Skip to main content

Overview

The Synheart Core Dart/Flutter SDK provides a unified API for collecting HSI-compatible data, processing human state on-device, and generating focus/emotion signals in Flutter applications. Key Features:
  • Cross-platform support (iOS + Android)
  • On-device HSI Runtime
  • Real-time state updates
  • Modular design (enable only what you need)
  • Privacy-first architecture

Installation

Add to your pubspec.yaml:
dependencies:
  synheart_core: ^1.0.0
Install dependencies:
flutter pub get

Platform Configuration

iOS Configuration

Add to ios/Runner/Info.plist:
<!-- Health data access (if using Wear module) -->
<key>NSHealthShareUsageDescription</key>
<string>This app needs access to your health data to provide insights.</string>

<!-- Motion & Fitness (if using Phone module) -->
<key>NSMotionUsageDescription</key>
<string>This app uses motion data to understand your activity patterns.</string>

Android Configuration

Add to android/app/src/main/AndroidManifest.xml:
<!-- Required permissions -->
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />

<!-- Health Connect Permissions (if using Wear module) -->
<uses-permission android:name="android.permission.health.READ_HEART_RATE"/>
<uses-permission android:name="android.permission.health.READ_HEART_RATE_VARIABILITY"/>

Basic Usage

Initialize the SDK

import 'package:synheart_core/synheart_core.dart';

// Initialize with basic modules
await Synheart.initialize(
  userId: 'anon_user_123',
  config: SynheartConfig(
    enableWear: true,
    enablePhone: true,
    enableBehavior: true,
  ),
);

Subscribe to HSV Updates

The HSV (Human State Vector) is the core state representation:
Synheart.onHSVUpdate.listen((hsv) {
  // Access state axes
  final affect = hsv.meta.axes.affect;
  final engagement = hsv.meta.axes.engagement;
  final activity = hsv.meta.axes.activity;
  final context = hsv.meta.axes.context;

  // Affect indicators
  print('Arousal Index: ${affect.arousalIndex}');
  print('Valence Stability: ${affect.valenceStability}');

  // Engagement indicators
  print('Engagement Stability: ${engagement.engagementStability}');
  print('Interaction Cadence: ${engagement.interactionCadence}');

  // Activity indicators
  print('Motion Index: ${activity.motionIndex}');
  print('Posture Stability: ${activity.postureStability}');

  // State embedding (64D vector)
  print('State Embedding: ${hsv.meta.embedding.vector}');
  print('Window Type: ${hsv.meta.embedding.windowType}'); // micro, short, medium, long
});

Enable Interpretation Modules

Interpretation modules are optional and must be explicitly enabled:
// Enable Focus tracking
await Synheart.enableFocus();
Synheart.onFocusUpdate.listen((focus) {
  print('Focus Score: ${focus.estimate.score}'); // 0.0-1.0
  print('Confidence: ${focus.estimate.confidence}');
  print('State: ${focus.state}'); // focused, distracted, neutral
});

// Enable Emotion tracking
await Synheart.enableEmotion();
Synheart.onEmotionUpdate.listen((emotion) {
  print('Stress Index: ${emotion.stressIndex}'); // 0.0-1.0
  print('Energy Level: ${emotion.energyLevel}');
  print('Primary Emotion: ${emotion.primaryEmotion}');
});

Enable Cloud Connector

Upload HSV snapshots (as HSI 1.0 format) to the cloud (requires user consent):
// Initialize SDK with Cloud Connector
await Synheart.initialize(
  userId: 'anon_user_123',
  config: SynheartConfig(
    enableWear: true,
    enablePhone: true,
    enableBehavior: true,
    cloudConfig: CloudConfig(
      tenantId: 'your_tenant_id',
      hmacSecret: 'your_hmac_secret',
      subjectId: 'user_123',
      instanceId: 'device_abc',
      baseUrl: 'https://api.synheart.com',
      maxQueueSize: 100,
      uploadInterval: Duration(minutes: 5),
    ),
  ),
);

// Cloud connector auto-starts if cloudConfig is provided
// Uploads happen automatically when:
// - HSV is updated
// - Rate limit allows (per window type)
// - Network is available
// - User has granted cloudUpload consent

// Force immediate upload
await Synheart.uploadNow();

// Flush entire upload queue
await Synheart.flushUploadQueue();

// Disable cloud uploads
await Synheart.disableCloud();

Complete Example

import 'package:flutter/material.dart';
import 'package:synheart_core/synheart_core.dart';

class HumanStateMonitor extends StatefulWidget {
  @override
  _HumanStateMonitorState createState() => _HumanStateMonitorState();
}

class _HumanStateMonitorState extends State<HumanStateMonitor> {
  HumanStateVector? _currentHSV;
  FocusState? _currentFocus;
  EmotionState? _currentEmotion;
  bool _isInitialized = false;

  @override
  void initState() {
    super.initState();
    _initializeSynheart();
  }

  Future<void> _initializeSynheart() async {
    try {
      // Initialize Core SDK
      await Synheart.initialize(
        userId: 'user_123',
        config: SynheartConfig(
          enableWear: true,
          enablePhone: true,
          enableBehavior: true,
        ),
      );

      // Enable interpretation modules
      await Synheart.enableFocus();
      await Synheart.enableEmotion();

      // Subscribe to updates
      Synheart.onHSVUpdate.listen((hsv) {
        setState(() => _currentHSV = hsv);
      });

      Synheart.onFocusUpdate.listen((focus) {
        setState(() => _currentFocus = focus);
      });

      Synheart.onEmotionUpdate.listen((emotion) {
        setState(() => _currentEmotion = emotion);
      });

      setState(() => _isInitialized = true);
    } catch (e) {
      print('Initialization error: $e');
    }
  }

  @override
  void dispose() {
    Synheart.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    if (!_isInitialized) {
      return Scaffold(
        body: Center(child: CircularProgressIndicator()),
      );
    }

    return Scaffold(
      appBar: AppBar(title: Text('Human State Monitor')),
      body: ListView(
        padding: EdgeInsets.all(16),
        children: [
          // HSV State
          Card(
            child: Padding(
              padding: EdgeInsets.all(16),
              child: Column(
                crossAxisAlignment: CrossAxisAlignment.start,
                children: [
                  Text('HSV State', style: Theme.of(context).textTheme.titleLarge),
                  SizedBox(height: 8),
                  if (_currentHSV != null) ...[
                    Text('Arousal: ${(_currentHSV!.meta.axes.affect.arousalIndex ?? 0).toStringAsFixed(2)}'),
                    Text('Valence Stability: ${(_currentHSV!.meta.axes.affect.valenceStability ?? 0).toStringAsFixed(2)}'),
                    Text('Engagement: ${(_currentHSV!.meta.axes.engagement.engagementStability ?? 0).toStringAsFixed(2)}'),
                    Text('Window: ${_currentHSV!.meta.embedding.windowType}'),
                  ] else
                    Text('Waiting for data...'),
                ],
              ),
            ),
          ),
          SizedBox(height: 16),

          // Focus State
          Card(
            child: Padding(
              padding: EdgeInsets.all(16),
              child: Column(
                crossAxisAlignment: CrossAxisAlignment.start,
                children: [
                  Text('Focus', style: Theme.of(context).textTheme.titleLarge),
                  SizedBox(height: 8),
                  if (_currentFocus != null) ...[
                    Text('Score: ${(_currentFocus!.score ?? 0).toStringAsFixed(2)}'),
                    Text('Cognitive Load: ${(_currentFocus!.cognitiveLoad ?? 0).toStringAsFixed(2)}'),
                    Text('Clarity: ${(_currentFocus!.clarity ?? 0).toStringAsFixed(2)}'),
                    LinearProgressIndicator(value: _currentFocus!.score ?? 0),
                  ] else
                    Text('Waiting for data...'),
                ],
              ),
            ),
          ),
          SizedBox(height: 16),

          // Emotion State
          Card(
            child: Padding(
              padding: EdgeInsets.all(16),
              child: Column(
                crossAxisAlignment: CrossAxisAlignment.start,
                children: [
                  Text('Emotion', style: Theme.of(context).textTheme.titleLarge),
                  SizedBox(height: 8),
                  if (_currentEmotion != null) ...[
                    Text('Stress: ${(_currentEmotion!.stress ?? 0).toStringAsFixed(2)}'),
                    Text('Calm: ${(_currentEmotion!.calm ?? 0).toStringAsFixed(2)}'),
                    Text('Engagement: ${(_currentEmotion!.engagement ?? 0).toStringAsFixed(2)}'),
                    Text('Valence: ${(_currentEmotion!.valence ?? 0).toStringAsFixed(2)}'),
                  ] else
                    Text('Waiting for data...'),
                ],
              ),
            ),
          ),
        ],
      ),
    );
  }
}

HSV State Reference

The HumanStateVector object contains the core human state representation:
class HumanStateVector {
  // Version and timestamp
  final String version;
  final int timestamp;

  // Emotion state (optional interpretation)
  final EmotionState emotion;

  // Focus state (optional interpretation)
  final FocusState focus;

  // Behavioral metrics
  final BehaviorState behavior;

  // Context information
  final ContextState context;

  // Meta state (axes, embeddings, device info)
  final MetaState meta;
}

class MetaState {
  // State axes (affect, engagement, activity, context)
  final HSIAxes axes;

  // State embedding (64D vector)
  final StateEmbedding embedding;

  // Session and device info
  final String sessionId;
  final DeviceInfo device;
  final double samplingRateHz;
}

class HSIAxes {
  final AffectAxis? affect;
  final EngagementAxis? engagement;
  final ActivityAxis? activity;
  final ContextAxis? context;
}

class AffectAxis {
  final double? arousalIndex;       // 0.0-1.0
  final double? valenceStability;   // 0.0-1.0
}

class EngagementAxis {
  final double? engagementStability;  // 0.0-1.0
  final double? interactionCadence;   // 0.0-1.0
}

class ActivityAxis {
  final double? motionIndex;        // 0.0-1.0
  final double? postureStability;   // 0.0-1.0
}

class ContextAxis {
  final double? screenActiveRatio;     // 0.0-1.0
  final double? sessionFragmentation;  // 0.0-1.0
}

class StateEmbedding {
  final List<double> vector;  // 64D normalized vector
  final int timestamp;
  final String windowType;    // micro, short, medium, long
}

Focus Module

// Enable focus tracking
await Synheart.enableFocus();

// Listen to updates
Synheart.onFocusUpdate.listen((focus) {
  final estimate = focus.estimate;

  print('Score: ${estimate.score}');           // 0.0-1.0
  print('Confidence: ${estimate.confidence}'); // 0.0-1.0
  print('State: ${focus.state}');              // FocusState enum

  // Focus states: focused, distracted, neutral, deep_focus
  if (focus.state == FocusState.distracted) {
    showNotification('You seem distracted. Take a break?');
  }
});

// Disable when done
await Synheart.disableFocus();

Emotion Module

// Enable emotion tracking
await Synheart.enableEmotion();

// Listen to updates
Synheart.onEmotionUpdate.listen((emotion) {
  print('Stress Index: ${emotion.stressIndex}');     // 0.0-1.0
  print('Energy Level: ${emotion.energyLevel}');     // 0.0-1.0
  print('Primary Emotion: ${emotion.primaryEmotion}');

  // Track stress over time
  if (emotion.stressIndex > 0.7) {
    triggerStressAlert();
  }
});

// Disable when done
await Synheart.disableEmotion();

Cloud Connector

The Cloud Connector automatically uploads HSV snapshots (as HSI 1.0 format) to the Synheart Platform:
// Initialize with CloudConfig
await Synheart.initialize(
  userId: 'anon_user_123',
  config: SynheartConfig(
    enableWear: true,
    enablePhone: true,
    enableBehavior: true,
    cloudConfig: CloudConfig(
      tenantId: 'your_tenant_id',        // From app registration
      hmacSecret: 'your_hmac_secret',    // From app registration
      subjectId: 'user_123',             // Pseudonymous user ID
      instanceId: 'device_abc',          // Device UUID
      baseUrl: 'https://api.synheart.com',
      maxQueueSize: 100,                 // Offline queue size
      uploadInterval: Duration(minutes: 5),
    ),
  ),
);

// Cloud connector auto-starts and handles:
// - Automatic uploads when HSV updates
// - Rate limiting per window type (micro: 30s, short: 2m, medium: 10m, long: 1h)
// - Offline queueing with persistence (max 100 snapshots, FIFO)
// - Network monitoring with auto-flush
// - HMAC-SHA256 authentication
// - Exponential backoff retry (1s, 2s, 4s)

// Force immediate upload
await Synheart.uploadNow();

// Flush entire upload queue
await Synheart.flushUploadQueue();

// Disable cloud uploads
await Synheart.disableCloud();
Upload Payload (HSI 1.0 format):
{
  "subject": {
    "subject_type": "pseudonymous_user",
    "subject_id": "user_123"
  },
  "snapshots": [
    {
      "hsi_version": "1.0",
      "observed_at_utc": "2025-12-28T10:30:00Z",
      "computed_at_utc": "2025-12-28T10:30:10Z",
      "producer": {
        "name": "Synheart Core SDK",
        "version": "1.0.0",
        "instance_id": "device_abc"
      },
      "windows": {
        "micro": {
          "start_utc": "2025-12-28T10:29:30Z",
          "end_utc": "2025-12-28T10:30:00Z",
          "duration_seconds": 30
        }
      },
      "axes": {
        "affect": {
          "arousal_index": 0.72,
          "valence_stability": 0.65
        },
        "engagement": {
          "engagement_stability": 0.81,
          "interaction_cadence": 0.55
        },
        "behavior": {
          "motion_index": 0.45,
          "posture_stability": 0.78,
          "screen_active_ratio": 0.90,
          "session_fragmentation": 0.22
        }
      },
      "embeddings": [
        {
          "dimension": 64,
          "model": "synheart_fusion_v1",
          "vector": [0.12, 0.34, ...]
        }
      ],
      "privacy": {
        "contains_pii": false
      }
    }
  ]
}

Configuration Options

final config = SynheartConfig(
  // Module enables
  enableWear: true,
  enablePhone: true,
  enableBehavior: true,

  // Update intervals
  hsiUpdateInterval: Duration(seconds: 30),

  // Capability level (default: core)
  capabilityLevel: CapabilityLevel.core,

  // Debug mode
  debug: false,
);

await Synheart.initialize(
  userId: 'user_123',
  config: config,
);

Error Handling

try {
  await Synheart.initialize(
    userId: userId,
    config: config,
  );
} on PermissionDeniedError catch (e) {
  print('Permission denied: ${e.message}');
} on InitializationError catch (e) {
  print('Initialization failed: ${e.message}');
} on SynheartError catch (e) {
  print('SDK error: ${e.message}');
}
The SDK enforces consent at multiple levels:
// Check consent status
final consentStatus = await Synheart.getConsentStatus();
print('Wear module: ${consentStatus.wear}');
print('Cloud sync: ${consentStatus.cloud}');

// Request consent for specific modules
final granted = await Synheart.requestConsent(
  modules: [ConsentModule.wear, ConsentModule.cloud],
  reason: 'We need access to provide personalized insights',
);

if (granted) {
  await Synheart.enableCloud();
}

// Revoke consent
await Synheart.revokeConsent(ConsentModule.cloud);

Performance Considerations

  • HSI updates: ~80ms latency
  • CPU usage: ~1.5%
  • Memory footprint: ~12MB
  • Battery impact: ~0.3%/hr
// Check SDK health
final health = await Synheart.getHealth();
print('CPU: ${health.cpuUsage}%');
print('Memory: ${health.memoryUsage}MB');
print('Battery: ${health.batteryImpact}%/hr');

API Reference

Synheart

Main SDK class. Static Methods:
MethodDescriptionReturns
initialize()Initialize the SDKFuture<void>
enableFocus()Enable focus moduleFuture<void>
enableEmotion()Enable emotion moduleFuture<void>
enableCloud()Verify cloud connector is activeFuture<void>
uploadNow()Force immediate uploadFuture<void>
flushUploadQueue()Flush entire upload queueFuture<void>
disableCloud()Disable cloud uploadsFuture<void>
dispose()Cleanup resourcesFuture<void>
Streams:
StreamTypeDescription
onHSVUpdateStream<HumanStateVector>HSV state updates (every 30s)
onFocusUpdateStream<FocusState>Focus state estimates
onEmotionUpdateStream<EmotionState>Emotion state estimates

Resources


Author: Israel Goytom