Skip to main content

What is Synheart Core SDK?

Synheart Core SDK is the unified integration point for developers who want to collect multimodal signals, process human state on-device via the synheart-runtime Rust engine (session-state-flux pipeline), generate focus/emotion signals, upload derived state snapshots (HSI 1.0 format) to the cloud (with user consent), and visualize state dashboards.

Key Features

Unified API

  • Single SDK: One integration point for all Synheart features
  • Modular Design: Enable only the modules you need
  • Consistent Interface: Same API patterns across all platforms
  • Multi-Module Support: Wear, Phone, Behavior, HSI Runtime, Consent, Cloud

On-Device Processing

  • HSI Runtime: Powered by synheart-runtime (Rust) — on-device human state fusion via session-state-flux pipeline
  • Privacy-First: All processing happens on-device
  • Low Latency: ≤ 100ms HSI update latency
  • Efficient: < 2% CPU, < 15MB memory, < 0.5%/hr battery

Multi-Platform Support

PlatformLanguageStatusRepository
Flutter/DartDartReadysynheart-core-sdk-dart
AndroidKotlinReadysynheart-core-sdk-kotlin
iOSSwiftReadysynheart-core-sdk-swift

Capability System

Access levels tied to app signature and tenant ID:
ModuleCoreExtendedResearch
WearDerived biosignalsHigher frequencyRaw streams
PhoneMotion, screenAdvanced app contextFull context
BehaviorBasic metricsExtended metricsEvent-level streams
HSI RuntimeBasic stateFull embeddingFull fusion vectors
ConnectorIngest (HSI 1.0)Extended endpointsResearch endpoints
By default, third-party apps run under Core capabilities. Extended and Research capabilities are enabled only for first‑party Synheart apps and approved partners, since they may unlock higher-sensitivity data access and require additional privacy/security review and contractual controls.

Module System

The Core SDK consolidates all Synheart signal channels:
Synheart Core SDK

├── Wear Module
│      (HR, HRV, sleep, motion — derived signals only)

├── Phone Module
│      (motion, screen state, coarse app context)

├── Synheart Behavior (Module)
│      (interaction patterns: taps, scrolls, typing cadence)

├── HSI Runtime (On-device, synheart-runtime Rust engine)
│      - session → state → flux pipeline
│      - multimodal fusion into HSI JSON
│      - state axes & indices
│      - time windows (30s, 5m, 1h, 24h)
│      - 64D state embedding

├── Interpretation Modules (Optional)
│      ├── Synheart Emotion
│      │     (affect modeling - optional, explicit enable)
│      └── Synheart Focus
│            (engagement/focus estimation - optional, explicit enable)

├── Consent Module
│      (permissions, masking, enforcement)

└── Cloud Connector
       (secure, consent-gated uploads)

Quick Start

Flutter/Dart

Dart SDK Guide

Complete Flutter integration guide
Installation:
dependencies:
  synheart_core: ^1.1.0
Basic Usage:
import 'package:synheart_core/synheart_core.dart';

// Initialize Core SDK
await Synheart.initialize(
  userId: 'anon_user_123',
  config: SynheartConfig(
    allowUnsignedCapabilities: true,  // Use capabilityToken in production
  ),
);

// Subscribe to HSI updates (raw JSON from synheart-runtime)
Synheart.onHSIUpdate.listen((hsiJson) {
  print('HSI JSON: $hsiJson');
});

// Optional: Enable interpretation modules
Synheart.activate(SynheartFeature.focus);
Synheart.onFocusUpdate.listen((focus) {
  print('Focus Score: ${focus.estimate.score}');
});

Synheart.activate(SynheartFeature.emotion);
Synheart.onEmotionUpdate.listen((emotion) {
  print('Stress Index: ${emotion.stressIndex}');
});

// Enable cloud upload (with consent)
Synheart.activate(SynheartFeature.cloud);

Android (Kotlin)

Kotlin SDK Guide

Complete Android integration guide
Installation:
dependencies {
    implementation("ai.synheart:synheart-core:1.1.0")
}
Basic Usage:
import com.synheart.core.Synheart
import com.synheart.core.models.SynheartConfig

// Initialize
Synheart.initialize(
    context = context,
    userId = "anon_user_123",
    config = SynheartConfig(
        allowUnsignedCapabilities = true  // Use capabilityToken in production
    )
)

// Subscribe to HSI updates (raw JSON from synheart-runtime)
Synheart.onHSIUpdate.collect { hsiJson ->
    println("HSI JSON: $hsiJson")
}

// Optional: Enable interpretation modules
Synheart.activate(SynheartFeature.FOCUS)
Synheart.onFocusUpdate.collect { focus ->
    println("Focus Score: ${focus.score}")
}

Synheart.activate(SynheartFeature.CLOUD)

iOS (Swift)

Swift SDK Guide

Complete iOS integration guide
Installation:
dependencies: [
    .package(url: "https://github.com/synheart-ai/synheart-core-sdk-swift.git", from: "1.1.0")
]
Basic Usage:
import SynheartCore
import Combine

var cancellables = Set<AnyCancellable>()

// Initialize
try await Synheart.initialize(
    userId: "anon_user_123",
    config: SynheartConfig(
        allowUnsignedCapabilities: true  // Use capabilityToken in production
    )
)

// Subscribe to HSI updates (raw JSON from synheart-runtime)
Synheart.onHSIUpdate
    .sink { hsiJson in
        print("HSI JSON: \(hsiJson)")
    }
    .store(in: &cancellables)

// Optional: Enable interpretation modules
Synheart.activate(.focus)
Synheart.onFocusUpdate
    .sink { focus in
        print("Focus Score: \(focus.score)")
    }
    .store(in: &cancellables)

Synheart.activate(.cloud)

Privacy & Security

  • Zero Raw Content: No text, mic, URLs, messages
  • On-Device Processing: All inference happens locally
  • No Raw Biosignals: Only derived signals externally
  • Consent-Gated: All cloud uploads require explicit consent
  • Capability-Enforced: Feature access tied to app signature and tenant ID

Performance Benchmarks

MetricTargetTypical
CPU Usage< 2%~1.5%
Memory< 15MB~12MB
Battery Impact< 0.5%/hr~0.3%/hr
HSI Update Latency≤ 100ms~80ms
Cloud Upload Time≤ 80ms~60ms

Available SDKs

Dart SDK

Flutter/Dart integration

Kotlin SDK

Android/Kotlin integration

Swift SDK

iOS/Swift integration

Technical Documentation

Architecture

Module system and HSI Runtime

HSV Specification

State axes, indices, embeddings, and windows

Capability System

Access level enforcement

Consent System

Permission model and enforcement

Use Cases

Mental Health Platforms

Monitor physiological and behavioral signals for emotional state tracking:
// Initialize the SDK
await Synheart.initialize(
  userId: userId,
  config: SynheartConfig(
    allowUnsignedCapabilities: true,  // Use capabilityToken in production
  ),
);

Synheart.activate(SynheartFeature.emotion);

// Monitor stress indicators
Synheart.onEmotionUpdate.listen((emotion) {
  if (emotion.stressIndex > 0.7) {
    showStressAlert();
  }
});

Productivity Apps

Track focus and engagement for productivity optimization:
await Synheart.initialize(
  userId: userId,
  config: SynheartConfig(
    allowUnsignedCapabilities: true,  // Use capabilityToken in production
  ),
);

Synheart.activate(SynheartFeature.focus);

// Monitor focus state
Synheart.onFocusUpdate.listen((focus) {
  updateFocusScore(focus.estimate.score);
  if (focus.estimate.score < 0.3) {
    suggestBreak();
  }
});

Research Applications

Collect comprehensive human state data for research:
// Initialize with research capability token
await Synheart.initialize(
  userId: participantId,
  config: SynheartConfig(
    capabilityToken: researchToken,  // Research capability requires an authorized token
    capabilitySecret: researchSecret,
  ),
);

// Enable cloud upload for data collection
Synheart.activate(SynheartFeature.cloud);

Next Steps

Quick Start

Get started in 5 minutes

Architecture Guide

Deep dive into the system

HSV Specification

Understand Synheart Core’s internal state model

GitHub

View source code

Author: Israel Goytom Organization: Synheart AI