Skip to main content
Start building emotion-aware applications with Synheart in minutes. This guide shows you how to use Synheart CLI to generate mock HSI data, Synheart Core to process it, and Synheart Emotion to detect emotions.

Get started in three steps

Get your Synheart integration running and detect emotions from HSI data in real-time.

Step 1: Install Synheart CLI and SDKs

First, install the Synheart CLI to generate mock HSI data for local development, then install the SDKs for your platform.

Install Synheart CLI

Install the CLI to generate mock HSI data:
git clone https://github.com/synheart-ai/synheart-cli
cd synheart-cli
make install
Ensure Go’s bin directory is in your PATH:
export PATH="$PATH:$HOME/go/bin"
Start the mock server:
synheart mock start
This starts a WebSocket server on ws://127.0.0.1:8787/hsi broadcasting HSI-compatible events.
The Synheart CLI WebSocket stream is a great way to validate HSI event production/consumption in your own tooling. The current mobile SDKs focus on on-device collection + inference and do not directly ingest the CLI WebSocket stream.

Install SDKs

Choose your platform and install the necessary SDKs.

Prerequisites

  • Flutter >= 3.22.0
  • Dart >= 3.0.0

Installation

Add to your pubspec.yaml:
dependencies:
  synheart_core: ^0.0.1
  synheart_emotion: ^0.2.3
  # Optional (recommended for real wearable biosignals)
  synheart_wear: ^0.2.1
flutter pub get

Prerequisites

  • Android SDK API 21+
  • Kotlin 1.8+

Installation

Add to your build.gradle.kts:
dependencies {
    // Synheart Emotion (published)
    implementation("ai.synheart:emotion:0.1.0")

    // Synheart Core (Kotlin) is currently most reliably consumed as a local module.
    // See the repo README for setup options.
}

Prerequisites

  • iOS 13.0+
  • Swift 5.9+
  • Xcode 15.0+

Installation

Swift Package Manager:
dependencies: [
    .package(url: "https://github.com/synheart-ai/synheart-core-swift.git", from: "0.1.0"),
    .package(url: "https://github.com/synheart-ai/synheart-emotion-ios.git", from: "0.1.0")
]

Prerequisites

  • Python >= 3.8

Installation

pip install synheart-emotion
Note: For mobile apps, use Flutter/Kotlin/Swift. Python SDKs are primarily for backend/CLI use.

Step 2: Initialize SDKs

Initialize Synheart Core (optional) and Synheart Emotion (works standalone).
import 'package:synheart_emotion/synheart_emotion.dart';

void main() async {
  // Initialize Synheart Emotion (standalone)
  final emotionEngine = EmotionEngine.fromPretrained(
    const EmotionConfig(
      window: Duration(seconds: 60),
      step: Duration(seconds: 5),
    ),
  );
}
import com.synheart.emotion.EmotionConfig
import com.synheart.emotion.EmotionEngine
import java.util.Date

// Initialize Synheart Emotion (standalone)
val emotionEngine = EmotionEngine.fromPretrained(EmotionConfig())
import SynheartEmotion

let emotionEngine = try EmotionEngine.fromPretrained(config: EmotionConfig())

Step 3: Stream biosignals and detect emotions

Feed heart-rate + RR-intervals into Synheart Emotion to detect emotions in real time.
// Push biosignal samples into the engine (example values)
emotionEngine.push(
  hr: 72.0,
  rrIntervalsMs: [823, 810, 798, 815, 820].map((v) => v.toDouble()).toList(),
  timestamp: DateTime.now().toUtc(),
);

final results = emotionEngine.consumeReady();
if (results.isNotEmpty) {
  final result = results.first;
  print('Emotion: ${result.emotion} (${(result.confidence * 100).toStringAsFixed(1)}%)');
}
emotionEngine.push(
    hr = 72.0,
    rrIntervalsMs = listOf(850.0, 820.0, 830.0, 815.0, 828.0),
    timestamp = Date()
)

val results = emotionEngine.consumeReady()
if (results.isNotEmpty()) {
    val result = results.first()
    println("Emotion: ${result.emotion} (${(result.confidence * 100).toInt()}%)")
}
emotionEngine.push(
    hr: 72.0,
    rrIntervalsMs: [850.0, 820.0, 830.0, 815.0, 828.0],
    timestamp: Date()
)

let results = emotionEngine.consumeReady()
if let result = results.first {
    print("Emotion: \(result.emotion) (\(Int(result.confidence * 100))%)")
}

How It Works

  1. Synheart CLI generates realistic mock HSI data and streams it via WebSocket
  2. Synheart Core connects to the CLI stream and processes HSI snapshots
  3. Synheart Emotion receives HSI data and detects emotions using on-device ML models

Integration Benefits

Privacy-First

All processing happens on-device. No biosignal data is sent to servers.

Real-Time

Sub-5ms inference latency for instant emotional state updates.

Research-Grade

Tested and validated on real-world datasets with proven accuracy.

Next steps


Author: Israel Goytom