Skip to main content

What is Synheart Emotion?

Synheart Emotion is a multi-platform SDK for inferring momentary emotions from biosignals (heart rate and RR intervals) directly on device, ensuring privacy and real-time performance. Supported Emotions:
  • 😊 Amused: Positive, engaged emotional state
  • 😌 Calm: Relaxed, peaceful emotional state
  • 😰 Stressed: Anxious, tense emotional state

Key Features

On-Device Processing

  • All inference happens locally on your device
  • No network calls required
  • No raw biometric data leaves the device
  • Privacy-first by design

Real-Time Performance

  • < 5ms inference latency on mid-range devices
  • < 3 MB memory footprint (engine + buffers)
  • < 2% CPU usage during active streaming
  • < 100 KB model size

Research-Based

  • Trained on WESAD dataset (wearable stress and affect detection)
  • 78% accuracy on 3-class emotion recognition
  • Linear SVM model with feature engineering
  • Reproducible with published model card

Multi-Platform

PlatformSDKInstallationVersionStatus
Pythonsynheart-emotionpip install synheart-emotion0.1.0✅ Ready
Dart/Fluttersynheart_emotionflutter pub add synheart_emotion0.2.1✅ Ready
Kotlinai.synheart:emotionJitPack0.1.0✅ Ready
SwiftSynheartEmotionSwift Package Manager0.1.0✅ Ready

Architecture

All SDKs implement the same architecture:
Wearable / Sensor
   └─(HR bpm, RR ms)──► Your App


                   Synheart Emotion SDK
            [Ring Buffer] → [Feature Extraction] → [Normalization]

                                  [Model]

                              EmotionResult
Components:
  • Ring Buffer: Holds last 60s of HR/RR data (configurable)
  • Feature Extractor: Computes HR mean, SDNN, RMSSD
  • Scaler: Standardizes features using training μ/σ
  • Model: Linear SVM (One-vs-Rest) with softmax
  • Emitter: Throttles outputs (default: every 5s)

Quick Start Examples

from datetime import datetime
from synheart_emotion import EmotionEngine, EmotionConfig

# Initialize engine
config = EmotionConfig()
engine = EmotionEngine.from_pretrained(config)

# Push biosignal data
engine.push(
    hr=72.0,
    rr_intervals_ms=[850.0, 820.0, 830.0, 845.0, 825.0],
    timestamp=datetime.now()
)

# Get inference results
results = engine.consume_ready()
for result in results:
    print(f"Emotion: {result.emotion} ({result.confidence:.1%})")
Full Python SDK Guide →

Use Cases

Mental Health Apps

Monitor stress levels in real-time:
from synheart_emotion import EmotionEngine, Emotion

engine = EmotionEngine.from_pretrained()

# Stream biosignals from wearable
for biosignal in wearable_stream():
    engine.push(
        hr=biosignal.heart_rate,
        rr_intervals_ms=biosignal.rr_intervals,
        timestamp=biosignal.timestamp
    )

    results = engine.consume_ready()
    for result in results:
        if result.emotion == Emotion.STRESSED and result.confidence > 0.7:
            send_notification("Consider taking a break")

Wellness Coaching

Track emotional patterns throughout the day:
// In your Flutter wellness app
final emotionEngine = EmotionEngine.fromPretrained();

// Integrate with Synheart Wear
synheartWear.streamHRV(windowSize: Duration(seconds: 60))
  .listen((metrics) {
    emotionEngine.push(
      hr: metrics.getMetric(MetricType.hr)!,
      rrIntervalsMs: metrics.rrIntervals,
      timestamp: DateTime.now(),
    );

    final results = emotionEngine.consumeReady();
    if (results.isNotEmpty) {
      updateDashboard(results.first);
    }
  });

Research Applications

Collect emotion data for scientific studies:
// Android research app
val emotionEngine = EmotionEngine.fromPretrained(
    EmotionConfig(
        windowDuration = 60_000L, // 60 seconds
        stepDuration = 5_000L      // 5 seconds
    )
)

// Log all results for analysis
val results = emotionEngine.consumeReady()
results.forEach { result ->
    database.insert(
        timestamp = result.timestamp,
        emotion = result.emotion,
        confidence = result.confidence
    )
}

Model Details

Model Type: Linear SVM (One-vs-Rest) Task: Momentary emotion recognition from HR/RR Input Features: [hr_mean, sdnn, rmssd] over a 60s rolling window Performance:
  • Accuracy: ~78%
  • Macro-F1: ~72%
  • Latency: < 5ms on modern mid-range devices
The model is trained on WESAD-derived 3-class subset with artifact rejection and normalization. View Model Card → View Technical Specification (RFC E1.1) →

API Parity

All SDKs expose identical functionality:
FeaturePythonKotlinSwiftDart
EmotionConfig
EmotionEngine
EmotionResult
EmotionError
Feature Extraction
Linear SVM Model
Thread-Safe
Sliding Window

Available SDKs

Privacy & Security

  • On-Device Processing: All emotion inference happens locally
  • No Data Retention: Raw biometric data is not retained after processing
  • No Network Calls: No data is sent to external servers
  • Privacy-First Design: No built-in storage - you control what gets persisted
  • Not a Medical Device: This library is for wellness and research purposes only

Resources

Citation

If you use this SDK in your research:
@software{synheart_emotion,
  title = {Synheart Emotion: Multi-platform SDK for on-device emotion inference from biosignals},
  author = {Synheart AI Team},
  year = {2025},
  version = {0.1.0},
  url = {https://github.com/synheart-ai/synheart-emotion}
}

Author: Israel Goytom
Made with ❤️ by the Synheart AI Team