Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.synheart.ai/llms.txt

Use this file to discover all available pages before exploring further.

What is Synheart Behavior?

Synheart Behavior is a multi-platform SDK for inferring behavioral signals from digital interactions (scroll, tap, swipe, typing, notifications, calls) directly on device, ensuring privacy and real-time performance. Supported Motion States:
  • 🛌 LAYING:
  • 🚶 MOVING:
  • 🪑 SITTING:
  • 🧍 STANDING:

Key Features

On-Device Processing

  • All inference happens locally on your device
  • No network calls required
  • No raw interaction data leaves the device
  • Privacy-first by design

On-Demand Metrics Calculation

  • Calculate behavioral metrics for custom time ranges within sessions
  • Query metrics for specific time periods without ending the session
  • Supports both active and ended sessions
  • Automatic validation ensures time ranges are within session bounds

Privacy-First Design

  • No PII: No names, contacts, or user-identifying data
  • No content capture: No text, images, or semantic data
  • No keystroke logging: Text input captured only as abstract tap events
  • Event-level metadata only: Timing and physical metrics only

Multi-Platform

PlatformSDKInstallationVersionStatus
Fluttersynheart_behaviorflutter pub add synheart_behavior0.2.1✅ Ready
Kotlin/Androidai.synheart:behaviorGradle: implementation 'ai.synheart:behavior:0.4.1'0.4.1✅ Ready
Swift/iOSSynheartBehaviorSwift Package: .package(url: "https://github.com/synheart-ai/synheart-behavior-swift.git", from: "0.2.0")0.3.0✅ Ready

Architecture

Behavior SDKs are input layers: they capture interaction events and stream them out. Higher-level metrics (focus hint, distraction score, etc.) are computed downstream by the Synheart runtime when behavior is fed through Synheart Core.
User Interactions
   └─(scroll, tap, swipe, typing, notifications, calls)──► Synheart Behavior SDK
            [Event Collector] → [Session Aggregator]
                  │                       │
            [ML Motion Model]      BehaviorEvents + Summary
                  │                       │
            MotionState                   ▼
                  │                  Your App  (or Synheart Core
                  │                            for HSI fusion)
                  └──────────────────────►
Components:
  • Event Collectors: Capture scroll, tap, swipe, app_switch, notification, call, typing, and clipboard events.
  • Session Aggregator: Aggregates events into session-level summaries on-device.
  • Motion State Inference: On-device ML model for activity recognition (LAYING, MOVING, SITTING, STANDING).
Behavior is content-free: only timing and physical metadata. Use the SDK standalone for raw events, or feed it into Synheart Core to fuse with biosignals into HSI.

Quick Start Examples

import 'package:synheart_behavior/synheart_behavior.dart';

final behavior = await SynheartBehavior.initialize(
  config: const BehaviorConfig(
    enableInputSignals: true,
    enableAttentionSignals: true,
    enableMotionLite: true,
  ),
);

// Wrap app to enable gesture tracking
return behavior.wrapWithGestureDetector(
  MaterialApp(...),
);

// Listen to real-time events
behavior.onEvent.listen((event) {
  print('Event: ${event.eventType} at ${event.timestamp}');
});

// Start a session
final session = await behavior.startSession();
Full Flutter SDK Guide →

Use Cases

Focus and Distraction Apps

Monitor user focus in real-time:
final behavior = await SynheartBehavior.initialize(
  config: const BehaviorConfig(
    enableInputSignals: true,
    enableAttentionSignals: true,
  ),
);

behavior.onEvent.listen((event) {
  // Track interaction patterns
});

final session = await behavior.startSession();
// ... user interacts with app ...
final summary = await session.end();

if (summary.behavioralMetrics.focusHint > 0.7) {
  print('User is highly focused');
} else if (summary.behavioralMetrics.distractionScore > 0.6) {
  print('User appears distracted');
}

Digital Wellness Analytics

Track behavioral patterns throughout the day:
// Start session when user opens app
final session = await behavior.startSession();

// End session when user closes app
final summary = await session.end();

// Analyze session metrics
print('Interaction Intensity: ${summary.behavioralMetrics.interactionIntensity}');
print('Deep Focus Blocks: ${summary.behavioralMetrics.deepFocusBlocks.length}');
print('Task Switch Rate: ${summary.behavioralMetrics.taskSwitchRate}');

On-Demand Metrics for Time Ranges

Calculate metrics for specific time periods within a session:
// Calculate metrics for a custom time range
final metrics = await behavior.calculateMetricsForTimeRange(
  startTimestampSeconds: 1767688063,  // Unix timestamp in seconds
  endTimestampSeconds: 1767688130,     // Unix timestamp in seconds
  sessionId: 'SESS-1767688063415',     // Optional: session ID
);

// Access calculated metrics
print('Events in range: ${metrics['activity_summary']['total_events']}');
print('Interaction intensity: ${metrics['behavioral_metrics']['interaction_intensity']}');
print('Distraction score: ${metrics['behavioral_metrics']['behavioral_distraction_score']}');

// Motion state (if available)
if (metrics['motion_state'] != null) {
  print('Motion state: ${metrics['motion_state']['major_state']}');
}

Cognitive Load Estimation

Estimate cognitive load from interaction patterns:
final summary = await session.end();

// High interaction intensity + low fragmentation = high cognitive load
final cognitiveLoad = summary.behavioralMetrics.interactionIntensity *
                     (1 - summary.behavioralMetrics.fragmentedIdleRatio);

if (cognitiveLoad > 0.8) {
  // Suggest taking a break
}

Behavioral Metrics

Each BehaviorSessionSummary carries two metric bundles. Per-platform pages document the exact field types — the canonical set is small.

BehavioralMetrics

  • interactionIntensity (0–1): Overall interaction rate and engagement.
  • taskSwitchRate: Frequency of app switching.
  • taskSwitchCost: Penalty for fragmented switching.
  • idleTimeRatio (0–1) / activeTimeRatio (0–1).
  • fragmentedIdleRatio (0–1): Ratio of fragmented vs continuous idle periods.
  • burstiness: Temporal clustering of interaction events.
  • behavioralDistractionScore (0–1): Distraction proxy.
  • focusHint (0–1): Focus-quality proxy.
  • deepFocusBlocks: Sustained-engagement intervals.
  • notificationLoad: Notification pressure proxy.
  • scrollJitterRate: Scroll-pattern irregularity.

TypingSessionSummary (when typing events occurred)

  • typingSpeed, typingCadenceStability, typingCadenceVariability.
  • typingActivityRatio, typingGapRatio, typingBurstiness, typingInteractionIntensity.
  • clipboardActivityRate: (copy + paste + cut) / (typing taps + clipboard actions).
  • correctionRate: (backspace + delete) / (typing taps + backspace + delete).
All metrics are bounded and numerically stable.

Motion State Inference

When enableMotionLite is enabled, the SDK uses an on-device ML model (LinearSVC, one-vs-rest) to predict motion states from accelerometer / gyroscope features. The native (Kotlin) layer ships the model directly; the Flutter SDK forwards motion classification to the engine runtime instead of surfacing it on the session summary. Native (Kotlin / Swift) — on the session summary:
val summary = session.end()
summary.motionState?.let {
    println("Motion: ${it.majorState} (confidence ${it.confidence})")
}
// States: laying, moving, sitting, standing
Flutter — motion classification is performed by the Synheart runtime when behavior is fed through Synheart Core. It does not appear on BehaviorSessionSummary. Use Synheart.onHSIUpdate to read the runtime’s motion-state output.

API Features

The SDK provides the following functionality:
FeatureFlutter
BehaviorConfig
SynheartBehavior
BehaviorEvent
BehaviorSession
Motion State Inference
Real-Time Event Streaming
Session Management
On-Demand Metrics Calculation
Thread-Safe
Automatic Session Ending

Available SDKs

Flutter SDK

Cross-platform mobile apps (iOS + Android)

Kotlin/Android SDK

Native Android applications

Swift/iOS SDK

Native iOS applications

Privacy & Security

  • On-Device Processing: All behavioral inference happens locally
  • No Data Retention: Raw interaction events are not retained after processing
  • No Network Calls: No data is sent to external servers
  • Privacy-First Design: No built-in storage - you control what gets persisted
  • No Content Capture: No text, images, or semantic data collected
  • Event-Level Metadata Only: Only timing and physical metrics

Resources

Citation

If you use this SDK in your research:
@software{synheart_behavior,
  title = {Synheart Behavior: Multi-platform SDK for on-device behavioral signal inference from digital interactions},
  author = {Synheart AI Team},
  year = {2025},
  version = {0.2.1},
  url = {https://github.com/synheart-ai/synheart-behavior-flutter}
}