Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.synheart.ai/llms.txt

Use this file to discover all available pages before exploring further.

The Human State Interface (HSI) is the canonical interchange format for human-state outputs. It is a separate, language-agnostic specification (github.com/synheart-ai/hsi, currently version 1.3) that is not owned by this SDK. In the Synheart engine pipeline, HSI is the wire format that flux produces from HSV (the typed inference output of the Synheart Runtime). The SDK is a consumer of HSI: it receives HSI JSON from the Synheart runtime (which embeds the engine pipeline) and exposes it as typed events. For the canonical contract, see the HSI overview on this site or the hsi repository.

Pipeline, recap

feature pipeline ──► FeatureSet
inference engine ──► HSV[] + 64D embedding   (see HSV Specification)
flux            ──► HSI 1.3 JSON payload
Synheart runtime broadcasts the HSI JSON
SDK exposes:    Synheart.onHSIUpdate (raw JSON), Synheart.onStateUpdate (typed)
The SDK does not produce HSI itself — flux does. The SDK does not validate HSI either; it parses leniently and exposes a typed view. For the upstream typed shape, see HSV Specification.

Where HSI enters Synheart Core

flux emits HSI window


Synheart runtime broadcasts the window (raw JSON + timestamp)


SDK listener (registered by Synheart.initialize)

        ├── Synheart.onHSIUpdate     → Stream<String>      (raw JSON)
        └── Synheart.onStateUpdate   → Stream<HSIState>    (typed)

Typed view: HSIState

class HSIState {
  final String subjectId;
  final int timestampMs;
  final HSIAxes hsi;
  final Modalities modalities;
  final Tiers tiers;
  final String rawJson;       // preserved for diagnostic use
}
HSIState.fromJson(json) extracts:
  • timestampMs from timestamp_ms or observed_at_ms.
  • subjectId from subject_id.
  • hsi from map['hsi'] (or the top-level when the engine wraps differently).
  • modalities derived from meta.provenance.sources[*].signals per the Synheart signal-modality mapping contract.
  • tiers derived from per-source source_tier (physiological) and meta.synheart.tiers.{kinematic, digital}.
rawJson is preserved on every state so apps that need full payload access can read it directly.

Axes (HSIAxes)

The current SDK emits four axis-shaped accessors:
class HSIAxes {
  final HSIAxisValue? focus;
  final HSIAxisValue? arousal;
  final HSIAxisValue? capacity;
  final HSIAxisValue? sleep;
}

class HSIAxisValue {
  final double value;       // 0.0..1.0
  final double confidence;  // 0.0..1.0
}
These map onto the HSV heads emitted upstream:
HSI axis (HSIAxes field)HSV head (HsvType)
focusFocus
arousalEmotion.arousal (one component of the Multiscalar value)
capacityCapacity
sleepSleep
The HSI 1.3 canonical domain set is physiological / kinematic / digital / cognitive / affective — those describe the modalities that produced a reading, not the head identity. Head names (focus, capacity, etc.) appear as axis_reading.axis strings under those domains. Missing axes are null (not zero).

Modalities

class Modalities {
  final bool physiological;
  final bool kinematic;
  final bool digital;
}
Use these to gate UX — for example, drop physiology copy when physiological == false.

Tiers

class Tiers {
  final int? physiological;  // 1..=4 from source.source_tier (HSV's source_tier)
  final int? kinematic;      // 1..=3 from meta.synheart.tiers.kinematic
  final int? digital;        // 1..=3 from meta.synheart.tiers.digital
}
Lower number = higher fidelity. The tier values originate on the upstream Hsv.tiers (TierBundle); flux carries them into the HSI envelope.

Modality / tier derivation in the SDK

Signal-name → modality mapping (used by HSIState.fromJson when reading flux output):
Signal nameModality
hr, hrv, rr, ecg, spo2, respiration, temperature, edaphysiological
accel, gyro, motion, posture, stepkinematic
tap, scroll, swipe, key, notification, app, screendigital
Source tier walk: physiology takes the worst-numbered tier across sources emitting any physiological signal (lowest-fidelity source wins for the modality).

Validation

The SDK does not enforce HSI schema validation — that lives with the producer (flux) and any cloud consumer. Apps that need strict validation can cross-check rawJson against hsi-1.3.schema.json before consuming. The HSI repo defines two validation levels:
  • HSI-VALIDATE-BASIC — structural + range checks against the JSON Schema for the claimed hsi_version.
  • HSI-VALIDATE-STRICT — cross-field integrity (references, time ordering, dimension checks).

Embedding presence

The 64D Johnson-Lindenstrauss embedding from the inference engine is forwarded into embeddings[] on the HSI envelope when capability level and HSI version permit. The SDK’s typed HSIState does not surface the embedding directly — apps that need it parse rawJson.

Subscribing

StreamSubscription? sub;

@override
void initState() {
  super.initState();
  sub = Synheart.onStateUpdate.listen((state) {
    setState(() {
      _arousal = state.hsi.arousal?.value;
      _hasPhysio = state.modalities.physiological;
    });
  });
}

@override
void dispose() {
  sub?.cancel();
  super.dispose();
}
The stream is BehaviorSubject-backed: late subscribers receive the latest value. Synheart.currentHSIState is also exposed for synchronous access.

Privacy posture

HSI by design carries no PII (privacy.contains_pii = false). The runtime hashes subject_id before emit; cloud-bound HSI carries subject_hash, not the SDK-supplied subjectId. The 64D JL embedding is privacy-preserving by construction — the random projection is non-invertible, so the original HSVs cannot be reconstructed from the embedding alone.