Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.synheart.ai/llms.txt

Use this file to discover all available pages before exploring further.

HSV (Human State Vector) is the typed intermediate representation in the Synheart engine pipeline. It is what the Synheart Runtime emits from each inference head. Flux is the engine component that packs HSVs into HSI 1.3 payloads for the wire. This page documents the canonical Hsv struct emitted by the Synheart Runtime’s inference engine.

Pipeline placement

raw signals


feature pipeline ──►  FeatureSet              (60s window, 5s step, signal tiers)


inference engine ──►  InferenceOutput         (HSV[] + 64D embedding)
    │                  - 6 cognitive/physio heads
    │                  - 1 motion/posture head

flux             ──►  HSI 1.3 payload         (canonical wire format)
Hsv is what the Synheart Runtime emits per inference head. Flux consumes InferenceOutput (or &[Hsv]) and produces an HSI payload via HsiBuilder::build_from_output(...).

The seven canonical heads

HsvType enum (all caps on the wire, snake_case in JSON):
HeadGroupOutput shape
Emotion (EMOTION)Primary cognitiveMultiscalar — typically [("valence", v), ("arousal", a)].
Focus (FOCUS)Primary cognitiveScalar in [0, 1].
Capacity (CAPACITY)Primary cognitiveScalar in [0, 1].
Sleep (SLEEP)Physiological contextScalar plus optional breakdown (SleepScore.components).
Recovery (RECOVERY)Physiological contextScalar.
Strain (STRAIN)Physiological contextScalar.
MotionState (MOTION_STATE)Behavioral contextMultiscalar indicator + notes carrying LAYING / SITTING / STANDING / MOVING / UNKNOWN.
The inference engine can be configured to enable a subset; default is all six core heads + MotionState.

Hsv struct

pub struct Hsv {
    pub hsv_id: Option<String>,             // optional deterministic id
    pub hsv_type: HsvType,                  // which head emitted this
    pub value: HsvValue,                    // Scalar | Multiscalar
    pub confidence: f64,                    // [0, 1] — quality-adjusted, tier-capped
    pub window: WindowRef,                  // { start_ms, end_ms, aggregation }
    pub inference: InferenceMetadata,       // mode, engine, engine_version, model_id, model_version, components
    pub providers: Vec<String>,             // e.g. ["garmin", "whoop"]
    pub source_tier: u8,                    // physiological fidelity tier
    pub tiers: TierBundle,                  // per-modality tier bundle (HSI 1.3)
    pub breakdown: Option<Vec<(String, f64)>>, // sub-components (e.g. SleepScore parts)
    pub notes: Option<String>,              // human-readable notes (forwarded to HSI axis_reading.notes)
}

pub enum HsvValue {
    Scalar(ScalarValue),                    // single value clamped to [0, 1]
    Multiscalar(Vec<(String, f64)>),        // named scalars, each [0, 1]
}

HSV invariants

Hsv::validate() enforces:
  • confidence ∈ [0, 1].
  • Every scalar value in value (whether Scalar or each entry of Multiscalar) is in [0, 1].
  • window.end_ms > window.start_ms.
Higher values mean more of the named property. Axis inversion is forbidden across versions — once a head emits focus where higher means more focus, it cannot flip semantics.

Tier-capped confidence

The inference engine caps confidence by signal fidelity tier so proxy signals never over-report certainty:
TierSourceMax confidence
Tier 1Native RR ground truth≤ 1.0
Tier 2Vendor HRV (RMSSD/SDNN/stress/recovery)≤ 0.9
Tier 3HR series≤ 0.7
Tier 4HR snapshot≤ 0.5
Tier 0No physiological signal(head-dependent; typically rulepack only)
The Tier-4 ONNX guard short-circuits Emotion/Focus/Capacity to rulepack inference when the window carries fewer than 5 HR samples — no probabilistic inference on single-snapshot signals. The source_tier field on Hsv mirrors tiers.physiological for HSI 1.2 wire compatibility; the per-modality TierBundle is the canonical 1.3 carrier.

64D Johnson-Lindenstrauss embedding

InferenceOutput.embedding carries a 64-dimensional, L2-normalized, deterministic, non-invertible projection of the HSVs:
pub struct EmbeddingOutput {
    pub vector: Vec<f64>,        // 64 dims, ‖v‖₂ = 1
    pub space: &'static str,     // "synheart-jl-64"
    pub dims: u32,               // 64
    pub encoding: &'static str,  // "float32"
    pub vector_hash: Option<String>, // sha256:<hex> integrity tag
}
Properties:
  • Always 64 dims.
  • Always L2-normalized to unit norm.
  • Deterministic — identical HSVs produce byte-identical vectors.
  • Privacy-preserving — Johnson-Lindenstrauss random projection is non-invertible: you cannot reconstruct the underlying HSVs from the embedding.
Flux consumes the embedding pre-computed via HsiBuilder::build_from_output(InferenceOutput). The fallback path HsiBuilder::build(&[Hsv]) recomputes the embedding from HSVs — same result, just an extra projection step. When the embedding reaches the HSI payload, capability level + HSI version control whether it appears in the output.

InferenceOutput

pub struct InferenceOutput {
    pub hsvs: Vec<Hsv>,
    pub embedding: EmbeddingOutput,
}
Inference engine entry points:
MethodPurpose
StateRuntime::infer(features, baseline_present)HSV-only inference; returns Vec<Hsv>.
StateRuntime::infer_with_embedding(features, baseline_present)Returns InferenceOutput (HSV + 64D embedding).
StateRuntime::infer_with_personalization(features, ctx)HSV inference with personalization context.
StateRuntime::infer_with_embedding_personalization(...)Both, with personalization.
StateRuntime::infer_head(head, features, baseline_present)Single-head inference.

HSV → HSI mapping (Flux)

HsiBuilder (in the flux engine component):
  1. Takes &[Hsv] plus the embedding vector + optional integrity tag.
  2. Maps each HSV onto an HSI axis reading via flux::export::map_axes(hsvs, window_ids, default_source_ids).
  3. Writes evidence_source_ids from hsv.providers (or the builder’s defaults when providers is empty).
  4. Forwards breakdown entries as one HSI axis_reading per component.
  5. Forwards notes to axis_reading.notes.
  6. Carries the 64D embedding into embeddings[] if capability/policy permits.
  7. Stamps source_tier and tiers into the HSI envelope.
  8. Returns a validated HsiPayload.
MethodPurpose
build(&[Hsv])Compute embedding + build HSI.
build_from_output(&InferenceOutput)Preferred. Use the pre-computed embedding from the inference engine.
build_json(&[Hsv]) / build_json_from_output(...)Same, plus pretty-printed JSON.

Why HSV exists alongside HSI

LayerTypeOptimised for
HSV (Hsv)in-process, typed, head-shapedFast, deterministic, type-safe inference inside the Synheart Runtime; downstream in-runtime consumers (personalization, scoring) consume Hsv directly without going through JSON.
HSI 1.3JSON wire format, multi-language, version-stableCross-process, cross-platform, cross-vendor interchange; canonical contract per hsi repo.
Flux is the boundary that converts the in-process typed shape into the wire shape. Producers that don’t run the Synheart Runtime can still emit HSI directly; SDK consumers never see HSV — they only ever see HSI 1.3 JSON.

What the Synheart Core SDK exposes

The SDK does not expose Hsv directly to host code (Dart/Kotlin/Swift). It exposes:
  • The HSI JSON via Synheart.onHSIUpdate — what flux emitted.
  • Typed HSIState (axes, modalities, tiers) via Synheart.onStateUpdate — parsed from the HSI JSON; see HSI in Synheart Core.
Apps that need direct Hsv access integrate at the runtime / engine level — typically only research or internal-Synheart builds.