Skip to content

Research & Verification

We adhere to a strict “Trust but Verify” doctrine. Synthetic data is dangerous if it drifts from reality; every run must be explainable and reproducible.

  • Venue-specific funding models live in src/aleatoric/venues/* with tests (tests/test_venues_models.py).
  • Presets (src/aleatoric/presets.py) encode starting parameters; adjust manifests for your venue deltas and rerun with fixed seeds.
  • Use /mcp/simulate_funding_regime to compare funding paths before committing to a manifest.

Suggested checks when comparing against live data (not automated in repo):

  • KS-Test (Kolmogorov–Smirnov): Verifies that the distribution of synthetic returns matches the distribution of historical returns.
  • Autocorrelation: Ensures that volatility clustering (periods of high/low vol) mimics real markets.
  • Hurst Exponent: Verifies the “memory” of the time series (trending vs. mean-reverting).
  • Regression coverage is documented in validation/VALIDATION_REPORT.md (158 tests, 82% coverage).
  • Determinism and manifest/schema parity are enforced in tests/test_determinism.py and tests/test_mcp_manifest_parity.py.
  • Audit logs and metrics include job_id, seed, and preset for traceability (api/audit.py, observability.py).
  • Coverage snapshot reflects the report timestamp (2025-12-10); refresh these numbers when rerunning the suite.