Skip to content

Quickstart

  • Make one authenticated request
  • Download a Parquet dataset
  • (Optional) use MCP tools for validation + reproducibility
  • An API key
  • curl installed

Fastest path: generate a dataset (Parquet)

Section titled “Fastest path: generate a dataset (Parquet)”

Data pipeline: validate, generate, cache, export.

Terminal window
export ALEATORIC_API_KEY="your-api-key"
curl -sS -X POST "https://mcp.aleatoric.systems/data/generate" \
-H "X-API-Key: $ALEATORIC_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"config": { "symbol": "BTC", "seed": 42 },
"duration_seconds": 60
}' \
-o response.json
python3 -c 'import json; print(json.load(open("response.json"))["download_url"])' > download_url.txt
curl -L "$(cat download_url.txt)" -o aleatoric.parquet
  • A file named aleatoric.parquet in your current directory.
  • A response.json file containing metadata (including download_url and manifest_hash).

Optional: MCP tool workflow (validation + cache export)

Section titled “Optional: MCP tool workflow (validation + cache export)”

If you’re integrating with an MCP client, the typical flow is:

  1. validate_config → validate a SimulationManifest and record the deterministic hash.
  2. generate_dataset → generate a batch dataset (and optionally populate cache entries).
  3. export_cache → export a cached artifact as Parquet.

For request/response shapes, see the API index and API reference.

  • If you want “which endpoint/tool do I call?”: API
  • If you want reproducibility guarantees: Determinism