Compare commits

...

52 Commits

Author SHA1 Message Date
AdrianoDev 7fa269de14 feat(deploy): auto-include docker-compose.local.yml override
Lo script deploy-noclone.sh ora carica automaticamente come ultimo -f
un eventuale $DEPLOY_DIR/docker-compose.local.yml se esiste. Utile per
fix specifici macchina (es. DOCKER_API_VERSION watchtower su daemon
vecchi). Gitignored per design — non versionato nel repo.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 22:44:01 +02:00
AdrianoDev c9ab211c38 chore(build-push): riusa docker login persistente
Skip login se ~/.docker/config.json contiene già auth per il registry.
Permette di fare 'docker login' una volta e poi lanciare lo script
senza dover esportare GITEA_PAT ad ogni run.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 21:40:56 +02:00
AdrianoDev 287c4b5372 chore: rimuovi deploy.sh e cache registry buildx
- scripts/deploy.sh eliminato (sostituito da deploy-noclone.sh)
- build-push.sh: rimossa cache-from/cache-to registry (cache buildx
  locale del laptop sufficiente, niente più image buildcache:* sul
  registry Gitea)
- doc cleanup riferimenti orfani

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 21:25:38 +02:00
AdrianoDev ba29572e93 chore(deploy): build locale + deploy no-clone, rimuovi CI Gitea
- scripts/build-push.sh: replica job CI in locale (8 image, cache buildx, tag :latest + :sha-X)
- scripts/deploy-noclone.sh: deploy VPS senza clone (curl raw config + image pull)
- rimossa .gitea/workflows/ci.yml
- README + DEPLOYMENT aggiornati: laptop -> registry -> VPS, paths /docker/cerbero_mcp
- ruff fix su 3 test (I001, SIM117, UP037, F821)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 20:37:06 +02:00
AdrianoDev 4f3e959805 feat(deploy): docker-compose.traefik.yml overlay per behind-Traefik
ci / ruff lint (push) Failing after 13s
ci / mypy mcp_common (push) Successful in 22s
ci / pytest (push) Successful in 32s
ci / validate compose + Caddyfile (push) Failing after 2m23s
ci / build & push to registry (push) Has been skipped
Per VPS condiviso (es. con Gitea) dove Traefik gestisce già 80/443.

- gateway/Caddyfile: env-aware listen + auto_https + trusted_proxies
  (defaults invariati per modalità standalone).
- docker-compose.traefik.yml: overlay che rimuove ports binding host,
  attacca gateway alla network esterna di Traefik, set labels per
  routing Host(cerbero-mcp.tielogic.xyz) + TLS via certresolver
  Traefik. Caddy ascolta plain HTTP :80 interno.
- scripts/deploy.sh: rileva BEHIND_TRAEFIK=true → aggiunge -f
  docker-compose.traefik.yml a tutti i docker compose call.
- DEPLOYMENT.md: nuova sezione 2a (topologia standalone vs behind-traefik)
  + sotto-sezione modalità behind-Traefik con env vars richieste.

Uso:
  docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml \
                 --env-file .env up -d

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 09:56:07 +02:00
AdrianoDev a1110c8ecb feat(safety+audit+deploy): consistency_check + audit log file sink + deploy script
ci / ruff lint (push) Failing after 12s
ci / mypy mcp_common (push) Successful in 25s
ci / pytest (push) Successful in 35s
ci / validate compose + Caddyfile (push) Successful in 2m3s
ci / build & push to registry (push) Has been skipped
#2 Env switch safety:
- mcp_common/environment.py: nuova consistency_check() che previene
  switch accidentali a mainnet. Solleva EnvironmentMismatchError se
  resolved=mainnet senza creds["environment"]="mainnet" esplicito,
  o se declared/resolved mismatch. Override via STRICT_MAINNET=false.
- Wirato in app_factory.run_exchange_main al boot.
- 6 nuovi test consistency.

#3 Audit log persistence:
- mcp_common/audit.py: TimedRotatingFileHandler aggiuntivo se env
  AUDIT_LOG_FILE settato. Rotation midnight UTC, retention 30gg
  default (AUDIT_LOG_BACKUP_DAYS). Format JSONL con SecretsFilter.
- docker-compose.prod.yml: bind mount /var/log/cerbero-mcp + env
  AUDIT_LOG_FILE per i 4 servizi exchange (write endpoints).
- 2 nuovi test file sink.

#1 Deploy script:
- scripts/deploy.sh: idempotente, fa docker login + clone/pull repo +
  copia secrets chmod 600 + crea .env + setup audit dir + pull image
  + up + smoke test pubblico HTTPS.
- DEPLOYMENT.md aggiornato: sezioni 2 (script), 3 (safety mainnet),
  4 (audit log query), renumber sezioni successive.

Test: 488/488 verdi.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 09:29:04 +02:00
AdrianoDev 019b7e3298 docs: README + DEPLOYMENT con stato CI/CD funzionante
ci / ruff lint (push) Successful in 14s
ci / mypy mcp_common (push) Successful in 23s
ci / pytest (push) Successful in 30s
ci / validate compose + Caddyfile (push) Successful in 2m2s
ci / build & push to registry (push) Successful in 1m32s
README aggiunge sezione 'CI/CD pipeline' che descrive i 5 job e i tag
image. DEPLOYMENT espande sez. 1 con dettagli runner Gitea (network
gitea_gitea-internal, image runner-images, label ubuntu-latest) e
configurazione secret user-level REGISTRY_TOKEN con scope write:package.
2026-04-29 09:18:30 +02:00
AdrianoDev 2fb7043790 ci: push base image al registry + parametrizza BASE_IMAGE nei service Dockerfile
ci / ruff lint (push) Successful in 13s
ci / mypy mcp_common (push) Successful in 23s
ci / pytest (push) Successful in 29s
ci / validate compose + Caddyfile (push) Successful in 2m0s
ci / build & push to registry (push) Successful in 1m43s
Buildx con driver docker-container non vede image caricate nel daemon
locale. Soluzione: push base come git.tielogic.xyz/adriano/cerbero-mcp/
base:latest e i 6 service Dockerfile usano ${BASE_IMAGE}:${BASE_TAG}
con default "cerbero-base" per dev locale, override CI a path registry.
2026-04-29 09:09:47 +02:00
AdrianoDev 38fd7db259 ci: usa secrets.REGISTRY_TOKEN per docker login (scope write:package)
ci / ruff lint (push) Successful in 14s
ci / mypy mcp_common (push) Successful in 26s
ci / pytest (push) Successful in 32s
ci / validate compose + Caddyfile (push) Successful in 2m24s
ci / build & push to registry (push) Failing after 4m13s
GITEA_TOKEN auto-iniettato non include write:package scope necessario per
push al registry. Serve PAT manuale creato in User Settings → Applications
con scope write:package, salvato come secret repo REGISTRY_TOKEN.
2026-04-29 08:53:31 +02:00
AdrianoDev 9da2e12473 lint: ruff clean services/ (autofix + manual + ignore E741)
ci / ruff lint (push) Successful in 15s
ci / validate compose + Caddyfile (push) Successful in 2m6s
ci / mypy mcp_common (push) Successful in 30s
ci / pytest (push) Successful in 34s
ci / build & push to registry (push) Failing after 47s
- 24 autofix safe (SIM105 contextlib.suppress, F401 unused imports,
  I001 import order, B007 unused loop var, F811 redef, F841 unused).
- 15 unsafe-fix (UP038 X|Y in isinstance, SIM108 ternary, ecc.).
- Manual fix: SIM102 nested if in deribit term_structure, E402 imports
  in test_cot.py + sentiment server.py.
- Ignore E741 (variabili 'l' in list comprehensions deribit/client.py
  — stilistico, non bug).

Tests: 478/478 verdi.
2026-04-29 08:44:12 +02:00
AdrianoDev 910f80c99b ci: setup-python@v5 con 3.13 + curl uv install (setup-uv@v5 non applicava python-version)
ci / mypy mcp_common (push) Successful in 25s
ci / pytest (push) Successful in 33s
ci / validate compose + Caddyfile (push) Successful in 3m35s
ci / build & push to registry (push) Has been skipped
ci / ruff lint (push) Failing after 52s
2026-04-29 08:29:24 +02:00
AdrianoDev fe7a9dd9c0 ci: usa astral-sh/setup-uv@v5 con python-version 3.13 (gestisce uv + Python + cache)
ci / ruff lint (push) Failing after 55s
ci / mypy mcp_common (push) Successful in 24s
ci / pytest (push) Successful in 30s
ci / build & push to registry (push) Has been cancelled
ci / validate compose + Caddyfile (push) Has been cancelled
2026-04-29 08:23:50 +02:00
AdrianoDev 503f7a4b17 ci: install Python 3.13 via uv (runner image ha solo 3.10)
ci / ruff lint (push) Failing after 21s
ci / mypy mcp_common (push) Successful in 32s
ci / pytest (push) Successful in 39s
ci / build & push to registry (push) Has been cancelled
ci / validate compose + Caddyfile (push) Has been cancelled
2026-04-29 08:22:29 +02:00
AdrianoDev 0956283463 ci: runs-on ubuntu-latest (label più stabile)
ci / ruff lint (push) Failing after 37s
ci / mypy mcp_common (push) Successful in 20s
ci / pytest (push) Successful in 30s
ci / validate compose + Caddyfile (push) Failing after 3m40s
ci / build & push to registry (push) Has been cancelled
2026-04-29 08:21:07 +02:00
AdrianoDev 7cc28cd6de ci: install uv via astral script + add to GITHUB_PATH
ci / ruff lint (push) Failing after 6s
ci / mypy mcp_common (push) Failing after 7s
ci / pytest (push) Failing after 6s
ci / validate compose + Caddyfile (push) Failing after 2m27s
ci / build & push to registry (push) Has been cancelled
2026-04-29 08:18:07 +02:00
AdrianoDev b91f843d89 ci: remove probe workflow (runner network issue resolved)
ci / ruff lint (push) Failing after 1m4s
ci / mypy mcp_common (push) Failing after 13s
ci / pytest (push) Failing after 13s
ci / build & push to registry (push) Has been cancelled
ci / validate compose + Caddyfile (push) Has been cancelled
2026-04-29 08:13:50 +02:00
AdrianoDev fd811d0692 ci(probe): minimal workflow per diagnosticare runner shell/tools
ci / ruff lint (push) Failing after 31s
ci / mypy mcp_common (push) Failing after 37s
ci / pytest (push) Failing after 31s
ci / validate compose + Caddyfile (push) Failing after 31s
probe / probe shell + tools (push) Successful in 1s
ci / build & push to registry (push) Has been skipped
2026-04-29 07:58:50 +02:00
AdrianoDev 1fea7d4ea1 ci: install uv via pipx (setup-uv@v3 era skipped da Gitea runner)
ci / ruff lint (push) Failing after 42s
ci / mypy mcp_common (push) Failing after 41s
ci / pytest (push) Failing after 35s
ci / validate compose + Caddyfile (push) Failing after 44s
ci / build & push to registry (push) Has been skipped
2026-04-29 07:54:17 +02:00
AdrianoDev b1aea194ad docs: add COT report tools to README macro section
ci / ruff lint (push) Failing after 39s
ci / mypy mcp_common (push) Failing after 27s
ci / pytest (push) Failing after 33s
ci / validate compose + Caddyfile (push) Failing after 37s
ci / build & push to registry (push) Has been skipped
2026-04-29 00:10:06 +02:00
AdrianoDev 8dfb932c8c feat(mcp-macro): expose COT report tools via MCP endpoint 2026-04-29 00:09:20 +02:00
AdrianoDev dc285daac8 feat(mcp-macro): fetch_cot_extreme_positioning scanner
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-29 00:06:52 +02:00
AdrianoDev 2474445b4c feat(mcp-macro): fetch_cot_disaggregated async fetcher with cache 2026-04-29 00:03:53 +02:00
AdrianoDev 66bcab05f9 feat(mcp-macro): fetch_cot_tff async fetcher with cache 2026-04-29 00:02:06 +02:00
AdrianoDev e206df49e4 feat(mcp-macro): add parse_tff_row + parse_disagg_row Socrata mappers 2026-04-29 00:00:04 +02:00
AdrianoDev bf152d90fd feat(mcp-macro): add compute_percentile + classify_extreme pure helpers 2026-04-28 23:58:38 +02:00
AdrianoDev 201f263c77 feat(mcp-macro): add CFTC contract codes constants for COT report 2026-04-28 23:57:31 +02:00
AdrianoDev 28e77cddee docs(plan): COT report implementation plan (8 tasks TDD)
ci / ruff lint (push) Failing after 33s
ci / mypy mcp_common (push) Failing after 31s
ci / pytest (push) Failing after 43s
ci / validate compose + Caddyfile (push) Failing after 38s
ci / build & push to registry (push) Has been skipped
2026-04-28 23:54:05 +02:00
AdrianoDev ad3f542c0f docs: fix markdown lint COT spec (blanks, table padding, code lang)
ci / ruff lint (push) Failing after 38s
ci / mypy mcp_common (push) Failing after 36s
ci / pytest (push) Failing after 34s
ci / validate compose + Caddyfile (push) Failing after 32s
ci / build & push to registry (push) Has been skipped
2026-04-28 23:51:01 +02:00
AdrianoDev b218ac3a2c docs: spec design COT report per mcp-macro
ci / mypy mcp_common (push) Failing after 33s
ci / ruff lint (push) Failing after 37s
ci / pytest (push) Failing after 28s
ci / validate compose + Caddyfile (push) Failing after 33s
ci / build & push to registry (push) Has been skipped
TFF (gpe5-46if) per equity/financial: ES, NQ, RTY, ZN, ZB, 6E, 6J, DX
Disaggregated (72hh-3qpy) per commodities: CL, GC, SI, HG, ZW, ZC, ZS

3 tool MCP: get_cot_tff, get_cot_disaggregated,
get_cot_extreme_positioning (scanner percentile 5/95).

Pure-logic helper + httpx integration test + ACL.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 23:43:40 +02:00
AdrianoDev c31ee0a121 ci: usa astral-sh/setup-uv@v3 (cache integrato, no curl|sh fragile)
ci / ruff lint (push) Failing after 55s
ci / pytest (push) Failing after 32s
ci / validate compose + Caddyfile (push) Failing after 32s
ci / build & push to registry (push) Has been skipped
ci / mypy mcp_common (push) Failing after 31s
2026-04-28 23:39:17 +02:00
AdrianoDev a2fdca3afd ci: clean runs-on syntax (runner stabile post crash-loop fix)
ci / ruff lint (push) Failing after 42s
ci / mypy mcp_common (push) Failing after 27s
ci / pytest (push) Failing after 32s
ci / validate compose + Caddyfile (push) Failing after 30s
ci / build & push to registry (push) Has been skipped
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 23:24:33 +02:00
AdrianoDev eec1c11cb3 ci: try array syntax runs-on
ci / ruff lint (push) Failing after 29s
ci / mypy mcp_common (push) Failing after 27s
ci / pytest (push) Failing after 29s
ci / validate compose + Caddyfile (push) Failing after 34s
ci / build & push to registry (push) Has been skipped
2026-04-28 23:23:51 +02:00
AdrianoDev 05b431c9c1 ci: try ubuntu-22.04 label
ci / ruff lint (push) Failing after 40s
ci / pytest (push) Failing after 33s
ci / mypy mcp_common (push) Failing after 29s
ci / validate compose + Caddyfile (push) Failing after 34s
ci / build & push to registry (push) Has been skipped
2026-04-28 23:19:59 +02:00
AdrianoDev 59ae9687c8 ci: runs-on tielogic-ci (label specifica del runner registrato)
ci / ruff lint (push) Failing after 1m11s
ci / mypy mcp_common (push) Failing after 26s
ci / pytest (push) Failing after 36s
ci / validate compose + Caddyfile (push) Failing after 1m56s
ci / build & push to registry (push) Has been cancelled
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 23:17:43 +02:00
AdrianoDev 65641a7de8 ci: validate-config job + cache registry-based
ci / ruff lint (push) Has been cancelled
ci / mypy mcp_common (push) Has been cancelled
ci / pytest (push) Has been cancelled
ci / validate compose + Caddyfile (push) Has been cancelled
ci / build & push to registry (push) Has been cancelled
- Nuovo job validate-config: docker compose -f docker-compose.{yml,prod.yml}
  config -q (verifica sintassi YAML + variabili env) + caddy validate
  --config Caddyfile (sintassi gateway).
- build-and-push ora needs anche validate-config: niente push image se
  compose o Caddyfile sono rotti.
- Cache Docker buildx passata da type=gha (richiede backend cache server
  Gitea Actions configurato) a type=registry,ref=<prefix>/buildcache:<name>
  che usa il registry stesso come storage cache. Funziona out-of-the-box,
  niente setup extra.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 23:02:17 +02:00
AdrianoDev c251fda886 feat(ci/cd): Gitea Actions + registry + Watchtower auto-update
ci / ruff lint (push) Failing after 1m37s
ci / mypy mcp_common (push) Has been cancelled
ci / pytest (push) Has been cancelled
ci / build & push to registry (push) Has been cancelled
CI pipeline (.gitea/workflows/ci.yml):
- Job lint (ruff), typecheck (mypy mcp_common gating + servizi
  warn-only), test (pytest 455).
- Job build-and-push solo su main: builda gateway + 6 image MCP via
  docker/build-push-action@v6, login al registry Gitea con
  docker/login-action@v3 + secrets.GITEA_TOKEN auto-iniettato.
- Cache distribuita type=gha per layer Docker → run successivi 5-10x
  più veloci. Tag :latest + :sha-XXXXXXX per ogni image.

Deploy VPS (docker-compose.prod.yml):
- Niente build locale: solo `image:` da git.tielogic.xyz/adriano/
  cerbero-mcp/<service>:latest. Variabile IMAGE_TAG per pin a sha
  specifico.
- Servizio Watchtower containerizzato che polla ogni 5min (configurabile
  via WATCHTOWER_POLL_INTERVAL) e auto-aggiorna i container con label
  com.centurylinklabs.watchtower.enable=true. Auth registry riusa
  ~/.docker/config.json bind-mounted readonly.

DEPLOYMENT.md: runbook completo per setup VPS, login registry, secrets,
.env, smoke test post-deploy, rollback (pin a sha), disable auto-update,
nota Traefik upload limit. README aggiornato con link.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 22:52:40 +02:00
AdrianoDev 6b7b3f7658 chore: httpx retry transport + healthcheck stdlib + mypy config
- mcp_common/http.py: nuovo helper async_client() con
  AsyncHTTPTransport(retries=3) per gestire connection error transient
  + call_with_retry() generic async retry decorator. Sostituite 25
  occorrenze httpx.AsyncClient(...) in deribit/hyperliquid/sentiment/
  macro client. 5 nuovi test.

- Dockerfile healthcheck: passato da python+httpx subprocess a
  stdlib urllib.request.urlopen() su tutti i 6 servizi MCP. Zero
  dipendenze esterne nel runtime check, timeout esplicito 3s, image
  leggermente più snella.

- pyproject.toml: aggiunto [tool.mypy] python_version=3.13 con
  mypy_path multi-package + override ignore_missing_imports per i
  vendor SDK (pybit, alpaca, hyperliquid, pythonjsonlogger). mypy 1.20
  in dev deps; ruff pinned 0.5.x. mcp_common passa mypy clean; 44
  errori tipo pre-esistenti nei servizi affiorati ma non bloccanti —
  fix da pianificare separatamente.

- 455 test verdi.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 07:26:17 +02:00
AdrianoDev 4d9db750be chore: ruff py313, conftest unification, audit log, app factory comune
- pyproject.toml: ruff target-version py311 → py313 (auto-fix 42 lint
  warnings via UP rules); aggiunto consider_namespace_packages = true
  che risolve la collisione conftest tra servizi e permette di lanciare
  pytest sull'intera suite cross-servizio.

- mcp_common.audit: nuovo helper audit_write_op() con logger dedicato
  mcp.audit. Wirato su tutti i write endpoint di deribit, bybit, alpaca
  e hyperliquid (place_order, place_combo_order, cancel_*, set_*,
  close_*, transfer_*, switch_*, amend_*) con principal + target +
  payload non-sensibile + result summarizzato.

- mcp_common.app_factory: ExchangeAppSpec + run_exchange_main()
  centralizza il boilerplate dei __main__.py (configure_root_logging,
  fail_fast_if_missing, summarize, load creds, resolve_environment,
  load token store, uvicorn). I 4 __main__.py exchange ridotti da ~60
  LOC ognuno a ~25 LOC dichiarativi. mcp_common.env_validation
  promosso da mcp_deribit (mantenuto re-export shim per back-compat
  test_env_validation).

- 8 test nuovi (4 audit + 4 app_factory). Suite full: 450/450 verdi.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-28 00:27:02 +02:00
AdrianoDev a13e3fe045 feat: 15 nuovi indicatori quant (common + deribit + bybit + macro + sentiment)
Common (mcp_common):
- indicators.py: vol_cone, hurst_exponent, half_life_mean_reversion,
  garch11_forecast, autocorrelation, rolling_sharpe, var_cvar
- options.py (nuovo): oi_weighted_skew, smile_asymmetry, atm_vs_wings_vol,
  dealer_gamma_profile, vanna_charm_aggregate
- microstructure.py (nuovo): orderbook_imbalance (ratio + microprice + slope)
- stats.py (nuovo): cointegration_test Engle-Granger + ADF helper

Deribit (+6 tool MCP):
- get_dealer_gamma_profile (net dealer gamma + flip level)
- get_vanna_charm (vanna/charm aggregati pesati OI)
- get_oi_weighted_skew, get_smile_asymmetry, get_atm_vs_wings_vol
- get_orderbook_imbalance

Bybit (+2 tool MCP):
- get_orderbook_imbalance, get_basis_term_structure (futures dated curve)

Macro (+2 tool MCP):
- get_yield_curve_slope (2y10y/5y30y + butterfly + regime)
- get_breakeven_inflation (FRED T5YIE/T10YIE/T5YIFR)

Sentiment (+3 tool MCP):
- get_funding_arb_spread (opportunità arb compatte annualizzate)
- get_liquidation_heatmap (heuristic da OI delta + funding extreme,
  no feed paid Coinglass)
- get_cointegration_pairs (Engle-Granger su coppie crypto Binance hourly)

Tutto in TDD pure-Python (no numpy/scipy in mcp_common). README
aggiornato con elenco completo. 442 test totali verdi.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 23:58:07 +02:00
AdrianoDev 867180f4bf feat(gateway): TLS auto + rate limit + IP allowlist su write endpoint
Configura il gateway Caddy per il deploy su cerbero-mcp.tielogic.xyz:

- Build custom Caddy con plugin mholt/caddy-ratelimit (Dockerfile +
  build via xcaddy).
- TLS automatico via Let's Encrypt (richiede DNS A record + porte
  80/443 raggiungibili), HSTS preload, header di sicurezza.
- Rate limit per IP (60 req/min sui read, 10 req/min sui write,
  sliding window).
- Allowlist IP sui write endpoint (place_*, cancel_*, set_*, close_*,
  transfer_*, amend_*, switch_*): IP non in WRITE_ALLOWLIST → 403.
- Default WRITE_ALLOWLIST copre loopback + Docker bridge: bot sulla
  stessa macchina (host o container) funziona senza configurazione,
  IP pubblici esterni vanno aggiunti esplicitamente.
- Smoke test e README aggiornati per il nuovo URL gateway.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 23:24:06 +02:00
AdrianoDev c2fd8330ca feat(mcp-deribit,mcp-bybit): add place_combo_order
Deribit: private/create_combo + place_order sul combo instrument → una
sola crociata di spread invece di N (slippage atteso ridotto su
strutture liquide). ACL core + leverage cap su tutti i leg.

Bybit: place_batch_order su category=option (atomic multi-leg, 1
round-trip API). Reject su category != option (perp/linear non
supportano batch nativo). orderLinkId auto-generato per leg.

Tutti i test: deribit 48/48, bybit 123/123.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 23:12:09 +02:00
AdrianoDev bacd5aab33 docs: aggiornati README e smoke con live checks + resolver kwargs
Documentata la precedenza di risoluzione environment e l'utilizzo dei
nuovi kwargs default_base_url_live/testnet di resolve_environment.
Smoke README estesa con i 6 live tool check read-only.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 22:29:46 +02:00
AdrianoDev 21da74e8a1 refactor(mcp-common): centralize base_url defaults in resolve_environment
Aggiunti kwargs opzionali default_base_url_live / default_base_url_testnet
a resolve_environment. Rimosse 8 chiamate creds.setdefault duplicate dai
4 servizi (alpaca, bybit, deribit, hyperliquid) ora passano gli URL
canonici direttamente al resolver.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-27 22:28:18 +02:00
AdrianoDev 81fb5e8c29 test(smoke): aggiunti live tool check read-only su 6 MCP 2026-04-27 22:11:47 +02:00
AdrianoDev 1dbf9bbd7b fix(mcp-bybit): wire creds + env_info into create_app (was missing) 2026-04-27 21:39:25 +02:00
AdrianoDev d1cea403a7 fix(docker/base): rename package option-mcp-common → mcp-common 2026-04-27 21:36:21 +02:00
AdrianoDev 33f358c13a test(smoke): bash + curl + jq script per i 6 MCP 2026-04-27 21:35:48 +02:00
AdrianoDev de19d42850 feat(mcp-alpaca): leverage_cap + paper resolver + environment_info 2026-04-27 17:57:17 +02:00
AdrianoDev fb8c43cc61 feat(mcp-hyperliquid): leverage_cap + testnet resolver + environment_info 2026-04-27 17:55:26 +02:00
AdrianoDev e958422fe5 feat(mcp-bybit): leverage_cap + testnet resolver + environment_info 2026-04-27 17:53:07 +02:00
AdrianoDev 0b471f207a feat(mcp-deribit): testnet resolver + environment_info tool + env override 2026-04-27 17:50:48 +02:00
AdrianoDev ecb2d0e4c2 refactor(mcp-deribit): replace risk_guard with local leverage_cap 2026-04-27 17:49:35 +02:00
89 changed files with 7691 additions and 506 deletions
+3
View File
@@ -36,3 +36,6 @@ config/*.env
# MCP config con token (solo .example tracciato)
.mcp.json
# Override locale compose (specifico macchina, fix daemon vecchi, ecc.)
docker-compose.local.yml
+448
View File
@@ -0,0 +1,448 @@
# Deployment Cerbero_mcp
Guida operativa per il deploy della suite MCP su un VPS pubblico.
L'architettura è: Gitea ospita codice + container registry; le immagini
vengono buildate e pushate dalla **macchina di sviluppo** (laptop) verso
il registry; il VPS produzione non builda nulla, fa solo pull dei
container già pronti e usa Watchtower per il rollover automatico.
```
┌──────────────────────────┐ ┌─────────────────────────┐ ┌──────────────────────────────────┐
│ Laptop dev │ │ Gitea git.tielogic.xyz │ │ VPS produzione │
│ │ │ │ │ cerbero-mcp.tielogic.xyz │
│ build-push.sh ──push──▶ │───▶│ ┌────────────────────┐ │ │ │
│ (8 image) │ │ │ Container registry │ │ │ ┌────────────────────────────┐ │
│ git push ─────────────▶ │───▶│ └────────────────────┘ │◀──┼──┤ docker compose │ │
│ │ │ ┌────────────────────┐ │ pull │ (docker-compose.prod.yml) │ │
│ │ │ │ Cerbero-mcp repo │ │ │ │ gateway, mcp-* │ │
│ │ │ └────────────────────┘ │ │ │ watchtower (poll 5min) │ │
│ │ │ │ │ └────────────────────────────┘ │
└──────────────────────────┘ └─────────────────────────┘ └──────────────────────────────────┘
```
Niente CI/CD su Gitea — qualità e build sono responsabilità del laptop
prima del push (lint/test in locale, poi `scripts/build-push.sh`).
## 1. Build & push image (dal laptop)
Lo script `scripts/build-push.sh` builda e pusha le 8 image al registry
Gitea, replicando il vecchio job CI ma in locale. Pre-requisiti:
- `docker` + `buildx` sul laptop.
- Personal Access Token Gitea con scope `write:package` (User Settings
→ Applications → Generate Token).
```bash
export GITEA_PAT='<PAT_write:package>'
export GITEA_USER=adriano
# Tutte le 8 image (base + gateway + 6 mcp-*)
./scripts/build-push.sh
# Solo specifiche (es. dopo modifica a un singolo servizio)
./scripts/build-push.sh base mcp-bybit
```
Lo script:
- Fa `docker login git.tielogic.xyz`.
- Builda con `docker buildx build --push` (cache buildx locale del
laptop, niente cache registry: build successivi rapidi senza pesare
sul registry).
- Tagga `:latest` + `:sha-<short_HEAD>`.
- Per le mcp-* passa `BASE_IMAGE`/`BASE_TAG` come build-arg in modo da
ereditare dall'image `base` appena pushata.
Ordine consigliato: builda `base` prima delle `mcp-*` (lo script lo fa
di default se chiamato senza argomenti).
## 1b. Quality gate locale (consigliato prima del push)
Prima di `build-push.sh` esegui in locale i check che prima girava il CI:
```bash
uv run ruff check services/
uv run mypy services/common/src/mcp_common
uv run pytest services/ --tb=short
docker compose -f docker-compose.prod.yml config -q
```
Tutti devono essere verdi prima di pushare image al registry.
## 2a. Topologia: standalone vs behind-Traefik
Cerbero_mcp supporta due topologie di deploy:
### Standalone (Caddy gestisce TLS direttamente)
```
Internet ──[443]──► Caddy gateway ──► mcp-* services
(ACME Let's Encrypt)
```
Setto: `docker-compose.prod.yml` da solo. Caddy bind sulle porte
80/443 host, fa cert auto via ACME. Adatto a un VPS dedicato senza
altri servizi sulle 80/443.
### Behind-Traefik (Traefik termina TLS)
```
Internet ──[443]──► Traefik ──[traefik network]──► Caddy gateway ──► mcp-* services
(TLS+ACME) (rate-limit, IP allowlist)
```
Setto: `docker-compose.prod.yml` + `docker-compose.traefik.yml` overlay.
Caddy non bind su host, ascolta plain HTTP `:80` interno alla
`traefik` network. Traefik fa routing per `Host(cerbero-mcp.tielogic.xyz)`,
TLS, ACME. Adatto a VPS condiviso con altri servizi (Gitea, ecc.).
## 2. Deploy automatizzato (script no-clone)
Il modo più rapido è `scripts/deploy-noclone.sh`, idempotente. Sul VPS
**non** viene clonato il repo: lo script scarica via raw HTTP solo i
file strettamente necessari al runtime (compose, Caddyfile, public
assets). Esegui sul VPS:
```bash
# Prerequisiti
export GITEA_PAT="<PAT con scope read:package>"
export GITEA_USER=adriano
# Crea la dir di deploy e mettici i secrets via scp dal posto sicuro
sudo mkdir -p /docker/cerbero_mcp/secrets
sudo chown -R "$USER" /docker/cerbero_mcp
# scp deribit.json bybit.json hyperliquid.json alpaca.json \
# macro.json sentiment.json core.token observer.token \
# vps:/docker/cerbero_mcp/secrets/
# Behind Traefik (opzionale, solo se VPS condiviso con Gitea o altri)
# export BEHIND_TRAEFIK=true
# export TRAEFIK_NETWORK=gitea_traefik-public
curl -sL -o /tmp/deploy-noclone.sh \
https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main/scripts/deploy-noclone.sh
chmod +x /tmp/deploy-noclone.sh
/tmp/deploy-noclone.sh
```
Lo script esegue: docker login registry → scarica `docker-compose.prod.yml`,
`docker-compose.traefik.yml`, `gateway/Caddyfile`, `gateway/public/*` in
`/docker/cerbero_mcp/` → chmod 600 sui secrets → genera `.env` iniziale
(testnet) → crea `/var/log/cerbero-mcp` con permessi `1000:1000` → pull
image dal registry → `docker compose up -d` → smoke test pubblico.
Per aggiornare in seguito: ri-esegui lo stesso script (preserva `.env`
e secrets, ricarica config dal branch `main` aggiornato).
**Override paths**: `DEPLOY_DIR` (default `/docker/cerbero_mcp`),
`SECRETS_SRC` (default `$DEPLOY_DIR/secrets`), `AUDIT_LOG_DIR` (default
`/var/log/cerbero-mcp`).
**Override compose locale (`docker-compose.local.yml`)**: lo script
include automaticamente come ultimo `-f` un eventuale
`$DEPLOY_DIR/docker-compose.local.yml`. Utile per fix specifici della
macchina (es. forzare `DOCKER_API_VERSION` su watchtower se il daemon
del VPS è più vecchio dell'API attesa). File gitignored per design —
non viene scaricato dal repo, lo crei a mano sul VPS. Esempio:
```yaml
# /docker/cerbero_mcp/docker-compose.local.yml
services:
watchtower:
environment:
DOCKER_API_VERSION: "1.44"
```
### Modalità behind-Traefik
Se sul VPS gira già un Traefik (es. lo stesso VPS di Gitea), prima di
lanciare lo script aggiungi al tuo `.env`:
```bash
BEHIND_TRAEFIK=true
TRAEFIK_NETWORK=gitea_traefik-public # nome network esterna di Traefik
TRAEFIK_CERTRESOLVER=letsencrypt # nome resolver in Traefik
TRAEFIK_ENTRYPOINT=websecure # entrypoint HTTPS Traefik
# Porte gateway non più necessarie (Traefik bind 80/443):
# GATEWAY_HTTP_PORT, GATEWAY_HTTPS_PORT non vengono usate.
```
Lo script rileva `BEHIND_TRAEFIK=true` e usa
`docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml`.
Il gateway Caddy NON bind su 80/443 host; viene esposto via Traefik con
labels per `Host(cerbero-mcp.tielogic.xyz)`.
Verifica della network Traefik:
```bash
docker network ls | grep -i traefik
# Tipicamente vedrai: gitea_traefik-public, traefik_default, ecc.
# Usa il nome ESATTO come TRAEFIK_NETWORK in .env.
```
## 3. Safety: switch testnet → mainnet
`mcp_common.environment.consistency_check` (richiamato dal boot
`run_exchange_main`) PREVIENE switch accidentali:
- Se l'ambiente risolto è **mainnet** ma il secret JSON corrispondente
non contiene `"environment": "mainnet"` esplicito → boot abort con
`EnvironmentMismatchError`.
- Se il secret dichiara un environment diverso da quello risolto (es.
`creds["environment"]="mainnet"` ma env var setta testnet) → boot abort.
**Per passare a mainnet su un exchange specifico** (es. bybit):
1. Edita `secrets/bybit.json`: aggiungi `"environment": "mainnet"`.
2. Modifica `.env`: `BYBIT_TESTNET=false`.
3. `docker compose -f docker-compose.prod.yml --env-file .env restart mcp-bybit`.
Senza il flag esplicito nel secret, il container mcp-bybit fallirà al
boot e Watchtower NON aggiornerà su versioni con cred mainnet rotti.
Override `STRICT_MAINNET=false` in `.env` permette mainnet senza la
conferma esplicita (downgrade safety, sconsigliato in produzione).
## 4. Audit log persistente
Tutti i write endpoint (`place_order`, `place_combo_order`, `cancel_*`,
`set_*`, `close_*`, `transfer_*`, `amend_*`, `switch_*`) emettono un
record JSON strutturato sul logger `mcp.audit`.
**Sink**:
- stdout/stderr container (sempre, visibile via `docker logs`).
- File JSONL persistente su volume host:
`${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}/<service>.audit.jsonl`.
Rotation a mezzanotte UTC con retention `AUDIT_LOG_BACKUP_DAYS`
(default 30 giorni).
**Esempio record**:
```json
{
"audit_event": "write_op",
"action": "place_order",
"exchange": "bybit",
"principal": "core",
"target": "BTCUSDT",
"payload": {"side": "Buy", "qty": 0.01, "price": 60000, "leverage": 3},
"result": {"order_id": "abc123", "status": "submitted"}
}
```
**Query operative**:
```bash
# Tutto l'audit log oggi
tail -f /var/log/cerbero-mcp/*.audit.jsonl
# Solo place_order su bybit
jq -c 'select(.action=="place_order" and .exchange=="bybit")' \
/var/log/cerbero-mcp/bybit.audit.jsonl
# Errori
jq -c 'select(.error)' /var/log/cerbero-mcp/*.audit.jsonl
# Operazioni di un principal
jq -c 'select(.principal=="core")' /var/log/cerbero-mcp/*.audit.jsonl
```
I secret (api_key, password) sono filtrati automaticamente da
`SecretsFilter` prima di arrivare al sink.
## 5. Setup iniziale del VPS (manuale, alternativa allo script)
**Pre-requisiti**: Docker Engine ≥ 24, `docker compose` plugin, accesso SSH
sudo, dominio DNS A record `cerbero-mcp.tielogic.xyz` → IP del VPS, porte 80
e 443 aperte sul firewall (per ACME challenge + traffico HTTPS).
### a) Login al registry Gitea
Crea un Personal Access Token su Gitea (`Settings → Applications →
Generate new token`) con scope `read:package`. Quindi sul VPS:
```bash
echo "$GITEA_PAT" | docker login git.tielogic.xyz -u <gitea-username> --password-stdin
```
Le credenziali vengono salvate in `~/.docker/config.json`. Watchtower lo
bind-monta in sola lettura per fare i pull autenticati.
### b) Crea dir di deploy e scarica i file di config
Sul VPS NON serve clonare il repo. Bastano i file di compose, il
`Caddyfile` e i public assets del gateway:
```bash
sudo mkdir -p /docker/cerbero_mcp/{secrets,gateway/public}
sudo chown -R "$USER" /docker/cerbero_mcp
cd /docker/cerbero_mcp
BASE=https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main
curl -fsSL -o docker-compose.prod.yml $BASE/docker-compose.prod.yml
curl -fsSL -o docker-compose.traefik.yml $BASE/docker-compose.traefik.yml
curl -fsSL -o gateway/Caddyfile $BASE/gateway/Caddyfile
curl -fsSL -o gateway/public/index.html $BASE/gateway/public/index.html
curl -fsSL -o gateway/public/status.js $BASE/gateway/public/status.js
curl -fsSL -o gateway/public/style.css $BASE/gateway/public/style.css
```
Il VPS NON ha bisogno di buildare; usa `docker-compose.prod.yml` che fa solo
pull dal registry.
### c) Prepara secrets
```bash
mkdir -p secrets
# Copia (via scp) i file JSON con cred reali:
# secrets/deribit.json, bybit.json, alpaca.json, hyperliquid.json,
# secrets/macro.json, sentiment.json
# secrets/core.token, observer.token
chmod 600 secrets/*
```
### d) `.env` con configurazione runtime
Crea `/docker/cerbero_mcp/.env`:
```bash
# Gateway
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
# Image tag — `latest` per auto-update Watchtower, oppure pin a sha-XXXXXXX
IMAGE_TAG=latest
IMAGE_PREFIX=git.tielogic.xyz/adriano/cerbero-mcp
# Environment exchange (true=testnet, false=mainnet)
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# Watchtower polling interval (sec). 300=5min default.
WATCHTOWER_POLL_INTERVAL=300
```
### e) Avvio
```bash
docker compose -f docker-compose.prod.yml --env-file .env pull
docker compose -f docker-compose.prod.yml --env-file .env up -d
docker compose -f docker-compose.prod.yml logs -f gateway
```
Caddy chiede automaticamente il certificato Let's Encrypt al primo
contatto su `https://cerbero-mcp.tielogic.xyz`.
## 6. Auto-update via Watchtower
Watchtower (servizio `watchtower` nel compose) polla il registry ogni
`WATCHTOWER_POLL_INTERVAL` secondi. Se trova un nuovo digest dietro al tag
`:latest` di un container etichettato `com.centurylinklabs.watchtower.enable=true`,
fa:
1. `docker pull` della nuova image
2. `docker stop` graceful del container vecchio
3. `docker rm` + start del nuovo container con stessa config + secret + volumi
4. Cleanup image vecchia (`WATCHTOWER_CLEANUP=true`)
I container con label sono: `gateway`, `mcp-deribit`, `mcp-bybit`,
`mcp-hyperliquid`, `mcp-alpaca`, `mcp-macro`, `mcp-sentiment`. Il container
`watchtower` stesso non si auto-aggiorna (per evitare loop).
### Disabilitare auto-update temporaneamente
Pin a uno SHA specifico nel `.env`:
```bash
IMAGE_TAG=sha-6b7b3f7
docker compose -f docker-compose.prod.yml --env-file .env up -d
```
In questo modo `:latest` non viene più seguito; per riattivare il rollover
automatico ripristina `IMAGE_TAG=latest`.
### Disabilitare auto-update per un singolo servizio
Rimuovi la label `com.centurylinklabs.watchtower.enable=true` per quel
servizio nel compose (oppure imposta `=false`). Watchtower lo ignora ma
continua a tenere aggiornati gli altri.
## 7. Rollback
```bash
# Trova lo SHA della versione precedente
docker images "git.tielogic.xyz/adriano/cerbero-mcp/*" --format "{{.Tag}}"
# Pin nel .env
IMAGE_TAG=sha-XXXXXXX
docker compose -f docker-compose.prod.yml --env-file .env up -d
```
Watchtower NON downgraderà perché il digest del tag pin corrisponde a quello
locale.
## 8. Smoke test post-deploy
```bash
# Da fuori VPS (laptop)
curl -s https://cerbero-mcp.tielogic.xyz/mcp-macro/health
# {"status":"ok",...}
# Test write endpoint allowlist (deve rispondere 403 da IP esterno):
curl -X POST https://cerbero-mcp.tielogic.xyz/mcp-deribit/tools/place_order \
-H "Authorization: Bearer $(cat secrets/core.token)" \
-d '{"instrument_name":"BTC-PERPETUAL","side":"buy","amount":1}'
# 403 forbidden: source ip not in allowlist ← OK
# Sul VPS:
GATEWAY=http://localhost bash tests/smoke/run.sh
```
## 9. Sicurezza VPS
- Firewall `ufw`: `allow 22, 80, 443`. Tutto il resto deny in.
- `fail2ban` su SSH e (opz) sul log Caddy 401.
- Secret rotation manuale: aggiorna i file `secrets/*.token`
`docker compose restart` (i token vengono ricaricati al boot di ogni
servizio MCP).
- Audit log in `docker compose logs <service> | grep audit_event` — per
produzione meglio redirezionare a syslog o a un servizio dedicato.
## 10. Note Traefik / reverse proxy davanti a Gitea
Gitea è esposto via Traefik (ROOT_URL `https://git.tielogic.xyz`). Per il push
di image Docker il reverse proxy deve consentire upload di body grossi (un
singolo layer può superare i 100MB).
Traefik default va bene, ma se vedi `413 Request Entity Too Large` durante
`docker push` aumenta il limite nel middleware:
```yaml
# traefik dynamic config
http:
middlewares:
gitea-upload:
buffering:
maxRequestBodyBytes: 524288000 # 500MB
```
Applicalo come middleware al router Gitea.
## 11. Aggiornamento del compose stesso (file YAML)
Watchtower aggiorna le **image**, non `docker-compose.prod.yml`
`Caddyfile`. Se cambi struttura (nuovi servizi, nuove env var, modifiche
al gateway), ri-esegui sul VPS lo script no-clone, che ri-scarica i file
di config dal branch `main` di Gitea e applica:
```bash
/tmp/deploy-noclone.sh
```
Lo script è idempotente: preserva `.env` e `secrets/`, aggiorna solo i
file di config + fa `pull` + `up -d`.
+131 -1
View File
@@ -8,9 +8,76 @@ split documentato in `docs/superpowers/specs/2026-04-27-split-mcp-core-design.md
## Servizi
- `mcp-alpaca`, `mcp-bybit`, `mcp-deribit`, `mcp-hyperliquid` — exchange
con `place_order`, `environment_info`, leverage cap server-side
- `mcp-deribit` e `mcp-bybit` espongono inoltre `place_combo_order`:
- Deribit: `private/create_combo` + ordine sul combo → 1 sola crociata
di spread invece di N (slippage atteso ridotto su strutture liquide).
- Bybit: `place_batch_order` su `category=option` → multi-leg atomico
in un solo round-trip API (no sconto fee, solo atomicità + latenza).
- `mcp-macro`, `mcp-sentiment` — read-only market data
## Avvio locale
## Indicatori quantitativi disponibili
### Common (`mcp_common.indicators` + `options` + `microstructure` + `stats`)
- Tecnici: `sma`, `rsi`, `macd`, `atr`, `adx`
- Volatilità: `vol_cone` (RV multi-window con percentili), `garch11_forecast`
- Statistici: `hurst_exponent`, `half_life_mean_reversion`, `autocorrelation`,
`cointegration_test` (Engle-Granger)
- Risk: `rolling_sharpe` (Sharpe + Sortino), `var_cvar` (historical VaR/ES)
- Microstructure: `orderbook_imbalance` (ratio + microprice + slope)
- Options: `oi_weighted_skew`, `smile_asymmetry`, `atm_vs_wings_vol`,
`dealer_gamma_profile`, `vanna_charm_aggregate`
### Deribit (esposti come tool MCP)
DVOL, GEX, P/C ratio, skew_25d, term_structure, iv_rank, realized_vol,
indicatori tecnici, find_by_delta, calculate_spread_payoff.
**Nuovi**: `get_dealer_gamma_profile`, `get_vanna_charm`,
`get_oi_weighted_skew`, `get_smile_asymmetry`, `get_atm_vs_wings_vol`,
`get_orderbook_imbalance`.
### Bybit
Ticker, orderbook, OHLCV, funding rate (current+history), open interest,
basis spot/perp, indicatori tecnici. **Nuovi**: `get_orderbook_imbalance`,
`get_basis_term_structure`.
### Macro
Treasury yields, FRED indicators, equity futures, asset prices, calendar.
**Nuovi**: `get_yield_curve_slope` (slope 2y10y/5y30y + butterfly + regime),
`get_breakeven_inflation` (T5YIE/T10YIE/T5YIFR), `get_cot_tff` (TFF report
CFTC equity/financial: ES/NQ/RTY/ZN/ZB/6E/6J/DX), `get_cot_disaggregated`
(Disaggregated report CFTC commodities: CL/GC/SI/HG/ZW/ZC/ZS),
`get_cot_extreme_positioning` (scanner percentile ≤5/≥95 su watchlist).
### Sentiment
News (CryptoPanic/CoinDesk), social (LunarCrush), funding multi-exchange,
OI history. **Nuovi**: `get_funding_arb_spread` (opportunità arb compatte),
`get_liquidation_heatmap` (heuristic da OI delta + funding extreme),
`get_cointegration_pairs` (Engle-Granger su coppie crypto).
## Build & deploy pipeline
Niente CI/CD su Gitea: la build delle 8 image è responsabilità della
macchina di sviluppo, fatta da `scripts/build-push.sh`. Il flusso è:
1. **Quality gate locale** (sul laptop, prima di pushare):
- `uv run ruff check services/`
- `uv run mypy services/common/src/mcp_common`
- `uv run pytest services/`
- `docker compose -f docker-compose.prod.yml config -q`
2. **Build & push** (sul laptop):
```bash
export GITEA_PAT='<PAT_write:package>'
./scripts/build-push.sh # tutte le 8 image
./scripts/build-push.sh base mcp-bybit # solo specifiche
```
Tagga `:latest` + `:sha-<short_HEAD>` per rollback puntuali. Cache
buildx via registry stesso (run successivi 5-10× più veloci).
3. **Auto-rollover su VPS**: Watchtower polla il registry ogni 5 min e
aggiorna i container quando il digest del tag `:latest` cambia.
Vedi [`DEPLOYMENT.md`](DEPLOYMENT.md) per build & push, deploy VPS
no-clone (`scripts/deploy-noclone.sh`), smoke test, rollback.
## Avvio locale (dev)
```bash
docker compose up -d
@@ -21,3 +88,66 @@ bash tests/smoke/run.sh
Vedi `secrets/*.json` e variabili `*_TESTNET` / `ALPACA_PAPER` in
`docker-compose.yml` per override ambiente.
### Deploy su VPS pubblica (`cerbero-mcp.tielogic.xyz`)
Vedi [`DEPLOYMENT.md`](DEPLOYMENT.md) per il runbook completo end-to-end.
Il gateway Caddy è configurato per:
- TLS automatico via Let's Encrypt (richiede DNS A/AAAA che punti al
VPS e porte 80+443 raggiungibili).
- HSTS preload, header di sicurezza (`X-Content-Type-Options`,
`X-Frame-Options`, `Referrer-Policy`).
- Rate limit per IP (60 req/min su read, 10 req/min su write) tramite
plugin `mholt/caddy-ratelimit`.
- Allowlist IP sui write endpoint (`place_*`, `cancel_*`, `set_*`,
`close_*`, `transfer_*`, `amend_*`, `switch_*`): IP non presenti in
`WRITE_ALLOWLIST` ricevono `403 forbidden`.
Variabili d'ambiente per il deploy:
```bash
# .env (su VPS)
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
# Allowlist write endpoint (CIDR space-separated). Default copre:
# - loopback IPv4/IPv6 (bot sull'host VPS chiama http://localhost)
# - Docker bridge 172.16.0.0/12 (bot in container nella stessa compose network)
# Aggiungi gli IP pubblici dei tuoi bot esterni se li hai.
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12 1.2.3.4/32"
```
Tre scenari per il trading bot:
1. Bot container nella stessa compose network → chiama `http://gateway:80`
internamente. Source IP = Docker bridge → coperto dalla default.
2. Bot processo sull'host VPS → chiama `http://localhost`. Source IP =
`127.0.0.1` → coperto dalla default.
3. Bot esterno (laptop, altro server) → chiama
`https://cerbero-mcp.tielogic.xyz` con TLS. Devi aggiungere l'IP
pubblico del bot in `WRITE_ALLOWLIST`.
Senza configurare `WRITE_ALLOWLIST` la default è loopback + Docker bridge:
nessun IP pubblico esterno può triggerare ordini.
Sull'host VPS i secret devono avere permessi restrittivi:
```bash
chmod 600 secrets/*.json secrets/*.token
```
### Risoluzione environment (testnet/mainnet)
Ogni servizio exchange usa `mcp_common.environment.resolve_environment()`
che applica la precedenza:
1. env var di override (`DERIBIT_TESTNET`, `BYBIT_TESTNET`,
`HYPERLIQUID_TESTNET`, `ALPACA_PAPER`)
2. flag nel secret JSON (`testnet` o `paper` per alpaca)
3. default `testnet`
Gli URL canonici live/testnet sono passati come kwargs
`default_base_url_live` / `default_base_url_testnet` direttamente al
resolver — non serve duplicarli nel secret JSON, ma se presenti
prevalgono sui default del codice.
+205
View File
@@ -0,0 +1,205 @@
# docker-compose.prod.yml — deploy su VPS produzione.
#
# Differenze vs docker-compose.yml (dev):
# - Niente `build:`, solo `image:` dal registry Gitea.
# - Tag `latest` (Watchtower polla per nuove versioni).
# - Aggiunge servizio `watchtower` che auto-aggiorna i container etichettati
# `com.centurylinklabs.watchtower.enable=true` quando il tag latest cambia.
# - Auth registry: `docker login git.tielogic.xyz` una sola volta sull'host
# (Watchtower legge ~/.docker/config.json bind-mounted in /config.json).
#
# Uso sul VPS:
# docker login git.tielogic.xyz
# docker compose -f docker-compose.prod.yml --env-file .env up -d
#
# Override variabili in `.env` accanto al compose:
# ACME_EMAIL=adrianodalpastro@tielogic.com
# WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
# GATEWAY_HTTP_PORT=80
# GATEWAY_HTTPS_PORT=443
# IMAGE_TAG=latest # o sha-XXXXXXX per pin specifico
networks:
internal:
driver: bridge
volumes:
caddy-data:
caddy-config:
secrets:
deribit_credentials:
file: ./secrets/deribit.json
hyperliquid_wallet:
file: ./secrets/hyperliquid.json
bybit_credentials:
file: ./secrets/bybit.json
alpaca_credentials:
file: ./secrets/alpaca.json
macro_credentials:
file: ./secrets/macro.json
sentiment_credentials:
file: ./secrets/sentiment.json
core_token:
file: ./secrets/core.token
observer_token:
file: ./secrets/observer.token
x-common-security: &common-security
cap_drop: [ALL]
security_opt:
- no-new-privileges:true
restart: unless-stopped
networks: [internal]
labels:
com.centurylinklabs.watchtower.enable: "true"
volumes:
- ${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}:/var/log/cerbero-mcp:rw
x-image-prefix: &image_prefix git.tielogic.xyz/adriano/cerbero-mcp
services:
gateway:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/gateway:${IMAGE_TAG:-latest}
restart: unless-stopped
networks: [internal]
security_opt:
- no-new-privileges:true
labels:
com.centurylinklabs.watchtower.enable: "true"
ports:
- "${GATEWAY_HTTP_PORT:-80}:80"
- "${GATEWAY_HTTPS_PORT:-443}:443"
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
volumes:
- ./gateway/Caddyfile:/etc/caddy/Caddyfile:ro
- ./gateway/public:/srv:ro
- caddy-data:/data
- caddy-config:/config
depends_on:
mcp-deribit: { condition: service_healthy }
mcp-hyperliquid: { condition: service_healthy }
mcp-bybit: { condition: service_healthy }
mcp-alpaca: { condition: service_healthy }
mcp-macro: { condition: service_healthy }
mcp-sentiment: { condition: service_healthy }
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost/"]
interval: 30s
timeout: 5s
retries: 3
mcp-deribit:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-deribit:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [deribit_credentials, core_token, observer_token]
environment:
CREDENTIALS_FILE: /run/secrets/deribit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
DERIBIT_TESTNET: "${DERIBIT_TESTNET:-true}"
ROOT_PATH: /mcp-deribit
AUDIT_LOG_FILE: /var/log/cerbero-mcp/deribit.audit.jsonl
mcp-hyperliquid:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-hyperliquid:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [hyperliquid_wallet, core_token, observer_token]
environment:
HYPERLIQUID_WALLET_FILE: /run/secrets/hyperliquid_wallet
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
HYPERLIQUID_TESTNET: "${HYPERLIQUID_TESTNET:-true}"
ROOT_PATH: /mcp-hyperliquid
AUDIT_LOG_FILE: /var/log/cerbero-mcp/hyperliquid.audit.jsonl
mcp-bybit:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-bybit:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [bybit_credentials, core_token, observer_token]
environment:
BYBIT_CREDENTIALS_FILE: /run/secrets/bybit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
BYBIT_TESTNET: "${BYBIT_TESTNET:-true}"
ROOT_PATH: /mcp-bybit
AUDIT_LOG_FILE: /var/log/cerbero-mcp/bybit.audit.jsonl
PORT: "9019"
mcp-alpaca:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-alpaca:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [alpaca_credentials, core_token, observer_token]
environment:
ALPACA_CREDENTIALS_FILE: /run/secrets/alpaca_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ALPACA_PAPER: "${ALPACA_PAPER:-true}"
ROOT_PATH: /mcp-alpaca
AUDIT_LOG_FILE: /var/log/cerbero-mcp/alpaca.audit.jsonl
PORT: "9020"
mcp-macro:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-macro:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [macro_credentials, core_token, observer_token]
environment:
MACRO_CREDENTIALS_FILE: /run/secrets/macro_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-macro
mcp-sentiment:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-sentiment:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [sentiment_credentials, core_token, observer_token]
environment:
SENTIMENT_CREDENTIALS_FILE: /run/secrets/sentiment_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-sentiment
# ========================================================
# WATCHTOWER — auto-update container con label enable=true
# ========================================================
watchtower:
image: containrrr/watchtower:latest
restart: unless-stopped
networks: [internal]
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ${HOME}/.docker/config.json:/config.json:ro
environment:
WATCHTOWER_LABEL_ENABLE: "true"
WATCHTOWER_CLEANUP: "true"
WATCHTOWER_POLL_INTERVAL: "${WATCHTOWER_POLL_INTERVAL:-300}"
WATCHTOWER_INCLUDE_RESTARTING: "false"
WATCHTOWER_NOTIFICATIONS_LEVEL: info
WATCHTOWER_LOG_LEVEL: info
command: --interval ${WATCHTOWER_POLL_INTERVAL:-300}
+60
View File
@@ -0,0 +1,60 @@
# docker-compose.traefik.yml — overlay per integrare Cerbero_mcp con un
# Traefik già esistente sull'host (es. lo stesso VPS che ospita Gitea).
#
# USO:
# docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml \
# --env-file .env up -d
#
# Differenze vs docker-compose.prod.yml standalone:
# - Gateway Caddy NON ha ports binding host (Traefik è il punto di ingresso
# pubblico su 80/443).
# - Gateway è connesso anche alla network esterna `traefik` (override env
# TRAEFIK_NETWORK se diversa, es. `gitea_traefik-public`).
# - Caddy NON fa auto-TLS — Traefik termina TLS e fa ACME Let's Encrypt.
# Caddy ascolta in chiaro su :80 dentro Docker network.
# - Trusted proxies: Caddy rispetta X-Forwarded-For ricevuto da Traefik
# per il match `remote_ip` (rate limit + WRITE_ALLOWLIST).
# - Labels Traefik su gateway: routing Host(`cerbero-mcp.tielogic.xyz`) +
# TLS automatic.
#
# Variabili .env aggiuntive richieste:
# TRAEFIK_NETWORK=gitea_traefik-public # nome network di Traefik
# TRAEFIK_CERTRESOLVER=letsencrypt # nome resolver in tua config Traefik
# TRAEFIK_ENTRYPOINT=websecure # entrypoint HTTPS Traefik
networks:
traefik:
external: true
name: ${TRAEFIK_NETWORK:-gitea_traefik-public}
services:
gateway:
# Override: niente port binding host, traffica solo via Traefik
ports: !reset []
networks:
- internal
- traefik
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
# Mode behind-proxy: Caddy ascolta plain HTTP su :80, no auto_https
LISTEN: ":80"
AUTO_HTTPS: "off"
# Traefik è il proxy che inoltra; trusta range privati + opz. CIDR Traefik
TRUSTED_PROXIES: ${TRUSTED_PROXIES:-private_ranges}
labels:
com.centurylinklabs.watchtower.enable: "true"
traefik.enable: "true"
traefik.docker.network: ${TRAEFIK_NETWORK:-gitea_traefik-public}
traefik.http.routers.cerbero-mcp.rule: "Host(`cerbero-mcp.tielogic.xyz`)"
traefik.http.routers.cerbero-mcp.entrypoints: ${TRAEFIK_ENTRYPOINT:-websecure}
traefik.http.routers.cerbero-mcp.tls: "true"
traefik.http.routers.cerbero-mcp.tls.certresolver: ${TRAEFIK_CERTRESOLVER:-letsencrypt}
traefik.http.services.cerbero-mcp.loadbalancer.server.port: "80"
# Security headers a livello Traefik (ridondante con Caddy ma utile se
# in futuro Caddy viene rimosso). Commenta se non vuoi duplicazione.
traefik.http.routers.cerbero-mcp.middlewares: cerbero-mcp-secheaders@docker
traefik.http.middlewares.cerbero-mcp-secheaders.headers.stsSeconds: "31536000"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.stsIncludeSubdomains: "true"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.contentTypeNosniff: "true"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.referrerPolicy: "no-referrer"
+14 -4
View File
@@ -36,12 +36,20 @@ services:
# GATEWAY — unica porta host, reverse proxy + landing page
# ========================================================
gateway:
image: caddy:2-alpine
build:
context: ./gateway
dockerfile: Dockerfile
image: cerbero-gateway:dev
restart: unless-stopped
networks: [internal]
security_opt:
- no-new-privileges:true
ports: ["${GATEWAY_PORT:-8080}:8080"]
ports:
- "${GATEWAY_HTTP_PORT:-80}:80"
- "${GATEWAY_HTTPS_PORT:-443}:443"
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
volumes:
- ./gateway/Caddyfile:/etc/caddy/Caddyfile:ro
- ./gateway/public:/srv:ro
@@ -55,7 +63,7 @@ services:
mcp-macro: { condition: service_healthy }
mcp-sentiment: { condition: service_healthy }
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost:8080/"]
test: ["CMD", "wget", "-q", "--spider", "http://localhost/"]
interval: 30s
timeout: 5s
retries: 3
@@ -78,6 +86,7 @@ services:
CREDENTIALS_FILE: /run/secrets/deribit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
DERIBIT_TESTNET: "true" # override secrets/deribit.json testnet flag
ROOT_PATH: /mcp-deribit
mcp-hyperliquid:
@@ -95,6 +104,7 @@ services:
HYPERLIQUID_WALLET_FILE: /run/secrets/hyperliquid_wallet
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
HYPERLIQUID_TESTNET: "true" # override secrets/hyperliquid.json testnet flag
ROOT_PATH: /mcp-hyperliquid
mcp-bybit:
@@ -112,7 +122,7 @@ services:
BYBIT_CREDENTIALS_FILE: /run/secrets/bybit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
BYBIT_TESTNET: "false"
BYBIT_TESTNET: "true" # override secrets/bybit.json testnet flag
ROOT_PATH: /mcp-bybit
PORT: "9019"
+2 -2
View File
@@ -6,7 +6,7 @@ RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY services/common ./services/common
RUN uv sync --frozen --no-dev --package option-mcp-common
RUN uv sync --frozen --no-dev --package mcp-common
ENV PATH="/app/.venv/bin:$PATH"
FROM base AS dev
RUN uv sync --frozen --package option-mcp-common
RUN uv sync --frozen --package mcp-common
+3 -2
View File
@@ -1,6 +1,7 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM cerbero-base:${BASE_TAG} AS builder
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-alpaca ./services/mcp-alpaca
RUN uv sync --frozen --no-dev --package mcp-alpaca
@@ -19,6 +20,6 @@ ENV HOST=0.0.0.0 PORT=9020
EXPOSE 9020
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import httpx, os; httpx.get(f'http://localhost:{os.environ.get(\"PORT\",\"9020\")}/health').raise_for_status()"
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9020\")}/health', timeout=3).close()"
CMD ["mcp-alpaca"]
+3 -2
View File
@@ -1,6 +1,7 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM cerbero-base:${BASE_TAG} AS builder
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-bybit ./services/mcp-bybit
RUN uv sync --frozen --no-dev --package mcp-bybit
@@ -19,6 +20,6 @@ ENV HOST=0.0.0.0 PORT=9019
EXPOSE 9019
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import httpx, os; httpx.get(f'http://localhost:{os.environ.get(\"PORT\",\"9019\")}/health').raise_for_status()"
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9019\")}/health', timeout=3).close()"
CMD ["mcp-bybit"]
+3 -2
View File
@@ -1,8 +1,9 @@
# CER-P5-012 multi-stage slim: builder da cerbero-base (con uv + toolchain),
# runtime da python:3.11-slim (solo venv + source).
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM cerbero-base:${BASE_TAG} AS builder
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-deribit ./services/mcp-deribit
RUN uv sync --frozen --no-dev --package mcp-deribit
@@ -21,6 +22,6 @@ ENV HOST=0.0.0.0 PORT=9011
EXPOSE 9011
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import httpx, os; httpx.get(f'http://localhost:{os.environ.get(\"PORT\",\"9011\")}/health').raise_for_status()"
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9011\")}/health', timeout=3).close()"
CMD ["mcp-deribit"]
+3 -2
View File
@@ -1,6 +1,7 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM cerbero-base:${BASE_TAG} AS builder
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-hyperliquid ./services/mcp-hyperliquid
RUN uv sync --frozen --no-dev --package mcp-hyperliquid
@@ -19,6 +20,6 @@ ENV HOST=0.0.0.0 PORT=9012
EXPOSE 9012
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import httpx, os; httpx.get(f'http://localhost:{os.environ.get(\"PORT\",\"9012\")}/health').raise_for_status()"
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9012\")}/health', timeout=3).close()"
CMD ["mcp-hyperliquid"]
+3 -2
View File
@@ -1,6 +1,7 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM cerbero-base:${BASE_TAG} AS builder
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-macro ./services/mcp-macro
RUN uv sync --frozen --no-dev --package mcp-macro
@@ -19,6 +20,6 @@ ENV HOST=0.0.0.0 PORT=9013
EXPOSE 9013
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import httpx, os; httpx.get(f'http://localhost:{os.environ.get(\"PORT\",\"9013\")}/health').raise_for_status()"
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9013\")}/health', timeout=3).close()"
CMD ["mcp-macro"]
+3 -2
View File
@@ -1,6 +1,7 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM cerbero-base:${BASE_TAG} AS builder
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-sentiment ./services/mcp-sentiment
RUN uv sync --frozen --no-dev --package mcp-sentiment
@@ -19,6 +20,6 @@ ENV HOST=0.0.0.0 PORT=9014
EXPOSE 9014
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import httpx, os; httpx.get(f'http://localhost:{os.environ.get(\"PORT\",\"9014\")}/health').raise_for_status()"
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9014\")}/health', timeout=3).close()"
CMD ["mcp-sentiment"]
File diff suppressed because it is too large Load Diff
@@ -0,0 +1,212 @@
# COT Report — design spec (mcp-macro)
**Data**: 2026-04-27
**Servizio target**: `mcp-macro`
**Scope**: aggiungere supporto al Commitment of Traders (COT) report
pubblicato dalla CFTC come fonte di posizionamento istituzionale per
opzioni, ETF azionari e materie prime.
## 1. Motivazione
Il COT settimanale CFTC è uno dei segnali di posizionamento più seguiti
per futures sotto la giurisdizione USA (equity, bond, valute, energia,
metalli, agricoli). Manca completamente in `mcp-macro`, che oggi copre
solo yields, FRED, calendar, equity futures spot prices.
L'uso primario nel contesto Cerbero:
- **Overlay opzioni Deribit**: BTC ha correlazione strutturale con
Nasdaq, e il posizionamento dei *Leveraged Funds* su NQ è un proxy di
rischio sistemico equity. Quando i lev funds sono short estremo
equity, IV upside premium si comprime → squeeze probabile.
- **Segnali ETF**: *Asset Manager net* (TFF) approssima il flow
istituzionale long-only (SPY, QQQ).
- **Materie prime**: *Producer/Merchant* (hedger commerciale) e
*Managed Money* (hedge fund / CTA) sono i veri segnali di top/bottom
per oil, gold, copper, agricoli.
## 2. Decisione: due report, non uno
Approccio adottato:
- **Equity / financial** (S&P, NDX, Russell, treasuries, currencies) →
**TFF** (*Traders in Financial Futures*).
- **Materie prime** (oil, gold, silver, copper, grains) →
**Disaggregated** (futures-only & options combined).
- **Legacy** (non-commercial vs commercial) → **escluso**: report
obsoleto, troppo aggregato, perde la granularità sui 4 ruoli
istituzionali.
Motivazione: i due report coprono i 13 simboli watchlist con la massima
granularità senza overlap.
## 3. Sorgenti dati
API CFTC pubblica (no API key richiesta), endpoint Socrata:
`https://publicreporting.cftc.gov/resource/<dataset>.json`.
| Report | Dataset ID | Frequenza | Contenuto |
| -------------------------- | ----------- | --------------------------- | -------------------------------------------------------------------- |
| TFF F&O combined | `gpe5-46if` | settimanale (ven 15:30 ET) | Dealer/Intermediary, Asset Manager, Leveraged Funds, Other Reportables |
| Disaggregated F&O combined | `72hh-3qpy` | settimanale (ven 15:30 ET) | Producer/Merchant, Swap Dealer, Managed Money, Other Reportables |
Dati osservati al **martedì** della settimana, pubblicati il **venerdì**
seguente alle 15:30 ET.
## 4. Watchlist simboli
### TFF
- `ES` (E-mini S&P 500)
- `NQ` (E-mini Nasdaq-100)
- `RTY` (E-mini Russell 2000)
- `ZN` (10-Year T-Note)
- `ZB` (30-Year T-Bond)
- `6E` (Euro FX)
- `6J` (Japanese Yen)
- `DX` (US Dollar Index)
### Disaggregated
- `CL` (Crude Oil WTI)
- `GC` (Gold)
- `SI` (Silver)
- `HG` (Copper)
- `ZW` (Wheat)
- `ZC` (Corn)
- `ZS` (Soybeans)
Mapping `ticker → cftc_contract_market_code` mantenuto in costante nel
modulo (es. ES → `13874A`, CL → `067651`). I codici sono pubblici CFTC e
non cambiano.
## 5. Tool MCP esposti
Tre tool, tutti `reads` (core + observer):
### 5.1 `get_cot_tff(symbol, lookback_weeks=52)`
Ritorna serie temporale TFF per un simbolo equity/financial.
Output:
```json
{
"symbol": "ES",
"report_type": "tff",
"rows": [
{
"report_date": "2026-04-22",
"dealer_long": 12345, "dealer_short": 23456, "dealer_net": -11111,
"asset_mgr_long": 654321, "asset_mgr_short": 200000, "asset_mgr_net": 454321,
"lev_funds_long": 100000, "lev_funds_short": 350000, "lev_funds_net": -250000,
"other_long": 50000, "other_short": 50000, "other_net": 0,
"open_interest": 2500000
}
],
"data_timestamp": "2026-04-27T20:00:00Z"
}
```
### 5.2 `get_cot_disaggregated(symbol, lookback_weeks=52)`
Stessa shape, campi diversi: `producer_*`, `swap_*`, `managed_money_*`,
`other_*`.
### 5.3 `get_cot_extreme_positioning(lookback_weeks=156)`
Scanner che restituisce, per ogni simbolo della watchlist, il percentile
storico (1y o 3y) dell'ultimo *net position* per il ruolo chiave
(Leveraged Funds per TFF, Managed Money per Disaggregated). Flagga
estremi a percentili ≤ 5 o ≥ 95.
Output:
```json
{
"lookback_weeks": 156,
"extremes": [
{
"symbol": "ES", "report_type": "tff",
"key_role": "lev_funds",
"current_net": -250000,
"percentile": 3.2,
"signal": "extreme_short",
"report_date": "2026-04-22"
}
],
"data_timestamp": "2026-04-27T20:00:00Z"
}
```
`signal``{"extreme_short", "extreme_long", "neutral"}`.
## 6. Architettura
```text
mcp-macro/
src/mcp_macro/
fetchers.py # esistente — aggiunge fetch_cot_tff, fetch_cot_disaggregated, fetch_cot_extreme_positioning
cot_contracts.py # NUOVO — costanti SYMBOL_TO_CFTC_CODE, CFTC_FIELD_MAPPINGS
server.py # esistente — aggiunge 3 endpoint + body models
tests/
test_cot.py # NUOVO — pure-logic test su parsing + percentile + extreme detection
test_fetchers.py # esistente — aggiunge integration test con httpx_mock
```
Logica pura (calcolo percentile, classificazione extreme) in `fetchers`
testata indipendentemente dal layer HTTP. I fetcher async usano
`mcp_common.http.async_client` (retry transport già in place).
## 7. Cache
- Chiamata Socrata risponde tipicamente in 200-800ms.
- COT esce settimanalmente venerdì sera ET → cache TTL 1 ora è
eccessiva ma sicura. Riusa il pattern `_TREASURY_CACHE` esistente in
`fetchers.py` (chiave `(symbol, report_type, lookback_weeks)`).
## 8. Edge cases
- **Pre-pubblicazione (es. mercoledì)**: ultimo report è quello della
settimana precedente. Niente da gestire — l'API ritorna l'ultimo
disponibile.
- **Simbolo fuori watchlist**: `get_cot_tff("INVALID")` → 400 con
payload `{"error": "unknown_symbol", "available": [...]}`.
- **API CFTC down**: retry transport gestisce transient. Su 5xx
persistente: ritorna `{"rows": [], "error": "cftc_unavailable"}`.
- **Lookback troppo corto** (< 4 settimane) → percentile inattendibile
in extreme positioning. Validation Pydantic: `lookback_weeks ≥ 4`.
## 9. Test plan
Pure-logic (no HTTP):
- `compute_percentile(value, history)` con casi noti.
- `classify_extreme(percentile, threshold=5)` → boundary cases.
- `parse_tff_row()` e `parse_disaggregated_row()` su payload Socrata
mock (campi reali documentati).
Integration (httpx_mock):
- `fetch_cot_tff("ES", lookback_weeks=52)` con risposta CFTC mock →
verifica shape output + ordering rows ASC per data.
- `fetch_cot_extreme_positioning()` con dati che includono casi extreme
+ casi neutral → verifica filtering e signal.
ACL test (TestClient):
- `POST /tools/get_cot_tff` con core/observer/no-auth → 200/200/401.
- `POST /tools/get_cot_extreme_positioning` idem.
## 10. Out of scope (versione 1)
- **Storico oltre 3 anni**: l'API CFTC ha tutto da 2010, ma `lookback`
default 52w (= 1 anno) e max ragionevole 156w. Storico decennale può
essere aggiunto in v2 se serve per backtest.
- **Disaggregated futures-only** (dataset diverso da F&O combined):
meno usato, skip.
- **Notification al rilascio settimanale**: il bot deve schedulare a
venerdì 16:00 ET; non è responsabilità del MCP server.
- **Legacy report**: escluso (vedi §2).
- **Aggregazione cross-symbol** (es. "tutti i metalli combinati"):
l'utente compone via tool individuali.
+56 -6
View File
@@ -1,23 +1,73 @@
{
admin off
auto_https off
email {$ACME_EMAIL:adrianodalpastro@tielogic.com}
auto_https {$AUTO_HTTPS:on}
# Plugin mholt/caddy-ratelimit
order rate_limit before basicauth
# Trusted proxies: rispetta X-Forwarded-For quando dietro reverse proxy
# (es. Traefik). Default = solo private ranges.
servers {
trusted_proxies static {$TRUSTED_PROXIES:private_ranges}
}
}
:8080 {
{$LISTEN:cerbero-mcp.tielogic.xyz} {
log {
output stdout
format console
format json
}
# ───── Security headers ─────
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "no-referrer"
-Server
}
# ───── IP allowlist su endpoint write ─────
# WRITE_ALLOWLIST: CIDR space-separated (es. "1.2.3.4/32 5.6.7.0/24").
# Default 127.0.0.1/32 — fail-closed se non configurato.
@writes_blocked {
path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
not remote_ip {$WRITE_ALLOWLIST:127.0.0.1/32 ::1/128 172.16.0.0/12}
}
respond @writes_blocked "forbidden: source ip not in allowlist" 403
# ───── Rate limit ─────
# Reads: 60 req/min/IP, writes: 10 req/min/IP (sliding window).
rate_limit {
zone reads {
match {
not path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
}
key {remote_ip}
events 60
window 1m
}
zone writes {
match {
path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
}
key {remote_ip}
events 10
window 1m
}
}
# ───── Reverse proxy ─────
handle_path /mcp-deribit/* {
reverse_proxy mcp-deribit:9011
}
handle_path /mcp-hyperliquid/* {
reverse_proxy mcp-hyperliquid:9012
}
handle_path /mcp-bybit/* {
reverse_proxy mcp-bybit:9019
}
handle_path /mcp-hyperliquid/* {
reverse_proxy mcp-hyperliquid:9012
}
handle_path /mcp-alpaca/* {
reverse_proxy mcp-alpaca:9020
}
+6
View File
@@ -0,0 +1,6 @@
FROM caddy:2.8-builder-alpine AS builder
RUN xcaddy build \
--with github.com/mholt/caddy-ratelimit
FROM caddy:2.8-alpine
COPY --from=builder /usr/bin/caddy /usr/bin/caddy
+36 -2
View File
@@ -11,11 +11,11 @@ members = [
[tool.ruff]
line-length = 100
target-version = "py311"
target-version = "py313"
[tool.ruff.lint]
select = ["E", "F", "I", "W", "UP", "B", "SIM"]
ignore = ["E501"]
ignore = ["E501", "E741"]
[tool.ruff.lint.flake8-bugbear]
extend-immutable-calls = [
@@ -37,10 +37,44 @@ extend-immutable-calls = [
asyncio_mode = "auto"
testpaths = ["services"]
addopts = "--import-mode=importlib"
consider_namespace_packages = true
[tool.mypy]
python_version = "3.13"
strict = false
warn_return_any = true
warn_unused_ignores = true
warn_redundant_casts = true
check_untyped_defs = true
ignore_missing_imports = true
mypy_path = [
"services/common/src",
"services/mcp-alpaca/src",
"services/mcp-bybit/src",
"services/mcp-deribit/src",
"services/mcp-hyperliquid/src",
"services/mcp-macro/src",
"services/mcp-sentiment/src",
]
exclude = [
"^.*tests/.*$",
"^.venv/.*$",
]
[[tool.mypy.overrides]]
module = [
"pybit.*",
"alpaca.*",
"hyperliquid.*",
"pythonjsonlogger.*",
]
ignore_missing_imports = true
[dependency-groups]
dev = [
"pytest>=9.0.3",
"pytest-asyncio>=1.3.0",
"pytest-httpx>=0.36.2",
"mypy>=1.13",
"ruff>=0.5,<0.6",
]
+90
View File
@@ -0,0 +1,90 @@
#!/usr/bin/env bash
# Cerbero_mcp — build & push image al registry Gitea da macchina locale.
#
# Sostituisce il job CI `build-and-push` di .gitea/workflows/ci.yml.
# Usalo dopo `git push` (o senza, se vuoi pushare un build "dirty").
# Watchtower sul VPS pulla automaticamente entro WATCHTOWER_POLL_INTERVAL.
#
# Pre-requisiti:
# - docker + buildx
# - PAT Gitea con scope `write:package` in env $GITEA_PAT
# - $GITEA_USER (default: adriano)
#
# Uso:
# ./scripts/build-push.sh # tutte le image
# ./scripts/build-push.sh base gateway # solo specifiche
set -euo pipefail
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
IMAGE_PREFIX="${IMAGE_PREFIX:-$REGISTRY/adriano/cerbero-mcp}"
GITEA_USER="${GITEA_USER:-adriano}"
SHA="$(git rev-parse --short HEAD)"
# Ordine di build: base prima (parent delle mcp-*), poi le altre.
ALL_TARGETS=(base gateway mcp-deribit mcp-bybit mcp-hyperliquid mcp-alpaca mcp-macro mcp-sentiment)
TARGETS=("${@:-${ALL_TARGETS[@]}}")
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
docker buildx version >/dev/null || { echo "FATAL: docker buildx non disponibile"; exit 1; }
# Login solo se non già autenticato sul registry. Per primo login fai:
# echo "<PAT>" | docker login $REGISTRY -u $GITEA_USER --password-stdin
if grep -q "\"$REGISTRY\"" ~/.docker/config.json 2>/dev/null; then
echo "=== docker già loggato su $REGISTRY (skip login) ==="
elif [ -n "${GITEA_PAT:-}" ]; then
echo "=== docker login $REGISTRY ==="
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
else
echo "FATAL: non autenticato su $REGISTRY e GITEA_PAT non settata."
echo " Esegui una volta: docker login $REGISTRY -u $GITEA_USER"
exit 1
fi
build_one() {
local name="$1"
local context file
case "$name" in
base)
context="."; file="docker/base.Dockerfile" ;;
gateway)
context="./gateway"; file="gateway/Dockerfile" ;;
mcp-*)
context="."; file="docker/${name}.Dockerfile" ;;
*)
echo "FATAL: target sconosciuto '$name'"; exit 1 ;;
esac
if [ ! -f "$file" ]; then
echo "FATAL: Dockerfile non trovato: $file"; exit 1
fi
local tag_latest="$IMAGE_PREFIX/$name:latest"
local tag_sha="$IMAGE_PREFIX/$name:sha-$SHA"
echo "=== [$name] build & push ==="
local args=(buildx build --push
-f "$file"
-t "$tag_latest"
-t "$tag_sha"
)
if [[ "$name" == mcp-* ]]; then
args+=(--build-arg "BASE_IMAGE=$IMAGE_PREFIX/base"
--build-arg "BASE_TAG=latest")
fi
args+=("$context")
docker "${args[@]}"
echo " pushed: $tag_latest"
echo " pushed: $tag_sha"
}
for t in "${TARGETS[@]}"; do
build_one "$t"
done
echo
echo "=== Tutto pushato (commit $SHA) ==="
echo "VPS Watchtower farà pull entro WATCHTOWER_POLL_INTERVAL (default 5min)."
echo "Per forzare subito:"
echo " ssh <vps> 'cd /docker/cerbero_mcp && docker compose -f docker-compose.prod.yml pull && docker compose -f docker-compose.prod.yml up -d'"
+202
View File
@@ -0,0 +1,202 @@
#!/usr/bin/env bash
# Cerbero_mcp — deploy script per VPS produzione.
#
# Sul VPS NON viene clonato il repo: lo script scarica solo i file
# strettamente necessari al runtime (compose, Caddyfile, public assets)
# via raw HTTP da Gitea. Le image vengono pullate pre-built dal registry
# Gitea (buildate dal laptop dev con scripts/build-push.sh).
#
# Pre-requisiti sul VPS (NON gestiti da questo script):
# 1. Docker Engine ≥ 24 + plugin docker compose installati.
# 2. DNS A record `cerbero-mcp.tielogic.xyz` → IP del VPS (warn-only).
# 3. Porte 80 e 443 aperte sul firewall (per ACME + traffico HTTPS).
# 4. PAT Gitea con scope `read:package`, salvato in env `$GITEA_PAT`.
# 5. Username Gitea in env `$GITEA_USER` (default: adriano).
# 6. Secret JSON exchange + token bearer disponibili in $SECRETS_SRC
# (default: $DEPLOY_DIR/secrets/), che lo script copierà in
# $DEPLOY_DIR/secrets/ con permessi 600 (ignorato se SECRETS_SRC == DEPLOY_DIR/secrets).
#
# Idempotente: rieseguibile per aggiornamenti (riscarica i file di config
# dal branch corrente, NON tocca .env esistente).
set -euo pipefail
DEPLOY_DIR="${DEPLOY_DIR:-/docker/cerbero_mcp}"
SECRETS_SRC="${SECRETS_SRC:-$DEPLOY_DIR/secrets}"
GITEA_USER="${GITEA_USER:-adriano}"
GITEA_RAW_BASE="${GITEA_RAW_BASE:-https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main}"
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
DOMAIN="${DOMAIN:-cerbero-mcp.tielogic.xyz}"
AUDIT_LOG_DIR="${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}"
echo "=== Cerbero_mcp deploy (no-clone) → $DEPLOY_DIR (domain $DOMAIN) ==="
# ──────────────────────────────────────────────────────────────
# 1. Verifica pre-requisiti
# ──────────────────────────────────────────────────────────────
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
command -v curl >/dev/null || { echo "FATAL: curl non installato"; exit 1; }
docker compose version >/dev/null || { echo "FATAL: docker compose plugin assente"; exit 1; }
if [ -z "${GITEA_PAT:-}" ]; then
echo "FATAL: env GITEA_PAT non settata. Export del PAT con scope read:package prima."
exit 1
fi
# Check DNS resolution (warning only, non blocca)
ip_resolved=$(getent hosts "$DOMAIN" | awk '{print $1}' | head -1 || true)
if [ -z "$ip_resolved" ]; then
echo "WARN: $DOMAIN non risolve via DNS — TLS Let's Encrypt fallirà finché DNS non propaga."
else
echo "DNS $DOMAIN$ip_resolved"
fi
# ──────────────────────────────────────────────────────────────
# 2. Login al container registry
# ──────────────────────────────────────────────────────────────
echo "=== docker login $REGISTRY ==="
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
# ──────────────────────────────────────────────────────────────
# 3. Setup dir + scarica i file di config dal repo (no clone)
# ──────────────────────────────────────────────────────────────
sudo mkdir -p "$DEPLOY_DIR"
sudo chown "$USER:$USER" "$DEPLOY_DIR"
mkdir -p "$DEPLOY_DIR/secrets" "$DEPLOY_DIR/gateway/public"
# File di config necessari al runtime. Scaricati come raw da Gitea.
# Idempotente: ricarica sempre la versione di main.
download() {
local rel="$1"
local dst="$DEPLOY_DIR/$rel"
echo " fetch: $rel"
curl -fsSL -o "$dst" "$GITEA_RAW_BASE/$rel" \
|| { echo "FATAL: download $rel fallito"; exit 1; }
}
echo "=== Download config da $GITEA_RAW_BASE ==="
download docker-compose.prod.yml
download docker-compose.traefik.yml
download gateway/Caddyfile
download gateway/public/index.html
download gateway/public/status.js
download gateway/public/style.css
cd "$DEPLOY_DIR"
# ──────────────────────────────────────────────────────────────
# 4. Copia secrets con permessi 600
# ──────────────────────────────────────────────────────────────
if [ "$(realpath "$SECRETS_SRC")" != "$(realpath "$DEPLOY_DIR/secrets")" ]; then
if [ ! -d "$SECRETS_SRC" ]; then
echo "FATAL: secrets src dir $SECRETS_SRC non esiste."
echo " Atteso contenere: deribit.json bybit.json hyperliquid.json alpaca.json"
echo " macro.json sentiment.json core.token observer.token"
exit 1
fi
echo "=== Copia secrets da $SECRETS_SRC ==="
for f in deribit.json bybit.json hyperliquid.json alpaca.json macro.json sentiment.json core.token observer.token; do
if [ -f "$SECRETS_SRC/$f" ]; then
cp "$SECRETS_SRC/$f" "secrets/$f"
chmod 600 "secrets/$f"
echo " ok: secrets/$f"
else
echo " WARN: $SECRETS_SRC/$f assente — il servizio relativo fallirà al boot."
fi
done
else
echo "=== Secrets già in $DEPLOY_DIR/secrets — solo chmod 600 ==="
for f in deribit.json bybit.json hyperliquid.json alpaca.json macro.json sentiment.json core.token observer.token; do
[ -f "secrets/$f" ] && chmod 600 "secrets/$f" && echo " ok: secrets/$f" \
|| echo " WARN: secrets/$f assente — il servizio relativo fallirà al boot."
done
fi
# ──────────────────────────────────────────────────────────────
# 5. Crea/aggiorna .env (preserva esistente)
# ──────────────────────────────────────────────────────────────
if [ ! -f .env ]; then
echo "=== Creazione .env iniziale (testnet di default) ==="
cat > .env <<EOF
# Cerbero_mcp deploy config — modifica per passare a mainnet
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
IMAGE_TAG=latest
IMAGE_PREFIX=git.tielogic.xyz/adriano/cerbero-mcp
# Environment exchange (true=testnet, false=mainnet).
# IMPORTANTE: per mainnet aggiungi anche "environment":"mainnet" al secret JSON
# corrispondente, altrimenti il boot abortisce per safety (vedi consistency_check).
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# Permette mainnet senza creds["environment"]="mainnet" esplicito (sconsigliato).
STRICT_MAINNET=true
# Audit log persistente per write endpoint (place_order, cancel, ecc.).
AUDIT_LOG_DIR=$AUDIT_LOG_DIR
# Watchtower polling auto-update (sec).
WATCHTOWER_POLL_INTERVAL=300
EOF
echo " $DEPLOY_DIR/.env creato. Rivedi prima del primo up."
else
echo "=== .env preesistente — non sovrascritto ==="
fi
# ──────────────────────────────────────────────────────────────
# 6. Audit log dir host (volume bind)
# ──────────────────────────────────────────────────────────────
sudo mkdir -p "$AUDIT_LOG_DIR"
sudo chown 1000:1000 "$AUDIT_LOG_DIR"
echo "Audit log dir: $AUDIT_LOG_DIR (chown 1000:1000)"
# ──────────────────────────────────────────────────────────────
# 7. Pull image + up
# ──────────────────────────────────────────────────────────────
COMPOSE_FILES=("-f" "docker-compose.prod.yml")
if [ "${BEHIND_TRAEFIK:-false}" = "true" ]; then
echo "=== Modalità behind-traefik attiva (network ${TRAEFIK_NETWORK:-gitea_traefik-public}) ==="
COMPOSE_FILES+=("-f" "docker-compose.traefik.yml")
fi
# Override locale specifico macchina (es. fix DOCKER_API_VERSION watchtower).
# Non versionato (in .gitignore), creato a mano sul VPS se serve.
if [ -f "docker-compose.local.yml" ]; then
echo "=== Override locale rilevato: docker-compose.local.yml ==="
COMPOSE_FILES+=("-f" "docker-compose.local.yml")
fi
echo "=== docker compose pull + up ==="
docker compose "${COMPOSE_FILES[@]}" --env-file .env pull
docker compose "${COMPOSE_FILES[@]}" --env-file .env up -d
# ──────────────────────────────────────────────────────────────
# 8. Verifica stato
# ──────────────────────────────────────────────────────────────
sleep 5
echo "=== Stato container ==="
docker compose "${COMPOSE_FILES[@]}" --env-file .env ps
echo
echo "=== Smoke test (health check via gateway pubblico) ==="
sleep 10
if curl -sf -o /dev/null -m 10 "https://$DOMAIN/mcp-macro/health"; then
echo " OK: https://$DOMAIN/mcp-macro/health → 200"
else
echo " WARN: https://$DOMAIN/mcp-macro/health non risponde (DNS o cert non ancora pronti?)"
echo " Riprova fra 30s o controlla: docker compose -f docker-compose.prod.yml logs gateway"
fi
echo
echo "=== Deploy completato ==="
echo "Comandi utili (compose files: ${COMPOSE_FILES[*]}):"
echo " Logs: docker compose ${COMPOSE_FILES[*]} --env-file .env logs -f <service>"
echo " Audit: tail -f $AUDIT_LOG_DIR/*.audit.jsonl"
echo " Restart: docker compose ${COMPOSE_FILES[*]} --env-file .env restart <service>"
echo " Stop: docker compose ${COMPOSE_FILES[*]} --env-file .env down"
echo " Update: ri-esegui questo script (riscarica config + pull image)"
@@ -0,0 +1,95 @@
"""App factory comune per i servizi mcp-{exchange}.
Centralizza il boilerplate dei `__main__.py`:
- configure_root_logging (JSON)
- fail_fast_if_missing su env mandatory
- summarize env
- load creds JSON
- resolve_environment con default URLs
- load token store
- delega creazione client + app a callback per-servizio
- uvicorn.run
Ogni servizio invoca `run_exchange_main(spec)` con uno spec dichiarativo.
"""
from __future__ import annotations
import json
import os
from collections.abc import Callable
from dataclasses import dataclass
from typing import Any
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.env_validation import fail_fast_if_missing, require_env, summarize
from mcp_common.environment import (
EnvironmentInfo,
consistency_check,
resolve_environment,
)
from mcp_common.logging import configure_root_logging
@dataclass(frozen=True)
class ExchangeAppSpec:
exchange: str
creds_env_var: str
env_var: str # es. "BYBIT_TESTNET", "ALPACA_PAPER"
flag_key: str # campo nel secret JSON ("testnet" o "paper")
default_base_url_live: str
default_base_url_testnet: str
default_port: int
build_client: Callable[[dict, EnvironmentInfo], Any]
build_app: Callable[..., Any]
extra_summarize_envs: tuple[str, ...] = ()
def run_exchange_main(spec: ExchangeAppSpec) -> None:
configure_root_logging()
fail_fast_if_missing([spec.creds_env_var])
summarize([
spec.creds_env_var,
"CORE_TOKEN_FILE",
"OBSERVER_TOKEN_FILE",
"PORT",
"HOST",
spec.env_var,
*spec.extra_summarize_envs,
])
creds_file = require_env(spec.creds_env_var, f"{spec.exchange} credentials JSON path")
with open(creds_file) as f:
creds = json.load(f)
env_info = resolve_environment(
creds,
env_var=spec.env_var,
flag_key=spec.flag_key,
exchange=spec.exchange,
default_base_url_live=spec.default_base_url_live,
default_base_url_testnet=spec.default_base_url_testnet,
)
# Safety: previene switch accidentali a mainnet senza conferma esplicita
# nel secret. Solleva EnvironmentMismatchError → boot abort se mismatch.
strict_mainnet = os.environ.get("STRICT_MAINNET", "true").lower() not in ("0", "false", "no")
consistency_check(env_info, creds, strict_mainnet=strict_mainnet)
client = spec.build_client(creds, env_info)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = spec.build_app(client=client, token_store=token_store, creds=creds, env_info=env_info)
uvicorn.run(
app,
log_config=None,
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", str(spec.default_port))),
)
+121
View File
@@ -0,0 +1,121 @@
"""Audit log strutturato per write endpoint MCP (place_order, cancel,
set_*, close_*, transfer_*). Usa un logger dedicato `mcp.audit` su stream
JSON.
Sink:
- stdout/stderr (sempre): tramite root JSON logger configurato da
`mcp_common.logging.configure_root_logging`.
- File JSONL persistente (opzionale): se env var `AUDIT_LOG_FILE` è
settata, aggiunge un `TimedRotatingFileHandler` che ruota a mezzanotte
con `AUDIT_LOG_BACKUP_DAYS` di retention (default 30). Una riga JSON
per record (formato `.jsonl`).
Per VPS produzione: setta `AUDIT_LOG_FILE=/var/log/cerbero-mcp/<service>.audit.jsonl`
con bind mount del volume `/var/log/cerbero-mcp` nel docker-compose.
Payload sensibile (api_key, secret) già filtrato dal SecretsFilter
globale; qui non si include creds.
"""
from __future__ import annotations
import logging
import os
from logging.handlers import TimedRotatingFileHandler
from typing import Any
from mcp_common.auth import Principal
from mcp_common.logging import SecretsFilter, get_json_logger
try:
from pythonjsonlogger.json import JsonFormatter as _JsonFormatter # noqa: N813
except ImportError:
from pythonjsonlogger.jsonlogger import JsonFormatter as _JsonFormatter # noqa: N813
_logger = get_json_logger("mcp.audit", level=logging.INFO)
_file_handler_attached = False
def _configure_audit_sink() -> None:
"""Aggiunge FileHandler al logger mcp.audit se AUDIT_LOG_FILE è settato.
Idempotente: chiamato la prima volta da audit_write_op, poi no-op.
"""
global _file_handler_attached
if _file_handler_attached:
return
file_path = os.environ.get("AUDIT_LOG_FILE", "").strip()
if not file_path:
_file_handler_attached = True
return
backup_days = int(os.environ.get("AUDIT_LOG_BACKUP_DAYS", "30"))
os.makedirs(os.path.dirname(file_path) or ".", exist_ok=True)
handler = TimedRotatingFileHandler(
file_path,
when="midnight",
interval=1,
backupCount=backup_days,
encoding="utf-8",
utc=True,
)
handler.setFormatter(_JsonFormatter("%(asctime)s %(name)s %(levelname)s %(message)s"))
handler.addFilter(SecretsFilter())
_logger.addHandler(handler)
_file_handler_attached = True
def audit_write_op(
*,
principal: Principal | None,
action: str,
exchange: str,
target: str | None = None,
payload: dict[str, Any] | None = None,
result: dict[str, Any] | None = None,
error: str | None = None,
) -> None:
"""Emit a structured audit log record per write operation.
principal: chi ha invocato (None se anonimo, ma normalmente _check
impedisce di arrivare qui senza principal).
action: nome del tool (es. "place_order", "cancel_order").
exchange: identificatore servizio (deribit, bybit, alpaca, hyperliquid).
target: instrument/symbol/order_id su cui si agisce.
payload: input non-sensibile (qty, side, leverage, ecc.).
result: output del client (order_id, status, ecc.).
error: stringa errore se l'operazione ha fallito.
"""
_configure_audit_sink()
record: dict[str, Any] = {
"audit_event": "write_op",
"action": action,
"exchange": exchange,
"principal": principal.name if principal else None,
"target": target,
"payload": payload or {},
}
if result is not None:
record["result"] = _summarize_result(result)
if error is not None:
record["error"] = error
_logger.error("audit", extra=record)
else:
_logger.info("audit", extra=record)
def _summarize_result(result: dict[str, Any]) -> dict[str, Any]:
"""Estrae i campi rilevanti dal result (order_id, state, error code)
per evitare di loggare payload enormi.
"""
keys = (
"order_id", "order_link_id", "combo_instrument", "state", "status",
"code", "error", "stop_price", "tp_price", "transfer_id",
)
out: dict[str, Any] = {}
for k in keys:
if k in result:
out[k] = result[k]
if "orders" in result:
out["orders_count"] = len(result["orders"])
return out
@@ -0,0 +1,69 @@
"""Env validation policy: fail-fast per mandatory, soft per optional.
Usage al boot di ogni mcp `__main__.py`:
from mcp_common.env_validation import require_env, optional_env, summarize
creds_file = require_env("CREDENTIALS_FILE", "deribit credentials JSON path")
host = optional_env("HOST", default="0.0.0.0")
summarize(["CREDENTIALS_FILE", "HOST", "PORT"])
"""
from __future__ import annotations
import logging
import os
import sys
logger = logging.getLogger(__name__)
class MissingEnvError(RuntimeError):
"""Mandatory env var absent or empty."""
def require_env(name: str, description: str = "") -> str:
val = (os.environ.get(name) or "").strip()
if not val:
msg = f"missing mandatory env var: {name}"
if description:
msg += f" ({description})"
logger.error(msg)
raise MissingEnvError(msg)
return val
def optional_env(name: str, *, default: str = "") -> str:
val = (os.environ.get(name) or "").strip()
if not val:
if default:
logger.info("env %s not set, using default=%r", name, default)
return default
return val
def summarize(names: list[str]) -> None:
sensitive_tokens = ("SECRET", "KEY", "TOKEN", "PASSWORD", "CREDENTIAL", "WALLET")
for n in names:
val = os.environ.get(n)
if val is None:
logger.info("env[%s]: <unset>", n)
continue
if any(t in n.upper() for t in sensitive_tokens):
logger.info("env[%s]: <set, %d chars>", n, len(val))
else:
logger.info("env[%s]: %s", n, val)
def fail_fast_if_missing(names: list[str]) -> None:
missing: list[str] = []
for n in names:
if not (os.environ.get(n) or "").strip():
missing.append(n)
if missing:
logger.error("boot aborted: missing mandatory env vars: %s", missing)
print(
f"FATAL: missing mandatory env vars: {missing}",
file=sys.stderr,
)
sys.exit(2)
+80 -1
View File
@@ -1,18 +1,31 @@
"""Resolver di ambiente (testnet/mainnet) per MCP exchange.
Precedenza: env var > campo secret > default True (testnet).
Safety: `consistency_check` previene switch accidentali a mainnet senza
conferma esplicita nel secret JSON.
"""
from __future__ import annotations
import logging
import os
from dataclasses import dataclass
from typing import Literal
logger = logging.getLogger(__name__)
Environment = Literal["testnet", "mainnet"]
Source = Literal["env", "credentials", "default"]
TRUTHY = {"1", "true", "yes", "on"}
# Tokens nel base_url che indicano endpoint testnet (case-insensitive).
TESTNET_URL_HINTS = ("test", "testnet", "paper")
class EnvironmentMismatchError(RuntimeError):
"""Boot abort: ambiente risolto non matcha conferma esplicita nel secret."""
@dataclass(frozen=True)
class EnvironmentInfo:
@@ -29,13 +42,18 @@ def resolve_environment(
env_var: str,
flag_key: str,
exchange: str,
default_base_url_live: str | None = None,
default_base_url_testnet: str | None = None,
) -> EnvironmentInfo:
"""Risolvi l'ambiente per un MCP exchange.
creds: dict letto dal secret JSON. Deve contenere base_url_live e base_url_testnet.
creds: dict letto dal secret JSON. Può contenere base_url_live/base_url_testnet
per override; in assenza vengono usati i default kwargs.
env_var: nome della env var di override (es. DERIBIT_TESTNET).
flag_key: chiave booleana nel secret JSON (es. "testnet" o "paper" per alpaca).
exchange: nome exchange per logging/info.
default_base_url_live / default_base_url_testnet: URL canonici dell'exchange,
applicati se non presenti in creds.
"""
env_value = os.environ.get(env_var)
if env_value is not None:
@@ -49,6 +67,11 @@ def resolve_environment(
environment = "testnet"
source = "default"
if default_base_url_live is not None:
creds.setdefault("base_url_live", default_base_url_live)
if default_base_url_testnet is not None:
creds.setdefault("base_url_testnet", default_base_url_testnet)
base_url = creds["base_url_testnet"] if environment == "testnet" else creds["base_url_live"]
return EnvironmentInfo(
exchange=exchange,
@@ -57,3 +80,59 @@ def resolve_environment(
env_value=env_value,
base_url=base_url,
)
def consistency_check(
env_info: EnvironmentInfo,
creds: dict,
*,
strict_mainnet: bool = True,
) -> list[str]:
"""Verifica coerenza environment risolto vs secret JSON. Restituisce
lista di warning string. Solleva EnvironmentMismatchError per mismatch
bloccanti.
Regole:
- Se `creds["environment"]` è presente e DIVERSO da `env_info.environment`:
→ raise EnvironmentMismatchError (declared vs resolved mismatch).
- Se `env_info.environment == "mainnet"` e `creds.get("environment") !=
"mainnet"`: con `strict_mainnet=True` → raise (richiede conferma
esplicita). Con `strict_mainnet=False` → warning.
- Se `env_info.base_url` contiene token testnet ("test", "testnet",
"paper") ma `env_info.environment == "mainnet"` (o viceversa): warning
(URL/environment incoerenti).
"""
warnings: list[str] = []
declared = creds.get("environment")
if declared and declared != env_info.environment:
raise EnvironmentMismatchError(
f"{env_info.exchange}: secret declared environment={declared!r} "
f"but resolver resolved environment={env_info.environment!r}"
)
if env_info.environment == "mainnet" and declared != "mainnet":
msg = (
f"{env_info.exchange}: resolved mainnet without explicit confirmation "
"in secret. Add `\"environment\": \"mainnet\"` to the credentials JSON."
)
if strict_mainnet:
raise EnvironmentMismatchError(msg)
warnings.append(msg)
url_lower = (env_info.base_url or "").lower()
has_test_hint = any(token in url_lower for token in TESTNET_URL_HINTS)
if env_info.environment == "mainnet" and has_test_hint:
warnings.append(
f"{env_info.exchange}: environment=mainnet but base_url contains "
f"testnet hint ({env_info.base_url!r})"
)
if env_info.environment == "testnet" and not has_test_hint and url_lower:
warnings.append(
f"{env_info.exchange}: environment=testnet but base_url does not "
f"appear to be a testnet endpoint ({env_info.base_url!r})"
)
for w in warnings:
logger.warning("environment consistency: %s", w)
return warnings
+85
View File
@@ -0,0 +1,85 @@
"""HTTP client factory con retry/backoff su errori transient.
Wrap leggero attorno a httpx.AsyncClient: aggiunge AsyncHTTPTransport
con retries=N per gestire connection errors / DNS / refused. Per retry
su 5xx HTTP response usa `request_with_retry()` (decoratore separato).
Usage standard:
async with async_client(timeout=15) as http:
resp = await http.get(url)
Equivalente a httpx.AsyncClient(timeout=15) ma con retry transport su
errori di livello connessione.
"""
from __future__ import annotations
import asyncio
import logging
from collections.abc import Awaitable, Callable
from typing import Any, TypeVar
import httpx
logger = logging.getLogger(__name__)
T = TypeVar("T")
DEFAULT_RETRIES = 3
DEFAULT_TIMEOUT = 15.0
def async_client(
*,
timeout: float = DEFAULT_TIMEOUT,
retries: int = DEFAULT_RETRIES,
follow_redirects: bool = False,
**kwargs: Any,
) -> httpx.AsyncClient:
"""httpx.AsyncClient con AsyncHTTPTransport(retries=N) di default.
retries gestisce connection errors / refused / DNS — non 5xx HTTP.
"""
transport = httpx.AsyncHTTPTransport(retries=retries)
return httpx.AsyncClient(
timeout=timeout,
transport=transport,
follow_redirects=follow_redirects,
**kwargs,
)
async def call_with_retry(
fn: Callable[[], Awaitable[T]],
*,
max_attempts: int = 3,
base_delay: float = 0.5,
max_delay: float = 8.0,
retry_on: tuple[type[BaseException], ...] = (httpx.TransportError, httpx.TimeoutException),
) -> T:
"""Retry generico async con exponential backoff.
Ritenta `fn()` se solleva una delle exception in `retry_on`. Backoff
raddoppia (0.5, 1, 2, 4, ...) clipped a max_delay. Solleva l'ultima
exception se max_attempts raggiunto.
Usabile su SDK sincroni avvolti in asyncio.to_thread (pybit, alpaca):
result = await call_with_retry(lambda: client._run(self._http.get_tickers, ...))
"""
delay = base_delay
last_exc: BaseException | None = None
for attempt in range(1, max_attempts + 1):
try:
return await fn()
except retry_on as e:
last_exc = e
if attempt == max_attempts:
break
logger.warning(
"transient error, retrying (%d/%d) in %.1fs: %s",
attempt, max_attempts, delay, type(e).__name__,
)
await asyncio.sleep(delay)
delay = min(delay * 2, max_delay)
assert last_exc is not None
raise last_exc
@@ -1,5 +1,7 @@
from __future__ import annotations
import math
def sma(values: list[float], period: int) -> float | None:
if len(values) < period:
@@ -137,3 +139,278 @@ def adx(
for i in range(period, len(dxs)):
adx_val = (adx_val * (period - 1) + dxs[i]) / period
return {"adx": adx_val, "+di": pdi, "-di": mdi}
# ───── Returns helper ─────
def _log_returns(closes: list[float]) -> list[float]:
out: list[float] = []
for i in range(1, len(closes)):
prev = closes[i - 1]
curr = closes[i]
if prev > 0 and curr > 0:
out.append(math.log(curr / prev))
return out
def _percentile(sorted_values: list[float], q: float) -> float:
if not sorted_values:
return 0.0
if len(sorted_values) == 1:
return sorted_values[0]
pos = q * (len(sorted_values) - 1)
lo = int(pos)
hi = min(lo + 1, len(sorted_values) - 1)
frac = pos - lo
return sorted_values[lo] + frac * (sorted_values[hi] - sorted_values[lo])
def _stddev(xs: list[float]) -> float:
if len(xs) < 2:
return 0.0
m = sum(xs) / len(xs)
var = sum((x - m) ** 2 for x in xs) / (len(xs) - 1)
return math.sqrt(var)
# ───── vol_cone ─────
def vol_cone(
closes: list[float],
windows: list[int] | None = None,
annualization: int = 252,
) -> dict[int, dict[str, float | None]]:
"""Realized vol cone: per ogni window restituisce vol corrente e percentili
storici (p10/p50/p90) di tutte le rolling windows del campione.
Annualizzata (default 252 trading days).
"""
windows = windows or [10, 20, 30, 60]
rets = _log_returns(closes)
out: dict[int, dict[str, float | None]] = {}
factor = math.sqrt(annualization)
for w in windows:
if len(rets) < w:
out[w] = {"current": None, "p10": None, "p50": None, "p90": None}
continue
rolling: list[float] = []
for i in range(w, len(rets) + 1):
window_rets = rets[i - w:i]
rolling.append(_stddev(window_rets) * factor)
rolling_sorted = sorted(rolling)
out[w] = {
"current": rolling[-1],
"p10": _percentile(rolling_sorted, 0.10),
"p50": _percentile(rolling_sorted, 0.50),
"p90": _percentile(rolling_sorted, 0.90),
}
return out
# ───── hurst_exponent ─────
def hurst_exponent(closes: list[float], min_lag: int = 2, max_lag: int = 100) -> float | None:
"""Hurst via R/S analysis su log-prices. H≈0.5 random walk, >0.5 trending,
<0.5 mean-reverting.
"""
if len(closes) < max(20, max_lag):
return None
log_p = [math.log(c) for c in closes if c > 0]
if len(log_p) < max(20, max_lag):
return None
upper = min(max_lag, len(log_p) // 2)
if upper < min_lag + 1:
return None
lags = list(range(min_lag, upper))
log_lags: list[float] = []
log_rs: list[float] = []
for lag in lags:
# Build N/lag non-overlapping segments; for each compute R/S
rs_vals: list[float] = []
n_segs = len(log_p) // lag
if n_segs < 1:
continue
for seg in range(n_segs):
chunk = log_p[seg * lag:(seg + 1) * lag]
diffs = [chunk[i] - chunk[i - 1] for i in range(1, len(chunk))]
if len(diffs) < 2:
continue
mean = sum(diffs) / len(diffs)
dev = [d - mean for d in diffs]
cum = []
acc = 0.0
for d in dev:
acc += d
cum.append(acc)
r = max(cum) - min(cum)
s = _stddev(diffs)
if s > 0:
rs_vals.append(r / s)
if rs_vals:
avg_rs = sum(rs_vals) / len(rs_vals)
if avg_rs > 0:
log_lags.append(math.log(lag))
log_rs.append(math.log(avg_rs))
if len(log_lags) < 4:
return None
# Linear regression slope = Hurst
n = len(log_lags)
mx = sum(log_lags) / n
my = sum(log_rs) / n
num = sum((log_lags[i] - mx) * (log_rs[i] - my) for i in range(n))
den = sum((log_lags[i] - mx) ** 2 for i in range(n))
if den == 0:
return None
return num / den
# ───── half_life_mean_reversion ─────
def half_life_mean_reversion(closes: list[float]) -> float | None:
"""Half-life via OU AR(1) fit: y_t - y_{t-1} = a + b*y_{t-1} + eps.
Half-life = -ln(2)/ln(1+b). Se b>=0 → no mean reversion → None.
"""
if len(closes) < 30:
return None
y_lag = closes[:-1]
delta = [closes[i] - closes[i - 1] for i in range(1, len(closes))]
n = len(y_lag)
mx = sum(y_lag) / n
my = sum(delta) / n
num = sum((y_lag[i] - mx) * (delta[i] - my) for i in range(n))
den = sum((y_lag[i] - mx) ** 2 for i in range(n))
if den == 0:
return None
b = num / den
if b >= 0:
return None
one_plus_b = 1.0 + b
if one_plus_b <= 0:
return None
return -math.log(2.0) / math.log(one_plus_b)
# ───── garch11_forecast ─────
def garch11_forecast(
closes: list[float],
max_iter: int = 50,
) -> dict[str, float] | None:
"""Forecast GARCH(1,1) one-step-ahead sigma via metodo dei momenti
semplificato (no MLE). Pure-Python: stima omega, alpha, beta tramite
iterazione di punto fisso minimizzando MSE sul squared-return tracking.
Sufficiente per ranking volatility regimes; non production-grade.
"""
rets = _log_returns(closes)
if len(rets) < 50:
return None
mean = sum(rets) / len(rets)
centered = [r - mean for r in rets]
sq = [r * r for r in centered]
# Sample variance as long-run mean
var_lr = sum(sq) / len(sq)
if var_lr <= 0:
return None
# Simple grid for (alpha, beta) minimizing MSE of sigma2 vs realized sq
best = (1e18, 0.05, 0.90)
for a in [0.02, 0.05, 0.08, 0.10, 0.15]:
for b in [0.80, 0.85, 0.88, 0.90, 0.93]:
if a + b >= 0.999:
continue
omega = var_lr * (1 - a - b)
if omega <= 0:
continue
sigma2 = var_lr
mse = 0.0
for s in sq[:-1]:
sigma2 = omega + a * s + b * sigma2
mse += (sigma2 - s) ** 2
if mse < best[0]:
best = (mse, a, b)
_, alpha, beta = best
omega = var_lr * (1 - alpha - beta)
sigma2 = var_lr
for s in sq:
sigma2 = omega + alpha * s + beta * sigma2
sigma2_next = omega + alpha * sq[-1] + beta * sigma2
return {
"sigma_next": math.sqrt(max(sigma2_next, 0.0)),
"alpha": alpha,
"beta": beta,
"omega": omega,
"long_run_sigma": math.sqrt(var_lr),
}
# ───── autocorrelation ─────
def autocorrelation(values: list[float], max_lag: int = 10) -> dict[int, float]:
"""Autocorrelation function (ACF) lag 1..max_lag. White noise → ≈ 0.
AR(1) phi → lag1 ≈ phi, lag-k ≈ phi^k.
"""
if len(values) < max_lag + 2:
return {}
n = len(values)
mean = sum(values) / n
dev = [v - mean for v in values]
var = sum(d * d for d in dev) / n
if var == 0:
return {lag: 0.0 for lag in range(1, max_lag + 1)}
out: dict[int, float] = {}
for lag in range(1, max_lag + 1):
cov = sum(dev[i] * dev[i + lag] for i in range(n - lag)) / n
out[lag] = cov / var
return out
# ───── rolling_sharpe ─────
def rolling_sharpe(
closes: list[float],
window: int = 60,
annualization: int = 252,
risk_free: float = 0.0,
) -> dict[str, float] | None:
"""Sharpe e Sortino rolling sull'ultimo `window` di log-returns.
Annualizzati. risk_free in tasso annualizzato.
"""
rets = _log_returns(closes)
if len(rets) < window:
return None
sample = rets[-window:]
daily_rf = risk_free / annualization
excess = [r - daily_rf for r in sample]
mean = sum(excess) / len(excess)
sd = _stddev(excess)
sharpe = (mean / sd) * math.sqrt(annualization) if sd > 0 else 0.0
downside = [e for e in excess if e < 0]
if downside:
ds_var = sum(d * d for d in downside) / len(excess)
ds_sd = math.sqrt(ds_var)
sortino = (mean / ds_sd) * math.sqrt(annualization) if ds_sd > 0 else 0.0
else:
sortino = sharpe * 2 # nessun downside → sortino "molto buono"
return {"sharpe": sharpe, "sortino": sortino, "mean_excess": mean, "stddev": sd}
# ───── var_cvar ─────
def var_cvar(returns: list[float], confidences: list[float] | None = None) -> dict[str, float]:
"""Historical VaR e CVaR (Expected Shortfall) ai livelli di confidenza.
returns: serie di rendimenti (qualsiasi periodicità). VaR/CVaR restituiti
come perdite positive (es. var_95=0.03 → -3% al 95%).
"""
confidences = confidences or [0.95, 0.99]
if len(returns) < 30:
return {}
sorted_rets = sorted(returns)
out: dict[str, float] = {}
for c in confidences:
tag = int(round(c * 100))
q = 1.0 - c
var = -_percentile(sorted_rets, q)
cutoff = -var
tail = [r for r in sorted_rets if r <= cutoff]
cvar = -(sum(tail) / len(tail)) if tail else var
out[f"var_{tag}"] = var
out[f"cvar_{tag}"] = cvar
return out
+3 -3
View File
@@ -21,6 +21,7 @@ Claude Code config esempio:
"""
from __future__ import annotations
import contextlib
from typing import Any
import httpx
@@ -40,6 +41,7 @@ def _derive_input_schemas(app: FastAPI, tool_names: list[str]) -> dict[str, dict
risolvibili vengono saltate: il chiamante userà un fallback.
"""
import typing
from pydantic import BaseModel
names_set = set(tool_names)
@@ -62,10 +64,8 @@ def _derive_input_schemas(app: FastAPI, tool_names: list[str]) -> dict[str, dict
if pname == "return":
continue
if isinstance(ann, type) and issubclass(ann, BaseModel):
try:
with contextlib.suppress(Exception):
out[name] = ann.model_json_schema()
except Exception:
pass
break
return out
@@ -0,0 +1,74 @@
"""Microstructure indicators: orderbook imbalance, slope, microprice.
Tutte le funzioni accettano bids/asks come list[list[price, qty]] (formato
standard dei ticker exchange) e ritornano metriche aggregate exchange-agnostic.
"""
from __future__ import annotations
def orderbook_imbalance(
bids: list[list[float]],
asks: list[list[float]],
depth: int = 10,
) -> dict[str, float | None]:
"""Imbalance ratio = (bid_vol - ask_vol) / (bid_vol + ask_vol) sui top-`depth`
livelli. Range [-1, +1]. Positivo = bid pressure, negativo = ask pressure.
Microprice (Stoll-Bertsimas): mid pesato dalla size opposta
→ P_micro = (P_bid * Q_ask + P_ask * Q_bid) / (Q_bid + Q_ask). Best level only.
Slope: variazione cumulata di volume per unità di prezzo (proxy per
liquidità in profondità).
"""
if not bids and not asks:
return {
"imbalance_ratio": None,
"bid_volume": 0.0,
"ask_volume": 0.0,
"microprice": None,
"bid_slope": None,
"ask_slope": None,
}
top_bids = bids[:depth]
top_asks = asks[:depth]
bid_vol = sum(q for _, q in top_bids)
ask_vol = sum(q for _, q in top_asks)
total = bid_vol + ask_vol
ratio = None if total == 0 else (bid_vol - ask_vol) / total
# Microprice: best bid, best ask. Weighted by opposite-side size.
microprice = None
if top_bids and top_asks:
bp, bq = top_bids[0]
ap, aq = top_asks[0]
denom = bq + aq
if denom > 0:
microprice = (bp * aq + ap * bq) / denom
bid_slope = _depth_slope(top_bids, ascending_price=False)
ask_slope = _depth_slope(top_asks, ascending_price=True)
return {
"imbalance_ratio": ratio,
"bid_volume": bid_vol,
"ask_volume": ask_vol,
"microprice": microprice,
"bid_slope": bid_slope,
"ask_slope": ask_slope,
}
def _depth_slope(levels: list[list[float]], ascending_price: bool) -> float | None:
"""Calcola |Δq / Δp| sul primo vs penultimo livello.
Slope alto = liquidità che crolla rapidamente in profondità (book sottile).
"""
if len(levels) < 2:
return None
p_first, q_first = levels[0]
p_last, q_last = levels[-1]
dp = abs(p_last - p_first)
if dp == 0:
return None
return abs(q_first - q_last) / dp
+201
View File
@@ -0,0 +1,201 @@
"""Logiche option-flow exchange-agnostiche.
Ogni funzione accetta una lista di "legs" (dizionari) con i campi rilevanti
e ritorna metriche aggregate. La normalizzazione exchange-specific dei dati
spetta al chiamante (es. mcp-deribit costruisce le legs da chain + ticker).
"""
from __future__ import annotations
from typing import Any
# Convention dealer gamma: i dealer sono SHORT calls (le vendono al retail) e
# LONG puts. Quando spot sale e dealer sono short calls, comprano underlying
# (positive feedback → vol amplificata). Quando spot scende e dealer long puts,
# vendono underlying (positive feedback). Net dealer gamma negativo → mercato
# instabile (squeeze in entrambe le direzioni).
def oi_weighted_skew(legs: list[dict[str, Any]]) -> dict[str, float | int | None]:
"""Skew aggregato pesato per OI: IV media puts - IV media calls.
Positivo = puts richer (paura), negativo = calls richer (greed).
"""
call_num = call_den = 0.0
put_num = put_den = 0.0
for leg in legs:
iv = leg.get("iv")
oi = leg.get("oi") or 0
if iv is None or oi <= 0:
continue
if leg.get("option_type") == "call":
call_num += iv * oi
call_den += oi
elif leg.get("option_type") == "put":
put_num += iv * oi
put_den += oi
call_iv = call_num / call_den if call_den > 0 else None
put_iv = put_num / put_den if put_den > 0 else None
skew = (put_iv - call_iv) if (call_iv is not None and put_iv is not None) else None
return {
"skew": skew,
"call_iv_weighted": call_iv,
"put_iv_weighted": put_iv,
"total_oi": int(call_den + put_den),
}
def smile_asymmetry(legs: list[dict[str, Any]], spot: float) -> dict[str, float | None]:
"""Smile asymmetry: differenza media IV otm puts vs otm calls a parità
di moneyness. Positivo = put-side richer (skew negativo classico equity).
"""
if spot <= 0 or not legs:
return {"atm_iv": None, "asymmetry": None, "otm_put_iv": None, "otm_call_iv": None}
# ATM IV: media IV strike entro ±1% da spot
atm_ivs = [leg["iv"] for leg in legs if leg.get("iv") is not None and abs(leg.get("strike", 0) - spot) / spot < 0.01]
atm_iv = sum(atm_ivs) / len(atm_ivs) if atm_ivs else None
otm_put_ivs = [
leg["iv"] for leg in legs
if leg.get("iv") is not None and leg.get("option_type") == "put" and leg.get("strike", 0) < spot * 0.95
]
otm_call_ivs = [
leg["iv"] for leg in legs
if leg.get("iv") is not None and leg.get("option_type") == "call" and leg.get("strike", 0) > spot * 1.05
]
otm_put = sum(otm_put_ivs) / len(otm_put_ivs) if otm_put_ivs else None
otm_call = sum(otm_call_ivs) / len(otm_call_ivs) if otm_call_ivs else None
asym = (otm_put - otm_call) if (otm_put is not None and otm_call is not None) else None
return {
"atm_iv": atm_iv,
"asymmetry": asym,
"otm_put_iv": otm_put,
"otm_call_iv": otm_call,
}
def atm_vs_wings_vol(legs: list[dict[str, Any]], spot: float) -> dict[str, float | None]:
"""IV ATM vs IV alle ali 25-delta. Wing richness > 0 → smile (kurtosis vol).
"""
if not legs:
return {"atm_iv": None, "wing_25d_call_iv": None, "wing_25d_put_iv": None, "wing_richness": None}
def _closest(target_delta: float, opt_type: str, tol: float = 0.1) -> float | None:
best = None
best_dist = float("inf")
for leg in legs:
d = leg.get("delta")
iv = leg.get("iv")
if d is None or iv is None or leg.get("option_type") != opt_type:
continue
dist = abs(abs(d) - abs(target_delta))
if dist < best_dist:
best_dist = dist
best = iv
return best if best_dist <= tol else None
# ATM IV: leg con delta più vicino a 0.5 (call) o -0.5 (put)
atm_call_iv = _closest(0.5, "call")
atm_put_iv = _closest(-0.5, "put")
atm_ivs = [v for v in (atm_call_iv, atm_put_iv) if v is not None]
atm_iv = sum(atm_ivs) / len(atm_ivs) if atm_ivs else None
wing_call = _closest(0.25, "call")
wing_put = _closest(-0.25, "put")
wing_avg = None
if wing_call is not None and wing_put is not None:
wing_avg = (wing_call + wing_put) / 2
richness = (wing_avg - atm_iv) if (wing_avg is not None and atm_iv is not None) else None
return {
"atm_iv": atm_iv,
"wing_25d_call_iv": wing_call,
"wing_25d_put_iv": wing_put,
"wing_richness": richness,
}
def dealer_gamma_profile(
legs: list[dict[str, Any]],
spot: float,
) -> dict[str, Any]:
"""Net dealer gamma per strike (assume dealer short calls + long puts).
Restituisce per strike: call_dealer_gamma (negativo), put_dealer_gamma
(positivo), net. Aggregato totale + zero-cross strike (gamma flip).
"""
by_strike: dict[float, dict[str, float]] = {}
for leg in legs:
strike = leg.get("strike")
gamma = leg.get("gamma")
oi = leg.get("oi") or 0
if strike is None or gamma is None or oi <= 0 or spot <= 0:
continue
contrib = float(gamma) * oi * (spot ** 2) * 0.01
entry = by_strike.setdefault(
float(strike),
{"strike": float(strike), "call_dealer_gamma": 0.0, "put_dealer_gamma": 0.0},
)
if leg.get("option_type") == "call":
entry["call_dealer_gamma"] -= contrib # dealer short calls
elif leg.get("option_type") == "put":
entry["put_dealer_gamma"] += contrib # dealer long puts
rows: list[dict[str, float]] = []
for s in sorted(by_strike.keys()):
e = by_strike[s]
e["net_dealer_gamma"] = e["call_dealer_gamma"] + e["put_dealer_gamma"]
rows.append(e)
flip_level = None
for a, b in zip(rows, rows[1:], strict=False):
if (a["net_dealer_gamma"] < 0 <= b["net_dealer_gamma"]) or (
a["net_dealer_gamma"] > 0 >= b["net_dealer_gamma"]
):
denom = b["net_dealer_gamma"] - a["net_dealer_gamma"]
if denom != 0:
frac = -a["net_dealer_gamma"] / denom
flip_level = round(a["strike"] + frac * (b["strike"] - a["strike"]), 2)
break
total = sum(r["net_dealer_gamma"] for r in rows)
return {
"by_strike": [
{
"strike": r["strike"],
"call_dealer_gamma": round(r["call_dealer_gamma"], 2),
"put_dealer_gamma": round(r["put_dealer_gamma"], 2),
"net_dealer_gamma": round(r["net_dealer_gamma"], 2),
}
for r in rows
],
"total_net_dealer_gamma": round(total, 2),
"gamma_flip_level": flip_level,
}
def vanna_charm_aggregate(
legs: list[dict[str, Any]],
spot: float,
) -> dict[str, Any]:
"""Vanna (∂delta/∂IV) e Charm (∂delta/∂t) aggregati pesati per OI.
Vanna positiva → IV up, calls hedge buys; charm negativa → time decay
pushes delta down (calls only).
"""
total_vanna = 0.0
total_charm = 0.0
legs_used = 0
for leg in legs:
vanna = leg.get("vanna")
charm = leg.get("charm")
oi = leg.get("oi") or 0
if vanna is None or charm is None or oi <= 0:
continue
sign = 1 if leg.get("option_type") == "call" else -1
total_vanna += float(vanna) * oi * sign
total_charm += float(charm) * oi * sign
legs_used += 1
return {
"total_vanna": total_vanna,
"total_charm": total_charm,
"legs_analyzed": legs_used,
"spot": spot,
}
+3 -3
View File
@@ -4,10 +4,10 @@ import json
import os
import time
import uuid
from datetime import UTC, datetime
from collections.abc import Callable
from contextlib import AbstractAsyncContextManager
from datetime import UTC, datetime
from typing import Any
from fastapi import FastAPI, HTTPException, Request
from fastapi.exceptions import RequestValidationError
@@ -29,7 +29,7 @@ def _error_envelope(
details: dict | None = None,
request_id: str | None = None,
) -> dict:
env = {
env: dict[str, Any] = {
"error": {
"type": type_,
"code": code,
+96
View File
@@ -0,0 +1,96 @@
"""Test statistici puri (cointegration, ADF, half-life già in indicators).
Nessuna dipendenza esterna: pure-Python.
"""
from __future__ import annotations
import math
def _ols_slope_intercept(xs: list[float], ys: list[float]) -> tuple[float, float] | None:
if len(xs) != len(ys) or len(xs) < 3:
return None
n = len(xs)
mx = sum(xs) / n
my = sum(ys) / n
num = sum((xs[i] - mx) * (ys[i] - my) for i in range(n))
den = sum((xs[i] - mx) ** 2 for i in range(n))
if den == 0:
return None
slope = num / den
intercept = my - slope * mx
return slope, intercept
def _adf_t_stat(series: list[float]) -> float | None:
"""Augmented Dickey-Fuller test stat semplificato (lag=0 → DF):
Δy_t = a + b*y_{t-1} + eps. t-stat di b vs zero.
Più negativo = più stazionario. Approssimazione: critical value ~ -2.86 al 5%.
"""
if len(series) < 30:
return None
y_lag = series[:-1]
delta = [series[i] - series[i - 1] for i in range(1, len(series))]
res = _ols_slope_intercept(y_lag, delta)
if res is None:
return None
b, a = res
n = len(y_lag)
mx = sum(y_lag) / n
den = sum((x - mx) ** 2 for x in y_lag)
if den == 0:
return None
fitted = [a + b * y_lag[i] for i in range(n)]
resid = [delta[i] - fitted[i] for i in range(n)]
rss = sum(r * r for r in resid)
if n - 2 <= 0:
return None
sigma2 = rss / (n - 2)
se_b = math.sqrt(sigma2 / den)
if se_b == 0:
return None
return b / se_b
def cointegration_test(
series_a: list[float],
series_b: list[float],
significance_t: float = -2.86,
) -> dict[str, float | bool | None]:
"""Engle-Granger cointegration:
1. OLS: y_t = alpha + beta * x_t + eps
2. ADF su residui: se t-stat < critical (-2.86 @ 5%) → cointegrate.
"""
if len(series_a) != len(series_b) or len(series_a) < 50:
return {
"cointegrated": None,
"beta": None,
"alpha": None,
"adf_t_stat": None,
"spread_mean": None,
"spread_std": None,
}
res = _ols_slope_intercept(series_b, series_a)
if res is None:
return {
"cointegrated": None,
"beta": None,
"alpha": None,
"adf_t_stat": None,
"spread_mean": None,
"spread_std": None,
}
beta, alpha = res
spread = [series_a[i] - alpha - beta * series_b[i] for i in range(len(series_a))]
t_stat = _adf_t_stat(spread)
cointegrated = (t_stat is not None and t_stat < significance_t)
n = len(spread)
mean = sum(spread) / n
var = sum((s - mean) ** 2 for s in spread) / (n - 1) if n > 1 else 0.0
return {
"cointegrated": cointegrated,
"beta": beta,
"alpha": alpha,
"adf_t_stat": t_stat,
"spread_mean": mean,
"spread_std": math.sqrt(var),
}
+137
View File
@@ -0,0 +1,137 @@
from __future__ import annotations
import json
from unittest.mock import MagicMock, patch
import pytest
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_common.environment import EnvironmentInfo
def _make_spec(build_client=None, build_app=None) -> ExchangeAppSpec:
return ExchangeAppSpec(
exchange="testex",
creds_env_var="TESTEX_CREDENTIALS_FILE",
env_var="TESTEX_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.testex.com",
default_base_url_testnet="https://test.testex.com",
default_port=9999,
build_client=build_client or (lambda creds, env_info: MagicMock(name="client")),
build_app=build_app or (lambda **kwargs: MagicMock(name="app")),
)
def test_run_exchange_main_loads_creds_and_resolves_env(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"api_key": "k", "api_secret": "s"}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("PORT", "10000")
monkeypatch.delenv("TESTEX_TESTNET", raising=False)
captured: dict = {}
def build_client(creds, env_info):
captured["creds"] = creds
captured["env_info"] = env_info
return MagicMock()
def build_app(**kwargs):
captured["app_kwargs"] = kwargs
return MagicMock()
spec = _make_spec(build_client=build_client, build_app=build_app)
with patch("mcp_common.app_factory.uvicorn.run") as mock_run:
run_exchange_main(spec)
assert captured["creds"]["api_key"] == "k"
assert captured["creds"]["base_url_live"] == "https://api.testex.com"
assert captured["creds"]["base_url_testnet"] == "https://test.testex.com"
assert isinstance(captured["env_info"], EnvironmentInfo)
assert captured["env_info"].environment == "testnet"
assert captured["env_info"].exchange == "testex"
assert "client" in captured["app_kwargs"]
assert "token_store" in captured["app_kwargs"]
assert "creds" in captured["app_kwargs"]
assert "env_info" in captured["app_kwargs"]
call_kwargs = mock_run.call_args.kwargs
assert call_kwargs["port"] == 10000 # PORT override
def test_run_exchange_main_uses_default_port(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.delenv("PORT", raising=False)
spec = _make_spec()
with patch("mcp_common.app_factory.uvicorn.run") as mock_run:
run_exchange_main(spec)
assert mock_run.call_args.kwargs["port"] == 9999
def test_run_exchange_main_env_var_overrides_creds(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
# `environment: mainnet` esplicito perché env var override → mainnet
# e consistency_check richiede conferma per evitare switch accidentale.
creds_file.write_text(json.dumps({"testnet": True, "environment": "mainnet"}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("TESTEX_TESTNET", "false")
captured: dict = {}
def build_client(creds, env_info):
captured["env_info"] = env_info
return MagicMock()
spec = _make_spec(build_client=build_client)
with patch("mcp_common.app_factory.uvicorn.run"):
run_exchange_main(spec)
# env var "false" overrides creds.testnet=True → mainnet
assert captured["env_info"].environment == "mainnet"
assert captured["env_info"].source == "env"
def test_run_exchange_main_aborts_on_mainnet_without_confirmation(tmp_path, monkeypatch):
"""Mainnet senza creds['environment']='mainnet' → boot abort fail-fast."""
from mcp_common.environment import EnvironmentMismatchError
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"testnet": False}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.delenv("TESTEX_TESTNET", raising=False)
monkeypatch.delenv("STRICT_MAINNET", raising=False)
spec = _make_spec()
with (
pytest.raises(EnvironmentMismatchError),
patch("mcp_common.app_factory.uvicorn.run"),
):
run_exchange_main(spec)
def test_run_exchange_main_strict_mainnet_disabled_via_env(tmp_path, monkeypatch):
"""STRICT_MAINNET=false permette mainnet senza conferma (warning soltanto)."""
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"testnet": False}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("STRICT_MAINNET", "false")
spec = _make_spec()
with patch("mcp_common.app_factory.uvicorn.run"):
run_exchange_main(spec) # non solleva
def test_run_exchange_main_missing_creds_file_exits(monkeypatch):
monkeypatch.delenv("TESTEX_CREDENTIALS_FILE", raising=False)
spec = _make_spec()
import pytest
with pytest.raises(SystemExit) as exc_info:
run_exchange_main(spec)
assert exc_info.value.code == 2
+155
View File
@@ -0,0 +1,155 @@
from __future__ import annotations
import logging
import pytest
from mcp_common import audit as audit_mod
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal
@pytest.fixture
def captured_records(monkeypatch):
"""Cattura i record emessi dal logger mcp.audit (propagate=False blocca caplog).
Sostituisce il logger del modulo con uno che ha caplog attaccato.
"""
records: list[logging.LogRecord] = []
class ListHandler(logging.Handler):
def emit(self, record: logging.LogRecord) -> None:
records.append(record)
test_logger = logging.getLogger("mcp.audit.test")
test_logger.handlers.clear()
test_logger.addHandler(ListHandler())
test_logger.setLevel(logging.DEBUG)
test_logger.propagate = False
monkeypatch.setattr(audit_mod, "_logger", test_logger)
return records
def test_audit_write_op_emits_structured_record(captured_records):
p = Principal("core", {"core"})
audit_write_op(
principal=p,
action="place_order",
exchange="deribit",
target="BTC-PERPETUAL",
payload={"side": "buy", "amount": 10, "leverage": 3},
result={"order_id": "abc", "state": "open"},
)
assert len(captured_records) == 1
rec = captured_records[0]
assert rec.action == "place_order"
assert rec.exchange == "deribit"
assert rec.target == "BTC-PERPETUAL"
assert rec.principal == "core"
assert rec.payload == {"side": "buy", "amount": 10, "leverage": 3}
assert rec.result == {"order_id": "abc", "state": "open"}
def test_audit_write_op_error_uses_error_level(captured_records):
p = Principal("core", {"core"})
audit_write_op(
principal=p,
action="cancel_order",
exchange="bybit",
target="ord-123",
payload={},
error="not_found",
)
assert len(captured_records) == 1
rec = captured_records[0]
assert rec.levelname == "ERROR"
assert rec.error == "not_found"
def test_audit_write_op_summarizes_result_fields(captured_records):
p = Principal("core", {"core"})
big_result = {
"order_id": "ord-1",
"state": "submitted",
"extra_huge_field": "x" * 10000,
"orders": [{"id": 1}, {"id": 2}, {"id": 3}],
}
audit_write_op(
principal=p,
action="place_combo_order",
exchange="bybit",
payload={},
result=big_result,
)
rec = captured_records[0]
assert "extra_huge_field" not in rec.result
assert rec.result["order_id"] == "ord-1"
assert rec.result["orders_count"] == 3
def test_audit_write_op_no_principal(captured_records):
audit_write_op(
principal=None,
action="place_order",
exchange="alpaca",
payload={},
)
rec = captured_records[0]
assert rec.principal is None
def test_audit_write_op_writes_to_file_when_AUDIT_LOG_FILE_set(tmp_path, monkeypatch):
"""Con env AUDIT_LOG_FILE settato, una riga JSON appare nel file."""
import json
from mcp_common import audit as audit_mod
audit_file = tmp_path / "audit.jsonl"
monkeypatch.setenv("AUDIT_LOG_FILE", str(audit_file))
# Reset state idempotency flag così il test riesegue setup
audit_mod._file_handler_attached = False
# Pulisci handlers preesistenti dal logger (potrebbe avere file vecchio)
for h in list(audit_mod._logger.handlers):
from logging.handlers import TimedRotatingFileHandler
if isinstance(h, TimedRotatingFileHandler):
audit_mod._logger.removeHandler(h)
audit_write_op(
principal=Principal("core", {"core"}),
action="place_order",
exchange="bybit",
target="BTCUSDT",
payload={"side": "Buy", "qty": 0.01},
result={"order_id": "abc123", "status": "submitted"},
)
# Forza flush dei file handler
for h in audit_mod._logger.handlers:
h.flush()
assert audit_file.exists()
content = audit_file.read_text().strip()
assert content, "audit file empty"
record = json.loads(content.splitlines()[-1])
assert record["audit_event"] == "write_op"
assert record["action"] == "place_order"
assert record["exchange"] == "bybit"
assert record["target"] == "BTCUSDT"
assert record["principal"] == "core"
def test_audit_no_file_when_env_unset(tmp_path, monkeypatch):
"""Senza AUDIT_LOG_FILE, nessun file viene creato."""
from mcp_common import audit as audit_mod
monkeypatch.delenv("AUDIT_LOG_FILE", raising=False)
audit_mod._file_handler_attached = False
audit_write_op(
principal=Principal("core", {"core"}),
action="cancel_order",
exchange="bybit",
target="ord-1",
payload={},
)
# Niente file creato in tmp_path
files = list(tmp_path.iterdir())
assert files == []
+105 -4
View File
@@ -1,10 +1,12 @@
from __future__ import annotations
import json
import pytest
from mcp_common.environment import EnvironmentInfo, resolve_environment
from mcp_common.environment import (
EnvironmentInfo,
EnvironmentMismatchError,
consistency_check,
resolve_environment,
)
def test_env_var_overrides_secret(monkeypatch):
@@ -74,6 +76,37 @@ def test_env_value_truthy_parsing(monkeypatch, raw, expected):
assert info.environment == expected
def test_default_base_urls_applied_when_creds_missing(monkeypatch):
monkeypatch.delenv("X_TESTNET", raising=False)
creds: dict = {}
info = resolve_environment(
creds,
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
default_base_url_live="https://live.example",
default_base_url_testnet="https://test.example",
)
assert info.base_url == "https://test.example"
assert creds["base_url_live"] == "https://live.example"
assert creds["base_url_testnet"] == "https://test.example"
def test_creds_base_urls_override_defaults(monkeypatch):
monkeypatch.delenv("X_TESTNET", raising=False)
creds = {"base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
default_base_url_live="https://live.example",
default_base_url_testnet="https://test.example",
)
assert info.base_url == "T"
assert creds["base_url_live"] == "L"
def test_alpaca_paper_flag_key(monkeypatch):
"""Alpaca usa 'paper' invece di 'testnet' nel secret."""
monkeypatch.delenv("ALPACA_PAPER", raising=False)
@@ -86,3 +119,71 @@ def test_alpaca_paper_flag_key(monkeypatch):
)
assert info.environment == "mainnet"
assert info.source == "credentials"
# ───────── consistency_check ─────────
def _info(env: str, exchange: str = "deribit") -> EnvironmentInfo:
"""Helper costruisce EnvironmentInfo per test."""
return EnvironmentInfo(
exchange=exchange,
environment=env,
source="env",
env_value="false" if env == "mainnet" else "true",
base_url=f"https://api.{exchange}.com" if env == "mainnet" else f"https://test.{exchange}.com",
)
def test_consistency_check_testnet_no_confirmation_ok():
"""Testnet senza conferma esplicita → ok, ritorna []. Default safe."""
info = _info("testnet")
creds = {"api_key": "k", "api_secret": "s"}
warnings = consistency_check(info, creds)
assert warnings == []
def test_consistency_check_mainnet_no_confirmation_raises():
"""Mainnet senza creds['environment']='mainnet' esplicito → fail-fast."""
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s"}
with pytest.raises(EnvironmentMismatchError, match="mainnet.*explicit confirmation"):
consistency_check(info, creds)
def test_consistency_check_mainnet_with_confirmation_ok():
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s", "environment": "mainnet"}
warnings = consistency_check(info, creds)
assert warnings == []
def test_consistency_check_explicit_mismatch_raises():
"""Secret dichiara mainnet ma resolver risolve testnet → fail-fast."""
info = _info("testnet")
creds = {"environment": "mainnet"}
with pytest.raises(EnvironmentMismatchError, match="declared.*resolved"):
consistency_check(info, creds)
def test_consistency_check_strict_mainnet_disabled():
"""Con strict_mainnet=False mainnet senza conferma logga warning ma non raise."""
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s"}
warnings = consistency_check(info, creds, strict_mainnet=False)
assert any("mainnet" in w for w in warnings)
def test_consistency_check_url_does_not_match_environment_warns():
"""Base URL contiene 'test' ma environment='mainnet' → warning."""
from mcp_common.environment import EnvironmentInfo
info = EnvironmentInfo(
exchange="bybit",
environment="mainnet",
source="env",
env_value="false",
base_url="https://api-testnet.bybit.com", # url DICE testnet ma resolver MAINNET
)
creds = {"environment": "mainnet"}
warnings = consistency_check(info, creds)
assert any("base_url" in w.lower() for w in warnings)
+72
View File
@@ -0,0 +1,72 @@
from __future__ import annotations
import asyncio
import httpx
import pytest
from mcp_common.http import async_client, call_with_retry
def test_async_client_uses_retry_transport():
c = async_client(retries=5)
assert isinstance(c._transport, httpx.AsyncHTTPTransport)
# internal _retries on transport
assert c._transport._pool._retries == 5
@pytest.mark.asyncio
async def test_call_with_retry_succeeds_first_try():
calls = 0
async def fn():
nonlocal calls
calls += 1
return "ok"
result = await call_with_retry(fn)
assert result == "ok"
assert calls == 1
@pytest.mark.asyncio
async def test_call_with_retry_recovers_after_transient(monkeypatch):
monkeypatch.setattr(asyncio, "sleep", asyncio.coroutine(lambda *_: None) if False else _no_sleep)
calls = 0
async def fn():
nonlocal calls
calls += 1
if calls < 3:
raise httpx.ConnectError("boom")
return "ok"
result = await call_with_retry(fn, max_attempts=5, base_delay=0.0)
assert result == "ok"
assert calls == 3
async def _no_sleep(_):
return None
@pytest.mark.asyncio
async def test_call_with_retry_gives_up_after_max():
calls = 0
async def fn():
nonlocal calls
calls += 1
raise httpx.TimeoutException("slow")
with pytest.raises(httpx.TimeoutException):
await call_with_retry(fn, max_attempts=3, base_delay=0.0)
assert calls == 3
@pytest.mark.asyncio
async def test_call_with_retry_does_not_catch_unexpected():
async def fn():
raise ValueError("not transient")
with pytest.raises(ValueError):
await call_with_retry(fn, max_attempts=5, base_delay=0.0)
+181 -1
View File
@@ -1,5 +1,20 @@
from mcp_common.indicators import adx, atr, macd, rsi, sma
import math
from mcp_common.indicators import (
adx,
atr,
autocorrelation,
garch11_forecast,
half_life_mean_reversion,
hurst_exponent,
macd,
rolling_sharpe,
rsi,
sma,
var_cvar,
vol_cone,
)
def test_rsi_simple():
@@ -78,3 +93,168 @@ def test_adx_flat_market():
# no directional movement → ADX near 0
assert a["adx"] is not None
assert a["adx"] < 5.0
# ---------- vol_cone ----------
def _gbm_series(mu: float, sigma: float, n: int, seed: int = 42) -> list[float]:
"""Mock GBM closes: deterministic for tests."""
import random
r = random.Random(seed)
p = [100.0]
for _ in range(n):
z = r.gauss(0.0, 1.0)
p.append(p[-1] * math.exp(mu / 252 + sigma / math.sqrt(252) * z))
return p
def test_vol_cone_returns_percentiles_per_window():
closes = _gbm_series(mu=0.0, sigma=0.5, n=400)
out = vol_cone(closes, windows=[10, 30, 60])
assert set(out.keys()) == {10, 30, 60}
for _w, stats in out.items():
assert "current" in stats
assert "p10" in stats and "p50" in stats and "p90" in stats
assert stats["p10"] <= stats["p50"] <= stats["p90"]
# annualized — sensible range for sigma=0.5
assert 0.1 < stats["p50"] < 1.5
def test_vol_cone_insufficient_data():
out = vol_cone([100.0, 101.0], windows=[10, 30])
assert out[10]["current"] is None
assert out[30]["current"] is None
# ---------- hurst_exponent ----------
def test_hurst_random_walk_near_half():
closes = _gbm_series(mu=0.0, sigma=0.3, n=500, seed=7)
h = hurst_exponent(closes)
assert h is not None
# Random walk → Hurst ≈ 0.5; R/S bias positivo ben noto su sample finiti.
# Bound largo: distinguere comunque random walk da trending forte (>0.85).
assert 0.35 < h < 0.85
def test_hurst_persistent_trend():
# Strong monotonic trend → H >> 0.5
closes = [100.0 + i * 0.5 + math.sin(i / 10) * 0.1 for i in range(400)]
h = hurst_exponent(closes)
assert h is not None
assert h > 0.85
def test_hurst_insufficient_data():
assert hurst_exponent([1.0, 2.0, 3.0]) is None
# ---------- half_life_mean_reversion ----------
def test_half_life_mean_reverting_series():
"""OU process with theta=0.1 → half-life ≈ ln(2)/0.1 ≈ 6.93."""
import random
r = random.Random(123)
theta = 0.1
mu = 100.0
sigma = 0.5
s = [mu]
for _ in range(500):
s.append(s[-1] + theta * (mu - s[-1]) + sigma * r.gauss(0, 1))
hl = half_life_mean_reversion(s)
assert hl is not None
# broad tolerance — finite-sample noise
assert 3.0 < hl < 20.0
def test_half_life_trending_returns_none():
closes = [100.0 + i for i in range(200)]
hl = half_life_mean_reversion(closes)
# No mean reversion → returns None or +inf
assert hl is None or hl > 1000
# ---------- garch11_forecast ----------
def test_garch11_forecast_returns_positive_sigma():
closes = _gbm_series(mu=0.0, sigma=0.4, n=500, seed=11)
out = garch11_forecast(closes)
assert out is not None
assert out["sigma_next"] > 0
assert 0 < out["alpha"] < 1
assert 0 < out["beta"] < 1
assert out["alpha"] + out["beta"] < 1.0 # stationarity
def test_garch11_insufficient_data():
assert garch11_forecast([100.0, 101.0]) is None
# ---------- autocorrelation ----------
def test_autocorrelation_white_noise_low():
import random
r = random.Random(1)
rets = [r.gauss(0, 0.01) for _ in range(500)]
out = autocorrelation(rets, max_lag=5)
assert len(out) == 5
# white noise → all autocorr ≈ 0 (within ±2/sqrt(N))
bound = 2.0 / math.sqrt(len(rets))
for _lag, val in out.items():
assert abs(val) < bound * 2 # generous
def test_autocorrelation_lag1_strong_for_ar1():
"""AR(1) with phi=0.7 → autocorr lag-1 ≈ 0.7."""
import random
r = random.Random(2)
s = [0.0]
for _ in range(500):
s.append(0.7 * s[-1] + r.gauss(0, 0.1))
out = autocorrelation(s, max_lag=3)
assert out[1] > 0.5
assert out[2] > 0.2 # geometric decay
def test_autocorrelation_insufficient_data():
assert autocorrelation([1.0], max_lag=5) == {}
# ---------- rolling_sharpe ----------
def test_rolling_sharpe_positive_for_uptrend():
closes = [100.0 * (1 + 0.001 * i) for i in range(252)]
s = rolling_sharpe(closes, window=60)
assert s is not None
assert s["sharpe"] > 0
assert s["sortino"] >= s["sharpe"] / 2 # sortino can be high if no downside
def test_rolling_sharpe_zero_volatility():
closes = [100.0] * 100
s = rolling_sharpe(closes, window=60)
assert s is not None
assert s["sharpe"] == 0.0 # no variance → 0 by convention
def test_rolling_sharpe_insufficient_data():
assert rolling_sharpe([100.0, 101.0], window=60) is None
# ---------- var_cvar ----------
def test_var_cvar_basic():
import random
r = random.Random(3)
rets = [r.gauss(0.0005, 0.02) for _ in range(1000)]
out = var_cvar(rets, confidences=[0.95, 0.99])
assert "var_95" in out and "cvar_95" in out
assert "var_99" in out and "cvar_99" in out
# VaR is loss → positive number representing percentile loss
assert out["var_95"] > 0
assert out["cvar_95"] >= out["var_95"] # CVaR worse than VaR
assert out["var_99"] >= out["var_95"]
def test_var_cvar_insufficient_data():
assert var_cvar([0.01], confidences=[0.95]) == {}
@@ -0,0 +1,59 @@
from __future__ import annotations
from mcp_common.microstructure import orderbook_imbalance
def test_orderbook_imbalance_balanced():
bids = [[100.0, 1.0], [99.5, 1.0], [99.0, 1.0]]
asks = [[100.5, 1.0], [101.0, 1.0], [101.5, 1.0]]
out = orderbook_imbalance(bids, asks, depth=3)
assert abs(out["imbalance_ratio"]) < 0.01 # bilanciato
assert out["bid_volume"] == 3.0
assert out["ask_volume"] == 3.0
assert out["microprice"] is not None
def test_orderbook_imbalance_bid_heavy():
bids = [[100.0, 5.0], [99.5, 5.0]]
asks = [[100.5, 1.0], [101.0, 1.0]]
out = orderbook_imbalance(bids, asks, depth=2)
assert out["imbalance_ratio"] > 0.5 # forte bid pressure
assert out["bid_volume"] == 10.0
assert out["ask_volume"] == 2.0
def test_orderbook_imbalance_ask_heavy():
bids = [[100.0, 1.0], [99.5, 1.0]]
asks = [[100.5, 5.0], [101.0, 5.0]]
out = orderbook_imbalance(bids, asks, depth=2)
assert out["imbalance_ratio"] < -0.5
def test_orderbook_imbalance_microprice_skew():
"""Microprice è weighted mid: pesato bid/ask depth opposto."""
bids = [[100.0, 9.0]]
asks = [[101.0, 1.0]]
out = orderbook_imbalance(bids, asks, depth=1)
# large bid → microprice closer to ask (paradox: weighted by *opposite* size)
assert out["microprice"] > 100.5
def test_orderbook_imbalance_empty():
out = orderbook_imbalance([], [], depth=5)
assert out["imbalance_ratio"] is None
assert out["microprice"] is None
def test_orderbook_imbalance_one_sided():
out = orderbook_imbalance([[100.0, 1.0]], [], depth=1)
assert out["imbalance_ratio"] == 1.0 # all bid
def test_orderbook_imbalance_slope():
"""Slope = velocity of liquidity dropoff: ripido = poca liquidità in profondità."""
bids_steep = [[100.0, 10.0], [99.0, 1.0]] # depth crolla → slope alto
asks_steep = [[101.0, 10.0], [102.0, 1.0]]
out = orderbook_imbalance(bids_steep, asks_steep, depth=2)
assert out["bid_slope"] is not None
# bid liquidity drops by 9 per 1 price unit → slope ~9
assert out["bid_slope"] > 5.0
+144
View File
@@ -0,0 +1,144 @@
"""Test puri per mcp_common.options (logiche option-flow indipendenti
dall'exchange).
"""
from __future__ import annotations
import pytest
from mcp_common.options import (
atm_vs_wings_vol,
dealer_gamma_profile,
oi_weighted_skew,
smile_asymmetry,
vanna_charm_aggregate,
)
# ---------- oi_weighted_skew ----------
def test_oi_weighted_skew_balanced():
"""OI distribuito 50/50 calls/puts → skew vicino a 0."""
legs = [
{"iv": 0.5, "delta": 0.5, "oi": 100, "option_type": "call"},
{"iv": 0.5, "delta": -0.5, "oi": 100, "option_type": "put"},
]
out = oi_weighted_skew(legs)
assert abs(out["skew"]) < 0.01
def test_oi_weighted_skew_put_heavy():
"""Put heavy → IV media puts > IV media calls → skew positivo (put > call)."""
legs = [
{"iv": 0.4, "delta": 0.5, "oi": 50, "option_type": "call"},
{"iv": 0.7, "delta": -0.5, "oi": 500, "option_type": "put"},
]
out = oi_weighted_skew(legs)
assert out["skew"] > 0
assert out["call_iv_weighted"] > 0
assert out["put_iv_weighted"] > out["call_iv_weighted"]
def test_oi_weighted_skew_empty():
out = oi_weighted_skew([])
assert out == {"skew": None, "call_iv_weighted": None, "put_iv_weighted": None, "total_oi": 0}
# ---------- smile_asymmetry ----------
def test_smile_asymmetry_symmetric():
"""Smile simmetrico ATM → asymmetry ≈ 0."""
legs = [
{"strike": 80, "iv": 0.55, "option_type": "put"},
{"strike": 90, "iv": 0.50, "option_type": "put"},
{"strike": 100, "iv": 0.45, "option_type": "call"},
{"strike": 110, "iv": 0.50, "option_type": "call"},
{"strike": 120, "iv": 0.55, "option_type": "call"},
]
out = smile_asymmetry(legs, spot=100.0)
assert out["atm_iv"] is not None
assert abs(out["asymmetry"]) < 0.05
def test_smile_asymmetry_put_skew():
"""OTM puts (low strike) IV >> OTM calls (high strike) IV → asymmetry > 0."""
legs = [
{"strike": 80, "iv": 0.80, "option_type": "put"},
{"strike": 100, "iv": 0.50, "option_type": "call"},
{"strike": 120, "iv": 0.45, "option_type": "call"},
]
out = smile_asymmetry(legs, spot=100.0)
assert out["asymmetry"] > 0.1
def test_smile_asymmetry_no_atm():
legs = [{"strike": 200, "iv": 0.5, "option_type": "call"}]
out = smile_asymmetry(legs, spot=100.0)
assert out["atm_iv"] is None
# ---------- atm_vs_wings_vol ----------
def test_atm_vs_wings_vol_basic():
legs = [
{"strike": 90, "iv": 0.55, "delta": -0.25, "option_type": "put"},
{"strike": 100, "iv": 0.45, "delta": 0.5, "option_type": "call"},
{"strike": 110, "iv": 0.50, "delta": 0.25, "option_type": "call"},
]
out = atm_vs_wings_vol(legs, spot=100.0)
assert out["atm_iv"] == pytest.approx(0.45, rel=1e-3)
assert out["wing_25d_call_iv"] == pytest.approx(0.50, rel=1e-3)
assert out["wing_25d_put_iv"] == pytest.approx(0.55, rel=1e-3)
# ATM<wings → richness positiva
assert out["wing_richness"] > 0
def test_atm_vs_wings_vol_no_data():
out = atm_vs_wings_vol([], spot=100.0)
assert out["atm_iv"] is None
# ---------- dealer_gamma_profile ----------
def test_dealer_gamma_profile_assumes_dealer_short_calls():
"""Convention: dealer SHORT calls (sells calls to retail), LONG puts.
Calls oi → negative dealer gamma, puts oi → positive dealer gamma.
"""
legs = [
{"strike": 100, "gamma": 0.01, "oi": 1000, "option_type": "call"},
{"strike": 100, "gamma": 0.01, "oi": 500, "option_type": "put"},
]
out = dealer_gamma_profile(legs, spot=100.0)
# call gamma greater than put gamma at same strike → net dealer short gamma
assert len(out["by_strike"]) == 1
row = out["by_strike"][0]
assert row["call_dealer_gamma"] < 0
assert row["put_dealer_gamma"] > 0
assert row["net_dealer_gamma"] < 0 # calls dominate
assert out["total_net_dealer_gamma"] < 0
def test_dealer_gamma_profile_empty():
out = dealer_gamma_profile([], spot=100.0)
assert out["by_strike"] == []
assert out["total_net_dealer_gamma"] == 0.0
# ---------- vanna_charm_aggregate ----------
def test_vanna_charm_aggregate_basic():
legs = [
{"strike": 100, "vanna": 0.05, "charm": -0.001, "oi": 1000, "option_type": "call"},
{"strike": 100, "vanna": -0.05, "charm": 0.001, "oi": 500, "option_type": "put"},
]
out = vanna_charm_aggregate(legs, spot=100.0)
assert out["total_vanna"] != 0 # some net exposure
assert "total_charm" in out
assert out["legs_analyzed"] == 2
def test_vanna_charm_aggregate_skip_missing_greeks():
legs = [
{"strike": 100, "vanna": None, "charm": -0.001, "oi": 1000, "option_type": "call"},
{"strike": 100, "vanna": 0.05, "charm": None, "oi": 500, "option_type": "put"},
]
out = vanna_charm_aggregate(legs, spot=100.0)
# entrambe le legs hanno almeno una greca None → skippate
assert out["legs_analyzed"] == 0
+51
View File
@@ -0,0 +1,51 @@
from __future__ import annotations
import random
from mcp_common.stats import cointegration_test
def test_cointegrated_synthetic_pair():
"""Costruisco coppia cointegrata: B random walk, A = 2*B + noise stazionario."""
r = random.Random(1)
b = [100.0]
for _ in range(300):
b.append(b[-1] + r.gauss(0, 1))
a = [2 * b[i] + r.gauss(0, 0.5) for i in range(len(b))]
out = cointegration_test(a, b)
assert out["cointegrated"] is True
assert out["beta"] == pytest_approx(2.0, rel=0.05)
assert out["adf_t_stat"] is not None
assert out["adf_t_stat"] < -2.86
def test_not_cointegrated_independent_walks():
"""Due random walk indipendenti → spread non stazionario → no cointegration."""
r = random.Random(2)
a = [100.0]
b = [100.0]
for _ in range(300):
a.append(a[-1] + r.gauss(0, 1))
b.append(b[-1] + r.gauss(0, 1))
out = cointegration_test(a, b)
# Per due RW indipendenti, t-stat ADF è solitamente > -2.86 → non cointegrate
assert out["cointegrated"] is False or out["adf_t_stat"] > -3.0
def test_cointegration_short_series():
out = cointegration_test([1.0, 2.0], [3.0, 4.0])
assert out["cointegrated"] is None
assert out["beta"] is None
def test_cointegration_mismatched_length():
out = cointegration_test([1.0, 2.0, 3.0], [1.0, 2.0])
assert out["cointegrated"] is None
def pytest_approx(value, rel):
"""Tiny helper to avoid importing pytest just for approx."""
class _Approx:
def __eq__(self, other):
return abs(other - value) <= abs(value) * rel
return _Approx()
+17 -32
View File
@@ -1,44 +1,29 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_alpaca.client import AlpacaClient
from mcp_alpaca.server import create_app
configure_root_logging()
SPEC = ExchangeAppSpec(
exchange="alpaca",
creds_env_var="ALPACA_CREDENTIALS_FILE",
env_var="ALPACA_PAPER",
flag_key="paper",
default_base_url_live="https://api.alpaca.markets",
default_base_url_testnet="https://paper-api.alpaca.markets",
default_port=9020,
build_client=lambda creds, env_info: AlpacaClient(
api_key=creds["api_key_id"],
secret_key=creds["secret_key"],
paper=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
creds_file = os.environ["ALPACA_CREDENTIALS_FILE"]
with open(creds_file) as f:
creds = json.load(f)
paper_env = os.environ.get("ALPACA_PAPER", "true").lower()
paper = paper_env not in ("0", "false", "no")
client = AlpacaClient(
api_key=creds["api_key_id"],
secret_key=creds["secret_key"],
paper=paper,
)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(client=client, token_store=token_store)
uvicorn.run(
app,
log_config=None,
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9020")),
)
run_exchange_main(SPEC)
if __name__ == "__main__":
+3 -6
View File
@@ -26,8 +26,6 @@ from alpaca.trading.client import TradingClient
from alpaca.trading.enums import (
AssetClass,
OrderSide,
OrderStatus,
OrderType,
QueryOrderStatus,
TimeInForce,
)
@@ -41,7 +39,6 @@ from alpaca.trading.requests import (
StopOrderRequest,
)
_TF_MAP = {
"1min": TimeFrame(1, TimeFrameUnit.Minute),
"5min": TimeFrame(5, TimeFrameUnit.Minute),
@@ -74,13 +71,13 @@ def _asset_class_enum(ac: str) -> AssetClass:
def _serialize(obj: Any) -> Any:
"""Recursively convert pydantic/datetime objects → json-safe."""
if obj is None or isinstance(obj, (str, int, float, bool)):
if obj is None or isinstance(obj, str | int | float | bool):
return obj
if isinstance(obj, (_dt.datetime, _dt.date)):
if isinstance(obj, _dt.datetime | _dt.date):
return obj.isoformat()
if isinstance(obj, dict):
return {k: _serialize(v) for k, v in obj.items()}
if isinstance(obj, (list, tuple)):
if isinstance(obj, list | tuple):
return [_serialize(v) for v in obj]
if hasattr(obj, "model_dump"):
return _serialize(obj.model_dump())
@@ -0,0 +1,56 @@
"""Leverage cap server-side per place_order.
Cap letto dal secret JSON via campo `max_leverage`. Default 1 (cash) se assente.
"""
from __future__ import annotations
from fastapi import HTTPException
def get_max_leverage(creds: dict) -> int:
"""Legge max_leverage dal secret. Default 1 se mancante."""
raw = creds.get("max_leverage", 1)
try:
value = int(raw)
except (TypeError, ValueError):
value = 1
return max(1, value)
def enforce_leverage(
requested: int | float | None,
*,
creds: dict,
exchange: str,
) -> int:
"""Verifica e applica leverage cap. Ritorna leverage applicabile.
Solleva HTTPException(403, LEVERAGE_CAP_EXCEEDED) se requested > cap.
Se requested is None, applica il cap come default.
"""
cap = get_max_leverage(creds)
if requested is None:
return cap
lev = int(requested)
if lev < 1:
raise HTTPException(
status_code=403,
detail={
"error": "LEVERAGE_CAP_EXCEEDED",
"exchange": exchange,
"requested": lev,
"max": cap,
"reason": "leverage must be >= 1",
},
)
if lev > cap:
raise HTTPException(
status_code=403,
detail={
"error": "LEVERAGE_CAP_EXCEEDED",
"exchange": exchange,
"requested": lev,
"max": cap,
},
)
return lev
+79 -8
View File
@@ -3,13 +3,15 @@ from __future__ import annotations
import os
from fastapi import Depends, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from mcp_alpaca.client import AlpacaClient
from mcp_alpaca.leverage_cap import get_max_leverage
# --- Body models: reads ---
@@ -118,11 +120,39 @@ def _check(principal: Principal, *, core: bool = False, observer: bool = False)
raise HTTPException(status_code=403, detail="forbidden")
def create_app(*, client: AlpacaClient, token_store: TokenStore):
def create_app(
*,
client: AlpacaClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
):
creds = creds or {}
app = build_app(name="mcp-alpaca", version="0.1.0", token_store=token_store)
# ── Reads ──────────────────────────────────────────────
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "alpaca",
"environment": "testnet" if getattr(client, "paper", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_account", tags=["reads"])
async def t_get_account(body: AccountReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
@@ -185,37 +215,77 @@ def create_app(*, client: AlpacaClient, token_store: TokenStore):
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(body: PlaceOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.place_order(
result = await client.place_order(
body.symbol, body.side, body.qty, body.notional,
body.order_type, body.limit_price, body.stop_price, body.tif, body.asset_class,
)
audit_write_op(
principal=principal, action="place_order", exchange="alpaca",
target=body.symbol,
payload={"side": body.side, "qty": body.qty, "notional": body.notional,
"order_type": body.order_type, "limit_price": body.limit_price,
"stop_price": body.stop_price, "tif": body.tif,
"asset_class": body.asset_class},
result=result,
)
return result
@app.post("/tools/amend_order", tags=["writes"])
async def t_amend_order(body: AmendOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.amend_order(
result = await client.amend_order(
body.order_id, body.qty, body.limit_price, body.stop_price, body.tif,
)
audit_write_op(
principal=principal, action="amend_order", exchange="alpaca",
target=body.order_id,
payload={"qty": body.qty, "limit_price": body.limit_price,
"stop_price": body.stop_price, "tif": body.tif},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(body: CancelOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.cancel_order(body.order_id)
result = await client.cancel_order(body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="alpaca",
target=body.order_id, payload={}, result=result,
)
return result
@app.post("/tools/cancel_all_orders", tags=["writes"])
async def t_cancel_all(body: CancelAllReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return {"canceled": await client.cancel_all_orders()}
result = {"canceled": await client.cancel_all_orders()}
audit_write_op(
principal=principal, action="cancel_all_orders", exchange="alpaca",
payload={}, result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close(body: ClosePositionReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.close_position(body.symbol, body.qty, body.percentage)
result = await client.close_position(body.symbol, body.qty, body.percentage)
audit_write_op(
principal=principal, action="close_position", exchange="alpaca",
target=body.symbol,
payload={"qty": body.qty, "percentage": body.percentage},
result=result,
)
return result
@app.post("/tools/close_all_positions", tags=["writes"])
async def t_close_all(body: CloseAllPositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return {"closed": await client.close_all_positions(body.cancel_orders)}
result = {"closed": await client.close_all_positions(body.cancel_orders)}
audit_write_op(
principal=principal, action="close_all_positions", exchange="alpaca",
payload={"cancel_orders": body.cancel_orders}, result=result,
)
return result
# ── MCP mount ──────────────────────────────────────────
@@ -227,6 +297,7 @@ def create_app(*, client: AlpacaClient, token_store: TokenStore):
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (paper/live), source, base_url, max_leverage cap."},
{"name": "get_account", "description": "Alpaca account summary (equity, cash, buying_power)."},
{"name": "get_positions", "description": "Posizioni aperte (stocks/crypto/options)."},
{"name": "get_activities", "description": "Activity log (fills, dividends, transfers)."},
-1
View File
@@ -3,7 +3,6 @@ from __future__ import annotations
from unittest.mock import MagicMock
import pytest
from mcp_alpaca.client import AlpacaClient
@@ -0,0 +1,50 @@
from __future__ import annotations
from unittest.mock import MagicMock
from fastapi.testclient import TestClient
from mcp_alpaca.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
def _make_app(env_info, creds):
c = MagicMock()
c.paper = True
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_paper_is_testnet():
"""Alpaca: 'paper' nel secret mappa a environment='testnet'."""
env = EnvironmentInfo(
exchange="alpaca",
environment="testnet",
source="env",
env_value="true",
base_url="https://paper-api.alpaca.markets",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post("/tools/environment_info", headers={"Authorization": "Bearer ot"})
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "alpaca"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["base_url"] == "https://paper-api.alpaca.markets"
assert body["max_leverage"] == 1
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="alpaca", environment="testnet", source="default",
env_value=None, base_url="https://paper-api.alpaca.markets",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -0,0 +1,46 @@
from __future__ import annotations
import pytest
from fastapi import HTTPException
from mcp_alpaca.leverage_cap import enforce_leverage, get_max_leverage
def test_get_max_leverage_returns_creds_value():
creds = {"max_leverage": 4}
assert get_max_leverage(creds) == 4
def test_get_max_leverage_default_when_missing():
"""Default 1 (cash) se il secret non ha max_leverage."""
assert get_max_leverage({}) == 1
def test_enforce_leverage_pass_at_cap_one():
"""Alpaca cash account: cap 1, leverage 1 OK."""
creds = {"max_leverage": 1}
enforce_leverage(1, creds=creds, exchange="alpaca") # no raise
def test_enforce_leverage_reject_over_cap_one():
creds = {"max_leverage": 1}
with pytest.raises(HTTPException) as exc:
enforce_leverage(2, creds=creds, exchange="alpaca")
assert exc.value.status_code == 403
assert exc.value.detail["error"] == "LEVERAGE_CAP_EXCEEDED"
assert exc.value.detail["exchange"] == "alpaca"
assert exc.value.detail["requested"] == 2
assert exc.value.detail["max"] == 1
def test_enforce_leverage_reject_when_below_one():
creds = {"max_leverage": 1}
with pytest.raises(HTTPException) as exc:
enforce_leverage(0, creds=creds, exchange="alpaca")
assert exc.value.status_code == 403
def test_enforce_leverage_default_when_none():
"""Se requested è None, applica il cap come default."""
creds = {"max_leverage": 1}
result = enforce_leverage(None, creds=creds, exchange="alpaca")
assert result == 1
+2 -3
View File
@@ -4,9 +4,8 @@ from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_alpaca.server import create_app
from mcp_common.auth import Principal, TokenStore
@pytest.fixture
@@ -44,7 +43,7 @@ def mock_client():
@pytest.fixture
def http(mock_client, token_store):
app = create_app(client=mock_client, token_store=token_store)
app = create_app(client=mock_client, token_store=token_store, creds={"max_leverage": 1})
return TestClient(app)
+17 -32
View File
@@ -1,44 +1,29 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_bybit.client import BybitClient
from mcp_bybit.server import create_app
configure_root_logging()
SPEC = ExchangeAppSpec(
exchange="bybit",
creds_env_var="BYBIT_CREDENTIALS_FILE",
env_var="BYBIT_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.bybit.com",
default_base_url_testnet="https://api-testnet.bybit.com",
default_port=9019,
build_client=lambda creds, env_info: BybitClient(
api_key=creds["api_key"],
api_secret=creds["api_secret"],
testnet=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
creds_file = os.environ["BYBIT_CREDENTIALS_FILE"]
with open(creds_file) as f:
creds = json.load(f)
testnet_env = os.environ.get("BYBIT_TESTNET", "true").lower()
testnet = testnet_env not in ("0", "false", "no")
client = BybitClient(
api_key=creds["api_key"],
api_secret=creds["api_secret"],
testnet=testnet,
)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(client=client, token_store=token_store)
uvicorn.run(
app,
log_config=None,
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9019")),
)
run_exchange_main(SPEC)
if __name__ == "__main__":
+114
View File
@@ -4,6 +4,7 @@ import asyncio
from typing import Any
from mcp_common import indicators as ind
from mcp_common import microstructure as micro
from pybit.unified_trading import HTTP
@@ -349,6 +350,74 @@ class BybitClient:
for r in rows
]
async def get_orderbook_imbalance(
self,
symbol: str,
category: str = "linear",
depth: int = 10,
) -> dict:
"""Microstructure: bid/ask imbalance ratio + microprice + slope."""
ob = await self.get_orderbook(symbol=symbol, category=category, limit=max(depth, 50))
result = micro.orderbook_imbalance(ob.get("bids") or [], ob.get("asks") or [], depth=depth)
return {
"symbol": symbol,
"category": category,
"depth": depth,
**result,
"timestamp": ob.get("timestamp"),
}
async def get_basis_term_structure(self, asset: str) -> dict:
"""Basis curve futures (dated) vs perp + spot. Filtra contratti future
BTCUSDT / ETHUSDT con scadenza, calcola annualized basis per ognuno.
"""
import datetime as _dt
asset = asset.upper()
spot = await self.get_ticker(f"{asset}USDT", category="spot")
perp = await self.get_ticker(f"{asset}USDT", category="linear")
sp = spot.get("last_price")
pp = perp.get("last_price")
# Lista futures dated (linear/inverse)
instr = await self.get_instruments(category="linear")
items = (instr.get("instruments") or [])
futures = [
x for x in items
if x.get("symbol", "").startswith(f"{asset}-") or x.get("symbol", "").startswith(f"{asset}USDT-")
]
rows: list[dict[str, Any]] = []
if sp:
now_ms = int(_dt.datetime.now(_dt.UTC).timestamp() * 1000)
for f in futures[:10]:
tk = await self.get_ticker(f["symbol"], category="linear")
fp = tk.get("last_price")
expiry_ms = f.get("delivery_time")
if not fp or not expiry_ms:
continue
days = max((int(expiry_ms) - now_ms) / 86_400_000, 1)
basis_pct = 100.0 * (fp - sp) / sp
annualized = basis_pct * 365.0 / days
rows.append({
"symbol": f["symbol"],
"expiry_ms": int(expiry_ms),
"days_to_expiry": round(days, 2),
"future_price": fp,
"basis_pct": round(basis_pct, 4),
"annualized_basis_pct": round(annualized, 4),
})
rows.sort(key=lambda r: r["days_to_expiry"])
return {
"asset": asset,
"spot_price": sp,
"perp_price": pp,
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4) if (sp and pp) else None,
"term_structure": rows,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
}
async def get_basis_spot_perp(self, asset: str) -> dict:
asset = asset.upper()
symbol = f"{asset}USDT"
@@ -412,6 +481,51 @@ class BybitClient:
"status": "submitted",
})
async def place_combo_order(
self,
category: str,
legs: list[dict[str, Any]],
) -> dict:
"""Atomic multi-leg via /v5/order/create-batch (Bybit option only).
Bybit supporta batch_order solo su category='option'. Per perp/linear
usare loop di place_order (non atomic).
legs: [{symbol, side, qty, order_type, price?, tif?, reduce_only?}].
"""
if category != "option":
raise ValueError("place_combo_order: Bybit batch_order è disponibile solo su category='option'")
if len(legs) < 2:
raise ValueError("combo requires at least 2 legs")
import uuid
request: list[dict[str, Any]] = []
for leg in legs:
entry: dict[str, Any] = {
"symbol": leg["symbol"],
"side": leg["side"],
"qty": str(leg["qty"]),
"orderType": leg.get("order_type", "Limit"),
"timeInForce": leg.get("tif", "GTC"),
"reduceOnly": leg.get("reduce_only", False),
"orderLinkId": f"cerbero-{uuid.uuid4().hex[:16]}",
}
if leg.get("price") is not None:
entry["price"] = str(leg["price"])
request.append(entry)
resp = await self._run(self._http.place_batch_order, category=category, request=request)
result_list = (resp.get("result") or {}).get("list") or []
orders = [
{
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
}
for r in result_list
]
return self._envelope(resp, {"orders": orders})
async def amend_order(
self,
category: str,
@@ -0,0 +1,56 @@
"""Leverage cap server-side per place_order.
Cap letto dal secret JSON via campo `max_leverage`. Default 1 (cash) se assente.
"""
from __future__ import annotations
from fastapi import HTTPException
def get_max_leverage(creds: dict) -> int:
"""Legge max_leverage dal secret. Default 1 se mancante."""
raw = creds.get("max_leverage", 1)
try:
value = int(raw)
except (TypeError, ValueError):
value = 1
return max(1, value)
def enforce_leverage(
requested: int | float | None,
*,
creds: dict,
exchange: str,
) -> int:
"""Verifica e applica leverage cap. Ritorna leverage applicabile.
Solleva HTTPException(403, LEVERAGE_CAP_EXCEEDED) se requested > cap.
Se requested is None, applica il cap come default.
"""
cap = get_max_leverage(creds)
if requested is None:
return cap
lev = int(requested)
if lev < 1:
raise HTTPException(
status_code=403,
detail={
"error": "LEVERAGE_CAP_EXCEEDED",
"exchange": exchange,
"requested": lev,
"max": cap,
"reason": "leverage must be >= 1",
},
)
if lev > cap:
raise HTTPException(
status_code=403,
detail={
"error": "LEVERAGE_CAP_EXCEEDED",
"exchange": exchange,
"requested": lev,
"max": cap,
},
)
return lev
+172 -13
View File
@@ -3,13 +3,16 @@ from __future__ import annotations
import os
from fastapi import Depends, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from pydantic import BaseModel, Field
from mcp_bybit.client import BybitClient
from mcp_bybit.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_bybit.leverage_cap import get_max_leverage
# --- Body models: reads ---
@@ -97,6 +100,16 @@ class BasisSpotPerpReq(BaseModel):
asset: str
class OrderbookImbalanceReq(BaseModel):
symbol: str
category: str = "linear"
depth: int = 10
class BasisTermStructureReq(BaseModel):
asset: str
# --- Body models: writes ---
class PlaceOrderReq(BaseModel):
@@ -111,6 +124,21 @@ class PlaceOrderReq(BaseModel):
position_idx: int | None = None
class ComboLegReq(BaseModel):
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
class PlaceComboOrderReq(BaseModel):
category: str = "option"
legs: list[ComboLegReq] = Field(..., min_length=2)
class AmendOrderReq(BaseModel):
category: str
symbol: str
@@ -180,11 +208,39 @@ def _check(principal: Principal, *, core: bool = False, observer: bool = False)
raise HTTPException(status_code=403, detail="forbidden")
def create_app(*, client: BybitClient, token_store: TokenStore):
def create_app(
*,
client: BybitClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
):
creds = creds or {}
app = build_app(name="mcp-bybit", version="0.1.0", token_store=token_store)
# ── Reads ──────────────────────────────────────────────
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "bybit",
"environment": "testnet" if client.testnet else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(body: TickerReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
@@ -265,62 +321,161 @@ def create_app(*, client: BybitClient, token_store: TokenStore):
_check(principal, core=True, observer=True)
return await client.get_basis_spot_perp(body.asset)
@app.post("/tools/get_orderbook_imbalance", tags=["reads"])
async def t_get_ob_imbalance(body: OrderbookImbalanceReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_orderbook_imbalance(body.symbol, body.category, body.depth)
@app.post("/tools/get_basis_term_structure", tags=["reads"])
async def t_get_basis_term_structure(body: BasisTermStructureReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_basis_term_structure(body.asset)
# ── Writes ─────────────────────────────────────────────
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(body: PlaceOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.place_order(
result = await client.place_order(
body.category, body.symbol, body.side, body.qty,
body.order_type, body.price, body.tif, body.reduce_only, body.position_idx,
)
audit_write_op(
principal=principal, action="place_order", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "side": body.side, "qty": body.qty,
"order_type": body.order_type, "price": body.price, "tif": body.tif,
"reduce_only": body.reduce_only},
result=result,
)
return result
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(body: PlaceComboOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_combo_order(
category=body.category,
legs=[leg.model_dump() for leg in body.legs],
)
audit_write_op(
principal=principal, action="place_combo_order", exchange="bybit",
payload={"category": body.category,
"legs": [leg.model_dump() for leg in body.legs]},
result=result if isinstance(result, dict) else None,
)
return result
@app.post("/tools/amend_order", tags=["writes"])
async def t_amend_order(body: AmendOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.amend_order(
result = await client.amend_order(
body.category, body.symbol, body.order_id, body.new_qty, body.new_price,
)
audit_write_op(
principal=principal, action="amend_order", exchange="bybit",
target=body.order_id,
payload={"category": body.category, "symbol": body.symbol,
"new_qty": body.new_qty, "new_price": body.new_price},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(body: CancelOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.cancel_order(body.category, body.symbol, body.order_id)
result = await client.cancel_order(body.category, body.symbol, body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="bybit",
target=body.order_id,
payload={"category": body.category, "symbol": body.symbol},
result=result,
)
return result
@app.post("/tools/cancel_all_orders", tags=["writes"])
async def t_cancel_all(body: CancelAllReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.cancel_all_orders(body.category, body.symbol)
result = await client.cancel_all_orders(body.category, body.symbol)
audit_write_op(
principal=principal, action="cancel_all_orders", exchange="bybit",
target=body.symbol,
payload={"category": body.category},
result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(body: SetStopLossReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.set_stop_loss(body.category, body.symbol, body.stop_loss, body.position_idx)
result = await client.set_stop_loss(body.category, body.symbol, body.stop_loss, body.position_idx)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="bybit",
target=body.symbol,
payload={"stop_loss": body.stop_loss, "position_idx": body.position_idx},
result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(body: SetTakeProfitReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.set_take_profit(body.category, body.symbol, body.take_profit, body.position_idx)
result = await client.set_take_profit(body.category, body.symbol, body.take_profit, body.position_idx)
audit_write_op(
principal=principal, action="set_take_profit", exchange="bybit",
target=body.symbol,
payload={"take_profit": body.take_profit, "position_idx": body.position_idx},
result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close(body: ClosePositionReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.close_position(body.category, body.symbol)
result = await client.close_position(body.category, body.symbol)
audit_write_op(
principal=principal, action="close_position", exchange="bybit",
target=body.symbol,
payload={"category": body.category},
result=result,
)
return result
@app.post("/tools/set_leverage", tags=["writes"])
async def t_set_leverage(body: SetLeverageReq, principal: Principal = Depends(require_principal)):
_enforce_leverage(body.leverage, creds=creds, exchange="bybit")
_check(principal, core=True)
return await client.set_leverage(body.category, body.symbol, body.leverage)
result = await client.set_leverage(body.category, body.symbol, body.leverage)
audit_write_op(
principal=principal, action="set_leverage", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "leverage": body.leverage},
result=result,
)
return result
@app.post("/tools/switch_position_mode", tags=["writes"])
async def t_switch_mode(body: SwitchModeReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.switch_position_mode(body.category, body.symbol, body.mode)
result = await client.switch_position_mode(body.category, body.symbol, body.mode)
audit_write_op(
principal=principal, action="switch_position_mode", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "mode": body.mode},
result=result,
)
return result
@app.post("/tools/transfer_asset", tags=["writes"])
async def t_transfer(body: TransferReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
return await client.transfer_asset(body.coin, body.amount, body.from_type, body.to_type)
result = await client.transfer_asset(body.coin, body.amount, body.from_type, body.to_type)
audit_write_op(
principal=principal, action="transfer_asset", exchange="bybit",
payload={"coin": body.coin, "amount": body.amount,
"from_type": body.from_type, "to_type": body.to_type},
result=result,
)
return result
# ── MCP mount ──────────────────────────────────────────
@@ -332,6 +487,7 @@ def create_app(*, client: BybitClient, token_store: TokenStore):
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_ticker", "description": "Ticker Bybit (spot/linear/inverse/option)."},
{"name": "get_ticker_batch", "description": "Ticker per più simboli."},
{"name": "get_orderbook", "description": "Orderbook profondità N."},
@@ -347,7 +503,10 @@ def create_app(*, client: BybitClient, token_store: TokenStore):
{"name": "get_trade_history", "description": "Fills recenti."},
{"name": "get_open_orders", "description": "Ordini pending."},
{"name": "get_basis_spot_perp", "description": "Basis spot vs linear perp."},
{"name": "get_orderbook_imbalance", "description": "Microstructure: imbalance ratio + microprice + slope su top-N livelli book."},
{"name": "get_basis_term_structure", "description": "Basis curve futures dated vs spot, annualizzato."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "place_combo_order", "description": "Multi-leg atomico via place_batch_order (solo category=option)."},
{"name": "amend_order", "description": "Modifica ordine esistente."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "cancel_all_orders", "description": "Cancella tutti ordini."},
-1
View File
@@ -3,7 +3,6 @@ from __future__ import annotations
from unittest.mock import MagicMock
import pytest
from mcp_bybit.client import BybitClient
+58 -1
View File
@@ -1,7 +1,6 @@
from __future__ import annotations
import pytest
from mcp_bybit.client import BybitClient
@@ -423,6 +422,64 @@ async def test_place_order_linear_no_link_id(client, mock_http):
assert "orderLinkId" not in kwargs
@pytest.mark.asyncio
async def test_place_combo_order_batch_option(client, mock_http):
"""Combo order via place_batch_order su category=option (atomic, 1 round-trip)."""
mock_http.place_batch_order.return_value = {
"retCode": 0,
"result": {
"list": [
{"orderId": "ord-1", "orderLinkId": "cerbero-leg1"},
{"orderId": "ord-2", "orderLinkId": "cerbero-leg2"},
]
},
}
legs = [
{"symbol": "BTC-30APR26-75000-C-USDT", "side": "Buy", "qty": 0.01, "order_type": "Limit", "price": 5.0},
{"symbol": "BTC-30APR26-80000-C-USDT", "side": "Sell", "qty": 0.01, "order_type": "Limit", "price": 3.0},
]
out = await client.place_combo_order(category="option", legs=legs)
assert len(out["orders"]) == 2
assert out["orders"][0]["order_id"] == "ord-1"
kwargs = mock_http.place_batch_order.call_args.kwargs
assert kwargs["category"] == "option"
request = kwargs["request"]
assert len(request) == 2
assert request[0]["symbol"] == "BTC-30APR26-75000-C-USDT"
assert request[0]["qty"] == "0.01"
assert request[0]["orderType"] == "Limit"
# CER: orderLinkId obbligatorio per option
assert "orderLinkId" in request[0]
@pytest.mark.asyncio
async def test_place_combo_order_error(client, mock_http):
mock_http.place_batch_order.return_value = {"retCode": 10001, "retMsg": "invalid leg"}
out = await client.place_combo_order(
category="option",
legs=[
{"symbol": "X", "side": "Buy", "qty": 1, "order_type": "Limit", "price": 1.0},
{"symbol": "Y", "side": "Sell", "qty": 1, "order_type": "Limit", "price": 1.0},
],
)
assert out["error"] == "invalid leg"
assert out["code"] == 10001
@pytest.mark.asyncio
async def test_place_combo_order_rejects_non_option(client, mock_http):
"""Bybit batch_order è disponibile solo su option category."""
import pytest as _pytest
with _pytest.raises(ValueError, match="option"):
await client.place_combo_order(
category="linear",
legs=[
{"symbol": "BTCUSDT", "side": "Buy", "qty": 0.01, "order_type": "Market"},
{"symbol": "ETHUSDT", "side": "Sell", "qty": 0.01, "order_type": "Market"},
],
)
@pytest.mark.asyncio
async def test_cancel_order(client, mock_http):
mock_http.cancel_order.return_value = {"retCode": 0, "result": {"orderId": "ord1"}}
@@ -0,0 +1,54 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
from fastapi.testclient import TestClient
from mcp_bybit.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
def _make_app(env_info, creds):
c = MagicMock()
c.testnet = True
c.set_leverage = AsyncMock(return_value={"state": "ok"})
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="bybit",
environment="testnet",
source="env",
env_value="true",
base_url="https://api-testnet.bybit.com",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ot"},
)
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "bybit"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://api-testnet.bybit.com"
assert body["max_leverage"] == 3
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="bybit", environment="testnet", source="default",
env_value=None, base_url="https://api-testnet.bybit.com",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -0,0 +1,50 @@
from __future__ import annotations
import pytest
from fastapi import HTTPException
from mcp_bybit.leverage_cap import enforce_leverage, get_max_leverage
def test_get_max_leverage_returns_creds_value():
creds = {"max_leverage": 5}
assert get_max_leverage(creds) == 5
def test_get_max_leverage_default_when_missing():
"""Default 1 (cash) se il secret non ha max_leverage."""
assert get_max_leverage({}) == 1
def test_enforce_leverage_pass_under_cap():
creds = {"max_leverage": 3}
enforce_leverage(2, creds=creds, exchange="bybit") # no raise
def test_enforce_leverage_pass_at_cap():
creds = {"max_leverage": 3}
enforce_leverage(3, creds=creds, exchange="bybit") # no raise
def test_enforce_leverage_reject_over_cap():
creds = {"max_leverage": 3}
with pytest.raises(HTTPException) as exc:
enforce_leverage(10, creds=creds, exchange="bybit")
assert exc.value.status_code == 403
assert exc.value.detail["error"] == "LEVERAGE_CAP_EXCEEDED"
assert exc.value.detail["exchange"] == "bybit"
assert exc.value.detail["requested"] == 10
assert exc.value.detail["max"] == 3
def test_enforce_leverage_reject_when_below_one():
creds = {"max_leverage": 3}
with pytest.raises(HTTPException) as exc:
enforce_leverage(0, creds=creds, exchange="bybit")
assert exc.value.status_code == 403
def test_enforce_leverage_default_when_none():
"""Se requested è None, applica il cap come default."""
creds = {"max_leverage": 3}
result = enforce_leverage(None, creds=creds, exchange="bybit")
assert result == 3
+26 -3
View File
@@ -4,9 +4,8 @@ from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_bybit.server import create_app
from mcp_common.auth import Principal, TokenStore
@pytest.fixture
@@ -47,12 +46,15 @@ def mock_client():
c.set_leverage = AsyncMock(return_value={"status": "leverage_set"})
c.switch_position_mode = AsyncMock(return_value={"status": "mode_switched"})
c.transfer_asset = AsyncMock(return_value={"transfer_id": "tx"})
c.place_combo_order = AsyncMock(return_value={"orders": [{"order_id": "ord-1"}, {"order_id": "ord-2"}]})
c.get_orderbook_imbalance = AsyncMock(return_value={"imbalance_ratio": 0.0, "microprice": 100.0})
c.get_basis_term_structure = AsyncMock(return_value={"asset": "BTC", "term_structure": []})
return c
@pytest.fixture
def http(mock_client, token_store):
app = create_app(client=mock_client, token_store=token_store)
app = create_app(client=mock_client, token_store=token_store, creds={"max_leverage": 5})
return TestClient(app)
@@ -75,6 +77,8 @@ READ_ENDPOINTS = [
("/tools/get_trade_history", {}),
("/tools/get_open_orders", {}),
("/tools/get_basis_spot_perp", {"asset": "BTC"}),
("/tools/get_orderbook_imbalance", {"symbol": "BTCUSDT"}),
("/tools/get_basis_term_structure", {"asset": "BTC"}),
]
WRITE_ENDPOINTS = [
@@ -88,9 +92,28 @@ WRITE_ENDPOINTS = [
("/tools/set_leverage", {"category": "linear", "symbol": "BTCUSDT", "leverage": 5}),
("/tools/switch_position_mode", {"category": "linear", "symbol": "BTCUSDT", "mode": "hedge"}),
("/tools/transfer_asset", {"coin": "USDT", "amount": 10.0, "from_type": "UNIFIED", "to_type": "FUND"}),
("/tools/place_combo_order", {
"category": "option",
"legs": [
{"symbol": "BTC-30APR26-75000-C-USDT", "side": "Buy", "qty": 0.01, "order_type": "Limit", "price": 5.0},
{"symbol": "BTC-30APR26-80000-C-USDT", "side": "Sell", "qty": 0.01, "order_type": "Limit", "price": 3.0},
],
}),
]
def test_place_combo_order_min_legs(http):
r = http.post(
"/tools/place_combo_order",
json={
"category": "option",
"legs": [{"symbol": "X", "side": "Buy", "qty": 1, "order_type": "Limit", "price": 1.0}],
},
headers=CORE,
)
assert r.status_code == 422
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
@@ -1,48 +1,29 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_deribit.env_validation import (
fail_fast_if_missing,
require_env,
summarize,
)
from mcp_common.logging import configure_root_logging
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_deribit.client import DeribitClient
from mcp_deribit.server import create_app
configure_root_logging() # CER-P5-009: JSON default, env LOG_FORMAT=text per dev
SPEC = ExchangeAppSpec(
exchange="deribit",
creds_env_var="CREDENTIALS_FILE",
env_var="DERIBIT_TESTNET",
flag_key="testnet",
default_base_url_live="https://www.deribit.com/api/v2",
default_base_url_testnet="https://test.deribit.com/api/v2",
default_port=9011,
build_client=lambda creds, env_info: DeribitClient(
client_id=creds["client_id"],
client_secret=creds["client_secret"],
testnet=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
# CER-P5-010: fail-fast boot su env mandatory
fail_fast_if_missing(["CREDENTIALS_FILE"])
summarize(["CREDENTIALS_FILE", "CORE_TOKEN_FILE", "OBSERVER_TOKEN_FILE", "PORT", "HOST"])
creds_file = require_env("CREDENTIALS_FILE", "deribit credentials JSON path")
with open(creds_file) as f:
creds = json.load(f)
client = DeribitClient(
client_id=creds["client_id"],
client_secret=creds["client_secret"],
testnet=bool(creds.get("testnet", True)),
)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(client=client, token_store=token_store)
uvicorn.run(
app,
log_config=None, # CER-P5-009: delega al root JSON logger
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9011")),
)
run_exchange_main(SPEC)
if __name__ == "__main__":
+206 -9
View File
@@ -1,11 +1,14 @@
from __future__ import annotations
import contextlib
import time
from dataclasses import dataclass, field
from typing import Any
import httpx
from mcp_common import indicators as ind
from mcp_common import microstructure as micro
from mcp_common import options as opt
from mcp_common.http import async_client
BASE_LIVE = "https://www.deribit.com/api/v2"
BASE_TESTNET = "https://test.deribit.com/api/v2"
@@ -41,7 +44,7 @@ class DeribitClient:
"client_id": self.client_id,
"client_secret": self.client_secret,
}
async with httpx.AsyncClient(timeout=15.0) as http:
async with async_client(timeout=15.0) as http:
resp = await http.get(url, params=params)
data = resp.json()
result = data["result"]
@@ -65,7 +68,7 @@ class DeribitClient:
if is_private and self._token:
headers["Authorization"] = f"Bearer {self._token}"
async with httpx.AsyncClient(timeout=15.0) as http:
async with async_client(timeout=15.0) as http:
resp = await http.get(url, params=request_params, headers=headers)
data = resp.json()
@@ -193,10 +196,8 @@ class DeribitClient:
name = s.get("instrument_name")
oi = s.get("open_interest")
if name and oi is not None:
try:
with contextlib.suppress(TypeError, ValueError):
oi_by_name[name] = float(oi)
except (TypeError, ValueError):
pass
all_items = raw.get("result") or []
filtered: list[dict] = []
@@ -262,6 +263,18 @@ class DeribitClient:
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
}
async def get_orderbook_imbalance(self, instrument_name: str, depth: int = 10) -> dict:
"""Microstructure: bid/ask imbalance + microprice + slope su top-N livelli."""
ob = await self.get_orderbook(instrument_name, depth=max(depth, 10))
result = micro.orderbook_imbalance(ob.get("bids") or [], ob.get("asks") or [], depth=depth)
return {
"instrument_name": instrument_name,
"depth": depth,
**result,
"timestamp": ob.get("timestamp"),
"testnet": self.testnet,
}
async def get_positions(self, currency: str = "USDC") -> list:
raw = await self._request("private/get_positions", {"currency": currency})
result = raw.get("result") or []
@@ -525,6 +538,159 @@ class DeribitClient:
"testnet": self.testnet,
}
async def _fetch_chain_legs(
self,
currency: str,
expiry_from: str | None = None,
expiry_to: str | None = None,
top_n_strikes: int = 50,
) -> tuple[float, list[dict[str, Any]]]:
"""Fetch chain options + ticker per top-N strikes per OI; restituisce
(spot, legs[]) con campi normalizzati per le funzioni in mcp_common.options.
"""
import asyncio
currency = currency.upper()
try:
idx_tk = await self.get_ticker(f"{currency}-PERPETUAL")
spot = float(idx_tk.get("mark_price") or 0)
except Exception:
spot = 0.0
chain = await self.get_instruments(
currency=currency,
kind="option",
expiry_from=expiry_from,
expiry_to=expiry_to,
limit=2000,
)
items = chain.get("instruments", [])
items.sort(key=lambda x: -(x.get("open_interest") or 0))
top = items[:top_n_strikes]
async def _ticker(name: str) -> dict:
try:
return await self.get_ticker(name)
except Exception:
return {}
tickers = await asyncio.gather(*[_ticker(i["name"]) for i in top])
legs: list[dict[str, Any]] = []
for meta, tk in zip(top, tickers, strict=True):
greeks = tk.get("greeks") or {}
legs.append({
"strike": meta.get("strike"),
"option_type": meta.get("option_type"),
"oi": meta.get("open_interest") or 0,
"iv": tk.get("mark_iv"),
"delta": greeks.get("delta"),
"gamma": greeks.get("gamma"),
"vanna": greeks.get("vanna"),
"charm": greeks.get("charm"),
"vega": greeks.get("vega"),
})
return spot, legs
async def get_dealer_gamma_profile(
self,
currency: str,
expiry_from: str | None = None,
expiry_to: str | None = None,
top_n_strikes: int = 50,
) -> dict:
"""Net dealer gamma per strike (assume dealer short calls/long puts).
Identifica il gamma flip level: sopra → mercato pinning, sotto → squeeze.
"""
import datetime as _dt
spot, legs = await self._fetch_chain_legs(currency, expiry_from, expiry_to, top_n_strikes)
result = opt.dealer_gamma_profile(legs, spot)
return {
"currency": currency.upper(),
"spot_price": spot,
**result,
"strikes_analyzed": len(result["by_strike"]),
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
"testnet": self.testnet,
}
async def get_vanna_charm(
self,
currency: str,
expiry_from: str | None = None,
expiry_to: str | None = None,
top_n_strikes: int = 50,
) -> dict:
"""Vanna (∂delta/∂IV) e Charm (∂delta/∂t) aggregati pesati per OI.
Vanna positiva: dealer compra spot quando IV sale.
Charm negativa: time decay erode delta hedging.
"""
import datetime as _dt
spot, legs = await self._fetch_chain_legs(currency, expiry_from, expiry_to, top_n_strikes)
result = opt.vanna_charm_aggregate(legs, spot)
return {
"currency": currency.upper(),
**result,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
"testnet": self.testnet,
}
async def get_oi_weighted_skew(
self,
currency: str,
expiry_from: str | None = None,
expiry_to: str | None = None,
top_n_strikes: int = 100,
) -> dict:
"""Skew aggregato pesato OI: IV media puts - calls. Positivo = paura.
"""
import datetime as _dt
_, legs = await self._fetch_chain_legs(currency, expiry_from, expiry_to, top_n_strikes)
result = opt.oi_weighted_skew(legs)
return {
"currency": currency.upper(),
**result,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
"testnet": self.testnet,
}
async def get_smile_asymmetry(
self,
currency: str,
expiry_from: str | None = None,
expiry_to: str | None = None,
top_n_strikes: int = 100,
) -> dict:
"""Asymmetry IV otm-puts vs otm-calls. Positivo = put-side richer."""
import datetime as _dt
spot, legs = await self._fetch_chain_legs(currency, expiry_from, expiry_to, top_n_strikes)
result = opt.smile_asymmetry(legs, spot)
return {
"currency": currency.upper(),
"spot_price": spot,
**result,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
"testnet": self.testnet,
}
async def get_atm_vs_wings_vol(
self,
currency: str,
expiry_from: str | None = None,
expiry_to: str | None = None,
top_n_strikes: int = 100,
) -> dict:
"""IV ATM vs IV alle ali 25-delta. wing_richness > 0 → smile (kurtosis)."""
import datetime as _dt
spot, legs = await self._fetch_chain_legs(currency, expiry_from, expiry_to, top_n_strikes)
result = opt.atm_vs_wings_vol(legs, spot)
return {
"currency": currency.upper(),
"spot_price": spot,
**result,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
"testnet": self.testnet,
}
async def get_pc_ratio(self, currency: str) -> dict:
import datetime as _dt
@@ -714,8 +880,7 @@ class DeribitClient:
shape = "backwardation"
short_term = next((x for x in ts if 8 <= x["dte"] <= 14), None)
mid_term = next((x for x in ts if 35 <= x["dte"] <= 45), None)
if short_term and mid_term:
if mid_term["atm_iv"] - short_term["atm_iv"] > 5:
if short_term and mid_term and mid_term["atm_iv"] - short_term["atm_iv"] > 5:
contango_steep = True
calendar_opp = True
@@ -963,7 +1128,7 @@ class DeribitClient:
structure = self._guess_structure(enriched)
notional = sum(l["quantity"] * spot for l in enriched) if spot else 0.0
sum(l["quantity"] * spot for l in enriched) if spot else 0.0
fee_per_leg = min(0.0003 * (spot or 1) * sum(l["quantity"] for l in enriched),
0.125 * abs(net_premium)) if spot else 0.0
fees_open = round(fee_per_leg, 4)
@@ -1401,6 +1566,38 @@ class DeribitClient:
return {"error": raw.get("error", "unknown"), "state": "error"}
return r
async def place_combo_order(
self,
legs: list[dict[str, Any]],
side: str,
amount: float,
type: str = "limit",
price: float | None = None,
label: str | None = None,
) -> dict:
"""Crea un combo via private/create_combo poi piazza un singolo ordine
(buy/sell) sull'instrument_name del combo. Una sola crociata di spread
invece di N (uno per leg) → minor slippage su strutture liquide.
legs: [{instrument_name, direction: 'buy'|'sell', ratio: int}].
"""
combo_raw = await self._request("private/create_combo", {"trades": legs})
combo = combo_raw.get("result")
if combo is None:
return {"state": "error", "error": combo_raw.get("error", "unknown")}
combo_instrument = combo.get("instrument_name") or combo.get("id")
order = await self.place_order(
instrument_name=combo_instrument,
side=side,
amount=amount,
type=type,
price=price,
label=label,
)
if order.get("state") == "error":
return {"state": "error", "error": order.get("error"), "combo_instrument": combo_instrument}
return {"combo_instrument": combo_instrument, **order}
async def set_leverage(self, instrument_name: str, leverage: int) -> dict:
"""CER-016: pre-set account leverage per evitare default 50x testnet."""
raw = await self._request(
@@ -1,80 +1,18 @@
"""CER-P5-010: env validation policy — fail-fast per mandatory, soft per optional.
Usage al boot di ogni mcp `__main__.py`:
from option_mcp_common.env_validation import require_env, optional_env, summarize
creds_file = require_env("CREDENTIALS_FILE", "deribit credentials JSON path")
host = optional_env("HOST", default="0.0.0.0")
summarize(["CREDENTIALS_FILE", "HOST", "PORT"])
"""Re-export shim per backward-compat: la logica vive ora in
mcp_common.env_validation. Non aggiungere nuovo codice qui.
"""
from mcp_common.env_validation import (
MissingEnvError,
fail_fast_if_missing,
optional_env,
require_env,
summarize,
)
from __future__ import annotations
import logging
import os
import sys
logger = logging.getLogger(__name__)
class MissingEnvError(RuntimeError):
"""Mandatory env var absent or empty."""
def require_env(name: str, description: str = "") -> str:
"""Fail-fast: raise MissingEnvError se name non presente o vuoto.
Uscita dal processo con codice 2 se chiamato dal main(). Comporta
logging chiaro del missing var prima dell'exit.
"""
val = (os.environ.get(name) or "").strip()
if not val:
msg = f"missing mandatory env var: {name}"
if description:
msg += f" ({description})"
logger.error(msg)
raise MissingEnvError(msg)
return val
def optional_env(name: str, *, default: str = "") -> str:
"""Soft: ritorna env o default. Log INFO se default usato."""
val = (os.environ.get(name) or "").strip()
if not val:
if default:
logger.info("env %s not set, using default=%r", name, default)
return default
return val
def summarize(names: list[str]) -> None:
"""Log INFO di tutti gli env rilevanti con presenza (mask se SECRET/KEY/TOKEN)."""
sensitive_tokens = ("SECRET", "KEY", "TOKEN", "PASSWORD", "CREDENTIAL", "WALLET")
for n in names:
val = os.environ.get(n)
if val is None:
logger.info("env[%s]: <unset>", n)
continue
if any(t in n.upper() for t in sensitive_tokens):
logger.info("env[%s]: <set, %d chars>", n, len(val))
else:
logger.info("env[%s]: %s", n, val)
def fail_fast_if_missing(names: list[str]) -> None:
"""Verifica lista di nomi mandatory al boot. Exit 2 se uno solo manca.
Uso preferito: early call in main() per bloccare boot se config incompleta.
"""
missing: list[str] = []
for n in names:
if not (os.environ.get(n) or "").strip():
missing.append(n)
if missing:
logger.error("boot aborted: missing mandatory env vars: %s", missing)
print(
f"FATAL: missing mandatory env vars: {missing}",
file=sys.stderr,
)
sys.exit(2)
__all__ = [
"MissingEnvError",
"fail_fast_if_missing",
"optional_env",
"require_env",
"summarize",
]
+197 -71
View File
@@ -1,19 +1,19 @@
from __future__ import annotations
import contextlib
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.risk_guard import (
enforce_aggregate,
enforce_leverage,
enforce_single_notional,
)
from mcp_common.server import build_app
from pydantic import BaseModel, field_validator, model_validator
from mcp_deribit.client import DeribitClient
from mcp_deribit.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_deribit.leverage_cap import get_max_leverage
# --- Body models ---
@@ -64,6 +64,11 @@ class GetOrderbookReq(BaseModel):
depth: int = 10
class OrderbookImbalanceReq(BaseModel):
instrument_name: str
depth: int = 10
class GetPositionsReq(BaseModel):
currency: str = "USDC"
@@ -112,6 +117,15 @@ class GetGexReq(BaseModel):
top_n_strikes: int = 50
class OptionFlowReq(BaseModel):
"""Body comune per indicatori option-flow (dealer gamma, vanna/charm,
OI-weighted skew, smile asymmetry, ATM vs wings)."""
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 100
class GetPcRatioReq(BaseModel):
currency: str
@@ -190,6 +204,28 @@ class PlaceOrderReq(BaseModel):
leverage: int | None = None # CER-016: None → default cap (3x)
class ComboLeg(BaseModel):
instrument_name: str
direction: str # "buy" | "sell"
ratio: int = 1
class PlaceComboOrderReq(BaseModel):
legs: list[ComboLeg]
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
label: str | None = None
leverage: int | None = None
@model_validator(mode="after")
def _at_least_two_legs(self):
if len(self.legs) < 2:
raise ValueError("combo requires at least 2 legs")
return self
class CancelOrderReq(BaseModel):
order_id: str
@@ -208,48 +244,6 @@ class ClosePositionReq(BaseModel):
instrument_name: str
# --- CER-016 notional helpers ---
async def _compute_notional_deribit(client: DeribitClient, body: PlaceOrderReq) -> float:
"""Stima notional in USD per un ordine Deribit.
- Perp USDC: contract size = 1 USD → amount è già notional USD.
- Options: amount è in base asset (BTC/ETH) → moltiplica per index price.
- Altri perp BTC/ETH: amount in USD notional.
"""
name = body.instrument_name.upper()
if name.endswith("-PERPETUAL"):
return float(body.amount)
ref_price: float | None = body.price
if ref_price is None:
try:
tk = await client.get_ticker(body.instrument_name)
ref_price = tk.get("mark_price") or tk.get("last_price")
except Exception:
ref_price = None
if not ref_price:
return float(body.amount)
return float(body.amount) * float(ref_price)
async def _current_aggregate_deribit(client: DeribitClient) -> float:
"""Somma notional posizioni aperte su Deribit (USDC)."""
try:
positions = await client.get_positions("USDC")
except Exception:
return 0.0
total = 0.0
for p in positions or []:
size = abs(float(p.get("size") or 0))
name = str(p.get("instrument") or "").upper()
if name.endswith("-PERPETUAL"):
total += size
else:
mark = float(p.get("mark_price") or 0)
total += size * mark
return total
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
@@ -264,18 +258,23 @@ def _check(principal: Principal, *, core: bool = False, observer: bool = False)
# --- App factory ---
def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
def create_app(
*,
client: DeribitClient,
token_store: TokenStore,
creds: dict,
env_info: EnvironmentInfo | None = None,
) -> FastAPI:
from contextlib import asynccontextmanager
# CER-016: pre-set leverage 3x su perp principali al boot (best-effort).
cap_default = get_max_leverage(creds)
# CER-016: pre-set leverage cap su perp principali al boot (best-effort).
@asynccontextmanager
async def _lifespan(_app: FastAPI):
cap = enforce_leverage(None)
for inst in ("BTC-PERPETUAL", "ETH-PERPETUAL"):
try:
await client.set_leverage(inst, cap)
except Exception:
pass
with contextlib.suppress(Exception):
await client.set_leverage(inst, cap_default)
yield
app = build_app(
@@ -292,6 +291,27 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
_check(principal, core=True, observer=True)
return client.is_testnet()
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "deribit",
"environment": "testnet" if client.is_testnet().get("testnet") else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": client.base_url,
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(
body: GetTickerReq, principal: Principal = Depends(require_principal)
@@ -330,6 +350,13 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.instrument_name, body.depth)
@app.post("/tools/get_orderbook_imbalance", tags=["reads"])
async def t_get_ob_imbalance(
body: OrderbookImbalanceReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook_imbalance(body.instrument_name, body.depth)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(
body: GetPositionsReq, principal: Principal = Depends(require_principal)
@@ -378,6 +405,51 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_dealer_gamma_profile", tags=["reads"])
async def t_get_dealer_gamma_profile(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dealer_gamma_profile(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_vanna_charm", tags=["reads"])
async def t_get_vanna_charm(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_vanna_charm(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_oi_weighted_skew", tags=["reads"])
async def t_get_oi_weighted_skew(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_oi_weighted_skew(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_smile_asymmetry", tags=["reads"])
async def t_get_smile_asymmetry(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_smile_asymmetry(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_atm_vs_wings_vol", tags=["reads"])
async def t_get_atm_vs_wings_vol(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_atm_vs_wings_vol(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_pc_ratio", tags=["reads"])
async def t_get_pc_ratio(
body: GetPcRatioReq, principal: Principal = Depends(require_principal)
@@ -476,20 +548,11 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
body: PlaceOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
lev = enforce_leverage(body.leverage)
if not body.reduce_only:
notional = await _compute_notional_deribit(client, body)
enforce_single_notional(
notional, exchange="deribit", instrument=body.instrument_name
)
agg = await _current_aggregate_deribit(client)
enforce_aggregate(agg, notional)
if lev != enforce_leverage(None):
try:
lev = _enforce_leverage(body.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
with contextlib.suppress(Exception):
await client.set_leverage(body.instrument_name, lev)
except Exception:
pass
return await client.place_order(
result = await client.place_order(
instrument_name=body.instrument_name,
side=body.side,
amount=body.amount,
@@ -499,34 +562,89 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
post_only=body.post_only,
label=body.label,
)
audit_write_op(
principal=principal, action="place_order", exchange="deribit",
target=body.instrument_name,
payload={"side": body.side, "amount": body.amount, "type": body.type,
"price": body.price, "leverage": lev, "label": body.label},
result=result,
)
return result
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(
body: PlaceComboOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
lev = _enforce_leverage(body.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
for leg in body.legs:
with contextlib.suppress(Exception):
await client.set_leverage(leg.instrument_name, lev)
result = await client.place_combo_order(
legs=[leg.model_dump() for leg in body.legs],
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
label=body.label,
)
audit_write_op(
principal=principal, action="place_combo_order", exchange="deribit",
target=result.get("combo_instrument") if isinstance(result, dict) else None,
payload={"legs": [leg.model_dump() for leg in body.legs],
"side": body.side, "amount": body.amount, "leverage": lev},
result=result if isinstance(result, dict) else None,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(
body: CancelOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.cancel_order(body.order_id)
result = await client.cancel_order(body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="deribit",
target=body.order_id, payload={}, result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(
body: SetStopLossReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.set_stop_loss(body.order_id, body.stop_price)
result = await client.set_stop_loss(body.order_id, body.stop_price)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="deribit",
target=body.order_id, payload={"stop_price": body.stop_price}, result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(
body: SetTakeProfitReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.set_take_profit(body.order_id, body.tp_price)
result = await client.set_take_profit(body.order_id, body.tp_price)
audit_write_op(
principal=principal, action="set_take_profit", exchange="deribit",
target=body.order_id, payload={"tp_price": body.tp_price}, result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close_position(
body: ClosePositionReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.close_position(body.instrument_name)
result = await client.close_position(body.instrument_name)
audit_write_op(
principal=principal, action="close_position", exchange="deribit",
target=body.instrument_name, payload={}, result=result,
)
return result
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9011"))
@@ -538,10 +656,12 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "is_testnet", "description": "True se client Deribit è in modalità testnet."},
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_ticker", "description": "Ticker di un instrument Deribit."},
{"name": "get_ticker_batch", "description": "Ticker per N instruments in parallelo (max 20)."},
{"name": "get_instruments", "description": "Lista instruments per currency."},
{"name": "get_orderbook", "description": "Orderbook L1/L2 per instrument."},
{"name": "get_orderbook_imbalance", "description": "Microstructure: imbalance ratio + microprice + slope."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Summary account (equity, balance)."},
{"name": "get_trade_history", "description": "Storia trade recenti."},
@@ -556,9 +676,15 @@ def create_app(*, client: DeribitClient, token_store: TokenStore) -> FastAPI:
{"name": "get_skew_25d", "description": "Skew 25-delta put/call IV + risk reversal + butterfly per expiry."},
{"name": "get_pc_ratio", "description": "Put/Call ratio aggregato su OI e volume 24h."},
{"name": "get_gex", "description": "Gamma exposure per strike + zero gamma level (top N strikes per OI)."},
{"name": "get_dealer_gamma_profile", "description": "Net dealer gamma per strike (short calls/long puts) + gamma flip level."},
{"name": "get_vanna_charm", "description": "Vanna (∂delta/∂IV) e Charm (∂delta/∂t) aggregati pesati OI."},
{"name": "get_oi_weighted_skew", "description": "Skew aggregato pesato per OI: IV puts - IV calls. Positivo = paura."},
{"name": "get_smile_asymmetry", "description": "Asymmetry IV otm-puts vs otm-calls + ATM IV reference."},
{"name": "get_atm_vs_wings_vol", "description": "IV ATM vs IV ali 25-delta. wing_richness > 0 = smile/kurtosis."},
{"name": "get_technical_indicators", "description": "Indicatori tecnici (RSI, MACD, ATR, ADX)."},
{"name": "get_realized_vol", "description": "Volatilità realizzata annualizzata (log-return std) BTC/ETH + spread IVRV."},
{"name": "place_order", "description": "Invia ordine (CORE only, testnet)."},
{"name": "place_combo_order", "description": "Crea combo via private/create_combo + piazza ordine sul combo (1 cross spread invece di N)."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "set_stop_loss", "description": "Setta stop loss su posizione."},
{"name": "set_take_profit", "description": "Setta take profit su posizione."},
+61
View File
@@ -221,6 +221,67 @@ async def test_get_dvol(httpx_mock: HTTPXMock, client: DeribitClient):
assert result["candles"][0]["close"] == 57.0
@pytest.mark.asyncio
async def test_place_combo_order(httpx_mock: HTTPXMock, client: DeribitClient):
httpx_mock.add_response(
url=re.compile(r"https://test\.deribit\.com/api/v2/public/auth"),
json=AUTH_RESP,
)
httpx_mock.add_response(
url=re.compile(r"https://test\.deribit\.com/api/v2/private/create_combo"),
json={
"result": {
"id": "BTC-COMBO-1",
"instrument_name": "BTC-COMBO-1",
"state": "active",
}
},
)
httpx_mock.add_response(
url=re.compile(r"https://test\.deribit\.com/api/v2/private/buy"),
json={
"result": {
"order": {"order_id": "ord-1", "order_state": "open"},
"trades": [],
}
},
)
legs = [
{"instrument_name": "BTC-30APR26-75000-C", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-30APR26-80000-C", "direction": "sell", "ratio": 1},
]
result = await client.place_combo_order(
legs=legs,
side="buy",
amount=1,
type="limit",
price=0.05,
label="vert-spread",
)
assert result["combo_instrument"] == "BTC-COMBO-1"
assert result["order"]["order_id"] == "ord-1"
@pytest.mark.asyncio
async def test_place_combo_order_create_combo_error(httpx_mock: HTTPXMock, client: DeribitClient):
httpx_mock.add_response(
url=re.compile(r"https://test\.deribit\.com/api/v2/public/auth"),
json=AUTH_RESP,
)
httpx_mock.add_response(
url=re.compile(r"https://test\.deribit\.com/api/v2/private/create_combo"),
json={"error": {"code": -32602, "message": "Invalid leg"}},
)
result = await client.place_combo_order(
legs=[{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1}],
side="buy",
amount=1,
type="market",
)
assert result["state"] == "error"
assert "Invalid leg" in str(result.get("error"))
@pytest.mark.asyncio
async def test_cancel_order(httpx_mock: HTTPXMock, client: DeribitClient):
httpx_mock.add_response(
@@ -0,0 +1,77 @@
from __future__ import annotations
from unittest.mock import AsyncMock
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
from mcp_deribit.server import create_app
def _make_app(env_info, creds):
c = AsyncMock()
c.set_leverage = AsyncMock(return_value={"state": "ok"})
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="env",
env_value="true",
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ot"},
)
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "deribit"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://test.deribit.com/api/v2"
assert body["max_leverage"] == 3
def test_environment_info_default_source():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="default",
env_value=None,
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ct"},
)
assert r.status_code == 200
body = r.json()
assert body["source"] == "default"
assert body["env_value"] is None
assert body["max_leverage"] == 1
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="default",
env_value=None,
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -2,7 +2,6 @@ from __future__ import annotations
import pytest
from fastapi import HTTPException
from mcp_deribit.leverage_cap import enforce_leverage, get_max_leverage
+107 -26
View File
@@ -4,8 +4,8 @@ from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_deribit.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_deribit.server import create_app
@pytest.fixture
@@ -20,6 +20,13 @@ def mock_client():
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_technical_indicators = AsyncMock(return_value={"rsi": 55.0})
c.place_order = AsyncMock(return_value={"order_id": "x"})
c.place_combo_order = AsyncMock(return_value={"combo_instrument": "BTC-COMBO-1", "order": {"order_id": "x"}})
c.get_dealer_gamma_profile = AsyncMock(return_value={"by_strike": [], "total_net_dealer_gamma": 0})
c.get_vanna_charm = AsyncMock(return_value={"total_vanna": 0, "total_charm": 0, "legs_analyzed": 0})
c.get_oi_weighted_skew = AsyncMock(return_value={"skew": 0, "call_iv_weighted": None, "put_iv_weighted": None})
c.get_smile_asymmetry = AsyncMock(return_value={"atm_iv": 0.5, "asymmetry": 0.0})
c.get_atm_vs_wings_vol = AsyncMock(return_value={"atm_iv": 0.5, "wing_richness": 0.0})
c.get_orderbook_imbalance = AsyncMock(return_value={"imbalance_ratio": 0.0, "microprice": 50000})
c.cancel_order = AsyncMock(return_value={"order_id": "x", "state": "cancelled"})
c.set_stop_loss = AsyncMock(return_value={"order_id": "x", "stop_price": 45000})
c.set_take_profit = AsyncMock(return_value={"order_id": "x", "tp_price": 55000})
@@ -34,7 +41,7 @@ def http(mock_client):
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
app = create_app(client=mock_client, token_store=store)
app = create_app(client=mock_client, token_store=store, creds={"max_leverage": 3})
return TestClient(app)
@@ -94,24 +101,108 @@ def test_place_order_observer_forbidden(http):
assert r.status_code == 403
def test_place_order_notional_cap_enforced(http):
"""CER-016: reject se notional > CERBERO_MAX_NOTIONAL (default 200)."""
def test_get_orderbook_imbalance_observer_ok(http):
r = http.post(
"/tools/place_order",
"/tools/get_orderbook_imbalance",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
@pytest.mark.parametrize("path", [
"/tools/get_dealer_gamma_profile",
"/tools/get_vanna_charm",
"/tools/get_oi_weighted_skew",
"/tools/get_smile_asymmetry",
"/tools/get_atm_vs_wings_vol",
])
def test_option_flow_indicators_observer_ok(http, path):
r = http.post(path, headers={"Authorization": "Bearer ot"}, json={"currency": "BTC"})
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path", [
"/tools/get_dealer_gamma_profile",
"/tools/get_vanna_charm",
"/tools/get_oi_weighted_skew",
"/tools/get_smile_asymmetry",
"/tools/get_atm_vs_wings_vol",
])
def test_option_flow_indicators_no_auth_401(http, path):
r = http.post(path, json={"currency": "BTC"})
assert r.status_code == 401, (path, r.text)
def test_place_combo_order_core_ok(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"instrument_name": "ETH-PERPETUAL",
"legs": [
{"instrument_name": "BTC-30APR26-75000-C", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-30APR26-80000-C", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 335, # USD — cap 200
"amount": 1,
"type": "limit",
"price": 0.05,
},
)
assert r.status_code == 200
assert r.json()["combo_instrument"] == "BTC-COMBO-1"
def test_place_combo_order_observer_forbidden(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ot"},
json={
"legs": [
{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-Y", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
},
)
assert r.status_code == 403
body = r.json()
assert body["error"]["code"] == "HARD_PROHIBITION"
def test_place_combo_order_min_legs(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1}],
"side": "buy",
"amount": 1,
},
)
assert r.status_code == 422
def test_place_combo_order_leverage_cap_enforced(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [
{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-Y", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
"leverage": 50,
},
)
assert r.status_code == 403
err = r.json()["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
def test_place_order_leverage_cap_enforced(http):
"""CER-016: reject leverage > 3x."""
"""Reject leverage > max_leverage (da secret, default 3)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
@@ -124,22 +215,12 @@ def test_place_order_leverage_cap_enforced(http):
)
assert r.status_code == 403
body = r.json()
assert body["error"]["code"] == "HARD_PROHIBITION"
def test_place_order_reduce_only_skips_cap(http):
"""CER-016: reduce_only orders bypassano cap notional (è close)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={
"instrument_name": "ETH-PERPETUAL",
"side": "sell",
"amount": 10000,
"reduce_only": True,
},
)
assert r.status_code == 200
err = body["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
details = err["details"]
assert details["exchange"] == "deribit"
assert details["requested"] == 50
assert details["max"] == 3
def test_close_position_core_ok(http):
@@ -1,42 +1,30 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_hyperliquid.client import HyperliquidClient
from mcp_hyperliquid.server import create_app
configure_root_logging() # CER-P5-009
def main():
wallet_file = os.environ["HYPERLIQUID_WALLET_FILE"]
with open(wallet_file) as f:
creds = json.load(f)
client = HyperliquidClient(
SPEC = ExchangeAppSpec(
exchange="hyperliquid",
creds_env_var="HYPERLIQUID_WALLET_FILE",
env_var="HYPERLIQUID_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.hyperliquid.xyz",
default_base_url_testnet="https://api.hyperliquid-testnet.xyz",
default_port=9012,
build_client=lambda creds, env_info: HyperliquidClient(
wallet_address=creds["wallet_address"],
private_key=creds["private_key"],
testnet=bool(creds.get("testnet", True)),
testnet=(env_info.environment == "testnet"),
api_wallet_address=creds.get("api_wallet_address"),
)
),
build_app=create_app,
)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(client=client, token_store=token_store)
uvicorn.run(
app,
log_config=None, # CER-P5-009: delega al root JSON logger
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9012")),
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
@@ -6,8 +6,8 @@ import asyncio
import datetime as _dt
from typing import Any
import httpx
from mcp_common import indicators as ind
from mcp_common.http import async_client
BASE_LIVE = "https://api.hyperliquid.xyz"
BASE_TESTNET = "https://api.hyperliquid-testnet.xyz"
@@ -86,7 +86,7 @@ class HyperliquidClient:
async def _post(self, payload: dict[str, Any]) -> Any:
"""POST JSON to the /info endpoint."""
async with httpx.AsyncClient(timeout=15) as http:
async with async_client(timeout=15.0) as http:
resp = await http.post(f"{self.base_url}/info", json=payload)
resp.raise_for_status()
return resp.json()
@@ -295,7 +295,7 @@ class HyperliquidClient:
spot_price: float | None = None
spot_source = "coinbase"
try:
async with httpx.AsyncClient(timeout=8) as c:
async with async_client(timeout=8.0) as c:
resp = await c.get(f"https://api.coinbase.com/v2/prices/{asset}-USD/spot")
if resp.status_code == 200:
spot_price = float(resp.json().get("data", {}).get("amount"))
@@ -303,7 +303,7 @@ class HyperliquidClient:
spot_price = None
if spot_price is None:
try:
async with httpx.AsyncClient(timeout=8) as c:
async with async_client(timeout=8.0) as c:
resp = await c.get(
"https://api.kraken.com/0/public/Ticker", params={"pair": f"{asset}USD"}
)
@@ -0,0 +1,56 @@
"""Leverage cap server-side per place_order.
Cap letto dal secret JSON via campo `max_leverage`. Default 1 (cash) se assente.
"""
from __future__ import annotations
from fastapi import HTTPException
def get_max_leverage(creds: dict) -> int:
"""Legge max_leverage dal secret. Default 1 se mancante."""
raw = creds.get("max_leverage", 1)
try:
value = int(raw)
except (TypeError, ValueError):
value = 1
return max(1, value)
def enforce_leverage(
requested: int | float | None,
*,
creds: dict,
exchange: str,
) -> int:
"""Verifica e applica leverage cap. Ritorna leverage applicabile.
Solleva HTTPException(403, LEVERAGE_CAP_EXCEEDED) se requested > cap.
Se requested is None, applica il cap come default.
"""
cap = get_max_leverage(creds)
if requested is None:
return cap
lev = int(requested)
if lev < 1:
raise HTTPException(
status_code=403,
detail={
"error": "LEVERAGE_CAP_EXCEEDED",
"exchange": exchange,
"requested": lev,
"max": cap,
"reason": "leverage must be >= 1",
},
)
if lev > cap:
raise HTTPException(
status_code=403,
detail={
"error": "LEVERAGE_CAP_EXCEEDED",
"exchange": exchange,
"requested": lev,
"max": cap,
},
)
return lev
@@ -3,17 +3,16 @@ from __future__ import annotations
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.risk_guard import (
enforce_aggregate,
enforce_leverage,
enforce_single_notional,
)
from mcp_common.server import build_app
from pydantic import BaseModel, field_validator, model_validator
from mcp_hyperliquid.client import HyperliquidClient
from mcp_hyperliquid.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_hyperliquid.leverage_cap import get_max_leverage
# --- Body models ---
@@ -167,35 +166,6 @@ class ClosePositionReq(BaseModel):
instrument: str
# --- CER-016 notional helpers ---
async def _compute_notional_hl(client: HyperliquidClient, body: PlaceOrderReq) -> float:
"""HL perp: amount è in base asset → notional = amount * price."""
ref_price: float | None = body.price
if ref_price is None:
try:
tk = await client.get_ticker(body.instrument)
ref_price = tk.get("mark_price") or tk.get("last_price") or tk.get("price")
except Exception:
ref_price = None
if not ref_price:
return 0.0
return float(body.amount) * float(ref_price)
async def _current_aggregate_hl(client: HyperliquidClient) -> float:
try:
positions = await client.get_positions()
except Exception:
return 0.0
total = 0.0
for p in positions or []:
size = abs(float(p.get("size") or p.get("amount") or 0))
mark = float(p.get("mark_price") or p.get("price") or 0)
total += size * mark
return total
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
@@ -210,11 +180,39 @@ def _check(principal: Principal, *, core: bool = False, observer: bool = False)
# --- App factory ---
def create_app(*, client: HyperliquidClient, token_store: TokenStore) -> FastAPI:
def create_app(
*,
client: HyperliquidClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
) -> FastAPI:
creds = creds or {}
app = build_app(name="mcp-hyperliquid", version="0.1.0", token_store=token_store)
# --- Read tools: core + observer ---
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "hyperliquid",
"environment": "testnet" if getattr(client, "testnet", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_markets", tags=["reads"])
async def t_get_markets(
body: GetMarketsReq, principal: Principal = Depends(require_principal)
@@ -307,15 +305,8 @@ def create_app(*, client: HyperliquidClient, token_store: TokenStore) -> FastAPI
body: PlaceOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
enforce_leverage(body.leverage)
if not body.reduce_only:
notional = await _compute_notional_hl(client, body)
enforce_single_notional(
notional, exchange="hyperliquid", instrument=body.instrument
)
agg = await _current_aggregate_hl(client)
enforce_aggregate(agg, notional)
return await client.place_order(
_enforce_leverage(body.leverage, creds=creds, exchange="hyperliquid")
result = await client.place_order(
instrument=body.instrument,
side=body.side,
amount=body.amount,
@@ -323,34 +314,67 @@ def create_app(*, client: HyperliquidClient, token_store: TokenStore) -> FastAPI
price=body.price,
reduce_only=body.reduce_only,
)
audit_write_op(
principal=principal, action="place_order", exchange="hyperliquid",
target=body.instrument,
payload={"side": body.side, "amount": body.amount, "type": body.type,
"price": body.price, "reduce_only": body.reduce_only,
"leverage": body.leverage},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(
body: CancelOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.cancel_order(body.order_id, body.instrument)
result = await client.cancel_order(body.order_id, body.instrument)
audit_write_op(
principal=principal, action="cancel_order", exchange="hyperliquid",
target=body.order_id, payload={"instrument": body.instrument}, result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(
body: SetStopLossReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.set_stop_loss(body.instrument, body.stop_price, body.size)
result = await client.set_stop_loss(body.instrument, body.stop_price, body.size)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="hyperliquid",
target=body.instrument,
payload={"stop_price": body.stop_price, "size": body.size},
result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(
body: SetTakeProfitReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.set_take_profit(body.instrument, body.tp_price, body.size)
result = await client.set_take_profit(body.instrument, body.tp_price, body.size)
audit_write_op(
principal=principal, action="set_take_profit", exchange="hyperliquid",
target=body.instrument,
payload={"tp_price": body.tp_price, "size": body.size},
result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close_position(
body: ClosePositionReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
return await client.close_position(body.instrument)
result = await client.close_position(body.instrument)
audit_write_op(
principal=principal, action="close_position", exchange="hyperliquid",
target=body.instrument, payload={}, result=result,
)
return result
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9012"))
@@ -361,6 +385,7 @@ def create_app(*, client: HyperliquidClient, token_store: TokenStore) -> FastAPI
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_markets", "description": "Lista mercati perp disponibili."},
{"name": "get_ticker", "description": "Ticker di un perp."},
{"name": "get_orderbook", "description": "Orderbook L2."},
@@ -0,0 +1,50 @@
from __future__ import annotations
from unittest.mock import MagicMock
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
from mcp_hyperliquid.server import create_app
def _make_app(env_info, creds):
c = MagicMock()
c.testnet = True
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="hyperliquid",
environment="testnet",
source="env",
env_value="true",
base_url="https://api.hyperliquid-testnet.xyz",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info", headers={"Authorization": "Bearer ot"})
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "hyperliquid"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://api.hyperliquid-testnet.xyz"
assert body["max_leverage"] == 3
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="hyperliquid", environment="testnet", source="default",
env_value=None, base_url="https://api.hyperliquid-testnet.xyz",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -0,0 +1,50 @@
from __future__ import annotations
import pytest
from fastapi import HTTPException
from mcp_hyperliquid.leverage_cap import enforce_leverage, get_max_leverage
def test_get_max_leverage_returns_creds_value():
creds = {"max_leverage": 5}
assert get_max_leverage(creds) == 5
def test_get_max_leverage_default_when_missing():
"""Default 1 (cash) se il secret non ha max_leverage."""
assert get_max_leverage({}) == 1
def test_enforce_leverage_pass_under_cap():
creds = {"max_leverage": 3}
enforce_leverage(2, creds=creds, exchange="hyperliquid") # no raise
def test_enforce_leverage_pass_at_cap():
creds = {"max_leverage": 3}
enforce_leverage(3, creds=creds, exchange="hyperliquid") # no raise
def test_enforce_leverage_reject_over_cap():
creds = {"max_leverage": 3}
with pytest.raises(HTTPException) as exc:
enforce_leverage(10, creds=creds, exchange="hyperliquid")
assert exc.value.status_code == 403
assert exc.value.detail["error"] == "LEVERAGE_CAP_EXCEEDED"
assert exc.value.detail["exchange"] == "hyperliquid"
assert exc.value.detail["requested"] == 10
assert exc.value.detail["max"] == 3
def test_enforce_leverage_reject_when_below_one():
creds = {"max_leverage": 3}
with pytest.raises(HTTPException) as exc:
enforce_leverage(0, creds=creds, exchange="hyperliquid")
assert exc.value.status_code == 403
def test_enforce_leverage_default_when_none():
"""Se requested è None, applica il cap come default."""
creds = {"max_leverage": 3}
result = enforce_leverage(None, creds=creds, exchange="hyperliquid")
assert result == 3
@@ -4,8 +4,8 @@ from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_hyperliquid.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_hyperliquid.server import create_app
@pytest.fixture
@@ -37,7 +37,7 @@ def http(mock_client):
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(client=mock_client, token_store=store)
app = create_app(client=mock_client, token_store=store, creds={"max_leverage": 3})
return TestClient(app)
@@ -128,18 +128,8 @@ def test_place_order_observer_forbidden(http):
assert r.status_code == 403
def test_place_order_notional_cap_enforced(http):
"""CER-016: HL reject amount*price > 200."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={"instrument": "ETH", "side": "buy", "amount": 0.1, "price": 3350},
)
assert r.status_code == 403
assert r.json()["error"]["code"] == "HARD_PROHIBITION"
def test_place_order_leverage_cap_enforced_hl(http):
"""Reject leverage > max_leverage (da secret, default 3)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
@@ -152,6 +142,10 @@ def test_place_order_leverage_cap_enforced_hl(http):
},
)
assert r.status_code == 403
body = r.json()
err = body["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
assert err["details"]["exchange"] == "hyperliquid"
def test_cancel_order_core_ok(http):
@@ -5,12 +5,10 @@ import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_macro.server import create_app
configure_root_logging() # CER-P5-009
def main():
+91
View File
@@ -0,0 +1,91 @@
"""Pure-logic helpers per COT report parsing e analytics.
Niente HTTP qui orchestrazione fetch sta in fetchers.py. Tutto testabile
in isolamento.
"""
from __future__ import annotations
from typing import Literal
ExtremeSignal = Literal["extreme_short", "extreme_long", "neutral"]
def compute_percentile(value: float, history: list[float]) -> float | None:
"""Percentile di `value` rispetto ad `history` (0-100, inclusive).
Restituisce None se history vuoto. Clipped a [0, 100] se value fuori range.
"""
if not history:
return None
n = len(history)
below_or_eq = sum(1 for h in history if h <= value)
pct = 100.0 * below_or_eq / n
return max(0.0, min(100.0, pct))
def classify_extreme(percentile: float | None, threshold: float = 5.0) -> ExtremeSignal:
"""Classifica un percentile come estremo short/long o neutral.
threshold default 5 flagga 5 come short, 100-5=95 come long.
"""
if percentile is None:
return "neutral"
if percentile <= threshold:
return "extreme_short"
if percentile >= 100.0 - threshold:
return "extreme_long"
return "neutral"
def _to_int(v) -> int:
try:
return int(float(v))
except (TypeError, ValueError):
return 0
def _date_only(s: str) -> str:
"""Estrae 'YYYY-MM-DD' da una data ISO con o senza timestamp."""
if not s:
return ""
return s.split("T", 1)[0]
def parse_tff_row(raw: dict) -> dict:
"""Mappa una row Socrata TFF al formato API output."""
dl = _to_int(raw.get("dealer_positions_long_all"))
ds = _to_int(raw.get("dealer_positions_short_all"))
al = _to_int(raw.get("asset_mgr_positions_long"))
as_ = _to_int(raw.get("asset_mgr_positions_short"))
ll = _to_int(raw.get("lev_money_positions_long"))
ls = _to_int(raw.get("lev_money_positions_short"))
ol = _to_int(raw.get("other_rept_positions_long"))
os_ = _to_int(raw.get("other_rept_positions_short"))
return {
"report_date": _date_only(raw.get("report_date_as_yyyy_mm_dd", "")),
"dealer_long": dl, "dealer_short": ds, "dealer_net": dl - ds,
"asset_mgr_long": al, "asset_mgr_short": as_, "asset_mgr_net": al - as_,
"lev_funds_long": ll, "lev_funds_short": ls, "lev_funds_net": ll - ls,
"other_long": ol, "other_short": os_, "other_net": ol - os_,
"open_interest": _to_int(raw.get("open_interest_all")),
}
def parse_disagg_row(raw: dict) -> dict:
"""Mappa una row Socrata Disaggregated F&O combined al formato API output."""
pl = _to_int(raw.get("prod_merc_positions_long_all"))
ps = _to_int(raw.get("prod_merc_positions_short_all"))
sl = _to_int(raw.get("swap_positions_long_all"))
ss = _to_int(raw.get("swap_positions_short_all"))
ml = _to_int(raw.get("m_money_positions_long_all"))
ms = _to_int(raw.get("m_money_positions_short_all"))
ol = _to_int(raw.get("other_rept_positions_long_all"))
os_ = _to_int(raw.get("other_rept_positions_short_all"))
return {
"report_date": _date_only(raw.get("report_date_as_yyyy_mm_dd", "")),
"producer_long": pl, "producer_short": ps, "producer_net": pl - ps,
"swap_long": sl, "swap_short": ss, "swap_net": sl - ss,
"managed_money_long": ml, "managed_money_short": ms, "managed_money_net": ml - ms,
"other_long": ol, "other_short": os_, "other_net": ol - os_,
"open_interest": _to_int(raw.get("open_interest_all")),
}
@@ -0,0 +1,36 @@
"""Costanti CFTC: ticker → contract_market_code per TFF e Disaggregated.
I codici CFTC (`cftc_contract_market_code`) sono pubblici e stabili nel tempo.
Riferimento: https://www.cftc.gov/MarketReports/CommitmentsofTraders/
"""
from __future__ import annotations
CFTC_BASE_URL = "https://publicreporting.cftc.gov/resource"
TFF_DATASET_ID = "gpe5-46if"
DISAGG_DATASET_ID = "72hh-3qpy"
# TFF: equity/financial. Mapping ticker → cftc_contract_market_code.
SYMBOL_TO_CFTC_CODE_TFF: dict[str, str] = {
"ES": "13874A", # E-mini S&P 500
"NQ": "209742", # E-mini Nasdaq-100
"RTY": "239742", # E-mini Russell 2000
"ZN": "043602", # 10-Year T-Note
"ZB": "020601", # 30-Year T-Bond
"6E": "099741", # Euro FX
"6J": "097741", # Japanese Yen
"DX": "098662", # US Dollar Index
}
# Disaggregated: commodities.
SYMBOL_TO_CFTC_CODE_DISAGG: dict[str, str] = {
"CL": "067651", # Crude Oil WTI
"GC": "088691", # Gold
"SI": "084691", # Silver
"HG": "085692", # Copper
"ZW": "001602", # Wheat
"ZC": "002602", # Corn
"ZS": "005602", # Soybeans
}
ALL_TFF_SYMBOLS: list[str] = list(SYMBOL_TO_CFTC_CODE_TFF.keys())
ALL_DISAGG_SYMBOLS: list[str] = list(SYMBOL_TO_CFTC_CODE_DISAGG.keys())
+262 -7
View File
@@ -4,6 +4,18 @@ from datetime import UTC, datetime, timedelta
from typing import Any
import httpx
from mcp_common.http import async_client
from mcp_macro.cot import classify_extreme, compute_percentile, parse_disagg_row, parse_tff_row
from mcp_macro.cot_contracts import (
ALL_DISAGG_SYMBOLS,
ALL_TFF_SYMBOLS,
CFTC_BASE_URL,
DISAGG_DATASET_ID,
SYMBOL_TO_CFTC_CODE_DISAGG,
SYMBOL_TO_CFTC_CODE_TFF,
TFF_DATASET_ID,
)
FRED_BASE = "https://api.stlouisfed.org/fred/series/observations"
FINNHUB_CALENDAR = "https://finnhub.io/api/v1/calendar/economic"
@@ -79,7 +91,7 @@ async def fetch_asset_price(ticker: str) -> dict[str, Any]:
return {"ticker": ticker, "error": f"unknown ticker {ticker}"}
symbol, name = mapping
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
info = await _fetch_yahoo_meta(client, symbol, "10d")
meta = info.get("meta") or {}
closes = info.get("closes") or []
@@ -129,7 +141,7 @@ async def fetch_treasury_yields() -> dict[str, Any]:
("us30y", "^TYX"),
]
yields: dict[str, float | None] = {}
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
for key, sym in symbols:
info = await _fetch_yahoo_meta(client, sym, "5d")
meta = info.get("meta") or {}
@@ -159,6 +171,100 @@ async def fetch_treasury_yields() -> dict[str, Any]:
return out
def yield_curve_metrics(yields: dict[str, float | None]) -> dict[str, Any]:
"""Slope + convexity da curva yields (us2y, us5y, us10y, us30y).
Convexity (butterfly): 2*us10y - us2y - us30y. >0 = curva concava.
"""
y2 = yields.get("us2y")
y5 = yields.get("us5y")
y10 = yields.get("us10y")
y30 = yields.get("us30y")
slope_2y10y = (y10 - y2) if (y2 is not None and y10 is not None) else None
slope_5y30y = (y30 - y5) if (y5 is not None and y30 is not None) else None
butterfly_2_10_30 = (2 * y10 - y2 - y30) if (y2 is not None and y10 is not None and y30 is not None) else None
regime = "unknown"
if slope_2y10y is not None:
if slope_2y10y >= 0.5:
regime = "steep"
elif slope_2y10y > 0.1:
regime = "normal"
elif slope_2y10y > -0.1:
regime = "flat"
else:
regime = "inverted"
return {
"slope_2y10y": round(slope_2y10y, 3) if slope_2y10y is not None else None,
"slope_5y30y": round(slope_5y30y, 3) if slope_5y30y is not None else None,
"butterfly_2_10_30": round(butterfly_2_10_30, 3) if butterfly_2_10_30 is not None else None,
"regime": regime,
}
async def fetch_yield_curve_slope() -> dict[str, Any]:
"""Curve slope/convexity metrics su treasury yields correnti."""
base = await fetch_treasury_yields()
metrics = yield_curve_metrics(base.get("yields") or {})
return {
"yields": base.get("yields"),
**metrics,
"data_timestamp": datetime.now(UTC).isoformat(),
}
async def fetch_breakeven_inflation(fred_api_key: str = "") -> dict[str, Any]:
"""Breakeven inflation rate via FRED:
- T10YIE (10Y breakeven, market expectation 10Y inflation)
- T5YIE (5Y breakeven)
- T5YIFR (5Y forward 5Y forward inflation expectation)
"""
if not fred_api_key:
return {"error": "No FRED API key configured", "breakevens": {}}
series_map = {
"be_5y": "T5YIE",
"be_10y": "T10YIE",
"be_5y5y_forward": "T5YIFR",
}
out: dict[str, float | None] = {}
async with async_client(timeout=10.0) as client:
for name, series_id in series_map.items():
resp = await client.get(
FRED_BASE,
params={
"series_id": series_id,
"api_key": fred_api_key,
"file_type": "json",
"sort_order": "desc",
"limit": 1,
},
)
data = resp.json()
obs = data.get("observations", [])
try:
out[name] = float(obs[0]["value"]) if obs and obs[0]["value"] != "." else None
except (ValueError, IndexError, KeyError):
out[name] = None
interpretation = "unknown"
be10 = out.get("be_10y")
if be10 is not None:
if be10 > 3.0:
interpretation = "high_inflation_expected"
elif be10 < 1.5:
interpretation = "low_inflation_expected"
else:
interpretation = "anchored"
return {
"breakevens": out,
"interpretation": interpretation,
"data_timestamp": datetime.now(UTC).isoformat(),
}
async def fetch_equity_futures() -> dict[str, Any]:
"""Fetch ES/NQ/YM/RTY futures con session detection."""
tickers = [("es", "ES=F"), ("nq", "NQ=F"), ("ym", "YM=F"), ("rty", "RTY=F")]
@@ -176,7 +282,7 @@ async def fetch_equity_futures() -> dict[str, Any]:
session = "after-hours"
out: dict[str, Any] = {}
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
for key, sym in tickers:
info = await _fetch_yahoo_meta(client, sym, "5d")
meta = info.get("meta") or {}
@@ -268,7 +374,7 @@ async def fetch_economic_indicators(
result: dict[str, Any] = {}
if not fred_api_key:
return {"indicators": result, "error": "No FRED API key configured"}
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
for name, series_id in series_map.items():
if indicators and name not in indicators:
continue
@@ -350,7 +456,7 @@ async def fetch_macro_calendar(
# Try Forex Factory free feed first
try:
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get("https://nfs.faireconomy.media/ff_calendar_thisweek.json")
if resp.status_code == 200:
raw = resp.json()
@@ -409,7 +515,7 @@ async def fetch_macro_calendar(
try:
now = datetime.now(UTC)
end_default = now + timedelta(days=days_ahead)
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get(
FINNHUB_CALENDAR,
params={
@@ -477,7 +583,7 @@ async def fetch_market_overview() -> dict[str, Any]:
if _MARKET_CACHE["data"] is not None and (now - _MARKET_CACHE["ts"]) < _MARKET_CACHE_TTL:
return _MARKET_CACHE["data"]
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
global_data: dict[str, Any] = {}
prices: dict[str, Any] = {}
try:
@@ -514,3 +620,152 @@ async def fetch_market_overview() -> dict[str, Any]:
_MARKET_CACHE["data"] = out
_MARKET_CACHE["ts"] = now
return out
_COT_TTL = 3600.0 # 1h
_COT_CACHE: dict[tuple[str, str, int], dict[str, Any]] = {}
_COT_CACHE_TS: dict[tuple[str, str, int], float] = {}
async def fetch_cot_tff(symbol: str, lookback_weeks: int = 52) -> dict[str, Any]:
"""Fetch COT TFF report per simbolo equity/financial. Returns ASC by date."""
import time
symbol = symbol.upper()
if symbol not in SYMBOL_TO_CFTC_CODE_TFF:
return {"error": "unknown_symbol", "available": ALL_TFF_SYMBOLS}
key = (symbol, "tff", lookback_weeks)
now = time.monotonic()
if key in _COT_CACHE and (now - _COT_CACHE_TS[key]) < _COT_TTL:
return _COT_CACHE[key]
code = SYMBOL_TO_CFTC_CODE_TFF[symbol]
url = f"{CFTC_BASE_URL}/{TFF_DATASET_ID}.json"
async with async_client(timeout=10.0) as client:
resp = await client.get(
url,
params={
"cftc_contract_market_code": code,
"$order": "report_date_as_yyyy_mm_dd DESC",
"$limit": str(lookback_weeks),
},
)
if resp.status_code != 200:
return {"symbol": symbol, "report_type": "tff", "rows": [], "error": "cftc_unavailable"}
raw_rows = resp.json() or []
parsed = [parse_tff_row(r) for r in raw_rows]
parsed.sort(key=lambda r: r["report_date"]) # ASC by date
out = {
"symbol": symbol,
"report_type": "tff",
"rows": parsed,
"data_timestamp": datetime.now(UTC).isoformat(),
}
_COT_CACHE[key] = out
_COT_CACHE_TS[key] = now
return out
async def fetch_cot_disaggregated(symbol: str, lookback_weeks: int = 52) -> dict[str, Any]:
"""Fetch COT Disaggregated report per simbolo commodity. Returns ASC by date."""
import time
symbol = symbol.upper()
if symbol not in SYMBOL_TO_CFTC_CODE_DISAGG:
return {"error": "unknown_symbol", "available": ALL_DISAGG_SYMBOLS}
key = (symbol, "disaggregated", lookback_weeks)
now = time.monotonic()
if key in _COT_CACHE and (now - _COT_CACHE_TS[key]) < _COT_TTL:
return _COT_CACHE[key]
code = SYMBOL_TO_CFTC_CODE_DISAGG[symbol]
url = f"{CFTC_BASE_URL}/{DISAGG_DATASET_ID}.json"
async with async_client(timeout=10.0) as client:
resp = await client.get(
url,
params={
"cftc_contract_market_code": code,
"$order": "report_date_as_yyyy_mm_dd DESC",
"$limit": str(lookback_weeks),
},
)
if resp.status_code != 200:
return {"symbol": symbol, "report_type": "disaggregated", "rows": [], "error": "cftc_unavailable"}
raw_rows = resp.json() or []
parsed = [parse_disagg_row(r) for r in raw_rows]
parsed.sort(key=lambda r: r["report_date"])
out = {
"symbol": symbol,
"report_type": "disaggregated",
"rows": parsed,
"data_timestamp": datetime.now(UTC).isoformat(),
}
_COT_CACHE[key] = out
_COT_CACHE_TS[key] = now
return out
async def fetch_cot_extreme_positioning(lookback_weeks: int = 156) -> dict[str, Any]:
"""Scanner posizionamento estremo (percentile <=5 o >=95) sui simboli watchlist.
TFF -> key_role = lev_funds (lev_funds_net).
Disaggregated -> key_role = managed_money (managed_money_net).
"""
import asyncio
tff_tasks = [fetch_cot_tff(s, lookback_weeks) for s in ALL_TFF_SYMBOLS]
disagg_tasks = [fetch_cot_disaggregated(s, lookback_weeks) for s in ALL_DISAGG_SYMBOLS]
tff_results, disagg_results = await asyncio.gather(
asyncio.gather(*tff_tasks, return_exceptions=True),
asyncio.gather(*disagg_tasks, return_exceptions=True),
)
extremes: list[dict[str, Any]] = []
for res in tff_results:
if isinstance(res, BaseException) or not isinstance(res, dict):
continue
rows = res.get("rows") or []
if len(rows) < 4:
continue
series = [r["lev_funds_net"] for r in rows]
current = series[-1]
history = series[:-1]
pct = compute_percentile(current, history)
extremes.append({
"symbol": res["symbol"],
"report_type": "tff",
"key_role": "lev_funds",
"current_net": current,
"percentile": pct,
"signal": classify_extreme(pct),
"report_date": rows[-1]["report_date"],
})
for res in disagg_results:
if isinstance(res, BaseException) or not isinstance(res, dict):
continue
rows = res.get("rows") or []
if len(rows) < 4:
continue
series = [r["managed_money_net"] for r in rows]
current = series[-1]
history = series[:-1]
pct = compute_percentile(current, history)
extremes.append({
"symbol": res["symbol"],
"report_type": "disaggregated",
"key_role": "managed_money",
"current_net": current,
"percentile": pct,
"signal": classify_extreme(pct),
"report_date": rows[-1]["report_date"],
})
return {
"lookback_weeks": lookback_weeks,
"extremes": extremes,
"data_timestamp": datetime.now(UTC).isoformat(),
}
+68 -1
View File
@@ -6,15 +6,20 @@ from fastapi import Depends, FastAPI, HTTPException
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from pydantic import BaseModel, Field
from mcp_macro.fetchers import (
fetch_asset_price,
fetch_breakeven_inflation,
fetch_cot_disaggregated,
fetch_cot_extreme_positioning,
fetch_cot_tff,
fetch_economic_indicators,
fetch_equity_futures,
fetch_macro_calendar,
fetch_market_overview,
fetch_treasury_yields,
fetch_yield_curve_slope,
)
# --- Body models ---
@@ -47,6 +52,28 @@ class GetEquityFuturesReq(BaseModel):
pass
class GetYieldCurveSlopeReq(BaseModel):
pass
class GetBreakevenInflationReq(BaseModel):
pass
class GetCotTffReq(BaseModel):
symbol: str
lookback_weeks: int = Field(default=52, ge=4, le=520)
class GetCotDisaggregatedReq(BaseModel):
symbol: str
lookback_weeks: int = Field(default=52, ge=4, le=520)
class GetCotExtremeReq(BaseModel):
lookback_weeks: int = Field(default=156, ge=4, le=520)
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
@@ -115,6 +142,41 @@ def create_app(*, fred_api_key: str = "", finnhub_api_key: str = "", token_store
_check(principal, core=True, observer=True)
return await fetch_equity_futures()
@app.post("/tools/get_yield_curve_slope", tags=["reads"])
async def t_get_yield_curve_slope(
body: GetYieldCurveSlopeReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_yield_curve_slope()
@app.post("/tools/get_breakeven_inflation", tags=["reads"])
async def t_get_breakeven_inflation(
body: GetBreakevenInflationReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_breakeven_inflation(fred_api_key=fred_api_key)
@app.post("/tools/get_cot_tff", tags=["reads"])
async def t_get_cot_tff(
body: GetCotTffReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_tff(body.symbol, body.lookback_weeks)
@app.post("/tools/get_cot_disaggregated", tags=["reads"])
async def t_get_cot_disaggregated(
body: GetCotDisaggregatedReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_disaggregated(body.symbol, body.lookback_weeks)
@app.post("/tools/get_cot_extreme_positioning", tags=["reads"])
async def t_get_cot_extreme(
body: GetCotExtremeReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_extreme_positioning(body.lookback_weeks)
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9013"))
mount_mcp_endpoint(
@@ -130,6 +192,11 @@ def create_app(*, fred_api_key: str = "", finnhub_api_key: str = "", token_store
{"name": "get_asset_price", "description": "Prezzo cross-asset: WTI, DXY, SPX, VIX, yields, FX, ecc."},
{"name": "get_treasury_yields", "description": "Curva US Treasury 2y/5y/10y/30y + shape detection."},
{"name": "get_equity_futures", "description": "Futures ES/NQ/YM/RTY con session status."},
{"name": "get_yield_curve_slope", "description": "Slope 2y10y/5y30y + butterfly + regime (steep/normal/flat/inverted)."},
{"name": "get_breakeven_inflation", "description": "Breakeven inflation 5Y/10Y + 5y5y forward (FRED T5YIE/T10YIE/T5YIFR)."},
{"name": "get_cot_tff", "description": "COT TFF report (CFTC) per equity/financial: ES/NQ/RTY/ZN/ZB/6E/6J/DX. Roles: dealer, asset manager, leveraged funds, other."},
{"name": "get_cot_disaggregated", "description": "COT Disaggregated report (CFTC) per commodities: CL/GC/SI/HG/ZW/ZC/ZS. Roles: producer/merchant, swap dealer, managed money, other."},
{"name": "get_cot_extreme_positioning", "description": "Scanner posizionamento estremo (percentile ≤5 o ≥95) sui simboli watchlist."},
],
)
+117
View File
@@ -0,0 +1,117 @@
from __future__ import annotations
from mcp_macro.cot import (
classify_extreme,
compute_percentile,
parse_disagg_row,
parse_tff_row,
)
def test_compute_percentile_basic():
history = [10, 20, 30, 40, 50, 60, 70, 80, 90, 100]
assert compute_percentile(50, history) == 50.0
assert compute_percentile(10, history) == 10.0
assert compute_percentile(100, history) == 100.0
def test_compute_percentile_value_below_min():
history = [10, 20, 30]
assert compute_percentile(5, history) == 0.0
def test_compute_percentile_value_above_max():
history = [10, 20, 30]
assert compute_percentile(40, history) == 100.0
def test_compute_percentile_empty_history():
assert compute_percentile(50, []) is None
def test_classify_extreme_below_threshold():
assert classify_extreme(3.0) == "extreme_short"
assert classify_extreme(5.0) == "extreme_short" # boundary inclusive
def test_classify_extreme_above_threshold():
assert classify_extreme(96.0) == "extreme_long"
assert classify_extreme(95.0) == "extreme_long" # boundary inclusive
def test_classify_extreme_neutral():
assert classify_extreme(50.0) == "neutral"
assert classify_extreme(94.99) == "neutral"
assert classify_extreme(5.01) == "neutral"
def test_classify_extreme_none_input():
assert classify_extreme(None) == "neutral"
# Payload Socrata reale (subset campi rilevanti, valori arbitrari per test)
TFF_SOCRATA_ROW = {
"report_date_as_yyyy_mm_dd": "2026-04-22T00:00:00.000",
"dealer_positions_long_all": "12345",
"dealer_positions_short_all": "23456",
"asset_mgr_positions_long": "654321",
"asset_mgr_positions_short": "200000",
"lev_money_positions_long": "100000",
"lev_money_positions_short": "350000",
"other_rept_positions_long": "50000",
"other_rept_positions_short": "50000",
"open_interest_all": "2500000",
}
DISAGG_SOCRATA_ROW = {
"report_date_as_yyyy_mm_dd": "2026-04-22T00:00:00.000",
"prod_merc_positions_long_all": "100000",
"prod_merc_positions_short_all": "300000",
"swap_positions_long_all": "50000",
"swap_positions_short_all": "60000",
"m_money_positions_long_all": "200000",
"m_money_positions_short_all": "80000",
"other_rept_positions_long_all": "10000",
"other_rept_positions_short_all": "10000",
"open_interest_all": "1500000",
}
def test_parse_tff_row_extracts_all_fields():
row = parse_tff_row(TFF_SOCRATA_ROW)
assert row["report_date"] == "2026-04-22"
assert row["dealer_long"] == 12345
assert row["dealer_short"] == 23456
assert row["dealer_net"] == 12345 - 23456
assert row["asset_mgr_long"] == 654321
assert row["asset_mgr_net"] == 654321 - 200000
assert row["lev_funds_long"] == 100000
assert row["lev_funds_short"] == 350000
assert row["lev_funds_net"] == 100000 - 350000
assert row["other_long"] == 50000
assert row["other_net"] == 0
assert row["open_interest"] == 2500000
def test_parse_tff_row_handles_missing_field():
payload = {"report_date_as_yyyy_mm_dd": "2026-04-22T00:00:00.000"}
row = parse_tff_row(payload)
assert row["report_date"] == "2026-04-22"
assert row["dealer_long"] == 0
assert row["dealer_net"] == 0
def test_parse_disagg_row_extracts_all_fields():
row = parse_disagg_row(DISAGG_SOCRATA_ROW)
assert row["report_date"] == "2026-04-22"
assert row["producer_long"] == 100000
assert row["producer_short"] == 300000
assert row["producer_net"] == -200000
assert row["swap_long"] == 50000
assert row["swap_net"] == -10000
assert row["managed_money_long"] == 200000
assert row["managed_money_short"] == 80000
assert row["managed_money_net"] == 120000
assert row["other_long"] == 10000
assert row["other_net"] == 0
assert row["open_interest"] == 1500000
+217
View File
@@ -6,9 +6,11 @@ import httpx
import pytest
import pytest_httpx
from mcp_macro.fetchers import (
fetch_breakeven_inflation,
fetch_economic_indicators,
fetch_macro_calendar,
fetch_market_overview,
yield_curve_metrics,
)
# --- fetch_economic_indicators ---
@@ -183,3 +185,218 @@ async def test_market_overview_happy(httpx_mock: pytest_httpx.HTTPXMock):
assert result["gold"] == 2412.5
assert result["vix"] == 18.3
assert "data_timestamp" in result
# --- yield_curve_metrics ---
def test_yield_curve_metrics_normal_curve():
out = yield_curve_metrics({"us2y": 4.0, "us5y": 4.2, "us10y": 4.5, "us30y": 4.8})
assert out["slope_2y10y"] == 0.5
assert out["slope_5y30y"] == 0.6
assert out["regime"] == "steep"
# butterfly: 2*4.5 - 4.0 - 4.8 = 0.2
assert out["butterfly_2_10_30"] == 0.2
def test_yield_curve_metrics_inverted():
out = yield_curve_metrics({"us2y": 5.5, "us5y": 5.0, "us10y": 4.5, "us30y": 4.3})
assert out["slope_2y10y"] == -1.0
assert out["regime"] == "inverted"
def test_yield_curve_metrics_partial_data():
out = yield_curve_metrics({"us10y": 4.5})
assert out["slope_2y10y"] is None
assert out["regime"] == "unknown"
# --- fetch_breakeven_inflation ---
@pytest.mark.asyncio
async def test_breakeven_no_key():
out = await fetch_breakeven_inflation(fred_api_key="")
assert "error" in out
@pytest.mark.asyncio
async def test_breakeven_happy_path(httpx_mock: pytest_httpx.HTTPXMock):
for series_id, val in [("T5YIE", "2.3"), ("T10YIE", "2.5"), ("T5YIFR", "2.7")]:
httpx_mock.add_response(
url=httpx.URL(
"https://api.stlouisfed.org/fred/series/observations",
params={
"series_id": series_id,
"api_key": "k",
"file_type": "json",
"sort_order": "desc",
"limit": "1",
},
),
json={"observations": [{"value": val}]},
)
out = await fetch_breakeven_inflation(fred_api_key="k")
assert out["breakevens"]["be_5y"] == 2.3
assert out["breakevens"]["be_10y"] == 2.5
assert out["breakevens"]["be_5y5y_forward"] == 2.7
assert out["interpretation"] == "anchored"
@pytest.mark.asyncio
async def test_breakeven_high_inflation(httpx_mock: pytest_httpx.HTTPXMock):
for series_id in ("T5YIE", "T10YIE", "T5YIFR"):
httpx_mock.add_response(
url=httpx.URL(
"https://api.stlouisfed.org/fred/series/observations",
params={
"series_id": series_id,
"api_key": "k",
"file_type": "json",
"sort_order": "desc",
"limit": "1",
},
),
json={"observations": [{"value": "3.5"}]},
)
out = await fetch_breakeven_inflation(fred_api_key="k")
assert out["interpretation"] == "high_inflation_expected"
@pytest.mark.asyncio
async def test_fetch_cot_tff_happy_path(httpx_mock: pytest_httpx.HTTPXMock):
from mcp_macro.fetchers import fetch_cot_tff
httpx_mock.add_response(
url=httpx.URL(
"https://publicreporting.cftc.gov/resource/gpe5-46if.json",
params={
"cftc_contract_market_code": "13874A",
"$order": "report_date_as_yyyy_mm_dd DESC",
"$limit": "52",
},
),
json=[
{
"report_date_as_yyyy_mm_dd": "2026-04-22T00:00:00.000",
"dealer_positions_long_all": "12345",
"dealer_positions_short_all": "23456",
"asset_mgr_positions_long": "654321",
"asset_mgr_positions_short": "200000",
"lev_money_positions_long": "100000",
"lev_money_positions_short": "350000",
"other_rept_positions_long": "50000",
"other_rept_positions_short": "50000",
"open_interest_all": "2500000",
},
{
"report_date_as_yyyy_mm_dd": "2026-04-15T00:00:00.000",
"dealer_positions_long_all": "11000",
"dealer_positions_short_all": "22000",
"asset_mgr_positions_long": "640000",
"asset_mgr_positions_short": "210000",
"lev_money_positions_long": "110000",
"lev_money_positions_short": "320000",
"other_rept_positions_long": "48000",
"other_rept_positions_short": "52000",
"open_interest_all": "2480000",
},
],
)
out = await fetch_cot_tff("ES", lookback_weeks=52)
assert out["symbol"] == "ES"
assert out["report_type"] == "tff"
assert len(out["rows"]) == 2
# Ordering ASC by date (oldest first)
assert out["rows"][0]["report_date"] == "2026-04-15"
assert out["rows"][1]["report_date"] == "2026-04-22"
assert out["rows"][1]["lev_funds_net"] == -250000
assert "data_timestamp" in out
@pytest.mark.asyncio
async def test_fetch_cot_tff_unknown_symbol():
from mcp_macro.fetchers import fetch_cot_tff
out = await fetch_cot_tff("INVALID", lookback_weeks=52)
assert out.get("error") == "unknown_symbol"
assert "ES" in out.get("available", [])
@pytest.mark.asyncio
async def test_fetch_cot_disagg_happy_path(httpx_mock: pytest_httpx.HTTPXMock):
from mcp_macro.fetchers import fetch_cot_disaggregated
httpx_mock.add_response(
url=httpx.URL(
"https://publicreporting.cftc.gov/resource/72hh-3qpy.json",
params={
"cftc_contract_market_code": "067651",
"$order": "report_date_as_yyyy_mm_dd DESC",
"$limit": "52",
},
),
json=[
{
"report_date_as_yyyy_mm_dd": "2026-04-22T00:00:00.000",
"prod_merc_positions_long_all": "100000",
"prod_merc_positions_short_all": "300000",
"swap_positions_long_all": "50000",
"swap_positions_short_all": "60000",
"m_money_positions_long_all": "200000",
"m_money_positions_short_all": "80000",
"other_rept_positions_long_all": "10000",
"other_rept_positions_short_all": "10000",
"open_interest_all": "1500000",
},
],
)
out = await fetch_cot_disaggregated("CL", lookback_weeks=52)
assert out["symbol"] == "CL"
assert out["report_type"] == "disaggregated"
assert len(out["rows"]) == 1
assert out["rows"][0]["managed_money_net"] == 120000
assert out["rows"][0]["producer_net"] == -200000
@pytest.mark.asyncio
async def test_fetch_cot_disagg_unknown_symbol():
from mcp_macro.fetchers import fetch_cot_disaggregated
out = await fetch_cot_disaggregated("XYZ", lookback_weeks=52)
assert out.get("error") == "unknown_symbol"
assert "CL" in out.get("available", [])
@pytest.mark.asyncio
async def test_fetch_cot_extreme_positioning_flags_outliers(monkeypatch):
"""Mock fetch_cot_tff e fetch_cot_disagg per simulare history e ultimo punto."""
from unittest.mock import AsyncMock
from mcp_macro import fetchers as f
# Simula una serie ES dove ultimo lev_funds_net è in basso (extreme_short)
es_rows = [
{"report_date": f"2026-{m:02d}-01", "lev_funds_net": v}
for m, v in [(1, 0), (2, 50), (3, 100), (4, -500)]
]
cl_rows = [
{"report_date": f"2026-{m:02d}-01", "managed_money_net": v}
for m, v in [(1, 100), (2, 200), (3, 300), (4, 1000)]
]
async def fake_tff(symbol, lookback_weeks):
if symbol == "ES":
return {"symbol": "ES", "report_type": "tff", "rows": es_rows}
return {"symbol": symbol, "report_type": "tff", "rows": []}
async def fake_disagg(symbol, lookback_weeks):
if symbol == "CL":
return {"symbol": "CL", "report_type": "disaggregated", "rows": cl_rows}
return {"symbol": symbol, "report_type": "disaggregated", "rows": []}
monkeypatch.setattr(f, "fetch_cot_tff", AsyncMock(side_effect=fake_tff))
monkeypatch.setattr(f, "fetch_cot_disaggregated", AsyncMock(side_effect=fake_disagg))
out = await f.fetch_cot_extreme_positioning(lookback_weeks=4)
assert "extremes" in out
by_sym = {e["symbol"]: e for e in out["extremes"]}
assert by_sym["ES"]["signal"] == "extreme_short"
assert by_sym["ES"]["key_role"] == "lev_funds"
assert by_sym["CL"]["signal"] == "extreme_long"
assert by_sym["CL"]["key_role"] == "managed_money"
+76 -1
View File
@@ -4,8 +4,8 @@ from unittest.mock import AsyncMock, patch
import pytest
from fastapi.testclient import TestClient
from mcp_macro.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_macro.server import create_app
@pytest.fixture
@@ -125,3 +125,78 @@ def test_get_market_overview_observer_ok(http):
def test_get_market_overview_no_auth_401(http):
r = http.post("/tools/get_market_overview", json={})
assert r.status_code == 401
def test_get_cot_tff_core_ok(http):
with patch(
"mcp_macro.server.fetch_cot_tff",
new=AsyncMock(return_value={"symbol": "ES", "rows": []}),
):
r = http.post(
"/tools/get_cot_tff",
headers={"Authorization": "Bearer ct"},
json={"symbol": "ES"},
)
assert r.status_code == 200
assert r.json()["symbol"] == "ES"
def test_get_cot_tff_observer_ok(http):
with patch(
"mcp_macro.server.fetch_cot_tff",
new=AsyncMock(return_value={"symbol": "ES", "rows": []}),
):
r = http.post(
"/tools/get_cot_tff",
headers={"Authorization": "Bearer ot"},
json={"symbol": "ES"},
)
assert r.status_code == 200
def test_get_cot_tff_no_auth_401(http):
r = http.post("/tools/get_cot_tff", json={"symbol": "ES"})
assert r.status_code == 401
def test_get_cot_disagg_observer_ok(http):
with patch(
"mcp_macro.server.fetch_cot_disaggregated",
new=AsyncMock(return_value={"symbol": "CL", "rows": []}),
):
r = http.post(
"/tools/get_cot_disaggregated",
headers={"Authorization": "Bearer ot"},
json={"symbol": "CL"},
)
assert r.status_code == 200
def test_get_cot_disagg_no_auth_401(http):
r = http.post("/tools/get_cot_disaggregated", json={"symbol": "CL"})
assert r.status_code == 401
def test_get_cot_extreme_positioning_ok(http):
with patch(
"mcp_macro.server.fetch_cot_extreme_positioning",
new=AsyncMock(return_value={"extremes": []}),
):
r = http.post(
"/tools/get_cot_extreme_positioning",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_cot_extreme_positioning_lookback_too_short(http):
"""Pydantic validation: lookback_weeks < 4 → 422."""
r = http.post(
"/tools/get_cot_extreme_positioning",
headers={"Authorization": "Bearer ct"},
json={"lookback_weeks": 2},
)
assert r.status_code == 422
@@ -5,7 +5,6 @@ import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_sentiment.server import create_app
@@ -5,7 +5,7 @@ import re
import xml.etree.ElementTree as ET
from typing import Any
import httpx
from mcp_common.http import async_client
CRYPTOPANIC_URL = "https://cryptopanic.com/api/v1/posts/"
ALTERNATIVE_ME_URL = "https://api.alternative.me/fng/"
@@ -18,7 +18,7 @@ MESSARI_NEWS_URL = "https://data.messari.io/api/v1/news"
async def _fetch_coindesk_headlines(limit: int = 20) -> list[dict[str, Any]]:
items: list[dict[str, Any]] = []
try:
async with httpx.AsyncClient(timeout=10, follow_redirects=True) as client:
async with async_client(timeout=10.0, follow_redirects=True) as client:
resp = await client.get(COINDESK_RSS)
if resp.status_code != 200:
return items
@@ -54,7 +54,7 @@ async def _fetch_cryptocompare_news(limit: int = 20) -> list[dict[str, Any]]:
"""CER-017: CryptoCompare news free (no key needed)."""
items: list[dict[str, Any]] = []
try:
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get(CRYPTOCOMPARE_NEWS_URL, params={"lang": "EN"})
if resp.status_code != 200:
return items
@@ -82,7 +82,7 @@ async def _fetch_messari_news(limit: int = 20) -> list[dict[str, Any]]:
"""CER-017: Messari news free (no key needed for basic feed)."""
items: list[dict[str, Any]] = []
try:
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get(MESSARI_NEWS_URL)
if resp.status_code != 200:
return items
@@ -164,7 +164,7 @@ async def fetch_crypto_news(api_key: str = "", limit: int = 20) -> dict[str, Any
async def _fetch_cryptopanic_news(api_key: str, limit: int) -> list[dict[str, Any]]:
"""Cryptopanic as one of the sources. Failure → []."""
try:
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get(
CRYPTOPANIC_URL,
params={"auth_token": api_key, "public": "true"},
@@ -189,7 +189,7 @@ async def _fetch_cryptopanic_news(api_key: str, limit: int) -> list[dict[str, An
async def _fetch_lunarcrush(symbol: str, api_key: str) -> dict | None:
"""CER-P2-005: LunarCrush v4 social metrics. Ritorna None se fail."""
try:
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get(
LUNARCRUSH_COIN_URL.format(symbol=symbol.upper()),
headers={"Authorization": f"Bearer {api_key}"},
@@ -220,7 +220,7 @@ async def fetch_social_sentiment(symbol: str = "BTC") -> dict[str, Any]:
Altrimenti deriva proxy da fear_greed per popolare twitter/reddit sentiment
(marcato come derived=True così l'agent sa che è proxy).
"""
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
fng_resp = await client.get(ALTERNATIVE_ME_URL, params={"limit": 1})
fng_data = fng_resp.json()
fng_list = fng_data.get("data", [])
@@ -275,7 +275,7 @@ async def fetch_funding_rates(asset: str = "BTC") -> dict[str, Any]:
okx_inst = f"{asset}-USDT-SWAP"
rates: list[dict[str, Any]] = []
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
# Binance
try:
resp = await client.get(BINANCE_FUNDING_URL, params={"symbol": usdt_symbol})
@@ -336,11 +336,12 @@ async def fetch_funding_rates(asset: str = "BTC") -> dict[str, Any]:
async def fetch_cross_exchange_funding(assets: list[str] | None = None) -> dict[str, Any]:
"""Snapshot multi-asset funding rates con spread e arbitrage detection."""
from datetime import UTC, datetime as _dt
from datetime import UTC
from datetime import datetime as _dt
assets = [a.upper() for a in (assets or ["BTC", "ETH", "SOL"])]
snapshot: dict[str, dict[str, Any]] = {}
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
for asset in assets:
rates: dict[str, float | None] = {
"binance": None,
@@ -438,11 +439,142 @@ async def fetch_cross_exchange_funding(assets: list[str] | None = None) -> dict[
}
async def fetch_funding_arb_spread(assets: list[str] | None = None) -> dict[str, Any]:
"""Riassume opportunità di arbitrage funding su cross-exchange in un
formato compatto: per ogni asset, rate min/max + spread + annualized %.
Wrapper su fetch_cross_exchange_funding focalizzato su action items.
"""
base = await fetch_cross_exchange_funding(assets)
snapshot = base.get("snapshot") or {}
rows: list[dict[str, Any]] = []
for asset, data in snapshot.items():
rates = {k: v for k, v in data.items() if k in ("binance", "bybit", "okx", "hyperliquid") and v is not None}
if len(rates) < 2:
continue
sorted_rates = sorted(rates.items(), key=lambda x: x[1])
low_ex, low_v = sorted_rates[0]
high_ex, high_v = sorted_rates[-1]
spread = high_v - low_v
# Funding cycle: 8h on most, 1h on hyperliquid → assume 8h => 3x/day
ann_pct = spread * 3 * 365 * 100
actionable = ann_pct > 50
rows.append({
"asset": asset,
"long_venue": low_ex,
"short_venue": high_ex,
"long_funding": low_v,
"short_funding": high_v,
"spread": spread,
"annualized_pct": round(ann_pct, 2),
"actionable": actionable,
})
rows.sort(key=lambda r: -r["annualized_pct"])
return {
"opportunities": rows,
"data_timestamp": base.get("data_timestamp"),
}
async def fetch_liquidation_heatmap(asset: str = "BTC") -> dict[str, Any]:
"""Heuristic liquidation pressure: combina OI delta + funding extreme su
asset. NON usa feed liq paid (Coinglass): stima dove si concentra
leveraged exposure pronta a essere liquidata.
long_squeeze_risk: high se OI cresce + funding positivo (long crowded).
short_squeeze_risk: high se OI cresce + funding negativo (short crowded).
"""
asset = asset.upper()
oi = await fetch_oi_history(asset=asset, period="5m", limit=288)
funding = await fetch_cross_exchange_funding(assets=[asset])
snap = (funding.get("snapshot") or {}).get(asset) or {}
rates = [v for k, v in snap.items() if k in ("binance", "bybit", "okx", "hyperliquid") and v is not None]
avg_funding = sum(rates) / len(rates) if rates else None
delta_4h = oi.get("delta_pct_4h")
delta_24h = oi.get("delta_pct_24h")
long_risk = "low"
short_risk = "low"
if avg_funding is not None and delta_24h is not None:
if avg_funding > 0.0001 and delta_24h > 5:
long_risk = "high"
elif avg_funding > 0.00005 and delta_24h > 2:
long_risk = "medium"
if avg_funding < -0.0001 and delta_24h > 5:
short_risk = "high"
elif avg_funding < -0.00005 and delta_24h > 2:
short_risk = "medium"
return {
"asset": asset,
"avg_funding_rate": avg_funding,
"oi_delta_pct_4h": delta_4h,
"oi_delta_pct_24h": delta_24h,
"long_squeeze_risk": long_risk,
"short_squeeze_risk": short_risk,
"note": "heuristic — non sostituisce feed liq dedicati (Coinglass).",
}
async def fetch_cointegration_pairs(
pairs: list[list[str]] | None = None,
lookback_hours: int = 24,
) -> dict[str, Any]:
"""Test Engle-Granger su coppie crypto su Binance hourly.
pairs: lista di [base, quote] (es. [["BTC", "ETH"]]). Default top-3.
"""
from mcp_common.stats import cointegration_test
pairs = pairs or [["BTC", "ETH"], ["BTC", "SOL"], ["ETH", "SOL"]]
out: list[dict[str, Any]] = []
interval = "1h"
limit = max(50, lookback_hours)
async with async_client(timeout=15.0) as client:
for pair in pairs:
if len(pair) != 2:
continue
a, b = pair[0].upper(), pair[1].upper()
sym_a = f"{a}USDT"
sym_b = f"{b}USDT"
try:
resp_a = await client.get(
"https://api.binance.com/api/v3/klines",
params={"symbol": sym_a, "interval": interval, "limit": limit},
)
resp_b = await client.get(
"https://api.binance.com/api/v3/klines",
params={"symbol": sym_b, "interval": interval, "limit": limit},
)
if resp_a.status_code != 200 or resp_b.status_code != 200:
continue
closes_a = [float(k[4]) for k in resp_a.json()]
closes_b = [float(k[4]) for k in resp_b.json()]
if len(closes_a) != len(closes_b):
n = min(len(closes_a), len(closes_b))
closes_a = closes_a[-n:]
closes_b = closes_b[-n:]
result = cointegration_test(closes_a, closes_b)
out.append({
"pair": [a, b],
"samples": len(closes_a),
**result,
})
except Exception as e:
out.append({"pair": [a, b], "error": str(e)})
out.sort(key=lambda r: r.get("adf_t_stat") or 0)
return {
"results": out,
"lookback_hours": lookback_hours,
}
async def fetch_world_news() -> dict[str, Any]:
"""Fetch world financial news from free RSS feeds."""
articles: list[dict[str, Any]] = []
async with httpx.AsyncClient(timeout=10, follow_redirects=True) as client:
async with async_client(timeout=10.0, follow_redirects=True) as client:
for source_name, url in WORLD_NEWS_FEEDS:
try:
resp = await client.get(url)
@@ -482,7 +614,7 @@ async def fetch_oi_history(asset: str = "BTC", period: str = "5m", limit: int =
limit = max(1, min(int(limit), 500))
points: list[dict[str, Any]] = []
try:
async with httpx.AsyncClient(timeout=10) as client:
async with async_client(timeout=10.0) as client:
resp = await client.get(
BINANCE_OI_HIST_URL,
params={"symbol": symbol, "period": period, "limit": limit},
@@ -9,17 +9,20 @@ from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
logger = logging.getLogger(__name__)
from mcp_sentiment.fetchers import (
fetch_crypto_news,
fetch_cointegration_pairs,
fetch_cross_exchange_funding,
fetch_crypto_news,
fetch_funding_arb_spread,
fetch_funding_rates,
fetch_liquidation_heatmap,
fetch_oi_history,
fetch_social_sentiment,
fetch_world_news,
)
logger = logging.getLogger(__name__)
# --- Body models ---
class GetCryptoNewsReq(BaseModel):
@@ -42,6 +45,19 @@ class GetCrossExchangeFundingReq(BaseModel):
assets: list[str] | None = None
class GetFundingArbSpreadReq(BaseModel):
assets: list[str] | None = None
class GetLiquidationHeatmapReq(BaseModel):
asset: str = "BTC"
class GetCointegrationPairsReq(BaseModel):
pairs: list[list[str]] | None = None
lookback_hours: int = 24
class GetOiHistoryReq(BaseModel):
asset: str = "BTC"
period: str = "5m"
@@ -106,6 +122,27 @@ def create_app(*, cryptopanic_key: str = "", token_store: TokenStore) -> FastAPI
_check(principal, core=True, observer=True)
return await fetch_cross_exchange_funding(body.assets)
@app.post("/tools/get_funding_arb_spread", tags=["reads"])
async def t_get_funding_arb_spread(
body: GetFundingArbSpreadReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_funding_arb_spread(body.assets)
@app.post("/tools/get_liquidation_heatmap", tags=["reads"])
async def t_get_liquidation_heatmap(
body: GetLiquidationHeatmapReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_liquidation_heatmap(body.asset)
@app.post("/tools/get_cointegration_pairs", tags=["reads"])
async def t_get_cointegration_pairs(
body: GetCointegrationPairsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cointegration_pairs(body.pairs, body.lookback_hours)
@app.post("/tools/get_oi_history", tags=["reads"])
async def t_get_oi_history(
body: GetOiHistoryReq, principal: Principal = Depends(require_principal)
@@ -128,6 +165,9 @@ def create_app(*, cryptopanic_key: str = "", token_store: TokenStore) -> FastAPI
{"name": "get_world_news", "description": "News macro/world."},
{"name": "get_cross_exchange_funding", "description": "Funding multi-asset multi-exchange + arbitrage opportunities."},
{"name": "get_oi_history", "description": "Open interest history perp (Binance) + delta_pct 1h/4h/24h."},
{"name": "get_funding_arb_spread", "description": "Opportunità arbitrage funding cross-exchange in formato compatto + annualized %."},
{"name": "get_liquidation_heatmap", "description": "Pressione liquidazioni heuristica da OI delta + funding (long/short squeeze risk)."},
{"name": "get_cointegration_pairs", "description": "Engle-Granger cointegration test su coppie crypto Binance hourly."},
],
)
@@ -4,8 +4,8 @@ from unittest.mock import AsyncMock, patch
import pytest
from fastapi.testclient import TestClient
from mcp_sentiment.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_sentiment.server import create_app
@pytest.fixture
@@ -157,3 +157,60 @@ def test_get_world_news_observer_ok(http):
def test_get_world_news_no_auth_401(http):
r = http.post("/tools/get_world_news", json={})
assert r.status_code == 401
# --- New indicators: funding_arb_spread, liquidation_heatmap, cointegration_pairs ---
def test_get_funding_arb_spread_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_arb_spread",
new=AsyncMock(return_value={"opportunities": []}),
):
r = http.post(
"/tools/get_funding_arb_spread",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_funding_arb_spread_no_auth_401(http):
r = http.post("/tools/get_funding_arb_spread", json={})
assert r.status_code == 401
def test_get_liquidation_heatmap_ok(http):
with patch(
"mcp_sentiment.server.fetch_liquidation_heatmap",
new=AsyncMock(return_value={"asset": "BTC", "long_squeeze_risk": "low"}),
):
r = http.post(
"/tools/get_liquidation_heatmap",
headers={"Authorization": "Bearer ct"},
json={"asset": "BTC"},
)
assert r.status_code == 200
def test_get_liquidation_heatmap_no_auth_401(http):
r = http.post("/tools/get_liquidation_heatmap", json={"asset": "BTC"})
assert r.status_code == 401
def test_get_cointegration_pairs_ok(http):
with patch(
"mcp_sentiment.server.fetch_cointegration_pairs",
new=AsyncMock(return_value={"results": []}),
):
r = http.post(
"/tools/get_cointegration_pairs",
headers={"Authorization": "Bearer ot"},
json={"pairs": [["BTC", "ETH"]]},
)
assert r.status_code == 200
def test_get_cointegration_pairs_no_auth_401(http):
r = http.post("/tools/get_cointegration_pairs", json={})
assert r.status_code == 401
+25
View File
@@ -0,0 +1,25 @@
# Smoke locale Cerbero_mcp
```bash
# da repo root Cerbero_mcp/
docker compose up -d
bash tests/smoke/run.sh
docker compose down
```
Il file `run.sh` verifica:
- `/health` di tutti i 6 MCP (atteso `200`)
- `environment_info` dei 4 exchange (atteso shape `{environment, source, env_value, base_url, max_leverage}`)
- live tool check read-only contro upstream testnet:
- deribit `get_ticker BTC-PERPETUAL`
- bybit `get_ticker BTCUSDT` (linear)
- hyperliquid `get_ticker BTC`
- alpaca `get_clock` (richiede credenziali paper valide)
- macro `get_treasury_yields`
- sentiment `get_funding_rates BTC`
Variabili di ambiente:
- `GATEWAY` — URL base gateway (default `http://localhost`; in produzione `https://cerbero-mcp.tielogic.xyz`)
- `TOKEN_FILE` — path al token bearer di lettura (default `secrets/observer.token`)
Exit code 0 = tutto OK, 1 = uno o più check falliti.
+111
View File
@@ -0,0 +1,111 @@
#!/usr/bin/env bash
# Smoke test locale per Cerbero_mcp.
# Presuppone: docker compose up -d già eseguito da repo root.
# Verifica: ogni MCP risponde /health; i 4 exchange espongono environment_info.
set -euo pipefail
cd "$(dirname "$0")/../.."
GATEWAY="${GATEWAY:-http://localhost}"
TOKEN_FILE="${TOKEN_FILE:-secrets/observer.token}"
if [ ! -f "$TOKEN_FILE" ]; then
echo "ERROR: token file $TOKEN_FILE not found" >&2
exit 2
fi
TOKEN="$(cat "$TOKEN_FILE")"
fail=0
echo "=== /health ==="
for svc in mcp-deribit mcp-bybit mcp-hyperliquid mcp-alpaca mcp-macro mcp-sentiment; do
code=$(curl -s -o /dev/null -w "%{http_code}" \
-H "Authorization: Bearer $TOKEN" \
"$GATEWAY/$svc/health" || echo "000")
if [ "$code" = "200" ]; then
echo " $svc: OK"
else
echo " $svc: FAIL ($code)"
fail=$((fail + 1))
fi
done
echo
echo "=== environment_info on 4 exchange MCPs ==="
for ex in deribit bybit hyperliquid alpaca; do
body=$(curl -s -X POST \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{}' \
"$GATEWAY/mcp-${ex}/tools/environment_info" 2>/dev/null || echo '{}')
if [ -z "$body" ] || [ "$body" = "{}" ]; then
echo " $ex: FAIL (empty body)"
fail=$((fail + 1))
continue
fi
if echo "$body" | jq -e '.environment, .source, .base_url, .max_leverage' > /dev/null 2>&1; then
env=$(echo "$body" | jq -r '.environment')
src=$(echo "$body" | jq -r '.source')
cap=$(echo "$body" | jq -r '.max_leverage')
echo " $ex: OK (env=$env source=$src max_leverage=$cap)"
else
echo " $ex: FAIL (shape mismatch)"
echo " body: $body"
fail=$((fail + 1))
fi
done
echo
echo "=== live tool check (read-only contro upstream testnet) ==="
_check_tool() {
local label="$1" url="$2" body="$3"
local resp
resp=$(curl -s -X POST \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d "$body" \
"$url" 2>/dev/null || echo '{}')
if echo "$resp" | jq -e 'has("error") | not' > /dev/null 2>&1 \
&& [ -n "$resp" ] && [ "$resp" != "null" ] && [ "$resp" != "{}" ]; then
echo " $label: OK"
else
echo " $label: FAIL"
echo " body: $(echo "$resp" | head -c 200)"
fail=$((fail + 1))
fi
}
_check_tool "deribit get_ticker BTC-PERPETUAL" \
"$GATEWAY/mcp-deribit/tools/get_ticker" \
'{"instrument_name":"BTC-PERPETUAL"}'
_check_tool "bybit get_ticker BTCUSDT" \
"$GATEWAY/mcp-bybit/tools/get_ticker" \
'{"symbol":"BTCUSDT","category":"linear"}'
_check_tool "hyperliquid get_ticker BTC" \
"$GATEWAY/mcp-hyperliquid/tools/get_ticker" \
'{"instrument":"BTC"}'
_check_tool "alpaca get_clock (richiede ALPACA_PAPER credenziali valide)" \
"$GATEWAY/mcp-alpaca/tools/get_clock" \
'{}'
_check_tool "macro get_treasury_yields" \
"$GATEWAY/mcp-macro/tools/get_treasury_yields" \
'{}'
_check_tool "sentiment get_funding_rates BTC" \
"$GATEWAY/mcp-sentiment/tools/get_funding_rates" \
'{"asset":"BTC"}'
echo
if [ "$fail" -gt 0 ]; then
echo "SMOKE FAILED: $fail check(s) failed"
exit 1
fi
echo "SMOKE OK"
Generated
+168 -22
View File
@@ -2,9 +2,12 @@ version = 1
revision = 3
requires-python = ">=3.11"
resolution-markers = [
"python_full_version >= '3.14' and sys_platform == 'win32'",
"python_full_version >= '3.14' and sys_platform == 'emscripten'",
"python_full_version >= '3.14' and sys_platform != 'emscripten' and sys_platform != 'win32'",
"python_full_version >= '3.15' and sys_platform == 'win32'",
"python_full_version == '3.14.*' and sys_platform == 'win32'",
"python_full_version >= '3.15' and sys_platform == 'emscripten'",
"python_full_version == '3.14.*' and sys_platform == 'emscripten'",
"python_full_version >= '3.15' and sys_platform != 'emscripten' and sys_platform != 'win32'",
"python_full_version == '3.14.*' and sys_platform != 'emscripten' and sys_platform != 'win32'",
"python_full_version < '3.14' and sys_platform == 'win32'",
"python_full_version < '3.14' and sys_platform == 'emscripten'",
"python_full_version < '3.14' and sys_platform != 'emscripten' and sys_platform != 'win32'",
@@ -23,9 +26,11 @@ members = [
[manifest.dependency-groups]
dev = [
{ name = "mypy", specifier = ">=1.13" },
{ name = "pytest", specifier = ">=9.0.3" },
{ name = "pytest-asyncio", specifier = ">=1.3.0" },
{ name = "pytest-httpx", specifier = ">=0.36.2" },
{ name = "ruff", specifier = ">=0.5,<0.6" },
]
[[package]]
@@ -879,6 +884,79 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/41/45/1a4ed80516f02155c51f51e8cedb3c1902296743db0bbc66608a0db2814f/jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe", size = 18437, upload-time = "2025-09-08T01:34:57.871Z" },
]
[[package]]
name = "librt"
version = "0.9.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/eb/6b/3d5c13fb3e3c4f43206c8f9dfed13778c2ed4f000bacaa0b7ce3c402a265/librt-0.9.0.tar.gz", hash = "sha256:a0951822531e7aee6e0dfb556b30d5ee36bbe234faf60c20a16c01be3530869d", size = 184368, upload-time = "2026-04-09T16:06:26.173Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e2/1e/2ec7afcebcf3efea593d13aee18bbcfdd3a243043d848ebf385055e9f636/librt-0.9.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:90904fac73c478f4b83f4ed96c99c8208b75e6f9a8a1910548f69a00f1eaa671", size = 67155, upload-time = "2026-04-09T16:04:42.933Z" },
{ url = "https://files.pythonhosted.org/packages/18/77/72b85afd4435268338ad4ec6231b3da8c77363f212a0227c1ff3b45e4d35/librt-0.9.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:789fff71757facc0738e8d89e3b84e4f0251c1c975e85e81b152cdaca927cc2d", size = 69916, upload-time = "2026-04-09T16:04:44.042Z" },
{ url = "https://files.pythonhosted.org/packages/27/fb/948ea0204fbe2e78add6d46b48330e58d39897e425560674aee302dca81c/librt-0.9.0-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:1bf465d1e5b0a27713862441f6467b5ab76385f4ecf8f1f3a44f8aa3c695b4b6", size = 199635, upload-time = "2026-04-09T16:04:45.5Z" },
{ url = "https://files.pythonhosted.org/packages/ac/cd/894a29e251b296a27957856804cfd21e93c194aa131de8bb8032021be07e/librt-0.9.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f819e0c6413e259a17a7c0d49f97f405abadd3c2a316a3b46c6440b7dbbedbb1", size = 211051, upload-time = "2026-04-09T16:04:47.016Z" },
{ url = "https://files.pythonhosted.org/packages/18/8f/dcaed0bc084a35f3721ff2d081158db569d2c57ea07d35623ddaca5cfc8e/librt-0.9.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e0785c2fb4a81e1aece366aa3e2e039f4a4d7d21aaaded5227d7f3c703427882", size = 224031, upload-time = "2026-04-09T16:04:48.207Z" },
{ url = "https://files.pythonhosted.org/packages/03/44/88f6c1ed1132cd418601cc041fbd92fed28b3a09f39de81978e0822d13ff/librt-0.9.0-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:80b25c7b570a86c03b5da69e665809deb39265476e8e21d96a9328f9762f9990", size = 218069, upload-time = "2026-04-09T16:04:50.025Z" },
{ url = "https://files.pythonhosted.org/packages/a3/90/7d02e981c2db12188d82b4410ff3e35bfdb844b26aecd02233626f46af2b/librt-0.9.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d4d16b608a1c43d7e33142099a75cd93af482dadce0bf82421e91cad077157f4", size = 224857, upload-time = "2026-04-09T16:04:51.684Z" },
{ url = "https://files.pythonhosted.org/packages/ef/c3/c77e706b7215ca32e928d47535cf13dbc3d25f096f84ddf8fbc06693e229/librt-0.9.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:194fc1a32e1e21fe809d38b5faea66cc65eaa00217c8901fbdb99866938adbdb", size = 219865, upload-time = "2026-04-09T16:04:52.949Z" },
{ url = "https://files.pythonhosted.org/packages/52/d1/32b0c1a0eb8461c70c11656c46a29f760b7c7edf3c36d6f102470c17170f/librt-0.9.0-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:8c6bc1384d9738781cfd41d09ad7f6e8af13cfea2c75ece6bd6d2566cdea2076", size = 218451, upload-time = "2026-04-09T16:04:54.174Z" },
{ url = "https://files.pythonhosted.org/packages/74/d1/adfd0f9c44761b1d49b1bec66173389834c33ee2bd3c7fd2e2367f1942d4/librt-0.9.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:15cb151e52a044f06e54ac7f7b47adbfc89b5c8e2b63e1175a9d587c43e8942a", size = 241300, upload-time = "2026-04-09T16:04:55.452Z" },
{ url = "https://files.pythonhosted.org/packages/09/b0/9074b64407712f0003c27f5b1d7655d1438979155f049720e8a1abd9b1a1/librt-0.9.0-cp311-cp311-win32.whl", hash = "sha256:f100bfe2acf8a3689af9d0cc660d89f17286c9c795f9f18f7b62dd1a6b247ae6", size = 55668, upload-time = "2026-04-09T16:04:56.689Z" },
{ url = "https://files.pythonhosted.org/packages/24/19/40b77b77ce80b9389fb03971431b09b6b913911c38d412059e0b3e2a9ef2/librt-0.9.0-cp311-cp311-win_amd64.whl", hash = "sha256:0b73e4266307e51c95e09c0750b7ec383c561d2e97d58e473f6f6a209952fbb8", size = 62976, upload-time = "2026-04-09T16:04:57.733Z" },
{ url = "https://files.pythonhosted.org/packages/70/9d/9fa7a64041e29035cb8c575af5f0e3840be1b97b4c4d9061e0713f171849/librt-0.9.0-cp311-cp311-win_arm64.whl", hash = "sha256:bc5518873822d2faa8ebdd2c1a4d7c8ef47b01a058495ab7924cb65bdbf5fc9a", size = 53502, upload-time = "2026-04-09T16:04:58.806Z" },
{ url = "https://files.pythonhosted.org/packages/bf/90/89ddba8e1c20b0922783cd93ed8e64f34dc05ab59c38a9c7e313632e20ff/librt-0.9.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:9b3e3bc363f71bda1639a4ee593cb78f7fbfeacc73411ec0d4c92f00730010a4", size = 68332, upload-time = "2026-04-09T16:05:00.09Z" },
{ url = "https://files.pythonhosted.org/packages/a8/40/7aa4da1fb08bdeeb540cb07bfc8207cb32c5c41642f2594dbd0098a0662d/librt-0.9.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0a09c2f5869649101738653a9b7ab70cf045a1105ac66cbb8f4055e61df78f2d", size = 70581, upload-time = "2026-04-09T16:05:01.213Z" },
{ url = "https://files.pythonhosted.org/packages/48/ac/73a2187e1031041e93b7e3a25aae37aa6f13b838c550f7e0f06f66766212/librt-0.9.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:5ca8e133d799c948db2ab1afc081c333a825b5540475164726dcbf73537e5c2f", size = 203984, upload-time = "2026-04-09T16:05:02.542Z" },
{ url = "https://files.pythonhosted.org/packages/5e/3d/23460d571e9cbddb405b017681df04c142fb1b04cbfce77c54b08e28b108/librt-0.9.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:603138ee838ee1583f1b960b62d5d0007845c5c423feb68e44648b1359014e27", size = 215762, upload-time = "2026-04-09T16:05:04.127Z" },
{ url = "https://files.pythonhosted.org/packages/de/1e/42dc7f8ab63e65b20640d058e63e97fd3e482c1edbda3570d813b4d0b927/librt-0.9.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f4003f70c56a5addd6aa0897f200dd59afd3bf7bcd5b3cce46dd21f925743bc2", size = 230288, upload-time = "2026-04-09T16:05:05.883Z" },
{ url = "https://files.pythonhosted.org/packages/dc/08/ca812b6d8259ad9ece703397f8ad5c03af5b5fedfce64279693d3ce4087c/librt-0.9.0-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:78042f6facfd98ecb25e9829c7e37cce23363d9d7c83bc5f72702c5059eb082b", size = 224103, upload-time = "2026-04-09T16:05:07.148Z" },
{ url = "https://files.pythonhosted.org/packages/b6/3f/620490fb2fa66ffd44e7f900254bc110ebec8dac6c1b7514d64662570e6f/librt-0.9.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a361c9434a64d70a7dbb771d1de302c0cc9f13c0bffe1cf7e642152814b35265", size = 232122, upload-time = "2026-04-09T16:05:08.386Z" },
{ url = "https://files.pythonhosted.org/packages/e9/83/12864700a1b6a8be458cf5d05db209b0d8e94ae281e7ec261dbe616597b4/librt-0.9.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:dd2c7e082b0b92e1baa4da28163a808672485617bc855cc22a2fd06978fa9084", size = 225045, upload-time = "2026-04-09T16:05:09.707Z" },
{ url = "https://files.pythonhosted.org/packages/fd/1b/845d339c29dc7dbc87a2e992a1ba8d28d25d0e0372f9a0a2ecebde298186/librt-0.9.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:7e6274fd33fc5b2a14d41c9119629d3ff395849d8bcbc80cf637d9e8d2034da8", size = 227372, upload-time = "2026-04-09T16:05:10.942Z" },
{ url = "https://files.pythonhosted.org/packages/8d/fe/277985610269d926a64c606f761d58d3db67b956dbbf40024921e95e7fcb/librt-0.9.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5093043afb226ecfa1400120d1ebd4442b4f99977783e4f4f7248879009b227f", size = 248224, upload-time = "2026-04-09T16:05:12.254Z" },
{ url = "https://files.pythonhosted.org/packages/92/1b/ee486d244b8de6b8b5dbaefabe6bfdd4a72e08f6353edf7d16d27114da8d/librt-0.9.0-cp312-cp312-win32.whl", hash = "sha256:9edcc35d1cae9fd5320171b1a838c7da8a5c968af31e82ecc3dff30b4be0957f", size = 55986, upload-time = "2026-04-09T16:05:13.529Z" },
{ url = "https://files.pythonhosted.org/packages/89/7a/ba1737012308c17dc6d5516143b5dce9a2c7ba3474afd54e11f44a4d1ef3/librt-0.9.0-cp312-cp312-win_amd64.whl", hash = "sha256:3cc2917258e131ae5f958a4d872e07555b51cb7466a43433218061c74ef33745", size = 63260, upload-time = "2026-04-09T16:05:14.68Z" },
{ url = "https://files.pythonhosted.org/packages/36/e4/01752c113da15127f18f7bf11142f5640038f062407a611c059d0036c6aa/librt-0.9.0-cp312-cp312-win_arm64.whl", hash = "sha256:90e6d5420fc8a300518d4d2288154ff45005e920425c22cbbfe8330f3f754bd9", size = 53694, upload-time = "2026-04-09T16:05:16.095Z" },
{ url = "https://files.pythonhosted.org/packages/5f/d7/1b3e26fffde1452d82f5666164858a81c26ebe808e7ae8c9c88628981540/librt-0.9.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f29b68cd9714531672db62cc54f6e8ff981900f824d13fa0e00749189e13778e", size = 68367, upload-time = "2026-04-09T16:05:17.243Z" },
{ url = "https://files.pythonhosted.org/packages/a5/5b/c61b043ad2e091fbe1f2d35d14795e545d0b56b03edaa390fa1dcee3d160/librt-0.9.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7d5c8a5929ac325729f6119802070b561f4db793dffc45e9ac750992a4ed4d22", size = 70595, upload-time = "2026-04-09T16:05:18.471Z" },
{ url = "https://files.pythonhosted.org/packages/a3/22/2448471196d8a73370aa2f23445455dc42712c21404081fcd7a03b9e0749/librt-0.9.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:756775d25ec8345b837ab52effee3ad2f3b2dfd6bbee3e3f029c517bd5d8f05a", size = 204354, upload-time = "2026-04-09T16:05:19.593Z" },
{ url = "https://files.pythonhosted.org/packages/ac/5e/39fc4b153c78cfd2c8a2dcb32700f2d41d2312aa1050513183be4540930d/librt-0.9.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2b8f5d00b49818f4e2b1667db994488b045835e0ac16fe2f924f3871bd2b8ac5", size = 216238, upload-time = "2026-04-09T16:05:20.868Z" },
{ url = "https://files.pythonhosted.org/packages/d7/42/bc2d02d0fa7badfa63aa8d6dcd8793a9f7ef5a94396801684a51ed8d8287/librt-0.9.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c81aef782380f0f13ead670aae01825eb653b44b046aa0e5ebbb79f76ed4aa11", size = 230589, upload-time = "2026-04-09T16:05:22.305Z" },
{ url = "https://files.pythonhosted.org/packages/c8/7b/e2d95cc513866373692aa5edf98080d5602dd07cabfb9e5d2f70df2f25f7/librt-0.9.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:66b58fed90a545328e80d575467244de3741e088c1af928f0b489ebec3ef3858", size = 224610, upload-time = "2026-04-09T16:05:23.647Z" },
{ url = "https://files.pythonhosted.org/packages/31/d5/6cec4607e998eaba57564d06a1295c21b0a0c8de76e4e74d699e627bd98c/librt-0.9.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e78fb7419e07d98c2af4b8567b72b3eaf8cb05caad642e9963465569c8b2d87e", size = 232558, upload-time = "2026-04-09T16:05:25.025Z" },
{ url = "https://files.pythonhosted.org/packages/95/8c/27f1d8d3aaf079d3eb26439bf0b32f1482340c3552e324f7db9dca858671/librt-0.9.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2c3786f0f4490a5cd87f1ed6cefae833ad6b1060d52044ce0434a2e85893afd0", size = 225521, upload-time = "2026-04-09T16:05:26.311Z" },
{ url = "https://files.pythonhosted.org/packages/6b/d8/1e0d43b1c329b416017619469b3c3801a25a6a4ef4a1c68332aeaa6f72ca/librt-0.9.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:8494cfc61e03542f2d381e71804990b3931175a29b9278fdb4a5459948778dc2", size = 227789, upload-time = "2026-04-09T16:05:27.624Z" },
{ url = "https://files.pythonhosted.org/packages/2c/b4/d3d842e88610fcd4c8eec7067b0c23ef2d7d3bff31496eded6a83b0f99be/librt-0.9.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:07cf11f769831186eeac424376e6189f20ace4f7263e2134bdb9757340d84d4d", size = 248616, upload-time = "2026-04-09T16:05:29.181Z" },
{ url = "https://files.pythonhosted.org/packages/ec/28/527df8ad0d1eb6c8bdfa82fc190f1f7c4cca5a1b6d7b36aeabf95b52d74d/librt-0.9.0-cp313-cp313-win32.whl", hash = "sha256:850d6d03177e52700af605fd60db7f37dcb89782049a149674d1a9649c2138fd", size = 56039, upload-time = "2026-04-09T16:05:30.709Z" },
{ url = "https://files.pythonhosted.org/packages/f3/a7/413652ad0d92273ee5e30c000fc494b361171177c83e57c060ecd3c21538/librt-0.9.0-cp313-cp313-win_amd64.whl", hash = "sha256:a5af136bfba820d592f86c67affcef9b3ff4d4360ac3255e341e964489b48519", size = 63264, upload-time = "2026-04-09T16:05:31.881Z" },
{ url = "https://files.pythonhosted.org/packages/a4/0a/92c244309b774e290ddb15e93363846ae7aa753d9586b8aad511c5e6145b/librt-0.9.0-cp313-cp313-win_arm64.whl", hash = "sha256:4c4d0440a3a8e31d962340c3e1cc3fc9ee7febd34c8d8f770d06adb947779ea5", size = 53728, upload-time = "2026-04-09T16:05:33.31Z" },
{ url = "https://files.pythonhosted.org/packages/cd/c1/184e539543f06ea2912f4b92a5ffaede4f9b392689e3f00acbf8134bee92/librt-0.9.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:3f05d145df35dca5056a8bc3838e940efebd893a54b3e19b2dda39ceaa299bcb", size = 67830, upload-time = "2026-04-09T16:05:34.517Z" },
{ url = "https://files.pythonhosted.org/packages/f3/ad/23399bdcb7afca819acacdef31b37ee59de261bd66b503a7995c03c4b0dc/librt-0.9.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1c587494461ebd42229d0f1739f3aa34237dd9980623ecf1be8d3bcba79f4499", size = 70280, upload-time = "2026-04-09T16:05:35.649Z" },
{ url = "https://files.pythonhosted.org/packages/9f/0b/4542dc5a2b8772dbf92cafb9194701230157e73c14b017b6961a23598b03/librt-0.9.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:b0a2040f801406b93657a70b72fa12311063a319fee72ce98e1524da7200171f", size = 201925, upload-time = "2026-04-09T16:05:36.739Z" },
{ url = "https://files.pythonhosted.org/packages/31/d4/8ee7358b08fd0cfce051ef96695380f09b3c2c11b77c9bfbc367c921cce5/librt-0.9.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f38bc489037eca88d6ebefc9c4d41a4e07c8e8b4de5188a9e6d290273ad7ebb1", size = 212381, upload-time = "2026-04-09T16:05:38.043Z" },
{ url = "https://files.pythonhosted.org/packages/f2/94/a2025fe442abedf8b038038dab3dba942009ad42b38ea064a1a9e6094241/librt-0.9.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f3fd278f5e6bf7c75ccd6d12344eb686cc020712683363b66f46ac79d37c799f", size = 227065, upload-time = "2026-04-09T16:05:39.394Z" },
{ url = "https://files.pythonhosted.org/packages/7c/e9/b9fcf6afa909f957cfbbf918802f9dada1bd5d3c1da43d722fd6a310dc3f/librt-0.9.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fcbdf2a9ca24e87bbebb47f1fe34e531ef06f104f98c9ccfc953a3f3344c567a", size = 221333, upload-time = "2026-04-09T16:05:40.999Z" },
{ url = "https://files.pythonhosted.org/packages/ac/7c/ba54cd6aa6a3c8cd12757a6870e0c79a64b1e6327f5248dcff98423f4d43/librt-0.9.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e306d956cfa027fe041585f02a1602c32bfa6bb8ebea4899d373383295a6c62f", size = 229051, upload-time = "2026-04-09T16:05:42.605Z" },
{ url = "https://files.pythonhosted.org/packages/4b/4b/8cfdbad314c8677a0148bf0b70591d6d18587f9884d930276098a235461b/librt-0.9.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:465814ab157986acb9dfa5ccd7df944be5eefc0d08d31ec6e8d88bc71251d845", size = 222492, upload-time = "2026-04-09T16:05:43.842Z" },
{ url = "https://files.pythonhosted.org/packages/1f/d1/2eda69563a1a88706808decdce035e4b32755dbfbb0d05e1a65db9547ed1/librt-0.9.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:703f4ae36d6240bfe24f542bac784c7e4194ec49c3ba5a994d02891649e2d85b", size = 223849, upload-time = "2026-04-09T16:05:45.054Z" },
{ url = "https://files.pythonhosted.org/packages/04/44/b2ed37df6be5b3d42cfe36318e0598e80843d5c6308dd63d0bf4e0ce5028/librt-0.9.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3be322a15ee5e70b93b7a59cfd074614f22cc8c9ff18bd27f474e79137ea8d3b", size = 245001, upload-time = "2026-04-09T16:05:46.34Z" },
{ url = "https://files.pythonhosted.org/packages/47/e7/617e412426df89169dd2a9ed0cc8752d5763336252c65dbf945199915119/librt-0.9.0-cp314-cp314-win32.whl", hash = "sha256:b8da9f8035bb417770b1e1610526d87ad4fc58a2804dc4d79c53f6d2cf5a6eb9", size = 51799, upload-time = "2026-04-09T16:05:47.738Z" },
{ url = "https://files.pythonhosted.org/packages/24/ed/c22ca4db0ca3cbc285e4d9206108746beda561a9792289c3c31281d7e9df/librt-0.9.0-cp314-cp314-win_amd64.whl", hash = "sha256:b8bd70d5d816566a580d193326912f4a76ec2d28a97dc4cd4cc831c0af8e330e", size = 59165, upload-time = "2026-04-09T16:05:49.198Z" },
{ url = "https://files.pythonhosted.org/packages/24/56/875398fafa4cbc8f15b89366fc3287304ddd3314d861f182a4b87595ace0/librt-0.9.0-cp314-cp314-win_arm64.whl", hash = "sha256:fc5758e2b7a56532dc33e3c544d78cbaa9ecf0a0f2a2da2df882c1d6b99a317f", size = 49292, upload-time = "2026-04-09T16:05:50.362Z" },
{ url = "https://files.pythonhosted.org/packages/4c/61/bc448ecbf9b2d69c5cff88fe41496b19ab2a1cbda0065e47d4d0d51c0867/librt-0.9.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:f24b90b0e0c8cc9491fb1693ae91fe17cb7963153a1946395acdbdd5818429a4", size = 70175, upload-time = "2026-04-09T16:05:51.564Z" },
{ url = "https://files.pythonhosted.org/packages/60/f2/c47bb71069a73e2f04e70acbd196c1e5cc411578ac99039a224b98920fd4/librt-0.9.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:3fe56e80badb66fdcde06bef81bbaa5bfcf6fbd7aefb86222d9e369c38c6b228", size = 72951, upload-time = "2026-04-09T16:05:52.699Z" },
{ url = "https://files.pythonhosted.org/packages/29/19/0549df59060631732df758e8886d92088da5fdbedb35b80e4643664e8412/librt-0.9.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:527b5b820b47a09e09829051452bb0d1dd2122261254e2a6f674d12f1d793d54", size = 225864, upload-time = "2026-04-09T16:05:53.895Z" },
{ url = "https://files.pythonhosted.org/packages/9d/f8/3b144396d302ac08e50f89e64452c38db84bc7b23f6c60479c5d3abd303c/librt-0.9.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7d429bdd4ac0ab17c8e4a8af0ed2a7440b16eba474909ab357131018fe8c7e71", size = 241155, upload-time = "2026-04-09T16:05:55.191Z" },
{ url = "https://files.pythonhosted.org/packages/7a/ce/ee67ec14581de4043e61d05786d2aed6c9b5338816b7859bcf07455c6a9f/librt-0.9.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7202bdcac47d3a708271c4304a474a8605a4a9a4a709e954bf2d3241140aa938", size = 252235, upload-time = "2026-04-09T16:05:56.549Z" },
{ url = "https://files.pythonhosted.org/packages/8a/fa/0ead15daa2b293a54101550b08d4bafe387b7d4a9fc6d2b985602bae69b6/librt-0.9.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0d620e74897f8c2613b3c4e2e9c1e422eb46d2ddd07df540784d44117836af3", size = 244963, upload-time = "2026-04-09T16:05:57.858Z" },
{ url = "https://files.pythonhosted.org/packages/29/68/9fbf9a9aa704ba87689e40017e720aced8d9a4d2b46b82451d8142f91ec9/librt-0.9.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d69fc39e627908f4c03297d5a88d9284b73f4d90b424461e32e8c2485e21c283", size = 257364, upload-time = "2026-04-09T16:05:59.686Z" },
{ url = "https://files.pythonhosted.org/packages/1a/8d/9d60869f1b6716c762e45f66ed945b1e5dd649f7377684c3b176ae424648/librt-0.9.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:c2640e23d2b7c98796f123ffd95cf2022c7777aa8a4a3b98b36c570d37e85eee", size = 247661, upload-time = "2026-04-09T16:06:00.938Z" },
{ url = "https://files.pythonhosted.org/packages/70/ff/a5c365093962310bfdb4f6af256f191085078ffb529b3f0cbebb5b33ebe2/librt-0.9.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:451daa98463b7695b0a30aa56bf637831ea559e7b8101ac2ef6382e8eb15e29c", size = 248238, upload-time = "2026-04-09T16:06:02.537Z" },
{ url = "https://files.pythonhosted.org/packages/a0/3c/2d34365177f412c9e19c0a29f969d70f5343f27634b76b765a54d8b27705/librt-0.9.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:928bd06eca2c2bbf4349e5b817f837509b0604342e65a502de1d50a7570afd15", size = 269457, upload-time = "2026-04-09T16:06:03.833Z" },
{ url = "https://files.pythonhosted.org/packages/bc/cd/de45b239ea3bdf626f982a00c14bfcf2e12d261c510ba7db62c5969a27cd/librt-0.9.0-cp314-cp314t-win32.whl", hash = "sha256:a9c63e04d003bc0fb6a03b348018b9a3002f98268200e22cc80f146beac5dc40", size = 52453, upload-time = "2026-04-09T16:06:05.229Z" },
{ url = "https://files.pythonhosted.org/packages/7f/f9/bfb32ae428aa75c0c533915622176f0a17d6da7b72b5a3c6363685914f70/librt-0.9.0-cp314-cp314t-win_amd64.whl", hash = "sha256:f162af66a2ed3f7d1d161a82ca584efd15acd9c1cff190a373458c32f7d42118", size = 60044, upload-time = "2026-04-09T16:06:06.398Z" },
{ url = "https://files.pythonhosted.org/packages/aa/47/7d70414bcdbb3bc1f458a8d10558f00bbfdb24e5a11740fc8197e12c3255/librt-0.9.0-cp314-cp314t-win_arm64.whl", hash = "sha256:a4b25c6c25cac5d0d9d6d6da855195b254e0021e513e0249f0e3b444dc6e0e61", size = 50009, upload-time = "2026-04-09T16:06:07.995Z" },
]
[[package]]
name = "mcp"
version = "1.27.0"
@@ -1193,6 +1271,65 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/81/f2/08ace4142eb281c12701fc3b93a10795e4d4dc7f753911d836675050f886/msgpack-1.1.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d99ef64f349d5ec3293688e91486c5fdb925ed03807f64d98d205d2713c60b46", size = 70868, upload-time = "2025-10-08T09:15:44.959Z" },
]
[[package]]
name = "mypy"
version = "1.20.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "librt", marker = "platform_python_implementation != 'PyPy'" },
{ name = "mypy-extensions" },
{ name = "pathspec" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/04/af/e3d4b3e9ec91a0ff9aabfdb38692952acf49bbb899c2e4c29acb3a6da3ae/mypy-1.20.2.tar.gz", hash = "sha256:e8222c26daaafd9e8626dec58ae36029f82585890589576f769a650dd20fd665", size = 3817349, upload-time = "2026-04-21T17:12:28.473Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/1f/4d/9ebeae211caccbdaddde7ed5e31dfcf57faac66be9b11deb1dc6526c8078/mypy-1.20.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4077797a273e56e8843d001e9dfe4ba10e33323d6ade647ff260e5cd97d9758c", size = 14371307, upload-time = "2026-04-21T17:08:56.442Z" },
{ url = "https://files.pythonhosted.org/packages/95/d7/93473d34b61f04fac1aecc01368485c89c5c4af7a4b9a0cab5d77d04b63f/mypy-1.20.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cdecf62abcc4292500d7858aeae87a1f8f1150f4c4dd08fb0b336ee79b2a6df3", size = 13258917, upload-time = "2026-04-21T17:05:50.978Z" },
{ url = "https://files.pythonhosted.org/packages/e2/30/3dd903e8bafb7b5f7bf87fcd58f8382086dea2aa19f0a7b357f21f63071b/mypy-1.20.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c566c3a88b6ece59b3d70f65bedef17304f48eb52ff040a6a18214e1917b3254", size = 13700516, upload-time = "2026-04-21T17:11:33.161Z" },
{ url = "https://files.pythonhosted.org/packages/07/05/c61a140aba4c729ac7bc99ae26fc627c78a6e08f5b9dd319244ea71a3d7e/mypy-1.20.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0deb80d062b2479f2c87ae568f89845afc71d11bc41b04179e58165fd9f31e98", size = 14562889, upload-time = "2026-04-21T17:05:27.674Z" },
{ url = "https://files.pythonhosted.org/packages/fd/87/da78243742ffa8a36d98c3010f0d829f93d5da4e6786f1a1a6f2ad616502/mypy-1.20.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bba9ad231e92a3e424b3e56b65aa17704993425bba97e302c832f9466bb85bac", size = 14803844, upload-time = "2026-04-21T17:10:06.2Z" },
{ url = "https://files.pythonhosted.org/packages/37/52/10a1ddf91b40f843943a3c6db51e2df59c9e237f29d355e95eaab427461f/mypy-1.20.2-cp311-cp311-win_amd64.whl", hash = "sha256:baf593f2765fa3a6b1ef95807dbaa3d25b594f6a52adcc506a6b9cb115e1be67", size = 10846300, upload-time = "2026-04-21T17:12:23.886Z" },
{ url = "https://files.pythonhosted.org/packages/20/02/f9a4415b664c53bd34d6709be59da303abcae986dc4ac847b402edb6fa1e/mypy-1.20.2-cp311-cp311-win_arm64.whl", hash = "sha256:20175a1c0f49863946ec20b7f63255768058ac4f07d2b9ded6a6b46cfb5a9100", size = 9779498, upload-time = "2026-04-21T17:09:23.695Z" },
{ url = "https://files.pythonhosted.org/packages/71/4e/7560e4528db9e9b147e4c0f22660466bf30a0a1fe3d63d1b9d3b0fd354ee/mypy-1.20.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4dbfcf869f6b0517f70cf0030ba6ea1d6645e132337a7d5204a18d8d5636c02b", size = 14539393, upload-time = "2026-04-21T17:07:12.52Z" },
{ url = "https://files.pythonhosted.org/packages/32/d9/34a5efed8124f5a9234f55ac6a4ced4201e2c5b81e1109c49ad23190ec8c/mypy-1.20.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4b6481b228d072315b053210b01ac320e1be243dc17f9e5887ef167f23f5fae4", size = 13361642, upload-time = "2026-04-21T17:06:53.742Z" },
{ url = "https://files.pythonhosted.org/packages/d1/14/eb377acf78c03c92d566a1510cda8137348215b5335085ef662ab82ecd3a/mypy-1.20.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:34397cdced6b90b836e38182076049fdb41424322e0b0728c946b0939ebdf9f6", size = 13740347, upload-time = "2026-04-21T17:12:04.73Z" },
{ url = "https://files.pythonhosted.org/packages/b9/94/7e4634a32b641aa1c112422eed1bbece61ee16205f674190e8b536f884de/mypy-1.20.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a5da6976f20cae27059ea8d0c86e7cef3de720e04c4bb9ee18e3690fdb792066", size = 14734042, upload-time = "2026-04-21T17:07:43.16Z" },
{ url = "https://files.pythonhosted.org/packages/7a/f3/f7e62395cb7f434541b4491a01149a4439e28ace4c0c632bbf5431e92d1f/mypy-1.20.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:56908d7e08318d39f85b1f0c6cfd47b0cac1a130da677630dac0de3e0623e102", size = 14964958, upload-time = "2026-04-21T17:11:00.665Z" },
{ url = "https://files.pythonhosted.org/packages/3e/0d/47e3c3a0ec2a876e35aeac365df3cac7776c36bbd4ed18cc521e1b9d255b/mypy-1.20.2-cp312-cp312-win_amd64.whl", hash = "sha256:d52ad8d78522da1d308789df651ee5379088e77c76cb1994858d40a426b343b9", size = 10911340, upload-time = "2026-04-21T17:10:49.179Z" },
{ url = "https://files.pythonhosted.org/packages/d6/b2/6c852d72e0ea8b01f49da817fb52539993cde327e7d010e0103dc12d0dac/mypy-1.20.2-cp312-cp312-win_arm64.whl", hash = "sha256:785b08db19c9f214dc37d65f7c165d19a30fcecb48abfa30f31b01b5acaabb58", size = 9833947, upload-time = "2026-04-21T17:09:05.267Z" },
{ url = "https://files.pythonhosted.org/packages/5b/c4/b93812d3a192c9bcf5df405bd2f30277cd0e48106a14d1023c7f6ed6e39b/mypy-1.20.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:edfbfca868cdd6bd8d974a60f8a3682f5565d3f5c99b327640cedd24c4264026", size = 14524670, upload-time = "2026-04-21T17:10:30.737Z" },
{ url = "https://files.pythonhosted.org/packages/f3/47/42c122501bff18eaf1e8f457f5c017933452d8acdc52918a9f59f6812955/mypy-1.20.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e2877a02380adfcdbc69071a0f74d6e9dbbf593c0dc9d174e1f223ffd5281943", size = 13336218, upload-time = "2026-04-21T17:08:44.069Z" },
{ url = "https://files.pythonhosted.org/packages/92/8f/75bbc92f41725fbd585fb17b440b1119b576105df1013622983e18640a93/mypy-1.20.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7488448de6007cd5177c6cea0517ac33b4c0f5ee9b5e9f2be51ce75511a85517", size = 13724906, upload-time = "2026-04-21T17:08:01.02Z" },
{ url = "https://files.pythonhosted.org/packages/a1/32/4c49da27a606167391ff0c39aa955707a00edc500572e562f7c36c08a71f/mypy-1.20.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bb9c2fa06887e21d6a3a868762acb82aec34e2c6fd0174064f27c93ede68ad15", size = 14726046, upload-time = "2026-04-21T17:11:22.354Z" },
{ url = "https://files.pythonhosted.org/packages/7f/fc/4e354a1bd70216359deb0c9c54847ee6b32ef78dfb09f5131ff99b494078/mypy-1.20.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d56a78b646f2e3daa865bc70cd5ec5a46c50045801ca8ff17a0c43abc97e3ee", size = 14955587, upload-time = "2026-04-21T17:12:16.033Z" },
{ url = "https://files.pythonhosted.org/packages/62/b2/c0f2056e9eb8f08c62cafd9715e4584b89132bdc832fcf85d27d07b5f3e5/mypy-1.20.2-cp313-cp313-win_amd64.whl", hash = "sha256:2a4102b03bb7481d9a91a6da8d174740c9c8c4401024684b9ca3b7cc5e49852f", size = 10922681, upload-time = "2026-04-21T17:06:35.842Z" },
{ url = "https://files.pythonhosted.org/packages/e5/14/065e333721f05de8ef683d0aa804c23026bcc287446b61cac657b902ccac/mypy-1.20.2-cp313-cp313-win_arm64.whl", hash = "sha256:a95a9248b0c6fd933a442c03c3b113c3b61320086b88e2c444676d3fd1ca3330", size = 9830560, upload-time = "2026-04-21T17:07:51.023Z" },
{ url = "https://files.pythonhosted.org/packages/ae/d1/b4ec96b0ecc620a4443570c6e95c867903428cfcde4206518eafdd5880c3/mypy-1.20.2-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:419413398fe250aae057fd2fe50166b61077083c9b82754c341cf4fd73038f30", size = 14524561, upload-time = "2026-04-21T17:06:27.325Z" },
{ url = "https://files.pythonhosted.org/packages/3a/63/d2c2ff4fa66bc49477d32dfa26e8a167ba803ea6a69c5efb416036909d30/mypy-1.20.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:e73c07f23009962885c197ccb9b41356a30cc0e5a1d0c2ea8fd8fb1362d7f924", size = 13363883, upload-time = "2026-04-21T17:11:11.239Z" },
{ url = "https://files.pythonhosted.org/packages/2a/56/983916806bf4eddeaaa2c9230903c3669c6718552a921154e1c5182c701f/mypy-1.20.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0c64e5973df366b747646fc98da921f9d6eba9716d57d1db94a83c026a08e0fb", size = 13742945, upload-time = "2026-04-21T17:08:34.181Z" },
{ url = "https://files.pythonhosted.org/packages/19/65/0cd9285ab010ee8214c83d67c6b49417c40d86ce46f1aa109457b5a9b8d7/mypy-1.20.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5a65aa591af023864fd08a97da9974e919452cfe19cb146c8a5dc692626445dc", size = 14706163, upload-time = "2026-04-21T17:05:15.51Z" },
{ url = "https://files.pythonhosted.org/packages/94/97/48ff3b297cafcc94d185243a9190836fb1b01c1b0918fff64e941e973cc9/mypy-1.20.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:4fef51b01e638974a6e69885687e9bd40c8d1e09a6cd291cca0619625cf1f558", size = 14938677, upload-time = "2026-04-21T17:05:39.562Z" },
{ url = "https://files.pythonhosted.org/packages/fd/a1/1b4233d255bdd0b38a1f284feeb1c143ca508c19184964e22f8d837ec851/mypy-1.20.2-cp314-cp314-win_amd64.whl", hash = "sha256:913485a03f1bcf5d279409a9d2b9ed565c151f61c09f29991e5faa14033da4c8", size = 11089322, upload-time = "2026-04-21T17:06:44.29Z" },
{ url = "https://files.pythonhosted.org/packages/78/c2/ce7ee2ba36aeb954ba50f18fa25d9c1188578654b97d02a66a15b6f09531/mypy-1.20.2-cp314-cp314-win_arm64.whl", hash = "sha256:c3bae4f855d965b5453784300c12ffc63a548304ac7f99e55d4dc7c898673aa3", size = 10017775, upload-time = "2026-04-21T17:07:20.732Z" },
{ url = "https://files.pythonhosted.org/packages/4e/a1/9d93a7d0b5859af0ead82b4888b46df6c8797e1bc5e1e262a08518c6d48e/mypy-1.20.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:2de3dcea53babc1c3237a19002bc3d228ce1833278f093b8d619e06e7cc79609", size = 15549002, upload-time = "2026-04-21T17:08:23.107Z" },
{ url = "https://files.pythonhosted.org/packages/00/d2/09a6a10ee1bf0008f6c144d9676f2ca6a12512151b4e0ad0ff6c4fac5337/mypy-1.20.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:52b176444e2e5054dfcbcb8c75b0b719865c96247b37407184bbfca5c353f2c2", size = 14401942, upload-time = "2026-04-21T17:07:31.837Z" },
{ url = "https://files.pythonhosted.org/packages/57/da/9594b75c3c019e805250bed3583bdf4443ff9e6ef08f97e39ae308cb06f2/mypy-1.20.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:688c3312e5dadb573a2c69c82af3a298d43ecf9e6d264e0f95df960b5f6ac19c", size = 15041649, upload-time = "2026-04-21T17:09:34.653Z" },
{ url = "https://files.pythonhosted.org/packages/97/77/f75a65c278e6e8eba2071f7f5a90481891053ecc39878cc444634d892abe/mypy-1.20.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:29752dbbf8cc53f89f6ac096d363314333045c257c9c75cbd189ca2de0455744", size = 15864588, upload-time = "2026-04-21T17:11:44.936Z" },
{ url = "https://files.pythonhosted.org/packages/d7/46/1a4e1c66e96c1a3246ddf5403d122ac9b0a8d2b7e65730b9d6533ba7a6d3/mypy-1.20.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:803203d2b6ea644982c644895c2f78b28d0e208bba7b27d9b921e0ec5eb207c6", size = 16093956, upload-time = "2026-04-21T17:10:17.683Z" },
{ url = "https://files.pythonhosted.org/packages/5a/2c/78a8851264dec38cd736ca5b8bc9380674df0dd0be7792f538916157716c/mypy-1.20.2-cp314-cp314t-win_amd64.whl", hash = "sha256:9bcb8aa397ff0093c824182fd76a935a9ba7ad097fcbef80ae89bf6c1731d8ec", size = 12568661, upload-time = "2026-04-21T17:11:54.473Z" },
{ url = "https://files.pythonhosted.org/packages/83/01/cd7318aa03493322ce275a0e14f4f52b8896335e4e79d4fb8153a7ad2b77/mypy-1.20.2-cp314-cp314t-win_arm64.whl", hash = "sha256:e061b58443f1736f8a37c48978d7ab581636d6ab03e3d4f99e3fa90463bb9382", size = 10389240, upload-time = "2026-04-21T17:09:42.719Z" },
{ url = "https://files.pythonhosted.org/packages/28/9a/f23c163e25b11074188251b0b5a0342625fc1cdb6af604757174fa9acc9b/mypy-1.20.2-py3-none-any.whl", hash = "sha256:a94c5a76ab46c5e6257c7972b6c8cff0574201ca7dc05647e33e795d78680563", size = 2637314, upload-time = "2026-04-21T17:05:54.5Z" },
]
[[package]]
name = "mypy-extensions"
version = "1.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a2/6e/371856a3fb9d31ca8dac321cda606860fa4548858c0cc45d9d1d4ca2628b/mypy_extensions-1.1.0.tar.gz", hash = "sha256:52e68efc3284861e772bbcd66823fde5ae21fd2fdb51c62a211403730b916558", size = 6343, upload-time = "2025-04-22T14:54:24.164Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
]
[[package]]
name = "numpy"
version = "2.4.4"
@@ -1353,6 +1490,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/aa/0f/c8b64d9b54ea631fcad4e9e3c8dbe8c11bb32a623be94f22974c88e71eaf/parsimonious-0.10.0-py3-none-any.whl", hash = "sha256:982ab435fabe86519b57f6b35610aa4e4e977e9f02a14353edf4bbc75369fc0f", size = 48427, upload-time = "2022-09-03T17:01:13.814Z" },
]
[[package]]
name = "pathspec"
version = "1.1.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/5a/82/42f767fc1c1143d6fd36efb827202a2d997a375e160a71eb2888a925aac1/pathspec-1.1.1.tar.gz", hash = "sha256:17db5ecd524104a120e173814c90367a96a98d07c45b2e10c2f3919fff91bf5a", size = 135180, upload-time = "2026-04-27T01:46:08.907Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f1/d9/7fb5aa316bc299258e68c73ba3bddbc499654a07f151cba08f6153988714/pathspec-1.1.1-py3-none-any.whl", hash = "sha256:a00ce642f577bf7f473932318056212bc4f8bfdf53128c78bbd5af0b9b20b189", size = 57328, upload-time = "2026-04-27T01:46:07.06Z" },
]
[[package]]
name = "pluggy"
version = "1.6.0"
@@ -1988,27 +2134,27 @@ wheels = [
[[package]]
name = "ruff"
version = "0.15.12"
version = "0.5.7"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/99/43/3291f1cc9106f4c63bdce7a8d0df5047fe8422a75b091c16b5e9355e0b11/ruff-0.15.12.tar.gz", hash = "sha256:ecea26adb26b4232c0c2ca19ccbc0083a68344180bba2a600605538ce51a40a6", size = 4643852, upload-time = "2026-04-24T18:17:14.305Z" }
sdist = { url = "https://files.pythonhosted.org/packages/bf/2b/69e5e412f9d390adbdbcbf4f64d6914fa61b44b08839a6584655014fc524/ruff-0.5.7.tar.gz", hash = "sha256:8dfc0a458797f5d9fb622dd0efc52d796f23f0a1493a9527f4e49a550ae9a7e5", size = 2449817, upload-time = "2024-08-08T15:43:07.467Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c3/6e/e78ffb61d4686f3d96ba3df2c801161843746dcbcbb17a1e927d4829312b/ruff-0.15.12-py3-none-linux_armv6l.whl", hash = "sha256:f86f176e188e94d6bdbc09f09bfd9dc729059ad93d0e7390b5a73efe19f8861c", size = 10640713, upload-time = "2026-04-24T18:17:22.841Z" },
{ url = "https://files.pythonhosted.org/packages/ae/08/a317bc231fb9e7b93e4ef3089501e51922ff88d6936ce5cf870c4fe55419/ruff-0.15.12-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:e3bcd123364c3770b8e1b7baaf343cc99a35f197c5c6e8af79015c666c423a6c", size = 11069267, upload-time = "2026-04-24T18:17:30.105Z" },
{ url = "https://files.pythonhosted.org/packages/aa/a4/f828e9718d3dce1f5f11c39c4f65afd32783c8b2aebb2e3d259e492c47bd/ruff-0.15.12-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fe87510d000220aa1ed530d4448a7c696a0cae1213e5ec30e5874287b66557b5", size = 10397182, upload-time = "2026-04-24T18:17:07.177Z" },
{ url = "https://files.pythonhosted.org/packages/71/e0/3310fc6d1b5e1fdea22bf3b1b807c7e187b581021b0d7d4514cccdb5fb71/ruff-0.15.12-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:84a1630093121375a3e2a95b4a6dc7b59e2b4ee76216e32d81aae550a832d002", size = 10758012, upload-time = "2026-04-24T18:16:55.759Z" },
{ url = "https://files.pythonhosted.org/packages/11/c1/a606911aee04c324ddaa883ae418f3569792fd3c4a10c50e0dd0a2311e1e/ruff-0.15.12-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fb129f40f114f089ebe0ca56c0d251cf2061b17651d464bb6478dc01e69f11f5", size = 10447479, upload-time = "2026-04-24T18:16:51.677Z" },
{ url = "https://files.pythonhosted.org/packages/9d/68/4201e8444f0894f21ab4aeeaee68aa4f10b51613514a20d80bd628d57e88/ruff-0.15.12-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b0c862b172d695db7598426b8af465e7e9ac00a3ea2a3630ee67eb82e366aaa6", size = 11234040, upload-time = "2026-04-24T18:17:16.529Z" },
{ url = "https://files.pythonhosted.org/packages/34/ff/8a6d6cf4ccc23fd67060874e832c18919d1557a0611ebef03fdb01fff11e/ruff-0.15.12-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2849ea9f3484c3aca43a82f484210370319e7170df4dfe4843395ddf6c57bc33", size = 12087377, upload-time = "2026-04-24T18:17:04.944Z" },
{ url = "https://files.pythonhosted.org/packages/85/f6/c669cf73f5152f623d34e69866a46d5e6185816b19fcd5b6dd8a2d299922/ruff-0.15.12-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e77c7e51c07fe396826d5969a5b846d9cd4c402535835fb6e21ce8b28fef847", size = 11367784, upload-time = "2026-04-24T18:17:25.409Z" },
{ url = "https://files.pythonhosted.org/packages/e8/39/c61d193b8a1daaa8977f7dea9e8d8ba866e02ea7b65d32f6861693aa4c12/ruff-0.15.12-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:83b2f4f2f3b1026b5fb449b467d9264bf22067b600f7b6f41fc5958909f449d0", size = 11344088, upload-time = "2026-04-24T18:17:12.258Z" },
{ url = "https://files.pythonhosted.org/packages/c2/8d/49afab3645e31e12c590acb6d3b5b69d7aab5b81926dbaf7461f9441f37a/ruff-0.15.12-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:9ba3b8f1afd7e2e43d8943e55f249e13f9682fde09711644a6e7290eb4f3e339", size = 11271770, upload-time = "2026-04-24T18:17:02.457Z" },
{ url = "https://files.pythonhosted.org/packages/46/06/33f41fe94403e2b755481cdfb9b7ef3e4e0ed031c4581124658d935d52b4/ruff-0.15.12-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:e852ba9fdc890655e1d78f2df1499efbe0e54126bd405362154a75e2bde159c5", size = 10719355, upload-time = "2026-04-24T18:17:27.648Z" },
{ url = "https://files.pythonhosted.org/packages/0d/59/18aa4e014debbf559670e4048e39260a85c7fcee84acfd761ac01e7b8d35/ruff-0.15.12-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:dd8aed930da53780d22fc70bdf84452c843cf64f8cb4eb38984319c24c5cd5fd", size = 10462758, upload-time = "2026-04-24T18:17:32.347Z" },
{ url = "https://files.pythonhosted.org/packages/25/e7/cc9f16fd0f3b5fddcbd7ec3d6ae30c8f3fde1047f32a4093a98d633c6570/ruff-0.15.12-py3-none-musllinux_1_2_i686.whl", hash = "sha256:01da3988d225628b709493d7dc67c3b9b12c0210016b08690ef9bd27970b262b", size = 10953498, upload-time = "2026-04-24T18:17:20.674Z" },
{ url = "https://files.pythonhosted.org/packages/72/7a/a9ba7f98c7a575978698f4230c5e8cc54bbc761af34f560818f933dafa0c/ruff-0.15.12-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:9cae0f92bd5700d1213188b31cd3bdd2b315361296d10b96b8e2337d3d11f53e", size = 11447765, upload-time = "2026-04-24T18:17:09.755Z" },
{ url = "https://files.pythonhosted.org/packages/ea/f9/0ae446942c846b8266059ad8a30702a35afae55f5cdc54c5adf8d7afdc27/ruff-0.15.12-py3-none-win32.whl", hash = "sha256:d0185894e038d7043ba8fd6aee7499ece6462dc0ea9f1e260c7451807c714c20", size = 10657277, upload-time = "2026-04-24T18:17:18.591Z" },
{ url = "https://files.pythonhosted.org/packages/33/f1/9614e03e1cdcbf9437570b5400ced8a720b5db22b28d8e0f1bda429f660d/ruff-0.15.12-py3-none-win_amd64.whl", hash = "sha256:c87a162d61ab3adca47c03f7f717c68672edec7d1b5499e652331780fe74950d", size = 11837758, upload-time = "2026-04-24T18:17:00.113Z" },
{ url = "https://files.pythonhosted.org/packages/c0/98/6beb4b351e472e5f4c4613f7c35a5290b8be2497e183825310c4c3a3984b/ruff-0.15.12-py3-none-win_arm64.whl", hash = "sha256:a538f7a82d061cee7be55542aca1d86d1393d55d81d4fcc314370f4340930d4f", size = 11120821, upload-time = "2026-04-24T18:16:57.979Z" },
{ url = "https://files.pythonhosted.org/packages/6b/eb/06e06aaf96af30a68e83b357b037008c54a2ddcbad4f989535007c700394/ruff-0.5.7-py3-none-linux_armv6l.whl", hash = "sha256:548992d342fc404ee2e15a242cdbea4f8e39a52f2e7752d0e4cbe88d2d2f416a", size = 9570571, upload-time = "2024-08-08T15:41:56.537Z" },
{ url = "https://files.pythonhosted.org/packages/a4/10/1be32aeaab8728f78f673e7a47dd813222364479b2d6573dbcf0085e83ea/ruff-0.5.7-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:00cc8872331055ee017c4f1071a8a31ca0809ccc0657da1d154a1d2abac5c0be", size = 8685138, upload-time = "2024-08-08T15:42:02.833Z" },
{ url = "https://files.pythonhosted.org/packages/3d/1d/c218ce83beb4394ba04d05e9aa2ae6ce9fba8405688fe878b0fdb40ce855/ruff-0.5.7-py3-none-macosx_11_0_arm64.whl", hash = "sha256:eaf3d86a1fdac1aec8a3417a63587d93f906c678bb9ed0b796da7b59c1114a1e", size = 8266785, upload-time = "2024-08-08T15:42:08.321Z" },
{ url = "https://files.pythonhosted.org/packages/26/79/7f49509bd844476235b40425756def366b227a9714191c91f02fb2178635/ruff-0.5.7-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a01c34400097b06cf8a6e61b35d6d456d5bd1ae6961542de18ec81eaf33b4cb8", size = 9983964, upload-time = "2024-08-08T15:42:12.419Z" },
{ url = "https://files.pythonhosted.org/packages/bf/b1/939836b70bf9fcd5e5cd3ea67fdb8abb9eac7631351d32f26544034a35e4/ruff-0.5.7-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fcc8054f1a717e2213500edaddcf1dbb0abad40d98e1bd9d0ad364f75c763eea", size = 9359490, upload-time = "2024-08-08T15:42:16.713Z" },
{ url = "https://files.pythonhosted.org/packages/32/7d/b3db19207de105daad0c8b704b2c6f2a011f9c07017bd58d8d6e7b8eba19/ruff-0.5.7-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7f70284e73f36558ef51602254451e50dd6cc479f8b6f8413a95fcb5db4a55fc", size = 10170833, upload-time = "2024-08-08T15:42:20.54Z" },
{ url = "https://files.pythonhosted.org/packages/a2/45/eae9da55f3357a1ac04220230b8b07800bf516e6dd7e1ad20a2ff3b03b1b/ruff-0.5.7-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:a78ad870ae3c460394fc95437d43deb5c04b5c29297815a2a1de028903f19692", size = 10896360, upload-time = "2024-08-08T15:42:25.2Z" },
{ url = "https://files.pythonhosted.org/packages/99/67/4388b36d145675f4c51ebec561fcd4298a0e2550c81e629116f83ce45a39/ruff-0.5.7-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9ccd078c66a8e419475174bfe60a69adb36ce04f8d4e91b006f1329d5cd44bcf", size = 10477094, upload-time = "2024-08-08T15:42:29.553Z" },
{ url = "https://files.pythonhosted.org/packages/e1/9c/f5e6ed1751dc187a4ecf19a4970dd30a521c0ee66b7941c16e292a4043fb/ruff-0.5.7-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7e31c9bad4ebf8fdb77b59cae75814440731060a09a0e0077d559a556453acbb", size = 11480896, upload-time = "2024-08-08T15:42:33.772Z" },
{ url = "https://files.pythonhosted.org/packages/c8/3b/2b683be597bbd02046678fc3fc1c199c641512b20212073b58f173822bb3/ruff-0.5.7-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d796327eed8e168164346b769dd9a27a70e0298d667b4ecee6877ce8095ec8e", size = 10179702, upload-time = "2024-08-08T15:42:38.038Z" },
{ url = "https://files.pythonhosted.org/packages/f1/38/c2d94054dc4b3d1ea4c2ba3439b2a7095f08d1c8184bc41e6abe2a688be7/ruff-0.5.7-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:4a09ea2c3f7778cc635e7f6edf57d566a8ee8f485f3c4454db7771efb692c499", size = 9982855, upload-time = "2024-08-08T15:42:42.031Z" },
{ url = "https://files.pythonhosted.org/packages/7d/e7/1433db2da505ffa8912dcf5b28a8743012ee780cbc20ad0bf114787385d9/ruff-0.5.7-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:a36d8dcf55b3a3bc353270d544fb170d75d2dff41eba5df57b4e0b67a95bb64e", size = 9433156, upload-time = "2024-08-08T15:42:45.339Z" },
{ url = "https://files.pythonhosted.org/packages/e0/36/4fa43250e67741edeea3d366f59a1dc993d4d89ad493a36cbaa9889895f2/ruff-0.5.7-py3-none-musllinux_1_2_i686.whl", hash = "sha256:9369c218f789eefbd1b8d82a8cf25017b523ac47d96b2f531eba73770971c9e5", size = 9782971, upload-time = "2024-08-08T15:42:49.354Z" },
{ url = "https://files.pythonhosted.org/packages/80/0e/8c276103d518e5cf9202f70630aaa494abf6fc71c04d87c08b6d3cd07a4b/ruff-0.5.7-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:b88ca3db7eb377eb24fb7c82840546fb7acef75af4a74bd36e9ceb37a890257e", size = 10247775, upload-time = "2024-08-08T15:42:53.294Z" },
{ url = "https://files.pythonhosted.org/packages/cb/b9/673096d61276f39291b729dddde23c831a5833d98048349835782688a0ec/ruff-0.5.7-py3-none-win32.whl", hash = "sha256:33d61fc0e902198a3e55719f4be6b375b28f860b09c281e4bdbf783c0566576a", size = 7841772, upload-time = "2024-08-08T15:42:57.488Z" },
{ url = "https://files.pythonhosted.org/packages/67/1c/4520c98bfc06b9c73cd1457686d4d3935d40046b1ddea08403e5a6deff51/ruff-0.5.7-py3-none-win_amd64.whl", hash = "sha256:083bbcbe6fadb93cd86709037acc510f86eed5a314203079df174c40bbbca6b3", size = 8699779, upload-time = "2024-08-08T15:43:00.429Z" },
{ url = "https://files.pythonhosted.org/packages/38/23/b3763a237d2523d40a31fe2d1a301191fe392dd48d3014977d079cf8c0bd/ruff-0.5.7-py3-none-win_arm64.whl", hash = "sha256:2dca26154ff9571995107221d0aeaad0e75a77b5a682d6236cf89a58c70b76f4", size = 8091891, upload-time = "2024-08-08T15:43:04.162Z" },
]
[[package]]