Compare commits

...

35 Commits

Author SHA1 Message Date
AdrianoDev 95b8bcfe96 docs(V2): aggiorna README con override URL .env, layout src, quality gate
- Aggiunte caratteristiche: override URL upstream da DERIBIT_URL_*, BYBIT_URL_*, ecc.
- Aggiunto badge qualità: 259 test, mypy clean, ruff clean
- Aggiunta sezione layout src/cerbero_mcp/
- Documentate quirk SDK su override URL (Bybit pybit endpoint, Alpaca trading-only)
- Linkato anche il plan oltre alla spec

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 00:04:50 +02:00
AdrianoDev 697d118522 chore(V2): mypy clean — fix radice V2 nuovo + suppress mirato V1 legacy
- settings.py: lambda factory + type:ignore[call-arg] per env-loaded models
- routers/*.py (6 file): cast esplicito Environment / Client per request.state
- __main__.py: cast Literal env in builder, type:ignore Settings()
- server.py: type:ignore[method-assign] su app.openapi
- deribit/tools.py: assert su validator-normalized fields, list return type
- deribit/client.py: type:ignore mirato no-any-return / has-type, rinomina types→types_list
- hyperliquid/{client,tools}.py: assert su validator-normalized fields, var-annotated
- alpaca/client.py: type:ignore mirato per SDK quirks (assignment, no-any-return, arg-type, union-attr)
- {macro,sentiment}/fetchers.py: type:ignore mirato no-any-return / operator / union-attr

Mypy: 68 → 0 errors. Test: 259 passing. Ruff: clean.
2026-04-30 20:43:03 +02:00
AdrianoDev 436dfd6f5a feat(V2): URL exchange configurabili da .env (DERIBIT_URL_*, BYBIT_URL_*, ecc.) 2026-04-30 20:36:31 +02:00
AdrianoDev b71c66917c chore(V2): cleanup quality gate
- ruff: contextlib.suppress al posto di try/except/pass (client_registry, test_env_routing)
- rimozione services/ legacy (residuo da git rm)
- fix integration test fixture: rimosso sys.modules.pop che inquinava module references nei test successivi (test_audit, test_client_init_default_http)

254 test passano. Ruff: clean. Mypy: 68 warning preesistenti dal codice V1 migrato (strict=false).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 19:02:55 +02:00
AdrianoDev b552127479 docs(V2): README riscritto per architettura V2.0.0
- Singola immagine Docker, routing testnet/mainnet via bearer token
- Configurazione interamente in .env
- Swagger /apidocs
- Sezione migrazione V1 → V2 con tabella mapping campi
- Riferimento alla spec di design

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 19:00:33 +02:00
AdrianoDev 50bc6b64b4 chore(V2): build-push.sh costruisce 1 sola immagine V2.0.0; rimosso deploy-noclone.sh
Lo script ora pubblica un solo tag cerbero-mcp:2.0.0 + :latest + :sha-<short>.
deploy-noclone.sh era specifico del workflow V1 multi-image.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:59:27 +02:00
AdrianoDev ec42d141bd chore(V2): rimuovi compose overlay V1 (prod, local, traefik) e DEPLOYMENT.md
Contenuti utili di DEPLOYMENT.md saranno integrati nel nuovo README V2.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:58:51 +02:00
AdrianoDev 6d19165d9e chore(V2): rimuovi services/, gateway/, secrets/, docker/ (legacy V1)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:58:11 +02:00
AdrianoDev 1c1b3e1570 test(V2): smoke script con bearer testnet 2026-04-30 18:57:07 +02:00
AdrianoDev cee7f7ca2f feat(V2): docker-compose.yml minimo (1 servizio, env_file .env) 2026-04-30 18:55:23 +02:00
AdrianoDev 6148461ac1 feat(V2): Dockerfile unico multi-stage in root
Build multi-stage builder/runtime, uv sync --frozen, utente non-root uid 1000,
healthcheck su /health, CMD cerbero-mcp. Smoke test passato: /health 200 OK,
immagine cerbero-mcp:2.0.0 229MB content size.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:54:38 +02:00
AdrianoDev f34452b2dd test(V2): integration env routing per ogni exchange (constructor spy)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:51:30 +02:00
AdrianoDev a53efb7a29 feat(V2): __main__ con lifespan + 6 router + integration test
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:48:56 +02:00
AdrianoDev f56df197e1 feat(V2): migrazione sentiment completa (read-only, env ignored)
- exchanges/sentiment/{client,fetchers,tools}.py: SentimentClient wrapper stateless (cryptopanic_key, lunarcrush_key)
- routers/sentiment.py: 9 tool POST sotto /mcp-sentiment (news, social, funding, OI, liquidations, cointegration)
- exchanges/__init__.py: branch builder per sentiment (env ignored)
- tests/unit/exchanges/sentiment: migrato test_fetchers, scartato test_server_acl V1-only
- tests/unit/test_exchanges_builder.py: aggiunto test_build_client_sentiment_no_env_distinction
- fetchers.py: env var lookup allineato a LUNARCRUSH_KEY (con fallback LUNARCRUSH_API_KEY)

241 test passano.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:46:48 +02:00
AdrianoDev 88bd4e7bde feat(V2): migrazione macro completa (read-only, env ignored)
- exchanges/macro: cot.py + cot_contracts.py + fetchers.py copiati 1:1 con
  rewrite import mcp_common -> cerbero_mcp.common, mcp_macro -> cerbero_mcp.exchanges.macro
- nuovo MacroClient stateless wrapper: trasporta solo fred_api_key/finnhub_api_key,
  niente HTTP session (i fetchers usano async_client ad-hoc)
- tools.py: 11 tool (get_treasury_yields, get_yield_curve_slope,
  get_breakeven_inflation, get_economic_indicators, get_macro_calendar,
  get_market_overview, get_equity_futures, get_asset_price, get_cot_tff,
  get_cot_disaggregated, get_cot_extreme_positioning) — niente write,
  niente leverage_cap
- routers/macro.py: prefix /mcp-macro, 11 route POST /tools/*
- builder branch macro: stesse credenziali per testnet/mainnet (env ignorato);
  registry istanzia 2 entry, costo trascurabile (wrapper stateless)
- test migrati: test_cot.py + test_fetchers.py (test_server_acl.py skippato V1-only)
- nuovo test test_build_client_macro_no_env_distinction in test_exchanges_builder.py

Suite: 224 passed.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:42:55 +02:00
AdrianoDev 1b8ba0ef9c feat(V2): migrazione alpaca completa
Task 6.7: porting alpaca da services/mcp-alpaca a src/cerbero_mcp.
client.py + leverage_cap.py copiati 1:1 (default cap 1 cash).
tools.py: 17 tool senza ACL/Principal/audit. Router /mcp-alpaca con 18
route (env_info + 17 tool). Builder branch alpaca: paper=(env=="testnet"),
api_key viene da settings.alpaca.api_key_id. Test client + leverage_cap
migrati (15 test alpaca pass). Test builder con stub SDK alpaca-py.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:39:25 +02:00
AdrianoDev 8dbaf3a0e4 feat(V2): migrazione hyperliquid completa
- exchanges/hyperliquid/{client,leverage_cap,tools}.py
- routers/hyperliquid.py con 16 endpoint /mcp-hyperliquid/tools/*
- builder hyperliquid in exchanges/__init__.py
- test migrati: test_client, test_leverage_cap (skip V1: server_acl, environment_info)
- test builder hyperliquid (testnet vs mainnet base_url)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:35:46 +02:00
AdrianoDev 5e42ce9c69 feat(V2): migrazione bybit completa (client, tools, router, test, builder)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:31:51 +02:00
AdrianoDev a8d970233e feat(V2): builder client centralizzato (solo deribit per ora)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:27:50 +02:00
AdrianoDev d3ec2ee588 feat(V2): router deribit + test migrati
Router /mcp-deribit/* monta 34 tool (28 read + 6 write) come endpoint
POST /mcp-deribit/tools/{tool_name}, con DI per env (request.state) e
client (ClientRegistry). Write tools costruiscono creds minimale
{max_leverage, client_id} da settings per leverage cap enforcement.

Test deribit migrati: test_client.py + test_leverage_cap.py riassegnati
sotto tests/unit/exchanges/deribit/ con import rewrite mcp_* -> cerbero_mcp.*.
Skip dei legacy V1-only test_environment_info / test_server_acl / test_env_validation
(ACL e resolve_environment eliminati in V2).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:26:34 +02:00
AdrianoDev daa4e02971 feat(V2): migrazione deribit (client, leverage_cap, tools)
Task 6.1 V2.0.0: copia client.py + leverage_cap.py da services/mcp-deribit
con import riscritti (mcp_common -> cerbero_mcp.common, mcp_deribit ->
cerbero_mcp.exchanges.deribit). Estratte 34 tool async (28 endpoint +
is_testnet/environment_info + helpers) in tools.py: pure logica senza
FastAPI/ACL. Audit calls per ora rimossi (TODO: cabling via router su
request.state.environment).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:23:44 +02:00
AdrianoDev 2a268b3a33 feat(V2): build_app con swagger /apidocs + middleware + handlers
Aggiunge /docs e /redoc alla whitelist auth (path disabilitati, nessun rischio sicurezza).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:20:17 +02:00
AdrianoDev 73f880e7f2 feat(V2): ClientRegistry lazy con lock per chiave
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:18:18 +02:00
AdrianoDev 80a4a88cb1 feat(V2): error envelope module estratto da server.py 2026-04-30 18:17:15 +02:00
AdrianoDev 993326136b test(V2): migrazione test common/
Copiati e aggiornati i test da services/common/tests/ a tests/unit/common/.
Import aggiornati da mcp_common a cerbero_mcp.common. Eliminati test di
funzionalità V1-only (app_factory, environment, auth/Principal, server_base).
Refactored test_audit.py (principal→actor str) e test_mcp_bridge.py
(TokenStore→valid_tokens set). 71/71 test passano.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:16:26 +02:00
AdrianoDev 1a1f9c43ba refactor(V2): audit.py usa actor:str invece di Principal, rimuovi legacy common/auth.py
- Eliminato src/cerbero_mcp/common/auth.py (V1 Principal/TokenStore/ACL)
- audit_write_op: parametro principal:Principal → actor:str|None
- mcp_bridge.py: TokenStore → valid_tokens:set[str] (V2 bearer model)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:14:10 +02:00
AdrianoDev 3868ba60ce feat(V2): migrazione common/ (indicators, options, microstructure, stats, http, audit, logging, mcp_bridge + auth) 2026-04-30 18:12:11 +02:00
AdrianoDev 04a34fc179 fix(V2): hoist fastapi Request import, ripristina importlib mode 2026-04-30 18:10:41 +02:00
AdrianoDev 2934a2d26a feat(V2): bearer auth middleware con compare_digest
Implementa install_auth_middleware con whitelist /health /apidocs /openapi.json,
token timing-safe via secrets.compare_digest, request.state.environment injection.
Fix pyproject: --import-mode=prepend (importlib + PEP563 rompe FastAPI Request injection).
Rimosso from __future__ import annotations da test_auth.py per stesso motivo.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:09:21 +02:00
AdrianoDev 97d93a5139 feat(V2): pydantic settings con secret str + test
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:04:40 +02:00
AdrianoDev 005300205b chore(V2): .env.example consolidato, .env gitignored 2026-04-30 18:03:22 +02:00
AdrianoDev 8df64b5176 chore(V2): scheletro src/cerbero_mcp + tests/
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:02:22 +02:00
AdrianoDev 8fd182e295 chore(V2): pyproject singolo package cerbero-mcp, rimosso workspace
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:01:16 +02:00
AdrianoDev b8753afad2 docs(plan): V2.0.0 implementation plan task-by-task
Plan a 12 phase per implementare V2.0.0:
- Phase 0: bootstrap struttura nuova
- Phase 1: settings + .env consolidato
- Phase 2: auth bearer middleware
- Phase 3: migrazione common/
- Phase 4: client_registry lazy
- Phase 5: build_app + swagger /apidocs
- Phase 6: migrazione 6 exchange (deribit template + 5 ripetizioni)
- Phase 7: __main__ entrypoint con lifespan
- Phase 8: integration test env routing
- Phase 9: Dockerfile + docker-compose minimo
- Phase 10: pulizia V1 (services/, gateway/, secrets/, docker/)
- Phase 11: README riscritto, DEPLOYMENT eliminato
- Phase 12: quality gate finale

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 17:58:13 +02:00
AdrianoDev 9a137563e8 docs(spec): V2.0.0 unified image + token-based env routing
Spec architetturale per V2.0.0: collassa 7 immagini Docker (gateway Caddy
+ 6 servizi MCP) in una singola immagine multi-router. Switch
testnet/mainnet diventa runtime per-request via bearer token (TESTNET_TOKEN
/ MAINNET_TOKEN). Configurazione consolidata in singolo .env, secret JSON
eliminati. Swagger UI esposto a /apidocs.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 17:45:26 +02:00
156 changed files with 8579 additions and 7857 deletions
+53 -6
View File
@@ -1,7 +1,54 @@
GATEWAY_PORT=8080
# ============================================================
# CERBERO MCP — V2.0.0
# Copy to .env and fill in values. .env is gitignored.
# Generate tokens: python -c 'import secrets; print(secrets.token_urlsafe(32))'
# ============================================================
# Override ambiente per ogni MCP exchange (precedenza: env > secret > default)
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# ─── SERVER ─────────────────────────────────────────────────
HOST=0.0.0.0
PORT=9000
LOG_LEVEL=info
# ─── AUTH — token bearer per env routing ──────────────────
# Bot manda Authorization: Bearer <TOKEN>:
# - TESTNET_TOKEN → request va a base_url_testnet
# - MAINNET_TOKEN → request va a base_url_live
TESTNET_TOKEN=
MAINNET_TOKEN=
# ─── EXCHANGE — DERIBIT ───────────────────────────────────
DERIBIT_CLIENT_ID=
DERIBIT_CLIENT_SECRET=
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
DERIBIT_MAX_LEVERAGE=3
# ─── EXCHANGE — BYBIT ─────────────────────────────────────
BYBIT_API_KEY=
BYBIT_API_SECRET=
BYBIT_URL_LIVE=https://api.bybit.com
BYBIT_URL_TESTNET=https://api-testnet.bybit.com
BYBIT_MAX_LEVERAGE=3
# ─── EXCHANGE — HYPERLIQUID ───────────────────────────────
HYPERLIQUID_WALLET_ADDRESS=
HYPERLIQUID_API_WALLET_ADDRESS=
HYPERLIQUID_PRIVATE_KEY=
HYPERLIQUID_URL_LIVE=https://api.hyperliquid.xyz
HYPERLIQUID_URL_TESTNET=https://api.hyperliquid-testnet.xyz
HYPERLIQUID_MAX_LEVERAGE=3
# ─── EXCHANGE — ALPACA ────────────────────────────────────
ALPACA_API_KEY_ID=
ALPACA_SECRET_KEY=
ALPACA_URL_LIVE=https://api.alpaca.markets
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
ALPACA_MAX_LEVERAGE=1
# ─── DATA PROVIDERS — MACRO ───────────────────────────────
FRED_API_KEY=
FINNHUB_API_KEY=
# ─── DATA PROVIDERS — SENTIMENT ───────────────────────────
CRYPTOPANIC_KEY=
LUNARCRUSH_KEY=
-448
View File
@@ -1,448 +0,0 @@
# Deployment Cerbero_mcp
Guida operativa per il deploy della suite MCP su un VPS pubblico.
L'architettura è: Gitea ospita codice + container registry; le immagini
vengono buildate e pushate dalla **macchina di sviluppo** (laptop) verso
il registry; il VPS produzione non builda nulla, fa solo pull dei
container già pronti e usa Watchtower per il rollover automatico.
```
┌──────────────────────────┐ ┌─────────────────────────┐ ┌──────────────────────────────────┐
│ Laptop dev │ │ Gitea git.tielogic.xyz │ │ VPS produzione │
│ │ │ │ │ cerbero-mcp.tielogic.xyz │
│ build-push.sh ──push──▶ │───▶│ ┌────────────────────┐ │ │ │
│ (8 image) │ │ │ Container registry │ │ │ ┌────────────────────────────┐ │
│ git push ─────────────▶ │───▶│ └────────────────────┘ │◀──┼──┤ docker compose │ │
│ │ │ ┌────────────────────┐ │ pull │ (docker-compose.prod.yml) │ │
│ │ │ │ Cerbero-mcp repo │ │ │ │ gateway, mcp-* │ │
│ │ │ └────────────────────┘ │ │ │ watchtower (poll 5min) │ │
│ │ │ │ │ └────────────────────────────┘ │
└──────────────────────────┘ └─────────────────────────┘ └──────────────────────────────────┘
```
Niente CI/CD su Gitea — qualità e build sono responsabilità del laptop
prima del push (lint/test in locale, poi `scripts/build-push.sh`).
## 1. Build & push image (dal laptop)
Lo script `scripts/build-push.sh` builda e pusha le 8 image al registry
Gitea, replicando il vecchio job CI ma in locale. Pre-requisiti:
- `docker` + `buildx` sul laptop.
- Personal Access Token Gitea con scope `write:package` (User Settings
→ Applications → Generate Token).
```bash
export GITEA_PAT='<PAT_write:package>'
export GITEA_USER=adriano
# Tutte le 8 image (base + gateway + 6 mcp-*)
./scripts/build-push.sh
# Solo specifiche (es. dopo modifica a un singolo servizio)
./scripts/build-push.sh base mcp-bybit
```
Lo script:
- Fa `docker login git.tielogic.xyz`.
- Builda con `docker buildx build --push` (cache buildx locale del
laptop, niente cache registry: build successivi rapidi senza pesare
sul registry).
- Tagga `:latest` + `:sha-<short_HEAD>`.
- Per le mcp-* passa `BASE_IMAGE`/`BASE_TAG` come build-arg in modo da
ereditare dall'image `base` appena pushata.
Ordine consigliato: builda `base` prima delle `mcp-*` (lo script lo fa
di default se chiamato senza argomenti).
## 1b. Quality gate locale (consigliato prima del push)
Prima di `build-push.sh` esegui in locale i check che prima girava il CI:
```bash
uv run ruff check services/
uv run mypy services/common/src/mcp_common
uv run pytest services/ --tb=short
docker compose -f docker-compose.prod.yml config -q
```
Tutti devono essere verdi prima di pushare image al registry.
## 2a. Topologia: standalone vs behind-Traefik
Cerbero_mcp supporta due topologie di deploy:
### Standalone (Caddy gestisce TLS direttamente)
```
Internet ──[443]──► Caddy gateway ──► mcp-* services
(ACME Let's Encrypt)
```
Setto: `docker-compose.prod.yml` da solo. Caddy bind sulle porte
80/443 host, fa cert auto via ACME. Adatto a un VPS dedicato senza
altri servizi sulle 80/443.
### Behind-Traefik (Traefik termina TLS)
```
Internet ──[443]──► Traefik ──[traefik network]──► Caddy gateway ──► mcp-* services
(TLS+ACME) (rate-limit, IP allowlist)
```
Setto: `docker-compose.prod.yml` + `docker-compose.traefik.yml` overlay.
Caddy non bind su host, ascolta plain HTTP `:80` interno alla
`traefik` network. Traefik fa routing per `Host(cerbero-mcp.tielogic.xyz)`,
TLS, ACME. Adatto a VPS condiviso con altri servizi (Gitea, ecc.).
## 2. Deploy automatizzato (script no-clone)
Il modo più rapido è `scripts/deploy-noclone.sh`, idempotente. Sul VPS
**non** viene clonato il repo: lo script scarica via raw HTTP solo i
file strettamente necessari al runtime (compose, Caddyfile, public
assets). Esegui sul VPS:
```bash
# Prerequisiti
export GITEA_PAT="<PAT con scope read:package>"
export GITEA_USER=adriano
# Crea la dir di deploy e mettici i secrets via scp dal posto sicuro
sudo mkdir -p /docker/cerbero_mcp/secrets
sudo chown -R "$USER" /docker/cerbero_mcp
# scp deribit.json bybit.json hyperliquid.json alpaca.json \
# macro.json sentiment.json core.token observer.token \
# vps:/docker/cerbero_mcp/secrets/
# Behind Traefik (opzionale, solo se VPS condiviso con Gitea o altri)
# export BEHIND_TRAEFIK=true
# export TRAEFIK_NETWORK=gitea_traefik-public
curl -sL -o /tmp/deploy-noclone.sh \
https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main/scripts/deploy-noclone.sh
chmod +x /tmp/deploy-noclone.sh
/tmp/deploy-noclone.sh
```
Lo script esegue: docker login registry → scarica `docker-compose.prod.yml`,
`docker-compose.traefik.yml`, `gateway/Caddyfile`, `gateway/public/*` in
`/docker/cerbero_mcp/` → chmod 600 sui secrets → genera `.env` iniziale
(testnet) → crea `/var/log/cerbero-mcp` con permessi `1000:1000` → pull
image dal registry → `docker compose up -d` → smoke test pubblico.
Per aggiornare in seguito: ri-esegui lo stesso script (preserva `.env`
e secrets, ricarica config dal branch `main` aggiornato).
**Override paths**: `DEPLOY_DIR` (default `/docker/cerbero_mcp`),
`SECRETS_SRC` (default `$DEPLOY_DIR/secrets`), `AUDIT_LOG_DIR` (default
`/var/log/cerbero-mcp`).
**Override compose locale (`docker-compose.local.yml`)**: lo script
include automaticamente come ultimo `-f` un eventuale
`$DEPLOY_DIR/docker-compose.local.yml`. Utile per fix specifici della
macchina (es. forzare `DOCKER_API_VERSION` su watchtower se il daemon
del VPS è più vecchio dell'API attesa). File gitignored per design —
non viene scaricato dal repo, lo crei a mano sul VPS. Esempio:
```yaml
# /docker/cerbero_mcp/docker-compose.local.yml
services:
watchtower:
environment:
DOCKER_API_VERSION: "1.44"
```
### Modalità behind-Traefik
Se sul VPS gira già un Traefik (es. lo stesso VPS di Gitea), prima di
lanciare lo script aggiungi al tuo `.env`:
```bash
BEHIND_TRAEFIK=true
TRAEFIK_NETWORK=gitea_traefik-public # nome network esterna di Traefik
TRAEFIK_CERTRESOLVER=letsencrypt # nome resolver in Traefik
TRAEFIK_ENTRYPOINT=websecure # entrypoint HTTPS Traefik
# Porte gateway non più necessarie (Traefik bind 80/443):
# GATEWAY_HTTP_PORT, GATEWAY_HTTPS_PORT non vengono usate.
```
Lo script rileva `BEHIND_TRAEFIK=true` e usa
`docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml`.
Il gateway Caddy NON bind su 80/443 host; viene esposto via Traefik con
labels per `Host(cerbero-mcp.tielogic.xyz)`.
Verifica della network Traefik:
```bash
docker network ls | grep -i traefik
# Tipicamente vedrai: gitea_traefik-public, traefik_default, ecc.
# Usa il nome ESATTO come TRAEFIK_NETWORK in .env.
```
## 3. Safety: switch testnet → mainnet
`mcp_common.environment.consistency_check` (richiamato dal boot
`run_exchange_main`) PREVIENE switch accidentali:
- Se l'ambiente risolto è **mainnet** ma il secret JSON corrispondente
non contiene `"environment": "mainnet"` esplicito → boot abort con
`EnvironmentMismatchError`.
- Se il secret dichiara un environment diverso da quello risolto (es.
`creds["environment"]="mainnet"` ma env var setta testnet) → boot abort.
**Per passare a mainnet su un exchange specifico** (es. bybit):
1. Edita `secrets/bybit.json`: aggiungi `"environment": "mainnet"`.
2. Modifica `.env`: `BYBIT_TESTNET=false`.
3. `docker compose -f docker-compose.prod.yml --env-file .env restart mcp-bybit`.
Senza il flag esplicito nel secret, il container mcp-bybit fallirà al
boot e Watchtower NON aggiornerà su versioni con cred mainnet rotti.
Override `STRICT_MAINNET=false` in `.env` permette mainnet senza la
conferma esplicita (downgrade safety, sconsigliato in produzione).
## 4. Audit log persistente
Tutti i write endpoint (`place_order`, `place_combo_order`, `cancel_*`,
`set_*`, `close_*`, `transfer_*`, `amend_*`, `switch_*`) emettono un
record JSON strutturato sul logger `mcp.audit`.
**Sink**:
- stdout/stderr container (sempre, visibile via `docker logs`).
- File JSONL persistente su volume host:
`${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}/<service>.audit.jsonl`.
Rotation a mezzanotte UTC con retention `AUDIT_LOG_BACKUP_DAYS`
(default 30 giorni).
**Esempio record**:
```json
{
"audit_event": "write_op",
"action": "place_order",
"exchange": "bybit",
"principal": "core",
"target": "BTCUSDT",
"payload": {"side": "Buy", "qty": 0.01, "price": 60000, "leverage": 3},
"result": {"order_id": "abc123", "status": "submitted"}
}
```
**Query operative**:
```bash
# Tutto l'audit log oggi
tail -f /var/log/cerbero-mcp/*.audit.jsonl
# Solo place_order su bybit
jq -c 'select(.action=="place_order" and .exchange=="bybit")' \
/var/log/cerbero-mcp/bybit.audit.jsonl
# Errori
jq -c 'select(.error)' /var/log/cerbero-mcp/*.audit.jsonl
# Operazioni di un principal
jq -c 'select(.principal=="core")' /var/log/cerbero-mcp/*.audit.jsonl
```
I secret (api_key, password) sono filtrati automaticamente da
`SecretsFilter` prima di arrivare al sink.
## 5. Setup iniziale del VPS (manuale, alternativa allo script)
**Pre-requisiti**: Docker Engine ≥ 24, `docker compose` plugin, accesso SSH
sudo, dominio DNS A record `cerbero-mcp.tielogic.xyz` → IP del VPS, porte 80
e 443 aperte sul firewall (per ACME challenge + traffico HTTPS).
### a) Login al registry Gitea
Crea un Personal Access Token su Gitea (`Settings → Applications →
Generate new token`) con scope `read:package`. Quindi sul VPS:
```bash
echo "$GITEA_PAT" | docker login git.tielogic.xyz -u <gitea-username> --password-stdin
```
Le credenziali vengono salvate in `~/.docker/config.json`. Watchtower lo
bind-monta in sola lettura per fare i pull autenticati.
### b) Crea dir di deploy e scarica i file di config
Sul VPS NON serve clonare il repo. Bastano i file di compose, il
`Caddyfile` e i public assets del gateway:
```bash
sudo mkdir -p /docker/cerbero_mcp/{secrets,gateway/public}
sudo chown -R "$USER" /docker/cerbero_mcp
cd /docker/cerbero_mcp
BASE=https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main
curl -fsSL -o docker-compose.prod.yml $BASE/docker-compose.prod.yml
curl -fsSL -o docker-compose.traefik.yml $BASE/docker-compose.traefik.yml
curl -fsSL -o gateway/Caddyfile $BASE/gateway/Caddyfile
curl -fsSL -o gateway/public/index.html $BASE/gateway/public/index.html
curl -fsSL -o gateway/public/status.js $BASE/gateway/public/status.js
curl -fsSL -o gateway/public/style.css $BASE/gateway/public/style.css
```
Il VPS NON ha bisogno di buildare; usa `docker-compose.prod.yml` che fa solo
pull dal registry.
### c) Prepara secrets
```bash
mkdir -p secrets
# Copia (via scp) i file JSON con cred reali:
# secrets/deribit.json, bybit.json, alpaca.json, hyperliquid.json,
# secrets/macro.json, sentiment.json
# secrets/core.token, observer.token
chmod 600 secrets/*
```
### d) `.env` con configurazione runtime
Crea `/docker/cerbero_mcp/.env`:
```bash
# Gateway
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
# Image tag — `latest` per auto-update Watchtower, oppure pin a sha-XXXXXXX
IMAGE_TAG=latest
IMAGE_PREFIX=git.tielogic.xyz/adriano/cerbero-mcp
# Environment exchange (true=testnet, false=mainnet)
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# Watchtower polling interval (sec). 300=5min default.
WATCHTOWER_POLL_INTERVAL=300
```
### e) Avvio
```bash
docker compose -f docker-compose.prod.yml --env-file .env pull
docker compose -f docker-compose.prod.yml --env-file .env up -d
docker compose -f docker-compose.prod.yml logs -f gateway
```
Caddy chiede automaticamente il certificato Let's Encrypt al primo
contatto su `https://cerbero-mcp.tielogic.xyz`.
## 6. Auto-update via Watchtower
Watchtower (servizio `watchtower` nel compose) polla il registry ogni
`WATCHTOWER_POLL_INTERVAL` secondi. Se trova un nuovo digest dietro al tag
`:latest` di un container etichettato `com.centurylinklabs.watchtower.enable=true`,
fa:
1. `docker pull` della nuova image
2. `docker stop` graceful del container vecchio
3. `docker rm` + start del nuovo container con stessa config + secret + volumi
4. Cleanup image vecchia (`WATCHTOWER_CLEANUP=true`)
I container con label sono: `gateway`, `mcp-deribit`, `mcp-bybit`,
`mcp-hyperliquid`, `mcp-alpaca`, `mcp-macro`, `mcp-sentiment`. Il container
`watchtower` stesso non si auto-aggiorna (per evitare loop).
### Disabilitare auto-update temporaneamente
Pin a uno SHA specifico nel `.env`:
```bash
IMAGE_TAG=sha-6b7b3f7
docker compose -f docker-compose.prod.yml --env-file .env up -d
```
In questo modo `:latest` non viene più seguito; per riattivare il rollover
automatico ripristina `IMAGE_TAG=latest`.
### Disabilitare auto-update per un singolo servizio
Rimuovi la label `com.centurylinklabs.watchtower.enable=true` per quel
servizio nel compose (oppure imposta `=false`). Watchtower lo ignora ma
continua a tenere aggiornati gli altri.
## 7. Rollback
```bash
# Trova lo SHA della versione precedente
docker images "git.tielogic.xyz/adriano/cerbero-mcp/*" --format "{{.Tag}}"
# Pin nel .env
IMAGE_TAG=sha-XXXXXXX
docker compose -f docker-compose.prod.yml --env-file .env up -d
```
Watchtower NON downgraderà perché il digest del tag pin corrisponde a quello
locale.
## 8. Smoke test post-deploy
```bash
# Da fuori VPS (laptop)
curl -s https://cerbero-mcp.tielogic.xyz/mcp-macro/health
# {"status":"ok",...}
# Test write endpoint allowlist (deve rispondere 403 da IP esterno):
curl -X POST https://cerbero-mcp.tielogic.xyz/mcp-deribit/tools/place_order \
-H "Authorization: Bearer $(cat secrets/core.token)" \
-d '{"instrument_name":"BTC-PERPETUAL","side":"buy","amount":1}'
# 403 forbidden: source ip not in allowlist ← OK
# Sul VPS:
GATEWAY=http://localhost bash tests/smoke/run.sh
```
## 9. Sicurezza VPS
- Firewall `ufw`: `allow 22, 80, 443`. Tutto il resto deny in.
- `fail2ban` su SSH e (opz) sul log Caddy 401.
- Secret rotation manuale: aggiorna i file `secrets/*.token`
`docker compose restart` (i token vengono ricaricati al boot di ogni
servizio MCP).
- Audit log in `docker compose logs <service> | grep audit_event` — per
produzione meglio redirezionare a syslog o a un servizio dedicato.
## 10. Note Traefik / reverse proxy davanti a Gitea
Gitea è esposto via Traefik (ROOT_URL `https://git.tielogic.xyz`). Per il push
di image Docker il reverse proxy deve consentire upload di body grossi (un
singolo layer può superare i 100MB).
Traefik default va bene, ma se vedi `413 Request Entity Too Large` durante
`docker push` aumenta il limite nel middleware:
```yaml
# traefik dynamic config
http:
middlewares:
gitea-upload:
buffering:
maxRequestBodyBytes: 524288000 # 500MB
```
Applicalo come middleware al router Gitea.
## 11. Aggiornamento del compose stesso (file YAML)
Watchtower aggiorna le **image**, non `docker-compose.prod.yml`
`Caddyfile`. Se cambi struttura (nuovi servizi, nuove env var, modifiche
al gateway), ri-esegui sul VPS lo script no-clone, che ri-scarica i file
di config dal branch `main` di Gitea e applica:
```bash
/tmp/deploy-noclone.sh
```
Lo script è idempotente: preserva `.env` e `secrets/`, aggiorna solo i
file di config + fa `pull` + `up -d`.
+27
View File
@@ -0,0 +1,27 @@
# syntax=docker/dockerfile:1.7
FROM python:3.11-slim AS builder
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential curl && rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY src ./src
RUN uv sync --frozen --no-dev
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.title="cerbero-mcp" \
org.opencontainers.image.version="2.0.0" \
org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH" \
HOST=0.0.0.0 \
PORT=9000 \
PYTHONUNBUFFERED=1
RUN useradd -m -u 1000 app && chown -R app:app /app
USER app
EXPOSE 9000
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
CMD ["cerbero-mcp"]
+214 -121
View File
@@ -1,153 +1,246 @@
# Cerbero_mcp
# Cerbero MCP — V2.0.0
Server MCP riusabili (exchange + market data) per la suite Cerbero.
Spinta da `Cerbero/` (commit `pre-split-2026-04-27`) come parte dello
split documentato in `docs/superpowers/specs/2026-04-27-split-mcp-core-design.md`
(nel repo storico).
Server MCP unificato multi-exchange per la suite Cerbero. Distribuito come
singola immagine Docker; testnet e mainnet sono raggiungibili
contemporaneamente attraverso un meccanismo di routing per-request basato
sul token bearer fornito dal client.
## Servizi
- `mcp-alpaca`, `mcp-bybit`, `mcp-deribit`, `mcp-hyperliquid` — exchange
con `place_order`, `environment_info`, leverage cap server-side
- `mcp-deribit` e `mcp-bybit` espongono inoltre `place_combo_order`:
- Deribit: `private/create_combo` + ordine sul combo → 1 sola crociata
di spread invece di N (slippage atteso ridotto su strutture liquide).
- Bybit: `place_batch_order` su `category=option` → multi-leg atomico
in un solo round-trip API (no sconto fee, solo atomicità + latenza).
- `mcp-macro`, `mcp-sentiment` — read-only market data
## Caratteristiche
## Indicatori quantitativi disponibili
- **Una singola immagine Docker** (`cerbero-mcp`) ospita tutti i router
exchange in un unico processo FastAPI
- **Quattro exchange** (Deribit, Bybit, Hyperliquid, Alpaca) e **due data
provider** read-only (Macro, Sentiment)
- **Switch testnet/mainnet per-request** tramite header
`Authorization: Bearer <TOKEN>`: lo stesso container serve entrambi gli
ambienti senza riavvii
- **Configurazione interamente in `.env`**: nessun file JSON di credenziali
separato; le URL upstream (live/testnet) di ciascun exchange sono
override-abili tramite variabili dedicate (`DERIBIT_URL_*`,
`BYBIT_URL_*`, `HYPERLIQUID_URL_*`, `ALPACA_URL_*`)
- **Documentazione interattiva** OpenAPI/Swagger esposta a `/apidocs`
- **Qualità verificata**: 259 test (unit + integration + smoke), mypy
pulito, ruff pulito
### Common (`mcp_common.indicators` + `options` + `microstructure` + `stats`)
- Tecnici: `sma`, `rsi`, `macd`, `atr`, `adx`
- Volatilità: `vol_cone` (RV multi-window con percentili), `garch11_forecast`
- Statistici: `hurst_exponent`, `half_life_mean_reversion`, `autocorrelation`,
`cointegration_test` (Engle-Granger)
- Risk: `rolling_sharpe` (Sharpe + Sortino), `var_cvar` (historical VaR/ES)
- Microstructure: `orderbook_imbalance` (ratio + microprice + slope)
- Options: `oi_weighted_skew`, `smile_asymmetry`, `atm_vs_wings_vol`,
`dealer_gamma_profile`, `vanna_charm_aggregate`
## Avvio rapido (sviluppo, senza Docker)
### Deribit (esposti come tool MCP)
1. Copiare il template di configurazione e compilarlo:
```bash
cp .env.example .env
# editare .env con le proprie credenziali e i due token
```
2. Generare i token bearer:
```bash
python -c 'import secrets; print("TESTNET_TOKEN=" + secrets.token_urlsafe(32))'
python -c 'import secrets; print("MAINNET_TOKEN=" + secrets.token_urlsafe(32))'
```
3. Installare le dipendenze e avviare:
```bash
uv sync
uv run cerbero-mcp
```
4. Aprire la documentazione interattiva: <http://localhost:9000/apidocs>
## Avvio con Docker
```bash
cp .env.example .env # compilare valori
docker compose up -d
```
Il container espone la porta indicata da `PORT` in `.env` (default 9000).
## Token bearer e ambienti
| Token usato | Ambiente upstream |
|---|---|
| `Authorization: Bearer $TESTNET_TOKEN` | URL testnet di ciascun exchange |
| `Authorization: Bearer $MAINNET_TOKEN` | URL mainnet (live) |
| Nessun token / token sconosciuto | 401 Unauthorized |
Le tool puramente read-only (`/mcp-macro/*` e `/mcp-sentiment/*`)
richiedono comunque un bearer valido, ma il valore (testnet o mainnet) è
indifferente perché non hanno endpoint testnet.
## Endpoint principali
| Path | Descrizione |
|---|---|
| `GET /health` | Healthcheck (no auth) |
| `GET /apidocs` | Swagger UI (no auth) |
| `GET /openapi.json` | Schema OpenAPI 3.1 (no auth) |
| `POST /mcp-deribit/tools/{tool}` | Tool exchange Deribit |
| `POST /mcp-bybit/tools/{tool}` | Tool exchange Bybit |
| `POST /mcp-hyperliquid/tools/{tool}` | Tool exchange Hyperliquid |
| `POST /mcp-alpaca/tools/{tool}` | Tool exchange Alpaca |
| `POST /mcp-macro/tools/{tool}` | Tool macro/market data |
| `POST /mcp-sentiment/tools/{tool}` | Tool sentiment/news |
## Tool disponibili
### Common (`cerbero_mcp.common.indicators` + `options` + `microstructure` + `stats`)
Tecnici (`sma`, `rsi`, `macd`, `atr`, `adx`), volatilità (`vol_cone`,
`garch11_forecast`), statistici (`hurst_exponent`,
`half_life_mean_reversion`, `cointegration_test`), risk (`rolling_sharpe`,
`var_cvar`), microstructure (`orderbook_imbalance`), options
(`oi_weighted_skew`, `smile_asymmetry`, `dealer_gamma_profile`,
`vanna_charm_aggregate`).
### Deribit
DVOL, GEX, P/C ratio, skew_25d, term_structure, iv_rank, realized_vol,
indicatori tecnici, find_by_delta, calculate_spread_payoff.
**Nuovi**: `get_dealer_gamma_profile`, `get_vanna_charm`,
`get_oi_weighted_skew`, `get_smile_asymmetry`, `get_atm_vs_wings_vol`,
`get_orderbook_imbalance`.
indicatori tecnici, find_by_delta, calculate_spread_payoff,
get_dealer_gamma_profile, get_vanna_charm, get_oi_weighted_skew,
get_smile_asymmetry, get_atm_vs_wings_vol, get_orderbook_imbalance,
place_combo_order.
### Bybit
Ticker, orderbook, OHLCV, funding rate (current+history), open interest,
basis spot/perp, indicatori tecnici. **Nuovi**: `get_orderbook_imbalance`,
`get_basis_term_structure`.
Ticker, orderbook, OHLCV, funding rate, open interest, basis spot/perp,
indicatori tecnici, place_batch_order, get_orderbook_imbalance,
get_basis_term_structure.
### Hyperliquid
Account summary, positions, orderbook, historical, indicators, funding
rate, basis spot/perp, place_order, set_stop_loss, set_take_profit.
### Alpaca
Account, positions, bars, snapshot, option chain, place_order,
amend_order, cancel_order, close_position.
### Macro
Treasury yields, FRED indicators, equity futures, asset prices, calendar.
**Nuovi**: `get_yield_curve_slope` (slope 2y10y/5y30y + butterfly + regime),
`get_breakeven_inflation` (T5YIE/T10YIE/T5YIFR), `get_cot_tff` (TFF report
CFTC equity/financial: ES/NQ/RTY/ZN/ZB/6E/6J/DX), `get_cot_disaggregated`
(Disaggregated report CFTC commodities: CL/GC/SI/HG/ZW/ZC/ZS),
`get_cot_extreme_positioning` (scanner percentile ≤5/≥95 su watchlist).
Treasury yields, FRED indicators, equity futures, asset prices, calendar,
get_yield_curve_slope, get_breakeven_inflation, get_cot_tff,
get_cot_disaggregated, get_cot_extreme_positioning.
### Sentiment
News (CryptoPanic/CoinDesk), social (LunarCrush), funding multi-exchange,
OI history. **Nuovi**: `get_funding_arb_spread` (opportunità arb compatte),
`get_liquidation_heatmap` (heuristic da OI delta + funding extreme),
`get_cointegration_pairs` (Engle-Granger su coppie crypto).
OI history, get_funding_arb_spread, get_liquidation_heatmap,
get_cointegration_pairs.
## Deploy su VPS con Traefik
Sul VPS la rete pubblica (TLS, allowlist IP, rate limit) è gestita da
Traefik esterno a questo repository. Il container `cerbero-mcp` non
espone porte all'esterno: si registra alla rete docker di Traefik tramite
label aggiunte da un override compose esterno (es.
`docker-compose.override.yml` versionato fuori da questo repo). La policy
di sicurezza pubblica (allowlist IP per gli endpoint write) è
responsabilità di Traefik.
Esempio label minime per Traefik:
```yaml
labels:
- "traefik.enable=true"
- "traefik.http.routers.cerbero.rule=Host(`cerbero-mcp.tielogic.xyz`)"
- "traefik.http.routers.cerbero.entrypoints=websecure"
- "traefik.http.routers.cerbero.tls.certresolver=letsencrypt"
- "traefik.http.services.cerbero.loadbalancer.server.port=9000"
```
## Build & deploy pipeline
Niente CI/CD su Gitea: la build delle 8 image è responsabilità della
macchina di sviluppo, fatta da `scripts/build-push.sh`. Il flusso è:
1. **Quality gate locale** (sul laptop, prima di pushare):
- `uv run ruff check services/`
- `uv run mypy services/common/src/mcp_common`
- `uv run pytest services/`
- `docker compose -f docker-compose.prod.yml config -q`
2. **Build & push** (sul laptop):
```bash
export GITEA_PAT='<PAT_write:package>'
./scripts/build-push.sh # tutte le 8 image
./scripts/build-push.sh base mcp-bybit # solo specifiche
```
Tagga `:latest` + `:sha-<short_HEAD>` per rollback puntuali. Cache
buildx via registry stesso (run successivi 5-10× più veloci).
3. **Auto-rollover su VPS**: Watchtower polla il registry ogni 5 min e
aggiorna i container quando il digest del tag `:latest` cambia.
Vedi [`DEPLOYMENT.md`](DEPLOYMENT.md) per build & push, deploy VPS
no-clone (`scripts/deploy-noclone.sh`), smoke test, rollback.
## Avvio locale (dev)
Build dell'immagine eseguita sulla macchina di sviluppo:
```bash
docker compose up -d
bash tests/smoke/run.sh
export GITEA_PAT='<PAT con scope write:package>'
./scripts/build-push.sh
```
## Configurazione
Lo script tagga `:2.0.0`, `:latest` e `:sha-<short>` per rollback puntuali
e pubblica al registry Gitea. Sul VPS Watchtower polla `:latest` e
aggiorna il container automaticamente.
Vedi `secrets/*.json` e variabili `*_TESTNET` / `ALPACA_PAPER` in
`docker-compose.yml` per override ambiente.
### Deploy su VPS pubblica (`cerbero-mcp.tielogic.xyz`)
Vedi [`DEPLOYMENT.md`](DEPLOYMENT.md) per il runbook completo end-to-end.
Il gateway Caddy è configurato per:
- TLS automatico via Let's Encrypt (richiede DNS A/AAAA che punti al
VPS e porte 80+443 raggiungibili).
- HSTS preload, header di sicurezza (`X-Content-Type-Options`,
`X-Frame-Options`, `Referrer-Policy`).
- Rate limit per IP (60 req/min su read, 10 req/min su write) tramite
plugin `mholt/caddy-ratelimit`.
- Allowlist IP sui write endpoint (`place_*`, `cancel_*`, `set_*`,
`close_*`, `transfer_*`, `amend_*`, `switch_*`): IP non presenti in
`WRITE_ALLOWLIST` ricevono `403 forbidden`.
Variabili d'ambiente per il deploy:
Smoke test post-deploy:
```bash
# .env (su VPS)
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
# Allowlist write endpoint (CIDR space-separated). Default copre:
# - loopback IPv4/IPv6 (bot sull'host VPS chiama http://localhost)
# - Docker bridge 172.16.0.0/12 (bot in container nella stessa compose network)
# Aggiungi gli IP pubblici dei tuoi bot esterni se li hai.
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12 1.2.3.4/32"
PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
```
Tre scenari per il trading bot:
1. Bot container nella stessa compose network → chiama `http://gateway:80`
internamente. Source IP = Docker bridge → coperto dalla default.
2. Bot processo sull'host VPS → chiama `http://localhost`. Source IP =
`127.0.0.1` → coperto dalla default.
3. Bot esterno (laptop, altro server) → chiama
`https://cerbero-mcp.tielogic.xyz` con TLS. Devi aggiungere l'IP
pubblico del bot in `WRITE_ALLOWLIST`.
Senza configurare `WRITE_ALLOWLIST` la default è loopback + Docker bridge:
nessun IP pubblico esterno può triggerare ordini.
Sull'host VPS i secret devono avere permessi restrittivi:
## Sviluppo
```bash
chmod 600 secrets/*.json secrets/*.token
uv sync
uv run pytest # tutta la suite (259 test attesi)
uv run pytest tests/unit -v # solo unit
uv run pytest tests/integration -v
uv run ruff check src/ tests/
uv run mypy src/cerbero_mcp
```
### Risoluzione environment (testnet/mainnet)
Tutti e quattro i comandi devono ritornare verde prima di committare.
Ogni servizio exchange usa `mcp_common.environment.resolve_environment()`
che applica la precedenza:
### Layout sorgenti
1. env var di override (`DERIBIT_TESTNET`, `BYBIT_TESTNET`,
`HYPERLIQUID_TESTNET`, `ALPACA_PAPER`)
2. flag nel secret JSON (`testnet` o `paper` per alpaca)
3. default `testnet`
```
src/cerbero_mcp/
├── __main__.py # entrypoint cerbero-mcp
├── settings.py # Pydantic Settings (legge .env)
├── auth.py # middleware bearer → request.state.environment
├── server.py # build_app() + Swagger + middleware + handlers
├── client_registry.py # cache lazy {(exchange, env): client}
├── routers/ # un file per exchange (deribit, bybit, ...)
├── exchanges/ # logica per-exchange: client + tools
└── common/ # indicators, options, microstructure, stats, ...
```
Gli URL canonici live/testnet sono passati come kwargs
`default_base_url_live` / `default_base_url_testnet` direttamente al
resolver — non serve duplicarli nel secret JSON, ma se presenti
prevalgono sui default del codice.
## Migrazione da V1 (1.x → 2.0.0)
Per chi è in produzione su V1:
1. Backup `secrets/` (V2 non li userà ma servono come fonte di copia).
2. Generare i due nuovi token bearer (vedi sopra).
3. Compilare `.env` mappando i campi V1 ai campi V2:
| V1 (file JSON) | V2 (variabile `.env`) |
|---|---|
| `secrets/deribit.json` `client_id` / `client_secret` | `DERIBIT_CLIENT_ID` / `DERIBIT_CLIENT_SECRET` |
| `secrets/bybit.json` `api_key` / `api_secret` | `BYBIT_API_KEY` / `BYBIT_API_SECRET` |
| `secrets/hyperliquid.json` `wallet_address` / `private_key` | `HYPERLIQUID_WALLET_ADDRESS` / `HYPERLIQUID_PRIVATE_KEY` |
| `secrets/alpaca.json` `api_key_id` / `secret_key` | `ALPACA_API_KEY_ID` / `ALPACA_SECRET_KEY` |
| `secrets/macro.json` `fred_api_key` / `finnhub_api_key` | `FRED_API_KEY` / `FINNHUB_API_KEY` |
| `secrets/sentiment.json` `cryptopanic_key` / `lunarcrush_key` | `CRYPTOPANIC_KEY` / `LUNARCRUSH_KEY` |
4. Aggiornare i client bot:
- i path API restano identici (`/mcp-{exchange}/tools/{tool}`)
- sostituire `core.token` / `observer.token` con `TESTNET_TOKEN` o
`MAINNET_TOKEN` a seconda dell'ambiente desiderato per la chiamata
5. Spegnere V1 (`docker compose -f <vecchio compose> down`) e avviare V2
(`docker compose up -d`).
6. Verificare `/health` e `/apidocs`.
In caso di necessità è possibile fare rollback pullando i tag immagine V1
(`cerbero-mcp-*:1.x`); si ricordi però che `.env` e `secrets/` sono
formati incompatibili tra V1 e V2 — tenere backup separati.
## Architettura
Spec di progettazione e plan di implementazione completi in:
- [`docs/superpowers/specs/2026-04-30-V2.0.0-unified-image-token-routing-design.md`](docs/superpowers/specs/2026-04-30-V2.0.0-unified-image-token-routing-design.md)
- [`docs/superpowers/plans/2026-04-30-V2.0.0-unified-image-token-routing.md`](docs/superpowers/plans/2026-04-30-V2.0.0-unified-image-token-routing.md)
Riepilogo del flusso runtime:
```
Bot → Authorization: Bearer <TESTNET|MAINNET>_TOKEN
FastAPI middleware auth → request.state.environment ∈ {testnet, mainnet}
Router /mcp-{exchange}/tools/{tool}
ClientRegistry.get(exchange, env) → client cached lazy (HTTP/WS pool riusato)
Tool function (logica pura) → exchange API
```
### Override URL upstream
L'override delle URL upstream da `.env` è completo per Deribit e
Hyperliquid. Per Bybit funziona tramite l'attributo `endpoint` interno di
pybit (workaround documentato nel client). Per Alpaca l'override è
applicato al solo trading endpoint: gli endpoint dati
(`data.alpaca.markets`) restano quelli predefiniti dell'SDK.
## Licenza
Privato.
-205
View File
@@ -1,205 +0,0 @@
# docker-compose.prod.yml — deploy su VPS produzione.
#
# Differenze vs docker-compose.yml (dev):
# - Niente `build:`, solo `image:` dal registry Gitea.
# - Tag `latest` (Watchtower polla per nuove versioni).
# - Aggiunge servizio `watchtower` che auto-aggiorna i container etichettati
# `com.centurylinklabs.watchtower.enable=true` quando il tag latest cambia.
# - Auth registry: `docker login git.tielogic.xyz` una sola volta sull'host
# (Watchtower legge ~/.docker/config.json bind-mounted in /config.json).
#
# Uso sul VPS:
# docker login git.tielogic.xyz
# docker compose -f docker-compose.prod.yml --env-file .env up -d
#
# Override variabili in `.env` accanto al compose:
# ACME_EMAIL=adrianodalpastro@tielogic.com
# WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
# GATEWAY_HTTP_PORT=80
# GATEWAY_HTTPS_PORT=443
# IMAGE_TAG=latest # o sha-XXXXXXX per pin specifico
networks:
internal:
driver: bridge
volumes:
caddy-data:
caddy-config:
secrets:
deribit_credentials:
file: ./secrets/deribit.json
hyperliquid_wallet:
file: ./secrets/hyperliquid.json
bybit_credentials:
file: ./secrets/bybit.json
alpaca_credentials:
file: ./secrets/alpaca.json
macro_credentials:
file: ./secrets/macro.json
sentiment_credentials:
file: ./secrets/sentiment.json
core_token:
file: ./secrets/core.token
observer_token:
file: ./secrets/observer.token
x-common-security: &common-security
cap_drop: [ALL]
security_opt:
- no-new-privileges:true
restart: unless-stopped
networks: [internal]
labels:
com.centurylinklabs.watchtower.enable: "true"
volumes:
- ${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}:/var/log/cerbero-mcp:rw
x-image-prefix: &image_prefix git.tielogic.xyz/adriano/cerbero-mcp
services:
gateway:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/gateway:${IMAGE_TAG:-latest}
restart: unless-stopped
networks: [internal]
security_opt:
- no-new-privileges:true
labels:
com.centurylinklabs.watchtower.enable: "true"
ports:
- "${GATEWAY_HTTP_PORT:-80}:80"
- "${GATEWAY_HTTPS_PORT:-443}:443"
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
volumes:
- ./gateway/Caddyfile:/etc/caddy/Caddyfile:ro
- ./gateway/public:/srv:ro
- caddy-data:/data
- caddy-config:/config
depends_on:
mcp-deribit: { condition: service_healthy }
mcp-hyperliquid: { condition: service_healthy }
mcp-bybit: { condition: service_healthy }
mcp-alpaca: { condition: service_healthy }
mcp-macro: { condition: service_healthy }
mcp-sentiment: { condition: service_healthy }
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost/"]
interval: 30s
timeout: 5s
retries: 3
mcp-deribit:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-deribit:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [deribit_credentials, core_token, observer_token]
environment:
CREDENTIALS_FILE: /run/secrets/deribit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
DERIBIT_TESTNET: "${DERIBIT_TESTNET:-true}"
ROOT_PATH: /mcp-deribit
AUDIT_LOG_FILE: /var/log/cerbero-mcp/deribit.audit.jsonl
mcp-hyperliquid:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-hyperliquid:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [hyperliquid_wallet, core_token, observer_token]
environment:
HYPERLIQUID_WALLET_FILE: /run/secrets/hyperliquid_wallet
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
HYPERLIQUID_TESTNET: "${HYPERLIQUID_TESTNET:-true}"
ROOT_PATH: /mcp-hyperliquid
AUDIT_LOG_FILE: /var/log/cerbero-mcp/hyperliquid.audit.jsonl
mcp-bybit:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-bybit:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [bybit_credentials, core_token, observer_token]
environment:
BYBIT_CREDENTIALS_FILE: /run/secrets/bybit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
BYBIT_TESTNET: "${BYBIT_TESTNET:-true}"
ROOT_PATH: /mcp-bybit
AUDIT_LOG_FILE: /var/log/cerbero-mcp/bybit.audit.jsonl
PORT: "9019"
mcp-alpaca:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-alpaca:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [alpaca_credentials, core_token, observer_token]
environment:
ALPACA_CREDENTIALS_FILE: /run/secrets/alpaca_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ALPACA_PAPER: "${ALPACA_PAPER:-true}"
ROOT_PATH: /mcp-alpaca
AUDIT_LOG_FILE: /var/log/cerbero-mcp/alpaca.audit.jsonl
PORT: "9020"
mcp-macro:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-macro:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [macro_credentials, core_token, observer_token]
environment:
MACRO_CREDENTIALS_FILE: /run/secrets/macro_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-macro
mcp-sentiment:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-sentiment:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [sentiment_credentials, core_token, observer_token]
environment:
SENTIMENT_CREDENTIALS_FILE: /run/secrets/sentiment_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-sentiment
# ========================================================
# WATCHTOWER — auto-update container con label enable=true
# ========================================================
watchtower:
image: containrrr/watchtower:latest
restart: unless-stopped
networks: [internal]
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ${HOME}/.docker/config.json:/config.json:ro
environment:
WATCHTOWER_LABEL_ENABLE: "true"
WATCHTOWER_CLEANUP: "true"
WATCHTOWER_POLL_INTERVAL: "${WATCHTOWER_POLL_INTERVAL:-300}"
WATCHTOWER_INCLUDE_RESTARTING: "false"
WATCHTOWER_NOTIFICATIONS_LEVEL: info
WATCHTOWER_LOG_LEVEL: info
command: --interval ${WATCHTOWER_POLL_INTERVAL:-300}
-60
View File
@@ -1,60 +0,0 @@
# docker-compose.traefik.yml — overlay per integrare Cerbero_mcp con un
# Traefik già esistente sull'host (es. lo stesso VPS che ospita Gitea).
#
# USO:
# docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml \
# --env-file .env up -d
#
# Differenze vs docker-compose.prod.yml standalone:
# - Gateway Caddy NON ha ports binding host (Traefik è il punto di ingresso
# pubblico su 80/443).
# - Gateway è connesso anche alla network esterna `traefik` (override env
# TRAEFIK_NETWORK se diversa, es. `gitea_traefik-public`).
# - Caddy NON fa auto-TLS — Traefik termina TLS e fa ACME Let's Encrypt.
# Caddy ascolta in chiaro su :80 dentro Docker network.
# - Trusted proxies: Caddy rispetta X-Forwarded-For ricevuto da Traefik
# per il match `remote_ip` (rate limit + WRITE_ALLOWLIST).
# - Labels Traefik su gateway: routing Host(`cerbero-mcp.tielogic.xyz`) +
# TLS automatic.
#
# Variabili .env aggiuntive richieste:
# TRAEFIK_NETWORK=gitea_traefik-public # nome network di Traefik
# TRAEFIK_CERTRESOLVER=letsencrypt # nome resolver in tua config Traefik
# TRAEFIK_ENTRYPOINT=websecure # entrypoint HTTPS Traefik
networks:
traefik:
external: true
name: ${TRAEFIK_NETWORK:-gitea_traefik-public}
services:
gateway:
# Override: niente port binding host, traffica solo via Traefik
ports: !reset []
networks:
- internal
- traefik
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
# Mode behind-proxy: Caddy ascolta plain HTTP su :80, no auto_https
LISTEN: ":80"
AUTO_HTTPS: "off"
# Traefik è il proxy che inoltra; trusta range privati + opz. CIDR Traefik
TRUSTED_PROXIES: ${TRUSTED_PROXIES:-private_ranges}
labels:
com.centurylinklabs.watchtower.enable: "true"
traefik.enable: "true"
traefik.docker.network: ${TRAEFIK_NETWORK:-gitea_traefik-public}
traefik.http.routers.cerbero-mcp.rule: "Host(`cerbero-mcp.tielogic.xyz`)"
traefik.http.routers.cerbero-mcp.entrypoints: ${TRAEFIK_ENTRYPOINT:-websecure}
traefik.http.routers.cerbero-mcp.tls: "true"
traefik.http.routers.cerbero-mcp.tls.certresolver: ${TRAEFIK_CERTRESOLVER:-letsencrypt}
traefik.http.services.cerbero-mcp.loadbalancer.server.port: "80"
# Security headers a livello Traefik (ridondante con Caddy ma utile se
# in futuro Caddy viene rimosso). Commenta se non vuoi duplicazione.
traefik.http.routers.cerbero-mcp.middlewares: cerbero-mcp-secheaders@docker
traefik.http.middlewares.cerbero-mcp-secheaders.headers.stsSeconds: "31536000"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.stsIncludeSubdomains: "true"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.contentTypeNosniff: "true"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.referrerPolicy: "no-referrer"
+12 -174
View File
@@ -1,180 +1,18 @@
networks:
internal:
driver: bridge
volumes:
caddy-data:
caddy-config:
secrets:
deribit_credentials:
file: ./secrets/deribit.json
hyperliquid_wallet:
file: ./secrets/hyperliquid.json
bybit_credentials:
file: ./secrets/bybit.json
alpaca_credentials:
file: ./secrets/alpaca.json
macro_credentials:
file: ./secrets/macro.json
sentiment_credentials:
file: ./secrets/sentiment.json
core_token:
file: ./secrets/core.token
observer_token:
file: ./secrets/observer.token
x-common-security: &common-security
cap_drop: [ALL]
security_opt:
- no-new-privileges:true
restart: unless-stopped
networks: [internal]
services:
# ========================================================
# GATEWAY — unica porta host, reverse proxy + landing page
# ========================================================
gateway:
build:
context: ./gateway
dockerfile: Dockerfile
image: cerbero-gateway:dev
restart: unless-stopped
networks: [internal]
security_opt:
- no-new-privileges:true
cerbero-mcp:
image: cerbero-mcp:2.0.0
build: .
container_name: cerbero-mcp
ports:
- "${GATEWAY_HTTP_PORT:-80}:80"
- "${GATEWAY_HTTPS_PORT:-443}:443"
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
volumes:
- ./gateway/Caddyfile:/etc/caddy/Caddyfile:ro
- ./gateway/public:/srv:ro
- caddy-data:/data
- caddy-config:/config
depends_on:
mcp-deribit: { condition: service_healthy }
mcp-hyperliquid: { condition: service_healthy }
mcp-bybit: { condition: service_healthy }
mcp-alpaca: { condition: service_healthy }
mcp-macro: { condition: service_healthy }
mcp-sentiment: { condition: service_healthy }
- "${PORT:-9000}:${PORT:-9000}"
env_file: .env
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost/"]
test:
- "CMD"
- "python"
- "-c"
- "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
interval: 30s
timeout: 5s
retries: 3
# ========================================================
# MCP — accessibili solo via gateway (nessuna porta host)
# ========================================================
mcp-deribit:
image: cerbero-mcp-deribit:dev
build:
context: .
dockerfile: docker/mcp-deribit.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [deribit_credentials, core_token, observer_token]
environment:
CREDENTIALS_FILE: /run/secrets/deribit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
DERIBIT_TESTNET: "true" # override secrets/deribit.json testnet flag
ROOT_PATH: /mcp-deribit
mcp-hyperliquid:
image: cerbero-mcp-hyperliquid:dev
build:
context: .
dockerfile: docker/mcp-hyperliquid.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [hyperliquid_wallet, core_token, observer_token]
environment:
HYPERLIQUID_WALLET_FILE: /run/secrets/hyperliquid_wallet
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
HYPERLIQUID_TESTNET: "true" # override secrets/hyperliquid.json testnet flag
ROOT_PATH: /mcp-hyperliquid
mcp-bybit:
image: cerbero-mcp-bybit:dev
build:
context: .
dockerfile: docker/mcp-bybit.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [bybit_credentials, core_token, observer_token]
environment:
BYBIT_CREDENTIALS_FILE: /run/secrets/bybit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
BYBIT_TESTNET: "true" # override secrets/bybit.json testnet flag
ROOT_PATH: /mcp-bybit
PORT: "9019"
mcp-alpaca:
image: cerbero-mcp-alpaca:dev
build:
context: .
dockerfile: docker/mcp-alpaca.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [alpaca_credentials, core_token, observer_token]
environment:
ALPACA_CREDENTIALS_FILE: /run/secrets/alpaca_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ALPACA_PAPER: "true"
ROOT_PATH: /mcp-alpaca
PORT: "9020"
mcp-macro:
image: cerbero-mcp-macro:dev
build:
context: .
dockerfile: docker/mcp-macro.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [macro_credentials, core_token, observer_token]
environment:
MACRO_CREDENTIALS_FILE: /run/secrets/macro_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-macro
mcp-sentiment:
image: cerbero-mcp-sentiment:dev
build:
context: .
dockerfile: docker/mcp-sentiment.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [sentiment_credentials, core_token, observer_token]
environment:
SENTIMENT_CREDENTIALS_FILE: /run/secrets/sentiment_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-sentiment
-12
View File
@@ -1,12 +0,0 @@
FROM python:3.11-slim AS base
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential curl \
&& rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY services/common ./services/common
RUN uv sync --frozen --no-dev --package mcp-common
ENV PATH="/app/.venv/bin:$PATH"
FROM base AS dev
RUN uv sync --frozen --package mcp-common
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-alpaca ./services/mcp-alpaca
RUN uv sync --frozen --no-dev --package mcp-alpaca
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-alpaca"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9020
EXPOSE 9020
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9020\")}/health', timeout=3).close()"
CMD ["mcp-alpaca"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-bybit ./services/mcp-bybit
RUN uv sync --frozen --no-dev --package mcp-bybit
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-bybit"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9019
EXPOSE 9019
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9019\")}/health', timeout=3).close()"
CMD ["mcp-bybit"]
-27
View File
@@ -1,27 +0,0 @@
# CER-P5-012 multi-stage slim: builder da cerbero-base (con uv + toolchain),
# runtime da python:3.11-slim (solo venv + source).
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-deribit ./services/mcp-deribit
RUN uv sync --frozen --no-dev --package mcp-deribit
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-deribit"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9011
EXPOSE 9011
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9011\")}/health', timeout=3).close()"
CMD ["mcp-deribit"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-hyperliquid ./services/mcp-hyperliquid
RUN uv sync --frozen --no-dev --package mcp-hyperliquid
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-hyperliquid"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9012
EXPOSE 9012
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9012\")}/health', timeout=3).close()"
CMD ["mcp-hyperliquid"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-macro ./services/mcp-macro
RUN uv sync --frozen --no-dev --package mcp-macro
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-macro"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9013
EXPOSE 9013
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9013\")}/health', timeout=3).close()"
CMD ["mcp-macro"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-sentiment ./services/mcp-sentiment
RUN uv sync --frozen --no-dev --package mcp-sentiment
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-sentiment"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9014
EXPOSE 9014
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9014\")}/health', timeout=3).close()"
CMD ["mcp-sentiment"]
File diff suppressed because it is too large Load Diff
@@ -0,0 +1,562 @@
# Cerbero MCP — V2.0.0: Unified Image, Token-Based Environment Routing
**Branch:** `V2.0.0`
**Data spec:** 2026-04-30
**Autore:** Adriano
## Sommario esecutivo
Riscrittura architetturale di Cerbero MCP per ridurre superficie operativa e
semplificare la gestione di testnet/mainnet. Si passa da 7 immagini Docker
(gateway Caddy + 6 servizi MCP) a una **singola immagine** che ospita tutti
i router exchange in un unico processo FastAPI. La distinzione fra ambiente
testnet e mainnet, oggi decisa al boot del container tramite variabili di
override e flag nei secret JSON, viene spostata **a runtime per-request**:
il bearer token presentato dal client (`Authorization: Bearer <token>`)
determina quale endpoint upstream viene contattato per quella specifica
chiamata. Tutta la configurazione confluisce in un singolo file `.env`,
eliminando i secret JSON separati. Viene infine esposta documentazione
OpenAPI interattiva via Swagger UI all'indirizzo `/apidocs`.
## Motivazioni
- **Operatività**: 7 container, 8 secret file, 4 docker-compose overlay e un
gateway Caddy con plugin di rate-limit sono troppo per il volume di
traffico atteso. Un singolo container è sufficiente.
- **Flessibilità ambiente**: oggi un bot che vuole leggere mainnet e
scrivere testnet deve coordinare due deploy. Con il routing per-request
basta scegliere il bearer giusto per ogni chiamata.
- **Configurazione**: 8 secret JSON + 2 token file + variabili di override
in 4 compose file = stato distribuito difficile da auditare. Un singolo
`.env` rende ovvio cosa è configurato.
- **DX dev**: oggi serve `docker compose up` anche per iterare. V2 punta a
`uv run cerbero-mcp` diretto su laptop senza Docker.
- **Discovery API**: senza Swagger l'unica fonte sulle tool è il codice.
`/apidocs` rende le tool esplorabili dal browser, e `/openapi.json` le
rende leggibili a LLM senza bisogno del protocollo MCP completo.
## Decisioni di design
| # | Domanda | Decisione |
|---|---|---|
| 1 | Significato di "passare il token alla funzione" | Routing per-request via bearer: lo stesso container serve testnet e mainnet contemporaneamente, decide URL upstream a runtime |
| 2 | Granularità token | 2 token globali (`TESTNET_TOKEN`, `MAINNET_TOKEN`) validi per tutti gli exchange |
| 3 | ACL core/observer (read-only) | Eliminata. Protezione write rimane via leverage cap server-side e via firewall (Traefik su VPS) |
| 4 | Scope "single image" | Un'unica immagine, **un solo container** con multi-router interno (un processo Python) |
| 5 | Gateway L7 | Eliminato dal repo. Su VPS prod c'è Traefik gestito esternamente. In dev nessun gateway |
| 6 | Formato configurazione | Tutto in `.env`. Nessun JSON. La porta è in `.env` |
| 7 | Swagger | `/apidocs` ON, `/openapi.json` ON, `/redoc` OFF, `/docs` OFF |
| 8a | Dispatch exchange | Path-based: `/mcp-{exchange}/tools/{tool}` (backward-compat con bot V1) |
| 8b | Lifecycle client exchange | Cache lazy `(exchange, env) → client`, max 8 client (4 exchange × 2 env) + 2 client read-only (macro, sentiment) |
Decisioni esplicite anche su:
- **Macro e sentiment richiedono token valido**: anche le tool puramente
read-only passano dal middleware auth. Bearer assente o non riconosciuto
→ 401. Il valore del token (testnet o mainnet) è ignorato per macro e
sentiment perché non hanno endpoint testnet, ma uno dei due deve essere
presente.
- **Nessuna policy `ALLOW_MAINNET`**: la protezione contro uso accidentale
di mainnet è demandata a (a) custodia dei token (mainnet token solo a bot
autorizzati), (b) Traefik IP allowlist sul VPS, (c) leverage cap
server-side già esistente per ogni exchange.
- **`docker-compose.yml` minimo** mantenuto per chi vuole usare Docker
localmente; `docker-compose.{prod,traefik,local}.yml` eliminati.
## Architettura
### Stack runtime
```
Prod (VPS):
Traefik (TLS, allowlist) ──▶ container cerbero-mcp:9000
Dev (laptop):
uv run cerbero-mcp ──▶ http://localhost:9000
```
Nessun gateway nel repo. Traefik è gestito fuori da questo progetto, con
label aggiunte tramite override compose esterno. In dev FastAPI è esposto
direttamente via uvicorn.
### Struttura sorgenti
```
Cerbero_mcp/
├── pyproject.toml # singolo package "cerbero-mcp"
├── uv.lock
├── Dockerfile # multi-stage builder + runtime slim
├── docker-compose.yml # minimo: 1 servizio, env_file: .env
├── .env.example # template completo, versionato
├── .gitignore # .env escluso
├── README.md # riscritto V2
├── docs/
│ └── superpowers/
│ ├── specs/
│ │ ├── 2026-04-27-cot-report-design.md (storico)
│ │ └── 2026-04-30-V2.0.0-unified-image-...-design.md
│ └── plans/
│ ├── 2026-04-27-cot-report.md (storico)
│ └── 2026-04-30-V2.0.0-unified-image-...-plan.md
├── src/cerbero_mcp/
│ ├── __init__.py
│ ├── __main__.py # entrypoint: uvicorn.run(app, ...)
│ ├── settings.py # Pydantic Settings per .env
│ ├── auth.py # bearer → environment
│ ├── server.py # build_app(): FastAPI + middleware + handlers + swagger
│ ├── client_registry.py # cache lazy {(exchange, env): client}
│ ├── routers/
│ │ ├── __init__.py
│ │ ├── deribit.py
│ │ ├── bybit.py
│ │ ├── hyperliquid.py
│ │ ├── alpaca.py
│ │ ├── macro.py
│ │ └── sentiment.py
│ ├── exchanges/
│ │ ├── deribit/{client.py, tools.py, leverage_cap.py}
│ │ ├── bybit/{client.py, tools.py, leverage_cap.py}
│ │ ├── hyperliquid/{client.py, tools.py, leverage_cap.py}
│ │ ├── alpaca/{client.py, tools.py, leverage_cap.py}
│ │ ├── macro/{client.py, tools.py}
│ │ └── sentiment/{client.py, tools.py}
│ └── common/
│ ├── indicators.py
│ ├── options.py
│ ├── microstructure.py
│ ├── stats.py
│ ├── http.py
│ ├── audit.py
│ ├── logging.py
│ └── errors.py # error envelope
├── tests/
│ ├── unit/
│ ├── integration/
│ └── smoke/
└── scripts/
└── build-push.sh # build di 1 sola immagine
```
**Cosa viene eliminato:**
- `services/` (intera struttura monorepo a 7 sub-package, sostituita da
`src/cerbero_mcp/`)
- `gateway/` (Caddy + Caddyfile + landing page)
- `secrets/` (8 file JSON + 2 token file)
- `docker/` (7 Dockerfile separati, sostituiti da `Dockerfile` in root)
- `docker-compose.prod.yml`, `docker-compose.local.yml`,
`docker-compose.traefik.yml`
- `DEPLOYMENT.md` (i contenuti ancora validi confluiscono in `README.md`)
- `mcp_common.environment` (resolver boot-time, sostituito da `auth.py`
runtime)
- `mcp_common.env_validation` (sostituito da Pydantic Settings)
- `mcp_common.app_factory` (boilerplate boot, integrato in `server.py`)
**Cosa resta uguale:**
- Path layout `/mcp-{exchange}/tools/{tool}` (backward-compat con bot V1)
- Tool MCP individuali: firme, response shape, error envelope, header
`X-Data-Timestamp` e `X-Duration-Ms`
- Logica indicatori quantitativi (`indicators`, `options`,
`microstructure`, `stats`)
- Healthcheck `/health` (formato identico)
- Leverage cap server-side per exchange
- Tool MCP-bridge (se in uso) preservato in `common/mcp_bridge.py`
### Flusso request
```
1. Bot HTTP request
POST /mcp-deribit/tools/place_order
Authorization: Bearer tk_test_xxx
{ "symbol":"BTC-PERPETUAL", "side":"buy", "qty": 0.1 }
2. Middleware AuthBearer (auth.py)
- whitelist path: /apidocs, /openapi.json, /health → bypass
- estrae bearer
- confronta con settings.testnet_token / settings.mainnet_token
- match testnet → request.state.environment = "testnet"
- match mainnet → request.state.environment = "mainnet"
- nessun match → 401 UNAUTHORIZED
3. Router deribit (routers/deribit.py)
- FastAPI valida body con Pydantic schema
- dependency get_env(request) -> "testnet"|"mainnet"
- dependency get_client(env) -> DeribitClient
4. Client Registry (client_registry.py)
- chiave (exchange="deribit", env="testnet")
- cache hit → return; miss → costruisce client lazy + auth iniziale + cache
5. Tool impl (exchanges/deribit/tools.py)
- leverage_cap.enforce(qty, max_leverage)
- client.place_order(...)
- response shape standard con data_timestamp, request_id
6. Response middleware
- X-Duration-Ms header
- data_timestamp injection se mancante
7. 200 JSON
```
### Auth
```python
# src/cerbero_mcp/auth.py (esempio sintetico)
from fastapi import Request, HTTPException, status
from typing import Literal
Environment = Literal["testnet", "mainnet"]
WHITELIST_PATHS = {"/health", "/apidocs", "/openapi.json"}
async def auth_middleware(request: Request, call_next):
if request.url.path in WHITELIST_PATHS:
return await call_next(request)
auth = request.headers.get("Authorization", "")
if not auth.startswith("Bearer "):
raise HTTPException(401, "missing bearer token")
token = auth[len("Bearer "):].strip()
settings = request.app.state.settings
if token == settings.testnet_token.get_secret_value():
request.state.environment = "testnet"
elif token == settings.mainnet_token.get_secret_value():
request.state.environment = "mainnet"
else:
raise HTTPException(401, "invalid token")
return await call_next(request)
```
Confronto token con `secrets.compare_digest` per evitare timing attack.
### Client registry
```python
# src/cerbero_mcp/client_registry.py (sintesi)
class ClientRegistry:
def __init__(self, settings):
self._settings = settings
self._clients: dict[tuple[str, Environment], Any] = {}
self._locks: dict[tuple[str, Environment], asyncio.Lock] = defaultdict(asyncio.Lock)
async def get(self, exchange: str, env: Environment) -> Any:
key = (exchange, env)
if key in self._clients:
return self._clients[key]
async with self._locks[key]:
if key in self._clients:
return self._clients[key]
client = await self._build(exchange, env)
self._clients[key] = client
return client
async def aclose(self):
for c in self._clients.values():
await c.aclose()
```
- Costruzione **lazy** al primo uso → boot rapido, no auth verso exchange
non usati
- **Lock per chiave** evita doppia istanziazione in caso di race
- Macro e sentiment usano stesso client per testnet e mainnet (l'env è
ignorato), ma per uniformità API ricevono ugualmente la chiave
`(exchange, env)`
- Lifespan FastAPI: registry creato in `startup`, chiuso in `shutdown`
(chiude HTTP pool, websocket eventuali, sessioni)
### Configurazione: `.env`
Singola sorgente di verità, letta da Pydantic Settings al boot. File
versionato come `.env.example` con placeholder vuoti; `.env` reale
gitignored.
```bash
# SERVER
HOST=0.0.0.0
PORT=9000
LOG_LEVEL=info
# AUTH
TESTNET_TOKEN=
MAINNET_TOKEN=
# DERIBIT
DERIBIT_CLIENT_ID=
DERIBIT_CLIENT_SECRET=
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
DERIBIT_MAX_LEVERAGE=3
# BYBIT
BYBIT_API_KEY=
BYBIT_API_SECRET=
BYBIT_URL_LIVE=https://api.bybit.com
BYBIT_URL_TESTNET=https://api-testnet.bybit.com
BYBIT_MAX_LEVERAGE=3
# HYPERLIQUID
HYPERLIQUID_WALLET_ADDRESS=
HYPERLIQUID_API_WALLET_ADDRESS=
HYPERLIQUID_PRIVATE_KEY=
HYPERLIQUID_URL_LIVE=https://api.hyperliquid.xyz
HYPERLIQUID_URL_TESTNET=https://api.hyperliquid-testnet.xyz
HYPERLIQUID_MAX_LEVERAGE=3
# ALPACA
ALPACA_API_KEY_ID=
ALPACA_SECRET_KEY=
ALPACA_URL_LIVE=https://api.alpaca.markets
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
ALPACA_MAX_LEVERAGE=1
# MACRO
FRED_API_KEY=
FINNHUB_API_KEY=
# SENTIMENT
CRYPTOPANIC_KEY=
LUNARCRUSH_KEY=
```
Pydantic Settings con `SecretStr` per i valori sensibili evita leak nei
log/repr. `extra="ignore"` ammette env aggiuntive (variabili di sistema,
Docker) senza crash. Validation fail-fast al boot.
### Swagger / OpenAPI
```python
app = FastAPI(
title="Cerbero MCP",
version="2.0.0",
description="Multi-exchange MCP server. Bearer token decides environment (testnet/mainnet).",
docs_url="/apidocs",
redoc_url=None,
openapi_url="/openapi.json",
swagger_ui_parameters={
"persistAuthorization": True,
"displayRequestDuration": True,
"filter": True,
"tryItOutEnabled": True,
"tagsSorter": "alpha",
"operationsSorter": "alpha",
},
)
```
Aggiunto `securityScheme = BearerAuth` su tutti gli endpoint sotto `/mcp-*`.
Click su Authorize in Swagger → input bearer → tutte le richieste "Try it
out" mandano il header. Cambio token = cambio ambiente senza ricaricare.
Tag organizzati per exchange:
- `system``/health`
- `deribit`, `bybit`, `hyperliquid`, `alpaca`, `macro`, `sentiment` → tool
rispettive
Ogni request body Pydantic include `examples=[...]` con almeno un esempio
realistico. Per response shape complesse (gamma profile, orderbook
imbalance) anche le response Pydantic includono examples.
### Dockerfile e immagine
```dockerfile
# syntax=docker/dockerfile:1.7
FROM python:3.11-slim AS builder
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential curl && rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY src ./src
RUN uv sync --frozen --no-dev
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.title="cerbero-mcp" \
org.opencontainers.image.version="2.0.0"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH" \
HOST=0.0.0.0 \
PORT=9000 \
PYTHONUNBUFFERED=1
RUN useradd -m -u 1000 app && chown -R app:app /app
USER app
EXPOSE 9000
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
CMD ["cerbero-mcp"]
```
- 1 sola immagine `cerbero-mcp:2.0.0` (+ `:latest`)
- Build attesa: ~2-3 min (vs ~12 min × 7 immagini in V1)
- Image size attesa: ~200 MB
- Non-root user `app:1000`
- Healthcheck legge `PORT` da env (rispetta override `.env`)
### docker-compose.yml minimo
```yaml
services:
cerbero-mcp:
image: cerbero-mcp:2.0.0
build: .
container_name: cerbero-mcp
ports:
- "${PORT:-9000}:${PORT:-9000}"
env_file: .env
restart: unless-stopped
healthcheck:
test: ["CMD", "python", "-c",
"import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"]
interval: 30s
timeout: 5s
retries: 3
```
Su VPS si estende con override esterno per applicare label Traefik (file
non versionato in questo repo).
## Errori
| Caso | Status | Code |
|---|---|---|
| Manca header Authorization | 401 | UNAUTHORIZED |
| Token non in `{testnet, mainnet}` | 401 | UNAUTHORIZED |
| Body invalido (Pydantic) | 422 | INVALID_INPUT |
| Exchange upstream 5xx | 502 | UPSTREAM_ERROR |
| Rate limit upstream | 429 | RATE_LIMIT |
| Eccezione non gestita | 500 | UNHANDLED_EXCEPTION |
Error envelope identico a V1: campi `error.{type, code, message,
retryable, suggested_fix?, details?}`, `request_id`, `data_timestamp`.
## Test plan
**Unit:**
- `auth.py`: 4 casi (no header, header malformato, token testnet, token
mainnet, token invalido). Verifica che `request.state.environment` sia
settato correttamente.
- `auth.py`: confronto token usa `secrets.compare_digest` (verifica con
test che attivi entrambi i rami).
- `auth.py`: path whitelist (`/health`, `/apidocs`, `/openapi.json`)
bypassano il middleware.
- `client_registry.py`: concorrenza — 10 task `get(deribit, testnet)` in
parallelo, `_build` chiamato 1 sola volta.
- `client_registry.py`: chiavi diverse istanziano client diversi
(deribit/testnet ≠ deribit/mainnet ≠ bybit/testnet).
- `settings.py`: `.env` valido carica senza errori; campo mandatory
mancante solleva ValidationError al boot.
- `common/`: tutti i test esistenti su indicators, options,
microstructure, stats migrano 1:1.
- Per ogni `exchanges/{exchange}/tools.py`: tool con stub client
restituisce response shape attesa (test esistenti V1 migrati).
**Integration:**
- Stub HTTP che intercetta richieste verso URL deribit:
- request con `Bearer TESTNET_TOKEN` colpisce `DERIBIT_URL_TESTNET`
- request con `Bearer MAINNET_TOKEN` colpisce `DERIBIT_URL_LIVE`
- Stesso pattern per bybit, hyperliquid, alpaca.
- Macro e sentiment: request con bearer testnet o mainnet entrambe
funzionanti, request senza bearer → 401.
- Swagger UI: `GET /apidocs` ritorna HTML con securityScheme BearerAuth
presente.
- OpenAPI: `GET /openapi.json` ritorna schema valido OpenAPI 3.1 con tag
per ogni exchange.
**Smoke (post-deploy):**
```bash
curl -s http://localhost:9000/health
curl -s http://localhost:9000/apidocs | grep -q "Cerbero MCP"
curl -s -H "Authorization: Bearer $TESTNET_TOKEN" \
http://localhost:9000/mcp-deribit/tools/get_ticker \
-d '{"instrument": "BTC-PERPETUAL"}'
```
## Migrazione da V1
Per chi è in produzione su V1:
1. **Backup**: `cp -r secrets/ secrets.v1.bak/`.
2. **Genera token V2**:
```bash
python -c 'import secrets; print("TESTNET_TOKEN=" + secrets.token_urlsafe(32))'
python -c 'import secrets; print("MAINNET_TOKEN=" + secrets.token_urlsafe(32))'
```
3. **Compila `.env`**: mappa V1 → V2 (i campi nei JSON V1 hanno nomi
leggermente diversi; vedi tabella in README V2 sezione "Migrazione").
4. **Aggiorna bot client**:
- URL invariato (`/mcp-{exchange}/tools/{tool}` non cambia)
- Bearer cambia: dove prima usavi `core.token` o `observer.token`, ora
usi `TESTNET_TOKEN` o `MAINNET_TOKEN` a seconda dell'ambiente
desiderato per quella richiesta.
5. **Spegni V1**: `docker compose down` (vecchio compose).
6. **Avvia V2**: `docker compose up -d` (nuovo compose minimo) +
verifica `curl /health` e `curl /apidocs`.
7. **Rollback** disponibile pullando i tag immagine V1 (`cerbero-mcp-*:1.x`)
se necessario, ma .env e secrets/ sono incompatibili tra V1 e V2 —
tenere backup separati.
## Pulizia documentazione
| File | V1 | Azione V2 |
|---|---|---|
| `README.md` | descrive 6 servizi + Caddy + 8 secret + build push 8 immagini | Riscritto da zero |
| `DEPLOYMENT.md` | runbook 7 immagini, gateway Caddy, allowlist Caddy, deploy no-clone | **Eliminato**. Contenuti utili in `README.md` |
| `docs/superpowers/specs/2026-04-27-cot-report-design.md` | spec feature passata | Mantenuto (storico) |
| `docs/superpowers/plans/2026-04-27-cot-report.md` | plan feature passata | Mantenuto (storico) |
| `docs/superpowers/specs/2026-04-30-V2.0.0-...-design.md` | (nuovo) | Creato |
| `docs/superpowers/plans/2026-04-30-V2.0.0-...-plan.md` | (nuovo) | Creato dopo writing-plans |
Razionale eliminazione `DEPLOYMENT.md`: 16 KB di doc su build-push 8
immagini, gateway Caddy, secret mounts, IP allowlist Caddy, deploy
no-clone — tutto obsoleto in V2. Le 30 righe ancora valide (smoke test,
rollback Watchtower) sono integrate nella sezione "Deploy" del nuovo
`README.md`.
## Out of scope
Per evitare scope creep nello stesso sprint, restano fuori:
- **HSTS / security headers** custom: in V1 li gestiva Caddy. In V2 si
delegano a Traefik su VPS (gestito esternamente). Aggiunta a livello
applicativo non in V2.
- **Rate limit applicativo** (`slowapi` o simile): demandato a Traefik.
- **Metriche Prometheus**: rinviate a iterazione successiva.
- **Token rotation automatica**: fuori scope V2. Rotation manuale
modificando `.env` + restart container.
- **Telemetria audit trail**: `mcp_common.audit` viene preservato, ma
evoluzioni (struttura log, sink) rinviate.
- **Multi-account per exchange**: V2 supporta 1 account per exchange. Più
account = future iterazione.
## Rischi e mitigazioni
| Rischio | Probabilità | Mitigazione |
|---|---|---|
| Bug auth → token testnet finisce su URL mainnet | Bassa, alto impatto | Test integration con stub HTTP per ogni exchange; leverage cap server-side resta secondo livello di difesa |
| Cache `client_registry` non rilascia connessioni → leak fd | Media | Lifespan FastAPI chiama `aclose()` su shutdown; healthcheck monitora processo |
| Boot fail per env var mancante in `.env` | Alta in dev | Pydantic Settings fail-fast con messaggio chiaro; `.env.example` versionato |
| Migrazione V1→V2 disallineata bot | Media | Path API invariato; documentare mapping V1→V2 in README |
| Concorrenza: prima request mainnet e testnet sullo stesso exchange in parallelo | Media | Lock per chiave nel registry impedisce race su `_build` |
| Image size cresce inattesa | Bassa | Multi-stage slim, base python:3.11-slim, no dipendenze inutili |
## Criteri di successo
- ✅ `docker compose up -d` avvia 1 container che risponde su `/health`
entro 5 secondi
- ✅ `curl /apidocs` rende Swagger UI navigabile
- ✅ Bot V1 funziona con cambio bearer e basta (path identici)
- ✅ Stesso bot può alternare testnet e mainnet su request consecutive
cambiando solo bearer
- ✅ Tutti i test V1 migrati passano in V2
- ✅ Tempo di build immagine ridotto da ~12 min a ~3 min
- ✅ `services/`, `gateway/`, `secrets/`, `docker/`, `DEPLOYMENT.md`,
3 docker-compose overlay rimossi dal repo
-86
View File
@@ -1,86 +0,0 @@
{
admin off
email {$ACME_EMAIL:adrianodalpastro@tielogic.com}
auto_https {$AUTO_HTTPS:on}
# Plugin mholt/caddy-ratelimit
order rate_limit before basicauth
# Trusted proxies: rispetta X-Forwarded-For quando dietro reverse proxy
# (es. Traefik). Default = solo private ranges.
servers {
trusted_proxies static {$TRUSTED_PROXIES:private_ranges}
}
}
{$LISTEN:cerbero-mcp.tielogic.xyz} {
log {
output stdout
format json
}
# ───── Security headers ─────
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "no-referrer"
-Server
}
# ───── IP allowlist su endpoint write ─────
# WRITE_ALLOWLIST: CIDR space-separated (es. "1.2.3.4/32 5.6.7.0/24").
# Default 127.0.0.1/32 — fail-closed se non configurato.
@writes_blocked {
path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
not remote_ip {$WRITE_ALLOWLIST:127.0.0.1/32 ::1/128 172.16.0.0/12}
}
respond @writes_blocked "forbidden: source ip not in allowlist" 403
# ───── Rate limit ─────
# Reads: 60 req/min/IP, writes: 10 req/min/IP (sliding window).
rate_limit {
zone reads {
match {
not path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
}
key {remote_ip}
events 60
window 1m
}
zone writes {
match {
path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
}
key {remote_ip}
events 10
window 1m
}
}
# ───── Reverse proxy ─────
handle_path /mcp-deribit/* {
reverse_proxy mcp-deribit:9011
}
handle_path /mcp-bybit/* {
reverse_proxy mcp-bybit:9019
}
handle_path /mcp-hyperliquid/* {
reverse_proxy mcp-hyperliquid:9012
}
handle_path /mcp-alpaca/* {
reverse_proxy mcp-alpaca:9020
}
handle_path /mcp-macro/* {
reverse_proxy mcp-macro:9013
}
handle_path /mcp-sentiment/* {
reverse_proxy mcp-sentiment:9014
}
# Landing page statica
handle {
root * /srv
file_server
}
}
-6
View File
@@ -1,6 +0,0 @@
FROM caddy:2.8-builder-alpine AS builder
RUN xcaddy build \
--with github.com/mholt/caddy-ratelimit
FROM caddy:2.8-alpine
COPY --from=builder /usr/bin/caddy /usr/bin/caddy
-97
View File
@@ -1,97 +0,0 @@
<!DOCTYPE html>
<html lang="it">
<head>
<meta charset="UTF-8">
<title>Cerbero — MCP gateway</title>
<link rel="stylesheet" href="/style.css">
</head>
<body>
<header>
<h1>Cerbero</h1>
<p>Sistema trading autonomo crypto, architettura MCP-only.</p>
</header>
<main>
<table id="services">
<thead>
<tr>
<th>Stato</th>
<th>Servizio</th>
<th>Porta int.</th>
<th>Descrizione</th>
<th>Link</th>
</tr>
</thead>
<tbody>
<tr data-path="/mcp-memory">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-memory</td>
<td>9015</td>
<td>Store L1/L2, system prompt base + dyn</td>
<td><a href="/mcp-memory/health">health</a> · <a href="/mcp-memory/docs">docs</a></td>
</tr>
<tr data-path="/mcp-scheduler">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-scheduler</td>
<td>9016</td>
<td>Recurring task + core agent runner</td>
<td><a href="/mcp-scheduler/health">health</a> · <a href="/mcp-scheduler/docs">docs</a></td>
</tr>
<tr data-path="/mcp-deribit">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-deribit</td>
<td>9011</td>
<td>Options testnet order/market</td>
<td><a href="/mcp-deribit/health">health</a> · <a href="/mcp-deribit/docs">docs</a></td>
</tr>
<tr data-path="/mcp-hyperliquid">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-hyperliquid</td>
<td>9012</td>
<td>Perp DEX testnet</td>
<td><a href="/mcp-hyperliquid/health">health</a> · <a href="/mcp-hyperliquid/docs">docs</a></td>
</tr>
<tr data-path="/mcp-macro">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-macro</td>
<td>9013</td>
<td>FRED indicators + Finnhub calendar</td>
<td><a href="/mcp-macro/health">health</a> · <a href="/mcp-macro/docs">docs</a></td>
</tr>
<tr data-path="/mcp-sentiment">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-sentiment</td>
<td>9014</td>
<td>CryptoPanic news feed</td>
<td><a href="/mcp-sentiment/health">health</a> · <a href="/mcp-sentiment/docs">docs</a></td>
</tr>
<tr data-path="/mcp-telegram">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-telegram</td>
<td>9017</td>
<td>Bot commands + notifiche operatore</td>
<td><a href="/mcp-telegram/health">health</a> · <a href="/mcp-telegram/docs">docs</a></td>
</tr>
<tr data-path="/mcp-portfolio">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-portfolio</td>
<td>9018</td>
<td>Holdings + yfinance + UI htmx</td>
<td><a href="/mcp-portfolio/health">health</a> · <a href="/gui">gui</a> · <a href="/mcp-portfolio/docs">docs</a></td>
</tr>
</tbody>
</table>
<section style="margin-top: 2rem;">
<h2 style="color: var(--accent); margin-bottom: 0.5rem;">Console operativa</h2>
<p><a href="/console" style="font-size: 1.1rem;">/console</a> — run del core agent, eventi stdout/stderr, L1 live, trigger manuale.</p>
</section>
</main>
<footer>
<p>Status aggiornato ogni 5 s. Gateway Caddy su porta configurata via <code>GATEWAY_PORT</code>.</p>
</footer>
<script src="/status.js"></script>
</body>
</html>
-23
View File
@@ -1,23 +0,0 @@
const rows = document.querySelectorAll("tr[data-path]");
async function poll() {
for (const row of rows) {
const dot = row.querySelector(".status");
try {
const r = await fetch(`${row.dataset.path}/health`, {
method: "GET",
cache: "no-store",
});
dot.classList.toggle("ok", r.ok);
dot.classList.toggle("err", !r.ok);
dot.setAttribute("aria-label", r.ok ? "ok" : "error");
} catch {
dot.classList.remove("ok");
dot.classList.add("err");
dot.setAttribute("aria-label", "unreachable");
}
}
}
poll();
setInterval(poll, 5000);
-101
View File
@@ -1,101 +0,0 @@
:root {
--bg: #0f172a;
--fg: #e2e8f0;
--muted: #94a3b8;
--card: #1e293b;
--border: #334155;
--ok: #22c55e;
--err: #ef4444;
--unknown: #64748b;
--accent: #38bdf8;
}
* { box-sizing: border-box; }
body {
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, sans-serif;
background: var(--bg);
color: var(--fg);
line-height: 1.5;
}
header, main, footer {
max-width: 960px;
margin: 0 auto;
padding: 1.5rem;
}
header h1 {
margin: 0 0 0.25rem;
color: var(--accent);
font-size: 2rem;
}
header p {
margin: 0;
color: var(--muted);
}
table {
width: 100%;
border-collapse: collapse;
background: var(--card);
border-radius: 8px;
overflow: hidden;
}
th, td {
padding: 0.75rem 1rem;
text-align: left;
border-bottom: 1px solid var(--border);
}
th {
background: #0f172a;
color: var(--muted);
font-weight: 600;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.05em;
}
tr:last-child td { border-bottom: none; }
td:nth-child(3) {
font-family: ui-monospace, "SF Mono", Menlo, monospace;
color: var(--muted);
}
a {
color: var(--accent);
text-decoration: none;
margin-right: 0.5rem;
}
a:hover { text-decoration: underline; }
.status {
display: inline-block;
width: 12px;
height: 12px;
border-radius: 50%;
background: var(--unknown);
transition: background 0.3s ease;
}
.status.ok { background: var(--ok); box-shadow: 0 0 8px var(--ok); }
.status.err { background: var(--err); box-shadow: 0 0 8px var(--err); }
footer {
color: var(--muted);
font-size: 0.85rem;
margin-top: 2rem;
}
code {
background: var(--border);
padding: 0.1rem 0.3rem;
border-radius: 3px;
font-size: 0.9em;
}
+35 -32
View File
@@ -1,17 +1,39 @@
[tool.uv.workspace]
members = [
"services/common",
"services/mcp-alpaca",
"services/mcp-bybit",
"services/mcp-deribit",
"services/mcp-hyperliquid",
"services/mcp-macro",
"services/mcp-sentiment",
[project]
name = "cerbero-mcp"
version = "2.0.0"
description = "Unified multi-exchange MCP server with token-based testnet/mainnet routing"
requires-python = ">=3.11"
authors = [{ name = "Adriano", email = "adrianodalpastro@tielogic.com" }]
dependencies = [
"fastapi>=0.115",
"uvicorn[standard]>=0.32",
"pydantic>=2.9",
"pydantic-settings>=2.6",
"httpx>=0.27",
"python-json-logger>=2.0",
"websockets>=13",
"numpy>=1.26",
"scipy>=1.13",
"statsmodels>=0.14",
"pandas>=2.2",
"pybit>=5.7",
"alpaca-py>=0.30",
"hyperliquid-python-sdk>=0.6",
]
[project.scripts]
cerbero-mcp = "cerbero_mcp.__main__:main"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/cerbero_mcp"]
[tool.ruff]
line-length = 100
target-version = "py313"
target-version = "py311"
[tool.ruff.lint]
select = ["E", "F", "I", "W", "UP", "B", "SIM"]
@@ -35,39 +57,20 @@ extend-immutable-calls = [
[tool.pytest.ini_options]
asyncio_mode = "auto"
testpaths = ["services"]
testpaths = ["tests"]
addopts = "--import-mode=importlib"
consider_namespace_packages = true
[tool.mypy]
python_version = "3.13"
python_version = "3.11"
strict = false
warn_return_any = true
warn_unused_ignores = true
warn_redundant_casts = true
check_untyped_defs = true
ignore_missing_imports = true
mypy_path = [
"services/common/src",
"services/mcp-alpaca/src",
"services/mcp-bybit/src",
"services/mcp-deribit/src",
"services/mcp-hyperliquid/src",
"services/mcp-macro/src",
"services/mcp-sentiment/src",
]
exclude = [
"^.*tests/.*$",
"^.venv/.*$",
]
[[tool.mypy.overrides]]
module = [
"pybit.*",
"alpaca.*",
"hyperliquid.*",
"pythonjsonlogger.*",
]
module = ["pybit.*", "alpaca.*", "hyperliquid.*", "pythonjsonlogger.*"]
ignore_missing_imports = true
[dependency-groups]
+16 -56
View File
@@ -1,35 +1,26 @@
#!/usr/bin/env bash
# Cerbero_mcp — build & push image al registry Gitea da macchina locale.
#
# Sostituisce il job CI `build-and-push` di .gitea/workflows/ci.yml.
# Usalo dopo `git push` (o senza, se vuoi pushare un build "dirty").
# Watchtower sul VPS pulla automaticamente entro WATCHTOWER_POLL_INTERVAL.
# Cerbero MCP — build & push immagine unica V2.0.0 al registry Gitea.
#
# Pre-requisiti:
# - docker + buildx
# - docker
# - PAT Gitea con scope `write:package` in env $GITEA_PAT
# - $GITEA_USER (default: adriano)
#
# Uso:
# ./scripts/build-push.sh # tutte le image
# ./scripts/build-push.sh base gateway # solo specifiche
# ./scripts/build-push.sh
# VERSION=2.0.1 ./scripts/build-push.sh
set -euo pipefail
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
IMAGE_PREFIX="${IMAGE_PREFIX:-$REGISTRY/adriano/cerbero-mcp}"
GITEA_USER="${GITEA_USER:-adriano}"
VERSION="${VERSION:-2.0.0}"
SHA="$(git rev-parse --short HEAD)"
# Ordine di build: base prima (parent delle mcp-*), poi le altre.
ALL_TARGETS=(base gateway mcp-deribit mcp-bybit mcp-hyperliquid mcp-alpaca mcp-macro mcp-sentiment)
TARGETS=("${@:-${ALL_TARGETS[@]}}")
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
docker buildx version >/dev/null || { echo "FATAL: docker buildx non disponibile"; exit 1; }
# Login solo se non già autenticato sul registry. Per primo login fai:
# echo "<PAT>" | docker login $REGISTRY -u $GITEA_USER --password-stdin
# Login solo se non già autenticato sul registry.
if grep -q "\"$REGISTRY\"" ~/.docker/config.json 2>/dev/null; then
echo "=== docker già loggato su $REGISTRY (skip login) ==="
elif [ -n "${GITEA_PAT:-}" ]; then
@@ -41,50 +32,19 @@ else
exit 1
fi
build_one() {
local name="$1"
local context file
case "$name" in
base)
context="."; file="docker/base.Dockerfile" ;;
gateway)
context="./gateway"; file="gateway/Dockerfile" ;;
mcp-*)
context="."; file="docker/${name}.Dockerfile" ;;
*)
echo "FATAL: target sconosciuto '$name'"; exit 1 ;;
esac
TAG_VERSION="$IMAGE_PREFIX:$VERSION"
TAG_LATEST="$IMAGE_PREFIX:latest"
TAG_SHA="$IMAGE_PREFIX:sha-$SHA"
if [ ! -f "$file" ]; then
echo "FATAL: Dockerfile non trovato: $file"; exit 1
fi
echo "=== build cerbero-mcp:$VERSION ==="
docker build -t "$TAG_VERSION" -t "$TAG_LATEST" -t "$TAG_SHA" .
local tag_latest="$IMAGE_PREFIX/$name:latest"
local tag_sha="$IMAGE_PREFIX/$name:sha-$SHA"
echo "=== [$name] build & push ==="
local args=(buildx build --push
-f "$file"
-t "$tag_latest"
-t "$tag_sha"
)
if [[ "$name" == mcp-* ]]; then
args+=(--build-arg "BASE_IMAGE=$IMAGE_PREFIX/base"
--build-arg "BASE_TAG=latest")
fi
args+=("$context")
docker "${args[@]}"
echo " pushed: $tag_latest"
echo " pushed: $tag_sha"
}
for t in "${TARGETS[@]}"; do
build_one "$t"
echo "=== push ==="
for tag in "$TAG_VERSION" "$TAG_LATEST" "$TAG_SHA"; do
docker push "$tag"
echo " pushed: $tag"
done
echo
echo "=== Tutto pushato (commit $SHA) ==="
echo "=== Done (commit $SHA, version $VERSION) ==="
echo "VPS Watchtower farà pull entro WATCHTOWER_POLL_INTERVAL (default 5min)."
echo "Per forzare subito:"
echo " ssh <vps> 'cd /docker/cerbero_mcp && docker compose -f docker-compose.prod.yml pull && docker compose -f docker-compose.prod.yml up -d'"
-202
View File
@@ -1,202 +0,0 @@
#!/usr/bin/env bash
# Cerbero_mcp — deploy script per VPS produzione.
#
# Sul VPS NON viene clonato il repo: lo script scarica solo i file
# strettamente necessari al runtime (compose, Caddyfile, public assets)
# via raw HTTP da Gitea. Le image vengono pullate pre-built dal registry
# Gitea (buildate dal laptop dev con scripts/build-push.sh).
#
# Pre-requisiti sul VPS (NON gestiti da questo script):
# 1. Docker Engine ≥ 24 + plugin docker compose installati.
# 2. DNS A record `cerbero-mcp.tielogic.xyz` → IP del VPS (warn-only).
# 3. Porte 80 e 443 aperte sul firewall (per ACME + traffico HTTPS).
# 4. PAT Gitea con scope `read:package`, salvato in env `$GITEA_PAT`.
# 5. Username Gitea in env `$GITEA_USER` (default: adriano).
# 6. Secret JSON exchange + token bearer disponibili in $SECRETS_SRC
# (default: $DEPLOY_DIR/secrets/), che lo script copierà in
# $DEPLOY_DIR/secrets/ con permessi 600 (ignorato se SECRETS_SRC == DEPLOY_DIR/secrets).
#
# Idempotente: rieseguibile per aggiornamenti (riscarica i file di config
# dal branch corrente, NON tocca .env esistente).
set -euo pipefail
DEPLOY_DIR="${DEPLOY_DIR:-/docker/cerbero_mcp}"
SECRETS_SRC="${SECRETS_SRC:-$DEPLOY_DIR/secrets}"
GITEA_USER="${GITEA_USER:-adriano}"
GITEA_RAW_BASE="${GITEA_RAW_BASE:-https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main}"
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
DOMAIN="${DOMAIN:-cerbero-mcp.tielogic.xyz}"
AUDIT_LOG_DIR="${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}"
echo "=== Cerbero_mcp deploy (no-clone) → $DEPLOY_DIR (domain $DOMAIN) ==="
# ──────────────────────────────────────────────────────────────
# 1. Verifica pre-requisiti
# ──────────────────────────────────────────────────────────────
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
command -v curl >/dev/null || { echo "FATAL: curl non installato"; exit 1; }
docker compose version >/dev/null || { echo "FATAL: docker compose plugin assente"; exit 1; }
if [ -z "${GITEA_PAT:-}" ]; then
echo "FATAL: env GITEA_PAT non settata. Export del PAT con scope read:package prima."
exit 1
fi
# Check DNS resolution (warning only, non blocca)
ip_resolved=$(getent hosts "$DOMAIN" | awk '{print $1}' | head -1 || true)
if [ -z "$ip_resolved" ]; then
echo "WARN: $DOMAIN non risolve via DNS — TLS Let's Encrypt fallirà finché DNS non propaga."
else
echo "DNS $DOMAIN$ip_resolved"
fi
# ──────────────────────────────────────────────────────────────
# 2. Login al container registry
# ──────────────────────────────────────────────────────────────
echo "=== docker login $REGISTRY ==="
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
# ──────────────────────────────────────────────────────────────
# 3. Setup dir + scarica i file di config dal repo (no clone)
# ──────────────────────────────────────────────────────────────
sudo mkdir -p "$DEPLOY_DIR"
sudo chown "$USER:$USER" "$DEPLOY_DIR"
mkdir -p "$DEPLOY_DIR/secrets" "$DEPLOY_DIR/gateway/public"
# File di config necessari al runtime. Scaricati come raw da Gitea.
# Idempotente: ricarica sempre la versione di main.
download() {
local rel="$1"
local dst="$DEPLOY_DIR/$rel"
echo " fetch: $rel"
curl -fsSL -o "$dst" "$GITEA_RAW_BASE/$rel" \
|| { echo "FATAL: download $rel fallito"; exit 1; }
}
echo "=== Download config da $GITEA_RAW_BASE ==="
download docker-compose.prod.yml
download docker-compose.traefik.yml
download gateway/Caddyfile
download gateway/public/index.html
download gateway/public/status.js
download gateway/public/style.css
cd "$DEPLOY_DIR"
# ──────────────────────────────────────────────────────────────
# 4. Copia secrets con permessi 600
# ──────────────────────────────────────────────────────────────
if [ "$(realpath "$SECRETS_SRC")" != "$(realpath "$DEPLOY_DIR/secrets")" ]; then
if [ ! -d "$SECRETS_SRC" ]; then
echo "FATAL: secrets src dir $SECRETS_SRC non esiste."
echo " Atteso contenere: deribit.json bybit.json hyperliquid.json alpaca.json"
echo " macro.json sentiment.json core.token observer.token"
exit 1
fi
echo "=== Copia secrets da $SECRETS_SRC ==="
for f in deribit.json bybit.json hyperliquid.json alpaca.json macro.json sentiment.json core.token observer.token; do
if [ -f "$SECRETS_SRC/$f" ]; then
cp "$SECRETS_SRC/$f" "secrets/$f"
chmod 600 "secrets/$f"
echo " ok: secrets/$f"
else
echo " WARN: $SECRETS_SRC/$f assente — il servizio relativo fallirà al boot."
fi
done
else
echo "=== Secrets già in $DEPLOY_DIR/secrets — solo chmod 600 ==="
for f in deribit.json bybit.json hyperliquid.json alpaca.json macro.json sentiment.json core.token observer.token; do
[ -f "secrets/$f" ] && chmod 600 "secrets/$f" && echo " ok: secrets/$f" \
|| echo " WARN: secrets/$f assente — il servizio relativo fallirà al boot."
done
fi
# ──────────────────────────────────────────────────────────────
# 5. Crea/aggiorna .env (preserva esistente)
# ──────────────────────────────────────────────────────────────
if [ ! -f .env ]; then
echo "=== Creazione .env iniziale (testnet di default) ==="
cat > .env <<EOF
# Cerbero_mcp deploy config — modifica per passare a mainnet
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
IMAGE_TAG=latest
IMAGE_PREFIX=git.tielogic.xyz/adriano/cerbero-mcp
# Environment exchange (true=testnet, false=mainnet).
# IMPORTANTE: per mainnet aggiungi anche "environment":"mainnet" al secret JSON
# corrispondente, altrimenti il boot abortisce per safety (vedi consistency_check).
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# Permette mainnet senza creds["environment"]="mainnet" esplicito (sconsigliato).
STRICT_MAINNET=true
# Audit log persistente per write endpoint (place_order, cancel, ecc.).
AUDIT_LOG_DIR=$AUDIT_LOG_DIR
# Watchtower polling auto-update (sec).
WATCHTOWER_POLL_INTERVAL=300
EOF
echo " $DEPLOY_DIR/.env creato. Rivedi prima del primo up."
else
echo "=== .env preesistente — non sovrascritto ==="
fi
# ──────────────────────────────────────────────────────────────
# 6. Audit log dir host (volume bind)
# ──────────────────────────────────────────────────────────────
sudo mkdir -p "$AUDIT_LOG_DIR"
sudo chown 1000:1000 "$AUDIT_LOG_DIR"
echo "Audit log dir: $AUDIT_LOG_DIR (chown 1000:1000)"
# ──────────────────────────────────────────────────────────────
# 7. Pull image + up
# ──────────────────────────────────────────────────────────────
COMPOSE_FILES=("-f" "docker-compose.prod.yml")
if [ "${BEHIND_TRAEFIK:-false}" = "true" ]; then
echo "=== Modalità behind-traefik attiva (network ${TRAEFIK_NETWORK:-gitea_traefik-public}) ==="
COMPOSE_FILES+=("-f" "docker-compose.traefik.yml")
fi
# Override locale specifico macchina (es. fix DOCKER_API_VERSION watchtower).
# Non versionato (in .gitignore), creato a mano sul VPS se serve.
if [ -f "docker-compose.local.yml" ]; then
echo "=== Override locale rilevato: docker-compose.local.yml ==="
COMPOSE_FILES+=("-f" "docker-compose.local.yml")
fi
echo "=== docker compose pull + up ==="
docker compose "${COMPOSE_FILES[@]}" --env-file .env pull
docker compose "${COMPOSE_FILES[@]}" --env-file .env up -d
# ──────────────────────────────────────────────────────────────
# 8. Verifica stato
# ──────────────────────────────────────────────────────────────
sleep 5
echo "=== Stato container ==="
docker compose "${COMPOSE_FILES[@]}" --env-file .env ps
echo
echo "=== Smoke test (health check via gateway pubblico) ==="
sleep 10
if curl -sf -o /dev/null -m 10 "https://$DOMAIN/mcp-macro/health"; then
echo " OK: https://$DOMAIN/mcp-macro/health → 200"
else
echo " WARN: https://$DOMAIN/mcp-macro/health non risponde (DNS o cert non ancora pronti?)"
echo " Riprova fra 30s o controlla: docker compose -f docker-compose.prod.yml logs gateway"
fi
echo
echo "=== Deploy completato ==="
echo "Comandi utili (compose files: ${COMPOSE_FILES[*]}):"
echo " Logs: docker compose ${COMPOSE_FILES[*]} --env-file .env logs -f <service>"
echo " Audit: tail -f $AUDIT_LOG_DIR/*.audit.jsonl"
echo " Restart: docker compose ${COMPOSE_FILES[*]} --env-file .env restart <service>"
echo " Stop: docker compose ${COMPOSE_FILES[*]} --env-file .env down"
echo " Update: ri-esegui questo script (riscarica config + pull image)"
-23
View File
@@ -1,23 +0,0 @@
[project]
name = "mcp-common"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"mcp>=1.0",
"httpx>=0.27",
"pydantic>=2.6",
"pydantic-settings>=2.3",
"python-json-logger>=2.0",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30", "ruff>=0.5"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_common"]
@@ -1 +0,0 @@
__all__ = []
@@ -1,95 +0,0 @@
"""App factory comune per i servizi mcp-{exchange}.
Centralizza il boilerplate dei `__main__.py`:
- configure_root_logging (JSON)
- fail_fast_if_missing su env mandatory
- summarize env
- load creds JSON
- resolve_environment con default URLs
- load token store
- delega creazione client + app a callback per-servizio
- uvicorn.run
Ogni servizio invoca `run_exchange_main(spec)` con uno spec dichiarativo.
"""
from __future__ import annotations
import json
import os
from collections.abc import Callable
from dataclasses import dataclass
from typing import Any
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.env_validation import fail_fast_if_missing, require_env, summarize
from mcp_common.environment import (
EnvironmentInfo,
consistency_check,
resolve_environment,
)
from mcp_common.logging import configure_root_logging
@dataclass(frozen=True)
class ExchangeAppSpec:
exchange: str
creds_env_var: str
env_var: str # es. "BYBIT_TESTNET", "ALPACA_PAPER"
flag_key: str # campo nel secret JSON ("testnet" o "paper")
default_base_url_live: str
default_base_url_testnet: str
default_port: int
build_client: Callable[[dict, EnvironmentInfo], Any]
build_app: Callable[..., Any]
extra_summarize_envs: tuple[str, ...] = ()
def run_exchange_main(spec: ExchangeAppSpec) -> None:
configure_root_logging()
fail_fast_if_missing([spec.creds_env_var])
summarize([
spec.creds_env_var,
"CORE_TOKEN_FILE",
"OBSERVER_TOKEN_FILE",
"PORT",
"HOST",
spec.env_var,
*spec.extra_summarize_envs,
])
creds_file = require_env(spec.creds_env_var, f"{spec.exchange} credentials JSON path")
with open(creds_file) as f:
creds = json.load(f)
env_info = resolve_environment(
creds,
env_var=spec.env_var,
flag_key=spec.flag_key,
exchange=spec.exchange,
default_base_url_live=spec.default_base_url_live,
default_base_url_testnet=spec.default_base_url_testnet,
)
# Safety: previene switch accidentali a mainnet senza conferma esplicita
# nel secret. Solleva EnvironmentMismatchError → boot abort se mismatch.
strict_mainnet = os.environ.get("STRICT_MAINNET", "true").lower() not in ("0", "false", "no")
consistency_check(env_info, creds, strict_mainnet=strict_mainnet)
client = spec.build_client(creds, env_info)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = spec.build_app(client=client, token_store=token_store, creds=creds, env_info=env_info)
uvicorn.run(
app,
log_config=None,
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", str(spec.default_port))),
)
-98
View File
@@ -1,98 +0,0 @@
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass, field
from functools import wraps
from fastapi import HTTPException, Request, status
@dataclass
class Principal:
name: str
capabilities: set[str] = field(default_factory=set)
@dataclass
class TokenStore:
tokens: dict[str, Principal]
def get(self, token: str) -> Principal | None:
return self.tokens.get(token)
def require_principal(request: Request) -> Principal:
auth = request.headers.get("Authorization", "")
if not auth.startswith("Bearer "):
raise HTTPException(status.HTTP_401_UNAUTHORIZED, "missing bearer token")
token = auth[len("Bearer "):].strip()
store: TokenStore = request.app.state.token_store
principal = store.get(token)
if principal is None:
raise HTTPException(status.HTTP_403_FORBIDDEN, "invalid token")
return principal
def acl_requires(*, core: bool = False, observer: bool = False) -> Callable:
"""Decorator: require at least one matching capability."""
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
def decorator(func: Callable) -> Callable:
@wraps(func)
async def async_wrapper(*args, **kwargs):
principal = kwargs.get("principal")
if principal is None:
for a in args:
if isinstance(a, Principal):
principal = a
break
if principal is None or not (principal.capabilities & allowed):
raise HTTPException(
status.HTTP_403_FORBIDDEN,
f"capability required: {allowed}",
)
return await func(*args, **kwargs) if _is_coro(func) else func(*args, **kwargs)
@wraps(func)
def sync_wrapper(*args, **kwargs):
principal = kwargs.get("principal")
if principal is None:
for a in args:
if isinstance(a, Principal):
principal = a
break
if principal is None or not (principal.capabilities & allowed):
raise HTTPException(
status.HTTP_403_FORBIDDEN,
f"capability required: {allowed}",
)
return func(*args, **kwargs)
return async_wrapper if _is_coro(func) else sync_wrapper
return decorator
def _is_coro(func: Callable) -> bool:
import asyncio
return asyncio.iscoroutinefunction(func)
def load_token_store_from_files(
core_token_file: str | None,
observer_token_file: str | None,
) -> TokenStore:
tokens: dict[str, Principal] = {}
if core_token_file:
with open(core_token_file) as f:
tokens[f.read().strip()] = Principal(name="core", capabilities={"core"})
if observer_token_file:
with open(observer_token_file) as f:
tokens[f.read().strip()] = Principal(
name="observer", capabilities={"observer"}
)
return TokenStore(tokens=tokens)
@@ -1,69 +0,0 @@
"""Env validation policy: fail-fast per mandatory, soft per optional.
Usage al boot di ogni mcp `__main__.py`:
from mcp_common.env_validation import require_env, optional_env, summarize
creds_file = require_env("CREDENTIALS_FILE", "deribit credentials JSON path")
host = optional_env("HOST", default="0.0.0.0")
summarize(["CREDENTIALS_FILE", "HOST", "PORT"])
"""
from __future__ import annotations
import logging
import os
import sys
logger = logging.getLogger(__name__)
class MissingEnvError(RuntimeError):
"""Mandatory env var absent or empty."""
def require_env(name: str, description: str = "") -> str:
val = (os.environ.get(name) or "").strip()
if not val:
msg = f"missing mandatory env var: {name}"
if description:
msg += f" ({description})"
logger.error(msg)
raise MissingEnvError(msg)
return val
def optional_env(name: str, *, default: str = "") -> str:
val = (os.environ.get(name) or "").strip()
if not val:
if default:
logger.info("env %s not set, using default=%r", name, default)
return default
return val
def summarize(names: list[str]) -> None:
sensitive_tokens = ("SECRET", "KEY", "TOKEN", "PASSWORD", "CREDENTIAL", "WALLET")
for n in names:
val = os.environ.get(n)
if val is None:
logger.info("env[%s]: <unset>", n)
continue
if any(t in n.upper() for t in sensitive_tokens):
logger.info("env[%s]: <set, %d chars>", n, len(val))
else:
logger.info("env[%s]: %s", n, val)
def fail_fast_if_missing(names: list[str]) -> None:
missing: list[str] = []
for n in names:
if not (os.environ.get(n) or "").strip():
missing.append(n)
if missing:
logger.error("boot aborted: missing mandatory env vars: %s", missing)
print(
f"FATAL: missing mandatory env vars: {missing}",
file=sys.stderr,
)
sys.exit(2)
@@ -1,138 +0,0 @@
"""Resolver di ambiente (testnet/mainnet) per MCP exchange.
Precedenza: env var > campo secret > default True (testnet).
Safety: `consistency_check` previene switch accidentali a mainnet senza
conferma esplicita nel secret JSON.
"""
from __future__ import annotations
import logging
import os
from dataclasses import dataclass
from typing import Literal
logger = logging.getLogger(__name__)
Environment = Literal["testnet", "mainnet"]
Source = Literal["env", "credentials", "default"]
TRUTHY = {"1", "true", "yes", "on"}
# Tokens nel base_url che indicano endpoint testnet (case-insensitive).
TESTNET_URL_HINTS = ("test", "testnet", "paper")
class EnvironmentMismatchError(RuntimeError):
"""Boot abort: ambiente risolto non matcha conferma esplicita nel secret."""
@dataclass(frozen=True)
class EnvironmentInfo:
exchange: str
environment: Environment
source: Source
env_value: str | None
base_url: str
def resolve_environment(
creds: dict,
*,
env_var: str,
flag_key: str,
exchange: str,
default_base_url_live: str | None = None,
default_base_url_testnet: str | None = None,
) -> EnvironmentInfo:
"""Risolvi l'ambiente per un MCP exchange.
creds: dict letto dal secret JSON. Può contenere base_url_live/base_url_testnet
per override; in assenza vengono usati i default kwargs.
env_var: nome della env var di override (es. DERIBIT_TESTNET).
flag_key: chiave booleana nel secret JSON (es. "testnet" o "paper" per alpaca).
exchange: nome exchange per logging/info.
default_base_url_live / default_base_url_testnet: URL canonici dell'exchange,
applicati se non presenti in creds.
"""
env_value = os.environ.get(env_var)
if env_value is not None:
is_test = env_value.strip().lower() in TRUTHY
environment: Environment = "testnet" if is_test else "mainnet"
source: Source = "env"
elif flag_key in creds:
environment = "testnet" if bool(creds[flag_key]) else "mainnet"
source = "credentials"
else:
environment = "testnet"
source = "default"
if default_base_url_live is not None:
creds.setdefault("base_url_live", default_base_url_live)
if default_base_url_testnet is not None:
creds.setdefault("base_url_testnet", default_base_url_testnet)
base_url = creds["base_url_testnet"] if environment == "testnet" else creds["base_url_live"]
return EnvironmentInfo(
exchange=exchange,
environment=environment,
source=source,
env_value=env_value,
base_url=base_url,
)
def consistency_check(
env_info: EnvironmentInfo,
creds: dict,
*,
strict_mainnet: bool = True,
) -> list[str]:
"""Verifica coerenza environment risolto vs secret JSON. Restituisce
lista di warning string. Solleva EnvironmentMismatchError per mismatch
bloccanti.
Regole:
- Se `creds["environment"]` è presente e DIVERSO da `env_info.environment`:
→ raise EnvironmentMismatchError (declared vs resolved mismatch).
- Se `env_info.environment == "mainnet"` e `creds.get("environment") !=
"mainnet"`: con `strict_mainnet=True` → raise (richiede conferma
esplicita). Con `strict_mainnet=False` → warning.
- Se `env_info.base_url` contiene token testnet ("test", "testnet",
"paper") ma `env_info.environment == "mainnet"` (o viceversa): warning
(URL/environment incoerenti).
"""
warnings: list[str] = []
declared = creds.get("environment")
if declared and declared != env_info.environment:
raise EnvironmentMismatchError(
f"{env_info.exchange}: secret declared environment={declared!r} "
f"but resolver resolved environment={env_info.environment!r}"
)
if env_info.environment == "mainnet" and declared != "mainnet":
msg = (
f"{env_info.exchange}: resolved mainnet without explicit confirmation "
"in secret. Add `\"environment\": \"mainnet\"` to the credentials JSON."
)
if strict_mainnet:
raise EnvironmentMismatchError(msg)
warnings.append(msg)
url_lower = (env_info.base_url or "").lower()
has_test_hint = any(token in url_lower for token in TESTNET_URL_HINTS)
if env_info.environment == "mainnet" and has_test_hint:
warnings.append(
f"{env_info.exchange}: environment=mainnet but base_url contains "
f"testnet hint ({env_info.base_url!r})"
)
if env_info.environment == "testnet" and not has_test_hint and url_lower:
warnings.append(
f"{env_info.exchange}: environment=testnet but base_url does not "
f"appear to be a testnet endpoint ({env_info.base_url!r})"
)
for w in warnings:
logger.warning("environment consistency: %s", w)
return warnings
-137
View File
@@ -1,137 +0,0 @@
from __future__ import annotations
import json
from unittest.mock import MagicMock, patch
import pytest
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_common.environment import EnvironmentInfo
def _make_spec(build_client=None, build_app=None) -> ExchangeAppSpec:
return ExchangeAppSpec(
exchange="testex",
creds_env_var="TESTEX_CREDENTIALS_FILE",
env_var="TESTEX_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.testex.com",
default_base_url_testnet="https://test.testex.com",
default_port=9999,
build_client=build_client or (lambda creds, env_info: MagicMock(name="client")),
build_app=build_app or (lambda **kwargs: MagicMock(name="app")),
)
def test_run_exchange_main_loads_creds_and_resolves_env(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"api_key": "k", "api_secret": "s"}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("PORT", "10000")
monkeypatch.delenv("TESTEX_TESTNET", raising=False)
captured: dict = {}
def build_client(creds, env_info):
captured["creds"] = creds
captured["env_info"] = env_info
return MagicMock()
def build_app(**kwargs):
captured["app_kwargs"] = kwargs
return MagicMock()
spec = _make_spec(build_client=build_client, build_app=build_app)
with patch("mcp_common.app_factory.uvicorn.run") as mock_run:
run_exchange_main(spec)
assert captured["creds"]["api_key"] == "k"
assert captured["creds"]["base_url_live"] == "https://api.testex.com"
assert captured["creds"]["base_url_testnet"] == "https://test.testex.com"
assert isinstance(captured["env_info"], EnvironmentInfo)
assert captured["env_info"].environment == "testnet"
assert captured["env_info"].exchange == "testex"
assert "client" in captured["app_kwargs"]
assert "token_store" in captured["app_kwargs"]
assert "creds" in captured["app_kwargs"]
assert "env_info" in captured["app_kwargs"]
call_kwargs = mock_run.call_args.kwargs
assert call_kwargs["port"] == 10000 # PORT override
def test_run_exchange_main_uses_default_port(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.delenv("PORT", raising=False)
spec = _make_spec()
with patch("mcp_common.app_factory.uvicorn.run") as mock_run:
run_exchange_main(spec)
assert mock_run.call_args.kwargs["port"] == 9999
def test_run_exchange_main_env_var_overrides_creds(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
# `environment: mainnet` esplicito perché env var override → mainnet
# e consistency_check richiede conferma per evitare switch accidentale.
creds_file.write_text(json.dumps({"testnet": True, "environment": "mainnet"}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("TESTEX_TESTNET", "false")
captured: dict = {}
def build_client(creds, env_info):
captured["env_info"] = env_info
return MagicMock()
spec = _make_spec(build_client=build_client)
with patch("mcp_common.app_factory.uvicorn.run"):
run_exchange_main(spec)
# env var "false" overrides creds.testnet=True → mainnet
assert captured["env_info"].environment == "mainnet"
assert captured["env_info"].source == "env"
def test_run_exchange_main_aborts_on_mainnet_without_confirmation(tmp_path, monkeypatch):
"""Mainnet senza creds['environment']='mainnet' → boot abort fail-fast."""
from mcp_common.environment import EnvironmentMismatchError
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"testnet": False}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.delenv("TESTEX_TESTNET", raising=False)
monkeypatch.delenv("STRICT_MAINNET", raising=False)
spec = _make_spec()
with (
pytest.raises(EnvironmentMismatchError),
patch("mcp_common.app_factory.uvicorn.run"),
):
run_exchange_main(spec)
def test_run_exchange_main_strict_mainnet_disabled_via_env(tmp_path, monkeypatch):
"""STRICT_MAINNET=false permette mainnet senza conferma (warning soltanto)."""
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"testnet": False}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("STRICT_MAINNET", "false")
spec = _make_spec()
with patch("mcp_common.app_factory.uvicorn.run"):
run_exchange_main(spec) # non solleva
def test_run_exchange_main_missing_creds_file_exits(monkeypatch):
monkeypatch.delenv("TESTEX_CREDENTIALS_FILE", raising=False)
spec = _make_spec()
import pytest
with pytest.raises(SystemExit) as exc_info:
run_exchange_main(spec)
assert exc_info.value.code == 2
-84
View File
@@ -1,84 +0,0 @@
import pytest
from fastapi import Depends, FastAPI
from fastapi.testclient import TestClient
from mcp_common.auth import (
Principal,
TokenStore,
acl_requires,
require_principal,
)
@pytest.fixture
def token_store():
return TokenStore(tokens={
"token-core-123": Principal(name="core", capabilities={"core"}),
"token-obs-456": Principal(name="observer", capabilities={"observer"}),
})
@pytest.fixture
def app(token_store):
app = FastAPI()
app.state.token_store = token_store
@app.get("/public")
def public():
return {"ok": True}
@app.get("/private")
def private(principal: Principal = Depends(require_principal)):
return {"name": principal.name}
@app.post("/core-only")
@acl_requires(core=True, observer=False)
def core_only(principal: Principal = Depends(require_principal)):
return {"who": principal.name}
@app.post("/observer-only")
@acl_requires(core=False, observer=True)
def observer_only(principal: Principal = Depends(require_principal)):
return {"who": principal.name}
return app
def test_public_endpoint_no_auth(app):
client = TestClient(app)
assert client.get("/public").status_code == 200
def test_private_without_header_401(app):
client = TestClient(app)
assert client.get("/private").status_code == 401
def test_private_bad_token_403(app):
client = TestClient(app)
r = client.get("/private", headers={"Authorization": "Bearer nope"})
assert r.status_code == 403
def test_private_good_token_200(app):
client = TestClient(app)
r = client.get("/private", headers={"Authorization": "Bearer token-core-123"})
assert r.status_code == 200
assert r.json() == {"name": "core"}
def test_acl_core_token_on_core_only_endpoint(app):
client = TestClient(app)
r = client.post("/core-only", headers={"Authorization": "Bearer token-core-123"})
assert r.status_code == 200
def test_acl_observer_on_core_only_rejected(app):
client = TestClient(app)
r = client.post("/core-only", headers={"Authorization": "Bearer token-obs-456"})
assert r.status_code == 403
def test_acl_observer_on_observer_only_ok(app):
client = TestClient(app)
r = client.post("/observer-only", headers={"Authorization": "Bearer token-obs-456"})
assert r.status_code == 200
-189
View File
@@ -1,189 +0,0 @@
from __future__ import annotations
import pytest
from mcp_common.environment import (
EnvironmentInfo,
EnvironmentMismatchError,
consistency_check,
resolve_environment,
)
def test_env_var_overrides_secret(monkeypatch):
monkeypatch.setenv("DERIBIT_TESTNET", "false")
creds = {"testnet": True, "base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="DERIBIT_TESTNET",
flag_key="testnet",
exchange="deribit",
)
assert info.environment == "mainnet"
assert info.source == "env"
assert info.env_value == "false"
assert info.base_url == "L"
def test_secret_used_when_env_missing(monkeypatch):
monkeypatch.delenv("DERIBIT_TESTNET", raising=False)
creds = {"testnet": True, "base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="DERIBIT_TESTNET",
flag_key="testnet",
exchange="deribit",
)
assert info.environment == "testnet"
assert info.source == "credentials"
assert info.env_value is None
assert info.base_url == "T"
def test_default_when_both_missing(monkeypatch):
monkeypatch.delenv("FOO_TESTNET", raising=False)
creds = {"base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="FOO_TESTNET",
flag_key="testnet",
exchange="foo",
)
assert info.environment == "testnet"
assert info.source == "default"
assert info.env_value is None
@pytest.mark.parametrize("raw,expected", [
("1", "testnet"),
("true", "testnet"),
("yes", "testnet"),
("on", "testnet"),
("TRUE", "testnet"),
("0", "mainnet"),
("false", "mainnet"),
("no", "mainnet"),
("off", "mainnet"),
("garbage", "mainnet"),
])
def test_env_value_truthy_parsing(monkeypatch, raw, expected):
monkeypatch.setenv("X_TESTNET", raw)
info = resolve_environment(
{"base_url_live": "L", "base_url_testnet": "T"},
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
)
assert info.environment == expected
def test_default_base_urls_applied_when_creds_missing(monkeypatch):
monkeypatch.delenv("X_TESTNET", raising=False)
creds: dict = {}
info = resolve_environment(
creds,
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
default_base_url_live="https://live.example",
default_base_url_testnet="https://test.example",
)
assert info.base_url == "https://test.example"
assert creds["base_url_live"] == "https://live.example"
assert creds["base_url_testnet"] == "https://test.example"
def test_creds_base_urls_override_defaults(monkeypatch):
monkeypatch.delenv("X_TESTNET", raising=False)
creds = {"base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
default_base_url_live="https://live.example",
default_base_url_testnet="https://test.example",
)
assert info.base_url == "T"
assert creds["base_url_live"] == "L"
def test_alpaca_paper_flag_key(monkeypatch):
"""Alpaca usa 'paper' invece di 'testnet' nel secret."""
monkeypatch.delenv("ALPACA_PAPER", raising=False)
creds = {"paper": False, "base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="ALPACA_PAPER",
flag_key="paper",
exchange="alpaca",
)
assert info.environment == "mainnet"
assert info.source == "credentials"
# ───────── consistency_check ─────────
def _info(env: str, exchange: str = "deribit") -> EnvironmentInfo:
"""Helper costruisce EnvironmentInfo per test."""
return EnvironmentInfo(
exchange=exchange,
environment=env,
source="env",
env_value="false" if env == "mainnet" else "true",
base_url=f"https://api.{exchange}.com" if env == "mainnet" else f"https://test.{exchange}.com",
)
def test_consistency_check_testnet_no_confirmation_ok():
"""Testnet senza conferma esplicita → ok, ritorna []. Default safe."""
info = _info("testnet")
creds = {"api_key": "k", "api_secret": "s"}
warnings = consistency_check(info, creds)
assert warnings == []
def test_consistency_check_mainnet_no_confirmation_raises():
"""Mainnet senza creds['environment']='mainnet' esplicito → fail-fast."""
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s"}
with pytest.raises(EnvironmentMismatchError, match="mainnet.*explicit confirmation"):
consistency_check(info, creds)
def test_consistency_check_mainnet_with_confirmation_ok():
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s", "environment": "mainnet"}
warnings = consistency_check(info, creds)
assert warnings == []
def test_consistency_check_explicit_mismatch_raises():
"""Secret dichiara mainnet ma resolver risolve testnet → fail-fast."""
info = _info("testnet")
creds = {"environment": "mainnet"}
with pytest.raises(EnvironmentMismatchError, match="declared.*resolved"):
consistency_check(info, creds)
def test_consistency_check_strict_mainnet_disabled():
"""Con strict_mainnet=False mainnet senza conferma logga warning ma non raise."""
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s"}
warnings = consistency_check(info, creds, strict_mainnet=False)
assert any("mainnet" in w for w in warnings)
def test_consistency_check_url_does_not_match_environment_warns():
"""Base URL contiene 'test' ma environment='mainnet' → warning."""
from mcp_common.environment import EnvironmentInfo
info = EnvironmentInfo(
exchange="bybit",
environment="mainnet",
source="env",
env_value="false",
base_url="https://api-testnet.bybit.com", # url DICE testnet ma resolver MAINNET
)
creds = {"environment": "mainnet"}
warnings = consistency_check(info, creds)
assert any("base_url" in w.lower() for w in warnings)
-90
View File
@@ -1,90 +0,0 @@
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.server import build_app
def test_build_app_health():
store = TokenStore(tokens={})
app = build_app(name="test-mcp", version="0.0.1", token_store=store)
client = TestClient(app)
r = client.get("/health")
assert r.status_code == 200
body = r.json()
assert body["status"] == "healthy"
assert body["name"] == "test-mcp"
assert body["version"] == "0.0.1"
assert "uptime_seconds" in body
assert "data_timestamp" in body
assert r.headers.get("X-Duration-Ms") is not None
def test_build_app_adds_token_store():
store = TokenStore(tokens={"t1": Principal("x", {"core"})})
app = build_app(name="t", version="v", token_store=store)
assert app.state.token_store is store
def test_timestamp_injector_dict_response():
"""CER-P5-001: dict response gets data_timestamp + X-Data-Timestamp header."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/foo")
def foo():
return {"ok": True}
client = TestClient(app)
r = client.post("/tools/foo")
assert r.status_code == 200
body = r.json()
assert body["ok"] is True
assert "data_timestamp" in body
assert r.headers.get("X-Data-Timestamp") is not None
def test_timestamp_injector_list_of_dicts():
"""CER-P5-001: list of dicts → each item gets data_timestamp."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/list_items")
def list_items():
return [{"x": 1}, {"x": 2}]
client = TestClient(app)
r = client.post("/tools/list_items")
body = r.json()
assert isinstance(body, list)
assert len(body) == 2
for item in body:
assert "data_timestamp" in item
assert r.headers.get("X-Data-Timestamp") is not None
def test_timestamp_injector_preserves_existing():
"""CER-P5-001: se già presente, non override."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/already")
def already():
return {"data_timestamp": "2020-01-01T00:00:00Z", "x": 1}
client = TestClient(app)
body = client.post("/tools/already").json()
assert body["data_timestamp"] == "2020-01-01T00:00:00Z"
def test_timestamp_injector_empty_list_gets_header_only():
"""CER-P5-001: list vuota — no body modification, ma header presente."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/empty_list")
def empty_list():
return []
client = TestClient(app)
r = client.post("/tools/empty_list")
assert r.json() == []
assert r.headers.get("X-Data-Timestamp") is not None
-29
View File
@@ -1,29 +0,0 @@
[project]
name = "mcp-alpaca"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
"alpaca-py>=0.32",
"pytz>=2024.1",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_alpaca"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-alpaca = "mcp_alpaca.__main__:main"
@@ -1,30 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_alpaca.client import AlpacaClient
from mcp_alpaca.server import create_app
SPEC = ExchangeAppSpec(
exchange="alpaca",
creds_env_var="ALPACA_CREDENTIALS_FILE",
env_var="ALPACA_PAPER",
flag_key="paper",
default_base_url_live="https://api.alpaca.markets",
default_base_url_testnet="https://paper-api.alpaca.markets",
default_port=9020,
build_client=lambda creds, env_info: AlpacaClient(
api_key=creds["api_key_id"],
secret_key=creds["secret_key"],
paper=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
@@ -1,321 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from mcp_alpaca.client import AlpacaClient
from mcp_alpaca.leverage_cap import get_max_leverage
# --- Body models: reads ---
class AccountReq(BaseModel):
pass
class PositionsReq(BaseModel):
pass
class ActivitiesReq(BaseModel):
limit: int = 50
class AssetsReq(BaseModel):
asset_class: str = "stocks"
status: str = "active"
class TickerReq(BaseModel):
symbol: str
asset_class: str = "stocks"
class BarsReq(BaseModel):
symbol: str
asset_class: str = "stocks"
interval: str = "1d"
start: str | None = None
end: str | None = None
limit: int = 1000
class SnapshotReq(BaseModel):
symbol: str
class OptionChainReq(BaseModel):
underlying: str
expiry: str | None = None
class OpenOrdersReq(BaseModel):
limit: int = 50
class ClockReq(BaseModel):
pass
class CalendarReq(BaseModel):
start: str | None = None
end: str | None = None
# --- Body models: writes ---
class PlaceOrderReq(BaseModel):
symbol: str
side: str
qty: float | None = None
notional: float | None = None
order_type: str = "market"
limit_price: float | None = None
stop_price: float | None = None
tif: str = "day"
asset_class: str = "stocks"
class AmendOrderReq(BaseModel):
order_id: str
qty: float | None = None
limit_price: float | None = None
stop_price: float | None = None
tif: str | None = None
class CancelOrderReq(BaseModel):
order_id: str
class CancelAllReq(BaseModel):
pass
class ClosePositionReq(BaseModel):
symbol: str
qty: float | None = None
percentage: float | None = None
class CloseAllPositionsReq(BaseModel):
cancel_orders: bool = True
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(status_code=403, detail="forbidden")
def create_app(
*,
client: AlpacaClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
):
creds = creds or {}
app = build_app(name="mcp-alpaca", version="0.1.0", token_store=token_store)
# ── Reads ──────────────────────────────────────────────
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "alpaca",
"environment": "testnet" if getattr(client, "paper", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_account", tags=["reads"])
async def t_get_account(body: AccountReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_account()
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(body: PositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"positions": await client.get_positions()}
@app.post("/tools/get_activities", tags=["reads"])
async def t_get_activities(body: ActivitiesReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"activities": await client.get_activities(body.limit)}
@app.post("/tools/get_assets", tags=["reads"])
async def t_get_assets(body: AssetsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"assets": await client.get_assets(body.asset_class, body.status)}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(body: TickerReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.symbol, body.asset_class)
@app.post("/tools/get_bars", tags=["reads"])
async def t_get_bars(body: BarsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_bars(
body.symbol, body.asset_class, body.interval, body.start, body.end, body.limit,
)
@app.post("/tools/get_snapshot", tags=["reads"])
async def t_get_snapshot(body: SnapshotReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_snapshot(body.symbol)
@app.post("/tools/get_option_chain", tags=["reads"])
async def t_get_option_chain(body: OptionChainReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_option_chain(body.underlying, body.expiry)
@app.post("/tools/get_open_orders", tags=["reads"])
async def t_get_open_orders(body: OpenOrdersReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"orders": await client.get_open_orders(body.limit)}
@app.post("/tools/get_clock", tags=["reads"])
async def t_get_clock(body: ClockReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_clock()
@app.post("/tools/get_calendar", tags=["reads"])
async def t_get_calendar(body: CalendarReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"calendar": await client.get_calendar(body.start, body.end)}
# ── Writes ─────────────────────────────────────────────
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(body: PlaceOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_order(
body.symbol, body.side, body.qty, body.notional,
body.order_type, body.limit_price, body.stop_price, body.tif, body.asset_class,
)
audit_write_op(
principal=principal, action="place_order", exchange="alpaca",
target=body.symbol,
payload={"side": body.side, "qty": body.qty, "notional": body.notional,
"order_type": body.order_type, "limit_price": body.limit_price,
"stop_price": body.stop_price, "tif": body.tif,
"asset_class": body.asset_class},
result=result,
)
return result
@app.post("/tools/amend_order", tags=["writes"])
async def t_amend_order(body: AmendOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.amend_order(
body.order_id, body.qty, body.limit_price, body.stop_price, body.tif,
)
audit_write_op(
principal=principal, action="amend_order", exchange="alpaca",
target=body.order_id,
payload={"qty": body.qty, "limit_price": body.limit_price,
"stop_price": body.stop_price, "tif": body.tif},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(body: CancelOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.cancel_order(body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="alpaca",
target=body.order_id, payload={}, result=result,
)
return result
@app.post("/tools/cancel_all_orders", tags=["writes"])
async def t_cancel_all(body: CancelAllReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = {"canceled": await client.cancel_all_orders()}
audit_write_op(
principal=principal, action="cancel_all_orders", exchange="alpaca",
payload={}, result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close(body: ClosePositionReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.close_position(body.symbol, body.qty, body.percentage)
audit_write_op(
principal=principal, action="close_position", exchange="alpaca",
target=body.symbol,
payload={"qty": body.qty, "percentage": body.percentage},
result=result,
)
return result
@app.post("/tools/close_all_positions", tags=["writes"])
async def t_close_all(body: CloseAllPositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = {"closed": await client.close_all_positions(body.cancel_orders)}
audit_write_op(
principal=principal, action="close_all_positions", exchange="alpaca",
payload={"cancel_orders": body.cancel_orders}, result=result,
)
return result
# ── MCP mount ──────────────────────────────────────────
port = int(os.environ.get("PORT", "9020"))
mount_mcp_endpoint(
app,
name="cerbero-alpaca",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (paper/live), source, base_url, max_leverage cap."},
{"name": "get_account", "description": "Alpaca account summary (equity, cash, buying_power)."},
{"name": "get_positions", "description": "Posizioni aperte (stocks/crypto/options)."},
{"name": "get_activities", "description": "Activity log (fills, dividends, transfers)."},
{"name": "get_assets", "description": "Universo asset per asset_class."},
{"name": "get_ticker", "description": "Last trade + quote per simbolo (stocks/crypto/options)."},
{"name": "get_bars", "description": "OHLCV candles (stocks/crypto/options)."},
{"name": "get_snapshot", "description": "Snapshot completo stock (last trade+quote+bar)."},
{"name": "get_option_chain", "description": "Option chain per underlying."},
{"name": "get_open_orders", "description": "Ordini pending."},
{"name": "get_clock", "description": "Market clock (open/close, next_open)."},
{"name": "get_calendar", "description": "Calendar sessioni trading."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "amend_order", "description": "Replace ordine esistente."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "cancel_all_orders", "description": "Cancella tutti ordini aperti."},
{"name": "close_position", "description": "Chiude posizione (tutta o parziale)."},
{"name": "close_all_positions", "description": "Liquida tutto il portafoglio."},
],
)
return app
@@ -1,50 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
from fastapi.testclient import TestClient
from mcp_alpaca.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
def _make_app(env_info, creds):
c = MagicMock()
c.paper = True
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_paper_is_testnet():
"""Alpaca: 'paper' nel secret mappa a environment='testnet'."""
env = EnvironmentInfo(
exchange="alpaca",
environment="testnet",
source="env",
env_value="true",
base_url="https://paper-api.alpaca.markets",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post("/tools/environment_info", headers={"Authorization": "Bearer ot"})
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "alpaca"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["base_url"] == "https://paper-api.alpaca.markets"
assert body["max_leverage"] == 1
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="alpaca", environment="testnet", source="default",
env_value=None, base_url="https://paper-api.alpaca.markets",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -1,110 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_alpaca.server import create_app
from mcp_common.auth import Principal, TokenStore
@pytest.fixture
def token_store():
return TokenStore(
tokens={
"core-tok": Principal("core", {"core"}),
"obs-tok": Principal("observer", {"observer"}),
}
)
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_account = AsyncMock(return_value={"equity": 100000})
c.get_positions = AsyncMock(return_value=[])
c.get_activities = AsyncMock(return_value=[])
c.get_assets = AsyncMock(return_value=[])
c.get_ticker = AsyncMock(return_value={"symbol": "AAPL"})
c.get_bars = AsyncMock(return_value={"bars": []})
c.get_snapshot = AsyncMock(return_value={})
c.get_option_chain = AsyncMock(return_value={"contracts": []})
c.get_open_orders = AsyncMock(return_value=[])
c.get_clock = AsyncMock(return_value={"is_open": True})
c.get_calendar = AsyncMock(return_value=[])
c.place_order = AsyncMock(return_value={"id": "o1"})
c.amend_order = AsyncMock(return_value={"id": "o1"})
c.cancel_order = AsyncMock(return_value={"canceled": True})
c.cancel_all_orders = AsyncMock(return_value=[])
c.close_position = AsyncMock(return_value={"id": "close1"})
c.close_all_positions = AsyncMock(return_value=[])
return c
@pytest.fixture
def http(mock_client, token_store):
app = create_app(client=mock_client, token_store=token_store, creds={"max_leverage": 1})
return TestClient(app)
CORE = {"Authorization": "Bearer core-tok"}
OBS = {"Authorization": "Bearer obs-tok"}
READ_ENDPOINTS = [
("/tools/get_account", {}),
("/tools/get_positions", {}),
("/tools/get_activities", {}),
("/tools/get_assets", {}),
("/tools/get_ticker", {"symbol": "AAPL"}),
("/tools/get_bars", {"symbol": "AAPL"}),
("/tools/get_snapshot", {"symbol": "AAPL"}),
("/tools/get_option_chain", {"underlying": "AAPL"}),
("/tools/get_open_orders", {}),
("/tools/get_clock", {}),
("/tools/get_calendar", {}),
]
WRITE_ENDPOINTS = [
("/tools/place_order", {"symbol": "AAPL", "side": "buy", "qty": 1}),
("/tools/amend_order", {"order_id": "o1", "qty": 2}),
("/tools/cancel_order", {"order_id": "o1"}),
("/tools/cancel_all_orders", {}),
("/tools/close_position", {"symbol": "AAPL"}),
("/tools/close_all_positions", {}),
]
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_observer_ok(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_observer_403(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 403, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
-28
View File
@@ -1,28 +0,0 @@
[project]
name = "mcp-bybit"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
"pybit>=5.8",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_bybit"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-bybit = "mcp_bybit.__main__:main"
@@ -1,30 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_bybit.client import BybitClient
from mcp_bybit.server import create_app
SPEC = ExchangeAppSpec(
exchange="bybit",
creds_env_var="BYBIT_CREDENTIALS_FILE",
env_var="BYBIT_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.bybit.com",
default_base_url_testnet="https://api-testnet.bybit.com",
default_port=9019,
build_client=lambda creds, env_info: BybitClient(
api_key=creds["api_key"],
api_secret=creds["api_secret"],
testnet=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
-522
View File
@@ -1,522 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, Field
from mcp_bybit.client import BybitClient
from mcp_bybit.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_bybit.leverage_cap import get_max_leverage
# --- Body models: reads ---
class TickerReq(BaseModel):
symbol: str
category: str = "linear"
class TickerBatchReq(BaseModel):
symbols: list[str]
category: str = "linear"
class OrderbookReq(BaseModel):
symbol: str
category: str = "linear"
limit: int = 50
class HistoricalReq(BaseModel):
symbol: str
category: str = "linear"
interval: str = "60"
start: int | None = None
end: int | None = None
limit: int = 1000
class IndicatorsReq(BaseModel):
symbol: str
category: str = "linear"
indicators: list[str] = ["rsi", "atr", "macd", "adx"]
interval: str = "60"
start: int | None = None
end: int | None = None
class FundingRateReq(BaseModel):
symbol: str
category: str = "linear"
class FundingHistoryReq(BaseModel):
symbol: str
category: str = "linear"
limit: int = 100
class OpenInterestReq(BaseModel):
symbol: str
category: str = "linear"
interval: str = "5min"
limit: int = 288
class InstrumentsReq(BaseModel):
category: str = "linear"
symbol: str | None = None
class OptionChainReq(BaseModel):
base_coin: str
expiry: str | None = None
class PositionsReq(BaseModel):
category: str = "linear"
class AccountSummaryReq(BaseModel):
pass
class TradeHistoryReq(BaseModel):
category: str = "linear"
limit: int = 50
class OpenOrdersReq(BaseModel):
category: str = "linear"
symbol: str | None = None
class BasisSpotPerpReq(BaseModel):
asset: str
class OrderbookImbalanceReq(BaseModel):
symbol: str
category: str = "linear"
depth: int = 10
class BasisTermStructureReq(BaseModel):
asset: str
# --- Body models: writes ---
class PlaceOrderReq(BaseModel):
category: str
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
position_idx: int | None = None
class ComboLegReq(BaseModel):
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
class PlaceComboOrderReq(BaseModel):
category: str = "option"
legs: list[ComboLegReq] = Field(..., min_length=2)
class AmendOrderReq(BaseModel):
category: str
symbol: str
order_id: str
new_qty: float | None = None
new_price: float | None = None
class CancelOrderReq(BaseModel):
category: str
symbol: str
order_id: str
class CancelAllReq(BaseModel):
category: str
symbol: str | None = None
class SetStopLossReq(BaseModel):
category: str
symbol: str
stop_loss: float
position_idx: int = 0
class SetTakeProfitReq(BaseModel):
category: str
symbol: str
take_profit: float
position_idx: int = 0
class ClosePositionReq(BaseModel):
category: str
symbol: str
class SetLeverageReq(BaseModel):
category: str
symbol: str
leverage: int
class SwitchModeReq(BaseModel):
category: str
symbol: str
mode: str
class TransferReq(BaseModel):
coin: str
amount: float
from_type: str
to_type: str
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(status_code=403, detail="forbidden")
def create_app(
*,
client: BybitClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
):
creds = creds or {}
app = build_app(name="mcp-bybit", version="0.1.0", token_store=token_store)
# ── Reads ──────────────────────────────────────────────
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "bybit",
"environment": "testnet" if client.testnet else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(body: TickerReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.symbol, body.category)
@app.post("/tools/get_ticker_batch", tags=["reads"])
async def t_get_ticker_batch(body: TickerBatchReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_ticker_batch(body.symbols, body.category)
@app.post("/tools/get_orderbook", tags=["reads"])
async def t_get_orderbook(body: OrderbookReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.symbol, body.category, body.limit)
@app.post("/tools/get_historical", tags=["reads"])
async def t_get_historical(body: HistoricalReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_historical(
body.symbol, body.category, body.interval, body.start, body.end, body.limit,
)
@app.post("/tools/get_indicators", tags=["reads"])
async def t_get_indicators(body: IndicatorsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_indicators(
body.symbol, body.category, body.indicators,
body.interval, body.start, body.end,
)
@app.post("/tools/get_funding_rate", tags=["reads"])
async def t_get_funding_rate(body: FundingRateReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_funding_rate(body.symbol, body.category)
@app.post("/tools/get_funding_history", tags=["reads"])
async def t_get_funding_history(body: FundingHistoryReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_funding_history(body.symbol, body.category, body.limit)
@app.post("/tools/get_open_interest", tags=["reads"])
async def t_get_open_interest(body: OpenInterestReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_open_interest(body.symbol, body.category, body.interval, body.limit)
@app.post("/tools/get_instruments", tags=["reads"])
async def t_get_instruments(body: InstrumentsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_instruments(body.category, body.symbol)
@app.post("/tools/get_option_chain", tags=["reads"])
async def t_get_option_chain(body: OptionChainReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_option_chain(body.base_coin, body.expiry)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(body: PositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"positions": await client.get_positions(body.category)}
@app.post("/tools/get_account_summary", tags=["reads"])
async def t_get_account_summary(body: AccountSummaryReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_account_summary()
@app.post("/tools/get_trade_history", tags=["reads"])
async def t_get_trade_history(body: TradeHistoryReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"trades": await client.get_trade_history(body.category, body.limit)}
@app.post("/tools/get_open_orders", tags=["reads"])
async def t_get_open_orders(body: OpenOrdersReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"orders": await client.get_open_orders(body.category, body.symbol)}
@app.post("/tools/get_basis_spot_perp", tags=["reads"])
async def t_get_basis_spot_perp(body: BasisSpotPerpReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_basis_spot_perp(body.asset)
@app.post("/tools/get_orderbook_imbalance", tags=["reads"])
async def t_get_ob_imbalance(body: OrderbookImbalanceReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_orderbook_imbalance(body.symbol, body.category, body.depth)
@app.post("/tools/get_basis_term_structure", tags=["reads"])
async def t_get_basis_term_structure(body: BasisTermStructureReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_basis_term_structure(body.asset)
# ── Writes ─────────────────────────────────────────────
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(body: PlaceOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_order(
body.category, body.symbol, body.side, body.qty,
body.order_type, body.price, body.tif, body.reduce_only, body.position_idx,
)
audit_write_op(
principal=principal, action="place_order", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "side": body.side, "qty": body.qty,
"order_type": body.order_type, "price": body.price, "tif": body.tif,
"reduce_only": body.reduce_only},
result=result,
)
return result
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(body: PlaceComboOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_combo_order(
category=body.category,
legs=[leg.model_dump() for leg in body.legs],
)
audit_write_op(
principal=principal, action="place_combo_order", exchange="bybit",
payload={"category": body.category,
"legs": [leg.model_dump() for leg in body.legs]},
result=result if isinstance(result, dict) else None,
)
return result
@app.post("/tools/amend_order", tags=["writes"])
async def t_amend_order(body: AmendOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.amend_order(
body.category, body.symbol, body.order_id, body.new_qty, body.new_price,
)
audit_write_op(
principal=principal, action="amend_order", exchange="bybit",
target=body.order_id,
payload={"category": body.category, "symbol": body.symbol,
"new_qty": body.new_qty, "new_price": body.new_price},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(body: CancelOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.cancel_order(body.category, body.symbol, body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="bybit",
target=body.order_id,
payload={"category": body.category, "symbol": body.symbol},
result=result,
)
return result
@app.post("/tools/cancel_all_orders", tags=["writes"])
async def t_cancel_all(body: CancelAllReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.cancel_all_orders(body.category, body.symbol)
audit_write_op(
principal=principal, action="cancel_all_orders", exchange="bybit",
target=body.symbol,
payload={"category": body.category},
result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(body: SetStopLossReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.set_stop_loss(body.category, body.symbol, body.stop_loss, body.position_idx)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="bybit",
target=body.symbol,
payload={"stop_loss": body.stop_loss, "position_idx": body.position_idx},
result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(body: SetTakeProfitReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.set_take_profit(body.category, body.symbol, body.take_profit, body.position_idx)
audit_write_op(
principal=principal, action="set_take_profit", exchange="bybit",
target=body.symbol,
payload={"take_profit": body.take_profit, "position_idx": body.position_idx},
result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close(body: ClosePositionReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.close_position(body.category, body.symbol)
audit_write_op(
principal=principal, action="close_position", exchange="bybit",
target=body.symbol,
payload={"category": body.category},
result=result,
)
return result
@app.post("/tools/set_leverage", tags=["writes"])
async def t_set_leverage(body: SetLeverageReq, principal: Principal = Depends(require_principal)):
_enforce_leverage(body.leverage, creds=creds, exchange="bybit")
_check(principal, core=True)
result = await client.set_leverage(body.category, body.symbol, body.leverage)
audit_write_op(
principal=principal, action="set_leverage", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "leverage": body.leverage},
result=result,
)
return result
@app.post("/tools/switch_position_mode", tags=["writes"])
async def t_switch_mode(body: SwitchModeReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.switch_position_mode(body.category, body.symbol, body.mode)
audit_write_op(
principal=principal, action="switch_position_mode", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "mode": body.mode},
result=result,
)
return result
@app.post("/tools/transfer_asset", tags=["writes"])
async def t_transfer(body: TransferReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.transfer_asset(body.coin, body.amount, body.from_type, body.to_type)
audit_write_op(
principal=principal, action="transfer_asset", exchange="bybit",
payload={"coin": body.coin, "amount": body.amount,
"from_type": body.from_type, "to_type": body.to_type},
result=result,
)
return result
# ── MCP mount ──────────────────────────────────────────
port = int(os.environ.get("PORT", "9019"))
mount_mcp_endpoint(
app,
name="cerbero-bybit",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_ticker", "description": "Ticker Bybit (spot/linear/inverse/option)."},
{"name": "get_ticker_batch", "description": "Ticker per più simboli."},
{"name": "get_orderbook", "description": "Orderbook profondità N."},
{"name": "get_historical", "description": "OHLCV candles Bybit."},
{"name": "get_indicators", "description": "Indicatori tecnici (RSI, ATR, MACD, ADX)."},
{"name": "get_funding_rate", "description": "Funding corrente perp."},
{"name": "get_funding_history", "description": "Funding storico perp."},
{"name": "get_open_interest", "description": "Open interest history perp."},
{"name": "get_instruments", "description": "Specs contratti."},
{"name": "get_option_chain", "description": "Option chain BTC/ETH/SOL."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Wallet balance e margine."},
{"name": "get_trade_history", "description": "Fills recenti."},
{"name": "get_open_orders", "description": "Ordini pending."},
{"name": "get_basis_spot_perp", "description": "Basis spot vs linear perp."},
{"name": "get_orderbook_imbalance", "description": "Microstructure: imbalance ratio + microprice + slope su top-N livelli book."},
{"name": "get_basis_term_structure", "description": "Basis curve futures dated vs spot, annualizzato."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "place_combo_order", "description": "Multi-leg atomico via place_batch_order (solo category=option)."},
{"name": "amend_order", "description": "Modifica ordine esistente."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "cancel_all_orders", "description": "Cancella tutti ordini."},
{"name": "set_stop_loss", "description": "Setta stop loss su posizione."},
{"name": "set_take_profit", "description": "Setta take profit su posizione."},
{"name": "close_position", "description": "Chiude posizione aperta."},
{"name": "set_leverage", "description": "Leva buy+sell uniforme."},
{"name": "switch_position_mode", "description": "Hedge vs one-way."},
{"name": "transfer_asset", "description": "Trasferimento interno tra account types."},
],
)
return app
@@ -1,54 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
from fastapi.testclient import TestClient
from mcp_bybit.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
def _make_app(env_info, creds):
c = MagicMock()
c.testnet = True
c.set_leverage = AsyncMock(return_value={"state": "ok"})
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="bybit",
environment="testnet",
source="env",
env_value="true",
base_url="https://api-testnet.bybit.com",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ot"},
)
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "bybit"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://api-testnet.bybit.com"
assert body["max_leverage"] == 3
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="bybit", environment="testnet", source="default",
env_value=None, base_url="https://api-testnet.bybit.com",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
-150
View File
@@ -1,150 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_bybit.server import create_app
from mcp_common.auth import Principal, TokenStore
@pytest.fixture
def token_store():
return TokenStore(
tokens={
"core-tok": Principal("core", {"core"}),
"obs-tok": Principal("observer", {"observer"}),
}
)
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_ticker = AsyncMock(return_value={"symbol": "BTCUSDT"})
c.get_ticker_batch = AsyncMock(return_value={"BTCUSDT": {}})
c.get_orderbook = AsyncMock(return_value={"bids": [], "asks": []})
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_indicators = AsyncMock(return_value={"rsi": 50.0})
c.get_funding_rate = AsyncMock(return_value={"funding_rate": 0.0001})
c.get_funding_history = AsyncMock(return_value={"history": []})
c.get_open_interest = AsyncMock(return_value={"points": []})
c.get_instruments = AsyncMock(return_value={"instruments": []})
c.get_option_chain = AsyncMock(return_value={"options": []})
c.get_positions = AsyncMock(return_value=[])
c.get_account_summary = AsyncMock(return_value={"equity": 0})
c.get_trade_history = AsyncMock(return_value=[])
c.get_open_orders = AsyncMock(return_value=[])
c.get_basis_spot_perp = AsyncMock(return_value={"basis_pct": 0})
c.place_order = AsyncMock(return_value={"order_id": "x"})
c.amend_order = AsyncMock(return_value={"order_id": "x"})
c.cancel_order = AsyncMock(return_value={"status": "cancelled"})
c.cancel_all_orders = AsyncMock(return_value={"cancelled_ids": []})
c.set_stop_loss = AsyncMock(return_value={"status": "stop_loss_set"})
c.set_take_profit = AsyncMock(return_value={"status": "take_profit_set"})
c.close_position = AsyncMock(return_value={"status": "submitted"})
c.set_leverage = AsyncMock(return_value={"status": "leverage_set"})
c.switch_position_mode = AsyncMock(return_value={"status": "mode_switched"})
c.transfer_asset = AsyncMock(return_value={"transfer_id": "tx"})
c.place_combo_order = AsyncMock(return_value={"orders": [{"order_id": "ord-1"}, {"order_id": "ord-2"}]})
c.get_orderbook_imbalance = AsyncMock(return_value={"imbalance_ratio": 0.0, "microprice": 100.0})
c.get_basis_term_structure = AsyncMock(return_value={"asset": "BTC", "term_structure": []})
return c
@pytest.fixture
def http(mock_client, token_store):
app = create_app(client=mock_client, token_store=token_store, creds={"max_leverage": 5})
return TestClient(app)
CORE = {"Authorization": "Bearer core-tok"}
OBS = {"Authorization": "Bearer obs-tok"}
READ_ENDPOINTS = [
("/tools/get_ticker", {"symbol": "BTCUSDT"}),
("/tools/get_ticker_batch", {"symbols": ["BTCUSDT"]}),
("/tools/get_orderbook", {"symbol": "BTCUSDT"}),
("/tools/get_historical", {"symbol": "BTCUSDT"}),
("/tools/get_indicators", {"symbol": "BTCUSDT"}),
("/tools/get_funding_rate", {"symbol": "BTCUSDT"}),
("/tools/get_funding_history", {"symbol": "BTCUSDT"}),
("/tools/get_open_interest", {"symbol": "BTCUSDT"}),
("/tools/get_instruments", {}),
("/tools/get_option_chain", {"base_coin": "BTC"}),
("/tools/get_positions", {}),
("/tools/get_account_summary", {}),
("/tools/get_trade_history", {}),
("/tools/get_open_orders", {}),
("/tools/get_basis_spot_perp", {"asset": "BTC"}),
("/tools/get_orderbook_imbalance", {"symbol": "BTCUSDT"}),
("/tools/get_basis_term_structure", {"asset": "BTC"}),
]
WRITE_ENDPOINTS = [
("/tools/place_order", {"category": "linear", "symbol": "BTCUSDT", "side": "Buy", "qty": 0.01}),
("/tools/amend_order", {"category": "linear", "symbol": "BTCUSDT", "order_id": "o1"}),
("/tools/cancel_order", {"category": "linear", "symbol": "BTCUSDT", "order_id": "o1"}),
("/tools/cancel_all_orders", {"category": "linear"}),
("/tools/set_stop_loss", {"category": "linear", "symbol": "BTCUSDT", "stop_loss": 55000}),
("/tools/set_take_profit", {"category": "linear", "symbol": "BTCUSDT", "take_profit": 65000}),
("/tools/close_position", {"category": "linear", "symbol": "BTCUSDT"}),
("/tools/set_leverage", {"category": "linear", "symbol": "BTCUSDT", "leverage": 5}),
("/tools/switch_position_mode", {"category": "linear", "symbol": "BTCUSDT", "mode": "hedge"}),
("/tools/transfer_asset", {"coin": "USDT", "amount": 10.0, "from_type": "UNIFIED", "to_type": "FUND"}),
("/tools/place_combo_order", {
"category": "option",
"legs": [
{"symbol": "BTC-30APR26-75000-C-USDT", "side": "Buy", "qty": 0.01, "order_type": "Limit", "price": 5.0},
{"symbol": "BTC-30APR26-80000-C-USDT", "side": "Sell", "qty": 0.01, "order_type": "Limit", "price": 3.0},
],
}),
]
def test_place_combo_order_min_legs(http):
r = http.post(
"/tools/place_combo_order",
json={
"category": "option",
"legs": [{"symbol": "X", "side": "Buy", "qty": 1, "order_type": "Limit", "price": 1.0}],
},
headers=CORE,
)
assert r.status_code == 422
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_observer_ok(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_observer_403(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 403, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
-27
View File
@@ -1,27 +0,0 @@
[project]
name = "mcp-deribit"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_deribit"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-deribit = "mcp_deribit.__main__:main"
@@ -1,30 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_deribit.client import DeribitClient
from mcp_deribit.server import create_app
SPEC = ExchangeAppSpec(
exchange="deribit",
creds_env_var="CREDENTIALS_FILE",
env_var="DERIBIT_TESTNET",
flag_key="testnet",
default_base_url_live="https://www.deribit.com/api/v2",
default_base_url_testnet="https://test.deribit.com/api/v2",
default_port=9011,
build_client=lambda creds, env_info: DeribitClient(
client_id=creds["client_id"],
client_secret=creds["client_secret"],
testnet=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
@@ -1,18 +0,0 @@
"""Re-export shim per backward-compat: la logica vive ora in
mcp_common.env_validation. Non aggiungere nuovo codice qui.
"""
from mcp_common.env_validation import (
MissingEnvError,
fail_fast_if_missing,
optional_env,
require_env,
summarize,
)
__all__ = [
"MissingEnvError",
"fail_fast_if_missing",
"optional_env",
"require_env",
"summarize",
]
@@ -1,695 +0,0 @@
from __future__ import annotations
import contextlib
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, field_validator, model_validator
from mcp_deribit.client import DeribitClient
from mcp_deribit.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_deribit.leverage_cap import get_max_leverage
# --- Body models ---
class GetTickerReq(BaseModel):
instrument_name: str | None = None
instrument: str | None = None
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
sym = self.instrument_name or self.instrument
if not sym:
raise ValueError("instrument_name (or instrument) is required")
self.instrument_name = sym
return self
class GetTickerBatchReq(BaseModel):
instrument_names: list[str] | None = None
instruments: list[str] | None = None
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
names = self.instrument_names or self.instruments
if not names:
raise ValueError("instrument_names (or instruments) is required")
self.instrument_names = names
return self
class GetInstrumentsReq(BaseModel):
currency: str
kind: str | None = None
expiry_from: str | None = None
expiry_to: str | None = None
strike_min: float | None = None
strike_max: float | None = None
min_open_interest: float | None = None
limit: int = 100
offset: int = 0
class GetOrderbookReq(BaseModel):
instrument_name: str
depth: int = 10
class OrderbookImbalanceReq(BaseModel):
instrument_name: str
depth: int = 10
class GetPositionsReq(BaseModel):
currency: str = "USDC"
class GetAccountSummaryReq(BaseModel):
currency: str = "USDC"
class GetTradeHistoryReq(BaseModel):
limit: int = 100
instrument_name: str | None = None
class GetHistoricalReq(BaseModel):
instrument: str
start_date: str
end_date: str
resolution: str = "1h"
class GetDvolReq(BaseModel):
currency: str = "BTC"
start_date: str
end_date: str
resolution: str = "1D"
class GetDvolHistoryReq(BaseModel):
currency: str = "BTC"
lookback_days: int = 90
class GetIvRankReq(BaseModel):
instrument: str
class GetRealizedVolReq(BaseModel):
currency: str = "BTC"
windows: list[int] = [14, 30]
class GetGexReq(BaseModel):
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 50
class OptionFlowReq(BaseModel):
"""Body comune per indicatori option-flow (dealer gamma, vanna/charm,
OI-weighted skew, smile asymmetry, ATM vs wings)."""
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 100
class GetPcRatioReq(BaseModel):
currency: str
class GetSkew25dReq(BaseModel):
currency: str
expiry: str
class GetTermStructureReq(BaseModel):
currency: str
class CalculateSpreadPayoffReq(BaseModel):
legs: list[dict]
quote_currency: str = "USD"
class RunBacktestReq(BaseModel):
strategy_name: str
underlying: str = "BTC"
lookback_days: int = 30
resolution: str = "4h"
entry_rules: dict | None = None
exit_rules: dict | None = None
class FindByDeltaReq(BaseModel):
currency: str
expiry: str
target_delta: float
option_type: str
max_results: int = 3
min_open_interest: float = 100.0
min_volume_24h: float = 20.0
class GetIndicatorsReq(BaseModel):
instrument: str
indicators: list[str]
start_date: str
end_date: str
resolution: str = "1h"
@field_validator("indicators", mode="before")
@classmethod
def _coerce_indicators(cls, v):
if isinstance(v, str):
import json
s = v.strip()
if s.startswith("["):
try:
parsed = json.loads(s)
if isinstance(parsed, list):
return [str(x).strip() for x in parsed if str(x).strip()]
except json.JSONDecodeError:
pass
return [x.strip() for x in s.split(",") if x.strip()]
if isinstance(v, list):
return v
raise ValueError(
"indicators must be a list like ['rsi','atr','macd'] "
"or a comma-separated string like 'rsi,atr,macd'"
)
class PlaceOrderReq(BaseModel):
instrument_name: str
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
reduce_only: bool = False
post_only: bool = False
label: str | None = None
leverage: int | None = None # CER-016: None → default cap (3x)
class ComboLeg(BaseModel):
instrument_name: str
direction: str # "buy" | "sell"
ratio: int = 1
class PlaceComboOrderReq(BaseModel):
legs: list[ComboLeg]
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
label: str | None = None
leverage: int | None = None
@model_validator(mode="after")
def _at_least_two_legs(self):
if len(self.legs) < 2:
raise ValueError("combo requires at least 2 legs")
return self
class CancelOrderReq(BaseModel):
order_id: str
class SetStopLossReq(BaseModel):
order_id: str
stop_price: float
class SetTakeProfitReq(BaseModel):
order_id: str
tp_price: float
class ClosePositionReq(BaseModel):
instrument_name: str
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(
*,
client: DeribitClient,
token_store: TokenStore,
creds: dict,
env_info: EnvironmentInfo | None = None,
) -> FastAPI:
from contextlib import asynccontextmanager
cap_default = get_max_leverage(creds)
# CER-016: pre-set leverage cap su perp principali al boot (best-effort).
@asynccontextmanager
async def _lifespan(_app: FastAPI):
for inst in ("BTC-PERPETUAL", "ETH-PERPETUAL"):
with contextlib.suppress(Exception):
await client.set_leverage(inst, cap_default)
yield
app = build_app(
name="mcp-deribit",
version="0.1.0",
token_store=token_store,
lifespan=_lifespan,
)
# --- Read tools: core + observer ---
@app.post("/tools/is_testnet", tags=["reads"])
async def t_is_testnet(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return client.is_testnet()
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "deribit",
"environment": "testnet" if client.is_testnet().get("testnet") else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": client.base_url,
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(
body: GetTickerReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.instrument_name)
@app.post("/tools/get_ticker_batch", tags=["reads"])
async def t_get_ticker_batch(
body: GetTickerBatchReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_ticker_batch(body.instrument_names)
@app.post("/tools/get_instruments", tags=["reads"])
async def t_get_instruments(
body: GetInstrumentsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_instruments(
currency=body.currency,
kind=body.kind,
expiry_from=body.expiry_from,
expiry_to=body.expiry_to,
strike_min=body.strike_min,
strike_max=body.strike_max,
min_open_interest=body.min_open_interest,
limit=body.limit,
offset=body.offset,
)
@app.post("/tools/get_orderbook", tags=["reads"])
async def t_get_orderbook(
body: GetOrderbookReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.instrument_name, body.depth)
@app.post("/tools/get_orderbook_imbalance", tags=["reads"])
async def t_get_ob_imbalance(
body: OrderbookImbalanceReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook_imbalance(body.instrument_name, body.depth)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(
body: GetPositionsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_positions(body.currency)
@app.post("/tools/get_account_summary", tags=["reads"])
async def t_get_account_summary(
body: GetAccountSummaryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_account_summary(body.currency)
@app.post("/tools/get_trade_history", tags=["reads"])
async def t_get_trade_history(
body: GetTradeHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_trade_history(body.limit, body.instrument_name)
@app.post("/tools/get_historical", tags=["reads"])
async def t_get_historical(
body: GetHistoricalReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_historical(
body.instrument, body.start_date, body.end_date, body.resolution
)
@app.post("/tools/get_dvol", tags=["reads"])
async def t_get_dvol(
body: GetDvolReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dvol(
body.currency, body.start_date, body.end_date, body.resolution
)
@app.post("/tools/get_gex", tags=["reads"])
async def t_get_gex(
body: GetGexReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_gex(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_dealer_gamma_profile", tags=["reads"])
async def t_get_dealer_gamma_profile(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dealer_gamma_profile(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_vanna_charm", tags=["reads"])
async def t_get_vanna_charm(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_vanna_charm(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_oi_weighted_skew", tags=["reads"])
async def t_get_oi_weighted_skew(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_oi_weighted_skew(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_smile_asymmetry", tags=["reads"])
async def t_get_smile_asymmetry(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_smile_asymmetry(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_atm_vs_wings_vol", tags=["reads"])
async def t_get_atm_vs_wings_vol(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_atm_vs_wings_vol(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_pc_ratio", tags=["reads"])
async def t_get_pc_ratio(
body: GetPcRatioReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_pc_ratio(body.currency)
@app.post("/tools/get_skew_25d", tags=["reads"])
async def t_get_skew_25d(
body: GetSkew25dReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_skew_25d(body.currency, body.expiry)
@app.post("/tools/get_term_structure", tags=["reads"])
async def t_get_term_structure(
body: GetTermStructureReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_term_structure(body.currency)
@app.post("/tools/run_backtest", tags=["writes"])
async def t_run_backtest(
body: RunBacktestReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.run_backtest(
strategy_name=body.strategy_name,
underlying=body.underlying,
lookback_days=body.lookback_days,
resolution=body.resolution,
entry_rules=body.entry_rules,
exit_rules=body.exit_rules,
)
@app.post("/tools/calculate_spread_payoff", tags=["writes"])
async def t_calculate_spread_payoff(
body: CalculateSpreadPayoffReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.calculate_spread_payoff(body.legs, body.quote_currency)
@app.post("/tools/find_by_delta", tags=["writes"])
async def t_find_by_delta(
body: FindByDeltaReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.find_by_delta(
currency=body.currency,
expiry=body.expiry,
target_delta=body.target_delta,
option_type=body.option_type,
max_results=body.max_results,
min_open_interest=body.min_open_interest,
min_volume_24h=body.min_volume_24h,
)
@app.post("/tools/get_iv_rank", tags=["reads"])
async def t_get_iv_rank(
body: GetIvRankReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_iv_rank(body.instrument)
@app.post("/tools/get_dvol_history", tags=["reads"])
async def t_get_dvol_history(
body: GetDvolHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dvol_history(body.currency, body.lookback_days)
@app.post("/tools/get_realized_vol", tags=["reads"])
async def t_get_realized_vol(
body: GetRealizedVolReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_realized_vol(body.currency, body.windows)
@app.post("/tools/get_technical_indicators", tags=["reads"])
async def t_get_indicators(
body: GetIndicatorsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_technical_indicators(
body.instrument,
body.indicators,
body.start_date,
body.end_date,
body.resolution,
)
# --- Write tools: core only ---
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(
body: PlaceOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
lev = _enforce_leverage(body.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
with contextlib.suppress(Exception):
await client.set_leverage(body.instrument_name, lev)
result = await client.place_order(
instrument_name=body.instrument_name,
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
reduce_only=body.reduce_only,
post_only=body.post_only,
label=body.label,
)
audit_write_op(
principal=principal, action="place_order", exchange="deribit",
target=body.instrument_name,
payload={"side": body.side, "amount": body.amount, "type": body.type,
"price": body.price, "leverage": lev, "label": body.label},
result=result,
)
return result
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(
body: PlaceComboOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
lev = _enforce_leverage(body.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
for leg in body.legs:
with contextlib.suppress(Exception):
await client.set_leverage(leg.instrument_name, lev)
result = await client.place_combo_order(
legs=[leg.model_dump() for leg in body.legs],
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
label=body.label,
)
audit_write_op(
principal=principal, action="place_combo_order", exchange="deribit",
target=result.get("combo_instrument") if isinstance(result, dict) else None,
payload={"legs": [leg.model_dump() for leg in body.legs],
"side": body.side, "amount": body.amount, "leverage": lev},
result=result if isinstance(result, dict) else None,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(
body: CancelOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.cancel_order(body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="deribit",
target=body.order_id, payload={}, result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(
body: SetStopLossReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_stop_loss(body.order_id, body.stop_price)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="deribit",
target=body.order_id, payload={"stop_price": body.stop_price}, result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(
body: SetTakeProfitReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_take_profit(body.order_id, body.tp_price)
audit_write_op(
principal=principal, action="set_take_profit", exchange="deribit",
target=body.order_id, payload={"tp_price": body.tp_price}, result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close_position(
body: ClosePositionReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.close_position(body.instrument_name)
audit_write_op(
principal=principal, action="close_position", exchange="deribit",
target=body.instrument_name, payload={}, result=result,
)
return result
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9011"))
mount_mcp_endpoint(
app,
name="cerbero-deribit",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "is_testnet", "description": "True se client Deribit è in modalità testnet."},
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_ticker", "description": "Ticker di un instrument Deribit."},
{"name": "get_ticker_batch", "description": "Ticker per N instruments in parallelo (max 20)."},
{"name": "get_instruments", "description": "Lista instruments per currency."},
{"name": "get_orderbook", "description": "Orderbook L1/L2 per instrument."},
{"name": "get_orderbook_imbalance", "description": "Microstructure: imbalance ratio + microprice + slope."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Summary account (equity, balance)."},
{"name": "get_trade_history", "description": "Storia trade recenti."},
{"name": "get_historical", "description": "OHLCV storico."},
{"name": "get_dvol", "description": "Deribit Volatility Index (DVOL) OHLC per currency (BTC/ETH)."},
{"name": "get_dvol_history", "description": "DVOL time series + percentili su lookback_days."},
{"name": "get_iv_rank", "description": "IV rank 30/90/365d di un instrument vs DVOL storico della currency."},
{"name": "find_by_delta", "description": "Trova strike con delta più vicino a target, filtrato per liquidità (OI/vol)."},
{"name": "calculate_spread_payoff", "description": "Payoff/greci/max P-L/break-even/fee per struttura multi-leg."},
{"name": "run_backtest", "description": "Heuristic backtest RSI-based su storia OHLCV per threshold accept/marginal/reject."},
{"name": "get_term_structure", "description": "IV ATM per ogni expiry disponibile, detect contango/backwardation."},
{"name": "get_skew_25d", "description": "Skew 25-delta put/call IV + risk reversal + butterfly per expiry."},
{"name": "get_pc_ratio", "description": "Put/Call ratio aggregato su OI e volume 24h."},
{"name": "get_gex", "description": "Gamma exposure per strike + zero gamma level (top N strikes per OI)."},
{"name": "get_dealer_gamma_profile", "description": "Net dealer gamma per strike (short calls/long puts) + gamma flip level."},
{"name": "get_vanna_charm", "description": "Vanna (∂delta/∂IV) e Charm (∂delta/∂t) aggregati pesati OI."},
{"name": "get_oi_weighted_skew", "description": "Skew aggregato pesato per OI: IV puts - IV calls. Positivo = paura."},
{"name": "get_smile_asymmetry", "description": "Asymmetry IV otm-puts vs otm-calls + ATM IV reference."},
{"name": "get_atm_vs_wings_vol", "description": "IV ATM vs IV ali 25-delta. wing_richness > 0 = smile/kurtosis."},
{"name": "get_technical_indicators", "description": "Indicatori tecnici (RSI, MACD, ATR, ADX)."},
{"name": "get_realized_vol", "description": "Volatilità realizzata annualizzata (log-return std) BTC/ETH + spread IVRV."},
{"name": "place_order", "description": "Invia ordine (CORE only, testnet)."},
{"name": "place_combo_order", "description": "Crea combo via private/create_combo + piazza ordine sul combo (1 cross spread invece di N)."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "set_stop_loss", "description": "Setta stop loss su posizione."},
{"name": "set_take_profit", "description": "Setta take profit su posizione."},
{"name": "close_position", "description": "Chiude posizione aperta."},
],
)
return app
@@ -1,71 +0,0 @@
"""CER-P5-010 env validation tests."""
from __future__ import annotations
import pytest
from mcp_deribit.env_validation import (
MissingEnvError,
fail_fast_if_missing,
optional_env,
require_env,
summarize,
)
def test_require_env_present(monkeypatch):
monkeypatch.setenv("FOO_KEY", "value1")
assert require_env("FOO_KEY") == "value1"
def test_require_env_missing_raises(monkeypatch):
monkeypatch.delenv("MISSING_REQ", raising=False)
with pytest.raises(MissingEnvError):
require_env("MISSING_REQ", "critical path")
def test_require_env_empty_raises(monkeypatch):
monkeypatch.setenv("EMPTY_REQ", "")
with pytest.raises(MissingEnvError):
require_env("EMPTY_REQ")
def test_require_env_whitespace_only_raises(monkeypatch):
monkeypatch.setenv("WS_REQ", " ")
with pytest.raises(MissingEnvError):
require_env("WS_REQ")
def test_optional_env_default(monkeypatch):
monkeypatch.delenv("OPT_A", raising=False)
assert optional_env("OPT_A", default="fallback") == "fallback"
def test_optional_env_set(monkeypatch):
monkeypatch.setenv("OPT_B", "xx")
assert optional_env("OPT_B", default="fallback") == "xx"
def test_fail_fast_all_present(monkeypatch):
monkeypatch.setenv("AA", "1")
monkeypatch.setenv("BB", "2")
fail_fast_if_missing(["AA", "BB"]) # no exit
def test_fail_fast_missing_exits(monkeypatch):
monkeypatch.setenv("HAVE_IT", "1")
monkeypatch.delenv("MISSING_X", raising=False)
with pytest.raises(SystemExit) as exc:
fail_fast_if_missing(["HAVE_IT", "MISSING_X"])
assert exc.value.code == 2
def test_summarize_does_not_leak_secrets(monkeypatch, caplog):
import logging
monkeypatch.setenv("API_KEY_FOO", "super-secret-token-123456")
monkeypatch.setenv("PORT", "9000")
with caplog.at_level(logging.INFO, logger="mcp_deribit.env_validation"):
summarize(["API_KEY_FOO", "PORT", "NOT_SET_XYZ"])
log_text = "\n".join(caplog.messages)
assert "super-secret-token-123456" not in log_text
assert "9000" in log_text
assert "<unset>" in log_text
@@ -1,77 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
from mcp_deribit.server import create_app
def _make_app(env_info, creds):
c = AsyncMock()
c.set_leverage = AsyncMock(return_value={"state": "ok"})
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="env",
env_value="true",
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ot"},
)
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "deribit"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://test.deribit.com/api/v2"
assert body["max_leverage"] == 3
def test_environment_info_default_source():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="default",
env_value=None,
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ct"},
)
assert r.status_code == 200
body = r.json()
assert body["source"] == "default"
assert body["env_value"] is None
assert body["max_leverage"] == 1
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="default",
env_value=None,
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -1,269 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_deribit.server import create_app
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_ticker = AsyncMock(return_value={"mark_price": 50000})
c.get_instruments = AsyncMock(return_value=[])
c.get_orderbook = AsyncMock(return_value={"bids": [], "asks": []})
c.get_positions = AsyncMock(return_value=[])
c.get_account_summary = AsyncMock(return_value={"equity": 1000})
c.get_trade_history = AsyncMock(return_value=[])
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_technical_indicators = AsyncMock(return_value={"rsi": 55.0})
c.place_order = AsyncMock(return_value={"order_id": "x"})
c.place_combo_order = AsyncMock(return_value={"combo_instrument": "BTC-COMBO-1", "order": {"order_id": "x"}})
c.get_dealer_gamma_profile = AsyncMock(return_value={"by_strike": [], "total_net_dealer_gamma": 0})
c.get_vanna_charm = AsyncMock(return_value={"total_vanna": 0, "total_charm": 0, "legs_analyzed": 0})
c.get_oi_weighted_skew = AsyncMock(return_value={"skew": 0, "call_iv_weighted": None, "put_iv_weighted": None})
c.get_smile_asymmetry = AsyncMock(return_value={"atm_iv": 0.5, "asymmetry": 0.0})
c.get_atm_vs_wings_vol = AsyncMock(return_value={"atm_iv": 0.5, "wing_richness": 0.0})
c.get_orderbook_imbalance = AsyncMock(return_value={"imbalance_ratio": 0.0, "microprice": 50000})
c.cancel_order = AsyncMock(return_value={"order_id": "x", "state": "cancelled"})
c.set_stop_loss = AsyncMock(return_value={"order_id": "x", "stop_price": 45000})
c.set_take_profit = AsyncMock(return_value={"order_id": "x", "tp_price": 55000})
c.close_position = AsyncMock(return_value={"closed": True})
c.set_leverage = AsyncMock(return_value={"state": "ok"})
return c
@pytest.fixture
def http(mock_client):
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
app = create_app(client=mock_client, token_store=store, creds={"max_leverage": 3})
return TestClient(app)
def test_health(http):
assert http.get("/health").status_code == 200
def test_get_ticker_core_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ct"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
assert r.json()["mark_price"] == 50000
def test_get_ticker_observer_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
def test_get_ticker_no_auth_401(http):
r = http.post("/tools/get_ticker", json={"instrument_name": "BTC-PERPETUAL"})
assert r.status_code == 401
def test_get_ticker_alias_instrument_ok(http, mock_client):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ct"},
json={"instrument": "ETH"},
)
assert r.status_code == 200
mock_client.get_ticker.assert_awaited_with("ETH")
def test_place_order_core_ok(http):
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={"instrument_name": "BTC-PERPETUAL", "side": "buy", "amount": 10},
)
assert r.status_code == 200
def test_place_order_observer_forbidden(http):
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL", "side": "buy", "amount": 10},
)
assert r.status_code == 403
def test_get_orderbook_imbalance_observer_ok(http):
r = http.post(
"/tools/get_orderbook_imbalance",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
@pytest.mark.parametrize("path", [
"/tools/get_dealer_gamma_profile",
"/tools/get_vanna_charm",
"/tools/get_oi_weighted_skew",
"/tools/get_smile_asymmetry",
"/tools/get_atm_vs_wings_vol",
])
def test_option_flow_indicators_observer_ok(http, path):
r = http.post(path, headers={"Authorization": "Bearer ot"}, json={"currency": "BTC"})
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path", [
"/tools/get_dealer_gamma_profile",
"/tools/get_vanna_charm",
"/tools/get_oi_weighted_skew",
"/tools/get_smile_asymmetry",
"/tools/get_atm_vs_wings_vol",
])
def test_option_flow_indicators_no_auth_401(http, path):
r = http.post(path, json={"currency": "BTC"})
assert r.status_code == 401, (path, r.text)
def test_place_combo_order_core_ok(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [
{"instrument_name": "BTC-30APR26-75000-C", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-30APR26-80000-C", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
"type": "limit",
"price": 0.05,
},
)
assert r.status_code == 200
assert r.json()["combo_instrument"] == "BTC-COMBO-1"
def test_place_combo_order_observer_forbidden(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ot"},
json={
"legs": [
{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-Y", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
},
)
assert r.status_code == 403
def test_place_combo_order_min_legs(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1}],
"side": "buy",
"amount": 1,
},
)
assert r.status_code == 422
def test_place_combo_order_leverage_cap_enforced(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [
{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-Y", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
"leverage": 50,
},
)
assert r.status_code == 403
err = r.json()["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
def test_place_order_leverage_cap_enforced(http):
"""Reject leverage > max_leverage (da secret, default 3)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={
"instrument_name": "BTC-PERPETUAL",
"side": "buy",
"amount": 50,
"leverage": 50,
},
)
assert r.status_code == 403
body = r.json()
err = body["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
details = err["details"]
assert details["exchange"] == "deribit"
assert details["requested"] == 50
assert details["max"] == 3
def test_close_position_core_ok(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ct"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
def test_close_position_observer_forbidden(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 403
def test_cancel_order_observer_forbidden(http):
r = http.post(
"/tools/cancel_order",
headers={"Authorization": "Bearer ot"},
json={"order_id": "abc123"},
)
assert r.status_code == 403
def test_set_stop_loss_observer_forbidden(http):
r = http.post(
"/tools/set_stop_loss",
headers={"Authorization": "Bearer ot"},
json={"order_id": "abc123", "stop_price": 45000.0},
)
assert r.status_code == 403
def test_get_account_summary_observer_ok(http):
r = http.post(
"/tools/get_account_summary",
headers={"Authorization": "Bearer ot"},
json={"currency": "USDC"},
)
assert r.status_code == 200
assert r.json()["equity"] == 1000
-29
View File
@@ -1,29 +0,0 @@
[project]
name = "mcp-hyperliquid"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
"hyperliquid-python-sdk>=0.3",
"eth-account>=0.11",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_hyperliquid"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-hyperliquid = "mcp_hyperliquid.__main__:main"
@@ -1,31 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_hyperliquid.client import HyperliquidClient
from mcp_hyperliquid.server import create_app
SPEC = ExchangeAppSpec(
exchange="hyperliquid",
creds_env_var="HYPERLIQUID_WALLET_FILE",
env_var="HYPERLIQUID_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.hyperliquid.xyz",
default_base_url_testnet="https://api.hyperliquid-testnet.xyz",
default_port=9012,
build_client=lambda creds, env_info: HyperliquidClient(
wallet_address=creds["wallet_address"],
private_key=creds["private_key"],
testnet=(env_info.environment == "testnet"),
api_wallet_address=creds.get("api_wallet_address"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
@@ -1,408 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, field_validator, model_validator
from mcp_hyperliquid.client import HyperliquidClient
from mcp_hyperliquid.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_hyperliquid.leverage_cap import get_max_leverage
# --- Body models ---
class GetMarketsReq(BaseModel):
pass
class GetTickerReq(BaseModel):
instrument: str
class GetOrderbookReq(BaseModel):
instrument: str
depth: int = 10
class GetPositionsReq(BaseModel):
pass
class GetAccountSummaryReq(BaseModel):
pass
class GetTradeHistoryReq(BaseModel):
limit: int = 100
class GetHistoricalReq(BaseModel):
instrument: str | None = None
asset: str | None = None
start_date: str | None = None
end_date: str | None = None
resolution: str = "1h"
interval: str | None = None
limit: int = 50
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
from datetime import UTC, datetime, timedelta
sym = self.instrument or self.asset
if not sym:
raise ValueError("instrument (or asset) is required")
self.instrument = sym
if self.interval:
self.resolution = self.interval
if not self.end_date:
self.end_date = datetime.now(UTC).strftime("%Y-%m-%dT%H:%M:%S")
if not self.start_date:
days = max(1, self.limit // 6)
self.start_date = (
datetime.now(UTC) - timedelta(days=days)
).strftime("%Y-%m-%dT%H:%M:%S")
return self
class GetOpenOrdersReq(BaseModel):
pass
class GetFundingRateReq(BaseModel):
instrument: str
class BasisSpotPerpReq(BaseModel):
asset: str
class GetIndicatorsReq(BaseModel):
instrument: str | None = None
asset: str | None = None
indicators: list[str] = ["rsi", "atr", "macd", "adx"]
start_date: str | None = None
end_date: str | None = None
resolution: str = "1h"
interval: str | None = None
limit: int = 50
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
from datetime import UTC, datetime, timedelta
sym = self.instrument or self.asset
if not sym:
raise ValueError("instrument (or asset) is required")
self.instrument = sym
if self.interval:
self.resolution = self.interval
if not self.end_date:
self.end_date = datetime.now(UTC).strftime("%Y-%m-%dT%H:%M:%S")
if not self.start_date:
days = max(2, self.limit // 6)
self.start_date = (
datetime.now(UTC) - timedelta(days=days)
).strftime("%Y-%m-%dT%H:%M:%S")
return self
@field_validator("indicators", mode="before")
@classmethod
def _coerce_indicators(cls, v):
if isinstance(v, str):
import json
s = v.strip()
if s.startswith("["):
try:
parsed = json.loads(s)
if isinstance(parsed, list):
return [str(x).strip() for x in parsed if str(x).strip()]
except json.JSONDecodeError:
pass
return [x.strip() for x in s.split(",") if x.strip()]
if isinstance(v, list):
return v
raise ValueError(
"indicators must be a list like ['rsi','atr','macd'] "
"or a comma-separated string like 'rsi,atr,macd'"
)
class PlaceOrderReq(BaseModel):
instrument: str
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
reduce_only: bool = False
leverage: int | None = None # CER-016: None → default cap (3x)
class CancelOrderReq(BaseModel):
order_id: str
instrument: str
class SetStopLossReq(BaseModel):
instrument: str
stop_price: float
size: float
class SetTakeProfitReq(BaseModel):
instrument: str
tp_price: float
size: float
class ClosePositionReq(BaseModel):
instrument: str
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(
*,
client: HyperliquidClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
) -> FastAPI:
creds = creds or {}
app = build_app(name="mcp-hyperliquid", version="0.1.0", token_store=token_store)
# --- Read tools: core + observer ---
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "hyperliquid",
"environment": "testnet" if getattr(client, "testnet", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_markets", tags=["reads"])
async def t_get_markets(
body: GetMarketsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_markets()
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(
body: GetTickerReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.instrument)
@app.post("/tools/get_orderbook", tags=["reads"])
async def t_get_orderbook(
body: GetOrderbookReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.instrument, body.depth)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(
body: GetPositionsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_positions()
@app.post("/tools/get_account_summary", tags=["reads"])
async def t_get_account_summary(
body: GetAccountSummaryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_account_summary()
@app.post("/tools/get_trade_history", tags=["reads"])
async def t_get_trade_history(
body: GetTradeHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_trade_history(body.limit)
@app.post("/tools/get_historical", tags=["reads"])
async def t_get_historical(
body: GetHistoricalReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_historical(
body.instrument, body.start_date, body.end_date, body.resolution
)
@app.post("/tools/get_open_orders", tags=["reads"])
async def t_get_open_orders(
body: GetOpenOrdersReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_open_orders()
@app.post("/tools/get_funding_rate", tags=["reads"])
async def t_get_funding_rate(
body: GetFundingRateReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_funding_rate(body.instrument)
@app.post("/tools/basis_spot_perp", tags=["writes"])
async def t_basis_spot_perp(
body: BasisSpotPerpReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.basis_spot_perp(body.asset)
@app.post("/tools/get_indicators", tags=["reads"])
async def t_get_indicators(
body: GetIndicatorsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_indicators(
body.instrument,
body.indicators,
body.start_date,
body.end_date,
body.resolution,
)
# --- Write tools: core only ---
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(
body: PlaceOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
_enforce_leverage(body.leverage, creds=creds, exchange="hyperliquid")
result = await client.place_order(
instrument=body.instrument,
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
reduce_only=body.reduce_only,
)
audit_write_op(
principal=principal, action="place_order", exchange="hyperliquid",
target=body.instrument,
payload={"side": body.side, "amount": body.amount, "type": body.type,
"price": body.price, "reduce_only": body.reduce_only,
"leverage": body.leverage},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(
body: CancelOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.cancel_order(body.order_id, body.instrument)
audit_write_op(
principal=principal, action="cancel_order", exchange="hyperliquid",
target=body.order_id, payload={"instrument": body.instrument}, result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(
body: SetStopLossReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_stop_loss(body.instrument, body.stop_price, body.size)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="hyperliquid",
target=body.instrument,
payload={"stop_price": body.stop_price, "size": body.size},
result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(
body: SetTakeProfitReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_take_profit(body.instrument, body.tp_price, body.size)
audit_write_op(
principal=principal, action="set_take_profit", exchange="hyperliquid",
target=body.instrument,
payload={"tp_price": body.tp_price, "size": body.size},
result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close_position(
body: ClosePositionReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.close_position(body.instrument)
audit_write_op(
principal=principal, action="close_position", exchange="hyperliquid",
target=body.instrument, payload={}, result=result,
)
return result
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9012"))
mount_mcp_endpoint(
app,
name="cerbero-hyperliquid",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_markets", "description": "Lista mercati perp disponibili."},
{"name": "get_ticker", "description": "Ticker di un perp."},
{"name": "get_orderbook", "description": "Orderbook L2."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Account summary (spot + perp equity)."},
{"name": "get_trade_history", "description": "Storia trade."},
{"name": "get_historical", "description": "OHLCV storico."},
{"name": "get_open_orders", "description": "Ordini aperti."},
{"name": "get_funding_rate", "description": "Funding rate corrente per simbolo."},
{"name": "basis_spot_perp", "description": "Basis spot-perp annualizzato + carry opportunity detection."},
{"name": "get_indicators", "description": "Indicatori tecnici."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "set_stop_loss", "description": "Stop loss su posizione."},
{"name": "set_take_profit", "description": "Take profit su posizione."},
{"name": "close_position", "description": "Chiude posizione."},
],
)
return app
@@ -1,50 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
from mcp_hyperliquid.server import create_app
def _make_app(env_info, creds):
c = MagicMock()
c.testnet = True
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="hyperliquid",
environment="testnet",
source="env",
env_value="true",
base_url="https://api.hyperliquid-testnet.xyz",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info", headers={"Authorization": "Bearer ot"})
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "hyperliquid"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://api.hyperliquid-testnet.xyz"
assert body["max_leverage"] == 3
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="hyperliquid", environment="testnet", source="default",
env_value=None, base_url="https://api.hyperliquid-testnet.xyz",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -1,211 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_hyperliquid.server import create_app
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_markets = AsyncMock(return_value=[{"asset": "BTC", "mark_price": 50000}])
c.get_ticker = AsyncMock(return_value={"asset": "BTC", "mark_price": 50000})
c.get_orderbook = AsyncMock(return_value={"bids": [], "asks": []})
c.get_positions = AsyncMock(return_value=[])
c.get_account_summary = AsyncMock(return_value={"equity": 1500, "perps_equity": 1000})
c.get_trade_history = AsyncMock(return_value=[])
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_open_orders = AsyncMock(return_value=[])
c.get_funding_rate = AsyncMock(return_value={"asset": "BTC", "current_funding_rate": 0.0001})
c.get_indicators = AsyncMock(return_value={"rsi": 55.0})
c.place_order = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.cancel_order = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.set_stop_loss = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.set_take_profit = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.close_position = AsyncMock(return_value={"status": "ok", "asset": "BTC"})
return c
@pytest.fixture
def http(mock_client):
store = TokenStore(
tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(client=mock_client, token_store=store, creds={"max_leverage": 3})
return TestClient(app)
# --- Health ---
def test_health(http):
assert http.get("/health").status_code == 200
# --- Read tools: both core and observer allowed ---
def test_get_markets_core_ok(http):
r = http.post("/tools/get_markets", headers={"Authorization": "Bearer ct"}, json={})
assert r.status_code == 200
def test_get_markets_observer_ok(http):
r = http.post("/tools/get_markets", headers={"Authorization": "Bearer ot"}, json={})
assert r.status_code == 200
def test_get_ticker_core_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
assert r.json()["mark_price"] == 50000
def test_get_ticker_observer_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
def test_get_ticker_no_auth_401(http):
r = http.post("/tools/get_ticker", json={"instrument": "BTC"})
assert r.status_code == 401
def test_get_account_summary_observer_ok(http):
r = http.post(
"/tools/get_account_summary",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
assert r.json()["equity"] == 1500
def test_get_funding_rate_observer_ok(http):
r = http.post(
"/tools/get_funding_rate",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
def test_get_positions_no_auth_401(http):
r = http.post("/tools/get_positions", json={})
assert r.status_code == 401
# --- Write tools: core only ---
def test_place_order_core_ok(http):
# CER-016: amount * price = 150 < cap 200
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC", "side": "buy", "amount": 0.003, "price": 50000},
)
assert r.status_code == 200
def test_place_order_observer_forbidden(http):
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC", "side": "buy", "amount": 0.001, "price": 50000},
)
assert r.status_code == 403
def test_place_order_leverage_cap_enforced_hl(http):
"""Reject leverage > max_leverage (da secret, default 3)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={
"instrument": "BTC",
"side": "buy",
"amount": 0.001,
"price": 50000,
"leverage": 10,
},
)
assert r.status_code == 403
body = r.json()
err = body["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
assert err["details"]["exchange"] == "hyperliquid"
def test_cancel_order_core_ok(http):
r = http.post(
"/tools/cancel_order",
headers={"Authorization": "Bearer ct"},
json={"order_id": "123", "instrument": "BTC"},
)
assert r.status_code == 200
def test_cancel_order_observer_forbidden(http):
r = http.post(
"/tools/cancel_order",
headers={"Authorization": "Bearer ot"},
json={"order_id": "123", "instrument": "BTC"},
)
assert r.status_code == 403
def test_set_stop_loss_core_ok(http):
r = http.post(
"/tools/set_stop_loss",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC", "stop_price": 45000.0, "size": 0.1},
)
assert r.status_code == 200
def test_set_stop_loss_observer_forbidden(http):
r = http.post(
"/tools/set_stop_loss",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC", "stop_price": 45000.0, "size": 0.1},
)
assert r.status_code == 403
def test_set_take_profit_observer_forbidden(http):
r = http.post(
"/tools/set_take_profit",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC", "tp_price": 55000.0, "size": 0.1},
)
assert r.status_code == 403
def test_close_position_core_ok(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
def test_close_position_observer_forbidden(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC"},
)
assert r.status_code == 403
-27
View File
@@ -1,27 +0,0 @@
[project]
name = "mcp-macro"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_macro"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-macro = "mcp_macro.__main__:main"
@@ -1,37 +0,0 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_macro.server import create_app
configure_root_logging() # CER-P5-009
def main():
creds_file = os.environ["MACRO_CREDENTIALS_FILE"]
with open(creds_file) as f:
creds = json.load(f)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(
fred_api_key=creds.get("fred_api_key", ""),
finnhub_api_key=creds.get("finnhub_api_key", ""),
token_store=token_store,
)
uvicorn.run(
app,
log_config=None, # CER-P5-009: delega al root JSON logger
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9013")),
)
if __name__ == "__main__":
main()
-203
View File
@@ -1,203 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, Field
from mcp_macro.fetchers import (
fetch_asset_price,
fetch_breakeven_inflation,
fetch_cot_disaggregated,
fetch_cot_extreme_positioning,
fetch_cot_tff,
fetch_economic_indicators,
fetch_equity_futures,
fetch_macro_calendar,
fetch_market_overview,
fetch_treasury_yields,
fetch_yield_curve_slope,
)
# --- Body models ---
class GetEconomicIndicatorsReq(BaseModel):
indicators: list[str] | None = None
class GetMacroCalendarReq(BaseModel):
days: int = 7
country_filter: list[str] | None = None
importance_min: str | None = None
start: str | None = None
end: str | None = None
class GetMarketOverviewReq(BaseModel):
pass
class GetAssetPriceReq(BaseModel):
ticker: str
class GetTreasuryYieldsReq(BaseModel):
pass
class GetEquityFuturesReq(BaseModel):
pass
class GetYieldCurveSlopeReq(BaseModel):
pass
class GetBreakevenInflationReq(BaseModel):
pass
class GetCotTffReq(BaseModel):
symbol: str
lookback_weeks: int = Field(default=52, ge=4, le=520)
class GetCotDisaggregatedReq(BaseModel):
symbol: str
lookback_weeks: int = Field(default=52, ge=4, le=520)
class GetCotExtremeReq(BaseModel):
lookback_weeks: int = Field(default=156, ge=4, le=520)
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(*, fred_api_key: str = "", finnhub_api_key: str = "", token_store: TokenStore) -> FastAPI:
app = build_app(name="mcp-macro", version="0.1.0", token_store=token_store)
@app.post("/tools/get_economic_indicators", tags=["reads"])
async def t_get_economic_indicators(
body: GetEconomicIndicatorsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_economic_indicators(
fred_api_key=fred_api_key, indicators=body.indicators
)
@app.post("/tools/get_macro_calendar", tags=["reads"])
async def t_get_macro_calendar(
body: GetMacroCalendarReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_macro_calendar(
finnhub_api_key=finnhub_api_key,
days_ahead=body.days,
country_filter=body.country_filter,
importance_min=body.importance_min,
start=body.start,
end=body.end,
)
@app.post("/tools/get_market_overview", tags=["reads"])
async def t_get_market_overview(
body: GetMarketOverviewReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_market_overview()
@app.post("/tools/get_asset_price", tags=["reads"])
async def t_get_asset_price(
body: GetAssetPriceReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_asset_price(body.ticker)
@app.post("/tools/get_treasury_yields", tags=["reads"])
async def t_get_treasury_yields(
body: GetTreasuryYieldsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_treasury_yields()
@app.post("/tools/get_equity_futures", tags=["reads"])
async def t_get_equity_futures(
body: GetEquityFuturesReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_equity_futures()
@app.post("/tools/get_yield_curve_slope", tags=["reads"])
async def t_get_yield_curve_slope(
body: GetYieldCurveSlopeReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_yield_curve_slope()
@app.post("/tools/get_breakeven_inflation", tags=["reads"])
async def t_get_breakeven_inflation(
body: GetBreakevenInflationReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_breakeven_inflation(fred_api_key=fred_api_key)
@app.post("/tools/get_cot_tff", tags=["reads"])
async def t_get_cot_tff(
body: GetCotTffReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_tff(body.symbol, body.lookback_weeks)
@app.post("/tools/get_cot_disaggregated", tags=["reads"])
async def t_get_cot_disaggregated(
body: GetCotDisaggregatedReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_disaggregated(body.symbol, body.lookback_weeks)
@app.post("/tools/get_cot_extreme_positioning", tags=["reads"])
async def t_get_cot_extreme(
body: GetCotExtremeReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_extreme_positioning(body.lookback_weeks)
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9013"))
mount_mcp_endpoint(
app,
name="cerbero-macro",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "get_economic_indicators", "description": "FRED economic indicators (Fed rate, CPI, ecc)."},
{"name": "get_macro_calendar", "description": "Eventi macro con filtri country/importance/date range."},
{"name": "get_market_overview", "description": "Snapshot overview mercato macro."},
{"name": "get_asset_price", "description": "Prezzo cross-asset: WTI, DXY, SPX, VIX, yields, FX, ecc."},
{"name": "get_treasury_yields", "description": "Curva US Treasury 2y/5y/10y/30y + shape detection."},
{"name": "get_equity_futures", "description": "Futures ES/NQ/YM/RTY con session status."},
{"name": "get_yield_curve_slope", "description": "Slope 2y10y/5y30y + butterfly + regime (steep/normal/flat/inverted)."},
{"name": "get_breakeven_inflation", "description": "Breakeven inflation 5Y/10Y + 5y5y forward (FRED T5YIE/T10YIE/T5YIFR)."},
{"name": "get_cot_tff", "description": "COT TFF report (CFTC) per equity/financial: ES/NQ/RTY/ZN/ZB/6E/6J/DX. Roles: dealer, asset manager, leveraged funds, other."},
{"name": "get_cot_disaggregated", "description": "COT Disaggregated report (CFTC) per commodities: CL/GC/SI/HG/ZW/ZC/ZS. Roles: producer/merchant, swap dealer, managed money, other."},
{"name": "get_cot_extreme_positioning", "description": "Scanner posizionamento estremo (percentile ≤5 o ≥95) sui simboli watchlist."},
],
)
return app
-202
View File
@@ -1,202 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, patch
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_macro.server import create_app
@pytest.fixture
def http():
store = TokenStore(
tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(fred_api_key="testfred", finnhub_api_key="testfinn", token_store=store)
return TestClient(app)
# --- Health ---
def test_health(http):
assert http.get("/health").status_code == 200
# --- get_economic_indicators ---
def test_get_economic_indicators_core_ok(http):
with patch(
"mcp_macro.server.fetch_economic_indicators",
new=AsyncMock(return_value={"fed_rate": 5.25, "updated_at": "2024-01-01T00:00:00+00:00"}),
):
r = http.post(
"/tools/get_economic_indicators",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
assert r.json()["fed_rate"] == 5.25
def test_get_economic_indicators_observer_ok(http):
with patch(
"mcp_macro.server.fetch_economic_indicators",
new=AsyncMock(return_value={"fed_rate": 5.25}),
):
r = http.post(
"/tools/get_economic_indicators",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_economic_indicators_no_auth_401(http):
r = http.post("/tools/get_economic_indicators", json={})
assert r.status_code == 401
# --- get_macro_calendar ---
def test_get_macro_calendar_core_ok(http):
with patch(
"mcp_macro.server.fetch_macro_calendar",
new=AsyncMock(return_value={"events": []}),
):
r = http.post(
"/tools/get_macro_calendar",
headers={"Authorization": "Bearer ct"},
json={"days": 7},
)
assert r.status_code == 200
def test_get_macro_calendar_observer_ok(http):
with patch(
"mcp_macro.server.fetch_macro_calendar",
new=AsyncMock(return_value={"events": []}),
):
r = http.post(
"/tools/get_macro_calendar",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_macro_calendar_no_auth_401(http):
r = http.post("/tools/get_macro_calendar", json={})
assert r.status_code == 401
# --- get_market_overview ---
def test_get_market_overview_core_ok(http):
with patch(
"mcp_macro.server.fetch_market_overview",
new=AsyncMock(return_value={"btc_dominance": 52.0, "btc_price": 65000}),
):
r = http.post(
"/tools/get_market_overview",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
assert r.json()["btc_price"] == 65000
def test_get_market_overview_observer_ok(http):
with patch(
"mcp_macro.server.fetch_market_overview",
new=AsyncMock(return_value={"btc_dominance": 52.0}),
):
r = http.post(
"/tools/get_market_overview",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_market_overview_no_auth_401(http):
r = http.post("/tools/get_market_overview", json={})
assert r.status_code == 401
def test_get_cot_tff_core_ok(http):
with patch(
"mcp_macro.server.fetch_cot_tff",
new=AsyncMock(return_value={"symbol": "ES", "rows": []}),
):
r = http.post(
"/tools/get_cot_tff",
headers={"Authorization": "Bearer ct"},
json={"symbol": "ES"},
)
assert r.status_code == 200
assert r.json()["symbol"] == "ES"
def test_get_cot_tff_observer_ok(http):
with patch(
"mcp_macro.server.fetch_cot_tff",
new=AsyncMock(return_value={"symbol": "ES", "rows": []}),
):
r = http.post(
"/tools/get_cot_tff",
headers={"Authorization": "Bearer ot"},
json={"symbol": "ES"},
)
assert r.status_code == 200
def test_get_cot_tff_no_auth_401(http):
r = http.post("/tools/get_cot_tff", json={"symbol": "ES"})
assert r.status_code == 401
def test_get_cot_disagg_observer_ok(http):
with patch(
"mcp_macro.server.fetch_cot_disaggregated",
new=AsyncMock(return_value={"symbol": "CL", "rows": []}),
):
r = http.post(
"/tools/get_cot_disaggregated",
headers={"Authorization": "Bearer ot"},
json={"symbol": "CL"},
)
assert r.status_code == 200
def test_get_cot_disagg_no_auth_401(http):
r = http.post("/tools/get_cot_disaggregated", json={"symbol": "CL"})
assert r.status_code == 401
def test_get_cot_extreme_positioning_ok(http):
with patch(
"mcp_macro.server.fetch_cot_extreme_positioning",
new=AsyncMock(return_value={"extremes": []}),
):
r = http.post(
"/tools/get_cot_extreme_positioning",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_cot_extreme_positioning_lookback_too_short(http):
"""Pydantic validation: lookback_weeks < 4 → 422."""
r = http.post(
"/tools/get_cot_extreme_positioning",
headers={"Authorization": "Bearer ct"},
json={"lookback_weeks": 2},
)
assert r.status_code == 422
-27
View File
@@ -1,27 +0,0 @@
[project]
name = "mcp-sentiment"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_sentiment"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-sentiment = "mcp_sentiment.__main__:main"
@@ -1,46 +0,0 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_sentiment.server import create_app
def _load_cryptopanic_key() -> str:
"""CER-002: preferisci file secret, fallback a env CRYPTOPANIC_API_KEY."""
creds_file = os.environ.get("SENTIMENT_CREDENTIALS_FILE")
if creds_file and os.path.exists(creds_file):
try:
with open(creds_file) as f:
creds = json.load(f)
key = (creds.get("cryptopanic_key") or "").strip()
if key and key.lower() not in ("placeholder", "changeme", "none"):
return key
except (OSError, json.JSONDecodeError):
pass
return (os.environ.get("CRYPTOPANIC_API_KEY") or "").strip()
configure_root_logging() # CER-P5-009
def main():
key = _load_cryptopanic_key()
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(cryptopanic_key=key, token_store=token_store)
uvicorn.run(
app,
log_config=None, # CER-P5-009: delega al root JSON logger
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9014")),
)
if __name__ == "__main__":
main()
@@ -1,174 +0,0 @@
from __future__ import annotations
import logging
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from mcp_sentiment.fetchers import (
fetch_cointegration_pairs,
fetch_cross_exchange_funding,
fetch_crypto_news,
fetch_funding_arb_spread,
fetch_funding_rates,
fetch_liquidation_heatmap,
fetch_oi_history,
fetch_social_sentiment,
fetch_world_news,
)
logger = logging.getLogger(__name__)
# --- Body models ---
class GetCryptoNewsReq(BaseModel):
limit: int = 20
class GetSocialSentimentReq(BaseModel):
symbol: str = "BTC"
class GetFundingRatesReq(BaseModel):
asset: str = "BTC"
class GetWorldNewsReq(BaseModel):
pass
class GetCrossExchangeFundingReq(BaseModel):
assets: list[str] | None = None
class GetFundingArbSpreadReq(BaseModel):
assets: list[str] | None = None
class GetLiquidationHeatmapReq(BaseModel):
asset: str = "BTC"
class GetCointegrationPairsReq(BaseModel):
pairs: list[list[str]] | None = None
lookback_hours: int = 24
class GetOiHistoryReq(BaseModel):
asset: str = "BTC"
period: str = "5m"
limit: int = 288
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(*, cryptopanic_key: str = "", token_store: TokenStore) -> FastAPI:
app = build_app(name="mcp-sentiment", version="0.1.0", token_store=token_store)
if not cryptopanic_key or cryptopanic_key.lower() in ("placeholder", "none", "changeme"):
logger.warning(
"mcp-sentiment: cryptopanic_key mancante o placeholder — get_crypto_news "
"ritornerà headlines=[] con note diagnostica"
)
@app.post("/tools/get_crypto_news", tags=["reads"])
async def t_get_crypto_news(
body: GetCryptoNewsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_crypto_news(api_key=cryptopanic_key, limit=body.limit)
@app.post("/tools/get_social_sentiment", tags=["reads"])
async def t_get_social_sentiment(
body: GetSocialSentimentReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_social_sentiment(body.symbol)
@app.post("/tools/get_funding_rates", tags=["reads"])
async def t_get_funding_rates(
body: GetFundingRatesReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_funding_rates(body.asset)
@app.post("/tools/get_world_news", tags=["reads"])
async def t_get_world_news(
body: GetWorldNewsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_world_news()
@app.post("/tools/get_cross_exchange_funding", tags=["reads"])
async def t_get_cross_exchange_funding(
body: GetCrossExchangeFundingReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cross_exchange_funding(body.assets)
@app.post("/tools/get_funding_arb_spread", tags=["reads"])
async def t_get_funding_arb_spread(
body: GetFundingArbSpreadReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_funding_arb_spread(body.assets)
@app.post("/tools/get_liquidation_heatmap", tags=["reads"])
async def t_get_liquidation_heatmap(
body: GetLiquidationHeatmapReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_liquidation_heatmap(body.asset)
@app.post("/tools/get_cointegration_pairs", tags=["reads"])
async def t_get_cointegration_pairs(
body: GetCointegrationPairsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cointegration_pairs(body.pairs, body.lookback_hours)
@app.post("/tools/get_oi_history", tags=["reads"])
async def t_get_oi_history(
body: GetOiHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_oi_history(body.asset, body.period, body.limit)
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9014"))
mount_mcp_endpoint(
app,
name="cerbero-sentiment",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "get_crypto_news", "description": "News crypto da CryptoPanic."},
{"name": "get_social_sentiment", "description": "Sentiment aggregato social."},
{"name": "get_funding_rates", "description": "Funding rates aggregati."},
{"name": "get_world_news", "description": "News macro/world."},
{"name": "get_cross_exchange_funding", "description": "Funding multi-asset multi-exchange + arbitrage opportunities."},
{"name": "get_oi_history", "description": "Open interest history perp (Binance) + delta_pct 1h/4h/24h."},
{"name": "get_funding_arb_spread", "description": "Opportunità arbitrage funding cross-exchange in formato compatto + annualized %."},
{"name": "get_liquidation_heatmap", "description": "Pressione liquidazioni heuristica da OI delta + funding (long/short squeeze risk)."},
{"name": "get_cointegration_pairs", "description": "Engle-Granger cointegration test su coppie crypto Binance hourly."},
],
)
return app
@@ -1,216 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, patch
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_sentiment.server import create_app
@pytest.fixture
def http():
store = TokenStore(
tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(cryptopanic_key="testkey", token_store=store)
return TestClient(app)
# --- Health ---
def test_health(http):
assert http.get("/health").status_code == 200
# --- get_crypto_news ---
def test_get_crypto_news_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_crypto_news",
new=AsyncMock(return_value={"headlines": []}),
):
r = http.post(
"/tools/get_crypto_news",
headers={"Authorization": "Bearer ct"},
json={"limit": 5},
)
assert r.status_code == 200
def test_get_crypto_news_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_crypto_news",
new=AsyncMock(return_value={"headlines": []}),
):
r = http.post(
"/tools/get_crypto_news",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_crypto_news_no_auth_401(http):
r = http.post("/tools/get_crypto_news", json={})
assert r.status_code == 401
# --- get_social_sentiment ---
def test_get_social_sentiment_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_social_sentiment",
new=AsyncMock(return_value={"fear_greed_index": 65, "fear_greed_label": "Greed"}),
):
r = http.post(
"/tools/get_social_sentiment",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
assert r.json()["fear_greed_index"] == 65
def test_get_social_sentiment_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_social_sentiment",
new=AsyncMock(return_value={"fear_greed_index": 65}),
):
r = http.post(
"/tools/get_social_sentiment",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_social_sentiment_no_auth_401(http):
r = http.post("/tools/get_social_sentiment", json={})
assert r.status_code == 401
# --- get_funding_rates ---
def test_get_funding_rates_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_rates",
new=AsyncMock(return_value={"rates": []}),
):
r = http.post(
"/tools/get_funding_rates",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
def test_get_funding_rates_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_rates",
new=AsyncMock(return_value={"rates": []}),
):
r = http.post(
"/tools/get_funding_rates",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_funding_rates_no_auth_401(http):
r = http.post("/tools/get_funding_rates", json={})
assert r.status_code == 401
# --- get_world_news ---
def test_get_world_news_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_world_news",
new=AsyncMock(return_value={"articles": [], "count": 0}),
):
r = http.post(
"/tools/get_world_news",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
def test_get_world_news_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_world_news",
new=AsyncMock(return_value={"articles": [], "count": 0}),
):
r = http.post(
"/tools/get_world_news",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_world_news_no_auth_401(http):
r = http.post("/tools/get_world_news", json={})
assert r.status_code == 401
# --- New indicators: funding_arb_spread, liquidation_heatmap, cointegration_pairs ---
def test_get_funding_arb_spread_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_arb_spread",
new=AsyncMock(return_value={"opportunities": []}),
):
r = http.post(
"/tools/get_funding_arb_spread",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_funding_arb_spread_no_auth_401(http):
r = http.post("/tools/get_funding_arb_spread", json={})
assert r.status_code == 401
def test_get_liquidation_heatmap_ok(http):
with patch(
"mcp_sentiment.server.fetch_liquidation_heatmap",
new=AsyncMock(return_value={"asset": "BTC", "long_squeeze_risk": "low"}),
):
r = http.post(
"/tools/get_liquidation_heatmap",
headers={"Authorization": "Bearer ct"},
json={"asset": "BTC"},
)
assert r.status_code == 200
def test_get_liquidation_heatmap_no_auth_401(http):
r = http.post("/tools/get_liquidation_heatmap", json={"asset": "BTC"})
assert r.status_code == 401
def test_get_cointegration_pairs_ok(http):
with patch(
"mcp_sentiment.server.fetch_cointegration_pairs",
new=AsyncMock(return_value={"results": []}),
):
r = http.post(
"/tools/get_cointegration_pairs",
headers={"Authorization": "Bearer ot"},
json={"pairs": [["BTC", "ETH"]]},
)
assert r.status_code == 200
def test_get_cointegration_pairs_no_auth_401(http):
r = http.post("/tools/get_cointegration_pairs", json={})
assert r.status_code == 401
+82
View File
@@ -0,0 +1,82 @@
"""Entrypoint cerbero-mcp.
Boot:
- carica Settings da .env
- costruisce app FastAPI con router per ogni exchange
- crea ClientRegistry con builder
- monta lifespan per chiusura pulita
- avvia uvicorn
"""
from __future__ import annotations
from contextlib import asynccontextmanager
from typing import Literal, cast
import uvicorn
from fastapi import FastAPI
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.logging import configure_root_logging
from cerbero_mcp.exchanges import build_client
from cerbero_mcp.routers import (
alpaca,
bybit,
deribit,
hyperliquid,
macro,
sentiment,
)
from cerbero_mcp.server import build_app
from cerbero_mcp.settings import Settings
def _make_app(settings: Settings) -> FastAPI:
app = build_app(
testnet_token=settings.testnet_token.get_secret_value(),
mainnet_token=settings.mainnet_token.get_secret_value(),
title="Cerbero MCP",
version="2.0.0",
)
app.state.settings = settings
async def builder(exchange: str, env: str):
return await build_client(
settings, exchange, cast(Literal["testnet", "mainnet"], env)
)
app.state.registry = ClientRegistry(builder=builder)
@asynccontextmanager
async def lifespan(app: FastAPI):
try:
yield
finally:
await app.state.registry.aclose()
app.router.lifespan_context = lifespan
app.include_router(deribit.make_router())
app.include_router(bybit.make_router())
app.include_router(hyperliquid.make_router())
app.include_router(alpaca.make_router())
app.include_router(macro.make_router())
app.include_router(sentiment.make_router())
return app
def main() -> None:
configure_root_logging()
settings = Settings() # type: ignore[call-arg]
app = _make_app(settings)
uvicorn.run(
app,
log_config=None,
host=settings.host,
port=settings.port,
)
if __name__ == "__main__":
main()
+60
View File
@@ -0,0 +1,60 @@
"""Bearer auth middleware: bearer token → request.state.environment."""
from __future__ import annotations
import secrets
from typing import Literal
from fastapi import FastAPI, Request, status
from fastapi.responses import JSONResponse
Environment = Literal["testnet", "mainnet"]
WHITELIST_PATHS = frozenset({"/health", "/apidocs", "/openapi.json", "/docs", "/redoc"})
def _extract_bearer(auth_header: str) -> str | None:
if not auth_header.startswith("Bearer "):
return None
token = auth_header[len("Bearer "):].strip()
return token or None
def _check_token(
candidate: str, testnet_token: str, mainnet_token: str
) -> Environment | None:
if secrets.compare_digest(candidate, testnet_token):
return "testnet"
if secrets.compare_digest(candidate, mainnet_token):
return "mainnet"
return None
def install_auth_middleware(
app: FastAPI,
*,
testnet_token: str,
mainnet_token: str,
) -> None:
"""Registra middleware di auth bearer sull'app FastAPI."""
@app.middleware("http")
async def auth_middleware(request: Request, call_next):
if request.url.path in WHITELIST_PATHS:
return await call_next(request)
token = _extract_bearer(request.headers.get("Authorization", ""))
if token is None:
return JSONResponse(
status_code=status.HTTP_401_UNAUTHORIZED,
content={"error": {"code": "UNAUTHORIZED",
"message": "missing or malformed bearer token"}},
)
env = _check_token(token, testnet_token, mainnet_token)
if env is None:
return JSONResponse(
status_code=status.HTTP_401_UNAUTHORIZED,
content={"error": {"code": "UNAUTHORIZED",
"message": "invalid token"}},
)
request.state.environment = env
return await call_next(request)
+40
View File
@@ -0,0 +1,40 @@
"""Cache lazy di client exchange, una istanza per (exchange, env)."""
from __future__ import annotations
import asyncio
import contextlib
from collections import defaultdict
from collections.abc import Awaitable, Callable
from typing import Any, Literal
Environment = Literal["testnet", "mainnet"]
Builder = Callable[[str, Environment], Awaitable[Any]]
class ClientRegistry:
def __init__(self, *, builder: Builder) -> None:
self._builder = builder
self._clients: dict[tuple[str, Environment], Any] = {}
self._locks: dict[tuple[str, Environment], asyncio.Lock] = defaultdict(
asyncio.Lock
)
async def get(self, exchange: str, env: Environment) -> Any:
key = (exchange, env)
if key in self._clients:
return self._clients[key]
async with self._locks[key]:
if key in self._clients: # double-check
return self._clients[key]
client = await self._builder(exchange, env)
self._clients[key] = client
return client
async def aclose(self) -> None:
for client in self._clients.values():
close = getattr(client, "aclose", None)
if close is None:
continue
with contextlib.suppress(Exception):
await close()
self._clients.clear()
@@ -23,8 +23,7 @@ import os
from logging.handlers import TimedRotatingFileHandler
from typing import Any
from mcp_common.auth import Principal
from mcp_common.logging import SecretsFilter, get_json_logger
from cerbero_mcp.common.logging import SecretsFilter, get_json_logger
try:
from pythonjsonlogger.json import JsonFormatter as _JsonFormatter # noqa: N813
@@ -67,7 +66,7 @@ def _configure_audit_sink() -> None:
def audit_write_op(
*,
principal: Principal | None,
actor: str | None = None,
action: str,
exchange: str,
target: str | None = None,
@@ -77,8 +76,8 @@ def audit_write_op(
) -> None:
"""Emit a structured audit log record per write operation.
principal: chi ha invocato (None se anonimo, ma normalmente _check
impedisce di arrivare qui senza principal).
actor: identificatore di chi ha invocato (es. "testnet", "mainnet",
oppure None per logging anonimo).
action: nome del tool (es. "place_order", "cancel_order").
exchange: identificatore servizio (deribit, bybit, alpaca, hyperliquid).
target: instrument/symbol/order_id su cui si agisce.
@@ -91,7 +90,7 @@ def audit_write_op(
"audit_event": "write_op",
"action": action,
"exchange": exchange,
"principal": principal.name if principal else None,
"actor": actor,
"target": target,
"payload": payload or {},
}
+51
View File
@@ -0,0 +1,51 @@
"""Error envelope standard per tutti i tool MCP."""
from __future__ import annotations
import uuid
from datetime import UTC, datetime
from typing import Any
def error_envelope(
*,
type_: str,
code: str,
message: str,
retryable: bool,
suggested_fix: str | None = None,
details: dict | None = None,
request_id: str | None = None,
) -> dict:
env: dict[str, Any] = {
"error": {
"type": type_,
"code": code,
"message": message,
"retryable": retryable,
},
"request_id": request_id or uuid.uuid4().hex,
"data_timestamp": datetime.now(UTC).isoformat(),
}
if suggested_fix:
env["error"]["suggested_fix"] = suggested_fix
if details:
env["error"]["details"] = details
return env
HTTP_CODE_MAP = {
400: "BAD_REQUEST",
401: "UNAUTHORIZED",
403: "FORBIDDEN",
404: "NOT_FOUND",
408: "TIMEOUT",
409: "CONFLICT",
422: "VALIDATION_ERROR",
429: "RATE_LIMIT",
500: "INTERNAL_ERROR",
502: "UPSTREAM_ERROR",
503: "UNAVAILABLE",
504: "GATEWAY_TIMEOUT",
}
RETRYABLE_STATUSES = frozenset({408, 429, 502, 503, 504})
@@ -28,8 +28,6 @@ import httpx
from fastapi import FastAPI, Request
from fastapi.responses import JSONResponse
from mcp_common.auth import TokenStore
MCP_PROTOCOL_VERSION = "2024-11-05"
@@ -95,20 +93,22 @@ def mount_mcp_endpoint(
*,
name: str,
version: str,
token_store: TokenStore,
valid_tokens: set[str],
internal_base_url: str,
tools: list[dict],
) -> None:
"""Registra un endpoint MCP JSON-RPC 2.0 su POST /mcp.
Ogni tool è proxato verso POST {internal_base_url}/tools/<name> con il
Bearer token del client MCP (preservando le ACL REST esistenti).
Bearer token del client MCP. L'auth è già gestita dal middleware V2
(bearer testnet/mainnet); qui si ricontrolla che il token sia nei
valid_tokens prima di proxare.
Args:
app: istanza FastAPI del service
name: nome server MCP
version: versione del service
token_store: lo stesso usato dai tool REST
valid_tokens: set di token validi (testnet + mainnet)
internal_base_url: URL base interno (es. "http://localhost:9015")
tools: lista di {"name": str, "description": str, "input_schema"?: dict}
"""
@@ -207,8 +207,7 @@ def mount_mcp_endpoint(
if not auth.startswith("Bearer "):
return JSONResponse({"error": "missing bearer token"}, status_code=401)
token = auth[len("Bearer "):].strip()
principal = token_store.get(token)
if principal is None:
if token not in valid_tokens:
return JSONResponse({"error": "invalid token"}, status_code=403)
body = await request.json()
+74
View File
@@ -0,0 +1,74 @@
"""Builder centralizzato di client per ClientRegistry."""
from __future__ import annotations
from typing import Literal
from cerbero_mcp.settings import Settings
Environment = Literal["testnet", "mainnet"]
async def build_client(
settings: Settings, exchange: str, env: Environment
):
if exchange == "deribit":
from cerbero_mcp.exchanges.deribit.client import DeribitClient
url = settings.deribit.url_testnet if env == "testnet" else settings.deribit.url_live
return DeribitClient(
client_id=settings.deribit.client_id,
client_secret=settings.deribit.client_secret.get_secret_value(),
testnet=(env == "testnet"),
base_url_override=url,
)
if exchange == "bybit":
from cerbero_mcp.exchanges.bybit.client import BybitClient
url = settings.bybit.url_testnet if env == "testnet" else settings.bybit.url_live
return BybitClient(
api_key=settings.bybit.api_key,
api_secret=settings.bybit.api_secret.get_secret_value(),
testnet=(env == "testnet"),
base_url=url,
)
if exchange == "hyperliquid":
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
url = settings.hyperliquid.url_testnet if env == "testnet" else settings.hyperliquid.url_live
return HyperliquidClient(
wallet_address=settings.hyperliquid.wallet_address,
private_key=settings.hyperliquid.private_key.get_secret_value(),
testnet=(env == "testnet"),
api_wallet_address=settings.hyperliquid.api_wallet_address,
base_url=url,
)
if exchange == "alpaca":
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
url = settings.alpaca.url_testnet if env == "testnet" else settings.alpaca.url_live
return AlpacaClient(
api_key=settings.alpaca.api_key_id,
secret_key=settings.alpaca.secret_key.get_secret_value(),
paper=(env == "testnet"),
base_url=url,
)
if exchange == "macro":
# Read-only data provider — env ignored. Il registry
# istanzia comunque 2 entry (testnet/mainnet); costo trascurabile
# (wrapper stateless senza HTTP session).
from cerbero_mcp.exchanges.macro.client import MacroClient
return MacroClient(
fred_api_key=settings.macro.fred_api_key.get_secret_value(),
finnhub_api_key=settings.macro.finnhub_api_key.get_secret_value(),
)
if exchange == "sentiment":
# Read-only data provider — env ignored (CryptoPanic, LunarCrush e
# endpoint pubblici di funding/OI multi-exchange sono unici).
from cerbero_mcp.exchanges.sentiment.client import SentimentClient
return SentimentClient(
cryptopanic_key=settings.sentiment.cryptopanic_key.get_secret_value(),
lunarcrush_key=settings.sentiment.lunarcrush_key.get_secret_value(),
)
raise ValueError(f"unsupported exchange: {exchange}")
@@ -92,6 +92,7 @@ class AlpacaClient:
api_key: str,
secret_key: str,
paper: bool = True,
base_url: str | None = None,
trading: Any | None = None,
stock_data: Any | None = None,
crypto_data: Any | None = None,
@@ -100,9 +101,18 @@ class AlpacaClient:
self.api_key = api_key
self.secret_key = secret_key
self.paper = paper
self._trading = trading or TradingClient(
api_key=api_key, secret_key=secret_key, paper=paper
)
self.base_url = base_url
# alpaca-py TradingClient accetta `url_override` per override URL trading.
# Data clients (Stock/Crypto/Option) non supportano url_override sul costruttore;
# usano endpoint dati separati (data.alpaca.markets) — `base_url` è ignorato per essi.
if trading is None:
trading_kwargs: dict[str, Any] = {
"api_key": api_key, "secret_key": secret_key, "paper": paper,
}
if base_url:
trading_kwargs["url_override"] = base_url
trading = TradingClient(**trading_kwargs)
self._trading = trading
self._stock = stock_data or StockHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
@@ -120,14 +130,14 @@ class AlpacaClient:
async def get_account(self) -> dict:
acc = await self._run(self._trading.get_account)
return _serialize(acc)
return _serialize(acc) # type: ignore[no-any-return]
async def get_positions(self) -> list[dict]:
pos = await self._run(self._trading.get_all_positions)
return [_serialize(p) for p in pos]
async def get_activities(self, limit: int = 50) -> list[dict]:
acts = await self._run(self._trading.get_account_activities)
acts = await self._run(self._trading.get_account_activities) # type: ignore[union-attr]
data = [_serialize(a) for a in acts]
return data[:limit]
@@ -138,7 +148,7 @@ class AlpacaClient:
) -> list[dict]:
req = GetAssetsRequest(
asset_class=_asset_class_enum(asset_class),
status=status,
status=status, # type: ignore[arg-type]
)
assets = await self._run(self._trading.get_all_assets, req)
return [_serialize(a) for a in assets[:500]]
@@ -165,10 +175,10 @@ class AlpacaClient:
"timestamp": _serialize(getattr(trade, "timestamp", None)),
}
if ac == "crypto":
req = CryptoLatestTradeRequest(symbol_or_symbols=symbol)
req = CryptoLatestTradeRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
data = await self._run(self._crypto.get_crypto_latest_trade, req)
trade = data.get(symbol)
q_req = CryptoLatestQuoteRequest(symbol_or_symbols=symbol)
q_req = CryptoLatestQuoteRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
qdata = await self._run(self._crypto.get_crypto_latest_quote, q_req)
quote = qdata.get(symbol)
return {
@@ -180,7 +190,7 @@ class AlpacaClient:
"timestamp": _serialize(getattr(trade, "timestamp", None)),
}
if ac == "options":
req = OptionLatestQuoteRequest(symbol_or_symbols=symbol)
req = OptionLatestQuoteRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
data = await self._run(self._option.get_option_latest_quote, req)
quote = data.get(symbol)
return {
@@ -214,13 +224,13 @@ class AlpacaClient:
)
data = await self._run(self._stock.get_stock_bars, req)
elif ac == "crypto":
req = CryptoBarsRequest(
req = CryptoBarsRequest( # type: ignore[assignment]
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
)
data = await self._run(self._crypto.get_crypto_bars, req)
elif ac == "options":
req = OptionBarsRequest(
req = OptionBarsRequest( # type: ignore[assignment]
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
)
@@ -245,7 +255,7 @@ class AlpacaClient:
async def get_snapshot(self, symbol: str) -> dict:
req = StockSnapshotRequest(symbol_or_symbols=symbol)
data = await self._run(self._stock.get_stock_snapshot, req)
return _serialize(data.get(symbol))
return _serialize(data.get(symbol)) # type: ignore[no-any-return]
async def get_option_chain(
self,
@@ -291,23 +301,23 @@ class AlpacaClient:
"time_in_force": tif_enum,
}
if qty is not None:
common["qty"] = qty
common["qty"] = qty # type: ignore[assignment]
if notional is not None:
common["notional"] = notional
common["notional"] = notional # type: ignore[assignment]
if ot == "market":
req = MarketOrderRequest(**common)
elif ot == "limit":
if limit_price is None:
raise ValueError("limit_price required for limit order")
req = LimitOrderRequest(**common, limit_price=limit_price)
req = LimitOrderRequest(**common, limit_price=limit_price) # type: ignore[assignment]
elif ot == "stop":
if stop_price is None:
raise ValueError("stop_price required for stop order")
req = StopOrderRequest(**common, stop_price=stop_price)
req = StopOrderRequest(**common, stop_price=stop_price) # type: ignore[assignment]
else:
raise ValueError(f"unsupported order_type: {order_type}")
order = await self._run(self._trading.submit_order, req)
return _serialize(order)
return _serialize(order) # type: ignore[no-any-return]
async def amend_order(
self,
@@ -328,7 +338,7 @@ class AlpacaClient:
kwargs["time_in_force"] = TimeInForce(tif.lower())
req = ReplaceOrderRequest(**kwargs)
order = await self._run(self._trading.replace_order_by_id, order_id, req)
return _serialize(order)
return _serialize(order) # type: ignore[no-any-return]
async def cancel_order(self, order_id: str) -> dict:
await self._run(self._trading.cancel_order_by_id, order_id)
@@ -354,7 +364,7 @@ class AlpacaClient:
order = await self._run(
self._trading.close_position, symbol, close_options=req
)
return _serialize(order)
return _serialize(order) # type: ignore[no-any-return]
async def close_all_positions(self, cancel_orders: bool = True) -> list[dict]:
resp = await self._run(
@@ -366,7 +376,7 @@ class AlpacaClient:
async def get_clock(self) -> dict:
clock = await self._run(self._trading.get_clock)
return _serialize(clock)
return _serialize(clock) # type: ignore[no-any-return]
async def get_calendar(
self, start: str | None = None, end: str | None = None
+279
View File
@@ -0,0 +1,279 @@
"""Tool alpaca V2: pydantic schemas + async functions.
Ogni funzione prende (client: AlpacaClient, params: <Req>) e restituisce
un dict (o list[dict]). Pure logica, no FastAPI dependency, no ACL.
L'autenticazione bearer è gestita dal middleware in cerbero_mcp.auth;
l'audit verrà cablato dal router via request.state.environment.
"""
from __future__ import annotations
from typing import Any
from pydantic import BaseModel
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
from cerbero_mcp.exchanges.alpaca.leverage_cap import get_max_leverage
# === Schemas: reads ===
class GetAccountReq(BaseModel):
pass
class GetPositionsReq(BaseModel):
pass
class GetActivitiesReq(BaseModel):
limit: int = 50
class GetAssetsReq(BaseModel):
asset_class: str = "stocks"
status: str = "active"
class GetTickerReq(BaseModel):
symbol: str
asset_class: str = "stocks"
class GetBarsReq(BaseModel):
symbol: str
asset_class: str = "stocks"
interval: str = "1d"
start: str | None = None
end: str | None = None
limit: int = 1000
class GetSnapshotReq(BaseModel):
symbol: str
class GetOptionChainReq(BaseModel):
underlying: str
expiry: str | None = None
class GetOpenOrdersReq(BaseModel):
limit: int = 50
class GetClockReq(BaseModel):
pass
class GetCalendarReq(BaseModel):
start: str | None = None
end: str | None = None
# === Schemas: writes ===
class PlaceOrderReq(BaseModel):
symbol: str
side: str # "buy" | "sell"
qty: float | None = None
notional: float | None = None
order_type: str = "market"
limit_price: float | None = None
stop_price: float | None = None
tif: str = "day"
asset_class: str = "stocks"
model_config = {
"json_schema_extra": {
"examples": [
{
"summary": "Market buy 1 share AAPL",
"value": {
"symbol": "AAPL",
"side": "buy",
"qty": 1,
"order_type": "market",
"asset_class": "stocks",
},
}
]
}
}
class AmendOrderReq(BaseModel):
order_id: str
qty: float | None = None
limit_price: float | None = None
stop_price: float | None = None
tif: str | None = None
class CancelOrderReq(BaseModel):
order_id: str
class CancelAllOrdersReq(BaseModel):
pass
class ClosePositionReq(BaseModel):
symbol: str
qty: float | None = None
percentage: float | None = None
class CloseAllPositionsReq(BaseModel):
cancel_orders: bool = True
# === Tools (reads) ===
async def environment_info(
client: AlpacaClient, *, creds: dict, env_info: Any | None = None
) -> dict:
if env_info is None:
return {
"exchange": "alpaca",
"environment": "testnet" if getattr(client, "paper", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
async def get_account(client: AlpacaClient, params: GetAccountReq) -> dict:
return await client.get_account()
async def get_positions(
client: AlpacaClient, params: GetPositionsReq
) -> dict:
return {"positions": await client.get_positions()}
async def get_activities(
client: AlpacaClient, params: GetActivitiesReq
) -> dict:
return {"activities": await client.get_activities(params.limit)}
async def get_assets(client: AlpacaClient, params: GetAssetsReq) -> dict:
return {
"assets": await client.get_assets(params.asset_class, params.status)
}
async def get_ticker(client: AlpacaClient, params: GetTickerReq) -> dict:
return await client.get_ticker(params.symbol, params.asset_class)
async def get_bars(client: AlpacaClient, params: GetBarsReq) -> dict:
return await client.get_bars(
params.symbol,
params.asset_class,
params.interval,
params.start,
params.end,
params.limit,
)
async def get_snapshot(
client: AlpacaClient, params: GetSnapshotReq
) -> dict:
return await client.get_snapshot(params.symbol)
async def get_option_chain(
client: AlpacaClient, params: GetOptionChainReq
) -> dict:
return await client.get_option_chain(params.underlying, params.expiry)
async def get_open_orders(
client: AlpacaClient, params: GetOpenOrdersReq
) -> dict:
return {"orders": await client.get_open_orders(params.limit)}
async def get_clock(client: AlpacaClient, params: GetClockReq) -> dict:
return await client.get_clock()
async def get_calendar(
client: AlpacaClient, params: GetCalendarReq
) -> dict:
return {"calendar": await client.get_calendar(params.start, params.end)}
# === Tools (writes) ===
async def place_order(
client: AlpacaClient, params: PlaceOrderReq, *, creds: dict
) -> dict:
# Alpaca: cap default 1 (cash account). Niente leverage parametro;
# cap presente per coerenza con altri exchange e per audit.
return await client.place_order(
symbol=params.symbol,
side=params.side,
qty=params.qty,
notional=params.notional,
order_type=params.order_type,
limit_price=params.limit_price,
stop_price=params.stop_price,
tif=params.tif,
asset_class=params.asset_class,
)
async def amend_order(
client: AlpacaClient, params: AmendOrderReq
) -> dict:
return await client.amend_order(
params.order_id,
params.qty,
params.limit_price,
params.stop_price,
params.tif,
)
async def cancel_order(
client: AlpacaClient, params: CancelOrderReq
) -> dict:
return await client.cancel_order(params.order_id)
async def cancel_all_orders(
client: AlpacaClient, params: CancelAllOrdersReq
) -> dict:
return {"canceled": await client.cancel_all_orders()}
async def close_position(
client: AlpacaClient, params: ClosePositionReq
) -> dict:
return await client.close_position(
params.symbol, params.qty, params.percentage
)
async def close_all_positions(
client: AlpacaClient, params: CloseAllPositionsReq
) -> dict:
return {
"closed": await client.close_all_positions(params.cancel_orders)
}
@@ -3,10 +3,11 @@ from __future__ import annotations
import asyncio
from typing import Any
from mcp_common import indicators as ind
from mcp_common import microstructure as micro
from pybit.unified_trading import HTTP
from cerbero_mcp.common import indicators as ind
from cerbero_mcp.common import microstructure as micro
def _f(v: Any) -> float | None:
try:
@@ -29,15 +30,24 @@ class BybitClient:
api_secret: str,
testnet: bool = True,
http: Any | None = None,
base_url: str | None = None,
) -> None:
self.api_key = api_key
self.api_secret = api_secret
self.testnet = testnet
self._http = http or HTTP(
api_key=api_key,
api_secret=api_secret,
testnet=testnet,
)
# pybit HTTP non accetta `endpoint` come kwarg (vedi _V5HTTPManager.__init__:
# solo `domain`/`tld`/`testnet`). Override URL applicato post-init
# sovrascrivendo l'attributo `endpoint` dell'istanza HTTP.
self.base_url = base_url
if http is None:
http = HTTP(
api_key=api_key,
api_secret=api_secret,
testnet=testnet,
)
if base_url:
http.endpoint = base_url
self._http = http
async def _run(self, fn, /, **kwargs):
return await asyncio.to_thread(fn, **kwargs)
+442
View File
@@ -0,0 +1,442 @@
"""Tool bybit V2: pydantic schemas + async functions.
Ogni funzione prende (client: BybitClient, params: <Req>) e restituisce
un dict (o un model Pydantic). Pure logica, no FastAPI dependency, no ACL.
L'autenticazione bearer è gestita dal middleware in cerbero_mcp.auth;
l'audit verrà cablato dal router via request.state.environment.
"""
from __future__ import annotations
from typing import Any
from pydantic import BaseModel, Field
from cerbero_mcp.exchanges.bybit.client import BybitClient
from cerbero_mcp.exchanges.bybit.leverage_cap import (
enforce_leverage as _enforce_leverage,
)
from cerbero_mcp.exchanges.bybit.leverage_cap import get_max_leverage
# === Schemas: reads ===
class TickerReq(BaseModel):
symbol: str
category: str = "linear"
class TickerBatchReq(BaseModel):
symbols: list[str]
category: str = "linear"
class OrderbookReq(BaseModel):
symbol: str
category: str = "linear"
limit: int = 50
class HistoricalReq(BaseModel):
symbol: str
category: str = "linear"
interval: str = "60"
start: int | None = None
end: int | None = None
limit: int = 1000
class IndicatorsReq(BaseModel):
symbol: str
category: str = "linear"
indicators: list[str] = ["rsi", "atr", "macd", "adx"]
interval: str = "60"
start: int | None = None
end: int | None = None
class FundingRateReq(BaseModel):
symbol: str
category: str = "linear"
class FundingHistoryReq(BaseModel):
symbol: str
category: str = "linear"
limit: int = 100
class OpenInterestReq(BaseModel):
symbol: str
category: str = "linear"
interval: str = "5min"
limit: int = 288
class InstrumentsReq(BaseModel):
category: str = "linear"
symbol: str | None = None
class OptionChainReq(BaseModel):
base_coin: str
expiry: str | None = None
class PositionsReq(BaseModel):
category: str = "linear"
class AccountSummaryReq(BaseModel):
pass
class TradeHistoryReq(BaseModel):
category: str = "linear"
limit: int = 50
class OpenOrdersReq(BaseModel):
category: str = "linear"
symbol: str | None = None
class BasisSpotPerpReq(BaseModel):
asset: str
class OrderbookImbalanceReq(BaseModel):
symbol: str
category: str = "linear"
depth: int = 10
class BasisTermStructureReq(BaseModel):
asset: str
# === Schemas: writes ===
class PlaceOrderReq(BaseModel):
category: str
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
position_idx: int | None = None
model_config = {
"json_schema_extra": {
"examples": [
{
"summary": "Market buy 0.01 BTCUSDT linear perp",
"value": {
"category": "linear",
"symbol": "BTCUSDT",
"side": "Buy",
"qty": 0.01,
"order_type": "Market",
},
}
]
}
}
class ComboLegReq(BaseModel):
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
class PlaceComboOrderReq(BaseModel):
category: str = "option"
legs: list[ComboLegReq] = Field(..., min_length=2)
class AmendOrderReq(BaseModel):
category: str
symbol: str
order_id: str
new_qty: float | None = None
new_price: float | None = None
class CancelOrderReq(BaseModel):
category: str
symbol: str
order_id: str
class CancelAllReq(BaseModel):
category: str
symbol: str | None = None
class SetStopLossReq(BaseModel):
category: str
symbol: str
stop_loss: float
position_idx: int = 0
class SetTakeProfitReq(BaseModel):
category: str
symbol: str
take_profit: float
position_idx: int = 0
class ClosePositionReq(BaseModel):
category: str
symbol: str
class SetLeverageReq(BaseModel):
category: str
symbol: str
leverage: int
class SwitchModeReq(BaseModel):
category: str
symbol: str
mode: str
class TransferReq(BaseModel):
coin: str
amount: float
from_type: str
to_type: str
# === Tools (reads) ===
async def environment_info(
client: BybitClient, *, creds: dict, env_info: Any | None = None
) -> dict:
if env_info is None:
return {
"exchange": "bybit",
"environment": "testnet" if client.testnet else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
async def get_ticker(client: BybitClient, params: TickerReq) -> dict:
return await client.get_ticker(params.symbol, params.category)
async def get_ticker_batch(client: BybitClient, params: TickerBatchReq) -> dict:
return await client.get_ticker_batch(params.symbols, params.category)
async def get_orderbook(client: BybitClient, params: OrderbookReq) -> dict:
return await client.get_orderbook(params.symbol, params.category, params.limit)
async def get_historical(client: BybitClient, params: HistoricalReq) -> dict:
return await client.get_historical(
params.symbol,
params.category,
params.interval,
params.start,
params.end,
params.limit,
)
async def get_indicators(client: BybitClient, params: IndicatorsReq) -> dict:
return await client.get_indicators(
params.symbol,
params.category,
params.indicators,
params.interval,
params.start,
params.end,
)
async def get_funding_rate(client: BybitClient, params: FundingRateReq) -> dict:
return await client.get_funding_rate(params.symbol, params.category)
async def get_funding_history(client: BybitClient, params: FundingHistoryReq) -> dict:
return await client.get_funding_history(
params.symbol, params.category, params.limit
)
async def get_open_interest(client: BybitClient, params: OpenInterestReq) -> dict:
return await client.get_open_interest(
params.symbol, params.category, params.interval, params.limit
)
async def get_instruments(client: BybitClient, params: InstrumentsReq) -> dict:
return await client.get_instruments(params.category, params.symbol)
async def get_option_chain(client: BybitClient, params: OptionChainReq) -> dict:
return await client.get_option_chain(params.base_coin, params.expiry)
async def get_positions(client: BybitClient, params: PositionsReq) -> dict:
return {"positions": await client.get_positions(params.category)}
async def get_account_summary(
client: BybitClient, params: AccountSummaryReq
) -> dict:
return await client.get_account_summary()
async def get_trade_history(client: BybitClient, params: TradeHistoryReq) -> dict:
return {
"trades": await client.get_trade_history(params.category, params.limit)
}
async def get_open_orders(client: BybitClient, params: OpenOrdersReq) -> dict:
return {
"orders": await client.get_open_orders(params.category, params.symbol)
}
async def get_basis_spot_perp(client: BybitClient, params: BasisSpotPerpReq) -> dict:
return await client.get_basis_spot_perp(params.asset)
async def get_orderbook_imbalance(
client: BybitClient, params: OrderbookImbalanceReq
) -> dict:
return await client.get_orderbook_imbalance(
params.symbol, params.category, params.depth
)
async def get_basis_term_structure(
client: BybitClient, params: BasisTermStructureReq
) -> dict:
return await client.get_basis_term_structure(params.asset)
# === Tools (writes) ===
async def place_order(
client: BybitClient, params: PlaceOrderReq, *, creds: dict
) -> dict:
# Bybit non ha leverage_cap parametro per place_order; cap applicato a set_leverage.
result = await client.place_order(
category=params.category,
symbol=params.symbol,
side=params.side,
qty=params.qty,
order_type=params.order_type,
price=params.price,
tif=params.tif,
reduce_only=params.reduce_only,
position_idx=params.position_idx,
)
# TODO V2: wire audit via request.state.environment in router
return result
async def place_combo_order(
client: BybitClient, params: PlaceComboOrderReq, *, creds: dict
) -> dict:
result = await client.place_combo_order(
category=params.category,
legs=[leg.model_dump() for leg in params.legs],
)
# TODO V2: wire audit via request.state.environment in router
return result
async def amend_order(client: BybitClient, params: AmendOrderReq) -> dict:
result = await client.amend_order(
params.category,
params.symbol,
params.order_id,
params.new_qty,
params.new_price,
)
return result
async def cancel_order(client: BybitClient, params: CancelOrderReq) -> dict:
result = await client.cancel_order(
params.category, params.symbol, params.order_id
)
return result
async def cancel_all_orders(client: BybitClient, params: CancelAllReq) -> dict:
result = await client.cancel_all_orders(params.category, params.symbol)
return result
async def set_stop_loss(client: BybitClient, params: SetStopLossReq) -> dict:
result = await client.set_stop_loss(
params.category, params.symbol, params.stop_loss, params.position_idx
)
return result
async def set_take_profit(client: BybitClient, params: SetTakeProfitReq) -> dict:
result = await client.set_take_profit(
params.category, params.symbol, params.take_profit, params.position_idx
)
return result
async def close_position(client: BybitClient, params: ClosePositionReq) -> dict:
result = await client.close_position(params.category, params.symbol)
return result
async def set_leverage(
client: BybitClient, params: SetLeverageReq, *, creds: dict
) -> dict:
_enforce_leverage(params.leverage, creds=creds, exchange="bybit")
result = await client.set_leverage(
params.category, params.symbol, params.leverage
)
return result
async def switch_position_mode(
client: BybitClient, params: SwitchModeReq
) -> dict:
result = await client.switch_position_mode(
params.category, params.symbol, params.mode
)
return result
async def transfer_asset(client: BybitClient, params: TransferReq) -> dict:
result = await client.transfer_asset(
params.coin, params.amount, params.from_type, params.to_type
)
return result
@@ -5,10 +5,10 @@ import time
from dataclasses import dataclass, field
from typing import Any
from mcp_common import indicators as ind
from mcp_common import microstructure as micro
from mcp_common import options as opt
from mcp_common.http import async_client
from cerbero_mcp.common import indicators as ind
from cerbero_mcp.common import microstructure as micro
from cerbero_mcp.common import options as opt
from cerbero_mcp.common.http import async_client
BASE_LIVE = "https://www.deribit.com/api/v2"
BASE_TESTNET = "https://test.deribit.com/api/v2"
@@ -28,11 +28,14 @@ class DeribitClient:
client_id: str
client_secret: str
testnet: bool = True
base_url_override: str | None = None
_token: str | None = field(default=None, init=False, repr=False)
_token_expires_at: float = field(default=0.0, init=False, repr=False)
@property
def base_url(self) -> str:
if self.base_url_override:
return self.base_url_override
return BASE_TESTNET if self.testnet else BASE_LIVE
# ── Auth ─────────────────────────────────────────────────────
@@ -84,10 +87,10 @@ class DeribitClient:
resp = await http.get(url, params=request_params, headers=headers)
data = resp.json()
if "result" in data:
return data
return data # type: ignore[no-any-return]
return {"result": None, "error": error_msg}
return data
return data # type: ignore[no-any-return]
# ── Read tools ───────────────────────────────────────────────
@@ -187,9 +190,9 @@ class DeribitClient:
self._request("public/get_book_summary_by_currency", summary_params),
return_exceptions=True,
)
raw = instruments_raw if isinstance(instruments_raw, dict) else {}
raw = instruments_raw if isinstance(instruments_raw, dict) else {} # type: ignore[has-type]
summary_items = (
summary_raw.get("result") if isinstance(summary_raw, dict) else None
summary_raw.get("result") if isinstance(summary_raw, dict) else None # type: ignore[has-type]
) or []
oi_by_name: dict[str, float] = {}
for s in summary_items:
@@ -546,7 +549,7 @@ class DeribitClient:
top_n_strikes: int = 50,
) -> tuple[float, list[dict[str, Any]]]:
"""Fetch chain options + ticker per top-N strikes per OI; restituisce
(spot, legs[]) con campi normalizzati per le funzioni in mcp_common.options.
(spot, legs[]) con campi normalizzati per le funzioni in cerbero_mcp.common.options.
"""
import asyncio
@@ -926,7 +929,7 @@ class DeribitClient:
}
closes = [c["close"] for c in candles]
from mcp_common import indicators as _ind
from cerbero_mcp.common import indicators as _ind
rsi_thr_low = (entry_rules or {}).get("rsi_below", 35)
rsi_thr_high = (entry_rules or {}).get("rsi_above", 65)
@@ -1197,8 +1200,8 @@ class DeribitClient:
if len({l["expiry"] for l in legs}) == 2 and len(strikes) == 1:
return "calendar spread"
if n == 4:
types = [l["type"] for l in legs]
if types.count("P") == 2 and types.count("C") == 2:
types_list = [l["type"] for l in legs]
if types_list.count("P") == 2 and types_list.count("C") == 2:
return "iron condor"
return "custom"
@@ -1481,7 +1484,7 @@ class DeribitClient:
if not values_sorted:
return None
idx = int(round((len(values_sorted) - 1) * p))
return values_sorted[idx]
return values_sorted[idx] # type: ignore[no-any-return]
mean = sum(values) / len(values) if values else None
return {
@@ -1564,7 +1567,7 @@ class DeribitClient:
r = raw.get("result")
if r is None:
return {"error": raw.get("error", "unknown"), "state": "error"}
return r
return r # type: ignore[no-any-return]
async def place_combo_order(
self,
+530
View File
@@ -0,0 +1,530 @@
"""Tool deribit V2: pydantic schemas + async functions.
Ogni funzione prende (client: DeribitClient, params: <Req>) e restituisce
un dict (o un model Pydantic). Pure logica, no FastAPI dependency, no ACL.
L'autenticazione bearer è gestita dal middleware in cerbero_mcp.auth;
l'audit verrà cablato dal router via request.state.environment.
"""
from __future__ import annotations
import contextlib
from typing import Any
from pydantic import BaseModel, field_validator, model_validator
from cerbero_mcp.exchanges.deribit.client import DeribitClient
from cerbero_mcp.exchanges.deribit.leverage_cap import (
enforce_leverage as _enforce_leverage,
)
from cerbero_mcp.exchanges.deribit.leverage_cap import get_max_leverage
# === Schemas ===
class GetTickerReq(BaseModel):
instrument_name: str | None = None
instrument: str | None = None
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
sym = self.instrument_name or self.instrument
if not sym:
raise ValueError("instrument_name (or instrument) is required")
self.instrument_name = sym
return self
class GetTickerBatchReq(BaseModel):
instrument_names: list[str] | None = None
instruments: list[str] | None = None
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
names = self.instrument_names or self.instruments
if not names:
raise ValueError("instrument_names (or instruments) is required")
self.instrument_names = names
return self
class GetInstrumentsReq(BaseModel):
currency: str
kind: str | None = None
expiry_from: str | None = None
expiry_to: str | None = None
strike_min: float | None = None
strike_max: float | None = None
min_open_interest: float | None = None
limit: int = 100
offset: int = 0
class GetOrderbookReq(BaseModel):
instrument_name: str
depth: int = 10
class OrderbookImbalanceReq(BaseModel):
instrument_name: str
depth: int = 10
class GetPositionsReq(BaseModel):
currency: str = "USDC"
class GetAccountSummaryReq(BaseModel):
currency: str = "USDC"
class GetTradeHistoryReq(BaseModel):
limit: int = 100
instrument_name: str | None = None
class GetHistoricalReq(BaseModel):
instrument: str
start_date: str
end_date: str
resolution: str = "1h"
class GetDvolReq(BaseModel):
currency: str = "BTC"
start_date: str
end_date: str
resolution: str = "1D"
class GetDvolHistoryReq(BaseModel):
currency: str = "BTC"
lookback_days: int = 90
class GetIvRankReq(BaseModel):
instrument: str
class GetRealizedVolReq(BaseModel):
currency: str = "BTC"
windows: list[int] = [14, 30]
class GetGexReq(BaseModel):
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 50
class OptionFlowReq(BaseModel):
"""Body comune per indicatori option-flow (dealer gamma, vanna/charm,
OI-weighted skew, smile asymmetry, ATM vs wings)."""
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 100
class GetPcRatioReq(BaseModel):
currency: str
class GetSkew25dReq(BaseModel):
currency: str
expiry: str
class GetTermStructureReq(BaseModel):
currency: str
class CalculateSpreadPayoffReq(BaseModel):
legs: list[dict]
quote_currency: str = "USD"
class RunBacktestReq(BaseModel):
strategy_name: str
underlying: str = "BTC"
lookback_days: int = 30
resolution: str = "4h"
entry_rules: dict | None = None
exit_rules: dict | None = None
class FindByDeltaReq(BaseModel):
currency: str
expiry: str
target_delta: float
option_type: str
max_results: int = 3
min_open_interest: float = 100.0
min_volume_24h: float = 20.0
class GetIndicatorsReq(BaseModel):
instrument: str
indicators: list[str]
start_date: str
end_date: str
resolution: str = "1h"
@field_validator("indicators", mode="before")
@classmethod
def _coerce_indicators(cls, v):
if isinstance(v, str):
import json
s = v.strip()
if s.startswith("["):
try:
parsed = json.loads(s)
if isinstance(parsed, list):
return [str(x).strip() for x in parsed if str(x).strip()]
except json.JSONDecodeError:
pass
return [x.strip() for x in s.split(",") if x.strip()]
if isinstance(v, list):
return v
raise ValueError(
"indicators must be a list like ['rsi','atr','macd'] "
"or a comma-separated string like 'rsi,atr,macd'"
)
class PlaceOrderReq(BaseModel):
instrument_name: str
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
reduce_only: bool = False
post_only: bool = False
label: str | None = None
leverage: int | None = None # CER-016: None → default cap (3x)
model_config = {
"json_schema_extra": {
"examples": [
{
"summary": "Market buy 0.1 BTC perpetual",
"value": {
"instrument_name": "BTC-PERPETUAL",
"type": "market",
"amount": 0.1,
"side": "buy",
},
}
]
}
}
class ComboLeg(BaseModel):
instrument_name: str
direction: str # "buy" | "sell"
ratio: int = 1
class PlaceComboOrderReq(BaseModel):
legs: list[ComboLeg]
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
label: str | None = None
leverage: int | None = None
@model_validator(mode="after")
def _at_least_two_legs(self):
if len(self.legs) < 2:
raise ValueError("combo requires at least 2 legs")
return self
class CancelOrderReq(BaseModel):
order_id: str
class SetStopLossReq(BaseModel):
order_id: str
stop_price: float
class SetTakeProfitReq(BaseModel):
order_id: str
tp_price: float
class ClosePositionReq(BaseModel):
instrument_name: str
# === Tools (reads) ===
async def is_testnet(client: DeribitClient) -> dict:
return client.is_testnet()
async def environment_info(
client: DeribitClient, *, creds: dict, env_info: Any | None = None
) -> dict:
if env_info is None:
return {
"exchange": "deribit",
"environment": "testnet" if client.is_testnet().get("testnet") else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": client.base_url,
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
async def get_ticker(client: DeribitClient, params: GetTickerReq) -> dict:
assert params.instrument_name is not None # validator garantisce non-None
return await client.get_ticker(params.instrument_name)
async def get_ticker_batch(client: DeribitClient, params: GetTickerBatchReq) -> dict:
assert params.instrument_names is not None # validator garantisce non-None
return await client.get_ticker_batch(params.instrument_names)
async def get_instruments(client: DeribitClient, params: GetInstrumentsReq) -> dict:
return await client.get_instruments(
currency=params.currency,
kind=params.kind,
expiry_from=params.expiry_from,
expiry_to=params.expiry_to,
strike_min=params.strike_min,
strike_max=params.strike_max,
min_open_interest=params.min_open_interest,
limit=params.limit,
offset=params.offset,
)
async def get_orderbook(client: DeribitClient, params: GetOrderbookReq) -> dict:
return await client.get_orderbook(params.instrument_name, params.depth)
async def get_orderbook_imbalance(
client: DeribitClient, params: OrderbookImbalanceReq
) -> dict:
return await client.get_orderbook_imbalance(params.instrument_name, params.depth)
async def get_positions(client: DeribitClient, params: GetPositionsReq) -> list:
return await client.get_positions(params.currency)
async def get_account_summary(
client: DeribitClient, params: GetAccountSummaryReq
) -> dict:
return await client.get_account_summary(params.currency)
async def get_trade_history(client: DeribitClient, params: GetTradeHistoryReq) -> list:
return await client.get_trade_history(params.limit, params.instrument_name)
async def get_historical(client: DeribitClient, params: GetHistoricalReq) -> dict:
return await client.get_historical(
params.instrument, params.start_date, params.end_date, params.resolution
)
async def get_dvol(client: DeribitClient, params: GetDvolReq) -> dict:
return await client.get_dvol(
params.currency, params.start_date, params.end_date, params.resolution
)
async def get_gex(client: DeribitClient, params: GetGexReq) -> dict:
return await client.get_gex(
params.currency, params.expiry_from, params.expiry_to, params.top_n_strikes
)
async def get_dealer_gamma_profile(
client: DeribitClient, params: OptionFlowReq
) -> dict:
return await client.get_dealer_gamma_profile(
params.currency, params.expiry_from, params.expiry_to, params.top_n_strikes
)
async def get_vanna_charm(client: DeribitClient, params: OptionFlowReq) -> dict:
return await client.get_vanna_charm(
params.currency, params.expiry_from, params.expiry_to, params.top_n_strikes
)
async def get_oi_weighted_skew(client: DeribitClient, params: OptionFlowReq) -> dict:
return await client.get_oi_weighted_skew(
params.currency, params.expiry_from, params.expiry_to, params.top_n_strikes
)
async def get_smile_asymmetry(client: DeribitClient, params: OptionFlowReq) -> dict:
return await client.get_smile_asymmetry(
params.currency, params.expiry_from, params.expiry_to, params.top_n_strikes
)
async def get_atm_vs_wings_vol(client: DeribitClient, params: OptionFlowReq) -> dict:
return await client.get_atm_vs_wings_vol(
params.currency, params.expiry_from, params.expiry_to, params.top_n_strikes
)
async def get_pc_ratio(client: DeribitClient, params: GetPcRatioReq) -> dict:
return await client.get_pc_ratio(params.currency)
async def get_skew_25d(client: DeribitClient, params: GetSkew25dReq) -> dict:
return await client.get_skew_25d(params.currency, params.expiry)
async def get_term_structure(
client: DeribitClient, params: GetTermStructureReq
) -> dict:
return await client.get_term_structure(params.currency)
async def run_backtest(client: DeribitClient, params: RunBacktestReq) -> dict:
return await client.run_backtest(
strategy_name=params.strategy_name,
underlying=params.underlying,
lookback_days=params.lookback_days,
resolution=params.resolution,
entry_rules=params.entry_rules,
exit_rules=params.exit_rules,
)
async def calculate_spread_payoff(
client: DeribitClient, params: CalculateSpreadPayoffReq
) -> dict:
return await client.calculate_spread_payoff(params.legs, params.quote_currency)
async def find_by_delta(client: DeribitClient, params: FindByDeltaReq) -> dict:
return await client.find_by_delta(
currency=params.currency,
expiry=params.expiry,
target_delta=params.target_delta,
option_type=params.option_type,
max_results=params.max_results,
min_open_interest=params.min_open_interest,
min_volume_24h=params.min_volume_24h,
)
async def get_iv_rank(client: DeribitClient, params: GetIvRankReq) -> dict:
return await client.get_iv_rank(params.instrument)
async def get_dvol_history(client: DeribitClient, params: GetDvolHistoryReq) -> dict:
return await client.get_dvol_history(params.currency, params.lookback_days)
async def get_realized_vol(client: DeribitClient, params: GetRealizedVolReq) -> dict:
return await client.get_realized_vol(params.currency, params.windows)
async def get_technical_indicators(
client: DeribitClient, params: GetIndicatorsReq
) -> dict:
return await client.get_technical_indicators(
params.instrument,
params.indicators,
params.start_date,
params.end_date,
params.resolution,
)
# === Tools (writes) ===
async def place_order(
client: DeribitClient, params: PlaceOrderReq, *, creds: dict
) -> dict:
cap_default = get_max_leverage(creds)
lev = _enforce_leverage(params.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
with contextlib.suppress(Exception):
await client.set_leverage(params.instrument_name, lev)
result = await client.place_order(
instrument_name=params.instrument_name,
side=params.side,
amount=params.amount,
type=params.type,
price=params.price,
reduce_only=params.reduce_only,
post_only=params.post_only,
label=params.label,
)
# TODO V2: wire audit via request.state.environment in router
return result
async def place_combo_order(
client: DeribitClient, params: PlaceComboOrderReq, *, creds: dict
) -> dict:
cap_default = get_max_leverage(creds)
lev = _enforce_leverage(params.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
for leg in params.legs:
with contextlib.suppress(Exception):
await client.set_leverage(leg.instrument_name, lev)
result = await client.place_combo_order(
legs=[leg.model_dump() for leg in params.legs],
side=params.side,
amount=params.amount,
type=params.type,
price=params.price,
label=params.label,
)
# TODO V2: wire audit via request.state.environment in router
return result
async def cancel_order(client: DeribitClient, params: CancelOrderReq) -> dict:
result = await client.cancel_order(params.order_id)
# TODO V2: wire audit via request.state.environment in router
return result
async def set_stop_loss(client: DeribitClient, params: SetStopLossReq) -> dict:
result = await client.set_stop_loss(params.order_id, params.stop_price)
# TODO V2: wire audit via request.state.environment in router
return result
async def set_take_profit(client: DeribitClient, params: SetTakeProfitReq) -> dict:
result = await client.set_take_profit(params.order_id, params.tp_price)
# TODO V2: wire audit via request.state.environment in router
return result
async def close_position(client: DeribitClient, params: ClosePositionReq) -> dict:
result = await client.close_position(params.instrument_name)
# TODO V2: wire audit via request.state.environment in router
return result
@@ -6,8 +6,8 @@ import asyncio
import datetime as _dt
from typing import Any
from mcp_common import indicators as ind
from mcp_common.http import async_client
from cerbero_mcp.common import indicators as ind
from cerbero_mcp.common.http import async_client
BASE_LIVE = "https://api.hyperliquid.xyz"
BASE_TESTNET = "https://api.hyperliquid-testnet.xyz"
@@ -52,12 +52,17 @@ class HyperliquidClient:
private_key: str,
testnet: bool = True,
api_wallet_address: str | None = None,
base_url: str | None = None,
):
self.wallet_address = wallet_address
self.private_key = private_key
self.testnet = testnet
self.api_wallet_address = api_wallet_address or wallet_address
self.base_url = BASE_TESTNET if testnet else BASE_LIVE
self._base_url_override = base_url
if base_url:
self.base_url = base_url
else:
self.base_url = BASE_TESTNET if testnet else BASE_LIVE
self._exchange: Any | None = None
# ── SDK exchange (lazy) ────────────────────────────────────
@@ -70,13 +75,16 @@ class HyperliquidClient:
)
if self._exchange is None:
account = Account.from_key(self.private_key)
base_url = (
hl_constants.TESTNET_API_URL if self.testnet else hl_constants.MAINNET_API_URL
)
if self._base_url_override:
sdk_base_url = self._base_url_override
else:
sdk_base_url = (
hl_constants.TESTNET_API_URL if self.testnet else hl_constants.MAINNET_API_URL
)
empty_spot_meta: dict[str, Any] = {"universe": [], "tokens": []}
self._exchange = Exchange(
account,
base_url,
sdk_base_url,
account_address=self.wallet_address,
spot_meta=empty_spot_meta,
)
@@ -309,7 +317,7 @@ class HyperliquidClient:
)
if resp.status_code == 200:
res = resp.json().get("result") or {}
first = next(iter(res.values()), {})
first: dict = next(iter(res.values()), {})
price = (first.get("c") or [None])[0]
spot_price = float(price) if price else None
spot_source = "kraken"
@@ -458,8 +466,10 @@ class HyperliquidClient:
mark = ticker.get("mark_price", 0)
price = round(mark * 1.03, 1) if is_buy else round(mark * 0.97, 1)
elif type in ("stop_market", "stop_loss"):
assert price is not None
ot = {"trigger": {"triggerPx": float(price), "isMarket": True, "tpsl": "sl"}}
elif type == "take_profit":
assert price is not None
ot = {"trigger": {"triggerPx": float(price), "isMarket": True, "tpsl": "tp"}}
else:
ot = {"limit": {"tif": "Gtc"}}
@@ -0,0 +1,335 @@
"""Tool hyperliquid V2: pydantic schemas + async functions.
Ogni funzione prende (client: HyperliquidClient, params: <Req>) e restituisce
un dict (o un model Pydantic). Pure logica, no FastAPI dependency, no ACL.
L'autenticazione bearer è gestita dal middleware in cerbero_mcp.auth;
l'audit verrà cablato dal router via request.state.environment.
"""
from __future__ import annotations
from typing import Any
from pydantic import BaseModel, field_validator, model_validator
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
from cerbero_mcp.exchanges.hyperliquid.leverage_cap import (
enforce_leverage as _enforce_leverage,
)
from cerbero_mcp.exchanges.hyperliquid.leverage_cap import get_max_leverage
# === Schemas: reads ===
class GetMarketsReq(BaseModel):
pass
class GetTickerReq(BaseModel):
instrument: str
class GetOrderbookReq(BaseModel):
instrument: str
depth: int = 10
class GetPositionsReq(BaseModel):
pass
class GetAccountSummaryReq(BaseModel):
pass
class GetTradeHistoryReq(BaseModel):
limit: int = 100
class GetHistoricalReq(BaseModel):
instrument: str | None = None
asset: str | None = None
start_date: str | None = None
end_date: str | None = None
resolution: str = "1h"
interval: str | None = None
limit: int = 50
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
from datetime import UTC, datetime, timedelta
sym = self.instrument or self.asset
if not sym:
raise ValueError("instrument (or asset) is required")
self.instrument = sym
if self.interval:
self.resolution = self.interval
if not self.end_date:
self.end_date = datetime.now(UTC).strftime("%Y-%m-%dT%H:%M:%S")
if not self.start_date:
days = max(1, self.limit // 6)
self.start_date = (
datetime.now(UTC) - timedelta(days=days)
).strftime("%Y-%m-%dT%H:%M:%S")
return self
class GetOpenOrdersReq(BaseModel):
pass
class GetFundingRateReq(BaseModel):
instrument: str
class BasisSpotPerpReq(BaseModel):
asset: str
class GetIndicatorsReq(BaseModel):
instrument: str | None = None
asset: str | None = None
indicators: list[str] = ["rsi", "atr", "macd", "adx"]
start_date: str | None = None
end_date: str | None = None
resolution: str = "1h"
interval: str | None = None
limit: int = 50
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
from datetime import UTC, datetime, timedelta
sym = self.instrument or self.asset
if not sym:
raise ValueError("instrument (or asset) is required")
self.instrument = sym
if self.interval:
self.resolution = self.interval
if not self.end_date:
self.end_date = datetime.now(UTC).strftime("%Y-%m-%dT%H:%M:%S")
if not self.start_date:
days = max(2, self.limit // 6)
self.start_date = (
datetime.now(UTC) - timedelta(days=days)
).strftime("%Y-%m-%dT%H:%M:%S")
return self
@field_validator("indicators", mode="before")
@classmethod
def _coerce_indicators(cls, v):
if isinstance(v, str):
import json
s = v.strip()
if s.startswith("["):
try:
parsed = json.loads(s)
if isinstance(parsed, list):
return [str(x).strip() for x in parsed if str(x).strip()]
except json.JSONDecodeError:
pass
return [x.strip() for x in s.split(",") if x.strip()]
if isinstance(v, list):
return v
raise ValueError(
"indicators must be a list like ['rsi','atr','macd'] "
"or a comma-separated string like 'rsi,atr,macd'"
)
# === Schemas: writes ===
class PlaceOrderReq(BaseModel):
instrument: str
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
reduce_only: bool = False
leverage: int | None = None # CER-016: None → default cap (3x)
model_config = {
"json_schema_extra": {
"examples": [
{
"summary": "Market buy 0.01 BTC perp",
"value": {
"instrument": "BTC",
"side": "buy",
"amount": 0.01,
"type": "market",
},
}
]
}
}
class CancelOrderReq(BaseModel):
order_id: str
instrument: str
class SetStopLossReq(BaseModel):
instrument: str
stop_price: float
size: float
class SetTakeProfitReq(BaseModel):
instrument: str
tp_price: float
size: float
class ClosePositionReq(BaseModel):
instrument: str
# === Tools (reads) ===
async def environment_info(
client: HyperliquidClient, *, creds: dict, env_info: Any | None = None
) -> dict:
if env_info is None:
return {
"exchange": "hyperliquid",
"environment": "testnet" if getattr(client, "testnet", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
async def get_markets(client: HyperliquidClient, params: GetMarketsReq) -> list[dict]:
return await client.get_markets()
async def get_ticker(client: HyperliquidClient, params: GetTickerReq) -> dict:
return await client.get_ticker(params.instrument)
async def get_orderbook(client: HyperliquidClient, params: GetOrderbookReq) -> dict:
return await client.get_orderbook(params.instrument, params.depth)
async def get_positions(
client: HyperliquidClient, params: GetPositionsReq
) -> list[dict]:
return await client.get_positions()
async def get_account_summary(
client: HyperliquidClient, params: GetAccountSummaryReq
) -> dict:
return await client.get_account_summary()
async def get_trade_history(
client: HyperliquidClient, params: GetTradeHistoryReq
) -> list[dict]:
return await client.get_trade_history(params.limit)
async def get_historical(
client: HyperliquidClient, params: GetHistoricalReq
) -> dict:
assert params.instrument is not None # validator garantisce non-None
assert params.start_date is not None # validator garantisce non-None
assert params.end_date is not None # validator garantisce non-None
return await client.get_historical(
params.instrument, params.start_date, params.end_date, params.resolution
)
async def get_open_orders(
client: HyperliquidClient, params: GetOpenOrdersReq
) -> list[dict]:
return await client.get_open_orders()
async def get_funding_rate(
client: HyperliquidClient, params: GetFundingRateReq
) -> dict:
return await client.get_funding_rate(params.instrument)
async def basis_spot_perp(
client: HyperliquidClient, params: BasisSpotPerpReq
) -> dict:
return await client.basis_spot_perp(params.asset)
async def get_indicators(
client: HyperliquidClient, params: GetIndicatorsReq
) -> dict:
assert params.instrument is not None # validator garantisce non-None
assert params.start_date is not None # validator garantisce non-None
assert params.end_date is not None # validator garantisce non-None
return await client.get_indicators(
params.instrument,
params.indicators,
params.start_date,
params.end_date,
params.resolution,
)
# === Tools (writes) ===
async def place_order(
client: HyperliquidClient, params: PlaceOrderReq, *, creds: dict
) -> dict:
_enforce_leverage(params.leverage, creds=creds, exchange="hyperliquid")
result = await client.place_order(
instrument=params.instrument,
side=params.side,
amount=params.amount,
type=params.type,
price=params.price,
reduce_only=params.reduce_only,
)
# TODO V2: wire audit via request.state.environment in router
return result
async def cancel_order(
client: HyperliquidClient, params: CancelOrderReq
) -> dict:
return await client.cancel_order(params.order_id, params.instrument)
async def set_stop_loss(
client: HyperliquidClient, params: SetStopLossReq
) -> dict:
return await client.set_stop_loss(
params.instrument, params.stop_price, params.size
)
async def set_take_profit(
client: HyperliquidClient, params: SetTakeProfitReq
) -> dict:
return await client.set_take_profit(
params.instrument, params.tp_price, params.size
)
async def close_position(
client: HyperliquidClient, params: ClosePositionReq
) -> dict:
return await client.close_position(params.instrument)
+20
View File
@@ -0,0 +1,20 @@
"""MacroClient: contenitore credenziali per data provider macro (FRED, Finnhub).
Macro è un read-only data provider: nessun testnet/mainnet, nessuna sessione
HTTP persistente — i `fetchers.py` aprono client httpx ad-hoc tramite
`cerbero_mcp.common.http.async_client`. Questo wrapper esiste solo per uniformità
con il pattern degli altri exchange (DeribitClient/BybitClient/...) e per essere
istanziato dal `ClientRegistry`.
"""
from __future__ import annotations
class MacroClient:
"""Wrapper credenziali FRED/Finnhub. Stateless, no HTTP session."""
def __init__(self, *, fred_api_key: str = "", finnhub_api_key: str = "") -> None:
self.fred_api_key = fred_api_key
self.finnhub_api_key = finnhub_api_key
async def aclose(self) -> None: # pragma: no cover - no-op, no resources
return None
@@ -4,10 +4,15 @@ from datetime import UTC, datetime, timedelta
from typing import Any
import httpx
from mcp_common.http import async_client
from mcp_macro.cot import classify_extreme, compute_percentile, parse_disagg_row, parse_tff_row
from mcp_macro.cot_contracts import (
from cerbero_mcp.common.http import async_client
from cerbero_mcp.exchanges.macro.cot import (
classify_extreme,
compute_percentile,
parse_disagg_row,
parse_tff_row,
)
from cerbero_mcp.exchanges.macro.cot_contracts import (
ALL_DISAGG_SYMBOLS,
ALL_TFF_SYMBOLS,
CFTC_BASE_URL,
@@ -84,7 +89,7 @@ async def fetch_asset_price(ticker: str) -> dict[str, Any]:
now = time.monotonic()
cached = _ASSET_CACHE.get(key)
if cached and (now - cached["ts"]) < _ASSET_CACHE_TTL:
return cached["data"]
return cached["data"] # type: ignore[no-any-return]
mapping = ASSET_TICKER_MAP.get(key)
if not mapping:
@@ -132,7 +137,7 @@ async def fetch_treasury_yields() -> dict[str, Any]:
now = time.monotonic()
if _TREASURY_CACHE["data"] and (now - _TREASURY_CACHE["ts"]) < _TREASURY_TTL:
return _TREASURY_CACHE["data"]
return _TREASURY_CACHE["data"] # type: ignore[no-any-return]
symbols = [
("us2y", "^UST2YR"),
@@ -150,7 +155,7 @@ async def fetch_treasury_yields() -> dict[str, Any]:
spread = None
if yields.get("us10y") is not None and yields.get("us2y") is not None:
spread = round(yields["us10y"] - yields["us2y"], 3)
spread = round(yields["us10y"] - yields["us2y"], 3) # type: ignore[operator]
shape = "unknown"
if spread is not None:
if spread > 0.25:
@@ -581,7 +586,7 @@ async def fetch_market_overview() -> dict[str, Any]:
now = time.monotonic()
if _MARKET_CACHE["data"] is not None and (now - _MARKET_CACHE["ts"]) < _MARKET_CACHE_TTL:
return _MARKET_CACHE["data"]
return _MARKET_CACHE["data"] # type: ignore[no-any-return]
async with async_client(timeout=10.0) as client:
global_data: dict[str, Any] = {}

Some files were not shown because too many files have changed in this diff Show More