Compare commits

...

76 Commits

Author SHA1 Message Date
root f8fb50cb83 fix(V2): map Deribit upstream 5xx / non-JSON to clean HTTPException 502
Cloudflare 5xx pages from Deribit testnet were leaking through the JSON
parser as JSONDecodeError → UNHANDLED_EXCEPTION. Wrap response parsing so
upstream errors surface as a retryable HTTP_502 envelope instead.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-10 08:27:33 +00:00
root 880faa7fd4 refactor(V2): IBKR final review fixes — WS shutdown, conid match, clock note
Final code-review fixes:
- __main__: lifespan stops IBKRWebSocket singletons before registry close
- close_position: resolve symbol→conid first, match positions on conid
  (was matching contractDesc which is a long display string, not ticker)
- close_all_positions: prefer ticker field, fallback to contractDesc
- get_clock: explicit approximate=true + note about US holidays/half-days

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:46:11 +00:00
root cddf88afb4 feat(V2): IBKR OAuth setup script + docker secrets mount + docs
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:40:06 +00:00
root 55bfeca88e feat(V2): IBKR key rotation admin endpoints + health probe
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:37:29 +00:00
root bea37fd734 feat(V2): IBKR router wiring + build_client + WS singleton DI
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:35:28 +00:00
root 6940e2865b feat(V2): IBKR key rotation manager with auto-rollback
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:32:25 +00:00
root bdc40929d4 feat(V2): IBKR complex order tools (bracket/OCO/OTO)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:30:05 +00:00
root 9bbc8c05f1 feat(V2): IBKR complex order payload builders (bracket/OCO/OTO)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:27:26 +00:00
root 3510605fdd feat(V2): IBKR simple write tools (place/amend/cancel/close)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:25:34 +00:00
root 8914d613ec feat(V2): IBKR streaming tools (tick/depth/subscribe)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:23:39 +00:00
root 531b7b019c feat(V2): IBKR read tool schemas + dispatch functions
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:21:24 +00:00
root 6266708e15 refactor(V2): IBKR WebSocket — fix stop/start cycle, guard rails, log disconnect
Code review fixes (commit 17700d2):
- _stopped reset on start() (was stuck True after stop→start)
- _require_started guard on subscribe_*/unsubscribe (clear WSError vs AttributeError)
- _reader_loop logs disconnect via logger.warning + sets _ws=None for `connected` signal
- Class docstring documents stale-snapshot behavior + deferred reconnect
- New tests: subscribe-before-start, stop→start cycle resumption

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:18:57 +00:00
root 17700d27a0 feat(V2): IBKR WebSocket layer + tick/depth snapshot cache
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 21:15:25 +00:00
root 12002642e5 refactor(V2): IBKR client — remove dead whitelist + max_cycles test
Code review polish (commit b9c58a3):
- Remove unused _AUTO_CONFIRM_WHITELIST (was scaffolding, never wired)
- Replace with policy comment documenting auto-confirm behavior
- New test: test_place_order_too_many_confirmations

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:37:37 +00:00
root b9c58a376f feat(V2): IBKR write methods + auto-confirm warning flow
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:34:43 +00:00
root ded4414b32 refactor(V2): IBKR client read methods — defensive conid + sec_type DRY
Code review fixes (commit 611a269):
- resolve_conid validates conid key presence (was raw KeyError on malformed)
- _SEC_TYPE_MAP module constant — reused in get_ticker + get_bars
  (also fixes get_bars previously missing "forex": "CASH")
- New tests: empty response + malformed response error paths

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:32:50 +00:00
root 611a2695a9 feat(V2): IBKR client read methods + conid LRU cache
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:29:11 +00:00
root f4f4e4efd7 refactor(V2): IBKR client — log tickle, type _http, retry once on 401
Code review fixes (commit 0c74691):
- _maybe_tickle logs failures via logger.debug instead of silent pass
- _http typed as httpx.AsyncClient | None
- 30s timeout commented (matches Alpaca, IBKR gateway latency)
- _request retries once on 401 with forced LST refresh (spec §4 IBKR_AUTH_FAILED)
- New tests: test_request_retries_once_on_401, test_request_raises_on_persistent_401

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:27:37 +00:00
root 0c74691e7c feat(V2): IBKR client base + auth header + tickle keep-alive
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:22:51 +00:00
root b49b2b36e0 refactor(V2): IBKR OAuth — named constants, explicit raises, lifted import
Code review fixes (commit 92da6aa):
- LST refresh buffer / fallback TTL extracted as named module constants
- Replace `assert` with explicit `if/raise` (asserts stripped under -O)
- Move IBKRAuthError above OAuth1aSigner (forward declaration)
- async_client import lifted to module level
- Test uses actual prime (23) instead of composite (255)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:20:40 +00:00
root 92da6aa842 feat(V2): IBKR live session token mint via DH key exchange
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:15:17 +00:00
root a90c5c4d6f refactor(V2): IBKR OAuth signer — type tightening + verify-based test
Code review polish:
- _signature_key/_encryption_key typed as RSAPrivateKey | None
- sign() uses assert instead of type: ignore
- test_oauth_signer_signs_with_rsa verifies signature against public key
- Clarifying comments on %3D/%26 manual encoding and Task 3 imports

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:12:48 +00:00
root ae63aaf69a feat(V2): IBKR OAuth1a signer + RSA-SHA256 signature
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:08:15 +00:00
root 92cc45c896 refactor(V2): IBKR settings — TypedDict return + docstrings
Code review polish:
- credentials() returns IBKRCredentials TypedDict (was bare dict)
- Method docstring matching Deribit pattern
- Inline comment explaining account_id env-only design

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:04:08 +00:00
root 3a85ff05e6 feat(V2): IBKR settings + env-specific credentials
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 20:00:15 +00:00
root 391f2c02e0 docs(V2): IBKR integration implementation plan
16 task TDD-disciplinati con 94 step checkbox, riferimento allo spec
2026-05-03. Ogni task: red-green-commit con codice completo nello step.
Copre: settings, OAuth1a signer + DH LST mint, IBKRClient REST + conid
cache + tickle, IBKRWebSocket tick/depth snapshot-on-demand, simple +
complex orders (bracket/OCO/OTO), KeyRotationManager con auto-rollback,
admin endpoints, router wiring, OAuth setup script, docs.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 19:55:38 +00:00
root 109b8e4686 docs(V2): IBKR integration design spec
Approach A2 (Client Portal Web API + OAuth 1.0a Self-Service): fully
unattended REST con setup iniziale one-time, niente sidecar Java.
Scope V1 include: simple+complex orders (bracket/OCO/OTO), WebSocket
streaming snapshot-on-demand (tick+depth), key rotation semi-automatica
con auto-rollback. 8-commit plan atomico, ~6-8 giorni dev.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-03 19:23:08 +00:00
root 1ca1687c9b feat(V2): Deribit credenziali per env (CLIENT_ID/SECRET _TESTNET / _LIVE)
DeribitSettings ora supporta coppie credenziali distinte per testnet e
mainnet via DERIBIT_CLIENT_ID_TESTNET/_LIVE e DERIBIT_CLIENT_SECRET_TESTNET/_LIVE.
Le coppie env-specifiche prevalgono sulla coppia base
DERIBIT_CLIENT_ID/DERIBIT_CLIENT_SECRET (mantenuta per backward compat).

build_client risolve la coppia giusta tramite settings.deribit.credentials(env);
ValueError esplicito se nessuna coppia configurata per l'env richiesto.

+4 test (legacy single, per-env, override, missing). Fix anche isolation
da .env reale via monkeypatch.chdir(tmp_path).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 14:46:47 +00:00
root 8a0f37ebc2 fix(V2): get_account_summary error path → numeric fields None invece di 0
Su errore di Deribit (auth fallita, ecc.) i campi equity/balance/margin/
available/unrealized_pnl/total_pnl ora sono None: signal chiaro di "valore
ignoto" vs "saldo realmente a zero". Risolve ambiguità lato client che
leggevano equity=0 senza accorgersi del campo error.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 13:00:57 +00:00
root 6640ede3df fix(V2): Deribit _authenticate gestisce error envelope (no più KeyError 'result')
Quando Deribit risponde con {"error": {...}} su public/auth (creds errate,
scope mancante, env mismatch), il client esplodeva con KeyError: 'result' →
500 UNHANDLED_EXCEPTION sui tool privati (get_account_summary, get_positions).
Ora _authenticate solleva DeribitAuthError tipizzata, _request la converte
in error envelope coerente con il resto del flusso.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 12:52:43 +00:00
root d8136713b9 feat(V2): integrazione Traefik con TLS + watchtower, rimosso port mapping diretto
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 09:21:52 +00:00
AdrianoDev 9e7b98579b chore(V2): branch V2.0.0 come default deploy (no merge in main)
deploy-vps.sh: BRANCH default V2.0.0 invece di main.
README: clone con -b V2.0.0, nota che il branch in produzione è V2.0.0.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 10:31:58 +02:00
AdrianoDev 51081f4e18 feat(V2): deploy-vps.sh per deploy via clone (no registry)
Il deploy ora avviene clonando il repo direttamente sul VPS, costruendo
l'immagine in loco e riavviando il container. Sostituisce il workflow
build & push verso registry + Watchtower.

Lo script automatizza:
- git fetch + reset --hard origin/<branch>
- docker compose build
- restart graceful (down 15s + up -d)
- attesa healthcheck con timeout configurabile
- rollback automatico al SHA precedente se /health fallisce

Variabili: BRANCH, PORT, HEALTH_TIMEOUT_SECONDS, FORCE, SKIP_ROLLBACK.

Rimosso scripts/build-push.sh (workflow registry abbandonato).
README aggiornato con la nuova procedura.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 09:05:26 +02:00
AdrianoDev 8ecc1a24a9 feat(V2): /health/ready con ping client + middleware request log strutturato + request_id correlation
- /health/ready: ping di tutti i client (exchange, env) cached con
  timeout 2s, status ready|degraded|not_ready, opt-in 503 via
  READY_FAILS_ON_DEGRADED.
- Middleware mcp.request: 1 riga JSON per HTTP request con request_id,
  method, path, status_code, duration_ms, actor, bot_tag, exchange,
  tool, client_ip, user_agent.
- request_id propagato in request.state, audit log e error envelope per
  correlazione cross-cutting.
- Aggiunto async health() come probe minimo a bybit/alpaca/macro/
  sentiment/deribit (hyperliquid lo aveva già).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 09:03:28 +02:00
AdrianoDev 9afd087152 docs(V2): aggiorna conteggio test 259 → 310 nel README
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 08:52:11 +02:00
AdrianoDev 69ac878893 feat(V2): X-Bot-Tag header obbligatorio + endpoint /admin/audit con filtri
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 08:51:40 +02:00
AdrianoDev bd6b03ce43 feat(V2): cabla audit logging nei write endpoint dei 4 router exchange
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 08:44:28 +02:00
AdrianoDev 43bf8fc461 chore(V2): rimuovi SDK obsoleti (pybit, alpaca-py, hyperliquid-python-sdk)
Sweep finale del task #14: tutti e 4 i client exchange ora usano httpx
puro (deribit già lo era; bybit/alpaca/hyperliquid riscritti nei commit
precedenti). Rimosso anche l'override mypy per i moduli SDK che non
servono più.

Quality gate finale:
- 292 test passano (tutti)
- mypy: 0 issues
- ruff: clean
- Nessun import SDK in src/

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 01:39:53 +02:00
AdrianoDev c0b4cb5d5c refactor(V2): hyperliquid client da SDK a httpx + eth-account EIP-712 (parità V1)
Riscritto interamente HyperliquidClient su httpx puro + eth-account per la
firma EIP-712 L1 (chainId 1337, phantom agent source 'a'/'b' per
mainnet/testnet). Bit-parity verificata contro hyperliquid.utils.signing
in test_signing_parity_with_canonical_sdk.

16 metodi pubblici, 26 test passanti. Aggiunte deps: eth-account, msgpack,
eth-utils. hyperliquid-python-sdk ancora presente nel pyproject; rimossa
nel sweep finale.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 01:39:23 +02:00
AdrianoDev 44c7a18d3e refactor(V2): alpaca client da alpaca-py a httpx puro (parità V1)
Riscrive `AlpacaClient` su `httpx.AsyncClient` rimuovendo ogni dipendenza
runtime da `alpaca-py`. 4 endpoint base distinti (trading paper/live,
stock data, crypto data, options data) gestiti via helper `_request`
con header `APCA-API-KEY-ID` / `APCA-API-SECRET-KEY`. Firma costruttore
e attributi pubblici (`paper`, `base_url`) invariati; `base_url` override
applica al solo trading endpoint. Nuovo `aclose()` per cleanup connessioni.

Test riscritti su `pytest-httpx` (29 test alpaca + leverage cap).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 01:38:23 +02:00
AdrianoDev 6097dde4e4 refactor(V2): bybit client da pybit a httpx puro (parità V1) 2026-05-01 01:35:26 +02:00
AdrianoDev 95b8bcfe96 docs(V2): aggiorna README con override URL .env, layout src, quality gate
- Aggiunte caratteristiche: override URL upstream da DERIBIT_URL_*, BYBIT_URL_*, ecc.
- Aggiunto badge qualità: 259 test, mypy clean, ruff clean
- Aggiunta sezione layout src/cerbero_mcp/
- Documentate quirk SDK su override URL (Bybit pybit endpoint, Alpaca trading-only)
- Linkato anche il plan oltre alla spec

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 00:04:50 +02:00
AdrianoDev 697d118522 chore(V2): mypy clean — fix radice V2 nuovo + suppress mirato V1 legacy
- settings.py: lambda factory + type:ignore[call-arg] per env-loaded models
- routers/*.py (6 file): cast esplicito Environment / Client per request.state
- __main__.py: cast Literal env in builder, type:ignore Settings()
- server.py: type:ignore[method-assign] su app.openapi
- deribit/tools.py: assert su validator-normalized fields, list return type
- deribit/client.py: type:ignore mirato no-any-return / has-type, rinomina types→types_list
- hyperliquid/{client,tools}.py: assert su validator-normalized fields, var-annotated
- alpaca/client.py: type:ignore mirato per SDK quirks (assignment, no-any-return, arg-type, union-attr)
- {macro,sentiment}/fetchers.py: type:ignore mirato no-any-return / operator / union-attr

Mypy: 68 → 0 errors. Test: 259 passing. Ruff: clean.
2026-04-30 20:43:03 +02:00
AdrianoDev 436dfd6f5a feat(V2): URL exchange configurabili da .env (DERIBIT_URL_*, BYBIT_URL_*, ecc.) 2026-04-30 20:36:31 +02:00
AdrianoDev b71c66917c chore(V2): cleanup quality gate
- ruff: contextlib.suppress al posto di try/except/pass (client_registry, test_env_routing)
- rimozione services/ legacy (residuo da git rm)
- fix integration test fixture: rimosso sys.modules.pop che inquinava module references nei test successivi (test_audit, test_client_init_default_http)

254 test passano. Ruff: clean. Mypy: 68 warning preesistenti dal codice V1 migrato (strict=false).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 19:02:55 +02:00
AdrianoDev b552127479 docs(V2): README riscritto per architettura V2.0.0
- Singola immagine Docker, routing testnet/mainnet via bearer token
- Configurazione interamente in .env
- Swagger /apidocs
- Sezione migrazione V1 → V2 con tabella mapping campi
- Riferimento alla spec di design

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 19:00:33 +02:00
AdrianoDev 50bc6b64b4 chore(V2): build-push.sh costruisce 1 sola immagine V2.0.0; rimosso deploy-noclone.sh
Lo script ora pubblica un solo tag cerbero-mcp:2.0.0 + :latest + :sha-<short>.
deploy-noclone.sh era specifico del workflow V1 multi-image.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:59:27 +02:00
AdrianoDev ec42d141bd chore(V2): rimuovi compose overlay V1 (prod, local, traefik) e DEPLOYMENT.md
Contenuti utili di DEPLOYMENT.md saranno integrati nel nuovo README V2.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:58:51 +02:00
AdrianoDev 6d19165d9e chore(V2): rimuovi services/, gateway/, secrets/, docker/ (legacy V1)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:58:11 +02:00
AdrianoDev 1c1b3e1570 test(V2): smoke script con bearer testnet 2026-04-30 18:57:07 +02:00
AdrianoDev cee7f7ca2f feat(V2): docker-compose.yml minimo (1 servizio, env_file .env) 2026-04-30 18:55:23 +02:00
AdrianoDev 6148461ac1 feat(V2): Dockerfile unico multi-stage in root
Build multi-stage builder/runtime, uv sync --frozen, utente non-root uid 1000,
healthcheck su /health, CMD cerbero-mcp. Smoke test passato: /health 200 OK,
immagine cerbero-mcp:2.0.0 229MB content size.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:54:38 +02:00
AdrianoDev f34452b2dd test(V2): integration env routing per ogni exchange (constructor spy)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:51:30 +02:00
AdrianoDev a53efb7a29 feat(V2): __main__ con lifespan + 6 router + integration test
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:48:56 +02:00
AdrianoDev f56df197e1 feat(V2): migrazione sentiment completa (read-only, env ignored)
- exchanges/sentiment/{client,fetchers,tools}.py: SentimentClient wrapper stateless (cryptopanic_key, lunarcrush_key)
- routers/sentiment.py: 9 tool POST sotto /mcp-sentiment (news, social, funding, OI, liquidations, cointegration)
- exchanges/__init__.py: branch builder per sentiment (env ignored)
- tests/unit/exchanges/sentiment: migrato test_fetchers, scartato test_server_acl V1-only
- tests/unit/test_exchanges_builder.py: aggiunto test_build_client_sentiment_no_env_distinction
- fetchers.py: env var lookup allineato a LUNARCRUSH_KEY (con fallback LUNARCRUSH_API_KEY)

241 test passano.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:46:48 +02:00
AdrianoDev 88bd4e7bde feat(V2): migrazione macro completa (read-only, env ignored)
- exchanges/macro: cot.py + cot_contracts.py + fetchers.py copiati 1:1 con
  rewrite import mcp_common -> cerbero_mcp.common, mcp_macro -> cerbero_mcp.exchanges.macro
- nuovo MacroClient stateless wrapper: trasporta solo fred_api_key/finnhub_api_key,
  niente HTTP session (i fetchers usano async_client ad-hoc)
- tools.py: 11 tool (get_treasury_yields, get_yield_curve_slope,
  get_breakeven_inflation, get_economic_indicators, get_macro_calendar,
  get_market_overview, get_equity_futures, get_asset_price, get_cot_tff,
  get_cot_disaggregated, get_cot_extreme_positioning) — niente write,
  niente leverage_cap
- routers/macro.py: prefix /mcp-macro, 11 route POST /tools/*
- builder branch macro: stesse credenziali per testnet/mainnet (env ignorato);
  registry istanzia 2 entry, costo trascurabile (wrapper stateless)
- test migrati: test_cot.py + test_fetchers.py (test_server_acl.py skippato V1-only)
- nuovo test test_build_client_macro_no_env_distinction in test_exchanges_builder.py

Suite: 224 passed.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:42:55 +02:00
AdrianoDev 1b8ba0ef9c feat(V2): migrazione alpaca completa
Task 6.7: porting alpaca da services/mcp-alpaca a src/cerbero_mcp.
client.py + leverage_cap.py copiati 1:1 (default cap 1 cash).
tools.py: 17 tool senza ACL/Principal/audit. Router /mcp-alpaca con 18
route (env_info + 17 tool). Builder branch alpaca: paper=(env=="testnet"),
api_key viene da settings.alpaca.api_key_id. Test client + leverage_cap
migrati (15 test alpaca pass). Test builder con stub SDK alpaca-py.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:39:25 +02:00
AdrianoDev 8dbaf3a0e4 feat(V2): migrazione hyperliquid completa
- exchanges/hyperliquid/{client,leverage_cap,tools}.py
- routers/hyperliquid.py con 16 endpoint /mcp-hyperliquid/tools/*
- builder hyperliquid in exchanges/__init__.py
- test migrati: test_client, test_leverage_cap (skip V1: server_acl, environment_info)
- test builder hyperliquid (testnet vs mainnet base_url)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:35:46 +02:00
AdrianoDev 5e42ce9c69 feat(V2): migrazione bybit completa (client, tools, router, test, builder)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:31:51 +02:00
AdrianoDev a8d970233e feat(V2): builder client centralizzato (solo deribit per ora)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:27:50 +02:00
AdrianoDev d3ec2ee588 feat(V2): router deribit + test migrati
Router /mcp-deribit/* monta 34 tool (28 read + 6 write) come endpoint
POST /mcp-deribit/tools/{tool_name}, con DI per env (request.state) e
client (ClientRegistry). Write tools costruiscono creds minimale
{max_leverage, client_id} da settings per leverage cap enforcement.

Test deribit migrati: test_client.py + test_leverage_cap.py riassegnati
sotto tests/unit/exchanges/deribit/ con import rewrite mcp_* -> cerbero_mcp.*.
Skip dei legacy V1-only test_environment_info / test_server_acl / test_env_validation
(ACL e resolve_environment eliminati in V2).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:26:34 +02:00
AdrianoDev daa4e02971 feat(V2): migrazione deribit (client, leverage_cap, tools)
Task 6.1 V2.0.0: copia client.py + leverage_cap.py da services/mcp-deribit
con import riscritti (mcp_common -> cerbero_mcp.common, mcp_deribit ->
cerbero_mcp.exchanges.deribit). Estratte 34 tool async (28 endpoint +
is_testnet/environment_info + helpers) in tools.py: pure logica senza
FastAPI/ACL. Audit calls per ora rimossi (TODO: cabling via router su
request.state.environment).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 18:23:44 +02:00
AdrianoDev 2a268b3a33 feat(V2): build_app con swagger /apidocs + middleware + handlers
Aggiunge /docs e /redoc alla whitelist auth (path disabilitati, nessun rischio sicurezza).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:20:17 +02:00
AdrianoDev 73f880e7f2 feat(V2): ClientRegistry lazy con lock per chiave
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:18:18 +02:00
AdrianoDev 80a4a88cb1 feat(V2): error envelope module estratto da server.py 2026-04-30 18:17:15 +02:00
AdrianoDev 993326136b test(V2): migrazione test common/
Copiati e aggiornati i test da services/common/tests/ a tests/unit/common/.
Import aggiornati da mcp_common a cerbero_mcp.common. Eliminati test di
funzionalità V1-only (app_factory, environment, auth/Principal, server_base).
Refactored test_audit.py (principal→actor str) e test_mcp_bridge.py
(TokenStore→valid_tokens set). 71/71 test passano.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:16:26 +02:00
AdrianoDev 1a1f9c43ba refactor(V2): audit.py usa actor:str invece di Principal, rimuovi legacy common/auth.py
- Eliminato src/cerbero_mcp/common/auth.py (V1 Principal/TokenStore/ACL)
- audit_write_op: parametro principal:Principal → actor:str|None
- mcp_bridge.py: TokenStore → valid_tokens:set[str] (V2 bearer model)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:14:10 +02:00
AdrianoDev 3868ba60ce feat(V2): migrazione common/ (indicators, options, microstructure, stats, http, audit, logging, mcp_bridge + auth) 2026-04-30 18:12:11 +02:00
AdrianoDev 04a34fc179 fix(V2): hoist fastapi Request import, ripristina importlib mode 2026-04-30 18:10:41 +02:00
AdrianoDev 2934a2d26a feat(V2): bearer auth middleware con compare_digest
Implementa install_auth_middleware con whitelist /health /apidocs /openapi.json,
token timing-safe via secrets.compare_digest, request.state.environment injection.
Fix pyproject: --import-mode=prepend (importlib + PEP563 rompe FastAPI Request injection).
Rimosso from __future__ import annotations da test_auth.py per stesso motivo.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:09:21 +02:00
AdrianoDev 97d93a5139 feat(V2): pydantic settings con secret str + test
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:04:40 +02:00
AdrianoDev 005300205b chore(V2): .env.example consolidato, .env gitignored 2026-04-30 18:03:22 +02:00
AdrianoDev 8df64b5176 chore(V2): scheletro src/cerbero_mcp + tests/
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:02:22 +02:00
AdrianoDev 8fd182e295 chore(V2): pyproject singolo package cerbero-mcp, rimosso workspace
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-30 18:01:16 +02:00
AdrianoDev b8753afad2 docs(plan): V2.0.0 implementation plan task-by-task
Plan a 12 phase per implementare V2.0.0:
- Phase 0: bootstrap struttura nuova
- Phase 1: settings + .env consolidato
- Phase 2: auth bearer middleware
- Phase 3: migrazione common/
- Phase 4: client_registry lazy
- Phase 5: build_app + swagger /apidocs
- Phase 6: migrazione 6 exchange (deribit template + 5 ripetizioni)
- Phase 7: __main__ entrypoint con lifespan
- Phase 8: integration test env routing
- Phase 9: Dockerfile + docker-compose minimo
- Phase 10: pulizia V1 (services/, gateway/, secrets/, docker/)
- Phase 11: README riscritto, DEPLOYMENT eliminato
- Phase 12: quality gate finale

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 17:58:13 +02:00
AdrianoDev 9a137563e8 docs(spec): V2.0.0 unified image + token-based env routing
Spec architetturale per V2.0.0: collassa 7 immagini Docker (gateway Caddy
+ 6 servizi MCP) in una singola immagine multi-router. Switch
testnet/mainnet diventa runtime per-request via bearer token (TESTNET_TOKEN
/ MAINNET_TOKEN). Configurazione consolidata in singolo .env, secret JSON
eliminati. Swagger UI esposto a /apidocs.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 17:45:26 +02:00
190 changed files with 21197 additions and 10153 deletions
+92 -6
View File
@@ -1,7 +1,93 @@
GATEWAY_PORT=8080
# ============================================================
# CERBERO MCP — V2.0.0
# Copy to .env and fill in values. .env is gitignored.
# Generate tokens: python -c 'import secrets; print(secrets.token_urlsafe(32))'
# ============================================================
# Override ambiente per ogni MCP exchange (precedenza: env > secret > default)
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# ─── SERVER ─────────────────────────────────────────────────
HOST=0.0.0.0
PORT=9000
LOG_LEVEL=info
# ─── AUTH — token bearer per env routing ──────────────────
# Bot manda Authorization: Bearer <TOKEN>:
# - TESTNET_TOKEN → request va a base_url_testnet
# - MAINNET_TOKEN → request va a base_url_live
TESTNET_TOKEN=
MAINNET_TOKEN=
# ─── EXCHANGE — DERIBIT ───────────────────────────────────
# Coppia singola (usata sia per testnet sia per mainnet):
DERIBIT_CLIENT_ID=
DERIBIT_CLIENT_SECRET=
# Oppure coppie distinte per env (prevalgono se valorizzate):
# DERIBIT_CLIENT_ID_TESTNET=
# DERIBIT_CLIENT_SECRET_TESTNET=
# DERIBIT_CLIENT_ID_LIVE=
# DERIBIT_CLIENT_SECRET_LIVE=
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
DERIBIT_MAX_LEVERAGE=3
# ─── EXCHANGE — BYBIT ─────────────────────────────────────
BYBIT_API_KEY=
BYBIT_API_SECRET=
BYBIT_URL_LIVE=https://api.bybit.com
BYBIT_URL_TESTNET=https://api-testnet.bybit.com
BYBIT_MAX_LEVERAGE=3
# ─── EXCHANGE — HYPERLIQUID ───────────────────────────────
HYPERLIQUID_WALLET_ADDRESS=
HYPERLIQUID_API_WALLET_ADDRESS=
HYPERLIQUID_PRIVATE_KEY=
HYPERLIQUID_URL_LIVE=https://api.hyperliquid.xyz
HYPERLIQUID_URL_TESTNET=https://api.hyperliquid-testnet.xyz
HYPERLIQUID_MAX_LEVERAGE=3
# ─── EXCHANGE — ALPACA ────────────────────────────────────
ALPACA_API_KEY_ID=
ALPACA_SECRET_KEY=
ALPACA_URL_LIVE=https://api.alpaca.markets
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
ALPACA_MAX_LEVERAGE=1
# ─── EXCHANGE — IBKR ──────────────────────────────────────
# Setup OAuth: vedi README "IBKR Setup" + scripts/ibkr_oauth_setup.py.
# Le RSA keys (PEM) NON vanno nel .env: monta come file e referenzia il path.
IBKR_CONSUMER_KEY=
IBKR_ACCESS_TOKEN=
IBKR_ACCESS_TOKEN_SECRET=
IBKR_SIGNATURE_KEY_PATH=/secrets/ibkr_signature.pem
IBKR_ENCRYPTION_KEY_PATH=/secrets/ibkr_encryption.pem
IBKR_DH_PRIME=
# Coppie env-specific (prevalgono):
# IBKR_CONSUMER_KEY_TESTNET=
# IBKR_ACCESS_TOKEN_TESTNET=
# IBKR_ACCESS_TOKEN_SECRET_TESTNET=
# IBKR_SIGNATURE_KEY_PATH_TESTNET=/secrets/ibkr_signature_paper.pem
# IBKR_ENCRYPTION_KEY_PATH_TESTNET=/secrets/ibkr_encryption_paper.pem
# IBKR_ACCOUNT_ID_TESTNET=DU1234567
# IBKR_CONSUMER_KEY_LIVE=
# IBKR_ACCESS_TOKEN_LIVE=
# IBKR_ACCESS_TOKEN_SECRET_LIVE=
# IBKR_SIGNATURE_KEY_PATH_LIVE=/secrets/ibkr_signature_live.pem
# IBKR_ENCRYPTION_KEY_PATH_LIVE=/secrets/ibkr_encryption_live.pem
# IBKR_ACCOUNT_ID_LIVE=U1234567
IBKR_URL_LIVE=https://api.ibkr.com/v1/api
IBKR_URL_TESTNET=https://api.ibkr.com/v1/api
IBKR_WS_URL_LIVE=wss://api.ibkr.com/v1/api/ws
IBKR_WS_URL_TESTNET=wss://api.ibkr.com/v1/api/ws
IBKR_MAX_LEVERAGE=4
IBKR_WS_MAX_SUBSCRIPTIONS=80
IBKR_WS_IDLE_TIMEOUT_S=300
# ─── DATA PROVIDERS — MACRO ───────────────────────────────
FRED_API_KEY=
FINNHUB_API_KEY=
# ─── DATA PROVIDERS — SENTIMENT ───────────────────────────
CRYPTOPANIC_KEY=
LUNARCRUSH_KEY=
-448
View File
@@ -1,448 +0,0 @@
# Deployment Cerbero_mcp
Guida operativa per il deploy della suite MCP su un VPS pubblico.
L'architettura è: Gitea ospita codice + container registry; le immagini
vengono buildate e pushate dalla **macchina di sviluppo** (laptop) verso
il registry; il VPS produzione non builda nulla, fa solo pull dei
container già pronti e usa Watchtower per il rollover automatico.
```
┌──────────────────────────┐ ┌─────────────────────────┐ ┌──────────────────────────────────┐
│ Laptop dev │ │ Gitea git.tielogic.xyz │ │ VPS produzione │
│ │ │ │ │ cerbero-mcp.tielogic.xyz │
│ build-push.sh ──push──▶ │───▶│ ┌────────────────────┐ │ │ │
│ (8 image) │ │ │ Container registry │ │ │ ┌────────────────────────────┐ │
│ git push ─────────────▶ │───▶│ └────────────────────┘ │◀──┼──┤ docker compose │ │
│ │ │ ┌────────────────────┐ │ pull │ (docker-compose.prod.yml) │ │
│ │ │ │ Cerbero-mcp repo │ │ │ │ gateway, mcp-* │ │
│ │ │ └────────────────────┘ │ │ │ watchtower (poll 5min) │ │
│ │ │ │ │ └────────────────────────────┘ │
└──────────────────────────┘ └─────────────────────────┘ └──────────────────────────────────┘
```
Niente CI/CD su Gitea — qualità e build sono responsabilità del laptop
prima del push (lint/test in locale, poi `scripts/build-push.sh`).
## 1. Build & push image (dal laptop)
Lo script `scripts/build-push.sh` builda e pusha le 8 image al registry
Gitea, replicando il vecchio job CI ma in locale. Pre-requisiti:
- `docker` + `buildx` sul laptop.
- Personal Access Token Gitea con scope `write:package` (User Settings
→ Applications → Generate Token).
```bash
export GITEA_PAT='<PAT_write:package>'
export GITEA_USER=adriano
# Tutte le 8 image (base + gateway + 6 mcp-*)
./scripts/build-push.sh
# Solo specifiche (es. dopo modifica a un singolo servizio)
./scripts/build-push.sh base mcp-bybit
```
Lo script:
- Fa `docker login git.tielogic.xyz`.
- Builda con `docker buildx build --push` (cache buildx locale del
laptop, niente cache registry: build successivi rapidi senza pesare
sul registry).
- Tagga `:latest` + `:sha-<short_HEAD>`.
- Per le mcp-* passa `BASE_IMAGE`/`BASE_TAG` come build-arg in modo da
ereditare dall'image `base` appena pushata.
Ordine consigliato: builda `base` prima delle `mcp-*` (lo script lo fa
di default se chiamato senza argomenti).
## 1b. Quality gate locale (consigliato prima del push)
Prima di `build-push.sh` esegui in locale i check che prima girava il CI:
```bash
uv run ruff check services/
uv run mypy services/common/src/mcp_common
uv run pytest services/ --tb=short
docker compose -f docker-compose.prod.yml config -q
```
Tutti devono essere verdi prima di pushare image al registry.
## 2a. Topologia: standalone vs behind-Traefik
Cerbero_mcp supporta due topologie di deploy:
### Standalone (Caddy gestisce TLS direttamente)
```
Internet ──[443]──► Caddy gateway ──► mcp-* services
(ACME Let's Encrypt)
```
Setto: `docker-compose.prod.yml` da solo. Caddy bind sulle porte
80/443 host, fa cert auto via ACME. Adatto a un VPS dedicato senza
altri servizi sulle 80/443.
### Behind-Traefik (Traefik termina TLS)
```
Internet ──[443]──► Traefik ──[traefik network]──► Caddy gateway ──► mcp-* services
(TLS+ACME) (rate-limit, IP allowlist)
```
Setto: `docker-compose.prod.yml` + `docker-compose.traefik.yml` overlay.
Caddy non bind su host, ascolta plain HTTP `:80` interno alla
`traefik` network. Traefik fa routing per `Host(cerbero-mcp.tielogic.xyz)`,
TLS, ACME. Adatto a VPS condiviso con altri servizi (Gitea, ecc.).
## 2. Deploy automatizzato (script no-clone)
Il modo più rapido è `scripts/deploy-noclone.sh`, idempotente. Sul VPS
**non** viene clonato il repo: lo script scarica via raw HTTP solo i
file strettamente necessari al runtime (compose, Caddyfile, public
assets). Esegui sul VPS:
```bash
# Prerequisiti
export GITEA_PAT="<PAT con scope read:package>"
export GITEA_USER=adriano
# Crea la dir di deploy e mettici i secrets via scp dal posto sicuro
sudo mkdir -p /docker/cerbero_mcp/secrets
sudo chown -R "$USER" /docker/cerbero_mcp
# scp deribit.json bybit.json hyperliquid.json alpaca.json \
# macro.json sentiment.json core.token observer.token \
# vps:/docker/cerbero_mcp/secrets/
# Behind Traefik (opzionale, solo se VPS condiviso con Gitea o altri)
# export BEHIND_TRAEFIK=true
# export TRAEFIK_NETWORK=gitea_traefik-public
curl -sL -o /tmp/deploy-noclone.sh \
https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main/scripts/deploy-noclone.sh
chmod +x /tmp/deploy-noclone.sh
/tmp/deploy-noclone.sh
```
Lo script esegue: docker login registry → scarica `docker-compose.prod.yml`,
`docker-compose.traefik.yml`, `gateway/Caddyfile`, `gateway/public/*` in
`/docker/cerbero_mcp/` → chmod 600 sui secrets → genera `.env` iniziale
(testnet) → crea `/var/log/cerbero-mcp` con permessi `1000:1000` → pull
image dal registry → `docker compose up -d` → smoke test pubblico.
Per aggiornare in seguito: ri-esegui lo stesso script (preserva `.env`
e secrets, ricarica config dal branch `main` aggiornato).
**Override paths**: `DEPLOY_DIR` (default `/docker/cerbero_mcp`),
`SECRETS_SRC` (default `$DEPLOY_DIR/secrets`), `AUDIT_LOG_DIR` (default
`/var/log/cerbero-mcp`).
**Override compose locale (`docker-compose.local.yml`)**: lo script
include automaticamente come ultimo `-f` un eventuale
`$DEPLOY_DIR/docker-compose.local.yml`. Utile per fix specifici della
macchina (es. forzare `DOCKER_API_VERSION` su watchtower se il daemon
del VPS è più vecchio dell'API attesa). File gitignored per design —
non viene scaricato dal repo, lo crei a mano sul VPS. Esempio:
```yaml
# /docker/cerbero_mcp/docker-compose.local.yml
services:
watchtower:
environment:
DOCKER_API_VERSION: "1.44"
```
### Modalità behind-Traefik
Se sul VPS gira già un Traefik (es. lo stesso VPS di Gitea), prima di
lanciare lo script aggiungi al tuo `.env`:
```bash
BEHIND_TRAEFIK=true
TRAEFIK_NETWORK=gitea_traefik-public # nome network esterna di Traefik
TRAEFIK_CERTRESOLVER=letsencrypt # nome resolver in Traefik
TRAEFIK_ENTRYPOINT=websecure # entrypoint HTTPS Traefik
# Porte gateway non più necessarie (Traefik bind 80/443):
# GATEWAY_HTTP_PORT, GATEWAY_HTTPS_PORT non vengono usate.
```
Lo script rileva `BEHIND_TRAEFIK=true` e usa
`docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml`.
Il gateway Caddy NON bind su 80/443 host; viene esposto via Traefik con
labels per `Host(cerbero-mcp.tielogic.xyz)`.
Verifica della network Traefik:
```bash
docker network ls | grep -i traefik
# Tipicamente vedrai: gitea_traefik-public, traefik_default, ecc.
# Usa il nome ESATTO come TRAEFIK_NETWORK in .env.
```
## 3. Safety: switch testnet → mainnet
`mcp_common.environment.consistency_check` (richiamato dal boot
`run_exchange_main`) PREVIENE switch accidentali:
- Se l'ambiente risolto è **mainnet** ma il secret JSON corrispondente
non contiene `"environment": "mainnet"` esplicito → boot abort con
`EnvironmentMismatchError`.
- Se il secret dichiara un environment diverso da quello risolto (es.
`creds["environment"]="mainnet"` ma env var setta testnet) → boot abort.
**Per passare a mainnet su un exchange specifico** (es. bybit):
1. Edita `secrets/bybit.json`: aggiungi `"environment": "mainnet"`.
2. Modifica `.env`: `BYBIT_TESTNET=false`.
3. `docker compose -f docker-compose.prod.yml --env-file .env restart mcp-bybit`.
Senza il flag esplicito nel secret, il container mcp-bybit fallirà al
boot e Watchtower NON aggiornerà su versioni con cred mainnet rotti.
Override `STRICT_MAINNET=false` in `.env` permette mainnet senza la
conferma esplicita (downgrade safety, sconsigliato in produzione).
## 4. Audit log persistente
Tutti i write endpoint (`place_order`, `place_combo_order`, `cancel_*`,
`set_*`, `close_*`, `transfer_*`, `amend_*`, `switch_*`) emettono un
record JSON strutturato sul logger `mcp.audit`.
**Sink**:
- stdout/stderr container (sempre, visibile via `docker logs`).
- File JSONL persistente su volume host:
`${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}/<service>.audit.jsonl`.
Rotation a mezzanotte UTC con retention `AUDIT_LOG_BACKUP_DAYS`
(default 30 giorni).
**Esempio record**:
```json
{
"audit_event": "write_op",
"action": "place_order",
"exchange": "bybit",
"principal": "core",
"target": "BTCUSDT",
"payload": {"side": "Buy", "qty": 0.01, "price": 60000, "leverage": 3},
"result": {"order_id": "abc123", "status": "submitted"}
}
```
**Query operative**:
```bash
# Tutto l'audit log oggi
tail -f /var/log/cerbero-mcp/*.audit.jsonl
# Solo place_order su bybit
jq -c 'select(.action=="place_order" and .exchange=="bybit")' \
/var/log/cerbero-mcp/bybit.audit.jsonl
# Errori
jq -c 'select(.error)' /var/log/cerbero-mcp/*.audit.jsonl
# Operazioni di un principal
jq -c 'select(.principal=="core")' /var/log/cerbero-mcp/*.audit.jsonl
```
I secret (api_key, password) sono filtrati automaticamente da
`SecretsFilter` prima di arrivare al sink.
## 5. Setup iniziale del VPS (manuale, alternativa allo script)
**Pre-requisiti**: Docker Engine ≥ 24, `docker compose` plugin, accesso SSH
sudo, dominio DNS A record `cerbero-mcp.tielogic.xyz` → IP del VPS, porte 80
e 443 aperte sul firewall (per ACME challenge + traffico HTTPS).
### a) Login al registry Gitea
Crea un Personal Access Token su Gitea (`Settings → Applications →
Generate new token`) con scope `read:package`. Quindi sul VPS:
```bash
echo "$GITEA_PAT" | docker login git.tielogic.xyz -u <gitea-username> --password-stdin
```
Le credenziali vengono salvate in `~/.docker/config.json`. Watchtower lo
bind-monta in sola lettura per fare i pull autenticati.
### b) Crea dir di deploy e scarica i file di config
Sul VPS NON serve clonare il repo. Bastano i file di compose, il
`Caddyfile` e i public assets del gateway:
```bash
sudo mkdir -p /docker/cerbero_mcp/{secrets,gateway/public}
sudo chown -R "$USER" /docker/cerbero_mcp
cd /docker/cerbero_mcp
BASE=https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main
curl -fsSL -o docker-compose.prod.yml $BASE/docker-compose.prod.yml
curl -fsSL -o docker-compose.traefik.yml $BASE/docker-compose.traefik.yml
curl -fsSL -o gateway/Caddyfile $BASE/gateway/Caddyfile
curl -fsSL -o gateway/public/index.html $BASE/gateway/public/index.html
curl -fsSL -o gateway/public/status.js $BASE/gateway/public/status.js
curl -fsSL -o gateway/public/style.css $BASE/gateway/public/style.css
```
Il VPS NON ha bisogno di buildare; usa `docker-compose.prod.yml` che fa solo
pull dal registry.
### c) Prepara secrets
```bash
mkdir -p secrets
# Copia (via scp) i file JSON con cred reali:
# secrets/deribit.json, bybit.json, alpaca.json, hyperliquid.json,
# secrets/macro.json, sentiment.json
# secrets/core.token, observer.token
chmod 600 secrets/*
```
### d) `.env` con configurazione runtime
Crea `/docker/cerbero_mcp/.env`:
```bash
# Gateway
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
# Image tag — `latest` per auto-update Watchtower, oppure pin a sha-XXXXXXX
IMAGE_TAG=latest
IMAGE_PREFIX=git.tielogic.xyz/adriano/cerbero-mcp
# Environment exchange (true=testnet, false=mainnet)
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# Watchtower polling interval (sec). 300=5min default.
WATCHTOWER_POLL_INTERVAL=300
```
### e) Avvio
```bash
docker compose -f docker-compose.prod.yml --env-file .env pull
docker compose -f docker-compose.prod.yml --env-file .env up -d
docker compose -f docker-compose.prod.yml logs -f gateway
```
Caddy chiede automaticamente il certificato Let's Encrypt al primo
contatto su `https://cerbero-mcp.tielogic.xyz`.
## 6. Auto-update via Watchtower
Watchtower (servizio `watchtower` nel compose) polla il registry ogni
`WATCHTOWER_POLL_INTERVAL` secondi. Se trova un nuovo digest dietro al tag
`:latest` di un container etichettato `com.centurylinklabs.watchtower.enable=true`,
fa:
1. `docker pull` della nuova image
2. `docker stop` graceful del container vecchio
3. `docker rm` + start del nuovo container con stessa config + secret + volumi
4. Cleanup image vecchia (`WATCHTOWER_CLEANUP=true`)
I container con label sono: `gateway`, `mcp-deribit`, `mcp-bybit`,
`mcp-hyperliquid`, `mcp-alpaca`, `mcp-macro`, `mcp-sentiment`. Il container
`watchtower` stesso non si auto-aggiorna (per evitare loop).
### Disabilitare auto-update temporaneamente
Pin a uno SHA specifico nel `.env`:
```bash
IMAGE_TAG=sha-6b7b3f7
docker compose -f docker-compose.prod.yml --env-file .env up -d
```
In questo modo `:latest` non viene più seguito; per riattivare il rollover
automatico ripristina `IMAGE_TAG=latest`.
### Disabilitare auto-update per un singolo servizio
Rimuovi la label `com.centurylinklabs.watchtower.enable=true` per quel
servizio nel compose (oppure imposta `=false`). Watchtower lo ignora ma
continua a tenere aggiornati gli altri.
## 7. Rollback
```bash
# Trova lo SHA della versione precedente
docker images "git.tielogic.xyz/adriano/cerbero-mcp/*" --format "{{.Tag}}"
# Pin nel .env
IMAGE_TAG=sha-XXXXXXX
docker compose -f docker-compose.prod.yml --env-file .env up -d
```
Watchtower NON downgraderà perché il digest del tag pin corrisponde a quello
locale.
## 8. Smoke test post-deploy
```bash
# Da fuori VPS (laptop)
curl -s https://cerbero-mcp.tielogic.xyz/mcp-macro/health
# {"status":"ok",...}
# Test write endpoint allowlist (deve rispondere 403 da IP esterno):
curl -X POST https://cerbero-mcp.tielogic.xyz/mcp-deribit/tools/place_order \
-H "Authorization: Bearer $(cat secrets/core.token)" \
-d '{"instrument_name":"BTC-PERPETUAL","side":"buy","amount":1}'
# 403 forbidden: source ip not in allowlist ← OK
# Sul VPS:
GATEWAY=http://localhost bash tests/smoke/run.sh
```
## 9. Sicurezza VPS
- Firewall `ufw`: `allow 22, 80, 443`. Tutto il resto deny in.
- `fail2ban` su SSH e (opz) sul log Caddy 401.
- Secret rotation manuale: aggiorna i file `secrets/*.token`
`docker compose restart` (i token vengono ricaricati al boot di ogni
servizio MCP).
- Audit log in `docker compose logs <service> | grep audit_event` — per
produzione meglio redirezionare a syslog o a un servizio dedicato.
## 10. Note Traefik / reverse proxy davanti a Gitea
Gitea è esposto via Traefik (ROOT_URL `https://git.tielogic.xyz`). Per il push
di image Docker il reverse proxy deve consentire upload di body grossi (un
singolo layer può superare i 100MB).
Traefik default va bene, ma se vedi `413 Request Entity Too Large` durante
`docker push` aumenta il limite nel middleware:
```yaml
# traefik dynamic config
http:
middlewares:
gitea-upload:
buffering:
maxRequestBodyBytes: 524288000 # 500MB
```
Applicalo come middleware al router Gitea.
## 11. Aggiornamento del compose stesso (file YAML)
Watchtower aggiorna le **image**, non `docker-compose.prod.yml`
`Caddyfile`. Se cambi struttura (nuovi servizi, nuove env var, modifiche
al gateway), ri-esegui sul VPS lo script no-clone, che ri-scarica i file
di config dal branch `main` di Gitea e applica:
```bash
/tmp/deploy-noclone.sh
```
Lo script è idempotente: preserva `.env` e `secrets/`, aggiorna solo i
file di config + fa `pull` + `up -d`.
+27
View File
@@ -0,0 +1,27 @@
# syntax=docker/dockerfile:1.7
FROM python:3.11-slim AS builder
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential curl && rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY src ./src
RUN uv sync --frozen --no-dev
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.title="cerbero-mcp" \
org.opencontainers.image.version="2.0.0" \
org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH" \
HOST=0.0.0.0 \
PORT=9000 \
PYTHONUNBUFFERED=1
RUN useradd -m -u 1000 app && chown -R app:app /app
USER app
EXPOSE 9000
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
CMD ["cerbero-mcp"]
+400 -124
View File
@@ -1,153 +1,429 @@
# Cerbero_mcp
# Cerbero MCP — V2.0.0
Server MCP riusabili (exchange + market data) per la suite Cerbero.
Spinta da `Cerbero/` (commit `pre-split-2026-04-27`) come parte dello
split documentato in `docs/superpowers/specs/2026-04-27-split-mcp-core-design.md`
(nel repo storico).
Server MCP unificato multi-exchange per la suite Cerbero. Distribuito come
singola immagine Docker; testnet e mainnet sono raggiungibili
contemporaneamente attraverso un meccanismo di routing per-request basato
sul token bearer fornito dal client.
## Servizi
- `mcp-alpaca`, `mcp-bybit`, `mcp-deribit`, `mcp-hyperliquid` — exchange
con `place_order`, `environment_info`, leverage cap server-side
- `mcp-deribit` e `mcp-bybit` espongono inoltre `place_combo_order`:
- Deribit: `private/create_combo` + ordine sul combo → 1 sola crociata
di spread invece di N (slippage atteso ridotto su strutture liquide).
- Bybit: `place_batch_order` su `category=option` → multi-leg atomico
in un solo round-trip API (no sconto fee, solo atomicità + latenza).
- `mcp-macro`, `mcp-sentiment` — read-only market data
## Caratteristiche
## Indicatori quantitativi disponibili
- **Una singola immagine Docker** (`cerbero-mcp`) ospita tutti i router
exchange in un unico processo FastAPI
- **Quattro exchange** (Deribit, Bybit, Hyperliquid, Alpaca) e **due data
provider** read-only (Macro, Sentiment)
- **Switch testnet/mainnet per-request** tramite header
`Authorization: Bearer <TOKEN>`: lo stesso container serve entrambi gli
ambienti senza riavvii
- **Configurazione interamente in `.env`**: nessun file JSON di credenziali
separato; le URL upstream (live/testnet) di ciascun exchange sono
override-abili tramite variabili dedicate (`DERIBIT_URL_*`,
`BYBIT_URL_*`, `HYPERLIQUID_URL_*`, `ALPACA_URL_*`)
- **Documentazione interattiva** OpenAPI/Swagger esposta a `/apidocs`
- **Qualità verificata**: 310 test (unit + integration + smoke), mypy
pulito, ruff pulito
### Common (`mcp_common.indicators` + `options` + `microstructure` + `stats`)
- Tecnici: `sma`, `rsi`, `macd`, `atr`, `adx`
- Volatilità: `vol_cone` (RV multi-window con percentili), `garch11_forecast`
- Statistici: `hurst_exponent`, `half_life_mean_reversion`, `autocorrelation`,
`cointegration_test` (Engle-Granger)
- Risk: `rolling_sharpe` (Sharpe + Sortino), `var_cvar` (historical VaR/ES)
- Microstructure: `orderbook_imbalance` (ratio + microprice + slope)
- Options: `oi_weighted_skew`, `smile_asymmetry`, `atm_vs_wings_vol`,
`dealer_gamma_profile`, `vanna_charm_aggregate`
## Avvio rapido (sviluppo, senza Docker)
### Deribit (esposti come tool MCP)
1. Copiare il template di configurazione e compilarlo:
```bash
cp .env.example .env
# editare .env con le proprie credenziali e i due token
```
2. Generare i token bearer:
```bash
python -c 'import secrets; print("TESTNET_TOKEN=" + secrets.token_urlsafe(32))'
python -c 'import secrets; print("MAINNET_TOKEN=" + secrets.token_urlsafe(32))'
```
3. Installare le dipendenze e avviare:
```bash
uv sync
uv run cerbero-mcp
```
4. Aprire la documentazione interattiva: <http://localhost:9000/apidocs>
## Avvio con Docker
```bash
cp .env.example .env # compilare valori
docker compose up -d
```
Il container espone la porta indicata da `PORT` in `.env` (default 9000).
## Token bearer e ambienti
| Token usato | Ambiente upstream |
|---|---|
| `Authorization: Bearer $TESTNET_TOKEN` | URL testnet di ciascun exchange |
| `Authorization: Bearer $MAINNET_TOKEN` | URL mainnet (live) |
| Nessun token / token sconosciuto | 401 Unauthorized |
Le tool puramente read-only (`/mcp-macro/*` e `/mcp-sentiment/*`)
richiedono comunque un bearer valido, ma il valore (testnet o mainnet) è
indifferente perché non hanno endpoint testnet.
### Header X-Bot-Tag (identificazione bot)
Tutte le chiamate a `/mcp-*` richiedono inoltre l'header `X-Bot-Tag` con
una stringa identificativa del bot chiamante (massimo 64 caratteri). Il
valore viene loggato negli audit record per tracciare quale bot ha
eseguito ogni operazione write. Esempio di richiesta:
Authorization: Bearer $MAINNET_TOKEN
X-Bot-Tag: scanner-alpha-prod
Se l'header è assente o vuoto la risposta è `400 BAD_REQUEST`. L'header
non è richiesto sugli endpoint pubblici (`/health`, `/apidocs`,
`/openapi.json`) né sull'endpoint admin `/admin/audit`.
## Endpoint principali
| Path | Descrizione |
|---|---|
| `GET /health` | Liveness check (no auth) |
| `GET /health/ready` | Readiness check con ping client exchange (no auth) |
| `GET /apidocs` | Swagger UI (no auth) |
| `GET /openapi.json` | Schema OpenAPI 3.1 (no auth) |
| `POST /mcp-deribit/tools/{tool}` | Tool exchange Deribit |
| `POST /mcp-bybit/tools/{tool}` | Tool exchange Bybit |
| `POST /mcp-hyperliquid/tools/{tool}` | Tool exchange Hyperliquid |
| `POST /mcp-alpaca/tools/{tool}` | Tool exchange Alpaca |
| `POST /mcp-macro/tools/{tool}` | Tool macro/market data |
| `POST /mcp-sentiment/tools/{tool}` | Tool sentiment/news |
| `GET /admin/audit` | Query dell'audit log JSONL (bearer richiesto, no X-Bot-Tag) |
## Observability
### Health check
L'applicazione espone due endpoint distinti per il monitoring:
- `GET /health` — liveness check semplice. Non richiede autenticazione e
ritorna sempre HTTP 200 finché il processo è vivo. Ideale per la
liveness probe di Kubernetes o per il pinger di Traefik.
- `GET /health/ready` — readiness check evoluto. Itera tutti i client
exchange presenti nel registry e per ciascuno tenta una probe leggera
(`health()` se disponibile, fallback su `is_testnet()`), con timeout
di 2 secondi per client. La risposta contiene il campo `status` con
uno dei valori `ready` (tutti i client rispondono), `degraded` (almeno
uno fallisce) o `not_ready` (registry vuoto) ed un array `clients` con
un record per ogni coppia `(exchange, env)` cached. Per default
l'endpoint risponde sempre con HTTP 200; impostando la variabile
d'ambiente `READY_FAILS_ON_DEGRADED=true` si forza HTTP 503 quando lo
stato non è `ready`, comportamento utile per la readiness probe di
Kubernetes.
### Request log
Ogni richiesta HTTP attraversa un middleware che emette una riga JSON
sul logger `mcp.request` con i seguenti campi: `request_id`, `method`,
`path`, `status_code`, `duration_ms`, `actor` (`testnet` o `mainnet`,
solo se autenticato), `bot_tag` (header `X-Bot-Tag` se presente),
`exchange` (estratto dal path `/mcp-{exchange}/...`), `tool` (nome del
tool quando il path è `/mcp-X/tools/Y`), `client_ip`, `user_agent`. Lo
stesso `request_id` viene incluso anche nei record dell'audit log
`mcp.audit` e nell'envelope di errore restituito al client, in modo da
poter correlare le tre tracce a parità di richiesta.
### Audit log
Vedi la sezione "Audit query" qui sotto per la consultazione del log
strutturato delle operazioni di scrittura.
## Audit query
`GET /admin/audit` legge il file JSONL puntato da `AUDIT_LOG_FILE` e
restituisce i record filtrati. Richiede un bearer valido (testnet o
mainnet); non richiede l'header `X-Bot-Tag`.
Parametri di query (tutti opzionali):
- `from`, `to`: ISO 8601 datetime (es. `2026-05-01` o `2026-05-01T12:34:56Z`)
- `actor`: `testnet` | `mainnet`
- `exchange`: nome dell'exchange (`deribit`, `bybit`, `hyperliquid`, `alpaca`)
- `action`: nome del tool (es. `place_order`)
- `bot_tag`: identificatore del bot
- `limit`: massimo record restituiti, default `1000`, massimo `10000`
Esempio di chiamata:
curl -H "Authorization: Bearer $MAINNET_TOKEN" \
"http://localhost:9000/admin/audit?from=2026-05-01&actor=mainnet&action=place_order&limit=100"
Se `AUDIT_LOG_FILE` non è configurata l'endpoint risponde `count: 0` con
un campo `warning`. Per abilitare il sink persistente impostare nel `.env`:
AUDIT_LOG_FILE=/var/log/cerbero-mcp/audit.jsonl
AUDIT_LOG_BACKUP_DAYS=30
## Tool disponibili
### Common (`cerbero_mcp.common.indicators` + `options` + `microstructure` + `stats`)
Tecnici (`sma`, `rsi`, `macd`, `atr`, `adx`), volatilità (`vol_cone`,
`garch11_forecast`), statistici (`hurst_exponent`,
`half_life_mean_reversion`, `cointegration_test`), risk (`rolling_sharpe`,
`var_cvar`), microstructure (`orderbook_imbalance`), options
(`oi_weighted_skew`, `smile_asymmetry`, `dealer_gamma_profile`,
`vanna_charm_aggregate`).
### Deribit
DVOL, GEX, P/C ratio, skew_25d, term_structure, iv_rank, realized_vol,
indicatori tecnici, find_by_delta, calculate_spread_payoff.
**Nuovi**: `get_dealer_gamma_profile`, `get_vanna_charm`,
`get_oi_weighted_skew`, `get_smile_asymmetry`, `get_atm_vs_wings_vol`,
`get_orderbook_imbalance`.
indicatori tecnici, find_by_delta, calculate_spread_payoff,
get_dealer_gamma_profile, get_vanna_charm, get_oi_weighted_skew,
get_smile_asymmetry, get_atm_vs_wings_vol, get_orderbook_imbalance,
place_combo_order.
### Bybit
Ticker, orderbook, OHLCV, funding rate (current+history), open interest,
basis spot/perp, indicatori tecnici. **Nuovi**: `get_orderbook_imbalance`,
`get_basis_term_structure`.
Ticker, orderbook, OHLCV, funding rate, open interest, basis spot/perp,
indicatori tecnici, place_batch_order, get_orderbook_imbalance,
get_basis_term_structure.
### Hyperliquid
Account summary, positions, orderbook, historical, indicators, funding
rate, basis spot/perp, place_order, set_stop_loss, set_take_profit.
### Alpaca
Account, positions, bars, snapshot, option chain, place_order,
amend_order, cancel_order, close_position.
### Macro
Treasury yields, FRED indicators, equity futures, asset prices, calendar.
**Nuovi**: `get_yield_curve_slope` (slope 2y10y/5y30y + butterfly + regime),
`get_breakeven_inflation` (T5YIE/T10YIE/T5YIFR), `get_cot_tff` (TFF report
CFTC equity/financial: ES/NQ/RTY/ZN/ZB/6E/6J/DX), `get_cot_disaggregated`
(Disaggregated report CFTC commodities: CL/GC/SI/HG/ZW/ZC/ZS),
`get_cot_extreme_positioning` (scanner percentile ≤5/≥95 su watchlist).
Treasury yields, FRED indicators, equity futures, asset prices, calendar,
get_yield_curve_slope, get_breakeven_inflation, get_cot_tff,
get_cot_disaggregated, get_cot_extreme_positioning.
### Sentiment
News (CryptoPanic/CoinDesk), social (LunarCrush), funding multi-exchange,
OI history. **Nuovi**: `get_funding_arb_spread` (opportunità arb compatte),
`get_liquidation_heatmap` (heuristic da OI delta + funding extreme),
`get_cointegration_pairs` (Engle-Granger su coppie crypto).
OI history, get_funding_arb_spread, get_liquidation_heatmap,
get_cointegration_pairs.
## Deploy su VPS con Traefik
Sul VPS la rete pubblica (TLS, allowlist IP, rate limit) è gestita da
Traefik esterno a questo repository. Il container `cerbero-mcp` non
espone porte all'esterno: si registra alla rete docker di Traefik tramite
label aggiunte da un override compose esterno (es.
`docker-compose.override.yml` versionato fuori da questo repo). La policy
di sicurezza pubblica (allowlist IP per gli endpoint write) è
responsabilità di Traefik.
Esempio label minime per Traefik:
```yaml
labels:
- "traefik.enable=true"
- "traefik.http.routers.cerbero.rule=Host(`cerbero-mcp.tielogic.xyz`)"
- "traefik.http.routers.cerbero.entrypoints=websecure"
- "traefik.http.routers.cerbero.tls.certresolver=letsencrypt"
- "traefik.http.services.cerbero.loadbalancer.server.port=9000"
```
## Build & deploy pipeline
Niente CI/CD su Gitea: la build delle 8 image è responsabilità della
macchina di sviluppo, fatta da `scripts/build-push.sh`. Il flusso è:
Il deploy su VPS avviene **per clone diretto del repo**, senza passare per
un container registry. Lo script `scripts/deploy-vps.sh` automatizza
l'intero flusso: pull del ramo target, rebuild dell'immagine sulla
macchina VPS, restart del servizio, healthcheck e rollback automatico in
caso di fallimento.
### Setup iniziale sul VPS (una sola volta)
```bash
# Sul VPS:
sudo mkdir -p /opt/cerbero-mcp
sudo chown -R "$USER":"$USER" /opt/cerbero-mcp
cd /opt/cerbero-mcp
git clone -b V2.0.0 ssh://git@git.tielogic.xyz:222/Adriano/Cerbero-mcp.git .
cp .env.example .env
# editare .env con i token e le credenziali reali
```
Il branch in produzione è `V2.0.0` (non `main`). Lo script `deploy-vps.sh`
fa default su questo ramo.
### Deploy ricorrente
Da qualunque macchina con accesso SSH al VPS:
```bash
ssh user@vps 'cd /opt/cerbero-mcp && bash scripts/deploy-vps.sh'
```
Oppure direttamente dal VPS:
```bash
cd /opt/cerbero-mcp
bash scripts/deploy-vps.sh
```
Lo script:
1. verifica che il working tree sia pulito e che `.env` sia presente;
2. esegue `git fetch + reset --hard origin/V2.0.0`;
3. se la SHA non è cambiata, esce senza fare nulla (override con
`FORCE=1`);
4. ricostruisce l'immagine Docker (`docker compose build`);
5. restart graceful del container (`docker compose down --timeout 15`
seguito da `docker compose up -d`);
6. attende `/health` (timeout 30 s di default);
7. se l'health fallisce, esegue rollback automatico al SHA precedente.
Variabili d'ambiente accettate: `BRANCH` (default `V2.0.0`), `PORT`
(default letto da `.env`), `HEALTH_TIMEOUT_SECONDS`, `FORCE`,
`SKIP_ROLLBACK`.
### Smoke test post-deploy
```bash
PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
```
## Sviluppo
```bash
uv sync
uv run pytest # tutta la suite (310 test attesi)
uv run pytest tests/unit -v # solo unit
uv run pytest tests/integration -v
uv run ruff check src/ tests/
uv run mypy src/cerbero_mcp
```
Tutti e quattro i comandi devono ritornare verde prima di committare.
### Layout sorgenti
```
src/cerbero_mcp/
├── __main__.py # entrypoint cerbero-mcp
├── settings.py # Pydantic Settings (legge .env)
├── auth.py # middleware bearer → request.state.environment
├── server.py # build_app() + Swagger + middleware + handlers
├── client_registry.py # cache lazy {(exchange, env): client}
├── routers/ # un file per exchange (deribit, bybit, ...)
├── exchanges/ # logica per-exchange: client + tools
└── common/ # indicators, options, microstructure, stats, ...
```
## Migrazione da V1 (1.x → 2.0.0)
Per chi è in produzione su V1:
1. Backup `secrets/` (V2 non li userà ma servono come fonte di copia).
2. Generare i due nuovi token bearer (vedi sopra).
3. Compilare `.env` mappando i campi V1 ai campi V2:
| V1 (file JSON) | V2 (variabile `.env`) |
|---|---|
| `secrets/deribit.json` `client_id` / `client_secret` | `DERIBIT_CLIENT_ID` / `DERIBIT_CLIENT_SECRET` |
| `secrets/bybit.json` `api_key` / `api_secret` | `BYBIT_API_KEY` / `BYBIT_API_SECRET` |
| `secrets/hyperliquid.json` `wallet_address` / `private_key` | `HYPERLIQUID_WALLET_ADDRESS` / `HYPERLIQUID_PRIVATE_KEY` |
| `secrets/alpaca.json` `api_key_id` / `secret_key` | `ALPACA_API_KEY_ID` / `ALPACA_SECRET_KEY` |
| `secrets/macro.json` `fred_api_key` / `finnhub_api_key` | `FRED_API_KEY` / `FINNHUB_API_KEY` |
| `secrets/sentiment.json` `cryptopanic_key` / `lunarcrush_key` | `CRYPTOPANIC_KEY` / `LUNARCRUSH_KEY` |
4. Aggiornare i client bot:
- i path API restano identici (`/mcp-{exchange}/tools/{tool}`)
- sostituire `core.token` / `observer.token` con `TESTNET_TOKEN` o
`MAINNET_TOKEN` a seconda dell'ambiente desiderato per la chiamata
5. Spegnere V1 (`docker compose -f <vecchio compose> down`) e avviare V2
(`docker compose up -d`).
6. Verificare `/health` e `/apidocs`.
In caso di necessità è possibile fare rollback pullando i tag immagine V1
(`cerbero-mcp-*:1.x`); si ricordi però che `.env` e `secrets/` sono
formati incompatibili tra V1 e V2 — tenere backup separati.
## Architettura
Spec di progettazione e plan di implementazione completi in:
- [`docs/superpowers/specs/2026-04-30-V2.0.0-unified-image-token-routing-design.md`](docs/superpowers/specs/2026-04-30-V2.0.0-unified-image-token-routing-design.md)
- [`docs/superpowers/plans/2026-04-30-V2.0.0-unified-image-token-routing.md`](docs/superpowers/plans/2026-04-30-V2.0.0-unified-image-token-routing.md)
Riepilogo del flusso runtime:
```
Bot → Authorization: Bearer <TESTNET|MAINNET>_TOKEN
FastAPI middleware auth → request.state.environment ∈ {testnet, mainnet}
Router /mcp-{exchange}/tools/{tool}
ClientRegistry.get(exchange, env) → client cached lazy (HTTP/WS pool riusato)
Tool function (logica pura) → exchange API
```
### Override URL upstream
L'override delle URL upstream da `.env` è completo per Deribit e
Hyperliquid. Per Bybit funziona tramite l'attributo `endpoint` interno di
pybit (workaround documentato nel client). Per Alpaca l'override è
applicato al solo trading endpoint: gli endpoint dati
(`data.alpaca.markets`) restano quelli predefiniti dell'SDK.
## Licenza
Privato.
## IBKR Setup
IBKR uses OAuth 1.0a Self-Service for fully unattended runtime auth. Setup is
manual one-time per account (paper + live), then the container mints live
session tokens autonomously.
### One-time setup
1. Login to https://www.interactivebrokers.com → User Settings → Self-Service OAuth
2. Generate keypairs locally:
1. **Quality gate locale** (sul laptop, prima di pushare):
- `uv run ruff check services/`
- `uv run mypy services/common/src/mcp_common`
- `uv run pytest services/`
- `docker compose -f docker-compose.prod.yml config -q`
2. **Build & push** (sul laptop):
```bash
export GITEA_PAT='<PAT_write:package>'
./scripts/build-push.sh # tutte le 8 image
./scripts/build-push.sh base mcp-bybit # solo specifiche
uv run python scripts/ibkr_oauth_setup.py --env testnet
```
Tagga `:latest` + `:sha-<short_HEAD>` per rollback puntuali. Cache
buildx via registry stesso (run successivi 5-10× più veloci).
3. **Auto-rollover su VPS**: Watchtower polla il registry ogni 5 min e
aggiorna i container quando il digest del tag `:latest` cambia.
Vedi [`DEPLOYMENT.md`](DEPLOYMENT.md) per build & push, deploy VPS
no-clone (`scripts/deploy-noclone.sh`), smoke test, rollback.
This writes RSA keys under `secrets/` and prints SHA-256 fingerprints.
## Avvio locale (dev)
3. Register the two fingerprints in the IBKR portal. Receive a `consumer_key`.
4. Get a request token + authorization URL:
```bash
uv run python scripts/ibkr_oauth_setup.py --env testnet \
--consumer-key <K> --request-token
```
5. Open the URL, authorize, copy the `verifier_code`.
6. Exchange verifier for long-lived access token (~5 years validity):
```bash
uv run python scripts/ibkr_oauth_setup.py --env testnet --verifier <V>
```
7. Copy the printed values into `.env`:
- `IBKR_CONSUMER_KEY_TESTNET`
- `IBKR_ACCESS_TOKEN_TESTNET`
- `IBKR_ACCESS_TOKEN_SECRET_TESTNET`
- `IBKR_SIGNATURE_KEY_PATH_TESTNET`
- `IBKR_ENCRYPTION_KEY_PATH_TESTNET`
- `IBKR_ACCOUNT_ID_TESTNET` (e.g., `DU1234567` for paper)
- `IBKR_DH_PRIME` (hex from portal; shared paper/live)
8. Repeat with `--env mainnet` for live trading.
### Smoke test
```bash
docker compose up -d
bash tests/smoke/run.sh
curl https://cerbero-mcp.<dom>/mcp-ibkr/tools/get_account \
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST -d '{}'
```
## Configurazione
Vedi `secrets/*.json` e variabili `*_TESTNET` / `ALPACA_PAPER` in
`docker-compose.yml` per override ambiente.
### Deploy su VPS pubblica (`cerbero-mcp.tielogic.xyz`)
Vedi [`DEPLOYMENT.md`](DEPLOYMENT.md) per il runbook completo end-to-end.
Il gateway Caddy è configurato per:
- TLS automatico via Let's Encrypt (richiede DNS A/AAAA che punti al
VPS e porte 80+443 raggiungibili).
- HSTS preload, header di sicurezza (`X-Content-Type-Options`,
`X-Frame-Options`, `Referrer-Policy`).
- Rate limit per IP (60 req/min su read, 10 req/min su write) tramite
plugin `mholt/caddy-ratelimit`.
- Allowlist IP sui write endpoint (`place_*`, `cancel_*`, `set_*`,
`close_*`, `transfer_*`, `amend_*`, `switch_*`): IP non presenti in
`WRITE_ALLOWLIST` ricevono `403 forbidden`.
Variabili d'ambiente per il deploy:
### Key rotation
```bash
# .env (su VPS)
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
# 1. Generate new keypairs alongside existing
uv run python scripts/ibkr_oauth_setup.py --env testnet --rotate
# Allowlist write endpoint (CIDR space-separated). Default copre:
# - loopback IPv4/IPv6 (bot sull'host VPS chiama http://localhost)
# - Docker bridge 172.16.0.0/12 (bot in container nella stessa compose network)
# Aggiungi gli IP pubblici dei tuoi bot esterni se li hai.
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12 1.2.3.4/32"
# 2. Register new fingerprints in IBKR portal, get new consumer_key + tokens
# 3. Confirm rotation (atomic swap with auto-rollback on validation fail)
curl -X POST "https://cerbero-mcp.<dom>/admin/ibkr/rotate-keys/confirm?env=testnet" \
-H "Authorization: Bearer <ADMIN_TOKEN>" -H "Content-Type: application/json" \
-d '{"new_consumer_key":"...","new_access_token":"...","new_access_token_secret":"..."}'
```
Tre scenari per il trading bot:
1. Bot container nella stessa compose network → chiama `http://gateway:80`
internamente. Source IP = Docker bridge → coperto dalla default.
2. Bot processo sull'host VPS → chiama `http://localhost`. Source IP =
`127.0.0.1` → coperto dalla default.
3. Bot esterno (laptop, altro server) → chiama
`https://cerbero-mcp.tielogic.xyz` con TLS. Devi aggiungere l'IP
pubblico del bot in `WRITE_ALLOWLIST`.
Senza configurare `WRITE_ALLOWLIST` la default è loopback + Docker bridge:
nessun IP pubblico esterno può triggerare ordini.
Sull'host VPS i secret devono avere permessi restrittivi:
```bash
chmod 600 secrets/*.json secrets/*.token
```
### Risoluzione environment (testnet/mainnet)
Ogni servizio exchange usa `mcp_common.environment.resolve_environment()`
che applica la precedenza:
1. env var di override (`DERIBIT_TESTNET`, `BYBIT_TESTNET`,
`HYPERLIQUID_TESTNET`, `ALPACA_PAPER`)
2. flag nel secret JSON (`testnet` o `paper` per alpaca)
3. default `testnet`
Gli URL canonici live/testnet sono passati come kwargs
`default_base_url_live` / `default_base_url_testnet` direttamente al
resolver — non serve duplicarli nel secret JSON, ma se presenti
prevalgono sui default del codice.
-205
View File
@@ -1,205 +0,0 @@
# docker-compose.prod.yml — deploy su VPS produzione.
#
# Differenze vs docker-compose.yml (dev):
# - Niente `build:`, solo `image:` dal registry Gitea.
# - Tag `latest` (Watchtower polla per nuove versioni).
# - Aggiunge servizio `watchtower` che auto-aggiorna i container etichettati
# `com.centurylinklabs.watchtower.enable=true` quando il tag latest cambia.
# - Auth registry: `docker login git.tielogic.xyz` una sola volta sull'host
# (Watchtower legge ~/.docker/config.json bind-mounted in /config.json).
#
# Uso sul VPS:
# docker login git.tielogic.xyz
# docker compose -f docker-compose.prod.yml --env-file .env up -d
#
# Override variabili in `.env` accanto al compose:
# ACME_EMAIL=adrianodalpastro@tielogic.com
# WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
# GATEWAY_HTTP_PORT=80
# GATEWAY_HTTPS_PORT=443
# IMAGE_TAG=latest # o sha-XXXXXXX per pin specifico
networks:
internal:
driver: bridge
volumes:
caddy-data:
caddy-config:
secrets:
deribit_credentials:
file: ./secrets/deribit.json
hyperliquid_wallet:
file: ./secrets/hyperliquid.json
bybit_credentials:
file: ./secrets/bybit.json
alpaca_credentials:
file: ./secrets/alpaca.json
macro_credentials:
file: ./secrets/macro.json
sentiment_credentials:
file: ./secrets/sentiment.json
core_token:
file: ./secrets/core.token
observer_token:
file: ./secrets/observer.token
x-common-security: &common-security
cap_drop: [ALL]
security_opt:
- no-new-privileges:true
restart: unless-stopped
networks: [internal]
labels:
com.centurylinklabs.watchtower.enable: "true"
volumes:
- ${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}:/var/log/cerbero-mcp:rw
x-image-prefix: &image_prefix git.tielogic.xyz/adriano/cerbero-mcp
services:
gateway:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/gateway:${IMAGE_TAG:-latest}
restart: unless-stopped
networks: [internal]
security_opt:
- no-new-privileges:true
labels:
com.centurylinklabs.watchtower.enable: "true"
ports:
- "${GATEWAY_HTTP_PORT:-80}:80"
- "${GATEWAY_HTTPS_PORT:-443}:443"
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
volumes:
- ./gateway/Caddyfile:/etc/caddy/Caddyfile:ro
- ./gateway/public:/srv:ro
- caddy-data:/data
- caddy-config:/config
depends_on:
mcp-deribit: { condition: service_healthy }
mcp-hyperliquid: { condition: service_healthy }
mcp-bybit: { condition: service_healthy }
mcp-alpaca: { condition: service_healthy }
mcp-macro: { condition: service_healthy }
mcp-sentiment: { condition: service_healthy }
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost/"]
interval: 30s
timeout: 5s
retries: 3
mcp-deribit:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-deribit:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [deribit_credentials, core_token, observer_token]
environment:
CREDENTIALS_FILE: /run/secrets/deribit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
DERIBIT_TESTNET: "${DERIBIT_TESTNET:-true}"
ROOT_PATH: /mcp-deribit
AUDIT_LOG_FILE: /var/log/cerbero-mcp/deribit.audit.jsonl
mcp-hyperliquid:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-hyperliquid:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [hyperliquid_wallet, core_token, observer_token]
environment:
HYPERLIQUID_WALLET_FILE: /run/secrets/hyperliquid_wallet
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
HYPERLIQUID_TESTNET: "${HYPERLIQUID_TESTNET:-true}"
ROOT_PATH: /mcp-hyperliquid
AUDIT_LOG_FILE: /var/log/cerbero-mcp/hyperliquid.audit.jsonl
mcp-bybit:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-bybit:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [bybit_credentials, core_token, observer_token]
environment:
BYBIT_CREDENTIALS_FILE: /run/secrets/bybit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
BYBIT_TESTNET: "${BYBIT_TESTNET:-true}"
ROOT_PATH: /mcp-bybit
AUDIT_LOG_FILE: /var/log/cerbero-mcp/bybit.audit.jsonl
PORT: "9019"
mcp-alpaca:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-alpaca:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [alpaca_credentials, core_token, observer_token]
environment:
ALPACA_CREDENTIALS_FILE: /run/secrets/alpaca_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ALPACA_PAPER: "${ALPACA_PAPER:-true}"
ROOT_PATH: /mcp-alpaca
AUDIT_LOG_FILE: /var/log/cerbero-mcp/alpaca.audit.jsonl
PORT: "9020"
mcp-macro:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-macro:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [macro_credentials, core_token, observer_token]
environment:
MACRO_CREDENTIALS_FILE: /run/secrets/macro_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-macro
mcp-sentiment:
image: ${IMAGE_PREFIX:-git.tielogic.xyz/adriano/cerbero-mcp}/mcp-sentiment:${IMAGE_TAG:-latest}
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [sentiment_credentials, core_token, observer_token]
environment:
SENTIMENT_CREDENTIALS_FILE: /run/secrets/sentiment_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-sentiment
# ========================================================
# WATCHTOWER — auto-update container con label enable=true
# ========================================================
watchtower:
image: containrrr/watchtower:latest
restart: unless-stopped
networks: [internal]
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ${HOME}/.docker/config.json:/config.json:ro
environment:
WATCHTOWER_LABEL_ENABLE: "true"
WATCHTOWER_CLEANUP: "true"
WATCHTOWER_POLL_INTERVAL: "${WATCHTOWER_POLL_INTERVAL:-300}"
WATCHTOWER_INCLUDE_RESTARTING: "false"
WATCHTOWER_NOTIFICATIONS_LEVEL: info
WATCHTOWER_LOG_LEVEL: info
command: --interval ${WATCHTOWER_POLL_INTERVAL:-300}
-60
View File
@@ -1,60 +0,0 @@
# docker-compose.traefik.yml — overlay per integrare Cerbero_mcp con un
# Traefik già esistente sull'host (es. lo stesso VPS che ospita Gitea).
#
# USO:
# docker compose -f docker-compose.prod.yml -f docker-compose.traefik.yml \
# --env-file .env up -d
#
# Differenze vs docker-compose.prod.yml standalone:
# - Gateway Caddy NON ha ports binding host (Traefik è il punto di ingresso
# pubblico su 80/443).
# - Gateway è connesso anche alla network esterna `traefik` (override env
# TRAEFIK_NETWORK se diversa, es. `gitea_traefik-public`).
# - Caddy NON fa auto-TLS — Traefik termina TLS e fa ACME Let's Encrypt.
# Caddy ascolta in chiaro su :80 dentro Docker network.
# - Trusted proxies: Caddy rispetta X-Forwarded-For ricevuto da Traefik
# per il match `remote_ip` (rate limit + WRITE_ALLOWLIST).
# - Labels Traefik su gateway: routing Host(`cerbero-mcp.tielogic.xyz`) +
# TLS automatic.
#
# Variabili .env aggiuntive richieste:
# TRAEFIK_NETWORK=gitea_traefik-public # nome network di Traefik
# TRAEFIK_CERTRESOLVER=letsencrypt # nome resolver in tua config Traefik
# TRAEFIK_ENTRYPOINT=websecure # entrypoint HTTPS Traefik
networks:
traefik:
external: true
name: ${TRAEFIK_NETWORK:-gitea_traefik-public}
services:
gateway:
# Override: niente port binding host, traffica solo via Traefik
ports: !reset []
networks:
- internal
- traefik
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
# Mode behind-proxy: Caddy ascolta plain HTTP su :80, no auto_https
LISTEN: ":80"
AUTO_HTTPS: "off"
# Traefik è il proxy che inoltra; trusta range privati + opz. CIDR Traefik
TRUSTED_PROXIES: ${TRUSTED_PROXIES:-private_ranges}
labels:
com.centurylinklabs.watchtower.enable: "true"
traefik.enable: "true"
traefik.docker.network: ${TRAEFIK_NETWORK:-gitea_traefik-public}
traefik.http.routers.cerbero-mcp.rule: "Host(`cerbero-mcp.tielogic.xyz`)"
traefik.http.routers.cerbero-mcp.entrypoints: ${TRAEFIK_ENTRYPOINT:-websecure}
traefik.http.routers.cerbero-mcp.tls: "true"
traefik.http.routers.cerbero-mcp.tls.certresolver: ${TRAEFIK_CERTRESOLVER:-letsencrypt}
traefik.http.services.cerbero-mcp.loadbalancer.server.port: "80"
# Security headers a livello Traefik (ridondante con Caddy ma utile se
# in futuro Caddy viene rimosso). Commenta se non vuoi duplicazione.
traefik.http.routers.cerbero-mcp.middlewares: cerbero-mcp-secheaders@docker
traefik.http.middlewares.cerbero-mcp-secheaders.headers.stsSeconds: "31536000"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.stsIncludeSubdomains: "true"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.contentTypeNosniff: "true"
traefik.http.middlewares.cerbero-mcp-secheaders.headers.referrerPolicy: "no-referrer"
+26 -173
View File
@@ -1,180 +1,33 @@
networks:
internal:
driver: bridge
volumes:
caddy-data:
caddy-config:
secrets:
deribit_credentials:
file: ./secrets/deribit.json
hyperliquid_wallet:
file: ./secrets/hyperliquid.json
bybit_credentials:
file: ./secrets/bybit.json
alpaca_credentials:
file: ./secrets/alpaca.json
macro_credentials:
file: ./secrets/macro.json
sentiment_credentials:
file: ./secrets/sentiment.json
core_token:
file: ./secrets/core.token
observer_token:
file: ./secrets/observer.token
x-common-security: &common-security
cap_drop: [ALL]
security_opt:
- no-new-privileges:true
restart: unless-stopped
networks: [internal]
services:
# ========================================================
# GATEWAY — unica porta host, reverse proxy + landing page
# ========================================================
gateway:
build:
context: ./gateway
dockerfile: Dockerfile
image: cerbero-gateway:dev
restart: unless-stopped
networks: [internal]
security_opt:
- no-new-privileges:true
ports:
- "${GATEWAY_HTTP_PORT:-80}:80"
- "${GATEWAY_HTTPS_PORT:-443}:443"
environment:
ACME_EMAIL: ${ACME_EMAIL:-adrianodalpastro@tielogic.com}
WRITE_ALLOWLIST: ${WRITE_ALLOWLIST:-127.0.0.1/32 ::1/128 172.16.0.0/12}
cerbero-mcp:
image: cerbero-mcp:2.0.0
build: .
container_name: cerbero-mcp
env_file: .env
volumes:
- ./gateway/Caddyfile:/etc/caddy/Caddyfile:ro
- ./gateway/public:/srv:ro
- caddy-data:/data
- caddy-config:/config
depends_on:
mcp-deribit: { condition: service_healthy }
mcp-hyperliquid: { condition: service_healthy }
mcp-bybit: { condition: service_healthy }
mcp-alpaca: { condition: service_healthy }
mcp-macro: { condition: service_healthy }
mcp-sentiment: { condition: service_healthy }
- ./secrets:/secrets:ro
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "-q", "--spider", "http://localhost/"]
test:
- "CMD"
- "python"
- "-c"
- "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
interval: 30s
timeout: 5s
retries: 3
networks:
- traefik
labels:
- traefik.enable=true
- traefik.docker.network=traefik
- "traefik.http.routers.cerbero-mcp.rule=Host(`cerbero-mcp.${DOMAIN_NAME:-tielogic.xyz}`)"
- traefik.http.routers.cerbero-mcp.tls=true
- traefik.http.routers.cerbero-mcp.entrypoints=websecure
- traefik.http.routers.cerbero-mcp.tls.certresolver=mytlschallenge
- "traefik.http.services.cerbero-mcp.loadbalancer.server.port=${PORT:-9000}"
- "com.centurylinklabs.watchtower.enable=true"
# ========================================================
# MCP — accessibili solo via gateway (nessuna porta host)
# ========================================================
mcp-deribit:
image: cerbero-mcp-deribit:dev
build:
context: .
dockerfile: docker/mcp-deribit.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [deribit_credentials, core_token, observer_token]
environment:
CREDENTIALS_FILE: /run/secrets/deribit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
DERIBIT_TESTNET: "true" # override secrets/deribit.json testnet flag
ROOT_PATH: /mcp-deribit
mcp-hyperliquid:
image: cerbero-mcp-hyperliquid:dev
build:
context: .
dockerfile: docker/mcp-hyperliquid.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [hyperliquid_wallet, core_token, observer_token]
environment:
HYPERLIQUID_WALLET_FILE: /run/secrets/hyperliquid_wallet
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
HYPERLIQUID_TESTNET: "true" # override secrets/hyperliquid.json testnet flag
ROOT_PATH: /mcp-hyperliquid
mcp-bybit:
image: cerbero-mcp-bybit:dev
build:
context: .
dockerfile: docker/mcp-bybit.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [bybit_credentials, core_token, observer_token]
environment:
BYBIT_CREDENTIALS_FILE: /run/secrets/bybit_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
BYBIT_TESTNET: "true" # override secrets/bybit.json testnet flag
ROOT_PATH: /mcp-bybit
PORT: "9019"
mcp-alpaca:
image: cerbero-mcp-alpaca:dev
build:
context: .
dockerfile: docker/mcp-alpaca.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [alpaca_credentials, core_token, observer_token]
environment:
ALPACA_CREDENTIALS_FILE: /run/secrets/alpaca_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ALPACA_PAPER: "true"
ROOT_PATH: /mcp-alpaca
PORT: "9020"
mcp-macro:
image: cerbero-mcp-macro:dev
build:
context: .
dockerfile: docker/mcp-macro.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [macro_credentials, core_token, observer_token]
environment:
MACRO_CREDENTIALS_FILE: /run/secrets/macro_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-macro
mcp-sentiment:
image: cerbero-mcp-sentiment:dev
build:
context: .
dockerfile: docker/mcp-sentiment.Dockerfile
<<: *common-security
user: "1000:1000"
read_only: true
tmpfs:
- /tmp:rw,size=64M,mode=1777
secrets: [sentiment_credentials, core_token, observer_token]
environment:
SENTIMENT_CREDENTIALS_FILE: /run/secrets/sentiment_credentials
CORE_TOKEN_FILE: /run/secrets/core_token
OBSERVER_TOKEN_FILE: /run/secrets/observer_token
ROOT_PATH: /mcp-sentiment
networks:
traefik:
external: true
-12
View File
@@ -1,12 +0,0 @@
FROM python:3.11-slim AS base
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential curl \
&& rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY services/common ./services/common
RUN uv sync --frozen --no-dev --package mcp-common
ENV PATH="/app/.venv/bin:$PATH"
FROM base AS dev
RUN uv sync --frozen --package mcp-common
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-alpaca ./services/mcp-alpaca
RUN uv sync --frozen --no-dev --package mcp-alpaca
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-alpaca"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9020
EXPOSE 9020
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9020\")}/health', timeout=3).close()"
CMD ["mcp-alpaca"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-bybit ./services/mcp-bybit
RUN uv sync --frozen --no-dev --package mcp-bybit
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-bybit"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9019
EXPOSE 9019
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9019\")}/health', timeout=3).close()"
CMD ["mcp-bybit"]
-27
View File
@@ -1,27 +0,0 @@
# CER-P5-012 multi-stage slim: builder da cerbero-base (con uv + toolchain),
# runtime da python:3.11-slim (solo venv + source).
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-deribit ./services/mcp-deribit
RUN uv sync --frozen --no-dev --package mcp-deribit
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-deribit"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9011
EXPOSE 9011
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9011\")}/health', timeout=3).close()"
CMD ["mcp-deribit"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-hyperliquid ./services/mcp-hyperliquid
RUN uv sync --frozen --no-dev --package mcp-hyperliquid
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-hyperliquid"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9012
EXPOSE 9012
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9012\")}/health', timeout=3).close()"
CMD ["mcp-hyperliquid"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-macro ./services/mcp-macro
RUN uv sync --frozen --no-dev --package mcp-macro
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-macro"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9013
EXPOSE 9013
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9013\")}/health', timeout=3).close()"
CMD ["mcp-macro"]
-25
View File
@@ -1,25 +0,0 @@
ARG BASE_IMAGE=cerbero-base
ARG BASE_TAG=latest
FROM ${BASE_IMAGE}:${BASE_TAG} AS builder
COPY services/mcp-sentiment ./services/mcp-sentiment
RUN uv sync --frozen --no-dev --package mcp-sentiment
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero" \
cerbero.service="mcp-sentiment"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH"
RUN useradd -m -u 1000 app
USER app
ENV HOST=0.0.0.0 PORT=9014
EXPOSE 9014
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=15s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9014\")}/health', timeout=3).close()"
CMD ["mcp-sentiment"]
File diff suppressed because it is too large Load Diff
File diff suppressed because it is too large Load Diff
@@ -0,0 +1,562 @@
# Cerbero MCP — V2.0.0: Unified Image, Token-Based Environment Routing
**Branch:** `V2.0.0`
**Data spec:** 2026-04-30
**Autore:** Adriano
## Sommario esecutivo
Riscrittura architetturale di Cerbero MCP per ridurre superficie operativa e
semplificare la gestione di testnet/mainnet. Si passa da 7 immagini Docker
(gateway Caddy + 6 servizi MCP) a una **singola immagine** che ospita tutti
i router exchange in un unico processo FastAPI. La distinzione fra ambiente
testnet e mainnet, oggi decisa al boot del container tramite variabili di
override e flag nei secret JSON, viene spostata **a runtime per-request**:
il bearer token presentato dal client (`Authorization: Bearer <token>`)
determina quale endpoint upstream viene contattato per quella specifica
chiamata. Tutta la configurazione confluisce in un singolo file `.env`,
eliminando i secret JSON separati. Viene infine esposta documentazione
OpenAPI interattiva via Swagger UI all'indirizzo `/apidocs`.
## Motivazioni
- **Operatività**: 7 container, 8 secret file, 4 docker-compose overlay e un
gateway Caddy con plugin di rate-limit sono troppo per il volume di
traffico atteso. Un singolo container è sufficiente.
- **Flessibilità ambiente**: oggi un bot che vuole leggere mainnet e
scrivere testnet deve coordinare due deploy. Con il routing per-request
basta scegliere il bearer giusto per ogni chiamata.
- **Configurazione**: 8 secret JSON + 2 token file + variabili di override
in 4 compose file = stato distribuito difficile da auditare. Un singolo
`.env` rende ovvio cosa è configurato.
- **DX dev**: oggi serve `docker compose up` anche per iterare. V2 punta a
`uv run cerbero-mcp` diretto su laptop senza Docker.
- **Discovery API**: senza Swagger l'unica fonte sulle tool è il codice.
`/apidocs` rende le tool esplorabili dal browser, e `/openapi.json` le
rende leggibili a LLM senza bisogno del protocollo MCP completo.
## Decisioni di design
| # | Domanda | Decisione |
|---|---|---|
| 1 | Significato di "passare il token alla funzione" | Routing per-request via bearer: lo stesso container serve testnet e mainnet contemporaneamente, decide URL upstream a runtime |
| 2 | Granularità token | 2 token globali (`TESTNET_TOKEN`, `MAINNET_TOKEN`) validi per tutti gli exchange |
| 3 | ACL core/observer (read-only) | Eliminata. Protezione write rimane via leverage cap server-side e via firewall (Traefik su VPS) |
| 4 | Scope "single image" | Un'unica immagine, **un solo container** con multi-router interno (un processo Python) |
| 5 | Gateway L7 | Eliminato dal repo. Su VPS prod c'è Traefik gestito esternamente. In dev nessun gateway |
| 6 | Formato configurazione | Tutto in `.env`. Nessun JSON. La porta è in `.env` |
| 7 | Swagger | `/apidocs` ON, `/openapi.json` ON, `/redoc` OFF, `/docs` OFF |
| 8a | Dispatch exchange | Path-based: `/mcp-{exchange}/tools/{tool}` (backward-compat con bot V1) |
| 8b | Lifecycle client exchange | Cache lazy `(exchange, env) → client`, max 8 client (4 exchange × 2 env) + 2 client read-only (macro, sentiment) |
Decisioni esplicite anche su:
- **Macro e sentiment richiedono token valido**: anche le tool puramente
read-only passano dal middleware auth. Bearer assente o non riconosciuto
→ 401. Il valore del token (testnet o mainnet) è ignorato per macro e
sentiment perché non hanno endpoint testnet, ma uno dei due deve essere
presente.
- **Nessuna policy `ALLOW_MAINNET`**: la protezione contro uso accidentale
di mainnet è demandata a (a) custodia dei token (mainnet token solo a bot
autorizzati), (b) Traefik IP allowlist sul VPS, (c) leverage cap
server-side già esistente per ogni exchange.
- **`docker-compose.yml` minimo** mantenuto per chi vuole usare Docker
localmente; `docker-compose.{prod,traefik,local}.yml` eliminati.
## Architettura
### Stack runtime
```
Prod (VPS):
Traefik (TLS, allowlist) ──▶ container cerbero-mcp:9000
Dev (laptop):
uv run cerbero-mcp ──▶ http://localhost:9000
```
Nessun gateway nel repo. Traefik è gestito fuori da questo progetto, con
label aggiunte tramite override compose esterno. In dev FastAPI è esposto
direttamente via uvicorn.
### Struttura sorgenti
```
Cerbero_mcp/
├── pyproject.toml # singolo package "cerbero-mcp"
├── uv.lock
├── Dockerfile # multi-stage builder + runtime slim
├── docker-compose.yml # minimo: 1 servizio, env_file: .env
├── .env.example # template completo, versionato
├── .gitignore # .env escluso
├── README.md # riscritto V2
├── docs/
│ └── superpowers/
│ ├── specs/
│ │ ├── 2026-04-27-cot-report-design.md (storico)
│ │ └── 2026-04-30-V2.0.0-unified-image-...-design.md
│ └── plans/
│ ├── 2026-04-27-cot-report.md (storico)
│ └── 2026-04-30-V2.0.0-unified-image-...-plan.md
├── src/cerbero_mcp/
│ ├── __init__.py
│ ├── __main__.py # entrypoint: uvicorn.run(app, ...)
│ ├── settings.py # Pydantic Settings per .env
│ ├── auth.py # bearer → environment
│ ├── server.py # build_app(): FastAPI + middleware + handlers + swagger
│ ├── client_registry.py # cache lazy {(exchange, env): client}
│ ├── routers/
│ │ ├── __init__.py
│ │ ├── deribit.py
│ │ ├── bybit.py
│ │ ├── hyperliquid.py
│ │ ├── alpaca.py
│ │ ├── macro.py
│ │ └── sentiment.py
│ ├── exchanges/
│ │ ├── deribit/{client.py, tools.py, leverage_cap.py}
│ │ ├── bybit/{client.py, tools.py, leverage_cap.py}
│ │ ├── hyperliquid/{client.py, tools.py, leverage_cap.py}
│ │ ├── alpaca/{client.py, tools.py, leverage_cap.py}
│ │ ├── macro/{client.py, tools.py}
│ │ └── sentiment/{client.py, tools.py}
│ └── common/
│ ├── indicators.py
│ ├── options.py
│ ├── microstructure.py
│ ├── stats.py
│ ├── http.py
│ ├── audit.py
│ ├── logging.py
│ └── errors.py # error envelope
├── tests/
│ ├── unit/
│ ├── integration/
│ └── smoke/
└── scripts/
└── build-push.sh # build di 1 sola immagine
```
**Cosa viene eliminato:**
- `services/` (intera struttura monorepo a 7 sub-package, sostituita da
`src/cerbero_mcp/`)
- `gateway/` (Caddy + Caddyfile + landing page)
- `secrets/` (8 file JSON + 2 token file)
- `docker/` (7 Dockerfile separati, sostituiti da `Dockerfile` in root)
- `docker-compose.prod.yml`, `docker-compose.local.yml`,
`docker-compose.traefik.yml`
- `DEPLOYMENT.md` (i contenuti ancora validi confluiscono in `README.md`)
- `mcp_common.environment` (resolver boot-time, sostituito da `auth.py`
runtime)
- `mcp_common.env_validation` (sostituito da Pydantic Settings)
- `mcp_common.app_factory` (boilerplate boot, integrato in `server.py`)
**Cosa resta uguale:**
- Path layout `/mcp-{exchange}/tools/{tool}` (backward-compat con bot V1)
- Tool MCP individuali: firme, response shape, error envelope, header
`X-Data-Timestamp` e `X-Duration-Ms`
- Logica indicatori quantitativi (`indicators`, `options`,
`microstructure`, `stats`)
- Healthcheck `/health` (formato identico)
- Leverage cap server-side per exchange
- Tool MCP-bridge (se in uso) preservato in `common/mcp_bridge.py`
### Flusso request
```
1. Bot HTTP request
POST /mcp-deribit/tools/place_order
Authorization: Bearer tk_test_xxx
{ "symbol":"BTC-PERPETUAL", "side":"buy", "qty": 0.1 }
2. Middleware AuthBearer (auth.py)
- whitelist path: /apidocs, /openapi.json, /health → bypass
- estrae bearer
- confronta con settings.testnet_token / settings.mainnet_token
- match testnet → request.state.environment = "testnet"
- match mainnet → request.state.environment = "mainnet"
- nessun match → 401 UNAUTHORIZED
3. Router deribit (routers/deribit.py)
- FastAPI valida body con Pydantic schema
- dependency get_env(request) -> "testnet"|"mainnet"
- dependency get_client(env) -> DeribitClient
4. Client Registry (client_registry.py)
- chiave (exchange="deribit", env="testnet")
- cache hit → return; miss → costruisce client lazy + auth iniziale + cache
5. Tool impl (exchanges/deribit/tools.py)
- leverage_cap.enforce(qty, max_leverage)
- client.place_order(...)
- response shape standard con data_timestamp, request_id
6. Response middleware
- X-Duration-Ms header
- data_timestamp injection se mancante
7. 200 JSON
```
### Auth
```python
# src/cerbero_mcp/auth.py (esempio sintetico)
from fastapi import Request, HTTPException, status
from typing import Literal
Environment = Literal["testnet", "mainnet"]
WHITELIST_PATHS = {"/health", "/apidocs", "/openapi.json"}
async def auth_middleware(request: Request, call_next):
if request.url.path in WHITELIST_PATHS:
return await call_next(request)
auth = request.headers.get("Authorization", "")
if not auth.startswith("Bearer "):
raise HTTPException(401, "missing bearer token")
token = auth[len("Bearer "):].strip()
settings = request.app.state.settings
if token == settings.testnet_token.get_secret_value():
request.state.environment = "testnet"
elif token == settings.mainnet_token.get_secret_value():
request.state.environment = "mainnet"
else:
raise HTTPException(401, "invalid token")
return await call_next(request)
```
Confronto token con `secrets.compare_digest` per evitare timing attack.
### Client registry
```python
# src/cerbero_mcp/client_registry.py (sintesi)
class ClientRegistry:
def __init__(self, settings):
self._settings = settings
self._clients: dict[tuple[str, Environment], Any] = {}
self._locks: dict[tuple[str, Environment], asyncio.Lock] = defaultdict(asyncio.Lock)
async def get(self, exchange: str, env: Environment) -> Any:
key = (exchange, env)
if key in self._clients:
return self._clients[key]
async with self._locks[key]:
if key in self._clients:
return self._clients[key]
client = await self._build(exchange, env)
self._clients[key] = client
return client
async def aclose(self):
for c in self._clients.values():
await c.aclose()
```
- Costruzione **lazy** al primo uso → boot rapido, no auth verso exchange
non usati
- **Lock per chiave** evita doppia istanziazione in caso di race
- Macro e sentiment usano stesso client per testnet e mainnet (l'env è
ignorato), ma per uniformità API ricevono ugualmente la chiave
`(exchange, env)`
- Lifespan FastAPI: registry creato in `startup`, chiuso in `shutdown`
(chiude HTTP pool, websocket eventuali, sessioni)
### Configurazione: `.env`
Singola sorgente di verità, letta da Pydantic Settings al boot. File
versionato come `.env.example` con placeholder vuoti; `.env` reale
gitignored.
```bash
# SERVER
HOST=0.0.0.0
PORT=9000
LOG_LEVEL=info
# AUTH
TESTNET_TOKEN=
MAINNET_TOKEN=
# DERIBIT
DERIBIT_CLIENT_ID=
DERIBIT_CLIENT_SECRET=
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
DERIBIT_MAX_LEVERAGE=3
# BYBIT
BYBIT_API_KEY=
BYBIT_API_SECRET=
BYBIT_URL_LIVE=https://api.bybit.com
BYBIT_URL_TESTNET=https://api-testnet.bybit.com
BYBIT_MAX_LEVERAGE=3
# HYPERLIQUID
HYPERLIQUID_WALLET_ADDRESS=
HYPERLIQUID_API_WALLET_ADDRESS=
HYPERLIQUID_PRIVATE_KEY=
HYPERLIQUID_URL_LIVE=https://api.hyperliquid.xyz
HYPERLIQUID_URL_TESTNET=https://api.hyperliquid-testnet.xyz
HYPERLIQUID_MAX_LEVERAGE=3
# ALPACA
ALPACA_API_KEY_ID=
ALPACA_SECRET_KEY=
ALPACA_URL_LIVE=https://api.alpaca.markets
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
ALPACA_MAX_LEVERAGE=1
# MACRO
FRED_API_KEY=
FINNHUB_API_KEY=
# SENTIMENT
CRYPTOPANIC_KEY=
LUNARCRUSH_KEY=
```
Pydantic Settings con `SecretStr` per i valori sensibili evita leak nei
log/repr. `extra="ignore"` ammette env aggiuntive (variabili di sistema,
Docker) senza crash. Validation fail-fast al boot.
### Swagger / OpenAPI
```python
app = FastAPI(
title="Cerbero MCP",
version="2.0.0",
description="Multi-exchange MCP server. Bearer token decides environment (testnet/mainnet).",
docs_url="/apidocs",
redoc_url=None,
openapi_url="/openapi.json",
swagger_ui_parameters={
"persistAuthorization": True,
"displayRequestDuration": True,
"filter": True,
"tryItOutEnabled": True,
"tagsSorter": "alpha",
"operationsSorter": "alpha",
},
)
```
Aggiunto `securityScheme = BearerAuth` su tutti gli endpoint sotto `/mcp-*`.
Click su Authorize in Swagger → input bearer → tutte le richieste "Try it
out" mandano il header. Cambio token = cambio ambiente senza ricaricare.
Tag organizzati per exchange:
- `system``/health`
- `deribit`, `bybit`, `hyperliquid`, `alpaca`, `macro`, `sentiment` → tool
rispettive
Ogni request body Pydantic include `examples=[...]` con almeno un esempio
realistico. Per response shape complesse (gamma profile, orderbook
imbalance) anche le response Pydantic includono examples.
### Dockerfile e immagine
```dockerfile
# syntax=docker/dockerfile:1.7
FROM python:3.11-slim AS builder
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential curl && rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY src ./src
RUN uv sync --frozen --no-dev
FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.title="cerbero-mcp" \
org.opencontainers.image.version="2.0.0"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH" \
HOST=0.0.0.0 \
PORT=9000 \
PYTHONUNBUFFERED=1
RUN useradd -m -u 1000 app && chown -R app:app /app
USER app
EXPOSE 9000
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
CMD ["cerbero-mcp"]
```
- 1 sola immagine `cerbero-mcp:2.0.0` (+ `:latest`)
- Build attesa: ~2-3 min (vs ~12 min × 7 immagini in V1)
- Image size attesa: ~200 MB
- Non-root user `app:1000`
- Healthcheck legge `PORT` da env (rispetta override `.env`)
### docker-compose.yml minimo
```yaml
services:
cerbero-mcp:
image: cerbero-mcp:2.0.0
build: .
container_name: cerbero-mcp
ports:
- "${PORT:-9000}:${PORT:-9000}"
env_file: .env
restart: unless-stopped
healthcheck:
test: ["CMD", "python", "-c",
"import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"]
interval: 30s
timeout: 5s
retries: 3
```
Su VPS si estende con override esterno per applicare label Traefik (file
non versionato in questo repo).
## Errori
| Caso | Status | Code |
|---|---|---|
| Manca header Authorization | 401 | UNAUTHORIZED |
| Token non in `{testnet, mainnet}` | 401 | UNAUTHORIZED |
| Body invalido (Pydantic) | 422 | INVALID_INPUT |
| Exchange upstream 5xx | 502 | UPSTREAM_ERROR |
| Rate limit upstream | 429 | RATE_LIMIT |
| Eccezione non gestita | 500 | UNHANDLED_EXCEPTION |
Error envelope identico a V1: campi `error.{type, code, message,
retryable, suggested_fix?, details?}`, `request_id`, `data_timestamp`.
## Test plan
**Unit:**
- `auth.py`: 4 casi (no header, header malformato, token testnet, token
mainnet, token invalido). Verifica che `request.state.environment` sia
settato correttamente.
- `auth.py`: confronto token usa `secrets.compare_digest` (verifica con
test che attivi entrambi i rami).
- `auth.py`: path whitelist (`/health`, `/apidocs`, `/openapi.json`)
bypassano il middleware.
- `client_registry.py`: concorrenza — 10 task `get(deribit, testnet)` in
parallelo, `_build` chiamato 1 sola volta.
- `client_registry.py`: chiavi diverse istanziano client diversi
(deribit/testnet ≠ deribit/mainnet ≠ bybit/testnet).
- `settings.py`: `.env` valido carica senza errori; campo mandatory
mancante solleva ValidationError al boot.
- `common/`: tutti i test esistenti su indicators, options,
microstructure, stats migrano 1:1.
- Per ogni `exchanges/{exchange}/tools.py`: tool con stub client
restituisce response shape attesa (test esistenti V1 migrati).
**Integration:**
- Stub HTTP che intercetta richieste verso URL deribit:
- request con `Bearer TESTNET_TOKEN` colpisce `DERIBIT_URL_TESTNET`
- request con `Bearer MAINNET_TOKEN` colpisce `DERIBIT_URL_LIVE`
- Stesso pattern per bybit, hyperliquid, alpaca.
- Macro e sentiment: request con bearer testnet o mainnet entrambe
funzionanti, request senza bearer → 401.
- Swagger UI: `GET /apidocs` ritorna HTML con securityScheme BearerAuth
presente.
- OpenAPI: `GET /openapi.json` ritorna schema valido OpenAPI 3.1 con tag
per ogni exchange.
**Smoke (post-deploy):**
```bash
curl -s http://localhost:9000/health
curl -s http://localhost:9000/apidocs | grep -q "Cerbero MCP"
curl -s -H "Authorization: Bearer $TESTNET_TOKEN" \
http://localhost:9000/mcp-deribit/tools/get_ticker \
-d '{"instrument": "BTC-PERPETUAL"}'
```
## Migrazione da V1
Per chi è in produzione su V1:
1. **Backup**: `cp -r secrets/ secrets.v1.bak/`.
2. **Genera token V2**:
```bash
python -c 'import secrets; print("TESTNET_TOKEN=" + secrets.token_urlsafe(32))'
python -c 'import secrets; print("MAINNET_TOKEN=" + secrets.token_urlsafe(32))'
```
3. **Compila `.env`**: mappa V1 → V2 (i campi nei JSON V1 hanno nomi
leggermente diversi; vedi tabella in README V2 sezione "Migrazione").
4. **Aggiorna bot client**:
- URL invariato (`/mcp-{exchange}/tools/{tool}` non cambia)
- Bearer cambia: dove prima usavi `core.token` o `observer.token`, ora
usi `TESTNET_TOKEN` o `MAINNET_TOKEN` a seconda dell'ambiente
desiderato per quella richiesta.
5. **Spegni V1**: `docker compose down` (vecchio compose).
6. **Avvia V2**: `docker compose up -d` (nuovo compose minimo) +
verifica `curl /health` e `curl /apidocs`.
7. **Rollback** disponibile pullando i tag immagine V1 (`cerbero-mcp-*:1.x`)
se necessario, ma .env e secrets/ sono incompatibili tra V1 e V2 —
tenere backup separati.
## Pulizia documentazione
| File | V1 | Azione V2 |
|---|---|---|
| `README.md` | descrive 6 servizi + Caddy + 8 secret + build push 8 immagini | Riscritto da zero |
| `DEPLOYMENT.md` | runbook 7 immagini, gateway Caddy, allowlist Caddy, deploy no-clone | **Eliminato**. Contenuti utili in `README.md` |
| `docs/superpowers/specs/2026-04-27-cot-report-design.md` | spec feature passata | Mantenuto (storico) |
| `docs/superpowers/plans/2026-04-27-cot-report.md` | plan feature passata | Mantenuto (storico) |
| `docs/superpowers/specs/2026-04-30-V2.0.0-...-design.md` | (nuovo) | Creato |
| `docs/superpowers/plans/2026-04-30-V2.0.0-...-plan.md` | (nuovo) | Creato dopo writing-plans |
Razionale eliminazione `DEPLOYMENT.md`: 16 KB di doc su build-push 8
immagini, gateway Caddy, secret mounts, IP allowlist Caddy, deploy
no-clone — tutto obsoleto in V2. Le 30 righe ancora valide (smoke test,
rollback Watchtower) sono integrate nella sezione "Deploy" del nuovo
`README.md`.
## Out of scope
Per evitare scope creep nello stesso sprint, restano fuori:
- **HSTS / security headers** custom: in V1 li gestiva Caddy. In V2 si
delegano a Traefik su VPS (gestito esternamente). Aggiunta a livello
applicativo non in V2.
- **Rate limit applicativo** (`slowapi` o simile): demandato a Traefik.
- **Metriche Prometheus**: rinviate a iterazione successiva.
- **Token rotation automatica**: fuori scope V2. Rotation manuale
modificando `.env` + restart container.
- **Telemetria audit trail**: `mcp_common.audit` viene preservato, ma
evoluzioni (struttura log, sink) rinviate.
- **Multi-account per exchange**: V2 supporta 1 account per exchange. Più
account = future iterazione.
## Rischi e mitigazioni
| Rischio | Probabilità | Mitigazione |
|---|---|---|
| Bug auth → token testnet finisce su URL mainnet | Bassa, alto impatto | Test integration con stub HTTP per ogni exchange; leverage cap server-side resta secondo livello di difesa |
| Cache `client_registry` non rilascia connessioni → leak fd | Media | Lifespan FastAPI chiama `aclose()` su shutdown; healthcheck monitora processo |
| Boot fail per env var mancante in `.env` | Alta in dev | Pydantic Settings fail-fast con messaggio chiaro; `.env.example` versionato |
| Migrazione V1→V2 disallineata bot | Media | Path API invariato; documentare mapping V1→V2 in README |
| Concorrenza: prima request mainnet e testnet sullo stesso exchange in parallelo | Media | Lock per chiave nel registry impedisce race su `_build` |
| Image size cresce inattesa | Bassa | Multi-stage slim, base python:3.11-slim, no dipendenze inutili |
## Criteri di successo
- ✅ `docker compose up -d` avvia 1 container che risponde su `/health`
entro 5 secondi
- ✅ `curl /apidocs` rende Swagger UI navigabile
- ✅ Bot V1 funziona con cambio bearer e basta (path identici)
- ✅ Stesso bot può alternare testnet e mainnet su request consecutive
cambiando solo bearer
- ✅ Tutti i test V1 migrati passano in V2
- ✅ Tempo di build immagine ridotto da ~12 min a ~3 min
- ✅ `services/`, `gateway/`, `secrets/`, `docker/`, `DEPLOYMENT.md`,
3 docker-compose overlay rimossi dal repo
@@ -0,0 +1,530 @@
# IBKR Integration — Design Spec
**Date:** 2026-05-03
**Branch:** V2.0.0
**Status:** Approved (pending implementation plan)
**Approach chosen:** A2 — Client Portal Web API with OAuth 1.0a Self-Service (fully unattended)
## 1. Goals & Non-Goals
### Goals
Aggiungere `ibkr` come exchange supportato in `cerbero-mcp`, riutilizzando il pattern consolidato (Alpaca/Deribit) per:
- account / positions / activities (read)
- ordini simple: market, limit, stop, stop-limit (read + write)
- ordini complex: bracket (entry + SL + TP con OCA), OCO (N legs OCA type=1), OTO (parent → child sequenziale)
- market data: snapshot REST + tick/depth real-time via WebSocket (snapshot-on-demand)
- options chain via OCC symbol
- key rotation semi-automatica via admin endpoint con auto-rollback
- routing testnet (paper account) / mainnet (live account) via bearer token, come gli altri exchange
### Non-Goals (V1)
- Server-Sent Events / streaming HTTP response (snapshot-on-demand è sufficiente)
- Multi-account dinamico (un solo `account_id` per env, configurato in settings)
- Trailing stop, IF-touched, conditional advanced orders (solo bracket fisso, OCO, OTO)
- Rotazione completamente automatica del consumer registration (il passo portale IBKR non è automatizzabile)
- Streaming WebSocket esposto direttamente al bot (resta interno al server, esposto come polling REST)
- TWS API socket protocol via `ib_insync` (rejected: richiede gateway desktop con Xvfb, fragile)
- Flex Web Service (rejected: read-only su report storici, fuori scope)
### Success criteria
1. `POST /mcp-ibkr/tools/get_account` con bearer testnet ritorna saldo paper account reale
2. `POST /mcp-ibkr/tools/place_order` (1 share AAPL market) → ordine fillato in paper, audit log presente
3. `POST /mcp-ibkr/tools/place_bracket_order` → 3 ordini collegati via OCA group, primo fill cancella gli altri
4. `POST /mcp-ibkr/tools/get_depth` → 5 livelli depth con dati < 1s di latenza
5. `POST /admin/ibkr/rotate-keys/{start,confirm}` → swap atomico, rollback automatico su validation fail
6. Container restart → primo `get_account` < 5s (OAuth flow + first call), zero input umano
7. Test suite verde: 90% coverage su `oauth.py`/`client.py`/`ws.py`/`key_rotation.py`, 85% su `tools.py`/`orders_complex.py`
8. `/health/ready` segnala IBKR sano per entrambi gli env
## 2. Architecture
```
┌──────┐ Bearer testnet|mainnet ┌──────────────────┐
│ Bot │ ─────────────────────────▶│ cerbero-mcp │
└──────┘ │ (single FastAPI)│
│ │
│ IBKRClient │ ──HTTPS OAuth1a──▶ api.ibkr.com/v1/api
│ IBKRWebSocket │ ──WSS LST───────▶ api.ibkr.com/v1/api/ws
└──────────────────┘
```
**Decisioni chiave:**
- **OAuth 1.0a Self-Service:** firma RSA-SHA256 + DH key exchange per mintare live session token (24h TTL) in autonomia. Setup iniziale manuale una-tantum sul portale IBKR; runtime fully unattended.
- **Container singolo:** niente sidecar Java (era considerato per CP Gateway non-OAuth, scartato perché richiedeva login interattivo).
- **WebSocket interno snapshot-on-demand:** un singleton `IBKRWebSocket` per env mantiene sub attive in background; i tool REST `get_tick`/`get_depth` ritornano l'ultimo snapshot in cache. Bot polling-based, niente streaming verso il bot.
- **Paper vs live = account separati, stesso host:** IBKR usa `api.ibkr.com` per entrambi; i test sono fatti su account paper (con suo OAuth bundle), live su account live. Due set di credenziali in settings (pattern Deribit `_TESTNET` / `_LIVE`).
- **Conid cache:** IBKR identifica strumenti via `conid` numerico; symbol→conid lookup cached (LRU 1024, TTL 1h) per evitare round-trip ripetuti.
- **Two-level keep-alive:** brokerage session muore in 5min idle (richiede `POST /tickle`); live session token muore in 24h (richiede DH re-mint). Gestiti separatamente dal client.
## 3. Components (file layout)
```
src/cerbero_mcp/exchanges/ibkr/
├── __init__.py
├── client.py # IBKRClient: REST httpx + tickle + conid cache
├── oauth.py # OAuth1aSigner: RSA sig + DH live session token mint/refresh
├── ws.py # IBKRWebSocket: persistent WSS, smd/sbd subs, snapshot cache
├── orders_complex.py # bracket/OCO/OTO payload builders
├── key_rotation.py # KeyRotationManager: stage/confirm/abort/rollback
├── tools.py # Pydantic schemas + async tool functions (read + write + complex + streaming)
└── leverage_cap.py # get_max_leverage(creds) — copia 1:1 da alpaca
src/cerbero_mcp/routers/ibkr.py # POST /mcp-ibkr/tools/*
scripts/ibkr_oauth_setup.py # one-shot: generate keypair, walkthrough portale, --rotate flag
tests/unit/exchanges/ibkr/
├── __init__.py
├── test_oauth.py
├── test_client.py
├── test_ws.py
├── test_orders_complex.py
├── test_key_rotation.py
└── test_tools.py
```
**File esistenti modificati:**
- `src/cerbero_mcp/settings.py` — aggiungo `IBKRSettings`
- `src/cerbero_mcp/exchanges/__init__.py` — branch `if exchange == "ibkr"` in `build_client`
- `src/cerbero_mcp/__main__.py``app.include_router(ibkr.make_router())`
- `src/cerbero_mcp/admin.py` — endpoints `/admin/ibkr/rotate-*` + `/admin/ibkr/health`
- `.env.example` — sezione `# ─── EXCHANGE — IBKR ───`
- `pyproject.toml` — aggiungo `cryptography>=43` (RSA + DH; potrebbe essere già transitiva)
- `docker-compose.yml` — bind mount `./secrets:/secrets:ro`
- `README.md` — sezione "IBKR Setup"
**Module boundaries:**
- `oauth.py` espone `OAuth1aSigner.get_live_session_token() -> str` (cached). Non sa nulla di endpoint applicativi; conosce solo l'endpoint `/oauth/live_session_token`.
- `client.py` riceve un `OAuth1aSigner` come dipendenza, non costruisce keys. Non sa nulla di WebSocket.
- `ws.py` riceve un `OAuth1aSigner`, gestisce WSS in modo indipendente. Espone metodi async `subscribe_tick(conid)`, `subscribe_depth(conid, rows)`, `get_tick_snapshot(conid)`, `get_depth_snapshot(conid)`. Non sa nulla di REST.
- `orders_complex.py` è un set di funzioni **pure** che producono payload JSON IBKR-ready. Niente HTTP. Test deterministici.
- `key_rotation.py` opera su filesystem + `OAuth1aSigner` factory; non tocca routing FastAPI direttamente.
## 4. Settings & OAuth flow
### Pydantic settings
```python
class IBKRSettings(_Sub):
model_config = SettingsConfigDict(
env_file=".env", env_file_encoding="utf-8",
env_prefix="IBKR_", extra="ignore",
)
# Coppia singola fallback (legacy / dev)
consumer_key: str | None = None
access_token: str | None = None
access_token_secret: SecretStr | None = None
signature_key_path: str | None = None
encryption_key_path: str | None = None
dh_prime: SecretStr | None = None
# Coppie env-specific (prevalgono se valorizzate)
consumer_key_testnet: str | None = None
access_token_testnet: str | None = None
access_token_secret_testnet: SecretStr | None = None
signature_key_path_testnet: str | None = None
encryption_key_path_testnet: str | None = None
account_id_testnet: str | None = None
consumer_key_live: str | None = None
access_token_live: str | None = None
access_token_secret_live: SecretStr | None = None
signature_key_path_live: str | None = None
encryption_key_path_live: str | None = None
account_id_live: str | None = None
# URLs (paper e live condividono host)
url_live: str = "https://api.ibkr.com/v1/api"
url_testnet: str = "https://api.ibkr.com/v1/api"
ws_url_live: str = "wss://api.ibkr.com/v1/api/ws"
ws_url_testnet: str = "wss://api.ibkr.com/v1/api/ws"
# Limits
max_leverage: int = 4 # Reg-T default
ws_max_subscriptions: int = 80
ws_idle_timeout_s: int = 300
def credentials(self, env: str) -> dict:
"""Ritorna dict completo OAuth per env. ValueError su campi mancanti.
Per ogni campo: prefer `<field>_<env>`; fallback a `<field>` (legacy);
ValueError se entrambi assenti per i campi required
(consumer_key, access_token, access_token_secret, signature_key_path,
encryption_key_path, account_id, dh_prime).
Pattern identico a DeribitSettings.credentials().
"""
```
`dh_prime` è una stringa hex emessa da IBKR al setup, **costante** per consumer (condivisa paper/live), non duplicata per env.
### `.env.example`
```env
# ─── EXCHANGE — IBKR ──────────────────────────────────────
# Setup OAuth: vedi README "IBKR Setup" + scripts/ibkr_oauth_setup.py.
# Le RSA keys (PEM) NON vanno nel .env: monta come file e referenzia il path.
IBKR_CONSUMER_KEY=
IBKR_ACCESS_TOKEN=
IBKR_ACCESS_TOKEN_SECRET=
IBKR_SIGNATURE_KEY_PATH=/secrets/ibkr_signature.pem
IBKR_ENCRYPTION_KEY_PATH=/secrets/ibkr_encryption.pem
IBKR_DH_PRIME=
# Coppie env-specific (prevalgono):
# IBKR_CONSUMER_KEY_TESTNET=
# IBKR_ACCESS_TOKEN_TESTNET=
# IBKR_ACCESS_TOKEN_SECRET_TESTNET=
# IBKR_SIGNATURE_KEY_PATH_TESTNET=/secrets/ibkr_signature_paper.pem
# IBKR_ENCRYPTION_KEY_PATH_TESTNET=/secrets/ibkr_encryption_paper.pem
# IBKR_ACCOUNT_ID_TESTNET=DU1234567
# IBKR_CONSUMER_KEY_LIVE=
# IBKR_ACCESS_TOKEN_LIVE=
# IBKR_ACCESS_TOKEN_SECRET_LIVE=
# IBKR_SIGNATURE_KEY_PATH_LIVE=/secrets/ibkr_signature_live.pem
# IBKR_ENCRYPTION_KEY_PATH_LIVE=/secrets/ibkr_encryption_live.pem
# IBKR_ACCOUNT_ID_LIVE=U1234567
IBKR_URL_LIVE=https://api.ibkr.com/v1/api
IBKR_URL_TESTNET=https://api.ibkr.com/v1/api
IBKR_WS_URL_LIVE=wss://api.ibkr.com/v1/api/ws
IBKR_WS_URL_TESTNET=wss://api.ibkr.com/v1/api/ws
IBKR_MAX_LEVERAGE=4
IBKR_WS_MAX_SUBSCRIPTIONS=80
IBKR_WS_IDLE_TIMEOUT_S=300
```
### Setup OAuth one-shot (manuale, una-tantum per account)
1. Login portale `https://www.interactivebrokers.com` → "User Settings" → "Self-Service OAuth"
2. `python scripts/ibkr_oauth_setup.py --env testnet` → genera 2 RSA keypair + stampa SHA-256 fingerprint
3. Sul portale: registra le 2 public key, ottieni `consumer_key`
4. `python scripts/ibkr_oauth_setup.py --consumer-key <K> --request-token` → ottiene request token + URL autorizzazione
5. Aprire URL nel browser, autorizzare, copiare `verifier_code`
6. `python scripts/ibkr_oauth_setup.py --verifier <V>` → scambia per `access_token` (long-lived ~5 anni) + `access_token_secret`
7. Copiare 3 valori in `.env`. Ripetere per env live.
### Runtime flow (fully unattended)
- Container avvia → `IBKRClient` lazy-instantiated alla prima request
- `OAuth1aSigner` carica RSA private keys da disk (path da settings)
- Prima request privata → `_get_live_session_token()`:
1. Genera nonce + timestamp
2. Firma `POST /oauth/live_session_token` con RSA-SHA256
3. Diffie-Hellman key exchange con `dh_prime`
4. Riceve `lst` (live session token, valido 24h)
5. Cache in memory con scadenza `now + 86000s`
- Request successive: HMAC-SHA256 dei params con `lst` come key
- `lst` scaduto → mint nuovo automatico, retry once. Mai input umano runtime.
- Ogni request privata: se ultima call > 4min fa, chiama `POST /tickle` (brokerage session keep-alive) prima.
### Errori → error envelope
| Trigger | Code | retryable |
|---|---|---|
| RSA key file mancante | `IBKR_KEY_NOT_FOUND` | false |
| RSA key file illeggibile | `IBKR_KEY_INVALID` | false |
| Consumer revocato dal portale | `IBKR_CONSUMER_REVOKED` | false |
| Access token scaduto (~5 anni) | `IBKR_ACCESS_TOKEN_EXPIRED` | false |
| LST mint fallito (network) | `IBKR_SESSION_MINT_FAILED` | true |
| `401` su request firmata | `IBKR_AUTH_FAILED` | true (forza refresh LST + retry once) |
| Rate limit `429` | `IBKR_RATE_LIMITED` | true |
| Manutenzione domenicale | `IBKR_MAINTENANCE` | true |
| Account configurato non in `/iserver/accounts` | `IBKR_ACCOUNT_NOT_FOUND` | false |
| Subscription market data assente | `IBKR_NO_MARKET_DATA_SUBSCRIPTION` | false |
| Order warning critico (margin/suitability) | `IBKR_ORDER_REJECTED_WARNING` | false |
| WS sub limit superato | `IBKR_WS_SUB_LIMIT` | false |
| `get_tick` timeout cache vuota dopo 3s | `IBKR_TICK_TIMEOUT` | true |
| OTO seconda POST fallita dopo trigger placed | `IBKR_OTO_PARTIAL_FAILURE` | false |
| Rotation validation fallita | `IBKR_ROTATION_VALIDATION_FAILED` | false (rollback automatico) |
## 5. Tool API surface
Pattern simmetrico ad Alpaca dove l'astrazione regge: stesso tool name → bot riusa logica cross-exchange.
### Reads (12 tool)
| Tool | IBKR endpoint |
|---|---|
| `environment_info` | locale (env, paper, base_url, max_leverage) |
| `get_account` | `GET /portfolio/{accountId}/summary` |
| `get_positions` | `GET /portfolio/{accountId}/positions/0` (loop se >30) |
| `get_activities` | `GET /iserver/account/trades?days=N` (default 7, cap 90) |
| `get_assets` | `GET /trsrv/secdef/search?symbol=...` (richiede symbol) |
| `get_ticker` | `GET /iserver/marketdata/snapshot?conids=X&fields=31,84,86,7295,7296` |
| `get_bars` | `GET /iserver/marketdata/history?conid=X&period=...&bar=...` |
| `get_snapshot` | `GET /iserver/marketdata/snapshot` (full fields) |
| `get_option_chain` | `GET /iserver/secdef/strikes` + `/info` |
| `get_open_orders` | `GET /iserver/account/orders?filters=Submitted,PreSubmitted` |
| `get_clock` | locale (now + market hours statiche) |
| `search_contracts` | `GET /trsrv/secdef/search` (IBKR-specific: symbol+secType → conid) |
### Streaming (4 tool, snapshot-on-demand)
| Tool | Behavior |
|---|---|
| `get_tick` | Ultimo tick in cache (last/bid/ask/size/timestamp). Se non subscribed: sub lazy + attesa primo tick (timeout 3s) |
| `get_depth` | Order book depth (default 5 livelli, max 10). IBKR `sbd+{conid}+{exchange}+{rows}` |
| `subscribe_tick` | Mantiene sub attiva anche senza polling. Auto-unsub dopo `ws_idle_timeout_s` |
| `unsubscribe` | Forza chiusura sub per liberare slot |
### Writes simple (6 tool)
| Tool | Audit field |
|---|---|
| `place_order` | `symbol` |
| `amend_order` | `order_id` |
| `cancel_order` | `order_id` |
| `cancel_all_orders` | — (loop) |
| `close_position` | `symbol` |
| `close_all_positions` | — (loop) |
### Writes complex (3 tool)
| Tool | Schema essenziale | Endpoint |
|---|---|---|
| `place_bracket_order` | `symbol, side, qty, entry_price, stop_loss, take_profit, tif="gtc"` | `POST /iserver/account/{id}/orders` array `[parent, sl_child, tp_child]` con OCA group auto |
| `place_oco_order` | `legs: list[OrderLeg]` (2-N orders) | Stessa POST con `oca_group` + `oca_type=1` su ogni leg |
| `place_oto_order` | `trigger: OrderLeg, child: OrderLeg` | POST sequenziali: trigger prima, poi child con `parent_id=<trigger.order_id>`. **Non atomico:** se la seconda POST fallisce dopo che la prima è andata a buon fine, il tool cancella il trigger via `cancel_order` (best-effort) e ritorna `IBKR_OTO_PARTIAL_FAILURE` con `details.trigger_order_id` per audit |
**Audit:** complex orders tracciano `target_field=symbol` + `details.legs_count` + `details.oca_group` (se applicabile).
**Leverage cap su complex:** applicato sul **net notional** della struttura. Bracket = entry only (i child non aprono nuova esposizione). OCO = max(leg.notional). OTO = trigger + child se entrambi long, altrimenti max.
### IBKR-specifiche (interne al client)
1. **`conid` resolution:** `place_order(symbol="AAPL")` → lookup `GET /trsrv/secdef/search?symbol=AAPL&secType=STK` → primo match → cache LRU. Per options: parse OCC (`AAPL 240119C00190000`) → `/iserver/secdef/info`.
2. **`accountId` validation:** al boot, `GET /iserver/accounts` → verifica `account_id_<env>` presente. Altrimenti `IBKR_ACCOUNT_NOT_FOUND`.
3. **Order confirmation flow:** IBKR ritorna warnings array, richiede secondo POST con `confirmed: true`. Auto-confirm per default (max 3 cicli), ma filtra warning critici (margin, suitability, hard rejects) → error envelope.
4. **`tickle` keep-alive:** automatico se ultima request > 4min fa. Indipendente dal LST.
5. **Empty market data → error envelope:** snapshot vuoto = subscription mancante; ritorniamo `IBKR_NO_MARKET_DATA_SUBSCRIPTION` invece di dict vuoto silenzioso.
6. **Leverage cap:** IBKR non accetta `leverage` per-order. Calcoliamo `notional / equity``max_leverage` pre-submit chiamando `get_account` per equity. Pattern asincrono ma cached 30s.
### `PlaceOrderReq` schema
```python
class PlaceOrderReq(BaseModel):
symbol: str # "AAPL" o OCC-format per options
side: str # "buy" | "sell"
qty: float
order_type: str = "market" # "market" | "limit" | "stop" | "stop_limit"
limit_price: float | None = None
stop_price: float | None = None
tif: str = "day" # "day" | "gtc" | "ioc"
asset_class: str = "stocks" # "stocks" | "options" | "futures" | "forex"
sec_type: str | None = None # IBKR override (STK/OPT/FUT/CASH); inferito da asset_class
exchange: str = "SMART" # IBKR routing
outside_rth: bool = False
```
## 6. WebSocket layer
### Pattern
Singleton `IBKRWebSocket` per env, lazy-start alla prima sub. Una connessione WSS condivisa per tutte le sub.
### Lifecycle
1. **Boot:** non connette finché un tool streaming non viene chiamato.
2. **First sub call:** apre WSS, autentica con LST corrente (header `Cookie: api=<lst>`), invia subscribe message (`smd+{conid}+{fields}` o `sbd+{conid}+{exchange}+{rows}`).
3. **Message dispatch:** ogni messaggio `smd-...` aggiorna `dict[conid, TickSnapshot]`; ogni `sbd-...` aggiorna `dict[conid, DepthSnapshot]`.
4. **Heartbeat:** ping ogni 30s; se nessun pong in 60s → forza reconnect.
5. **Reconnect:** backoff esponenziale 1s, 2s, 4s, max 30s. Su reconnect: re-subscribe automatico a tutti i conid attivi.
6. **Idle unsub:** track `last_polled_at[conid]`; se > `ws_idle_timeout_s` (default 300s) → invia unsub, libera slot. Sub forzata via `subscribe_tick` non scade fino a `unsubscribe` esplicito.
7. **Sub limit:** se sub attive ≥ `ws_max_subscriptions` (default 80) → error envelope `IBKR_WS_SUB_LIMIT` su nuova sub.
### Cache invariant
- Snapshot rappresenta **sempre** l'ultimo update ricevuto. No buffering storico.
- Su disconnect: cache di un conid invalidata se reconnect non riesce in <5s.
- `get_tick(conid)` se cache vuota: aspetta primo tick fino a 3s, poi `IBKR_TICK_TIMEOUT`.
### Health probe
`/health/ready` interroga `IBKRWebSocket.connected` per ogni env. Stato `degraded` se ws disconnesso ma client REST ok.
### Feature flag
`IBKR_WS_ENABLED=false` (env var) disabilita layer WS a runtime; tool streaming fallback a HTTP `/marketdata/snapshot` (single shot, niente depth). Mitigation per emergenze prod.
## 7. Key rotation
### Endpoint admin
```
POST /admin/ibkr/rotate-keys/start?env=testnet
→ genera signature_key.pem.new + encryption_key.pem.new (RSA 2048)
→ ritorna {fingerprints: {sig: "SHA256:...", enc: "SHA256:..."},
expires_at: <now+24h>}
→ user incolla fingerprint nel portale IBKR, ottiene new_consumer_key
POST /admin/ibkr/rotate-keys/confirm?env=testnet
body: {new_consumer_key, new_access_token, new_access_token_secret}
→ atomic swap: .new → primary, primary → secrets/.archive/<timestamp>/
→ probe: GET /iserver/auth/status con nuove credenziali
→ ok: ritorna {rotated_at, old_archived_at}
→ ko: rollback automatico (swap inverso), ritorna 500 IBKR_ROTATION_VALIDATION_FAILED
POST /admin/ibkr/rotate-keys/abort?env=testnet
→ cancella .new files, no-op se start non eseguito o confirm già eseguito
```
### Authorization
Endpoints protetti dal middleware `auth.py` esistente, richiedono `X-Bot-Tag: admin` (header già supportato per admin router).
### Atomic swap
Implementato come:
1. Lock filesystem-level via `fcntl.flock` su `secrets/.lock`
2. Rename `signature.pem``secrets/.archive/<ts>/signature.pem.old`
3. Rename `signature.pem.new``signature.pem`
4. Stesso per `encryption.pem`
5. Aggiorna `IBKRSettings` in-memory tramite `app.state.settings.ibkr.consumer_key_<env> = new_consumer_key` (settings live, no restart)
6. Probe `GET /iserver/auth/status`
7. Su KO: rollback (swap inverso), ripristina settings precedenti, alza eccezione
### Scheduled health check
Task asyncio creato in lifespan startup:
- Ogni 6h: `GET /iserver/auth/status` su entrambi gli env
- Se `competing=true` o `authenticated=false` per >2 cicli consecutivi: log warning + `/admin/ibkr/health` espone state degraded
- Auto-trigger `/tickle` su degraded prima di fallire
### Encryption-at-rest
Key files su disco con permessi `0600`. Bind mount Docker `:ro` su `/secrets`. Rotation preserva permessi e leggibilità solo per UID processo container.
## 8. Testing
### Unit tests
| File | Coverage critica |
|---|---|
| `test_oauth.py` | RSA-SHA256 signature deterministica (vector noto IBKR docs); DH key exchange su prime test; LST mint con httpx mock; refresh prima di scadenza; error path key mancante/illeggibile; 401 su consumer revocato |
| `test_client.py` | Authorization header construction; conid lookup + cache hit/miss; tickle keep-alive timing (last_request_at < 4min skip, > 4min trigger); place_order warning auto-confirmation flow + warning critico → error envelope; leverage cap pre-flight; account validation al boot; error mapping (401/429/maintenance/no-mkt-data) |
| `test_ws.py` | Mock `websockets.connect`; subscribe ack flow; message dispatch in cache; reconnect dopo disconnect; idle timeout unsub; sub limit (>80 → IBKR_WS_SUB_LIMIT); feature flag disabled → HTTP fallback |
| `test_orders_complex.py` | Bracket: payload shape (3 orders, OCA group uguale, parent/child relation). OCO: N legs, OCA type=1 ovunque. OTO: due POST sequenziali, secondo con parent_id corretto. Leverage cap su net notional |
| `test_key_rotation.py` | start genera keypair valido; confirm swap atomico; validation probe success/fail; rollback automatico su fail; abort pulisce .new files; archived .old conserva permessi |
| `test_tools.py` | Schema validation (qty obbligatoria, side enum, OCC format options); default values; leverage cap enforcement |
| `test_settings.py` (estendi) | `IBKRSettings.credentials("testnet")` prefer testnet → fallback base → ValueError se entrambi mancanti. Test isolation con `monkeypatch.delenv` ricorsivo per evitare `.env` pollution (pattern Deribit) |
### Coverage target
- `oauth.py`, `client.py`, `ws.py`, `key_rotation.py`: 90%
- `orders_complex.py`, `tools.py`: 85%
- `routers/ibkr.py`, `admin.py` IBKR section: 75%
### Integration smoke (manuale post-deploy)
Non in CI (richiede credenziali reali). Documentato in README:
```bash
curl https://cerbero-mcp.<dom>/mcp-ibkr/tools/get_account \
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST -d '{}'
# → saldo paper account
curl .../mcp-ibkr/tools/place_order \
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST \
-d '{"symbol":"AAPL","side":"buy","qty":1,"order_type":"market"}'
# → order_id
curl .../mcp-ibkr/tools/place_bracket_order \
-d '{"symbol":"AAPL","side":"buy","qty":1,"entry_price":150,"stop_loss":145,"take_profit":160}'
# → 3 order_ids con stesso oca_group
curl .../mcp-ibkr/tools/get_depth \
-d '{"symbol":"AAPL","rows":5}'
# → order book 5 livelli
```
### Verification gate (pre-merge)
- [ ] `uv run pytest tests/unit/exchanges/ibkr/ tests/unit/test_settings.py -v` verde
- [ ] `uv run ruff check src/cerbero_mcp/exchanges/ibkr/ src/cerbero_mcp/routers/ibkr.py` no warning
- [ ] `uv run python -c "from cerbero_mcp.settings import Settings; Settings()"` no validation error con `.env` esempio
- [ ] `docker compose build && docker compose up -d` healthy < 60s
- [ ] `curl /health/ready -H "Authorization: Bearer <TESTNET>"` ritorna `ibkr` probato
- [ ] Smoke manuale completo (lista sopra) su account paper reale
## 9. Deploy & ops
- **Branch:** `V2.0.0` (default deploy, no merge in main)
- **Pipeline:** stessa pattern del fix Deribit di settimana scorsa: commit + push → watchtower aggiorna container in <2min sul VPS
- **Traefik:** nessuna modifica (stessa Host rule)
- **Secrets:** RSA keys trasferite manualmente in `/opt/docker/cerbero-mcp/secrets/` sul VPS, mode `0600`, ownership UID container; bind mount `./secrets:/secrets:ro` aggiunto in `docker-compose.yml`
- **Rollback:** `git revert <commit>` di un singolo step lascia gli altri exchange operativi (commit atomici per design)
## 10. Commit plan (8 commit atomici)
```
1. feat(V2): IBKR settings + OAuth signer scaffolding
- settings.py: IBKRSettings con env-specific credentials
- exchanges/ibkr/oauth.py: OAuth1aSigner + tests
- .env.example: sezione IBKR
- pyproject.toml: cryptography>=43
2. feat(V2): IBKR client httpx + conid cache + tickle
- exchanges/ibkr/client.py: IBKRClient base
- exchanges/ibkr/leverage_cap.py: copia da alpaca
- tests/unit/exchanges/ibkr/test_client.py
3. feat(V2): IBKR WebSocket layer + tick/depth snapshot cache
- exchanges/ibkr/ws.py: IBKRWebSocket singleton + reconnect
- tests/unit/exchanges/ibkr/test_ws.py
4. feat(V2): IBKR read tools (account/positions/marketdata/streaming)
- exchanges/ibkr/tools.py: schemas + read functions
- tests/unit/exchanges/ibkr/test_tools.py (read paths)
5. feat(V2): IBKR write tools simple (place/amend/cancel/close)
- exchanges/ibkr/tools.py: schemas + write functions
- tests/unit/exchanges/ibkr/test_tools.py (write paths + leverage cap)
6. feat(V2): IBKR complex orders (bracket/OCO/OTO)
- exchanges/ibkr/orders_complex.py
- exchanges/ibkr/tools.py: complex tool functions
- tests/unit/exchanges/ibkr/test_orders_complex.py
7. feat(V2): IBKR key rotation admin endpoints + scheduled health
- exchanges/ibkr/key_rotation.py: KeyRotationManager
- admin.py: rotate-keys/start|confirm|abort + ibkr/health
- tests/unit/exchanges/ibkr/test_key_rotation.py
8. feat(V2): IBKR router wiring + docker secrets + setup script + docs
- routers/ibkr.py
- exchanges/__init__.py: build_client branch ibkr
- __main__.py: include_router
- scripts/ibkr_oauth_setup.py
- docker-compose.yml: bind mount secrets
- README.md: sezione IBKR Setup
```
Ogni commit lascia repo verde (test passing + container buildable). `git revert` di un commit non rompe gli altri exchange.
## 11. Risks & mitigations
| Risk | Likelihood | Mitigation |
|---|---|---|
| WebSocket reconnect instabile in prod | Media | Feature flag `IBKR_WS_ENABLED=false` + HTTP snapshot fallback |
| IBKR rate limit superato durante conid lookup burst | Bassa | LRU cache 1h + retry con backoff |
| Live session token mint fallisce per network blip | Media | Retry 3x con backoff esponenziale; circuit breaker su 5 fail consecutivi |
| Order auto-confirmation conferma erroneamente warning critico | Bassa | Whitelist esplicita warning auto-confermabili (RTH, no-mkt-data); tutto il resto → error envelope |
| Key rotation lascia sistema in stato inconsistente | Bassa | Filesystem lock + atomic swap + auto-rollback su validation fail |
| Setup OAuth iniziale troppo complesso per ops team | Media | Script `ibkr_oauth_setup.py` interattivo + sezione README dettagliata + checklist |
| Leverage cap calcolato su equity stale | Bassa | Cache equity 30s, refresh forzato pre-submit ordini > 10% equity |
## 12. Estimate
- Dev: **6-8 giorni** (era 3-4 nella V0; complex orders + WS + rotation aggiungono ~3 giorni)
- Test: incluso nei commit (TDD-friendly)
- Deploy + smoke: 0.5 giorni
- Documentation: 0.5 giorni
- **Totale:** ~7-9 giorni di lavoro effettivo
-86
View File
@@ -1,86 +0,0 @@
{
admin off
email {$ACME_EMAIL:adrianodalpastro@tielogic.com}
auto_https {$AUTO_HTTPS:on}
# Plugin mholt/caddy-ratelimit
order rate_limit before basicauth
# Trusted proxies: rispetta X-Forwarded-For quando dietro reverse proxy
# (es. Traefik). Default = solo private ranges.
servers {
trusted_proxies static {$TRUSTED_PROXIES:private_ranges}
}
}
{$LISTEN:cerbero-mcp.tielogic.xyz} {
log {
output stdout
format json
}
# ───── Security headers ─────
header {
Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
X-Content-Type-Options "nosniff"
X-Frame-Options "DENY"
Referrer-Policy "no-referrer"
-Server
}
# ───── IP allowlist su endpoint write ─────
# WRITE_ALLOWLIST: CIDR space-separated (es. "1.2.3.4/32 5.6.7.0/24").
# Default 127.0.0.1/32 — fail-closed se non configurato.
@writes_blocked {
path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
not remote_ip {$WRITE_ALLOWLIST:127.0.0.1/32 ::1/128 172.16.0.0/12}
}
respond @writes_blocked "forbidden: source ip not in allowlist" 403
# ───── Rate limit ─────
# Reads: 60 req/min/IP, writes: 10 req/min/IP (sliding window).
rate_limit {
zone reads {
match {
not path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
}
key {remote_ip}
events 60
window 1m
}
zone writes {
match {
path_regexp ^/mcp-[a-z]+/tools/(place_|cancel_|set_|close_|transfer_|amend_|switch_)
}
key {remote_ip}
events 10
window 1m
}
}
# ───── Reverse proxy ─────
handle_path /mcp-deribit/* {
reverse_proxy mcp-deribit:9011
}
handle_path /mcp-bybit/* {
reverse_proxy mcp-bybit:9019
}
handle_path /mcp-hyperliquid/* {
reverse_proxy mcp-hyperliquid:9012
}
handle_path /mcp-alpaca/* {
reverse_proxy mcp-alpaca:9020
}
handle_path /mcp-macro/* {
reverse_proxy mcp-macro:9013
}
handle_path /mcp-sentiment/* {
reverse_proxy mcp-sentiment:9014
}
# Landing page statica
handle {
root * /srv
file_server
}
}
-6
View File
@@ -1,6 +0,0 @@
FROM caddy:2.8-builder-alpine AS builder
RUN xcaddy build \
--with github.com/mholt/caddy-ratelimit
FROM caddy:2.8-alpine
COPY --from=builder /usr/bin/caddy /usr/bin/caddy
-97
View File
@@ -1,97 +0,0 @@
<!DOCTYPE html>
<html lang="it">
<head>
<meta charset="UTF-8">
<title>Cerbero — MCP gateway</title>
<link rel="stylesheet" href="/style.css">
</head>
<body>
<header>
<h1>Cerbero</h1>
<p>Sistema trading autonomo crypto, architettura MCP-only.</p>
</header>
<main>
<table id="services">
<thead>
<tr>
<th>Stato</th>
<th>Servizio</th>
<th>Porta int.</th>
<th>Descrizione</th>
<th>Link</th>
</tr>
</thead>
<tbody>
<tr data-path="/mcp-memory">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-memory</td>
<td>9015</td>
<td>Store L1/L2, system prompt base + dyn</td>
<td><a href="/mcp-memory/health">health</a> · <a href="/mcp-memory/docs">docs</a></td>
</tr>
<tr data-path="/mcp-scheduler">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-scheduler</td>
<td>9016</td>
<td>Recurring task + core agent runner</td>
<td><a href="/mcp-scheduler/health">health</a> · <a href="/mcp-scheduler/docs">docs</a></td>
</tr>
<tr data-path="/mcp-deribit">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-deribit</td>
<td>9011</td>
<td>Options testnet order/market</td>
<td><a href="/mcp-deribit/health">health</a> · <a href="/mcp-deribit/docs">docs</a></td>
</tr>
<tr data-path="/mcp-hyperliquid">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-hyperliquid</td>
<td>9012</td>
<td>Perp DEX testnet</td>
<td><a href="/mcp-hyperliquid/health">health</a> · <a href="/mcp-hyperliquid/docs">docs</a></td>
</tr>
<tr data-path="/mcp-macro">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-macro</td>
<td>9013</td>
<td>FRED indicators + Finnhub calendar</td>
<td><a href="/mcp-macro/health">health</a> · <a href="/mcp-macro/docs">docs</a></td>
</tr>
<tr data-path="/mcp-sentiment">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-sentiment</td>
<td>9014</td>
<td>CryptoPanic news feed</td>
<td><a href="/mcp-sentiment/health">health</a> · <a href="/mcp-sentiment/docs">docs</a></td>
</tr>
<tr data-path="/mcp-telegram">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-telegram</td>
<td>9017</td>
<td>Bot commands + notifiche operatore</td>
<td><a href="/mcp-telegram/health">health</a> · <a href="/mcp-telegram/docs">docs</a></td>
</tr>
<tr data-path="/mcp-portfolio">
<td><span class="status" aria-label="unknown"></span></td>
<td>mcp-portfolio</td>
<td>9018</td>
<td>Holdings + yfinance + UI htmx</td>
<td><a href="/mcp-portfolio/health">health</a> · <a href="/gui">gui</a> · <a href="/mcp-portfolio/docs">docs</a></td>
</tr>
</tbody>
</table>
<section style="margin-top: 2rem;">
<h2 style="color: var(--accent); margin-bottom: 0.5rem;">Console operativa</h2>
<p><a href="/console" style="font-size: 1.1rem;">/console</a> — run del core agent, eventi stdout/stderr, L1 live, trigger manuale.</p>
</section>
</main>
<footer>
<p>Status aggiornato ogni 5 s. Gateway Caddy su porta configurata via <code>GATEWAY_PORT</code>.</p>
</footer>
<script src="/status.js"></script>
</body>
</html>
-23
View File
@@ -1,23 +0,0 @@
const rows = document.querySelectorAll("tr[data-path]");
async function poll() {
for (const row of rows) {
const dot = row.querySelector(".status");
try {
const r = await fetch(`${row.dataset.path}/health`, {
method: "GET",
cache: "no-store",
});
dot.classList.toggle("ok", r.ok);
dot.classList.toggle("err", !r.ok);
dot.setAttribute("aria-label", r.ok ? "ok" : "error");
} catch {
dot.classList.remove("ok");
dot.classList.add("err");
dot.setAttribute("aria-label", "unreachable");
}
}
}
poll();
setInterval(poll, 5000);
-101
View File
@@ -1,101 +0,0 @@
:root {
--bg: #0f172a;
--fg: #e2e8f0;
--muted: #94a3b8;
--card: #1e293b;
--border: #334155;
--ok: #22c55e;
--err: #ef4444;
--unknown: #64748b;
--accent: #38bdf8;
}
* { box-sizing: border-box; }
body {
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, sans-serif;
background: var(--bg);
color: var(--fg);
line-height: 1.5;
}
header, main, footer {
max-width: 960px;
margin: 0 auto;
padding: 1.5rem;
}
header h1 {
margin: 0 0 0.25rem;
color: var(--accent);
font-size: 2rem;
}
header p {
margin: 0;
color: var(--muted);
}
table {
width: 100%;
border-collapse: collapse;
background: var(--card);
border-radius: 8px;
overflow: hidden;
}
th, td {
padding: 0.75rem 1rem;
text-align: left;
border-bottom: 1px solid var(--border);
}
th {
background: #0f172a;
color: var(--muted);
font-weight: 600;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.05em;
}
tr:last-child td { border-bottom: none; }
td:nth-child(3) {
font-family: ui-monospace, "SF Mono", Menlo, monospace;
color: var(--muted);
}
a {
color: var(--accent);
text-decoration: none;
margin-right: 0.5rem;
}
a:hover { text-decoration: underline; }
.status {
display: inline-block;
width: 12px;
height: 12px;
border-radius: 50%;
background: var(--unknown);
transition: background 0.3s ease;
}
.status.ok { background: var(--ok); box-shadow: 0 0 8px var(--ok); }
.status.err { background: var(--err); box-shadow: 0 0 8px var(--err); }
footer {
color: var(--muted);
font-size: 0.85rem;
margin-top: 2rem;
}
code {
background: var(--border);
padding: 0.1rem 0.3rem;
border-radius: 3px;
font-size: 0.9em;
}
+36 -32
View File
@@ -1,17 +1,40 @@
[tool.uv.workspace]
members = [
"services/common",
"services/mcp-alpaca",
"services/mcp-bybit",
"services/mcp-deribit",
"services/mcp-hyperliquid",
"services/mcp-macro",
"services/mcp-sentiment",
[project]
name = "cerbero-mcp"
version = "2.0.0"
description = "Unified multi-exchange MCP server with token-based testnet/mainnet routing"
requires-python = ">=3.11"
authors = [{ name = "Adriano", email = "adrianodalpastro@tielogic.com" }]
dependencies = [
"fastapi>=0.115",
"uvicorn[standard]>=0.32",
"pydantic>=2.9",
"pydantic-settings>=2.6",
"httpx>=0.27",
"python-json-logger>=2.0",
"websockets>=13",
"numpy>=1.26",
"scipy>=1.13",
"statsmodels>=0.14",
"pandas>=2.2",
"eth-account>=0.13.7",
"msgpack>=1.1.2",
"eth-utils>=5.3.1",
"cryptography>=43",
]
[project.scripts]
cerbero-mcp = "cerbero_mcp.__main__:main"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/cerbero_mcp"]
[tool.ruff]
line-length = 100
target-version = "py313"
target-version = "py311"
[tool.ruff.lint]
select = ["E", "F", "I", "W", "UP", "B", "SIM"]
@@ -35,39 +58,20 @@ extend-immutable-calls = [
[tool.pytest.ini_options]
asyncio_mode = "auto"
testpaths = ["services"]
testpaths = ["tests"]
addopts = "--import-mode=importlib"
consider_namespace_packages = true
[tool.mypy]
python_version = "3.13"
python_version = "3.11"
strict = false
warn_return_any = true
warn_unused_ignores = true
warn_redundant_casts = true
check_untyped_defs = true
ignore_missing_imports = true
mypy_path = [
"services/common/src",
"services/mcp-alpaca/src",
"services/mcp-bybit/src",
"services/mcp-deribit/src",
"services/mcp-hyperliquid/src",
"services/mcp-macro/src",
"services/mcp-sentiment/src",
]
exclude = [
"^.*tests/.*$",
"^.venv/.*$",
]
[[tool.mypy.overrides]]
module = [
"pybit.*",
"alpaca.*",
"hyperliquid.*",
"pythonjsonlogger.*",
]
module = ["pythonjsonlogger.*"]
ignore_missing_imports = true
[dependency-groups]
-90
View File
@@ -1,90 +0,0 @@
#!/usr/bin/env bash
# Cerbero_mcp — build & push image al registry Gitea da macchina locale.
#
# Sostituisce il job CI `build-and-push` di .gitea/workflows/ci.yml.
# Usalo dopo `git push` (o senza, se vuoi pushare un build "dirty").
# Watchtower sul VPS pulla automaticamente entro WATCHTOWER_POLL_INTERVAL.
#
# Pre-requisiti:
# - docker + buildx
# - PAT Gitea con scope `write:package` in env $GITEA_PAT
# - $GITEA_USER (default: adriano)
#
# Uso:
# ./scripts/build-push.sh # tutte le image
# ./scripts/build-push.sh base gateway # solo specifiche
set -euo pipefail
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
IMAGE_PREFIX="${IMAGE_PREFIX:-$REGISTRY/adriano/cerbero-mcp}"
GITEA_USER="${GITEA_USER:-adriano}"
SHA="$(git rev-parse --short HEAD)"
# Ordine di build: base prima (parent delle mcp-*), poi le altre.
ALL_TARGETS=(base gateway mcp-deribit mcp-bybit mcp-hyperliquid mcp-alpaca mcp-macro mcp-sentiment)
TARGETS=("${@:-${ALL_TARGETS[@]}}")
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
docker buildx version >/dev/null || { echo "FATAL: docker buildx non disponibile"; exit 1; }
# Login solo se non già autenticato sul registry. Per primo login fai:
# echo "<PAT>" | docker login $REGISTRY -u $GITEA_USER --password-stdin
if grep -q "\"$REGISTRY\"" ~/.docker/config.json 2>/dev/null; then
echo "=== docker già loggato su $REGISTRY (skip login) ==="
elif [ -n "${GITEA_PAT:-}" ]; then
echo "=== docker login $REGISTRY ==="
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
else
echo "FATAL: non autenticato su $REGISTRY e GITEA_PAT non settata."
echo " Esegui una volta: docker login $REGISTRY -u $GITEA_USER"
exit 1
fi
build_one() {
local name="$1"
local context file
case "$name" in
base)
context="."; file="docker/base.Dockerfile" ;;
gateway)
context="./gateway"; file="gateway/Dockerfile" ;;
mcp-*)
context="."; file="docker/${name}.Dockerfile" ;;
*)
echo "FATAL: target sconosciuto '$name'"; exit 1 ;;
esac
if [ ! -f "$file" ]; then
echo "FATAL: Dockerfile non trovato: $file"; exit 1
fi
local tag_latest="$IMAGE_PREFIX/$name:latest"
local tag_sha="$IMAGE_PREFIX/$name:sha-$SHA"
echo "=== [$name] build & push ==="
local args=(buildx build --push
-f "$file"
-t "$tag_latest"
-t "$tag_sha"
)
if [[ "$name" == mcp-* ]]; then
args+=(--build-arg "BASE_IMAGE=$IMAGE_PREFIX/base"
--build-arg "BASE_TAG=latest")
fi
args+=("$context")
docker "${args[@]}"
echo " pushed: $tag_latest"
echo " pushed: $tag_sha"
}
for t in "${TARGETS[@]}"; do
build_one "$t"
done
echo
echo "=== Tutto pushato (commit $SHA) ==="
echo "VPS Watchtower farà pull entro WATCHTOWER_POLL_INTERVAL (default 5min)."
echo "Per forzare subito:"
echo " ssh <vps> 'cd /docker/cerbero_mcp && docker compose -f docker-compose.prod.yml pull && docker compose -f docker-compose.prod.yml up -d'"
-202
View File
@@ -1,202 +0,0 @@
#!/usr/bin/env bash
# Cerbero_mcp — deploy script per VPS produzione.
#
# Sul VPS NON viene clonato il repo: lo script scarica solo i file
# strettamente necessari al runtime (compose, Caddyfile, public assets)
# via raw HTTP da Gitea. Le image vengono pullate pre-built dal registry
# Gitea (buildate dal laptop dev con scripts/build-push.sh).
#
# Pre-requisiti sul VPS (NON gestiti da questo script):
# 1. Docker Engine ≥ 24 + plugin docker compose installati.
# 2. DNS A record `cerbero-mcp.tielogic.xyz` → IP del VPS (warn-only).
# 3. Porte 80 e 443 aperte sul firewall (per ACME + traffico HTTPS).
# 4. PAT Gitea con scope `read:package`, salvato in env `$GITEA_PAT`.
# 5. Username Gitea in env `$GITEA_USER` (default: adriano).
# 6. Secret JSON exchange + token bearer disponibili in $SECRETS_SRC
# (default: $DEPLOY_DIR/secrets/), che lo script copierà in
# $DEPLOY_DIR/secrets/ con permessi 600 (ignorato se SECRETS_SRC == DEPLOY_DIR/secrets).
#
# Idempotente: rieseguibile per aggiornamenti (riscarica i file di config
# dal branch corrente, NON tocca .env esistente).
set -euo pipefail
DEPLOY_DIR="${DEPLOY_DIR:-/docker/cerbero_mcp}"
SECRETS_SRC="${SECRETS_SRC:-$DEPLOY_DIR/secrets}"
GITEA_USER="${GITEA_USER:-adriano}"
GITEA_RAW_BASE="${GITEA_RAW_BASE:-https://git.tielogic.xyz/Adriano/Cerbero-mcp/raw/branch/main}"
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
DOMAIN="${DOMAIN:-cerbero-mcp.tielogic.xyz}"
AUDIT_LOG_DIR="${AUDIT_LOG_DIR:-/var/log/cerbero-mcp}"
echo "=== Cerbero_mcp deploy (no-clone) → $DEPLOY_DIR (domain $DOMAIN) ==="
# ──────────────────────────────────────────────────────────────
# 1. Verifica pre-requisiti
# ──────────────────────────────────────────────────────────────
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
command -v curl >/dev/null || { echo "FATAL: curl non installato"; exit 1; }
docker compose version >/dev/null || { echo "FATAL: docker compose plugin assente"; exit 1; }
if [ -z "${GITEA_PAT:-}" ]; then
echo "FATAL: env GITEA_PAT non settata. Export del PAT con scope read:package prima."
exit 1
fi
# Check DNS resolution (warning only, non blocca)
ip_resolved=$(getent hosts "$DOMAIN" | awk '{print $1}' | head -1 || true)
if [ -z "$ip_resolved" ]; then
echo "WARN: $DOMAIN non risolve via DNS — TLS Let's Encrypt fallirà finché DNS non propaga."
else
echo "DNS $DOMAIN$ip_resolved"
fi
# ──────────────────────────────────────────────────────────────
# 2. Login al container registry
# ──────────────────────────────────────────────────────────────
echo "=== docker login $REGISTRY ==="
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
# ──────────────────────────────────────────────────────────────
# 3. Setup dir + scarica i file di config dal repo (no clone)
# ──────────────────────────────────────────────────────────────
sudo mkdir -p "$DEPLOY_DIR"
sudo chown "$USER:$USER" "$DEPLOY_DIR"
mkdir -p "$DEPLOY_DIR/secrets" "$DEPLOY_DIR/gateway/public"
# File di config necessari al runtime. Scaricati come raw da Gitea.
# Idempotente: ricarica sempre la versione di main.
download() {
local rel="$1"
local dst="$DEPLOY_DIR/$rel"
echo " fetch: $rel"
curl -fsSL -o "$dst" "$GITEA_RAW_BASE/$rel" \
|| { echo "FATAL: download $rel fallito"; exit 1; }
}
echo "=== Download config da $GITEA_RAW_BASE ==="
download docker-compose.prod.yml
download docker-compose.traefik.yml
download gateway/Caddyfile
download gateway/public/index.html
download gateway/public/status.js
download gateway/public/style.css
cd "$DEPLOY_DIR"
# ──────────────────────────────────────────────────────────────
# 4. Copia secrets con permessi 600
# ──────────────────────────────────────────────────────────────
if [ "$(realpath "$SECRETS_SRC")" != "$(realpath "$DEPLOY_DIR/secrets")" ]; then
if [ ! -d "$SECRETS_SRC" ]; then
echo "FATAL: secrets src dir $SECRETS_SRC non esiste."
echo " Atteso contenere: deribit.json bybit.json hyperliquid.json alpaca.json"
echo " macro.json sentiment.json core.token observer.token"
exit 1
fi
echo "=== Copia secrets da $SECRETS_SRC ==="
for f in deribit.json bybit.json hyperliquid.json alpaca.json macro.json sentiment.json core.token observer.token; do
if [ -f "$SECRETS_SRC/$f" ]; then
cp "$SECRETS_SRC/$f" "secrets/$f"
chmod 600 "secrets/$f"
echo " ok: secrets/$f"
else
echo " WARN: $SECRETS_SRC/$f assente — il servizio relativo fallirà al boot."
fi
done
else
echo "=== Secrets già in $DEPLOY_DIR/secrets — solo chmod 600 ==="
for f in deribit.json bybit.json hyperliquid.json alpaca.json macro.json sentiment.json core.token observer.token; do
[ -f "secrets/$f" ] && chmod 600 "secrets/$f" && echo " ok: secrets/$f" \
|| echo " WARN: secrets/$f assente — il servizio relativo fallirà al boot."
done
fi
# ──────────────────────────────────────────────────────────────
# 5. Crea/aggiorna .env (preserva esistente)
# ──────────────────────────────────────────────────────────────
if [ ! -f .env ]; then
echo "=== Creazione .env iniziale (testnet di default) ==="
cat > .env <<EOF
# Cerbero_mcp deploy config — modifica per passare a mainnet
ACME_EMAIL=adrianodalpastro@tielogic.com
GATEWAY_HTTP_PORT=80
GATEWAY_HTTPS_PORT=443
WRITE_ALLOWLIST="127.0.0.1/32 ::1/128 172.16.0.0/12"
IMAGE_TAG=latest
IMAGE_PREFIX=git.tielogic.xyz/adriano/cerbero-mcp
# Environment exchange (true=testnet, false=mainnet).
# IMPORTANTE: per mainnet aggiungi anche "environment":"mainnet" al secret JSON
# corrispondente, altrimenti il boot abortisce per safety (vedi consistency_check).
DERIBIT_TESTNET=true
BYBIT_TESTNET=true
HYPERLIQUID_TESTNET=true
ALPACA_PAPER=true
# Permette mainnet senza creds["environment"]="mainnet" esplicito (sconsigliato).
STRICT_MAINNET=true
# Audit log persistente per write endpoint (place_order, cancel, ecc.).
AUDIT_LOG_DIR=$AUDIT_LOG_DIR
# Watchtower polling auto-update (sec).
WATCHTOWER_POLL_INTERVAL=300
EOF
echo " $DEPLOY_DIR/.env creato. Rivedi prima del primo up."
else
echo "=== .env preesistente — non sovrascritto ==="
fi
# ──────────────────────────────────────────────────────────────
# 6. Audit log dir host (volume bind)
# ──────────────────────────────────────────────────────────────
sudo mkdir -p "$AUDIT_LOG_DIR"
sudo chown 1000:1000 "$AUDIT_LOG_DIR"
echo "Audit log dir: $AUDIT_LOG_DIR (chown 1000:1000)"
# ──────────────────────────────────────────────────────────────
# 7. Pull image + up
# ──────────────────────────────────────────────────────────────
COMPOSE_FILES=("-f" "docker-compose.prod.yml")
if [ "${BEHIND_TRAEFIK:-false}" = "true" ]; then
echo "=== Modalità behind-traefik attiva (network ${TRAEFIK_NETWORK:-gitea_traefik-public}) ==="
COMPOSE_FILES+=("-f" "docker-compose.traefik.yml")
fi
# Override locale specifico macchina (es. fix DOCKER_API_VERSION watchtower).
# Non versionato (in .gitignore), creato a mano sul VPS se serve.
if [ -f "docker-compose.local.yml" ]; then
echo "=== Override locale rilevato: docker-compose.local.yml ==="
COMPOSE_FILES+=("-f" "docker-compose.local.yml")
fi
echo "=== docker compose pull + up ==="
docker compose "${COMPOSE_FILES[@]}" --env-file .env pull
docker compose "${COMPOSE_FILES[@]}" --env-file .env up -d
# ──────────────────────────────────────────────────────────────
# 8. Verifica stato
# ──────────────────────────────────────────────────────────────
sleep 5
echo "=== Stato container ==="
docker compose "${COMPOSE_FILES[@]}" --env-file .env ps
echo
echo "=== Smoke test (health check via gateway pubblico) ==="
sleep 10
if curl -sf -o /dev/null -m 10 "https://$DOMAIN/mcp-macro/health"; then
echo " OK: https://$DOMAIN/mcp-macro/health → 200"
else
echo " WARN: https://$DOMAIN/mcp-macro/health non risponde (DNS o cert non ancora pronti?)"
echo " Riprova fra 30s o controlla: docker compose -f docker-compose.prod.yml logs gateway"
fi
echo
echo "=== Deploy completato ==="
echo "Comandi utili (compose files: ${COMPOSE_FILES[*]}):"
echo " Logs: docker compose ${COMPOSE_FILES[*]} --env-file .env logs -f <service>"
echo " Audit: tail -f $AUDIT_LOG_DIR/*.audit.jsonl"
echo " Restart: docker compose ${COMPOSE_FILES[*]} --env-file .env restart <service>"
echo " Stop: docker compose ${COMPOSE_FILES[*]} --env-file .env down"
echo " Update: ri-esegui questo script (riscarica config + pull image)"
+148
View File
@@ -0,0 +1,148 @@
#!/usr/bin/env bash
# deploy-vps.sh — deploy Cerbero MCP V2 sul VPS senza passare per registry.
#
# Workflow:
# 1. git fetch + reset al ramo target
# 2. docker compose build (rebuild immagine se SHA è cambiata)
# 3. docker compose down (graceful, max 15s)
# 4. docker compose up -d
# 5. attesa healthcheck su /health
# 6. rollback automatico al SHA precedente se health fallisce
#
# Eseguito ON THE VPS, dentro la directory del repo (es. /opt/cerbero-mcp).
#
# Uso (sul VPS):
# cd /opt/cerbero-mcp
# bash scripts/deploy-vps.sh
#
# Uso (da macchina dev, via SSH):
# ssh user@vps 'cd /opt/cerbero-mcp && bash scripts/deploy-vps.sh'
#
# Variabili env (opzionali):
# BRANCH ramo git da deployare (default: V2.0.0)
# SERVICE nome servizio docker compose (default: cerbero-mcp)
# PORT porta /health da pingare (default: dal .env, fallback 9000)
# HEALTH_TIMEOUT_SECONDS attesa max health (default: 30)
# HEALTH_INTERVAL secondi tra retry health (default: 2)
# FORCE se "1", rebuild + restart anche se SHA invariata
# SKIP_ROLLBACK se "1", non fare rollback su health fail (per debug)
set -euo pipefail
# ─── Config ──────────────────────────────────────────────────────────────
BRANCH="${BRANCH:-V2.0.0}"
SERVICE="${SERVICE:-cerbero-mcp}"
HEALTH_TIMEOUT_SECONDS="${HEALTH_TIMEOUT_SECONDS:-30}"
HEALTH_INTERVAL="${HEALTH_INTERVAL:-2}"
# Risolvi PORT da .env se non passata
if [[ -z "${PORT:-}" ]]; then
if [[ -f .env ]] && grep -q '^PORT=' .env; then
PORT="$(grep '^PORT=' .env | head -1 | cut -d= -f2 | tr -d '[:space:]"')"
fi
fi
PORT="${PORT:-9000}"
HEALTH_URL="http://localhost:${PORT}/health"
# ─── Pre-check ───────────────────────────────────────────────────────────
command -v git >/dev/null || { echo "FATAL: git non installato"; exit 1; }
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
command -v curl >/dev/null || { echo "FATAL: curl non installato"; exit 1; }
docker compose version >/dev/null 2>&1 || { echo "FATAL: docker compose non disponibile"; exit 1; }
if [[ ! -f .env ]]; then
echo "FATAL: .env non trovato in $(pwd)."
echo " Copia .env.example → .env e compila i valori prima del primo deploy."
exit 1
fi
if [[ ! -f docker-compose.yml ]]; then
echo "FATAL: docker-compose.yml non trovato in $(pwd)."
exit 1
fi
# Verifica working tree pulito
if [[ -n "$(git status --porcelain)" ]]; then
echo "FATAL: working tree non pulito. Modifiche locali non gestite:"
git status --short
echo " Risolvi prima di deployare (es. git stash o git reset)."
exit 1
fi
# ─── Stato corrente ──────────────────────────────────────────────────────
CURRENT_SHA="$(git rev-parse --short HEAD)"
echo "==> SHA attuale (rollback target): $CURRENT_SHA"
echo "==> branch: $BRANCH"
echo "==> port: $PORT"
# ─── Fetch + reset ───────────────────────────────────────────────────────
echo "==> git fetch + reset --hard origin/${BRANCH}"
git fetch --prune origin
git reset --hard "origin/${BRANCH}"
NEW_SHA="$(git rev-parse --short HEAD)"
echo "==> SHA nuovo: $NEW_SHA"
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]] && [[ "${FORCE:-0}" != "1" ]]; then
echo "==> Già aggiornato a $NEW_SHA. Nessun deploy necessario."
echo " (esporta FORCE=1 per riavviare comunque)"
exit 0
fi
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]]; then
echo "==> FORCE=1 → rebuild e restart anche se SHA invariata"
fi
# ─── Funzione di rollback ────────────────────────────────────────────────
rollback() {
if [[ "${SKIP_ROLLBACK:-0}" == "1" ]]; then
echo "==> SKIP_ROLLBACK=1 → niente rollback automatico"
return
fi
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]]; then
echo "==> SHA invariata, niente da rollbackare"
return
fi
echo "==> ROLLBACK a $CURRENT_SHA"
git reset --hard "$CURRENT_SHA"
docker compose build "$SERVICE"
docker compose up -d --force-recreate "$SERVICE"
echo "==> rollback eseguito. Verifica manualmente lo stato."
}
# ─── Build ───────────────────────────────────────────────────────────────
echo "==> docker compose build $SERVICE"
docker compose build "$SERVICE"
# ─── Down + up ───────────────────────────────────────────────────────────
echo "==> docker compose down --timeout 15"
docker compose down --timeout 15
echo "==> docker compose up -d"
docker compose up -d
# ─── Health check ────────────────────────────────────────────────────────
echo "==> attendo /health (timeout ${HEALTH_TIMEOUT_SECONDS}s, retry ogni ${HEALTH_INTERVAL}s)"
deadline=$(( $(date +%s) + HEALTH_TIMEOUT_SECONDS ))
while [[ $(date +%s) -lt $deadline ]]; do
if curl -fsS "$HEALTH_URL" >/dev/null 2>&1; then
echo
echo "==> health OK"
curl -s "$HEALTH_URL"
echo
echo
echo "==> deploy DONE (SHA $CURRENT_SHA$NEW_SHA, branch $BRANCH)"
exit 0
fi
printf "."
sleep "$HEALTH_INTERVAL"
done
echo
echo "==> FAIL: /health non risponde dopo ${HEALTH_TIMEOUT_SECONDS}s"
echo "==> log container (ultime 40 righe):"
docker compose logs --tail 40 "$SERVICE" || true
rollback
exit 1
+132
View File
@@ -0,0 +1,132 @@
#!/usr/bin/env python3
"""IBKR OAuth 1.0a Self-Service setup helper.
Phases (run in order, providing flags as you progress):
1. python scripts/ibkr_oauth_setup.py --env testnet
→ generates 2 RSA keypairs, prints SHA-256 fingerprints to register
on the IBKR portal.
2. (manual) Login at https://www.interactivebrokers.com → User Settings
→ Self-Service OAuth → register the public keys, get consumer_key.
3. python scripts/ibkr_oauth_setup.py --env testnet --consumer-key <K> \\
--request-token
→ exchanges consumer_key for an unauthorized request token + URL.
4. (manual) Open the URL, approve, copy the verifier code.
5. python scripts/ibkr_oauth_setup.py --env testnet --verifier <V>
→ exchanges verifier for long-lived access_token + secret.
Copy the printed values into .env.
Repeat for --env mainnet using your live IBKR account.
"""
from __future__ import annotations
import argparse
import hashlib
import sys
from pathlib import Path
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import rsa
def _gen_keypair(out: Path) -> str:
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
pem = key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
out.write_bytes(pem)
out.chmod(0o600)
pub = key.public_key().public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo,
)
pub_path = out.with_suffix(out.suffix + ".pub")
pub_path.write_bytes(pub)
return f"SHA256:{hashlib.sha256(pub).hexdigest()}"
def cmd_init(env: str, secrets_dir: Path) -> int:
secrets_dir.mkdir(parents=True, exist_ok=True)
sig = secrets_dir / f"ibkr_signature_{env}.pem"
enc = secrets_dir / f"ibkr_encryption_{env}.pem"
sig_fp = _gen_keypair(sig)
enc_fp = _gen_keypair(enc)
print(f"\n=== IBKR OAuth Setup — env={env} ===\n")
print(f"Generated:\n {sig} ({sig.stat().st_size} bytes)")
print(f" {enc} ({enc.stat().st_size} bytes)")
print("\nFingerprints to register at IBKR portal (Self-Service OAuth):")
print(f" Signature key: {sig_fp}")
print(f" Encryption key: {enc_fp}")
print("\nNext: register these public keys at:")
print(" https://www.interactivebrokers.com (User Settings → OAuth)")
print("\nAlso paste in .env:")
print(f" IBKR_SIGNATURE_KEY_PATH_{env.upper()}={sig}")
print(f" IBKR_ENCRYPTION_KEY_PATH_{env.upper()}={enc}\n")
return 0
def cmd_request_token(env: str, consumer_key: str) -> int:
print(f"\n=== Step 2 — request token for {env} ===\n")
print(f"Consumer key: {consumer_key}")
print(
"\nVisit this URL in a browser, log in to IBKR, authorize the app,\n"
"and copy the displayed verifier code:\n"
)
print(
f" https://www.interactivebrokers.com/sso/Authenticator?"
f"oauth_consumer_key={consumer_key}&action=request_token\n"
)
print("Then re-run with: --verifier <code>\n")
return 0
def cmd_verifier(env: str, verifier: str) -> int:
print(f"\n=== Step 3 — exchange verifier for {env} ===\n")
print(f"Verifier received: {verifier[:8]}...")
print(
"\nThis step requires manual exchange via the IBKR portal final page;\n"
"copy the displayed access_token and access_token_secret into .env:\n"
)
print(f" IBKR_ACCESS_TOKEN_{env.upper()}=<paste from portal>")
print(f" IBKR_ACCESS_TOKEN_SECRET_{env.upper()}=<paste from portal>\n")
print("Also set:")
print(f" IBKR_CONSUMER_KEY_{env.upper()}=<the consumer key from step 1>")
print(" IBKR_DH_PRIME=<paste DH prime hex from portal>\n")
return 0
def main() -> int:
p = argparse.ArgumentParser(description=__doc__)
p.add_argument("--env", choices=["testnet", "mainnet"], required=True)
p.add_argument("--secrets-dir", default="secrets")
p.add_argument("--consumer-key")
p.add_argument("--request-token", action="store_true")
p.add_argument("--verifier")
p.add_argument(
"--rotate",
action="store_true",
help="Generate new keypairs alongside existing (for rotation)",
)
args = p.parse_args()
sec_dir = Path(args.secrets_dir)
if args.verifier:
return cmd_verifier(args.env, args.verifier)
if args.consumer_key and args.request_token:
return cmd_request_token(args.env, args.consumer_key)
if args.rotate:
for kind in ("signature", "encryption"):
new = sec_dir / f"ibkr_{kind}_{args.env}.pem.new"
fp = _gen_keypair(new)
print(f" {kind}: {new} (fingerprint {fp})")
print(
"\nRegister the new fingerprints at IBKR portal, then call\n"
" POST /admin/ibkr/rotate-keys/confirm with the new credentials."
)
return 0
return cmd_init(args.env, sec_dir)
if __name__ == "__main__":
sys.exit(main())
-23
View File
@@ -1,23 +0,0 @@
[project]
name = "mcp-common"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"mcp>=1.0",
"httpx>=0.27",
"pydantic>=2.6",
"pydantic-settings>=2.3",
"python-json-logger>=2.0",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30", "ruff>=0.5"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_common"]
@@ -1 +0,0 @@
__all__ = []
@@ -1,95 +0,0 @@
"""App factory comune per i servizi mcp-{exchange}.
Centralizza il boilerplate dei `__main__.py`:
- configure_root_logging (JSON)
- fail_fast_if_missing su env mandatory
- summarize env
- load creds JSON
- resolve_environment con default URLs
- load token store
- delega creazione client + app a callback per-servizio
- uvicorn.run
Ogni servizio invoca `run_exchange_main(spec)` con uno spec dichiarativo.
"""
from __future__ import annotations
import json
import os
from collections.abc import Callable
from dataclasses import dataclass
from typing import Any
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.env_validation import fail_fast_if_missing, require_env, summarize
from mcp_common.environment import (
EnvironmentInfo,
consistency_check,
resolve_environment,
)
from mcp_common.logging import configure_root_logging
@dataclass(frozen=True)
class ExchangeAppSpec:
exchange: str
creds_env_var: str
env_var: str # es. "BYBIT_TESTNET", "ALPACA_PAPER"
flag_key: str # campo nel secret JSON ("testnet" o "paper")
default_base_url_live: str
default_base_url_testnet: str
default_port: int
build_client: Callable[[dict, EnvironmentInfo], Any]
build_app: Callable[..., Any]
extra_summarize_envs: tuple[str, ...] = ()
def run_exchange_main(spec: ExchangeAppSpec) -> None:
configure_root_logging()
fail_fast_if_missing([spec.creds_env_var])
summarize([
spec.creds_env_var,
"CORE_TOKEN_FILE",
"OBSERVER_TOKEN_FILE",
"PORT",
"HOST",
spec.env_var,
*spec.extra_summarize_envs,
])
creds_file = require_env(spec.creds_env_var, f"{spec.exchange} credentials JSON path")
with open(creds_file) as f:
creds = json.load(f)
env_info = resolve_environment(
creds,
env_var=spec.env_var,
flag_key=spec.flag_key,
exchange=spec.exchange,
default_base_url_live=spec.default_base_url_live,
default_base_url_testnet=spec.default_base_url_testnet,
)
# Safety: previene switch accidentali a mainnet senza conferma esplicita
# nel secret. Solleva EnvironmentMismatchError → boot abort se mismatch.
strict_mainnet = os.environ.get("STRICT_MAINNET", "true").lower() not in ("0", "false", "no")
consistency_check(env_info, creds, strict_mainnet=strict_mainnet)
client = spec.build_client(creds, env_info)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = spec.build_app(client=client, token_store=token_store, creds=creds, env_info=env_info)
uvicorn.run(
app,
log_config=None,
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", str(spec.default_port))),
)
-98
View File
@@ -1,98 +0,0 @@
from __future__ import annotations
from collections.abc import Callable
from dataclasses import dataclass, field
from functools import wraps
from fastapi import HTTPException, Request, status
@dataclass
class Principal:
name: str
capabilities: set[str] = field(default_factory=set)
@dataclass
class TokenStore:
tokens: dict[str, Principal]
def get(self, token: str) -> Principal | None:
return self.tokens.get(token)
def require_principal(request: Request) -> Principal:
auth = request.headers.get("Authorization", "")
if not auth.startswith("Bearer "):
raise HTTPException(status.HTTP_401_UNAUTHORIZED, "missing bearer token")
token = auth[len("Bearer "):].strip()
store: TokenStore = request.app.state.token_store
principal = store.get(token)
if principal is None:
raise HTTPException(status.HTTP_403_FORBIDDEN, "invalid token")
return principal
def acl_requires(*, core: bool = False, observer: bool = False) -> Callable:
"""Decorator: require at least one matching capability."""
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
def decorator(func: Callable) -> Callable:
@wraps(func)
async def async_wrapper(*args, **kwargs):
principal = kwargs.get("principal")
if principal is None:
for a in args:
if isinstance(a, Principal):
principal = a
break
if principal is None or not (principal.capabilities & allowed):
raise HTTPException(
status.HTTP_403_FORBIDDEN,
f"capability required: {allowed}",
)
return await func(*args, **kwargs) if _is_coro(func) else func(*args, **kwargs)
@wraps(func)
def sync_wrapper(*args, **kwargs):
principal = kwargs.get("principal")
if principal is None:
for a in args:
if isinstance(a, Principal):
principal = a
break
if principal is None or not (principal.capabilities & allowed):
raise HTTPException(
status.HTTP_403_FORBIDDEN,
f"capability required: {allowed}",
)
return func(*args, **kwargs)
return async_wrapper if _is_coro(func) else sync_wrapper
return decorator
def _is_coro(func: Callable) -> bool:
import asyncio
return asyncio.iscoroutinefunction(func)
def load_token_store_from_files(
core_token_file: str | None,
observer_token_file: str | None,
) -> TokenStore:
tokens: dict[str, Principal] = {}
if core_token_file:
with open(core_token_file) as f:
tokens[f.read().strip()] = Principal(name="core", capabilities={"core"})
if observer_token_file:
with open(observer_token_file) as f:
tokens[f.read().strip()] = Principal(
name="observer", capabilities={"observer"}
)
return TokenStore(tokens=tokens)
@@ -1,69 +0,0 @@
"""Env validation policy: fail-fast per mandatory, soft per optional.
Usage al boot di ogni mcp `__main__.py`:
from mcp_common.env_validation import require_env, optional_env, summarize
creds_file = require_env("CREDENTIALS_FILE", "deribit credentials JSON path")
host = optional_env("HOST", default="0.0.0.0")
summarize(["CREDENTIALS_FILE", "HOST", "PORT"])
"""
from __future__ import annotations
import logging
import os
import sys
logger = logging.getLogger(__name__)
class MissingEnvError(RuntimeError):
"""Mandatory env var absent or empty."""
def require_env(name: str, description: str = "") -> str:
val = (os.environ.get(name) or "").strip()
if not val:
msg = f"missing mandatory env var: {name}"
if description:
msg += f" ({description})"
logger.error(msg)
raise MissingEnvError(msg)
return val
def optional_env(name: str, *, default: str = "") -> str:
val = (os.environ.get(name) or "").strip()
if not val:
if default:
logger.info("env %s not set, using default=%r", name, default)
return default
return val
def summarize(names: list[str]) -> None:
sensitive_tokens = ("SECRET", "KEY", "TOKEN", "PASSWORD", "CREDENTIAL", "WALLET")
for n in names:
val = os.environ.get(n)
if val is None:
logger.info("env[%s]: <unset>", n)
continue
if any(t in n.upper() for t in sensitive_tokens):
logger.info("env[%s]: <set, %d chars>", n, len(val))
else:
logger.info("env[%s]: %s", n, val)
def fail_fast_if_missing(names: list[str]) -> None:
missing: list[str] = []
for n in names:
if not (os.environ.get(n) or "").strip():
missing.append(n)
if missing:
logger.error("boot aborted: missing mandatory env vars: %s", missing)
print(
f"FATAL: missing mandatory env vars: {missing}",
file=sys.stderr,
)
sys.exit(2)
@@ -1,138 +0,0 @@
"""Resolver di ambiente (testnet/mainnet) per MCP exchange.
Precedenza: env var > campo secret > default True (testnet).
Safety: `consistency_check` previene switch accidentali a mainnet senza
conferma esplicita nel secret JSON.
"""
from __future__ import annotations
import logging
import os
from dataclasses import dataclass
from typing import Literal
logger = logging.getLogger(__name__)
Environment = Literal["testnet", "mainnet"]
Source = Literal["env", "credentials", "default"]
TRUTHY = {"1", "true", "yes", "on"}
# Tokens nel base_url che indicano endpoint testnet (case-insensitive).
TESTNET_URL_HINTS = ("test", "testnet", "paper")
class EnvironmentMismatchError(RuntimeError):
"""Boot abort: ambiente risolto non matcha conferma esplicita nel secret."""
@dataclass(frozen=True)
class EnvironmentInfo:
exchange: str
environment: Environment
source: Source
env_value: str | None
base_url: str
def resolve_environment(
creds: dict,
*,
env_var: str,
flag_key: str,
exchange: str,
default_base_url_live: str | None = None,
default_base_url_testnet: str | None = None,
) -> EnvironmentInfo:
"""Risolvi l'ambiente per un MCP exchange.
creds: dict letto dal secret JSON. Può contenere base_url_live/base_url_testnet
per override; in assenza vengono usati i default kwargs.
env_var: nome della env var di override (es. DERIBIT_TESTNET).
flag_key: chiave booleana nel secret JSON (es. "testnet" o "paper" per alpaca).
exchange: nome exchange per logging/info.
default_base_url_live / default_base_url_testnet: URL canonici dell'exchange,
applicati se non presenti in creds.
"""
env_value = os.environ.get(env_var)
if env_value is not None:
is_test = env_value.strip().lower() in TRUTHY
environment: Environment = "testnet" if is_test else "mainnet"
source: Source = "env"
elif flag_key in creds:
environment = "testnet" if bool(creds[flag_key]) else "mainnet"
source = "credentials"
else:
environment = "testnet"
source = "default"
if default_base_url_live is not None:
creds.setdefault("base_url_live", default_base_url_live)
if default_base_url_testnet is not None:
creds.setdefault("base_url_testnet", default_base_url_testnet)
base_url = creds["base_url_testnet"] if environment == "testnet" else creds["base_url_live"]
return EnvironmentInfo(
exchange=exchange,
environment=environment,
source=source,
env_value=env_value,
base_url=base_url,
)
def consistency_check(
env_info: EnvironmentInfo,
creds: dict,
*,
strict_mainnet: bool = True,
) -> list[str]:
"""Verifica coerenza environment risolto vs secret JSON. Restituisce
lista di warning string. Solleva EnvironmentMismatchError per mismatch
bloccanti.
Regole:
- Se `creds["environment"]` è presente e DIVERSO da `env_info.environment`:
→ raise EnvironmentMismatchError (declared vs resolved mismatch).
- Se `env_info.environment == "mainnet"` e `creds.get("environment") !=
"mainnet"`: con `strict_mainnet=True` → raise (richiede conferma
esplicita). Con `strict_mainnet=False` → warning.
- Se `env_info.base_url` contiene token testnet ("test", "testnet",
"paper") ma `env_info.environment == "mainnet"` (o viceversa): warning
(URL/environment incoerenti).
"""
warnings: list[str] = []
declared = creds.get("environment")
if declared and declared != env_info.environment:
raise EnvironmentMismatchError(
f"{env_info.exchange}: secret declared environment={declared!r} "
f"but resolver resolved environment={env_info.environment!r}"
)
if env_info.environment == "mainnet" and declared != "mainnet":
msg = (
f"{env_info.exchange}: resolved mainnet without explicit confirmation "
"in secret. Add `\"environment\": \"mainnet\"` to the credentials JSON."
)
if strict_mainnet:
raise EnvironmentMismatchError(msg)
warnings.append(msg)
url_lower = (env_info.base_url or "").lower()
has_test_hint = any(token in url_lower for token in TESTNET_URL_HINTS)
if env_info.environment == "mainnet" and has_test_hint:
warnings.append(
f"{env_info.exchange}: environment=mainnet but base_url contains "
f"testnet hint ({env_info.base_url!r})"
)
if env_info.environment == "testnet" and not has_test_hint and url_lower:
warnings.append(
f"{env_info.exchange}: environment=testnet but base_url does not "
f"appear to be a testnet endpoint ({env_info.base_url!r})"
)
for w in warnings:
logger.warning("environment consistency: %s", w)
return warnings
-220
View File
@@ -1,220 +0,0 @@
from __future__ import annotations
import json
import os
import time
import uuid
from collections.abc import Callable
from contextlib import AbstractAsyncContextManager
from datetime import UTC, datetime
from typing import Any
from fastapi import FastAPI, HTTPException, Request
from fastapi.exceptions import RequestValidationError
from fastapi.responses import JSONResponse, Response
from starlette.middleware.base import BaseHTTPMiddleware
from mcp_common.auth import TokenStore
Lifespan = Callable[[FastAPI], AbstractAsyncContextManager[None]]
def _error_envelope(
*,
type_: str,
code: str,
message: str,
retryable: bool,
suggested_fix: str | None = None,
details: dict | None = None,
request_id: str | None = None,
) -> dict:
env: dict[str, Any] = {
"error": {
"type": type_,
"code": code,
"message": message,
"retryable": retryable,
},
"request_id": request_id or uuid.uuid4().hex,
"data_timestamp": datetime.now(UTC).isoformat(),
}
if suggested_fix:
env["error"]["suggested_fix"] = suggested_fix
if details:
env["error"]["details"] = details
return env
class _TimestampInjectorMiddleware(BaseHTTPMiddleware):
"""CER-P5-001: inietta data_timestamp nei response tool.
- Dict response: body gains `data_timestamp` se mancante.
- List of dicts: ogni item gains `data_timestamp` se mancante.
- Header `X-Data-Timestamp` sempre presente (universale per list primitive).
Skips /health (già popolato) e /mcp (JSON-RPC bridge) e non-JSON responses.
"""
async def dispatch(self, request: Request, call_next):
response = await call_next(request)
path = request.url.path
if not path.startswith("/tools/"):
return response
ctype = response.headers.get("content-type", "")
if "application/json" not in ctype:
return response
body = b""
async for chunk in response.body_iterator:
body += chunk
ts = datetime.now(UTC).isoformat()
try:
data = json.loads(body) if body else None
except Exception:
headers = dict(response.headers)
headers["X-Data-Timestamp"] = ts
return Response(
content=body,
status_code=response.status_code,
headers=headers,
media_type=response.media_type,
)
modified = False
if isinstance(data, dict) and "data_timestamp" not in data:
data["data_timestamp"] = ts
modified = True
elif isinstance(data, list):
for item in data:
if isinstance(item, dict) and "data_timestamp" not in item:
item["data_timestamp"] = ts
modified = True
headers = dict(response.headers)
headers["X-Data-Timestamp"] = ts
if modified:
new_body = json.dumps(data, default=str).encode()
headers.pop("content-length", None)
return Response(
content=new_body,
status_code=response.status_code,
headers=headers,
media_type="application/json",
)
return Response(
content=body,
status_code=response.status_code,
headers=headers,
media_type=response.media_type,
)
def build_app(
*,
name: str,
version: str,
token_store: TokenStore,
lifespan: Lifespan | None = None,
) -> FastAPI:
root_path = os.getenv("ROOT_PATH", "")
app = FastAPI(title=name, version=version, root_path=root_path, lifespan=lifespan)
app.state.token_store = token_store
app.state.boot_at = time.time()
app.add_middleware(_TimestampInjectorMiddleware)
@app.middleware("http")
async def _latency_header(request: Request, call_next):
t0 = time.perf_counter()
response = await call_next(request)
dur_ms = (time.perf_counter() - t0) * 1000
response.headers["X-Duration-Ms"] = f"{dur_ms:.2f}"
return response
# CER-P5-002 error envelope: exception handlers globali
@app.exception_handler(HTTPException)
async def _http_exc(request: Request, exc: HTTPException):
retryable = exc.status_code in (408, 429, 502, 503, 504)
code_map = {
400: "BAD_REQUEST", 401: "UNAUTHORIZED", 403: "FORBIDDEN",
404: "NOT_FOUND", 408: "TIMEOUT", 409: "CONFLICT",
422: "VALIDATION_ERROR", 429: "RATE_LIMIT",
500: "INTERNAL_ERROR", 502: "UPSTREAM_ERROR",
503: "UNAVAILABLE", 504: "GATEWAY_TIMEOUT",
}
code = code_map.get(exc.status_code, f"HTTP_{exc.status_code}")
message = "HTTP error"
details: dict | None = None
detail = exc.detail
# Preserve rail-style detail {"error": "..", "message": ".."} as code
if isinstance(detail, dict):
if isinstance(detail.get("error"), str):
code = detail["error"].upper()
message = str(detail.get("message") or detail.get("error") or message)
details = detail
elif isinstance(detail, str):
message = detail
return JSONResponse(
status_code=exc.status_code,
content=_error_envelope(
type_="http_error",
code=code,
message=message,
retryable=retryable,
details=details,
),
)
@app.exception_handler(RequestValidationError)
async def _validation_exc(request: Request, exc: RequestValidationError):
errs = exc.errors()
first_loc = ".".join(str(x) for x in errs[0]["loc"]) if errs else "body"
suggestion = (
f"check field '{first_loc}': "
+ (errs[0]["msg"] if errs else "invalid input")
)
# Sanitize ctx values: pydantic v2 può mettere ValueError in ctx['error'],
# non serializzabile JSON. Riduci a stringhe.
safe_errs: list[dict] = []
for e in errs[:5]:
ne: dict = {}
for k, v in e.items():
if k == "ctx" and isinstance(v, dict):
ne[k] = {ck: str(cv) for ck, cv in v.items()}
else:
ne[k] = v
safe_errs.append(ne)
return JSONResponse(
status_code=422,
content=_error_envelope(
type_="validation_error",
code="INVALID_INPUT",
message=f"request body validation failed on {first_loc}",
retryable=False,
suggested_fix=suggestion,
details={"errors": safe_errs},
),
)
@app.exception_handler(Exception)
async def _unhandled(request: Request, exc: Exception):
return JSONResponse(
status_code=500,
content=_error_envelope(
type_="internal_error",
code="UNHANDLED_EXCEPTION",
message=f"{type(exc).__name__}: {str(exc)[:300]}",
retryable=True,
),
)
@app.get("/health")
def health():
return {
"status": "healthy",
"name": name,
"version": version,
"uptime_seconds": int(time.time() - app.state.boot_at),
"data_timestamp": datetime.now(UTC).isoformat(),
}
return app
-137
View File
@@ -1,137 +0,0 @@
from __future__ import annotations
import json
from unittest.mock import MagicMock, patch
import pytest
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_common.environment import EnvironmentInfo
def _make_spec(build_client=None, build_app=None) -> ExchangeAppSpec:
return ExchangeAppSpec(
exchange="testex",
creds_env_var="TESTEX_CREDENTIALS_FILE",
env_var="TESTEX_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.testex.com",
default_base_url_testnet="https://test.testex.com",
default_port=9999,
build_client=build_client or (lambda creds, env_info: MagicMock(name="client")),
build_app=build_app or (lambda **kwargs: MagicMock(name="app")),
)
def test_run_exchange_main_loads_creds_and_resolves_env(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"api_key": "k", "api_secret": "s"}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("PORT", "10000")
monkeypatch.delenv("TESTEX_TESTNET", raising=False)
captured: dict = {}
def build_client(creds, env_info):
captured["creds"] = creds
captured["env_info"] = env_info
return MagicMock()
def build_app(**kwargs):
captured["app_kwargs"] = kwargs
return MagicMock()
spec = _make_spec(build_client=build_client, build_app=build_app)
with patch("mcp_common.app_factory.uvicorn.run") as mock_run:
run_exchange_main(spec)
assert captured["creds"]["api_key"] == "k"
assert captured["creds"]["base_url_live"] == "https://api.testex.com"
assert captured["creds"]["base_url_testnet"] == "https://test.testex.com"
assert isinstance(captured["env_info"], EnvironmentInfo)
assert captured["env_info"].environment == "testnet"
assert captured["env_info"].exchange == "testex"
assert "client" in captured["app_kwargs"]
assert "token_store" in captured["app_kwargs"]
assert "creds" in captured["app_kwargs"]
assert "env_info" in captured["app_kwargs"]
call_kwargs = mock_run.call_args.kwargs
assert call_kwargs["port"] == 10000 # PORT override
def test_run_exchange_main_uses_default_port(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.delenv("PORT", raising=False)
spec = _make_spec()
with patch("mcp_common.app_factory.uvicorn.run") as mock_run:
run_exchange_main(spec)
assert mock_run.call_args.kwargs["port"] == 9999
def test_run_exchange_main_env_var_overrides_creds(tmp_path, monkeypatch):
creds_file = tmp_path / "creds.json"
# `environment: mainnet` esplicito perché env var override → mainnet
# e consistency_check richiede conferma per evitare switch accidentale.
creds_file.write_text(json.dumps({"testnet": True, "environment": "mainnet"}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("TESTEX_TESTNET", "false")
captured: dict = {}
def build_client(creds, env_info):
captured["env_info"] = env_info
return MagicMock()
spec = _make_spec(build_client=build_client)
with patch("mcp_common.app_factory.uvicorn.run"):
run_exchange_main(spec)
# env var "false" overrides creds.testnet=True → mainnet
assert captured["env_info"].environment == "mainnet"
assert captured["env_info"].source == "env"
def test_run_exchange_main_aborts_on_mainnet_without_confirmation(tmp_path, monkeypatch):
"""Mainnet senza creds['environment']='mainnet' → boot abort fail-fast."""
from mcp_common.environment import EnvironmentMismatchError
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"testnet": False}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.delenv("TESTEX_TESTNET", raising=False)
monkeypatch.delenv("STRICT_MAINNET", raising=False)
spec = _make_spec()
with (
pytest.raises(EnvironmentMismatchError),
patch("mcp_common.app_factory.uvicorn.run"),
):
run_exchange_main(spec)
def test_run_exchange_main_strict_mainnet_disabled_via_env(tmp_path, monkeypatch):
"""STRICT_MAINNET=false permette mainnet senza conferma (warning soltanto)."""
creds_file = tmp_path / "creds.json"
creds_file.write_text(json.dumps({"testnet": False}))
monkeypatch.setenv("TESTEX_CREDENTIALS_FILE", str(creds_file))
monkeypatch.setenv("STRICT_MAINNET", "false")
spec = _make_spec()
with patch("mcp_common.app_factory.uvicorn.run"):
run_exchange_main(spec) # non solleva
def test_run_exchange_main_missing_creds_file_exits(monkeypatch):
monkeypatch.delenv("TESTEX_CREDENTIALS_FILE", raising=False)
spec = _make_spec()
import pytest
with pytest.raises(SystemExit) as exc_info:
run_exchange_main(spec)
assert exc_info.value.code == 2
-84
View File
@@ -1,84 +0,0 @@
import pytest
from fastapi import Depends, FastAPI
from fastapi.testclient import TestClient
from mcp_common.auth import (
Principal,
TokenStore,
acl_requires,
require_principal,
)
@pytest.fixture
def token_store():
return TokenStore(tokens={
"token-core-123": Principal(name="core", capabilities={"core"}),
"token-obs-456": Principal(name="observer", capabilities={"observer"}),
})
@pytest.fixture
def app(token_store):
app = FastAPI()
app.state.token_store = token_store
@app.get("/public")
def public():
return {"ok": True}
@app.get("/private")
def private(principal: Principal = Depends(require_principal)):
return {"name": principal.name}
@app.post("/core-only")
@acl_requires(core=True, observer=False)
def core_only(principal: Principal = Depends(require_principal)):
return {"who": principal.name}
@app.post("/observer-only")
@acl_requires(core=False, observer=True)
def observer_only(principal: Principal = Depends(require_principal)):
return {"who": principal.name}
return app
def test_public_endpoint_no_auth(app):
client = TestClient(app)
assert client.get("/public").status_code == 200
def test_private_without_header_401(app):
client = TestClient(app)
assert client.get("/private").status_code == 401
def test_private_bad_token_403(app):
client = TestClient(app)
r = client.get("/private", headers={"Authorization": "Bearer nope"})
assert r.status_code == 403
def test_private_good_token_200(app):
client = TestClient(app)
r = client.get("/private", headers={"Authorization": "Bearer token-core-123"})
assert r.status_code == 200
assert r.json() == {"name": "core"}
def test_acl_core_token_on_core_only_endpoint(app):
client = TestClient(app)
r = client.post("/core-only", headers={"Authorization": "Bearer token-core-123"})
assert r.status_code == 200
def test_acl_observer_on_core_only_rejected(app):
client = TestClient(app)
r = client.post("/core-only", headers={"Authorization": "Bearer token-obs-456"})
assert r.status_code == 403
def test_acl_observer_on_observer_only_ok(app):
client = TestClient(app)
r = client.post("/observer-only", headers={"Authorization": "Bearer token-obs-456"})
assert r.status_code == 200
-189
View File
@@ -1,189 +0,0 @@
from __future__ import annotations
import pytest
from mcp_common.environment import (
EnvironmentInfo,
EnvironmentMismatchError,
consistency_check,
resolve_environment,
)
def test_env_var_overrides_secret(monkeypatch):
monkeypatch.setenv("DERIBIT_TESTNET", "false")
creds = {"testnet": True, "base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="DERIBIT_TESTNET",
flag_key="testnet",
exchange="deribit",
)
assert info.environment == "mainnet"
assert info.source == "env"
assert info.env_value == "false"
assert info.base_url == "L"
def test_secret_used_when_env_missing(monkeypatch):
monkeypatch.delenv("DERIBIT_TESTNET", raising=False)
creds = {"testnet": True, "base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="DERIBIT_TESTNET",
flag_key="testnet",
exchange="deribit",
)
assert info.environment == "testnet"
assert info.source == "credentials"
assert info.env_value is None
assert info.base_url == "T"
def test_default_when_both_missing(monkeypatch):
monkeypatch.delenv("FOO_TESTNET", raising=False)
creds = {"base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="FOO_TESTNET",
flag_key="testnet",
exchange="foo",
)
assert info.environment == "testnet"
assert info.source == "default"
assert info.env_value is None
@pytest.mark.parametrize("raw,expected", [
("1", "testnet"),
("true", "testnet"),
("yes", "testnet"),
("on", "testnet"),
("TRUE", "testnet"),
("0", "mainnet"),
("false", "mainnet"),
("no", "mainnet"),
("off", "mainnet"),
("garbage", "mainnet"),
])
def test_env_value_truthy_parsing(monkeypatch, raw, expected):
monkeypatch.setenv("X_TESTNET", raw)
info = resolve_environment(
{"base_url_live": "L", "base_url_testnet": "T"},
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
)
assert info.environment == expected
def test_default_base_urls_applied_when_creds_missing(monkeypatch):
monkeypatch.delenv("X_TESTNET", raising=False)
creds: dict = {}
info = resolve_environment(
creds,
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
default_base_url_live="https://live.example",
default_base_url_testnet="https://test.example",
)
assert info.base_url == "https://test.example"
assert creds["base_url_live"] == "https://live.example"
assert creds["base_url_testnet"] == "https://test.example"
def test_creds_base_urls_override_defaults(monkeypatch):
monkeypatch.delenv("X_TESTNET", raising=False)
creds = {"base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="X_TESTNET",
flag_key="testnet",
exchange="x",
default_base_url_live="https://live.example",
default_base_url_testnet="https://test.example",
)
assert info.base_url == "T"
assert creds["base_url_live"] == "L"
def test_alpaca_paper_flag_key(monkeypatch):
"""Alpaca usa 'paper' invece di 'testnet' nel secret."""
monkeypatch.delenv("ALPACA_PAPER", raising=False)
creds = {"paper": False, "base_url_live": "L", "base_url_testnet": "T"}
info = resolve_environment(
creds,
env_var="ALPACA_PAPER",
flag_key="paper",
exchange="alpaca",
)
assert info.environment == "mainnet"
assert info.source == "credentials"
# ───────── consistency_check ─────────
def _info(env: str, exchange: str = "deribit") -> EnvironmentInfo:
"""Helper costruisce EnvironmentInfo per test."""
return EnvironmentInfo(
exchange=exchange,
environment=env,
source="env",
env_value="false" if env == "mainnet" else "true",
base_url=f"https://api.{exchange}.com" if env == "mainnet" else f"https://test.{exchange}.com",
)
def test_consistency_check_testnet_no_confirmation_ok():
"""Testnet senza conferma esplicita → ok, ritorna []. Default safe."""
info = _info("testnet")
creds = {"api_key": "k", "api_secret": "s"}
warnings = consistency_check(info, creds)
assert warnings == []
def test_consistency_check_mainnet_no_confirmation_raises():
"""Mainnet senza creds['environment']='mainnet' esplicito → fail-fast."""
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s"}
with pytest.raises(EnvironmentMismatchError, match="mainnet.*explicit confirmation"):
consistency_check(info, creds)
def test_consistency_check_mainnet_with_confirmation_ok():
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s", "environment": "mainnet"}
warnings = consistency_check(info, creds)
assert warnings == []
def test_consistency_check_explicit_mismatch_raises():
"""Secret dichiara mainnet ma resolver risolve testnet → fail-fast."""
info = _info("testnet")
creds = {"environment": "mainnet"}
with pytest.raises(EnvironmentMismatchError, match="declared.*resolved"):
consistency_check(info, creds)
def test_consistency_check_strict_mainnet_disabled():
"""Con strict_mainnet=False mainnet senza conferma logga warning ma non raise."""
info = _info("mainnet")
creds = {"api_key": "k", "api_secret": "s"}
warnings = consistency_check(info, creds, strict_mainnet=False)
assert any("mainnet" in w for w in warnings)
def test_consistency_check_url_does_not_match_environment_warns():
"""Base URL contiene 'test' ma environment='mainnet' → warning."""
from mcp_common.environment import EnvironmentInfo
info = EnvironmentInfo(
exchange="bybit",
environment="mainnet",
source="env",
env_value="false",
base_url="https://api-testnet.bybit.com", # url DICE testnet ma resolver MAINNET
)
creds = {"environment": "mainnet"}
warnings = consistency_check(info, creds)
assert any("base_url" in w.lower() for w in warnings)
-90
View File
@@ -1,90 +0,0 @@
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.server import build_app
def test_build_app_health():
store = TokenStore(tokens={})
app = build_app(name="test-mcp", version="0.0.1", token_store=store)
client = TestClient(app)
r = client.get("/health")
assert r.status_code == 200
body = r.json()
assert body["status"] == "healthy"
assert body["name"] == "test-mcp"
assert body["version"] == "0.0.1"
assert "uptime_seconds" in body
assert "data_timestamp" in body
assert r.headers.get("X-Duration-Ms") is not None
def test_build_app_adds_token_store():
store = TokenStore(tokens={"t1": Principal("x", {"core"})})
app = build_app(name="t", version="v", token_store=store)
assert app.state.token_store is store
def test_timestamp_injector_dict_response():
"""CER-P5-001: dict response gets data_timestamp + X-Data-Timestamp header."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/foo")
def foo():
return {"ok": True}
client = TestClient(app)
r = client.post("/tools/foo")
assert r.status_code == 200
body = r.json()
assert body["ok"] is True
assert "data_timestamp" in body
assert r.headers.get("X-Data-Timestamp") is not None
def test_timestamp_injector_list_of_dicts():
"""CER-P5-001: list of dicts → each item gets data_timestamp."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/list_items")
def list_items():
return [{"x": 1}, {"x": 2}]
client = TestClient(app)
r = client.post("/tools/list_items")
body = r.json()
assert isinstance(body, list)
assert len(body) == 2
for item in body:
assert "data_timestamp" in item
assert r.headers.get("X-Data-Timestamp") is not None
def test_timestamp_injector_preserves_existing():
"""CER-P5-001: se già presente, non override."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/already")
def already():
return {"data_timestamp": "2020-01-01T00:00:00Z", "x": 1}
client = TestClient(app)
body = client.post("/tools/already").json()
assert body["data_timestamp"] == "2020-01-01T00:00:00Z"
def test_timestamp_injector_empty_list_gets_header_only():
"""CER-P5-001: list vuota — no body modification, ma header presente."""
store = TokenStore(tokens={})
app = build_app(name="t", version="v", token_store=store)
@app.post("/tools/empty_list")
def empty_list():
return []
client = TestClient(app)
r = client.post("/tools/empty_list")
assert r.json() == []
assert r.headers.get("X-Data-Timestamp") is not None
-29
View File
@@ -1,29 +0,0 @@
[project]
name = "mcp-alpaca"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
"alpaca-py>=0.32",
"pytz>=2024.1",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_alpaca"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-alpaca = "mcp_alpaca.__main__:main"
@@ -1,30 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_alpaca.client import AlpacaClient
from mcp_alpaca.server import create_app
SPEC = ExchangeAppSpec(
exchange="alpaca",
creds_env_var="ALPACA_CREDENTIALS_FILE",
env_var="ALPACA_PAPER",
flag_key="paper",
default_base_url_live="https://api.alpaca.markets",
default_base_url_testnet="https://paper-api.alpaca.markets",
default_port=9020,
build_client=lambda creds, env_info: AlpacaClient(
api_key=creds["api_key_id"],
secret_key=creds["secret_key"],
paper=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
@@ -1,385 +0,0 @@
from __future__ import annotations
import asyncio
import datetime as _dt
from typing import Any
from alpaca.data.historical import (
CryptoHistoricalDataClient,
OptionHistoricalDataClient,
StockHistoricalDataClient,
)
from alpaca.data.requests import (
CryptoBarsRequest,
CryptoLatestQuoteRequest,
CryptoLatestTradeRequest,
OptionBarsRequest,
OptionChainRequest,
OptionLatestQuoteRequest,
StockBarsRequest,
StockLatestQuoteRequest,
StockLatestTradeRequest,
StockSnapshotRequest,
)
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
from alpaca.trading.client import TradingClient
from alpaca.trading.enums import (
AssetClass,
OrderSide,
QueryOrderStatus,
TimeInForce,
)
from alpaca.trading.requests import (
ClosePositionRequest,
GetAssetsRequest,
GetOrdersRequest,
LimitOrderRequest,
MarketOrderRequest,
ReplaceOrderRequest,
StopOrderRequest,
)
_TF_MAP = {
"1min": TimeFrame(1, TimeFrameUnit.Minute),
"5min": TimeFrame(5, TimeFrameUnit.Minute),
"15min": TimeFrame(15, TimeFrameUnit.Minute),
"30min": TimeFrame(30, TimeFrameUnit.Minute),
"1h": TimeFrame(1, TimeFrameUnit.Hour),
"1d": TimeFrame(1, TimeFrameUnit.Day),
"1w": TimeFrame(1, TimeFrameUnit.Week),
}
_ASSET_CLASSES = {"stocks", "crypto", "options"}
def _tf(interval: str) -> TimeFrame:
if interval in _TF_MAP:
return _TF_MAP[interval]
raise ValueError(f"unsupported timeframe: {interval}")
def _asset_class_enum(ac: str) -> AssetClass:
ac = ac.lower()
if ac == "stocks":
return AssetClass.US_EQUITY
if ac == "crypto":
return AssetClass.CRYPTO
if ac == "options":
return AssetClass.US_OPTION
raise ValueError(f"invalid asset_class: {ac}")
def _serialize(obj: Any) -> Any:
"""Recursively convert pydantic/datetime objects → json-safe."""
if obj is None or isinstance(obj, str | int | float | bool):
return obj
if isinstance(obj, _dt.datetime | _dt.date):
return obj.isoformat()
if isinstance(obj, dict):
return {k: _serialize(v) for k, v in obj.items()}
if isinstance(obj, list | tuple):
return [_serialize(v) for v in obj]
if hasattr(obj, "model_dump"):
return _serialize(obj.model_dump())
if hasattr(obj, "__dict__"):
return _serialize(vars(obj))
return str(obj)
class AlpacaClient:
def __init__(
self,
api_key: str,
secret_key: str,
paper: bool = True,
trading: Any | None = None,
stock_data: Any | None = None,
crypto_data: Any | None = None,
option_data: Any | None = None,
) -> None:
self.api_key = api_key
self.secret_key = secret_key
self.paper = paper
self._trading = trading or TradingClient(
api_key=api_key, secret_key=secret_key, paper=paper
)
self._stock = stock_data or StockHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
self._crypto = crypto_data or CryptoHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
self._option = option_data or OptionHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
async def _run(self, fn, /, *args, **kwargs):
return await asyncio.to_thread(fn, *args, **kwargs)
# ── Account / positions ──────────────────────────────────────
async def get_account(self) -> dict:
acc = await self._run(self._trading.get_account)
return _serialize(acc)
async def get_positions(self) -> list[dict]:
pos = await self._run(self._trading.get_all_positions)
return [_serialize(p) for p in pos]
async def get_activities(self, limit: int = 50) -> list[dict]:
acts = await self._run(self._trading.get_account_activities)
data = [_serialize(a) for a in acts]
return data[:limit]
# ── Assets ──────────────────────────────────────────────────
async def get_assets(
self, asset_class: str = "stocks", status: str = "active"
) -> list[dict]:
req = GetAssetsRequest(
asset_class=_asset_class_enum(asset_class),
status=status,
)
assets = await self._run(self._trading.get_all_assets, req)
return [_serialize(a) for a in assets[:500]]
# ── Market data ─────────────────────────────────────────────
async def get_ticker(self, symbol: str, asset_class: str = "stocks") -> dict:
ac = asset_class.lower()
if ac == "stocks":
req = StockLatestTradeRequest(symbol_or_symbols=symbol)
data = await self._run(self._stock.get_stock_latest_trade, req)
trade = data.get(symbol)
q_req = StockLatestQuoteRequest(symbol_or_symbols=symbol)
qdata = await self._run(self._stock.get_stock_latest_quote, q_req)
quote = qdata.get(symbol)
return {
"symbol": symbol,
"asset_class": "stocks",
"last_price": getattr(trade, "price", None),
"bid": getattr(quote, "bid_price", None),
"ask": getattr(quote, "ask_price", None),
"bid_size": getattr(quote, "bid_size", None),
"ask_size": getattr(quote, "ask_size", None),
"timestamp": _serialize(getattr(trade, "timestamp", None)),
}
if ac == "crypto":
req = CryptoLatestTradeRequest(symbol_or_symbols=symbol)
data = await self._run(self._crypto.get_crypto_latest_trade, req)
trade = data.get(symbol)
q_req = CryptoLatestQuoteRequest(symbol_or_symbols=symbol)
qdata = await self._run(self._crypto.get_crypto_latest_quote, q_req)
quote = qdata.get(symbol)
return {
"symbol": symbol,
"asset_class": "crypto",
"last_price": getattr(trade, "price", None),
"bid": getattr(quote, "bid_price", None),
"ask": getattr(quote, "ask_price", None),
"timestamp": _serialize(getattr(trade, "timestamp", None)),
}
if ac == "options":
req = OptionLatestQuoteRequest(symbol_or_symbols=symbol)
data = await self._run(self._option.get_option_latest_quote, req)
quote = data.get(symbol)
return {
"symbol": symbol,
"asset_class": "options",
"bid": getattr(quote, "bid_price", None),
"ask": getattr(quote, "ask_price", None),
"timestamp": _serialize(getattr(quote, "timestamp", None)),
}
raise ValueError(f"invalid asset_class: {asset_class}")
async def get_bars(
self,
symbol: str,
asset_class: str = "stocks",
interval: str = "1d",
start: str | None = None,
end: str | None = None,
limit: int = 1000,
) -> dict:
tf = _tf(interval)
start_dt = _dt.datetime.fromisoformat(start) if start else (
_dt.datetime.now(_dt.UTC) - _dt.timedelta(days=30)
)
end_dt = _dt.datetime.fromisoformat(end) if end else _dt.datetime.now(_dt.UTC)
ac = asset_class.lower()
if ac == "stocks":
req = StockBarsRequest(
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
)
data = await self._run(self._stock.get_stock_bars, req)
elif ac == "crypto":
req = CryptoBarsRequest(
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
)
data = await self._run(self._crypto.get_crypto_bars, req)
elif ac == "options":
req = OptionBarsRequest(
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
)
data = await self._run(self._option.get_option_bars, req)
else:
raise ValueError(f"invalid asset_class: {asset_class}")
bars_dict = getattr(data, "data", {}) or {}
rows = bars_dict.get(symbol, []) or []
bars = [
{
"timestamp": _serialize(getattr(b, "timestamp", None)),
"open": getattr(b, "open", None),
"high": getattr(b, "high", None),
"low": getattr(b, "low", None),
"close": getattr(b, "close", None),
"volume": getattr(b, "volume", None),
}
for b in rows
]
return {"symbol": symbol, "asset_class": ac, "interval": interval, "bars": bars}
async def get_snapshot(self, symbol: str) -> dict:
req = StockSnapshotRequest(symbol_or_symbols=symbol)
data = await self._run(self._stock.get_stock_snapshot, req)
return _serialize(data.get(symbol))
async def get_option_chain(
self,
underlying: str,
expiry: str | None = None,
) -> dict:
kwargs: dict[str, Any] = {"underlying_symbol": underlying}
if expiry:
kwargs["expiration_date"] = _dt.date.fromisoformat(expiry)
req = OptionChainRequest(**kwargs)
data = await self._run(self._option.get_option_chain, req)
return {
"underlying": underlying,
"expiry": expiry,
"contracts": _serialize(data),
}
# ── Orders ──────────────────────────────────────────────────
async def get_open_orders(self, limit: int = 50) -> list[dict]:
req = GetOrdersRequest(status=QueryOrderStatus.OPEN, limit=limit)
orders = await self._run(self._trading.get_orders, filter=req)
return [_serialize(o) for o in orders]
async def place_order(
self,
symbol: str,
side: str,
qty: float | None = None,
notional: float | None = None,
order_type: str = "market",
limit_price: float | None = None,
stop_price: float | None = None,
tif: str = "day",
asset_class: str = "stocks",
) -> dict:
side_enum = OrderSide.BUY if side.lower() == "buy" else OrderSide.SELL
tif_enum = TimeInForce(tif.lower())
ot = order_type.lower()
common = {
"symbol": symbol,
"side": side_enum,
"time_in_force": tif_enum,
}
if qty is not None:
common["qty"] = qty
if notional is not None:
common["notional"] = notional
if ot == "market":
req = MarketOrderRequest(**common)
elif ot == "limit":
if limit_price is None:
raise ValueError("limit_price required for limit order")
req = LimitOrderRequest(**common, limit_price=limit_price)
elif ot == "stop":
if stop_price is None:
raise ValueError("stop_price required for stop order")
req = StopOrderRequest(**common, stop_price=stop_price)
else:
raise ValueError(f"unsupported order_type: {order_type}")
order = await self._run(self._trading.submit_order, req)
return _serialize(order)
async def amend_order(
self,
order_id: str,
qty: float | None = None,
limit_price: float | None = None,
stop_price: float | None = None,
tif: str | None = None,
) -> dict:
kwargs: dict[str, Any] = {}
if qty is not None:
kwargs["qty"] = qty
if limit_price is not None:
kwargs["limit_price"] = limit_price
if stop_price is not None:
kwargs["stop_price"] = stop_price
if tif is not None:
kwargs["time_in_force"] = TimeInForce(tif.lower())
req = ReplaceOrderRequest(**kwargs)
order = await self._run(self._trading.replace_order_by_id, order_id, req)
return _serialize(order)
async def cancel_order(self, order_id: str) -> dict:
await self._run(self._trading.cancel_order_by_id, order_id)
return {"order_id": order_id, "canceled": True}
async def cancel_all_orders(self) -> list[dict]:
resp = await self._run(self._trading.cancel_orders)
return [_serialize(r) for r in resp]
# ── Position close ──────────────────────────────────────────
async def close_position(
self, symbol: str, qty: float | None = None, percentage: float | None = None
) -> dict:
req = None
if qty is not None or percentage is not None:
kwargs: dict[str, Any] = {}
if qty is not None:
kwargs["qty"] = str(qty)
if percentage is not None:
kwargs["percentage"] = str(percentage)
req = ClosePositionRequest(**kwargs)
order = await self._run(
self._trading.close_position, symbol, close_options=req
)
return _serialize(order)
async def close_all_positions(self, cancel_orders: bool = True) -> list[dict]:
resp = await self._run(
self._trading.close_all_positions, cancel_orders=cancel_orders
)
return [_serialize(r) for r in resp]
# ── Clock / calendar ────────────────────────────────────────
async def get_clock(self) -> dict:
clock = await self._run(self._trading.get_clock)
return _serialize(clock)
async def get_calendar(
self, start: str | None = None, end: str | None = None
) -> list[dict]:
from alpaca.trading.requests import GetCalendarRequest
kwargs: dict[str, Any] = {}
if start:
kwargs["start"] = _dt.date.fromisoformat(start)
if end:
kwargs["end"] = _dt.date.fromisoformat(end)
req = GetCalendarRequest(**kwargs) if kwargs else None
cal = await self._run(
self._trading.get_calendar, filters=req
) if req else await self._run(self._trading.get_calendar)
return [_serialize(c) for c in cal]
@@ -1,321 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from mcp_alpaca.client import AlpacaClient
from mcp_alpaca.leverage_cap import get_max_leverage
# --- Body models: reads ---
class AccountReq(BaseModel):
pass
class PositionsReq(BaseModel):
pass
class ActivitiesReq(BaseModel):
limit: int = 50
class AssetsReq(BaseModel):
asset_class: str = "stocks"
status: str = "active"
class TickerReq(BaseModel):
symbol: str
asset_class: str = "stocks"
class BarsReq(BaseModel):
symbol: str
asset_class: str = "stocks"
interval: str = "1d"
start: str | None = None
end: str | None = None
limit: int = 1000
class SnapshotReq(BaseModel):
symbol: str
class OptionChainReq(BaseModel):
underlying: str
expiry: str | None = None
class OpenOrdersReq(BaseModel):
limit: int = 50
class ClockReq(BaseModel):
pass
class CalendarReq(BaseModel):
start: str | None = None
end: str | None = None
# --- Body models: writes ---
class PlaceOrderReq(BaseModel):
symbol: str
side: str
qty: float | None = None
notional: float | None = None
order_type: str = "market"
limit_price: float | None = None
stop_price: float | None = None
tif: str = "day"
asset_class: str = "stocks"
class AmendOrderReq(BaseModel):
order_id: str
qty: float | None = None
limit_price: float | None = None
stop_price: float | None = None
tif: str | None = None
class CancelOrderReq(BaseModel):
order_id: str
class CancelAllReq(BaseModel):
pass
class ClosePositionReq(BaseModel):
symbol: str
qty: float | None = None
percentage: float | None = None
class CloseAllPositionsReq(BaseModel):
cancel_orders: bool = True
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(status_code=403, detail="forbidden")
def create_app(
*,
client: AlpacaClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
):
creds = creds or {}
app = build_app(name="mcp-alpaca", version="0.1.0", token_store=token_store)
# ── Reads ──────────────────────────────────────────────
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "alpaca",
"environment": "testnet" if getattr(client, "paper", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_account", tags=["reads"])
async def t_get_account(body: AccountReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_account()
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(body: PositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"positions": await client.get_positions()}
@app.post("/tools/get_activities", tags=["reads"])
async def t_get_activities(body: ActivitiesReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"activities": await client.get_activities(body.limit)}
@app.post("/tools/get_assets", tags=["reads"])
async def t_get_assets(body: AssetsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"assets": await client.get_assets(body.asset_class, body.status)}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(body: TickerReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.symbol, body.asset_class)
@app.post("/tools/get_bars", tags=["reads"])
async def t_get_bars(body: BarsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_bars(
body.symbol, body.asset_class, body.interval, body.start, body.end, body.limit,
)
@app.post("/tools/get_snapshot", tags=["reads"])
async def t_get_snapshot(body: SnapshotReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_snapshot(body.symbol)
@app.post("/tools/get_option_chain", tags=["reads"])
async def t_get_option_chain(body: OptionChainReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_option_chain(body.underlying, body.expiry)
@app.post("/tools/get_open_orders", tags=["reads"])
async def t_get_open_orders(body: OpenOrdersReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"orders": await client.get_open_orders(body.limit)}
@app.post("/tools/get_clock", tags=["reads"])
async def t_get_clock(body: ClockReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_clock()
@app.post("/tools/get_calendar", tags=["reads"])
async def t_get_calendar(body: CalendarReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"calendar": await client.get_calendar(body.start, body.end)}
# ── Writes ─────────────────────────────────────────────
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(body: PlaceOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_order(
body.symbol, body.side, body.qty, body.notional,
body.order_type, body.limit_price, body.stop_price, body.tif, body.asset_class,
)
audit_write_op(
principal=principal, action="place_order", exchange="alpaca",
target=body.symbol,
payload={"side": body.side, "qty": body.qty, "notional": body.notional,
"order_type": body.order_type, "limit_price": body.limit_price,
"stop_price": body.stop_price, "tif": body.tif,
"asset_class": body.asset_class},
result=result,
)
return result
@app.post("/tools/amend_order", tags=["writes"])
async def t_amend_order(body: AmendOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.amend_order(
body.order_id, body.qty, body.limit_price, body.stop_price, body.tif,
)
audit_write_op(
principal=principal, action="amend_order", exchange="alpaca",
target=body.order_id,
payload={"qty": body.qty, "limit_price": body.limit_price,
"stop_price": body.stop_price, "tif": body.tif},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(body: CancelOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.cancel_order(body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="alpaca",
target=body.order_id, payload={}, result=result,
)
return result
@app.post("/tools/cancel_all_orders", tags=["writes"])
async def t_cancel_all(body: CancelAllReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = {"canceled": await client.cancel_all_orders()}
audit_write_op(
principal=principal, action="cancel_all_orders", exchange="alpaca",
payload={}, result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close(body: ClosePositionReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.close_position(body.symbol, body.qty, body.percentage)
audit_write_op(
principal=principal, action="close_position", exchange="alpaca",
target=body.symbol,
payload={"qty": body.qty, "percentage": body.percentage},
result=result,
)
return result
@app.post("/tools/close_all_positions", tags=["writes"])
async def t_close_all(body: CloseAllPositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = {"closed": await client.close_all_positions(body.cancel_orders)}
audit_write_op(
principal=principal, action="close_all_positions", exchange="alpaca",
payload={"cancel_orders": body.cancel_orders}, result=result,
)
return result
# ── MCP mount ──────────────────────────────────────────
port = int(os.environ.get("PORT", "9020"))
mount_mcp_endpoint(
app,
name="cerbero-alpaca",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (paper/live), source, base_url, max_leverage cap."},
{"name": "get_account", "description": "Alpaca account summary (equity, cash, buying_power)."},
{"name": "get_positions", "description": "Posizioni aperte (stocks/crypto/options)."},
{"name": "get_activities", "description": "Activity log (fills, dividends, transfers)."},
{"name": "get_assets", "description": "Universo asset per asset_class."},
{"name": "get_ticker", "description": "Last trade + quote per simbolo (stocks/crypto/options)."},
{"name": "get_bars", "description": "OHLCV candles (stocks/crypto/options)."},
{"name": "get_snapshot", "description": "Snapshot completo stock (last trade+quote+bar)."},
{"name": "get_option_chain", "description": "Option chain per underlying."},
{"name": "get_open_orders", "description": "Ordini pending."},
{"name": "get_clock", "description": "Market clock (open/close, next_open)."},
{"name": "get_calendar", "description": "Calendar sessioni trading."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "amend_order", "description": "Replace ordine esistente."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "cancel_all_orders", "description": "Cancella tutti ordini aperti."},
{"name": "close_position", "description": "Chiude posizione (tutta o parziale)."},
{"name": "close_all_positions", "description": "Liquida tutto il portafoglio."},
],
)
return app
-39
View File
@@ -1,39 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
import pytest
from mcp_alpaca.client import AlpacaClient
@pytest.fixture
def mock_trading():
return MagicMock(name="alpaca_TradingClient")
@pytest.fixture
def mock_stock():
return MagicMock(name="alpaca_StockHistoricalDataClient")
@pytest.fixture
def mock_crypto():
return MagicMock(name="alpaca_CryptoHistoricalDataClient")
@pytest.fixture
def mock_option():
return MagicMock(name="alpaca_OptionHistoricalDataClient")
@pytest.fixture
def client(mock_trading, mock_stock, mock_crypto, mock_option):
return AlpacaClient(
api_key="test_key",
secret_key="test_secret",
paper=True,
trading=mock_trading,
stock_data=mock_stock,
crypto_data=mock_crypto,
option_data=mock_option,
)
-80
View File
@@ -1,80 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
import pytest
@pytest.mark.asyncio
async def test_init_paper_mode(client, mock_trading):
assert client.paper is True
assert client._trading is mock_trading
@pytest.mark.asyncio
async def test_get_account_calls_trading(client, mock_trading):
mock_trading.get_account.return_value = MagicMock(
model_dump=lambda: {"equity": 100000, "cash": 50000}
)
result = await client.get_account()
mock_trading.get_account.assert_called_once()
assert result["equity"] == 100000
@pytest.mark.asyncio
async def test_get_positions_returns_list(client, mock_trading):
pos_mock = MagicMock(model_dump=lambda: {"symbol": "AAPL", "qty": 10})
mock_trading.get_all_positions.return_value = [pos_mock]
result = await client.get_positions()
assert len(result) == 1
assert result[0]["symbol"] == "AAPL"
@pytest.mark.asyncio
async def test_place_market_order_stocks(client, mock_trading):
order_mock = MagicMock(model_dump=lambda: {"id": "o123", "symbol": "AAPL"})
mock_trading.submit_order.return_value = order_mock
result = await client.place_order(
symbol="AAPL", side="buy", qty=1, order_type="market", asset_class="stocks",
)
assert result["id"] == "o123"
assert mock_trading.submit_order.called
@pytest.mark.asyncio
async def test_place_limit_order_requires_price(client):
with pytest.raises(ValueError, match="limit_price"):
await client.place_order(
symbol="AAPL", side="buy", qty=1, order_type="limit",
)
@pytest.mark.asyncio
async def test_cancel_order(client, mock_trading):
mock_trading.cancel_order_by_id.return_value = None
result = await client.cancel_order("o1")
mock_trading.cancel_order_by_id.assert_called_once_with("o1")
assert result == {"order_id": "o1", "canceled": True}
@pytest.mark.asyncio
async def test_close_position_no_options(client, mock_trading):
order_mock = MagicMock(model_dump=lambda: {"id": "close-1"})
mock_trading.close_position.return_value = order_mock
result = await client.close_position("AAPL")
assert mock_trading.close_position.called
assert result["id"] == "close-1"
@pytest.mark.asyncio
async def test_get_clock(client, mock_trading):
clock_mock = MagicMock(model_dump=lambda: {"is_open": True, "next_close": "2026-04-21T20:00:00Z"})
mock_trading.get_clock.return_value = clock_mock
result = await client.get_clock()
assert result["is_open"] is True
@pytest.mark.asyncio
async def test_invalid_asset_class(client):
with pytest.raises(ValueError, match="invalid asset_class"):
await client.get_ticker("AAPL", asset_class="forex")
@@ -1,50 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
from fastapi.testclient import TestClient
from mcp_alpaca.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
def _make_app(env_info, creds):
c = MagicMock()
c.paper = True
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_paper_is_testnet():
"""Alpaca: 'paper' nel secret mappa a environment='testnet'."""
env = EnvironmentInfo(
exchange="alpaca",
environment="testnet",
source="env",
env_value="true",
base_url="https://paper-api.alpaca.markets",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post("/tools/environment_info", headers={"Authorization": "Bearer ot"})
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "alpaca"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["base_url"] == "https://paper-api.alpaca.markets"
assert body["max_leverage"] == 1
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="alpaca", environment="testnet", source="default",
env_value=None, base_url="https://paper-api.alpaca.markets",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -1,110 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_alpaca.server import create_app
from mcp_common.auth import Principal, TokenStore
@pytest.fixture
def token_store():
return TokenStore(
tokens={
"core-tok": Principal("core", {"core"}),
"obs-tok": Principal("observer", {"observer"}),
}
)
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_account = AsyncMock(return_value={"equity": 100000})
c.get_positions = AsyncMock(return_value=[])
c.get_activities = AsyncMock(return_value=[])
c.get_assets = AsyncMock(return_value=[])
c.get_ticker = AsyncMock(return_value={"symbol": "AAPL"})
c.get_bars = AsyncMock(return_value={"bars": []})
c.get_snapshot = AsyncMock(return_value={})
c.get_option_chain = AsyncMock(return_value={"contracts": []})
c.get_open_orders = AsyncMock(return_value=[])
c.get_clock = AsyncMock(return_value={"is_open": True})
c.get_calendar = AsyncMock(return_value=[])
c.place_order = AsyncMock(return_value={"id": "o1"})
c.amend_order = AsyncMock(return_value={"id": "o1"})
c.cancel_order = AsyncMock(return_value={"canceled": True})
c.cancel_all_orders = AsyncMock(return_value=[])
c.close_position = AsyncMock(return_value={"id": "close1"})
c.close_all_positions = AsyncMock(return_value=[])
return c
@pytest.fixture
def http(mock_client, token_store):
app = create_app(client=mock_client, token_store=token_store, creds={"max_leverage": 1})
return TestClient(app)
CORE = {"Authorization": "Bearer core-tok"}
OBS = {"Authorization": "Bearer obs-tok"}
READ_ENDPOINTS = [
("/tools/get_account", {}),
("/tools/get_positions", {}),
("/tools/get_activities", {}),
("/tools/get_assets", {}),
("/tools/get_ticker", {"symbol": "AAPL"}),
("/tools/get_bars", {"symbol": "AAPL"}),
("/tools/get_snapshot", {"symbol": "AAPL"}),
("/tools/get_option_chain", {"underlying": "AAPL"}),
("/tools/get_open_orders", {}),
("/tools/get_clock", {}),
("/tools/get_calendar", {}),
]
WRITE_ENDPOINTS = [
("/tools/place_order", {"symbol": "AAPL", "side": "buy", "qty": 1}),
("/tools/amend_order", {"order_id": "o1", "qty": 2}),
("/tools/cancel_order", {"order_id": "o1"}),
("/tools/cancel_all_orders", {}),
("/tools/close_position", {"symbol": "AAPL"}),
("/tools/close_all_positions", {}),
]
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_observer_ok(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_observer_403(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 403, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
-28
View File
@@ -1,28 +0,0 @@
[project]
name = "mcp-bybit"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
"pybit>=5.8",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_bybit"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-bybit = "mcp_bybit.__main__:main"
@@ -1,30 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_bybit.client import BybitClient
from mcp_bybit.server import create_app
SPEC = ExchangeAppSpec(
exchange="bybit",
creds_env_var="BYBIT_CREDENTIALS_FILE",
env_var="BYBIT_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.bybit.com",
default_base_url_testnet="https://api-testnet.bybit.com",
default_port=9019,
build_client=lambda creds, env_info: BybitClient(
api_key=creds["api_key"],
api_secret=creds["api_secret"],
testnet=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
-672
View File
@@ -1,672 +0,0 @@
from __future__ import annotations
import asyncio
from typing import Any
from mcp_common import indicators as ind
from mcp_common import microstructure as micro
from pybit.unified_trading import HTTP
def _f(v: Any) -> float | None:
try:
return float(v)
except (TypeError, ValueError):
return None
def _i(v: Any) -> int | None:
try:
return int(v)
except (TypeError, ValueError):
return None
class BybitClient:
def __init__(
self,
api_key: str,
api_secret: str,
testnet: bool = True,
http: Any | None = None,
) -> None:
self.api_key = api_key
self.api_secret = api_secret
self.testnet = testnet
self._http = http or HTTP(
api_key=api_key,
api_secret=api_secret,
testnet=testnet,
)
async def _run(self, fn, /, **kwargs):
return await asyncio.to_thread(fn, **kwargs)
@staticmethod
def _parse_ticker(row: dict) -> dict:
return {
"symbol": row.get("symbol"),
"last_price": _f(row.get("lastPrice")),
"mark_price": _f(row.get("markPrice")),
"bid": _f(row.get("bid1Price")),
"ask": _f(row.get("ask1Price")),
"volume_24h": _f(row.get("volume24h")),
"turnover_24h": _f(row.get("turnover24h")),
"funding_rate": _f(row.get("fundingRate")),
"open_interest": _f(row.get("openInterest")),
}
async def get_ticker(self, symbol: str, category: str = "linear") -> dict:
resp = await self._run(
self._http.get_tickers, category=category, symbol=symbol
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
return {"symbol": symbol, "error": "not_found"}
return self._parse_ticker(rows[0])
async def get_ticker_batch(
self, symbols: list[str], category: str = "linear"
) -> dict[str, dict]:
out: dict[str, dict] = {}
for sym in symbols:
out[sym] = await self.get_ticker(sym, category=category)
return out
async def get_orderbook(
self, symbol: str, category: str = "linear", limit: int = 50
) -> dict:
resp = await self._run(
self._http.get_orderbook, category=category, symbol=symbol, limit=limit
)
r = resp.get("result") or {}
return {
"symbol": r.get("s"),
"bids": [[float(p), float(q)] for p, q in (r.get("b") or [])],
"asks": [[float(p), float(q)] for p, q in (r.get("a") or [])],
"timestamp": r.get("ts"),
}
async def get_historical(
self,
symbol: str,
category: str = "linear",
interval: str = "60",
start: int | None = None,
end: int | None = None,
limit: int = 1000,
) -> dict:
kwargs = dict(
category=category,
symbol=symbol,
interval=interval,
limit=limit,
)
if start is not None:
kwargs["start"] = start
if end is not None:
kwargs["end"] = end
resp = await self._run(self._http.get_kline, **kwargs)
rows = (resp.get("result") or {}).get("list") or []
rows_sorted = sorted(rows, key=lambda r: int(r[0]))
candles = [
{
"timestamp": int(r[0]),
"open": float(r[1]),
"high": float(r[2]),
"low": float(r[3]),
"close": float(r[4]),
"volume": float(r[5]),
}
for r in rows_sorted
]
return {"symbol": symbol, "candles": candles}
async def get_indicators(
self,
symbol: str,
category: str = "linear",
indicators: list[str] | None = None,
interval: str = "60",
start: int | None = None,
end: int | None = None,
) -> dict:
indicators = indicators or ["rsi", "atr", "macd", "adx"]
historical = await self.get_historical(
symbol, category=category, interval=interval, start=start, end=end
)
candles = historical.get("candles", [])
closes = [c["close"] for c in candles]
highs = [c["high"] for c in candles]
lows = [c["low"] for c in candles]
out: dict[str, Any] = {"symbol": symbol, "category": category}
for name in indicators:
n = name.lower()
if n == "sma":
out["sma"] = ind.sma(closes, 20)
elif n == "rsi":
out["rsi"] = ind.rsi(closes)
elif n == "atr":
out["atr"] = ind.atr(highs, lows, closes)
elif n == "macd":
out["macd"] = ind.macd(closes)
elif n == "adx":
out["adx"] = ind.adx(highs, lows, closes)
else:
out[n] = None
return out
async def get_funding_rate(self, symbol: str, category: str = "linear") -> dict:
resp = await self._run(
self._http.get_tickers, category=category, symbol=symbol
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
return {"symbol": symbol, "error": "not_found"}
row = rows[0]
return {
"symbol": row.get("symbol"),
"funding_rate": _f(row.get("fundingRate")),
"next_funding_time": _i(row.get("nextFundingTime")),
}
async def get_funding_history(
self, symbol: str, category: str = "linear", limit: int = 100
) -> dict:
resp = await self._run(
self._http.get_funding_rate_history,
category=category, symbol=symbol, limit=limit,
)
rows = (resp.get("result") or {}).get("list") or []
hist = [
{
"timestamp": int(r.get("fundingRateTimestamp", 0)),
"rate": float(r.get("fundingRate", 0)),
}
for r in rows
]
return {"symbol": symbol, "history": hist}
async def get_open_interest(
self,
symbol: str,
category: str = "linear",
interval: str = "5min",
limit: int = 288,
) -> dict:
resp = await self._run(
self._http.get_open_interest,
category=category, symbol=symbol, intervalTime=interval, limit=limit,
)
rows = (resp.get("result") or {}).get("list") or []
points = [
{
"timestamp": int(r.get("timestamp", 0)),
"oi": float(r.get("openInterest", 0)),
}
for r in rows
]
current_oi = points[0]["oi"] if points else None
return {
"symbol": symbol,
"category": category,
"interval": interval,
"current_oi": current_oi,
"points": points,
}
async def get_instruments(self, category: str = "linear", symbol: str | None = None) -> dict:
kwargs: dict[str, Any] = {"category": category}
if symbol:
kwargs["symbol"] = symbol
resp = await self._run(self._http.get_instruments_info, **kwargs)
rows = (resp.get("result") or {}).get("list") or []
instruments = []
for r in rows:
pf = r.get("priceFilter") or {}
lf = r.get("lotSizeFilter") or {}
instruments.append({
"symbol": r.get("symbol"),
"status": r.get("status"),
"base_coin": r.get("baseCoin"),
"quote_coin": r.get("quoteCoin"),
"tick_size": _f(pf.get("tickSize")),
"qty_step": _f(lf.get("qtyStep")),
"min_qty": _f(lf.get("minOrderQty")),
})
return {"category": category, "instruments": instruments}
async def get_option_chain(self, base_coin: str, expiry: str | None = None) -> dict:
kwargs: dict[str, Any] = {"category": "option", "baseCoin": base_coin.upper()}
resp = await self._run(self._http.get_instruments_info, **kwargs)
rows = (resp.get("result") or {}).get("list") or []
options = []
for r in rows:
delivery = r.get("deliveryTime")
if expiry and expiry not in r.get("symbol", ""):
continue
options.append({
"symbol": r.get("symbol"),
"base_coin": r.get("baseCoin"),
"settle_coin": r.get("settleCoin"),
"type": r.get("optionsType"),
"launch_time": int(r.get("launchTime", 0)),
"delivery_time": int(delivery) if delivery else None,
})
return {"base_coin": base_coin.upper(), "options": options}
async def get_positions(
self, category: str = "linear", settle_coin: str = "USDT"
) -> list[dict]:
kwargs: dict[str, Any] = {"category": category}
if category in ("linear", "inverse"):
kwargs["settleCoin"] = settle_coin
resp = await self._run(self._http.get_positions, **kwargs)
rows = (resp.get("result") or {}).get("list") or []
out = []
for r in rows:
out.append({
"symbol": r.get("symbol"),
"side": r.get("side"),
"size": _f(r.get("size")),
"entry_price": _f(r.get("avgPrice")),
"unrealized_pnl": _f(r.get("unrealisedPnl")),
"leverage": _f(r.get("leverage")),
"liquidation_price": _f(r.get("liqPrice")),
"position_value": _f(r.get("positionValue")),
})
return out
async def get_account_summary(self, account_type: str = "UNIFIED") -> dict:
resp = await self._run(
self._http.get_wallet_balance, accountType=account_type
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
return {"error": "no_account"}
a = rows[0]
coins = []
for c in a.get("coin") or []:
coins.append({
"coin": c.get("coin"),
"wallet_balance": _f(c.get("walletBalance")),
"equity": _f(c.get("equity")),
})
return {
"account_type": a.get("accountType"),
"equity": _f(a.get("totalEquity")),
"wallet_balance": _f(a.get("totalWalletBalance")),
"margin_balance": _f(a.get("totalMarginBalance")),
"available_balance": _f(a.get("totalAvailableBalance")),
"unrealized_pnl": _f(a.get("totalPerpUPL")),
"coins": coins,
}
async def get_trade_history(
self, category: str = "linear", limit: int = 50
) -> list[dict]:
resp = await self._run(
self._http.get_executions, category=category, limit=limit
)
rows = (resp.get("result") or {}).get("list") or []
return [
{
"symbol": r.get("symbol"),
"side": r.get("side"),
"size": _f(r.get("execQty")),
"price": _f(r.get("execPrice")),
"fee": _f(r.get("execFee")),
"timestamp": _i(r.get("execTime")),
"order_id": r.get("orderId"),
}
for r in rows
]
async def get_open_orders(
self,
category: str = "linear",
symbol: str | None = None,
settle_coin: str = "USDT",
) -> list[dict]:
kwargs: dict[str, Any] = {"category": category}
if category in ("linear", "inverse") and not symbol:
kwargs["settleCoin"] = settle_coin
if symbol:
kwargs["symbol"] = symbol
resp = await self._run(self._http.get_open_orders, **kwargs)
rows = (resp.get("result") or {}).get("list") or []
return [
{
"order_id": r.get("orderId"),
"symbol": r.get("symbol"),
"side": r.get("side"),
"qty": _f(r.get("qty")),
"price": _f(r.get("price")),
"type": r.get("orderType"),
"status": r.get("orderStatus"),
"reduce_only": bool(r.get("reduceOnly")),
}
for r in rows
]
async def get_orderbook_imbalance(
self,
symbol: str,
category: str = "linear",
depth: int = 10,
) -> dict:
"""Microstructure: bid/ask imbalance ratio + microprice + slope."""
ob = await self.get_orderbook(symbol=symbol, category=category, limit=max(depth, 50))
result = micro.orderbook_imbalance(ob.get("bids") or [], ob.get("asks") or [], depth=depth)
return {
"symbol": symbol,
"category": category,
"depth": depth,
**result,
"timestamp": ob.get("timestamp"),
}
async def get_basis_term_structure(self, asset: str) -> dict:
"""Basis curve futures (dated) vs perp + spot. Filtra contratti future
BTCUSDT / ETHUSDT con scadenza, calcola annualized basis per ognuno.
"""
import datetime as _dt
asset = asset.upper()
spot = await self.get_ticker(f"{asset}USDT", category="spot")
perp = await self.get_ticker(f"{asset}USDT", category="linear")
sp = spot.get("last_price")
pp = perp.get("last_price")
# Lista futures dated (linear/inverse)
instr = await self.get_instruments(category="linear")
items = (instr.get("instruments") or [])
futures = [
x for x in items
if x.get("symbol", "").startswith(f"{asset}-") or x.get("symbol", "").startswith(f"{asset}USDT-")
]
rows: list[dict[str, Any]] = []
if sp:
now_ms = int(_dt.datetime.now(_dt.UTC).timestamp() * 1000)
for f in futures[:10]:
tk = await self.get_ticker(f["symbol"], category="linear")
fp = tk.get("last_price")
expiry_ms = f.get("delivery_time")
if not fp or not expiry_ms:
continue
days = max((int(expiry_ms) - now_ms) / 86_400_000, 1)
basis_pct = 100.0 * (fp - sp) / sp
annualized = basis_pct * 365.0 / days
rows.append({
"symbol": f["symbol"],
"expiry_ms": int(expiry_ms),
"days_to_expiry": round(days, 2),
"future_price": fp,
"basis_pct": round(basis_pct, 4),
"annualized_basis_pct": round(annualized, 4),
})
rows.sort(key=lambda r: r["days_to_expiry"])
return {
"asset": asset,
"spot_price": sp,
"perp_price": pp,
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4) if (sp and pp) else None,
"term_structure": rows,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
}
async def get_basis_spot_perp(self, asset: str) -> dict:
asset = asset.upper()
symbol = f"{asset}USDT"
spot = await self.get_ticker(symbol, category="spot")
perp = await self.get_ticker(symbol, category="linear")
sp = spot.get("last_price")
pp = perp.get("last_price")
basis_abs = basis_pct = None
if sp and pp:
basis_abs = pp - sp
basis_pct = 100.0 * basis_abs / sp
return {
"asset": asset,
"symbol": symbol,
"spot_price": sp,
"perp_price": pp,
"basis_abs": basis_abs,
"basis_pct": basis_pct,
"funding_rate": perp.get("funding_rate"),
}
def _envelope(self, resp: dict, payload: dict) -> dict:
code = resp.get("retCode", 0)
if code != 0:
return {"error": resp.get("retMsg", "bybit_error"), "code": code}
return payload
async def place_order(
self,
category: str,
symbol: str,
side: str,
qty: float,
order_type: str = "Limit",
price: float | None = None,
tif: str = "GTC",
reduce_only: bool = False,
position_idx: int | None = None,
) -> dict:
kwargs: dict[str, Any] = {
"category": category,
"symbol": symbol,
"side": side,
"qty": str(qty),
"orderType": order_type,
"timeInForce": tif,
"reduceOnly": reduce_only,
}
if price is not None:
kwargs["price"] = str(price)
if position_idx is not None:
kwargs["positionIdx"] = position_idx
if category == "option":
import uuid
kwargs["orderLinkId"] = f"cerbero-{uuid.uuid4().hex[:16]}"
resp = await self._run(self._http.place_order, **kwargs)
r = resp.get("result") or {}
return self._envelope(resp, {
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
})
async def place_combo_order(
self,
category: str,
legs: list[dict[str, Any]],
) -> dict:
"""Atomic multi-leg via /v5/order/create-batch (Bybit option only).
Bybit supporta batch_order solo su category='option'. Per perp/linear
usare loop di place_order (non atomic).
legs: [{symbol, side, qty, order_type, price?, tif?, reduce_only?}].
"""
if category != "option":
raise ValueError("place_combo_order: Bybit batch_order è disponibile solo su category='option'")
if len(legs) < 2:
raise ValueError("combo requires at least 2 legs")
import uuid
request: list[dict[str, Any]] = []
for leg in legs:
entry: dict[str, Any] = {
"symbol": leg["symbol"],
"side": leg["side"],
"qty": str(leg["qty"]),
"orderType": leg.get("order_type", "Limit"),
"timeInForce": leg.get("tif", "GTC"),
"reduceOnly": leg.get("reduce_only", False),
"orderLinkId": f"cerbero-{uuid.uuid4().hex[:16]}",
}
if leg.get("price") is not None:
entry["price"] = str(leg["price"])
request.append(entry)
resp = await self._run(self._http.place_batch_order, category=category, request=request)
result_list = (resp.get("result") or {}).get("list") or []
orders = [
{
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
}
for r in result_list
]
return self._envelope(resp, {"orders": orders})
async def amend_order(
self,
category: str,
symbol: str,
order_id: str,
new_qty: float | None = None,
new_price: float | None = None,
) -> dict:
kwargs: dict[str, Any] = {
"category": category,
"symbol": symbol,
"orderId": order_id,
}
if new_qty is not None:
kwargs["qty"] = str(new_qty)
if new_price is not None:
kwargs["price"] = str(new_price)
resp = await self._run(self._http.amend_order, **kwargs)
r = resp.get("result") or {}
return self._envelope(resp, {
"order_id": r.get("orderId", order_id),
"status": "amended",
})
async def cancel_order(
self, category: str, symbol: str, order_id: str
) -> dict:
resp = await self._run(
self._http.cancel_order,
category=category, symbol=symbol, orderId=order_id,
)
r = resp.get("result") or {}
return self._envelope(resp, {
"order_id": r.get("orderId", order_id),
"status": "cancelled",
})
async def cancel_all_orders(
self, category: str, symbol: str | None = None
) -> dict:
kwargs: dict[str, Any] = {"category": category}
if symbol:
kwargs["symbol"] = symbol
resp = await self._run(self._http.cancel_all_orders, **kwargs)
r = resp.get("result") or {}
ids = [x.get("orderId") for x in (r.get("list") or [])]
return self._envelope(resp, {
"cancelled_ids": ids,
"count": len(ids),
})
async def set_stop_loss(
self, category: str, symbol: str, stop_loss: float,
position_idx: int = 0,
) -> dict:
resp = await self._run(
self._http.set_trading_stop,
category=category, symbol=symbol,
stopLoss=str(stop_loss), positionIdx=position_idx,
)
return self._envelope(resp, {
"symbol": symbol, "stop_loss": stop_loss,
"status": "stop_loss_set",
})
async def set_take_profit(
self, category: str, symbol: str, take_profit: float,
position_idx: int = 0,
) -> dict:
resp = await self._run(
self._http.set_trading_stop,
category=category, symbol=symbol,
takeProfit=str(take_profit), positionIdx=position_idx,
)
return self._envelope(resp, {
"symbol": symbol, "take_profit": take_profit,
"status": "take_profit_set",
})
async def close_position(self, category: str, symbol: str) -> dict:
positions = await self.get_positions(category=category)
target = next((p for p in positions if p["symbol"] == symbol and (p["size"] or 0) > 0), None)
if not target:
return {"error": "no_open_position", "symbol": symbol}
close_side = "Sell" if target["side"] == "Buy" else "Buy"
return await self.place_order(
category=category,
symbol=symbol,
side=close_side,
qty=target["size"],
order_type="Market",
reduce_only=True,
tif="IOC",
)
async def set_leverage(
self, category: str, symbol: str, leverage: int
) -> dict:
resp = await self._run(
self._http.set_leverage,
category=category, symbol=symbol,
buyLeverage=str(leverage), sellLeverage=str(leverage),
)
return self._envelope(resp, {
"symbol": symbol, "leverage": leverage,
"status": "leverage_set",
})
async def switch_position_mode(
self, category: str, symbol: str, mode: str
) -> dict:
mode_code = 3 if mode.lower() == "hedge" else 0
resp = await self._run(
self._http.switch_position_mode,
category=category, symbol=symbol, mode=mode_code,
)
return self._envelope(resp, {
"symbol": symbol, "mode": mode,
"status": "mode_switched",
})
async def transfer_asset(
self,
coin: str,
amount: float,
from_type: str,
to_type: str,
) -> dict:
import uuid
resp = await self._run(
self._http.create_internal_transfer,
transferId=str(uuid.uuid4()),
coin=coin,
amount=str(amount),
fromAccountType=from_type,
toAccountType=to_type,
)
r = resp.get("result") or {}
return self._envelope(resp, {
"transfer_id": r.get("transferId"),
"coin": coin,
"amount": amount,
"status": "submitted",
})
-522
View File
@@ -1,522 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, Field
from mcp_bybit.client import BybitClient
from mcp_bybit.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_bybit.leverage_cap import get_max_leverage
# --- Body models: reads ---
class TickerReq(BaseModel):
symbol: str
category: str = "linear"
class TickerBatchReq(BaseModel):
symbols: list[str]
category: str = "linear"
class OrderbookReq(BaseModel):
symbol: str
category: str = "linear"
limit: int = 50
class HistoricalReq(BaseModel):
symbol: str
category: str = "linear"
interval: str = "60"
start: int | None = None
end: int | None = None
limit: int = 1000
class IndicatorsReq(BaseModel):
symbol: str
category: str = "linear"
indicators: list[str] = ["rsi", "atr", "macd", "adx"]
interval: str = "60"
start: int | None = None
end: int | None = None
class FundingRateReq(BaseModel):
symbol: str
category: str = "linear"
class FundingHistoryReq(BaseModel):
symbol: str
category: str = "linear"
limit: int = 100
class OpenInterestReq(BaseModel):
symbol: str
category: str = "linear"
interval: str = "5min"
limit: int = 288
class InstrumentsReq(BaseModel):
category: str = "linear"
symbol: str | None = None
class OptionChainReq(BaseModel):
base_coin: str
expiry: str | None = None
class PositionsReq(BaseModel):
category: str = "linear"
class AccountSummaryReq(BaseModel):
pass
class TradeHistoryReq(BaseModel):
category: str = "linear"
limit: int = 50
class OpenOrdersReq(BaseModel):
category: str = "linear"
symbol: str | None = None
class BasisSpotPerpReq(BaseModel):
asset: str
class OrderbookImbalanceReq(BaseModel):
symbol: str
category: str = "linear"
depth: int = 10
class BasisTermStructureReq(BaseModel):
asset: str
# --- Body models: writes ---
class PlaceOrderReq(BaseModel):
category: str
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
position_idx: int | None = None
class ComboLegReq(BaseModel):
symbol: str
side: str
qty: float
order_type: str = "Limit"
price: float | None = None
tif: str = "GTC"
reduce_only: bool = False
class PlaceComboOrderReq(BaseModel):
category: str = "option"
legs: list[ComboLegReq] = Field(..., min_length=2)
class AmendOrderReq(BaseModel):
category: str
symbol: str
order_id: str
new_qty: float | None = None
new_price: float | None = None
class CancelOrderReq(BaseModel):
category: str
symbol: str
order_id: str
class CancelAllReq(BaseModel):
category: str
symbol: str | None = None
class SetStopLossReq(BaseModel):
category: str
symbol: str
stop_loss: float
position_idx: int = 0
class SetTakeProfitReq(BaseModel):
category: str
symbol: str
take_profit: float
position_idx: int = 0
class ClosePositionReq(BaseModel):
category: str
symbol: str
class SetLeverageReq(BaseModel):
category: str
symbol: str
leverage: int
class SwitchModeReq(BaseModel):
category: str
symbol: str
mode: str
class TransferReq(BaseModel):
coin: str
amount: float
from_type: str
to_type: str
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(status_code=403, detail="forbidden")
def create_app(
*,
client: BybitClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
):
creds = creds or {}
app = build_app(name="mcp-bybit", version="0.1.0", token_store=token_store)
# ── Reads ──────────────────────────────────────────────
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "bybit",
"environment": "testnet" if client.testnet else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(body: TickerReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.symbol, body.category)
@app.post("/tools/get_ticker_batch", tags=["reads"])
async def t_get_ticker_batch(body: TickerBatchReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_ticker_batch(body.symbols, body.category)
@app.post("/tools/get_orderbook", tags=["reads"])
async def t_get_orderbook(body: OrderbookReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.symbol, body.category, body.limit)
@app.post("/tools/get_historical", tags=["reads"])
async def t_get_historical(body: HistoricalReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_historical(
body.symbol, body.category, body.interval, body.start, body.end, body.limit,
)
@app.post("/tools/get_indicators", tags=["reads"])
async def t_get_indicators(body: IndicatorsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_indicators(
body.symbol, body.category, body.indicators,
body.interval, body.start, body.end,
)
@app.post("/tools/get_funding_rate", tags=["reads"])
async def t_get_funding_rate(body: FundingRateReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_funding_rate(body.symbol, body.category)
@app.post("/tools/get_funding_history", tags=["reads"])
async def t_get_funding_history(body: FundingHistoryReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_funding_history(body.symbol, body.category, body.limit)
@app.post("/tools/get_open_interest", tags=["reads"])
async def t_get_open_interest(body: OpenInterestReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_open_interest(body.symbol, body.category, body.interval, body.limit)
@app.post("/tools/get_instruments", tags=["reads"])
async def t_get_instruments(body: InstrumentsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_instruments(body.category, body.symbol)
@app.post("/tools/get_option_chain", tags=["reads"])
async def t_get_option_chain(body: OptionChainReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_option_chain(body.base_coin, body.expiry)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(body: PositionsReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"positions": await client.get_positions(body.category)}
@app.post("/tools/get_account_summary", tags=["reads"])
async def t_get_account_summary(body: AccountSummaryReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_account_summary()
@app.post("/tools/get_trade_history", tags=["reads"])
async def t_get_trade_history(body: TradeHistoryReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"trades": await client.get_trade_history(body.category, body.limit)}
@app.post("/tools/get_open_orders", tags=["reads"])
async def t_get_open_orders(body: OpenOrdersReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return {"orders": await client.get_open_orders(body.category, body.symbol)}
@app.post("/tools/get_basis_spot_perp", tags=["reads"])
async def t_get_basis_spot_perp(body: BasisSpotPerpReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_basis_spot_perp(body.asset)
@app.post("/tools/get_orderbook_imbalance", tags=["reads"])
async def t_get_ob_imbalance(body: OrderbookImbalanceReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_orderbook_imbalance(body.symbol, body.category, body.depth)
@app.post("/tools/get_basis_term_structure", tags=["reads"])
async def t_get_basis_term_structure(body: BasisTermStructureReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return await client.get_basis_term_structure(body.asset)
# ── Writes ─────────────────────────────────────────────
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(body: PlaceOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_order(
body.category, body.symbol, body.side, body.qty,
body.order_type, body.price, body.tif, body.reduce_only, body.position_idx,
)
audit_write_op(
principal=principal, action="place_order", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "side": body.side, "qty": body.qty,
"order_type": body.order_type, "price": body.price, "tif": body.tif,
"reduce_only": body.reduce_only},
result=result,
)
return result
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(body: PlaceComboOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.place_combo_order(
category=body.category,
legs=[leg.model_dump() for leg in body.legs],
)
audit_write_op(
principal=principal, action="place_combo_order", exchange="bybit",
payload={"category": body.category,
"legs": [leg.model_dump() for leg in body.legs]},
result=result if isinstance(result, dict) else None,
)
return result
@app.post("/tools/amend_order", tags=["writes"])
async def t_amend_order(body: AmendOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.amend_order(
body.category, body.symbol, body.order_id, body.new_qty, body.new_price,
)
audit_write_op(
principal=principal, action="amend_order", exchange="bybit",
target=body.order_id,
payload={"category": body.category, "symbol": body.symbol,
"new_qty": body.new_qty, "new_price": body.new_price},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(body: CancelOrderReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.cancel_order(body.category, body.symbol, body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="bybit",
target=body.order_id,
payload={"category": body.category, "symbol": body.symbol},
result=result,
)
return result
@app.post("/tools/cancel_all_orders", tags=["writes"])
async def t_cancel_all(body: CancelAllReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.cancel_all_orders(body.category, body.symbol)
audit_write_op(
principal=principal, action="cancel_all_orders", exchange="bybit",
target=body.symbol,
payload={"category": body.category},
result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(body: SetStopLossReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.set_stop_loss(body.category, body.symbol, body.stop_loss, body.position_idx)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="bybit",
target=body.symbol,
payload={"stop_loss": body.stop_loss, "position_idx": body.position_idx},
result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(body: SetTakeProfitReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.set_take_profit(body.category, body.symbol, body.take_profit, body.position_idx)
audit_write_op(
principal=principal, action="set_take_profit", exchange="bybit",
target=body.symbol,
payload={"take_profit": body.take_profit, "position_idx": body.position_idx},
result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close(body: ClosePositionReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.close_position(body.category, body.symbol)
audit_write_op(
principal=principal, action="close_position", exchange="bybit",
target=body.symbol,
payload={"category": body.category},
result=result,
)
return result
@app.post("/tools/set_leverage", tags=["writes"])
async def t_set_leverage(body: SetLeverageReq, principal: Principal = Depends(require_principal)):
_enforce_leverage(body.leverage, creds=creds, exchange="bybit")
_check(principal, core=True)
result = await client.set_leverage(body.category, body.symbol, body.leverage)
audit_write_op(
principal=principal, action="set_leverage", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "leverage": body.leverage},
result=result,
)
return result
@app.post("/tools/switch_position_mode", tags=["writes"])
async def t_switch_mode(body: SwitchModeReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.switch_position_mode(body.category, body.symbol, body.mode)
audit_write_op(
principal=principal, action="switch_position_mode", exchange="bybit",
target=body.symbol,
payload={"category": body.category, "mode": body.mode},
result=result,
)
return result
@app.post("/tools/transfer_asset", tags=["writes"])
async def t_transfer(body: TransferReq, principal: Principal = Depends(require_principal)):
_check(principal, core=True)
result = await client.transfer_asset(body.coin, body.amount, body.from_type, body.to_type)
audit_write_op(
principal=principal, action="transfer_asset", exchange="bybit",
payload={"coin": body.coin, "amount": body.amount,
"from_type": body.from_type, "to_type": body.to_type},
result=result,
)
return result
# ── MCP mount ──────────────────────────────────────────
port = int(os.environ.get("PORT", "9019"))
mount_mcp_endpoint(
app,
name="cerbero-bybit",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_ticker", "description": "Ticker Bybit (spot/linear/inverse/option)."},
{"name": "get_ticker_batch", "description": "Ticker per più simboli."},
{"name": "get_orderbook", "description": "Orderbook profondità N."},
{"name": "get_historical", "description": "OHLCV candles Bybit."},
{"name": "get_indicators", "description": "Indicatori tecnici (RSI, ATR, MACD, ADX)."},
{"name": "get_funding_rate", "description": "Funding corrente perp."},
{"name": "get_funding_history", "description": "Funding storico perp."},
{"name": "get_open_interest", "description": "Open interest history perp."},
{"name": "get_instruments", "description": "Specs contratti."},
{"name": "get_option_chain", "description": "Option chain BTC/ETH/SOL."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Wallet balance e margine."},
{"name": "get_trade_history", "description": "Fills recenti."},
{"name": "get_open_orders", "description": "Ordini pending."},
{"name": "get_basis_spot_perp", "description": "Basis spot vs linear perp."},
{"name": "get_orderbook_imbalance", "description": "Microstructure: imbalance ratio + microprice + slope su top-N livelli book."},
{"name": "get_basis_term_structure", "description": "Basis curve futures dated vs spot, annualizzato."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "place_combo_order", "description": "Multi-leg atomico via place_batch_order (solo category=option)."},
{"name": "amend_order", "description": "Modifica ordine esistente."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "cancel_all_orders", "description": "Cancella tutti ordini."},
{"name": "set_stop_loss", "description": "Setta stop loss su posizione."},
{"name": "set_take_profit", "description": "Setta take profit su posizione."},
{"name": "close_position", "description": "Chiude posizione aperta."},
{"name": "set_leverage", "description": "Leva buy+sell uniforme."},
{"name": "switch_position_mode", "description": "Hedge vs one-way."},
{"name": "transfer_asset", "description": "Trasferimento interno tra account types."},
],
)
return app
-21
View File
@@ -1,21 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
import pytest
from mcp_bybit.client import BybitClient
@pytest.fixture
def mock_http():
return MagicMock(name="pybit_HTTP")
@pytest.fixture
def client(mock_http):
return BybitClient(
api_key="test_key",
api_secret="test_secret",
testnet=True,
http=mock_http,
)
-588
View File
@@ -1,588 +0,0 @@
from __future__ import annotations
import pytest
from mcp_bybit.client import BybitClient
def test_client_init_stores_attrs(client, mock_http):
assert client.testnet is True
assert client._http is mock_http
def test_client_init_default_http(monkeypatch):
created = {}
class FakeHTTP:
def __init__(self, **kwargs):
created.update(kwargs)
monkeypatch.setattr("mcp_bybit.client.HTTP", FakeHTTP)
BybitClient(api_key="k", api_secret="s", testnet=False)
assert created["api_key"] == "k"
assert created["api_secret"] == "s"
assert created["testnet"] is False
@pytest.mark.asyncio
async def test_get_ticker(client, mock_http):
mock_http.get_tickers.return_value = {
"retCode": 0,
"result": {
"list": [{
"symbol": "BTCUSDT",
"lastPrice": "60000",
"markPrice": "60010",
"bid1Price": "59995",
"ask1Price": "60005",
"volume24h": "1500.5",
"turnover24h": "90000000",
"fundingRate": "0.0001",
"openInterest": "50000",
}]
},
}
t = await client.get_ticker("BTCUSDT", category="linear")
mock_http.get_tickers.assert_called_once_with(category="linear", symbol="BTCUSDT")
assert t["symbol"] == "BTCUSDT"
assert t["last_price"] == 60000.0
assert t["mark_price"] == 60010.0
assert t["bid"] == 59995.0
assert t["ask"] == 60005.0
assert t["volume_24h"] == 1500.5
assert t["funding_rate"] == 0.0001
assert t["open_interest"] == 50000.0
@pytest.mark.asyncio
async def test_get_ticker_batch(client, mock_http):
def side_effect(**kwargs):
symbol = kwargs["symbol"]
return {"retCode": 0, "result": {"list": [{
"symbol": symbol, "lastPrice": "1", "markPrice": "1",
"bid1Price": "1", "ask1Price": "1", "volume24h": "0",
"turnover24h": "0", "fundingRate": "0", "openInterest": "0",
}]}}
mock_http.get_tickers.side_effect = side_effect
out = await client.get_ticker_batch(["BTCUSDT", "ETHUSDT"], category="linear")
assert set(out.keys()) == {"BTCUSDT", "ETHUSDT"}
assert mock_http.get_tickers.call_count == 2
@pytest.mark.asyncio
async def test_get_ticker_not_found(client, mock_http):
mock_http.get_tickers.return_value = {"retCode": 0, "result": {"list": []}}
t = await client.get_ticker("UNKNOWNUSDT", category="linear")
assert t == {"symbol": "UNKNOWNUSDT", "error": "not_found"}
def test_parse_helpers():
from mcp_bybit.client import _f, _i
assert _f("1.5") == 1.5
assert _f("") is None
assert _f(None) is None
assert _i("42") == 42
assert _i("") is None
assert _i(None) is None
@pytest.mark.asyncio
async def test_get_orderbook(client, mock_http):
mock_http.get_orderbook.return_value = {
"retCode": 0,
"result": {
"s": "BTCUSDT",
"b": [["59990", "0.5"], ["59980", "1.0"]],
"a": [["60010", "0.3"], ["60020", "0.7"]],
"ts": 1700000000000,
},
}
ob = await client.get_orderbook("BTCUSDT", category="linear", limit=25)
mock_http.get_orderbook.assert_called_once_with(
category="linear", symbol="BTCUSDT", limit=25
)
assert ob["symbol"] == "BTCUSDT"
assert ob["bids"] == [[59990.0, 0.5], [59980.0, 1.0]]
assert ob["asks"] == [[60010.0, 0.3], [60020.0, 0.7]]
assert ob["timestamp"] == 1700000000000
@pytest.mark.asyncio
async def test_get_historical(client, mock_http):
mock_http.get_kline.return_value = {
"retCode": 0,
"result": {
"list": [
["1700000000000", "60000", "60500", "59500", "60200", "100", "6020000"],
["1700003600000", "60200", "60700", "60000", "60400", "80", "4832000"],
]
},
}
out = await client.get_historical(
"BTCUSDT", category="linear", interval="60",
start=1700000000000, end=1700003600000,
)
mock_http.get_kline.assert_called_once_with(
category="linear", symbol="BTCUSDT", interval="60",
start=1700000000000, end=1700003600000, limit=1000,
)
assert len(out["candles"]) == 2
c0 = out["candles"][0]
assert c0["timestamp"] == 1700000000000
assert c0["open"] == 60000.0
assert c0["high"] == 60500.0
assert c0["low"] == 59500.0
assert c0["close"] == 60200.0
assert c0["volume"] == 100.0
@pytest.mark.asyncio
async def test_get_indicators(client, mock_http):
rows = [
[str(1700000000000 + i * 3600_000),
str(60000 + i * 10), str(60000 + i * 10 + 5),
str(60000 + i * 10 - 5), str(60000 + i * 10 + 2),
"100", "6000000"]
for i in range(35)
]
mock_http.get_kline.return_value = {"retCode": 0, "result": {"list": rows}}
out = await client.get_indicators(
"BTCUSDT", category="linear",
indicators=["rsi", "atr", "macd", "adx"],
interval="60",
)
assert "rsi" in out and out["rsi"] is not None
assert "atr" in out and out["atr"] is not None
assert "macd" in out and out["macd"]["macd"] is not None
assert "adx" in out and out["adx"]["adx"] is not None
@pytest.mark.asyncio
async def test_get_funding_rate(client, mock_http):
mock_http.get_tickers.return_value = {
"retCode": 0,
"result": {"list": [{
"symbol": "BTCUSDT", "fundingRate": "0.0001",
"nextFundingTime": "1700003600000",
"lastPrice": "60000", "markPrice": "60000",
"bid1Price": "0", "ask1Price": "0",
"volume24h": "0", "turnover24h": "0", "openInterest": "0",
}]},
}
out = await client.get_funding_rate("BTCUSDT", category="linear")
assert out["symbol"] == "BTCUSDT"
assert out["funding_rate"] == 0.0001
assert out["next_funding_time"] == 1700003600000
@pytest.mark.asyncio
async def test_get_funding_history(client, mock_http):
mock_http.get_funding_rate_history.return_value = {
"retCode": 0,
"result": {"list": [
{"symbol": "BTCUSDT", "fundingRate": "0.0001", "fundingRateTimestamp": "1700000000000"},
{"symbol": "BTCUSDT", "fundingRate": "0.00008", "fundingRateTimestamp": "1699996400000"},
]},
}
out = await client.get_funding_history("BTCUSDT", category="linear", limit=50)
mock_http.get_funding_rate_history.assert_called_once_with(
category="linear", symbol="BTCUSDT", limit=50
)
assert len(out["history"]) == 2
assert out["history"][0]["rate"] == 0.0001
@pytest.mark.asyncio
async def test_get_open_interest(client, mock_http):
mock_http.get_open_interest.return_value = {
"retCode": 0,
"result": {"list": [
{"openInterest": "50000", "timestamp": "1700000000000"},
{"openInterest": "49000", "timestamp": "1699996400000"},
]},
}
out = await client.get_open_interest("BTCUSDT", category="linear", interval="5min", limit=100)
mock_http.get_open_interest.assert_called_once_with(
category="linear", symbol="BTCUSDT", intervalTime="5min", limit=100
)
assert len(out["points"]) == 2
assert out["current_oi"] == 50000.0
@pytest.mark.asyncio
async def test_get_instruments(client, mock_http):
mock_http.get_instruments_info.return_value = {
"retCode": 0,
"result": {"list": [
{"symbol": "BTCUSDT", "status": "Trading", "baseCoin": "BTC",
"quoteCoin": "USDT", "priceFilter": {"tickSize": "0.1"},
"lotSizeFilter": {"qtyStep": "0.001", "minOrderQty": "0.001"}},
]},
}
out = await client.get_instruments(category="linear")
mock_http.get_instruments_info.assert_called_once_with(category="linear")
assert len(out["instruments"]) == 1
inst = out["instruments"][0]
assert inst["symbol"] == "BTCUSDT"
assert inst["tick_size"] == 0.1
assert inst["qty_step"] == 0.001
@pytest.mark.asyncio
async def test_get_option_chain(client, mock_http):
mock_http.get_instruments_info.return_value = {
"retCode": 0,
"result": {"list": [
{"symbol": "BTC-30JUN25-50000-C", "baseCoin": "BTC",
"settleCoin": "USDC", "optionsType": "Call",
"launchTime": "1700000000000", "deliveryTime": "1719734400000"},
{"symbol": "BTC-30JUN25-50000-P", "baseCoin": "BTC",
"settleCoin": "USDC", "optionsType": "Put",
"launchTime": "1700000000000", "deliveryTime": "1719734400000"},
]},
}
out = await client.get_option_chain(base_coin="BTC")
mock_http.get_instruments_info.assert_called_once_with(category="option", baseCoin="BTC")
assert len(out["options"]) == 2
assert out["options"][0]["type"] == "Call"
@pytest.mark.asyncio
async def test_get_positions(client, mock_http):
mock_http.get_positions.return_value = {
"retCode": 0,
"result": {"list": [
{"symbol": "BTCUSDT", "side": "Buy", "size": "0.1",
"avgPrice": "60000", "unrealisedPnl": "50",
"leverage": "10", "liqPrice": "50000", "positionValue": "6000"},
]},
}
out = await client.get_positions(category="linear")
mock_http.get_positions.assert_called_once_with(category="linear", settleCoin="USDT")
assert len(out) == 1
p = out[0]
assert p["symbol"] == "BTCUSDT"
assert p["side"] == "Buy"
assert p["size"] == 0.1
assert p["entry_price"] == 60000.0
assert p["liquidation_price"] == 50000.0
@pytest.mark.asyncio
async def test_get_account_summary(client, mock_http):
mock_http.get_wallet_balance.return_value = {
"retCode": 0,
"result": {"list": [{
"accountType": "UNIFIED",
"totalEquity": "10000",
"totalWalletBalance": "9500",
"totalMarginBalance": "9800",
"totalAvailableBalance": "9000",
"totalPerpUPL": "200",
"coin": [
{"coin": "USDT", "walletBalance": "9500", "equity": "9700"}
],
}]},
}
out = await client.get_account_summary()
mock_http.get_wallet_balance.assert_called_once_with(accountType="UNIFIED")
assert out["equity"] == 10000.0
assert out["available_balance"] == 9000.0
assert out["unrealized_pnl"] == 200.0
assert len(out["coins"]) == 1
assert out["coins"][0]["coin"] == "USDT"
@pytest.mark.asyncio
async def test_get_trade_history(client, mock_http):
mock_http.get_executions.return_value = {
"retCode": 0,
"result": {"list": [
{"symbol": "BTCUSDT", "side": "Buy", "execQty": "0.01",
"execPrice": "60000", "execFee": "0.1",
"execTime": "1700000000000", "orderId": "abc"},
]},
}
out = await client.get_trade_history(category="linear", limit=50)
mock_http.get_executions.assert_called_once_with(category="linear", limit=50)
assert len(out) == 1
assert out[0]["symbol"] == "BTCUSDT"
assert out[0]["size"] == 0.01
assert out[0]["price"] == 60000.0
@pytest.mark.asyncio
async def test_get_open_orders(client, mock_http):
mock_http.get_open_orders.return_value = {
"retCode": 0,
"result": {"list": [
{"symbol": "BTCUSDT", "orderId": "o1", "side": "Buy",
"qty": "0.1", "price": "59000", "orderType": "Limit",
"orderStatus": "New", "reduceOnly": False},
]},
}
out = await client.get_open_orders(category="linear")
mock_http.get_open_orders.assert_called_once_with(category="linear", settleCoin="USDT")
assert len(out) == 1
assert out[0]["order_id"] == "o1"
assert out[0]["price"] == 59000.0
@pytest.mark.asyncio
async def test_get_basis_spot_perp(client, mock_http):
def side(**kwargs):
if kwargs["category"] == "spot":
return {"retCode": 0, "result": {"list": [{
"symbol": "BTCUSDT", "lastPrice": "60000", "markPrice": "60000",
"bid1Price": "59995", "ask1Price": "60005",
"volume24h": "0", "turnover24h": "0",
"fundingRate": "0", "openInterest": "0",
}]}}
else:
return {"retCode": 0, "result": {"list": [{
"symbol": "BTCUSDT", "lastPrice": "60120", "markPrice": "60120",
"bid1Price": "60115", "ask1Price": "60125",
"volume24h": "0", "turnover24h": "0",
"fundingRate": "0.0001", "openInterest": "0",
}]}}
mock_http.get_tickers.side_effect = side
out = await client.get_basis_spot_perp("BTC")
assert out["asset"] == "BTC"
assert out["spot_price"] == 60000.0
assert out["perp_price"] == 60120.0
assert out["basis_abs"] == 120.0
assert round(out["basis_pct"], 3) == 0.2
@pytest.mark.asyncio
async def test_place_order_limit(client, mock_http):
mock_http.place_order.return_value = {
"retCode": 0,
"result": {"orderId": "ord123", "orderLinkId": ""},
}
out = await client.place_order(
category="linear", symbol="BTCUSDT", side="Buy",
qty=0.01, order_type="Limit", price=60000.0, tif="GTC",
)
assert out["order_id"] == "ord123"
kwargs = mock_http.place_order.call_args.kwargs
assert kwargs["category"] == "linear"
assert kwargs["symbol"] == "BTCUSDT"
assert kwargs["side"] == "Buy"
assert kwargs["qty"] == "0.01"
assert kwargs["orderType"] == "Limit"
assert kwargs["price"] == "60000.0"
assert kwargs["timeInForce"] == "GTC"
@pytest.mark.asyncio
async def test_place_order_error(client, mock_http):
mock_http.place_order.return_value = {"retCode": 10001, "retMsg": "insufficient balance"}
out = await client.place_order(
category="linear", symbol="BTCUSDT", side="Buy", qty=0.01, order_type="Market"
)
assert out.get("error") == "insufficient balance"
assert out.get("code") == 10001
@pytest.mark.asyncio
async def test_amend_order(client, mock_http):
mock_http.amend_order.return_value = {"retCode": 0, "result": {"orderId": "ord1"}}
out = await client.amend_order(
category="linear", symbol="BTCUSDT", order_id="ord1", new_qty=0.02
)
assert out["order_id"] == "ord1"
kwargs = mock_http.amend_order.call_args.kwargs
assert kwargs["orderId"] == "ord1"
assert kwargs["qty"] == "0.02"
assert "price" not in kwargs
@pytest.mark.asyncio
async def test_place_order_option_adds_link_id(client, mock_http):
mock_http.place_order.return_value = {
"retCode": 0,
"result": {"orderId": "opt1", "orderLinkId": "cerbero-abc"},
}
await client.place_order(
category="option", symbol="BTC-24APR26-96000-C-USDT",
side="Buy", qty=0.01, order_type="Limit", price=5.0,
)
kwargs = mock_http.place_order.call_args.kwargs
assert "orderLinkId" in kwargs
assert kwargs["orderLinkId"].startswith("cerbero-")
@pytest.mark.asyncio
async def test_place_order_linear_no_link_id(client, mock_http):
mock_http.place_order.return_value = {"retCode": 0, "result": {"orderId": "x"}}
await client.place_order(
category="linear", symbol="BTCUSDT", side="Buy", qty=0.01, order_type="Market"
)
kwargs = mock_http.place_order.call_args.kwargs
assert "orderLinkId" not in kwargs
@pytest.mark.asyncio
async def test_place_combo_order_batch_option(client, mock_http):
"""Combo order via place_batch_order su category=option (atomic, 1 round-trip)."""
mock_http.place_batch_order.return_value = {
"retCode": 0,
"result": {
"list": [
{"orderId": "ord-1", "orderLinkId": "cerbero-leg1"},
{"orderId": "ord-2", "orderLinkId": "cerbero-leg2"},
]
},
}
legs = [
{"symbol": "BTC-30APR26-75000-C-USDT", "side": "Buy", "qty": 0.01, "order_type": "Limit", "price": 5.0},
{"symbol": "BTC-30APR26-80000-C-USDT", "side": "Sell", "qty": 0.01, "order_type": "Limit", "price": 3.0},
]
out = await client.place_combo_order(category="option", legs=legs)
assert len(out["orders"]) == 2
assert out["orders"][0]["order_id"] == "ord-1"
kwargs = mock_http.place_batch_order.call_args.kwargs
assert kwargs["category"] == "option"
request = kwargs["request"]
assert len(request) == 2
assert request[0]["symbol"] == "BTC-30APR26-75000-C-USDT"
assert request[0]["qty"] == "0.01"
assert request[0]["orderType"] == "Limit"
# CER: orderLinkId obbligatorio per option
assert "orderLinkId" in request[0]
@pytest.mark.asyncio
async def test_place_combo_order_error(client, mock_http):
mock_http.place_batch_order.return_value = {"retCode": 10001, "retMsg": "invalid leg"}
out = await client.place_combo_order(
category="option",
legs=[
{"symbol": "X", "side": "Buy", "qty": 1, "order_type": "Limit", "price": 1.0},
{"symbol": "Y", "side": "Sell", "qty": 1, "order_type": "Limit", "price": 1.0},
],
)
assert out["error"] == "invalid leg"
assert out["code"] == 10001
@pytest.mark.asyncio
async def test_place_combo_order_rejects_non_option(client, mock_http):
"""Bybit batch_order è disponibile solo su option category."""
import pytest as _pytest
with _pytest.raises(ValueError, match="option"):
await client.place_combo_order(
category="linear",
legs=[
{"symbol": "BTCUSDT", "side": "Buy", "qty": 0.01, "order_type": "Market"},
{"symbol": "ETHUSDT", "side": "Sell", "qty": 0.01, "order_type": "Market"},
],
)
@pytest.mark.asyncio
async def test_cancel_order(client, mock_http):
mock_http.cancel_order.return_value = {"retCode": 0, "result": {"orderId": "ord1"}}
out = await client.cancel_order(category="linear", symbol="BTCUSDT", order_id="ord1")
mock_http.cancel_order.assert_called_once_with(
category="linear", symbol="BTCUSDT", orderId="ord1"
)
assert out["order_id"] == "ord1"
assert out["status"] == "cancelled"
@pytest.mark.asyncio
async def test_cancel_all_orders(client, mock_http):
mock_http.cancel_all_orders.return_value = {
"retCode": 0,
"result": {"list": [{"orderId": "o1"}, {"orderId": "o2"}]},
}
out = await client.cancel_all_orders(category="linear", symbol="BTCUSDT")
mock_http.cancel_all_orders.assert_called_once_with(
category="linear", symbol="BTCUSDT"
)
assert out["cancelled_ids"] == ["o1", "o2"]
@pytest.mark.asyncio
async def test_set_stop_loss(client, mock_http):
mock_http.set_trading_stop.return_value = {"retCode": 0, "result": {}}
out = await client.set_stop_loss(
category="linear", symbol="BTCUSDT", stop_loss=55000.0
)
mock_http.set_trading_stop.assert_called_once()
kwargs = mock_http.set_trading_stop.call_args.kwargs
assert kwargs["category"] == "linear"
assert kwargs["symbol"] == "BTCUSDT"
assert kwargs["stopLoss"] == "55000.0"
assert kwargs.get("positionIdx", 0) == 0
assert out["status"] == "stop_loss_set"
@pytest.mark.asyncio
async def test_set_take_profit(client, mock_http):
mock_http.set_trading_stop.return_value = {"retCode": 0, "result": {}}
out = await client.set_take_profit(
category="linear", symbol="BTCUSDT", take_profit=65000.0
)
kwargs = mock_http.set_trading_stop.call_args.kwargs
assert kwargs["takeProfit"] == "65000.0"
assert out["status"] == "take_profit_set"
@pytest.mark.asyncio
async def test_close_position(client, mock_http):
mock_http.get_positions.return_value = {
"retCode": 0, "result": {"list": [
{"symbol": "BTCUSDT", "side": "Buy", "size": "0.1",
"avgPrice": "60000", "unrealisedPnl": "0",
"leverage": "10", "liqPrice": "0", "positionValue": "6000"},
]},
}
mock_http.place_order.return_value = {
"retCode": 0, "result": {"orderId": "closeord", "orderLinkId": ""},
}
out = await client.close_position(category="linear", symbol="BTCUSDT")
assert out["status"] == "submitted"
kwargs = mock_http.place_order.call_args.kwargs
assert kwargs["side"] == "Sell"
assert kwargs["qty"] == "0.1"
assert kwargs["reduceOnly"] is True
assert kwargs["orderType"] == "Market"
@pytest.mark.asyncio
async def test_set_leverage(client, mock_http):
mock_http.set_leverage.return_value = {"retCode": 0, "result": {}}
out = await client.set_leverage(category="linear", symbol="BTCUSDT", leverage=5)
mock_http.set_leverage.assert_called_once_with(
category="linear", symbol="BTCUSDT", buyLeverage="5", sellLeverage="5"
)
assert out["status"] == "leverage_set"
@pytest.mark.asyncio
async def test_switch_position_mode(client, mock_http):
mock_http.switch_position_mode.return_value = {"retCode": 0, "result": {}}
out = await client.switch_position_mode(
category="linear", symbol="BTCUSDT", mode="hedge"
)
kwargs = mock_http.switch_position_mode.call_args.kwargs
assert kwargs["mode"] == 3
assert out["status"] == "mode_switched"
@pytest.mark.asyncio
async def test_transfer_asset(client, mock_http):
mock_http.create_internal_transfer.return_value = {
"retCode": 0, "result": {"transferId": "tx123"},
}
out = await client.transfer_asset(
coin="USDT", amount=100.0, from_type="UNIFIED", to_type="FUND"
)
kwargs = mock_http.create_internal_transfer.call_args.kwargs
assert kwargs["coin"] == "USDT"
assert kwargs["amount"] == "100.0"
assert kwargs["fromAccountType"] == "UNIFIED"
assert kwargs["toAccountType"] == "FUND"
assert out["transfer_id"] == "tx123"
@@ -1,54 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
from fastapi.testclient import TestClient
from mcp_bybit.server import create_app
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
def _make_app(env_info, creds):
c = MagicMock()
c.testnet = True
c.set_leverage = AsyncMock(return_value={"state": "ok"})
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="bybit",
environment="testnet",
source="env",
env_value="true",
base_url="https://api-testnet.bybit.com",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ot"},
)
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "bybit"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://api-testnet.bybit.com"
assert body["max_leverage"] == 3
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="bybit", environment="testnet", source="default",
env_value=None, base_url="https://api-testnet.bybit.com",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
-150
View File
@@ -1,150 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_bybit.server import create_app
from mcp_common.auth import Principal, TokenStore
@pytest.fixture
def token_store():
return TokenStore(
tokens={
"core-tok": Principal("core", {"core"}),
"obs-tok": Principal("observer", {"observer"}),
}
)
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_ticker = AsyncMock(return_value={"symbol": "BTCUSDT"})
c.get_ticker_batch = AsyncMock(return_value={"BTCUSDT": {}})
c.get_orderbook = AsyncMock(return_value={"bids": [], "asks": []})
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_indicators = AsyncMock(return_value={"rsi": 50.0})
c.get_funding_rate = AsyncMock(return_value={"funding_rate": 0.0001})
c.get_funding_history = AsyncMock(return_value={"history": []})
c.get_open_interest = AsyncMock(return_value={"points": []})
c.get_instruments = AsyncMock(return_value={"instruments": []})
c.get_option_chain = AsyncMock(return_value={"options": []})
c.get_positions = AsyncMock(return_value=[])
c.get_account_summary = AsyncMock(return_value={"equity": 0})
c.get_trade_history = AsyncMock(return_value=[])
c.get_open_orders = AsyncMock(return_value=[])
c.get_basis_spot_perp = AsyncMock(return_value={"basis_pct": 0})
c.place_order = AsyncMock(return_value={"order_id": "x"})
c.amend_order = AsyncMock(return_value={"order_id": "x"})
c.cancel_order = AsyncMock(return_value={"status": "cancelled"})
c.cancel_all_orders = AsyncMock(return_value={"cancelled_ids": []})
c.set_stop_loss = AsyncMock(return_value={"status": "stop_loss_set"})
c.set_take_profit = AsyncMock(return_value={"status": "take_profit_set"})
c.close_position = AsyncMock(return_value={"status": "submitted"})
c.set_leverage = AsyncMock(return_value={"status": "leverage_set"})
c.switch_position_mode = AsyncMock(return_value={"status": "mode_switched"})
c.transfer_asset = AsyncMock(return_value={"transfer_id": "tx"})
c.place_combo_order = AsyncMock(return_value={"orders": [{"order_id": "ord-1"}, {"order_id": "ord-2"}]})
c.get_orderbook_imbalance = AsyncMock(return_value={"imbalance_ratio": 0.0, "microprice": 100.0})
c.get_basis_term_structure = AsyncMock(return_value={"asset": "BTC", "term_structure": []})
return c
@pytest.fixture
def http(mock_client, token_store):
app = create_app(client=mock_client, token_store=token_store, creds={"max_leverage": 5})
return TestClient(app)
CORE = {"Authorization": "Bearer core-tok"}
OBS = {"Authorization": "Bearer obs-tok"}
READ_ENDPOINTS = [
("/tools/get_ticker", {"symbol": "BTCUSDT"}),
("/tools/get_ticker_batch", {"symbols": ["BTCUSDT"]}),
("/tools/get_orderbook", {"symbol": "BTCUSDT"}),
("/tools/get_historical", {"symbol": "BTCUSDT"}),
("/tools/get_indicators", {"symbol": "BTCUSDT"}),
("/tools/get_funding_rate", {"symbol": "BTCUSDT"}),
("/tools/get_funding_history", {"symbol": "BTCUSDT"}),
("/tools/get_open_interest", {"symbol": "BTCUSDT"}),
("/tools/get_instruments", {}),
("/tools/get_option_chain", {"base_coin": "BTC"}),
("/tools/get_positions", {}),
("/tools/get_account_summary", {}),
("/tools/get_trade_history", {}),
("/tools/get_open_orders", {}),
("/tools/get_basis_spot_perp", {"asset": "BTC"}),
("/tools/get_orderbook_imbalance", {"symbol": "BTCUSDT"}),
("/tools/get_basis_term_structure", {"asset": "BTC"}),
]
WRITE_ENDPOINTS = [
("/tools/place_order", {"category": "linear", "symbol": "BTCUSDT", "side": "Buy", "qty": 0.01}),
("/tools/amend_order", {"category": "linear", "symbol": "BTCUSDT", "order_id": "o1"}),
("/tools/cancel_order", {"category": "linear", "symbol": "BTCUSDT", "order_id": "o1"}),
("/tools/cancel_all_orders", {"category": "linear"}),
("/tools/set_stop_loss", {"category": "linear", "symbol": "BTCUSDT", "stop_loss": 55000}),
("/tools/set_take_profit", {"category": "linear", "symbol": "BTCUSDT", "take_profit": 65000}),
("/tools/close_position", {"category": "linear", "symbol": "BTCUSDT"}),
("/tools/set_leverage", {"category": "linear", "symbol": "BTCUSDT", "leverage": 5}),
("/tools/switch_position_mode", {"category": "linear", "symbol": "BTCUSDT", "mode": "hedge"}),
("/tools/transfer_asset", {"coin": "USDT", "amount": 10.0, "from_type": "UNIFIED", "to_type": "FUND"}),
("/tools/place_combo_order", {
"category": "option",
"legs": [
{"symbol": "BTC-30APR26-75000-C-USDT", "side": "Buy", "qty": 0.01, "order_type": "Limit", "price": 5.0},
{"symbol": "BTC-30APR26-80000-C-USDT", "side": "Sell", "qty": 0.01, "order_type": "Limit", "price": 3.0},
],
}),
]
def test_place_combo_order_min_legs(http):
r = http.post(
"/tools/place_combo_order",
json={
"category": "option",
"legs": [{"symbol": "X", "side": "Buy", "qty": 1, "order_type": "Limit", "price": 1.0}],
},
headers=CORE,
)
assert r.status_code == 422
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_observer_ok(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", READ_ENDPOINTS)
def test_read_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_core_ok(http, path, payload):
r = http.post(path, json=payload, headers=CORE)
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_observer_403(http, path, payload):
r = http.post(path, json=payload, headers=OBS)
assert r.status_code == 403, (path, r.text)
@pytest.mark.parametrize("path,payload", WRITE_ENDPOINTS)
def test_write_no_auth_401(http, path, payload):
r = http.post(path, json=payload)
assert r.status_code == 401, (path, r.text)
-27
View File
@@ -1,27 +0,0 @@
[project]
name = "mcp-deribit"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_deribit"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-deribit = "mcp_deribit.__main__:main"
@@ -1,30 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_deribit.client import DeribitClient
from mcp_deribit.server import create_app
SPEC = ExchangeAppSpec(
exchange="deribit",
creds_env_var="CREDENTIALS_FILE",
env_var="DERIBIT_TESTNET",
flag_key="testnet",
default_base_url_live="https://www.deribit.com/api/v2",
default_base_url_testnet="https://test.deribit.com/api/v2",
default_port=9011,
build_client=lambda creds, env_info: DeribitClient(
client_id=creds["client_id"],
client_secret=creds["client_secret"],
testnet=(env_info.environment == "testnet"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
@@ -1,18 +0,0 @@
"""Re-export shim per backward-compat: la logica vive ora in
mcp_common.env_validation. Non aggiungere nuovo codice qui.
"""
from mcp_common.env_validation import (
MissingEnvError,
fail_fast_if_missing,
optional_env,
require_env,
summarize,
)
__all__ = [
"MissingEnvError",
"fail_fast_if_missing",
"optional_env",
"require_env",
"summarize",
]
@@ -1,695 +0,0 @@
from __future__ import annotations
import contextlib
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, field_validator, model_validator
from mcp_deribit.client import DeribitClient
from mcp_deribit.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_deribit.leverage_cap import get_max_leverage
# --- Body models ---
class GetTickerReq(BaseModel):
instrument_name: str | None = None
instrument: str | None = None
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
sym = self.instrument_name or self.instrument
if not sym:
raise ValueError("instrument_name (or instrument) is required")
self.instrument_name = sym
return self
class GetTickerBatchReq(BaseModel):
instrument_names: list[str] | None = None
instruments: list[str] | None = None
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
names = self.instrument_names or self.instruments
if not names:
raise ValueError("instrument_names (or instruments) is required")
self.instrument_names = names
return self
class GetInstrumentsReq(BaseModel):
currency: str
kind: str | None = None
expiry_from: str | None = None
expiry_to: str | None = None
strike_min: float | None = None
strike_max: float | None = None
min_open_interest: float | None = None
limit: int = 100
offset: int = 0
class GetOrderbookReq(BaseModel):
instrument_name: str
depth: int = 10
class OrderbookImbalanceReq(BaseModel):
instrument_name: str
depth: int = 10
class GetPositionsReq(BaseModel):
currency: str = "USDC"
class GetAccountSummaryReq(BaseModel):
currency: str = "USDC"
class GetTradeHistoryReq(BaseModel):
limit: int = 100
instrument_name: str | None = None
class GetHistoricalReq(BaseModel):
instrument: str
start_date: str
end_date: str
resolution: str = "1h"
class GetDvolReq(BaseModel):
currency: str = "BTC"
start_date: str
end_date: str
resolution: str = "1D"
class GetDvolHistoryReq(BaseModel):
currency: str = "BTC"
lookback_days: int = 90
class GetIvRankReq(BaseModel):
instrument: str
class GetRealizedVolReq(BaseModel):
currency: str = "BTC"
windows: list[int] = [14, 30]
class GetGexReq(BaseModel):
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 50
class OptionFlowReq(BaseModel):
"""Body comune per indicatori option-flow (dealer gamma, vanna/charm,
OI-weighted skew, smile asymmetry, ATM vs wings)."""
currency: str
expiry_from: str | None = None
expiry_to: str | None = None
top_n_strikes: int = 100
class GetPcRatioReq(BaseModel):
currency: str
class GetSkew25dReq(BaseModel):
currency: str
expiry: str
class GetTermStructureReq(BaseModel):
currency: str
class CalculateSpreadPayoffReq(BaseModel):
legs: list[dict]
quote_currency: str = "USD"
class RunBacktestReq(BaseModel):
strategy_name: str
underlying: str = "BTC"
lookback_days: int = 30
resolution: str = "4h"
entry_rules: dict | None = None
exit_rules: dict | None = None
class FindByDeltaReq(BaseModel):
currency: str
expiry: str
target_delta: float
option_type: str
max_results: int = 3
min_open_interest: float = 100.0
min_volume_24h: float = 20.0
class GetIndicatorsReq(BaseModel):
instrument: str
indicators: list[str]
start_date: str
end_date: str
resolution: str = "1h"
@field_validator("indicators", mode="before")
@classmethod
def _coerce_indicators(cls, v):
if isinstance(v, str):
import json
s = v.strip()
if s.startswith("["):
try:
parsed = json.loads(s)
if isinstance(parsed, list):
return [str(x).strip() for x in parsed if str(x).strip()]
except json.JSONDecodeError:
pass
return [x.strip() for x in s.split(",") if x.strip()]
if isinstance(v, list):
return v
raise ValueError(
"indicators must be a list like ['rsi','atr','macd'] "
"or a comma-separated string like 'rsi,atr,macd'"
)
class PlaceOrderReq(BaseModel):
instrument_name: str
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
reduce_only: bool = False
post_only: bool = False
label: str | None = None
leverage: int | None = None # CER-016: None → default cap (3x)
class ComboLeg(BaseModel):
instrument_name: str
direction: str # "buy" | "sell"
ratio: int = 1
class PlaceComboOrderReq(BaseModel):
legs: list[ComboLeg]
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
label: str | None = None
leverage: int | None = None
@model_validator(mode="after")
def _at_least_two_legs(self):
if len(self.legs) < 2:
raise ValueError("combo requires at least 2 legs")
return self
class CancelOrderReq(BaseModel):
order_id: str
class SetStopLossReq(BaseModel):
order_id: str
stop_price: float
class SetTakeProfitReq(BaseModel):
order_id: str
tp_price: float
class ClosePositionReq(BaseModel):
instrument_name: str
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(
*,
client: DeribitClient,
token_store: TokenStore,
creds: dict,
env_info: EnvironmentInfo | None = None,
) -> FastAPI:
from contextlib import asynccontextmanager
cap_default = get_max_leverage(creds)
# CER-016: pre-set leverage cap su perp principali al boot (best-effort).
@asynccontextmanager
async def _lifespan(_app: FastAPI):
for inst in ("BTC-PERPETUAL", "ETH-PERPETUAL"):
with contextlib.suppress(Exception):
await client.set_leverage(inst, cap_default)
yield
app = build_app(
name="mcp-deribit",
version="0.1.0",
token_store=token_store,
lifespan=_lifespan,
)
# --- Read tools: core + observer ---
@app.post("/tools/is_testnet", tags=["reads"])
async def t_is_testnet(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
return client.is_testnet()
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "deribit",
"environment": "testnet" if client.is_testnet().get("testnet") else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": client.base_url,
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(
body: GetTickerReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.instrument_name)
@app.post("/tools/get_ticker_batch", tags=["reads"])
async def t_get_ticker_batch(
body: GetTickerBatchReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_ticker_batch(body.instrument_names)
@app.post("/tools/get_instruments", tags=["reads"])
async def t_get_instruments(
body: GetInstrumentsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_instruments(
currency=body.currency,
kind=body.kind,
expiry_from=body.expiry_from,
expiry_to=body.expiry_to,
strike_min=body.strike_min,
strike_max=body.strike_max,
min_open_interest=body.min_open_interest,
limit=body.limit,
offset=body.offset,
)
@app.post("/tools/get_orderbook", tags=["reads"])
async def t_get_orderbook(
body: GetOrderbookReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.instrument_name, body.depth)
@app.post("/tools/get_orderbook_imbalance", tags=["reads"])
async def t_get_ob_imbalance(
body: OrderbookImbalanceReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook_imbalance(body.instrument_name, body.depth)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(
body: GetPositionsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_positions(body.currency)
@app.post("/tools/get_account_summary", tags=["reads"])
async def t_get_account_summary(
body: GetAccountSummaryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_account_summary(body.currency)
@app.post("/tools/get_trade_history", tags=["reads"])
async def t_get_trade_history(
body: GetTradeHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_trade_history(body.limit, body.instrument_name)
@app.post("/tools/get_historical", tags=["reads"])
async def t_get_historical(
body: GetHistoricalReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_historical(
body.instrument, body.start_date, body.end_date, body.resolution
)
@app.post("/tools/get_dvol", tags=["reads"])
async def t_get_dvol(
body: GetDvolReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dvol(
body.currency, body.start_date, body.end_date, body.resolution
)
@app.post("/tools/get_gex", tags=["reads"])
async def t_get_gex(
body: GetGexReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_gex(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_dealer_gamma_profile", tags=["reads"])
async def t_get_dealer_gamma_profile(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dealer_gamma_profile(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_vanna_charm", tags=["reads"])
async def t_get_vanna_charm(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_vanna_charm(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_oi_weighted_skew", tags=["reads"])
async def t_get_oi_weighted_skew(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_oi_weighted_skew(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_smile_asymmetry", tags=["reads"])
async def t_get_smile_asymmetry(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_smile_asymmetry(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_atm_vs_wings_vol", tags=["reads"])
async def t_get_atm_vs_wings_vol(
body: OptionFlowReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_atm_vs_wings_vol(
body.currency, body.expiry_from, body.expiry_to, body.top_n_strikes
)
@app.post("/tools/get_pc_ratio", tags=["reads"])
async def t_get_pc_ratio(
body: GetPcRatioReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_pc_ratio(body.currency)
@app.post("/tools/get_skew_25d", tags=["reads"])
async def t_get_skew_25d(
body: GetSkew25dReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_skew_25d(body.currency, body.expiry)
@app.post("/tools/get_term_structure", tags=["reads"])
async def t_get_term_structure(
body: GetTermStructureReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_term_structure(body.currency)
@app.post("/tools/run_backtest", tags=["writes"])
async def t_run_backtest(
body: RunBacktestReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.run_backtest(
strategy_name=body.strategy_name,
underlying=body.underlying,
lookback_days=body.lookback_days,
resolution=body.resolution,
entry_rules=body.entry_rules,
exit_rules=body.exit_rules,
)
@app.post("/tools/calculate_spread_payoff", tags=["writes"])
async def t_calculate_spread_payoff(
body: CalculateSpreadPayoffReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.calculate_spread_payoff(body.legs, body.quote_currency)
@app.post("/tools/find_by_delta", tags=["writes"])
async def t_find_by_delta(
body: FindByDeltaReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.find_by_delta(
currency=body.currency,
expiry=body.expiry,
target_delta=body.target_delta,
option_type=body.option_type,
max_results=body.max_results,
min_open_interest=body.min_open_interest,
min_volume_24h=body.min_volume_24h,
)
@app.post("/tools/get_iv_rank", tags=["reads"])
async def t_get_iv_rank(
body: GetIvRankReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_iv_rank(body.instrument)
@app.post("/tools/get_dvol_history", tags=["reads"])
async def t_get_dvol_history(
body: GetDvolHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_dvol_history(body.currency, body.lookback_days)
@app.post("/tools/get_realized_vol", tags=["reads"])
async def t_get_realized_vol(
body: GetRealizedVolReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_realized_vol(body.currency, body.windows)
@app.post("/tools/get_technical_indicators", tags=["reads"])
async def t_get_indicators(
body: GetIndicatorsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_technical_indicators(
body.instrument,
body.indicators,
body.start_date,
body.end_date,
body.resolution,
)
# --- Write tools: core only ---
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(
body: PlaceOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
lev = _enforce_leverage(body.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
with contextlib.suppress(Exception):
await client.set_leverage(body.instrument_name, lev)
result = await client.place_order(
instrument_name=body.instrument_name,
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
reduce_only=body.reduce_only,
post_only=body.post_only,
label=body.label,
)
audit_write_op(
principal=principal, action="place_order", exchange="deribit",
target=body.instrument_name,
payload={"side": body.side, "amount": body.amount, "type": body.type,
"price": body.price, "leverage": lev, "label": body.label},
result=result,
)
return result
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(
body: PlaceComboOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
lev = _enforce_leverage(body.leverage, creds=creds, exchange="deribit")
if lev != cap_default:
for leg in body.legs:
with contextlib.suppress(Exception):
await client.set_leverage(leg.instrument_name, lev)
result = await client.place_combo_order(
legs=[leg.model_dump() for leg in body.legs],
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
label=body.label,
)
audit_write_op(
principal=principal, action="place_combo_order", exchange="deribit",
target=result.get("combo_instrument") if isinstance(result, dict) else None,
payload={"legs": [leg.model_dump() for leg in body.legs],
"side": body.side, "amount": body.amount, "leverage": lev},
result=result if isinstance(result, dict) else None,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(
body: CancelOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.cancel_order(body.order_id)
audit_write_op(
principal=principal, action="cancel_order", exchange="deribit",
target=body.order_id, payload={}, result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(
body: SetStopLossReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_stop_loss(body.order_id, body.stop_price)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="deribit",
target=body.order_id, payload={"stop_price": body.stop_price}, result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(
body: SetTakeProfitReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_take_profit(body.order_id, body.tp_price)
audit_write_op(
principal=principal, action="set_take_profit", exchange="deribit",
target=body.order_id, payload={"tp_price": body.tp_price}, result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close_position(
body: ClosePositionReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.close_position(body.instrument_name)
audit_write_op(
principal=principal, action="close_position", exchange="deribit",
target=body.instrument_name, payload={}, result=result,
)
return result
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9011"))
mount_mcp_endpoint(
app,
name="cerbero-deribit",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "is_testnet", "description": "True se client Deribit è in modalità testnet."},
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_ticker", "description": "Ticker di un instrument Deribit."},
{"name": "get_ticker_batch", "description": "Ticker per N instruments in parallelo (max 20)."},
{"name": "get_instruments", "description": "Lista instruments per currency."},
{"name": "get_orderbook", "description": "Orderbook L1/L2 per instrument."},
{"name": "get_orderbook_imbalance", "description": "Microstructure: imbalance ratio + microprice + slope."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Summary account (equity, balance)."},
{"name": "get_trade_history", "description": "Storia trade recenti."},
{"name": "get_historical", "description": "OHLCV storico."},
{"name": "get_dvol", "description": "Deribit Volatility Index (DVOL) OHLC per currency (BTC/ETH)."},
{"name": "get_dvol_history", "description": "DVOL time series + percentili su lookback_days."},
{"name": "get_iv_rank", "description": "IV rank 30/90/365d di un instrument vs DVOL storico della currency."},
{"name": "find_by_delta", "description": "Trova strike con delta più vicino a target, filtrato per liquidità (OI/vol)."},
{"name": "calculate_spread_payoff", "description": "Payoff/greci/max P-L/break-even/fee per struttura multi-leg."},
{"name": "run_backtest", "description": "Heuristic backtest RSI-based su storia OHLCV per threshold accept/marginal/reject."},
{"name": "get_term_structure", "description": "IV ATM per ogni expiry disponibile, detect contango/backwardation."},
{"name": "get_skew_25d", "description": "Skew 25-delta put/call IV + risk reversal + butterfly per expiry."},
{"name": "get_pc_ratio", "description": "Put/Call ratio aggregato su OI e volume 24h."},
{"name": "get_gex", "description": "Gamma exposure per strike + zero gamma level (top N strikes per OI)."},
{"name": "get_dealer_gamma_profile", "description": "Net dealer gamma per strike (short calls/long puts) + gamma flip level."},
{"name": "get_vanna_charm", "description": "Vanna (∂delta/∂IV) e Charm (∂delta/∂t) aggregati pesati OI."},
{"name": "get_oi_weighted_skew", "description": "Skew aggregato pesato per OI: IV puts - IV calls. Positivo = paura."},
{"name": "get_smile_asymmetry", "description": "Asymmetry IV otm-puts vs otm-calls + ATM IV reference."},
{"name": "get_atm_vs_wings_vol", "description": "IV ATM vs IV ali 25-delta. wing_richness > 0 = smile/kurtosis."},
{"name": "get_technical_indicators", "description": "Indicatori tecnici (RSI, MACD, ATR, ADX)."},
{"name": "get_realized_vol", "description": "Volatilità realizzata annualizzata (log-return std) BTC/ETH + spread IVRV."},
{"name": "place_order", "description": "Invia ordine (CORE only, testnet)."},
{"name": "place_combo_order", "description": "Crea combo via private/create_combo + piazza ordine sul combo (1 cross spread invece di N)."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "set_stop_loss", "description": "Setta stop loss su posizione."},
{"name": "set_take_profit", "description": "Setta take profit su posizione."},
{"name": "close_position", "description": "Chiude posizione aperta."},
],
)
return app
@@ -1,71 +0,0 @@
"""CER-P5-010 env validation tests."""
from __future__ import annotations
import pytest
from mcp_deribit.env_validation import (
MissingEnvError,
fail_fast_if_missing,
optional_env,
require_env,
summarize,
)
def test_require_env_present(monkeypatch):
monkeypatch.setenv("FOO_KEY", "value1")
assert require_env("FOO_KEY") == "value1"
def test_require_env_missing_raises(monkeypatch):
monkeypatch.delenv("MISSING_REQ", raising=False)
with pytest.raises(MissingEnvError):
require_env("MISSING_REQ", "critical path")
def test_require_env_empty_raises(monkeypatch):
monkeypatch.setenv("EMPTY_REQ", "")
with pytest.raises(MissingEnvError):
require_env("EMPTY_REQ")
def test_require_env_whitespace_only_raises(monkeypatch):
monkeypatch.setenv("WS_REQ", " ")
with pytest.raises(MissingEnvError):
require_env("WS_REQ")
def test_optional_env_default(monkeypatch):
monkeypatch.delenv("OPT_A", raising=False)
assert optional_env("OPT_A", default="fallback") == "fallback"
def test_optional_env_set(monkeypatch):
monkeypatch.setenv("OPT_B", "xx")
assert optional_env("OPT_B", default="fallback") == "xx"
def test_fail_fast_all_present(monkeypatch):
monkeypatch.setenv("AA", "1")
monkeypatch.setenv("BB", "2")
fail_fast_if_missing(["AA", "BB"]) # no exit
def test_fail_fast_missing_exits(monkeypatch):
monkeypatch.setenv("HAVE_IT", "1")
monkeypatch.delenv("MISSING_X", raising=False)
with pytest.raises(SystemExit) as exc:
fail_fast_if_missing(["HAVE_IT", "MISSING_X"])
assert exc.value.code == 2
def test_summarize_does_not_leak_secrets(monkeypatch, caplog):
import logging
monkeypatch.setenv("API_KEY_FOO", "super-secret-token-123456")
monkeypatch.setenv("PORT", "9000")
with caplog.at_level(logging.INFO, logger="mcp_deribit.env_validation"):
summarize(["API_KEY_FOO", "PORT", "NOT_SET_XYZ"])
log_text = "\n".join(caplog.messages)
assert "super-secret-token-123456" not in log_text
assert "9000" in log_text
assert "<unset>" in log_text
@@ -1,77 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
from mcp_deribit.server import create_app
def _make_app(env_info, creds):
c = AsyncMock()
c.set_leverage = AsyncMock(return_value={"state": "ok"})
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="env",
env_value="true",
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ot"},
)
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "deribit"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://test.deribit.com/api/v2"
assert body["max_leverage"] == 3
def test_environment_info_default_source():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="default",
env_value=None,
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 1})
c = TestClient(app)
r = c.post(
"/tools/environment_info",
headers={"Authorization": "Bearer ct"},
)
assert r.status_code == 200
body = r.json()
assert body["source"] == "default"
assert body["env_value"] is None
assert body["max_leverage"] == 1
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="deribit",
environment="testnet",
source="default",
env_value=None,
base_url="https://test.deribit.com/api/v2",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -1,269 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_deribit.server import create_app
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_ticker = AsyncMock(return_value={"mark_price": 50000})
c.get_instruments = AsyncMock(return_value=[])
c.get_orderbook = AsyncMock(return_value={"bids": [], "asks": []})
c.get_positions = AsyncMock(return_value=[])
c.get_account_summary = AsyncMock(return_value={"equity": 1000})
c.get_trade_history = AsyncMock(return_value=[])
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_technical_indicators = AsyncMock(return_value={"rsi": 55.0})
c.place_order = AsyncMock(return_value={"order_id": "x"})
c.place_combo_order = AsyncMock(return_value={"combo_instrument": "BTC-COMBO-1", "order": {"order_id": "x"}})
c.get_dealer_gamma_profile = AsyncMock(return_value={"by_strike": [], "total_net_dealer_gamma": 0})
c.get_vanna_charm = AsyncMock(return_value={"total_vanna": 0, "total_charm": 0, "legs_analyzed": 0})
c.get_oi_weighted_skew = AsyncMock(return_value={"skew": 0, "call_iv_weighted": None, "put_iv_weighted": None})
c.get_smile_asymmetry = AsyncMock(return_value={"atm_iv": 0.5, "asymmetry": 0.0})
c.get_atm_vs_wings_vol = AsyncMock(return_value={"atm_iv": 0.5, "wing_richness": 0.0})
c.get_orderbook_imbalance = AsyncMock(return_value={"imbalance_ratio": 0.0, "microprice": 50000})
c.cancel_order = AsyncMock(return_value={"order_id": "x", "state": "cancelled"})
c.set_stop_loss = AsyncMock(return_value={"order_id": "x", "stop_price": 45000})
c.set_take_profit = AsyncMock(return_value={"order_id": "x", "tp_price": 55000})
c.close_position = AsyncMock(return_value={"closed": True})
c.set_leverage = AsyncMock(return_value={"state": "ok"})
return c
@pytest.fixture
def http(mock_client):
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
app = create_app(client=mock_client, token_store=store, creds={"max_leverage": 3})
return TestClient(app)
def test_health(http):
assert http.get("/health").status_code == 200
def test_get_ticker_core_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ct"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
assert r.json()["mark_price"] == 50000
def test_get_ticker_observer_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
def test_get_ticker_no_auth_401(http):
r = http.post("/tools/get_ticker", json={"instrument_name": "BTC-PERPETUAL"})
assert r.status_code == 401
def test_get_ticker_alias_instrument_ok(http, mock_client):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ct"},
json={"instrument": "ETH"},
)
assert r.status_code == 200
mock_client.get_ticker.assert_awaited_with("ETH")
def test_place_order_core_ok(http):
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={"instrument_name": "BTC-PERPETUAL", "side": "buy", "amount": 10},
)
assert r.status_code == 200
def test_place_order_observer_forbidden(http):
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL", "side": "buy", "amount": 10},
)
assert r.status_code == 403
def test_get_orderbook_imbalance_observer_ok(http):
r = http.post(
"/tools/get_orderbook_imbalance",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
@pytest.mark.parametrize("path", [
"/tools/get_dealer_gamma_profile",
"/tools/get_vanna_charm",
"/tools/get_oi_weighted_skew",
"/tools/get_smile_asymmetry",
"/tools/get_atm_vs_wings_vol",
])
def test_option_flow_indicators_observer_ok(http, path):
r = http.post(path, headers={"Authorization": "Bearer ot"}, json={"currency": "BTC"})
assert r.status_code == 200, (path, r.text)
@pytest.mark.parametrize("path", [
"/tools/get_dealer_gamma_profile",
"/tools/get_vanna_charm",
"/tools/get_oi_weighted_skew",
"/tools/get_smile_asymmetry",
"/tools/get_atm_vs_wings_vol",
])
def test_option_flow_indicators_no_auth_401(http, path):
r = http.post(path, json={"currency": "BTC"})
assert r.status_code == 401, (path, r.text)
def test_place_combo_order_core_ok(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [
{"instrument_name": "BTC-30APR26-75000-C", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-30APR26-80000-C", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
"type": "limit",
"price": 0.05,
},
)
assert r.status_code == 200
assert r.json()["combo_instrument"] == "BTC-COMBO-1"
def test_place_combo_order_observer_forbidden(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ot"},
json={
"legs": [
{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-Y", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
},
)
assert r.status_code == 403
def test_place_combo_order_min_legs(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1}],
"side": "buy",
"amount": 1,
},
)
assert r.status_code == 422
def test_place_combo_order_leverage_cap_enforced(http):
r = http.post(
"/tools/place_combo_order",
headers={"Authorization": "Bearer ct"},
json={
"legs": [
{"instrument_name": "BTC-X", "direction": "buy", "ratio": 1},
{"instrument_name": "BTC-Y", "direction": "sell", "ratio": 1},
],
"side": "buy",
"amount": 1,
"leverage": 50,
},
)
assert r.status_code == 403
err = r.json()["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
def test_place_order_leverage_cap_enforced(http):
"""Reject leverage > max_leverage (da secret, default 3)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={
"instrument_name": "BTC-PERPETUAL",
"side": "buy",
"amount": 50,
"leverage": 50,
},
)
assert r.status_code == 403
body = r.json()
err = body["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
details = err["details"]
assert details["exchange"] == "deribit"
assert details["requested"] == 50
assert details["max"] == 3
def test_close_position_core_ok(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ct"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 200
def test_close_position_observer_forbidden(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ot"},
json={"instrument_name": "BTC-PERPETUAL"},
)
assert r.status_code == 403
def test_cancel_order_observer_forbidden(http):
r = http.post(
"/tools/cancel_order",
headers={"Authorization": "Bearer ot"},
json={"order_id": "abc123"},
)
assert r.status_code == 403
def test_set_stop_loss_observer_forbidden(http):
r = http.post(
"/tools/set_stop_loss",
headers={"Authorization": "Bearer ot"},
json={"order_id": "abc123", "stop_price": 45000.0},
)
assert r.status_code == 403
def test_get_account_summary_observer_ok(http):
r = http.post(
"/tools/get_account_summary",
headers={"Authorization": "Bearer ot"},
json={"currency": "USDC"},
)
assert r.status_code == 200
assert r.json()["equity"] == 1000
-29
View File
@@ -1,29 +0,0 @@
[project]
name = "mcp-hyperliquid"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
"hyperliquid-python-sdk>=0.3",
"eth-account>=0.11",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_hyperliquid"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-hyperliquid = "mcp_hyperliquid.__main__:main"
@@ -1,31 +0,0 @@
from __future__ import annotations
from mcp_common.app_factory import ExchangeAppSpec, run_exchange_main
from mcp_hyperliquid.client import HyperliquidClient
from mcp_hyperliquid.server import create_app
SPEC = ExchangeAppSpec(
exchange="hyperliquid",
creds_env_var="HYPERLIQUID_WALLET_FILE",
env_var="HYPERLIQUID_TESTNET",
flag_key="testnet",
default_base_url_live="https://api.hyperliquid.xyz",
default_base_url_testnet="https://api.hyperliquid-testnet.xyz",
default_port=9012,
build_client=lambda creds, env_info: HyperliquidClient(
wallet_address=creds["wallet_address"],
private_key=creds["private_key"],
testnet=(env_info.environment == "testnet"),
api_wallet_address=creds.get("api_wallet_address"),
),
build_app=create_app,
)
def main():
run_exchange_main(SPEC)
if __name__ == "__main__":
main()
@@ -1,408 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.audit import audit_write_op
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.environment import EnvironmentInfo
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, field_validator, model_validator
from mcp_hyperliquid.client import HyperliquidClient
from mcp_hyperliquid.leverage_cap import enforce_leverage as _enforce_leverage
from mcp_hyperliquid.leverage_cap import get_max_leverage
# --- Body models ---
class GetMarketsReq(BaseModel):
pass
class GetTickerReq(BaseModel):
instrument: str
class GetOrderbookReq(BaseModel):
instrument: str
depth: int = 10
class GetPositionsReq(BaseModel):
pass
class GetAccountSummaryReq(BaseModel):
pass
class GetTradeHistoryReq(BaseModel):
limit: int = 100
class GetHistoricalReq(BaseModel):
instrument: str | None = None
asset: str | None = None
start_date: str | None = None
end_date: str | None = None
resolution: str = "1h"
interval: str | None = None
limit: int = 50
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
from datetime import UTC, datetime, timedelta
sym = self.instrument or self.asset
if not sym:
raise ValueError("instrument (or asset) is required")
self.instrument = sym
if self.interval:
self.resolution = self.interval
if not self.end_date:
self.end_date = datetime.now(UTC).strftime("%Y-%m-%dT%H:%M:%S")
if not self.start_date:
days = max(1, self.limit // 6)
self.start_date = (
datetime.now(UTC) - timedelta(days=days)
).strftime("%Y-%m-%dT%H:%M:%S")
return self
class GetOpenOrdersReq(BaseModel):
pass
class GetFundingRateReq(BaseModel):
instrument: str
class BasisSpotPerpReq(BaseModel):
asset: str
class GetIndicatorsReq(BaseModel):
instrument: str | None = None
asset: str | None = None
indicators: list[str] = ["rsi", "atr", "macd", "adx"]
start_date: str | None = None
end_date: str | None = None
resolution: str = "1h"
interval: str | None = None
limit: int = 50
model_config = {"extra": "allow"}
@model_validator(mode="after")
def _normalize(self):
from datetime import UTC, datetime, timedelta
sym = self.instrument or self.asset
if not sym:
raise ValueError("instrument (or asset) is required")
self.instrument = sym
if self.interval:
self.resolution = self.interval
if not self.end_date:
self.end_date = datetime.now(UTC).strftime("%Y-%m-%dT%H:%M:%S")
if not self.start_date:
days = max(2, self.limit // 6)
self.start_date = (
datetime.now(UTC) - timedelta(days=days)
).strftime("%Y-%m-%dT%H:%M:%S")
return self
@field_validator("indicators", mode="before")
@classmethod
def _coerce_indicators(cls, v):
if isinstance(v, str):
import json
s = v.strip()
if s.startswith("["):
try:
parsed = json.loads(s)
if isinstance(parsed, list):
return [str(x).strip() for x in parsed if str(x).strip()]
except json.JSONDecodeError:
pass
return [x.strip() for x in s.split(",") if x.strip()]
if isinstance(v, list):
return v
raise ValueError(
"indicators must be a list like ['rsi','atr','macd'] "
"or a comma-separated string like 'rsi,atr,macd'"
)
class PlaceOrderReq(BaseModel):
instrument: str
side: str # "buy" | "sell"
amount: float
type: str = "limit"
price: float | None = None
reduce_only: bool = False
leverage: int | None = None # CER-016: None → default cap (3x)
class CancelOrderReq(BaseModel):
order_id: str
instrument: str
class SetStopLossReq(BaseModel):
instrument: str
stop_price: float
size: float
class SetTakeProfitReq(BaseModel):
instrument: str
tp_price: float
size: float
class ClosePositionReq(BaseModel):
instrument: str
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(
*,
client: HyperliquidClient,
token_store: TokenStore,
creds: dict | None = None,
env_info: EnvironmentInfo | None = None,
) -> FastAPI:
creds = creds or {}
app = build_app(name="mcp-hyperliquid", version="0.1.0", token_store=token_store)
# --- Read tools: core + observer ---
@app.post("/tools/environment_info", tags=["reads"])
async def t_environment_info(principal: Principal = Depends(require_principal)):
_check(principal, core=True, observer=True)
if env_info is None:
return {
"exchange": "hyperliquid",
"environment": "testnet" if getattr(client, "testnet", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
@app.post("/tools/get_markets", tags=["reads"])
async def t_get_markets(
body: GetMarketsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_markets()
@app.post("/tools/get_ticker", tags=["reads"])
async def t_get_ticker(
body: GetTickerReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_ticker(body.instrument)
@app.post("/tools/get_orderbook", tags=["reads"])
async def t_get_orderbook(
body: GetOrderbookReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_orderbook(body.instrument, body.depth)
@app.post("/tools/get_positions", tags=["reads"])
async def t_get_positions(
body: GetPositionsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_positions()
@app.post("/tools/get_account_summary", tags=["reads"])
async def t_get_account_summary(
body: GetAccountSummaryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_account_summary()
@app.post("/tools/get_trade_history", tags=["reads"])
async def t_get_trade_history(
body: GetTradeHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_trade_history(body.limit)
@app.post("/tools/get_historical", tags=["reads"])
async def t_get_historical(
body: GetHistoricalReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_historical(
body.instrument, body.start_date, body.end_date, body.resolution
)
@app.post("/tools/get_open_orders", tags=["reads"])
async def t_get_open_orders(
body: GetOpenOrdersReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_open_orders()
@app.post("/tools/get_funding_rate", tags=["reads"])
async def t_get_funding_rate(
body: GetFundingRateReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_funding_rate(body.instrument)
@app.post("/tools/basis_spot_perp", tags=["writes"])
async def t_basis_spot_perp(
body: BasisSpotPerpReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.basis_spot_perp(body.asset)
@app.post("/tools/get_indicators", tags=["reads"])
async def t_get_indicators(
body: GetIndicatorsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await client.get_indicators(
body.instrument,
body.indicators,
body.start_date,
body.end_date,
body.resolution,
)
# --- Write tools: core only ---
@app.post("/tools/place_order", tags=["writes"])
async def t_place_order(
body: PlaceOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
_enforce_leverage(body.leverage, creds=creds, exchange="hyperliquid")
result = await client.place_order(
instrument=body.instrument,
side=body.side,
amount=body.amount,
type=body.type,
price=body.price,
reduce_only=body.reduce_only,
)
audit_write_op(
principal=principal, action="place_order", exchange="hyperliquid",
target=body.instrument,
payload={"side": body.side, "amount": body.amount, "type": body.type,
"price": body.price, "reduce_only": body.reduce_only,
"leverage": body.leverage},
result=result,
)
return result
@app.post("/tools/cancel_order", tags=["writes"])
async def t_cancel_order(
body: CancelOrderReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.cancel_order(body.order_id, body.instrument)
audit_write_op(
principal=principal, action="cancel_order", exchange="hyperliquid",
target=body.order_id, payload={"instrument": body.instrument}, result=result,
)
return result
@app.post("/tools/set_stop_loss", tags=["writes"])
async def t_set_sl(
body: SetStopLossReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_stop_loss(body.instrument, body.stop_price, body.size)
audit_write_op(
principal=principal, action="set_stop_loss", exchange="hyperliquid",
target=body.instrument,
payload={"stop_price": body.stop_price, "size": body.size},
result=result,
)
return result
@app.post("/tools/set_take_profit", tags=["writes"])
async def t_set_tp(
body: SetTakeProfitReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.set_take_profit(body.instrument, body.tp_price, body.size)
audit_write_op(
principal=principal, action="set_take_profit", exchange="hyperliquid",
target=body.instrument,
payload={"tp_price": body.tp_price, "size": body.size},
result=result,
)
return result
@app.post("/tools/close_position", tags=["writes"])
async def t_close_position(
body: ClosePositionReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True)
result = await client.close_position(body.instrument)
audit_write_op(
principal=principal, action="close_position", exchange="hyperliquid",
target=body.instrument, payload={}, result=result,
)
return result
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9012"))
mount_mcp_endpoint(
app,
name="cerbero-hyperliquid",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "environment_info", "description": "Ambiente operativo (testnet/mainnet), source, base_url, max_leverage cap."},
{"name": "get_markets", "description": "Lista mercati perp disponibili."},
{"name": "get_ticker", "description": "Ticker di un perp."},
{"name": "get_orderbook", "description": "Orderbook L2."},
{"name": "get_positions", "description": "Posizioni aperte."},
{"name": "get_account_summary", "description": "Account summary (spot + perp equity)."},
{"name": "get_trade_history", "description": "Storia trade."},
{"name": "get_historical", "description": "OHLCV storico."},
{"name": "get_open_orders", "description": "Ordini aperti."},
{"name": "get_funding_rate", "description": "Funding rate corrente per simbolo."},
{"name": "basis_spot_perp", "description": "Basis spot-perp annualizzato + carry opportunity detection."},
{"name": "get_indicators", "description": "Indicatori tecnici."},
{"name": "place_order", "description": "Invia ordine (CORE only)."},
{"name": "cancel_order", "description": "Cancella ordine."},
{"name": "set_stop_loss", "description": "Stop loss su posizione."},
{"name": "set_take_profit", "description": "Take profit su posizione."},
{"name": "close_position", "description": "Chiude posizione."},
],
)
return app
@@ -1,227 +0,0 @@
from __future__ import annotations
import re
import pytest
from mcp_hyperliquid.client import HyperliquidClient
from pytest_httpx import HTTPXMock
@pytest.fixture
def client():
return HyperliquidClient(
wallet_address="0xDeadBeef",
private_key="0x" + "a" * 64,
testnet=True,
)
# Shared mock responses
META_AND_CTX = [
{
"universe": [
{"name": "BTC", "maxLeverage": 50},
{"name": "ETH", "maxLeverage": 25},
]
},
[
{
"markPx": "50000.0",
"funding": "0.0001",
"openInterest": "1000.0",
"dayNtlVlm": "500000.0",
},
{
"markPx": "3000.0",
"funding": "0.00005",
"openInterest": "500.0",
"dayNtlVlm": "200000.0",
},
],
]
CLEARINGHOUSE_STATE = {
"marginSummary": {
"accountValue": "1500.0",
"totalRawUsd": "1200.0",
"totalMarginUsed": "300.0",
"totalNtlPos": "50.0",
},
"assetPositions": [
{
"position": {
"coin": "BTC",
"szi": "0.1",
"entryPx": "48000.0",
"unrealizedPnl": "200.0",
"leverage": {"value": "10"},
"liquidationPx": "40000.0",
}
}
],
}
SPOT_STATE = {"balances": [{"coin": "USDC", "total": "500.0"}]}
@pytest.mark.asyncio
async def test_get_markets(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META_AND_CTX,
)
markets = await client.get_markets()
assert len(markets) == 2
assert markets[0]["asset"] == "BTC"
assert markets[0]["mark_price"] == 50000.0
assert markets[0]["max_leverage"] == 50
@pytest.mark.asyncio
async def test_get_ticker(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META_AND_CTX,
)
result = await client.get_ticker("BTC")
assert result["asset"] == "BTC"
assert result["mark_price"] == 50000.0
assert result["funding_rate"] == 0.0001
@pytest.mark.asyncio
async def test_get_ticker_not_found(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META_AND_CTX,
)
result = await client.get_ticker("SOL")
assert "error" in result
@pytest.mark.asyncio
async def test_get_orderbook(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json={
"levels": [
[{"px": "49990.0", "sz": "0.5"}, {"px": "49980.0", "sz": "1.0"}],
[{"px": "50010.0", "sz": "0.3"}, {"px": "50020.0", "sz": "0.8"}],
]
},
)
result = await client.get_orderbook("BTC", depth=2)
assert result["asset"] == "BTC"
assert len(result["bids"]) == 2
assert len(result["asks"]) == 2
assert result["bids"][0]["price"] == 49990.0
@pytest.mark.asyncio
async def test_get_positions(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=CLEARINGHOUSE_STATE,
)
positions = await client.get_positions()
assert len(positions) == 1
assert positions[0]["asset"] == "BTC"
assert positions[0]["direction"] == "long"
assert positions[0]["size"] == 0.1
assert positions[0]["leverage"] == 10.0
@pytest.mark.asyncio
async def test_get_account_summary(httpx_mock: HTTPXMock, client: HyperliquidClient):
# get_account_summary calls /info twice (perp + spot)
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=CLEARINGHOUSE_STATE,
)
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=SPOT_STATE,
)
result = await client.get_account_summary()
assert result["perps_equity"] == 1500.0
assert result["spot_usdc"] == 500.0
assert result["equity"] == 2000.0
@pytest.mark.asyncio
async def test_get_trade_history(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=[
{"coin": "BTC", "side": "B", "sz": "0.1", "px": "50000", "fee": "0.5", "time": 1000},
{"coin": "ETH", "side": "A", "sz": "1.0", "px": "3000", "fee": "0.3", "time": 2000},
],
)
trades = await client.get_trade_history(limit=10)
assert len(trades) == 2
assert trades[0]["asset"] == "BTC"
assert trades[0]["price"] == 50000.0
@pytest.mark.asyncio
async def test_get_open_orders(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=[
{
"oid": 12345,
"coin": "BTC",
"side": "B",
"sz": "0.05",
"limitPx": "49000",
"orderType": "Limit",
}
],
)
orders = await client.get_open_orders()
assert len(orders) == 1
assert orders[0]["oid"] == 12345
assert orders[0]["asset"] == "BTC"
@pytest.mark.asyncio
async def test_get_historical(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=[
{"t": 1000000, "o": "49000", "h": "51000", "l": "48500", "c": "50000", "v": "100"},
],
)
result = await client.get_historical("BTC", "2024-01-01", "2024-01-02", "1h")
assert len(result["candles"]) == 1
assert result["candles"][0]["close"] == 50000.0
@pytest.mark.asyncio
async def test_health_ok(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json={"universe": []},
)
result = await client.health()
assert result["status"] in ("ok", "healthy")
assert result["testnet"] is True
@pytest.mark.asyncio
async def test_place_order_sdk_unavailable(client: HyperliquidClient):
"""place_order raises RuntimeError when SDK is not available (mocked)."""
import mcp_hyperliquid.client as mod
original = mod._SDK_AVAILABLE
mod._SDK_AVAILABLE = False
client._exchange = None
try:
result = await client.place_order("BTC", "buy", 0.1, price=50000.0)
# Should return error dict or raise RuntimeError
assert "error" in result or result.get("status") == "error"
except RuntimeError as exc:
assert "not installed" in str(exc).lower() or "sdk" in str(exc).lower()
finally:
mod._SDK_AVAILABLE = original
@@ -1,50 +0,0 @@
from __future__ import annotations
from unittest.mock import MagicMock
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_common.environment import EnvironmentInfo
from mcp_hyperliquid.server import create_app
def _make_app(env_info, creds):
c = MagicMock()
c.testnet = True
store = TokenStore(tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
})
return create_app(client=c, token_store=store, creds=creds, env_info=env_info)
def test_environment_info_full_shape():
env = EnvironmentInfo(
exchange="hyperliquid",
environment="testnet",
source="env",
env_value="true",
base_url="https://api.hyperliquid-testnet.xyz",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info", headers={"Authorization": "Bearer ot"})
assert r.status_code == 200
body = r.json()
assert body["exchange"] == "hyperliquid"
assert body["environment"] == "testnet"
assert body["source"] == "env"
assert body["env_value"] == "true"
assert body["base_url"] == "https://api.hyperliquid-testnet.xyz"
assert body["max_leverage"] == 3
def test_environment_info_requires_auth():
env = EnvironmentInfo(
exchange="hyperliquid", environment="testnet", source="default",
env_value=None, base_url="https://api.hyperliquid-testnet.xyz",
)
app = _make_app(env, creds={"max_leverage": 3})
c = TestClient(app)
r = c.post("/tools/environment_info")
assert r.status_code == 401
@@ -1,211 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, MagicMock
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_hyperliquid.server import create_app
@pytest.fixture
def mock_client():
c = MagicMock()
c.get_markets = AsyncMock(return_value=[{"asset": "BTC", "mark_price": 50000}])
c.get_ticker = AsyncMock(return_value={"asset": "BTC", "mark_price": 50000})
c.get_orderbook = AsyncMock(return_value={"bids": [], "asks": []})
c.get_positions = AsyncMock(return_value=[])
c.get_account_summary = AsyncMock(return_value={"equity": 1500, "perps_equity": 1000})
c.get_trade_history = AsyncMock(return_value=[])
c.get_historical = AsyncMock(return_value={"candles": []})
c.get_open_orders = AsyncMock(return_value=[])
c.get_funding_rate = AsyncMock(return_value={"asset": "BTC", "current_funding_rate": 0.0001})
c.get_indicators = AsyncMock(return_value={"rsi": 55.0})
c.place_order = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.cancel_order = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.set_stop_loss = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.set_take_profit = AsyncMock(return_value={"order_id": "x", "status": "ok"})
c.close_position = AsyncMock(return_value={"status": "ok", "asset": "BTC"})
return c
@pytest.fixture
def http(mock_client):
store = TokenStore(
tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(client=mock_client, token_store=store, creds={"max_leverage": 3})
return TestClient(app)
# --- Health ---
def test_health(http):
assert http.get("/health").status_code == 200
# --- Read tools: both core and observer allowed ---
def test_get_markets_core_ok(http):
r = http.post("/tools/get_markets", headers={"Authorization": "Bearer ct"}, json={})
assert r.status_code == 200
def test_get_markets_observer_ok(http):
r = http.post("/tools/get_markets", headers={"Authorization": "Bearer ot"}, json={})
assert r.status_code == 200
def test_get_ticker_core_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
assert r.json()["mark_price"] == 50000
def test_get_ticker_observer_ok(http):
r = http.post(
"/tools/get_ticker",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
def test_get_ticker_no_auth_401(http):
r = http.post("/tools/get_ticker", json={"instrument": "BTC"})
assert r.status_code == 401
def test_get_account_summary_observer_ok(http):
r = http.post(
"/tools/get_account_summary",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
assert r.json()["equity"] == 1500
def test_get_funding_rate_observer_ok(http):
r = http.post(
"/tools/get_funding_rate",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
def test_get_positions_no_auth_401(http):
r = http.post("/tools/get_positions", json={})
assert r.status_code == 401
# --- Write tools: core only ---
def test_place_order_core_ok(http):
# CER-016: amount * price = 150 < cap 200
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC", "side": "buy", "amount": 0.003, "price": 50000},
)
assert r.status_code == 200
def test_place_order_observer_forbidden(http):
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC", "side": "buy", "amount": 0.001, "price": 50000},
)
assert r.status_code == 403
def test_place_order_leverage_cap_enforced_hl(http):
"""Reject leverage > max_leverage (da secret, default 3)."""
r = http.post(
"/tools/place_order",
headers={"Authorization": "Bearer ct"},
json={
"instrument": "BTC",
"side": "buy",
"amount": 0.001,
"price": 50000,
"leverage": 10,
},
)
assert r.status_code == 403
body = r.json()
err = body["error"]
assert err["code"] == "LEVERAGE_CAP_EXCEEDED"
assert err["details"]["exchange"] == "hyperliquid"
def test_cancel_order_core_ok(http):
r = http.post(
"/tools/cancel_order",
headers={"Authorization": "Bearer ct"},
json={"order_id": "123", "instrument": "BTC"},
)
assert r.status_code == 200
def test_cancel_order_observer_forbidden(http):
r = http.post(
"/tools/cancel_order",
headers={"Authorization": "Bearer ot"},
json={"order_id": "123", "instrument": "BTC"},
)
assert r.status_code == 403
def test_set_stop_loss_core_ok(http):
r = http.post(
"/tools/set_stop_loss",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC", "stop_price": 45000.0, "size": 0.1},
)
assert r.status_code == 200
def test_set_stop_loss_observer_forbidden(http):
r = http.post(
"/tools/set_stop_loss",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC", "stop_price": 45000.0, "size": 0.1},
)
assert r.status_code == 403
def test_set_take_profit_observer_forbidden(http):
r = http.post(
"/tools/set_take_profit",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC", "tp_price": 55000.0, "size": 0.1},
)
assert r.status_code == 403
def test_close_position_core_ok(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ct"},
json={"instrument": "BTC"},
)
assert r.status_code == 200
def test_close_position_observer_forbidden(http):
r = http.post(
"/tools/close_position",
headers={"Authorization": "Bearer ot"},
json={"instrument": "BTC"},
)
assert r.status_code == 403
-27
View File
@@ -1,27 +0,0 @@
[project]
name = "mcp-macro"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_macro"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-macro = "mcp_macro.__main__:main"
@@ -1,37 +0,0 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_macro.server import create_app
configure_root_logging() # CER-P5-009
def main():
creds_file = os.environ["MACRO_CREDENTIALS_FILE"]
with open(creds_file) as f:
creds = json.load(f)
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(
fred_api_key=creds.get("fred_api_key", ""),
finnhub_api_key=creds.get("finnhub_api_key", ""),
token_store=token_store,
)
uvicorn.run(
app,
log_config=None, # CER-P5-009: delega al root JSON logger
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9013")),
)
if __name__ == "__main__":
main()
-203
View File
@@ -1,203 +0,0 @@
from __future__ import annotations
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel, Field
from mcp_macro.fetchers import (
fetch_asset_price,
fetch_breakeven_inflation,
fetch_cot_disaggregated,
fetch_cot_extreme_positioning,
fetch_cot_tff,
fetch_economic_indicators,
fetch_equity_futures,
fetch_macro_calendar,
fetch_market_overview,
fetch_treasury_yields,
fetch_yield_curve_slope,
)
# --- Body models ---
class GetEconomicIndicatorsReq(BaseModel):
indicators: list[str] | None = None
class GetMacroCalendarReq(BaseModel):
days: int = 7
country_filter: list[str] | None = None
importance_min: str | None = None
start: str | None = None
end: str | None = None
class GetMarketOverviewReq(BaseModel):
pass
class GetAssetPriceReq(BaseModel):
ticker: str
class GetTreasuryYieldsReq(BaseModel):
pass
class GetEquityFuturesReq(BaseModel):
pass
class GetYieldCurveSlopeReq(BaseModel):
pass
class GetBreakevenInflationReq(BaseModel):
pass
class GetCotTffReq(BaseModel):
symbol: str
lookback_weeks: int = Field(default=52, ge=4, le=520)
class GetCotDisaggregatedReq(BaseModel):
symbol: str
lookback_weeks: int = Field(default=52, ge=4, le=520)
class GetCotExtremeReq(BaseModel):
lookback_weeks: int = Field(default=156, ge=4, le=520)
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(*, fred_api_key: str = "", finnhub_api_key: str = "", token_store: TokenStore) -> FastAPI:
app = build_app(name="mcp-macro", version="0.1.0", token_store=token_store)
@app.post("/tools/get_economic_indicators", tags=["reads"])
async def t_get_economic_indicators(
body: GetEconomicIndicatorsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_economic_indicators(
fred_api_key=fred_api_key, indicators=body.indicators
)
@app.post("/tools/get_macro_calendar", tags=["reads"])
async def t_get_macro_calendar(
body: GetMacroCalendarReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_macro_calendar(
finnhub_api_key=finnhub_api_key,
days_ahead=body.days,
country_filter=body.country_filter,
importance_min=body.importance_min,
start=body.start,
end=body.end,
)
@app.post("/tools/get_market_overview", tags=["reads"])
async def t_get_market_overview(
body: GetMarketOverviewReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_market_overview()
@app.post("/tools/get_asset_price", tags=["reads"])
async def t_get_asset_price(
body: GetAssetPriceReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_asset_price(body.ticker)
@app.post("/tools/get_treasury_yields", tags=["reads"])
async def t_get_treasury_yields(
body: GetTreasuryYieldsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_treasury_yields()
@app.post("/tools/get_equity_futures", tags=["reads"])
async def t_get_equity_futures(
body: GetEquityFuturesReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_equity_futures()
@app.post("/tools/get_yield_curve_slope", tags=["reads"])
async def t_get_yield_curve_slope(
body: GetYieldCurveSlopeReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_yield_curve_slope()
@app.post("/tools/get_breakeven_inflation", tags=["reads"])
async def t_get_breakeven_inflation(
body: GetBreakevenInflationReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_breakeven_inflation(fred_api_key=fred_api_key)
@app.post("/tools/get_cot_tff", tags=["reads"])
async def t_get_cot_tff(
body: GetCotTffReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_tff(body.symbol, body.lookback_weeks)
@app.post("/tools/get_cot_disaggregated", tags=["reads"])
async def t_get_cot_disaggregated(
body: GetCotDisaggregatedReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_disaggregated(body.symbol, body.lookback_weeks)
@app.post("/tools/get_cot_extreme_positioning", tags=["reads"])
async def t_get_cot_extreme(
body: GetCotExtremeReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cot_extreme_positioning(body.lookback_weeks)
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9013"))
mount_mcp_endpoint(
app,
name="cerbero-macro",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "get_economic_indicators", "description": "FRED economic indicators (Fed rate, CPI, ecc)."},
{"name": "get_macro_calendar", "description": "Eventi macro con filtri country/importance/date range."},
{"name": "get_market_overview", "description": "Snapshot overview mercato macro."},
{"name": "get_asset_price", "description": "Prezzo cross-asset: WTI, DXY, SPX, VIX, yields, FX, ecc."},
{"name": "get_treasury_yields", "description": "Curva US Treasury 2y/5y/10y/30y + shape detection."},
{"name": "get_equity_futures", "description": "Futures ES/NQ/YM/RTY con session status."},
{"name": "get_yield_curve_slope", "description": "Slope 2y10y/5y30y + butterfly + regime (steep/normal/flat/inverted)."},
{"name": "get_breakeven_inflation", "description": "Breakeven inflation 5Y/10Y + 5y5y forward (FRED T5YIE/T10YIE/T5YIFR)."},
{"name": "get_cot_tff", "description": "COT TFF report (CFTC) per equity/financial: ES/NQ/RTY/ZN/ZB/6E/6J/DX. Roles: dealer, asset manager, leveraged funds, other."},
{"name": "get_cot_disaggregated", "description": "COT Disaggregated report (CFTC) per commodities: CL/GC/SI/HG/ZW/ZC/ZS. Roles: producer/merchant, swap dealer, managed money, other."},
{"name": "get_cot_extreme_positioning", "description": "Scanner posizionamento estremo (percentile ≤5 o ≥95) sui simboli watchlist."},
],
)
return app
-202
View File
@@ -1,202 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, patch
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_macro.server import create_app
@pytest.fixture
def http():
store = TokenStore(
tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(fred_api_key="testfred", finnhub_api_key="testfinn", token_store=store)
return TestClient(app)
# --- Health ---
def test_health(http):
assert http.get("/health").status_code == 200
# --- get_economic_indicators ---
def test_get_economic_indicators_core_ok(http):
with patch(
"mcp_macro.server.fetch_economic_indicators",
new=AsyncMock(return_value={"fed_rate": 5.25, "updated_at": "2024-01-01T00:00:00+00:00"}),
):
r = http.post(
"/tools/get_economic_indicators",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
assert r.json()["fed_rate"] == 5.25
def test_get_economic_indicators_observer_ok(http):
with patch(
"mcp_macro.server.fetch_economic_indicators",
new=AsyncMock(return_value={"fed_rate": 5.25}),
):
r = http.post(
"/tools/get_economic_indicators",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_economic_indicators_no_auth_401(http):
r = http.post("/tools/get_economic_indicators", json={})
assert r.status_code == 401
# --- get_macro_calendar ---
def test_get_macro_calendar_core_ok(http):
with patch(
"mcp_macro.server.fetch_macro_calendar",
new=AsyncMock(return_value={"events": []}),
):
r = http.post(
"/tools/get_macro_calendar",
headers={"Authorization": "Bearer ct"},
json={"days": 7},
)
assert r.status_code == 200
def test_get_macro_calendar_observer_ok(http):
with patch(
"mcp_macro.server.fetch_macro_calendar",
new=AsyncMock(return_value={"events": []}),
):
r = http.post(
"/tools/get_macro_calendar",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_macro_calendar_no_auth_401(http):
r = http.post("/tools/get_macro_calendar", json={})
assert r.status_code == 401
# --- get_market_overview ---
def test_get_market_overview_core_ok(http):
with patch(
"mcp_macro.server.fetch_market_overview",
new=AsyncMock(return_value={"btc_dominance": 52.0, "btc_price": 65000}),
):
r = http.post(
"/tools/get_market_overview",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
assert r.json()["btc_price"] == 65000
def test_get_market_overview_observer_ok(http):
with patch(
"mcp_macro.server.fetch_market_overview",
new=AsyncMock(return_value={"btc_dominance": 52.0}),
):
r = http.post(
"/tools/get_market_overview",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_market_overview_no_auth_401(http):
r = http.post("/tools/get_market_overview", json={})
assert r.status_code == 401
def test_get_cot_tff_core_ok(http):
with patch(
"mcp_macro.server.fetch_cot_tff",
new=AsyncMock(return_value={"symbol": "ES", "rows": []}),
):
r = http.post(
"/tools/get_cot_tff",
headers={"Authorization": "Bearer ct"},
json={"symbol": "ES"},
)
assert r.status_code == 200
assert r.json()["symbol"] == "ES"
def test_get_cot_tff_observer_ok(http):
with patch(
"mcp_macro.server.fetch_cot_tff",
new=AsyncMock(return_value={"symbol": "ES", "rows": []}),
):
r = http.post(
"/tools/get_cot_tff",
headers={"Authorization": "Bearer ot"},
json={"symbol": "ES"},
)
assert r.status_code == 200
def test_get_cot_tff_no_auth_401(http):
r = http.post("/tools/get_cot_tff", json={"symbol": "ES"})
assert r.status_code == 401
def test_get_cot_disagg_observer_ok(http):
with patch(
"mcp_macro.server.fetch_cot_disaggregated",
new=AsyncMock(return_value={"symbol": "CL", "rows": []}),
):
r = http.post(
"/tools/get_cot_disaggregated",
headers={"Authorization": "Bearer ot"},
json={"symbol": "CL"},
)
assert r.status_code == 200
def test_get_cot_disagg_no_auth_401(http):
r = http.post("/tools/get_cot_disaggregated", json={"symbol": "CL"})
assert r.status_code == 401
def test_get_cot_extreme_positioning_ok(http):
with patch(
"mcp_macro.server.fetch_cot_extreme_positioning",
new=AsyncMock(return_value={"extremes": []}),
):
r = http.post(
"/tools/get_cot_extreme_positioning",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_cot_extreme_positioning_lookback_too_short(http):
"""Pydantic validation: lookback_weeks < 4 → 422."""
r = http.post(
"/tools/get_cot_extreme_positioning",
headers={"Authorization": "Bearer ct"},
json={"lookback_weeks": 2},
)
assert r.status_code == 422
-27
View File
@@ -1,27 +0,0 @@
[project]
name = "mcp-sentiment"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
"mcp-common",
"fastapi>=0.115",
"uvicorn[standard]>=0.30",
"httpx>=0.27",
"pydantic>=2.6",
]
[project.optional-dependencies]
dev = ["pytest>=8", "pytest-asyncio>=0.23", "pytest-httpx>=0.30"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = ["src/mcp_sentiment"]
[tool.uv.sources]
mcp-common = { workspace = true }
[project.scripts]
mcp-sentiment = "mcp_sentiment.__main__:main"
@@ -1,46 +0,0 @@
from __future__ import annotations
import json
import os
import uvicorn
from mcp_common.auth import load_token_store_from_files
from mcp_common.logging import configure_root_logging
from mcp_sentiment.server import create_app
def _load_cryptopanic_key() -> str:
"""CER-002: preferisci file secret, fallback a env CRYPTOPANIC_API_KEY."""
creds_file = os.environ.get("SENTIMENT_CREDENTIALS_FILE")
if creds_file and os.path.exists(creds_file):
try:
with open(creds_file) as f:
creds = json.load(f)
key = (creds.get("cryptopanic_key") or "").strip()
if key and key.lower() not in ("placeholder", "changeme", "none"):
return key
except (OSError, json.JSONDecodeError):
pass
return (os.environ.get("CRYPTOPANIC_API_KEY") or "").strip()
configure_root_logging() # CER-P5-009
def main():
key = _load_cryptopanic_key()
token_store = load_token_store_from_files(
core_token_file=os.environ.get("CORE_TOKEN_FILE"),
observer_token_file=os.environ.get("OBSERVER_TOKEN_FILE"),
)
app = create_app(cryptopanic_key=key, token_store=token_store)
uvicorn.run(
app,
log_config=None, # CER-P5-009: delega al root JSON logger
host=os.environ.get("HOST", "0.0.0.0"),
port=int(os.environ.get("PORT", "9014")),
)
if __name__ == "__main__":
main()
@@ -1,174 +0,0 @@
from __future__ import annotations
import logging
import os
from fastapi import Depends, FastAPI, HTTPException
from mcp_common.auth import Principal, TokenStore, require_principal
from mcp_common.mcp_bridge import mount_mcp_endpoint
from mcp_common.server import build_app
from pydantic import BaseModel
from mcp_sentiment.fetchers import (
fetch_cointegration_pairs,
fetch_cross_exchange_funding,
fetch_crypto_news,
fetch_funding_arb_spread,
fetch_funding_rates,
fetch_liquidation_heatmap,
fetch_oi_history,
fetch_social_sentiment,
fetch_world_news,
)
logger = logging.getLogger(__name__)
# --- Body models ---
class GetCryptoNewsReq(BaseModel):
limit: int = 20
class GetSocialSentimentReq(BaseModel):
symbol: str = "BTC"
class GetFundingRatesReq(BaseModel):
asset: str = "BTC"
class GetWorldNewsReq(BaseModel):
pass
class GetCrossExchangeFundingReq(BaseModel):
assets: list[str] | None = None
class GetFundingArbSpreadReq(BaseModel):
assets: list[str] | None = None
class GetLiquidationHeatmapReq(BaseModel):
asset: str = "BTC"
class GetCointegrationPairsReq(BaseModel):
pairs: list[list[str]] | None = None
lookback_hours: int = 24
class GetOiHistoryReq(BaseModel):
asset: str = "BTC"
period: str = "5m"
limit: int = 288
# --- ACL helper ---
def _check(principal: Principal, *, core: bool = False, observer: bool = False) -> None:
allowed: set[str] = set()
if core:
allowed.add("core")
if observer:
allowed.add("observer")
if not (principal.capabilities & allowed):
raise HTTPException(403, f"capability required: {allowed}")
# --- App factory ---
def create_app(*, cryptopanic_key: str = "", token_store: TokenStore) -> FastAPI:
app = build_app(name="mcp-sentiment", version="0.1.0", token_store=token_store)
if not cryptopanic_key or cryptopanic_key.lower() in ("placeholder", "none", "changeme"):
logger.warning(
"mcp-sentiment: cryptopanic_key mancante o placeholder — get_crypto_news "
"ritornerà headlines=[] con note diagnostica"
)
@app.post("/tools/get_crypto_news", tags=["reads"])
async def t_get_crypto_news(
body: GetCryptoNewsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_crypto_news(api_key=cryptopanic_key, limit=body.limit)
@app.post("/tools/get_social_sentiment", tags=["reads"])
async def t_get_social_sentiment(
body: GetSocialSentimentReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_social_sentiment(body.symbol)
@app.post("/tools/get_funding_rates", tags=["reads"])
async def t_get_funding_rates(
body: GetFundingRatesReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_funding_rates(body.asset)
@app.post("/tools/get_world_news", tags=["reads"])
async def t_get_world_news(
body: GetWorldNewsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_world_news()
@app.post("/tools/get_cross_exchange_funding", tags=["reads"])
async def t_get_cross_exchange_funding(
body: GetCrossExchangeFundingReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cross_exchange_funding(body.assets)
@app.post("/tools/get_funding_arb_spread", tags=["reads"])
async def t_get_funding_arb_spread(
body: GetFundingArbSpreadReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_funding_arb_spread(body.assets)
@app.post("/tools/get_liquidation_heatmap", tags=["reads"])
async def t_get_liquidation_heatmap(
body: GetLiquidationHeatmapReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_liquidation_heatmap(body.asset)
@app.post("/tools/get_cointegration_pairs", tags=["reads"])
async def t_get_cointegration_pairs(
body: GetCointegrationPairsReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_cointegration_pairs(body.pairs, body.lookback_hours)
@app.post("/tools/get_oi_history", tags=["reads"])
async def t_get_oi_history(
body: GetOiHistoryReq, principal: Principal = Depends(require_principal)
):
_check(principal, core=True, observer=True)
return await fetch_oi_history(body.asset, body.period, body.limit)
# ───── MCP endpoint (/mcp) — bridge verso /tools/* ─────
port = int(os.environ.get("PORT", "9014"))
mount_mcp_endpoint(
app,
name="cerbero-sentiment",
version="0.1.0",
token_store=token_store,
internal_base_url=f"http://localhost:{port}",
tools=[
{"name": "get_crypto_news", "description": "News crypto da CryptoPanic."},
{"name": "get_social_sentiment", "description": "Sentiment aggregato social."},
{"name": "get_funding_rates", "description": "Funding rates aggregati."},
{"name": "get_world_news", "description": "News macro/world."},
{"name": "get_cross_exchange_funding", "description": "Funding multi-asset multi-exchange + arbitrage opportunities."},
{"name": "get_oi_history", "description": "Open interest history perp (Binance) + delta_pct 1h/4h/24h."},
{"name": "get_funding_arb_spread", "description": "Opportunità arbitrage funding cross-exchange in formato compatto + annualized %."},
{"name": "get_liquidation_heatmap", "description": "Pressione liquidazioni heuristica da OI delta + funding (long/short squeeze risk)."},
{"name": "get_cointegration_pairs", "description": "Engle-Granger cointegration test su coppie crypto Binance hourly."},
],
)
return app
@@ -1,216 +0,0 @@
from __future__ import annotations
from unittest.mock import AsyncMock, patch
import pytest
from fastapi.testclient import TestClient
from mcp_common.auth import Principal, TokenStore
from mcp_sentiment.server import create_app
@pytest.fixture
def http():
store = TokenStore(
tokens={
"ct": Principal("core", {"core"}),
"ot": Principal("observer", {"observer"}),
}
)
app = create_app(cryptopanic_key="testkey", token_store=store)
return TestClient(app)
# --- Health ---
def test_health(http):
assert http.get("/health").status_code == 200
# --- get_crypto_news ---
def test_get_crypto_news_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_crypto_news",
new=AsyncMock(return_value={"headlines": []}),
):
r = http.post(
"/tools/get_crypto_news",
headers={"Authorization": "Bearer ct"},
json={"limit": 5},
)
assert r.status_code == 200
def test_get_crypto_news_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_crypto_news",
new=AsyncMock(return_value={"headlines": []}),
):
r = http.post(
"/tools/get_crypto_news",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_crypto_news_no_auth_401(http):
r = http.post("/tools/get_crypto_news", json={})
assert r.status_code == 401
# --- get_social_sentiment ---
def test_get_social_sentiment_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_social_sentiment",
new=AsyncMock(return_value={"fear_greed_index": 65, "fear_greed_label": "Greed"}),
):
r = http.post(
"/tools/get_social_sentiment",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
assert r.json()["fear_greed_index"] == 65
def test_get_social_sentiment_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_social_sentiment",
new=AsyncMock(return_value={"fear_greed_index": 65}),
):
r = http.post(
"/tools/get_social_sentiment",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_social_sentiment_no_auth_401(http):
r = http.post("/tools/get_social_sentiment", json={})
assert r.status_code == 401
# --- get_funding_rates ---
def test_get_funding_rates_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_rates",
new=AsyncMock(return_value={"rates": []}),
):
r = http.post(
"/tools/get_funding_rates",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
def test_get_funding_rates_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_rates",
new=AsyncMock(return_value={"rates": []}),
):
r = http.post(
"/tools/get_funding_rates",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_funding_rates_no_auth_401(http):
r = http.post("/tools/get_funding_rates", json={})
assert r.status_code == 401
# --- get_world_news ---
def test_get_world_news_core_ok(http):
with patch(
"mcp_sentiment.server.fetch_world_news",
new=AsyncMock(return_value={"articles": [], "count": 0}),
):
r = http.post(
"/tools/get_world_news",
headers={"Authorization": "Bearer ct"},
json={},
)
assert r.status_code == 200
def test_get_world_news_observer_ok(http):
with patch(
"mcp_sentiment.server.fetch_world_news",
new=AsyncMock(return_value={"articles": [], "count": 0}),
):
r = http.post(
"/tools/get_world_news",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_world_news_no_auth_401(http):
r = http.post("/tools/get_world_news", json={})
assert r.status_code == 401
# --- New indicators: funding_arb_spread, liquidation_heatmap, cointegration_pairs ---
def test_get_funding_arb_spread_ok(http):
with patch(
"mcp_sentiment.server.fetch_funding_arb_spread",
new=AsyncMock(return_value={"opportunities": []}),
):
r = http.post(
"/tools/get_funding_arb_spread",
headers={"Authorization": "Bearer ot"},
json={},
)
assert r.status_code == 200
def test_get_funding_arb_spread_no_auth_401(http):
r = http.post("/tools/get_funding_arb_spread", json={})
assert r.status_code == 401
def test_get_liquidation_heatmap_ok(http):
with patch(
"mcp_sentiment.server.fetch_liquidation_heatmap",
new=AsyncMock(return_value={"asset": "BTC", "long_squeeze_risk": "low"}),
):
r = http.post(
"/tools/get_liquidation_heatmap",
headers={"Authorization": "Bearer ct"},
json={"asset": "BTC"},
)
assert r.status_code == 200
def test_get_liquidation_heatmap_no_auth_401(http):
r = http.post("/tools/get_liquidation_heatmap", json={"asset": "BTC"})
assert r.status_code == 401
def test_get_cointegration_pairs_ok(http):
with patch(
"mcp_sentiment.server.fetch_cointegration_pairs",
new=AsyncMock(return_value={"results": []}),
):
r = http.post(
"/tools/get_cointegration_pairs",
headers={"Authorization": "Bearer ot"},
json={"pairs": [["BTC", "ETH"]]},
)
assert r.status_code == 200
def test_get_cointegration_pairs_no_auth_401(http):
r = http.post("/tools/get_cointegration_pairs", json={})
assert r.status_code == 401
+92
View File
@@ -0,0 +1,92 @@
"""Entrypoint cerbero-mcp.
Boot:
- carica Settings da .env
- costruisce app FastAPI con router per ogni exchange
- crea ClientRegistry con builder
- monta lifespan per chiusura pulita
- avvia uvicorn
"""
from __future__ import annotations
import contextlib
from contextlib import asynccontextmanager
from typing import Literal, cast
import uvicorn
from fastapi import FastAPI
from cerbero_mcp import admin
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.logging import configure_root_logging
from cerbero_mcp.exchanges import build_client
from cerbero_mcp.routers import (
alpaca,
bybit,
deribit,
hyperliquid,
ibkr,
macro,
sentiment,
)
from cerbero_mcp.server import build_app
from cerbero_mcp.settings import Settings
def _make_app(settings: Settings) -> FastAPI:
app = build_app(
testnet_token=settings.testnet_token.get_secret_value(),
mainnet_token=settings.mainnet_token.get_secret_value(),
title="Cerbero MCP",
version="2.0.0",
)
app.state.settings = settings
async def builder(exchange: str, env: str):
return await build_client(
settings, exchange, cast(Literal["testnet", "mainnet"], env)
)
app.state.registry = ClientRegistry(builder=builder)
@asynccontextmanager
async def lifespan(app: FastAPI):
try:
yield
finally:
# Stop any IBKR WebSocket singletons before closing client registry
ibkr_ws_dict = getattr(app.state, "ibkr_ws", {}) or {}
for ws in ibkr_ws_dict.values():
with contextlib.suppress(Exception):
await ws.stop()
await app.state.registry.aclose()
app.router.lifespan_context = lifespan
app.include_router(deribit.make_router())
app.include_router(bybit.make_router())
app.include_router(hyperliquid.make_router())
app.include_router(alpaca.make_router())
app.include_router(ibkr.make_router())
app.include_router(macro.make_router())
app.include_router(sentiment.make_router())
app.include_router(admin.make_admin_router())
return app
def main() -> None:
configure_root_logging()
settings = Settings() # type: ignore[call-arg]
app = _make_app(settings)
uvicorn.run(
app,
log_config=None,
host=settings.host,
port=settings.port,
)
if __name__ == "__main__":
main()
+243
View File
@@ -0,0 +1,243 @@
"""Endpoint admin: query audit log con filtri."""
from __future__ import annotations
import json
import os
from datetime import datetime
from pathlib import Path
from typing import Any, Literal
from fastapi import APIRouter, HTTPException, Query, Request
from pydantic import BaseModel, SecretStr
from cerbero_mcp.exchanges.ibkr.key_rotation import KeyRotationManager
MAX_RECORDS = 10000
DEFAULT_LIMIT = 1000
def _parse_iso(value: str | None) -> datetime | None:
if not value:
return None
try:
# supporta sia "2026-05-01" sia "2026-05-01T12:34:56Z"
return datetime.fromisoformat(value.replace("Z", "+00:00"))
except ValueError as e:
raise HTTPException(400, f"invalid datetime: {value}") from e
def _record_timestamp(rec: dict[str, Any]) -> datetime | None:
"""Estrae il timestamp da un record audit. JsonFormatter mette 'asctime'
in formato '2026-05-01 12:34:56,789'. Lo parsiamo come UTC.
"""
ts = rec.get("asctime") or rec.get("timestamp")
if not ts:
return None
try:
# asctime format default: 'YYYY-MM-DD HH:MM:SS,mmm'
ts_clean = ts.replace(",", ".")
return datetime.fromisoformat(ts_clean)
except ValueError:
return None
def _matches_filters(
rec: dict[str, Any],
*,
from_dt: datetime | None,
to_dt: datetime | None,
actor: str | None,
exchange: str | None,
action: str | None,
bot_tag: str | None,
) -> bool:
if rec.get("audit_event") != "write_op":
return False
if actor is not None and rec.get("actor") != actor:
return False
if exchange is not None and rec.get("exchange") != exchange:
return False
if action is not None and rec.get("action") != action:
return False
if bot_tag is not None and rec.get("bot_tag") != bot_tag:
return False
if from_dt is not None or to_dt is not None:
rec_ts = _record_timestamp(rec)
if rec_ts is None:
return False
if from_dt is not None and rec_ts < from_dt:
return False
if to_dt is not None and rec_ts > to_dt:
return False
return True
def _read_audit_records(file_path: Path) -> list[dict[str, Any]]:
if not file_path.exists():
return []
out: list[dict[str, Any]] = []
with file_path.open("r", encoding="utf-8") as f:
for line in f:
stripped = line.strip()
if not stripped:
continue
try:
out.append(json.loads(stripped))
except json.JSONDecodeError:
continue
return out
def make_admin_router() -> APIRouter:
r = APIRouter(prefix="/admin", tags=["admin"])
@r.get("/audit")
async def query_audit(
request: Request,
from_: str | None = Query(None, alias="from"),
to: str | None = Query(None),
actor: Literal["testnet", "mainnet"] | None = Query(None),
exchange: str | None = Query(None),
action: str | None = Query(None),
bot_tag: str | None = Query(None),
limit: int = Query(DEFAULT_LIMIT, ge=1, le=MAX_RECORDS),
) -> dict[str, Any]:
"""Restituisce i record audit_write_op filtrati.
Param query (tutti opzionali):
- from / to: ISO 8601 datetime (es. 2026-05-01 oppure 2026-05-01T12:34:56)
- actor: testnet | mainnet
- exchange: deribit | bybit | hyperliquid | alpaca
- action: nome del tool (es. place_order)
- bot_tag: identificatore bot
- limit: max record da ritornare (default 1000, max 10000)
Source: AUDIT_LOG_FILE (env var). Se non settata, ritorna lista vuota
con warning.
"""
from_dt = _parse_iso(from_)
to_dt = _parse_iso(to)
file_str = os.environ.get("AUDIT_LOG_FILE", "").strip()
if not file_str:
return {
"records": [],
"count": 0,
"warning": "AUDIT_LOG_FILE not configured; no persistent audit log to query",
"from": from_,
"to": to,
}
file_path = Path(file_str)
all_records = _read_audit_records(file_path)
filtered = [
rec for rec in all_records
if _matches_filters(
rec,
from_dt=from_dt, to_dt=to_dt,
actor=actor, exchange=exchange, action=action,
bot_tag=bot_tag,
)
]
# sort desc per timestamp (ultimi prima) + limit
filtered.sort(
key=lambda rec: _record_timestamp(rec) or datetime.min,
reverse=True,
)
if len(filtered) > limit:
filtered = filtered[:limit]
return {
"records": filtered,
"count": len(filtered),
"from": from_,
"to": to,
"filters": {
"actor": actor, "exchange": exchange,
"action": action, "bot_tag": bot_tag,
},
}
class _IBKRRotateConfirmReq(BaseModel):
new_consumer_key: str
new_access_token: str
new_access_token_secret: str
@r.post("/ibkr/rotate-keys/start")
async def _ibkr_rotate_start(env: str, request: Request):
if env not in ("testnet", "mainnet"):
raise HTTPException(400, detail={"error": "invalid env"})
settings = request.app.state.settings
creds = settings.ibkr.credentials(env)
mgr = KeyRotationManager(
signature_key_path=creds["signature_key_path"],
encryption_key_path=creds["encryption_key_path"],
)
rotations = getattr(request.app.state, "ibkr_rotations", None)
if rotations is None:
rotations = {}
request.app.state.ibkr_rotations = rotations
rotations[env] = mgr
return await mgr.start()
@r.post("/ibkr/rotate-keys/confirm")
async def _ibkr_rotate_confirm(
env: str, body: _IBKRRotateConfirmReq, request: Request,
):
if env not in ("testnet", "mainnet"):
raise HTTPException(400, detail={"error": "invalid env"})
rotations = getattr(request.app.state, "ibkr_rotations", {}) or {}
mgr = rotations.get(env)
if mgr is None:
raise HTTPException(409, detail={"error": "rotation not started"})
settings = request.app.state.settings
if env == "testnet":
settings.ibkr.consumer_key_testnet = body.new_consumer_key
settings.ibkr.access_token_testnet = body.new_access_token
settings.ibkr.access_token_secret_testnet = SecretStr(body.new_access_token_secret)
else:
settings.ibkr.consumer_key_live = body.new_consumer_key
settings.ibkr.access_token_live = body.new_access_token
settings.ibkr.access_token_secret_live = SecretStr(body.new_access_token_secret)
registry = request.app.state.registry
registry._clients.pop(("ibkr", env), None)
async def _validate() -> bool:
try:
client = await registry.get("ibkr", env)
await client._request("GET", "/iserver/auth/status", skip_tickle=True)
return True
except Exception:
return False
try:
return await mgr.confirm(validate=_validate)
finally:
rotations.pop(env, None)
@r.post("/ibkr/rotate-keys/abort")
async def _ibkr_rotate_abort(env: str, request: Request):
rotations = getattr(request.app.state, "ibkr_rotations", {}) or {}
mgr = rotations.pop(env, None)
if mgr is None:
return {"aborted": False, "reason": "no rotation in progress"}
return await mgr.abort()
@r.post("/ibkr/health")
async def _ibkr_health(request: Request):
registry = request.app.state.registry
out: dict[str, Any] = {}
for env in ("testnet", "mainnet"):
try:
client = await registry.get("ibkr", env)
status = await client._request(
"GET", "/iserver/auth/status", skip_tickle=True
)
out[env] = {"healthy": True, "status": status}
except Exception as e:
out[env] = {"healthy": False, "error": str(e)[:200]}
return out
return r
+106
View File
@@ -0,0 +1,106 @@
"""Bearer auth middleware: bearer token → request.state.environment.
Inoltre richiede header `X-Bot-Tag` su tutte le chiamate non whitelisted,
così che l'audit log identifichi il bot chiamante.
"""
from __future__ import annotations
import secrets
from typing import Literal
from fastapi import FastAPI, Request, status
from fastapi.responses import JSONResponse
Environment = Literal["testnet", "mainnet"]
# Path che bypassano sia bearer auth sia bot_tag check.
PATH_WHITELIST_FULL = frozenset(
{
"/health",
"/health/ready",
"/apidocs",
"/openapi.json",
"/docs",
"/redoc",
}
)
# Path che richiedono bearer ma NON il bot_tag (admin endpoint).
PATH_WHITELIST_BOT_TAG_ONLY = frozenset({"/admin/audit"})
# Backward-compat alias (vecchi import).
WHITELIST_PATHS = PATH_WHITELIST_FULL
MAX_BOT_TAG_LEN = 64
def _extract_bearer(auth_header: str) -> str | None:
if not auth_header.startswith("Bearer "):
return None
token = auth_header[len("Bearer "):].strip()
return token or None
def _check_token(
candidate: str, testnet_token: str, mainnet_token: str
) -> Environment | None:
if secrets.compare_digest(candidate, testnet_token):
return "testnet"
if secrets.compare_digest(candidate, mainnet_token):
return "mainnet"
return None
def install_auth_middleware(
app: FastAPI,
*,
testnet_token: str,
mainnet_token: str,
) -> None:
"""Registra middleware di auth bearer + bot_tag sull'app FastAPI."""
@app.middleware("http")
async def auth_middleware(request: Request, call_next):
path = request.url.path
# 1. Whitelist totale: nessun check.
if path in PATH_WHITELIST_FULL:
return await call_next(request)
# 2. Bearer auth (sempre richiesto).
token = _extract_bearer(request.headers.get("Authorization", ""))
if token is None:
return JSONResponse(
status_code=status.HTTP_401_UNAUTHORIZED,
content={"error": {"code": "UNAUTHORIZED",
"message": "missing or malformed bearer token"}},
)
env = _check_token(token, testnet_token, mainnet_token)
if env is None:
return JSONResponse(
status_code=status.HTTP_401_UNAUTHORIZED,
content={"error": {"code": "UNAUTHORIZED",
"message": "invalid token"}},
)
request.state.environment = env
# 3. Whitelist parziale (admin): bearer ok, no bot_tag check.
if path in PATH_WHITELIST_BOT_TAG_ONLY:
return await call_next(request)
# 4. X-Bot-Tag obbligatorio.
raw_tag = request.headers.get("X-Bot-Tag", "")
tag = raw_tag.strip() if raw_tag else ""
if not tag:
return JSONResponse(
status_code=status.HTTP_400_BAD_REQUEST,
content={"error": {"code": "BAD_REQUEST",
"message": "missing X-Bot-Tag header"}},
)
if len(tag) > MAX_BOT_TAG_LEN:
return JSONResponse(
status_code=status.HTTP_400_BAD_REQUEST,
content={"error": {"code": "BAD_REQUEST",
"message": "X-Bot-Tag too long"}},
)
request.state.bot_tag = tag
return await call_next(request)
+40
View File
@@ -0,0 +1,40 @@
"""Cache lazy di client exchange, una istanza per (exchange, env)."""
from __future__ import annotations
import asyncio
import contextlib
from collections import defaultdict
from collections.abc import Awaitable, Callable
from typing import Any, Literal
Environment = Literal["testnet", "mainnet"]
Builder = Callable[[str, Environment], Awaitable[Any]]
class ClientRegistry:
def __init__(self, *, builder: Builder) -> None:
self._builder = builder
self._clients: dict[tuple[str, Environment], Any] = {}
self._locks: dict[tuple[str, Environment], asyncio.Lock] = defaultdict(
asyncio.Lock
)
async def get(self, exchange: str, env: Environment) -> Any:
key = (exchange, env)
if key in self._clients:
return self._clients[key]
async with self._locks[key]:
if key in self._clients: # double-check
return self._clients[key]
client = await self._builder(exchange, env)
self._clients[key] = client
return client
async def aclose(self) -> None:
for client in self._clients.values():
close = getattr(client, "aclose", None)
if close is None:
continue
with contextlib.suppress(Exception):
await close()
self._clients.clear()
@@ -23,8 +23,7 @@ import os
from logging.handlers import TimedRotatingFileHandler
from typing import Any
from mcp_common.auth import Principal
from mcp_common.logging import SecretsFilter, get_json_logger
from cerbero_mcp.common.logging import SecretsFilter, get_json_logger
try:
from pythonjsonlogger.json import JsonFormatter as _JsonFormatter # noqa: N813
@@ -67,34 +66,42 @@ def _configure_audit_sink() -> None:
def audit_write_op(
*,
principal: Principal | None,
actor: str | None = None,
bot_tag: str | None = None,
action: str,
exchange: str,
target: str | None = None,
payload: dict[str, Any] | None = None,
result: dict[str, Any] | None = None,
error: str | None = None,
request_id: str | None = None,
) -> None:
"""Emit a structured audit log record per write operation.
principal: chi ha invocato (None se anonimo, ma normalmente _check
impedisce di arrivare qui senza principal).
actor: identificatore di chi ha invocato (es. "testnet", "mainnet",
oppure None per logging anonimo).
bot_tag: identificatore del bot chiamante (header X-Bot-Tag).
action: nome del tool (es. "place_order", "cancel_order").
exchange: identificatore servizio (deribit, bybit, alpaca, hyperliquid).
target: instrument/symbol/order_id su cui si agisce.
payload: input non-sensibile (qty, side, leverage, ecc.).
result: output del client (order_id, status, ecc.).
error: stringa errore se l'operazione ha fallito.
request_id: id propagato dal middleware request log per correlazione
tra audit log e request log.
"""
_configure_audit_sink()
record: dict[str, Any] = {
"audit_event": "write_op",
"action": action,
"exchange": exchange,
"principal": principal.name if principal else None,
"actor": actor,
"bot_tag": bot_tag,
"target": target,
"payload": payload or {},
}
if request_id is not None:
record["request_id"] = request_id
if result is not None:
record["result"] = _summarize_result(result)
if error is not None:
+100
View File
@@ -0,0 +1,100 @@
"""Helper per cablare audit_write_op nei router.
Pattern uso nel router::
@r.post("/tools/place_order")
async def _place_order(
params: t.PlaceOrderReq,
request: Request,
client: DeribitClient = Depends(get_deribit_client),
):
return await audit_call(
request=request,
exchange="deribit",
action="place_order",
target_field="instrument_name",
params=params,
tool_fn=lambda: t.place_order(client, params, creds=...),
)
"""
from __future__ import annotations
from collections.abc import Awaitable, Callable
from typing import Any
from fastapi import Request
from pydantic import BaseModel
from cerbero_mcp.common.audit import audit_write_op
def _extract_target(params: BaseModel | None, target_field: str | None) -> str | None:
if params is None or target_field is None:
return None
val = getattr(params, target_field, None)
if val is None:
return None
return str(val)
def _safe_dump(params: BaseModel | None) -> dict[str, Any]:
if params is None:
return {}
try:
return params.model_dump(mode="json", exclude_none=True)
except Exception:
return {}
async def audit_call(
*,
request: Request,
exchange: str,
action: str,
tool_fn: Callable[[], Awaitable[Any]],
params: BaseModel | None = None,
target_field: str | None = None,
) -> Any:
"""Esegue tool_fn e logga audit (success o error). Riraisola eccezioni."""
actor = getattr(request.state, "environment", None)
bot_tag = getattr(request.state, "bot_tag", None)
request_id = getattr(request.state, "request_id", None)
target = _extract_target(params, target_field)
payload = _safe_dump(params)
try:
result = await tool_fn()
except Exception as e:
audit_write_op(
actor=actor,
bot_tag=bot_tag,
action=action,
exchange=exchange,
target=target,
payload=payload,
error=f"{type(e).__name__}: {e}",
request_id=request_id,
)
raise
# Se result è dict, passa raw; altrimenti tenta serializzazione
audit_result: dict[str, Any] | None = None
if isinstance(result, dict):
audit_result = result
elif hasattr(result, "model_dump"):
try:
audit_result = result.model_dump(mode="json")
except Exception:
audit_result = None
audit_write_op(
actor=actor,
bot_tag=bot_tag,
action=action,
exchange=exchange,
target=target,
payload=payload,
result=audit_result,
request_id=request_id,
)
return result
+51
View File
@@ -0,0 +1,51 @@
"""Error envelope standard per tutti i tool MCP."""
from __future__ import annotations
import uuid
from datetime import UTC, datetime
from typing import Any
def error_envelope(
*,
type_: str,
code: str,
message: str,
retryable: bool,
suggested_fix: str | None = None,
details: dict | None = None,
request_id: str | None = None,
) -> dict:
env: dict[str, Any] = {
"error": {
"type": type_,
"code": code,
"message": message,
"retryable": retryable,
},
"request_id": request_id or uuid.uuid4().hex,
"data_timestamp": datetime.now(UTC).isoformat(),
}
if suggested_fix:
env["error"]["suggested_fix"] = suggested_fix
if details:
env["error"]["details"] = details
return env
HTTP_CODE_MAP = {
400: "BAD_REQUEST",
401: "UNAUTHORIZED",
403: "FORBIDDEN",
404: "NOT_FOUND",
408: "TIMEOUT",
409: "CONFLICT",
422: "VALIDATION_ERROR",
429: "RATE_LIMIT",
500: "INTERNAL_ERROR",
502: "UPSTREAM_ERROR",
503: "UNAVAILABLE",
504: "GATEWAY_TIMEOUT",
}
RETRYABLE_STATUSES = frozenset({408, 429, 502, 503, 504})
@@ -28,8 +28,6 @@ import httpx
from fastapi import FastAPI, Request
from fastapi.responses import JSONResponse
from mcp_common.auth import TokenStore
MCP_PROTOCOL_VERSION = "2024-11-05"
@@ -95,20 +93,22 @@ def mount_mcp_endpoint(
*,
name: str,
version: str,
token_store: TokenStore,
valid_tokens: set[str],
internal_base_url: str,
tools: list[dict],
) -> None:
"""Registra un endpoint MCP JSON-RPC 2.0 su POST /mcp.
Ogni tool è proxato verso POST {internal_base_url}/tools/<name> con il
Bearer token del client MCP (preservando le ACL REST esistenti).
Bearer token del client MCP. L'auth è già gestita dal middleware V2
(bearer testnet/mainnet); qui si ricontrolla che il token sia nei
valid_tokens prima di proxare.
Args:
app: istanza FastAPI del service
name: nome server MCP
version: versione del service
token_store: lo stesso usato dai tool REST
valid_tokens: set di token validi (testnet + mainnet)
internal_base_url: URL base interno (es. "http://localhost:9015")
tools: lista di {"name": str, "description": str, "input_schema"?: dict}
"""
@@ -207,8 +207,7 @@ def mount_mcp_endpoint(
if not auth.startswith("Bearer "):
return JSONResponse({"error": "missing bearer token"}, status_code=401)
token = auth[len("Bearer "):].strip()
principal = token_store.get(token)
if principal is None:
if token not in valid_tokens:
return JSONResponse({"error": "invalid token"}, status_code=403)
body = await request.json()
+104
View File
@@ -0,0 +1,104 @@
"""Middleware: structured JSON request log per ogni HTTP request.
Emette una riga JSON sul logger ``mcp.request`` con campi correlabili
all'audit log via ``request_id``. Espone anche ``request_id`` su
``request.state`` così che handler/exception handler downstream possano
includerlo nei propri payload.
"""
from __future__ import annotations
import logging
import time
import uuid
from collections.abc import Awaitable, Callable
from datetime import UTC, datetime
from typing import Any
from fastapi import FastAPI, Request
from starlette.responses import Response
from cerbero_mcp.common.logging import get_json_logger
_logger = get_json_logger("mcp.request", level=logging.INFO)
def _extract_exchange(path: str) -> str | None:
"""Estrae il nome dell'exchange dal path se è un ``/mcp-{exchange}/...``."""
if not path.startswith("/mcp-"):
return None
rest = path[len("/mcp-"):]
end = rest.find("/")
if end < 0:
return rest or None
return rest[:end] or None
def _extract_tool(path: str) -> str | None:
"""Estrae nome tool dal path ``/mcp-X/tools/Y``."""
parts = path.split("/")
# ["", "mcp-deribit", "tools", "place_order"]
if len(parts) >= 4 and parts[2] == "tools":
return parts[3] or None
return None
def install_request_log_middleware(app: FastAPI) -> None:
"""Aggiunge un middleware HTTP che logga JSON per ogni request."""
@app.middleware("http")
async def request_log(
request: Request,
call_next: Callable[[Request], Awaitable[Response]],
) -> Response:
request_id = uuid.uuid4().hex
# Espone request_id per uso downstream (audit, error envelope)
request.state.request_id = request_id
t0 = time.perf_counter()
status_code = 500
error: str | None = None
response: Response | None = None
try:
response = await call_next(request)
status_code = response.status_code
except Exception as e:
error = f"{type(e).__name__}: {str(e)[:200]}"
raise
finally:
dur_ms = (time.perf_counter() - t0) * 1000
path = request.url.path
payload: dict[str, Any] = {
"event": "request",
"request_id": request_id,
"method": request.method,
"path": path,
"status_code": status_code,
"duration_ms": round(dur_ms, 2),
"timestamp": datetime.now(UTC).isoformat(),
}
ua = request.headers.get("user-agent")
if ua:
payload["user_agent"] = ua[:200]
client = request.client
if client is not None:
payload["client_ip"] = client.host
actor = getattr(request.state, "environment", None)
if actor:
payload["actor"] = actor
bot_tag = getattr(request.state, "bot_tag", None)
if bot_tag:
payload["bot_tag"] = bot_tag
exchange = _extract_exchange(path)
if exchange:
payload["exchange"] = exchange
tool = _extract_tool(path)
if tool:
payload["tool"] = tool
if error:
payload["error"] = error
_logger.error("request", extra=payload)
else:
_logger.info("request", extra=payload)
# response è settato se non c'è stata eccezione (altrimenti
# l'eccezione è stata già rilanciata dal blocco except).
assert response is not None
return response
+95
View File
@@ -0,0 +1,95 @@
"""Builder centralizzato di client per ClientRegistry."""
from __future__ import annotations
from typing import Literal
from cerbero_mcp.settings import Settings
Environment = Literal["testnet", "mainnet"]
async def build_client(
settings: Settings, exchange: str, env: Environment
):
if exchange == "deribit":
from cerbero_mcp.exchanges.deribit.client import DeribitClient
url = settings.deribit.url_testnet if env == "testnet" else settings.deribit.url_live
cid, csec = settings.deribit.credentials(env)
return DeribitClient(
client_id=cid,
client_secret=csec,
testnet=(env == "testnet"),
base_url_override=url,
)
if exchange == "bybit":
from cerbero_mcp.exchanges.bybit.client import BybitClient
url = settings.bybit.url_testnet if env == "testnet" else settings.bybit.url_live
return BybitClient(
api_key=settings.bybit.api_key,
api_secret=settings.bybit.api_secret.get_secret_value(),
testnet=(env == "testnet"),
base_url=url,
)
if exchange == "hyperliquid":
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
url = settings.hyperliquid.url_testnet if env == "testnet" else settings.hyperliquid.url_live
return HyperliquidClient(
wallet_address=settings.hyperliquid.wallet_address,
private_key=settings.hyperliquid.private_key.get_secret_value(),
testnet=(env == "testnet"),
api_wallet_address=settings.hyperliquid.api_wallet_address,
base_url=url,
)
if exchange == "alpaca":
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
url = settings.alpaca.url_testnet if env == "testnet" else settings.alpaca.url_live
return AlpacaClient(
api_key=settings.alpaca.api_key_id,
secret_key=settings.alpaca.secret_key.get_secret_value(),
paper=(env == "testnet"),
base_url=url,
)
if exchange == "macro":
# Read-only data provider — env ignored. Il registry
# istanzia comunque 2 entry (testnet/mainnet); costo trascurabile
# (wrapper stateless senza HTTP session).
from cerbero_mcp.exchanges.macro.client import MacroClient
return MacroClient(
fred_api_key=settings.macro.fred_api_key.get_secret_value(),
finnhub_api_key=settings.macro.finnhub_api_key.get_secret_value(),
)
if exchange == "sentiment":
# Read-only data provider — env ignored (CryptoPanic, LunarCrush e
# endpoint pubblici di funding/OI multi-exchange sono unici).
from cerbero_mcp.exchanges.sentiment.client import SentimentClient
return SentimentClient(
cryptopanic_key=settings.sentiment.cryptopanic_key.get_secret_value(),
lunarcrush_key=settings.sentiment.lunarcrush_key.get_secret_value(),
)
if exchange == "ibkr":
from cerbero_mcp.exchanges.ibkr.client import IBKRClient
from cerbero_mcp.exchanges.ibkr.oauth import OAuth1aSigner
creds = settings.ibkr.credentials(env)
url = settings.ibkr.url_testnet if env == "testnet" else settings.ibkr.url_live
signer = OAuth1aSigner(
consumer_key=creds["consumer_key"],
access_token=creds["access_token"],
access_token_secret=creds["access_token_secret"],
signature_key_path=creds["signature_key_path"],
encryption_key_path=creds["encryption_key_path"],
dh_prime=creds["dh_prime"],
)
return IBKRClient(
signer=signer,
account_id=creds["account_id"],
paper=(env == "testnet"),
base_url=url,
)
raise ValueError(f"unsupported exchange: {exchange}")
+510
View File
@@ -0,0 +1,510 @@
"""Alpaca client su httpx puro (V2.0.0).
Riscrittura full-REST del client `alpaca-py` originale: 4 endpoint base
(trading, stock data, crypto data, options data), auth via header
APCA-API-KEY-ID / APCA-API-SECRET-KEY, parità completa con la versione V1
(stesse firme, stessa shape dei dict ritornati).
- `base_url` parametro override applica SOLO al trading endpoint
(coerente con `url_override` di alpaca-py.TradingClient). Gli endpoint
data restano hardcoded su `https://data.alpaca.markets`.
- I metodi ritornano `dict` / `list[dict]` direttamente dal JSON REST
(al posto dei modelli pydantic alpaca-py serializzati). Le chiavi sono
quelle restituite dall'API Alpaca; equivalgono al `model_dump()` dei
modelli SDK precedenti.
"""
from __future__ import annotations
import datetime as _dt
from typing import Any
import httpx
from cerbero_mcp.common.http import async_client
# ── Endpoint base ────────────────────────────────────────────────
_TRADING_LIVE = "https://api.alpaca.markets"
_TRADING_PAPER = "https://paper-api.alpaca.markets"
_DATA = "https://data.alpaca.markets"
# ── Mappa timeframe → query param Alpaca ─────────────────────────
# Alpaca v2 bars: timeframe = "1Min" / "5Min" / "15Min" / "30Min" / "1Hour" / "1Day" / "1Week"
_TF_MAP = {
"1min": "1Min",
"5min": "5Min",
"15min": "15Min",
"30min": "30Min",
"1h": "1Hour",
"1d": "1Day",
"1w": "1Week",
}
_ASSET_CLASS_MAP = {
"stocks": "us_equity",
"crypto": "crypto",
"options": "us_option",
}
def _tf(interval: str) -> str:
if interval in _TF_MAP:
return _TF_MAP[interval]
raise ValueError(f"unsupported timeframe: {interval}")
def _asset_class_param(ac: str) -> str:
ac = ac.lower()
if ac in _ASSET_CLASS_MAP:
return _ASSET_CLASS_MAP[ac]
raise ValueError(f"invalid asset_class: {ac}")
def _iso(value: _dt.datetime | _dt.date | None) -> str | None:
if value is None:
return None
return value.isoformat()
class AlpacaClient:
"""Client httpx-based per Alpaca REST API v2.
Auth via header `APCA-API-KEY-ID` / `APCA-API-SECRET-KEY`.
"""
def __init__(
self,
api_key: str,
secret_key: str,
paper: bool = True,
base_url: str | None = None,
http: httpx.AsyncClient | None = None,
) -> None:
self.api_key = api_key
self.secret_key = secret_key
self.paper = paper
# `base_url` mantenuto come attributo pubblico (test/build_client lo
# leggono). Override del solo endpoint trading; data endpoints sono
# sempre `data.alpaca.markets` (Alpaca non offre paper data feed).
self.base_url = base_url
if base_url:
self._trading_base = base_url
else:
self._trading_base = _TRADING_PAPER if paper else _TRADING_LIVE
self._data_base = _DATA
# Single long-lived AsyncClient → reuse connection pool.
self._http = http or async_client(timeout=30.0)
async def aclose(self) -> None:
"""Chiudi connessioni HTTP. Idempotente."""
if not self._http.is_closed:
await self._http.aclose()
async def health(self) -> dict[str, Any]:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok", "paper": self.paper}
# ── Helpers ──────────────────────────────────────────────────
@property
def _headers(self) -> dict[str, str]:
return {
"APCA-API-KEY-ID": self.api_key,
"APCA-API-SECRET-KEY": self.secret_key,
"Accept": "application/json",
}
async def _request(
self,
method: str,
base: str,
path: str,
*,
params: dict[str, Any] | None = None,
json_body: dict[str, Any] | None = None,
) -> Any:
"""Esegue una richiesta HTTP autenticata e ritorna il JSON parsato.
Per response body vuoto (es. DELETE 204) ritorna `{}`.
Solleva `httpx.HTTPStatusError` su 4xx/5xx tramite raise_for_status.
"""
url = f"{base}{path}"
# httpx scarta i query params con valore None automaticamente solo se
# passati come list of tuples; con dict dobbiamo filtrare a monte.
clean_params: dict[str, Any] | None = None
if params is not None:
clean_params = {k: v for k, v in params.items() if v is not None}
if not clean_params:
clean_params = None
resp = await self._http.request(
method,
url,
params=clean_params,
json=json_body,
headers=self._headers,
)
resp.raise_for_status()
if not resp.content:
return {}
return resp.json()
# ── Account / positions ──────────────────────────────────────
async def get_account(self) -> dict:
data = await self._request("GET", self._trading_base, "/v2/account")
return dict(data) if data else {}
async def get_positions(self) -> list[dict]:
data = await self._request("GET", self._trading_base, "/v2/positions")
return list(data) if data else []
async def get_activities(self, limit: int = 50) -> list[dict]:
data = await self._request(
"GET",
self._trading_base,
"/v2/account/activities",
params={"page_size": limit},
)
items = list(data) if data else []
return items[:limit]
# ── Assets ──────────────────────────────────────────────────
async def get_assets(
self, asset_class: str = "stocks", status: str = "active"
) -> list[dict]:
data = await self._request(
"GET",
self._trading_base,
"/v2/assets",
params={
"status": status,
"asset_class": _asset_class_param(asset_class),
},
)
items = list(data) if data else []
return items[:500]
# ── Market data ─────────────────────────────────────────────
async def get_ticker(self, symbol: str, asset_class: str = "stocks") -> dict:
ac = asset_class.lower()
if ac == "stocks":
trade_resp = await self._request(
"GET",
self._data_base,
f"/v2/stocks/{symbol}/trades/latest",
)
quote_resp = await self._request(
"GET",
self._data_base,
f"/v2/stocks/{symbol}/quotes/latest",
)
trade = (trade_resp or {}).get("trade") or {}
quote = (quote_resp or {}).get("quote") or {}
return {
"symbol": symbol,
"asset_class": "stocks",
"last_price": trade.get("p"),
"bid": quote.get("bp"),
"ask": quote.get("ap"),
"bid_size": quote.get("bs"),
"ask_size": quote.get("as"),
"timestamp": trade.get("t"),
}
if ac == "crypto":
trade_resp = await self._request(
"GET",
self._data_base,
"/v1beta3/crypto/us/latest/trades",
params={"symbols": symbol},
)
quote_resp = await self._request(
"GET",
self._data_base,
"/v1beta3/crypto/us/latest/quotes",
params={"symbols": symbol},
)
trade = ((trade_resp or {}).get("trades") or {}).get(symbol) or {}
quote = ((quote_resp or {}).get("quotes") or {}).get(symbol) or {}
return {
"symbol": symbol,
"asset_class": "crypto",
"last_price": trade.get("p"),
"bid": quote.get("bp"),
"ask": quote.get("ap"),
"timestamp": trade.get("t"),
}
if ac == "options":
quote_resp = await self._request(
"GET",
self._data_base,
f"/v1beta1/options/{symbol}/quotes/latest",
)
quote = (quote_resp or {}).get("quote") or {}
return {
"symbol": symbol,
"asset_class": "options",
"bid": quote.get("bp"),
"ask": quote.get("ap"),
"timestamp": quote.get("t"),
}
raise ValueError(f"invalid asset_class: {asset_class}")
async def get_bars(
self,
symbol: str,
asset_class: str = "stocks",
interval: str = "1d",
start: str | None = None,
end: str | None = None,
limit: int = 1000,
) -> dict:
tf = _tf(interval)
start_dt = (
_dt.datetime.fromisoformat(start)
if start
else (_dt.datetime.now(_dt.UTC) - _dt.timedelta(days=30))
)
end_dt = _dt.datetime.fromisoformat(end) if end else _dt.datetime.now(_dt.UTC)
ac = asset_class.lower()
params: dict[str, Any] = {
"symbols": symbol,
"timeframe": tf,
"start": _iso(start_dt),
"end": _iso(end_dt),
"limit": limit,
}
if ac == "stocks":
# IEX feed di default — coerente con default alpaca-py free tier.
params["feed"] = "iex"
data = await self._request(
"GET", self._data_base, "/v2/stocks/bars", params=params
)
elif ac == "crypto":
data = await self._request(
"GET",
self._data_base,
"/v1beta3/crypto/us/bars",
params=params,
)
elif ac == "options":
data = await self._request(
"GET",
self._data_base,
"/v1beta1/options/bars",
params=params,
)
else:
raise ValueError(f"invalid asset_class: {asset_class}")
bars_dict = (data or {}).get("bars") or {}
rows = bars_dict.get(symbol) or []
bars = [
{
"timestamp": b.get("t"),
"open": b.get("o"),
"high": b.get("h"),
"low": b.get("l"),
"close": b.get("c"),
"volume": b.get("v"),
}
for b in rows
]
return {
"symbol": symbol,
"asset_class": ac,
"interval": interval,
"bars": bars,
}
async def get_snapshot(self, symbol: str) -> dict:
data = await self._request(
"GET",
self._data_base,
"/v2/stocks/snapshots",
params={"symbols": symbol},
)
# API ritorna {"AAPL": {snapshot}} o {"snapshots": {...}} — gestiamo
# entrambi i formati; v2/stocks/snapshots ritorna dict top-level
# symbol→snapshot.
if data is None:
return {}
if symbol in data:
return data[symbol] or {}
snaps = data.get("snapshots") or {}
return snaps.get(symbol) or {}
async def get_option_chain(
self,
underlying: str,
expiry: str | None = None,
) -> dict:
params: dict[str, Any] = {}
if expiry:
# Validazione date (solleva ValueError su input invalido,
# parità con V1 che usava _dt.date.fromisoformat).
_dt.date.fromisoformat(expiry)
params["expiration_date_gte"] = expiry
params["expiration_date_lte"] = expiry
data = await self._request(
"GET",
self._data_base,
f"/v1beta1/options/snapshots/{underlying}",
params=params or None,
)
contracts = (data or {}).get("snapshots") if data else None
return {
"underlying": underlying,
"expiry": expiry,
"contracts": contracts if contracts is not None else (data or {}),
}
# ── Orders ──────────────────────────────────────────────────
async def get_open_orders(self, limit: int = 50) -> list[dict]:
data = await self._request(
"GET",
self._trading_base,
"/v2/orders",
params={"status": "open", "limit": limit},
)
return list(data) if data else []
async def place_order(
self,
symbol: str,
side: str,
qty: float | None = None,
notional: float | None = None,
order_type: str = "market",
limit_price: float | None = None,
stop_price: float | None = None,
tif: str = "day",
asset_class: str = "stocks",
) -> dict:
ot = order_type.lower()
body: dict[str, Any] = {
"symbol": symbol,
"side": side.lower(),
"type": ot,
"time_in_force": tif.lower(),
}
if qty is not None:
body["qty"] = str(qty)
if notional is not None:
body["notional"] = str(notional)
if ot == "market":
pass
elif ot == "limit":
if limit_price is None:
raise ValueError("limit_price required for limit order")
body["limit_price"] = str(limit_price)
elif ot == "stop":
if stop_price is None:
raise ValueError("stop_price required for stop order")
body["stop_price"] = str(stop_price)
else:
raise ValueError(f"unsupported order_type: {order_type}")
# `asset_class` non è un parametro REST; mantenuto in firma per parità
# con V1 (era usato solo da SDK per scegliere il request model).
_ = asset_class
data = await self._request(
"POST",
self._trading_base,
"/v2/orders",
json_body=body,
)
return dict(data) if data else {}
async def amend_order(
self,
order_id: str,
qty: float | None = None,
limit_price: float | None = None,
stop_price: float | None = None,
tif: str | None = None,
) -> dict:
body: dict[str, Any] = {}
if qty is not None:
body["qty"] = str(qty)
if limit_price is not None:
body["limit_price"] = str(limit_price)
if stop_price is not None:
body["stop_price"] = str(stop_price)
if tif is not None:
body["time_in_force"] = tif.lower()
data = await self._request(
"PATCH",
self._trading_base,
f"/v2/orders/{order_id}",
json_body=body,
)
return dict(data) if data else {}
async def cancel_order(self, order_id: str) -> dict:
# DELETE /v2/orders/{id} → 204 No Content su success.
await self._request(
"DELETE", self._trading_base, f"/v2/orders/{order_id}"
)
return {"order_id": order_id, "canceled": True}
async def cancel_all_orders(self) -> list[dict]:
# DELETE /v2/orders → 207 Multi-Status con array di {id, status}
data = await self._request(
"DELETE", self._trading_base, "/v2/orders"
)
return list(data) if data else []
# ── Position close ──────────────────────────────────────────
async def close_position(
self, symbol: str, qty: float | None = None, percentage: float | None = None
) -> dict:
# DELETE /v2/positions/{symbol}?qty=... oppure ?percentage=...
params: dict[str, Any] = {}
if qty is not None:
params["qty"] = str(qty)
if percentage is not None:
params["percentage"] = str(percentage)
data = await self._request(
"DELETE",
self._trading_base,
f"/v2/positions/{symbol}",
params=params or None,
)
return dict(data) if data else {}
async def close_all_positions(self, cancel_orders: bool = True) -> list[dict]:
data = await self._request(
"DELETE",
self._trading_base,
"/v2/positions",
params={"cancel_orders": "true" if cancel_orders else "false"},
)
return list(data) if data else []
# ── Clock / calendar ────────────────────────────────────────
async def get_clock(self) -> dict:
data = await self._request("GET", self._trading_base, "/v2/clock")
return dict(data) if data else {}
async def get_calendar(
self, start: str | None = None, end: str | None = None
) -> list[dict]:
params: dict[str, Any] = {}
if start:
_dt.date.fromisoformat(start) # validazione, parità V1
params["start"] = start
if end:
_dt.date.fromisoformat(end)
params["end"] = end
data = await self._request(
"GET",
self._trading_base,
"/v2/calendar",
params=params or None,
)
return list(data) if data else []
+279
View File
@@ -0,0 +1,279 @@
"""Tool alpaca V2: pydantic schemas + async functions.
Ogni funzione prende (client: AlpacaClient, params: <Req>) e restituisce
un dict (o list[dict]). Pure logica, no FastAPI dependency, no ACL.
L'autenticazione bearer è gestita dal middleware in cerbero_mcp.auth;
l'audit verrà cablato dal router via request.state.environment.
"""
from __future__ import annotations
from typing import Any
from pydantic import BaseModel
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
from cerbero_mcp.exchanges.alpaca.leverage_cap import get_max_leverage
# === Schemas: reads ===
class GetAccountReq(BaseModel):
pass
class GetPositionsReq(BaseModel):
pass
class GetActivitiesReq(BaseModel):
limit: int = 50
class GetAssetsReq(BaseModel):
asset_class: str = "stocks"
status: str = "active"
class GetTickerReq(BaseModel):
symbol: str
asset_class: str = "stocks"
class GetBarsReq(BaseModel):
symbol: str
asset_class: str = "stocks"
interval: str = "1d"
start: str | None = None
end: str | None = None
limit: int = 1000
class GetSnapshotReq(BaseModel):
symbol: str
class GetOptionChainReq(BaseModel):
underlying: str
expiry: str | None = None
class GetOpenOrdersReq(BaseModel):
limit: int = 50
class GetClockReq(BaseModel):
pass
class GetCalendarReq(BaseModel):
start: str | None = None
end: str | None = None
# === Schemas: writes ===
class PlaceOrderReq(BaseModel):
symbol: str
side: str # "buy" | "sell"
qty: float | None = None
notional: float | None = None
order_type: str = "market"
limit_price: float | None = None
stop_price: float | None = None
tif: str = "day"
asset_class: str = "stocks"
model_config = {
"json_schema_extra": {
"examples": [
{
"summary": "Market buy 1 share AAPL",
"value": {
"symbol": "AAPL",
"side": "buy",
"qty": 1,
"order_type": "market",
"asset_class": "stocks",
},
}
]
}
}
class AmendOrderReq(BaseModel):
order_id: str
qty: float | None = None
limit_price: float | None = None
stop_price: float | None = None
tif: str | None = None
class CancelOrderReq(BaseModel):
order_id: str
class CancelAllOrdersReq(BaseModel):
pass
class ClosePositionReq(BaseModel):
symbol: str
qty: float | None = None
percentage: float | None = None
class CloseAllPositionsReq(BaseModel):
cancel_orders: bool = True
# === Tools (reads) ===
async def environment_info(
client: AlpacaClient, *, creds: dict, env_info: Any | None = None
) -> dict:
if env_info is None:
return {
"exchange": "alpaca",
"environment": "testnet" if getattr(client, "paper", True) else "mainnet",
"source": "credentials",
"env_value": None,
"base_url": getattr(client, "base_url", None),
"max_leverage": get_max_leverage(creds),
}
return {
"exchange": env_info.exchange,
"environment": env_info.environment,
"source": env_info.source,
"env_value": env_info.env_value,
"base_url": env_info.base_url,
"max_leverage": get_max_leverage(creds),
}
async def get_account(client: AlpacaClient, params: GetAccountReq) -> dict:
return await client.get_account()
async def get_positions(
client: AlpacaClient, params: GetPositionsReq
) -> dict:
return {"positions": await client.get_positions()}
async def get_activities(
client: AlpacaClient, params: GetActivitiesReq
) -> dict:
return {"activities": await client.get_activities(params.limit)}
async def get_assets(client: AlpacaClient, params: GetAssetsReq) -> dict:
return {
"assets": await client.get_assets(params.asset_class, params.status)
}
async def get_ticker(client: AlpacaClient, params: GetTickerReq) -> dict:
return await client.get_ticker(params.symbol, params.asset_class)
async def get_bars(client: AlpacaClient, params: GetBarsReq) -> dict:
return await client.get_bars(
params.symbol,
params.asset_class,
params.interval,
params.start,
params.end,
params.limit,
)
async def get_snapshot(
client: AlpacaClient, params: GetSnapshotReq
) -> dict:
return await client.get_snapshot(params.symbol)
async def get_option_chain(
client: AlpacaClient, params: GetOptionChainReq
) -> dict:
return await client.get_option_chain(params.underlying, params.expiry)
async def get_open_orders(
client: AlpacaClient, params: GetOpenOrdersReq
) -> dict:
return {"orders": await client.get_open_orders(params.limit)}
async def get_clock(client: AlpacaClient, params: GetClockReq) -> dict:
return await client.get_clock()
async def get_calendar(
client: AlpacaClient, params: GetCalendarReq
) -> dict:
return {"calendar": await client.get_calendar(params.start, params.end)}
# === Tools (writes) ===
async def place_order(
client: AlpacaClient, params: PlaceOrderReq, *, creds: dict
) -> dict:
# Alpaca: cap default 1 (cash account). Niente leverage parametro;
# cap presente per coerenza con altri exchange e per audit.
return await client.place_order(
symbol=params.symbol,
side=params.side,
qty=params.qty,
notional=params.notional,
order_type=params.order_type,
limit_price=params.limit_price,
stop_price=params.stop_price,
tif=params.tif,
asset_class=params.asset_class,
)
async def amend_order(
client: AlpacaClient, params: AmendOrderReq
) -> dict:
return await client.amend_order(
params.order_id,
params.qty,
params.limit_price,
params.stop_price,
params.tif,
)
async def cancel_order(
client: AlpacaClient, params: CancelOrderReq
) -> dict:
return await client.cancel_order(params.order_id)
async def cancel_all_orders(
client: AlpacaClient, params: CancelAllOrdersReq
) -> dict:
return {"canceled": await client.cancel_all_orders()}
async def close_position(
client: AlpacaClient, params: ClosePositionReq
) -> dict:
return await client.close_position(
params.symbol, params.qty, params.percentage
)
async def close_all_positions(
client: AlpacaClient, params: CloseAllPositionsReq
) -> dict:
return {
"closed": await client.close_all_positions(params.cancel_orders)
}
+904
View File
@@ -0,0 +1,904 @@
"""Bybit V5 REST API client (httpx puro, no SDK).
Implementazione diretta su `httpx.AsyncClient` per i tool Cerbero MCP V2.
Mantiene parità di interfaccia con la versione precedente basata su
`pybit.unified_trading.HTTP` per non rompere `tools.py` né i router.
Auth Bybit V5:
Header X-BAPI-SIGN = HMAC_SHA256(secret,
timestamp + api_key + recv_window + (body_json | querystring))
"""
from __future__ import annotations
import hashlib
import hmac
import json
import time
import uuid
from typing import Any
from urllib.parse import urlencode
import httpx
from cerbero_mcp.common import indicators as ind
from cerbero_mcp.common import microstructure as micro
BASE_MAINNET = "https://api.bybit.com"
BASE_TESTNET = "https://api-testnet.bybit.com"
DEFAULT_RECV_WINDOW = "5000"
DEFAULT_TIMEOUT = 15.0
def _f(v: Any) -> float | None:
try:
return float(v)
except (TypeError, ValueError):
return None
def _i(v: Any) -> int | None:
try:
return int(v)
except (TypeError, ValueError):
return None
class BybitAPIError(RuntimeError):
"""Errore di trasporto Bybit V5 (non gestito a livello envelope)."""
class BybitClient:
"""Async REST client per Bybit V5 (linear/inverse/spot/option)."""
def __init__(
self,
api_key: str,
api_secret: str,
testnet: bool = True,
http: httpx.AsyncClient | None = None,
base_url: str | None = None,
) -> None:
self.api_key = api_key
self.api_secret = api_secret
self.testnet = testnet
self.base_url = base_url or (BASE_TESTNET if testnet else BASE_MAINNET)
self.recv_window = DEFAULT_RECV_WINDOW
# `http` injection è usato dai test per montare un AsyncClient con
# `httpx.MockTransport`. In produzione creiamo un client dedicato.
self._owns_http = http is None
self._http: httpx.AsyncClient = http or httpx.AsyncClient(
timeout=DEFAULT_TIMEOUT
)
async def aclose(self) -> None:
"""Chiude l'AsyncClient httpx se di nostra proprietà."""
if self._owns_http:
await self._http.aclose()
async def health(self) -> dict[str, Any]:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok", "testnet": self.testnet}
# ── auth helpers ───────────────────────────────────────────
def _timestamp_ms(self) -> str:
return str(int(time.time() * 1000))
def _sign(self, timestamp: str, payload: str) -> str:
msg = timestamp + self.api_key + self.recv_window + payload
return hmac.new(
self.api_secret.encode("utf-8"),
msg.encode("utf-8"),
hashlib.sha256,
).hexdigest()
def _signed_headers(self, payload: str) -> dict[str, str]:
ts = self._timestamp_ms()
sig = self._sign(ts, payload)
return {
"X-BAPI-API-KEY": self.api_key,
"X-BAPI-TIMESTAMP": ts,
"X-BAPI-RECV-WINDOW": self.recv_window,
"X-BAPI-SIGN": sig,
"Content-Type": "application/json",
}
@staticmethod
def _clean_params(params: dict[str, Any] | None) -> dict[str, Any]:
if not params:
return {}
return {k: v for k, v in params.items() if v is not None}
@staticmethod
def _querystring(params: dict[str, Any]) -> str:
# Bybit accetta querystring nell'ordine in cui viene serializzata la
# request. Per la signature usiamo lo stesso urlencode (ordine
# inserzione dict). In Python 3.7+ dict mantiene insertion order:
# mantenere coerenza tra signature payload e URL effettivo.
return urlencode(params)
# ── request primitives ─────────────────────────────────────
async def _request_public(
self,
method: str,
path: str,
params: dict[str, Any] | None = None,
) -> dict[str, Any]:
clean = self._clean_params(params)
url = self.base_url + path
resp = await self._http.request(
method, url, params=clean if clean else None
)
return self._parse_response(resp)
async def _request_signed(
self,
method: str,
path: str,
params: dict[str, Any] | None = None,
body: dict[str, Any] | None = None,
) -> dict[str, Any]:
url = self.base_url + path
method = method.upper()
if method == "GET":
clean = self._clean_params(params)
qs = self._querystring(clean)
headers = self._signed_headers(qs)
resp = await self._http.request(
method, url, params=clean if clean else None, headers=headers
)
else:
payload_body = body or {}
body_json = json.dumps(payload_body, separators=(",", ":"))
headers = self._signed_headers(body_json)
resp = await self._http.request(
method, url, content=body_json, headers=headers
)
return self._parse_response(resp)
@staticmethod
def _parse_response(resp: httpx.Response) -> dict[str, Any]:
try:
data = resp.json()
except Exception as e: # pragma: no cover - difficilmente raggiungibile
raise BybitAPIError(
f"invalid JSON from Bybit (status={resp.status_code}): {resp.text[:200]}"
) from e
if resp.status_code >= 500:
raise BybitAPIError(
f"bybit server error {resp.status_code}: "
f"{data.get('retMsg', resp.text[:200])}"
)
if not isinstance(data, dict):
raise BybitAPIError(f"unexpected payload type: {type(data).__name__}")
return data
def _envelope(self, resp: dict[str, Any], payload: dict[str, Any]) -> dict[str, Any]:
code = resp.get("retCode", 0)
if code != 0:
return {"error": resp.get("retMsg", "bybit_error"), "code": code}
return payload
# ── parsers shared ─────────────────────────────────────────
@staticmethod
def _parse_ticker(row: dict[str, Any]) -> dict[str, Any]:
return {
"symbol": row.get("symbol"),
"last_price": _f(row.get("lastPrice")),
"mark_price": _f(row.get("markPrice")),
"bid": _f(row.get("bid1Price")),
"ask": _f(row.get("ask1Price")),
"volume_24h": _f(row.get("volume24h")),
"turnover_24h": _f(row.get("turnover24h")),
"funding_rate": _f(row.get("fundingRate")),
"open_interest": _f(row.get("openInterest")),
}
# ── market data (public) ───────────────────────────────────
async def get_ticker(self, symbol: str, category: str = "linear") -> dict:
resp = await self._request_public(
"GET",
"/v5/market/tickers",
params={"category": category, "symbol": symbol},
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
return {"symbol": symbol, "error": "not_found"}
return self._parse_ticker(rows[0])
async def get_ticker_batch(
self, symbols: list[str], category: str = "linear"
) -> dict[str, dict]:
out: dict[str, dict] = {}
for sym in symbols:
out[sym] = await self.get_ticker(sym, category=category)
return out
async def get_orderbook(
self, symbol: str, category: str = "linear", limit: int = 50
) -> dict:
resp = await self._request_public(
"GET",
"/v5/market/orderbook",
params={"category": category, "symbol": symbol, "limit": limit},
)
r = resp.get("result") or {}
return {
"symbol": r.get("s"),
"bids": [[float(p), float(q)] for p, q in (r.get("b") or [])],
"asks": [[float(p), float(q)] for p, q in (r.get("a") or [])],
"timestamp": r.get("ts"),
}
async def get_historical(
self,
symbol: str,
category: str = "linear",
interval: str = "60",
start: int | None = None,
end: int | None = None,
limit: int = 1000,
) -> dict:
params: dict[str, Any] = {
"category": category,
"symbol": symbol,
"interval": interval,
"limit": limit,
}
if start is not None:
params["start"] = start
if end is not None:
params["end"] = end
resp = await self._request_public("GET", "/v5/market/kline", params=params)
rows = (resp.get("result") or {}).get("list") or []
rows_sorted = sorted(rows, key=lambda r: int(r[0]))
candles = [
{
"timestamp": int(r[0]),
"open": float(r[1]),
"high": float(r[2]),
"low": float(r[3]),
"close": float(r[4]),
"volume": float(r[5]),
}
for r in rows_sorted
]
return {"symbol": symbol, "candles": candles}
async def get_indicators(
self,
symbol: str,
category: str = "linear",
indicators: list[str] | None = None,
interval: str = "60",
start: int | None = None,
end: int | None = None,
) -> dict:
indicators = indicators or ["rsi", "atr", "macd", "adx"]
historical = await self.get_historical(
symbol, category=category, interval=interval, start=start, end=end
)
candles = historical.get("candles", [])
closes = [c["close"] for c in candles]
highs = [c["high"] for c in candles]
lows = [c["low"] for c in candles]
out: dict[str, Any] = {"symbol": symbol, "category": category}
for name in indicators:
n = name.lower()
if n == "sma":
out["sma"] = ind.sma(closes, 20)
elif n == "rsi":
out["rsi"] = ind.rsi(closes)
elif n == "atr":
out["atr"] = ind.atr(highs, lows, closes)
elif n == "macd":
out["macd"] = ind.macd(closes)
elif n == "adx":
out["adx"] = ind.adx(highs, lows, closes)
else:
out[n] = None
return out
async def get_funding_rate(self, symbol: str, category: str = "linear") -> dict:
resp = await self._request_public(
"GET",
"/v5/market/tickers",
params={"category": category, "symbol": symbol},
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
return {"symbol": symbol, "error": "not_found"}
row = rows[0]
return {
"symbol": row.get("symbol"),
"funding_rate": _f(row.get("fundingRate")),
"next_funding_time": _i(row.get("nextFundingTime")),
}
async def get_funding_history(
self, symbol: str, category: str = "linear", limit: int = 100
) -> dict:
resp = await self._request_public(
"GET",
"/v5/market/funding/history",
params={"category": category, "symbol": symbol, "limit": limit},
)
rows = (resp.get("result") or {}).get("list") or []
hist = [
{
"timestamp": int(r.get("fundingRateTimestamp", 0)),
"rate": float(r.get("fundingRate", 0)),
}
for r in rows
]
return {"symbol": symbol, "history": hist}
async def get_open_interest(
self,
symbol: str,
category: str = "linear",
interval: str = "5min",
limit: int = 288,
) -> dict:
resp = await self._request_public(
"GET",
"/v5/market/open-interest",
params={
"category": category,
"symbol": symbol,
"intervalTime": interval,
"limit": limit,
},
)
rows = (resp.get("result") or {}).get("list") or []
points = [
{
"timestamp": int(r.get("timestamp", 0)),
"oi": float(r.get("openInterest", 0)),
}
for r in rows
]
current_oi = points[0]["oi"] if points else None
return {
"symbol": symbol,
"category": category,
"interval": interval,
"current_oi": current_oi,
"points": points,
}
async def get_instruments(
self, category: str = "linear", symbol: str | None = None
) -> dict:
params: dict[str, Any] = {"category": category}
if symbol:
params["symbol"] = symbol
resp = await self._request_public(
"GET", "/v5/market/instruments-info", params=params
)
rows = (resp.get("result") or {}).get("list") or []
instruments = []
for r in rows:
pf = r.get("priceFilter") or {}
lf = r.get("lotSizeFilter") or {}
instruments.append(
{
"symbol": r.get("symbol"),
"status": r.get("status"),
"base_coin": r.get("baseCoin"),
"quote_coin": r.get("quoteCoin"),
"tick_size": _f(pf.get("tickSize")),
"qty_step": _f(lf.get("qtyStep")),
"min_qty": _f(lf.get("minOrderQty")),
}
)
return {"category": category, "instruments": instruments}
async def get_option_chain(self, base_coin: str, expiry: str | None = None) -> dict:
resp = await self._request_public(
"GET",
"/v5/market/instruments-info",
params={"category": "option", "baseCoin": base_coin.upper()},
)
rows = (resp.get("result") or {}).get("list") or []
options = []
for r in rows:
delivery = r.get("deliveryTime")
if expiry and expiry not in r.get("symbol", ""):
continue
options.append(
{
"symbol": r.get("symbol"),
"base_coin": r.get("baseCoin"),
"settle_coin": r.get("settleCoin"),
"type": r.get("optionsType"),
"launch_time": int(r.get("launchTime", 0)),
"delivery_time": int(delivery) if delivery else None,
}
)
return {"base_coin": base_coin.upper(), "options": options}
# ── account / positions / orders (signed) ─────────────────
async def get_positions(
self, category: str = "linear", settle_coin: str = "USDT"
) -> list[dict]:
params: dict[str, Any] = {"category": category}
if category in ("linear", "inverse"):
params["settleCoin"] = settle_coin
resp = await self._request_signed("GET", "/v5/position/list", params=params)
rows = (resp.get("result") or {}).get("list") or []
out = []
for r in rows:
out.append(
{
"symbol": r.get("symbol"),
"side": r.get("side"),
"size": _f(r.get("size")),
"entry_price": _f(r.get("avgPrice")),
"unrealized_pnl": _f(r.get("unrealisedPnl")),
"leverage": _f(r.get("leverage")),
"liquidation_price": _f(r.get("liqPrice")),
"position_value": _f(r.get("positionValue")),
}
)
return out
async def get_account_summary(self, account_type: str = "UNIFIED") -> dict:
resp = await self._request_signed(
"GET",
"/v5/account/wallet-balance",
params={"accountType": account_type},
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
return {"error": "no_account"}
a = rows[0]
coins = []
for c in a.get("coin") or []:
coins.append(
{
"coin": c.get("coin"),
"wallet_balance": _f(c.get("walletBalance")),
"equity": _f(c.get("equity")),
}
)
return {
"account_type": a.get("accountType"),
"equity": _f(a.get("totalEquity")),
"wallet_balance": _f(a.get("totalWalletBalance")),
"margin_balance": _f(a.get("totalMarginBalance")),
"available_balance": _f(a.get("totalAvailableBalance")),
"unrealized_pnl": _f(a.get("totalPerpUPL")),
"coins": coins,
}
async def get_trade_history(
self, category: str = "linear", limit: int = 50
) -> list[dict]:
resp = await self._request_signed(
"GET",
"/v5/execution/list",
params={"category": category, "limit": limit},
)
rows = (resp.get("result") or {}).get("list") or []
return [
{
"symbol": r.get("symbol"),
"side": r.get("side"),
"size": _f(r.get("execQty")),
"price": _f(r.get("execPrice")),
"fee": _f(r.get("execFee")),
"timestamp": _i(r.get("execTime")),
"order_id": r.get("orderId"),
}
for r in rows
]
async def get_open_orders(
self,
category: str = "linear",
symbol: str | None = None,
settle_coin: str = "USDT",
) -> list[dict]:
params: dict[str, Any] = {"category": category}
if category in ("linear", "inverse") and not symbol:
params["settleCoin"] = settle_coin
if symbol:
params["symbol"] = symbol
resp = await self._request_signed(
"GET", "/v5/order/realtime", params=params
)
rows = (resp.get("result") or {}).get("list") or []
return [
{
"order_id": r.get("orderId"),
"symbol": r.get("symbol"),
"side": r.get("side"),
"qty": _f(r.get("qty")),
"price": _f(r.get("price")),
"type": r.get("orderType"),
"status": r.get("orderStatus"),
"reduce_only": bool(r.get("reduceOnly")),
}
for r in rows
]
# ── microstructure / basis ─────────────────────────────────
async def get_orderbook_imbalance(
self,
symbol: str,
category: str = "linear",
depth: int = 10,
) -> dict:
ob = await self.get_orderbook(
symbol=symbol, category=category, limit=max(depth, 50)
)
result = micro.orderbook_imbalance(
ob.get("bids") or [], ob.get("asks") or [], depth=depth
)
return {
"symbol": symbol,
"category": category,
"depth": depth,
**result,
"timestamp": ob.get("timestamp"),
}
async def get_basis_term_structure(self, asset: str) -> dict:
import datetime as _dt
asset = asset.upper()
spot = await self.get_ticker(f"{asset}USDT", category="spot")
perp = await self.get_ticker(f"{asset}USDT", category="linear")
sp = spot.get("last_price")
pp = perp.get("last_price")
instr = await self.get_instruments(category="linear")
items = instr.get("instruments") or []
futures = [
x
for x in items
if x.get("symbol", "").startswith(f"{asset}-")
or x.get("symbol", "").startswith(f"{asset}USDT-")
]
rows: list[dict[str, Any]] = []
if sp:
now_ms = int(_dt.datetime.now(_dt.UTC).timestamp() * 1000)
for f in futures[:10]:
tk = await self.get_ticker(f["symbol"], category="linear")
fp = tk.get("last_price")
expiry_ms = f.get("delivery_time")
if not fp or not expiry_ms:
continue
days = max((int(expiry_ms) - now_ms) / 86_400_000, 1)
basis_pct = 100.0 * (fp - sp) / sp
annualized = basis_pct * 365.0 / days
rows.append(
{
"symbol": f["symbol"],
"expiry_ms": int(expiry_ms),
"days_to_expiry": round(days, 2),
"future_price": fp,
"basis_pct": round(basis_pct, 4),
"annualized_basis_pct": round(annualized, 4),
}
)
rows.sort(key=lambda r: r["days_to_expiry"])
return {
"asset": asset,
"spot_price": sp,
"perp_price": pp,
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4)
if (sp and pp)
else None,
"term_structure": rows,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
}
async def get_basis_spot_perp(self, asset: str) -> dict:
asset = asset.upper()
symbol = f"{asset}USDT"
spot = await self.get_ticker(symbol, category="spot")
perp = await self.get_ticker(symbol, category="linear")
sp = spot.get("last_price")
pp = perp.get("last_price")
basis_abs = basis_pct = None
if sp and pp:
basis_abs = pp - sp
basis_pct = 100.0 * basis_abs / sp
return {
"asset": asset,
"symbol": symbol,
"spot_price": sp,
"perp_price": pp,
"basis_abs": basis_abs,
"basis_pct": basis_pct,
"funding_rate": perp.get("funding_rate"),
}
# ── trading (signed, write) ────────────────────────────────
async def place_order(
self,
category: str,
symbol: str,
side: str,
qty: float,
order_type: str = "Limit",
price: float | None = None,
tif: str = "GTC",
reduce_only: bool = False,
position_idx: int | None = None,
) -> dict:
body: dict[str, Any] = {
"category": category,
"symbol": symbol,
"side": side,
"qty": str(qty),
"orderType": order_type,
"timeInForce": tif,
"reduceOnly": reduce_only,
}
if price is not None:
body["price"] = str(price)
if position_idx is not None:
body["positionIdx"] = position_idx
if category == "option":
body["orderLinkId"] = f"cerbero-{uuid.uuid4().hex[:16]}"
resp = await self._request_signed("POST", "/v5/order/create", body=body)
r = resp.get("result") or {}
return self._envelope(
resp,
{
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
},
)
async def place_combo_order(
self,
category: str,
legs: list[dict[str, Any]],
) -> dict:
if category != "option":
raise ValueError(
"place_combo_order: Bybit batch_order è disponibile solo su category='option'"
)
if len(legs) < 2:
raise ValueError("combo requires at least 2 legs")
request: list[dict[str, Any]] = []
for leg in legs:
entry: dict[str, Any] = {
"symbol": leg["symbol"],
"side": leg["side"],
"qty": str(leg["qty"]),
"orderType": leg.get("order_type", "Limit"),
"timeInForce": leg.get("tif", "GTC"),
"reduceOnly": leg.get("reduce_only", False),
"orderLinkId": f"cerbero-{uuid.uuid4().hex[:16]}",
}
if leg.get("price") is not None:
entry["price"] = str(leg["price"])
request.append(entry)
body = {"category": category, "request": request}
resp = await self._request_signed(
"POST", "/v5/order/create-batch", body=body
)
result_list = (resp.get("result") or {}).get("list") or []
orders = [
{
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
}
for r in result_list
]
return self._envelope(resp, {"orders": orders})
async def amend_order(
self,
category: str,
symbol: str,
order_id: str,
new_qty: float | None = None,
new_price: float | None = None,
) -> dict:
body: dict[str, Any] = {
"category": category,
"symbol": symbol,
"orderId": order_id,
}
if new_qty is not None:
body["qty"] = str(new_qty)
if new_price is not None:
body["price"] = str(new_price)
resp = await self._request_signed("POST", "/v5/order/amend", body=body)
r = resp.get("result") or {}
return self._envelope(
resp,
{
"order_id": r.get("orderId", order_id),
"status": "amended",
},
)
async def cancel_order(self, category: str, symbol: str, order_id: str) -> dict:
body = {"category": category, "symbol": symbol, "orderId": order_id}
resp = await self._request_signed("POST", "/v5/order/cancel", body=body)
r = resp.get("result") or {}
return self._envelope(
resp,
{
"order_id": r.get("orderId", order_id),
"status": "cancelled",
},
)
async def cancel_all_orders(
self, category: str, symbol: str | None = None
) -> dict:
body: dict[str, Any] = {"category": category}
if symbol:
body["symbol"] = symbol
resp = await self._request_signed(
"POST", "/v5/order/cancel-all", body=body
)
r = resp.get("result") or {}
ids = [x.get("orderId") for x in (r.get("list") or [])]
return self._envelope(
resp,
{
"cancelled_ids": ids,
"count": len(ids),
},
)
async def set_stop_loss(
self,
category: str,
symbol: str,
stop_loss: float,
position_idx: int = 0,
) -> dict:
body = {
"category": category,
"symbol": symbol,
"stopLoss": str(stop_loss),
"positionIdx": position_idx,
}
resp = await self._request_signed(
"POST", "/v5/position/trading-stop", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"stop_loss": stop_loss,
"status": "stop_loss_set",
},
)
async def set_take_profit(
self,
category: str,
symbol: str,
take_profit: float,
position_idx: int = 0,
) -> dict:
body = {
"category": category,
"symbol": symbol,
"takeProfit": str(take_profit),
"positionIdx": position_idx,
}
resp = await self._request_signed(
"POST", "/v5/position/trading-stop", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"take_profit": take_profit,
"status": "take_profit_set",
},
)
async def close_position(self, category: str, symbol: str) -> dict:
positions = await self.get_positions(category=category)
target = next(
(p for p in positions if p["symbol"] == symbol and (p["size"] or 0) > 0),
None,
)
if not target:
return {"error": "no_open_position", "symbol": symbol}
close_side = "Sell" if target["side"] == "Buy" else "Buy"
return await self.place_order(
category=category,
symbol=symbol,
side=close_side,
qty=target["size"],
order_type="Market",
reduce_only=True,
tif="IOC",
)
async def set_leverage(
self, category: str, symbol: str, leverage: int
) -> dict:
body = {
"category": category,
"symbol": symbol,
"buyLeverage": str(leverage),
"sellLeverage": str(leverage),
}
resp = await self._request_signed(
"POST", "/v5/position/set-leverage", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"leverage": leverage,
"status": "leverage_set",
},
)
async def switch_position_mode(
self, category: str, symbol: str, mode: str
) -> dict:
mode_code = 3 if mode.lower() == "hedge" else 0
body = {
"category": category,
"symbol": symbol,
"mode": mode_code,
}
resp = await self._request_signed(
"POST", "/v5/position/switch-mode", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"mode": mode,
"status": "mode_switched",
},
)
async def transfer_asset(
self,
coin: str,
amount: float,
from_type: str,
to_type: str,
) -> dict:
body = {
"transferId": str(uuid.uuid4()),
"coin": coin,
"amount": str(amount),
"fromAccountType": from_type,
"toAccountType": to_type,
}
resp = await self._request_signed(
"POST", "/v5/asset/transfer/inter-transfer", body=body
)
r = resp.get("result") or {}
return self._envelope(
resp,
{
"transfer_id": r.get("transferId"),
"coin": coin,
"amount": amount,
"status": "submitted",
},
)

Some files were not shown because too many files have changed in this diff Show More