Compare commits
46 Commits
95b8bcfe96
..
V2.0.0
| Author | SHA1 | Date | |
|---|---|---|---|
| 91aadaea6a | |||
| 0ba5a05219 | |||
| c94312d79f | |||
| 110ca7f5cf | |||
| a56baad3dd | |||
| f8fb50cb83 | |||
| 880faa7fd4 | |||
| cddf88afb4 | |||
| 55bfeca88e | |||
| bea37fd734 | |||
| 6940e2865b | |||
| bdc40929d4 | |||
| 9bbc8c05f1 | |||
| 3510605fdd | |||
| 8914d613ec | |||
| 531b7b019c | |||
| 6266708e15 | |||
| 17700d27a0 | |||
| 12002642e5 | |||
| b9c58a376f | |||
| ded4414b32 | |||
| 611a2695a9 | |||
| f4f4e4efd7 | |||
| 0c74691e7c | |||
| b49b2b36e0 | |||
| 92da6aa842 | |||
| a90c5c4d6f | |||
| ae63aaf69a | |||
| 92cc45c896 | |||
| 3a85ff05e6 | |||
| 391f2c02e0 | |||
| 109b8e4686 | |||
| 1ca1687c9b | |||
| 8a0f37ebc2 | |||
| 6640ede3df | |||
| d8136713b9 | |||
| 9e7b98579b | |||
| 51081f4e18 | |||
| 8ecc1a24a9 | |||
| 9afd087152 | |||
| 69ac878893 | |||
| bd6b03ce43 | |||
| 43bf8fc461 | |||
| c0b4cb5d5c | |||
| 44c7a18d3e | |||
| 6097dde4e4 |
@@ -17,8 +17,14 @@ TESTNET_TOKEN=
|
||||
MAINNET_TOKEN=
|
||||
|
||||
# ─── EXCHANGE — DERIBIT ───────────────────────────────────
|
||||
# Coppia singola (usata sia per testnet sia per mainnet):
|
||||
DERIBIT_CLIENT_ID=
|
||||
DERIBIT_CLIENT_SECRET=
|
||||
# Oppure coppie distinte per env (prevalgono se valorizzate):
|
||||
# DERIBIT_CLIENT_ID_TESTNET=
|
||||
# DERIBIT_CLIENT_SECRET_TESTNET=
|
||||
# DERIBIT_CLIENT_ID_LIVE=
|
||||
# DERIBIT_CLIENT_SECRET_LIVE=
|
||||
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
|
||||
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
|
||||
DERIBIT_MAX_LEVERAGE=3
|
||||
@@ -45,6 +51,39 @@ ALPACA_URL_LIVE=https://api.alpaca.markets
|
||||
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
|
||||
ALPACA_MAX_LEVERAGE=1
|
||||
|
||||
# ─── EXCHANGE — IBKR ──────────────────────────────────────
|
||||
# Setup OAuth: vedi README "IBKR Setup" + scripts/ibkr_oauth_setup.py.
|
||||
# Le RSA keys (PEM) NON vanno nel .env: monta come file e referenzia il path.
|
||||
|
||||
IBKR_CONSUMER_KEY=
|
||||
IBKR_ACCESS_TOKEN=
|
||||
IBKR_ACCESS_TOKEN_SECRET=
|
||||
IBKR_SIGNATURE_KEY_PATH=/secrets/ibkr_signature.pem
|
||||
IBKR_ENCRYPTION_KEY_PATH=/secrets/ibkr_encryption.pem
|
||||
IBKR_DH_PRIME=
|
||||
|
||||
# Coppie env-specific (prevalgono):
|
||||
# IBKR_CONSUMER_KEY_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_TESTNET=
|
||||
# IBKR_SIGNATURE_KEY_PATH_TESTNET=/secrets/ibkr_signature_paper.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_TESTNET=/secrets/ibkr_encryption_paper.pem
|
||||
# IBKR_ACCOUNT_ID_TESTNET=DU1234567
|
||||
# IBKR_CONSUMER_KEY_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_LIVE=
|
||||
# IBKR_SIGNATURE_KEY_PATH_LIVE=/secrets/ibkr_signature_live.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_LIVE=/secrets/ibkr_encryption_live.pem
|
||||
# IBKR_ACCOUNT_ID_LIVE=U1234567
|
||||
|
||||
IBKR_URL_LIVE=https://api.ibkr.com/v1/api
|
||||
IBKR_URL_TESTNET=https://api.ibkr.com/v1/api
|
||||
IBKR_WS_URL_LIVE=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_WS_URL_TESTNET=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_MAX_LEVERAGE=4
|
||||
IBKR_WS_MAX_SUBSCRIPTIONS=80
|
||||
IBKR_WS_IDLE_TIMEOUT_S=300
|
||||
|
||||
# ─── DATA PROVIDERS — MACRO ───────────────────────────────
|
||||
FRED_API_KEY=
|
||||
FINNHUB_API_KEY=
|
||||
|
||||
@@ -9,8 +9,8 @@ sul token bearer fornito dal client.
|
||||
|
||||
- **Una singola immagine Docker** (`cerbero-mcp`) ospita tutti i router
|
||||
exchange in un unico processo FastAPI
|
||||
- **Quattro exchange** (Deribit, Bybit, Hyperliquid, Alpaca) e **due data
|
||||
provider** read-only (Macro, Sentiment)
|
||||
- **Cinque exchange** (Deribit, Bybit, Hyperliquid, Alpaca, IBKR) e **due
|
||||
data provider** read-only (Macro, Sentiment)
|
||||
- **Switch testnet/mainnet per-request** tramite header
|
||||
`Authorization: Bearer <TOKEN>`: lo stesso container serve entrambi gli
|
||||
ambienti senza riavvii
|
||||
@@ -19,7 +19,10 @@ sul token bearer fornito dal client.
|
||||
override-abili tramite variabili dedicate (`DERIBIT_URL_*`,
|
||||
`BYBIT_URL_*`, `HYPERLIQUID_URL_*`, `ALPACA_URL_*`)
|
||||
- **Documentazione interattiva** OpenAPI/Swagger esposta a `/apidocs`
|
||||
- **Qualità verificata**: 259 test (unit + integration + smoke), mypy
|
||||
- **Endpoint cross-exchange unificato** (`/mcp-cross/tools/get_historical`):
|
||||
fan-out a tutti gli exchange che supportano (symbol, asset_class) e
|
||||
consensus per-bar (mediana OHLC + `div_pct` + `sources`)
|
||||
- **Qualità verificata**: 399 test (unit + integration + smoke), mypy
|
||||
pulito, ruff pulito
|
||||
|
||||
## Avvio rapido (sviluppo, senza Docker)
|
||||
@@ -62,19 +65,101 @@ Le tool puramente read-only (`/mcp-macro/*` e `/mcp-sentiment/*`)
|
||||
richiedono comunque un bearer valido, ma il valore (testnet o mainnet) è
|
||||
indifferente perché non hanno endpoint testnet.
|
||||
|
||||
### Header X-Bot-Tag (identificazione bot)
|
||||
|
||||
Tutte le chiamate a `/mcp-*` richiedono inoltre l'header `X-Bot-Tag` con
|
||||
una stringa identificativa del bot chiamante (massimo 64 caratteri). Il
|
||||
valore viene loggato negli audit record per tracciare quale bot ha
|
||||
eseguito ogni operazione write. Esempio di richiesta:
|
||||
|
||||
Authorization: Bearer $MAINNET_TOKEN
|
||||
X-Bot-Tag: scanner-alpha-prod
|
||||
|
||||
Se l'header è assente o vuoto la risposta è `400 BAD_REQUEST`. L'header
|
||||
non è richiesto sugli endpoint pubblici (`/health`, `/apidocs`,
|
||||
`/openapi.json`) né sull'endpoint admin `/admin/audit`.
|
||||
|
||||
## Endpoint principali
|
||||
|
||||
| Path | Descrizione |
|
||||
|---|---|
|
||||
| `GET /health` | Healthcheck (no auth) |
|
||||
| `GET /health` | Liveness check (no auth) |
|
||||
| `GET /health/ready` | Readiness check con ping client exchange (no auth) |
|
||||
| `GET /apidocs` | Swagger UI (no auth) |
|
||||
| `GET /openapi.json` | Schema OpenAPI 3.1 (no auth) |
|
||||
| `POST /mcp-deribit/tools/{tool}` | Tool exchange Deribit |
|
||||
| `POST /mcp-bybit/tools/{tool}` | Tool exchange Bybit |
|
||||
| `POST /mcp-hyperliquid/tools/{tool}` | Tool exchange Hyperliquid |
|
||||
| `POST /mcp-alpaca/tools/{tool}` | Tool exchange Alpaca |
|
||||
| `POST /mcp-ibkr/tools/{tool}` | Tool exchange Interactive Brokers |
|
||||
| `POST /mcp-macro/tools/{tool}` | Tool macro/market data |
|
||||
| `POST /mcp-sentiment/tools/{tool}` | Tool sentiment/news |
|
||||
| `POST /mcp-cross/tools/get_historical` | Storico aggregato cross-exchange con consensus + divergenza |
|
||||
| `GET /admin/audit` | Query dell'audit log JSONL (bearer richiesto, no X-Bot-Tag) |
|
||||
|
||||
## Observability
|
||||
|
||||
### Health check
|
||||
|
||||
L'applicazione espone due endpoint distinti per il monitoring:
|
||||
|
||||
- `GET /health` — liveness check semplice. Non richiede autenticazione e
|
||||
ritorna sempre HTTP 200 finché il processo è vivo. Ideale per la
|
||||
liveness probe di Kubernetes o per il pinger di Traefik.
|
||||
- `GET /health/ready` — readiness check evoluto. Itera tutti i client
|
||||
exchange presenti nel registry e per ciascuno tenta una probe leggera
|
||||
(`health()` se disponibile, fallback su `is_testnet()`), con timeout
|
||||
di 2 secondi per client. La risposta contiene il campo `status` con
|
||||
uno dei valori `ready` (tutti i client rispondono), `degraded` (almeno
|
||||
uno fallisce) o `not_ready` (registry vuoto) ed un array `clients` con
|
||||
un record per ogni coppia `(exchange, env)` cached. Per default
|
||||
l'endpoint risponde sempre con HTTP 200; impostando la variabile
|
||||
d'ambiente `READY_FAILS_ON_DEGRADED=true` si forza HTTP 503 quando lo
|
||||
stato non è `ready`, comportamento utile per la readiness probe di
|
||||
Kubernetes.
|
||||
|
||||
### Request log
|
||||
|
||||
Ogni richiesta HTTP attraversa un middleware che emette una riga JSON
|
||||
sul logger `mcp.request` con i seguenti campi: `request_id`, `method`,
|
||||
`path`, `status_code`, `duration_ms`, `actor` (`testnet` o `mainnet`,
|
||||
solo se autenticato), `bot_tag` (header `X-Bot-Tag` se presente),
|
||||
`exchange` (estratto dal path `/mcp-{exchange}/...`), `tool` (nome del
|
||||
tool quando il path è `/mcp-X/tools/Y`), `client_ip`, `user_agent`. Lo
|
||||
stesso `request_id` viene incluso anche nei record dell'audit log
|
||||
`mcp.audit` e nell'envelope di errore restituito al client, in modo da
|
||||
poter correlare le tre tracce a parità di richiesta.
|
||||
|
||||
### Audit log
|
||||
|
||||
Vedi la sezione "Audit query" qui sotto per la consultazione del log
|
||||
strutturato delle operazioni di scrittura.
|
||||
|
||||
## Audit query
|
||||
|
||||
`GET /admin/audit` legge il file JSONL puntato da `AUDIT_LOG_FILE` e
|
||||
restituisce i record filtrati. Richiede un bearer valido (testnet o
|
||||
mainnet); non richiede l'header `X-Bot-Tag`.
|
||||
|
||||
Parametri di query (tutti opzionali):
|
||||
|
||||
- `from`, `to`: ISO 8601 datetime (es. `2026-05-01` o `2026-05-01T12:34:56Z`)
|
||||
- `actor`: `testnet` | `mainnet`
|
||||
- `exchange`: nome dell'exchange (`deribit`, `bybit`, `hyperliquid`, `alpaca`, `ibkr`)
|
||||
- `action`: nome del tool (es. `place_order`)
|
||||
- `bot_tag`: identificatore del bot
|
||||
- `limit`: massimo record restituiti, default `1000`, massimo `10000`
|
||||
|
||||
Esempio di chiamata:
|
||||
|
||||
curl -H "Authorization: Bearer $MAINNET_TOKEN" \
|
||||
"http://localhost:9000/admin/audit?from=2026-05-01&actor=mainnet&action=place_order&limit=100"
|
||||
|
||||
Se `AUDIT_LOG_FILE` non è configurata l'endpoint risponde `count: 0` con
|
||||
un campo `warning`. Per abilitare il sink persistente impostare nel `.env`:
|
||||
|
||||
AUDIT_LOG_FILE=/var/log/cerbero-mcp/audit.jsonl
|
||||
AUDIT_LOG_BACKUP_DAYS=30
|
||||
|
||||
## Tool disponibili
|
||||
|
||||
@@ -106,6 +191,13 @@ rate, basis spot/perp, place_order, set_stop_loss, set_take_profit.
|
||||
Account, positions, bars, snapshot, option chain, place_order,
|
||||
amend_order, cancel_order, close_position.
|
||||
|
||||
### IBKR (Interactive Brokers)
|
||||
Account, positions, activities, ticker, bars, snapshot, option chain,
|
||||
search_contracts, clock, streaming (tick + depth via WebSocket
|
||||
singleton), place_order, amend_order, cancel_order, close_position,
|
||||
bracket/OCO/OTO orders. Auth via OAuth 1.0a Self-Service con minting
|
||||
session token unattended (vedi sezione "IBKR Setup" più sotto).
|
||||
|
||||
### Macro
|
||||
Treasury yields, FRED indicators, equity futures, asset prices, calendar,
|
||||
get_yield_curve_slope, get_breakeven_inflation, get_cot_tff,
|
||||
@@ -116,6 +208,16 @@ News (CryptoPanic/CoinDesk), social (LunarCrush), funding multi-exchange,
|
||||
OI history, get_funding_arb_spread, get_liquidation_heatmap,
|
||||
get_cointegration_pairs.
|
||||
|
||||
### Cross (storico unificato)
|
||||
`get_historical` aggrega le candele dello stesso simbolo da tutti gli
|
||||
exchange che lo supportano e ritorna una serie consensus: la chiusura è
|
||||
la mediana, `sources` è il numero di exchange che hanno contribuito al
|
||||
bar e `div_pct = (max-min)/median` segnala il disaccordo tra fonti — un
|
||||
quality gate per i bot. Crypto: BTC/ETH/SOL via Bybit + Hyperliquid +
|
||||
Deribit. Stocks: AAPL/SPY/QQQ/TSLA/NVDA via Alpaca. In caso di fallimento
|
||||
parziale ritorna i dati disponibili più `failed_sources`; se *tutti* gli
|
||||
upstream falliscono → HTTP 502 retryable.
|
||||
|
||||
## Deploy su VPS con Traefik
|
||||
|
||||
Sul VPS la rete pubblica (TLS, allowlist IP, rate limit) è gestita da
|
||||
@@ -139,18 +241,58 @@ labels:
|
||||
|
||||
## Build & deploy pipeline
|
||||
|
||||
Build dell'immagine eseguita sulla macchina di sviluppo:
|
||||
Il deploy su VPS avviene **per clone diretto del repo**, senza passare per
|
||||
un container registry. Lo script `scripts/deploy-vps.sh` automatizza
|
||||
l'intero flusso: pull del ramo target, rebuild dell'immagine sulla
|
||||
macchina VPS, restart del servizio, healthcheck e rollback automatico in
|
||||
caso di fallimento.
|
||||
|
||||
### Setup iniziale sul VPS (una sola volta)
|
||||
|
||||
```bash
|
||||
export GITEA_PAT='<PAT con scope write:package>'
|
||||
./scripts/build-push.sh
|
||||
# Sul VPS:
|
||||
sudo mkdir -p /opt/cerbero-mcp
|
||||
sudo chown -R "$USER":"$USER" /opt/cerbero-mcp
|
||||
cd /opt/cerbero-mcp
|
||||
git clone -b V2.0.0 ssh://git@git.tielogic.xyz:222/Adriano/Cerbero-mcp.git .
|
||||
cp .env.example .env
|
||||
# editare .env con i token e le credenziali reali
|
||||
```
|
||||
|
||||
Lo script tagga `:2.0.0`, `:latest` e `:sha-<short>` per rollback puntuali
|
||||
e pubblica al registry Gitea. Sul VPS Watchtower polla `:latest` e
|
||||
aggiorna il container automaticamente.
|
||||
Il branch in produzione è `V2.0.0` (non `main`). Lo script `deploy-vps.sh`
|
||||
fa default su questo ramo.
|
||||
|
||||
Smoke test post-deploy:
|
||||
### Deploy ricorrente
|
||||
|
||||
Da qualunque macchina con accesso SSH al VPS:
|
||||
|
||||
```bash
|
||||
ssh user@vps 'cd /opt/cerbero-mcp && bash scripts/deploy-vps.sh'
|
||||
```
|
||||
|
||||
Oppure direttamente dal VPS:
|
||||
|
||||
```bash
|
||||
cd /opt/cerbero-mcp
|
||||
bash scripts/deploy-vps.sh
|
||||
```
|
||||
|
||||
Lo script:
|
||||
1. verifica che il working tree sia pulito e che `.env` sia presente;
|
||||
2. esegue `git fetch + reset --hard origin/V2.0.0`;
|
||||
3. se la SHA non è cambiata, esce senza fare nulla (override con
|
||||
`FORCE=1`);
|
||||
4. ricostruisce l'immagine Docker (`docker compose build`);
|
||||
5. restart graceful del container (`docker compose down --timeout 15`
|
||||
seguito da `docker compose up -d`);
|
||||
6. attende `/health` (timeout 30 s di default);
|
||||
7. se l'health fallisce, esegue rollback automatico al SHA precedente.
|
||||
|
||||
Variabili d'ambiente accettate: `BRANCH` (default `V2.0.0`), `PORT`
|
||||
(default letto da `.env`), `HEALTH_TIMEOUT_SECONDS`, `FORCE`,
|
||||
`SKIP_ROLLBACK`.
|
||||
|
||||
### Smoke test post-deploy
|
||||
|
||||
```bash
|
||||
PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
|
||||
@@ -160,7 +302,7 @@ PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
|
||||
|
||||
```bash
|
||||
uv sync
|
||||
uv run pytest # tutta la suite (259 test attesi)
|
||||
uv run pytest # tutta la suite (399 test attesi)
|
||||
uv run pytest tests/unit -v # solo unit
|
||||
uv run pytest tests/integration -v
|
||||
uv run ruff check src/ tests/
|
||||
@@ -241,6 +383,69 @@ pybit (workaround documentato nel client). Per Alpaca l'override è
|
||||
applicato al solo trading endpoint: gli endpoint dati
|
||||
(`data.alpaca.markets`) restano quelli predefiniti dell'SDK.
|
||||
|
||||
## IBKR Setup
|
||||
|
||||
IBKR uses OAuth 1.0a Self-Service for fully unattended runtime auth. Setup is
|
||||
manual one-time per account (paper + live), then the container mints live
|
||||
session tokens autonomously.
|
||||
|
||||
### One-time setup
|
||||
|
||||
1. Login to https://www.interactivebrokers.com → User Settings → Self-Service OAuth
|
||||
2. Generate keypairs locally:
|
||||
|
||||
```bash
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet
|
||||
```
|
||||
|
||||
This writes RSA keys under `secrets/` and prints SHA-256 fingerprints.
|
||||
|
||||
3. Register the two fingerprints in the IBKR portal. Receive a `consumer_key`.
|
||||
4. Get a request token + authorization URL:
|
||||
|
||||
```bash
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet \
|
||||
--consumer-key <K> --request-token
|
||||
```
|
||||
|
||||
5. Open the URL, authorize, copy the `verifier_code`.
|
||||
6. Exchange verifier for long-lived access token (~5 years validity):
|
||||
|
||||
```bash
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet --verifier <V>
|
||||
```
|
||||
|
||||
7. Copy the printed values into `.env`:
|
||||
- `IBKR_CONSUMER_KEY_TESTNET`
|
||||
- `IBKR_ACCESS_TOKEN_TESTNET`
|
||||
- `IBKR_ACCESS_TOKEN_SECRET_TESTNET`
|
||||
- `IBKR_SIGNATURE_KEY_PATH_TESTNET`
|
||||
- `IBKR_ENCRYPTION_KEY_PATH_TESTNET`
|
||||
- `IBKR_ACCOUNT_ID_TESTNET` (e.g., `DU1234567` for paper)
|
||||
- `IBKR_DH_PRIME` (hex from portal; shared paper/live)
|
||||
8. Repeat with `--env mainnet` for live trading.
|
||||
|
||||
### Smoke test
|
||||
|
||||
```bash
|
||||
curl https://cerbero-mcp.<dom>/mcp-ibkr/tools/get_account \
|
||||
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST -d '{}'
|
||||
```
|
||||
|
||||
### Key rotation
|
||||
|
||||
```bash
|
||||
# 1. Generate new keypairs alongside existing
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet --rotate
|
||||
|
||||
# 2. Register new fingerprints in IBKR portal, get new consumer_key + tokens
|
||||
|
||||
# 3. Confirm rotation (atomic swap with auto-rollback on validation fail)
|
||||
curl -X POST "https://cerbero-mcp.<dom>/admin/ibkr/rotate-keys/confirm?env=testnet" \
|
||||
-H "Authorization: Bearer <ADMIN_TOKEN>" -H "Content-Type: application/json" \
|
||||
-d '{"new_consumer_key":"...","new_access_token":"...","new_access_token_secret":"..."}'
|
||||
```
|
||||
|
||||
## Licenza
|
||||
|
||||
Privato.
|
||||
|
||||
+17
-2
@@ -3,9 +3,9 @@ services:
|
||||
image: cerbero-mcp:2.0.0
|
||||
build: .
|
||||
container_name: cerbero-mcp
|
||||
ports:
|
||||
- "${PORT:-9000}:${PORT:-9000}"
|
||||
env_file: .env
|
||||
volumes:
|
||||
- ./secrets:/secrets:ro
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test:
|
||||
@@ -16,3 +16,18 @@ services:
|
||||
interval: 30s
|
||||
timeout: 5s
|
||||
retries: 3
|
||||
networks:
|
||||
- traefik
|
||||
labels:
|
||||
- traefik.enable=true
|
||||
- traefik.docker.network=traefik
|
||||
- "traefik.http.routers.cerbero-mcp.rule=Host(`cerbero-mcp.${DOMAIN_NAME:-tielogic.xyz}`)"
|
||||
- traefik.http.routers.cerbero-mcp.tls=true
|
||||
- traefik.http.routers.cerbero-mcp.entrypoints=websecure
|
||||
- traefik.http.routers.cerbero-mcp.tls.certresolver=mytlschallenge
|
||||
- "traefik.http.services.cerbero-mcp.loadbalancer.server.port=${PORT:-9000}"
|
||||
- "com.centurylinklabs.watchtower.enable=true"
|
||||
|
||||
networks:
|
||||
traefik:
|
||||
external: true
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,530 @@
|
||||
# IBKR Integration — Design Spec
|
||||
|
||||
**Date:** 2026-05-03
|
||||
**Branch:** V2.0.0
|
||||
**Status:** Approved (pending implementation plan)
|
||||
**Approach chosen:** A2 — Client Portal Web API with OAuth 1.0a Self-Service (fully unattended)
|
||||
|
||||
## 1. Goals & Non-Goals
|
||||
|
||||
### Goals
|
||||
|
||||
Aggiungere `ibkr` come exchange supportato in `cerbero-mcp`, riutilizzando il pattern consolidato (Alpaca/Deribit) per:
|
||||
|
||||
- account / positions / activities (read)
|
||||
- ordini simple: market, limit, stop, stop-limit (read + write)
|
||||
- ordini complex: bracket (entry + SL + TP con OCA), OCO (N legs OCA type=1), OTO (parent → child sequenziale)
|
||||
- market data: snapshot REST + tick/depth real-time via WebSocket (snapshot-on-demand)
|
||||
- options chain via OCC symbol
|
||||
- key rotation semi-automatica via admin endpoint con auto-rollback
|
||||
- routing testnet (paper account) / mainnet (live account) via bearer token, come gli altri exchange
|
||||
|
||||
### Non-Goals (V1)
|
||||
|
||||
- Server-Sent Events / streaming HTTP response (snapshot-on-demand è sufficiente)
|
||||
- Multi-account dinamico (un solo `account_id` per env, configurato in settings)
|
||||
- Trailing stop, IF-touched, conditional advanced orders (solo bracket fisso, OCO, OTO)
|
||||
- Rotazione completamente automatica del consumer registration (il passo portale IBKR non è automatizzabile)
|
||||
- Streaming WebSocket esposto direttamente al bot (resta interno al server, esposto come polling REST)
|
||||
- TWS API socket protocol via `ib_insync` (rejected: richiede gateway desktop con Xvfb, fragile)
|
||||
- Flex Web Service (rejected: read-only su report storici, fuori scope)
|
||||
|
||||
### Success criteria
|
||||
|
||||
1. `POST /mcp-ibkr/tools/get_account` con bearer testnet ritorna saldo paper account reale
|
||||
2. `POST /mcp-ibkr/tools/place_order` (1 share AAPL market) → ordine fillato in paper, audit log presente
|
||||
3. `POST /mcp-ibkr/tools/place_bracket_order` → 3 ordini collegati via OCA group, primo fill cancella gli altri
|
||||
4. `POST /mcp-ibkr/tools/get_depth` → 5 livelli depth con dati < 1s di latenza
|
||||
5. `POST /admin/ibkr/rotate-keys/{start,confirm}` → swap atomico, rollback automatico su validation fail
|
||||
6. Container restart → primo `get_account` < 5s (OAuth flow + first call), zero input umano
|
||||
7. Test suite verde: 90% coverage su `oauth.py`/`client.py`/`ws.py`/`key_rotation.py`, 85% su `tools.py`/`orders_complex.py`
|
||||
8. `/health/ready` segnala IBKR sano per entrambi gli env
|
||||
|
||||
## 2. Architecture
|
||||
|
||||
```
|
||||
┌──────┐ Bearer testnet|mainnet ┌──────────────────┐
|
||||
│ Bot │ ─────────────────────────▶│ cerbero-mcp │
|
||||
└──────┘ │ (single FastAPI)│
|
||||
│ │
|
||||
│ IBKRClient │ ──HTTPS OAuth1a──▶ api.ibkr.com/v1/api
|
||||
│ IBKRWebSocket │ ──WSS LST───────▶ api.ibkr.com/v1/api/ws
|
||||
└──────────────────┘
|
||||
```
|
||||
|
||||
**Decisioni chiave:**
|
||||
|
||||
- **OAuth 1.0a Self-Service:** firma RSA-SHA256 + DH key exchange per mintare live session token (24h TTL) in autonomia. Setup iniziale manuale una-tantum sul portale IBKR; runtime fully unattended.
|
||||
- **Container singolo:** niente sidecar Java (era considerato per CP Gateway non-OAuth, scartato perché richiedeva login interattivo).
|
||||
- **WebSocket interno snapshot-on-demand:** un singleton `IBKRWebSocket` per env mantiene sub attive in background; i tool REST `get_tick`/`get_depth` ritornano l'ultimo snapshot in cache. Bot polling-based, niente streaming verso il bot.
|
||||
- **Paper vs live = account separati, stesso host:** IBKR usa `api.ibkr.com` per entrambi; i test sono fatti su account paper (con suo OAuth bundle), live su account live. Due set di credenziali in settings (pattern Deribit `_TESTNET` / `_LIVE`).
|
||||
- **Conid cache:** IBKR identifica strumenti via `conid` numerico; symbol→conid lookup cached (LRU 1024, TTL 1h) per evitare round-trip ripetuti.
|
||||
- **Two-level keep-alive:** brokerage session muore in 5min idle (richiede `POST /tickle`); live session token muore in 24h (richiede DH re-mint). Gestiti separatamente dal client.
|
||||
|
||||
## 3. Components (file layout)
|
||||
|
||||
```
|
||||
src/cerbero_mcp/exchanges/ibkr/
|
||||
├── __init__.py
|
||||
├── client.py # IBKRClient: REST httpx + tickle + conid cache
|
||||
├── oauth.py # OAuth1aSigner: RSA sig + DH live session token mint/refresh
|
||||
├── ws.py # IBKRWebSocket: persistent WSS, smd/sbd subs, snapshot cache
|
||||
├── orders_complex.py # bracket/OCO/OTO payload builders
|
||||
├── key_rotation.py # KeyRotationManager: stage/confirm/abort/rollback
|
||||
├── tools.py # Pydantic schemas + async tool functions (read + write + complex + streaming)
|
||||
└── leverage_cap.py # get_max_leverage(creds) — copia 1:1 da alpaca
|
||||
|
||||
src/cerbero_mcp/routers/ibkr.py # POST /mcp-ibkr/tools/*
|
||||
|
||||
scripts/ibkr_oauth_setup.py # one-shot: generate keypair, walkthrough portale, --rotate flag
|
||||
|
||||
tests/unit/exchanges/ibkr/
|
||||
├── __init__.py
|
||||
├── test_oauth.py
|
||||
├── test_client.py
|
||||
├── test_ws.py
|
||||
├── test_orders_complex.py
|
||||
├── test_key_rotation.py
|
||||
└── test_tools.py
|
||||
```
|
||||
|
||||
**File esistenti modificati:**
|
||||
|
||||
- `src/cerbero_mcp/settings.py` — aggiungo `IBKRSettings`
|
||||
- `src/cerbero_mcp/exchanges/__init__.py` — branch `if exchange == "ibkr"` in `build_client`
|
||||
- `src/cerbero_mcp/__main__.py` — `app.include_router(ibkr.make_router())`
|
||||
- `src/cerbero_mcp/admin.py` — endpoints `/admin/ibkr/rotate-*` + `/admin/ibkr/health`
|
||||
- `.env.example` — sezione `# ─── EXCHANGE — IBKR ───`
|
||||
- `pyproject.toml` — aggiungo `cryptography>=43` (RSA + DH; potrebbe essere già transitiva)
|
||||
- `docker-compose.yml` — bind mount `./secrets:/secrets:ro`
|
||||
- `README.md` — sezione "IBKR Setup"
|
||||
|
||||
**Module boundaries:**
|
||||
|
||||
- `oauth.py` espone `OAuth1aSigner.get_live_session_token() -> str` (cached). Non sa nulla di endpoint applicativi; conosce solo l'endpoint `/oauth/live_session_token`.
|
||||
- `client.py` riceve un `OAuth1aSigner` come dipendenza, non costruisce keys. Non sa nulla di WebSocket.
|
||||
- `ws.py` riceve un `OAuth1aSigner`, gestisce WSS in modo indipendente. Espone metodi async `subscribe_tick(conid)`, `subscribe_depth(conid, rows)`, `get_tick_snapshot(conid)`, `get_depth_snapshot(conid)`. Non sa nulla di REST.
|
||||
- `orders_complex.py` è un set di funzioni **pure** che producono payload JSON IBKR-ready. Niente HTTP. Test deterministici.
|
||||
- `key_rotation.py` opera su filesystem + `OAuth1aSigner` factory; non tocca routing FastAPI direttamente.
|
||||
|
||||
## 4. Settings & OAuth flow
|
||||
|
||||
### Pydantic settings
|
||||
|
||||
```python
|
||||
class IBKRSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env", env_file_encoding="utf-8",
|
||||
env_prefix="IBKR_", extra="ignore",
|
||||
)
|
||||
# Coppia singola fallback (legacy / dev)
|
||||
consumer_key: str | None = None
|
||||
access_token: str | None = None
|
||||
access_token_secret: SecretStr | None = None
|
||||
signature_key_path: str | None = None
|
||||
encryption_key_path: str | None = None
|
||||
dh_prime: SecretStr | None = None
|
||||
|
||||
# Coppie env-specific (prevalgono se valorizzate)
|
||||
consumer_key_testnet: str | None = None
|
||||
access_token_testnet: str | None = None
|
||||
access_token_secret_testnet: SecretStr | None = None
|
||||
signature_key_path_testnet: str | None = None
|
||||
encryption_key_path_testnet: str | None = None
|
||||
account_id_testnet: str | None = None
|
||||
|
||||
consumer_key_live: str | None = None
|
||||
access_token_live: str | None = None
|
||||
access_token_secret_live: SecretStr | None = None
|
||||
signature_key_path_live: str | None = None
|
||||
encryption_key_path_live: str | None = None
|
||||
account_id_live: str | None = None
|
||||
|
||||
# URLs (paper e live condividono host)
|
||||
url_live: str = "https://api.ibkr.com/v1/api"
|
||||
url_testnet: str = "https://api.ibkr.com/v1/api"
|
||||
ws_url_live: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
ws_url_testnet: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
|
||||
# Limits
|
||||
max_leverage: int = 4 # Reg-T default
|
||||
ws_max_subscriptions: int = 80
|
||||
ws_idle_timeout_s: int = 300
|
||||
|
||||
def credentials(self, env: str) -> dict:
|
||||
"""Ritorna dict completo OAuth per env. ValueError su campi mancanti.
|
||||
|
||||
Per ogni campo: prefer `<field>_<env>`; fallback a `<field>` (legacy);
|
||||
ValueError se entrambi assenti per i campi required
|
||||
(consumer_key, access_token, access_token_secret, signature_key_path,
|
||||
encryption_key_path, account_id, dh_prime).
|
||||
Pattern identico a DeribitSettings.credentials().
|
||||
"""
|
||||
```
|
||||
|
||||
`dh_prime` è una stringa hex emessa da IBKR al setup, **costante** per consumer (condivisa paper/live), non duplicata per env.
|
||||
|
||||
### `.env.example`
|
||||
|
||||
```env
|
||||
# ─── EXCHANGE — IBKR ──────────────────────────────────────
|
||||
# Setup OAuth: vedi README "IBKR Setup" + scripts/ibkr_oauth_setup.py.
|
||||
# Le RSA keys (PEM) NON vanno nel .env: monta come file e referenzia il path.
|
||||
|
||||
IBKR_CONSUMER_KEY=
|
||||
IBKR_ACCESS_TOKEN=
|
||||
IBKR_ACCESS_TOKEN_SECRET=
|
||||
IBKR_SIGNATURE_KEY_PATH=/secrets/ibkr_signature.pem
|
||||
IBKR_ENCRYPTION_KEY_PATH=/secrets/ibkr_encryption.pem
|
||||
IBKR_DH_PRIME=
|
||||
|
||||
# Coppie env-specific (prevalgono):
|
||||
# IBKR_CONSUMER_KEY_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_TESTNET=
|
||||
# IBKR_SIGNATURE_KEY_PATH_TESTNET=/secrets/ibkr_signature_paper.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_TESTNET=/secrets/ibkr_encryption_paper.pem
|
||||
# IBKR_ACCOUNT_ID_TESTNET=DU1234567
|
||||
# IBKR_CONSUMER_KEY_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_LIVE=
|
||||
# IBKR_SIGNATURE_KEY_PATH_LIVE=/secrets/ibkr_signature_live.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_LIVE=/secrets/ibkr_encryption_live.pem
|
||||
# IBKR_ACCOUNT_ID_LIVE=U1234567
|
||||
|
||||
IBKR_URL_LIVE=https://api.ibkr.com/v1/api
|
||||
IBKR_URL_TESTNET=https://api.ibkr.com/v1/api
|
||||
IBKR_WS_URL_LIVE=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_WS_URL_TESTNET=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_MAX_LEVERAGE=4
|
||||
IBKR_WS_MAX_SUBSCRIPTIONS=80
|
||||
IBKR_WS_IDLE_TIMEOUT_S=300
|
||||
```
|
||||
|
||||
### Setup OAuth one-shot (manuale, una-tantum per account)
|
||||
|
||||
1. Login portale `https://www.interactivebrokers.com` → "User Settings" → "Self-Service OAuth"
|
||||
2. `python scripts/ibkr_oauth_setup.py --env testnet` → genera 2 RSA keypair + stampa SHA-256 fingerprint
|
||||
3. Sul portale: registra le 2 public key, ottieni `consumer_key`
|
||||
4. `python scripts/ibkr_oauth_setup.py --consumer-key <K> --request-token` → ottiene request token + URL autorizzazione
|
||||
5. Aprire URL nel browser, autorizzare, copiare `verifier_code`
|
||||
6. `python scripts/ibkr_oauth_setup.py --verifier <V>` → scambia per `access_token` (long-lived ~5 anni) + `access_token_secret`
|
||||
7. Copiare 3 valori in `.env`. Ripetere per env live.
|
||||
|
||||
### Runtime flow (fully unattended)
|
||||
|
||||
- Container avvia → `IBKRClient` lazy-instantiated alla prima request
|
||||
- `OAuth1aSigner` carica RSA private keys da disk (path da settings)
|
||||
- Prima request privata → `_get_live_session_token()`:
|
||||
1. Genera nonce + timestamp
|
||||
2. Firma `POST /oauth/live_session_token` con RSA-SHA256
|
||||
3. Diffie-Hellman key exchange con `dh_prime`
|
||||
4. Riceve `lst` (live session token, valido 24h)
|
||||
5. Cache in memory con scadenza `now + 86000s`
|
||||
- Request successive: HMAC-SHA256 dei params con `lst` come key
|
||||
- `lst` scaduto → mint nuovo automatico, retry once. Mai input umano runtime.
|
||||
- Ogni request privata: se ultima call > 4min fa, chiama `POST /tickle` (brokerage session keep-alive) prima.
|
||||
|
||||
### Errori → error envelope
|
||||
|
||||
| Trigger | Code | retryable |
|
||||
|---|---|---|
|
||||
| RSA key file mancante | `IBKR_KEY_NOT_FOUND` | false |
|
||||
| RSA key file illeggibile | `IBKR_KEY_INVALID` | false |
|
||||
| Consumer revocato dal portale | `IBKR_CONSUMER_REVOKED` | false |
|
||||
| Access token scaduto (~5 anni) | `IBKR_ACCESS_TOKEN_EXPIRED` | false |
|
||||
| LST mint fallito (network) | `IBKR_SESSION_MINT_FAILED` | true |
|
||||
| `401` su request firmata | `IBKR_AUTH_FAILED` | true (forza refresh LST + retry once) |
|
||||
| Rate limit `429` | `IBKR_RATE_LIMITED` | true |
|
||||
| Manutenzione domenicale | `IBKR_MAINTENANCE` | true |
|
||||
| Account configurato non in `/iserver/accounts` | `IBKR_ACCOUNT_NOT_FOUND` | false |
|
||||
| Subscription market data assente | `IBKR_NO_MARKET_DATA_SUBSCRIPTION` | false |
|
||||
| Order warning critico (margin/suitability) | `IBKR_ORDER_REJECTED_WARNING` | false |
|
||||
| WS sub limit superato | `IBKR_WS_SUB_LIMIT` | false |
|
||||
| `get_tick` timeout cache vuota dopo 3s | `IBKR_TICK_TIMEOUT` | true |
|
||||
| OTO seconda POST fallita dopo trigger placed | `IBKR_OTO_PARTIAL_FAILURE` | false |
|
||||
| Rotation validation fallita | `IBKR_ROTATION_VALIDATION_FAILED` | false (rollback automatico) |
|
||||
|
||||
## 5. Tool API surface
|
||||
|
||||
Pattern simmetrico ad Alpaca dove l'astrazione regge: stesso tool name → bot riusa logica cross-exchange.
|
||||
|
||||
### Reads (12 tool)
|
||||
|
||||
| Tool | IBKR endpoint |
|
||||
|---|---|
|
||||
| `environment_info` | locale (env, paper, base_url, max_leverage) |
|
||||
| `get_account` | `GET /portfolio/{accountId}/summary` |
|
||||
| `get_positions` | `GET /portfolio/{accountId}/positions/0` (loop se >30) |
|
||||
| `get_activities` | `GET /iserver/account/trades?days=N` (default 7, cap 90) |
|
||||
| `get_assets` | `GET /trsrv/secdef/search?symbol=...` (richiede symbol) |
|
||||
| `get_ticker` | `GET /iserver/marketdata/snapshot?conids=X&fields=31,84,86,7295,7296` |
|
||||
| `get_bars` | `GET /iserver/marketdata/history?conid=X&period=...&bar=...` |
|
||||
| `get_snapshot` | `GET /iserver/marketdata/snapshot` (full fields) |
|
||||
| `get_option_chain` | `GET /iserver/secdef/strikes` + `/info` |
|
||||
| `get_open_orders` | `GET /iserver/account/orders?filters=Submitted,PreSubmitted` |
|
||||
| `get_clock` | locale (now + market hours statiche) |
|
||||
| `search_contracts` | `GET /trsrv/secdef/search` (IBKR-specific: symbol+secType → conid) |
|
||||
|
||||
### Streaming (4 tool, snapshot-on-demand)
|
||||
|
||||
| Tool | Behavior |
|
||||
|---|---|
|
||||
| `get_tick` | Ultimo tick in cache (last/bid/ask/size/timestamp). Se non subscribed: sub lazy + attesa primo tick (timeout 3s) |
|
||||
| `get_depth` | Order book depth (default 5 livelli, max 10). IBKR `sbd+{conid}+{exchange}+{rows}` |
|
||||
| `subscribe_tick` | Mantiene sub attiva anche senza polling. Auto-unsub dopo `ws_idle_timeout_s` |
|
||||
| `unsubscribe` | Forza chiusura sub per liberare slot |
|
||||
|
||||
### Writes simple (6 tool)
|
||||
|
||||
| Tool | Audit field |
|
||||
|---|---|
|
||||
| `place_order` | `symbol` |
|
||||
| `amend_order` | `order_id` |
|
||||
| `cancel_order` | `order_id` |
|
||||
| `cancel_all_orders` | — (loop) |
|
||||
| `close_position` | `symbol` |
|
||||
| `close_all_positions` | — (loop) |
|
||||
|
||||
### Writes complex (3 tool)
|
||||
|
||||
| Tool | Schema essenziale | Endpoint |
|
||||
|---|---|---|
|
||||
| `place_bracket_order` | `symbol, side, qty, entry_price, stop_loss, take_profit, tif="gtc"` | `POST /iserver/account/{id}/orders` array `[parent, sl_child, tp_child]` con OCA group auto |
|
||||
| `place_oco_order` | `legs: list[OrderLeg]` (2-N orders) | Stessa POST con `oca_group` + `oca_type=1` su ogni leg |
|
||||
| `place_oto_order` | `trigger: OrderLeg, child: OrderLeg` | POST sequenziali: trigger prima, poi child con `parent_id=<trigger.order_id>`. **Non atomico:** se la seconda POST fallisce dopo che la prima è andata a buon fine, il tool cancella il trigger via `cancel_order` (best-effort) e ritorna `IBKR_OTO_PARTIAL_FAILURE` con `details.trigger_order_id` per audit |
|
||||
|
||||
**Audit:** complex orders tracciano `target_field=symbol` + `details.legs_count` + `details.oca_group` (se applicabile).
|
||||
|
||||
**Leverage cap su complex:** applicato sul **net notional** della struttura. Bracket = entry only (i child non aprono nuova esposizione). OCO = max(leg.notional). OTO = trigger + child se entrambi long, altrimenti max.
|
||||
|
||||
### IBKR-specifiche (interne al client)
|
||||
|
||||
1. **`conid` resolution:** `place_order(symbol="AAPL")` → lookup `GET /trsrv/secdef/search?symbol=AAPL&secType=STK` → primo match → cache LRU. Per options: parse OCC (`AAPL 240119C00190000`) → `/iserver/secdef/info`.
|
||||
2. **`accountId` validation:** al boot, `GET /iserver/accounts` → verifica `account_id_<env>` presente. Altrimenti `IBKR_ACCOUNT_NOT_FOUND`.
|
||||
3. **Order confirmation flow:** IBKR ritorna warnings array, richiede secondo POST con `confirmed: true`. Auto-confirm per default (max 3 cicli), ma filtra warning critici (margin, suitability, hard rejects) → error envelope.
|
||||
4. **`tickle` keep-alive:** automatico se ultima request > 4min fa. Indipendente dal LST.
|
||||
5. **Empty market data → error envelope:** snapshot vuoto = subscription mancante; ritorniamo `IBKR_NO_MARKET_DATA_SUBSCRIPTION` invece di dict vuoto silenzioso.
|
||||
6. **Leverage cap:** IBKR non accetta `leverage` per-order. Calcoliamo `notional / equity` ≤ `max_leverage` pre-submit chiamando `get_account` per equity. Pattern asincrono ma cached 30s.
|
||||
|
||||
### `PlaceOrderReq` schema
|
||||
|
||||
```python
|
||||
class PlaceOrderReq(BaseModel):
|
||||
symbol: str # "AAPL" o OCC-format per options
|
||||
side: str # "buy" | "sell"
|
||||
qty: float
|
||||
order_type: str = "market" # "market" | "limit" | "stop" | "stop_limit"
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str = "day" # "day" | "gtc" | "ioc"
|
||||
asset_class: str = "stocks" # "stocks" | "options" | "futures" | "forex"
|
||||
sec_type: str | None = None # IBKR override (STK/OPT/FUT/CASH); inferito da asset_class
|
||||
exchange: str = "SMART" # IBKR routing
|
||||
outside_rth: bool = False
|
||||
```
|
||||
|
||||
## 6. WebSocket layer
|
||||
|
||||
### Pattern
|
||||
|
||||
Singleton `IBKRWebSocket` per env, lazy-start alla prima sub. Una connessione WSS condivisa per tutte le sub.
|
||||
|
||||
### Lifecycle
|
||||
|
||||
1. **Boot:** non connette finché un tool streaming non viene chiamato.
|
||||
2. **First sub call:** apre WSS, autentica con LST corrente (header `Cookie: api=<lst>`), invia subscribe message (`smd+{conid}+{fields}` o `sbd+{conid}+{exchange}+{rows}`).
|
||||
3. **Message dispatch:** ogni messaggio `smd-...` aggiorna `dict[conid, TickSnapshot]`; ogni `sbd-...` aggiorna `dict[conid, DepthSnapshot]`.
|
||||
4. **Heartbeat:** ping ogni 30s; se nessun pong in 60s → forza reconnect.
|
||||
5. **Reconnect:** backoff esponenziale 1s, 2s, 4s, max 30s. Su reconnect: re-subscribe automatico a tutti i conid attivi.
|
||||
6. **Idle unsub:** track `last_polled_at[conid]`; se > `ws_idle_timeout_s` (default 300s) → invia unsub, libera slot. Sub forzata via `subscribe_tick` non scade fino a `unsubscribe` esplicito.
|
||||
7. **Sub limit:** se sub attive ≥ `ws_max_subscriptions` (default 80) → error envelope `IBKR_WS_SUB_LIMIT` su nuova sub.
|
||||
|
||||
### Cache invariant
|
||||
|
||||
- Snapshot rappresenta **sempre** l'ultimo update ricevuto. No buffering storico.
|
||||
- Su disconnect: cache di un conid invalidata se reconnect non riesce in <5s.
|
||||
- `get_tick(conid)` se cache vuota: aspetta primo tick fino a 3s, poi `IBKR_TICK_TIMEOUT`.
|
||||
|
||||
### Health probe
|
||||
|
||||
`/health/ready` interroga `IBKRWebSocket.connected` per ogni env. Stato `degraded` se ws disconnesso ma client REST ok.
|
||||
|
||||
### Feature flag
|
||||
|
||||
`IBKR_WS_ENABLED=false` (env var) disabilita layer WS a runtime; tool streaming fallback a HTTP `/marketdata/snapshot` (single shot, niente depth). Mitigation per emergenze prod.
|
||||
|
||||
## 7. Key rotation
|
||||
|
||||
### Endpoint admin
|
||||
|
||||
```
|
||||
POST /admin/ibkr/rotate-keys/start?env=testnet
|
||||
→ genera signature_key.pem.new + encryption_key.pem.new (RSA 2048)
|
||||
→ ritorna {fingerprints: {sig: "SHA256:...", enc: "SHA256:..."},
|
||||
expires_at: <now+24h>}
|
||||
→ user incolla fingerprint nel portale IBKR, ottiene new_consumer_key
|
||||
|
||||
POST /admin/ibkr/rotate-keys/confirm?env=testnet
|
||||
body: {new_consumer_key, new_access_token, new_access_token_secret}
|
||||
→ atomic swap: .new → primary, primary → secrets/.archive/<timestamp>/
|
||||
→ probe: GET /iserver/auth/status con nuove credenziali
|
||||
→ ok: ritorna {rotated_at, old_archived_at}
|
||||
→ ko: rollback automatico (swap inverso), ritorna 500 IBKR_ROTATION_VALIDATION_FAILED
|
||||
|
||||
POST /admin/ibkr/rotate-keys/abort?env=testnet
|
||||
→ cancella .new files, no-op se start non eseguito o confirm già eseguito
|
||||
```
|
||||
|
||||
### Authorization
|
||||
|
||||
Endpoints protetti dal middleware `auth.py` esistente, richiedono `X-Bot-Tag: admin` (header già supportato per admin router).
|
||||
|
||||
### Atomic swap
|
||||
|
||||
Implementato come:
|
||||
1. Lock filesystem-level via `fcntl.flock` su `secrets/.lock`
|
||||
2. Rename `signature.pem` → `secrets/.archive/<ts>/signature.pem.old`
|
||||
3. Rename `signature.pem.new` → `signature.pem`
|
||||
4. Stesso per `encryption.pem`
|
||||
5. Aggiorna `IBKRSettings` in-memory tramite `app.state.settings.ibkr.consumer_key_<env> = new_consumer_key` (settings live, no restart)
|
||||
6. Probe `GET /iserver/auth/status`
|
||||
7. Su KO: rollback (swap inverso), ripristina settings precedenti, alza eccezione
|
||||
|
||||
### Scheduled health check
|
||||
|
||||
Task asyncio creato in lifespan startup:
|
||||
- Ogni 6h: `GET /iserver/auth/status` su entrambi gli env
|
||||
- Se `competing=true` o `authenticated=false` per >2 cicli consecutivi: log warning + `/admin/ibkr/health` espone state degraded
|
||||
- Auto-trigger `/tickle` su degraded prima di fallire
|
||||
|
||||
### Encryption-at-rest
|
||||
|
||||
Key files su disco con permessi `0600`. Bind mount Docker `:ro` su `/secrets`. Rotation preserva permessi e leggibilità solo per UID processo container.
|
||||
|
||||
## 8. Testing
|
||||
|
||||
### Unit tests
|
||||
|
||||
| File | Coverage critica |
|
||||
|---|---|
|
||||
| `test_oauth.py` | RSA-SHA256 signature deterministica (vector noto IBKR docs); DH key exchange su prime test; LST mint con httpx mock; refresh prima di scadenza; error path key mancante/illeggibile; 401 su consumer revocato |
|
||||
| `test_client.py` | Authorization header construction; conid lookup + cache hit/miss; tickle keep-alive timing (last_request_at < 4min skip, > 4min trigger); place_order warning auto-confirmation flow + warning critico → error envelope; leverage cap pre-flight; account validation al boot; error mapping (401/429/maintenance/no-mkt-data) |
|
||||
| `test_ws.py` | Mock `websockets.connect`; subscribe ack flow; message dispatch in cache; reconnect dopo disconnect; idle timeout unsub; sub limit (>80 → IBKR_WS_SUB_LIMIT); feature flag disabled → HTTP fallback |
|
||||
| `test_orders_complex.py` | Bracket: payload shape (3 orders, OCA group uguale, parent/child relation). OCO: N legs, OCA type=1 ovunque. OTO: due POST sequenziali, secondo con parent_id corretto. Leverage cap su net notional |
|
||||
| `test_key_rotation.py` | start genera keypair valido; confirm swap atomico; validation probe success/fail; rollback automatico su fail; abort pulisce .new files; archived .old conserva permessi |
|
||||
| `test_tools.py` | Schema validation (qty obbligatoria, side enum, OCC format options); default values; leverage cap enforcement |
|
||||
| `test_settings.py` (estendi) | `IBKRSettings.credentials("testnet")` prefer testnet → fallback base → ValueError se entrambi mancanti. Test isolation con `monkeypatch.delenv` ricorsivo per evitare `.env` pollution (pattern Deribit) |
|
||||
|
||||
### Coverage target
|
||||
|
||||
- `oauth.py`, `client.py`, `ws.py`, `key_rotation.py`: 90%
|
||||
- `orders_complex.py`, `tools.py`: 85%
|
||||
- `routers/ibkr.py`, `admin.py` IBKR section: 75%
|
||||
|
||||
### Integration smoke (manuale post-deploy)
|
||||
|
||||
Non in CI (richiede credenziali reali). Documentato in README:
|
||||
|
||||
```bash
|
||||
curl https://cerbero-mcp.<dom>/mcp-ibkr/tools/get_account \
|
||||
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST -d '{}'
|
||||
# → saldo paper account
|
||||
|
||||
curl .../mcp-ibkr/tools/place_order \
|
||||
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST \
|
||||
-d '{"symbol":"AAPL","side":"buy","qty":1,"order_type":"market"}'
|
||||
# → order_id
|
||||
|
||||
curl .../mcp-ibkr/tools/place_bracket_order \
|
||||
-d '{"symbol":"AAPL","side":"buy","qty":1,"entry_price":150,"stop_loss":145,"take_profit":160}'
|
||||
# → 3 order_ids con stesso oca_group
|
||||
|
||||
curl .../mcp-ibkr/tools/get_depth \
|
||||
-d '{"symbol":"AAPL","rows":5}'
|
||||
# → order book 5 livelli
|
||||
```
|
||||
|
||||
### Verification gate (pre-merge)
|
||||
|
||||
- [ ] `uv run pytest tests/unit/exchanges/ibkr/ tests/unit/test_settings.py -v` verde
|
||||
- [ ] `uv run ruff check src/cerbero_mcp/exchanges/ibkr/ src/cerbero_mcp/routers/ibkr.py` no warning
|
||||
- [ ] `uv run python -c "from cerbero_mcp.settings import Settings; Settings()"` no validation error con `.env` esempio
|
||||
- [ ] `docker compose build && docker compose up -d` healthy < 60s
|
||||
- [ ] `curl /health/ready -H "Authorization: Bearer <TESTNET>"` ritorna `ibkr` probato
|
||||
- [ ] Smoke manuale completo (lista sopra) su account paper reale
|
||||
|
||||
## 9. Deploy & ops
|
||||
|
||||
- **Branch:** `V2.0.0` (default deploy, no merge in main)
|
||||
- **Pipeline:** stessa pattern del fix Deribit di settimana scorsa: commit + push → watchtower aggiorna container in <2min sul VPS
|
||||
- **Traefik:** nessuna modifica (stessa Host rule)
|
||||
- **Secrets:** RSA keys trasferite manualmente in `/opt/docker/cerbero-mcp/secrets/` sul VPS, mode `0600`, ownership UID container; bind mount `./secrets:/secrets:ro` aggiunto in `docker-compose.yml`
|
||||
- **Rollback:** `git revert <commit>` di un singolo step lascia gli altri exchange operativi (commit atomici per design)
|
||||
|
||||
## 10. Commit plan (8 commit atomici)
|
||||
|
||||
```
|
||||
1. feat(V2): IBKR settings + OAuth signer scaffolding
|
||||
- settings.py: IBKRSettings con env-specific credentials
|
||||
- exchanges/ibkr/oauth.py: OAuth1aSigner + tests
|
||||
- .env.example: sezione IBKR
|
||||
- pyproject.toml: cryptography>=43
|
||||
|
||||
2. feat(V2): IBKR client httpx + conid cache + tickle
|
||||
- exchanges/ibkr/client.py: IBKRClient base
|
||||
- exchanges/ibkr/leverage_cap.py: copia da alpaca
|
||||
- tests/unit/exchanges/ibkr/test_client.py
|
||||
|
||||
3. feat(V2): IBKR WebSocket layer + tick/depth snapshot cache
|
||||
- exchanges/ibkr/ws.py: IBKRWebSocket singleton + reconnect
|
||||
- tests/unit/exchanges/ibkr/test_ws.py
|
||||
|
||||
4. feat(V2): IBKR read tools (account/positions/marketdata/streaming)
|
||||
- exchanges/ibkr/tools.py: schemas + read functions
|
||||
- tests/unit/exchanges/ibkr/test_tools.py (read paths)
|
||||
|
||||
5. feat(V2): IBKR write tools simple (place/amend/cancel/close)
|
||||
- exchanges/ibkr/tools.py: schemas + write functions
|
||||
- tests/unit/exchanges/ibkr/test_tools.py (write paths + leverage cap)
|
||||
|
||||
6. feat(V2): IBKR complex orders (bracket/OCO/OTO)
|
||||
- exchanges/ibkr/orders_complex.py
|
||||
- exchanges/ibkr/tools.py: complex tool functions
|
||||
- tests/unit/exchanges/ibkr/test_orders_complex.py
|
||||
|
||||
7. feat(V2): IBKR key rotation admin endpoints + scheduled health
|
||||
- exchanges/ibkr/key_rotation.py: KeyRotationManager
|
||||
- admin.py: rotate-keys/start|confirm|abort + ibkr/health
|
||||
- tests/unit/exchanges/ibkr/test_key_rotation.py
|
||||
|
||||
8. feat(V2): IBKR router wiring + docker secrets + setup script + docs
|
||||
- routers/ibkr.py
|
||||
- exchanges/__init__.py: build_client branch ibkr
|
||||
- __main__.py: include_router
|
||||
- scripts/ibkr_oauth_setup.py
|
||||
- docker-compose.yml: bind mount secrets
|
||||
- README.md: sezione IBKR Setup
|
||||
```
|
||||
|
||||
Ogni commit lascia repo verde (test passing + container buildable). `git revert` di un commit non rompe gli altri exchange.
|
||||
|
||||
## 11. Risks & mitigations
|
||||
|
||||
| Risk | Likelihood | Mitigation |
|
||||
|---|---|---|
|
||||
| WebSocket reconnect instabile in prod | Media | Feature flag `IBKR_WS_ENABLED=false` + HTTP snapshot fallback |
|
||||
| IBKR rate limit superato durante conid lookup burst | Bassa | LRU cache 1h + retry con backoff |
|
||||
| Live session token mint fallisce per network blip | Media | Retry 3x con backoff esponenziale; circuit breaker su 5 fail consecutivi |
|
||||
| Order auto-confirmation conferma erroneamente warning critico | Bassa | Whitelist esplicita warning auto-confermabili (RTH, no-mkt-data); tutto il resto → error envelope |
|
||||
| Key rotation lascia sistema in stato inconsistente | Bassa | Filesystem lock + atomic swap + auto-rollback su validation fail |
|
||||
| Setup OAuth iniziale troppo complesso per ops team | Media | Script `ibkr_oauth_setup.py` interattivo + sezione README dettagliata + checklist |
|
||||
| Leverage cap calcolato su equity stale | Bassa | Cache equity 30s, refresh forzato pre-submit ordini > 10% equity |
|
||||
|
||||
## 12. Estimate
|
||||
|
||||
- Dev: **6-8 giorni** (era 3-4 nella V0; complex orders + WS + rotation aggiungono ~3 giorni)
|
||||
- Test: incluso nei commit (TDD-friendly)
|
||||
- Deploy + smoke: 0.5 giorni
|
||||
- Documentation: 0.5 giorni
|
||||
- **Totale:** ~7-9 giorni di lavoro effettivo
|
||||
+5
-4
@@ -16,9 +16,10 @@ dependencies = [
|
||||
"scipy>=1.13",
|
||||
"statsmodels>=0.14",
|
||||
"pandas>=2.2",
|
||||
"pybit>=5.7",
|
||||
"alpaca-py>=0.30",
|
||||
"hyperliquid-python-sdk>=0.6",
|
||||
"eth-account>=0.13.7",
|
||||
"msgpack>=1.1.2",
|
||||
"eth-utils>=5.3.1",
|
||||
"cryptography>=43",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
@@ -70,7 +71,7 @@ check_untyped_defs = true
|
||||
ignore_missing_imports = true
|
||||
|
||||
[[tool.mypy.overrides]]
|
||||
module = ["pybit.*", "alpaca.*", "hyperliquid.*", "pythonjsonlogger.*"]
|
||||
module = ["pythonjsonlogger.*"]
|
||||
ignore_missing_imports = true
|
||||
|
||||
[dependency-groups]
|
||||
|
||||
@@ -1,50 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
# Cerbero MCP — build & push immagine unica V2.0.0 al registry Gitea.
|
||||
#
|
||||
# Pre-requisiti:
|
||||
# - docker
|
||||
# - PAT Gitea con scope `write:package` in env $GITEA_PAT
|
||||
# - $GITEA_USER (default: adriano)
|
||||
#
|
||||
# Uso:
|
||||
# ./scripts/build-push.sh
|
||||
# VERSION=2.0.1 ./scripts/build-push.sh
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
|
||||
IMAGE_PREFIX="${IMAGE_PREFIX:-$REGISTRY/adriano/cerbero-mcp}"
|
||||
GITEA_USER="${GITEA_USER:-adriano}"
|
||||
VERSION="${VERSION:-2.0.0}"
|
||||
SHA="$(git rev-parse --short HEAD)"
|
||||
|
||||
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
|
||||
|
||||
# Login solo se non già autenticato sul registry.
|
||||
if grep -q "\"$REGISTRY\"" ~/.docker/config.json 2>/dev/null; then
|
||||
echo "=== docker già loggato su $REGISTRY (skip login) ==="
|
||||
elif [ -n "${GITEA_PAT:-}" ]; then
|
||||
echo "=== docker login $REGISTRY ==="
|
||||
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
|
||||
else
|
||||
echo "FATAL: non autenticato su $REGISTRY e GITEA_PAT non settata."
|
||||
echo " Esegui una volta: docker login $REGISTRY -u $GITEA_USER"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
TAG_VERSION="$IMAGE_PREFIX:$VERSION"
|
||||
TAG_LATEST="$IMAGE_PREFIX:latest"
|
||||
TAG_SHA="$IMAGE_PREFIX:sha-$SHA"
|
||||
|
||||
echo "=== build cerbero-mcp:$VERSION ==="
|
||||
docker build -t "$TAG_VERSION" -t "$TAG_LATEST" -t "$TAG_SHA" .
|
||||
|
||||
echo "=== push ==="
|
||||
for tag in "$TAG_VERSION" "$TAG_LATEST" "$TAG_SHA"; do
|
||||
docker push "$tag"
|
||||
echo " pushed: $tag"
|
||||
done
|
||||
|
||||
echo
|
||||
echo "=== Done (commit $SHA, version $VERSION) ==="
|
||||
echo "VPS Watchtower farà pull entro WATCHTOWER_POLL_INTERVAL (default 5min)."
|
||||
Executable
+148
@@ -0,0 +1,148 @@
|
||||
#!/usr/bin/env bash
|
||||
# deploy-vps.sh — deploy Cerbero MCP V2 sul VPS senza passare per registry.
|
||||
#
|
||||
# Workflow:
|
||||
# 1. git fetch + reset al ramo target
|
||||
# 2. docker compose build (rebuild immagine se SHA è cambiata)
|
||||
# 3. docker compose down (graceful, max 15s)
|
||||
# 4. docker compose up -d
|
||||
# 5. attesa healthcheck su /health
|
||||
# 6. rollback automatico al SHA precedente se health fallisce
|
||||
#
|
||||
# Eseguito ON THE VPS, dentro la directory del repo (es. /opt/cerbero-mcp).
|
||||
#
|
||||
# Uso (sul VPS):
|
||||
# cd /opt/cerbero-mcp
|
||||
# bash scripts/deploy-vps.sh
|
||||
#
|
||||
# Uso (da macchina dev, via SSH):
|
||||
# ssh user@vps 'cd /opt/cerbero-mcp && bash scripts/deploy-vps.sh'
|
||||
#
|
||||
# Variabili env (opzionali):
|
||||
# BRANCH ramo git da deployare (default: V2.0.0)
|
||||
# SERVICE nome servizio docker compose (default: cerbero-mcp)
|
||||
# PORT porta /health da pingare (default: dal .env, fallback 9000)
|
||||
# HEALTH_TIMEOUT_SECONDS attesa max health (default: 30)
|
||||
# HEALTH_INTERVAL secondi tra retry health (default: 2)
|
||||
# FORCE se "1", rebuild + restart anche se SHA invariata
|
||||
# SKIP_ROLLBACK se "1", non fare rollback su health fail (per debug)
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# ─── Config ──────────────────────────────────────────────────────────────
|
||||
BRANCH="${BRANCH:-V2.0.0}"
|
||||
SERVICE="${SERVICE:-cerbero-mcp}"
|
||||
HEALTH_TIMEOUT_SECONDS="${HEALTH_TIMEOUT_SECONDS:-30}"
|
||||
HEALTH_INTERVAL="${HEALTH_INTERVAL:-2}"
|
||||
|
||||
# Risolvi PORT da .env se non passata
|
||||
if [[ -z "${PORT:-}" ]]; then
|
||||
if [[ -f .env ]] && grep -q '^PORT=' .env; then
|
||||
PORT="$(grep '^PORT=' .env | head -1 | cut -d= -f2 | tr -d '[:space:]"')"
|
||||
fi
|
||||
fi
|
||||
PORT="${PORT:-9000}"
|
||||
HEALTH_URL="http://localhost:${PORT}/health"
|
||||
|
||||
# ─── Pre-check ───────────────────────────────────────────────────────────
|
||||
command -v git >/dev/null || { echo "FATAL: git non installato"; exit 1; }
|
||||
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
|
||||
command -v curl >/dev/null || { echo "FATAL: curl non installato"; exit 1; }
|
||||
docker compose version >/dev/null 2>&1 || { echo "FATAL: docker compose non disponibile"; exit 1; }
|
||||
|
||||
if [[ ! -f .env ]]; then
|
||||
echo "FATAL: .env non trovato in $(pwd)."
|
||||
echo " Copia .env.example → .env e compila i valori prima del primo deploy."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ! -f docker-compose.yml ]]; then
|
||||
echo "FATAL: docker-compose.yml non trovato in $(pwd)."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Verifica working tree pulito
|
||||
if [[ -n "$(git status --porcelain)" ]]; then
|
||||
echo "FATAL: working tree non pulito. Modifiche locali non gestite:"
|
||||
git status --short
|
||||
echo " Risolvi prima di deployare (es. git stash o git reset)."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ─── Stato corrente ──────────────────────────────────────────────────────
|
||||
CURRENT_SHA="$(git rev-parse --short HEAD)"
|
||||
echo "==> SHA attuale (rollback target): $CURRENT_SHA"
|
||||
echo "==> branch: $BRANCH"
|
||||
echo "==> port: $PORT"
|
||||
|
||||
# ─── Fetch + reset ───────────────────────────────────────────────────────
|
||||
echo "==> git fetch + reset --hard origin/${BRANCH}"
|
||||
git fetch --prune origin
|
||||
git reset --hard "origin/${BRANCH}"
|
||||
|
||||
NEW_SHA="$(git rev-parse --short HEAD)"
|
||||
echo "==> SHA nuovo: $NEW_SHA"
|
||||
|
||||
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]] && [[ "${FORCE:-0}" != "1" ]]; then
|
||||
echo "==> Già aggiornato a $NEW_SHA. Nessun deploy necessario."
|
||||
echo " (esporta FORCE=1 per riavviare comunque)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]]; then
|
||||
echo "==> FORCE=1 → rebuild e restart anche se SHA invariata"
|
||||
fi
|
||||
|
||||
# ─── Funzione di rollback ────────────────────────────────────────────────
|
||||
rollback() {
|
||||
if [[ "${SKIP_ROLLBACK:-0}" == "1" ]]; then
|
||||
echo "==> SKIP_ROLLBACK=1 → niente rollback automatico"
|
||||
return
|
||||
fi
|
||||
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]]; then
|
||||
echo "==> SHA invariata, niente da rollbackare"
|
||||
return
|
||||
fi
|
||||
echo "==> ROLLBACK a $CURRENT_SHA"
|
||||
git reset --hard "$CURRENT_SHA"
|
||||
docker compose build "$SERVICE"
|
||||
docker compose up -d --force-recreate "$SERVICE"
|
||||
echo "==> rollback eseguito. Verifica manualmente lo stato."
|
||||
}
|
||||
|
||||
# ─── Build ───────────────────────────────────────────────────────────────
|
||||
echo "==> docker compose build $SERVICE"
|
||||
docker compose build "$SERVICE"
|
||||
|
||||
# ─── Down + up ───────────────────────────────────────────────────────────
|
||||
echo "==> docker compose down --timeout 15"
|
||||
docker compose down --timeout 15
|
||||
|
||||
echo "==> docker compose up -d"
|
||||
docker compose up -d
|
||||
|
||||
# ─── Health check ────────────────────────────────────────────────────────
|
||||
echo "==> attendo /health (timeout ${HEALTH_TIMEOUT_SECONDS}s, retry ogni ${HEALTH_INTERVAL}s)"
|
||||
deadline=$(( $(date +%s) + HEALTH_TIMEOUT_SECONDS ))
|
||||
while [[ $(date +%s) -lt $deadline ]]; do
|
||||
if curl -fsS "$HEALTH_URL" >/dev/null 2>&1; then
|
||||
echo
|
||||
echo "==> health OK"
|
||||
curl -s "$HEALTH_URL"
|
||||
echo
|
||||
echo
|
||||
echo "==> deploy DONE (SHA $CURRENT_SHA → $NEW_SHA, branch $BRANCH)"
|
||||
exit 0
|
||||
fi
|
||||
printf "."
|
||||
sleep "$HEALTH_INTERVAL"
|
||||
done
|
||||
|
||||
echo
|
||||
echo "==> FAIL: /health non risponde dopo ${HEALTH_TIMEOUT_SECONDS}s"
|
||||
echo "==> log container (ultime 40 righe):"
|
||||
docker compose logs --tail 40 "$SERVICE" || true
|
||||
|
||||
rollback
|
||||
|
||||
exit 1
|
||||
@@ -0,0 +1,132 @@
|
||||
#!/usr/bin/env python3
|
||||
"""IBKR OAuth 1.0a Self-Service setup helper.
|
||||
|
||||
Phases (run in order, providing flags as you progress):
|
||||
1. python scripts/ibkr_oauth_setup.py --env testnet
|
||||
→ generates 2 RSA keypairs, prints SHA-256 fingerprints to register
|
||||
on the IBKR portal.
|
||||
2. (manual) Login at https://www.interactivebrokers.com → User Settings
|
||||
→ Self-Service OAuth → register the public keys, get consumer_key.
|
||||
3. python scripts/ibkr_oauth_setup.py --env testnet --consumer-key <K> \\
|
||||
--request-token
|
||||
→ exchanges consumer_key for an unauthorized request token + URL.
|
||||
4. (manual) Open the URL, approve, copy the verifier code.
|
||||
5. python scripts/ibkr_oauth_setup.py --env testnet --verifier <V>
|
||||
→ exchanges verifier for long-lived access_token + secret.
|
||||
Copy the printed values into .env.
|
||||
|
||||
Repeat for --env mainnet using your live IBKR account.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
|
||||
|
||||
def _gen_keypair(out: Path) -> str:
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
pem = key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
)
|
||||
out.write_bytes(pem)
|
||||
out.chmod(0o600)
|
||||
pub = key.public_key().public_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PublicFormat.SubjectPublicKeyInfo,
|
||||
)
|
||||
pub_path = out.with_suffix(out.suffix + ".pub")
|
||||
pub_path.write_bytes(pub)
|
||||
return f"SHA256:{hashlib.sha256(pub).hexdigest()}"
|
||||
|
||||
|
||||
def cmd_init(env: str, secrets_dir: Path) -> int:
|
||||
secrets_dir.mkdir(parents=True, exist_ok=True)
|
||||
sig = secrets_dir / f"ibkr_signature_{env}.pem"
|
||||
enc = secrets_dir / f"ibkr_encryption_{env}.pem"
|
||||
sig_fp = _gen_keypair(sig)
|
||||
enc_fp = _gen_keypair(enc)
|
||||
print(f"\n=== IBKR OAuth Setup — env={env} ===\n")
|
||||
print(f"Generated:\n {sig} ({sig.stat().st_size} bytes)")
|
||||
print(f" {enc} ({enc.stat().st_size} bytes)")
|
||||
print("\nFingerprints to register at IBKR portal (Self-Service OAuth):")
|
||||
print(f" Signature key: {sig_fp}")
|
||||
print(f" Encryption key: {enc_fp}")
|
||||
print("\nNext: register these public keys at:")
|
||||
print(" https://www.interactivebrokers.com (User Settings → OAuth)")
|
||||
print("\nAlso paste in .env:")
|
||||
print(f" IBKR_SIGNATURE_KEY_PATH_{env.upper()}={sig}")
|
||||
print(f" IBKR_ENCRYPTION_KEY_PATH_{env.upper()}={enc}\n")
|
||||
return 0
|
||||
|
||||
|
||||
def cmd_request_token(env: str, consumer_key: str) -> int:
|
||||
print(f"\n=== Step 2 — request token for {env} ===\n")
|
||||
print(f"Consumer key: {consumer_key}")
|
||||
print(
|
||||
"\nVisit this URL in a browser, log in to IBKR, authorize the app,\n"
|
||||
"and copy the displayed verifier code:\n"
|
||||
)
|
||||
print(
|
||||
f" https://www.interactivebrokers.com/sso/Authenticator?"
|
||||
f"oauth_consumer_key={consumer_key}&action=request_token\n"
|
||||
)
|
||||
print("Then re-run with: --verifier <code>\n")
|
||||
return 0
|
||||
|
||||
|
||||
def cmd_verifier(env: str, verifier: str) -> int:
|
||||
print(f"\n=== Step 3 — exchange verifier for {env} ===\n")
|
||||
print(f"Verifier received: {verifier[:8]}...")
|
||||
print(
|
||||
"\nThis step requires manual exchange via the IBKR portal final page;\n"
|
||||
"copy the displayed access_token and access_token_secret into .env:\n"
|
||||
)
|
||||
print(f" IBKR_ACCESS_TOKEN_{env.upper()}=<paste from portal>")
|
||||
print(f" IBKR_ACCESS_TOKEN_SECRET_{env.upper()}=<paste from portal>\n")
|
||||
print("Also set:")
|
||||
print(f" IBKR_CONSUMER_KEY_{env.upper()}=<the consumer key from step 1>")
|
||||
print(" IBKR_DH_PRIME=<paste DH prime hex from portal>\n")
|
||||
return 0
|
||||
|
||||
|
||||
def main() -> int:
|
||||
p = argparse.ArgumentParser(description=__doc__)
|
||||
p.add_argument("--env", choices=["testnet", "mainnet"], required=True)
|
||||
p.add_argument("--secrets-dir", default="secrets")
|
||||
p.add_argument("--consumer-key")
|
||||
p.add_argument("--request-token", action="store_true")
|
||||
p.add_argument("--verifier")
|
||||
p.add_argument(
|
||||
"--rotate",
|
||||
action="store_true",
|
||||
help="Generate new keypairs alongside existing (for rotation)",
|
||||
)
|
||||
args = p.parse_args()
|
||||
|
||||
sec_dir = Path(args.secrets_dir)
|
||||
if args.verifier:
|
||||
return cmd_verifier(args.env, args.verifier)
|
||||
if args.consumer_key and args.request_token:
|
||||
return cmd_request_token(args.env, args.consumer_key)
|
||||
if args.rotate:
|
||||
for kind in ("signature", "encryption"):
|
||||
new = sec_dir / f"ibkr_{kind}_{args.env}.pem.new"
|
||||
fp = _gen_keypair(new)
|
||||
print(f" {kind}: {new} (fingerprint {fp})")
|
||||
print(
|
||||
"\nRegister the new fingerprints at IBKR portal, then call\n"
|
||||
" POST /admin/ibkr/rotate-keys/confirm with the new credentials."
|
||||
)
|
||||
return 0
|
||||
return cmd_init(args.env, sec_dir)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
@@ -9,20 +9,24 @@ Boot:
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
from contextlib import asynccontextmanager
|
||||
from typing import Literal, cast
|
||||
|
||||
import uvicorn
|
||||
from fastapi import FastAPI
|
||||
|
||||
from cerbero_mcp import admin
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.logging import configure_root_logging
|
||||
from cerbero_mcp.exchanges import build_client
|
||||
from cerbero_mcp.routers import (
|
||||
alpaca,
|
||||
bybit,
|
||||
cross,
|
||||
deribit,
|
||||
hyperliquid,
|
||||
ibkr,
|
||||
macro,
|
||||
sentiment,
|
||||
)
|
||||
@@ -52,6 +56,11 @@ def _make_app(settings: Settings) -> FastAPI:
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
# Stop any IBKR WebSocket singletons before closing client registry
|
||||
ibkr_ws_dict = getattr(app.state, "ibkr_ws", {}) or {}
|
||||
for ws in ibkr_ws_dict.values():
|
||||
with contextlib.suppress(Exception):
|
||||
await ws.stop()
|
||||
await app.state.registry.aclose()
|
||||
|
||||
app.router.lifespan_context = lifespan
|
||||
@@ -60,8 +69,11 @@ def _make_app(settings: Settings) -> FastAPI:
|
||||
app.include_router(bybit.make_router())
|
||||
app.include_router(hyperliquid.make_router())
|
||||
app.include_router(alpaca.make_router())
|
||||
app.include_router(ibkr.make_router())
|
||||
app.include_router(macro.make_router())
|
||||
app.include_router(sentiment.make_router())
|
||||
app.include_router(cross.make_router())
|
||||
app.include_router(admin.make_admin_router())
|
||||
|
||||
return app
|
||||
|
||||
|
||||
@@ -0,0 +1,244 @@
|
||||
"""Endpoint admin: query audit log con filtri."""
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import os
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Any, Literal
|
||||
|
||||
from fastapi import APIRouter, HTTPException, Query, Request
|
||||
from pydantic import BaseModel, SecretStr
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.key_rotation import KeyRotationManager
|
||||
|
||||
MAX_RECORDS = 10000
|
||||
DEFAULT_LIMIT = 1000
|
||||
|
||||
|
||||
class _IBKRRotateConfirmReq(BaseModel):
|
||||
new_consumer_key: str
|
||||
new_access_token: str
|
||||
new_access_token_secret: str
|
||||
|
||||
|
||||
def _parse_iso(value: str | None) -> datetime | None:
|
||||
if not value:
|
||||
return None
|
||||
try:
|
||||
# supporta sia "2026-05-01" sia "2026-05-01T12:34:56Z"
|
||||
return datetime.fromisoformat(value.replace("Z", "+00:00"))
|
||||
except ValueError as e:
|
||||
raise HTTPException(400, f"invalid datetime: {value}") from e
|
||||
|
||||
|
||||
def _record_timestamp(rec: dict[str, Any]) -> datetime | None:
|
||||
"""Estrae il timestamp da un record audit. JsonFormatter mette 'asctime'
|
||||
in formato '2026-05-01 12:34:56,789'. Lo parsiamo come UTC.
|
||||
"""
|
||||
ts = rec.get("asctime") or rec.get("timestamp")
|
||||
if not ts:
|
||||
return None
|
||||
try:
|
||||
# asctime format default: 'YYYY-MM-DD HH:MM:SS,mmm'
|
||||
ts_clean = ts.replace(",", ".")
|
||||
return datetime.fromisoformat(ts_clean)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def _matches_filters(
|
||||
rec: dict[str, Any],
|
||||
*,
|
||||
from_dt: datetime | None,
|
||||
to_dt: datetime | None,
|
||||
actor: str | None,
|
||||
exchange: str | None,
|
||||
action: str | None,
|
||||
bot_tag: str | None,
|
||||
) -> bool:
|
||||
if rec.get("audit_event") != "write_op":
|
||||
return False
|
||||
if actor is not None and rec.get("actor") != actor:
|
||||
return False
|
||||
if exchange is not None and rec.get("exchange") != exchange:
|
||||
return False
|
||||
if action is not None and rec.get("action") != action:
|
||||
return False
|
||||
if bot_tag is not None and rec.get("bot_tag") != bot_tag:
|
||||
return False
|
||||
if from_dt is not None or to_dt is not None:
|
||||
rec_ts = _record_timestamp(rec)
|
||||
if rec_ts is None:
|
||||
return False
|
||||
if from_dt is not None and rec_ts < from_dt:
|
||||
return False
|
||||
if to_dt is not None and rec_ts > to_dt:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def _read_audit_records(file_path: Path) -> list[dict[str, Any]]:
|
||||
if not file_path.exists():
|
||||
return []
|
||||
out: list[dict[str, Any]] = []
|
||||
with file_path.open("r", encoding="utf-8") as f:
|
||||
for line in f:
|
||||
stripped = line.strip()
|
||||
if not stripped:
|
||||
continue
|
||||
try:
|
||||
out.append(json.loads(stripped))
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
return out
|
||||
|
||||
|
||||
def make_admin_router() -> APIRouter:
|
||||
r = APIRouter(prefix="/admin", tags=["admin"])
|
||||
|
||||
@r.get("/audit")
|
||||
async def query_audit(
|
||||
request: Request,
|
||||
from_: str | None = Query(None, alias="from"),
|
||||
to: str | None = Query(None),
|
||||
actor: Literal["testnet", "mainnet"] | None = Query(None),
|
||||
exchange: str | None = Query(None),
|
||||
action: str | None = Query(None),
|
||||
bot_tag: str | None = Query(None),
|
||||
limit: int = Query(DEFAULT_LIMIT, ge=1, le=MAX_RECORDS),
|
||||
) -> dict[str, Any]:
|
||||
"""Restituisce i record audit_write_op filtrati.
|
||||
|
||||
Param query (tutti opzionali):
|
||||
- from / to: ISO 8601 datetime (es. 2026-05-01 oppure 2026-05-01T12:34:56)
|
||||
- actor: testnet | mainnet
|
||||
- exchange: deribit | bybit | hyperliquid | alpaca
|
||||
- action: nome del tool (es. place_order)
|
||||
- bot_tag: identificatore bot
|
||||
- limit: max record da ritornare (default 1000, max 10000)
|
||||
|
||||
Source: AUDIT_LOG_FILE (env var). Se non settata, ritorna lista vuota
|
||||
con warning.
|
||||
"""
|
||||
from_dt = _parse_iso(from_)
|
||||
to_dt = _parse_iso(to)
|
||||
|
||||
file_str = os.environ.get("AUDIT_LOG_FILE", "").strip()
|
||||
if not file_str:
|
||||
return {
|
||||
"records": [],
|
||||
"count": 0,
|
||||
"warning": "AUDIT_LOG_FILE not configured; no persistent audit log to query",
|
||||
"from": from_,
|
||||
"to": to,
|
||||
}
|
||||
|
||||
file_path = Path(file_str)
|
||||
all_records = _read_audit_records(file_path)
|
||||
filtered = [
|
||||
rec for rec in all_records
|
||||
if _matches_filters(
|
||||
rec,
|
||||
from_dt=from_dt, to_dt=to_dt,
|
||||
actor=actor, exchange=exchange, action=action,
|
||||
bot_tag=bot_tag,
|
||||
)
|
||||
]
|
||||
# sort desc per timestamp (ultimi prima) + limit
|
||||
filtered.sort(
|
||||
key=lambda rec: _record_timestamp(rec) or datetime.min,
|
||||
reverse=True,
|
||||
)
|
||||
if len(filtered) > limit:
|
||||
filtered = filtered[:limit]
|
||||
|
||||
return {
|
||||
"records": filtered,
|
||||
"count": len(filtered),
|
||||
"from": from_,
|
||||
"to": to,
|
||||
"filters": {
|
||||
"actor": actor, "exchange": exchange,
|
||||
"action": action, "bot_tag": bot_tag,
|
||||
},
|
||||
}
|
||||
|
||||
@r.post("/ibkr/rotate-keys/start")
|
||||
async def _ibkr_rotate_start(env: str, request: Request):
|
||||
if env not in ("testnet", "mainnet"):
|
||||
raise HTTPException(400, detail={"error": "invalid env"})
|
||||
settings = request.app.state.settings
|
||||
creds = settings.ibkr.credentials(env)
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=creds["signature_key_path"],
|
||||
encryption_key_path=creds["encryption_key_path"],
|
||||
)
|
||||
rotations = getattr(request.app.state, "ibkr_rotations", None)
|
||||
if rotations is None:
|
||||
rotations = {}
|
||||
request.app.state.ibkr_rotations = rotations
|
||||
rotations[env] = mgr
|
||||
return await mgr.start()
|
||||
|
||||
@r.post("/ibkr/rotate-keys/confirm")
|
||||
async def _ibkr_rotate_confirm(
|
||||
env: str, body: _IBKRRotateConfirmReq, request: Request,
|
||||
):
|
||||
if env not in ("testnet", "mainnet"):
|
||||
raise HTTPException(400, detail={"error": "invalid env"})
|
||||
rotations = getattr(request.app.state, "ibkr_rotations", {}) or {}
|
||||
mgr = rotations.get(env)
|
||||
if mgr is None:
|
||||
raise HTTPException(409, detail={"error": "rotation not started"})
|
||||
|
||||
settings = request.app.state.settings
|
||||
if env == "testnet":
|
||||
settings.ibkr.consumer_key_testnet = body.new_consumer_key
|
||||
settings.ibkr.access_token_testnet = body.new_access_token
|
||||
settings.ibkr.access_token_secret_testnet = SecretStr(body.new_access_token_secret)
|
||||
else:
|
||||
settings.ibkr.consumer_key_live = body.new_consumer_key
|
||||
settings.ibkr.access_token_live = body.new_access_token
|
||||
settings.ibkr.access_token_secret_live = SecretStr(body.new_access_token_secret)
|
||||
|
||||
registry = request.app.state.registry
|
||||
registry._clients.pop(("ibkr", env), None)
|
||||
|
||||
async def _validate() -> bool:
|
||||
try:
|
||||
client = await registry.get("ibkr", env)
|
||||
await client._request("GET", "/iserver/auth/status", skip_tickle=True)
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
try:
|
||||
return await mgr.confirm(validate=_validate)
|
||||
finally:
|
||||
rotations.pop(env, None)
|
||||
|
||||
@r.post("/ibkr/rotate-keys/abort")
|
||||
async def _ibkr_rotate_abort(env: str, request: Request):
|
||||
rotations = getattr(request.app.state, "ibkr_rotations", {}) or {}
|
||||
mgr = rotations.pop(env, None)
|
||||
if mgr is None:
|
||||
return {"aborted": False, "reason": "no rotation in progress"}
|
||||
return await mgr.abort()
|
||||
|
||||
@r.post("/ibkr/health")
|
||||
async def _ibkr_health(request: Request):
|
||||
registry = request.app.state.registry
|
||||
out: dict[str, Any] = {}
|
||||
for env in ("testnet", "mainnet"):
|
||||
try:
|
||||
client = await registry.get("ibkr", env)
|
||||
status = await client._request(
|
||||
"GET", "/iserver/auth/status", skip_tickle=True
|
||||
)
|
||||
out[env] = {"healthy": True, "status": status}
|
||||
except Exception as e:
|
||||
out[env] = {"healthy": False, "error": str(e)[:200]}
|
||||
return out
|
||||
|
||||
return r
|
||||
+50
-4
@@ -1,4 +1,8 @@
|
||||
"""Bearer auth middleware: bearer token → request.state.environment."""
|
||||
"""Bearer auth middleware: bearer token → request.state.environment.
|
||||
|
||||
Inoltre richiede header `X-Bot-Tag` su tutte le chiamate non whitelisted,
|
||||
così che l'audit log identifichi il bot chiamante.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import secrets
|
||||
@@ -9,7 +13,24 @@ from fastapi.responses import JSONResponse
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
WHITELIST_PATHS = frozenset({"/health", "/apidocs", "/openapi.json", "/docs", "/redoc"})
|
||||
# Path che bypassano sia bearer auth sia bot_tag check.
|
||||
PATH_WHITELIST_FULL = frozenset(
|
||||
{
|
||||
"/health",
|
||||
"/health/ready",
|
||||
"/apidocs",
|
||||
"/openapi.json",
|
||||
"/docs",
|
||||
"/redoc",
|
||||
}
|
||||
)
|
||||
# Path che richiedono bearer ma NON il bot_tag (admin endpoint).
|
||||
PATH_WHITELIST_BOT_TAG_ONLY = frozenset({"/admin/audit"})
|
||||
|
||||
# Backward-compat alias (vecchi import).
|
||||
WHITELIST_PATHS = PATH_WHITELIST_FULL
|
||||
|
||||
MAX_BOT_TAG_LEN = 64
|
||||
|
||||
|
||||
def _extract_bearer(auth_header: str) -> str | None:
|
||||
@@ -35,13 +56,17 @@ def install_auth_middleware(
|
||||
testnet_token: str,
|
||||
mainnet_token: str,
|
||||
) -> None:
|
||||
"""Registra middleware di auth bearer sull'app FastAPI."""
|
||||
"""Registra middleware di auth bearer + bot_tag sull'app FastAPI."""
|
||||
|
||||
@app.middleware("http")
|
||||
async def auth_middleware(request: Request, call_next):
|
||||
if request.url.path in WHITELIST_PATHS:
|
||||
path = request.url.path
|
||||
|
||||
# 1. Whitelist totale: nessun check.
|
||||
if path in PATH_WHITELIST_FULL:
|
||||
return await call_next(request)
|
||||
|
||||
# 2. Bearer auth (sempre richiesto).
|
||||
token = _extract_bearer(request.headers.get("Authorization", ""))
|
||||
if token is None:
|
||||
return JSONResponse(
|
||||
@@ -57,4 +82,25 @@ def install_auth_middleware(
|
||||
"message": "invalid token"}},
|
||||
)
|
||||
request.state.environment = env
|
||||
|
||||
# 3. Whitelist parziale (admin): bearer ok, no bot_tag check.
|
||||
if path in PATH_WHITELIST_BOT_TAG_ONLY:
|
||||
return await call_next(request)
|
||||
|
||||
# 4. X-Bot-Tag obbligatorio.
|
||||
raw_tag = request.headers.get("X-Bot-Tag", "")
|
||||
tag = raw_tag.strip() if raw_tag else ""
|
||||
if not tag:
|
||||
return JSONResponse(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
content={"error": {"code": "BAD_REQUEST",
|
||||
"message": "missing X-Bot-Tag header"}},
|
||||
)
|
||||
if len(tag) > MAX_BOT_TAG_LEN:
|
||||
return JSONResponse(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
content={"error": {"code": "BAD_REQUEST",
|
||||
"message": "X-Bot-Tag too long"}},
|
||||
)
|
||||
request.state.bot_tag = tag
|
||||
return await call_next(request)
|
||||
|
||||
@@ -67,23 +67,28 @@ def _configure_audit_sink() -> None:
|
||||
def audit_write_op(
|
||||
*,
|
||||
actor: str | None = None,
|
||||
bot_tag: str | None = None,
|
||||
action: str,
|
||||
exchange: str,
|
||||
target: str | None = None,
|
||||
payload: dict[str, Any] | None = None,
|
||||
result: dict[str, Any] | None = None,
|
||||
error: str | None = None,
|
||||
request_id: str | None = None,
|
||||
) -> None:
|
||||
"""Emit a structured audit log record per write operation.
|
||||
|
||||
actor: identificatore di chi ha invocato (es. "testnet", "mainnet",
|
||||
oppure None per logging anonimo).
|
||||
bot_tag: identificatore del bot chiamante (header X-Bot-Tag).
|
||||
action: nome del tool (es. "place_order", "cancel_order").
|
||||
exchange: identificatore servizio (deribit, bybit, alpaca, hyperliquid).
|
||||
target: instrument/symbol/order_id su cui si agisce.
|
||||
payload: input non-sensibile (qty, side, leverage, ecc.).
|
||||
result: output del client (order_id, status, ecc.).
|
||||
error: stringa errore se l'operazione ha fallito.
|
||||
request_id: id propagato dal middleware request log per correlazione
|
||||
tra audit log e request log.
|
||||
"""
|
||||
_configure_audit_sink()
|
||||
record: dict[str, Any] = {
|
||||
@@ -91,9 +96,12 @@ def audit_write_op(
|
||||
"action": action,
|
||||
"exchange": exchange,
|
||||
"actor": actor,
|
||||
"bot_tag": bot_tag,
|
||||
"target": target,
|
||||
"payload": payload or {},
|
||||
}
|
||||
if request_id is not None:
|
||||
record["request_id"] = request_id
|
||||
if result is not None:
|
||||
record["result"] = _summarize_result(result)
|
||||
if error is not None:
|
||||
|
||||
@@ -0,0 +1,100 @@
|
||||
"""Helper per cablare audit_write_op nei router.
|
||||
|
||||
Pattern uso nel router::
|
||||
|
||||
@r.post("/tools/place_order")
|
||||
async def _place_order(
|
||||
params: t.PlaceOrderReq,
|
||||
request: Request,
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="place_order",
|
||||
target_field="instrument_name",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=...),
|
||||
)
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Awaitable, Callable
|
||||
from typing import Any
|
||||
|
||||
from fastapi import Request
|
||||
from pydantic import BaseModel
|
||||
|
||||
from cerbero_mcp.common.audit import audit_write_op
|
||||
|
||||
|
||||
def _extract_target(params: BaseModel | None, target_field: str | None) -> str | None:
|
||||
if params is None or target_field is None:
|
||||
return None
|
||||
val = getattr(params, target_field, None)
|
||||
if val is None:
|
||||
return None
|
||||
return str(val)
|
||||
|
||||
|
||||
def _safe_dump(params: BaseModel | None) -> dict[str, Any]:
|
||||
if params is None:
|
||||
return {}
|
||||
try:
|
||||
return params.model_dump(mode="json", exclude_none=True)
|
||||
except Exception:
|
||||
return {}
|
||||
|
||||
|
||||
async def audit_call(
|
||||
*,
|
||||
request: Request,
|
||||
exchange: str,
|
||||
action: str,
|
||||
tool_fn: Callable[[], Awaitable[Any]],
|
||||
params: BaseModel | None = None,
|
||||
target_field: str | None = None,
|
||||
) -> Any:
|
||||
"""Esegue tool_fn e logga audit (success o error). Riraisola eccezioni."""
|
||||
actor = getattr(request.state, "environment", None)
|
||||
bot_tag = getattr(request.state, "bot_tag", None)
|
||||
request_id = getattr(request.state, "request_id", None)
|
||||
target = _extract_target(params, target_field)
|
||||
payload = _safe_dump(params)
|
||||
|
||||
try:
|
||||
result = await tool_fn()
|
||||
except Exception as e:
|
||||
audit_write_op(
|
||||
actor=actor,
|
||||
bot_tag=bot_tag,
|
||||
action=action,
|
||||
exchange=exchange,
|
||||
target=target,
|
||||
payload=payload,
|
||||
error=f"{type(e).__name__}: {e}",
|
||||
request_id=request_id,
|
||||
)
|
||||
raise
|
||||
|
||||
# Se result è dict, passa raw; altrimenti tenta serializzazione
|
||||
audit_result: dict[str, Any] | None = None
|
||||
if isinstance(result, dict):
|
||||
audit_result = result
|
||||
elif hasattr(result, "model_dump"):
|
||||
try:
|
||||
audit_result = result.model_dump(mode="json")
|
||||
except Exception:
|
||||
audit_result = None
|
||||
|
||||
audit_write_op(
|
||||
actor=actor,
|
||||
bot_tag=bot_tag,
|
||||
action=action,
|
||||
exchange=exchange,
|
||||
target=target,
|
||||
payload=payload,
|
||||
result=audit_result,
|
||||
request_id=request_id,
|
||||
)
|
||||
return result
|
||||
@@ -0,0 +1,53 @@
|
||||
"""Shared OHLCV candle model + validator for exchange historical endpoints."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
from pydantic import BaseModel, ConfigDict, ValidationError, model_validator
|
||||
|
||||
|
||||
class Candle(BaseModel):
|
||||
model_config = ConfigDict(extra="ignore")
|
||||
|
||||
timestamp: int
|
||||
open: float
|
||||
high: float
|
||||
low: float
|
||||
close: float
|
||||
volume: float
|
||||
|
||||
@model_validator(mode="after")
|
||||
def _check(self) -> Candle:
|
||||
if self.timestamp <= 0:
|
||||
raise ValueError(f"timestamp must be > 0, got {self.timestamp}")
|
||||
if self.volume < 0:
|
||||
raise ValueError(f"volume must be >= 0, got {self.volume}")
|
||||
if self.high < max(self.open, self.close, self.low):
|
||||
raise ValueError(
|
||||
f"high {self.high} < max(open={self.open}, "
|
||||
f"close={self.close}, low={self.low})"
|
||||
)
|
||||
if self.low > min(self.open, self.close, self.high):
|
||||
raise ValueError(
|
||||
f"low {self.low} > min(open={self.open}, "
|
||||
f"close={self.close}, high={self.high})"
|
||||
)
|
||||
return self
|
||||
|
||||
|
||||
def validate_candles(raw: list[dict[str, Any]]) -> list[dict[str, Any]]:
|
||||
"""Coerce upstream rows into validated candle dicts, sorted by timestamp.
|
||||
|
||||
Raises HTTPException(502) if any row violates OHLC consistency or schema —
|
||||
upstream data corruption is mapped to a retryable error envelope.
|
||||
"""
|
||||
try:
|
||||
candles = [Candle.model_validate(row) for row in raw]
|
||||
except ValidationError as e:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail=f"upstream returned malformed candle: {e.errors()[0]['msg']}",
|
||||
) from e
|
||||
candles.sort(key=lambda c: c.timestamp)
|
||||
return [c.model_dump() for c in candles]
|
||||
@@ -0,0 +1,104 @@
|
||||
"""Middleware: structured JSON request log per ogni HTTP request.
|
||||
|
||||
Emette una riga JSON sul logger ``mcp.request`` con campi correlabili
|
||||
all'audit log via ``request_id``. Espone anche ``request_id`` su
|
||||
``request.state`` così che handler/exception handler downstream possano
|
||||
includerlo nei propri payload.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import time
|
||||
import uuid
|
||||
from collections.abc import Awaitable, Callable
|
||||
from datetime import UTC, datetime
|
||||
from typing import Any
|
||||
|
||||
from fastapi import FastAPI, Request
|
||||
from starlette.responses import Response
|
||||
|
||||
from cerbero_mcp.common.logging import get_json_logger
|
||||
|
||||
_logger = get_json_logger("mcp.request", level=logging.INFO)
|
||||
|
||||
|
||||
def _extract_exchange(path: str) -> str | None:
|
||||
"""Estrae il nome dell'exchange dal path se è un ``/mcp-{exchange}/...``."""
|
||||
if not path.startswith("/mcp-"):
|
||||
return None
|
||||
rest = path[len("/mcp-"):]
|
||||
end = rest.find("/")
|
||||
if end < 0:
|
||||
return rest or None
|
||||
return rest[:end] or None
|
||||
|
||||
|
||||
def _extract_tool(path: str) -> str | None:
|
||||
"""Estrae nome tool dal path ``/mcp-X/tools/Y``."""
|
||||
parts = path.split("/")
|
||||
# ["", "mcp-deribit", "tools", "place_order"]
|
||||
if len(parts) >= 4 and parts[2] == "tools":
|
||||
return parts[3] or None
|
||||
return None
|
||||
|
||||
|
||||
def install_request_log_middleware(app: FastAPI) -> None:
|
||||
"""Aggiunge un middleware HTTP che logga JSON per ogni request."""
|
||||
|
||||
@app.middleware("http")
|
||||
async def request_log(
|
||||
request: Request,
|
||||
call_next: Callable[[Request], Awaitable[Response]],
|
||||
) -> Response:
|
||||
request_id = uuid.uuid4().hex
|
||||
# Espone request_id per uso downstream (audit, error envelope)
|
||||
request.state.request_id = request_id
|
||||
t0 = time.perf_counter()
|
||||
status_code = 500
|
||||
error: str | None = None
|
||||
response: Response | None = None
|
||||
try:
|
||||
response = await call_next(request)
|
||||
status_code = response.status_code
|
||||
except Exception as e:
|
||||
error = f"{type(e).__name__}: {str(e)[:200]}"
|
||||
raise
|
||||
finally:
|
||||
dur_ms = (time.perf_counter() - t0) * 1000
|
||||
path = request.url.path
|
||||
payload: dict[str, Any] = {
|
||||
"event": "request",
|
||||
"request_id": request_id,
|
||||
"method": request.method,
|
||||
"path": path,
|
||||
"status_code": status_code,
|
||||
"duration_ms": round(dur_ms, 2),
|
||||
"timestamp": datetime.now(UTC).isoformat(),
|
||||
}
|
||||
ua = request.headers.get("user-agent")
|
||||
if ua:
|
||||
payload["user_agent"] = ua[:200]
|
||||
client = request.client
|
||||
if client is not None:
|
||||
payload["client_ip"] = client.host
|
||||
actor = getattr(request.state, "environment", None)
|
||||
if actor:
|
||||
payload["actor"] = actor
|
||||
bot_tag = getattr(request.state, "bot_tag", None)
|
||||
if bot_tag:
|
||||
payload["bot_tag"] = bot_tag
|
||||
exchange = _extract_exchange(path)
|
||||
if exchange:
|
||||
payload["exchange"] = exchange
|
||||
tool = _extract_tool(path)
|
||||
if tool:
|
||||
payload["tool"] = tool
|
||||
if error:
|
||||
payload["error"] = error
|
||||
_logger.error("request", extra=payload)
|
||||
else:
|
||||
_logger.info("request", extra=payload)
|
||||
# response è settato se non c'è stata eccezione (altrimenti
|
||||
# l'eccezione è stata già rilanciata dal blocco except).
|
||||
assert response is not None
|
||||
return response
|
||||
@@ -15,9 +15,10 @@ async def build_client(
|
||||
from cerbero_mcp.exchanges.deribit.client import DeribitClient
|
||||
|
||||
url = settings.deribit.url_testnet if env == "testnet" else settings.deribit.url_live
|
||||
cid, csec = settings.deribit.credentials(env)
|
||||
return DeribitClient(
|
||||
client_id=settings.deribit.client_id,
|
||||
client_secret=settings.deribit.client_secret.get_secret_value(),
|
||||
client_id=cid,
|
||||
client_secret=csec,
|
||||
testnet=(env == "testnet"),
|
||||
base_url_override=url,
|
||||
)
|
||||
@@ -71,4 +72,24 @@ async def build_client(
|
||||
cryptopanic_key=settings.sentiment.cryptopanic_key.get_secret_value(),
|
||||
lunarcrush_key=settings.sentiment.lunarcrush_key.get_secret_value(),
|
||||
)
|
||||
if exchange == "ibkr":
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRClient
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import OAuth1aSigner
|
||||
|
||||
creds = settings.ibkr.credentials(env)
|
||||
url = settings.ibkr.url_testnet if env == "testnet" else settings.ibkr.url_live
|
||||
signer = OAuth1aSigner(
|
||||
consumer_key=creds["consumer_key"],
|
||||
access_token=creds["access_token"],
|
||||
access_token_secret=creds["access_token_secret"],
|
||||
signature_key_path=creds["signature_key_path"],
|
||||
encryption_key_path=creds["encryption_key_path"],
|
||||
dh_prime=creds["dh_prime"],
|
||||
)
|
||||
return IBKRClient(
|
||||
signer=signer,
|
||||
account_id=creds["account_id"],
|
||||
paper=(env == "testnet"),
|
||||
base_url=url,
|
||||
)
|
||||
raise ValueError(f"unsupported exchange: {exchange}")
|
||||
|
||||
@@ -1,204 +1,253 @@
|
||||
"""Alpaca client su httpx puro (V2.0.0).
|
||||
|
||||
Riscrittura full-REST del client `alpaca-py` originale: 4 endpoint base
|
||||
(trading, stock data, crypto data, options data), auth via header
|
||||
APCA-API-KEY-ID / APCA-API-SECRET-KEY, parità completa con la versione V1
|
||||
(stesse firme, stessa shape dei dict ritornati).
|
||||
|
||||
- `base_url` parametro override applica SOLO al trading endpoint
|
||||
(coerente con `url_override` di alpaca-py.TradingClient). Gli endpoint
|
||||
data restano hardcoded su `https://data.alpaca.markets`.
|
||||
- I metodi ritornano `dict` / `list[dict]` direttamente dal JSON REST
|
||||
(al posto dei modelli pydantic alpaca-py serializzati). Le chiavi sono
|
||||
quelle restituite dall'API Alpaca; equivalgono al `model_dump()` dei
|
||||
modelli SDK precedenti.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import datetime as _dt
|
||||
from typing import Any
|
||||
|
||||
from alpaca.data.historical import (
|
||||
CryptoHistoricalDataClient,
|
||||
OptionHistoricalDataClient,
|
||||
StockHistoricalDataClient,
|
||||
)
|
||||
from alpaca.data.requests import (
|
||||
CryptoBarsRequest,
|
||||
CryptoLatestQuoteRequest,
|
||||
CryptoLatestTradeRequest,
|
||||
OptionBarsRequest,
|
||||
OptionChainRequest,
|
||||
OptionLatestQuoteRequest,
|
||||
StockBarsRequest,
|
||||
StockLatestQuoteRequest,
|
||||
StockLatestTradeRequest,
|
||||
StockSnapshotRequest,
|
||||
)
|
||||
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
|
||||
from alpaca.trading.client import TradingClient
|
||||
from alpaca.trading.enums import (
|
||||
AssetClass,
|
||||
OrderSide,
|
||||
QueryOrderStatus,
|
||||
TimeInForce,
|
||||
)
|
||||
from alpaca.trading.requests import (
|
||||
ClosePositionRequest,
|
||||
GetAssetsRequest,
|
||||
GetOrdersRequest,
|
||||
LimitOrderRequest,
|
||||
MarketOrderRequest,
|
||||
ReplaceOrderRequest,
|
||||
StopOrderRequest,
|
||||
)
|
||||
import httpx
|
||||
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
# ── Endpoint base ────────────────────────────────────────────────
|
||||
_TRADING_LIVE = "https://api.alpaca.markets"
|
||||
_TRADING_PAPER = "https://paper-api.alpaca.markets"
|
||||
_DATA = "https://data.alpaca.markets"
|
||||
|
||||
# ── Mappa timeframe → query param Alpaca ─────────────────────────
|
||||
# Alpaca v2 bars: timeframe = "1Min" / "5Min" / "15Min" / "30Min" / "1Hour" / "1Day" / "1Week"
|
||||
_TF_MAP = {
|
||||
"1min": TimeFrame(1, TimeFrameUnit.Minute),
|
||||
"5min": TimeFrame(5, TimeFrameUnit.Minute),
|
||||
"15min": TimeFrame(15, TimeFrameUnit.Minute),
|
||||
"30min": TimeFrame(30, TimeFrameUnit.Minute),
|
||||
"1h": TimeFrame(1, TimeFrameUnit.Hour),
|
||||
"1d": TimeFrame(1, TimeFrameUnit.Day),
|
||||
"1w": TimeFrame(1, TimeFrameUnit.Week),
|
||||
"1min": "1Min",
|
||||
"5min": "5Min",
|
||||
"15min": "15Min",
|
||||
"30min": "30Min",
|
||||
"1h": "1Hour",
|
||||
"1d": "1Day",
|
||||
"1w": "1Week",
|
||||
}
|
||||
|
||||
_ASSET_CLASSES = {"stocks", "crypto", "options"}
|
||||
_ASSET_CLASS_MAP = {
|
||||
"stocks": "us_equity",
|
||||
"crypto": "crypto",
|
||||
"options": "us_option",
|
||||
}
|
||||
|
||||
|
||||
def _tf(interval: str) -> TimeFrame:
|
||||
def _tf(interval: str) -> str:
|
||||
if interval in _TF_MAP:
|
||||
return _TF_MAP[interval]
|
||||
raise ValueError(f"unsupported timeframe: {interval}")
|
||||
|
||||
|
||||
def _asset_class_enum(ac: str) -> AssetClass:
|
||||
def _asset_class_param(ac: str) -> str:
|
||||
ac = ac.lower()
|
||||
if ac == "stocks":
|
||||
return AssetClass.US_EQUITY
|
||||
if ac == "crypto":
|
||||
return AssetClass.CRYPTO
|
||||
if ac == "options":
|
||||
return AssetClass.US_OPTION
|
||||
if ac in _ASSET_CLASS_MAP:
|
||||
return _ASSET_CLASS_MAP[ac]
|
||||
raise ValueError(f"invalid asset_class: {ac}")
|
||||
|
||||
|
||||
def _serialize(obj: Any) -> Any:
|
||||
"""Recursively convert pydantic/datetime objects → json-safe."""
|
||||
if obj is None or isinstance(obj, str | int | float | bool):
|
||||
return obj
|
||||
if isinstance(obj, _dt.datetime | _dt.date):
|
||||
return obj.isoformat()
|
||||
if isinstance(obj, dict):
|
||||
return {k: _serialize(v) for k, v in obj.items()}
|
||||
if isinstance(obj, list | tuple):
|
||||
return [_serialize(v) for v in obj]
|
||||
if hasattr(obj, "model_dump"):
|
||||
return _serialize(obj.model_dump())
|
||||
if hasattr(obj, "__dict__"):
|
||||
return _serialize(vars(obj))
|
||||
return str(obj)
|
||||
def _iso(value: _dt.datetime | _dt.date | None) -> str | None:
|
||||
if value is None:
|
||||
return None
|
||||
return value.isoformat()
|
||||
|
||||
|
||||
class AlpacaClient:
|
||||
"""Client httpx-based per Alpaca REST API v2.
|
||||
|
||||
Auth via header `APCA-API-KEY-ID` / `APCA-API-SECRET-KEY`.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
api_key: str,
|
||||
secret_key: str,
|
||||
paper: bool = True,
|
||||
base_url: str | None = None,
|
||||
trading: Any | None = None,
|
||||
stock_data: Any | None = None,
|
||||
crypto_data: Any | None = None,
|
||||
option_data: Any | None = None,
|
||||
http: httpx.AsyncClient | None = None,
|
||||
) -> None:
|
||||
self.api_key = api_key
|
||||
self.secret_key = secret_key
|
||||
self.paper = paper
|
||||
# `base_url` mantenuto come attributo pubblico (test/build_client lo
|
||||
# leggono). Override del solo endpoint trading; data endpoints sono
|
||||
# sempre `data.alpaca.markets` (Alpaca non offre paper data feed).
|
||||
self.base_url = base_url
|
||||
# alpaca-py TradingClient accetta `url_override` per override URL trading.
|
||||
# Data clients (Stock/Crypto/Option) non supportano url_override sul costruttore;
|
||||
# usano endpoint dati separati (data.alpaca.markets) — `base_url` è ignorato per essi.
|
||||
if trading is None:
|
||||
trading_kwargs: dict[str, Any] = {
|
||||
"api_key": api_key, "secret_key": secret_key, "paper": paper,
|
||||
}
|
||||
if base_url:
|
||||
trading_kwargs["url_override"] = base_url
|
||||
trading = TradingClient(**trading_kwargs)
|
||||
self._trading = trading
|
||||
self._stock = stock_data or StockHistoricalDataClient(
|
||||
api_key=api_key, secret_key=secret_key
|
||||
)
|
||||
self._crypto = crypto_data or CryptoHistoricalDataClient(
|
||||
api_key=api_key, secret_key=secret_key
|
||||
)
|
||||
self._option = option_data or OptionHistoricalDataClient(
|
||||
api_key=api_key, secret_key=secret_key
|
||||
)
|
||||
self._trading_base = base_url
|
||||
else:
|
||||
self._trading_base = _TRADING_PAPER if paper else _TRADING_LIVE
|
||||
self._data_base = _DATA
|
||||
# Single long-lived AsyncClient → reuse connection pool.
|
||||
self._http = http or async_client(timeout=30.0)
|
||||
|
||||
async def _run(self, fn, /, *args, **kwargs):
|
||||
return await asyncio.to_thread(fn, *args, **kwargs)
|
||||
async def aclose(self) -> None:
|
||||
"""Chiudi connessioni HTTP. Idempotente."""
|
||||
if not self._http.is_closed:
|
||||
await self._http.aclose()
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
|
||||
return {"status": "ok", "paper": self.paper}
|
||||
|
||||
# ── Helpers ──────────────────────────────────────────────────
|
||||
|
||||
@property
|
||||
def _headers(self) -> dict[str, str]:
|
||||
return {
|
||||
"APCA-API-KEY-ID": self.api_key,
|
||||
"APCA-API-SECRET-KEY": self.secret_key,
|
||||
"Accept": "application/json",
|
||||
}
|
||||
|
||||
async def _request(
|
||||
self,
|
||||
method: str,
|
||||
base: str,
|
||||
path: str,
|
||||
*,
|
||||
params: dict[str, Any] | None = None,
|
||||
json_body: dict[str, Any] | None = None,
|
||||
) -> Any:
|
||||
"""Esegue una richiesta HTTP autenticata e ritorna il JSON parsato.
|
||||
|
||||
Per response body vuoto (es. DELETE 204) ritorna `{}`.
|
||||
Solleva `httpx.HTTPStatusError` su 4xx/5xx tramite raise_for_status.
|
||||
"""
|
||||
url = f"{base}{path}"
|
||||
# httpx scarta i query params con valore None automaticamente solo se
|
||||
# passati come list of tuples; con dict dobbiamo filtrare a monte.
|
||||
clean_params: dict[str, Any] | None = None
|
||||
if params is not None:
|
||||
clean_params = {k: v for k, v in params.items() if v is not None}
|
||||
if not clean_params:
|
||||
clean_params = None
|
||||
resp = await self._http.request(
|
||||
method,
|
||||
url,
|
||||
params=clean_params,
|
||||
json=json_body,
|
||||
headers=self._headers,
|
||||
)
|
||||
resp.raise_for_status()
|
||||
if not resp.content:
|
||||
return {}
|
||||
return resp.json()
|
||||
|
||||
# ── Account / positions ──────────────────────────────────────
|
||||
|
||||
async def get_account(self) -> dict:
|
||||
acc = await self._run(self._trading.get_account)
|
||||
return _serialize(acc) # type: ignore[no-any-return]
|
||||
data = await self._request("GET", self._trading_base, "/v2/account")
|
||||
return dict(data) if data else {}
|
||||
|
||||
async def get_positions(self) -> list[dict]:
|
||||
pos = await self._run(self._trading.get_all_positions)
|
||||
return [_serialize(p) for p in pos]
|
||||
data = await self._request("GET", self._trading_base, "/v2/positions")
|
||||
return list(data) if data else []
|
||||
|
||||
async def get_activities(self, limit: int = 50) -> list[dict]:
|
||||
acts = await self._run(self._trading.get_account_activities) # type: ignore[union-attr]
|
||||
data = [_serialize(a) for a in acts]
|
||||
return data[:limit]
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._trading_base,
|
||||
"/v2/account/activities",
|
||||
params={"page_size": limit},
|
||||
)
|
||||
items = list(data) if data else []
|
||||
return items[:limit]
|
||||
|
||||
# ── Assets ──────────────────────────────────────────────────
|
||||
|
||||
async def get_assets(
|
||||
self, asset_class: str = "stocks", status: str = "active"
|
||||
) -> list[dict]:
|
||||
req = GetAssetsRequest(
|
||||
asset_class=_asset_class_enum(asset_class),
|
||||
status=status, # type: ignore[arg-type]
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._trading_base,
|
||||
"/v2/assets",
|
||||
params={
|
||||
"status": status,
|
||||
"asset_class": _asset_class_param(asset_class),
|
||||
},
|
||||
)
|
||||
assets = await self._run(self._trading.get_all_assets, req)
|
||||
return [_serialize(a) for a in assets[:500]]
|
||||
items = list(data) if data else []
|
||||
return items[:500]
|
||||
|
||||
# ── Market data ─────────────────────────────────────────────
|
||||
|
||||
async def get_ticker(self, symbol: str, asset_class: str = "stocks") -> dict:
|
||||
ac = asset_class.lower()
|
||||
if ac == "stocks":
|
||||
req = StockLatestTradeRequest(symbol_or_symbols=symbol)
|
||||
data = await self._run(self._stock.get_stock_latest_trade, req)
|
||||
trade = data.get(symbol)
|
||||
q_req = StockLatestQuoteRequest(symbol_or_symbols=symbol)
|
||||
qdata = await self._run(self._stock.get_stock_latest_quote, q_req)
|
||||
quote = qdata.get(symbol)
|
||||
trade_resp = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
f"/v2/stocks/{symbol}/trades/latest",
|
||||
)
|
||||
quote_resp = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
f"/v2/stocks/{symbol}/quotes/latest",
|
||||
)
|
||||
trade = (trade_resp or {}).get("trade") or {}
|
||||
quote = (quote_resp or {}).get("quote") or {}
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": "stocks",
|
||||
"last_price": getattr(trade, "price", None),
|
||||
"bid": getattr(quote, "bid_price", None),
|
||||
"ask": getattr(quote, "ask_price", None),
|
||||
"bid_size": getattr(quote, "bid_size", None),
|
||||
"ask_size": getattr(quote, "ask_size", None),
|
||||
"timestamp": _serialize(getattr(trade, "timestamp", None)),
|
||||
"last_price": trade.get("p"),
|
||||
"bid": quote.get("bp"),
|
||||
"ask": quote.get("ap"),
|
||||
"bid_size": quote.get("bs"),
|
||||
"ask_size": quote.get("as"),
|
||||
"timestamp": trade.get("t"),
|
||||
}
|
||||
if ac == "crypto":
|
||||
req = CryptoLatestTradeRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
|
||||
data = await self._run(self._crypto.get_crypto_latest_trade, req)
|
||||
trade = data.get(symbol)
|
||||
q_req = CryptoLatestQuoteRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
|
||||
qdata = await self._run(self._crypto.get_crypto_latest_quote, q_req)
|
||||
quote = qdata.get(symbol)
|
||||
trade_resp = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
"/v1beta3/crypto/us/latest/trades",
|
||||
params={"symbols": symbol},
|
||||
)
|
||||
quote_resp = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
"/v1beta3/crypto/us/latest/quotes",
|
||||
params={"symbols": symbol},
|
||||
)
|
||||
trade = ((trade_resp or {}).get("trades") or {}).get(symbol) or {}
|
||||
quote = ((quote_resp or {}).get("quotes") or {}).get(symbol) or {}
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": "crypto",
|
||||
"last_price": getattr(trade, "price", None),
|
||||
"bid": getattr(quote, "bid_price", None),
|
||||
"ask": getattr(quote, "ask_price", None),
|
||||
"timestamp": _serialize(getattr(trade, "timestamp", None)),
|
||||
"last_price": trade.get("p"),
|
||||
"bid": quote.get("bp"),
|
||||
"ask": quote.get("ap"),
|
||||
"timestamp": trade.get("t"),
|
||||
}
|
||||
if ac == "options":
|
||||
req = OptionLatestQuoteRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
|
||||
data = await self._run(self._option.get_option_latest_quote, req)
|
||||
quote = data.get(symbol)
|
||||
quote_resp = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
f"/v1beta1/options/{symbol}/quotes/latest",
|
||||
)
|
||||
quote = (quote_resp or {}).get("quote") or {}
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": "options",
|
||||
"bid": getattr(quote, "bid_price", None),
|
||||
"ask": getattr(quote, "ask_price", None),
|
||||
"timestamp": _serialize(getattr(quote, "timestamp", None)),
|
||||
"bid": quote.get("bp"),
|
||||
"ask": quote.get("ap"),
|
||||
"timestamp": quote.get("t"),
|
||||
}
|
||||
raise ValueError(f"invalid asset_class: {asset_class}")
|
||||
|
||||
@@ -212,73 +261,125 @@ class AlpacaClient:
|
||||
limit: int = 1000,
|
||||
) -> dict:
|
||||
tf = _tf(interval)
|
||||
start_dt = _dt.datetime.fromisoformat(start) if start else (
|
||||
_dt.datetime.now(_dt.UTC) - _dt.timedelta(days=30)
|
||||
start_dt = (
|
||||
_dt.datetime.fromisoformat(start)
|
||||
if start
|
||||
else (_dt.datetime.now(_dt.UTC) - _dt.timedelta(days=30))
|
||||
)
|
||||
end_dt = _dt.datetime.fromisoformat(end) if end else _dt.datetime.now(_dt.UTC)
|
||||
ac = asset_class.lower()
|
||||
|
||||
params: dict[str, Any] = {
|
||||
"symbols": symbol,
|
||||
"timeframe": tf,
|
||||
"start": _iso(start_dt),
|
||||
"end": _iso(end_dt),
|
||||
"limit": limit,
|
||||
}
|
||||
|
||||
if ac == "stocks":
|
||||
req = StockBarsRequest(
|
||||
symbol_or_symbols=symbol, timeframe=tf,
|
||||
start=start_dt, end=end_dt, limit=limit,
|
||||
# IEX feed di default — coerente con default alpaca-py free tier.
|
||||
params["feed"] = "iex"
|
||||
data = await self._request(
|
||||
"GET", self._data_base, "/v2/stocks/bars", params=params
|
||||
)
|
||||
data = await self._run(self._stock.get_stock_bars, req)
|
||||
elif ac == "crypto":
|
||||
req = CryptoBarsRequest( # type: ignore[assignment]
|
||||
symbol_or_symbols=symbol, timeframe=tf,
|
||||
start=start_dt, end=end_dt, limit=limit,
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
"/v1beta3/crypto/us/bars",
|
||||
params=params,
|
||||
)
|
||||
data = await self._run(self._crypto.get_crypto_bars, req)
|
||||
elif ac == "options":
|
||||
req = OptionBarsRequest( # type: ignore[assignment]
|
||||
symbol_or_symbols=symbol, timeframe=tf,
|
||||
start=start_dt, end=end_dt, limit=limit,
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
"/v1beta1/options/bars",
|
||||
params=params,
|
||||
)
|
||||
data = await self._run(self._option.get_option_bars, req)
|
||||
else:
|
||||
raise ValueError(f"invalid asset_class: {asset_class}")
|
||||
bars_dict = getattr(data, "data", {}) or {}
|
||||
rows = bars_dict.get(symbol, []) or []
|
||||
bars = [
|
||||
|
||||
bars_dict = (data or {}).get("bars") or {}
|
||||
rows = bars_dict.get(symbol) or []
|
||||
|
||||
def _iso_to_ms(ts: str | int | None) -> int | None:
|
||||
if ts is None or isinstance(ts, int):
|
||||
return ts
|
||||
return int(_dt.datetime.fromisoformat(
|
||||
ts.replace("Z", "+00:00")
|
||||
).timestamp() * 1000)
|
||||
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": _serialize(getattr(b, "timestamp", None)),
|
||||
"open": getattr(b, "open", None),
|
||||
"high": getattr(b, "high", None),
|
||||
"low": getattr(b, "low", None),
|
||||
"close": getattr(b, "close", None),
|
||||
"volume": getattr(b, "volume", None),
|
||||
"timestamp": _iso_to_ms(b.get("t")),
|
||||
"open": b.get("o"),
|
||||
"high": b.get("h"),
|
||||
"low": b.get("l"),
|
||||
"close": b.get("c"),
|
||||
"volume": b.get("v"),
|
||||
}
|
||||
for b in rows
|
||||
]
|
||||
return {"symbol": symbol, "asset_class": ac, "interval": interval, "bars": bars}
|
||||
])
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": ac,
|
||||
"interval": interval,
|
||||
"candles": candles,
|
||||
}
|
||||
|
||||
async def get_snapshot(self, symbol: str) -> dict:
|
||||
req = StockSnapshotRequest(symbol_or_symbols=symbol)
|
||||
data = await self._run(self._stock.get_stock_snapshot, req)
|
||||
return _serialize(data.get(symbol)) # type: ignore[no-any-return]
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
"/v2/stocks/snapshots",
|
||||
params={"symbols": symbol},
|
||||
)
|
||||
# API ritorna {"AAPL": {snapshot}} o {"snapshots": {...}} — gestiamo
|
||||
# entrambi i formati; v2/stocks/snapshots ritorna dict top-level
|
||||
# symbol→snapshot.
|
||||
if data is None:
|
||||
return {}
|
||||
if symbol in data:
|
||||
return data[symbol] or {}
|
||||
snaps = data.get("snapshots") or {}
|
||||
return snaps.get(symbol) or {}
|
||||
|
||||
async def get_option_chain(
|
||||
self,
|
||||
underlying: str,
|
||||
expiry: str | None = None,
|
||||
) -> dict:
|
||||
kwargs: dict[str, Any] = {"underlying_symbol": underlying}
|
||||
params: dict[str, Any] = {}
|
||||
if expiry:
|
||||
kwargs["expiration_date"] = _dt.date.fromisoformat(expiry)
|
||||
req = OptionChainRequest(**kwargs)
|
||||
data = await self._run(self._option.get_option_chain, req)
|
||||
# Validazione date (solleva ValueError su input invalido,
|
||||
# parità con V1 che usava _dt.date.fromisoformat).
|
||||
_dt.date.fromisoformat(expiry)
|
||||
params["expiration_date_gte"] = expiry
|
||||
params["expiration_date_lte"] = expiry
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._data_base,
|
||||
f"/v1beta1/options/snapshots/{underlying}",
|
||||
params=params or None,
|
||||
)
|
||||
contracts = (data or {}).get("snapshots") if data else None
|
||||
return {
|
||||
"underlying": underlying,
|
||||
"expiry": expiry,
|
||||
"contracts": _serialize(data),
|
||||
"contracts": contracts if contracts is not None else (data or {}),
|
||||
}
|
||||
|
||||
# ── Orders ──────────────────────────────────────────────────
|
||||
|
||||
async def get_open_orders(self, limit: int = 50) -> list[dict]:
|
||||
req = GetOrdersRequest(status=QueryOrderStatus.OPEN, limit=limit)
|
||||
orders = await self._run(self._trading.get_orders, filter=req)
|
||||
return [_serialize(o) for o in orders]
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._trading_base,
|
||||
"/v2/orders",
|
||||
params={"status": "open", "limit": limit},
|
||||
)
|
||||
return list(data) if data else []
|
||||
|
||||
async def place_order(
|
||||
self,
|
||||
@@ -292,32 +393,39 @@ class AlpacaClient:
|
||||
tif: str = "day",
|
||||
asset_class: str = "stocks",
|
||||
) -> dict:
|
||||
side_enum = OrderSide.BUY if side.lower() == "buy" else OrderSide.SELL
|
||||
tif_enum = TimeInForce(tif.lower())
|
||||
ot = order_type.lower()
|
||||
common = {
|
||||
body: dict[str, Any] = {
|
||||
"symbol": symbol,
|
||||
"side": side_enum,
|
||||
"time_in_force": tif_enum,
|
||||
"side": side.lower(),
|
||||
"type": ot,
|
||||
"time_in_force": tif.lower(),
|
||||
}
|
||||
if qty is not None:
|
||||
common["qty"] = qty # type: ignore[assignment]
|
||||
body["qty"] = str(qty)
|
||||
if notional is not None:
|
||||
common["notional"] = notional # type: ignore[assignment]
|
||||
body["notional"] = str(notional)
|
||||
if ot == "market":
|
||||
req = MarketOrderRequest(**common)
|
||||
pass
|
||||
elif ot == "limit":
|
||||
if limit_price is None:
|
||||
raise ValueError("limit_price required for limit order")
|
||||
req = LimitOrderRequest(**common, limit_price=limit_price) # type: ignore[assignment]
|
||||
body["limit_price"] = str(limit_price)
|
||||
elif ot == "stop":
|
||||
if stop_price is None:
|
||||
raise ValueError("stop_price required for stop order")
|
||||
req = StopOrderRequest(**common, stop_price=stop_price) # type: ignore[assignment]
|
||||
body["stop_price"] = str(stop_price)
|
||||
else:
|
||||
raise ValueError(f"unsupported order_type: {order_type}")
|
||||
order = await self._run(self._trading.submit_order, req)
|
||||
return _serialize(order) # type: ignore[no-any-return]
|
||||
# `asset_class` non è un parametro REST; mantenuto in firma per parità
|
||||
# con V1 (era usato solo da SDK per scegliere il request model).
|
||||
_ = asset_class
|
||||
data = await self._request(
|
||||
"POST",
|
||||
self._trading_base,
|
||||
"/v2/orders",
|
||||
json_body=body,
|
||||
)
|
||||
return dict(data) if data else {}
|
||||
|
||||
async def amend_order(
|
||||
self,
|
||||
@@ -327,69 +435,85 @@ class AlpacaClient:
|
||||
stop_price: float | None = None,
|
||||
tif: str | None = None,
|
||||
) -> dict:
|
||||
kwargs: dict[str, Any] = {}
|
||||
body: dict[str, Any] = {}
|
||||
if qty is not None:
|
||||
kwargs["qty"] = qty
|
||||
body["qty"] = str(qty)
|
||||
if limit_price is not None:
|
||||
kwargs["limit_price"] = limit_price
|
||||
body["limit_price"] = str(limit_price)
|
||||
if stop_price is not None:
|
||||
kwargs["stop_price"] = stop_price
|
||||
body["stop_price"] = str(stop_price)
|
||||
if tif is not None:
|
||||
kwargs["time_in_force"] = TimeInForce(tif.lower())
|
||||
req = ReplaceOrderRequest(**kwargs)
|
||||
order = await self._run(self._trading.replace_order_by_id, order_id, req)
|
||||
return _serialize(order) # type: ignore[no-any-return]
|
||||
body["time_in_force"] = tif.lower()
|
||||
data = await self._request(
|
||||
"PATCH",
|
||||
self._trading_base,
|
||||
f"/v2/orders/{order_id}",
|
||||
json_body=body,
|
||||
)
|
||||
return dict(data) if data else {}
|
||||
|
||||
async def cancel_order(self, order_id: str) -> dict:
|
||||
await self._run(self._trading.cancel_order_by_id, order_id)
|
||||
# DELETE /v2/orders/{id} → 204 No Content su success.
|
||||
await self._request(
|
||||
"DELETE", self._trading_base, f"/v2/orders/{order_id}"
|
||||
)
|
||||
return {"order_id": order_id, "canceled": True}
|
||||
|
||||
async def cancel_all_orders(self) -> list[dict]:
|
||||
resp = await self._run(self._trading.cancel_orders)
|
||||
return [_serialize(r) for r in resp]
|
||||
# DELETE /v2/orders → 207 Multi-Status con array di {id, status}
|
||||
data = await self._request(
|
||||
"DELETE", self._trading_base, "/v2/orders"
|
||||
)
|
||||
return list(data) if data else []
|
||||
|
||||
# ── Position close ──────────────────────────────────────────
|
||||
|
||||
async def close_position(
|
||||
self, symbol: str, qty: float | None = None, percentage: float | None = None
|
||||
) -> dict:
|
||||
req = None
|
||||
if qty is not None or percentage is not None:
|
||||
kwargs: dict[str, Any] = {}
|
||||
# DELETE /v2/positions/{symbol}?qty=... oppure ?percentage=...
|
||||
params: dict[str, Any] = {}
|
||||
if qty is not None:
|
||||
kwargs["qty"] = str(qty)
|
||||
params["qty"] = str(qty)
|
||||
if percentage is not None:
|
||||
kwargs["percentage"] = str(percentage)
|
||||
req = ClosePositionRequest(**kwargs)
|
||||
order = await self._run(
|
||||
self._trading.close_position, symbol, close_options=req
|
||||
params["percentage"] = str(percentage)
|
||||
data = await self._request(
|
||||
"DELETE",
|
||||
self._trading_base,
|
||||
f"/v2/positions/{symbol}",
|
||||
params=params or None,
|
||||
)
|
||||
return _serialize(order) # type: ignore[no-any-return]
|
||||
return dict(data) if data else {}
|
||||
|
||||
async def close_all_positions(self, cancel_orders: bool = True) -> list[dict]:
|
||||
resp = await self._run(
|
||||
self._trading.close_all_positions, cancel_orders=cancel_orders
|
||||
data = await self._request(
|
||||
"DELETE",
|
||||
self._trading_base,
|
||||
"/v2/positions",
|
||||
params={"cancel_orders": "true" if cancel_orders else "false"},
|
||||
)
|
||||
return [_serialize(r) for r in resp]
|
||||
return list(data) if data else []
|
||||
|
||||
# ── Clock / calendar ────────────────────────────────────────
|
||||
|
||||
async def get_clock(self) -> dict:
|
||||
clock = await self._run(self._trading.get_clock)
|
||||
return _serialize(clock) # type: ignore[no-any-return]
|
||||
data = await self._request("GET", self._trading_base, "/v2/clock")
|
||||
return dict(data) if data else {}
|
||||
|
||||
async def get_calendar(
|
||||
self, start: str | None = None, end: str | None = None
|
||||
) -> list[dict]:
|
||||
from alpaca.trading.requests import GetCalendarRequest
|
||||
|
||||
kwargs: dict[str, Any] = {}
|
||||
params: dict[str, Any] = {}
|
||||
if start:
|
||||
kwargs["start"] = _dt.date.fromisoformat(start)
|
||||
_dt.date.fromisoformat(start) # validazione, parità V1
|
||||
params["start"] = start
|
||||
if end:
|
||||
kwargs["end"] = _dt.date.fromisoformat(end)
|
||||
req = GetCalendarRequest(**kwargs) if kwargs else None
|
||||
cal = await self._run(
|
||||
self._trading.get_calendar, filters=req
|
||||
) if req else await self._run(self._trading.get_calendar)
|
||||
return [_serialize(c) for c in cal]
|
||||
_dt.date.fromisoformat(end)
|
||||
params["end"] = end
|
||||
data = await self._request(
|
||||
"GET",
|
||||
self._trading_base,
|
||||
"/v2/calendar",
|
||||
params=params or None,
|
||||
)
|
||||
return list(data) if data else []
|
||||
|
||||
@@ -1,12 +1,33 @@
|
||||
"""Bybit V5 REST API client (httpx puro, no SDK).
|
||||
|
||||
Implementazione diretta su `httpx.AsyncClient` per i tool Cerbero MCP V2.
|
||||
Mantiene parità di interfaccia con la versione precedente basata su
|
||||
`pybit.unified_trading.HTTP` per non rompere `tools.py` né i router.
|
||||
|
||||
Auth Bybit V5:
|
||||
Header X-BAPI-SIGN = HMAC_SHA256(secret,
|
||||
timestamp + api_key + recv_window + (body_json | querystring))
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import hashlib
|
||||
import hmac
|
||||
import json
|
||||
import time
|
||||
import uuid
|
||||
from typing import Any
|
||||
from urllib.parse import urlencode
|
||||
|
||||
from pybit.unified_trading import HTTP
|
||||
import httpx
|
||||
|
||||
from cerbero_mcp.common import indicators as ind
|
||||
from cerbero_mcp.common import microstructure as micro
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
|
||||
BASE_MAINNET = "https://api.bybit.com"
|
||||
BASE_TESTNET = "https://api-testnet.bybit.com"
|
||||
DEFAULT_RECV_WINDOW = "5000"
|
||||
DEFAULT_TIMEOUT = 15.0
|
||||
|
||||
|
||||
def _f(v: Any) -> float | None:
|
||||
@@ -23,37 +44,147 @@ def _i(v: Any) -> int | None:
|
||||
return None
|
||||
|
||||
|
||||
class BybitAPIError(RuntimeError):
|
||||
"""Errore di trasporto Bybit V5 (non gestito a livello envelope)."""
|
||||
|
||||
|
||||
class BybitClient:
|
||||
"""Async REST client per Bybit V5 (linear/inverse/spot/option)."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
api_key: str,
|
||||
api_secret: str,
|
||||
testnet: bool = True,
|
||||
http: Any | None = None,
|
||||
http: httpx.AsyncClient | None = None,
|
||||
base_url: str | None = None,
|
||||
) -> None:
|
||||
self.api_key = api_key
|
||||
self.api_secret = api_secret
|
||||
self.testnet = testnet
|
||||
# pybit HTTP non accetta `endpoint` come kwarg (vedi _V5HTTPManager.__init__:
|
||||
# solo `domain`/`tld`/`testnet`). Override URL applicato post-init
|
||||
# sovrascrivendo l'attributo `endpoint` dell'istanza HTTP.
|
||||
self.base_url = base_url
|
||||
if http is None:
|
||||
http = HTTP(
|
||||
api_key=api_key,
|
||||
api_secret=api_secret,
|
||||
testnet=testnet,
|
||||
self.base_url = base_url or (BASE_TESTNET if testnet else BASE_MAINNET)
|
||||
self.recv_window = DEFAULT_RECV_WINDOW
|
||||
# `http` injection è usato dai test per montare un AsyncClient con
|
||||
# `httpx.MockTransport`. In produzione creiamo un client dedicato.
|
||||
self._owns_http = http is None
|
||||
self._http: httpx.AsyncClient = http or httpx.AsyncClient(
|
||||
timeout=DEFAULT_TIMEOUT
|
||||
)
|
||||
if base_url:
|
||||
http.endpoint = base_url
|
||||
self._http = http
|
||||
|
||||
async def _run(self, fn, /, **kwargs):
|
||||
return await asyncio.to_thread(fn, **kwargs)
|
||||
async def aclose(self) -> None:
|
||||
"""Chiude l'AsyncClient httpx se di nostra proprietà."""
|
||||
if self._owns_http:
|
||||
await self._http.aclose()
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
|
||||
return {"status": "ok", "testnet": self.testnet}
|
||||
|
||||
# ── auth helpers ───────────────────────────────────────────
|
||||
|
||||
def _timestamp_ms(self) -> str:
|
||||
return str(int(time.time() * 1000))
|
||||
|
||||
def _sign(self, timestamp: str, payload: str) -> str:
|
||||
msg = timestamp + self.api_key + self.recv_window + payload
|
||||
return hmac.new(
|
||||
self.api_secret.encode("utf-8"),
|
||||
msg.encode("utf-8"),
|
||||
hashlib.sha256,
|
||||
).hexdigest()
|
||||
|
||||
def _signed_headers(self, payload: str) -> dict[str, str]:
|
||||
ts = self._timestamp_ms()
|
||||
sig = self._sign(ts, payload)
|
||||
return {
|
||||
"X-BAPI-API-KEY": self.api_key,
|
||||
"X-BAPI-TIMESTAMP": ts,
|
||||
"X-BAPI-RECV-WINDOW": self.recv_window,
|
||||
"X-BAPI-SIGN": sig,
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _parse_ticker(row: dict) -> dict:
|
||||
def _clean_params(params: dict[str, Any] | None) -> dict[str, Any]:
|
||||
if not params:
|
||||
return {}
|
||||
return {k: v for k, v in params.items() if v is not None}
|
||||
|
||||
@staticmethod
|
||||
def _querystring(params: dict[str, Any]) -> str:
|
||||
# Bybit accetta querystring nell'ordine in cui viene serializzata la
|
||||
# request. Per la signature usiamo lo stesso urlencode (ordine
|
||||
# inserzione dict). In Python 3.7+ dict mantiene insertion order:
|
||||
# mantenere coerenza tra signature payload e URL effettivo.
|
||||
return urlencode(params)
|
||||
|
||||
# ── request primitives ─────────────────────────────────────
|
||||
|
||||
async def _request_public(
|
||||
self,
|
||||
method: str,
|
||||
path: str,
|
||||
params: dict[str, Any] | None = None,
|
||||
) -> dict[str, Any]:
|
||||
clean = self._clean_params(params)
|
||||
url = self.base_url + path
|
||||
resp = await self._http.request(
|
||||
method, url, params=clean if clean else None
|
||||
)
|
||||
return self._parse_response(resp)
|
||||
|
||||
async def _request_signed(
|
||||
self,
|
||||
method: str,
|
||||
path: str,
|
||||
params: dict[str, Any] | None = None,
|
||||
body: dict[str, Any] | None = None,
|
||||
) -> dict[str, Any]:
|
||||
url = self.base_url + path
|
||||
method = method.upper()
|
||||
if method == "GET":
|
||||
clean = self._clean_params(params)
|
||||
qs = self._querystring(clean)
|
||||
headers = self._signed_headers(qs)
|
||||
resp = await self._http.request(
|
||||
method, url, params=clean if clean else None, headers=headers
|
||||
)
|
||||
else:
|
||||
payload_body = body or {}
|
||||
body_json = json.dumps(payload_body, separators=(",", ":"))
|
||||
headers = self._signed_headers(body_json)
|
||||
resp = await self._http.request(
|
||||
method, url, content=body_json, headers=headers
|
||||
)
|
||||
return self._parse_response(resp)
|
||||
|
||||
@staticmethod
|
||||
def _parse_response(resp: httpx.Response) -> dict[str, Any]:
|
||||
try:
|
||||
data = resp.json()
|
||||
except Exception as e: # pragma: no cover - difficilmente raggiungibile
|
||||
raise BybitAPIError(
|
||||
f"invalid JSON from Bybit (status={resp.status_code}): {resp.text[:200]}"
|
||||
) from e
|
||||
if resp.status_code >= 500:
|
||||
raise BybitAPIError(
|
||||
f"bybit server error {resp.status_code}: "
|
||||
f"{data.get('retMsg', resp.text[:200])}"
|
||||
)
|
||||
if not isinstance(data, dict):
|
||||
raise BybitAPIError(f"unexpected payload type: {type(data).__name__}")
|
||||
return data
|
||||
|
||||
def _envelope(self, resp: dict[str, Any], payload: dict[str, Any]) -> dict[str, Any]:
|
||||
code = resp.get("retCode", 0)
|
||||
if code != 0:
|
||||
return {"error": resp.get("retMsg", "bybit_error"), "code": code}
|
||||
return payload
|
||||
|
||||
# ── parsers shared ─────────────────────────────────────────
|
||||
|
||||
@staticmethod
|
||||
def _parse_ticker(row: dict[str, Any]) -> dict[str, Any]:
|
||||
return {
|
||||
"symbol": row.get("symbol"),
|
||||
"last_price": _f(row.get("lastPrice")),
|
||||
@@ -66,9 +197,13 @@ class BybitClient:
|
||||
"open_interest": _f(row.get("openInterest")),
|
||||
}
|
||||
|
||||
# ── market data (public) ───────────────────────────────────
|
||||
|
||||
async def get_ticker(self, symbol: str, category: str = "linear") -> dict:
|
||||
resp = await self._run(
|
||||
self._http.get_tickers, category=category, symbol=symbol
|
||||
resp = await self._request_public(
|
||||
"GET",
|
||||
"/v5/market/tickers",
|
||||
params={"category": category, "symbol": symbol},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
if not rows:
|
||||
@@ -86,8 +221,10 @@ class BybitClient:
|
||||
async def get_orderbook(
|
||||
self, symbol: str, category: str = "linear", limit: int = 50
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.get_orderbook, category=category, symbol=symbol, limit=limit
|
||||
resp = await self._request_public(
|
||||
"GET",
|
||||
"/v5/market/orderbook",
|
||||
params={"category": category, "symbol": symbol, "limit": limit},
|
||||
)
|
||||
r = resp.get("result") or {}
|
||||
return {
|
||||
@@ -106,30 +243,29 @@ class BybitClient:
|
||||
end: int | None = None,
|
||||
limit: int = 1000,
|
||||
) -> dict:
|
||||
kwargs = dict(
|
||||
category=category,
|
||||
symbol=symbol,
|
||||
interval=interval,
|
||||
limit=limit,
|
||||
)
|
||||
params: dict[str, Any] = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"interval": interval,
|
||||
"limit": limit,
|
||||
}
|
||||
if start is not None:
|
||||
kwargs["start"] = start
|
||||
params["start"] = start
|
||||
if end is not None:
|
||||
kwargs["end"] = end
|
||||
resp = await self._run(self._http.get_kline, **kwargs)
|
||||
params["end"] = end
|
||||
resp = await self._request_public("GET", "/v5/market/kline", params=params)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
rows_sorted = sorted(rows, key=lambda r: int(r[0]))
|
||||
candles = [
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": int(r[0]),
|
||||
"open": float(r[1]),
|
||||
"high": float(r[2]),
|
||||
"low": float(r[3]),
|
||||
"close": float(r[4]),
|
||||
"volume": float(r[5]),
|
||||
"open": r[1],
|
||||
"high": r[2],
|
||||
"low": r[3],
|
||||
"close": r[4],
|
||||
"volume": r[5],
|
||||
}
|
||||
for r in rows_sorted
|
||||
]
|
||||
for r in rows
|
||||
])
|
||||
return {"symbol": symbol, "candles": candles}
|
||||
|
||||
async def get_indicators(
|
||||
@@ -168,8 +304,10 @@ class BybitClient:
|
||||
return out
|
||||
|
||||
async def get_funding_rate(self, symbol: str, category: str = "linear") -> dict:
|
||||
resp = await self._run(
|
||||
self._http.get_tickers, category=category, symbol=symbol
|
||||
resp = await self._request_public(
|
||||
"GET",
|
||||
"/v5/market/tickers",
|
||||
params={"category": category, "symbol": symbol},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
if not rows:
|
||||
@@ -184,9 +322,10 @@ class BybitClient:
|
||||
async def get_funding_history(
|
||||
self, symbol: str, category: str = "linear", limit: int = 100
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.get_funding_rate_history,
|
||||
category=category, symbol=symbol, limit=limit,
|
||||
resp = await self._request_public(
|
||||
"GET",
|
||||
"/v5/market/funding/history",
|
||||
params={"category": category, "symbol": symbol, "limit": limit},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
hist = [
|
||||
@@ -205,9 +344,15 @@ class BybitClient:
|
||||
interval: str = "5min",
|
||||
limit: int = 288,
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.get_open_interest,
|
||||
category=category, symbol=symbol, intervalTime=interval, limit=limit,
|
||||
resp = await self._request_public(
|
||||
"GET",
|
||||
"/v5/market/open-interest",
|
||||
params={
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"intervalTime": interval,
|
||||
"limit": limit,
|
||||
},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
points = [
|
||||
@@ -226,17 +371,22 @@ class BybitClient:
|
||||
"points": points,
|
||||
}
|
||||
|
||||
async def get_instruments(self, category: str = "linear", symbol: str | None = None) -> dict:
|
||||
kwargs: dict[str, Any] = {"category": category}
|
||||
async def get_instruments(
|
||||
self, category: str = "linear", symbol: str | None = None
|
||||
) -> dict:
|
||||
params: dict[str, Any] = {"category": category}
|
||||
if symbol:
|
||||
kwargs["symbol"] = symbol
|
||||
resp = await self._run(self._http.get_instruments_info, **kwargs)
|
||||
params["symbol"] = symbol
|
||||
resp = await self._request_public(
|
||||
"GET", "/v5/market/instruments-info", params=params
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
instruments = []
|
||||
for r in rows:
|
||||
pf = r.get("priceFilter") or {}
|
||||
lf = r.get("lotSizeFilter") or {}
|
||||
instruments.append({
|
||||
instruments.append(
|
||||
{
|
||||
"symbol": r.get("symbol"),
|
||||
"status": r.get("status"),
|
||||
"base_coin": r.get("baseCoin"),
|
||||
@@ -244,39 +394,48 @@ class BybitClient:
|
||||
"tick_size": _f(pf.get("tickSize")),
|
||||
"qty_step": _f(lf.get("qtyStep")),
|
||||
"min_qty": _f(lf.get("minOrderQty")),
|
||||
})
|
||||
}
|
||||
)
|
||||
return {"category": category, "instruments": instruments}
|
||||
|
||||
async def get_option_chain(self, base_coin: str, expiry: str | None = None) -> dict:
|
||||
kwargs: dict[str, Any] = {"category": "option", "baseCoin": base_coin.upper()}
|
||||
resp = await self._run(self._http.get_instruments_info, **kwargs)
|
||||
resp = await self._request_public(
|
||||
"GET",
|
||||
"/v5/market/instruments-info",
|
||||
params={"category": "option", "baseCoin": base_coin.upper()},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
options = []
|
||||
for r in rows:
|
||||
delivery = r.get("deliveryTime")
|
||||
if expiry and expiry not in r.get("symbol", ""):
|
||||
continue
|
||||
options.append({
|
||||
options.append(
|
||||
{
|
||||
"symbol": r.get("symbol"),
|
||||
"base_coin": r.get("baseCoin"),
|
||||
"settle_coin": r.get("settleCoin"),
|
||||
"type": r.get("optionsType"),
|
||||
"launch_time": int(r.get("launchTime", 0)),
|
||||
"delivery_time": int(delivery) if delivery else None,
|
||||
})
|
||||
}
|
||||
)
|
||||
return {"base_coin": base_coin.upper(), "options": options}
|
||||
|
||||
# ── account / positions / orders (signed) ─────────────────
|
||||
|
||||
async def get_positions(
|
||||
self, category: str = "linear", settle_coin: str = "USDT"
|
||||
) -> list[dict]:
|
||||
kwargs: dict[str, Any] = {"category": category}
|
||||
params: dict[str, Any] = {"category": category}
|
||||
if category in ("linear", "inverse"):
|
||||
kwargs["settleCoin"] = settle_coin
|
||||
resp = await self._run(self._http.get_positions, **kwargs)
|
||||
params["settleCoin"] = settle_coin
|
||||
resp = await self._request_signed("GET", "/v5/position/list", params=params)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
out = []
|
||||
for r in rows:
|
||||
out.append({
|
||||
out.append(
|
||||
{
|
||||
"symbol": r.get("symbol"),
|
||||
"side": r.get("side"),
|
||||
"size": _f(r.get("size")),
|
||||
@@ -285,12 +444,15 @@ class BybitClient:
|
||||
"leverage": _f(r.get("leverage")),
|
||||
"liquidation_price": _f(r.get("liqPrice")),
|
||||
"position_value": _f(r.get("positionValue")),
|
||||
})
|
||||
}
|
||||
)
|
||||
return out
|
||||
|
||||
async def get_account_summary(self, account_type: str = "UNIFIED") -> dict:
|
||||
resp = await self._run(
|
||||
self._http.get_wallet_balance, accountType=account_type
|
||||
resp = await self._request_signed(
|
||||
"GET",
|
||||
"/v5/account/wallet-balance",
|
||||
params={"accountType": account_type},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
if not rows:
|
||||
@@ -298,11 +460,13 @@ class BybitClient:
|
||||
a = rows[0]
|
||||
coins = []
|
||||
for c in a.get("coin") or []:
|
||||
coins.append({
|
||||
coins.append(
|
||||
{
|
||||
"coin": c.get("coin"),
|
||||
"wallet_balance": _f(c.get("walletBalance")),
|
||||
"equity": _f(c.get("equity")),
|
||||
})
|
||||
}
|
||||
)
|
||||
return {
|
||||
"account_type": a.get("accountType"),
|
||||
"equity": _f(a.get("totalEquity")),
|
||||
@@ -316,8 +480,10 @@ class BybitClient:
|
||||
async def get_trade_history(
|
||||
self, category: str = "linear", limit: int = 50
|
||||
) -> list[dict]:
|
||||
resp = await self._run(
|
||||
self._http.get_executions, category=category, limit=limit
|
||||
resp = await self._request_signed(
|
||||
"GET",
|
||||
"/v5/execution/list",
|
||||
params={"category": category, "limit": limit},
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
return [
|
||||
@@ -339,12 +505,14 @@ class BybitClient:
|
||||
symbol: str | None = None,
|
||||
settle_coin: str = "USDT",
|
||||
) -> list[dict]:
|
||||
kwargs: dict[str, Any] = {"category": category}
|
||||
params: dict[str, Any] = {"category": category}
|
||||
if category in ("linear", "inverse") and not symbol:
|
||||
kwargs["settleCoin"] = settle_coin
|
||||
params["settleCoin"] = settle_coin
|
||||
if symbol:
|
||||
kwargs["symbol"] = symbol
|
||||
resp = await self._run(self._http.get_open_orders, **kwargs)
|
||||
params["symbol"] = symbol
|
||||
resp = await self._request_signed(
|
||||
"GET", "/v5/order/realtime", params=params
|
||||
)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
return [
|
||||
{
|
||||
@@ -360,15 +528,20 @@ class BybitClient:
|
||||
for r in rows
|
||||
]
|
||||
|
||||
# ── microstructure / basis ─────────────────────────────────
|
||||
|
||||
async def get_orderbook_imbalance(
|
||||
self,
|
||||
symbol: str,
|
||||
category: str = "linear",
|
||||
depth: int = 10,
|
||||
) -> dict:
|
||||
"""Microstructure: bid/ask imbalance ratio + microprice + slope."""
|
||||
ob = await self.get_orderbook(symbol=symbol, category=category, limit=max(depth, 50))
|
||||
result = micro.orderbook_imbalance(ob.get("bids") or [], ob.get("asks") or [], depth=depth)
|
||||
ob = await self.get_orderbook(
|
||||
symbol=symbol, category=category, limit=max(depth, 50)
|
||||
)
|
||||
result = micro.orderbook_imbalance(
|
||||
ob.get("bids") or [], ob.get("asks") or [], depth=depth
|
||||
)
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"category": category,
|
||||
@@ -378,9 +551,6 @@ class BybitClient:
|
||||
}
|
||||
|
||||
async def get_basis_term_structure(self, asset: str) -> dict:
|
||||
"""Basis curve futures (dated) vs perp + spot. Filtra contratti future
|
||||
BTCUSDT / ETHUSDT con scadenza, calcola annualized basis per ognuno.
|
||||
"""
|
||||
import datetime as _dt
|
||||
|
||||
asset = asset.upper()
|
||||
@@ -389,12 +559,13 @@ class BybitClient:
|
||||
sp = spot.get("last_price")
|
||||
pp = perp.get("last_price")
|
||||
|
||||
# Lista futures dated (linear/inverse)
|
||||
instr = await self.get_instruments(category="linear")
|
||||
items = (instr.get("instruments") or [])
|
||||
items = instr.get("instruments") or []
|
||||
futures = [
|
||||
x for x in items
|
||||
if x.get("symbol", "").startswith(f"{asset}-") or x.get("symbol", "").startswith(f"{asset}USDT-")
|
||||
x
|
||||
for x in items
|
||||
if x.get("symbol", "").startswith(f"{asset}-")
|
||||
or x.get("symbol", "").startswith(f"{asset}USDT-")
|
||||
]
|
||||
|
||||
rows: list[dict[str, Any]] = []
|
||||
@@ -409,21 +580,25 @@ class BybitClient:
|
||||
days = max((int(expiry_ms) - now_ms) / 86_400_000, 1)
|
||||
basis_pct = 100.0 * (fp - sp) / sp
|
||||
annualized = basis_pct * 365.0 / days
|
||||
rows.append({
|
||||
rows.append(
|
||||
{
|
||||
"symbol": f["symbol"],
|
||||
"expiry_ms": int(expiry_ms),
|
||||
"days_to_expiry": round(days, 2),
|
||||
"future_price": fp,
|
||||
"basis_pct": round(basis_pct, 4),
|
||||
"annualized_basis_pct": round(annualized, 4),
|
||||
})
|
||||
}
|
||||
)
|
||||
|
||||
rows.sort(key=lambda r: r["days_to_expiry"])
|
||||
return {
|
||||
"asset": asset,
|
||||
"spot_price": sp,
|
||||
"perp_price": pp,
|
||||
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4) if (sp and pp) else None,
|
||||
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4)
|
||||
if (sp and pp)
|
||||
else None,
|
||||
"term_structure": rows,
|
||||
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
|
||||
}
|
||||
@@ -449,11 +624,7 @@ class BybitClient:
|
||||
"funding_rate": perp.get("funding_rate"),
|
||||
}
|
||||
|
||||
def _envelope(self, resp: dict, payload: dict) -> dict:
|
||||
code = resp.get("retCode", 0)
|
||||
if code != 0:
|
||||
return {"error": resp.get("retMsg", "bybit_error"), "code": code}
|
||||
return payload
|
||||
# ── trading (signed, write) ────────────────────────────────
|
||||
|
||||
async def place_order(
|
||||
self,
|
||||
@@ -467,7 +638,7 @@ class BybitClient:
|
||||
reduce_only: bool = False,
|
||||
position_idx: int | None = None,
|
||||
) -> dict:
|
||||
kwargs: dict[str, Any] = {
|
||||
body: dict[str, Any] = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"side": side,
|
||||
@@ -477,38 +648,34 @@ class BybitClient:
|
||||
"reduceOnly": reduce_only,
|
||||
}
|
||||
if price is not None:
|
||||
kwargs["price"] = str(price)
|
||||
body["price"] = str(price)
|
||||
if position_idx is not None:
|
||||
kwargs["positionIdx"] = position_idx
|
||||
body["positionIdx"] = position_idx
|
||||
if category == "option":
|
||||
import uuid
|
||||
kwargs["orderLinkId"] = f"cerbero-{uuid.uuid4().hex[:16]}"
|
||||
resp = await self._run(self._http.place_order, **kwargs)
|
||||
body["orderLinkId"] = f"cerbero-{uuid.uuid4().hex[:16]}"
|
||||
resp = await self._request_signed("POST", "/v5/order/create", body=body)
|
||||
r = resp.get("result") or {}
|
||||
return self._envelope(resp, {
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"order_id": r.get("orderId"),
|
||||
"order_link_id": r.get("orderLinkId"),
|
||||
"status": "submitted",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def place_combo_order(
|
||||
self,
|
||||
category: str,
|
||||
legs: list[dict[str, Any]],
|
||||
) -> dict:
|
||||
"""Atomic multi-leg via /v5/order/create-batch (Bybit option only).
|
||||
|
||||
Bybit supporta batch_order solo su category='option'. Per perp/linear
|
||||
usare loop di place_order (non atomic).
|
||||
|
||||
legs: [{symbol, side, qty, order_type, price?, tif?, reduce_only?}].
|
||||
"""
|
||||
if category != "option":
|
||||
raise ValueError("place_combo_order: Bybit batch_order è disponibile solo su category='option'")
|
||||
raise ValueError(
|
||||
"place_combo_order: Bybit batch_order è disponibile solo su category='option'"
|
||||
)
|
||||
if len(legs) < 2:
|
||||
raise ValueError("combo requires at least 2 legs")
|
||||
|
||||
import uuid
|
||||
request: list[dict[str, Any]] = []
|
||||
for leg in legs:
|
||||
entry: dict[str, Any] = {
|
||||
@@ -524,7 +691,10 @@ class BybitClient:
|
||||
entry["price"] = str(leg["price"])
|
||||
request.append(entry)
|
||||
|
||||
resp = await self._run(self._http.place_batch_order, category=category, request=request)
|
||||
body = {"category": category, "request": request}
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/order/create-batch", body=body
|
||||
)
|
||||
result_list = (resp.get("result") or {}).get("list") or []
|
||||
orders = [
|
||||
{
|
||||
@@ -544,80 +714,112 @@ class BybitClient:
|
||||
new_qty: float | None = None,
|
||||
new_price: float | None = None,
|
||||
) -> dict:
|
||||
kwargs: dict[str, Any] = {
|
||||
body: dict[str, Any] = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"orderId": order_id,
|
||||
}
|
||||
if new_qty is not None:
|
||||
kwargs["qty"] = str(new_qty)
|
||||
body["qty"] = str(new_qty)
|
||||
if new_price is not None:
|
||||
kwargs["price"] = str(new_price)
|
||||
resp = await self._run(self._http.amend_order, **kwargs)
|
||||
body["price"] = str(new_price)
|
||||
resp = await self._request_signed("POST", "/v5/order/amend", body=body)
|
||||
r = resp.get("result") or {}
|
||||
return self._envelope(resp, {
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"order_id": r.get("orderId", order_id),
|
||||
"status": "amended",
|
||||
})
|
||||
|
||||
async def cancel_order(
|
||||
self, category: str, symbol: str, order_id: str
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.cancel_order,
|
||||
category=category, symbol=symbol, orderId=order_id,
|
||||
},
|
||||
)
|
||||
|
||||
async def cancel_order(self, category: str, symbol: str, order_id: str) -> dict:
|
||||
body = {"category": category, "symbol": symbol, "orderId": order_id}
|
||||
resp = await self._request_signed("POST", "/v5/order/cancel", body=body)
|
||||
r = resp.get("result") or {}
|
||||
return self._envelope(resp, {
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"order_id": r.get("orderId", order_id),
|
||||
"status": "cancelled",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def cancel_all_orders(
|
||||
self, category: str, symbol: str | None = None
|
||||
) -> dict:
|
||||
kwargs: dict[str, Any] = {"category": category}
|
||||
body: dict[str, Any] = {"category": category}
|
||||
if symbol:
|
||||
kwargs["symbol"] = symbol
|
||||
resp = await self._run(self._http.cancel_all_orders, **kwargs)
|
||||
body["symbol"] = symbol
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/order/cancel-all", body=body
|
||||
)
|
||||
r = resp.get("result") or {}
|
||||
ids = [x.get("orderId") for x in (r.get("list") or [])]
|
||||
return self._envelope(resp, {
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"cancelled_ids": ids,
|
||||
"count": len(ids),
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def set_stop_loss(
|
||||
self, category: str, symbol: str, stop_loss: float,
|
||||
self,
|
||||
category: str,
|
||||
symbol: str,
|
||||
stop_loss: float,
|
||||
position_idx: int = 0,
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.set_trading_stop,
|
||||
category=category, symbol=symbol,
|
||||
stopLoss=str(stop_loss), positionIdx=position_idx,
|
||||
body = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"stopLoss": str(stop_loss),
|
||||
"positionIdx": position_idx,
|
||||
}
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/position/trading-stop", body=body
|
||||
)
|
||||
return self._envelope(resp, {
|
||||
"symbol": symbol, "stop_loss": stop_loss,
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"symbol": symbol,
|
||||
"stop_loss": stop_loss,
|
||||
"status": "stop_loss_set",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def set_take_profit(
|
||||
self, category: str, symbol: str, take_profit: float,
|
||||
self,
|
||||
category: str,
|
||||
symbol: str,
|
||||
take_profit: float,
|
||||
position_idx: int = 0,
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.set_trading_stop,
|
||||
category=category, symbol=symbol,
|
||||
takeProfit=str(take_profit), positionIdx=position_idx,
|
||||
body = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"takeProfit": str(take_profit),
|
||||
"positionIdx": position_idx,
|
||||
}
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/position/trading-stop", body=body
|
||||
)
|
||||
return self._envelope(resp, {
|
||||
"symbol": symbol, "take_profit": take_profit,
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"symbol": symbol,
|
||||
"take_profit": take_profit,
|
||||
"status": "take_profit_set",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def close_position(self, category: str, symbol: str) -> dict:
|
||||
positions = await self.get_positions(category=category)
|
||||
target = next((p for p in positions if p["symbol"] == symbol and (p["size"] or 0) > 0), None)
|
||||
target = next(
|
||||
(p for p in positions if p["symbol"] == symbol and (p["size"] or 0) > 0),
|
||||
None,
|
||||
)
|
||||
if not target:
|
||||
return {"error": "no_open_position", "symbol": symbol}
|
||||
close_side = "Sell" if target["side"] == "Buy" else "Buy"
|
||||
@@ -634,28 +836,44 @@ class BybitClient:
|
||||
async def set_leverage(
|
||||
self, category: str, symbol: str, leverage: int
|
||||
) -> dict:
|
||||
resp = await self._run(
|
||||
self._http.set_leverage,
|
||||
category=category, symbol=symbol,
|
||||
buyLeverage=str(leverage), sellLeverage=str(leverage),
|
||||
body = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"buyLeverage": str(leverage),
|
||||
"sellLeverage": str(leverage),
|
||||
}
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/position/set-leverage", body=body
|
||||
)
|
||||
return self._envelope(resp, {
|
||||
"symbol": symbol, "leverage": leverage,
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"symbol": symbol,
|
||||
"leverage": leverage,
|
||||
"status": "leverage_set",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def switch_position_mode(
|
||||
self, category: str, symbol: str, mode: str
|
||||
) -> dict:
|
||||
mode_code = 3 if mode.lower() == "hedge" else 0
|
||||
resp = await self._run(
|
||||
self._http.switch_position_mode,
|
||||
category=category, symbol=symbol, mode=mode_code,
|
||||
body = {
|
||||
"category": category,
|
||||
"symbol": symbol,
|
||||
"mode": mode_code,
|
||||
}
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/position/switch-mode", body=body
|
||||
)
|
||||
return self._envelope(resp, {
|
||||
"symbol": symbol, "mode": mode,
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"symbol": symbol,
|
||||
"mode": mode,
|
||||
"status": "mode_switched",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
async def transfer_asset(
|
||||
self,
|
||||
@@ -664,19 +882,23 @@ class BybitClient:
|
||||
from_type: str,
|
||||
to_type: str,
|
||||
) -> dict:
|
||||
import uuid
|
||||
resp = await self._run(
|
||||
self._http.create_internal_transfer,
|
||||
transferId=str(uuid.uuid4()),
|
||||
coin=coin,
|
||||
amount=str(amount),
|
||||
fromAccountType=from_type,
|
||||
toAccountType=to_type,
|
||||
body = {
|
||||
"transferId": str(uuid.uuid4()),
|
||||
"coin": coin,
|
||||
"amount": str(amount),
|
||||
"fromAccountType": from_type,
|
||||
"toAccountType": to_type,
|
||||
}
|
||||
resp = await self._request_signed(
|
||||
"POST", "/v5/asset/transfer/inter-transfer", body=body
|
||||
)
|
||||
r = resp.get("result") or {}
|
||||
return self._envelope(resp, {
|
||||
return self._envelope(
|
||||
resp,
|
||||
{
|
||||
"transfer_id": r.get("transferId"),
|
||||
"coin": coin,
|
||||
"amount": amount,
|
||||
"status": "submitted",
|
||||
})
|
||||
},
|
||||
)
|
||||
|
||||
@@ -359,7 +359,6 @@ async def place_order(
|
||||
reduce_only=params.reduce_only,
|
||||
position_idx=params.position_idx,
|
||||
)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
@@ -370,7 +369,6 @@ async def place_combo_order(
|
||||
category=params.category,
|
||||
legs=[leg.model_dump() for leg in params.legs],
|
||||
)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
|
||||
@@ -0,0 +1,146 @@
|
||||
"""Cross-exchange historical aggregator.
|
||||
|
||||
Fan-out a canonical (symbol, asset_class, interval, start, end) request to
|
||||
every active exchange that supports the pair, then merge the results into
|
||||
a single consensus candle series with per-bar divergence metrics.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import datetime as _dt
|
||||
from typing import Any, Literal, Protocol
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from cerbero_mcp.exchanges.cross.consensus import merge_candles
|
||||
from cerbero_mcp.exchanges.cross.symbol_map import (
|
||||
get_sources,
|
||||
supported_intervals,
|
||||
to_native_interval,
|
||||
to_native_symbol,
|
||||
)
|
||||
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
|
||||
class _Registry(Protocol):
|
||||
async def get(self, exchange: str, env: Environment) -> Any: ...
|
||||
|
||||
|
||||
def _iso_to_ms(s: str) -> int:
|
||||
return int(_dt.datetime.fromisoformat(
|
||||
s.replace("Z", "+00:00")
|
||||
).timestamp() * 1000)
|
||||
|
||||
|
||||
async def _call_bybit(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_historical(
|
||||
symbol=sym, category="linear", interval=interval,
|
||||
start=_iso_to_ms(start), end=_iso_to_ms(end),
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
async def _call_hyperliquid(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_historical(
|
||||
instrument=sym, start_date=start, end_date=end, resolution=interval,
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
async def _call_deribit(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_historical(
|
||||
instrument=sym, start_date=start, end_date=end, resolution=interval,
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
async def _call_alpaca(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_bars(
|
||||
symbol=sym, asset_class="stocks", interval=interval,
|
||||
start=start, end=end,
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
_DISPATCH = {
|
||||
"bybit": _call_bybit,
|
||||
"hyperliquid": _call_hyperliquid,
|
||||
"deribit": _call_deribit,
|
||||
"alpaca": _call_alpaca,
|
||||
}
|
||||
|
||||
|
||||
class CrossClient:
|
||||
def __init__(self, registry: _Registry, *, env: Environment):
|
||||
self._registry = registry
|
||||
self._env = env
|
||||
|
||||
async def _fetch_one(
|
||||
self, exchange: str, native_sym: str, native_interval: str,
|
||||
start: str, end: str,
|
||||
) -> tuple[str, list[dict[str, Any]] | Exception]:
|
||||
try:
|
||||
client = await self._registry.get(exchange, self._env)
|
||||
resp = await _DISPATCH[exchange](
|
||||
client, native_sym, native_interval, start, end,
|
||||
)
|
||||
return exchange, resp.get("candles", [])
|
||||
except Exception as e: # noqa: BLE001
|
||||
return exchange, e
|
||||
|
||||
async def get_historical(
|
||||
self, *, symbol: str, asset_class: str, interval: str,
|
||||
start_date: str, end_date: str,
|
||||
) -> dict[str, Any]:
|
||||
sources = get_sources(asset_class, symbol)
|
||||
if not sources:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"unsupported symbol/asset_class: {symbol} ({asset_class})",
|
||||
)
|
||||
if interval not in supported_intervals():
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"unsupported interval: {interval}; "
|
||||
f"supported: {supported_intervals()}",
|
||||
)
|
||||
|
||||
tasks = [
|
||||
self._fetch_one(
|
||||
ex,
|
||||
to_native_symbol(asset_class, symbol, ex),
|
||||
to_native_interval(interval, ex),
|
||||
start_date, end_date,
|
||||
)
|
||||
for ex in sources
|
||||
]
|
||||
results = await asyncio.gather(*tasks)
|
||||
|
||||
by_source: dict[str, list[dict[str, Any]]] = {}
|
||||
failed: list[dict[str, str]] = []
|
||||
for ex, payload in results:
|
||||
if isinstance(payload, Exception):
|
||||
failed.append({"exchange": ex, "error": f"{type(payload).__name__}: {payload}"})
|
||||
else:
|
||||
by_source[ex] = payload
|
||||
|
||||
if not by_source:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail={"error": "all sources failed", "failed_sources": failed},
|
||||
)
|
||||
|
||||
return {
|
||||
"symbol": symbol.upper(),
|
||||
"asset_class": asset_class,
|
||||
"interval": interval,
|
||||
"candles": merge_candles(by_source),
|
||||
"sources_used": sorted(by_source.keys()),
|
||||
"failed_sources": failed,
|
||||
}
|
||||
@@ -0,0 +1,37 @@
|
||||
"""Pure consensus aggregation: merge per-source OHLCV candles by timestamp.
|
||||
|
||||
The output is a single time-series with the median OHLC across sources,
|
||||
mean volume, the contributing source count, and a divergence % computed
|
||||
on the close range. div_pct gives a quick quality signal: 0 means full
|
||||
agreement, > X% means at least one source is suspect.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from collections import defaultdict
|
||||
from statistics import median
|
||||
from typing import Any
|
||||
|
||||
|
||||
def merge_candles(by_source: dict[str, list[dict[str, Any]]]) -> list[dict[str, Any]]:
|
||||
grouped: dict[int, list[dict[str, Any]]] = defaultdict(list)
|
||||
for candles in by_source.values():
|
||||
for c in candles:
|
||||
grouped[int(c["timestamp"])].append(c)
|
||||
|
||||
out: list[dict[str, Any]] = []
|
||||
for ts in sorted(grouped):
|
||||
rows = grouped[ts]
|
||||
closes = [float(r["close"]) for r in rows]
|
||||
med_close = float(median(closes))
|
||||
div_pct = (max(closes) - min(closes)) / med_close if med_close else 0.0
|
||||
out.append({
|
||||
"timestamp": ts,
|
||||
"open": float(median(float(r["open"]) for r in rows)),
|
||||
"high": float(median(float(r["high"]) for r in rows)),
|
||||
"low": float(median(float(r["low"]) for r in rows)),
|
||||
"close": med_close,
|
||||
"volume": sum(float(r["volume"]) for r in rows) / len(rows),
|
||||
"sources": len(rows),
|
||||
"div_pct": div_pct,
|
||||
})
|
||||
return out
|
||||
@@ -0,0 +1,60 @@
|
||||
"""Routing table: canonical (asset_class, symbol, interval) → per-exchange native.
|
||||
|
||||
Crypto canonical symbols default to USD/USDT-quoted perpetuals on the most
|
||||
liquid pair available. Equities currently route to Alpaca only — IBKR is
|
||||
omitted from the cross MVP because its bars endpoint takes a relative
|
||||
period instead of (start, end).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
AssetClass = str
|
||||
|
||||
_CRYPTO_SYMBOLS: dict[str, dict[str, str]] = {
|
||||
"BTC": {"bybit": "BTCUSDT", "hyperliquid": "BTC", "deribit": "BTC-PERPETUAL"},
|
||||
"ETH": {"bybit": "ETHUSDT", "hyperliquid": "ETH", "deribit": "ETH-PERPETUAL"},
|
||||
"SOL": {"bybit": "SOLUSDT", "hyperliquid": "SOL"},
|
||||
}
|
||||
|
||||
_STOCK_SYMBOLS: dict[str, dict[str, str]] = {
|
||||
"AAPL": {"alpaca": "AAPL"},
|
||||
"SPY": {"alpaca": "SPY"},
|
||||
"QQQ": {"alpaca": "QQQ"},
|
||||
"TSLA": {"alpaca": "TSLA"},
|
||||
"NVDA": {"alpaca": "NVDA"},
|
||||
}
|
||||
|
||||
_SYMBOLS: dict[AssetClass, dict[str, dict[str, str]]] = {
|
||||
"crypto": _CRYPTO_SYMBOLS,
|
||||
"stocks": _STOCK_SYMBOLS,
|
||||
}
|
||||
|
||||
_INTERVALS: dict[str, dict[str, str]] = {
|
||||
"1m": {"bybit": "1", "hyperliquid": "1m", "deribit": "1m", "alpaca": "1m"},
|
||||
"5m": {"bybit": "5", "hyperliquid": "5m", "deribit": "5m", "alpaca": "5m"},
|
||||
"15m": {"bybit": "15", "hyperliquid": "15m", "deribit": "15m", "alpaca": "15m"},
|
||||
"1h": {"bybit": "60", "hyperliquid": "1h", "deribit": "1h", "alpaca": "1h"},
|
||||
"4h": {"bybit": "240", "hyperliquid": "4h", "deribit": "4h", "alpaca": "4h"},
|
||||
"1d": {"bybit": "D", "hyperliquid": "1d", "deribit": "1d", "alpaca": "1d"},
|
||||
}
|
||||
|
||||
|
||||
def get_sources(asset_class: AssetClass, symbol: str) -> list[str]:
|
||||
table = _SYMBOLS.get(asset_class, {})
|
||||
mapping = table.get(symbol.upper())
|
||||
if mapping is None:
|
||||
return []
|
||||
return list(mapping.keys())
|
||||
|
||||
|
||||
def to_native_symbol(
|
||||
asset_class: AssetClass, symbol: str, exchange: str
|
||||
) -> str:
|
||||
return _SYMBOLS[asset_class][symbol.upper()][exchange]
|
||||
|
||||
|
||||
def to_native_interval(interval: str, exchange: str) -> str:
|
||||
return _INTERVALS[interval][exchange]
|
||||
|
||||
|
||||
def supported_intervals() -> list[str]:
|
||||
return list(_INTERVALS.keys())
|
||||
@@ -0,0 +1,28 @@
|
||||
"""Pydantic schemas + thin tool wrappers for the /mcp-cross router."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal
|
||||
|
||||
from pydantic import BaseModel
|
||||
|
||||
from cerbero_mcp.exchanges.cross.client import CrossClient
|
||||
|
||||
AssetClass = Literal["crypto", "stocks"]
|
||||
|
||||
|
||||
class GetHistoricalReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: AssetClass = "crypto"
|
||||
interval: str = "1h"
|
||||
start_date: str
|
||||
end_date: str
|
||||
|
||||
|
||||
async def get_historical(client: CrossClient, params: GetHistoricalReq) -> dict:
|
||||
return await client.get_historical(
|
||||
symbol=params.symbol,
|
||||
asset_class=params.asset_class,
|
||||
interval=params.interval,
|
||||
start_date=params.start_date,
|
||||
end_date=params.end_date,
|
||||
)
|
||||
@@ -1,15 +1,37 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
import json
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from cerbero_mcp.common import indicators as ind
|
||||
from cerbero_mcp.common import microstructure as micro
|
||||
from cerbero_mcp.common import options as opt
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
|
||||
def _parse_deribit_response(resp: Any) -> dict[str, Any]:
|
||||
"""Map Deribit upstream errors to a clean HTTP 502 (retryable) instead of
|
||||
leaking JSONDecodeError when the body is HTML (e.g. Cloudflare 5xx page)."""
|
||||
if resp.status_code >= 500:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail=f"Deribit upstream HTTP {resp.status_code}",
|
||||
)
|
||||
try:
|
||||
data: dict[str, Any] = resp.json()
|
||||
return data
|
||||
except json.JSONDecodeError as e:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail=f"Deribit upstream returned non-JSON (status {resp.status_code})",
|
||||
) from e
|
||||
|
||||
BASE_LIVE = "https://www.deribit.com/api/v2"
|
||||
BASE_TESTNET = "https://test.deribit.com/api/v2"
|
||||
|
||||
@@ -23,6 +45,10 @@ RESOLUTION_MAP = {
|
||||
}
|
||||
|
||||
|
||||
class DeribitAuthError(Exception):
|
||||
"""Deribit auth failed (bad credentials, missing scope, env mismatch)."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class DeribitClient:
|
||||
client_id: str
|
||||
@@ -49,7 +75,14 @@ class DeribitClient:
|
||||
}
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.get(url, params=params)
|
||||
data = resp.json()
|
||||
data = _parse_deribit_response(resp)
|
||||
if "result" not in data:
|
||||
error = data.get("error", {})
|
||||
msg = error.get("message", str(data)) if isinstance(error, dict) else str(error)
|
||||
code = error.get("code") if isinstance(error, dict) else None
|
||||
raise DeribitAuthError(
|
||||
f"Deribit auth failed (code={code}, env={'testnet' if self.testnet else 'mainnet'}): {msg}"
|
||||
)
|
||||
result = data["result"]
|
||||
self._token = result["access_token"]
|
||||
self._token_expires_at = time.monotonic() + result.get("expires_in", 900) - 30
|
||||
@@ -63,7 +96,10 @@ class DeribitClient:
|
||||
async def _request(self, method: str, params: dict[str, Any] | None = None) -> dict:
|
||||
is_private = method.startswith("private/")
|
||||
if is_private:
|
||||
try:
|
||||
await self._get_token()
|
||||
except DeribitAuthError as e:
|
||||
return {"result": None, "error": str(e)}
|
||||
|
||||
url = f"{self.base_url}/{method}"
|
||||
request_params = dict(params) if params else {}
|
||||
@@ -73,7 +109,7 @@ class DeribitClient:
|
||||
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.get(url, params=request_params, headers=headers)
|
||||
data = resp.json()
|
||||
data = _parse_deribit_response(resp)
|
||||
|
||||
if "result" not in data:
|
||||
error = data.get("error", {})
|
||||
@@ -85,18 +121,22 @@ class DeribitClient:
|
||||
await self._authenticate()
|
||||
headers["Authorization"] = f"Bearer {self._token}"
|
||||
resp = await http.get(url, params=request_params, headers=headers)
|
||||
data = resp.json()
|
||||
data = _parse_deribit_response(resp)
|
||||
if "result" in data:
|
||||
return data # type: ignore[no-any-return]
|
||||
return data
|
||||
return {"result": None, "error": error_msg}
|
||||
|
||||
return data # type: ignore[no-any-return]
|
||||
return data
|
||||
|
||||
# ── Read tools ───────────────────────────────────────────────
|
||||
|
||||
def is_testnet(self) -> dict:
|
||||
return {"testnet": self.testnet, "base_url": self.base_url}
|
||||
|
||||
async def health(self) -> dict:
|
||||
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
|
||||
return {"status": "ok", "testnet": self.testnet}
|
||||
|
||||
async def get_ticker(self, instrument_name: str) -> dict:
|
||||
import datetime as _dt
|
||||
raw = await self._request("public/ticker", {"instrument_name": instrument_name})
|
||||
@@ -303,12 +343,12 @@ class DeribitClient:
|
||||
r = raw.get("result")
|
||||
if not r:
|
||||
return {
|
||||
"equity": 0,
|
||||
"balance": 0,
|
||||
"margin_balance": 0,
|
||||
"available_funds": 0,
|
||||
"unrealized_pnl": 0,
|
||||
"total_pnl": 0,
|
||||
"equity": None,
|
||||
"balance": None,
|
||||
"margin_balance": None,
|
||||
"available_funds": None,
|
||||
"unrealized_pnl": None,
|
||||
"total_pnl": None,
|
||||
"testnet": self.testnet,
|
||||
"error": raw.get("error", "no result"),
|
||||
}
|
||||
@@ -380,24 +420,24 @@ class DeribitClient:
|
||||
},
|
||||
)
|
||||
r = raw.get("result") or {}
|
||||
candles = []
|
||||
ticks = r.get("ticks", []) or []
|
||||
opens = r.get("open", []) or []
|
||||
highs = r.get("high", []) or []
|
||||
lows = r.get("low", []) or []
|
||||
closes = r.get("close", []) or []
|
||||
volumes = r.get("volume", []) or []
|
||||
for idx, ts in enumerate(ticks):
|
||||
if idx >= min(len(opens), len(highs), len(lows), len(closes), len(volumes)):
|
||||
break
|
||||
candles.append({
|
||||
"timestamp": ts,
|
||||
"open": opens[idx],
|
||||
"high": highs[idx],
|
||||
"low": lows[idx],
|
||||
"close": closes[idx],
|
||||
"volume": volumes[idx],
|
||||
})
|
||||
n = min(len(ticks), len(opens), len(highs), len(lows), len(closes), len(volumes))
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": ticks[i],
|
||||
"open": opens[i],
|
||||
"high": highs[i],
|
||||
"low": lows[i],
|
||||
"close": closes[i],
|
||||
"volume": volumes[i],
|
||||
}
|
||||
for i in range(n)
|
||||
])
|
||||
return {"candles": candles}
|
||||
|
||||
async def get_dvol(
|
||||
|
||||
@@ -481,7 +481,6 @@ async def place_order(
|
||||
post_only=params.post_only,
|
||||
label=params.label,
|
||||
)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
@@ -502,29 +501,24 @@ async def place_combo_order(
|
||||
price=params.price,
|
||||
label=params.label,
|
||||
)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
async def cancel_order(client: DeribitClient, params: CancelOrderReq) -> dict:
|
||||
result = await client.cancel_order(params.order_id)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
async def set_stop_loss(client: DeribitClient, params: SetStopLossReq) -> dict:
|
||||
result = await client.set_stop_loss(params.order_id, params.stop_price)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
async def set_take_profit(client: DeribitClient, params: SetTakeProfitReq) -> dict:
|
||||
result = await client.set_take_profit(params.order_id, params.tp_price)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
async def close_position(client: DeribitClient, params: ClosePositionReq) -> dict:
|
||||
result = await client.close_position(params.instrument_name)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
@@ -1,12 +1,33 @@
|
||||
"""Hyperliquid REST API client for perpetual futures trading."""
|
||||
"""Hyperliquid REST API client for perpetual futures trading.
|
||||
|
||||
Pure ``httpx`` + ``eth-account`` implementation: no dependency on
|
||||
``hyperliquid-python-sdk``. Read endpoints hit ``POST /info`` (no auth);
|
||||
write endpoints hit ``POST /exchange`` and require an EIP-712 L1 signature.
|
||||
|
||||
The signing scheme is bit-for-bit equivalent to the canonical SDK:
|
||||
|
||||
action_hash = keccak( msgpack(action) || nonce[u64 BE] || vault_marker
|
||||
|| (expires_after marker || expires_after[u64 BE])? )
|
||||
phantom = {"source": "a"|"b", "connectionId": action_hash} # a=mainnet, b=testnet
|
||||
EIP-712 domain: name="Exchange", version="1", chainId=1337,
|
||||
verifyingContract=0x0
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import datetime as _dt
|
||||
import time as _time
|
||||
from decimal import Decimal
|
||||
from typing import Any
|
||||
|
||||
import httpx
|
||||
import msgpack
|
||||
from eth_account import Account
|
||||
from eth_account.messages import encode_typed_data
|
||||
from eth_utils import keccak, to_hex
|
||||
|
||||
from cerbero_mcp.common import indicators as ind
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
BASE_LIVE = "https://api.hyperliquid.xyz"
|
||||
@@ -21,14 +42,8 @@ RESOLUTION_MAP = {
|
||||
"1d": "1d",
|
||||
}
|
||||
|
||||
try:
|
||||
from eth_account import Account
|
||||
from hyperliquid.exchange import Exchange
|
||||
from hyperliquid.utils import constants as hl_constants
|
||||
|
||||
_SDK_AVAILABLE = True
|
||||
except ImportError: # pragma: no cover
|
||||
_SDK_AVAILABLE = False
|
||||
# Slippage usato per market order / market_close (parità con SDK).
|
||||
DEFAULT_SLIPPAGE = 0.05
|
||||
|
||||
|
||||
def _to_ms(date_str: str) -> int:
|
||||
@@ -39,11 +54,91 @@ def _to_ms(date_str: str) -> int:
|
||||
return int(dt.timestamp() * 1000)
|
||||
|
||||
|
||||
class HyperliquidClient:
|
||||
"""Async client for the Hyperliquid API.
|
||||
def _float_to_wire(x: float) -> str:
|
||||
"""Convert a price/size float to Hyperliquid wire string format.
|
||||
|
||||
Read operations use direct HTTP calls via httpx against /info.
|
||||
Write operations delegate to hyperliquid-python-sdk for EIP-712 signing.
|
||||
8 decimal places, no trailing zeros (matching SDK ``float_to_wire``).
|
||||
"""
|
||||
rounded = f"{x:.8f}"
|
||||
if abs(float(rounded) - x) >= 1e-12:
|
||||
raise ValueError("float_to_wire causes rounding", x)
|
||||
if rounded == "-0":
|
||||
rounded = "0"
|
||||
normalized = Decimal(rounded).normalize()
|
||||
return f"{normalized:f}"
|
||||
|
||||
|
||||
def _address_to_bytes(address: str) -> bytes:
|
||||
return bytes.fromhex(address.removeprefix("0x"))
|
||||
|
||||
|
||||
def _action_hash(
|
||||
action: Any,
|
||||
vault_address: str | None,
|
||||
nonce: int,
|
||||
expires_after: int | None,
|
||||
) -> bytes:
|
||||
"""Deterministic action hash (msgpack + nonce + vault + expires)."""
|
||||
data = msgpack.packb(action)
|
||||
data += nonce.to_bytes(8, "big")
|
||||
if vault_address is None:
|
||||
data += b"\x00"
|
||||
else:
|
||||
data += b"\x01"
|
||||
data += _address_to_bytes(vault_address)
|
||||
if expires_after is not None:
|
||||
data += b"\x00"
|
||||
data += expires_after.to_bytes(8, "big")
|
||||
return keccak(data)
|
||||
|
||||
|
||||
def _l1_payload(phantom_agent: dict[str, Any]) -> dict[str, Any]:
|
||||
return {
|
||||
"domain": {
|
||||
"chainId": 1337,
|
||||
"name": "Exchange",
|
||||
"verifyingContract": "0x0000000000000000000000000000000000000000",
|
||||
"version": "1",
|
||||
},
|
||||
"types": {
|
||||
"Agent": [
|
||||
{"name": "source", "type": "string"},
|
||||
{"name": "connectionId", "type": "bytes32"},
|
||||
],
|
||||
"EIP712Domain": [
|
||||
{"name": "name", "type": "string"},
|
||||
{"name": "version", "type": "string"},
|
||||
{"name": "chainId", "type": "uint256"},
|
||||
{"name": "verifyingContract", "type": "address"},
|
||||
],
|
||||
},
|
||||
"primaryType": "Agent",
|
||||
"message": phantom_agent,
|
||||
}
|
||||
|
||||
|
||||
def _sign_l1_action(
|
||||
private_key: str,
|
||||
action: Any,
|
||||
vault_address: str | None,
|
||||
nonce: int,
|
||||
expires_after: int | None,
|
||||
is_mainnet: bool,
|
||||
) -> dict[str, Any]:
|
||||
h = _action_hash(action, vault_address, nonce, expires_after)
|
||||
phantom_agent = {"source": "a" if is_mainnet else "b", "connectionId": h}
|
||||
payload = _l1_payload(phantom_agent)
|
||||
encoded = encode_typed_data(full_message=payload)
|
||||
signed = Account.from_key(private_key).sign_message(encoded)
|
||||
return {"r": to_hex(signed["r"]), "s": to_hex(signed["s"]), "v": signed["v"]}
|
||||
|
||||
|
||||
class HyperliquidClient:
|
||||
"""Async client for the Hyperliquid REST API.
|
||||
|
||||
Read operations call ``POST /info`` directly via ``httpx``.
|
||||
Write operations build an EIP-712 L1 signature in-process (no SDK)
|
||||
and call ``POST /exchange``.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
@@ -63,53 +158,99 @@ class HyperliquidClient:
|
||||
self.base_url = base_url
|
||||
else:
|
||||
self.base_url = BASE_TESTNET if testnet else BASE_LIVE
|
||||
self._exchange: Any | None = None
|
||||
self._is_mainnet = self.base_url == BASE_LIVE
|
||||
self.vault_address: str | None = None
|
||||
# Persistent async client (riutilizzato per /exchange e /info).
|
||||
self._http: httpx.AsyncClient | None = None
|
||||
# Cache name → asset id (perp universe).
|
||||
self._name_to_asset: dict[str, int] | None = None
|
||||
|
||||
# ── SDK exchange (lazy) ────────────────────────────────────
|
||||
|
||||
def _get_exchange(self) -> Any:
|
||||
"""Return (and cache) an SDK Exchange instance for write ops."""
|
||||
if not _SDK_AVAILABLE:
|
||||
raise RuntimeError(
|
||||
"hyperliquid-python-sdk is not installed; write operations unavailable."
|
||||
)
|
||||
if self._exchange is None:
|
||||
account = Account.from_key(self.private_key)
|
||||
if self._base_url_override:
|
||||
sdk_base_url = self._base_url_override
|
||||
else:
|
||||
sdk_base_url = (
|
||||
hl_constants.TESTNET_API_URL if self.testnet else hl_constants.MAINNET_API_URL
|
||||
)
|
||||
empty_spot_meta: dict[str, Any] = {"universe": [], "tokens": []}
|
||||
self._exchange = Exchange(
|
||||
account,
|
||||
sdk_base_url,
|
||||
account_address=self.wallet_address,
|
||||
spot_meta=empty_spot_meta,
|
||||
)
|
||||
return self._exchange
|
||||
async def aclose(self) -> None:
|
||||
"""Close the underlying HTTP client (if any)."""
|
||||
if self._http is not None:
|
||||
await self._http.aclose()
|
||||
self._http = None
|
||||
|
||||
# ── Internal helpers ───────────────────────────────────────
|
||||
|
||||
async def _post(self, payload: dict[str, Any]) -> Any:
|
||||
"""POST JSON to the /info endpoint."""
|
||||
async def _post_info(self, payload: dict[str, Any]) -> Any:
|
||||
"""POST a JSON payload to ``/info`` (read-only, no auth)."""
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.post(f"{self.base_url}/info", json=payload)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
# backward-compat alias (interno).
|
||||
async def _post(self, payload: dict[str, Any]) -> Any:
|
||||
return await self._post_info(payload)
|
||||
|
||||
async def _post_exchange(
|
||||
self,
|
||||
action: dict[str, Any],
|
||||
nonce: int | None = None,
|
||||
vault_address: str | None = None,
|
||||
) -> Any:
|
||||
"""Sign and POST an action to ``/exchange``."""
|
||||
if nonce is None:
|
||||
nonce = int(_time.time() * 1000)
|
||||
vault = vault_address if vault_address is not None else self.vault_address
|
||||
signature = _sign_l1_action(
|
||||
self.private_key,
|
||||
action,
|
||||
vault,
|
||||
nonce,
|
||||
None, # expires_after: not used here
|
||||
self._is_mainnet,
|
||||
)
|
||||
payload: dict[str, Any] = {
|
||||
"action": action,
|
||||
"nonce": nonce,
|
||||
"signature": signature,
|
||||
"vaultAddress": vault,
|
||||
"expiresAfter": None,
|
||||
}
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.post(f"{self.base_url}/exchange", json=payload)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
async def _name_to_asset_id(self, name: str) -> int:
|
||||
"""Resolve a perp coin name (e.g. ``BTC``) to its asset id.
|
||||
|
||||
The asset id is the index in the ``meta.universe`` array. Cached
|
||||
per-client; refreshed if the requested name is missing.
|
||||
"""
|
||||
upper = name.upper()
|
||||
if self._name_to_asset is None or upper not in self._name_to_asset:
|
||||
meta = await self._post_info({"type": "meta"})
|
||||
universe = meta.get("universe", [])
|
||||
self._name_to_asset = {
|
||||
m["name"].upper(): idx for idx, m in enumerate(universe)
|
||||
}
|
||||
if upper not in self._name_to_asset:
|
||||
raise ValueError(f"Unknown asset: {name}")
|
||||
return self._name_to_asset[upper]
|
||||
|
||||
@staticmethod
|
||||
async def _run_sync(func: Any, *args: Any, **kwargs: Any) -> Any:
|
||||
"""Run a synchronous SDK call in the default executor."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(None, lambda: func(*args, **kwargs))
|
||||
def _order_type_to_wire(order_type: dict[str, Any]) -> dict[str, Any]:
|
||||
if "limit" in order_type:
|
||||
return {"limit": order_type["limit"]}
|
||||
if "trigger" in order_type:
|
||||
t = order_type["trigger"]
|
||||
return {
|
||||
"trigger": {
|
||||
"isMarket": t["isMarket"],
|
||||
"triggerPx": _float_to_wire(float(t["triggerPx"])),
|
||||
"tpsl": t["tpsl"],
|
||||
}
|
||||
}
|
||||
raise ValueError("Invalid order type", order_type)
|
||||
|
||||
# ── Read tools ─────────────────────────────────────────────
|
||||
|
||||
async def get_markets(self) -> list[dict[str, Any]]:
|
||||
"""List all perp markets with metadata and current stats."""
|
||||
data = await self._post({"type": "metaAndAssetCtxs"})
|
||||
data = await self._post_info({"type": "metaAndAssetCtxs"})
|
||||
universe = data[0]["universe"]
|
||||
ctx_list = data[1]
|
||||
markets = []
|
||||
@@ -144,7 +285,7 @@ class HyperliquidClient:
|
||||
|
||||
async def get_orderbook(self, instrument: str, depth: int = 10) -> dict[str, Any]:
|
||||
"""Get L2 order book for an asset."""
|
||||
data = await self._post({"type": "l2Book", "coin": instrument.upper()})
|
||||
data = await self._post_info({"type": "l2Book", "coin": instrument.upper()})
|
||||
levels = data.get("levels", [[], []])
|
||||
bids = [{"price": float(b["px"]), "size": float(b["sz"])} for b in levels[0][:depth]]
|
||||
asks = [{"price": float(a["px"]), "size": float(a["sz"])} for a in levels[1][:depth]]
|
||||
@@ -152,7 +293,7 @@ class HyperliquidClient:
|
||||
|
||||
async def get_positions(self) -> list[dict[str, Any]]:
|
||||
"""Get open positions for the wallet."""
|
||||
data = await self._post(
|
||||
data = await self._post_info(
|
||||
{"type": "clearinghouseState", "user": self.wallet_address}
|
||||
)
|
||||
positions = []
|
||||
@@ -184,9 +325,9 @@ class HyperliquidClient:
|
||||
"""Get account summary (equity, balance, margin) including spot balances.
|
||||
|
||||
Con Unified Account, spot USDC e perps condividono collaterale.
|
||||
`spot_fetch_ok` / `perps_fetch_ok` indicano se i due lati sono stati
|
||||
``spot_fetch_ok`` / ``perps_fetch_ok`` indicano se i due lati sono stati
|
||||
letti correttamente: se uno dei due è False il chiamante dovrebbe
|
||||
considerare `equity`/`available_balance` un lower bound.
|
||||
considerare ``equity``/``available_balance`` un lower bound.
|
||||
"""
|
||||
perps_fetch_ok = True
|
||||
perps_equity = 0.0
|
||||
@@ -194,7 +335,7 @@ class HyperliquidClient:
|
||||
margin_used = 0.0
|
||||
unrealized_pnl = 0.0
|
||||
try:
|
||||
data = await self._post(
|
||||
data = await self._post_info(
|
||||
{"type": "clearinghouseState", "user": self.wallet_address}
|
||||
)
|
||||
margin = data.get("marginSummary") or {}
|
||||
@@ -208,7 +349,7 @@ class HyperliquidClient:
|
||||
spot_fetch_ok = True
|
||||
spot_usdc = 0.0
|
||||
try:
|
||||
spot_data = await self._post(
|
||||
spot_data = await self._post_info(
|
||||
{"type": "spotClearinghouseState", "user": self.wallet_address}
|
||||
)
|
||||
for b in spot_data.get("balances", []) or []:
|
||||
@@ -233,7 +374,9 @@ class HyperliquidClient:
|
||||
|
||||
async def get_trade_history(self, limit: int = 100) -> list[dict[str, Any]]:
|
||||
"""Get recent trade fills."""
|
||||
data = await self._post({"type": "userFills", "user": self.wallet_address})
|
||||
data = await self._post_info(
|
||||
{"type": "userFills", "user": self.wallet_address}
|
||||
)
|
||||
trades = []
|
||||
for t in data[:limit]:
|
||||
trades.append(
|
||||
@@ -255,7 +398,7 @@ class HyperliquidClient:
|
||||
start_ms = _to_ms(start_date)
|
||||
end_ms = _to_ms(end_date)
|
||||
interval = RESOLUTION_MAP.get(resolution, resolution)
|
||||
data = await self._post(
|
||||
data = await self._post_info(
|
||||
{
|
||||
"type": "candleSnapshot",
|
||||
"req": {
|
||||
@@ -266,23 +409,24 @@ class HyperliquidClient:
|
||||
},
|
||||
}
|
||||
)
|
||||
candles = []
|
||||
for c in data:
|
||||
candles.append(
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": c.get("t", 0),
|
||||
"open": float(c.get("o", 0)),
|
||||
"high": float(c.get("h", 0)),
|
||||
"low": float(c.get("l", 0)),
|
||||
"close": float(c.get("c", 0)),
|
||||
"volume": float(c.get("v", 0)),
|
||||
"timestamp": c.get("t"),
|
||||
"open": c.get("o"),
|
||||
"high": c.get("h"),
|
||||
"low": c.get("l"),
|
||||
"close": c.get("c"),
|
||||
"volume": c.get("v"),
|
||||
}
|
||||
)
|
||||
for c in data
|
||||
])
|
||||
return {"candles": candles}
|
||||
|
||||
async def get_open_orders(self) -> list[dict[str, Any]]:
|
||||
"""Get all open orders for the wallet."""
|
||||
data = await self._post({"type": "openOrders", "user": self.wallet_address})
|
||||
data = await self._post_info(
|
||||
{"type": "openOrders", "user": self.wallet_address}
|
||||
)
|
||||
orders = []
|
||||
for o in data:
|
||||
orders.append(
|
||||
@@ -326,7 +470,7 @@ class HyperliquidClient:
|
||||
|
||||
# Perp price + funding from HL
|
||||
try:
|
||||
ctx = await self._post({"type": "metaAndAssetCtxs"})
|
||||
ctx = await self._post_info({"type": "metaAndAssetCtxs"})
|
||||
universe = ctx[0]["universe"]
|
||||
ctx_list = ctx[1]
|
||||
perp_price = None
|
||||
@@ -375,7 +519,7 @@ class HyperliquidClient:
|
||||
|
||||
async def get_funding_rate(self, instrument: str) -> dict[str, Any]:
|
||||
"""Get current and recent historical funding rates for an asset."""
|
||||
data = await self._post({"type": "metaAndAssetCtxs"})
|
||||
data = await self._post_info({"type": "metaAndAssetCtxs"})
|
||||
universe = data[0]["universe"]
|
||||
ctx_list = data[1]
|
||||
current_rate = None
|
||||
@@ -389,7 +533,7 @@ class HyperliquidClient:
|
||||
# Fetch funding history (last 7 days)
|
||||
end_ms = int(_dt.datetime.utcnow().timestamp() * 1000)
|
||||
start_ms = end_ms - 7 * 24 * 3600 * 1000
|
||||
history_data = await self._post(
|
||||
history_data = await self._post_info(
|
||||
{
|
||||
"type": "fundingHistory",
|
||||
"coin": instrument.upper(),
|
||||
@@ -443,44 +587,10 @@ class HyperliquidClient:
|
||||
result[name] = None
|
||||
return result
|
||||
|
||||
# ── Write tools (via SDK) ──────────────────────────────────
|
||||
|
||||
async def place_order(
|
||||
self,
|
||||
instrument: str,
|
||||
side: str,
|
||||
amount: float,
|
||||
type: str = "limit",
|
||||
price: float | None = None,
|
||||
reduce_only: bool = False,
|
||||
) -> dict[str, Any]:
|
||||
"""Place an order on Hyperliquid using the SDK for EIP-712 signing."""
|
||||
exchange = self._get_exchange()
|
||||
is_buy = side.lower() in ("buy", "long")
|
||||
coin = instrument.upper()
|
||||
|
||||
if type == "market":
|
||||
ot: dict[str, Any] = {"limit": {"tif": "Ioc"}}
|
||||
if price is None:
|
||||
ticker = await self.get_ticker(coin)
|
||||
mark = ticker.get("mark_price", 0)
|
||||
price = round(mark * 1.03, 1) if is_buy else round(mark * 0.97, 1)
|
||||
elif type in ("stop_market", "stop_loss"):
|
||||
assert price is not None
|
||||
ot = {"trigger": {"triggerPx": float(price), "isMarket": True, "tpsl": "sl"}}
|
||||
elif type == "take_profit":
|
||||
assert price is not None
|
||||
ot = {"trigger": {"triggerPx": float(price), "isMarket": True, "tpsl": "tp"}}
|
||||
else:
|
||||
ot = {"limit": {"tif": "Gtc"}}
|
||||
|
||||
if price is None:
|
||||
return {"error": "price is required for limit orders"}
|
||||
|
||||
result = await self._run_sync(
|
||||
exchange.order, coin, is_buy, amount, price, ot, reduce_only
|
||||
)
|
||||
# ── Write tools (signed) ───────────────────────────────────
|
||||
|
||||
@staticmethod
|
||||
def _parse_order_response(result: dict[str, Any]) -> dict[str, Any]:
|
||||
status = result.get("status", "unknown")
|
||||
response = result.get("response", {})
|
||||
if isinstance(response, str):
|
||||
@@ -491,7 +601,6 @@ class HyperliquidClient:
|
||||
"filled_size": 0,
|
||||
"avg_fill_price": 0,
|
||||
}
|
||||
|
||||
statuses = response.get("data", {}).get("statuses", [{}])
|
||||
first = statuses[0] if statuses else {}
|
||||
if isinstance(first, str):
|
||||
@@ -511,12 +620,95 @@ class HyperliquidClient:
|
||||
"avg_fill_price": float(first.get("filled", {}).get("avgPx", 0)),
|
||||
}
|
||||
|
||||
async def place_order(
|
||||
self,
|
||||
instrument: str,
|
||||
side: str,
|
||||
amount: float,
|
||||
type: str = "limit",
|
||||
price: float | None = None,
|
||||
reduce_only: bool = False,
|
||||
) -> dict[str, Any]:
|
||||
"""Place an order on Hyperliquid (signed via EIP-712)."""
|
||||
is_buy = side.lower() in ("buy", "long")
|
||||
coin = instrument.upper()
|
||||
|
||||
if type == "market":
|
||||
order_type: dict[str, Any] = {"limit": {"tif": "Ioc"}}
|
||||
if price is None:
|
||||
ticker = await self.get_ticker(coin)
|
||||
mark = ticker.get("mark_price", 0)
|
||||
price = round(mark * 1.03, 1) if is_buy else round(mark * 0.97, 1)
|
||||
elif type in ("stop_market", "stop_loss"):
|
||||
assert price is not None
|
||||
order_type = {
|
||||
"trigger": {
|
||||
"triggerPx": float(price),
|
||||
"isMarket": True,
|
||||
"tpsl": "sl",
|
||||
}
|
||||
}
|
||||
elif type == "take_profit":
|
||||
assert price is not None
|
||||
order_type = {
|
||||
"trigger": {
|
||||
"triggerPx": float(price),
|
||||
"isMarket": True,
|
||||
"tpsl": "tp",
|
||||
}
|
||||
}
|
||||
else:
|
||||
order_type = {"limit": {"tif": "Gtc"}}
|
||||
|
||||
if price is None:
|
||||
return {"error": "price is required for limit orders"}
|
||||
|
||||
try:
|
||||
asset_id = await self._name_to_asset_id(coin)
|
||||
except ValueError as exc:
|
||||
return {"error": str(exc), "order_id": "", "filled_size": 0, "avg_fill_price": 0}
|
||||
|
||||
order_wire: dict[str, Any] = {
|
||||
"a": asset_id,
|
||||
"b": is_buy,
|
||||
"p": _float_to_wire(float(price)),
|
||||
"s": _float_to_wire(float(amount)),
|
||||
"r": reduce_only,
|
||||
"t": self._order_type_to_wire(order_type),
|
||||
}
|
||||
action: dict[str, Any] = {
|
||||
"type": "order",
|
||||
"orders": [order_wire],
|
||||
"grouping": "na",
|
||||
}
|
||||
try:
|
||||
result = await self._post_exchange(action)
|
||||
except httpx.HTTPError as exc:
|
||||
return {
|
||||
"status": "error",
|
||||
"error": str(exc),
|
||||
"order_id": "",
|
||||
"filled_size": 0,
|
||||
"avg_fill_price": 0,
|
||||
}
|
||||
return self._parse_order_response(result)
|
||||
|
||||
async def cancel_order(self, order_id: str, instrument: str) -> dict[str, Any]:
|
||||
"""Cancel an existing order using the SDK."""
|
||||
exchange = self._get_exchange()
|
||||
result = await self._run_sync(
|
||||
exchange.cancel, instrument.upper(), int(order_id)
|
||||
)
|
||||
"""Cancel an existing order via signed ``cancel`` action."""
|
||||
try:
|
||||
asset_id = await self._name_to_asset_id(instrument)
|
||||
except ValueError as exc:
|
||||
return {"order_id": order_id, "status": "error", "error": str(exc)}
|
||||
|
||||
action: dict[str, Any] = {
|
||||
"type": "cancel",
|
||||
"cancels": [{"a": asset_id, "o": int(order_id)}],
|
||||
}
|
||||
try:
|
||||
result = await self._post_exchange(action)
|
||||
except httpx.HTTPError as exc:
|
||||
return {"order_id": order_id, "status": "error", "error": str(exc)}
|
||||
|
||||
status = result.get("status", "unknown")
|
||||
response = result.get("response", "")
|
||||
if isinstance(response, str) and status == "err":
|
||||
@@ -526,8 +718,7 @@ class HyperliquidClient:
|
||||
async def set_stop_loss(
|
||||
self, instrument: str, stop_price: float, size: float
|
||||
) -> dict[str, Any]:
|
||||
"""Set a stop-loss trigger order."""
|
||||
# Determine direction by checking open position
|
||||
"""Set a stop-loss trigger order (reduce-only)."""
|
||||
positions = await self.get_positions()
|
||||
direction = "sell" # default: assume long
|
||||
for pos in positions:
|
||||
@@ -548,7 +739,7 @@ class HyperliquidClient:
|
||||
async def set_take_profit(
|
||||
self, instrument: str, tp_price: float, size: float
|
||||
) -> dict[str, Any]:
|
||||
"""Set a take-profit trigger order."""
|
||||
"""Set a take-profit trigger order (reduce-only)."""
|
||||
positions = await self.get_positions()
|
||||
direction = "sell" # default: assume long
|
||||
for pos in positions:
|
||||
@@ -567,21 +758,55 @@ class HyperliquidClient:
|
||||
)
|
||||
|
||||
async def close_position(self, instrument: str) -> dict[str, Any]:
|
||||
"""Close an open position for the given asset using market_close."""
|
||||
exchange = self._get_exchange()
|
||||
"""Close an open position using an aggressive IOC reduce-only order."""
|
||||
coin = instrument.upper()
|
||||
try:
|
||||
result = await self._run_sync(exchange.market_close, instrument.upper())
|
||||
data = await self._post_info(
|
||||
{"type": "clearinghouseState", "user": self.wallet_address}
|
||||
)
|
||||
target = None
|
||||
for ap in data.get("assetPositions", []):
|
||||
pos = ap.get("position", {})
|
||||
if (pos.get("coin") or "").upper() == coin:
|
||||
target = pos
|
||||
break
|
||||
if target is None:
|
||||
return {"error": f"No open position for {instrument}", "asset": instrument}
|
||||
|
||||
szi = float(target.get("szi", 0) or 0)
|
||||
if szi == 0:
|
||||
return {"error": f"No open position for {instrument}", "asset": instrument}
|
||||
sz = abs(szi)
|
||||
is_buy = szi < 0 # short → buy to close
|
||||
|
||||
# Slippage price: usa mark price * (1±slippage), arrotonda a 5 sig figs.
|
||||
ticker = await self.get_ticker(coin)
|
||||
mark = float(ticker.get("mark_price", 0) or 0)
|
||||
if mark <= 0:
|
||||
return {"error": "missing mark price for slippage calc", "asset": instrument}
|
||||
px = mark * (1 + DEFAULT_SLIPPAGE) if is_buy else mark * (1 - DEFAULT_SLIPPAGE)
|
||||
px = round(float(f"{px:.5g}"), 6)
|
||||
|
||||
result = await self.place_order(
|
||||
instrument=coin,
|
||||
side="buy" if is_buy else "sell",
|
||||
amount=sz,
|
||||
type="limit",
|
||||
price=px,
|
||||
reduce_only=True,
|
||||
)
|
||||
return {
|
||||
"status": result.get("status", "unknown"),
|
||||
"status": result.get("status", "ok"),
|
||||
"asset": instrument,
|
||||
**{k: v for k, v in result.items() if k != "status"},
|
||||
}
|
||||
except Exception as exc:
|
||||
return {"error": str(exc), "asset": instrument}
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
"""Health check — ping /info for server status."""
|
||||
"""Health check — ping ``/info`` for server status."""
|
||||
try:
|
||||
await self._post({"type": "meta"})
|
||||
await self._post_info({"type": "meta"})
|
||||
return {"status": "ok", "testnet": self.testnet}
|
||||
except Exception as exc:
|
||||
return {"status": "error", "error": str(exc)}
|
||||
|
||||
@@ -303,7 +303,6 @@ async def place_order(
|
||||
price=params.price,
|
||||
reduce_only=params.reduce_only,
|
||||
)
|
||||
# TODO V2: wire audit via request.state.environment in router
|
||||
return result
|
||||
|
||||
|
||||
|
||||
@@ -0,0 +1,433 @@
|
||||
"""IBKR Client Portal Web API client (REST httpx + OAuth1a)."""
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import time
|
||||
from collections import OrderedDict
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any
|
||||
|
||||
import httpx
|
||||
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import (
|
||||
IBKRAuthError,
|
||||
OAuth1aSigner,
|
||||
_percent_encode,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class IBKRError(Exception):
|
||||
"""Generic IBKR API error (non-auth)."""
|
||||
|
||||
|
||||
_TICKLE_INTERVAL_S = 240.0 # tickle if last call > 4min ago
|
||||
|
||||
# Mapping asset_class (cerbero/MCP convention) → IBKR secType.
|
||||
_SEC_TYPE_MAP: dict[str, str] = {
|
||||
"stocks": "STK",
|
||||
"options": "OPT",
|
||||
"futures": "FUT",
|
||||
"forex": "CASH",
|
||||
}
|
||||
|
||||
|
||||
@dataclass
|
||||
class IBKRClient:
|
||||
signer: OAuth1aSigner
|
||||
account_id: str
|
||||
paper: bool = True
|
||||
base_url: str = "https://api.ibkr.com/v1/api"
|
||||
|
||||
_conid_cache: OrderedDict[str, int] = field(
|
||||
default_factory=OrderedDict, init=False, repr=False
|
||||
)
|
||||
_last_request_at: float = field(default=0.0, init=False, repr=False)
|
||||
_http: httpx.AsyncClient | None = field(default=None, init=False, repr=False)
|
||||
|
||||
_CONID_CACHE_MAX = 1024
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
# IBKR Client Portal gateway latency is higher than crypto exchanges
|
||||
# (lookup roundtrip + session validation); 30s matches Alpaca's choice.
|
||||
self._http = async_client(timeout=30.0)
|
||||
|
||||
async def aclose(self) -> None:
|
||||
if self._http and not self._http.is_closed:
|
||||
await self._http.aclose()
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
return {"status": "ok", "paper": self.paper}
|
||||
|
||||
def is_testnet(self) -> dict[str, Any]:
|
||||
return {"testnet": self.paper, "base_url": self.base_url}
|
||||
|
||||
async def _build_auth_header(self, method: str, url: str) -> str:
|
||||
await self.signer.get_live_session_token(base_url=self.base_url)
|
||||
params = self.signer.make_oauth_params()
|
||||
params["oauth_signature_method"] = "HMAC-SHA256"
|
||||
sig = self.signer.sign_with_lst(method, url, params)
|
||||
params["oauth_signature"] = sig
|
||||
return "OAuth realm=\"limited_poa\", " + ", ".join(
|
||||
f'{k}="{_percent_encode(v)}"' for k, v in sorted(params.items())
|
||||
)
|
||||
|
||||
async def _maybe_tickle(self) -> None:
|
||||
if time.monotonic() - self._last_request_at < _TICKLE_INTERVAL_S:
|
||||
return
|
||||
if self._http is None: # pragma: no cover
|
||||
return
|
||||
try:
|
||||
url = f"{self.base_url}/tickle"
|
||||
auth = await self._build_auth_header("POST", url)
|
||||
await self._http.post(url, headers={"Authorization": auth})
|
||||
except Exception as exc:
|
||||
# Best-effort: failure shouldn't block the real request, but log
|
||||
# so misconfigured signer / dead session aren't invisible.
|
||||
logger.debug("ibkr tickle best-effort failed: %s", exc)
|
||||
|
||||
async def _force_lst_refresh(self) -> None:
|
||||
"""Invalidate cached LST and remint on next call."""
|
||||
self.signer._live_session_token = None
|
||||
self.signer._lst_expires_at = 0.0
|
||||
|
||||
async def _request(
|
||||
self,
|
||||
method: str,
|
||||
path: str,
|
||||
*,
|
||||
params: dict[str, Any] | None = None,
|
||||
json_body: dict[str, Any] | None = None,
|
||||
skip_tickle: bool = False,
|
||||
_retried_auth: bool = False,
|
||||
) -> Any:
|
||||
if self._http is None: # pragma: no cover — set in __post_init__
|
||||
raise IBKRError("http client not initialized")
|
||||
if not skip_tickle:
|
||||
await self._maybe_tickle()
|
||||
url = f"{self.base_url}{path}"
|
||||
auth = await self._build_auth_header(method, url)
|
||||
clean_params = (
|
||||
{k: v for k, v in params.items() if v is not None}
|
||||
if params else None
|
||||
)
|
||||
resp = await self._http.request(
|
||||
method, url,
|
||||
params=clean_params or None,
|
||||
json=json_body,
|
||||
headers={"Authorization": auth, "User-Agent": "cerbero-mcp/2.0"},
|
||||
)
|
||||
self._last_request_at = time.monotonic()
|
||||
if resp.status_code == 401 and not _retried_auth:
|
||||
# Retry once with fresh LST (per spec: IBKR_AUTH_FAILED retryable)
|
||||
logger.info("ibkr 401 on %s %s — refreshing LST and retrying once", method, path)
|
||||
await self._force_lst_refresh()
|
||||
return await self._request(
|
||||
method, path, params=params, json_body=json_body,
|
||||
skip_tickle=skip_tickle, _retried_auth=True,
|
||||
)
|
||||
if resp.status_code == 401:
|
||||
raise IBKRAuthError(f"401 on {method} {path} (after retry): {resp.text[:200]}")
|
||||
if resp.status_code == 429:
|
||||
raise IBKRError(f"IBKR_RATE_LIMITED: {resp.text[:200]}")
|
||||
if resp.status_code >= 500:
|
||||
raise IBKRError(f"IBKR_SERVER_ERROR status={resp.status_code}")
|
||||
if resp.status_code >= 400:
|
||||
raise IBKRError(f"IBKR_HTTP_{resp.status_code}: {resp.text[:300]}")
|
||||
if not resp.content:
|
||||
return {}
|
||||
return resp.json()
|
||||
|
||||
async def get_account(self) -> dict:
|
||||
return await self._request("GET", f"/portfolio/{self.account_id}/summary")
|
||||
|
||||
# ── Conid resolution ────────────────────────────────────────
|
||||
|
||||
async def resolve_conid(self, symbol: str, sec_type: str = "STK") -> int:
|
||||
key = f"{sec_type}:{symbol}"
|
||||
if key in self._conid_cache:
|
||||
self._conid_cache.move_to_end(key)
|
||||
return self._conid_cache[key]
|
||||
result = await self._request(
|
||||
"GET", "/trsrv/secdef/search",
|
||||
params={"symbol": symbol, "secType": sec_type},
|
||||
)
|
||||
if not result or not isinstance(result, list):
|
||||
raise IBKRError(f"IBKR_CONID_NOT_FOUND: {symbol}/{sec_type}")
|
||||
first = result[0]
|
||||
if not isinstance(first, dict) or "conid" not in first:
|
||||
raise IBKRError(
|
||||
f"IBKR_CONID_NOT_FOUND: {symbol}/{sec_type} (malformed response)"
|
||||
)
|
||||
conid = int(first["conid"])
|
||||
self._conid_cache[key] = conid
|
||||
if len(self._conid_cache) > self._CONID_CACHE_MAX:
|
||||
self._conid_cache.popitem(last=False)
|
||||
return conid
|
||||
|
||||
# ── Positions / orders / activities ─────────────────────────
|
||||
|
||||
async def get_positions(self, page: int = 0) -> list[dict]:
|
||||
data = await self._request(
|
||||
"GET", f"/portfolio/{self.account_id}/positions/{page}"
|
||||
)
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
async def get_open_orders(self) -> list[dict]:
|
||||
data = await self._request(
|
||||
"GET", "/iserver/account/orders",
|
||||
params={"filters": "Submitted,PreSubmitted"},
|
||||
)
|
||||
if isinstance(data, dict):
|
||||
return list(data.get("orders") or [])
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
async def get_activities(self, days: int = 7) -> list[dict]:
|
||||
days = max(1, min(days, 90))
|
||||
data = await self._request(
|
||||
"GET", "/iserver/account/trades", params={"days": days},
|
||||
)
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
# ── Market data ─────────────────────────────────────────────
|
||||
|
||||
_SNAPSHOT_FIELDS = "31,84,86,7295,7296" # last,bid,ask,bid_size,ask_size
|
||||
|
||||
async def get_ticker(self, symbol: str, asset_class: str = "stocks") -> dict:
|
||||
sec_type = _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
conid = await self.resolve_conid(symbol, sec_type)
|
||||
data = await self._request(
|
||||
"GET", "/iserver/marketdata/snapshot",
|
||||
params={"conids": str(conid), "fields": self._SNAPSHOT_FIELDS},
|
||||
)
|
||||
if not data or not isinstance(data, list):
|
||||
raise IBKRError("IBKR_NO_MARKET_DATA_SUBSCRIPTION")
|
||||
row = data[0]
|
||||
|
||||
def _f(k: str) -> float | None:
|
||||
v = row.get(k)
|
||||
try:
|
||||
return float(v) if v not in (None, "") else None
|
||||
except (TypeError, ValueError):
|
||||
return None
|
||||
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": asset_class,
|
||||
"last_price": _f("31"),
|
||||
"bid": _f("84"),
|
||||
"ask": _f("86"),
|
||||
"bid_size": _f("7295"),
|
||||
"ask_size": _f("7296"),
|
||||
}
|
||||
|
||||
async def get_bars(
|
||||
self, symbol: str, asset_class: str = "stocks",
|
||||
period: str = "1d", bar: str = "5min",
|
||||
) -> dict:
|
||||
sec_type = _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
conid = await self.resolve_conid(symbol, sec_type)
|
||||
data = await self._request(
|
||||
"GET", "/iserver/marketdata/history",
|
||||
params={"conid": str(conid), "period": period, "bar": bar},
|
||||
)
|
||||
rows = (data or {}).get("data") or []
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": r.get("t"),
|
||||
"open": r.get("o"),
|
||||
"high": r.get("h"),
|
||||
"low": r.get("l"),
|
||||
"close": r.get("c"),
|
||||
"volume": r.get("v"),
|
||||
}
|
||||
for r in rows
|
||||
])
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": asset_class,
|
||||
"interval": bar,
|
||||
"candles": candles,
|
||||
}
|
||||
|
||||
async def get_option_chain(
|
||||
self, underlying: str, expiry: str | None = None
|
||||
) -> dict:
|
||||
conid = await self.resolve_conid(underlying, "STK")
|
||||
params: dict[str, Any] = {"conid": str(conid), "secType": "OPT"}
|
||||
if expiry:
|
||||
params["month"] = expiry # IBKR format: "JAN26"
|
||||
strikes = await self._request(
|
||||
"GET", "/iserver/secdef/strikes", params=params,
|
||||
)
|
||||
return {
|
||||
"underlying": underlying,
|
||||
"expiry": expiry,
|
||||
"strikes": strikes,
|
||||
}
|
||||
|
||||
async def search_contracts(
|
||||
self, symbol: str, sec_type: str = "STK"
|
||||
) -> list[dict]:
|
||||
data = await self._request(
|
||||
"GET", "/trsrv/secdef/search",
|
||||
params={"symbol": symbol, "secType": sec_type},
|
||||
)
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
# ── Order writes ────────────────────────────────────────────
|
||||
|
||||
# Auto-confirm policy: any IBKR warning that is NOT in _CRITICAL_WARNINGS
|
||||
# is auto-confirmed up to _AUTO_CONFIRM_MAX_CYCLES times. Hardening to a
|
||||
# strict whitelist (allow-list) is deferred — V1 trades safety for UX.
|
||||
_CRITICAL_WARNINGS = (
|
||||
"margin", "suitability", "credit", "rejected", "insufficient",
|
||||
)
|
||||
_AUTO_CONFIRM_MAX_CYCLES = 3
|
||||
|
||||
async def place_order(
|
||||
self, *,
|
||||
symbol: str, side: str, qty: float,
|
||||
order_type: str = "market",
|
||||
limit_price: float | None = None,
|
||||
stop_price: float | None = None,
|
||||
tif: str = "day",
|
||||
asset_class: str = "stocks",
|
||||
sec_type: str | None = None,
|
||||
exchange: str = "SMART",
|
||||
outside_rth: bool = False,
|
||||
) -> dict:
|
||||
st = sec_type or _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
conid = await self.resolve_conid(symbol, st)
|
||||
|
||||
order: dict[str, Any] = {
|
||||
"conid": conid,
|
||||
"secType": f"{conid}:{st}",
|
||||
"orderType": _ibkr_order_type(order_type),
|
||||
"side": side.upper(),
|
||||
"quantity": qty,
|
||||
"tif": tif.upper(),
|
||||
"outsideRTH": outside_rth,
|
||||
"listingExchange": exchange,
|
||||
}
|
||||
if limit_price is not None:
|
||||
order["price"] = limit_price
|
||||
if stop_price is not None:
|
||||
order["auxPrice"] = stop_price
|
||||
|
||||
return await self._submit_order_with_confirmation({"orders": [order]})
|
||||
|
||||
async def _submit_order_with_confirmation(
|
||||
self, payload: dict, *, cycles: int = 0
|
||||
) -> dict:
|
||||
path = f"/iserver/account/{self.account_id}/orders"
|
||||
result = await self._request("POST", path, json_body=payload)
|
||||
return await self._handle_order_response(result, cycles)
|
||||
|
||||
async def _handle_order_response(
|
||||
self, result: Any, cycles: int
|
||||
) -> dict:
|
||||
if not isinstance(result, list) or not result:
|
||||
raise IBKRError(f"IBKR_ORDER_UNEXPECTED_RESPONSE: {result!r}")
|
||||
first = result[0]
|
||||
if "id" in first and "message" in first:
|
||||
messages = first.get("message") or []
|
||||
joined = " ".join(messages).lower()
|
||||
if any(crit in joined for crit in self._CRITICAL_WARNINGS):
|
||||
raise IBKRError(
|
||||
f"IBKR_ORDER_REJECTED_WARNING: {messages}"
|
||||
)
|
||||
if cycles >= self._AUTO_CONFIRM_MAX_CYCLES:
|
||||
raise IBKRError(
|
||||
f"IBKR_ORDER_TOO_MANY_CONFIRMATIONS: {messages}"
|
||||
)
|
||||
reply = await self._request(
|
||||
"POST", f"/iserver/reply/{first['id']}",
|
||||
json_body={"confirmed": True},
|
||||
)
|
||||
return await self._handle_order_response(reply, cycles + 1)
|
||||
if "order_id" in first:
|
||||
return {"order_id": first["order_id"], "status": first.get("order_status")}
|
||||
raise IBKRError(f"IBKR_ORDER_UNEXPECTED_RESPONSE: {first!r}")
|
||||
|
||||
async def amend_order(
|
||||
self, order_id: str, *,
|
||||
qty: float | None = None,
|
||||
limit_price: float | None = None,
|
||||
stop_price: float | None = None,
|
||||
tif: str | None = None,
|
||||
) -> dict:
|
||||
body: dict[str, Any] = {}
|
||||
if qty is not None:
|
||||
body["quantity"] = qty
|
||||
if limit_price is not None:
|
||||
body["price"] = limit_price
|
||||
if stop_price is not None:
|
||||
body["auxPrice"] = stop_price
|
||||
if tif is not None:
|
||||
body["tif"] = tif.upper()
|
||||
path = f"/iserver/account/{self.account_id}/order/{order_id}"
|
||||
result = await self._request("POST", path, json_body=body)
|
||||
return await self._handle_order_response(result, cycles=0)
|
||||
|
||||
async def cancel_order(self, order_id: str) -> dict:
|
||||
path = f"/iserver/account/{self.account_id}/order/{order_id}"
|
||||
await self._request("DELETE", path)
|
||||
return {"order_id": order_id, "canceled": True}
|
||||
|
||||
async def cancel_all_orders(self) -> list[dict]:
|
||||
orders = await self.get_open_orders()
|
||||
results = []
|
||||
for o in orders:
|
||||
oid = o.get("orderId") or o.get("order_id")
|
||||
if not oid:
|
||||
continue
|
||||
try:
|
||||
results.append(await self.cancel_order(str(oid)))
|
||||
except Exception as e:
|
||||
results.append({"order_id": str(oid), "canceled": False, "error": str(e)})
|
||||
return results
|
||||
|
||||
async def close_position(
|
||||
self, symbol: str, qty: float | None = None
|
||||
) -> dict:
|
||||
# Resolve symbol → conid, then match positions on conid (positions
|
||||
# return `contractDesc` as a long display string, not ticker).
|
||||
sec_type = _SEC_TYPE_MAP.get("stocks", "STK")
|
||||
conid = await self.resolve_conid(symbol, sec_type)
|
||||
positions = await self.get_positions()
|
||||
target = next(
|
||||
(p for p in positions if int(p.get("conid", 0)) == conid),
|
||||
None,
|
||||
)
|
||||
if not target:
|
||||
raise IBKRError(f"IBKR_NO_POSITION: {symbol} (conid={conid})")
|
||||
position_qty = float(target.get("position", 0))
|
||||
close_qty = abs(qty if qty is not None else position_qty)
|
||||
side = "SELL" if position_qty > 0 else "BUY"
|
||||
return await self.place_order(
|
||||
symbol=symbol, side=side, qty=close_qty, order_type="market",
|
||||
)
|
||||
|
||||
async def close_all_positions(self) -> list[dict]:
|
||||
positions = await self.get_positions()
|
||||
results = []
|
||||
for p in positions:
|
||||
sym = p.get("ticker") or p.get("contractDesc")
|
||||
if not sym:
|
||||
continue
|
||||
try:
|
||||
results.append(await self.close_position(sym))
|
||||
except Exception as e:
|
||||
results.append({"symbol": sym, "error": str(e)})
|
||||
return results
|
||||
|
||||
|
||||
def _ibkr_order_type(t: str) -> str:
|
||||
m = {"market": "MKT", "limit": "LMT", "stop": "STP", "stop_limit": "STP_LMT"}
|
||||
if t.lower() not in m:
|
||||
raise IBKRError(f"unsupported order_type: {t}")
|
||||
return m[t.lower()]
|
||||
@@ -0,0 +1,106 @@
|
||||
"""IBKR RSA key rotation: stage/confirm/abort with auto-rollback."""
|
||||
from __future__ import annotations
|
||||
|
||||
import datetime as _dt
|
||||
import hashlib
|
||||
import os
|
||||
import shutil
|
||||
from collections.abc import Awaitable, Callable
|
||||
from dataclasses import dataclass, field
|
||||
from pathlib import Path
|
||||
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
|
||||
|
||||
def _sha256_fingerprint(pem_path: Path) -> str:
|
||||
digest = hashlib.sha256(pem_path.read_bytes()).hexdigest()
|
||||
return f"SHA256:{digest}"
|
||||
|
||||
|
||||
@dataclass
|
||||
class KeyRotationManager:
|
||||
signature_key_path: str
|
||||
encryption_key_path: str
|
||||
|
||||
_started: bool = field(default=False, init=False)
|
||||
|
||||
def _sig(self) -> Path:
|
||||
return Path(self.signature_key_path)
|
||||
|
||||
def _enc(self) -> Path:
|
||||
return Path(self.encryption_key_path)
|
||||
|
||||
async def start(self) -> dict:
|
||||
sig_new = self._sig().with_suffix(self._sig().suffix + ".new")
|
||||
enc_new = self._enc().with_suffix(self._enc().suffix + ".new")
|
||||
|
||||
for p in (sig_new, enc_new):
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
p.write_bytes(key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
))
|
||||
os.chmod(p, 0o600)
|
||||
|
||||
self._started = True
|
||||
return {
|
||||
"fingerprints": {
|
||||
"sig": _sha256_fingerprint(sig_new),
|
||||
"enc": _sha256_fingerprint(enc_new),
|
||||
},
|
||||
"expires_at": (
|
||||
_dt.datetime.now(_dt.UTC) + _dt.timedelta(hours=24)
|
||||
).isoformat(),
|
||||
}
|
||||
|
||||
async def confirm(
|
||||
self, *, validate: Callable[[], Awaitable[bool]],
|
||||
) -> dict:
|
||||
sig = self._sig()
|
||||
enc = self._enc()
|
||||
sig_new = sig.with_suffix(sig.suffix + ".new")
|
||||
enc_new = enc.with_suffix(enc.suffix + ".new")
|
||||
if not (sig_new.exists() and enc_new.exists()):
|
||||
raise RuntimeError("IBKR_ROTATION_NOT_STARTED")
|
||||
|
||||
archive = sig.parent / ".archive" / _dt.datetime.now(_dt.UTC).strftime("%Y%m%dT%H%M%S")
|
||||
archive.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
shutil.move(str(sig), str(archive / sig.name))
|
||||
shutil.move(str(enc), str(archive / enc.name))
|
||||
shutil.move(str(sig_new), str(sig))
|
||||
shutil.move(str(enc_new), str(enc))
|
||||
|
||||
err: BaseException | None = None
|
||||
try:
|
||||
ok = await validate()
|
||||
except Exception as e:
|
||||
ok = False
|
||||
err = e
|
||||
|
||||
if not ok:
|
||||
shutil.move(str(sig), str(sig.with_suffix(sig.suffix + ".new")))
|
||||
shutil.move(str(enc), str(enc.with_suffix(enc.suffix + ".new")))
|
||||
shutil.move(str(archive / sig.name), str(sig))
|
||||
shutil.move(str(archive / enc.name), str(enc))
|
||||
raise RuntimeError(
|
||||
f"IBKR_ROTATION_VALIDATION_FAILED: {err}" if err
|
||||
else "IBKR_ROTATION_VALIDATION_FAILED"
|
||||
)
|
||||
|
||||
self._started = False
|
||||
return {
|
||||
"rotated_at": _dt.datetime.now(_dt.UTC).isoformat(),
|
||||
"old_archived_at": str(archive),
|
||||
}
|
||||
|
||||
async def abort(self) -> dict:
|
||||
sig_new = self._sig().with_suffix(self._sig().suffix + ".new")
|
||||
enc_new = self._enc().with_suffix(self._enc().suffix + ".new")
|
||||
for p in (sig_new, enc_new):
|
||||
if p.exists():
|
||||
p.unlink()
|
||||
self._started = False
|
||||
return {"aborted": True}
|
||||
@@ -0,0 +1,57 @@
|
||||
"""Leverage cap server-side per place_order (IBKR Reg-T context).
|
||||
|
||||
Cap letto dal secret JSON via campo `max_leverage`. Default 1 (cash) se assente.
|
||||
IBKR margin accounts default a 4x intraday / 2x overnight (Reg-T).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
|
||||
def get_max_leverage(creds: dict) -> int:
|
||||
"""Legge max_leverage dal secret. Default 1 se mancante."""
|
||||
raw = creds.get("max_leverage", 1)
|
||||
try:
|
||||
value = int(raw)
|
||||
except (TypeError, ValueError):
|
||||
value = 1
|
||||
return max(1, value)
|
||||
|
||||
|
||||
def enforce_leverage(
|
||||
requested: int | float | None,
|
||||
*,
|
||||
creds: dict,
|
||||
exchange: str,
|
||||
) -> int:
|
||||
"""Verifica e applica leverage cap. Ritorna leverage applicabile.
|
||||
|
||||
Solleva HTTPException(403, LEVERAGE_CAP_EXCEEDED) se requested > cap.
|
||||
Se requested is None, applica il cap come default.
|
||||
"""
|
||||
cap = get_max_leverage(creds)
|
||||
if requested is None:
|
||||
return cap
|
||||
lev = int(requested)
|
||||
if lev < 1:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={
|
||||
"error": "LEVERAGE_CAP_EXCEEDED",
|
||||
"exchange": exchange,
|
||||
"requested": lev,
|
||||
"max": cap,
|
||||
"reason": "leverage must be >= 1",
|
||||
},
|
||||
)
|
||||
if lev > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={
|
||||
"error": "LEVERAGE_CAP_EXCEEDED",
|
||||
"exchange": exchange,
|
||||
"requested": lev,
|
||||
"max": cap,
|
||||
},
|
||||
)
|
||||
return lev
|
||||
@@ -0,0 +1,181 @@
|
||||
"""OAuth 1.0a Self-Service signer for IBKR Client Portal Web API.
|
||||
|
||||
Reference: https://www.interactivebrokers.com/api/doc.html (Self-Service OAuth)
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import base64
|
||||
import hashlib
|
||||
import hmac
|
||||
import secrets
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from urllib.parse import quote
|
||||
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import padding
|
||||
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey
|
||||
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
# Refresh LST 9h before its 24h expiry (very conservative; trades token
|
||||
# lifetime for safety margin against clock drift and slow re-auth flows).
|
||||
_LST_REFRESH_BUFFER_S = 9 * 3600 # 32_400
|
||||
_LST_FALLBACK_TTL_S = 15 * 3600 # 54_000 — used when server omits expiration
|
||||
|
||||
|
||||
def _percent_encode(value: str) -> str:
|
||||
"""RFC 3986 percent-encoding for OAuth (no `+` for space)."""
|
||||
return quote(str(value), safe="")
|
||||
|
||||
|
||||
def build_signature_base_string(
|
||||
method: str, url: str, params: dict[str, str]
|
||||
) -> str:
|
||||
"""Costruisce signature base string OAuth 1.0a:
|
||||
`<METHOD>&<encoded-url>&<encoded-sorted-params>`
|
||||
"""
|
||||
sorted_params = sorted(params.items())
|
||||
encoded_pairs = [
|
||||
f"{_percent_encode(k)}%3D{_percent_encode(v)}"
|
||||
for k, v in sorted_params
|
||||
]
|
||||
# Manual %3D / %26 = double-encoding of '=' and '&' per RFC 5849 §3.4.1.3.2
|
||||
params_str = "%26".join(encoded_pairs)
|
||||
return f"{method.upper()}&{_percent_encode(url)}&{params_str}"
|
||||
|
||||
|
||||
class IBKRAuthError(Exception):
|
||||
"""OAuth flow failed (key invalid, consumer revoked, mint failed)."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class OAuth1aSigner:
|
||||
consumer_key: str
|
||||
access_token: str
|
||||
access_token_secret: str
|
||||
signature_key_path: str
|
||||
encryption_key_path: str
|
||||
dh_prime: str # hex string
|
||||
|
||||
_signature_key: RSAPrivateKey | None = field(default=None, init=False, repr=False)
|
||||
_encryption_key: RSAPrivateKey | None = field(default=None, init=False, repr=False)
|
||||
_live_session_token: str | None = field(default=None, init=False, repr=False)
|
||||
_lst_expires_at: float = field(default=0.0, init=False, repr=False)
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
with open(self.signature_key_path, "rb") as f:
|
||||
self._signature_key = serialization.load_pem_private_key(
|
||||
f.read(), password=None
|
||||
)
|
||||
with open(self.encryption_key_path, "rb") as f:
|
||||
self._encryption_key = serialization.load_pem_private_key(
|
||||
f.read(), password=None
|
||||
)
|
||||
|
||||
def sign(self, method: str, url: str, params: dict[str, str]) -> str:
|
||||
"""Firma RSA-SHA256 della signature base string. Ritorna base64."""
|
||||
assert self._signature_key is not None # set in __post_init__
|
||||
base = build_signature_base_string(method, url, params)
|
||||
signature = self._signature_key.sign(
|
||||
base.encode("utf-8"),
|
||||
padding.PKCS1v15(),
|
||||
hashes.SHA256(),
|
||||
)
|
||||
return base64.b64encode(signature).decode("ascii")
|
||||
|
||||
def make_oauth_params(self) -> dict[str, str]:
|
||||
"""Genera oauth_nonce/timestamp/version standard."""
|
||||
return {
|
||||
"oauth_consumer_key": self.consumer_key,
|
||||
"oauth_token": self.access_token,
|
||||
"oauth_nonce": secrets.token_hex(16),
|
||||
"oauth_timestamp": str(int(time.time())),
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
}
|
||||
|
||||
async def get_live_session_token(self, *, base_url: str) -> str:
|
||||
"""Restituisce LST cached, riminta se mancante o vicino a scadenza."""
|
||||
if self._live_session_token and time.monotonic() < self._lst_expires_at:
|
||||
return self._live_session_token
|
||||
return await self._mint_live_session_token(base_url)
|
||||
|
||||
async def _mint_live_session_token(self, base_url: str) -> str:
|
||||
"""DH key exchange + RSA-signed POST /oauth/live_session_token.
|
||||
|
||||
1. Generate random dh_random
|
||||
2. Compute dh_challenge = 2^dh_random mod dh_prime
|
||||
3. Decrypt access_token_secret via encryption RSA key
|
||||
4. POST signed request with diffie_hellman_challenge
|
||||
5. shared = dh_response^dh_random mod dh_prime
|
||||
6. LST = HMAC-SHA1(shared, decrypted_secret), base64
|
||||
"""
|
||||
url = f"{base_url}/oauth/live_session_token"
|
||||
|
||||
prime = int(self.dh_prime, 16)
|
||||
dh_random = secrets.randbits(256)
|
||||
dh_challenge = pow(2, dh_random, prime)
|
||||
dh_challenge_hex = format(dh_challenge, "x")
|
||||
|
||||
if self._encryption_key is None: # pragma: no cover — set in __post_init__
|
||||
raise IBKRAuthError("encryption key not loaded")
|
||||
try:
|
||||
encrypted = bytes.fromhex(self.access_token_secret)
|
||||
decrypted_secret = self._encryption_key.decrypt(
|
||||
encrypted, padding.PKCS1v15()
|
||||
)
|
||||
except Exception as e: # narrow to crypto/value errors; broad on purpose
|
||||
# Intentionally broad: covers ValueError (bad hex), cryptography
|
||||
# errors (InvalidKey, padding decoding), and any RSA backend issue
|
||||
# — all map to the same user-facing failure ("bad credentials").
|
||||
raise IBKRAuthError(f"access_token_secret decrypt failed: {e}") from e
|
||||
|
||||
oauth_params = self.make_oauth_params()
|
||||
oauth_params["diffie_hellman_challenge"] = dh_challenge_hex
|
||||
signature = self.sign("POST", url, oauth_params)
|
||||
oauth_params["oauth_signature"] = signature
|
||||
|
||||
auth_header = "OAuth " + ", ".join(
|
||||
f'{k}="{_percent_encode(v)}"' for k, v in sorted(oauth_params.items())
|
||||
)
|
||||
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.post(
|
||||
url,
|
||||
headers={"Authorization": auth_header, "User-Agent": "cerbero-mcp/2.0"},
|
||||
)
|
||||
if resp.status_code != 200:
|
||||
raise IBKRAuthError(
|
||||
f"LST mint failed status={resp.status_code} body={resp.text[:300]}"
|
||||
)
|
||||
data = resp.json()
|
||||
dh_response = int(data["diffie_hellman_response"], 16)
|
||||
expires_ms = data.get("live_session_token_expiration", 0)
|
||||
|
||||
shared = pow(dh_response, dh_random, prime)
|
||||
shared_bytes = shared.to_bytes((shared.bit_length() + 7) // 8, "big")
|
||||
if shared_bytes and shared_bytes[0] & 0x80:
|
||||
shared_bytes = b"\x00" + shared_bytes
|
||||
|
||||
lst_raw = hmac.new(shared_bytes, decrypted_secret, hashlib.sha1).digest()
|
||||
lst = base64.b64encode(lst_raw).decode("ascii")
|
||||
|
||||
self._live_session_token = lst
|
||||
if expires_ms:
|
||||
ttl = max(60.0, (expires_ms / 1000) - time.time() - _LST_REFRESH_BUFFER_S)
|
||||
else:
|
||||
ttl = float(_LST_FALLBACK_TTL_S)
|
||||
# `expires_ms` is wall clock; convert to a monotonic deadline so the
|
||||
# cache check is unaffected by future clock adjustments.
|
||||
self._lst_expires_at = time.monotonic() + ttl
|
||||
return lst
|
||||
|
||||
def sign_with_lst(self, method: str, url: str, params: dict[str, str]) -> str:
|
||||
"""Firma HMAC-SHA256 con LST come key (per request post-mint)."""
|
||||
if not self._live_session_token:
|
||||
raise IBKRAuthError("LST not minted yet; call get_live_session_token first")
|
||||
base = build_signature_base_string(method, url, params)
|
||||
lst_bytes = base64.b64decode(self._live_session_token)
|
||||
sig = hmac.new(lst_bytes, base.encode("utf-8"), hashlib.sha256).digest()
|
||||
return base64.b64encode(sig).decode("ascii")
|
||||
@@ -0,0 +1,101 @@
|
||||
"""Pure-function payload builders for IBKR complex orders (bracket/OCO/OTO).
|
||||
|
||||
No HTTP. Tests are deterministic.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import secrets
|
||||
from dataclasses import dataclass
|
||||
from typing import Literal
|
||||
|
||||
|
||||
@dataclass
|
||||
class OrderSpec:
|
||||
conid: int
|
||||
sec_type: str # "STK" | "OPT" | "FUT" | "CASH"
|
||||
side: Literal["BUY", "SELL"]
|
||||
qty: float
|
||||
order_type: Literal["MKT", "LMT", "STP", "STP_LMT"]
|
||||
price: float | None = None # limit price
|
||||
aux_price: float | None = None # stop price
|
||||
tif: str = "GTC"
|
||||
exchange: str = "SMART"
|
||||
|
||||
|
||||
def _to_order_dict(spec: OrderSpec, *, oca_group: str | None = None,
|
||||
oca_type: int | None = None,
|
||||
parent_id: str | None = None) -> dict:
|
||||
o: dict = {
|
||||
"conid": spec.conid,
|
||||
"secType": f"{spec.conid}:{spec.sec_type}",
|
||||
"orderType": spec.order_type,
|
||||
"side": spec.side,
|
||||
"quantity": spec.qty,
|
||||
"tif": spec.tif,
|
||||
"listingExchange": spec.exchange,
|
||||
}
|
||||
if spec.price is not None:
|
||||
o["price"] = spec.price
|
||||
if spec.aux_price is not None:
|
||||
o["auxPrice"] = spec.aux_price
|
||||
if oca_group:
|
||||
o["ocaGroup"] = oca_group
|
||||
if oca_type is not None:
|
||||
o["ocaType"] = oca_type
|
||||
if parent_id:
|
||||
o["parentId"] = parent_id
|
||||
return o
|
||||
|
||||
|
||||
def _new_oca_group() -> str:
|
||||
return f"oca-{secrets.token_hex(4)}"
|
||||
|
||||
|
||||
def build_bracket_payload(
|
||||
*, conid: int, sec_type: str, side: str, qty: float,
|
||||
entry_price: float, stop_loss: float, take_profit: float,
|
||||
tif: str = "GTC", exchange: str = "SMART",
|
||||
) -> dict:
|
||||
"""Bracket: parent LMT entry + child STP (loss) + child LMT (profit), OCA-linked."""
|
||||
side = side.upper()
|
||||
opposite = "SELL" if side == "BUY" else "BUY"
|
||||
oca = _new_oca_group()
|
||||
|
||||
parent = OrderSpec(conid=conid, sec_type=sec_type, side=side, qty=qty, # type: ignore[arg-type]
|
||||
order_type="LMT", price=entry_price,
|
||||
tif=tif, exchange=exchange)
|
||||
sl = OrderSpec(conid=conid, sec_type=sec_type, side=opposite, qty=qty, # type: ignore[arg-type]
|
||||
order_type="STP", aux_price=stop_loss,
|
||||
tif=tif, exchange=exchange)
|
||||
tp = OrderSpec(conid=conid, sec_type=sec_type, side=opposite, qty=qty, # type: ignore[arg-type]
|
||||
order_type="LMT", price=take_profit,
|
||||
tif=tif, exchange=exchange)
|
||||
return {
|
||||
"orders": [
|
||||
_to_order_dict(parent, oca_group=oca, oca_type=2),
|
||||
_to_order_dict(sl, oca_group=oca, oca_type=2),
|
||||
_to_order_dict(tp, oca_group=oca, oca_type=2),
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
def build_oco_payload(legs: list[OrderSpec]) -> dict:
|
||||
"""OCO: N legs, all sharing same ocaGroup with ocaType=1 (one-cancels-all)."""
|
||||
if len(legs) < 2:
|
||||
raise ValueError("OCO requires at least 2 legs")
|
||||
oca = _new_oca_group()
|
||||
return {
|
||||
"orders": [
|
||||
_to_order_dict(l, oca_group=oca, oca_type=1) for l in legs
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
def build_oto_first_payload(trigger: OrderSpec) -> dict:
|
||||
"""OTO step 1: place trigger as standalone."""
|
||||
return {"orders": [_to_order_dict(trigger)]}
|
||||
|
||||
|
||||
def build_oto_child_payload(child: OrderSpec, parent_order_id: str) -> dict:
|
||||
"""OTO step 2: child references parentId from step-1 order_id."""
|
||||
return {"orders": [_to_order_dict(child, parent_id=parent_order_id)]}
|
||||
@@ -0,0 +1,453 @@
|
||||
"""IBKR tool functions: Pydantic schemas + async dispatch to client/ws."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import contextlib
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
from pydantic import BaseModel
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.client import _SEC_TYPE_MAP, IBKRClient, IBKRError
|
||||
from cerbero_mcp.exchanges.ibkr.leverage_cap import get_max_leverage
|
||||
from cerbero_mcp.exchanges.ibkr.orders_complex import (
|
||||
OrderSpec,
|
||||
build_bracket_payload,
|
||||
build_oco_payload,
|
||||
build_oto_child_payload,
|
||||
build_oto_first_payload,
|
||||
)
|
||||
from cerbero_mcp.exchanges.ibkr.ws import IBKRWebSocket
|
||||
|
||||
# === Schemas: reads ===
|
||||
|
||||
class GetAccountReq(BaseModel):
|
||||
pass
|
||||
|
||||
class GetPositionsReq(BaseModel):
|
||||
pass
|
||||
|
||||
class GetOpenOrdersReq(BaseModel):
|
||||
pass
|
||||
|
||||
class GetActivitiesReq(BaseModel):
|
||||
days: int = 7
|
||||
|
||||
class GetTickerReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class GetBarsReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
period: str = "1d"
|
||||
bar: str = "5min"
|
||||
|
||||
class GetSnapshotReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class GetOptionChainReq(BaseModel):
|
||||
underlying: str
|
||||
expiry: str | None = None
|
||||
|
||||
class SearchContractsReq(BaseModel):
|
||||
symbol: str
|
||||
sec_type: str = "STK"
|
||||
|
||||
class GetClockReq(BaseModel):
|
||||
pass
|
||||
|
||||
# === Schemas: streaming ===
|
||||
|
||||
class GetTickReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class GetDepthReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
rows: int = 5
|
||||
exchange: str = "SMART"
|
||||
|
||||
class SubscribeTickReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class UnsubscribeReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
# === Schemas: writes simple ===
|
||||
|
||||
class PlaceOrderReq(BaseModel):
|
||||
symbol: str
|
||||
side: str
|
||||
qty: float
|
||||
order_type: str = "market"
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str = "day"
|
||||
asset_class: str = "stocks"
|
||||
sec_type: str | None = None
|
||||
exchange: str = "SMART"
|
||||
outside_rth: bool = False
|
||||
|
||||
class AmendOrderReq(BaseModel):
|
||||
order_id: str
|
||||
qty: float | None = None
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str | None = None
|
||||
|
||||
class CancelOrderReq(BaseModel):
|
||||
order_id: str
|
||||
|
||||
class CancelAllOrdersReq(BaseModel):
|
||||
pass
|
||||
|
||||
class ClosePositionReq(BaseModel):
|
||||
symbol: str
|
||||
qty: float | None = None
|
||||
|
||||
class CloseAllPositionsReq(BaseModel):
|
||||
pass
|
||||
|
||||
# === Schemas: writes complex ===
|
||||
|
||||
class PlaceBracketOrderReq(BaseModel):
|
||||
symbol: str
|
||||
side: str
|
||||
qty: float
|
||||
entry_price: float
|
||||
stop_loss: float
|
||||
take_profit: float
|
||||
tif: str = "gtc"
|
||||
asset_class: str = "stocks"
|
||||
exchange: str = "SMART"
|
||||
|
||||
class OrderLeg(BaseModel):
|
||||
symbol: str
|
||||
side: str
|
||||
qty: float
|
||||
order_type: str = "limit"
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str = "gtc"
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class PlaceOcoOrderReq(BaseModel):
|
||||
legs: list[OrderLeg]
|
||||
|
||||
class PlaceOtoOrderReq(BaseModel):
|
||||
trigger: OrderLeg
|
||||
child: OrderLeg
|
||||
|
||||
# === Read tools ===
|
||||
|
||||
async def environment_info(
|
||||
client: IBKRClient, *, creds: dict, env_info: Any | None = None
|
||||
) -> dict:
|
||||
return {
|
||||
"exchange": "ibkr",
|
||||
"environment": "testnet" if client.paper else "mainnet",
|
||||
"paper": client.paper,
|
||||
"base_url": client.base_url,
|
||||
"max_leverage": get_max_leverage(creds),
|
||||
}
|
||||
|
||||
async def get_account(client: IBKRClient, params: GetAccountReq) -> dict:
|
||||
return await client.get_account()
|
||||
|
||||
async def get_positions(client: IBKRClient, params: GetPositionsReq) -> dict:
|
||||
return {"positions": await client.get_positions()}
|
||||
|
||||
async def get_open_orders(client: IBKRClient, params: GetOpenOrdersReq) -> dict:
|
||||
return {"orders": await client.get_open_orders()}
|
||||
|
||||
async def get_activities(client: IBKRClient, params: GetActivitiesReq) -> dict:
|
||||
return {"activities": await client.get_activities(params.days)}
|
||||
|
||||
async def get_ticker(client: IBKRClient, params: GetTickerReq) -> dict:
|
||||
return await client.get_ticker(params.symbol, params.asset_class)
|
||||
|
||||
async def get_bars(client: IBKRClient, params: GetBarsReq) -> dict:
|
||||
return await client.get_bars(
|
||||
params.symbol, params.asset_class, params.period, params.bar,
|
||||
)
|
||||
|
||||
async def get_snapshot(client: IBKRClient, params: GetSnapshotReq) -> dict:
|
||||
return await client.get_ticker(params.symbol, params.asset_class)
|
||||
|
||||
async def get_option_chain(client: IBKRClient, params: GetOptionChainReq) -> dict:
|
||||
return await client.get_option_chain(params.underlying, params.expiry)
|
||||
|
||||
async def search_contracts(client: IBKRClient, params: SearchContractsReq) -> dict:
|
||||
return {"contracts": await client.search_contracts(params.symbol, params.sec_type)}
|
||||
|
||||
async def get_clock(client: IBKRClient, params: GetClockReq) -> dict:
|
||||
import datetime as _dt
|
||||
now = _dt.datetime.now(_dt.UTC)
|
||||
return {
|
||||
"timestamp": now.isoformat(),
|
||||
"is_open": _dt.time(13, 30) <= now.time() <= _dt.time(20, 0)
|
||||
and now.weekday() < 5,
|
||||
"approximate": True,
|
||||
"note": (
|
||||
"is_open is a UTC-based approximation; does not account for "
|
||||
"US market holidays or half-days. Use IBKR /trsrv/marketdata/calendar "
|
||||
"for authoritative schedule."
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
# === Streaming tools ===
|
||||
|
||||
|
||||
def _sec_type_for(asset_class: str) -> str:
|
||||
return _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
|
||||
|
||||
async def get_tick(
|
||||
client: IBKRClient, params: GetTickReq,
|
||||
*, ws: IBKRWebSocket, timeout_s: float = 3.0,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
snap = ws.get_tick_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await ws.subscribe_tick(conid)
|
||||
deadline = asyncio.get_event_loop().time() + timeout_s
|
||||
while asyncio.get_event_loop().time() < deadline:
|
||||
snap = ws.get_tick_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await asyncio.sleep(0.05)
|
||||
raise IBKRError(f"IBKR_TICK_TIMEOUT: {params.symbol}")
|
||||
|
||||
|
||||
async def get_depth(
|
||||
client: IBKRClient, params: GetDepthReq,
|
||||
*, ws: IBKRWebSocket, timeout_s: float = 3.0,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
snap = ws.get_depth_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await ws.subscribe_depth(conid, exchange=params.exchange, rows=params.rows)
|
||||
deadline = asyncio.get_event_loop().time() + timeout_s
|
||||
while asyncio.get_event_loop().time() < deadline:
|
||||
snap = ws.get_depth_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await asyncio.sleep(0.05)
|
||||
raise IBKRError(f"IBKR_DEPTH_TIMEOUT: {params.symbol}")
|
||||
|
||||
|
||||
async def subscribe_tick(
|
||||
client: IBKRClient, params: SubscribeTickReq, *, ws: IBKRWebSocket,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
await ws.subscribe_tick(conid, forced=True)
|
||||
return {"symbol": params.symbol, "conid": conid, "subscribed": True}
|
||||
|
||||
|
||||
async def unsubscribe(
|
||||
client: IBKRClient, params: UnsubscribeReq, *, ws: IBKRWebSocket,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
await ws.unsubscribe(conid)
|
||||
return {"symbol": params.symbol, "conid": conid, "unsubscribed": True}
|
||||
|
||||
|
||||
# === Write tools: simple ===
|
||||
|
||||
|
||||
async def place_order(
|
||||
client: IBKRClient, params: PlaceOrderReq,
|
||||
*, creds: dict, last_price: float | None = None,
|
||||
) -> dict:
|
||||
cap = get_max_leverage(creds)
|
||||
if last_price is None:
|
||||
try:
|
||||
ticker = await client.get_ticker(params.symbol, params.asset_class)
|
||||
last_price = ticker.get("last_price") or ticker.get("ask")
|
||||
except Exception:
|
||||
last_price = None
|
||||
if last_price:
|
||||
notional = params.qty * float(last_price)
|
||||
try:
|
||||
account = await client.get_account()
|
||||
equity = float(
|
||||
(account.get("netliquidation") or {}).get("amount") or 0
|
||||
)
|
||||
except Exception:
|
||||
equity = 0.0
|
||||
if equity > 0 and notional / equity > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={
|
||||
"error": "LEVERAGE_CAP_EXCEEDED",
|
||||
"exchange": "ibkr",
|
||||
"requested_ratio": notional / equity,
|
||||
"max": cap,
|
||||
},
|
||||
)
|
||||
|
||||
return await client.place_order(
|
||||
symbol=params.symbol,
|
||||
side=params.side,
|
||||
qty=params.qty,
|
||||
order_type=params.order_type,
|
||||
limit_price=params.limit_price,
|
||||
stop_price=params.stop_price,
|
||||
tif=params.tif,
|
||||
asset_class=params.asset_class,
|
||||
sec_type=params.sec_type,
|
||||
exchange=params.exchange,
|
||||
outside_rth=params.outside_rth,
|
||||
)
|
||||
|
||||
|
||||
async def amend_order(client: IBKRClient, params: AmendOrderReq) -> dict:
|
||||
return await client.amend_order(
|
||||
params.order_id,
|
||||
qty=params.qty,
|
||||
limit_price=params.limit_price,
|
||||
stop_price=params.stop_price,
|
||||
tif=params.tif,
|
||||
)
|
||||
|
||||
|
||||
async def cancel_order(client: IBKRClient, params: CancelOrderReq) -> dict:
|
||||
return await client.cancel_order(params.order_id)
|
||||
|
||||
|
||||
async def cancel_all_orders(
|
||||
client: IBKRClient, params: CancelAllOrdersReq
|
||||
) -> dict:
|
||||
return {"canceled": await client.cancel_all_orders()}
|
||||
|
||||
|
||||
async def close_position(
|
||||
client: IBKRClient, params: ClosePositionReq
|
||||
) -> dict:
|
||||
return await client.close_position(params.symbol, params.qty)
|
||||
|
||||
|
||||
async def close_all_positions(
|
||||
client: IBKRClient, params: CloseAllPositionsReq
|
||||
) -> dict:
|
||||
return {"closed": await client.close_all_positions()}
|
||||
|
||||
|
||||
# === Write tools: complex orders ===
|
||||
|
||||
|
||||
def _leg_to_spec(leg: OrderLeg, conid: int) -> OrderSpec:
|
||||
return OrderSpec(
|
||||
conid=conid,
|
||||
sec_type=_sec_type_for(leg.asset_class),
|
||||
side=leg.side.upper(), # type: ignore[arg-type]
|
||||
qty=leg.qty,
|
||||
order_type={
|
||||
"market": "MKT", "limit": "LMT",
|
||||
"stop": "STP", "stop_limit": "STP_LMT",
|
||||
}[leg.order_type.lower()], # type: ignore[arg-type]
|
||||
price=leg.limit_price,
|
||||
aux_price=leg.stop_price,
|
||||
tif=leg.tif.upper(),
|
||||
)
|
||||
|
||||
|
||||
async def place_bracket_order(
|
||||
client: IBKRClient, params: PlaceBracketOrderReq, *, creds: dict,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
cap = get_max_leverage(creds)
|
||||
notional = params.qty * params.entry_price
|
||||
try:
|
||||
account = await client.get_account()
|
||||
equity = float((account.get("netliquidation") or {}).get("amount") or 0)
|
||||
except Exception:
|
||||
equity = 0.0
|
||||
if equity > 0 and notional / equity > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={"error": "LEVERAGE_CAP_EXCEEDED", "exchange": "ibkr",
|
||||
"requested_ratio": notional / equity, "max": cap},
|
||||
)
|
||||
payload = build_bracket_payload(
|
||||
conid=conid, sec_type=sec, side=params.side.upper(), qty=params.qty,
|
||||
entry_price=params.entry_price, stop_loss=params.stop_loss,
|
||||
take_profit=params.take_profit, tif=params.tif.upper(),
|
||||
exchange=params.exchange,
|
||||
)
|
||||
return await client._submit_order_with_confirmation(payload)
|
||||
|
||||
|
||||
async def place_oco_order(
|
||||
client: IBKRClient, params: PlaceOcoOrderReq, *, creds: dict,
|
||||
) -> dict:
|
||||
if len(params.legs) < 2:
|
||||
raise HTTPException(400, detail={"error": "OCO requires >=2 legs"})
|
||||
cap = get_max_leverage(creds)
|
||||
leg_notional = max(
|
||||
l.qty * (l.limit_price or l.stop_price or 0) for l in params.legs
|
||||
)
|
||||
try:
|
||||
account = await client.get_account()
|
||||
equity = float((account.get("netliquidation") or {}).get("amount") or 0)
|
||||
except Exception:
|
||||
equity = 0.0
|
||||
if equity > 0 and leg_notional / equity > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={"error": "LEVERAGE_CAP_EXCEEDED", "exchange": "ibkr",
|
||||
"requested_ratio": leg_notional / equity, "max": cap},
|
||||
)
|
||||
|
||||
specs = []
|
||||
for l in params.legs:
|
||||
sec = _sec_type_for(l.asset_class)
|
||||
conid = await client.resolve_conid(l.symbol, sec)
|
||||
specs.append(_leg_to_spec(l, conid))
|
||||
payload = build_oco_payload(specs)
|
||||
return await client._submit_order_with_confirmation(payload)
|
||||
|
||||
|
||||
async def place_oto_order(
|
||||
client: IBKRClient, params: PlaceOtoOrderReq, *, creds: dict,
|
||||
) -> dict:
|
||||
sec_t = _sec_type_for(params.trigger.asset_class)
|
||||
sec_c = _sec_type_for(params.child.asset_class)
|
||||
conid_t = await client.resolve_conid(params.trigger.symbol, sec_t)
|
||||
conid_c = await client.resolve_conid(params.child.symbol, sec_c)
|
||||
trig_spec = _leg_to_spec(params.trigger, conid_t)
|
||||
child_spec = _leg_to_spec(params.child, conid_c)
|
||||
|
||||
trig_payload = build_oto_first_payload(trig_spec)
|
||||
trig_res = await client._submit_order_with_confirmation(trig_payload)
|
||||
trigger_order_id = trig_res.get("order_id")
|
||||
if not trigger_order_id:
|
||||
raise IBKRError(f"IBKR_OTO_TRIGGER_NO_ID: {trig_res!r}")
|
||||
|
||||
try:
|
||||
child_payload = build_oto_child_payload(child_spec, str(trigger_order_id))
|
||||
child_res = await client._submit_order_with_confirmation(child_payload)
|
||||
except Exception as e:
|
||||
with contextlib.suppress(Exception):
|
||||
await client.cancel_order(str(trigger_order_id))
|
||||
raise IBKRError(
|
||||
f"IBKR_OTO_PARTIAL_FAILURE: trigger={trigger_order_id} reason={e}"
|
||||
) from e
|
||||
|
||||
return {
|
||||
"trigger_order_id": trigger_order_id,
|
||||
"child_order_id": child_res.get("order_id"),
|
||||
}
|
||||
@@ -0,0 +1,247 @@
|
||||
"""IBKR Client Portal WebSocket — persistent WSS, smd/sbd subs, snapshot cache."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import contextlib
|
||||
import json
|
||||
import logging
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any
|
||||
|
||||
from websockets import connect as websockets_connect # exposed for tests
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import OAuth1aSigner
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WSError(Exception):
|
||||
"""WebSocket layer error."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class TickSnapshot:
|
||||
last_price: float | None
|
||||
bid: float | None
|
||||
ask: float | None
|
||||
bid_size: float | None
|
||||
ask_size: float | None
|
||||
timestamp_ms: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class DepthSnapshot:
|
||||
bids: list[dict]
|
||||
asks: list[dict]
|
||||
timestamp_ms: int
|
||||
|
||||
|
||||
_SMD_FIELDS = ["31", "84", "86", "7295", "7296"]
|
||||
|
||||
|
||||
@dataclass
|
||||
class IBKRWebSocket:
|
||||
"""Persistent WSS to IBKR Client Portal with smd/sbd subs.
|
||||
|
||||
Snapshot lifetime: each (tick|depth) cache entry is overwritten on every
|
||||
incoming message. On disconnect, the reader loop logs and exits leaving
|
||||
the existing cache intact. Consumers should check `connected` before
|
||||
trusting a stale snapshot, or compare `timestamp_ms` against wall clock.
|
||||
Automatic reconnect is deferred to a follow-up; V1 surfaces disconnects
|
||||
via `connected=False` so the higher-level tool layer can rebuild the WS.
|
||||
"""
|
||||
signer: OAuth1aSigner
|
||||
ws_url: str
|
||||
base_url: str
|
||||
max_subs: int = 80
|
||||
idle_timeout_s: int = 300
|
||||
|
||||
_ws: Any = field(default=None, init=False, repr=False)
|
||||
_tick_cache: dict[int, TickSnapshot] = field(default_factory=dict, init=False)
|
||||
_depth_cache: dict[int, DepthSnapshot] = field(default_factory=dict, init=False)
|
||||
_subs: set[int] = field(default_factory=set, init=False)
|
||||
_depth_subs: set[int] = field(default_factory=set, init=False)
|
||||
_last_polled_at: dict[int, float] = field(default_factory=dict, init=False)
|
||||
_forced_subs: set[int] = field(default_factory=set, init=False)
|
||||
_reader_task: asyncio.Task | None = field(default=None, init=False)
|
||||
_idle_task: asyncio.Task | None = field(default=None, init=False)
|
||||
_stopped: bool = field(default=False, init=False)
|
||||
|
||||
@property
|
||||
def connected(self) -> bool:
|
||||
return self._ws is not None and not getattr(self._ws, "closed", True)
|
||||
|
||||
async def start(self) -> None:
|
||||
if self.connected:
|
||||
return
|
||||
self._stopped = False # reset on every start (supports stop→start cycles)
|
||||
lst = await self.signer.get_live_session_token(base_url=self.base_url)
|
||||
self._ws = await websockets_connect(
|
||||
self.ws_url,
|
||||
additional_headers={"Cookie": f"api={lst}"},
|
||||
)
|
||||
self._reader_task = asyncio.create_task(self._reader_loop())
|
||||
self._idle_task = asyncio.create_task(self._idle_sweeper())
|
||||
|
||||
async def stop(self) -> None:
|
||||
self._stopped = True
|
||||
if self._idle_task:
|
||||
self._idle_task.cancel()
|
||||
with contextlib.suppress(BaseException):
|
||||
await self._idle_task
|
||||
if self._reader_task:
|
||||
self._reader_task.cancel()
|
||||
with contextlib.suppress(BaseException):
|
||||
await self._reader_task
|
||||
if self._ws:
|
||||
with contextlib.suppress(Exception):
|
||||
await self._ws.close()
|
||||
self._ws = None
|
||||
|
||||
async def subscribe_tick(self, conid: int, *, forced: bool = False) -> None:
|
||||
self._require_started()
|
||||
await self._ensure_capacity(conid)
|
||||
if conid in self._subs:
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
if forced:
|
||||
self._forced_subs.add(conid)
|
||||
return
|
||||
msg = "smd+" + str(conid) + "+" + json.dumps({"fields": _SMD_FIELDS})
|
||||
await self._ws.send(msg)
|
||||
self._subs.add(conid)
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
if forced:
|
||||
self._forced_subs.add(conid)
|
||||
|
||||
async def subscribe_depth(
|
||||
self, conid: int, *, exchange: str = "SMART", rows: int = 5
|
||||
) -> None:
|
||||
self._require_started()
|
||||
await self._ensure_capacity(conid)
|
||||
if conid in self._depth_subs:
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
return
|
||||
msg = f"sbd+{conid}+{exchange}+{rows}"
|
||||
await self._ws.send(msg)
|
||||
self._depth_subs.add(conid)
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
|
||||
async def unsubscribe(self, conid: int) -> None:
|
||||
self._require_started()
|
||||
if conid in self._subs:
|
||||
await self._ws.send(f"umd+{conid}+{{}}")
|
||||
self._subs.discard(conid)
|
||||
if conid in self._depth_subs:
|
||||
await self._ws.send(f"ubd+{conid}")
|
||||
self._depth_subs.discard(conid)
|
||||
self._tick_cache.pop(conid, None)
|
||||
self._depth_cache.pop(conid, None)
|
||||
self._last_polled_at.pop(conid, None)
|
||||
self._forced_subs.discard(conid)
|
||||
|
||||
def get_tick_snapshot(self, conid: int) -> dict | None:
|
||||
snap = self._tick_cache.get(conid)
|
||||
if not snap:
|
||||
return None
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
return {
|
||||
"conid": conid,
|
||||
"last_price": snap.last_price,
|
||||
"bid": snap.bid,
|
||||
"ask": snap.ask,
|
||||
"bid_size": snap.bid_size,
|
||||
"ask_size": snap.ask_size,
|
||||
"timestamp_ms": snap.timestamp_ms,
|
||||
}
|
||||
|
||||
def get_depth_snapshot(self, conid: int) -> dict | None:
|
||||
snap = self._depth_cache.get(conid)
|
||||
if not snap:
|
||||
return None
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
return {
|
||||
"conid": conid,
|
||||
"bids": snap.bids,
|
||||
"asks": snap.asks,
|
||||
"timestamp_ms": snap.timestamp_ms,
|
||||
}
|
||||
|
||||
def _require_started(self) -> None:
|
||||
if self._ws is None:
|
||||
raise WSError("IBKR_WS_NOT_STARTED: call start() first")
|
||||
|
||||
async def _ensure_capacity(self, conid: int) -> None:
|
||||
if (conid in self._subs) or (conid in self._depth_subs):
|
||||
return
|
||||
active = len(self._subs) + len(self._depth_subs)
|
||||
if active >= self.max_subs:
|
||||
raise WSError(f"IBKR_WS_SUB_LIMIT: {active}/{self.max_subs}")
|
||||
|
||||
async def _reader_loop(self) -> None:
|
||||
try:
|
||||
while not self._stopped and self._ws:
|
||||
raw = await self._ws.recv()
|
||||
try:
|
||||
msg = json.loads(raw)
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
topic = msg.get("topic", "")
|
||||
if topic.startswith("smd+"):
|
||||
self._on_tick(topic, msg)
|
||||
elif topic.startswith("sbd+"):
|
||||
self._on_depth(topic, msg)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
except Exception as exc:
|
||||
# Disconnect / parse error / network — leave cache as-is, mark dead.
|
||||
# V1: no automatic reconnect; consumers detect via stale timestamp_ms.
|
||||
logger.warning("ibkr ws reader exited: %s", exc)
|
||||
self._ws = None
|
||||
return
|
||||
|
||||
def _on_tick(self, topic: str, msg: dict) -> None:
|
||||
try:
|
||||
conid = int(topic.split("+", 1)[1])
|
||||
except (ValueError, IndexError):
|
||||
return
|
||||
|
||||
def _f(k: str) -> float | None:
|
||||
v = msg.get(k)
|
||||
try:
|
||||
return float(v) if v not in (None, "") else None
|
||||
except (TypeError, ValueError):
|
||||
return None
|
||||
|
||||
self._tick_cache[conid] = TickSnapshot(
|
||||
last_price=_f("31"), bid=_f("84"), ask=_f("86"),
|
||||
bid_size=_f("7295"), ask_size=_f("7296"),
|
||||
timestamp_ms=int(time.time() * 1000),
|
||||
)
|
||||
|
||||
def _on_depth(self, topic: str, msg: dict) -> None:
|
||||
try:
|
||||
conid = int(topic.split("+", 1)[1])
|
||||
except (ValueError, IndexError):
|
||||
return
|
||||
self._depth_cache[conid] = DepthSnapshot(
|
||||
bids=msg.get("bids") or [],
|
||||
asks=msg.get("asks") or [],
|
||||
timestamp_ms=int(time.time() * 1000),
|
||||
)
|
||||
|
||||
async def _idle_sweeper(self) -> None:
|
||||
try:
|
||||
while not self._stopped:
|
||||
await asyncio.sleep(30)
|
||||
now = time.monotonic()
|
||||
expired = [
|
||||
c for c in list(self._subs | self._depth_subs)
|
||||
if c not in self._forced_subs
|
||||
and now - self._last_polled_at.get(c, now) > self.idle_timeout_s
|
||||
]
|
||||
for c in expired:
|
||||
with contextlib.suppress(Exception):
|
||||
await self.unsubscribe(c)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
@@ -8,6 +8,8 @@ istanziato dal `ClientRegistry`.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
|
||||
class MacroClient:
|
||||
"""Wrapper credenziali FRED/Finnhub. Stateless, no HTTP session."""
|
||||
@@ -18,3 +20,7 @@ class MacroClient:
|
||||
|
||||
async def aclose(self) -> None: # pragma: no cover - no-op, no resources
|
||||
return None
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
|
||||
return {"status": "ok"}
|
||||
|
||||
@@ -9,6 +9,8 @@ e per essere istanziato dal `ClientRegistry`.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
|
||||
class SentimentClient:
|
||||
"""Wrapper credenziali CryptoPanic/LunarCrush. Stateless, no HTTP session."""
|
||||
@@ -19,3 +21,7 @@ class SentimentClient:
|
||||
|
||||
async def aclose(self) -> None: # pragma: no cover - no-op, no resources
|
||||
return None
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
|
||||
return {"status": "ok"}
|
||||
|
||||
@@ -11,6 +11,7 @@ from typing import Literal, cast
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
from cerbero_mcp.exchanges.alpaca import tools as t
|
||||
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
|
||||
|
||||
@@ -136,41 +137,86 @@ def make_router() -> APIRouter:
|
||||
client: AlpacaClient = Depends(get_alpaca_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.place_order(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="alpaca",
|
||||
action="place_order",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/amend_order")
|
||||
async def _amend_order(
|
||||
params: t.AmendOrderReq,
|
||||
request: Request,
|
||||
client: AlpacaClient = Depends(get_alpaca_client),
|
||||
):
|
||||
return await t.amend_order(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="alpaca",
|
||||
action="amend_order",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.amend_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_order")
|
||||
async def _cancel_order(
|
||||
params: t.CancelOrderReq,
|
||||
request: Request,
|
||||
client: AlpacaClient = Depends(get_alpaca_client),
|
||||
):
|
||||
return await t.cancel_order(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="alpaca",
|
||||
action="cancel_order",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.cancel_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_all_orders")
|
||||
async def _cancel_all_orders(
|
||||
params: t.CancelAllOrdersReq,
|
||||
request: Request,
|
||||
client: AlpacaClient = Depends(get_alpaca_client),
|
||||
):
|
||||
return await t.cancel_all_orders(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="alpaca",
|
||||
action="cancel_all_orders",
|
||||
params=params,
|
||||
tool_fn=lambda: t.cancel_all_orders(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_position")
|
||||
async def _close_position(
|
||||
params: t.ClosePositionReq,
|
||||
request: Request,
|
||||
client: AlpacaClient = Depends(get_alpaca_client),
|
||||
):
|
||||
return await t.close_position(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="alpaca",
|
||||
action="close_position",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.close_position(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_all_positions")
|
||||
async def _close_all_positions(
|
||||
params: t.CloseAllPositionsReq,
|
||||
request: Request,
|
||||
client: AlpacaClient = Depends(get_alpaca_client),
|
||||
):
|
||||
return await t.close_all_positions(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="alpaca",
|
||||
action="close_all_positions",
|
||||
params=params,
|
||||
tool_fn=lambda: t.close_all_positions(client, params),
|
||||
)
|
||||
|
||||
return r
|
||||
|
||||
@@ -11,6 +11,7 @@ from typing import Literal, cast
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
from cerbero_mcp.exchanges.bybit import tools as t
|
||||
from cerbero_mcp.exchanges.bybit.client import BybitClient
|
||||
|
||||
@@ -182,7 +183,14 @@ def make_router() -> APIRouter:
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.place_order(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="place_order",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/place_combo_order")
|
||||
async def _place_combo_order(
|
||||
@@ -191,49 +199,103 @@ def make_router() -> APIRouter:
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.place_combo_order(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="place_combo_order",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_combo_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/amend_order")
|
||||
async def _amend_order(
|
||||
params: t.AmendOrderReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.amend_order(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="amend_order",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.amend_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_order")
|
||||
async def _cancel_order(
|
||||
params: t.CancelOrderReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.cancel_order(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="cancel_order",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.cancel_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_all_orders")
|
||||
async def _cancel_all_orders(
|
||||
params: t.CancelAllReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.cancel_all_orders(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="cancel_all_orders",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.cancel_all_orders(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_stop_loss")
|
||||
async def _set_stop_loss(
|
||||
params: t.SetStopLossReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.set_stop_loss(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="set_stop_loss",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_stop_loss(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_take_profit")
|
||||
async def _set_take_profit(
|
||||
params: t.SetTakeProfitReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.set_take_profit(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="set_take_profit",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_take_profit(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_position")
|
||||
async def _close_position(
|
||||
params: t.ClosePositionReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.close_position(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="close_position",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.close_position(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_leverage")
|
||||
async def _set_leverage(
|
||||
@@ -242,20 +304,43 @@ def make_router() -> APIRouter:
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.set_leverage(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="set_leverage",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_leverage(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/switch_position_mode")
|
||||
async def _switch_position_mode(
|
||||
params: t.SwitchModeReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.switch_position_mode(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="switch_position_mode",
|
||||
target_field="symbol",
|
||||
params=params,
|
||||
tool_fn=lambda: t.switch_position_mode(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/transfer_asset")
|
||||
async def _transfer_asset(
|
||||
params: t.TransferReq,
|
||||
request: Request,
|
||||
client: BybitClient = Depends(get_bybit_client),
|
||||
):
|
||||
return await t.transfer_asset(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="bybit",
|
||||
action="transfer_asset",
|
||||
target_field="coin",
|
||||
params=params,
|
||||
tool_fn=lambda: t.transfer_asset(client, params),
|
||||
)
|
||||
|
||||
return r
|
||||
|
||||
@@ -0,0 +1,36 @@
|
||||
"""Router /mcp-cross/* — historical data with cross-exchange consensus."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal, cast
|
||||
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.exchanges.cross import tools as t
|
||||
from cerbero_mcp.exchanges.cross.client import CrossClient
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
|
||||
def get_environment(request: Request) -> Environment:
|
||||
return cast(Environment, request.state.environment)
|
||||
|
||||
|
||||
def get_cross_client(
|
||||
request: Request, env: Environment = Depends(get_environment),
|
||||
) -> CrossClient:
|
||||
registry: ClientRegistry = request.app.state.registry
|
||||
return CrossClient(registry, env=env)
|
||||
|
||||
|
||||
def make_router() -> APIRouter:
|
||||
r = APIRouter(prefix="/mcp-cross", tags=["cross"])
|
||||
|
||||
@r.post("/tools/get_historical")
|
||||
async def _get_historical(
|
||||
params: t.GetHistoricalReq,
|
||||
client: CrossClient = Depends(get_cross_client),
|
||||
):
|
||||
return await t.get_historical(client, params)
|
||||
|
||||
return r
|
||||
@@ -11,6 +11,7 @@ from typing import Literal, cast
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
from cerbero_mcp.exchanges.deribit import tools as t
|
||||
from cerbero_mcp.exchanges.deribit.client import DeribitClient
|
||||
|
||||
@@ -249,7 +250,14 @@ def make_router() -> APIRouter:
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.place_order(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="place_order",
|
||||
target_field="instrument_name",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/place_combo_order")
|
||||
async def _place_combo_order(
|
||||
@@ -258,34 +266,72 @@ def make_router() -> APIRouter:
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.place_combo_order(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="place_combo_order",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_combo_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_order")
|
||||
async def _cancel_order(
|
||||
params: t.CancelOrderReq,
|
||||
request: Request,
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
return await t.cancel_order(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="cancel_order",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.cancel_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_stop_loss")
|
||||
async def _set_stop_loss(
|
||||
params: t.SetStopLossReq,
|
||||
request: Request,
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
return await t.set_stop_loss(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="set_stop_loss",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_stop_loss(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_take_profit")
|
||||
async def _set_take_profit(
|
||||
params: t.SetTakeProfitReq,
|
||||
request: Request,
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
return await t.set_take_profit(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="set_take_profit",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_take_profit(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_position")
|
||||
async def _close_position(
|
||||
params: t.ClosePositionReq,
|
||||
request: Request,
|
||||
client: DeribitClient = Depends(get_deribit_client),
|
||||
):
|
||||
return await t.close_position(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="deribit",
|
||||
action="close_position",
|
||||
target_field="instrument_name",
|
||||
params=params,
|
||||
tool_fn=lambda: t.close_position(client, params),
|
||||
)
|
||||
|
||||
return r
|
||||
|
||||
@@ -11,6 +11,7 @@ from typing import Literal, cast
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
from cerbero_mcp.exchanges.hyperliquid import tools as t
|
||||
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
|
||||
|
||||
@@ -136,34 +137,73 @@ def make_router() -> APIRouter:
|
||||
client: HyperliquidClient = Depends(get_hyperliquid_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await t.place_order(client, params, creds=creds)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="hyperliquid",
|
||||
action="place_order",
|
||||
target_field="instrument",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_order")
|
||||
async def _cancel_order(
|
||||
params: t.CancelOrderReq,
|
||||
request: Request,
|
||||
client: HyperliquidClient = Depends(get_hyperliquid_client),
|
||||
):
|
||||
return await t.cancel_order(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="hyperliquid",
|
||||
action="cancel_order",
|
||||
target_field="order_id",
|
||||
params=params,
|
||||
tool_fn=lambda: t.cancel_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_stop_loss")
|
||||
async def _set_stop_loss(
|
||||
params: t.SetStopLossReq,
|
||||
request: Request,
|
||||
client: HyperliquidClient = Depends(get_hyperliquid_client),
|
||||
):
|
||||
return await t.set_stop_loss(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="hyperliquid",
|
||||
action="set_stop_loss",
|
||||
target_field="instrument",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_stop_loss(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/set_take_profit")
|
||||
async def _set_take_profit(
|
||||
params: t.SetTakeProfitReq,
|
||||
request: Request,
|
||||
client: HyperliquidClient = Depends(get_hyperliquid_client),
|
||||
):
|
||||
return await t.set_take_profit(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="hyperliquid",
|
||||
action="set_take_profit",
|
||||
target_field="instrument",
|
||||
params=params,
|
||||
tool_fn=lambda: t.set_take_profit(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_position")
|
||||
async def _close_position(
|
||||
params: t.ClosePositionReq,
|
||||
request: Request,
|
||||
client: HyperliquidClient = Depends(get_hyperliquid_client),
|
||||
):
|
||||
return await t.close_position(client, params)
|
||||
return await audit_call(
|
||||
request=request,
|
||||
exchange="hyperliquid",
|
||||
action="close_position",
|
||||
target_field="instrument",
|
||||
params=params,
|
||||
tool_fn=lambda: t.close_position(client, params),
|
||||
)
|
||||
|
||||
return r
|
||||
|
||||
@@ -0,0 +1,248 @@
|
||||
"""Router /mcp-ibkr/* — DI per env, client e (write) creds."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal, cast
|
||||
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
from cerbero_mcp.exchanges.ibkr import tools as t
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRClient
|
||||
from cerbero_mcp.exchanges.ibkr.ws import IBKRWebSocket
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
|
||||
def get_environment(request: Request) -> Environment:
|
||||
return cast(Environment, request.state.environment)
|
||||
|
||||
|
||||
async def get_ibkr_client(
|
||||
request: Request, env: Environment = Depends(get_environment),
|
||||
) -> IBKRClient:
|
||||
registry: ClientRegistry = request.app.state.registry
|
||||
return cast(IBKRClient, await registry.get("ibkr", env))
|
||||
|
||||
|
||||
async def get_ibkr_ws(
|
||||
request: Request, env: Environment = Depends(get_environment),
|
||||
) -> IBKRWebSocket:
|
||||
"""Lazy-create singleton WS per env on first streaming call."""
|
||||
ws_dict = getattr(request.app.state, "ibkr_ws", None)
|
||||
if ws_dict is None:
|
||||
ws_dict = {}
|
||||
request.app.state.ibkr_ws = ws_dict
|
||||
if env not in ws_dict:
|
||||
client = await get_ibkr_client(request, env)
|
||||
settings = request.app.state.settings
|
||||
ws_url = (
|
||||
settings.ibkr.ws_url_testnet if env == "testnet"
|
||||
else settings.ibkr.ws_url_live
|
||||
)
|
||||
ws = IBKRWebSocket(
|
||||
signer=client.signer,
|
||||
ws_url=ws_url,
|
||||
base_url=client.base_url,
|
||||
max_subs=settings.ibkr.ws_max_subscriptions,
|
||||
idle_timeout_s=settings.ibkr.ws_idle_timeout_s,
|
||||
)
|
||||
await ws.start()
|
||||
ws_dict[env] = ws
|
||||
return ws_dict[env]
|
||||
|
||||
|
||||
def _build_creds(request: Request) -> dict:
|
||||
settings = request.app.state.settings
|
||||
return {"max_leverage": settings.ibkr.max_leverage}
|
||||
|
||||
|
||||
def make_router() -> APIRouter:
|
||||
r = APIRouter(prefix="/mcp-ibkr", tags=["ibkr"])
|
||||
|
||||
# === READ tools ===
|
||||
|
||||
@r.post("/tools/environment_info")
|
||||
async def _ei(request: Request, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.environment_info(client, creds=_build_creds(request))
|
||||
|
||||
@r.post("/tools/get_account")
|
||||
async def _ga(params: t.GetAccountReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_account(client, params)
|
||||
|
||||
@r.post("/tools/get_positions")
|
||||
async def _gp(params: t.GetPositionsReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_positions(client, params)
|
||||
|
||||
@r.post("/tools/get_open_orders")
|
||||
async def _goo(params: t.GetOpenOrdersReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_open_orders(client, params)
|
||||
|
||||
@r.post("/tools/get_activities")
|
||||
async def _gact(params: t.GetActivitiesReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_activities(client, params)
|
||||
|
||||
@r.post("/tools/get_ticker")
|
||||
async def _gt(params: t.GetTickerReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_ticker(client, params)
|
||||
|
||||
@r.post("/tools/get_bars")
|
||||
async def _gb(params: t.GetBarsReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_bars(client, params)
|
||||
|
||||
@r.post("/tools/get_snapshot")
|
||||
async def _gs(params: t.GetSnapshotReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_snapshot(client, params)
|
||||
|
||||
@r.post("/tools/get_option_chain")
|
||||
async def _goc(params: t.GetOptionChainReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_option_chain(client, params)
|
||||
|
||||
@r.post("/tools/search_contracts")
|
||||
async def _sc(params: t.SearchContractsReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.search_contracts(client, params)
|
||||
|
||||
@r.post("/tools/get_clock")
|
||||
async def _gc(params: t.GetClockReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_clock(client, params)
|
||||
|
||||
# === STREAMING tools ===
|
||||
|
||||
@r.post("/tools/get_tick")
|
||||
async def _gtk(
|
||||
params: t.GetTickReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.get_tick(client, params, ws=ws)
|
||||
|
||||
@r.post("/tools/get_depth")
|
||||
async def _gd(
|
||||
params: t.GetDepthReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.get_depth(client, params, ws=ws)
|
||||
|
||||
@r.post("/tools/subscribe_tick")
|
||||
async def _st(
|
||||
params: t.SubscribeTickReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.subscribe_tick(client, params, ws=ws)
|
||||
|
||||
@r.post("/tools/unsubscribe")
|
||||
async def _us(
|
||||
params: t.UnsubscribeReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.unsubscribe(client, params, ws=ws)
|
||||
|
||||
# === WRITE simple ===
|
||||
|
||||
@r.post("/tools/place_order")
|
||||
async def _po(
|
||||
params: t.PlaceOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_order",
|
||||
target_field="symbol", params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/amend_order")
|
||||
async def _ao(
|
||||
params: t.AmendOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="amend_order",
|
||||
target_field="order_id", params=params,
|
||||
tool_fn=lambda: t.amend_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_order")
|
||||
async def _co(
|
||||
params: t.CancelOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="cancel_order",
|
||||
target_field="order_id", params=params,
|
||||
tool_fn=lambda: t.cancel_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_all_orders")
|
||||
async def _cao(
|
||||
params: t.CancelAllOrdersReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="cancel_all_orders",
|
||||
params=params, tool_fn=lambda: t.cancel_all_orders(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_position")
|
||||
async def _cp(
|
||||
params: t.ClosePositionReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="close_position",
|
||||
target_field="symbol", params=params,
|
||||
tool_fn=lambda: t.close_position(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_all_positions")
|
||||
async def _cap(
|
||||
params: t.CloseAllPositionsReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="close_all_positions",
|
||||
params=params, tool_fn=lambda: t.close_all_positions(client, params),
|
||||
)
|
||||
|
||||
# === WRITE complex ===
|
||||
|
||||
@r.post("/tools/place_bracket_order")
|
||||
async def _pbo(
|
||||
params: t.PlaceBracketOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_bracket_order",
|
||||
target_field="symbol", params=params,
|
||||
tool_fn=lambda: t.place_bracket_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/place_oco_order")
|
||||
async def _poco(
|
||||
params: t.PlaceOcoOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_oco_order",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_oco_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/place_oto_order")
|
||||
async def _poto(
|
||||
params: t.PlaceOtoOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_oto_order",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_oto_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
return r
|
||||
@@ -1,7 +1,9 @@
|
||||
"""Factory FastAPI app con middleware, swagger, exception handlers."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import os
|
||||
import time
|
||||
from datetime import UTC, datetime
|
||||
from typing import Any
|
||||
@@ -18,6 +20,7 @@ from cerbero_mcp.common.errors import (
|
||||
RETRYABLE_STATUSES,
|
||||
error_envelope,
|
||||
)
|
||||
from cerbero_mcp.common.request_log import install_request_log_middleware
|
||||
|
||||
|
||||
class _TimestampInjectorMiddleware(BaseHTTPMiddleware):
|
||||
@@ -99,6 +102,11 @@ def build_app(
|
||||
app, testnet_token=testnet_token, mainnet_token=mainnet_token
|
||||
)
|
||||
|
||||
# Request log middleware: registrato DOPO auth → starlette esegue
|
||||
# i middleware in ordine inverso (LIFO) → request_log è outermost,
|
||||
# auth è interno e popola request.state.* prima del ritorno.
|
||||
install_request_log_middleware(app)
|
||||
|
||||
app.add_middleware(_TimestampInjectorMiddleware)
|
||||
|
||||
@app.middleware("http")
|
||||
@@ -128,6 +136,7 @@ def build_app(
|
||||
content=error_envelope(
|
||||
type_="http_error", code=code, message=message,
|
||||
retryable=retryable, details=details,
|
||||
request_id=getattr(request.state, "request_id", None),
|
||||
),
|
||||
)
|
||||
|
||||
@@ -155,6 +164,7 @@ def build_app(
|
||||
message=f"request body validation failed on {first_loc}",
|
||||
retryable=False, suggested_fix=suggestion,
|
||||
details={"errors": safe_errs},
|
||||
request_id=getattr(request.state, "request_id", None),
|
||||
),
|
||||
)
|
||||
|
||||
@@ -166,6 +176,7 @@ def build_app(
|
||||
type_="internal_error", code="UNHANDLED_EXCEPTION",
|
||||
message=f"{type(exc).__name__}: {str(exc)[:300]}",
|
||||
retryable=True,
|
||||
request_id=getattr(request.state, "request_id", None),
|
||||
),
|
||||
)
|
||||
|
||||
@@ -179,6 +190,79 @@ def build_app(
|
||||
"data_timestamp": datetime.now(UTC).isoformat(),
|
||||
}
|
||||
|
||||
@app.get("/health/ready", tags=["system"])
|
||||
async def health_ready():
|
||||
"""Readiness check: ping ogni client exchange cached.
|
||||
|
||||
- Itera ``app.state.registry._clients`` (se presente).
|
||||
- Per ogni client prova ``health()`` (preferito) o ``is_testnet()``.
|
||||
In assenza di metodo, marca con ``note: no probe method``.
|
||||
- Timeout di 2s per client tramite ``asyncio.wait_for``.
|
||||
- Stato globale: ``ready`` se tutti ok, ``degraded`` se almeno
|
||||
uno fallisce, ``not_ready`` se registry vuoto.
|
||||
- HTTP 200 di default; con ``READY_FAILS_ON_DEGRADED=true`` ritorna
|
||||
503 quando lo stato non è ``ready`` (utile per probe k8s).
|
||||
"""
|
||||
registry = getattr(app.state, "registry", None)
|
||||
clients_status: list[dict[str, Any]] = []
|
||||
if registry is not None:
|
||||
for (exchange, env), client in registry._clients.items():
|
||||
t0 = time.perf_counter()
|
||||
ping = (
|
||||
getattr(client, "health", None)
|
||||
or getattr(client, "is_testnet", None)
|
||||
)
|
||||
if ping is None:
|
||||
clients_status.append({
|
||||
"exchange": exchange,
|
||||
"env": env,
|
||||
"healthy": True,
|
||||
"note": "no probe method",
|
||||
})
|
||||
continue
|
||||
try:
|
||||
res = ping()
|
||||
if asyncio.iscoroutine(res):
|
||||
await asyncio.wait_for(res, timeout=2.0)
|
||||
dur = (time.perf_counter() - t0) * 1000
|
||||
clients_status.append({
|
||||
"exchange": exchange,
|
||||
"env": env,
|
||||
"healthy": True,
|
||||
"duration_ms": round(dur, 2),
|
||||
})
|
||||
except Exception as e:
|
||||
clients_status.append({
|
||||
"exchange": exchange,
|
||||
"env": env,
|
||||
"healthy": False,
|
||||
"error": f"{type(e).__name__}: {str(e)[:200]}",
|
||||
})
|
||||
|
||||
if not clients_status:
|
||||
status_label = "not_ready"
|
||||
elif all(c["healthy"] for c in clients_status):
|
||||
status_label = "ready"
|
||||
else:
|
||||
status_label = "degraded"
|
||||
|
||||
fail_on_degraded = os.environ.get(
|
||||
"READY_FAILS_ON_DEGRADED", "false"
|
||||
).lower() in ("1", "true", "yes")
|
||||
http_code = 200
|
||||
if fail_on_degraded and status_label != "ready":
|
||||
http_code = 503
|
||||
|
||||
body = {
|
||||
"status": status_label,
|
||||
"name": title,
|
||||
"version": version,
|
||||
"uptime_seconds": int(time.time() - app.state.boot_at),
|
||||
"data_timestamp": datetime.now(UTC).isoformat(),
|
||||
"clients": clients_status,
|
||||
}
|
||||
return JSONResponse(status_code=http_code, content=body)
|
||||
|
||||
def _custom_openapi() -> dict[str, Any]:
|
||||
if app.openapi_schema:
|
||||
return app.openapi_schema
|
||||
|
||||
+119
-2
@@ -1,10 +1,22 @@
|
||||
"""Pydantic Settings: legge .env e variabili d'ambiente."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TypedDict
|
||||
|
||||
from pydantic import Field, SecretStr
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class IBKRCredentials(TypedDict):
|
||||
consumer_key: str
|
||||
access_token: str
|
||||
access_token_secret: str
|
||||
signature_key_path: str
|
||||
encryption_key_path: str
|
||||
account_id: str
|
||||
dh_prime: str
|
||||
|
||||
|
||||
class _Sub(BaseSettings):
|
||||
"""Base per sub-settings, condivide model_config con env_file."""
|
||||
model_config = SettingsConfigDict(
|
||||
@@ -21,12 +33,33 @@ class DeribitSettings(_Sub):
|
||||
env_prefix="DERIBIT_",
|
||||
extra="ignore",
|
||||
)
|
||||
client_id: str
|
||||
client_secret: SecretStr
|
||||
client_id: str | None = None
|
||||
client_secret: SecretStr | None = None
|
||||
client_id_testnet: str | None = None
|
||||
client_secret_testnet: SecretStr | None = None
|
||||
client_id_live: str | None = None
|
||||
client_secret_live: SecretStr | None = None
|
||||
url_live: str
|
||||
url_testnet: str
|
||||
max_leverage: int = 3
|
||||
|
||||
def credentials(self, env: str) -> tuple[str, str]:
|
||||
"""Return (client_id, client_secret) for the given env.
|
||||
Prefers env-specific (_TESTNET / _LIVE) pair; falls back to base
|
||||
(DERIBIT_CLIENT_ID / DERIBIT_CLIENT_SECRET) for legacy single-pair setups.
|
||||
"""
|
||||
if env == "testnet":
|
||||
cid = self.client_id_testnet or self.client_id
|
||||
csec = self.client_secret_testnet or self.client_secret
|
||||
elif env == "mainnet":
|
||||
cid = self.client_id_live or self.client_id
|
||||
csec = self.client_secret_live or self.client_secret
|
||||
else:
|
||||
raise ValueError(f"unknown deribit env: {env}")
|
||||
if not cid or csec is None:
|
||||
raise ValueError(f"Deribit credentials not configured for env={env}")
|
||||
return cid, csec.get_secret_value()
|
||||
|
||||
|
||||
class BybitSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
@@ -71,6 +104,89 @@ class AlpacaSettings(_Sub):
|
||||
max_leverage: int = 1
|
||||
|
||||
|
||||
class IBKRSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
env_prefix="IBKR_",
|
||||
extra="ignore",
|
||||
)
|
||||
consumer_key: str | None = None
|
||||
access_token: str | None = None
|
||||
access_token_secret: SecretStr | None = None
|
||||
signature_key_path: str | None = None
|
||||
encryption_key_path: str | None = None
|
||||
dh_prime: SecretStr | None = None
|
||||
|
||||
consumer_key_testnet: str | None = None
|
||||
access_token_testnet: str | None = None
|
||||
access_token_secret_testnet: SecretStr | None = None
|
||||
signature_key_path_testnet: str | None = None
|
||||
encryption_key_path_testnet: str | None = None
|
||||
# account_id has no base variant: paper and live accounts are always distinct
|
||||
account_id_testnet: str | None = None
|
||||
|
||||
consumer_key_live: str | None = None
|
||||
access_token_live: str | None = None
|
||||
access_token_secret_live: SecretStr | None = None
|
||||
signature_key_path_live: str | None = None
|
||||
encryption_key_path_live: str | None = None
|
||||
account_id_live: str | None = None
|
||||
|
||||
url_live: str = "https://api.ibkr.com/v1/api"
|
||||
url_testnet: str = "https://api.ibkr.com/v1/api"
|
||||
ws_url_live: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
ws_url_testnet: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
max_leverage: int = 4
|
||||
ws_max_subscriptions: int = 80
|
||||
ws_idle_timeout_s: int = 300
|
||||
|
||||
def credentials(self, env: str) -> IBKRCredentials:
|
||||
"""Return credential dict for given env.
|
||||
Prefers env-specific (_TESTNET / _LIVE) values; falls back to base
|
||||
(IBKR_CONSUMER_KEY etc.) for legacy single-pair setups.
|
||||
ValueError if any required field missing.
|
||||
"""
|
||||
if env == "testnet":
|
||||
ck = self.consumer_key_testnet or self.consumer_key
|
||||
at = self.access_token_testnet or self.access_token
|
||||
ats = self.access_token_secret_testnet or self.access_token_secret
|
||||
sigp = self.signature_key_path_testnet or self.signature_key_path
|
||||
encp = self.encryption_key_path_testnet or self.encryption_key_path
|
||||
acct = self.account_id_testnet
|
||||
elif env == "mainnet":
|
||||
ck = self.consumer_key_live or self.consumer_key
|
||||
at = self.access_token_live or self.access_token
|
||||
ats = self.access_token_secret_live or self.access_token_secret
|
||||
sigp = self.signature_key_path_live or self.signature_key_path
|
||||
encp = self.encryption_key_path_live or self.encryption_key_path
|
||||
acct = self.account_id_live
|
||||
else:
|
||||
raise ValueError(f"unknown ibkr env: {env}")
|
||||
|
||||
missing = [
|
||||
n for n, v in [
|
||||
("consumer_key", ck), ("access_token", at),
|
||||
("access_token_secret", ats), ("signature_key_path", sigp),
|
||||
("encryption_key_path", encp), ("account_id", acct),
|
||||
("dh_prime", self.dh_prime),
|
||||
] if not v
|
||||
]
|
||||
if missing:
|
||||
raise ValueError(
|
||||
f"IBKR credentials not configured for env={env}: missing {missing}"
|
||||
)
|
||||
return {
|
||||
"consumer_key": ck,
|
||||
"access_token": at,
|
||||
"access_token_secret": ats.get_secret_value(), # type: ignore[union-attr]
|
||||
"signature_key_path": sigp,
|
||||
"encryption_key_path": encp,
|
||||
"account_id": acct,
|
||||
"dh_prime": self.dh_prime.get_secret_value(), # type: ignore[union-attr]
|
||||
}
|
||||
|
||||
|
||||
class MacroSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
@@ -103,5 +219,6 @@ class Settings(_Sub):
|
||||
bybit: BybitSettings = Field(default_factory=lambda: BybitSettings()) # type: ignore[call-arg]
|
||||
hyperliquid: HyperliquidSettings = Field(default_factory=lambda: HyperliquidSettings()) # type: ignore[call-arg]
|
||||
alpaca: AlpacaSettings = Field(default_factory=lambda: AlpacaSettings()) # type: ignore[call-arg]
|
||||
ibkr: IBKRSettings = Field(default_factory=lambda: IBKRSettings()) # type: ignore[call-arg]
|
||||
macro: MacroSettings = Field(default_factory=lambda: MacroSettings()) # type: ignore[call-arg]
|
||||
sentiment: SentimentSettings = Field(default_factory=lambda: SentimentSettings()) # type: ignore[call-arg]
|
||||
|
||||
@@ -29,11 +29,11 @@ def app(monkeypatch):
|
||||
|
||||
|
||||
def _bearer_test():
|
||||
return {"Authorization": "Bearer t_test_123"}
|
||||
return {"Authorization": "Bearer t_test_123", "X-Bot-Tag": "test-bot"}
|
||||
|
||||
|
||||
def _bearer_live():
|
||||
return {"Authorization": "Bearer t_live_456"}
|
||||
return {"Authorization": "Bearer t_live_456", "X-Bot-Tag": "test-bot"}
|
||||
|
||||
|
||||
# ── Spy helpers ──────────────────────────────────────────────────────────────
|
||||
|
||||
@@ -0,0 +1,167 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from pydantic import BaseModel
|
||||
|
||||
|
||||
class FakeReq(BaseModel):
|
||||
instrument_name: str
|
||||
qty: float
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_audit_call_logs_success(monkeypatch):
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
|
||||
logged = []
|
||||
|
||||
def fake_audit(**kw):
|
||||
logged.append(kw)
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
|
||||
|
||||
class FakeRequest:
|
||||
class _State:
|
||||
environment = "testnet"
|
||||
state = _State()
|
||||
|
||||
async def tool_fn():
|
||||
return {"order_id": "abc123", "state": "filled"}
|
||||
|
||||
result = await audit_call(
|
||||
request=FakeRequest(), # type: ignore[arg-type]
|
||||
exchange="deribit",
|
||||
action="place_order",
|
||||
target_field="instrument_name",
|
||||
params=FakeReq(instrument_name="BTC-PERPETUAL", qty=0.1),
|
||||
tool_fn=tool_fn,
|
||||
)
|
||||
assert result == {"order_id": "abc123", "state": "filled"}
|
||||
assert len(logged) == 1
|
||||
rec = logged[0]
|
||||
assert rec["actor"] == "testnet"
|
||||
assert rec["exchange"] == "deribit"
|
||||
assert rec["action"] == "place_order"
|
||||
assert rec["target"] == "BTC-PERPETUAL"
|
||||
assert rec["payload"]["qty"] == 0.1
|
||||
assert rec["result"]["order_id"] == "abc123"
|
||||
assert "error" not in rec or rec.get("error") is None
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_audit_call_logs_error_and_reraises(monkeypatch):
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
|
||||
logged = []
|
||||
|
||||
def fake_audit(**kw):
|
||||
logged.append(kw)
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
|
||||
|
||||
class FakeRequest:
|
||||
class _State:
|
||||
environment = "mainnet"
|
||||
state = _State()
|
||||
|
||||
async def tool_fn():
|
||||
raise RuntimeError("upstream timeout")
|
||||
|
||||
with pytest.raises(RuntimeError, match="upstream timeout"):
|
||||
await audit_call(
|
||||
request=FakeRequest(), # type: ignore[arg-type]
|
||||
exchange="deribit",
|
||||
action="cancel_order",
|
||||
target_field="instrument_name",
|
||||
params=FakeReq(instrument_name="BTC-PERPETUAL", qty=0.0),
|
||||
tool_fn=tool_fn,
|
||||
)
|
||||
assert len(logged) == 1
|
||||
rec = logged[0]
|
||||
assert rec["actor"] == "mainnet"
|
||||
assert "RuntimeError: upstream timeout" in rec["error"]
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_audit_call_no_params_no_target():
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
|
||||
class FakeRequest:
|
||||
class _State:
|
||||
environment = "testnet"
|
||||
state = _State()
|
||||
|
||||
async def tool_fn():
|
||||
return {"ok": True}
|
||||
|
||||
result = await audit_call(
|
||||
request=FakeRequest(), # type: ignore[arg-type]
|
||||
exchange="bybit",
|
||||
action="cancel_all_orders",
|
||||
tool_fn=tool_fn,
|
||||
)
|
||||
assert result == {"ok": True}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_audit_call_propagates_bot_tag(monkeypatch):
|
||||
"""bot_tag letto da request.state e propagato a audit_write_op."""
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
|
||||
logged = []
|
||||
|
||||
def fake_audit(**kw):
|
||||
logged.append(kw)
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
|
||||
|
||||
class FakeRequest:
|
||||
class _State:
|
||||
environment = "testnet"
|
||||
bot_tag = "scanner-alpha"
|
||||
state = _State()
|
||||
|
||||
async def tool_fn():
|
||||
return {"order_id": "abc"}
|
||||
|
||||
await audit_call(
|
||||
request=FakeRequest(), # type: ignore[arg-type]
|
||||
exchange="deribit",
|
||||
action="place_order",
|
||||
target_field="instrument_name",
|
||||
params=FakeReq(instrument_name="BTC-PERPETUAL", qty=0.1),
|
||||
tool_fn=tool_fn,
|
||||
)
|
||||
assert len(logged) == 1
|
||||
assert logged[0]["bot_tag"] == "scanner-alpha"
|
||||
assert logged[0]["actor"] == "testnet"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_audit_call_bot_tag_none_when_missing(monkeypatch):
|
||||
"""Se request.state.bot_tag non esiste, audit riceve None senza errore."""
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
|
||||
logged = []
|
||||
|
||||
def fake_audit(**kw):
|
||||
logged.append(kw)
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
|
||||
|
||||
class FakeRequest:
|
||||
class _State:
|
||||
environment = "testnet"
|
||||
state = _State()
|
||||
|
||||
async def tool_fn():
|
||||
return {"ok": True}
|
||||
|
||||
await audit_call(
|
||||
request=FakeRequest(), # type: ignore[arg-type]
|
||||
exchange="bybit",
|
||||
action="cancel_all_orders",
|
||||
tool_fn=tool_fn,
|
||||
)
|
||||
assert len(logged) == 1
|
||||
assert logged[0]["bot_tag"] is None
|
||||
@@ -0,0 +1,72 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.common.candles import Candle, validate_candles
|
||||
from fastapi import HTTPException
|
||||
|
||||
|
||||
def test_valid_candle():
|
||||
c = Candle(timestamp=1_700_000_000_000, open=100.0, high=110.0,
|
||||
low=95.0, close=105.0, volume=12.5)
|
||||
assert c.high == 110.0
|
||||
|
||||
|
||||
def test_high_below_close_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=90, low=80, close=95, volume=1)
|
||||
|
||||
|
||||
def test_high_below_open_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=90, low=80, close=85, volume=1)
|
||||
|
||||
|
||||
def test_low_above_close_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=110, low=105, close=102, volume=1)
|
||||
|
||||
|
||||
def test_low_above_open_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=95, high=110, low=100, close=105, volume=1)
|
||||
|
||||
|
||||
def test_negative_volume_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=110, low=90, close=105, volume=-1)
|
||||
|
||||
|
||||
def test_non_positive_timestamp_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=0, open=100, high=110, low=90, close=105, volume=1)
|
||||
|
||||
|
||||
def test_validate_candles_sorts_by_timestamp():
|
||||
raw = [
|
||||
{"timestamp": 3, "open": 1, "high": 2, "low": 1, "close": 1, "volume": 0},
|
||||
{"timestamp": 1, "open": 1, "high": 2, "low": 1, "close": 1, "volume": 0},
|
||||
{"timestamp": 2, "open": 1, "high": 2, "low": 1, "close": 1, "volume": 0},
|
||||
]
|
||||
out = validate_candles(raw)
|
||||
assert [c["timestamp"] for c in out] == [1, 2, 3]
|
||||
|
||||
|
||||
def test_validate_candles_coerces_string_numerics():
|
||||
raw = [{"timestamp": "1", "open": "100", "high": "110",
|
||||
"low": "90", "close": "105", "volume": "10"}]
|
||||
out = validate_candles(raw)
|
||||
assert out[0]["open"] == 100.0
|
||||
assert isinstance(out[0]["volume"], float)
|
||||
|
||||
|
||||
def test_validate_candles_malformed_raises_http_502():
|
||||
raw = [{"timestamp": 1, "open": 100, "high": 50, "low": 90,
|
||||
"close": 105, "volume": 1}]
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
validate_candles(raw)
|
||||
assert exc_info.value.status_code == 502
|
||||
assert "candle" in str(exc_info.value.detail).lower()
|
||||
|
||||
|
||||
def test_validate_candles_empty_list():
|
||||
assert validate_candles([]) == []
|
||||
@@ -1,39 +1,23 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_trading():
|
||||
return MagicMock(name="alpaca_TradingClient")
|
||||
async def client():
|
||||
"""AlpacaClient paper su httpx mock (gestito da pytest-httpx)."""
|
||||
c = AlpacaClient(api_key="test_key", secret_key="test_secret", paper=True)
|
||||
try:
|
||||
yield c
|
||||
finally:
|
||||
await c.aclose()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_stock():
|
||||
return MagicMock(name="alpaca_StockHistoricalDataClient")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_crypto():
|
||||
return MagicMock(name="alpaca_CryptoHistoricalDataClient")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_option():
|
||||
return MagicMock(name="alpaca_OptionHistoricalDataClient")
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(mock_trading, mock_stock, mock_crypto, mock_option):
|
||||
return AlpacaClient(
|
||||
api_key="test_key",
|
||||
secret_key="test_secret",
|
||||
paper=True,
|
||||
trading=mock_trading,
|
||||
stock_data=mock_stock,
|
||||
crypto_data=mock_crypto,
|
||||
option_data=mock_option,
|
||||
)
|
||||
async def client_live():
|
||||
c = AlpacaClient(api_key="test_key", secret_key="test_secret", paper=False)
|
||||
try:
|
||||
yield c
|
||||
finally:
|
||||
await c.aclose()
|
||||
|
||||
@@ -1,80 +1,417 @@
|
||||
"""Test AlpacaClient httpx-based (V2.0.0).
|
||||
|
||||
Mockano gli endpoint REST tramite pytest-httpx. Coprono account/positions,
|
||||
ordini (place/cancel/limit-error), close position, clock, asset class
|
||||
invalida → ValueError.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from unittest.mock import MagicMock
|
||||
import re
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
|
||||
from pytest_httpx import HTTPXMock
|
||||
|
||||
PAPER = "https://paper-api.alpaca.markets"
|
||||
DATA = "https://data.alpaca.markets"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_init_paper_mode(client, mock_trading):
|
||||
async def test_init_paper_mode(client: AlpacaClient):
|
||||
assert client.paper is True
|
||||
assert client._trading is mock_trading
|
||||
assert client.base_url is None
|
||||
assert client._trading_base == PAPER
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_account_calls_trading(client, mock_trading):
|
||||
mock_trading.get_account.return_value = MagicMock(
|
||||
model_dump=lambda: {"equity": 100000, "cash": 50000}
|
||||
async def test_init_live_mode(client_live: AlpacaClient):
|
||||
assert client_live.paper is False
|
||||
assert client_live._trading_base == "https://api.alpaca.markets"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_init_base_url_override():
|
||||
c = AlpacaClient(
|
||||
api_key="k",
|
||||
secret_key="s",
|
||||
paper=True,
|
||||
base_url="https://alpaca-custom.example.com",
|
||||
)
|
||||
try:
|
||||
assert c.base_url == "https://alpaca-custom.example.com"
|
||||
assert c._trading_base == "https://alpaca-custom.example.com"
|
||||
# Data endpoint NON viene overridato
|
||||
assert c._data_base == DATA
|
||||
finally:
|
||||
await c.aclose()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_account(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=f"{PAPER}/v2/account",
|
||||
json={"id": "abc", "equity": "100000.00", "buying_power": "200000.00"},
|
||||
)
|
||||
result = await client.get_account()
|
||||
mock_trading.get_account.assert_called_once()
|
||||
assert result["equity"] == 100000
|
||||
assert result["id"] == "abc"
|
||||
assert result["equity"] == "100000.00"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_positions_returns_list(client, mock_trading):
|
||||
pos_mock = MagicMock(model_dump=lambda: {"symbol": "AAPL", "qty": 10})
|
||||
mock_trading.get_all_positions.return_value = [pos_mock]
|
||||
async def test_get_account_sends_auth_headers(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(url=f"{PAPER}/v2/account", json={"id": "x"})
|
||||
await client.get_account()
|
||||
req = httpx_mock.get_requests()[0]
|
||||
assert req.headers["APCA-API-KEY-ID"] == "test_key"
|
||||
assert req.headers["APCA-API-SECRET-KEY"] == "test_secret"
|
||||
assert req.headers["Accept"] == "application/json"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_positions_returns_list(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(
|
||||
url=f"{PAPER}/v2/positions",
|
||||
json=[{"symbol": "AAPL", "qty": "10", "side": "long"}],
|
||||
)
|
||||
result = await client.get_positions()
|
||||
assert len(result) == 1
|
||||
assert result[0]["symbol"] == "AAPL"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_market_order_stocks(client, mock_trading):
|
||||
order_mock = MagicMock(model_dump=lambda: {"id": "o123", "symbol": "AAPL"})
|
||||
mock_trading.submit_order.return_value = order_mock
|
||||
async def test_place_market_order_stocks(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(
|
||||
method="POST",
|
||||
url=f"{PAPER}/v2/orders",
|
||||
json={"id": "o123", "symbol": "AAPL", "status": "accepted"},
|
||||
)
|
||||
result = await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="market", asset_class="stocks",
|
||||
symbol="AAPL", side="buy", qty=1, order_type="market", asset_class="stocks"
|
||||
)
|
||||
assert result["id"] == "o123"
|
||||
assert mock_trading.submit_order.called
|
||||
# body POST corretto
|
||||
req = httpx_mock.get_requests()[0]
|
||||
import json as _j
|
||||
|
||||
body = _j.loads(req.content)
|
||||
assert body["symbol"] == "AAPL"
|
||||
assert body["side"] == "buy"
|
||||
assert body["type"] == "market"
|
||||
assert body["qty"] == "1"
|
||||
assert body["time_in_force"] == "day"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_limit_order_requires_price(client):
|
||||
async def test_place_limit_order_requires_price(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="limit_price"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="limit",
|
||||
symbol="AAPL", side="buy", qty=1, order_type="limit"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order(client, mock_trading):
|
||||
mock_trading.cancel_order_by_id.return_value = None
|
||||
async def test_place_stop_order_requires_price(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="stop_price"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="stop"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_unsupported_order_type(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="unsupported order_type"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="trailing_stop"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
# 204 No Content su success
|
||||
httpx_mock.add_response(
|
||||
method="DELETE",
|
||||
url=f"{PAPER}/v2/orders/o1",
|
||||
status_code=204,
|
||||
)
|
||||
result = await client.cancel_order("o1")
|
||||
mock_trading.cancel_order_by_id.assert_called_once_with("o1")
|
||||
assert result == {"order_id": "o1", "canceled": True}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_position_no_options(client, mock_trading):
|
||||
order_mock = MagicMock(model_dump=lambda: {"id": "close-1"})
|
||||
mock_trading.close_position.return_value = order_mock
|
||||
async def test_cancel_all_orders(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
method="DELETE",
|
||||
url=f"{PAPER}/v2/orders",
|
||||
json=[
|
||||
{"id": "a", "status": 200},
|
||||
{"id": "b", "status": 200},
|
||||
],
|
||||
)
|
||||
result = await client.cancel_all_orders()
|
||||
assert len(result) == 2
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_position_no_options(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(
|
||||
method="DELETE",
|
||||
url=f"{PAPER}/v2/positions/AAPL",
|
||||
json={"id": "close-1", "symbol": "AAPL"},
|
||||
)
|
||||
result = await client.close_position("AAPL")
|
||||
assert mock_trading.close_position.called
|
||||
assert result["id"] == "close-1"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_clock(client, mock_trading):
|
||||
clock_mock = MagicMock(model_dump=lambda: {"is_open": True, "next_close": "2026-04-21T20:00:00Z"})
|
||||
mock_trading.get_clock.return_value = clock_mock
|
||||
async def test_close_position_with_qty(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(
|
||||
method="DELETE",
|
||||
url=f"{PAPER}/v2/positions/AAPL?qty=5.0",
|
||||
json={"id": "close-2"},
|
||||
)
|
||||
result = await client.close_position("AAPL", qty=5.0)
|
||||
assert result["id"] == "close-2"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_all_positions(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(
|
||||
method="DELETE",
|
||||
url=f"{PAPER}/v2/positions?cancel_orders=true",
|
||||
json=[{"symbol": "AAPL", "status": 200}],
|
||||
)
|
||||
result = await client.close_all_positions(cancel_orders=True)
|
||||
assert len(result) == 1
|
||||
assert result[0]["symbol"] == "AAPL"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_clock(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=f"{PAPER}/v2/clock",
|
||||
json={"is_open": True, "next_close": "2026-04-21T20:00:00Z"},
|
||||
)
|
||||
result = await client.get_clock()
|
||||
assert result["is_open"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_invalid_asset_class(client):
|
||||
async def test_invalid_asset_class(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="invalid asset_class"):
|
||||
await client.get_ticker("AAPL", asset_class="forex")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_ticker_stocks(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=f"{DATA}/v2/stocks/AAPL/trades/latest",
|
||||
json={
|
||||
"symbol": "AAPL",
|
||||
"trade": {"p": 175.50, "s": 100, "t": "2026-04-18T15:30:00Z"},
|
||||
},
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=f"{DATA}/v2/stocks/AAPL/quotes/latest",
|
||||
json={
|
||||
"symbol": "AAPL",
|
||||
"quote": {
|
||||
"bp": 175.40,
|
||||
"ap": 175.55,
|
||||
"bs": 50,
|
||||
"as": 25,
|
||||
"t": "2026-04-18T15:30:00Z",
|
||||
},
|
||||
},
|
||||
)
|
||||
result = await client.get_ticker("AAPL", asset_class="stocks")
|
||||
assert result["asset_class"] == "stocks"
|
||||
assert result["last_price"] == 175.50
|
||||
assert result["bid"] == 175.40
|
||||
assert result["ask"] == 175.55
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_bars_stocks(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{DATA}/v2/stocks/bars\?.*"),
|
||||
json={
|
||||
"bars": {
|
||||
"AAPL": [
|
||||
{
|
||||
"t": "2026-04-17T00:00:00Z",
|
||||
"o": 170.0,
|
||||
"h": 176.0,
|
||||
"l": 169.5,
|
||||
"c": 175.0,
|
||||
"v": 1000000,
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
)
|
||||
result = await client.get_bars(
|
||||
symbol="AAPL",
|
||||
asset_class="stocks",
|
||||
interval="1d",
|
||||
start="2026-04-17T00:00:00",
|
||||
end="2026-04-18T00:00:00",
|
||||
limit=10,
|
||||
)
|
||||
assert result["symbol"] == "AAPL"
|
||||
assert result["interval"] == "1d"
|
||||
assert len(result["candles"]) == 1
|
||||
assert result["candles"][0]["close"] == 175.0
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_bars_unsupported_timeframe(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="unsupported timeframe"):
|
||||
await client.get_bars(
|
||||
symbol="AAPL",
|
||||
asset_class="stocks",
|
||||
interval="3min",
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_bars_invalid_asset_class(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="invalid asset_class"):
|
||||
await client.get_bars(symbol="AAPL", asset_class="forex")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_assets(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{PAPER}/v2/assets\?.*"),
|
||||
json=[
|
||||
{"symbol": "AAPL", "tradable": True, "class": "us_equity"},
|
||||
{"symbol": "GOOG", "tradable": True, "class": "us_equity"},
|
||||
],
|
||||
)
|
||||
result = await client.get_assets(asset_class="stocks", status="active")
|
||||
assert len(result) == 2
|
||||
assert result[0]["symbol"] == "AAPL"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_assets_invalid_class(client: AlpacaClient):
|
||||
with pytest.raises(ValueError, match="invalid asset_class"):
|
||||
await client.get_assets(asset_class="forex")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_open_orders(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{PAPER}/v2/orders\?.*"),
|
||||
json=[{"id": "o1", "status": "open", "symbol": "AAPL"}],
|
||||
)
|
||||
result = await client.get_open_orders(limit=10)
|
||||
assert len(result) == 1
|
||||
assert result[0]["id"] == "o1"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_amend_order(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
method="PATCH",
|
||||
url=f"{PAPER}/v2/orders/o1",
|
||||
json={"id": "o1", "qty": "5", "limit_price": "180.0"},
|
||||
)
|
||||
result = await client.amend_order(
|
||||
"o1", qty=5, limit_price=180.0, tif="gtc"
|
||||
)
|
||||
assert result["id"] == "o1"
|
||||
req = httpx_mock.get_requests()[0]
|
||||
import json as _j
|
||||
|
||||
body = _j.loads(req.content)
|
||||
assert body["qty"] == "5"
|
||||
assert body["limit_price"] == "180.0"
|
||||
assert body["time_in_force"] == "gtc"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_calendar(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{PAPER}/v2/calendar.*"),
|
||||
json=[{"date": "2026-04-20", "open": "09:30", "close": "16:00"}],
|
||||
)
|
||||
result = await client.get_calendar(start="2026-04-20", end="2026-04-20")
|
||||
assert len(result) == 1
|
||||
assert result[0]["date"] == "2026-04-20"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_calendar_no_filters(
|
||||
httpx_mock: HTTPXMock, client: AlpacaClient
|
||||
):
|
||||
httpx_mock.add_response(
|
||||
url=f"{PAPER}/v2/calendar",
|
||||
json=[{"date": "2026-04-20"}],
|
||||
)
|
||||
result = await client.get_calendar()
|
||||
assert len(result) == 1
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_snapshot(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{DATA}/v2/stocks/snapshots\?.*"),
|
||||
json={
|
||||
"AAPL": {
|
||||
"latestTrade": {"p": 175.0},
|
||||
"latestQuote": {"bp": 174.9, "ap": 175.1},
|
||||
}
|
||||
},
|
||||
)
|
||||
result = await client.get_snapshot("AAPL")
|
||||
assert result["latestTrade"]["p"] == 175.0
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_option_chain(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{DATA}/v1beta1/options/snapshots/AAPL.*"),
|
||||
json={
|
||||
"snapshots": {
|
||||
"AAPL250620C00200000": {
|
||||
"latestQuote": {"bp": 1.20, "ap": 1.30}
|
||||
}
|
||||
}
|
||||
},
|
||||
)
|
||||
result = await client.get_option_chain("AAPL", expiry="2026-06-20")
|
||||
assert result["underlying"] == "AAPL"
|
||||
assert result["expiry"] == "2026-06-20"
|
||||
assert "AAPL250620C00200000" in result["contracts"]
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_activities(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf"^{PAPER}/v2/account/activities.*"),
|
||||
json=[
|
||||
{"id": "1", "activity_type": "FILL"},
|
||||
{"id": "2", "activity_type": "TRANS"},
|
||||
],
|
||||
)
|
||||
result = await client.get_activities(limit=10)
|
||||
assert len(result) == 2
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_aclose_idempotent(client: AlpacaClient):
|
||||
await client.aclose()
|
||||
await client.aclose() # nessun raise
|
||||
|
||||
@@ -1,21 +1,18 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from unittest.mock import MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.bybit.client import BybitClient
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_http():
|
||||
return MagicMock(name="pybit_HTTP")
|
||||
def client():
|
||||
"""BybitClient con base_url testnet e AsyncClient interno.
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(mock_http):
|
||||
pytest-httpx intercetta le chiamate dell'AsyncClient httpx creato dal
|
||||
costruttore (auto-mock), quindi non serve injection esplicita.
|
||||
"""
|
||||
return BybitClient(
|
||||
api_key="test_key",
|
||||
api_secret="test_secret",
|
||||
testnet=True,
|
||||
http=mock_http,
|
||||
)
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,134 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.cross.client import CrossClient
|
||||
from fastapi import HTTPException
|
||||
|
||||
|
||||
class _Fake:
|
||||
def __init__(self, candles: list[dict[str, Any]] | None = None,
|
||||
*, raises: Exception | None = None):
|
||||
self._candles = candles or []
|
||||
self._raises = raises
|
||||
self.calls: list[dict[str, Any]] = []
|
||||
|
||||
async def get_historical(self, **kwargs: Any) -> dict[str, Any]:
|
||||
if self._raises:
|
||||
raise self._raises
|
||||
self.calls.append(kwargs)
|
||||
return {"candles": list(self._candles)}
|
||||
|
||||
async def get_bars(self, **kwargs: Any) -> dict[str, Any]:
|
||||
if self._raises:
|
||||
raise self._raises
|
||||
self.calls.append(kwargs)
|
||||
return {"candles": list(self._candles)}
|
||||
|
||||
|
||||
class _FakeRegistry:
|
||||
def __init__(self, clients: dict[str, _Fake]):
|
||||
self._clients = clients
|
||||
|
||||
async def get(self, exchange: str, env: str) -> _Fake:
|
||||
if exchange not in self._clients:
|
||||
raise KeyError(exchange)
|
||||
return self._clients[exchange]
|
||||
|
||||
|
||||
def _c(ts: int, close: float = 100.0) -> dict[str, Any]:
|
||||
return {"timestamp": ts, "open": close, "high": close, "low": close,
|
||||
"close": close, "volume": 1.0}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_crypto_three_sources_aggregates():
|
||||
fakes = {
|
||||
"bybit": _Fake([_c(1, 100), _c(2, 200)]),
|
||||
"hyperliquid": _Fake([_c(1, 100), _c(2, 200)]),
|
||||
"deribit": _Fake([_c(1, 100), _c(2, 200)]),
|
||||
}
|
||||
cc = CrossClient(_FakeRegistry(fakes), env="mainnet")
|
||||
out = await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert out["symbol"] == "BTC"
|
||||
assert out["asset_class"] == "crypto"
|
||||
assert len(out["candles"]) == 2
|
||||
assert out["candles"][0]["sources"] == 3
|
||||
assert out["candles"][0]["div_pct"] == 0.0
|
||||
assert set(out["sources_used"]) == {"bybit", "hyperliquid", "deribit"}
|
||||
assert out["failed_sources"] == []
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_crypto_partial_failure_returns_partial_with_warning():
|
||||
fakes = {
|
||||
"bybit": _Fake([_c(1, 100)]),
|
||||
"hyperliquid": _Fake([_c(1, 100)]),
|
||||
"deribit": _Fake(raises=RuntimeError("upstream down")),
|
||||
}
|
||||
cc = CrossClient(_FakeRegistry(fakes), env="mainnet")
|
||||
out = await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert out["candles"][0]["sources"] == 2
|
||||
assert set(out["sources_used"]) == {"bybit", "hyperliquid"}
|
||||
assert len(out["failed_sources"]) == 1
|
||||
assert out["failed_sources"][0]["exchange"] == "deribit"
|
||||
assert "upstream down" in out["failed_sources"][0]["error"]
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_all_sources_fail_raises_502():
|
||||
fakes = {
|
||||
"bybit": _Fake(raises=RuntimeError("a")),
|
||||
"hyperliquid": _Fake(raises=RuntimeError("b")),
|
||||
"deribit": _Fake(raises=RuntimeError("c")),
|
||||
}
|
||||
cc = CrossClient(_FakeRegistry(fakes), env="mainnet")
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert exc_info.value.status_code == 502
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_unsupported_symbol_raises_400():
|
||||
cc = CrossClient(_FakeRegistry({}), env="mainnet")
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await cc.get_historical(
|
||||
symbol="NONEXISTENT", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert exc_info.value.status_code == 400
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_stocks_routes_to_alpaca_only():
|
||||
fake = _Fake([_c(1, 175.0)])
|
||||
cc = CrossClient(_FakeRegistry({"alpaca": fake}), env="mainnet")
|
||||
out = await cc.get_historical(
|
||||
symbol="AAPL", asset_class="stocks", interval="1d",
|
||||
start_date="2026-04-09T00:00:00", end_date="2026-04-10T00:00:00",
|
||||
)
|
||||
assert out["sources_used"] == ["alpaca"]
|
||||
assert out["candles"][0]["close"] == 175.0
|
||||
# Alpaca was called with native symbol
|
||||
assert fake.calls[0]["symbol"] == "AAPL"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_unsupported_interval_raises_400():
|
||||
cc = CrossClient(_FakeRegistry({}), env="mainnet")
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="3h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert exc_info.value.status_code == 400
|
||||
@@ -0,0 +1,90 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from cerbero_mcp.exchanges.cross.consensus import merge_candles
|
||||
|
||||
|
||||
def _c(ts, o, h, l, c, v):
|
||||
return {"timestamp": ts, "open": o, "high": h, "low": l, "close": c, "volume": v}
|
||||
|
||||
|
||||
def test_empty_input():
|
||||
assert merge_candles({}) == []
|
||||
|
||||
|
||||
def test_single_source_passthrough():
|
||||
out = merge_candles({"bybit": [_c(1, 100, 110, 90, 105, 5)]})
|
||||
assert len(out) == 1
|
||||
assert out[0]["timestamp"] == 1
|
||||
assert out[0]["close"] == 105
|
||||
assert out[0]["sources"] == 1
|
||||
assert out[0]["div_pct"] == 0.0
|
||||
|
||||
|
||||
def test_three_sources_identical_no_divergence():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 105, 5)],
|
||||
"hyperliquid": [_c(1, 100, 110, 90, 105, 3)],
|
||||
"deribit": [_c(1, 100, 110, 90, 105, 7)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
assert len(out) == 1
|
||||
assert out[0]["close"] == 105.0
|
||||
assert out[0]["sources"] == 3
|
||||
assert out[0]["div_pct"] == 0.0
|
||||
# volume is mean across sources
|
||||
assert abs(out[0]["volume"] - 5.0) < 1e-9
|
||||
|
||||
|
||||
def test_three_sources_divergent_close():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 100, 1)],
|
||||
"hyperliquid": [_c(1, 100, 110, 90, 110, 1)],
|
||||
"deribit": [_c(1, 100, 110, 90, 105, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
# median of [100, 110, 105] = 105
|
||||
assert out[0]["close"] == 105.0
|
||||
# div_pct = (110 - 100) / 105 ≈ 0.0952
|
||||
assert abs(out[0]["div_pct"] - 10 / 105) < 1e-6
|
||||
assert out[0]["sources"] == 3
|
||||
|
||||
|
||||
def test_misaligned_timestamps():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 105, 1), _c(2, 100, 110, 90, 105, 1)],
|
||||
"hyperliquid": [_c(2, 100, 110, 90, 105, 1), _c(3, 100, 110, 90, 105, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
timestamps = [c["timestamp"] for c in out]
|
||||
sources_by_ts = {c["timestamp"]: c["sources"] for c in out}
|
||||
assert timestamps == [1, 2, 3]
|
||||
assert sources_by_ts == {1: 1, 2: 2, 3: 1}
|
||||
|
||||
|
||||
def test_two_sources_even_median():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 100, 1)],
|
||||
"hyperliquid": [_c(1, 100, 110, 90, 110, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
# even median = mean of two = 105
|
||||
assert out[0]["close"] == 105.0
|
||||
|
||||
|
||||
def test_empty_source_ignored():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 105, 1)],
|
||||
"hyperliquid": [],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
assert len(out) == 1
|
||||
assert out[0]["sources"] == 1
|
||||
|
||||
|
||||
def test_output_sorted_by_timestamp():
|
||||
src = {
|
||||
"bybit": [_c(3, 100, 110, 90, 105, 1), _c(1, 100, 110, 90, 105, 1),
|
||||
_c(2, 100, 110, 90, 105, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
assert [c["timestamp"] for c in out] == [1, 2, 3]
|
||||
@@ -0,0 +1,47 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.cross.symbol_map import (
|
||||
get_sources,
|
||||
to_native_interval,
|
||||
to_native_symbol,
|
||||
)
|
||||
|
||||
|
||||
def test_btc_crypto_sources():
|
||||
assert set(get_sources("crypto", "BTC")) == {"bybit", "hyperliquid", "deribit"}
|
||||
|
||||
|
||||
def test_eth_crypto_sources():
|
||||
assert set(get_sources("crypto", "ETH")) == {"bybit", "hyperliquid", "deribit"}
|
||||
|
||||
|
||||
def test_unknown_crypto_symbol_returns_empty():
|
||||
assert get_sources("crypto", "DOGEFAKE") == []
|
||||
|
||||
|
||||
def test_stocks_aapl_sources():
|
||||
assert set(get_sources("stocks", "AAPL")) == {"alpaca"}
|
||||
|
||||
|
||||
def test_native_symbol_btc():
|
||||
assert to_native_symbol("crypto", "BTC", "bybit") == "BTCUSDT"
|
||||
assert to_native_symbol("crypto", "BTC", "hyperliquid") == "BTC"
|
||||
assert to_native_symbol("crypto", "BTC", "deribit") == "BTC-PERPETUAL"
|
||||
|
||||
|
||||
def test_native_symbol_unsupported_pair_raises():
|
||||
with pytest.raises(KeyError):
|
||||
to_native_symbol("crypto", "BTC", "alpaca")
|
||||
|
||||
|
||||
def test_native_interval_1h():
|
||||
assert to_native_interval("1h", "bybit") == "60"
|
||||
assert to_native_interval("1h", "hyperliquid") == "1h"
|
||||
assert to_native_interval("1h", "deribit") == "1h"
|
||||
assert to_native_interval("1h", "alpaca") == "1h"
|
||||
|
||||
|
||||
def test_native_interval_unknown_canonical_raises():
|
||||
with pytest.raises(KeyError):
|
||||
to_native_interval("3h", "bybit")
|
||||
@@ -154,6 +154,47 @@ async def test_get_account_summary(httpx_mock: HTTPXMock, client: DeribitClient)
|
||||
assert result["balance"] == 900.0
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_upstream_5xx_raises_clean_http_error(
|
||||
httpx_mock: HTTPXMock, client: DeribitClient
|
||||
):
|
||||
"""Upstream Deribit 5xx (non-JSON body) → HTTPException 502, non JSONDecodeError."""
|
||||
from fastapi import HTTPException
|
||||
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://test\.deribit\.com/api/v2/public/get_tradingview_chart_data"),
|
||||
status_code=502,
|
||||
text="<html>Bad Gateway</html>",
|
||||
)
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await client.get_historical(
|
||||
instrument="BTC-PERPETUAL",
|
||||
start_date="2026-05-09T00:00:00",
|
||||
end_date="2026-05-10T00:00:00",
|
||||
resolution="1h",
|
||||
)
|
||||
assert exc_info.value.status_code == 502
|
||||
assert "Deribit upstream" in str(exc_info.value.detail)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_private_call_with_bad_auth_returns_error_envelope(
|
||||
httpx_mock: HTTPXMock, client: DeribitClient
|
||||
):
|
||||
"""Auth fallita (creds errate / scope mancante) → error envelope, non KeyError."""
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://test\.deribit\.com/api/v2/public/auth"),
|
||||
json={"error": {"code": 13004, "message": "invalid_credentials"}},
|
||||
is_reusable=True,
|
||||
)
|
||||
summary = await client.get_account_summary("USDC")
|
||||
assert summary["equity"] is None
|
||||
assert summary["balance"] is None
|
||||
assert "invalid_credentials" in summary["error"]
|
||||
positions = await client.get_positions("USDC")
|
||||
assert positions == []
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order(httpx_mock: HTTPXMock, client: DeribitClient):
|
||||
httpx_mock.add_response(
|
||||
|
||||
@@ -6,12 +6,16 @@ import pytest
|
||||
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
|
||||
from pytest_httpx import HTTPXMock
|
||||
|
||||
# Chiave privata fissa: rende deterministica la firma EIP-712 per i test write.
|
||||
DUMMY_PRIVATE_KEY = "0x" + "01" * 32
|
||||
DUMMY_WALLET = "0x1a642f0E3c3aF545E7AcBD38b07251B3990914F1" # derived from key above
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client():
|
||||
return HyperliquidClient(
|
||||
wallet_address="0xDeadBeef",
|
||||
private_key="0x" + "a" * 64,
|
||||
wallet_address=DUMMY_WALLET,
|
||||
private_key=DUMMY_PRIVATE_KEY,
|
||||
testnet=True,
|
||||
)
|
||||
|
||||
@@ -41,6 +45,13 @@ META_AND_CTX = [
|
||||
],
|
||||
]
|
||||
|
||||
META = {
|
||||
"universe": [
|
||||
{"name": "BTC", "maxLeverage": 50},
|
||||
{"name": "ETH", "maxLeverage": 25},
|
||||
]
|
||||
}
|
||||
|
||||
CLEARINGHOUSE_STATE = {
|
||||
"marginSummary": {
|
||||
"accountValue": "1500.0",
|
||||
@@ -65,6 +76,9 @@ CLEARINGHOUSE_STATE = {
|
||||
SPOT_STATE = {"balances": [{"coin": "USDC", "total": "500.0"}]}
|
||||
|
||||
|
||||
# ── Read endpoints ─────────────────────────────────────────────
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_markets(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
httpx_mock.add_response(
|
||||
@@ -209,19 +223,263 @@ async def test_health_ok(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
assert result["testnet"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_sdk_unavailable(client: HyperliquidClient):
|
||||
"""place_order raises RuntimeError when SDK is not available (mocked)."""
|
||||
import cerbero_mcp.exchanges.hyperliquid.client as mod
|
||||
# ── Write endpoints (signed via EIP-712) ───────────────────────
|
||||
|
||||
original = mod._SDK_AVAILABLE
|
||||
mod._SDK_AVAILABLE = False
|
||||
client._exchange = None
|
||||
try:
|
||||
result = await client.place_order("BTC", "buy", 0.1, price=50000.0)
|
||||
# Should return error dict or raise RuntimeError
|
||||
assert "error" in result or result.get("status") == "error"
|
||||
except RuntimeError as exc:
|
||||
assert "not installed" in str(exc).lower() or "sdk" in str(exc).lower()
|
||||
finally:
|
||||
mod._SDK_AVAILABLE = original
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_limit(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
"""Limit order: signs and POSTs to /exchange with correct payload shape."""
|
||||
# 1. /info type=meta per asset id
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META,
|
||||
)
|
||||
# 2. /exchange firmato
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
|
||||
json={
|
||||
"status": "ok",
|
||||
"response": {
|
||||
"type": "order",
|
||||
"data": {
|
||||
"statuses": [{"resting": {"oid": 9999}}],
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
result = await client.place_order(
|
||||
instrument="BTC", side="buy", amount=0.01, type="limit", price=50000.0
|
||||
)
|
||||
|
||||
# Verifica shape risposta normalizzata
|
||||
assert result["status"] == "ok"
|
||||
assert result["order_id"] == 9999
|
||||
|
||||
# Verifica request body al POST /exchange
|
||||
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
|
||||
assert len(requests) == 1
|
||||
import json as _json
|
||||
|
||||
body = _json.loads(requests[0].content)
|
||||
assert body["nonce"] > 0
|
||||
assert body["vaultAddress"] is None
|
||||
assert body["expiresAfter"] is None
|
||||
assert body["action"]["type"] == "order"
|
||||
assert body["action"]["grouping"] == "na"
|
||||
assert len(body["action"]["orders"]) == 1
|
||||
order = body["action"]["orders"][0]
|
||||
assert order["a"] == 0 # BTC è index 0 in META
|
||||
assert order["b"] is True # buy
|
||||
assert order["p"] == "50000"
|
||||
assert order["s"] == "0.01"
|
||||
assert order["r"] is False
|
||||
assert order["t"] == {"limit": {"tif": "Gtc"}}
|
||||
sig = body["signature"]
|
||||
assert set(sig.keys()) == {"r", "s", "v"}
|
||||
assert sig["r"].startswith("0x") and len(sig["r"]) == 66
|
||||
assert sig["s"].startswith("0x") and len(sig["s"]) == 66
|
||||
assert sig["v"] in (27, 28)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_market(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
"""Market order: usa mark_price + buffer e tif=Ioc."""
|
||||
# market path: get_ticker → meta+ctxs, poi meta per asset id, poi /exchange
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META_AND_CTX,
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META,
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
|
||||
json={
|
||||
"status": "ok",
|
||||
"response": {
|
||||
"type": "order",
|
||||
"data": {
|
||||
"statuses": [{"filled": {"oid": 1, "totalSz": "0.01", "avgPx": "51500"}}],
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
result = await client.place_order(
|
||||
instrument="BTC", side="buy", amount=0.01, type="market"
|
||||
)
|
||||
assert result["status"] == "ok"
|
||||
assert result["filled_size"] == 0.01
|
||||
|
||||
import json as _json
|
||||
|
||||
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
|
||||
assert len(requests) == 1
|
||||
body = _json.loads(requests[0].content)
|
||||
order = body["action"]["orders"][0]
|
||||
assert order["t"] == {"limit": {"tif": "Ioc"}}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_stop_loss(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
"""Stop-loss: usa trigger order type."""
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META,
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
|
||||
json={
|
||||
"status": "ok",
|
||||
"response": {"type": "order", "data": {"statuses": [{"resting": {"oid": 7}}]}},
|
||||
},
|
||||
)
|
||||
|
||||
result = await client.place_order(
|
||||
instrument="BTC",
|
||||
side="sell",
|
||||
amount=0.01,
|
||||
type="stop_loss",
|
||||
price=45000.0,
|
||||
reduce_only=True,
|
||||
)
|
||||
assert result["status"] == "ok"
|
||||
assert result["order_id"] == 7
|
||||
|
||||
import json as _json
|
||||
|
||||
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
|
||||
body = _json.loads(requests[0].content)
|
||||
order = body["action"]["orders"][0]
|
||||
assert order["r"] is True
|
||||
assert order["t"] == {
|
||||
"trigger": {"isMarket": True, "triggerPx": "45000", "tpsl": "sl"}
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_unknown_asset(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
"""Asset sconosciuto → error dict, niente POST /exchange."""
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META,
|
||||
)
|
||||
result = await client.place_order(
|
||||
instrument="DOGE", side="buy", amount=1.0, type="limit", price=0.1
|
||||
)
|
||||
assert "error" in result
|
||||
assert "DOGE" in result["error"]
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
"""Cancel: action.type=cancel con asset id + oid."""
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META,
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
|
||||
json={"status": "ok", "response": {"type": "cancel", "data": {"statuses": ["success"]}}},
|
||||
)
|
||||
|
||||
result = await client.cancel_order("12345", "BTC")
|
||||
assert result["status"] == "ok"
|
||||
assert result["order_id"] == "12345"
|
||||
|
||||
import json as _json
|
||||
|
||||
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
|
||||
body = _json.loads(requests[0].content)
|
||||
assert body["action"]["type"] == "cancel"
|
||||
assert body["action"]["cancels"] == [{"a": 0, "o": 12345}]
|
||||
assert "r" in body["signature"] and "s" in body["signature"] and "v" in body["signature"]
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_position(httpx_mock: HTTPXMock, client: HyperliquidClient):
|
||||
"""close_position: legge stato, calcola slippage, place IOC reduce-only."""
|
||||
# 1. clearinghouseState per direzione
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=CLEARINGHOUSE_STATE,
|
||||
)
|
||||
# 2. get_ticker → metaAndAssetCtxs
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META_AND_CTX,
|
||||
)
|
||||
# 3. meta per asset id
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json=META,
|
||||
)
|
||||
# 4. /exchange
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
|
||||
json={
|
||||
"status": "ok",
|
||||
"response": {
|
||||
"type": "order",
|
||||
"data": {
|
||||
"statuses": [{"filled": {"oid": 5, "totalSz": "0.1", "avgPx": "47500"}}],
|
||||
},
|
||||
},
|
||||
},
|
||||
)
|
||||
|
||||
result = await client.close_position("BTC")
|
||||
assert result["status"] == "ok"
|
||||
assert result["asset"] == "BTC"
|
||||
|
||||
import json as _json
|
||||
|
||||
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
|
||||
body = _json.loads(requests[0].content)
|
||||
order = body["action"]["orders"][0]
|
||||
# Posizione long → side=sell per chiudere
|
||||
assert order["b"] is False
|
||||
assert order["r"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_position_no_position(
|
||||
httpx_mock: HTTPXMock, client: HyperliquidClient
|
||||
):
|
||||
"""close_position senza posizione aperta → error dict."""
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
|
||||
json={"assetPositions": [], "marginSummary": {}},
|
||||
)
|
||||
result = await client.close_position("BTC")
|
||||
assert "error" in result
|
||||
assert result["asset"] == "BTC"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_signing_parity_with_canonical_sdk(client: HyperliquidClient):
|
||||
"""Sanity: la firma EIP-712 prodotta è bit-for-bit identica a quella che
|
||||
genererebbe il SDK ufficiale ``hyperliquid-python-sdk`` per lo stesso input.
|
||||
|
||||
Test isolato (no httpx) per garantire che la rimozione del SDK runtime
|
||||
non introduca regressioni di signing.
|
||||
"""
|
||||
from cerbero_mcp.exchanges.hyperliquid.client import _sign_l1_action
|
||||
|
||||
action = {"type": "cancel", "cancels": [{"a": 0, "o": 12345}]}
|
||||
nonce = 1700000000000
|
||||
sig = _sign_l1_action(DUMMY_PRIVATE_KEY, action, None, nonce, None, False)
|
||||
assert sig == {
|
||||
"r": "0xab1150f8d695e015a07e3f79983a0a2a4e58dedec071dfa4177a0761f37e0485",
|
||||
"s": "0x208cb6370e5e56a3cefa451538c1e0096b70777d2bde172c7afb1e77c4d28d20",
|
||||
"v": 28,
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_aclose_idempotent(client: HyperliquidClient):
|
||||
"""``aclose`` può essere chiamato anche senza http client attivo."""
|
||||
await client.aclose()
|
||||
await client.aclose()
|
||||
|
||||
@@ -0,0 +1,232 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRClient, IBKRError
|
||||
from pytest_httpx import HTTPXMock
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fake_signer():
|
||||
s = MagicMock()
|
||||
s.consumer_key = "CK"
|
||||
s.access_token = "AT"
|
||||
s.get_live_session_token = AsyncMock(return_value="LSTBASE64==")
|
||||
s.sign_with_lst = MagicMock(return_value="SIG==")
|
||||
s.make_oauth_params = MagicMock(return_value={
|
||||
"oauth_consumer_key": "CK", "oauth_token": "AT",
|
||||
"oauth_nonce": "n", "oauth_timestamp": "1",
|
||||
"oauth_signature_method": "HMAC-SHA256", "oauth_version": "1.0",
|
||||
})
|
||||
return s
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(fake_signer):
|
||||
return IBKRClient(
|
||||
signer=fake_signer,
|
||||
account_id="DU1234",
|
||||
paper=True,
|
||||
base_url="https://api.ibkr.com/v1/api",
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_health_no_network(client):
|
||||
info = await client.health()
|
||||
assert info["status"] == "ok"
|
||||
assert info["paper"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_account_summary(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
json={"netliquidation": {"amount": 10000, "currency": "USD"}, "totalcashvalue": {"amount": 8000}},
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/tickle"),
|
||||
json={"session": "abc"},
|
||||
)
|
||||
data = await client.get_account()
|
||||
assert "netliquidation" in data
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_request_retries_once_on_401(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
# First call returns 401
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
status_code=401, text="session expired",
|
||||
)
|
||||
# Second call (after LST refresh) succeeds
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
json={"netliquidation": {"amount": 5000}},
|
||||
)
|
||||
data = await client.get_account()
|
||||
assert data["netliquidation"]["amount"] == 5000
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_request_raises_on_persistent_401(httpx_mock: HTTPXMock, client):
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import IBKRAuthError
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
# Both attempts return 401
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
status_code=401, text="bad creds",
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
status_code=401, text="bad creds",
|
||||
)
|
||||
with pytest.raises(IBKRAuthError, match="after retry"):
|
||||
await client.get_account()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_conid_caches(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/tickle"), json={"session": "x"},
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=AAPL"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
cid = await client.resolve_conid("AAPL", "STK")
|
||||
assert cid == 265598
|
||||
cid2 = await client.resolve_conid("AAPL", "STK")
|
||||
assert cid2 == 265598
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_positions(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/positions/0"),
|
||||
json=[{"conid": 265598, "position": 10, "mktPrice": 150}],
|
||||
)
|
||||
res = await client.get_positions()
|
||||
assert isinstance(res, list)
|
||||
assert res[0]["position"] == 10
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_ticker_resolves_and_fetches(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=AAPL"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/marketdata/snapshot"),
|
||||
json=[{"31": "150.5", "84": "150.4", "86": "150.6", "conid": 265598}],
|
||||
)
|
||||
snap = await client.get_ticker("AAPL", "stocks")
|
||||
assert snap["last_price"] == 150.5
|
||||
assert snap["bid"] == 150.4
|
||||
assert snap["ask"] == 150.6
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_conid_empty_response_raises(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=NOPE"),
|
||||
json=[],
|
||||
)
|
||||
with pytest.raises(IBKRError, match="IBKR_CONID_NOT_FOUND"):
|
||||
await client.resolve_conid("NOPE", "STK")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_conid_malformed_response_raises(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=BAD"),
|
||||
json=[{"symbol": "BAD"}], # missing conid key
|
||||
)
|
||||
with pytest.raises(IBKRError, match="malformed"):
|
||||
await client.resolve_conid("BAD", "STK")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_auto_confirms_warning(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/orders$"),
|
||||
method="POST",
|
||||
json=[{"id": "msgid1", "message": ["outside RTH"]}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/reply/msgid1"),
|
||||
method="POST",
|
||||
json=[{"order_id": "OID42", "order_status": "Submitted"}],
|
||||
)
|
||||
res = await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="market",
|
||||
)
|
||||
assert res["order_id"] == "OID42"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_rejects_critical_warning(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/orders$"),
|
||||
method="POST",
|
||||
json=[{"id": "msgid2", "message": ["Margin requirement exceeded"]}],
|
||||
)
|
||||
with pytest.raises(IBKRError, match="IBKR_ORDER_REJECTED_WARNING"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1000000, order_type="market",
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/order/OID42"),
|
||||
method="DELETE",
|
||||
json={"msg": "Request was submitted", "order_id": "OID42"},
|
||||
)
|
||||
res = await client.cancel_order("OID42")
|
||||
assert res["canceled"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_too_many_confirmations(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
# Initial place + 3 reply cycles all return new warnings — should fail at MAX_CYCLES
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/orders$"),
|
||||
method="POST",
|
||||
json=[{"id": "msg1", "message": ["outside RTH"]}],
|
||||
)
|
||||
for n in range(2, 5):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf".*/iserver/reply/msg{n-1}$"),
|
||||
method="POST",
|
||||
json=[{"id": f"msg{n}", "message": ["outside RTH"]}],
|
||||
)
|
||||
with pytest.raises(IBKRError, match="IBKR_ORDER_TOO_MANY_CONFIRMATIONS"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="market",
|
||||
)
|
||||
@@ -0,0 +1,80 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.key_rotation import KeyRotationManager
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_start_generates_new_keypair_files(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
out = await mgr.start()
|
||||
assert "sig" in out["fingerprints"]
|
||||
assert "enc" in out["fingerprints"]
|
||||
assert (tmp_path / "sig.pem.new").exists()
|
||||
assert (tmp_path / "enc.pem.new").exists()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_confirm_swap_and_validate_ok(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
await mgr.start()
|
||||
|
||||
async def fake_validate() -> bool:
|
||||
return True
|
||||
out = await mgr.confirm(validate=fake_validate)
|
||||
assert "rotated_at" in out
|
||||
assert (tmp_path / ".archive").exists()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_confirm_validate_fail_rollbacks(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
await mgr.start()
|
||||
|
||||
async def fake_validate() -> bool:
|
||||
return False
|
||||
with pytest.raises(RuntimeError, match="IBKR_ROTATION_VALIDATION_FAILED"):
|
||||
await mgr.confirm(validate=fake_validate)
|
||||
assert sig_path.read_bytes() == b"old-sig"
|
||||
assert enc_path.read_bytes() == b"old-enc"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_abort_cleans_new_files(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
await mgr.start()
|
||||
await mgr.abort()
|
||||
assert not (tmp_path / "sig.pem.new").exists()
|
||||
assert not (tmp_path / "enc.pem.new").exists()
|
||||
@@ -0,0 +1,132 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import base64 as _b64
|
||||
import re
|
||||
import time
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import (
|
||||
OAuth1aSigner,
|
||||
build_signature_base_string,
|
||||
)
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import padding, rsa
|
||||
from pytest_httpx import HTTPXMock
|
||||
|
||||
|
||||
def test_signature_base_string_canonical_order():
|
||||
base = build_signature_base_string(
|
||||
method="POST",
|
||||
url="https://api.ibkr.com/v1/api/oauth/live_session_token",
|
||||
params={
|
||||
"oauth_consumer_key": "TEST_CONSUMER",
|
||||
"oauth_token": "TEST_TOKEN",
|
||||
"oauth_nonce": "abc123",
|
||||
"oauth_timestamp": "1700000000",
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
"diffie_hellman_challenge": "ff00",
|
||||
},
|
||||
)
|
||||
assert base.startswith("POST&")
|
||||
assert "oauth_consumer_key%3DTEST_CONSUMER" in base
|
||||
idx_consumer = base.index("oauth_consumer_key")
|
||||
idx_token = base.index("oauth_token")
|
||||
assert idx_consumer < idx_token
|
||||
|
||||
|
||||
def test_oauth_signer_signs_with_rsa(tmp_path):
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
pem = key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
)
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
sig_path.write_bytes(pem)
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
enc_path.write_bytes(pem)
|
||||
|
||||
signer = OAuth1aSigner(
|
||||
consumer_key="TEST_CONSUMER",
|
||||
access_token="TEST_TOKEN",
|
||||
access_token_secret="TEST_SECRET",
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
dh_prime="FFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AACAA68FFFFFFFFFFFFFFFF",
|
||||
)
|
||||
sig = signer.sign(
|
||||
method="GET",
|
||||
url="https://api.ibkr.com/v1/api/iserver/auth/status",
|
||||
params={
|
||||
"oauth_consumer_key": "TEST_CONSUMER",
|
||||
"oauth_token": "TEST_TOKEN",
|
||||
"oauth_nonce": "abc",
|
||||
"oauth_timestamp": "1700000000",
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
},
|
||||
)
|
||||
|
||||
# Verify signature against the public key — proves correctness, not just shape
|
||||
base = build_signature_base_string(
|
||||
method="GET",
|
||||
url="https://api.ibkr.com/v1/api/iserver/auth/status",
|
||||
params={
|
||||
"oauth_consumer_key": "TEST_CONSUMER",
|
||||
"oauth_token": "TEST_TOKEN",
|
||||
"oauth_nonce": "abc",
|
||||
"oauth_timestamp": "1700000000",
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
},
|
||||
)
|
||||
key.public_key().verify(
|
||||
_b64.b64decode(sig),
|
||||
base.encode("utf-8"),
|
||||
padding.PKCS1v15(),
|
||||
hashes.SHA256(),
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_live_session_token_mint(httpx_mock: HTTPXMock, tmp_path):
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
pem = key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
)
|
||||
(tmp_path / "sig.pem").write_bytes(pem)
|
||||
(tmp_path / "enc.pem").write_bytes(pem)
|
||||
|
||||
# Encrypt fake "real" secret with our public key
|
||||
raw_secret = b"my_real_token_secret"
|
||||
encrypted = key.public_key().encrypt(raw_secret, padding.PKCS1v15())
|
||||
encrypted_hex = encrypted.hex()
|
||||
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/oauth/live_session_token"),
|
||||
json={
|
||||
"diffie_hellman_response": "ff",
|
||||
"live_session_token_signature": "00" * 20,
|
||||
"live_session_token_expiration": int(time.time() * 1000) + 86400000,
|
||||
},
|
||||
)
|
||||
|
||||
signer = OAuth1aSigner(
|
||||
consumer_key="TEST_CK",
|
||||
access_token="TEST_AT",
|
||||
access_token_secret=encrypted_hex,
|
||||
signature_key_path=str(tmp_path / "sig.pem"),
|
||||
encryption_key_path=str(tmp_path / "enc.pem"),
|
||||
dh_prime="17", # 23 — smallest prime > 16 that fits a 1-byte modulus
|
||||
)
|
||||
lst = await signer.get_live_session_token(
|
||||
base_url="https://api.ibkr.com/v1/api"
|
||||
)
|
||||
assert isinstance(lst, str) and len(lst) > 0
|
||||
lst2 = await signer.get_live_session_token(
|
||||
base_url="https://api.ibkr.com/v1/api"
|
||||
)
|
||||
assert lst == lst2 # cached
|
||||
@@ -0,0 +1,44 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.orders_complex import (
|
||||
OrderSpec,
|
||||
build_bracket_payload,
|
||||
build_oco_payload,
|
||||
)
|
||||
|
||||
|
||||
def test_bracket_three_legs_with_oca():
|
||||
payload = build_bracket_payload(
|
||||
conid=42, sec_type="STK", side="BUY", qty=10,
|
||||
entry_price=150.0, stop_loss=145.0, take_profit=160.0,
|
||||
tif="GTC", exchange="SMART",
|
||||
)
|
||||
assert "orders" in payload
|
||||
legs = payload["orders"]
|
||||
assert len(legs) == 3
|
||||
oca = legs[0].get("ocaGroup")
|
||||
assert oca and all(l.get("ocaGroup") == oca for l in legs[1:])
|
||||
assert legs[0]["orderType"] == "LMT"
|
||||
assert legs[0]["price"] == 150.0
|
||||
assert legs[0]["side"] == "BUY"
|
||||
assert legs[1]["side"] == "SELL"
|
||||
assert legs[2]["side"] == "SELL"
|
||||
assert legs[1]["orderType"] == "STP"
|
||||
assert legs[1]["auxPrice"] == 145.0
|
||||
assert legs[2]["orderType"] == "LMT"
|
||||
assert legs[2]["price"] == 160.0
|
||||
|
||||
|
||||
def test_oco_oca_group_and_type():
|
||||
legs = [
|
||||
OrderSpec(conid=1, sec_type="STK", side="BUY", qty=1,
|
||||
order_type="LMT", price=100),
|
||||
OrderSpec(conid=1, sec_type="STK", side="BUY", qty=1,
|
||||
order_type="LMT", price=110),
|
||||
]
|
||||
payload = build_oco_payload(legs)
|
||||
assert len(payload["orders"]) == 2
|
||||
oca = payload["orders"][0]["ocaGroup"]
|
||||
for o in payload["orders"]:
|
||||
assert o["ocaGroup"] == oca
|
||||
assert o["ocaType"] == 1
|
||||
@@ -0,0 +1,137 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr import tools as t
|
||||
|
||||
|
||||
def test_place_order_req_schema():
|
||||
req = t.PlaceOrderReq(symbol="AAPL", side="buy", qty=1)
|
||||
assert req.order_type == "market"
|
||||
assert req.tif == "day"
|
||||
assert req.exchange == "SMART"
|
||||
|
||||
|
||||
def test_place_order_req_options_validates_occ():
|
||||
req = t.PlaceOrderReq(
|
||||
symbol="AAPL 240119C00190000", side="buy", qty=1, asset_class="options",
|
||||
)
|
||||
assert req.asset_class == "options"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_account_tool_calls_client():
|
||||
client = MagicMock()
|
||||
client.get_account = AsyncMock(return_value={"netliquidation": {"amount": 10000}})
|
||||
res = await t.get_account(client, t.GetAccountReq())
|
||||
assert res["netliquidation"]["amount"] == 10000
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_tick_uses_cache_or_subscribes():
|
||||
client = MagicMock()
|
||||
client.resolve_conid = AsyncMock(return_value=42)
|
||||
ws = MagicMock()
|
||||
ws.get_tick_snapshot = MagicMock(side_effect=[
|
||||
None,
|
||||
{"conid": 42, "last_price": 99.5, "bid": 99.4, "ask": 99.6,
|
||||
"bid_size": 1, "ask_size": 1, "timestamp_ms": 1700000000000},
|
||||
])
|
||||
ws.subscribe_tick = AsyncMock()
|
||||
|
||||
res = await t.get_tick(
|
||||
client, t.GetTickReq(symbol="AAPL"), ws=ws, timeout_s=0.05,
|
||||
)
|
||||
assert res["last_price"] == 99.5
|
||||
ws.subscribe_tick.assert_awaited_once_with(42)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_enforces_leverage():
|
||||
client = MagicMock()
|
||||
client.get_account = AsyncMock(return_value={
|
||||
"netliquidation": {"amount": 10000},
|
||||
})
|
||||
client.place_order = AsyncMock(return_value={"order_id": "O1"})
|
||||
creds = {"max_leverage": 2}
|
||||
res = await t.place_order(
|
||||
client, t.PlaceOrderReq(symbol="AAPL", side="buy", qty=10),
|
||||
creds=creds, last_price=100.0,
|
||||
)
|
||||
assert res["order_id"] == "O1"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order_calls_client():
|
||||
client = MagicMock()
|
||||
client.cancel_order = AsyncMock(return_value={"order_id": "O1", "canceled": True})
|
||||
res = await t.cancel_order(client, t.CancelOrderReq(order_id="O1"))
|
||||
assert res["canceled"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_rejects_excessive_leverage():
|
||||
from fastapi import HTTPException
|
||||
client = MagicMock()
|
||||
client.get_account = AsyncMock(return_value={
|
||||
"netliquidation": {"amount": 1000},
|
||||
})
|
||||
creds = {"max_leverage": 2}
|
||||
# Order notional = 100*100 = 10000 vs equity 1000 → ratio 10x >> 2x cap → 403
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await t.place_order(
|
||||
client, t.PlaceOrderReq(symbol="AAPL", side="buy", qty=100),
|
||||
creds=creds, last_price=100.0,
|
||||
)
|
||||
assert exc_info.value.status_code == 403
|
||||
assert exc_info.value.detail["error"] == "LEVERAGE_CAP_EXCEEDED"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_bracket_order_calls_client_with_three_legs():
|
||||
client = MagicMock()
|
||||
client.resolve_conid = AsyncMock(return_value=42)
|
||||
client.account_id = "DU1"
|
||||
client.get_account = AsyncMock(return_value={"netliquidation": {"amount": 100000}})
|
||||
client._submit_order_with_confirmation = AsyncMock(
|
||||
return_value={"order_id": "OID-parent"}
|
||||
)
|
||||
res = await t.place_bracket_order(
|
||||
client,
|
||||
t.PlaceBracketOrderReq(
|
||||
symbol="AAPL", side="buy", qty=1,
|
||||
entry_price=150, stop_loss=145, take_profit=160,
|
||||
),
|
||||
creds={"max_leverage": 4},
|
||||
)
|
||||
assert res["order_id"] == "OID-parent"
|
||||
payload = client._submit_order_with_confirmation.call_args[0][0]
|
||||
assert len(payload["orders"]) == 3
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_oto_partial_failure_cancels_trigger():
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRError
|
||||
client = MagicMock()
|
||||
client.resolve_conid = AsyncMock(return_value=42)
|
||||
client.account_id = "DU1"
|
||||
client._submit_order_with_confirmation = AsyncMock(
|
||||
side_effect=[
|
||||
{"order_id": "TRIG1"},
|
||||
IBKRError("network"),
|
||||
]
|
||||
)
|
||||
client.cancel_order = AsyncMock(return_value={"canceled": True})
|
||||
with pytest.raises(IBKRError, match="IBKR_OTO_PARTIAL_FAILURE"):
|
||||
await t.place_oto_order(
|
||||
client,
|
||||
t.PlaceOtoOrderReq(
|
||||
trigger=t.OrderLeg(symbol="AAPL", side="buy", qty=1,
|
||||
order_type="limit", limit_price=150),
|
||||
child=t.OrderLeg(symbol="AAPL", side="sell", qty=1,
|
||||
order_type="limit", limit_price=160),
|
||||
),
|
||||
creds={"max_leverage": 4},
|
||||
)
|
||||
client.cancel_order.assert_awaited_once_with("TRIG1")
|
||||
@@ -0,0 +1,124 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.ws import IBKRWebSocket, WSError
|
||||
|
||||
|
||||
class FakeWS:
|
||||
"""Bidirectional async fake for WSS messages."""
|
||||
def __init__(self) -> None:
|
||||
self.sent: list[str] = []
|
||||
self._inbox: asyncio.Queue[str] = asyncio.Queue()
|
||||
self.closed = False
|
||||
async def send(self, msg: str) -> None:
|
||||
self.sent.append(msg)
|
||||
async def recv(self) -> str:
|
||||
return await self._inbox.get()
|
||||
async def close(self) -> None:
|
||||
self.closed = True
|
||||
async def push(self, payload: dict) -> None:
|
||||
await self._inbox.put(json.dumps(payload))
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fake_signer():
|
||||
s = MagicMock()
|
||||
s.get_live_session_token = AsyncMock(return_value="LST==")
|
||||
return s
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_subscribe_tick_caches_snapshot(fake_signer, monkeypatch):
|
||||
fake_ws = FakeWS()
|
||||
|
||||
async def fake_connect(url, **kw):
|
||||
return fake_ws
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.exchanges.ibkr.ws.websockets_connect", fake_connect)
|
||||
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://api.ibkr.com/v1/api/ws",
|
||||
base_url="https://api.ibkr.com/v1/api",
|
||||
max_subs=80, idle_timeout_s=300,
|
||||
)
|
||||
await ws.start()
|
||||
await ws.subscribe_tick(265598)
|
||||
|
||||
await fake_ws.push({
|
||||
"topic": "smd+265598",
|
||||
"31": "150.5", "84": "150.4", "86": "150.6",
|
||||
"7295": "100", "7296": "200",
|
||||
})
|
||||
await asyncio.sleep(0.05)
|
||||
|
||||
snap = ws.get_tick_snapshot(265598)
|
||||
assert snap is not None
|
||||
assert snap["last_price"] == 150.5
|
||||
assert snap["bid"] == 150.4
|
||||
|
||||
await ws.stop()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_subscribe_limit(fake_signer, monkeypatch):
|
||||
fake_ws = FakeWS()
|
||||
|
||||
async def fake_connect(url, **kw):
|
||||
return fake_ws
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.exchanges.ibkr.ws.websockets_connect", fake_connect)
|
||||
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://x", base_url="https://x",
|
||||
max_subs=2, idle_timeout_s=300,
|
||||
)
|
||||
await ws.start()
|
||||
await ws.subscribe_tick(1)
|
||||
await ws.subscribe_tick(2)
|
||||
with pytest.raises(WSError, match="IBKR_WS_SUB_LIMIT"):
|
||||
await ws.subscribe_tick(3)
|
||||
await ws.stop()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_subscribe_before_start_raises(fake_signer):
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://x", base_url="https://x",
|
||||
max_subs=10, idle_timeout_s=300,
|
||||
)
|
||||
with pytest.raises(WSError, match="IBKR_WS_NOT_STARTED"):
|
||||
await ws.subscribe_tick(1)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_start_after_stop_resumes_reader(fake_signer, monkeypatch):
|
||||
fake_ws_a = FakeWS()
|
||||
fake_ws_b = FakeWS()
|
||||
fakes = iter([fake_ws_a, fake_ws_b])
|
||||
|
||||
async def fake_connect(url, **kw):
|
||||
return next(fakes)
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.exchanges.ibkr.ws.websockets_connect", fake_connect)
|
||||
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://x", base_url="https://x",
|
||||
max_subs=10, idle_timeout_s=300,
|
||||
)
|
||||
await ws.start()
|
||||
await ws.stop()
|
||||
# Restart with fresh fake_ws_b
|
||||
await ws.start()
|
||||
await ws.subscribe_tick(42)
|
||||
await fake_ws_b.push({"topic": "smd+42", "31": "100"})
|
||||
await asyncio.sleep(0.05)
|
||||
assert ws.get_tick_snapshot(42) is not None
|
||||
await ws.stop()
|
||||
@@ -0,0 +1,155 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def tmp_audit_file(tmp_path, monkeypatch):
|
||||
file_path = tmp_path / "audit.jsonl"
|
||||
monkeypatch.setenv("AUDIT_LOG_FILE", str(file_path))
|
||||
return file_path
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app(monkeypatch, tmp_audit_file):
|
||||
from tests.unit.test_settings import _minimal_env
|
||||
for k, v in _minimal_env().items():
|
||||
monkeypatch.setenv(k, v)
|
||||
from cerbero_mcp.__main__ import _make_app
|
||||
from cerbero_mcp.settings import Settings
|
||||
return _make_app(Settings())
|
||||
|
||||
|
||||
def _write_records(file_path: Path, records: list[dict]) -> None:
|
||||
with file_path.open("w") as f:
|
||||
for r in records:
|
||||
f.write(json.dumps(r) + "\n")
|
||||
|
||||
|
||||
def _bearer_test():
|
||||
return {"Authorization": "Bearer t_test_123"}
|
||||
|
||||
|
||||
def test_admin_audit_no_file(app):
|
||||
"""Senza AUDIT_LOG_FILE settato, ritorna empty + warning."""
|
||||
import os
|
||||
os.environ.pop("AUDIT_LOG_FILE", None)
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
body = r.json()
|
||||
assert body["count"] == 0
|
||||
assert "warning" in body
|
||||
|
||||
|
||||
def test_admin_audit_no_bearer_returns_401(app):
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit")
|
||||
assert r.status_code == 401
|
||||
|
||||
|
||||
def test_admin_audit_no_bot_tag_required(app, tmp_audit_file):
|
||||
"""Endpoint admin NON richiede X-Bot-Tag (solo bearer)."""
|
||||
_write_records(tmp_audit_file, [])
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
|
||||
|
||||
def test_admin_audit_returns_records(app, tmp_audit_file):
|
||||
records = [
|
||||
{
|
||||
"audit_event": "write_op",
|
||||
"asctime": "2026-05-01 10:00:00,000",
|
||||
"actor": "testnet", "bot_tag": "alpha",
|
||||
"exchange": "deribit", "action": "place_order",
|
||||
"target": "BTC-PERPETUAL",
|
||||
"payload": {"qty": 0.1},
|
||||
"result": {"order_id": "abc"},
|
||||
},
|
||||
{
|
||||
"audit_event": "write_op",
|
||||
"asctime": "2026-05-01 11:00:00,000",
|
||||
"actor": "mainnet", "bot_tag": "beta",
|
||||
"exchange": "bybit", "action": "cancel_order",
|
||||
"target": "ord-1",
|
||||
"payload": {},
|
||||
},
|
||||
]
|
||||
_write_records(tmp_audit_file, records)
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
body = r.json()
|
||||
assert body["count"] == 2
|
||||
|
||||
|
||||
def test_admin_audit_filter_by_actor(app, tmp_audit_file):
|
||||
records = [
|
||||
{"audit_event": "write_op", "asctime": "2026-05-01 10:00:00,000",
|
||||
"actor": "testnet", "bot_tag": "a", "exchange": "deribit", "action": "place_order"},
|
||||
{"audit_event": "write_op", "asctime": "2026-05-01 11:00:00,000",
|
||||
"actor": "mainnet", "bot_tag": "b", "exchange": "bybit", "action": "place_order"},
|
||||
]
|
||||
_write_records(tmp_audit_file, records)
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit?actor=mainnet", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
body = r.json()
|
||||
assert body["count"] == 1
|
||||
assert body["records"][0]["actor"] == "mainnet"
|
||||
|
||||
|
||||
def test_admin_audit_filter_by_date_range(app, tmp_audit_file):
|
||||
records = [
|
||||
{"audit_event": "write_op", "asctime": "2026-04-30 10:00:00,000",
|
||||
"actor": "testnet", "exchange": "deribit", "action": "place_order"},
|
||||
{"audit_event": "write_op", "asctime": "2026-05-01 10:00:00,000",
|
||||
"actor": "testnet", "exchange": "deribit", "action": "place_order"},
|
||||
{"audit_event": "write_op", "asctime": "2026-05-02 10:00:00,000",
|
||||
"actor": "testnet", "exchange": "deribit", "action": "place_order"},
|
||||
]
|
||||
_write_records(tmp_audit_file, records)
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit?from=2026-05-01&to=2026-05-01T23:59:59", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
assert r.json()["count"] == 1
|
||||
|
||||
|
||||
def test_admin_audit_filter_by_bot_tag(app, tmp_audit_file):
|
||||
records = [
|
||||
{"audit_event": "write_op", "asctime": "2026-05-01 10:00:00,000",
|
||||
"actor": "testnet", "bot_tag": "alpha", "exchange": "deribit", "action": "place_order"},
|
||||
{"audit_event": "write_op", "asctime": "2026-05-01 11:00:00,000",
|
||||
"actor": "testnet", "bot_tag": "beta", "exchange": "deribit", "action": "place_order"},
|
||||
]
|
||||
_write_records(tmp_audit_file, records)
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit?bot_tag=alpha", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
assert r.json()["count"] == 1
|
||||
assert r.json()["records"][0]["bot_tag"] == "alpha"
|
||||
|
||||
|
||||
def test_admin_audit_invalid_date(app, tmp_audit_file):
|
||||
_write_records(tmp_audit_file, [])
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit?from=not-a-date", headers=_bearer_test())
|
||||
assert r.status_code == 400
|
||||
|
||||
|
||||
def test_admin_audit_limit(app, tmp_audit_file):
|
||||
records = [
|
||||
{"audit_event": "write_op", "asctime": f"2026-05-01 10:{i:02d}:00,000",
|
||||
"actor": "testnet", "exchange": "deribit", "action": "place_order"}
|
||||
for i in range(50)
|
||||
]
|
||||
_write_records(tmp_audit_file, records)
|
||||
c = TestClient(app)
|
||||
r = c.get("/admin/audit?limit=10", headers=_bearer_test())
|
||||
assert r.status_code == 200
|
||||
assert r.json()["count"] == 10
|
||||
+78
-2
@@ -67,7 +67,10 @@ def test_testnet_token_sets_env_testnet():
|
||||
return {"env": request.state.environment}
|
||||
|
||||
c = TestClient(fa)
|
||||
r = c.get("/mcp-deribit/peek", headers={"Authorization": "Bearer tk_test"})
|
||||
r = c.get(
|
||||
"/mcp-deribit/peek",
|
||||
headers={"Authorization": "Bearer tk_test", "X-Bot-Tag": "test-bot"},
|
||||
)
|
||||
assert r.status_code == 200
|
||||
assert r.json() == {"env": "testnet"}
|
||||
|
||||
@@ -83,7 +86,10 @@ def test_mainnet_token_sets_env_mainnet():
|
||||
return {"env": request.state.environment}
|
||||
|
||||
c = TestClient(fa)
|
||||
r = c.get("/mcp-deribit/peek", headers={"Authorization": "Bearer tk_live"})
|
||||
r = c.get(
|
||||
"/mcp-deribit/peek",
|
||||
headers={"Authorization": "Bearer tk_live", "X-Bot-Tag": "test-bot"},
|
||||
)
|
||||
assert r.status_code == 200
|
||||
assert r.json() == {"env": "mainnet"}
|
||||
|
||||
@@ -96,3 +102,73 @@ def test_uses_compare_digest():
|
||||
|
||||
src = inspect.getsource(auth)
|
||||
assert "compare_digest" in src, "auth.py deve usare secrets.compare_digest"
|
||||
|
||||
|
||||
# ── X-Bot-Tag header ─────────────────────────────────────────────────────────
|
||||
|
||||
def test_missing_bot_tag_returns_400():
|
||||
from cerbero_mcp.auth import install_auth_middleware
|
||||
fa = FastAPI()
|
||||
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
|
||||
|
||||
@fa.get("/mcp-deribit/health")
|
||||
def h():
|
||||
return {"ok": True}
|
||||
|
||||
c = TestClient(fa)
|
||||
r = c.get("/mcp-deribit/health", headers={"Authorization": "Bearer t"})
|
||||
assert r.status_code == 400
|
||||
assert "X-Bot-Tag" in r.json()["error"]["message"]
|
||||
|
||||
|
||||
def test_bot_tag_accepted_and_set_on_state():
|
||||
from cerbero_mcp.auth import install_auth_middleware
|
||||
fa = FastAPI()
|
||||
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
|
||||
|
||||
@fa.get("/mcp-deribit/peek")
|
||||
def peek(request: Request):
|
||||
return {
|
||||
"env": request.state.environment,
|
||||
"bot_tag": request.state.bot_tag,
|
||||
}
|
||||
|
||||
c = TestClient(fa)
|
||||
r = c.get(
|
||||
"/mcp-deribit/peek",
|
||||
headers={"Authorization": "Bearer t", "X-Bot-Tag": "scanner-alpha"},
|
||||
)
|
||||
assert r.status_code == 200
|
||||
assert r.json() == {"env": "testnet", "bot_tag": "scanner-alpha"}
|
||||
|
||||
|
||||
def test_bot_tag_too_long_returns_400():
|
||||
from cerbero_mcp.auth import install_auth_middleware
|
||||
fa = FastAPI()
|
||||
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
|
||||
|
||||
@fa.get("/mcp-deribit/health")
|
||||
def h():
|
||||
return {"ok": True}
|
||||
|
||||
c = TestClient(fa)
|
||||
r = c.get(
|
||||
"/mcp-deribit/health",
|
||||
headers={"Authorization": "Bearer t", "X-Bot-Tag": "x" * 65},
|
||||
)
|
||||
assert r.status_code == 400
|
||||
|
||||
|
||||
def test_bot_tag_not_required_on_health():
|
||||
"""Health endpoint deve restare senza auth e senza bot tag."""
|
||||
from cerbero_mcp.auth import install_auth_middleware
|
||||
fa = FastAPI()
|
||||
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
|
||||
|
||||
@fa.get("/health")
|
||||
def h():
|
||||
return {"ok": True}
|
||||
|
||||
c = TestClient(fa)
|
||||
r = c.get("/health")
|
||||
assert r.status_code == 200
|
||||
|
||||
@@ -29,15 +29,8 @@ async def test_build_client_bybit_returns_correct_env(monkeypatch):
|
||||
for k, v in _minimal_env().items():
|
||||
monkeypatch.setenv(k, v)
|
||||
|
||||
# Stub pybit HTTP per evitare connessione reale durante __init__
|
||||
from cerbero_mcp.exchanges.bybit import client as bybit_client
|
||||
|
||||
class _FakeHTTP:
|
||||
def __init__(self, **kwargs):
|
||||
self.kwargs = kwargs
|
||||
|
||||
monkeypatch.setattr(bybit_client, "HTTP", _FakeHTTP)
|
||||
|
||||
# BybitClient costruisce internamente httpx.AsyncClient: nessuna
|
||||
# connessione reale finché non si invoca un metodo di rete.
|
||||
from cerbero_mcp.exchanges import build_client
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
@@ -78,28 +71,22 @@ async def test_build_client_alpaca_returns_correct_env(monkeypatch):
|
||||
for k, v in _minimal_env().items():
|
||||
monkeypatch.setenv(k, v)
|
||||
|
||||
# Stub alpaca SDK clients per evitare connessioni reali in __init__
|
||||
from cerbero_mcp.exchanges.alpaca import client as alpaca_client
|
||||
|
||||
class _FakeSdk:
|
||||
def __init__(self, **kwargs):
|
||||
self.kwargs = kwargs
|
||||
|
||||
monkeypatch.setattr(alpaca_client, "TradingClient", _FakeSdk)
|
||||
monkeypatch.setattr(alpaca_client, "StockHistoricalDataClient", _FakeSdk)
|
||||
monkeypatch.setattr(alpaca_client, "CryptoHistoricalDataClient", _FakeSdk)
|
||||
monkeypatch.setattr(alpaca_client, "OptionHistoricalDataClient", _FakeSdk)
|
||||
|
||||
# AlpacaClient (V2) usa httpx puro: il costruttore non apre connessioni
|
||||
# reali (httpx.AsyncClient è lazy fino alla prima request), quindi nessuno
|
||||
# stub SDK è necessario.
|
||||
from cerbero_mcp.exchanges import build_client
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
c_test = await build_client(s, "alpaca", "testnet")
|
||||
c_live = await build_client(s, "alpaca", "mainnet")
|
||||
|
||||
try:
|
||||
assert c_test is not c_live
|
||||
assert c_test.paper is True
|
||||
assert c_live.paper is False
|
||||
finally:
|
||||
await c_test.aclose()
|
||||
await c_live.aclose()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
@@ -202,8 +189,8 @@ async def test_hyperliquid_url_from_env_overrides_default(monkeypatch):
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_bybit_url_from_env_overrides_default(monkeypatch):
|
||||
"""Bybit: pybit non accetta `endpoint` come kwarg, ma setting di
|
||||
`_http.endpoint` post-init rispecchia l'override."""
|
||||
"""Bybit (httpx): override BYBIT_URL_TESTNET applica direttamente a
|
||||
`self.base_url`, usato come base di ogni richiesta REST V5."""
|
||||
from tests.unit.test_settings import _minimal_env
|
||||
|
||||
env = _minimal_env(BYBIT_URL_TESTNET="https://bybit-custom.example.com")
|
||||
@@ -216,14 +203,12 @@ async def test_bybit_url_from_env_overrides_default(monkeypatch):
|
||||
s = Settings()
|
||||
c = await build_client(s, "bybit", "testnet")
|
||||
assert c.base_url == "https://bybit-custom.example.com"
|
||||
# override applicato all'istanza pybit HTTP via attributo `endpoint`
|
||||
assert getattr(c._http, "endpoint", None) == "https://bybit-custom.example.com"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_alpaca_url_from_env_overrides_default(monkeypatch):
|
||||
"""Alpaca: TradingClient supporta url_override per trading API.
|
||||
Data clients (Stock/Crypto/Option) non supportano override sul costruttore."""
|
||||
"""Alpaca V2 (httpx): `base_url` override applica al solo trading
|
||||
endpoint; data endpoints (data.alpaca.markets) restano hardcoded."""
|
||||
from tests.unit.test_settings import _minimal_env
|
||||
|
||||
env = _minimal_env(ALPACA_URL_TESTNET="https://alpaca-custom.example.com")
|
||||
@@ -235,7 +220,10 @@ async def test_alpaca_url_from_env_overrides_default(monkeypatch):
|
||||
|
||||
s = Settings()
|
||||
c = await build_client(s, "alpaca", "testnet")
|
||||
try:
|
||||
assert c.base_url == "https://alpaca-custom.example.com"
|
||||
finally:
|
||||
await c.aclose()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
|
||||
@@ -0,0 +1,121 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
|
||||
import pytest
|
||||
from fastapi.testclient import TestClient
|
||||
from starlette.requests import Request as StarletteRequest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def app(monkeypatch):
|
||||
from tests.unit.test_settings import _minimal_env
|
||||
for k, v in _minimal_env().items():
|
||||
monkeypatch.setenv(k, v)
|
||||
from cerbero_mcp.__main__ import _make_app
|
||||
from cerbero_mcp.settings import Settings
|
||||
return _make_app(Settings())
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def request_log_caplog(caplog):
|
||||
"""Caplog per logger mcp.request: aggiunge l'handler caplog direttamente
|
||||
al logger, bypassando ``propagate=False`` settato da common/logging.py.
|
||||
"""
|
||||
lg = logging.getLogger("mcp.request")
|
||||
lg.addHandler(caplog.handler)
|
||||
lg.setLevel(logging.INFO)
|
||||
caplog.set_level(logging.INFO, logger="mcp.request")
|
||||
try:
|
||||
yield caplog
|
||||
finally:
|
||||
lg.removeHandler(caplog.handler)
|
||||
|
||||
|
||||
def _request_records(caplog):
|
||||
return [rec for rec in caplog.records if rec.name == "mcp.request"]
|
||||
|
||||
|
||||
def test_request_log_emits_for_health(app, request_log_caplog):
|
||||
c = TestClient(app)
|
||||
r = c.get("/health")
|
||||
assert r.status_code == 200
|
||||
records = _request_records(request_log_caplog)
|
||||
assert any(getattr(rec, "path", None) == "/health" for rec in records)
|
||||
|
||||
|
||||
def test_request_log_includes_request_id(app, request_log_caplog):
|
||||
c = TestClient(app)
|
||||
c.get("/health")
|
||||
records = _request_records(request_log_caplog)
|
||||
assert records, "expected at least one mcp.request record"
|
||||
for rec in records:
|
||||
rid = getattr(rec, "request_id", None)
|
||||
assert rid and isinstance(rid, str) and len(rid) >= 16
|
||||
|
||||
|
||||
def test_request_log_includes_method_status_duration(app, request_log_caplog):
|
||||
c = TestClient(app)
|
||||
c.get("/health")
|
||||
records = _request_records(request_log_caplog)
|
||||
rec = next(rec for rec in records if getattr(rec, "path", None) == "/health")
|
||||
assert getattr(rec, "method", None) == "GET"
|
||||
assert getattr(rec, "status_code", None) == 200
|
||||
assert isinstance(getattr(rec, "duration_ms", None), int | float)
|
||||
|
||||
|
||||
def test_request_log_includes_actor_and_bot_tag_on_protected(
|
||||
app, request_log_caplog
|
||||
):
|
||||
"""Su path autenticato actor/bot_tag/exchange/tool sono propagati."""
|
||||
c = TestClient(app)
|
||||
c.post(
|
||||
"/mcp-deribit/tools/is_testnet",
|
||||
headers={
|
||||
"Authorization": "Bearer t_test_123",
|
||||
"X-Bot-Tag": "scanner-x",
|
||||
},
|
||||
json={},
|
||||
)
|
||||
records = _request_records(request_log_caplog)
|
||||
rec = next(
|
||||
rec
|
||||
for rec in records
|
||||
if getattr(rec, "path", None) == "/mcp-deribit/tools/is_testnet"
|
||||
)
|
||||
assert getattr(rec, "actor", None) == "testnet"
|
||||
assert getattr(rec, "bot_tag", None) == "scanner-x"
|
||||
assert getattr(rec, "exchange", None) == "deribit"
|
||||
assert getattr(rec, "tool", None) == "is_testnet"
|
||||
|
||||
|
||||
def test_request_log_unauthorized_does_not_have_actor(
|
||||
app, request_log_caplog
|
||||
):
|
||||
"""Senza bearer, request log emette comunque ma senza actor/bot_tag."""
|
||||
c = TestClient(app)
|
||||
c.post("/mcp-deribit/tools/is_testnet", json={})
|
||||
records = _request_records(request_log_caplog)
|
||||
rec = next(
|
||||
rec
|
||||
for rec in records
|
||||
if getattr(rec, "path", None) == "/mcp-deribit/tools/is_testnet"
|
||||
)
|
||||
assert getattr(rec, "status_code", None) == 401
|
||||
assert getattr(rec, "actor", None) is None
|
||||
assert getattr(rec, "exchange", None) == "deribit"
|
||||
|
||||
|
||||
def test_request_id_in_state_for_handlers(app):
|
||||
"""Verifica che request.state.request_id sia disponibile a handler."""
|
||||
@app.get("/__test_state")
|
||||
def _state_handler(request: StarletteRequest) -> dict:
|
||||
return {"rid": request.state.request_id}
|
||||
|
||||
c = TestClient(app)
|
||||
r = c.get(
|
||||
"/__test_state",
|
||||
headers={"Authorization": "Bearer t_test_123", "X-Bot-Tag": "x"},
|
||||
)
|
||||
assert r.status_code == 200, f"got {r.status_code}: {r.text[:500]}"
|
||||
assert r.json()["rid"]
|
||||
@@ -60,3 +60,123 @@ def test_x_duration_ms_header(app):
|
||||
c = TestClient(app)
|
||||
r = c.get("/health")
|
||||
assert "X-Duration-Ms" in r.headers
|
||||
|
||||
|
||||
def test_health_ready_empty_registry(app):
|
||||
"""Senza registry il readiness ritorna not_ready ma HTTP 200."""
|
||||
c = TestClient(app)
|
||||
r = c.get("/health/ready")
|
||||
assert r.status_code == 200
|
||||
j = r.json()
|
||||
assert j["status"] == "not_ready"
|
||||
assert j["clients"] == []
|
||||
assert j["version"] == "2.0.0"
|
||||
|
||||
|
||||
def test_health_ready_all_healthy(app):
|
||||
"""Registry con stub client healthy → status=ready."""
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
|
||||
class _StubOk:
|
||||
async def health(self):
|
||||
return {"status": "ok"}
|
||||
|
||||
async def _builder(exchange, env): # pragma: no cover - non chiamato
|
||||
return _StubOk()
|
||||
|
||||
reg = ClientRegistry(builder=_builder)
|
||||
reg._clients[("deribit", "testnet")] = _StubOk()
|
||||
reg._clients[("bybit", "mainnet")] = _StubOk()
|
||||
app.state.registry = reg
|
||||
|
||||
c = TestClient(app)
|
||||
r = c.get("/health/ready")
|
||||
assert r.status_code == 200
|
||||
j = r.json()
|
||||
assert j["status"] == "ready"
|
||||
assert len(j["clients"]) == 2
|
||||
for entry in j["clients"]:
|
||||
assert entry["healthy"] is True
|
||||
assert "duration_ms" in entry
|
||||
|
||||
|
||||
def test_health_ready_degraded_on_error(app):
|
||||
"""Registry con almeno un client che fa raise → status=degraded."""
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
|
||||
class _StubOk:
|
||||
async def health(self):
|
||||
return {"status": "ok"}
|
||||
|
||||
class _StubFail:
|
||||
async def health(self):
|
||||
raise RuntimeError("boom")
|
||||
|
||||
async def _builder(exchange, env): # pragma: no cover - non chiamato
|
||||
return _StubOk()
|
||||
|
||||
reg = ClientRegistry(builder=_builder)
|
||||
reg._clients[("deribit", "testnet")] = _StubOk()
|
||||
reg._clients[("bybit", "mainnet")] = _StubFail()
|
||||
app.state.registry = reg
|
||||
|
||||
c = TestClient(app)
|
||||
r = c.get("/health/ready")
|
||||
assert r.status_code == 200
|
||||
j = r.json()
|
||||
assert j["status"] == "degraded"
|
||||
fail = next(c for c in j["clients"] if c["exchange"] == "bybit")
|
||||
assert fail["healthy"] is False
|
||||
assert "RuntimeError" in fail["error"]
|
||||
|
||||
|
||||
def test_health_ready_503_when_fail_on_degraded(app, monkeypatch):
|
||||
"""READY_FAILS_ON_DEGRADED=true → HTTP 503 quando degraded."""
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
|
||||
class _StubFail:
|
||||
async def health(self):
|
||||
raise RuntimeError("boom")
|
||||
|
||||
async def _builder(exchange, env): # pragma: no cover - non chiamato
|
||||
return _StubFail()
|
||||
|
||||
reg = ClientRegistry(builder=_builder)
|
||||
reg._clients[("deribit", "testnet")] = _StubFail()
|
||||
app.state.registry = reg
|
||||
|
||||
monkeypatch.setenv("READY_FAILS_ON_DEGRADED", "true")
|
||||
c = TestClient(app)
|
||||
r = c.get("/health/ready")
|
||||
assert r.status_code == 503
|
||||
assert r.json()["status"] == "degraded"
|
||||
|
||||
|
||||
def test_health_ready_no_probe_method(app):
|
||||
"""Client senza health/is_testnet → marcato healthy con note."""
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
|
||||
class _StubBare:
|
||||
pass
|
||||
|
||||
async def _builder(exchange, env): # pragma: no cover - non chiamato
|
||||
return _StubBare()
|
||||
|
||||
reg = ClientRegistry(builder=_builder)
|
||||
reg._clients[("foo", "testnet")] = _StubBare()
|
||||
app.state.registry = reg
|
||||
|
||||
c = TestClient(app)
|
||||
r = c.get("/health/ready")
|
||||
assert r.status_code == 200
|
||||
j = r.json()
|
||||
assert j["status"] == "ready"
|
||||
assert j["clients"][0]["note"] == "no probe method"
|
||||
|
||||
|
||||
def test_health_ready_in_whitelist_no_auth(app):
|
||||
"""/health/ready non richiede bearer."""
|
||||
c = TestClient(app)
|
||||
# Nessun Authorization header → 200 (whitelist)
|
||||
r = c.get("/health/ready")
|
||||
assert r.status_code == 200
|
||||
|
||||
+129
-1
@@ -1,5 +1,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
from pydantic import ValidationError
|
||||
|
||||
@@ -51,7 +53,8 @@ def test_settings_load_minimal(monkeypatch):
|
||||
assert s.alpaca.max_leverage == 1
|
||||
|
||||
|
||||
def test_settings_missing_token_fails(monkeypatch):
|
||||
def test_settings_missing_token_fails(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path) # isola dal .env reale del working dir
|
||||
env = _minimal_env()
|
||||
env.pop("TESTNET_TOKEN")
|
||||
for k, v in env.items():
|
||||
@@ -84,3 +87,128 @@ def test_settings_secret_str_no_leak(monkeypatch):
|
||||
s = Settings()
|
||||
assert "t_test_123" not in repr(s)
|
||||
assert "t_live_456" not in repr(s)
|
||||
|
||||
|
||||
def _isolated(monkeypatch, tmp_path, env: dict) -> None:
|
||||
"""Isola Settings dal .env reale in working dir e setta solo env passato."""
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in (
|
||||
"DERIBIT_CLIENT_ID", "DERIBIT_CLIENT_SECRET",
|
||||
"DERIBIT_CLIENT_ID_TESTNET", "DERIBIT_CLIENT_SECRET_TESTNET",
|
||||
"DERIBIT_CLIENT_ID_LIVE", "DERIBIT_CLIENT_SECRET_LIVE",
|
||||
):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
for k, v in env.items():
|
||||
monkeypatch.setenv(k, v)
|
||||
|
||||
|
||||
def test_deribit_credentials_legacy_single_pair(monkeypatch, tmp_path):
|
||||
"""Solo DERIBIT_CLIENT_ID/SECRET → entrambi gli env usano la stessa coppia."""
|
||||
_isolated(monkeypatch, tmp_path, _minimal_env())
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
assert s.deribit.credentials("testnet") == ("id", "secret")
|
||||
assert s.deribit.credentials("mainnet") == ("id", "secret")
|
||||
|
||||
|
||||
def test_deribit_credentials_per_env_pairs(monkeypatch, tmp_path):
|
||||
"""Coppie _TESTNET e _LIVE → ognuna serve l'env corrispondente."""
|
||||
env = _minimal_env()
|
||||
env.pop("DERIBIT_CLIENT_ID")
|
||||
env.pop("DERIBIT_CLIENT_SECRET")
|
||||
env["DERIBIT_CLIENT_ID_TESTNET"] = "tid"
|
||||
env["DERIBIT_CLIENT_SECRET_TESTNET"] = "tsec"
|
||||
env["DERIBIT_CLIENT_ID_LIVE"] = "lid"
|
||||
env["DERIBIT_CLIENT_SECRET_LIVE"] = "lsec"
|
||||
_isolated(monkeypatch, tmp_path, env)
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
assert s.deribit.credentials("testnet") == ("tid", "tsec")
|
||||
assert s.deribit.credentials("mainnet") == ("lid", "lsec")
|
||||
|
||||
|
||||
def test_deribit_credentials_env_specific_overrides_fallback(monkeypatch, tmp_path):
|
||||
"""_LIVE presente prevale sulla coppia base anche se entrambe configurate."""
|
||||
env = _minimal_env()
|
||||
env["DERIBIT_CLIENT_ID_LIVE"] = "lid"
|
||||
env["DERIBIT_CLIENT_SECRET_LIVE"] = "lsec"
|
||||
_isolated(monkeypatch, tmp_path, env)
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
assert s.deribit.credentials("mainnet") == ("lid", "lsec")
|
||||
assert s.deribit.credentials("testnet") == ("id", "secret") # fallback
|
||||
|
||||
|
||||
def test_deribit_credentials_missing_raises(monkeypatch, tmp_path):
|
||||
"""Nessuna coppia configurata → ValueError esplicito."""
|
||||
env = _minimal_env()
|
||||
env.pop("DERIBIT_CLIENT_ID")
|
||||
env.pop("DERIBIT_CLIENT_SECRET")
|
||||
_isolated(monkeypatch, tmp_path, env)
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
with pytest.raises(ValueError, match="not configured for env=mainnet"):
|
||||
s.deribit.credentials("mainnet")
|
||||
|
||||
|
||||
def test_ibkr_settings_prefer_testnet_specific(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in list(os.environ):
|
||||
if k.startswith("IBKR_"):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
|
||||
monkeypatch.setenv("IBKR_CONSUMER_KEY", "base_consumer")
|
||||
monkeypatch.setenv("IBKR_CONSUMER_KEY_TESTNET", "paper_consumer")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN_TESTNET", "paper_token")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN_SECRET_TESTNET", "paper_secret")
|
||||
monkeypatch.setenv("IBKR_SIGNATURE_KEY_PATH_TESTNET", "/secrets/sig_paper.pem")
|
||||
monkeypatch.setenv("IBKR_ENCRYPTION_KEY_PATH_TESTNET", "/secrets/enc_paper.pem")
|
||||
monkeypatch.setenv("IBKR_ACCOUNT_ID_TESTNET", "DU1234567")
|
||||
monkeypatch.setenv("IBKR_DH_PRIME", "ffff")
|
||||
|
||||
from cerbero_mcp.settings import IBKRSettings
|
||||
s = IBKRSettings()
|
||||
creds = s.credentials("testnet")
|
||||
assert creds["consumer_key"] == "paper_consumer"
|
||||
assert creds["access_token"] == "paper_token"
|
||||
assert creds["account_id"] == "DU1234567"
|
||||
|
||||
|
||||
def test_ibkr_settings_fallback_to_base(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in list(os.environ):
|
||||
if k.startswith("IBKR_"):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
|
||||
monkeypatch.setenv("IBKR_CONSUMER_KEY", "base_consumer")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN", "base_token")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN_SECRET", "base_secret")
|
||||
monkeypatch.setenv("IBKR_SIGNATURE_KEY_PATH", "/secrets/sig.pem")
|
||||
monkeypatch.setenv("IBKR_ENCRYPTION_KEY_PATH", "/secrets/enc.pem")
|
||||
monkeypatch.setenv("IBKR_ACCOUNT_ID_TESTNET", "DU1234567")
|
||||
monkeypatch.setenv("IBKR_DH_PRIME", "ffff")
|
||||
|
||||
from cerbero_mcp.settings import IBKRSettings
|
||||
s = IBKRSettings()
|
||||
creds = s.credentials("testnet")
|
||||
assert creds["consumer_key"] == "base_consumer"
|
||||
|
||||
|
||||
def test_ibkr_settings_missing_raises(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in list(os.environ):
|
||||
if k.startswith("IBKR_"):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
|
||||
from cerbero_mcp.settings import IBKRSettings
|
||||
s = IBKRSettings()
|
||||
with pytest.raises(ValueError, match="IBKR credentials not configured"):
|
||||
s.credentials("testnet")
|
||||
|
||||
@@ -13,24 +13,6 @@ resolution-markers = [
|
||||
"python_full_version < '3.14' and sys_platform != 'emscripten' and sys_platform != 'win32'",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "alpaca-py"
|
||||
version = "0.43.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "msgpack" },
|
||||
{ name = "pandas" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pytz" },
|
||||
{ name = "requests" },
|
||||
{ name = "sseclient-py" },
|
||||
{ name = "websockets" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2f/9d/3003f661c15b8003655c447c187aec10f0843647e5c98b391701b04ac3d8/alpaca_py-0.43.4.tar.gz", hash = "sha256:7d529b3654d4e817d9fd7ab461131c4f06a315c736b6a9e4a87d5406bb71114a", size = 97990, upload-time = "2026-04-29T08:41:48.775Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/d5/1f57cc03e7b5925a927cb7f8e7ee5f873e22632633778d28d5d23681c871/alpaca_py-0.43.4-py3-none-any.whl", hash = "sha256:dd49ac30e0f2a8f38550ef1f27a58e7fd8f3f3875deaa4e757443cdbd033a1b4", size = 122534, upload-time = "2026-04-29T08:41:50.149Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "annotated-doc"
|
||||
version = "0.0.4"
|
||||
@@ -140,13 +122,14 @@ name = "cerbero-mcp"
|
||||
version = "2.0.0"
|
||||
source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "alpaca-py" },
|
||||
{ name = "cryptography" },
|
||||
{ name = "eth-account" },
|
||||
{ name = "eth-utils" },
|
||||
{ name = "fastapi" },
|
||||
{ name = "httpx" },
|
||||
{ name = "hyperliquid-python-sdk" },
|
||||
{ name = "msgpack" },
|
||||
{ name = "numpy" },
|
||||
{ name = "pandas" },
|
||||
{ name = "pybit" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pydantic-settings" },
|
||||
{ name = "python-json-logger" },
|
||||
@@ -167,13 +150,14 @@ dev = [
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "alpaca-py", specifier = ">=0.30" },
|
||||
{ name = "cryptography", specifier = ">=43" },
|
||||
{ name = "eth-account", specifier = ">=0.13.7" },
|
||||
{ name = "eth-utils", specifier = ">=5.3.1" },
|
||||
{ name = "fastapi", specifier = ">=0.115" },
|
||||
{ name = "httpx", specifier = ">=0.27" },
|
||||
{ name = "hyperliquid-python-sdk", specifier = ">=0.6" },
|
||||
{ name = "msgpack", specifier = ">=1.1.2" },
|
||||
{ name = "numpy", specifier = ">=1.26" },
|
||||
{ name = "pandas", specifier = ">=2.2" },
|
||||
{ name = "pybit", specifier = ">=5.7" },
|
||||
{ name = "pydantic", specifier = ">=2.9" },
|
||||
{ name = "pydantic-settings", specifier = ">=2.6" },
|
||||
{ name = "python-json-logger", specifier = ">=2.0" },
|
||||
@@ -202,92 +186,73 @@ wheels = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "charset-normalizer"
|
||||
version = "3.4.7"
|
||||
name = "cffi"
|
||||
version = "2.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e7/a1/67fe25fac3c7642725500a3f6cfe5821ad557c3abb11c9d20d12c7008d3e/charset_normalizer-3.4.7.tar.gz", hash = "sha256:ae89db9e5f98a11a4bf50407d4363e7b09b31e55bc117b4f7d80aab97ba009e5", size = 144271, upload-time = "2026-04-02T09:28:39.342Z" }
|
||||
dependencies = [
|
||||
{ name = "pycparser", marker = "implementation_name != 'PyPy'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/d7/b5b7020a0565c2e9fa8c09f4b5fa6232feb326b8c20081ccded47ea368fd/charset_normalizer-3.4.7-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7641bb8895e77f921102f72833904dcd9901df5d6d72a2ab8f31d04b7e51e4e7", size = 309705, upload-time = "2026-04-02T09:26:02.191Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/53/58c29116c340e5456724ecd2fff4196d236b98f3da97b404bc5e51ac3493/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:202389074300232baeb53ae2569a60901f7efadd4245cf3a3bf0617d60b439d7", size = 206419, upload-time = "2026-04-02T09:26:03.583Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/02/e8146dc6591a37a00e5144c63f29fb7c97a734ea8a111190783c0e60ab63/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:30b8d1d8c52a48c2c5690e152c169b673487a2a58de1ec7393196753063fcd5e", size = 227901, upload-time = "2026-04-02T09:26:04.738Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fb/73/77486c4cd58f1267bf17db420e930c9afa1b3be3fe8c8b8ebbebc9624359/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:532bc9bf33a68613fd7d65e4b1c71a6a38d7d42604ecf239c77392e9b4e8998c", size = 222742, upload-time = "2026-04-02T09:26:06.36Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/fa/f74eb381a7d94ded44739e9d94de18dc5edc9c17fb8c11f0a6890696c0a9/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2fe249cb4651fd12605b7288b24751d8bfd46d35f12a20b1ba33dea122e690df", size = 214061, upload-time = "2026-04-02T09:26:08.347Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/92/42bd3cefcf7687253fb86694b45f37b733c97f59af3724f356fa92b8c344/charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_armv7l.whl", hash = "sha256:65bcd23054beab4d166035cabbc868a09c1a49d1efe458fe8e4361215df40265", size = 199239, upload-time = "2026-04-02T09:26:09.823Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/3d/069e7184e2aa3b3cddc700e3dd267413dc259854adc3380421c805c6a17d/charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:08e721811161356f97b4059a9ba7bafb23ea5ee2255402c42881c214e173c6b4", size = 210173, upload-time = "2026-04-02T09:26:10.953Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/62/51/9d56feb5f2e7074c46f93e0ebdbe61f0848ee246e2f0d89f8e20b89ebb8f/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e060d01aec0a910bdccb8be71faf34e7799ce36950f8294c8bf612cba65a2c9e", size = 209841, upload-time = "2026-04-02T09:26:12.142Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/59/893d8f99cc4c837dda1fe2f1139079703deb9f321aabcb032355de13b6c7/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:38c0109396c4cfc574d502df99742a45c72c08eff0a36158b6f04000043dbf38", size = 200304, upload-time = "2026-04-02T09:26:13.711Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/1d/ee6f3be3464247578d1ed5c46de545ccc3d3ff933695395c402c21fa6b77/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:1c2a768fdd44ee4a9339a9b0b130049139b8ce3c01d2ce09f67f5a68048d477c", size = 229455, upload-time = "2026-04-02T09:26:14.941Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/bb/8fb0a946296ea96a488928bdce8ef99023998c48e4713af533e9bb98ef07/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:1a87ca9d5df6fe460483d9a5bbf2b18f620cbed41b432e2bddb686228282d10b", size = 210036, upload-time = "2026-04-02T09:26:16.478Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/bc/015b2387f913749f82afd4fcba07846d05b6d784dd16123cb66860e0237d/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:d635aab80466bc95771bb78d5370e74d36d1fe31467b6b29b8b57b2a3cd7d22c", size = 224739, upload-time = "2026-04-02T09:26:17.751Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/ab/63133691f56baae417493cba6b7c641571a2130eb7bceba6773367ab9ec5/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ae196f021b5e7c78e918242d217db021ed2a6ace2bc6ae94c0fc596221c7f58d", size = 216277, upload-time = "2026-04-02T09:26:18.981Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/6d/3be70e827977f20db77c12a97e6a9f973631a45b8d186c084527e53e77a4/charset_normalizer-3.4.7-cp311-cp311-win32.whl", hash = "sha256:adb2597b428735679446b46c8badf467b4ca5f5056aae4d51a19f9570301b1ad", size = 147819, upload-time = "2026-04-02T09:26:20.295Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/d9/5f67790f06b735d7c7637171bbfd89882ad67201891b7275e51116ed8207/charset_normalizer-3.4.7-cp311-cp311-win_amd64.whl", hash = "sha256:8e385e4267ab76874ae30db04c627faaaf0b509e1ccc11a95b3fc3e83f855c00", size = 159281, upload-time = "2026-04-02T09:26:21.74Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ca/83/6413f36c5a34afead88ce6f66684d943d91f233d76dd083798f9602b75ae/charset_normalizer-3.4.7-cp311-cp311-win_arm64.whl", hash = "sha256:d4a48e5b3c2a489fae013b7589308a40146ee081f6f509e047e0e096084ceca1", size = 147843, upload-time = "2026-04-02T09:26:22.901Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/eb/4fc8d0a7110eb5fc9cc161723a34a8a6c200ce3b4fbf681bc86feee22308/charset_normalizer-3.4.7-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46", size = 311328, upload-time = "2026-04-02T09:26:24.331Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/e3/0fadc706008ac9d7b9b5be6dc767c05f9d3e5df51744ce4cc9605de7b9f4/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2", size = 208061, upload-time = "2026-04-02T09:26:25.568Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/42/f0/3dd1045c47f4a4604df85ec18ad093912ae1344ac706993aff91d38773a2/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b", size = 229031, upload-time = "2026-04-02T09:26:26.865Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/67/675a46eb016118a2fbde5a277a5d15f4f69d5f3f5f338e5ee2f8948fcf43/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a", size = 225239, upload-time = "2026-04-02T09:26:28.044Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/f8/d0118a2f5f23b02cd166fa385c60f9b0d4f9194f574e2b31cef350ad7223/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116", size = 216589, upload-time = "2026-04-02T09:26:29.239Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/f1/6d2b0b261b6c4ceef0fcb0d17a01cc5bc53586c2d4796fa04b5c540bc13d/charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_armv7l.whl", hash = "sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb", size = 202733, upload-time = "2026-04-02T09:26:30.5Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/c0/7b1f943f7e87cc3db9626ba17807d042c38645f0a1d4415c7a14afb5591f/charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1", size = 212652, upload-time = "2026-04-02T09:26:31.709Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/dd/5a9ab159fe45c6e72079398f277b7d2b523e7f716acc489726115a910097/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15", size = 211229, upload-time = "2026-04-02T09:26:33.282Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/ff/531a1cad5ca855d1c1a8b69cb71abfd6d85c0291580146fda7c82857caa1/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5", size = 203552, upload-time = "2026-04-02T09:26:34.845Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/4c/a5fb52d528a8ca41f7598cb619409ece30a169fbdf9cdce592e53b46c3a6/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d", size = 230806, upload-time = "2026-04-02T09:26:36.152Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/7a/071feed8124111a32b316b33ae4de83d36923039ef8cf48120266844285b/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7", size = 212316, upload-time = "2026-04-02T09:26:37.672Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/35/f7dba3994312d7ba508e041eaac39a36b120f32d4c8662b8814dab876431/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464", size = 227274, upload-time = "2026-04-02T09:26:38.93Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/2d/a572df5c9204ab7688ec1edc895a73ebded3b023bb07364710b05dd1c9be/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49", size = 218468, upload-time = "2026-04-02T09:26:40.17Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/eb/890922a8b03a568ca2f336c36585a4713c55d4d67bf0f0c78924be6315ca/charset_normalizer-3.4.7-cp312-cp312-win32.whl", hash = "sha256:2257141f39fe65a3fdf38aeccae4b953e5f3b3324f4ff0daf9f15b8518666a2c", size = 148460, upload-time = "2026-04-02T09:26:41.416Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/d9/0e7dffa06c5ab081f75b1b786f0aefc88365825dfcd0ac544bdb7b2b6853/charset_normalizer-3.4.7-cp312-cp312-win_amd64.whl", hash = "sha256:5ed6ab538499c8644b8a3e18debabcd7ce684f3fa91cf867521a7a0279cab2d6", size = 159330, upload-time = "2026-04-02T09:26:42.554Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/5d/481bcc2a7c88ea6b0878c299547843b2521ccbc40980cb406267088bc701/charset_normalizer-3.4.7-cp312-cp312-win_arm64.whl", hash = "sha256:56be790f86bfb2c98fb742ce566dfb4816e5a83384616ab59c49e0604d49c51d", size = 147828, upload-time = "2026-04-02T09:26:44.075Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/3b/66777e39d3ae1ddc77ee606be4ec6d8cbd4c801f65e5a1b6f2b11b8346dd/charset_normalizer-3.4.7-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063", size = 309627, upload-time = "2026-04-02T09:26:45.198Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2e/4e/b7f84e617b4854ade48a1b7915c8ccfadeba444d2a18c291f696e37f0d3b/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c", size = 207008, upload-time = "2026-04-02T09:26:46.824Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c4/bb/ec73c0257c9e11b268f018f068f5d00aa0ef8c8b09f7753ebd5f2880e248/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66", size = 228303, upload-time = "2026-04-02T09:26:48.397Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/85/fb/32d1f5033484494619f701e719429c69b766bfc4dbc61aa9e9c8c166528b/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18", size = 224282, upload-time = "2026-04-02T09:26:49.684Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/07/330e3a0dda4c404d6da83b327270906e9654a24f6c546dc886a0eb0ffb23/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd", size = 215595, upload-time = "2026-04-02T09:26:50.915Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/7c/fc890655786e423f02556e0216d4b8c6bcb6bdfa890160dc66bf52dee468/charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215", size = 201986, upload-time = "2026-04-02T09:26:52.197Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/97/bfb18b3db2aed3b90cf54dc292ad79fdd5ad65c4eae454099475cbeadd0d/charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859", size = 211711, upload-time = "2026-04-02T09:26:53.49Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/a5/a581c13798546a7fd557c82614a5c65a13df2157e9ad6373166d2a3e645d/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8", size = 210036, upload-time = "2026-04-02T09:26:54.975Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/bf/b3ab5bcb478e4193d517644b0fb2bf5497fbceeaa7a1bc0f4d5b50953861/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5", size = 202998, upload-time = "2026-04-02T09:26:56.303Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/4e/23efd79b65d314fa320ec6017b4b5834d5c12a58ba4610aa353af2e2f577/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832", size = 230056, upload-time = "2026-04-02T09:26:57.554Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b9/9f/1e1941bc3f0e01df116e68dc37a55c4d249df5e6fa77f008841aef68264f/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6", size = 211537, upload-time = "2026-04-02T09:26:58.843Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/0f/088cbb3020d44428964a6c97fe1edfb1b9550396bf6d278330281e8b709c/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48", size = 226176, upload-time = "2026-04-02T09:27:00.437Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/9f/130394f9bbe06f4f63e22641d32fc9b202b7e251c9aef4db044324dac493/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a", size = 217723, upload-time = "2026-04-02T09:27:02.021Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/73/55/c469897448a06e49f8fa03f6caae97074fde823f432a98f979cc42b90e69/charset_normalizer-3.4.7-cp313-cp313-win32.whl", hash = "sha256:4042d5c8f957e15221d423ba781e85d553722fc4113f523f2feb7b188cc34c5e", size = 148085, upload-time = "2026-04-02T09:27:03.192Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/78/1b74c5bbb3f99b77a1715c91b3e0b5bdb6fe302d95ace4f5b1bec37b0167/charset_normalizer-3.4.7-cp313-cp313-win_amd64.whl", hash = "sha256:3946fa46a0cf3e4c8cb1cc52f56bb536310d34f25f01ca9b6c16afa767dab110", size = 158819, upload-time = "2026-04-02T09:27:04.454Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/68/86/46bd42279d323deb8687c4a5a811fd548cb7d1de10cf6535d099877a9a9f/charset_normalizer-3.4.7-cp313-cp313-win_arm64.whl", hash = "sha256:80d04837f55fc81da168b98de4f4b797ef007fc8a79ab71c6ec9bc4dd662b15b", size = 147915, upload-time = "2026-04-02T09:27:05.971Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/97/c8/c67cb8c70e19ef1960b97b22ed2a1567711de46c4ddf19799923adc836c2/charset_normalizer-3.4.7-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0", size = 309234, upload-time = "2026-04-02T09:27:07.194Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/99/85/c091fdee33f20de70d6c8b522743b6f831a2f1cd3ff86de4c6a827c48a76/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a", size = 208042, upload-time = "2026-04-02T09:27:08.749Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/1c/ab2ce611b984d2fd5d86a5a8a19c1ae26acac6bad967da4967562c75114d/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b", size = 228706, upload-time = "2026-04-02T09:27:09.951Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/29/2b1d2cb00bf085f59d29eb773ce58ec2d325430f8c216804a0a5cd83cbca/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41", size = 224727, upload-time = "2026-04-02T09:27:11.175Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/5c/032c2d5a07fe4d4855fea851209cca2b6f03ebeb6d4e3afdb3358386a684/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e", size = 215882, upload-time = "2026-04-02T09:27:12.446Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/c2/356065d5a8b78ed04499cae5f339f091946a6a74f91e03476c33f0ab7100/charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae", size = 200860, upload-time = "2026-04-02T09:27:13.721Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/cd/a32a84217ced5039f53b29f460962abb2d4420def55afabe45b1c3c7483d/charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18", size = 211564, upload-time = "2026-04-02T09:27:15.272Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/86/58e6f13ce26cc3b8f4a36b94a0f22ae2f00a72534520f4ae6857c4b81f89/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b", size = 211276, upload-time = "2026-04-02T09:27:16.834Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8f/fe/d17c32dc72e17e155e06883efa84514ca375f8a528ba2546bee73fc4df81/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356", size = 201238, upload-time = "2026-04-02T09:27:18.229Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/29/f33daa50b06525a237451cdb6c69da366c381a3dadcd833fa5676bc468b3/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab", size = 230189, upload-time = "2026-04-02T09:27:19.445Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/6e/52c84015394a6a0bdcd435210a7e944c5f94ea1055f5cc5d56c5fe368e7b/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46", size = 211352, upload-time = "2026-04-02T09:27:20.79Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/d7/4353be581b373033fb9198bf1da3cf8f09c1082561e8e922aa7b39bf9fe8/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44", size = 227024, upload-time = "2026-04-02T09:27:22.063Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/30/45/99d18aa925bd1740098ccd3060e238e21115fffbfdcb8f3ece837d0ace6c/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72", size = 217869, upload-time = "2026-04-02T09:27:23.486Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/05/5ee478aa53f4bb7996482153d4bfe1b89e0f087f0ab6b294fcf92d595873/charset_normalizer-3.4.7-cp314-cp314-win32.whl", hash = "sha256:5b77459df20e08151cd6f8b9ef8ef1f961ef73d85c21a555c7eed5b79410ec10", size = 148541, upload-time = "2026-04-02T09:27:25.146Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/77/72dcb0921b2ce86420b2d79d454c7022bf5be40202a2a07906b9f2a35c97/charset_normalizer-3.4.7-cp314-cp314-win_amd64.whl", hash = "sha256:92a0a01ead5e668468e952e4238cccd7c537364eb7d851ab144ab6627dbbe12f", size = 159634, upload-time = "2026-04-02T09:27:26.642Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/a3/c2369911cd72f02386e4e340770f6e158c7980267da16af8f668217abaa0/charset_normalizer-3.4.7-cp314-cp314-win_arm64.whl", hash = "sha256:67f6279d125ca0046a7fd386d01b311c6363844deac3e5b069b514ba3e63c246", size = 148384, upload-time = "2026-04-02T09:27:28.271Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/09/7e8a7f73d24dba1f0035fbbf014d2c36828fc1bf9c88f84093e57d315935/charset_normalizer-3.4.7-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24", size = 330133, upload-time = "2026-04-02T09:27:29.474Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/da/96975ddb11f8e977f706f45cddd8540fd8242f71ecdb5d18a80723dcf62c/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79", size = 216257, upload-time = "2026-04-02T09:27:30.793Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/e8/1d63bf8ef2d388e95c64b2098f45f84758f6d102a087552da1485912637b/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960", size = 234851, upload-time = "2026-04-02T09:27:32.44Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/40/e5ff04233e70da2681fa43969ad6f66ca5611d7e669be0246c4c7aaf6dc8/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4", size = 233393, upload-time = "2026-04-02T09:27:34.03Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/c1/06c6c49d5a5450f76899992f1ee40b41d076aee9279b49cf9974d2f313d5/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e", size = 223251, upload-time = "2026-04-02T09:27:35.369Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/9f/f2ff16fb050946169e3e1f82134d107e5d4ae72647ec8a1b1446c148480f/charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1", size = 206609, upload-time = "2026-04-02T09:27:36.661Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/69/d5/a527c0cd8d64d2eab7459784fb4169a0ac76e5a6fc5237337982fd61347e/charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44", size = 220014, upload-time = "2026-04-02T09:27:38.019Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/80/8a7b8104a3e203074dc9aa2c613d4b726c0e136bad1cc734594b02867972/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e", size = 218979, upload-time = "2026-04-02T09:27:39.37Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/9a/b759b503d507f375b2b5c153e4d2ee0a75aa215b7f2489cf314f4541f2c0/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3", size = 209238, upload-time = "2026-04-02T09:27:40.722Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/4e/0f3f5d47b86bdb79256e7290b26ac847a2832d9a4033f7eb2cd4bcf4bb5b/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0", size = 236110, upload-time = "2026-04-02T09:27:42.33Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/96/23/bce28734eb3ed2c91dcf93abeb8a5cf393a7b2749725030bb630e554fdd8/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e", size = 219824, upload-time = "2026-04-02T09:27:43.924Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/6f/6e897c6984cc4d41af319b077f2f600fc8214eb2fe2d6bcb79141b882400/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb", size = 233103, upload-time = "2026-04-02T09:27:45.348Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/76/22/ef7bd0fe480a0ae9b656189ec00744b60933f68b4f42a7bb06589f6f576a/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe", size = 225194, upload-time = "2026-04-02T09:27:46.706Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c5/a7/0e0ab3e0b5bc1219bd80a6a0d4d72ca74d9250cb2382b7c699c147e06017/charset_normalizer-3.4.7-cp314-cp314t-win32.whl", hash = "sha256:c03a41a8784091e67a39648f70c5f97b5b6a37f216896d44d2cdcb82615339a0", size = 159827, upload-time = "2026-04-02T09:27:48.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/1d/29d32e0fb40864b1f878c7f5a0b343ae676c6e2b271a2d55cc3a152391da/charset_normalizer-3.4.7-cp314-cp314t-win_amd64.whl", hash = "sha256:03853ed82eeebbce3c2abfdbc98c96dc205f32a79627688ac9a27370ea61a49c", size = 174168, upload-time = "2026-04-02T09:27:49.795Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/de/32/d92444ad05c7a6e41fb2036749777c163baf7a0301a040cb672d6b2b1ae9/charset_normalizer-3.4.7-cp314-cp314t-win_arm64.whl", hash = "sha256:c35abb8bfff0185efac5878da64c45dafd2b37fb0383add1be155a763c1f083d", size = 153018, upload-time = "2026-04-02T09:27:51.116Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/db/8f/61959034484a4a7c527811f4721e75d02d653a35afb0b6054474d8185d4c/charset_normalizer-3.4.7-py3-none-any.whl", hash = "sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d", size = 61958, upload-time = "2026-04-02T09:28:37.794Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload-time = "2025-09-08T23:22:26.456Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload-time = "2025-09-08T23:22:28.197Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload-time = "2025-09-08T23:22:29.475Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload-time = "2025-09-08T23:22:31.063Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload-time = "2025-09-08T23:22:32.507Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload-time = "2025-09-08T23:22:34.132Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload-time = "2025-09-08T23:22:35.443Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload-time = "2025-09-08T23:22:36.805Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload-time = "2025-09-08T23:22:38.436Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload-time = "2025-09-08T23:22:39.776Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload-time = "2025-09-08T23:22:40.95Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload-time = "2025-09-08T23:22:42.463Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload-time = "2025-09-08T23:22:43.623Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload-time = "2025-09-08T23:22:47.349Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload-time = "2025-09-08T23:22:57.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload-time = "2025-09-08T23:22:58.351Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload-time = "2025-09-08T23:22:59.668Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -359,6 +324,65 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cryptography"
|
||||
version = "47.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ef/b2/7ffa7fe8207a8c42147ffe70c3e360b228160c1d85dc3faff16aaa3244c0/cryptography-47.0.0.tar.gz", hash = "sha256:9f8e55fe4e63613a5e1cc5819030f27b97742d720203a087802ce4ce9ceb52bb", size = 830863, upload-time = "2026-04-24T19:54:57.056Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/98/40dfe932134bdcae4f6ab5927c87488754bf9eb79297d7e0070b78dd58e9/cryptography-47.0.0-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:160ad728f128972d362e714054f6ba0067cab7fb350c5202a9ae8ae4ce3ef1a0", size = 7912214, upload-time = "2026-04-24T19:53:03.864Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/c6/2733531243fba725f58611b918056b277692f1033373dcc8bd01af1c05d4/cryptography-47.0.0-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b9a8943e359b7615db1a3ba587994618e094ff3d6fa5a390c73d079ce18b3973", size = 4644617, upload-time = "2026-04-24T19:53:06.909Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/e3/b27be1a670a9b87f855d211cf0e1174a5d721216b7616bd52d8581d912ed/cryptography-47.0.0-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f5c15764f261394b22aef6b00252f5195f46f2ca300bec57149474e2538b31f8", size = 4668186, upload-time = "2026-04-24T19:53:09.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/b9/8443cfe5d17d482d348cee7048acf502bb89a51b6382f06240fd290d4ca3/cryptography-47.0.0-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:9c59ab0e0fa3a180a5a9c59f3a5abe3ef90d474bc56d7fadfbe80359491b615b", size = 4651244, upload-time = "2026-04-24T19:53:11.217Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/5e/13ed0cdd0eb88ba159d6dd5ebfece8cb901dbcf1ae5ac4072e28b55d3153/cryptography-47.0.0-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:34b4358b925a5ea3e14384ca781a2c0ef7ac219b57bb9eacc4457078e2b19f92", size = 5252906, upload-time = "2026-04-24T19:53:13.532Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/16/ed058e1df0f33d440217cd120d41d5dda9dd215a80b8187f68483185af82/cryptography-47.0.0-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0024b87d47ae2399165a6bfb20d24888881eeab83ae2566d62467c5ff0030ce7", size = 4701842, upload-time = "2026-04-24T19:53:15.618Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/e0/3d30986b30fdbd9e969abbdf8ba00ed0618615144341faeb57f395a084fe/cryptography-47.0.0-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:1e47422b5557bb82d3fff997e8d92cff4e28b9789576984f08c248d2b3535d93", size = 4289313, upload-time = "2026-04-24T19:53:17.755Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/fd/32db38e3ad0cb331f0691cb4c7a8a6f176f679124dee746b3af6633db4d9/cryptography-47.0.0-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:6f29f36582e6151d9686235e586dd35bb67491f024767d10b842e520dc6a07ac", size = 4650964, upload-time = "2026-04-24T19:53:20.062Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/53/5395d944dfd48cb1f67917f533c609c34347185ef15eb4308024c876f274/cryptography-47.0.0-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:a9b761f012a943b7de0e828843c5688d0de94a0578d44d6c85a1bae32f87791f", size = 5207817, upload-time = "2026-04-24T19:53:22.498Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/4f/e5711b28e1901f7d480a2b1b688b645aa4c77c73f10731ed17e7f7db3f0d/cryptography-47.0.0-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4e1de79e047e25d6e9f8cea71c86b4a53aced64134f0f003bbcbf3655fd172c8", size = 4701544, upload-time = "2026-04-24T19:53:24.356Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/22/c8ddc25de3010fc8da447648f5a092c40e7a8fadf01dd6d255d9c0b9373d/cryptography-47.0.0-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef6b3634087f18d2155b1e8ce264e5345a753da2c5fa9815e7d41315c90f8318", size = 4783536, upload-time = "2026-04-24T19:53:26.665Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/b6/d4a68f4ea999c6d89e8498579cba1c5fcba4276284de7773b17e4fa69293/cryptography-47.0.0-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:11dbb9f50a0f1bb9757b3d8c27c1101780efb8f0bdecfb12439c22a74d64c001", size = 4926106, upload-time = "2026-04-24T19:53:28.686Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/ed/5f524db1fade9c013aa618e1c99c6ed05e8ffc9ceee6cda22fed22dda3f4/cryptography-47.0.0-cp311-abi3-win32.whl", hash = "sha256:7fda2f02c9015db3f42bb8a22324a454516ed10a8c29ca6ece6cdbb5efe2a203", size = 3258581, upload-time = "2026-04-24T19:53:31.058Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/dc/1b901990b174786569029f67542b3edf72ac068b6c3c8683c17e6a2f5363/cryptography-47.0.0-cp311-abi3-win_amd64.whl", hash = "sha256:f5c3296dab66202f1b18a91fa266be93d6aa0c2806ea3d67762c69f60adc71aa", size = 3775309, upload-time = "2026-04-24T19:53:33.054Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/88/7aa18ad9c11bc87689affa5ce4368d884b517502d75739d475fc6f4a03c7/cryptography-47.0.0-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:be12cb6a204f77ed968bcefe68086eb061695b540a3dd05edac507a3111b25f0", size = 7904299, upload-time = "2026-04-24T19:53:35.003Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/55/c18f75724544872f234678fdedc871391722cb34a2aee19faa9f63100bb2/cryptography-47.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2ebd84adf0728c039a3be2700289378e1c164afc6748df1a5ed456767bef9ba7", size = 4631180, upload-time = "2026-04-24T19:53:37.517Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/65/31a5cc0eaca99cec5bafffe155d407115d96136bb161e8b49e0ef73f09a7/cryptography-47.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7f68d6fbc7fbbcfb0939fea72c3b96a9f9a6edfc0e1b1d29778a2066030418b1", size = 4653529, upload-time = "2026-04-24T19:53:39.775Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/bc/641c0519a495f3bfd0421b48d7cd325c4336578523ccd76ea322b6c29c7a/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:6651d32eff255423503aa276739da98c30f26c40cbeffcc6048e0d54ef704c0c", size = 4638570, upload-time = "2026-04-24T19:53:42.129Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/f2/300327b0a47f6dc94dd8b71b57052aefe178bb51745073d73d80604f11ab/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:3fb8fa48075fad7193f2e5496135c6a76ac4b2aa5a38433df0a539296b377829", size = 5238019, upload-time = "2026-04-24T19:53:44.577Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/5a/5b5cf994391d4bf9d9c7efd4c66aabe4d95227256627f8fea6cff7dfadbd/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:11438c7518132d95f354fa01a4aa2f806d172a061a7bed18cf18cbdacdb204d7", size = 4686832, upload-time = "2026-04-24T19:53:47.015Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/2c/ae950e28fd6475c852fc21a44db3e6b5bcc1261d1e370f2b6e42fa800fef/cryptography-47.0.0-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:8c1a736bbb3288005796c3f7ccb9453360d7fed483b13b9f468aea5171432923", size = 4269301, upload-time = "2026-04-24T19:53:48.97Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/fb/6a39782e150ffe5cc1b0018cb6ddc48bf7ca62b498d7539ffc8a758e977d/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:f1557695e5c2b86e204f6ce9470497848634100787935ab7adc5397c54abd7ab", size = 4638110, upload-time = "2026-04-24T19:53:51.011Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/d7/0b3c71090a76e5c203164a47688b697635ece006dcd2499ab3a4dbd3f0bd/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:f9a034b642b960767fb343766ae5ba6ad653f2e890ddd82955aef288ffea8736", size = 5194988, upload-time = "2026-04-24T19:53:52.962Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/63/33/63a961498a9df51721ab578c5a2622661411fc520e00bd83b0cc64eb20c4/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:b1c76fca783aa7698eb21eb14f9c4aa09452248ee54a627d125025a43f83e7a7", size = 4686563, upload-time = "2026-04-24T19:53:55.274Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/bf/5ee5b145248f92250de86145d1c1d6edebbd57a7fe7caa4dedb5d4cf06a1/cryptography-47.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4f7722c97826770bab8ae92959a2e7b20a5e9e9bf4deae68fd86c3ca457bab52", size = 4770094, upload-time = "2026-04-24T19:53:57.753Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/43/21d220b2da5d517773894dacdcdb5c682c28d3fffce65548cb06e87d5501/cryptography-47.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:09f6d7bf6724f8db8b32f11eccf23efc8e759924bc5603800335cf8859a3ddbd", size = 4913811, upload-time = "2026-04-24T19:54:00.236Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/98/dc4ad376ac5f1a1a7d4a83f7b0c6f2bcad36b5d2d8f30aeb482d3a7d9582/cryptography-47.0.0-cp314-cp314t-win32.whl", hash = "sha256:6eebcaf0df1d21ce1f90605c9b432dd2c4f4ab665ac29a40d5e3fc68f51b5e63", size = 3237158, upload-time = "2026-04-24T19:54:02.606Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bc/da/97f62d18306b5133468bc3f8cc73a3111e8cdc8cf8d3e69474d6e5fd2d1b/cryptography-47.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:51c9313e90bd1690ec5a75ed047c27c0b8e6c570029712943d6116ef9a90620b", size = 3758706, upload-time = "2026-04-24T19:54:04.433Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/34/a4fae8ae7c3bc227460c9ae43f56abf1b911da0ec29e0ebac53bb0a4b6b7/cryptography-47.0.0-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:14432c8a9bcb37009784f9594a62fae211a2ae9543e96c92b2a8e4c3cd5cd0c4", size = 7904072, upload-time = "2026-04-24T19:54:06.411Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/64/d7b1e54fdb69f22d24a64bb3e88dc718b31c7fb10ef0b9691a3cf7eeea6e/cryptography-47.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:07efe86201817e7d3c18781ca9770bc0db04e1e48c994be384e4602bc38f8f27", size = 4635767, upload-time = "2026-04-24T19:54:08.519Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/7b/cca826391fb2a94efdcdfe4631eb69306ee1cff0b22f664a412c90713877/cryptography-47.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2b45761c6ec22b7c726d6a829558777e32d0f1c8be7c3f3480f9c912d5ee8a10", size = 4654350, upload-time = "2026-04-24T19:54:10.795Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/65/4b57bcc823f42a991627c51c2f68c9fd6eb1393c1756aac876cba2accae2/cryptography-47.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:edd4da498015da5b9f26d38d3bfc2e90257bfa9cbed1f6767c282a0025ae649b", size = 4643394, upload-time = "2026-04-24T19:54:13.275Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/c4/2c5fbeea70adbbca2bbae865e1d605d6a4a7f8dbd9d33eaf69645087f06c/cryptography-47.0.0-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:9af828c0d5a65c70ec729cd7495a4bf1a67ecb66417b8f02ff125ab8a6326a74", size = 5225777, upload-time = "2026-04-24T19:54:15.18Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/b8/ac57107ef32749d2b244e36069bb688792a363aaaa3acc9e3cf84c130315/cryptography-47.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:256d07c78a04d6b276f5df935a9923275f53bd1522f214447fdf365494e2d515", size = 4688771, upload-time = "2026-04-24T19:54:17.835Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/56/fc/9f1de22ff8be99d991f240a46863c52d475404c408886c5a38d2b5c3bb26/cryptography-47.0.0-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:5d0e362ff51041b0c0d219cc7d6924d7b8996f57ce5712bdcef71eb3c65a59cc", size = 4270753, upload-time = "2026-04-24T19:54:19.963Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/68/d70c852797aa68e8e48d12e5a87170c43f67bb4a59403627259dd57d15de/cryptography-47.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:1581aef4219f7ca2849d0250edaa3866212fb74bf5667284f46aa92f9e65c1ca", size = 4642911, upload-time = "2026-04-24T19:54:21.818Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a5/51/661cbee74f594c5d97ff82d34f10d5551c085ca4668645f4606ebd22bd5d/cryptography-47.0.0-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:a49a3eb5341b9503fa3000a9a0db033161db90d47285291f53c2a9d2cd1b7f76", size = 5181411, upload-time = "2026-04-24T19:54:24.376Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/87/f2b6c374a82cf076cfa1416992ac8e8ec94d79facc37aec87c1a5cb72352/cryptography-47.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:2207a498b03275d0051589e326b79d4cf59985c99031b05bb292ac52631c37fe", size = 4688262, upload-time = "2026-04-24T19:54:26.946Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/e2/8b7462f4acf21ec509616f0245018bb197194ab0b65c2ea21a0bdd53c0eb/cryptography-47.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:7a02675e2fabd0c0fc04c868b8781863cbf1967691543c22f5470500ff840b31", size = 4775506, upload-time = "2026-04-24T19:54:28.926Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/75/158e494e4c08dc05e039da5bb48553826bd26c23930cf8d3cd5f21fa8921/cryptography-47.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80887c5cbd1774683cb126f0ab4184567f080071d5acf62205acb354b4b753b7", size = 4912060, upload-time = "2026-04-24T19:54:30.869Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/bd/0a9d3edbf5eadbac926d7b9b3cd0c4be584eeeae4a003d24d9eda4affbbd/cryptography-47.0.0-cp38-abi3-win32.whl", hash = "sha256:ed67ea4e0cfb5faa5bc7ecb6e2b8838f3807a03758eec239d6c21c8769355310", size = 3248487, upload-time = "2026-04-24T19:54:33.494Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/80/5681af756d0da3a599b7bdb586fac5a1540f1bcefd2717a20e611ddade45/cryptography-47.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:835d2d7f47cdc53b3224e90810fb1d36ca94ea29cc1801fb4c1bc43876735769", size = 3755737, upload-time = "2026-04-24T19:54:35.408Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1b/a0/928c9ce0d120a40a81aa99e3ba383e87337b9ac9ef9f6db02e4d7822424d/cryptography-47.0.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:7f1207974a904e005f762869996cf620e9bf79ecb4622f148550bb48e0eb35a7", size = 3909893, upload-time = "2026-04-24T19:54:38.334Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/75/d691e284750df5d9569f2b1ce4a00a71e1d79566da83b2b3e5549c84917f/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:1a405c08857258c11016777e11c02bacbe7ef596faf259305d282272a3a05cbe", size = 4587867, upload-time = "2026-04-24T19:54:40.619Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/d6/1b90f1a4e453009730b4545286f0b39bb348d805c11181fc31544e4f9a65/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:20fdbe3e38fb67c385d233c89371fa27f9909f6ebca1cecc20c13518dae65475", size = 4627192, upload-time = "2026-04-24T19:54:42.849Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/53/cb358a80e9e359529f496870dd08c102aa8a4b5b9f9064f00f0d6ed5b527/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:f7db373287273d8af1414cf95dc4118b13ffdc62be521997b0f2b270771fef50", size = 4587486, upload-time = "2026-04-24T19:54:44.908Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/57/aaa3d53876467a226f9a7a82fd14dd48058ad2de1948493442dfa16e2ffd/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:9fe6b7c64926c765f9dff301f9c1b867febcda5768868ca084e18589113732ab", size = 4626327, upload-time = "2026-04-24T19:54:47.813Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/9c/51f28c3550276bcf35660703ba0ab829a90b88be8cd98a71ef23c2413913/cryptography-47.0.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:cffbba3392df0fa8629bb7f43454ee2925059ee158e23c54620b9063912b86c8", size = 3698916, upload-time = "2026-04-24T19:54:49.782Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cytoolz"
|
||||
version = "1.1.0"
|
||||
@@ -713,22 +737,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "hyperliquid-python-sdk"
|
||||
version = "0.23.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "eth-account" },
|
||||
{ name = "eth-utils" },
|
||||
{ name = "msgpack" },
|
||||
{ name = "requests" },
|
||||
{ name = "websocket-client" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/80/ad/12b4559a953e26fc56677de5bf689023e11213196b802b991a6e6db94814/hyperliquid_python_sdk-0.23.0.tar.gz", hash = "sha256:14df0b62511a0cf08ca5a73f73f03656868ee67845ed3362539a79674511bb51", size = 25255, upload-time = "2026-04-14T21:51:24.646Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/94/e9/b7b23aefc319727f670992904b1defd7aee5fc3f59a51c141a87db05f7da/hyperliquid_python_sdk-0.23.0-py3-none-any.whl", hash = "sha256:5b4f9f7ab8c0b1ad9848f2222901dc047c8f97a6e6fe3fd7286b7b34337f80cb", size = 24638, upload-time = "2026-04-14T21:51:23.27Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "3.13"
|
||||
@@ -1123,17 +1131,12 @@ wheels = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pybit"
|
||||
version = "5.16.0"
|
||||
name = "pycparser"
|
||||
version = "3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pycryptodome" },
|
||||
{ name = "requests" },
|
||||
{ name = "websocket-client" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2c/92/c7c14df206de81d7acb7d7d92180482d8d54c18dc73bf8d1ab73498bdbfc/pybit-5.16.0.tar.gz", hash = "sha256:554a812982e0271ec3a9f9483767ad5ff440d3834f3d363b689c81a0a474ebc0", size = 66764, upload-time = "2026-04-18T13:55:33.058Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload-time = "2026-01-21T14:26:51.89Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/3b/9c37237af574f143cee806801ff37a83b85491cd654046c49caf1d2940a2/pybit-5.16.0-py2.py3-none-any.whl", hash = "sha256:d214c4987aabb25c10e8e031244973a3be4728e044575ab6c6f6f7873f8c1cab", size = 59230, upload-time = "2026-04-18T13:55:31.551Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload-time = "2026-01-21T14:26:50.693Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@@ -1378,15 +1381,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/27/be/0631a861af4d1c875f096c07d34e9a63639560a717130e7a87cbc82b7e3f/python_json_logger-4.1.0-py3-none-any.whl", hash = "sha256:132994765cf75bf44554be9aa49b06ef2345d23661a96720262716438141b6b2", size = 15021, upload-time = "2026-03-29T04:39:55.266Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytz"
|
||||
version = "2026.1.post1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/56/db/b8721d71d945e6a8ac63c0fc900b2067181dbb50805958d4d4661cf7d277/pytz-2026.1.post1.tar.gz", hash = "sha256:3378dde6a0c3d26719182142c56e60c7f9af7e968076f31aae569d72a0358ee1", size = 321088, upload-time = "2026-03-03T07:47:50.683Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/10/99/781fe0c827be2742bcc775efefccb3b048a3a9c6ce9aec0cbf4a101677e5/pytz-2026.1.post1-py2.py3-none-any.whl", hash = "sha256:f2fd16142fda348286a75e1a524be810bb05d444e5a081f37f7affc635035f7a", size = 510489, upload-time = "2026-03-03T07:47:49.167Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyyaml"
|
||||
version = "6.0.3"
|
||||
@@ -1546,21 +1540,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/e7/ec846d560ae6a597115153c02ca6138a7877a1748b2072d9521c10a93e58/regex-2026.4.4-cp314-cp314t-win_arm64.whl", hash = "sha256:af0384cb01a33600c49505c27c6c57ab0b27bf84a74e28524c92ca897ebdac9d", size = 275773, upload-time = "2026-04-03T20:56:26.07Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests"
|
||||
version = "2.33.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "certifi" },
|
||||
{ name = "charset-normalizer" },
|
||||
{ name = "idna" },
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5f/a4/98b9c7c6428a668bf7e42ebb7c79d576a1c3c1e3ae2d47e674b468388871/requests-2.33.1.tar.gz", hash = "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517", size = 134120, upload-time = "2026-03-30T16:09:15.531Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/8e/7540e8a2036f79a125c1d2ebadf69ed7901608859186c856fa0388ef4197/requests-2.33.1-py3-none-any.whl", hash = "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a", size = 64947, upload-time = "2026-03-30T16:09:13.83Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rlp"
|
||||
version = "4.1.0"
|
||||
@@ -1678,14 +1657,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "sseclient-py"
|
||||
version = "1.9.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/2e/59920f7d66b7f9932a3d83dd0ec53fab001be1e058bf582606fe414a5198/sseclient_py-1.9.0-py3-none-any.whl", hash = "sha256:340062b1587fc2880892811e2ab5b176d98ef3eee98b3672ff3a3ba1e8ed0f6f", size = 8351, upload-time = "2026-01-02T23:39:30.995Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "starlette"
|
||||
version = "1.0.0"
|
||||
@@ -1777,15 +1748,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/e4/dccd7f47c4b64213ac01ef921a1337ee6e30e8c6466046018326977efd95/tzdata-2026.2-py2.py3-none-any.whl", hash = "sha256:bbe9af844f658da81a5f95019480da3a89415801f6cc966806612cc7169bffe7", size = 349321, upload-time = "2026-04-24T15:22:05.876Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "urllib3"
|
||||
version = "2.6.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload-time = "2026-01-07T16:24:43.925Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "uvicorn"
|
||||
version = "0.46.0"
|
||||
@@ -1935,15 +1897,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/d4/ed38dd3b1767193de971e694aa544356e63353c33a85d948166b5ff58b9e/watchfiles-1.1.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e6f39af2eab0118338902798b5aa6664f46ff66bc0280de76fca67a7f262a49", size = 457546, upload-time = "2025-10-14T15:06:13.372Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "websocket-client"
|
||||
version = "1.9.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2c/41/aa4bf9664e4cda14c3b39865b12251e8e7d239f4cd0e3cc1b6c2ccde25c1/websocket_client-1.9.0.tar.gz", hash = "sha256:9e813624b6eb619999a97dc7958469217c3176312b3a16a4bd1bc7e08a46ec98", size = 70576, upload-time = "2025-10-07T21:16:36.495Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/34/db/b10e48aa8fff7407e67470363eac595018441cf32d5e1001567a7aeba5d2/websocket_client-1.9.0-py3-none-any.whl", hash = "sha256:af248a825037ef591efbf6ed20cc5faa03d3b47b9e5a2230a529eeee1c1fc3ef", size = 82616, upload-time = "2025-10-07T21:16:34.951Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "websockets"
|
||||
version = "16.0"
|
||||
|
||||
Reference in New Issue
Block a user