Compare commits
36 Commits
9e7b98579b
...
V2.0.0
| Author | SHA1 | Date | |
|---|---|---|---|
| 91aadaea6a | |||
| 0ba5a05219 | |||
| c94312d79f | |||
| 110ca7f5cf | |||
| a56baad3dd | |||
| f8fb50cb83 | |||
| 880faa7fd4 | |||
| cddf88afb4 | |||
| 55bfeca88e | |||
| bea37fd734 | |||
| 6940e2865b | |||
| bdc40929d4 | |||
| 9bbc8c05f1 | |||
| 3510605fdd | |||
| 8914d613ec | |||
| 531b7b019c | |||
| 6266708e15 | |||
| 17700d27a0 | |||
| 12002642e5 | |||
| b9c58a376f | |||
| ded4414b32 | |||
| 611a2695a9 | |||
| f4f4e4efd7 | |||
| 0c74691e7c | |||
| b49b2b36e0 | |||
| 92da6aa842 | |||
| a90c5c4d6f | |||
| ae63aaf69a | |||
| 92cc45c896 | |||
| 3a85ff05e6 | |||
| 391f2c02e0 | |||
| 109b8e4686 | |||
| 1ca1687c9b | |||
| 8a0f37ebc2 | |||
| 6640ede3df | |||
| d8136713b9 |
@@ -17,8 +17,14 @@ TESTNET_TOKEN=
|
||||
MAINNET_TOKEN=
|
||||
|
||||
# ─── EXCHANGE — DERIBIT ───────────────────────────────────
|
||||
# Coppia singola (usata sia per testnet sia per mainnet):
|
||||
DERIBIT_CLIENT_ID=
|
||||
DERIBIT_CLIENT_SECRET=
|
||||
# Oppure coppie distinte per env (prevalgono se valorizzate):
|
||||
# DERIBIT_CLIENT_ID_TESTNET=
|
||||
# DERIBIT_CLIENT_SECRET_TESTNET=
|
||||
# DERIBIT_CLIENT_ID_LIVE=
|
||||
# DERIBIT_CLIENT_SECRET_LIVE=
|
||||
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
|
||||
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
|
||||
DERIBIT_MAX_LEVERAGE=3
|
||||
@@ -45,6 +51,39 @@ ALPACA_URL_LIVE=https://api.alpaca.markets
|
||||
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
|
||||
ALPACA_MAX_LEVERAGE=1
|
||||
|
||||
# ─── EXCHANGE — IBKR ──────────────────────────────────────
|
||||
# Setup OAuth: vedi README "IBKR Setup" + scripts/ibkr_oauth_setup.py.
|
||||
# Le RSA keys (PEM) NON vanno nel .env: monta come file e referenzia il path.
|
||||
|
||||
IBKR_CONSUMER_KEY=
|
||||
IBKR_ACCESS_TOKEN=
|
||||
IBKR_ACCESS_TOKEN_SECRET=
|
||||
IBKR_SIGNATURE_KEY_PATH=/secrets/ibkr_signature.pem
|
||||
IBKR_ENCRYPTION_KEY_PATH=/secrets/ibkr_encryption.pem
|
||||
IBKR_DH_PRIME=
|
||||
|
||||
# Coppie env-specific (prevalgono):
|
||||
# IBKR_CONSUMER_KEY_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_TESTNET=
|
||||
# IBKR_SIGNATURE_KEY_PATH_TESTNET=/secrets/ibkr_signature_paper.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_TESTNET=/secrets/ibkr_encryption_paper.pem
|
||||
# IBKR_ACCOUNT_ID_TESTNET=DU1234567
|
||||
# IBKR_CONSUMER_KEY_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_LIVE=
|
||||
# IBKR_SIGNATURE_KEY_PATH_LIVE=/secrets/ibkr_signature_live.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_LIVE=/secrets/ibkr_encryption_live.pem
|
||||
# IBKR_ACCOUNT_ID_LIVE=U1234567
|
||||
|
||||
IBKR_URL_LIVE=https://api.ibkr.com/v1/api
|
||||
IBKR_URL_TESTNET=https://api.ibkr.com/v1/api
|
||||
IBKR_WS_URL_LIVE=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_WS_URL_TESTNET=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_MAX_LEVERAGE=4
|
||||
IBKR_WS_MAX_SUBSCRIPTIONS=80
|
||||
IBKR_WS_IDLE_TIMEOUT_S=300
|
||||
|
||||
# ─── DATA PROVIDERS — MACRO ───────────────────────────────
|
||||
FRED_API_KEY=
|
||||
FINNHUB_API_KEY=
|
||||
|
||||
@@ -9,8 +9,8 @@ sul token bearer fornito dal client.
|
||||
|
||||
- **Una singola immagine Docker** (`cerbero-mcp`) ospita tutti i router
|
||||
exchange in un unico processo FastAPI
|
||||
- **Quattro exchange** (Deribit, Bybit, Hyperliquid, Alpaca) e **due data
|
||||
provider** read-only (Macro, Sentiment)
|
||||
- **Cinque exchange** (Deribit, Bybit, Hyperliquid, Alpaca, IBKR) e **due
|
||||
data provider** read-only (Macro, Sentiment)
|
||||
- **Switch testnet/mainnet per-request** tramite header
|
||||
`Authorization: Bearer <TOKEN>`: lo stesso container serve entrambi gli
|
||||
ambienti senza riavvii
|
||||
@@ -19,7 +19,10 @@ sul token bearer fornito dal client.
|
||||
override-abili tramite variabili dedicate (`DERIBIT_URL_*`,
|
||||
`BYBIT_URL_*`, `HYPERLIQUID_URL_*`, `ALPACA_URL_*`)
|
||||
- **Documentazione interattiva** OpenAPI/Swagger esposta a `/apidocs`
|
||||
- **Qualità verificata**: 310 test (unit + integration + smoke), mypy
|
||||
- **Endpoint cross-exchange unificato** (`/mcp-cross/tools/get_historical`):
|
||||
fan-out a tutti gli exchange che supportano (symbol, asset_class) e
|
||||
consensus per-bar (mediana OHLC + `div_pct` + `sources`)
|
||||
- **Qualità verificata**: 399 test (unit + integration + smoke), mypy
|
||||
pulito, ruff pulito
|
||||
|
||||
## Avvio rapido (sviluppo, senza Docker)
|
||||
@@ -88,8 +91,10 @@ non è richiesto sugli endpoint pubblici (`/health`, `/apidocs`,
|
||||
| `POST /mcp-bybit/tools/{tool}` | Tool exchange Bybit |
|
||||
| `POST /mcp-hyperliquid/tools/{tool}` | Tool exchange Hyperliquid |
|
||||
| `POST /mcp-alpaca/tools/{tool}` | Tool exchange Alpaca |
|
||||
| `POST /mcp-ibkr/tools/{tool}` | Tool exchange Interactive Brokers |
|
||||
| `POST /mcp-macro/tools/{tool}` | Tool macro/market data |
|
||||
| `POST /mcp-sentiment/tools/{tool}` | Tool sentiment/news |
|
||||
| `POST /mcp-cross/tools/get_historical` | Storico aggregato cross-exchange con consensus + divergenza |
|
||||
| `GET /admin/audit` | Query dell'audit log JSONL (bearer richiesto, no X-Bot-Tag) |
|
||||
|
||||
## Observability
|
||||
@@ -140,7 +145,7 @@ Parametri di query (tutti opzionali):
|
||||
|
||||
- `from`, `to`: ISO 8601 datetime (es. `2026-05-01` o `2026-05-01T12:34:56Z`)
|
||||
- `actor`: `testnet` | `mainnet`
|
||||
- `exchange`: nome dell'exchange (`deribit`, `bybit`, `hyperliquid`, `alpaca`)
|
||||
- `exchange`: nome dell'exchange (`deribit`, `bybit`, `hyperliquid`, `alpaca`, `ibkr`)
|
||||
- `action`: nome del tool (es. `place_order`)
|
||||
- `bot_tag`: identificatore del bot
|
||||
- `limit`: massimo record restituiti, default `1000`, massimo `10000`
|
||||
@@ -186,6 +191,13 @@ rate, basis spot/perp, place_order, set_stop_loss, set_take_profit.
|
||||
Account, positions, bars, snapshot, option chain, place_order,
|
||||
amend_order, cancel_order, close_position.
|
||||
|
||||
### IBKR (Interactive Brokers)
|
||||
Account, positions, activities, ticker, bars, snapshot, option chain,
|
||||
search_contracts, clock, streaming (tick + depth via WebSocket
|
||||
singleton), place_order, amend_order, cancel_order, close_position,
|
||||
bracket/OCO/OTO orders. Auth via OAuth 1.0a Self-Service con minting
|
||||
session token unattended (vedi sezione "IBKR Setup" più sotto).
|
||||
|
||||
### Macro
|
||||
Treasury yields, FRED indicators, equity futures, asset prices, calendar,
|
||||
get_yield_curve_slope, get_breakeven_inflation, get_cot_tff,
|
||||
@@ -196,6 +208,16 @@ News (CryptoPanic/CoinDesk), social (LunarCrush), funding multi-exchange,
|
||||
OI history, get_funding_arb_spread, get_liquidation_heatmap,
|
||||
get_cointegration_pairs.
|
||||
|
||||
### Cross (storico unificato)
|
||||
`get_historical` aggrega le candele dello stesso simbolo da tutti gli
|
||||
exchange che lo supportano e ritorna una serie consensus: la chiusura è
|
||||
la mediana, `sources` è il numero di exchange che hanno contribuito al
|
||||
bar e `div_pct = (max-min)/median` segnala il disaccordo tra fonti — un
|
||||
quality gate per i bot. Crypto: BTC/ETH/SOL via Bybit + Hyperliquid +
|
||||
Deribit. Stocks: AAPL/SPY/QQQ/TSLA/NVDA via Alpaca. In caso di fallimento
|
||||
parziale ritorna i dati disponibili più `failed_sources`; se *tutti* gli
|
||||
upstream falliscono → HTTP 502 retryable.
|
||||
|
||||
## Deploy su VPS con Traefik
|
||||
|
||||
Sul VPS la rete pubblica (TLS, allowlist IP, rate limit) è gestita da
|
||||
@@ -280,7 +302,7 @@ PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
|
||||
|
||||
```bash
|
||||
uv sync
|
||||
uv run pytest # tutta la suite (310 test attesi)
|
||||
uv run pytest # tutta la suite (399 test attesi)
|
||||
uv run pytest tests/unit -v # solo unit
|
||||
uv run pytest tests/integration -v
|
||||
uv run ruff check src/ tests/
|
||||
@@ -361,6 +383,69 @@ pybit (workaround documentato nel client). Per Alpaca l'override è
|
||||
applicato al solo trading endpoint: gli endpoint dati
|
||||
(`data.alpaca.markets`) restano quelli predefiniti dell'SDK.
|
||||
|
||||
## IBKR Setup
|
||||
|
||||
IBKR uses OAuth 1.0a Self-Service for fully unattended runtime auth. Setup is
|
||||
manual one-time per account (paper + live), then the container mints live
|
||||
session tokens autonomously.
|
||||
|
||||
### One-time setup
|
||||
|
||||
1. Login to https://www.interactivebrokers.com → User Settings → Self-Service OAuth
|
||||
2. Generate keypairs locally:
|
||||
|
||||
```bash
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet
|
||||
```
|
||||
|
||||
This writes RSA keys under `secrets/` and prints SHA-256 fingerprints.
|
||||
|
||||
3. Register the two fingerprints in the IBKR portal. Receive a `consumer_key`.
|
||||
4. Get a request token + authorization URL:
|
||||
|
||||
```bash
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet \
|
||||
--consumer-key <K> --request-token
|
||||
```
|
||||
|
||||
5. Open the URL, authorize, copy the `verifier_code`.
|
||||
6. Exchange verifier for long-lived access token (~5 years validity):
|
||||
|
||||
```bash
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet --verifier <V>
|
||||
```
|
||||
|
||||
7. Copy the printed values into `.env`:
|
||||
- `IBKR_CONSUMER_KEY_TESTNET`
|
||||
- `IBKR_ACCESS_TOKEN_TESTNET`
|
||||
- `IBKR_ACCESS_TOKEN_SECRET_TESTNET`
|
||||
- `IBKR_SIGNATURE_KEY_PATH_TESTNET`
|
||||
- `IBKR_ENCRYPTION_KEY_PATH_TESTNET`
|
||||
- `IBKR_ACCOUNT_ID_TESTNET` (e.g., `DU1234567` for paper)
|
||||
- `IBKR_DH_PRIME` (hex from portal; shared paper/live)
|
||||
8. Repeat with `--env mainnet` for live trading.
|
||||
|
||||
### Smoke test
|
||||
|
||||
```bash
|
||||
curl https://cerbero-mcp.<dom>/mcp-ibkr/tools/get_account \
|
||||
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST -d '{}'
|
||||
```
|
||||
|
||||
### Key rotation
|
||||
|
||||
```bash
|
||||
# 1. Generate new keypairs alongside existing
|
||||
uv run python scripts/ibkr_oauth_setup.py --env testnet --rotate
|
||||
|
||||
# 2. Register new fingerprints in IBKR portal, get new consumer_key + tokens
|
||||
|
||||
# 3. Confirm rotation (atomic swap with auto-rollback on validation fail)
|
||||
curl -X POST "https://cerbero-mcp.<dom>/admin/ibkr/rotate-keys/confirm?env=testnet" \
|
||||
-H "Authorization: Bearer <ADMIN_TOKEN>" -H "Content-Type: application/json" \
|
||||
-d '{"new_consumer_key":"...","new_access_token":"...","new_access_token_secret":"..."}'
|
||||
```
|
||||
|
||||
## Licenza
|
||||
|
||||
Privato.
|
||||
|
||||
+17
-2
@@ -3,9 +3,9 @@ services:
|
||||
image: cerbero-mcp:2.0.0
|
||||
build: .
|
||||
container_name: cerbero-mcp
|
||||
ports:
|
||||
- "${PORT:-9000}:${PORT:-9000}"
|
||||
env_file: .env
|
||||
volumes:
|
||||
- ./secrets:/secrets:ro
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test:
|
||||
@@ -16,3 +16,18 @@ services:
|
||||
interval: 30s
|
||||
timeout: 5s
|
||||
retries: 3
|
||||
networks:
|
||||
- traefik
|
||||
labels:
|
||||
- traefik.enable=true
|
||||
- traefik.docker.network=traefik
|
||||
- "traefik.http.routers.cerbero-mcp.rule=Host(`cerbero-mcp.${DOMAIN_NAME:-tielogic.xyz}`)"
|
||||
- traefik.http.routers.cerbero-mcp.tls=true
|
||||
- traefik.http.routers.cerbero-mcp.entrypoints=websecure
|
||||
- traefik.http.routers.cerbero-mcp.tls.certresolver=mytlschallenge
|
||||
- "traefik.http.services.cerbero-mcp.loadbalancer.server.port=${PORT:-9000}"
|
||||
- "com.centurylinklabs.watchtower.enable=true"
|
||||
|
||||
networks:
|
||||
traefik:
|
||||
external: true
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,530 @@
|
||||
# IBKR Integration — Design Spec
|
||||
|
||||
**Date:** 2026-05-03
|
||||
**Branch:** V2.0.0
|
||||
**Status:** Approved (pending implementation plan)
|
||||
**Approach chosen:** A2 — Client Portal Web API with OAuth 1.0a Self-Service (fully unattended)
|
||||
|
||||
## 1. Goals & Non-Goals
|
||||
|
||||
### Goals
|
||||
|
||||
Aggiungere `ibkr` come exchange supportato in `cerbero-mcp`, riutilizzando il pattern consolidato (Alpaca/Deribit) per:
|
||||
|
||||
- account / positions / activities (read)
|
||||
- ordini simple: market, limit, stop, stop-limit (read + write)
|
||||
- ordini complex: bracket (entry + SL + TP con OCA), OCO (N legs OCA type=1), OTO (parent → child sequenziale)
|
||||
- market data: snapshot REST + tick/depth real-time via WebSocket (snapshot-on-demand)
|
||||
- options chain via OCC symbol
|
||||
- key rotation semi-automatica via admin endpoint con auto-rollback
|
||||
- routing testnet (paper account) / mainnet (live account) via bearer token, come gli altri exchange
|
||||
|
||||
### Non-Goals (V1)
|
||||
|
||||
- Server-Sent Events / streaming HTTP response (snapshot-on-demand è sufficiente)
|
||||
- Multi-account dinamico (un solo `account_id` per env, configurato in settings)
|
||||
- Trailing stop, IF-touched, conditional advanced orders (solo bracket fisso, OCO, OTO)
|
||||
- Rotazione completamente automatica del consumer registration (il passo portale IBKR non è automatizzabile)
|
||||
- Streaming WebSocket esposto direttamente al bot (resta interno al server, esposto come polling REST)
|
||||
- TWS API socket protocol via `ib_insync` (rejected: richiede gateway desktop con Xvfb, fragile)
|
||||
- Flex Web Service (rejected: read-only su report storici, fuori scope)
|
||||
|
||||
### Success criteria
|
||||
|
||||
1. `POST /mcp-ibkr/tools/get_account` con bearer testnet ritorna saldo paper account reale
|
||||
2. `POST /mcp-ibkr/tools/place_order` (1 share AAPL market) → ordine fillato in paper, audit log presente
|
||||
3. `POST /mcp-ibkr/tools/place_bracket_order` → 3 ordini collegati via OCA group, primo fill cancella gli altri
|
||||
4. `POST /mcp-ibkr/tools/get_depth` → 5 livelli depth con dati < 1s di latenza
|
||||
5. `POST /admin/ibkr/rotate-keys/{start,confirm}` → swap atomico, rollback automatico su validation fail
|
||||
6. Container restart → primo `get_account` < 5s (OAuth flow + first call), zero input umano
|
||||
7. Test suite verde: 90% coverage su `oauth.py`/`client.py`/`ws.py`/`key_rotation.py`, 85% su `tools.py`/`orders_complex.py`
|
||||
8. `/health/ready` segnala IBKR sano per entrambi gli env
|
||||
|
||||
## 2. Architecture
|
||||
|
||||
```
|
||||
┌──────┐ Bearer testnet|mainnet ┌──────────────────┐
|
||||
│ Bot │ ─────────────────────────▶│ cerbero-mcp │
|
||||
└──────┘ │ (single FastAPI)│
|
||||
│ │
|
||||
│ IBKRClient │ ──HTTPS OAuth1a──▶ api.ibkr.com/v1/api
|
||||
│ IBKRWebSocket │ ──WSS LST───────▶ api.ibkr.com/v1/api/ws
|
||||
└──────────────────┘
|
||||
```
|
||||
|
||||
**Decisioni chiave:**
|
||||
|
||||
- **OAuth 1.0a Self-Service:** firma RSA-SHA256 + DH key exchange per mintare live session token (24h TTL) in autonomia. Setup iniziale manuale una-tantum sul portale IBKR; runtime fully unattended.
|
||||
- **Container singolo:** niente sidecar Java (era considerato per CP Gateway non-OAuth, scartato perché richiedeva login interattivo).
|
||||
- **WebSocket interno snapshot-on-demand:** un singleton `IBKRWebSocket` per env mantiene sub attive in background; i tool REST `get_tick`/`get_depth` ritornano l'ultimo snapshot in cache. Bot polling-based, niente streaming verso il bot.
|
||||
- **Paper vs live = account separati, stesso host:** IBKR usa `api.ibkr.com` per entrambi; i test sono fatti su account paper (con suo OAuth bundle), live su account live. Due set di credenziali in settings (pattern Deribit `_TESTNET` / `_LIVE`).
|
||||
- **Conid cache:** IBKR identifica strumenti via `conid` numerico; symbol→conid lookup cached (LRU 1024, TTL 1h) per evitare round-trip ripetuti.
|
||||
- **Two-level keep-alive:** brokerage session muore in 5min idle (richiede `POST /tickle`); live session token muore in 24h (richiede DH re-mint). Gestiti separatamente dal client.
|
||||
|
||||
## 3. Components (file layout)
|
||||
|
||||
```
|
||||
src/cerbero_mcp/exchanges/ibkr/
|
||||
├── __init__.py
|
||||
├── client.py # IBKRClient: REST httpx + tickle + conid cache
|
||||
├── oauth.py # OAuth1aSigner: RSA sig + DH live session token mint/refresh
|
||||
├── ws.py # IBKRWebSocket: persistent WSS, smd/sbd subs, snapshot cache
|
||||
├── orders_complex.py # bracket/OCO/OTO payload builders
|
||||
├── key_rotation.py # KeyRotationManager: stage/confirm/abort/rollback
|
||||
├── tools.py # Pydantic schemas + async tool functions (read + write + complex + streaming)
|
||||
└── leverage_cap.py # get_max_leverage(creds) — copia 1:1 da alpaca
|
||||
|
||||
src/cerbero_mcp/routers/ibkr.py # POST /mcp-ibkr/tools/*
|
||||
|
||||
scripts/ibkr_oauth_setup.py # one-shot: generate keypair, walkthrough portale, --rotate flag
|
||||
|
||||
tests/unit/exchanges/ibkr/
|
||||
├── __init__.py
|
||||
├── test_oauth.py
|
||||
├── test_client.py
|
||||
├── test_ws.py
|
||||
├── test_orders_complex.py
|
||||
├── test_key_rotation.py
|
||||
└── test_tools.py
|
||||
```
|
||||
|
||||
**File esistenti modificati:**
|
||||
|
||||
- `src/cerbero_mcp/settings.py` — aggiungo `IBKRSettings`
|
||||
- `src/cerbero_mcp/exchanges/__init__.py` — branch `if exchange == "ibkr"` in `build_client`
|
||||
- `src/cerbero_mcp/__main__.py` — `app.include_router(ibkr.make_router())`
|
||||
- `src/cerbero_mcp/admin.py` — endpoints `/admin/ibkr/rotate-*` + `/admin/ibkr/health`
|
||||
- `.env.example` — sezione `# ─── EXCHANGE — IBKR ───`
|
||||
- `pyproject.toml` — aggiungo `cryptography>=43` (RSA + DH; potrebbe essere già transitiva)
|
||||
- `docker-compose.yml` — bind mount `./secrets:/secrets:ro`
|
||||
- `README.md` — sezione "IBKR Setup"
|
||||
|
||||
**Module boundaries:**
|
||||
|
||||
- `oauth.py` espone `OAuth1aSigner.get_live_session_token() -> str` (cached). Non sa nulla di endpoint applicativi; conosce solo l'endpoint `/oauth/live_session_token`.
|
||||
- `client.py` riceve un `OAuth1aSigner` come dipendenza, non costruisce keys. Non sa nulla di WebSocket.
|
||||
- `ws.py` riceve un `OAuth1aSigner`, gestisce WSS in modo indipendente. Espone metodi async `subscribe_tick(conid)`, `subscribe_depth(conid, rows)`, `get_tick_snapshot(conid)`, `get_depth_snapshot(conid)`. Non sa nulla di REST.
|
||||
- `orders_complex.py` è un set di funzioni **pure** che producono payload JSON IBKR-ready. Niente HTTP. Test deterministici.
|
||||
- `key_rotation.py` opera su filesystem + `OAuth1aSigner` factory; non tocca routing FastAPI direttamente.
|
||||
|
||||
## 4. Settings & OAuth flow
|
||||
|
||||
### Pydantic settings
|
||||
|
||||
```python
|
||||
class IBKRSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env", env_file_encoding="utf-8",
|
||||
env_prefix="IBKR_", extra="ignore",
|
||||
)
|
||||
# Coppia singola fallback (legacy / dev)
|
||||
consumer_key: str | None = None
|
||||
access_token: str | None = None
|
||||
access_token_secret: SecretStr | None = None
|
||||
signature_key_path: str | None = None
|
||||
encryption_key_path: str | None = None
|
||||
dh_prime: SecretStr | None = None
|
||||
|
||||
# Coppie env-specific (prevalgono se valorizzate)
|
||||
consumer_key_testnet: str | None = None
|
||||
access_token_testnet: str | None = None
|
||||
access_token_secret_testnet: SecretStr | None = None
|
||||
signature_key_path_testnet: str | None = None
|
||||
encryption_key_path_testnet: str | None = None
|
||||
account_id_testnet: str | None = None
|
||||
|
||||
consumer_key_live: str | None = None
|
||||
access_token_live: str | None = None
|
||||
access_token_secret_live: SecretStr | None = None
|
||||
signature_key_path_live: str | None = None
|
||||
encryption_key_path_live: str | None = None
|
||||
account_id_live: str | None = None
|
||||
|
||||
# URLs (paper e live condividono host)
|
||||
url_live: str = "https://api.ibkr.com/v1/api"
|
||||
url_testnet: str = "https://api.ibkr.com/v1/api"
|
||||
ws_url_live: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
ws_url_testnet: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
|
||||
# Limits
|
||||
max_leverage: int = 4 # Reg-T default
|
||||
ws_max_subscriptions: int = 80
|
||||
ws_idle_timeout_s: int = 300
|
||||
|
||||
def credentials(self, env: str) -> dict:
|
||||
"""Ritorna dict completo OAuth per env. ValueError su campi mancanti.
|
||||
|
||||
Per ogni campo: prefer `<field>_<env>`; fallback a `<field>` (legacy);
|
||||
ValueError se entrambi assenti per i campi required
|
||||
(consumer_key, access_token, access_token_secret, signature_key_path,
|
||||
encryption_key_path, account_id, dh_prime).
|
||||
Pattern identico a DeribitSettings.credentials().
|
||||
"""
|
||||
```
|
||||
|
||||
`dh_prime` è una stringa hex emessa da IBKR al setup, **costante** per consumer (condivisa paper/live), non duplicata per env.
|
||||
|
||||
### `.env.example`
|
||||
|
||||
```env
|
||||
# ─── EXCHANGE — IBKR ──────────────────────────────────────
|
||||
# Setup OAuth: vedi README "IBKR Setup" + scripts/ibkr_oauth_setup.py.
|
||||
# Le RSA keys (PEM) NON vanno nel .env: monta come file e referenzia il path.
|
||||
|
||||
IBKR_CONSUMER_KEY=
|
||||
IBKR_ACCESS_TOKEN=
|
||||
IBKR_ACCESS_TOKEN_SECRET=
|
||||
IBKR_SIGNATURE_KEY_PATH=/secrets/ibkr_signature.pem
|
||||
IBKR_ENCRYPTION_KEY_PATH=/secrets/ibkr_encryption.pem
|
||||
IBKR_DH_PRIME=
|
||||
|
||||
# Coppie env-specific (prevalgono):
|
||||
# IBKR_CONSUMER_KEY_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_TESTNET=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_TESTNET=
|
||||
# IBKR_SIGNATURE_KEY_PATH_TESTNET=/secrets/ibkr_signature_paper.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_TESTNET=/secrets/ibkr_encryption_paper.pem
|
||||
# IBKR_ACCOUNT_ID_TESTNET=DU1234567
|
||||
# IBKR_CONSUMER_KEY_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_LIVE=
|
||||
# IBKR_ACCESS_TOKEN_SECRET_LIVE=
|
||||
# IBKR_SIGNATURE_KEY_PATH_LIVE=/secrets/ibkr_signature_live.pem
|
||||
# IBKR_ENCRYPTION_KEY_PATH_LIVE=/secrets/ibkr_encryption_live.pem
|
||||
# IBKR_ACCOUNT_ID_LIVE=U1234567
|
||||
|
||||
IBKR_URL_LIVE=https://api.ibkr.com/v1/api
|
||||
IBKR_URL_TESTNET=https://api.ibkr.com/v1/api
|
||||
IBKR_WS_URL_LIVE=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_WS_URL_TESTNET=wss://api.ibkr.com/v1/api/ws
|
||||
IBKR_MAX_LEVERAGE=4
|
||||
IBKR_WS_MAX_SUBSCRIPTIONS=80
|
||||
IBKR_WS_IDLE_TIMEOUT_S=300
|
||||
```
|
||||
|
||||
### Setup OAuth one-shot (manuale, una-tantum per account)
|
||||
|
||||
1. Login portale `https://www.interactivebrokers.com` → "User Settings" → "Self-Service OAuth"
|
||||
2. `python scripts/ibkr_oauth_setup.py --env testnet` → genera 2 RSA keypair + stampa SHA-256 fingerprint
|
||||
3. Sul portale: registra le 2 public key, ottieni `consumer_key`
|
||||
4. `python scripts/ibkr_oauth_setup.py --consumer-key <K> --request-token` → ottiene request token + URL autorizzazione
|
||||
5. Aprire URL nel browser, autorizzare, copiare `verifier_code`
|
||||
6. `python scripts/ibkr_oauth_setup.py --verifier <V>` → scambia per `access_token` (long-lived ~5 anni) + `access_token_secret`
|
||||
7. Copiare 3 valori in `.env`. Ripetere per env live.
|
||||
|
||||
### Runtime flow (fully unattended)
|
||||
|
||||
- Container avvia → `IBKRClient` lazy-instantiated alla prima request
|
||||
- `OAuth1aSigner` carica RSA private keys da disk (path da settings)
|
||||
- Prima request privata → `_get_live_session_token()`:
|
||||
1. Genera nonce + timestamp
|
||||
2. Firma `POST /oauth/live_session_token` con RSA-SHA256
|
||||
3. Diffie-Hellman key exchange con `dh_prime`
|
||||
4. Riceve `lst` (live session token, valido 24h)
|
||||
5. Cache in memory con scadenza `now + 86000s`
|
||||
- Request successive: HMAC-SHA256 dei params con `lst` come key
|
||||
- `lst` scaduto → mint nuovo automatico, retry once. Mai input umano runtime.
|
||||
- Ogni request privata: se ultima call > 4min fa, chiama `POST /tickle` (brokerage session keep-alive) prima.
|
||||
|
||||
### Errori → error envelope
|
||||
|
||||
| Trigger | Code | retryable |
|
||||
|---|---|---|
|
||||
| RSA key file mancante | `IBKR_KEY_NOT_FOUND` | false |
|
||||
| RSA key file illeggibile | `IBKR_KEY_INVALID` | false |
|
||||
| Consumer revocato dal portale | `IBKR_CONSUMER_REVOKED` | false |
|
||||
| Access token scaduto (~5 anni) | `IBKR_ACCESS_TOKEN_EXPIRED` | false |
|
||||
| LST mint fallito (network) | `IBKR_SESSION_MINT_FAILED` | true |
|
||||
| `401` su request firmata | `IBKR_AUTH_FAILED` | true (forza refresh LST + retry once) |
|
||||
| Rate limit `429` | `IBKR_RATE_LIMITED` | true |
|
||||
| Manutenzione domenicale | `IBKR_MAINTENANCE` | true |
|
||||
| Account configurato non in `/iserver/accounts` | `IBKR_ACCOUNT_NOT_FOUND` | false |
|
||||
| Subscription market data assente | `IBKR_NO_MARKET_DATA_SUBSCRIPTION` | false |
|
||||
| Order warning critico (margin/suitability) | `IBKR_ORDER_REJECTED_WARNING` | false |
|
||||
| WS sub limit superato | `IBKR_WS_SUB_LIMIT` | false |
|
||||
| `get_tick` timeout cache vuota dopo 3s | `IBKR_TICK_TIMEOUT` | true |
|
||||
| OTO seconda POST fallita dopo trigger placed | `IBKR_OTO_PARTIAL_FAILURE` | false |
|
||||
| Rotation validation fallita | `IBKR_ROTATION_VALIDATION_FAILED` | false (rollback automatico) |
|
||||
|
||||
## 5. Tool API surface
|
||||
|
||||
Pattern simmetrico ad Alpaca dove l'astrazione regge: stesso tool name → bot riusa logica cross-exchange.
|
||||
|
||||
### Reads (12 tool)
|
||||
|
||||
| Tool | IBKR endpoint |
|
||||
|---|---|
|
||||
| `environment_info` | locale (env, paper, base_url, max_leverage) |
|
||||
| `get_account` | `GET /portfolio/{accountId}/summary` |
|
||||
| `get_positions` | `GET /portfolio/{accountId}/positions/0` (loop se >30) |
|
||||
| `get_activities` | `GET /iserver/account/trades?days=N` (default 7, cap 90) |
|
||||
| `get_assets` | `GET /trsrv/secdef/search?symbol=...` (richiede symbol) |
|
||||
| `get_ticker` | `GET /iserver/marketdata/snapshot?conids=X&fields=31,84,86,7295,7296` |
|
||||
| `get_bars` | `GET /iserver/marketdata/history?conid=X&period=...&bar=...` |
|
||||
| `get_snapshot` | `GET /iserver/marketdata/snapshot` (full fields) |
|
||||
| `get_option_chain` | `GET /iserver/secdef/strikes` + `/info` |
|
||||
| `get_open_orders` | `GET /iserver/account/orders?filters=Submitted,PreSubmitted` |
|
||||
| `get_clock` | locale (now + market hours statiche) |
|
||||
| `search_contracts` | `GET /trsrv/secdef/search` (IBKR-specific: symbol+secType → conid) |
|
||||
|
||||
### Streaming (4 tool, snapshot-on-demand)
|
||||
|
||||
| Tool | Behavior |
|
||||
|---|---|
|
||||
| `get_tick` | Ultimo tick in cache (last/bid/ask/size/timestamp). Se non subscribed: sub lazy + attesa primo tick (timeout 3s) |
|
||||
| `get_depth` | Order book depth (default 5 livelli, max 10). IBKR `sbd+{conid}+{exchange}+{rows}` |
|
||||
| `subscribe_tick` | Mantiene sub attiva anche senza polling. Auto-unsub dopo `ws_idle_timeout_s` |
|
||||
| `unsubscribe` | Forza chiusura sub per liberare slot |
|
||||
|
||||
### Writes simple (6 tool)
|
||||
|
||||
| Tool | Audit field |
|
||||
|---|---|
|
||||
| `place_order` | `symbol` |
|
||||
| `amend_order` | `order_id` |
|
||||
| `cancel_order` | `order_id` |
|
||||
| `cancel_all_orders` | — (loop) |
|
||||
| `close_position` | `symbol` |
|
||||
| `close_all_positions` | — (loop) |
|
||||
|
||||
### Writes complex (3 tool)
|
||||
|
||||
| Tool | Schema essenziale | Endpoint |
|
||||
|---|---|---|
|
||||
| `place_bracket_order` | `symbol, side, qty, entry_price, stop_loss, take_profit, tif="gtc"` | `POST /iserver/account/{id}/orders` array `[parent, sl_child, tp_child]` con OCA group auto |
|
||||
| `place_oco_order` | `legs: list[OrderLeg]` (2-N orders) | Stessa POST con `oca_group` + `oca_type=1` su ogni leg |
|
||||
| `place_oto_order` | `trigger: OrderLeg, child: OrderLeg` | POST sequenziali: trigger prima, poi child con `parent_id=<trigger.order_id>`. **Non atomico:** se la seconda POST fallisce dopo che la prima è andata a buon fine, il tool cancella il trigger via `cancel_order` (best-effort) e ritorna `IBKR_OTO_PARTIAL_FAILURE` con `details.trigger_order_id` per audit |
|
||||
|
||||
**Audit:** complex orders tracciano `target_field=symbol` + `details.legs_count` + `details.oca_group` (se applicabile).
|
||||
|
||||
**Leverage cap su complex:** applicato sul **net notional** della struttura. Bracket = entry only (i child non aprono nuova esposizione). OCO = max(leg.notional). OTO = trigger + child se entrambi long, altrimenti max.
|
||||
|
||||
### IBKR-specifiche (interne al client)
|
||||
|
||||
1. **`conid` resolution:** `place_order(symbol="AAPL")` → lookup `GET /trsrv/secdef/search?symbol=AAPL&secType=STK` → primo match → cache LRU. Per options: parse OCC (`AAPL 240119C00190000`) → `/iserver/secdef/info`.
|
||||
2. **`accountId` validation:** al boot, `GET /iserver/accounts` → verifica `account_id_<env>` presente. Altrimenti `IBKR_ACCOUNT_NOT_FOUND`.
|
||||
3. **Order confirmation flow:** IBKR ritorna warnings array, richiede secondo POST con `confirmed: true`. Auto-confirm per default (max 3 cicli), ma filtra warning critici (margin, suitability, hard rejects) → error envelope.
|
||||
4. **`tickle` keep-alive:** automatico se ultima request > 4min fa. Indipendente dal LST.
|
||||
5. **Empty market data → error envelope:** snapshot vuoto = subscription mancante; ritorniamo `IBKR_NO_MARKET_DATA_SUBSCRIPTION` invece di dict vuoto silenzioso.
|
||||
6. **Leverage cap:** IBKR non accetta `leverage` per-order. Calcoliamo `notional / equity` ≤ `max_leverage` pre-submit chiamando `get_account` per equity. Pattern asincrono ma cached 30s.
|
||||
|
||||
### `PlaceOrderReq` schema
|
||||
|
||||
```python
|
||||
class PlaceOrderReq(BaseModel):
|
||||
symbol: str # "AAPL" o OCC-format per options
|
||||
side: str # "buy" | "sell"
|
||||
qty: float
|
||||
order_type: str = "market" # "market" | "limit" | "stop" | "stop_limit"
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str = "day" # "day" | "gtc" | "ioc"
|
||||
asset_class: str = "stocks" # "stocks" | "options" | "futures" | "forex"
|
||||
sec_type: str | None = None # IBKR override (STK/OPT/FUT/CASH); inferito da asset_class
|
||||
exchange: str = "SMART" # IBKR routing
|
||||
outside_rth: bool = False
|
||||
```
|
||||
|
||||
## 6. WebSocket layer
|
||||
|
||||
### Pattern
|
||||
|
||||
Singleton `IBKRWebSocket` per env, lazy-start alla prima sub. Una connessione WSS condivisa per tutte le sub.
|
||||
|
||||
### Lifecycle
|
||||
|
||||
1. **Boot:** non connette finché un tool streaming non viene chiamato.
|
||||
2. **First sub call:** apre WSS, autentica con LST corrente (header `Cookie: api=<lst>`), invia subscribe message (`smd+{conid}+{fields}` o `sbd+{conid}+{exchange}+{rows}`).
|
||||
3. **Message dispatch:** ogni messaggio `smd-...` aggiorna `dict[conid, TickSnapshot]`; ogni `sbd-...` aggiorna `dict[conid, DepthSnapshot]`.
|
||||
4. **Heartbeat:** ping ogni 30s; se nessun pong in 60s → forza reconnect.
|
||||
5. **Reconnect:** backoff esponenziale 1s, 2s, 4s, max 30s. Su reconnect: re-subscribe automatico a tutti i conid attivi.
|
||||
6. **Idle unsub:** track `last_polled_at[conid]`; se > `ws_idle_timeout_s` (default 300s) → invia unsub, libera slot. Sub forzata via `subscribe_tick` non scade fino a `unsubscribe` esplicito.
|
||||
7. **Sub limit:** se sub attive ≥ `ws_max_subscriptions` (default 80) → error envelope `IBKR_WS_SUB_LIMIT` su nuova sub.
|
||||
|
||||
### Cache invariant
|
||||
|
||||
- Snapshot rappresenta **sempre** l'ultimo update ricevuto. No buffering storico.
|
||||
- Su disconnect: cache di un conid invalidata se reconnect non riesce in <5s.
|
||||
- `get_tick(conid)` se cache vuota: aspetta primo tick fino a 3s, poi `IBKR_TICK_TIMEOUT`.
|
||||
|
||||
### Health probe
|
||||
|
||||
`/health/ready` interroga `IBKRWebSocket.connected` per ogni env. Stato `degraded` se ws disconnesso ma client REST ok.
|
||||
|
||||
### Feature flag
|
||||
|
||||
`IBKR_WS_ENABLED=false` (env var) disabilita layer WS a runtime; tool streaming fallback a HTTP `/marketdata/snapshot` (single shot, niente depth). Mitigation per emergenze prod.
|
||||
|
||||
## 7. Key rotation
|
||||
|
||||
### Endpoint admin
|
||||
|
||||
```
|
||||
POST /admin/ibkr/rotate-keys/start?env=testnet
|
||||
→ genera signature_key.pem.new + encryption_key.pem.new (RSA 2048)
|
||||
→ ritorna {fingerprints: {sig: "SHA256:...", enc: "SHA256:..."},
|
||||
expires_at: <now+24h>}
|
||||
→ user incolla fingerprint nel portale IBKR, ottiene new_consumer_key
|
||||
|
||||
POST /admin/ibkr/rotate-keys/confirm?env=testnet
|
||||
body: {new_consumer_key, new_access_token, new_access_token_secret}
|
||||
→ atomic swap: .new → primary, primary → secrets/.archive/<timestamp>/
|
||||
→ probe: GET /iserver/auth/status con nuove credenziali
|
||||
→ ok: ritorna {rotated_at, old_archived_at}
|
||||
→ ko: rollback automatico (swap inverso), ritorna 500 IBKR_ROTATION_VALIDATION_FAILED
|
||||
|
||||
POST /admin/ibkr/rotate-keys/abort?env=testnet
|
||||
→ cancella .new files, no-op se start non eseguito o confirm già eseguito
|
||||
```
|
||||
|
||||
### Authorization
|
||||
|
||||
Endpoints protetti dal middleware `auth.py` esistente, richiedono `X-Bot-Tag: admin` (header già supportato per admin router).
|
||||
|
||||
### Atomic swap
|
||||
|
||||
Implementato come:
|
||||
1. Lock filesystem-level via `fcntl.flock` su `secrets/.lock`
|
||||
2. Rename `signature.pem` → `secrets/.archive/<ts>/signature.pem.old`
|
||||
3. Rename `signature.pem.new` → `signature.pem`
|
||||
4. Stesso per `encryption.pem`
|
||||
5. Aggiorna `IBKRSettings` in-memory tramite `app.state.settings.ibkr.consumer_key_<env> = new_consumer_key` (settings live, no restart)
|
||||
6. Probe `GET /iserver/auth/status`
|
||||
7. Su KO: rollback (swap inverso), ripristina settings precedenti, alza eccezione
|
||||
|
||||
### Scheduled health check
|
||||
|
||||
Task asyncio creato in lifespan startup:
|
||||
- Ogni 6h: `GET /iserver/auth/status` su entrambi gli env
|
||||
- Se `competing=true` o `authenticated=false` per >2 cicli consecutivi: log warning + `/admin/ibkr/health` espone state degraded
|
||||
- Auto-trigger `/tickle` su degraded prima di fallire
|
||||
|
||||
### Encryption-at-rest
|
||||
|
||||
Key files su disco con permessi `0600`. Bind mount Docker `:ro` su `/secrets`. Rotation preserva permessi e leggibilità solo per UID processo container.
|
||||
|
||||
## 8. Testing
|
||||
|
||||
### Unit tests
|
||||
|
||||
| File | Coverage critica |
|
||||
|---|---|
|
||||
| `test_oauth.py` | RSA-SHA256 signature deterministica (vector noto IBKR docs); DH key exchange su prime test; LST mint con httpx mock; refresh prima di scadenza; error path key mancante/illeggibile; 401 su consumer revocato |
|
||||
| `test_client.py` | Authorization header construction; conid lookup + cache hit/miss; tickle keep-alive timing (last_request_at < 4min skip, > 4min trigger); place_order warning auto-confirmation flow + warning critico → error envelope; leverage cap pre-flight; account validation al boot; error mapping (401/429/maintenance/no-mkt-data) |
|
||||
| `test_ws.py` | Mock `websockets.connect`; subscribe ack flow; message dispatch in cache; reconnect dopo disconnect; idle timeout unsub; sub limit (>80 → IBKR_WS_SUB_LIMIT); feature flag disabled → HTTP fallback |
|
||||
| `test_orders_complex.py` | Bracket: payload shape (3 orders, OCA group uguale, parent/child relation). OCO: N legs, OCA type=1 ovunque. OTO: due POST sequenziali, secondo con parent_id corretto. Leverage cap su net notional |
|
||||
| `test_key_rotation.py` | start genera keypair valido; confirm swap atomico; validation probe success/fail; rollback automatico su fail; abort pulisce .new files; archived .old conserva permessi |
|
||||
| `test_tools.py` | Schema validation (qty obbligatoria, side enum, OCC format options); default values; leverage cap enforcement |
|
||||
| `test_settings.py` (estendi) | `IBKRSettings.credentials("testnet")` prefer testnet → fallback base → ValueError se entrambi mancanti. Test isolation con `monkeypatch.delenv` ricorsivo per evitare `.env` pollution (pattern Deribit) |
|
||||
|
||||
### Coverage target
|
||||
|
||||
- `oauth.py`, `client.py`, `ws.py`, `key_rotation.py`: 90%
|
||||
- `orders_complex.py`, `tools.py`: 85%
|
||||
- `routers/ibkr.py`, `admin.py` IBKR section: 75%
|
||||
|
||||
### Integration smoke (manuale post-deploy)
|
||||
|
||||
Non in CI (richiede credenziali reali). Documentato in README:
|
||||
|
||||
```bash
|
||||
curl https://cerbero-mcp.<dom>/mcp-ibkr/tools/get_account \
|
||||
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST -d '{}'
|
||||
# → saldo paper account
|
||||
|
||||
curl .../mcp-ibkr/tools/place_order \
|
||||
-H "Authorization: Bearer <TESTNET_TOKEN>" -X POST \
|
||||
-d '{"symbol":"AAPL","side":"buy","qty":1,"order_type":"market"}'
|
||||
# → order_id
|
||||
|
||||
curl .../mcp-ibkr/tools/place_bracket_order \
|
||||
-d '{"symbol":"AAPL","side":"buy","qty":1,"entry_price":150,"stop_loss":145,"take_profit":160}'
|
||||
# → 3 order_ids con stesso oca_group
|
||||
|
||||
curl .../mcp-ibkr/tools/get_depth \
|
||||
-d '{"symbol":"AAPL","rows":5}'
|
||||
# → order book 5 livelli
|
||||
```
|
||||
|
||||
### Verification gate (pre-merge)
|
||||
|
||||
- [ ] `uv run pytest tests/unit/exchanges/ibkr/ tests/unit/test_settings.py -v` verde
|
||||
- [ ] `uv run ruff check src/cerbero_mcp/exchanges/ibkr/ src/cerbero_mcp/routers/ibkr.py` no warning
|
||||
- [ ] `uv run python -c "from cerbero_mcp.settings import Settings; Settings()"` no validation error con `.env` esempio
|
||||
- [ ] `docker compose build && docker compose up -d` healthy < 60s
|
||||
- [ ] `curl /health/ready -H "Authorization: Bearer <TESTNET>"` ritorna `ibkr` probato
|
||||
- [ ] Smoke manuale completo (lista sopra) su account paper reale
|
||||
|
||||
## 9. Deploy & ops
|
||||
|
||||
- **Branch:** `V2.0.0` (default deploy, no merge in main)
|
||||
- **Pipeline:** stessa pattern del fix Deribit di settimana scorsa: commit + push → watchtower aggiorna container in <2min sul VPS
|
||||
- **Traefik:** nessuna modifica (stessa Host rule)
|
||||
- **Secrets:** RSA keys trasferite manualmente in `/opt/docker/cerbero-mcp/secrets/` sul VPS, mode `0600`, ownership UID container; bind mount `./secrets:/secrets:ro` aggiunto in `docker-compose.yml`
|
||||
- **Rollback:** `git revert <commit>` di un singolo step lascia gli altri exchange operativi (commit atomici per design)
|
||||
|
||||
## 10. Commit plan (8 commit atomici)
|
||||
|
||||
```
|
||||
1. feat(V2): IBKR settings + OAuth signer scaffolding
|
||||
- settings.py: IBKRSettings con env-specific credentials
|
||||
- exchanges/ibkr/oauth.py: OAuth1aSigner + tests
|
||||
- .env.example: sezione IBKR
|
||||
- pyproject.toml: cryptography>=43
|
||||
|
||||
2. feat(V2): IBKR client httpx + conid cache + tickle
|
||||
- exchanges/ibkr/client.py: IBKRClient base
|
||||
- exchanges/ibkr/leverage_cap.py: copia da alpaca
|
||||
- tests/unit/exchanges/ibkr/test_client.py
|
||||
|
||||
3. feat(V2): IBKR WebSocket layer + tick/depth snapshot cache
|
||||
- exchanges/ibkr/ws.py: IBKRWebSocket singleton + reconnect
|
||||
- tests/unit/exchanges/ibkr/test_ws.py
|
||||
|
||||
4. feat(V2): IBKR read tools (account/positions/marketdata/streaming)
|
||||
- exchanges/ibkr/tools.py: schemas + read functions
|
||||
- tests/unit/exchanges/ibkr/test_tools.py (read paths)
|
||||
|
||||
5. feat(V2): IBKR write tools simple (place/amend/cancel/close)
|
||||
- exchanges/ibkr/tools.py: schemas + write functions
|
||||
- tests/unit/exchanges/ibkr/test_tools.py (write paths + leverage cap)
|
||||
|
||||
6. feat(V2): IBKR complex orders (bracket/OCO/OTO)
|
||||
- exchanges/ibkr/orders_complex.py
|
||||
- exchanges/ibkr/tools.py: complex tool functions
|
||||
- tests/unit/exchanges/ibkr/test_orders_complex.py
|
||||
|
||||
7. feat(V2): IBKR key rotation admin endpoints + scheduled health
|
||||
- exchanges/ibkr/key_rotation.py: KeyRotationManager
|
||||
- admin.py: rotate-keys/start|confirm|abort + ibkr/health
|
||||
- tests/unit/exchanges/ibkr/test_key_rotation.py
|
||||
|
||||
8. feat(V2): IBKR router wiring + docker secrets + setup script + docs
|
||||
- routers/ibkr.py
|
||||
- exchanges/__init__.py: build_client branch ibkr
|
||||
- __main__.py: include_router
|
||||
- scripts/ibkr_oauth_setup.py
|
||||
- docker-compose.yml: bind mount secrets
|
||||
- README.md: sezione IBKR Setup
|
||||
```
|
||||
|
||||
Ogni commit lascia repo verde (test passing + container buildable). `git revert` di un commit non rompe gli altri exchange.
|
||||
|
||||
## 11. Risks & mitigations
|
||||
|
||||
| Risk | Likelihood | Mitigation |
|
||||
|---|---|---|
|
||||
| WebSocket reconnect instabile in prod | Media | Feature flag `IBKR_WS_ENABLED=false` + HTTP snapshot fallback |
|
||||
| IBKR rate limit superato durante conid lookup burst | Bassa | LRU cache 1h + retry con backoff |
|
||||
| Live session token mint fallisce per network blip | Media | Retry 3x con backoff esponenziale; circuit breaker su 5 fail consecutivi |
|
||||
| Order auto-confirmation conferma erroneamente warning critico | Bassa | Whitelist esplicita warning auto-confermabili (RTH, no-mkt-data); tutto il resto → error envelope |
|
||||
| Key rotation lascia sistema in stato inconsistente | Bassa | Filesystem lock + atomic swap + auto-rollback su validation fail |
|
||||
| Setup OAuth iniziale troppo complesso per ops team | Media | Script `ibkr_oauth_setup.py` interattivo + sezione README dettagliata + checklist |
|
||||
| Leverage cap calcolato su equity stale | Bassa | Cache equity 30s, refresh forzato pre-submit ordini > 10% equity |
|
||||
|
||||
## 12. Estimate
|
||||
|
||||
- Dev: **6-8 giorni** (era 3-4 nella V0; complex orders + WS + rotation aggiungono ~3 giorni)
|
||||
- Test: incluso nei commit (TDD-friendly)
|
||||
- Deploy + smoke: 0.5 giorni
|
||||
- Documentation: 0.5 giorni
|
||||
- **Totale:** ~7-9 giorni di lavoro effettivo
|
||||
@@ -19,6 +19,7 @@ dependencies = [
|
||||
"eth-account>=0.13.7",
|
||||
"msgpack>=1.1.2",
|
||||
"eth-utils>=5.3.1",
|
||||
"cryptography>=43",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
|
||||
@@ -0,0 +1,132 @@
|
||||
#!/usr/bin/env python3
|
||||
"""IBKR OAuth 1.0a Self-Service setup helper.
|
||||
|
||||
Phases (run in order, providing flags as you progress):
|
||||
1. python scripts/ibkr_oauth_setup.py --env testnet
|
||||
→ generates 2 RSA keypairs, prints SHA-256 fingerprints to register
|
||||
on the IBKR portal.
|
||||
2. (manual) Login at https://www.interactivebrokers.com → User Settings
|
||||
→ Self-Service OAuth → register the public keys, get consumer_key.
|
||||
3. python scripts/ibkr_oauth_setup.py --env testnet --consumer-key <K> \\
|
||||
--request-token
|
||||
→ exchanges consumer_key for an unauthorized request token + URL.
|
||||
4. (manual) Open the URL, approve, copy the verifier code.
|
||||
5. python scripts/ibkr_oauth_setup.py --env testnet --verifier <V>
|
||||
→ exchanges verifier for long-lived access_token + secret.
|
||||
Copy the printed values into .env.
|
||||
|
||||
Repeat for --env mainnet using your live IBKR account.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
|
||||
|
||||
def _gen_keypair(out: Path) -> str:
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
pem = key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
)
|
||||
out.write_bytes(pem)
|
||||
out.chmod(0o600)
|
||||
pub = key.public_key().public_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PublicFormat.SubjectPublicKeyInfo,
|
||||
)
|
||||
pub_path = out.with_suffix(out.suffix + ".pub")
|
||||
pub_path.write_bytes(pub)
|
||||
return f"SHA256:{hashlib.sha256(pub).hexdigest()}"
|
||||
|
||||
|
||||
def cmd_init(env: str, secrets_dir: Path) -> int:
|
||||
secrets_dir.mkdir(parents=True, exist_ok=True)
|
||||
sig = secrets_dir / f"ibkr_signature_{env}.pem"
|
||||
enc = secrets_dir / f"ibkr_encryption_{env}.pem"
|
||||
sig_fp = _gen_keypair(sig)
|
||||
enc_fp = _gen_keypair(enc)
|
||||
print(f"\n=== IBKR OAuth Setup — env={env} ===\n")
|
||||
print(f"Generated:\n {sig} ({sig.stat().st_size} bytes)")
|
||||
print(f" {enc} ({enc.stat().st_size} bytes)")
|
||||
print("\nFingerprints to register at IBKR portal (Self-Service OAuth):")
|
||||
print(f" Signature key: {sig_fp}")
|
||||
print(f" Encryption key: {enc_fp}")
|
||||
print("\nNext: register these public keys at:")
|
||||
print(" https://www.interactivebrokers.com (User Settings → OAuth)")
|
||||
print("\nAlso paste in .env:")
|
||||
print(f" IBKR_SIGNATURE_KEY_PATH_{env.upper()}={sig}")
|
||||
print(f" IBKR_ENCRYPTION_KEY_PATH_{env.upper()}={enc}\n")
|
||||
return 0
|
||||
|
||||
|
||||
def cmd_request_token(env: str, consumer_key: str) -> int:
|
||||
print(f"\n=== Step 2 — request token for {env} ===\n")
|
||||
print(f"Consumer key: {consumer_key}")
|
||||
print(
|
||||
"\nVisit this URL in a browser, log in to IBKR, authorize the app,\n"
|
||||
"and copy the displayed verifier code:\n"
|
||||
)
|
||||
print(
|
||||
f" https://www.interactivebrokers.com/sso/Authenticator?"
|
||||
f"oauth_consumer_key={consumer_key}&action=request_token\n"
|
||||
)
|
||||
print("Then re-run with: --verifier <code>\n")
|
||||
return 0
|
||||
|
||||
|
||||
def cmd_verifier(env: str, verifier: str) -> int:
|
||||
print(f"\n=== Step 3 — exchange verifier for {env} ===\n")
|
||||
print(f"Verifier received: {verifier[:8]}...")
|
||||
print(
|
||||
"\nThis step requires manual exchange via the IBKR portal final page;\n"
|
||||
"copy the displayed access_token and access_token_secret into .env:\n"
|
||||
)
|
||||
print(f" IBKR_ACCESS_TOKEN_{env.upper()}=<paste from portal>")
|
||||
print(f" IBKR_ACCESS_TOKEN_SECRET_{env.upper()}=<paste from portal>\n")
|
||||
print("Also set:")
|
||||
print(f" IBKR_CONSUMER_KEY_{env.upper()}=<the consumer key from step 1>")
|
||||
print(" IBKR_DH_PRIME=<paste DH prime hex from portal>\n")
|
||||
return 0
|
||||
|
||||
|
||||
def main() -> int:
|
||||
p = argparse.ArgumentParser(description=__doc__)
|
||||
p.add_argument("--env", choices=["testnet", "mainnet"], required=True)
|
||||
p.add_argument("--secrets-dir", default="secrets")
|
||||
p.add_argument("--consumer-key")
|
||||
p.add_argument("--request-token", action="store_true")
|
||||
p.add_argument("--verifier")
|
||||
p.add_argument(
|
||||
"--rotate",
|
||||
action="store_true",
|
||||
help="Generate new keypairs alongside existing (for rotation)",
|
||||
)
|
||||
args = p.parse_args()
|
||||
|
||||
sec_dir = Path(args.secrets_dir)
|
||||
if args.verifier:
|
||||
return cmd_verifier(args.env, args.verifier)
|
||||
if args.consumer_key and args.request_token:
|
||||
return cmd_request_token(args.env, args.consumer_key)
|
||||
if args.rotate:
|
||||
for kind in ("signature", "encryption"):
|
||||
new = sec_dir / f"ibkr_{kind}_{args.env}.pem.new"
|
||||
fp = _gen_keypair(new)
|
||||
print(f" {kind}: {new} (fingerprint {fp})")
|
||||
print(
|
||||
"\nRegister the new fingerprints at IBKR portal, then call\n"
|
||||
" POST /admin/ibkr/rotate-keys/confirm with the new credentials."
|
||||
)
|
||||
return 0
|
||||
return cmd_init(args.env, sec_dir)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
@@ -9,6 +9,7 @@ Boot:
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
from contextlib import asynccontextmanager
|
||||
from typing import Literal, cast
|
||||
|
||||
@@ -22,8 +23,10 @@ from cerbero_mcp.exchanges import build_client
|
||||
from cerbero_mcp.routers import (
|
||||
alpaca,
|
||||
bybit,
|
||||
cross,
|
||||
deribit,
|
||||
hyperliquid,
|
||||
ibkr,
|
||||
macro,
|
||||
sentiment,
|
||||
)
|
||||
@@ -53,6 +56,11 @@ def _make_app(settings: Settings) -> FastAPI:
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
# Stop any IBKR WebSocket singletons before closing client registry
|
||||
ibkr_ws_dict = getattr(app.state, "ibkr_ws", {}) or {}
|
||||
for ws in ibkr_ws_dict.values():
|
||||
with contextlib.suppress(Exception):
|
||||
await ws.stop()
|
||||
await app.state.registry.aclose()
|
||||
|
||||
app.router.lifespan_context = lifespan
|
||||
@@ -61,8 +69,10 @@ def _make_app(settings: Settings) -> FastAPI:
|
||||
app.include_router(bybit.make_router())
|
||||
app.include_router(hyperliquid.make_router())
|
||||
app.include_router(alpaca.make_router())
|
||||
app.include_router(ibkr.make_router())
|
||||
app.include_router(macro.make_router())
|
||||
app.include_router(sentiment.make_router())
|
||||
app.include_router(cross.make_router())
|
||||
app.include_router(admin.make_admin_router())
|
||||
|
||||
return app
|
||||
|
||||
@@ -8,11 +8,20 @@ from pathlib import Path
|
||||
from typing import Any, Literal
|
||||
|
||||
from fastapi import APIRouter, HTTPException, Query, Request
|
||||
from pydantic import BaseModel, SecretStr
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.key_rotation import KeyRotationManager
|
||||
|
||||
MAX_RECORDS = 10000
|
||||
DEFAULT_LIMIT = 1000
|
||||
|
||||
|
||||
class _IBKRRotateConfirmReq(BaseModel):
|
||||
new_consumer_key: str
|
||||
new_access_token: str
|
||||
new_access_token_secret: str
|
||||
|
||||
|
||||
def _parse_iso(value: str | None) -> datetime | None:
|
||||
if not value:
|
||||
return None
|
||||
@@ -155,4 +164,81 @@ def make_admin_router() -> APIRouter:
|
||||
},
|
||||
}
|
||||
|
||||
@r.post("/ibkr/rotate-keys/start")
|
||||
async def _ibkr_rotate_start(env: str, request: Request):
|
||||
if env not in ("testnet", "mainnet"):
|
||||
raise HTTPException(400, detail={"error": "invalid env"})
|
||||
settings = request.app.state.settings
|
||||
creds = settings.ibkr.credentials(env)
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=creds["signature_key_path"],
|
||||
encryption_key_path=creds["encryption_key_path"],
|
||||
)
|
||||
rotations = getattr(request.app.state, "ibkr_rotations", None)
|
||||
if rotations is None:
|
||||
rotations = {}
|
||||
request.app.state.ibkr_rotations = rotations
|
||||
rotations[env] = mgr
|
||||
return await mgr.start()
|
||||
|
||||
@r.post("/ibkr/rotate-keys/confirm")
|
||||
async def _ibkr_rotate_confirm(
|
||||
env: str, body: _IBKRRotateConfirmReq, request: Request,
|
||||
):
|
||||
if env not in ("testnet", "mainnet"):
|
||||
raise HTTPException(400, detail={"error": "invalid env"})
|
||||
rotations = getattr(request.app.state, "ibkr_rotations", {}) or {}
|
||||
mgr = rotations.get(env)
|
||||
if mgr is None:
|
||||
raise HTTPException(409, detail={"error": "rotation not started"})
|
||||
|
||||
settings = request.app.state.settings
|
||||
if env == "testnet":
|
||||
settings.ibkr.consumer_key_testnet = body.new_consumer_key
|
||||
settings.ibkr.access_token_testnet = body.new_access_token
|
||||
settings.ibkr.access_token_secret_testnet = SecretStr(body.new_access_token_secret)
|
||||
else:
|
||||
settings.ibkr.consumer_key_live = body.new_consumer_key
|
||||
settings.ibkr.access_token_live = body.new_access_token
|
||||
settings.ibkr.access_token_secret_live = SecretStr(body.new_access_token_secret)
|
||||
|
||||
registry = request.app.state.registry
|
||||
registry._clients.pop(("ibkr", env), None)
|
||||
|
||||
async def _validate() -> bool:
|
||||
try:
|
||||
client = await registry.get("ibkr", env)
|
||||
await client._request("GET", "/iserver/auth/status", skip_tickle=True)
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
try:
|
||||
return await mgr.confirm(validate=_validate)
|
||||
finally:
|
||||
rotations.pop(env, None)
|
||||
|
||||
@r.post("/ibkr/rotate-keys/abort")
|
||||
async def _ibkr_rotate_abort(env: str, request: Request):
|
||||
rotations = getattr(request.app.state, "ibkr_rotations", {}) or {}
|
||||
mgr = rotations.pop(env, None)
|
||||
if mgr is None:
|
||||
return {"aborted": False, "reason": "no rotation in progress"}
|
||||
return await mgr.abort()
|
||||
|
||||
@r.post("/ibkr/health")
|
||||
async def _ibkr_health(request: Request):
|
||||
registry = request.app.state.registry
|
||||
out: dict[str, Any] = {}
|
||||
for env in ("testnet", "mainnet"):
|
||||
try:
|
||||
client = await registry.get("ibkr", env)
|
||||
status = await client._request(
|
||||
"GET", "/iserver/auth/status", skip_tickle=True
|
||||
)
|
||||
out[env] = {"healthy": True, "status": status}
|
||||
except Exception as e:
|
||||
out[env] = {"healthy": False, "error": str(e)[:200]}
|
||||
return out
|
||||
|
||||
return r
|
||||
|
||||
@@ -0,0 +1,53 @@
|
||||
"""Shared OHLCV candle model + validator for exchange historical endpoints."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
from pydantic import BaseModel, ConfigDict, ValidationError, model_validator
|
||||
|
||||
|
||||
class Candle(BaseModel):
|
||||
model_config = ConfigDict(extra="ignore")
|
||||
|
||||
timestamp: int
|
||||
open: float
|
||||
high: float
|
||||
low: float
|
||||
close: float
|
||||
volume: float
|
||||
|
||||
@model_validator(mode="after")
|
||||
def _check(self) -> Candle:
|
||||
if self.timestamp <= 0:
|
||||
raise ValueError(f"timestamp must be > 0, got {self.timestamp}")
|
||||
if self.volume < 0:
|
||||
raise ValueError(f"volume must be >= 0, got {self.volume}")
|
||||
if self.high < max(self.open, self.close, self.low):
|
||||
raise ValueError(
|
||||
f"high {self.high} < max(open={self.open}, "
|
||||
f"close={self.close}, low={self.low})"
|
||||
)
|
||||
if self.low > min(self.open, self.close, self.high):
|
||||
raise ValueError(
|
||||
f"low {self.low} > min(open={self.open}, "
|
||||
f"close={self.close}, high={self.high})"
|
||||
)
|
||||
return self
|
||||
|
||||
|
||||
def validate_candles(raw: list[dict[str, Any]]) -> list[dict[str, Any]]:
|
||||
"""Coerce upstream rows into validated candle dicts, sorted by timestamp.
|
||||
|
||||
Raises HTTPException(502) if any row violates OHLC consistency or schema —
|
||||
upstream data corruption is mapped to a retryable error envelope.
|
||||
"""
|
||||
try:
|
||||
candles = [Candle.model_validate(row) for row in raw]
|
||||
except ValidationError as e:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail=f"upstream returned malformed candle: {e.errors()[0]['msg']}",
|
||||
) from e
|
||||
candles.sort(key=lambda c: c.timestamp)
|
||||
return [c.model_dump() for c in candles]
|
||||
@@ -15,9 +15,10 @@ async def build_client(
|
||||
from cerbero_mcp.exchanges.deribit.client import DeribitClient
|
||||
|
||||
url = settings.deribit.url_testnet if env == "testnet" else settings.deribit.url_live
|
||||
cid, csec = settings.deribit.credentials(env)
|
||||
return DeribitClient(
|
||||
client_id=settings.deribit.client_id,
|
||||
client_secret=settings.deribit.client_secret.get_secret_value(),
|
||||
client_id=cid,
|
||||
client_secret=csec,
|
||||
testnet=(env == "testnet"),
|
||||
base_url_override=url,
|
||||
)
|
||||
@@ -71,4 +72,24 @@ async def build_client(
|
||||
cryptopanic_key=settings.sentiment.cryptopanic_key.get_secret_value(),
|
||||
lunarcrush_key=settings.sentiment.lunarcrush_key.get_secret_value(),
|
||||
)
|
||||
if exchange == "ibkr":
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRClient
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import OAuth1aSigner
|
||||
|
||||
creds = settings.ibkr.credentials(env)
|
||||
url = settings.ibkr.url_testnet if env == "testnet" else settings.ibkr.url_live
|
||||
signer = OAuth1aSigner(
|
||||
consumer_key=creds["consumer_key"],
|
||||
access_token=creds["access_token"],
|
||||
access_token_secret=creds["access_token_secret"],
|
||||
signature_key_path=creds["signature_key_path"],
|
||||
encryption_key_path=creds["encryption_key_path"],
|
||||
dh_prime=creds["dh_prime"],
|
||||
)
|
||||
return IBKRClient(
|
||||
signer=signer,
|
||||
account_id=creds["account_id"],
|
||||
paper=(env == "testnet"),
|
||||
base_url=url,
|
||||
)
|
||||
raise ValueError(f"unsupported exchange: {exchange}")
|
||||
|
||||
@@ -20,6 +20,7 @@ from typing import Any
|
||||
|
||||
import httpx
|
||||
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
# ── Endpoint base ────────────────────────────────────────────────
|
||||
@@ -301,9 +302,17 @@ class AlpacaClient:
|
||||
|
||||
bars_dict = (data or {}).get("bars") or {}
|
||||
rows = bars_dict.get(symbol) or []
|
||||
bars = [
|
||||
|
||||
def _iso_to_ms(ts: str | int | None) -> int | None:
|
||||
if ts is None or isinstance(ts, int):
|
||||
return ts
|
||||
return int(_dt.datetime.fromisoformat(
|
||||
ts.replace("Z", "+00:00")
|
||||
).timestamp() * 1000)
|
||||
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": b.get("t"),
|
||||
"timestamp": _iso_to_ms(b.get("t")),
|
||||
"open": b.get("o"),
|
||||
"high": b.get("h"),
|
||||
"low": b.get("l"),
|
||||
@@ -311,12 +320,12 @@ class AlpacaClient:
|
||||
"volume": b.get("v"),
|
||||
}
|
||||
for b in rows
|
||||
]
|
||||
])
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": ac,
|
||||
"interval": interval,
|
||||
"bars": bars,
|
||||
"candles": candles,
|
||||
}
|
||||
|
||||
async def get_snapshot(self, symbol: str) -> dict:
|
||||
|
||||
@@ -22,6 +22,7 @@ import httpx
|
||||
|
||||
from cerbero_mcp.common import indicators as ind
|
||||
from cerbero_mcp.common import microstructure as micro
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
|
||||
BASE_MAINNET = "https://api.bybit.com"
|
||||
BASE_TESTNET = "https://api-testnet.bybit.com"
|
||||
@@ -254,18 +255,17 @@ class BybitClient:
|
||||
params["end"] = end
|
||||
resp = await self._request_public("GET", "/v5/market/kline", params=params)
|
||||
rows = (resp.get("result") or {}).get("list") or []
|
||||
rows_sorted = sorted(rows, key=lambda r: int(r[0]))
|
||||
candles = [
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": int(r[0]),
|
||||
"open": float(r[1]),
|
||||
"high": float(r[2]),
|
||||
"low": float(r[3]),
|
||||
"close": float(r[4]),
|
||||
"volume": float(r[5]),
|
||||
"open": r[1],
|
||||
"high": r[2],
|
||||
"low": r[3],
|
||||
"close": r[4],
|
||||
"volume": r[5],
|
||||
}
|
||||
for r in rows_sorted
|
||||
]
|
||||
for r in rows
|
||||
])
|
||||
return {"symbol": symbol, "candles": candles}
|
||||
|
||||
async def get_indicators(
|
||||
|
||||
@@ -0,0 +1,146 @@
|
||||
"""Cross-exchange historical aggregator.
|
||||
|
||||
Fan-out a canonical (symbol, asset_class, interval, start, end) request to
|
||||
every active exchange that supports the pair, then merge the results into
|
||||
a single consensus candle series with per-bar divergence metrics.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import datetime as _dt
|
||||
from typing import Any, Literal, Protocol
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from cerbero_mcp.exchanges.cross.consensus import merge_candles
|
||||
from cerbero_mcp.exchanges.cross.symbol_map import (
|
||||
get_sources,
|
||||
supported_intervals,
|
||||
to_native_interval,
|
||||
to_native_symbol,
|
||||
)
|
||||
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
|
||||
class _Registry(Protocol):
|
||||
async def get(self, exchange: str, env: Environment) -> Any: ...
|
||||
|
||||
|
||||
def _iso_to_ms(s: str) -> int:
|
||||
return int(_dt.datetime.fromisoformat(
|
||||
s.replace("Z", "+00:00")
|
||||
).timestamp() * 1000)
|
||||
|
||||
|
||||
async def _call_bybit(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_historical(
|
||||
symbol=sym, category="linear", interval=interval,
|
||||
start=_iso_to_ms(start), end=_iso_to_ms(end),
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
async def _call_hyperliquid(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_historical(
|
||||
instrument=sym, start_date=start, end_date=end, resolution=interval,
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
async def _call_deribit(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_historical(
|
||||
instrument=sym, start_date=start, end_date=end, resolution=interval,
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
async def _call_alpaca(client: Any, sym: str, interval: str,
|
||||
start: str, end: str) -> dict[str, Any]:
|
||||
resp: dict[str, Any] = await client.get_bars(
|
||||
symbol=sym, asset_class="stocks", interval=interval,
|
||||
start=start, end=end,
|
||||
)
|
||||
return resp
|
||||
|
||||
|
||||
_DISPATCH = {
|
||||
"bybit": _call_bybit,
|
||||
"hyperliquid": _call_hyperliquid,
|
||||
"deribit": _call_deribit,
|
||||
"alpaca": _call_alpaca,
|
||||
}
|
||||
|
||||
|
||||
class CrossClient:
|
||||
def __init__(self, registry: _Registry, *, env: Environment):
|
||||
self._registry = registry
|
||||
self._env = env
|
||||
|
||||
async def _fetch_one(
|
||||
self, exchange: str, native_sym: str, native_interval: str,
|
||||
start: str, end: str,
|
||||
) -> tuple[str, list[dict[str, Any]] | Exception]:
|
||||
try:
|
||||
client = await self._registry.get(exchange, self._env)
|
||||
resp = await _DISPATCH[exchange](
|
||||
client, native_sym, native_interval, start, end,
|
||||
)
|
||||
return exchange, resp.get("candles", [])
|
||||
except Exception as e: # noqa: BLE001
|
||||
return exchange, e
|
||||
|
||||
async def get_historical(
|
||||
self, *, symbol: str, asset_class: str, interval: str,
|
||||
start_date: str, end_date: str,
|
||||
) -> dict[str, Any]:
|
||||
sources = get_sources(asset_class, symbol)
|
||||
if not sources:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"unsupported symbol/asset_class: {symbol} ({asset_class})",
|
||||
)
|
||||
if interval not in supported_intervals():
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"unsupported interval: {interval}; "
|
||||
f"supported: {supported_intervals()}",
|
||||
)
|
||||
|
||||
tasks = [
|
||||
self._fetch_one(
|
||||
ex,
|
||||
to_native_symbol(asset_class, symbol, ex),
|
||||
to_native_interval(interval, ex),
|
||||
start_date, end_date,
|
||||
)
|
||||
for ex in sources
|
||||
]
|
||||
results = await asyncio.gather(*tasks)
|
||||
|
||||
by_source: dict[str, list[dict[str, Any]]] = {}
|
||||
failed: list[dict[str, str]] = []
|
||||
for ex, payload in results:
|
||||
if isinstance(payload, Exception):
|
||||
failed.append({"exchange": ex, "error": f"{type(payload).__name__}: {payload}"})
|
||||
else:
|
||||
by_source[ex] = payload
|
||||
|
||||
if not by_source:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail={"error": "all sources failed", "failed_sources": failed},
|
||||
)
|
||||
|
||||
return {
|
||||
"symbol": symbol.upper(),
|
||||
"asset_class": asset_class,
|
||||
"interval": interval,
|
||||
"candles": merge_candles(by_source),
|
||||
"sources_used": sorted(by_source.keys()),
|
||||
"failed_sources": failed,
|
||||
}
|
||||
@@ -0,0 +1,37 @@
|
||||
"""Pure consensus aggregation: merge per-source OHLCV candles by timestamp.
|
||||
|
||||
The output is a single time-series with the median OHLC across sources,
|
||||
mean volume, the contributing source count, and a divergence % computed
|
||||
on the close range. div_pct gives a quick quality signal: 0 means full
|
||||
agreement, > X% means at least one source is suspect.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from collections import defaultdict
|
||||
from statistics import median
|
||||
from typing import Any
|
||||
|
||||
|
||||
def merge_candles(by_source: dict[str, list[dict[str, Any]]]) -> list[dict[str, Any]]:
|
||||
grouped: dict[int, list[dict[str, Any]]] = defaultdict(list)
|
||||
for candles in by_source.values():
|
||||
for c in candles:
|
||||
grouped[int(c["timestamp"])].append(c)
|
||||
|
||||
out: list[dict[str, Any]] = []
|
||||
for ts in sorted(grouped):
|
||||
rows = grouped[ts]
|
||||
closes = [float(r["close"]) for r in rows]
|
||||
med_close = float(median(closes))
|
||||
div_pct = (max(closes) - min(closes)) / med_close if med_close else 0.0
|
||||
out.append({
|
||||
"timestamp": ts,
|
||||
"open": float(median(float(r["open"]) for r in rows)),
|
||||
"high": float(median(float(r["high"]) for r in rows)),
|
||||
"low": float(median(float(r["low"]) for r in rows)),
|
||||
"close": med_close,
|
||||
"volume": sum(float(r["volume"]) for r in rows) / len(rows),
|
||||
"sources": len(rows),
|
||||
"div_pct": div_pct,
|
||||
})
|
||||
return out
|
||||
@@ -0,0 +1,60 @@
|
||||
"""Routing table: canonical (asset_class, symbol, interval) → per-exchange native.
|
||||
|
||||
Crypto canonical symbols default to USD/USDT-quoted perpetuals on the most
|
||||
liquid pair available. Equities currently route to Alpaca only — IBKR is
|
||||
omitted from the cross MVP because its bars endpoint takes a relative
|
||||
period instead of (start, end).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
AssetClass = str
|
||||
|
||||
_CRYPTO_SYMBOLS: dict[str, dict[str, str]] = {
|
||||
"BTC": {"bybit": "BTCUSDT", "hyperliquid": "BTC", "deribit": "BTC-PERPETUAL"},
|
||||
"ETH": {"bybit": "ETHUSDT", "hyperliquid": "ETH", "deribit": "ETH-PERPETUAL"},
|
||||
"SOL": {"bybit": "SOLUSDT", "hyperliquid": "SOL"},
|
||||
}
|
||||
|
||||
_STOCK_SYMBOLS: dict[str, dict[str, str]] = {
|
||||
"AAPL": {"alpaca": "AAPL"},
|
||||
"SPY": {"alpaca": "SPY"},
|
||||
"QQQ": {"alpaca": "QQQ"},
|
||||
"TSLA": {"alpaca": "TSLA"},
|
||||
"NVDA": {"alpaca": "NVDA"},
|
||||
}
|
||||
|
||||
_SYMBOLS: dict[AssetClass, dict[str, dict[str, str]]] = {
|
||||
"crypto": _CRYPTO_SYMBOLS,
|
||||
"stocks": _STOCK_SYMBOLS,
|
||||
}
|
||||
|
||||
_INTERVALS: dict[str, dict[str, str]] = {
|
||||
"1m": {"bybit": "1", "hyperliquid": "1m", "deribit": "1m", "alpaca": "1m"},
|
||||
"5m": {"bybit": "5", "hyperliquid": "5m", "deribit": "5m", "alpaca": "5m"},
|
||||
"15m": {"bybit": "15", "hyperliquid": "15m", "deribit": "15m", "alpaca": "15m"},
|
||||
"1h": {"bybit": "60", "hyperliquid": "1h", "deribit": "1h", "alpaca": "1h"},
|
||||
"4h": {"bybit": "240", "hyperliquid": "4h", "deribit": "4h", "alpaca": "4h"},
|
||||
"1d": {"bybit": "D", "hyperliquid": "1d", "deribit": "1d", "alpaca": "1d"},
|
||||
}
|
||||
|
||||
|
||||
def get_sources(asset_class: AssetClass, symbol: str) -> list[str]:
|
||||
table = _SYMBOLS.get(asset_class, {})
|
||||
mapping = table.get(symbol.upper())
|
||||
if mapping is None:
|
||||
return []
|
||||
return list(mapping.keys())
|
||||
|
||||
|
||||
def to_native_symbol(
|
||||
asset_class: AssetClass, symbol: str, exchange: str
|
||||
) -> str:
|
||||
return _SYMBOLS[asset_class][symbol.upper()][exchange]
|
||||
|
||||
|
||||
def to_native_interval(interval: str, exchange: str) -> str:
|
||||
return _INTERVALS[interval][exchange]
|
||||
|
||||
|
||||
def supported_intervals() -> list[str]:
|
||||
return list(_INTERVALS.keys())
|
||||
@@ -0,0 +1,28 @@
|
||||
"""Pydantic schemas + thin tool wrappers for the /mcp-cross router."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal
|
||||
|
||||
from pydantic import BaseModel
|
||||
|
||||
from cerbero_mcp.exchanges.cross.client import CrossClient
|
||||
|
||||
AssetClass = Literal["crypto", "stocks"]
|
||||
|
||||
|
||||
class GetHistoricalReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: AssetClass = "crypto"
|
||||
interval: str = "1h"
|
||||
start_date: str
|
||||
end_date: str
|
||||
|
||||
|
||||
async def get_historical(client: CrossClient, params: GetHistoricalReq) -> dict:
|
||||
return await client.get_historical(
|
||||
symbol=params.symbol,
|
||||
asset_class=params.asset_class,
|
||||
interval=params.interval,
|
||||
start_date=params.start_date,
|
||||
end_date=params.end_date,
|
||||
)
|
||||
@@ -1,15 +1,37 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import contextlib
|
||||
import json
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from cerbero_mcp.common import indicators as ind
|
||||
from cerbero_mcp.common import microstructure as micro
|
||||
from cerbero_mcp.common import options as opt
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
|
||||
def _parse_deribit_response(resp: Any) -> dict[str, Any]:
|
||||
"""Map Deribit upstream errors to a clean HTTP 502 (retryable) instead of
|
||||
leaking JSONDecodeError when the body is HTML (e.g. Cloudflare 5xx page)."""
|
||||
if resp.status_code >= 500:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail=f"Deribit upstream HTTP {resp.status_code}",
|
||||
)
|
||||
try:
|
||||
data: dict[str, Any] = resp.json()
|
||||
return data
|
||||
except json.JSONDecodeError as e:
|
||||
raise HTTPException(
|
||||
status_code=502,
|
||||
detail=f"Deribit upstream returned non-JSON (status {resp.status_code})",
|
||||
) from e
|
||||
|
||||
BASE_LIVE = "https://www.deribit.com/api/v2"
|
||||
BASE_TESTNET = "https://test.deribit.com/api/v2"
|
||||
|
||||
@@ -23,6 +45,10 @@ RESOLUTION_MAP = {
|
||||
}
|
||||
|
||||
|
||||
class DeribitAuthError(Exception):
|
||||
"""Deribit auth failed (bad credentials, missing scope, env mismatch)."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class DeribitClient:
|
||||
client_id: str
|
||||
@@ -49,7 +75,14 @@ class DeribitClient:
|
||||
}
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.get(url, params=params)
|
||||
data = resp.json()
|
||||
data = _parse_deribit_response(resp)
|
||||
if "result" not in data:
|
||||
error = data.get("error", {})
|
||||
msg = error.get("message", str(data)) if isinstance(error, dict) else str(error)
|
||||
code = error.get("code") if isinstance(error, dict) else None
|
||||
raise DeribitAuthError(
|
||||
f"Deribit auth failed (code={code}, env={'testnet' if self.testnet else 'mainnet'}): {msg}"
|
||||
)
|
||||
result = data["result"]
|
||||
self._token = result["access_token"]
|
||||
self._token_expires_at = time.monotonic() + result.get("expires_in", 900) - 30
|
||||
@@ -63,7 +96,10 @@ class DeribitClient:
|
||||
async def _request(self, method: str, params: dict[str, Any] | None = None) -> dict:
|
||||
is_private = method.startswith("private/")
|
||||
if is_private:
|
||||
await self._get_token()
|
||||
try:
|
||||
await self._get_token()
|
||||
except DeribitAuthError as e:
|
||||
return {"result": None, "error": str(e)}
|
||||
|
||||
url = f"{self.base_url}/{method}"
|
||||
request_params = dict(params) if params else {}
|
||||
@@ -73,7 +109,7 @@ class DeribitClient:
|
||||
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.get(url, params=request_params, headers=headers)
|
||||
data = resp.json()
|
||||
data = _parse_deribit_response(resp)
|
||||
|
||||
if "result" not in data:
|
||||
error = data.get("error", {})
|
||||
@@ -85,12 +121,12 @@ class DeribitClient:
|
||||
await self._authenticate()
|
||||
headers["Authorization"] = f"Bearer {self._token}"
|
||||
resp = await http.get(url, params=request_params, headers=headers)
|
||||
data = resp.json()
|
||||
data = _parse_deribit_response(resp)
|
||||
if "result" in data:
|
||||
return data # type: ignore[no-any-return]
|
||||
return data
|
||||
return {"result": None, "error": error_msg}
|
||||
|
||||
return data # type: ignore[no-any-return]
|
||||
return data
|
||||
|
||||
# ── Read tools ───────────────────────────────────────────────
|
||||
|
||||
@@ -307,12 +343,12 @@ class DeribitClient:
|
||||
r = raw.get("result")
|
||||
if not r:
|
||||
return {
|
||||
"equity": 0,
|
||||
"balance": 0,
|
||||
"margin_balance": 0,
|
||||
"available_funds": 0,
|
||||
"unrealized_pnl": 0,
|
||||
"total_pnl": 0,
|
||||
"equity": None,
|
||||
"balance": None,
|
||||
"margin_balance": None,
|
||||
"available_funds": None,
|
||||
"unrealized_pnl": None,
|
||||
"total_pnl": None,
|
||||
"testnet": self.testnet,
|
||||
"error": raw.get("error", "no result"),
|
||||
}
|
||||
@@ -384,24 +420,24 @@ class DeribitClient:
|
||||
},
|
||||
)
|
||||
r = raw.get("result") or {}
|
||||
candles = []
|
||||
ticks = r.get("ticks", []) or []
|
||||
opens = r.get("open", []) or []
|
||||
highs = r.get("high", []) or []
|
||||
lows = r.get("low", []) or []
|
||||
closes = r.get("close", []) or []
|
||||
volumes = r.get("volume", []) or []
|
||||
for idx, ts in enumerate(ticks):
|
||||
if idx >= min(len(opens), len(highs), len(lows), len(closes), len(volumes)):
|
||||
break
|
||||
candles.append({
|
||||
"timestamp": ts,
|
||||
"open": opens[idx],
|
||||
"high": highs[idx],
|
||||
"low": lows[idx],
|
||||
"close": closes[idx],
|
||||
"volume": volumes[idx],
|
||||
})
|
||||
n = min(len(ticks), len(opens), len(highs), len(lows), len(closes), len(volumes))
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": ticks[i],
|
||||
"open": opens[i],
|
||||
"high": highs[i],
|
||||
"low": lows[i],
|
||||
"close": closes[i],
|
||||
"volume": volumes[i],
|
||||
}
|
||||
for i in range(n)
|
||||
])
|
||||
return {"candles": candles}
|
||||
|
||||
async def get_dvol(
|
||||
|
||||
@@ -27,6 +27,7 @@ from eth_account.messages import encode_typed_data
|
||||
from eth_utils import keccak, to_hex
|
||||
|
||||
from cerbero_mcp.common import indicators as ind
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
BASE_LIVE = "https://api.hyperliquid.xyz"
|
||||
@@ -408,18 +409,17 @@ class HyperliquidClient:
|
||||
},
|
||||
}
|
||||
)
|
||||
candles = []
|
||||
for c in data:
|
||||
candles.append(
|
||||
{
|
||||
"timestamp": c.get("t", 0),
|
||||
"open": float(c.get("o", 0)),
|
||||
"high": float(c.get("h", 0)),
|
||||
"low": float(c.get("l", 0)),
|
||||
"close": float(c.get("c", 0)),
|
||||
"volume": float(c.get("v", 0)),
|
||||
}
|
||||
)
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": c.get("t"),
|
||||
"open": c.get("o"),
|
||||
"high": c.get("h"),
|
||||
"low": c.get("l"),
|
||||
"close": c.get("c"),
|
||||
"volume": c.get("v"),
|
||||
}
|
||||
for c in data
|
||||
])
|
||||
return {"candles": candles}
|
||||
|
||||
async def get_open_orders(self) -> list[dict[str, Any]]:
|
||||
|
||||
@@ -0,0 +1,433 @@
|
||||
"""IBKR Client Portal Web API client (REST httpx + OAuth1a)."""
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
import time
|
||||
from collections import OrderedDict
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any
|
||||
|
||||
import httpx
|
||||
|
||||
from cerbero_mcp.common.candles import validate_candles
|
||||
from cerbero_mcp.common.http import async_client
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import (
|
||||
IBKRAuthError,
|
||||
OAuth1aSigner,
|
||||
_percent_encode,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class IBKRError(Exception):
|
||||
"""Generic IBKR API error (non-auth)."""
|
||||
|
||||
|
||||
_TICKLE_INTERVAL_S = 240.0 # tickle if last call > 4min ago
|
||||
|
||||
# Mapping asset_class (cerbero/MCP convention) → IBKR secType.
|
||||
_SEC_TYPE_MAP: dict[str, str] = {
|
||||
"stocks": "STK",
|
||||
"options": "OPT",
|
||||
"futures": "FUT",
|
||||
"forex": "CASH",
|
||||
}
|
||||
|
||||
|
||||
@dataclass
|
||||
class IBKRClient:
|
||||
signer: OAuth1aSigner
|
||||
account_id: str
|
||||
paper: bool = True
|
||||
base_url: str = "https://api.ibkr.com/v1/api"
|
||||
|
||||
_conid_cache: OrderedDict[str, int] = field(
|
||||
default_factory=OrderedDict, init=False, repr=False
|
||||
)
|
||||
_last_request_at: float = field(default=0.0, init=False, repr=False)
|
||||
_http: httpx.AsyncClient | None = field(default=None, init=False, repr=False)
|
||||
|
||||
_CONID_CACHE_MAX = 1024
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
# IBKR Client Portal gateway latency is higher than crypto exchanges
|
||||
# (lookup roundtrip + session validation); 30s matches Alpaca's choice.
|
||||
self._http = async_client(timeout=30.0)
|
||||
|
||||
async def aclose(self) -> None:
|
||||
if self._http and not self._http.is_closed:
|
||||
await self._http.aclose()
|
||||
|
||||
async def health(self) -> dict[str, Any]:
|
||||
return {"status": "ok", "paper": self.paper}
|
||||
|
||||
def is_testnet(self) -> dict[str, Any]:
|
||||
return {"testnet": self.paper, "base_url": self.base_url}
|
||||
|
||||
async def _build_auth_header(self, method: str, url: str) -> str:
|
||||
await self.signer.get_live_session_token(base_url=self.base_url)
|
||||
params = self.signer.make_oauth_params()
|
||||
params["oauth_signature_method"] = "HMAC-SHA256"
|
||||
sig = self.signer.sign_with_lst(method, url, params)
|
||||
params["oauth_signature"] = sig
|
||||
return "OAuth realm=\"limited_poa\", " + ", ".join(
|
||||
f'{k}="{_percent_encode(v)}"' for k, v in sorted(params.items())
|
||||
)
|
||||
|
||||
async def _maybe_tickle(self) -> None:
|
||||
if time.monotonic() - self._last_request_at < _TICKLE_INTERVAL_S:
|
||||
return
|
||||
if self._http is None: # pragma: no cover
|
||||
return
|
||||
try:
|
||||
url = f"{self.base_url}/tickle"
|
||||
auth = await self._build_auth_header("POST", url)
|
||||
await self._http.post(url, headers={"Authorization": auth})
|
||||
except Exception as exc:
|
||||
# Best-effort: failure shouldn't block the real request, but log
|
||||
# so misconfigured signer / dead session aren't invisible.
|
||||
logger.debug("ibkr tickle best-effort failed: %s", exc)
|
||||
|
||||
async def _force_lst_refresh(self) -> None:
|
||||
"""Invalidate cached LST and remint on next call."""
|
||||
self.signer._live_session_token = None
|
||||
self.signer._lst_expires_at = 0.0
|
||||
|
||||
async def _request(
|
||||
self,
|
||||
method: str,
|
||||
path: str,
|
||||
*,
|
||||
params: dict[str, Any] | None = None,
|
||||
json_body: dict[str, Any] | None = None,
|
||||
skip_tickle: bool = False,
|
||||
_retried_auth: bool = False,
|
||||
) -> Any:
|
||||
if self._http is None: # pragma: no cover — set in __post_init__
|
||||
raise IBKRError("http client not initialized")
|
||||
if not skip_tickle:
|
||||
await self._maybe_tickle()
|
||||
url = f"{self.base_url}{path}"
|
||||
auth = await self._build_auth_header(method, url)
|
||||
clean_params = (
|
||||
{k: v for k, v in params.items() if v is not None}
|
||||
if params else None
|
||||
)
|
||||
resp = await self._http.request(
|
||||
method, url,
|
||||
params=clean_params or None,
|
||||
json=json_body,
|
||||
headers={"Authorization": auth, "User-Agent": "cerbero-mcp/2.0"},
|
||||
)
|
||||
self._last_request_at = time.monotonic()
|
||||
if resp.status_code == 401 and not _retried_auth:
|
||||
# Retry once with fresh LST (per spec: IBKR_AUTH_FAILED retryable)
|
||||
logger.info("ibkr 401 on %s %s — refreshing LST and retrying once", method, path)
|
||||
await self._force_lst_refresh()
|
||||
return await self._request(
|
||||
method, path, params=params, json_body=json_body,
|
||||
skip_tickle=skip_tickle, _retried_auth=True,
|
||||
)
|
||||
if resp.status_code == 401:
|
||||
raise IBKRAuthError(f"401 on {method} {path} (after retry): {resp.text[:200]}")
|
||||
if resp.status_code == 429:
|
||||
raise IBKRError(f"IBKR_RATE_LIMITED: {resp.text[:200]}")
|
||||
if resp.status_code >= 500:
|
||||
raise IBKRError(f"IBKR_SERVER_ERROR status={resp.status_code}")
|
||||
if resp.status_code >= 400:
|
||||
raise IBKRError(f"IBKR_HTTP_{resp.status_code}: {resp.text[:300]}")
|
||||
if not resp.content:
|
||||
return {}
|
||||
return resp.json()
|
||||
|
||||
async def get_account(self) -> dict:
|
||||
return await self._request("GET", f"/portfolio/{self.account_id}/summary")
|
||||
|
||||
# ── Conid resolution ────────────────────────────────────────
|
||||
|
||||
async def resolve_conid(self, symbol: str, sec_type: str = "STK") -> int:
|
||||
key = f"{sec_type}:{symbol}"
|
||||
if key in self._conid_cache:
|
||||
self._conid_cache.move_to_end(key)
|
||||
return self._conid_cache[key]
|
||||
result = await self._request(
|
||||
"GET", "/trsrv/secdef/search",
|
||||
params={"symbol": symbol, "secType": sec_type},
|
||||
)
|
||||
if not result or not isinstance(result, list):
|
||||
raise IBKRError(f"IBKR_CONID_NOT_FOUND: {symbol}/{sec_type}")
|
||||
first = result[0]
|
||||
if not isinstance(first, dict) or "conid" not in first:
|
||||
raise IBKRError(
|
||||
f"IBKR_CONID_NOT_FOUND: {symbol}/{sec_type} (malformed response)"
|
||||
)
|
||||
conid = int(first["conid"])
|
||||
self._conid_cache[key] = conid
|
||||
if len(self._conid_cache) > self._CONID_CACHE_MAX:
|
||||
self._conid_cache.popitem(last=False)
|
||||
return conid
|
||||
|
||||
# ── Positions / orders / activities ─────────────────────────
|
||||
|
||||
async def get_positions(self, page: int = 0) -> list[dict]:
|
||||
data = await self._request(
|
||||
"GET", f"/portfolio/{self.account_id}/positions/{page}"
|
||||
)
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
async def get_open_orders(self) -> list[dict]:
|
||||
data = await self._request(
|
||||
"GET", "/iserver/account/orders",
|
||||
params={"filters": "Submitted,PreSubmitted"},
|
||||
)
|
||||
if isinstance(data, dict):
|
||||
return list(data.get("orders") or [])
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
async def get_activities(self, days: int = 7) -> list[dict]:
|
||||
days = max(1, min(days, 90))
|
||||
data = await self._request(
|
||||
"GET", "/iserver/account/trades", params={"days": days},
|
||||
)
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
# ── Market data ─────────────────────────────────────────────
|
||||
|
||||
_SNAPSHOT_FIELDS = "31,84,86,7295,7296" # last,bid,ask,bid_size,ask_size
|
||||
|
||||
async def get_ticker(self, symbol: str, asset_class: str = "stocks") -> dict:
|
||||
sec_type = _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
conid = await self.resolve_conid(symbol, sec_type)
|
||||
data = await self._request(
|
||||
"GET", "/iserver/marketdata/snapshot",
|
||||
params={"conids": str(conid), "fields": self._SNAPSHOT_FIELDS},
|
||||
)
|
||||
if not data or not isinstance(data, list):
|
||||
raise IBKRError("IBKR_NO_MARKET_DATA_SUBSCRIPTION")
|
||||
row = data[0]
|
||||
|
||||
def _f(k: str) -> float | None:
|
||||
v = row.get(k)
|
||||
try:
|
||||
return float(v) if v not in (None, "") else None
|
||||
except (TypeError, ValueError):
|
||||
return None
|
||||
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": asset_class,
|
||||
"last_price": _f("31"),
|
||||
"bid": _f("84"),
|
||||
"ask": _f("86"),
|
||||
"bid_size": _f("7295"),
|
||||
"ask_size": _f("7296"),
|
||||
}
|
||||
|
||||
async def get_bars(
|
||||
self, symbol: str, asset_class: str = "stocks",
|
||||
period: str = "1d", bar: str = "5min",
|
||||
) -> dict:
|
||||
sec_type = _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
conid = await self.resolve_conid(symbol, sec_type)
|
||||
data = await self._request(
|
||||
"GET", "/iserver/marketdata/history",
|
||||
params={"conid": str(conid), "period": period, "bar": bar},
|
||||
)
|
||||
rows = (data or {}).get("data") or []
|
||||
candles = validate_candles([
|
||||
{
|
||||
"timestamp": r.get("t"),
|
||||
"open": r.get("o"),
|
||||
"high": r.get("h"),
|
||||
"low": r.get("l"),
|
||||
"close": r.get("c"),
|
||||
"volume": r.get("v"),
|
||||
}
|
||||
for r in rows
|
||||
])
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"asset_class": asset_class,
|
||||
"interval": bar,
|
||||
"candles": candles,
|
||||
}
|
||||
|
||||
async def get_option_chain(
|
||||
self, underlying: str, expiry: str | None = None
|
||||
) -> dict:
|
||||
conid = await self.resolve_conid(underlying, "STK")
|
||||
params: dict[str, Any] = {"conid": str(conid), "secType": "OPT"}
|
||||
if expiry:
|
||||
params["month"] = expiry # IBKR format: "JAN26"
|
||||
strikes = await self._request(
|
||||
"GET", "/iserver/secdef/strikes", params=params,
|
||||
)
|
||||
return {
|
||||
"underlying": underlying,
|
||||
"expiry": expiry,
|
||||
"strikes": strikes,
|
||||
}
|
||||
|
||||
async def search_contracts(
|
||||
self, symbol: str, sec_type: str = "STK"
|
||||
) -> list[dict]:
|
||||
data = await self._request(
|
||||
"GET", "/trsrv/secdef/search",
|
||||
params={"symbol": symbol, "secType": sec_type},
|
||||
)
|
||||
return list(data) if isinstance(data, list) else []
|
||||
|
||||
# ── Order writes ────────────────────────────────────────────
|
||||
|
||||
# Auto-confirm policy: any IBKR warning that is NOT in _CRITICAL_WARNINGS
|
||||
# is auto-confirmed up to _AUTO_CONFIRM_MAX_CYCLES times. Hardening to a
|
||||
# strict whitelist (allow-list) is deferred — V1 trades safety for UX.
|
||||
_CRITICAL_WARNINGS = (
|
||||
"margin", "suitability", "credit", "rejected", "insufficient",
|
||||
)
|
||||
_AUTO_CONFIRM_MAX_CYCLES = 3
|
||||
|
||||
async def place_order(
|
||||
self, *,
|
||||
symbol: str, side: str, qty: float,
|
||||
order_type: str = "market",
|
||||
limit_price: float | None = None,
|
||||
stop_price: float | None = None,
|
||||
tif: str = "day",
|
||||
asset_class: str = "stocks",
|
||||
sec_type: str | None = None,
|
||||
exchange: str = "SMART",
|
||||
outside_rth: bool = False,
|
||||
) -> dict:
|
||||
st = sec_type or _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
conid = await self.resolve_conid(symbol, st)
|
||||
|
||||
order: dict[str, Any] = {
|
||||
"conid": conid,
|
||||
"secType": f"{conid}:{st}",
|
||||
"orderType": _ibkr_order_type(order_type),
|
||||
"side": side.upper(),
|
||||
"quantity": qty,
|
||||
"tif": tif.upper(),
|
||||
"outsideRTH": outside_rth,
|
||||
"listingExchange": exchange,
|
||||
}
|
||||
if limit_price is not None:
|
||||
order["price"] = limit_price
|
||||
if stop_price is not None:
|
||||
order["auxPrice"] = stop_price
|
||||
|
||||
return await self._submit_order_with_confirmation({"orders": [order]})
|
||||
|
||||
async def _submit_order_with_confirmation(
|
||||
self, payload: dict, *, cycles: int = 0
|
||||
) -> dict:
|
||||
path = f"/iserver/account/{self.account_id}/orders"
|
||||
result = await self._request("POST", path, json_body=payload)
|
||||
return await self._handle_order_response(result, cycles)
|
||||
|
||||
async def _handle_order_response(
|
||||
self, result: Any, cycles: int
|
||||
) -> dict:
|
||||
if not isinstance(result, list) or not result:
|
||||
raise IBKRError(f"IBKR_ORDER_UNEXPECTED_RESPONSE: {result!r}")
|
||||
first = result[0]
|
||||
if "id" in first and "message" in first:
|
||||
messages = first.get("message") or []
|
||||
joined = " ".join(messages).lower()
|
||||
if any(crit in joined for crit in self._CRITICAL_WARNINGS):
|
||||
raise IBKRError(
|
||||
f"IBKR_ORDER_REJECTED_WARNING: {messages}"
|
||||
)
|
||||
if cycles >= self._AUTO_CONFIRM_MAX_CYCLES:
|
||||
raise IBKRError(
|
||||
f"IBKR_ORDER_TOO_MANY_CONFIRMATIONS: {messages}"
|
||||
)
|
||||
reply = await self._request(
|
||||
"POST", f"/iserver/reply/{first['id']}",
|
||||
json_body={"confirmed": True},
|
||||
)
|
||||
return await self._handle_order_response(reply, cycles + 1)
|
||||
if "order_id" in first:
|
||||
return {"order_id": first["order_id"], "status": first.get("order_status")}
|
||||
raise IBKRError(f"IBKR_ORDER_UNEXPECTED_RESPONSE: {first!r}")
|
||||
|
||||
async def amend_order(
|
||||
self, order_id: str, *,
|
||||
qty: float | None = None,
|
||||
limit_price: float | None = None,
|
||||
stop_price: float | None = None,
|
||||
tif: str | None = None,
|
||||
) -> dict:
|
||||
body: dict[str, Any] = {}
|
||||
if qty is not None:
|
||||
body["quantity"] = qty
|
||||
if limit_price is not None:
|
||||
body["price"] = limit_price
|
||||
if stop_price is not None:
|
||||
body["auxPrice"] = stop_price
|
||||
if tif is not None:
|
||||
body["tif"] = tif.upper()
|
||||
path = f"/iserver/account/{self.account_id}/order/{order_id}"
|
||||
result = await self._request("POST", path, json_body=body)
|
||||
return await self._handle_order_response(result, cycles=0)
|
||||
|
||||
async def cancel_order(self, order_id: str) -> dict:
|
||||
path = f"/iserver/account/{self.account_id}/order/{order_id}"
|
||||
await self._request("DELETE", path)
|
||||
return {"order_id": order_id, "canceled": True}
|
||||
|
||||
async def cancel_all_orders(self) -> list[dict]:
|
||||
orders = await self.get_open_orders()
|
||||
results = []
|
||||
for o in orders:
|
||||
oid = o.get("orderId") or o.get("order_id")
|
||||
if not oid:
|
||||
continue
|
||||
try:
|
||||
results.append(await self.cancel_order(str(oid)))
|
||||
except Exception as e:
|
||||
results.append({"order_id": str(oid), "canceled": False, "error": str(e)})
|
||||
return results
|
||||
|
||||
async def close_position(
|
||||
self, symbol: str, qty: float | None = None
|
||||
) -> dict:
|
||||
# Resolve symbol → conid, then match positions on conid (positions
|
||||
# return `contractDesc` as a long display string, not ticker).
|
||||
sec_type = _SEC_TYPE_MAP.get("stocks", "STK")
|
||||
conid = await self.resolve_conid(symbol, sec_type)
|
||||
positions = await self.get_positions()
|
||||
target = next(
|
||||
(p for p in positions if int(p.get("conid", 0)) == conid),
|
||||
None,
|
||||
)
|
||||
if not target:
|
||||
raise IBKRError(f"IBKR_NO_POSITION: {symbol} (conid={conid})")
|
||||
position_qty = float(target.get("position", 0))
|
||||
close_qty = abs(qty if qty is not None else position_qty)
|
||||
side = "SELL" if position_qty > 0 else "BUY"
|
||||
return await self.place_order(
|
||||
symbol=symbol, side=side, qty=close_qty, order_type="market",
|
||||
)
|
||||
|
||||
async def close_all_positions(self) -> list[dict]:
|
||||
positions = await self.get_positions()
|
||||
results = []
|
||||
for p in positions:
|
||||
sym = p.get("ticker") or p.get("contractDesc")
|
||||
if not sym:
|
||||
continue
|
||||
try:
|
||||
results.append(await self.close_position(sym))
|
||||
except Exception as e:
|
||||
results.append({"symbol": sym, "error": str(e)})
|
||||
return results
|
||||
|
||||
|
||||
def _ibkr_order_type(t: str) -> str:
|
||||
m = {"market": "MKT", "limit": "LMT", "stop": "STP", "stop_limit": "STP_LMT"}
|
||||
if t.lower() not in m:
|
||||
raise IBKRError(f"unsupported order_type: {t}")
|
||||
return m[t.lower()]
|
||||
@@ -0,0 +1,106 @@
|
||||
"""IBKR RSA key rotation: stage/confirm/abort with auto-rollback."""
|
||||
from __future__ import annotations
|
||||
|
||||
import datetime as _dt
|
||||
import hashlib
|
||||
import os
|
||||
import shutil
|
||||
from collections.abc import Awaitable, Callable
|
||||
from dataclasses import dataclass, field
|
||||
from pathlib import Path
|
||||
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
|
||||
|
||||
def _sha256_fingerprint(pem_path: Path) -> str:
|
||||
digest = hashlib.sha256(pem_path.read_bytes()).hexdigest()
|
||||
return f"SHA256:{digest}"
|
||||
|
||||
|
||||
@dataclass
|
||||
class KeyRotationManager:
|
||||
signature_key_path: str
|
||||
encryption_key_path: str
|
||||
|
||||
_started: bool = field(default=False, init=False)
|
||||
|
||||
def _sig(self) -> Path:
|
||||
return Path(self.signature_key_path)
|
||||
|
||||
def _enc(self) -> Path:
|
||||
return Path(self.encryption_key_path)
|
||||
|
||||
async def start(self) -> dict:
|
||||
sig_new = self._sig().with_suffix(self._sig().suffix + ".new")
|
||||
enc_new = self._enc().with_suffix(self._enc().suffix + ".new")
|
||||
|
||||
for p in (sig_new, enc_new):
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
p.write_bytes(key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
))
|
||||
os.chmod(p, 0o600)
|
||||
|
||||
self._started = True
|
||||
return {
|
||||
"fingerprints": {
|
||||
"sig": _sha256_fingerprint(sig_new),
|
||||
"enc": _sha256_fingerprint(enc_new),
|
||||
},
|
||||
"expires_at": (
|
||||
_dt.datetime.now(_dt.UTC) + _dt.timedelta(hours=24)
|
||||
).isoformat(),
|
||||
}
|
||||
|
||||
async def confirm(
|
||||
self, *, validate: Callable[[], Awaitable[bool]],
|
||||
) -> dict:
|
||||
sig = self._sig()
|
||||
enc = self._enc()
|
||||
sig_new = sig.with_suffix(sig.suffix + ".new")
|
||||
enc_new = enc.with_suffix(enc.suffix + ".new")
|
||||
if not (sig_new.exists() and enc_new.exists()):
|
||||
raise RuntimeError("IBKR_ROTATION_NOT_STARTED")
|
||||
|
||||
archive = sig.parent / ".archive" / _dt.datetime.now(_dt.UTC).strftime("%Y%m%dT%H%M%S")
|
||||
archive.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
shutil.move(str(sig), str(archive / sig.name))
|
||||
shutil.move(str(enc), str(archive / enc.name))
|
||||
shutil.move(str(sig_new), str(sig))
|
||||
shutil.move(str(enc_new), str(enc))
|
||||
|
||||
err: BaseException | None = None
|
||||
try:
|
||||
ok = await validate()
|
||||
except Exception as e:
|
||||
ok = False
|
||||
err = e
|
||||
|
||||
if not ok:
|
||||
shutil.move(str(sig), str(sig.with_suffix(sig.suffix + ".new")))
|
||||
shutil.move(str(enc), str(enc.with_suffix(enc.suffix + ".new")))
|
||||
shutil.move(str(archive / sig.name), str(sig))
|
||||
shutil.move(str(archive / enc.name), str(enc))
|
||||
raise RuntimeError(
|
||||
f"IBKR_ROTATION_VALIDATION_FAILED: {err}" if err
|
||||
else "IBKR_ROTATION_VALIDATION_FAILED"
|
||||
)
|
||||
|
||||
self._started = False
|
||||
return {
|
||||
"rotated_at": _dt.datetime.now(_dt.UTC).isoformat(),
|
||||
"old_archived_at": str(archive),
|
||||
}
|
||||
|
||||
async def abort(self) -> dict:
|
||||
sig_new = self._sig().with_suffix(self._sig().suffix + ".new")
|
||||
enc_new = self._enc().with_suffix(self._enc().suffix + ".new")
|
||||
for p in (sig_new, enc_new):
|
||||
if p.exists():
|
||||
p.unlink()
|
||||
self._started = False
|
||||
return {"aborted": True}
|
||||
@@ -0,0 +1,57 @@
|
||||
"""Leverage cap server-side per place_order (IBKR Reg-T context).
|
||||
|
||||
Cap letto dal secret JSON via campo `max_leverage`. Default 1 (cash) se assente.
|
||||
IBKR margin accounts default a 4x intraday / 2x overnight (Reg-T).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
|
||||
def get_max_leverage(creds: dict) -> int:
|
||||
"""Legge max_leverage dal secret. Default 1 se mancante."""
|
||||
raw = creds.get("max_leverage", 1)
|
||||
try:
|
||||
value = int(raw)
|
||||
except (TypeError, ValueError):
|
||||
value = 1
|
||||
return max(1, value)
|
||||
|
||||
|
||||
def enforce_leverage(
|
||||
requested: int | float | None,
|
||||
*,
|
||||
creds: dict,
|
||||
exchange: str,
|
||||
) -> int:
|
||||
"""Verifica e applica leverage cap. Ritorna leverage applicabile.
|
||||
|
||||
Solleva HTTPException(403, LEVERAGE_CAP_EXCEEDED) se requested > cap.
|
||||
Se requested is None, applica il cap come default.
|
||||
"""
|
||||
cap = get_max_leverage(creds)
|
||||
if requested is None:
|
||||
return cap
|
||||
lev = int(requested)
|
||||
if lev < 1:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={
|
||||
"error": "LEVERAGE_CAP_EXCEEDED",
|
||||
"exchange": exchange,
|
||||
"requested": lev,
|
||||
"max": cap,
|
||||
"reason": "leverage must be >= 1",
|
||||
},
|
||||
)
|
||||
if lev > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={
|
||||
"error": "LEVERAGE_CAP_EXCEEDED",
|
||||
"exchange": exchange,
|
||||
"requested": lev,
|
||||
"max": cap,
|
||||
},
|
||||
)
|
||||
return lev
|
||||
@@ -0,0 +1,181 @@
|
||||
"""OAuth 1.0a Self-Service signer for IBKR Client Portal Web API.
|
||||
|
||||
Reference: https://www.interactivebrokers.com/api/doc.html (Self-Service OAuth)
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import base64
|
||||
import hashlib
|
||||
import hmac
|
||||
import secrets
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from urllib.parse import quote
|
||||
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import padding
|
||||
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey
|
||||
|
||||
from cerbero_mcp.common.http import async_client
|
||||
|
||||
# Refresh LST 9h before its 24h expiry (very conservative; trades token
|
||||
# lifetime for safety margin against clock drift and slow re-auth flows).
|
||||
_LST_REFRESH_BUFFER_S = 9 * 3600 # 32_400
|
||||
_LST_FALLBACK_TTL_S = 15 * 3600 # 54_000 — used when server omits expiration
|
||||
|
||||
|
||||
def _percent_encode(value: str) -> str:
|
||||
"""RFC 3986 percent-encoding for OAuth (no `+` for space)."""
|
||||
return quote(str(value), safe="")
|
||||
|
||||
|
||||
def build_signature_base_string(
|
||||
method: str, url: str, params: dict[str, str]
|
||||
) -> str:
|
||||
"""Costruisce signature base string OAuth 1.0a:
|
||||
`<METHOD>&<encoded-url>&<encoded-sorted-params>`
|
||||
"""
|
||||
sorted_params = sorted(params.items())
|
||||
encoded_pairs = [
|
||||
f"{_percent_encode(k)}%3D{_percent_encode(v)}"
|
||||
for k, v in sorted_params
|
||||
]
|
||||
# Manual %3D / %26 = double-encoding of '=' and '&' per RFC 5849 §3.4.1.3.2
|
||||
params_str = "%26".join(encoded_pairs)
|
||||
return f"{method.upper()}&{_percent_encode(url)}&{params_str}"
|
||||
|
||||
|
||||
class IBKRAuthError(Exception):
|
||||
"""OAuth flow failed (key invalid, consumer revoked, mint failed)."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class OAuth1aSigner:
|
||||
consumer_key: str
|
||||
access_token: str
|
||||
access_token_secret: str
|
||||
signature_key_path: str
|
||||
encryption_key_path: str
|
||||
dh_prime: str # hex string
|
||||
|
||||
_signature_key: RSAPrivateKey | None = field(default=None, init=False, repr=False)
|
||||
_encryption_key: RSAPrivateKey | None = field(default=None, init=False, repr=False)
|
||||
_live_session_token: str | None = field(default=None, init=False, repr=False)
|
||||
_lst_expires_at: float = field(default=0.0, init=False, repr=False)
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
with open(self.signature_key_path, "rb") as f:
|
||||
self._signature_key = serialization.load_pem_private_key(
|
||||
f.read(), password=None
|
||||
)
|
||||
with open(self.encryption_key_path, "rb") as f:
|
||||
self._encryption_key = serialization.load_pem_private_key(
|
||||
f.read(), password=None
|
||||
)
|
||||
|
||||
def sign(self, method: str, url: str, params: dict[str, str]) -> str:
|
||||
"""Firma RSA-SHA256 della signature base string. Ritorna base64."""
|
||||
assert self._signature_key is not None # set in __post_init__
|
||||
base = build_signature_base_string(method, url, params)
|
||||
signature = self._signature_key.sign(
|
||||
base.encode("utf-8"),
|
||||
padding.PKCS1v15(),
|
||||
hashes.SHA256(),
|
||||
)
|
||||
return base64.b64encode(signature).decode("ascii")
|
||||
|
||||
def make_oauth_params(self) -> dict[str, str]:
|
||||
"""Genera oauth_nonce/timestamp/version standard."""
|
||||
return {
|
||||
"oauth_consumer_key": self.consumer_key,
|
||||
"oauth_token": self.access_token,
|
||||
"oauth_nonce": secrets.token_hex(16),
|
||||
"oauth_timestamp": str(int(time.time())),
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
}
|
||||
|
||||
async def get_live_session_token(self, *, base_url: str) -> str:
|
||||
"""Restituisce LST cached, riminta se mancante o vicino a scadenza."""
|
||||
if self._live_session_token and time.monotonic() < self._lst_expires_at:
|
||||
return self._live_session_token
|
||||
return await self._mint_live_session_token(base_url)
|
||||
|
||||
async def _mint_live_session_token(self, base_url: str) -> str:
|
||||
"""DH key exchange + RSA-signed POST /oauth/live_session_token.
|
||||
|
||||
1. Generate random dh_random
|
||||
2. Compute dh_challenge = 2^dh_random mod dh_prime
|
||||
3. Decrypt access_token_secret via encryption RSA key
|
||||
4. POST signed request with diffie_hellman_challenge
|
||||
5. shared = dh_response^dh_random mod dh_prime
|
||||
6. LST = HMAC-SHA1(shared, decrypted_secret), base64
|
||||
"""
|
||||
url = f"{base_url}/oauth/live_session_token"
|
||||
|
||||
prime = int(self.dh_prime, 16)
|
||||
dh_random = secrets.randbits(256)
|
||||
dh_challenge = pow(2, dh_random, prime)
|
||||
dh_challenge_hex = format(dh_challenge, "x")
|
||||
|
||||
if self._encryption_key is None: # pragma: no cover — set in __post_init__
|
||||
raise IBKRAuthError("encryption key not loaded")
|
||||
try:
|
||||
encrypted = bytes.fromhex(self.access_token_secret)
|
||||
decrypted_secret = self._encryption_key.decrypt(
|
||||
encrypted, padding.PKCS1v15()
|
||||
)
|
||||
except Exception as e: # narrow to crypto/value errors; broad on purpose
|
||||
# Intentionally broad: covers ValueError (bad hex), cryptography
|
||||
# errors (InvalidKey, padding decoding), and any RSA backend issue
|
||||
# — all map to the same user-facing failure ("bad credentials").
|
||||
raise IBKRAuthError(f"access_token_secret decrypt failed: {e}") from e
|
||||
|
||||
oauth_params = self.make_oauth_params()
|
||||
oauth_params["diffie_hellman_challenge"] = dh_challenge_hex
|
||||
signature = self.sign("POST", url, oauth_params)
|
||||
oauth_params["oauth_signature"] = signature
|
||||
|
||||
auth_header = "OAuth " + ", ".join(
|
||||
f'{k}="{_percent_encode(v)}"' for k, v in sorted(oauth_params.items())
|
||||
)
|
||||
|
||||
async with async_client(timeout=15.0) as http:
|
||||
resp = await http.post(
|
||||
url,
|
||||
headers={"Authorization": auth_header, "User-Agent": "cerbero-mcp/2.0"},
|
||||
)
|
||||
if resp.status_code != 200:
|
||||
raise IBKRAuthError(
|
||||
f"LST mint failed status={resp.status_code} body={resp.text[:300]}"
|
||||
)
|
||||
data = resp.json()
|
||||
dh_response = int(data["diffie_hellman_response"], 16)
|
||||
expires_ms = data.get("live_session_token_expiration", 0)
|
||||
|
||||
shared = pow(dh_response, dh_random, prime)
|
||||
shared_bytes = shared.to_bytes((shared.bit_length() + 7) // 8, "big")
|
||||
if shared_bytes and shared_bytes[0] & 0x80:
|
||||
shared_bytes = b"\x00" + shared_bytes
|
||||
|
||||
lst_raw = hmac.new(shared_bytes, decrypted_secret, hashlib.sha1).digest()
|
||||
lst = base64.b64encode(lst_raw).decode("ascii")
|
||||
|
||||
self._live_session_token = lst
|
||||
if expires_ms:
|
||||
ttl = max(60.0, (expires_ms / 1000) - time.time() - _LST_REFRESH_BUFFER_S)
|
||||
else:
|
||||
ttl = float(_LST_FALLBACK_TTL_S)
|
||||
# `expires_ms` is wall clock; convert to a monotonic deadline so the
|
||||
# cache check is unaffected by future clock adjustments.
|
||||
self._lst_expires_at = time.monotonic() + ttl
|
||||
return lst
|
||||
|
||||
def sign_with_lst(self, method: str, url: str, params: dict[str, str]) -> str:
|
||||
"""Firma HMAC-SHA256 con LST come key (per request post-mint)."""
|
||||
if not self._live_session_token:
|
||||
raise IBKRAuthError("LST not minted yet; call get_live_session_token first")
|
||||
base = build_signature_base_string(method, url, params)
|
||||
lst_bytes = base64.b64decode(self._live_session_token)
|
||||
sig = hmac.new(lst_bytes, base.encode("utf-8"), hashlib.sha256).digest()
|
||||
return base64.b64encode(sig).decode("ascii")
|
||||
@@ -0,0 +1,101 @@
|
||||
"""Pure-function payload builders for IBKR complex orders (bracket/OCO/OTO).
|
||||
|
||||
No HTTP. Tests are deterministic.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import secrets
|
||||
from dataclasses import dataclass
|
||||
from typing import Literal
|
||||
|
||||
|
||||
@dataclass
|
||||
class OrderSpec:
|
||||
conid: int
|
||||
sec_type: str # "STK" | "OPT" | "FUT" | "CASH"
|
||||
side: Literal["BUY", "SELL"]
|
||||
qty: float
|
||||
order_type: Literal["MKT", "LMT", "STP", "STP_LMT"]
|
||||
price: float | None = None # limit price
|
||||
aux_price: float | None = None # stop price
|
||||
tif: str = "GTC"
|
||||
exchange: str = "SMART"
|
||||
|
||||
|
||||
def _to_order_dict(spec: OrderSpec, *, oca_group: str | None = None,
|
||||
oca_type: int | None = None,
|
||||
parent_id: str | None = None) -> dict:
|
||||
o: dict = {
|
||||
"conid": spec.conid,
|
||||
"secType": f"{spec.conid}:{spec.sec_type}",
|
||||
"orderType": spec.order_type,
|
||||
"side": spec.side,
|
||||
"quantity": spec.qty,
|
||||
"tif": spec.tif,
|
||||
"listingExchange": spec.exchange,
|
||||
}
|
||||
if spec.price is not None:
|
||||
o["price"] = spec.price
|
||||
if spec.aux_price is not None:
|
||||
o["auxPrice"] = spec.aux_price
|
||||
if oca_group:
|
||||
o["ocaGroup"] = oca_group
|
||||
if oca_type is not None:
|
||||
o["ocaType"] = oca_type
|
||||
if parent_id:
|
||||
o["parentId"] = parent_id
|
||||
return o
|
||||
|
||||
|
||||
def _new_oca_group() -> str:
|
||||
return f"oca-{secrets.token_hex(4)}"
|
||||
|
||||
|
||||
def build_bracket_payload(
|
||||
*, conid: int, sec_type: str, side: str, qty: float,
|
||||
entry_price: float, stop_loss: float, take_profit: float,
|
||||
tif: str = "GTC", exchange: str = "SMART",
|
||||
) -> dict:
|
||||
"""Bracket: parent LMT entry + child STP (loss) + child LMT (profit), OCA-linked."""
|
||||
side = side.upper()
|
||||
opposite = "SELL" if side == "BUY" else "BUY"
|
||||
oca = _new_oca_group()
|
||||
|
||||
parent = OrderSpec(conid=conid, sec_type=sec_type, side=side, qty=qty, # type: ignore[arg-type]
|
||||
order_type="LMT", price=entry_price,
|
||||
tif=tif, exchange=exchange)
|
||||
sl = OrderSpec(conid=conid, sec_type=sec_type, side=opposite, qty=qty, # type: ignore[arg-type]
|
||||
order_type="STP", aux_price=stop_loss,
|
||||
tif=tif, exchange=exchange)
|
||||
tp = OrderSpec(conid=conid, sec_type=sec_type, side=opposite, qty=qty, # type: ignore[arg-type]
|
||||
order_type="LMT", price=take_profit,
|
||||
tif=tif, exchange=exchange)
|
||||
return {
|
||||
"orders": [
|
||||
_to_order_dict(parent, oca_group=oca, oca_type=2),
|
||||
_to_order_dict(sl, oca_group=oca, oca_type=2),
|
||||
_to_order_dict(tp, oca_group=oca, oca_type=2),
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
def build_oco_payload(legs: list[OrderSpec]) -> dict:
|
||||
"""OCO: N legs, all sharing same ocaGroup with ocaType=1 (one-cancels-all)."""
|
||||
if len(legs) < 2:
|
||||
raise ValueError("OCO requires at least 2 legs")
|
||||
oca = _new_oca_group()
|
||||
return {
|
||||
"orders": [
|
||||
_to_order_dict(l, oca_group=oca, oca_type=1) for l in legs
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
def build_oto_first_payload(trigger: OrderSpec) -> dict:
|
||||
"""OTO step 1: place trigger as standalone."""
|
||||
return {"orders": [_to_order_dict(trigger)]}
|
||||
|
||||
|
||||
def build_oto_child_payload(child: OrderSpec, parent_order_id: str) -> dict:
|
||||
"""OTO step 2: child references parentId from step-1 order_id."""
|
||||
return {"orders": [_to_order_dict(child, parent_id=parent_order_id)]}
|
||||
@@ -0,0 +1,453 @@
|
||||
"""IBKR tool functions: Pydantic schemas + async dispatch to client/ws."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import contextlib
|
||||
from typing import Any
|
||||
|
||||
from fastapi import HTTPException
|
||||
from pydantic import BaseModel
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.client import _SEC_TYPE_MAP, IBKRClient, IBKRError
|
||||
from cerbero_mcp.exchanges.ibkr.leverage_cap import get_max_leverage
|
||||
from cerbero_mcp.exchanges.ibkr.orders_complex import (
|
||||
OrderSpec,
|
||||
build_bracket_payload,
|
||||
build_oco_payload,
|
||||
build_oto_child_payload,
|
||||
build_oto_first_payload,
|
||||
)
|
||||
from cerbero_mcp.exchanges.ibkr.ws import IBKRWebSocket
|
||||
|
||||
# === Schemas: reads ===
|
||||
|
||||
class GetAccountReq(BaseModel):
|
||||
pass
|
||||
|
||||
class GetPositionsReq(BaseModel):
|
||||
pass
|
||||
|
||||
class GetOpenOrdersReq(BaseModel):
|
||||
pass
|
||||
|
||||
class GetActivitiesReq(BaseModel):
|
||||
days: int = 7
|
||||
|
||||
class GetTickerReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class GetBarsReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
period: str = "1d"
|
||||
bar: str = "5min"
|
||||
|
||||
class GetSnapshotReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class GetOptionChainReq(BaseModel):
|
||||
underlying: str
|
||||
expiry: str | None = None
|
||||
|
||||
class SearchContractsReq(BaseModel):
|
||||
symbol: str
|
||||
sec_type: str = "STK"
|
||||
|
||||
class GetClockReq(BaseModel):
|
||||
pass
|
||||
|
||||
# === Schemas: streaming ===
|
||||
|
||||
class GetTickReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class GetDepthReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
rows: int = 5
|
||||
exchange: str = "SMART"
|
||||
|
||||
class SubscribeTickReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class UnsubscribeReq(BaseModel):
|
||||
symbol: str
|
||||
asset_class: str = "stocks"
|
||||
|
||||
# === Schemas: writes simple ===
|
||||
|
||||
class PlaceOrderReq(BaseModel):
|
||||
symbol: str
|
||||
side: str
|
||||
qty: float
|
||||
order_type: str = "market"
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str = "day"
|
||||
asset_class: str = "stocks"
|
||||
sec_type: str | None = None
|
||||
exchange: str = "SMART"
|
||||
outside_rth: bool = False
|
||||
|
||||
class AmendOrderReq(BaseModel):
|
||||
order_id: str
|
||||
qty: float | None = None
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str | None = None
|
||||
|
||||
class CancelOrderReq(BaseModel):
|
||||
order_id: str
|
||||
|
||||
class CancelAllOrdersReq(BaseModel):
|
||||
pass
|
||||
|
||||
class ClosePositionReq(BaseModel):
|
||||
symbol: str
|
||||
qty: float | None = None
|
||||
|
||||
class CloseAllPositionsReq(BaseModel):
|
||||
pass
|
||||
|
||||
# === Schemas: writes complex ===
|
||||
|
||||
class PlaceBracketOrderReq(BaseModel):
|
||||
symbol: str
|
||||
side: str
|
||||
qty: float
|
||||
entry_price: float
|
||||
stop_loss: float
|
||||
take_profit: float
|
||||
tif: str = "gtc"
|
||||
asset_class: str = "stocks"
|
||||
exchange: str = "SMART"
|
||||
|
||||
class OrderLeg(BaseModel):
|
||||
symbol: str
|
||||
side: str
|
||||
qty: float
|
||||
order_type: str = "limit"
|
||||
limit_price: float | None = None
|
||||
stop_price: float | None = None
|
||||
tif: str = "gtc"
|
||||
asset_class: str = "stocks"
|
||||
|
||||
class PlaceOcoOrderReq(BaseModel):
|
||||
legs: list[OrderLeg]
|
||||
|
||||
class PlaceOtoOrderReq(BaseModel):
|
||||
trigger: OrderLeg
|
||||
child: OrderLeg
|
||||
|
||||
# === Read tools ===
|
||||
|
||||
async def environment_info(
|
||||
client: IBKRClient, *, creds: dict, env_info: Any | None = None
|
||||
) -> dict:
|
||||
return {
|
||||
"exchange": "ibkr",
|
||||
"environment": "testnet" if client.paper else "mainnet",
|
||||
"paper": client.paper,
|
||||
"base_url": client.base_url,
|
||||
"max_leverage": get_max_leverage(creds),
|
||||
}
|
||||
|
||||
async def get_account(client: IBKRClient, params: GetAccountReq) -> dict:
|
||||
return await client.get_account()
|
||||
|
||||
async def get_positions(client: IBKRClient, params: GetPositionsReq) -> dict:
|
||||
return {"positions": await client.get_positions()}
|
||||
|
||||
async def get_open_orders(client: IBKRClient, params: GetOpenOrdersReq) -> dict:
|
||||
return {"orders": await client.get_open_orders()}
|
||||
|
||||
async def get_activities(client: IBKRClient, params: GetActivitiesReq) -> dict:
|
||||
return {"activities": await client.get_activities(params.days)}
|
||||
|
||||
async def get_ticker(client: IBKRClient, params: GetTickerReq) -> dict:
|
||||
return await client.get_ticker(params.symbol, params.asset_class)
|
||||
|
||||
async def get_bars(client: IBKRClient, params: GetBarsReq) -> dict:
|
||||
return await client.get_bars(
|
||||
params.symbol, params.asset_class, params.period, params.bar,
|
||||
)
|
||||
|
||||
async def get_snapshot(client: IBKRClient, params: GetSnapshotReq) -> dict:
|
||||
return await client.get_ticker(params.symbol, params.asset_class)
|
||||
|
||||
async def get_option_chain(client: IBKRClient, params: GetOptionChainReq) -> dict:
|
||||
return await client.get_option_chain(params.underlying, params.expiry)
|
||||
|
||||
async def search_contracts(client: IBKRClient, params: SearchContractsReq) -> dict:
|
||||
return {"contracts": await client.search_contracts(params.symbol, params.sec_type)}
|
||||
|
||||
async def get_clock(client: IBKRClient, params: GetClockReq) -> dict:
|
||||
import datetime as _dt
|
||||
now = _dt.datetime.now(_dt.UTC)
|
||||
return {
|
||||
"timestamp": now.isoformat(),
|
||||
"is_open": _dt.time(13, 30) <= now.time() <= _dt.time(20, 0)
|
||||
and now.weekday() < 5,
|
||||
"approximate": True,
|
||||
"note": (
|
||||
"is_open is a UTC-based approximation; does not account for "
|
||||
"US market holidays or half-days. Use IBKR /trsrv/marketdata/calendar "
|
||||
"for authoritative schedule."
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
# === Streaming tools ===
|
||||
|
||||
|
||||
def _sec_type_for(asset_class: str) -> str:
|
||||
return _SEC_TYPE_MAP.get(asset_class.lower(), "STK")
|
||||
|
||||
|
||||
async def get_tick(
|
||||
client: IBKRClient, params: GetTickReq,
|
||||
*, ws: IBKRWebSocket, timeout_s: float = 3.0,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
snap = ws.get_tick_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await ws.subscribe_tick(conid)
|
||||
deadline = asyncio.get_event_loop().time() + timeout_s
|
||||
while asyncio.get_event_loop().time() < deadline:
|
||||
snap = ws.get_tick_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await asyncio.sleep(0.05)
|
||||
raise IBKRError(f"IBKR_TICK_TIMEOUT: {params.symbol}")
|
||||
|
||||
|
||||
async def get_depth(
|
||||
client: IBKRClient, params: GetDepthReq,
|
||||
*, ws: IBKRWebSocket, timeout_s: float = 3.0,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
snap = ws.get_depth_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await ws.subscribe_depth(conid, exchange=params.exchange, rows=params.rows)
|
||||
deadline = asyncio.get_event_loop().time() + timeout_s
|
||||
while asyncio.get_event_loop().time() < deadline:
|
||||
snap = ws.get_depth_snapshot(conid)
|
||||
if snap:
|
||||
return {**snap, "symbol": params.symbol}
|
||||
await asyncio.sleep(0.05)
|
||||
raise IBKRError(f"IBKR_DEPTH_TIMEOUT: {params.symbol}")
|
||||
|
||||
|
||||
async def subscribe_tick(
|
||||
client: IBKRClient, params: SubscribeTickReq, *, ws: IBKRWebSocket,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
await ws.subscribe_tick(conid, forced=True)
|
||||
return {"symbol": params.symbol, "conid": conid, "subscribed": True}
|
||||
|
||||
|
||||
async def unsubscribe(
|
||||
client: IBKRClient, params: UnsubscribeReq, *, ws: IBKRWebSocket,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
await ws.unsubscribe(conid)
|
||||
return {"symbol": params.symbol, "conid": conid, "unsubscribed": True}
|
||||
|
||||
|
||||
# === Write tools: simple ===
|
||||
|
||||
|
||||
async def place_order(
|
||||
client: IBKRClient, params: PlaceOrderReq,
|
||||
*, creds: dict, last_price: float | None = None,
|
||||
) -> dict:
|
||||
cap = get_max_leverage(creds)
|
||||
if last_price is None:
|
||||
try:
|
||||
ticker = await client.get_ticker(params.symbol, params.asset_class)
|
||||
last_price = ticker.get("last_price") or ticker.get("ask")
|
||||
except Exception:
|
||||
last_price = None
|
||||
if last_price:
|
||||
notional = params.qty * float(last_price)
|
||||
try:
|
||||
account = await client.get_account()
|
||||
equity = float(
|
||||
(account.get("netliquidation") or {}).get("amount") or 0
|
||||
)
|
||||
except Exception:
|
||||
equity = 0.0
|
||||
if equity > 0 and notional / equity > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={
|
||||
"error": "LEVERAGE_CAP_EXCEEDED",
|
||||
"exchange": "ibkr",
|
||||
"requested_ratio": notional / equity,
|
||||
"max": cap,
|
||||
},
|
||||
)
|
||||
|
||||
return await client.place_order(
|
||||
symbol=params.symbol,
|
||||
side=params.side,
|
||||
qty=params.qty,
|
||||
order_type=params.order_type,
|
||||
limit_price=params.limit_price,
|
||||
stop_price=params.stop_price,
|
||||
tif=params.tif,
|
||||
asset_class=params.asset_class,
|
||||
sec_type=params.sec_type,
|
||||
exchange=params.exchange,
|
||||
outside_rth=params.outside_rth,
|
||||
)
|
||||
|
||||
|
||||
async def amend_order(client: IBKRClient, params: AmendOrderReq) -> dict:
|
||||
return await client.amend_order(
|
||||
params.order_id,
|
||||
qty=params.qty,
|
||||
limit_price=params.limit_price,
|
||||
stop_price=params.stop_price,
|
||||
tif=params.tif,
|
||||
)
|
||||
|
||||
|
||||
async def cancel_order(client: IBKRClient, params: CancelOrderReq) -> dict:
|
||||
return await client.cancel_order(params.order_id)
|
||||
|
||||
|
||||
async def cancel_all_orders(
|
||||
client: IBKRClient, params: CancelAllOrdersReq
|
||||
) -> dict:
|
||||
return {"canceled": await client.cancel_all_orders()}
|
||||
|
||||
|
||||
async def close_position(
|
||||
client: IBKRClient, params: ClosePositionReq
|
||||
) -> dict:
|
||||
return await client.close_position(params.symbol, params.qty)
|
||||
|
||||
|
||||
async def close_all_positions(
|
||||
client: IBKRClient, params: CloseAllPositionsReq
|
||||
) -> dict:
|
||||
return {"closed": await client.close_all_positions()}
|
||||
|
||||
|
||||
# === Write tools: complex orders ===
|
||||
|
||||
|
||||
def _leg_to_spec(leg: OrderLeg, conid: int) -> OrderSpec:
|
||||
return OrderSpec(
|
||||
conid=conid,
|
||||
sec_type=_sec_type_for(leg.asset_class),
|
||||
side=leg.side.upper(), # type: ignore[arg-type]
|
||||
qty=leg.qty,
|
||||
order_type={
|
||||
"market": "MKT", "limit": "LMT",
|
||||
"stop": "STP", "stop_limit": "STP_LMT",
|
||||
}[leg.order_type.lower()], # type: ignore[arg-type]
|
||||
price=leg.limit_price,
|
||||
aux_price=leg.stop_price,
|
||||
tif=leg.tif.upper(),
|
||||
)
|
||||
|
||||
|
||||
async def place_bracket_order(
|
||||
client: IBKRClient, params: PlaceBracketOrderReq, *, creds: dict,
|
||||
) -> dict:
|
||||
sec = _sec_type_for(params.asset_class)
|
||||
conid = await client.resolve_conid(params.symbol, sec)
|
||||
cap = get_max_leverage(creds)
|
||||
notional = params.qty * params.entry_price
|
||||
try:
|
||||
account = await client.get_account()
|
||||
equity = float((account.get("netliquidation") or {}).get("amount") or 0)
|
||||
except Exception:
|
||||
equity = 0.0
|
||||
if equity > 0 and notional / equity > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={"error": "LEVERAGE_CAP_EXCEEDED", "exchange": "ibkr",
|
||||
"requested_ratio": notional / equity, "max": cap},
|
||||
)
|
||||
payload = build_bracket_payload(
|
||||
conid=conid, sec_type=sec, side=params.side.upper(), qty=params.qty,
|
||||
entry_price=params.entry_price, stop_loss=params.stop_loss,
|
||||
take_profit=params.take_profit, tif=params.tif.upper(),
|
||||
exchange=params.exchange,
|
||||
)
|
||||
return await client._submit_order_with_confirmation(payload)
|
||||
|
||||
|
||||
async def place_oco_order(
|
||||
client: IBKRClient, params: PlaceOcoOrderReq, *, creds: dict,
|
||||
) -> dict:
|
||||
if len(params.legs) < 2:
|
||||
raise HTTPException(400, detail={"error": "OCO requires >=2 legs"})
|
||||
cap = get_max_leverage(creds)
|
||||
leg_notional = max(
|
||||
l.qty * (l.limit_price or l.stop_price or 0) for l in params.legs
|
||||
)
|
||||
try:
|
||||
account = await client.get_account()
|
||||
equity = float((account.get("netliquidation") or {}).get("amount") or 0)
|
||||
except Exception:
|
||||
equity = 0.0
|
||||
if equity > 0 and leg_notional / equity > cap:
|
||||
raise HTTPException(
|
||||
status_code=403,
|
||||
detail={"error": "LEVERAGE_CAP_EXCEEDED", "exchange": "ibkr",
|
||||
"requested_ratio": leg_notional / equity, "max": cap},
|
||||
)
|
||||
|
||||
specs = []
|
||||
for l in params.legs:
|
||||
sec = _sec_type_for(l.asset_class)
|
||||
conid = await client.resolve_conid(l.symbol, sec)
|
||||
specs.append(_leg_to_spec(l, conid))
|
||||
payload = build_oco_payload(specs)
|
||||
return await client._submit_order_with_confirmation(payload)
|
||||
|
||||
|
||||
async def place_oto_order(
|
||||
client: IBKRClient, params: PlaceOtoOrderReq, *, creds: dict,
|
||||
) -> dict:
|
||||
sec_t = _sec_type_for(params.trigger.asset_class)
|
||||
sec_c = _sec_type_for(params.child.asset_class)
|
||||
conid_t = await client.resolve_conid(params.trigger.symbol, sec_t)
|
||||
conid_c = await client.resolve_conid(params.child.symbol, sec_c)
|
||||
trig_spec = _leg_to_spec(params.trigger, conid_t)
|
||||
child_spec = _leg_to_spec(params.child, conid_c)
|
||||
|
||||
trig_payload = build_oto_first_payload(trig_spec)
|
||||
trig_res = await client._submit_order_with_confirmation(trig_payload)
|
||||
trigger_order_id = trig_res.get("order_id")
|
||||
if not trigger_order_id:
|
||||
raise IBKRError(f"IBKR_OTO_TRIGGER_NO_ID: {trig_res!r}")
|
||||
|
||||
try:
|
||||
child_payload = build_oto_child_payload(child_spec, str(trigger_order_id))
|
||||
child_res = await client._submit_order_with_confirmation(child_payload)
|
||||
except Exception as e:
|
||||
with contextlib.suppress(Exception):
|
||||
await client.cancel_order(str(trigger_order_id))
|
||||
raise IBKRError(
|
||||
f"IBKR_OTO_PARTIAL_FAILURE: trigger={trigger_order_id} reason={e}"
|
||||
) from e
|
||||
|
||||
return {
|
||||
"trigger_order_id": trigger_order_id,
|
||||
"child_order_id": child_res.get("order_id"),
|
||||
}
|
||||
@@ -0,0 +1,247 @@
|
||||
"""IBKR Client Portal WebSocket — persistent WSS, smd/sbd subs, snapshot cache."""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import contextlib
|
||||
import json
|
||||
import logging
|
||||
import time
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Any
|
||||
|
||||
from websockets import connect as websockets_connect # exposed for tests
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import OAuth1aSigner
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WSError(Exception):
|
||||
"""WebSocket layer error."""
|
||||
|
||||
|
||||
@dataclass
|
||||
class TickSnapshot:
|
||||
last_price: float | None
|
||||
bid: float | None
|
||||
ask: float | None
|
||||
bid_size: float | None
|
||||
ask_size: float | None
|
||||
timestamp_ms: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class DepthSnapshot:
|
||||
bids: list[dict]
|
||||
asks: list[dict]
|
||||
timestamp_ms: int
|
||||
|
||||
|
||||
_SMD_FIELDS = ["31", "84", "86", "7295", "7296"]
|
||||
|
||||
|
||||
@dataclass
|
||||
class IBKRWebSocket:
|
||||
"""Persistent WSS to IBKR Client Portal with smd/sbd subs.
|
||||
|
||||
Snapshot lifetime: each (tick|depth) cache entry is overwritten on every
|
||||
incoming message. On disconnect, the reader loop logs and exits leaving
|
||||
the existing cache intact. Consumers should check `connected` before
|
||||
trusting a stale snapshot, or compare `timestamp_ms` against wall clock.
|
||||
Automatic reconnect is deferred to a follow-up; V1 surfaces disconnects
|
||||
via `connected=False` so the higher-level tool layer can rebuild the WS.
|
||||
"""
|
||||
signer: OAuth1aSigner
|
||||
ws_url: str
|
||||
base_url: str
|
||||
max_subs: int = 80
|
||||
idle_timeout_s: int = 300
|
||||
|
||||
_ws: Any = field(default=None, init=False, repr=False)
|
||||
_tick_cache: dict[int, TickSnapshot] = field(default_factory=dict, init=False)
|
||||
_depth_cache: dict[int, DepthSnapshot] = field(default_factory=dict, init=False)
|
||||
_subs: set[int] = field(default_factory=set, init=False)
|
||||
_depth_subs: set[int] = field(default_factory=set, init=False)
|
||||
_last_polled_at: dict[int, float] = field(default_factory=dict, init=False)
|
||||
_forced_subs: set[int] = field(default_factory=set, init=False)
|
||||
_reader_task: asyncio.Task | None = field(default=None, init=False)
|
||||
_idle_task: asyncio.Task | None = field(default=None, init=False)
|
||||
_stopped: bool = field(default=False, init=False)
|
||||
|
||||
@property
|
||||
def connected(self) -> bool:
|
||||
return self._ws is not None and not getattr(self._ws, "closed", True)
|
||||
|
||||
async def start(self) -> None:
|
||||
if self.connected:
|
||||
return
|
||||
self._stopped = False # reset on every start (supports stop→start cycles)
|
||||
lst = await self.signer.get_live_session_token(base_url=self.base_url)
|
||||
self._ws = await websockets_connect(
|
||||
self.ws_url,
|
||||
additional_headers={"Cookie": f"api={lst}"},
|
||||
)
|
||||
self._reader_task = asyncio.create_task(self._reader_loop())
|
||||
self._idle_task = asyncio.create_task(self._idle_sweeper())
|
||||
|
||||
async def stop(self) -> None:
|
||||
self._stopped = True
|
||||
if self._idle_task:
|
||||
self._idle_task.cancel()
|
||||
with contextlib.suppress(BaseException):
|
||||
await self._idle_task
|
||||
if self._reader_task:
|
||||
self._reader_task.cancel()
|
||||
with contextlib.suppress(BaseException):
|
||||
await self._reader_task
|
||||
if self._ws:
|
||||
with contextlib.suppress(Exception):
|
||||
await self._ws.close()
|
||||
self._ws = None
|
||||
|
||||
async def subscribe_tick(self, conid: int, *, forced: bool = False) -> None:
|
||||
self._require_started()
|
||||
await self._ensure_capacity(conid)
|
||||
if conid in self._subs:
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
if forced:
|
||||
self._forced_subs.add(conid)
|
||||
return
|
||||
msg = "smd+" + str(conid) + "+" + json.dumps({"fields": _SMD_FIELDS})
|
||||
await self._ws.send(msg)
|
||||
self._subs.add(conid)
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
if forced:
|
||||
self._forced_subs.add(conid)
|
||||
|
||||
async def subscribe_depth(
|
||||
self, conid: int, *, exchange: str = "SMART", rows: int = 5
|
||||
) -> None:
|
||||
self._require_started()
|
||||
await self._ensure_capacity(conid)
|
||||
if conid in self._depth_subs:
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
return
|
||||
msg = f"sbd+{conid}+{exchange}+{rows}"
|
||||
await self._ws.send(msg)
|
||||
self._depth_subs.add(conid)
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
|
||||
async def unsubscribe(self, conid: int) -> None:
|
||||
self._require_started()
|
||||
if conid in self._subs:
|
||||
await self._ws.send(f"umd+{conid}+{{}}")
|
||||
self._subs.discard(conid)
|
||||
if conid in self._depth_subs:
|
||||
await self._ws.send(f"ubd+{conid}")
|
||||
self._depth_subs.discard(conid)
|
||||
self._tick_cache.pop(conid, None)
|
||||
self._depth_cache.pop(conid, None)
|
||||
self._last_polled_at.pop(conid, None)
|
||||
self._forced_subs.discard(conid)
|
||||
|
||||
def get_tick_snapshot(self, conid: int) -> dict | None:
|
||||
snap = self._tick_cache.get(conid)
|
||||
if not snap:
|
||||
return None
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
return {
|
||||
"conid": conid,
|
||||
"last_price": snap.last_price,
|
||||
"bid": snap.bid,
|
||||
"ask": snap.ask,
|
||||
"bid_size": snap.bid_size,
|
||||
"ask_size": snap.ask_size,
|
||||
"timestamp_ms": snap.timestamp_ms,
|
||||
}
|
||||
|
||||
def get_depth_snapshot(self, conid: int) -> dict | None:
|
||||
snap = self._depth_cache.get(conid)
|
||||
if not snap:
|
||||
return None
|
||||
self._last_polled_at[conid] = time.monotonic()
|
||||
return {
|
||||
"conid": conid,
|
||||
"bids": snap.bids,
|
||||
"asks": snap.asks,
|
||||
"timestamp_ms": snap.timestamp_ms,
|
||||
}
|
||||
|
||||
def _require_started(self) -> None:
|
||||
if self._ws is None:
|
||||
raise WSError("IBKR_WS_NOT_STARTED: call start() first")
|
||||
|
||||
async def _ensure_capacity(self, conid: int) -> None:
|
||||
if (conid in self._subs) or (conid in self._depth_subs):
|
||||
return
|
||||
active = len(self._subs) + len(self._depth_subs)
|
||||
if active >= self.max_subs:
|
||||
raise WSError(f"IBKR_WS_SUB_LIMIT: {active}/{self.max_subs}")
|
||||
|
||||
async def _reader_loop(self) -> None:
|
||||
try:
|
||||
while not self._stopped and self._ws:
|
||||
raw = await self._ws.recv()
|
||||
try:
|
||||
msg = json.loads(raw)
|
||||
except json.JSONDecodeError:
|
||||
continue
|
||||
topic = msg.get("topic", "")
|
||||
if topic.startswith("smd+"):
|
||||
self._on_tick(topic, msg)
|
||||
elif topic.startswith("sbd+"):
|
||||
self._on_depth(topic, msg)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
except Exception as exc:
|
||||
# Disconnect / parse error / network — leave cache as-is, mark dead.
|
||||
# V1: no automatic reconnect; consumers detect via stale timestamp_ms.
|
||||
logger.warning("ibkr ws reader exited: %s", exc)
|
||||
self._ws = None
|
||||
return
|
||||
|
||||
def _on_tick(self, topic: str, msg: dict) -> None:
|
||||
try:
|
||||
conid = int(topic.split("+", 1)[1])
|
||||
except (ValueError, IndexError):
|
||||
return
|
||||
|
||||
def _f(k: str) -> float | None:
|
||||
v = msg.get(k)
|
||||
try:
|
||||
return float(v) if v not in (None, "") else None
|
||||
except (TypeError, ValueError):
|
||||
return None
|
||||
|
||||
self._tick_cache[conid] = TickSnapshot(
|
||||
last_price=_f("31"), bid=_f("84"), ask=_f("86"),
|
||||
bid_size=_f("7295"), ask_size=_f("7296"),
|
||||
timestamp_ms=int(time.time() * 1000),
|
||||
)
|
||||
|
||||
def _on_depth(self, topic: str, msg: dict) -> None:
|
||||
try:
|
||||
conid = int(topic.split("+", 1)[1])
|
||||
except (ValueError, IndexError):
|
||||
return
|
||||
self._depth_cache[conid] = DepthSnapshot(
|
||||
bids=msg.get("bids") or [],
|
||||
asks=msg.get("asks") or [],
|
||||
timestamp_ms=int(time.time() * 1000),
|
||||
)
|
||||
|
||||
async def _idle_sweeper(self) -> None:
|
||||
try:
|
||||
while not self._stopped:
|
||||
await asyncio.sleep(30)
|
||||
now = time.monotonic()
|
||||
expired = [
|
||||
c for c in list(self._subs | self._depth_subs)
|
||||
if c not in self._forced_subs
|
||||
and now - self._last_polled_at.get(c, now) > self.idle_timeout_s
|
||||
]
|
||||
for c in expired:
|
||||
with contextlib.suppress(Exception):
|
||||
await self.unsubscribe(c)
|
||||
except asyncio.CancelledError:
|
||||
raise
|
||||
@@ -0,0 +1,36 @@
|
||||
"""Router /mcp-cross/* — historical data with cross-exchange consensus."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal, cast
|
||||
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.exchanges.cross import tools as t
|
||||
from cerbero_mcp.exchanges.cross.client import CrossClient
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
|
||||
def get_environment(request: Request) -> Environment:
|
||||
return cast(Environment, request.state.environment)
|
||||
|
||||
|
||||
def get_cross_client(
|
||||
request: Request, env: Environment = Depends(get_environment),
|
||||
) -> CrossClient:
|
||||
registry: ClientRegistry = request.app.state.registry
|
||||
return CrossClient(registry, env=env)
|
||||
|
||||
|
||||
def make_router() -> APIRouter:
|
||||
r = APIRouter(prefix="/mcp-cross", tags=["cross"])
|
||||
|
||||
@r.post("/tools/get_historical")
|
||||
async def _get_historical(
|
||||
params: t.GetHistoricalReq,
|
||||
client: CrossClient = Depends(get_cross_client),
|
||||
):
|
||||
return await t.get_historical(client, params)
|
||||
|
||||
return r
|
||||
@@ -0,0 +1,248 @@
|
||||
"""Router /mcp-ibkr/* — DI per env, client e (write) creds."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal, cast
|
||||
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from cerbero_mcp.client_registry import ClientRegistry
|
||||
from cerbero_mcp.common.audit_helpers import audit_call
|
||||
from cerbero_mcp.exchanges.ibkr import tools as t
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRClient
|
||||
from cerbero_mcp.exchanges.ibkr.ws import IBKRWebSocket
|
||||
|
||||
Environment = Literal["testnet", "mainnet"]
|
||||
|
||||
|
||||
def get_environment(request: Request) -> Environment:
|
||||
return cast(Environment, request.state.environment)
|
||||
|
||||
|
||||
async def get_ibkr_client(
|
||||
request: Request, env: Environment = Depends(get_environment),
|
||||
) -> IBKRClient:
|
||||
registry: ClientRegistry = request.app.state.registry
|
||||
return cast(IBKRClient, await registry.get("ibkr", env))
|
||||
|
||||
|
||||
async def get_ibkr_ws(
|
||||
request: Request, env: Environment = Depends(get_environment),
|
||||
) -> IBKRWebSocket:
|
||||
"""Lazy-create singleton WS per env on first streaming call."""
|
||||
ws_dict = getattr(request.app.state, "ibkr_ws", None)
|
||||
if ws_dict is None:
|
||||
ws_dict = {}
|
||||
request.app.state.ibkr_ws = ws_dict
|
||||
if env not in ws_dict:
|
||||
client = await get_ibkr_client(request, env)
|
||||
settings = request.app.state.settings
|
||||
ws_url = (
|
||||
settings.ibkr.ws_url_testnet if env == "testnet"
|
||||
else settings.ibkr.ws_url_live
|
||||
)
|
||||
ws = IBKRWebSocket(
|
||||
signer=client.signer,
|
||||
ws_url=ws_url,
|
||||
base_url=client.base_url,
|
||||
max_subs=settings.ibkr.ws_max_subscriptions,
|
||||
idle_timeout_s=settings.ibkr.ws_idle_timeout_s,
|
||||
)
|
||||
await ws.start()
|
||||
ws_dict[env] = ws
|
||||
return ws_dict[env]
|
||||
|
||||
|
||||
def _build_creds(request: Request) -> dict:
|
||||
settings = request.app.state.settings
|
||||
return {"max_leverage": settings.ibkr.max_leverage}
|
||||
|
||||
|
||||
def make_router() -> APIRouter:
|
||||
r = APIRouter(prefix="/mcp-ibkr", tags=["ibkr"])
|
||||
|
||||
# === READ tools ===
|
||||
|
||||
@r.post("/tools/environment_info")
|
||||
async def _ei(request: Request, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.environment_info(client, creds=_build_creds(request))
|
||||
|
||||
@r.post("/tools/get_account")
|
||||
async def _ga(params: t.GetAccountReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_account(client, params)
|
||||
|
||||
@r.post("/tools/get_positions")
|
||||
async def _gp(params: t.GetPositionsReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_positions(client, params)
|
||||
|
||||
@r.post("/tools/get_open_orders")
|
||||
async def _goo(params: t.GetOpenOrdersReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_open_orders(client, params)
|
||||
|
||||
@r.post("/tools/get_activities")
|
||||
async def _gact(params: t.GetActivitiesReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_activities(client, params)
|
||||
|
||||
@r.post("/tools/get_ticker")
|
||||
async def _gt(params: t.GetTickerReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_ticker(client, params)
|
||||
|
||||
@r.post("/tools/get_bars")
|
||||
async def _gb(params: t.GetBarsReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_bars(client, params)
|
||||
|
||||
@r.post("/tools/get_snapshot")
|
||||
async def _gs(params: t.GetSnapshotReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_snapshot(client, params)
|
||||
|
||||
@r.post("/tools/get_option_chain")
|
||||
async def _goc(params: t.GetOptionChainReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_option_chain(client, params)
|
||||
|
||||
@r.post("/tools/search_contracts")
|
||||
async def _sc(params: t.SearchContractsReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.search_contracts(client, params)
|
||||
|
||||
@r.post("/tools/get_clock")
|
||||
async def _gc(params: t.GetClockReq, client: IBKRClient = Depends(get_ibkr_client)):
|
||||
return await t.get_clock(client, params)
|
||||
|
||||
# === STREAMING tools ===
|
||||
|
||||
@r.post("/tools/get_tick")
|
||||
async def _gtk(
|
||||
params: t.GetTickReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.get_tick(client, params, ws=ws)
|
||||
|
||||
@r.post("/tools/get_depth")
|
||||
async def _gd(
|
||||
params: t.GetDepthReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.get_depth(client, params, ws=ws)
|
||||
|
||||
@r.post("/tools/subscribe_tick")
|
||||
async def _st(
|
||||
params: t.SubscribeTickReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.subscribe_tick(client, params, ws=ws)
|
||||
|
||||
@r.post("/tools/unsubscribe")
|
||||
async def _us(
|
||||
params: t.UnsubscribeReq,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
ws: IBKRWebSocket = Depends(get_ibkr_ws),
|
||||
):
|
||||
return await t.unsubscribe(client, params, ws=ws)
|
||||
|
||||
# === WRITE simple ===
|
||||
|
||||
@r.post("/tools/place_order")
|
||||
async def _po(
|
||||
params: t.PlaceOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_order",
|
||||
target_field="symbol", params=params,
|
||||
tool_fn=lambda: t.place_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/amend_order")
|
||||
async def _ao(
|
||||
params: t.AmendOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="amend_order",
|
||||
target_field="order_id", params=params,
|
||||
tool_fn=lambda: t.amend_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_order")
|
||||
async def _co(
|
||||
params: t.CancelOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="cancel_order",
|
||||
target_field="order_id", params=params,
|
||||
tool_fn=lambda: t.cancel_order(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/cancel_all_orders")
|
||||
async def _cao(
|
||||
params: t.CancelAllOrdersReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="cancel_all_orders",
|
||||
params=params, tool_fn=lambda: t.cancel_all_orders(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_position")
|
||||
async def _cp(
|
||||
params: t.ClosePositionReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="close_position",
|
||||
target_field="symbol", params=params,
|
||||
tool_fn=lambda: t.close_position(client, params),
|
||||
)
|
||||
|
||||
@r.post("/tools/close_all_positions")
|
||||
async def _cap(
|
||||
params: t.CloseAllPositionsReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="close_all_positions",
|
||||
params=params, tool_fn=lambda: t.close_all_positions(client, params),
|
||||
)
|
||||
|
||||
# === WRITE complex ===
|
||||
|
||||
@r.post("/tools/place_bracket_order")
|
||||
async def _pbo(
|
||||
params: t.PlaceBracketOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_bracket_order",
|
||||
target_field="symbol", params=params,
|
||||
tool_fn=lambda: t.place_bracket_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/place_oco_order")
|
||||
async def _poco(
|
||||
params: t.PlaceOcoOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_oco_order",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_oco_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
@r.post("/tools/place_oto_order")
|
||||
async def _poto(
|
||||
params: t.PlaceOtoOrderReq, request: Request,
|
||||
client: IBKRClient = Depends(get_ibkr_client),
|
||||
):
|
||||
creds = _build_creds(request)
|
||||
return await audit_call(
|
||||
request=request, exchange="ibkr", action="place_oto_order",
|
||||
params=params,
|
||||
tool_fn=lambda: t.place_oto_order(client, params, creds=creds),
|
||||
)
|
||||
|
||||
return r
|
||||
+119
-2
@@ -1,10 +1,22 @@
|
||||
"""Pydantic Settings: legge .env e variabili d'ambiente."""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TypedDict
|
||||
|
||||
from pydantic import Field, SecretStr
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class IBKRCredentials(TypedDict):
|
||||
consumer_key: str
|
||||
access_token: str
|
||||
access_token_secret: str
|
||||
signature_key_path: str
|
||||
encryption_key_path: str
|
||||
account_id: str
|
||||
dh_prime: str
|
||||
|
||||
|
||||
class _Sub(BaseSettings):
|
||||
"""Base per sub-settings, condivide model_config con env_file."""
|
||||
model_config = SettingsConfigDict(
|
||||
@@ -21,12 +33,33 @@ class DeribitSettings(_Sub):
|
||||
env_prefix="DERIBIT_",
|
||||
extra="ignore",
|
||||
)
|
||||
client_id: str
|
||||
client_secret: SecretStr
|
||||
client_id: str | None = None
|
||||
client_secret: SecretStr | None = None
|
||||
client_id_testnet: str | None = None
|
||||
client_secret_testnet: SecretStr | None = None
|
||||
client_id_live: str | None = None
|
||||
client_secret_live: SecretStr | None = None
|
||||
url_live: str
|
||||
url_testnet: str
|
||||
max_leverage: int = 3
|
||||
|
||||
def credentials(self, env: str) -> tuple[str, str]:
|
||||
"""Return (client_id, client_secret) for the given env.
|
||||
Prefers env-specific (_TESTNET / _LIVE) pair; falls back to base
|
||||
(DERIBIT_CLIENT_ID / DERIBIT_CLIENT_SECRET) for legacy single-pair setups.
|
||||
"""
|
||||
if env == "testnet":
|
||||
cid = self.client_id_testnet or self.client_id
|
||||
csec = self.client_secret_testnet or self.client_secret
|
||||
elif env == "mainnet":
|
||||
cid = self.client_id_live or self.client_id
|
||||
csec = self.client_secret_live or self.client_secret
|
||||
else:
|
||||
raise ValueError(f"unknown deribit env: {env}")
|
||||
if not cid or csec is None:
|
||||
raise ValueError(f"Deribit credentials not configured for env={env}")
|
||||
return cid, csec.get_secret_value()
|
||||
|
||||
|
||||
class BybitSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
@@ -71,6 +104,89 @@ class AlpacaSettings(_Sub):
|
||||
max_leverage: int = 1
|
||||
|
||||
|
||||
class IBKRSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
env_prefix="IBKR_",
|
||||
extra="ignore",
|
||||
)
|
||||
consumer_key: str | None = None
|
||||
access_token: str | None = None
|
||||
access_token_secret: SecretStr | None = None
|
||||
signature_key_path: str | None = None
|
||||
encryption_key_path: str | None = None
|
||||
dh_prime: SecretStr | None = None
|
||||
|
||||
consumer_key_testnet: str | None = None
|
||||
access_token_testnet: str | None = None
|
||||
access_token_secret_testnet: SecretStr | None = None
|
||||
signature_key_path_testnet: str | None = None
|
||||
encryption_key_path_testnet: str | None = None
|
||||
# account_id has no base variant: paper and live accounts are always distinct
|
||||
account_id_testnet: str | None = None
|
||||
|
||||
consumer_key_live: str | None = None
|
||||
access_token_live: str | None = None
|
||||
access_token_secret_live: SecretStr | None = None
|
||||
signature_key_path_live: str | None = None
|
||||
encryption_key_path_live: str | None = None
|
||||
account_id_live: str | None = None
|
||||
|
||||
url_live: str = "https://api.ibkr.com/v1/api"
|
||||
url_testnet: str = "https://api.ibkr.com/v1/api"
|
||||
ws_url_live: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
ws_url_testnet: str = "wss://api.ibkr.com/v1/api/ws"
|
||||
max_leverage: int = 4
|
||||
ws_max_subscriptions: int = 80
|
||||
ws_idle_timeout_s: int = 300
|
||||
|
||||
def credentials(self, env: str) -> IBKRCredentials:
|
||||
"""Return credential dict for given env.
|
||||
Prefers env-specific (_TESTNET / _LIVE) values; falls back to base
|
||||
(IBKR_CONSUMER_KEY etc.) for legacy single-pair setups.
|
||||
ValueError if any required field missing.
|
||||
"""
|
||||
if env == "testnet":
|
||||
ck = self.consumer_key_testnet or self.consumer_key
|
||||
at = self.access_token_testnet or self.access_token
|
||||
ats = self.access_token_secret_testnet or self.access_token_secret
|
||||
sigp = self.signature_key_path_testnet or self.signature_key_path
|
||||
encp = self.encryption_key_path_testnet or self.encryption_key_path
|
||||
acct = self.account_id_testnet
|
||||
elif env == "mainnet":
|
||||
ck = self.consumer_key_live or self.consumer_key
|
||||
at = self.access_token_live or self.access_token
|
||||
ats = self.access_token_secret_live or self.access_token_secret
|
||||
sigp = self.signature_key_path_live or self.signature_key_path
|
||||
encp = self.encryption_key_path_live or self.encryption_key_path
|
||||
acct = self.account_id_live
|
||||
else:
|
||||
raise ValueError(f"unknown ibkr env: {env}")
|
||||
|
||||
missing = [
|
||||
n for n, v in [
|
||||
("consumer_key", ck), ("access_token", at),
|
||||
("access_token_secret", ats), ("signature_key_path", sigp),
|
||||
("encryption_key_path", encp), ("account_id", acct),
|
||||
("dh_prime", self.dh_prime),
|
||||
] if not v
|
||||
]
|
||||
if missing:
|
||||
raise ValueError(
|
||||
f"IBKR credentials not configured for env={env}: missing {missing}"
|
||||
)
|
||||
return {
|
||||
"consumer_key": ck,
|
||||
"access_token": at,
|
||||
"access_token_secret": ats.get_secret_value(), # type: ignore[union-attr]
|
||||
"signature_key_path": sigp,
|
||||
"encryption_key_path": encp,
|
||||
"account_id": acct,
|
||||
"dh_prime": self.dh_prime.get_secret_value(), # type: ignore[union-attr]
|
||||
}
|
||||
|
||||
|
||||
class MacroSettings(_Sub):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
@@ -103,5 +219,6 @@ class Settings(_Sub):
|
||||
bybit: BybitSettings = Field(default_factory=lambda: BybitSettings()) # type: ignore[call-arg]
|
||||
hyperliquid: HyperliquidSettings = Field(default_factory=lambda: HyperliquidSettings()) # type: ignore[call-arg]
|
||||
alpaca: AlpacaSettings = Field(default_factory=lambda: AlpacaSettings()) # type: ignore[call-arg]
|
||||
ibkr: IBKRSettings = Field(default_factory=lambda: IBKRSettings()) # type: ignore[call-arg]
|
||||
macro: MacroSettings = Field(default_factory=lambda: MacroSettings()) # type: ignore[call-arg]
|
||||
sentiment: SentimentSettings = Field(default_factory=lambda: SentimentSettings()) # type: ignore[call-arg]
|
||||
|
||||
@@ -0,0 +1,72 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.common.candles import Candle, validate_candles
|
||||
from fastapi import HTTPException
|
||||
|
||||
|
||||
def test_valid_candle():
|
||||
c = Candle(timestamp=1_700_000_000_000, open=100.0, high=110.0,
|
||||
low=95.0, close=105.0, volume=12.5)
|
||||
assert c.high == 110.0
|
||||
|
||||
|
||||
def test_high_below_close_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=90, low=80, close=95, volume=1)
|
||||
|
||||
|
||||
def test_high_below_open_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=90, low=80, close=85, volume=1)
|
||||
|
||||
|
||||
def test_low_above_close_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=110, low=105, close=102, volume=1)
|
||||
|
||||
|
||||
def test_low_above_open_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=95, high=110, low=100, close=105, volume=1)
|
||||
|
||||
|
||||
def test_negative_volume_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=1, open=100, high=110, low=90, close=105, volume=-1)
|
||||
|
||||
|
||||
def test_non_positive_timestamp_rejected():
|
||||
with pytest.raises(ValueError):
|
||||
Candle(timestamp=0, open=100, high=110, low=90, close=105, volume=1)
|
||||
|
||||
|
||||
def test_validate_candles_sorts_by_timestamp():
|
||||
raw = [
|
||||
{"timestamp": 3, "open": 1, "high": 2, "low": 1, "close": 1, "volume": 0},
|
||||
{"timestamp": 1, "open": 1, "high": 2, "low": 1, "close": 1, "volume": 0},
|
||||
{"timestamp": 2, "open": 1, "high": 2, "low": 1, "close": 1, "volume": 0},
|
||||
]
|
||||
out = validate_candles(raw)
|
||||
assert [c["timestamp"] for c in out] == [1, 2, 3]
|
||||
|
||||
|
||||
def test_validate_candles_coerces_string_numerics():
|
||||
raw = [{"timestamp": "1", "open": "100", "high": "110",
|
||||
"low": "90", "close": "105", "volume": "10"}]
|
||||
out = validate_candles(raw)
|
||||
assert out[0]["open"] == 100.0
|
||||
assert isinstance(out[0]["volume"], float)
|
||||
|
||||
|
||||
def test_validate_candles_malformed_raises_http_502():
|
||||
raw = [{"timestamp": 1, "open": 100, "high": 50, "low": 90,
|
||||
"close": 105, "volume": 1}]
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
validate_candles(raw)
|
||||
assert exc_info.value.status_code == 502
|
||||
assert "candle" in str(exc_info.value.detail).lower()
|
||||
|
||||
|
||||
def test_validate_candles_empty_list():
|
||||
assert validate_candles([]) == []
|
||||
@@ -271,8 +271,8 @@ async def test_get_bars_stocks(httpx_mock: HTTPXMock, client: AlpacaClient):
|
||||
)
|
||||
assert result["symbol"] == "AAPL"
|
||||
assert result["interval"] == "1d"
|
||||
assert len(result["bars"]) == 1
|
||||
assert result["bars"][0]["close"] == 175.0
|
||||
assert len(result["candles"]) == 1
|
||||
assert result["candles"][0]["close"] == 175.0
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
|
||||
@@ -0,0 +1,134 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.cross.client import CrossClient
|
||||
from fastapi import HTTPException
|
||||
|
||||
|
||||
class _Fake:
|
||||
def __init__(self, candles: list[dict[str, Any]] | None = None,
|
||||
*, raises: Exception | None = None):
|
||||
self._candles = candles or []
|
||||
self._raises = raises
|
||||
self.calls: list[dict[str, Any]] = []
|
||||
|
||||
async def get_historical(self, **kwargs: Any) -> dict[str, Any]:
|
||||
if self._raises:
|
||||
raise self._raises
|
||||
self.calls.append(kwargs)
|
||||
return {"candles": list(self._candles)}
|
||||
|
||||
async def get_bars(self, **kwargs: Any) -> dict[str, Any]:
|
||||
if self._raises:
|
||||
raise self._raises
|
||||
self.calls.append(kwargs)
|
||||
return {"candles": list(self._candles)}
|
||||
|
||||
|
||||
class _FakeRegistry:
|
||||
def __init__(self, clients: dict[str, _Fake]):
|
||||
self._clients = clients
|
||||
|
||||
async def get(self, exchange: str, env: str) -> _Fake:
|
||||
if exchange not in self._clients:
|
||||
raise KeyError(exchange)
|
||||
return self._clients[exchange]
|
||||
|
||||
|
||||
def _c(ts: int, close: float = 100.0) -> dict[str, Any]:
|
||||
return {"timestamp": ts, "open": close, "high": close, "low": close,
|
||||
"close": close, "volume": 1.0}
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_crypto_three_sources_aggregates():
|
||||
fakes = {
|
||||
"bybit": _Fake([_c(1, 100), _c(2, 200)]),
|
||||
"hyperliquid": _Fake([_c(1, 100), _c(2, 200)]),
|
||||
"deribit": _Fake([_c(1, 100), _c(2, 200)]),
|
||||
}
|
||||
cc = CrossClient(_FakeRegistry(fakes), env="mainnet")
|
||||
out = await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert out["symbol"] == "BTC"
|
||||
assert out["asset_class"] == "crypto"
|
||||
assert len(out["candles"]) == 2
|
||||
assert out["candles"][0]["sources"] == 3
|
||||
assert out["candles"][0]["div_pct"] == 0.0
|
||||
assert set(out["sources_used"]) == {"bybit", "hyperliquid", "deribit"}
|
||||
assert out["failed_sources"] == []
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_crypto_partial_failure_returns_partial_with_warning():
|
||||
fakes = {
|
||||
"bybit": _Fake([_c(1, 100)]),
|
||||
"hyperliquid": _Fake([_c(1, 100)]),
|
||||
"deribit": _Fake(raises=RuntimeError("upstream down")),
|
||||
}
|
||||
cc = CrossClient(_FakeRegistry(fakes), env="mainnet")
|
||||
out = await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert out["candles"][0]["sources"] == 2
|
||||
assert set(out["sources_used"]) == {"bybit", "hyperliquid"}
|
||||
assert len(out["failed_sources"]) == 1
|
||||
assert out["failed_sources"][0]["exchange"] == "deribit"
|
||||
assert "upstream down" in out["failed_sources"][0]["error"]
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_all_sources_fail_raises_502():
|
||||
fakes = {
|
||||
"bybit": _Fake(raises=RuntimeError("a")),
|
||||
"hyperliquid": _Fake(raises=RuntimeError("b")),
|
||||
"deribit": _Fake(raises=RuntimeError("c")),
|
||||
}
|
||||
cc = CrossClient(_FakeRegistry(fakes), env="mainnet")
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert exc_info.value.status_code == 502
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_unsupported_symbol_raises_400():
|
||||
cc = CrossClient(_FakeRegistry({}), env="mainnet")
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await cc.get_historical(
|
||||
symbol="NONEXISTENT", asset_class="crypto", interval="1h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert exc_info.value.status_code == 400
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_stocks_routes_to_alpaca_only():
|
||||
fake = _Fake([_c(1, 175.0)])
|
||||
cc = CrossClient(_FakeRegistry({"alpaca": fake}), env="mainnet")
|
||||
out = await cc.get_historical(
|
||||
symbol="AAPL", asset_class="stocks", interval="1d",
|
||||
start_date="2026-04-09T00:00:00", end_date="2026-04-10T00:00:00",
|
||||
)
|
||||
assert out["sources_used"] == ["alpaca"]
|
||||
assert out["candles"][0]["close"] == 175.0
|
||||
# Alpaca was called with native symbol
|
||||
assert fake.calls[0]["symbol"] == "AAPL"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_unsupported_interval_raises_400():
|
||||
cc = CrossClient(_FakeRegistry({}), env="mainnet")
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await cc.get_historical(
|
||||
symbol="BTC", asset_class="crypto", interval="3h",
|
||||
start_date="2026-05-09T00:00:00", end_date="2026-05-10T00:00:00",
|
||||
)
|
||||
assert exc_info.value.status_code == 400
|
||||
@@ -0,0 +1,90 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from cerbero_mcp.exchanges.cross.consensus import merge_candles
|
||||
|
||||
|
||||
def _c(ts, o, h, l, c, v):
|
||||
return {"timestamp": ts, "open": o, "high": h, "low": l, "close": c, "volume": v}
|
||||
|
||||
|
||||
def test_empty_input():
|
||||
assert merge_candles({}) == []
|
||||
|
||||
|
||||
def test_single_source_passthrough():
|
||||
out = merge_candles({"bybit": [_c(1, 100, 110, 90, 105, 5)]})
|
||||
assert len(out) == 1
|
||||
assert out[0]["timestamp"] == 1
|
||||
assert out[0]["close"] == 105
|
||||
assert out[0]["sources"] == 1
|
||||
assert out[0]["div_pct"] == 0.0
|
||||
|
||||
|
||||
def test_three_sources_identical_no_divergence():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 105, 5)],
|
||||
"hyperliquid": [_c(1, 100, 110, 90, 105, 3)],
|
||||
"deribit": [_c(1, 100, 110, 90, 105, 7)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
assert len(out) == 1
|
||||
assert out[0]["close"] == 105.0
|
||||
assert out[0]["sources"] == 3
|
||||
assert out[0]["div_pct"] == 0.0
|
||||
# volume is mean across sources
|
||||
assert abs(out[0]["volume"] - 5.0) < 1e-9
|
||||
|
||||
|
||||
def test_three_sources_divergent_close():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 100, 1)],
|
||||
"hyperliquid": [_c(1, 100, 110, 90, 110, 1)],
|
||||
"deribit": [_c(1, 100, 110, 90, 105, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
# median of [100, 110, 105] = 105
|
||||
assert out[0]["close"] == 105.0
|
||||
# div_pct = (110 - 100) / 105 ≈ 0.0952
|
||||
assert abs(out[0]["div_pct"] - 10 / 105) < 1e-6
|
||||
assert out[0]["sources"] == 3
|
||||
|
||||
|
||||
def test_misaligned_timestamps():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 105, 1), _c(2, 100, 110, 90, 105, 1)],
|
||||
"hyperliquid": [_c(2, 100, 110, 90, 105, 1), _c(3, 100, 110, 90, 105, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
timestamps = [c["timestamp"] for c in out]
|
||||
sources_by_ts = {c["timestamp"]: c["sources"] for c in out}
|
||||
assert timestamps == [1, 2, 3]
|
||||
assert sources_by_ts == {1: 1, 2: 2, 3: 1}
|
||||
|
||||
|
||||
def test_two_sources_even_median():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 100, 1)],
|
||||
"hyperliquid": [_c(1, 100, 110, 90, 110, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
# even median = mean of two = 105
|
||||
assert out[0]["close"] == 105.0
|
||||
|
||||
|
||||
def test_empty_source_ignored():
|
||||
src = {
|
||||
"bybit": [_c(1, 100, 110, 90, 105, 1)],
|
||||
"hyperliquid": [],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
assert len(out) == 1
|
||||
assert out[0]["sources"] == 1
|
||||
|
||||
|
||||
def test_output_sorted_by_timestamp():
|
||||
src = {
|
||||
"bybit": [_c(3, 100, 110, 90, 105, 1), _c(1, 100, 110, 90, 105, 1),
|
||||
_c(2, 100, 110, 90, 105, 1)],
|
||||
}
|
||||
out = merge_candles(src)
|
||||
assert [c["timestamp"] for c in out] == [1, 2, 3]
|
||||
@@ -0,0 +1,47 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.cross.symbol_map import (
|
||||
get_sources,
|
||||
to_native_interval,
|
||||
to_native_symbol,
|
||||
)
|
||||
|
||||
|
||||
def test_btc_crypto_sources():
|
||||
assert set(get_sources("crypto", "BTC")) == {"bybit", "hyperliquid", "deribit"}
|
||||
|
||||
|
||||
def test_eth_crypto_sources():
|
||||
assert set(get_sources("crypto", "ETH")) == {"bybit", "hyperliquid", "deribit"}
|
||||
|
||||
|
||||
def test_unknown_crypto_symbol_returns_empty():
|
||||
assert get_sources("crypto", "DOGEFAKE") == []
|
||||
|
||||
|
||||
def test_stocks_aapl_sources():
|
||||
assert set(get_sources("stocks", "AAPL")) == {"alpaca"}
|
||||
|
||||
|
||||
def test_native_symbol_btc():
|
||||
assert to_native_symbol("crypto", "BTC", "bybit") == "BTCUSDT"
|
||||
assert to_native_symbol("crypto", "BTC", "hyperliquid") == "BTC"
|
||||
assert to_native_symbol("crypto", "BTC", "deribit") == "BTC-PERPETUAL"
|
||||
|
||||
|
||||
def test_native_symbol_unsupported_pair_raises():
|
||||
with pytest.raises(KeyError):
|
||||
to_native_symbol("crypto", "BTC", "alpaca")
|
||||
|
||||
|
||||
def test_native_interval_1h():
|
||||
assert to_native_interval("1h", "bybit") == "60"
|
||||
assert to_native_interval("1h", "hyperliquid") == "1h"
|
||||
assert to_native_interval("1h", "deribit") == "1h"
|
||||
assert to_native_interval("1h", "alpaca") == "1h"
|
||||
|
||||
|
||||
def test_native_interval_unknown_canonical_raises():
|
||||
with pytest.raises(KeyError):
|
||||
to_native_interval("3h", "bybit")
|
||||
@@ -154,6 +154,47 @@ async def test_get_account_summary(httpx_mock: HTTPXMock, client: DeribitClient)
|
||||
assert result["balance"] == 900.0
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_upstream_5xx_raises_clean_http_error(
|
||||
httpx_mock: HTTPXMock, client: DeribitClient
|
||||
):
|
||||
"""Upstream Deribit 5xx (non-JSON body) → HTTPException 502, non JSONDecodeError."""
|
||||
from fastapi import HTTPException
|
||||
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://test\.deribit\.com/api/v2/public/get_tradingview_chart_data"),
|
||||
status_code=502,
|
||||
text="<html>Bad Gateway</html>",
|
||||
)
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await client.get_historical(
|
||||
instrument="BTC-PERPETUAL",
|
||||
start_date="2026-05-09T00:00:00",
|
||||
end_date="2026-05-10T00:00:00",
|
||||
resolution="1h",
|
||||
)
|
||||
assert exc_info.value.status_code == 502
|
||||
assert "Deribit upstream" in str(exc_info.value.detail)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_private_call_with_bad_auth_returns_error_envelope(
|
||||
httpx_mock: HTTPXMock, client: DeribitClient
|
||||
):
|
||||
"""Auth fallita (creds errate / scope mancante) → error envelope, non KeyError."""
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r"https://test\.deribit\.com/api/v2/public/auth"),
|
||||
json={"error": {"code": 13004, "message": "invalid_credentials"}},
|
||||
is_reusable=True,
|
||||
)
|
||||
summary = await client.get_account_summary("USDC")
|
||||
assert summary["equity"] is None
|
||||
assert summary["balance"] is None
|
||||
assert "invalid_credentials" in summary["error"]
|
||||
positions = await client.get_positions("USDC")
|
||||
assert positions == []
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order(httpx_mock: HTTPXMock, client: DeribitClient):
|
||||
httpx_mock.add_response(
|
||||
|
||||
@@ -0,0 +1,232 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRClient, IBKRError
|
||||
from pytest_httpx import HTTPXMock
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fake_signer():
|
||||
s = MagicMock()
|
||||
s.consumer_key = "CK"
|
||||
s.access_token = "AT"
|
||||
s.get_live_session_token = AsyncMock(return_value="LSTBASE64==")
|
||||
s.sign_with_lst = MagicMock(return_value="SIG==")
|
||||
s.make_oauth_params = MagicMock(return_value={
|
||||
"oauth_consumer_key": "CK", "oauth_token": "AT",
|
||||
"oauth_nonce": "n", "oauth_timestamp": "1",
|
||||
"oauth_signature_method": "HMAC-SHA256", "oauth_version": "1.0",
|
||||
})
|
||||
return s
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def client(fake_signer):
|
||||
return IBKRClient(
|
||||
signer=fake_signer,
|
||||
account_id="DU1234",
|
||||
paper=True,
|
||||
base_url="https://api.ibkr.com/v1/api",
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_health_no_network(client):
|
||||
info = await client.health()
|
||||
assert info["status"] == "ok"
|
||||
assert info["paper"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_account_summary(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
json={"netliquidation": {"amount": 10000, "currency": "USD"}, "totalcashvalue": {"amount": 8000}},
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/tickle"),
|
||||
json={"session": "abc"},
|
||||
)
|
||||
data = await client.get_account()
|
||||
assert "netliquidation" in data
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_request_retries_once_on_401(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
# First call returns 401
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
status_code=401, text="session expired",
|
||||
)
|
||||
# Second call (after LST refresh) succeeds
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
json={"netliquidation": {"amount": 5000}},
|
||||
)
|
||||
data = await client.get_account()
|
||||
assert data["netliquidation"]["amount"] == 5000
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_request_raises_on_persistent_401(httpx_mock: HTTPXMock, client):
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import IBKRAuthError
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
# Both attempts return 401
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
status_code=401, text="bad creds",
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/summary"),
|
||||
status_code=401, text="bad creds",
|
||||
)
|
||||
with pytest.raises(IBKRAuthError, match="after retry"):
|
||||
await client.get_account()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_conid_caches(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/tickle"), json={"session": "x"},
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=AAPL"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
cid = await client.resolve_conid("AAPL", "STK")
|
||||
assert cid == 265598
|
||||
cid2 = await client.resolve_conid("AAPL", "STK")
|
||||
assert cid2 == 265598
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_positions(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/portfolio/DU1234/positions/0"),
|
||||
json=[{"conid": 265598, "position": 10, "mktPrice": 150}],
|
||||
)
|
||||
res = await client.get_positions()
|
||||
assert isinstance(res, list)
|
||||
assert res[0]["position"] == 10
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_ticker_resolves_and_fetches(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=AAPL"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/marketdata/snapshot"),
|
||||
json=[{"31": "150.5", "84": "150.4", "86": "150.6", "conid": 265598}],
|
||||
)
|
||||
snap = await client.get_ticker("AAPL", "stocks")
|
||||
assert snap["last_price"] == 150.5
|
||||
assert snap["bid"] == 150.4
|
||||
assert snap["ask"] == 150.6
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_conid_empty_response_raises(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=NOPE"),
|
||||
json=[],
|
||||
)
|
||||
with pytest.raises(IBKRError, match="IBKR_CONID_NOT_FOUND"):
|
||||
await client.resolve_conid("NOPE", "STK")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_resolve_conid_malformed_response_raises(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search.*symbol=BAD"),
|
||||
json=[{"symbol": "BAD"}], # missing conid key
|
||||
)
|
||||
with pytest.raises(IBKRError, match="malformed"):
|
||||
await client.resolve_conid("BAD", "STK")
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_auto_confirms_warning(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/orders$"),
|
||||
method="POST",
|
||||
json=[{"id": "msgid1", "message": ["outside RTH"]}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/reply/msgid1"),
|
||||
method="POST",
|
||||
json=[{"order_id": "OID42", "order_status": "Submitted"}],
|
||||
)
|
||||
res = await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="market",
|
||||
)
|
||||
assert res["order_id"] == "OID42"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_rejects_critical_warning(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/orders$"),
|
||||
method="POST",
|
||||
json=[{"id": "msgid2", "message": ["Margin requirement exceeded"]}],
|
||||
)
|
||||
with pytest.raises(IBKRError, match="IBKR_ORDER_REJECTED_WARNING"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1000000, order_type="market",
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/order/OID42"),
|
||||
method="DELETE",
|
||||
json={"msg": "Request was submitted", "order_id": "OID42"},
|
||||
)
|
||||
res = await client.cancel_order("OID42")
|
||||
assert res["canceled"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_too_many_confirmations(httpx_mock: HTTPXMock, client):
|
||||
httpx_mock.add_response(url=re.compile(r".*/tickle"), json={})
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/trsrv/secdef/search"),
|
||||
json=[{"conid": 265598, "symbol": "AAPL", "secType": "STK"}],
|
||||
)
|
||||
# Initial place + 3 reply cycles all return new warnings — should fail at MAX_CYCLES
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/iserver/account/DU1234/orders$"),
|
||||
method="POST",
|
||||
json=[{"id": "msg1", "message": ["outside RTH"]}],
|
||||
)
|
||||
for n in range(2, 5):
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(rf".*/iserver/reply/msg{n-1}$"),
|
||||
method="POST",
|
||||
json=[{"id": f"msg{n}", "message": ["outside RTH"]}],
|
||||
)
|
||||
with pytest.raises(IBKRError, match="IBKR_ORDER_TOO_MANY_CONFIRMATIONS"):
|
||||
await client.place_order(
|
||||
symbol="AAPL", side="buy", qty=1, order_type="market",
|
||||
)
|
||||
@@ -0,0 +1,80 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.key_rotation import KeyRotationManager
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_start_generates_new_keypair_files(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
out = await mgr.start()
|
||||
assert "sig" in out["fingerprints"]
|
||||
assert "enc" in out["fingerprints"]
|
||||
assert (tmp_path / "sig.pem.new").exists()
|
||||
assert (tmp_path / "enc.pem.new").exists()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_confirm_swap_and_validate_ok(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
await mgr.start()
|
||||
|
||||
async def fake_validate() -> bool:
|
||||
return True
|
||||
out = await mgr.confirm(validate=fake_validate)
|
||||
assert "rotated_at" in out
|
||||
assert (tmp_path / ".archive").exists()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_confirm_validate_fail_rollbacks(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
await mgr.start()
|
||||
|
||||
async def fake_validate() -> bool:
|
||||
return False
|
||||
with pytest.raises(RuntimeError, match="IBKR_ROTATION_VALIDATION_FAILED"):
|
||||
await mgr.confirm(validate=fake_validate)
|
||||
assert sig_path.read_bytes() == b"old-sig"
|
||||
assert enc_path.read_bytes() == b"old-enc"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_abort_cleans_new_files(tmp_path):
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
sig_path.write_bytes(b"old-sig")
|
||||
enc_path.write_bytes(b"old-enc")
|
||||
|
||||
mgr = KeyRotationManager(
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
)
|
||||
await mgr.start()
|
||||
await mgr.abort()
|
||||
assert not (tmp_path / "sig.pem.new").exists()
|
||||
assert not (tmp_path / "enc.pem.new").exists()
|
||||
@@ -0,0 +1,132 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import base64 as _b64
|
||||
import re
|
||||
import time
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.oauth import (
|
||||
OAuth1aSigner,
|
||||
build_signature_base_string,
|
||||
)
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import padding, rsa
|
||||
from pytest_httpx import HTTPXMock
|
||||
|
||||
|
||||
def test_signature_base_string_canonical_order():
|
||||
base = build_signature_base_string(
|
||||
method="POST",
|
||||
url="https://api.ibkr.com/v1/api/oauth/live_session_token",
|
||||
params={
|
||||
"oauth_consumer_key": "TEST_CONSUMER",
|
||||
"oauth_token": "TEST_TOKEN",
|
||||
"oauth_nonce": "abc123",
|
||||
"oauth_timestamp": "1700000000",
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
"diffie_hellman_challenge": "ff00",
|
||||
},
|
||||
)
|
||||
assert base.startswith("POST&")
|
||||
assert "oauth_consumer_key%3DTEST_CONSUMER" in base
|
||||
idx_consumer = base.index("oauth_consumer_key")
|
||||
idx_token = base.index("oauth_token")
|
||||
assert idx_consumer < idx_token
|
||||
|
||||
|
||||
def test_oauth_signer_signs_with_rsa(tmp_path):
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
pem = key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
)
|
||||
sig_path = tmp_path / "sig.pem"
|
||||
sig_path.write_bytes(pem)
|
||||
enc_path = tmp_path / "enc.pem"
|
||||
enc_path.write_bytes(pem)
|
||||
|
||||
signer = OAuth1aSigner(
|
||||
consumer_key="TEST_CONSUMER",
|
||||
access_token="TEST_TOKEN",
|
||||
access_token_secret="TEST_SECRET",
|
||||
signature_key_path=str(sig_path),
|
||||
encryption_key_path=str(enc_path),
|
||||
dh_prime="FFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD129024E088A67CC74020BBEA63B139B22514A08798E3404DDEF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7EDEE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3DC2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F83655D23DCA3AD961C62F356208552BB9ED529077096966D670C354E4ABC9804F1746C08CA18217C32905E462E36CE3BE39E772C180E86039B2783A2EC07A28FB5C55DF06F4C52C9DE2BCBF6955817183995497CEA956AE515D2261898FA051015728E5A8AACAA68FFFFFFFFFFFFFFFF",
|
||||
)
|
||||
sig = signer.sign(
|
||||
method="GET",
|
||||
url="https://api.ibkr.com/v1/api/iserver/auth/status",
|
||||
params={
|
||||
"oauth_consumer_key": "TEST_CONSUMER",
|
||||
"oauth_token": "TEST_TOKEN",
|
||||
"oauth_nonce": "abc",
|
||||
"oauth_timestamp": "1700000000",
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
},
|
||||
)
|
||||
|
||||
# Verify signature against the public key — proves correctness, not just shape
|
||||
base = build_signature_base_string(
|
||||
method="GET",
|
||||
url="https://api.ibkr.com/v1/api/iserver/auth/status",
|
||||
params={
|
||||
"oauth_consumer_key": "TEST_CONSUMER",
|
||||
"oauth_token": "TEST_TOKEN",
|
||||
"oauth_nonce": "abc",
|
||||
"oauth_timestamp": "1700000000",
|
||||
"oauth_signature_method": "RSA-SHA256",
|
||||
"oauth_version": "1.0",
|
||||
},
|
||||
)
|
||||
key.public_key().verify(
|
||||
_b64.b64decode(sig),
|
||||
base.encode("utf-8"),
|
||||
padding.PKCS1v15(),
|
||||
hashes.SHA256(),
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_live_session_token_mint(httpx_mock: HTTPXMock, tmp_path):
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
pem = key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption(),
|
||||
)
|
||||
(tmp_path / "sig.pem").write_bytes(pem)
|
||||
(tmp_path / "enc.pem").write_bytes(pem)
|
||||
|
||||
# Encrypt fake "real" secret with our public key
|
||||
raw_secret = b"my_real_token_secret"
|
||||
encrypted = key.public_key().encrypt(raw_secret, padding.PKCS1v15())
|
||||
encrypted_hex = encrypted.hex()
|
||||
|
||||
httpx_mock.add_response(
|
||||
url=re.compile(r".*/oauth/live_session_token"),
|
||||
json={
|
||||
"diffie_hellman_response": "ff",
|
||||
"live_session_token_signature": "00" * 20,
|
||||
"live_session_token_expiration": int(time.time() * 1000) + 86400000,
|
||||
},
|
||||
)
|
||||
|
||||
signer = OAuth1aSigner(
|
||||
consumer_key="TEST_CK",
|
||||
access_token="TEST_AT",
|
||||
access_token_secret=encrypted_hex,
|
||||
signature_key_path=str(tmp_path / "sig.pem"),
|
||||
encryption_key_path=str(tmp_path / "enc.pem"),
|
||||
dh_prime="17", # 23 — smallest prime > 16 that fits a 1-byte modulus
|
||||
)
|
||||
lst = await signer.get_live_session_token(
|
||||
base_url="https://api.ibkr.com/v1/api"
|
||||
)
|
||||
assert isinstance(lst, str) and len(lst) > 0
|
||||
lst2 = await signer.get_live_session_token(
|
||||
base_url="https://api.ibkr.com/v1/api"
|
||||
)
|
||||
assert lst == lst2 # cached
|
||||
@@ -0,0 +1,44 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from cerbero_mcp.exchanges.ibkr.orders_complex import (
|
||||
OrderSpec,
|
||||
build_bracket_payload,
|
||||
build_oco_payload,
|
||||
)
|
||||
|
||||
|
||||
def test_bracket_three_legs_with_oca():
|
||||
payload = build_bracket_payload(
|
||||
conid=42, sec_type="STK", side="BUY", qty=10,
|
||||
entry_price=150.0, stop_loss=145.0, take_profit=160.0,
|
||||
tif="GTC", exchange="SMART",
|
||||
)
|
||||
assert "orders" in payload
|
||||
legs = payload["orders"]
|
||||
assert len(legs) == 3
|
||||
oca = legs[0].get("ocaGroup")
|
||||
assert oca and all(l.get("ocaGroup") == oca for l in legs[1:])
|
||||
assert legs[0]["orderType"] == "LMT"
|
||||
assert legs[0]["price"] == 150.0
|
||||
assert legs[0]["side"] == "BUY"
|
||||
assert legs[1]["side"] == "SELL"
|
||||
assert legs[2]["side"] == "SELL"
|
||||
assert legs[1]["orderType"] == "STP"
|
||||
assert legs[1]["auxPrice"] == 145.0
|
||||
assert legs[2]["orderType"] == "LMT"
|
||||
assert legs[2]["price"] == 160.0
|
||||
|
||||
|
||||
def test_oco_oca_group_and_type():
|
||||
legs = [
|
||||
OrderSpec(conid=1, sec_type="STK", side="BUY", qty=1,
|
||||
order_type="LMT", price=100),
|
||||
OrderSpec(conid=1, sec_type="STK", side="BUY", qty=1,
|
||||
order_type="LMT", price=110),
|
||||
]
|
||||
payload = build_oco_payload(legs)
|
||||
assert len(payload["orders"]) == 2
|
||||
oca = payload["orders"][0]["ocaGroup"]
|
||||
for o in payload["orders"]:
|
||||
assert o["ocaGroup"] == oca
|
||||
assert o["ocaType"] == 1
|
||||
@@ -0,0 +1,137 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr import tools as t
|
||||
|
||||
|
||||
def test_place_order_req_schema():
|
||||
req = t.PlaceOrderReq(symbol="AAPL", side="buy", qty=1)
|
||||
assert req.order_type == "market"
|
||||
assert req.tif == "day"
|
||||
assert req.exchange == "SMART"
|
||||
|
||||
|
||||
def test_place_order_req_options_validates_occ():
|
||||
req = t.PlaceOrderReq(
|
||||
symbol="AAPL 240119C00190000", side="buy", qty=1, asset_class="options",
|
||||
)
|
||||
assert req.asset_class == "options"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_account_tool_calls_client():
|
||||
client = MagicMock()
|
||||
client.get_account = AsyncMock(return_value={"netliquidation": {"amount": 10000}})
|
||||
res = await t.get_account(client, t.GetAccountReq())
|
||||
assert res["netliquidation"]["amount"] == 10000
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_tick_uses_cache_or_subscribes():
|
||||
client = MagicMock()
|
||||
client.resolve_conid = AsyncMock(return_value=42)
|
||||
ws = MagicMock()
|
||||
ws.get_tick_snapshot = MagicMock(side_effect=[
|
||||
None,
|
||||
{"conid": 42, "last_price": 99.5, "bid": 99.4, "ask": 99.6,
|
||||
"bid_size": 1, "ask_size": 1, "timestamp_ms": 1700000000000},
|
||||
])
|
||||
ws.subscribe_tick = AsyncMock()
|
||||
|
||||
res = await t.get_tick(
|
||||
client, t.GetTickReq(symbol="AAPL"), ws=ws, timeout_s=0.05,
|
||||
)
|
||||
assert res["last_price"] == 99.5
|
||||
ws.subscribe_tick.assert_awaited_once_with(42)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_enforces_leverage():
|
||||
client = MagicMock()
|
||||
client.get_account = AsyncMock(return_value={
|
||||
"netliquidation": {"amount": 10000},
|
||||
})
|
||||
client.place_order = AsyncMock(return_value={"order_id": "O1"})
|
||||
creds = {"max_leverage": 2}
|
||||
res = await t.place_order(
|
||||
client, t.PlaceOrderReq(symbol="AAPL", side="buy", qty=10),
|
||||
creds=creds, last_price=100.0,
|
||||
)
|
||||
assert res["order_id"] == "O1"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_cancel_order_calls_client():
|
||||
client = MagicMock()
|
||||
client.cancel_order = AsyncMock(return_value={"order_id": "O1", "canceled": True})
|
||||
res = await t.cancel_order(client, t.CancelOrderReq(order_id="O1"))
|
||||
assert res["canceled"] is True
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_order_rejects_excessive_leverage():
|
||||
from fastapi import HTTPException
|
||||
client = MagicMock()
|
||||
client.get_account = AsyncMock(return_value={
|
||||
"netliquidation": {"amount": 1000},
|
||||
})
|
||||
creds = {"max_leverage": 2}
|
||||
# Order notional = 100*100 = 10000 vs equity 1000 → ratio 10x >> 2x cap → 403
|
||||
with pytest.raises(HTTPException) as exc_info:
|
||||
await t.place_order(
|
||||
client, t.PlaceOrderReq(symbol="AAPL", side="buy", qty=100),
|
||||
creds=creds, last_price=100.0,
|
||||
)
|
||||
assert exc_info.value.status_code == 403
|
||||
assert exc_info.value.detail["error"] == "LEVERAGE_CAP_EXCEEDED"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_bracket_order_calls_client_with_three_legs():
|
||||
client = MagicMock()
|
||||
client.resolve_conid = AsyncMock(return_value=42)
|
||||
client.account_id = "DU1"
|
||||
client.get_account = AsyncMock(return_value={"netliquidation": {"amount": 100000}})
|
||||
client._submit_order_with_confirmation = AsyncMock(
|
||||
return_value={"order_id": "OID-parent"}
|
||||
)
|
||||
res = await t.place_bracket_order(
|
||||
client,
|
||||
t.PlaceBracketOrderReq(
|
||||
symbol="AAPL", side="buy", qty=1,
|
||||
entry_price=150, stop_loss=145, take_profit=160,
|
||||
),
|
||||
creds={"max_leverage": 4},
|
||||
)
|
||||
assert res["order_id"] == "OID-parent"
|
||||
payload = client._submit_order_with_confirmation.call_args[0][0]
|
||||
assert len(payload["orders"]) == 3
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_place_oto_partial_failure_cancels_trigger():
|
||||
from cerbero_mcp.exchanges.ibkr.client import IBKRError
|
||||
client = MagicMock()
|
||||
client.resolve_conid = AsyncMock(return_value=42)
|
||||
client.account_id = "DU1"
|
||||
client._submit_order_with_confirmation = AsyncMock(
|
||||
side_effect=[
|
||||
{"order_id": "TRIG1"},
|
||||
IBKRError("network"),
|
||||
]
|
||||
)
|
||||
client.cancel_order = AsyncMock(return_value={"canceled": True})
|
||||
with pytest.raises(IBKRError, match="IBKR_OTO_PARTIAL_FAILURE"):
|
||||
await t.place_oto_order(
|
||||
client,
|
||||
t.PlaceOtoOrderReq(
|
||||
trigger=t.OrderLeg(symbol="AAPL", side="buy", qty=1,
|
||||
order_type="limit", limit_price=150),
|
||||
child=t.OrderLeg(symbol="AAPL", side="sell", qty=1,
|
||||
order_type="limit", limit_price=160),
|
||||
),
|
||||
creds={"max_leverage": 4},
|
||||
)
|
||||
client.cancel_order.assert_awaited_once_with("TRIG1")
|
||||
@@ -0,0 +1,124 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
|
||||
import pytest
|
||||
from cerbero_mcp.exchanges.ibkr.ws import IBKRWebSocket, WSError
|
||||
|
||||
|
||||
class FakeWS:
|
||||
"""Bidirectional async fake for WSS messages."""
|
||||
def __init__(self) -> None:
|
||||
self.sent: list[str] = []
|
||||
self._inbox: asyncio.Queue[str] = asyncio.Queue()
|
||||
self.closed = False
|
||||
async def send(self, msg: str) -> None:
|
||||
self.sent.append(msg)
|
||||
async def recv(self) -> str:
|
||||
return await self._inbox.get()
|
||||
async def close(self) -> None:
|
||||
self.closed = True
|
||||
async def push(self, payload: dict) -> None:
|
||||
await self._inbox.put(json.dumps(payload))
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fake_signer():
|
||||
s = MagicMock()
|
||||
s.get_live_session_token = AsyncMock(return_value="LST==")
|
||||
return s
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_subscribe_tick_caches_snapshot(fake_signer, monkeypatch):
|
||||
fake_ws = FakeWS()
|
||||
|
||||
async def fake_connect(url, **kw):
|
||||
return fake_ws
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.exchanges.ibkr.ws.websockets_connect", fake_connect)
|
||||
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://api.ibkr.com/v1/api/ws",
|
||||
base_url="https://api.ibkr.com/v1/api",
|
||||
max_subs=80, idle_timeout_s=300,
|
||||
)
|
||||
await ws.start()
|
||||
await ws.subscribe_tick(265598)
|
||||
|
||||
await fake_ws.push({
|
||||
"topic": "smd+265598",
|
||||
"31": "150.5", "84": "150.4", "86": "150.6",
|
||||
"7295": "100", "7296": "200",
|
||||
})
|
||||
await asyncio.sleep(0.05)
|
||||
|
||||
snap = ws.get_tick_snapshot(265598)
|
||||
assert snap is not None
|
||||
assert snap["last_price"] == 150.5
|
||||
assert snap["bid"] == 150.4
|
||||
|
||||
await ws.stop()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_subscribe_limit(fake_signer, monkeypatch):
|
||||
fake_ws = FakeWS()
|
||||
|
||||
async def fake_connect(url, **kw):
|
||||
return fake_ws
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.exchanges.ibkr.ws.websockets_connect", fake_connect)
|
||||
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://x", base_url="https://x",
|
||||
max_subs=2, idle_timeout_s=300,
|
||||
)
|
||||
await ws.start()
|
||||
await ws.subscribe_tick(1)
|
||||
await ws.subscribe_tick(2)
|
||||
with pytest.raises(WSError, match="IBKR_WS_SUB_LIMIT"):
|
||||
await ws.subscribe_tick(3)
|
||||
await ws.stop()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_subscribe_before_start_raises(fake_signer):
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://x", base_url="https://x",
|
||||
max_subs=10, idle_timeout_s=300,
|
||||
)
|
||||
with pytest.raises(WSError, match="IBKR_WS_NOT_STARTED"):
|
||||
await ws.subscribe_tick(1)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_start_after_stop_resumes_reader(fake_signer, monkeypatch):
|
||||
fake_ws_a = FakeWS()
|
||||
fake_ws_b = FakeWS()
|
||||
fakes = iter([fake_ws_a, fake_ws_b])
|
||||
|
||||
async def fake_connect(url, **kw):
|
||||
return next(fakes)
|
||||
|
||||
monkeypatch.setattr("cerbero_mcp.exchanges.ibkr.ws.websockets_connect", fake_connect)
|
||||
|
||||
ws = IBKRWebSocket(
|
||||
signer=fake_signer,
|
||||
ws_url="wss://x", base_url="https://x",
|
||||
max_subs=10, idle_timeout_s=300,
|
||||
)
|
||||
await ws.start()
|
||||
await ws.stop()
|
||||
# Restart with fresh fake_ws_b
|
||||
await ws.start()
|
||||
await ws.subscribe_tick(42)
|
||||
await fake_ws_b.push({"topic": "smd+42", "31": "100"})
|
||||
await asyncio.sleep(0.05)
|
||||
assert ws.get_tick_snapshot(42) is not None
|
||||
await ws.stop()
|
||||
+129
-1
@@ -1,5 +1,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
from pydantic import ValidationError
|
||||
|
||||
@@ -51,7 +53,8 @@ def test_settings_load_minimal(monkeypatch):
|
||||
assert s.alpaca.max_leverage == 1
|
||||
|
||||
|
||||
def test_settings_missing_token_fails(monkeypatch):
|
||||
def test_settings_missing_token_fails(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path) # isola dal .env reale del working dir
|
||||
env = _minimal_env()
|
||||
env.pop("TESTNET_TOKEN")
|
||||
for k, v in env.items():
|
||||
@@ -84,3 +87,128 @@ def test_settings_secret_str_no_leak(monkeypatch):
|
||||
s = Settings()
|
||||
assert "t_test_123" not in repr(s)
|
||||
assert "t_live_456" not in repr(s)
|
||||
|
||||
|
||||
def _isolated(monkeypatch, tmp_path, env: dict) -> None:
|
||||
"""Isola Settings dal .env reale in working dir e setta solo env passato."""
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in (
|
||||
"DERIBIT_CLIENT_ID", "DERIBIT_CLIENT_SECRET",
|
||||
"DERIBIT_CLIENT_ID_TESTNET", "DERIBIT_CLIENT_SECRET_TESTNET",
|
||||
"DERIBIT_CLIENT_ID_LIVE", "DERIBIT_CLIENT_SECRET_LIVE",
|
||||
):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
for k, v in env.items():
|
||||
monkeypatch.setenv(k, v)
|
||||
|
||||
|
||||
def test_deribit_credentials_legacy_single_pair(monkeypatch, tmp_path):
|
||||
"""Solo DERIBIT_CLIENT_ID/SECRET → entrambi gli env usano la stessa coppia."""
|
||||
_isolated(monkeypatch, tmp_path, _minimal_env())
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
assert s.deribit.credentials("testnet") == ("id", "secret")
|
||||
assert s.deribit.credentials("mainnet") == ("id", "secret")
|
||||
|
||||
|
||||
def test_deribit_credentials_per_env_pairs(monkeypatch, tmp_path):
|
||||
"""Coppie _TESTNET e _LIVE → ognuna serve l'env corrispondente."""
|
||||
env = _minimal_env()
|
||||
env.pop("DERIBIT_CLIENT_ID")
|
||||
env.pop("DERIBIT_CLIENT_SECRET")
|
||||
env["DERIBIT_CLIENT_ID_TESTNET"] = "tid"
|
||||
env["DERIBIT_CLIENT_SECRET_TESTNET"] = "tsec"
|
||||
env["DERIBIT_CLIENT_ID_LIVE"] = "lid"
|
||||
env["DERIBIT_CLIENT_SECRET_LIVE"] = "lsec"
|
||||
_isolated(monkeypatch, tmp_path, env)
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
assert s.deribit.credentials("testnet") == ("tid", "tsec")
|
||||
assert s.deribit.credentials("mainnet") == ("lid", "lsec")
|
||||
|
||||
|
||||
def test_deribit_credentials_env_specific_overrides_fallback(monkeypatch, tmp_path):
|
||||
"""_LIVE presente prevale sulla coppia base anche se entrambe configurate."""
|
||||
env = _minimal_env()
|
||||
env["DERIBIT_CLIENT_ID_LIVE"] = "lid"
|
||||
env["DERIBIT_CLIENT_SECRET_LIVE"] = "lsec"
|
||||
_isolated(monkeypatch, tmp_path, env)
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
assert s.deribit.credentials("mainnet") == ("lid", "lsec")
|
||||
assert s.deribit.credentials("testnet") == ("id", "secret") # fallback
|
||||
|
||||
|
||||
def test_deribit_credentials_missing_raises(monkeypatch, tmp_path):
|
||||
"""Nessuna coppia configurata → ValueError esplicito."""
|
||||
env = _minimal_env()
|
||||
env.pop("DERIBIT_CLIENT_ID")
|
||||
env.pop("DERIBIT_CLIENT_SECRET")
|
||||
_isolated(monkeypatch, tmp_path, env)
|
||||
|
||||
from cerbero_mcp.settings import Settings
|
||||
|
||||
s = Settings()
|
||||
with pytest.raises(ValueError, match="not configured for env=mainnet"):
|
||||
s.deribit.credentials("mainnet")
|
||||
|
||||
|
||||
def test_ibkr_settings_prefer_testnet_specific(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in list(os.environ):
|
||||
if k.startswith("IBKR_"):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
|
||||
monkeypatch.setenv("IBKR_CONSUMER_KEY", "base_consumer")
|
||||
monkeypatch.setenv("IBKR_CONSUMER_KEY_TESTNET", "paper_consumer")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN_TESTNET", "paper_token")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN_SECRET_TESTNET", "paper_secret")
|
||||
monkeypatch.setenv("IBKR_SIGNATURE_KEY_PATH_TESTNET", "/secrets/sig_paper.pem")
|
||||
monkeypatch.setenv("IBKR_ENCRYPTION_KEY_PATH_TESTNET", "/secrets/enc_paper.pem")
|
||||
monkeypatch.setenv("IBKR_ACCOUNT_ID_TESTNET", "DU1234567")
|
||||
monkeypatch.setenv("IBKR_DH_PRIME", "ffff")
|
||||
|
||||
from cerbero_mcp.settings import IBKRSettings
|
||||
s = IBKRSettings()
|
||||
creds = s.credentials("testnet")
|
||||
assert creds["consumer_key"] == "paper_consumer"
|
||||
assert creds["access_token"] == "paper_token"
|
||||
assert creds["account_id"] == "DU1234567"
|
||||
|
||||
|
||||
def test_ibkr_settings_fallback_to_base(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in list(os.environ):
|
||||
if k.startswith("IBKR_"):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
|
||||
monkeypatch.setenv("IBKR_CONSUMER_KEY", "base_consumer")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN", "base_token")
|
||||
monkeypatch.setenv("IBKR_ACCESS_TOKEN_SECRET", "base_secret")
|
||||
monkeypatch.setenv("IBKR_SIGNATURE_KEY_PATH", "/secrets/sig.pem")
|
||||
monkeypatch.setenv("IBKR_ENCRYPTION_KEY_PATH", "/secrets/enc.pem")
|
||||
monkeypatch.setenv("IBKR_ACCOUNT_ID_TESTNET", "DU1234567")
|
||||
monkeypatch.setenv("IBKR_DH_PRIME", "ffff")
|
||||
|
||||
from cerbero_mcp.settings import IBKRSettings
|
||||
s = IBKRSettings()
|
||||
creds = s.credentials("testnet")
|
||||
assert creds["consumer_key"] == "base_consumer"
|
||||
|
||||
|
||||
def test_ibkr_settings_missing_raises(monkeypatch, tmp_path):
|
||||
monkeypatch.chdir(tmp_path)
|
||||
for k in list(os.environ):
|
||||
if k.startswith("IBKR_"):
|
||||
monkeypatch.delenv(k, raising=False)
|
||||
|
||||
from cerbero_mcp.settings import IBKRSettings
|
||||
s = IBKRSettings()
|
||||
with pytest.raises(ValueError, match="IBKR credentials not configured"):
|
||||
s.credentials("testnet")
|
||||
|
||||
@@ -122,6 +122,7 @@ name = "cerbero-mcp"
|
||||
version = "2.0.0"
|
||||
source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "cryptography" },
|
||||
{ name = "eth-account" },
|
||||
{ name = "eth-utils" },
|
||||
{ name = "fastapi" },
|
||||
@@ -149,6 +150,7 @@ dev = [
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "cryptography", specifier = ">=43" },
|
||||
{ name = "eth-account", specifier = ">=0.13.7" },
|
||||
{ name = "eth-utils", specifier = ">=5.3.1" },
|
||||
{ name = "fastapi", specifier = ">=0.115" },
|
||||
@@ -183,6 +185,76 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/22/30/7cd8fdcdfbc5b869528b079bfb76dcdf6056b1a2097a662e5e8c04f42965/certifi-2026.4.22-py3-none-any.whl", hash = "sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a", size = 135707, upload-time = "2026-04-22T11:26:09.372Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cffi"
|
||||
version = "2.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pycparser", marker = "implementation_name != 'PyPy'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/12/4a/3dfd5f7850cbf0d06dc84ba9aa00db766b52ca38d8b86e3a38314d52498c/cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", size = 184344, upload-time = "2025-09-08T23:22:26.456Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/8b/f0e4c441227ba756aafbe78f117485b25bb26b1c059d01f137fa6d14896b/cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", size = 180560, upload-time = "2025-09-08T23:22:28.197Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b1/b7/1200d354378ef52ec227395d95c2576330fd22a869f7a70e88e1447eb234/cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", size = 209613, upload-time = "2025-09-08T23:22:29.475Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b8/56/6033f5e86e8cc9bb629f0077ba71679508bdf54a9a5e112a3c0b91870332/cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", size = 216476, upload-time = "2025-09-08T23:22:31.063Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/7f/55fecd70f7ece178db2f26128ec41430d8720f2d12ca97bf8f0a628207d5/cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", size = 203374, upload-time = "2025-09-08T23:22:32.507Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/84/ef/a7b77c8bdc0f77adc3b46888f1ad54be8f3b7821697a7b89126e829e676a/cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", size = 202597, upload-time = "2025-09-08T23:22:34.132Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d7/91/500d892b2bf36529a75b77958edfcd5ad8e2ce4064ce2ecfeab2125d72d1/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", size = 215574, upload-time = "2025-09-08T23:22:35.443Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/64/58f6255b62b101093d5df22dcb752596066c7e89dd725e0afaed242a61be/cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", size = 218971, upload-time = "2025-09-08T23:22:36.805Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/49/fa72cebe2fd8a55fbe14956f9970fe8eb1ac59e5df042f603ef7c8ba0adc/cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", size = 211972, upload-time = "2025-09-08T23:22:38.436Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/28/dd0967a76aab36731b6ebfe64dec4e981aff7e0608f60c2d46b46982607d/cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", size = 217078, upload-time = "2025-09-08T23:22:39.776Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/c0/015b25184413d7ab0a410775fdb4a50fca20f5589b5dab1dbbfa3baad8ce/cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", size = 172076, upload-time = "2025-09-08T23:22:40.95Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/8f/dc5531155e7070361eb1b7e4c1a9d896d0cb21c49f807a6c03fd63fc877e/cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", size = 182820, upload-time = "2025-09-08T23:22:42.463Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/5c/1b493356429f9aecfd56bc171285a4c4ac8697f76e9bbbbb105e537853a1/cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", size = 177635, upload-time = "2025-09-08T23:22:43.623Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload-time = "2025-09-08T23:22:47.349Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload-time = "2025-09-08T23:22:57.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload-time = "2025-09-08T23:22:58.351Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload-time = "2025-09-08T23:22:59.668Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ckzg"
|
||||
version = "2.1.7"
|
||||
@@ -252,6 +324,65 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cryptography"
|
||||
version = "47.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ef/b2/7ffa7fe8207a8c42147ffe70c3e360b228160c1d85dc3faff16aaa3244c0/cryptography-47.0.0.tar.gz", hash = "sha256:9f8e55fe4e63613a5e1cc5819030f27b97742d720203a087802ce4ce9ceb52bb", size = 830863, upload-time = "2026-04-24T19:54:57.056Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/98/40dfe932134bdcae4f6ab5927c87488754bf9eb79297d7e0070b78dd58e9/cryptography-47.0.0-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:160ad728f128972d362e714054f6ba0067cab7fb350c5202a9ae8ae4ce3ef1a0", size = 7912214, upload-time = "2026-04-24T19:53:03.864Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/c6/2733531243fba725f58611b918056b277692f1033373dcc8bd01af1c05d4/cryptography-47.0.0-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b9a8943e359b7615db1a3ba587994618e094ff3d6fa5a390c73d079ce18b3973", size = 4644617, upload-time = "2026-04-24T19:53:06.909Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/e3/b27be1a670a9b87f855d211cf0e1174a5d721216b7616bd52d8581d912ed/cryptography-47.0.0-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f5c15764f261394b22aef6b00252f5195f46f2ca300bec57149474e2538b31f8", size = 4668186, upload-time = "2026-04-24T19:53:09.053Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/b9/8443cfe5d17d482d348cee7048acf502bb89a51b6382f06240fd290d4ca3/cryptography-47.0.0-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:9c59ab0e0fa3a180a5a9c59f3a5abe3ef90d474bc56d7fadfbe80359491b615b", size = 4651244, upload-time = "2026-04-24T19:53:11.217Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/5e/13ed0cdd0eb88ba159d6dd5ebfece8cb901dbcf1ae5ac4072e28b55d3153/cryptography-47.0.0-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:34b4358b925a5ea3e14384ca781a2c0ef7ac219b57bb9eacc4457078e2b19f92", size = 5252906, upload-time = "2026-04-24T19:53:13.532Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/16/ed058e1df0f33d440217cd120d41d5dda9dd215a80b8187f68483185af82/cryptography-47.0.0-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0024b87d47ae2399165a6bfb20d24888881eeab83ae2566d62467c5ff0030ce7", size = 4701842, upload-time = "2026-04-24T19:53:15.618Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/e0/3d30986b30fdbd9e969abbdf8ba00ed0618615144341faeb57f395a084fe/cryptography-47.0.0-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:1e47422b5557bb82d3fff997e8d92cff4e28b9789576984f08c248d2b3535d93", size = 4289313, upload-time = "2026-04-24T19:53:17.755Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/fd/32db38e3ad0cb331f0691cb4c7a8a6f176f679124dee746b3af6633db4d9/cryptography-47.0.0-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:6f29f36582e6151d9686235e586dd35bb67491f024767d10b842e520dc6a07ac", size = 4650964, upload-time = "2026-04-24T19:53:20.062Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/53/5395d944dfd48cb1f67917f533c609c34347185ef15eb4308024c876f274/cryptography-47.0.0-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:a9b761f012a943b7de0e828843c5688d0de94a0578d44d6c85a1bae32f87791f", size = 5207817, upload-time = "2026-04-24T19:53:22.498Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/4f/e5711b28e1901f7d480a2b1b688b645aa4c77c73f10731ed17e7f7db3f0d/cryptography-47.0.0-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4e1de79e047e25d6e9f8cea71c86b4a53aced64134f0f003bbcbf3655fd172c8", size = 4701544, upload-time = "2026-04-24T19:53:24.356Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/22/c8ddc25de3010fc8da447648f5a092c40e7a8fadf01dd6d255d9c0b9373d/cryptography-47.0.0-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef6b3634087f18d2155b1e8ce264e5345a753da2c5fa9815e7d41315c90f8318", size = 4783536, upload-time = "2026-04-24T19:53:26.665Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/b6/d4a68f4ea999c6d89e8498579cba1c5fcba4276284de7773b17e4fa69293/cryptography-47.0.0-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:11dbb9f50a0f1bb9757b3d8c27c1101780efb8f0bdecfb12439c22a74d64c001", size = 4926106, upload-time = "2026-04-24T19:53:28.686Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/ed/5f524db1fade9c013aa618e1c99c6ed05e8ffc9ceee6cda22fed22dda3f4/cryptography-47.0.0-cp311-abi3-win32.whl", hash = "sha256:7fda2f02c9015db3f42bb8a22324a454516ed10a8c29ca6ece6cdbb5efe2a203", size = 3258581, upload-time = "2026-04-24T19:53:31.058Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/dc/1b901990b174786569029f67542b3edf72ac068b6c3c8683c17e6a2f5363/cryptography-47.0.0-cp311-abi3-win_amd64.whl", hash = "sha256:f5c3296dab66202f1b18a91fa266be93d6aa0c2806ea3d67762c69f60adc71aa", size = 3775309, upload-time = "2026-04-24T19:53:33.054Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/88/7aa18ad9c11bc87689affa5ce4368d884b517502d75739d475fc6f4a03c7/cryptography-47.0.0-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:be12cb6a204f77ed968bcefe68086eb061695b540a3dd05edac507a3111b25f0", size = 7904299, upload-time = "2026-04-24T19:53:35.003Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/55/c18f75724544872f234678fdedc871391722cb34a2aee19faa9f63100bb2/cryptography-47.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2ebd84adf0728c039a3be2700289378e1c164afc6748df1a5ed456767bef9ba7", size = 4631180, upload-time = "2026-04-24T19:53:37.517Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/65/31a5cc0eaca99cec5bafffe155d407115d96136bb161e8b49e0ef73f09a7/cryptography-47.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7f68d6fbc7fbbcfb0939fea72c3b96a9f9a6edfc0e1b1d29778a2066030418b1", size = 4653529, upload-time = "2026-04-24T19:53:39.775Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/bc/641c0519a495f3bfd0421b48d7cd325c4336578523ccd76ea322b6c29c7a/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:6651d32eff255423503aa276739da98c30f26c40cbeffcc6048e0d54ef704c0c", size = 4638570, upload-time = "2026-04-24T19:53:42.129Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/f2/300327b0a47f6dc94dd8b71b57052aefe178bb51745073d73d80604f11ab/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:3fb8fa48075fad7193f2e5496135c6a76ac4b2aa5a38433df0a539296b377829", size = 5238019, upload-time = "2026-04-24T19:53:44.577Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/5a/5b5cf994391d4bf9d9c7efd4c66aabe4d95227256627f8fea6cff7dfadbd/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:11438c7518132d95f354fa01a4aa2f806d172a061a7bed18cf18cbdacdb204d7", size = 4686832, upload-time = "2026-04-24T19:53:47.015Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/2c/ae950e28fd6475c852fc21a44db3e6b5bcc1261d1e370f2b6e42fa800fef/cryptography-47.0.0-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:8c1a736bbb3288005796c3f7ccb9453360d7fed483b13b9f468aea5171432923", size = 4269301, upload-time = "2026-04-24T19:53:48.97Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/fb/6a39782e150ffe5cc1b0018cb6ddc48bf7ca62b498d7539ffc8a758e977d/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:f1557695e5c2b86e204f6ce9470497848634100787935ab7adc5397c54abd7ab", size = 4638110, upload-time = "2026-04-24T19:53:51.011Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/d7/0b3c71090a76e5c203164a47688b697635ece006dcd2499ab3a4dbd3f0bd/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:f9a034b642b960767fb343766ae5ba6ad653f2e890ddd82955aef288ffea8736", size = 5194988, upload-time = "2026-04-24T19:53:52.962Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/63/33/63a961498a9df51721ab578c5a2622661411fc520e00bd83b0cc64eb20c4/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:b1c76fca783aa7698eb21eb14f9c4aa09452248ee54a627d125025a43f83e7a7", size = 4686563, upload-time = "2026-04-24T19:53:55.274Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/bf/5ee5b145248f92250de86145d1c1d6edebbd57a7fe7caa4dedb5d4cf06a1/cryptography-47.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4f7722c97826770bab8ae92959a2e7b20a5e9e9bf4deae68fd86c3ca457bab52", size = 4770094, upload-time = "2026-04-24T19:53:57.753Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/92/43/21d220b2da5d517773894dacdcdb5c682c28d3fffce65548cb06e87d5501/cryptography-47.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:09f6d7bf6724f8db8b32f11eccf23efc8e759924bc5603800335cf8859a3ddbd", size = 4913811, upload-time = "2026-04-24T19:54:00.236Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/98/dc4ad376ac5f1a1a7d4a83f7b0c6f2bcad36b5d2d8f30aeb482d3a7d9582/cryptography-47.0.0-cp314-cp314t-win32.whl", hash = "sha256:6eebcaf0df1d21ce1f90605c9b432dd2c4f4ab665ac29a40d5e3fc68f51b5e63", size = 3237158, upload-time = "2026-04-24T19:54:02.606Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bc/da/97f62d18306b5133468bc3f8cc73a3111e8cdc8cf8d3e69474d6e5fd2d1b/cryptography-47.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:51c9313e90bd1690ec5a75ed047c27c0b8e6c570029712943d6116ef9a90620b", size = 3758706, upload-time = "2026-04-24T19:54:04.433Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/34/a4fae8ae7c3bc227460c9ae43f56abf1b911da0ec29e0ebac53bb0a4b6b7/cryptography-47.0.0-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:14432c8a9bcb37009784f9594a62fae211a2ae9543e96c92b2a8e4c3cd5cd0c4", size = 7904072, upload-time = "2026-04-24T19:54:06.411Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/64/d7b1e54fdb69f22d24a64bb3e88dc718b31c7fb10ef0b9691a3cf7eeea6e/cryptography-47.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:07efe86201817e7d3c18781ca9770bc0db04e1e48c994be384e4602bc38f8f27", size = 4635767, upload-time = "2026-04-24T19:54:08.519Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/7b/cca826391fb2a94efdcdfe4631eb69306ee1cff0b22f664a412c90713877/cryptography-47.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2b45761c6ec22b7c726d6a829558777e32d0f1c8be7c3f3480f9c912d5ee8a10", size = 4654350, upload-time = "2026-04-24T19:54:10.795Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/65/4b57bcc823f42a991627c51c2f68c9fd6eb1393c1756aac876cba2accae2/cryptography-47.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:edd4da498015da5b9f26d38d3bfc2e90257bfa9cbed1f6767c282a0025ae649b", size = 4643394, upload-time = "2026-04-24T19:54:13.275Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/c4/2c5fbeea70adbbca2bbae865e1d605d6a4a7f8dbd9d33eaf69645087f06c/cryptography-47.0.0-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:9af828c0d5a65c70ec729cd7495a4bf1a67ecb66417b8f02ff125ab8a6326a74", size = 5225777, upload-time = "2026-04-24T19:54:15.18Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/b8/ac57107ef32749d2b244e36069bb688792a363aaaa3acc9e3cf84c130315/cryptography-47.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:256d07c78a04d6b276f5df935a9923275f53bd1522f214447fdf365494e2d515", size = 4688771, upload-time = "2026-04-24T19:54:17.835Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/56/fc/9f1de22ff8be99d991f240a46863c52d475404c408886c5a38d2b5c3bb26/cryptography-47.0.0-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:5d0e362ff51041b0c0d219cc7d6924d7b8996f57ce5712bdcef71eb3c65a59cc", size = 4270753, upload-time = "2026-04-24T19:54:19.963Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/68/d70c852797aa68e8e48d12e5a87170c43f67bb4a59403627259dd57d15de/cryptography-47.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:1581aef4219f7ca2849d0250edaa3866212fb74bf5667284f46aa92f9e65c1ca", size = 4642911, upload-time = "2026-04-24T19:54:21.818Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a5/51/661cbee74f594c5d97ff82d34f10d5551c085ca4668645f4606ebd22bd5d/cryptography-47.0.0-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:a49a3eb5341b9503fa3000a9a0db033161db90d47285291f53c2a9d2cd1b7f76", size = 5181411, upload-time = "2026-04-24T19:54:24.376Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/87/f2b6c374a82cf076cfa1416992ac8e8ec94d79facc37aec87c1a5cb72352/cryptography-47.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:2207a498b03275d0051589e326b79d4cf59985c99031b05bb292ac52631c37fe", size = 4688262, upload-time = "2026-04-24T19:54:26.946Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/e2/8b7462f4acf21ec509616f0245018bb197194ab0b65c2ea21a0bdd53c0eb/cryptography-47.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:7a02675e2fabd0c0fc04c868b8781863cbf1967691543c22f5470500ff840b31", size = 4775506, upload-time = "2026-04-24T19:54:28.926Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/75/158e494e4c08dc05e039da5bb48553826bd26c23930cf8d3cd5f21fa8921/cryptography-47.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80887c5cbd1774683cb126f0ab4184567f080071d5acf62205acb354b4b753b7", size = 4912060, upload-time = "2026-04-24T19:54:30.869Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/bd/0a9d3edbf5eadbac926d7b9b3cd0c4be584eeeae4a003d24d9eda4affbbd/cryptography-47.0.0-cp38-abi3-win32.whl", hash = "sha256:ed67ea4e0cfb5faa5bc7ecb6e2b8838f3807a03758eec239d6c21c8769355310", size = 3248487, upload-time = "2026-04-24T19:54:33.494Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/80/5681af756d0da3a599b7bdb586fac5a1540f1bcefd2717a20e611ddade45/cryptography-47.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:835d2d7f47cdc53b3224e90810fb1d36ca94ea29cc1801fb4c1bc43876735769", size = 3755737, upload-time = "2026-04-24T19:54:35.408Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1b/a0/928c9ce0d120a40a81aa99e3ba383e87337b9ac9ef9f6db02e4d7822424d/cryptography-47.0.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:7f1207974a904e005f762869996cf620e9bf79ecb4622f148550bb48e0eb35a7", size = 3909893, upload-time = "2026-04-24T19:54:38.334Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/75/d691e284750df5d9569f2b1ce4a00a71e1d79566da83b2b3e5549c84917f/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:1a405c08857258c11016777e11c02bacbe7ef596faf259305d282272a3a05cbe", size = 4587867, upload-time = "2026-04-24T19:54:40.619Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/d6/1b90f1a4e453009730b4545286f0b39bb348d805c11181fc31544e4f9a65/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:20fdbe3e38fb67c385d233c89371fa27f9909f6ebca1cecc20c13518dae65475", size = 4627192, upload-time = "2026-04-24T19:54:42.849Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dc/53/cb358a80e9e359529f496870dd08c102aa8a4b5b9f9064f00f0d6ed5b527/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:f7db373287273d8af1414cf95dc4118b13ffdc62be521997b0f2b270771fef50", size = 4587486, upload-time = "2026-04-24T19:54:44.908Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/57/aaa3d53876467a226f9a7a82fd14dd48058ad2de1948493442dfa16e2ffd/cryptography-47.0.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:9fe6b7c64926c765f9dff301f9c1b867febcda5768868ca084e18589113732ab", size = 4626327, upload-time = "2026-04-24T19:54:47.813Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/9c/51f28c3550276bcf35660703ba0ab829a90b88be8cd98a71ef23c2413913/cryptography-47.0.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:cffbba3392df0fa8629bb7f43454ee2925059ee158e23c54620b9063912b86c8", size = 3698916, upload-time = "2026-04-24T19:54:49.782Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cytoolz"
|
||||
version = "1.1.0"
|
||||
@@ -999,6 +1130,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pycparser"
|
||||
version = "3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload-time = "2026-01-21T14:26:51.89Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload-time = "2026-01-21T14:26:50.693Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pycryptodome"
|
||||
version = "3.23.0"
|
||||
|
||||
Reference in New Issue
Block a user