Phase 3: MCP HTTP clients + Dockerization

Wrapper async tipizzati sui sei servizi MCP HTTP che Cerbero Bite
consuma in autonomia. 277 test pass, copertura clients 93%, mypy
strict pulito, ruff clean.

Base layer:
- clients/_base.py: HttpToolClient con httpx + tenacity (retry
  esponenziale 3x, timeout 8s, mapping HTTP→eccezioni tipizzate).
- clients/_exceptions.py: McpAuthError, McpServerError, McpToolError,
  McpDataAnomalyError, McpNotFoundError, McpTimeoutError.
- config/mcp_endpoints.py: risoluzione URL via Docker DNS
  (mcp-deribit:9011, ...) con override per servizio via env var;
  caricamento bearer token da secrets/core.token o
  CERBERO_BITE_CORE_TOKEN_FILE.

Wrapper:
- clients/macro.py: next_high_severity_within() per filtro entry §2.5.
- clients/sentiment.py: funding_cross_median_annualized() con
  annualizzazione per period nativo per exchange (Binance/Bybit/OKX
  1095, Hyperliquid 8760).
- clients/hyperliquid.py: funding_rate_annualized() per filtro §2.6.
- clients/portfolio.py: total_equity_eur(), asset_pct_of_portfolio()
  per sizing engine + filtro §2.7.
- clients/telegram.py: notify-only (no callback queue, no
  conferme — Bite auto-execute).
- clients/deribit.py: environment_info, index_price_eth,
  latest_dvol, options_chain, get_tickers, orderbook_depth_top3,
  get_account_summary, get_positions, place_combo_order (combo
  atomico), cancel_order.

CLI:
- cerbero-bite ping: health-check parallelo di tutti gli MCP con
  tabella rich (OK/FAIL/SKIPPED).

Docker:
- Dockerfile multi-stage Python 3.13 + uv, user non-root.
- docker-compose.yml con rete external "cerbero-suite", secret
  core_token montato a /run/secrets/core_token, env per ogni MCP.
- secrets/README.md documenta il setup del token.

Documentazione di intervento:
- docs/12-mcp-deribit-changes.md: spec delle modifiche apportate
  al server mcp-deribit (place_combo_order + override testnet via
  DERIBIT_TESTNET).

Dipendenze:
- aggiunto pytest-httpx per i test HTTP.
- rimosso mcp>=1.0 (non usiamo l'SDK MCP, parliamo via HTTP REST).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
2026-04-27 23:36:30 +02:00
parent 263470786d
commit 466e63dc19
29 changed files with 2988 additions and 235 deletions
+13
View File
@@ -0,0 +1,13 @@
.git/
.venv/
.mypy_cache/
.ruff_cache/
.pytest_cache/
__pycache__/
data/
docs/
tests/
.coverage
htmlcov/
*.local.yaml
secrets/
+3
View File
@@ -43,3 +43,6 @@ data/
.env .env
.env.* .env.*
!.env.example !.env.example
secrets/*
!secrets/.gitkeep
!secrets/README.md
+52
View File
@@ -0,0 +1,52 @@
# syntax=docker/dockerfile:1.7
FROM python:3.13-slim AS builder
# uv ships static binaries; pin a version for reproducibility.
COPY --from=ghcr.io/astral-sh/uv:0.4.27 /uv /usr/local/bin/uv
WORKDIR /app
ENV UV_PROJECT_ENVIRONMENT=/opt/venv \
UV_LINK_MODE=copy \
PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1
# Install only the dependencies first so the layer is cached when the
# source tree changes.
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen --no-dev --no-install-project
# Now copy the source tree and install the project itself.
COPY src ./src
COPY README.md ./
RUN uv sync --frozen --no-dev
FROM python:3.13-slim AS runtime
RUN apt-get update \
&& apt-get install -y --no-install-recommends sqlite3 ca-certificates \
&& rm -rf /var/lib/apt/lists/*
# Non-root user with a stable UID for volume permissions.
RUN useradd --system --uid 10001 --home-dir /app --shell /usr/sbin/nologin bite
WORKDIR /app
ENV PATH=/opt/venv/bin:$PATH \
PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
CERBERO_BITE_CORE_TOKEN_FILE=/run/secrets/core_token
COPY --from=builder /opt/venv /opt/venv
COPY --from=builder /app/src /app/src
COPY scripts /app/scripts
COPY strategy.yaml /app/strategy.yaml
# Persistent state + audit go into /app/data, mounted as a volume in
# docker-compose.yml.
RUN mkdir -p /app/data/log /app/data/backups \
&& chown -R bite:bite /app
USER bite
ENTRYPOINT ["cerbero-bite"]
CMD ["status"]
+53
View File
@@ -0,0 +1,53 @@
# docker-compose.yml — Cerbero Bite
#
# Bite runs in its own Compose project but joins the same Docker
# network used by Cerbero_mcp so it can resolve `mcp-deribit`,
# `mcp-macro` and friends by their service name (see the gateway
# Caddyfile in Cerbero_mcp).
#
# The shared network is declared as external here. Create it once on
# the host with `docker network create cerbero-suite` (or rename the
# Cerbero_mcp network to `cerbero-suite` and mark it external).
#
# Secrets are read from ./secrets/, which is .gitignore'd.
networks:
cerbero-suite:
external: true
secrets:
core_token:
file: ./secrets/core.token
volumes:
bite-data:
services:
cerbero-bite:
build:
context: .
dockerfile: Dockerfile
image: cerbero-bite:dev
restart: unless-stopped
networks: [cerbero-suite]
cap_drop: [ALL]
security_opt:
- no-new-privileges:true
secrets:
- core_token
environment:
CERBERO_BITE_CORE_TOKEN_FILE: /run/secrets/core_token
# Service URLs — the defaults below match the cerbero-suite
# network DNS. Override per service if you need to point at a
# different host (dev only).
CERBERO_BITE_MCP_DERIBIT_URL: http://mcp-deribit:9011
CERBERO_BITE_MCP_HYPERLIQUID_URL: http://mcp-hyperliquid:9012
CERBERO_BITE_MCP_MACRO_URL: http://mcp-macro:9013
CERBERO_BITE_MCP_SENTIMENT_URL: http://mcp-sentiment:9014
CERBERO_BITE_MCP_TELEGRAM_URL: http://mcp-telegram:9017
CERBERO_BITE_MCP_PORTFOLIO_URL: http://mcp-portfolio:9018
volumes:
- bite-data:/app/data
# Default command runs the engine status check; override with the
# CLI subcommand of choice (start, ping, dry-run, ...).
command: ["status"]
+282
View File
@@ -0,0 +1,282 @@
# 12 — Modifiche da apportare a `mcp-deribit`
## Premessa
Cerbero Bite, a partire dalla Fase 3, opera in modo completamente
autonomo: nessuna conferma interattiva da parte di Adriano nel primo
periodo, niente bottoni Telegram, niente coda di approvazione manuale.
Quando il rule engine valuta che le condizioni di entrata e di uscita
sono soddisfatte, l'engine deve poter inviare l'ordine senza
intermediazione umana. Il primo periodo di esercizio sarà condotto su
ambiente **testnet** Deribit, in modo da raccogliere dati di slippage e
fill reali senza esporre capitale.
Per realizzare questo flusso sono necessarie due modifiche al servizio
`mcp-deribit` ospitato in `CerberoSuite/Cerbero/services/mcp-deribit`.
Entrambe le modifiche sono additive e non rompono i tool esistenti.
Lo stato corrente del servizio è il seguente:
* `place_order` esiste, ma è limitato a un singolo strumento per
invocazione (`PlaceOrderReq.instrument_name`); non è atomico per uno
spread a due gambe.
* `client.py` espone l'attributo `testnet`, letto dal file di
credenziali (`secrets/deribit.json`, campo `testnet`). Il tool
`is_testnet` riporta correttamente lo stato. Non esiste oggi un modo
di sovrascrivere quella scelta a livello di environment del
container.
## Modifica 1 — Nuovo tool `place_combo_order`
### Obiettivo
Inviare a Deribit un singolo ordine combo a due o più gambe
(legs) in modo atomico, evitando il rischio di leg risk tipico
delle credit spread (la prima gamba si riempie e la seconda no, oppure
si riempie a un prezzo che invalida il rapporto credito/larghezza).
### Comportamento attesto
L'endpoint riceve la lista di legs, costruisce o riusa lo strumento
combo corrispondente sul lato Deribit (API `public/create_combo`),
quindi invia l'ordine di acquisto netto sul combo (`private/buy` con
tipo `limit` o `market`). Il prezzo limite è espresso come prezzo
netto del combo (in ETH per le opzioni inverse Deribit), cioè la somma
algebrica firmata dei mid-price delle gambe.
L'ordine è soggetto agli stessi guard rail di `place_order`:
verifica `core` capability, `enforce_single_notional`,
`enforce_aggregate`, `enforce_leverage`. Per le opzioni il notional è
calcolato come `width × n_contracts × spot_index` (perdita massima del
combo).
### Schema della richiesta
```python
class PlaceComboOrderReq(BaseModel):
legs: list[ComboLegReq] # 2..4 gambe
side: Literal["buy", "sell"] # "sell" = incassa credito
amount: float # numero di contratti combo
type: Literal["limit", "market"] = "limit"
price: float | None = None # prezzo netto in ETH; richiesto se type=limit
label: str | None = None # propagato a Deribit per riconciliazione
time_in_force: Literal["good_til_cancelled", "good_til_day", "fill_or_kill", "immediate_or_cancel"] = "good_til_cancelled"
post_only: bool = False
reduce_only: bool = False # vero per combo di chiusura
class ComboLegReq(BaseModel):
instrument_name: str # ETH-15MAY26-2475-P
direction: Literal["buy", "sell"] # direzione della singola gamba
ratio: int = 1 # multiplo della size combo (Deribit usa direction/ratio)
```
Vincoli di validazione:
* `len(legs)` compreso fra 2 e 4 (esclusi naked e ratio sbilanciati).
* Tutte le gambe hanno la stessa scadenza (`expiry`). Il payload deve
rifiutare combo a scadenze miste.
* Se `type == "limit"`, `price` è obbligatorio.
* `direction` di ciascuna gamba e `side` del combo sono coerenti con
la convenzione Deribit (vedi
[docs Deribit `public/create_combo`](https://docs.deribit.com/#public-create_combo)).
### Schema della risposta
```python
class PlaceComboOrderResp(BaseModel):
combo_id: str # ETH-15MAY26-2475P_2350P
order_id: str # ID dell'ordine sul combo
state: str # open, filled, rejected, ecc.
average_price: float | None # in ETH
filled_amount: float
legs: list[ComboLegFill] # per gamba: instrument, direction, fill price, fees
raw: dict # response Deribit completa per audit
```
### Endpoint
```python
@app.post("/tools/place_combo_order", tags=["writes"])
async def t_place_combo_order(
body: PlaceComboOrderReq,
principal: Principal = Depends(require_principal),
):
_check(principal, core=True)
return await client.place_combo_order(...)
```
### Estensioni del client (`mcp_deribit/client.py`)
Aggiungere un metodo `place_combo_order` che incapsula due chiamate
Deribit:
1. `public/create_combo` con la lista delle gambe; ritorna
`combo_id` (Deribit gestisce idempotenza per pair identico, perciò
chiamate ripetute con gli stessi parametri restituiscono lo stesso
id).
2. `private/buy` o `private/sell` (a seconda di `side`) sul combo
appena creato, con `amount`, `price`, `type`, `time_in_force`,
`post_only`, `reduce_only`, `label`.
In aggiunta: `cancel_combo_order(order_id)` che oggi può semplicemente
delegare a `cancel_order` (i combo sono cancellabili come ordini
normali); è opportuno tracciarlo come metodo distinto per rendere il
log dell'audit più leggibile.
### Tag MCP
Aggiungere alla lista `tools` di `mount_mcp_endpoint` la voce
`place_combo_order` con descrizione *"Invia un combo order
multi-leg atomico (Bull Put, Bear Call, Iron Condor)"*.
### Test
* Mock del client che simula `create_combo` + `buy` e verifica che il
body propagato a Deribit rispetti la convenzione direction/ratio.
* Test 403 quando il principal non ha capability `core`.
* Test di rifiuto per legs a scadenze miste, `len(legs) < 2`,
`len(legs) > 4`, `price` mancante con `type=limit`.
## Modifica 2 — Override `testnet` via environment variable
### Obiettivo
Permettere di forzare l'ambiente di esecuzione (testnet o mainnet)
senza dover riscrivere `secrets/deribit.json`. Questa flessibilità è
essenziale per Cerbero Bite, che gira in un container Docker dedicato
e deve poter passare da paper trading a soft launch modificando solo
una variabile d'ambiente.
### Comportamento attesto
In `__main__.py`, dopo la lettura del file di credenziali, applicare
la seguente precedenza di risoluzione:
1. Se la variabile d'ambiente `DERIBIT_TESTNET` è valorizzata, la sua
conversione booleana (`true|false`, `1|0`, case-insensitive)
sovrascrive il campo `testnet` del file di credenziali.
2. Altrimenti vale il campo `testnet` del JSON.
3. Se nessuno dei due è presente, default `True` (precauzione: meglio
testnet che mainnet per errore).
### Variazione al main
```python
def _resolve_testnet(creds: dict) -> bool:
env = os.environ.get("DERIBIT_TESTNET")
if env is not None:
return env.strip().lower() in {"1", "true", "yes", "on"}
return bool(creds.get("testnet", True))
client = DeribitClient(
client_id=creds["client_id"],
client_secret=creds["client_secret"],
testnet=_resolve_testnet(creds),
)
```
### Estensione del tool `is_testnet`
Il tool oggi ritorna `{"testnet": bool, "base_url": str}`. Aggiungere:
* `source`: `"env"` o `"credentials"` o `"default"` per indicare
l'origine della scelta.
* `env_value`: valore grezzo letto da `DERIBIT_TESTNET` (utile per
diagnosticare typo nel docker-compose).
In questo modo Cerbero Bite, all'avvio, può chiamare `is_testnet`,
loggare il risultato e, se in disaccordo con la propria configurazione
attesa (`strategy.yaml` o equivalente), armare il kill switch prima
che parta qualsiasi ciclo di entry.
### Documentazione e compose
* In `docker-compose.yml`, aggiungere `environment: DERIBIT_TESTNET:
"true"` al servizio `mcp-deribit` con commento *"override secrets/
deribit.json testnet flag"*.
* In `services/mcp-deribit/README.md` (se esiste, altrimenti creare),
documentare la precedenza e le tre forme accettate di valore
booleano.
### Test
* Test che monta sia il file di credenziali (con `testnet:false`) sia
l'env var (`DERIBIT_TESTNET=true`) e verifica che `is_testnet`
riporti `True` con `source="env"`.
* Test del default a `True` quando entrambe le sorgenti mancano (file
con campo assente, env non settata).
## Impatto su Cerbero Bite
Le due modifiche abilitano i seguenti flussi nella Fase 3 di Cerbero
Bite:
* Il wrapper `clients/deribit.py` espone un metodo
`place_combo_order(short, long, side, n_contracts, limit_price)` che
inoltra direttamente al nuovo tool. Niente più sequenza di due
`place_order`.
* All'avvio, l'orchestrator (Fase 4) chiama
`cerbero-deribit.is_testnet` e:
* blocca il boot se l'ambiente non corrisponde a quanto previsto in
`strategy.yaml` (campo nuovo proposto: `execution.environment:
"testnet" | "mainnet"`);
* scrive l'esito in `system_state.config_version` o in un campo
dedicato `system_state.environment` per renderlo visibile alla
GUI.
* Il documento di flusso operativo (`docs/06-operational-flow.md`)
viene aggiornato per rimuovere la fase di conferma utente e
introdurre l'auto-execute condizionato alle regole già codificate
in `core/`.
## Modifiche correlate ai documenti di Cerbero Bite
Una volta integrate le modifiche al server `mcp-deribit`, occorrerà
allineare i seguenti documenti del progetto Cerbero Bite (sono
modifiche editoriali, nessuna implicazione architetturale ulteriore):
* `docs/04-mcp-integration.md`: rimuovere le sezioni
`cerbero-memory` e `cerbero-brain-bridge`; aggiungere la voce
`place_combo_order` nella tabella di `cerbero-deribit`; aggiornare
la matrice di degradation in modo che `cerbero-deribit` resti
l'unico hard-fail di esecuzione.
* `docs/06-operational-flow.md`: sostituire il flusso di conferma
Telegram con un flusso di sola notifica e auto-execute.
* `docs/02-architecture.md`: rimuovere il blocco
`cerbero-memory ⇆ Cerbero core` dal diagramma.
* `docs/05-data-model.md`: la tabella `instructions` diventa il log
degli ordini Deribit (combo `combo_id`, `order_id`, fill price,
fees) anziché il tracking di `push_user_instruction`.
* `docs/07-risk-controls.md`: kill switch trigger per mismatch
testnet/mainnet, da aggiungere alla matrice.
* `strategy.yaml`: aggiungere il blocco
`execution.environment: "testnet"` con `last_review` aggiornato.
## Out of scope di questo documento
Non sono oggetto di modifica al server `mcp-deribit`:
* La gestione di ordini combo a più di quattro gambe (i prodotti
Cerbero Bite restano due o quattro gambe per Bull Put, Bear Call,
Iron Condor).
* L'aggiunta di logica di repricing (incremento di un tick verso
ask combinato): è responsabilità di Cerbero Bite via
`cancel_combo_order` + `place_combo_order` con prezzo aggiornato.
* L'integrazione con `cerbero-memory` o `cerbero-brain-bridge`: nessun
collegamento, neanche indiretto.
## Sequenza di lavoro consigliata
1. Aprire un branch `feat/place-combo-order` su `CerberoSuite/Cerbero`.
2. Implementare il metodo `client.place_combo_order` con relativi
test unitari sul client (mock di `create_combo` + `buy`).
3. Esporre il tool `place_combo_order` in `server.py` con i guard rail
e i test FastAPI.
4. Aprire un branch separato `feat/deribit-testnet-env` per la modifica
2 (più piccola, indipendente dal resto).
5. Aggiornare `docker-compose.yml` con la variabile d'ambiente.
6. Una volta in main, su Cerbero Bite implementare il wrapper
`clients/deribit.py` con i metodi `place_combo_order` e
`cancel_combo_order`, scrivere i fake corrispondenti in
`tests/fixtures/fakes/deribit.py` e i test integration in
`tests/integration/test_deribit_combo.py`.
+2 -2
View File
@@ -18,7 +18,6 @@ dependencies = [
"aiosqlite>=0.20", "aiosqlite>=0.20",
"pyyaml>=6.0", "pyyaml>=6.0",
"httpx>=0.27", "httpx>=0.27",
"mcp>=1.0",
"tenacity>=9.0", "tenacity>=9.0",
"python-dateutil>=2.9", "python-dateutil>=2.9",
] ]
@@ -45,6 +44,7 @@ dev = [
"pytest>=8.3", "pytest>=8.3",
"pytest-asyncio>=0.24", "pytest-asyncio>=0.24",
"pytest-cov>=5.0", "pytest-cov>=5.0",
"pytest-httpx>=0.33",
"hypothesis>=6.115", "hypothesis>=6.115",
"mypy>=1.13", "mypy>=1.13",
"ruff>=0.7", "ruff>=0.7",
@@ -111,7 +111,7 @@ no_implicit_reexport = true
files = ["src/cerbero_bite"] files = ["src/cerbero_bite"]
[[tool.mypy.overrides]] [[tool.mypy.overrides]]
module = ["apscheduler.*", "mcp.*"] module = ["apscheduler.*"]
ignore_missing_imports = true ignore_missing_imports = true
[tool.pytest.ini_options] [tool.pytest.ini_options]
View File
+28
View File
@@ -0,0 +1,28 @@
# `secrets/`
Cartella runtime per i credenziali sensibili. Tutti i file in questa
directory sono `.gitignore`d eccetto questo README e `.gitkeep`.
## Contenuto atteso
| File | Origine | Uso |
|---|---|---|
| `core.token` | copia di `Cerbero_mcp/secrets/core.token` | bearer token con capability `core` per chiamare i tool MCP. Letta una sola volta al boot del container. |
## Setup
```bash
cp /path/to/Cerbero_mcp/secrets/core.token secrets/core.token
chmod 600 secrets/core.token
```
Il `docker-compose.yml` di Cerbero Bite monta `secrets/core.token`
come Docker secret a `/run/secrets/core_token` dentro il container, e
la variabile d'ambiente `CERBERO_BITE_CORE_TOKEN_FILE` punta lì per
default.
## Rotazione
Quando il token core viene ruotato sul cluster Cerbero_mcp, sostituire
anche la copia locale. Il container va riavviato perché il token è
letto solo all'avvio.
+102
View File
@@ -9,6 +9,7 @@ without changing the surface.
from __future__ import annotations from __future__ import annotations
import asyncio
import sys import sys
from datetime import UTC, datetime from datetime import UTC, datetime
from pathlib import Path from pathlib import Path
@@ -18,7 +19,18 @@ from rich.console import Console
from rich.table import Table from rich.table import Table
from cerbero_bite import __version__ from cerbero_bite import __version__
from cerbero_bite.clients import HttpToolClient, McpError
from cerbero_bite.clients.deribit import DeribitClient
from cerbero_bite.clients.hyperliquid import HyperliquidClient
from cerbero_bite.clients.macro import MacroClient
from cerbero_bite.clients.portfolio import PortfolioClient
from cerbero_bite.clients.sentiment import SentimentClient
from cerbero_bite.config.loader import compute_config_hash, load_strategy from cerbero_bite.config.loader import compute_config_hash, load_strategy
from cerbero_bite.config.mcp_endpoints import (
DEFAULT_ENDPOINTS,
load_endpoints,
load_token,
)
from cerbero_bite.logging import configure as configure_logging from cerbero_bite.logging import configure as configure_logging
from cerbero_bite.logging import get_logger from cerbero_bite.logging import get_logger
from cerbero_bite.safety.audit_log import AuditChainError, AuditLog from cerbero_bite.safety.audit_log import AuditChainError, AuditLog
@@ -225,6 +237,96 @@ def kill_switch_status(db: Path) -> None:
) )
@main.command()
@click.option(
"--token-file",
type=click.Path(dir_okay=False, path_type=Path),
default=None,
help="Path to the bearer token file (default: secrets/core_token).",
)
@click.option(
"--timeout",
type=float,
default=4.0,
show_default=True,
help="Per-service timeout in seconds for the ping call.",
)
def ping(token_file: Path | None, timeout: float) -> None:
"""Print health status for every MCP service Cerbero Bite uses."""
try:
token = load_token(path=token_file)
except (FileNotFoundError, ValueError) as exc:
console.print(f"[red]token error[/red]: {exc}")
sys.exit(1)
endpoints = load_endpoints()
rows = asyncio.run(_ping_all(endpoints, token=token, timeout=timeout))
table = Table(title="MCP services")
table.add_column("service")
table.add_column("url")
table.add_column("status")
table.add_column("detail")
for service, url, status, detail in rows:
colour = {"ok": "green", "fail": "red", "skipped": "yellow"}.get(status, "white")
table.add_row(service, url, f"[{colour}]{status.upper()}[/{colour}]", detail)
console.print(table)
async def _ping_one(
*,
service: str,
url: str,
token: str,
timeout: float,
) -> tuple[str, str]:
"""Return ``(status, detail)`` for one service health check."""
http = HttpToolClient(
service=service,
base_url=url,
token=token,
retry_max=1,
timeout_s=timeout,
)
try:
if service == "deribit":
info = await DeribitClient(http).environment_info()
return "ok", f"environment={info.environment}"
if service == "macro":
await MacroClient(http).get_calendar(days=1, importance_min="high")
return "ok", "calendar reachable"
if service == "sentiment":
await SentimentClient(http).funding_cross_median_annualized("ETH")
return "ok", "funding reachable"
if service == "hyperliquid":
await HyperliquidClient(http).funding_rate_annualized("ETH")
return "ok", "ETH-PERP reachable"
if service == "portfolio":
await PortfolioClient(http).total_equity_eur()
return "ok", "portfolio reachable"
if service == "telegram":
# Notify-only: no read tool. Skip without hitting the bot.
return "skipped", "notify-only client (no health probe)"
return "skipped", "no probe defined" # pragma: no cover
except McpError as exc:
return "fail", f"{type(exc).__name__}: {exc}"
except Exception as exc: # surface any unexpected error for the operator
return "fail", f"{type(exc).__name__}: {exc}"
async def _ping_all(
endpoints: object, *, token: str, timeout: float
) -> list[tuple[str, str, str, str]]:
rows: list[tuple[str, str, str, str]] = []
for service in DEFAULT_ENDPOINTS:
url = endpoints.for_service(service) # type: ignore[attr-defined]
status, detail = await _ping_one(
service=service, url=url, token=token, timeout=timeout
)
rows.append((service, url, status, detail))
return rows
@main.command() @main.command()
def gui() -> None: def gui() -> None:
"""Launch the Streamlit dashboard.""" """Launch the Streamlit dashboard."""
+28
View File
@@ -0,0 +1,28 @@
"""Async wrappers over the Cerbero MCP HTTP services.
Every concrete client extends :class:`HttpToolClient` and exposes a
typed surface that returns the Pydantic records consumed by the
``core/`` algorithms.
"""
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import (
McpAuthError,
McpDataAnomalyError,
McpError,
McpNotFoundError,
McpServerError,
McpTimeoutError,
McpToolError,
)
__all__ = [
"HttpToolClient",
"McpAuthError",
"McpDataAnomalyError",
"McpError",
"McpNotFoundError",
"McpServerError",
"McpTimeoutError",
"McpToolError",
]
+228
View File
@@ -0,0 +1,228 @@
"""HTTP tool client common to every MCP wrapper.
Each MCP service exposes ``POST <base_url>/tools/<tool_name>`` with a
JSON body and a ``Bearer <core_token>`` header. ``HttpToolClient`` is a
thin wrapper around :class:`httpx.AsyncClient` that:
* Adds the auth header.
* Applies the project-wide timeout (default 8 s, see
``docs/10-config-spec.md`` ``mcp.call_timeout_s``).
* Retries the call on transient failures with exponential backoff
(1 s, 5 s, 30 s) — at most 3 attempts in total.
* Maps HTTP errors and ``state == "error"`` envelopes into the typed
exceptions in :mod:`cerbero_bite.clients._exceptions`.
The wrapper does *not* hold a long-lived ``AsyncClient`` by default:
each call opens and closes its own connection so a transient DNS issue
on one MCP server does not corrupt connection pooling for the others.
A shared pool can still be passed in via ``transport`` / ``client``
when the orchestrator wants connection reuse.
"""
from __future__ import annotations
import json
import logging
from collections.abc import Awaitable, Callable
from typing import Any
import httpx
from tenacity import (
AsyncRetrying,
RetryError,
retry_if_exception_type,
stop_after_attempt,
wait_exponential,
)
from cerbero_bite.clients._exceptions import (
McpAuthError,
McpError,
McpNotFoundError,
McpServerError,
McpTimeoutError,
McpToolError,
)
__all__ = ["HttpToolClient"]
_log = logging.getLogger("cerbero_bite.clients")
_RETRYABLE: tuple[type[BaseException], ...] = (
McpTimeoutError,
McpServerError,
)
class HttpToolClient:
"""Async client for ``POST <base>/tools/<tool>`` style MCP services.
Args:
service: short service identifier (``"deribit"``, ``"macro"`` …).
base_url: e.g. ``"http://mcp-deribit:9011"``. Trailing slash
is stripped.
token: bearer token for the ``Authorization`` header.
timeout_s: per-request timeout, default 8 seconds.
retry_max: max number of attempts (1 = no retry).
retry_base_delay: base delay for exponential backoff.
sleep: hook for tests to skip real waits.
"""
def __init__(
self,
*,
service: str,
base_url: str,
token: str,
timeout_s: float = 8.0,
retry_max: int = 3,
retry_base_delay: float = 1.0,
sleep: Callable[[int | float], Awaitable[None] | None] | None = None,
) -> None:
self._service = service
self._base_url = base_url.rstrip("/")
self._token = token
self._timeout = httpx.Timeout(timeout_s)
self._retry_max = max(1, retry_max)
self._retry_base_delay = retry_base_delay
self._sleep = sleep
@property
def service(self) -> str:
return self._service
@property
def base_url(self) -> str:
return self._base_url
async def call(
self,
tool: str,
body: dict[str, Any] | None = None,
*,
client: httpx.AsyncClient | None = None,
) -> Any:
"""Invoke ``tool`` with ``body`` and return the parsed JSON response.
Returns whatever shape the server replies with (typically ``dict``,
sometimes ``list``). The wrapper checks ``state == "error"`` only on
``dict`` responses; list/scalar responses are passed through unchanged.
"""
url = f"{self._base_url}/tools/{tool}"
headers = {
"Authorization": f"Bearer {self._token}",
"Content-Type": "application/json",
}
payload = body or {}
async def _attempt() -> Any:
return await self._do_request(
url=url,
headers=headers,
payload=payload,
tool=tool,
client=client,
)
if self._retry_max <= 1:
return await _attempt()
retry_kwargs: dict[str, Any] = {
"stop": stop_after_attempt(self._retry_max),
"wait": wait_exponential(multiplier=self._retry_base_delay, min=1, max=30),
"retry": retry_if_exception_type(_RETRYABLE),
"reraise": True,
}
if self._sleep is not None:
retry_kwargs["sleep"] = self._sleep
retrier = AsyncRetrying(**retry_kwargs)
try:
async for attempt in retrier:
with attempt:
return await _attempt()
except RetryError as exc: # pragma: no cover — reraise=True covers it
raise exc.last_attempt.exception() or McpError(
"retry exhausted", service=self._service, tool=tool
) from exc
# mypy needs an explicit fall-through — retry never falls out of the loop
raise McpError(
"unreachable retry loop exit", service=self._service, tool=tool
) # pragma: no cover
async def _do_request(
self,
*,
url: str,
headers: dict[str, str],
payload: dict[str, Any],
tool: str,
client: httpx.AsyncClient | None,
) -> Any:
request_client = client or httpx.AsyncClient(timeout=self._timeout)
owned = client is None
try:
try:
response = await request_client.post(url, json=payload, headers=headers)
except httpx.TimeoutException as exc:
raise McpTimeoutError(
f"timeout calling {self._service}.{tool}",
service=self._service,
tool=tool,
) from exc
except httpx.HTTPError as exc:
raise McpServerError(
f"HTTP error calling {self._service}.{tool}: {exc}",
service=self._service,
tool=tool,
) from exc
self._raise_for_status(response, tool=tool)
try:
data: Any = response.json()
except json.JSONDecodeError as exc:
raise McpServerError(
f"{self._service}.{tool}: response is not JSON",
service=self._service,
tool=tool,
) from exc
if isinstance(data, dict) and data.get("state") == "error":
raise McpToolError(
f"{self._service}.{tool} returned error: "
f"{data.get('error', 'unknown')}",
service=self._service,
tool=tool,
payload=data,
)
return data
finally:
if owned:
await request_client.aclose()
def _raise_for_status(self, response: httpx.Response, *, tool: str) -> None:
status = response.status_code
if 200 <= status < 300:
return
if status in (401, 403):
raise McpAuthError(
f"{self._service}.{tool} authentication failed (HTTP {status})",
service=self._service,
tool=tool,
)
if status == 404:
raise McpNotFoundError(
f"{self._service}.{tool} not found (HTTP 404)",
service=self._service,
tool=tool,
)
# 4xx other than auth/404 → tool error from server side; do not retry.
# 5xx → server fault, retry-eligible.
message = (
f"{self._service}.{tool} HTTP {status}: "
f"{(response.text or '')[:200]!r}"
)
if 500 <= status < 600:
raise McpServerError(message, service=self._service, tool=tool)
raise McpToolError(message, service=self._service, tool=tool)
+73
View File
@@ -0,0 +1,73 @@
"""Typed exceptions raised by the MCP client wrappers.
Every wrapper translates HTTP status codes and JSON error envelopes into
one of these classes so the orchestrator can decide its degradation
strategy without inspecting raw HTTP traces.
"""
from __future__ import annotations
__all__ = [
"McpAuthError",
"McpDataAnomalyError",
"McpError",
"McpNotFoundError",
"McpServerError",
"McpTimeoutError",
"McpToolError",
]
class McpError(Exception):
"""Base class for every MCP-side failure surfaced to the engine."""
def __init__(
self,
message: str,
*,
service: str | None = None,
tool: str | None = None,
) -> None:
super().__init__(message)
self.service = service
self.tool = tool
class McpTimeoutError(McpError):
"""The MCP service did not respond within the configured timeout."""
class McpAuthError(McpError):
"""The bearer token was rejected (HTTP 401/403)."""
class McpNotFoundError(McpError):
"""Endpoint missing on the server (HTTP 404). Indicates schema drift."""
class McpServerError(McpError):
"""5xx or unexpected HTTP error returned by the MCP service."""
class McpToolError(McpError):
"""The tool returned a structured error envelope (e.g. ``{state: "error"}``)."""
def __init__(
self,
message: str,
*,
service: str | None = None,
tool: str | None = None,
payload: dict[str, object] | None = None,
) -> None:
super().__init__(message, service=service, tool=tool)
self.payload = payload or {}
class McpDataAnomalyError(McpError):
"""The response shape was valid but the values are nonsensical.
Example: every option in the chain has ``mark_iv == 7%`` (the
Deribit testnet placeholder), or every bid is zero. The orchestrator
skips the cycle and alerts.
"""
+336
View File
@@ -0,0 +1,336 @@
"""Wrapper around ``mcp-deribit``.
Exposes the read tools Cerbero Bite needs to evaluate entry/exit and
the ``place_combo_order`` write path that submits the credit spread
atomically. Everything is converted to ``Decimal`` at the boundary so
the ``core/`` algorithms stay in their preferred numeric domain.
"""
from __future__ import annotations
import re
from datetime import UTC, datetime
from decimal import Decimal
from typing import Any, Literal
from pydantic import BaseModel, ConfigDict
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
from cerbero_bite.core.types import PutOrCall
__all__ = [
"ComboLegOrder",
"ComboOrderResult",
"DeribitClient",
"DeribitEnvironment",
"InstrumentMeta",
]
_INSTRUMENT_RE = re.compile(
r"^(?P<asset>[A-Z]+)-"
r"(?P<expiry>\d{1,2}[A-Z]{3}\d{2})-"
r"(?P<strike>\d+)-"
r"(?P<type>[PC])$"
)
class DeribitEnvironment(BaseModel):
"""Result of the ``environment_info`` tool."""
model_config = ConfigDict(frozen=True, extra="ignore")
exchange: str
environment: Literal["testnet", "mainnet"]
source: str
env_value: str | None
base_url: str
max_leverage: int | None = None
class InstrumentMeta(BaseModel):
"""Static metadata of a Deribit option instrument."""
model_config = ConfigDict(frozen=True, extra="ignore")
name: str
strike: Decimal
expiry: datetime
option_type: PutOrCall
open_interest: Decimal | None
tick_size: Decimal | None
min_trade_amount: Decimal | None
class ComboLegOrder(BaseModel):
"""One leg of a combo order request."""
model_config = ConfigDict(frozen=True, extra="forbid")
instrument_name: str
direction: Literal["buy", "sell"]
ratio: int = 1
class ComboOrderResult(BaseModel):
"""Outcome of a ``place_combo_order`` invocation."""
model_config = ConfigDict(frozen=True, extra="ignore")
combo_instrument: str
order_id: str | None
state: str
average_price_eth: Decimal | None
filled_amount: Decimal | None
raw: dict[str, Any]
def _parse_instrument(name: str) -> tuple[Decimal, datetime, PutOrCall]:
"""Return ``(strike, expiry, option_type)`` parsed from a Deribit instrument."""
match = _INSTRUMENT_RE.match(name)
if not match:
raise McpDataAnomalyError(
f"deribit instrument name '{name}' does not match expected pattern"
)
expiry = datetime.strptime(match.group("expiry"), "%d%b%y").replace(
hour=8, minute=0, second=0, tzinfo=UTC
)
return Decimal(match.group("strike")), expiry, match.group("type") # type: ignore[return-value]
def _to_decimal(value: Any) -> Decimal | None:
if value is None:
return None
return Decimal(str(value))
class DeribitClient:
SERVICE = "deribit"
def __init__(self, http: HttpToolClient) -> None:
if http.service != self.SERVICE:
raise ValueError(
f"DeribitClient requires service '{self.SERVICE}', got '{http.service}'"
)
self._http = http
# ------------------------------------------------------------------
# Environment / health
# ------------------------------------------------------------------
async def environment_info(self) -> DeribitEnvironment:
raw = await self._http.call("environment_info", {})
return DeribitEnvironment(**raw)
# ------------------------------------------------------------------
# Market data
# ------------------------------------------------------------------
async def index_price_eth(self) -> Decimal:
"""Return the ETH spot proxy used by combo selection.
Deribit does not expose the index price as its own MCP tool, so
we use the ``ETH-PERPETUAL`` mark price as a proxy. On testnet
this is good enough; on mainnet we will swap in a dedicated
index call when the server exposes it.
"""
raw = await self._http.call(
"get_ticker", {"instrument_name": "ETH-PERPETUAL"}
)
mark = raw.get("mark_price")
if mark is None:
raise McpDataAnomalyError(
"deribit ETH-PERPETUAL mark_price missing",
service=self.SERVICE,
tool="get_ticker",
)
return Decimal(str(mark))
async def latest_dvol(
self,
*,
currency: str = "ETH",
now: datetime | None = None,
) -> Decimal:
"""Return the latest DVOL value for ``currency``."""
when = (now or datetime.now(UTC)).astimezone(UTC)
body = {
"currency": currency,
"start_date": (when.date()).isoformat(),
"end_date": when.date().isoformat(),
"resolution": "1D",
}
raw = await self._http.call("get_dvol", body)
latest = raw.get("latest")
if latest is None:
candles = raw.get("candles") or []
if not candles:
raise McpDataAnomalyError(
"deribit DVOL response has neither 'latest' nor 'candles'",
service=self.SERVICE,
tool="get_dvol",
)
tail = candles[-1]
latest = tail.get("close") if isinstance(tail, dict) else None
if latest is None:
raise McpDataAnomalyError(
"deribit DVOL last candle missing 'close'",
service=self.SERVICE,
tool="get_dvol",
)
return Decimal(str(latest))
async def options_chain(
self,
*,
currency: str = "ETH",
expiry_from: datetime | None = None,
expiry_to: datetime | None = None,
min_open_interest: int | None = None,
limit: int = 500,
) -> list[InstrumentMeta]:
"""Return option instruments matching the filters as typed metadata."""
body: dict[str, Any] = {"currency": currency, "kind": "option", "limit": limit}
if expiry_from is not None:
body["expiry_from"] = expiry_from.date().isoformat()
if expiry_to is not None:
body["expiry_to"] = expiry_to.date().isoformat()
if min_open_interest is not None:
body["min_open_interest"] = min_open_interest
raw = await self._http.call("get_instruments", body)
instruments = raw.get("instruments") or []
out: list[InstrumentMeta] = []
for entry in instruments:
if not isinstance(entry, dict):
continue
name = entry.get("name")
if not isinstance(name, str):
continue
try:
strike, expiry, option_type = _parse_instrument(name)
except McpDataAnomalyError:
continue
out.append(
InstrumentMeta(
name=name,
strike=strike,
expiry=expiry,
option_type=option_type,
open_interest=_to_decimal(entry.get("open_interest")),
tick_size=_to_decimal(entry.get("tick_size")),
min_trade_amount=_to_decimal(entry.get("min_trade_amount")),
)
)
return out
async def get_tickers(self, instrument_names: list[str]) -> list[dict[str, Any]]:
"""Fetch full ticker data for up to 20 instruments at once."""
if not instrument_names:
return []
if len(instrument_names) > 20:
raise ValueError("get_tickers: max 20 instruments per call")
raw = await self._http.call(
"get_ticker_batch", {"instrument_names": instrument_names}
)
if isinstance(raw, dict) and raw.get("error"):
raise McpDataAnomalyError(
f"deribit get_ticker_batch error: {raw['error']}",
service=self.SERVICE,
tool="get_ticker_batch",
)
return list(raw.get("tickers") or [])
async def orderbook_depth_top3(self, instrument_name: str) -> int:
"""Sum of size on the top-3 bid + top-3 ask levels."""
raw = await self._http.call(
"get_orderbook", {"instrument_name": instrument_name, "depth": 3}
)
bids = raw.get("bids") or []
asks = raw.get("asks") or []
def _sum(rows: list[Any]) -> Decimal:
total = Decimal("0")
for row in rows[:3]:
if isinstance(row, list | tuple) and len(row) >= 2:
total += Decimal(str(row[1]))
elif isinstance(row, dict):
size = row.get("amount") or row.get("size") or 0
total += Decimal(str(size))
return total
return int(_sum(bids) + _sum(asks))
async def get_account_summary(self, currency: str = "USDC") -> dict[str, Any]:
result: Any = await self._http.call(
"get_account_summary", {"currency": currency}
)
return result if isinstance(result, dict) else {}
async def get_positions(self, currency: str = "USDC") -> list[dict[str, Any]]:
raw = await self._http.call("get_positions", {"currency": currency})
if isinstance(raw, list):
return raw
# Server may also wrap the list under a key — defensive only.
return list(raw.get("positions") or []) # pragma: no cover
# ------------------------------------------------------------------
# Execution
# ------------------------------------------------------------------
async def place_combo_order(
self,
*,
legs: list[ComboLegOrder],
side: Literal["buy", "sell"],
n_contracts: int,
limit_price_eth: Decimal | None = None,
order_type: Literal["limit", "market"] = "limit",
label: str | None = None,
) -> ComboOrderResult:
"""Submit a combo order atomically."""
if len(legs) < 2:
raise ValueError("place_combo_order requires at least 2 legs")
if n_contracts <= 0:
raise ValueError("place_combo_order: n_contracts must be > 0")
if order_type == "limit" and limit_price_eth is None:
raise ValueError("place_combo_order: limit price required for type=limit")
body: dict[str, Any] = {
"legs": [leg.model_dump() for leg in legs],
"side": side,
"amount": n_contracts,
"type": order_type,
}
if limit_price_eth is not None:
body["price"] = float(limit_price_eth)
if label is not None:
body["label"] = label
raw = await self._http.call("place_combo_order", body)
if not isinstance(raw, dict):
raise McpDataAnomalyError(
"place_combo_order: server returned non-object",
service=self.SERVICE,
tool="place_combo_order",
)
combo_instrument = raw.get("combo_instrument")
if not isinstance(combo_instrument, str):
raise McpDataAnomalyError(
"place_combo_order: missing 'combo_instrument' in response",
service=self.SERVICE,
tool="place_combo_order",
)
return ComboOrderResult(
combo_instrument=combo_instrument,
order_id=raw.get("order_id"),
state=str(raw.get("state") or "unknown"),
average_price_eth=_to_decimal(raw.get("average_price")),
filled_amount=_to_decimal(raw.get("filled_amount")),
raw=raw,
)
async def cancel_order(self, order_id: str) -> dict[str, Any]:
result: Any = await self._http.call("cancel_order", {"order_id": order_id})
return result if isinstance(result, dict) else {}
+49
View File
@@ -0,0 +1,49 @@
"""Wrapper around ``mcp-hyperliquid``.
Cerbero Bite consumes a single tool: ``get_funding_rate`` for ETH-PERP,
used by entry filter §2.6 of ``docs/01-strategy-rules.md`` (cap on the
absolute annualised funding rate).
"""
from __future__ import annotations
from decimal import Decimal
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
__all__ = ["HOURLY_FUNDING_PERIODS_PER_YEAR", "HyperliquidClient"]
HOURLY_FUNDING_PERIODS_PER_YEAR = 24 * 365 # = 8760
class HyperliquidClient:
SERVICE = "hyperliquid"
def __init__(self, http: HttpToolClient) -> None:
if http.service != self.SERVICE:
raise ValueError(
f"HyperliquidClient requires service '{self.SERVICE}', got '{http.service}'"
)
self._http = http
async def funding_rate_annualized(self, asset: str) -> Decimal:
"""Return the latest funding rate of ``asset`` as an annualised fraction."""
raw = await self._http.call(
"get_funding_rate", {"instrument": asset.upper()}
)
if raw.get("error"):
raise McpDataAnomalyError(
f"hyperliquid get_funding_rate error: {raw['error']}",
service=self.SERVICE,
tool="get_funding_rate",
)
rate = raw.get("current_funding_rate")
if rate is None:
raise McpDataAnomalyError(
"hyperliquid response missing 'current_funding_rate'",
service=self.SERVICE,
tool="get_funding_rate",
)
return Decimal(str(rate)) * Decimal(HOURLY_FUNDING_PERIODS_PER_YEAR)
+115
View File
@@ -0,0 +1,115 @@
"""Wrapper around ``mcp-macro`` (``docs/04-mcp-integration.md``).
Exposes a single use case relevant to Cerbero Bite: how many days
separate the current moment from the next high-severity macro event in
the requested window. The orchestrator feeds the result straight into
``entry_validator.EntryContext.next_macro_event_in_days``.
"""
from __future__ import annotations
from datetime import UTC, datetime
from typing import Any
from pydantic import BaseModel, ConfigDict
from cerbero_bite.clients._base import HttpToolClient
__all__ = ["MacroClient", "MacroEvent"]
class MacroEvent(BaseModel):
"""One row of the macro calendar."""
model_config = ConfigDict(frozen=True, extra="ignore")
name: str
country_code: str
importance: str
datetime_utc: datetime | None
class MacroClient:
"""High-level wrapper that returns typed macro events."""
SERVICE = "macro"
def __init__(self, http: HttpToolClient) -> None:
if http.service != self.SERVICE:
raise ValueError(
f"MacroClient requires service '{self.SERVICE}', got '{http.service}'"
)
self._http = http
async def get_calendar(
self,
*,
days: int,
country_filter: list[str] | None = None,
importance_min: str | None = None,
) -> list[MacroEvent]:
"""Return the events in the next ``days`` matching the filters."""
body: dict[str, Any] = {"days": days}
if country_filter is not None:
body["country_filter"] = country_filter
if importance_min is not None:
body["importance_min"] = importance_min
raw = await self._http.call("get_macro_calendar", body)
events = raw.get("events") or []
out: list[MacroEvent] = []
for entry in events:
if not isinstance(entry, dict):
continue
out.append(
MacroEvent(
name=str(entry.get("name") or entry.get("event") or ""),
country_code=str(entry.get("country_code") or ""),
importance=str(entry.get("importance") or "medium"),
datetime_utc=_parse_dt(entry.get("datetime_utc"))
or _parse_dt(entry.get("date")),
)
)
return out
async def next_high_severity_within(
self,
*,
days: int,
countries: list[str] | None = None,
now: datetime | None = None,
) -> int | None:
"""Days until the first high-severity event within ``days``, else None.
``now`` is taken from the parameter (default: now in UTC) so the
decision is reproducible in tests; the result is rounded up to
whole days because the strategy filter compares with DTE in
days.
"""
events = await self.get_calendar(
days=days,
country_filter=countries,
importance_min="high",
)
reference = (now or datetime.now(UTC)).astimezone(UTC)
deltas: list[int] = []
for event in events:
if event.datetime_utc is None:
continue
delta = event.datetime_utc - reference
seconds = delta.total_seconds()
if seconds < 0:
continue
deltas.append(int(seconds // 86400))
return min(deltas) if deltas else None
def _parse_dt(value: Any) -> datetime | None:
if not isinstance(value, str) or not value:
return None
try:
out = datetime.fromisoformat(value.replace("Z", "+00:00"))
except ValueError:
return None
if out.tzinfo is None:
out = out.replace(tzinfo=UTC)
return out
+92
View File
@@ -0,0 +1,92 @@
"""Wrapper around ``mcp-portfolio``.
Cerbero Bite uses two pieces of information from this service:
* total portfolio value (EUR) — fed to the sizing engine after FX
conversion to USD;
* exposure of a specific asset as percentage of the total portfolio —
used by entry filter §2.7 (``eth_holdings_pct_max``).
The portfolio service stores everything in EUR. The orchestrator is
responsible for the EUR→USD conversion using a live FX rate.
"""
from __future__ import annotations
from decimal import Decimal
from typing import Any
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
__all__ = ["PortfolioClient"]
class PortfolioClient:
SERVICE = "portfolio"
def __init__(self, http: HttpToolClient) -> None:
if http.service != self.SERVICE:
raise ValueError(
f"PortfolioClient requires service '{self.SERVICE}', got '{http.service}'"
)
self._http = http
async def total_equity_eur(self) -> Decimal:
"""Return the aggregate portfolio value in EUR."""
raw = await self._http.call(
"get_total_portfolio_value", {"currency": "EUR"}
)
if not isinstance(raw, dict):
raise McpDataAnomalyError(
f"portfolio total_value_eur unexpected shape: {type(raw).__name__}",
service=self.SERVICE,
tool="get_total_portfolio_value",
)
value = raw.get("total_value_eur")
if value is None:
raise McpDataAnomalyError(
"portfolio response missing 'total_value_eur'",
service=self.SERVICE,
tool="get_total_portfolio_value",
)
return Decimal(str(value))
async def asset_pct_of_portfolio(self, ticker: str) -> Decimal:
"""Return the fraction (0..1) of the portfolio held in ``ticker``.
Iterates the holdings list and aggregates ``current_value_eur``
for any holding whose ticker contains ``ticker`` (case-insensitive).
Empty portfolio → 0.
"""
holdings = await self._http.call("get_holdings", {"min_value_eur": 0})
if not isinstance(holdings, list):
raise McpDataAnomalyError(
f"portfolio get_holdings unexpected shape: {type(holdings).__name__}",
service=self.SERVICE,
tool="get_holdings",
)
target = ticker.upper()
matching_value = Decimal("0")
total_value = Decimal("0")
for entry in holdings:
if not isinstance(entry, dict):
continue
value = entry.get("current_value_eur")
if value is None:
continue
value_dec = Decimal(str(value))
total_value += value_dec
entry_ticker = str(entry.get("ticker") or "").upper()
if target in entry_ticker:
matching_value += value_dec
if total_value == 0:
return Decimal("0")
return matching_value / total_value
async def health(self) -> dict[str, Any]:
"""Lightweight call used by ``cerbero-bite ping``."""
result: Any = await self._http.call("get_last_update_info", {})
return result if isinstance(result, dict) else {}
+79
View File
@@ -0,0 +1,79 @@
"""Wrapper around ``mcp-sentiment``.
Cerbero Bite uses one tool from this service: ``get_cross_exchange_funding``,
the input to the directional bias of ``compute_bias`` (see
``docs/01-strategy-rules.md §3.1``).
The MCP server returns the *raw period funding rate* of each exchange
(Binance/Bybit/OKX use an 8-hour funding period; Hyperliquid uses 1
hour). The wrapper converts each value to an annualised fraction
using the period that exchange actually settles on, then returns the
median across the available venues. Exchanges that did not return a
quote (``None``) are skipped.
"""
from __future__ import annotations
import statistics
from decimal import Decimal
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
__all__ = ["EXCHANGE_PERIODS_PER_YEAR", "SentimentClient"]
# Funding settlement frequency per year. 1095 = 365 × 3 (8-hour funding).
EXCHANGE_PERIODS_PER_YEAR: dict[str, int] = {
"binance": 1095,
"bybit": 1095,
"okx": 1095,
"hyperliquid": 8760, # hourly funding
}
class SentimentClient:
SERVICE = "sentiment"
def __init__(self, http: HttpToolClient) -> None:
if http.service != self.SERVICE:
raise ValueError(
f"SentimentClient requires service '{self.SERVICE}', got '{http.service}'"
)
self._http = http
async def funding_cross_median_annualized(self, asset: str) -> Decimal:
"""Return the median annualised funding rate across known venues.
Raises :class:`McpDataAnomalyError` when the snapshot lacks any
quote — without funding data Bite cannot compute a directional
bias and must skip the cycle.
"""
raw = await self._http.call(
"get_cross_exchange_funding", {"assets": [asset.upper()]}
)
snapshot = (raw.get("snapshot") or {}).get(asset.upper())
if not isinstance(snapshot, dict):
raise McpDataAnomalyError(
f"sentiment snapshot missing for {asset}",
service=self.SERVICE,
tool="get_cross_exchange_funding",
)
annualized: list[Decimal] = []
for venue, periods in EXCHANGE_PERIODS_PER_YEAR.items():
value = snapshot.get(venue)
if value is None:
continue
annualized.append(Decimal(str(value)) * Decimal(periods))
if not annualized:
raise McpDataAnomalyError(
f"no funding venues responded for {asset}",
service=self.SERVICE,
tool="get_cross_exchange_funding",
)
# statistics.median works on Decimal: it returns an averaged
# Decimal for even counts, which is exactly what we want.
return Decimal(str(statistics.median(annualized)))
+112
View File
@@ -0,0 +1,112 @@
"""Wrapper around ``mcp-telegram`` (notify-only mode).
Cerbero Bite during the testnet phase (and through the soft launch) is
fully autonomous: Telegram is used purely to *notify* Adriano of what
the engine has done, never to gate execution. As a consequence:
* No ``send_with_buttons`` and no callback queue.
* Confirmation timeouts are handled inside the orchestrator's own
state machine, not by waiting on Telegram replies.
* All notifications go through one of the typed endpoints
(``notify``, ``notify_position_opened``, ``notify_position_closed``,
``notify_alert``, ``notify_system_error``) — the formatting lives
on the server side.
"""
from __future__ import annotations
from decimal import Decimal
from typing import Any
from cerbero_bite.clients._base import HttpToolClient
__all__ = ["TelegramClient"]
def _to_float(value: Decimal | float) -> float:
return float(value) if isinstance(value, Decimal) else value
class TelegramClient:
SERVICE = "telegram"
def __init__(self, http: HttpToolClient) -> None:
if http.service != self.SERVICE:
raise ValueError(
f"TelegramClient requires service '{self.SERVICE}', got '{http.service}'"
)
self._http = http
async def notify(
self,
message: str,
*,
priority: str = "normal",
tag: str | None = None,
) -> None:
body: dict[str, Any] = {"message": message, "priority": priority}
if tag is not None:
body["tag"] = tag
await self._http.call("notify", body)
async def notify_position_opened(
self,
*,
instrument: str,
side: str,
size: int,
strategy: str,
greeks: dict[str, Decimal | float] | None = None,
expected_pnl_usd: Decimal | float | None = None,
) -> None:
body: dict[str, Any] = {
"instrument": instrument,
"side": side,
"size": float(size),
"strategy": strategy,
}
if greeks is not None:
body["greeks"] = {k: _to_float(v) for k, v in greeks.items()}
if expected_pnl_usd is not None:
body["expected_pnl"] = _to_float(expected_pnl_usd)
await self._http.call("notify_position_opened", body)
async def notify_position_closed(
self,
*,
instrument: str,
realized_pnl_usd: Decimal | float,
reason: str,
) -> None:
await self._http.call(
"notify_position_closed",
{
"instrument": instrument,
"realized_pnl": _to_float(realized_pnl_usd),
"reason": reason,
},
)
async def notify_alert(
self,
*,
source: str,
message: str,
priority: str = "high",
) -> None:
await self._http.call(
"notify_alert",
{"source": source, "message": message, "priority": priority},
)
async def notify_system_error(
self,
*,
message: str,
component: str | None = None,
priority: str = "critical",
) -> None:
body: dict[str, Any] = {"message": message, "priority": priority}
if component is not None:
body["component"] = component
await self._http.call("notify_system_error", body)
+108
View File
@@ -0,0 +1,108 @@
"""Resolve MCP service URLs and the bearer token.
Cerbero Bite runs in its own Docker container that joins the
``cerbero-suite`` network: every MCP service is reachable by the
container DNS name plus its internal port (``mcp-deribit:9011`` etc.).
The resolver supports two layers of override:
1. Per-service environment variables (``CERBERO_BITE_MCP_DERIBIT_URL``,
``CERBERO_BITE_MCP_MACRO_URL``…). Useful for dev when running
outside Docker — point at ``http://localhost:9011`` etc.
2. ``CERBERO_BITE_CORE_TOKEN_FILE`` env var: path to the file that
stores the bearer token (default
``/run/secrets/core_token``). The file is read at boot, the
trailing whitespace is stripped, and the value is *not* logged.
"""
from __future__ import annotations
import os
from dataclasses import dataclass
from pathlib import Path
__all__ = [
"DEFAULT_ENDPOINTS",
"MCP_SERVICES",
"McpEndpoints",
"load_endpoints",
"load_token",
]
# Service identifier → (default Docker DNS host, default port, env var name)
MCP_SERVICES: dict[str, tuple[str, int, str]] = {
"deribit": ("mcp-deribit", 9011, "CERBERO_BITE_MCP_DERIBIT_URL"),
"hyperliquid": ("mcp-hyperliquid", 9012, "CERBERO_BITE_MCP_HYPERLIQUID_URL"),
"macro": ("mcp-macro", 9013, "CERBERO_BITE_MCP_MACRO_URL"),
"sentiment": ("mcp-sentiment", 9014, "CERBERO_BITE_MCP_SENTIMENT_URL"),
"telegram": ("mcp-telegram", 9017, "CERBERO_BITE_MCP_TELEGRAM_URL"),
"portfolio": ("mcp-portfolio", 9018, "CERBERO_BITE_MCP_PORTFOLIO_URL"),
}
def _default_url(host: str, port: int) -> str:
return f"http://{host}:{port}"
DEFAULT_ENDPOINTS: dict[str, str] = {
name: _default_url(host, port) for name, (host, port, _) in MCP_SERVICES.items()
}
@dataclass(frozen=True)
class McpEndpoints:
"""Resolved per-service URLs."""
deribit: str
hyperliquid: str
macro: str
sentiment: str
telegram: str
portfolio: str
def for_service(self, name: str) -> str:
try:
return getattr(self, name) # type: ignore[no-any-return]
except AttributeError as exc:
raise KeyError(f"unknown MCP service '{name}'") from exc
def load_endpoints(env: dict[str, str] | None = None) -> McpEndpoints:
"""Build an :class:`McpEndpoints` honouring env-var overrides."""
e = env if env is not None else os.environ
resolved: dict[str, str] = {}
for name, (host, port, env_var) in MCP_SERVICES.items():
override = e.get(env_var)
resolved[name] = override.rstrip("/") if override else _default_url(host, port)
return McpEndpoints(**resolved)
_DEFAULT_TOKEN_FILE = "/run/secrets/core_token"
_TOKEN_FILE_ENV = "CERBERO_BITE_CORE_TOKEN_FILE"
def load_token(
*,
path: str | Path | None = None,
env: dict[str, str] | None = None,
) -> str:
"""Read the bearer token from disk and return it stripped.
Resolution order:
1. explicit ``path`` argument;
2. ``CERBERO_BITE_CORE_TOKEN_FILE`` env var;
3. ``/run/secrets/core_token`` (Docker secrets default).
"""
e = env if env is not None else os.environ
target = (
Path(path)
if path is not None
else Path(e.get(_TOKEN_FILE_ENV, _DEFAULT_TOKEN_FILE))
)
if not target.is_file():
raise FileNotFoundError(f"core token file not found: {target}")
token = target.read_text(encoding="utf-8").strip()
if not token:
raise ValueError(f"core token file is empty: {target}")
return token
+110
View File
@@ -0,0 +1,110 @@
"""End-to-end test for ``cerbero-bite ping``.
The CLI uses the production code paths, so we set up an HTTP mock that
matches every URL Bite is going to hit and assert the rendered output
contains the expected statuses.
"""
from __future__ import annotations
from pathlib import Path
from click.testing import CliRunner
from pytest_httpx import HTTPXMock
from cerbero_bite.cli import main as cli_main
def _seed_token(tmp_path: Path) -> Path:
target = tmp_path / "core_token"
target.write_text("super-secret\n", encoding="utf-8")
return target
def test_ping_reports_each_service(
tmp_path: Path, httpx_mock: HTTPXMock
) -> None:
token_file = _seed_token(tmp_path)
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/environment_info",
json={
"exchange": "deribit",
"environment": "testnet",
"source": "env",
"env_value": "true",
"base_url": "https://test.deribit.com/api/v2",
"max_leverage": 3,
},
)
httpx_mock.add_response(
url="http://mcp-hyperliquid:9012/tools/get_funding_rate",
json={"asset": "ETH", "current_funding_rate": 0.0001},
)
httpx_mock.add_response(
url="http://mcp-macro:9013/tools/get_macro_calendar",
json={"events": []},
)
httpx_mock.add_response(
url="http://mcp-sentiment:9014/tools/get_cross_exchange_funding",
json={"snapshot": {"ETH": {"binance": 0.0001}}},
)
httpx_mock.add_response(
url="http://mcp-portfolio:9018/tools/get_total_portfolio_value",
json={"total_value_eur": 5000.0},
)
result = CliRunner().invoke(
cli_main, ["ping", "--token-file", str(token_file), "--timeout", "1.0"]
)
assert result.exit_code == 0, result.output
assert "deribit" in result.output
assert "hyperliquid" in result.output
assert "macro" in result.output
assert "sentiment" in result.output
assert "portfolio" in result.output
assert "telegram" in result.output # listed even if skipped
# at least 5 OK statuses
assert result.output.count("OK") >= 5
def test_ping_reports_failure_when_service_unreachable(
tmp_path: Path, httpx_mock: HTTPXMock
) -> None:
token_file = _seed_token(tmp_path)
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/environment_info",
status_code=500,
text="boom",
)
# Provide successful stubs for the others to keep the call small.
httpx_mock.add_response(
url="http://mcp-hyperliquid:9012/tools/get_funding_rate",
json={"asset": "ETH", "current_funding_rate": 0.0001},
)
httpx_mock.add_response(
url="http://mcp-macro:9013/tools/get_macro_calendar",
json={"events": []},
)
httpx_mock.add_response(
url="http://mcp-sentiment:9014/tools/get_cross_exchange_funding",
json={"snapshot": {"ETH": {"binance": 0.0001}}},
)
httpx_mock.add_response(
url="http://mcp-portfolio:9018/tools/get_total_portfolio_value",
json={"total_value_eur": 0.0},
)
result = CliRunner().invoke(
cli_main, ["ping", "--token-file", str(token_file), "--timeout", "1.0"]
)
assert result.exit_code == 0
assert "FAIL" in result.output
def test_ping_token_missing_exits_nonzero(tmp_path: Path) -> None:
result = CliRunner().invoke(
cli_main, ["ping", "--token-file", str(tmp_path / "nope")]
)
assert result.exit_code == 1
assert "token error" in result.output
+168
View File
@@ -0,0 +1,168 @@
"""Tests for :class:`HttpToolClient`."""
from __future__ import annotations
import httpx
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import (
McpAuthError,
McpNotFoundError,
McpServerError,
McpTimeoutError,
McpToolError,
)
def _make_client(**overrides: object) -> HttpToolClient:
base: dict[str, object] = {
"service": "test",
"base_url": "http://mcp-test:9000",
"token": "tok",
"retry_max": 1,
}
base.update(overrides)
return HttpToolClient(**base) # type: ignore[arg-type]
@pytest.mark.asyncio
async def test_call_returns_parsed_payload(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-test:9000/tools/get_thing",
json={"result": [1, 2, 3]},
)
client = _make_client()
out = await client.call("get_thing", {"x": 1})
assert out == {"result": [1, 2, 3]}
@pytest.mark.asyncio
async def test_call_attaches_bearer_token(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"ok": True})
client = _make_client(token="abc123")
await client.call("any")
request = httpx_mock.get_request()
assert request is not None
assert request.headers["Authorization"] == "Bearer abc123"
assert request.headers["Content-Type"] == "application/json"
@pytest.mark.asyncio
async def test_call_raises_auth_error_on_401(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(status_code=401, json={"detail": "nope"})
client = _make_client()
with pytest.raises(McpAuthError):
await client.call("any")
@pytest.mark.asyncio
async def test_call_raises_auth_error_on_403(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(status_code=403, json={"detail": "forbidden"})
client = _make_client()
with pytest.raises(McpAuthError):
await client.call("any")
@pytest.mark.asyncio
async def test_call_raises_not_found_on_404(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(status_code=404, text="missing")
client = _make_client()
with pytest.raises(McpNotFoundError):
await client.call("any")
@pytest.mark.asyncio
async def test_call_raises_server_error_on_500(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(status_code=500, text="boom")
client = _make_client()
with pytest.raises(McpServerError):
await client.call("any")
@pytest.mark.asyncio
async def test_call_raises_tool_error_on_state_error_envelope(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(json={"state": "error", "error": "bad params"})
client = _make_client()
with pytest.raises(McpToolError) as info:
await client.call("any")
assert "bad params" in str(info.value)
assert info.value.payload == {"state": "error", "error": "bad params"}
@pytest.mark.asyncio
async def test_call_raises_tool_error_on_4xx_other(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(status_code=422, text="invalid")
client = _make_client()
with pytest.raises(McpToolError):
await client.call("any")
@pytest.mark.asyncio
async def test_call_raises_server_error_when_response_not_json(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(content=b"not-json", headers={"content-type": "text/plain"})
client = _make_client()
with pytest.raises(McpServerError, match="not JSON"):
await client.call("any")
@pytest.mark.asyncio
async def test_call_passes_through_list_response(httpx_mock: HTTPXMock) -> None:
"""Some MCP tools (e.g. portfolio.get_holdings) return a list."""
httpx_mock.add_response(json=[1, 2, 3])
client = _make_client()
out = await client.call("any")
assert out == [1, 2, 3]
@pytest.mark.asyncio
async def test_call_raises_timeout(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_exception(httpx.ReadTimeout("slow"))
client = _make_client()
with pytest.raises(McpTimeoutError):
await client.call("any")
@pytest.mark.asyncio
async def test_call_retries_on_server_error_then_succeeds(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(status_code=500, text="down")
httpx_mock.add_response(json={"ok": True})
sleeps: list[float] = []
async def _fake_sleep(seconds: float) -> None:
sleeps.append(seconds)
client = _make_client(retry_max=3, sleep=_fake_sleep)
out = await client.call("any")
assert out == {"ok": True}
# one retry → at least one sleep recorded
assert sleeps, "expected the retrier to sleep before retry"
@pytest.mark.asyncio
async def test_call_does_not_retry_auth_error(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(status_code=401, json={"detail": "no"})
client = _make_client(retry_max=3)
with pytest.raises(McpAuthError):
await client.call("any")
# only one request made; no retry on auth
requests = httpx_mock.get_requests()
assert len(requests) == 1
@pytest.mark.asyncio
async def test_base_url_trailing_slash_is_stripped(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-test:9000/tools/foo",
json={"ok": True},
)
client = _make_client(base_url="http://mcp-test:9000///")
await client.call("foo")
+341
View File
@@ -0,0 +1,341 @@
"""Tests for DeribitClient."""
from __future__ import annotations
import json
from datetime import UTC, datetime
from decimal import Decimal
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
from cerbero_bite.clients.deribit import (
ComboLegOrder,
DeribitClient,
)
def _client() -> DeribitClient:
http = HttpToolClient(
service="deribit",
base_url="http://mcp-deribit:9011",
token="t",
retry_max=1,
)
return DeribitClient(http)
def _request_body(httpx_mock: HTTPXMock, *, index: int = -1) -> dict:
requests = httpx_mock.get_requests()
assert requests, "expected at least one request"
return json.loads(requests[index].read())
# ---------------------------------------------------------------------------
# environment_info
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_environment_info_parses_payload(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/environment_info",
json={
"exchange": "deribit",
"environment": "testnet",
"source": "env",
"env_value": "true",
"base_url": "https://test.deribit.com/api/v2",
"max_leverage": 3,
},
)
info = await _client().environment_info()
assert info.environment == "testnet"
assert info.source == "env"
assert info.max_leverage == 3
# ---------------------------------------------------------------------------
# index_price_eth
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_index_price_eth_uses_perpetual_mark(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/get_ticker",
json={"instrument_name": "ETH-PERPETUAL", "mark_price": 3024.5, "bid": 3024.3},
)
out = await _client().index_price_eth()
assert out == Decimal("3024.5")
body = _request_body(httpx_mock)
assert body == {"instrument_name": "ETH-PERPETUAL"}
@pytest.mark.asyncio
async def test_index_price_eth_anomaly_when_mark_missing(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(json={"instrument_name": "ETH-PERPETUAL"})
with pytest.raises(McpDataAnomalyError, match="mark_price"):
await _client().index_price_eth()
# ---------------------------------------------------------------------------
# latest_dvol
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_latest_dvol_returns_latest_field(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/get_dvol",
json={"currency": "ETH", "latest": 52.4, "candles": []},
)
out = await _client().latest_dvol(now=datetime(2026, 4, 27, 14, 0, tzinfo=UTC))
assert out == Decimal("52.4")
@pytest.mark.asyncio
async def test_latest_dvol_falls_back_to_candle_close(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
json={
"currency": "ETH",
"candles": [
{"close": 50.0},
{"close": 51.7},
],
}
)
out = await _client().latest_dvol(now=datetime(2026, 4, 27, 14, 0, tzinfo=UTC))
assert out == Decimal("51.7")
@pytest.mark.asyncio
async def test_latest_dvol_anomaly_when_empty(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"currency": "ETH", "candles": []})
with pytest.raises(McpDataAnomalyError):
await _client().latest_dvol(now=datetime(2026, 4, 27, 14, 0, tzinfo=UTC))
# ---------------------------------------------------------------------------
# options_chain
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_options_chain_parses_instrument_names(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/get_instruments",
json={
"instruments": [
{
"name": "ETH-15MAY26-2475-P",
"strike": 2475,
"open_interest": 200,
"tick_size": 0.0005,
"min_trade_amount": 1,
},
{
"name": "ETH-15MAY26-2350-P",
"strike": 2350,
"open_interest": 150,
},
{"name": "MALFORMED-INSTRUMENT"},
]
},
)
chain = await _client().options_chain(currency="ETH")
names = [m.name for m in chain]
assert names == ["ETH-15MAY26-2475-P", "ETH-15MAY26-2350-P"]
assert chain[0].strike == Decimal("2475")
assert chain[0].expiry == datetime(2026, 5, 15, 8, 0, tzinfo=UTC)
assert chain[0].option_type == "P"
assert chain[0].open_interest == Decimal("200")
# ---------------------------------------------------------------------------
# get_tickers / orderbook
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_get_tickers_returns_list(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/get_ticker_batch",
json={"tickers": [{"instrument_name": "X"}], "errors": []},
)
tickers = await _client().get_tickers(["X"])
assert tickers == [{"instrument_name": "X"}]
@pytest.mark.asyncio
async def test_get_tickers_empty_short_circuits(httpx_mock: HTTPXMock) -> None:
out = await _client().get_tickers([])
assert out == []
assert httpx_mock.get_requests() == []
@pytest.mark.asyncio
async def test_get_tickers_rejects_more_than_twenty() -> None:
with pytest.raises(ValueError, match="max 20"):
await _client().get_tickers(["x"] * 21)
@pytest.mark.asyncio
async def test_get_tickers_anomaly_on_error_envelope(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(json={"error": "max 20 instruments per batch"})
with pytest.raises(McpDataAnomalyError):
await _client().get_tickers(["x"])
@pytest.mark.asyncio
async def test_orderbook_depth_top3_sums_levels(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/get_orderbook",
json={
"bids": [[3000, 5], [2999, 7], [2998, 3], [2997, 100]],
"asks": [[3001, 4], [3002, 6], [3003, 2]],
},
)
depth = await _client().orderbook_depth_top3("ETH-15MAY26-2475-P")
# bids 5+7+3=15; asks 4+6+2=12 → 27
assert depth == 27
# ---------------------------------------------------------------------------
# place_combo_order
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_place_combo_order_submits_correct_body(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/place_combo_order",
json={
"combo_instrument": "ETH-15MAY26-2475P_2350P",
"order_id": "ord-1",
"state": "open",
"average_price": None,
"filled_amount": 0,
},
)
legs = [
ComboLegOrder(instrument_name="ETH-15MAY26-2475-P", direction="sell"),
ComboLegOrder(instrument_name="ETH-15MAY26-2350-P", direction="buy"),
]
res = await _client().place_combo_order(
legs=legs,
side="sell",
n_contracts=2,
limit_price_eth=Decimal("0.005"),
label="weekly-open",
)
body = _request_body(httpx_mock)
assert body["side"] == "sell"
assert body["amount"] == 2
assert body["price"] == 0.005
assert body["label"] == "weekly-open"
assert body["legs"] == [
{"instrument_name": "ETH-15MAY26-2475-P", "direction": "sell", "ratio": 1},
{"instrument_name": "ETH-15MAY26-2350-P", "direction": "buy", "ratio": 1},
]
assert res.combo_instrument == "ETH-15MAY26-2475P_2350P"
assert res.order_id == "ord-1"
assert res.state == "open"
@pytest.mark.asyncio
async def test_place_combo_rejects_single_leg() -> None:
with pytest.raises(ValueError, match="at least 2 legs"):
await _client().place_combo_order(
legs=[ComboLegOrder(instrument_name="X", direction="sell")],
side="sell",
n_contracts=1,
limit_price_eth=Decimal("0.001"),
)
@pytest.mark.asyncio
async def test_place_combo_rejects_zero_contracts() -> None:
legs = [
ComboLegOrder(instrument_name="A", direction="sell"),
ComboLegOrder(instrument_name="B", direction="buy"),
]
with pytest.raises(ValueError, match="n_contracts"):
await _client().place_combo_order(
legs=legs, side="sell", n_contracts=0, limit_price_eth=Decimal("0.001")
)
@pytest.mark.asyncio
async def test_place_combo_rejects_limit_without_price() -> None:
legs = [
ComboLegOrder(instrument_name="A", direction="sell"),
ComboLegOrder(instrument_name="B", direction="buy"),
]
with pytest.raises(ValueError, match="limit price"):
await _client().place_combo_order(
legs=legs, side="sell", n_contracts=1, limit_price_eth=None
)
@pytest.mark.asyncio
async def test_place_combo_anomaly_when_combo_instrument_missing(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(json={"order_id": "x"})
legs = [
ComboLegOrder(instrument_name="A", direction="sell"),
ComboLegOrder(instrument_name="B", direction="buy"),
]
with pytest.raises(McpDataAnomalyError, match="combo_instrument"):
await _client().place_combo_order(
legs=legs, side="sell", n_contracts=1, limit_price_eth=Decimal("0.001")
)
# ---------------------------------------------------------------------------
# cancel + accounts
# ---------------------------------------------------------------------------
@pytest.mark.asyncio
async def test_cancel_order_passes_id(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-deribit:9011/tools/cancel_order",
json={"order_id": "ord-1", "state": "cancelled"},
)
out = await _client().cancel_order("ord-1")
assert out["state"] == "cancelled"
@pytest.mark.asyncio
async def test_get_account_summary_passes_currency(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"equity": 1000, "balance": 1000})
out = await _client().get_account_summary("USDC")
assert out["equity"] == 1000
@pytest.mark.asyncio
async def test_get_positions_returns_list(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json=[{"instrument": "ETH-PERPETUAL", "size": 1}])
out = await _client().get_positions()
assert out == [{"instrument": "ETH-PERPETUAL", "size": 1}]
# ---------------------------------------------------------------------------
# misc
# ---------------------------------------------------------------------------
def test_deribit_client_rejects_wrong_service() -> None:
bad = HttpToolClient(
service="macro", base_url="http://x:1", token="t", retry_max=1
)
with pytest.raises(ValueError, match="requires service 'deribit'"):
DeribitClient(bad)
+57
View File
@@ -0,0 +1,57 @@
"""Tests for HyperliquidClient."""
from __future__ import annotations
from decimal import Decimal
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
from cerbero_bite.clients.hyperliquid import (
HOURLY_FUNDING_PERIODS_PER_YEAR,
HyperliquidClient,
)
def _client() -> HyperliquidClient:
http = HttpToolClient(
service="hyperliquid",
base_url="http://mcp-hyperliquid:9012",
token="t",
retry_max=1,
)
return HyperliquidClient(http)
@pytest.mark.asyncio
async def test_funding_rate_annualized(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-hyperliquid:9012/tools/get_funding_rate",
json={"asset": "ETH", "current_funding_rate": 0.00005},
)
out = await _client().funding_rate_annualized("eth")
assert out == Decimal("0.00005") * Decimal(HOURLY_FUNDING_PERIODS_PER_YEAR)
@pytest.mark.asyncio
async def test_funding_rate_anomaly_on_error_envelope(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"error": "Asset XYZ not found"})
with pytest.raises(McpDataAnomalyError, match="Asset XYZ"):
await _client().funding_rate_annualized("XYZ")
@pytest.mark.asyncio
async def test_funding_rate_anomaly_when_rate_missing(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"asset": "ETH"})
with pytest.raises(McpDataAnomalyError, match="current_funding_rate"):
await _client().funding_rate_annualized("ETH")
def test_hyperliquid_client_rejects_wrong_service() -> None:
bad = HttpToolClient(
service="macro", base_url="http://x:1", token="t", retry_max=1
)
with pytest.raises(ValueError, match="requires service 'hyperliquid'"):
HyperliquidClient(bad)
+139
View File
@@ -0,0 +1,139 @@
"""Tests for MacroClient."""
from __future__ import annotations
from datetime import UTC, datetime, timedelta
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients.macro import MacroClient
def _client() -> MacroClient:
http = HttpToolClient(
service="macro",
base_url="http://mcp-macro:9013",
token="t",
retry_max=1,
)
return MacroClient(http)
@pytest.mark.asyncio
async def test_get_calendar_parses_events(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-macro:9013/tools/get_macro_calendar",
json={
"events": [
{
"name": "FOMC Meeting",
"country_code": "US",
"importance": "high",
"datetime_utc": "2026-05-05T18:00:00+00:00",
},
{
"event": "ECB rate decision",
"country_code": "EU",
"importance": "high",
"datetime_utc": "2026-05-08T12:00:00Z",
},
]
},
)
events = await _client().get_calendar(days=18)
assert [e.name for e in events] == ["FOMC Meeting", "ECB rate decision"]
assert events[0].country_code == "US"
assert events[0].importance == "high"
assert events[0].datetime_utc == datetime(2026, 5, 5, 18, 0, tzinfo=UTC)
@pytest.mark.asyncio
async def test_get_calendar_skips_non_dict_entries(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"events": ["not-a-dict", None]})
assert await _client().get_calendar(days=18) == []
@pytest.mark.asyncio
async def test_next_high_severity_returns_min_days(httpx_mock: HTTPXMock) -> None:
now = datetime(2026, 4, 27, 14, 0, tzinfo=UTC)
httpx_mock.add_response(
json={
"events": [
{
"name": "FOMC",
"country_code": "US",
"importance": "high",
"datetime_utc": (now + timedelta(days=10, hours=4)).isoformat(),
},
{
"name": "ECB",
"country_code": "EU",
"importance": "high",
"datetime_utc": (now + timedelta(days=3, hours=1)).isoformat(),
},
]
}
)
days = await _client().next_high_severity_within(
days=18, countries=["US", "EU"], now=now
)
assert days == 3
@pytest.mark.asyncio
async def test_next_high_severity_ignores_past_events(httpx_mock: HTTPXMock) -> None:
now = datetime(2026, 4, 27, 14, 0, tzinfo=UTC)
httpx_mock.add_response(
json={
"events": [
{
"name": "stale",
"country_code": "US",
"importance": "high",
"datetime_utc": (now - timedelta(days=2)).isoformat(),
},
]
}
)
assert (
await _client().next_high_severity_within(days=18, countries=None, now=now)
is None
)
@pytest.mark.asyncio
async def test_next_high_severity_no_events_returns_none(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(json={"events": []})
out = await _client().next_high_severity_within(
days=18, countries=["US"], now=datetime(2026, 4, 27, 14, 0, tzinfo=UTC)
)
assert out is None
@pytest.mark.asyncio
async def test_get_calendar_passes_filters(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"events": []})
await _client().get_calendar(
days=18, country_filter=["US", "EU"], importance_min="high"
)
request = httpx_mock.get_request()
assert request is not None
body = request.read().decode()
assert '"days":18' in body
assert '"country_filter":["US","EU"]' in body
assert '"importance_min":"high"' in body
def test_macro_client_rejects_wrong_service() -> None:
bad = HttpToolClient(
service="deribit",
base_url="http://x:1",
token="t",
retry_max=1,
)
with pytest.raises(ValueError, match="requires service 'macro'"):
MacroClient(bad)
+95
View File
@@ -0,0 +1,95 @@
"""Tests for PortfolioClient."""
from __future__ import annotations
from decimal import Decimal
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
from cerbero_bite.clients.portfolio import PortfolioClient
def _client() -> PortfolioClient:
http = HttpToolClient(
service="portfolio",
base_url="http://mcp-portfolio:9018",
token="t",
retry_max=1,
)
return PortfolioClient(http)
@pytest.mark.asyncio
async def test_total_equity_eur(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-portfolio:9018/tools/get_total_portfolio_value",
json={"total_value_eur": 12345.67},
)
out = await _client().total_equity_eur()
assert out == Decimal("12345.67")
@pytest.mark.asyncio
async def test_total_equity_anomaly_when_missing(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={})
with pytest.raises(McpDataAnomalyError, match="total_value_eur"):
await _client().total_equity_eur()
@pytest.mark.asyncio
async def test_total_equity_anomaly_on_unexpected_shape(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json=[1, 2, 3])
with pytest.raises(McpDataAnomalyError, match="unexpected shape"):
await _client().total_equity_eur()
@pytest.mark.asyncio
async def test_asset_pct_aggregates_matching_tickers(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-portfolio:9018/tools/get_holdings",
json=[
{"ticker": "ETH-USD", "current_value_eur": 3000.0},
{"ticker": "ETHE", "current_value_eur": 1000.0}, # ETH ticker variant
{"ticker": "AAPL", "current_value_eur": 6000.0},
],
)
pct = await _client().asset_pct_of_portfolio("ETH")
# 4000 / 10000 = 0.4
assert pct == Decimal("0.4")
@pytest.mark.asyncio
async def test_asset_pct_returns_zero_for_empty_portfolio(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(json=[])
assert await _client().asset_pct_of_portfolio("ETH") == Decimal("0")
@pytest.mark.asyncio
async def test_asset_pct_skips_entries_without_value(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
json=[
{"ticker": "ETH", "current_value_eur": None},
{"ticker": "AAPL", "current_value_eur": 1000.0},
]
)
assert await _client().asset_pct_of_portfolio("ETH") == Decimal("0")
@pytest.mark.asyncio
async def test_asset_pct_anomaly_when_response_not_list(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"holdings": []})
with pytest.raises(McpDataAnomalyError, match="unexpected shape"):
await _client().asset_pct_of_portfolio("ETH")
def test_portfolio_client_rejects_wrong_service() -> None:
bad = HttpToolClient(
service="macro", base_url="http://x:1", token="t", retry_max=1
)
with pytest.raises(ValueError, match="requires service 'portfolio'"):
PortfolioClient(bad)
+110
View File
@@ -0,0 +1,110 @@
"""Tests for SentimentClient."""
from __future__ import annotations
from decimal import Decimal
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients._exceptions import McpDataAnomalyError
from cerbero_bite.clients.sentiment import (
EXCHANGE_PERIODS_PER_YEAR,
SentimentClient,
)
def _client() -> SentimentClient:
http = HttpToolClient(
service="sentiment",
base_url="http://mcp-sentiment:9014",
token="t",
retry_max=1,
)
return SentimentClient(http)
@pytest.mark.asyncio
async def test_median_annualises_per_exchange_period(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-sentiment:9014/tools/get_cross_exchange_funding",
json={
"snapshot": {
"ETH": {
"binance": 0.0001,
"bybit": 0.0002,
"okx": 0.00015,
"hyperliquid": 0.00002,
}
}
},
)
median = await _client().funding_cross_median_annualized("eth")
# Annualised values (Decimal):
# binance: 0.0001 * 1095 = 0.1095
# bybit: 0.0002 * 1095 = 0.2190
# okx: 0.00015 * 1095 = 0.16425
# hyperliquid:0.00002 * 8760 = 0.17520
# sorted: 0.1095, 0.16425, 0.17520, 0.2190 → median = (0.16425+0.17520)/2 = 0.169725
assert median == Decimal("0.169725")
@pytest.mark.asyncio
async def test_median_skips_missing_venues(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
json={
"snapshot": {
"ETH": {
"binance": 0.0001,
"bybit": None,
"okx": None,
"hyperliquid": None,
}
}
}
)
median = await _client().funding_cross_median_annualized("ETH")
assert median == Decimal("0.0001") * Decimal("1095")
@pytest.mark.asyncio
async def test_anomaly_when_snapshot_missing(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"snapshot": {}})
with pytest.raises(McpDataAnomalyError, match="snapshot missing"):
await _client().funding_cross_median_annualized("ETH")
@pytest.mark.asyncio
async def test_anomaly_when_no_venue_responded(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
json={
"snapshot": {
"ETH": {
"binance": None,
"bybit": None,
"okx": None,
"hyperliquid": None,
}
}
}
)
with pytest.raises(McpDataAnomalyError, match="no funding venues"):
await _client().funding_cross_median_annualized("ETH")
def test_periods_table_covers_documented_venues() -> None:
assert set(EXCHANGE_PERIODS_PER_YEAR) == {
"binance", "bybit", "okx", "hyperliquid"
}
def test_sentiment_client_rejects_wrong_service() -> None:
bad = HttpToolClient(
service="macro",
base_url="http://x:1",
token="t",
retry_max=1,
)
with pytest.raises(ValueError, match="requires service 'sentiment'"):
SentimentClient(bad)
+122
View File
@@ -0,0 +1,122 @@
"""Tests for TelegramClient (notify-only mode)."""
from __future__ import annotations
import json
from decimal import Decimal
import pytest
from pytest_httpx import HTTPXMock
from cerbero_bite.clients._base import HttpToolClient
from cerbero_bite.clients.telegram import TelegramClient
def _client() -> TelegramClient:
http = HttpToolClient(
service="telegram",
base_url="http://mcp-telegram:9017",
token="t",
retry_max=1,
)
return TelegramClient(http)
def _request_body(httpx_mock: HTTPXMock) -> dict:
request = httpx_mock.get_request()
assert request is not None
return json.loads(request.read())
@pytest.mark.asyncio
async def test_notify_sends_message_with_priority(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(
url="http://mcp-telegram:9017/tools/notify",
json={"ok": True},
)
await _client().notify("hello", priority="high", tag="entry")
body = _request_body(httpx_mock)
assert body == {"message": "hello", "priority": "high", "tag": "entry"}
@pytest.mark.asyncio
async def test_notify_default_priority_normal(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"ok": True})
await _client().notify("plain")
body = _request_body(httpx_mock)
assert body["priority"] == "normal"
assert "tag" not in body
@pytest.mark.asyncio
async def test_notify_position_opened_serialises_decimals(
httpx_mock: HTTPXMock,
) -> None:
httpx_mock.add_response(
url="http://mcp-telegram:9017/tools/notify_position_opened",
json={"ok": True},
)
await _client().notify_position_opened(
instrument="ETH-15MAY26-2475-P",
side="SELL",
size=2,
strategy="bull_put",
greeks={"delta": Decimal("-0.04"), "vega": Decimal("0.20")},
expected_pnl_usd=Decimal("45.00"),
)
body = _request_body(httpx_mock)
assert body["instrument"] == "ETH-15MAY26-2475-P"
assert body["greeks"] == {"delta": -0.04, "vega": 0.20}
assert body["expected_pnl"] == 45.0
assert body["size"] == 2.0
@pytest.mark.asyncio
async def test_notify_position_closed(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"ok": True})
await _client().notify_position_closed(
instrument="ETH-15MAY26-2475-P_2350-P",
realized_pnl_usd=Decimal("32.50"),
reason="CLOSE_PROFIT",
)
body = _request_body(httpx_mock)
assert body == {
"instrument": "ETH-15MAY26-2475-P_2350-P",
"realized_pnl": 32.5,
"reason": "CLOSE_PROFIT",
}
@pytest.mark.asyncio
async def test_notify_alert(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"ok": True})
await _client().notify_alert(
source="kill_switch", message="armed manually", priority="critical"
)
body = _request_body(httpx_mock)
assert body == {
"source": "kill_switch",
"message": "armed manually",
"priority": "critical",
}
@pytest.mark.asyncio
async def test_notify_system_error(httpx_mock: HTTPXMock) -> None:
httpx_mock.add_response(json={"ok": True})
await _client().notify_system_error(
message="deribit feed anomaly",
component="clients.deribit",
)
body = _request_body(httpx_mock)
assert body["message"] == "deribit feed anomaly"
assert body["component"] == "clients.deribit"
assert body["priority"] == "critical"
def test_telegram_client_rejects_wrong_service() -> None:
bad = HttpToolClient(
service="macro", base_url="http://x:1", token="t", retry_max=1
)
with pytest.raises(ValueError, match="requires service 'telegram'"):
TelegramClient(bad)
+76
View File
@@ -0,0 +1,76 @@
"""Tests for the MCP endpoint and token resolver."""
from __future__ import annotations
from pathlib import Path
import pytest
from cerbero_bite.config.mcp_endpoints import (
DEFAULT_ENDPOINTS,
MCP_SERVICES,
load_endpoints,
load_token,
)
def test_defaults_match_known_docker_dns() -> None:
assert DEFAULT_ENDPOINTS["deribit"] == "http://mcp-deribit:9011"
assert DEFAULT_ENDPOINTS["telegram"] == "http://mcp-telegram:9017"
def test_load_endpoints_uses_defaults_when_env_empty() -> None:
endpoints = load_endpoints(env={})
for name, url in DEFAULT_ENDPOINTS.items():
assert endpoints.for_service(name) == url
def test_per_service_env_var_override() -> None:
endpoints = load_endpoints(
env={"CERBERO_BITE_MCP_DERIBIT_URL": "http://localhost:9911"}
)
assert endpoints.deribit == "http://localhost:9911"
assert endpoints.macro == DEFAULT_ENDPOINTS["macro"]
def test_per_service_env_var_strips_trailing_slash() -> None:
endpoints = load_endpoints(
env={"CERBERO_BITE_MCP_DERIBIT_URL": "http://localhost:9911///"}
)
assert endpoints.deribit == "http://localhost:9911"
def test_for_service_unknown_raises_key_error() -> None:
endpoints = load_endpoints(env={})
with pytest.raises(KeyError, match="unknown MCP service"):
endpoints.for_service("nope")
def test_load_token_uses_explicit_path(tmp_path: Path) -> None:
target = tmp_path / "core.token"
target.write_text("abcdef\n", encoding="utf-8")
assert load_token(path=target) == "abcdef"
def test_load_token_uses_env_var(tmp_path: Path) -> None:
target = tmp_path / "core.token"
target.write_text("xyz", encoding="utf-8")
token = load_token(env={"CERBERO_BITE_CORE_TOKEN_FILE": str(target)})
assert token == "xyz"
def test_load_token_raises_when_file_missing(tmp_path: Path) -> None:
with pytest.raises(FileNotFoundError):
load_token(path=tmp_path / "missing")
def test_load_token_raises_when_file_empty(tmp_path: Path) -> None:
target = tmp_path / "empty"
target.write_text("", encoding="utf-8")
with pytest.raises(ValueError, match="empty"):
load_token(path=target)
def test_mcp_services_table_is_complete() -> None:
expected = {"deribit", "hyperliquid", "macro", "sentiment", "telegram", "portfolio"}
assert set(MCP_SERVICES) == expected
Generated
+15 -233
View File
@@ -108,7 +108,6 @@ dependencies = [
{ name = "apscheduler" }, { name = "apscheduler" },
{ name = "click" }, { name = "click" },
{ name = "httpx" }, { name = "httpx" },
{ name = "mcp" },
{ name = "pydantic" }, { name = "pydantic" },
{ name = "pydantic-settings" }, { name = "pydantic-settings" },
{ name = "python-dateutil" }, { name = "python-dateutil" },
@@ -141,6 +140,7 @@ dev = [
{ name = "pytest" }, { name = "pytest" },
{ name = "pytest-asyncio" }, { name = "pytest-asyncio" },
{ name = "pytest-cov" }, { name = "pytest-cov" },
{ name = "pytest-httpx" },
{ name = "ruff" }, { name = "ruff" },
{ name = "types-python-dateutil" }, { name = "types-python-dateutil" },
{ name = "types-pyyaml" }, { name = "types-pyyaml" },
@@ -153,7 +153,6 @@ requires-dist = [
{ name = "click", specifier = ">=8.1" }, { name = "click", specifier = ">=8.1" },
{ name = "httpx", specifier = ">=0.27" }, { name = "httpx", specifier = ">=0.27" },
{ name = "matplotlib", marker = "extra == 'backtest'", specifier = ">=3.9" }, { name = "matplotlib", marker = "extra == 'backtest'", specifier = ">=3.9" },
{ name = "mcp", specifier = ">=1.0" },
{ name = "numpy", marker = "extra == 'backtest'", specifier = ">=2.0" }, { name = "numpy", marker = "extra == 'backtest'", specifier = ">=2.0" },
{ name = "numpy", marker = "extra == 'gui'", specifier = ">=2.0" }, { name = "numpy", marker = "extra == 'gui'", specifier = ">=2.0" },
{ name = "pandas", marker = "extra == 'backtest'", specifier = ">=2.2" }, { name = "pandas", marker = "extra == 'backtest'", specifier = ">=2.2" },
@@ -180,6 +179,7 @@ dev = [
{ name = "pytest", specifier = ">=8.3" }, { name = "pytest", specifier = ">=8.3" },
{ name = "pytest-asyncio", specifier = ">=0.24" }, { name = "pytest-asyncio", specifier = ">=0.24" },
{ name = "pytest-cov", specifier = ">=5.0" }, { name = "pytest-cov", specifier = ">=5.0" },
{ name = "pytest-httpx", specifier = ">=0.33" },
{ name = "ruff", specifier = ">=0.7" }, { name = "ruff", specifier = ">=0.7" },
{ name = "types-python-dateutil", specifier = ">=2.9" }, { name = "types-python-dateutil", specifier = ">=2.9" },
{ name = "types-pyyaml", specifier = ">=6.0" }, { name = "types-pyyaml", specifier = ">=6.0" },
@@ -194,63 +194,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/22/30/7cd8fdcdfbc5b869528b079bfb76dcdf6056b1a2097a662e5e8c04f42965/certifi-2026.4.22-py3-none-any.whl", hash = "sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a", size = 135707, upload-time = "2026-04-22T11:26:09.372Z" }, { url = "https://files.pythonhosted.org/packages/22/30/7cd8fdcdfbc5b869528b079bfb76dcdf6056b1a2097a662e5e8c04f42965/certifi-2026.4.22-py3-none-any.whl", hash = "sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a", size = 135707, upload-time = "2026-04-22T11:26:09.372Z" },
] ]
[[package]]
name = "cffi"
version = "2.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pycparser", marker = "implementation_name != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" },
{ url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" },
{ url = "https://files.pythonhosted.org/packages/ff/df/a4f0fbd47331ceeba3d37c2e51e9dfc9722498becbeec2bd8bc856c9538a/cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", size = 212529, upload-time = "2025-09-08T23:22:47.349Z" },
{ url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" },
{ url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" },
{ url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" },
{ url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" },
{ url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" },
{ url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" },
{ url = "https://files.pythonhosted.org/packages/7b/2b/2b6435f76bfeb6bbf055596976da087377ede68df465419d192acf00c437/cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", size = 172932, upload-time = "2025-09-08T23:22:57.188Z" },
{ url = "https://files.pythonhosted.org/packages/f8/ed/13bd4418627013bec4ed6e54283b1959cf6db888048c7cf4b4c3b5b36002/cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", size = 183557, upload-time = "2025-09-08T23:22:58.351Z" },
{ url = "https://files.pythonhosted.org/packages/95/31/9f7f93ad2f8eff1dbc1c3656d7ca5bfd8fb52c9d786b4dcf19b2d02217fa/cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", size = 177762, upload-time = "2025-09-08T23:22:59.668Z" },
{ url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" },
{ url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" },
{ url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" },
{ url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" },
{ url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" },
{ url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" },
{ url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" },
{ url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" },
{ url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" },
{ url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" },
{ url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" },
{ url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" },
{ url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" },
{ url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" },
{ url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" },
{ url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" },
{ url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" },
{ url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" },
{ url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" },
{ url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" },
{ url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" },
{ url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" },
{ url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" },
{ url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" },
{ url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" },
{ url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" },
{ url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" },
{ url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" },
{ url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" },
{ url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" },
{ url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" },
{ url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" },
{ url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" },
{ url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" },
]
[[package]] [[package]]
name = "cfgv" name = "cfgv"
version = "3.5.0" version = "3.5.0"
@@ -504,59 +447,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/9e/ee/a4cf96b8ce1e566ed238f0659ac2d3f007ed1d14b181bcb684e19561a69a/coverage-7.13.5-py3-none-any.whl", hash = "sha256:34b02417cf070e173989b3db962f7ed56d2f644307b2cf9d5a0f258e13084a61", size = 211346, upload-time = "2026-03-17T10:33:15.691Z" }, { url = "https://files.pythonhosted.org/packages/9e/ee/a4cf96b8ce1e566ed238f0659ac2d3f007ed1d14b181bcb684e19561a69a/coverage-7.13.5-py3-none-any.whl", hash = "sha256:34b02417cf070e173989b3db962f7ed56d2f644307b2cf9d5a0f258e13084a61", size = 211346, upload-time = "2026-03-17T10:33:15.691Z" },
] ]
[[package]]
name = "cryptography"
version = "47.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ef/b2/7ffa7fe8207a8c42147ffe70c3e360b228160c1d85dc3faff16aaa3244c0/cryptography-47.0.0.tar.gz", hash = "sha256:9f8e55fe4e63613a5e1cc5819030f27b97742d720203a087802ce4ce9ceb52bb", size = 830863, upload-time = "2026-04-24T19:54:57.056Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a4/98/40dfe932134bdcae4f6ab5927c87488754bf9eb79297d7e0070b78dd58e9/cryptography-47.0.0-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:160ad728f128972d362e714054f6ba0067cab7fb350c5202a9ae8ae4ce3ef1a0", size = 7912214, upload-time = "2026-04-24T19:53:03.864Z" },
{ url = "https://files.pythonhosted.org/packages/34/c6/2733531243fba725f58611b918056b277692f1033373dcc8bd01af1c05d4/cryptography-47.0.0-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b9a8943e359b7615db1a3ba587994618e094ff3d6fa5a390c73d079ce18b3973", size = 4644617, upload-time = "2026-04-24T19:53:06.909Z" },
{ url = "https://files.pythonhosted.org/packages/00/e3/b27be1a670a9b87f855d211cf0e1174a5d721216b7616bd52d8581d912ed/cryptography-47.0.0-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f5c15764f261394b22aef6b00252f5195f46f2ca300bec57149474e2538b31f8", size = 4668186, upload-time = "2026-04-24T19:53:09.053Z" },
{ url = "https://files.pythonhosted.org/packages/81/b9/8443cfe5d17d482d348cee7048acf502bb89a51b6382f06240fd290d4ca3/cryptography-47.0.0-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:9c59ab0e0fa3a180a5a9c59f3a5abe3ef90d474bc56d7fadfbe80359491b615b", size = 4651244, upload-time = "2026-04-24T19:53:11.217Z" },
{ url = "https://files.pythonhosted.org/packages/5d/5e/13ed0cdd0eb88ba159d6dd5ebfece8cb901dbcf1ae5ac4072e28b55d3153/cryptography-47.0.0-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:34b4358b925a5ea3e14384ca781a2c0ef7ac219b57bb9eacc4457078e2b19f92", size = 5252906, upload-time = "2026-04-24T19:53:13.532Z" },
{ url = "https://files.pythonhosted.org/packages/64/16/ed058e1df0f33d440217cd120d41d5dda9dd215a80b8187f68483185af82/cryptography-47.0.0-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0024b87d47ae2399165a6bfb20d24888881eeab83ae2566d62467c5ff0030ce7", size = 4701842, upload-time = "2026-04-24T19:53:15.618Z" },
{ url = "https://files.pythonhosted.org/packages/02/e0/3d30986b30fdbd9e969abbdf8ba00ed0618615144341faeb57f395a084fe/cryptography-47.0.0-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:1e47422b5557bb82d3fff997e8d92cff4e28b9789576984f08c248d2b3535d93", size = 4289313, upload-time = "2026-04-24T19:53:17.755Z" },
{ url = "https://files.pythonhosted.org/packages/df/fd/32db38e3ad0cb331f0691cb4c7a8a6f176f679124dee746b3af6633db4d9/cryptography-47.0.0-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:6f29f36582e6151d9686235e586dd35bb67491f024767d10b842e520dc6a07ac", size = 4650964, upload-time = "2026-04-24T19:53:20.062Z" },
{ url = "https://files.pythonhosted.org/packages/86/53/5395d944dfd48cb1f67917f533c609c34347185ef15eb4308024c876f274/cryptography-47.0.0-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:a9b761f012a943b7de0e828843c5688d0de94a0578d44d6c85a1bae32f87791f", size = 5207817, upload-time = "2026-04-24T19:53:22.498Z" },
{ url = "https://files.pythonhosted.org/packages/34/4f/e5711b28e1901f7d480a2b1b688b645aa4c77c73f10731ed17e7f7db3f0d/cryptography-47.0.0-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:4e1de79e047e25d6e9f8cea71c86b4a53aced64134f0f003bbcbf3655fd172c8", size = 4701544, upload-time = "2026-04-24T19:53:24.356Z" },
{ url = "https://files.pythonhosted.org/packages/22/22/c8ddc25de3010fc8da447648f5a092c40e7a8fadf01dd6d255d9c0b9373d/cryptography-47.0.0-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef6b3634087f18d2155b1e8ce264e5345a753da2c5fa9815e7d41315c90f8318", size = 4783536, upload-time = "2026-04-24T19:53:26.665Z" },
{ url = "https://files.pythonhosted.org/packages/66/b6/d4a68f4ea999c6d89e8498579cba1c5fcba4276284de7773b17e4fa69293/cryptography-47.0.0-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:11dbb9f50a0f1bb9757b3d8c27c1101780efb8f0bdecfb12439c22a74d64c001", size = 4926106, upload-time = "2026-04-24T19:53:28.686Z" },
{ url = "https://files.pythonhosted.org/packages/54/ed/5f524db1fade9c013aa618e1c99c6ed05e8ffc9ceee6cda22fed22dda3f4/cryptography-47.0.0-cp311-abi3-win32.whl", hash = "sha256:7fda2f02c9015db3f42bb8a22324a454516ed10a8c29ca6ece6cdbb5efe2a203", size = 3258581, upload-time = "2026-04-24T19:53:31.058Z" },
{ url = "https://files.pythonhosted.org/packages/b2/dc/1b901990b174786569029f67542b3edf72ac068b6c3c8683c17e6a2f5363/cryptography-47.0.0-cp311-abi3-win_amd64.whl", hash = "sha256:f5c3296dab66202f1b18a91fa266be93d6aa0c2806ea3d67762c69f60adc71aa", size = 3775309, upload-time = "2026-04-24T19:53:33.054Z" },
{ url = "https://files.pythonhosted.org/packages/14/88/7aa18ad9c11bc87689affa5ce4368d884b517502d75739d475fc6f4a03c7/cryptography-47.0.0-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:be12cb6a204f77ed968bcefe68086eb061695b540a3dd05edac507a3111b25f0", size = 7904299, upload-time = "2026-04-24T19:53:35.003Z" },
{ url = "https://files.pythonhosted.org/packages/07/55/c18f75724544872f234678fdedc871391722cb34a2aee19faa9f63100bb2/cryptography-47.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2ebd84adf0728c039a3be2700289378e1c164afc6748df1a5ed456767bef9ba7", size = 4631180, upload-time = "2026-04-24T19:53:37.517Z" },
{ url = "https://files.pythonhosted.org/packages/ee/65/31a5cc0eaca99cec5bafffe155d407115d96136bb161e8b49e0ef73f09a7/cryptography-47.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7f68d6fbc7fbbcfb0939fea72c3b96a9f9a6edfc0e1b1d29778a2066030418b1", size = 4653529, upload-time = "2026-04-24T19:53:39.775Z" },
{ url = "https://files.pythonhosted.org/packages/e5/bc/641c0519a495f3bfd0421b48d7cd325c4336578523ccd76ea322b6c29c7a/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:6651d32eff255423503aa276739da98c30f26c40cbeffcc6048e0d54ef704c0c", size = 4638570, upload-time = "2026-04-24T19:53:42.129Z" },
{ url = "https://files.pythonhosted.org/packages/2b/f2/300327b0a47f6dc94dd8b71b57052aefe178bb51745073d73d80604f11ab/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:3fb8fa48075fad7193f2e5496135c6a76ac4b2aa5a38433df0a539296b377829", size = 5238019, upload-time = "2026-04-24T19:53:44.577Z" },
{ url = "https://files.pythonhosted.org/packages/e9/5a/5b5cf994391d4bf9d9c7efd4c66aabe4d95227256627f8fea6cff7dfadbd/cryptography-47.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:11438c7518132d95f354fa01a4aa2f806d172a061a7bed18cf18cbdacdb204d7", size = 4686832, upload-time = "2026-04-24T19:53:47.015Z" },
{ url = "https://files.pythonhosted.org/packages/dc/2c/ae950e28fd6475c852fc21a44db3e6b5bcc1261d1e370f2b6e42fa800fef/cryptography-47.0.0-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:8c1a736bbb3288005796c3f7ccb9453360d7fed483b13b9f468aea5171432923", size = 4269301, upload-time = "2026-04-24T19:53:48.97Z" },
{ url = "https://files.pythonhosted.org/packages/67/fb/6a39782e150ffe5cc1b0018cb6ddc48bf7ca62b498d7539ffc8a758e977d/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:f1557695e5c2b86e204f6ce9470497848634100787935ab7adc5397c54abd7ab", size = 4638110, upload-time = "2026-04-24T19:53:51.011Z" },
{ url = "https://files.pythonhosted.org/packages/8e/d7/0b3c71090a76e5c203164a47688b697635ece006dcd2499ab3a4dbd3f0bd/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:f9a034b642b960767fb343766ae5ba6ad653f2e890ddd82955aef288ffea8736", size = 5194988, upload-time = "2026-04-24T19:53:52.962Z" },
{ url = "https://files.pythonhosted.org/packages/63/33/63a961498a9df51721ab578c5a2622661411fc520e00bd83b0cc64eb20c4/cryptography-47.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:b1c76fca783aa7698eb21eb14f9c4aa09452248ee54a627d125025a43f83e7a7", size = 4686563, upload-time = "2026-04-24T19:53:55.274Z" },
{ url = "https://files.pythonhosted.org/packages/b7/bf/5ee5b145248f92250de86145d1c1d6edebbd57a7fe7caa4dedb5d4cf06a1/cryptography-47.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4f7722c97826770bab8ae92959a2e7b20a5e9e9bf4deae68fd86c3ca457bab52", size = 4770094, upload-time = "2026-04-24T19:53:57.753Z" },
{ url = "https://files.pythonhosted.org/packages/92/43/21d220b2da5d517773894dacdcdb5c682c28d3fffce65548cb06e87d5501/cryptography-47.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:09f6d7bf6724f8db8b32f11eccf23efc8e759924bc5603800335cf8859a3ddbd", size = 4913811, upload-time = "2026-04-24T19:54:00.236Z" },
{ url = "https://files.pythonhosted.org/packages/31/98/dc4ad376ac5f1a1a7d4a83f7b0c6f2bcad36b5d2d8f30aeb482d3a7d9582/cryptography-47.0.0-cp314-cp314t-win32.whl", hash = "sha256:6eebcaf0df1d21ce1f90605c9b432dd2c4f4ab665ac29a40d5e3fc68f51b5e63", size = 3237158, upload-time = "2026-04-24T19:54:02.606Z" },
{ url = "https://files.pythonhosted.org/packages/bc/da/97f62d18306b5133468bc3f8cc73a3111e8cdc8cf8d3e69474d6e5fd2d1b/cryptography-47.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:51c9313e90bd1690ec5a75ed047c27c0b8e6c570029712943d6116ef9a90620b", size = 3758706, upload-time = "2026-04-24T19:54:04.433Z" },
{ url = "https://files.pythonhosted.org/packages/e0/34/a4fae8ae7c3bc227460c9ae43f56abf1b911da0ec29e0ebac53bb0a4b6b7/cryptography-47.0.0-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:14432c8a9bcb37009784f9594a62fae211a2ae9543e96c92b2a8e4c3cd5cd0c4", size = 7904072, upload-time = "2026-04-24T19:54:06.411Z" },
{ url = "https://files.pythonhosted.org/packages/01/64/d7b1e54fdb69f22d24a64bb3e88dc718b31c7fb10ef0b9691a3cf7eeea6e/cryptography-47.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:07efe86201817e7d3c18781ca9770bc0db04e1e48c994be384e4602bc38f8f27", size = 4635767, upload-time = "2026-04-24T19:54:08.519Z" },
{ url = "https://files.pythonhosted.org/packages/8b/7b/cca826391fb2a94efdcdfe4631eb69306ee1cff0b22f664a412c90713877/cryptography-47.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2b45761c6ec22b7c726d6a829558777e32d0f1c8be7c3f3480f9c912d5ee8a10", size = 4654350, upload-time = "2026-04-24T19:54:10.795Z" },
{ url = "https://files.pythonhosted.org/packages/4c/65/4b57bcc823f42a991627c51c2f68c9fd6eb1393c1756aac876cba2accae2/cryptography-47.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:edd4da498015da5b9f26d38d3bfc2e90257bfa9cbed1f6767c282a0025ae649b", size = 4643394, upload-time = "2026-04-24T19:54:13.275Z" },
{ url = "https://files.pythonhosted.org/packages/f4/c4/2c5fbeea70adbbca2bbae865e1d605d6a4a7f8dbd9d33eaf69645087f06c/cryptography-47.0.0-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:9af828c0d5a65c70ec729cd7495a4bf1a67ecb66417b8f02ff125ab8a6326a74", size = 5225777, upload-time = "2026-04-24T19:54:15.18Z" },
{ url = "https://files.pythonhosted.org/packages/7e/b8/ac57107ef32749d2b244e36069bb688792a363aaaa3acc9e3cf84c130315/cryptography-47.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:256d07c78a04d6b276f5df935a9923275f53bd1522f214447fdf365494e2d515", size = 4688771, upload-time = "2026-04-24T19:54:17.835Z" },
{ url = "https://files.pythonhosted.org/packages/56/fc/9f1de22ff8be99d991f240a46863c52d475404c408886c5a38d2b5c3bb26/cryptography-47.0.0-cp38-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:5d0e362ff51041b0c0d219cc7d6924d7b8996f57ce5712bdcef71eb3c65a59cc", size = 4270753, upload-time = "2026-04-24T19:54:19.963Z" },
{ url = "https://files.pythonhosted.org/packages/00/68/d70c852797aa68e8e48d12e5a87170c43f67bb4a59403627259dd57d15de/cryptography-47.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:1581aef4219f7ca2849d0250edaa3866212fb74bf5667284f46aa92f9e65c1ca", size = 4642911, upload-time = "2026-04-24T19:54:21.818Z" },
{ url = "https://files.pythonhosted.org/packages/a5/51/661cbee74f594c5d97ff82d34f10d5551c085ca4668645f4606ebd22bd5d/cryptography-47.0.0-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:a49a3eb5341b9503fa3000a9a0db033161db90d47285291f53c2a9d2cd1b7f76", size = 5181411, upload-time = "2026-04-24T19:54:24.376Z" },
{ url = "https://files.pythonhosted.org/packages/94/87/f2b6c374a82cf076cfa1416992ac8e8ec94d79facc37aec87c1a5cb72352/cryptography-47.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:2207a498b03275d0051589e326b79d4cf59985c99031b05bb292ac52631c37fe", size = 4688262, upload-time = "2026-04-24T19:54:26.946Z" },
{ url = "https://files.pythonhosted.org/packages/14/e2/8b7462f4acf21ec509616f0245018bb197194ab0b65c2ea21a0bdd53c0eb/cryptography-47.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:7a02675e2fabd0c0fc04c868b8781863cbf1967691543c22f5470500ff840b31", size = 4775506, upload-time = "2026-04-24T19:54:28.926Z" },
{ url = "https://files.pythonhosted.org/packages/70/75/158e494e4c08dc05e039da5bb48553826bd26c23930cf8d3cd5f21fa8921/cryptography-47.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:80887c5cbd1774683cb126f0ab4184567f080071d5acf62205acb354b4b753b7", size = 4912060, upload-time = "2026-04-24T19:54:30.869Z" },
{ url = "https://files.pythonhosted.org/packages/06/bd/0a9d3edbf5eadbac926d7b9b3cd0c4be584eeeae4a003d24d9eda4affbbd/cryptography-47.0.0-cp38-abi3-win32.whl", hash = "sha256:ed67ea4e0cfb5faa5bc7ecb6e2b8838f3807a03758eec239d6c21c8769355310", size = 3248487, upload-time = "2026-04-24T19:54:33.494Z" },
{ url = "https://files.pythonhosted.org/packages/60/80/5681af756d0da3a599b7bdb586fac5a1540f1bcefd2717a20e611ddade45/cryptography-47.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:835d2d7f47cdc53b3224e90810fb1d36ca94ea29cc1801fb4c1bc43876735769", size = 3755737, upload-time = "2026-04-24T19:54:35.408Z" },
]
[[package]] [[package]]
name = "cycler" name = "cycler"
version = "0.12.1" version = "0.12.1"
@@ -725,15 +615,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
] ]
[[package]]
name = "httpx-sse"
version = "0.4.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/0f/4c/751061ffa58615a32c31b2d82e8482be8dd4a89154f003147acee90f2be9/httpx_sse-0.4.3.tar.gz", hash = "sha256:9b1ed0127459a66014aec3c56bebd93da3c1bc8bb6618c8082039a44889a755d", size = 15943, upload-time = "2025-10-10T21:48:22.271Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d2/fd/6668e5aec43ab844de6fc74927e155a3b37bf40d7c3790e49fc0406b6578/httpx_sse-0.4.3-py3-none-any.whl", hash = "sha256:0ac1c9fe3c0afad2e0ebb25a934a59f4c7823b60792691f779fad2c5568830fc", size = 8960, upload-time = "2025-10-10T21:48:21.158Z" },
]
[[package]] [[package]]
name = "hypothesis" name = "hypothesis"
version = "6.152.3" version = "6.152.3"
@@ -1087,31 +968,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/78/23/92493c3e6e1b635ccfff146f7b99e674808787915420373ac399283764c2/matplotlib-3.10.9-cp314-cp314t-win_arm64.whl", hash = "sha256:a49f1eadc84ca85fd72fa4e89e70e61bf86452df6f971af04b12c60761a0772c", size = 8324785, upload-time = "2026-04-24T00:13:53.633Z" }, { url = "https://files.pythonhosted.org/packages/78/23/92493c3e6e1b635ccfff146f7b99e674808787915420373ac399283764c2/matplotlib-3.10.9-cp314-cp314t-win_arm64.whl", hash = "sha256:a49f1eadc84ca85fd72fa4e89e70e61bf86452df6f971af04b12c60761a0772c", size = 8324785, upload-time = "2026-04-24T00:13:53.633Z" },
] ]
[[package]]
name = "mcp"
version = "1.27.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
{ name = "httpx" },
{ name = "httpx-sse" },
{ name = "jsonschema" },
{ name = "pydantic" },
{ name = "pydantic-settings" },
{ name = "pyjwt", extra = ["crypto"] },
{ name = "python-multipart" },
{ name = "pywin32", marker = "sys_platform == 'win32'" },
{ name = "sse-starlette" },
{ name = "starlette" },
{ name = "typing-extensions" },
{ name = "typing-inspection" },
{ name = "uvicorn", marker = "sys_platform != 'emscripten'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/8b/eb/c0cfc62075dc6e1ec1c64d352ae09ac051d9334311ed226f1f425312848a/mcp-1.27.0.tar.gz", hash = "sha256:d3dc35a7eec0d458c1da4976a48f982097ddaab87e278c5511d5a4a56e852b83", size = 607509, upload-time = "2026-04-02T14:48:08.88Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9c/46/f6b4ad632c67ef35209a66127e4bddc95759649dd595f71f13fba11bdf9a/mcp-1.27.0-py3-none-any.whl", hash = "sha256:5ce1fa81614958e267b21fb2aa34e0aea8e2c6ede60d52aba45fd47246b4d741", size = 215967, upload-time = "2026-04-02T14:48:07.24Z" },
]
[[package]] [[package]]
name = "mdurl" name = "mdurl"
version = "0.1.2" version = "0.1.2"
@@ -1496,15 +1352,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/51/be/6f79d55816d5c22557cf27533543d5d70dfe692adfbee4b99f2760674f38/pyarrow-24.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:c91d00057f23b8d353039520dc3a6c09d8608164c692e9f59a175a42b2ae0c19", size = 28131282, upload-time = "2026-04-21T10:51:16.815Z" }, { url = "https://files.pythonhosted.org/packages/51/be/6f79d55816d5c22557cf27533543d5d70dfe692adfbee4b99f2760674f38/pyarrow-24.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:c91d00057f23b8d353039520dc3a6c09d8608164c692e9f59a175a42b2ae0c19", size = 28131282, upload-time = "2026-04-21T10:51:16.815Z" },
] ]
[[package]]
name = "pycparser"
version = "3.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/1b/7d/92392ff7815c21062bea51aa7b87d45576f649f16458d78b7cf94b9ab2e6/pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", size = 103492, upload-time = "2026-01-21T14:26:51.89Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/0c/c3/44f3fbbfa403ea2a7c779186dc20772604442dde72947e7d01069cbe98e3/pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992", size = 48172, upload-time = "2026-01-21T14:26:50.693Z" },
]
[[package]] [[package]]
name = "pydantic" name = "pydantic"
version = "2.13.3" version = "2.13.3"
@@ -1631,20 +1478,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/f4/7e/a72dd26f3b0f4f2bf1dd8923c85f7ceb43172af56d63c7383eb62b332364/pygments-2.20.0-py3-none-any.whl", hash = "sha256:81a9e26dd42fd28a23a2d169d86d7ac03b46e2f8b59ed4698fb4785f946d0176", size = 1231151, upload-time = "2026-03-29T13:29:30.038Z" }, { url = "https://files.pythonhosted.org/packages/f4/7e/a72dd26f3b0f4f2bf1dd8923c85f7ceb43172af56d63c7383eb62b332364/pygments-2.20.0-py3-none-any.whl", hash = "sha256:81a9e26dd42fd28a23a2d169d86d7ac03b46e2f8b59ed4698fb4785f946d0176", size = 1231151, upload-time = "2026-03-29T13:29:30.038Z" },
] ]
[[package]]
name = "pyjwt"
version = "2.12.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/c2/27/a3b6e5bf6ff856d2509292e95c8f57f0df7017cf5394921fc4e4ef40308a/pyjwt-2.12.1.tar.gz", hash = "sha256:c74a7a2adf861c04d002db713dd85f84beb242228e671280bf709d765b03672b", size = 102564, upload-time = "2026-03-13T19:27:37.25Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e5/7a/8dd906bd22e79e47397a61742927f6747fe93242ef86645ee9092e610244/pyjwt-2.12.1-py3-none-any.whl", hash = "sha256:28ca37c070cad8ba8cd9790cd940535d40274d22f80ab87f3ac6a713e6e8454c", size = 29726, upload-time = "2026-03-13T19:27:35.677Z" },
]
[package.optional-dependencies]
crypto = [
{ name = "cryptography" },
]
[[package]] [[package]]
name = "pyparsing" name = "pyparsing"
version = "3.3.2" version = "3.3.2"
@@ -1697,6 +1530,19 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/9d/7a/d968e294073affff457b041c2be9868a40c1c71f4a35fcc1e45e5493067b/pytest_cov-7.1.0-py3-none-any.whl", hash = "sha256:a0461110b7865f9a271aa1b51e516c9a95de9d696734a2f71e3e78f46e1d4678", size = 22876, upload-time = "2026-03-21T20:11:14.438Z" }, { url = "https://files.pythonhosted.org/packages/9d/7a/d968e294073affff457b041c2be9868a40c1c71f4a35fcc1e45e5493067b/pytest_cov-7.1.0-py3-none-any.whl", hash = "sha256:a0461110b7865f9a271aa1b51e516c9a95de9d696734a2f71e3e78f46e1d4678", size = 22876, upload-time = "2026-03-21T20:11:14.438Z" },
] ]
[[package]]
name = "pytest-httpx"
version = "0.36.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "httpx" },
{ name = "pytest" },
]
sdist = { url = "https://files.pythonhosted.org/packages/4e/42/f53c58570e80d503ade9dd42ce57f2915d14bcbe25f6308138143950d1d6/pytest_httpx-0.36.2.tar.gz", hash = "sha256:05a56527484f7f4e8c856419ea379b8dc359c36801c4992fdb330f294c690356", size = 57683, upload-time = "2026-04-09T13:57:19.837Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/1e/55/1fa65f8e4fceb19dd6daa867c162ad845d547f6058cd92b4b02384a44777/pytest_httpx-0.36.2-py3-none-any.whl", hash = "sha256:d42ebd5679442dc7bfb0c48e0767b6562e9bc4534d805127b0084171886a5e22", size = 20315, upload-time = "2026-04-09T13:57:18.587Z" },
]
[[package]] [[package]]
name = "python-dateutil" name = "python-dateutil"
version = "2.9.0.post0" version = "2.9.0.post0"
@@ -1731,31 +1577,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/0b/d7/1959b9648791274998a9c3526f6d0ec8fd2233e4d4acce81bbae76b44b2a/python_dotenv-1.2.2-py3-none-any.whl", hash = "sha256:1d8214789a24de455a8b8bd8ae6fe3c6b69a5e3d64aa8a8e5d68e694bbcb285a", size = 22101, upload-time = "2026-03-01T16:00:25.09Z" }, { url = "https://files.pythonhosted.org/packages/0b/d7/1959b9648791274998a9c3526f6d0ec8fd2233e4d4acce81bbae76b44b2a/python_dotenv-1.2.2-py3-none-any.whl", hash = "sha256:1d8214789a24de455a8b8bd8ae6fe3c6b69a5e3d64aa8a8e5d68e694bbcb285a", size = 22101, upload-time = "2026-03-01T16:00:25.09Z" },
] ]
[[package]]
name = "python-multipart"
version = "0.0.26"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/88/71/b145a380824a960ebd60e1014256dbb7d2253f2316ff2d73dfd8928ec2c3/python_multipart-0.0.26.tar.gz", hash = "sha256:08fadc45918cd615e26846437f50c5d6d23304da32c341f289a617127b081f17", size = 43501, upload-time = "2026-04-10T14:09:59.473Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/9a/22/f1925cdda983ab66fc8ec6ec8014b959262747e58bdca26a4e3d1da29d56/python_multipart-0.0.26-py3-none-any.whl", hash = "sha256:c0b169f8c4484c13b0dcf2ef0ec3a4adb255c4b7d18d8e420477d2b1dd03f185", size = 28847, upload-time = "2026-04-10T14:09:58.131Z" },
]
[[package]]
name = "pywin32"
version = "311"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload-time = "2025-07-14T20:13:20.765Z" },
{ url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload-time = "2025-07-14T20:13:22.543Z" },
{ url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload-time = "2025-07-14T20:13:24.682Z" },
{ url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" },
{ url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" },
{ url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" },
{ url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" },
{ url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" },
{ url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" },
]
[[package]] [[package]]
name = "pyyaml" name = "pyyaml"
version = "6.0.3" version = "6.0.3"
@@ -2084,32 +1905,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e5/30/8519fdde58a7bdf155b714359791ad1dc018b47d60269d5d160d311fdc36/sqlalchemy-2.0.49-py3-none-any.whl", hash = "sha256:ec44cfa7ef1a728e88ad41674de50f6db8cfdb3e2af84af86e0041aaf02d43d0", size = 1942158, upload-time = "2026-04-03T16:53:44.135Z" }, { url = "https://files.pythonhosted.org/packages/e5/30/8519fdde58a7bdf155b714359791ad1dc018b47d60269d5d160d311fdc36/sqlalchemy-2.0.49-py3-none-any.whl", hash = "sha256:ec44cfa7ef1a728e88ad41674de50f6db8cfdb3e2af84af86e0041aaf02d43d0", size = 1942158, upload-time = "2026-04-03T16:53:44.135Z" },
] ]
[[package]]
name = "sse-starlette"
version = "3.4.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
{ name = "starlette" },
]
sdist = { url = "https://files.pythonhosted.org/packages/e1/9a/f35932a8c0eb6b2287b66fa65a0321df8c84e4e355a659c1841a37c39fdb/sse_starlette-3.4.1.tar.gz", hash = "sha256:f780bebcf6c8997fe514e3bd8e8c648d8284976b391c8bed0bcb1f611632b555", size = 35127, upload-time = "2026-04-26T13:32:32.292Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ff/07/45c21ed03d708c477367305726b89919b020a3a2a01f72aaf5ad941caf35/sse_starlette-3.4.1-py3-none-any.whl", hash = "sha256:6b43cf21f1d574d582a6e1b0cfbde1c94dc86a32a701a7168c99c4475c6bd1d0", size = 16487, upload-time = "2026-04-26T13:32:30.819Z" },
]
[[package]]
name = "starlette"
version = "1.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/81/69/17425771797c36cded50b7fe44e850315d039f28b15901ab44839e70b593/starlette-1.0.0.tar.gz", hash = "sha256:6a4beaf1f81bb472fd19ea9b918b50dc3a77a6f2e190a12954b25e6ed5eea149", size = 2655289, upload-time = "2026-03-22T18:29:46.779Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/0b/c9/584bc9651441b4ba60cc4d557d8a547b5aff901af35bda3a4ee30c819b82/starlette-1.0.0-py3-none-any.whl", hash = "sha256:d3ec55e0bb321692d275455ddfd3df75fff145d009685eb40dc91fc66b03d38b", size = 72651, upload-time = "2026-03-22T18:29:45.111Z" },
]
[[package]] [[package]]
name = "streamlit" name = "streamlit"
version = "1.56.0" version = "1.56.0"
@@ -2252,19 +2047,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" }, { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" },
] ]
[[package]]
name = "uvicorn"
version = "0.46.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "click" },
{ name = "h11" },
]
sdist = { url = "https://files.pythonhosted.org/packages/1f/93/041fca8274050e40e6791f267d82e0e2e27dd165627bd640d3e0e378d877/uvicorn-0.46.0.tar.gz", hash = "sha256:fb9da0926999cc6cb22dc7cd71a94a632f078e6ae47ff683c5c420750fb7413d", size = 88758, upload-time = "2026-04-23T07:16:00.151Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/31/a3/5b1562db76a5a488274b2332a97199b32d0442aca0ed193697fd47786316/uvicorn-0.46.0-py3-none-any.whl", hash = "sha256:bbebbcbed972d162afca128605223022bedd345b7bc7855ce66deb31487a9048", size = 70926, upload-time = "2026-04-23T07:15:58.355Z" },
]
[[package]] [[package]]
name = "virtualenv" name = "virtualenv"
version = "21.2.4" version = "21.2.4"