Compare commits

...

10 Commits

Author SHA1 Message Date
AdrianoDev 9e7b98579b chore(V2): branch V2.0.0 come default deploy (no merge in main)
deploy-vps.sh: BRANCH default V2.0.0 invece di main.
README: clone con -b V2.0.0, nota che il branch in produzione è V2.0.0.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 10:31:58 +02:00
AdrianoDev 51081f4e18 feat(V2): deploy-vps.sh per deploy via clone (no registry)
Il deploy ora avviene clonando il repo direttamente sul VPS, costruendo
l'immagine in loco e riavviando il container. Sostituisce il workflow
build & push verso registry + Watchtower.

Lo script automatizza:
- git fetch + reset --hard origin/<branch>
- docker compose build
- restart graceful (down 15s + up -d)
- attesa healthcheck con timeout configurabile
- rollback automatico al SHA precedente se /health fallisce

Variabili: BRANCH, PORT, HEALTH_TIMEOUT_SECONDS, FORCE, SKIP_ROLLBACK.

Rimosso scripts/build-push.sh (workflow registry abbandonato).
README aggiornato con la nuova procedura.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 09:05:26 +02:00
AdrianoDev 8ecc1a24a9 feat(V2): /health/ready con ping client + middleware request log strutturato + request_id correlation
- /health/ready: ping di tutti i client (exchange, env) cached con
  timeout 2s, status ready|degraded|not_ready, opt-in 503 via
  READY_FAILS_ON_DEGRADED.
- Middleware mcp.request: 1 riga JSON per HTTP request con request_id,
  method, path, status_code, duration_ms, actor, bot_tag, exchange,
  tool, client_ip, user_agent.
- request_id propagato in request.state, audit log e error envelope per
  correlazione cross-cutting.
- Aggiunto async health() come probe minimo a bybit/alpaca/macro/
  sentiment/deribit (hyperliquid lo aveva già).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 09:03:28 +02:00
AdrianoDev 9afd087152 docs(V2): aggiorna conteggio test 259 → 310 nel README
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 08:52:11 +02:00
AdrianoDev 69ac878893 feat(V2): X-Bot-Tag header obbligatorio + endpoint /admin/audit con filtri
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 08:51:40 +02:00
AdrianoDev bd6b03ce43 feat(V2): cabla audit logging nei write endpoint dei 4 router exchange
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 08:44:28 +02:00
AdrianoDev 43bf8fc461 chore(V2): rimuovi SDK obsoleti (pybit, alpaca-py, hyperliquid-python-sdk)
Sweep finale del task #14: tutti e 4 i client exchange ora usano httpx
puro (deribit già lo era; bybit/alpaca/hyperliquid riscritti nei commit
precedenti). Rimosso anche l'override mypy per i moduli SDK che non
servono più.

Quality gate finale:
- 292 test passano (tutti)
- mypy: 0 issues
- ruff: clean
- Nessun import SDK in src/

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 01:39:53 +02:00
AdrianoDev c0b4cb5d5c refactor(V2): hyperliquid client da SDK a httpx + eth-account EIP-712 (parità V1)
Riscritto interamente HyperliquidClient su httpx puro + eth-account per la
firma EIP-712 L1 (chainId 1337, phantom agent source 'a'/'b' per
mainnet/testnet). Bit-parity verificata contro hyperliquid.utils.signing
in test_signing_parity_with_canonical_sdk.

16 metodi pubblici, 26 test passanti. Aggiunte deps: eth-account, msgpack,
eth-utils. hyperliquid-python-sdk ancora presente nel pyproject; rimossa
nel sweep finale.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 01:39:23 +02:00
AdrianoDev 44c7a18d3e refactor(V2): alpaca client da alpaca-py a httpx puro (parità V1)
Riscrive `AlpacaClient` su `httpx.AsyncClient` rimuovendo ogni dipendenza
runtime da `alpaca-py`. 4 endpoint base distinti (trading paper/live,
stock data, crypto data, options data) gestiti via helper `_request`
con header `APCA-API-KEY-ID` / `APCA-API-SECRET-KEY`. Firma costruttore
e attributi pubblici (`paper`, `base_url`) invariati; `base_url` override
applica al solo trading endpoint. Nuovo `aclose()` per cleanup connessioni.

Test riscritti su `pytest-httpx` (29 test alpaca + leverage cap).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-01 01:38:23 +02:00
AdrianoDev 6097dde4e4 refactor(V2): bybit client da pybit a httpx puro (parità V1) 2026-05-01 01:35:26 +02:00
37 changed files with 4170 additions and 1322 deletions
+130 -10
View File
@@ -19,7 +19,7 @@ sul token bearer fornito dal client.
override-abili tramite variabili dedicate (`DERIBIT_URL_*`,
`BYBIT_URL_*`, `HYPERLIQUID_URL_*`, `ALPACA_URL_*`)
- **Documentazione interattiva** OpenAPI/Swagger esposta a `/apidocs`
- **Qualità verificata**: 259 test (unit + integration + smoke), mypy
- **Qualità verificata**: 310 test (unit + integration + smoke), mypy
pulito, ruff pulito
## Avvio rapido (sviluppo, senza Docker)
@@ -62,11 +62,26 @@ Le tool puramente read-only (`/mcp-macro/*` e `/mcp-sentiment/*`)
richiedono comunque un bearer valido, ma il valore (testnet o mainnet) è
indifferente perché non hanno endpoint testnet.
### Header X-Bot-Tag (identificazione bot)
Tutte le chiamate a `/mcp-*` richiedono inoltre l'header `X-Bot-Tag` con
una stringa identificativa del bot chiamante (massimo 64 caratteri). Il
valore viene loggato negli audit record per tracciare quale bot ha
eseguito ogni operazione write. Esempio di richiesta:
Authorization: Bearer $MAINNET_TOKEN
X-Bot-Tag: scanner-alpha-prod
Se l'header è assente o vuoto la risposta è `400 BAD_REQUEST`. L'header
non è richiesto sugli endpoint pubblici (`/health`, `/apidocs`,
`/openapi.json`) né sull'endpoint admin `/admin/audit`.
## Endpoint principali
| Path | Descrizione |
|---|---|
| `GET /health` | Healthcheck (no auth) |
| `GET /health` | Liveness check (no auth) |
| `GET /health/ready` | Readiness check con ping client exchange (no auth) |
| `GET /apidocs` | Swagger UI (no auth) |
| `GET /openapi.json` | Schema OpenAPI 3.1 (no auth) |
| `POST /mcp-deribit/tools/{tool}` | Tool exchange Deribit |
@@ -75,6 +90,71 @@ indifferente perché non hanno endpoint testnet.
| `POST /mcp-alpaca/tools/{tool}` | Tool exchange Alpaca |
| `POST /mcp-macro/tools/{tool}` | Tool macro/market data |
| `POST /mcp-sentiment/tools/{tool}` | Tool sentiment/news |
| `GET /admin/audit` | Query dell'audit log JSONL (bearer richiesto, no X-Bot-Tag) |
## Observability
### Health check
L'applicazione espone due endpoint distinti per il monitoring:
- `GET /health` — liveness check semplice. Non richiede autenticazione e
ritorna sempre HTTP 200 finché il processo è vivo. Ideale per la
liveness probe di Kubernetes o per il pinger di Traefik.
- `GET /health/ready` — readiness check evoluto. Itera tutti i client
exchange presenti nel registry e per ciascuno tenta una probe leggera
(`health()` se disponibile, fallback su `is_testnet()`), con timeout
di 2 secondi per client. La risposta contiene il campo `status` con
uno dei valori `ready` (tutti i client rispondono), `degraded` (almeno
uno fallisce) o `not_ready` (registry vuoto) ed un array `clients` con
un record per ogni coppia `(exchange, env)` cached. Per default
l'endpoint risponde sempre con HTTP 200; impostando la variabile
d'ambiente `READY_FAILS_ON_DEGRADED=true` si forza HTTP 503 quando lo
stato non è `ready`, comportamento utile per la readiness probe di
Kubernetes.
### Request log
Ogni richiesta HTTP attraversa un middleware che emette una riga JSON
sul logger `mcp.request` con i seguenti campi: `request_id`, `method`,
`path`, `status_code`, `duration_ms`, `actor` (`testnet` o `mainnet`,
solo se autenticato), `bot_tag` (header `X-Bot-Tag` se presente),
`exchange` (estratto dal path `/mcp-{exchange}/...`), `tool` (nome del
tool quando il path è `/mcp-X/tools/Y`), `client_ip`, `user_agent`. Lo
stesso `request_id` viene incluso anche nei record dell'audit log
`mcp.audit` e nell'envelope di errore restituito al client, in modo da
poter correlare le tre tracce a parità di richiesta.
### Audit log
Vedi la sezione "Audit query" qui sotto per la consultazione del log
strutturato delle operazioni di scrittura.
## Audit query
`GET /admin/audit` legge il file JSONL puntato da `AUDIT_LOG_FILE` e
restituisce i record filtrati. Richiede un bearer valido (testnet o
mainnet); non richiede l'header `X-Bot-Tag`.
Parametri di query (tutti opzionali):
- `from`, `to`: ISO 8601 datetime (es. `2026-05-01` o `2026-05-01T12:34:56Z`)
- `actor`: `testnet` | `mainnet`
- `exchange`: nome dell'exchange (`deribit`, `bybit`, `hyperliquid`, `alpaca`)
- `action`: nome del tool (es. `place_order`)
- `bot_tag`: identificatore del bot
- `limit`: massimo record restituiti, default `1000`, massimo `10000`
Esempio di chiamata:
curl -H "Authorization: Bearer $MAINNET_TOKEN" \
"http://localhost:9000/admin/audit?from=2026-05-01&actor=mainnet&action=place_order&limit=100"
Se `AUDIT_LOG_FILE` non è configurata l'endpoint risponde `count: 0` con
un campo `warning`. Per abilitare il sink persistente impostare nel `.env`:
AUDIT_LOG_FILE=/var/log/cerbero-mcp/audit.jsonl
AUDIT_LOG_BACKUP_DAYS=30
## Tool disponibili
@@ -139,18 +219,58 @@ labels:
## Build & deploy pipeline
Build dell'immagine eseguita sulla macchina di sviluppo:
Il deploy su VPS avviene **per clone diretto del repo**, senza passare per
un container registry. Lo script `scripts/deploy-vps.sh` automatizza
l'intero flusso: pull del ramo target, rebuild dell'immagine sulla
macchina VPS, restart del servizio, healthcheck e rollback automatico in
caso di fallimento.
### Setup iniziale sul VPS (una sola volta)
```bash
export GITEA_PAT='<PAT con scope write:package>'
./scripts/build-push.sh
# Sul VPS:
sudo mkdir -p /opt/cerbero-mcp
sudo chown -R "$USER":"$USER" /opt/cerbero-mcp
cd /opt/cerbero-mcp
git clone -b V2.0.0 ssh://git@git.tielogic.xyz:222/Adriano/Cerbero-mcp.git .
cp .env.example .env
# editare .env con i token e le credenziali reali
```
Lo script tagga `:2.0.0`, `:latest` e `:sha-<short>` per rollback puntuali
e pubblica al registry Gitea. Sul VPS Watchtower polla `:latest` e
aggiorna il container automaticamente.
Il branch in produzione è `V2.0.0` (non `main`). Lo script `deploy-vps.sh`
fa default su questo ramo.
Smoke test post-deploy:
### Deploy ricorrente
Da qualunque macchina con accesso SSH al VPS:
```bash
ssh user@vps 'cd /opt/cerbero-mcp && bash scripts/deploy-vps.sh'
```
Oppure direttamente dal VPS:
```bash
cd /opt/cerbero-mcp
bash scripts/deploy-vps.sh
```
Lo script:
1. verifica che il working tree sia pulito e che `.env` sia presente;
2. esegue `git fetch + reset --hard origin/V2.0.0`;
3. se la SHA non è cambiata, esce senza fare nulla (override con
`FORCE=1`);
4. ricostruisce l'immagine Docker (`docker compose build`);
5. restart graceful del container (`docker compose down --timeout 15`
seguito da `docker compose up -d`);
6. attende `/health` (timeout 30 s di default);
7. se l'health fallisce, esegue rollback automatico al SHA precedente.
Variabili d'ambiente accettate: `BRANCH` (default `V2.0.0`), `PORT`
(default letto da `.env`), `HEALTH_TIMEOUT_SECONDS`, `FORCE`,
`SKIP_ROLLBACK`.
### Smoke test post-deploy
```bash
PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
@@ -160,7 +280,7 @@ PORT=9000 TESTNET_TOKEN="$TESTNET_TOKEN" bash tests/smoke/run.sh
```bash
uv sync
uv run pytest # tutta la suite (259 test attesi)
uv run pytest # tutta la suite (310 test attesi)
uv run pytest tests/unit -v # solo unit
uv run pytest tests/integration -v
uv run ruff check src/ tests/
+4 -4
View File
@@ -16,9 +16,9 @@ dependencies = [
"scipy>=1.13",
"statsmodels>=0.14",
"pandas>=2.2",
"pybit>=5.7",
"alpaca-py>=0.30",
"hyperliquid-python-sdk>=0.6",
"eth-account>=0.13.7",
"msgpack>=1.1.2",
"eth-utils>=5.3.1",
]
[project.scripts]
@@ -70,7 +70,7 @@ check_untyped_defs = true
ignore_missing_imports = true
[[tool.mypy.overrides]]
module = ["pybit.*", "alpaca.*", "hyperliquid.*", "pythonjsonlogger.*"]
module = ["pythonjsonlogger.*"]
ignore_missing_imports = true
[dependency-groups]
-50
View File
@@ -1,50 +0,0 @@
#!/usr/bin/env bash
# Cerbero MCP — build & push immagine unica V2.0.0 al registry Gitea.
#
# Pre-requisiti:
# - docker
# - PAT Gitea con scope `write:package` in env $GITEA_PAT
# - $GITEA_USER (default: adriano)
#
# Uso:
# ./scripts/build-push.sh
# VERSION=2.0.1 ./scripts/build-push.sh
set -euo pipefail
REGISTRY="${REGISTRY:-git.tielogic.xyz}"
IMAGE_PREFIX="${IMAGE_PREFIX:-$REGISTRY/adriano/cerbero-mcp}"
GITEA_USER="${GITEA_USER:-adriano}"
VERSION="${VERSION:-2.0.0}"
SHA="$(git rev-parse --short HEAD)"
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
# Login solo se non già autenticato sul registry.
if grep -q "\"$REGISTRY\"" ~/.docker/config.json 2>/dev/null; then
echo "=== docker già loggato su $REGISTRY (skip login) ==="
elif [ -n "${GITEA_PAT:-}" ]; then
echo "=== docker login $REGISTRY ==="
echo "$GITEA_PAT" | docker login "$REGISTRY" -u "$GITEA_USER" --password-stdin
else
echo "FATAL: non autenticato su $REGISTRY e GITEA_PAT non settata."
echo " Esegui una volta: docker login $REGISTRY -u $GITEA_USER"
exit 1
fi
TAG_VERSION="$IMAGE_PREFIX:$VERSION"
TAG_LATEST="$IMAGE_PREFIX:latest"
TAG_SHA="$IMAGE_PREFIX:sha-$SHA"
echo "=== build cerbero-mcp:$VERSION ==="
docker build -t "$TAG_VERSION" -t "$TAG_LATEST" -t "$TAG_SHA" .
echo "=== push ==="
for tag in "$TAG_VERSION" "$TAG_LATEST" "$TAG_SHA"; do
docker push "$tag"
echo " pushed: $tag"
done
echo
echo "=== Done (commit $SHA, version $VERSION) ==="
echo "VPS Watchtower farà pull entro WATCHTOWER_POLL_INTERVAL (default 5min)."
+148
View File
@@ -0,0 +1,148 @@
#!/usr/bin/env bash
# deploy-vps.sh — deploy Cerbero MCP V2 sul VPS senza passare per registry.
#
# Workflow:
# 1. git fetch + reset al ramo target
# 2. docker compose build (rebuild immagine se SHA è cambiata)
# 3. docker compose down (graceful, max 15s)
# 4. docker compose up -d
# 5. attesa healthcheck su /health
# 6. rollback automatico al SHA precedente se health fallisce
#
# Eseguito ON THE VPS, dentro la directory del repo (es. /opt/cerbero-mcp).
#
# Uso (sul VPS):
# cd /opt/cerbero-mcp
# bash scripts/deploy-vps.sh
#
# Uso (da macchina dev, via SSH):
# ssh user@vps 'cd /opt/cerbero-mcp && bash scripts/deploy-vps.sh'
#
# Variabili env (opzionali):
# BRANCH ramo git da deployare (default: V2.0.0)
# SERVICE nome servizio docker compose (default: cerbero-mcp)
# PORT porta /health da pingare (default: dal .env, fallback 9000)
# HEALTH_TIMEOUT_SECONDS attesa max health (default: 30)
# HEALTH_INTERVAL secondi tra retry health (default: 2)
# FORCE se "1", rebuild + restart anche se SHA invariata
# SKIP_ROLLBACK se "1", non fare rollback su health fail (per debug)
set -euo pipefail
# ─── Config ──────────────────────────────────────────────────────────────
BRANCH="${BRANCH:-V2.0.0}"
SERVICE="${SERVICE:-cerbero-mcp}"
HEALTH_TIMEOUT_SECONDS="${HEALTH_TIMEOUT_SECONDS:-30}"
HEALTH_INTERVAL="${HEALTH_INTERVAL:-2}"
# Risolvi PORT da .env se non passata
if [[ -z "${PORT:-}" ]]; then
if [[ -f .env ]] && grep -q '^PORT=' .env; then
PORT="$(grep '^PORT=' .env | head -1 | cut -d= -f2 | tr -d '[:space:]"')"
fi
fi
PORT="${PORT:-9000}"
HEALTH_URL="http://localhost:${PORT}/health"
# ─── Pre-check ───────────────────────────────────────────────────────────
command -v git >/dev/null || { echo "FATAL: git non installato"; exit 1; }
command -v docker >/dev/null || { echo "FATAL: docker non installato"; exit 1; }
command -v curl >/dev/null || { echo "FATAL: curl non installato"; exit 1; }
docker compose version >/dev/null 2>&1 || { echo "FATAL: docker compose non disponibile"; exit 1; }
if [[ ! -f .env ]]; then
echo "FATAL: .env non trovato in $(pwd)."
echo " Copia .env.example → .env e compila i valori prima del primo deploy."
exit 1
fi
if [[ ! -f docker-compose.yml ]]; then
echo "FATAL: docker-compose.yml non trovato in $(pwd)."
exit 1
fi
# Verifica working tree pulito
if [[ -n "$(git status --porcelain)" ]]; then
echo "FATAL: working tree non pulito. Modifiche locali non gestite:"
git status --short
echo " Risolvi prima di deployare (es. git stash o git reset)."
exit 1
fi
# ─── Stato corrente ──────────────────────────────────────────────────────
CURRENT_SHA="$(git rev-parse --short HEAD)"
echo "==> SHA attuale (rollback target): $CURRENT_SHA"
echo "==> branch: $BRANCH"
echo "==> port: $PORT"
# ─── Fetch + reset ───────────────────────────────────────────────────────
echo "==> git fetch + reset --hard origin/${BRANCH}"
git fetch --prune origin
git reset --hard "origin/${BRANCH}"
NEW_SHA="$(git rev-parse --short HEAD)"
echo "==> SHA nuovo: $NEW_SHA"
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]] && [[ "${FORCE:-0}" != "1" ]]; then
echo "==> Già aggiornato a $NEW_SHA. Nessun deploy necessario."
echo " (esporta FORCE=1 per riavviare comunque)"
exit 0
fi
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]]; then
echo "==> FORCE=1 → rebuild e restart anche se SHA invariata"
fi
# ─── Funzione di rollback ────────────────────────────────────────────────
rollback() {
if [[ "${SKIP_ROLLBACK:-0}" == "1" ]]; then
echo "==> SKIP_ROLLBACK=1 → niente rollback automatico"
return
fi
if [[ "$CURRENT_SHA" == "$NEW_SHA" ]]; then
echo "==> SHA invariata, niente da rollbackare"
return
fi
echo "==> ROLLBACK a $CURRENT_SHA"
git reset --hard "$CURRENT_SHA"
docker compose build "$SERVICE"
docker compose up -d --force-recreate "$SERVICE"
echo "==> rollback eseguito. Verifica manualmente lo stato."
}
# ─── Build ───────────────────────────────────────────────────────────────
echo "==> docker compose build $SERVICE"
docker compose build "$SERVICE"
# ─── Down + up ───────────────────────────────────────────────────────────
echo "==> docker compose down --timeout 15"
docker compose down --timeout 15
echo "==> docker compose up -d"
docker compose up -d
# ─── Health check ────────────────────────────────────────────────────────
echo "==> attendo /health (timeout ${HEALTH_TIMEOUT_SECONDS}s, retry ogni ${HEALTH_INTERVAL}s)"
deadline=$(( $(date +%s) + HEALTH_TIMEOUT_SECONDS ))
while [[ $(date +%s) -lt $deadline ]]; do
if curl -fsS "$HEALTH_URL" >/dev/null 2>&1; then
echo
echo "==> health OK"
curl -s "$HEALTH_URL"
echo
echo
echo "==> deploy DONE (SHA $CURRENT_SHA$NEW_SHA, branch $BRANCH)"
exit 0
fi
printf "."
sleep "$HEALTH_INTERVAL"
done
echo
echo "==> FAIL: /health non risponde dopo ${HEALTH_TIMEOUT_SECONDS}s"
echo "==> log container (ultime 40 righe):"
docker compose logs --tail 40 "$SERVICE" || true
rollback
exit 1
+2
View File
@@ -15,6 +15,7 @@ from typing import Literal, cast
import uvicorn
from fastapi import FastAPI
from cerbero_mcp import admin
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.logging import configure_root_logging
from cerbero_mcp.exchanges import build_client
@@ -62,6 +63,7 @@ def _make_app(settings: Settings) -> FastAPI:
app.include_router(alpaca.make_router())
app.include_router(macro.make_router())
app.include_router(sentiment.make_router())
app.include_router(admin.make_admin_router())
return app
+158
View File
@@ -0,0 +1,158 @@
"""Endpoint admin: query audit log con filtri."""
from __future__ import annotations
import json
import os
from datetime import datetime
from pathlib import Path
from typing import Any, Literal
from fastapi import APIRouter, HTTPException, Query, Request
MAX_RECORDS = 10000
DEFAULT_LIMIT = 1000
def _parse_iso(value: str | None) -> datetime | None:
if not value:
return None
try:
# supporta sia "2026-05-01" sia "2026-05-01T12:34:56Z"
return datetime.fromisoformat(value.replace("Z", "+00:00"))
except ValueError as e:
raise HTTPException(400, f"invalid datetime: {value}") from e
def _record_timestamp(rec: dict[str, Any]) -> datetime | None:
"""Estrae il timestamp da un record audit. JsonFormatter mette 'asctime'
in formato '2026-05-01 12:34:56,789'. Lo parsiamo come UTC.
"""
ts = rec.get("asctime") or rec.get("timestamp")
if not ts:
return None
try:
# asctime format default: 'YYYY-MM-DD HH:MM:SS,mmm'
ts_clean = ts.replace(",", ".")
return datetime.fromisoformat(ts_clean)
except ValueError:
return None
def _matches_filters(
rec: dict[str, Any],
*,
from_dt: datetime | None,
to_dt: datetime | None,
actor: str | None,
exchange: str | None,
action: str | None,
bot_tag: str | None,
) -> bool:
if rec.get("audit_event") != "write_op":
return False
if actor is not None and rec.get("actor") != actor:
return False
if exchange is not None and rec.get("exchange") != exchange:
return False
if action is not None and rec.get("action") != action:
return False
if bot_tag is not None and rec.get("bot_tag") != bot_tag:
return False
if from_dt is not None or to_dt is not None:
rec_ts = _record_timestamp(rec)
if rec_ts is None:
return False
if from_dt is not None and rec_ts < from_dt:
return False
if to_dt is not None and rec_ts > to_dt:
return False
return True
def _read_audit_records(file_path: Path) -> list[dict[str, Any]]:
if not file_path.exists():
return []
out: list[dict[str, Any]] = []
with file_path.open("r", encoding="utf-8") as f:
for line in f:
stripped = line.strip()
if not stripped:
continue
try:
out.append(json.loads(stripped))
except json.JSONDecodeError:
continue
return out
def make_admin_router() -> APIRouter:
r = APIRouter(prefix="/admin", tags=["admin"])
@r.get("/audit")
async def query_audit(
request: Request,
from_: str | None = Query(None, alias="from"),
to: str | None = Query(None),
actor: Literal["testnet", "mainnet"] | None = Query(None),
exchange: str | None = Query(None),
action: str | None = Query(None),
bot_tag: str | None = Query(None),
limit: int = Query(DEFAULT_LIMIT, ge=1, le=MAX_RECORDS),
) -> dict[str, Any]:
"""Restituisce i record audit_write_op filtrati.
Param query (tutti opzionali):
- from / to: ISO 8601 datetime (es. 2026-05-01 oppure 2026-05-01T12:34:56)
- actor: testnet | mainnet
- exchange: deribit | bybit | hyperliquid | alpaca
- action: nome del tool (es. place_order)
- bot_tag: identificatore bot
- limit: max record da ritornare (default 1000, max 10000)
Source: AUDIT_LOG_FILE (env var). Se non settata, ritorna lista vuota
con warning.
"""
from_dt = _parse_iso(from_)
to_dt = _parse_iso(to)
file_str = os.environ.get("AUDIT_LOG_FILE", "").strip()
if not file_str:
return {
"records": [],
"count": 0,
"warning": "AUDIT_LOG_FILE not configured; no persistent audit log to query",
"from": from_,
"to": to,
}
file_path = Path(file_str)
all_records = _read_audit_records(file_path)
filtered = [
rec for rec in all_records
if _matches_filters(
rec,
from_dt=from_dt, to_dt=to_dt,
actor=actor, exchange=exchange, action=action,
bot_tag=bot_tag,
)
]
# sort desc per timestamp (ultimi prima) + limit
filtered.sort(
key=lambda rec: _record_timestamp(rec) or datetime.min,
reverse=True,
)
if len(filtered) > limit:
filtered = filtered[:limit]
return {
"records": filtered,
"count": len(filtered),
"from": from_,
"to": to,
"filters": {
"actor": actor, "exchange": exchange,
"action": action, "bot_tag": bot_tag,
},
}
return r
+50 -4
View File
@@ -1,4 +1,8 @@
"""Bearer auth middleware: bearer token → request.state.environment."""
"""Bearer auth middleware: bearer token → request.state.environment.
Inoltre richiede header `X-Bot-Tag` su tutte le chiamate non whitelisted,
così che l'audit log identifichi il bot chiamante.
"""
from __future__ import annotations
import secrets
@@ -9,7 +13,24 @@ from fastapi.responses import JSONResponse
Environment = Literal["testnet", "mainnet"]
WHITELIST_PATHS = frozenset({"/health", "/apidocs", "/openapi.json", "/docs", "/redoc"})
# Path che bypassano sia bearer auth sia bot_tag check.
PATH_WHITELIST_FULL = frozenset(
{
"/health",
"/health/ready",
"/apidocs",
"/openapi.json",
"/docs",
"/redoc",
}
)
# Path che richiedono bearer ma NON il bot_tag (admin endpoint).
PATH_WHITELIST_BOT_TAG_ONLY = frozenset({"/admin/audit"})
# Backward-compat alias (vecchi import).
WHITELIST_PATHS = PATH_WHITELIST_FULL
MAX_BOT_TAG_LEN = 64
def _extract_bearer(auth_header: str) -> str | None:
@@ -35,13 +56,17 @@ def install_auth_middleware(
testnet_token: str,
mainnet_token: str,
) -> None:
"""Registra middleware di auth bearer sull'app FastAPI."""
"""Registra middleware di auth bearer + bot_tag sull'app FastAPI."""
@app.middleware("http")
async def auth_middleware(request: Request, call_next):
if request.url.path in WHITELIST_PATHS:
path = request.url.path
# 1. Whitelist totale: nessun check.
if path in PATH_WHITELIST_FULL:
return await call_next(request)
# 2. Bearer auth (sempre richiesto).
token = _extract_bearer(request.headers.get("Authorization", ""))
if token is None:
return JSONResponse(
@@ -57,4 +82,25 @@ def install_auth_middleware(
"message": "invalid token"}},
)
request.state.environment = env
# 3. Whitelist parziale (admin): bearer ok, no bot_tag check.
if path in PATH_WHITELIST_BOT_TAG_ONLY:
return await call_next(request)
# 4. X-Bot-Tag obbligatorio.
raw_tag = request.headers.get("X-Bot-Tag", "")
tag = raw_tag.strip() if raw_tag else ""
if not tag:
return JSONResponse(
status_code=status.HTTP_400_BAD_REQUEST,
content={"error": {"code": "BAD_REQUEST",
"message": "missing X-Bot-Tag header"}},
)
if len(tag) > MAX_BOT_TAG_LEN:
return JSONResponse(
status_code=status.HTTP_400_BAD_REQUEST,
content={"error": {"code": "BAD_REQUEST",
"message": "X-Bot-Tag too long"}},
)
request.state.bot_tag = tag
return await call_next(request)
+8
View File
@@ -67,23 +67,28 @@ def _configure_audit_sink() -> None:
def audit_write_op(
*,
actor: str | None = None,
bot_tag: str | None = None,
action: str,
exchange: str,
target: str | None = None,
payload: dict[str, Any] | None = None,
result: dict[str, Any] | None = None,
error: str | None = None,
request_id: str | None = None,
) -> None:
"""Emit a structured audit log record per write operation.
actor: identificatore di chi ha invocato (es. "testnet", "mainnet",
oppure None per logging anonimo).
bot_tag: identificatore del bot chiamante (header X-Bot-Tag).
action: nome del tool (es. "place_order", "cancel_order").
exchange: identificatore servizio (deribit, bybit, alpaca, hyperliquid).
target: instrument/symbol/order_id su cui si agisce.
payload: input non-sensibile (qty, side, leverage, ecc.).
result: output del client (order_id, status, ecc.).
error: stringa errore se l'operazione ha fallito.
request_id: id propagato dal middleware request log per correlazione
tra audit log e request log.
"""
_configure_audit_sink()
record: dict[str, Any] = {
@@ -91,9 +96,12 @@ def audit_write_op(
"action": action,
"exchange": exchange,
"actor": actor,
"bot_tag": bot_tag,
"target": target,
"payload": payload or {},
}
if request_id is not None:
record["request_id"] = request_id
if result is not None:
record["result"] = _summarize_result(result)
if error is not None:
+100
View File
@@ -0,0 +1,100 @@
"""Helper per cablare audit_write_op nei router.
Pattern uso nel router::
@r.post("/tools/place_order")
async def _place_order(
params: t.PlaceOrderReq,
request: Request,
client: DeribitClient = Depends(get_deribit_client),
):
return await audit_call(
request=request,
exchange="deribit",
action="place_order",
target_field="instrument_name",
params=params,
tool_fn=lambda: t.place_order(client, params, creds=...),
)
"""
from __future__ import annotations
from collections.abc import Awaitable, Callable
from typing import Any
from fastapi import Request
from pydantic import BaseModel
from cerbero_mcp.common.audit import audit_write_op
def _extract_target(params: BaseModel | None, target_field: str | None) -> str | None:
if params is None or target_field is None:
return None
val = getattr(params, target_field, None)
if val is None:
return None
return str(val)
def _safe_dump(params: BaseModel | None) -> dict[str, Any]:
if params is None:
return {}
try:
return params.model_dump(mode="json", exclude_none=True)
except Exception:
return {}
async def audit_call(
*,
request: Request,
exchange: str,
action: str,
tool_fn: Callable[[], Awaitable[Any]],
params: BaseModel | None = None,
target_field: str | None = None,
) -> Any:
"""Esegue tool_fn e logga audit (success o error). Riraisola eccezioni."""
actor = getattr(request.state, "environment", None)
bot_tag = getattr(request.state, "bot_tag", None)
request_id = getattr(request.state, "request_id", None)
target = _extract_target(params, target_field)
payload = _safe_dump(params)
try:
result = await tool_fn()
except Exception as e:
audit_write_op(
actor=actor,
bot_tag=bot_tag,
action=action,
exchange=exchange,
target=target,
payload=payload,
error=f"{type(e).__name__}: {e}",
request_id=request_id,
)
raise
# Se result è dict, passa raw; altrimenti tenta serializzazione
audit_result: dict[str, Any] | None = None
if isinstance(result, dict):
audit_result = result
elif hasattr(result, "model_dump"):
try:
audit_result = result.model_dump(mode="json")
except Exception:
audit_result = None
audit_write_op(
actor=actor,
bot_tag=bot_tag,
action=action,
exchange=exchange,
target=target,
payload=payload,
result=audit_result,
request_id=request_id,
)
return result
+104
View File
@@ -0,0 +1,104 @@
"""Middleware: structured JSON request log per ogni HTTP request.
Emette una riga JSON sul logger ``mcp.request`` con campi correlabili
all'audit log via ``request_id``. Espone anche ``request_id`` su
``request.state`` così che handler/exception handler downstream possano
includerlo nei propri payload.
"""
from __future__ import annotations
import logging
import time
import uuid
from collections.abc import Awaitable, Callable
from datetime import UTC, datetime
from typing import Any
from fastapi import FastAPI, Request
from starlette.responses import Response
from cerbero_mcp.common.logging import get_json_logger
_logger = get_json_logger("mcp.request", level=logging.INFO)
def _extract_exchange(path: str) -> str | None:
"""Estrae il nome dell'exchange dal path se è un ``/mcp-{exchange}/...``."""
if not path.startswith("/mcp-"):
return None
rest = path[len("/mcp-"):]
end = rest.find("/")
if end < 0:
return rest or None
return rest[:end] or None
def _extract_tool(path: str) -> str | None:
"""Estrae nome tool dal path ``/mcp-X/tools/Y``."""
parts = path.split("/")
# ["", "mcp-deribit", "tools", "place_order"]
if len(parts) >= 4 and parts[2] == "tools":
return parts[3] or None
return None
def install_request_log_middleware(app: FastAPI) -> None:
"""Aggiunge un middleware HTTP che logga JSON per ogni request."""
@app.middleware("http")
async def request_log(
request: Request,
call_next: Callable[[Request], Awaitable[Response]],
) -> Response:
request_id = uuid.uuid4().hex
# Espone request_id per uso downstream (audit, error envelope)
request.state.request_id = request_id
t0 = time.perf_counter()
status_code = 500
error: str | None = None
response: Response | None = None
try:
response = await call_next(request)
status_code = response.status_code
except Exception as e:
error = f"{type(e).__name__}: {str(e)[:200]}"
raise
finally:
dur_ms = (time.perf_counter() - t0) * 1000
path = request.url.path
payload: dict[str, Any] = {
"event": "request",
"request_id": request_id,
"method": request.method,
"path": path,
"status_code": status_code,
"duration_ms": round(dur_ms, 2),
"timestamp": datetime.now(UTC).isoformat(),
}
ua = request.headers.get("user-agent")
if ua:
payload["user_agent"] = ua[:200]
client = request.client
if client is not None:
payload["client_ip"] = client.host
actor = getattr(request.state, "environment", None)
if actor:
payload["actor"] = actor
bot_tag = getattr(request.state, "bot_tag", None)
if bot_tag:
payload["bot_tag"] = bot_tag
exchange = _extract_exchange(path)
if exchange:
payload["exchange"] = exchange
tool = _extract_tool(path)
if tool:
payload["tool"] = tool
if error:
payload["error"] = error
_logger.error("request", extra=payload)
else:
_logger.info("request", extra=payload)
# response è settato se non c'è stata eccezione (altrimenti
# l'eccezione è stata già rilanciata dal blocco except).
assert response is not None
return response
+330 -215
View File
@@ -1,204 +1,252 @@
"""Alpaca client su httpx puro (V2.0.0).
Riscrittura full-REST del client `alpaca-py` originale: 4 endpoint base
(trading, stock data, crypto data, options data), auth via header
APCA-API-KEY-ID / APCA-API-SECRET-KEY, parità completa con la versione V1
(stesse firme, stessa shape dei dict ritornati).
- `base_url` parametro override applica SOLO al trading endpoint
(coerente con `url_override` di alpaca-py.TradingClient). Gli endpoint
data restano hardcoded su `https://data.alpaca.markets`.
- I metodi ritornano `dict` / `list[dict]` direttamente dal JSON REST
(al posto dei modelli pydantic alpaca-py serializzati). Le chiavi sono
quelle restituite dall'API Alpaca; equivalgono al `model_dump()` dei
modelli SDK precedenti.
"""
from __future__ import annotations
import asyncio
import datetime as _dt
from typing import Any
from alpaca.data.historical import (
CryptoHistoricalDataClient,
OptionHistoricalDataClient,
StockHistoricalDataClient,
)
from alpaca.data.requests import (
CryptoBarsRequest,
CryptoLatestQuoteRequest,
CryptoLatestTradeRequest,
OptionBarsRequest,
OptionChainRequest,
OptionLatestQuoteRequest,
StockBarsRequest,
StockLatestQuoteRequest,
StockLatestTradeRequest,
StockSnapshotRequest,
)
from alpaca.data.timeframe import TimeFrame, TimeFrameUnit
from alpaca.trading.client import TradingClient
from alpaca.trading.enums import (
AssetClass,
OrderSide,
QueryOrderStatus,
TimeInForce,
)
from alpaca.trading.requests import (
ClosePositionRequest,
GetAssetsRequest,
GetOrdersRequest,
LimitOrderRequest,
MarketOrderRequest,
ReplaceOrderRequest,
StopOrderRequest,
)
import httpx
from cerbero_mcp.common.http import async_client
# ── Endpoint base ────────────────────────────────────────────────
_TRADING_LIVE = "https://api.alpaca.markets"
_TRADING_PAPER = "https://paper-api.alpaca.markets"
_DATA = "https://data.alpaca.markets"
# ── Mappa timeframe → query param Alpaca ─────────────────────────
# Alpaca v2 bars: timeframe = "1Min" / "5Min" / "15Min" / "30Min" / "1Hour" / "1Day" / "1Week"
_TF_MAP = {
"1min": TimeFrame(1, TimeFrameUnit.Minute),
"5min": TimeFrame(5, TimeFrameUnit.Minute),
"15min": TimeFrame(15, TimeFrameUnit.Minute),
"30min": TimeFrame(30, TimeFrameUnit.Minute),
"1h": TimeFrame(1, TimeFrameUnit.Hour),
"1d": TimeFrame(1, TimeFrameUnit.Day),
"1w": TimeFrame(1, TimeFrameUnit.Week),
"1min": "1Min",
"5min": "5Min",
"15min": "15Min",
"30min": "30Min",
"1h": "1Hour",
"1d": "1Day",
"1w": "1Week",
}
_ASSET_CLASSES = {"stocks", "crypto", "options"}
_ASSET_CLASS_MAP = {
"stocks": "us_equity",
"crypto": "crypto",
"options": "us_option",
}
def _tf(interval: str) -> TimeFrame:
def _tf(interval: str) -> str:
if interval in _TF_MAP:
return _TF_MAP[interval]
raise ValueError(f"unsupported timeframe: {interval}")
def _asset_class_enum(ac: str) -> AssetClass:
def _asset_class_param(ac: str) -> str:
ac = ac.lower()
if ac == "stocks":
return AssetClass.US_EQUITY
if ac == "crypto":
return AssetClass.CRYPTO
if ac == "options":
return AssetClass.US_OPTION
if ac in _ASSET_CLASS_MAP:
return _ASSET_CLASS_MAP[ac]
raise ValueError(f"invalid asset_class: {ac}")
def _serialize(obj: Any) -> Any:
"""Recursively convert pydantic/datetime objects → json-safe."""
if obj is None or isinstance(obj, str | int | float | bool):
return obj
if isinstance(obj, _dt.datetime | _dt.date):
return obj.isoformat()
if isinstance(obj, dict):
return {k: _serialize(v) for k, v in obj.items()}
if isinstance(obj, list | tuple):
return [_serialize(v) for v in obj]
if hasattr(obj, "model_dump"):
return _serialize(obj.model_dump())
if hasattr(obj, "__dict__"):
return _serialize(vars(obj))
return str(obj)
def _iso(value: _dt.datetime | _dt.date | None) -> str | None:
if value is None:
return None
return value.isoformat()
class AlpacaClient:
"""Client httpx-based per Alpaca REST API v2.
Auth via header `APCA-API-KEY-ID` / `APCA-API-SECRET-KEY`.
"""
def __init__(
self,
api_key: str,
secret_key: str,
paper: bool = True,
base_url: str | None = None,
trading: Any | None = None,
stock_data: Any | None = None,
crypto_data: Any | None = None,
option_data: Any | None = None,
http: httpx.AsyncClient | None = None,
) -> None:
self.api_key = api_key
self.secret_key = secret_key
self.paper = paper
# `base_url` mantenuto come attributo pubblico (test/build_client lo
# leggono). Override del solo endpoint trading; data endpoints sono
# sempre `data.alpaca.markets` (Alpaca non offre paper data feed).
self.base_url = base_url
# alpaca-py TradingClient accetta `url_override` per override URL trading.
# Data clients (Stock/Crypto/Option) non supportano url_override sul costruttore;
# usano endpoint dati separati (data.alpaca.markets) — `base_url` è ignorato per essi.
if trading is None:
trading_kwargs: dict[str, Any] = {
"api_key": api_key, "secret_key": secret_key, "paper": paper,
}
if base_url:
trading_kwargs["url_override"] = base_url
trading = TradingClient(**trading_kwargs)
self._trading = trading
self._stock = stock_data or StockHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
self._crypto = crypto_data or CryptoHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
self._option = option_data or OptionHistoricalDataClient(
api_key=api_key, secret_key=secret_key
)
if base_url:
self._trading_base = base_url
else:
self._trading_base = _TRADING_PAPER if paper else _TRADING_LIVE
self._data_base = _DATA
# Single long-lived AsyncClient → reuse connection pool.
self._http = http or async_client(timeout=30.0)
async def _run(self, fn, /, *args, **kwargs):
return await asyncio.to_thread(fn, *args, **kwargs)
async def aclose(self) -> None:
"""Chiudi connessioni HTTP. Idempotente."""
if not self._http.is_closed:
await self._http.aclose()
async def health(self) -> dict[str, Any]:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok", "paper": self.paper}
# ── Helpers ──────────────────────────────────────────────────
@property
def _headers(self) -> dict[str, str]:
return {
"APCA-API-KEY-ID": self.api_key,
"APCA-API-SECRET-KEY": self.secret_key,
"Accept": "application/json",
}
async def _request(
self,
method: str,
base: str,
path: str,
*,
params: dict[str, Any] | None = None,
json_body: dict[str, Any] | None = None,
) -> Any:
"""Esegue una richiesta HTTP autenticata e ritorna il JSON parsato.
Per response body vuoto (es. DELETE 204) ritorna `{}`.
Solleva `httpx.HTTPStatusError` su 4xx/5xx tramite raise_for_status.
"""
url = f"{base}{path}"
# httpx scarta i query params con valore None automaticamente solo se
# passati come list of tuples; con dict dobbiamo filtrare a monte.
clean_params: dict[str, Any] | None = None
if params is not None:
clean_params = {k: v for k, v in params.items() if v is not None}
if not clean_params:
clean_params = None
resp = await self._http.request(
method,
url,
params=clean_params,
json=json_body,
headers=self._headers,
)
resp.raise_for_status()
if not resp.content:
return {}
return resp.json()
# ── Account / positions ──────────────────────────────────────
async def get_account(self) -> dict:
acc = await self._run(self._trading.get_account)
return _serialize(acc) # type: ignore[no-any-return]
data = await self._request("GET", self._trading_base, "/v2/account")
return dict(data) if data else {}
async def get_positions(self) -> list[dict]:
pos = await self._run(self._trading.get_all_positions)
return [_serialize(p) for p in pos]
data = await self._request("GET", self._trading_base, "/v2/positions")
return list(data) if data else []
async def get_activities(self, limit: int = 50) -> list[dict]:
acts = await self._run(self._trading.get_account_activities) # type: ignore[union-attr]
data = [_serialize(a) for a in acts]
return data[:limit]
data = await self._request(
"GET",
self._trading_base,
"/v2/account/activities",
params={"page_size": limit},
)
items = list(data) if data else []
return items[:limit]
# ── Assets ──────────────────────────────────────────────────
async def get_assets(
self, asset_class: str = "stocks", status: str = "active"
) -> list[dict]:
req = GetAssetsRequest(
asset_class=_asset_class_enum(asset_class),
status=status, # type: ignore[arg-type]
data = await self._request(
"GET",
self._trading_base,
"/v2/assets",
params={
"status": status,
"asset_class": _asset_class_param(asset_class),
},
)
assets = await self._run(self._trading.get_all_assets, req)
return [_serialize(a) for a in assets[:500]]
items = list(data) if data else []
return items[:500]
# ── Market data ─────────────────────────────────────────────
async def get_ticker(self, symbol: str, asset_class: str = "stocks") -> dict:
ac = asset_class.lower()
if ac == "stocks":
req = StockLatestTradeRequest(symbol_or_symbols=symbol)
data = await self._run(self._stock.get_stock_latest_trade, req)
trade = data.get(symbol)
q_req = StockLatestQuoteRequest(symbol_or_symbols=symbol)
qdata = await self._run(self._stock.get_stock_latest_quote, q_req)
quote = qdata.get(symbol)
trade_resp = await self._request(
"GET",
self._data_base,
f"/v2/stocks/{symbol}/trades/latest",
)
quote_resp = await self._request(
"GET",
self._data_base,
f"/v2/stocks/{symbol}/quotes/latest",
)
trade = (trade_resp or {}).get("trade") or {}
quote = (quote_resp or {}).get("quote") or {}
return {
"symbol": symbol,
"asset_class": "stocks",
"last_price": getattr(trade, "price", None),
"bid": getattr(quote, "bid_price", None),
"ask": getattr(quote, "ask_price", None),
"bid_size": getattr(quote, "bid_size", None),
"ask_size": getattr(quote, "ask_size", None),
"timestamp": _serialize(getattr(trade, "timestamp", None)),
"last_price": trade.get("p"),
"bid": quote.get("bp"),
"ask": quote.get("ap"),
"bid_size": quote.get("bs"),
"ask_size": quote.get("as"),
"timestamp": trade.get("t"),
}
if ac == "crypto":
req = CryptoLatestTradeRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
data = await self._run(self._crypto.get_crypto_latest_trade, req)
trade = data.get(symbol)
q_req = CryptoLatestQuoteRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
qdata = await self._run(self._crypto.get_crypto_latest_quote, q_req)
quote = qdata.get(symbol)
trade_resp = await self._request(
"GET",
self._data_base,
"/v1beta3/crypto/us/latest/trades",
params={"symbols": symbol},
)
quote_resp = await self._request(
"GET",
self._data_base,
"/v1beta3/crypto/us/latest/quotes",
params={"symbols": symbol},
)
trade = ((trade_resp or {}).get("trades") or {}).get(symbol) or {}
quote = ((quote_resp or {}).get("quotes") or {}).get(symbol) or {}
return {
"symbol": symbol,
"asset_class": "crypto",
"last_price": getattr(trade, "price", None),
"bid": getattr(quote, "bid_price", None),
"ask": getattr(quote, "ask_price", None),
"timestamp": _serialize(getattr(trade, "timestamp", None)),
"last_price": trade.get("p"),
"bid": quote.get("bp"),
"ask": quote.get("ap"),
"timestamp": trade.get("t"),
}
if ac == "options":
req = OptionLatestQuoteRequest(symbol_or_symbols=symbol) # type: ignore[assignment]
data = await self._run(self._option.get_option_latest_quote, req)
quote = data.get(symbol)
quote_resp = await self._request(
"GET",
self._data_base,
f"/v1beta1/options/{symbol}/quotes/latest",
)
quote = (quote_resp or {}).get("quote") or {}
return {
"symbol": symbol,
"asset_class": "options",
"bid": getattr(quote, "bid_price", None),
"ask": getattr(quote, "ask_price", None),
"timestamp": _serialize(getattr(quote, "timestamp", None)),
"bid": quote.get("bp"),
"ask": quote.get("ap"),
"timestamp": quote.get("t"),
}
raise ValueError(f"invalid asset_class: {asset_class}")
@@ -212,73 +260,117 @@ class AlpacaClient:
limit: int = 1000,
) -> dict:
tf = _tf(interval)
start_dt = _dt.datetime.fromisoformat(start) if start else (
_dt.datetime.now(_dt.UTC) - _dt.timedelta(days=30)
start_dt = (
_dt.datetime.fromisoformat(start)
if start
else (_dt.datetime.now(_dt.UTC) - _dt.timedelta(days=30))
)
end_dt = _dt.datetime.fromisoformat(end) if end else _dt.datetime.now(_dt.UTC)
ac = asset_class.lower()
params: dict[str, Any] = {
"symbols": symbol,
"timeframe": tf,
"start": _iso(start_dt),
"end": _iso(end_dt),
"limit": limit,
}
if ac == "stocks":
req = StockBarsRequest(
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
# IEX feed di default — coerente con default alpaca-py free tier.
params["feed"] = "iex"
data = await self._request(
"GET", self._data_base, "/v2/stocks/bars", params=params
)
data = await self._run(self._stock.get_stock_bars, req)
elif ac == "crypto":
req = CryptoBarsRequest( # type: ignore[assignment]
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
data = await self._request(
"GET",
self._data_base,
"/v1beta3/crypto/us/bars",
params=params,
)
data = await self._run(self._crypto.get_crypto_bars, req)
elif ac == "options":
req = OptionBarsRequest( # type: ignore[assignment]
symbol_or_symbols=symbol, timeframe=tf,
start=start_dt, end=end_dt, limit=limit,
data = await self._request(
"GET",
self._data_base,
"/v1beta1/options/bars",
params=params,
)
data = await self._run(self._option.get_option_bars, req)
else:
raise ValueError(f"invalid asset_class: {asset_class}")
bars_dict = getattr(data, "data", {}) or {}
rows = bars_dict.get(symbol, []) or []
bars_dict = (data or {}).get("bars") or {}
rows = bars_dict.get(symbol) or []
bars = [
{
"timestamp": _serialize(getattr(b, "timestamp", None)),
"open": getattr(b, "open", None),
"high": getattr(b, "high", None),
"low": getattr(b, "low", None),
"close": getattr(b, "close", None),
"volume": getattr(b, "volume", None),
"timestamp": b.get("t"),
"open": b.get("o"),
"high": b.get("h"),
"low": b.get("l"),
"close": b.get("c"),
"volume": b.get("v"),
}
for b in rows
]
return {"symbol": symbol, "asset_class": ac, "interval": interval, "bars": bars}
return {
"symbol": symbol,
"asset_class": ac,
"interval": interval,
"bars": bars,
}
async def get_snapshot(self, symbol: str) -> dict:
req = StockSnapshotRequest(symbol_or_symbols=symbol)
data = await self._run(self._stock.get_stock_snapshot, req)
return _serialize(data.get(symbol)) # type: ignore[no-any-return]
data = await self._request(
"GET",
self._data_base,
"/v2/stocks/snapshots",
params={"symbols": symbol},
)
# API ritorna {"AAPL": {snapshot}} o {"snapshots": {...}} — gestiamo
# entrambi i formati; v2/stocks/snapshots ritorna dict top-level
# symbol→snapshot.
if data is None:
return {}
if symbol in data:
return data[symbol] or {}
snaps = data.get("snapshots") or {}
return snaps.get(symbol) or {}
async def get_option_chain(
self,
underlying: str,
expiry: str | None = None,
) -> dict:
kwargs: dict[str, Any] = {"underlying_symbol": underlying}
params: dict[str, Any] = {}
if expiry:
kwargs["expiration_date"] = _dt.date.fromisoformat(expiry)
req = OptionChainRequest(**kwargs)
data = await self._run(self._option.get_option_chain, req)
# Validazione date (solleva ValueError su input invalido,
# parità con V1 che usava _dt.date.fromisoformat).
_dt.date.fromisoformat(expiry)
params["expiration_date_gte"] = expiry
params["expiration_date_lte"] = expiry
data = await self._request(
"GET",
self._data_base,
f"/v1beta1/options/snapshots/{underlying}",
params=params or None,
)
contracts = (data or {}).get("snapshots") if data else None
return {
"underlying": underlying,
"expiry": expiry,
"contracts": _serialize(data),
"contracts": contracts if contracts is not None else (data or {}),
}
# ── Orders ──────────────────────────────────────────────────
async def get_open_orders(self, limit: int = 50) -> list[dict]:
req = GetOrdersRequest(status=QueryOrderStatus.OPEN, limit=limit)
orders = await self._run(self._trading.get_orders, filter=req)
return [_serialize(o) for o in orders]
data = await self._request(
"GET",
self._trading_base,
"/v2/orders",
params={"status": "open", "limit": limit},
)
return list(data) if data else []
async def place_order(
self,
@@ -292,32 +384,39 @@ class AlpacaClient:
tif: str = "day",
asset_class: str = "stocks",
) -> dict:
side_enum = OrderSide.BUY if side.lower() == "buy" else OrderSide.SELL
tif_enum = TimeInForce(tif.lower())
ot = order_type.lower()
common = {
body: dict[str, Any] = {
"symbol": symbol,
"side": side_enum,
"time_in_force": tif_enum,
"side": side.lower(),
"type": ot,
"time_in_force": tif.lower(),
}
if qty is not None:
common["qty"] = qty # type: ignore[assignment]
body["qty"] = str(qty)
if notional is not None:
common["notional"] = notional # type: ignore[assignment]
body["notional"] = str(notional)
if ot == "market":
req = MarketOrderRequest(**common)
pass
elif ot == "limit":
if limit_price is None:
raise ValueError("limit_price required for limit order")
req = LimitOrderRequest(**common, limit_price=limit_price) # type: ignore[assignment]
body["limit_price"] = str(limit_price)
elif ot == "stop":
if stop_price is None:
raise ValueError("stop_price required for stop order")
req = StopOrderRequest(**common, stop_price=stop_price) # type: ignore[assignment]
body["stop_price"] = str(stop_price)
else:
raise ValueError(f"unsupported order_type: {order_type}")
order = await self._run(self._trading.submit_order, req)
return _serialize(order) # type: ignore[no-any-return]
# `asset_class` non è un parametro REST; mantenuto in firma per parità
# con V1 (era usato solo da SDK per scegliere il request model).
_ = asset_class
data = await self._request(
"POST",
self._trading_base,
"/v2/orders",
json_body=body,
)
return dict(data) if data else {}
async def amend_order(
self,
@@ -327,69 +426,85 @@ class AlpacaClient:
stop_price: float | None = None,
tif: str | None = None,
) -> dict:
kwargs: dict[str, Any] = {}
body: dict[str, Any] = {}
if qty is not None:
kwargs["qty"] = qty
body["qty"] = str(qty)
if limit_price is not None:
kwargs["limit_price"] = limit_price
body["limit_price"] = str(limit_price)
if stop_price is not None:
kwargs["stop_price"] = stop_price
body["stop_price"] = str(stop_price)
if tif is not None:
kwargs["time_in_force"] = TimeInForce(tif.lower())
req = ReplaceOrderRequest(**kwargs)
order = await self._run(self._trading.replace_order_by_id, order_id, req)
return _serialize(order) # type: ignore[no-any-return]
body["time_in_force"] = tif.lower()
data = await self._request(
"PATCH",
self._trading_base,
f"/v2/orders/{order_id}",
json_body=body,
)
return dict(data) if data else {}
async def cancel_order(self, order_id: str) -> dict:
await self._run(self._trading.cancel_order_by_id, order_id)
# DELETE /v2/orders/{id} → 204 No Content su success.
await self._request(
"DELETE", self._trading_base, f"/v2/orders/{order_id}"
)
return {"order_id": order_id, "canceled": True}
async def cancel_all_orders(self) -> list[dict]:
resp = await self._run(self._trading.cancel_orders)
return [_serialize(r) for r in resp]
# DELETE /v2/orders → 207 Multi-Status con array di {id, status}
data = await self._request(
"DELETE", self._trading_base, "/v2/orders"
)
return list(data) if data else []
# ── Position close ──────────────────────────────────────────
async def close_position(
self, symbol: str, qty: float | None = None, percentage: float | None = None
) -> dict:
req = None
if qty is not None or percentage is not None:
kwargs: dict[str, Any] = {}
if qty is not None:
kwargs["qty"] = str(qty)
if percentage is not None:
kwargs["percentage"] = str(percentage)
req = ClosePositionRequest(**kwargs)
order = await self._run(
self._trading.close_position, symbol, close_options=req
# DELETE /v2/positions/{symbol}?qty=... oppure ?percentage=...
params: dict[str, Any] = {}
if qty is not None:
params["qty"] = str(qty)
if percentage is not None:
params["percentage"] = str(percentage)
data = await self._request(
"DELETE",
self._trading_base,
f"/v2/positions/{symbol}",
params=params or None,
)
return _serialize(order) # type: ignore[no-any-return]
return dict(data) if data else {}
async def close_all_positions(self, cancel_orders: bool = True) -> list[dict]:
resp = await self._run(
self._trading.close_all_positions, cancel_orders=cancel_orders
data = await self._request(
"DELETE",
self._trading_base,
"/v2/positions",
params={"cancel_orders": "true" if cancel_orders else "false"},
)
return [_serialize(r) for r in resp]
return list(data) if data else []
# ── Clock / calendar ────────────────────────────────────────
async def get_clock(self) -> dict:
clock = await self._run(self._trading.get_clock)
return _serialize(clock) # type: ignore[no-any-return]
data = await self._request("GET", self._trading_base, "/v2/clock")
return dict(data) if data else {}
async def get_calendar(
self, start: str | None = None, end: str | None = None
) -> list[dict]:
from alpaca.trading.requests import GetCalendarRequest
kwargs: dict[str, Any] = {}
params: dict[str, Any] = {}
if start:
kwargs["start"] = _dt.date.fromisoformat(start)
_dt.date.fromisoformat(start) # validazione, parità V1
params["start"] = start
if end:
kwargs["end"] = _dt.date.fromisoformat(end)
req = GetCalendarRequest(**kwargs) if kwargs else None
cal = await self._run(
self._trading.get_calendar, filters=req
) if req else await self._run(self._trading.get_calendar)
return [_serialize(c) for c in cal]
_dt.date.fromisoformat(end)
params["end"] = end
data = await self._request(
"GET",
self._trading_base,
"/v2/calendar",
params=params or None,
)
return list(data) if data else []
+430 -208
View File
@@ -1,13 +1,33 @@
"""Bybit V5 REST API client (httpx puro, no SDK).
Implementazione diretta su `httpx.AsyncClient` per i tool Cerbero MCP V2.
Mantiene parità di interfaccia con la versione precedente basata su
`pybit.unified_trading.HTTP` per non rompere `tools.py` né i router.
Auth Bybit V5:
Header X-BAPI-SIGN = HMAC_SHA256(secret,
timestamp + api_key + recv_window + (body_json | querystring))
"""
from __future__ import annotations
import asyncio
import hashlib
import hmac
import json
import time
import uuid
from typing import Any
from urllib.parse import urlencode
from pybit.unified_trading import HTTP
import httpx
from cerbero_mcp.common import indicators as ind
from cerbero_mcp.common import microstructure as micro
BASE_MAINNET = "https://api.bybit.com"
BASE_TESTNET = "https://api-testnet.bybit.com"
DEFAULT_RECV_WINDOW = "5000"
DEFAULT_TIMEOUT = 15.0
def _f(v: Any) -> float | None:
try:
@@ -23,37 +43,147 @@ def _i(v: Any) -> int | None:
return None
class BybitAPIError(RuntimeError):
"""Errore di trasporto Bybit V5 (non gestito a livello envelope)."""
class BybitClient:
"""Async REST client per Bybit V5 (linear/inverse/spot/option)."""
def __init__(
self,
api_key: str,
api_secret: str,
testnet: bool = True,
http: Any | None = None,
http: httpx.AsyncClient | None = None,
base_url: str | None = None,
) -> None:
self.api_key = api_key
self.api_secret = api_secret
self.testnet = testnet
# pybit HTTP non accetta `endpoint` come kwarg (vedi _V5HTTPManager.__init__:
# solo `domain`/`tld`/`testnet`). Override URL applicato post-init
# sovrascrivendo l'attributo `endpoint` dell'istanza HTTP.
self.base_url = base_url
if http is None:
http = HTTP(
api_key=api_key,
api_secret=api_secret,
testnet=testnet,
)
if base_url:
http.endpoint = base_url
self._http = http
self.base_url = base_url or (BASE_TESTNET if testnet else BASE_MAINNET)
self.recv_window = DEFAULT_RECV_WINDOW
# `http` injection è usato dai test per montare un AsyncClient con
# `httpx.MockTransport`. In produzione creiamo un client dedicato.
self._owns_http = http is None
self._http: httpx.AsyncClient = http or httpx.AsyncClient(
timeout=DEFAULT_TIMEOUT
)
async def _run(self, fn, /, **kwargs):
return await asyncio.to_thread(fn, **kwargs)
async def aclose(self) -> None:
"""Chiude l'AsyncClient httpx se di nostra proprietà."""
if self._owns_http:
await self._http.aclose()
async def health(self) -> dict[str, Any]:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok", "testnet": self.testnet}
# ── auth helpers ───────────────────────────────────────────
def _timestamp_ms(self) -> str:
return str(int(time.time() * 1000))
def _sign(self, timestamp: str, payload: str) -> str:
msg = timestamp + self.api_key + self.recv_window + payload
return hmac.new(
self.api_secret.encode("utf-8"),
msg.encode("utf-8"),
hashlib.sha256,
).hexdigest()
def _signed_headers(self, payload: str) -> dict[str, str]:
ts = self._timestamp_ms()
sig = self._sign(ts, payload)
return {
"X-BAPI-API-KEY": self.api_key,
"X-BAPI-TIMESTAMP": ts,
"X-BAPI-RECV-WINDOW": self.recv_window,
"X-BAPI-SIGN": sig,
"Content-Type": "application/json",
}
@staticmethod
def _parse_ticker(row: dict) -> dict:
def _clean_params(params: dict[str, Any] | None) -> dict[str, Any]:
if not params:
return {}
return {k: v for k, v in params.items() if v is not None}
@staticmethod
def _querystring(params: dict[str, Any]) -> str:
# Bybit accetta querystring nell'ordine in cui viene serializzata la
# request. Per la signature usiamo lo stesso urlencode (ordine
# inserzione dict). In Python 3.7+ dict mantiene insertion order:
# mantenere coerenza tra signature payload e URL effettivo.
return urlencode(params)
# ── request primitives ─────────────────────────────────────
async def _request_public(
self,
method: str,
path: str,
params: dict[str, Any] | None = None,
) -> dict[str, Any]:
clean = self._clean_params(params)
url = self.base_url + path
resp = await self._http.request(
method, url, params=clean if clean else None
)
return self._parse_response(resp)
async def _request_signed(
self,
method: str,
path: str,
params: dict[str, Any] | None = None,
body: dict[str, Any] | None = None,
) -> dict[str, Any]:
url = self.base_url + path
method = method.upper()
if method == "GET":
clean = self._clean_params(params)
qs = self._querystring(clean)
headers = self._signed_headers(qs)
resp = await self._http.request(
method, url, params=clean if clean else None, headers=headers
)
else:
payload_body = body or {}
body_json = json.dumps(payload_body, separators=(",", ":"))
headers = self._signed_headers(body_json)
resp = await self._http.request(
method, url, content=body_json, headers=headers
)
return self._parse_response(resp)
@staticmethod
def _parse_response(resp: httpx.Response) -> dict[str, Any]:
try:
data = resp.json()
except Exception as e: # pragma: no cover - difficilmente raggiungibile
raise BybitAPIError(
f"invalid JSON from Bybit (status={resp.status_code}): {resp.text[:200]}"
) from e
if resp.status_code >= 500:
raise BybitAPIError(
f"bybit server error {resp.status_code}: "
f"{data.get('retMsg', resp.text[:200])}"
)
if not isinstance(data, dict):
raise BybitAPIError(f"unexpected payload type: {type(data).__name__}")
return data
def _envelope(self, resp: dict[str, Any], payload: dict[str, Any]) -> dict[str, Any]:
code = resp.get("retCode", 0)
if code != 0:
return {"error": resp.get("retMsg", "bybit_error"), "code": code}
return payload
# ── parsers shared ─────────────────────────────────────────
@staticmethod
def _parse_ticker(row: dict[str, Any]) -> dict[str, Any]:
return {
"symbol": row.get("symbol"),
"last_price": _f(row.get("lastPrice")),
@@ -66,9 +196,13 @@ class BybitClient:
"open_interest": _f(row.get("openInterest")),
}
# ── market data (public) ───────────────────────────────────
async def get_ticker(self, symbol: str, category: str = "linear") -> dict:
resp = await self._run(
self._http.get_tickers, category=category, symbol=symbol
resp = await self._request_public(
"GET",
"/v5/market/tickers",
params={"category": category, "symbol": symbol},
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
@@ -86,8 +220,10 @@ class BybitClient:
async def get_orderbook(
self, symbol: str, category: str = "linear", limit: int = 50
) -> dict:
resp = await self._run(
self._http.get_orderbook, category=category, symbol=symbol, limit=limit
resp = await self._request_public(
"GET",
"/v5/market/orderbook",
params={"category": category, "symbol": symbol, "limit": limit},
)
r = resp.get("result") or {}
return {
@@ -106,17 +242,17 @@ class BybitClient:
end: int | None = None,
limit: int = 1000,
) -> dict:
kwargs = dict(
category=category,
symbol=symbol,
interval=interval,
limit=limit,
)
params: dict[str, Any] = {
"category": category,
"symbol": symbol,
"interval": interval,
"limit": limit,
}
if start is not None:
kwargs["start"] = start
params["start"] = start
if end is not None:
kwargs["end"] = end
resp = await self._run(self._http.get_kline, **kwargs)
params["end"] = end
resp = await self._request_public("GET", "/v5/market/kline", params=params)
rows = (resp.get("result") or {}).get("list") or []
rows_sorted = sorted(rows, key=lambda r: int(r[0]))
candles = [
@@ -168,8 +304,10 @@ class BybitClient:
return out
async def get_funding_rate(self, symbol: str, category: str = "linear") -> dict:
resp = await self._run(
self._http.get_tickers, category=category, symbol=symbol
resp = await self._request_public(
"GET",
"/v5/market/tickers",
params={"category": category, "symbol": symbol},
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
@@ -184,9 +322,10 @@ class BybitClient:
async def get_funding_history(
self, symbol: str, category: str = "linear", limit: int = 100
) -> dict:
resp = await self._run(
self._http.get_funding_rate_history,
category=category, symbol=symbol, limit=limit,
resp = await self._request_public(
"GET",
"/v5/market/funding/history",
params={"category": category, "symbol": symbol, "limit": limit},
)
rows = (resp.get("result") or {}).get("list") or []
hist = [
@@ -205,9 +344,15 @@ class BybitClient:
interval: str = "5min",
limit: int = 288,
) -> dict:
resp = await self._run(
self._http.get_open_interest,
category=category, symbol=symbol, intervalTime=interval, limit=limit,
resp = await self._request_public(
"GET",
"/v5/market/open-interest",
params={
"category": category,
"symbol": symbol,
"intervalTime": interval,
"limit": limit,
},
)
rows = (resp.get("result") or {}).get("list") or []
points = [
@@ -226,71 +371,88 @@ class BybitClient:
"points": points,
}
async def get_instruments(self, category: str = "linear", symbol: str | None = None) -> dict:
kwargs: dict[str, Any] = {"category": category}
async def get_instruments(
self, category: str = "linear", symbol: str | None = None
) -> dict:
params: dict[str, Any] = {"category": category}
if symbol:
kwargs["symbol"] = symbol
resp = await self._run(self._http.get_instruments_info, **kwargs)
params["symbol"] = symbol
resp = await self._request_public(
"GET", "/v5/market/instruments-info", params=params
)
rows = (resp.get("result") or {}).get("list") or []
instruments = []
for r in rows:
pf = r.get("priceFilter") or {}
lf = r.get("lotSizeFilter") or {}
instruments.append({
"symbol": r.get("symbol"),
"status": r.get("status"),
"base_coin": r.get("baseCoin"),
"quote_coin": r.get("quoteCoin"),
"tick_size": _f(pf.get("tickSize")),
"qty_step": _f(lf.get("qtyStep")),
"min_qty": _f(lf.get("minOrderQty")),
})
instruments.append(
{
"symbol": r.get("symbol"),
"status": r.get("status"),
"base_coin": r.get("baseCoin"),
"quote_coin": r.get("quoteCoin"),
"tick_size": _f(pf.get("tickSize")),
"qty_step": _f(lf.get("qtyStep")),
"min_qty": _f(lf.get("minOrderQty")),
}
)
return {"category": category, "instruments": instruments}
async def get_option_chain(self, base_coin: str, expiry: str | None = None) -> dict:
kwargs: dict[str, Any] = {"category": "option", "baseCoin": base_coin.upper()}
resp = await self._run(self._http.get_instruments_info, **kwargs)
resp = await self._request_public(
"GET",
"/v5/market/instruments-info",
params={"category": "option", "baseCoin": base_coin.upper()},
)
rows = (resp.get("result") or {}).get("list") or []
options = []
for r in rows:
delivery = r.get("deliveryTime")
if expiry and expiry not in r.get("symbol", ""):
continue
options.append({
"symbol": r.get("symbol"),
"base_coin": r.get("baseCoin"),
"settle_coin": r.get("settleCoin"),
"type": r.get("optionsType"),
"launch_time": int(r.get("launchTime", 0)),
"delivery_time": int(delivery) if delivery else None,
})
options.append(
{
"symbol": r.get("symbol"),
"base_coin": r.get("baseCoin"),
"settle_coin": r.get("settleCoin"),
"type": r.get("optionsType"),
"launch_time": int(r.get("launchTime", 0)),
"delivery_time": int(delivery) if delivery else None,
}
)
return {"base_coin": base_coin.upper(), "options": options}
# ── account / positions / orders (signed) ─────────────────
async def get_positions(
self, category: str = "linear", settle_coin: str = "USDT"
) -> list[dict]:
kwargs: dict[str, Any] = {"category": category}
params: dict[str, Any] = {"category": category}
if category in ("linear", "inverse"):
kwargs["settleCoin"] = settle_coin
resp = await self._run(self._http.get_positions, **kwargs)
params["settleCoin"] = settle_coin
resp = await self._request_signed("GET", "/v5/position/list", params=params)
rows = (resp.get("result") or {}).get("list") or []
out = []
for r in rows:
out.append({
"symbol": r.get("symbol"),
"side": r.get("side"),
"size": _f(r.get("size")),
"entry_price": _f(r.get("avgPrice")),
"unrealized_pnl": _f(r.get("unrealisedPnl")),
"leverage": _f(r.get("leverage")),
"liquidation_price": _f(r.get("liqPrice")),
"position_value": _f(r.get("positionValue")),
})
out.append(
{
"symbol": r.get("symbol"),
"side": r.get("side"),
"size": _f(r.get("size")),
"entry_price": _f(r.get("avgPrice")),
"unrealized_pnl": _f(r.get("unrealisedPnl")),
"leverage": _f(r.get("leverage")),
"liquidation_price": _f(r.get("liqPrice")),
"position_value": _f(r.get("positionValue")),
}
)
return out
async def get_account_summary(self, account_type: str = "UNIFIED") -> dict:
resp = await self._run(
self._http.get_wallet_balance, accountType=account_type
resp = await self._request_signed(
"GET",
"/v5/account/wallet-balance",
params={"accountType": account_type},
)
rows = (resp.get("result") or {}).get("list") or []
if not rows:
@@ -298,11 +460,13 @@ class BybitClient:
a = rows[0]
coins = []
for c in a.get("coin") or []:
coins.append({
"coin": c.get("coin"),
"wallet_balance": _f(c.get("walletBalance")),
"equity": _f(c.get("equity")),
})
coins.append(
{
"coin": c.get("coin"),
"wallet_balance": _f(c.get("walletBalance")),
"equity": _f(c.get("equity")),
}
)
return {
"account_type": a.get("accountType"),
"equity": _f(a.get("totalEquity")),
@@ -316,8 +480,10 @@ class BybitClient:
async def get_trade_history(
self, category: str = "linear", limit: int = 50
) -> list[dict]:
resp = await self._run(
self._http.get_executions, category=category, limit=limit
resp = await self._request_signed(
"GET",
"/v5/execution/list",
params={"category": category, "limit": limit},
)
rows = (resp.get("result") or {}).get("list") or []
return [
@@ -339,12 +505,14 @@ class BybitClient:
symbol: str | None = None,
settle_coin: str = "USDT",
) -> list[dict]:
kwargs: dict[str, Any] = {"category": category}
params: dict[str, Any] = {"category": category}
if category in ("linear", "inverse") and not symbol:
kwargs["settleCoin"] = settle_coin
params["settleCoin"] = settle_coin
if symbol:
kwargs["symbol"] = symbol
resp = await self._run(self._http.get_open_orders, **kwargs)
params["symbol"] = symbol
resp = await self._request_signed(
"GET", "/v5/order/realtime", params=params
)
rows = (resp.get("result") or {}).get("list") or []
return [
{
@@ -360,15 +528,20 @@ class BybitClient:
for r in rows
]
# ── microstructure / basis ─────────────────────────────────
async def get_orderbook_imbalance(
self,
symbol: str,
category: str = "linear",
depth: int = 10,
) -> dict:
"""Microstructure: bid/ask imbalance ratio + microprice + slope."""
ob = await self.get_orderbook(symbol=symbol, category=category, limit=max(depth, 50))
result = micro.orderbook_imbalance(ob.get("bids") or [], ob.get("asks") or [], depth=depth)
ob = await self.get_orderbook(
symbol=symbol, category=category, limit=max(depth, 50)
)
result = micro.orderbook_imbalance(
ob.get("bids") or [], ob.get("asks") or [], depth=depth
)
return {
"symbol": symbol,
"category": category,
@@ -378,9 +551,6 @@ class BybitClient:
}
async def get_basis_term_structure(self, asset: str) -> dict:
"""Basis curve futures (dated) vs perp + spot. Filtra contratti future
BTCUSDT / ETHUSDT con scadenza, calcola annualized basis per ognuno.
"""
import datetime as _dt
asset = asset.upper()
@@ -389,12 +559,13 @@ class BybitClient:
sp = spot.get("last_price")
pp = perp.get("last_price")
# Lista futures dated (linear/inverse)
instr = await self.get_instruments(category="linear")
items = (instr.get("instruments") or [])
items = instr.get("instruments") or []
futures = [
x for x in items
if x.get("symbol", "").startswith(f"{asset}-") or x.get("symbol", "").startswith(f"{asset}USDT-")
x
for x in items
if x.get("symbol", "").startswith(f"{asset}-")
or x.get("symbol", "").startswith(f"{asset}USDT-")
]
rows: list[dict[str, Any]] = []
@@ -409,21 +580,25 @@ class BybitClient:
days = max((int(expiry_ms) - now_ms) / 86_400_000, 1)
basis_pct = 100.0 * (fp - sp) / sp
annualized = basis_pct * 365.0 / days
rows.append({
"symbol": f["symbol"],
"expiry_ms": int(expiry_ms),
"days_to_expiry": round(days, 2),
"future_price": fp,
"basis_pct": round(basis_pct, 4),
"annualized_basis_pct": round(annualized, 4),
})
rows.append(
{
"symbol": f["symbol"],
"expiry_ms": int(expiry_ms),
"days_to_expiry": round(days, 2),
"future_price": fp,
"basis_pct": round(basis_pct, 4),
"annualized_basis_pct": round(annualized, 4),
}
)
rows.sort(key=lambda r: r["days_to_expiry"])
return {
"asset": asset,
"spot_price": sp,
"perp_price": pp,
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4) if (sp and pp) else None,
"perp_basis_pct": round(100.0 * (pp - sp) / sp, 4)
if (sp and pp)
else None,
"term_structure": rows,
"data_timestamp": _dt.datetime.now(_dt.UTC).isoformat(),
}
@@ -449,11 +624,7 @@ class BybitClient:
"funding_rate": perp.get("funding_rate"),
}
def _envelope(self, resp: dict, payload: dict) -> dict:
code = resp.get("retCode", 0)
if code != 0:
return {"error": resp.get("retMsg", "bybit_error"), "code": code}
return payload
# ── trading (signed, write) ────────────────────────────────
async def place_order(
self,
@@ -467,7 +638,7 @@ class BybitClient:
reduce_only: bool = False,
position_idx: int | None = None,
) -> dict:
kwargs: dict[str, Any] = {
body: dict[str, Any] = {
"category": category,
"symbol": symbol,
"side": side,
@@ -477,38 +648,34 @@ class BybitClient:
"reduceOnly": reduce_only,
}
if price is not None:
kwargs["price"] = str(price)
body["price"] = str(price)
if position_idx is not None:
kwargs["positionIdx"] = position_idx
body["positionIdx"] = position_idx
if category == "option":
import uuid
kwargs["orderLinkId"] = f"cerbero-{uuid.uuid4().hex[:16]}"
resp = await self._run(self._http.place_order, **kwargs)
body["orderLinkId"] = f"cerbero-{uuid.uuid4().hex[:16]}"
resp = await self._request_signed("POST", "/v5/order/create", body=body)
r = resp.get("result") or {}
return self._envelope(resp, {
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
})
return self._envelope(
resp,
{
"order_id": r.get("orderId"),
"order_link_id": r.get("orderLinkId"),
"status": "submitted",
},
)
async def place_combo_order(
self,
category: str,
legs: list[dict[str, Any]],
) -> dict:
"""Atomic multi-leg via /v5/order/create-batch (Bybit option only).
Bybit supporta batch_order solo su category='option'. Per perp/linear
usare loop di place_order (non atomic).
legs: [{symbol, side, qty, order_type, price?, tif?, reduce_only?}].
"""
if category != "option":
raise ValueError("place_combo_order: Bybit batch_order è disponibile solo su category='option'")
raise ValueError(
"place_combo_order: Bybit batch_order è disponibile solo su category='option'"
)
if len(legs) < 2:
raise ValueError("combo requires at least 2 legs")
import uuid
request: list[dict[str, Any]] = []
for leg in legs:
entry: dict[str, Any] = {
@@ -524,7 +691,10 @@ class BybitClient:
entry["price"] = str(leg["price"])
request.append(entry)
resp = await self._run(self._http.place_batch_order, category=category, request=request)
body = {"category": category, "request": request}
resp = await self._request_signed(
"POST", "/v5/order/create-batch", body=body
)
result_list = (resp.get("result") or {}).get("list") or []
orders = [
{
@@ -544,80 +714,112 @@ class BybitClient:
new_qty: float | None = None,
new_price: float | None = None,
) -> dict:
kwargs: dict[str, Any] = {
body: dict[str, Any] = {
"category": category,
"symbol": symbol,
"orderId": order_id,
}
if new_qty is not None:
kwargs["qty"] = str(new_qty)
body["qty"] = str(new_qty)
if new_price is not None:
kwargs["price"] = str(new_price)
resp = await self._run(self._http.amend_order, **kwargs)
body["price"] = str(new_price)
resp = await self._request_signed("POST", "/v5/order/amend", body=body)
r = resp.get("result") or {}
return self._envelope(resp, {
"order_id": r.get("orderId", order_id),
"status": "amended",
})
async def cancel_order(
self, category: str, symbol: str, order_id: str
) -> dict:
resp = await self._run(
self._http.cancel_order,
category=category, symbol=symbol, orderId=order_id,
return self._envelope(
resp,
{
"order_id": r.get("orderId", order_id),
"status": "amended",
},
)
async def cancel_order(self, category: str, symbol: str, order_id: str) -> dict:
body = {"category": category, "symbol": symbol, "orderId": order_id}
resp = await self._request_signed("POST", "/v5/order/cancel", body=body)
r = resp.get("result") or {}
return self._envelope(resp, {
"order_id": r.get("orderId", order_id),
"status": "cancelled",
})
return self._envelope(
resp,
{
"order_id": r.get("orderId", order_id),
"status": "cancelled",
},
)
async def cancel_all_orders(
self, category: str, symbol: str | None = None
) -> dict:
kwargs: dict[str, Any] = {"category": category}
body: dict[str, Any] = {"category": category}
if symbol:
kwargs["symbol"] = symbol
resp = await self._run(self._http.cancel_all_orders, **kwargs)
body["symbol"] = symbol
resp = await self._request_signed(
"POST", "/v5/order/cancel-all", body=body
)
r = resp.get("result") or {}
ids = [x.get("orderId") for x in (r.get("list") or [])]
return self._envelope(resp, {
"cancelled_ids": ids,
"count": len(ids),
})
return self._envelope(
resp,
{
"cancelled_ids": ids,
"count": len(ids),
},
)
async def set_stop_loss(
self, category: str, symbol: str, stop_loss: float,
self,
category: str,
symbol: str,
stop_loss: float,
position_idx: int = 0,
) -> dict:
resp = await self._run(
self._http.set_trading_stop,
category=category, symbol=symbol,
stopLoss=str(stop_loss), positionIdx=position_idx,
body = {
"category": category,
"symbol": symbol,
"stopLoss": str(stop_loss),
"positionIdx": position_idx,
}
resp = await self._request_signed(
"POST", "/v5/position/trading-stop", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"stop_loss": stop_loss,
"status": "stop_loss_set",
},
)
return self._envelope(resp, {
"symbol": symbol, "stop_loss": stop_loss,
"status": "stop_loss_set",
})
async def set_take_profit(
self, category: str, symbol: str, take_profit: float,
self,
category: str,
symbol: str,
take_profit: float,
position_idx: int = 0,
) -> dict:
resp = await self._run(
self._http.set_trading_stop,
category=category, symbol=symbol,
takeProfit=str(take_profit), positionIdx=position_idx,
body = {
"category": category,
"symbol": symbol,
"takeProfit": str(take_profit),
"positionIdx": position_idx,
}
resp = await self._request_signed(
"POST", "/v5/position/trading-stop", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"take_profit": take_profit,
"status": "take_profit_set",
},
)
return self._envelope(resp, {
"symbol": symbol, "take_profit": take_profit,
"status": "take_profit_set",
})
async def close_position(self, category: str, symbol: str) -> dict:
positions = await self.get_positions(category=category)
target = next((p for p in positions if p["symbol"] == symbol and (p["size"] or 0) > 0), None)
target = next(
(p for p in positions if p["symbol"] == symbol and (p["size"] or 0) > 0),
None,
)
if not target:
return {"error": "no_open_position", "symbol": symbol}
close_side = "Sell" if target["side"] == "Buy" else "Buy"
@@ -634,28 +836,44 @@ class BybitClient:
async def set_leverage(
self, category: str, symbol: str, leverage: int
) -> dict:
resp = await self._run(
self._http.set_leverage,
category=category, symbol=symbol,
buyLeverage=str(leverage), sellLeverage=str(leverage),
body = {
"category": category,
"symbol": symbol,
"buyLeverage": str(leverage),
"sellLeverage": str(leverage),
}
resp = await self._request_signed(
"POST", "/v5/position/set-leverage", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"leverage": leverage,
"status": "leverage_set",
},
)
return self._envelope(resp, {
"symbol": symbol, "leverage": leverage,
"status": "leverage_set",
})
async def switch_position_mode(
self, category: str, symbol: str, mode: str
) -> dict:
mode_code = 3 if mode.lower() == "hedge" else 0
resp = await self._run(
self._http.switch_position_mode,
category=category, symbol=symbol, mode=mode_code,
body = {
"category": category,
"symbol": symbol,
"mode": mode_code,
}
resp = await self._request_signed(
"POST", "/v5/position/switch-mode", body=body
)
return self._envelope(
resp,
{
"symbol": symbol,
"mode": mode,
"status": "mode_switched",
},
)
return self._envelope(resp, {
"symbol": symbol, "mode": mode,
"status": "mode_switched",
})
async def transfer_asset(
self,
@@ -664,19 +882,23 @@ class BybitClient:
from_type: str,
to_type: str,
) -> dict:
import uuid
resp = await self._run(
self._http.create_internal_transfer,
transferId=str(uuid.uuid4()),
coin=coin,
amount=str(amount),
fromAccountType=from_type,
toAccountType=to_type,
body = {
"transferId": str(uuid.uuid4()),
"coin": coin,
"amount": str(amount),
"fromAccountType": from_type,
"toAccountType": to_type,
}
resp = await self._request_signed(
"POST", "/v5/asset/transfer/inter-transfer", body=body
)
r = resp.get("result") or {}
return self._envelope(resp, {
"transfer_id": r.get("transferId"),
"coin": coin,
"amount": amount,
"status": "submitted",
})
return self._envelope(
resp,
{
"transfer_id": r.get("transferId"),
"coin": coin,
"amount": amount,
"status": "submitted",
},
)
-2
View File
@@ -359,7 +359,6 @@ async def place_order(
reduce_only=params.reduce_only,
position_idx=params.position_idx,
)
# TODO V2: wire audit via request.state.environment in router
return result
@@ -370,7 +369,6 @@ async def place_combo_order(
category=params.category,
legs=[leg.model_dump() for leg in params.legs],
)
# TODO V2: wire audit via request.state.environment in router
return result
@@ -97,6 +97,10 @@ class DeribitClient:
def is_testnet(self) -> dict:
return {"testnet": self.testnet, "base_url": self.base_url}
async def health(self) -> dict:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok", "testnet": self.testnet}
async def get_ticker(self, instrument_name: str) -> dict:
import datetime as _dt
raw = await self._request("public/ticker", {"instrument_name": instrument_name})
@@ -481,7 +481,6 @@ async def place_order(
post_only=params.post_only,
label=params.label,
)
# TODO V2: wire audit via request.state.environment in router
return result
@@ -502,29 +501,24 @@ async def place_combo_order(
price=params.price,
label=params.label,
)
# TODO V2: wire audit via request.state.environment in router
return result
async def cancel_order(client: DeribitClient, params: CancelOrderReq) -> dict:
result = await client.cancel_order(params.order_id)
# TODO V2: wire audit via request.state.environment in router
return result
async def set_stop_loss(client: DeribitClient, params: SetStopLossReq) -> dict:
result = await client.set_stop_loss(params.order_id, params.stop_price)
# TODO V2: wire audit via request.state.environment in router
return result
async def set_take_profit(client: DeribitClient, params: SetTakeProfitReq) -> dict:
result = await client.set_take_profit(params.order_id, params.tp_price)
# TODO V2: wire audit via request.state.environment in router
return result
async def close_position(client: DeribitClient, params: ClosePositionReq) -> dict:
result = await client.close_position(params.instrument_name)
# TODO V2: wire audit via request.state.environment in router
return result
+335 -110
View File
@@ -1,11 +1,31 @@
"""Hyperliquid REST API client for perpetual futures trading."""
"""Hyperliquid REST API client for perpetual futures trading.
Pure ``httpx`` + ``eth-account`` implementation: no dependency on
``hyperliquid-python-sdk``. Read endpoints hit ``POST /info`` (no auth);
write endpoints hit ``POST /exchange`` and require an EIP-712 L1 signature.
The signing scheme is bit-for-bit equivalent to the canonical SDK:
action_hash = keccak( msgpack(action) || nonce[u64 BE] || vault_marker
|| (expires_after marker || expires_after[u64 BE])? )
phantom = {"source": "a"|"b", "connectionId": action_hash} # a=mainnet, b=testnet
EIP-712 domain: name="Exchange", version="1", chainId=1337,
verifyingContract=0x0
"""
from __future__ import annotations
import asyncio
import datetime as _dt
import time as _time
from decimal import Decimal
from typing import Any
import httpx
import msgpack
from eth_account import Account
from eth_account.messages import encode_typed_data
from eth_utils import keccak, to_hex
from cerbero_mcp.common import indicators as ind
from cerbero_mcp.common.http import async_client
@@ -21,14 +41,8 @@ RESOLUTION_MAP = {
"1d": "1d",
}
try:
from eth_account import Account
from hyperliquid.exchange import Exchange
from hyperliquid.utils import constants as hl_constants
_SDK_AVAILABLE = True
except ImportError: # pragma: no cover
_SDK_AVAILABLE = False
# Slippage usato per market order / market_close (parità con SDK).
DEFAULT_SLIPPAGE = 0.05
def _to_ms(date_str: str) -> int:
@@ -39,11 +53,91 @@ def _to_ms(date_str: str) -> int:
return int(dt.timestamp() * 1000)
class HyperliquidClient:
"""Async client for the Hyperliquid API.
def _float_to_wire(x: float) -> str:
"""Convert a price/size float to Hyperliquid wire string format.
Read operations use direct HTTP calls via httpx against /info.
Write operations delegate to hyperliquid-python-sdk for EIP-712 signing.
8 decimal places, no trailing zeros (matching SDK ``float_to_wire``).
"""
rounded = f"{x:.8f}"
if abs(float(rounded) - x) >= 1e-12:
raise ValueError("float_to_wire causes rounding", x)
if rounded == "-0":
rounded = "0"
normalized = Decimal(rounded).normalize()
return f"{normalized:f}"
def _address_to_bytes(address: str) -> bytes:
return bytes.fromhex(address.removeprefix("0x"))
def _action_hash(
action: Any,
vault_address: str | None,
nonce: int,
expires_after: int | None,
) -> bytes:
"""Deterministic action hash (msgpack + nonce + vault + expires)."""
data = msgpack.packb(action)
data += nonce.to_bytes(8, "big")
if vault_address is None:
data += b"\x00"
else:
data += b"\x01"
data += _address_to_bytes(vault_address)
if expires_after is not None:
data += b"\x00"
data += expires_after.to_bytes(8, "big")
return keccak(data)
def _l1_payload(phantom_agent: dict[str, Any]) -> dict[str, Any]:
return {
"domain": {
"chainId": 1337,
"name": "Exchange",
"verifyingContract": "0x0000000000000000000000000000000000000000",
"version": "1",
},
"types": {
"Agent": [
{"name": "source", "type": "string"},
{"name": "connectionId", "type": "bytes32"},
],
"EIP712Domain": [
{"name": "name", "type": "string"},
{"name": "version", "type": "string"},
{"name": "chainId", "type": "uint256"},
{"name": "verifyingContract", "type": "address"},
],
},
"primaryType": "Agent",
"message": phantom_agent,
}
def _sign_l1_action(
private_key: str,
action: Any,
vault_address: str | None,
nonce: int,
expires_after: int | None,
is_mainnet: bool,
) -> dict[str, Any]:
h = _action_hash(action, vault_address, nonce, expires_after)
phantom_agent = {"source": "a" if is_mainnet else "b", "connectionId": h}
payload = _l1_payload(phantom_agent)
encoded = encode_typed_data(full_message=payload)
signed = Account.from_key(private_key).sign_message(encoded)
return {"r": to_hex(signed["r"]), "s": to_hex(signed["s"]), "v": signed["v"]}
class HyperliquidClient:
"""Async client for the Hyperliquid REST API.
Read operations call ``POST /info`` directly via ``httpx``.
Write operations build an EIP-712 L1 signature in-process (no SDK)
and call ``POST /exchange``.
"""
def __init__(
@@ -63,53 +157,99 @@ class HyperliquidClient:
self.base_url = base_url
else:
self.base_url = BASE_TESTNET if testnet else BASE_LIVE
self._exchange: Any | None = None
self._is_mainnet = self.base_url == BASE_LIVE
self.vault_address: str | None = None
# Persistent async client (riutilizzato per /exchange e /info).
self._http: httpx.AsyncClient | None = None
# Cache name → asset id (perp universe).
self._name_to_asset: dict[str, int] | None = None
# ── SDK exchange (lazy) ────────────────────────────────────
def _get_exchange(self) -> Any:
"""Return (and cache) an SDK Exchange instance for write ops."""
if not _SDK_AVAILABLE:
raise RuntimeError(
"hyperliquid-python-sdk is not installed; write operations unavailable."
)
if self._exchange is None:
account = Account.from_key(self.private_key)
if self._base_url_override:
sdk_base_url = self._base_url_override
else:
sdk_base_url = (
hl_constants.TESTNET_API_URL if self.testnet else hl_constants.MAINNET_API_URL
)
empty_spot_meta: dict[str, Any] = {"universe": [], "tokens": []}
self._exchange = Exchange(
account,
sdk_base_url,
account_address=self.wallet_address,
spot_meta=empty_spot_meta,
)
return self._exchange
async def aclose(self) -> None:
"""Close the underlying HTTP client (if any)."""
if self._http is not None:
await self._http.aclose()
self._http = None
# ── Internal helpers ───────────────────────────────────────
async def _post(self, payload: dict[str, Any]) -> Any:
"""POST JSON to the /info endpoint."""
async def _post_info(self, payload: dict[str, Any]) -> Any:
"""POST a JSON payload to ``/info`` (read-only, no auth)."""
async with async_client(timeout=15.0) as http:
resp = await http.post(f"{self.base_url}/info", json=payload)
resp.raise_for_status()
return resp.json()
# backward-compat alias (interno).
async def _post(self, payload: dict[str, Any]) -> Any:
return await self._post_info(payload)
async def _post_exchange(
self,
action: dict[str, Any],
nonce: int | None = None,
vault_address: str | None = None,
) -> Any:
"""Sign and POST an action to ``/exchange``."""
if nonce is None:
nonce = int(_time.time() * 1000)
vault = vault_address if vault_address is not None else self.vault_address
signature = _sign_l1_action(
self.private_key,
action,
vault,
nonce,
None, # expires_after: not used here
self._is_mainnet,
)
payload: dict[str, Any] = {
"action": action,
"nonce": nonce,
"signature": signature,
"vaultAddress": vault,
"expiresAfter": None,
}
async with async_client(timeout=15.0) as http:
resp = await http.post(f"{self.base_url}/exchange", json=payload)
resp.raise_for_status()
return resp.json()
async def _name_to_asset_id(self, name: str) -> int:
"""Resolve a perp coin name (e.g. ``BTC``) to its asset id.
The asset id is the index in the ``meta.universe`` array. Cached
per-client; refreshed if the requested name is missing.
"""
upper = name.upper()
if self._name_to_asset is None or upper not in self._name_to_asset:
meta = await self._post_info({"type": "meta"})
universe = meta.get("universe", [])
self._name_to_asset = {
m["name"].upper(): idx for idx, m in enumerate(universe)
}
if upper not in self._name_to_asset:
raise ValueError(f"Unknown asset: {name}")
return self._name_to_asset[upper]
@staticmethod
async def _run_sync(func: Any, *args: Any, **kwargs: Any) -> Any:
"""Run a synchronous SDK call in the default executor."""
loop = asyncio.get_event_loop()
return await loop.run_in_executor(None, lambda: func(*args, **kwargs))
def _order_type_to_wire(order_type: dict[str, Any]) -> dict[str, Any]:
if "limit" in order_type:
return {"limit": order_type["limit"]}
if "trigger" in order_type:
t = order_type["trigger"]
return {
"trigger": {
"isMarket": t["isMarket"],
"triggerPx": _float_to_wire(float(t["triggerPx"])),
"tpsl": t["tpsl"],
}
}
raise ValueError("Invalid order type", order_type)
# ── Read tools ─────────────────────────────────────────────
async def get_markets(self) -> list[dict[str, Any]]:
"""List all perp markets with metadata and current stats."""
data = await self._post({"type": "metaAndAssetCtxs"})
data = await self._post_info({"type": "metaAndAssetCtxs"})
universe = data[0]["universe"]
ctx_list = data[1]
markets = []
@@ -144,7 +284,7 @@ class HyperliquidClient:
async def get_orderbook(self, instrument: str, depth: int = 10) -> dict[str, Any]:
"""Get L2 order book for an asset."""
data = await self._post({"type": "l2Book", "coin": instrument.upper()})
data = await self._post_info({"type": "l2Book", "coin": instrument.upper()})
levels = data.get("levels", [[], []])
bids = [{"price": float(b["px"]), "size": float(b["sz"])} for b in levels[0][:depth]]
asks = [{"price": float(a["px"]), "size": float(a["sz"])} for a in levels[1][:depth]]
@@ -152,7 +292,7 @@ class HyperliquidClient:
async def get_positions(self) -> list[dict[str, Any]]:
"""Get open positions for the wallet."""
data = await self._post(
data = await self._post_info(
{"type": "clearinghouseState", "user": self.wallet_address}
)
positions = []
@@ -184,9 +324,9 @@ class HyperliquidClient:
"""Get account summary (equity, balance, margin) including spot balances.
Con Unified Account, spot USDC e perps condividono collaterale.
`spot_fetch_ok` / `perps_fetch_ok` indicano se i due lati sono stati
``spot_fetch_ok`` / ``perps_fetch_ok`` indicano se i due lati sono stati
letti correttamente: se uno dei due è False il chiamante dovrebbe
considerare `equity`/`available_balance` un lower bound.
considerare ``equity``/``available_balance`` un lower bound.
"""
perps_fetch_ok = True
perps_equity = 0.0
@@ -194,7 +334,7 @@ class HyperliquidClient:
margin_used = 0.0
unrealized_pnl = 0.0
try:
data = await self._post(
data = await self._post_info(
{"type": "clearinghouseState", "user": self.wallet_address}
)
margin = data.get("marginSummary") or {}
@@ -208,7 +348,7 @@ class HyperliquidClient:
spot_fetch_ok = True
spot_usdc = 0.0
try:
spot_data = await self._post(
spot_data = await self._post_info(
{"type": "spotClearinghouseState", "user": self.wallet_address}
)
for b in spot_data.get("balances", []) or []:
@@ -233,7 +373,9 @@ class HyperliquidClient:
async def get_trade_history(self, limit: int = 100) -> list[dict[str, Any]]:
"""Get recent trade fills."""
data = await self._post({"type": "userFills", "user": self.wallet_address})
data = await self._post_info(
{"type": "userFills", "user": self.wallet_address}
)
trades = []
for t in data[:limit]:
trades.append(
@@ -255,7 +397,7 @@ class HyperliquidClient:
start_ms = _to_ms(start_date)
end_ms = _to_ms(end_date)
interval = RESOLUTION_MAP.get(resolution, resolution)
data = await self._post(
data = await self._post_info(
{
"type": "candleSnapshot",
"req": {
@@ -282,7 +424,9 @@ class HyperliquidClient:
async def get_open_orders(self) -> list[dict[str, Any]]:
"""Get all open orders for the wallet."""
data = await self._post({"type": "openOrders", "user": self.wallet_address})
data = await self._post_info(
{"type": "openOrders", "user": self.wallet_address}
)
orders = []
for o in data:
orders.append(
@@ -326,7 +470,7 @@ class HyperliquidClient:
# Perp price + funding from HL
try:
ctx = await self._post({"type": "metaAndAssetCtxs"})
ctx = await self._post_info({"type": "metaAndAssetCtxs"})
universe = ctx[0]["universe"]
ctx_list = ctx[1]
perp_price = None
@@ -375,7 +519,7 @@ class HyperliquidClient:
async def get_funding_rate(self, instrument: str) -> dict[str, Any]:
"""Get current and recent historical funding rates for an asset."""
data = await self._post({"type": "metaAndAssetCtxs"})
data = await self._post_info({"type": "metaAndAssetCtxs"})
universe = data[0]["universe"]
ctx_list = data[1]
current_rate = None
@@ -389,7 +533,7 @@ class HyperliquidClient:
# Fetch funding history (last 7 days)
end_ms = int(_dt.datetime.utcnow().timestamp() * 1000)
start_ms = end_ms - 7 * 24 * 3600 * 1000
history_data = await self._post(
history_data = await self._post_info(
{
"type": "fundingHistory",
"coin": instrument.upper(),
@@ -443,44 +587,10 @@ class HyperliquidClient:
result[name] = None
return result
# ── Write tools (via SDK) ──────────────────────────────────
async def place_order(
self,
instrument: str,
side: str,
amount: float,
type: str = "limit",
price: float | None = None,
reduce_only: bool = False,
) -> dict[str, Any]:
"""Place an order on Hyperliquid using the SDK for EIP-712 signing."""
exchange = self._get_exchange()
is_buy = side.lower() in ("buy", "long")
coin = instrument.upper()
if type == "market":
ot: dict[str, Any] = {"limit": {"tif": "Ioc"}}
if price is None:
ticker = await self.get_ticker(coin)
mark = ticker.get("mark_price", 0)
price = round(mark * 1.03, 1) if is_buy else round(mark * 0.97, 1)
elif type in ("stop_market", "stop_loss"):
assert price is not None
ot = {"trigger": {"triggerPx": float(price), "isMarket": True, "tpsl": "sl"}}
elif type == "take_profit":
assert price is not None
ot = {"trigger": {"triggerPx": float(price), "isMarket": True, "tpsl": "tp"}}
else:
ot = {"limit": {"tif": "Gtc"}}
if price is None:
return {"error": "price is required for limit orders"}
result = await self._run_sync(
exchange.order, coin, is_buy, amount, price, ot, reduce_only
)
# ── Write tools (signed) ──────────────────────────────────
@staticmethod
def _parse_order_response(result: dict[str, Any]) -> dict[str, Any]:
status = result.get("status", "unknown")
response = result.get("response", {})
if isinstance(response, str):
@@ -491,7 +601,6 @@ class HyperliquidClient:
"filled_size": 0,
"avg_fill_price": 0,
}
statuses = response.get("data", {}).get("statuses", [{}])
first = statuses[0] if statuses else {}
if isinstance(first, str):
@@ -511,12 +620,95 @@ class HyperliquidClient:
"avg_fill_price": float(first.get("filled", {}).get("avgPx", 0)),
}
async def place_order(
self,
instrument: str,
side: str,
amount: float,
type: str = "limit",
price: float | None = None,
reduce_only: bool = False,
) -> dict[str, Any]:
"""Place an order on Hyperliquid (signed via EIP-712)."""
is_buy = side.lower() in ("buy", "long")
coin = instrument.upper()
if type == "market":
order_type: dict[str, Any] = {"limit": {"tif": "Ioc"}}
if price is None:
ticker = await self.get_ticker(coin)
mark = ticker.get("mark_price", 0)
price = round(mark * 1.03, 1) if is_buy else round(mark * 0.97, 1)
elif type in ("stop_market", "stop_loss"):
assert price is not None
order_type = {
"trigger": {
"triggerPx": float(price),
"isMarket": True,
"tpsl": "sl",
}
}
elif type == "take_profit":
assert price is not None
order_type = {
"trigger": {
"triggerPx": float(price),
"isMarket": True,
"tpsl": "tp",
}
}
else:
order_type = {"limit": {"tif": "Gtc"}}
if price is None:
return {"error": "price is required for limit orders"}
try:
asset_id = await self._name_to_asset_id(coin)
except ValueError as exc:
return {"error": str(exc), "order_id": "", "filled_size": 0, "avg_fill_price": 0}
order_wire: dict[str, Any] = {
"a": asset_id,
"b": is_buy,
"p": _float_to_wire(float(price)),
"s": _float_to_wire(float(amount)),
"r": reduce_only,
"t": self._order_type_to_wire(order_type),
}
action: dict[str, Any] = {
"type": "order",
"orders": [order_wire],
"grouping": "na",
}
try:
result = await self._post_exchange(action)
except httpx.HTTPError as exc:
return {
"status": "error",
"error": str(exc),
"order_id": "",
"filled_size": 0,
"avg_fill_price": 0,
}
return self._parse_order_response(result)
async def cancel_order(self, order_id: str, instrument: str) -> dict[str, Any]:
"""Cancel an existing order using the SDK."""
exchange = self._get_exchange()
result = await self._run_sync(
exchange.cancel, instrument.upper(), int(order_id)
)
"""Cancel an existing order via signed ``cancel`` action."""
try:
asset_id = await self._name_to_asset_id(instrument)
except ValueError as exc:
return {"order_id": order_id, "status": "error", "error": str(exc)}
action: dict[str, Any] = {
"type": "cancel",
"cancels": [{"a": asset_id, "o": int(order_id)}],
}
try:
result = await self._post_exchange(action)
except httpx.HTTPError as exc:
return {"order_id": order_id, "status": "error", "error": str(exc)}
status = result.get("status", "unknown")
response = result.get("response", "")
if isinstance(response, str) and status == "err":
@@ -526,8 +718,7 @@ class HyperliquidClient:
async def set_stop_loss(
self, instrument: str, stop_price: float, size: float
) -> dict[str, Any]:
"""Set a stop-loss trigger order."""
# Determine direction by checking open position
"""Set a stop-loss trigger order (reduce-only)."""
positions = await self.get_positions()
direction = "sell" # default: assume long
for pos in positions:
@@ -548,7 +739,7 @@ class HyperliquidClient:
async def set_take_profit(
self, instrument: str, tp_price: float, size: float
) -> dict[str, Any]:
"""Set a take-profit trigger order."""
"""Set a take-profit trigger order (reduce-only)."""
positions = await self.get_positions()
direction = "sell" # default: assume long
for pos in positions:
@@ -567,21 +758,55 @@ class HyperliquidClient:
)
async def close_position(self, instrument: str) -> dict[str, Any]:
"""Close an open position for the given asset using market_close."""
exchange = self._get_exchange()
"""Close an open position using an aggressive IOC reduce-only order."""
coin = instrument.upper()
try:
result = await self._run_sync(exchange.market_close, instrument.upper())
data = await self._post_info(
{"type": "clearinghouseState", "user": self.wallet_address}
)
target = None
for ap in data.get("assetPositions", []):
pos = ap.get("position", {})
if (pos.get("coin") or "").upper() == coin:
target = pos
break
if target is None:
return {"error": f"No open position for {instrument}", "asset": instrument}
szi = float(target.get("szi", 0) or 0)
if szi == 0:
return {"error": f"No open position for {instrument}", "asset": instrument}
sz = abs(szi)
is_buy = szi < 0 # short → buy to close
# Slippage price: usa mark price * (1±slippage), arrotonda a 5 sig figs.
ticker = await self.get_ticker(coin)
mark = float(ticker.get("mark_price", 0) or 0)
if mark <= 0:
return {"error": "missing mark price for slippage calc", "asset": instrument}
px = mark * (1 + DEFAULT_SLIPPAGE) if is_buy else mark * (1 - DEFAULT_SLIPPAGE)
px = round(float(f"{px:.5g}"), 6)
result = await self.place_order(
instrument=coin,
side="buy" if is_buy else "sell",
amount=sz,
type="limit",
price=px,
reduce_only=True,
)
return {
"status": result.get("status", "unknown"),
"status": result.get("status", "ok"),
"asset": instrument,
**{k: v for k, v in result.items() if k != "status"},
}
except Exception as exc:
return {"error": str(exc), "asset": instrument}
async def health(self) -> dict[str, Any]:
"""Health check — ping /info for server status."""
"""Health check — ping ``/info`` for server status."""
try:
await self._post({"type": "meta"})
await self._post_info({"type": "meta"})
return {"status": "ok", "testnet": self.testnet}
except Exception as exc:
return {"status": "error", "error": str(exc)}
@@ -303,7 +303,6 @@ async def place_order(
price=params.price,
reduce_only=params.reduce_only,
)
# TODO V2: wire audit via request.state.environment in router
return result
@@ -8,6 +8,8 @@ istanziato dal `ClientRegistry`.
"""
from __future__ import annotations
from typing import Any
class MacroClient:
"""Wrapper credenziali FRED/Finnhub. Stateless, no HTTP session."""
@@ -18,3 +20,7 @@ class MacroClient:
async def aclose(self) -> None: # pragma: no cover - no-op, no resources
return None
async def health(self) -> dict[str, Any]:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok"}
@@ -9,6 +9,8 @@ e per essere istanziato dal `ClientRegistry`.
"""
from __future__ import annotations
from typing import Any
class SentimentClient:
"""Wrapper credenziali CryptoPanic/LunarCrush. Stateless, no HTTP session."""
@@ -19,3 +21,7 @@ class SentimentClient:
async def aclose(self) -> None: # pragma: no cover - no-op, no resources
return None
async def health(self) -> dict[str, Any]:
"""Probe minimo per /health/ready: nessuna chiamata di rete."""
return {"status": "ok"}
+52 -6
View File
@@ -11,6 +11,7 @@ from typing import Literal, cast
from fastapi import APIRouter, Depends, Request
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.audit_helpers import audit_call
from cerbero_mcp.exchanges.alpaca import tools as t
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
@@ -136,41 +137,86 @@ def make_router() -> APIRouter:
client: AlpacaClient = Depends(get_alpaca_client),
):
creds = _build_creds(request)
return await t.place_order(client, params, creds=creds)
return await audit_call(
request=request,
exchange="alpaca",
action="place_order",
target_field="symbol",
params=params,
tool_fn=lambda: t.place_order(client, params, creds=creds),
)
@r.post("/tools/amend_order")
async def _amend_order(
params: t.AmendOrderReq,
request: Request,
client: AlpacaClient = Depends(get_alpaca_client),
):
return await t.amend_order(client, params)
return await audit_call(
request=request,
exchange="alpaca",
action="amend_order",
target_field="order_id",
params=params,
tool_fn=lambda: t.amend_order(client, params),
)
@r.post("/tools/cancel_order")
async def _cancel_order(
params: t.CancelOrderReq,
request: Request,
client: AlpacaClient = Depends(get_alpaca_client),
):
return await t.cancel_order(client, params)
return await audit_call(
request=request,
exchange="alpaca",
action="cancel_order",
target_field="order_id",
params=params,
tool_fn=lambda: t.cancel_order(client, params),
)
@r.post("/tools/cancel_all_orders")
async def _cancel_all_orders(
params: t.CancelAllOrdersReq,
request: Request,
client: AlpacaClient = Depends(get_alpaca_client),
):
return await t.cancel_all_orders(client, params)
return await audit_call(
request=request,
exchange="alpaca",
action="cancel_all_orders",
params=params,
tool_fn=lambda: t.cancel_all_orders(client, params),
)
@r.post("/tools/close_position")
async def _close_position(
params: t.ClosePositionReq,
request: Request,
client: AlpacaClient = Depends(get_alpaca_client),
):
return await t.close_position(client, params)
return await audit_call(
request=request,
exchange="alpaca",
action="close_position",
target_field="symbol",
params=params,
tool_fn=lambda: t.close_position(client, params),
)
@r.post("/tools/close_all_positions")
async def _close_all_positions(
params: t.CloseAllPositionsReq,
request: Request,
client: AlpacaClient = Depends(get_alpaca_client),
):
return await t.close_all_positions(client, params)
return await audit_call(
request=request,
exchange="alpaca",
action="close_all_positions",
params=params,
tool_fn=lambda: t.close_all_positions(client, params),
)
return r
+96 -11
View File
@@ -11,6 +11,7 @@ from typing import Literal, cast
from fastapi import APIRouter, Depends, Request
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.audit_helpers import audit_call
from cerbero_mcp.exchanges.bybit import tools as t
from cerbero_mcp.exchanges.bybit.client import BybitClient
@@ -182,7 +183,14 @@ def make_router() -> APIRouter:
client: BybitClient = Depends(get_bybit_client),
):
creds = _build_creds(request)
return await t.place_order(client, params, creds=creds)
return await audit_call(
request=request,
exchange="bybit",
action="place_order",
target_field="symbol",
params=params,
tool_fn=lambda: t.place_order(client, params, creds=creds),
)
@r.post("/tools/place_combo_order")
async def _place_combo_order(
@@ -191,49 +199,103 @@ def make_router() -> APIRouter:
client: BybitClient = Depends(get_bybit_client),
):
creds = _build_creds(request)
return await t.place_combo_order(client, params, creds=creds)
return await audit_call(
request=request,
exchange="bybit",
action="place_combo_order",
params=params,
tool_fn=lambda: t.place_combo_order(client, params, creds=creds),
)
@r.post("/tools/amend_order")
async def _amend_order(
params: t.AmendOrderReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.amend_order(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="amend_order",
target_field="symbol",
params=params,
tool_fn=lambda: t.amend_order(client, params),
)
@r.post("/tools/cancel_order")
async def _cancel_order(
params: t.CancelOrderReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.cancel_order(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="cancel_order",
target_field="order_id",
params=params,
tool_fn=lambda: t.cancel_order(client, params),
)
@r.post("/tools/cancel_all_orders")
async def _cancel_all_orders(
params: t.CancelAllReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.cancel_all_orders(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="cancel_all_orders",
target_field="symbol",
params=params,
tool_fn=lambda: t.cancel_all_orders(client, params),
)
@r.post("/tools/set_stop_loss")
async def _set_stop_loss(
params: t.SetStopLossReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.set_stop_loss(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="set_stop_loss",
target_field="symbol",
params=params,
tool_fn=lambda: t.set_stop_loss(client, params),
)
@r.post("/tools/set_take_profit")
async def _set_take_profit(
params: t.SetTakeProfitReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.set_take_profit(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="set_take_profit",
target_field="symbol",
params=params,
tool_fn=lambda: t.set_take_profit(client, params),
)
@r.post("/tools/close_position")
async def _close_position(
params: t.ClosePositionReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.close_position(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="close_position",
target_field="symbol",
params=params,
tool_fn=lambda: t.close_position(client, params),
)
@r.post("/tools/set_leverage")
async def _set_leverage(
@@ -242,20 +304,43 @@ def make_router() -> APIRouter:
client: BybitClient = Depends(get_bybit_client),
):
creds = _build_creds(request)
return await t.set_leverage(client, params, creds=creds)
return await audit_call(
request=request,
exchange="bybit",
action="set_leverage",
target_field="symbol",
params=params,
tool_fn=lambda: t.set_leverage(client, params, creds=creds),
)
@r.post("/tools/switch_position_mode")
async def _switch_position_mode(
params: t.SwitchModeReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.switch_position_mode(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="switch_position_mode",
target_field="symbol",
params=params,
tool_fn=lambda: t.switch_position_mode(client, params),
)
@r.post("/tools/transfer_asset")
async def _transfer_asset(
params: t.TransferReq,
request: Request,
client: BybitClient = Depends(get_bybit_client),
):
return await t.transfer_asset(client, params)
return await audit_call(
request=request,
exchange="bybit",
action="transfer_asset",
target_field="coin",
params=params,
tool_fn=lambda: t.transfer_asset(client, params),
)
return r
+52 -6
View File
@@ -11,6 +11,7 @@ from typing import Literal, cast
from fastapi import APIRouter, Depends, Request
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.audit_helpers import audit_call
from cerbero_mcp.exchanges.deribit import tools as t
from cerbero_mcp.exchanges.deribit.client import DeribitClient
@@ -249,7 +250,14 @@ def make_router() -> APIRouter:
client: DeribitClient = Depends(get_deribit_client),
):
creds = _build_creds(request)
return await t.place_order(client, params, creds=creds)
return await audit_call(
request=request,
exchange="deribit",
action="place_order",
target_field="instrument_name",
params=params,
tool_fn=lambda: t.place_order(client, params, creds=creds),
)
@r.post("/tools/place_combo_order")
async def _place_combo_order(
@@ -258,34 +266,72 @@ def make_router() -> APIRouter:
client: DeribitClient = Depends(get_deribit_client),
):
creds = _build_creds(request)
return await t.place_combo_order(client, params, creds=creds)
return await audit_call(
request=request,
exchange="deribit",
action="place_combo_order",
params=params,
tool_fn=lambda: t.place_combo_order(client, params, creds=creds),
)
@r.post("/tools/cancel_order")
async def _cancel_order(
params: t.CancelOrderReq,
request: Request,
client: DeribitClient = Depends(get_deribit_client),
):
return await t.cancel_order(client, params)
return await audit_call(
request=request,
exchange="deribit",
action="cancel_order",
target_field="order_id",
params=params,
tool_fn=lambda: t.cancel_order(client, params),
)
@r.post("/tools/set_stop_loss")
async def _set_stop_loss(
params: t.SetStopLossReq,
request: Request,
client: DeribitClient = Depends(get_deribit_client),
):
return await t.set_stop_loss(client, params)
return await audit_call(
request=request,
exchange="deribit",
action="set_stop_loss",
target_field="order_id",
params=params,
tool_fn=lambda: t.set_stop_loss(client, params),
)
@r.post("/tools/set_take_profit")
async def _set_take_profit(
params: t.SetTakeProfitReq,
request: Request,
client: DeribitClient = Depends(get_deribit_client),
):
return await t.set_take_profit(client, params)
return await audit_call(
request=request,
exchange="deribit",
action="set_take_profit",
target_field="order_id",
params=params,
tool_fn=lambda: t.set_take_profit(client, params),
)
@r.post("/tools/close_position")
async def _close_position(
params: t.ClosePositionReq,
request: Request,
client: DeribitClient = Depends(get_deribit_client),
):
return await t.close_position(client, params)
return await audit_call(
request=request,
exchange="deribit",
action="close_position",
target_field="instrument_name",
params=params,
tool_fn=lambda: t.close_position(client, params),
)
return r
+45 -5
View File
@@ -11,6 +11,7 @@ from typing import Literal, cast
from fastapi import APIRouter, Depends, Request
from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.audit_helpers import audit_call
from cerbero_mcp.exchanges.hyperliquid import tools as t
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
@@ -136,34 +137,73 @@ def make_router() -> APIRouter:
client: HyperliquidClient = Depends(get_hyperliquid_client),
):
creds = _build_creds(request)
return await t.place_order(client, params, creds=creds)
return await audit_call(
request=request,
exchange="hyperliquid",
action="place_order",
target_field="instrument",
params=params,
tool_fn=lambda: t.place_order(client, params, creds=creds),
)
@r.post("/tools/cancel_order")
async def _cancel_order(
params: t.CancelOrderReq,
request: Request,
client: HyperliquidClient = Depends(get_hyperliquid_client),
):
return await t.cancel_order(client, params)
return await audit_call(
request=request,
exchange="hyperliquid",
action="cancel_order",
target_field="order_id",
params=params,
tool_fn=lambda: t.cancel_order(client, params),
)
@r.post("/tools/set_stop_loss")
async def _set_stop_loss(
params: t.SetStopLossReq,
request: Request,
client: HyperliquidClient = Depends(get_hyperliquid_client),
):
return await t.set_stop_loss(client, params)
return await audit_call(
request=request,
exchange="hyperliquid",
action="set_stop_loss",
target_field="instrument",
params=params,
tool_fn=lambda: t.set_stop_loss(client, params),
)
@r.post("/tools/set_take_profit")
async def _set_take_profit(
params: t.SetTakeProfitReq,
request: Request,
client: HyperliquidClient = Depends(get_hyperliquid_client),
):
return await t.set_take_profit(client, params)
return await audit_call(
request=request,
exchange="hyperliquid",
action="set_take_profit",
target_field="instrument",
params=params,
tool_fn=lambda: t.set_take_profit(client, params),
)
@r.post("/tools/close_position")
async def _close_position(
params: t.ClosePositionReq,
request: Request,
client: HyperliquidClient = Depends(get_hyperliquid_client),
):
return await t.close_position(client, params)
return await audit_call(
request=request,
exchange="hyperliquid",
action="close_position",
target_field="instrument",
params=params,
tool_fn=lambda: t.close_position(client, params),
)
return r
+84
View File
@@ -1,7 +1,9 @@
"""Factory FastAPI app con middleware, swagger, exception handlers."""
from __future__ import annotations
import asyncio
import json
import os
import time
from datetime import UTC, datetime
from typing import Any
@@ -18,6 +20,7 @@ from cerbero_mcp.common.errors import (
RETRYABLE_STATUSES,
error_envelope,
)
from cerbero_mcp.common.request_log import install_request_log_middleware
class _TimestampInjectorMiddleware(BaseHTTPMiddleware):
@@ -99,6 +102,11 @@ def build_app(
app, testnet_token=testnet_token, mainnet_token=mainnet_token
)
# Request log middleware: registrato DOPO auth → starlette esegue
# i middleware in ordine inverso (LIFO) → request_log è outermost,
# auth è interno e popola request.state.* prima del ritorno.
install_request_log_middleware(app)
app.add_middleware(_TimestampInjectorMiddleware)
@app.middleware("http")
@@ -128,6 +136,7 @@ def build_app(
content=error_envelope(
type_="http_error", code=code, message=message,
retryable=retryable, details=details,
request_id=getattr(request.state, "request_id", None),
),
)
@@ -155,6 +164,7 @@ def build_app(
message=f"request body validation failed on {first_loc}",
retryable=False, suggested_fix=suggestion,
details={"errors": safe_errs},
request_id=getattr(request.state, "request_id", None),
),
)
@@ -166,6 +176,7 @@ def build_app(
type_="internal_error", code="UNHANDLED_EXCEPTION",
message=f"{type(exc).__name__}: {str(exc)[:300]}",
retryable=True,
request_id=getattr(request.state, "request_id", None),
),
)
@@ -179,6 +190,79 @@ def build_app(
"data_timestamp": datetime.now(UTC).isoformat(),
}
@app.get("/health/ready", tags=["system"])
async def health_ready():
"""Readiness check: ping ogni client exchange cached.
- Itera ``app.state.registry._clients`` (se presente).
- Per ogni client prova ``health()`` (preferito) o ``is_testnet()``.
In assenza di metodo, marca con ``note: no probe method``.
- Timeout di 2s per client tramite ``asyncio.wait_for``.
- Stato globale: ``ready`` se tutti ok, ``degraded`` se almeno
uno fallisce, ``not_ready`` se registry vuoto.
- HTTP 200 di default; con ``READY_FAILS_ON_DEGRADED=true`` ritorna
503 quando lo stato non è ``ready`` (utile per probe k8s).
"""
registry = getattr(app.state, "registry", None)
clients_status: list[dict[str, Any]] = []
if registry is not None:
for (exchange, env), client in registry._clients.items():
t0 = time.perf_counter()
ping = (
getattr(client, "health", None)
or getattr(client, "is_testnet", None)
)
if ping is None:
clients_status.append({
"exchange": exchange,
"env": env,
"healthy": True,
"note": "no probe method",
})
continue
try:
res = ping()
if asyncio.iscoroutine(res):
await asyncio.wait_for(res, timeout=2.0)
dur = (time.perf_counter() - t0) * 1000
clients_status.append({
"exchange": exchange,
"env": env,
"healthy": True,
"duration_ms": round(dur, 2),
})
except Exception as e:
clients_status.append({
"exchange": exchange,
"env": env,
"healthy": False,
"error": f"{type(e).__name__}: {str(e)[:200]}",
})
if not clients_status:
status_label = "not_ready"
elif all(c["healthy"] for c in clients_status):
status_label = "ready"
else:
status_label = "degraded"
fail_on_degraded = os.environ.get(
"READY_FAILS_ON_DEGRADED", "false"
).lower() in ("1", "true", "yes")
http_code = 200
if fail_on_degraded and status_label != "ready":
http_code = 503
body = {
"status": status_label,
"name": title,
"version": version,
"uptime_seconds": int(time.time() - app.state.boot_at),
"data_timestamp": datetime.now(UTC).isoformat(),
"clients": clients_status,
}
return JSONResponse(status_code=http_code, content=body)
def _custom_openapi() -> dict[str, Any]:
if app.openapi_schema:
return app.openapi_schema
+2 -2
View File
@@ -29,11 +29,11 @@ def app(monkeypatch):
def _bearer_test():
return {"Authorization": "Bearer t_test_123"}
return {"Authorization": "Bearer t_test_123", "X-Bot-Tag": "test-bot"}
def _bearer_live():
return {"Authorization": "Bearer t_live_456"}
return {"Authorization": "Bearer t_live_456", "X-Bot-Tag": "test-bot"}
# ── Spy helpers ──────────────────────────────────────────────────────────────
+167
View File
@@ -0,0 +1,167 @@
from __future__ import annotations
import pytest
from pydantic import BaseModel
class FakeReq(BaseModel):
instrument_name: str
qty: float
@pytest.mark.asyncio
async def test_audit_call_logs_success(monkeypatch):
from cerbero_mcp.common.audit_helpers import audit_call
logged = []
def fake_audit(**kw):
logged.append(kw)
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
class FakeRequest:
class _State:
environment = "testnet"
state = _State()
async def tool_fn():
return {"order_id": "abc123", "state": "filled"}
result = await audit_call(
request=FakeRequest(), # type: ignore[arg-type]
exchange="deribit",
action="place_order",
target_field="instrument_name",
params=FakeReq(instrument_name="BTC-PERPETUAL", qty=0.1),
tool_fn=tool_fn,
)
assert result == {"order_id": "abc123", "state": "filled"}
assert len(logged) == 1
rec = logged[0]
assert rec["actor"] == "testnet"
assert rec["exchange"] == "deribit"
assert rec["action"] == "place_order"
assert rec["target"] == "BTC-PERPETUAL"
assert rec["payload"]["qty"] == 0.1
assert rec["result"]["order_id"] == "abc123"
assert "error" not in rec or rec.get("error") is None
@pytest.mark.asyncio
async def test_audit_call_logs_error_and_reraises(monkeypatch):
from cerbero_mcp.common.audit_helpers import audit_call
logged = []
def fake_audit(**kw):
logged.append(kw)
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
class FakeRequest:
class _State:
environment = "mainnet"
state = _State()
async def tool_fn():
raise RuntimeError("upstream timeout")
with pytest.raises(RuntimeError, match="upstream timeout"):
await audit_call(
request=FakeRequest(), # type: ignore[arg-type]
exchange="deribit",
action="cancel_order",
target_field="instrument_name",
params=FakeReq(instrument_name="BTC-PERPETUAL", qty=0.0),
tool_fn=tool_fn,
)
assert len(logged) == 1
rec = logged[0]
assert rec["actor"] == "mainnet"
assert "RuntimeError: upstream timeout" in rec["error"]
@pytest.mark.asyncio
async def test_audit_call_no_params_no_target():
from cerbero_mcp.common.audit_helpers import audit_call
class FakeRequest:
class _State:
environment = "testnet"
state = _State()
async def tool_fn():
return {"ok": True}
result = await audit_call(
request=FakeRequest(), # type: ignore[arg-type]
exchange="bybit",
action="cancel_all_orders",
tool_fn=tool_fn,
)
assert result == {"ok": True}
@pytest.mark.asyncio
async def test_audit_call_propagates_bot_tag(monkeypatch):
"""bot_tag letto da request.state e propagato a audit_write_op."""
from cerbero_mcp.common.audit_helpers import audit_call
logged = []
def fake_audit(**kw):
logged.append(kw)
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
class FakeRequest:
class _State:
environment = "testnet"
bot_tag = "scanner-alpha"
state = _State()
async def tool_fn():
return {"order_id": "abc"}
await audit_call(
request=FakeRequest(), # type: ignore[arg-type]
exchange="deribit",
action="place_order",
target_field="instrument_name",
params=FakeReq(instrument_name="BTC-PERPETUAL", qty=0.1),
tool_fn=tool_fn,
)
assert len(logged) == 1
assert logged[0]["bot_tag"] == "scanner-alpha"
assert logged[0]["actor"] == "testnet"
@pytest.mark.asyncio
async def test_audit_call_bot_tag_none_when_missing(monkeypatch):
"""Se request.state.bot_tag non esiste, audit riceve None senza errore."""
from cerbero_mcp.common.audit_helpers import audit_call
logged = []
def fake_audit(**kw):
logged.append(kw)
monkeypatch.setattr("cerbero_mcp.common.audit_helpers.audit_write_op", fake_audit)
class FakeRequest:
class _State:
environment = "testnet"
state = _State()
async def tool_fn():
return {"ok": True}
await audit_call(
request=FakeRequest(), # type: ignore[arg-type]
exchange="bybit",
action="cancel_all_orders",
tool_fn=tool_fn,
)
assert len(logged) == 1
assert logged[0]["bot_tag"] is None
+13 -29
View File
@@ -1,39 +1,23 @@
from __future__ import annotations
from unittest.mock import MagicMock
import pytest
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
@pytest.fixture
def mock_trading():
return MagicMock(name="alpaca_TradingClient")
async def client():
"""AlpacaClient paper su httpx mock (gestito da pytest-httpx)."""
c = AlpacaClient(api_key="test_key", secret_key="test_secret", paper=True)
try:
yield c
finally:
await c.aclose()
@pytest.fixture
def mock_stock():
return MagicMock(name="alpaca_StockHistoricalDataClient")
@pytest.fixture
def mock_crypto():
return MagicMock(name="alpaca_CryptoHistoricalDataClient")
@pytest.fixture
def mock_option():
return MagicMock(name="alpaca_OptionHistoricalDataClient")
@pytest.fixture
def client(mock_trading, mock_stock, mock_crypto, mock_option):
return AlpacaClient(
api_key="test_key",
secret_key="test_secret",
paper=True,
trading=mock_trading,
stock_data=mock_stock,
crypto_data=mock_crypto,
option_data=mock_option,
)
async def client_live():
c = AlpacaClient(api_key="test_key", secret_key="test_secret", paper=False)
try:
yield c
finally:
await c.aclose()
+366 -29
View File
@@ -1,80 +1,417 @@
"""Test AlpacaClient httpx-based (V2.0.0).
Mockano gli endpoint REST tramite pytest-httpx. Coprono account/positions,
ordini (place/cancel/limit-error), close position, clock, asset class
invalida ValueError.
"""
from __future__ import annotations
from unittest.mock import MagicMock
import re
import pytest
from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
from pytest_httpx import HTTPXMock
PAPER = "https://paper-api.alpaca.markets"
DATA = "https://data.alpaca.markets"
@pytest.mark.asyncio
async def test_init_paper_mode(client, mock_trading):
async def test_init_paper_mode(client: AlpacaClient):
assert client.paper is True
assert client._trading is mock_trading
assert client.base_url is None
assert client._trading_base == PAPER
@pytest.mark.asyncio
async def test_get_account_calls_trading(client, mock_trading):
mock_trading.get_account.return_value = MagicMock(
model_dump=lambda: {"equity": 100000, "cash": 50000}
async def test_init_live_mode(client_live: AlpacaClient):
assert client_live.paper is False
assert client_live._trading_base == "https://api.alpaca.markets"
@pytest.mark.asyncio
async def test_init_base_url_override():
c = AlpacaClient(
api_key="k",
secret_key="s",
paper=True,
base_url="https://alpaca-custom.example.com",
)
try:
assert c.base_url == "https://alpaca-custom.example.com"
assert c._trading_base == "https://alpaca-custom.example.com"
# Data endpoint NON viene overridato
assert c._data_base == DATA
finally:
await c.aclose()
@pytest.mark.asyncio
async def test_get_account(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=f"{PAPER}/v2/account",
json={"id": "abc", "equity": "100000.00", "buying_power": "200000.00"},
)
result = await client.get_account()
mock_trading.get_account.assert_called_once()
assert result["equity"] == 100000
assert result["id"] == "abc"
assert result["equity"] == "100000.00"
@pytest.mark.asyncio
async def test_get_positions_returns_list(client, mock_trading):
pos_mock = MagicMock(model_dump=lambda: {"symbol": "AAPL", "qty": 10})
mock_trading.get_all_positions.return_value = [pos_mock]
async def test_get_account_sends_auth_headers(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(url=f"{PAPER}/v2/account", json={"id": "x"})
await client.get_account()
req = httpx_mock.get_requests()[0]
assert req.headers["APCA-API-KEY-ID"] == "test_key"
assert req.headers["APCA-API-SECRET-KEY"] == "test_secret"
assert req.headers["Accept"] == "application/json"
@pytest.mark.asyncio
async def test_get_positions_returns_list(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(
url=f"{PAPER}/v2/positions",
json=[{"symbol": "AAPL", "qty": "10", "side": "long"}],
)
result = await client.get_positions()
assert len(result) == 1
assert result[0]["symbol"] == "AAPL"
@pytest.mark.asyncio
async def test_place_market_order_stocks(client, mock_trading):
order_mock = MagicMock(model_dump=lambda: {"id": "o123", "symbol": "AAPL"})
mock_trading.submit_order.return_value = order_mock
async def test_place_market_order_stocks(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(
method="POST",
url=f"{PAPER}/v2/orders",
json={"id": "o123", "symbol": "AAPL", "status": "accepted"},
)
result = await client.place_order(
symbol="AAPL", side="buy", qty=1, order_type="market", asset_class="stocks",
symbol="AAPL", side="buy", qty=1, order_type="market", asset_class="stocks"
)
assert result["id"] == "o123"
assert mock_trading.submit_order.called
# body POST corretto
req = httpx_mock.get_requests()[0]
import json as _j
body = _j.loads(req.content)
assert body["symbol"] == "AAPL"
assert body["side"] == "buy"
assert body["type"] == "market"
assert body["qty"] == "1"
assert body["time_in_force"] == "day"
@pytest.mark.asyncio
async def test_place_limit_order_requires_price(client):
async def test_place_limit_order_requires_price(client: AlpacaClient):
with pytest.raises(ValueError, match="limit_price"):
await client.place_order(
symbol="AAPL", side="buy", qty=1, order_type="limit",
symbol="AAPL", side="buy", qty=1, order_type="limit"
)
@pytest.mark.asyncio
async def test_cancel_order(client, mock_trading):
mock_trading.cancel_order_by_id.return_value = None
async def test_place_stop_order_requires_price(client: AlpacaClient):
with pytest.raises(ValueError, match="stop_price"):
await client.place_order(
symbol="AAPL", side="buy", qty=1, order_type="stop"
)
@pytest.mark.asyncio
async def test_place_unsupported_order_type(client: AlpacaClient):
with pytest.raises(ValueError, match="unsupported order_type"):
await client.place_order(
symbol="AAPL", side="buy", qty=1, order_type="trailing_stop"
)
@pytest.mark.asyncio
async def test_cancel_order(httpx_mock: HTTPXMock, client: AlpacaClient):
# 204 No Content su success
httpx_mock.add_response(
method="DELETE",
url=f"{PAPER}/v2/orders/o1",
status_code=204,
)
result = await client.cancel_order("o1")
mock_trading.cancel_order_by_id.assert_called_once_with("o1")
assert result == {"order_id": "o1", "canceled": True}
@pytest.mark.asyncio
async def test_close_position_no_options(client, mock_trading):
order_mock = MagicMock(model_dump=lambda: {"id": "close-1"})
mock_trading.close_position.return_value = order_mock
async def test_cancel_all_orders(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
method="DELETE",
url=f"{PAPER}/v2/orders",
json=[
{"id": "a", "status": 200},
{"id": "b", "status": 200},
],
)
result = await client.cancel_all_orders()
assert len(result) == 2
@pytest.mark.asyncio
async def test_close_position_no_options(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(
method="DELETE",
url=f"{PAPER}/v2/positions/AAPL",
json={"id": "close-1", "symbol": "AAPL"},
)
result = await client.close_position("AAPL")
assert mock_trading.close_position.called
assert result["id"] == "close-1"
@pytest.mark.asyncio
async def test_get_clock(client, mock_trading):
clock_mock = MagicMock(model_dump=lambda: {"is_open": True, "next_close": "2026-04-21T20:00:00Z"})
mock_trading.get_clock.return_value = clock_mock
async def test_close_position_with_qty(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(
method="DELETE",
url=f"{PAPER}/v2/positions/AAPL?qty=5.0",
json={"id": "close-2"},
)
result = await client.close_position("AAPL", qty=5.0)
assert result["id"] == "close-2"
@pytest.mark.asyncio
async def test_close_all_positions(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(
method="DELETE",
url=f"{PAPER}/v2/positions?cancel_orders=true",
json=[{"symbol": "AAPL", "status": 200}],
)
result = await client.close_all_positions(cancel_orders=True)
assert len(result) == 1
assert result[0]["symbol"] == "AAPL"
@pytest.mark.asyncio
async def test_get_clock(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=f"{PAPER}/v2/clock",
json={"is_open": True, "next_close": "2026-04-21T20:00:00Z"},
)
result = await client.get_clock()
assert result["is_open"] is True
@pytest.mark.asyncio
async def test_invalid_asset_class(client):
async def test_invalid_asset_class(client: AlpacaClient):
with pytest.raises(ValueError, match="invalid asset_class"):
await client.get_ticker("AAPL", asset_class="forex")
@pytest.mark.asyncio
async def test_get_ticker_stocks(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=f"{DATA}/v2/stocks/AAPL/trades/latest",
json={
"symbol": "AAPL",
"trade": {"p": 175.50, "s": 100, "t": "2026-04-18T15:30:00Z"},
},
)
httpx_mock.add_response(
url=f"{DATA}/v2/stocks/AAPL/quotes/latest",
json={
"symbol": "AAPL",
"quote": {
"bp": 175.40,
"ap": 175.55,
"bs": 50,
"as": 25,
"t": "2026-04-18T15:30:00Z",
},
},
)
result = await client.get_ticker("AAPL", asset_class="stocks")
assert result["asset_class"] == "stocks"
assert result["last_price"] == 175.50
assert result["bid"] == 175.40
assert result["ask"] == 175.55
@pytest.mark.asyncio
async def test_get_bars_stocks(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{DATA}/v2/stocks/bars\?.*"),
json={
"bars": {
"AAPL": [
{
"t": "2026-04-17T00:00:00Z",
"o": 170.0,
"h": 176.0,
"l": 169.5,
"c": 175.0,
"v": 1000000,
}
]
}
},
)
result = await client.get_bars(
symbol="AAPL",
asset_class="stocks",
interval="1d",
start="2026-04-17T00:00:00",
end="2026-04-18T00:00:00",
limit=10,
)
assert result["symbol"] == "AAPL"
assert result["interval"] == "1d"
assert len(result["bars"]) == 1
assert result["bars"][0]["close"] == 175.0
@pytest.mark.asyncio
async def test_get_bars_unsupported_timeframe(client: AlpacaClient):
with pytest.raises(ValueError, match="unsupported timeframe"):
await client.get_bars(
symbol="AAPL",
asset_class="stocks",
interval="3min",
)
@pytest.mark.asyncio
async def test_get_bars_invalid_asset_class(client: AlpacaClient):
with pytest.raises(ValueError, match="invalid asset_class"):
await client.get_bars(symbol="AAPL", asset_class="forex")
@pytest.mark.asyncio
async def test_get_assets(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{PAPER}/v2/assets\?.*"),
json=[
{"symbol": "AAPL", "tradable": True, "class": "us_equity"},
{"symbol": "GOOG", "tradable": True, "class": "us_equity"},
],
)
result = await client.get_assets(asset_class="stocks", status="active")
assert len(result) == 2
assert result[0]["symbol"] == "AAPL"
@pytest.mark.asyncio
async def test_get_assets_invalid_class(client: AlpacaClient):
with pytest.raises(ValueError, match="invalid asset_class"):
await client.get_assets(asset_class="forex")
@pytest.mark.asyncio
async def test_get_open_orders(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{PAPER}/v2/orders\?.*"),
json=[{"id": "o1", "status": "open", "symbol": "AAPL"}],
)
result = await client.get_open_orders(limit=10)
assert len(result) == 1
assert result[0]["id"] == "o1"
@pytest.mark.asyncio
async def test_amend_order(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
method="PATCH",
url=f"{PAPER}/v2/orders/o1",
json={"id": "o1", "qty": "5", "limit_price": "180.0"},
)
result = await client.amend_order(
"o1", qty=5, limit_price=180.0, tif="gtc"
)
assert result["id"] == "o1"
req = httpx_mock.get_requests()[0]
import json as _j
body = _j.loads(req.content)
assert body["qty"] == "5"
assert body["limit_price"] == "180.0"
assert body["time_in_force"] == "gtc"
@pytest.mark.asyncio
async def test_get_calendar(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{PAPER}/v2/calendar.*"),
json=[{"date": "2026-04-20", "open": "09:30", "close": "16:00"}],
)
result = await client.get_calendar(start="2026-04-20", end="2026-04-20")
assert len(result) == 1
assert result[0]["date"] == "2026-04-20"
@pytest.mark.asyncio
async def test_get_calendar_no_filters(
httpx_mock: HTTPXMock, client: AlpacaClient
):
httpx_mock.add_response(
url=f"{PAPER}/v2/calendar",
json=[{"date": "2026-04-20"}],
)
result = await client.get_calendar()
assert len(result) == 1
@pytest.mark.asyncio
async def test_get_snapshot(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{DATA}/v2/stocks/snapshots\?.*"),
json={
"AAPL": {
"latestTrade": {"p": 175.0},
"latestQuote": {"bp": 174.9, "ap": 175.1},
}
},
)
result = await client.get_snapshot("AAPL")
assert result["latestTrade"]["p"] == 175.0
@pytest.mark.asyncio
async def test_get_option_chain(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{DATA}/v1beta1/options/snapshots/AAPL.*"),
json={
"snapshots": {
"AAPL250620C00200000": {
"latestQuote": {"bp": 1.20, "ap": 1.30}
}
}
},
)
result = await client.get_option_chain("AAPL", expiry="2026-06-20")
assert result["underlying"] == "AAPL"
assert result["expiry"] == "2026-06-20"
assert "AAPL250620C00200000" in result["contracts"]
@pytest.mark.asyncio
async def test_get_activities(httpx_mock: HTTPXMock, client: AlpacaClient):
httpx_mock.add_response(
url=re.compile(rf"^{PAPER}/v2/account/activities.*"),
json=[
{"id": "1", "activity_type": "FILL"},
{"id": "2", "activity_type": "TRANS"},
],
)
result = await client.get_activities(limit=10)
assert len(result) == 2
@pytest.mark.asyncio
async def test_aclose_idempotent(client: AlpacaClient):
await client.aclose()
await client.aclose() # nessun raise
+5 -8
View File
@@ -1,21 +1,18 @@
from __future__ import annotations
from unittest.mock import MagicMock
import pytest
from cerbero_mcp.exchanges.bybit.client import BybitClient
@pytest.fixture
def mock_http():
return MagicMock(name="pybit_HTTP")
def client():
"""BybitClient con base_url testnet e AsyncClient interno.
@pytest.fixture
def client(mock_http):
pytest-httpx intercetta le chiamate dell'AsyncClient httpx creato dal
costruttore (auto-mock), quindi non serve injection esplicita.
"""
return BybitClient(
api_key="test_key",
api_secret="test_secret",
testnet=True,
http=mock_http,
)
File diff suppressed because it is too large Load Diff
+275 -17
View File
@@ -6,12 +6,16 @@ import pytest
from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
from pytest_httpx import HTTPXMock
# Chiave privata fissa: rende deterministica la firma EIP-712 per i test write.
DUMMY_PRIVATE_KEY = "0x" + "01" * 32
DUMMY_WALLET = "0x1a642f0E3c3aF545E7AcBD38b07251B3990914F1" # derived from key above
@pytest.fixture
def client():
return HyperliquidClient(
wallet_address="0xDeadBeef",
private_key="0x" + "a" * 64,
wallet_address=DUMMY_WALLET,
private_key=DUMMY_PRIVATE_KEY,
testnet=True,
)
@@ -41,6 +45,13 @@ META_AND_CTX = [
],
]
META = {
"universe": [
{"name": "BTC", "maxLeverage": 50},
{"name": "ETH", "maxLeverage": 25},
]
}
CLEARINGHOUSE_STATE = {
"marginSummary": {
"accountValue": "1500.0",
@@ -65,6 +76,9 @@ CLEARINGHOUSE_STATE = {
SPOT_STATE = {"balances": [{"coin": "USDC", "total": "500.0"}]}
# ── Read endpoints ─────────────────────────────────────────────
@pytest.mark.asyncio
async def test_get_markets(httpx_mock: HTTPXMock, client: HyperliquidClient):
httpx_mock.add_response(
@@ -209,19 +223,263 @@ async def test_health_ok(httpx_mock: HTTPXMock, client: HyperliquidClient):
assert result["testnet"] is True
@pytest.mark.asyncio
async def test_place_order_sdk_unavailable(client: HyperliquidClient):
"""place_order raises RuntimeError when SDK is not available (mocked)."""
import cerbero_mcp.exchanges.hyperliquid.client as mod
# ── Write endpoints (signed via EIP-712) ───────────────────────
original = mod._SDK_AVAILABLE
mod._SDK_AVAILABLE = False
client._exchange = None
try:
result = await client.place_order("BTC", "buy", 0.1, price=50000.0)
# Should return error dict or raise RuntimeError
assert "error" in result or result.get("status") == "error"
except RuntimeError as exc:
assert "not installed" in str(exc).lower() or "sdk" in str(exc).lower()
finally:
mod._SDK_AVAILABLE = original
@pytest.mark.asyncio
async def test_place_order_limit(httpx_mock: HTTPXMock, client: HyperliquidClient):
"""Limit order: signs and POSTs to /exchange with correct payload shape."""
# 1. /info type=meta per asset id
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META,
)
# 2. /exchange firmato
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
json={
"status": "ok",
"response": {
"type": "order",
"data": {
"statuses": [{"resting": {"oid": 9999}}],
},
},
},
)
result = await client.place_order(
instrument="BTC", side="buy", amount=0.01, type="limit", price=50000.0
)
# Verifica shape risposta normalizzata
assert result["status"] == "ok"
assert result["order_id"] == 9999
# Verifica request body al POST /exchange
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
assert len(requests) == 1
import json as _json
body = _json.loads(requests[0].content)
assert body["nonce"] > 0
assert body["vaultAddress"] is None
assert body["expiresAfter"] is None
assert body["action"]["type"] == "order"
assert body["action"]["grouping"] == "na"
assert len(body["action"]["orders"]) == 1
order = body["action"]["orders"][0]
assert order["a"] == 0 # BTC è index 0 in META
assert order["b"] is True # buy
assert order["p"] == "50000"
assert order["s"] == "0.01"
assert order["r"] is False
assert order["t"] == {"limit": {"tif": "Gtc"}}
sig = body["signature"]
assert set(sig.keys()) == {"r", "s", "v"}
assert sig["r"].startswith("0x") and len(sig["r"]) == 66
assert sig["s"].startswith("0x") and len(sig["s"]) == 66
assert sig["v"] in (27, 28)
@pytest.mark.asyncio
async def test_place_order_market(httpx_mock: HTTPXMock, client: HyperliquidClient):
"""Market order: usa mark_price + buffer e tif=Ioc."""
# market path: get_ticker → meta+ctxs, poi meta per asset id, poi /exchange
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META_AND_CTX,
)
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META,
)
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
json={
"status": "ok",
"response": {
"type": "order",
"data": {
"statuses": [{"filled": {"oid": 1, "totalSz": "0.01", "avgPx": "51500"}}],
},
},
},
)
result = await client.place_order(
instrument="BTC", side="buy", amount=0.01, type="market"
)
assert result["status"] == "ok"
assert result["filled_size"] == 0.01
import json as _json
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
assert len(requests) == 1
body = _json.loads(requests[0].content)
order = body["action"]["orders"][0]
assert order["t"] == {"limit": {"tif": "Ioc"}}
@pytest.mark.asyncio
async def test_place_order_stop_loss(httpx_mock: HTTPXMock, client: HyperliquidClient):
"""Stop-loss: usa trigger order type."""
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META,
)
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
json={
"status": "ok",
"response": {"type": "order", "data": {"statuses": [{"resting": {"oid": 7}}]}},
},
)
result = await client.place_order(
instrument="BTC",
side="sell",
amount=0.01,
type="stop_loss",
price=45000.0,
reduce_only=True,
)
assert result["status"] == "ok"
assert result["order_id"] == 7
import json as _json
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
body = _json.loads(requests[0].content)
order = body["action"]["orders"][0]
assert order["r"] is True
assert order["t"] == {
"trigger": {"isMarket": True, "triggerPx": "45000", "tpsl": "sl"}
}
@pytest.mark.asyncio
async def test_place_order_unknown_asset(httpx_mock: HTTPXMock, client: HyperliquidClient):
"""Asset sconosciuto → error dict, niente POST /exchange."""
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META,
)
result = await client.place_order(
instrument="DOGE", side="buy", amount=1.0, type="limit", price=0.1
)
assert "error" in result
assert "DOGE" in result["error"]
@pytest.mark.asyncio
async def test_cancel_order(httpx_mock: HTTPXMock, client: HyperliquidClient):
"""Cancel: action.type=cancel con asset id + oid."""
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META,
)
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
json={"status": "ok", "response": {"type": "cancel", "data": {"statuses": ["success"]}}},
)
result = await client.cancel_order("12345", "BTC")
assert result["status"] == "ok"
assert result["order_id"] == "12345"
import json as _json
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
body = _json.loads(requests[0].content)
assert body["action"]["type"] == "cancel"
assert body["action"]["cancels"] == [{"a": 0, "o": 12345}]
assert "r" in body["signature"] and "s" in body["signature"] and "v" in body["signature"]
@pytest.mark.asyncio
async def test_close_position(httpx_mock: HTTPXMock, client: HyperliquidClient):
"""close_position: legge stato, calcola slippage, place IOC reduce-only."""
# 1. clearinghouseState per direzione
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=CLEARINGHOUSE_STATE,
)
# 2. get_ticker → metaAndAssetCtxs
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META_AND_CTX,
)
# 3. meta per asset id
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json=META,
)
# 4. /exchange
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/exchange"),
json={
"status": "ok",
"response": {
"type": "order",
"data": {
"statuses": [{"filled": {"oid": 5, "totalSz": "0.1", "avgPx": "47500"}}],
},
},
},
)
result = await client.close_position("BTC")
assert result["status"] == "ok"
assert result["asset"] == "BTC"
import json as _json
requests = [r for r in httpx_mock.get_requests() if r.url.path == "/exchange"]
body = _json.loads(requests[0].content)
order = body["action"]["orders"][0]
# Posizione long → side=sell per chiudere
assert order["b"] is False
assert order["r"] is True
@pytest.mark.asyncio
async def test_close_position_no_position(
httpx_mock: HTTPXMock, client: HyperliquidClient
):
"""close_position senza posizione aperta → error dict."""
httpx_mock.add_response(
url=re.compile(r"https://api\.hyperliquid-testnet\.xyz/info"),
json={"assetPositions": [], "marginSummary": {}},
)
result = await client.close_position("BTC")
assert "error" in result
assert result["asset"] == "BTC"
@pytest.mark.asyncio
async def test_signing_parity_with_canonical_sdk(client: HyperliquidClient):
"""Sanity: la firma EIP-712 prodotta è bit-for-bit identica a quella che
genererebbe il SDK ufficiale ``hyperliquid-python-sdk`` per lo stesso input.
Test isolato (no httpx) per garantire che la rimozione del SDK runtime
non introduca regressioni di signing.
"""
from cerbero_mcp.exchanges.hyperliquid.client import _sign_l1_action
action = {"type": "cancel", "cancels": [{"a": 0, "o": 12345}]}
nonce = 1700000000000
sig = _sign_l1_action(DUMMY_PRIVATE_KEY, action, None, nonce, None, False)
assert sig == {
"r": "0xab1150f8d695e015a07e3f79983a0a2a4e58dedec071dfa4177a0761f37e0485",
"s": "0x208cb6370e5e56a3cefa451538c1e0096b70777d2bde172c7afb1e77c4d28d20",
"v": 28,
}
@pytest.mark.asyncio
async def test_aclose_idempotent(client: HyperliquidClient):
"""``aclose`` può essere chiamato anche senza http client attivo."""
await client.aclose()
await client.aclose()
+155
View File
@@ -0,0 +1,155 @@
from __future__ import annotations
import json
from pathlib import Path
import pytest
from fastapi.testclient import TestClient
@pytest.fixture
def tmp_audit_file(tmp_path, monkeypatch):
file_path = tmp_path / "audit.jsonl"
monkeypatch.setenv("AUDIT_LOG_FILE", str(file_path))
return file_path
@pytest.fixture
def app(monkeypatch, tmp_audit_file):
from tests.unit.test_settings import _minimal_env
for k, v in _minimal_env().items():
monkeypatch.setenv(k, v)
from cerbero_mcp.__main__ import _make_app
from cerbero_mcp.settings import Settings
return _make_app(Settings())
def _write_records(file_path: Path, records: list[dict]) -> None:
with file_path.open("w") as f:
for r in records:
f.write(json.dumps(r) + "\n")
def _bearer_test():
return {"Authorization": "Bearer t_test_123"}
def test_admin_audit_no_file(app):
"""Senza AUDIT_LOG_FILE settato, ritorna empty + warning."""
import os
os.environ.pop("AUDIT_LOG_FILE", None)
c = TestClient(app)
r = c.get("/admin/audit", headers=_bearer_test())
assert r.status_code == 200
body = r.json()
assert body["count"] == 0
assert "warning" in body
def test_admin_audit_no_bearer_returns_401(app):
c = TestClient(app)
r = c.get("/admin/audit")
assert r.status_code == 401
def test_admin_audit_no_bot_tag_required(app, tmp_audit_file):
"""Endpoint admin NON richiede X-Bot-Tag (solo bearer)."""
_write_records(tmp_audit_file, [])
c = TestClient(app)
r = c.get("/admin/audit", headers=_bearer_test())
assert r.status_code == 200
def test_admin_audit_returns_records(app, tmp_audit_file):
records = [
{
"audit_event": "write_op",
"asctime": "2026-05-01 10:00:00,000",
"actor": "testnet", "bot_tag": "alpha",
"exchange": "deribit", "action": "place_order",
"target": "BTC-PERPETUAL",
"payload": {"qty": 0.1},
"result": {"order_id": "abc"},
},
{
"audit_event": "write_op",
"asctime": "2026-05-01 11:00:00,000",
"actor": "mainnet", "bot_tag": "beta",
"exchange": "bybit", "action": "cancel_order",
"target": "ord-1",
"payload": {},
},
]
_write_records(tmp_audit_file, records)
c = TestClient(app)
r = c.get("/admin/audit", headers=_bearer_test())
assert r.status_code == 200
body = r.json()
assert body["count"] == 2
def test_admin_audit_filter_by_actor(app, tmp_audit_file):
records = [
{"audit_event": "write_op", "asctime": "2026-05-01 10:00:00,000",
"actor": "testnet", "bot_tag": "a", "exchange": "deribit", "action": "place_order"},
{"audit_event": "write_op", "asctime": "2026-05-01 11:00:00,000",
"actor": "mainnet", "bot_tag": "b", "exchange": "bybit", "action": "place_order"},
]
_write_records(tmp_audit_file, records)
c = TestClient(app)
r = c.get("/admin/audit?actor=mainnet", headers=_bearer_test())
assert r.status_code == 200
body = r.json()
assert body["count"] == 1
assert body["records"][0]["actor"] == "mainnet"
def test_admin_audit_filter_by_date_range(app, tmp_audit_file):
records = [
{"audit_event": "write_op", "asctime": "2026-04-30 10:00:00,000",
"actor": "testnet", "exchange": "deribit", "action": "place_order"},
{"audit_event": "write_op", "asctime": "2026-05-01 10:00:00,000",
"actor": "testnet", "exchange": "deribit", "action": "place_order"},
{"audit_event": "write_op", "asctime": "2026-05-02 10:00:00,000",
"actor": "testnet", "exchange": "deribit", "action": "place_order"},
]
_write_records(tmp_audit_file, records)
c = TestClient(app)
r = c.get("/admin/audit?from=2026-05-01&to=2026-05-01T23:59:59", headers=_bearer_test())
assert r.status_code == 200
assert r.json()["count"] == 1
def test_admin_audit_filter_by_bot_tag(app, tmp_audit_file):
records = [
{"audit_event": "write_op", "asctime": "2026-05-01 10:00:00,000",
"actor": "testnet", "bot_tag": "alpha", "exchange": "deribit", "action": "place_order"},
{"audit_event": "write_op", "asctime": "2026-05-01 11:00:00,000",
"actor": "testnet", "bot_tag": "beta", "exchange": "deribit", "action": "place_order"},
]
_write_records(tmp_audit_file, records)
c = TestClient(app)
r = c.get("/admin/audit?bot_tag=alpha", headers=_bearer_test())
assert r.status_code == 200
assert r.json()["count"] == 1
assert r.json()["records"][0]["bot_tag"] == "alpha"
def test_admin_audit_invalid_date(app, tmp_audit_file):
_write_records(tmp_audit_file, [])
c = TestClient(app)
r = c.get("/admin/audit?from=not-a-date", headers=_bearer_test())
assert r.status_code == 400
def test_admin_audit_limit(app, tmp_audit_file):
records = [
{"audit_event": "write_op", "asctime": f"2026-05-01 10:{i:02d}:00,000",
"actor": "testnet", "exchange": "deribit", "action": "place_order"}
for i in range(50)
]
_write_records(tmp_audit_file, records)
c = TestClient(app)
r = c.get("/admin/audit?limit=10", headers=_bearer_test())
assert r.status_code == 200
assert r.json()["count"] == 10
+78 -2
View File
@@ -67,7 +67,10 @@ def test_testnet_token_sets_env_testnet():
return {"env": request.state.environment}
c = TestClient(fa)
r = c.get("/mcp-deribit/peek", headers={"Authorization": "Bearer tk_test"})
r = c.get(
"/mcp-deribit/peek",
headers={"Authorization": "Bearer tk_test", "X-Bot-Tag": "test-bot"},
)
assert r.status_code == 200
assert r.json() == {"env": "testnet"}
@@ -83,7 +86,10 @@ def test_mainnet_token_sets_env_mainnet():
return {"env": request.state.environment}
c = TestClient(fa)
r = c.get("/mcp-deribit/peek", headers={"Authorization": "Bearer tk_live"})
r = c.get(
"/mcp-deribit/peek",
headers={"Authorization": "Bearer tk_live", "X-Bot-Tag": "test-bot"},
)
assert r.status_code == 200
assert r.json() == {"env": "mainnet"}
@@ -96,3 +102,73 @@ def test_uses_compare_digest():
src = inspect.getsource(auth)
assert "compare_digest" in src, "auth.py deve usare secrets.compare_digest"
# ── X-Bot-Tag header ─────────────────────────────────────────────────────────
def test_missing_bot_tag_returns_400():
from cerbero_mcp.auth import install_auth_middleware
fa = FastAPI()
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
@fa.get("/mcp-deribit/health")
def h():
return {"ok": True}
c = TestClient(fa)
r = c.get("/mcp-deribit/health", headers={"Authorization": "Bearer t"})
assert r.status_code == 400
assert "X-Bot-Tag" in r.json()["error"]["message"]
def test_bot_tag_accepted_and_set_on_state():
from cerbero_mcp.auth import install_auth_middleware
fa = FastAPI()
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
@fa.get("/mcp-deribit/peek")
def peek(request: Request):
return {
"env": request.state.environment,
"bot_tag": request.state.bot_tag,
}
c = TestClient(fa)
r = c.get(
"/mcp-deribit/peek",
headers={"Authorization": "Bearer t", "X-Bot-Tag": "scanner-alpha"},
)
assert r.status_code == 200
assert r.json() == {"env": "testnet", "bot_tag": "scanner-alpha"}
def test_bot_tag_too_long_returns_400():
from cerbero_mcp.auth import install_auth_middleware
fa = FastAPI()
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
@fa.get("/mcp-deribit/health")
def h():
return {"ok": True}
c = TestClient(fa)
r = c.get(
"/mcp-deribit/health",
headers={"Authorization": "Bearer t", "X-Bot-Tag": "x" * 65},
)
assert r.status_code == 400
def test_bot_tag_not_required_on_health():
"""Health endpoint deve restare senza auth e senza bot tag."""
from cerbero_mcp.auth import install_auth_middleware
fa = FastAPI()
install_auth_middleware(fa, testnet_token="t", mainnet_token="m")
@fa.get("/health")
def h():
return {"ok": True}
c = TestClient(fa)
r = c.get("/health")
assert r.status_code == 200
+20 -32
View File
@@ -29,15 +29,8 @@ async def test_build_client_bybit_returns_correct_env(monkeypatch):
for k, v in _minimal_env().items():
monkeypatch.setenv(k, v)
# Stub pybit HTTP per evitare connessione reale durante __init__
from cerbero_mcp.exchanges.bybit import client as bybit_client
class _FakeHTTP:
def __init__(self, **kwargs):
self.kwargs = kwargs
monkeypatch.setattr(bybit_client, "HTTP", _FakeHTTP)
# BybitClient costruisce internamente httpx.AsyncClient: nessuna
# connessione reale finché non si invoca un metodo di rete.
from cerbero_mcp.exchanges import build_client
from cerbero_mcp.settings import Settings
@@ -78,28 +71,22 @@ async def test_build_client_alpaca_returns_correct_env(monkeypatch):
for k, v in _minimal_env().items():
monkeypatch.setenv(k, v)
# Stub alpaca SDK clients per evitare connessioni reali in __init__
from cerbero_mcp.exchanges.alpaca import client as alpaca_client
class _FakeSdk:
def __init__(self, **kwargs):
self.kwargs = kwargs
monkeypatch.setattr(alpaca_client, "TradingClient", _FakeSdk)
monkeypatch.setattr(alpaca_client, "StockHistoricalDataClient", _FakeSdk)
monkeypatch.setattr(alpaca_client, "CryptoHistoricalDataClient", _FakeSdk)
monkeypatch.setattr(alpaca_client, "OptionHistoricalDataClient", _FakeSdk)
# AlpacaClient (V2) usa httpx puro: il costruttore non apre connessioni
# reali (httpx.AsyncClient è lazy fino alla prima request), quindi nessuno
# stub SDK è necessario.
from cerbero_mcp.exchanges import build_client
from cerbero_mcp.settings import Settings
s = Settings()
c_test = await build_client(s, "alpaca", "testnet")
c_live = await build_client(s, "alpaca", "mainnet")
assert c_test is not c_live
assert c_test.paper is True
assert c_live.paper is False
try:
assert c_test is not c_live
assert c_test.paper is True
assert c_live.paper is False
finally:
await c_test.aclose()
await c_live.aclose()
@pytest.mark.asyncio
@@ -202,8 +189,8 @@ async def test_hyperliquid_url_from_env_overrides_default(monkeypatch):
@pytest.mark.asyncio
async def test_bybit_url_from_env_overrides_default(monkeypatch):
"""Bybit: pybit non accetta `endpoint` come kwarg, ma setting di
`_http.endpoint` post-init rispecchia l'override."""
"""Bybit (httpx): override BYBIT_URL_TESTNET applica direttamente a
`self.base_url`, usato come base di ogni richiesta REST V5."""
from tests.unit.test_settings import _minimal_env
env = _minimal_env(BYBIT_URL_TESTNET="https://bybit-custom.example.com")
@@ -216,14 +203,12 @@ async def test_bybit_url_from_env_overrides_default(monkeypatch):
s = Settings()
c = await build_client(s, "bybit", "testnet")
assert c.base_url == "https://bybit-custom.example.com"
# override applicato all'istanza pybit HTTP via attributo `endpoint`
assert getattr(c._http, "endpoint", None) == "https://bybit-custom.example.com"
@pytest.mark.asyncio
async def test_alpaca_url_from_env_overrides_default(monkeypatch):
"""Alpaca: TradingClient supporta url_override per trading API.
Data clients (Stock/Crypto/Option) non supportano override sul costruttore."""
"""Alpaca V2 (httpx): `base_url` override applica al solo trading
endpoint; data endpoints (data.alpaca.markets) restano hardcoded."""
from tests.unit.test_settings import _minimal_env
env = _minimal_env(ALPACA_URL_TESTNET="https://alpaca-custom.example.com")
@@ -235,7 +220,10 @@ async def test_alpaca_url_from_env_overrides_default(monkeypatch):
s = Settings()
c = await build_client(s, "alpaca", "testnet")
assert c.base_url == "https://alpaca-custom.example.com"
try:
assert c.base_url == "https://alpaca-custom.example.com"
finally:
await c.aclose()
@pytest.mark.asyncio
+121
View File
@@ -0,0 +1,121 @@
from __future__ import annotations
import logging
import pytest
from fastapi.testclient import TestClient
from starlette.requests import Request as StarletteRequest
@pytest.fixture
def app(monkeypatch):
from tests.unit.test_settings import _minimal_env
for k, v in _minimal_env().items():
monkeypatch.setenv(k, v)
from cerbero_mcp.__main__ import _make_app
from cerbero_mcp.settings import Settings
return _make_app(Settings())
@pytest.fixture
def request_log_caplog(caplog):
"""Caplog per logger mcp.request: aggiunge l'handler caplog direttamente
al logger, bypassando ``propagate=False`` settato da common/logging.py.
"""
lg = logging.getLogger("mcp.request")
lg.addHandler(caplog.handler)
lg.setLevel(logging.INFO)
caplog.set_level(logging.INFO, logger="mcp.request")
try:
yield caplog
finally:
lg.removeHandler(caplog.handler)
def _request_records(caplog):
return [rec for rec in caplog.records if rec.name == "mcp.request"]
def test_request_log_emits_for_health(app, request_log_caplog):
c = TestClient(app)
r = c.get("/health")
assert r.status_code == 200
records = _request_records(request_log_caplog)
assert any(getattr(rec, "path", None) == "/health" for rec in records)
def test_request_log_includes_request_id(app, request_log_caplog):
c = TestClient(app)
c.get("/health")
records = _request_records(request_log_caplog)
assert records, "expected at least one mcp.request record"
for rec in records:
rid = getattr(rec, "request_id", None)
assert rid and isinstance(rid, str) and len(rid) >= 16
def test_request_log_includes_method_status_duration(app, request_log_caplog):
c = TestClient(app)
c.get("/health")
records = _request_records(request_log_caplog)
rec = next(rec for rec in records if getattr(rec, "path", None) == "/health")
assert getattr(rec, "method", None) == "GET"
assert getattr(rec, "status_code", None) == 200
assert isinstance(getattr(rec, "duration_ms", None), int | float)
def test_request_log_includes_actor_and_bot_tag_on_protected(
app, request_log_caplog
):
"""Su path autenticato actor/bot_tag/exchange/tool sono propagati."""
c = TestClient(app)
c.post(
"/mcp-deribit/tools/is_testnet",
headers={
"Authorization": "Bearer t_test_123",
"X-Bot-Tag": "scanner-x",
},
json={},
)
records = _request_records(request_log_caplog)
rec = next(
rec
for rec in records
if getattr(rec, "path", None) == "/mcp-deribit/tools/is_testnet"
)
assert getattr(rec, "actor", None) == "testnet"
assert getattr(rec, "bot_tag", None) == "scanner-x"
assert getattr(rec, "exchange", None) == "deribit"
assert getattr(rec, "tool", None) == "is_testnet"
def test_request_log_unauthorized_does_not_have_actor(
app, request_log_caplog
):
"""Senza bearer, request log emette comunque ma senza actor/bot_tag."""
c = TestClient(app)
c.post("/mcp-deribit/tools/is_testnet", json={})
records = _request_records(request_log_caplog)
rec = next(
rec
for rec in records
if getattr(rec, "path", None) == "/mcp-deribit/tools/is_testnet"
)
assert getattr(rec, "status_code", None) == 401
assert getattr(rec, "actor", None) is None
assert getattr(rec, "exchange", None) == "deribit"
def test_request_id_in_state_for_handlers(app):
"""Verifica che request.state.request_id sia disponibile a handler."""
@app.get("/__test_state")
def _state_handler(request: StarletteRequest) -> dict:
return {"rid": request.state.request_id}
c = TestClient(app)
r = c.get(
"/__test_state",
headers={"Authorization": "Bearer t_test_123", "X-Bot-Tag": "x"},
)
assert r.status_code == 200, f"got {r.status_code}: {r.text[:500]}"
assert r.json()["rid"]
+120
View File
@@ -60,3 +60,123 @@ def test_x_duration_ms_header(app):
c = TestClient(app)
r = c.get("/health")
assert "X-Duration-Ms" in r.headers
def test_health_ready_empty_registry(app):
"""Senza registry il readiness ritorna not_ready ma HTTP 200."""
c = TestClient(app)
r = c.get("/health/ready")
assert r.status_code == 200
j = r.json()
assert j["status"] == "not_ready"
assert j["clients"] == []
assert j["version"] == "2.0.0"
def test_health_ready_all_healthy(app):
"""Registry con stub client healthy → status=ready."""
from cerbero_mcp.client_registry import ClientRegistry
class _StubOk:
async def health(self):
return {"status": "ok"}
async def _builder(exchange, env): # pragma: no cover - non chiamato
return _StubOk()
reg = ClientRegistry(builder=_builder)
reg._clients[("deribit", "testnet")] = _StubOk()
reg._clients[("bybit", "mainnet")] = _StubOk()
app.state.registry = reg
c = TestClient(app)
r = c.get("/health/ready")
assert r.status_code == 200
j = r.json()
assert j["status"] == "ready"
assert len(j["clients"]) == 2
for entry in j["clients"]:
assert entry["healthy"] is True
assert "duration_ms" in entry
def test_health_ready_degraded_on_error(app):
"""Registry con almeno un client che fa raise → status=degraded."""
from cerbero_mcp.client_registry import ClientRegistry
class _StubOk:
async def health(self):
return {"status": "ok"}
class _StubFail:
async def health(self):
raise RuntimeError("boom")
async def _builder(exchange, env): # pragma: no cover - non chiamato
return _StubOk()
reg = ClientRegistry(builder=_builder)
reg._clients[("deribit", "testnet")] = _StubOk()
reg._clients[("bybit", "mainnet")] = _StubFail()
app.state.registry = reg
c = TestClient(app)
r = c.get("/health/ready")
assert r.status_code == 200
j = r.json()
assert j["status"] == "degraded"
fail = next(c for c in j["clients"] if c["exchange"] == "bybit")
assert fail["healthy"] is False
assert "RuntimeError" in fail["error"]
def test_health_ready_503_when_fail_on_degraded(app, monkeypatch):
"""READY_FAILS_ON_DEGRADED=true → HTTP 503 quando degraded."""
from cerbero_mcp.client_registry import ClientRegistry
class _StubFail:
async def health(self):
raise RuntimeError("boom")
async def _builder(exchange, env): # pragma: no cover - non chiamato
return _StubFail()
reg = ClientRegistry(builder=_builder)
reg._clients[("deribit", "testnet")] = _StubFail()
app.state.registry = reg
monkeypatch.setenv("READY_FAILS_ON_DEGRADED", "true")
c = TestClient(app)
r = c.get("/health/ready")
assert r.status_code == 503
assert r.json()["status"] == "degraded"
def test_health_ready_no_probe_method(app):
"""Client senza health/is_testnet → marcato healthy con note."""
from cerbero_mcp.client_registry import ClientRegistry
class _StubBare:
pass
async def _builder(exchange, env): # pragma: no cover - non chiamato
return _StubBare()
reg = ClientRegistry(builder=_builder)
reg._clients[("foo", "testnet")] = _StubBare()
app.state.registry = reg
c = TestClient(app)
r = c.get("/health/ready")
assert r.status_code == 200
j = r.json()
assert j["status"] == "ready"
assert j["clients"][0]["note"] == "no probe method"
def test_health_ready_in_whitelist_no_auth(app):
"""/health/ready non richiede bearer."""
c = TestClient(app)
# Nessun Authorization header → 200 (whitelist)
r = c.get("/health/ready")
assert r.status_code == 200
Generated
+6 -193
View File
@@ -13,24 +13,6 @@ resolution-markers = [
"python_full_version < '3.14' and sys_platform != 'emscripten' and sys_platform != 'win32'",
]
[[package]]
name = "alpaca-py"
version = "0.43.4"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "msgpack" },
{ name = "pandas" },
{ name = "pydantic" },
{ name = "pytz" },
{ name = "requests" },
{ name = "sseclient-py" },
{ name = "websockets" },
]
sdist = { url = "https://files.pythonhosted.org/packages/2f/9d/3003f661c15b8003655c447c187aec10f0843647e5c98b391701b04ac3d8/alpaca_py-0.43.4.tar.gz", hash = "sha256:7d529b3654d4e817d9fd7ab461131c4f06a315c736b6a9e4a87d5406bb71114a", size = 97990, upload-time = "2026-04-29T08:41:48.775Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/3a/d5/1f57cc03e7b5925a927cb7f8e7ee5f873e22632633778d28d5d23681c871/alpaca_py-0.43.4-py3-none-any.whl", hash = "sha256:dd49ac30e0f2a8f38550ef1f27a58e7fd8f3f3875deaa4e757443cdbd033a1b4", size = 122534, upload-time = "2026-04-29T08:41:50.149Z" },
]
[[package]]
name = "annotated-doc"
version = "0.0.4"
@@ -140,13 +122,13 @@ name = "cerbero-mcp"
version = "2.0.0"
source = { editable = "." }
dependencies = [
{ name = "alpaca-py" },
{ name = "eth-account" },
{ name = "eth-utils" },
{ name = "fastapi" },
{ name = "httpx" },
{ name = "hyperliquid-python-sdk" },
{ name = "msgpack" },
{ name = "numpy" },
{ name = "pandas" },
{ name = "pybit" },
{ name = "pydantic" },
{ name = "pydantic-settings" },
{ name = "python-json-logger" },
@@ -167,13 +149,13 @@ dev = [
[package.metadata]
requires-dist = [
{ name = "alpaca-py", specifier = ">=0.30" },
{ name = "eth-account", specifier = ">=0.13.7" },
{ name = "eth-utils", specifier = ">=5.3.1" },
{ name = "fastapi", specifier = ">=0.115" },
{ name = "httpx", specifier = ">=0.27" },
{ name = "hyperliquid-python-sdk", specifier = ">=0.6" },
{ name = "msgpack", specifier = ">=1.1.2" },
{ name = "numpy", specifier = ">=1.26" },
{ name = "pandas", specifier = ">=2.2" },
{ name = "pybit", specifier = ">=5.7" },
{ name = "pydantic", specifier = ">=2.9" },
{ name = "pydantic-settings", specifier = ">=2.6" },
{ name = "python-json-logger", specifier = ">=2.0" },
@@ -201,95 +183,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/22/30/7cd8fdcdfbc5b869528b079bfb76dcdf6056b1a2097a662e5e8c04f42965/certifi-2026.4.22-py3-none-any.whl", hash = "sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a", size = 135707, upload-time = "2026-04-22T11:26:09.372Z" },
]
[[package]]
name = "charset-normalizer"
version = "3.4.7"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/e7/a1/67fe25fac3c7642725500a3f6cfe5821ad557c3abb11c9d20d12c7008d3e/charset_normalizer-3.4.7.tar.gz", hash = "sha256:ae89db9e5f98a11a4bf50407d4363e7b09b31e55bc117b4f7d80aab97ba009e5", size = 144271, upload-time = "2026-04-02T09:28:39.342Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c2/d7/b5b7020a0565c2e9fa8c09f4b5fa6232feb326b8c20081ccded47ea368fd/charset_normalizer-3.4.7-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7641bb8895e77f921102f72833904dcd9901df5d6d72a2ab8f31d04b7e51e4e7", size = 309705, upload-time = "2026-04-02T09:26:02.191Z" },
{ url = "https://files.pythonhosted.org/packages/5a/53/58c29116c340e5456724ecd2fff4196d236b98f3da97b404bc5e51ac3493/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:202389074300232baeb53ae2569a60901f7efadd4245cf3a3bf0617d60b439d7", size = 206419, upload-time = "2026-04-02T09:26:03.583Z" },
{ url = "https://files.pythonhosted.org/packages/b2/02/e8146dc6591a37a00e5144c63f29fb7c97a734ea8a111190783c0e60ab63/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:30b8d1d8c52a48c2c5690e152c169b673487a2a58de1ec7393196753063fcd5e", size = 227901, upload-time = "2026-04-02T09:26:04.738Z" },
{ url = "https://files.pythonhosted.org/packages/fb/73/77486c4cd58f1267bf17db420e930c9afa1b3be3fe8c8b8ebbebc9624359/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:532bc9bf33a68613fd7d65e4b1c71a6a38d7d42604ecf239c77392e9b4e8998c", size = 222742, upload-time = "2026-04-02T09:26:06.36Z" },
{ url = "https://files.pythonhosted.org/packages/a1/fa/f74eb381a7d94ded44739e9d94de18dc5edc9c17fb8c11f0a6890696c0a9/charset_normalizer-3.4.7-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2fe249cb4651fd12605b7288b24751d8bfd46d35f12a20b1ba33dea122e690df", size = 214061, upload-time = "2026-04-02T09:26:08.347Z" },
{ url = "https://files.pythonhosted.org/packages/dc/92/42bd3cefcf7687253fb86694b45f37b733c97f59af3724f356fa92b8c344/charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_armv7l.whl", hash = "sha256:65bcd23054beab4d166035cabbc868a09c1a49d1efe458fe8e4361215df40265", size = 199239, upload-time = "2026-04-02T09:26:09.823Z" },
{ url = "https://files.pythonhosted.org/packages/4c/3d/069e7184e2aa3b3cddc700e3dd267413dc259854adc3380421c805c6a17d/charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:08e721811161356f97b4059a9ba7bafb23ea5ee2255402c42881c214e173c6b4", size = 210173, upload-time = "2026-04-02T09:26:10.953Z" },
{ url = "https://files.pythonhosted.org/packages/62/51/9d56feb5f2e7074c46f93e0ebdbe61f0848ee246e2f0d89f8e20b89ebb8f/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e060d01aec0a910bdccb8be71faf34e7799ce36950f8294c8bf612cba65a2c9e", size = 209841, upload-time = "2026-04-02T09:26:12.142Z" },
{ url = "https://files.pythonhosted.org/packages/d2/59/893d8f99cc4c837dda1fe2f1139079703deb9f321aabcb032355de13b6c7/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:38c0109396c4cfc574d502df99742a45c72c08eff0a36158b6f04000043dbf38", size = 200304, upload-time = "2026-04-02T09:26:13.711Z" },
{ url = "https://files.pythonhosted.org/packages/7d/1d/ee6f3be3464247578d1ed5c46de545ccc3d3ff933695395c402c21fa6b77/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:1c2a768fdd44ee4a9339a9b0b130049139b8ce3c01d2ce09f67f5a68048d477c", size = 229455, upload-time = "2026-04-02T09:26:14.941Z" },
{ url = "https://files.pythonhosted.org/packages/54/bb/8fb0a946296ea96a488928bdce8ef99023998c48e4713af533e9bb98ef07/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:1a87ca9d5df6fe460483d9a5bbf2b18f620cbed41b432e2bddb686228282d10b", size = 210036, upload-time = "2026-04-02T09:26:16.478Z" },
{ url = "https://files.pythonhosted.org/packages/9a/bc/015b2387f913749f82afd4fcba07846d05b6d784dd16123cb66860e0237d/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:d635aab80466bc95771bb78d5370e74d36d1fe31467b6b29b8b57b2a3cd7d22c", size = 224739, upload-time = "2026-04-02T09:26:17.751Z" },
{ url = "https://files.pythonhosted.org/packages/17/ab/63133691f56baae417493cba6b7c641571a2130eb7bceba6773367ab9ec5/charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ae196f021b5e7c78e918242d217db021ed2a6ace2bc6ae94c0fc596221c7f58d", size = 216277, upload-time = "2026-04-02T09:26:18.981Z" },
{ url = "https://files.pythonhosted.org/packages/06/6d/3be70e827977f20db77c12a97e6a9f973631a45b8d186c084527e53e77a4/charset_normalizer-3.4.7-cp311-cp311-win32.whl", hash = "sha256:adb2597b428735679446b46c8badf467b4ca5f5056aae4d51a19f9570301b1ad", size = 147819, upload-time = "2026-04-02T09:26:20.295Z" },
{ url = "https://files.pythonhosted.org/packages/20/d9/5f67790f06b735d7c7637171bbfd89882ad67201891b7275e51116ed8207/charset_normalizer-3.4.7-cp311-cp311-win_amd64.whl", hash = "sha256:8e385e4267ab76874ae30db04c627faaaf0b509e1ccc11a95b3fc3e83f855c00", size = 159281, upload-time = "2026-04-02T09:26:21.74Z" },
{ url = "https://files.pythonhosted.org/packages/ca/83/6413f36c5a34afead88ce6f66684d943d91f233d76dd083798f9602b75ae/charset_normalizer-3.4.7-cp311-cp311-win_arm64.whl", hash = "sha256:d4a48e5b3c2a489fae013b7589308a40146ee081f6f509e047e0e096084ceca1", size = 147843, upload-time = "2026-04-02T09:26:22.901Z" },
{ url = "https://files.pythonhosted.org/packages/0c/eb/4fc8d0a7110eb5fc9cc161723a34a8a6c200ce3b4fbf681bc86feee22308/charset_normalizer-3.4.7-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46", size = 311328, upload-time = "2026-04-02T09:26:24.331Z" },
{ url = "https://files.pythonhosted.org/packages/f8/e3/0fadc706008ac9d7b9b5be6dc767c05f9d3e5df51744ce4cc9605de7b9f4/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2", size = 208061, upload-time = "2026-04-02T09:26:25.568Z" },
{ url = "https://files.pythonhosted.org/packages/42/f0/3dd1045c47f4a4604df85ec18ad093912ae1344ac706993aff91d38773a2/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b", size = 229031, upload-time = "2026-04-02T09:26:26.865Z" },
{ url = "https://files.pythonhosted.org/packages/dc/67/675a46eb016118a2fbde5a277a5d15f4f69d5f3f5f338e5ee2f8948fcf43/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a", size = 225239, upload-time = "2026-04-02T09:26:28.044Z" },
{ url = "https://files.pythonhosted.org/packages/4b/f8/d0118a2f5f23b02cd166fa385c60f9b0d4f9194f574e2b31cef350ad7223/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116", size = 216589, upload-time = "2026-04-02T09:26:29.239Z" },
{ url = "https://files.pythonhosted.org/packages/b1/f1/6d2b0b261b6c4ceef0fcb0d17a01cc5bc53586c2d4796fa04b5c540bc13d/charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_armv7l.whl", hash = "sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb", size = 202733, upload-time = "2026-04-02T09:26:30.5Z" },
{ url = "https://files.pythonhosted.org/packages/6f/c0/7b1f943f7e87cc3db9626ba17807d042c38645f0a1d4415c7a14afb5591f/charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1", size = 212652, upload-time = "2026-04-02T09:26:31.709Z" },
{ url = "https://files.pythonhosted.org/packages/38/dd/5a9ab159fe45c6e72079398f277b7d2b523e7f716acc489726115a910097/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15", size = 211229, upload-time = "2026-04-02T09:26:33.282Z" },
{ url = "https://files.pythonhosted.org/packages/d5/ff/531a1cad5ca855d1c1a8b69cb71abfd6d85c0291580146fda7c82857caa1/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5", size = 203552, upload-time = "2026-04-02T09:26:34.845Z" },
{ url = "https://files.pythonhosted.org/packages/c1/4c/a5fb52d528a8ca41f7598cb619409ece30a169fbdf9cdce592e53b46c3a6/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d", size = 230806, upload-time = "2026-04-02T09:26:36.152Z" },
{ url = "https://files.pythonhosted.org/packages/59/7a/071feed8124111a32b316b33ae4de83d36923039ef8cf48120266844285b/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7", size = 212316, upload-time = "2026-04-02T09:26:37.672Z" },
{ url = "https://files.pythonhosted.org/packages/fd/35/f7dba3994312d7ba508e041eaac39a36b120f32d4c8662b8814dab876431/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464", size = 227274, upload-time = "2026-04-02T09:26:38.93Z" },
{ url = "https://files.pythonhosted.org/packages/8a/2d/a572df5c9204ab7688ec1edc895a73ebded3b023bb07364710b05dd1c9be/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49", size = 218468, upload-time = "2026-04-02T09:26:40.17Z" },
{ url = "https://files.pythonhosted.org/packages/86/eb/890922a8b03a568ca2f336c36585a4713c55d4d67bf0f0c78924be6315ca/charset_normalizer-3.4.7-cp312-cp312-win32.whl", hash = "sha256:2257141f39fe65a3fdf38aeccae4b953e5f3b3324f4ff0daf9f15b8518666a2c", size = 148460, upload-time = "2026-04-02T09:26:41.416Z" },
{ url = "https://files.pythonhosted.org/packages/35/d9/0e7dffa06c5ab081f75b1b786f0aefc88365825dfcd0ac544bdb7b2b6853/charset_normalizer-3.4.7-cp312-cp312-win_amd64.whl", hash = "sha256:5ed6ab538499c8644b8a3e18debabcd7ce684f3fa91cf867521a7a0279cab2d6", size = 159330, upload-time = "2026-04-02T09:26:42.554Z" },
{ url = "https://files.pythonhosted.org/packages/9e/5d/481bcc2a7c88ea6b0878c299547843b2521ccbc40980cb406267088bc701/charset_normalizer-3.4.7-cp312-cp312-win_arm64.whl", hash = "sha256:56be790f86bfb2c98fb742ce566dfb4816e5a83384616ab59c49e0604d49c51d", size = 147828, upload-time = "2026-04-02T09:26:44.075Z" },
{ url = "https://files.pythonhosted.org/packages/c1/3b/66777e39d3ae1ddc77ee606be4ec6d8cbd4c801f65e5a1b6f2b11b8346dd/charset_normalizer-3.4.7-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063", size = 309627, upload-time = "2026-04-02T09:26:45.198Z" },
{ url = "https://files.pythonhosted.org/packages/2e/4e/b7f84e617b4854ade48a1b7915c8ccfadeba444d2a18c291f696e37f0d3b/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c", size = 207008, upload-time = "2026-04-02T09:26:46.824Z" },
{ url = "https://files.pythonhosted.org/packages/c4/bb/ec73c0257c9e11b268f018f068f5d00aa0ef8c8b09f7753ebd5f2880e248/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66", size = 228303, upload-time = "2026-04-02T09:26:48.397Z" },
{ url = "https://files.pythonhosted.org/packages/85/fb/32d1f5033484494619f701e719429c69b766bfc4dbc61aa9e9c8c166528b/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18", size = 224282, upload-time = "2026-04-02T09:26:49.684Z" },
{ url = "https://files.pythonhosted.org/packages/fa/07/330e3a0dda4c404d6da83b327270906e9654a24f6c546dc886a0eb0ffb23/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd", size = 215595, upload-time = "2026-04-02T09:26:50.915Z" },
{ url = "https://files.pythonhosted.org/packages/e3/7c/fc890655786e423f02556e0216d4b8c6bcb6bdfa890160dc66bf52dee468/charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215", size = 201986, upload-time = "2026-04-02T09:26:52.197Z" },
{ url = "https://files.pythonhosted.org/packages/d8/97/bfb18b3db2aed3b90cf54dc292ad79fdd5ad65c4eae454099475cbeadd0d/charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859", size = 211711, upload-time = "2026-04-02T09:26:53.49Z" },
{ url = "https://files.pythonhosted.org/packages/6f/a5/a581c13798546a7fd557c82614a5c65a13df2157e9ad6373166d2a3e645d/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8", size = 210036, upload-time = "2026-04-02T09:26:54.975Z" },
{ url = "https://files.pythonhosted.org/packages/8c/bf/b3ab5bcb478e4193d517644b0fb2bf5497fbceeaa7a1bc0f4d5b50953861/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5", size = 202998, upload-time = "2026-04-02T09:26:56.303Z" },
{ url = "https://files.pythonhosted.org/packages/e7/4e/23efd79b65d314fa320ec6017b4b5834d5c12a58ba4610aa353af2e2f577/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832", size = 230056, upload-time = "2026-04-02T09:26:57.554Z" },
{ url = "https://files.pythonhosted.org/packages/b9/9f/1e1941bc3f0e01df116e68dc37a55c4d249df5e6fa77f008841aef68264f/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6", size = 211537, upload-time = "2026-04-02T09:26:58.843Z" },
{ url = "https://files.pythonhosted.org/packages/80/0f/088cbb3020d44428964a6c97fe1edfb1b9550396bf6d278330281e8b709c/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48", size = 226176, upload-time = "2026-04-02T09:27:00.437Z" },
{ url = "https://files.pythonhosted.org/packages/6a/9f/130394f9bbe06f4f63e22641d32fc9b202b7e251c9aef4db044324dac493/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a", size = 217723, upload-time = "2026-04-02T09:27:02.021Z" },
{ url = "https://files.pythonhosted.org/packages/73/55/c469897448a06e49f8fa03f6caae97074fde823f432a98f979cc42b90e69/charset_normalizer-3.4.7-cp313-cp313-win32.whl", hash = "sha256:4042d5c8f957e15221d423ba781e85d553722fc4113f523f2feb7b188cc34c5e", size = 148085, upload-time = "2026-04-02T09:27:03.192Z" },
{ url = "https://files.pythonhosted.org/packages/5d/78/1b74c5bbb3f99b77a1715c91b3e0b5bdb6fe302d95ace4f5b1bec37b0167/charset_normalizer-3.4.7-cp313-cp313-win_amd64.whl", hash = "sha256:3946fa46a0cf3e4c8cb1cc52f56bb536310d34f25f01ca9b6c16afa767dab110", size = 158819, upload-time = "2026-04-02T09:27:04.454Z" },
{ url = "https://files.pythonhosted.org/packages/68/86/46bd42279d323deb8687c4a5a811fd548cb7d1de10cf6535d099877a9a9f/charset_normalizer-3.4.7-cp313-cp313-win_arm64.whl", hash = "sha256:80d04837f55fc81da168b98de4f4b797ef007fc8a79ab71c6ec9bc4dd662b15b", size = 147915, upload-time = "2026-04-02T09:27:05.971Z" },
{ url = "https://files.pythonhosted.org/packages/97/c8/c67cb8c70e19ef1960b97b22ed2a1567711de46c4ddf19799923adc836c2/charset_normalizer-3.4.7-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0", size = 309234, upload-time = "2026-04-02T09:27:07.194Z" },
{ url = "https://files.pythonhosted.org/packages/99/85/c091fdee33f20de70d6c8b522743b6f831a2f1cd3ff86de4c6a827c48a76/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a", size = 208042, upload-time = "2026-04-02T09:27:08.749Z" },
{ url = "https://files.pythonhosted.org/packages/87/1c/ab2ce611b984d2fd5d86a5a8a19c1ae26acac6bad967da4967562c75114d/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b", size = 228706, upload-time = "2026-04-02T09:27:09.951Z" },
{ url = "https://files.pythonhosted.org/packages/a8/29/2b1d2cb00bf085f59d29eb773ce58ec2d325430f8c216804a0a5cd83cbca/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41", size = 224727, upload-time = "2026-04-02T09:27:11.175Z" },
{ url = "https://files.pythonhosted.org/packages/47/5c/032c2d5a07fe4d4855fea851209cca2b6f03ebeb6d4e3afdb3358386a684/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e", size = 215882, upload-time = "2026-04-02T09:27:12.446Z" },
{ url = "https://files.pythonhosted.org/packages/2c/c2/356065d5a8b78ed04499cae5f339f091946a6a74f91e03476c33f0ab7100/charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae", size = 200860, upload-time = "2026-04-02T09:27:13.721Z" },
{ url = "https://files.pythonhosted.org/packages/0c/cd/a32a84217ced5039f53b29f460962abb2d4420def55afabe45b1c3c7483d/charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18", size = 211564, upload-time = "2026-04-02T09:27:15.272Z" },
{ url = "https://files.pythonhosted.org/packages/44/86/58e6f13ce26cc3b8f4a36b94a0f22ae2f00a72534520f4ae6857c4b81f89/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b", size = 211276, upload-time = "2026-04-02T09:27:16.834Z" },
{ url = "https://files.pythonhosted.org/packages/8f/fe/d17c32dc72e17e155e06883efa84514ca375f8a528ba2546bee73fc4df81/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356", size = 201238, upload-time = "2026-04-02T09:27:18.229Z" },
{ url = "https://files.pythonhosted.org/packages/6a/29/f33daa50b06525a237451cdb6c69da366c381a3dadcd833fa5676bc468b3/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab", size = 230189, upload-time = "2026-04-02T09:27:19.445Z" },
{ url = "https://files.pythonhosted.org/packages/b6/6e/52c84015394a6a0bdcd435210a7e944c5f94ea1055f5cc5d56c5fe368e7b/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46", size = 211352, upload-time = "2026-04-02T09:27:20.79Z" },
{ url = "https://files.pythonhosted.org/packages/8c/d7/4353be581b373033fb9198bf1da3cf8f09c1082561e8e922aa7b39bf9fe8/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44", size = 227024, upload-time = "2026-04-02T09:27:22.063Z" },
{ url = "https://files.pythonhosted.org/packages/30/45/99d18aa925bd1740098ccd3060e238e21115fffbfdcb8f3ece837d0ace6c/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72", size = 217869, upload-time = "2026-04-02T09:27:23.486Z" },
{ url = "https://files.pythonhosted.org/packages/5c/05/5ee478aa53f4bb7996482153d4bfe1b89e0f087f0ab6b294fcf92d595873/charset_normalizer-3.4.7-cp314-cp314-win32.whl", hash = "sha256:5b77459df20e08151cd6f8b9ef8ef1f961ef73d85c21a555c7eed5b79410ec10", size = 148541, upload-time = "2026-04-02T09:27:25.146Z" },
{ url = "https://files.pythonhosted.org/packages/48/77/72dcb0921b2ce86420b2d79d454c7022bf5be40202a2a07906b9f2a35c97/charset_normalizer-3.4.7-cp314-cp314-win_amd64.whl", hash = "sha256:92a0a01ead5e668468e952e4238cccd7c537364eb7d851ab144ab6627dbbe12f", size = 159634, upload-time = "2026-04-02T09:27:26.642Z" },
{ url = "https://files.pythonhosted.org/packages/c6/a3/c2369911cd72f02386e4e340770f6e158c7980267da16af8f668217abaa0/charset_normalizer-3.4.7-cp314-cp314-win_arm64.whl", hash = "sha256:67f6279d125ca0046a7fd386d01b311c6363844deac3e5b069b514ba3e63c246", size = 148384, upload-time = "2026-04-02T09:27:28.271Z" },
{ url = "https://files.pythonhosted.org/packages/94/09/7e8a7f73d24dba1f0035fbbf014d2c36828fc1bf9c88f84093e57d315935/charset_normalizer-3.4.7-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24", size = 330133, upload-time = "2026-04-02T09:27:29.474Z" },
{ url = "https://files.pythonhosted.org/packages/8d/da/96975ddb11f8e977f706f45cddd8540fd8242f71ecdb5d18a80723dcf62c/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79", size = 216257, upload-time = "2026-04-02T09:27:30.793Z" },
{ url = "https://files.pythonhosted.org/packages/e5/e8/1d63bf8ef2d388e95c64b2098f45f84758f6d102a087552da1485912637b/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960", size = 234851, upload-time = "2026-04-02T09:27:32.44Z" },
{ url = "https://files.pythonhosted.org/packages/9b/40/e5ff04233e70da2681fa43969ad6f66ca5611d7e669be0246c4c7aaf6dc8/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4", size = 233393, upload-time = "2026-04-02T09:27:34.03Z" },
{ url = "https://files.pythonhosted.org/packages/be/c1/06c6c49d5a5450f76899992f1ee40b41d076aee9279b49cf9974d2f313d5/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e", size = 223251, upload-time = "2026-04-02T09:27:35.369Z" },
{ url = "https://files.pythonhosted.org/packages/2b/9f/f2ff16fb050946169e3e1f82134d107e5d4ae72647ec8a1b1446c148480f/charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1", size = 206609, upload-time = "2026-04-02T09:27:36.661Z" },
{ url = "https://files.pythonhosted.org/packages/69/d5/a527c0cd8d64d2eab7459784fb4169a0ac76e5a6fc5237337982fd61347e/charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44", size = 220014, upload-time = "2026-04-02T09:27:38.019Z" },
{ url = "https://files.pythonhosted.org/packages/7e/80/8a7b8104a3e203074dc9aa2c613d4b726c0e136bad1cc734594b02867972/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e", size = 218979, upload-time = "2026-04-02T09:27:39.37Z" },
{ url = "https://files.pythonhosted.org/packages/02/9a/b759b503d507f375b2b5c153e4d2ee0a75aa215b7f2489cf314f4541f2c0/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3", size = 209238, upload-time = "2026-04-02T09:27:40.722Z" },
{ url = "https://files.pythonhosted.org/packages/c2/4e/0f3f5d47b86bdb79256e7290b26ac847a2832d9a4033f7eb2cd4bcf4bb5b/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0", size = 236110, upload-time = "2026-04-02T09:27:42.33Z" },
{ url = "https://files.pythonhosted.org/packages/96/23/bce28734eb3ed2c91dcf93abeb8a5cf393a7b2749725030bb630e554fdd8/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e", size = 219824, upload-time = "2026-04-02T09:27:43.924Z" },
{ url = "https://files.pythonhosted.org/packages/2c/6f/6e897c6984cc4d41af319b077f2f600fc8214eb2fe2d6bcb79141b882400/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb", size = 233103, upload-time = "2026-04-02T09:27:45.348Z" },
{ url = "https://files.pythonhosted.org/packages/76/22/ef7bd0fe480a0ae9b656189ec00744b60933f68b4f42a7bb06589f6f576a/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe", size = 225194, upload-time = "2026-04-02T09:27:46.706Z" },
{ url = "https://files.pythonhosted.org/packages/c5/a7/0e0ab3e0b5bc1219bd80a6a0d4d72ca74d9250cb2382b7c699c147e06017/charset_normalizer-3.4.7-cp314-cp314t-win32.whl", hash = "sha256:c03a41a8784091e67a39648f70c5f97b5b6a37f216896d44d2cdcb82615339a0", size = 159827, upload-time = "2026-04-02T09:27:48.053Z" },
{ url = "https://files.pythonhosted.org/packages/7a/1d/29d32e0fb40864b1f878c7f5a0b343ae676c6e2b271a2d55cc3a152391da/charset_normalizer-3.4.7-cp314-cp314t-win_amd64.whl", hash = "sha256:03853ed82eeebbce3c2abfdbc98c96dc205f32a79627688ac9a27370ea61a49c", size = 174168, upload-time = "2026-04-02T09:27:49.795Z" },
{ url = "https://files.pythonhosted.org/packages/de/32/d92444ad05c7a6e41fb2036749777c163baf7a0301a040cb672d6b2b1ae9/charset_normalizer-3.4.7-cp314-cp314t-win_arm64.whl", hash = "sha256:c35abb8bfff0185efac5878da64c45dafd2b37fb0383add1be155a763c1f083d", size = 153018, upload-time = "2026-04-02T09:27:51.116Z" },
{ url = "https://files.pythonhosted.org/packages/db/8f/61959034484a4a7c527811f4721e75d02d653a35afb0b6054474d8185d4c/charset_normalizer-3.4.7-py3-none-any.whl", hash = "sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d", size = 61958, upload-time = "2026-04-02T09:28:37.794Z" },
]
[[package]]
name = "ckzg"
version = "2.1.7"
@@ -713,22 +606,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
]
[[package]]
name = "hyperliquid-python-sdk"
version = "0.23.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "eth-account" },
{ name = "eth-utils" },
{ name = "msgpack" },
{ name = "requests" },
{ name = "websocket-client" },
]
sdist = { url = "https://files.pythonhosted.org/packages/80/ad/12b4559a953e26fc56677de5bf689023e11213196b802b991a6e6db94814/hyperliquid_python_sdk-0.23.0.tar.gz", hash = "sha256:14df0b62511a0cf08ca5a73f73f03656868ee67845ed3362539a79674511bb51", size = 25255, upload-time = "2026-04-14T21:51:24.646Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/94/e9/b7b23aefc319727f670992904b1defd7aee5fc3f59a51c141a87db05f7da/hyperliquid_python_sdk-0.23.0-py3-none-any.whl", hash = "sha256:5b4f9f7ab8c0b1ad9848f2222901dc047c8f97a6e6fe3fd7286b7b34337f80cb", size = 24638, upload-time = "2026-04-14T21:51:23.27Z" },
]
[[package]]
name = "idna"
version = "3.13"
@@ -1122,20 +999,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
]
[[package]]
name = "pybit"
version = "5.16.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pycryptodome" },
{ name = "requests" },
{ name = "websocket-client" },
]
sdist = { url = "https://files.pythonhosted.org/packages/2c/92/c7c14df206de81d7acb7d7d92180482d8d54c18dc73bf8d1ab73498bdbfc/pybit-5.16.0.tar.gz", hash = "sha256:554a812982e0271ec3a9f9483767ad5ff440d3834f3d363b689c81a0a474ebc0", size = 66764, upload-time = "2026-04-18T13:55:33.058Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d1/3b/9c37237af574f143cee806801ff37a83b85491cd654046c49caf1d2940a2/pybit-5.16.0-py2.py3-none-any.whl", hash = "sha256:d214c4987aabb25c10e8e031244973a3be4728e044575ab6c6f6f7873f8c1cab", size = 59230, upload-time = "2026-04-18T13:55:31.551Z" },
]
[[package]]
name = "pycryptodome"
version = "3.23.0"
@@ -1378,15 +1241,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/27/be/0631a861af4d1c875f096c07d34e9a63639560a717130e7a87cbc82b7e3f/python_json_logger-4.1.0-py3-none-any.whl", hash = "sha256:132994765cf75bf44554be9aa49b06ef2345d23661a96720262716438141b6b2", size = 15021, upload-time = "2026-03-29T04:39:55.266Z" },
]
[[package]]
name = "pytz"
version = "2026.1.post1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/56/db/b8721d71d945e6a8ac63c0fc900b2067181dbb50805958d4d4661cf7d277/pytz-2026.1.post1.tar.gz", hash = "sha256:3378dde6a0c3d26719182142c56e60c7f9af7e968076f31aae569d72a0358ee1", size = 321088, upload-time = "2026-03-03T07:47:50.683Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/10/99/781fe0c827be2742bcc775efefccb3b048a3a9c6ce9aec0cbf4a101677e5/pytz-2026.1.post1-py2.py3-none-any.whl", hash = "sha256:f2fd16142fda348286a75e1a524be810bb05d444e5a081f37f7affc635035f7a", size = 510489, upload-time = "2026-03-03T07:47:49.167Z" },
]
[[package]]
name = "pyyaml"
version = "6.0.3"
@@ -1546,21 +1400,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/d5/e7/ec846d560ae6a597115153c02ca6138a7877a1748b2072d9521c10a93e58/regex-2026.4.4-cp314-cp314t-win_arm64.whl", hash = "sha256:af0384cb01a33600c49505c27c6c57ab0b27bf84a74e28524c92ca897ebdac9d", size = 275773, upload-time = "2026-04-03T20:56:26.07Z" },
]
[[package]]
name = "requests"
version = "2.33.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
{ name = "charset-normalizer" },
{ name = "idna" },
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5f/a4/98b9c7c6428a668bf7e42ebb7c79d576a1c3c1e3ae2d47e674b468388871/requests-2.33.1.tar.gz", hash = "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517", size = 134120, upload-time = "2026-03-30T16:09:15.531Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d7/8e/7540e8a2036f79a125c1d2ebadf69ed7901608859186c856fa0388ef4197/requests-2.33.1-py3-none-any.whl", hash = "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a", size = 64947, upload-time = "2026-03-30T16:09:13.83Z" },
]
[[package]]
name = "rlp"
version = "4.1.0"
@@ -1678,14 +1517,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
]
[[package]]
name = "sseclient-py"
version = "1.9.0"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4d/2e/59920f7d66b7f9932a3d83dd0ec53fab001be1e058bf582606fe414a5198/sseclient_py-1.9.0-py3-none-any.whl", hash = "sha256:340062b1587fc2880892811e2ab5b176d98ef3eee98b3672ff3a3ba1e8ed0f6f", size = 8351, upload-time = "2026-01-02T23:39:30.995Z" },
]
[[package]]
name = "starlette"
version = "1.0.0"
@@ -1777,15 +1608,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/ce/e4/dccd7f47c4b64213ac01ef921a1337ee6e30e8c6466046018326977efd95/tzdata-2026.2-py2.py3-none-any.whl", hash = "sha256:bbe9af844f658da81a5f95019480da3a89415801f6cc966806612cc7169bffe7", size = 349321, upload-time = "2026-04-24T15:22:05.876Z" },
]
[[package]]
name = "urllib3"
version = "2.6.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload-time = "2026-01-07T16:24:43.925Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" },
]
[[package]]
name = "uvicorn"
version = "0.46.0"
@@ -1935,15 +1757,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/6e/d4/ed38dd3b1767193de971e694aa544356e63353c33a85d948166b5ff58b9e/watchfiles-1.1.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e6f39af2eab0118338902798b5aa6664f46ff66bc0280de76fca67a7f262a49", size = 457546, upload-time = "2025-10-14T15:06:13.372Z" },
]
[[package]]
name = "websocket-client"
version = "1.9.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2c/41/aa4bf9664e4cda14c3b39865b12251e8e7d239f4cd0e3cc1b6c2ccde25c1/websocket_client-1.9.0.tar.gz", hash = "sha256:9e813624b6eb619999a97dc7958469217c3176312b3a16a4bd1bc7e08a46ec98", size = 70576, upload-time = "2025-10-07T21:16:36.495Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/34/db/b10e48aa8fff7407e67470363eac595018441cf32d5e1001567a7aeba5d2/websocket_client-1.9.0-py3-none-any.whl", hash = "sha256:af248a825037ef591efbf6ed20cc5faa03d3b47b9e5a2230a529eeee1c1fc3ef", size = 82616, upload-time = "2025-10-07T21:16:34.951Z" },
]
[[package]]
name = "websockets"
version = "16.0"