Files
Cerbero-mcp/docs/superpowers/plans/2026-04-30-V2.0.0-unified-image-token-routing.md
AdrianoDev b8753afad2 docs(plan): V2.0.0 implementation plan task-by-task
Plan a 12 phase per implementare V2.0.0:
- Phase 0: bootstrap struttura nuova
- Phase 1: settings + .env consolidato
- Phase 2: auth bearer middleware
- Phase 3: migrazione common/
- Phase 4: client_registry lazy
- Phase 5: build_app + swagger /apidocs
- Phase 6: migrazione 6 exchange (deribit template + 5 ripetizioni)
- Phase 7: __main__ entrypoint con lifespan
- Phase 8: integration test env routing
- Phase 9: Dockerfile + docker-compose minimo
- Phase 10: pulizia V1 (services/, gateway/, secrets/, docker/)
- Phase 11: README riscritto, DEPLOYMENT eliminato
- Phase 12: quality gate finale

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-30 17:58:13 +02:00

80 KiB
Raw Permalink Blame History

Cerbero MCP V2.0.0 — Unified Image, Token-Based Env Routing — Implementation Plan

For agentic workers: REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (- [ ]) syntax for tracking.

Goal: Collassare 7 immagini Docker in una unica, sostituire la risoluzione testnet/mainnet boot-time con routing per-request via bearer token, consolidare tutta la configurazione in .env ed esporre Swagger UI a /apidocs.

Architecture: Singolo package Python cerbero_mcp in src/. Boot legge .env via Pydantic Settings, costruisce un app FastAPI con un router per exchange (/mcp-{exchange}/tools/{tool}), middleware bearer-auth che setta request.state.environment da TESTNET_TOKEN/MAINNET_TOKEN, e ClientRegistry lazy che mantiene un client per coppia (exchange, env). Swagger UI esposto a /apidocs con securityScheme BearerAuth. Build Docker singola, deploy con docker compose up -d. Su VPS Traefik gestisce TLS/allowlist esternamente.

Tech Stack: Python 3.11, FastAPI, uvicorn, pydantic-settings, httpx, uv (package manager), pytest, pytest-asyncio, pytest-httpx, ruff, mypy, Docker.


Spec di riferimento

docs/superpowers/specs/2026-04-30-V2.0.0-unified-image-token-routing-design.md

Pre-requisiti

  • Branch V2.0.0 già creato e attivo
  • git status pulito (solo lo spec committato)
  • uv >= 0.5 installato
  • Docker installato (per task finali di build)

PHASE 0 — Bootstrap struttura nuova

L'obiettivo di questa phase è avere una nuova struttura src/cerbero_mcp/ che convive con la vecchia services/ finché non migrate tutto il codice. La vecchia struttura viene rimossa solo nelle phase finali.

Task 0.1: Aggiornare pyproject.toml per singolo package

Files:

  • Modify: pyproject.toml

  • Step 1: Sostituisci interamente pyproject.toml

Sovrascrivi con questo contenuto (rimuove workspace, aggiunge [project]):

[project]
name = "cerbero-mcp"
version = "2.0.0"
description = "Unified multi-exchange MCP server with token-based testnet/mainnet routing"
requires-python = ">=3.11"
authors = [{ name = "Adriano", email = "adrianodalpastro@tielogic.com" }]
dependencies = [
    "fastapi>=0.115",
    "uvicorn[standard]>=0.32",
    "pydantic>=2.9",
    "pydantic-settings>=2.6",
    "httpx>=0.27",
    "python-json-logger>=2.0",
    "websockets>=13",
    "numpy>=1.26",
    "scipy>=1.13",
    "statsmodels>=0.14",
    "pandas>=2.2",
    "pybit>=5.7",
    "alpaca-py>=0.30",
    "hyperliquid-python-sdk>=0.6",
]

[project.scripts]
cerbero-mcp = "cerbero_mcp.__main__:main"

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.hatch.build.targets.wheel]
packages = ["src/cerbero_mcp"]

[tool.ruff]
line-length = 100
target-version = "py311"

[tool.ruff.lint]
select = ["E", "F", "I", "W", "UP", "B", "SIM"]
ignore = ["E501", "E741"]

[tool.ruff.lint.flake8-bugbear]
extend-immutable-calls = [
    "fastapi.Depends",
    "fastapi.Query",
    "fastapi.Body",
    "fastapi.Header",
    "fastapi.Path",
    "fastapi.Cookie",
    "fastapi.Form",
    "fastapi.File",
    "fastapi.Security",
]

[tool.ruff.lint.per-file-ignores]
"**/test_*.py" = ["B008"]

[tool.pytest.ini_options]
asyncio_mode = "auto"
testpaths = ["tests"]
addopts = "--import-mode=importlib"

[tool.mypy]
python_version = "3.11"
strict = false
warn_return_any = true
warn_unused_ignores = true
warn_redundant_casts = true
check_untyped_defs = true
ignore_missing_imports = true

[[tool.mypy.overrides]]
module = ["pybit.*", "alpaca.*", "hyperliquid.*", "pythonjsonlogger.*"]
ignore_missing_imports = true

[dependency-groups]
dev = [
    "pytest>=9.0.3",
    "pytest-asyncio>=1.3.0",
    "pytest-httpx>=0.36.2",
    "mypy>=1.13",
    "ruff>=0.5,<0.6",
]
  • Step 2: Verifica con uv sync (lock pulito)
rm -f uv.lock
uv sync

Expected: nessun errore, .venv/ aggiornato, uv.lock rigenerato.

  • Step 3: Commit
git add pyproject.toml uv.lock
git commit -m "chore(V2): pyproject singolo package cerbero-mcp, rimosso workspace"

Task 0.2: Creare scheletro src/cerbero_mcp/

Files:

  • Create: src/cerbero_mcp/__init__.py

  • Create: src/cerbero_mcp/__main__.py

  • Create: src/cerbero_mcp/common/__init__.py

  • Create: src/cerbero_mcp/exchanges/__init__.py

  • Create: src/cerbero_mcp/routers/__init__.py

  • Create: tests/__init__.py

  • Create: tests/unit/__init__.py

  • Create: tests/integration/__init__.py

  • Step 1: Crea le directory e file vuoti

mkdir -p src/cerbero_mcp/{common,exchanges,routers} tests/{unit,integration,smoke}
touch src/cerbero_mcp/__init__.py
touch src/cerbero_mcp/common/__init__.py
touch src/cerbero_mcp/exchanges/__init__.py
touch src/cerbero_mcp/routers/__init__.py
touch tests/__init__.py tests/unit/__init__.py tests/integration/__init__.py
  • Step 2: Crea src/cerbero_mcp/__main__.py placeholder funzionante
"""Entrypoint cerbero-mcp."""
from __future__ import annotations


def main() -> None:
    raise NotImplementedError("server da implementare nelle phase successive")


if __name__ == "__main__":
    main()
  • Step 3: Verifica console script registrato
uv sync
uv run cerbero-mcp 2>&1 || true

Expected: NotImplementedError sollevato (script registrato correttamente).

  • Step 4: Commit
git add src/ tests/
git commit -m "chore(V2): scheletro src/cerbero_mcp + tests/"

PHASE 1 — Settings e configurazione

Task 1.1: .env.example consolidato

Files:

  • Create: .env.example

  • Modify: .gitignore

  • Step 1: Scrivi .env.example

# ============================================================
# CERBERO MCP — V2.0.0
# Copy to .env and fill in values. .env is gitignored.
# Generate tokens: python -c 'import secrets; print(secrets.token_urlsafe(32))'
# ============================================================

# ─── SERVER ─────────────────────────────────────────────────
HOST=0.0.0.0
PORT=9000
LOG_LEVEL=info

# ─── AUTH — token bearer per env routing ──────────────────
# Bot manda Authorization: Bearer <TOKEN>:
#   - TESTNET_TOKEN → request va a base_url_testnet
#   - MAINNET_TOKEN → request va a base_url_live
TESTNET_TOKEN=
MAINNET_TOKEN=

# ─── EXCHANGE — DERIBIT ───────────────────────────────────
DERIBIT_CLIENT_ID=
DERIBIT_CLIENT_SECRET=
DERIBIT_URL_LIVE=https://www.deribit.com/api/v2
DERIBIT_URL_TESTNET=https://test.deribit.com/api/v2
DERIBIT_MAX_LEVERAGE=3

# ─── EXCHANGE — BYBIT ─────────────────────────────────────
BYBIT_API_KEY=
BYBIT_API_SECRET=
BYBIT_URL_LIVE=https://api.bybit.com
BYBIT_URL_TESTNET=https://api-testnet.bybit.com
BYBIT_MAX_LEVERAGE=3

# ─── EXCHANGE — HYPERLIQUID ───────────────────────────────
HYPERLIQUID_WALLET_ADDRESS=
HYPERLIQUID_API_WALLET_ADDRESS=
HYPERLIQUID_PRIVATE_KEY=
HYPERLIQUID_URL_LIVE=https://api.hyperliquid.xyz
HYPERLIQUID_URL_TESTNET=https://api.hyperliquid-testnet.xyz
HYPERLIQUID_MAX_LEVERAGE=3

# ─── EXCHANGE — ALPACA ────────────────────────────────────
ALPACA_API_KEY_ID=
ALPACA_SECRET_KEY=
ALPACA_URL_LIVE=https://api.alpaca.markets
ALPACA_URL_TESTNET=https://paper-api.alpaca.markets
ALPACA_MAX_LEVERAGE=1

# ─── DATA PROVIDERS — MACRO ───────────────────────────────
FRED_API_KEY=
FINNHUB_API_KEY=

# ─── DATA PROVIDERS — SENTIMENT ───────────────────────────
CRYPTOPANIC_KEY=
LUNARCRUSH_KEY=
  • Step 2: Aggiorna .gitignore

Aggiungi (se non presente):

.env
secrets/

Rimuovi righe legate a strutture V1 obsolete se presenti (docker-compose.local.yml esiste già committato, ma .gitignore resta valido).

  • Step 3: Commit
git add .env.example .gitignore
git commit -m "chore(V2): .env.example consolidato, .env gitignored"

Task 1.2: Test Pydantic Settings

Files:

  • Create: tests/unit/test_settings.py

  • Step 1: Scrivi i test

from __future__ import annotations

import pytest
from pydantic import ValidationError


def _minimal_env(**overrides) -> dict:
    base = {
        "TESTNET_TOKEN": "t_test_123",
        "MAINNET_TOKEN": "t_live_456",
        "DERIBIT_CLIENT_ID": "id",
        "DERIBIT_CLIENT_SECRET": "secret",
        "DERIBIT_URL_LIVE": "https://www.deribit.com/api/v2",
        "DERIBIT_URL_TESTNET": "https://test.deribit.com/api/v2",
        "BYBIT_API_KEY": "k",
        "BYBIT_API_SECRET": "s",
        "BYBIT_URL_LIVE": "https://api.bybit.com",
        "BYBIT_URL_TESTNET": "https://api-testnet.bybit.com",
        "HYPERLIQUID_WALLET_ADDRESS": "0xabc",
        "HYPERLIQUID_API_WALLET_ADDRESS": "0xdef",
        "HYPERLIQUID_PRIVATE_KEY": "0x123",
        "HYPERLIQUID_URL_LIVE": "https://api.hyperliquid.xyz",
        "HYPERLIQUID_URL_TESTNET": "https://api.hyperliquid-testnet.xyz",
        "ALPACA_API_KEY_ID": "k",
        "ALPACA_SECRET_KEY": "s",
        "ALPACA_URL_LIVE": "https://api.alpaca.markets",
        "ALPACA_URL_TESTNET": "https://paper-api.alpaca.markets",
        "FRED_API_KEY": "x",
        "FINNHUB_API_KEY": "y",
        "CRYPTOPANIC_KEY": "z",
        "LUNARCRUSH_KEY": "w",
    }
    base.update(overrides)
    return base


def test_settings_load_minimal(monkeypatch):
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)
    monkeypatch.setenv("PORT", "9123")

    from cerbero_mcp.settings import Settings

    s = Settings()
    assert s.port == 9123
    assert s.host == "0.0.0.0"
    assert s.testnet_token.get_secret_value() == "t_test_123"
    assert s.mainnet_token.get_secret_value() == "t_live_456"
    assert s.deribit.url_testnet.endswith("test.deribit.com/api/v2")
    assert s.bybit.max_leverage == 3
    assert s.alpaca.max_leverage == 1


def test_settings_missing_token_fails(monkeypatch):
    env = _minimal_env()
    env.pop("TESTNET_TOKEN")
    for k, v in env.items():
        monkeypatch.setenv(k, v)
    monkeypatch.delenv("TESTNET_TOKEN", raising=False)

    from cerbero_mcp.settings import Settings

    with pytest.raises(ValidationError):
        Settings()


def test_settings_extras_ignored(monkeypatch):
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)
    monkeypatch.setenv("UNRELATED_VAR", "ignored")

    from cerbero_mcp.settings import Settings

    s = Settings()
    assert s.testnet_token.get_secret_value() == "t_test_123"


def test_settings_secret_str_no_leak(monkeypatch):
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)

    from cerbero_mcp.settings import Settings

    s = Settings()
    assert "t_test_123" not in repr(s)
    assert "t_live_456" not in repr(s)
  • Step 2: Verifica fail (modulo non esiste ancora)
uv run pytest tests/unit/test_settings.py -v

Expected: ImportError su cerbero_mcp.settings.

  • Step 3: Implementa src/cerbero_mcp/settings.py
"""Pydantic Settings: legge .env e variabili d'ambiente."""
from __future__ import annotations

from pydantic import Field, SecretStr
from pydantic_settings import BaseSettings, SettingsConfigDict


class _Sub(BaseSettings):
    """Base per sub-settings, condivide model_config con env_file."""
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        extra="ignore",
    )


class DeribitSettings(_Sub):
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        env_prefix="DERIBIT_",
        extra="ignore",
    )
    client_id: str
    client_secret: SecretStr
    url_live: str
    url_testnet: str
    max_leverage: int = 3


class BybitSettings(_Sub):
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        env_prefix="BYBIT_",
        extra="ignore",
    )
    api_key: str
    api_secret: SecretStr
    url_live: str
    url_testnet: str
    max_leverage: int = 3


class HyperliquidSettings(_Sub):
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        env_prefix="HYPERLIQUID_",
        extra="ignore",
    )
    wallet_address: str
    api_wallet_address: str
    private_key: SecretStr
    url_live: str
    url_testnet: str
    max_leverage: int = 3


class AlpacaSettings(_Sub):
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        env_prefix="ALPACA_",
        extra="ignore",
    )
    api_key_id: str
    secret_key: SecretStr
    url_live: str
    url_testnet: str
    max_leverage: int = 1


class MacroSettings(_Sub):
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        extra="ignore",
    )
    fred_api_key: SecretStr
    finnhub_api_key: SecretStr


class SentimentSettings(_Sub):
    model_config = SettingsConfigDict(
        env_file=".env",
        env_file_encoding="utf-8",
        extra="ignore",
    )
    cryptopanic_key: SecretStr
    lunarcrush_key: SecretStr


class Settings(_Sub):
    host: str = "0.0.0.0"
    port: int = 9000
    log_level: str = "info"

    testnet_token: SecretStr
    mainnet_token: SecretStr

    deribit: DeribitSettings = Field(default_factory=DeribitSettings)
    bybit: BybitSettings = Field(default_factory=BybitSettings)
    hyperliquid: HyperliquidSettings = Field(default_factory=HyperliquidSettings)
    alpaca: AlpacaSettings = Field(default_factory=AlpacaSettings)
    macro: MacroSettings = Field(default_factory=MacroSettings)
    sentiment: SentimentSettings = Field(default_factory=SentimentSettings)
  • Step 4: Verifica test pass
uv run pytest tests/unit/test_settings.py -v

Expected: 4 PASSED.

  • Step 5: Commit
git add src/cerbero_mcp/settings.py tests/unit/test_settings.py
git commit -m "feat(V2): pydantic settings con secret str + test"

PHASE 2 — Auth bearer

Task 2.1: Test middleware auth

Files:

  • Create: tests/unit/test_auth.py

  • Step 1: Scrivi i test

from __future__ import annotations

import pytest
from fastapi import FastAPI
from fastapi.testclient import TestClient


@pytest.fixture
def app():
    from cerbero_mcp.auth import install_auth_middleware

    fa = FastAPI()
    install_auth_middleware(
        fa,
        testnet_token="tk_test_aaa",
        mainnet_token="tk_live_bbb",
    )

    @fa.get("/health")
    def health():
        return {"ok": True}

    @fa.get("/mcp-deribit/tools/get_ticker")
    def ticker(request):
        # workaround: FastAPI inietta Request via dependency
        return {"env": "?"}

    return fa


def test_health_no_auth_required():
    from cerbero_mcp.auth import install_auth_middleware
    fa = FastAPI()
    install_auth_middleware(fa, testnet_token="t", mainnet_token="m")

    @fa.get("/health")
    def h():
        return {"ok": True}

    c = TestClient(fa)
    r = c.get("/health")
    assert r.status_code == 200


def test_apidocs_no_auth_required():
    from cerbero_mcp.auth import install_auth_middleware
    fa = FastAPI(docs_url="/apidocs")
    install_auth_middleware(fa, testnet_token="t", mainnet_token="m")

    c = TestClient(fa)
    r = c.get("/apidocs")
    assert r.status_code == 200
    r = c.get("/openapi.json")
    assert r.status_code == 200


def test_missing_authorization_header_401():
    from cerbero_mcp.auth import install_auth_middleware
    fa = FastAPI()
    install_auth_middleware(fa, testnet_token="t", mainnet_token="m")

    @fa.get("/mcp-deribit/health")
    def h():
        return {"ok": True}

    c = TestClient(fa)
    r = c.get("/mcp-deribit/health")
    assert r.status_code == 401


def test_invalid_bearer_401():
    from cerbero_mcp.auth import install_auth_middleware
    fa = FastAPI()
    install_auth_middleware(fa, testnet_token="t", mainnet_token="m")

    @fa.get("/mcp-deribit/health")
    def h():
        return {"ok": True}

    c = TestClient(fa)
    r = c.get("/mcp-deribit/health", headers={"Authorization": "Bearer wrong"})
    assert r.status_code == 401


def test_testnet_token_sets_env_testnet():
    from cerbero_mcp.auth import install_auth_middleware
    fa = FastAPI()
    install_auth_middleware(fa, testnet_token="tk_test", mainnet_token="tk_live")

    captured = {}

    @fa.get("/mcp-deribit/peek")
    def peek(request):
        from fastapi import Request as R
        # Recupera Request via dependency-injection
        return {}

    from fastapi import Request

    @fa.get("/mcp-deribit/peek2")
    def peek2(request: Request):
        captured["env"] = request.state.environment
        return {"env": request.state.environment}

    c = TestClient(fa)
    r = c.get("/mcp-deribit/peek2", headers={"Authorization": "Bearer tk_test"})
    assert r.status_code == 200
    assert r.json() == {"env": "testnet"}


def test_mainnet_token_sets_env_mainnet():
    from cerbero_mcp.auth import install_auth_middleware
    from fastapi import Request

    fa = FastAPI()
    install_auth_middleware(fa, testnet_token="tk_test", mainnet_token="tk_live")

    @fa.get("/mcp-deribit/peek")
    def peek(request: Request):
        return {"env": request.state.environment}

    c = TestClient(fa)
    r = c.get("/mcp-deribit/peek", headers={"Authorization": "Bearer tk_live"})
    assert r.status_code == 200
    assert r.json() == {"env": "mainnet"}


def test_uses_compare_digest():
    """Verifica che _check_token usi secrets.compare_digest (timing-safe)."""
    import inspect
    from cerbero_mcp import auth

    src = inspect.getsource(auth)
    assert "compare_digest" in src, "auth.py deve usare secrets.compare_digest"
  • Step 2: Verifica fail
uv run pytest tests/unit/test_auth.py -v

Expected: ImportError su cerbero_mcp.auth.

  • Step 3: Implementa src/cerbero_mcp/auth.py
"""Bearer auth middleware: bearer token → request.state.environment."""
from __future__ import annotations

import secrets
from typing import Literal

from fastapi import FastAPI, HTTPException, Request, status
from fastapi.responses import JSONResponse

Environment = Literal["testnet", "mainnet"]

WHITELIST_PATHS = frozenset({"/health", "/apidocs", "/openapi.json"})


def _extract_bearer(auth_header: str) -> str | None:
    if not auth_header.startswith("Bearer "):
        return None
    token = auth_header[len("Bearer "):].strip()
    return token or None


def _check_token(
    candidate: str, testnet_token: str, mainnet_token: str
) -> Environment | None:
    if secrets.compare_digest(candidate, testnet_token):
        return "testnet"
    if secrets.compare_digest(candidate, mainnet_token):
        return "mainnet"
    return None


def install_auth_middleware(
    app: FastAPI,
    *,
    testnet_token: str,
    mainnet_token: str,
) -> None:
    """Registra middleware di auth bearer sull'app FastAPI."""

    @app.middleware("http")
    async def auth_middleware(request: Request, call_next):
        if request.url.path in WHITELIST_PATHS:
            return await call_next(request)

        token = _extract_bearer(request.headers.get("Authorization", ""))
        if token is None:
            return JSONResponse(
                status_code=status.HTTP_401_UNAUTHORIZED,
                content={"error": {"code": "UNAUTHORIZED",
                                   "message": "missing or malformed bearer token"}},
            )
        env = _check_token(token, testnet_token, mainnet_token)
        if env is None:
            return JSONResponse(
                status_code=status.HTTP_401_UNAUTHORIZED,
                content={"error": {"code": "UNAUTHORIZED",
                                   "message": "invalid token"}},
            )
        request.state.environment = env
        return await call_next(request)
  • Step 4: Verifica test pass
uv run pytest tests/unit/test_auth.py -v

Expected: 7 PASSED.

  • Step 5: Commit
git add src/cerbero_mcp/auth.py tests/unit/test_auth.py
git commit -m "feat(V2): bearer auth middleware con compare_digest"

PHASE 3 — Migrazione common/ (codice riutilizzabile)

L'obiettivo di questa phase è copiare i moduli mcp_common.{indicators,options,microstructure,stats,http,audit,logging,mcp_bridge} da services/common/src/mcp_common/ a src/cerbero_mcp/common/, aggiornando soltanto gli import. Il codice di logica resta identico.

Task 3.1: Copia moduli common 1:1

Files:

  • Create: src/cerbero_mcp/common/{indicators,options,microstructure,stats,http,audit,logging,mcp_bridge}.py

  • Step 1: Copia indicators.py

cp services/common/src/mcp_common/indicators.py src/cerbero_mcp/common/indicators.py

Modifica gli import in cima al file: sostituisci ogni from mcp_common.X con from cerbero_mcp.common.X. Usa:

sed -i 's|from mcp_common\.|from cerbero_mcp.common.|g' src/cerbero_mcp/common/indicators.py
sed -i 's|^import mcp_common\.|import cerbero_mcp.common.|g' src/cerbero_mcp/common/indicators.py
  • Step 2: Copia options.py, microstructure.py, stats.py
for f in options microstructure stats; do
  cp "services/common/src/mcp_common/$f.py" "src/cerbero_mcp/common/$f.py"
  sed -i 's|from mcp_common\.|from cerbero_mcp.common.|g' "src/cerbero_mcp/common/$f.py"
done
  • Step 3: Copia http.py, audit.py, logging.py, mcp_bridge.py
for f in http audit logging mcp_bridge; do
  cp "services/common/src/mcp_common/$f.py" "src/cerbero_mcp/common/$f.py"
  sed -i 's|from mcp_common\.|from cerbero_mcp.common.|g' "src/cerbero_mcp/common/$f.py"
done
  • Step 4: Verifica import puliti
grep -rn "mcp_common" src/cerbero_mcp/common/ || echo "OK: nessun residuo mcp_common"

Expected: output OK: nessun residuo mcp_common.

  • Step 5: Smoke import
uv run python -c "from cerbero_mcp.common import indicators, options, microstructure, stats, http, audit, logging, mcp_bridge; print('all imports OK')"

Expected: all imports OK.

  • Step 6: Commit
git add src/cerbero_mcp/common/
git commit -m "feat(V2): migrazione common/ (indicators, options, microstructure, stats, http, audit, logging, mcp_bridge)"

Task 3.2: Migra test del common

Files:

  • Create: tests/unit/common/test_indicators.py (e analoghi)

  • Step 1: Copia tutti i test esistenti del common

mkdir -p tests/unit/common
cp -r services/common/tests/* tests/unit/common/
  • Step 2: Aggiorna gli import nei test
find tests/unit/common -name '*.py' -exec sed -i 's|from mcp_common\.|from cerbero_mcp.common.|g' {} \;
find tests/unit/common -name '*.py' -exec sed -i 's|^import mcp_common\.|import cerbero_mcp.common.|g' {} \;
  • Step 3: Esegui i test
uv run pytest tests/unit/common/ -v

Expected: tutti i test PASS (stessa percentuale che passavano in V1).

Se qualche test fallisce per import non aggiornato, ripeti sed. Se fallisce per dipendenza mancante, aggiungila a [project].dependencies e uv sync.

  • Step 4: Commit
git add tests/unit/common/
git commit -m "test(V2): migrazione test common/"

Task 3.3: Estrai errors.py (error envelope) da mcp_common.server

Files:

  • Create: src/cerbero_mcp/common/errors.py

  • Step 1: Crea errors.py con la logica di error envelope

"""Error envelope standard per tutti i tool MCP."""
from __future__ import annotations

import uuid
from datetime import UTC, datetime
from typing import Any


def error_envelope(
    *,
    type_: str,
    code: str,
    message: str,
    retryable: bool,
    suggested_fix: str | None = None,
    details: dict | None = None,
    request_id: str | None = None,
) -> dict:
    env: dict[str, Any] = {
        "error": {
            "type": type_,
            "code": code,
            "message": message,
            "retryable": retryable,
        },
        "request_id": request_id or uuid.uuid4().hex,
        "data_timestamp": datetime.now(UTC).isoformat(),
    }
    if suggested_fix:
        env["error"]["suggested_fix"] = suggested_fix
    if details:
        env["error"]["details"] = details
    return env


HTTP_CODE_MAP = {
    400: "BAD_REQUEST",
    401: "UNAUTHORIZED",
    403: "FORBIDDEN",
    404: "NOT_FOUND",
    408: "TIMEOUT",
    409: "CONFLICT",
    422: "VALIDATION_ERROR",
    429: "RATE_LIMIT",
    500: "INTERNAL_ERROR",
    502: "UPSTREAM_ERROR",
    503: "UNAVAILABLE",
    504: "GATEWAY_TIMEOUT",
}

RETRYABLE_STATUSES = frozenset({408, 429, 502, 503, 504})
  • Step 2: Test envelope

Crea tests/unit/common/test_errors.py:

from __future__ import annotations

from cerbero_mcp.common.errors import error_envelope, HTTP_CODE_MAP


def test_envelope_minimal():
    e = error_envelope(type_="x", code="C", message="m", retryable=False)
    assert e["error"]["code"] == "C"
    assert e["error"]["retryable"] is False
    assert "request_id" in e
    assert "data_timestamp" in e


def test_envelope_with_suggested_fix_and_details():
    e = error_envelope(
        type_="validation_error",
        code="INVALID_INPUT",
        message="bad",
        retryable=False,
        suggested_fix="check field x",
        details={"field": "x"},
    )
    assert e["error"]["suggested_fix"] == "check field x"
    assert e["error"]["details"] == {"field": "x"}


def test_http_code_map_has_common_codes():
    assert HTTP_CODE_MAP[401] == "UNAUTHORIZED"
    assert HTTP_CODE_MAP[502] == "UPSTREAM_ERROR"
uv run pytest tests/unit/common/test_errors.py -v

Expected: 3 PASSED.

  • Step 3: Commit
git add src/cerbero_mcp/common/errors.py tests/unit/common/test_errors.py
git commit -m "feat(V2): error envelope module estratto da server.py"

PHASE 4 — Client registry

Task 4.1: Test ClientRegistry

Files:

  • Create: tests/unit/test_client_registry.py

  • Step 1: Scrivi i test

from __future__ import annotations

import asyncio

import pytest


@pytest.mark.asyncio
async def test_registry_lazy_build():
    from cerbero_mcp.client_registry import ClientRegistry

    builds: list[tuple[str, str]] = []

    async def fake_build(exchange: str, env: str):
        builds.append((exchange, env))

        class C:
            async def aclose(self):
                pass
        return C()

    reg = ClientRegistry(builder=fake_build)
    c = await reg.get("deribit", "testnet")
    assert builds == [("deribit", "testnet")]
    assert (await reg.get("deribit", "testnet")) is c  # cached
    assert builds == [("deribit", "testnet")]


@pytest.mark.asyncio
async def test_registry_different_keys_different_clients():
    from cerbero_mcp.client_registry import ClientRegistry

    async def fake_build(exchange: str, env: str):
        class C:
            tag = (exchange, env)
            async def aclose(self): ...
        return C()

    reg = ClientRegistry(builder=fake_build)
    a = await reg.get("deribit", "testnet")
    b = await reg.get("deribit", "mainnet")
    c = await reg.get("bybit", "testnet")
    assert a is not b
    assert a.tag == ("deribit", "testnet")
    assert b.tag == ("deribit", "mainnet")
    assert c.tag == ("bybit", "testnet")


@pytest.mark.asyncio
async def test_registry_concurrent_get_one_build():
    from cerbero_mcp.client_registry import ClientRegistry

    counter = {"calls": 0}

    async def fake_build(exchange: str, env: str):
        counter["calls"] += 1
        await asyncio.sleep(0.05)

        class C:
            async def aclose(self): ...
        return C()

    reg = ClientRegistry(builder=fake_build)
    results = await asyncio.gather(
        *[reg.get("deribit", "testnet") for _ in range(10)]
    )
    assert counter["calls"] == 1
    assert all(r is results[0] for r in results)


@pytest.mark.asyncio
async def test_registry_aclose_calls_all():
    from cerbero_mcp.client_registry import ClientRegistry

    closed: list[tuple[str, str]] = []

    async def fake_build(exchange: str, env: str):
        class C:
            async def aclose(self):
                closed.append((exchange, env))
        return C()

    reg = ClientRegistry(builder=fake_build)
    await reg.get("deribit", "testnet")
    await reg.get("bybit", "mainnet")
    await reg.aclose()
    assert sorted(closed) == sorted([("deribit", "testnet"), ("bybit", "mainnet")])
  • Step 2: Verifica fail
uv run pytest tests/unit/test_client_registry.py -v

Expected: ImportError su cerbero_mcp.client_registry.

  • Step 3: Implementa src/cerbero_mcp/client_registry.py
"""Cache lazy di client exchange, una istanza per (exchange, env)."""
from __future__ import annotations

import asyncio
from collections import defaultdict
from collections.abc import Awaitable, Callable
from typing import Any, Literal

Environment = Literal["testnet", "mainnet"]
Builder = Callable[[str, Environment], Awaitable[Any]]


class ClientRegistry:
    def __init__(self, *, builder: Builder) -> None:
        self._builder = builder
        self._clients: dict[tuple[str, Environment], Any] = {}
        self._locks: dict[tuple[str, Environment], asyncio.Lock] = defaultdict(
            asyncio.Lock
        )

    async def get(self, exchange: str, env: Environment) -> Any:
        key = (exchange, env)
        if key in self._clients:
            return self._clients[key]
        async with self._locks[key]:
            if key in self._clients:  # double-check
                return self._clients[key]
            client = await self._builder(exchange, env)
            self._clients[key] = client
            return client

    async def aclose(self) -> None:
        for client in self._clients.values():
            close = getattr(client, "aclose", None)
            if close is None:
                continue
            try:
                await close()
            except Exception:
                pass
        self._clients.clear()
  • Step 4: Verifica test pass
uv run pytest tests/unit/test_client_registry.py -v

Expected: 4 PASSED.

  • Step 5: Commit
git add src/cerbero_mcp/client_registry.py tests/unit/test_client_registry.py
git commit -m "feat(V2): ClientRegistry lazy con lock per chiave"

PHASE 5 — Server skeleton + Swagger

Task 5.1: build_app con Swagger

Files:

  • Create: src/cerbero_mcp/server.py

  • Create: tests/unit/test_server.py

  • Step 1: Test build_app

from __future__ import annotations

import pytest
from fastapi.testclient import TestClient


@pytest.fixture
def app():
    from cerbero_mcp.server import build_app
    return build_app(
        testnet_token="tk_test",
        mainnet_token="tk_live",
        title="Test",
        version="2.0.0",
    )


def test_apidocs_served(app):
    c = TestClient(app)
    r = c.get("/apidocs")
    assert r.status_code == 200
    assert "swagger" in r.text.lower()


def test_openapi_json_served(app):
    c = TestClient(app)
    r = c.get("/openapi.json")
    assert r.status_code == 200
    spec = r.json()
    assert spec["info"]["title"] == "Test"
    # securityScheme BearerAuth presente
    assert "BearerAuth" in spec["components"]["securitySchemes"]
    assert spec["components"]["securitySchemes"]["BearerAuth"]["scheme"] == "bearer"


def test_redoc_disabled(app):
    c = TestClient(app)
    r = c.get("/redoc")
    assert r.status_code == 404


def test_default_docs_path_disabled(app):
    c = TestClient(app)
    r = c.get("/docs")
    assert r.status_code == 404


def test_health_endpoint(app):
    c = TestClient(app)
    r = c.get("/health")
    assert r.status_code == 200
    j = r.json()
    assert j["status"] == "healthy"
    assert j["version"] == "2.0.0"
    assert "uptime_seconds" in j
    assert "data_timestamp" in j


def test_x_duration_ms_header(app):
    c = TestClient(app)
    r = c.get("/health")
    assert "X-Duration-Ms" in r.headers
  • Step 2: Verifica fail
uv run pytest tests/unit/test_server.py -v

Expected: ImportError.

  • Step 3: Implementa src/cerbero_mcp/server.py
"""Factory FastAPI app con middleware, swagger, exception handlers."""
from __future__ import annotations

import json
import time
from datetime import UTC, datetime
from typing import Any

from fastapi import FastAPI, HTTPException, Request
from fastapi.exceptions import RequestValidationError
from fastapi.openapi.utils import get_openapi
from fastapi.responses import JSONResponse, Response
from starlette.middleware.base import BaseHTTPMiddleware

from cerbero_mcp.auth import install_auth_middleware
from cerbero_mcp.common.errors import (
    HTTP_CODE_MAP,
    RETRYABLE_STATUSES,
    error_envelope,
)


class _TimestampInjectorMiddleware(BaseHTTPMiddleware):
    async def dispatch(self, request: Request, call_next):
        response = await call_next(request)
        path = request.url.path
        if "/tools/" not in path:
            return response
        ctype = response.headers.get("content-type", "")
        if "application/json" not in ctype:
            return response
        body = b""
        async for chunk in response.body_iterator:
            body += chunk
        ts = datetime.now(UTC).isoformat()
        try:
            data = json.loads(body) if body else None
        except Exception:
            headers = dict(response.headers)
            headers["X-Data-Timestamp"] = ts
            return Response(
                content=body, status_code=response.status_code,
                headers=headers, media_type=response.media_type,
            )
        modified = False
        if isinstance(data, dict) and "data_timestamp" not in data:
            data["data_timestamp"] = ts
            modified = True
        elif isinstance(data, list):
            for item in data:
                if isinstance(item, dict) and "data_timestamp" not in item:
                    item["data_timestamp"] = ts
                    modified = True
        headers = dict(response.headers)
        headers["X-Data-Timestamp"] = ts
        if modified:
            new_body = json.dumps(data, default=str).encode()
            headers.pop("content-length", None)
            return Response(
                content=new_body, status_code=response.status_code,
                headers=headers, media_type="application/json",
            )
        return Response(
            content=body, status_code=response.status_code,
            headers=headers, media_type=response.media_type,
        )


def build_app(
    *,
    testnet_token: str,
    mainnet_token: str,
    title: str = "Cerbero MCP",
    version: str = "2.0.0",
    description: str = (
        "Multi-exchange MCP server. "
        "Bearer token decides environment (testnet/mainnet)."
    ),
) -> FastAPI:
    app = FastAPI(
        title=title,
        version=version,
        description=description,
        docs_url="/apidocs",
        redoc_url=None,
        openapi_url="/openapi.json",
        swagger_ui_parameters={
            "persistAuthorization": True,
            "displayRequestDuration": True,
            "filter": True,
            "tryItOutEnabled": True,
            "tagsSorter": "alpha",
            "operationsSorter": "alpha",
        },
    )
    app.state.boot_at = time.time()

    install_auth_middleware(
        app, testnet_token=testnet_token, mainnet_token=mainnet_token
    )

    app.add_middleware(_TimestampInjectorMiddleware)

    @app.middleware("http")
    async def latency_header(request: Request, call_next):
        t0 = time.perf_counter()
        response = await call_next(request)
        dur_ms = (time.perf_counter() - t0) * 1000
        response.headers["X-Duration-Ms"] = f"{dur_ms:.2f}"
        return response

    @app.exception_handler(HTTPException)
    async def _http_exc(request: Request, exc: HTTPException):
        retryable = exc.status_code in RETRYABLE_STATUSES
        code = HTTP_CODE_MAP.get(exc.status_code, f"HTTP_{exc.status_code}")
        message = "HTTP error"
        details: dict | None = None
        detail = exc.detail
        if isinstance(detail, dict):
            if isinstance(detail.get("error"), str):
                code = detail["error"].upper()
            message = str(detail.get("message") or detail.get("error") or message)
            details = detail
        elif isinstance(detail, str):
            message = detail
        return JSONResponse(
            status_code=exc.status_code,
            content=error_envelope(
                type_="http_error", code=code, message=message,
                retryable=retryable, details=details,
            ),
        )

    @app.exception_handler(RequestValidationError)
    async def _val_exc(request: Request, exc: RequestValidationError):
        errs = exc.errors()
        first_loc = ".".join(str(x) for x in errs[0]["loc"]) if errs else "body"
        suggestion = (
            f"check field '{first_loc}': "
            + (errs[0]["msg"] if errs else "invalid input")
        )
        safe_errs: list[dict] = []
        for e in errs[:5]:
            ne: dict = {}
            for k, v in e.items():
                if k == "ctx" and isinstance(v, dict):
                    ne[k] = {ck: str(cv) for ck, cv in v.items()}
                else:
                    ne[k] = v
            safe_errs.append(ne)
        return JSONResponse(
            status_code=422,
            content=error_envelope(
                type_="validation_error", code="INVALID_INPUT",
                message=f"request body validation failed on {first_loc}",
                retryable=False, suggested_fix=suggestion,
                details={"errors": safe_errs},
            ),
        )

    @app.exception_handler(Exception)
    async def _unhandled(request: Request, exc: Exception):
        return JSONResponse(
            status_code=500,
            content=error_envelope(
                type_="internal_error", code="UNHANDLED_EXCEPTION",
                message=f"{type(exc).__name__}: {str(exc)[:300]}",
                retryable=True,
            ),
        )

    @app.get("/health", tags=["system"])
    def health():
        return {
            "status": "healthy",
            "name": title,
            "version": version,
            "uptime_seconds": int(time.time() - app.state.boot_at),
            "data_timestamp": datetime.now(UTC).isoformat(),
        }

    def _custom_openapi() -> dict[str, Any]:
        if app.openapi_schema:
            return app.openapi_schema
        schema = get_openapi(
            title=app.title, version=app.version,
            description=app.description, routes=app.routes,
        )
        schema.setdefault("components", {})
        schema["components"]["securitySchemes"] = {
            "BearerAuth": {
                "type": "http",
                "scheme": "bearer",
                "description": (
                    "Use TESTNET_TOKEN for testnet routing, "
                    "MAINNET_TOKEN for mainnet."
                ),
            }
        }
        schema["security"] = [{"BearerAuth": []}]
        app.openapi_schema = schema
        return schema

    app.openapi = _custom_openapi
    return app
  • Step 4: Verifica test
uv run pytest tests/unit/test_server.py -v

Expected: 6 PASSED.

  • Step 5: Commit
git add src/cerbero_mcp/server.py tests/unit/test_server.py
git commit -m "feat(V2): build_app con swagger /apidocs + middleware + handlers"

PHASE 6 — Migrazione exchange (template + 6 ripetizioni)

Per ognuno dei 6 servizi (deribit, bybit, hyperliquid, alpaca, macro, sentiment) si applica lo stesso pattern. Il pattern è dettagliato per deribit (Task 6.16.4); le altre task (6.56.10) seguono ESATTAMENTE lo stesso template, sostituendo solo nomi.

Pattern per ogni exchange:

  1. Crea src/cerbero_mcp/exchanges/{exchange}/{__init__,client,leverage_cap,tools}.py
  2. Crea src/cerbero_mcp/routers/{exchange}.py
  3. Migra test del servizio in tests/unit/exchanges/{exchange}/
  4. Aggiungi entry nel client_registry builder

Task 6.1: Migrazione deribit — codice

Files:

  • Create: src/cerbero_mcp/exchanges/deribit/{__init__,client,leverage_cap,tools}.py

  • Step 1: Copia file dal vecchio servizio

mkdir -p src/cerbero_mcp/exchanges/deribit
touch src/cerbero_mcp/exchanges/deribit/__init__.py
cp services/mcp-deribit/src/mcp_deribit/client.py src/cerbero_mcp/exchanges/deribit/client.py
cp services/mcp-deribit/src/mcp_deribit/leverage_cap.py src/cerbero_mcp/exchanges/deribit/leverage_cap.py
  • Step 2: Aggiorna gli import
for f in src/cerbero_mcp/exchanges/deribit/*.py; do
  sed -i 's|from mcp_common\.|from cerbero_mcp.common.|g' "$f"
  sed -i 's|from mcp_deribit\.|from cerbero_mcp.exchanges.deribit.|g' "$f"
  sed -i 's|^import mcp_common\.|import cerbero_mcp.common.|g' "$f"
  sed -i 's|^import mcp_deribit\.|import cerbero_mcp.exchanges.deribit.|g' "$f"
done
  • Step 3: Estrai le tool dal vecchio server.py

Il file services/mcp-deribit/src/mcp_deribit/server.py (695 righe) contiene sia la registrazione FastAPI sia la logica delle tool. Crea src/cerbero_mcp/exchanges/deribit/tools.py con SOLO la logica (funzioni async che ricevono client e params, non endpoint):

Apri services/mcp-deribit/src/mcp_deribit/server.py. Per ogni handler decorato con @app.post("/tools/<NAME>") (es. place_order, get_ticker, get_dvol, ...):

  • Estrai il body della funzione in una funzione async async def NAME(client: DeribitClient, params: ToolNameRequest) -> ToolNameResponse
  • Sposta i Pydantic model ToolNameRequest/ToolNameResponse in cima a tools.py
  • Aggiungi examples=[...] al model_config di ogni Request model (almeno 1 esempio realistico)

Esempio (estratto):

# src/cerbero_mcp/exchanges/deribit/tools.py
from __future__ import annotations

from pydantic import BaseModel
from typing import Literal

from cerbero_mcp.exchanges.deribit.client import DeribitClient


class PlaceOrderRequest(BaseModel):
    instrument_name: str
    side: Literal["buy", "sell"]
    amount: float
    order_type: Literal["market", "limit"] = "market"
    price: float | None = None

    model_config = {
        "json_schema_extra": {
            "examples": [
                {
                    "summary": "Market buy 0.1 BTC perpetual",
                    "value": {
                        "instrument_name": "BTC-PERPETUAL",
                        "side": "buy",
                        "amount": 0.1,
                    },
                }
            ]
        }
    }


class PlaceOrderResponse(BaseModel):
    order_id: str
    status: str
    filled_amount: float | None = None


async def place_order(
    client: DeribitClient, params: PlaceOrderRequest
) -> PlaceOrderResponse:
    res = await client.place_order(
        instrument_name=params.instrument_name,
        side=params.side,
        amount=params.amount,
        type_=params.order_type,
        price=params.price,
    )
    return PlaceOrderResponse(
        order_id=res["order"]["order_id"],
        status=res["order"]["order_state"],
        filled_amount=res["order"].get("filled_amount"),
    )

Ripeti per ogni tool deribit (la lista è in README.md sezione "Deribit").

  • Step 4: Verifica import
uv run python -c "from cerbero_mcp.exchanges.deribit import client, leverage_cap, tools; print('OK')"

Expected: OK.

  • Step 5: Commit
git add src/cerbero_mcp/exchanges/deribit/
git commit -m "feat(V2): migrazione deribit (client, leverage_cap, tools)"

Task 6.2: Migrazione deribit — router

Files:

  • Create: src/cerbero_mcp/routers/deribit.py

  • Step 1: Implementa il router

"""Router /mcp-deribit/* con dependency injection per env e client."""
from __future__ import annotations

from typing import Literal

from fastapi import APIRouter, Depends, Request

from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.exchanges.deribit import tools as deribit_tools
from cerbero_mcp.exchanges.deribit.client import DeribitClient

Environment = Literal["testnet", "mainnet"]


def get_environment(request: Request) -> Environment:
    return request.state.environment


async def get_deribit_client(
    request: Request, env: Environment = Depends(get_environment)
) -> DeribitClient:
    registry: ClientRegistry = request.app.state.registry
    return await registry.get("deribit", env)


def make_router() -> APIRouter:
    r = APIRouter(prefix="/mcp-deribit", tags=["deribit"])

    @r.post("/tools/place_order", response_model=deribit_tools.PlaceOrderResponse)
    async def place_order(
        params: deribit_tools.PlaceOrderRequest,
        client: DeribitClient = Depends(get_deribit_client),
    ):
        return await deribit_tools.place_order(client, params)

    # ... ripeti per ogni tool deribit (~28 endpoint)
    # OGNI endpoint segue lo stesso pattern:
    # @r.post("/tools/<name>", response_model=...Response)
    # async def <name>(params: ...Request, client = Depends(get_deribit_client)):
    #     return await deribit_tools.<name>(client, params)

    return r
  • Step 2: Test smoke router

Crea tests/unit/routers/test_deribit.py:

from __future__ import annotations

from fastapi.testclient import TestClient


def test_router_mounts_correctly(monkeypatch):
    """Verifica che il router deribit risponda 401 senza bearer e che i path esistano."""
    monkeypatch.setenv("TESTNET_TOKEN", "tk_test")
    monkeypatch.setenv("MAINNET_TOKEN", "tk_live")
    # set up minimal envs per Settings (vedi test_settings.py); usa fixture comune
    from tests.unit.test_settings import _minimal_env
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)

    from cerbero_mcp.server import build_app
    from cerbero_mcp.routers.deribit import make_router

    app = build_app(testnet_token="tk_test", mainnet_token="tk_live")
    app.include_router(make_router())

    c = TestClient(app)
    r = c.post("/mcp-deribit/tools/place_order", json={})
    assert r.status_code == 401
  • Step 3: Esegui test
uv run pytest tests/unit/routers/test_deribit.py -v

Expected: PASS.

  • Step 4: Commit
git add src/cerbero_mcp/routers/deribit.py tests/unit/routers/
git commit -m "feat(V2): router /mcp-deribit/tools/* con bearer auth"

Task 6.3: Migrazione deribit — test

Files:

  • Create: tests/unit/exchanges/deribit/

  • Step 1: Copia test esistenti

mkdir -p tests/unit/exchanges/deribit
cp services/mcp-deribit/tests/test_client.py tests/unit/exchanges/deribit/
cp services/mcp-deribit/tests/test_leverage_cap.py tests/unit/exchanges/deribit/
# saltiamo test_environment_info (legacy boot resolver, sostituito da auth)
# saltiamo test_server_acl (vecchia ACL core/observer)
# saltiamo test_env_validation (legacy)
  • Step 2: Aggiorna import
find tests/unit/exchanges/deribit -name '*.py' -exec sed -i 's|from mcp_common\.|from cerbero_mcp.common.|g' {} \;
find tests/unit/exchanges/deribit -name '*.py' -exec sed -i 's|from mcp_deribit\.|from cerbero_mcp.exchanges.deribit.|g' {} \;
  • Step 3: Esegui test
uv run pytest tests/unit/exchanges/deribit/ -v

Expected: tutti i test che passavano in V1 passano in V2. Test che riferiscono ACL core/observer eliminati.

  • Step 4: Commit
git add tests/unit/exchanges/deribit/
git commit -m "test(V2): migrazione test deribit (client, leverage_cap)"

Task 6.4: Aggiungi deribit al builder ClientRegistry

Files:

  • Create: src/cerbero_mcp/exchanges/__init__.py (builder)

  • Step 1: Implementa la funzione build_client

Modifica src/cerbero_mcp/exchanges/__init__.py:

"""Builder centralizzato di client per ClientRegistry."""
from __future__ import annotations

from typing import Literal

from cerbero_mcp.settings import Settings

Environment = Literal["testnet", "mainnet"]


async def build_client(
    settings: Settings, exchange: str, env: Environment
):
    if exchange == "deribit":
        from cerbero_mcp.exchanges.deribit.client import DeribitClient
        url = (
            settings.deribit.url_testnet
            if env == "testnet" else settings.deribit.url_live
        )
        return DeribitClient(
            client_id=settings.deribit.client_id,
            client_secret=settings.deribit.client_secret.get_secret_value(),
            base_url=url,
            max_leverage=settings.deribit.max_leverage,
        )
    raise ValueError(f"unsupported exchange: {exchange}")

(Le altre exchange vengono aggiunte nelle task 6.56.10 estendendo questa funzione.)

  • Step 2: Test builder smoke

Crea tests/unit/test_exchanges_builder.py:

from __future__ import annotations

import pytest
from tests.unit.test_settings import _minimal_env


@pytest.mark.asyncio
async def test_build_client_deribit_returns_correct_url(monkeypatch):
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)
    from cerbero_mcp.settings import Settings
    from cerbero_mcp.exchanges import build_client

    s = Settings()
    c_test = await build_client(s, "deribit", "testnet")
    c_live = await build_client(s, "deribit", "mainnet")
    assert "test.deribit.com" in c_test.base_url
    assert "test.deribit.com" not in c_live.base_url


@pytest.mark.asyncio
async def test_build_client_unknown_exchange_raises(monkeypatch):
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)
    from cerbero_mcp.settings import Settings
    from cerbero_mcp.exchanges import build_client

    s = Settings()
    with pytest.raises(ValueError):
        await build_client(s, "ftx", "testnet")
uv run pytest tests/unit/test_exchanges_builder.py -v

Expected: 2 PASSED.

  • Step 3: Commit
git add src/cerbero_mcp/exchanges/__init__.py tests/unit/test_exchanges_builder.py
git commit -m "feat(V2): builder client centralizzato (solo deribit per ora)"

Task 6.5: Migra bybit (stesso pattern di Task 6.16.4)

Applica esattamente la stessa procedura di Task 6.16.4 per bybit. Sostituisci ovunque:

  • deribitbybit
  • mcp_deribitmcp_bybit
  • DeribitClientBybitClient
  • prefisso router /mcp-deribit/mcp-bybit
  • tag swagger "deribit""bybit"

Punti specifici bybit:

  • Settings field: settings.bybit.api_key, settings.bybit.api_secret.get_secret_value()
  • Aggiungi al build_client:
elif exchange == "bybit":
    from cerbero_mcp.exchanges.bybit.client import BybitClient
    url = settings.bybit.url_testnet if env == "testnet" else settings.bybit.url_live
    return BybitClient(
        api_key=settings.bybit.api_key,
        api_secret=settings.bybit.api_secret.get_secret_value(),
        base_url=url,
        max_leverage=settings.bybit.max_leverage,
    )
  • Test esistenti da migrare: services/mcp-bybit/tests/*.py (escludi test ACL legacy)
  • Tools specifiche bybit: place_order, place_batch_order (combo), get_ticker, get_orderbook, get_kline, get_funding_rate, get_funding_rate_history, get_open_interest, get_basis_term_structure, get_orderbook_imbalance, indicatori tecnici

5 step:

  1. Copia + sed import (Task 6.1)
  2. Estrai tools (Task 6.1 step 3)
  3. Router (Task 6.2)
  4. Test (Task 6.3)
  5. builder + commit (Task 6.4)

Commit finale: feat(V2): migrazione bybit completa.


Task 6.6: Migra hyperliquid

Stessa procedura. Differenze:

  • Auth: wallet_address + private_key (firma EIP-712, non HMAC)
  • Settings: settings.hyperliquid.{wallet_address, api_wallet_address, private_key, url_live, url_testnet, max_leverage}
  • Aggiungi al builder:
elif exchange == "hyperliquid":
    from cerbero_mcp.exchanges.hyperliquid.client import HyperliquidClient
    url = (
        settings.hyperliquid.url_testnet if env == "testnet"
        else settings.hyperliquid.url_live
    )
    return HyperliquidClient(
        wallet_address=settings.hyperliquid.wallet_address,
        api_wallet_address=settings.hyperliquid.api_wallet_address,
        private_key=settings.hyperliquid.private_key.get_secret_value(),
        base_url=url,
        max_leverage=settings.hyperliquid.max_leverage,
    )

Tools specifiche: place_order, cancel_order, close_position, get_account_summary, get_positions, get_open_orders, get_ticker, get_orderbook, get_historical, get_indicators, get_funding_rate, basis_spot_perp, set_stop_loss, set_take_profit, get_markets, get_trade_history.

Commit finale: feat(V2): migrazione hyperliquid completa.


Task 6.7: Migra alpaca

Stessa procedura. Differenze:

  • Settings: settings.alpaca.{api_key_id, secret_key, url_live, url_testnet, max_leverage}
  • Builder:
elif exchange == "alpaca":
    from cerbero_mcp.exchanges.alpaca.client import AlpacaClient
    url = (
        settings.alpaca.url_testnet if env == "testnet"
        else settings.alpaca.url_live
    )
    return AlpacaClient(
        api_key_id=settings.alpaca.api_key_id,
        secret_key=settings.alpaca.secret_key.get_secret_value(),
        base_url=url,
        max_leverage=settings.alpaca.max_leverage,
    )

Tools: place_order, amend_order, cancel_order, cancel_all_orders, close_position, close_all_positions, get_account, get_positions, get_open_orders, get_activities, get_assets, get_bars, get_calendar, get_clock, get_option_chain, get_snapshot, get_ticker.

Commit finale: feat(V2): migrazione alpaca completa.


Task 6.8: Migra macro

Stessa procedura. Differenze importanti:

  • Macro NON ha endpoint testnet/mainnet. Il client è uno solo.
  • Builder ignora env:
elif exchange == "macro":
    from cerbero_mcp.exchanges.macro.client import MacroClient
    return MacroClient(
        fred_api_key=settings.macro.fred_api_key.get_secret_value(),
        finnhub_api_key=settings.macro.finnhub_api_key.get_secret_value(),
    )
  • Comunque richiede bearer valido: il middleware auth filtra prima del router.
  • Cache key: ("macro", env) istanzia 2 entry separate ma punta allo stesso client (oppure aggiungi normalizzazione del registry per macro/sentiment, vedi sotto).

Decisione implementativa: usa entrambe le chiavi (("macro", "testnet") e ("macro", "mainnet")) e lascia che il builder ritorni due istanze separate. Costo: 2 client httpx pool invece di 1, irrilevante. Vantaggio: nessun special-case nel registry.

Tools: get_asset_price, get_breakeven_inflation, get_cot_disaggregated, get_cot_extreme_positioning, get_cot_tff, get_economic_indicators, get_equity_futures, get_macro_calendar, get_market_overview, get_treasury_yields, get_yield_curve_slope.

Commit finale: feat(V2): migrazione macro completa.


Task 6.9: Migra sentiment

Stessa procedura di macro (no env distinction). Builder:

elif exchange == "sentiment":
    from cerbero_mcp.exchanges.sentiment.client import SentimentClient
    return SentimentClient(
        cryptopanic_key=settings.sentiment.cryptopanic_key.get_secret_value(),
        lunarcrush_key=settings.sentiment.lunarcrush_key.get_secret_value(),
    )

Tools: get_cointegration_pairs, get_cross_exchange_funding, get_crypto_news, get_funding_arb_spread, get_funding_rates, get_liquidation_heatmap, get_oi_history, get_social_sentiment, get_world_news.

Commit finale: feat(V2): migrazione sentiment completa.


Task 6.10: Default catch-all nel builder

Files:

  • Modify: src/cerbero_mcp/exchanges/__init__.py

  • Step 1: Verifica branch builder

Apri src/cerbero_mcp/exchanges/__init__.py. Deve avere 6 rami if/elif (deribit, bybit, hyperliquid, alpaca, macro, sentiment) + raise finale. Tutti i branch implementati nelle task precedenti.

  • Step 2: Esegui tutta la suite di test
uv run pytest tests/ -v

Expected: tutti i test PASS.

  • Step 3: Commit consolidamento (se restano modifiche)
git status
# se ci sono file modificati:
git add -A && git commit -m "chore(V2): consolidamento builder con tutti i 6 exchange"

PHASE 7 — Entrypoint __main__ e wiring finale

Task 7.1: Implementa __main__.py reale

Files:

  • Modify: src/cerbero_mcp/__main__.py

  • Step 1: Sostituisci il placeholder

"""Entrypoint cerbero-mcp.

Boot:
- carica Settings da .env
- costruisce app FastAPI con router per ogni exchange
- crea ClientRegistry con builder
- monta lifespan per chiusura pulita
- avvia uvicorn
"""
from __future__ import annotations

from contextlib import asynccontextmanager

import uvicorn
from fastapi import FastAPI

from cerbero_mcp.client_registry import ClientRegistry
from cerbero_mcp.common.logging import configure_root_logging
from cerbero_mcp.exchanges import build_client
from cerbero_mcp.routers import (
    alpaca, bybit, deribit, hyperliquid, macro, sentiment,
)
from cerbero_mcp.server import build_app
from cerbero_mcp.settings import Settings


def _make_app(settings: Settings) -> FastAPI:
    app = build_app(
        testnet_token=settings.testnet_token.get_secret_value(),
        mainnet_token=settings.mainnet_token.get_secret_value(),
        title="Cerbero MCP",
        version="2.0.0",
    )

    app.state.settings = settings

    async def builder(exchange: str, env: str):
        return await build_client(settings, exchange, env)

    app.state.registry = ClientRegistry(builder=builder)

    @asynccontextmanager
    async def lifespan(app: FastAPI):
        try:
            yield
        finally:
            await app.state.registry.aclose()

    app.router.lifespan_context = lifespan

    app.include_router(deribit.make_router())
    app.include_router(bybit.make_router())
    app.include_router(hyperliquid.make_router())
    app.include_router(alpaca.make_router())
    app.include_router(macro.make_router())
    app.include_router(sentiment.make_router())

    return app


def main() -> None:
    configure_root_logging()
    settings = Settings()
    app = _make_app(settings)
    uvicorn.run(
        app,
        log_config=None,
        host=settings.host,
        port=settings.port,
    )


if __name__ == "__main__":
    main()
  • Step 2: Smoke run senza richieste reali

Crea .env di sviluppo (copia .env.example e metti almeno TESTNET_TOKEN=test, MAINNET_TOKEN=test2, e dummy per le credenziali exchange):

cp .env.example .env
# riempi placeholder con valori dummy
python -c 'import secrets; print(secrets.token_urlsafe(32))'  # genera 2 token
# edita .env manualmente con valori dummy per credenziali (solo bot per il boot, non chiamare le tool)
timeout 3 uv run cerbero-mcp || true

Expected: server parte, listening su 0.0.0.0:9000, poi viene killato dal timeout. Nessun errore di config.

  • Step 3: Test integration build

Crea tests/integration/test_app_boot.py:

from __future__ import annotations

from fastapi.testclient import TestClient


def test_app_boots_and_health_responds(monkeypatch):
    from tests.unit.test_settings import _minimal_env
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)

    from cerbero_mcp.__main__ import _make_app
    from cerbero_mcp.settings import Settings

    app = _make_app(Settings())
    c = TestClient(app)

    r = c.get("/health")
    assert r.status_code == 200
    assert r.json()["status"] == "healthy"

    r = c.get("/openapi.json")
    assert r.status_code == 200
    spec = r.json()
    paths = spec["paths"].keys()
    assert any(p.startswith("/mcp-deribit/") for p in paths)
    assert any(p.startswith("/mcp-bybit/") for p in paths)
    assert any(p.startswith("/mcp-hyperliquid/") for p in paths)
    assert any(p.startswith("/mcp-alpaca/") for p in paths)
    assert any(p.startswith("/mcp-macro/") for p in paths)
    assert any(p.startswith("/mcp-sentiment/") for p in paths)


def test_apidocs_available_after_boot(monkeypatch):
    from tests.unit.test_settings import _minimal_env
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)

    from cerbero_mcp.__main__ import _make_app
    from cerbero_mcp.settings import Settings

    c = TestClient(_make_app(Settings()))
    r = c.get("/apidocs")
    assert r.status_code == 200
    assert "Cerbero MCP" in r.text
uv run pytest tests/integration/test_app_boot.py -v

Expected: 2 PASSED.

  • Step 4: Commit
git add src/cerbero_mcp/__main__.py tests/integration/
git commit -m "feat(V2): __main__ con lifespan + 6 router + integration test"

PHASE 8 — Test integration env routing

Task 8.1: Stub HTTP per verifica routing testnet/mainnet

Files:

  • Create: tests/integration/test_env_routing.py

  • Step 1: Scrivi test pytest-httpx

"""Test integration: bearer testnet → URL testnet, bearer mainnet → URL live."""
from __future__ import annotations

import pytest
from fastapi.testclient import TestClient
from pytest_httpx import HTTPXMock


@pytest.fixture
def app(monkeypatch):
    from tests.unit.test_settings import _minimal_env
    for k, v in _minimal_env().items():
        monkeypatch.setenv(k, v)

    from cerbero_mcp.__main__ import _make_app
    from cerbero_mcp.settings import Settings
    return _make_app(Settings())


def test_testnet_bearer_routes_to_testnet_url(app, httpx_mock: HTTPXMock):
    """Una request POST /mcp-deribit/tools/get_ticker con TESTNET_TOKEN
    deve produrre una request httpx verso DERIBIT_URL_TESTNET."""
    httpx_mock.add_response(
        url="https://test.deribit.com/api/v2/public/ticker",
        json={"result": {"last_price": 60000.0}},
    )

    c = TestClient(app)
    r = c.post(
        "/mcp-deribit/tools/get_ticker",
        headers={"Authorization": "Bearer t_test_123"},
        json={"instrument_name": "BTC-PERPETUAL"},
    )
    # 200 o 502 entrambi accettabili — l'importante è che la URL chiamata sia testnet
    requests = httpx_mock.get_requests()
    assert any("test.deribit.com" in str(req.url) for req in requests)


def test_mainnet_bearer_routes_to_live_url(app, httpx_mock: HTTPXMock):
    httpx_mock.add_response(
        url="https://www.deribit.com/api/v2/public/ticker",
        json={"result": {"last_price": 60000.0}},
    )

    c = TestClient(app)
    r = c.post(
        "/mcp-deribit/tools/get_ticker",
        headers={"Authorization": "Bearer t_live_456"},
        json={"instrument_name": "BTC-PERPETUAL"},
    )
    requests = httpx_mock.get_requests()
    assert any("www.deribit.com" in str(req.url) for req in requests)

(Ripeti pattern per bybit, hyperliquid, alpaca: minimo 1 test per exchange.)

  • Step 2: Esegui
uv run pytest tests/integration/test_env_routing.py -v

Expected: tutti PASS. Se un client exchange non usa httpx ma SDK custom (es. pybit, alpaca-py, hyperliquid SDK), pytest-httpx potrebbe non intercettare. In quel caso usa monkeypatch sul client SDK per catturare base_url al construct.

Esempio fallback per SDK opachi:

def test_bybit_testnet_via_constructor(app, monkeypatch):
    from cerbero_mcp.exchanges.bybit import client as bybit_client_mod

    captured = {}
    real_init = bybit_client_mod.BybitClient.__init__

    def spy_init(self, *args, **kwargs):
        captured["base_url"] = kwargs.get("base_url")
        real_init(self, *args, **kwargs)

    monkeypatch.setattr(bybit_client_mod.BybitClient, "__init__", spy_init)

    # forza ricreazione client
    app.state.registry._clients.clear()

    c = TestClient(app)
    c.post(
        "/mcp-bybit/tools/get_ticker",
        headers={"Authorization": "Bearer t_test_123"},
        json={"symbol": "BTCUSDT"},
    )
    assert "testnet" in captured["base_url"]
  • Step 3: Commit
git add tests/integration/test_env_routing.py
git commit -m "test(V2): integration env routing per ogni exchange"

PHASE 9 — Docker e compose

Task 9.1: Nuovo Dockerfile

Files:

  • Create: Dockerfile

  • Delete (later): docker/

  • Step 1: Scrivi Dockerfile in root

# syntax=docker/dockerfile:1.7

FROM python:3.11-slim AS builder
RUN apt-get update && apt-get install -y --no-install-recommends \
        build-essential curl && rm -rf /var/lib/apt/lists/*
RUN pip install --no-cache-dir "uv>=0.5,<0.7"
WORKDIR /app
COPY pyproject.toml uv.lock ./
COPY src ./src
RUN uv sync --frozen --no-dev

FROM python:3.11-slim AS runtime
LABEL org.opencontainers.image.title="cerbero-mcp" \
      org.opencontainers.image.version="2.0.0" \
      org.opencontainers.image.source="https://github.com/AdrianoDev/cerbero"
WORKDIR /app
COPY --from=builder /app /app
ENV PATH="/app/.venv/bin:$PATH" \
    HOST=0.0.0.0 \
    PORT=9000 \
    PYTHONUNBUFFERED=1
RUN useradd -m -u 1000 app && chown -R app:app /app
USER app
EXPOSE 9000
HEALTHCHECK --interval=30s --timeout=5s --retries=3 --start-period=10s \
    CMD python -c "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
CMD ["cerbero-mcp"]
  • Step 2: Build locale
docker build -t cerbero-mcp:2.0.0 .
docker images cerbero-mcp:2.0.0

Expected: build succeeds, image listed (~200 MB).

  • Step 3: Smoke run container
docker run --rm -d --name cerbero-test --env-file .env -p 9000:9000 cerbero-mcp:2.0.0
sleep 3
curl -s http://localhost:9000/health | python -m json.tool
docker logs cerbero-test
docker stop cerbero-test

Expected: {"status":"healthy",...}.

  • Step 4: Commit
git add Dockerfile
git commit -m "feat(V2): Dockerfile unico multi-stage in root"

Task 9.2: Riscrivi docker-compose.yml minimo

Files:

  • Modify: docker-compose.yml

  • Step 1: Sovrascrivi

services:
  cerbero-mcp:
    image: cerbero-mcp:2.0.0
    build: .
    container_name: cerbero-mcp
    ports:
      - "${PORT:-9000}:${PORT:-9000}"
    env_file: .env
    restart: unless-stopped
    healthcheck:
      test:
        - "CMD"
        - "python"
        - "-c"
        - "import os, urllib.request; urllib.request.urlopen(f'http://localhost:{os.environ.get(\"PORT\",\"9000\")}/health', timeout=3).close()"
      interval: 30s
      timeout: 5s
      retries: 3
  • Step 2: Verifica
docker compose config -q
docker compose up -d --build
sleep 3
curl -s http://localhost:9000/health | python -m json.tool
docker compose down

Expected: parse OK, container healthy.

  • Step 3: Commit
git add docker-compose.yml
git commit -m "feat(V2): docker-compose.yml minimo (1 servizio, env_file .env)"

Task 9.3: Smoke test end-to-end con bearer

Files:

  • Create: tests/smoke/run.sh

  • Step 1: Scrivi smoke script

#!/usr/bin/env bash
# Smoke test V2: avvia container, verifica /health, /apidocs, e una tool con bearer testnet.
set -euo pipefail

PORT="${PORT:-9000}"
BASE="http://localhost:${PORT}"

echo "==> healthcheck"
curl -fsS "${BASE}/health" | python -m json.tool

echo "==> apidocs"
curl -fsS "${BASE}/apidocs" | grep -q "Cerbero MCP" && echo "  swagger OK"

echo "==> openapi.json"
curl -fsS "${BASE}/openapi.json" | python -c "import sys,json;d=json.load(sys.stdin);assert 'BearerAuth' in d['components']['securitySchemes'];print('  bearer scheme OK')"

if [[ -z "${TESTNET_TOKEN:-}" ]]; then
  echo "==> skip tool call (TESTNET_TOKEN non settato)"
  exit 0
fi

echo "==> tool senza bearer → 401"
status=$(curl -s -o /dev/null -w "%{http_code}" -X POST "${BASE}/mcp-deribit/tools/get_ticker" \
  -H "Content-Type: application/json" -d '{"instrument_name":"BTC-PERPETUAL"}')
[[ "$status" == "401" ]] && echo "  401 OK" || (echo "  expected 401 got $status"; exit 1)

echo "==> tool con bearer testnet → routing testnet (può fallire 5xx se testnet down)"
curl -s -o /dev/null -w "  http %{http_code}\n" -X POST "${BASE}/mcp-deribit/tools/get_ticker" \
  -H "Authorization: Bearer ${TESTNET_TOKEN}" \
  -H "Content-Type: application/json" \
  -d '{"instrument_name":"BTC-PERPETUAL"}'

echo "==> SMOKE OK"
  • Step 2: Eseguibile
chmod +x tests/smoke/run.sh
  • Step 3: Run su container
docker compose up -d --build
sleep 3
PORT=9000 TESTNET_TOKEN=$(grep '^TESTNET_TOKEN=' .env | cut -d= -f2) bash tests/smoke/run.sh
docker compose down

Expected: SMOKE OK.

  • Step 4: Commit
git add tests/smoke/run.sh
git commit -m "test(V2): smoke script con bearer testnet"

PHASE 10 — Pulizia repo (V1 → V2)

A questo punto la nuova struttura V2 funziona end-to-end. Si rimuove il vecchio.

Task 10.1: Rimuovi services/, gateway/, secrets/, docker/

Files:

  • Delete: services/, gateway/, secrets/, docker/

  • Step 1: Verifica nessun import residuo da services/ o mcp_*

grep -rn "from mcp_common\|from mcp_deribit\|from mcp_bybit\|from mcp_hyperliquid\|from mcp_alpaca\|from mcp_macro\|from mcp_sentiment" src/ tests/ || echo "OK: no residual imports"
grep -rn "services/" src/ tests/ pyproject.toml || echo "OK: no path refs"

Expected: OK per entrambi.

  • Step 2: Rimuovi
git rm -rf services/ gateway/ docker/
rm -rf secrets/  # non versionato in V2 (gitignored)
  • Step 3: Run di tutti i test
uv run pytest tests/ -v

Expected: tutti PASS.

  • Step 4: Build immagine ancora OK
docker build -t cerbero-mcp:2.0.0 .

Expected: build succeeds.

  • Step 5: Commit
git add -A
git commit -m "chore(V2): rimuovi services/, gateway/, secrets/, docker/ (legacy V1)"

Task 10.2: Rimuovi compose overlay V1

Files:

  • Delete: docker-compose.prod.yml, docker-compose.local.yml, docker-compose.traefik.yml

  • Step 1: Rimuovi

git rm docker-compose.prod.yml docker-compose.local.yml docker-compose.traefik.yml
  • Step 2: Verifica docker-compose.yml ancora valido
docker compose config -q

Expected: nessun errore.

  • Step 3: Commit
git commit -m "chore(V2): rimuovi compose overlay V1 (prod, local, traefik)"

Task 10.3: Aggiorna scripts/build-push.sh per 1 immagine

Files:

  • Modify: scripts/build-push.sh

  • Step 1: Sovrascrivi

#!/usr/bin/env bash
# Build & push immagine cerbero-mcp:2.0.0 + :latest verso registry Gitea.
set -euo pipefail

: "${GITEA_PAT:?GITEA_PAT non settato}"
: "${REGISTRY:=gitea.tielogic.xyz/adriano}"

VERSION="${VERSION:-2.0.0}"
SHA="$(git rev-parse --short HEAD)"

echo "==> docker login"
echo "${GITEA_PAT}" | docker login "${REGISTRY%%/*}" -u adriano --password-stdin

echo "==> build"
docker build -t cerbero-mcp:${VERSION} -t cerbero-mcp:latest -t cerbero-mcp:sha-${SHA} .

echo "==> tag for registry"
for tag in ${VERSION} latest sha-${SHA}; do
  docker tag cerbero-mcp:${tag} ${REGISTRY}/cerbero-mcp:${tag}
done

echo "==> push"
for tag in ${VERSION} latest sha-${SHA}; do
  docker push ${REGISTRY}/cerbero-mcp:${tag}
done

echo "==> DONE"
  • Step 2: Eseguibile
chmod +x scripts/build-push.sh
  • Step 3: Commit
git add scripts/build-push.sh
git commit -m "chore(V2): build-push.sh costruisce 1 sola immagine"

PHASE 11 — Documentazione

Task 11.1: Riscrivi README.md

Files:

  • Modify: README.md

  • Step 1: Sovrascrivi

# Cerbero MCP — V2.0.0

Server MCP unificato multi-exchange per la suite Cerbero. Distribuito come
singola immagine Docker; testnet e mainnet sono raggiungibili
contemporaneamente attraverso un meccanismo di routing per-request basato
sul token bearer fornito dal client.

## Caratteristiche

- **Una singola immagine Docker** (`cerbero-mcp`) ospita tutti i router
  exchange in un unico processo FastAPI
- **Quattro exchange** (Deribit, Bybit, Hyperliquid, Alpaca) e **due data
  provider** read-only (Macro, Sentiment)
- **Switch testnet/mainnet per-request** via header
  `Authorization: Bearer <TOKEN>`: lo stesso container serve entrambi gli
  ambienti senza riavvii
- **Configurazione interamente in `.env`**: nessun file JSON di credenziali
  separato
- **Documentazione interattiva** OpenAPI/Swagger esposta a `/apidocs`

## Avvio rapido (sviluppo, senza Docker)

1. Copiare il template di configurazione e compilarlo:
   ```bash
   cp .env.example .env
   # editare .env con le proprie credenziali e i due token
  1. Generare i token bearer:
    python -c 'import secrets; print("TESTNET_TOKEN=" + secrets.token_urlsafe(32))'
    python -c 'import secrets; print("MAINNET_TOKEN=" + secrets.token_urlsafe(32))'
    
  2. Installare le dipendenze e avviare:
    uv sync
    uv run cerbero-mcp
    
  3. Aprire la documentazione interattiva: http://localhost:9000/apidocs

Avvio con Docker

cp .env.example .env  # compilare valori
docker compose up -d

Il container espone la porta indicata da PORT in .env (default 9000).

Token bearer e ambienti

Token usato Ambiente upstream
Authorization: Bearer $TESTNET_TOKEN URL testnet di ciascun exchange
Authorization: Bearer $MAINNET_TOKEN URL mainnet (live)
Nessun token / token sconosciuto 401 Unauthorized

Le tool puramente read-only (/mcp-macro/* e /mcp-sentiment/*) richiedono comunque un bearer valido, ma il valore (testnet o mainnet) è indifferente perché non hanno endpoint testnet.

Endpoint principali

Path Descrizione
GET /health Healthcheck (no auth)
GET /apidocs Swagger UI (no auth)
GET /openapi.json Schema OpenAPI 3.1 (no auth)
POST /mcp-deribit/tools/{tool} Tool exchange Deribit
POST /mcp-bybit/tools/{tool} Tool exchange Bybit
POST /mcp-hyperliquid/tools/{tool} Tool exchange Hyperliquid
POST /mcp-alpaca/tools/{tool} Tool exchange Alpaca
POST /mcp-macro/tools/{tool} Tool macro/market data
POST /mcp-sentiment/tools/{tool} Tool sentiment/news

Tool disponibili

Common (cerbero_mcp.common.indicators + options + microstructure + stats)

Tecnici (sma, rsi, macd, atr, adx), volatilità (vol_cone, garch11_forecast), statistici (hurst_exponent, half_life_mean_reversion, cointegration_test), risk (rolling_sharpe, var_cvar), microstructure (orderbook_imbalance), options (oi_weighted_skew, smile_asymmetry, dealer_gamma_profile, vanna_charm_aggregate).

Deribit

DVOL, GEX, P/C ratio, skew_25d, term_structure, iv_rank, realized_vol, indicatori tecnici, find_by_delta, calculate_spread_payoff, get_dealer_gamma_profile, get_vanna_charm, get_oi_weighted_skew, get_smile_asymmetry, get_atm_vs_wings_vol, get_orderbook_imbalance, place_combo_order.

Bybit

Ticker, orderbook, OHLCV, funding rate, open interest, basis spot/perp, indicatori tecnici, place_batch_order, get_orderbook_imbalance, get_basis_term_structure.

Hyperliquid

Account summary, positions, orderbook, historical, indicators, funding rate, basis spot/perp, place_order, set_stop_loss, set_take_profit.

Alpaca

Account, positions, bars, snapshot, option chain, place_order, amend_order, cancel_order, close_position.

Macro

Treasury yields, FRED indicators, equity futures, asset prices, calendar, get_yield_curve_slope, get_breakeven_inflation, get_cot_tff, get_cot_disaggregated, get_cot_extreme_positioning.

Sentiment

News (CryptoPanic/CoinDesk), social (LunarCrush), funding multi-exchange, OI history, get_funding_arb_spread, get_liquidation_heatmap, get_cointegration_pairs.

Deploy su VPS con Traefik

Sul VPS si gestisce la rete pubblica (TLS, allowlist IP, rate limit) con Traefik esterno a questo repository. Il container cerbero-mcp non espone porte all'esterno: si registra alla rete docker di Traefik tramite label aggiunte da un override compose esterno (es. docker-compose.override.yml versionato fuori da questo repo). La policy di sicurezza pubblica (allowlist IP per gli endpoint write) è responsabilità di Traefik.

Esempio label minime per Traefik:

labels:
  - "traefik.enable=true"
  - "traefik.http.routers.cerbero.rule=Host(`cerbero-mcp.tielogic.xyz`)"
  - "traefik.http.routers.cerbero.entrypoints=websecure"
  - "traefik.http.routers.cerbero.tls.certresolver=letsencrypt"
  - "traefik.http.services.cerbero.loadbalancer.server.port=9000"

Build & deploy pipeline

Build dell'immagine eseguita sulla macchina di sviluppo:

export GITEA_PAT='<PAT con scope write:package>'
./scripts/build-push.sh

Lo script tagga :2.0.0, :latest e :sha-<short> per rollback puntuali e pubblica al registry Gitea. Sul VPS Watchtower polla :latest e aggiorna il container automaticamente.

Sviluppo

uv sync
uv run pytest                   # tutta la suite
uv run pytest tests/unit -v     # solo unit
uv run ruff check src/
uv run mypy src/cerbero_mcp

Migrazione da V1 (1.x → 2.0.0)

Per chi è in produzione su V1:

  1. Backup secrets/ (V2 non li userà ma servono come fonte di copia).
  2. Generare i due nuovi token bearer (vedi sopra).
  3. Compilare .env mappando i campi V1 ai campi V2:
    • secrets/deribit.jsonDERIBIT_*
    • secrets/bybit.jsonBYBIT_*
    • secrets/hyperliquid.jsonHYPERLIQUID_*
    • secrets/alpaca.jsonALPACA_*
    • secrets/macro.jsonFRED_API_KEY, FINNHUB_API_KEY
    • secrets/sentiment.jsonCRYPTOPANIC_KEY, LUNARCRUSH_KEY
  4. Aggiornare i client bot:
    • i path API restano identici (/mcp-{exchange}/tools/{tool})
    • sostituire core.token / observer.token con TESTNET_TOKEN o MAINNET_TOKEN a seconda dell'ambiente desiderato per la chiamata
  5. Spegnere V1 (docker compose -f <vecchio compose> down) e avviare V2 (docker compose up -d).
  6. Verificare /health e /apidocs.

Licenza

Privato.


- [ ] **Step 2: Commit**

```bash
git add README.md
git commit -m "docs(V2): README riscritto per architettura V2.0.0"

Task 11.2: Elimina DEPLOYMENT.md

Files:

  • Delete: DEPLOYMENT.md

  • Step 1: Rimuovi

git rm DEPLOYMENT.md
  • Step 2: Verifica nessun link rotto
grep -rn "DEPLOYMENT.md" . --include='*.md' --include='*.py' --include='*.yml' --include='*.yaml' || echo "OK: no refs"

Expected: OK: no refs. Se ci sono riferimenti, aggiornali al README.

  • Step 3: Commit
git commit -m "docs(V2): rimuovi DEPLOYMENT.md (contenuti integrati in README)"

PHASE 12 — Verifica finale

Task 12.1: Quality gate completo

  • Step 1: Lint + type check + test
uv run ruff check src/ tests/
uv run mypy src/cerbero_mcp
uv run pytest tests/ -v

Expected: tutti i comandi exit code 0.

  • Step 2: Build & smoke con container
docker build -t cerbero-mcp:2.0.0 .
docker compose up -d
sleep 5
PORT=9000 TESTNET_TOKEN=$(grep '^TESTNET_TOKEN=' .env | cut -d= -f2) bash tests/smoke/run.sh
docker compose down

Expected: SMOKE OK.

  • Step 3: Verifica file rimossi
test ! -d services/ && echo "services/ removed"
test ! -d gateway/ && echo "gateway/ removed"
test ! -d docker/ && echo "docker/ removed"
test ! -f DEPLOYMENT.md && echo "DEPLOYMENT.md removed"
test ! -f docker-compose.prod.yml && echo "compose.prod removed"
test ! -f docker-compose.traefik.yml && echo "compose.traefik removed"
test ! -f docker-compose.local.yml && echo "compose.local removed"

Expected: 7 conferme di rimozione.

  • Step 4: Verifica file presenti
test -f Dockerfile && echo "Dockerfile present"
test -f docker-compose.yml && echo "compose present"
test -f .env.example && echo ".env.example present"
test -d src/cerbero_mcp && echo "src package present"
test -f src/cerbero_mcp/__main__.py && echo "main present"

Expected: 5 conferme.

  • Step 5: Tag finale (opzionale, se tutto OK)
git tag -a v2.0.0 -m "V2.0.0: unified image, token-based env routing"
  • Step 6: Commit di chiusura (se ci sono ancora modifiche)
git status
# se nulla → branch pronto

Criteri di successo finali

  • uv run cerbero-mcp parte su laptop senza Docker
  • docker compose up -d parte e /health risponde entro 5 s
  • /apidocs mostra Swagger UI con tag per ogni exchange
  • /openapi.json contiene securityScheme BearerAuth
  • Bot V1 funziona cambiando solo bearer (path identici)
  • Test integration: bearer testnet → URL testnet, bearer mainnet → URL live
  • Tutta la suite test passa
  • services/, gateway/, secrets/, docker/, DEPLOYMENT.md, 3 compose overlay rimossi
  • Build immagine < 5 min (vs ~12 min V1)
  • Image size ~200 MB