refactor(llm): route all tiers via OpenRouter, drop Anthropic SDK

Tutti i tier (S/A/B/C/D) ora passano per OpenRouter via OpenAI SDK.
Modelli Anthropic raggiungibili via prefisso `anthropic/...`.

- pyproject: rimosso `anthropic>=0.39` da deps + uv.lock
- config: rimosso `anthropic_api_key` field
- LLMClient: dispatch unico, single client OpenAI con base_url OpenRouter
- defaults S/A/B → `anthropic/claude-{opus-4-7,sonnet-4-6}`
- retry exceptions: solo openai.* (drop anthropic.*)
- test rinominati e adattati: tier S/A/B mockano OpenAI con prefisso `anthropic/`
- rimosso test `tier_S_without_anthropic_key_raises` (non più rilevante)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
2026-05-10 09:39:23 +02:00
parent 70b8bc3d6c
commit 4dad8be36b
8 changed files with 84 additions and 177 deletions
+6 -7
View File
@@ -4,17 +4,16 @@ CERBERO_TESTNET_TOKEN=
CERBERO_MAINNET_TOKEN=
CERBERO_BOT_TAG=swarm-poc-phase1
# LLM providers
# LLM provider (single endpoint via OpenRouter — supports anthropic/openai/qwen/llama models)
OPENROUTER_API_KEY=
ANTHROPIC_API_KEY=
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
# LLM models (override Phase 1 defaults if needed)
LLM_MODEL_TIER_S=claude-opus-4-7
LLM_MODEL_TIER_A=claude-sonnet-4-6
LLM_MODEL_TIER_B=claude-sonnet-4-6
# Models per tier (override Phase 1 defaults if needed)
LLM_MODEL_TIER_S=anthropic/claude-opus-4-7
LLM_MODEL_TIER_A=anthropic/claude-sonnet-4-6
LLM_MODEL_TIER_B=anthropic/claude-sonnet-4-6
LLM_MODEL_TIER_C=qwen/qwen-2.5-72b-instruct
LLM_MODEL_TIER_D=meta-llama/llama-3.3-70b-instruct
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
# Run config
RUN_NAME=phase1-spike-001