4dad8be36b
Tutti i tier (S/A/B/C/D) ora passano per OpenRouter via OpenAI SDK.
Modelli Anthropic raggiungibili via prefisso `anthropic/...`.
- pyproject: rimosso `anthropic>=0.39` da deps + uv.lock
- config: rimosso `anthropic_api_key` field
- LLMClient: dispatch unico, single client OpenAI con base_url OpenRouter
- defaults S/A/B → `anthropic/claude-{opus-4-7,sonnet-4-6}`
- retry exceptions: solo openai.* (drop anthropic.*)
- test rinominati e adattati: tier S/A/B mockano OpenAI con prefisso `anthropic/`
- rimosso test `tier_S_without_anthropic_key_raises` (non più rilevante)
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
23 lines
702 B
Bash
23 lines
702 B
Bash
# Cerbero MCP (locale durante Phase 1)
|
|
CERBERO_BASE_URL=http://localhost:9000
|
|
CERBERO_TESTNET_TOKEN=
|
|
CERBERO_MAINNET_TOKEN=
|
|
CERBERO_BOT_TAG=swarm-poc-phase1
|
|
|
|
# LLM provider (single endpoint via OpenRouter — supports anthropic/openai/qwen/llama models)
|
|
OPENROUTER_API_KEY=
|
|
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
|
|
|
|
# Models per tier (override Phase 1 defaults if needed)
|
|
LLM_MODEL_TIER_S=anthropic/claude-opus-4-7
|
|
LLM_MODEL_TIER_A=anthropic/claude-sonnet-4-6
|
|
LLM_MODEL_TIER_B=anthropic/claude-sonnet-4-6
|
|
LLM_MODEL_TIER_C=qwen/qwen-2.5-72b-instruct
|
|
LLM_MODEL_TIER_D=meta-llama/llama-3.3-70b-instruct
|
|
|
|
# Run config
|
|
RUN_NAME=phase1-spike-001
|
|
DATA_DIR=./data
|
|
SERIES_DIR=./series
|
|
DB_PATH=./runs.db
|