diff --git a/docs/superpowers/plans/2026-04-17-rev04-master-roadmap.md b/docs/superpowers/plans/2026-04-17-rev04-master-roadmap.md new file mode 100644 index 0000000..05df7b5 --- /dev/null +++ b/docs/superpowers/plans/2026-04-17-rev04-master-roadmap.md @@ -0,0 +1,377 @@ +# TieMeasureFlow V1.0.7 → V1.1.0 (rev04-2026) — Master Roadmap + +> **Tipo documento:** Master plan / roadmap. Non contiene task eseguibili TDD — rimanda ai sotto-piani per-fase. +> **Spec sorgente:** `Schema sviluppo SW TieFlow_rev04-2026.docx` + `PIANO_IMPLEMENTAZIONE.md` v1.0. +> **Target deployment:** Scenario B — container `tmflow-client` distribuito a ogni tablet/PC industriale, un `tmflow-server` centralizzato su VPS. + +**Goal:** Portare TieMeasureFlow da V1.0.7 (funzionalità base: ricette, misure, SPC) a V1.1.0 allineata alla rev04: integrazione GAIA, stazioni per-tablet, ruolo capoturno, editor ricetta a blocchi, workflow operatore stretto (retry/timer/cicalino/avvio produzione), deploy docker per-tablet con registry privato. + +## Strategia di Rilascio (revisione 2026-04-17) + +Priorità aggiornata dal committente: **consegnare al cliente un sistema funzionante di test il prima possibile**. Il deploy Scenario B (registry + Watchtower + immagini pubblicate) è esplicitamente **rimandato alla fine dello sviluppo**. + +Il programma si articola quindi in due milestone: + +| Milestone | Contenuto | Obiettivo | +|---|---|---| +| **M1 — Demo cliente** | Fasi 1-5 del piano (modello dati, capoturno, editor blocchi, workflow operatore, GAIA **import-only con dati reali**) + deploy "demo" semplice (single docker-compose su VPS demo, senza registry) | Sistema testabile end-to-end dal cliente con i suoi dati, per raccogliere feedback sul workflow operatore prima di integrare GAIA live e andare in produzione. | +| **M2 — Produzione** | Fase 6 (Deploy B industriale) + Fase 7 (hardening, E2E, rollout pilota) + eventuali aggiustamenti post-feedback M1 | Rollout su tablet/PC reali del cliente. | + +Conseguenze operative: +- **Dati reali, non mock:** M1 parte con contenuti veri del cliente (articoli, ricette, stazioni). Niente `MockGaiaClient`: in Fase 5 per M1 sviluppiamo solo il canale di **import one-shot** (endpoint admin che accetta un export GAIA in CSV/JSON, più UI Maker per editing manuale). L'integrazione GAIA live (polling, comandi produzione real-time) slitta a M2. Il cliente ci fornirà una snapshot dati all'inizio di M1. +- L'interfaccia astratta `GaiaClient` viene comunque disegnata in M1 (per non accumulare debito) ma in M1 resta implementata da un `ImportOnlyGaiaClient` che legge da file/DB senza contattare GAIA. I comandi produzione (`start/pause/resume/end`) in M1 scrivono su `production_events` locali — a M2 aggiungiamo l'implementazione che li invia a GAIA. +- Il deploy M1 usa un singolo `docker-compose.yml` su VPS di test con porte esposte a dominio di demo (HTTPS con Let's Encrypt tramite Traefik già presente in V1.0.7). Nessun registry privato, nessun Watchtower, build locale + push con `docker compose up --build`. +- Durante M1 il cliente testa da browser direttamente contro la VPS demo; i "tablet" in quella fase sono semplicemente browser puntati all'URL demo. +- Il feedback M1 può rimodulare le fasi di M2 (es. scoprire che serve un ruolo in più, che il cicalino va cambiato, che il protocollo GAIA va rivisto). + +**Architecture:** +- Server FastAPI centralizzato su VPS — unico nodo che parla a GAIA, detiene DB MySQL, genera report, calcola SPC. +- Client Flask impacchettato come `tmflow-client:` pubblicato su registry privato della VPS; ogni tablet/PC industriale pulla la propria copia e si identifica via `STATION_ID` nel `.env`. +- Capoturno come ruolo separato con flusso di override a token a scadenza breve. +- Ricetta estesa a blocchi tipizzati: preparazione documentale + misura vincolata a tolleranze. +- Parametri volatili di ricetta (timer, max_retries) isolati dalla versione immutabile. + +**Tech Stack (invariato rispetto a V1.0.7):** +- Server: FastAPI, SQLAlchemy 2.0 async, MySQL 8, Alembic, Pydantic v2, WeasyPrint, Plotly+Kaleido, APScheduler (nuovo, per sync GAIA). +- Client: Flask, Jinja2, Alpine.js, TailwindCSS, Fabric.js, Plotly.js, html5-qrcode, Web Serial, Flask-Babel. +- Nuove integrazioni: adapter GAIA (protocollo TBD col cliente). +- Deploy: Docker registry privato (`registry:2`) sulla VPS + Watchtower sui tablet per auto-update. + +--- + +## 0. Precondizioni e Decisioni Aperte (RISOLVERE PRIMA DI INIZIARE) + +Queste scelte sono **prerequisiti**: iniziare senza risolverle produce rework sicuro. + +| # | Decisione | Opzioni | Chi decide | Stato | +|---|---|---|---|---| +| D-0.1 | Protocollo integrazione GAIA | REST, DB shared, OPC-UA, file scambio | Cliente + noi | **Aperta** | +| D-0.2 | Credenziali e rete GAIA | VPN, firewall, whitelist IP | IT cliente | **Aperta** | +| D-0.3 | Target hardware "tablet" | Tablet Windows consumer, PC touch industriale Linux, tablet Android | Cliente | **Aperta** (critica per B) | +| D-0.4 | Cicalino/luce avviso | Audio browser HTML5, hardware USB locale, entrambi | Cliente | **Rimandato a M2** (non bloccante M1) | +| D-0.5 | Parametri runtime modificabili vs versione immutabile | A: ogni modifica crea versione; B: separare volatili da strutturali | Noi (raccomandato B) | **Aperta** | +| D-0.6 | Autenticazione capoturno durante override | Login sovrapposto modale, PIN rapido, badge RFID | Cliente | **Aperta** | +| D-0.7 | Timeout auto-logout | Valore default + configurabilità | Cliente | **Risolta:** configurabile via `system_settings` (chiave `auto_logout_minutes`, default 15), con override opzionale per-utente in colonna `users.auto_logout_minutes` (NULL = usa globale). | +| D-0.8 | Naming ruolo capoturno | `Supervisor`, `ShiftLeader`, `Capoturno` | Noi | Proposta: `Supervisor` (già inglese come gli altri) | +| D-0.9 | Tag versione immagine docker | `latest`, SemVer, git sha | Noi | Proposta: SemVer + `latest` | +| D-0.10 | Registry esposto su Internet o solo VPN | Internet+auth, solo VPN | Cliente + noi | Proposta: solo VPN cliente | + +**Azione richiesta:** aprire una riunione di allineamento col cliente su D-0.1, D-0.2, D-0.3, D-0.4, D-0.6 prima di Fase 1. + +--- + +## Fasi (ordine consigliato) + +Capoturno (Fase 2) è precondizione del workflow operatore (Fase 4). GAIA (Fase 5) può partire in parallelo a Fase 2-3 dopo Fase 1. Fase 6 (deploy industriale B) parte **solo dopo il feedback cliente su M1**. + +``` + ┌─────────────────────────── M1 — Demo per cliente ────────────────────────────┐ + │ │ + │ ┌───────────────────────────────────────────────────┐ │ + │ │ Fase 1 — Stazioni + identità per-tablet │ 1.5-2 settimane │ + │ └───────────────┬──────────────────────────────────┘ │ + │ │ │ + │ ┌───────────┼─────────────────────┐ │ + │ │ │ │ │ + │ ▼ ▼ ▼ │ + │ Fase 2 Fase 3 Fase 5 (import-only) │ + │ Capoturno Editor a blocchi ImportOnlyGaiaClient + UI import │ + │ │ │ │ (paralleli, ~2-3 sett.) │ + │ └─────┬─────┘ │ │ + │ ▼ │ │ + │ Fase 4 — Workflow operatore │ 2-3 settimane │ + │ (retry/timer/autologout/avvio) │ │ + │ │ │ │ + │ └──────────┬───────────────┘ │ + │ ▼ │ + │ Demo deploy su VPS test ~2 giorni │ + │ (docker-compose single-file, │ + │ Traefik+LE, no registry, │ + │ dati reali cliente importati) │ + │ │ │ + └────────────────────────┼─────────────────────────────────────────────────────┘ + │ + ═══ CHECKPOINT: feedback cliente su M1 ═══ + │ + ┌─────────────────────── M2 — Produzione ─────────────────────────────────────┐ + │ │ │ + │ ▼ │ + │ Adjustments post-feedback variabile │ + │ (modifiche workflow, UX, GAIA reale) │ + │ │ │ + │ ▼ │ + │ Fase 5 reale — GAIA integrazione 1-2 settimane │ + │ (protocollo definitivo, credenziali, │ + │ rete cliente, feature flag) │ + │ ▼ │ + │ Fase 6 — Deploy B industriale 1 settimana │ + │ (registry + watchtower + compose │ + │ split server/client, STATION_ID, │ + │ CI release) │ + │ ▼ │ + │ Fase 7 — Hardening & rollout 1-2 settimane │ + │ (security review, e2e sito pilota, │ + │ docs aggiornati, i18n delta) │ + │ │ + └──────────────────────────────────────────────────────────────────────────────┘ +``` + +**Stima a M1 (sistema demo testabile):** ~6-7 settimane full-time; ~4-5 settimane con 2 fasi in parallelo. +**Stima totale fino a M2:** ~9-11 settimane full-time (include feedback cycle stimato 1-2 settimane). + +--- + +## Fase 1 — Stazioni e Identità per-tablet + +**Obiettivo:** modellare il concetto di "stazione di controllo", associare ricette a stazioni, far sì che ogni client carichi solo le ricette della propria stazione identificata da `STATION_ID`. + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase1-stations.md` + +**Deliverable:** +- Tabelle `stations`, `station_recipe_assignments` + migration Alembic. +- Endpoint `GET /api/stations`, `POST /api/stations`, `PUT /api/stations/{id}`, `GET /api/stations/{id}/recipes`. +- Schema auth: API key del client associata a una stazione → ogni request porta identità di stazione implicita. +- Variabile d'ambiente `STATION_ID` nel client Flask, letta a startup, usata per filtrare `/api/recipes` in `select_recipe`. +- UI admin per gestire stazioni + assegnazioni. +- Test: un client con STATION_ID=A vede solo le ricette assegnate ad A. + +**File impattati principali:** +- Nuovi: `server/models/station.py`, `server/routers/stations.py`, `server/schemas/station.py`, `server/services/station_service.py`, `client/templates/admin/stations.html`, `server/migrations/versions/_add_stations.py`. +- Modificati: `server/main.py` (registra router), `server/models/__init__.py`, `server/routers/recipes.py` (filtro by station), `client/config.py` (`STATION_ID`), `client/blueprints/measure.py` (passa STATION_ID alle query), `client/blueprints/admin.py` (CRUD stazioni), `client/blueprints/auth.py` (usa API key specifica di stazione). + +**Rischi:** rotture retrocompatibilità API se cambiamo signature endpoint esistenti → mitigare con parametro opzionale `station_id` e default None = backward compat. + +--- + +## Fase 2 — Ruolo Capoturno (Supervisor) e Override + +**Obiettivo:** introdurre ruolo `Supervisor`, endpoint di override a token breve, UI login-sovrapposto per autorizzazioni puntuali. + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase2-supervisor.md` + +**Deliverable:** +- Estensione enum ruoli con `Supervisor` + dependency `require_supervisor`. +- Tabella `supervisor_overrides` (id, requested_by, authorized_by, override_type, target_ref, created_at, consumed_at, reason). +- Endpoint `POST /api/supervisor/override` che valida credenziali supervisor e ritorna `override_token` JWT-like (firmato, TTL 60s). +- Endpoint `POST /api/supervisor/override/verify` usato da altri router prima di eseguire azioni protette. +- Componente Alpine.js `supervisor-modal` riusabile: apertura modale, login, callback con token. +- Audit trail: ogni override consumato logga `consumed_at` + contesto. + +**Tipi di override previsti:** +- `FORCE_OUT_OF_TOLERANCE` — operatore vuole confermare misura fuori tolleranza. +- `RESET_RETRY_COUNTER` — sbloccare dopo N tentativi falliti. +- `PAUSE_PRODUCTION` — fermo linea (invierà comando GAIA in Fase 5). +- `END_PRODUCTION` — fine produzione (chiude timer GAIA in Fase 5). +- `EDIT_RUNTIME_PARAMS` — modificare timer/max_retries della ricetta in corsa. + +**File impattati principali:** +- Nuovi: `server/models/supervisor_override.py`, `server/routers/supervisor.py`, `server/services/supervisor_service.py`, `server/schemas/supervisor.py`, `client/templates/components/supervisor_modal.html`, `client/static/js/supervisor-modal.js`. +- Modificati: `server/models/user.py` (commento ruoli), `server/middleware/api_key.py` (nuova dep `require_supervisor`), `client/blueprints/measure.py` (consuma token). + +--- + +## Fase 3 — Editor Ricetta a Blocchi (Preparazione + Misura) + +**Obiettivo:** estendere il concetto di `RecipeTask` a blocchi tipizzati: `preparation` (documentale: materiali, imballo, temperatura) e `measurement` (con subtask + tolleranze + timer intervallo). + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase3-block-editor.md` + +**Deliverable:** +- Migration: aggiunge `block_type ENUM('preparation', 'measurement')` e `measurement_interval_seconds INT NULL`, `max_retries INT NOT NULL DEFAULT 3`, `preparation_content LONGTEXT NULL` a `recipe_tasks`. +- Backfill: tutti i task esistenti → `block_type='measurement'`. +- Editor UI: toggle tipo blocco nel task editor; blocchi preparazione usano rich-text editor (TipTap light o textarea con markdown + preview) e accettano upload PDF/immagine opzionale. +- MeasurementTec UI: blocchi preparazione visualizzati come "scheda documentale" con pulsante "Ho letto, procedi"; blocchi misura invariati. +- Parametri volatili: nuove colonne `measurement_interval_seconds` e `max_retries` sono modificabili senza creare nuova versione (Decisione D-0.5 opzione B). + +**File impattati principali:** +- Migration Alembic. +- Modificati: `server/models/task.py`, `server/schemas/task.py`, `server/services/recipe_service.py` (copy-on-write copia anche nuovi campi), `client/templates/maker/task_editor.html`, `client/static/js/annotation-editor.js` (skip per blocchi prep), `client/templates/measure/task_execute.html` (branch per tipo blocco), nuovo `client/templates/measure/preparation_view.html`. + +--- + +## Fase 4 — Workflow Operatore: Retry, Timer, Autologout, Avvio Produzione + +> Nota: feedback sonoro (cicalino avvio misura + luce continua) **rimandato a M2**. M1 usa solo notifiche visive. + +**Obiettivo:** implementare le regole stringenti della rev04 sul flusso operatore. + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase4-workflow.md` + +**Deliverable:** + +| Feature | Implementazione | +|---|---| +| N tentativi max poi Supervisor | Contatore in session Flask per `(measurement_session_id, subtask_id)`. Al superamento: modale Supervisor (Fase 2) per reset. | +| Auto-avanzamento solo se pass | Logica JS: se `pass_fail != 'pass'`, bottone "Avanti" disabilitato finché operatore non richiede conferma Supervisor (override FORCE_OUT_OF_TOLERANCE). | +| Log-out bloccato durante attività | Flag `measurement_session.is_active`. Link "Logout" in navbar nascosto/disabilitato se attiva. Endpoint `POST /api/auth/logout` respinge con 409 se sessione attiva senza override. | +| Auto-logout inattività | Alpine store `idle-timer` con event listeners `pointermove/keydown`. Alla scadenza (default 15m, config per-utente o globale) → redirect a `/logout`. | +| ~~Cicalino avvio misura~~ | **Rimandato a M2.** In M1 nessun feedback sonoro: al tick 0 del timer si mostra solo notifica visiva (toast + highlight subtask). | +| ~~Luce/cicalino continuo~~ | **Rimandato a M2.** | +| Timer intervallo tra misure | Countdown JS che legge `measurement_interval_seconds`. Al tick 0: notifica visiva + mostra subtask + sblocca input calibro. | +| "Avvio produzione" post-prima-misura | Bottone UI visibile solo dopo prima misura confermata. Chiamata `POST /api/gaia/production/start` (Fase 5). | +| "Fermo linea" / "Fine produzione" | Due bottoni protetti da Supervisor override. Chiamate GAIA (Fase 5). | + +**File impattati principali:** +- Modificati: `client/blueprints/measure.py`, `client/templates/measure/task_execute.html`, `client/static/js/numpad.js` (estende con retry counter), nuovo `client/static/js/idle-timer.js`, nuovo `client/static/js/production-timer.js`, `client/static/js/caliper.js` (hook conferma esplicita). +- Server: aggiunge `measurement_sessions` table (id, user_id, recipe_version_id, station_id, started_at, ended_at, is_active), endpoint start/end. + +**Dipendenza:** Fase 2 (Supervisor) e Fase 3 (blocchi con timer) devono essere completate. + +--- + +## Fase 5 — Adapter GAIA + +**Obiettivo:** integrare il MES GAIA per sync ricette-stazioni e comandi produzione. + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase5-gaia.md` + +**Scope per milestone:** +- **M1 (dati reali, no live GAIA):** + - Interfaccia astratta `GaiaClient` disegnata per intero (tutti i metodi previsti). + - Implementazione `ImportOnlyGaiaClient`: `fetch_station_articles` legge da DB popolato via import one-shot; i comandi `start/pause/resume/end` scrivono in una nuova tabella `production_events` locale (sorgente di verità per la demo, poi verrà replicata a GAIA in M2). + - Endpoint admin `POST /api/admin/gaia-import` che accetta un file (CSV o JSON) esportato dal cliente e popola `station_recipe_assignments` + articoli. Parser tollerante con report errori riga per riga. + - UI admin `/admin/gaia-import` per uploadare il file, previeware i dati, confermare l'import. + - Export retro-compatibile: `GET /api/admin/production-events/export` restituisce il JSON/CSV dei `production_events` accumulati (il cliente può caricarlo manualmente in GAIA in M1 se gli serve). + - Nessuna rete verso GAIA, nessuna credenziale GAIA richiesta per M1. +- **M2:** implementazione concreta `LiveGaiaClient` del protocollo definitivo (REST/DB/altro) in base a D-0.1, credenziali, rete cliente, circuit breaker, monitoring. I `production_events` accumulati da M1 diventano la coda che `LiveGaiaClient` drena verso GAIA al primo avvio. + +**Contratto comune (disegnato in M1, usato da entrambe le implementazioni):** + +```python +# server/services/gaia_client.py +class GaiaClient(Protocol): + async def fetch_station_articles(self, station_code: str) -> list[StationArticle]: ... + async def start_production(self, article_code: str, station_code: str, operator_id: str) -> None: ... + async def pause_production(self, article_code: str, station_code: str, reason: str) -> None: ... + async def resume_production(self, article_code: str, station_code: str) -> None: ... + async def end_production(self, article_code: str, station_code: str, stats: ProductionStats) -> None: ... +``` + +**Deliverable M1 — `ImportOnlyGaiaClient`:** +- Lettura articoli/ricette dal DB interno popolato via import one-shot. +- Comandi produzione persistiti in tabella `production_events` (no rete esterna). +- Endpoint `POST /api/admin/gaia-import` (upload CSV/JSON export GAIA) + UI `/admin/gaia-import` con preview e conferma. +- Endpoint `GET /api/admin/production-events/export` per esportare gli eventi accumulati (JSON/CSV) da caricare manualmente su GAIA se serve. +- Endpoint `POST /api/gaia/production/{start,pause,resume,end}` usati dal client Flask. +- Config `.env`: `GAIA_MODE=import_only` (default in M1). + +**Deliverable M2 — `LiveGaiaClient`:** +- Implementazione concreta del protocollo definitivo (REST/DB/altro) in base a D-0.1. +- Job APScheduler ogni 60s: per ogni stazione attiva, chiama `fetch_station_articles`, aggiorna `station_recipe_assignments`. +- Drain iniziale dei `production_events` accumulati verso GAIA. +- Config `.env`: `GAIA_MODE=live`, `GAIA_BASE_URL`, `GAIA_AUTH_TOKEN`, `GAIA_POLL_INTERVAL_SECONDS`. +- Circuit breaker: se GAIA giù, non bloccare operazioni interne; accoda su `production_events` e degrada a warning in UI. + +**File impattati principali:** +- Nuovi in M1: `server/services/gaia_client.py` (Protocol + `ImportOnlyGaiaClient`), `server/services/gaia_import.py`, `server/routers/gaia.py` (production events), `server/routers/admin_gaia.py` (import), `server/schemas/gaia.py`, `server/models/production_event.py`, `client/blueprints/admin.py` (aggiunge route import), `client/templates/admin/gaia_import.html`, `server/tests/test_gaia_import_only.py`. +- Aggiunti in M2: `server/services/gaia_live_client.py`, `server/services/gaia_sync.py` (scheduler), `server/tests/test_gaia_live.py`. +- Modificati: `server/main.py` (lifespan avvia APScheduler solo in modalità live), `server/config.py` (nuove variabili). + +--- + +## Fase M1-Demo — Deploy Demo per Cliente (tra Fase 4 e checkpoint feedback) + +**Obiettivo:** dopo Fasi 1-4 e Fase 5 import-only, mettere online una demo accessibile al cliente via HTTPS su dominio di test, **con i dati reali del cliente** (articoli, ricette, stazioni). + +**Deliverable:** +- `docker-compose.demo.yml`: estensione del `docker-compose.yml` prod esistente con Traefik + Let's Encrypt puntato a dominio `demo.tielogic.` (o sottodominio dedicato), MySQL vuoto pronto per l'import, server e client buildati localmente (no registry). +- Seed minimo: solo utenti di esempio per ogni ruolo (MeasurementTec, Maker, Metrologist, Supervisor, Admin) + una stazione "default". Articoli e ricette non vengono seedati: vengono importati dal cliente (o da noi a suo nome) via `/admin/gaia-import` con il file export GAIA reale. +- Pagina `/admin/demo-reset` (protetta da SETUP_PASSWORD) che ripulisce misure + production events lasciando utenti, articoli e ricette (così il cliente ri-testa più volte senza perdere i dati importati). +- Script `scripts/deploy-demo.sh` che fa build, push via SSH alla VPS demo, restart con `docker compose up -d --build`. +- Credenziali demo consegnate al cliente insieme alla guida di test (scenari guidati: "prova come operatore", "prova come capoturno", "prova come maker", ecc.). + +**Non incluso (rimandato a M2/Fase 6):** registry privato, Watchtower, compose split server/client, CI release automation, deploy su hardware cliente, `LiveGaiaClient` in tempo reale. + +--- + +## Fase 6 — Deploy Scenario B (Docker per-tablet + Registry + Watchtower) [M2] + +**Obiettivo:** package del client come immagine pubblicata su registry privato della VPS; ogni tablet pulla + auto-update via Watchtower. **Parte solo dopo checkpoint feedback M1.** + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase6-deploy-b.md` + +**Deliverable:** +- `docker-compose.server.yml` (VPS): `mysql` + `server` + `traefik` + `registry` + `watchtower-server`. +- `docker-compose.client.yml` (per-tablet): `client` + `watchtower-client`, `image: vps.tielogic.local:5000/tielogic/tmflow-client:latest`. +- `.env.client.example` con solo variabili necessarie al client: `STATION_ID`, `STATION_API_KEY`, `API_SERVER_URL`, `CLIENT_SECRET_KEY`, `LANG_DEFAULT`, ecc. +- `Dockerfile.client` ottimizzato: multi-stage, Tailwind build + Babel compile + gunicorn. +- GitHub Actions `.github/workflows/release.yml`: su tag `v*.*.*` builda e pusha `tmflow-server` e `tmflow-client` al registry. +- Script di provisioning `scripts/provision-tablet.sh`: installa docker, crea `.env`, genera STATION_ID, pulla e avvia. +- Documento `docs/DEPLOYMENT_B.md` con procedura manuale operatore IT cliente. + +**Scelte tecniche:** +- Registry self-hosted su VPS con auth basic (htpasswd) su `registry:2`, esposto solo via VPN cliente (D-0.10). +- Watchtower `WATCHTOWER_POLL_INTERVAL=300`, `WATCHTOWER_ROLLING_RESTART=true`, label-filtered (`com.centurylinklabs.watchtower.enable=true`) per evitare update indesiderati. +- Tag strategia: ogni release push `tmflow-client:1.1.0` + `tmflow-client:latest`. Tablet seguono `latest`; rollback = retag del latest a versione precedente. + +**Caveat D-0.3:** se i tablet target sono Windows consumer, B è fragile (docker desktop + licenze). Documentare come prerequisito mini-PC Linux industriale oppure scivolare a C (che è comunque B+LAN). Questo blocco deve essere chiarito prima. + +--- + +## Fase 7 — Hardening, E2E, Rollout [M2] + +**Obiettivo:** fare il sistema robusto per produzione e rilasciarlo su un sito pilota prima del rollout generale. + +**Sotto-piano:** `docs/superpowers/plans/2026-04-17-rev04-phase7-hardening.md` + +**Deliverable:** +- Security review completa (`oh-my-claudecode:security-reviewer`): nuovi endpoint GAIA, override Supervisor, registry esposto. +- Test E2E manuali + script Playwright/Selenium su: flusso completo operatore + capoturno override + sync GAIA + avvio/fine produzione. +- Rollback plan: procedura di ritorno a V1.0.7 se necessario (backup DB pre-migration, tag immagine precedente). +- Aggiornamento stringhe i18n IT/EN per tutti i nuovi messaggi (capoturno, cicalino, timer, GAIA errori). +- Aggiornamento `CLAUDE.md` + `README.md` + `API.md` + `USER_GUIDE.md`. +- Sito pilota: una singola postazione per una settimana prima del rollout sugli altri. +- Monitoring: log centralizzato + alert su errori GAIA persistenti, override Supervisor eccessivi per stazione, misure fuori tolleranza per ricetta. + +--- + +## Vincoli Trasversali (applicare in ogni fase) + +1. **Retrocompatibilità dati:** ogni migration deve avere `downgrade` testato. Mai DROP di colonne con dati senza passo di deprecazione. +2. **Feature flags:** nuove integrazioni (GAIA, Supervisor) dietro flag config (`GAIA_ENABLED`, `SUPERVISOR_ENABLED`) → rollout graduale e rollback veloce. +3. **Test prima:** seguire TDD sui sotto-piani. Ogni endpoint ha almeno un happy path + un negative path + un permission test. +4. **Envelope response:** iniziare ad allineare nuovi endpoint allo standard `{success, data, error}` della spec 2026-04-01, anche se V1.0.7 non lo usa. Minimizza rework futuro. +5. **Audit log:** ogni operazione critica (override, GAIA command, station assign) scrive su `access_logs`. +6. **i18n:** ogni nuova stringa passa da `{{ _('...') }}` o equivalente client-side. Niente testo hardcoded in italiano nei template. +7. **Commit granulari:** un commit per task TDD (test rosso → verde → commit). Usa il git-master agent per il formato commit. + +--- + +## Scelte Architetturali Confermate / Deviate dalla Spec 2026-04-01 + +| Scelta | Spec 2026-04-01 | V1.1.0 | Motivazione | +|---|---|---|---| +| Struttura cartelle | `src/backend/` + `src/frontend/` | `server/` + `client/` | V1.0.7 ha già cementato questa struttura. Rework > costo beneficio. | +| Package manager | `uv` | `requirements.txt` | Migrazione a uv rimandata a V1.2.0 (task dedicato). | +| Frontend | Gradio o React | Flask server-side | Vincolo cliente: tablet kiosk, i18n server-side, no SPA. Deviazione documentata. | +| Messaging | NATS obbligatorio | Nessuno | Monolite a 2 processi, NATS rimandato a v2.x (quando compariranno moduli camera/ML). | +| API Envelope `{success,data,error}` | Obbligatorio | Adozione incrementale sui nuovi endpoint rev04 | Evita rework totale ma allinea il nuovo codice. | +| Auth header `X-Api-Key` | Obbligatorio | ✓ Già presente (`X-API-Key`) | Allineato (case-insensitive per HTTP). | +| Registry privato + Watchtower | Raccomandato | ✓ Adottato in Fase 6 | Necessario per scenario B. | + +--- + +## Rischi e Mitigazioni + +| Rischio | Probabilità | Impatto | Mitigazione | +|---|---|---|---| +| Protocollo GAIA live non decidibile prima di M2 | Media | Medio | M1 non dipende dal protocollo live: usa `ImportOnlyGaiaClient` con dati reali importati. Il protocollo definitivo si chiude durante M1, pronto per Fase 5-live in M2. | +| Export GAIA fornito dal cliente in formato non documentato | Media | Medio | Parser `gaia_import.py` tollerante + anteprima UI + possibilità di caricare più volte; prevedere 1-2 giorni di fitting iniziale del parser sullo specifico formato fornito. | +| Tablet Windows consumer rendono B non praticabile | Media | Alto | Documentare prerequisito hardware, prevedere fallback a C (edge NUC per sito). | +| Cliente cambia idea sul ruolo capoturno | Bassa | Medio | Astrazione override generica: il ruolo può essere rinominato o scomposto senza toccare il flusso. | +| Prestazioni Watchtower con N tablet | Bassa | Basso | `poll_interval=300s`, update scaglionati (`rolling_restart`), banda registry limitata ma accettabile. | +| i18n delta si accumula e va in produzione non tradotto | Media | Medio | Lint in CI che fallisce se ci sono stringhe `_('...')` non in `messages.po`. | +| Migration Alembic rompe DB produzione | Bassa | Critico | Backup pre-deploy obbligatorio, staging con dump produzione, downgrade testato. | + +--- + +## Prossimi passi (focus M1) + +1. **Decisioni chiuse:** D-0.4 (cicalino → M2), D-0.7 (autologout configurabile), scelta (b) "dati reali senza live GAIA" in M1. +2. **Decisioni ancora aperte per M1:** D-0.5 (parametri volatili separati dalla versione — raccomandato B) e D-0.6 (UX capoturno: default modale login sovrapposto). Da confermare al volo col cliente o decise da noi di default. +3. **Decisioni rinviabili a M2:** D-0.1 (protocollo GAIA live), D-0.2 (rete/credenziali GAIA), D-0.3 (target hardware tablet), D-0.10 (registry esposto o solo VPN). Da chiudere durante M1, non bloccano la demo. +4. **Richiesta al cliente per sbloccare Fase 5 import-only:** farci avere un **export GAIA d'esempio** (CSV/JSON) degli articoli + ricette per una o due stazioni, così adattiamo il parser. +5. Dettagliare il sotto-piano `phase1-stations` in TDD eseguibile (prima fase sbloccante per tutte le altre di M1). +6. Eseguire le fasi di M1 (1 → 2/3/5-import in parallelo → 4 → Demo deploy) via `superpowers:subagent-driven-development` oppure `superpowers:executing-plans`, con review + verifier tra una fase e l'altra. +7. Consegnare demo al cliente → raccogliere feedback → aprire M2. +8. In M2: chiudere decisioni residue, sviluppare `LiveGaiaClient` + cicalino + Fase 6 (Deploy B) + Fase 7 (hardening/rollout). diff --git a/docs/superpowers/plans/2026-04-17-rev04-phase1-stations.md b/docs/superpowers/plans/2026-04-17-rev04-phase1-stations.md new file mode 100644 index 0000000..8155c36 --- /dev/null +++ b/docs/superpowers/plans/2026-04-17-rev04-phase1-stations.md @@ -0,0 +1,2195 @@ +# Fase 1 — Stazioni e Identità per-tablet (Implementation Plan) + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Introdurre il concetto di "stazione di controllo" come prima classe del dominio: ogni container client Flask parte con un `STATION_CODE` nel `.env`, il server conosce le stazioni e le associazioni stazione-ricetta, e ogni client vede solo le ricette della propria stazione. + +**Architecture:** +- Due nuove tabelle SQL: `stations` (anagrafica) e `station_recipe_assignments` (molti-a-molti stazione↔ricetta). +- CRUD admin su `/api/stations` (solo admin). +- Nuovo endpoint `GET /api/stations/by-code/{code}/recipes` che ritorna solo le ricette assegnate alla stazione richiesta; autenticato con l'API key dell'operatore loggato (non serve station-specific auth in M1). +- Il client Flask legge `STATION_CODE` dal `.env` e lo usa quando chiama il server in `measure` (select_recipe). Se assente → pagina di errore di configurazione (fail fast). +- UI admin per creare stazioni e gestire assegnazioni ricetta→stazione. +- Seed: una stazione di default `ST-DEFAULT` con tutte le ricette attive assegnate, così il sistema esistente continua a funzionare dopo la migration. + +**Tech Stack:** SQLAlchemy 2.0 async, Alembic, Pydantic v2, FastAPI, Flask, Jinja2+Tailwind+Alpine, pytest-asyncio, httpx. + +**Conventions invariate dal codice esistente:** +- Migration Alembic in `server/migrations/versions/`, next ID `002_add_stations`. +- Modelli usano `Mapped[...]` + `mapped_column()`, `Base` da `database`. +- Schema Pydantic con `ConfigDict(from_attributes=True)` nelle Response. +- Router: `APIRouter(prefix="/api/...", tags=[...])`, `Depends(get_db)`, `Depends(require_admin_user)`, `response_model` diretto (no envelope). +- Service: logica business async che riceve `AsyncSession`, solleva `HTTPException` su conflitti. +- Test: `client`, `db_session`, fixture utenti (`admin_user`, `maker_user`, `measurement_tec_user`), helper `auth_headers(user)`, helper `create_test_recipe(session, user_id)`. +- Client blueprint admin: route Flask che fa proxy JSON al server via `api_client.get/post/put/delete()`, error normalization `{"error": True, "status_code": ..., "detail": "..."}`. +- i18n: `{{ _('...') }}` nei template. + +--- + +## Task 1: Alembic Migration 002 — Create `stations` and `station_recipe_assignments` Tables + +**Files:** +- Create: `server/migrations/versions/002_add_stations.py` + +- [ ] **Step 1: Write migration file** + +```python +# server/migrations/versions/002_add_stations.py +"""add stations and station_recipe_assignments tables + +Revision ID: 002_add_stations +Revises: 001_image_path +Create Date: 2026-04-17 + +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa + +revision: str = '002_add_stations' +down_revision: Union[str, None] = '001_image_path' +branch_labels: Union[str, Sequence[str], None] = None +depends_on: Union[str, Sequence[str], None] = None + + +def upgrade() -> None: + op.create_table( + 'stations', + sa.Column('id', sa.Integer, primary_key=True, autoincrement=True), + sa.Column('code', sa.String(100), nullable=False), + sa.Column('name', sa.String(255), nullable=False), + sa.Column('location', sa.String(255), nullable=True), + sa.Column('notes', sa.Text, nullable=True), + sa.Column('active', sa.Boolean, nullable=False, server_default=sa.true()), + sa.Column('created_by', sa.Integer, sa.ForeignKey('users.id'), nullable=False), + sa.Column('created_at', sa.DateTime, nullable=False, server_default=sa.func.now()), + sa.UniqueConstraint('code', name='uq_stations_code'), + sa.Index('ix_stations_code', 'code'), + sa.Index('ix_stations_active', 'active'), + mysql_engine='InnoDB', + mysql_charset='utf8mb4', + ) + + op.create_table( + 'station_recipe_assignments', + sa.Column('id', sa.Integer, primary_key=True, autoincrement=True), + sa.Column('station_id', sa.Integer, sa.ForeignKey('stations.id', ondelete='CASCADE'), nullable=False), + sa.Column('recipe_id', sa.Integer, sa.ForeignKey('recipes.id', ondelete='CASCADE'), nullable=False), + sa.Column('assigned_by', sa.Integer, sa.ForeignKey('users.id'), nullable=False), + sa.Column('assigned_at', sa.DateTime, nullable=False, server_default=sa.func.now()), + sa.UniqueConstraint('station_id', 'recipe_id', name='uq_station_recipe'), + sa.Index('ix_sra_station', 'station_id'), + sa.Index('ix_sra_recipe', 'recipe_id'), + mysql_engine='InnoDB', + mysql_charset='utf8mb4', + ) + + +def downgrade() -> None: + op.drop_table('station_recipe_assignments') + op.drop_table('stations') +``` + +- [ ] **Step 2: Verify migration is detected** + +Run: `cd server && alembic -c migrations/alembic.ini history` +Expected output includes both `001_image_path` and `002_add_stations`. + +- [ ] **Step 3: Commit** + +```bash +git add server/migrations/versions/002_add_stations.py +git commit -m "feat(db): add migration 002 for stations and assignments" +``` + +--- + +## Task 2: Station and StationRecipeAssignment Models + +**Files:** +- Create: `server/models/station.py` +- Modify: `server/tests/conftest.py` (import new models so `Base.metadata.create_all` picks them up) +- Test: `server/tests/test_station_model.py` + +- [ ] **Step 1: Write failing test** + +```python +# server/tests/test_station_model.py +"""Test the Station and StationRecipeAssignment ORM models.""" +import pytest +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from models.station import Station, StationRecipeAssignment +from tests.conftest import _create_user, create_test_recipe + + +@pytest.mark.asyncio +async def test_create_station(db_session: AsyncSession): + admin = await _create_user(db_session, username="admin1", is_admin=True) + station = Station( + code="ST-001", + name="Linea 1", + location="Reparto Nord", + created_by=admin.id, + ) + db_session.add(station) + await db_session.flush() + await db_session.refresh(station) + assert station.id is not None + assert station.active is True + assert station.created_at is not None + + +@pytest.mark.asyncio +async def test_station_code_is_unique(db_session: AsyncSession): + from sqlalchemy.exc import IntegrityError + admin = await _create_user(db_session, username="admin2", is_admin=True) + db_session.add(Station(code="ST-DUP", name="A", created_by=admin.id)) + await db_session.flush() + db_session.add(Station(code="ST-DUP", name="B", created_by=admin.id)) + with pytest.raises(IntegrityError): + await db_session.flush() + + +@pytest.mark.asyncio +async def test_assign_recipe_to_station(db_session: AsyncSession): + admin = await _create_user(db_session, username="admin3", is_admin=True) + station = Station(code="ST-002", name="Linea 2", created_by=admin.id) + db_session.add(station) + await db_session.flush() + recipe = await create_test_recipe(db_session, user_id=admin.id, code="REC-X") + assignment = StationRecipeAssignment( + station_id=station.id, recipe_id=recipe.id, assigned_by=admin.id, + ) + db_session.add(assignment) + await db_session.flush() + result = await db_session.execute( + select(StationRecipeAssignment).where( + StationRecipeAssignment.station_id == station.id + ) + ) + assignments = result.scalars().all() + assert len(assignments) == 1 + assert assignments[0].recipe_id == recipe.id +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd server && pytest tests/test_station_model.py -v` +Expected: FAIL with `ModuleNotFoundError: No module named 'models.station'`. + +- [ ] **Step 3: Write the model file** + +```python +# server/models/station.py +"""Station and StationRecipeAssignment models. + +A Station represents a physical control point (typically one per tablet/PC). +Recipes are assigned to stations so that each station only sees the products +it is supposed to inspect. +""" +from datetime import datetime +from typing import TYPE_CHECKING, Optional + +from sqlalchemy import Boolean, DateTime, ForeignKey, Index, Integer, String, Text, UniqueConstraint, func +from sqlalchemy.orm import Mapped, mapped_column, relationship + +from database import Base + +if TYPE_CHECKING: + from models.recipe import Recipe + + +class Station(Base): + __tablename__ = "stations" + + id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True) + code: Mapped[str] = mapped_column(String(100), unique=True, nullable=False, index=True) + name: Mapped[str] = mapped_column(String(255), nullable=False) + location: Mapped[Optional[str]] = mapped_column(String(255), nullable=True) + notes: Mapped[Optional[str]] = mapped_column(Text, nullable=True) + active: Mapped[bool] = mapped_column(Boolean, nullable=False, default=True, index=True) + created_by: Mapped[int] = mapped_column(Integer, ForeignKey("users.id"), nullable=False) + created_at: Mapped[datetime] = mapped_column( + DateTime, nullable=False, server_default=func.now() + ) + + assignments: Mapped[list["StationRecipeAssignment"]] = relationship( + back_populates="station", cascade="all, delete-orphan", lazy="selectin" + ) + + __table_args__ = ( + {"mysql_engine": "InnoDB", "mysql_charset": "utf8mb4"}, + ) + + def __repr__(self) -> str: + return f"" + + +class StationRecipeAssignment(Base): + __tablename__ = "station_recipe_assignments" + + id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True) + station_id: Mapped[int] = mapped_column( + Integer, ForeignKey("stations.id", ondelete="CASCADE"), nullable=False, index=True + ) + recipe_id: Mapped[int] = mapped_column( + Integer, ForeignKey("recipes.id", ondelete="CASCADE"), nullable=False, index=True + ) + assigned_by: Mapped[int] = mapped_column(Integer, ForeignKey("users.id"), nullable=False) + assigned_at: Mapped[datetime] = mapped_column( + DateTime, nullable=False, server_default=func.now() + ) + + station: Mapped["Station"] = relationship(back_populates="assignments") + recipe: Mapped["Recipe"] = relationship(lazy="selectin") + + __table_args__ = ( + UniqueConstraint("station_id", "recipe_id", name="uq_station_recipe"), + {"mysql_engine": "InnoDB", "mysql_charset": "utf8mb4"}, + ) + + def __repr__(self) -> str: + return f"" +``` + +- [ ] **Step 4: Import the new models in conftest** + +Modify `server/tests/conftest.py` — add this import near the existing model imports (around line 41): + +```python +from models.station import Station, StationRecipeAssignment +``` + +- [ ] **Step 5: Run test to verify it passes** + +Run: `cd server && pytest tests/test_station_model.py -v` +Expected: 3 passed. + +- [ ] **Step 6: Commit** + +```bash +git add server/models/station.py server/tests/test_station_model.py server/tests/conftest.py +git commit -m "feat(models): add Station and StationRecipeAssignment models" +``` + +--- + +## Task 3: Pydantic Schemas for Station + +**Files:** +- Create: `server/schemas/station.py` +- Test: `server/tests/test_station_schemas.py` + +- [ ] **Step 1: Write failing test** + +```python +# server/tests/test_station_schemas.py +"""Tests for Station Pydantic schemas.""" +import pytest +from pydantic import ValidationError + +from schemas.station import ( + StationCreate, StationUpdate, StationResponse, + StationRecipeAssignmentCreate, StationRecipeAssignmentResponse, + StationWithRecipesResponse, +) + + +def test_station_create_valid(): + data = StationCreate(code="ST-001", name="Linea 1", location="Reparto Nord") + assert data.code == "ST-001" + assert data.active is True + + +def test_station_create_rejects_empty_code(): + with pytest.raises(ValidationError): + StationCreate(code="", name="X") + + +def test_station_create_rejects_too_long_code(): + with pytest.raises(ValidationError): + StationCreate(code="A" * 101, name="X") + + +def test_station_update_all_optional(): + data = StationUpdate() + assert data.name is None + + +def test_station_assignment_create(): + data = StationRecipeAssignmentCreate(recipe_id=42) + assert data.recipe_id == 42 +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd server && pytest tests/test_station_schemas.py -v` +Expected: FAIL with `ModuleNotFoundError: No module named 'schemas.station'`. + +- [ ] **Step 3: Write the schemas** + +```python +# server/schemas/station.py +"""Pydantic schemas for Station and StationRecipeAssignment.""" +from datetime import datetime +from typing import Optional + +from pydantic import BaseModel, ConfigDict, Field + + +class StationCreate(BaseModel): + code: str = Field(..., min_length=1, max_length=100) + name: str = Field(..., min_length=1, max_length=255) + location: Optional[str] = Field(default=None, max_length=255) + notes: Optional[str] = None + active: bool = True + + +class StationUpdate(BaseModel): + name: Optional[str] = Field(default=None, min_length=1, max_length=255) + location: Optional[str] = Field(default=None, max_length=255) + notes: Optional[str] = None + active: Optional[bool] = None + + +class StationResponse(BaseModel): + model_config = ConfigDict(from_attributes=True) + id: int + code: str + name: str + location: Optional[str] + notes: Optional[str] + active: bool + created_by: int + created_at: datetime + + +class StationRecipeAssignmentCreate(BaseModel): + recipe_id: int = Field(..., gt=0) + + +class StationRecipeAssignmentResponse(BaseModel): + model_config = ConfigDict(from_attributes=True) + id: int + station_id: int + recipe_id: int + assigned_by: int + assigned_at: datetime + + +class _RecipeSummary(BaseModel): + model_config = ConfigDict(from_attributes=True) + id: int + code: str + name: str + active: bool + + +class StationWithRecipesResponse(StationResponse): + recipes: list[_RecipeSummary] = Field(default_factory=list) +``` + +- [ ] **Step 4: Run test to verify it passes** + +Run: `cd server && pytest tests/test_station_schemas.py -v` +Expected: 5 passed. + +- [ ] **Step 5: Commit** + +```bash +git add server/schemas/station.py server/tests/test_station_schemas.py +git commit -m "feat(schemas): add Station and assignment Pydantic schemas" +``` + +--- + +## Task 4: Station Service (Business Logic) + +**Files:** +- Create: `server/services/station_service.py` +- Test: `server/tests/test_station_service.py` + +- [ ] **Step 1: Write failing test** + +```python +# server/tests/test_station_service.py +"""Tests for station_service business logic.""" +import pytest +from fastapi import HTTPException +from sqlalchemy.ext.asyncio import AsyncSession + +from models.station import Station +from schemas.station import StationCreate, StationUpdate +from services.station_service import ( + create_station, update_station, delete_station, + assign_recipe, unassign_recipe, list_station_recipes, + get_station_by_code, +) +from tests.conftest import _create_user, create_test_recipe + + +@pytest.mark.asyncio +async def test_create_station_ok(db_session: AsyncSession): + admin = await _create_user(db_session, username="a1", is_admin=True) + station = await create_station( + db_session, StationCreate(code="ST-100", name="Pilot"), admin + ) + assert station.id is not None + assert station.code == "ST-100" + + +@pytest.mark.asyncio +async def test_create_station_duplicate_code(db_session: AsyncSession): + admin = await _create_user(db_session, username="a2", is_admin=True) + await create_station(db_session, StationCreate(code="ST-DUP", name="A"), admin) + with pytest.raises(HTTPException) as exc: + await create_station(db_session, StationCreate(code="ST-DUP", name="B"), admin) + assert exc.value.status_code == 409 + + +@pytest.mark.asyncio +async def test_update_station(db_session: AsyncSession): + admin = await _create_user(db_session, username="a3", is_admin=True) + station = await create_station(db_session, StationCreate(code="ST-U", name="Old"), admin) + updated = await update_station( + db_session, station.id, StationUpdate(name="New name"), + ) + assert updated.name == "New name" + assert updated.code == "ST-U" + + +@pytest.mark.asyncio +async def test_update_missing_station(db_session: AsyncSession): + with pytest.raises(HTTPException) as exc: + await update_station(db_session, 9999, StationUpdate(name="x")) + assert exc.value.status_code == 404 + + +@pytest.mark.asyncio +async def test_assign_and_list_recipes(db_session: AsyncSession): + admin = await _create_user(db_session, username="a4", is_admin=True) + station = await create_station(db_session, StationCreate(code="ST-R", name="R"), admin) + r1 = await create_test_recipe(db_session, user_id=admin.id, code="REC-R1") + r2 = await create_test_recipe(db_session, user_id=admin.id, code="REC-R2") + await assign_recipe(db_session, station.id, r1.id, admin) + await assign_recipe(db_session, station.id, r2.id, admin) + recipes = await list_station_recipes(db_session, station.id) + assert {r.code for r in recipes} == {"REC-R1", "REC-R2"} + + +@pytest.mark.asyncio +async def test_assign_same_recipe_twice_is_409(db_session: AsyncSession): + admin = await _create_user(db_session, username="a5", is_admin=True) + station = await create_station(db_session, StationCreate(code="ST-D", name="D"), admin) + r = await create_test_recipe(db_session, user_id=admin.id, code="REC-D") + await assign_recipe(db_session, station.id, r.id, admin) + with pytest.raises(HTTPException) as exc: + await assign_recipe(db_session, station.id, r.id, admin) + assert exc.value.status_code == 409 + + +@pytest.mark.asyncio +async def test_unassign_recipe(db_session: AsyncSession): + admin = await _create_user(db_session, username="a6", is_admin=True) + station = await create_station(db_session, StationCreate(code="ST-UN", name="UN"), admin) + r = await create_test_recipe(db_session, user_id=admin.id, code="REC-UN") + await assign_recipe(db_session, station.id, r.id, admin) + await unassign_recipe(db_session, station.id, r.id) + recipes = await list_station_recipes(db_session, station.id) + assert recipes == [] + + +@pytest.mark.asyncio +async def test_get_station_by_code(db_session: AsyncSession): + admin = await _create_user(db_session, username="a7", is_admin=True) + await create_station(db_session, StationCreate(code="ST-FIND", name="F"), admin) + found = await get_station_by_code(db_session, "ST-FIND") + assert found is not None + assert found.name == "F" + missing = await get_station_by_code(db_session, "ST-NOPE") + assert missing is None + + +@pytest.mark.asyncio +async def test_list_recipes_only_returns_active(db_session: AsyncSession): + admin = await _create_user(db_session, username="a8", is_admin=True) + station = await create_station(db_session, StationCreate(code="ST-A", name="A"), admin) + active = await create_test_recipe(db_session, user_id=admin.id, code="REC-AC") + inactive = await create_test_recipe(db_session, user_id=admin.id, code="REC-IN") + inactive.active = False + await db_session.flush() + await assign_recipe(db_session, station.id, active.id, admin) + await assign_recipe(db_session, station.id, inactive.id, admin) + recipes = await list_station_recipes(db_session, station.id) + assert [r.code for r in recipes] == ["REC-AC"] + + +@pytest.mark.asyncio +async def test_delete_station_cascades_assignments(db_session: AsyncSession): + from sqlalchemy import select + from models.station import StationRecipeAssignment + admin = await _create_user(db_session, username="a9", is_admin=True) + station = await create_station(db_session, StationCreate(code="ST-DEL", name="D"), admin) + r = await create_test_recipe(db_session, user_id=admin.id, code="REC-DEL") + await assign_recipe(db_session, station.id, r.id, admin) + await delete_station(db_session, station.id) + remaining = await db_session.execute( + select(StationRecipeAssignment).where(StationRecipeAssignment.station_id == station.id) + ) + assert remaining.scalars().all() == [] +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd server && pytest tests/test_station_service.py -v` +Expected: FAIL with `ModuleNotFoundError: No module named 'services.station_service'`. + +- [ ] **Step 3: Write the service** + +```python +# server/services/station_service.py +"""Business logic for stations and recipe assignments. + +Routers must call into these functions rather than manipulating models directly. +All functions are async and accept an AsyncSession; they flush but do NOT commit +(commit is handled by the FastAPI get_db dependency). +""" +from typing import Optional + +from fastapi import HTTPException, status +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from models.recipe import Recipe +from models.station import Station, StationRecipeAssignment +from models.user import User +from schemas.station import StationCreate, StationUpdate + + +async def create_station( + db: AsyncSession, data: StationCreate, creator: User, +) -> Station: + existing = await db.execute(select(Station).where(Station.code == data.code)) + if existing.scalar_one_or_none() is not None: + raise HTTPException( + status_code=status.HTTP_409_CONFLICT, + detail=f"Station code '{data.code}' already exists", + ) + station = Station( + code=data.code, + name=data.name, + location=data.location, + notes=data.notes, + active=data.active, + created_by=creator.id, + ) + db.add(station) + await db.flush() + await db.refresh(station) + return station + + +async def get_station(db: AsyncSession, station_id: int) -> Station: + result = await db.execute(select(Station).where(Station.id == station_id)) + station = result.scalar_one_or_none() + if station is None: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, detail="Station not found", + ) + return station + + +async def get_station_by_code(db: AsyncSession, code: str) -> Optional[Station]: + result = await db.execute(select(Station).where(Station.code == code)) + return result.scalar_one_or_none() + + +async def list_stations(db: AsyncSession, active_only: bool = False) -> list[Station]: + query = select(Station).order_by(Station.code) + if active_only: + query = query.where(Station.active == True) + result = await db.execute(query) + return list(result.scalars().all()) + + +async def update_station( + db: AsyncSession, station_id: int, data: StationUpdate, +) -> Station: + station = await get_station(db, station_id) + for field, value in data.model_dump(exclude_unset=True).items(): + setattr(station, field, value) + await db.flush() + await db.refresh(station) + return station + + +async def delete_station(db: AsyncSession, station_id: int) -> None: + station = await get_station(db, station_id) + await db.delete(station) + await db.flush() + + +async def assign_recipe( + db: AsyncSession, station_id: int, recipe_id: int, assigner: User, +) -> StationRecipeAssignment: + await get_station(db, station_id) + recipe_row = await db.execute(select(Recipe).where(Recipe.id == recipe_id)) + if recipe_row.scalar_one_or_none() is None: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, detail="Recipe not found", + ) + existing = await db.execute( + select(StationRecipeAssignment).where( + StationRecipeAssignment.station_id == station_id, + StationRecipeAssignment.recipe_id == recipe_id, + ) + ) + if existing.scalar_one_or_none() is not None: + raise HTTPException( + status_code=status.HTTP_409_CONFLICT, + detail="Recipe already assigned to this station", + ) + assignment = StationRecipeAssignment( + station_id=station_id, recipe_id=recipe_id, assigned_by=assigner.id, + ) + db.add(assignment) + await db.flush() + await db.refresh(assignment) + return assignment + + +async def unassign_recipe( + db: AsyncSession, station_id: int, recipe_id: int, +) -> None: + result = await db.execute( + select(StationRecipeAssignment).where( + StationRecipeAssignment.station_id == station_id, + StationRecipeAssignment.recipe_id == recipe_id, + ) + ) + assignment = result.scalar_one_or_none() + if assignment is None: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail="Assignment not found", + ) + await db.delete(assignment) + await db.flush() + + +async def list_station_recipes( + db: AsyncSession, station_id: int, +) -> list[Recipe]: + """Return active recipes assigned to this station, ordered by code.""" + await get_station(db, station_id) + result = await db.execute( + select(Recipe) + .join(StationRecipeAssignment, StationRecipeAssignment.recipe_id == Recipe.id) + .where( + StationRecipeAssignment.station_id == station_id, + Recipe.active == True, + ) + .order_by(Recipe.code) + ) + return list(result.scalars().all()) +``` + +- [ ] **Step 4: Run test to verify it passes** + +Run: `cd server && pytest tests/test_station_service.py -v` +Expected: 10 passed. + +- [ ] **Step 5: Commit** + +```bash +git add server/services/station_service.py server/tests/test_station_service.py +git commit -m "feat(services): add station_service with CRUD and assignment logic" +``` + +--- + +## Task 5: Router `/api/stations` (CRUD, Admin Only) + +**Files:** +- Create: `server/routers/stations.py` +- Modify: `server/main.py` (include the router) +- Test: `server/tests/test_stations_api.py` + +- [ ] **Step 1: Write failing test** + +```python +# server/tests/test_stations_api.py +"""Integration tests for /api/stations endpoints.""" +import pytest +from httpx import AsyncClient + +from tests.conftest import auth_headers, create_test_recipe + + +@pytest.mark.asyncio +async def test_list_stations_requires_auth(client: AsyncClient): + resp = await client.get("/api/stations") + assert resp.status_code == 401 + + +@pytest.mark.asyncio +async def test_create_station_as_admin(client: AsyncClient, admin_user): + resp = await client.post( + "/api/stations", + headers=auth_headers(admin_user), + json={"code": "ST-API", "name": "Via API"}, + ) + assert resp.status_code == 201, resp.text + body = resp.json() + assert body["code"] == "ST-API" + assert body["active"] is True + assert body["id"] > 0 + + +@pytest.mark.asyncio +async def test_create_station_non_admin_is_403(client: AsyncClient, maker_user): + resp = await client.post( + "/api/stations", + headers=auth_headers(maker_user), + json={"code": "ST-NO", "name": "No"}, + ) + assert resp.status_code == 403 + + +@pytest.mark.asyncio +async def test_update_station(client: AsyncClient, admin_user): + created = await client.post( + "/api/stations", + headers=auth_headers(admin_user), + json={"code": "ST-UP", "name": "Old"}, + ) + station_id = created.json()["id"] + resp = await client.put( + f"/api/stations/{station_id}", + headers=auth_headers(admin_user), + json={"name": "New", "active": False}, + ) + assert resp.status_code == 200 + body = resp.json() + assert body["name"] == "New" + assert body["active"] is False + + +@pytest.mark.asyncio +async def test_delete_station(client: AsyncClient, admin_user): + created = await client.post( + "/api/stations", + headers=auth_headers(admin_user), + json={"code": "ST-D", "name": "D"}, + ) + sid = created.json()["id"] + resp = await client.delete( + f"/api/stations/{sid}", headers=auth_headers(admin_user), + ) + assert resp.status_code == 204 + again = await client.get( + f"/api/stations/{sid}", headers=auth_headers(admin_user), + ) + assert again.status_code == 404 + + +@pytest.mark.asyncio +async def test_assign_and_unassign_recipe( + client: AsyncClient, admin_user, db_session, +): + recipe = await create_test_recipe(db_session, user_id=admin_user.id, code="REC-AS") + await db_session.commit() + created = await client.post( + "/api/stations", + headers=auth_headers(admin_user), + json={"code": "ST-ASSIGN", "name": "A"}, + ) + sid = created.json()["id"] + a = await client.post( + f"/api/stations/{sid}/recipes", + headers=auth_headers(admin_user), + json={"recipe_id": recipe.id}, + ) + assert a.status_code == 201 + r = await client.get( + f"/api/stations/{sid}/recipes", + headers=auth_headers(admin_user), + ) + assert r.status_code == 200 + assert [rec["code"] for rec in r.json()] == ["REC-AS"] + u = await client.delete( + f"/api/stations/{sid}/recipes/{recipe.id}", + headers=auth_headers(admin_user), + ) + assert u.status_code == 204 + r2 = await client.get( + f"/api/stations/{sid}/recipes", + headers=auth_headers(admin_user), + ) + assert r2.json() == [] + + +@pytest.mark.asyncio +async def test_list_recipes_by_station_code( + client: AsyncClient, admin_user, measurement_tec_user, db_session, +): + recipe = await create_test_recipe(db_session, user_id=admin_user.id, code="REC-BC") + await db_session.commit() + created = await client.post( + "/api/stations", + headers=auth_headers(admin_user), + json={"code": "ST-BC", "name": "BC"}, + ) + sid = created.json()["id"] + await client.post( + f"/api/stations/{sid}/recipes", + headers=auth_headers(admin_user), + json={"recipe_id": recipe.id}, + ) + resp = await client.get( + "/api/stations/by-code/ST-BC/recipes", + headers=auth_headers(measurement_tec_user), + ) + assert resp.status_code == 200 + assert [r["code"] for r in resp.json()] == ["REC-BC"] + + +@pytest.mark.asyncio +async def test_list_recipes_by_unknown_code_404( + client: AsyncClient, measurement_tec_user, +): + resp = await client.get( + "/api/stations/by-code/ST-DOES-NOT-EXIST/recipes", + headers=auth_headers(measurement_tec_user), + ) + assert resp.status_code == 404 +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd server && pytest tests/test_stations_api.py -v` +Expected: all fail with 404 because the router isn't registered. + +- [ ] **Step 3: Write the router** + +```python +# server/routers/stations.py +"""Stations router - CRUD + recipe assignments.""" +from fastapi import APIRouter, Depends, HTTPException, status +from sqlalchemy.ext.asyncio import AsyncSession + +from database import get_db +from middleware.api_key import get_current_user, require_admin_user +from models.user import User +from schemas.station import ( + StationCreate, StationUpdate, StationResponse, + StationRecipeAssignmentCreate, StationRecipeAssignmentResponse, + _RecipeSummary, +) +from services import station_service + +router = APIRouter(prefix="/api/stations", tags=["stations"]) + + +@router.get("", response_model=list[StationResponse]) +async def list_stations( + active_only: bool = False, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """List all stations (admin only).""" + stations = await station_service.list_stations(db, active_only=active_only) + return [StationResponse.model_validate(s) for s in stations] + + +@router.post("", response_model=StationResponse, status_code=status.HTTP_201_CREATED) +async def create_new_station( + data: StationCreate, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Create a station (admin only).""" + station = await station_service.create_station(db, data, admin) + return StationResponse.model_validate(station) + + +@router.get("/{station_id}", response_model=StationResponse) +async def get_single_station( + station_id: int, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Get a station by id (admin only).""" + station = await station_service.get_station(db, station_id) + return StationResponse.model_validate(station) + + +@router.put("/{station_id}", response_model=StationResponse) +async def update_existing_station( + station_id: int, + data: StationUpdate, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Update a station (admin only).""" + station = await station_service.update_station(db, station_id, data) + return StationResponse.model_validate(station) + + +@router.delete("/{station_id}", status_code=status.HTTP_204_NO_CONTENT) +async def remove_station( + station_id: int, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Delete a station (admin only). Cascades to assignments.""" + await station_service.delete_station(db, station_id) + + +@router.get("/{station_id}/recipes", response_model=list[_RecipeSummary]) +async def list_assigned_recipes( + station_id: int, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Admin view: recipes assigned to this station (active only).""" + recipes = await station_service.list_station_recipes(db, station_id) + return [_RecipeSummary.model_validate(r) for r in recipes] + + +@router.post( + "/{station_id}/recipes", + response_model=StationRecipeAssignmentResponse, + status_code=status.HTTP_201_CREATED, +) +async def assign_recipe_to_station( + station_id: int, + data: StationRecipeAssignmentCreate, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Assign a recipe to a station (admin only).""" + assignment = await station_service.assign_recipe( + db, station_id, data.recipe_id, admin, + ) + return StationRecipeAssignmentResponse.model_validate(assignment) + + +@router.delete( + "/{station_id}/recipes/{recipe_id}", + status_code=status.HTTP_204_NO_CONTENT, +) +async def unassign_recipe_from_station( + station_id: int, + recipe_id: int, + admin: User = Depends(require_admin_user), + db: AsyncSession = Depends(get_db), +): + """Remove a recipe assignment (admin only).""" + await station_service.unassign_recipe(db, station_id, recipe_id) + + +@router.get( + "/by-code/{code}/recipes", + response_model=list[_RecipeSummary], +) +async def list_recipes_by_station_code( + code: str, + user: User = Depends(get_current_user), + db: AsyncSession = Depends(get_db), +): + """Operator view: active recipes assigned to the station with this code. + + Used by the Flask client at startup / on select_recipe page. + Any authenticated user can call this; filtering is by station code from + the client's STATION_CODE environment variable. + """ + station = await station_service.get_station_by_code(db, code) + if station is None or not station.active: + raise HTTPException( + status_code=status.HTTP_404_NOT_FOUND, + detail=f"Station '{code}' not found or inactive", + ) + recipes = await station_service.list_station_recipes(db, station.id) + return [_RecipeSummary.model_validate(r) for r in recipes] +``` + +- [ ] **Step 4: Register the router in main.py** + +Modify `server/main.py` — find the section where other routers are included (e.g. `app.include_router(users_router)`) and add: + +```python +from routers.stations import router as stations_router +# ... +app.include_router(stations_router) +``` + +Place the import alongside the existing router imports and the `include_router` call alongside the others; follow local style. + +- [ ] **Step 5: Run test to verify it passes** + +Run: `cd server && pytest tests/test_stations_api.py -v` +Expected: 8 passed. + +- [ ] **Step 6: Run full server test suite — no regressions** + +Run: `cd server && pytest -q` +Expected: all previous tests still pass. + +- [ ] **Step 7: Commit** + +```bash +git add server/routers/stations.py server/main.py server/tests/test_stations_api.py +git commit -m "feat(api): add /api/stations router with CRUD and assignments" +``` + +--- + +## Task 6: Seed Default Station on Setup + +**Rationale:** when the migration runs against an existing DB, there are already recipes but no stations — clients would break. The `/api/setup` endpoint (used for initial seeding) and the dev `init_db()` path must create a `ST-DEFAULT` station and auto-assign all active recipes to it, so existing installations keep working. + +**Files:** +- Modify: `server/routers/setup.py` +- Test: `server/tests/test_station_seed.py` + +- [ ] **Step 1: Read the current setup flow** + +Skim `server/routers/setup.py` to find the function that seeds demo data. Identify where recipes are created. + +- [ ] **Step 2: Write failing test** + +```python +# server/tests/test_station_seed.py +"""Verify /api/setup creates a default station with all recipes assigned.""" +import pytest +from httpx import AsyncClient +from sqlalchemy import select +from sqlalchemy.ext.asyncio import AsyncSession + +from models.station import Station, StationRecipeAssignment +from models.recipe import Recipe + + +@pytest.mark.asyncio +async def test_setup_creates_default_station(client: AsyncClient, db_session: AsyncSession): + import os + os.environ["SETUP_PASSWORD"] = "test-setup-pwd" + + resp = await client.post( + "/api/setup/initialize", + json={"password": "test-setup-pwd", "load_demo_data": True}, + ) + assert resp.status_code == 200, resp.text + + result = await db_session.execute(select(Station).where(Station.code == "ST-DEFAULT")) + default = result.scalar_one_or_none() + assert default is not None + assert default.active is True + + recipes = await db_session.execute(select(Recipe).where(Recipe.active == True)) + n_recipes = len(recipes.scalars().all()) + + assignments = await db_session.execute( + select(StationRecipeAssignment).where( + StationRecipeAssignment.station_id == default.id + ) + ) + n_assignments = len(assignments.scalars().all()) + assert n_assignments == n_recipes +``` + +- [ ] **Step 3: Run test to verify it fails** + +Run: `cd server && pytest tests/test_station_seed.py -v` +Expected: FAIL (no `ST-DEFAULT` exists). + +- [ ] **Step 4: Modify the setup seed** + +In `server/routers/setup.py`, at the end of the demo seed function (after recipes are created and committed) add: + +```python +# Create default station and assign all active recipes to it +from models.station import Station, StationRecipeAssignment +from sqlalchemy import select + +result = await db.execute(select(Station).where(Station.code == "ST-DEFAULT")) +if result.scalar_one_or_none() is None: + default_station = Station( + code="ST-DEFAULT", + name="Default Station", + location="Initial seed - change me", + created_by=admin_user.id, + ) + db.add(default_station) + await db.flush() + await db.refresh(default_station) + + recipes_result = await db.execute(select(Recipe).where(Recipe.active == True)) + for r in recipes_result.scalars().all(): + db.add(StationRecipeAssignment( + station_id=default_station.id, + recipe_id=r.id, + assigned_by=admin_user.id, + )) + await db.flush() +``` + +Adapt variable names (`db`, `admin_user`, `Recipe`) to match the existing ones in `setup.py`. + +- [ ] **Step 5: Run test to verify it passes** + +Run: `cd server && pytest tests/test_station_seed.py -v` +Expected: passed. + +- [ ] **Step 6: Commit** + +```bash +git add server/routers/setup.py server/tests/test_station_seed.py +git commit -m "feat(setup): seed ST-DEFAULT station and assign existing recipes" +``` + +--- + +## Task 7: Client `STATION_CODE` Configuration + +**Files:** +- Modify: `client/config.py` +- Modify: `.env.example` +- Test: `client/tests/test_config_station.py` + +- [ ] **Step 1: Write failing test** + +```python +# client/tests/test_config_station.py +"""Tests that STATION_CODE is loaded from env and exposed on the client config.""" +import importlib +import os + + +def test_station_code_read_from_env(monkeypatch): + monkeypatch.setenv("STATION_CODE", "ST-TEST") + import config + importlib.reload(config) + assert config.STATION_CODE == "ST-TEST" + + +def test_station_code_defaults_to_none_when_missing(monkeypatch): + monkeypatch.delenv("STATION_CODE", raising=False) + import config + importlib.reload(config) + assert config.STATION_CODE is None +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd client && pytest tests/test_config_station.py -v` +Expected: FAIL (no `STATION_CODE` attribute). + +- [ ] **Step 3: Add STATION_CODE to client/config.py** + +Open `client/config.py` and, alongside the other `os.getenv` lines (near `API_SERVER_URL`), add: + +```python +# Station identity: each deployed client container sets this to the station code +# it belongs to. Empty/None means "not configured" → client will show a config +# error page on select_recipe. +STATION_CODE: str | None = os.getenv("STATION_CODE") or None +``` + +- [ ] **Step 4: Add STATION_CODE to .env.example** + +Open `.env.example` (repo root) and add, near the other client variables: + +``` +# Station code this client container belongs to (e.g. ST-001). +# Each physical tablet/PC deployment must set this unique per-station value. +# Leave empty only for a single-station all-in-one demo using ST-DEFAULT. +STATION_CODE=ST-DEFAULT +``` + +- [ ] **Step 5: Run test to verify it passes** + +Run: `cd client && pytest tests/test_config_station.py -v` +Expected: 2 passed. + +- [ ] **Step 6: Commit** + +```bash +git add client/config.py client/tests/test_config_station.py .env.example +git commit -m "feat(client): add STATION_CODE env var and config attribute" +``` + +--- + +## Task 8: Client `api_client` Helper — `get_station_recipes()` + +**Files:** +- Modify: `client/services/api_client.py` +- Test: `client/tests/test_api_client_stations.py` + +- [ ] **Step 1: Write failing test** + +```python +# client/tests/test_api_client_stations.py +"""Tests for the station-related helpers in api_client.""" +from unittest.mock import patch, MagicMock + +from services.api_client import APIClient + + +def test_get_station_recipes_calls_correct_endpoint(): + client = APIClient() + with patch.object(client, "get") as mock_get: + mock_get.return_value = [{"id": 1, "code": "R1", "name": "R1", "active": True}] + result = client.get_station_recipes("ST-001", api_key="abc") + mock_get.assert_called_once_with( + "/api/stations/by-code/ST-001/recipes", api_key="abc", + ) + assert result == [{"id": 1, "code": "R1", "name": "R1", "active": True}] +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd client && pytest tests/test_api_client_stations.py -v` +Expected: FAIL with `AttributeError: 'APIClient' object has no attribute 'get_station_recipes'`. + +- [ ] **Step 3: Add the helper method** + +In `client/services/api_client.py` find the class `APIClient` and add a method (following the style of any existing helpers like `get_recipe`): + +```python +def get_station_recipes(self, station_code: str, api_key: str): + """Return the list of active recipes assigned to the given station.""" + return self.get(f"/api/stations/by-code/{station_code}/recipes", api_key=api_key) +``` + +- [ ] **Step 4: Run test to verify it passes** + +Run: `cd client && pytest tests/test_api_client_stations.py -v` +Expected: passed. + +- [ ] **Step 5: Commit** + +```bash +git add client/services/api_client.py client/tests/test_api_client_stations.py +git commit -m "feat(client): add get_station_recipes helper on APIClient" +``` + +--- + +## Task 9: Client `measure.select_recipe` — Filter by STATION_CODE + +**Goal:** when the operator reaches the recipe-selection page, the list is filtered to only the recipes assigned to this tablet's station. If `STATION_CODE` is unset, show an explicit configuration error page. + +**Files:** +- Modify: `client/blueprints/measure.py` +- Modify: `client/templates/measure/select_recipe.html` +- Create: `client/templates/errors/station_not_configured.html` +- Test: `client/tests/test_measure_station_filter.py` + +- [ ] **Step 1: Write failing test** + +```python +# client/tests/test_measure_station_filter.py +"""Verify that /measure/select reads STATION_CODE and filters recipes via the server.""" +from unittest.mock import patch + + +def test_select_recipe_calls_station_endpoint(logged_in_client, monkeypatch): + monkeypatch.setenv("STATION_CODE", "ST-TEST") + import config; import importlib; importlib.reload(config) + import blueprints.measure; importlib.reload(blueprints.measure) + + with patch("blueprints.measure.api_client") as mock_api: + mock_api.get_station_recipes.return_value = [ + {"id": 1, "code": "R1", "name": "Recipe 1", "active": True}, + ] + resp = logged_in_client.get("/measure/select") + assert resp.status_code == 200 + mock_api.get_station_recipes.assert_called_once() + args, kwargs = mock_api.get_station_recipes.call_args + assert args[0] == "ST-TEST" or kwargs.get("station_code") == "ST-TEST" + + +def test_select_recipe_without_station_code_shows_error(logged_in_client, monkeypatch): + monkeypatch.delenv("STATION_CODE", raising=False) + import config; import importlib; importlib.reload(config) + import blueprints.measure; importlib.reload(blueprints.measure) + + resp = logged_in_client.get("/measure/select") + assert resp.status_code == 503 + assert b"STATION_CODE" in resp.data or b"stazione" in resp.data.lower() +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: `cd client && pytest tests/test_measure_station_filter.py -v` +Expected: FAIL (either wrong API called or no error page). + +- [ ] **Step 3: Modify `select_recipe` view** + +Open `client/blueprints/measure.py`. Find the `select_recipe` view (handles `GET /measure/select`). Replace the call that currently lists recipes with: + +```python +import config +from flask import render_template, session + +@measure_bp.route("/select", methods=["GET"]) +def select_recipe(): + if not config.STATION_CODE: + return render_template( + "errors/station_not_configured.html", + ), 503 + api_key = session.get("api_key") + try: + recipes = api_client.get_station_recipes(config.STATION_CODE, api_key=api_key) + except Exception as e: + return render_template( + "errors/station_not_configured.html", + error=str(e), + ), 502 + return render_template( + "measure/select_recipe.html", + recipes=recipes, + station_code=config.STATION_CODE, + ) +``` + +Adapt to match the existing function signature and parameter names (query search, barcode, lot, serial). Keep all pre-existing URL params (`?recipe=&lot=&serial=`). + +- [ ] **Step 4: Create the error template** + +```html +{# client/templates/errors/station_not_configured.html #} +{% extends "base.html" %} +{% block content %} +
+

+ {{ _('Stazione non configurata') }} +

+

+ {{ _('Questo client non ha impostato la variabile di ambiente STATION_CODE.') }} +

+

+ {{ _('Contattare il responsabile IT: il file .env del container deve contenere STATION_CODE con il codice della stazione assegnata.') }} +

+ {% if error %} +
{{ error }}
+ {% endif %} +
+{% endblock %} +``` + +- [ ] **Step 5: Show station_code in the select_recipe template** + +In `client/templates/measure/select_recipe.html` add near the top, inside the header block: + +```html +
+ {{ _('Stazione') }}: {{ station_code }} +
+``` + +Leave the rest of the template intact. + +- [ ] **Step 6: Run test to verify it passes** + +Run: `cd client && pytest tests/test_measure_station_filter.py -v` +Expected: 2 passed. + +- [ ] **Step 7: Full client test suite — no regressions** + +Run: `cd client && pytest -q` +Expected: all previous tests still pass. + +- [ ] **Step 8: Commit** + +```bash +git add client/blueprints/measure.py client/templates/measure/select_recipe.html client/templates/errors/station_not_configured.html client/tests/test_measure_station_filter.py +git commit -m "feat(client): filter select_recipe by STATION_CODE with error fallback" +``` + +--- + +## Task 10: Admin Blueprint Routes for Stations + +**Files:** +- Modify: `client/blueprints/admin.py` +- Test: `client/tests/test_admin_stations.py` + +- [ ] **Step 1: Write failing test** + +```python +# client/tests/test_admin_stations.py +"""Tests that the admin blueprint exposes /admin/stations CRUD routes.""" +from unittest.mock import patch + + +def test_admin_stations_page_renders(logged_in_admin_client): + with patch("blueprints.admin.APIClient") as MockClient: + MockClient.return_value.get.return_value = [ + {"id": 1, "code": "ST-1", "name": "One", "active": True, + "location": None, "notes": None, "created_by": 1, + "created_at": "2026-04-17T00:00:00"}, + ] + resp = logged_in_admin_client.get("/admin/stations") + assert resp.status_code == 200 + assert b"ST-1" in resp.data + + +def test_admin_create_station_posts_to_server(logged_in_admin_client): + with patch("blueprints.admin.APIClient") as MockClient: + MockClient.return_value.post.return_value = { + "id": 5, "code": "ST-NEW", "name": "N", "active": True, + "location": None, "notes": None, "created_by": 1, + "created_at": "2026-04-17T00:00:00", + } + resp = logged_in_admin_client.post( + "/admin/api/stations", + json={"code": "ST-NEW", "name": "N"}, + ) + assert resp.status_code == 201 + MockClient.return_value.post.assert_called_once() + + +def test_admin_assign_recipe_to_station(logged_in_admin_client): + with patch("blueprints.admin.APIClient") as MockClient: + MockClient.return_value.post.return_value = { + "id": 10, "station_id": 1, "recipe_id": 7, + "assigned_by": 1, "assigned_at": "2026-04-17T00:00:00", + } + resp = logged_in_admin_client.post( + "/admin/api/stations/1/recipes", + json={"recipe_id": 7}, + ) + assert resp.status_code == 201 +``` + +The fixture `logged_in_admin_client` is a session with `is_admin=True`; add it to `client/tests/conftest.py` if not present — see step 2. + +- [ ] **Step 2: Add fixture `logged_in_admin_client` in `client/tests/conftest.py`** + +If the file already has `logged_in_client` but not `logged_in_admin_client`, add (near `logged_in_client`): + +```python +@pytest.fixture +def logged_in_admin_client(client): + with client.session_transaction() as sess: + sess["api_key"] = "admin-test-key" + sess["user_id"] = 1 + sess["language"] = "en" + sess["theme"] = "light" + sess["user"] = { + "id": 1, "username": "admin", "display_name": "Admin", + "roles": ["Maker", "MeasurementTec", "Metrologist"], + "is_admin": True, "language_pref": "en", "theme_pref": "light", + "active": True, "email": None, + } + return client +``` + +- [ ] **Step 3: Run test to verify it fails** + +Run: `cd client && pytest tests/test_admin_stations.py -v` +Expected: FAIL (routes don't exist). + +- [ ] **Step 4: Add the admin routes** + +Open `client/blueprints/admin.py`. Add, following the style of the existing user-management routes: + +```python +from flask import jsonify, render_template, request + +@admin_bp.route("/stations", methods=["GET"]) +def stations_page(): + api_key = session.get("api_key") + client = APIClient() + try: + stations = client.get("/api/stations", api_key=api_key) + except Exception: + stations = [] + return render_template("admin/stations.html", stations=stations) + + +@admin_bp.route("/stations/", methods=["GET"]) +def station_detail_page(station_id: int): + api_key = session.get("api_key") + client = APIClient() + try: + station = client.get(f"/api/stations/{station_id}", api_key=api_key) + recipes = client.get(f"/api/stations/{station_id}/recipes", api_key=api_key) + all_recipes = client.get("/api/recipes", api_key=api_key) + except Exception: + station, recipes, all_recipes = None, [], [] + return render_template( + "admin/station_detail.html", + station=station, assigned_recipes=recipes, all_recipes=all_recipes, + ) + + +@admin_bp.route("/api/stations", methods=["POST"]) +def api_create_station(): + api_key = session.get("api_key") + client = APIClient() + data = request.get_json(silent=True) or {} + try: + created = client.post("/api/stations", data=data, api_key=api_key) + return jsonify(created), 201 + except Exception as e: + return jsonify({"error": True, "detail": str(e)}), 500 + + +@admin_bp.route("/api/stations/", methods=["PUT"]) +def api_update_station(station_id: int): + api_key = session.get("api_key") + client = APIClient() + data = request.get_json(silent=True) or {} + try: + updated = client.put(f"/api/stations/{station_id}", data=data, api_key=api_key) + return jsonify(updated), 200 + except Exception as e: + return jsonify({"error": True, "detail": str(e)}), 500 + + +@admin_bp.route("/api/stations/", methods=["DELETE"]) +def api_delete_station(station_id: int): + api_key = session.get("api_key") + client = APIClient() + try: + client.delete(f"/api/stations/{station_id}", api_key=api_key) + return "", 204 + except Exception as e: + return jsonify({"error": True, "detail": str(e)}), 500 + + +@admin_bp.route("/api/stations//recipes", methods=["POST"]) +def api_assign_recipe(station_id: int): + api_key = session.get("api_key") + client = APIClient() + data = request.get_json(silent=True) or {} + try: + created = client.post( + f"/api/stations/{station_id}/recipes", data=data, api_key=api_key, + ) + return jsonify(created), 201 + except Exception as e: + return jsonify({"error": True, "detail": str(e)}), 500 + + +@admin_bp.route("/api/stations//recipes/", methods=["DELETE"]) +def api_unassign_recipe(station_id: int, recipe_id: int): + api_key = session.get("api_key") + client = APIClient() + try: + client.delete( + f"/api/stations/{station_id}/recipes/{recipe_id}", api_key=api_key, + ) + return "", 204 + except Exception as e: + return jsonify({"error": True, "detail": str(e)}), 500 +``` + +Adapt `session`, `APIClient` import, and error-normalization helpers to match the existing style in the same file. + +- [ ] **Step 5: Run test to verify it passes** + +Run: `cd client && pytest tests/test_admin_stations.py -v` +Expected: 3 passed. + +- [ ] **Step 6: Commit** + +```bash +git add client/blueprints/admin.py client/tests/test_admin_stations.py client/tests/conftest.py +git commit -m "feat(client): add admin station CRUD routes and recipe assignment proxy" +``` + +--- + +## Task 11: Admin Template — Stations List Page + +**Files:** +- Create: `client/templates/admin/stations.html` + +- [ ] **Step 1: Write the template** + +```html +{# client/templates/admin/stations.html #} +{% extends "base.html" %} +{% block content %} +
+
+

{{ _('Gestione Stazioni') }}

+ +
+ +
+ + + + + + + + + + + + + + + + +
{{ _('Codice') }}{{ _('Nome') }}{{ _('Ubicazione') }}{{ _('Stato') }}{{ _('Azioni') }}
+ {{ _('Nessuna stazione configurata.') }} +
+
+ + {# Create modal #} +
+
+

{{ _('Nuova Stazione') }}

+
+
+ + +
+
+ + +
+
+ + +
+
+ + +
+
+
+
+
+{% endblock %} + +{% block extra_js %} + +{% endblock %} +``` + +- [ ] **Step 2: Manual smoke test** + +Start client + server, login as admin, navigate to `/admin/stations`. +Expected: the page renders, the "Nuova Stazione" modal creates a new station, delete removes it. + +- [ ] **Step 3: Commit** + +```bash +git add client/templates/admin/stations.html +git commit -m "feat(client): add admin stations list and create modal" +``` + +--- + +## Task 12: Admin Template — Station Detail with Recipe Assignment + +**Files:** +- Create: `client/templates/admin/station_detail.html` + +- [ ] **Step 1: Write the template** + +```html +{# client/templates/admin/station_detail.html #} +{% extends "base.html" %} +{% block content %} +
+ + ← {{ _('Tutte le stazioni') }} + + +

+ + — +

+ +
+ {# Assigned recipes #} +
+

{{ _('Ricette assegnate') }}

+
    + +
  • + {{ _('Nessuna ricetta assegnata.') }} +
  • +
+
+ + {# Available recipes to assign #} +
+

{{ _('Ricette disponibili') }}

+ +
    + +
+
+
+
+{% endblock %} + +{% block extra_js %} + +{% endblock %} +``` + +- [ ] **Step 2: Manual smoke test** + +Navigate to `/admin/stations/` as admin. +Expected: two panels, drag-less assign/unassign works, filter filters. + +- [ ] **Step 3: Commit** + +```bash +git add client/templates/admin/station_detail.html +git commit -m "feat(client): add admin station detail with recipe assignment UI" +``` + +--- + +## Task 13: Navbar Link to Stations Admin + +**Files:** +- Modify: `client/templates/components/navbar.html` + +- [ ] **Step 1: Find the admin section of the navbar** + +Open `client/templates/components/navbar.html`. Find the block that renders links visible only to `is_admin` users (there's probably already a "Utenti" / "Users" link). + +- [ ] **Step 2: Add the Stations link** + +Next to the existing admin links, add: + +```html +{% if current_user.is_admin %} + + {{ _('Stazioni') }} + +{% endif %} +``` + +Adapt the conditional syntax (`current_user.is_admin` vs `session.user.is_admin`) to match the existing navbar pattern. + +- [ ] **Step 3: Manual smoke test** + +Login as admin → verify navbar shows "Stazioni" link, clicking navigates to /admin/stations. + +- [ ] **Step 4: Commit** + +```bash +git add client/templates/components/navbar.html +git commit -m "feat(client): add Stazioni link in admin navbar" +``` + +--- + +## Task 14: i18n Catalog Update + +**Files:** +- Modify: `client/translations/messages.pot`, `client/translations/it/LC_MESSAGES/messages.po`, `client/translations/en/LC_MESSAGES/messages.po` +- Run: `pybabel extract` + `pybabel update` + `pybabel compile` + +- [ ] **Step 1: Extract strings** + +```bash +cd client && pybabel extract -F babel.cfg -k _ -o translations/messages.pot . +``` + +- [ ] **Step 2: Update per-locale .po files** + +```bash +cd client && pybabel update -i translations/messages.pot -d translations +``` + +- [ ] **Step 3: Translate new strings** + +Open `client/translations/it/LC_MESSAGES/messages.po` and `client/translations/en/LC_MESSAGES/messages.po`. Fill in translations for newly extracted strings added in Tasks 9, 11, 12, 13 (e.g. `Gestione Stazioni`, `Nuova Stazione`, `Stazione non configurata`, `Ricette assegnate`, etc.). + +- [ ] **Step 4: Compile** + +```bash +cd client && pybabel compile -d translations +``` + +- [ ] **Step 5: Manual smoke test — language switch** + +Switch language between IT and EN in the UI; verify Stations pages translate correctly. + +- [ ] **Step 6: Commit** + +```bash +git add client/translations/ +git commit -m "i18n: add translations for stations management UI" +``` + +--- + +## Task 15: End-to-End Integration Test + +**Files:** +- Create: `server/tests/test_stations_e2e.py` + +- [ ] **Step 1: Write the E2E scenario** + +```python +# server/tests/test_stations_e2e.py +"""End-to-end: two stations see only their own assigned recipes.""" +import pytest +from httpx import AsyncClient + +from tests.conftest import auth_headers, create_test_recipe + + +@pytest.mark.asyncio +async def test_two_stations_see_only_their_recipes( + client: AsyncClient, admin_user, measurement_tec_user, db_session, +): + r_a = await create_test_recipe(db_session, user_id=admin_user.id, code="REC-A") + r_b = await create_test_recipe(db_session, user_id=admin_user.id, code="REC-B") + r_shared = await create_test_recipe(db_session, user_id=admin_user.id, code="REC-S") + await db_session.commit() + + for code, name in [("ST-A", "Alfa"), ("ST-B", "Beta")]: + resp = await client.post( + "/api/stations", + headers=auth_headers(admin_user), + json={"code": code, "name": name}, + ) + assert resp.status_code == 201 + + list_resp = await client.get( + "/api/stations", headers=auth_headers(admin_user), + ) + by_code = {s["code"]: s for s in list_resp.json()} + + await client.post( + f"/api/stations/{by_code['ST-A']['id']}/recipes", + headers=auth_headers(admin_user), + json={"recipe_id": r_a.id}, + ) + await client.post( + f"/api/stations/{by_code['ST-A']['id']}/recipes", + headers=auth_headers(admin_user), + json={"recipe_id": r_shared.id}, + ) + await client.post( + f"/api/stations/{by_code['ST-B']['id']}/recipes", + headers=auth_headers(admin_user), + json={"recipe_id": r_b.id}, + ) + await client.post( + f"/api/stations/{by_code['ST-B']['id']}/recipes", + headers=auth_headers(admin_user), + json={"recipe_id": r_shared.id}, + ) + + ra = await client.get( + "/api/stations/by-code/ST-A/recipes", + headers=auth_headers(measurement_tec_user), + ) + rb = await client.get( + "/api/stations/by-code/ST-B/recipes", + headers=auth_headers(measurement_tec_user), + ) + assert ra.status_code == 200 and rb.status_code == 200 + codes_a = {r["code"] for r in ra.json()} + codes_b = {r["code"] for r in rb.json()} + assert codes_a == {"REC-A", "REC-S"} + assert codes_b == {"REC-B", "REC-S"} + assert "REC-B" not in codes_a + assert "REC-A" not in codes_b +``` + +- [ ] **Step 2: Run the test** + +Run: `cd server && pytest tests/test_stations_e2e.py -v` +Expected: passed. + +- [ ] **Step 3: Run full server suite** + +Run: `cd server && pytest -q` +Expected: all tests pass. + +- [ ] **Step 4: Commit** + +```bash +git add server/tests/test_stations_e2e.py +git commit -m "test(e2e): two stations see only assigned recipes" +``` + +--- + +## Task 16: Apply Migration Against Dev DB and Verify + +**Files:** none (operational) + +- [ ] **Step 1: Apply the migration** + +```bash +cd server && alembic -c migrations/alembic.ini upgrade head +``` +Expected: migration `002_add_stations` applied without errors. Check MySQL: +```sql +SHOW TABLES LIKE 'station%'; +-- expected: stations, station_recipe_assignments +``` + +- [ ] **Step 2: Re-run the demo setup to trigger the seed** + +With SETUP_PASSWORD set in `.env`, call `POST /api/setup/initialize` (via Swagger or curl). Expected response status 200. + +Verify: +```sql +SELECT code, name, active FROM stations; +-- expected: one row with ST-DEFAULT +SELECT COUNT(*) FROM station_recipe_assignments WHERE station_id = (SELECT id FROM stations WHERE code = 'ST-DEFAULT'); +-- expected: equal to COUNT(*) FROM recipes WHERE active = 1 +``` + +- [ ] **Step 3: Smoke test the client flow** + +1. Start dev stack: `docker compose -f docker-compose.dev.yml up -d` (assumes `STATION_CODE=ST-DEFAULT` in client `.env`). +2. Open browser → login as MeasurementTec → `/measure/select`. +3. Expected: recipes list, "Stazione: ST-DEFAULT" header visible. +4. Remove `STATION_CODE` from `.env`, restart client. +5. Expected: error page "Stazione non configurata" with 503. + +- [ ] **Step 4: Create a pilot station in admin UI** + +1. Login as admin → `/admin/stations` → Nuova Stazione → `ST-PILOT`, `Pilot`. +2. Click Dettagli → assign 1 recipe from the available list. +3. Change `STATION_CODE=ST-PILOT` in client `.env`, restart client. +4. Expected: select_recipe shows only that 1 recipe. + +- [ ] **Step 5: Downgrade check (rollback safety)** + +In a scratch environment: +```bash +cd server && alembic -c migrations/alembic.ini downgrade -1 +``` +Expected: both tables dropped without error. + +Then re-upgrade: +```bash +cd server && alembic -c migrations/alembic.ini upgrade head +``` + +- [ ] **Step 6: No commit required** (operational task). + +--- + +## Task 17: Documentation Update + +**Files:** +- Modify: `CLAUDE.md`, `README.md`, `docs/API.md`, `docs/DEPLOYMENT.md` + +- [ ] **Step 1: Update CLAUDE.md** + +In the "Configurazione" section, add to the client env vars list: + +``` +- Client: CLIENT_HOST, CLIENT_PORT, CLIENT_SECRET_KEY, API_SERVER_URL, **STATION_CODE** +``` + +Add a new section right after "Client (Flask)" explaining the Station concept: + +``` +### Station Identity +Il client Flask identifica la propria stazione fisica tramite `STATION_CODE` nel `.env`. +All'avvio di `/measure/select` chiama `GET /api/stations/by-code/{code}/recipes` per ottenere +solo le ricette assegnate alla propria stazione. Se `STATION_CODE` manca, la pagina mostra +un errore 503 "Stazione non configurata". + +Le stazioni e le assegnazioni ricetta↔stazione sono gestite dagli admin in `/admin/stations`. +Il seed iniziale crea una stazione `ST-DEFAULT` con tutte le ricette assegnate. +``` + +- [ ] **Step 2: Update README.md** + +Aggiungere una frase nel paragrafo "Configurazione" o equivalente indicando la nuova variabile `STATION_CODE` necessaria per il client. + +- [ ] **Step 3: Update docs/API.md** + +Aggiungere la sezione "Stations" con gli endpoint di Task 5 e i relativi schemi. + +- [ ] **Step 4: Update docs/DEPLOYMENT.md** + +Nel paragrafo di configurazione client, chiarire che ogni container/tablet deve avere il suo `STATION_CODE` univoco. + +- [ ] **Step 5: Commit** + +```bash +git add CLAUDE.md README.md docs/API.md docs/DEPLOYMENT.md +git commit -m "docs: describe Station identity and admin management" +``` + +--- + +## Definition of Done (tutta la Fase 1) + +- [ ] Tutti i test nuovi e pre-esistenti passano in `server` e `client` (`pytest -q` pulito). +- [ ] Migration `002_add_stations` applicata e downgrade testato. +- [ ] Seed `ST-DEFAULT` crea associazioni con tutte le ricette attive esistenti. +- [ ] Admin può creare/modificare/eliminare stazioni in `/admin/stations`. +- [ ] Admin può assegnare/rimuovere ricette ad ogni stazione in `/admin/stations/`. +- [ ] Client con `STATION_CODE=ST-A` vede solo le ricette di ST-A; con `STATION_CODE=ST-B` solo quelle di ST-B. +- [ ] Client senza `STATION_CODE` mostra pagina di errore chiara. +- [ ] Stringhe nuove tradotte IT + EN, catalogo compilato. +- [ ] Documentazione (`CLAUDE.md`, `README.md`, `API.md`, `DEPLOYMENT.md`) aggiornata. + +--- + +## Note per l'esecutore + +1. **Ordine obbligato:** Task 1 → 2 → 3 → 4 → 5 → 6 sono sequenziali (ogni task dipende dal precedente). Task 7-13 (lato client) possono essere svolti dopo Task 5. Task 14-17 a fine fase. +2. **Regole commit:** un commit per task completato, mai amend, formato `feat(...)` / `test(...)` / `docs(...)` / `i18n(...)`. Se un hook pre-commit fallisce, analizza e correggi, non bypassare con `--no-verify`. +3. **Envelope response:** il progetto V1.0.7 non usa l'envelope `{success,data,error}`. Restiamo coerenti col pattern esistente (ritorno diretto) per tutti gli endpoint di questa fase; l'eventuale migrazione a envelope è rinviata a un task di refactor globale. +4. **Tablet identity header opzionale (M2):** per ora non aggiungiamo un header `X-Station-Id` universale; il filtraggio è esplicito per path (`/by-code/...`). L'header sarà introdotto in M2 se necessario per middleware di audit. +5. **Performance:** `list_station_recipes` fa una join di due tabelle piccole con indici sulle FK, nessun problema attualmente. +6. **Retrocompatibilità:** i tuoi endpoint `/api/recipes` originali continuano a funzionare inalterati; i client legacy non rotti.