← Back to project

Risk Log



Known unknowns and assumptions that could impact the MVP design and implementation.

| ID | Risk | Probability | Impact | Mitigation |
|----|------|-------------|--------|------------|
| R1 | Debezium requires Kafka; Kafka may be too heavy for quick MVP install | Medium | High | Consider Debezium direct HTTP sink or embedded connector; or use Maxwell daemon (lighter) |
| R2 | On-prem LLM quality insufficient for meaningful summaries | Medium | Medium | Start with cloud LLM option behind allowlist; later add local model (Ollama) as alternative |
| R3 | CDC connector performance impact on production MySQL (CPU, IO) | Medium | High | Recommend read replica; test with synthetic load; ensure incremental snapshotting; monitor lag |
| R4 | Schema evolution in source DB breaks event normalization | High | Medium | Design flexible schema registry; versioned transformations; fallback to raw capture |
| R5 | Data privacy: raw DB changes may contain PII; LLM context may leak | High | High | Implement masking at capture; redact sensitive columns before AI; on-prem LLM preferred for pilot |
| R6 | Customer IT security rejects any software that needs elevated DB privileges | Medium | High | Document minimum required privileges (REPLICATION SLAVE, SELECT on binlog, maybe SHOW DATABASES); run from read replica if possible |
| R7 | Multi-table transaction correlation is complex (eventual consistency, ordering) | Medium | Medium | Design transaction boundary detection (GTID or commit timestamps); tolerate some out-of-order; document limitations |
| R8 | MVP perceived as too complex to install (many moving parts) | High | High | Provide Docker Compose single command; detailed installation guide; offer hosted trial if needed |
| R9 | Synthetic data generator fails to produce realistic anomalies | Low | Medium | Build generator with configurable anomaly patterns; allow customer to tweak |
| R10 | Regulations (Peru data localization) may prohibit cloud LLM even with minimization | Medium | High | Default to on-prem LLM in MVP; architecture supports pluggable LLM backends |


Active Assumptions



These assumptions should be validated early.