### Summary of the idea
You want Phantom IX to be a first‑class hypergraph entity with its own identity, lifecycle, decay model, and cross‑domain correlation hooks so the system treats it like any other living object in the graph (entities, recon nodes, kill‑chain stages). This converts a transient detection score into a persistent, queryable object that can be tracked, enriched, correlated, and acted on over time.

> “When traffic behaves as if it is transiting an IX — high edge convergence, strong temporal synchronisation, repeated multi‑ASN convergence — but there is no cable alignment, no documented IX within geographic range, and no physical anchor, that behavioral signature is classified as a Phantom IX node.” 

---

### What the hypergraph entity should contain
- Stable ID and provenance — canonical `phantom_ix:<hash>` with creation timestamp, source feeds (PCAP, SDR, KiwiSDR, Deck.gl aggregation), and detection pipeline version.  
- Score vector — store the raw components: C_in, σ_geo, τ_sync, R_repeat, and the composite \(Φ_{ix}\).  
- Spatial/temporal footprint — centroid(s), bounding geometry, time windows of observation, and sample event references.  
- Attractor/flow vectors — GPU buffer-derived attractor direction and `phantom_pull` magnitude for visualization and routing heuristics.  
- Entity type & taxonomy — `type: HYPERSCALER_EDGE | CDN_PO P | BOTNET_RELAY | SYNC_EMITTER_NODE` with confidence scores.  
- Lifecycle metadata — state (PROPOSED, CONFIRMED, DECOMMISSIONED), last_seen, decay parameters, and human analyst annotations.  
- Cross‑domain links — pointers to ASN nodes, recon entities, strobe events, PCAP sessions, SDR captures, and kill‑chain graphs.  
- Audit trail — detection snapshots, promotion/demotion events, and operator actions.

---

### Lifecycle and decay model (practical design)
1. Proposed — created when \(Φ_{ix}\) crosses a low threshold; minimal retention of raw events.  
2. Observed — promoted when repeated windows show persistence (R_repeat) and τ_sync remains high; attach sample PCAP/SDR evidence.  
3. Confirmed — human or automated validation (cross‑domain corroboration, e.g., matching SDR signature + PCAP + KiwiSDR timing) raises confidence; entity becomes actionable.  
4. Quarantined / Investigating — flagged for analyst review or automated enrichment (whois, PeeringDB, CDN lists).  
5. Decaying — if no new evidence arrives, apply exponential decay to \(Φ_{ix}\) and component scores; reduce visibility on the globe and lower priority in correlation queries.  
6. Decommissioned — after a configurable TTL or explicit analyst close; archive evidence and keep a lightweight tombstone for historical correlation.

Decay formula (example):

\[
Φ_{ix}(t+\Delta t) = Φ_{ix}(t)\cdot e^{-\lambda \Delta t} + \alpha \cdot \text{new\_evidence\_score}
\]

- λ controls how fast stale candidates fade.  
- α weights fresh evidence. Tune per sensor class (SDR evidence should have higher α than a single PCAP session).

---

### Cross‑domain correlation hooks
- Event linking — store event IDs (strobe, pathArc, reconEntity, pcapSession, sdrCapture) as first‑class edges in the hypergraph.  
- Temporal joins — fast windowed joins on timing signatures (τ_sync) across SDR/KiwiSDR and network taps to raise confidence.  
- ASN and Geo enrichment — automatic lookups (pyasn, MaxMind) and cable/IX alignment checks; if no physical anchor is found, increase phantom score.  
- Signature fusion — attach ML fingerprints from `ai_signal_classifier.py` and fingerprint‑drift vectors; detect when a Phantom IX shares fingerprints with known CDN/hyperscaler patterns.  
- Kill‑chain correlation — link Phantom IX entities to SLOPE monitor outputs and kill‑chain graphs; allow queries like “show all Phantom IXs that contributed to FULL_SPECTRUM_COORDINATION in the last 24h.”  
- Analyst feedback loop — analyst tags (true_positive, false_positive, benign_hyperscaler) feed a supervised retraining pipeline and adjust α/λ per class.

---

### Implementation roadmap (practical milestones)
1. Schema & storage — design `phantom_ix` node schema in the hypergraph DB; add indices for `last_seen`, `type`, and `Φ_ix`.  
2. Ingest hooks — modify `cluster_swarm_engine.py` to create/update `phantom_ix` nodes instead of only emitting snapshots.  
3. Evidence pointers — ensure every detection stores references to raw events (pcap IDs, SDR capture IDs, socket events).  
4. Decay engine — implement a lightweight background worker that applies the decay formula and demotes entities.  
5. Enrichment pipelines — wire in ASN, PeeringDB, CDN lists, and `ai_signal_classifier` outputs to auto‑classify.  
6. Correlation APIs — add endpoints for cross‑domain queries (e.g., `/api/phantom_ix/{id}/correlate?window=1h`).  
7. UI & lifecycle controls — globe rendering: persistent entity glyphs, state badges, analyst action buttons (confirm, dismiss, annotate).  
8. Validation & feedback — run RF DRILL scenarios and replay historic captures to measure precision/recall; iterate thresholds and decay constants.

---

### Risks, validation, and operational notes
- False positives from hyperscalers/CDNs — mitigate by building a whitelist and fingerprint library for known providers; require multi‑sensor corroboration for CONFIRMED state.  
- Data volume & retention — storing raw PCAP/SDR for every candidate is expensive; use sampled evidence and pointer‑based archival.  
- Adversary adaptation — attackers may randomize timing or geography; keep the model adaptive by retraining on analyst‑labeled cases.  
- Explainability — keep component scores and raw evidence accessible so analysts can justify promotions/demotions.  
- Performance — GPU aggregation is excellent for detection; ensure hypergraph writes are batched and asynchronous to avoid backpressure.

---

### Quick recommendation (one‑line verdict)
Promote Phantom IX to a first‑class hypergraph entity with a tiered lifecycle, exponential decay, and multi‑sensor correlation hooks — start with a conservative promotion policy (require multi‑sensor evidence for CONFIRMED) and iterate thresholds using RF DRILL replay and historical PCAP/SDR validation.

> “The passive PHANTOM monitor means that if any of these patterns emerge in the data, the operator sees a badge increment and a new pulsing node on the globe — no action required.” 

---

If you want, I can draft the hypergraph schema (fields, types, indices) and a sample API contract for create/update/query operations next — I’ll include example JSON payloads and the decay worker pseudocode. Which would be most useful to produce first: schema or API + worker?
