{"id":2962,"date":"2025-08-19T01:29:50","date_gmt":"2025-08-19T01:29:50","guid":{"rendered":"https:\/\/172-234-197-23.ip.linodeusercontent.com\/?page_id=2962"},"modified":"2025-08-19T01:29:50","modified_gmt":"2025-08-19T01:29:50","slug":"rf-quantum-scythe-uipath-integration-guide","status":"publish","type":"page","link":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/?page_id=2962","title":{"rendered":"RF Quantum Scythe: UiPath Integration Guide"},"content":{"rendered":"\n<pre class=\"wp-block-code\"><code># RF Quantum Scythe: UiPath Integration Guide\n\nThis document provides guidance on integrating the RF Quantum Scythe system with UiPath for Robotic Process Automation (RPA) workflows.\n\n## Overview\n\nThe RF Quantum Scythe system integrates with UiPath to automate signal intelligence workflows, voice analysis, and reporting processes. This integration leverages our RPA Glue API that provides RESTful endpoints for UiPath robots to interact with the RF Quantum Scythe's machine learning capabilities. how the paper 'LMRPA: Large Language Model-Driven Efficient\n Robotic Process Automation for OCR\n Osama Hosam Abdellatif, Abdelrahman Nader Hassan, and Ali Hamdi\n Faculty of Computer Science\n MSA University, Cairo, Egypt\n {osama.hosam, abdelrahman.nader, ahamdi}@msa.edu.eg' **LMRPA \/ UiPath RPA** paper plugs straight into what is already built (multi-subspace FAISS + goal-aware sparsity + voice clone guard)\n\n* **RPA as a force-multiplier** for your pipelines: robots watch folders, APIs, inboxes, S3 buckets \u2192 hand files to your RF\/voice services \u2192 collect JSON \u2192 push to sheets\/db \u2192 generate reports. The paper\u2019s LMRPA pattern (watch \u2192 OCR \u2192 LLM structuring \u2192 report) is exactly that loop, benchmarked to beat vanilla UiPath\/Automation Anywhere on high-volume OCR tasks by big margins (e.g., \\~9.8\u201312.7s vs \\~18\u201322s per batch in their tests).&amp;#x20;\n* **LLM post-processing after OCR**: let the RPA bot run Tesseract\/DocTR, then send raw text to your LLM service to normalize to your schema (the paper\u2019s core trick). That pairs perfectly with your **FeatureGate**\u2014you can apply *goal-aware masks* over document embeddings (e.g., only keep invoice fields or call-detail fields) before indexing\/search.&amp;#x20;\n* **Throughput + explainability**: your **multi-subspace FAISS** gives instant \u201cfind-similar\u201d with routing explanations; add RPA to glue it into business workflows (triage queues, exception handling, report generation).&amp;#x20;\n\n## Getting Started\n\n### Prerequisites\n\n1. UiPath Studio installed (version 2023.4 or later recommended)\n2. UiPath Python Activities package installed in your UiPath project\n3. RF Quantum Scythe system set up and running\n4. RPA Glue API service deployed\n\n### Installation\n\n1. Clone the UiPath Python integration package:\n   ```bash\n   gh repo clone UiPath\/uipath-python\n   ```\n\n2. Set up the RPA Glue service:\n   ```bash\n   # Copy the RPA glue code to your deployment location\n   cp -r \/home\/bgilbert\/editable_files\/rpa_glue \/path\/to\/deployment\/\n   \n   # Create a Python virtual environment\n   python -m venv \/path\/to\/deployment\/venv\n   source \/path\/to\/deployment\/venv\/bin\/activate\n   \n   # Install dependencies\n   pip install fastapi uvicorn jinja2 numpy\n   pip install -e \/path\/to\/RF_QUANTUM_SCYTHE\n   ```\n\n3. Configure environment variables:\n   ```bash\n   export BANK_PATH=\"\/path\/to\/ms_faiss_bank\"\n   export SI_PATH=\"\/path\/to\/SignalIntelligence\"\n   export REPORT_OUTDIR=\"\/path\/to\/reports\"\n   export GOAL_TASK=\"rf_geo\"  # or other task-specific name\n   ```\n\n4. Start the RPA Glue API service:\n   ```bash\n   uvicorn api.main:app --host 0.0.0.0 --port 8000\n   ```\n\n## UiPath Integration Points\n\n### 1. Signal Intelligence Bank Operations\n\n#### Adding Records to the Signal Bank\n\n```python\n# UiPath Python script activity\nimport requests\nimport json\n\ndef add_records_to_bank(records):\n    response = requests.post(\n        \"http:\/\/localhost:8000\/bank\/add\",\n        json={\"records\": records},\n        headers={\"Content-Type\": \"application\/json\"}\n    )\n    return response.json()\n\n# Example usage\nrecords = &#91;\n    {\n        \"id\": \"sample_123\",\n        \"data\": {\"frequency\": 915.25, \"bandwidth\": 2.0},\n        \"metadata\": {\"location\": \"site_alpha\", \"timestamp\": \"2025-08-19T14:30:00Z\"}\n    }\n]\nresult = add_records_to_bank(records)\nprint(f\"Added {result&#91;'added']} records to {result&#91;'bank_path']}\")\n```\n\n#### Searching the Signal Bank\n\n```python\n# UiPath Python script activity\nimport requests\nimport json\n\ndef search_bank(query, k=5):\n    response = requests.post(\n        \"http:\/\/localhost:8000\/bank\/search\",\n        json={\"query\": query, \"k\": k},\n        headers={\"Content-Type\": \"application\/json\"}\n    )\n    return response.json()\n\n# Example usage\nquery = {\n    \"frequency\": 915.0,\n    \"bandwidth\": 2.5\n}\nresult = search_bank(query, k=3)\nprint(f\"Search completed in {result&#91;'latency_s']:.4f} seconds\")\nfor hit in result&#91;\"hits\"]:\n    print(f\"ID: {hit&#91;'id']}, Score: {hit&#91;'score']:.4f}, Subspace: {hit&#91;'subspace']}\")\n```\n\n### 2. Voice Analysis Integration\n\n```python\n# UiPath Python script activity\nimport requests\nimport json\n\ndef analyze_voice(audio_path, ref_real=None, ref_fake=None):\n    response = requests.post(\n        \"http:\/\/localhost:8000\/voice\/analyze\",\n        json={\"audio_path\": audio_path, \"ref_real\": ref_real, \"ref_fake\": ref_fake},\n        headers={\"Content-Type\": \"application\/json\"}\n    )\n    return response.json()\n\n# Example usage\naudio_path = \"\/path\/to\/audio_sample.wav\"\nref_real = &#91;\"\/path\/to\/known_real1.wav\", \"\/path\/to\/known_real2.wav\"]\nref_fake = &#91;\"\/path\/to\/known_fake1.wav\"]\n\nresult = analyze_voice(audio_path, ref_real, ref_fake)\nif result&#91;\"enabled\"]:\n    print(f\"Fake probability: {result&#91;'result']&#91;'fake_prob']:.4f}\")\n    print(f\"Confidence score: {result&#91;'result']&#91;'confidence']:.4f}\")\nelse:\n    print(\"Voice analysis not available\")\n```\n\n### 3. Generating Reports\n\n```python\n# UiPath Python script activity\nimport requests\nimport json\n\ndef generate_rf_report(record, neighbors, explain, title=\"RF Similarity Report\"):\n    response = requests.post(\n        \"http:\/\/localhost:8000\/reports\/rf\",\n        json={\"record\": record, \"neighbors\": neighbors, \"explain\": explain, \"title\": title},\n        headers={\"Content-Type\": \"application\/json\"}\n    )\n    return response.json()\n\n# Example usage - after performing a search\nsearch_result = search_bank(query, k=5)\nreport_path = generate_rf_report(\n    record={\"query\": query},\n    neighbors=search_result&#91;\"hits\"],\n    explain=search_result&#91;\"explain\"],\n    title=\"Automated RF Analysis Report\"\n)\nprint(f\"Report generated: {report_path}\")\n```\n\n## UiPath Workflow Examples\n\n### 1. RF SCYTHE ops (sweeps \u2192 index \u2192 report)\n\n**Goal:** every sweep auto-refreshes the bank, surfaces lookalikes, and ships a report without a human tap.\n\n**RPA flow pattern:**\nwatch `sweep_reports\/` \u2192 parse new `*_summary.json` \u2192 POST to your **\/bank\/add** \u2192 run **\/bank\/search** for top-K per hit \u2192 format a PDF\/Word and a CSV \u2192 drop in `outbox\/` (and\/or email).\n\n```bash\n# service bootstrap (Ubuntu)\nsource ~\/rf_quantum_env\/bin\/activate\nexport PYTHONPATH=~\/NerfEngine\/RF_QUANTUM_SCYTHE:$PYTHONPATH\nuvicorn api.main:app --host 0.0.0.0 --port 8000\n```\n\n**UiPath robot actions:**\n1. File trigger on `sweep_reports\/`\n2. HTTP Request \u2192 `POST http:\/\/localhost:8000\/bank\/add` (new records)\n3. Loop each record \u2192 `POST \/bank\/search?k=5` \u2192 collect explanations (subspace, responsibilities, whitening on\/off)\n4. Generate report with charts using UiPath Document Processing activities\n5. Save to `outbox\/` and\/or email stakeholders\n\n### 2. Voice Authentication Workflow\n\nThis workflow:\n1. Receives voice samples from a user interface or watched folder\n2. Analyzes the voice using RF Quantum Scythe's voice clone detection\n3. Applies a decision tree based on confidence scores\n4. Triggers appropriate business processes based on authentication results\n\n## Goal-Aware Sparsity Integration\n\nThe RF Quantum Scythe system supports goal-aware sparsity, which can be configured through the RPA Glue API. This allows UiPath workflows to leverage task-specific feature masking for improved performance and accuracy.\n\n```python\n# UiPath Python script activity\nimport requests\nimport json\nimport numpy as np\n\ndef learn_task_mask(task_name, X, y, mode=\"soft\"):\n    # X: feature matrix, y: labels (0\/1)\n    response = requests.post(\n        \"http:\/\/localhost:8000\/bank\/learn_mask\",\n        json={\n            \"task\": task_name,\n            \"features\": X.tolist(),\n            \"labels\": y.tolist(),\n            \"mode\": mode,\n            \"config\": {\n                \"C\": 0.5,\n                \"top_frac\": 0.3,\n                \"min_keep\": 32\n            }\n        },\n        headers={\"Content-Type\": \"application\/json\"}\n    )\n    return response.json()\n\n# Example usage\nX = np.load(\"\/path\/to\/features.npy\")\ny = np.load(\"\/path\/to\/labels.npy\")\nresult = learn_task_mask(\"rf_geo\", X, y, mode=\"soft\")\nprint(f\"Learned mask with {result&#91;'kept_dims']} active dimensions\")\n```\n\n## Best Practices\n\n1. **Error Handling**: Always include proper error handling in your UiPath workflows when interacting with the API.\n\n2. **Authentication**: For production deployments, implement proper authentication for the RPA Glue API.\n\n3. **Logging**: Configure logging for both UiPath workflows and the RPA Glue API for troubleshooting.\n\n4. **Resource Management**: Monitor resource usage, especially when processing large signal datasets.\n\n5. **Parallel Processing**: Use UiPath's parallel activity for processing multiple signals simultaneously.\n\n## Advanced Configuration\n\n### Environment Variables\n\n| Variable | Description | Default |\n|----------|-------------|---------|\n| BANK_PATH | Path to the multi-subspace FAISS index | \/tmp\/ms_faiss_bank |\n| SI_PATH | Path to SignalIntelligence module | \"\" |\n| N_SUBSPACES | Number of subspaces for the index | 3 |\n| METHOD | Clustering method (bgmm, kmeans) | bgmm |\n| WARMUP_MIN_POINTS | Minimum points for warmup | 30 |\n| TOP_M_SUBSPACES | Top subspaces to query | 2 |\n| WHITEN_ENABLE | Enable whitening | true |\n| GOAL_SPARSE_ENABLE | Enable goal-aware sparsity | true |\n| GOAL_TASK | Task name for goal-aware sparsity | rf_geo |\n| REPORT_OUTDIR | Output directory for reports | \/tmp\/reports |\n\n## Troubleshooting\n\n### Common Issues and Solutions\n\n1. **API Connection Issues**\n   - Ensure the RPA Glue API service is running\n   - Check network connectivity between UiPath robot and API server\n   - Verify port is not blocked by firewall\n\n2. **Python Environment Problems**\n   - Ensure all dependencies are installed\n   - Check Python version compatibility (3.9+ recommended)\n\n3. **Performance Bottlenecks**\n   - Consider enabling goal-aware sparsity for faster processing\n   - Split large batch operations into smaller chunks\n   - Monitor memory usage when processing large datasets\n\n## Support and Resources\n\n- RF Quantum Scythe Documentation: &#91;Link]\n- UiPath Documentation: &#91;https:\/\/docs.uipath.com\/](https:\/\/docs.uipath.com\/)\n- UiPath Python Activities: &#91;https:\/\/docs.uipath.com\/activities\/docs\/python-scope](https:\/\/docs.uipath.com\/activities\/docs\/python-scope)\n4. Write a **report.docx** and **report.csv** \u2192 archive.\n\n> Why it\u2019ll be fast: their LMRPA results show RPA **+ a specialized post-processor** removes overhead and halves wall-clock vs generic UiPath flows on OCR-heavy pipelines; your RF path is even lighter (no OCR), so the same design pattern wins on latency.&amp;#x20;\n\n## 2) Voice-clone guard (ingest \u2192 chunk \u2192 detect \u2192 evidence pack)\n\n**Goal:** you drop an audio file; bot returns a **fused deepfake score** + chunk timeline + nearest-neighbor evidence (which manifold, which neighbors).\n\n**RPA flow:**\nwatch `incoming_audio\/` \u2192 call your `detect_voice_clone.py` (or FastAPI) \u2192 persist JSON \u2192 generate timeline PNGs + an analyst-ready PDF (chunks, gating events, exemplar IDs).\n\n**Why RPA:** schedules, retries, routing by source, and templated reports\u2014right in the bot. LMRPA\u2019s \u201cOCR\u2192LLM\u2192Excel\/Word\u201d turns into \u201cAudio\u2192Embeddings\u2192kNN+GP\u2192PDF\/CSV\u201d. Same loop, new modality.&amp;#x20;\n\n## 3) Document\/OSINT ingestion (anti-scam angle)\n\nThe paper\u2019s focus is **invoice OCR + LLM structuring**; swap \u201cinvoice\u201d with **exchange receipts, KYC forms, Telegram screenshots, domain WHOIS, blockchain explorer PDFs**. RPA gathers, OCRs, LLM-normalizes fields, then pushes into your **FAISS (goal-aware masked) index** to connect entities across cases. Benchmarks in the paper show the loop\u2019s throughput edge for OCR-heavy batches.&amp;#x20;\n\n---\n\n# Drop-in glue (so you can run this today)\n\n## A) Minimal RPA \u2194 service contract (HTTP)\n\n**Your service endpoints (example):**\n\n```bash\n# Add exemplars (RF\/voice)\ncurl -X POST http:\/\/localhost:8088\/bank\/add -H \"Content-Type: application\/json\" -d @records.json\n\n# Search similar\ncurl -X POST http:\/\/localhost:8088\/bank\/search?k=5 -H \"Content-Type: application\/json\" -d '{\n  \"query\": {\"delta_f_hz\":10,\"snr_db\":20,\"q_ms\":50,\"metadata\":{}}\n}'\n```\n\nUiPath\/Automation Anywhere can call these with built-in HTTP activities; the paper\u2019s LMRPA loop uses the same pattern (watch, process, structure, export).&amp;#x20;\n\n## B) Headless runners you can call from RPA\n\n```bash\n# (1) Rebuild the bank on demand\nsource ~\/rf_quantum_env\/bin\/activate\nexport PYTHONPATH=~\/NerfEngine\/RF_QUANTUM_SCYTHE:$PYTHONPATH\npython - &lt;&lt;'PY'\nimport json, glob, os\nfrom SignalIntelligence.faiss_exemplar_index import RFExemplarFeaturizer\nfrom SignalIntelligence.multi_subspace_faiss import MultiSubspaceFaissIndex\nfe = RFExemplarFeaturizer(256)\nms = MultiSubspaceFaissIndex(fe, method=\"bgmm\", warmup_min_points=30,\n                             top_m_subspaces=2, whiten_enable=True,\n                             goal_sparse_enable=True, goal_task=\"rf_geo\")\nrecs=&#91;]\nfor p in glob.glob(\"\/home\/bgilbert\/editable_files\/sweep_results\/*.json\"):\n    try: recs.extend(json.load(open(p)))\n    except: pass\nfor r in recs: r.setdefault(\"metadata\", {})\nms.add_records(recs); ms.save(\"\/home\/bgilbert\/editable_files\/ms_faiss_bank\")\nprint(\"bank rebuilt:\", sum(len(s.ids) for s in ms.subspaces.values()))\nPY\n\n# (2) One-shot search for a new sweep row (RPA supplies JSON)\npython - &lt;&lt;'PY'\nimport json\nfrom SignalIntelligence.faiss_exemplar_index import RFExemplarFeaturizer\nfrom SignalIntelligence.multi_subspace_faiss import MultiSubspaceFaissIndex\nfe = RFExemplarFeaturizer(256)\nms = MultiSubspaceFaissIndex(fe); ms.load(\"\/home\/bgilbert\/editable_files\/ms_faiss_bank\")\nq = {\"delta_f_hz\":10,\"snr_db\":20,\"q_ms\":50,\"metadata\":{}}\nprint(json.dumps({\"explain\": ms.explain(q), \"hits\": ms.search(q, top_k=5)}, indent=2))\nPY\n```\n\n---\n\n# Turn on **goal-aware sparsity** for task-specific bots\n\nIn your bank loader (once per task\/bot):\n\n```python\nfrom SignalIntelligence.faiss_exemplar_index import RFExemplarFeaturizer\nfrom SignalIntelligence.multi_subspace_faiss import MultiSubspaceFaissIndex\nfrom goal_sparse_utils import auto_set_mask, AutoMaskConfig\nimport numpy as np, json\n\nfe = RFExemplarFeaturizer(256)\nms = MultiSubspaceFaissIndex(fe, method=\"bgmm\", warmup_min_points=30,\n                             top_m_subspaces=2, whiten_enable=True,\n                             goal_sparse_enable=True, goal_task=\"rf_geo\")\nms.load(\"\/home\/bgilbert\/editable_files\/ms_faiss_bank\")\n\n# Learn a SOFT mask from a handful of labeled RF examples (positives = \u201cgeo-relevant\u201d)\nX = np.load(\"\/home\/bgilbert\/editable_files\/labeled_rf_X.npy\")\ny = np.load(\"\/home\/bgilbert\/editable_files\/labeled_rf_y.npy\")\ninfo = auto_set_mask(ms.fgate, \"rf_geo\", X, y, mode=\"soft\", cfg=AutoMaskConfig(C=0.5, top_frac=0.3))\nprint(json.dumps(info, indent=2))\nms.save(\"\/home\/bgilbert\/editable_files\/ms_faiss_bank\")\n```\n\nNow your RPA \u201cGeo-Link\u201d bot queries a **sparser, faster, task-tuned** index.\n\n---\n\n# KPIs to track (so you can prove the win)\n\n* **End-to-end latency**: `file_arrival \u2192 report_out` (RPA loop time).\n* **Top-K retrieval quality**: R\\@K before\/after whitening + goal-sparse.\n* **Exception rate**: % files punted to human (parse errors, low confidence).\n* **Cost per 1k docs\/audio**: CPU-only FAISS + sklearn is cheap; LMRPA shows throughput edges on OCR-heavy flows you can cite in bids.&amp;#x20;\n\n---\n\n# \u201cFollow the money\u201d (where to sell this)\n\n* **Shared-services finance** (invoice, receipt, expense audit): OCR-heavy, RPA-savvy orgs. LMRPA\u2019s reported speedups vs UiPath\/AA on invoices give you an easy **\u201c2\u00d7 throughput\u201d** message if you bring your own post-processor.&amp;#x20;\n* **Telecom SOCs &amp; LE\/OSINT**: RF anomaly triage and voice-clone fraud queues automated via bots; evidence packs (neighbors, subspace, responsibilities, timeline plots) are auditor-friendly.\n* **Compliance\/KYC**: document normalization + exemplar matching to flag risky entities (goal-sparse mask for \u201cKYC-salient\u201d fields).\n\nProcurement angle: start as **on-prem add-on** (\u201cMode-Aware Similarity for RPA Workflows\u201d), priced per bot\/runtime with optional **SLA on report latency**.\n\n---\n\n# Why this maps 1:1 to the paper\n\n* Their loop: **monitor \u2192 OCR \u2192 LLM structuring \u2192 Excel\/Word** with measured performance deltas vs mainstream RPA. Your loop: **watch \u2192 (RF\/Audio\/OCR) \u2192 FAISS\/GP\/LLM \u2192 CSV\/PDF**, same topology, and you inherit the throughput\/automation lessons directly.&amp;#x20;\n\nIf you want, I\u2019ll package a **single \u201cRPA glue\u201d repo** (FastAPI endpoints, report templates, sample UiPath sequence JSON, bash runners) so you can hand UIPath a drop-in workflow and start clocking latency improvements with your own data.\n\n\nAmazing\u2014let\u2019s ship a **self-contained \u201cRPA glue\u201d repo** you can drop onto the box, wire to UiPath (or any RPA), and start measuring end-to-end latency today.\n\nBelow is everything: repo layout, FastAPI service, report templates, bash runners, and a minimal UiPath project (project.json + Sequence XAML) that watches your sweep folder and calls the service.\n\n---\n\n# \ud83d\udce6 Repo layout\n\n```\nrpa_glue\/\n\u251c\u2500 README.md\n\u251c\u2500 requirements.txt\n\u251c\u2500 .env.example\n\u251c\u2500 api\/\n\u2502  \u251c\u2500 main.py\n\u2502  \u2514\u2500 models.py\n\u251c\u2500 services\/\n\u2502  \u251c\u2500 bank.py\n\u2502  \u251c\u2500 voice.py\n\u2502  \u2514\u2500 reporting.py\n\u251c\u2500 templates\/\n\u2502  \u251c\u2500 rf_report.html.j2\n\u2502  \u2514\u2500 voice_report.html.j2\n\u251c\u2500 scripts\/\n\u2502  \u251c\u2500 start_api.sh\n\u2502  \u251c\u2500 rebuild_bank.sh\n\u2502  \u251c\u2500 process_sweep_folder.sh\n\u2502  \u2514\u2500 generate_rf_report.sh\n\u251c\u2500 uipath\/\n\u2502  \u251c\u2500 README.md\n\u2502  \u251c\u2500 project.json\n\u2502  \u2514\u2500 Sequence.Main.xaml\n\u2514\u2500 examples\/\n   \u251c\u2500 records.sample.json\n   \u2514\u2500 query.sample.json\n```\n\n> Assumes your code lives at `~\/NerfEngine\/RF_QUANTUM_SCYTHE\/SignalIntelligence` and the saved FAISS bank at `~\/editable_files\/ms_faiss_bank`. Adjust env vars if different.\n\n---\n\n## \ud83d\udd27 requirements.txt\n\n```txt\nfastapi==0.115.0\nuvicorn&#91;standard]==0.30.6\npydantic==2.8.2\njinja2==3.1.4\nnumpy==1.26.4\nfaiss-cpu==1.12.0\nscikit-learn==1.5.1\n# Optional (voice):\n# torch torchaudio librosa transformers\n```\n\n---\n\n## \ud83d\udd10 .env.example\n\n```env\n# Where your MultiSubspaceFaissIndex is persisted\nBANK_PATH=\/home\/bgilbert\/editable_files\/ms_faiss_bank\n\n# Where finished HTML\/PDF reports go\nREPORT_OUTDIR=\/home\/bgilbert\/editable_files\/reports\n\n# Glob for sweep result JSONs to rebuild the bank\nSWEEP_GLOB=\/home\/bgilbert\/editable_files\/sweep_results\/*.json\n\n# Your SignalIntelligence path so imports work\nSI_PATH=\/home\/bgilbert\/NerfEngine\/RF_QUANTUM_SCYTHE\n\n# Index init knobs (optional)\nN_SUBSPACES=3\nMETHOD=bgmm\nWARMUP_MIN_POINTS=30\nTOP_M_SUBSPACES=2\nWHITEN_ENABLE=true\nGOAL_SPARSE_ENABLE=true\nGOAL_TASK=rf_geo\n```\n\n---\n\n## \ud83e\udde0 api\/models.py\n\n```python\n# rpa_glue\/api\/models.py\nfrom pydantic import BaseModel, Field\nfrom typing import Any, Dict, List, Optional\n\nclass RecordsIn(BaseModel):\n    records: List&#91;Dict&#91;str, Any]]\n\nclass SearchIn(BaseModel):\n    query: Dict&#91;str, Any]\n    k: int = Field(5, ge=1, le=100)\n\nclass RebuildIn(BaseModel):\n    glob: Optional&#91;str] = None\n\nclass ReportIn(BaseModel):\n    record: Dict&#91;str, Any]\n    neighbors: List&#91;Dict&#91;str, Any]]  # &#91;{id, score, subspace}]\n    explain: Dict&#91;str, Any]\n    title: str = \"RF Similarity Report\"\n\nclass VoiceIn(BaseModel):\n    # If using bytes, you could Base64; for now we pass a path\n    audio_path: str\n    ref_real: Optional&#91;List&#91;str]] = None\n    ref_fake: Optional&#91;List&#91;str]] = None\n```\n\n---\n\n## \ud83e\udde9 services\/bank.py\n\n```python\n# rpa_glue\/services\/bank.py\nimport os, json, glob, time\nfrom typing import Any, Dict, List, Tuple\nimport numpy as np\n\nfrom SignalIntelligence.faiss_exemplar_index import RFExemplarFeaturizer\nfrom SignalIntelligence.multi_subspace_faiss import MultiSubspaceFaissIndex\n\ndef _bool(s: str, default=False):\n    if s is None: return default\n    return s.lower() in (\"1\",\"true\",\"yes\",\"y\",\"on\")\n\nclass BankManager:\n    def __init__(self):\n        self.bank_path = os.environ.get(\"BANK_PATH\", \"\/tmp\/ms_faiss_bank\")\n        self.si_path = os.environ.get(\"SI_PATH\", \"\")\n        self.feat = RFExemplarFeaturizer(256)\n        self.index = MultiSubspaceFaissIndex(\n            featurizer=self.feat,\n            n_subspaces=int(os.environ.get(\"N_SUBSPACES\",\"3\")),\n            method=os.environ.get(\"METHOD\",\"bgmm\"),\n            warmup_min_points=int(os.environ.get(\"WARMUP_MIN_POINTS\",\"30\")),\n            top_m_subspaces=int(os.environ.get(\"TOP_M_SUBSPACES\",\"2\")),\n            whiten_enable=_bool(os.environ.get(\"WHITEN_ENABLE\",\"true\")),\n            goal_sparse_enable=_bool(os.environ.get(\"GOAL_SPARSE_ENABLE\",\"true\")),\n            goal_task=os.environ.get(\"GOAL_TASK\",\"rf_geo\"),\n        )\n        self.metrics = {\"records_added\":0, \"searches\":0, \"last_rebuild_s\":None}\n        # Lazy-load when first needed\n        self._loaded = False\n\n    def _ensure_loaded(self):\n        if not self._loaded and os.path.isdir(self.bank_path):\n            self.index.load(self.bank_path)\n            self._loaded = True\n\n    def save(self):\n        os.makedirs(self.bank_path, exist_ok=True)\n        self.index.save(self.bank_path)\n\n    def add_records(self, records: List&#91;Dict&#91;str,Any]]) -> int:\n        self._ensure_loaded()\n        # guard for metadata\n        for r in records: r.setdefault(\"metadata\", {})\n        self.index.add_records(records)\n        self.metrics&#91;\"records_added\"] += len(records)\n        self.save()\n        return len(records)\n\n    def search(self, query: Dict&#91;str,Any], k:int=5):\n        self._ensure_loaded()\n        t0 = time.monotonic()\n        hits = self.index.search(query, top_k=k)\n        self.metrics&#91;\"searches\"] += 1\n        dt = time.monotonic() - t0\n        return hits, dt\n\n    def explain(self, query: Dict&#91;str,Any]):\n        self._ensure_loaded()\n        return self.index.explain(query)\n\n    def rebuild_from_glob(self, g: str) -> int:\n        self._ensure_loaded()\n        recs=&#91;]\n        for p in glob.glob(g):\n            try:\n                obj = json.load(open(p))\n                if isinstance(obj, list):\n                    recs.extend(obj)\n                elif isinstance(obj, dict) and \"results\" in obj:\n                    recs.extend(obj&#91;\"results\"])\n            except Exception:\n                continue\n        for r in recs: r.setdefault(\"metadata\", {})\n        # Fresh index: reinit to avoid accumulating old state\n        self.index = MultiSubspaceFaissIndex(\n            featurizer=self.feat,\n            n_subspaces=int(os.environ.get(\"N_SUBSPACES\",\"3\")),\n            method=os.environ.get(\"METHOD\",\"bgmm\"),\n            warmup_min_points=int(os.environ.get(\"WARMUP_MIN_POINTS\",\"30\")),\n            top_m_subspaces=int(os.environ.get(\"TOP_M_SUBSPACES\",\"2\")),\n            whiten_enable=_bool(os.environ.get(\"WHITEN_ENABLE\",\"true\")),\n            goal_sparse_enable=_bool(os.environ.get(\"GOAL_SPARSE_ENABLE\",\"true\")),\n            goal_task=os.environ.get(\"GOAL_TASK\",\"rf_geo\"),\n        )\n        if recs:\n            self.index.add_records(recs)\n        self.save()\n        self.metrics&#91;\"last_rebuild_s\"] = int(time.time())\n        return len(recs)\n\n    def metrics_json(self):\n        return {\n            \"records_added\": self.metrics&#91;\"records_added\"],\n            \"searches\": self.metrics&#91;\"searches\"],\n            \"last_rebuild_s\": self.metrics&#91;\"last_rebuild_s\"],\n            \"bank_path\": self.bank_path,\n            \"subspaces\": {k: len(v.ids) for k,v in self.index.subspaces.items()},\n            \"method\": self.index.method,\n            \"goal_sparse\": {\n                \"enabled\": self.index.goal_sparse_enable,\n                \"task\": self.index.goal_task,\n            },\n            \"whitening\": bool(self.index.whiten_enable),\n        }\n```\n\n---\n\n## \ud83c\udf99 services\/voice.py (optional)\n\n```python\n# rpa_glue\/services\/voice.py\nfrom typing import Any, Dict\nimport os, json\n\ntry:\n    # Your previously built enhanced detector (optional)\n    from voice_clone_guard_plus import VoiceDeepfakeDetectorPlus\n    from voice_clone_guard_ext import XLSREmbedderChunked, EmbedConfig\n    HAVE_VOICE=True\nexcept Exception:\n    HAVE_VOICE=False\n\nclass VoiceService:\n    def __init__(self):\n        self.enabled = HAVE_VOICE\n        if self.enabled:\n            self.embedder = XLSREmbedderChunked(cfg=EmbedConfig())\n            self.detector = VoiceDeepfakeDetectorPlus(gp_length_scale=1.5, k=7)\n\n    def score(self, audio_path: str, ref_real=None, ref_fake=None):\n        if not self.enabled:\n            return {\"enabled\": False, \"message\": \"Voice pipeline not installed\"}\n        ex=&#91;]\n        for p in ref_real or &#91;]:\n            V = self.embedder.embed_file(p)\n            if V is not None:\n                ex.append((f\"real::{os.path.basename(p)}\", V.mean(axis=0), 0, {\"path\":p}))\n        for p in ref_fake or &#91;]:\n            V = self.embedder.embed_file(p)\n            if V is not None:\n                ex.append((f\"fake::{os.path.basename(p)}\", V.mean(axis=0), 1, {\"path\":p}))\n        if ex:\n            self.detector.add_exemplars(ex)\n        Vt = self.embedder.embed_file(audio_path)\n        out = self.detector.score_chunks(Vt)\n        return {\"enabled\": True, \"result\": out}\n```\n\n---\n\n## \ud83e\uddfe services\/reporting.py\n\n```python\n# rpa_glue\/services\/reporting.py\nimport os, datetime\nfrom typing import Dict, Any, List\nfrom jinja2 import Environment, FileSystemLoader, select_autoescape\n\nclass Reporter:\n    def __init__(self, template_dir: str, outdir: str):\n        self.env = Environment(\n            loader=FileSystemLoader(template_dir),\n            autoescape=select_autoescape(&#91;\"html\", \"xml\"])\n        )\n        self.outdir = outdir\n        os.makedirs(outdir, exist_ok=True)\n\n    def render_rf(self, record: Dict&#91;str,Any], neighbors: List&#91;Dict&#91;str,Any]], explain: Dict&#91;str,Any], title=\"RF Similarity Report\"):\n        tpl = self.env.get_template(\"rf_report.html.j2\")\n        html = tpl.render(\n            title=title,\n            now=datetime.datetime.utcnow().isoformat(),\n            record=record,\n            neighbors=neighbors,\n            explain=explain\n        )\n        fname = f\"rf_report_{datetime.datetime.utcnow().strftime('%Y%m%d_%H%M%S')}.html\"\n        path = os.path.join(self.outdir, fname)\n        with open(path, \"w\") as f:\n            f.write(html)\n        return path\n\n    def render_voice(self, audio_path: str, result: Dict&#91;str,Any], title=\"Voice Clone Report\"):\n        tpl = self.env.get_template(\"voice_report.html.j2\")\n        html = tpl.render(title=title, now=datetime.datetime.utcnow().isoformat(),\n                          audio_path=audio_path, result=result)\n        fname = f\"voice_report_{datetime.datetime.utcnow().strftime('%Y%m%d_%H%M%S')}.html\"\n        path = os.path.join(self.outdir, fname)\n        with open(path, \"w\") as f:\n            f.write(html)\n        return path\n```\n\n---\n\n## \ud83c\udf10 api\/main.py\n\n```python\n# rpa_glue\/api\/main.py\nimport os, time\nfrom fastapi import FastAPI\nfrom fastapi.responses import JSONResponse\nfrom api.models import RecordsIn, SearchIn, RebuildIn, ReportIn, VoiceIn\nfrom services.bank import BankManager\nfrom services.reporting import Reporter\nfrom services.voice import VoiceService\n\nBANK = BankManager()\nREPORTER = Reporter(template_dir=os.path.join(os.path.dirname(__file__), \"..\", \"templates\"),\n                    outdir=os.environ.get(\"REPORT_OUTDIR\", \"\/tmp\/reports\"))\nVOICE = VoiceService()\n\napp = FastAPI(title=\"RPA Glue API\", version=\"1.0.0\")\n\n@app.get(\"\/health\")\ndef health():\n    return {\"ok\": True, \"time\": int(time.time())}\n\n@app.get(\"\/metrics\")\ndef metrics():\n    return BANK.metrics_json()\n\n@app.post(\"\/bank\/add\")\ndef bank_add(payload: RecordsIn):\n    n = BANK.add_records(payload.records)\n    return {\"added\": n, \"bank_path\": BANK.bank_path}\n\n@app.post(\"\/bank\/search\")\ndef bank_search(payload: SearchIn):\n    hits, dt = BANK.search(payload.query, payload.k)\n    # normalize to plain dict list for reporting\n    out = &#91;{\"id\": sid, \"score\": score, \"subspace\": subk} for (sid, score, subk) in hits]\n    explain = BANK.explain(payload.query)\n    return {\"latency_s\": dt, \"hits\": out, \"explain\": explain}\n\n@app.post(\"\/bank\/save\")\ndef bank_save():\n    BANK.save()\n    return {\"saved_to\": BANK.bank_path}\n\n@app.post(\"\/bank\/load\")\ndef bank_load():\n    BANK._ensure_loaded()\n    return {\"loaded_from\": BANK.bank_path}\n\n@app.post(\"\/rf\/rebuild\")\ndef rf_rebuild(payload: RebuildIn):\n    g = payload.glob or os.environ.get(\"SWEEP_GLOB\", \"\/tmp\/*.json\")\n    n = BANK.rebuild_from_glob(g)\n    return {\"reindexed\": n, \"bank_path\": BANK.bank_path}\n\n@app.post(\"\/report\/rf\")\ndef report_rf(payload: ReportIn):\n    path = REPORTER.render_rf(payload.record, payload.neighbors, payload.explain, payload.title)\n    return {\"report_path\": path}\n\n@app.post(\"\/voice\/score\")\ndef voice_score(payload: VoiceIn):\n    res = VOICE.score(payload.audio_path, payload.ref_real, payload.ref_fake)\n    if not res.get(\"enabled\", False):\n        return JSONResponse(res, status_code=501)\n    path = REPORTER.render_voice(payload.audio_path, res&#91;\"result\"])\n    return {\"report_path\": path, \"result\": res&#91;\"result\"]}\n```\n\n---\n\n## \ud83d\uddbc templates\/rf\\_report.html.j2\n\n```html\n&lt;!doctype html>\n&lt;html>\n&lt;head>\n  &lt;meta charset=\"utf-8\"\/>\n  &lt;title>{{ title }}&lt;\/title>\n  &lt;style>\n    body { font-family: ui-sans-serif, system-ui, -apple-system; margin: 24px; }\n    code, pre { background:#f6f6f6; padding:4px 6px; border-radius:6px; }\n    table { border-collapse: collapse; width: 100%; margin-top: 12px; }\n    th, td { border: 1px solid #ddd; padding: 8px; font-size: 14px; }\n    th { background:#fafafa; text-align: left; }\n    .pill { display:inline-block; padding:2px 8px; border-radius:999px; background:#eef; }\n  &lt;\/style>\n&lt;\/head>\n&lt;body>\n  &lt;h1>{{ title }}&lt;\/h1>\n  &lt;div class=\"pill\">Generated: {{ now }}&lt;\/div>\n\n  &lt;h2>Query Record&lt;\/h2>\n  &lt;pre>{{ record | tojson(indent=2) }}&lt;\/pre>\n\n  &lt;h2>Routing &amp;amp; Explanation&lt;\/h2>\n  &lt;pre>{{ explain | tojson(indent=2) }}&lt;\/pre>\n\n  &lt;h2>Top Neighbors&lt;\/h2>\n  &lt;table>\n    &lt;thead>&lt;tr>&lt;th>#&lt;\/th>&lt;th>ID&lt;\/th>&lt;th>Score&lt;\/th>&lt;th>Subspace&lt;\/th>&lt;\/tr>&lt;\/thead>\n    &lt;tbody>\n      {% for i, n in enumerate(neighbors) %}\n      &lt;tr>&lt;td>{{ i+1 }}&lt;\/td>&lt;td>{{ n.id }}&lt;\/td>&lt;td>{{ \"%.4f\"|format(n.score) }}&lt;\/td>&lt;td>{{ n.subspace }}&lt;\/td>&lt;\/tr>\n      {% endfor %}\n    &lt;\/tbody>\n  &lt;\/table>\n&lt;\/body>\n&lt;\/html>\n```\n\n---\n\n## \ud83d\uddbc templates\/voice\\_report.html.j2\n\n```html\n&lt;!doctype html>\n&lt;html>\n&lt;head>\n  &lt;meta charset=\"utf-8\"\/>\n  &lt;title>{{ title }}&lt;\/title>\n  &lt;style>\n    body { font-family: ui-sans-serif, system-ui, -apple-system; margin: 24px; }\n    code, pre { background:#f6f6f6; padding:4px 6px; border-radius:6px; }\n  &lt;\/style>\n&lt;\/head>\n&lt;body>\n  &lt;h1>{{ title }}&lt;\/h1>\n  &lt;p>&lt;b>Audio:&lt;\/b> {{ audio_path }}&lt;\/p>\n  &lt;pre>{{ result | tojson(indent=2) }}&lt;\/pre>\n&lt;\/body>\n&lt;\/html>\n```\n\n---\n\n## \ud83e\uddea examples\/records.sample.json\n\n```json\n&#91;\n  {\"id\":\"ex1\",\"delta_f_hz\":10.0,\"snr_db\":20.0,\"q_ms\":50.0,\"am_depth_pct\":0.0,\"fm_dev_hz\":0.0,\"metadata\":{}},\n  {\"id\":\"ex2\",\"delta_f_hz\":15.0,\"snr_db\":15.0,\"q_ms\":20.0,\"am_depth_pct\":0.0,\"fm_dev_hz\":0.0,\"metadata\":{}}\n]\n```\n\n## \ud83e\uddea examples\/query.sample.json\n\n```json\n{\n  \"query\": {\"delta_f_hz\":10.0,\"snr_db\":20.0,\"q_ms\":50.0,\"am_depth_pct\":0.0,\"fm_dev_hz\":0.0,\"metadata\":{}},\n  \"k\": 5\n}\n```\n\n---\n\n## \ud83d\udda5 scripts\/start\\_api.sh\n\n```bash\n#!\/usr\/bin\/env bash\nset -euo pipefail\ncd \"$(dirname \"$0\")\/..\"\n\n# Load env\nif &#91; -f .env ]; then set -a; source .env; set +a; fi\n\n# PYTHONPATH for SignalIntelligence\nexport PYTHONPATH=\"${SI_PATH:-$HOME\/NerfEngine\/RF_QUANTUM_SCYTHE}:$PYTHONPATH\"\n\n# Venv (edit if needed)\nsource ~\/rf_quantum_env\/bin\/activate\n\n# Install (first run)\npip -q install -r requirements.txt\n\n# Start API\nuvicorn api.main:app --host 0.0.0.0 --port 8088 --workers 2\n```\n\n## \ud83d\udda5 scripts\/rebuild\\_bank.sh\n\n```bash\n#!\/usr\/bin\/env bash\nset -euo pipefail\ncd \"$(dirname \"$0\")\/..\"\n&#91; -f .env ] &amp;&amp; set -a &amp;&amp; source .env &amp;&amp; set +a\ncurl -s -X POST \"http:\/\/localhost:8088\/rf\/rebuild\" \\\n  -H \"Content-Type: application\/json\" \\\n  -d \"{\\\"glob\\\":\\\"${SWEEP_GLOB:-\/tmp\/*.json}\\\"}\" | jq .\n```\n\n## \ud83d\udda5 scripts\/process\\_sweep\\_folder.sh\n\n```bash\n#!\/usr\/bin\/env bash\nset -euo pipefail\ncd \"$(dirname \"$0\")\/..\"\n&#91; -f .env ] &amp;&amp; set -a &amp;&amp; source .env &amp;&amp; set +a\n\n# Add all new records in one shot (RPA would call per-file; this is a batch example)\nFILE=\"${1:-examples\/records.sample.json}\"\ncurl -s -X POST \"http:\/\/localhost:8088\/bank\/add\" \\\n  -H \"Content-Type: application\/json\" \\\n  -d @${FILE} | jq .\n```\n\n## \ud83d\udda5 scripts\/generate\\_rf\\_report.sh\n\n```bash\n#!\/usr\/bin\/env bash\nset -euo pipefail\ncd \"$(dirname \"$0\")\/..\"\n&#91; -f .env ] &amp;&amp; set -a &amp;&amp; source .env &amp;&amp; set +a\n\nQUERY_FILE=\"${1:-examples\/query.sample.json}\"\n\n# Search\nRES=$(curl -s -X POST \"http:\/\/localhost:8088\/bank\/search\" -H \"Content-Type: application\/json\" -d @\"${QUERY_FILE}\")\nHITS=$(echo \"$RES\" | jq '.hits')\nEXPL=$(echo \"$RES\" | jq '.explain')\nREC=$(cat \"${QUERY_FILE}\" | jq '.query')\n\n# Report\ncurl -s -X POST \"http:\/\/localhost:8088\/report\/rf\" \\\n  -H \"Content-Type: application\/json\" \\\n  -d \"{\\\"record\\\":${REC},\\\"neighbors\\\":${HITS},\\\"explain\\\":${EXPL},\\\"title\\\":\\\"RF Search Report\\\"}\" | jq .\n```\n\nMake scripts executable:\n\n```bash\nchmod +x scripts\/*.sh\n```\n\n---\n\n## \u25b6\ufe0f Run it\n\n```bash\ncd rpa_glue\ncp .env.example .env   # edit paths if needed\n.\/scripts\/start_api.sh &amp;\n.\/scripts\/rebuild_bank.sh\n.\/scripts\/process_sweep_folder.sh\n.\/scripts\/generate_rf_report.sh\n# open the printed report_path in your browser\n```\n\n---\n\n# \ud83e\udd16 UiPath (drop-in)\n\n### uipath\/project.json (minimal)\n\n```json\n{\n  \"name\": \"RF_RPA_Glue\",\n  \"description\": \"Watches sweep folder, calls RPA Glue API, writes reports\",\n  \"main\": \"Sequence.Main.xaml\",\n  \"dependencies\": {\n    \"UiPath.System.Activities\": \"&#91;23.10.6]\",\n    \"UiPath.WebAPI.Activities\": \"&#91;1.16.3]\",\n    \"UiPath.Excel.Activities\": \"&#91;2.22.3]\"\n  },\n  \"schemas\": &#91;\"https:\/\/schemas.uipath.com\/workflow\/Project.json\"]\n}\n```\n\n### uipath\/Sequence.Main.xaml (core activities)\n\n```xml\n&lt;Activity x:Class=\"Sequence_Main\" xmlns=\"http:\/\/schemas.microsoft.com\/netfx\/2009\/xaml\/activities\"\n xmlns:x=\"http:\/\/schemas.microsoft.com\/winfx\/2006\/xaml\"\n xmlns:ui=\"http:\/\/schemas.uipath.com\/workflow\/activities\"\n xmlns:mc=\"http:\/\/schemas.openxmlformats.org\/markup-compatibility\/2006\">\n  &lt;Sequence DisplayName=\"RF RPA Glue\">\n    &lt;!-- 1) File Change Trigger on sweep_reports folder -->\n    &lt;ui:FileChangeTrigger DisplayName=\"Watch Sweep Folder\" Path=\"C:\\sweeps\" ChangeType=\"Created\">\n      &lt;ui:FileChangeTrigger.Body>\n        &lt;Sequence>\n          &lt;!-- 2) HTTP POST \/bank\/add with file contents -->\n          &lt;ui:HTTP RequestMethod=\"POST\" Endpoint=\"http:\/\/localhost:8088\/bank\/add\"\n                   Headers=\"&#91;new Dictionary(Of String, String) From {{\"\"Content-Type\"\",\"\"application\/json\"\"}}]\"\n                   Body=\"&#91;new System.IO.StreamReader(new System.IO.FileStream(triggerFile, System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.Read)).ReadToEnd()]\"\n                   Result=\"&#91;apiAddResult]\" \/>\n          &lt;!-- 3) HTTP POST \/bank\/search using a composed query (example) -->\n          &lt;Assign>\n            &lt;Assign.To>\n              &lt;OutArgument x:TypeArguments=\"x:String\">queryJson&lt;\/OutArgument>\n            &lt;\/Assign.To>\n            &lt;Assign.Value>\n              &lt;InArgument x:TypeArguments=\"x:String\">\n                {\"query\":{\"delta_f_hz\":10.0,\"snr_db\":20.0,\"q_ms\":50.0,\"metadata\":{}},\"k\":5}\n              &lt;\/InArgument>\n            &lt;\/Assign.Value>\n          &lt;\/Assign>\n          &lt;ui:HTTP RequestMethod=\"POST\" Endpoint=\"http:\/\/localhost:8088\/bank\/search\"\n                   Headers=\"&#91;new Dictionary(Of String, String) From {{\"\"Content-Type\"\",\"\"application\/json\"\"}}]\"\n                   Body=\"&#91;queryJson]\" Result=\"&#91;apiSearchResult]\" \/>\n          &lt;!-- 4) Write hits into CSV (or call \/report\/rf to get HTML path) -->\n          &lt;ui:HTTP RequestMethod=\"POST\" Endpoint=\"http:\/\/localhost:8088\/report\/rf\"\n                   Headers=\"&#91;new Dictionary(Of String, String) From {{\"\"Content-Type\"\",\"\"application\/json\"\"}}]\"\n                   Body=\"&#91;String.Format(\"\"{{\\\"\"record\\\"\":{{\\\"\"delta_f_hz\\\"\":10.0,\\\"\"snr_db\\\"\":20.0,\\\"\"q_ms\\\"\":50.0,\\\"\"metadata\\\"\":{{}}}},\\\"\"neighbors\\\"\":{0},\\\"\"explain\\\"\":{1},\\\"\"title\\\"\":\\\"\"RF Report from UiPath\\\"\"}}\" ,\n                       apiSearchResult.SelectToken(\"\"$.hits\"\").ToString(), apiSearchResult.SelectToken(\"\"$.explain\"\").ToString())]\"\n                   Result=\"&#91;apiReportResult]\" \/>\n        &lt;\/Sequence>\n      &lt;\/ui:FileChangeTrigger.Body>\n    &lt;\/ui:FileChangeTrigger>\n  &lt;\/Sequence>\n&lt;\/Activity>\n```\n\n> If you prefer JSON-only, skip the HTML and use `\/bank\/search` \u2192 write CSV via Excel activities. The XAML above uses UiPath **HTTP Request** and **File Change Trigger** activities.\n\n---\n\n# \ud83d\udcc8 Start clocking latency\n\n* **Service metrics:** `GET \/metrics` returns record counts, method, whitening, and subspace populations.\n* **Search latency:** `\/bank\/search` returns `latency_s`.\n* **RPA timing:** add `Log Message` around each HTTP call and write durations to an Excel log. Aim to compare:\n\n  * single-space vs multi-subspace+whitening,\n  * goal-sparse off vs on,\n  * batch vs per-file ingestion.\n\n---\n\n# \u2705 What you get out of the box\n\n* A clean FastAPI surface your robots can call for **add\/search\/report**.\n* **Explainable** output (responsibilities, gating, whitening, goal-sparse kept dims).\n* Lightweight HTML reports (RPA can attach to emails or push to SharePoint).\n* Bash runners so you can test without UiPath.\n* A minimal UiPath project to prove the loop.\n\nIf you want me to add a **CSV\/Excel report** or a **PDF export** (wkhtmltopdf hook), say the word and I\u2019ll extend `reporting.py` with one more function.\n<\/code><\/pre>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":2,"featured_media":2959,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"neve_meta_sidebar":"","neve_meta_container":"","neve_meta_enable_content_width":"","neve_meta_content_width":0,"neve_meta_title_alignment":"","neve_meta_author_avatar":"","neve_post_elements_order":"","neve_meta_disable_header":"","neve_meta_disable_footer":"","neve_meta_disable_title":"","footnotes":""},"class_list":["post-2962","page","type-page","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=\/wp\/v2\/pages\/2962","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2962"}],"version-history":[{"count":0,"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=\/wp\/v2\/pages\/2962\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=\/wp\/v2\/media\/2959"}],"wp:attachment":[{"href":"https:\/\/neurosphere-2.tail52f848.ts.net\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2962"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}