Nicolas Cantu f39de69e55 Add SSH deploy script for ia.enso nginx on proxy
**Motivations:**
- Apply proxy configuration from a workstation without manual scp steps.

**Root causes:**
- No automation existed for pushing nginx files to 192.168.1.100.

**Correctifs:**
- N/A.

**Evolutions:**
- deploy-ia-enso-to-proxy.sh with ProxyJump, optional generated Bearer token, retry without duplicate websocket map.

**Pages affectées:**
- deploy/nginx/deploy-ia-enso-to-proxy.sh
- deploy/nginx/README-ia-enso.md
- docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md
2026-03-23 01:03:22 +01:00
..
2026-03-21 17:43:45 +01:00
2026-03-21 17:43:45 +01:00
2026-03-21 17:43:45 +01:00
2026-03-21 17:43:45 +01:00
2026-03-21 17:43:45 +01:00
2026-03-21 17:43:45 +01:00

smart_ide — documentation

Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository.

Document Content
../README.md Project overview (French): vision, Lapce, AnythingLLM per project
deployment-target.md First target: Linux client + SSH remote server (AI stack + repos)
infrastructure.md Host inventory (LAN), SSH key workflow, host scripts
services.md Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker
anythingllm-workspaces.md One AnythingLLM workspace per project; sync pipeline
ux-navigation-model.md Beyond file explorer: intentions, graph, palette, risks, expert mode
system-architecture.md Layers, modules, agent gateway, OpenShell, events, Lapce

Author: 4NK

Related external docs