**Motivations:** - Apply proxy configuration from a workstation without manual scp steps. **Root causes:** - No automation existed for pushing nginx files to 192.168.1.100. **Correctifs:** - N/A. **Evolutions:** - deploy-ia-enso-to-proxy.sh with ProxyJump, optional generated Bearer token, retry without duplicate websocket map. **Pages affectées:** - deploy/nginx/deploy-ia-enso-to-proxy.sh - deploy/nginx/README-ia-enso.md - docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md
smart_ide — documentation
Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository.
| Document | Content |
|---|---|
| ../README.md | Project overview (French): vision, Lapce, AnythingLLM per project |
| deployment-target.md | First target: Linux client + SSH remote server (AI stack + repos) |
| infrastructure.md | Host inventory (LAN), SSH key workflow, host scripts |
| services.md | Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker |
| anythingllm-workspaces.md | One AnythingLLM workspace per project; sync pipeline |
| ux-navigation-model.md | Beyond file explorer: intentions, graph, palette, risks, expert mode |
| system-architecture.md | Layers, modules, agent gateway, OpenShell, events, Lapce |
Author: 4NK
Related external docs
- AnythingLLM Docker: https://docs.anythingllm.com/installation-docker/local-docker
- Ollama: https://github.com/ollama/ollama/blob/main/docs/linux.md
- Lapce: https://lapce.dev/