**Motivations:** - Expose Ollama and AnythingLLM via HTTPS paths on the LAN proxy with Bearer auth for Ollama. **Root causes:** - Cursor blocks direct requests to private IPs (SSRF policy). **Correctifs:** - N/A (new configuration artifacts). **Evolutions:** - Nginx site template, HTTP map for Bearer validation, websocket map example, deployment README, services doc link, feature documentation. **Pages affectées:** - deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example - deploy/nginx/http-maps/websocket-connection.map.conf.example - deploy/nginx/sites/ia.enso.4nkweb.com.conf - deploy/nginx/README-ia-enso.md - docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md - docs/services.md
smart_ide — documentation
Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository.
| Document | Content |
|---|---|
| ../README.md | Project overview (French): vision, Lapce, AnythingLLM per project |
| deployment-target.md | First target: Linux client + SSH remote server (AI stack + repos) |
| infrastructure.md | Host inventory (LAN), SSH key workflow, host scripts |
| services.md | Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker |
| anythingllm-workspaces.md | One AnythingLLM workspace per project; sync pipeline |
| ux-navigation-model.md | Beyond file explorer: intentions, graph, palette, risks, expert mode |
| system-architecture.md | Layers, modules, agent gateway, OpenShell, events, Lapce |
Author: 4NK
Related external docs
- AnythingLLM Docker: https://docs.anythingllm.com/installation-docker/local-docker
- Ollama: https://github.com/ollama/ollama/blob/main/docs/linux.md
- Lapce: https://lapce.dev/