**Motivations:** - Simplify Cursor/custom clients; Bearer caused confusion with Cursor user API key. **Root causes:** - N/A. **Correctifs:** - Drop if map check and Authorization stripping on /ollama/; deploy script no longer emits Bearer map. **Evolutions:** - Optional Bearer documented in http-maps example; README/services/feature/infrastructure updated; proxy redeployed. **Pages affectées:** - deploy/nginx/sites/ia.enso.4nkweb.com.conf - deploy/nginx/deploy-ia-enso-to-proxy.sh - deploy/nginx/README-ia-enso.md - deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example - docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md - docs/services.md - docs/infrastructure.md
smart_ide — documentation
Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository.
| Document | Content |
|---|---|
| ../README.md | Project overview (French): vision, Lapce, AnythingLLM per project |
| deployment-target.md | First target: Linux client + SSH remote server (AI stack + repos) |
| ia_dev-submodule.md | Git submodule ia_dev (clone, update, SSH URL) |
| lecoffre_ng-checkout.md | Plain clone lecoffre_ng next to smart_ide (/home/ncantu/code) |
| split-lecoffre-repos.md | Split monorepo into five Gitea repos (setup/split-lecoffre-ng-to-five-repos.sh) |
| infrastructure.md | Host inventory (LAN), SSH key workflow, host scripts |
| services.md | Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker |
| ../deploy/nginx/README-ia-enso.md | Proxy HTTPS ia.enso.4nkweb.com → Ollama / AnythingLLM (Bearer, script SSH, dépannage) |
| features/ia-enso-nginx-proxy-ollama-anythingllm.md | Fiche évolution : objectifs, impacts, modalités du reverse proxy ia.enso |
| anythingllm-workspaces.md | One AnythingLLM workspace per project; sync pipeline |
| ux-navigation-model.md | Beyond file explorer: intentions, graph, palette, risks, expert mode |
| system-architecture.md | Layers, modules, agent gateway, OpenShell, events, Lapce |
Author: 4NK
Related external docs
- AnythingLLM Docker: https://docs.anythingllm.com/installation-docker/local-docker
- Ollama: https://github.com/ollama/ollama/blob/main/docs/linux.md
- Lapce: https://lapce.dev/