# smart_ide — documentation Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository. | Document | Content | |----------|---------| | [../README.md](../README.md) | Project overview (French): vision, Lapce, AnythingLLM per project | | [deployment-target.md](./deployment-target.md) | First target: Linux client + SSH remote server (AI stack + repos) | | [ia_dev-submodule.md](./ia_dev-submodule.md) | Git submodule `ia_dev` (clone, update, SSH URL) | | [lecoffre_ng-checkout.md](./lecoffre_ng-checkout.md) | Plain clone `lecoffre_ng` next to `smart_ide` (`/home/ncantu/code`) | | [split-lecoffre-repos.md](./split-lecoffre-repos.md) | Split monorepo into five Gitea repos (`setup/split-lecoffre-ng-to-five-repos.sh`) | | [infrastructure.md](./infrastructure.md) | Host inventory (LAN), SSH key workflow, host scripts | | [services.md](./services.md) | Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker | | [../deploy/nginx/README-ia-enso.md](../deploy/nginx/README-ia-enso.md) | Proxy HTTPS `ia.enso.4nkweb.com` → Ollama / AnythingLLM (Bearer, script SSH, dépannage) | | [features/ia-enso-nginx-proxy-ollama-anythingllm.md](./features/ia-enso-nginx-proxy-ollama-anythingllm.md) | Fiche évolution : objectifs, impacts, modalités du reverse proxy ia.enso | | [anythingllm-workspaces.md](./anythingllm-workspaces.md) | One AnythingLLM workspace per project; sync pipeline | | [ux-navigation-model.md](./ux-navigation-model.md) | Beyond file explorer: intentions, graph, palette, risks, expert mode | | [system-architecture.md](./system-architecture.md) | Layers, modules, agent gateway, OpenShell, events, Lapce | **Author:** 4NK **Related external docs** - AnythingLLM Docker: - Ollama: - Lapce: