# smart_ide — documentation Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository. | Document | Content | |----------|---------| | [../README.md](../README.md) | Project overview (French): vision, Lapce, AnythingLLM per project | | [deployment-target.md](./deployment-target.md) | First target: Linux client + SSH remote server (AI stack + repos) | | [ia_dev-submodule.md](./ia_dev-submodule.md) | Git submodule `ia_dev` (clone, update, SSH URL) | | [lecoffre_ng-checkout.md](./lecoffre_ng-checkout.md) | Plain clone `lecoffre_ng` next to `smart_ide` (`/home/ncantu/code`) | | [split-lecoffre-repos.md](./split-lecoffre-repos.md) | Split monorepo into five Gitea repos (`setup/split-lecoffre-ng-to-five-repos.sh`) | | [infrastructure.md](./infrastructure.md) | Host inventory (LAN), SSH key workflow, host scripts | | [services.md](./services.md) | Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker | | [../deploy/nginx/README-ia-enso.md](../deploy/nginx/README-ia-enso.md) | Proxy HTTPS `ia.enso.4nkweb.com` → Ollama / AnythingLLM (Bearer, script SSH, dépannage) | | [../extensions/anythingllm-workspaces/README.md](../extensions/anythingllm-workspaces/README.md) | Extension VS Code / Cursor : lister les workspaces AnythingLLM (API) et ouvrir l’UI | | [features/anythingllm-vscode-extension.md](./features/anythingllm-vscode-extension.md) | Fiche évolution : extension AnythingLLM, impacts, modalités | | [features/repos-devtools-server-and-dev-panel.md](./features/repos-devtools-server-and-dev-panel.md) | API locale repos + panneau dev tools (clone, workspace AnythingLLM) | | [../services/repos-devtools-server/README.md](../services/repos-devtools-server/README.md) | Serveur HTTP local : clone/list/load sous `REPOS_DEVTOOLS_ROOT` | | [fixKnowledge/anythingllm-extension-403-api-key.md](./fixKnowledge/anythingllm-extension-403-api-key.md) | 403 API AnythingLLM : clé nginx Ollama vs clé UI API Keys | | [features/ia-enso-nginx-proxy-ollama-anythingllm.md](./features/ia-enso-nginx-proxy-ollama-anythingllm.md) | Fiche évolution : objectifs, impacts, modalités du reverse proxy ia.enso | | [anythingllm-workspaces.md](./anythingllm-workspaces.md) | One AnythingLLM workspace per project; sync pipeline | | [ux-navigation-model.md](./ux-navigation-model.md) | Beyond file explorer: intentions, graph, palette, risks, expert mode | | [system-architecture.md](./system-architecture.md) | Layers, modules, agent gateway, OpenShell, events, Lapce | **Author:** 4NK **Related external docs** - AnythingLLM Docker: - Ollama: - Lapce: