**Motivations:** - Seed AnythingLLM workspace from cloned repo using gitignore-style filters **Root causes:** - N/A **Correctifs:** - N/A **Evolutions:** - Template 4nkaiignore.default; server copies after clone; extension uploads via POST /api/v1/document/upload - New commands /workspace-sync; settings initialSync*; dependency ignore **Pages affectées:** - extensions/anythingllm-workspaces/* - services/repos-devtools-server/* - docs/features/initial-rag-sync-4nkaiignore.md
smart_ide — documentation
Operational, architectural, and UX-design notes for the local-AI IDE initiative and the host tooling in this repository.
| Document | Content |
|---|---|
| ../README.md | Project overview (French): vision, Lapce, AnythingLLM per project |
| deployment-target.md | First target: Linux client + SSH remote server (AI stack + repos) |
| ia_dev-submodule.md | Git submodule ia_dev (clone, update, SSH URL) |
| lecoffre_ng-checkout.md | Plain clone lecoffre_ng next to smart_ide (/home/ncantu/code) |
| split-lecoffre-repos.md | Split monorepo into five Gitea repos (setup/split-lecoffre-ng-to-five-repos.sh) |
| infrastructure.md | Host inventory (LAN), SSH key workflow, host scripts |
| services.md | Ollama, AnythingLLM (Docker), Desktop installer, Ollama ↔ Docker |
| ../deploy/nginx/README-ia-enso.md | Proxy HTTPS ia.enso.4nkweb.com → Ollama / AnythingLLM (Bearer, script SSH, dépannage) |
| ../extensions/anythingllm-workspaces/README.md | Extension VS Code / Cursor : lister les workspaces AnythingLLM (API) et ouvrir l’UI |
| features/anythingllm-vscode-extension.md | Fiche évolution : extension AnythingLLM, impacts, modalités |
| features/repos-devtools-server-and-dev-panel.md | API locale repos + panneau dev tools (clone, workspace AnythingLLM) |
| ../services/repos-devtools-server/README.md | Serveur HTTP local : clone/list/load sous REPOS_DEVTOOLS_ROOT |
| fixKnowledge/anythingllm-extension-403-api-key.md | 403 API AnythingLLM : clé nginx Ollama vs clé UI API Keys |
| features/ia-enso-nginx-proxy-ollama-anythingllm.md | Fiche évolution : objectifs, impacts, modalités du reverse proxy ia.enso |
| anythingllm-workspaces.md | One AnythingLLM workspace per project; sync pipeline |
| ux-navigation-model.md | Beyond file explorer: intentions, graph, palette, risks, expert mode |
| system-architecture.md | Layers, modules, agent gateway, OpenShell, events, Lapce |
Author: 4NK
Related external docs
- AnythingLLM Docker: https://docs.anythingllm.com/installation-docker/local-docker
- Ollama: https://github.com/ollama/ollama/blob/main/docs/linux.md
- Lapce: https://lapce.dev/