# AnythingLLM Workspaces (VS Code / Cursor extension) Minimal extension to call the **AnythingLLM developer API** and open a workspace in the browser. ## Prerequisites - AnythingLLM reachable at your public base URL (e.g. `https://ia.enso.4nkweb.com/anythingllm`). - An **API key** created in AnythingLLM: **Settings → API Keys**. **Do not** use the **nginx Bearer secret** for `/ollama/` (see `deploy/nginx/README-ia-enso.md`). That value is only for the Ollama reverse proxy. AnythingLLM validates API keys against its **own** database; a wrong secret yields `403` with `{"error":"No valid api key found."}`. ## Configuration | Setting | Description | |--------|-------------| | `anythingllm.baseUrl` | Base URL without trailing slash (default matches `deploy/nginx/README-ia-enso.md`). | | `anythingllm.apiKey` | Secret from AnythingLLM **Settings → API Keys** (optional leading `Bearer ` is stripped). Use **User** settings to avoid committing secrets. | ## Commands - **AnythingLLM: List workspaces** — Fetches workspaces, then opens the selected one in the default browser (`/workspace/` under your base URL). - **AnythingLLM: Open web UI** — Opens the AnythingLLM base URL. ## Ollama This extension targets **AnythingLLM** only. For OpenAI-compatible Ollama behind the same proxy, use Cursor’s model settings with `https://ia.enso.4nkweb.com/ollama/v1` and the nginx Bearer (see `deploy/nginx/README-ia-enso.md`). ## Build ```bash cd extensions/anythingllm-workspaces npm install npm run compile ``` Load the folder in VS Code / Cursor with **Run Extension** or install the packaged `.vsix` after `vsce package`. ## API reference Upstream routes (mounted under `/api`): `GET /v1/workspaces` — see Mintplex-Labs anything-llm `server/endpoints/api/workspace/index.js`.