smart_ide/extensions/anythingllm-workspaces
Nicolas Cantu 564b9d5576 AnythingLLM extension: clarify API key vs nginx bearer, normalize Bearer prefix
**Motivations:**
- 403 No valid api key when users paste Ollama nginx secret into extension

**Root causes:**
- AnythingLLM validates keys only from its DB; nginx Bearer is unrelated

**Correctifs:**
- README and fixKnowledge doc; strip optional Bearer prefix in client

**Evolutions:**
- Extension version 0.1.1

**Pages affectées:**
- extensions/anythingllm-workspaces/*
- docs/fixKnowledge/anythingllm-extension-403-api-key.md
- docs/README.md
2026-03-23 14:23:09 +01:00
..

AnythingLLM Workspaces (VS Code / Cursor extension)

Minimal extension to call the AnythingLLM developer API and open a workspace in the browser.

Prerequisites

  • AnythingLLM reachable at your public base URL (e.g. https://ia.enso.4nkweb.com/anythingllm).
  • An API key created in AnythingLLM: Settings → API Keys.

Do not use the nginx Bearer secret for /ollama/ (see deploy/nginx/README-ia-enso.md). That value is only for the Ollama reverse proxy. AnythingLLM validates API keys against its own database; a wrong secret yields 403 with {"error":"No valid api key found."}.

Configuration

Setting Description
anythingllm.baseUrl Base URL without trailing slash (default matches deploy/nginx/README-ia-enso.md).
anythingllm.apiKey Secret from AnythingLLM Settings → API Keys (optional leading Bearer is stripped). Use User settings to avoid committing secrets.

Commands

  • AnythingLLM: List workspaces — Fetches workspaces, then opens the selected one in the default browser (/workspace/<slug> under your base URL).
  • AnythingLLM: Open web UI — Opens the AnythingLLM base URL.

Ollama

This extension targets AnythingLLM only. For OpenAI-compatible Ollama behind the same proxy, use Cursors model settings with https://ia.enso.4nkweb.com/ollama/v1 and the nginx Bearer (see deploy/nginx/README-ia-enso.md).

Build

cd extensions/anythingllm-workspaces
npm install
npm run compile

Load the folder in VS Code / Cursor with Run Extension or install the packaged .vsix after vsce package.

API reference

Upstream routes (mounted under /api): GET /v1/workspaces — see Mintplex-Labs anything-llm server/endpoints/api/workspace/index.js.