**Motivations:** - Expose Ollama and AnythingLLM via HTTPS paths on the LAN proxy with Bearer auth for Ollama. **Root causes:** - Cursor blocks direct requests to private IPs (SSRF policy). **Correctifs:** - N/A (new configuration artifacts). **Evolutions:** - Nginx site template, HTTP map for Bearer validation, websocket map example, deployment README, services doc link, feature documentation. **Pages affectées:** - deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example - deploy/nginx/http-maps/websocket-connection.map.conf.example - deploy/nginx/sites/ia.enso.4nkweb.com.conf - deploy/nginx/README-ia-enso.md - docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md - docs/services.md
40 lines
2.3 KiB
Markdown
40 lines
2.3 KiB
Markdown
# Feature: Reverse proxy ia.enso.4nkweb.com for Ollama and AnythingLLM
|
||
|
||
**Author:** 4NK team
|
||
|
||
## Objective
|
||
|
||
Expose Ollama and AnythingLLM on the public proxy hostname with HTTPS, path prefixes `/ollama` and `/anythingllm`, and **gate Ollama** with a **Bearer token** checked at the proxy (compatible with Cursor’s OpenAI base URL + API key).
|
||
|
||
## Impacts
|
||
|
||
- **Proxy (nginx):** new `server_name`, TLS, locations, HTTP `map` for Bearer validation; optional new includes under `/etc/nginx/http-maps/`.
|
||
- **Backend (192.168.1.164):** must accept connections from the proxy on `11434` and `3001`; Ollama must not rely on the client `Authorization` header (nginx clears it after validation).
|
||
- **Clients:** Cursor uses `https://ia.enso.4nkweb.com/ollama/v1` and the shared secret as API key; avoids private-IP SSRF blocks in Cursor when the hostname resolves publicly.
|
||
|
||
## Modifications (repository)
|
||
|
||
- `deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example` — `map` for `$ia_enso_ollama_authorized`.
|
||
- `deploy/nginx/http-maps/websocket-connection.map.conf.example` — `map` for `$connection_upgrade` (AnythingLLM WebSocket).
|
||
- `deploy/nginx/sites/ia.enso.4nkweb.com.conf` — `server` blocks and upstreams.
|
||
- `deploy/nginx/README-ia-enso.md` — installation and verification on the proxy.
|
||
|
||
## Deployment modalities
|
||
|
||
1. DNS for `ia.enso.4nkweb.com` points to the proxy entry used for HTTPS.
|
||
2. Obtain TLS certificates (e.g. certbot) for that name.
|
||
3. Install map files under `/etc/nginx/http-maps/`, set the Bearer secret, include maps inside `http { }`.
|
||
4. Install the site file under `sites-available` / `sites-enabled`, `nginx -t`, reload nginx.
|
||
5. Restrict backend ports at the firewall to the proxy source where applicable.
|
||
|
||
## Analysis modalities
|
||
|
||
- `curl` to `/ollama/v1/models` with and without `Authorization: Bearer <secret>` (expect 200 / 401).
|
||
- Browser access to `/anythingllm/` and application login.
|
||
- Cursor connectivity after configuration change (no `ssrf_blocked` if hostname resolves to a public IP from Cursor’s perspective).
|
||
|
||
## Security notes
|
||
|
||
- The Bearer secret is equivalent to an API key; rotate by updating the map file and client configs together.
|
||
- AnythingLLM remains protected by **its own** application authentication; the `/anythingllm` location does not add the Ollama Bearer gate.
|