**Motivations:** - Simplify Cursor/custom clients; Bearer caused confusion with Cursor user API key. **Root causes:** - N/A. **Correctifs:** - Drop if map check and Authorization stripping on /ollama/; deploy script no longer emits Bearer map. **Evolutions:** - Optional Bearer documented in http-maps example; README/services/feature/infrastructure updated; proxy redeployed. **Pages affectées:** - deploy/nginx/sites/ia.enso.4nkweb.com.conf - deploy/nginx/deploy-ia-enso-to-proxy.sh - deploy/nginx/README-ia-enso.md - deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example - docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md - docs/services.md - docs/infrastructure.md
50 lines
2.9 KiB
Markdown
50 lines
2.9 KiB
Markdown
# Feature: Reverse proxy ia.enso.4nkweb.com for Ollama and AnythingLLM
|
||
|
||
**Author:** 4NK team
|
||
|
||
## Objective
|
||
|
||
Expose Ollama and AnythingLLM on the public proxy hostname with HTTPS, path prefixes `/ollama` and `/anythingllm`. **Default:** no nginx Bearer on `/ollama/` (optional `map` in `http-maps/ia-enso-ollama-bearer.map.conf.example` to re-enable).
|
||
|
||
## Public URLs (HTTPS)
|
||
|
||
- AnythingLLM UI: `https://ia.enso.4nkweb.com/anythingllm/`
|
||
- Ollama native API (example): `https://ia.enso.4nkweb.com/ollama/api/tags`
|
||
- OpenAI-compatible base (Cursor): `https://ia.enso.4nkweb.com/ollama/v1`
|
||
|
||
## Impacts
|
||
|
||
- **Proxy (nginx):** `server_name`, TLS, locations; `conf.d/ia-enso-http-maps.conf` holds WebSocket `map` when deployed by script (or stub if duplicate elsewhere).
|
||
- **Backend (192.168.1.164):** must accept connections from the proxy on `11434` and `3001`.
|
||
- **Clients:** Cursor can use `https://ia.enso.4nkweb.com/ollama/v1` without a matching nginx secret if Bearer is disabled; hostname may avoid private-IP SSRF blocks when DNS resolves publicly.
|
||
|
||
## Repository layout
|
||
|
||
| Path | Purpose |
|
||
|------|---------|
|
||
| `deploy/nginx/sites/ia.enso.4nkweb.com.conf` | `server` blocks ; upstreams use `__IA_ENSO_BACKEND_IP__` (default `192.168.1.164` substituted by `deploy-ia-enso-to-proxy.sh` or manual `sed`) |
|
||
| `deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example` | **Optional** Bearer `map` + `location /ollama/` `if` to re-enable auth |
|
||
| `deploy/nginx/http-maps/websocket-connection.map.conf.example` | Example WebSocket `map` (manual install) |
|
||
| `deploy/nginx/deploy-ia-enso-to-proxy.sh` | SSH deploy: maps + site, `nginx -t`, reload; stub retry if websocket `map` already exists |
|
||
| `deploy/nginx/sites/ia.enso.4nkweb.com.http-only.conf` | Temporary HTTP-only vhost for first Let’s Encrypt `webroot` issuance when `live/ia.enso…` is missing |
|
||
| `deploy/nginx/README-ia-enso.md` | **Operator reference:** automated + manual steps, env vars, checks, troubleshooting, TLS bootstrap |
|
||
|
||
## Deployment modalities
|
||
|
||
**Preferred:** run `./deploy/nginx/deploy-ia-enso-to-proxy.sh` from `smart_ide` on a host with SSH access (see `README-ia-enso.md`).
|
||
|
||
**Manual:** DNS → TLS (certbot) → WebSocket `map` if needed → install site → `nginx -t` → reload. Details: `deploy/nginx/README-ia-enso.md`.
|
||
|
||
Restrict backend ports on `192.168.1.164` to the proxy source where a host firewall is used.
|
||
|
||
## Analysis modalities
|
||
|
||
- `curl` to `/ollama/v1/models` and `/ollama/api/tags` without `Authorization` (expect **200** when Bearer is off).
|
||
- Browser access to `/anythingllm/` and application login.
|
||
- Cursor connectivity; `ERROR_BAD_USER_API_KEY` may still be a Cursor client issue (see README forum link).
|
||
|
||
## Security notes
|
||
|
||
- **Default `/ollama/` is unauthenticated at nginx:** anyone who can reach the URL can call Ollama unless restricted by firewall or Ollama-level controls. Re-add Bearer using the example `map` if needed.
|
||
- AnythingLLM remains protected by **its own** application authentication.
|