diff --git a/deploy/nginx/README-ia-enso.md b/deploy/nginx/README-ia-enso.md index fe0f016..9b75a70 100644 --- a/deploy/nginx/README-ia-enso.md +++ b/deploy/nginx/README-ia-enso.md @@ -1,11 +1,21 @@ # ia.enso.4nkweb.com — Nginx sur le proxy (192.168.1.100) -Reverse TLS vers l’hôte LAN **`192.168.1.164`** (Ollama + AnythingLLM ; ajuster dans `sites/ia.enso.4nkweb.com.conf` si l’IP change) : +Reverse TLS vers l’hôte LAN **`192.168.1.164`** (Ollama + AnythingLLM ; IP substituée au déploiement via `__IA_ENSO_BACKEND_IP__` / `IA_ENSO_BACKEND_IP`). -| Chemin public | Backend | Port | Protection | -|---------------|---------|------|------------| -| `/ollama/` | Ollama API | `11434` | **Bearer** vérifié par nginx ; en-tête `Authorization` **retiré** avant Ollama | -| `/anythingllm/` | AnythingLLM | `3001` | Auth **application** AnythingLLM (pas le Bearer Ollama) | +## URLs publiques complètes (HTTPS) + +| Service | URL | +|---------|-----| +| **AnythingLLM** (interface) | `https://ia.enso.4nkweb.com/anythingllm/` | +| **Ollama** API native (ex. liste des modèles) | `https://ia.enso.4nkweb.com/ollama/api/tags` | +| **Ollama** API compatible OpenAI (Cursor, etc.) | base URL `https://ia.enso.4nkweb.com/ollama/v1` — ex. `https://ia.enso.4nkweb.com/ollama/v1/models` | + +**Bearer nginx** : tout ce qui est sous `/ollama/` exige `Authorization: Bearer ` (sauf si tu modifies le `map`). Le secret n’est **pas** transmis à Ollama en aval. AnythingLLM sous `/anythingllm/` utilise l’auth **applicative**, pas ce Bearer. + +| Chemin (relatif) | Backend | Port LAN | Protection | +|------------------|---------|----------|------------| +| `/ollama/` | Ollama | `11434` | **Bearer** nginx puis `Authorization` effacé vers Ollama | +| `/anythingllm/` | AnythingLLM | `3001` | Login AnythingLLM | **Contexte Cursor :** une URL en IP privée (ex. `http://192.168.1.164:11434`) peut être refusée par Cursor (`ssrf_blocked`). Un **nom public** HTTPS vers le proxy évite ce blocage si le DNS résolu depuis Internet n’est pas une IP RFC1918. diff --git a/deploy/nginx/deploy-ia-enso-to-proxy.sh b/deploy/nginx/deploy-ia-enso-to-proxy.sh index 862737c..eaaf12c 100755 --- a/deploy/nginx/deploy-ia-enso-to-proxy.sh +++ b/deploy/nginx/deploy-ia-enso-to-proxy.sh @@ -123,4 +123,7 @@ if ! try_install 1; then fi fi -echo "Done. Cursor: base URL https://ia.enso.4nkweb.com/ollama/v1 and API key = token printed above." +echo "Done. Public URLs:" +echo " AnythingLLM: https://ia.enso.4nkweb.com/anythingllm/" +echo " Ollama API: https://ia.enso.4nkweb.com/ollama/api/tags (native) — Bearer required" +echo " Cursor/OpenAI base: https://ia.enso.4nkweb.com/ollama/v1 — API key = Bearer secret (see token above if generated)." diff --git a/deploy/nginx/sites/ia.enso.4nkweb.com.conf b/deploy/nginx/sites/ia.enso.4nkweb.com.conf index b10c44a..97544a6 100644 --- a/deploy/nginx/sites/ia.enso.4nkweb.com.conf +++ b/deploy/nginx/sites/ia.enso.4nkweb.com.conf @@ -1,5 +1,11 @@ # ia.enso.4nkweb.com — reverse proxy to LAN host (Ollama + AnythingLLM). # +# Public HTTPS URLs (after TLS + nginx reload): +# AnythingLLM UI: https://ia.enso.4nkweb.com/anythingllm/ +# Ollama OpenAI API: https://ia.enso.4nkweb.com/ollama/v1/ (e.g. .../v1/models, .../v1/chat/completions) +# Ollama native API: https://ia.enso.4nkweb.com/ollama/api/tags (and other /api/* paths) +# /ollama/* requires Authorization: Bearer at nginx (see map); Cursor base URL: .../ollama/v1 +# # Prerequisites on the proxy host: # - TLS certificate for ia.enso.4nkweb.com (e.g. certbot). # - In the main nginx `http { }` block, include the Bearer map (see http-maps/ia-enso-ollama-bearer.map.conf.example). diff --git a/docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md b/docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md index 9d86850..6290760 100644 --- a/docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md +++ b/docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md @@ -6,6 +6,12 @@ Expose Ollama and AnythingLLM on the public proxy hostname with HTTPS, path prefixes `/ollama` and `/anythingllm`, and **gate Ollama** with a **Bearer token** checked at the proxy (compatible with Cursor’s OpenAI base URL + API key). +## Public URLs (HTTPS) + +- AnythingLLM UI: `https://ia.enso.4nkweb.com/anythingllm/` +- Ollama native API (example): `https://ia.enso.4nkweb.com/ollama/api/tags` — `Authorization: Bearer ` at nginx +- OpenAI-compatible base (Cursor): `https://ia.enso.4nkweb.com/ollama/v1` + ## Impacts - **Proxy (nginx):** new `server_name`, TLS, locations, HTTP `map` for Bearer validation; maps deployed under `/etc/nginx/conf.d/` when using the provided script. diff --git a/docs/services.md b/docs/services.md index 08cf78b..01e85ba 100644 --- a/docs/services.md +++ b/docs/services.md @@ -99,9 +99,13 @@ The last command must succeed after `OLLAMA_HOST=0.0.0.0:11434` and `host.docker ## Public reverse proxy (ia.enso.4nkweb.com) -When Ollama runs on a LAN host (e.g. `192.168.1.164` in `deploy/nginx/sites/ia.enso.4nkweb.com.conf`) and must be reached via the **proxy** with HTTPS and a **Bearer** gate (for clients such as Cursor that block private IPs), use `deploy/nginx/` and **[deploy/nginx/README-ia-enso.md](../deploy/nginx/README-ia-enso.md)** (script `deploy-ia-enso-to-proxy.sh`, checks, troubleshooting). +When Ollama runs on a LAN host (e.g. `192.168.1.164` via `IA_ENSO_BACKEND_IP` / `deploy/nginx/sites/ia.enso.4nkweb.com.conf`) and must be reached via the **proxy** with HTTPS and a **Bearer** gate (for clients such as Cursor that block private IPs), use `deploy/nginx/` and **[deploy/nginx/README-ia-enso.md](../deploy/nginx/README-ia-enso.md)** (script `deploy-ia-enso-to-proxy.sh`, checks, troubleshooting). -- Cursor base URL: `https://ia.enso.4nkweb.com/ollama/v1` +**Full URLs** + +- AnythingLLM UI: `https://ia.enso.4nkweb.com/anythingllm/` +- Ollama native API example: `https://ia.enso.4nkweb.com/ollama/api/tags` (header `Authorization: Bearer `) +- Cursor / OpenAI-compatible base URL: `https://ia.enso.4nkweb.com/ollama/v1` - Cursor API key: same value as the Bearer secret configured on the proxy Feature note: [ia-enso-nginx-proxy-ollama-anythingllm.md](./features/ia-enso-nginx-proxy-ollama-anythingllm.md).