**Motivations:** - Apply proxy configuration from a workstation without manual scp steps. **Root causes:** - No automation existed for pushing nginx files to 192.168.1.100. **Correctifs:** - N/A. **Evolutions:** - deploy-ia-enso-to-proxy.sh with ProxyJump, optional generated Bearer token, retry without duplicate websocket map. **Pages affectées:** - deploy/nginx/deploy-ia-enso-to-proxy.sh - deploy/nginx/README-ia-enso.md - docs/features/ia-enso-nginx-proxy-ollama-anythingllm.md
2.4 KiB
2.4 KiB
Feature: Reverse proxy ia.enso.4nkweb.com for Ollama and AnythingLLM
Author: 4NK team
Objective
Expose Ollama and AnythingLLM on the public proxy hostname with HTTPS, path prefixes /ollama and /anythingllm, and gate Ollama with a Bearer token checked at the proxy (compatible with Cursor’s OpenAI base URL + API key).
Impacts
- Proxy (nginx): new
server_name, TLS, locations, HTTPmapfor Bearer validation; optional new includes under/etc/nginx/http-maps/. - Backend (192.168.1.164): must accept connections from the proxy on
11434and3001; Ollama must not rely on the clientAuthorizationheader (nginx clears it after validation). - Clients: Cursor uses
https://ia.enso.4nkweb.com/ollama/v1and the shared secret as API key; avoids private-IP SSRF blocks in Cursor when the hostname resolves publicly.
Modifications (repository)
deploy/nginx/http-maps/ia-enso-ollama-bearer.map.conf.example—mapfor$ia_enso_ollama_authorized.deploy/nginx/http-maps/websocket-connection.map.conf.example—mapfor$connection_upgrade(AnythingLLM WebSocket).deploy/nginx/sites/ia.enso.4nkweb.com.conf—serverblocks and upstreams.deploy/nginx/deploy-ia-enso-to-proxy.sh— push maps + site over SSH,nginx -t, reload (Bearer-only retry if websocketmapalready exists).deploy/nginx/README-ia-enso.md— installation and verification on the proxy.
Deployment modalities
- DNS for
ia.enso.4nkweb.compoints to the proxy entry used for HTTPS. - Obtain TLS certificates (e.g. certbot) for that name.
- Install map files under
/etc/nginx/http-maps/, set the Bearer secret, include maps insidehttp { }. - Install the site file under
sites-available/sites-enabled,nginx -t, reload nginx. - Restrict backend ports at the firewall to the proxy source where applicable.
Analysis modalities
curlto/ollama/v1/modelswith and withoutAuthorization: Bearer <secret>(expect 200 / 401).- Browser access to
/anythingllm/and application login. - Cursor connectivity after configuration change (no
ssrf_blockedif hostname resolves to a public IP from Cursor’s perspective).
Security notes
- The Bearer secret is equivalent to an API key; rotate by updating the map file and client configs together.
- AnythingLLM remains protected by its own application authentication; the
/anythingllmlocation does not add the Ollama Bearer gate.