Generic project config, deploy scripts, gitea-issues, no reverse dependency

**Motivations:**
- Single config file per project (projects/<id>/conf.json)
- Project must not depend on ia_dev; only ia_dev solicits the project

**Root causes:**
- N/A (evolution)

**Correctifs:**
- N/A

**Evolutions:**
- lib/project_config.sh: resolve PROJECT_SLUG from IA_PROJECT, .ia_project, ai_project_id; PROJECT_CONFIG_PATH = projects/<id>/conf.json
- projects/lecoffreio.json moved to projects/lecoffreio/conf.json; projects/ia_dev/conf.json added
- deploy: branch-align, bump-version, change-to-all-branches, pousse, deploy-by-script-to use PROJECT_ROOT/IA_DEV_ROOT and project_config.sh; SCRIPT_REAL for symlink-safe paths
- deploy/_lib: shared colors, env-map, ssh, git-flow
- gitea-issues: mail list/mark-read/get-thread/send-reply, thread log, agent-loop, wiki scripts; lib.sh loads project config
- README: principle no dependency from host project; invoke ./ia_dev/deploy/bump-version.sh etc. from repo root

**Pages affectées:**
- README.md, projects/README.md, lib/, deploy/, gitea-issues/, projects/lecoffreio/, projects/ia_dev/
This commit is contained in:
Nicolas Cantu 2026-03-12 22:35:15 +01:00
parent 07e0341c1d
commit 907807f4d6
46 changed files with 2839 additions and 38 deletions

40
.editorconfig Normal file
View File

@ -0,0 +1,40 @@
# EditorConfig is awesome: https://EditorConfig.org
# top-most EditorConfig file
root = true
# Unix-style newlines with a newline ending every file
[*]
charset = utf-8
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
# TypeScript/JavaScript files
[*.{ts,tsx,js,jsx}]
indent_style = tab
indent_size = 2
# JSON files
[*.json]
indent_style = tab
indent_size = 2
# YAML files
[*.{yml,yaml}]
indent_style = space
indent_size = 2
# Markdown files
[*.md]
trim_trailing_whitespace = false
# Shell scripts
[*.sh]
indent_style = tab
indent_size = 2
# Prisma schema
[*.prisma]
indent_style = space
indent_size = 2

View File

@ -2,9 +2,26 @@
Dépôt de pilotage par lIA pour les projets (règles, agents, scripts de déploiement et de push).
**Principe** : le projet hôte ne doit avoir **aucune dépendance** vers ia_dev (aucun script du projet n'appelle ia_dev). Seul ia_dev, en fonction du paramétrage (`.ia_project`, `ai_project_id`, `projects/<id>/conf.json`), sollicite le projet (lecture de la config, appel de `deploy/scripts_v2/deploy.sh`, etc.).
## Usage
- **En submodule** : ce dépôt est inclus comme sous-module Git dans chaque projet. Les paramètres spécifiques au projet sont dans `projects/<slug>.json`. Le projet hôte définit le slug par le fichier `.ia_project` à la racine ou par la variable denvironnement `IA_PROJECT`.
- **Scripts** : à lancer depuis la racine du dépôt du projet (ex. `./ia_dev/deploy/pousse.sh` ou `./deploy/pousse.sh` si `deploy` est un lien vers `ia_dev/deploy`).
Voir `projects/README.md` pour le schéma de configuration et les exemples.
## Répertoires d'exécution
Les scripts sont invoqués depuis la **racine du dépôt hôte**. Ils s'y placent (ou s'y ré-exécutent) avant de continuer.
- **deploy/** : `PROJECT_ROOT` = git toplevel ; ré-exécution depuis la racine si besoin. Chemin du script résolu (`readlink -f` / `realpath`) pour que `IA_DEV_ROOT` soit correct même si `deploy` est un symlink vers `ia_dev/deploy`.
- **gitea-issues/** : `ROOT` = git toplevel (sinon parent de `GITEA_ISSUES_DIR`) ; `cd "$ROOT"` et `export REPO_ROOT` pour que les scripts Python utilisent la racine hôte pour `.secrets/` et `logs/` (y compris quand gitea-issues est dans `ia_dev/gitea-issues`).
## Scripts centralisés (submodule)
Les scripts suivants sont centralisés dans `ia_dev/deploy/`. Le projet n'a pas à fournir de wrapper (pour rester sans dépendance vers ia_dev) ; on invoque depuis la racine : `./ia_dev/deploy/bump-version.sh`, `./ia_dev/deploy/pousse.sh`, etc.
- **bump-version.sh** : lecture de `projects/<id>/conf.json` (version.package_json_paths, version.splash_app_name). Invocation : `./ia_dev/deploy/bump-version.sh <version> [message]` depuis la racine du dépôt.
- **deploy-by-script-to.sh** : enchaîne change-to-all-branches (ia_dev/deploy), puis checkout/pull/deploy via `deploy/scripts_v2/deploy.sh` du projet. Le projet peut avoir `deploy/deploy-by-script-to.sh``exec …/ia_dev/deploy/deploy-by-script-to.sh "$@"`.
- **deploy/_lib/** : bibliothèque partagée pour les scripts de déploiement (`colors.sh`, `env-map.sh`, `ssh.sh`, `git-flow.sh`). Copie centralisée dans le submodule. Pour que `deploy/scripts_v2/` du projet utilise cette version sans dupliquer : depuis la racine du projet, `rm -rf deploy/scripts_v2/_lib` puis `ln -s ../../ia_dev/deploy/_lib deploy/scripts_v2/_lib` (les scripts font `source "$SCRIPT_DIR/_lib/…"`).

17
deploy/_lib/colors.sh Normal file
View File

@ -0,0 +1,17 @@
#!/usr/bin/env bash
set -euo pipefail
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
CYAN='\033[0;36m'
NC='\033[0m'
log_ts_utc() { date -u '+%Y-%m-%dT%H:%M:%SZ'; }
success() { echo -e "[$(log_ts_utc)] ${GREEN}${NC} $1"; }
error() { echo -e "[$(log_ts_utc)] ${RED}${NC} $1"; }
warning() { echo -e "[$(log_ts_utc)] ${YELLOW}${NC} $1"; }
info() { echo -e "[$(log_ts_utc)] ${BLUE}${NC} $1"; }

66
deploy/_lib/env-map.sh Normal file
View File

@ -0,0 +1,66 @@
#!/usr/bin/env bash
set -euo pipefail
#
# Environment mapping for LeCoffre.io v2 deployments (proxy-based infra).
# - Proxy (jump/orchestrator): 192.168.1.100 (4nk.myftp.biz)
# - Targets: test=192.168.1.101, pprod=192.168.1.102, prod=192.168.1.103, services=192.168.1.104
#
get_env_target_ip() {
local env="$1"
case "$env" in
test) echo "192.168.1.101" ;;
pprod) echo "192.168.1.102" ;;
prod) echo "192.168.1.103" ;;
services) echo "192.168.1.104" ;;
*) return 1 ;;
esac
}
get_env_domain() {
local env="$1"
case "$env" in
test) echo "test.lecoffreio.4nkweb.com" ;;
pprod) echo "pprod.lecoffreio.4nkweb.com" ;;
prod) echo "prod.lecoffreio.4nkweb.com" ;;
*) return 1 ;;
esac
}
# Repository path on each target host (infra standard: /srv/4NK/<domain>/)
get_env_remote_app_root() {
local env="$1"
local domain
domain="$(get_env_domain "$env")"
echo "/srv/4NK/${domain}"
}
# Public service port (proxied by nginx on proxy).
# This port is reserved for LeCoffre.io in the infra ports map.
get_env_service_port() {
local env="$1"
case "$env" in
test|pprod|prod) echo "3009" ;;
*) return 1 ;;
esac
}
# Internal frontend port (served by Next.js, proxied by local router).
get_env_frontend_internal_port() {
local env="$1"
case "$env" in
test|pprod|prod) echo "3100" ;;
*) return 1 ;;
esac
}
# Internal backend port (served by Express, proxied by local router).
get_env_backend_internal_port() {
local env="$1"
case "$env" in
test|pprod|prod) echo "3101" ;;
*) return 1 ;;
esac
}

293
deploy/_lib/git-flow.sh Normal file
View File

@ -0,0 +1,293 @@
#!/usr/bin/env bash
# Git flow functions for automatic branch promotion and verification
#
# Prerequisites: This file must be sourced after env-map.sh and ssh.sh
# Functions used: get_env_target_ip, get_env_service_port, get_env_backend_internal_port, ssh_run
# Variables used: DEPLOY_SSH_KEY, DEPLOY_SSH_USER
# Vérifie le succès d'un déploiement
verify_deployment_success() {
local env="$1"
local domain="$2"
local ssh_key="${DEPLOY_SSH_KEY:-$HOME/.ssh/id_ed25519}"
local ssh_user="${DEPLOY_SSH_USER:-ncantu}"
local target_ip
local service_port
local backend_internal_port
# These functions should be available from env-map.sh (sourced before this file)
target_ip="$(get_env_target_ip "$env")"
service_port="$(get_env_service_port "$env")"
backend_internal_port="$(get_env_backend_internal_port "$env")"
# 1. Attendre quelques secondes pour que les services démarrent
info "[verify] Waiting for services to start (15 seconds)..."
sleep 15
# 2. Health check HTTP avec retries
info "[verify] Checking health endpoint via router (port ${service_port})..."
local health_status
local max_retries=3
local retry_count=0
# Vérifier via le router depuis le serveur distant (via SSH)
# Le router route /api/ vers le backend
while [[ $retry_count -lt $max_retries ]]; do
health_status=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"curl -s -o /dev/null -w '%{http_code}' --max-time 10 --connect-timeout 5 'http://localhost:${service_port}/api/v1/public/health' 2>/dev/null || echo '000'")
if [[ "$health_status" == "200" ]]; then
info "[verify] Health check passed via router (HTTP $health_status)"
break
fi
retry_count=$((retry_count + 1))
if [[ $retry_count -lt $max_retries ]]; then
info "[verify] Health check attempt $retry_count failed (HTTP $health_status), retrying in 5 seconds..."
sleep 5
fi
done
if [[ "$health_status" != "200" ]]; then
# Essayer directement le backend en fallback
info "[verify] Router check failed (HTTP $health_status), trying backend directly (port ${backend_internal_port})..."
health_status=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"curl -s -o /dev/null -w '%{http_code}' --max-time 10 --connect-timeout 5 'http://localhost:${backend_internal_port}/api/v1/public/health' 2>/dev/null || echo '000'")
# If 404, backend may mount API at root (API_ROOT_URL=/); try path without /api prefix
if [[ "$health_status" == "404" ]]; then
info "[verify] Backend returned 404 for /api/v1/public/health, trying /v1/public/health..."
health_status=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"curl -s -o /dev/null -w '%{http_code}' --max-time 10 --connect-timeout 5 'http://localhost:${backend_internal_port}/v1/public/health' 2>/dev/null || echo '000'")
fi
if [[ "$health_status" != "200" ]]; then
error "[verify] Health check failed: HTTP $health_status"
# Afficher les logs du backend pour diagnostic
info "[verify] Backend logs (last 50 lines):"
ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"journalctl -u lecoffreio-backend@${domain}.service --no-pager -n 50 2>/dev/null || true" | sed 's/^/ /'
# Afficher l'état des services
info "[verify] Service status:"
ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"systemctl status lecoffreio-backend@${domain}.service lecoffreio-router@${domain}.service --no-pager -l 2>/dev/null || true" | sed 's/^/ /'
# Vérifier si le port est en écoute
info "[verify] Checking if backend port ${backend_internal_port} is listening:"
ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"ss -tlnp | grep ':${backend_internal_port}' || echo ' Port ${backend_internal_port} is not listening'" | sed 's/^/ /'
error "[verify] Backend may not be fully started yet. Check logs: journalctl -u lecoffreio-backend@${domain}.service -n 50"
error "[verify] Router status: systemctl status lecoffreio-router@${domain}.service"
return 1
fi
info "[verify] Health check passed via direct backend (HTTP $health_status)"
fi
# 3. Vérification des services systemd avec retries (frontend peut prendre plus de temps)
info "[verify] Checking systemd services..."
local services_status
local max_service_retries=10
local service_retry_count=0
local all_active=false
while [[ $service_retry_count -lt $max_service_retries && "$all_active" != "true" ]]; do
services_status=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"systemctl is-active lecoffreio-backend@${domain}.service lecoffreio-frontend@${domain}.service lecoffreio-router@${domain}.service 2>/dev/null | grep -vE '^(active|activating)$' || true")
if [[ -z "$services_status" ]]; then
# Vérifier que tous les services sont vraiment "active" (pas "activating")
local all_status
all_status=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"systemctl is-active lecoffreio-backend@${domain}.service lecoffreio-frontend@${domain}.service lecoffreio-router@${domain}.service 2>/dev/null")
# Vérifier s'il y a des erreurs dans les logs du frontend (si en "activating")
if echo "$all_status" | grep -q "activating"; then
# Vérifier les logs du frontend pour voir s'il y a une erreur
local frontend_errors
frontend_errors=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"journalctl -u lecoffreio-frontend@${domain}.service --since '2 minutes ago' --no-pager 2>/dev/null | { grep -iE '(error|fatal|failed)' || true; } | tail -5")
if [[ -n "$frontend_errors" ]]; then
error "[verify] Frontend errors detected while activating:"
echo "$frontend_errors" | sed 's/^/ /'
error "[verify] Check frontend logs: journalctl -u lecoffreio-frontend@${domain}.service -n 50"
return 1
fi
service_retry_count=$((service_retry_count + 1))
if [[ $service_retry_count -lt $max_service_retries ]]; then
info "[verify] Some services still activating, waiting 10 seconds (attempt $service_retry_count/$max_service_retries)..."
sleep 10
fi
else
all_active=true
fi
else
service_retry_count=$((service_retry_count + 1))
if [[ $service_retry_count -lt $max_service_retries ]]; then
info "[verify] Some services not active, waiting 10 seconds (attempt $service_retry_count/$max_service_retries)..."
echo "$services_status" | sed 's/^/ /'
sleep 10
fi
fi
done
if [[ "$all_active" != "true" ]]; then
# Dernière vérification pour afficher l'état final et les logs
services_status=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"systemctl is-active lecoffreio-backend@${domain}.service lecoffreio-frontend@${domain}.service lecoffreio-router@${domain}.service 2>/dev/null || echo 'unknown'")
error "[verify] Some services are not active after $max_service_retries attempts:"
echo "$services_status" | sed 's/^/ /'
# Afficher les logs du frontend si toujours en activating
if echo "$services_status" | grep -q "activating.*frontend"; then
info "[verify] Frontend logs (last 30 lines):"
ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"journalctl -u lecoffreio-frontend@${domain}.service --no-pager -n 30 2>/dev/null || true" | sed 's/^/ /'
fi
error "[verify] Check service status: systemctl status lecoffreio-backend@${domain}.service lecoffreio-frontend@${domain}.service lecoffreio-router@${domain}.service"
return 1
fi
info "[verify] All systemd services are active"
# 3. Vérification des logs (erreurs critiques récentes)
info "[verify] Checking for critical errors in logs..."
local critical_errors
critical_errors=$(ssh_run "$ssh_key" "$ssh_user" "$target_ip" \
"journalctl -u lecoffreio-backend@${domain}.service --since '5 minutes ago' --no-pager 2>/dev/null | { grep -iE '(error|fatal|critical)' || true; } | tail -10")
if [[ -n "$critical_errors" ]]; then
warning "[verify] Critical errors found in recent logs:"
echo "$critical_errors" | sed 's/^/ /'
# Ne pas bloquer pour les warnings, seulement les erreurs fatales
# On pourrait ajouter une logique plus fine ici
fi
info "[verify] Deployment verification passed"
return 0
}
# Détermine l'environnement suivant dans la chaîne
get_next_env() {
local current_env="$1"
case "$current_env" in
dev) echo "test" ;;
test) echo "pprod" ;;
pprod) echo "prod" ;;
prod) echo "" ;;
*) echo "" ;;
esac
}
# Promotion automatique vers l'environnement suivant
auto_promote_to_next_env() {
local current_env="$1"
local current_branch="$2"
local project_root="$3"
local deploy_git_remote="${4:-lecoffre_ng}"
local next_env
local next_branch
# Si on n'est pas sur dev, pas de promotion
if [[ "$current_branch" != "dev" ]]; then
return 0
fi
next_env=$(get_next_env "$current_env")
if [[ -z "$next_env" ]]; then
info "[promote] No next environment (already at prod)"
return 0
fi
# Déterminer la branche cible
case "$next_env" in
test) next_branch="test" ;;
pprod) next_branch="pprod" ;;
prod) next_branch="prod" ;;
*) return 0 ;;
esac
info "[promote] Auto-promoting dev → $next_branch for $next_env environment..."
# 1. Fetch la branche cible
git -C "$project_root" fetch "$deploy_git_remote" "$next_branch" || true
# 2. Checkout la branche cible
git -C "$project_root" checkout "$next_branch" || {
# Branch doesn't exist locally, create it from remote
git -C "$project_root" checkout -b "$next_branch" "${deploy_git_remote}/${next_branch}" 2>/dev/null || {
# Remote branch doesn't exist, create new branch
git -C "$project_root" checkout -b "$next_branch"
}
}
# 3. Merge dev into target branch
if ! git -C "$project_root" merge dev --allow-unrelated-histories --no-edit; then
error "[promote] Merge dev → $next_branch failed. Resolve conflicts manually."
git -C "$project_root" checkout dev
return 1
fi
# 4. Push
info "[promote] Pushing $next_branch..."
git -C "$project_root" push "$deploy_git_remote" "$next_branch"
# 5. Retourner sur dev
info "[promote] Returning to dev branch..."
git -C "$project_root" checkout dev
success "[promote] Successfully promoted dev → $next_branch"
info "[promote] Next step: deploy to $next_env with: ./deploy/scripts_v2/deploy.sh $next_env"
return 0
}
# Stage all changes, commit with message, and push current branch
# Usage: git_add_commit_push <project_root> <commit_message> [remote]
# Example: git_add_commit_push /path/to/repo "fix: something"
git_add_commit_push() {
local project_root="${1:-.}"
local commit_message="$2"
local deploy_git_remote="${3:-lecoffre_ng}"
local current_branch
if [[ -z "$commit_message" ]]; then
error "[git] Commit message required"
return 1
fi
# Lint --fix on all projects before staging (resources, backend, frontend). Non-blocking.
info "[lint] Running lint --fix on lecoffre-ressources-dev, lecoffre-back-main, lecoffre-front-main..."
(cd "${project_root}/lecoffre-ressources-dev" && npm run lint:fix) || warning "[lint] lecoffre-ressources-dev lint:fix failed (non-blocking)"
(cd "${project_root}/lecoffre-back-main" && npm run lint:fix) || warning "[lint] lecoffre-back-main lint:fix failed (non-blocking)"
(cd "${project_root}/lecoffre-front-main" && npm run lint:fix) || warning "[lint] lecoffre-front-main lint:fix failed (non-blocking)"
info "[lint] Lint:fix step done"
info "[git] Staging all changes (add -A)..."
git -C "$project_root" add -A || {
error "[git] git add -A failed"
return 1
}
info "[git] Committing..."
git -C "$project_root" commit -m "$commit_message" || {
error "[git] commit failed"
return 1
}
current_branch=$(git -C "$project_root" branch --show-current)
info "[git] Pushing to $deploy_git_remote $current_branch..."
git -C "$project_root" push "$deploy_git_remote" "$current_branch" || {
error "[git] push failed"
return 1
}
success "[git] add -A, commit, push done"
return 0
}

95
deploy/_lib/ssh.sh Normal file
View File

@ -0,0 +1,95 @@
#!/usr/bin/env bash
set -euo pipefail
require_ssh_key() {
local key_path="$1"
if [[ -z "$key_path" ]]; then
echo "SSH key path is required" >&2
return 1
fi
if [[ ! -f "$key_path" ]]; then
echo "SSH key not found: $key_path" >&2
return 1
fi
}
ssh_common_opts() {
local ssh_user="$1"
local ssh_host="$2"
# Keepalive to reduce flakiness through ProxyJump
# (observed: "Connection reset by peer" during scp/ssh).
#
# Notes:
# - Avoid SSH multiplexing here: ControlPath/ControlMaster can be flaky on Windows OpenSSH + MSYS paths.
# - Increased timeouts and keepalive settings to handle network instability
# - Compression disabled to reduce overhead and potential connection issues
echo \
-o BatchMode=yes \
-o StrictHostKeyChecking=accept-new \
-o ConnectTimeout=30 \
-o ServerAliveInterval=10 \
-o ServerAliveCountMax=6 \
-o TCPKeepAlive=yes \
-o Compression=no
}
ssh_run() {
local ssh_key="$1"
local ssh_user="$2"
local ssh_host="$3"
shift 3
require_ssh_key "$ssh_key"
local proxy_host="${DEPLOY_SSH_PROXY_HOST:-}"
local proxy_user="${DEPLOY_SSH_PROXY_USER:-$ssh_user}"
local proxy_args=()
if [[ -n "$proxy_host" ]]; then
proxy_args=(-J "$proxy_user@$proxy_host")
fi
# shellcheck disable=SC2207
local common_opts=($(ssh_common_opts "$ssh_user" "$ssh_host"))
ssh -i "$ssh_key" \
"${common_opts[@]}" \
"${proxy_args[@]}" \
"$ssh_user@$ssh_host" "$@"
}
scp_copy() {
local ssh_key="$1"
local src="$2"
local ssh_user="$3"
local ssh_host="$4"
local dst="$5"
local recursive="${6:-false}"
require_ssh_key "$ssh_key"
local proxy_host="${DEPLOY_SSH_PROXY_HOST:-}"
local proxy_user="${DEPLOY_SSH_PROXY_USER:-$ssh_user}"
local proxy_args=()
if [[ -n "$proxy_host" ]]; then
proxy_args=(-o "ProxyJump=$proxy_user@$proxy_host")
fi
# shellcheck disable=SC2207
local common_opts=($(ssh_common_opts "$ssh_user" "$ssh_host"))
local scp_opts=()
# Add -r for recursive copy if requested or if source is a directory
if [[ "$recursive" == "true" ]] || [[ -d "$src" ]]; then
scp_opts=(-r)
fi
scp -i "$ssh_key" \
"${scp_opts[@]}" \
"${common_opts[@]}" \
"${proxy_args[@]}" \
"$src" "$ssh_user@$ssh_host:$dst"
}

View File

@ -8,7 +8,8 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then
fi
PROJECT_ROOT="$(git rev-parse --show-toplevel)"
DEPLOY_DIR="$(cd "$(dirname "${BASH_SOURCE[0]:-$0}")" && pwd)"
SCRIPT_REAL="$(readlink -f "${BASH_SOURCE[0]:-$0}" 2>/dev/null || realpath "${BASH_SOURCE[0]:-$0}" 2>/dev/null || echo "${BASH_SOURCE[0]:-$0}")"
DEPLOY_DIR="$(cd "$(dirname "$SCRIPT_REAL")" && pwd)"
if [[ "$(pwd)" != "$PROJECT_ROOT" ]]; then
cd "$PROJECT_ROOT" && exec "${DEPLOY_DIR}/$(basename "${BASH_SOURCE[0]:-$0}")" "$@"
fi

View File

@ -1,9 +1,9 @@
#!/usr/bin/env bash
set -euo pipefail
# Bump version and optional package.json files from project config (projects/<slug>.json).
# Bump version and optional package.json files from project config (projects/<id>/conf.json).
# Usage: ./bump-version.sh <version> [message_court]
# Requires: run from repo root; IA_PROJECT or .ia_project for project slug; jq if using version.package_json_paths.
# Requires: run from repo root; project id from IA_PROJECT, .ia_project, or ai_project_id; jq if using version.package_json_paths.
VERSION="${1:-}"
SHORT_MSG="${2:-Nouvelles fonctionnalités et améliorations}"
@ -25,20 +25,16 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then
fi
PROJECT_ROOT="$(git rev-parse --show-toplevel)"
DEPLOY_DIR="$(cd "$(dirname "${BASH_SOURCE[0]:-$0}")" && pwd)"
SCRIPT_REAL="$(readlink -f "${BASH_SOURCE[0]:-$0}" 2>/dev/null || realpath "${BASH_SOURCE[0]:-$0}" 2>/dev/null || echo "${BASH_SOURCE[0]:-$0}")"
DEPLOY_DIR="$(cd "$(dirname "$SCRIPT_REAL")" && pwd)"
IA_DEV_ROOT="$(cd "$DEPLOY_DIR/.." && pwd)"
if [[ "$(pwd)" != "$PROJECT_ROOT" ]]; then
SCRIPT_ABS="${DEPLOY_DIR}/$(basename "${BASH_SOURCE[0]:-$0}")"
cd "$PROJECT_ROOT" && exec "$SCRIPT_ABS" "$@"
fi
PROJECT_SLUG=""
if [[ -f "$PROJECT_ROOT/.ia_project" ]]; then
PROJECT_SLUG="$(cat "$PROJECT_ROOT/.ia_project" | sed 's/[[:space:]]//g')"
fi
if [[ -z "$PROJECT_SLUG" && -n "${IA_PROJECT:-}" ]]; then
PROJECT_SLUG="$IA_PROJECT"
fi
# shellcheck source=../lib/project_config.sh
source "${IA_DEV_ROOT}/lib/project_config.sh"
echo "🔄 Mise à jour vers v${VERSION}..."
@ -47,11 +43,11 @@ echo "✅ VERSION → ${VERSION}"
package_paths=()
splash_name="Application"
if [[ -n "$PROJECT_SLUG" && -f "$IA_DEV_ROOT/projects/${PROJECT_SLUG}.json" ]] && command -v jq >/dev/null 2>&1; then
if [[ -n "${PROJECT_CONFIG_PATH:-}" && -f "$PROJECT_CONFIG_PATH" ]] && command -v jq >/dev/null 2>&1; then
while IFS= read -r p; do
[[ -n "$p" ]] && package_paths+=( "$p" )
done < <(jq -r '.version.package_json_paths[]? // empty' "$IA_DEV_ROOT/projects/${PROJECT_SLUG}.json" 2>/dev/null)
splash_name="$(jq -r '.version.splash_app_name // "Application"' "$IA_DEV_ROOT/projects/${PROJECT_SLUG}.json" 2>/dev/null)"
done < <(jq -r '.version.package_json_paths[]? // empty' "$PROJECT_CONFIG_PATH" 2>/dev/null)
splash_name="$(jq -r '.version.splash_app_name // "Application"' "$PROJECT_CONFIG_PATH" 2>/dev/null)"
fi
for p in "${package_paths[@]}"; do

View File

@ -9,7 +9,8 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then
fi
PROJECT_ROOT="$(git rev-parse --show-toplevel)"
DEPLOY_DIR="$(cd "$(dirname "${BASH_SOURCE[0]:-$0}")" && pwd)"
SCRIPT_REAL="$(readlink -f "${BASH_SOURCE[0]:-$0}" 2>/dev/null || realpath "${BASH_SOURCE[0]:-$0}" 2>/dev/null || echo "${BASH_SOURCE[0]:-$0}")"
DEPLOY_DIR="$(cd "$(dirname "$SCRIPT_REAL")" && pwd)"
if [[ "$(pwd)" != "$PROJECT_ROOT" ]]; then
cd "$PROJECT_ROOT" && exec "${DEPLOY_DIR}/$(basename "${BASH_SOURCE[0]:-$0}")" "$@"
fi
@ -23,7 +24,9 @@ fi
echo "[change-to-all-branches] Aligning branches..."
"$DEPLOY_DIR/branch-align.sh" test
# scripts_v2 lives in the host project's deploy/ (not necessarily under ia_dev)
DEPLOY_SCRIPTS_V2="${PROJECT_ROOT}/deploy/scripts_v2"
echo "[change-to-all-branches] Deploying test (--import-v1 --skipSetupHost, --no-sync-origin because we just pushed)..."
"$DEPLOY_DIR/scripts_v2/deploy.sh" test --import-v1 --skipSetupHost --no-sync-origin
"${DEPLOY_SCRIPTS_V2}/deploy.sh" test --import-v1 --skipSetupHost --no-sync-origin
echo "[change-to-all-branches] OK"

61
deploy/deploy-by-script-to.sh Executable file
View File

@ -0,0 +1,61 @@
#!/usr/bin/env bash
# deploy-by-script-to <target_branch>: run change-to-all-branches (align + deploy test), then checkout target, pull, deploy target.
# Centralized in ia_dev. Requires: start on branch test (after /push-by-script). Target: test | pprod | prod.
set -euo pipefail
if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then
echo "[deploy-by-script-to][ERROR] Not in a git repository" >&2
exit 1
fi
PROJECT_ROOT="$(git rev-parse --show-toplevel)"
SCRIPT_REAL="$(readlink -f "${BASH_SOURCE[0]:-$0}" 2>/dev/null || realpath "${BASH_SOURCE[0]:-$0}" 2>/dev/null || echo "${BASH_SOURCE[0]:-$0}")"
DEPLOY_IA="$(cd "$(dirname "$SCRIPT_REAL")" && pwd)"
if [[ "$(pwd)" != "$PROJECT_ROOT" ]]; then
cd "$PROJECT_ROOT" && exec "$SCRIPT_REAL" "$@"
fi
TARGET_BRANCH="${1:-}"
if [[ -z "$TARGET_BRANCH" ]]; then
echo "[deploy-by-script-to][ERROR] Missing <target_branch> argument (expected: test | pprod | prod)" >&2
echo "Usage: ./ia_dev/deploy/deploy-by-script-to.sh <target_branch> (or ./deploy/deploy-by-script-to.sh if deploy wraps ia_dev)" >&2
exit 1
fi
if [[ ! "$TARGET_BRANCH" =~ ^(test|pprod|prod)$ ]]; then
echo "[deploy-by-script-to][ERROR] Invalid target branch: must be test, pprod or prod (got: '${TARGET_BRANCH}')" >&2
echo "Usage: ./ia_dev/deploy/deploy-by-script-to.sh <target_branch>" >&2
exit 1
fi
current="$(git rev-parse --abbrev-ref HEAD)"
if [[ "$current" != "test" ]]; then
echo "[deploy-by-script-to][ERROR] Must be on branch 'test' to run change-to-all-branches first (current: '${current}')" >&2
exit 1
fi
echo "[deploy-by-script-to] Step 1/5: change-to-all-branches (align + deploy test)..."
"$DEPLOY_IA/change-to-all-branches.sh"
echo "[deploy-by-script-to] Step 2/5: checkout ${TARGET_BRANCH}..."
if [[ "$(git rev-parse --abbrev-ref HEAD)" != "$TARGET_BRANCH" ]]; then
git checkout "$TARGET_BRANCH"
fi
echo "[deploy-by-script-to] Step 3/5: fetch and sync local branch with origin/${TARGET_BRANCH}..."
git fetch origin
if [[ "$TARGET_BRANCH" == "test" ]]; then
git pull --rebase origin test || {
echo "[deploy-by-script-to][ERROR] Pull from origin/test failed. Resolve conflicts or run manually." >&2
exit 1
}
else
git reset --hard "origin/${TARGET_BRANCH}"
fi
echo "[deploy-by-script-to] Step 4/5: deploy ${TARGET_BRANCH} (--import-v1 --skipSetupHost)..."
"$PROJECT_ROOT/deploy/scripts_v2/deploy.sh" "$TARGET_BRANCH" --import-v1 --skipSetupHost
echo "[deploy-by-script-to] Step 5/5: checkout test..."
git checkout test
echo "[deploy-by-script-to] OK: aligned, synced, deployed to ${TARGET_BRANCH}, back on test"

View File

@ -7,21 +7,17 @@ if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then
fi
PROJECT_ROOT="$(git rev-parse --show-toplevel)"
DEPLOY_DIR="$(cd "$(dirname "${BASH_SOURCE[0]:-$0}")" && pwd)"
SCRIPT_REAL="$(readlink -f "${BASH_SOURCE[0]:-$0}" 2>/dev/null || realpath "${BASH_SOURCE[0]:-$0}" 2>/dev/null || echo "${BASH_SOURCE[0]:-$0}")"
DEPLOY_DIR="$(cd "$(dirname "$SCRIPT_REAL")" && pwd)"
IA_DEV_ROOT="$(cd "$DEPLOY_DIR/.." && pwd)"
if [[ "$(pwd)" != "$PROJECT_ROOT" ]]; then
SCRIPT_ABS="${DEPLOY_DIR}/$(basename "${BASH_SOURCE[0]:-$0}")"
cd "$PROJECT_ROOT" && exec "$SCRIPT_ABS" "$@"
fi
# Resolve project slug: .ia_project in repo root or IA_PROJECT env
PROJECT_SLUG=""
if [[ -f "$PROJECT_ROOT/.ia_project" ]]; then
PROJECT_SLUG="$(cat "$PROJECT_ROOT/.ia_project" | sed 's/[[:space:]]//g')"
fi
if [[ -z "$PROJECT_SLUG" && -n "${IA_PROJECT:-}" ]]; then
PROJECT_SLUG="$IA_PROJECT"
fi
# Resolve project id and config path: IA_PROJECT, .ia_project, or ai_project_id → projects/<id>/conf.json
# shellcheck source=../lib/project_config.sh
source "${IA_DEV_ROOT}/lib/project_config.sh"
remote="origin"
bump_version=false
@ -35,7 +31,7 @@ Usage:
Reads a full multi-line commit message from STDIN, then:
- if not in repo root: re-exec from repo root (standardized execution)
- build check (npm run build in each directory listed in projects/<slug>.json build_dirs, if any; exit on failure)
- build check (npm run build in each directory listed in projects/<id>/conf.json build_dirs, if any; exit on failure)
- git add -A
- git commit -F <message>
- git push -u <remote> HEAD
@ -99,12 +95,12 @@ if [[ "$author_name" != "4NK" && "$author_name" != "Nicolas Cantu" ]]; then
fi
repo_root="$(git rev-parse --show-toplevel)"
# Build dirs from project config (projects/<slug>.json); skip if no config or no build_dirs
# Build dirs from project config (projects/<id>/conf.json); skip if no config or no build_dirs
build_dirs=()
if [[ -n "$PROJECT_SLUG" && -f "$IA_DEV_ROOT/projects/${PROJECT_SLUG}.json" ]] && command -v jq >/dev/null 2>&1; then
if [[ -n "${PROJECT_CONFIG_PATH:-}" && -f "$PROJECT_CONFIG_PATH" ]] && command -v jq >/dev/null 2>&1; then
while IFS= read -r d; do
[[ -n "$d" ]] && build_dirs+=( "$d" )
done < <(jq -r '.build_dirs[]? // empty' "$IA_DEV_ROOT/projects/${PROJECT_SLUG}.json" 2>/dev/null)
done < <(jq -r '.build_dirs[]? // empty' "$PROJECT_CONFIG_PATH" 2>/dev/null)
fi
if [[ ${#build_dirs[@]} -gt 0 ]]; then
echo "[pousse] Build check (${#build_dirs[@]} dirs from project config)..."
@ -121,7 +117,7 @@ if [[ ${#build_dirs[@]} -gt 0 ]]; then
done
echo "[pousse] Build check OK"
else
echo "[pousse] No build_dirs in project config (or no projects/<slug>.json / jq); skipping build check"
echo "[pousse] No build_dirs in project config (or no projects/<id>/conf.json / jq); skipping build check"
fi
msg_file="$(mktemp -t pousse-commit-msg.XXXXXX)"

113
gitea-issues/AGENT_LOOP.md Normal file
View File

@ -0,0 +1,113 @@
# Boucle agent (agent-loop) surveillance des mails et fichier témoin
Script qui tourne en boucle dans lenvironnement Cursor de ce projet pour surveiller les mails non lus et maintenir un **fichier témoin** indiquant si la boucle est active.
## Rôle du script
- Exécuter périodiquement `mail-list-unread.sh` (sans modifier létat des mails).
- Mettre à jour à chaque tour un **fichier témoin** (statut + horodatage) pour savoir si la boucle est active.
- Quand des mails non lus sont détectés, écrire un fichier **pending** et afficher un message invitant à lancer lagent dans Cursor.
Le script **ne traite pas** les mails luimême : le traitement (réponse, issues, commits) est fait par l**agent gitea-issues-process**. Vous pouvez soit lancer lagent à la main dans Cursor, soit faire lancer lagent par la boucle en activant `AGENT_LOOP_RUN_AGENT=1` (voir cidessous) si la **Cursor Agent CLI** est installée.
## Environnement au démarrage
Au démarrage, le script **source systématiquement** `~/.bashrc` (si le fichier existe et est lisible), puis ajoute `~/.local/bin` au `PATH` si ce répertoire existe. Ainsi, la commande `agent` (Cursor Agent CLI) est trouvée même si la boucle est lancée depuis un contexte où le shell na pas chargé le profil (nohup, cron, etc.).
## Lancement
Si le fichier `.secrets/gitea-issues/agent-loop.env` existe, il est sourcé au démarrage (voir `agent-loop.env.example`).
Depuis la **racine du dépôt** :
```bash
./gitea-issues/agent-loop.sh
```
Avec un intervalle en secondes (défaut 60) :
```bash
./gitea-issues/agent-loop.sh 120
```
Ou via une variable denvironnement :
```bash
AGENT_LOOP_INTERVAL_SEC=120 ./gitea-issues/agent-loop.sh
```
Pour lexécuter en arrière-plan et garder la boucle active après fermeture du terminal, utiliser `nohup` ou un gestionnaire de processus (systemd, screen, tmux) :
```bash
nohup ./gitea-issues/agent-loop.sh 60 >> logs/gitea-issues/agent-loop.log 2>&1 &
```
## Fichier témoin (actif / inactif)
- **Emplacement** : `logs/gitea-issues/agent-loop.status` (ou `AGENT_LOOP_STATUS_FILE` si défini).
- **Contenu** : trois lignes
1. Horodatage ISO 8601 du dernier tour.
2. Statut : `idle` | `mails_pending` | `running` | `error`.
3. Détail optionnel (ex. message pour lutilisateur).
**Considérer la boucle comme active** si le fichier a été modifié depuis moins de **2 × intervalle** (ex. moins de 120 s si intervalle = 60 s). Au-delà, la boucle est considérée arrêtée.
Exemple de vérification (intervalle 60 s) :
```bash
# Fichier modifié il y a moins de 120 s ?
[ $(($(date +%s) - $(stat -c %Y logs/gitea-issues/agent-loop.status 2>/dev/null || 0))) -lt 120 ] && echo "Actif" || echo "Inactif"
```
Ou simplement consulter la première ligne du fichier (date du dernier tour) et la comparer à lheure courante.
## Fichier pending (mails en attente)
- **Emplacement** : `logs/gitea-issues/agent-loop.pending`.
- **Rôle** : quand des mails non lus sont détectés, le script y écrit un bloc (horodatage, statut `mails_pending`, puis la sortie de `mail-list-unread.sh`). Permet de voir quels mails attendent un traitement par lagent.
- Quand il ny a plus de non lus, le script vide ce fichier au tour suivant.
## Variables denvironnement
Variables possibles dans `.secrets/gitea-issues/agent-loop.env` ou en export shell :
| Variable | Défaut | Description |
|----------|--------|-------------|
| `AGENT_LOOP_INTERVAL_SEC` | 60 | Intervalle entre deux vérifications (secondes). Peut aussi être passé en premier argument au script. |
| `AGENT_LOOP_RUN_AGENT` | 0 | Si mis à `1`, la boucle lance la **Cursor Agent CLI** (`agent`) quand des mails non lus sont détectés, avec un prompt qui exécute le workflow mails de gitea-issues-process. Nécessite que la commande `agent` soit installée (voir [Cursor CLI](https://cursor.com/docs/cli/using)). Si `agent` nest pas dans le PATH, la boucle se contente de mettre à jour le statut et le fichier pending. |
| `AGENT_LOOP_MODEL` | `sonnet-4.6` | Modèle utilisé par la CLI (`agent --model ...`). Par défaut `sonnet-4.6` pour limiter les blocages liés aux quotas Opus. Ex. : `AGENT_LOOP_MODEL=gpt-5.4-low` ; liste : `agent models`. |
| `AGENT_LOOP_STATUS_FILE` | `logs/gitea-issues/agent-loop.status` | Chemin du fichier témoin. |
| `AGENT_LOOP_PENDING_FILE` | `logs/gitea-issues/agent-loop.pending` | Chemin du fichier pending. |
| `GITEA_ISSUES_DIR` | répertoire du script | Racine des scripts gitea-issues (pour appeler `mail-list-unread.sh`). |
## Traiter les mails
Dès que le statut est `mails_pending` ou que des mails apparaissent dans `agent-loop.pending` :
1. **Option A (manuel)** : ouvrir le projet dans **Cursor**, lancer l**agent gitea-issues-process** (commande `/gitea-issues-process` ou via linterface des agents).
2. **Option B (automatique)** : lancer la boucle avec `AGENT_LOOP_RUN_AGENT=1` et la **Cursor Agent CLI** installée (`agent` dans le PATH). Lorsque des mails non lus sont détectés, le script invoque `agent -p "..." -f` pour exécuter le workflow mails (fil, log, réponse, marquage lu). Voir [Cursor CLI](https://cursor.com/docs/cli/using) pour linstallation.
Lagent lit les non lus, consulte les fils, répond par mail, crée des issues si besoin, et marque les mails comme lus.
Après passage de lagent, au prochain tour de la boucle le statut repassera à `idle` et le fichier pending sera vidé (sil ny a plus de non lus).
## Logs
Le script nécrit pas de log structuré par défaut. Pour garder une trace des tours et des messages affichés :
```bash
./gitea-issues/agent-loop.sh 60 2>&1 | tee -a logs/gitea-issues/agent-loop.log
```
Ou en arrière-plan :
```bash
nohup ./gitea-issues/agent-loop.sh 60 >> logs/gitea-issues/agent-loop.log 2>&1 &
```
## Arrêter la boucle
- Si le script est en premier plan : `Ctrl+C`.
- Si lancé en arrière-plan : `kill <pid>` (ou `pkill -f agent-loop.sh`).
Après arrêt, le fichier témoin ne sera plus mis à jour ; après 2 × intervalle, il doit être considéré comme inactif.

100
gitea-issues/README.md Normal file
View File

@ -0,0 +1,100 @@
# Gitea issues scripts et agent
Dossier dédié au traitement des tickets (issues) Gitea du dépôt **4nk/lecoffre_ng** (https://git.4nkweb.com/4nk/lecoffre_ng/issues). Toute la logique dappel API et Git est dans les scripts ; lagent orchestre et appelle /fix ou /evol.
## Prérequis
- **jq** : `apt install jq` ou `brew install jq`
- **Token Gitea** : variable denvironnement `GITEA_TOKEN` ou fichier `.secrets/gitea-issues/token` (contenu = le token, non versionné). Créer le token dans Gitea : Settings → Applications → Generate New Token (scopes `read:issue`, `write:issue` si commentaires).
## Scripts (depuis la racine du dépôt)
| Script | Usage | Description |
|--------|--------|-------------|
| `list-open-issues.sh` | `./gitea-issues/list-open-issues.sh [--lines] [--limit N]` | Liste les issues ouvertes (JSON ou lignes `number\|title\|state`). |
| `get-issue.sh` | `./gitea-issues/get-issue.sh <num> [--summary]` | Détail dune issue (JSON ou résumé texte). |
| `print-issue-prompt.sh` | `./gitea-issues/print-issue-prompt.sh <num>` | Affiche titre + corps pour fournir la consigne à lagent. |
| `create-branch-for-issue.sh` | `./gitea-issues/create-branch-for-issue.sh <num> [base]` | Crée et checkout la branche `issue/<num>` depuis `base` (défaut `test`). |
| `comment-issue.sh` | `./gitea-issues/comment-issue.sh <num> <message>` ou `echo "msg" \| ./gitea-issues/comment-issue.sh <num> -` | Ajoute un commentaire à lissue. |
| `mail-list-unread.sh` | `./gitea-issues/mail-list-unread.sh` | Liste les mails **non lus envoyés à l'alias** (MAIL_FILTER_TO, défaut ai.support.lecoffreio@4nkweb.com) ; lecture seule ; sortie : UID, Message-ID, From, To, Subject, Date, Body. Aucun autre mail n'est listé. |
| `mail-get-thread.sh` | `./gitea-issues/mail-get-thread.sh <uid>` | Récupère **tout le fil** (conversation) du mail donné : tous les messages liés par References/In-Reply-To, tri chronologique (ancien → récent). Même format de sortie que mail-list-unread. À utiliser avant de décider ou répondre sur un mail. |
| `mail-send-reply.sh` | `./gitea-issues/mail-send-reply.sh --to <addr> --subject "..." [--body "..." \| stdin] [--in-reply-to "<msg-id>" [--references "..."]]` | Envoie une réponse par mail via le Bridge (SMTP) ; signature « Support IA du projet Lecoffre.io » / ai.support.lecoffreio@4nkweb.com ajoutée automatiquement. |
| `mail-create-issue-from-email.sh` | `./gitea-issues/mail-create-issue-from-email.sh --uid <uid> [--title "..." ] [--body "..."]` | Crée une issue à partir dun mail (UID), optionnel titre/corps formalisés ; marque le mail lu. |
| `mail-mark-read.sh` | `./gitea-issues/mail-mark-read.sh <uid>` | Marque un mail comme lu. |
| `mail-thread-log.sh` | `./gitea-issues/mail-thread-log.sh get-id \| init \| append-sent \| append-issue \| append-commit ...` | **Log par fil** : un fichier par conversation dans `logs/gitea-issues/threads/` (échanges reçus/envoyés, tickets, commits). `get-id --uid <uid>` affiche `THREAD_ID=...` ; `init --uid <uid>` crée/met à jour le fichier ; `append-sent/issue/commit` enregistrent une réponse, une issue ou un commit. |
| `mail-to-issue.sh` | `./gitea-issues/mail-to-issue.sh` | **Batch** : crée une issue par mail non lu (titre = sujet, corps = texte + From), marque lus. À éviter si on suit le workflow agent (voir cidessous). |
| `agent-loop.sh` | `./gitea-issues/agent-loop.sh [interval_sec]` | **Boucle de surveillance** : exécute périodiquement `mail-list-unread.sh`, met à jour un fichier témoin (`logs/gitea-issues/agent-loop.status`) pour indiquer si la boucle est active, et écrit les mails en attente dans `agent-loop.pending`. Voir `gitea-issues/AGENT_LOOP.md`. |
Variables optionnelles : `GITEA_API_URL`, `GITEA_REPO_OWNER`, `GITEA_REPO_NAME`, `GITEA_ISSUES_DIR`.
### Création dissues depuis les mails (IMAP) workflow agent
**Ne pas enchaîner directement** : lagent doit dabord lire les non lus, formaliser lissue ou répondre par mail, et ne créer/traiter quau moment où la demande est prête.
1. **Lire les non lus** : `./gitea-issues/mail-list-unread.sh` (ne marque pas les mails comme lus).
2. **Pour chaque mail** : consulter **tout l'historique du fil** avec `./gitea-issues/mail-get-thread.sh <uid>`, créer/mettre à jour le **log du fil** avec `./gitea-issues/mail-thread-log.sh init --uid <uid>` (sortie `THREAD_ID=...` à conserver), puis décider soit denvoyer une réponse directe (demande dinfos) via `mail-send-reply.sh`, soit de formaliser et créer lissue avec `mail-create-issue-from-email.sh` (optionnel `--title` / `--body` formalisés). Si la demande est une correction/évolution prête : créer lissue, traiter (fix/evol), commenter lissue, répondre au mail via `mail-send-reply.sh` (avec `--in-reply-to` pour le fil).
3. **Réponses aux mails** : toujours via le Bridge avec `mail-send-reply.sh`. Chaque envoi est enregistré dans le log du fil avec `mail-thread-log.sh append-sent`.
**Prérequis :**
- Python 3 (stdlib : imaplib, email, smtplib, json, urllib).
- Token Gitea : comme les autres scripts (`GITEA_TOKEN` ou `.secrets/gitea-issues/token`).
- Config IMAP/SMTP : copier `gitea-issues/imap-bridge.env.example` vers `.secrets/gitea-issues/imap-bridge.env`. Pour la boucle agent (optionnel) : `agent-loop.env.example` vers `.secrets/gitea-issues/agent-loop.env` (voir AGENT_LOOP.md). Renseigner `IMAP_USER`, `IMAP_PASSWORD` (et optionnellement `SMTP_*` pour lenvoi ; par défaut SMTP reprend les mêmes host/port Bridge 1025). Optionnel : `MAIL_FILTER_TO=ai.support.lecoffreio@4nkweb.com` (seuls les mails envoyés à cette adresse sont listés).
- Proton Mail Bridge (ou serveur IMAP/SMTP) en cours dexécution.
**Scripts :** `mail-list-unread.sh`, `mail-get-thread.sh`, `mail-thread-log.sh`, `mail-send-reply.sh`, `mail-create-issue-from-email.sh`, `mail-mark-read.sh`. Le script batch `mail-to-issue.sh` reste disponible mais ne doit pas être utilisé dans le cadre du workflow agent (liste → lecture du fil → log du fil → décision → création/ réponse). Le script **`agent-loop.sh`** permet de lancer une boucle de surveillance des mails avec fichier témoin ; voir `gitea-issues/AGENT_LOOP.md`.
## API Wiki (tests préalables)
Script de test de lAPI Wiki Gitea pour le même dépôt (prérequis à une éventuelle migration de `docs/` vers le wiki) :
| Script | Usage | Description |
|--------|--------|-------------|
| `wiki-api-test.sh` | `./gitea-issues/wiki-api-test.sh [--create]` | Teste GET list pages, GET page Home ; avec `--create` : POST une page test puis DELETE. |
**Prérequis :** même token que les issues (`GITEA_TOKEN` ou `.secrets/gitea-issues/token`). Pour lécriture (création / suppression de pages), le token doit avoir les droits décriture sur le dépôt.
**Endpoints utilisés (référence Gitea API 1.25) :**
- `GET /repos/{owner}/{repo}/wiki/pages` — liste des pages
- `GET /repos/{owner}/{repo}/wiki/page/{pageName}` — contenu dune page (ex. `Home`)
- `POST /repos/{owner}/{repo}/wiki/new` — créer une page (body : `title`, `content_base64`, `message`)
- `PATCH /repos/{owner}/{repo}/wiki/page/{pageName}` — modifier une page
- `DELETE /repos/{owner}/{repo}/wiki/page/{pageName}` — supprimer une page
Si le wiki na jamais été initialisé (aucune page créée via linterface), les GET peuvent renvoyer 404 ou une liste vide. **Initialiser le wiki** : aller sur https://git.4nkweb.com/4nk/lecoffre_ng/wiki et créer au moins une page (ex. « Home ») via linterface, puis relancer le script avec un token valide.
**Branche par défaut du wiki :** si lAPI renvoie `object does not exist [id: refs/heads/master]` alors que la branche par défaut du dépôt wiki est autre (ex. `prod`), cest un bug connu de certaines versions de Gitea (lAPI suppose `master`). Contournements possibles : (1) **mettre à jour Gitea** (correctif dans les versions récentes, ex. PR #34244) ; (2) **changer la branche par défaut du wiki** en `master` dans les réglages du dépôt (Settings → Branches). Variable optionnelle `GITEA_WIKI_REF=master` (défaut si wiki configuré sur master).
### Migration docs/ → wiki
**Décision :** tout le contenu de `docs/` (racine du dépôt) est migré vers le wiki ; pas de CI sur le wiki.
**Script de migration :**
| Script | Usage | Description |
|--------|--------|-------------|
| `wiki-migrate-docs.sh` | `./gitea-issues/wiki-migrate-docs.sh [--dry-run] [fichier.md ...]` | Migre `docs/*.md` vers le wiki. `--dry-run` affiche le mapping sans appel API. Si des fichiers sont passés en argument, migre uniquement ceux-là. |
| `wiki-put-page.sh` | `./gitea-issues/wiki-put-page.sh <page_name> <file_path>` | Met à jour ou crée une page wiki à partir dun fichier local (ex. `Home docs/README.md`). |
| `wiki-get-page.sh` | `./gitea-issues/wiki-get-page.sh <page_name>` | Affiche le markdown brut dune page wiki (pour scripts ou agents). |
**Correspondance fichier → page wiki :** nom de fichier sans `.md`, `_` remplacé par `-`, title-case par segment. Ex. OPERATIONS.md → Operations, README.md → Readme.
Les 17 fichiers de `docs/` ont été migrés ; les pages sont visibles sur https://git.4nkweb.com/4nk/lecoffre_ng/wiki. La page **Home** contient le contenu de `docs/README.md` (index et correspondance). **`docs/` est exclu du versionnement** (`.gitignore`) : maintenir `docs/` localement (ne pas le supprimer), pousser les modifications vers le wiki avec `wiki-migrate-docs.sh` ou `wiki-put-page.sh` ; ne pas committer `docs/`.
### Après un clone
Le répertoire `docs/` n'est pas versionné. Pour disposer d'une copie locale (édition puis synchro wiki), recréer le contenu à partir du wiki : ex. `./gitea-issues/wiki-get-page.sh Home > docs/README.md`, ou créer les fichiers manuellement à partir des pages wiki listées dans la section Migration ci-dessus.
### Usage « wiki uniquement » pour les agents
La connaissance du projet peut reposer **uniquement sur le wiki** (sans lire `docs/`) : les agents peuvent exécuter `./gitea-issues/wiki-get-page.sh <PageName>` pour récupérer le contenu markdown dune page et lutiliser comme référence. Exemples : `./gitea-issues/wiki-get-page.sh Home`, `./gitea-issues/wiki-get-page.sh Operations`, `./gitea-issues/wiki-get-page.sh Code-Standards`. Prérequis : token Gitea (comme pour les autres scripts wiki). Les agents peuvent ainsi consulter la doc projet à la demande depuis le wiki, sans dépendre des fichiers locaux `docs/`.
## Agent
Commande **/gitea-issues-process** (agent `.cursor/agents/gitea-issues-process.md`) : traite un ou plusieurs tickets en sappuyant uniquement sur ces scripts, puis appelle /fix ou /evol et /push-by-script. Voir le fichier de lagent pour le workflow exact.
## Référence
- Wiki : https://git.4nkweb.com/4nk/lecoffre_ng/wiki
- Documentation opérationnelle (ex. `docs/OPERATIONS.md`) : page wiki **Operations** (après migration).

View File

@ -0,0 +1,17 @@
# Agent-loop parameters (Cursor Agent CLI, model, interval).
# Copy to .secrets/gitea-issues/agent-loop.env and set as needed.
# Do not commit .secrets/gitea-issues/agent-loop.env (directory is gitignored).
#
# Run Cursor Agent when unread mails are detected (0 or 1)
# AGENT_LOOP_RUN_AGENT=1
#
# Model used by the CLI (default: sonnet-4.6 to avoid Opus usage limits)
# List: agent models
# AGENT_LOOP_MODEL=sonnet-4.6
#
# Polling interval in seconds (default: 60)
# AGENT_LOOP_INTERVAL_SEC=60
#
# Optional: custom paths for status and pending files
# AGENT_LOOP_STATUS_FILE=logs/gitea-issues/agent-loop.status
# AGENT_LOOP_PENDING_FILE=logs/gitea-issues/agent-loop.pending

86
gitea-issues/agent-loop.sh Executable file
View File

@ -0,0 +1,86 @@
#!/usr/bin/env bash
# Agent loop: poll for unread mails periodically and maintain a witness file.
# Run from repo root. Use a fichier témoin (status) to know if the loop is active.
#
# Usage:
# ./gitea-issues/agent-loop.sh [interval_seconds]
# AGENT_LOOP_INTERVAL_SEC=120 ./gitea-issues/agent-loop.sh
#
# Witness file: logs/gitea-issues/agent-loop.status
# Updated every iteration. If mtime is older than 2*interval, consider the loop stopped.
# Pending file: logs/gitea-issues/agent-loop.pending
# Written when unread mails exist; contains timestamp and mail list. Clear after agent run.
#
# Optional: set AGENT_LOOP_RUN_AGENT=1 to run the Cursor Agent CLI when mails are detected.
# Requires Cursor Agent CLI (https://cursor.com/docs/cli/using). If "agent" is not in PATH, the loop only updates status/pending.
#
# Optional: AGENT_LOOP_MODEL=<model> to force the model (e.g. sonnet-4.6, gpt-5.4-low). Default: sonnet-4.6 to avoid Opus usage limits when running unattended.
#
set -euo pipefail
# Source user env so PATH includes ~/.local/bin (Cursor Agent CLI, etc.)
if [ -n "${HOME:-}" ] && [ -r "$HOME/.bashrc" ]; then
set +u
# shellcheck source=/dev/null
source "$HOME/.bashrc" 2>/dev/null || true
set -u
fi
[ -n "${HOME:-}" ] && [ -d "$HOME/.local/bin" ] && export PATH="$HOME/.local/bin:$PATH"
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
# Load agent-loop parameters from .secrets (optional)
AGENT_LOOP_ENV="$ROOT/.secrets/gitea-issues/agent-loop.env"
if [ -r "$AGENT_LOOP_ENV" ]; then
set +u
# shellcheck source=/dev/null
source "$AGENT_LOOP_ENV"
set -u
fi
INTERVAL="${1:-${AGENT_LOOP_INTERVAL_SEC:-60}}"
STATUS_FILE="${AGENT_LOOP_STATUS_FILE:-$ROOT/logs/gitea-issues/agent-loop.status}"
PENDING_FILE="${AGENT_LOOP_PENDING_FILE:-$ROOT/logs/gitea-issues/agent-loop.pending}"
mkdir -p "$(dirname "$STATUS_FILE")"
write_status() {
local status="$1"
local detail="${2:-}"
printf "%s\n%s\n%s\n" "$(date -Iseconds)" "$status" "$detail" > "$STATUS_FILE"
}
while true; do
write_status "running" "interval=${INTERVAL}s"
out=""
if out=$(./gitea-issues/mail-list-unread.sh 2>&1); then
if echo "$out" | grep -q "UID="; then
write_status "mails_pending" "Des mails non lus. Lancer l'agent gitea-issues-process dans Cursor."
printf "%s\n%s\n%s\n%s\n" "$(date -Iseconds)" "mails_pending" "---" "$out" > "$PENDING_FILE"
echo "[agent-loop] $(date -Iseconds) — Mails non lus détectés. Lancer l'agent gitea-issues-process dans Cursor."
if [ "${AGENT_LOOP_RUN_AGENT:-0}" = "1" ] && command -v agent >/dev/null 2>&1; then
write_status "running_agent" "Lancement de l'agent Cursor pour traiter les mails."
echo "[agent-loop] $(date -Iseconds) — Lancement de l'agent Cursor (workflow gitea-issues-process mails)."
AGENT_MODEL="${AGENT_LOOP_MODEL:-sonnet-4.6}"
echo "[agent-loop] $(date -Iseconds) — Modèle: $AGENT_MODEL"
AGENT_OPTS=(-p "Exécute le workflow mails entrants de l'agent gitea-issues-process : les mails non lus viennent d'être détectés. 1) Pour chaque mail listé (voir contenu dans logs/gitea-issues/agent-loop.pending) : exécuter ./gitea-issues/mail-get-thread.sh <uid>, puis ./gitea-issues/mail-thread-log.sh init --uid <uid>, conserver THREAD_ID. 2) Pour chaque mail : rédiger une réponse (non technique, didactique, contexte LeCoffre.io ; pour bug demander l'environnement, pour évolution considérer test), envoyer avec ./gitea-issues/mail-send-reply.sh, puis ./gitea-issues/mail-thread-log.sh append-sent, puis ./gitea-issues/mail-mark-read.sh <uid>. 3) Uniquement branche test, ne pas modifier les agents ni scripts d'agents. Répondre à tous les mails avant de marquer comme lu." -f --model "$AGENT_MODEL")
if agent "${AGENT_OPTS[@]}" 2>&1; then
write_status "agent_done" "Agent terminé."
else
write_status "mails_pending" "Agent terminé avec erreur ou interruption. Relancer l'agent manuellement si besoin."
fi
fi
else
write_status "idle" "Aucun mail non lu."
if [ -f "$PENDING_FILE" ]; then
: > "$PENDING_FILE"
fi
fi
else
write_status "error" "mail-list-unread a échoué"
echo "[agent-loop] $(date -Iseconds) — Erreur mail-list-unread" >&2
fi
sleep "$INTERVAL"
done

50
gitea-issues/comment-issue.sh Executable file
View File

@ -0,0 +1,50 @@
#!/usr/bin/env bash
#
# Add a comment to an issue via Gitea API. The comment body is automatically
# signed with: Support IA du projet Lecoffre.io / ai.support.lecoffreio@4nkweb.com
# Usage: ./comment-issue.sh <issue_number> <message>
# Or: echo "message" | ./comment-issue.sh <issue_number> -
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
# Signature appended to every comment (same as mail replies)
COMMENT_SIGNATURE=$'\n\n--\nSupport IA du projet Lecoffre.io\nai.support.lecoffreio@4nkweb.com'
if [[ -n "${GITEA_COMMENT_SIGNATURE:-}" ]]; then
COMMENT_SIGNATURE="${GITEA_COMMENT_SIGNATURE}"
fi
require_jq || exit 1
if [[ $# -lt 1 ]]; then
log_err "Usage: $0 <issue_number> <message>"
log_err " Or: $0 <issue_number> - (read message from stdin)"
exit 1
fi
ISSUE_NUM="$1"
if [[ "${2:-}" == "-" ]]; then
BODY="$(cat)"
else
BODY="${2:-}"
fi
if [[ -z "$BODY" ]]; then
log_err "Comment body is empty."
exit 1
fi
BODY="${BODY}${COMMENT_SIGNATURE}"
# Escape for JSON: jq -Rs . handles newlines and quotes
BODY_JSON="$(echo "$BODY" | jq -Rs .)"
RESPONSE="$(gitea_api_post "/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}/issues/${ISSUE_NUM}/comments" "{\"body\":${BODY_JSON}}")"
if ! echo "$RESPONSE" | jq -e . &>/dev/null; then
log_err "API error posting comment: ${RESPONSE:0:200}"
exit 1
fi
log_info "Comment added to issue #${ISSUE_NUM}."

View File

@ -0,0 +1,45 @@
#!/usr/bin/env bash
#
# Create a local branch for an issue. Branch name: issue/<number> (safe, short).
# Base branch defaults to "test"; ensure it is up to date (fetch + reset to origin/base).
# Usage: ./create-branch-for-issue.sh <issue_number> [base_branch]
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
if [[ $# -lt 1 ]]; then
log_err "Usage: $0 <issue_number> [base_branch]"
exit 1
fi
ISSUE_NUM="$1"
BASE="${2:-test}"
require_git_root || exit 1
if git show-ref --quiet "refs/heads/issue/${ISSUE_NUM}"; then
log_info "Branch issue/${ISSUE_NUM} already exists. Checking it out."
git checkout "issue/${ISSUE_NUM}"
echo "issue/${ISSUE_NUM}"
exit 0
fi
if ! git show-ref --quiet "refs/heads/${BASE}"; then
log_err "Base branch ${BASE} does not exist locally."
exit 1
fi
git fetch origin
if git show-ref --quiet "refs/remotes/origin/${BASE}"; then
git checkout "${BASE}"
git reset --hard "origin/${BASE}"
else
git checkout "${BASE}"
fi
git checkout -b "issue/${ISSUE_NUM}"
log_info "Created and checked out branch issue/${ISSUE_NUM} from ${BASE}."
echo "issue/${ISSUE_NUM}"

39
gitea-issues/get-issue.sh Executable file
View File

@ -0,0 +1,39 @@
#!/usr/bin/env bash
#
# Get one issue by number. Output: JSON (default) or plain text summary (--summary).
# Usage: ./get-issue.sh <issue_number> [--summary]
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
require_jq || exit 1
if [[ $# -lt 1 ]]; then
log_err "Usage: $0 <issue_number> [--summary]"
exit 1
fi
ISSUE_NUM="$1"
SUMMARY=false
[[ "${2:-}" == "--summary" ]] && SUMMARY=true
RESPONSE="$(gitea_api_get "/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}/issues/${ISSUE_NUM}")"
if ! echo "$RESPONSE" | jq -e . &>/dev/null; then
log_err "API error or invalid JSON (issue ${ISSUE_NUM}): ${RESPONSE:0:200}"
exit 1
fi
if [[ "$SUMMARY" == true ]]; then
echo "--- Issue #${ISSUE_NUM} ---"
echo "Title: $(echo "$RESPONSE" | jq -r '.title')"
echo "State: $(echo "$RESPONSE" | jq -r '.state')"
echo "Labels: $(echo "$RESPONSE" | jq -r '[.labels[].name] | join(", ")')"
echo "Body:"
echo "$RESPONSE" | jq -r '.body // "(empty)"'
echo "---"
else
echo "$RESPONSE"
fi

View File

@ -0,0 +1,40 @@
# IMAP config for mail-to-issue (e.g. Proton Mail Bridge).
# Copy to .secrets/gitea-issues/imap-bridge.env and set real values.
# Do not commit .secrets/gitea-issues/imap-bridge.env (directory is gitignored).
#
# IMAP (read)
# IMAP_HOST=127.0.0.1
# IMAP_PORT=1143
# IMAP_USER=your-address@pm.me
# IMAP_PASSWORD=your-bridge-password
# IMAP_USE_STARTTLS=true
# For local Proton Bridge with self-signed cert, set to false to skip SSL verification (localhost only).
# IMAP_SSL_VERIFY=false
#
# SMTP (send replies; same Bridge account)
# SMTP_HOST=127.0.0.1
# SMTP_PORT=1025
# SMTP_USER=your-address@pm.me
# SMTP_PASSWORD=your-bridge-password
# SMTP_USE_STARTTLS=true
#
# Restrict listed mails to those sent to this address (default: ai.support.lecoffreio@4nkweb.com)
# MAIL_FILTER_TO=ai.support.lecoffreio@4nkweb.com
#
# Signature appended to every reply (default: Support IA du projet Lecoffre.io + ai.support.lecoffreio@4nkweb.com)
# MAIL_REPLY_SIGNATURE=--\\nSupport IA du projet Lecoffre.io\\nai.support.lecoffreio@4nkweb.com
#
# Signature for Gitea issue comments (optional; comment-issue.sh uses same default as mail signature)
# GITEA_COMMENT_SIGNATURE=
IMAP_HOST=127.0.0.1
IMAP_PORT=1143
IMAP_USER=
IMAP_PASSWORD=
IMAP_USE_STARTTLS=true
SMTP_HOST=127.0.0.1
SMTP_PORT=1025
SMTP_USER=
SMTP_PASSWORD=
SMTP_USE_STARTTLS=true

110
gitea-issues/lib.sh Executable file
View File

@ -0,0 +1,110 @@
#!/usr/bin/env bash
#
# Shared config and helpers for Gitea issues scripts.
# Source from gitea-issues/*.sh after cd to project root.
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
GITEA_API_URL="${GITEA_API_URL:-https://git.4nkweb.com/api/v1}"
GITEA_REPO_OWNER="${GITEA_REPO_OWNER:-4nk}"
GITEA_REPO_NAME="${GITEA_REPO_NAME:-lecoffre_ng}"
# Optional: load project config from ia_dev (projects/<id>/conf.json) when gitea-issues is inside ia_dev
PROJECT_CONFIG_PATH=""
if [[ -f "${GITEA_ISSUES_DIR}/../lib/project_config.sh" ]]; then
PROJECT_ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || true
IA_DEV_ROOT="$(cd "$GITEA_ISSUES_DIR/.." && pwd)"
if [[ -n "${PROJECT_ROOT:-}" ]]; then
# shellcheck source=../lib/project_config.sh
source "${GITEA_ISSUES_DIR}/../lib/project_config.sh"
fi
fi
# Load token: GITEA_TOKEN env, then project config git.token_file, then default .secrets path
load_gitea_token() {
if [[ -n "${GITEA_TOKEN:-}" ]]; then
return 0
fi
local token_file=""
if [[ -n "${PROJECT_CONFIG_PATH:-}" && -f "$PROJECT_CONFIG_PATH" ]] && command -v jq >/dev/null 2>&1; then
local rel_path
rel_path="$(jq -r '.git.token_file // empty' "$PROJECT_CONFIG_PATH" 2>/dev/null)"
if [[ -n "$rel_path" && -n "${PROJECT_ROOT:-}" && -f "${PROJECT_ROOT}/${rel_path}" ]]; then
token_file="${PROJECT_ROOT}/${rel_path}"
fi
fi
if [[ -z "$token_file" ]]; then
token_file="${GITEA_ISSUES_DIR}/../.secrets/gitea-issues/token"
fi
if [[ -f "$token_file" ]]; then
GITEA_TOKEN="$(cat "$token_file")"
return 0
fi
echo "[gitea-issues] ERROR: GITEA_TOKEN not set and ${token_file} not found" >&2
echo "[gitea-issues] Set GITEA_TOKEN or create the token file with a Gitea Personal Access Token." >&2
return 1
}
# curl wrapper for Gitea API (GET). Usage: gitea_api_get "/repos/owner/repo/issues"
gitea_api_get() {
local path="$1"
load_gitea_token || return 1
curl -sS -H "Accept: application/json" \
-H "Authorization: token ${GITEA_TOKEN}" \
"${GITEA_API_URL}${path}"
}
# curl wrapper for Gitea API (POST). Usage: gitea_api_post "/repos/owner/repo/issues/123/comments" '{"body":"..."}'
gitea_api_post() {
local path="$1"
local data="${2:-}"
load_gitea_token || return 1
curl -sS -X POST -H "Accept: application/json" -H "Content-Type: application/json" \
-H "Authorization: token ${GITEA_TOKEN}" \
-d "$data" \
"${GITEA_API_URL}${path}"
}
# curl wrapper for Gitea API (PATCH). Usage: gitea_api_patch "/repos/owner/repo/wiki/page/Foo" '{"content_base64":"..."}'
gitea_api_patch() {
local path="$1"
local data="${2:-}"
load_gitea_token || return 1
curl -sS -X PATCH -H "Accept: application/json" -H "Content-Type: application/json" \
-H "Authorization: token ${GITEA_TOKEN}" \
-d "$data" \
"${GITEA_API_URL}${path}"
}
# curl wrapper for Gitea API (DELETE). Usage: gitea_api_delete "/repos/owner/repo/wiki/page/Foo"
gitea_api_delete() {
local path="$1"
load_gitea_token || return 1
curl -sS -X DELETE -H "Accept: application/json" \
-H "Authorization: token ${GITEA_TOKEN}" \
"${GITEA_API_URL}${path}"
}
log_ts() { date -u '+%Y-%m-%dT%H:%M:%SZ'; }
log_info() { echo "[$(log_ts)] [gitea-issues] $*"; }
log_err() { echo "[$(log_ts)] [gitea-issues] $*" >&2; }
# Require jq for JSON output
require_jq() {
if ! command -v jq &>/dev/null; then
log_err "jq is required. Install with: apt install jq / brew install jq"
return 1
fi
}
# Ensure we are in the git repo root (for create-branch, etc.)
require_git_root() {
local root
root="$(git rev-parse --show-toplevel 2>/dev/null)" || true
if [[ -z "$root" ]]; then
log_err "Not inside a git repository."
return 1
fi
cd "$root"
}

View File

@ -0,0 +1,35 @@
#!/usr/bin/env bash
#
# List open issues for the configured Gitea repo.
# Output: JSON array (default) or one line per issue "number|title|state" with --lines.
# Usage: ./list-open-issues.sh [--lines] [--limit N]
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
require_jq || exit 1
LINES=false
LIMIT=50
while [[ $# -gt 0 ]]; do
case "$1" in
--lines) LINES=true; shift ;;
--limit) LIMIT="$2"; shift 2 ;;
*) log_err "Unknown option: $1"; exit 1 ;;
esac
done
RESPONSE="$(gitea_api_get "/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}/issues?state=open&page=1&limit=${LIMIT}")"
if ! echo "$RESPONSE" | jq -e . &>/dev/null; then
log_err "API error or invalid JSON: ${RESPONSE:0:200}"
exit 1
fi
if [[ "$LINES" == true ]]; then
echo "$RESPONSE" | jq -r '.[] | "\(.number)|\(.title)|\(.state)"'
else
echo "$RESPONSE"
fi

View File

@ -0,0 +1,110 @@
#!/usr/bin/env python3
"""
Create one Gitea issue from one email (by UID), then mark the email as read.
If --title and/or --body are provided (formalized by agent), use them; else use subject and body from the email.
Usage: ./gitea-issues/mail-create-issue-from-email.sh --uid <uid> [--title "..." ] [--body "..." ]
"""
from __future__ import annotations
import argparse
import email
import imaplib
import ssl
import sys
from email.header import decode_header
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import (
create_gitea_issue,
load_gitea_config,
load_imap_config,
repo_root,
sanitize_title,
)
def decode_header_value(header: str | None) -> str:
if not header:
return ""
from email.header import decode_header as dh
parts = dh(header)
result = []
for part, charset in parts:
if isinstance(part, bytes):
result.append(part.decode(charset or "utf-8", errors="replace"))
else:
result.append(part)
return "".join(result)
def get_text_body(msg: email.message.Message) -> str:
if msg.is_multipart():
for part in msg.walk():
if part.get_content_type() == "text/plain":
payload = part.get_payload(decode=True)
if payload:
return payload.decode(part.get_content_charset() or "utf-8", errors="replace")
return ""
payload = msg.get_payload(decode=True)
if not payload:
return ""
return payload.decode(msg.get_content_charset() or "utf-8", errors="replace")
def main() -> None:
ap = argparse.ArgumentParser(description="Create one Gitea issue from one email by UID")
ap.add_argument("--uid", required=True, help="IMAP message UID")
ap.add_argument("--title", default="", help="Formalized issue title (else use subject)")
ap.add_argument("--body", default="", help="Formalized issue body (else use email body + From)")
args = ap.parse_args()
cfg = load_imap_config()
if not cfg["user"] or not cfg["password"]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
print("[gitea-issues] ERROR: IMAP_USER and IMAP_PASSWORD required.", file=sys.stderr)
sys.exit(1)
gitea = load_gitea_config()
if not gitea["token"]:
print("[gitea-issues] ERROR: GITEA_TOKEN not set.", file=sys.stderr)
sys.exit(1)
mail = imaplib.IMAP4(cfg["host"], int(cfg["port"]))
if cfg["use_starttls"]:
mail.starttls(ssl.create_default_context())
mail.login(cfg["user"], cfg["password"])
mail.select("INBOX")
_, data = mail.fetch(args.uid, "(RFC822)")
if not data or not data[0]:
print("[gitea-issues] ERROR: Message UID not found.", file=sys.stderr)
mail.logout()
sys.exit(1)
msg = email.message_from_bytes(data[0][1])
from_ = decode_header_value(msg.get("From"))
subject = decode_header_value(msg.get("Subject"))
body_text = get_text_body(msg)
body_for_issue = f"**From:** {from_}\n\n{body_text}".strip()
title = args.title.strip() if args.title else sanitize_title(subject)
body = args.body.strip() if args.body else body_for_issue
issue = create_gitea_issue(title, body)
if not issue:
print("[gitea-issues] ERROR: Failed to create issue.", file=sys.stderr)
mail.logout()
sys.exit(1)
mail.store(args.uid, "+FLAGS", "\\Seen")
mail.logout()
num = issue.get("number", "?")
print(f"[gitea-issues] Created issue #{num}: {title[:60]}")
print(f"ISSUE_NUMBER={num}")
if __name__ == "__main__":
main()

View File

@ -0,0 +1,10 @@
#!/usr/bin/env bash
# Create one Gitea issue from one email (by UID), mark email read. Run from repo root.
# Usage: ./gitea-issues/mail-create-issue-from-email.sh --uid <uid> [--title "..." ] [--body "..." ]
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
exec python3 "${GITEA_ISSUES_DIR}/mail-create-issue-from-email.py" "$@"

View File

@ -0,0 +1,207 @@
#!/usr/bin/env python3
"""
Fetch the full email thread (conversation) for a given message UID.
Uses Message-ID, References and In-Reply-To to find all messages in the thread.
Output format: same as mail-list-unread (--- MAIL UID=... --- ... --- END MAIL ---), chronological order.
Usage: mail-get-thread.py <uid>
or: ./gitea-issues/mail-get-thread.sh <uid>
"""
from __future__ import annotations
import email
import imaplib
import re
import sys
from email.header import decode_header
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import load_imap_config, repo_root, imap_ssl_context
def decode_header_value(header: str | None) -> str:
if not header:
return ""
parts = decode_header(header)
result = []
for part, charset in parts:
if isinstance(part, bytes):
result.append(part.decode(charset or "utf-8", errors="replace"))
else:
result.append(part)
return "".join(result)
def get_text_body(msg: email.message.Message) -> str:
if msg.is_multipart():
for part in msg.walk():
if part.get_content_type() == "text/plain":
payload = part.get_payload(decode=True)
if payload:
return payload.decode(
part.get_content_charset() or "utf-8", errors="replace"
)
return ""
payload = msg.get_payload(decode=True)
if not payload:
return ""
return payload.decode(
msg.get_content_charset() or "utf-8", errors="replace"
)
def parse_message_ids(refs: str | None, in_reply_to: str | None) -> set[str]:
"""Extract Message-ID values from References and In-Reply-To headers."""
ids: set[str] = set()
for raw in (refs or "", in_reply_to or ""):
for part in re.split(r"\s+", raw.strip()):
part = part.strip()
if part.startswith("<") and ">" in part:
ids.add(part)
elif part and "@" in part and part not in ("<", ">"):
ids.add(part if part.startswith("<") else f"<{part}>")
return ids
def find_message_ids_from_msg(msg: email.message.Message) -> set[str]:
mid = (msg.get("Message-ID") or "").strip()
refs = (msg.get("References") or "").strip()
in_reply = (msg.get("In-Reply-To") or "").strip()
ids = {mid} if mid else set()
ids |= parse_message_ids(refs, in_reply)
return ids
def search_by_message_id(mail: imaplib.IMAP4, msg_id: str) -> list[str]:
"""Return list of UIDs (as strings) for messages with given Message-ID."""
if not msg_id:
return []
if not msg_id.startswith("<"):
msg_id = f"<{msg_id}>"
if not msg_id.endswith(">"):
msg_id = msg_id + ">"
criterion = f'HEADER Message-ID "{msg_id}"'
try:
_, data = mail.search(None, criterion)
except Exception:
return []
if not data or not data[0]:
return []
return [u.decode("ascii") for u in data[0].split() if u]
def fetch_message_by_uid(
mail: imaplib.IMAP4, uid: str
) -> email.message.Message | None:
"""Fetch a single message by UID. Returns parsed email or None."""
try:
_, data = mail.fetch(uid.encode("ascii"), "(RFC822)")
except Exception:
return None
if not data or not data[0] or len(data[0]) < 2:
return None
raw = data[0][1]
if isinstance(raw, bytes):
return email.message_from_bytes(raw)
return None
def format_message(uid: str, msg: email.message.Message) -> str:
mid = (msg.get("Message-ID") or "").strip()
from_ = decode_header_value(msg.get("From"))
to_ = decode_header_value(msg.get("To"))
subj = decode_header_value(msg.get("Subject"))
date_h = decode_header_value(msg.get("Date"))
body = get_text_body(msg)
lines = [
"--- MAIL",
f"UID={uid}",
"---",
"Message-ID: " + (mid or "(none)"),
"From: " + from_,
"To: " + (to_ or ""),
"Subject: " + subj,
"Date: " + (date_h or ""),
"Body:",
body or "(empty)",
"--- END MAIL ---",
]
return "\n".join(lines)
def main() -> int:
if len(sys.argv) < 2:
print("Usage: mail-get-thread.py <uid>", file=sys.stderr)
return 1
uid0 = sys.argv[1].strip()
if not uid0:
print("[gitea-issues] ERROR: UID required.", file=sys.stderr)
return 1
cfg = load_imap_config()
if not cfg["user"] or not cfg["password"]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
print(
"[gitea-issues] ERROR: IMAP_USER and IMAP_PASSWORD required.",
file=sys.stderr,
)
print(f"[gitea-issues] Set env or create {env_path}", file=sys.stderr)
return 1
mail = imaplib.IMAP4(cfg["host"], int(cfg["port"]))
if cfg["use_starttls"]:
mail.starttls(imap_ssl_context(cfg.get("ssl_verify", True)))
mail.login(cfg["user"], cfg["password"])
mail.select("INBOX")
msg0 = fetch_message_by_uid(mail, uid0)
if not msg0:
print(f"[gitea-issues] No message found for UID={uid0}.", file=sys.stderr)
mail.logout()
return 1
to_fetch: set[str] = find_message_ids_from_msg(msg0)
seen_ids: set[str] = set()
uids_by_mid: dict[str, str] = {}
while to_fetch:
mid = to_fetch.pop()
if not mid or mid in seen_ids:
continue
seen_ids.add(mid)
uids = search_by_message_id(mail, mid)
if uids:
uids_by_mid[mid] = uids[0]
msg = fetch_message_by_uid(mail, uids[0])
if msg:
to_fetch |= find_message_ids_from_msg(msg)
mid0 = (msg0.get("Message-ID") or "").strip()
if mid0 and mid0 not in uids_by_mid:
uids_by_mid[mid0] = uid0
collected: list[tuple[str, str, email.message.Message]] = []
for _mid, uid in uids_by_mid.items():
msg = fetch_message_by_uid(mail, uid)
if not msg:
continue
date_h = (msg.get("Date") or "").strip()
collected.append((date_h, uid, msg))
if uid0 not in uids_by_mid.values():
date0 = (msg0.get("Date") or "").strip()
collected.append((date0, uid0, msg0))
collected.sort(key=lambda x: x[0])
for _date, uid, msg in collected:
print(format_message(uid, msg))
mail.logout()
return 0
if __name__ == "__main__":
sys.exit(main())

14
gitea-issues/mail-get-thread.sh Executable file
View File

@ -0,0 +1,14 @@
#!/usr/bin/env bash
# Fetch full email thread for a given UID. Run from repo root.
# Usage: ./gitea-issues/mail-get-thread.sh <uid>
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
if [ $# -lt 1 ]; then
echo "Usage: $0 <uid>" >&2
exit 1
fi
exec python3 "${GITEA_ISSUES_DIR}/mail-get-thread.py" "$1"

117
gitea-issues/mail-list-unread.py Executable file
View File

@ -0,0 +1,117 @@
#!/usr/bin/env python3
"""
List unread emails via IMAP (e.g. Proton Mail Bridge). Read-only; does not mark as read.
Only lists messages sent to the configured alias (MAIL_FILTER_TO, default ai.support.lecoffreio@4nkweb.com).
Output is for the agent: each mail with UID, Message-ID, From, To, Subject, Date, body.
Usage: ./gitea-issues/mail-list-unread.sh
"""
from __future__ import annotations
import email
import imaplib
import sys
from email.header import decode_header
from pathlib import Path
# Add gitea-issues to path for mail_common
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import load_imap_config, repo_root, imap_ssl_context
def decode_header_value(header: str | None) -> str:
if not header:
return ""
from email.header import decode_header as dh
parts = dh(header)
result = []
for part, charset in parts:
if isinstance(part, bytes):
result.append(part.decode(charset or "utf-8", errors="replace"))
else:
result.append(part)
return "".join(result)
def get_text_body(msg: email.message.Message) -> str:
if msg.is_multipart():
for part in msg.walk():
if part.get_content_type() == "text/plain":
payload = part.get_payload(decode=True)
if payload:
return payload.decode(part.get_content_charset() or "utf-8", errors="replace")
return ""
payload = msg.get_payload(decode=True)
if not payload:
return ""
return payload.decode(msg.get_content_charset() or "utf-8", errors="replace")
def is_sent_to_alias(msg: email.message.Message, filter_to: str) -> bool:
"""True if any To/Delivered-To/X-Original-To/Cc header contains the filter address."""
if not filter_to:
return True
headers_to_check = ("To", "Delivered-To", "X-Original-To", "Cc", "Envelope-To")
for name in headers_to_check:
value = msg.get(name)
if value:
decoded = decode_header_value(value).lower()
if filter_to in decoded:
return True
return False
def main() -> None:
cfg = load_imap_config()
if not cfg["user"] or not cfg["password"]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
print("[gitea-issues] ERROR: IMAP_USER and IMAP_PASSWORD required.", file=sys.stderr)
print(f"[gitea-issues] Set env or create {env_path}", file=sys.stderr)
sys.exit(1)
mail = imaplib.IMAP4(cfg["host"], int(cfg["port"]))
if cfg["use_starttls"]:
mail.starttls(imap_ssl_context(cfg.get("ssl_verify", True)))
mail.login(cfg["user"], cfg["password"])
mail.select("INBOX")
_, nums = mail.search(None, "UNSEEN")
ids = nums[0].split()
if not ids:
print("[gitea-issues] No unread messages.")
mail.logout()
return
shown = 0
for uid in ids:
uid_s = uid.decode("ascii")
_, data = mail.fetch(uid, "(RFC822)")
if not data or not data[0]:
continue
msg = email.message_from_bytes(data[0][1])
if not is_sent_to_alias(msg, cfg.get("filter_to", "")):
continue
mid = msg.get("Message-ID", "").strip()
from_ = decode_header_value(msg.get("From"))
to_ = decode_header_value(msg.get("To"))
subj = decode_header_value(msg.get("Subject"))
date_h = decode_header_value(msg.get("Date"))
body = get_text_body(msg)
print("--- MAIL", f"UID={uid_s}", "---")
print("Message-ID:", mid or "(none)")
print("From:", from_)
print("To:", to_ or "")
print("Subject:", subj)
print("Date:", date_h or "")
print("Body:")
print(body or "(empty)")
print("--- END MAIL ---")
shown += 1
if shown == 0:
print("[gitea-issues] No unread messages sent to the configured alias (MAIL_FILTER_TO).")
mail.logout()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,9 @@
#!/usr/bin/env bash
# List unread emails (read-only). Run from repo root.
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
exec python3 "${GITEA_ISSUES_DIR}/mail-list-unread.py"

41
gitea-issues/mail-mark-read.py Executable file
View File

@ -0,0 +1,41 @@
#!/usr/bin/env python3
"""
Mark one email as read by UID (e.g. after replying without creating an issue).
Usage: ./gitea-issues/mail-mark-read.sh <uid>
"""
from __future__ import annotations
import imaplib
import sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import load_imap_config, repo_root, imap_ssl_context
def main() -> None:
if len(sys.argv) < 2:
print("[gitea-issues] Usage: mail-mark-read.sh <uid>", file=sys.stderr)
sys.exit(1)
uid = sys.argv[1].strip()
cfg = load_imap_config()
if not cfg["user"] or not cfg["password"]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
print("[gitea-issues] ERROR: IMAP_USER and IMAP_PASSWORD required.", file=sys.stderr)
sys.exit(1)
mail = imaplib.IMAP4(cfg["host"], int(cfg["port"]))
if cfg["use_starttls"]:
mail.starttls(imap_ssl_context(cfg.get("ssl_verify", True)))
mail.login(cfg["user"], cfg["password"])
mail.select("INBOX")
mail.store(uid, "+FLAGS", "\\Seen")
mail.logout()
print("[gitea-issues] Marked as read.")
if __name__ == "__main__":
main()

10
gitea-issues/mail-mark-read.sh Executable file
View File

@ -0,0 +1,10 @@
#!/usr/bin/env bash
# Mark one email as read by UID. Run from repo root.
# Usage: ./gitea-issues/mail-mark-read.sh <uid>
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
exec python3 "${GITEA_ISSUES_DIR}/mail-mark-read.py" "$@"

73
gitea-issues/mail-send-reply.py Executable file
View File

@ -0,0 +1,73 @@
#!/usr/bin/env python3
"""
Send a reply email via SMTP (e.g. Proton Mail Bridge).
Usage: ./gitea-issues/mail-send-reply.sh --to addr@example.com --subject "..." --body "..." [--in-reply-to "<msg-id>" [--references "<refs>"]]
Or: echo "body" | ./gitea-issues/mail-send-reply.sh --to addr@example.com --subject "..." [--in-reply-to "<msg-id>"]
"""
from __future__ import annotations
import argparse
import os
import smtplib
import sys
from email.mime.text import MIMEText
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import load_smtp_config, repo_root, imap_ssl_context
DEFAULT_SIGNATURE = """--
Support IA du projet Lecoffre.io
ai.support.lecoffreio@4nkweb.com"""
def get_reply_signature() -> str:
sig = os.environ.get("MAIL_REPLY_SIGNATURE", "").strip()
if sig:
return "\n\n" + sig.replace("\\n", "\n")
return "\n\n" + DEFAULT_SIGNATURE
def main() -> None:
ap = argparse.ArgumentParser(description="Send reply email via Bridge SMTP")
ap.add_argument("--to", required=True, help="To address")
ap.add_argument("--subject", required=True, help="Subject")
ap.add_argument("--body", default="", help="Body (or use stdin)")
ap.add_argument("--in-reply-to", default="", help="Message-ID of the message we reply to")
ap.add_argument("--references", default="", help="References header for threading")
args = ap.parse_args()
cfg = load_smtp_config()
if not cfg["user"] or not cfg["password"]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
print("[gitea-issues] ERROR: SMTP_USER and SMTP_PASSWORD required.", file=sys.stderr)
print(f"[gitea-issues] Set env or create {env_path}", file=sys.stderr)
sys.exit(1)
body = args.body
if not body and not sys.stdin.isatty():
body = sys.stdin.read()
body = (body.rstrip() + get_reply_signature()).strip()
msg = MIMEText(body, "plain", "utf-8")
msg["Subject"] = args.subject
msg["From"] = cfg["user"]
msg["To"] = args.to
if args.in_reply_to:
msg["In-Reply-To"] = args.in_reply_to
if args.references:
msg["References"] = args.references
with smtplib.SMTP(cfg["host"], int(cfg["port"])) as smtp:
if cfg["use_starttls"]:
smtp.starttls(context=imap_ssl_context(cfg.get("ssl_verify", True)))
smtp.login(cfg["user"], cfg["password"])
smtp.sendmail(cfg["user"], [args.to], msg.as_string())
print("[gitea-issues] Reply sent.")
if __name__ == "__main__":
main()

10
gitea-issues/mail-send-reply.sh Executable file
View File

@ -0,0 +1,10 @@
#!/usr/bin/env bash
# Send reply email via Bridge SMTP. Run from repo root.
# Usage: ./gitea-issues/mail-send-reply.sh --to addr --subject "..." [--body "..." | stdin] [--in-reply-to "<msg-id>" [--references "..." ]]
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
exec python3 "${GITEA_ISSUES_DIR}/mail-send-reply.py" "$@"

View File

@ -0,0 +1,332 @@
#!/usr/bin/env python3
"""
Thread log: one file per email thread under logs/gitea-issues/threads/.
Content: exchanges (received + sent), tickets (issues), commits.
Usage:
mail-thread-log.py get-id --uid <uid> # print THREAD_ID=...
mail-thread-log.py init --uid <uid> # create/update log from thread
mail-thread-log.py append-sent --thread-id <id> --to <addr> --subject "..." [--body "..."] [--date "..."]
mail-thread-log.py append-issue --thread-id <id> --issue <num> [--title "..."]
mail-thread-log.py append-commit --thread-id <id> --hash <hash> --message "..." [--branch "..."]
"""
from __future__ import annotations
import argparse
import re
import subprocess
import sys
from datetime import datetime, timezone
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import load_gitea_config, load_imap_config, repo_root
def threads_dir() -> Path:
root = repo_root()
d = root / "logs" / "gitea-issues" / "threads"
d.mkdir(parents=True, exist_ok=True)
return d
def sanitize_thread_id(raw: str, max_len: int = 80) -> str:
s = re.sub(r"[^a-zA-Z0-9._-]", "_", raw).strip("_")
return s[:max_len] if s else "thread_unknown"
def get_thread_output(uid: str) -> str:
gitea_dir = Path(__file__).resolve().parent
root = gitea_dir.parent
env = {"GITEA_ISSUES_DIR": str(gitea_dir)}
result = subprocess.run(
[sys.executable, str(gitea_dir / "mail-get-thread.py"), uid],
cwd=str(root),
capture_output=True,
text=True,
env={**__import__("os").environ, **env},
timeout=60,
)
if result.returncode != 0:
raise RuntimeError(
f"mail-get-thread failed: {result.stderr or result.stdout or 'unknown'}"
)
return result.stdout
def parse_thread_blocks(text: str) -> list[dict[str, str]]:
"""Parse --- MAIL UID=... --- ... --- END MAIL --- blocks."""
blocks: list[dict[str, str]] = []
pattern = re.compile(
r"--- MAIL\s+UID=(\S+)\s+---\s*\n"
r"(?:Message-ID:\s*(.*?)\n)?"
r"From:\s*(.*?)\n"
r"To:\s*(.*?)\n"
r"Subject:\s*(.*?)\n"
r"Date:\s*(.*?)\n"
r"Body:\s*\n(.*?)--- END MAIL ---",
re.DOTALL,
)
for m in pattern.finditer(text):
blocks.append({
"uid": m.group(1).strip(),
"message_id": (m.group(2) or "").strip(),
"from": (m.group(3) or "").strip(),
"to": (m.group(4) or "").strip(),
"subject": (m.group(5) or "").strip(),
"date": (m.group(6) or "").strip(),
"body": (m.group(7) or "").strip(),
})
return blocks
def get_thread_id_from_uid(uid: str) -> str:
out = get_thread_output(uid)
blocks = parse_thread_blocks(out)
if not blocks:
return sanitize_thread_id(f"thread_uid_{uid}")
first_msg_id = (blocks[0].get("message_id") or "").strip() or blocks[0].get("uid", "")
return sanitize_thread_id(first_msg_id)
def format_exchange_received(block: dict[str, str]) -> str:
return (
f"### {block.get('date', '')} — Reçu\n"
f"- **De:** {block.get('from', '')}\n"
f"- **À:** {block.get('to', '')}\n"
f"- **Sujet:** {block.get('subject', '')}\n\n"
f"{block.get('body', '')}\n\n"
)
def format_exchange_sent(block: dict[str, str]) -> str:
return (
f"### {block.get('date', '')} — Envoyé\n"
f"- **À:** {block.get('to', '')}\n"
f"- **Sujet:** {block.get('subject', '')}\n\n"
f"{block.get('body', '')}\n\n"
)
def init_log(uid: str) -> str:
cfg = load_imap_config()
our_address = (cfg.get("filter_to") or "").strip().lower()
if not our_address:
our_address = (cfg.get("user") or "").strip().lower()
out = get_thread_output(uid)
blocks = parse_thread_blocks(out)
thread_id = get_thread_id_from_uid(uid)
log_path = threads_dir() / f"{thread_id}.md"
received_blocks: list[dict[str, str]] = []
sent_blocks: list[dict[str, str]] = []
for b in blocks:
from_ = (b.get("from") or "").lower()
if our_address and our_address in from_:
sent_blocks.append(b)
else:
received_blocks.append(b)
existing_tickets = ""
existing_commits = ""
if log_path.exists():
content = log_path.read_text(encoding="utf-8")
if "## Tickets (issues)" in content:
idx = content.index("## Tickets (issues)")
end = content.find("\n## ", idx + 1)
if end == -1:
end = len(content)
existing_tickets = content[idx:end].strip()
if "## Commits" in content:
idx = content.index("## Commits")
end = content.find("\n## ", idx + 1)
if end == -1:
end = len(content)
existing_commits = content[idx:end].strip()
lines = [
f"# Fil — {thread_id}",
"",
"## Échanges reçus",
"",
]
for b in received_blocks:
lines.append(format_exchange_received(b))
lines.append("## Échanges envoyés")
lines.append("")
for b in sent_blocks:
lines.append(format_exchange_sent(b))
if existing_tickets:
lines.append(existing_tickets)
lines.append("")
else:
lines.append("## Tickets (issues)")
lines.append("")
lines.append("(aucun)")
lines.append("")
if existing_commits:
lines.append(existing_commits)
lines.append("")
else:
lines.append("## Commits")
lines.append("")
lines.append("(aucun)")
lines.append("")
log_path.write_text("\n".join(lines), encoding="utf-8")
return thread_id
def append_sent(
thread_id: str,
to_addr: str,
subject: str,
body: str = "",
date_str: str | None = None,
) -> None:
if not date_str:
date_str = datetime.now(timezone.utc).strftime("%a, %d %b %Y %H:%M:%S +0000")
log_path = threads_dir() / f"{sanitize_thread_id(thread_id)}.md"
block = {
"date": date_str,
"to": to_addr,
"subject": subject,
"body": body,
}
section = format_exchange_sent(block)
if not log_path.exists():
log_path.write_text(
f"# Fil — {thread_id}\n\n## Échanges reçus\n\n(aucun)\n\n"
"## Échanges envoyés\n\n" + section + "\n## Tickets (issues)\n\n(aucun)\n\n## Commits\n\n(aucun)\n",
encoding="utf-8",
)
return
content = log_path.read_text(encoding="utf-8")
insert_marker = "## Échanges envoyés"
idx = content.find(insert_marker)
if idx == -1:
content += "\n\n## Échanges envoyés\n\n" + section
else:
next_section = content.find("\n## ", idx + 1)
if next_section == -1:
content = content.rstrip() + "\n\n" + section
else:
content = (
content[:next_section].rstrip() + "\n\n" + section + content[next_section:]
)
log_path.write_text(content, encoding="utf-8")
def append_issue(thread_id: str, issue_num: str, title: str = "") -> None:
gitea = load_gitea_config()
base = f"{gitea['api_url'].replace('/api/v1', '')}/{gitea['owner']}/{gitea['repo']}/issues/{issue_num}"
line = f"- #{issue_num}" + (f"{title}" if title else "") + f" — <{base}>\n"
log_path = threads_dir() / f"{sanitize_thread_id(thread_id)}.md"
if not log_path.exists():
log_path.write_text(
f"# Fil — {thread_id}\n\n## Échanges reçus\n\n(aucun)\n\n"
"## Échanges envoyés\n\n(aucun)\n\n## Tickets (issues)\n\n" + line + "\n## Commits\n\n(aucun)\n",
encoding="utf-8",
)
return
content = log_path.read_text(encoding="utf-8")
marker = "## Tickets (issues)"
idx = content.find(marker)
if idx == -1:
content += "\n\n" + marker + "\n\n" + line
else:
end = idx + len(marker)
rest = content[end:]
if "(aucun)" in rest.split("\n## ")[0]:
content = content[:end] + "\n\n" + line + rest.replace("(aucun)\n", "", 1)
else:
content = content[:end] + "\n\n" + line + content[end:]
log_path.write_text(content, encoding="utf-8")
def append_commit(
thread_id: str,
commit_hash: str,
message: str,
branch: str = "",
) -> None:
line = f"- `{commit_hash[:12]}`"
if branch:
line += f" ({branch})"
line += f"{message.strip()}\n"
log_path = threads_dir() / f"{sanitize_thread_id(thread_id)}.md"
if not log_path.exists():
log_path.write_text(
f"# Fil — {thread_id}\n\n## Échanges reçus\n\n(aucun)\n\n"
"## Échanges envoyés\n\n(aucun)\n\n## Tickets (issues)\n\n(aucun)\n\n## Commits\n\n" + line,
encoding="utf-8",
)
return
content = log_path.read_text(encoding="utf-8")
marker = "## Commits"
idx = content.find(marker)
if idx == -1:
content += "\n\n" + marker + "\n\n" + line
else:
end = idx + len(marker)
rest = content[end:]
if "(aucun)" in rest.split("\n## ")[0]:
content = content[:end] + "\n\n" + line + rest.replace("(aucun)\n", "", 1)
else:
content = content[:end] + "\n\n" + line + content[end:]
log_path.write_text(content, encoding="utf-8")
def main() -> int:
ap = argparse.ArgumentParser(prog="mail-thread-log.py")
sub = ap.add_subparsers(dest="cmd", required=True)
p_get = sub.add_parser("get-id")
p_get.add_argument("--uid", required=True, help="Mail UID")
p_init = sub.add_parser("init")
p_init.add_argument("--uid", required=True, help="Mail UID")
p_sent = sub.add_parser("append-sent")
p_sent.add_argument("--thread-id", required=True)
p_sent.add_argument("--to", required=True, dest="to_addr")
p_sent.add_argument("--subject", required=True)
p_sent.add_argument("--body", default="")
p_sent.add_argument("--date", default=None)
p_issue = sub.add_parser("append-issue")
p_issue.add_argument("--thread-id", required=True)
p_issue.add_argument("--issue", required=True)
p_issue.add_argument("--title", default="")
p_commit = sub.add_parser("append-commit")
p_commit.add_argument("--thread-id", required=True)
p_commit.add_argument("--hash", required=True)
p_commit.add_argument("--message", required=True)
p_commit.add_argument("--branch", default="")
args = ap.parse_args()
if args.cmd == "get-id":
tid = get_thread_id_from_uid(args.uid)
print(f"THREAD_ID={tid}")
return 0
if args.cmd == "init":
tid = init_log(args.uid)
print(f"THREAD_ID={tid}")
return 0
if args.cmd == "append-sent":
append_sent(
args.thread_id,
args.to_addr,
args.subject,
args.body,
args.date,
)
return 0
if args.cmd == "append-issue":
append_issue(args.thread_id, args.issue, args.title)
return 0
if args.cmd == "append-commit":
append_commit(args.thread_id, args.hash, args.message, args.branch)
return 0
return 1
if __name__ == "__main__":
sys.exit(main())

15
gitea-issues/mail-thread-log.sh Executable file
View File

@ -0,0 +1,15 @@
#!/usr/bin/env bash
# Thread log: one file per thread under logs/gitea-issues/threads/. Run from repo root.
# Usage:
# ./gitea-issues/mail-thread-log.sh get-id --uid <uid>
# ./gitea-issues/mail-thread-log.sh init --uid <uid>
# ./gitea-issues/mail-thread-log.sh append-sent --thread-id <id> --to <addr> --subject "..." [--body "..."] [--date "..."]
# ./gitea-issues/mail-thread-log.sh append-issue --thread-id <id> --issue <num> [--title "..."]
# ./gitea-issues/mail-thread-log.sh append-commit --thread-id <id> --hash <hash> --message "..." [--branch "..."]
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
exec python3 "${GITEA_ISSUES_DIR}/mail-thread-log.py" "$@"

114
gitea-issues/mail-to-issue.py Executable file
View File

@ -0,0 +1,114 @@
#!/usr/bin/env python3
"""
Create Gitea issues from unread emails via IMAP (e.g. Proton Mail Bridge).
**Preferred flow (agent-driven):** do not chain directly. Use mail-list-unread.sh
to list unread emails, then for each: formalize the issue or send a reply (mail-send-reply.sh);
only when a correction/evolution is ready, create the issue (mail-create-issue-from-email.sh
with optional formalized title/body), treat it (fix/evol), then comment on the issue and
reply to the email via the Bridge.
This script (mail-to-issue) is a **batch** fallback: it creates one issue per unread
message with title=subject and body=text+From, then marks messages as read. Use only
when the agent-driven flow is not used.
Reads IMAP config from .secrets/gitea-issues/imap-bridge.env (or env vars).
Reads Gitea token from GITEA_TOKEN or .secrets/gitea-issues/token.
"""
from __future__ import annotations
import email
import imaplib
import ssl
import sys
from email.header import decode_header
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent))
from mail_common import (
create_gitea_issue,
load_gitea_config,
load_imap_config,
repo_root,
sanitize_title,
)
def _decode_header_value(header: str | None) -> str:
if not header:
return ""
parts = decode_header(header)
result = []
for part, charset in parts:
if isinstance(part, bytes):
result.append(part.decode(charset or "utf-8", errors="replace"))
else:
result.append(part)
return "".join(result)
def _get_text_body(msg: email.message.Message) -> str:
if msg.is_multipart():
for part in msg.walk():
if part.get_content_type() == "text/plain":
payload = part.get_payload(decode=True)
if payload:
return payload.decode(part.get_content_charset() or "utf-8", errors="replace")
return ""
payload = msg.get_payload(decode=True)
if not payload:
return ""
return payload.decode(msg.get_content_charset() or "utf-8", errors="replace")
def main() -> None:
imap_cfg = load_imap_config()
if not imap_cfg["user"] or not imap_cfg["password"]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
print("[gitea-issues] ERROR: IMAP_USER and IMAP_PASSWORD required.", file=sys.stderr)
sys.exit(1)
gitea_cfg = load_gitea_config()
if not gitea_cfg["token"]:
print("[gitea-issues] ERROR: GITEA_TOKEN not set.", file=sys.stderr)
sys.exit(1)
mail = imaplib.IMAP4(imap_cfg["host"], int(imap_cfg["port"]))
if imap_cfg["use_starttls"]:
mail.starttls(ssl.create_default_context())
mail.login(imap_cfg["user"], imap_cfg["password"])
mail.select("INBOX")
_, nums = mail.search(None, "UNSEEN")
ids = nums[0].split()
if not ids:
print("[gitea-issues] No unread messages.")
mail.logout()
return
created = 0
for uid in ids:
uid_s = uid.decode("ascii")
_, data = mail.fetch(uid, "(RFC822)")
if not data or not data[0]:
continue
msg = email.message_from_bytes(data[0][1])
subject = _decode_header_value(msg.get("Subject"))
from_ = _decode_header_value(msg.get("From"))
body_text = _get_text_body(msg)
body_for_issue = f"**From:** {from_}\n\n{body_text}".strip()
title = sanitize_title(subject)
issue = create_gitea_issue(title, body_for_issue)
if issue:
created += 1
print(f"[gitea-issues] Created issue #{issue.get('number', '?')}: {title[:60]}")
mail.store(uid_s, "+FLAGS", "\\Seen")
else:
print(f"[gitea-issues] Skipped (API failed): {title[:60]}", file=sys.stderr)
mail.logout()
print(f"[gitea-issues] Done. Created {created} issue(s).")
if __name__ == "__main__":
main()

14
gitea-issues/mail-to-issue.sh Executable file
View File

@ -0,0 +1,14 @@
#!/usr/bin/env bash
#
# Create Gitea issues from unread emails (IMAP). Requires Proton Mail Bridge
# or any IMAP server. Config: .secrets/gitea-issues/imap-bridge.env and token.
# Usage: ./gitea-issues/mail-to-issue.sh
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
ROOT="$(git rev-parse --show-toplevel 2>/dev/null)" || ROOT="$(cd "${GITEA_ISSUES_DIR}/.." && pwd)"
export GITEA_ISSUES_DIR
export REPO_ROOT="${ROOT}"
cd "$ROOT"
exec python3 "${GITEA_ISSUES_DIR}/mail-to-issue.py"

125
gitea-issues/mail_common.py Normal file
View File

@ -0,0 +1,125 @@
# Shared config and helpers for gitea-issues mail scripts (IMAP/SMTP, Gitea).
# Used by mail-list-unread, mail-send-reply, mail-create-issue-from-email, mail-mark-read.
from __future__ import annotations
import json
import os
import re
import ssl
from pathlib import Path
from urllib.error import HTTPError, URLError
from urllib.request import Request, urlopen
def repo_root() -> Path:
# When set by shell (e.g. when gitea-issues is inside ia_dev), use host repo root for .secrets, logs
env_root = os.environ.get("REPO_ROOT")
if env_root:
return Path(env_root).resolve()
issues_dir = os.environ.get("GITEA_ISSUES_DIR")
if issues_dir:
return Path(issues_dir).resolve().parent
return Path(__file__).resolve().parent.parent
def load_env_file(path: Path) -> None:
if not path.is_file():
return
with open(path, encoding="utf-8") as f:
for line in f:
line = line.strip()
if not line or line.startswith("#"):
continue
if "=" in line:
key, _, value = line.partition("=")
key = key.strip()
value = value.strip().strip("'\"")
if key and key not in os.environ:
os.environ[key] = value
def load_imap_config() -> dict[str, str]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
load_env_file(env_path)
ssl_verify_raw = os.environ.get("IMAP_SSL_VERIFY", "true").lower()
ssl_verify = ssl_verify_raw not in ("0", "false", "no")
return {
"host": os.environ.get("IMAP_HOST", "127.0.0.1"),
"port": os.environ.get("IMAP_PORT", "1143"),
"user": os.environ.get("IMAP_USER", ""),
"password": os.environ.get("IMAP_PASSWORD", ""),
"use_starttls": os.environ.get("IMAP_USE_STARTTLS", "true").lower() in ("1", "true", "yes"),
"ssl_verify": ssl_verify,
"filter_to": os.environ.get("MAIL_FILTER_TO", "ai.support.lecoffreio@4nkweb.com").strip().lower(),
}
def imap_ssl_context(ssl_verify: bool = True) -> ssl.SSLContext:
"""Return SSL context for IMAP STARTTLS. Use ssl_verify=False only for local Bridge with self-signed cert."""
if ssl_verify:
return ssl.create_default_context()
ctx = ssl.create_default_context()
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
return ctx
def load_smtp_config() -> dict[str, str]:
root = repo_root()
env_path = root / ".secrets" / "gitea-issues" / "imap-bridge.env"
load_env_file(env_path)
ssl_verify_raw = os.environ.get("IMAP_SSL_VERIFY", os.environ.get("SMTP_SSL_VERIFY", "true")).lower()
ssl_verify = ssl_verify_raw not in ("0", "false", "no")
return {
"host": os.environ.get("SMTP_HOST", os.environ.get("IMAP_HOST", "127.0.0.1")),
"port": os.environ.get("SMTP_PORT", "1025"),
"user": os.environ.get("SMTP_USER", os.environ.get("IMAP_USER", "")),
"password": os.environ.get("SMTP_PASSWORD", os.environ.get("IMAP_PASSWORD", "")),
"use_starttls": os.environ.get("SMTP_USE_STARTTLS", "true").lower() in ("1", "true", "yes"),
"ssl_verify": ssl_verify,
}
def load_gitea_config() -> dict[str, str]:
root = repo_root()
token = os.environ.get("GITEA_TOKEN")
if not token:
token_path = root / ".secrets" / "gitea-issues" / "token"
if token_path.is_file():
token = token_path.read_text(encoding="utf-8").strip()
return {
"api_url": os.environ.get("GITEA_API_URL", "https://git.4nkweb.com/api/v1").rstrip("/"),
"owner": os.environ.get("GITEA_REPO_OWNER", "4nk"),
"repo": os.environ.get("GITEA_REPO_NAME", "lecoffre_ng"),
"token": token or "",
}
def sanitize_title(raw: str, max_len: int = 200) -> str:
one_line = re.sub(r"\s+", " ", raw).strip()
return one_line[:max_len] if one_line else "(no subject)"
def create_gitea_issue(title: str, body: str) -> dict | None:
gitea = load_gitea_config()
if not gitea["token"]:
return None
url = f"{gitea['api_url']}/repos/{gitea['owner']}/{gitea['repo']}/issues"
payload = json.dumps({"title": title, "body": body}).encode("utf-8")
req = Request(
url,
data=payload,
method="POST",
headers={
"Accept": "application/json",
"Content-Type": "application/json",
"Authorization": f"token {gitea['token']}",
},
)
try:
with urlopen(req, timeout=30) as resp:
return json.loads(resp.read().decode("utf-8"))
except (HTTPError, URLError):
return None

View File

@ -0,0 +1,35 @@
#!/usr/bin/env bash
#
# Print issue number, title and body in a single block for agent consumption.
# Used by the gitea-issues-process agent to get the ticket content before calling /fix or /evol.
# Usage: ./print-issue-prompt.sh <issue_number>
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
require_jq || exit 1
if [[ $# -lt 1 ]]; then
log_err "Usage: $0 <issue_number>"
exit 1
fi
ISSUE_NUM="$1"
RESPONSE="$(gitea_api_get "/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}/issues/${ISSUE_NUM}")"
if ! echo "$RESPONSE" | jq -e . &>/dev/null; then
log_err "API error (issue ${ISSUE_NUM}): ${RESPONSE:0:200}"
exit 1
fi
TITLE="$(echo "$RESPONSE" | jq -r '.title')"
BODY="$(echo "$RESPONSE" | jq -r '.body // "(no description)"')"
LABELS="$(echo "$RESPONSE" | jq -r '[.labels[].name] | join(", ")')"
echo "Issue #${ISSUE_NUM}"
echo "Title: ${TITLE}"
echo "Labels: ${LABELS}"
echo ""
echo "${BODY}"

88
gitea-issues/wiki-api-test.sh Executable file
View File

@ -0,0 +1,88 @@
#!/usr/bin/env bash
#
# Test Gitea Wiki API for repo 4nk/lecoffre_ng.
# Requires GITEA_TOKEN or .secrets/gitea-issues/token (same as issues scripts).
# Usage: ./wiki-api-test.sh [--create]
# --create: create a test page then delete it (checks write access).
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
REPO_PATH="/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}"
# Branch ref for wiki (default branch of wiki repo; use master when wiki is configured on master)
GITEA_WIKI_REF="${GITEA_WIKI_REF:-master}"
WIKI_PAGES="${REPO_PATH}/wiki/pages?ref=${GITEA_WIKI_REF}"
WIKI_PAGE="${REPO_PATH}/wiki/page"
WIKI_NEW="${REPO_PATH}/wiki/new"
do_create=false
while [[ $# -gt 0 ]]; do
case "$1" in
--create) do_create=true; shift ;;
*) log_err "Unknown option: $1"; exit 1 ;;
esac
done
if ! load_gitea_token 2>/dev/null; then
log_err "No GITEA_TOKEN and no .secrets/gitea-issues/token. Set token to run wiki API tests."
exit 1
fi
require_jq || exit 1
echo "=== 1. GET ${WIKI_PAGES} (list wiki pages) ==="
RESPONSE="$(gitea_api_get "${WIKI_PAGES}")"
if echo "$RESPONSE" | jq -e . &>/dev/null; then
if echo "$RESPONSE" | jq -e 'type == "array"' &>/dev/null; then
COUNT="$(echo "$RESPONSE" | jq 'length')"
log_info "List OK: ${COUNT} page(s)"
echo "$RESPONSE" | jq -r '.[] | " - \(.title)"' 2>/dev/null || echo "$RESPONSE" | jq .
else
log_info "Response: $(echo "$RESPONSE" | jq -c . 2>/dev/null || echo "$RESPONSE")"
fi
else
log_err "Response (first 300 chars): ${RESPONSE:0:300}"
fi
echo ""
echo "=== 2. GET ${WIKI_PAGE}/Home (get one page, ref=${GITEA_WIKI_REF}) ==="
RESPONSE="$(gitea_api_get "${WIKI_PAGE}/Home?ref=${GITEA_WIKI_REF}")"
if echo "$RESPONSE" | jq -e .title &>/dev/null; then
log_info "Page OK: title=$(echo "$RESPONSE" | jq -r .title)"
echo "$RESPONSE" | jq '{ title, html_url, commit_count }'
else
log_info "Response: $(echo "$RESPONSE" | jq -c . 2>/dev/null || echo "${RESPONSE:0:200}")"
fi
if [[ "$do_create" != true ]]; then
log_info "Done. Use --create to test POST wiki page and DELETE."
exit 0
fi
echo ""
echo "=== 3. POST ${WIKI_NEW} (create test page) ==="
TEST_TITLE="Api-test-$(date +%s)"
CONTENT="# Test\nCreated by wiki-api-test.sh. Safe to delete."
CONTENT_B64="$(echo -n "$CONTENT" | base64 -w 0)"
BODY="$(jq -n --arg title "$TEST_TITLE" --arg content "$CONTENT_B64" --arg msg "wiki-api-test.sh" \
'{ title: $title, content_base64: $content, message: $msg }')"
RESPONSE="$(gitea_api_post "${WIKI_NEW}" "$BODY")"
if echo "$RESPONSE" | jq -e .title &>/dev/null; then
log_info "Create OK: $(echo "$RESPONSE" | jq -r .title)"
CREATED_TITLE="$TEST_TITLE"
else
log_err "Create failed: ${RESPONSE:0:300}"
exit 1
fi
echo ""
echo "=== 4. DELETE ${WIKI_PAGE}/${CREATED_TITLE} (remove test page) ==="
RESPONSE="$(gitea_api_delete "${WIKI_PAGE}/${CREATED_TITLE}")"
# DELETE often returns 204 No Content
log_info "Delete sent (204 or empty body = success)."
echo ""
log_info "All wiki API tests completed."

34
gitea-issues/wiki-get-page.sh Executable file
View File

@ -0,0 +1,34 @@
#!/usr/bin/env bash
#
# Output the raw markdown of a wiki page (for agents or scripts).
# Usage: ./wiki-get-page.sh <page_name>
# Example: ./wiki-get-page.sh Home
# Requires GITEA_TOKEN or .secrets/gitea-issues/token.
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
REPO_PATH="/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}"
GITEA_WIKI_REF="${GITEA_WIKI_REF:-master}"
if [[ $# -lt 1 ]]; then
log_err "Usage: $0 <page_name>"
exit 1
fi
PAGE_NAME="$1"
load_gitea_token || exit 1
require_jq || exit 1
resp="$(gitea_api_get "${REPO_PATH}/wiki/page/${PAGE_NAME}?ref=${GITEA_WIKI_REF}")"
if ! echo "$resp" | jq -e .content_base64 &>/dev/null; then
log_err "Page not found or error: ${PAGE_NAME}"
echo "$resp" | jq . 2>/dev/null || echo "$resp"
exit 1
fi
echo "$resp" | jq -r '.content_base64' | base64 -d
echo

100
gitea-issues/wiki-migrate-docs.sh Executable file
View File

@ -0,0 +1,100 @@
#!/usr/bin/env bash
#
# Migrate all docs/*.md (repo root) to Gitea wiki as pages.
# Mapping: docs/FILE.md → page "File" (stem with _ → -, first letter upper per segment).
# Requires GITEA_TOKEN or .secrets/gitea-issues/token.
# Usage: ./wiki-migrate-docs.sh [--dry-run] [file.md ...]
# --dry-run: print mapping and skip API calls.
# If file(s) given: migrate only those; else migrate all docs/*.md.
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
REPO_ROOT="${GITEA_ISSUES_DIR}/.."
DOCS_DIR="${REPO_ROOT}/docs"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
REPO_PATH="/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}"
GITEA_WIKI_REF="${GITEA_WIKI_REF:-master}"
WIKI_PAGE="${REPO_PATH}/wiki/page"
WIKI_NEW="${REPO_PATH}/wiki/new"
# docs/FILE.md → page name for wiki (stem: _ → -, title-case: First-Letter-Of-Each-Segment)
file_to_page_name() {
local base="$1"
local stem="${base%.md}"
echo "$stem" | tr '_' '-' | awk -F- '{
for(i=1;i<=NF;i++) {
s = $i; l = length(s)
if (l > 0) $i = toupper(substr(s,1,1)) tolower(substr(s,2))
}
}1' OFS='-'
}
dry_run=false
files=()
while [[ $# -gt 0 ]]; do
case "$1" in
--dry-run) dry_run=true; shift ;;
*.md) files+=("$1"); shift ;;
*) log_err "Unknown option or not .md: $1"; exit 1 ;;
esac
done
if [[ ${#files[@]} -eq 0 ]]; then
while IFS= read -r -d '' f; do
files+=("$f")
done < <(find "$DOCS_DIR" -maxdepth 1 -name '*.md' -print0 | sort -z)
else
# Resolve args to full paths under DOCS_DIR
for i in "${!files[@]}"; do
u="${files[$i]}"
if [[ "$u" != */* ]] && [[ -f "${DOCS_DIR}/${u}" ]]; then
files[$i]="${DOCS_DIR}/${u}"
fi
done
fi
if [[ ${#files[@]} -eq 0 ]]; then
log_err "No .md files found in ${DOCS_DIR}"
exit 1
fi
if [[ "$dry_run" == true ]]; then
log_info "Dry run: would migrate ${#files[@]} file(s)"
for f in "${files[@]}"; do
base="$(basename "$f")"
page="$(file_to_page_name "$base")"
echo " $f$page"
done
exit 0
fi
load_gitea_token || exit 1
require_jq || exit 1
for f in "${files[@]}"; do
base="$(basename "$f")"
page="$(file_to_page_name "$base")"
if [[ ! -f "$f" ]]; then
log_err "Skip (not a file): $f"
continue
fi
content="$(cat "$f")"
content_b64="$(echo -n "$content" | base64 -w 0)"
body="$(jq -n --arg title "$page" --arg content "$content_b64" --arg msg "Migrate from docs/$base" \
'{ title: $title, content_base64: $content, message: $msg }')"
# Check if page exists (GET); if 200 use PATCH else POST
resp="$(gitea_api_get "${REPO_PATH}/wiki/page/${page}?ref=${GITEA_WIKI_REF}")"
if echo "$resp" | jq -e .title &>/dev/null; then
log_info "Update: $base$page"
patch_body="$(jq -n --arg content "$content_b64" --arg msg "Update from docs/$base" '{ content_base64: $content, message: $msg }')"
gitea_api_patch "${WIKI_PAGE}/${page}?ref=${GITEA_WIKI_REF}" "$patch_body" >/dev/null || true
else
log_info "Create: $base$page"
gitea_api_post "${WIKI_NEW}" "$body" >/dev/null || true
fi
done
log_info "Migration done: ${#files[@]} file(s)."

46
gitea-issues/wiki-put-page.sh Executable file
View File

@ -0,0 +1,46 @@
#!/usr/bin/env bash
#
# Update a single wiki page from a local file.
# Usage: ./wiki-put-page.sh <page_name> <file_path>
# Example: ./wiki-put-page.sh Home docs/README.md
# Requires GITEA_TOKEN or .secrets/gitea-issues/token.
#
set -euo pipefail
GITEA_ISSUES_DIR="${GITEA_ISSUES_DIR:-$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)}"
# shellcheck source=lib.sh
source "${GITEA_ISSUES_DIR}/lib.sh"
REPO_PATH="/repos/${GITEA_REPO_OWNER}/${GITEA_REPO_NAME}"
GITEA_WIKI_REF="${GITEA_WIKI_REF:-master}"
WIKI_PAGE="${REPO_PATH}/wiki/page"
WIKI_NEW="${REPO_PATH}/wiki/new"
if [[ $# -lt 2 ]]; then
log_err "Usage: $0 <page_name> <file_path>"
exit 1
fi
PAGE_NAME="$1"
FILE_PATH="$2"
[[ -f "$FILE_PATH" ]] || { log_err "File not found: $FILE_PATH"; exit 1; }
load_gitea_token || exit 1
require_jq || exit 1
content="$(cat "$FILE_PATH")"
content_b64="$(echo -n "$content" | base64 -w 0)"
msg="Update from $FILE_PATH"
resp="$(gitea_api_get "${REPO_PATH}/wiki/page/${PAGE_NAME}?ref=${GITEA_WIKI_REF}")"
if echo "$resp" | jq -e .title &>/dev/null; then
log_info "PATCH ${PAGE_NAME}"
body="$(jq -n --arg title "$PAGE_NAME" --arg content "$content_b64" --arg msg "$msg" '{ title: $title, content_base64: $content, message: $msg }')"
gitea_api_patch "${WIKI_PAGE}/${PAGE_NAME}?ref=${GITEA_WIKI_REF}" "$body"
else
log_info "POST ${PAGE_NAME}"
body="$(jq -n --arg title "$PAGE_NAME" --arg content "$content_b64" --arg msg "$msg" '{ title: $title, content_base64: $content, message: $msg }')"
gitea_api_post "${WIKI_NEW}" "$body"
fi
log_info "Done: ${PAGE_NAME}"

13
lib/README.md Normal file
View File

@ -0,0 +1,13 @@
# ia_dev shared lib
## project_config.sh
Sourced by deploy scripts and gitea-issues to resolve the current project **id** and the path to its JSON config.
**Before sourcing:** set `PROJECT_ROOT` (git repo root, where `ai_project_id` or `.ia_project` lives) and `IA_DEV_ROOT` (path to the `ia_dev` directory).
**After sourcing:** `PROJECT_SLUG` and `PROJECT_CONFIG_PATH` are set (and exported). Config path is `projects/<id>/conf.json`.
**Project id resolution order:** `IA_PROJECT` env → `.ia_project` at PROJECT_ROOT → `ai_project_id` at PROJECT_ROOT.
See `projects/README.md` for the config schema.

35
lib/project_config.sh Normal file
View File

@ -0,0 +1,35 @@
#
# Project config resolution for ia_dev scripts.
# Source this after setting PROJECT_ROOT and IA_DEV_ROOT.
# Resolves PROJECT_SLUG (id) and PROJECT_CONFIG_PATH (projects/<id>/conf.json).
#
# Project id resolution order:
# 1. IA_PROJECT (env)
# 2. .ia_project at PROJECT_ROOT (one line, slug)
# 3. ai_project_id at PROJECT_ROOT (one line, id = directory name in projects/)
#
# Config file: projects/<id>/conf.json (e.g. projects/lecoffreio/conf.json).
#
set -euo pipefail
PROJECT_SLUG=""
if [[ -n "${IA_PROJECT:-}" ]]; then
PROJECT_SLUG="$(echo "${IA_PROJECT}" | sed 's/[[:space:]]//g')"
fi
if [[ -z "$PROJECT_SLUG" && -n "${PROJECT_ROOT:-}" && -f "$PROJECT_ROOT/.ia_project" ]]; then
PROJECT_SLUG="$(cat "$PROJECT_ROOT/.ia_project" | sed 's/[[:space:]]//g')"
fi
if [[ -z "$PROJECT_SLUG" && -n "${PROJECT_ROOT:-}" && -f "$PROJECT_ROOT/ai_project_id" ]]; then
PROJECT_SLUG="$(cat "$PROJECT_ROOT/ai_project_id" | sed 's/[[:space:]]//g')"
fi
PROJECT_CONFIG_PATH=""
if [[ -n "$PROJECT_SLUG" && -n "${IA_DEV_ROOT:-}" ]]; then
PROJECT_CONFIG_PATH="${IA_DEV_ROOT}/projects/${PROJECT_SLUG}/conf.json"
if [[ ! -f "$PROJECT_CONFIG_PATH" ]]; then
PROJECT_CONFIG_PATH=""
fi
fi
export PROJECT_SLUG
export PROJECT_CONFIG_PATH

View File

@ -1,17 +1,20 @@
# Project-specific configuration
This repo (`ia_dev`) is intended to be used as a **git submodule** inside each project. Project-specific parameters are stored here in `projects/<slug>.json`.
This repo (`ia_dev`) is intended to be used as a **git submodule** inside each project. Project-specific parameters are stored in `projects/<id>/conf.json` (e.g. `projects/lecoffreio/conf.json`). The `<id>` is the project identifier and the name of the directory under `projects/`.
## Current project selection
- **`IA_PROJECT`** (environment variable), or
- **`.ia_project`** file at the repository root (one line: the project slug, e.g. `lecoffreio`). Do not use angle brackets in the file.
Scripts resolve the project **id** (used as the directory name in `projects/`) in this order:
1. **`IA_PROJECT`** (environment variable)
2. **`.ia_project`** file at the repository root (one line: the project id, e.g. `lecoffreio`)
3. **`ai_project_id`** file at the repository root (one line: the project id). When `ia_dev` is a submodule, this file lives at the host repo root (parent of `ia_dev`).
When running from a repo that has `ia_dev` as a submodule, the root is the parent repo; the script resolves `ia_dev` either as `./ia_dev` or `./deploy` (symlink to `ia_dev/deploy`).
## Schema
One JSON file per project in `projects/` named by the slug (e.g. `projects/lecoffreio.json`).
One JSON file per project: `projects/<id>/conf.json` (e.g. `projects/lecoffreio/conf.json`). The `<id>` is the directory name; the config file is always named `conf.json`.
| Field | Required | Description |
|-------|----------|-------------|
@ -22,7 +25,7 @@ One JSON file per project in `projects/` named by the slug (e.g. `projects/lecof
| `version.package_json_paths` | no | List of paths (relative to repo root) to `package.json` files to update on bump |
| `version.splash_app_name` | no | App name used in splash message template |
| `mail` | no | Mail/imap bridge config |
| `git` | no | Git hosting: `wiki_url`, `ticketing_url`, `token_file` |
| `git` | no | Git hosting: `wiki_url`, `ticketing_url`, `token_file` (path relative to repo root for token file) |
## Example (minimal)
@ -35,4 +38,4 @@ One JSON file per project in `projects/` named by the slug (e.g. `projects/lecof
## Example (full)
See `projects/lecoffreio.json`.
See `projects/lecoffreio/conf.json`.

20
projects/ia_dev/conf.json Normal file
View File

@ -0,0 +1,20 @@
{
"id": "ia_dev",
"name": "ia_dev",
"project_path": "../",
"build_dirs": [],
"deploy": {},
"version": {
"package_json_paths": [],
"splash_app_name": "ia_dev"
},
"mail": {
"email": "ai.support.ia_dev@4nkweb.com",
"imap_bridge_env": ".secrets/gitea-issues/imap-bridge.env"
},
"git": {
"wiki_url": "https://git.4nkweb.com/4nk/ia_dev/wiki",
"ticketing_url": "https://git.4nkweb.com/4nk/ia_dev/issues",
"token_file": ".secrets/gitea-issues/token"
}
}

View File

@ -1,11 +1,17 @@
{
"id": "lecoffreio",
"name": "Lecoffre.io",
"project_path": "../lecoffre_ng_test",
"project_path": "../",
"build_dirs": [
"lecoffre-ressources-dev",
"lecoffre-back-main",
"lecoffre-front-main"
],
"deploy": {
"scripts_path": "../deploy/scripts_v2",
"deploy_script_path": "../deploy/scripts_v2/deploy.sh",
"secrets_path": "../.secrets"
},
"version": {
"package_json_paths": [
"lecoffre-back-main/package.json",