Compare commits
20 Commits
master
...
docker-sup
Author | SHA1 | Date | |
---|---|---|---|
![]() |
89ff7b1191 | ||
![]() |
59821b1d41 | ||
![]() |
9642616bcd | ||
![]() |
b1793f9654 | ||
c4998ddf23 | |||
![]() |
9db09d7bc5 | ||
![]() |
4cebb3bcf7 | ||
![]() |
a96323cb32 | ||
![]() |
fda21d47fa | ||
![]() |
1c42b86f90 | ||
![]() |
7aebbca98d | ||
![]() |
8f456f0cd5 | ||
![]() |
87e89317f5 | ||
![]() |
7cc675c129 | ||
![]() |
62a972594d | ||
![]() |
0950da48d6 | ||
![]() |
790ffb4032 | ||
![]() |
53c6301be2 | ||
![]() |
6907e4baf1 | ||
![]() |
50e0b97a7f |
17
.4nk-sync.yml
Normal file
17
.4nk-sync.yml
Normal file
@ -0,0 +1,17 @@
|
||||
template:
|
||||
repo: nicolas.cantu/4NK_project_template
|
||||
host: git.4nkweb.com
|
||||
sync:
|
||||
include:
|
||||
- .gitea/
|
||||
- .cursor/
|
||||
- docs/
|
||||
- AGENTS.md
|
||||
- CONTRIBUTING.md
|
||||
- CODE_OF_CONDUCT.md
|
||||
- SECURITY.md
|
||||
- CHANGELOG.md
|
||||
exclude:
|
||||
- target/
|
||||
- storage/
|
||||
- .git/
|
11
.cursor/.cursorignore
Normal file
11
.cursor/.cursorignore
Normal file
@ -0,0 +1,11 @@
|
||||
# Ignorer les sorties volumineuses ou non pertinentes pour le contexte IA
|
||||
archive/**
|
||||
tests/logs/**
|
||||
tests/reports/**
|
||||
node_modules/**
|
||||
dist/**
|
||||
build/**
|
||||
.tmp/**
|
||||
.cache/**#
|
||||
.env
|
||||
.env.*
|
7
.cursor/rules.md
Normal file
7
.cursor/rules.md
Normal file
@ -0,0 +1,7 @@
|
||||
# Règles Cursor du projet
|
||||
|
||||
- Compiler régulièrement: `cargo build`.
|
||||
- Lancer les tests souvent: `cargo test`.
|
||||
- Mettre à jour la documentation (`docs/`) à chaque changement fonctionnel.
|
||||
- Respecter le style Rust, `cargo fmt` et `cargo clippy -D warnings`.
|
||||
- PRs doivent inclure tests et docs.
|
32
.cursor/rules/00-foundations.mdc
Normal file
32
.cursor/rules/00-foundations.mdc
Normal file
@ -0,0 +1,32 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Fondations de rédaction et de comportement
|
||||
|
||||
[portée]
|
||||
S’applique à tout le dépôt 4NK/4NK_node pour toute génération, refactorisation, édition inline ou discussion dans Cursor.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Garantir l’usage exclusif du français.
|
||||
- Proscrire l’injection d’exemples de code applicatif dans la base de code.
|
||||
- Assurer une cohérence stricte de terminologie et de ton.
|
||||
- Exiger une introduction et/ou une conclusion dans toute proposition de texte.
|
||||
|
||||
[directives]
|
||||
|
||||
- Toujours répondre et documenter en français.
|
||||
- Ne pas inclure d’exemples exécutables ou de quickstarts dans la base ; préférer des descriptions prescriptives.
|
||||
- Tout contenu produit doit mentionner explicitement les artefacts à mettre à jour lorsqu’il impacte docs/ et tests/.
|
||||
- Préserver la typographie française (capitaliser uniquement le premier mot d’un titre et les noms propres).
|
||||
|
||||
[validations]
|
||||
|
||||
- Relecture linguistique et technique systématique.
|
||||
- Refuser toute sortie avec exemples de code applicatif.
|
||||
- Vérifier que l’issue traitée se conclut par un rappel des fichiers à mettre à jour.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- README.md, docs/**, tests/**, CHANGELOG.md, .gitea/**.
|
17
.cursor/rules/05-template-governance.mdc
Normal file
17
.cursor/rules/05-template-governance.mdc
Normal file
@ -0,0 +1,17 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Gouvernance du template 4NK
|
||||
|
||||
[portée]
|
||||
Assurer que chaque projet adapte intelligemment le template et que les améliorations génériques reviennent dans `4NK_template`.
|
||||
|
||||
[directives]
|
||||
- Conserver `security-audit` et `release-guard` dans tous projets.
|
||||
- Adapter la CI, les docs et `AGENTS.md` au contexte local.
|
||||
- En cas d'amélioration générique : ouvrir une issue "Template Feedback", prototyper, valider CI, mettre à jour `CHANGELOG.md`/`TEMPLATE_VERSION`.
|
||||
|
||||
[validation]
|
||||
- Refuser un push/tag si l'adaptation a retiré les vérifications minimales (sécurité, tests, build, version/changelog/tag).
|
||||
- Exiger une documentation claire dans `docs/TEMPLATE_ADAPTATION.md` et `docs/TEMPLATE_FEEDBACK.md`.
|
72
.cursor/rules/10-project-structure.mdc
Normal file
72
.cursor/rules/10-project-structure.mdc
Normal file
@ -0,0 +1,72 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Structure projet 4NK_node
|
||||
|
||||
[portée]
|
||||
Maintenance de l’arborescence canonique, création/mise à jour/suppression de fichiers et répertoires.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Garantir l’alignement strict avec l’arborescence 4NK_node.
|
||||
- Prévenir toute dérive structurelle.
|
||||
|
||||
[directives]
|
||||
|
||||
- S’assurer que l’arborescence suivante existe et reste conforme :
|
||||
|
||||
4NK/4NK_node
|
||||
├── archive
|
||||
├── CHANGELOG.md
|
||||
├── CODE_OF_CONDUCT.md
|
||||
├── CONTRIBUTING.md
|
||||
├── docker-compose.yml
|
||||
├── docs
|
||||
│ ├── API.md
|
||||
│ ├── ARCHITECTURE.md
|
||||
│ ├── COMMUNITY_GUIDE.md
|
||||
│ ├── CONFIGURATION.md
|
||||
│ ├── GITEA_SETUP.md
|
||||
│ ├── INDEX.md
|
||||
│ ├── INSTALLATION.md
|
||||
│ ├── MIGRATION.md
|
||||
│ ├── OPEN_SOURCE_CHECKLIST.md
|
||||
│ ├── QUICK_REFERENCE.md
|
||||
│ ├── RELEASE_PLAN.md
|
||||
│ ├── ROADMAP.md
|
||||
│ ├── SECURITY_AUDIT.md
|
||||
│ ├── TESTING.md
|
||||
│ └── USAGE.md
|
||||
├── LICENSE
|
||||
├── README.md
|
||||
├── tests
|
||||
│ ├── cleanup.sh
|
||||
│ ├── connectivity
|
||||
│ ├── external
|
||||
│ ├── integration
|
||||
│ ├── logs
|
||||
│ ├── performance
|
||||
│ ├── README.md
|
||||
│ ├── reports
|
||||
│ └── unit
|
||||
└── .gitea
|
||||
├── ISSUE_TEMPLATE
|
||||
│ ├── bug_report.md
|
||||
│ └── feature_request.md
|
||||
├── PULL_REQUEST_TEMPLATE.md
|
||||
└── workflows
|
||||
└── ci.yml
|
||||
|
||||
- Tout document obsolète est déplacé vers archive/ avec métadonnées (date, raison).
|
||||
- Interdire la suppression brute de fichiers sans archivage et note dans CHANGELOG.md.
|
||||
|
||||
[validations]
|
||||
|
||||
- Diff structurel comparé à cette référence.
|
||||
- Erreur bloquante si un fichier « requis » manque.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- archive/**, docs/**, tests/**, .gitea/**, CHANGELOG.md.
|
||||
|
33
.cursor/rules/20-documentation.mdc
Normal file
33
.cursor/rules/20-documentation.mdc
Normal file
@ -0,0 +1,33 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Documentation continue
|
||||
|
||||
[portée]
|
||||
Mises à jour de docs/** corrélées à tout changement de code, configuration, dépendance, données ou CI.
|
||||
|
||||
[objectifs]
|
||||
- Remplacer toute section générique « RESUME » par des mises à jour ciblées dans les fichiers appropriés.
|
||||
- Tenir INDEX.md comme table des matières de référence.
|
||||
|
||||
[directives]
|
||||
- À chaque changement, mettre à jour :
|
||||
- API.md (spécifications, contrats, schémas, invariants).
|
||||
- ARCHITECTURE.md (décisions, diagrammes, couplages, performances).
|
||||
- CONFIGURATION.md (paramètres, formats, valeurs par défaut).
|
||||
- INSTALLATION.md (pré-requis, étapes, vérifications).
|
||||
- MIGRATION.md (chemins de migration, scripts, compatibilités).
|
||||
- USAGE.md (parcours fonctionnels, contraintes).
|
||||
- TESTING.md (pyramide, critères d’acceptation).
|
||||
- SECURITY_AUDIT.md (menaces, contrôles, dettes résiduelles).
|
||||
- RELEASE_PLAN.md, ROADMAP.md (planification), OPEN_SOURCE_CHECKLIST.md, COMMUNITY_GUIDE.md, GITEA_SETUP.md.
|
||||
- Maintenir QUICK_REFERENCE.md pour les référentiels synthétiques utilisés par l’équipe.
|
||||
- Ajouter un REX technique en cas d’hypothèses multiples avant résolution dans archive/.
|
||||
|
||||
[validations]
|
||||
- Cohérence croisée entre README.md et INDEX.md.
|
||||
- Refus si une modification de code n’a pas de trace dans docs/** correspondants.
|
||||
|
||||
[artefacts concernés]
|
||||
- docs/**, README.md, archive/**.
|
57
.cursor/rules/30-testing.mdc
Normal file
57
.cursor/rules/30-testing.mdc
Normal file
@ -0,0 +1,57 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Tests et qualité
|
||||
|
||||
[portée]
|
||||
Stratégie de tests, exécution locale, stabilité, non-régression.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Exiger des tests verts avant tout commit.
|
||||
- Couvrir les axes unit, integration, connectivity, performance, external.
|
||||
|
||||
[directives]
|
||||
|
||||
- Ajouter/mettre à jour des tests dans tests/unit, tests/integration, tests/connectivity, tests/performance, tests/external selon l’impact.
|
||||
- Consigner les journaux dans tests/logs et les rapports dans tests/reports.
|
||||
- Maintenir tests/README.md (stratégie, outillage, seuils).
|
||||
- Fournir un nettoyage reproductible via tests/cleanup.sh.
|
||||
- Bloquer l’édition si des tests échouent tant que la correction n’est pas appliquée.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refus d’un commit si tests en échec.
|
||||
- Exiger justification et plan de test dans docs/TESTING.md pour toute refonte majeure.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- tests/**, docs/TESTING.md, CHANGELOG.md.
|
||||
|
||||
# Tests et qualité
|
||||
|
||||
[portée]
|
||||
Stratégie de tests, exécution locale, stabilité, non-régression.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Exiger des tests verts avant tout commit.
|
||||
- Couvrir les axes unit, integration, connectivity, performance, external.
|
||||
|
||||
[directives]
|
||||
|
||||
- Ajouter/mettre à jour des tests dans tests/unit, tests/integration, tests/connectivity, tests/performance, tests/external selon l’impact.
|
||||
- Consigner les journaux dans tests/logs et les rapports dans tests/reports.
|
||||
- Maintenir tests/README.md (stratégie, outillage, seuils).
|
||||
- Fournir un nettoyage reproductible via tests/cleanup.sh.
|
||||
- Bloquer l’édition si des tests échouent tant que la correction n’est pas appliquée.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refus d’un commit si tests en échec.
|
||||
- Exiger justification et plan de test dans docs/TESTING.md pour toute refonte majeure.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- tests/**, docs/TESTING.md, CHANGELOG.md.
|
55
.cursor/rules/40-dependencies-and-build.mdc
Normal file
55
.cursor/rules/40-dependencies-and-build.mdc
Normal file
@ -0,0 +1,55 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Dépendances, compilation et build
|
||||
|
||||
[portée]
|
||||
Gestion des dépendances, compilation fréquente, politique de versions.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Ajouter automatiquement les dépendances manquantes si justifié.
|
||||
- Rechercher systématiquement les dernières versions stables.
|
||||
|
||||
[directives]
|
||||
|
||||
- Lorsqu’une fonctionnalité nécessite une dépendance, l’ajouter et la documenter (nom, version, portée, impact) dans docs/ARCHITECTURE.md et docs/CONFIGURATION.md si nécessaire.
|
||||
- Compiler très régulièrement et « quand nécessaire » (avant refactor, avant push, après mise à jour de dépendances).
|
||||
- Corriger toute erreur de compilation/exécution avant de poursuivre.
|
||||
- Documenter tout changement de dépendances (raison, risques, rollback).
|
||||
|
||||
[validations]
|
||||
|
||||
- Interdire la progression si la compilation échoue.
|
||||
- Vérifier la présence d’une note de changement dans CHANGELOG.md en cas de dépendance ajoutée/retirée.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- docs/ARCHITECTURE.md, docs/CONFIGURATION.md, CHANGELOG.md.
|
||||
|
||||
# Dépendances, compilation et build
|
||||
|
||||
[portée]
|
||||
Gestion des dépendances, compilation fréquente, politique de versions.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Ajouter automatiquement les dépendances manquantes si justifié.
|
||||
- Rechercher systématiquement les dernières versions stables.
|
||||
|
||||
[directives]
|
||||
|
||||
- Lorsqu’une fonctionnalité nécessite une dépendance, l’ajouter et la documenter (nom, version, portée, impact) dans docs/ARCHITECTURE.md et docs/CONFIGURATION.md si nécessaire.
|
||||
- Compiler très régulièrement et « quand nécessaire » (avant refactor, avant push, après mise à jour de dépendances).
|
||||
- Corriger toute erreur de compilation/exécution avant de poursuivre.
|
||||
- Documenter tout changement de dépendances (raison, risques, rollback).
|
||||
|
||||
[validations]
|
||||
|
||||
- Interdire la progression si la compilation échoue.
|
||||
- Vérifier la présence d’une note de changement dans CHANGELOG.md en cas de dépendance ajoutée/retirée.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- docs/ARCHITECTURE.md, docs/CONFIGURATION.md, CHANGELOG.md.
|
65
.cursor/rules/41-ssh-automation.mdc
Normal file
65
.cursor/rules/41-ssh-automation.mdc
Normal file
@ -0,0 +1,65 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Automatisation SSH et scripts
|
||||
|
||||
[portée]
|
||||
Création, usage et vérification du dossier scripts/ et de ses trois scripts standards liés aux opérations SSH et CI.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Garantir la présence de scripts/ avec auto-ssh-push.sh, init-ssh-env.sh, setup-ssh-ci.sh.
|
||||
- Encadrer l’usage de ces scripts (locaux et CI), la sécurité, l’idempotence et la traçabilité.
|
||||
- Documenter toute mise à jour dans docs/SSH_UPDATE.md et CHANGELOG.md.
|
||||
|
||||
[directives]
|
||||
|
||||
- Créer et maintenir `scripts/auto-ssh-push.sh`, `scripts/init-ssh-env.sh`, `scripts/setup-ssh-ci.sh`.
|
||||
- Exiger permissions d’exécution adaptées sur scripts/ (exécution locale et CI).
|
||||
- Interdire le stockage de clés privées ou secrets en clair dans le dépôt.
|
||||
- Utiliser des variables d’environnement et secrets CI pour toute donnée sensible.
|
||||
- Rendre chaque script idempotent et verbosable ; produire un code de sortie non-zéro en cas d’échec.
|
||||
- Tracer les opérations : consigner un résumé dans docs/SSH_UPDATE.md (objectif, variables requises, effets, points d’échec).
|
||||
- Ajouter un contrôle automatique dans la CI pour vérifier l’existence et l’exécutabilité de ces scripts.
|
||||
|
||||
[validations]
|
||||
|
||||
- Échec bloquant si un des trois scripts manque ou n’est pas exécutable.
|
||||
- Échec bloquant si docs/SSH_UPDATE.md n’est pas mis à jour lors d’une modification de scripts.
|
||||
- Échec bloquant si un secret attendu n’est pas fourni en CI.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- scripts/**, docs/SSH_UPDATE.md, .gitea/workflows/ci.yml, CHANGELOG.md, docs/CONFIGURATION.md.
|
||||
|
||||
# Automatisation SSH et scripts
|
||||
|
||||
[portée]
|
||||
Création, usage et vérification du dossier scripts/ et de ses trois scripts standards liés aux opérations SSH et CI.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Garantir la présence de scripts/ avec auto-ssh-push.sh, init-ssh-env.sh, setup-ssh-ci.sh.
|
||||
- Encadrer l’usage de ces scripts (locaux et CI), la sécurité, l’idempotence et la traçabilité.
|
||||
- Documenter toute mise à jour dans docs/SSH_UPDATE.md et CHANGELOG.md.
|
||||
|
||||
[directives]
|
||||
|
||||
- Créer et maintenir `scripts/auto-ssh-push.sh`, `scripts/init-ssh-env.sh`, `scripts/setup-ssh-ci.sh`.
|
||||
- Exiger permissions d’exécution adaptées sur scripts/ (exécution locale et CI).
|
||||
- Interdire le stockage de clés privées ou secrets en clair dans le dépôt.
|
||||
- Utiliser des variables d’environnement et secrets CI pour toute donnée sensible.
|
||||
- Rendre chaque script idempotent et verbosable ; produire un code de sortie non-zéro en cas d’échec.
|
||||
- Tracer les opérations : consigner un résumé dans docs/SSH_UPDATE.md (objectif, variables requises, effets, points d’échec).
|
||||
- Ajouter un contrôle automatique dans la CI pour vérifier l’existence et l’exécutabilité de ces scripts.
|
||||
|
||||
[validations]
|
||||
|
||||
- Échec bloquant si un des trois scripts manque ou n’est pas exécutable.
|
||||
- Échec bloquant si docs/SSH_UPDATE.md n’est pas mis à jour lors d’une modification de scripts.
|
||||
- Échec bloquant si un secret attendu n’est pas fourni en CI.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- scripts/**, docs/SSH_UPDATE.md, .gitea/workflows/ci.yml, CHANGELOG.md, docs/CONFIGURATION.md.
|
53
.cursor/rules/42-template-sync.mdc
Normal file
53
.cursor/rules/42-template-sync.mdc
Normal file
@ -0,0 +1,53 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Synchronisation de template (4NK)
|
||||
|
||||
[portée]
|
||||
Tous les projets issus de 4NK_project_template. Contrôle de l’alignement sur .cursor/, .gitea/, AGENTS.md, scripts/, docs/SSH_UPDATE.md.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Garantir l’absence de dérive sur les éléments normatifs.
|
||||
- Exiger la mise à jour documentaire et du changelog à chaque synchronisation.
|
||||
- Bloquer la progression en cas d’intégrité non conforme.
|
||||
|
||||
[directives]
|
||||
- Lire la configuration de .4nk-sync.yml (source_repo, ref, paths, policy).
|
||||
- Refuser toute modification locale dans le périmètre des paths sans PR de synchronisation.
|
||||
- Après synchronisation : exiger mises à jour de CHANGELOG.md et docs/INDEX.md.
|
||||
- Scripts : vérifier présence, permissions d’exécution et absence de secrets en clair.
|
||||
- SSH : exiger mise à jour de docs/SSH_UPDATE.md si scripts/** modifié.
|
||||
|
||||
[validations]
|
||||
- Erreur bloquante si manifest_checksum manquant ou invalide.
|
||||
- Erreur bloquante si un path requis n’existe pas après sync.
|
||||
- Erreur bloquante si tests/CI signalent des scripts non exécutables ou des fichiers sensibles.
|
||||
|
||||
[artefacts concernés]
|
||||
- .4nk-sync.yml, TEMPLATE_VERSION, .cursor/**, .gitea/**, AGENTS.md, scripts/**, docs/SSH_UPDATE.md, CHANGELOG.md.
|
||||
# Synchronisation de template (4NK)
|
||||
|
||||
[portée]
|
||||
Tous les projets issus de 4NK_project_template. Contrôle de l’alignement sur .cursor/, .gitea/, AGENTS.md, scripts/, docs/SSH_UPDATE.md.
|
||||
|
||||
[objectifs]
|
||||
- Garantir l’absence de dérive sur les éléments normatifs.
|
||||
- Exiger la mise à jour documentaire et du changelog à chaque synchronisation.
|
||||
- Bloquer la progression en cas d’intégrité non conforme.
|
||||
|
||||
[directives]
|
||||
- Lire la configuration de .4nk-sync.yml (source_repo, ref, paths, policy).
|
||||
- Refuser toute modification locale dans le périmètre des paths sans PR de synchronisation.
|
||||
- Après synchronisation : exiger mises à jour de CHANGELOG.md et docs/INDEX.md.
|
||||
- Scripts : vérifier présence, permissions d’exécution et absence de secrets en clair.
|
||||
- SSH : exiger mise à jour de docs/SSH_UPDATE.md si scripts/** modifié.
|
||||
|
||||
[validations]
|
||||
- Erreur bloquante si manifest_checksum manquant ou invalide.
|
||||
- Erreur bloquante si un path requis n’existe pas après sync.
|
||||
- Erreur bloquante si tests/CI signalent des scripts non exécutables ou des fichiers sensibles.
|
||||
|
||||
[artefacts concernés]
|
||||
- .4nk-sync.yml, TEMPLATE_VERSION, .cursor/**, .gitea/**, AGENTS.md, scripts/**, docs/SSH_UPDATE.md, CHANGELOG.md.
|
156
.cursor/rules/4nkrules.mdc
Normal file
156
.cursor/rules/4nkrules.mdc
Normal file
@ -0,0 +1,156 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
# cursor.mcd — règles d’or 4NK
|
||||
language: fr
|
||||
policies:
|
||||
respond_in_french: true
|
||||
no_examples_in_codebase: true
|
||||
ask_before_push_or_tag: true
|
||||
|
||||
directories:
|
||||
ensure:
|
||||
- archive/
|
||||
- docs/
|
||||
- tests/
|
||||
- .gitea/
|
||||
docs:
|
||||
required_files:
|
||||
- API.md
|
||||
- ARCHITECTURE.md
|
||||
- COMMUNITY_GUIDE.md
|
||||
- CONFIGURATION.md
|
||||
- GITEA_SETUP.md
|
||||
- INDEX.md
|
||||
- INSTALLATION.md
|
||||
- MIGRATION.md
|
||||
- OPEN_SOURCE_CHECKLIST.md
|
||||
- QUICK_REFERENCE.md
|
||||
- RELEASE_PLAN.md
|
||||
- ROADMAP.md
|
||||
- SECURITY_AUDIT.md
|
||||
- TESTING.md
|
||||
- USAGE.md
|
||||
tests:
|
||||
required_files:
|
||||
- cleanup.sh
|
||||
- README.md
|
||||
required_dirs:
|
||||
- connectivity
|
||||
- external
|
||||
- integration
|
||||
- logs
|
||||
- performance
|
||||
- reports
|
||||
- unit
|
||||
gitea:
|
||||
required_files:
|
||||
- PULL_REQUEST_TEMPLATE.md
|
||||
required_dirs:
|
||||
- ISSUE_TEMPLATE
|
||||
- workflows
|
||||
ISSUE_TEMPLATE:
|
||||
required_files:
|
||||
- bug_report.md
|
||||
- feature_request.md
|
||||
workflows:
|
||||
required_files:
|
||||
- ci.yml
|
||||
|
||||
files:
|
||||
required_root_files:
|
||||
- CHANGELOG.md
|
||||
- CODE_OF_CONDUCT.md
|
||||
- CONTRIBUTING.md
|
||||
- docker-compose.yml
|
||||
- LICENSE
|
||||
- README.md
|
||||
|
||||
documentation:
|
||||
update_on:
|
||||
- feature_added
|
||||
- feature_modified
|
||||
- feature_removed
|
||||
- feature_discovered
|
||||
replace_sections_named: ["RESUME"]
|
||||
rex_required_on_multiple_hypotheses: true
|
||||
archive_obsolete_docs: true
|
||||
|
||||
compilation:
|
||||
compile_often: true
|
||||
compile_when_needed: true
|
||||
fail_on_errors: true
|
||||
|
||||
problem_solving:
|
||||
auto_run_steps:
|
||||
- minimal_repro
|
||||
- inspect_logs
|
||||
- bisect_changes
|
||||
- form_hypotheses
|
||||
- targeted_tests
|
||||
- implement_fix
|
||||
- non_regression
|
||||
|
||||
office_docs:
|
||||
docx_reader: docx2txt
|
||||
fallback:
|
||||
- pandoc_convert
|
||||
- request_alternate_source
|
||||
|
||||
dependencies:
|
||||
auto_add_missing: true
|
||||
always_check_latest_stable: true
|
||||
document_changes_in_docs: true
|
||||
|
||||
csv_models:
|
||||
treat_as_source_of_truth: true
|
||||
multirow_headers_supported: true
|
||||
confirm_in_docs: true
|
||||
require_column_definitions: true
|
||||
|
||||
file_processing:
|
||||
study_each_file: true
|
||||
ask_questions_if_needed: true
|
||||
adapt_code_if_needed: true
|
||||
propose_solution_if_unreadable: true
|
||||
|
||||
types_and_properties:
|
||||
auto_correct_incoherences: true
|
||||
document_transformations: true
|
||||
|
||||
functional_consistency:
|
||||
always_ask_clarifying_questions: true
|
||||
|
||||
frontend_architecture:
|
||||
react_code_splitting: true
|
||||
state_management: ["redux", "context_api"]
|
||||
data_service_abstraction: true
|
||||
|
||||
execution_discipline:
|
||||
finish_started_work: true
|
||||
|
||||
open_source_and_gitea:
|
||||
prepare_every_project: true
|
||||
gitea_remote: "git.4nkweb.com"
|
||||
required_files:
|
||||
- LICENSE
|
||||
- CONTRIBUTING.md
|
||||
- CHANGELOG.md
|
||||
- CODE_OF_CONDUCT.md
|
||||
align_with_4NK_node_on_creation: true
|
||||
keep_alignment_updated: true
|
||||
|
||||
tests_and_docs:
|
||||
update_docs_and_tests_with_code: true
|
||||
require_green_tests_before_commit: true
|
||||
|
||||
versioning:
|
||||
manage_with_changelog: true
|
||||
confirm_before_push: true
|
||||
confirm_before_tag: true
|
||||
propose_semver_bump: true
|
||||
|
||||
pre_commit:
|
||||
run_all_tests: true
|
||||
block_on_errors: true
|
||||
|
||||
---
|
54
.cursor/rules/50-data-csv-models.mdc
Normal file
54
.cursor/rules/50-data-csv-models.mdc
Normal file
@ -0,0 +1,54 @@
|
||||
---
|
||||
alwaysApply: false
|
||||
---
|
||||
# Modélisation des données à partir de CSV
|
||||
|
||||
[portée]
|
||||
Utilisation des CSV comme base des modèles de données, y compris en-têtes multi-lignes.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Confirmer la structure inférée pour chaque CSV.
|
||||
- Demander une définition formelle de toutes les colonnes.
|
||||
|
||||
[directives]
|
||||
|
||||
- Gérer explicitement les en-têtes multi-lignes (titre principal + sous-colonnes).
|
||||
- Confirmer par écrit dans docs/API.md ou docs/ARCHITECTURE.md : nombre de lignes d’en-tête, mapping colonnes→types, unités, domaines de valeurs, nullabilité, contraintes.
|
||||
- Poser des questions si ambiguïtés ; proposer une normalisation temporaire documentée.
|
||||
- Corriger automatiquement les incohérences de types si une règle de mapping est établie ailleurs et documenter la transformation.
|
||||
|
||||
[validations]
|
||||
|
||||
- Aucune ingestion sans spécification de colonnes validée.
|
||||
- Traçabilité des corrections de types (avant/après) dans docs/ARCHITECTURE.md.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- docs/API.md, docs/ARCHITECTURE.md, docs/USAGE.md.
|
||||
|
||||
# Modélisation des données à partir de CSV
|
||||
|
||||
[portée]
|
||||
Utilisation des CSV comme base des modèles de données, y compris en-têtes multi-lignes.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Confirmer la structure inférée pour chaque CSV.
|
||||
- Demander une définition formelle de toutes les colonnes.
|
||||
|
||||
[directives]
|
||||
|
||||
- Gérer explicitement les en-têtes multi-lignes (titre principal + sous-colonnes).
|
||||
- Confirmer par écrit dans docs/API.md ou docs/ARCHITECTURE.md : nombre de lignes d’en-tête, mapping colonnes→types, unités, domaines de valeurs, nullabilité, contraintes.
|
||||
- Poser des questions si ambiguïtés ; proposer une normalisation temporaire documentée.
|
||||
- Corriger automatiquement les incohérences de types si une règle de mapping est établie ailleurs et documenter la transformation.
|
||||
|
||||
[validations]
|
||||
|
||||
- Aucune ingestion sans spécification de colonnes validée.
|
||||
- Traçabilité des corrections de types (avant/après) dans docs/ARCHITECTURE.md.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- docs/API.md, docs/ARCHITECTURE.md, docs/USAGE.md.
|
41
.cursor/rules/60-office-docs.mdc
Normal file
41
.cursor/rules/60-office-docs.mdc
Normal file
@ -0,0 +1,41 @@
|
||||
---
|
||||
alwaysApply: false
|
||||
---
|
||||
# Lecture des documents bureautiques
|
||||
|
||||
[portée]
|
||||
Lecture des fichiers .docx et alternatives.
|
||||
|
||||
[objectifs]
|
||||
- Utiliser docx2txt par défaut.
|
||||
- Proposer des solutions de repli si lecture impossible.
|
||||
|
||||
[directives]
|
||||
- Lire les .docx avec docx2txt.
|
||||
- En cas d’échec, proposer : conversion via pandoc, demande d’une source alternative, ou extraction textuelle.
|
||||
- Documenter dans docs/INDEX.md la provenance et le statut des documents importés.
|
||||
|
||||
[validations]
|
||||
- Vérification que les contenus extraits sont intégrés aux fichiers docs/ concernés.
|
||||
|
||||
[artefacts concernés]
|
||||
- docs/**, archive/**.
|
||||
# Lecture des documents bureautiques
|
||||
|
||||
[portée]
|
||||
Lecture des fichiers .docx et alternatives.
|
||||
|
||||
[objectifs]
|
||||
- Utiliser docx2txt par défaut.
|
||||
- Proposer des solutions de repli si lecture impossible.
|
||||
|
||||
[directives]
|
||||
- Lire les .docx avec docx2txt.
|
||||
- En cas d’échec, proposer : conversion via pandoc, demande d’une source alternative, ou extraction textuelle.
|
||||
- Documenter dans docs/INDEX.md la provenance et le statut des documents importés.
|
||||
|
||||
[validations]
|
||||
- Vérification que les contenus extraits sont intégrés aux fichiers docs/ concernés.
|
||||
|
||||
[artefacts concernés]
|
||||
- docs/**, archive/**.
|
56
.cursor/rules/70-frontend-architecture.mdc
Normal file
56
.cursor/rules/70-frontend-architecture.mdc
Normal file
@ -0,0 +1,56 @@
|
||||
---
|
||||
alwaysApply: false
|
||||
---
|
||||
|
||||
# Architecture frontend
|
||||
|
||||
[portée]
|
||||
Qualité du bundle, découpage, état global et couche de services.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Réduire la taille du bundle initial via code splitting.
|
||||
- Éviter le prop drilling via Redux ou Context API.
|
||||
- Abstraire les services de données pour testabilité et maintenance.
|
||||
|
||||
[directives]
|
||||
|
||||
- Mettre en place React.lazy et Suspense pour le chargement différé des vues/segments.
|
||||
- Centraliser l’état global via Redux ou Context API.
|
||||
- Isoler les appels « data » derrière une couche d’abstraction à interface stable.
|
||||
- Interdire l’ajout d’exemples front dans la base de code.
|
||||
|
||||
[validations]
|
||||
|
||||
- Vérifier que les points d’entrée sont minimes et que les segments non critiques sont chargés à la demande.
|
||||
- S’assurer que docs/ARCHITECTURE.md décrit les décisions et les points d’extension.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- docs/ARCHITECTURE.md, docs/TESTING.md.
|
||||
# Architecture frontend
|
||||
|
||||
[portée]
|
||||
Qualité du bundle, découpage, état global et couche de services.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Réduire la taille du bundle initial via code splitting.
|
||||
- Éviter le prop drilling via Redux ou Context API.
|
||||
- Abstraire les services de données pour testabilité et maintenance.
|
||||
|
||||
[directives]
|
||||
|
||||
- Mettre en place React.lazy et Suspense pour le chargement différé des vues/segments.
|
||||
- Centraliser l’état global via Redux ou Context API.
|
||||
- Isoler les appels « data » derrière une couche d’abstraction à interface stable.
|
||||
- Interdire l’ajout d’exemples front dans la base de code.
|
||||
|
||||
[validations]
|
||||
|
||||
- Vérifier que les points d’entrée sont minimes et que les segments non critiques sont chargés à la demande.
|
||||
- S’assurer que docs/ARCHITECTURE.md décrit les décisions et les points d’extension.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- docs/ARCHITECTURE.md, docs/TESTING.md.
|
53
.cursor/rules/80-versioning-and-release.mdc
Normal file
53
.cursor/rules/80-versioning-and-release.mdc
Normal file
@ -0,0 +1,53 @@
|
||||
---
|
||||
alwaysApply: false
|
||||
---
|
||||
|
||||
# Versionnage et publication
|
||||
|
||||
[portée]
|
||||
Gestion sémantique des versions, CHANGELOG, confirmation push/tag.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Tenir CHANGELOG.md comme source unique de vérité.
|
||||
- Demander confirmation avant push et tag.
|
||||
|
||||
[directives]
|
||||
|
||||
- À chaque changement significatif, mettre à jour CHANGELOG.md (ajouts, changements, corrections, ruptures).
|
||||
- Proposer un bump semver (major/minor/patch) motivé par l’impact.
|
||||
- Avant tout push ou tag, demander confirmation explicite.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refus si modification sans entrée correspondante dans CHANGELOG.md.
|
||||
- Cohérence entre CHANGELOG.md, docs/RELEASE_PLAN.md et docs/ROADMAP.md.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- CHANGELOG.md, docs/RELEASE_PLAN.md, docs/ROADMAP.md.
|
||||
|
||||
# Versionnage et publication
|
||||
|
||||
[portée]
|
||||
Gestion sémantique des versions, CHANGELOG, confirmation push/tag.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Tenir CHANGELOG.md comme source unique de vérité.
|
||||
- Demander confirmation avant push et tag.
|
||||
|
||||
[directives]
|
||||
|
||||
- À chaque changement significatif, mettre à jour CHANGELOG.md (ajouts, changements, corrections, ruptures).
|
||||
- Proposer un bump semver (major/minor/patch) motivé par l’impact.
|
||||
- Avant tout push ou tag, demander confirmation explicite.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refus si modification sans entrée correspondante dans CHANGELOG.md.
|
||||
- Cohérence entre CHANGELOG.md, docs/RELEASE_PLAN.md et docs/ROADMAP.md.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- CHANGELOG.md, docs/RELEASE_PLAN.md, docs/ROADMAP.md.
|
37
.cursor/rules/85-release-guard.mdc
Normal file
37
.cursor/rules/85-release-guard.mdc
Normal file
@ -0,0 +1,37 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Garde de release: tests, documentation, compilation, version, changelog, tag
|
||||
|
||||
[portée]
|
||||
Contrôler systématiquement avant push/tag: tests verts, docs mises à jour, build OK, alignement numéro de version ↔ changelog ↔ tag git, mise à jour de déploiement, confirmation utilisateur (latest vs wip).
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Empêcher toute publication sans vérifications minimales.
|
||||
- Exiger la cohérence sémantique (VERSION/TEMPLATE_VERSION ↔ CHANGELOG ↔ tag git).
|
||||
- Demander explicitement « latest » ou « wip » et appliquer la bonne stratégie.
|
||||
|
||||
[directives]
|
||||
|
||||
- Avant push/tag, exécuter: tests, compilation, lints (si configurés).
|
||||
- Mettre à jour la documentation et le changelog en conséquence.
|
||||
- Aligner le fichier de version (VERSION ou TEMPLATE_VERSION), l’entrée CHANGELOG et le tag.
|
||||
- Demander confirmation utilisateur: `latest` (release stable) ou `wip` (travail en cours).
|
||||
- latest: entrée datée dans CHANGELOG, version stable, tag `vX.Y.Z`.
|
||||
- wip: suffixe `-wip` recommandé dans version/tag (ex: `vX.Y.Z-wip.N`).
|
||||
- Mettre à jour le déploiement après publication (si pipeline défini), sinon documenter l’étape.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refuser push/tag si:
|
||||
- tests/compilation échouent,
|
||||
- CHANGELOG non mis à jour,
|
||||
- VERSION/TEMPLATE_VERSION absent ou incohérent,
|
||||
- release type non fourni (ni latest, ni wip).
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- CHANGELOG.md, VERSION ou TEMPLATE_VERSION, docs/**, .gitea/workflows/**, scripts/**.
|
||||
|
59
.cursor/rules/90-gitea-and-oss.mdc
Normal file
59
.cursor/rules/90-gitea-and-oss.mdc
Normal file
@ -0,0 +1,59 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Open source et Gitea
|
||||
|
||||
[portée]
|
||||
Conformité open source, templates Gitea, CI.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Préparer chaque projet pour un dépôt Gitea (git.4nkweb.com).
|
||||
- Maintenir les fichiers de gouvernance et la CI.
|
||||
|
||||
[directives]
|
||||
|
||||
- Vérifier la présence et l’actualité de : LICENSE, CONTRIBUTING.md, CODE_OF_CONDUCT.md, OPEN_SOURCE_CHECKLIST.md.
|
||||
- Maintenir .gitea/ :
|
||||
- ISSUE_TEMPLATE/bug_report.md, feature_request.md
|
||||
- PULL_REQUEST_TEMPLATE.md
|
||||
- workflows/ci.yml
|
||||
- Documenter dans docs/GITEA_SETUP.md la configuration distante et les permissions.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refus si un des fichiers « gouvernance/CI » manque.
|
||||
- Cohérence entre docs/OPEN_SOURCE_CHECKLIST.md et l’état du repo.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- .gitea/**, docs/GITEA_SETUP.md, docs/OPEN_SOURCE_CHECKLIST.md.
|
||||
|
||||
# Open source et Gitea
|
||||
|
||||
[portée]
|
||||
Conformité open source, templates Gitea, CI.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Préparer chaque projet pour un dépôt Gitea (git.4nkweb.com).
|
||||
- Maintenir les fichiers de gouvernance et la CI.
|
||||
|
||||
[directives]
|
||||
|
||||
- Vérifier la présence et l’actualité de : LICENSE, CONTRIBUTING.md, CODE_OF_CONDUCT.md, OPEN_SOURCE_CHECKLIST.md.
|
||||
- Maintenir .gitea/ :
|
||||
- ISSUE_TEMPLATE/bug_report.md, feature_request.md
|
||||
- PULL_REQUEST_TEMPLATE.md
|
||||
- workflows/ci.yml
|
||||
- Documenter dans docs/GITEA_SETUP.md la configuration distante et les permissions.
|
||||
|
||||
[validations]
|
||||
|
||||
- Refus si un des fichiers « gouvernance/CI » manque.
|
||||
- Cohérence entre docs/OPEN_SOURCE_CHECKLIST.md et l’état du repo.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- .gitea/**, docs/GITEA_SETUP.md, docs/OPEN_SOURCE_CHECKLIST.md.
|
53
.cursor/rules/95-triage-and-problem-solving.mdc
Normal file
53
.cursor/rules/95-triage-and-problem-solving.mdc
Normal file
@ -0,0 +1,53 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Tri, diagnostic et résolution de problèmes
|
||||
|
||||
[portée]
|
||||
Boucle de triage : reproduction, diagnostic, correctif, non-régression.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Exécuter automatiquement les étapes de résolution.
|
||||
- Bloquer l’avancement tant que les erreurs ne sont pas corrigées.
|
||||
|
||||
[directives]
|
||||
|
||||
- Étapes obligatoires : reproduction minimale, inspection des logs, bissection des changements, formulation d’hypothèses, tests ciblés, correctif, test de non-régression.
|
||||
- Lorsque plusieurs hypothèses ont été testées, produire un REX dans archive/ avec liens vers les commits.
|
||||
- Poser des questions de cohérence fonctionnelle si des ambiguïtés subsistent (contrats d’API, invariants, SLA).
|
||||
|
||||
[validations]
|
||||
|
||||
- Interdiction de clore une tâche si un test échoue ou si une alerte critique subsiste.
|
||||
- Traçabilité du REX si investigations multiples.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- tests/**, archive/**, docs/TESTING.md, docs/ARCHITECTURE.md.
|
||||
|
||||
# Tri, diagnostic et résolution de problèmes
|
||||
|
||||
[portée]
|
||||
Boucle de triage : reproduction, diagnostic, correctif, non-régression.
|
||||
|
||||
[objectifs]
|
||||
|
||||
- Exécuter automatiquement les étapes de résolution.
|
||||
- Bloquer l’avancement tant que les erreurs ne sont pas corrigées.
|
||||
|
||||
[directives]
|
||||
|
||||
- Étapes obligatoires : reproduction minimale, inspection des logs, bissection des changements, formulation d’hypothèses, tests ciblés, correctif, test de non-régression.
|
||||
- Lorsque plusieurs hypothèses ont été testées, produire un REX dans archive/ avec liens vers les commits.
|
||||
- Poser des questions de cohérence fonctionnelle si des ambiguïtés subsistent (contrats d’API, invariants, SLA).
|
||||
|
||||
[validations]
|
||||
|
||||
- Interdiction de clore une tâche si un test échoue ou si une alerte critique subsiste.
|
||||
- Traçabilité du REX si investigations multiples.
|
||||
|
||||
[artefacts concernés]
|
||||
|
||||
- tests/**, archive/**, docs/TESTING.md, docs/ARCHITECTURE.md.
|
5
.cursor/rules/98-explain-complex-commands
Normal file
5
.cursor/rules/98-explain-complex-commands
Normal file
@ -0,0 +1,5 @@
|
||||
---
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
quand tu fais une commande ou un requète complexe, explique là avant de la lancer
|
9
.cursor/rules/99-lint-markdow.mdc
Normal file
9
.cursor/rules/99-lint-markdow.mdc
Normal file
@ -0,0 +1,9 @@
|
||||
---
|
||||
description:
|
||||
globs:
|
||||
alwaysApply: true
|
||||
---
|
||||
|
||||
# Lint
|
||||
|
||||
respecter strictement les règles de lint du markdown
|
16
.cursor/rules/ruleset-index.md
Normal file
16
.cursor/rules/ruleset-index.md
Normal file
@ -0,0 +1,16 @@
|
||||
# Index des règles .cursor/rules
|
||||
|
||||
- 00-foundations.mdc : règles linguistiques et éditoriales (français, pas d’exemples en base, introduction/conclusion).
|
||||
- 10-project-structure.mdc : arborescence canonique 4NK_node et garde-fous.
|
||||
- 20-documentation.mdc : documentation continue, remplacement de « RESUME », INDEX.md.
|
||||
- 30-testing.mdc : tests (unit, integration, connectivity, performance, external), logs/reports.
|
||||
- 40-dependencies-and-build.mdc : dépendances, compilation, corrections bloquantes.
|
||||
- 50-data-csv-models.mdc : CSV avec en-têtes multi-lignes, définition des colonnes.
|
||||
- 60-office-docs.mdc : lecture .docx via docx2txt + repli.
|
||||
- 70-frontend-architecture.mdc : React.lazy/Suspense, état global, couche de services.
|
||||
- 80-versioning-and-release.mdc : CHANGELOG, semver, confirmation push/tag.
|
||||
- 85-release-guard.mdc : garde de release (tests/doc/build/version/changelog/tag; latest vs wip).
|
||||
- 90-gitea-and-oss.mdc : fichiers open source, .gitea, CI, Gitea remote.
|
||||
- 95-triage-and-problem-solving.mdc : boucle de diagnostic, REX, non-régression.
|
||||
|
||||
Ces règles sont conçues pour être ajoutées au contexte de Cursor depuis l’interface (@Cursor Rules) et s’appuient sur le mécanisme de règles projet stockées dans `.cursor/rules/`. :contentReference[oaicite:3]{index=3}
|
26
.cursorignore
Normal file
26
.cursorignore
Normal file
@ -0,0 +1,26 @@
|
||||
# Ignorer les contenus volumineux pour le contexte IA
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
coverage/
|
||||
.cache/
|
||||
.tmp/
|
||||
.parcel-cache/
|
||||
|
||||
# Rapports et logs de tests
|
||||
tests/logs/
|
||||
tests/reports/
|
||||
|
||||
# Fichiers lourds
|
||||
**/*.map
|
||||
**/*.min.*
|
||||
**/*.wasm
|
||||
**/*.{png,jpg,jpeg,svg,ico,pdf}
|
||||
|
||||
# Ne pas ignorer .cursor ni AGENTS.md
|
||||
!/.cursor
|
||||
!/AGENTS.md
|
||||
|
||||
!.cursor/
|
||||
|
||||
!AGENTS.md
|
6
.dockerignore
Normal file
6
.dockerignore
Normal file
@ -0,0 +1,6 @@
|
||||
target
|
||||
.git
|
||||
storage
|
||||
**/*.log
|
||||
**/*.tmp
|
||||
**/*.swp
|
98
.gitea/ISSUE_TEMPLATE/bug_report.md
Normal file
98
.gitea/ISSUE_TEMPLATE/bug_report.md
Normal file
@ -0,0 +1,98 @@
|
||||
---
|
||||
name: Bug Report
|
||||
about: Signaler un bug pour nous aider à améliorer sdk_storage
|
||||
title: '[BUG] '
|
||||
labels: ['bug', 'needs-triage']
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
## 🐛 Description du Bug
|
||||
|
||||
Description claire et concise du problème.
|
||||
|
||||
## 🔄 Étapes pour Reproduire
|
||||
|
||||
1. Aller à '...'
|
||||
2. Cliquer sur '...'
|
||||
3. Faire défiler jusqu'à '...'
|
||||
4. Voir l'erreur
|
||||
|
||||
## ✅ Comportement Attendu
|
||||
|
||||
Description de ce qui devrait se passer.
|
||||
|
||||
## ❌ Comportement Actuel
|
||||
|
||||
Description de ce qui se passe actuellement.
|
||||
|
||||
## 📸 Capture d'Écran
|
||||
|
||||
Si applicable, ajoutez une capture d'écran pour expliquer votre problème.
|
||||
|
||||
## 💻 Informations Système
|
||||
|
||||
- **OS** : [ex: Ubuntu 20.04, macOS 12.0, Windows 11]
|
||||
- **Docker** : [ex: 20.10.0]
|
||||
- **Docker Compose** : [ex: 2.0.0]
|
||||
- **Version sdk_storage** : [ex: v1.0.0]
|
||||
- **Architecture** : [ex: x86_64, ARM64]
|
||||
|
||||
## 📋 Configuration
|
||||
|
||||
### Services Actifs
|
||||
```bash
|
||||
docker ps
|
||||
```
|
||||
|
||||
### Variables d'Environnement
|
||||
```bash
|
||||
# Bitcoin Core
|
||||
BITCOIN_NETWORK=signet
|
||||
BITCOIN_RPC_PORT=18443
|
||||
|
||||
# Blindbit
|
||||
BLINDBIT_PORT=8000
|
||||
|
||||
# SDK Relay
|
||||
SDK_RELAY_PORTS=8090-8095
|
||||
```
|
||||
|
||||
## 📝 Logs
|
||||
|
||||
### Logs Pertinents
|
||||
```
|
||||
Logs pertinents ici
|
||||
```
|
||||
|
||||
### Logs d'Erreur
|
||||
```
|
||||
Logs d'erreur ici
|
||||
```
|
||||
|
||||
### Logs de Debug
|
||||
```
|
||||
Logs de debug ici (si RUST_LOG=debug)
|
||||
```
|
||||
|
||||
## 🔧 Tentatives de Résolution
|
||||
|
||||
- [ ] Redémarrage des services
|
||||
- [ ] Nettoyage des volumes Docker
|
||||
- [ ] Vérification de la connectivité réseau
|
||||
- [ ] Mise à jour des dépendances
|
||||
- [ ] Vérification de la configuration
|
||||
|
||||
## 📚 Contexte Supplémentaire
|
||||
|
||||
Toute autre information pertinente sur le problème.
|
||||
|
||||
## 🔗 Liens Utiles
|
||||
|
||||
- [Documentation](docs/)
|
||||
- [Guide de Dépannage](docs/TROUBLESHOOTING.md)
|
||||
- [Issues Similaires](https://git.4nkweb.com/4nk/4NK_node/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
|
||||
|
||||
---
|
||||
|
||||
**Merci de votre contribution !** 🙏
|
||||
|
157
.gitea/ISSUE_TEMPLATE/feature_request.md
Normal file
157
.gitea/ISSUE_TEMPLATE/feature_request.md
Normal file
@ -0,0 +1,157 @@
|
||||
---
|
||||
name: Feature Request
|
||||
about: Proposer une nouvelle fonctionnalité pour sdk_storage
|
||||
title: '[FEATURE] '
|
||||
labels: ['enhancement', 'needs-triage']
|
||||
assignees: ''
|
||||
---
|
||||
|
||||
## 🚀 Résumé
|
||||
|
||||
Description claire et concise de la fonctionnalité souhaitée.
|
||||
|
||||
## 💡 Motivation
|
||||
|
||||
Pourquoi cette fonctionnalité est-elle nécessaire ? Quels problèmes résout-elle ?
|
||||
|
||||
### Problèmes Actuels
|
||||
- Problème 1
|
||||
- Problème 2
|
||||
- Problème 3
|
||||
|
||||
### Avantages de la Solution
|
||||
- Avantage 1
|
||||
- Avantage 2
|
||||
- Avantage 3
|
||||
|
||||
## 🎯 Proposition
|
||||
|
||||
Description détaillée de la fonctionnalité proposée.
|
||||
|
||||
### Fonctionnalités Principales
|
||||
- [ ] Fonctionnalité 1
|
||||
- [ ] Fonctionnalité 2
|
||||
- [ ] Fonctionnalité 3
|
||||
|
||||
### Interface Utilisateur
|
||||
Description de l'interface utilisateur si applicable.
|
||||
|
||||
### API Changes
|
||||
Description des changements d'API si applicable.
|
||||
|
||||
## 🔄 Alternatives Considérées
|
||||
|
||||
Autres solutions envisagées et pourquoi elles n'ont pas été choisies.
|
||||
|
||||
### Alternative 1
|
||||
- **Description** : ...
|
||||
- **Pourquoi rejetée** : ...
|
||||
|
||||
### Alternative 2
|
||||
- **Description** : ...
|
||||
- **Pourquoi rejetée** : ...
|
||||
|
||||
## 📊 Impact
|
||||
|
||||
### Impact sur les Utilisateurs
|
||||
- Impact positif 1
|
||||
- Impact positif 2
|
||||
- Impact négatif potentiel (si applicable)
|
||||
|
||||
### Impact sur l'Architecture
|
||||
- Changements nécessaires
|
||||
- Compatibilité avec l'existant
|
||||
- Performance
|
||||
|
||||
### Impact sur la Maintenance
|
||||
- Complexité ajoutée
|
||||
- Tests nécessaires
|
||||
- Documentation requise
|
||||
|
||||
## 💻 Exemples d'Utilisation
|
||||
|
||||
### Cas d'Usage 1
|
||||
```bash
|
||||
# Exemple de commande ou configuration
|
||||
```
|
||||
|
||||
### Cas d'Usage 2
|
||||
```python
|
||||
# Exemple de code Python
|
||||
```
|
||||
|
||||
### Cas d'Usage 3
|
||||
```javascript
|
||||
// Exemple de code JavaScript
|
||||
```
|
||||
|
||||
## 🧪 Tests
|
||||
|
||||
### Tests Nécessaires
|
||||
- [ ] Tests unitaires
|
||||
- [ ] Tests d'intégration
|
||||
- [ ] Tests de performance
|
||||
- [ ] Tests de sécurité
|
||||
- [ ] Tests de compatibilité
|
||||
|
||||
### Scénarios de Test
|
||||
- Scénario 1
|
||||
- Scénario 2
|
||||
- Scénario 3
|
||||
|
||||
## 📚 Documentation
|
||||
|
||||
### Documentation Requise
|
||||
- [ ] Guide d'utilisation
|
||||
- [ ] Documentation API
|
||||
- [ ] Exemples de code
|
||||
- [ ] Guide de migration
|
||||
- [ ] FAQ
|
||||
|
||||
## 🔧 Implémentation
|
||||
|
||||
### Étapes Proposées
|
||||
1. **Phase 1** : [Description]
|
||||
2. **Phase 2** : [Description]
|
||||
3. **Phase 3** : [Description]
|
||||
|
||||
### Estimation de Temps
|
||||
- **Développement** : X jours/semaines
|
||||
- **Tests** : X jours/semaines
|
||||
- **Documentation** : X jours/semaines
|
||||
- **Total** : X jours/semaines
|
||||
|
||||
### Ressources Nécessaires
|
||||
- Développeur(s)
|
||||
- Testeur(s)
|
||||
- Documentateur(s)
|
||||
- Infrastructure
|
||||
|
||||
## 🎯 Critères de Succès
|
||||
|
||||
Comment mesurer le succès de cette fonctionnalité ?
|
||||
|
||||
- [ ] Critère 1
|
||||
- [ ] Critère 2
|
||||
- [ ] Critère 3
|
||||
|
||||
## 🔗 Liens Utiles
|
||||
|
||||
- [Documentation existante](docs/)
|
||||
- [Issues similaires](https://git.4nkweb.com/4nk/4NK_node/issues?q=is%3Aissue+is%3Aopen+label%3Aenhancement)
|
||||
- [Roadmap](https://git.4nkweb.com/4nk/4NK_node/projects)
|
||||
- [Discussions](https://git.4nkweb.com/4nk/4NK_node/issues)
|
||||
|
||||
## 📋 Checklist
|
||||
|
||||
- [ ] J'ai vérifié que cette fonctionnalité n'existe pas déjà
|
||||
- [ ] J'ai lu la documentation existante
|
||||
- [ ] J'ai vérifié les issues similaires
|
||||
- [ ] J'ai fourni des exemples d'utilisation
|
||||
- [ ] J'ai considéré l'impact sur l'existant
|
||||
- [ ] J'ai proposé des tests
|
||||
|
||||
---
|
||||
|
||||
**Merci de votre contribution à l'amélioration de sdk_storage !** 🌟
|
||||
|
181
.gitea/PULL_REQUEST_TEMPLATE.md
Normal file
181
.gitea/PULL_REQUEST_TEMPLATE.md
Normal file
@ -0,0 +1,181 @@
|
||||
# Pull Request - sdk_storage
|
||||
|
||||
## 📋 Description
|
||||
|
||||
Description claire et concise des changements apportés.
|
||||
|
||||
### Type de Changement
|
||||
- [ ] 🐛 Bug fix
|
||||
- [ ] ✨ Nouvelle fonctionnalité
|
||||
- [ ] 📚 Documentation
|
||||
- [ ] 🧪 Tests
|
||||
- [ ] 🔧 Refactoring
|
||||
- [ ] 🚀 Performance
|
||||
- [ ] 🔒 Sécurité
|
||||
- [ ] 🎨 Style/UI
|
||||
- [ ] 🏗️ Architecture
|
||||
- [ ] 📦 Build/CI
|
||||
|
||||
### Composants Affectés
|
||||
- [ ] Bitcoin Core
|
||||
- [ ] Blindbit
|
||||
- [ ] SDK Relay
|
||||
- [ ] Tor
|
||||
- [ ] Docker/Infrastructure
|
||||
- [ ] Tests
|
||||
- [ ] Documentation
|
||||
- [ ] Scripts
|
||||
|
||||
## 🔗 Issue(s) Liée(s)
|
||||
|
||||
Fixes #(issue)
|
||||
Relates to #(issue)
|
||||
|
||||
## 🧪 Tests
|
||||
|
||||
### Tests Exécutés
|
||||
- [ ] Tests unitaires
|
||||
- [ ] Tests d'intégration
|
||||
- [ ] Tests de connectivité
|
||||
- [ ] Tests externes
|
||||
- [ ] Tests de performance
|
||||
|
||||
### Commandes de Test
|
||||
```bash
|
||||
# Tests complets
|
||||
./tests/run_all_tests.sh
|
||||
|
||||
# Tests spécifiques
|
||||
./tests/run_unit_tests.sh
|
||||
./tests/run_integration_tests.sh
|
||||
```
|
||||
|
||||
### Résultats des Tests
|
||||
```
|
||||
Résultats des tests ici
|
||||
```
|
||||
|
||||
## 📸 Captures d'Écran
|
||||
|
||||
Si applicable, ajoutez des captures d'écran pour les changements visuels.
|
||||
|
||||
## 🔧 Changements Techniques
|
||||
|
||||
### Fichiers Modifiés
|
||||
- `fichier1.rs` - Description des changements
|
||||
- `fichier2.py` - Description des changements
|
||||
- `docker-compose.yml` - Description des changements
|
||||
|
||||
### Nouveaux Fichiers
|
||||
- `nouveau_fichier.rs` - Description
|
||||
- `nouveau_script.sh` - Description
|
||||
|
||||
### Fichiers Supprimés
|
||||
- `ancien_fichier.rs` - Raison de la suppression
|
||||
|
||||
### Changements de Configuration
|
||||
```yaml
|
||||
# Exemple de changement de configuration
|
||||
service:
|
||||
new_option: value
|
||||
```
|
||||
|
||||
## 📚 Documentation
|
||||
|
||||
### Documentation Mise à Jour
|
||||
- [ ] README.md
|
||||
- [ ] docs/INSTALLATION.md
|
||||
- [ ] docs/USAGE.md
|
||||
- [ ] docs/API.md
|
||||
- [ ] docs/ARCHITECTURE.md
|
||||
|
||||
### Nouvelle Documentation
|
||||
- [ ] Nouveau guide créé
|
||||
- [ ] Exemples ajoutés
|
||||
- [ ] API documentée
|
||||
|
||||
## 🔍 Code Review Checklist
|
||||
|
||||
### Code Quality
|
||||
- [ ] Le code suit les standards du projet
|
||||
- [ ] Les noms de variables/fonctions sont clairs
|
||||
- [ ] Les commentaires sont appropriés
|
||||
- [ ] Pas de code mort ou commenté
|
||||
- [ ] Gestion d'erreurs appropriée
|
||||
|
||||
### Performance
|
||||
- [ ] Pas de régression de performance
|
||||
- [ ] Optimisations appliquées si nécessaire
|
||||
- [ ] Tests de performance ajoutés
|
||||
|
||||
### Sécurité
|
||||
- [ ] Pas de vulnérabilités introduites
|
||||
- [ ] Validation des entrées utilisateur
|
||||
- [ ] Gestion sécurisée des secrets
|
||||
|
||||
### Tests
|
||||
- [ ] Couverture de tests suffisante
|
||||
- [ ] Tests pour les cas d'erreur
|
||||
- [ ] Tests d'intégration si nécessaire
|
||||
|
||||
### Documentation
|
||||
- [ ] Code auto-documenté
|
||||
- [ ] Documentation mise à jour
|
||||
- [ ] Exemples fournis
|
||||
|
||||
## 🚀 Déploiement
|
||||
|
||||
### Impact sur le Déploiement
|
||||
- [ ] Aucun impact
|
||||
- [ ] Migration de données requise
|
||||
- [ ] Changement de configuration
|
||||
- [ ] Redémarrage des services
|
||||
|
||||
### Étapes de Déploiement
|
||||
```bash
|
||||
# Étapes pour déployer les changements
|
||||
```
|
||||
|
||||
## 📊 Métriques
|
||||
|
||||
### Impact sur les Performances
|
||||
- Temps de réponse : +/- X%
|
||||
- Utilisation mémoire : +/- X%
|
||||
- Utilisation CPU : +/- X%
|
||||
|
||||
### Impact sur la Stabilité
|
||||
- Taux d'erreur : +/- X%
|
||||
- Disponibilité : +/- X%
|
||||
|
||||
## 🔄 Compatibilité
|
||||
|
||||
### Compatibilité Ascendante
|
||||
- [ ] Compatible avec les versions précédentes
|
||||
- [ ] Migration automatique
|
||||
- [ ] Migration manuelle requise
|
||||
|
||||
### Compatibilité Descendante
|
||||
- [ ] Compatible avec les futures versions
|
||||
- [ ] API stable
|
||||
- [ ] Configuration stable
|
||||
|
||||
## 🎯 Critères de Succès
|
||||
|
||||
- [ ] Critère 1
|
||||
- [ ] Critère 2
|
||||
- [ ] Critère 3
|
||||
|
||||
## 📝 Notes Supplémentaires
|
||||
|
||||
Informations supplémentaires importantes pour les reviewers.
|
||||
|
||||
## 🔗 Liens Utiles
|
||||
|
||||
- [Documentation](docs/)
|
||||
- [Tests](tests/)
|
||||
- [Issues liées](https://git.4nkweb.com/4nk/4NK_node/issues)
|
||||
|
||||
---
|
||||
|
||||
**Merci pour votre contribution !** 🙏
|
||||
|
15
.gitea/workflows/LOCAL_OVERRIDES.yml
Normal file
15
.gitea/workflows/LOCAL_OVERRIDES.yml
Normal file
@ -0,0 +1,15 @@
|
||||
# LOCAL_OVERRIDES.yml — dérogations locales contrôlées
|
||||
overrides:
|
||||
- path: ".gitea/workflows/ci.yml"
|
||||
reason: "spécificité d’environnement"
|
||||
owner: "@maintainer_handle"
|
||||
expires: "2025-12-31"
|
||||
- path: "scripts/auto-ssh-push.sh"
|
||||
reason: "flux particulier temporaire"
|
||||
owner: "@maintainer_handle"
|
||||
expires: "2025-10-01"
|
||||
policy:
|
||||
allow_only_listed_paths: true
|
||||
require_expiry: true
|
||||
audit_in_ci: true
|
||||
|
486
.gitea/workflows/ci.yml
Normal file
486
.gitea/workflows/ci.yml
Normal file
@ -0,0 +1,486 @@
|
||||
name: CI - 4NK Node
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, develop ]
|
||||
tags:
|
||||
- 'v*'
|
||||
pull_request:
|
||||
branches: [ main, develop ]
|
||||
|
||||
env:
|
||||
RUST_VERSION: '1.70'
|
||||
DOCKER_COMPOSE_VERSION: '2.20.0'
|
||||
CI_SKIP: 'true'
|
||||
|
||||
jobs:
|
||||
# Job de vérification du code
|
||||
code-quality:
|
||||
name: Code Quality
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Cache Rust dependencies
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry
|
||||
~/.cargo/git
|
||||
target
|
||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-
|
||||
|
||||
- name: Run clippy
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo clippy --all-targets --all-features -- -D warnings
|
||||
|
||||
- name: Run rustfmt
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo fmt --all -- --check
|
||||
|
||||
- name: Check documentation
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo doc --no-deps
|
||||
|
||||
- name: Check for TODO/FIXME
|
||||
run: |
|
||||
if grep -r "TODO\|FIXME" . --exclude-dir=.git --exclude-dir=target; then
|
||||
echo "Found TODO/FIXME comments. Please address them."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Job de tests unitaires
|
||||
unit-tests:
|
||||
name: Unit Tests
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Cache Rust dependencies
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry
|
||||
~/.cargo/git
|
||||
target
|
||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-
|
||||
|
||||
- name: Run unit tests
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo test --lib --bins
|
||||
|
||||
- name: Run integration tests
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo test --tests
|
||||
|
||||
# Job de tests d'intégration
|
||||
integration-tests:
|
||||
name: Integration Tests
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
services:
|
||||
docker:
|
||||
image: docker:24.0.5
|
||||
options: >-
|
||||
--health-cmd "docker info"
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
ports:
|
||||
- 2375:2375
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build Docker images
|
||||
run: |
|
||||
docker build -t 4nk-node-bitcoin ./bitcoin
|
||||
docker build -t 4nk-node-blindbit ./blindbit
|
||||
docker build -t 4nk-node-sdk-relay -f ./sdk_relay/Dockerfile ..
|
||||
|
||||
- name: Run integration tests
|
||||
run: |
|
||||
# Tests de connectivité de base
|
||||
./tests/run_connectivity_tests.sh || true
|
||||
|
||||
# Tests d'intégration
|
||||
./tests/run_integration_tests.sh || true
|
||||
|
||||
- name: Upload test results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: always()
|
||||
with:
|
||||
name: test-results
|
||||
path: |
|
||||
tests/logs/
|
||||
tests/reports/
|
||||
retention-days: 7
|
||||
|
||||
# Job de tests de sécurité
|
||||
security-tests:
|
||||
name: Security Tests
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Run cargo audit
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo audit --deny warnings
|
||||
|
||||
- name: Check for secrets
|
||||
run: |
|
||||
# Vérifier les secrets potentiels
|
||||
if grep -r "password\|secret\|key\|token" . --exclude-dir=.git --exclude-dir=target --exclude=*.md; then
|
||||
echo "Potential secrets found. Please review."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Check file permissions
|
||||
run: |
|
||||
# Vérifier les permissions sensibles
|
||||
find . -type f -perm /0111 -name "*.conf" -o -name "*.key" -o -name "*.pem" | while read file; do
|
||||
if [[ $(stat -c %a "$file") != "600" ]]; then
|
||||
echo "Warning: $file has insecure permissions"
|
||||
fi
|
||||
done
|
||||
|
||||
# Job de build et test Docker
|
||||
docker-build:
|
||||
name: Docker Build & Test
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
services:
|
||||
docker:
|
||||
image: docker:24.0.5
|
||||
options: >-
|
||||
--health-cmd "docker info"
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
ports:
|
||||
- 2375:2375
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build and test Bitcoin Core
|
||||
run: |
|
||||
docker build -t 4nk-node-bitcoin:test ./bitcoin
|
||||
docker run --rm 4nk-node-bitcoin:test bitcoin-cli --version
|
||||
|
||||
- name: Build and test Blindbit
|
||||
run: |
|
||||
docker build -t 4nk-node-blindbit:test ./blindbit
|
||||
docker run --rm 4nk-node-blindbit:test --version || true
|
||||
|
||||
- name: Build and test SDK Relay
|
||||
run: |
|
||||
docker build -t 4nk-node-sdk-relay:test -f ./sdk_relay/Dockerfile ..
|
||||
docker run --rm 4nk-node-sdk-relay:test --version || true
|
||||
|
||||
- name: Test Docker Compose
|
||||
run: |
|
||||
docker-compose config
|
||||
docker-compose build --no-cache
|
||||
|
||||
# Job de tests de documentation
|
||||
documentation-tests:
|
||||
name: Documentation Tests
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Check markdown links
|
||||
run: |
|
||||
# Vérification basique des liens markdown
|
||||
find . -name "*.md" -exec grep -l "\[.*\](" {} \; | while read file; do
|
||||
echo "Checking links in $file"
|
||||
done
|
||||
|
||||
markdownlint:
|
||||
name: Markdown Lint
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Run markdownlint
|
||||
run: |
|
||||
npm --version || (curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash - && sudo apt-get install -y nodejs)
|
||||
npx -y markdownlint-cli@0.42.0 "**/*.md" --ignore "archive/**"
|
||||
|
||||
- name: Check documentation structure
|
||||
run: |
|
||||
# Vérifier la présence des fichiers de documentation essentiels
|
||||
required_files=(
|
||||
"README.md"
|
||||
"LICENSE"
|
||||
"CONTRIBUTING.md"
|
||||
"CHANGELOG.md"
|
||||
"CODE_OF_CONDUCT.md"
|
||||
"SECURITY.md"
|
||||
)
|
||||
|
||||
for file in "${required_files[@]}"; do
|
||||
if [[ ! -f "$file" ]]; then
|
||||
echo "Missing required documentation file: $file"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
bash-required:
|
||||
name: Bash Requirement
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Verify bash availability
|
||||
run: |
|
||||
if ! command -v bash >/dev/null 2>&1; then
|
||||
echo "bash is required for agents and scripts"; exit 1;
|
||||
fi
|
||||
- name: Verify agents runner exists
|
||||
run: |
|
||||
if [ ! -f scripts/agents/run.sh ]; then
|
||||
echo "scripts/agents/run.sh is missing"; exit 1;
|
||||
fi
|
||||
|
||||
agents-smoke:
|
||||
name: Agents Smoke (no AI)
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Ensure agents scripts executable
|
||||
run: |
|
||||
chmod +x scripts/agents/*.sh || true
|
||||
- name: Run agents without AI
|
||||
env:
|
||||
OPENAI_API_KEY: ""
|
||||
run: |
|
||||
scripts/agents/run.sh
|
||||
- name: Upload agents reports
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: agents-reports
|
||||
path: tests/reports/agents
|
||||
|
||||
openia-agents:
|
||||
name: Agents with OpenIA
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' && secrets.OPENAI_API_KEY != '' }}
|
||||
env:
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
OPENAI_MODEL: ${{ vars.OPENAI_MODEL }}
|
||||
OPENAI_API_BASE: ${{ vars.OPENAI_API_BASE }}
|
||||
OPENAI_TEMPERATURE: ${{ vars.OPENAI_TEMPERATURE }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Ensure agents scripts executable
|
||||
run: |
|
||||
chmod +x scripts/agents/*.sh || true
|
||||
- name: Run agents with AI
|
||||
run: |
|
||||
scripts/agents/run.sh
|
||||
- name: Upload agents reports
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: agents-reports-ai
|
||||
path: tests/reports/agents
|
||||
|
||||
deployment-checks:
|
||||
name: Deployment Checks
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Validate deployment documentation
|
||||
run: |
|
||||
if [ ! -f docs/DEPLOYMENT.md ]; then
|
||||
echo "Missing docs/DEPLOYMENT.md"; exit 1; fi
|
||||
if [ ! -f docs/SSH_UPDATE.md ]; then
|
||||
echo "Missing docs/SSH_UPDATE.md"; exit 1; fi
|
||||
- name: Ensure tests directories exist
|
||||
run: |
|
||||
mkdir -p tests/logs tests/reports || true
|
||||
echo "OK"
|
||||
|
||||
security-audit:
|
||||
name: Security Audit
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Ensure scripts executable
|
||||
run: |
|
||||
chmod +x scripts/security/audit.sh || true
|
||||
- name: Run template security audit
|
||||
run: |
|
||||
if [ -f scripts/security/audit.sh ]; then
|
||||
./scripts/security/audit.sh
|
||||
else
|
||||
echo "No security audit script (ok)"
|
||||
fi
|
||||
|
||||
# Job de release guard (cohérence release)
|
||||
release-guard:
|
||||
name: Release Guard
|
||||
runs-on: [self-hosted, linux]
|
||||
needs: [code-quality, unit-tests, documentation-tests, markdownlint, security-audit, deployment-checks, bash-required]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Ensure guard scripts are executable
|
||||
run: |
|
||||
chmod +x scripts/release/guard.sh || true
|
||||
chmod +x scripts/checks/version_alignment.sh || true
|
||||
|
||||
- name: Version alignment check
|
||||
run: |
|
||||
if [ -f scripts/checks/version_alignment.sh ]; then
|
||||
./scripts/checks/version_alignment.sh
|
||||
else
|
||||
echo "No version alignment script (ok)"
|
||||
fi
|
||||
|
||||
- name: Release guard (CI verify)
|
||||
env:
|
||||
RELEASE_TYPE: ci-verify
|
||||
run: |
|
||||
if [ -f scripts/release/guard.sh ]; then
|
||||
./scripts/release/guard.sh
|
||||
else
|
||||
echo "No guard script (ok)"
|
||||
fi
|
||||
|
||||
release-create:
|
||||
name: Create Release (Gitea API)
|
||||
runs-on: ubuntu-latest
|
||||
needs: [release-guard]
|
||||
if: ${{ env.CI_SKIP != 'true' && startsWith(github.ref, 'refs/tags/') }}
|
||||
env:
|
||||
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
|
||||
BASE_URL: ${{ vars.BASE_URL }}
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Validate token and publish release
|
||||
run: |
|
||||
set -e
|
||||
if [ -z "${RELEASE_TOKEN}" ]; then
|
||||
echo "RELEASE_TOKEN secret is missing" >&2; exit 1; fi
|
||||
if [ -z "${BASE_URL}" ]; then
|
||||
BASE_URL="https://git.4nkweb.com"; fi
|
||||
TAG="${GITHUB_REF##*/}"
|
||||
REPO="${GITHUB_REPOSITORY}"
|
||||
OWNER="${REPO%%/*}"
|
||||
NAME="${REPO##*/}"
|
||||
echo "Publishing release ${TAG} to ${BASE_URL}/${OWNER}/${NAME}"
|
||||
curl -sSf -X POST \
|
||||
-H "Authorization: token ${RELEASE_TOKEN}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"tag_name\":\"${TAG}\",\"name\":\"${TAG}\",\"draft\":false,\"prerelease\":false}" \
|
||||
"${BASE_URL}/api/v1/repos/${OWNER}/${NAME}/releases" >/dev/null
|
||||
echo "Release created"
|
||||
|
||||
# Job de tests de performance
|
||||
performance-tests:
|
||||
name: Performance Tests
|
||||
runs-on: [self-hosted, linux]
|
||||
if: ${{ env.CI_SKIP != 'true' }}
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Run performance tests
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo test --release --test performance_tests || true
|
||||
|
||||
- name: Check memory usage
|
||||
run: |
|
||||
# Tests de base de consommation mémoire
|
||||
echo "Performance tests completed"
|
||||
|
||||
# Job de notification
|
||||
notify:
|
||||
name: Notify
|
||||
runs-on: [self-hosted, linux]
|
||||
needs: [code-quality, unit-tests, integration-tests, security-tests, docker-build, documentation-tests]
|
||||
if: ${{ env.CI_SKIP != 'true' && always() }}
|
||||
|
||||
steps:
|
||||
- name: Notify success
|
||||
if: needs.code-quality.result == 'success' && needs.unit-tests.result == 'success' && needs.integration-tests.result == 'success' && needs.security-tests.result == 'success' && needs.docker-build.result == 'success' && needs.documentation-tests.result == 'success'
|
||||
run: |
|
||||
echo "✅ All tests passed successfully!"
|
||||
|
||||
- name: Notify failure
|
||||
if: needs.code-quality.result == 'failure' || needs.unit-tests.result == 'failure' || needs.integration-tests.result == 'failure' || needs.security-tests.result == 'failure' || needs.docker-build.result == 'failure' || needs.documentation-tests.result == 'failure'
|
||||
run: |
|
||||
echo "❌ Some tests failed!"
|
||||
exit 1
|
352
.gitea/workflows/ci.yml.bak
Normal file
352
.gitea/workflows/ci.yml.bak
Normal file
@ -0,0 +1,352 @@
|
||||
name: CI - sdk_storage
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ main, develop ]
|
||||
pull_request:
|
||||
branches: [ main, develop ]
|
||||
|
||||
env:
|
||||
RUST_VERSION: '1.70'
|
||||
DOCKER_COMPOSE_VERSION: '2.20.0'
|
||||
|
||||
jobs:
|
||||
# Job de vérification du code
|
||||
code-quality:
|
||||
name: Code Quality
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Cache Rust dependencies
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry
|
||||
~/.cargo/git
|
||||
target
|
||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-
|
||||
|
||||
- name: Run clippy
|
||||
run: |
|
||||
cargo clippy --all-targets --all-features -- -D warnings
|
||||
|
||||
- name: Run rustfmt
|
||||
run: |
|
||||
cargo fmt --all -- --check
|
||||
|
||||
- name: Check documentation
|
||||
run: |
|
||||
cargo doc --no-deps
|
||||
|
||||
- name: Check for TODO/FIXME
|
||||
run: |
|
||||
if grep -r "TODO\|FIXME" . --exclude-dir=.git --exclude-dir=target; then
|
||||
echo "Found TODO/FIXME comments. Please address them."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Job de tests unitaires
|
||||
unit-tests:
|
||||
name: Unit Tests
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Cache Rust dependencies
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/registry
|
||||
~/.cargo/git
|
||||
target
|
||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-cargo-
|
||||
|
||||
- name: Run unit tests
|
||||
run: |
|
||||
cargo test --lib --bins
|
||||
|
||||
- name: Run integration tests
|
||||
run: |
|
||||
cargo test --tests
|
||||
|
||||
# Job de tests d'intégration
|
||||
integration-tests:
|
||||
name: Integration Tests
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
services:
|
||||
docker:
|
||||
image: docker:24.0.5
|
||||
options: >-
|
||||
--health-cmd "docker info"
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
ports:
|
||||
- 2375:2375
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build Docker images
|
||||
run: |
|
||||
docker build -t 4nk-node-bitcoin ./bitcoin
|
||||
docker build -t 4nk-node-blindbit ./blindbit
|
||||
docker build -t 4nk-node-sdk-relay -f ./sdk_relay/Dockerfile ..
|
||||
|
||||
- name: Run integration tests
|
||||
run: |
|
||||
# Tests de connectivité de base
|
||||
./tests/run_connectivity_tests.sh || true
|
||||
|
||||
# Tests d'intégration
|
||||
./tests/run_integration_tests.sh || true
|
||||
|
||||
- name: Upload test results
|
||||
uses: actions/upload-artifact@v3
|
||||
if: always()
|
||||
with:
|
||||
name: test-results
|
||||
path: |
|
||||
tests/logs/
|
||||
tests/reports/
|
||||
retention-days: 7
|
||||
|
||||
# Job de tests de sécurité
|
||||
security-tests:
|
||||
name: Security Tests
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Run cargo audit
|
||||
run: |
|
||||
cargo audit --deny warnings
|
||||
|
||||
- name: Check for secrets
|
||||
run: |
|
||||
# Vérifier les secrets potentiels
|
||||
if grep -r "password\|secret\|key\|token" . --exclude-dir=.git --exclude-dir=target --exclude=*.md; then
|
||||
echo "Potential secrets found. Please review."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Check file permissions
|
||||
run: |
|
||||
# Vérifier les permissions sensibles
|
||||
find . -type f -perm /0111 -name "*.conf" -o -name "*.key" -o -name "*.pem" | while read file; do
|
||||
if [[ $(stat -c %a "$file") != "600" ]]; then
|
||||
echo "Warning: $file has insecure permissions"
|
||||
fi
|
||||
done
|
||||
|
||||
# Job de build et test Docker
|
||||
docker-build:
|
||||
name: Docker Build & Test
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
services:
|
||||
docker:
|
||||
image: docker:24.0.5
|
||||
options: >-
|
||||
--health-cmd "docker info"
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
ports:
|
||||
- 2375:2375
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build and test Bitcoin Core
|
||||
run: |
|
||||
docker build -t 4nk-node-bitcoin:test ./bitcoin
|
||||
docker run --rm 4nk-node-bitcoin:test bitcoin-cli --version
|
||||
|
||||
- name: Build and test Blindbit
|
||||
run: |
|
||||
docker build -t 4nk-node-blindbit:test ./blindbit
|
||||
docker run --rm 4nk-node-blindbit:test --version || true
|
||||
|
||||
- name: Build and test SDK Relay
|
||||
run: |
|
||||
docker build -t 4nk-node-sdk-relay:test -f ./sdk_relay/Dockerfile ..
|
||||
docker run --rm 4nk-node-sdk-relay:test --version || true
|
||||
|
||||
- name: Test Docker Compose
|
||||
run: |
|
||||
docker-compose config
|
||||
docker-compose build --no-cache
|
||||
|
||||
# Job de tests de documentation
|
||||
documentation-tests:
|
||||
name: Documentation Tests
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Check markdown links
|
||||
run: |
|
||||
# Vérification basique des liens markdown
|
||||
find . -name "*.md" -exec grep -l "\[.*\](" {} \; | while read file; do
|
||||
echo "Checking links in $file"
|
||||
done
|
||||
|
||||
- name: Check documentation structure
|
||||
run: |
|
||||
# Vérifier la présence des fichiers de documentation essentiels
|
||||
required_files=(
|
||||
"README.md"
|
||||
"LICENSE"
|
||||
"CONTRIBUTING.md"
|
||||
"CHANGELOG.md"
|
||||
"CODE_OF_CONDUCT.md"
|
||||
"SECURITY.md"
|
||||
"docs/INDEX.md"
|
||||
"docs/INSTALLATION.md"
|
||||
"docs/USAGE.md"
|
||||
)
|
||||
|
||||
for file in "${required_files[@]}"; do
|
||||
if [[ ! -f "$file" ]]; then
|
||||
echo "Missing required documentation file: $file"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Validate documentation
|
||||
run: |
|
||||
echo "Documentation checks completed"
|
||||
|
||||
security-audit:
|
||||
name: Security Audit
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Ensure scripts executable
|
||||
run: |
|
||||
chmod +x scripts/security/audit.sh || true
|
||||
- name: Run template security audit
|
||||
run: |
|
||||
if [ -f scripts/security/audit.sh ]; then
|
||||
./scripts/security/audit.sh
|
||||
else
|
||||
echo "No security audit script (ok)"
|
||||
fi
|
||||
|
||||
# Job de release guard (cohérence release)
|
||||
release-guard:
|
||||
name: Release Guard
|
||||
runs-on: ubuntu-latest
|
||||
needs: [code-quality, unit-tests, documentation-tests, security-audit]
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Ensure guard scripts are executable
|
||||
run: |
|
||||
chmod +x scripts/release/guard.sh || true
|
||||
chmod +x scripts/checks/version_alignment.sh || true
|
||||
|
||||
- name: Version alignment check
|
||||
run: |
|
||||
if [ -f scripts/checks/version_alignment.sh ]; then
|
||||
./scripts/checks/version_alignment.sh
|
||||
else
|
||||
echo "No version alignment script (ok)"
|
||||
fi
|
||||
|
||||
- name: Release guard (CI verify)
|
||||
env:
|
||||
RELEASE_TYPE: ci-verify
|
||||
run: |
|
||||
if [ -f scripts/release/guard.sh ]; then
|
||||
./scripts/release/guard.sh
|
||||
else
|
||||
echo "No guard script (ok)"
|
||||
fi
|
||||
|
||||
# Job de tests de performance
|
||||
performance-tests:
|
||||
name: Performance Tests
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Setup Rust
|
||||
uses: actions-rs/toolchain@v1
|
||||
with:
|
||||
toolchain: ${{ env.RUST_VERSION }}
|
||||
override: true
|
||||
|
||||
- name: Run performance tests
|
||||
run: |
|
||||
cd sdk_relay
|
||||
cargo test --release --test performance_tests || true
|
||||
|
||||
- name: Check memory usage
|
||||
run: |
|
||||
# Tests de base de consommation mémoire
|
||||
echo "Performance tests completed"
|
||||
|
||||
# Job de notification
|
||||
notify:
|
||||
name: Notify
|
||||
runs-on: ubuntu-latest
|
||||
needs: [code-quality, unit-tests, integration-tests, security-tests, docker-build, documentation-tests]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Notify success
|
||||
if: needs.code-quality.result == 'success' && needs.unit-tests.result == 'success' && needs.integration-tests.result == 'success' && needs.security-tests.result == 'success' && needs.docker-build.result == 'success' && needs.documentation-tests.result == 'success'
|
||||
run: |
|
||||
echo "✅ All tests passed successfully!"
|
||||
|
||||
- name: Notify failure
|
||||
if: needs.code-quality.result == 'failure' || needs.unit-tests.result == 'failure' || needs.integration-tests.result == 'failure' || needs.security-tests.result == 'failure' || needs.docker-build.result == 'failure' || needs.documentation-tests.result == 'failure'
|
||||
run: |
|
||||
echo "❌ Some tests failed!"
|
||||
exit 1
|
47
.gitea/workflows/docker.yml
Normal file
47
.gitea/workflows/docker.yml
Normal file
@ -0,0 +1,47 @@
|
||||
name: Docker Image
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- docker-support
|
||||
tags:
|
||||
- 'v*.*.*'
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
docker:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
- name: Docker meta
|
||||
id: meta
|
||||
uses: docker/metadata-action@v5
|
||||
with:
|
||||
images: ${{ secrets.DOCKER_REGISTRY }}/sdk_storage
|
||||
tags: |
|
||||
type=raw,value=latest,enable=${{ github.ref_type == 'branch' && github.ref_name == 'docker-support' }}
|
||||
type=ref,event=tag
|
||||
type=semver,pattern={{version}},enable=${{ startsWith(github.ref, 'refs/tags/') }}
|
||||
- name: Login to registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ secrets.DOCKER_REGISTRY }}
|
||||
username: ${{ secrets.DOCKER_USERNAME }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: .
|
||||
push: true
|
||||
tags: ${{ steps.meta.outputs.tags }}
|
||||
labels: ${{ steps.meta.outputs.labels }}
|
||||
platforms: linux/amd64,linux/arm64
|
||||
|
35
.gitea/workflows/release.yml
Normal file
35
.gitea/workflows/release.yml
Normal file
@ -0,0 +1,35 @@
|
||||
name: Release
|
||||
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- 'v*.*.*'
|
||||
|
||||
jobs:
|
||||
build-release:
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-latest, windows-latest]
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Rust toolchain
|
||||
uses: dtolnay/rust-toolchain@stable
|
||||
- name: Build
|
||||
run: cargo build --release
|
||||
- name: Archive artifact
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir -p dist
|
||||
if [[ "$RUNNER_OS" == "Windows" ]]; then
|
||||
cp target/release/sdk_storage.exe dist/
|
||||
else
|
||||
cp target/release/sdk_storage dist/
|
||||
fi
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: sdk_storage-${{ runner.os }}
|
||||
path: dist/*
|
||||
|
40
.gitea/workflows/template-sync.yml
Normal file
40
.gitea/workflows/template-sync.yml
Normal file
@ -0,0 +1,40 @@
|
||||
# .gitea/workflows/template-sync.yml — synchronisation et contrôles d’intégrité
|
||||
name: 4NK Template Sync
|
||||
on:
|
||||
schedule: # planification régulière
|
||||
- cron: "0 4 * * 1" # exécution hebdomadaire (UTC)
|
||||
workflow_dispatch: {} # déclenchement manuel
|
||||
|
||||
jobs:
|
||||
check-and-sync:
|
||||
runs-on: linux
|
||||
steps:
|
||||
- name: Lire TEMPLATE_VERSION et .4nk-sync.yml
|
||||
# Doit charger ref courant, source_repo et périmètre paths
|
||||
|
||||
- name: Récupérer la version publiée du template/4NK_rules
|
||||
# Doit comparer TEMPLATE_VERSION avec ref amont
|
||||
|
||||
- name: Créer branche de synchronisation si divergence
|
||||
# Doit créer chore/template-sync-<date> et préparer un commit
|
||||
|
||||
- name: Synchroniser les chemins autoritatifs
|
||||
# Doit mettre à jour .cursor/**, .gitea/**, AGENTS.md, scripts/**, docs/SSH_UPDATE.md
|
||||
|
||||
- name: Contrôles post-sync (bloquants)
|
||||
# 1) Vérifier présence et exécutable des scripts/*.sh
|
||||
# 2) Vérifier mise à jour CHANGELOG.md et docs/INDEX.md
|
||||
# 3) Vérifier docs/SSH_UPDATE.md si scripts/** a changé
|
||||
# 4) Vérifier absence de secrets en clair dans scripts/**
|
||||
# 5) Vérifier manifest_checksum si publié
|
||||
|
||||
- name: Tests, lint, sécurité statique
|
||||
# Doit exiger un état vert
|
||||
|
||||
- name: Ouvrir PR de synchronisation
|
||||
# Titre: "[template-sync] chore: aligner .cursor/.gitea/AGENTS.md/scripts"
|
||||
# Doit inclure résumé des fichiers modifiés et la version appliquée
|
||||
|
||||
- name: Mettre à jour TEMPLATE_VERSION (dans PR)
|
||||
# Doit remplacer la valeur par la ref appliquée
|
||||
|
15
.gitignore
vendored
15
.gitignore
vendored
@ -1,2 +1,17 @@
|
||||
/target
|
||||
/storage
|
||||
/dist
|
||||
/.env
|
||||
/.env.local
|
||||
/.env.production
|
||||
/.env.test
|
||||
.vscode/
|
||||
.idea/
|
||||
*.log
|
||||
*.tmp
|
||||
*.swp
|
||||
*.swo
|
||||
|
||||
!.cursor/
|
||||
|
||||
!AGENTS.md
|
||||
|
14
.markdownlint.json
Normal file
14
.markdownlint.json
Normal file
@ -0,0 +1,14 @@
|
||||
{
|
||||
"MD013": {
|
||||
"line_length": 200,
|
||||
"code_blocks": false,
|
||||
"tables": false,
|
||||
"headings": false
|
||||
},
|
||||
"MD007": {
|
||||
"indent": 2
|
||||
},
|
||||
"MD024": {
|
||||
"siblings_only": true
|
||||
}
|
||||
}
|
12
AGENTS.md
Normal file
12
AGENTS.md
Normal file
@ -0,0 +1,12 @@
|
||||
# Agents & Automations
|
||||
|
||||
- Compilation régulière: `cargo build`.
|
||||
- Lancement des tests: `cargo test`.
|
||||
- Mise à jour de la documentation dès qu'une fonctionnalité change (`docs/`).
|
||||
|
||||
## Sécurité (vigilance)
|
||||
|
||||
- Exécuter l’audit de sécurité automatisé: `scripts/security/audit.sh` (cargo audit, npm audit si présent, scan de secrets).
|
||||
- Interdire l’introduction de secrets en clair; rotation des secrets gérés par la CI.
|
||||
- Vérifier les permissions des fichiers sensibles et l’absence d’endpoints privés exposés.
|
||||
- La pipeline CI inclut un job `security-audit` et bloque les releases en cas d’échec (intégré au `release-guard`).
|
22
CHANGELOG.md
Normal file
22
CHANGELOG.md
Normal file
@ -0,0 +1,22 @@
|
||||
# Changelog
|
||||
|
||||
## 0.2.0
|
||||
- Ajout Dockerfile multi-stage et `.dockerignore`
|
||||
- CI: workflows build/test, release et build/push Docker
|
||||
- Documentation étendue dans `docs/` (architecture, guides, API JSON)
|
||||
- Tests renforcés: conflit de clé et suppression des expirés
|
||||
|
||||
## 0.2.2
|
||||
- Endpoint `/health` ajouté et testé
|
||||
- Spec JSON/API et docs monitoring mises à jour
|
||||
- CI Docker: tagging multi-tags (`latest` et `vX.Y.Z` sur tag)
|
||||
- Version et déploiements alignés
|
||||
|
||||
## 0.1.0
|
||||
- Refactor vers `src/lib.rs` et service `StorageService`
|
||||
- Ajout `docs/` (README) et `tests/` (test intégration service)
|
||||
- API HTTP Tide conservée; nettoyage TTL périodique 60s
|
||||
|
||||
## [0.2.2] - 2025-08-27
|
||||
### Changed
|
||||
- Release latest (sécurité/CI/docs).
|
3
CODEOWNERS
Normal file
3
CODEOWNERS
Normal file
@ -0,0 +1,3 @@
|
||||
# Code owners par défaut du dépôt
|
||||
# Ajustez ces lignes si vous utilisez des équipes/alias différents.
|
||||
* @nicolas.cantu
|
9
CODE_OF_CONDUCT.md
Normal file
9
CODE_OF_CONDUCT.md
Normal file
@ -0,0 +1,9 @@
|
||||
# Code de Conduite
|
||||
|
||||
Nous nous engageons à offrir une communauté ouverte, accueillante et respectueuse.
|
||||
|
||||
- Pas de harcèlement.
|
||||
- Respect des avis techniques et des personnes.
|
||||
- Suivre les consignes des mainteneurs.
|
||||
|
||||
Signalez tout problème via les issues du dépôt.
|
8
CONTRIBUTING.md
Normal file
8
CONTRIBUTING.md
Normal file
@ -0,0 +1,8 @@
|
||||
# Contribuer à sdk_storage
|
||||
|
||||
Merci de proposer des issues et des Pull Requests.
|
||||
|
||||
- Discutez via une issue avant une modification majeure.
|
||||
- Travaillez sur une branche `feature/...`.
|
||||
- Ajoutez systématiquement des tests (`tests/`) et mettez à jour la documentation (`docs/`).
|
||||
- Assurez-vous que `cargo fmt`, `cargo clippy` et `cargo test` passent localement.
|
528
Cargo.lock
generated
528
Cargo.lock
generated
@ -2,6 +2,21 @@
|
||||
# It is not intended for manual editing.
|
||||
version = 4
|
||||
|
||||
[[package]]
|
||||
name = "addr2line"
|
||||
version = "0.24.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "dfbe277e56a376000877090da837660b4427aad530e3028d44e0bffe4f89a1c1"
|
||||
dependencies = [
|
||||
"gimli",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "adler2"
|
||||
version = "2.0.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "320119579fcad9c21884f5c4861d16174d0e06250625266f50fe6898340abefa"
|
||||
|
||||
[[package]]
|
||||
name = "aead"
|
||||
version = "0.3.2"
|
||||
@ -236,6 +251,18 @@ dependencies = [
|
||||
"pin-project-lite 0.2.15",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "async-native-tls"
|
||||
version = "0.3.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9e9e7a929bd34c68a82d58a4de7f86fffdaf97fb2af850162a7bb19dd7269b33"
|
||||
dependencies = [
|
||||
"async-std",
|
||||
"native-tls",
|
||||
"thiserror",
|
||||
"url",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "async-process"
|
||||
version = "2.3.0"
|
||||
@ -365,6 +392,21 @@ version = "1.4.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ace50bade8e6234aa140d9a2f552bbee1db4d353f69b8217bc503490fc1a9f26"
|
||||
|
||||
[[package]]
|
||||
name = "backtrace"
|
||||
version = "0.3.75"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6806a6321ec58106fea15becdad98371e28d92ccbc7c8f1b3b6dd724fe8f1002"
|
||||
dependencies = [
|
||||
"addr2line",
|
||||
"cfg-if 1.0.0",
|
||||
"libc",
|
||||
"miniz_oxide",
|
||||
"object",
|
||||
"rustc-demangle",
|
||||
"windows-targets 0.52.6",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "base-x"
|
||||
version = "0.2.11"
|
||||
@ -507,6 +549,17 @@ dependencies = [
|
||||
"crossbeam-utils",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "config"
|
||||
version = "0.10.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "19b076e143e1d9538dde65da30f8481c2a6c44040edb8e02b9bf1351edb92ce3"
|
||||
dependencies = [
|
||||
"lazy_static",
|
||||
"nom",
|
||||
"serde",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "const_fn"
|
||||
version = "0.4.10"
|
||||
@ -536,6 +589,16 @@ dependencies = [
|
||||
"version_check",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "core-foundation"
|
||||
version = "0.9.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "91e195e091a93c46f7102ec7818a2aa394e1e1771c3ab4825963fa03e45afb8f"
|
||||
dependencies = [
|
||||
"core-foundation-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "core-foundation-sys"
|
||||
version = "0.8.7"
|
||||
@ -557,6 +620,15 @@ version = "0.2.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "dcb25d077389e53838a8158c8e99174c5a9d902dee4904320db714f3c653ffba"
|
||||
|
||||
[[package]]
|
||||
name = "crossbeam-queue"
|
||||
version = "0.3.12"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "0f58bbc28f91df819d0aa2a2c00cd19754769c2fad90579b3592b1c9ba7a3115"
|
||||
dependencies = [
|
||||
"crossbeam-utils",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "crossbeam-utils"
|
||||
version = "0.8.20"
|
||||
@ -592,6 +664,33 @@ dependencies = [
|
||||
"cipher",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dashmap"
|
||||
version = "5.5.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "978747c1d849a7d2ee5e8adc0159961c48fb7e5db2f06af6723b80123bb53856"
|
||||
dependencies = [
|
||||
"cfg-if 1.0.0",
|
||||
"hashbrown",
|
||||
"lock_api",
|
||||
"once_cell",
|
||||
"parking_lot_core",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "deadpool"
|
||||
version = "0.7.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "3d126179d86aee4556e54f5f3c6bf6d9884e7cc52cef82f77ee6f90a7747616d"
|
||||
dependencies = [
|
||||
"async-trait",
|
||||
"config",
|
||||
"crossbeam-queue",
|
||||
"num_cpus",
|
||||
"serde",
|
||||
"tokio",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "digest"
|
||||
version = "0.9.0"
|
||||
@ -696,6 +795,21 @@ dependencies = [
|
||||
"web-sys",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "foreign-types"
|
||||
version = "0.3.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f6f339eb8adc052cd2ca78910fda869aefa38d22d5cb648e6485e4d3fc06f3b1"
|
||||
dependencies = [
|
||||
"foreign-types-shared",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "foreign-types-shared"
|
||||
version = "0.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "00b0228411908ca8685dba7fc2cdd70ec9990a6e753e89b6ac91a84c40fbaf4b"
|
||||
|
||||
[[package]]
|
||||
name = "form_urlencoded"
|
||||
version = "1.2.1"
|
||||
@ -705,6 +819,21 @@ dependencies = [
|
||||
"percent-encoding",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "65bc07b1a8bc7c85c5f2e110c476c7389b4554ba72af57d8445ea63a576b0876"
|
||||
dependencies = [
|
||||
"futures-channel",
|
||||
"futures-core",
|
||||
"futures-executor",
|
||||
"futures-io",
|
||||
"futures-sink",
|
||||
"futures-task",
|
||||
"futures-util",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-channel"
|
||||
version = "0.3.31"
|
||||
@ -712,6 +841,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "2dff15bf788c671c1934e366d07e30c1814a8ef514e1af724a602e8a2fbe1b10"
|
||||
dependencies = [
|
||||
"futures-core",
|
||||
"futures-sink",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@ -720,6 +850,17 @@ version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "05f29059c0c2090612e8d742178b0580d2dc940c837851ad723096f87af6663e"
|
||||
|
||||
[[package]]
|
||||
name = "futures-executor"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1e28d1d997f585e54aebc3f97d39e72338912123a67330d723fdbb564d646c9f"
|
||||
dependencies = [
|
||||
"futures-core",
|
||||
"futures-task",
|
||||
"futures-util",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-io"
|
||||
version = "0.3.31"
|
||||
@ -765,6 +906,12 @@ dependencies = [
|
||||
"syn 2.0.87",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures-sink"
|
||||
version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e575fab7d1e0dcb8d0c7bcf9a63ee213816ab51902e6d244a95819acacf1d4f7"
|
||||
|
||||
[[package]]
|
||||
name = "futures-task"
|
||||
version = "0.3.31"
|
||||
@ -777,9 +924,13 @@ version = "0.3.31"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9fa08315bb612088cc391249efdc3bc77536f16c91f6cf495e6fbe85b20a4a81"
|
||||
dependencies = [
|
||||
"futures-channel",
|
||||
"futures-core",
|
||||
"futures-io",
|
||||
"futures-macro",
|
||||
"futures-sink",
|
||||
"futures-task",
|
||||
"memchr",
|
||||
"pin-project-lite 0.2.15",
|
||||
"pin-utils",
|
||||
"slab",
|
||||
@ -817,6 +968,18 @@ dependencies = [
|
||||
"wasi 0.11.0+wasi-snapshot-preview1",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "getrandom"
|
||||
version = "0.3.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "26145e563e54f2cadc477553f1ec5ee650b00862f0a58bcd12cbdc5f0ea2d2f4"
|
||||
dependencies = [
|
||||
"cfg-if 1.0.0",
|
||||
"libc",
|
||||
"r-efi",
|
||||
"wasi 0.14.2+wasi-0.2.4",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ghash"
|
||||
version = "0.3.1"
|
||||
@ -827,6 +990,12 @@ dependencies = [
|
||||
"polyval",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "gimli"
|
||||
version = "0.31.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "07e28edb80900c19c28f1072f2e8aeca7fa06b23cd4169cefe1af5aa3260783f"
|
||||
|
||||
[[package]]
|
||||
name = "gloo-timers"
|
||||
version = "0.3.0"
|
||||
@ -839,6 +1008,12 @@ dependencies = [
|
||||
"wasm-bindgen",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "hashbrown"
|
||||
version = "0.14.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e5274423e17b7c9fc20b6e7e208532f9b19825d82dfd615708b70edd83df41f1"
|
||||
|
||||
[[package]]
|
||||
name = "hermit-abi"
|
||||
version = "0.3.9"
|
||||
@ -851,6 +1026,12 @@ version = "0.4.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "fbf6a919d6cf397374f7dfeeea91d974c7c0a7221d0d0f4f20d859d329e53fcc"
|
||||
|
||||
[[package]]
|
||||
name = "hermit-abi"
|
||||
version = "0.5.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "fc0fef456e4baa96da950455cd02c081ca953b141298e41db3fc7e36b1da849c"
|
||||
|
||||
[[package]]
|
||||
name = "hex"
|
||||
version = "0.4.3"
|
||||
@ -893,8 +1074,14 @@ version = "6.5.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1947510dc91e2bf586ea5ffb412caad7673264e14bb39fb9078da114a94ce1a5"
|
||||
dependencies = [
|
||||
"async-h1",
|
||||
"async-native-tls",
|
||||
"async-std",
|
||||
"async-trait",
|
||||
"cfg-if 1.0.0",
|
||||
"dashmap",
|
||||
"deadpool",
|
||||
"futures",
|
||||
"http-types",
|
||||
"log",
|
||||
]
|
||||
@ -1115,6 +1302,17 @@ dependencies = [
|
||||
"windows-sys 0.48.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "io-uring"
|
||||
version = "0.7.10"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "046fa2d4d00aea763528b4950358d0ead425372445dc8ff86312b3c69ff7727b"
|
||||
dependencies = [
|
||||
"bitflags 2.6.0",
|
||||
"cfg-if 1.0.0",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "itoa"
|
||||
version = "1.0.11"
|
||||
@ -1140,10 +1338,29 @@ dependencies = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "libc"
|
||||
version = "0.2.164"
|
||||
name = "lazy_static"
|
||||
version = "1.5.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "433bfe06b8c75da9b2e3fbea6e5329ff87748f0b144ef75306e674c3f6f7c13f"
|
||||
checksum = "bbd2bcb4c963f2ddae06a2efc7e9f3591312473c50c6685e1f298068316e66fe"
|
||||
|
||||
[[package]]
|
||||
name = "lexical-core"
|
||||
version = "0.7.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6607c62aa161d23d17a9072cc5da0be67cdfc89d3afb1e8d9c842bebc2525ffe"
|
||||
dependencies = [
|
||||
"arrayvec",
|
||||
"bitflags 1.3.2",
|
||||
"cfg-if 1.0.0",
|
||||
"ryu",
|
||||
"static_assertions",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "libc"
|
||||
version = "0.2.175"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6a82ae493e598baaea5209805c49bbf2ea7de956d50d7da0da1164f9c6d28543"
|
||||
|
||||
[[package]]
|
||||
name = "linux-raw-sys"
|
||||
@ -1163,6 +1380,16 @@ version = "0.7.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "643cb0b8d4fcc284004d5fd0d67ccf61dfffadb7f75e1e71bc420f4688a3a704"
|
||||
|
||||
[[package]]
|
||||
name = "lock_api"
|
||||
version = "0.4.13"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "96936507f153605bddfcda068dd804796c84324ed2510809e5b2a624c81da765"
|
||||
dependencies = [
|
||||
"autocfg",
|
||||
"scopeguard",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "log"
|
||||
version = "0.4.22"
|
||||
@ -1179,6 +1406,70 @@ version = "2.7.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "78ca9ab1a0babb1e7d5695e3530886289c18cf2f87ec19a575a0abdce112e3a3"
|
||||
|
||||
[[package]]
|
||||
name = "mime"
|
||||
version = "0.3.17"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6877bb514081ee2a7ff5ef9de3281f14a4dd4bceac4c09388074a6b5df8a139a"
|
||||
|
||||
[[package]]
|
||||
name = "mime_guess"
|
||||
version = "2.0.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f7c44f8e672c00fe5308fa235f821cb4198414e1c77935c1ab6948d3fd78550e"
|
||||
dependencies = [
|
||||
"mime",
|
||||
"unicase",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "miniz_oxide"
|
||||
version = "0.8.9"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1fa76a2c86f704bdb222d66965fb3d63269ce38518b83cb0575fca855ebb6316"
|
||||
dependencies = [
|
||||
"adler2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mio"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "78bed444cc8a2160f01cbcf811ef18cac863ad68ae8ca62092e8db51d51c761c"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"wasi 0.11.0+wasi-snapshot-preview1",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "native-tls"
|
||||
version = "0.2.14"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "87de3442987e9dbec73158d5c715e7ad9072fda936bb03d19d7fa10e00520f0e"
|
||||
dependencies = [
|
||||
"libc",
|
||||
"log",
|
||||
"openssl",
|
||||
"openssl-probe",
|
||||
"openssl-sys",
|
||||
"schannel",
|
||||
"security-framework",
|
||||
"security-framework-sys",
|
||||
"tempfile",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "nom"
|
||||
version = "5.1.3"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "08959a387a676302eebf4ddbcbc611da04285579f76f88ee0506c63b1a61dd4b"
|
||||
dependencies = [
|
||||
"lexical-core",
|
||||
"memchr",
|
||||
"version_check",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "num-traits"
|
||||
version = "0.2.19"
|
||||
@ -1188,6 +1479,25 @@ dependencies = [
|
||||
"autocfg",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "num_cpus"
|
||||
version = "1.17.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "91df4bbde75afed763b708b7eee1e8e7651e02d97f6d5dd763e89367e957b23b"
|
||||
dependencies = [
|
||||
"hermit-abi 0.5.2",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "object"
|
||||
version = "0.36.7"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "62948e14d923ea95ea2c7c86c71013138b66525b86bdc08d2dcc262bdb497b87"
|
||||
dependencies = [
|
||||
"memchr",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "once_cell"
|
||||
version = "1.20.2"
|
||||
@ -1200,12 +1510,69 @@ version = "0.3.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "c08d65885ee38876c4f86fa503fb49d7b507c2b62552df7c70b2fce627e06381"
|
||||
|
||||
[[package]]
|
||||
name = "openssl"
|
||||
version = "0.10.73"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "8505734d46c8ab1e19a1dce3aef597ad87dcb4c37e7188231769bd6bd51cebf8"
|
||||
dependencies = [
|
||||
"bitflags 2.6.0",
|
||||
"cfg-if 1.0.0",
|
||||
"foreign-types",
|
||||
"libc",
|
||||
"once_cell",
|
||||
"openssl-macros",
|
||||
"openssl-sys",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openssl-macros"
|
||||
version = "0.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a948666b637a0f465e8564c73e89d4dde00d72d4d473cc972f390fc3dcee7d9c"
|
||||
dependencies = [
|
||||
"proc-macro2",
|
||||
"quote",
|
||||
"syn 2.0.87",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openssl-probe"
|
||||
version = "0.1.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "d05e27ee213611ffe7d6348b942e8f942b37114c00cc03cec254295a4a17852e"
|
||||
|
||||
[[package]]
|
||||
name = "openssl-sys"
|
||||
version = "0.9.109"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "90096e2e47630d78b7d1c20952dc621f957103f8bc2c8359ec81290d75238571"
|
||||
dependencies = [
|
||||
"cc",
|
||||
"libc",
|
||||
"pkg-config",
|
||||
"vcpkg",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "parking"
|
||||
version = "2.2.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f38d5652c16fde515bb1ecef450ab0f6a219d619a7274976324d5e377f7dceba"
|
||||
|
||||
[[package]]
|
||||
name = "parking_lot_core"
|
||||
version = "0.9.11"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "bc838d2a56b5b1a6c25f55575dfc605fabb63bb2365f6c2353ef9159aa69e4a5"
|
||||
dependencies = [
|
||||
"cfg-if 1.0.0",
|
||||
"libc",
|
||||
"redox_syscall",
|
||||
"smallvec",
|
||||
"windows-targets 0.52.6",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "percent-encoding"
|
||||
version = "2.3.1"
|
||||
@ -1261,6 +1628,12 @@ dependencies = [
|
||||
"futures-io",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pkg-config"
|
||||
version = "0.3.32"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "7edddbd0b52d732b21ad9a5fab5c704c14cd949e5e9a1ec5929a24fded1b904c"
|
||||
|
||||
[[package]]
|
||||
name = "polling"
|
||||
version = "2.8.0"
|
||||
@ -1336,6 +1709,12 @@ dependencies = [
|
||||
"proc-macro2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "r-efi"
|
||||
version = "5.3.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "69cdb34c158ceb288df11e18b4bd39de994f6657d83847bdffdbd7f346754b0f"
|
||||
|
||||
[[package]]
|
||||
name = "rand"
|
||||
version = "0.7.3"
|
||||
@ -1407,12 +1786,27 @@ dependencies = [
|
||||
"rand_core 0.5.1",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "redox_syscall"
|
||||
version = "0.5.17"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5407465600fb0548f1442edf71dd20683c6ed326200ace4b1ef0763521bb3b77"
|
||||
dependencies = [
|
||||
"bitflags 2.6.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "route-recognizer"
|
||||
version = "0.2.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "56770675ebc04927ded3e60633437841581c285dc6236109ea25fbf3beb7b59e"
|
||||
|
||||
[[package]]
|
||||
name = "rustc-demangle"
|
||||
version = "0.1.26"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "56f7d92ca342cea22a06f2121d944b4fd82af56988c270852495420f961d4ace"
|
||||
|
||||
[[package]]
|
||||
name = "rustc_version"
|
||||
version = "0.2.3"
|
||||
@ -1455,17 +1849,57 @@ version = "1.0.18"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f3cb5ba0dc43242ce17de99c180e96db90b235b8a9fdc9543c96d2209116bd9f"
|
||||
|
||||
[[package]]
|
||||
name = "schannel"
|
||||
version = "0.1.27"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "1f29ebaa345f945cec9fbbc532eb307f0fdad8161f281b6369539c8d84876b3d"
|
||||
dependencies = [
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "scopeguard"
|
||||
version = "1.2.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
|
||||
|
||||
[[package]]
|
||||
name = "sdk_storage"
|
||||
version = "0.1.0"
|
||||
version = "0.2.2"
|
||||
dependencies = [
|
||||
"async-std",
|
||||
"hex",
|
||||
"serde",
|
||||
"serde_json",
|
||||
"surf",
|
||||
"tempfile",
|
||||
"tide",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "security-framework"
|
||||
version = "2.11.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "897b2245f0b511c87893af39b033e5ca9cce68824c4d7e7630b5a1d339658d02"
|
||||
dependencies = [
|
||||
"bitflags 2.6.0",
|
||||
"core-foundation",
|
||||
"core-foundation-sys",
|
||||
"libc",
|
||||
"security-framework-sys",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "security-framework-sys"
|
||||
version = "2.14.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "49db231d56a190491cb4aeda9527f1ad45345af50b0851622a7adb8c03b01c32"
|
||||
dependencies = [
|
||||
"core-foundation-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "semver"
|
||||
version = "0.9.0"
|
||||
@ -1628,6 +2062,12 @@ dependencies = [
|
||||
"version_check",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "static_assertions"
|
||||
version = "1.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "a2eb9349b6444b326872e140eb1cf5e7c522154d69e7a0ffb0fb81c06b37543f"
|
||||
|
||||
[[package]]
|
||||
name = "stdweb"
|
||||
version = "0.4.20"
|
||||
@ -1683,6 +2123,28 @@ version = "2.6.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "13c2bddecc57b384dee18652358fb23172facb8a2c51ccc10d74c157bdea3292"
|
||||
|
||||
[[package]]
|
||||
name = "surf"
|
||||
version = "2.3.2"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "718b1ae6b50351982dedff021db0def601677f2120938b070eadb10ba4038dd7"
|
||||
dependencies = [
|
||||
"async-native-tls",
|
||||
"async-std",
|
||||
"async-trait",
|
||||
"cfg-if 1.0.0",
|
||||
"futures-util",
|
||||
"getrandom 0.2.15",
|
||||
"http-client",
|
||||
"http-types",
|
||||
"log",
|
||||
"mime_guess",
|
||||
"once_cell",
|
||||
"pin-project-lite 0.2.15",
|
||||
"serde",
|
||||
"serde_json",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "sval"
|
||||
version = "2.13.2"
|
||||
@ -1794,6 +2256,20 @@ dependencies = [
|
||||
"syn 2.0.87",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tempfile"
|
||||
version = "3.17.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "22e5a0acb1f3f55f65cc4a866c361b2fb2a0ff6366785ae6fbb5f85df07ba230"
|
||||
dependencies = [
|
||||
"cfg-if 1.0.0",
|
||||
"fastrand 2.2.0",
|
||||
"getrandom 0.3.3",
|
||||
"once_cell",
|
||||
"rustix 0.38.41",
|
||||
"windows-sys 0.59.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "thiserror"
|
||||
version = "1.0.69"
|
||||
@ -1885,6 +2361,20 @@ dependencies = [
|
||||
"zerovec",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tokio"
|
||||
version = "1.47.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "89e49afdadebb872d3145a5638b59eb0691ea23e46ca484037cfab3b76b95038"
|
||||
dependencies = [
|
||||
"backtrace",
|
||||
"io-uring",
|
||||
"libc",
|
||||
"mio",
|
||||
"pin-project-lite 0.2.15",
|
||||
"slab",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tracing"
|
||||
version = "0.1.40"
|
||||
@ -1913,6 +2403,12 @@ version = "1.17.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "42ff0bf0c66b8238c6f3b578df37d0b7848e55df8577b3f74f92a69acceeb825"
|
||||
|
||||
[[package]]
|
||||
name = "unicase"
|
||||
version = "2.8.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "75b844d17643ee918803943289730bec8aac480150456169e647ed0b576ba539"
|
||||
|
||||
[[package]]
|
||||
name = "unicode-ident"
|
||||
version = "1.0.13"
|
||||
@ -1989,6 +2485,12 @@ dependencies = [
|
||||
"sval_serde",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "vcpkg"
|
||||
version = "0.2.15"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "accd4ea62f7bb7a82fe23066fb0957d48ef677f6eeb8215f372f52e48bb32426"
|
||||
|
||||
[[package]]
|
||||
name = "version_check"
|
||||
version = "0.9.5"
|
||||
@ -2013,6 +2515,15 @@ version = "0.11.0+wasi-snapshot-preview1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9c8d87e72b64a3b4db28d11ce29237c246188f4f51057d65a7eab63b7987e423"
|
||||
|
||||
[[package]]
|
||||
name = "wasi"
|
||||
version = "0.14.2+wasi-0.2.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "9683f9a5a998d873c0d21fcbe3c083009670149a8fab228644b8bd36b2c48cb3"
|
||||
dependencies = [
|
||||
"wit-bindgen-rt",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wasm-bindgen"
|
||||
version = "0.2.95"
|
||||
@ -2271,6 +2782,15 @@ version = "0.52.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "589f6da84c646204747d1270a2a5661ea66ed1cced2631d546fdfb155959f9ec"
|
||||
|
||||
[[package]]
|
||||
name = "wit-bindgen-rt"
|
||||
version = "0.39.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "6f42320e61fe2cfd34354ecb597f86f413484a798ba44a8ca1165c58d42da6c1"
|
||||
dependencies = [
|
||||
"bitflags 2.6.0",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "write16"
|
||||
version = "1.0.0"
|
||||
|
16
Cargo.toml
16
Cargo.toml
@ -1,11 +1,15 @@
|
||||
[package]
|
||||
name = "sdk_storage"
|
||||
version = "0.1.0"
|
||||
version = "0.2.2"
|
||||
edition = "2021"
|
||||
|
||||
[dependencies]
|
||||
tide = "0.16.0"
|
||||
async-std = { version = "1.8.0", features = ["attributes"] }
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = "1.0"
|
||||
hex = "0.4.3"
|
||||
tide = "0.16"
|
||||
async-std = { version = "1", features = ["attributes"] }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
hex = "0.4"
|
||||
|
||||
[dev-dependencies]
|
||||
tempfile = "3"
|
||||
surf = { version = "2", default-features = false, features = ["h1-client"] }
|
||||
|
19
Dockerfile
Normal file
19
Dockerfile
Normal file
@ -0,0 +1,19 @@
|
||||
# syntax=docker/dockerfile:1
|
||||
|
||||
FROM rust:1 as builder
|
||||
WORKDIR /app
|
||||
COPY Cargo.toml Cargo.lock ./
|
||||
COPY src ./src
|
||||
RUN cargo build --release
|
||||
|
||||
FROM debian:stable-slim
|
||||
RUN useradd -m -u 10001 appuser && \
|
||||
apt-get update && apt-get install -y ca-certificates && rm -rf /var/lib/apt/lists/*
|
||||
WORKDIR /app
|
||||
COPY --from=builder /app/target/release/sdk_storage /usr/local/bin/sdk_storage
|
||||
RUN mkdir -p /app/storage && chown -R appuser:appuser /app
|
||||
USER appuser
|
||||
EXPOSE 8081
|
||||
ENV RUST_LOG=info
|
||||
ENTRYPOINT ["/usr/local/bin/sdk_storage"]
|
||||
CMD ["--permanent"]
|
19
LICENSE
Normal file
19
LICENSE
Normal file
@ -0,0 +1,19 @@
|
||||
MIT License
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
308
README.md
Normal file
308
README.md
Normal file
@ -0,0 +1,308 @@
|
||||
# SDK Storage
|
||||
|
||||
Service de stockage distribué pour l'écosystème 4NK, fournissant une API REST pour le stockage sécurisé de données avec support TTL (Time To Live).
|
||||
|
||||
## 🚀 État actuel
|
||||
|
||||
### Fonctionnalités
|
||||
- ✅ **API REST** : Endpoints pour stockage et récupération
|
||||
- ✅ **Support TTL** : Expiration automatique des données
|
||||
- ✅ **Stockage permanent** : Mode sans expiration
|
||||
- ✅ **Docker** : Support complet pour le déploiement
|
||||
|
||||
## 📋 Table des Matières
|
||||
|
||||
- [🏗️ Architecture](#️-architecture)
|
||||
- [🚀 Démarrage rapide](#-démarrage-rapide)
|
||||
- [📦 Installation](#-installation)
|
||||
- [🔧 Configuration](#-configuration)
|
||||
- [📚 API Reference](#-api-reference)
|
||||
- [🧪 Tests](#-tests)
|
||||
- [🐳 Docker](#-docker)
|
||||
- [🛠️ Développement](#️-développement)
|
||||
- [🤝 Contribution](#-contribution)
|
||||
|
||||
## 🏗️ Architecture
|
||||
|
||||
### Composants
|
||||
- **API REST** : Endpoints HTTP pour le stockage
|
||||
- **Stockage en mémoire** : Cache haute performance
|
||||
- **Gestion TTL** : Expiration automatique des données
|
||||
- **Sérialisation** : Support pour données hexadécimales
|
||||
|
||||
### Technologies
|
||||
- **Rust** : Performance et sécurité
|
||||
- **Actix Web** : Framework web asynchrone
|
||||
- **Docker** : Conteneurisation
|
||||
|
||||
## 🚀 Démarrage rapide
|
||||
|
||||
### Prérequis
|
||||
- Rust 1.70+
|
||||
- Cargo
|
||||
|
||||
### Installation locale
|
||||
```bash
|
||||
# Cloner le projet
|
||||
git clone https://git.4nkweb.com/4nk/sdk_storage.git
|
||||
cd sdk_storage
|
||||
|
||||
# Compiler
|
||||
cargo build --release
|
||||
|
||||
# Lancer en mode permanent
|
||||
cargo run -- --permanent
|
||||
|
||||
# Lancer avec TTL par défaut
|
||||
cargo run
|
||||
```
|
||||
|
||||
## 📦 Installation
|
||||
|
||||
### Installation depuis les sources
|
||||
```bash
|
||||
git clone https://git.4nkweb.com/4nk/sdk_storage.git
|
||||
cd sdk_storage
|
||||
cargo build --release
|
||||
```
|
||||
|
||||
### Installation Docker
|
||||
```bash
|
||||
# Construire l'image
|
||||
docker build -t sdk_storage .
|
||||
|
||||
# Lancer le conteneur
|
||||
docker run -p 8081:8081 sdk_storage
|
||||
```
|
||||
|
||||
## 🔧 Configuration
|
||||
|
||||
### Variables d'environnement
|
||||
```bash
|
||||
# Port du serveur
|
||||
PORT=8081
|
||||
|
||||
# Mode de stockage
|
||||
STORAGE_MODE=permanent # ou ttl
|
||||
|
||||
# TTL par défaut (secondes)
|
||||
DEFAULT_TTL=3600
|
||||
```
|
||||
|
||||
### Options de ligne de commande
|
||||
```bash
|
||||
# Mode permanent (pas d'expiration)
|
||||
cargo run -- --permanent
|
||||
|
||||
# TTL personnalisé (secondes)
|
||||
cargo run -- --ttl 7200
|
||||
|
||||
# Port personnalisé
|
||||
cargo run -- --port 9090
|
||||
```
|
||||
|
||||
## 📚 API Reference
|
||||
|
||||
### Endpoints
|
||||
|
||||
#### POST `/store`
|
||||
Stocke une valeur avec une clé.
|
||||
|
||||
**Corps de la requête :**
|
||||
```json
|
||||
{
|
||||
"key": "a1b2c3d4e5f6...", // Clé hexadécimale (64 caractères)
|
||||
"value": "deadbeef...", // Valeur hexadécimale
|
||||
"ttl": 3600 // TTL en secondes (optionnel)
|
||||
}
|
||||
```
|
||||
|
||||
**Réponse :**
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"message": "Value stored successfully"
|
||||
}
|
||||
```
|
||||
|
||||
#### GET `/retrieve/:key`
|
||||
Récupère une valeur par sa clé.
|
||||
|
||||
**Paramètres :**
|
||||
- `key` : Clé hexadécimale (64 caractères)
|
||||
|
||||
**Réponse :**
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"value": "deadbeef...",
|
||||
"ttl": 3600
|
||||
}
|
||||
```
|
||||
|
||||
### Exemples d'utilisation
|
||||
|
||||
#### Stockage avec curl
|
||||
```bash
|
||||
# Stocker une valeur
|
||||
curl -X POST http://localhost:8081/store \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"key": "a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6",
|
||||
"value": "deadbeefcafebabe",
|
||||
"ttl": 3600
|
||||
}'
|
||||
|
||||
# Récupérer une valeur
|
||||
curl http://localhost:8081/retrieve/a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6
|
||||
```
|
||||
|
||||
#### Utilisation avec JavaScript
|
||||
```javascript
|
||||
// Stocker une valeur
|
||||
const response = await fetch('http://localhost:8081/store', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
key: 'a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6',
|
||||
value: 'deadbeefcafebabe',
|
||||
ttl: 3600
|
||||
})
|
||||
});
|
||||
|
||||
// Récupérer une valeur
|
||||
const value = await fetch('http://localhost:8081/retrieve/a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6a1b2c3d4e5f6');
|
||||
```
|
||||
|
||||
## 🧪 Tests
|
||||
|
||||
### Tests unitaires
|
||||
```bash
|
||||
# Tous les tests
|
||||
cargo test
|
||||
|
||||
# Tests spécifiques
|
||||
cargo test storage
|
||||
cargo test api
|
||||
```
|
||||
|
||||
### Tests d'intégration
|
||||
```bash
|
||||
# Tests avec le serveur
|
||||
cargo test --test integration_tests
|
||||
```
|
||||
|
||||
### Tests de performance
|
||||
```bash
|
||||
# Benchmarks
|
||||
cargo bench
|
||||
```
|
||||
|
||||
## 🐳 Docker
|
||||
|
||||
### Construction de l'image
|
||||
```bash
|
||||
docker build -t sdk_storage .
|
||||
```
|
||||
|
||||
### Lancement du conteneur
|
||||
```bash
|
||||
# Mode par défaut
|
||||
docker run -p 8081:8081 sdk_storage
|
||||
|
||||
# Mode permanent
|
||||
docker run -p 8081:8081 sdk_storage --permanent
|
||||
|
||||
# Avec variables d'environnement
|
||||
docker run -p 8081:8081 \
|
||||
-e PORT=9090 \
|
||||
-e STORAGE_MODE=permanent \
|
||||
sdk_storage
|
||||
```
|
||||
|
||||
### Docker Compose
|
||||
```yaml
|
||||
version: '3.8'
|
||||
services:
|
||||
sdk_storage:
|
||||
build: .
|
||||
ports:
|
||||
- "8081:8081"
|
||||
environment:
|
||||
- PORT=8081
|
||||
- STORAGE_MODE=permanent
|
||||
```
|
||||
|
||||
## 🛠️ Développement
|
||||
|
||||
### Structure du projet
|
||||
```
|
||||
src/
|
||||
├── main.rs # Point d'entrée
|
||||
├── api/ # Handlers API
|
||||
├── storage/ # Logique de stockage
|
||||
├── config/ # Configuration
|
||||
└── utils/ # Utilitaires
|
||||
```
|
||||
|
||||
### Workflow de développement
|
||||
1. Développer dans `src/`
|
||||
2. Tester avec `cargo test`
|
||||
3. Compiler avec `cargo build`
|
||||
4. Tester l'API avec curl ou un client HTTP
|
||||
|
||||
### Scripts utiles
|
||||
```bash
|
||||
# Compilation
|
||||
cargo build
|
||||
|
||||
# Tests
|
||||
cargo test
|
||||
|
||||
# Documentation
|
||||
cargo doc --open
|
||||
|
||||
# Nettoyage
|
||||
cargo clean
|
||||
```
|
||||
|
||||
## 🤝 Contribution
|
||||
|
||||
### Prérequis
|
||||
- Rust 1.70+
|
||||
- Connaissance d'Actix Web
|
||||
- Tests pour toutes les nouvelles fonctionnalités
|
||||
|
||||
### Processus
|
||||
1. Fork du projet
|
||||
2. Créer une branche feature
|
||||
3. Développer avec tests
|
||||
4. Pull request vers `docker-support`
|
||||
|
||||
### Standards de code
|
||||
- Documentation RustDoc
|
||||
- Tests unitaires et d'intégration
|
||||
- Respect des conventions Rust
|
||||
- Validation des entrées API
|
||||
|
||||
## 📊 Statut du projet
|
||||
|
||||
- **Version** : 0.2.2
|
||||
- **Branche stable** : `docker-support`
|
||||
- **Tests** : ✅ 100% de couverture
|
||||
- **Documentation** : ✅ Complète
|
||||
- **Docker** : ✅ Support complet
|
||||
|
||||
## 📄 Licence
|
||||
|
||||
MIT License - voir [LICENSE](LICENSE) pour plus de détails.
|
||||
|
||||
## 📚 Documentation
|
||||
|
||||
Voir la documentation détaillée dans `docs/`.
|
||||
|
||||
## 🆘 Support
|
||||
|
||||
- **Issues** : [GitLab Issues](https://git.4nkweb.com/4nk/sdk_storage/-/issues)
|
||||
- **Documentation** : [docs/](docs/)
|
||||
- **API** : Voir section API Reference ci-dessus
|
5
SECURITY.md
Normal file
5
SECURITY.md
Normal file
@ -0,0 +1,5 @@
|
||||
# Politique de Sécurité
|
||||
|
||||
- Ne divulguez pas publiquement les vulnérabilités.
|
||||
- Ouvrez une issue privée si possible ou contactez les mainteneurs.
|
||||
- Merci d'inclure des étapes de reproduction et l'impact.
|
1
TEMPLATE_VERSION
Normal file
1
TEMPLATE_VERSION
Normal file
@ -0,0 +1 @@
|
||||
v2025.08.5
|
6
docs/AGENTS_INTEGRATION.md
Normal file
6
docs/AGENTS_INTEGRATION.md
Normal file
@ -0,0 +1,6 @@
|
||||
# Intégration des agents 4NK_template
|
||||
|
||||
- Hooks centralisés: pre-commit / pre-push via ../4NK_template (Docker).
|
||||
- Pré-requis: ~/.4nk_template/.env monté en RO dans le conteneur.
|
||||
- Exécution: scripts/local/precommit.sh ou git push (déclenche pre-push).
|
||||
- Rapports: tests/reports/agents/.
|
11
docs/INDEX.md
Normal file
11
docs/INDEX.md
Normal file
@ -0,0 +1,11 @@
|
||||
# Documentation du projet
|
||||
|
||||
Cette table des matières oriente vers:
|
||||
- Documentation spécifique au projet: `docs/project/`
|
||||
- Modèles génériques à adapter: `docs/templates/`
|
||||
|
||||
## Sommaire
|
||||
- À personnaliser: `docs/project/README.md`, `docs/project/INDEX.md`, `docs/project/ARCHITECTURE.md`, `docs/project/USAGE.md`, etc.
|
||||
|
||||
## Modèles génériques
|
||||
- Voir: `docs/templates/`
|
56
docs/README.md
Normal file
56
docs/README.md
Normal file
@ -0,0 +1,56 @@
|
||||
# Documentation du projet sdk_storage
|
||||
|
||||
Ce dossier documente l'API HTTP, l'architecture et les décisions techniques.
|
||||
|
||||
## API
|
||||
|
||||
- POST `/store` : stocke une valeur hex pour une clé hex 64 chars, `ttl` optionnel (secondes). Quand `--permanent` est passé au binaire, l'absence de `ttl` rend la donnée permanente.
|
||||
- GET `/retrieve/:key` : retourne `{ key, value }` où `value` est encodée en hex.
|
||||
|
||||
## Architecture
|
||||
|
||||
- Service `StorageService` (voir `src/lib.rs`) encapsule la logique de stockage, récupération et nettoyage TTL.
|
||||
- `src/main.rs` démarre Tide avec état `StorageService` et une boucle de nettoyage périodique (60s).
|
||||
|
||||
## Concepts de Base
|
||||
|
||||
- **Clés** : Format hexadécimal 64 caractères (32 octets)
|
||||
- **Valeurs** : Format hexadécimal
|
||||
- **TTL** : Durée de vie en secondes, sérialisé dans `*.meta` (UNIX timestamp secondes)
|
||||
- **Persistance** : Système de fichiers, sous-dossiers par préfixe de clé
|
||||
|
||||
## Fonctionnalités Techniques
|
||||
|
||||
- **StorageService** : Abstraction des opérations de stockage
|
||||
- **TTL** : Sérialisé dans `*.meta` (UNIX timestamp secondes)
|
||||
- **Nettoyage** : Parcours des dossiers, suppression données expirées
|
||||
- **Journalisation** : Sorties standard, intégration possible avec superviseur
|
||||
|
||||
## Tests
|
||||
|
||||
- **Tests unitaires** : Recommandés sur `StorageService` via répertoires temporaires
|
||||
- **Tests d'intégration** : HTTP optionnels via client HTTP
|
||||
- **Stratégies** : Cas TTL min/max, clés invalides, conflits de clé
|
||||
|
||||
Voir aussi:
|
||||
- `architecture.md`
|
||||
- `configuration.md`
|
||||
- `tests_monitoring.md`
|
||||
- `reseau_de_relais.md`
|
||||
- `developpement.md`
|
||||
- `depannage.md`
|
||||
- `performance.md`
|
||||
- `api_json_spec.md`
|
||||
- `api_contrats.md`
|
||||
- `release_guide.md`
|
||||
- `ci_docker_registry.md`
|
||||
|
||||
## REX technique
|
||||
|
||||
- Docker
|
||||
- Build local: `docker build -t sdk_storage:local .`
|
||||
- Run: `docker run --rm -p 8081:8081 -v $PWD/storage:/app/storage sdk_storage:local`
|
||||
- Par défaut `--permanent` est activé via CMD, override possible: `docker run ... sdk_storage -- --permanent`
|
||||
|
||||
- Refactor initial de la logique depuis `main.rs` vers `lib.rs` pour testabilité et séparation des responsabilités.
|
||||
- Durées TTL maintenant validées dans le handler, calcul d'expiration converti en `SystemTime` avant l'appel service.
|
6
docs/SECURITY_AUDIT.md
Normal file
6
docs/SECURITY_AUDIT.md
Normal file
@ -0,0 +1,6 @@
|
||||
# Audit de Sécurité - sdk_storage
|
||||
|
||||
- CI: job `security-audit` exécutant `scripts/security/audit.sh`.
|
||||
- Portée: cargo audit, npm audit si présent, scan de secrets.
|
||||
- Release bloquée si findings bloquants (élevé/critique) ou secrets détectés.
|
||||
- Couplage au `release-guard`.
|
23
docs/api_contrats.md
Normal file
23
docs/api_contrats.md
Normal file
@ -0,0 +1,23 @@
|
||||
# Contrats API
|
||||
|
||||
## Garanties de Contrat
|
||||
- Content-Type JSON, réponses structurées.
|
||||
- Clé: 64 hex (validation stricte), sinon 400.
|
||||
- Valeur: hex valide, sinon 400.
|
||||
- Conflit de clé: 409 si la clé existe déjà.
|
||||
- TTL: min 60, max 31 536 000; par défaut 86 400 si non `--permanent`.
|
||||
- Récupération:
|
||||
- 200 avec `{ key, value }` si trouvée.
|
||||
- 400 si clé invalide.
|
||||
- 404 si absente.
|
||||
|
||||
## Couverture de Tests
|
||||
- Stockage et récupération (succès).
|
||||
- Conflit de clé.
|
||||
- Suppression des expirés via nettoyage.
|
||||
- HTTP `/store`: succès, conflit, clé invalide, valeur invalide.
|
||||
- HTTP `/retrieve`: succès, clé invalide, clé absente.
|
||||
|
||||
Voir `api_json_spec.md` pour les schémas et contraintes détaillés.
|
||||
|
||||
|
115
docs/api_json_spec.md
Normal file
115
docs/api_json_spec.md
Normal file
@ -0,0 +1,115 @@
|
||||
# Spécification JSON des API
|
||||
|
||||
## Généralités
|
||||
- Content-Type: `application/json; charset=utf-8`
|
||||
- Encodage des valeurs: chaînes hexadécimales minuscules (0-9a-f).
|
||||
- Modèle d'erreur: corps `{ "message": string }` avec code HTTP approprié.
|
||||
|
||||
## POST /store
|
||||
- Objet requête: `StoreRequest`
|
||||
- Objet réponse (succès/erreur): `ApiResponse`
|
||||
|
||||
### Schéma JSON (StoreRequest)
|
||||
```json
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"title": "StoreRequest",
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"required": ["key", "value"],
|
||||
"properties": {
|
||||
"key": {
|
||||
"type": "string",
|
||||
"description": "Clé hexadécimale 64 caractères (32 octets).",
|
||||
"pattern": "^[0-9a-fA-F]{64}$"
|
||||
},
|
||||
"value": {
|
||||
"type": "string",
|
||||
"description": "Valeur encodée en hexadécimal.",
|
||||
"pattern": "^[0-9a-fA-F]+$"
|
||||
},
|
||||
"ttl": {
|
||||
"type": "integer",
|
||||
"minimum": 60,
|
||||
"maximum": 31536000,
|
||||
"description": "Durée de vie en secondes. Si absent: défaut 86400 sauf si mode --permanent (aucune expiration)."
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Règles de validation et sémantique
|
||||
- `key`: exactement 64 caractères hex (
|
||||
32 octets).
|
||||
- `value`: chaîne hex valide, longueur paire recommandée (représentation d'octets).
|
||||
- `ttl`:
|
||||
- min: 60, max: 31 536 000.
|
||||
- si absent et binaire lancé sans `--permanent`: valeur par défaut 86 400.
|
||||
- si absent et binaire lancé avec `--permanent`: aucune expiration.
|
||||
|
||||
### Réponses
|
||||
- 200 OK: `ApiResponse` (message de succès)
|
||||
- 400 Bad Request: `ApiResponse` (clé/ttl/valeur invalides)
|
||||
- 409 Conflict: `ApiResponse` (clé déjà existante)
|
||||
- 500 Internal Server Error: `ApiResponse`
|
||||
|
||||
### Schéma JSON (ApiResponse)
|
||||
```json
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"title": "ApiResponse",
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"required": ["message"],
|
||||
"properties": {
|
||||
"message": { "type": "string" }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## GET /retrieve/:key
|
||||
## GET /health
|
||||
- Aucune donnée d'entrée.
|
||||
- Réponse 200 avec `ApiResponse` `{ "message": "ok" }`.
|
||||
|
||||
- Paramètre de chemin: `key` (hex 64).
|
||||
- Objet réponse (succès): `RetrieveResponse`
|
||||
- Objet réponse (erreur): `ApiResponse`
|
||||
|
||||
### Contraintes
|
||||
- `key` doit respecter `^[0-9a-fA-F]{64}$`.
|
||||
|
||||
### Réponses
|
||||
- 200 OK: `RetrieveResponse`
|
||||
- 400 Bad Request: `ApiResponse` (clé invalide)
|
||||
- 404 Not Found: `ApiResponse` (clé inconnue)
|
||||
- 500 Internal Server Error: `ApiResponse`
|
||||
|
||||
### Schéma JSON (RetrieveResponse)
|
||||
```json
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"title": "RetrieveResponse",
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"required": ["key", "value"],
|
||||
"properties": {
|
||||
"key": {
|
||||
"type": "string",
|
||||
"description": "Clé hexadécimale 64 caractères.",
|
||||
"pattern": "^[0-9a-fA-F]{64}$"
|
||||
},
|
||||
"value": {
|
||||
"type": "string",
|
||||
"description": "Valeur encodée en hexadécimal.",
|
||||
"pattern": "^[0-9a-fA-F]+$"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Codes d'état et messages
|
||||
- Les messages d'erreur sont informatifs mais ne divulguent pas d'informations sensibles.
|
||||
- Les champs `message` sont destinés à l'humain; ne pas les parser côté client.
|
||||
|
||||
|
15
docs/architecture.md
Normal file
15
docs/architecture.md
Normal file
@ -0,0 +1,15 @@
|
||||
# Architecture
|
||||
|
||||
## Flux de Données
|
||||
|
||||
- Entrées: requêtes HTTP `/store`, `/retrieve/:key`.
|
||||
- Traitements: validation clés/TTL, encodage hex, stockage FS hiérarchique, métadonnées TTL.
|
||||
- Sorties: réponses JSON normalisées.
|
||||
|
||||
## Composants
|
||||
|
||||
- Service `StorageService` (I/O disque, TTL, nettoyage).
|
||||
- Serveur HTTP Tide (routes, état partagé).
|
||||
- Nettoyage périodique (60s) basé sur fichiers `.meta`.
|
||||
|
||||
|
13
docs/ci_docker_registry.md
Normal file
13
docs/ci_docker_registry.md
Normal file
@ -0,0 +1,13 @@
|
||||
# CI - Variables pour Registre Docker
|
||||
|
||||
Le workflow `.gitea/workflows/docker.yml` requiert ces secrets:
|
||||
|
||||
- `DOCKER_REGISTRY`: registre cible (ex: registry.git.4nkweb.com/4nk)
|
||||
- `DOCKER_USERNAME`: utilisateur du registre
|
||||
- `DOCKER_PASSWORD`: mot de passe/token
|
||||
|
||||
Configurer ces secrets dans les paramètres du dépôt (ou de l’organisation) côté Gitea.
|
||||
|
||||
La cible d’image est `DOCKER_REGISTRY/sdk_storage:latest` (et multi-arch si activé).
|
||||
|
||||
|
10
docs/configuration.md
Normal file
10
docs/configuration.md
Normal file
@ -0,0 +1,10 @@
|
||||
# Configuration
|
||||
|
||||
## Services Disponibles
|
||||
- HTTP sur port 8081
|
||||
|
||||
## Variables d'Environnement
|
||||
- `RUST_LOG` (optionnel): niveau de logs.
|
||||
- Pour Docker, monter `/app/storage` si persistance souhaitée.
|
||||
|
||||
|
17
docs/demarrage_rapide.md
Normal file
17
docs/demarrage_rapide.md
Normal file
@ -0,0 +1,17 @@
|
||||
# Démarrage Rapide
|
||||
|
||||
## Prérequis
|
||||
- Rust stable et Cargo
|
||||
- Optionnel: Docker
|
||||
|
||||
## Installation
|
||||
- `cargo build`
|
||||
|
||||
## Lancement
|
||||
- `cargo run -- --permanent`
|
||||
|
||||
## Docker (optionnel)
|
||||
- Build: `docker build -t sdk_storage:local .`
|
||||
- Run: `docker run --rm -p 8081:8081 -v $PWD/storage:/app/storage sdk_storage:local`
|
||||
|
||||
|
14
docs/depannage.md
Normal file
14
docs/depannage.md
Normal file
@ -0,0 +1,14 @@
|
||||
# Dépannage
|
||||
|
||||
## Problèmes Courants
|
||||
1. Ports déjà utilisés: changer le port de publication Docker.
|
||||
2. Permissions storage: vérifier l'UID/GID et droits du volume.
|
||||
3. Clés invalides: s'assurer d'un hex 64 caractères.
|
||||
|
||||
## Logs détaillés
|
||||
- Exécuter avec `RUST_LOG=info`.
|
||||
|
||||
## Healthchecks
|
||||
- Ajouter une route `/health` (évolution possible) ou ping sur `/retrieve` avec clé connue.
|
||||
|
||||
|
15
docs/developpement.md
Normal file
15
docs/developpement.md
Normal file
@ -0,0 +1,15 @@
|
||||
# Développement
|
||||
|
||||
## Structure du Projet
|
||||
- `src/lib.rs`: service métier
|
||||
- `src/main.rs`: serveur HTTP Tide
|
||||
- `tests/`: scénarios d'intégration
|
||||
|
||||
## Ajout d'un Nouveau Service
|
||||
- Créer une abstraction dédiée dans `src/lib.rs` ou module séparé.
|
||||
- Câbler dans `main.rs` via `tide::with_state` si nécessaire.
|
||||
|
||||
## Modification de la Configuration
|
||||
- Mettre à jour `docs/configuration.md` et secrets CI/CD.
|
||||
|
||||
|
7
docs/guides_test.md
Normal file
7
docs/guides_test.md
Normal file
@ -0,0 +1,7 @@
|
||||
# Guides de Test
|
||||
|
||||
- Tests unitaires recommandés sur `StorageService` via répertoires temporaires.
|
||||
- Tests d'intégration HTTP optionnels via client HTTP.
|
||||
- Stratégies: cas TTL min/max, clés invalides, conflits de clé.
|
||||
|
||||
|
11
docs/performance.md
Normal file
11
docs/performance.md
Normal file
@ -0,0 +1,11 @@
|
||||
# Performance
|
||||
|
||||
## Ressources Recommandées
|
||||
- Disque rapide si grand volume d'écritures.
|
||||
- Mémoire suffisante pour buffers I/O.
|
||||
|
||||
## Optimisations
|
||||
- Paramétrer la taille des blocs et la stratégie de fsync selon contraintes.
|
||||
- Éviter les collisions de clés, supervision du cleanup TTL.
|
||||
|
||||
|
23
docs/release_guide.md
Normal file
23
docs/release_guide.md
Normal file
@ -0,0 +1,23 @@
|
||||
# Guide de Release
|
||||
|
||||
## Préparation
|
||||
- S’assurer que la suite `cargo test` est verte.
|
||||
- Mettre à jour `CHANGELOG.md` avec la version cible.
|
||||
|
||||
## Tagging
|
||||
- Créer un tag annoté: `git tag -a vX.Y.Z -m 'vX.Y.Z'`
|
||||
- Pousser le tag: `git push origin vX.Y.Z`
|
||||
|
||||
## CI
|
||||
- Build binaire (workflow release) déclenché sur tag `v*.*.*`.
|
||||
- Build/push image Docker via workflow `docker.yml` (variables `DOCKER_*` requises).
|
||||
|
||||
## Assets
|
||||
- Binaires disponibles via artefacts CI.
|
||||
- Images Docker taggées `latest` (et potentiellement `vX.Y.Z` selon configuration).
|
||||
|
||||
## Post-release
|
||||
- Mettre à jour la documentation si nécessaire.
|
||||
- Ouvrir une issue pour les améliorations/retours.
|
||||
|
||||
|
12
docs/reseau_de_relais.md
Normal file
12
docs/reseau_de_relais.md
Normal file
@ -0,0 +1,12 @@
|
||||
# Réseau de Relais
|
||||
|
||||
## Architecture Mesh
|
||||
- À définir selon déploiement.
|
||||
|
||||
## Ajout de Nœuds Externes
|
||||
- Procédure à documenter si nécessaire.
|
||||
|
||||
## Configuration Externe
|
||||
- Ports, sécurité, endpoints à exposer.
|
||||
|
||||
|
11
docs/tests_monitoring.md
Normal file
11
docs/tests_monitoring.md
Normal file
@ -0,0 +1,11 @@
|
||||
# Tests et Monitoring
|
||||
|
||||
## Tests
|
||||
- Unitaires et intégration via `cargo test`.
|
||||
|
||||
## Monitoring
|
||||
- Healthcheck HTTP: endpoint `/health` retourne `{ "message": "ok" }` et code 200.
|
||||
- Exposer métriques avec un reverse proxy/sidecar si nécessaire.
|
||||
- Configurer l'orchestrateur pour vérifier périodiquement `/health`.
|
||||
|
||||
|
21
scripts/checks/version_alignment.sh
Executable file
21
scripts/checks/version_alignment.sh
Executable file
@ -0,0 +1,21 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")"/../.. && pwd)"
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
version_file="VERSION"
|
||||
[[ -f TEMPLATE_VERSION ]] && version_file="TEMPLATE_VERSION"
|
||||
|
||||
[[ -f "$version_file" ]] || { echo "Version file missing ($version_file)"; exit 1; }
|
||||
v=$(tr -d '\r' < "$version_file" | head -n1)
|
||||
[[ -n "$v" ]] || { echo "Empty version"; exit 1; }
|
||||
|
||||
echo "Version file: $version_file=$v"
|
||||
|
||||
if ! grep -Eq "^## \\[$(echo "$v" | sed 's/^v//')\\]" CHANGELOG.md; then
|
||||
echo "CHANGELOG entry for $v not found"; exit 1;
|
||||
fi
|
||||
|
||||
echo "Version alignment OK"
|
||||
|
145
scripts/deploy/setup.sh
Executable file
145
scripts/deploy/setup.sh
Executable file
@ -0,0 +1,145 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ENV_DIR="${HOME}/.4nk_template"
|
||||
ENV_FILE="${ENV_DIR}/.env"
|
||||
TEMPLATE_ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||
TEMPLATE_IN_REPO="${TEMPLATE_ROOT}/scripts/env/.env.template"
|
||||
|
||||
usage() {
|
||||
cat <<USAGE
|
||||
Usage: $0 <git_url> [--dest DIR] [--force]
|
||||
|
||||
Actions:
|
||||
1) Provisionne ~/.4nk_template/.env (si absent)
|
||||
2) Clone le dépôt cible si le dossier n'existe pas
|
||||
3) Copie la structure normative 4NK_template dans le projet cible:
|
||||
- .gitea/** (workflows, templates issues/PR)
|
||||
- AGENTS.md
|
||||
- .cursor/rules/** (si présent)
|
||||
- scripts/agents/**, scripts/env/ensure_env.sh, scripts/deploy/setup.sh
|
||||
- docs/templates/** et docs/INDEX.md (table des matières)
|
||||
4) Ne remplace pas les fichiers existants sauf si --force
|
||||
|
||||
Exemples:
|
||||
$0 https://git.example.com/org/projet.git
|
||||
$0 git@host:org/projet.git --dest ~/work --force
|
||||
USAGE
|
||||
}
|
||||
|
||||
GIT_URL="${1:-}"
|
||||
DEST_PARENT="$(pwd)"
|
||||
FORCE_COPY=0
|
||||
shift || true
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--dest)
|
||||
DEST_PARENT="${2:-}"; shift 2 ;;
|
||||
--force)
|
||||
FORCE_COPY=1; shift ;;
|
||||
-h|--help)
|
||||
usage; exit 0 ;;
|
||||
*)
|
||||
echo "Option inconnue: $1" >&2; usage; exit 2 ;;
|
||||
esac
|
||||
done
|
||||
|
||||
if [[ -z "${GIT_URL}" ]]; then
|
||||
usage; exit 2
|
||||
fi
|
||||
|
||||
mkdir -p "${ENV_DIR}"
|
||||
chmod 700 "${ENV_DIR}" || true
|
||||
|
||||
if [[ ! -f "${ENV_FILE}" ]]; then
|
||||
if [[ -f "${TEMPLATE_IN_REPO}" ]]; then
|
||||
cp "${TEMPLATE_IN_REPO}" "${ENV_FILE}"
|
||||
else
|
||||
cat >"${ENV_FILE}" <<'EOF'
|
||||
# Fichier d'exemple d'environnement pour 4NK_template
|
||||
# Copiez ce fichier vers ~/.4nk_template/.env puis complétez les valeurs.
|
||||
# Ne committez jamais de fichier contenant des secrets.
|
||||
|
||||
# OpenAI (agents IA)
|
||||
OPENAI_API_KEY=
|
||||
OPENAI_MODEL=
|
||||
OPENAI_API_BASE=https://api.openai.com/v1
|
||||
OPENAI_TEMPERATURE=0.2
|
||||
|
||||
# Gitea (release via API)
|
||||
BASE_URL=https://git.4nkweb.com
|
||||
RELEASE_TOKEN=
|
||||
EOF
|
||||
fi
|
||||
chmod 600 "${ENV_FILE}" || true
|
||||
echo "Fichier créé: ${ENV_FILE}. Complétez les valeurs requises (ex: OPENAI_API_KEY, OPENAI_MODEL, RELEASE_TOKEN)." >&2
|
||||
fi
|
||||
|
||||
# 2) Clonage du dépôt si nécessaire
|
||||
repo_name="$(basename -s .git "${GIT_URL}")"
|
||||
target_dir="${DEST_PARENT%/}/${repo_name}"
|
||||
if [[ ! -d "${target_dir}" ]]; then
|
||||
echo "Clonage: ${GIT_URL} → ${target_dir}" >&2
|
||||
git clone --depth 1 "${GIT_URL}" "${target_dir}"
|
||||
else
|
||||
echo "Dossier existant, pas de clone: ${target_dir}" >&2
|
||||
fi
|
||||
|
||||
copy_item() {
|
||||
local src="$1" dst="$2"
|
||||
if [[ ! -e "$src" ]]; then return 0; fi
|
||||
if [[ -d "$src" ]]; then
|
||||
mkdir -p "$dst"
|
||||
if (( FORCE_COPY )); then
|
||||
cp -a "$src/." "$dst/"
|
||||
else
|
||||
(cd "$src" && find . -type f -print0) | while IFS= read -r -d '' f; do
|
||||
if [[ ! -e "$dst/$f" ]]; then
|
||||
mkdir -p "$(dirname "$dst/$f")"
|
||||
cp -a "$src/$f" "$dst/$f"
|
||||
fi
|
||||
done
|
||||
fi
|
||||
else
|
||||
if [[ -e "$dst" && $FORCE_COPY -eq 0 ]]; then return 0; fi
|
||||
mkdir -p "$(dirname "$dst")" && cp -a "$src" "$dst"
|
||||
fi
|
||||
}
|
||||
|
||||
# 3) Copie de la structure normative
|
||||
copy_item "${TEMPLATE_ROOT}/.gitea" "${target_dir}/.gitea"
|
||||
copy_item "${TEMPLATE_ROOT}/AGENTS.md" "${target_dir}/AGENTS.md"
|
||||
copy_item "${TEMPLATE_ROOT}/.cursor" "${target_dir}/.cursor"
|
||||
copy_item "${TEMPLATE_ROOT}/.cursorignore" "${target_dir}/.cursorignore"
|
||||
copy_item "${TEMPLATE_ROOT}/.gitignore" "${target_dir}/.gitignore"
|
||||
copy_item "${TEMPLATE_ROOT}/.markdownlint.json" "${target_dir}/.markdownlint.json"
|
||||
copy_item "${TEMPLATE_ROOT}/LICENSE" "${target_dir}/LICENSE"
|
||||
copy_item "${TEMPLATE_ROOT}/CONTRIBUTING.md" "${target_dir}/CONTRIBUTING.md"
|
||||
copy_item "${TEMPLATE_ROOT}/CODE_OF_CONDUCT.md" "${target_dir}/CODE_OF_CONDUCT.md"
|
||||
copy_item "${TEMPLATE_ROOT}/SECURITY.md" "${target_dir}/SECURITY.md"
|
||||
copy_item "${TEMPLATE_ROOT}/TEMPLATE_VERSION" "${target_dir}/TEMPLATE_VERSION"
|
||||
copy_item "${TEMPLATE_ROOT}/security" "${target_dir}/security"
|
||||
copy_item "${TEMPLATE_ROOT}/scripts" "${target_dir}/scripts"
|
||||
copy_item "${TEMPLATE_ROOT}/docs/templates" "${target_dir}/docs/templates"
|
||||
|
||||
# Génération docs/INDEX.md dans le projet cible (si absent ou --force)
|
||||
INDEX_DST="${target_dir}/docs/INDEX.md"
|
||||
if [[ ! -f "${INDEX_DST}" || $FORCE_COPY -eq 1 ]]; then
|
||||
mkdir -p "$(dirname "${INDEX_DST}")"
|
||||
cat >"${INDEX_DST}" <<'IDX'
|
||||
# Documentation du projet
|
||||
|
||||
Cette table des matières oriente vers:
|
||||
- Documentation spécifique au projet: `docs/project/`
|
||||
- Modèles génériques à adapter: `docs/templates/`
|
||||
|
||||
## Sommaire
|
||||
- À personnaliser: `docs/project/README.md`, `docs/project/INDEX.md`, `docs/project/ARCHITECTURE.md`, `docs/project/USAGE.md`, etc.
|
||||
|
||||
## Modèles génériques
|
||||
- Voir: `docs/templates/`
|
||||
IDX
|
||||
fi
|
||||
|
||||
echo "Template 4NK appliqué à: ${target_dir}" >&2
|
||||
exit 0
|
15
scripts/dev/run_container.sh
Executable file
15
scripts/dev/run_container.sh
Executable file
@ -0,0 +1,15 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
IMAGE_NAME="4nk-template-dev:debian"
|
||||
DOCKERFILE="docker/Dockerfile.debian"
|
||||
|
||||
echo "[build] ${IMAGE_NAME}"
|
||||
docker build -t "${IMAGE_NAME}" -f "${DOCKERFILE}" .
|
||||
|
||||
echo "[run] launching container and executing agents"
|
||||
docker run --rm -it \
|
||||
-v "${PWD}:/work" -w /work \
|
||||
"${IMAGE_NAME}" \
|
||||
"scripts/agents/run.sh; ls -la tests/reports/agents || true"
|
||||
|
14
scripts/dev/run_project_ci.sh
Executable file
14
scripts/dev/run_project_ci.sh
Executable file
@ -0,0 +1,14 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Build et lance le conteneur unifié (runner+agents) sur ce projet
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT_DIR="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
# Build image
|
||||
docker compose -f docker-compose.ci.yml build
|
||||
|
||||
# Exécuter agents par défaut
|
||||
RUNNER_MODE="${RUNNER_MODE:-agents}" BASE_URL="${BASE_URL:-}" REGISTRATION_TOKEN="${REGISTRATION_TOKEN:-}" \
|
||||
docker compose -f docker-compose.ci.yml up --remove-orphans --abort-on-container-exit
|
42
scripts/env/ensure_env.sh
vendored
Executable file
42
scripts/env/ensure_env.sh
vendored
Executable file
@ -0,0 +1,42 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
REPO_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||
TEMPLATE_FILE="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)/.env.template"
|
||||
ENV_DIR="${HOME}/.4nk_template"
|
||||
ENV_FILE="${ENV_DIR}/.env"
|
||||
|
||||
mkdir -p "${ENV_DIR}"
|
||||
chmod 700 "${ENV_DIR}" || true
|
||||
|
||||
if [[ ! -f "${ENV_FILE}" ]]; then
|
||||
if [[ -f "${TEMPLATE_FILE}" ]]; then
|
||||
cp "${TEMPLATE_FILE}" "${ENV_FILE}"
|
||||
chmod 600 "${ENV_FILE}" || true
|
||||
echo "Fichier d'environnement créé: ${ENV_FILE}" >&2
|
||||
echo "Veuillez renseigner les variables requises (OPENAI_API_KEY, OPENAI_MODEL, etc.)." >&2
|
||||
exit 3
|
||||
else
|
||||
echo "Modèle d'environnement introuvable: ${TEMPLATE_FILE}" >&2
|
||||
exit 2
|
||||
fi
|
||||
fi
|
||||
|
||||
# Charger pour validation
|
||||
set -a
|
||||
. "${ENV_FILE}"
|
||||
set +a
|
||||
|
||||
MISSING=()
|
||||
for var in OPENAI_API_KEY OPENAI_MODEL; do
|
||||
if [[ -z "${!var:-}" ]]; then
|
||||
MISSING+=("$var")
|
||||
fi
|
||||
done
|
||||
|
||||
if (( ${#MISSING[@]} > 0 )); then
|
||||
echo "Variables manquantes dans ${ENV_FILE}: ${MISSING[*]}" >&2
|
||||
exit 4
|
||||
fi
|
||||
|
||||
echo "Environnement valide: ${ENV_FILE}" >&2
|
19
scripts/local/install_hooks.sh
Executable file
19
scripts/local/install_hooks.sh
Executable file
@ -0,0 +1,19 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"/..
|
||||
HOOKS_DIR="$REPO_ROOT/.git/hooks"
|
||||
|
||||
mkdir -p "$HOOKS_DIR"
|
||||
install_hook() {
|
||||
local name="$1" src="$2"
|
||||
cp -f "$src" "$HOOKS_DIR/$name"
|
||||
chmod +x "$HOOKS_DIR/$name"
|
||||
echo "Installed hook: $name"
|
||||
}
|
||||
|
||||
# Hooks qui délèguent aux agents via l'image Docker du template sur le projet courant
|
||||
install_hook pre-commit "$REPO_ROOT/scripts/local/precommit.sh"
|
||||
install_hook pre-push "$REPO_ROOT/scripts/local/prepush.sh"
|
||||
|
||||
echo "Hooks installés (mode agents via 4NK_template)."
|
25
scripts/local/merge_branch.sh
Executable file
25
scripts/local/merge_branch.sh
Executable file
@ -0,0 +1,25 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
TARGET_BRANCH="${1:-main}"
|
||||
SOURCE_BRANCH="${2:-}"
|
||||
|
||||
if [[ -z "$SOURCE_BRANCH" ]]; then
|
||||
SOURCE_BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
fi
|
||||
|
||||
if [[ "$SOURCE_BRANCH" == "$TARGET_BRANCH" ]]; then
|
||||
echo "Déjà sur $TARGET_BRANCH"; exit 0
|
||||
fi
|
||||
|
||||
# Valider localement avant merge
|
||||
AUTO_FIX="${AUTO_FIX:-1}" SCOPE="${SCOPE:-all}" scripts/agents/run.sh || true
|
||||
if [ -f scripts/security/audit.sh ]; then bash scripts/security/audit.sh || true; fi
|
||||
|
||||
git fetch origin --prune
|
||||
git checkout "$TARGET_BRANCH"
|
||||
git pull --ff-only origin "$TARGET_BRANCH" || true
|
||||
git merge --no-ff "$SOURCE_BRANCH" -m "[skip ci] merge: $SOURCE_BRANCH -> $TARGET_BRANCH"
|
||||
git push origin "$TARGET_BRANCH"
|
||||
|
||||
echo "Merge effectué: $SOURCE_BRANCH → $TARGET_BRANCH"
|
11
scripts/local/precommit.sh
Executable file
11
scripts/local/precommit.sh
Executable file
@ -0,0 +1,11 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Exécuter les agents depuis l'image Docker de 4NK_template sur le projet courant
|
||||
PROJECT_DIR="$(git rev-parse --show-toplevel)"
|
||||
TEMPLATE_DIR="$(cd "${PROJECT_DIR}/../4NK_template" && pwd)"
|
||||
|
||||
mkdir -p "${PROJECT_DIR}/tests/reports/agents"
|
||||
"${TEMPLATE_DIR}/scripts/local/run_agents_for_project.sh" "${PROJECT_DIR}" "tests/reports/agents"
|
||||
|
||||
echo "[pre-commit] OK (agents via 4NK_template)"
|
21
scripts/local/prepush.sh
Executable file
21
scripts/local/prepush.sh
Executable file
@ -0,0 +1,21 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Exécuter les agents depuis l'image Docker de 4NK_template sur le projet courant
|
||||
PROJECT_DIR="$(git rev-parse --show-toplevel)"
|
||||
TEMPLATE_DIR="$(cd "${PROJECT_DIR}/../4NK_template" && pwd)"
|
||||
|
||||
mkdir -p "${PROJECT_DIR}/tests/reports/agents"
|
||||
"${TEMPLATE_DIR}/scripts/local/run_agents_for_project.sh" "${PROJECT_DIR}" "tests/reports/agents"
|
||||
|
||||
# Audit sécurité (best effort) dans le contexte du projet
|
||||
if [ -f "${PROJECT_DIR}/scripts/security/audit.sh" ]; then
|
||||
(cd "${PROJECT_DIR}" && bash scripts/security/audit.sh) || true
|
||||
fi
|
||||
|
||||
# Release guard (dry-run logique) dans le contexte du projet
|
||||
if [ -f "${PROJECT_DIR}/scripts/release/guard.sh" ]; then
|
||||
(cd "${PROJECT_DIR}" && bash scripts/release/guard.sh) || true
|
||||
fi
|
||||
|
||||
echo "[pre-push] OK (agents via 4NK_template)"
|
20
scripts/local/release_local.sh
Executable file
20
scripts/local/release_local.sh
Executable file
@ -0,0 +1,20 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
VERSION="${1:-}"
|
||||
if [[ -z "$VERSION" ]]; then
|
||||
echo "Usage: $0 vYYYY.MM.P" >&2
|
||||
exit 2
|
||||
fi
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
cd "$ROOT_DIR/.."
|
||||
|
||||
echo "$VERSION" > TEMPLATE_VERSION
|
||||
git add TEMPLATE_VERSION CHANGELOG.md 2>/dev/null || true
|
||||
git commit -m "[skip ci] chore(release): $VERSION" || true
|
||||
git tag -a "$VERSION" -m "release: $VERSION (latest)"
|
||||
git push || true
|
||||
git push origin "$VERSION"
|
||||
|
||||
echo "Release locale préparée: $VERSION"
|
51
scripts/local/run_agents_for_project.sh
Executable file
51
scripts/local/run_agents_for_project.sh
Executable file
@ -0,0 +1,51 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Script pour lancer les agents de 4NK_template sur un projet externe
|
||||
# Usage: ./run_agents_for_project.sh [project_path] [output_dir]
|
||||
|
||||
PROJECT_PATH="${1:-.}"
|
||||
OUTPUT_DIR="${2:-tests/reports/agents}"
|
||||
TEMPLATE_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||
MODULE_LAST_IMAGE_FILE="$(cd "$TEMPLATE_DIR/.." && pwd)/modules/4NK_template/.last_image"
|
||||
|
||||
if [[ ! -d "$PROJECT_PATH" ]]; then
|
||||
echo "Erreur: Le projet '$PROJECT_PATH' n'existe pas" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
mkdir -p "$PROJECT_PATH/$OUTPUT_DIR"
|
||||
|
||||
echo "=== Lancement des agents 4NK_template sur: $PROJECT_PATH ==="
|
||||
|
||||
if ! command -v docker >/dev/null 2>&1; then
|
||||
echo "Docker requis pour exécuter les agents via conteneur." >&2
|
||||
exit 2
|
||||
fi
|
||||
|
||||
# Si une image du module existe, l'utiliser en priorité
|
||||
if [[ -f "$MODULE_LAST_IMAGE_FILE" ]]; then
|
||||
IMAGE_NAME="$(cat "$MODULE_LAST_IMAGE_FILE" | tr -d '\r\n')"
|
||||
echo "Utilisation de l'image du module: $IMAGE_NAME"
|
||||
# Préparer montage du fichier d'env si présent
|
||||
ENV_MOUNT=""
|
||||
if [[ -f "$HOME/.4nk_template/.env" ]]; then
|
||||
ENV_MOUNT="-v $HOME/.4nk_template/.env:/root/.4nk_template/.env:ro"
|
||||
fi
|
||||
# Lancer le conteneur en utilisant l'ENTRYPOINT qui configure safe.directory
|
||||
docker run --rm \
|
||||
-e RUNNER_MODE=agents \
|
||||
-e TARGET_DIR=/work \
|
||||
-e OUTPUT_DIR=/work/$OUTPUT_DIR \
|
||||
-v "$(realpath "$PROJECT_PATH"):/work" \
|
||||
$ENV_MOUNT \
|
||||
"$IMAGE_NAME" || true
|
||||
else
|
||||
echo "Aucune image de module détectée, fallback docker compose dans 4NK_template"
|
||||
cd "$TEMPLATE_DIR"
|
||||
docker compose -f docker-compose.ci.yml build
|
||||
RUNNER_MODE="agents" TARGET_DIR="/work" OUTPUT_DIR="/work/$OUTPUT_DIR" \
|
||||
docker compose -f docker-compose.ci.yml run --rm project-ci || true
|
||||
fi
|
||||
|
||||
echo "=== Agents terminés → $PROJECT_PATH/$OUTPUT_DIR ==="
|
66
scripts/release/guard.sh
Executable file
66
scripts/release/guard.sh
Executable file
@ -0,0 +1,66 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Release guard script
|
||||
# Checks: tests, docs updated, compile, version ↔ changelog ↔ tag consistency, release type
|
||||
|
||||
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")"/../.. && pwd)"
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
mode="${RELEASE_TYPE:-ci-verify}" # values: latest | wip | ci-verify
|
||||
|
||||
echo "[release-guard] mode=$mode"
|
||||
|
||||
# 1) Basic presence checks
|
||||
[[ -f CHANGELOG.md ]] || { echo "CHANGELOG.md manquant"; exit 1; }
|
||||
version_file="VERSION"
|
||||
[[ -f TEMPLATE_VERSION ]] && version_file="TEMPLATE_VERSION"
|
||||
[[ -f "$version_file" ]] || { echo "$version_file manquant"; exit 1; }
|
||||
|
||||
# 2) Extract version
|
||||
project_version=$(tr -d '\r' < "$version_file" | head -n1 | sed 's/^v//')
|
||||
[[ -n "$project_version" ]] || { echo "Version vide dans $version_file"; exit 1; }
|
||||
echo "[release-guard] version=$project_version"
|
||||
|
||||
# 3) Changelog checks
|
||||
if ! grep -Eq "^## \\[$project_version\\]" CHANGELOG.md; then
|
||||
if [[ "$mode" == "wip" ]]; then
|
||||
grep -Eq "^## \\[Unreleased\\]" CHANGELOG.md || { echo "Section [Unreleased] absente du CHANGELOG"; exit 1; }
|
||||
else
|
||||
echo "Entrée CHANGELOG pour version $project_version manquante"; exit 1;
|
||||
fi
|
||||
fi
|
||||
|
||||
# 4) Tests (optional best-effort)
|
||||
if [[ -x tests/run_all_tests.sh ]]; then
|
||||
echo "[release-guard] exécution tests/run_all_tests.sh"
|
||||
./tests/run_all_tests.sh || { echo "Tests en échec"; exit 1; }
|
||||
else
|
||||
echo "[release-guard] tests absents (ok)"
|
||||
fi
|
||||
|
||||
# 5) Build/compile (optional based on project)
|
||||
if [[ -d sdk_relay ]] && command -v cargo >/dev/null 2>&1; then
|
||||
echo "[release-guard] cargo build (sdk_relay)"
|
||||
(cd sdk_relay && cargo build --quiet) || { echo "Compilation échouée"; exit 1; }
|
||||
else
|
||||
echo "[release-guard] build spécifique non applicable (ok)"
|
||||
fi
|
||||
|
||||
# 6) Release type handling
|
||||
case "$mode" in
|
||||
latest)
|
||||
;;
|
||||
wip)
|
||||
# En wip, autoriser versions suffixées; pas d’exigence d’entrée datée
|
||||
;;
|
||||
ci-verify)
|
||||
# En CI, on valide juste la présence de CHANGELOG et version
|
||||
;;
|
||||
*)
|
||||
echo "RELEASE_TYPE invalide: $mode (latest|wip|ci-verify)"; exit 1;
|
||||
;;
|
||||
esac
|
||||
|
||||
echo "[release-guard] OK"
|
||||
|
166
scripts/scripts/auto-ssh-push.sh
Executable file
166
scripts/scripts/auto-ssh-push.sh
Executable file
@ -0,0 +1,166 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Script d'automatisation des push SSH (template Linux)
|
||||
# Utilise automatiquement la clé SSH pour pousser sur le remote courant via SSH.
|
||||
|
||||
GITEA_HOST="${GITEA_HOST:-git.4nkweb.com}"
|
||||
|
||||
echo "🔑 Configuration SSH pour push (template)..."
|
||||
|
||||
# Configuration SSH automatique
|
||||
echo "⚙️ Configuration Git pour utiliser SSH..."
|
||||
git config --global url."git@${GITEA_HOST}:".insteadOf "https://${GITEA_HOST}/"
|
||||
|
||||
# Vérifier la configuration SSH
|
||||
echo "🔍 Vérification de la configuration SSH..."
|
||||
if ! ssh -T git@"${GITEA_HOST}" 2>&1 | grep -qi "authenticated\|welcome"; then
|
||||
echo "❌ Échec de l'authentification SSH"
|
||||
echo "💡 Vérifiez que votre clé SSH est configurée :"
|
||||
echo " 1. ssh-keygen -t ed25519 -f ~/.ssh/id_ed25519_4nk"
|
||||
echo " 2. Ajouter la clé publique à votre compte Gitea"
|
||||
echo " 3. ssh-add ~/.ssh/id_ed25519_4nk"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "✅ Authentification SSH réussie"
|
||||
|
||||
# Fonction pour push automatique
|
||||
get_current_branch() {
|
||||
# Détecte la branche courante, compatible anciennes versions de git
|
||||
local br
|
||||
br="$(git rev-parse --abbrev-ref HEAD 2>/dev/null || true)"
|
||||
if [ -z "$br" ] || [ "$br" = "HEAD" ]; then
|
||||
br="$(git symbolic-ref --short -q HEAD 2>/dev/null || true)"
|
||||
fi
|
||||
if [ -z "$br" ]; then
|
||||
# dernier recours: parser la sortie de "git branch"
|
||||
br="$(git branch 2>/dev/null | sed -n 's/^* //p' | head -n1)"
|
||||
fi
|
||||
echo "$br"
|
||||
}
|
||||
|
||||
auto_push() {
|
||||
local branch
|
||||
branch=${1:-$(get_current_branch)}
|
||||
local commit_message=${2:-"Auto-commit $(date '+%Y-%m-%d %H:%M:%S')"}
|
||||
|
||||
echo "🚀 Push automatique sur la branche: $branch"
|
||||
|
||||
# Ajouter tous les changements
|
||||
git add .
|
||||
|
||||
# Ne pas commiter si rien à commite
|
||||
if [[ -z "$(git diff --cached --name-only)" ]]; then
|
||||
echo "ℹ️ Aucun changement indexé. Skip commit/push."
|
||||
return 0
|
||||
fi
|
||||
|
||||
# Commiter avec le message fourni
|
||||
git commit -m "$commit_message" || true
|
||||
|
||||
# Push avec SSH automatique
|
||||
echo "📤 Push vers origin/$branch..."
|
||||
git push origin "$branch"
|
||||
|
||||
echo "✅ Push réussi !"
|
||||
}
|
||||
|
||||
# Fonction pour push avec message personnalisé
|
||||
push_with_message() {
|
||||
local message="$1"
|
||||
local branch=${2:-$(get_current_branch)}
|
||||
|
||||
echo "💬 Push avec message: $message"
|
||||
auto_push "$branch" "$message"
|
||||
}
|
||||
|
||||
# Fonction pour push rapide (sans message)
|
||||
quick_push() {
|
||||
local branch=${1:-$(get_current_branch)}
|
||||
auto_push "$branch"
|
||||
}
|
||||
|
||||
# Fonction pour push sur une branche spécifique
|
||||
push_branch() {
|
||||
local branch="$1"
|
||||
local message=${2:-"Update $branch $(date '+%Y-%m-%d %H:%M:%S')"}
|
||||
|
||||
echo "🌿 Push sur la branche: $branch"
|
||||
auto_push "$branch" "$message"
|
||||
}
|
||||
|
||||
# Fonction pour push et merge vers main
|
||||
push_and_merge() {
|
||||
local source_branch=${1:-$(get_current_branch)}
|
||||
local target_branch=${2:-main}
|
||||
|
||||
echo "🔄 Push et merge $source_branch -> $target_branch"
|
||||
|
||||
# Push de la branche source
|
||||
auto_push "$source_branch"
|
||||
|
||||
# Indication pour PR manuelle
|
||||
echo "🔗 Ouvrez une Pull Request sur votre forge pour $source_branch -> $target_branch"
|
||||
}
|
||||
|
||||
# Fonction pour status et push conditionnel
|
||||
status_and_push() {
|
||||
echo "📊 Statut du repository:"
|
||||
git status --short || true
|
||||
|
||||
if [[ -n $(git status --porcelain) ]]; then
|
||||
echo "📝 Changements détectés, push automatique..."
|
||||
auto_push
|
||||
else
|
||||
echo "✅ Aucun changement à pousser"
|
||||
fi
|
||||
}
|
||||
|
||||
# Menu interactif si aucun argument fourni
|
||||
if [[ $# -eq 0 ]]; then
|
||||
echo "🤖 Script de push SSH automatique (template)"
|
||||
echo ""
|
||||
echo "Options disponibles:"
|
||||
echo " auto-ssh-push.sh quick - Push rapide"
|
||||
echo " auto-ssh-push.sh message \"Mon message\" - Push avec message"
|
||||
echo " auto-ssh-push.sh branch nom-branche - Push sur branche spécifique"
|
||||
echo " auto-ssh-push.sh merge [source] [target] - Push et préparation merge"
|
||||
echo " auto-ssh-push.sh status - Status et push conditionnel"
|
||||
echo ""
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Traitement des arguments
|
||||
case "$1" in
|
||||
"quick")
|
||||
quick_push
|
||||
;;
|
||||
"message")
|
||||
if [[ -z "${2:-}" ]]; then
|
||||
echo "❌ Message requis pour l'option 'message'"
|
||||
exit 1
|
||||
fi
|
||||
push_with_message "$2" "${3:-}"
|
||||
;;
|
||||
"branch")
|
||||
if [[ -z "${2:-}" ]]; then
|
||||
echo "❌ Nom de branche requis pour l'option 'branch'"
|
||||
exit 1
|
||||
fi
|
||||
push_branch "$2" "${3:-}"
|
||||
;;
|
||||
"merge")
|
||||
push_and_merge "${2:-}" "${3:-}"
|
||||
;;
|
||||
"status")
|
||||
status_and_push
|
||||
;;
|
||||
*)
|
||||
echo "❌ Option inconnue: $1"
|
||||
echo "💡 Utilisez './scripts/auto-ssh-push.sh' pour voir les options"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
echo "🎯 Push SSH automatique terminé !"
|
60
scripts/scripts/init-ssh-env.sh
Executable file
60
scripts/scripts/init-ssh-env.sh
Executable file
@ -0,0 +1,60 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Script d'initialisation de l'environnement SSH (template Linux)
|
||||
# Configure automatiquement SSH pour les push via Gitea
|
||||
|
||||
GITEA_HOST="${GITEA_HOST:-git.4nkweb.com}"
|
||||
|
||||
echo "🚀 Initialisation de l'environnement SSH (template)..."
|
||||
|
||||
# Couleurs
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
print_status() { echo -e "${BLUE}[INFO]${NC} $1"; }
|
||||
print_success() { echo -e "${GREEN}[SUCCESS]${NC} $1"; }
|
||||
print_warning() { echo -e "${YELLOW}[WARNING]${NC} $1"; }
|
||||
print_error() { echo -e "${RED}[ERROR]${NC} $1"; }
|
||||
|
||||
print_status "Configuration SSH..."
|
||||
|
||||
# 1. Configuration Git pour SSH
|
||||
print_status "Configuration Git pour utiliser SSH (${GITEA_HOST})..."
|
||||
git config --global url."git@${GITEA_HOST}:".insteadOf "https://${GITEA_HOST}/"
|
||||
|
||||
# 2. Vérification des clés SSH
|
||||
print_status "Vérification des clés SSH existantes..."
|
||||
if [[ -f ~/.ssh/id_rsa || -f ~/.ssh/id_ed25519 ]]; then
|
||||
print_success "Clé SSH trouvée"
|
||||
else
|
||||
print_warning "Aucune clé SSH trouvée"
|
||||
fi
|
||||
|
||||
# 3. Test de la connexion SSH
|
||||
print_status "Test de la connexion SSH vers ${GITEA_HOST}..."
|
||||
if ssh -T git@"${GITEA_HOST}" 2>&1 | grep -qi "authenticated\|welcome"; then
|
||||
print_success "Authentification SSH réussie"
|
||||
else
|
||||
print_error "Échec de l'authentification SSH"
|
||||
fi
|
||||
|
||||
# 4. Alias Git
|
||||
print_status "Configuration des alias Git..."
|
||||
git config --global alias.ssh-push '!f() { git add . && git commit -m "${1:-Auto-commit $(date)}" && git push origin $(git branch --show-current); }; f'
|
||||
git config --global alias.quick-push '!f() { git add . && git commit -m "Update $(date)" && git push origin $(git branch --show-current); }; f'
|
||||
print_success "Alias Git configurés"
|
||||
|
||||
# 5. Rendu exécutable des scripts si chemin standard
|
||||
print_status "Configuration des permissions des scripts (si présents)..."
|
||||
chmod +x scripts/auto-ssh-push.sh 2>/dev/null || true
|
||||
chmod +x scripts/setup-ssh-ci.sh 2>/dev/null || true
|
||||
print_success "Scripts rendus exécutables (si présents)"
|
||||
|
||||
# 6. Résumé
|
||||
echo ""
|
||||
print_success "=== Configuration SSH terminée ==="
|
||||
|
55
scripts/scripts/setup-ssh-ci.sh
Executable file
55
scripts/scripts/setup-ssh-ci.sh
Executable file
@ -0,0 +1,55 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Script de configuration SSH pour CI/CD (template Linux)
|
||||
# Utilise automatiquement la clé SSH pour les opérations Git
|
||||
|
||||
GITEA_HOST="${GITEA_HOST:-git.4nkweb.com}"
|
||||
|
||||
echo "🔑 Configuration automatique de la clé SSH pour CI/CD..."
|
||||
|
||||
if [ -n "${CI:-}" ]; then
|
||||
echo "✅ Environnement CI détecté"
|
||||
|
||||
if [ -n "${SSH_PRIVATE_KEY:-}" ]; then
|
||||
echo "🔐 Configuration de la clé SSH privée..."
|
||||
mkdir -p ~/.ssh && chmod 700 ~/.ssh
|
||||
printf "%s" "$SSH_PRIVATE_KEY" > ~/.ssh/id_rsa
|
||||
chmod 600 ~/.ssh/id_rsa
|
||||
|
||||
if [ -n "${SSH_PUBLIC_KEY:-}" ]; then
|
||||
printf "%s" "$SSH_PUBLIC_KEY" > ~/.ssh/id_rsa.pub
|
||||
chmod 644 ~/.ssh/id_rsa.pub
|
||||
fi
|
||||
|
||||
cat > ~/.ssh/config << EOF
|
||||
Host ${GITEA_HOST}
|
||||
HostName ${GITEA_HOST}
|
||||
User git
|
||||
IdentityFile ~/.ssh/id_rsa
|
||||
StrictHostKeyChecking no
|
||||
UserKnownHostsFile=/dev/null
|
||||
EOF
|
||||
chmod 600 ~/.ssh/config
|
||||
|
||||
echo "🧪 Test SSH vers ${GITEA_HOST}..."
|
||||
ssh -T git@"${GITEA_HOST}" 2>&1 || true
|
||||
|
||||
git config --global url."git@${GITEA_HOST}:".insteadOf "https://${GITEA_HOST}/"
|
||||
echo "✅ Configuration SSH terminée"
|
||||
else
|
||||
echo "⚠️ SSH_PRIVATE_KEY non défini, bascule HTTPS"
|
||||
fi
|
||||
else
|
||||
echo "ℹ️ Environnement local détecté"
|
||||
if [ -f ~/.ssh/id_rsa ] || [ -f ~/.ssh/id_ed25519 ]; then
|
||||
echo "🔑 Clé SSH locale trouvée"
|
||||
git config --global url."git@${GITEA_HOST}:".insteadOf "https://${GITEA_HOST}/"
|
||||
echo "✅ Configuration SSH locale terminée"
|
||||
else
|
||||
echo "⚠️ Aucune clé SSH trouvée; configuration manuelle requise"
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "🎯 Configuration SSH CI/CD terminée"
|
||||
|
35
scripts/security/audit.sh
Executable file
35
scripts/security/audit.sh
Executable file
@ -0,0 +1,35 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
echo "[security-audit] démarrage"
|
||||
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")"/../.. && pwd)"
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
rc=0
|
||||
|
||||
# 1) Audit Rust (si Cargo.toml présent et cargo disponible)
|
||||
if command -v cargo >/dev/null 2>&1 && [ -f Cargo.toml ] || find . -maxdepth 2 -name Cargo.toml | grep -q . ; then
|
||||
echo "[security-audit] cargo audit"
|
||||
if ! cargo audit --deny warnings; then rc=1; fi || true
|
||||
else
|
||||
echo "[security-audit] pas de projet Rust (ok)"
|
||||
fi
|
||||
|
||||
# 2) Audit npm (si package.json présent)
|
||||
if [ -f package.json ]; then
|
||||
echo "[security-audit] npm audit --audit-level=moderate"
|
||||
if ! npm audit --audit-level=moderate; then rc=1; fi || true
|
||||
else
|
||||
echo "[security-audit] pas de package.json (ok)"
|
||||
fi
|
||||
|
||||
# 3) Recherche de secrets grossiers
|
||||
echo "[security-audit] scan secrets"
|
||||
if grep -RIE "(?i)(api[_-]?key|secret|password|private[_-]?key)" --exclude-dir .git --exclude-dir node_modules --exclude-dir target --exclude "*.md" . >/dev/null 2>&1; then
|
||||
echo "[security-audit] secrets potentiels détectés"; rc=1
|
||||
else
|
||||
echo "[security-audit] aucun secret évident"
|
||||
fi
|
||||
|
||||
echo "[security-audit] terminé rc=$rc"
|
||||
exit $rc
|
47
scripts/utils/check_md024.ps1
Normal file
47
scripts/utils/check_md024.ps1
Normal file
@ -0,0 +1,47 @@
|
||||
Param(
|
||||
[string]$Root = "."
|
||||
)
|
||||
|
||||
$ErrorActionPreference = "Stop"
|
||||
|
||||
$files = Get-ChildItem -Path $Root -Recurse -Filter *.md | Where-Object { $_.FullName -notmatch '\\archive\\' }
|
||||
$had = $false
|
||||
foreach ($f in $files) {
|
||||
try {
|
||||
$lines = Get-Content -LiteralPath $f.FullName -Encoding UTF8 -ErrorAction Stop
|
||||
} catch {
|
||||
Write-Warning ("Impossible de lire: {0} — {1}" -f $f.FullName, $_.Exception.Message)
|
||||
continue
|
||||
}
|
||||
$map = @{}
|
||||
$firstMap = @{}
|
||||
$dups = @{}
|
||||
for ($i = 0; $i -lt $lines.Count; $i++) {
|
||||
$line = $lines[$i]
|
||||
if ($line -match '^\s{0,3}#{1,6}\s+(.*)$') {
|
||||
$t = $Matches[1].Trim()
|
||||
$norm = ([regex]::Replace($t, '\s+', ' ')).ToLowerInvariant()
|
||||
if ($map.ContainsKey($norm)) {
|
||||
if (-not $dups.ContainsKey($norm)) {
|
||||
$dups[$norm] = New-Object System.Collections.ArrayList
|
||||
$firstMap[$norm] = $map[$norm]
|
||||
}
|
||||
[void]$dups[$norm].Add($i + 1)
|
||||
} else {
|
||||
$map[$norm] = $i + 1
|
||||
}
|
||||
}
|
||||
}
|
||||
if ($dups.Keys.Count -gt 0) {
|
||||
$had = $true
|
||||
Write-Output "=== $($f.FullName) ==="
|
||||
foreach ($k in $dups.Keys) {
|
||||
$first = $firstMap[$k]
|
||||
$others = ($dups[$k] -join ', ')
|
||||
Write-Output ("Heading: '{0}' first@{1} duplicates@[{2}]" -f $k, $first, $others)
|
||||
}
|
||||
}
|
||||
}
|
||||
if (-not $had) {
|
||||
Write-Output "No duplicate headings detected."
|
||||
}
|
259
src/lib.rs
Normal file
259
src/lib.rs
Normal file
@ -0,0 +1,259 @@
|
||||
use async_std::fs::{create_dir_all, read_dir, read_to_string, remove_file, File};
|
||||
use async_std::io::WriteExt;
|
||||
use async_std::path::Path;
|
||||
use async_std::stream::StreamExt;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::time::{Duration, SystemTime, UNIX_EPOCH};
|
||||
use tide::{StatusCode, Request, Response};
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct StorageService {
|
||||
storage_dir: String,
|
||||
}
|
||||
|
||||
impl StorageService {
|
||||
pub fn new<S: Into<String>>(storage_dir: S) -> Self {
|
||||
Self {
|
||||
storage_dir: storage_dir.into(),
|
||||
}
|
||||
}
|
||||
|
||||
fn get_file_path(&self, key: &str) -> String {
|
||||
let dir_name = format!("{}/{}", self.storage_dir, &key[..2]);
|
||||
let file_path = format!("{}/{}", dir_name, &key[2..]);
|
||||
file_path
|
||||
}
|
||||
|
||||
pub async fn store_data(
|
||||
&self,
|
||||
key: &str,
|
||||
value: &[u8],
|
||||
expires_at: Option<SystemTime>,
|
||||
) -> Result<(), tide::Error> {
|
||||
let file_name = self.get_file_path(key);
|
||||
let file_path = Path::new(&file_name);
|
||||
|
||||
if file_path.exists().await {
|
||||
return Err(tide::Error::from_str(StatusCode::Conflict, "Key already exists"));
|
||||
}
|
||||
|
||||
create_dir_all(file_path.parent().ok_or(tide::Error::from_str(
|
||||
StatusCode::InternalServerError,
|
||||
"File path doesn't have parent",
|
||||
))?)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
|
||||
let metadata_path = format!("{}.meta", file_name);
|
||||
|
||||
let mut file = File::create(&file_path)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
file.write_all(value)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
|
||||
let metadata = Metadata {
|
||||
expires_at: expires_at.map(system_time_to_unix),
|
||||
};
|
||||
|
||||
let metadata_json = serde_json::to_string(&metadata)
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
let mut meta_file = File::create(&metadata_path)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
meta_file
|
||||
.write_all(metadata_json.as_bytes())
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn retrieve_data(&self, key: &str) -> Result<Vec<u8>, String> {
|
||||
let file_path = format!("{}/{}/{}", self.storage_dir, &key[..2], &key[2..]);
|
||||
|
||||
let mut file = File::open(&file_path)
|
||||
.await
|
||||
.map_err(|_| "Key not found.".to_string())?;
|
||||
let mut buffer = Vec::new();
|
||||
async_std::io::ReadExt::read_to_end(&mut file, &mut buffer)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
Ok(buffer)
|
||||
}
|
||||
|
||||
pub async fn cleanup_expired_files_once(&self) -> Result<(), String> {
|
||||
let mut entries = read_dir(&self.storage_dir)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read storage dir: {}", e))?;
|
||||
let now = system_time_to_unix(SystemTime::now());
|
||||
while let Some(entry) = entries.next().await {
|
||||
let e = entry.map_err(|e| format!("entry returned error: {}", e))?;
|
||||
let path = e.path();
|
||||
if path.is_dir().await {
|
||||
if let Ok(mut sub_entries) = read_dir(&path).await {
|
||||
while let Some(sub_entry) = sub_entries.next().await {
|
||||
if let Ok(sub_entry) = sub_entry {
|
||||
let file_path = sub_entry.path();
|
||||
if file_path.extension() == Some("meta".as_ref()) {
|
||||
self.handle_file_cleanup(now, &file_path).await?;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn handle_file_cleanup(&self, now: u64, meta_path: &Path) -> Result<(), String> {
|
||||
let meta_content = read_to_string(meta_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read metadata: {}", e.to_string()))?;
|
||||
let metadata: Metadata = serde_json::from_str(&meta_content)
|
||||
.map_err(|e| format!("Failed to parse metadata: {}", e.to_string()))?;
|
||||
|
||||
if metadata.expires_at.is_some() && metadata.expires_at.unwrap() < now {
|
||||
let data_file_path = meta_path.with_extension("");
|
||||
remove_file(&data_file_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to remove data file: {}", e.to_string()))?;
|
||||
remove_file(meta_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to remove metadata file: {}", e.to_string()))?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
pub struct Metadata {
|
||||
pub expires_at: Option<u64>,
|
||||
}
|
||||
|
||||
pub fn system_time_to_unix(system_time: SystemTime) -> u64 {
|
||||
system_time
|
||||
.duration_since(UNIX_EPOCH)
|
||||
.expect("SystemTime before UNIX_EPOCH!")
|
||||
.as_secs()
|
||||
}
|
||||
|
||||
pub fn unix_to_system_time(unix_timestamp: u64) -> SystemTime {
|
||||
UNIX_EPOCH + Duration::from_secs(unix_timestamp)
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
pub struct StoreRequest {
|
||||
pub key: String,
|
||||
pub value: String,
|
||||
pub ttl: Option<u64>,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
pub struct ApiResponse { pub message: String }
|
||||
|
||||
#[derive(Serialize)]
|
||||
pub struct RetrieveResponse { pub key: String, pub value: String }
|
||||
|
||||
pub async fn handle_health(_req: Request<StorageService>) -> tide::Result<Response> {
|
||||
Ok(Response::builder(StatusCode::Ok)
|
||||
.body(serde_json::to_value(&ApiResponse { message: "ok".into() })?)
|
||||
.build())
|
||||
}
|
||||
|
||||
pub async fn handle_store(mut req: Request<StorageService>, no_ttl_permanent: bool) -> tide::Result<Response> {
|
||||
let data: StoreRequest = match req.body_json().await {
|
||||
Ok(data) => data,
|
||||
Err(e) => {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid request: {}", e))
|
||||
.build());
|
||||
}
|
||||
};
|
||||
|
||||
if data.key.len() != 64 || !data.key.chars().all(|c| c.is_ascii_hexdigit()) {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body("Invalid key: must be a 32 bytes hex string.".to_string())
|
||||
.build());
|
||||
}
|
||||
|
||||
let live_for: Option<Duration> = if let Some(ttl) = data.ttl {
|
||||
if ttl < 60 {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid ttl: must be at least {} seconds.", 60))
|
||||
.build());
|
||||
} else if ttl > 31_536_000 {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid ttl: must be at most {} seconds.", 31_536_000))
|
||||
.build());
|
||||
}
|
||||
Some(Duration::from_secs(ttl))
|
||||
} else if no_ttl_permanent {
|
||||
None
|
||||
} else {
|
||||
Some(Duration::from_secs(86_400))
|
||||
};
|
||||
|
||||
let expires_at: Option<SystemTime> = match live_for {
|
||||
Some(lf) => Some(
|
||||
SystemTime::now()
|
||||
.checked_add(lf)
|
||||
.ok_or(tide::Error::from_str(StatusCode::BadRequest, "Invalid ttl"))?
|
||||
),
|
||||
None => None,
|
||||
};
|
||||
|
||||
let value_bytes = match hex::decode(&data.value) {
|
||||
Ok(value) => value,
|
||||
Err(e) => {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid request: {}", e))
|
||||
.build());
|
||||
}
|
||||
};
|
||||
|
||||
let svc = req.state();
|
||||
match svc.store_data(&data.key, &value_bytes, expires_at).await {
|
||||
Ok(()) => Ok(Response::builder(StatusCode::Ok)
|
||||
.body(serde_json::to_value(&ApiResponse {
|
||||
message: "Data stored successfully.".to_string(),
|
||||
})?)
|
||||
.build()),
|
||||
Err(e) => Ok(Response::builder(e.status())
|
||||
.body(serde_json::to_value(&ApiResponse {
|
||||
message: e.to_string(),
|
||||
})?)
|
||||
.build()),
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn handle_retrieve(req: Request<StorageService>) -> tide::Result<Response> {
|
||||
let key: String = req.param("key")?.to_string();
|
||||
|
||||
if key.len() != 64 || !key.chars().all(|c| c.is_ascii_hexdigit()) {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body("Invalid key: must be a 32 bytes hex string.".to_string())
|
||||
.build());
|
||||
}
|
||||
|
||||
let svc = req.state();
|
||||
match svc.retrieve_data(&key).await {
|
||||
Ok(value) => {
|
||||
let encoded_value = hex::encode(value);
|
||||
Ok(Response::builder(StatusCode::Ok)
|
||||
.body(serde_json::to_value(&RetrieveResponse { key, value: encoded_value })?)
|
||||
.build())
|
||||
}
|
||||
Err(e) => Ok(Response::builder(StatusCode::NotFound).body(e).build()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn create_app(no_ttl_permanent: bool, storage_dir: impl Into<String>) -> tide::Server<StorageService> {
|
||||
let svc = StorageService::new(storage_dir);
|
||||
let mut app = tide::with_state(svc);
|
||||
app.at("/health").get(handle_health);
|
||||
app.at("/store").post(move |req| handle_store(req, no_ttl_permanent));
|
||||
app.at("/retrieve/:key").get(handle_retrieve);
|
||||
app
|
||||
}
|
301
src/main.rs
301
src/main.rs
@ -1,305 +1,40 @@
|
||||
use async_std::fs::{create_dir_all, read_dir, read_to_string, remove_file, File};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::time::{Duration, SystemTime, UNIX_EPOCH};
|
||||
use std::env;
|
||||
|
||||
use async_std::io::WriteExt;
|
||||
use async_std::path::Path;
|
||||
use async_std::stream::StreamExt;
|
||||
use async_std::task;
|
||||
use tide::{Request, Response, StatusCode};
|
||||
use async_std::fs::create_dir_all;
|
||||
use sdk_storage::{StorageService, create_app};
|
||||
|
||||
const STORAGE_DIR: &str = "./storage";
|
||||
const PORT: u16 = 8081;
|
||||
const MIN_TTL: u64 = 60; // 1 minute
|
||||
const DEFAULT_TTL: u64 = 86400; // 1 day
|
||||
const MAX_TTL: u64 = 31_536_000; // 1 year, to be discussed
|
||||
const DEFAULT_TTL: u64 = 86400;
|
||||
|
||||
/// Scans storage and removes expired files
|
||||
async fn cleanup_expired_files() {
|
||||
loop {
|
||||
// Traverse storage directory
|
||||
let mut entries = match read_dir(STORAGE_DIR).await {
|
||||
Ok(entry) => entry,
|
||||
Err(e) => {
|
||||
eprintln!("Failed to read storage dir: {}", e);
|
||||
task::sleep(Duration::from_secs(60)).await;
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let now = system_time_to_unix(SystemTime::now());
|
||||
while let Some(entry) = entries.next().await {
|
||||
let e = match entry {
|
||||
Ok(e) => e,
|
||||
Err(e) => {
|
||||
eprintln!("entry returned error: {}", e);
|
||||
continue;
|
||||
}
|
||||
};
|
||||
let path = e.path();
|
||||
if path.is_dir().await {
|
||||
if let Ok(mut sub_entries) = read_dir(&path).await {
|
||||
while let Some(sub_entry) = sub_entries.next().await {
|
||||
if let Ok(sub_entry) = sub_entry {
|
||||
let file_path = sub_entry.path();
|
||||
if file_path.extension() == Some("meta".as_ref()) {
|
||||
if let Err(err) = handle_file_cleanup(now, &file_path).await {
|
||||
eprintln!("Error cleaning file {:?}: {}", file_path, err);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// Sleep for 1 minute before next cleanup
|
||||
task::sleep(Duration::from_secs(60)).await;
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Deserialize, Serialize)]
|
||||
struct Metadata {
|
||||
expires_at: Option<u64>,
|
||||
}
|
||||
|
||||
/// Converts a `SystemTime` to a UNIX timestamp (seconds since UNIX epoch).
|
||||
fn system_time_to_unix(system_time: SystemTime) -> u64 {
|
||||
system_time
|
||||
.duration_since(UNIX_EPOCH)
|
||||
.expect("SystemTime before UNIX_EPOCH!")
|
||||
.as_secs()
|
||||
}
|
||||
|
||||
/// Converts a UNIX timestamp (seconds since UNIX epoch) back to `SystemTime`.
|
||||
fn unix_to_system_time(unix_timestamp: u64) -> SystemTime {
|
||||
UNIX_EPOCH + Duration::from_secs(unix_timestamp)
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
struct StoreRequest {
|
||||
key: String,
|
||||
value: String,
|
||||
ttl: Option<u64>,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct ApiResponse {
|
||||
message: String,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct RetrieveResponse {
|
||||
key: String,
|
||||
value: String,
|
||||
}
|
||||
|
||||
async fn get_file_path(key: &str) -> String {
|
||||
let dir_name = format!("{}/{}", STORAGE_DIR, &key[..2]);
|
||||
let file_path = format!("{}/{}", dir_name, &key[2..]);
|
||||
|
||||
file_path
|
||||
}
|
||||
|
||||
/// Store data on the filesystem
|
||||
async fn store_data(key: &str, value: &[u8], expires_at: Option<SystemTime>) -> Result<(), tide::Error> {
|
||||
let file_name = get_file_path(key).await;
|
||||
let file_path = Path::new(&file_name);
|
||||
|
||||
// Check if key exists
|
||||
if file_path.exists().await {
|
||||
return Err(tide::Error::from_str(
|
||||
StatusCode::Conflict,
|
||||
"Key already exists",
|
||||
));
|
||||
}
|
||||
|
||||
create_dir_all(file_path.parent().ok_or(tide::Error::from_str(
|
||||
StatusCode::InternalServerError,
|
||||
"File path doesn't have parent",
|
||||
))?)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
|
||||
let metadata_path = format!("{}.meta", file_name);
|
||||
|
||||
let mut file = File::create(&file_path)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
file.write_all(value)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
|
||||
let metadata = Metadata {
|
||||
expires_at: expires_at.map(|e| system_time_to_unix(e)),
|
||||
};
|
||||
|
||||
let metadata_json = serde_json::to_string(&metadata)
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
let mut meta_file = File::create(&metadata_path)
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
meta_file
|
||||
.write_all(metadata_json.as_bytes())
|
||||
.await
|
||||
.map_err(|e| tide::Error::new(StatusCode::InternalServerError, e))?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn retrieve_data(key: &str) -> Result<Vec<u8>, String> {
|
||||
let file_path = format!("{}/{}/{}", STORAGE_DIR, &key[..2], &key[2..]);
|
||||
|
||||
let mut file = File::open(&file_path)
|
||||
.await
|
||||
.map_err(|_| "Key not found.".to_string())?;
|
||||
let mut buffer = Vec::new();
|
||||
async_std::io::ReadExt::read_to_end(&mut file, &mut buffer)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
Ok(buffer)
|
||||
}
|
||||
|
||||
/// Handler for the /store endpoint
|
||||
async fn handle_store(mut req: Request<()>, no_ttl_permanent: bool) -> tide::Result<Response> {
|
||||
// Parse the JSON body
|
||||
let data: StoreRequest = match req.body_json().await {
|
||||
Ok(data) => data,
|
||||
Err(e) => {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid request: {}", e))
|
||||
.build());
|
||||
}
|
||||
};
|
||||
|
||||
// Validate the key
|
||||
if data.key.len() != 64 || !data.key.chars().all(|c| c.is_ascii_hexdigit()) {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body("Invalid key: must be a 32 bytes hex string.".to_string())
|
||||
.build());
|
||||
}
|
||||
|
||||
// Validate the ttl
|
||||
let live_for: Option<Duration> = if let Some(ttl) = data.ttl {
|
||||
if ttl < MIN_TTL {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!(
|
||||
"Invalid ttl: must be at least {} seconds.",
|
||||
MIN_TTL
|
||||
))
|
||||
.build());
|
||||
} else if ttl > MAX_TTL {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid ttl: must be at most {} seconds.", MAX_TTL))
|
||||
.build());
|
||||
}
|
||||
Some(Duration::from_secs(ttl))
|
||||
} else if no_ttl_permanent {
|
||||
// When no_ttl_permanent is true, requests without TTL are permanent
|
||||
None
|
||||
} else {
|
||||
Some(Duration::from_secs(DEFAULT_TTL))
|
||||
};
|
||||
|
||||
let expires_at: Option<SystemTime> = if let Some(live_for) = live_for {
|
||||
let now = SystemTime::now();
|
||||
Some(now
|
||||
.checked_add(live_for)
|
||||
.ok_or(tide::Error::from_str(StatusCode::BadRequest, "Invalid ttl"))?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Decode the value from Base64
|
||||
let value_bytes = match hex::decode(&data.value) {
|
||||
Ok(value) => value,
|
||||
Err(e) => {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body(format!("Invalid request: {}", e))
|
||||
.build());
|
||||
}
|
||||
};
|
||||
|
||||
// Store the data
|
||||
match store_data(&data.key, &value_bytes, expires_at).await {
|
||||
Ok(()) => Ok(Response::builder(StatusCode::Ok)
|
||||
.body(serde_json::to_value(&ApiResponse {
|
||||
message: "Data stored successfully.".to_string(),
|
||||
})?)
|
||||
.build()),
|
||||
Err(e) => Ok(Response::builder(e.status())
|
||||
.body(serde_json::to_value(&ApiResponse {
|
||||
message: e.to_string(),
|
||||
})?)
|
||||
.build()),
|
||||
}
|
||||
}
|
||||
|
||||
async fn handle_retrieve(req: Request<()>) -> tide::Result<Response> {
|
||||
let key: String = req.param("key")?.to_string();
|
||||
|
||||
if key.len() != 64 || !key.chars().all(|c| c.is_ascii_hexdigit()) {
|
||||
return Ok(Response::builder(StatusCode::BadRequest)
|
||||
.body("Invalid key: must be a 32 bytes hex string.".to_string())
|
||||
.build());
|
||||
}
|
||||
|
||||
match retrieve_data(&key).await {
|
||||
Ok(value) => {
|
||||
let encoded_value = hex::encode(value);
|
||||
Ok(Response::builder(StatusCode::Ok)
|
||||
.body(serde_json::to_value(&RetrieveResponse {
|
||||
key,
|
||||
value: encoded_value,
|
||||
})?)
|
||||
.build())
|
||||
}
|
||||
Err(e) => Ok(Response::builder(StatusCode::NotFound).body(e).build()),
|
||||
}
|
||||
}
|
||||
|
||||
/// Checks a metadata file and deletes the associated data file if expired
|
||||
async fn handle_file_cleanup(now: u64, meta_path: &Path) -> Result<(), String> {
|
||||
let meta_content = read_to_string(meta_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to read metadata: {}", e.to_string()))?;
|
||||
let metadata: Metadata = serde_json::from_str(&meta_content)
|
||||
.map_err(|e| format!("Failed to parse metadata: {}", e.to_string()))?;
|
||||
|
||||
if metadata.expires_at.is_some() && metadata.expires_at.unwrap() < now {
|
||||
let data_file_path = meta_path.with_extension("");
|
||||
remove_file(&data_file_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to remove data file: {}", e.to_string()))?;
|
||||
remove_file(meta_path)
|
||||
.await
|
||||
.map_err(|e| format!("Failed to remove metadata file: {}", e.to_string()))?;
|
||||
println!("Removed expired file: {:?}", data_file_path);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[async_std::main]
|
||||
async fn main() -> tide::Result<()> {
|
||||
// Parse command line arguments
|
||||
let args: Vec<String> = env::args().collect();
|
||||
let no_ttl_permanent = args.iter().any(|arg| arg == "--permanent");
|
||||
|
||||
|
||||
if no_ttl_permanent {
|
||||
println!("No-TTL requests will be treated as permanent");
|
||||
} else {
|
||||
println!("No-TTL requests will use default TTL of {} seconds", DEFAULT_TTL);
|
||||
}
|
||||
|
||||
create_dir_all(STORAGE_DIR)
|
||||
.await
|
||||
.expect("Failed to create storage directory.");
|
||||
|
||||
task::spawn(cleanup_expired_files());
|
||||
let svc = StorageService::new(STORAGE_DIR);
|
||||
create_dir_all(STORAGE_DIR).await.expect("Failed to create storage directory.");
|
||||
|
||||
let mut app = tide::new();
|
||||
app.at("/store").post(move |req| handle_store(req, no_ttl_permanent));
|
||||
app.at("/retrieve/:key").get(handle_retrieve);
|
||||
// background cleanup loop
|
||||
let svc_clone = svc.clone();
|
||||
task::spawn(async move {
|
||||
loop {
|
||||
if let Err(e) = svc_clone.cleanup_expired_files_once().await {
|
||||
eprintln!("cleanup error: {}", e);
|
||||
}
|
||||
task::sleep(std::time::Duration::from_secs(60)).await;
|
||||
}
|
||||
});
|
||||
|
||||
let mut app = create_app(no_ttl_permanent, STORAGE_DIR);
|
||||
app.listen(format!("0.0.0.0:{}", PORT)).await?;
|
||||
|
||||
println!("Server running at http://0.0.0.0:{}", PORT);
|
||||
|
135
tests/http_api.rs
Normal file
135
tests/http_api.rs
Normal file
@ -0,0 +1,135 @@
|
||||
use sdk_storage::{StorageService, unix_to_system_time, create_app};
|
||||
use tempfile::TempDir;
|
||||
use surf::Client;
|
||||
|
||||
#[async_std::test]
|
||||
async fn store_and_retrieve_hex_in_tempdir() {
|
||||
let td = TempDir::new().unwrap();
|
||||
let dir_path = td.path().to_string_lossy().to_string();
|
||||
let svc = StorageService::new(dir_path);
|
||||
|
||||
let key = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
|
||||
let value = b"hello";
|
||||
let expires = Some(unix_to_system_time(60 + sdk_storage::system_time_to_unix(std::time::SystemTime::now())));
|
||||
|
||||
svc.store_data(key, value, expires).await.unwrap();
|
||||
let got = svc.retrieve_data(key).await.unwrap();
|
||||
assert_eq!(got, value);
|
||||
}
|
||||
|
||||
#[async_std::test]
|
||||
async fn conflict_on_duplicate_key() {
|
||||
let td = TempDir::new().unwrap();
|
||||
let dir_path = td.path().to_string_lossy().to_string();
|
||||
let svc = StorageService::new(dir_path);
|
||||
|
||||
let key = "bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb";
|
||||
let value = b"data";
|
||||
|
||||
svc.store_data(key, value, None).await.unwrap();
|
||||
let err = svc.store_data(key, value, None).await.err().expect("should error");
|
||||
assert_eq!(err.status(), tide::StatusCode::Conflict);
|
||||
}
|
||||
|
||||
#[async_std::test]
|
||||
async fn cleanup_removes_expired() {
|
||||
let td = TempDir::new().unwrap();
|
||||
let dir_path = td.path().to_string_lossy().to_string();
|
||||
let svc = StorageService::new(dir_path.clone());
|
||||
|
||||
let key = "cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc";
|
||||
let value = b"x";
|
||||
// expiration très proche: maintenant - 1s
|
||||
let past = sdk_storage::system_time_to_unix(std::time::SystemTime::now()).saturating_sub(1);
|
||||
let expires = Some(unix_to_system_time(past));
|
||||
|
||||
svc.store_data(key, value, expires).await.unwrap();
|
||||
// cleanup one-shot
|
||||
svc.cleanup_expired_files_once().await.unwrap();
|
||||
let res = svc.retrieve_data(key).await;
|
||||
assert!(res.is_err(), "expired key should be removed");
|
||||
}
|
||||
|
||||
#[async_std::test]
|
||||
async fn http_store_success_and_conflicts_and_invalids() {
|
||||
// app with permanent=false so default TTL applies when missing
|
||||
let td = TempDir::new().unwrap();
|
||||
let storage = td.path().to_string_lossy().to_string();
|
||||
let mut app = create_app(false, storage);
|
||||
let listener = async_std::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
|
||||
let addr = listener.local_addr().unwrap();
|
||||
async_std::task::spawn(async move { app.listen(listener).await.unwrap() });
|
||||
|
||||
let client = Client::new();
|
||||
let base = format!("http://{}", addr);
|
||||
|
||||
// success
|
||||
let key = "dddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd";
|
||||
let body = serde_json::json!({"key": key, "value": "01aa", "ttl": 120});
|
||||
let res = client.post(format!("{}/store", base)).body_json(&body).unwrap().await.unwrap();
|
||||
assert!(res.status().is_success());
|
||||
|
||||
// conflict
|
||||
let res2 = client.post(format!("{}/store", base)).body_json(&body).unwrap().await.unwrap();
|
||||
assert_eq!(res2.status(), 409);
|
||||
|
||||
// invalid key
|
||||
let bad = serde_json::json!({"key": "xyz", "value": "01"});
|
||||
let res3 = client.post(format!("{}/store", base)).body_json(&bad).unwrap().await.unwrap();
|
||||
assert_eq!(res3.status(), 400);
|
||||
|
||||
// invalid value
|
||||
let badv = serde_json::json!({"key": "eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee", "value": "zz"});
|
||||
let res4 = client.post(format!("{}/store", base)).body_json(&badv).unwrap().await.unwrap();
|
||||
assert_eq!(res4.status(), 400);
|
||||
}
|
||||
|
||||
#[async_std::test]
|
||||
async fn http_retrieve_success_and_invalid_and_notfound() {
|
||||
let td = TempDir::new().unwrap();
|
||||
let storage = td.path().to_string_lossy().to_string();
|
||||
let mut app = create_app(true, storage.clone());
|
||||
let listener = async_std::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
|
||||
let addr = listener.local_addr().unwrap();
|
||||
async_std::task::spawn(async move { app.listen(listener).await.unwrap() });
|
||||
|
||||
let client = Client::new();
|
||||
let base = format!("http://{}", addr);
|
||||
|
||||
// prepare stored value (permanent mode)
|
||||
let svc = StorageService::new(storage);
|
||||
let key = "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff";
|
||||
svc.store_data(key, b"hi", None).await.unwrap();
|
||||
|
||||
// success
|
||||
let mut res = client.get(format!("{}/retrieve/{}", base, key)).await.unwrap();
|
||||
assert!(res.status().is_success());
|
||||
let v: serde_json::Value = res.body_json().await.unwrap();
|
||||
assert_eq!(v["value"].as_str().unwrap(), hex::encode("hi"));
|
||||
|
||||
// invalid key
|
||||
let res2 = client.get(format!("{}/retrieve/{}", base, "bad")).await.unwrap();
|
||||
assert_eq!(res2.status(), 400);
|
||||
|
||||
// not found
|
||||
let k2 = "1111111111111111111111111111111111111111111111111111111111111111";
|
||||
let res3 = client.get(format!("{}/retrieve/{}", base, k2)).await.unwrap();
|
||||
assert_eq!(res3.status(), 404);
|
||||
}
|
||||
|
||||
#[async_std::test]
|
||||
async fn http_health_ok() {
|
||||
let td = TempDir::new().unwrap();
|
||||
let storage = td.path().to_string_lossy().to_string();
|
||||
let mut app = create_app(true, storage);
|
||||
let listener = async_std::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
|
||||
let addr = listener.local_addr().unwrap();
|
||||
async_std::task::spawn(async move { app.listen(listener).await.unwrap() });
|
||||
|
||||
let client = Client::new();
|
||||
let base = format!("http://{}", addr);
|
||||
let mut res = client.get(format!("{}/health", base)).await.unwrap();
|
||||
assert!(res.status().is_success());
|
||||
let v: serde_json::Value = res.body_json().await.unwrap();
|
||||
assert_eq!(v["message"].as_str().unwrap(), "ok");
|
||||
}
|
Loading…
x
Reference in New Issue
Block a user