Compare commits

...

7 Commits
master ... dev

Author SHA1 Message Date
Sosthene
83b21cb815 Fix unconsistent async patterns + add a few missing try/catch
All checks were successful
Build and Push to Registry / build-and-push (push) Successful in 1m53s
2025-09-08 21:57:11 +02:00
omaroughriss
8449083237 Add build js to Dockerfile
All checks were successful
Build and Push to Registry / build-and-push (push) Successful in 1m56s
2025-09-08 19:42:55 +02:00
omaroughriss
f4bf9791c0 Add CICD for the signer
All checks were successful
Build and Push to Registry / build-and-push (push) Successful in 2m0s
2025-09-08 19:22:33 +02:00
omaroughriss
95763806c8 Update Dockerfile to use signer image with sdk_client pkg 2025-09-08 19:22:11 +02:00
Sosthene
fe60812c53 Update changelog 2025-09-08 18:12:52 +02:00
Sosthene
26f985195d [bug] More robust retrieveData 2025-09-08 18:12:38 +02:00
Sosthene
34a65ec079 Add background-sync service 2025-09-08 18:11:56 +02:00
18 changed files with 634 additions and 1409 deletions

View File

@ -1,98 +0,0 @@
---
name: Bug Report
about: Signaler un bug pour nous aider à améliorer sdk_signer
title: '[BUG] '
labels: ['bug', 'needs-triage']
assignees: ''
---
## 🐛 Description du Bug
Description claire et concise du problème.
## 🔄 Étapes pour Reproduire
1. Aller à '...'
2. Cliquer sur '...'
3. Faire défiler jusqu'à '...'
4. Voir l'erreur
## ✅ Comportement Attendu
Description de ce qui devrait se passer.
## ❌ Comportement Actuel
Description de ce qui se passe actuellement.
## 📸 Capture d'Écran
Si applicable, ajoutez une capture d'écran pour expliquer votre problème.
## 💻 Informations Système
- **OS** : [ex: Ubuntu 20.04, macOS 12.0, Windows 11]
- **Docker** : [ex: 20.10.0]
- **Docker Compose** : [ex: 2.0.0]
- **Version sdk_signer** : [ex: v1.0.0]
- **Architecture** : [ex: x86_64, ARM64]
## 📋 Configuration
### Services Actifs
```bash
docker ps
```
### Variables d'Environnement
```bash
# Bitcoin Core
BITCOIN_NETWORK=signet
BITCOIN_RPC_PORT=18443
# Blindbit
BLINDBIT_PORT=8000
# SDK Relay
SDK_RELAY_PORTS=8090-8095
```
## 📝 Logs
### Logs Pertinents
```
Logs pertinents ici
```
### Logs d'Erreur
```
Logs d'erreur ici
```
### Logs de Debug
```
Logs de debug ici (si RUST_LOG=debug)
```
## 🔧 Tentatives de Résolution
- [ ] Redémarrage des services
- [ ] Nettoyage des volumes Docker
- [ ] Vérification de la connectivité réseau
- [ ] Mise à jour des dépendances
- [ ] Vérification de la configuration
## 📚 Contexte Supplémentaire
Toute autre information pertinente sur le problème.
## 🔗 Liens Utiles
- [Documentation](docs/)
- [Guide de Dépannage](docs/TROUBLESHOOTING.md)
- [Issues Similaires](https://git.4nkweb.com/4nk/4NK_node/issues?q=is%3Aissue+is%3Aopen+label%3Abug)
---
**Merci de votre contribution !** 🙏

View File

@ -1,157 +0,0 @@
---
name: Feature Request
about: Proposer une nouvelle fonctionnalité pour sdk_signer
title: '[FEATURE] '
labels: ['enhancement', 'needs-triage']
assignees: ''
---
## 🚀 Résumé
Description claire et concise de la fonctionnalité souhaitée.
## 💡 Motivation
Pourquoi cette fonctionnalité est-elle nécessaire ? Quels problèmes résout-elle ?
### Problèmes Actuels
- Problème 1
- Problème 2
- Problème 3
### Avantages de la Solution
- Avantage 1
- Avantage 2
- Avantage 3
## 🎯 Proposition
Description détaillée de la fonctionnalité proposée.
### Fonctionnalités Principales
- [ ] Fonctionnalité 1
- [ ] Fonctionnalité 2
- [ ] Fonctionnalité 3
### Interface Utilisateur
Description de l'interface utilisateur si applicable.
### API Changes
Description des changements d'API si applicable.
## 🔄 Alternatives Considérées
Autres solutions envisagées et pourquoi elles n'ont pas été choisies.
### Alternative 1
- **Description** : ...
- **Pourquoi rejetée** : ...
### Alternative 2
- **Description** : ...
- **Pourquoi rejetée** : ...
## 📊 Impact
### Impact sur les Utilisateurs
- Impact positif 1
- Impact positif 2
- Impact négatif potentiel (si applicable)
### Impact sur l'Architecture
- Changements nécessaires
- Compatibilité avec l'existant
- Performance
### Impact sur la Maintenance
- Complexité ajoutée
- Tests nécessaires
- Documentation requise
## 💻 Exemples d'Utilisation
### Cas d'Usage 1
```bash
# Exemple de commande ou configuration
```
### Cas d'Usage 2
```python
# Exemple de code Python
```
### Cas d'Usage 3
```javascript
// Exemple de code JavaScript
```
## 🧪 Tests
### Tests Nécessaires
- [ ] Tests unitaires
- [ ] Tests d'intégration
- [ ] Tests de performance
- [ ] Tests de sécurité
- [ ] Tests de compatibilité
### Scénarios de Test
- Scénario 1
- Scénario 2
- Scénario 3
## 📚 Documentation
### Documentation Requise
- [ ] Guide d'utilisation
- [ ] Documentation API
- [ ] Exemples de code
- [ ] Guide de migration
- [ ] FAQ
## 🔧 Implémentation
### Étapes Proposées
1. **Phase 1** : [Description]
2. **Phase 2** : [Description]
3. **Phase 3** : [Description]
### Estimation de Temps
- **Développement** : X jours/semaines
- **Tests** : X jours/semaines
- **Documentation** : X jours/semaines
- **Total** : X jours/semaines
### Ressources Nécessaires
- Développeur(s)
- Testeur(s)
- Documentateur(s)
- Infrastructure
## 🎯 Critères de Succès
Comment mesurer le succès de cette fonctionnalité ?
- [ ] Critère 1
- [ ] Critère 2
- [ ] Critère 3
## 🔗 Liens Utiles
- [Documentation existante](docs/)
- [Issues similaires](https://git.4nkweb.com/4nk/4NK_node/issues?q=is%3Aissue+is%3Aopen+label%3Aenhancement)
- [Roadmap](https://git.4nkweb.com/4nk/4NK_node/projects)
- [Discussions](https://git.4nkweb.com/4nk/4NK_node/issues)
## 📋 Checklist
- [ ] J'ai vérifié que cette fonctionnalité n'existe pas déjà
- [ ] J'ai lu la documentation existante
- [ ] J'ai vérifié les issues similaires
- [ ] J'ai fourni des exemples d'utilisation
- [ ] J'ai considéré l'impact sur l'existant
- [ ] J'ai proposé des tests
---
**Merci de votre contribution à l'amélioration de sdk_signer !** 🌟

View File

@ -1,181 +0,0 @@
# Pull Request - sdk_signer
## 📋 Description
Description claire et concise des changements apportés.
### Type de Changement
- [ ] 🐛 Bug fix
- [ ] ✨ Nouvelle fonctionnalité
- [ ] 📚 Documentation
- [ ] 🧪 Tests
- [ ] 🔧 Refactoring
- [ ] 🚀 Performance
- [ ] 🔒 Sécurité
- [ ] 🎨 Style/UI
- [ ] 🏗️ Architecture
- [ ] 📦 Build/CI
### Composants Affectés
- [ ] Bitcoin Core
- [ ] Blindbit
- [ ] SDK Relay
- [ ] Tor
- [ ] Docker/Infrastructure
- [ ] Tests
- [ ] Documentation
- [ ] Scripts
## 🔗 Issue(s) Liée(s)
Fixes #(issue)
Relates to #(issue)
## 🧪 Tests
### Tests Exécutés
- [ ] Tests unitaires
- [ ] Tests d'intégration
- [ ] Tests de connectivité
- [ ] Tests externes
- [ ] Tests de performance
### Commandes de Test
```bash
# Tests complets
./tests/run_all_tests.sh
# Tests spécifiques
./tests/run_unit_tests.sh
./tests/run_integration_tests.sh
```
### Résultats des Tests
```
Résultats des tests ici
```
## 📸 Captures d'Écran
Si applicable, ajoutez des captures d'écran pour les changements visuels.
## 🔧 Changements Techniques
### Fichiers Modifiés
- `fichier1.rs` - Description des changements
- `fichier2.py` - Description des changements
- `docker-compose.yml` - Description des changements
### Nouveaux Fichiers
- `nouveau_fichier.rs` - Description
- `nouveau_script.sh` - Description
### Fichiers Supprimés
- `ancien_fichier.rs` - Raison de la suppression
### Changements de Configuration
```yaml
# Exemple de changement de configuration
service:
new_option: value
```
## 📚 Documentation
### Documentation Mise à Jour
- [ ] README.md
- [ ] docs/INSTALLATION.md
- [ ] docs/USAGE.md
- [ ] docs/API.md
- [ ] docs/ARCHITECTURE.md
### Nouvelle Documentation
- [ ] Nouveau guide créé
- [ ] Exemples ajoutés
- [ ] API documentée
## 🔍 Code Review Checklist
### Code Quality
- [ ] Le code suit les standards du projet
- [ ] Les noms de variables/fonctions sont clairs
- [ ] Les commentaires sont appropriés
- [ ] Pas de code mort ou commenté
- [ ] Gestion d'erreurs appropriée
### Performance
- [ ] Pas de régression de performance
- [ ] Optimisations appliquées si nécessaire
- [ ] Tests de performance ajoutés
### Sécurité
- [ ] Pas de vulnérabilités introduites
- [ ] Validation des entrées utilisateur
- [ ] Gestion sécurisée des secrets
### Tests
- [ ] Couverture de tests suffisante
- [ ] Tests pour les cas d'erreur
- [ ] Tests d'intégration si nécessaire
### Documentation
- [ ] Code auto-documenté
- [ ] Documentation mise à jour
- [ ] Exemples fournis
## 🚀 Déploiement
### Impact sur le Déploiement
- [ ] Aucun impact
- [ ] Migration de données requise
- [ ] Changement de configuration
- [ ] Redémarrage des services
### Étapes de Déploiement
```bash
# Étapes pour déployer les changements
```
## 📊 Métriques
### Impact sur les Performances
- Temps de réponse : +/- X%
- Utilisation mémoire : +/- X%
- Utilisation CPU : +/- X%
### Impact sur la Stabilité
- Taux d'erreur : +/- X%
- Disponibilité : +/- X%
## 🔄 Compatibilité
### Compatibilité Ascendante
- [ ] Compatible avec les versions précédentes
- [ ] Migration automatique
- [ ] Migration manuelle requise
### Compatibilité Descendante
- [ ] Compatible avec les futures versions
- [ ] API stable
- [ ] Configuration stable
## 🎯 Critères de Succès
- [ ] Critère 1
- [ ] Critère 2
- [ ] Critère 3
## 📝 Notes Supplémentaires
Informations supplémentaires importantes pour les reviewers.
## 🔗 Liens Utiles
- [Documentation](docs/)
- [Tests](tests/)
- [Issues liées](https://git.4nkweb.com/4nk/4NK_node/issues)
---
**Merci pour votre contribution !** 🙏

View File

@ -1,4 +0,0 @@
# .gitea
Fichiers de configuration Gitea (issues, templates, workflows) à ajouter au besoin.

View File

@ -1,15 +0,0 @@
# LOCAL_OVERRIDES.yml — dérogations locales contrôlées
overrides:
- path: ".gitea/workflows/ci.yml"
reason: "spécificité denvironnement"
owner: "@maintainer_handle"
expires: "2025-12-31"
- path: "scripts/auto-ssh-push.sh"
reason: "flux particulier temporaire"
owner: "@maintainer_handle"
expires: "2025-10-01"
policy:
allow_only_listed_paths: true
require_expiry: true
audit_in_ci: true

View File

@ -1,486 +0,0 @@
name: CI - 4NK Node
on:
push:
branches: [ main, develop ]
tags:
- 'v*'
pull_request:
branches: [ main, develop ]
env:
RUST_VERSION: '1.70'
DOCKER_COMPOSE_VERSION: '2.20.0'
CI_SKIP: 'true'
jobs:
# Job de vérification du code
code-quality:
name: Code Quality
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Cache Rust dependencies
uses: actions/cache@v3
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Run clippy
run: |
cd sdk_relay
cargo clippy --all-targets --all-features -- -D warnings
- name: Run rustfmt
run: |
cd sdk_relay
cargo fmt --all -- --check
- name: Check documentation
run: |
cd sdk_relay
cargo doc --no-deps
- name: Check for TODO/FIXME
run: |
if grep -r "TODO\|FIXME" . --exclude-dir=.git --exclude-dir=target; then
echo "Found TODO/FIXME comments. Please address them."
exit 1
fi
# Job de tests unitaires
unit-tests:
name: Unit Tests
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Cache Rust dependencies
uses: actions/cache@v3
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Run unit tests
run: |
cd sdk_relay
cargo test --lib --bins
- name: Run integration tests
run: |
cd sdk_relay
cargo test --tests
# Job de tests d'intégration
integration-tests:
name: Integration Tests
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
services:
docker:
image: docker:24.0.5
options: >-
--health-cmd "docker info"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 2375:2375
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build Docker images
run: |
docker build -t 4nk-node-bitcoin ./bitcoin
docker build -t 4nk-node-blindbit ./blindbit
docker build -t 4nk-node-sdk-relay -f ./sdk_relay/Dockerfile ..
- name: Run integration tests
run: |
# Tests de connectivité de base
./tests/run_connectivity_tests.sh || true
# Tests d'intégration
./tests/run_integration_tests.sh || true
- name: Upload test results
uses: actions/upload-artifact@v3
if: always()
with:
name: test-results
path: |
tests/logs/
tests/reports/
retention-days: 7
# Job de tests de sécurité
security-tests:
name: Security Tests
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Run cargo audit
run: |
cd sdk_relay
cargo audit --deny warnings
- name: Check for secrets
run: |
# Vérifier les secrets potentiels
if grep -r "password\|secret\|key\|token" . --exclude-dir=.git --exclude-dir=target --exclude=*.md; then
echo "Potential secrets found. Please review."
exit 1
fi
- name: Check file permissions
run: |
# Vérifier les permissions sensibles
find . -type f -perm /0111 -name "*.conf" -o -name "*.key" -o -name "*.pem" | while read file; do
if [[ $(stat -c %a "$file") != "600" ]]; then
echo "Warning: $file has insecure permissions"
fi
done
# Job de build et test Docker
docker-build:
name: Docker Build & Test
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
services:
docker:
image: docker:24.0.5
options: >-
--health-cmd "docker info"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 2375:2375
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build and test Bitcoin Core
run: |
docker build -t 4nk-node-bitcoin:test ./bitcoin
docker run --rm 4nk-node-bitcoin:test bitcoin-cli --version
- name: Build and test Blindbit
run: |
docker build -t 4nk-node-blindbit:test ./blindbit
docker run --rm 4nk-node-blindbit:test --version || true
- name: Build and test SDK Relay
run: |
docker build -t 4nk-node-sdk-relay:test -f ./sdk_relay/Dockerfile ..
docker run --rm 4nk-node-sdk-relay:test --version || true
- name: Test Docker Compose
run: |
docker-compose config
docker-compose build --no-cache
# Job de tests de documentation
documentation-tests:
name: Documentation Tests
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Check markdown links
run: |
# Vérification basique des liens markdown
find . -name "*.md" -exec grep -l "\[.*\](" {} \; | while read file; do
echo "Checking links in $file"
done
markdownlint:
name: Markdown Lint
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Run markdownlint
run: |
npm --version || (curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash - && sudo apt-get install -y nodejs)
npx -y markdownlint-cli@0.42.0 "**/*.md" --ignore "archive/**"
- name: Check documentation structure
run: |
# Vérifier la présence des fichiers de documentation essentiels
required_files=(
"README.md"
"LICENSE"
"CONTRIBUTING.md"
"CHANGELOG.md"
"CODE_OF_CONDUCT.md"
"SECURITY.md"
)
for file in "${required_files[@]}"; do
if [[ ! -f "$file" ]]; then
echo "Missing required documentation file: $file"
exit 1
fi
done
bash-required:
name: Bash Requirement
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Verify bash availability
run: |
if ! command -v bash >/dev/null 2>&1; then
echo "bash is required for agents and scripts"; exit 1;
fi
- name: Verify agents runner exists
run: |
if [ ! -f scripts/agents/run.sh ]; then
echo "scripts/agents/run.sh is missing"; exit 1;
fi
agents-smoke:
name: Agents Smoke (no AI)
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Ensure agents scripts executable
run: |
chmod +x scripts/agents/*.sh || true
- name: Run agents without AI
env:
OPENAI_API_KEY: ""
run: |
scripts/agents/run.sh
- name: Upload agents reports
uses: actions/upload-artifact@v3
with:
name: agents-reports
path: tests/reports/agents
openia-agents:
name: Agents with OpenIA
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' && secrets.OPENAI_API_KEY != '' }}
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENAI_MODEL: ${{ vars.OPENAI_MODEL }}
OPENAI_API_BASE: ${{ vars.OPENAI_API_BASE }}
OPENAI_TEMPERATURE: ${{ vars.OPENAI_TEMPERATURE }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Ensure agents scripts executable
run: |
chmod +x scripts/agents/*.sh || true
- name: Run agents with AI
run: |
scripts/agents/run.sh
- name: Upload agents reports
uses: actions/upload-artifact@v3
with:
name: agents-reports-ai
path: tests/reports/agents
deployment-checks:
name: Deployment Checks
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Validate deployment documentation
run: |
if [ ! -f docs/DEPLOYMENT.md ]; then
echo "Missing docs/DEPLOYMENT.md"; exit 1; fi
if [ ! -f docs/SSH_UPDATE.md ]; then
echo "Missing docs/SSH_UPDATE.md"; exit 1; fi
- name: Ensure tests directories exist
run: |
mkdir -p tests/logs tests/reports || true
echo "OK"
security-audit:
name: Security Audit
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Ensure scripts executable
run: |
chmod +x scripts/security/audit.sh || true
- name: Run template security audit
run: |
if [ -f scripts/security/audit.sh ]; then
./scripts/security/audit.sh
else
echo "No security audit script (ok)"
fi
# Job de release guard (cohérence release)
release-guard:
name: Release Guard
runs-on: [self-hosted, linux]
needs: [code-quality, unit-tests, documentation-tests, markdownlint, security-audit, deployment-checks, bash-required]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Ensure guard scripts are executable
run: |
chmod +x scripts/release/guard.sh || true
chmod +x scripts/checks/version_alignment.sh || true
- name: Version alignment check
run: |
if [ -f scripts/checks/version_alignment.sh ]; then
./scripts/checks/version_alignment.sh
else
echo "No version alignment script (ok)"
fi
- name: Release guard (CI verify)
env:
RELEASE_TYPE: ci-verify
run: |
if [ -f scripts/release/guard.sh ]; then
./scripts/release/guard.sh
else
echo "No guard script (ok)"
fi
release-create:
name: Create Release (Gitea API)
runs-on: ubuntu-latest
needs: [release-guard]
if: ${{ env.CI_SKIP != 'true' && startsWith(github.ref, 'refs/tags/') }}
env:
RELEASE_TOKEN: ${{ secrets.RELEASE_TOKEN }}
BASE_URL: ${{ vars.BASE_URL }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Validate token and publish release
run: |
set -e
if [ -z "${RELEASE_TOKEN}" ]; then
echo "RELEASE_TOKEN secret is missing" >&2; exit 1; fi
if [ -z "${BASE_URL}" ]; then
BASE_URL="https://git.4nkweb.com"; fi
TAG="${GITHUB_REF##*/}"
REPO="${GITHUB_REPOSITORY}"
OWNER="${REPO%%/*}"
NAME="${REPO##*/}"
echo "Publishing release ${TAG} to ${BASE_URL}/${OWNER}/${NAME}"
curl -sSf -X POST \
-H "Authorization: token ${RELEASE_TOKEN}" \
-H "Content-Type: application/json" \
-d "{\"tag_name\":\"${TAG}\",\"name\":\"${TAG}\",\"draft\":false,\"prerelease\":false}" \
"${BASE_URL}/api/v1/repos/${OWNER}/${NAME}/releases" >/dev/null
echo "Release created"
# Job de tests de performance
performance-tests:
name: Performance Tests
runs-on: [self-hosted, linux]
if: ${{ env.CI_SKIP != 'true' }}
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Run performance tests
run: |
cd sdk_relay
cargo test --release --test performance_tests || true
- name: Check memory usage
run: |
# Tests de base de consommation mémoire
echo "Performance tests completed"
# Job de notification
notify:
name: Notify
runs-on: [self-hosted, linux]
needs: [code-quality, unit-tests, integration-tests, security-tests, docker-build, documentation-tests]
if: ${{ env.CI_SKIP != 'true' && always() }}
steps:
- name: Notify success
if: needs.code-quality.result == 'success' && needs.unit-tests.result == 'success' && needs.integration-tests.result == 'success' && needs.security-tests.result == 'success' && needs.docker-build.result == 'success' && needs.documentation-tests.result == 'success'
run: |
echo "✅ All tests passed successfully!"
- name: Notify failure
if: needs.code-quality.result == 'failure' || needs.unit-tests.result == 'failure' || needs.integration-tests.result == 'failure' || needs.security-tests.result == 'failure' || needs.docker-build.result == 'failure' || needs.documentation-tests.result == 'failure'
run: |
echo "❌ Some tests failed!"
exit 1

View File

@ -1,352 +0,0 @@
name: CI - sdk_signer
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]
env:
RUST_VERSION: '1.70'
DOCKER_COMPOSE_VERSION: '2.20.0'
jobs:
# Job de vérification du code
code-quality:
name: Code Quality
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Cache Rust dependencies
uses: actions/cache@v3
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Run clippy
run: |
cargo clippy --all-targets --all-features -- -D warnings
- name: Run rustfmt
run: |
cargo fmt --all -- --check
- name: Check documentation
run: |
cargo doc --no-deps
- name: Check for TODO/FIXME
run: |
if grep -r "TODO\|FIXME" . --exclude-dir=.git --exclude-dir=target; then
echo "Found TODO/FIXME comments. Please address them."
exit 1
fi
# Job de tests unitaires
unit-tests:
name: Unit Tests
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Cache Rust dependencies
uses: actions/cache@v3
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Run unit tests
run: |
cargo test --lib --bins
- name: Run integration tests
run: |
cargo test --tests
# Job de tests d'intégration
integration-tests:
name: Integration Tests
runs-on: ubuntu-latest
services:
docker:
image: docker:24.0.5
options: >-
--health-cmd "docker info"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 2375:2375
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build Docker images
run: |
docker build -t 4nk-node-bitcoin ./bitcoin
docker build -t 4nk-node-blindbit ./blindbit
docker build -t 4nk-node-sdk-relay -f ./sdk_relay/Dockerfile ..
- name: Run integration tests
run: |
# Tests de connectivité de base
./tests/run_connectivity_tests.sh || true
# Tests d'intégration
./tests/run_integration_tests.sh || true
- name: Upload test results
uses: actions/upload-artifact@v3
if: always()
with:
name: test-results
path: |
tests/logs/
tests/reports/
retention-days: 7
# Job de tests de sécurité
security-tests:
name: Security Tests
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Run cargo audit
run: |
cargo audit --deny warnings
- name: Check for secrets
run: |
# Vérifier les secrets potentiels
if grep -r "password\|secret\|key\|token" . --exclude-dir=.git --exclude-dir=target --exclude=*.md; then
echo "Potential secrets found. Please review."
exit 1
fi
- name: Check file permissions
run: |
# Vérifier les permissions sensibles
find . -type f -perm /0111 -name "*.conf" -o -name "*.key" -o -name "*.pem" | while read file; do
if [[ $(stat -c %a "$file") != "600" ]]; then
echo "Warning: $file has insecure permissions"
fi
done
# Job de build et test Docker
docker-build:
name: Docker Build & Test
runs-on: ubuntu-latest
services:
docker:
image: docker:24.0.5
options: >-
--health-cmd "docker info"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 2375:2375
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build and test Bitcoin Core
run: |
docker build -t 4nk-node-bitcoin:test ./bitcoin
docker run --rm 4nk-node-bitcoin:test bitcoin-cli --version
- name: Build and test Blindbit
run: |
docker build -t 4nk-node-blindbit:test ./blindbit
docker run --rm 4nk-node-blindbit:test --version || true
- name: Build and test SDK Relay
run: |
docker build -t 4nk-node-sdk-relay:test -f ./sdk_relay/Dockerfile ..
docker run --rm 4nk-node-sdk-relay:test --version || true
- name: Test Docker Compose
run: |
docker-compose config
docker-compose build --no-cache
# Job de tests de documentation
documentation-tests:
name: Documentation Tests
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Check markdown links
run: |
# Vérification basique des liens markdown
find . -name "*.md" -exec grep -l "\[.*\](" {} \; | while read file; do
echo "Checking links in $file"
done
- name: Check documentation structure
run: |
# Vérifier la présence des fichiers de documentation essentiels
required_files=(
"README.md"
"LICENSE"
"CONTRIBUTING.md"
"CHANGELOG.md"
"CODE_OF_CONDUCT.md"
"SECURITY.md"
"docs/INDEX.md"
"docs/INSTALLATION.md"
"docs/USAGE.md"
)
for file in "${required_files[@]}"; do
if [[ ! -f "$file" ]]; then
echo "Missing required documentation file: $file"
exit 1
fi
done
- name: Validate documentation
run: |
echo "Documentation checks completed"
security-audit:
name: Security Audit
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Ensure scripts executable
run: |
chmod +x scripts/security/audit.sh || true
- name: Run template security audit
run: |
if [ -f scripts/security/audit.sh ]; then
./scripts/security/audit.sh
else
echo "No security audit script (ok)"
fi
# Job de release guard (cohérence release)
release-guard:
name: Release Guard
runs-on: ubuntu-latest
needs: [code-quality, unit-tests, documentation-tests, security-audit]
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Ensure guard scripts are executable
run: |
chmod +x scripts/release/guard.sh || true
chmod +x scripts/checks/version_alignment.sh || true
- name: Version alignment check
run: |
if [ -f scripts/checks/version_alignment.sh ]; then
./scripts/checks/version_alignment.sh
else
echo "No version alignment script (ok)"
fi
- name: Release guard (CI verify)
env:
RELEASE_TYPE: ci-verify
run: |
if [ -f scripts/release/guard.sh ]; then
./scripts/release/guard.sh
else
echo "No guard script (ok)"
fi
# Job de tests de performance
performance-tests:
name: Performance Tests
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: actions-rs/toolchain@v1
with:
toolchain: ${{ env.RUST_VERSION }}
override: true
- name: Run performance tests
run: |
cd sdk_relay
cargo test --release --test performance_tests || true
- name: Check memory usage
run: |
# Tests de base de consommation mémoire
echo "Performance tests completed"
# Job de notification
notify:
name: Notify
runs-on: ubuntu-latest
needs: [code-quality, unit-tests, integration-tests, security-tests, docker-build, documentation-tests]
if: always()
steps:
- name: Notify success
if: needs.code-quality.result == 'success' && needs.unit-tests.result == 'success' && needs.integration-tests.result == 'success' && needs.security-tests.result == 'success' && needs.docker-build.result == 'success' && needs.documentation-tests.result == 'success'
run: |
echo "✅ All tests passed successfully!"
- name: Notify failure
if: needs.code-quality.result == 'failure' || needs.unit-tests.result == 'failure' || needs.integration-tests.result == 'failure' || needs.security-tests.result == 'failure' || needs.docker-build.result == 'failure' || needs.documentation-tests.result == 'failure'
run: |
echo "❌ Some tests failed!"
exit 1

View File

@ -1,36 +0,0 @@
name: Release
on:
push:
tags:
- 'v*.*.*'
jobs:
docker-release:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Login to DockerHub
if: ${{ secrets.DOCKERHUB_USERNAME && secrets.DOCKERHUB_TOKEN }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Extract version
id: vars
run: echo "version=${GITHUB_REF##*/}" >> $GITHUB_OUTPUT
- name: Build image
run: docker build -t ${DOCKER_IMAGE:-sdk-signer}:${{ steps.vars.outputs.version }} .
- name: Push image
if: ${{ secrets.DOCKERHUB_USERNAME && secrets.DOCKERHUB_TOKEN }}
run: |
IMAGE=${DOCKER_IMAGE:-sdk-signer}
docker tag $IMAGE:${{ steps.vars.outputs.version }} $IMAGE:latest
docker push $IMAGE:${{ steps.vars.outputs.version }}
docker push $IMAGE:latest

View File

@ -1,40 +0,0 @@
# .gitea/workflows/template-sync.yml — synchronisation et contrôles dintégrité
name: 4NK Template Sync
on:
schedule: # planification régulière
- cron: "0 4 * * 1" # exécution hebdomadaire (UTC)
workflow_dispatch: {} # déclenchement manuel
jobs:
check-and-sync:
runs-on: linux
steps:
- name: Lire TEMPLATE_VERSION et .4nk-sync.yml
# Doit charger ref courant, source_repo et périmètre paths
- name: Récupérer la version publiée du template/4NK_rules
# Doit comparer TEMPLATE_VERSION avec ref amont
- name: Créer branche de synchronisation si divergence
# Doit créer chore/template-sync-<date> et préparer un commit
- name: Synchroniser les chemins autoritatifs
# Doit mettre à jour .cursor/**, .gitea/**, AGENTS.md, scripts/**, docs/SSH_UPDATE.md
- name: Contrôles post-sync (bloquants)
# 1) Vérifier présence et exécutable des scripts/*.sh
# 2) Vérifier mise à jour CHANGELOG.md et docs/INDEX.md
# 3) Vérifier docs/SSH_UPDATE.md si scripts/** a changé
# 4) Vérifier absence de secrets en clair dans scripts/**
# 5) Vérifier manifest_checksum si publié
- name: Tests, lint, sécurité statique
# Doit exiger un état vert
- name: Ouvrir PR de synchronisation
# Titre: "[template-sync] chore: aligner .cursor/.gitea/AGENTS.md/scripts"
# Doit inclure résumé des fichiers modifiés et la version appliquée
- name: Mettre à jour TEMPLATE_VERSION (dans PR)
# Doit remplacer la valeur par la ref appliquée

44
.github/workflows/dev.yml vendored Normal file
View File

@ -0,0 +1,44 @@
name: Build and Push to Registry
on:
push:
branches: [ dev ]
env:
REGISTRY: git.4nkweb.com
IMAGE_NAME: 4nk/sdk_signer
jobs:
build-and-push:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up SSH agent
uses: webfactory/ssh-agent@v0.9.1
with:
ssh-private-key: ${{ secrets.SSH_PRIVATE_KEY }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ secrets.USER }}
password: ${{ secrets.TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
push: true
ssh: default
build-args: |
ENV_VARS=${{ secrets.ENV_VARS }}
tags: |
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:latest
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:dev
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ gitea.sha }}

View File

@ -4,6 +4,24 @@ Toutes les modifications notables de ce projet seront documentées ici.
## [Unreleased]
### Added
- Service de Background Sync pour la détection automatique des données manquantes
- Surveillance périodique des `pcd_commitment` (toutes les 30 secondes)
- API WebSocket pour contrôler le background sync (`FORCE_DATA_SCAN`, `GET_BACKGROUND_SYNC_STATUS`)
- Gestion automatique des entrées `diff` pour le tracking des données manquantes
- Récupération en deux étapes des données manquantes :
- Essai depuis les serveurs de stockage (retrieveData) en priorité
- Fallback vers les pairs (requestDataFromPeers) si non trouvé
- Extraction automatique des URLs de stockage depuis les rôles
- Scripts de test et validation du service de background sync
- Test spécifique pour la récupération depuis le storage
- Documentation complète dans `docs/BACKGROUND_SYNC.md`
### Changed
- Intégration du background sync dans le service principal
- Démarrage automatique du background sync avec le serveur
- Arrêt propre du background sync lors de l'arrêt du serveur
## [0.1.1] - 2025-08-26
- Bump version package.json à 0.1.1
- Documentation déploiement mise à jour (exemples tag)

View File

@ -1,35 +1,56 @@
FROM node:20-alpine AS base
# syntax=docker/dockerfile:1.4
FROM rust:1.82-alpine AS wasm-builder
WORKDIR /build
# Install production dependencies only by default
ENV NODE_ENV=production
# Installation des dépendances nécessaires pour la compilation
RUN apk update && apk add --no-cache \
git \
openssh-client \
curl \
nodejs \
npm \
build-base \
pkgconfig \
clang \
llvm \
musl-dev \
nginx
# Installation de wasm-pack
RUN curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh
# Configuration SSH basique
RUN mkdir -p /root/.ssh && \
ssh-keyscan git.4nkweb.com >> /root/.ssh/known_hosts
# On se place dans le bon répertoire parent
WORKDIR /build
# Copie du projet sdk_signer
COPY . sdk_signer/
# Clonage du sdk_client au même niveau que sdk_signer en utilisant la clé SSH montée
RUN --mount=type=ssh git clone -b dev ssh://git@git.4nkweb.com/4nk/sdk_client.git
# Build du WebAssembly avec accès SSH pour les dépendances
WORKDIR /build/sdk_client
RUN --mount=type=ssh wasm-pack build --out-dir ../sdk_signer/pkg --target nodejs --dev
FROM node:20-alpine
WORKDIR /app
# Install build dependencies
FROM base AS deps
ENV NODE_ENV=development
RUN apk add --no-cache python3 make g++
COPY package.json package-lock.json* ./
RUN npm ci
# Installation des dépendances nécessaires
RUN apk update && apk add --no-cache git
# Copie des fichiers du projet
COPY --from=wasm-builder /build/sdk_signer/pkg ./pkg
COPY . .
# Installation des dépendances Node.js
RUN npm install
# Build TypeScript
FROM deps AS build
COPY tsconfig.json ./
COPY src ./src
COPY pkg ./pkg
RUN npm run build
# Runtime image
FROM base AS runner
WORKDIR /app
ENV NODE_ENV=production
RUN addgroup -S nodejs && adduser -S nodejs -G nodejs
COPY --from=deps /app/node_modules ./node_modules
COPY --from=build /app/dist ./dist
COPY --from=build /app/pkg ./pkg
EXPOSE 9090
USER nodejs
CMD ["node", "dist/index.js"]
CMD ["npm", "start"]

View File

@ -0,0 +1,352 @@
// Service de surveillance en arrière-plan pour Node.js
// Équivalent du service worker pour la détection des données manquantes
import { Service } from './service';
import Database from './database.service';
import { config } from './config';
import { retrieveData } from './storage.service';
export class BackgroundSyncService {
private static instance: BackgroundSyncService;
private service: Service;
private db: Database | null = null;
private scanInterval: NodeJS.Timeout | null = null;
private isRunning: boolean = false;
private readonly SCAN_INTERVAL_MS = 30000; // 30 secondes
private readonly EMPTY32BYTES = '0'.repeat(64);
private constructor() {
this.service = Service.getInstance();
console.log('🔧 BackgroundSyncService initialized');
}
static async getInstance(): Promise<BackgroundSyncService> {
if (!BackgroundSyncService.instance) {
BackgroundSyncService.instance = new BackgroundSyncService();
await BackgroundSyncService.instance.init();
}
return BackgroundSyncService.instance;
}
private async init(): Promise<void> {
this.db = await Database.getInstance();
}
/**
* Démarre le service de surveillance en arrière-plan
*/
public async start(): Promise<void> {
if (this.isRunning) {
console.log('⚠️ BackgroundSyncService already running');
return;
}
console.log('🚀 Starting background sync service...');
this.isRunning = true;
// Scan immédiat au démarrage
await this.scanMissingData();
// Puis scan périodique
this.scanInterval = setInterval(async () => {
try {
await this.scanMissingData();
} catch (error) {
console.error('❌ Error in background scan:', error);
}
}, this.SCAN_INTERVAL_MS);
console.log('✅ Background sync service started');
}
/**
* Arrête le service de surveillance
*/
public stop(): void {
if (!this.isRunning) {
return;
}
console.log('🛑 Stopping background sync service...');
this.isRunning = false;
if (this.scanInterval) {
clearInterval(this.scanInterval);
this.scanInterval = null;
}
console.log('✅ Background sync service stopped');
}
/**
* Scan manuel des données manquantes (équivalent à scanMissingData du service worker)
*/
public async scanMissingData(): Promise<string[]> {
console.log('🔍 Scanning for missing data...');
try {
const myProcesses = await this.getMyProcesses();
if (!myProcesses || myProcesses.length === 0) {
console.log('No processes to scan');
return [];
}
const toDownload = new Set<string>();
for (const processId of myProcesses) {
const process = await this.service.getProcess(processId);
if (!process) continue;
for (const state of process.states) {
if (state.state_id === this.EMPTY32BYTES) continue;
// Vérifier chaque pcd_commitment
for (const [field, hash] of Object.entries(state.pcd_commitment)) {
// Ignorer les champs publics
if (state.public_data[field] !== undefined || field === 'roles') continue;
// Vérifier que hash est une string
if (typeof hash !== 'string') continue;
// Vérifier si on a déjà les données
const existingData = await this.getBlob(hash);
if (!existingData) {
toDownload.add(hash);
// Ajouter une entrée diff si elle n'existe pas
await this.addDiff(processId, state.state_id, hash, state.roles, field);
}
}
}
}
const missingHashes = Array.from(toDownload);
if (missingHashes.length > 0) {
console.log(`📥 Found ${missingHashes.length} missing data hashes:`, missingHashes);
await this.requestMissingData(missingHashes);
} else {
console.log('✅ No missing data found');
}
return missingHashes;
} catch (error) {
console.error('❌ Error scanning missing data:', error);
throw error;
}
}
/**
* Récupère les processus de l'utilisateur
*/
private async getMyProcesses(): Promise<string[]> {
try {
return await this.service.getMyProcesses() || [];
} catch (error) {
console.error('Error getting my processes:', error);
return [];
}
}
/**
* Vérifie si un blob existe en base
*/
private async getBlob(hash: string): Promise<Buffer | null> {
try {
return await this.service.getBufferFromDb(hash);
} catch (error) {
return null;
}
}
/**
* Ajoute une entrée diff pour tracking
*/
private async addDiff(
processId: string,
stateId: string,
hash: string,
roles: any,
field: string
): Promise<void> {
try {
if (!this.db) {
console.error('Database not initialized');
return;
}
const existingDiff = await this.db.getObject('diffs', hash);
if (!existingDiff) {
const newDiff = {
process_id: processId,
state_id: stateId,
value_commitment: hash,
roles: roles,
field: field,
description: null,
previous_value: null,
new_value: null,
notify_user: false,
need_validation: false,
validation_status: 'None'
};
await this.db.addObject({
storeName: 'diffs',
object: newDiff,
key: hash
});
console.log(`📝 Added diff entry for hash: ${hash}`);
}
} catch (error) {
console.error('Error adding diff:', error);
}
}
/**
* Demande les données manquantes aux pairs
*/
private async requestMissingData(hashes: string[]): Promise<void> {
try {
console.log('🔄 Requesting missing data from peers...');
// Récupérer tous les processus pour déterminer les rôles
const myProcesses = await this.getMyProcesses();
const processesToRequest: Record<string, any> = {};
for (const processId of myProcesses) {
const process = await this.service.getProcess(processId);
if (process) {
processesToRequest[processId] = process;
}
}
// Pour chaque hash manquant, essayer de le récupérer
for (const hash of hashes) {
try {
// Trouver le diff correspondant
const diffs = await this.service.getDiffsFromDb();
const diff = Object.values(diffs).find((d: any) => d.value_commitment === hash);
if (diff) {
const processId = diff.process_id;
const stateId = diff.state_id;
const roles = diff.roles;
if (processesToRequest[processId]) {
// D'abord essayer de récupérer depuis les serveurs de stockage
const retrievedFromStorage = await this.tryRetrieveFromStorage(hash, roles);
if (retrievedFromStorage) {
console.log(`✅ Data retrieved from storage for hash ${hash}`);
// Sauvegarder les données récupérées en base
await this.service.saveBufferToDb(hash, Buffer.from(retrievedFromStorage));
continue; // Passer au hash suivant
}
// Si pas trouvé en storage, demander aux pairs
console.log(`🔄 Requesting data for hash ${hash} from process ${processId}`);
await this.service.requestDataFromPeers(processId, [stateId], [roles]);
}
}
} catch (error) {
console.error(`Error requesting data for hash ${hash}:`, error);
}
}
} catch (error) {
console.error('Error requesting missing data:', error);
}
}
/**
* Essaie de récupérer les données depuis les serveurs de stockage
*/
private async tryRetrieveFromStorage(hash: string, roles: any): Promise<ArrayBuffer | null> {
try {
// Extraire les URLs de stockage depuis les rôles
const storageUrls = this.extractStorageUrls(roles);
if (storageUrls.length === 0) {
console.log(`No storage URLs found for hash ${hash}`);
return null;
}
console.log(`🔍 Trying to retrieve hash ${hash} from storage servers:`, storageUrls);
// Essayer de récupérer depuis les serveurs de stockage
const data = await retrieveData(storageUrls, hash);
if (data) {
console.log(`✅ Successfully retrieved data for hash ${hash} from storage`);
return data;
} else {
console.log(`❌ Data not found in storage for hash ${hash}`);
return null;
}
} catch (error) {
console.error(`Error retrieving data from storage for hash ${hash}:`, error);
return null;
}
}
/**
* Extrait les URLs de stockage depuis les rôles
*/
private extractStorageUrls(roles: any): string[] {
const storageUrls = new Set<string>();
try {
if (roles && typeof roles === 'object') {
for (const role of Object.values(roles)) {
if (role && typeof role === 'object' && 'storages' in role && Array.isArray((role as any).storages)) {
for (const storageUrl of (role as any).storages) {
if (typeof storageUrl === 'string' && storageUrl.trim()) {
storageUrls.add(storageUrl.trim());
}
}
}
}
}
} catch (error) {
console.error('Error extracting storage URLs from roles:', error);
}
return Array.from(storageUrls);
}
/**
* Force un scan immédiat (pour tests ou usage manuel)
*/
public async forceScan(): Promise<string[]> {
console.log('🔄 Forcing immediate scan...');
return await this.scanMissingData();
}
/**
* Obtient le statut du service
*/
public getStatus(): { isRunning: boolean; scanInterval: number } {
return {
isRunning: this.isRunning,
scanInterval: this.SCAN_INTERVAL_MS
};
}
/**
* Met à jour l'intervalle de scan
*/
public setScanInterval(intervalMs: number): void {
if (this.isRunning && this.scanInterval) {
clearInterval(this.scanInterval);
this.scanInterval = setInterval(async () => {
try {
await this.scanMissingData();
} catch (error) {
console.error('❌ Error in background scan:', error);
}
}, intervalMs);
}
}
}

View File

@ -37,6 +37,11 @@ export enum MessageType {
// Account management
ADD_DEVICE = 'ADD_DEVICE',
DEVICE_ADDED = 'DEVICE_ADDED',
// Background sync
FORCE_DATA_SCAN = 'FORCE_DATA_SCAN',
DATA_SCAN_RESULT = 'DATA_SCAN_RESULT',
GET_BACKGROUND_SYNC_STATUS = 'GET_BACKGROUND_SYNC_STATUS',
BACKGROUND_SYNC_STATUS = 'BACKGROUND_SYNC_STATUS',
}
// Re-export AnkFlag from WASM for relay message typing

View File

@ -309,7 +309,7 @@ export class RelayManager {
}
// Relay Message Handling
private handleRelayMessage(relayId: string, message: any): void {
private async handleRelayMessage(relayId: string, message: any): Promise<void> {
console.log(`📨 Received message from relay ${relayId}:`);
if (message.flag === 'Handshake') {
@ -321,7 +321,7 @@ export class RelayManager {
// Handle different types of relay responses
if (message.flag) {
// Handle protocol-specific responses
this.handleProtocolMessage(relayId, message);
await this.handleProtocolMessage(relayId, message);
} else if (message.type === 'heartbeat') {
// Update heartbeat
const relay = this.relays.get(relayId);
@ -337,14 +337,16 @@ export class RelayManager {
}
}
private handleProtocolMessage(relayId: string, message: any): void {
private async handleProtocolMessage(relayId: string, message: any): Promise<void> {
// Handle different AnkFlag responses
switch (message.flag) {
case "NewTx":
console.log(`📨 NewTx response from relay ${relayId}`);
setImmediate(() => {
Service.getInstance().parseNewTx(message.content);
});
try {
await Service.getInstance().parseNewTx(message.content);
} catch (error) {
console.error(`❌ Error parsing NewTx from relay ${relayId}:`, error);
}
break;
case "Commit":
console.log(`📨 Commit response from relay ${relayId}`);
@ -353,9 +355,11 @@ export class RelayManager {
break;
case "Cipher":
console.log(`📨 Cipher response from relay ${relayId}`);
setImmediate(() => {
Service.getInstance().parseCipher(message.content);
});
try {
await Service.getInstance().parseCipher(message.content);
} catch (error) {
console.error(`❌ Error parsing Cipher from relay ${relayId}:`, error);
}
break;
case "Handshake":
console.log(`📨 Handshake response from relay ${relayId}`);

View File

@ -16,6 +16,7 @@ export class Service {
private membersList: any = {};
private relayManager: RelayManager;
private storages: string[] = []; // storage urls
private backgroundSync: any = null; // BackgroundSyncService
private constructor() {
console.log('🔧 Service initialized');
@ -1158,8 +1159,12 @@ export class Service {
}
if (apiReturn.new_tx_to_send && apiReturn.new_tx_to_send.transaction.length != 0) {
await this.sendNewTxMessage(JSON.stringify(apiReturn.new_tx_to_send));
await new Promise(r => setTimeout(r, 500));
try {
await this.sendNewTxMessage(JSON.stringify(apiReturn.new_tx_to_send));
await new Promise(r => setTimeout(r, 500));
} catch (error) {
console.error('❌ Error sending NewTx message:', error);
}
}
if (apiReturn.secrets) {
@ -1244,11 +1249,19 @@ export class Service {
if (apiReturn.commit_to_send) {
const commit = apiReturn.commit_to_send;
await this.sendCommitMessage(JSON.stringify(commit));
try {
await this.sendCommitMessage(JSON.stringify(commit));
} catch (error) {
console.error('❌ Error sending Commit message:', error);
}
}
if (apiReturn.ciphers_to_send && apiReturn.ciphers_to_send.length != 0) {
await this.sendCipherMessages(apiReturn.ciphers_to_send);
try {
await this.sendCipherMessages(apiReturn.ciphers_to_send);
} catch (error) {
console.error('❌ Error sending Cipher messages:', error);
}
}
}
@ -1448,4 +1461,54 @@ export class Service {
throw new Error(`Failed to dump device: ${e}`);
}
}
/**
* Démarre le service de surveillance en arrière-plan
*/
public async startBackgroundSync(): Promise<void> {
if (!this.backgroundSync) {
const { BackgroundSyncService } = await import('./background-sync.service');
this.backgroundSync = await BackgroundSyncService.getInstance();
}
await this.backgroundSync.start();
}
/**
* Arrête le service de surveillance
*/
public stopBackgroundSync(): void {
if (this.backgroundSync) {
this.backgroundSync.stop();
}
}
/**
* Force un scan manuel des données manquantes
*/
public async forceDataScan(): Promise<string[]> {
if (!this.backgroundSync) {
const { BackgroundSyncService } = await import('./background-sync.service');
this.backgroundSync = await BackgroundSyncService.getInstance();
}
return await this.backgroundSync.forceScan();
}
/**
* Obtient le statut du service de background sync
*/
public getBackgroundSyncStatus(): { isRunning: boolean; scanInterval: number } | null {
if (!this.backgroundSync) {
return null;
}
return this.backgroundSync.getStatus();
}
/**
* Configure l'intervalle de scan du background sync
*/
public setBackgroundSyncInterval(intervalMs: number): void {
if (this.backgroundSync) {
this.backgroundSync.setScanInterval(intervalMs);
}
}
}

View File

@ -321,6 +321,57 @@ class SimpleProcessHandlers {
}
}
async handleForceDataScan(event: ServerMessageEvent): Promise<ServerResponse> {
if (event.data.type !== MessageType.FORCE_DATA_SCAN) {
throw new Error('Invalid message type');
}
try {
const { apiKey } = event.data;
if (!apiKey || !this.validateApiKey(apiKey)) {
throw new Error('Invalid API key');
}
const missingHashes = await this.service.forceDataScan();
return {
type: MessageType.DATA_SCAN_RESULT,
missingHashes,
count: missingHashes.length,
messageId: event.data.messageId
};
} catch (e) {
const errorMessage = e instanceof Error ? e.message : String(e || 'Unknown error');
throw new Error(errorMessage);
}
}
async handleGetBackgroundSyncStatus(event: ServerMessageEvent): Promise<ServerResponse> {
if (event.data.type !== MessageType.GET_BACKGROUND_SYNC_STATUS) {
throw new Error('Invalid message type');
}
try {
const { apiKey } = event.data;
if (!apiKey || !this.validateApiKey(apiKey)) {
throw new Error('Invalid API key');
}
const status = this.service.getBackgroundSyncStatus();
return {
type: MessageType.BACKGROUND_SYNC_STATUS,
status,
messageId: event.data.messageId
};
} catch (e) {
const errorMessage = e instanceof Error ? e.message : String(e || 'Unknown error');
throw new Error(errorMessage);
}
}
async handleMessage(event: ServerMessageEvent): Promise<ServerResponse> {
try {
switch (event.data.type) {
@ -336,6 +387,10 @@ class SimpleProcessHandlers {
return await this.handleGetMyProcesses(event);
case MessageType.GET_PAIRING_ID:
return await this.handleGetPairingId(event);
case MessageType.FORCE_DATA_SCAN:
return await this.handleForceDataScan(event);
case MessageType.GET_BACKGROUND_SYNC_STATUS:
return await this.handleGetBackgroundSyncStatus(event);
default:
throw new Error(`Unhandled message type: ${event.data.type}`);
}
@ -433,10 +488,20 @@ export class Server {
// Connect to relays
await service.connectToRelaysAndWaitForHandshake();
// Start background sync service for missing data detection
try {
await service.startBackgroundSync();
console.log('🔄 Background sync service started');
} catch (error) {
console.error('❌ Failed to start background sync service:', error);
// Don't exit, continue without background sync
}
console.log(`✅ Simple server running on port ${this.wss.options.port}`);
console.log('📋 Supported operations: UPDATE_PROCESS, NOTIFY_UPDATE, VALIDATE_STATE');
console.log('🔑 Authentication: API key required for all operations');
console.log('🔧 Services: Integrated with SimpleService protocol logic');
console.log('🔄 Background sync: Automatic missing data detection enabled');
} catch (error) {
console.error('❌ Failed to initialize server:', error);
@ -509,6 +574,15 @@ export class Server {
public shutdown() {
console.log('🛑 Shutting down server...');
// Stop background sync service
try {
const service = Service.getInstance();
service.stopBackgroundSync();
console.log('🔄 Background sync service stopped');
} catch (error) {
console.error('❌ Error stopping background sync service:', error);
}
// Close all active client connections first
for (const [ws, clientId] of this.clients.entries()) {
console.log(`🔌 Closing connection to ${clientId}...`);

View File

@ -50,11 +50,24 @@ export async function retrieveData(servers: string[], key: string): Promise<Arra
});
if (response.status === 200) {
// Validate that we received an ArrayBuffer
// Handle both ArrayBuffer and Buffer (Node.js)
if (response.data instanceof ArrayBuffer) {
return response.data;
} else if (Buffer.isBuffer(response.data)) {
// Convert Buffer to ArrayBuffer
return response.data.buffer.slice(
response.data.byteOffset,
response.data.byteOffset + response.data.byteLength
);
} else if (response.data && typeof response.data === 'object' && 'buffer' in response.data) {
// Handle Uint8Array or similar typed arrays
const buffer = response.data.buffer;
return buffer.slice(
response.data.byteOffset,
response.data.byteOffset + response.data.byteLength
);
} else {
console.error('Server returned non-ArrayBuffer data:', typeof response.data);
console.error('Server returned unsupported data type:', typeof response.data, response.data?.constructor?.name);
continue;
}
} else {