lint fix wip
This commit is contained in:
parent
7fdd71278e
commit
b2e6250ff5
@ -456,6 +456,78 @@ Aucune bibliothèque ne doit être introduite si une capacité équivalente exis
|
|||||||
|
|
||||||
Les noms, interfaces, types et contrats doivent être stables, explicites et orientés intention.
|
Les noms, interfaces, types et contrats doivent être stables, explicites et orientés intention.
|
||||||
|
|
||||||
|
### Architecture logicielle et séparation des responsabilités
|
||||||
|
|
||||||
|
L'architecture du projet suit un modèle strict de séparation des responsabilités et d'isolation des couches. Toute modification doit respecter ces principes architecturaux fondamentaux.
|
||||||
|
|
||||||
|
#### Persistance des données
|
||||||
|
|
||||||
|
* **IndexedDB comme source de vérité locale** : Toutes les données sont persistées dans IndexedDB côté utilisateur. IndexedDB constitue la source de vérité locale pour toutes les données de l'application.
|
||||||
|
|
||||||
|
* **IndexedDB en lecture uniquement pour l'interface** : L'interface utilisateur (composants React, pages) utilise uniquement IndexedDB en lecture. Aucune écriture directe dans IndexedDB ne doit être effectuée depuis l'interface utilisateur.
|
||||||
|
|
||||||
|
* **Écritures via le réseau Nostr** : Toutes les écritures et mises à jour de données passent par le réseau Nostr. L'interface ne doit jamais écrire directement dans IndexedDB.
|
||||||
|
|
||||||
|
* **Statut de publication** : Toute écriture en base de données (IndexedDB) doit inclure un statut `published` qui permet de gérer ultérieurement la publication (ok/ko). Ce statut permet de suivre l'état de synchronisation et de publication des données.
|
||||||
|
|
||||||
|
#### Service Workers et tâches de fond
|
||||||
|
|
||||||
|
* **Service Workers pour les tâches de fond** : Toutes les tâches de fond sur les données sont effectuées avec des Service Workers. Les Service Workers gèrent la synchronisation, la mise à jour et le traitement asynchrone des données.
|
||||||
|
|
||||||
|
* **Service Worker pour les notifications** : Un Service Worker alimente la table des notifications. Il surveille les événements et met à jour IndexedDB avec les nouvelles notifications.
|
||||||
|
|
||||||
|
* **Détection indépendante des changements** : Le Service Worker pour les notifications détecte les changements de manière indépendante. Il n'est pas dépendant du service d'écriture et ne reçoit pas d'événements directs de celui-ci. Il surveille IndexedDB et détecte les nouvelles notifications par lui-même.
|
||||||
|
|
||||||
|
* **Service de lecture des notifications** : Les notifications sont lues par un service dédié qui détecte les nouvelles notifications dans la base de données (IndexedDB). Ce service est séparé du Service Worker qui alimente les notifications.
|
||||||
|
|
||||||
|
#### Web Workers et écritures
|
||||||
|
|
||||||
|
* **Web Worker d'écriture** : Un Web Worker dédié gère toutes les écritures et modifications dans IndexedDB. L'interface utilisateur ne doit jamais écrire directement dans IndexedDB, mais doit passer par ce Web Worker.
|
||||||
|
|
||||||
|
* **Accès aux données** : Le Web Worker d'écriture reçoit les données complètes, pas seulement les modifications. Les données partielles ou différentielles ne doivent pas être utilisées.
|
||||||
|
|
||||||
|
* **Transactions multi-tables** : Le Web Worker gère les transactions multi-tables via plusieurs transactions. La logique de découpage des transactions est gérée côté Web Worker, pas côté appelant.
|
||||||
|
|
||||||
|
* **Synchronisation et gestion des conflits** : Le Web Worker est conçu pour gérer une pile d'écritures. Il traite les écritures de manière séquentielle pour éviter les conflits lorsque plusieurs écritures arrivent simultanément. Aucune écriture ne doit contourner cette pile.
|
||||||
|
|
||||||
|
* **Service de gestion des écritures** : Un service gère les écritures/mises à jour vers les WebSockets/API et vers le Web Worker d'écriture. Ce service orchestre la communication entre l'interface, le réseau (Nostr/WebSockets) et le Web Worker d'écriture.
|
||||||
|
|
||||||
|
* **Orchestration** : Le service de gestion des écritures coordonne le flux WebSockets → Web Worker → IndexedDB. Il orchestre les écritures réseau et locales de manière indépendante.
|
||||||
|
|
||||||
|
* **Ordre des opérations** : Les écritures réseau et locales sont effectuées en parallèle et de manière indépendante. Il n'y a pas d'ordre imposé entre l'écriture réseau et l'écriture locale.
|
||||||
|
|
||||||
|
* **Gestion des erreurs** : Si le réseau échoue mais que l'écriture locale réussit, aucune action immédiate n'est requise. Un autre Service Worker réessaiera la publication ultérieurement en se basant sur le statut `published` des données en IndexedDB.
|
||||||
|
|
||||||
|
#### WebSockets et communication
|
||||||
|
|
||||||
|
* **Service de gestion des WebSockets** : Les WebSockets sont gérés par un service dédié qui communique avec les Service Workers. Ce service isole la logique de communication WebSocket et permet une gestion centralisée des connexions.
|
||||||
|
|
||||||
|
* **Rôle principal et secondaires** : Le service WebSocket a pour rôle principal la gestion des connexions. Ses rôles secondaires incluent le routage des messages et la gestion de l'état de connexion.
|
||||||
|
|
||||||
|
* **Communication avec les Service Workers** : La communication entre le service WebSocket et les Service Workers s'effectue via `postMessage`. Aucun autre mécanisme de communication ne doit être utilisé.
|
||||||
|
|
||||||
|
* **Gestion des reconnexions** : Le service WebSocket doit gérer les reconnexions automatiques et maintenir l'état de connexion. Les reconnexions doivent être transparentes pour les Service Workers.
|
||||||
|
|
||||||
|
* **Séparation des responsabilités** : Le service WebSocket ne doit pas écrire directement dans IndexedDB. Il communique avec les Service Workers qui gèrent la persistance.
|
||||||
|
|
||||||
|
#### Principes de respect de l'architecture
|
||||||
|
|
||||||
|
Toute modification doit respecter ces principes :
|
||||||
|
|
||||||
|
* **Pas d'écriture directe dans IndexedDB depuis l'interface** : L'interface utilisateur ne doit jamais écrire directement dans IndexedDB. Toutes les écritures passent par le réseau Nostr et le Web Worker d'écriture.
|
||||||
|
|
||||||
|
* **Séparation stricte des couches** : Les couches doivent rester isolées :
|
||||||
|
|
||||||
|
* Interface utilisateur (React) : Lecture IndexedDB uniquement
|
||||||
|
* Service de gestion des écritures : Orchestration des écritures réseau et Web Worker
|
||||||
|
* Web Worker d'écriture : Écritures dans IndexedDB
|
||||||
|
* Service Workers : Tâches de fond, synchronisation, notifications
|
||||||
|
* Service WebSocket : Communication réseau avec Service Workers
|
||||||
|
|
||||||
|
* **Statut de publication obligatoire** : Toute écriture en base doit inclure un statut `published` pour le suivi de la publication.
|
||||||
|
|
||||||
|
* **Pas de contournement** : Aucun contournement de cette architecture ne doit être introduit, même pour des cas particuliers. Les exceptions doivent être gérées dans le cadre de cette architecture.
|
||||||
|
|
||||||
### Validation et précaution obligatoires
|
### Validation et précaution obligatoires
|
||||||
|
|
||||||
Avant toute implémentation ou modification, certaines décisions critiques doivent être validées explicitement par l'utilisateur.
|
Avant toute implémentation ou modification, certaines décisions critiques doivent être validées explicitement par l'utilisateur.
|
||||||
@ -487,7 +559,8 @@ Sont concernés :
|
|||||||
* **Contrats d'API** : Modifications des structures de données échangées via API
|
* **Contrats d'API** : Modifications des structures de données échangées via API
|
||||||
* **Formats de stockage** : Changements de format de fichiers, structures JSON, schémas de validation
|
* **Formats de stockage** : Changements de format de fichiers, structures JSON, schémas de validation
|
||||||
|
|
||||||
Toute modification du modèle de données doit être documentée avec :
|
**Toute modification du modèle de données doit être documentée avec :**
|
||||||
|
|
||||||
* L'impact sur les données existantes
|
* L'impact sur les données existantes
|
||||||
* Les migrations nécessaires
|
* Les migrations nécessaires
|
||||||
* Les risques de régression
|
* Les risques de régression
|
||||||
|
|||||||
@ -24,7 +24,7 @@ function SuccessMessage(): React.ReactElement {
|
|||||||
|
|
||||||
export function ArticleEditor({ onPublishSuccess, onCancel, seriesOptions, onSelectSeries, defaultSeriesId }: ArticleEditorProps): React.ReactElement {
|
export function ArticleEditor({ onPublishSuccess, onCancel, seriesOptions, onSelectSeries, defaultSeriesId }: ArticleEditorProps): React.ReactElement {
|
||||||
const { connected, pubkey, connect } = useNostrAuth()
|
const { connected, pubkey, connect } = useNostrAuth()
|
||||||
const { loading, error, success, publishArticle } = useArticlePublishing(pubkey ?? null)
|
const { loading, error, success, relayStatuses, publishArticle } = useArticlePublishing(pubkey ?? null)
|
||||||
const [draft, setDraft] = useState<ArticleDraft>({
|
const [draft, setDraft] = useState<ArticleDraft>({
|
||||||
title: '',
|
title: '',
|
||||||
preview: '',
|
preview: '',
|
||||||
@ -50,6 +50,7 @@ export function ArticleEditor({ onPublishSuccess, onCancel, seriesOptions, onSel
|
|||||||
}}
|
}}
|
||||||
loading={loading}
|
loading={loading}
|
||||||
error={error}
|
error={error}
|
||||||
|
relayStatuses={relayStatuses}
|
||||||
{...(onCancel ? { onCancel } : {})}
|
{...(onCancel ? { onCancel } : {})}
|
||||||
{...(seriesOptions ? { seriesOptions } : {})}
|
{...(seriesOptions ? { seriesOptions } : {})}
|
||||||
{...(onSelectSeries ? { onSelectSeries } : {})}
|
{...(onSelectSeries ? { onSelectSeries } : {})}
|
||||||
|
|||||||
@ -9,12 +9,15 @@ import { MarkdownEditorTwoColumns } from './MarkdownEditorTwoColumns'
|
|||||||
import type { MediaRef } from '@/types/nostr'
|
import type { MediaRef } from '@/types/nostr'
|
||||||
import { t } from '@/lib/i18n'
|
import { t } from '@/lib/i18n'
|
||||||
|
|
||||||
|
import type { RelayPublishStatus } from '@/lib/publishResult'
|
||||||
|
|
||||||
interface ArticleEditorFormProps {
|
interface ArticleEditorFormProps {
|
||||||
draft: ArticleDraft
|
draft: ArticleDraft
|
||||||
onDraftChange: (draft: ArticleDraft) => void
|
onDraftChange: (draft: ArticleDraft) => void
|
||||||
onSubmit: (e: React.FormEvent) => void
|
onSubmit: (e: React.FormEvent) => void
|
||||||
loading: boolean
|
loading: boolean
|
||||||
error: string | null
|
error: string | null
|
||||||
|
relayStatuses?: RelayPublishStatus[]
|
||||||
onCancel?: () => void
|
onCancel?: () => void
|
||||||
seriesOptions?: { id: string; title: string }[] | undefined
|
seriesOptions?: { id: string; title: string }[] | undefined
|
||||||
onSelectSeries?: ((seriesId: string | undefined) => void) | undefined
|
onSelectSeries?: ((seriesId: string | undefined) => void) | undefined
|
||||||
@ -249,6 +252,7 @@ export function ArticleEditorForm({
|
|||||||
onSubmit,
|
onSubmit,
|
||||||
loading,
|
loading,
|
||||||
error,
|
error,
|
||||||
|
relayStatuses,
|
||||||
onCancel,
|
onCancel,
|
||||||
seriesOptions,
|
seriesOptions,
|
||||||
onSelectSeries,
|
onSelectSeries,
|
||||||
@ -266,7 +270,7 @@ export function ArticleEditorForm({
|
|||||||
<ArticleFieldsRight draft={draft} onDraftChange={onDraftChange} />
|
<ArticleFieldsRight draft={draft} onDraftChange={onDraftChange} />
|
||||||
</div>
|
</div>
|
||||||
<ErrorAlert error={error} />
|
<ErrorAlert error={error} />
|
||||||
<ArticleFormButtons loading={loading} {...(onCancel ? { onCancel } : {})} />
|
<ArticleFormButtons loading={loading} relayStatuses={relayStatuses} {...(onCancel ? { onCancel } : {})} />
|
||||||
</form>
|
</form>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|||||||
@ -2,28 +2,31 @@ import { t } from '@/lib/i18n'
|
|||||||
|
|
||||||
interface ArticleFormButtonsProps {
|
interface ArticleFormButtonsProps {
|
||||||
loading: boolean
|
loading: boolean
|
||||||
|
relayStatuses?: unknown // Kept for backward compatibility but not displayed
|
||||||
onCancel?: () => void
|
onCancel?: () => void
|
||||||
}
|
}
|
||||||
|
|
||||||
export function ArticleFormButtons({ loading, onCancel }: ArticleFormButtonsProps): React.ReactElement {
|
export function ArticleFormButtons({ loading, onCancel }: ArticleFormButtonsProps): React.ReactElement {
|
||||||
return (
|
return (
|
||||||
<div className="flex gap-3 pt-4">
|
<div className="space-y-3 pt-4">
|
||||||
<button
|
<div className="flex gap-3">
|
||||||
type="submit"
|
|
||||||
disabled={loading}
|
|
||||||
className="flex-1 px-4 py-2 bg-neon-cyan/20 hover:bg-neon-cyan/30 text-neon-cyan rounded-lg font-medium transition-all border border-neon-cyan/50 hover:shadow-glow-cyan disabled:opacity-50 disabled:cursor-not-allowed"
|
|
||||||
>
|
|
||||||
{loading ? t('publish.publishing') : t('publish.button')}
|
|
||||||
</button>
|
|
||||||
{onCancel && (
|
|
||||||
<button
|
<button
|
||||||
type="button"
|
type="submit"
|
||||||
onClick={onCancel}
|
disabled={loading}
|
||||||
className="px-4 py-2 bg-cyber-dark hover:bg-cyber-dark/80 text-cyber-accent rounded-lg font-medium transition-colors border border-cyber-accent/30 hover:border-neon-cyan/50"
|
className="flex-1 px-4 py-2 bg-neon-cyan/20 hover:bg-neon-cyan/30 text-neon-cyan rounded-lg font-medium transition-all border border-neon-cyan/50 hover:shadow-glow-cyan disabled:opacity-50 disabled:cursor-not-allowed"
|
||||||
>
|
>
|
||||||
{t('common.back')}
|
{loading ? t('publish.publishing') : t('publish.button')}
|
||||||
</button>
|
</button>
|
||||||
)}
|
{onCancel && (
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
onClick={onCancel}
|
||||||
|
className="px-4 py-2 bg-cyber-dark hover:bg-cyber-dark/80 text-cyber-accent rounded-lg font-medium transition-colors border border-cyber-accent/30 hover:border-neon-cyan/50"
|
||||||
|
>
|
||||||
|
{t('common.back')}
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|||||||
@ -59,7 +59,7 @@ export function KeyManagementManager(): React.ReactElement {
|
|||||||
const urlObj = new URL(url)
|
const urlObj = new URL(url)
|
||||||
// Check if it's a nostr:// URL with nsec
|
// Check if it's a nostr:// URL with nsec
|
||||||
if (urlObj.protocol === 'nostr:' || urlObj.protocol === 'nostr://') {
|
if (urlObj.protocol === 'nostr:' || urlObj.protocol === 'nostr://') {
|
||||||
const path = urlObj.pathname || urlObj.href.replace(/^nostr:?\/\//, '')
|
const path = urlObj.pathname ?? urlObj.href.replace(/^nostr:?\/\//, '')
|
||||||
if (path.startsWith('nsec')) {
|
if (path.startsWith('nsec')) {
|
||||||
return path
|
return path
|
||||||
}
|
}
|
||||||
@ -329,7 +329,7 @@ export function KeyManagementManager(): React.ReactElement {
|
|||||||
</button>
|
</button>
|
||||||
<button
|
<button
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
void performImport(extractKeyFromUrl(importKey.trim()) || importKey.trim())
|
void performImport(extractKeyFromUrl(importKey.trim()) ?? importKey.trim())
|
||||||
}}
|
}}
|
||||||
disabled={importing}
|
disabled={importing}
|
||||||
className="flex-1 py-2 px-4 bg-red-600/20 hover:bg-red-600/30 text-red-400 rounded-lg font-medium transition-all border border-red-400/50 hover:shadow-glow-red disabled:opacity-50"
|
className="flex-1 py-2 px-4 bg-red-600/20 hover:bg-red-600/30 text-red-400 rounded-lg font-medium transition-all border border-red-400/50 hover:shadow-glow-red disabled:opacity-50"
|
||||||
|
|||||||
@ -26,9 +26,12 @@ export function NotificationBadgeButton({ unreadCount, onClick }: NotificationBa
|
|||||||
/>
|
/>
|
||||||
</svg>
|
</svg>
|
||||||
{unreadCount > 0 && (
|
{unreadCount > 0 && (
|
||||||
<span className="absolute top-0 right-0 inline-flex items-center justify-center px-2 py-1 text-xs font-bold leading-none text-white transform translate-x-1/2 -translate-y-1/2 bg-red-600 rounded-full">
|
<>
|
||||||
{unreadCount > 99 ? '99+' : unreadCount}
|
<span className="absolute top-0 right-0 w-2 h-2 bg-orange-500 rounded-full transform translate-x-1/2 -translate-y-1/2" />
|
||||||
</span>
|
<span className="absolute top-0 right-0 inline-flex items-center justify-center px-2 py-1 text-xs font-bold leading-none text-white transform translate-x-1/2 -translate-y-1/2 bg-red-600 rounded-full">
|
||||||
|
{unreadCount > 99 ? '99+' : unreadCount}
|
||||||
|
</span>
|
||||||
|
</>
|
||||||
)}
|
)}
|
||||||
</button>
|
</button>
|
||||||
)
|
)
|
||||||
|
|||||||
@ -1,5 +1,5 @@
|
|||||||
import Link from 'next/link'
|
import Link from 'next/link'
|
||||||
import type { Notification } from '@/types/notifications'
|
import type { Notification } from '@/lib/notificationService'
|
||||||
|
|
||||||
interface NotificationContentProps {
|
interface NotificationContentProps {
|
||||||
notification: Notification
|
notification: Notification
|
||||||
|
|||||||
@ -1,4 +1,4 @@
|
|||||||
import type { Notification } from '@/types/notifications'
|
import type { Notification } from '@/lib/notificationService'
|
||||||
import { NotificationContent } from './NotificationContent'
|
import { NotificationContent } from './NotificationContent'
|
||||||
import { NotificationActions } from './NotificationActions'
|
import { NotificationActions } from './NotificationActions'
|
||||||
|
|
||||||
@ -19,11 +19,14 @@ export function NotificationItem({
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div
|
<div
|
||||||
className={`p-4 hover:bg-gray-50 transition-colors cursor-pointer ${
|
className={`p-4 hover:bg-gray-50 transition-colors cursor-pointer relative ${
|
||||||
!notification.read ? 'bg-blue-50' : ''
|
!notification.read ? 'bg-blue-50' : ''
|
||||||
}`}
|
}`}
|
||||||
onClick={() => onNotificationClick(notification)}
|
onClick={() => onNotificationClick(notification)}
|
||||||
>
|
>
|
||||||
|
{!notification.read && (
|
||||||
|
<div className="absolute top-2 right-2 w-2 h-2 bg-orange-500 rounded-full" />
|
||||||
|
)}
|
||||||
<div className="flex items-start justify-between">
|
<div className="flex items-start justify-between">
|
||||||
<NotificationContent notification={notification} />
|
<NotificationContent notification={notification} />
|
||||||
<NotificationActions timestamp={notification.timestamp} onDelete={handleDelete} />
|
<NotificationActions timestamp={notification.timestamp} onDelete={handleDelete} />
|
||||||
|
|||||||
@ -1,4 +1,4 @@
|
|||||||
import type { Notification } from '@/types/notifications'
|
import type { Notification } from '@/lib/notificationService'
|
||||||
import { NotificationItem } from './NotificationItem'
|
import { NotificationItem } from './NotificationItem'
|
||||||
import { NotificationPanelHeader } from './NotificationPanelHeader'
|
import { NotificationPanelHeader } from './NotificationPanelHeader'
|
||||||
import { t } from '@/lib/i18n'
|
import { t } from '@/lib/i18n'
|
||||||
|
|||||||
@ -3,7 +3,6 @@ import { ConditionalPublishButton } from './ConditionalPublishButton'
|
|||||||
import { LanguageSelector } from './LanguageSelector'
|
import { LanguageSelector } from './LanguageSelector'
|
||||||
import { t } from '@/lib/i18n'
|
import { t } from '@/lib/i18n'
|
||||||
import { KeyIndicator } from './KeyIndicator'
|
import { KeyIndicator } from './KeyIndicator'
|
||||||
import { SyncStatus } from './GlobalSyncProgressBar'
|
|
||||||
|
|
||||||
function GitIcon(): React.ReactElement {
|
function GitIcon(): React.ReactElement {
|
||||||
return (
|
return (
|
||||||
@ -90,7 +89,6 @@ export function PageHeader(): React.ReactElement {
|
|||||||
<GitIcon />
|
<GitIcon />
|
||||||
</a>
|
</a>
|
||||||
<KeyIndicator />
|
<KeyIndicator />
|
||||||
<SyncStatus />
|
|
||||||
</div>
|
</div>
|
||||||
<div className="flex items-center gap-4">
|
<div className="flex items-center gap-4">
|
||||||
<LanguageSelector />
|
<LanguageSelector />
|
||||||
|
|||||||
@ -377,6 +377,11 @@ export function RelayManager({ onConfigChange }: RelayManagerProps): React.React
|
|||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
{relay.lastSyncDate && (
|
||||||
|
<div className="text-xs text-cyber-accent/70 mt-1">
|
||||||
|
{t('settings.relay.list.lastSync')}: {new Date(relay.lastSyncDate).toLocaleString()}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
<div className="flex items-center gap-2">
|
<div className="flex items-center gap-2">
|
||||||
<label className="flex items-center gap-2 cursor-pointer">
|
<label className="flex items-center gap-2 cursor-pointer">
|
||||||
|
|||||||
105
docs/architecture-implementation-status.md
Normal file
105
docs/architecture-implementation-status.md
Normal file
@ -0,0 +1,105 @@
|
|||||||
|
# État d'implémentation de l'architecture
|
||||||
|
|
||||||
|
**Auteur** : Équipe 4NK
|
||||||
|
**Date** : 2025-01-27
|
||||||
|
|
||||||
|
## Vue d'ensemble
|
||||||
|
|
||||||
|
Ce document décrit l'état actuel de l'implémentation des principes d'architecture et les migrations nécessaires.
|
||||||
|
|
||||||
|
## Services créés
|
||||||
|
|
||||||
|
### ✅ Services conformes aux principes
|
||||||
|
|
||||||
|
1. **`lib/websocketService.ts`** - Service de gestion des WebSockets
|
||||||
|
- Gère les connexions WebSocket
|
||||||
|
- Communique avec les Service Workers via `swClient`
|
||||||
|
- ✅ Conforme aux principes
|
||||||
|
|
||||||
|
2. **`lib/writeOrchestrator.ts`** - Service de gestion des écritures
|
||||||
|
- Orchestre les écritures vers WebSockets/API et Web Worker d'écriture
|
||||||
|
- ✅ Conforme aux principes
|
||||||
|
|
||||||
|
3. **`lib/writeService.ts`** - Service de gestion du Web Worker d'écriture
|
||||||
|
- Existe déjà et route les écritures vers le Web Worker
|
||||||
|
- ⚠️ Le fichier `writeWorker.js` doit être créé et adapté pour Next.js
|
||||||
|
|
||||||
|
4. **`lib/notificationReader.ts`** - Service de lecture des notifications
|
||||||
|
- Détecte les nouvelles notifications dans IndexedDB
|
||||||
|
- Séparé du Service Worker qui alimente les notifications
|
||||||
|
- ✅ Conforme aux principes
|
||||||
|
|
||||||
|
5. **`lib/notificationService.ts`** - Service de gestion des notifications
|
||||||
|
- Stocke les notifications dans IndexedDB
|
||||||
|
- ✅ Conforme aux principes
|
||||||
|
|
||||||
|
6. **`lib/notificationDetector.ts`** - Détecteur de notifications
|
||||||
|
- Scanne IndexedDB pour détecter les nouveaux événements
|
||||||
|
- ⚠️ Doit être intégré dans le Service Worker pour alimenter les notifications
|
||||||
|
|
||||||
|
## Migrations nécessaires
|
||||||
|
|
||||||
|
### 🔄 Migrations prioritaires
|
||||||
|
|
||||||
|
1. **Créer et adapter `writeWorker.js`**
|
||||||
|
- Le fichier `public/writeWorker.js` existe mais doit être adapté pour Next.js
|
||||||
|
- Les Web Workers dans Next.js nécessitent une configuration spéciale
|
||||||
|
- **Action** : Adapter le worker pour fonctionner avec Next.js ou utiliser une approche alternative
|
||||||
|
|
||||||
|
2. **Migrer les écritures directes vers `writeService`**
|
||||||
|
- Plusieurs services appellent directement `objectCache.set` et `objectCache.updatePublished`
|
||||||
|
- **Fichiers concernés** :
|
||||||
|
- `lib/platformSync.ts` - Synchronisation plateforme
|
||||||
|
- `lib/userContentSync.ts` - Synchronisation contenu utilisateur
|
||||||
|
- `lib/articlePublisher.ts` - Publication d'articles
|
||||||
|
- `lib/purchaseQueries.ts` - Requêtes d'achats
|
||||||
|
- `lib/reviewTipQueries.ts` - Requêtes de tips
|
||||||
|
- `lib/sponsoringQueries.ts` - Requêtes de sponsoring
|
||||||
|
- **Action** : Remplacer tous les appels directs par `writeService.writeObject()` et `writeService.updatePublished()`
|
||||||
|
|
||||||
|
3. **Intégrer le Service Worker pour les notifications**
|
||||||
|
- Le Service Worker doit alimenter directement la table des notifications via le Web Worker d'écriture
|
||||||
|
- Actuellement, le détecteur s'exécute dans le thread principal
|
||||||
|
- **Action** : Modifier `public/sw.js` pour que le Service Worker alimente les notifications via `writeService.createNotification()`
|
||||||
|
|
||||||
|
4. **Utiliser `websocketService` au lieu de `nostrService.pool`**
|
||||||
|
- `nostrService` utilise directement `SimplePool` pour les WebSockets
|
||||||
|
- **Action** : Migrer vers `websocketService` qui communique avec les Service Workers
|
||||||
|
|
||||||
|
5. **Utiliser `writeOrchestrator` pour les publications**
|
||||||
|
- Les publications doivent passer par `writeOrchestrator` qui orchestre WebSocket + Web Worker
|
||||||
|
- **Action** : Modifier `nostrService.publishEvent()` pour utiliser `writeOrchestrator`
|
||||||
|
|
||||||
|
## Architecture cible
|
||||||
|
|
||||||
|
```
|
||||||
|
Interface Utilisateur (React)
|
||||||
|
↓ (lecture uniquement)
|
||||||
|
IndexedDB
|
||||||
|
↑ (écriture via Web Worker)
|
||||||
|
Web Worker d'écriture (writeWorker.js)
|
||||||
|
↑ (orchestration)
|
||||||
|
Service de gestion des écritures (writeOrchestrator)
|
||||||
|
↓ (réseau)
|
||||||
|
Service WebSocket (websocketService)
|
||||||
|
↓ (communication)
|
||||||
|
Service Workers (sw.js)
|
||||||
|
↓ (persistance via Web Worker)
|
||||||
|
Web Worker d'écriture
|
||||||
|
↓
|
||||||
|
IndexedDB
|
||||||
|
```
|
||||||
|
|
||||||
|
## Priorités
|
||||||
|
|
||||||
|
1. **Haute priorité** : Créer et adapter `writeWorker.js` pour Next.js
|
||||||
|
2. **Haute priorité** : Migrer les écritures de synchronisation vers `writeService`
|
||||||
|
3. **Moyenne priorité** : Intégrer le Service Worker pour alimenter les notifications
|
||||||
|
4. **Moyenne priorité** : Migrer vers `websocketService`
|
||||||
|
5. **Basse priorité** : Utiliser `writeOrchestrator` pour les publications
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- Les services de configuration (`configStorage`, `settingsCache`) peuvent continuer à écrire directement dans IndexedDB car ils ne sont pas des données métier
|
||||||
|
- Les services de log (`publishLog`) peuvent continuer à écrire directement car ce sont des logs
|
||||||
|
- La migration doit être progressive pour éviter de casser l'application
|
||||||
84
docs/architecture-principles.md
Normal file
84
docs/architecture-principles.md
Normal file
@ -0,0 +1,84 @@
|
|||||||
|
# Principes d'architecture logicielle
|
||||||
|
|
||||||
|
**Auteur** : Équipe 4NK
|
||||||
|
|
||||||
|
## Vue d'ensemble
|
||||||
|
|
||||||
|
Ce document décrit les principes d'architecture logicielle du projet. Toute modification doit respecter ces principes.
|
||||||
|
|
||||||
|
## Principes fondamentaux
|
||||||
|
|
||||||
|
### Persistance des données
|
||||||
|
|
||||||
|
* **IndexedDB comme source de vérité locale** : Toutes les données sont persistées dans IndexedDB côté utilisateur. IndexedDB constitue la source de vérité locale pour toutes les données de l'application.
|
||||||
|
|
||||||
|
* **IndexedDB en lecture uniquement pour l'interface** : L'interface utilisateur (composants React, pages) utilise uniquement IndexedDB en lecture. Aucune écriture directe dans IndexedDB ne doit être effectuée depuis l'interface utilisateur.
|
||||||
|
|
||||||
|
* **Écritures via le réseau Nostr** : Toutes les écritures et mises à jour de données passent par le réseau Nostr. L'interface ne doit jamais écrire directement dans IndexedDB.
|
||||||
|
|
||||||
|
* **Statut de publication** : Toute écriture en base de données (IndexedDB) doit inclure un statut `published` qui permet de gérer ultérieurement la publication (ok/ko). Ce statut permet de suivre l'état de synchronisation et de publication des données.
|
||||||
|
|
||||||
|
### Service Workers et tâches de fond
|
||||||
|
|
||||||
|
* **Service Workers pour les tâches de fond** : Toutes les tâches de fond sur les données sont effectuées avec des Service Workers. Les Service Workers gèrent la synchronisation, la mise à jour et le traitement asynchrone des données.
|
||||||
|
|
||||||
|
* **Service Worker pour les notifications** : Un Service Worker alimente la table des notifications. Il surveille les événements et met à jour IndexedDB avec les nouvelles notifications.
|
||||||
|
|
||||||
|
* **Service de lecture des notifications** : Les notifications sont lues par un service dédié qui détecte les nouvelles notifications dans la base de données (IndexedDB). Ce service est séparé du Service Worker qui alimente les notifications.
|
||||||
|
|
||||||
|
### Web Workers et écritures
|
||||||
|
|
||||||
|
* **Web Worker d'écriture** : Un Web Worker dédié gère toutes les écritures et modifications dans IndexedDB. L'interface utilisateur ne doit jamais écrire directement dans IndexedDB, mais doit passer par ce Web Worker.
|
||||||
|
|
||||||
|
* **Service de gestion des écritures** : Un service gère les écritures/mises à jour vers les WebSockets/API et vers le Web Worker d'écriture. Ce service orchestre la communication entre l'interface, le réseau (Nostr/WebSockets) et le Web Worker d'écriture.
|
||||||
|
|
||||||
|
### WebSockets et communication
|
||||||
|
|
||||||
|
* **Service de gestion des WebSockets** : Les WebSockets sont gérés par un service dédié qui communique avec les Service Workers. Ce service isole la logique de communication WebSocket et permet une gestion centralisée des connexions.
|
||||||
|
|
||||||
|
* **Séparation des responsabilités** : Le service WebSocket ne doit pas écrire directement dans IndexedDB. Il communique avec les Service Workers qui gèrent la persistance.
|
||||||
|
|
||||||
|
## Architecture des composants
|
||||||
|
|
||||||
|
```
|
||||||
|
Interface Utilisateur (React)
|
||||||
|
↓ (lecture uniquement)
|
||||||
|
IndexedDB
|
||||||
|
↑ (écriture via Web Worker)
|
||||||
|
Web Worker d'écriture
|
||||||
|
↑ (orchestration)
|
||||||
|
Service de gestion des écritures
|
||||||
|
↓ (réseau)
|
||||||
|
Service WebSocket
|
||||||
|
↓ (communication)
|
||||||
|
Service Workers
|
||||||
|
↓ (persistance)
|
||||||
|
IndexedDB
|
||||||
|
```
|
||||||
|
|
||||||
|
## Flux d'écriture
|
||||||
|
|
||||||
|
1. **Interface utilisateur** : Demande une écriture/mise à jour
|
||||||
|
2. **Service de gestion des écritures** : Reçoit la demande
|
||||||
|
3. **Service WebSocket** : Publie sur le réseau Nostr
|
||||||
|
4. **Service de gestion des écritures** : Envoie les données au Web Worker d'écriture
|
||||||
|
5. **Web Worker d'écriture** : Écrit dans IndexedDB avec statut `published`
|
||||||
|
|
||||||
|
## Flux de synchronisation
|
||||||
|
|
||||||
|
1. **Service Worker** : Déclenche la synchronisation périodique
|
||||||
|
2. **Service WebSocket** : Se connecte aux relais Nostr
|
||||||
|
3. **Service Worker** : Reçoit les événements
|
||||||
|
4. **Web Worker d'écriture** : Écrit les nouveaux événements dans IndexedDB
|
||||||
|
|
||||||
|
## Flux de notifications
|
||||||
|
|
||||||
|
1. **Service Worker** : Surveille les événements dans IndexedDB
|
||||||
|
2. **Service Worker** : Détecte les nouveaux événements concernant l'utilisateur
|
||||||
|
3. **Web Worker d'écriture** : Écrit les notifications dans IndexedDB
|
||||||
|
4. **Service de lecture des notifications** : Détecte les nouvelles notifications
|
||||||
|
5. **Interface utilisateur** : Affiche les notifications
|
||||||
|
|
||||||
|
## Respect des principes
|
||||||
|
|
||||||
|
Toute modification du code doit respecter ces principes. En cas de doute, consulter ce document et les règles de qualité dans `.cursor/rules/quality.mdc`.
|
||||||
113
docs/service-worker-sync.md
Normal file
113
docs/service-worker-sync.md
Normal file
@ -0,0 +1,113 @@
|
|||||||
|
# Service Worker pour la synchronisation
|
||||||
|
|
||||||
|
**Auteur** : Équipe 4NK
|
||||||
|
|
||||||
|
## Vue d'ensemble
|
||||||
|
|
||||||
|
La synchronisation utilise maintenant un Service Worker pour fonctionner en arrière-plan, même lorsque l'onglet est fermé. Le système fonctionne en mode hybride : il utilise le Service Worker si disponible, sinon il revient à l'implémentation classique avec `setInterval`.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Composants
|
||||||
|
|
||||||
|
1. **Service Worker** (`public/sw.js`)
|
||||||
|
- Gère les événements de synchronisation périodique
|
||||||
|
- Communique avec le thread principal via `postMessage`
|
||||||
|
- Utilise l'API Periodic Background Sync si disponible
|
||||||
|
|
||||||
|
2. **Service Worker Client** (`lib/swClient.ts`)
|
||||||
|
- Interface de communication avec le Service Worker
|
||||||
|
- Enregistre et gère le Service Worker
|
||||||
|
- Envoie des messages au Service Worker
|
||||||
|
|
||||||
|
3. **Service Worker Sync Handler** (`lib/swSyncHandler.ts`)
|
||||||
|
- Écoute les requêtes du Service Worker
|
||||||
|
- Exécute les opérations de synchronisation dans le thread principal
|
||||||
|
- Accède à `nostrService`, `IndexedDB`, etc.
|
||||||
|
|
||||||
|
### Flux de synchronisation
|
||||||
|
|
||||||
|
```
|
||||||
|
Service Worker (sw.js)
|
||||||
|
↓ (postMessage)
|
||||||
|
Thread Principal (swSyncHandler)
|
||||||
|
↓ (exécution)
|
||||||
|
Services (platformSync, userContentSync, publishWorker)
|
||||||
|
↓ (mise à jour)
|
||||||
|
IndexedDB
|
||||||
|
```
|
||||||
|
|
||||||
|
## Fonctionnalités
|
||||||
|
|
||||||
|
### Synchronisation de la plateforme
|
||||||
|
|
||||||
|
- **Déclenchement** : Toutes les 60 secondes
|
||||||
|
- **Service Worker** : Utilise `periodicSync` si disponible
|
||||||
|
- **Fallback** : `setInterval` dans le thread principal
|
||||||
|
|
||||||
|
### Synchronisation du contenu utilisateur
|
||||||
|
|
||||||
|
- **Déclenchement** : Lors de la connexion ou navigation
|
||||||
|
- **Service Worker** : Peut fonctionner en arrière-plan
|
||||||
|
- **Fallback** : Exécution directe dans le thread principal
|
||||||
|
|
||||||
|
### Worker de republication
|
||||||
|
|
||||||
|
- **Déclenchement** : Toutes les 30 secondes
|
||||||
|
- **Service Worker** : Utilise `periodicSync` si disponible
|
||||||
|
- **Fallback** : `setInterval` dans le thread principal
|
||||||
|
|
||||||
|
## Avantages
|
||||||
|
|
||||||
|
1. **Fonctionne en arrière-plan** : La synchronisation continue même si l'onglet est fermé
|
||||||
|
2. **Réduction de l'impact** : Les opérations lourdes sont déplacées du thread principal
|
||||||
|
3. **Meilleure expérience utilisateur** : Synchronisation transparente sans bloquer l'interface
|
||||||
|
|
||||||
|
## Limitations
|
||||||
|
|
||||||
|
1. **WebSockets** : Les connexions WebSocket doivent être gérées dans le thread principal
|
||||||
|
2. **IndexedDB** : Accès direct depuis le Service Worker, mais les opérations complexes sont exécutées dans le thread principal
|
||||||
|
3. **Compatibilité** : Nécessite un navigateur supportant les Service Workers et l'API Periodic Background Sync (optionnel)
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
Le Service Worker est automatiquement enregistré au démarrage de l'application dans `pages/_app.tsx`.
|
||||||
|
|
||||||
|
### Désactiver le Service Worker
|
||||||
|
|
||||||
|
Pour désactiver le Service Worker et revenir à l'implémentation classique :
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Dans lib/platformSync.ts, lib/publishWorker.ts, etc.
|
||||||
|
// Retirer les appels à swClient
|
||||||
|
```
|
||||||
|
|
||||||
|
## Dépannage
|
||||||
|
|
||||||
|
### Le Service Worker ne se charge pas
|
||||||
|
|
||||||
|
1. Vérifier que le fichier `public/sw.js` existe
|
||||||
|
2. Vérifier la console du navigateur pour les erreurs
|
||||||
|
3. Vérifier que HTTPS est utilisé (requis pour les Service Workers en production)
|
||||||
|
|
||||||
|
### La synchronisation ne fonctionne pas
|
||||||
|
|
||||||
|
1. Vérifier que `swSyncHandler.initialize()` est appelé
|
||||||
|
2. Vérifier les messages dans la console : `[SW]`, `[SWClient]`, `[SWSyncHandler]`
|
||||||
|
3. Vérifier que les services de synchronisation sont correctement initialisés
|
||||||
|
|
||||||
|
### Mise à jour du Service Worker
|
||||||
|
|
||||||
|
Le Service Worker se met à jour automatiquement. Si une nouvelle version est détectée, l'application se recharge automatiquement.
|
||||||
|
|
||||||
|
## Modalités de déploiement
|
||||||
|
|
||||||
|
1. Le fichier `public/sw.js` est servi automatiquement par Next.js
|
||||||
|
2. Le Service Worker est enregistré au premier chargement de l'application
|
||||||
|
3. Les mises à jour sont gérées automatiquement
|
||||||
|
|
||||||
|
## Modalités d'analyse
|
||||||
|
|
||||||
|
- Console navigateur : Messages `[SW]`, `[SWClient]`, `[SWSyncHandler]`
|
||||||
|
- DevTools > Application > Service Workers : État du Service Worker
|
||||||
|
- DevTools > Network : Requêtes de synchronisation
|
||||||
@ -2,16 +2,19 @@ import { useState } from 'react'
|
|||||||
import { articlePublisher } from '@/lib/articlePublisher'
|
import { articlePublisher } from '@/lib/articlePublisher'
|
||||||
import { nostrService } from '@/lib/nostr'
|
import { nostrService } from '@/lib/nostr'
|
||||||
import type { ArticleDraft } from '@/lib/articlePublisher'
|
import type { ArticleDraft } from '@/lib/articlePublisher'
|
||||||
|
import type { RelayPublishStatus } from '@/lib/publishResult'
|
||||||
|
|
||||||
export function useArticlePublishing(pubkey: string | null): {
|
export function useArticlePublishing(pubkey: string | null): {
|
||||||
loading: boolean
|
loading: boolean
|
||||||
error: string | null
|
error: string | null
|
||||||
success: boolean
|
success: boolean
|
||||||
|
relayStatuses: RelayPublishStatus[]
|
||||||
publishArticle: (draft: ArticleDraft) => Promise<string | null>
|
publishArticle: (draft: ArticleDraft) => Promise<string | null>
|
||||||
} {
|
} {
|
||||||
const [loading, setLoading] = useState(false)
|
const [loading, setLoading] = useState(false)
|
||||||
const [error, setError] = useState<string | null>(null)
|
const [error, setError] = useState<string | null>(null)
|
||||||
const [success, setSuccess] = useState(false)
|
const [success, setSuccess] = useState(false)
|
||||||
|
const [relayStatuses, setRelayStatuses] = useState<RelayPublishStatus[]>([])
|
||||||
|
|
||||||
const publishArticle = async (draft: ArticleDraft): Promise<string | null> => {
|
const publishArticle = async (draft: ArticleDraft): Promise<string | null> => {
|
||||||
if (!pubkey) {
|
if (!pubkey) {
|
||||||
@ -26,6 +29,7 @@ export function useArticlePublishing(pubkey: string | null): {
|
|||||||
|
|
||||||
setLoading(true)
|
setLoading(true)
|
||||||
setError(null)
|
setError(null)
|
||||||
|
setRelayStatuses([])
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const privateKey = nostrService.getPrivateKey()
|
const privateKey = nostrService.getPrivateKey()
|
||||||
@ -33,6 +37,7 @@ export function useArticlePublishing(pubkey: string | null): {
|
|||||||
|
|
||||||
if (result.success) {
|
if (result.success) {
|
||||||
setSuccess(true)
|
setSuccess(true)
|
||||||
|
setRelayStatuses(result.relayStatuses ?? [])
|
||||||
return result.articleId
|
return result.articleId
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -50,6 +55,7 @@ export function useArticlePublishing(pubkey: string | null): {
|
|||||||
loading,
|
loading,
|
||||||
error,
|
error,
|
||||||
success,
|
success,
|
||||||
|
relayStatuses,
|
||||||
publishArticle,
|
publishArticle,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@ -85,45 +85,15 @@ export function useArticles(searchQuery: string = '', filters: ArticleFilters |
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let unsubscribe: (() => void) | null = null
|
// Read only from IndexedDB cache - no network subscription
|
||||||
let timeout: NodeJS.Timeout | null = null
|
|
||||||
|
|
||||||
void loadAuthorsFromCache().then((hasCachedAuthors) => {
|
void loadAuthorsFromCache().then((hasCachedAuthors) => {
|
||||||
// Only subscribe to network if cache is empty (to fetch new content)
|
|
||||||
// If cache has authors, we can skip network subscription for faster load
|
|
||||||
if (!hasCachedAuthors) {
|
if (!hasCachedAuthors) {
|
||||||
unsubscribe = nostrService.subscribeToArticles(
|
setError(t('common.error.noContent'))
|
||||||
(article) => {
|
|
||||||
setArticles((prev) => {
|
|
||||||
if (prev.some((a) => a.id === article.id)) {
|
|
||||||
return prev
|
|
||||||
}
|
|
||||||
const next = [article, ...prev].sort((a, b) => b.createdAt - a.createdAt)
|
|
||||||
hasArticlesRef.current = next.length > 0
|
|
||||||
return next
|
|
||||||
})
|
|
||||||
setLoading(false)
|
|
||||||
},
|
|
||||||
50
|
|
||||||
)
|
|
||||||
|
|
||||||
// Shorter timeout if cache is empty (5 seconds instead of 10)
|
|
||||||
timeout = setTimeout(() => {
|
|
||||||
setLoading(false)
|
|
||||||
if (!hasArticlesRef.current) {
|
|
||||||
setError(t('common.error.noContent'))
|
|
||||||
}
|
|
||||||
}, 5000)
|
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
if (unsubscribe) {
|
// No cleanup needed - no network subscription
|
||||||
unsubscribe()
|
|
||||||
}
|
|
||||||
if (timeout) {
|
|
||||||
clearTimeout(timeout)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}, [])
|
}, [])
|
||||||
|
|
||||||
|
|||||||
@ -1,5 +1,5 @@
|
|||||||
import { useState } from 'react'
|
import { useState } from 'react'
|
||||||
import type { Notification } from '@/types/notifications'
|
import type { Notification } from '@/lib/notificationService'
|
||||||
|
|
||||||
export function useNotificationCenter(
|
export function useNotificationCenter(
|
||||||
markAsRead: (id: string) => void,
|
markAsRead: (id: string) => void,
|
||||||
|
|||||||
@ -1,6 +1,6 @@
|
|||||||
import { useState, useEffect, useCallback } from 'react'
|
import { useState, useEffect, useCallback } from 'react'
|
||||||
import { notificationService, loadStoredNotifications, saveNotifications, markNotificationAsRead, markAllAsRead, deleteNotification } from '@/lib/notifications'
|
import { notificationService } from '@/lib/notificationService'
|
||||||
import type { Notification } from '@/types/notifications'
|
import type { Notification } from '@/lib/notificationService'
|
||||||
|
|
||||||
export function useNotifications(userPubkey: string | null): {
|
export function useNotifications(userPubkey: string | null): {
|
||||||
notifications: Notification[]
|
notifications: Notification[]
|
||||||
@ -13,7 +13,7 @@ export function useNotifications(userPubkey: string | null): {
|
|||||||
const [notifications, setNotifications] = useState<Notification[]>([])
|
const [notifications, setNotifications] = useState<Notification[]>([])
|
||||||
const [loading, setLoading] = useState(true)
|
const [loading, setLoading] = useState(true)
|
||||||
|
|
||||||
// Load stored notifications on mount
|
// Load stored notifications on mount and refresh periodically
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!userPubkey) {
|
if (!userPubkey) {
|
||||||
setNotifications([])
|
setNotifications([])
|
||||||
@ -21,66 +21,75 @@ export function useNotifications(userPubkey: string | null): {
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
const loadStored = async (): Promise<void> => {
|
const loadNotifications = async (): Promise<void> => {
|
||||||
const storedNotifications = await loadStoredNotifications(userPubkey)
|
try {
|
||||||
setNotifications(storedNotifications)
|
setLoading(true)
|
||||||
}
|
const storedNotifications = await notificationService.getAllNotifications(100)
|
||||||
void loadStored()
|
setNotifications(storedNotifications)
|
||||||
}, [userPubkey])
|
} catch (error) {
|
||||||
|
console.error('[useNotifications] Error loading notifications:', error)
|
||||||
// Subscribe to new notifications
|
} finally {
|
||||||
useEffect(() => {
|
setLoading(false)
|
||||||
if (!userPubkey) {
|
}
|
||||||
return
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const unsubscribe = notificationService.subscribeToPayments(userPubkey, (newNotification) => {
|
void loadNotifications()
|
||||||
setNotifications((prev) => {
|
|
||||||
// Check if notification already exists
|
|
||||||
if (prev.some((n) => n.id === newNotification.id)) {
|
|
||||||
return prev
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add new notification at the beginning
|
// Refresh notifications every 30 seconds
|
||||||
const updated = [newNotification, ...prev]
|
const interval = setInterval(() => {
|
||||||
|
void loadNotifications()
|
||||||
// Keep only last 100 notifications
|
}, 30000)
|
||||||
const trimmed = updated.slice(0, 100)
|
|
||||||
|
|
||||||
// Save to IndexedDB
|
|
||||||
void saveNotifications(userPubkey, trimmed)
|
|
||||||
|
|
||||||
return trimmed
|
|
||||||
})
|
|
||||||
})
|
|
||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
unsubscribe()
|
clearInterval(interval)
|
||||||
}
|
}
|
||||||
}, [userPubkey])
|
}, [userPubkey])
|
||||||
|
|
||||||
const unreadCount = notifications.filter((n) => !n.read).length
|
const unreadCount = notifications.filter((n) => !n.read).length
|
||||||
|
|
||||||
const markAsRead = useCallback(
|
const markAsRead = useCallback(
|
||||||
(notificationId: string): void => {
|
async (notificationId: string): Promise<void> => {
|
||||||
if (!userPubkey) {return}
|
if (!userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
setNotifications((prev) => markNotificationAsRead(userPubkey, notificationId, prev))
|
try {
|
||||||
|
await notificationService.markAsRead(notificationId)
|
||||||
|
setNotifications((prev) =>
|
||||||
|
prev.map((n) => (n.id === notificationId ? { ...n, read: true } : n))
|
||||||
|
)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[useNotifications] Error marking notification as read:', error)
|
||||||
|
}
|
||||||
},
|
},
|
||||||
[userPubkey]
|
[userPubkey]
|
||||||
)
|
)
|
||||||
|
|
||||||
const markAllAsReadHandler = useCallback((): void => {
|
const markAllAsReadHandler = useCallback(async (): Promise<void> => {
|
||||||
if (!userPubkey) {return}
|
if (!userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
setNotifications((prev) => markAllAsRead(userPubkey, prev))
|
try {
|
||||||
|
await notificationService.markAllAsRead()
|
||||||
|
setNotifications((prev) => prev.map((n) => ({ ...n, read: true })))
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[useNotifications] Error marking all as read:', error)
|
||||||
|
}
|
||||||
}, [userPubkey])
|
}, [userPubkey])
|
||||||
|
|
||||||
const deleteNotificationHandler = useCallback(
|
const deleteNotificationHandler = useCallback(
|
||||||
(notificationId: string): void => {
|
async (notificationId: string): Promise<void> => {
|
||||||
if (!userPubkey) {return}
|
if (!userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
setNotifications((prev) => deleteNotification(userPubkey, notificationId, prev))
|
try {
|
||||||
|
await notificationService.deleteNotification(notificationId)
|
||||||
|
setNotifications((prev) => prev.filter((n) => n.id !== notificationId))
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[useNotifications] Error deleting notification:', error)
|
||||||
|
}
|
||||||
},
|
},
|
||||||
[userPubkey]
|
[userPubkey]
|
||||||
)
|
)
|
||||||
|
|||||||
@ -3,6 +3,7 @@ import { nostrService } from '@/lib/nostr'
|
|||||||
import type { Article } from '@/types/nostr'
|
import type { Article } from '@/types/nostr'
|
||||||
import { applyFiltersAndSort } from '@/lib/articleFiltering'
|
import { applyFiltersAndSort } from '@/lib/articleFiltering'
|
||||||
import type { ArticleFilters } from '@/components/ArticleFilters'
|
import type { ArticleFilters } from '@/components/ArticleFilters'
|
||||||
|
import { objectCache } from '@/lib/objectCache'
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Hook to fetch articles published by a specific user
|
* Hook to fetch articles published by a specific user
|
||||||
@ -32,31 +33,34 @@ export function useUserArticles(
|
|||||||
setLoading(true)
|
setLoading(true)
|
||||||
setError(null)
|
setError(null)
|
||||||
|
|
||||||
const unsubscribe = nostrService.subscribeToArticles(
|
// Read only from IndexedDB cache - no network subscription
|
||||||
(article) => {
|
void (async (): Promise<void> => {
|
||||||
if (article.pubkey === userPubkey) {
|
try {
|
||||||
setArticles((prev) => {
|
const allPublications = await objectCache.getAll('publication')
|
||||||
if (prev.some((a) => a.id === article.id)) {
|
const allAuthors = await objectCache.getAll('author')
|
||||||
return prev
|
const allArticles = [...allPublications, ...allAuthors] as Article[]
|
||||||
}
|
|
||||||
const next = [article, ...prev].sort((a, b) => b.createdAt - a.createdAt)
|
|
||||||
hasArticlesRef.current = next.length > 0
|
|
||||||
return next
|
|
||||||
})
|
|
||||||
setLoading(false)
|
|
||||||
}
|
|
||||||
},
|
|
||||||
100
|
|
||||||
)
|
|
||||||
|
|
||||||
// Timeout after 10 seconds
|
// Filter by user pubkey
|
||||||
const timeout = setTimeout(() => {
|
const userArticles = allArticles.filter((article) => article.pubkey === userPubkey)
|
||||||
setLoading(false)
|
|
||||||
}, 10000)
|
// Sort by creation date descending
|
||||||
|
const sortedArticles = userArticles.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
|
|
||||||
|
setArticles(sortedArticles)
|
||||||
|
hasArticlesRef.current = sortedArticles.length > 0
|
||||||
|
if (sortedArticles.length === 0) {
|
||||||
|
setError('Aucun contenu trouvé')
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error loading user articles from cache:', error)
|
||||||
|
setError('Erreur lors du chargement des articles')
|
||||||
|
} finally {
|
||||||
|
setLoading(false)
|
||||||
|
}
|
||||||
|
})()
|
||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
unsubscribe()
|
// No cleanup needed - no network subscription
|
||||||
clearTimeout(timeout)
|
|
||||||
}
|
}
|
||||||
}, [userPubkey])
|
}, [userPubkey])
|
||||||
|
|
||||||
|
|||||||
@ -182,16 +182,17 @@ export class ArticlePublisher {
|
|||||||
return buildFailure('Failed to publish presentation article')
|
return buildFailure('Failed to publish presentation article')
|
||||||
}
|
}
|
||||||
|
|
||||||
// Parse and cache the published presentation immediately
|
// Parse and cache the published presentation immediately with published: false
|
||||||
|
// The published status will be updated asynchronously by publishEvent
|
||||||
const { parsePresentationEvent } = await import('./articlePublisherHelpers')
|
const { parsePresentationEvent } = await import('./articlePublisherHelpers')
|
||||||
const { extractTagsFromEvent } = await import('./nostrTagSystem')
|
const { extractTagsFromEvent } = await import('./nostrTagSystem')
|
||||||
const { objectCache } = await import('./objectCache')
|
|
||||||
const parsed = await parsePresentationEvent(publishedEvent)
|
const parsed = await parsePresentationEvent(publishedEvent)
|
||||||
if (parsed) {
|
if (parsed) {
|
||||||
const tags = extractTagsFromEvent(publishedEvent)
|
const tags = extractTagsFromEvent(publishedEvent)
|
||||||
const { id: tagId, version: tagVersion, hidden: tagHidden } = tags
|
const { id: tagId, version: tagVersion, hidden: tagHidden } = tags
|
||||||
if (tagId) {
|
if (tagId) {
|
||||||
await objectCache.set('author', tagId, publishedEvent, parsed, tagVersion ?? 0, tagHidden ?? false)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('author', tagId, publishedEvent, parsed, tagVersion ?? 0, tagHidden ?? false, undefined, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -28,7 +28,17 @@ export async function buildPresentationEvent(
|
|||||||
const separator = '\n\n---\n\nDescription du contenu :\n'
|
const separator = '\n\n---\n\nDescription du contenu :\n'
|
||||||
const separatorIndex = draft.content.indexOf(separator)
|
const separatorIndex = draft.content.indexOf(separator)
|
||||||
const presentation = separatorIndex !== -1 ? draft.content.substring(0, separatorIndex) : draft.presentation
|
const presentation = separatorIndex !== -1 ? draft.content.substring(0, separatorIndex) : draft.presentation
|
||||||
const contentDescription = separatorIndex !== -1 ? draft.content.substring(separatorIndex + separator.length) : draft.contentDescription
|
let contentDescription = separatorIndex !== -1 ? draft.content.substring(separatorIndex + separator.length) : draft.contentDescription
|
||||||
|
|
||||||
|
// Remove Bitcoin address from contentDescription if present (should not be visible in note content)
|
||||||
|
// Remove lines matching "Adresse Bitcoin mainnet (pour le sponsoring) : ..."
|
||||||
|
if (contentDescription) {
|
||||||
|
contentDescription = contentDescription
|
||||||
|
.split('\n')
|
||||||
|
.filter((line) => !line.includes('Adresse Bitcoin mainnet (pour le sponsoring)'))
|
||||||
|
.join('\n')
|
||||||
|
.trim()
|
||||||
|
}
|
||||||
|
|
||||||
// Generate hash ID from author data first (needed for URL)
|
// Generate hash ID from author data first (needed for URL)
|
||||||
const hashId = await generateAuthorHashId({
|
const hashId = await generateAuthorHashId({
|
||||||
@ -55,10 +65,10 @@ export async function buildPresentationEvent(
|
|||||||
: profileUrl
|
: profileUrl
|
||||||
|
|
||||||
const visibleContent = [
|
const visibleContent = [
|
||||||
'Nouveau profil publié sur zapwall.fr',
|
'Nouveau profil auteur publié sur zapwall.fr (plateforme de publications scientifiques)',
|
||||||
linkWithPreview,
|
linkWithPreview,
|
||||||
`Présentation personnelle : ${presentation}`,
|
`Présentation personnelle : ${presentation}`,
|
||||||
`Description de votre contenu : ${contentDescription}`,
|
...(contentDescription ? [`Description de votre contenu : ${contentDescription}`] : []),
|
||||||
].join('\n')
|
].join('\n')
|
||||||
|
|
||||||
// Build profile JSON for metadata (stored in tag, not in content)
|
// Build profile JSON for metadata (stored in tag, not in content)
|
||||||
@ -200,7 +210,15 @@ export async function parsePresentationEvent(event: Event): Promise<import('@/ty
|
|||||||
preview: tags.preview ?? event.content.substring(0, 200),
|
preview: tags.preview ?? event.content.substring(0, 200),
|
||||||
content: event.content,
|
content: event.content,
|
||||||
description: profileData?.presentation ?? tags.description ?? '', // Required field
|
description: profileData?.presentation ?? tags.description ?? '', // Required field
|
||||||
contentDescription: profileData?.contentDescription ?? tags.description ?? '', // Required field
|
contentDescription: ((): string => {
|
||||||
|
const raw = profileData?.contentDescription ?? tags.description ?? ''
|
||||||
|
// Remove Bitcoin address from contentDescription if present (should not be visible)
|
||||||
|
return raw
|
||||||
|
.split('\n')
|
||||||
|
.filter((line) => !line.includes('Adresse Bitcoin mainnet (pour le sponsoring)'))
|
||||||
|
.join('\n')
|
||||||
|
.trim()
|
||||||
|
})(), // Required field
|
||||||
thumbnailUrl: (typeof profileData?.pictureUrl === 'string' ? profileData.pictureUrl : typeof tags.pictureUrl === 'string' ? tags.pictureUrl : ''), // Required field
|
thumbnailUrl: (typeof profileData?.pictureUrl === 'string' ? profileData.pictureUrl : typeof tags.pictureUrl === 'string' ? tags.pictureUrl : ''), // Required field
|
||||||
createdAt: event.created_at,
|
createdAt: event.created_at,
|
||||||
zapAmount: 0,
|
zapAmount: 0,
|
||||||
@ -271,7 +289,8 @@ export async function fetchAuthorPresentationFromPool(
|
|||||||
// Calculate totalSponsoring from cache before storing
|
// Calculate totalSponsoring from cache before storing
|
||||||
const { getAuthorSponsoring } = await import('./sponsoring')
|
const { getAuthorSponsoring } = await import('./sponsoring')
|
||||||
value.totalSponsoring = await getAuthorSponsoring(value.pubkey)
|
value.totalSponsoring = await getAuthorSponsoring(value.pubkey)
|
||||||
await objectCache.set('author', value.hash, event, value, tags.version ?? 0, tags.hidden, value.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('author', value.hash, event, value, tags.version ?? 0, tags.hidden, value.index, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -301,6 +320,7 @@ export async function fetchAuthorPresentationFromPool(
|
|||||||
await finalize(null)
|
await finalize(null)
|
||||||
})()
|
})()
|
||||||
})
|
})
|
||||||
|
// Reduced timeout for faster feedback when cache is empty
|
||||||
setTimeout((): void => {
|
setTimeout((): void => {
|
||||||
void (async (): Promise<void> => {
|
void (async (): Promise<void> => {
|
||||||
// Get the latest version from all collected events
|
// Get the latest version from all collected events
|
||||||
@ -314,6 +334,6 @@ export async function fetchAuthorPresentationFromPool(
|
|||||||
}
|
}
|
||||||
await finalize(null)
|
await finalize(null)
|
||||||
})()
|
})()
|
||||||
}, 5000).unref?.()
|
}, 2000).unref?.()
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|||||||
@ -4,6 +4,7 @@ import type { AlbyInvoice } from '@/types/alby'
|
|||||||
import { createArticleInvoice, createPreviewEvent } from './articleInvoice'
|
import { createArticleInvoice, createPreviewEvent } from './articleInvoice'
|
||||||
import { encryptArticleContent, encryptDecryptionKey } from './articleEncryption'
|
import { encryptArticleContent, encryptDecryptionKey } from './articleEncryption'
|
||||||
import { storePrivateContent } from './articleStorage'
|
import { storePrivateContent } from './articleStorage'
|
||||||
|
import type { PublishResult } from './publishResult'
|
||||||
|
|
||||||
export function buildFailure(error?: string): PublishedArticle {
|
export function buildFailure(error?: string): PublishedArticle {
|
||||||
const base: PublishedArticle = {
|
const base: PublishedArticle = {
|
||||||
@ -21,10 +22,14 @@ export async function publishPreview(
|
|||||||
presentationId: string,
|
presentationId: string,
|
||||||
extraTags?: string[][],
|
extraTags?: string[][],
|
||||||
encryptedContent?: string,
|
encryptedContent?: string,
|
||||||
encryptedKey?: string
|
encryptedKey?: string,
|
||||||
): Promise<import('nostr-tools').Event | null> {
|
returnStatus?: boolean
|
||||||
|
): Promise<import('nostr-tools').Event | null | PublishResult> {
|
||||||
const previewEvent = await createPreviewEvent(draft, invoice, authorPubkey, presentationId, extraTags, encryptedContent, encryptedKey)
|
const previewEvent = await createPreviewEvent(draft, invoice, authorPubkey, presentationId, extraTags, encryptedContent, encryptedKey)
|
||||||
const publishedEvent = await nostrService.publishEvent(previewEvent)
|
if (returnStatus) {
|
||||||
|
return await nostrService.publishEvent(previewEvent, true)
|
||||||
|
}
|
||||||
|
const publishedEvent = await nostrService.publishEvent(previewEvent, false)
|
||||||
return publishedEvent ?? null
|
return publishedEvent ?? null
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -50,18 +55,42 @@ export async function encryptAndPublish(
|
|||||||
const encryptedKey = await encryptDecryptionKey(key, iv, authorPrivateKeyForEncryption, authorPubkey)
|
const encryptedKey = await encryptDecryptionKey(key, iv, authorPrivateKeyForEncryption, authorPubkey)
|
||||||
const invoice = await createArticleInvoice(draft)
|
const invoice = await createArticleInvoice(draft)
|
||||||
const extraTags = buildArticleExtraTags(draft, category)
|
const extraTags = buildArticleExtraTags(draft, category)
|
||||||
const publishedEvent = await publishPreview(draft, invoice, authorPubkey, presentationId, extraTags, encryptedContent, encryptedKey)
|
const publishResult = await publishPreview(draft, invoice, authorPubkey, presentationId, extraTags, encryptedContent, encryptedKey, true)
|
||||||
|
|
||||||
if (!publishedEvent) {
|
if (!publishResult) {
|
||||||
return buildFailure('Failed to publish article')
|
return buildFailure('Failed to publish article')
|
||||||
}
|
}
|
||||||
|
|
||||||
await storePrivateContent(publishedEvent.id, draft.content, authorPubkey, invoice, key, iv)
|
// Handle both old format (Event | null) and new format (PublishResult)
|
||||||
|
let event: import('nostr-tools').Event | null = null
|
||||||
|
let relayStatuses: import('./publishResult').RelayPublishStatus[] | undefined
|
||||||
|
|
||||||
|
if (publishResult && 'event' in publishResult && 'relayStatuses' in publishResult) {
|
||||||
|
// New format with statuses
|
||||||
|
event = publishResult.event
|
||||||
|
relayStatuses = publishResult.relayStatuses
|
||||||
|
} else if (publishResult && 'id' in publishResult) {
|
||||||
|
// Old format (Event)
|
||||||
|
event = publishResult
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!event) {
|
||||||
|
return buildFailure('Failed to publish article')
|
||||||
|
}
|
||||||
|
|
||||||
|
await storePrivateContent(event.id, draft.content, authorPubkey, invoice, key, iv)
|
||||||
console.warn('Article published with encrypted content', {
|
console.warn('Article published with encrypted content', {
|
||||||
articleId: publishedEvent.id,
|
articleId: event.id,
|
||||||
authorPubkey,
|
authorPubkey,
|
||||||
timestamp: new Date().toISOString(),
|
timestamp: new Date().toISOString(),
|
||||||
|
relayStatuses,
|
||||||
})
|
})
|
||||||
|
|
||||||
return { articleId: publishedEvent.id, previewEventId: publishedEvent.id, invoice, success: true }
|
return {
|
||||||
|
articleId: event.id,
|
||||||
|
previewEventId: event.id,
|
||||||
|
invoice,
|
||||||
|
success: true,
|
||||||
|
relayStatuses,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,5 +1,6 @@
|
|||||||
import type { AlbyInvoice } from '@/types/alby'
|
import type { AlbyInvoice } from '@/types/alby'
|
||||||
import type { MediaRef, Page } from '@/types/nostr'
|
import type { MediaRef, Page } from '@/types/nostr'
|
||||||
|
import type { RelayPublishStatus } from './publishResult'
|
||||||
|
|
||||||
export interface ArticleDraft {
|
export interface ArticleDraft {
|
||||||
title: string
|
title: string
|
||||||
@ -29,4 +30,5 @@ export interface PublishedArticle {
|
|||||||
invoice?: AlbyInvoice // Invoice created by author (required if success)
|
invoice?: AlbyInvoice // Invoice created by author (required if success)
|
||||||
success: boolean
|
success: boolean
|
||||||
error?: string
|
error?: string
|
||||||
|
relayStatuses?: RelayPublishStatus[] | undefined // Status of publication to each relay
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,61 +1,14 @@
|
|||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import { nostrService } from './nostr'
|
|
||||||
import { SimplePool } from 'nostr-tools'
|
|
||||||
import { createSubscription } from '@/types/nostr-tools-extended'
|
|
||||||
import type { Article } from '@/types/nostr'
|
import type { Article } from '@/types/nostr'
|
||||||
import { parseArticleFromEvent } from './nostrEventParsing'
|
import { objectCache } from './objectCache'
|
||||||
import { buildTagFilter } from './nostrTagSystem'
|
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
|
|
||||||
function createSeriesSubscription(pool: SimplePool, seriesId: string, limit: number): ReturnType<typeof createSubscription> {
|
export async function getArticlesBySeries(seriesId: string, _timeoutMs: number = 5000, _limit: number = 100): Promise<Article[]> {
|
||||||
const filters = [
|
// Read only from IndexedDB cache
|
||||||
{
|
const allPublications = await objectCache.getAll('publication')
|
||||||
...buildTagFilter({
|
const articles = allPublications as Article[]
|
||||||
type: 'publication',
|
|
||||||
seriesId,
|
// Filter by seriesId
|
||||||
service: PLATFORM_SERVICE,
|
const seriesArticles = articles.filter((article) => article.seriesId === seriesId)
|
||||||
}),
|
|
||||||
since: MIN_EVENT_DATE,
|
// Sort by creation date descending
|
||||||
limit,
|
return seriesArticles.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
},
|
|
||||||
]
|
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
return createSubscription(pool, [relayUrl], filters)
|
|
||||||
}
|
|
||||||
|
|
||||||
export function getArticlesBySeries(seriesId: string, timeoutMs: number = 5000, limit: number = 100): Promise<Article[]> {
|
|
||||||
const pool = nostrService.getPool()
|
|
||||||
if (!pool) {
|
|
||||||
throw new Error('Pool not initialized')
|
|
||||||
}
|
|
||||||
const sub = createSeriesSubscription(pool, seriesId, limit)
|
|
||||||
|
|
||||||
return new Promise<Article[]>((resolve) => {
|
|
||||||
const results: Article[] = []
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = (): void => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const parsed = await parseArticleFromEvent(event)
|
|
||||||
if (parsed) {
|
|
||||||
results.push(parsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
done()
|
|
||||||
})
|
|
||||||
setTimeout(() => done(), timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,33 +1,39 @@
|
|||||||
/**
|
/**
|
||||||
* Query authors by hash ID or pubkey (for backward compatibility)
|
* Query authors by hash ID or pubkey (for backward compatibility)
|
||||||
|
* Read only from IndexedDB - no network requests
|
||||||
*/
|
*/
|
||||||
|
|
||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import type { SimplePoolWithSub } from '@/types/nostr-tools-extended'
|
import type { SimplePoolWithSub } from '@/types/nostr-tools-extended'
|
||||||
import { buildTagFilter, extractTagsFromEvent } from './nostrTagSystem'
|
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
import { parsePresentationEvent, fetchAuthorPresentationFromPool } from './articlePublisherHelpersPresentation'
|
|
||||||
import { getLatestVersion } from './versionManager'
|
|
||||||
import { objectCache } from './objectCache'
|
import { objectCache } from './objectCache'
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Fetch author presentation by hash ID or pubkey
|
* Fetch author presentation by hash ID or pubkey
|
||||||
|
* Read only from IndexedDB - no network requests
|
||||||
* If the parameter looks like a pubkey (64 hex chars), it uses pubkey lookup
|
* If the parameter looks like a pubkey (64 hex chars), it uses pubkey lookup
|
||||||
* Otherwise, it uses hash ID lookup
|
* Otherwise, it uses hash ID lookup
|
||||||
*/
|
*/
|
||||||
export async function fetchAuthorByHashId(
|
export async function fetchAuthorByHashId(
|
||||||
pool: SimplePoolWithSub,
|
_pool: SimplePoolWithSub,
|
||||||
hashIdOrPubkey: string
|
hashIdOrPubkey: string
|
||||||
): Promise<import('@/types/nostr').AuthorPresentationArticle | null> {
|
): Promise<import('@/types/nostr').AuthorPresentationArticle | null> {
|
||||||
// Check if it's a pubkey (64 hex characters) for backward compatibility
|
// Check if it's a pubkey (64 hex characters) for backward compatibility
|
||||||
if (/^[a-f0-9]{64}$/i.test(hashIdOrPubkey)) {
|
if (/^[a-f0-9]{64}$/i.test(hashIdOrPubkey)) {
|
||||||
return fetchAuthorPresentationFromPool(pool, hashIdOrPubkey)
|
// Read only from IndexedDB cache
|
||||||
|
const cached = await objectCache.getAuthorByPubkey(hashIdOrPubkey)
|
||||||
|
if (cached) {
|
||||||
|
const presentation = cached as import('@/types/nostr').AuthorPresentationArticle
|
||||||
|
// Calculate totalSponsoring from cache
|
||||||
|
const { getAuthorSponsoring } = await import('./sponsoring')
|
||||||
|
presentation.totalSponsoring = await getAuthorSponsoring(presentation.pubkey)
|
||||||
|
return presentation
|
||||||
|
}
|
||||||
|
// Not found in cache - return null (no network request)
|
||||||
|
return null
|
||||||
}
|
}
|
||||||
|
|
||||||
// Otherwise, treat as hash ID
|
// Otherwise, treat as hash ID
|
||||||
const hashId = hashIdOrPubkey
|
const hashId = hashIdOrPubkey
|
||||||
// Check cache first - this is the primary source
|
// Read only from IndexedDB cache
|
||||||
const cached = await objectCache.get('author', hashId)
|
const cached = await objectCache.get('author', hashId)
|
||||||
if (cached) {
|
if (cached) {
|
||||||
const presentation = cached as import('@/types/nostr').AuthorPresentationArticle
|
const presentation = cached as import('@/types/nostr').AuthorPresentationArticle
|
||||||
@ -37,85 +43,16 @@ export async function fetchAuthorByHashId(
|
|||||||
return presentation
|
return presentation
|
||||||
}
|
}
|
||||||
|
|
||||||
const filters = [
|
// Also try by ID if hash lookup failed
|
||||||
{
|
const cachedById = await objectCache.getById('author', hashId)
|
||||||
...buildTagFilter({
|
if (cachedById) {
|
||||||
type: 'author',
|
const presentation = cachedById as import('@/types/nostr').AuthorPresentationArticle
|
||||||
id: hashId,
|
// Calculate totalSponsoring from cache
|
||||||
service: PLATFORM_SERVICE,
|
const { getAuthorSponsoring } = await import('./sponsoring')
|
||||||
}),
|
presentation.totalSponsoring = await getAuthorSponsoring(presentation.pubkey)
|
||||||
since: MIN_EVENT_DATE,
|
return presentation
|
||||||
limit: 100, // Get all versions to find the latest
|
}
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
return new Promise<import('@/types/nostr').AuthorPresentationArticle | null>((resolve) => {
|
// Not found in cache - return null (no network request)
|
||||||
let resolved = false
|
return null
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
|
|
||||||
const events: Event[] = []
|
|
||||||
|
|
||||||
const finalize = async (value: import('@/types/nostr').AuthorPresentationArticle | null): Promise<void> => {
|
|
||||||
if (resolved) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
resolved = true
|
|
||||||
sub.unsub()
|
|
||||||
|
|
||||||
// Cache the result if found
|
|
||||||
if (value && events.length > 0) {
|
|
||||||
const event = events.find(e => e.id === value.id) || events[0]
|
|
||||||
if (event) {
|
|
||||||
const tags = extractTagsFromEvent(event)
|
|
||||||
if (value.hash) {
|
|
||||||
// Calculate totalSponsoring from cache before storing
|
|
||||||
const { getAuthorSponsoring } = await import('./sponsoring')
|
|
||||||
value.totalSponsoring = await getAuthorSponsoring(value.pubkey)
|
|
||||||
await objectCache.set('author', value.hash, event, value, tags.version ?? 0, tags.hidden, value.index)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
resolve(value)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
// Collect all events first
|
|
||||||
const tags = extractTagsFromEvent(event)
|
|
||||||
if (tags.type === 'author' && !tags.hidden && tags.id === hashId) {
|
|
||||||
events.push(event)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
// Get the latest version from all collected events
|
|
||||||
const latestEvent = getLatestVersion(events)
|
|
||||||
if (latestEvent) {
|
|
||||||
const parsed = await parsePresentationEvent(latestEvent)
|
|
||||||
if (parsed) {
|
|
||||||
await finalize(parsed)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
|
||||||
await finalize(null)
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
// Get the latest version from all collected events
|
|
||||||
const timeoutLatestEvent = getLatestVersion(events)
|
|
||||||
if (timeoutLatestEvent) {
|
|
||||||
const timeoutParsed = await parsePresentationEvent(timeoutLatestEvent)
|
|
||||||
if (timeoutParsed) {
|
|
||||||
await finalize(timeoutParsed)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
}
|
|
||||||
await finalize(null)
|
|
||||||
})()
|
|
||||||
}, 5000).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -8,6 +8,7 @@ export interface RelayConfig {
|
|||||||
enabled: boolean
|
enabled: boolean
|
||||||
priority: number
|
priority: number
|
||||||
createdAt: number
|
createdAt: number
|
||||||
|
lastSyncDate?: number // Timestamp of last successful sync with this relay
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface Nip95Config {
|
export interface Nip95Config {
|
||||||
|
|||||||
@ -157,9 +157,7 @@ export async function extractAuthorFromEvent(event: Event): Promise<ExtractedAut
|
|||||||
let metadata = extractMetadataJsonFromTag(event)
|
let metadata = extractMetadataJsonFromTag(event)
|
||||||
|
|
||||||
// Fallback to content format (for backward compatibility)
|
// Fallback to content format (for backward compatibility)
|
||||||
if (!metadata) {
|
metadata ??= extractMetadataJson(event.content)
|
||||||
metadata = extractMetadataJson(event.content)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (metadata?.type === 'author') {
|
if (metadata?.type === 'author') {
|
||||||
const authorData = {
|
const authorData = {
|
||||||
@ -207,9 +205,7 @@ export async function extractSeriesFromEvent(event: Event): Promise<ExtractedSer
|
|||||||
let metadata = extractMetadataJsonFromTag(event)
|
let metadata = extractMetadataJsonFromTag(event)
|
||||||
|
|
||||||
// Fallback to content format (for backward compatibility)
|
// Fallback to content format (for backward compatibility)
|
||||||
if (!metadata) {
|
metadata ??= extractMetadataJson(event.content)
|
||||||
metadata = extractMetadataJson(event.content)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (metadata?.type === 'series') {
|
if (metadata?.type === 'series') {
|
||||||
const seriesData = {
|
const seriesData = {
|
||||||
@ -278,9 +274,7 @@ export async function extractPublicationFromEvent(event: Event): Promise<Extract
|
|||||||
let metadata = extractMetadataJsonFromTag(event)
|
let metadata = extractMetadataJsonFromTag(event)
|
||||||
|
|
||||||
// Fallback to content format (for backward compatibility)
|
// Fallback to content format (for backward compatibility)
|
||||||
if (!metadata) {
|
metadata ??= extractMetadataJson(event.content)
|
||||||
metadata = extractMetadataJson(event.content)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (metadata?.type === 'publication') {
|
if (metadata?.type === 'publication') {
|
||||||
const publicationData = {
|
const publicationData = {
|
||||||
@ -357,9 +351,7 @@ export async function extractReviewFromEvent(event: Event): Promise<ExtractedRev
|
|||||||
let metadata = extractMetadataJsonFromTag(event)
|
let metadata = extractMetadataJsonFromTag(event)
|
||||||
|
|
||||||
// Fallback to content format (for backward compatibility)
|
// Fallback to content format (for backward compatibility)
|
||||||
if (!metadata) {
|
metadata ??= extractMetadataJson(event.content)
|
||||||
metadata = extractMetadataJson(event.content)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (metadata?.type === 'review') {
|
if (metadata?.type === 'review') {
|
||||||
const reviewData = {
|
const reviewData = {
|
||||||
|
|||||||
273
lib/nostr.ts
273
lib/nostr.ts
@ -10,10 +10,11 @@ import {
|
|||||||
decryptArticleContentWithKey,
|
decryptArticleContentWithKey,
|
||||||
} from './nostrPrivateMessages'
|
} from './nostrPrivateMessages'
|
||||||
import { checkZapReceipt as checkZapReceiptHelper } from './nostrZapVerification'
|
import { checkZapReceipt as checkZapReceiptHelper } from './nostrZapVerification'
|
||||||
import { subscribeWithTimeout } from './nostrSubscription'
|
|
||||||
import { getPrimaryRelay, getPrimaryRelaySync } from './config'
|
import { getPrimaryRelay, getPrimaryRelaySync } from './config'
|
||||||
import { buildTagFilter } from './nostrTagSystem'
|
import { buildTagFilter } from './nostrTagSystem'
|
||||||
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
||||||
|
import type { PublishResult, RelayPublishStatus } from './publishResult'
|
||||||
|
import { objectCache } from './objectCache'
|
||||||
|
|
||||||
class NostrService {
|
class NostrService {
|
||||||
private pool: SimplePool | null = null
|
private pool: SimplePool | null = null
|
||||||
@ -62,7 +63,10 @@ class NostrService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async publishEvent(eventTemplate: EventTemplate): Promise<Event | null> {
|
async publishEvent(eventTemplate: EventTemplate): Promise<Event | null>
|
||||||
|
async publishEvent(eventTemplate: EventTemplate, returnStatus: false): Promise<Event | null>
|
||||||
|
async publishEvent(eventTemplate: EventTemplate, returnStatus: true): Promise<PublishResult>
|
||||||
|
async publishEvent(eventTemplate: EventTemplate, returnStatus: boolean = false): Promise<Event | null | PublishResult> {
|
||||||
if (!this.privateKey || !this.pool) {
|
if (!this.privateKey || !this.pool) {
|
||||||
throw new Error('Private key not set or pool not initialized')
|
throw new Error('Private key not set or pool not initialized')
|
||||||
}
|
}
|
||||||
@ -75,42 +79,108 @@ class NostrService {
|
|||||||
const secretKey = hexToBytes(this.privateKey)
|
const secretKey = hexToBytes(this.privateKey)
|
||||||
const event = finalizeEvent(unsignedEvent, secretKey)
|
const event = finalizeEvent(unsignedEvent, secretKey)
|
||||||
|
|
||||||
try {
|
// Publish to all active relays (enabled and not marked inactive for this session)
|
||||||
// Publish to all active relays (enabled and not marked inactive for this session)
|
// Each event has a unique ID based on content, so publishing to multiple relays
|
||||||
// Each event has a unique ID based on content, so publishing to multiple relays
|
// doesn't create duplicates - it's the same event stored redundantly
|
||||||
// doesn't create duplicates - it's the same event stored redundantly
|
// Network failures do not block - we return the event and status immediately
|
||||||
const { relaySessionManager } = await import('./relaySessionManager')
|
const { relaySessionManager } = await import('./relaySessionManager')
|
||||||
const activeRelays = await relaySessionManager.getActiveRelays()
|
const activeRelays = await relaySessionManager.getActiveRelays()
|
||||||
|
|
||||||
if (activeRelays.length === 0) {
|
const relayStatuses: RelayPublishStatus[] = []
|
||||||
// Fallback to primary relay if no active relays
|
|
||||||
const relayUrl = await getPrimaryRelay()
|
|
||||||
const pubs = this.pool.publish([relayUrl], event)
|
|
||||||
await Promise.all(pubs)
|
|
||||||
} else {
|
|
||||||
// Publish to all active relays
|
|
||||||
console.warn(`[NostrService] Publishing event ${event.id} to ${activeRelays.length} active relay(s)`)
|
|
||||||
const pubs = this.pool.publish(activeRelays, event)
|
|
||||||
|
|
||||||
// Track failed relays and mark them inactive for the session
|
// Start publishing asynchronously (don't await - non-blocking)
|
||||||
const results = await Promise.allSettled(pubs)
|
void (async (): Promise<void> => {
|
||||||
results.forEach((result, index) => {
|
try {
|
||||||
const relayUrl = activeRelays[index]
|
if (activeRelays.length === 0) {
|
||||||
if (!relayUrl) {
|
// Fallback to primary relay if no active relays
|
||||||
return
|
const relayUrl = await getPrimaryRelay()
|
||||||
|
if (!this.pool) {
|
||||||
|
throw new Error('Pool not initialized')
|
||||||
}
|
}
|
||||||
if (result.status === 'rejected') {
|
const pubs = this.pool.publish([relayUrl], event)
|
||||||
const error = result.reason
|
const results = await Promise.allSettled(pubs)
|
||||||
console.error(`[NostrService] Relay ${relayUrl} failed during publish:`, error)
|
|
||||||
relaySessionManager.markRelayFailed(relayUrl)
|
const successfulRelays: string[] = []
|
||||||
|
const { publishLog } = await import('./publishLog')
|
||||||
|
|
||||||
|
results.forEach((result) => {
|
||||||
|
if (result.status === 'fulfilled') {
|
||||||
|
successfulRelays.push(relayUrl)
|
||||||
|
// Log successful publication
|
||||||
|
void publishLog.logPublication(event.id, relayUrl, true, undefined)
|
||||||
|
} else {
|
||||||
|
const error = result.reason
|
||||||
|
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||||
|
console.error(`[NostrService] Relay ${relayUrl} failed during publish:`, error)
|
||||||
|
relaySessionManager.markRelayFailed(relayUrl)
|
||||||
|
// Log failed publication
|
||||||
|
void publishLog.logPublication(event.id, relayUrl, false, errorMessage)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Update published status in IndexedDB
|
||||||
|
await this.updatePublishedStatus(event.id, successfulRelays.length > 0 ? successfulRelays : false)
|
||||||
|
} else {
|
||||||
|
// Publish to all active relays
|
||||||
|
console.warn(`[NostrService] Publishing event ${event.id} to ${activeRelays.length} active relay(s)`)
|
||||||
|
if (!this.pool) {
|
||||||
|
throw new Error('Pool not initialized')
|
||||||
}
|
}
|
||||||
})
|
const pubs = this.pool.publish(activeRelays, event)
|
||||||
|
|
||||||
|
// Track failed relays and mark them inactive for the session
|
||||||
|
const results = await Promise.allSettled(pubs)
|
||||||
|
const successfulRelays: string[] = []
|
||||||
|
|
||||||
|
const { publishLog } = await import('./publishLog')
|
||||||
|
|
||||||
|
results.forEach((result, index) => {
|
||||||
|
const relayUrl = activeRelays[index]
|
||||||
|
if (!relayUrl) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (result.status === 'fulfilled') {
|
||||||
|
successfulRelays.push(relayUrl)
|
||||||
|
// Log successful publication
|
||||||
|
void publishLog.logPublication(event.id, relayUrl, true, undefined)
|
||||||
|
} else {
|
||||||
|
const error = result.reason
|
||||||
|
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||||
|
console.error(`[NostrService] Relay ${relayUrl} failed during publish:`, error)
|
||||||
|
relaySessionManager.markRelayFailed(relayUrl)
|
||||||
|
// Log failed publication
|
||||||
|
void publishLog.logPublication(event.id, relayUrl, false, errorMessage)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Update published status in IndexedDB
|
||||||
|
await this.updatePublishedStatus(event.id, successfulRelays.length > 0 ? successfulRelays : false)
|
||||||
|
}
|
||||||
|
} catch (publishError) {
|
||||||
|
console.error(`[NostrService] Error during publish (non-blocking):`, publishError)
|
||||||
|
// Mark as not published if all relays failed
|
||||||
|
await this.updatePublishedStatus(event.id, false)
|
||||||
}
|
}
|
||||||
|
})()
|
||||||
|
|
||||||
return event
|
// Build statuses for return (synchronous, before network completes)
|
||||||
} catch (publishError) {
|
if (returnStatus) {
|
||||||
throw new Error(`Publish failed: ${publishError}`)
|
// Return statuses immediately (will be updated asynchronously)
|
||||||
|
activeRelays.forEach((relayUrl) => {
|
||||||
|
relayStatuses.push({
|
||||||
|
relayUrl,
|
||||||
|
success: false, // Will be updated asynchronously
|
||||||
|
})
|
||||||
|
})
|
||||||
|
return {
|
||||||
|
event,
|
||||||
|
relayStatuses,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Return event immediately (non-blocking)
|
||||||
|
return event
|
||||||
}
|
}
|
||||||
|
|
||||||
private createArticleSubscription(pool: SimplePool, limit: number): ReturnType<typeof createSubscription> {
|
private createArticleSubscription(pool: SimplePool, limit: number): ReturnType<typeof createSubscription> {
|
||||||
@ -181,13 +251,33 @@ class NostrService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
getArticleById(eventId: string): Promise<Article | null> {
|
async getArticleById(eventId: string): Promise<Article | null> {
|
||||||
if (!this.pool) {
|
// Read only from IndexedDB cache
|
||||||
throw new Error('Pool not initialized')
|
// Try by ID first
|
||||||
|
const cachedById = await objectCache.getById('publication', eventId)
|
||||||
|
if (cachedById) {
|
||||||
|
return cachedById as Article
|
||||||
}
|
}
|
||||||
|
|
||||||
const filters = [{ ids: [eventId], kinds: [1] }]
|
// Also try by hash if eventId is a hash
|
||||||
return subscribeWithTimeout(this.pool, filters, parseArticleFromEvent, 5000)
|
const cachedByHash = await objectCache.get('publication', eventId)
|
||||||
|
if (cachedByHash) {
|
||||||
|
return cachedByHash as Article
|
||||||
|
}
|
||||||
|
|
||||||
|
// Also check author presentations
|
||||||
|
const cachedAuthor = await objectCache.getById('author', eventId)
|
||||||
|
if (cachedAuthor) {
|
||||||
|
return cachedAuthor as Article
|
||||||
|
}
|
||||||
|
|
||||||
|
const cachedAuthorByHash = await objectCache.get('author', eventId)
|
||||||
|
if (cachedAuthorByHash) {
|
||||||
|
return cachedAuthorByHash as Article
|
||||||
|
}
|
||||||
|
|
||||||
|
// Not found in cache - return null (no network request)
|
||||||
|
return null
|
||||||
}
|
}
|
||||||
|
|
||||||
getPrivateContent(eventId: string, authorPubkey: string): Promise<string | null> {
|
getPrivateContent(eventId: string, authorPubkey: string): Promise<string | null> {
|
||||||
@ -242,40 +332,32 @@ class NostrService {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Get event by ID (helper method)
|
* Get event by ID (helper method)
|
||||||
|
* Reads only from IndexedDB cache
|
||||||
*/
|
*/
|
||||||
private async getEventById(eventId: string): Promise<Event | null> {
|
private async getEventById(eventId: string): Promise<Event | null> {
|
||||||
if (!this.pool) {
|
// Read only from IndexedDB cache
|
||||||
throw new Error('Pool not initialized')
|
// Try publication cache first
|
||||||
|
const eventFromPublication = await objectCache.getEventById('publication', eventId)
|
||||||
|
if (eventFromPublication) {
|
||||||
|
return eventFromPublication
|
||||||
}
|
}
|
||||||
|
|
||||||
const filters = [{ ids: [eventId], kinds: [1] }]
|
// Try author cache
|
||||||
return subscribeWithTimeout(this.pool, filters, (event: Event): Event => event, 5000)
|
const eventFromAuthor = await objectCache.getEventById('author', eventId)
|
||||||
|
if (eventFromAuthor) {
|
||||||
|
return eventFromAuthor
|
||||||
|
}
|
||||||
|
|
||||||
|
// Not found in cache - return null (no network request)
|
||||||
|
return null
|
||||||
}
|
}
|
||||||
|
|
||||||
getProfile(pubkey: string): Promise<NostrProfile | null> {
|
async getProfile(_pubkey: string): Promise<NostrProfile | null> {
|
||||||
if (!this.pool) {
|
// Read only from IndexedDB cache
|
||||||
throw new Error('Pool not initialized')
|
// Profiles (kind 0) are not currently cached in objectCache
|
||||||
}
|
// Return null if profile is not in cache (no network request)
|
||||||
|
// TODO: Add profile caching to objectCache if needed
|
||||||
const filters = [
|
return null
|
||||||
{
|
|
||||||
kinds: [0],
|
|
||||||
authors: [pubkey],
|
|
||||||
limit: 1,
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
const parseProfile = (event: Event): NostrProfile | null => {
|
|
||||||
try {
|
|
||||||
const profile = JSON.parse(event.content) as NostrProfile
|
|
||||||
return { ...profile, pubkey }
|
|
||||||
} catch (parseProfileError) {
|
|
||||||
console.error('Error parsing profile:', parseProfileError)
|
|
||||||
return null
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return subscribeWithTimeout(this.pool, filters, parseProfile, 5000)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -374,6 +456,69 @@ class NostrService {
|
|||||||
getPool(): SimplePool | null {
|
getPool(): SimplePool | null {
|
||||||
return this.pool
|
return this.pool
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update published status in IndexedDB for an event
|
||||||
|
* Searches all object types to find and update the event
|
||||||
|
*/
|
||||||
|
private async updatePublishedStatus(eventId: string, published: false | string[]): Promise<void> {
|
||||||
|
const { objectCache } = await import('./objectCache')
|
||||||
|
const objectTypes: Array<import('./objectCache').ObjectType> = ['author', 'series', 'publication', 'review', 'purchase', 'sponsoring', 'review_tip', 'payment_note']
|
||||||
|
|
||||||
|
// First try to find in unpublished objects (faster)
|
||||||
|
for (const objectType of objectTypes) {
|
||||||
|
try {
|
||||||
|
const unpublished = await objectCache.getUnpublished(objectType)
|
||||||
|
const matching = unpublished.find((obj) => obj.event.id === eventId)
|
||||||
|
if (matching) {
|
||||||
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.updatePublished(objectType, matching.id, published)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn(`[NostrService] Error checking unpublished in ${objectType}:`, error)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If not found in unpublished, search all objects
|
||||||
|
for (const objectType of objectTypes) {
|
||||||
|
try {
|
||||||
|
const db = await objectCache['initDB'](objectType)
|
||||||
|
const transaction = db.transaction(['objects'], 'readonly')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
const request = store.openCursor()
|
||||||
|
|
||||||
|
const found = await new Promise<string | null>((resolve, reject) => {
|
||||||
|
request.onsuccess = (event: globalThis.Event): void => {
|
||||||
|
const cursor = (event.target as IDBRequest<IDBCursorWithValue>).result
|
||||||
|
if (cursor) {
|
||||||
|
const obj = cursor.value as { id: string; event: { id: string } }
|
||||||
|
if (obj.event.id === eventId) {
|
||||||
|
resolve(obj.id)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(null)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
if (found) {
|
||||||
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.updatePublished(objectType, found, published)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn(`[NostrService] Error searching for event in ${objectType}:`, error)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export const nostrService = new NostrService()
|
export const nostrService = new NostrService()
|
||||||
|
|||||||
288
lib/notificationDetector.ts
Normal file
288
lib/notificationDetector.ts
Normal file
@ -0,0 +1,288 @@
|
|||||||
|
/**
|
||||||
|
* Notification detector - scans IndexedDB for new events and creates notifications
|
||||||
|
* Runs in a service worker or main thread
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { objectCache } from './objectCache'
|
||||||
|
import { notificationService, type NotificationType } from './notificationService'
|
||||||
|
import type { CachedObject } from './objectCache'
|
||||||
|
|
||||||
|
interface ObjectChange {
|
||||||
|
objectType: string
|
||||||
|
objectId: string
|
||||||
|
eventId: string
|
||||||
|
oldPublished: false | string[]
|
||||||
|
newPublished: false | string[]
|
||||||
|
}
|
||||||
|
|
||||||
|
class NotificationDetector {
|
||||||
|
private lastScanTime: number = 0
|
||||||
|
private scanInterval: number | null = null
|
||||||
|
private isScanning = false
|
||||||
|
private userPubkey: string | null = null
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start scanning for notifications
|
||||||
|
*/
|
||||||
|
start(userPubkey: string): void {
|
||||||
|
if (this.scanInterval) {
|
||||||
|
return // Already started
|
||||||
|
}
|
||||||
|
|
||||||
|
this.userPubkey = userPubkey
|
||||||
|
this.lastScanTime = Date.now()
|
||||||
|
|
||||||
|
// Scan immediately
|
||||||
|
void this.scan()
|
||||||
|
|
||||||
|
// Then scan periodically (every 30 seconds)
|
||||||
|
this.scanInterval = window.setInterval(() => {
|
||||||
|
void this.scan()
|
||||||
|
}, 30000)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop scanning
|
||||||
|
*/
|
||||||
|
stop(): void {
|
||||||
|
if (this.scanInterval) {
|
||||||
|
clearInterval(this.scanInterval)
|
||||||
|
this.scanInterval = null
|
||||||
|
}
|
||||||
|
this.userPubkey = null
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Scan IndexedDB for new events that should trigger notifications
|
||||||
|
*/
|
||||||
|
async scan(): Promise<void> {
|
||||||
|
if (this.isScanning || !this.userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.isScanning = true
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Scan for user-related objects
|
||||||
|
await this.scanUserObjects()
|
||||||
|
|
||||||
|
// Scan for published status changes
|
||||||
|
await this.scanPublishedStatusChanges()
|
||||||
|
|
||||||
|
this.lastScanTime = Date.now()
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationDetector] Error scanning:', error)
|
||||||
|
} finally {
|
||||||
|
this.isScanning = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Scan for user-related objects (purchases, reviews, sponsoring, review_tips, payment_notes)
|
||||||
|
*/
|
||||||
|
private async scanUserObjects(): Promise<void> {
|
||||||
|
if (!this.userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const objectTypes: Array<{ type: string; notificationType: NotificationType }> = [
|
||||||
|
{ type: 'purchase', notificationType: 'purchase' },
|
||||||
|
{ type: 'review', notificationType: 'review' },
|
||||||
|
{ type: 'sponsoring', notificationType: 'sponsoring' },
|
||||||
|
{ type: 'review_tip', notificationType: 'review_tip' },
|
||||||
|
{ type: 'payment_note', notificationType: 'payment_note' },
|
||||||
|
]
|
||||||
|
|
||||||
|
for (const { type, notificationType } of objectTypes) {
|
||||||
|
try {
|
||||||
|
const allObjects = await objectCache.getAll(type as Parameters<typeof objectCache.getAll>[0])
|
||||||
|
const userObjects = (allObjects as CachedObject[]).filter((obj: CachedObject) => {
|
||||||
|
// Check if object is related to the user
|
||||||
|
// For purchases: targetPubkey === userPubkey
|
||||||
|
// For reviews: targetEventId points to user's article
|
||||||
|
// For sponsoring: targetPubkey === userPubkey
|
||||||
|
// For review_tips: targetEventId points to user's review
|
||||||
|
// For payment_notes: targetPubkey === userPubkey
|
||||||
|
|
||||||
|
if (type === 'purchase' || type === 'sponsoring' || type === 'payment_note') {
|
||||||
|
return (obj as { targetPubkey?: string }).targetPubkey === this.userPubkey
|
||||||
|
}
|
||||||
|
|
||||||
|
if (type === 'review' || type === 'review_tip') {
|
||||||
|
// Need to check if the target event belongs to the user
|
||||||
|
// This is more complex and may require checking the article/review
|
||||||
|
// For now, we'll create notifications for all reviews/tips
|
||||||
|
// The UI can filter them if needed
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
})
|
||||||
|
|
||||||
|
// Create notifications for objects created after last scan
|
||||||
|
for (const obj of userObjects) {
|
||||||
|
const cachedObj = obj as CachedObject
|
||||||
|
if (cachedObj.createdAt * 1000 > this.lastScanTime) {
|
||||||
|
const eventId = cachedObj.id.split(':')[1] ?? cachedObj.id
|
||||||
|
await notificationService.createNotification({
|
||||||
|
type: notificationType,
|
||||||
|
objectType: type,
|
||||||
|
objectId: cachedObj.id,
|
||||||
|
eventId,
|
||||||
|
data: { object: obj },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`[NotificationDetector] Error scanning ${type}:`, error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Scan for published status changes (published: false -> list of relays)
|
||||||
|
*/
|
||||||
|
private async scanPublishedStatusChanges(): Promise<void> {
|
||||||
|
if (!this.userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Get all object types that can be published
|
||||||
|
const objectTypes = ['author', 'series', 'publication', 'review', 'purchase', 'sponsoring', 'review_tip', 'payment_note']
|
||||||
|
|
||||||
|
for (const objectType of objectTypes) {
|
||||||
|
try {
|
||||||
|
const allObjects = await objectCache.getAll(objectType as Parameters<typeof objectCache.getAll>[0])
|
||||||
|
const userObjects = (allObjects as CachedObject[]).filter((obj: CachedObject) => {
|
||||||
|
// Check if object belongs to user - need to check parsed object for pubkey
|
||||||
|
const parsed = obj.parsed as { pubkey?: string } | undefined
|
||||||
|
return parsed?.pubkey === this.userPubkey
|
||||||
|
})
|
||||||
|
|
||||||
|
for (const obj of userObjects) {
|
||||||
|
// Check if object was recently published (published changed from false to array)
|
||||||
|
if (Array.isArray(obj.published) && obj.published.length > 0) {
|
||||||
|
// Check if we already created a notification for this
|
||||||
|
const eventId = obj.id.split(':')[1] ?? obj.id
|
||||||
|
const existing = await notificationService.getNotificationByEventId(eventId)
|
||||||
|
|
||||||
|
if (existing?.type !== 'published') {
|
||||||
|
// Check if this is a recent change (within last 5 minutes)
|
||||||
|
// We can't track old/new state easily, so we'll create notification
|
||||||
|
// if object was published recently (created in last hour and published)
|
||||||
|
const oneHourAgo = Date.now() - 60 * 60 * 1000
|
||||||
|
const cachedObj = obj as CachedObject
|
||||||
|
if (cachedObj.createdAt * 1000 > oneHourAgo) {
|
||||||
|
const relays = cachedObj.published as string[]
|
||||||
|
await notificationService.createNotification({
|
||||||
|
type: 'published',
|
||||||
|
objectType,
|
||||||
|
objectId: cachedObj.id,
|
||||||
|
eventId,
|
||||||
|
data: {
|
||||||
|
relays,
|
||||||
|
object: obj,
|
||||||
|
title: 'Publication réussie',
|
||||||
|
message: `Votre contenu a été publié sur ${relays.length} relais`,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`[NotificationDetector] Error scanning published status for ${objectType}:`, error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationDetector] Error scanning published status changes:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Manually check for a specific object change
|
||||||
|
*/
|
||||||
|
async checkObjectChange(change: ObjectChange): Promise<void> {
|
||||||
|
if (!this.userPubkey) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Check if published status changed from false to array
|
||||||
|
if (change.oldPublished === false && Array.isArray(change.newPublished) && change.newPublished.length > 0) {
|
||||||
|
// Get the object to check if it belongs to user
|
||||||
|
const obj = await objectCache.get(change.objectType as Parameters<typeof objectCache.get>[0], change.objectId)
|
||||||
|
if (obj) {
|
||||||
|
const cachedObj = obj as CachedObject
|
||||||
|
const parsed = cachedObj.parsed as { pubkey?: string } | undefined
|
||||||
|
if (parsed?.pubkey === this.userPubkey) {
|
||||||
|
const relays = change.newPublished
|
||||||
|
await notificationService.createNotification({
|
||||||
|
type: 'published',
|
||||||
|
objectType: change.objectType,
|
||||||
|
objectId: change.objectId,
|
||||||
|
eventId: change.eventId,
|
||||||
|
data: {
|
||||||
|
relays,
|
||||||
|
object: obj,
|
||||||
|
title: 'Publication réussie',
|
||||||
|
message: `Votre contenu a été publié sur ${relays.length} relais`,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationDetector] Error checking object change:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get notification title based on type
|
||||||
|
*/
|
||||||
|
private getNotificationTitle(type: NotificationType, _obj: CachedObject): string {
|
||||||
|
switch (type) {
|
||||||
|
case 'purchase':
|
||||||
|
return 'Nouvel achat'
|
||||||
|
case 'review':
|
||||||
|
return 'Nouvel avis'
|
||||||
|
case 'sponsoring':
|
||||||
|
return 'Nouveau sponsoring'
|
||||||
|
case 'review_tip':
|
||||||
|
return 'Nouveau remerciement'
|
||||||
|
case 'payment_note':
|
||||||
|
return 'Nouvelle note de paiement'
|
||||||
|
case 'published':
|
||||||
|
return 'Publication réussie'
|
||||||
|
default:
|
||||||
|
return 'Nouvelle notification'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get notification message based on type
|
||||||
|
*/
|
||||||
|
private getNotificationMessage(type: NotificationType, _obj: CachedObject): string {
|
||||||
|
switch (type) {
|
||||||
|
case 'purchase':
|
||||||
|
return `Vous avez acheté un article`
|
||||||
|
case 'review':
|
||||||
|
return `Un nouvel avis a été publié`
|
||||||
|
case 'sponsoring':
|
||||||
|
return `Vous avez reçu un sponsoring`
|
||||||
|
case 'review_tip':
|
||||||
|
return `Vous avez reçu un remerciement`
|
||||||
|
case 'payment_note':
|
||||||
|
return `Une note de paiement a été ajoutée`
|
||||||
|
case 'published':
|
||||||
|
const cachedObj = _obj as CachedObject
|
||||||
|
const relays = Array.isArray(cachedObj.published) ? cachedObj.published : []
|
||||||
|
return `Votre contenu a été publié sur ${relays.length} relais`
|
||||||
|
default:
|
||||||
|
return 'Nouvelle notification'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const notificationDetector = new NotificationDetector()
|
||||||
82
lib/notificationReader.ts
Normal file
82
lib/notificationReader.ts
Normal file
@ -0,0 +1,82 @@
|
|||||||
|
/**
|
||||||
|
* Notification reader service - detects new notifications in IndexedDB
|
||||||
|
* This service is separate from the Service Worker that populates notifications
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { notificationService } from './notificationService'
|
||||||
|
import type { Notification } from './notificationService'
|
||||||
|
|
||||||
|
class NotificationReader {
|
||||||
|
private lastReadTime: number = 0
|
||||||
|
private subscribers: Array<(notifications: Notification[]) => void> = []
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Subscribe to notification updates
|
||||||
|
*/
|
||||||
|
subscribe(callback: (notifications: Notification[]) => void): () => void {
|
||||||
|
this.subscribers.push(callback)
|
||||||
|
|
||||||
|
// Return unsubscribe function
|
||||||
|
return () => {
|
||||||
|
const index = this.subscribers.indexOf(callback)
|
||||||
|
if (index > -1) {
|
||||||
|
this.subscribers.splice(index, 1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Notify subscribers of new notifications
|
||||||
|
*/
|
||||||
|
private notifySubscribers(notifications: Notification[]): void {
|
||||||
|
this.subscribers.forEach((callback) => {
|
||||||
|
try {
|
||||||
|
callback(notifications)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationReader] Error in subscriber callback:', error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Read notifications from IndexedDB
|
||||||
|
*/
|
||||||
|
async readNotifications(limit: number = 100): Promise<Notification[]> {
|
||||||
|
try {
|
||||||
|
const notifications = await notificationService.getAllNotifications(limit)
|
||||||
|
this.lastReadTime = Date.now()
|
||||||
|
this.notifySubscribers(notifications)
|
||||||
|
return notifications
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationReader] Error reading notifications:', error)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get unread notifications count
|
||||||
|
*/
|
||||||
|
async getUnreadCount(): Promise<number> {
|
||||||
|
try {
|
||||||
|
return await notificationService.getUnreadCount()
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationReader] Error getting unread count:', error)
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get new notifications since last read
|
||||||
|
*/
|
||||||
|
async getNewNotifications(since: number = this.lastReadTime): Promise<Notification[]> {
|
||||||
|
try {
|
||||||
|
const allNotifications = await notificationService.getAllNotifications(1000)
|
||||||
|
return allNotifications.filter((n) => n.timestamp > since)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationReader] Error getting new notifications:', error)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const notificationReader = new NotificationReader()
|
||||||
345
lib/notificationService.ts
Normal file
345
lib/notificationService.ts
Normal file
@ -0,0 +1,345 @@
|
|||||||
|
/**
|
||||||
|
* Notification service - stores and manages notifications in IndexedDB
|
||||||
|
*/
|
||||||
|
|
||||||
|
const DB_NAME = 'nostr_notifications'
|
||||||
|
const DB_VERSION = 1
|
||||||
|
const STORE_NAME = 'notifications'
|
||||||
|
|
||||||
|
export type NotificationType =
|
||||||
|
| 'purchase' // Achat
|
||||||
|
| 'review' // Avis
|
||||||
|
| 'sponsoring' // Sponsoring
|
||||||
|
| 'review_tip' // Remerciement (tip sur un avis)
|
||||||
|
| 'payment_note' // Note de paiement
|
||||||
|
| 'published' // Objet publié avec succès (passé de published: false à liste de relais)
|
||||||
|
|
||||||
|
export interface Notification {
|
||||||
|
id: string // Unique notification ID
|
||||||
|
type: NotificationType
|
||||||
|
objectType: string // Type d'objet (author, series, publication, etc.)
|
||||||
|
objectId: string // ID de l'objet dans objectCache
|
||||||
|
eventId: string // ID de l'événement Nostr
|
||||||
|
timestamp: number // Date de création de la notification (milliseconds)
|
||||||
|
read: boolean // Si la notification a été lue
|
||||||
|
data?: Record<string, unknown> // Données supplémentaires (relais publiés, montant, etc.)
|
||||||
|
// Compatibilité avec l'ancien format
|
||||||
|
title?: string
|
||||||
|
message?: string
|
||||||
|
articleId?: string
|
||||||
|
articleTitle?: string
|
||||||
|
amount?: number
|
||||||
|
fromPubkey?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
class NotificationService {
|
||||||
|
private db: IDBDatabase | null = null
|
||||||
|
private initPromise: Promise<void> | null = null
|
||||||
|
|
||||||
|
private async init(): Promise<void> {
|
||||||
|
if (this.db) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.initPromise) {
|
||||||
|
return this.initPromise
|
||||||
|
}
|
||||||
|
|
||||||
|
this.initPromise = this.openDatabase()
|
||||||
|
|
||||||
|
try {
|
||||||
|
await this.initPromise
|
||||||
|
} catch (error) {
|
||||||
|
this.initPromise = null
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private openDatabase(): Promise<void> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (typeof window === 'undefined' || !window.indexedDB) {
|
||||||
|
reject(new Error('IndexedDB is not available'))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const request = window.indexedDB.open(DB_NAME, DB_VERSION)
|
||||||
|
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(new Error(`Failed to open IndexedDB: ${request.error}`))
|
||||||
|
}
|
||||||
|
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
this.db = request.result
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
|
||||||
|
request.onupgradeneeded = (event: IDBVersionChangeEvent): void => {
|
||||||
|
const db = (event.target as IDBOpenDBRequest).result
|
||||||
|
if (!db.objectStoreNames.contains(STORE_NAME)) {
|
||||||
|
const store = db.createObjectStore(STORE_NAME, { keyPath: 'id' })
|
||||||
|
store.createIndex('type', 'type', { unique: false })
|
||||||
|
store.createIndex('objectId', 'objectId', { unique: false })
|
||||||
|
store.createIndex('eventId', 'eventId', { unique: false })
|
||||||
|
store.createIndex('timestamp', 'timestamp', { unique: false })
|
||||||
|
store.createIndex('read', 'read', { unique: false })
|
||||||
|
store.createIndex('objectType', 'objectType', { unique: false })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a new notification
|
||||||
|
* Utilise writeService pour écrire via Web Worker
|
||||||
|
*/
|
||||||
|
async createNotification(params: {
|
||||||
|
type: NotificationType
|
||||||
|
objectType: string
|
||||||
|
objectId: string
|
||||||
|
eventId: string
|
||||||
|
data?: Record<string, unknown>
|
||||||
|
}): Promise<void> {
|
||||||
|
try {
|
||||||
|
const { type, objectType, objectId, eventId, data } = params
|
||||||
|
// Check if notification already exists for this event
|
||||||
|
const existing = await this.getNotificationByEventId(eventId)
|
||||||
|
if (existing) {
|
||||||
|
return // Notification already exists
|
||||||
|
}
|
||||||
|
|
||||||
|
// Utiliser writeService pour créer la notification via Web Worker
|
||||||
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.createNotification(type, objectType, objectId, eventId, data)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error creating notification:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get notification by event ID
|
||||||
|
*/
|
||||||
|
async getNotificationByEventId(eventId: string): Promise<Notification | null> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readonly')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const index = store.index('eventId')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = index.get(eventId)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve((request.result as Notification) ?? null)
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error getting notification by event ID:', error)
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get all notifications for a user
|
||||||
|
*/
|
||||||
|
async getAllNotifications(limit: number = 100): Promise<Notification[]> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readonly')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const index = store.index('timestamp')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = index.openCursor(null, 'prev') // Descending order (newest first)
|
||||||
|
const notifications: Notification[] = []
|
||||||
|
|
||||||
|
request.onsuccess = (event: globalThis.Event): void => {
|
||||||
|
const cursor = (event.target as IDBRequest<IDBCursorWithValue>).result
|
||||||
|
if (cursor) {
|
||||||
|
notifications.push(cursor.value as Notification)
|
||||||
|
if (notifications.length < limit) {
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(notifications)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
resolve(notifications)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error getting all notifications:', error)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get unread notifications count
|
||||||
|
*/
|
||||||
|
async getUnreadCount(): Promise<number> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readonly')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const index = store.index('read')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = index.openCursor(IDBKeyRange.only(false))
|
||||||
|
let count = 0
|
||||||
|
|
||||||
|
request.onsuccess = (event: globalThis.Event): void => {
|
||||||
|
const cursor = (event.target as IDBRequest<IDBCursorWithValue>).result
|
||||||
|
if (cursor) {
|
||||||
|
count++
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(count)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error getting unread count:', error)
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Mark notification as read
|
||||||
|
*/
|
||||||
|
async markAsRead(notificationId: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
throw new Error('Database not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readwrite')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const request = store.get(notificationId)
|
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
const notification = request.result as Notification | undefined
|
||||||
|
if (!notification) {
|
||||||
|
reject(new Error('Notification not found'))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const updatedNotification: Notification = {
|
||||||
|
...notification,
|
||||||
|
read: true,
|
||||||
|
}
|
||||||
|
|
||||||
|
const updateRequest = store.put(updatedNotification)
|
||||||
|
updateRequest.onsuccess = (): void => {
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
updateRequest.onerror = (): void => {
|
||||||
|
reject(new Error(`Failed to update notification: ${updateRequest.error}`))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error marking notification as read:', error)
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Mark all notifications as read
|
||||||
|
*/
|
||||||
|
async markAllAsRead(): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
throw new Error('Database not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
const notifications = await this.getAllNotifications(10000)
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readwrite')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
|
||||||
|
await Promise.all(
|
||||||
|
notifications
|
||||||
|
.filter((n) => !n.read)
|
||||||
|
.map(
|
||||||
|
(notification) =>
|
||||||
|
new Promise<void>((resolve, reject) => {
|
||||||
|
const updatedNotification: Notification = {
|
||||||
|
...notification,
|
||||||
|
read: true,
|
||||||
|
}
|
||||||
|
const request = store.put(updatedNotification)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
)
|
||||||
|
)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error marking all notifications as read:', error)
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete notification
|
||||||
|
*/
|
||||||
|
async deleteNotification(notificationId: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
throw new Error('Database not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readwrite')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const request = store.delete(notificationId)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(new Error(`Failed to delete notification: ${request.error}`))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[NotificationService] Error deleting notification:', error)
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const notificationService = new NotificationService()
|
||||||
@ -10,7 +10,7 @@ import { buildObjectId } from './urlGenerator'
|
|||||||
|
|
||||||
export type ObjectType = 'author' | 'series' | 'publication' | 'review' | 'purchase' | 'sponsoring' | 'review_tip' | 'payment_note'
|
export type ObjectType = 'author' | 'series' | 'publication' | 'review' | 'purchase' | 'sponsoring' | 'review_tip' | 'payment_note'
|
||||||
|
|
||||||
interface CachedObject {
|
export interface CachedObject {
|
||||||
id: string // Format: <hash>_<index>_<version>
|
id: string // Format: <hash>_<index>_<version>
|
||||||
hash: string // SHA-256 hash of the object
|
hash: string // SHA-256 hash of the object
|
||||||
hashId: string // Legacy field for backward compatibility
|
hashId: string // Legacy field for backward compatibility
|
||||||
@ -21,10 +21,11 @@ interface CachedObject {
|
|||||||
hidden: boolean
|
hidden: boolean
|
||||||
createdAt: number
|
createdAt: number
|
||||||
cachedAt: number
|
cachedAt: number
|
||||||
|
published: false | string[] // false if not published, array of relay URLs that successfully published
|
||||||
}
|
}
|
||||||
|
|
||||||
const DB_PREFIX = 'nostr_objects_'
|
const DB_PREFIX = 'nostr_objects_'
|
||||||
const DB_VERSION = 2 // Incremented to add id, hash, index fields
|
const DB_VERSION = 3 // Incremented to add published field
|
||||||
|
|
||||||
class ObjectCacheService {
|
class ObjectCacheService {
|
||||||
private dbs: Map<ObjectType, IDBDatabase> = new Map()
|
private dbs: Map<ObjectType, IDBDatabase> = new Map()
|
||||||
@ -63,6 +64,7 @@ class ObjectCacheService {
|
|||||||
store.createIndex('index', 'index', { unique: false })
|
store.createIndex('index', 'index', { unique: false })
|
||||||
store.createIndex('hidden', 'hidden', { unique: false })
|
store.createIndex('hidden', 'hidden', { unique: false })
|
||||||
store.createIndex('cachedAt', 'cachedAt', { unique: false })
|
store.createIndex('cachedAt', 'cachedAt', { unique: false })
|
||||||
|
store.createIndex('published', 'published', { unique: false })
|
||||||
} else {
|
} else {
|
||||||
// Migration: add new indexes if they don't exist
|
// Migration: add new indexes if they don't exist
|
||||||
const {transaction} = (event.target as IDBOpenDBRequest)
|
const {transaction} = (event.target as IDBOpenDBRequest)
|
||||||
@ -74,6 +76,9 @@ class ObjectCacheService {
|
|||||||
if (!store.indexNames.contains('index')) {
|
if (!store.indexNames.contains('index')) {
|
||||||
store.createIndex('index', 'index', { unique: false })
|
store.createIndex('index', 'index', { unique: false })
|
||||||
}
|
}
|
||||||
|
if (!store.indexNames.contains('published')) {
|
||||||
|
store.createIndex('published', 'published', { unique: false })
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -108,6 +113,7 @@ class ObjectCacheService {
|
|||||||
/**
|
/**
|
||||||
* Store an object in cache
|
* Store an object in cache
|
||||||
* Verifies and sets the index before insertion
|
* Verifies and sets the index before insertion
|
||||||
|
* @param published - false if not published, or array of relay URLs that successfully published
|
||||||
*/
|
*/
|
||||||
async set(
|
async set(
|
||||||
objectType: ObjectType,
|
objectType: ObjectType,
|
||||||
@ -116,7 +122,8 @@ class ObjectCacheService {
|
|||||||
parsed: unknown,
|
parsed: unknown,
|
||||||
version: number,
|
version: number,
|
||||||
hidden: boolean,
|
hidden: boolean,
|
||||||
index?: number
|
index?: number,
|
||||||
|
published: false | string[] = false
|
||||||
): Promise<void> {
|
): Promise<void> {
|
||||||
try {
|
try {
|
||||||
const db = await this.initDB(objectType)
|
const db = await this.initDB(objectType)
|
||||||
@ -133,6 +140,20 @@ class ObjectCacheService {
|
|||||||
const transaction = db.transaction(['objects'], 'readwrite')
|
const transaction = db.transaction(['objects'], 'readwrite')
|
||||||
const store = transaction.objectStore('objects')
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
// Check if object already exists to preserve published status if updating
|
||||||
|
const existing = await new Promise<CachedObject | null>((resolve, reject) => {
|
||||||
|
const getRequest = store.get(id)
|
||||||
|
getRequest.onsuccess = (): void => {
|
||||||
|
resolve((getRequest.result as CachedObject) ?? null)
|
||||||
|
}
|
||||||
|
getRequest.onerror = (): void => {
|
||||||
|
reject(getRequest.error)
|
||||||
|
}
|
||||||
|
}).catch(() => null)
|
||||||
|
|
||||||
|
// If updating and published is not provided, preserve existing published status
|
||||||
|
const finalPublished = existing && published === false ? existing.published : published
|
||||||
|
|
||||||
const cached: CachedObject = {
|
const cached: CachedObject = {
|
||||||
id,
|
id,
|
||||||
hash,
|
hash,
|
||||||
@ -144,6 +165,7 @@ class ObjectCacheService {
|
|||||||
hidden,
|
hidden,
|
||||||
createdAt: event.created_at,
|
createdAt: event.created_at,
|
||||||
cachedAt: Date.now(),
|
cachedAt: Date.now(),
|
||||||
|
published: finalPublished,
|
||||||
}
|
}
|
||||||
|
|
||||||
await new Promise<void>((resolve, reject) => {
|
await new Promise<void>((resolve, reject) => {
|
||||||
@ -160,6 +182,104 @@ class ObjectCacheService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update published status for an object
|
||||||
|
*/
|
||||||
|
async updatePublished(
|
||||||
|
objectType: ObjectType,
|
||||||
|
id: string,
|
||||||
|
published: false | string[]
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
const db = await this.initDB(objectType)
|
||||||
|
const transaction = db.transaction(['objects'], 'readwrite')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
const existing = await new Promise<CachedObject | null>((resolve, reject) => {
|
||||||
|
const request = store.get(id)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve((request.result as CachedObject) ?? null)
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
if (!existing) {
|
||||||
|
console.warn(`Object ${id} not found in cache, cannot update published status`)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const oldPublished = existing.published
|
||||||
|
const updated: CachedObject = {
|
||||||
|
...existing,
|
||||||
|
published,
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const request = store.put(updated)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Notify about published status change (false -> array of relays)
|
||||||
|
if (oldPublished === false && Array.isArray(published) && published.length > 0) {
|
||||||
|
const eventId = id.split(':')[1] ?? id
|
||||||
|
const { notificationDetector } = require('./notificationDetector')
|
||||||
|
void notificationDetector.checkObjectChange({
|
||||||
|
objectType,
|
||||||
|
objectId: id,
|
||||||
|
eventId,
|
||||||
|
oldPublished,
|
||||||
|
newPublished: published,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
} catch (updateError) {
|
||||||
|
console.error(`Error updating published status for ${objectType} object:`, updateError)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get all objects that are not published (published === false)
|
||||||
|
*/
|
||||||
|
async getUnpublished(objectType: ObjectType): Promise<Array<{ id: string; event: NostrEvent }>> {
|
||||||
|
try {
|
||||||
|
const db = await this.initDB(objectType)
|
||||||
|
const transaction = db.transaction(['objects'], 'readonly')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = store.openCursor()
|
||||||
|
const unpublished: Array<{ id: string; event: NostrEvent }> = []
|
||||||
|
|
||||||
|
request.onsuccess = (event: globalThis.Event): void => {
|
||||||
|
const cursor = (event.target as IDBRequest<IDBCursorWithValue>).result
|
||||||
|
if (cursor) {
|
||||||
|
const obj = cursor.value as CachedObject
|
||||||
|
// Check if published is false (not an array)
|
||||||
|
if (obj.published === false && !obj.hidden) {
|
||||||
|
unpublished.push({ id: obj.id, event: obj.event })
|
||||||
|
}
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(unpublished)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (getUnpublishedError) {
|
||||||
|
console.error(`Error retrieving unpublished ${objectType} objects:`, getUnpublishedError)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get an object from cache by hash
|
* Get an object from cache by hash
|
||||||
* Returns the latest non-hidden version
|
* Returns the latest non-hidden version
|
||||||
@ -233,6 +353,35 @@ class ObjectCacheService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the raw event from cache by ID
|
||||||
|
*/
|
||||||
|
async getEventById(objectType: ObjectType, id: string): Promise<NostrEvent | null> {
|
||||||
|
try {
|
||||||
|
const db = await this.initDB(objectType)
|
||||||
|
const transaction = db.transaction(['objects'], 'readonly')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
return new Promise<NostrEvent | null>((resolve, reject) => {
|
||||||
|
const request = store.get(id)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
const obj = request.result as CachedObject | undefined
|
||||||
|
if (obj && !obj.hidden) {
|
||||||
|
resolve(obj.event)
|
||||||
|
} else {
|
||||||
|
resolve(null)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (retrieveByIdError) {
|
||||||
|
console.error(`Error retrieving ${objectType} event by ID from cache:`, retrieveByIdError)
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get an author presentation by pubkey (searches all cached authors)
|
* Get an author presentation by pubkey (searches all cached authors)
|
||||||
*/
|
*/
|
||||||
|
|||||||
@ -9,7 +9,6 @@ import type { SimplePoolWithSub } from '@/types/nostr-tools-extended'
|
|||||||
import { nostrService } from './nostr'
|
import { nostrService } from './nostr'
|
||||||
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
||||||
import { extractTagsFromEvent } from './nostrTagSystem'
|
import { extractTagsFromEvent } from './nostrTagSystem'
|
||||||
import { objectCache } from './objectCache'
|
|
||||||
import { parsePresentationEvent } from './articlePublisherHelpersPresentation'
|
import { parsePresentationEvent } from './articlePublisherHelpersPresentation'
|
||||||
import { parseArticleFromEvent, parseSeriesFromEvent, parseReviewFromEvent, parsePurchaseFromEvent, parseReviewTipFromEvent, parseSponsoringFromEvent } from './nostrEventParsing'
|
import { parseArticleFromEvent, parseSeriesFromEvent, parseReviewFromEvent, parsePurchaseFromEvent, parseReviewTipFromEvent, parseSponsoringFromEvent } from './nostrEventParsing'
|
||||||
|
|
||||||
@ -120,7 +119,7 @@ class PlatformSyncService {
|
|||||||
let resolved = false
|
let resolved = false
|
||||||
let eventCount = 0
|
let eventCount = 0
|
||||||
|
|
||||||
const finalize = (): void => {
|
const finalize = async (): Promise<void> => {
|
||||||
if (resolved) {
|
if (resolved) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@ -136,6 +135,16 @@ class PlatformSyncService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
console.warn(`[PlatformSync] Relay ${relayUrl} completed: received ${eventCount} total events from relay, ${relayEvents.length} filtered with service='${PLATFORM_SERVICE}'`)
|
console.warn(`[PlatformSync] Relay ${relayUrl} completed: received ${eventCount} total events from relay, ${relayEvents.length} filtered with service='${PLATFORM_SERVICE}'`)
|
||||||
|
|
||||||
|
// Update lastSyncDate for this relay on successful sync
|
||||||
|
if (relayEvents.length > 0 || eventCount > 0) {
|
||||||
|
const { configStorage } = await import('./configStorage')
|
||||||
|
const config = await configStorage.getConfig()
|
||||||
|
const relayConfig = config.relays.find((r) => r.url === relayUrl)
|
||||||
|
if (relayConfig) {
|
||||||
|
await configStorage.updateRelay(relayConfig.id, { lastSyncDate: Date.now() })
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
await new Promise<void>((resolve) => {
|
await new Promise<void>((resolve) => {
|
||||||
sub.on('event', (event: Event): void => {
|
sub.on('event', (event: Event): void => {
|
||||||
@ -191,7 +200,7 @@ class PlatformSyncService {
|
|||||||
// Timeout after SYNC_TIMEOUT_MS
|
// Timeout after SYNC_TIMEOUT_MS
|
||||||
const timeoutId = setTimeout((): void => {
|
const timeoutId = setTimeout((): void => {
|
||||||
console.warn(`[PlatformSync] Relay ${relayUrl} timeout after ${this.SYNC_TIMEOUT_MS}ms`)
|
console.warn(`[PlatformSync] Relay ${relayUrl} timeout after ${this.SYNC_TIMEOUT_MS}ms`)
|
||||||
finalize()
|
void finalize()
|
||||||
resolve()
|
resolve()
|
||||||
}, this.SYNC_TIMEOUT_MS)
|
}, this.SYNC_TIMEOUT_MS)
|
||||||
timeoutId.unref?.()
|
timeoutId.unref?.()
|
||||||
@ -281,7 +290,8 @@ class PlatformSyncService {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
if (parsed && parsed.hash) {
|
if (parsed && parsed.hash) {
|
||||||
await objectCache.set('author', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('author', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index, false)
|
||||||
if (event.id === '527d83e0af20bf23c3e104974090ccc21536ece72c24eb784b3642890f63b763') {
|
if (event.id === '527d83e0af20bf23c3e104974090ccc21536ece72c24eb784b3642890f63b763') {
|
||||||
console.warn(`[PlatformSync] Target event cached successfully as author with hash:`, parsed.hash)
|
console.warn(`[PlatformSync] Target event cached successfully as author with hash:`, parsed.hash)
|
||||||
}
|
}
|
||||||
@ -291,31 +301,37 @@ class PlatformSyncService {
|
|||||||
} else if (tags.type === 'series') {
|
} else if (tags.type === 'series') {
|
||||||
const parsed = await parseSeriesFromEvent(event)
|
const parsed = await parseSeriesFromEvent(event)
|
||||||
if (parsed && parsed.hash) {
|
if (parsed && parsed.hash) {
|
||||||
await objectCache.set('series', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('series', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index, false)
|
||||||
}
|
}
|
||||||
} else if (tags.type === 'publication') {
|
} else if (tags.type === 'publication') {
|
||||||
const parsed = await parseArticleFromEvent(event)
|
const parsed = await parseArticleFromEvent(event)
|
||||||
if (parsed && parsed.hash) {
|
if (parsed && parsed.hash) {
|
||||||
await objectCache.set('publication', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('publication', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index, false)
|
||||||
}
|
}
|
||||||
} else if (tags.type === 'quote') {
|
} else if (tags.type === 'quote') {
|
||||||
const parsed = await parseReviewFromEvent(event)
|
const parsed = await parseReviewFromEvent(event)
|
||||||
if (parsed && parsed.hash) {
|
if (parsed && parsed.hash) {
|
||||||
await objectCache.set('review', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('review', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden, parsed.index, false)
|
||||||
}
|
}
|
||||||
} else if (event.kind === 9735) {
|
} else if (event.kind === 9735) {
|
||||||
// Zap receipts (kind 9735) can be sponsoring, purchase, or review_tip
|
// Zap receipts (kind 9735) can be sponsoring, purchase, or review_tip
|
||||||
const sponsoring = await parseSponsoringFromEvent(event)
|
const sponsoring = await parseSponsoringFromEvent(event)
|
||||||
if (sponsoring && sponsoring.hash) {
|
if (sponsoring && sponsoring.hash) {
|
||||||
await objectCache.set('sponsoring', sponsoring.hash, event, sponsoring, 0, false, sponsoring.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('sponsoring', sponsoring.hash, event, sponsoring, 0, false, sponsoring.index, false)
|
||||||
} else {
|
} else {
|
||||||
const purchase = await parsePurchaseFromEvent(event)
|
const purchase = await parsePurchaseFromEvent(event)
|
||||||
if (purchase && purchase.hash) {
|
if (purchase && purchase.hash) {
|
||||||
await objectCache.set('purchase', purchase.hash, event, purchase, 0, false, purchase.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('purchase', purchase.hash, event, purchase, 0, false, purchase.index, false)
|
||||||
} else {
|
} else {
|
||||||
const reviewTip = await parseReviewTipFromEvent(event)
|
const reviewTip = await parseReviewTipFromEvent(event)
|
||||||
if (reviewTip && reviewTip.hash) {
|
if (reviewTip && reviewTip.hash) {
|
||||||
await objectCache.set('review_tip', reviewTip.hash, event, reviewTip, 0, false, reviewTip.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('review_tip', reviewTip.hash, event, reviewTip, 0, false, reviewTip.index, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -324,8 +340,27 @@ class PlatformSyncService {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Start continuous sync (runs periodically)
|
* Start continuous sync (runs periodically)
|
||||||
|
* Can use Service Worker if available, otherwise falls back to setInterval
|
||||||
*/
|
*/
|
||||||
startContinuousSync(): void {
|
async startContinuousSync(): Promise<void> {
|
||||||
|
// Try to use Service Worker for background sync
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
try {
|
||||||
|
const { swClient } = await import('./swClient')
|
||||||
|
const isReady = await swClient.isReady()
|
||||||
|
if (isReady) {
|
||||||
|
console.warn('[PlatformSync] Using Service Worker for background sync')
|
||||||
|
await swClient.startPlatformSync()
|
||||||
|
// Still start initial sync in main thread
|
||||||
|
void this.startSync()
|
||||||
|
return
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('[PlatformSync] Service Worker not available, using setInterval:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback to setInterval if Service Worker not available
|
||||||
// Start initial sync
|
// Start initial sync
|
||||||
void this.startSync()
|
void this.startSync()
|
||||||
|
|
||||||
@ -340,7 +375,21 @@ class PlatformSyncService {
|
|||||||
/**
|
/**
|
||||||
* Stop sync
|
* Stop sync
|
||||||
*/
|
*/
|
||||||
stopSync(): void {
|
async stopSync(): Promise<void> {
|
||||||
|
// Stop Service Worker sync if active
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
try {
|
||||||
|
const { swClient } = await import('./swClient')
|
||||||
|
const isReady = await swClient.isReady()
|
||||||
|
if (isReady) {
|
||||||
|
await swClient.stopPlatformSync()
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Stop local sync
|
||||||
if (this.syncSubscription) {
|
if (this.syncSubscription) {
|
||||||
this.syncSubscription.unsub()
|
this.syncSubscription.unsub()
|
||||||
this.syncSubscription = null
|
this.syncSubscription = null
|
||||||
|
|||||||
@ -3,8 +3,8 @@ import { nostrService } from './nostr'
|
|||||||
import { PLATFORM_NPUB } from './platformConfig'
|
import { PLATFORM_NPUB } from './platformConfig'
|
||||||
import type { SimplePoolWithSub } from '@/types/nostr-tools-extended'
|
import type { SimplePoolWithSub } from '@/types/nostr-tools-extended'
|
||||||
import type { ContentDeliveryTracking } from './platformTrackingTypes'
|
import type { ContentDeliveryTracking } from './platformTrackingTypes'
|
||||||
import { buildTrackingEvent } from './platformTrackingEvents'
|
import { buildTrackingEvent, getTrackingKind } from './platformTrackingEvents'
|
||||||
import { parseTrackingEvent, createArticleDeliveriesSubscription, createRecipientDeliveriesSubscription } from './platformTrackingQueries'
|
import { parseTrackingEvent } from './platformTrackingQueries'
|
||||||
|
|
||||||
export type { ContentDeliveryTracking } from './platformTrackingTypes'
|
export type { ContentDeliveryTracking } from './platformTrackingTypes'
|
||||||
|
|
||||||
@ -17,12 +17,8 @@ export class PlatformTrackingService {
|
|||||||
private readonly platformPubkey: string = PLATFORM_NPUB
|
private readonly platformPubkey: string = PLATFORM_NPUB
|
||||||
|
|
||||||
private async publishTrackingEvent(event: Event): Promise<void> {
|
private async publishTrackingEvent(event: Event): Promise<void> {
|
||||||
const pool = nostrService.getPool()
|
// Publish to all active relays via websocketService (routes to Service Worker)
|
||||||
if (!pool) {
|
const { websocketService } = await import('./websocketService')
|
||||||
throw new Error('Pool not initialized')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Publish to all active relays (enabled and not marked inactive for this session)
|
|
||||||
const { relaySessionManager } = await import('./relaySessionManager')
|
const { relaySessionManager } = await import('./relaySessionManager')
|
||||||
const activeRelays = await relaySessionManager.getActiveRelays()
|
const activeRelays = await relaySessionManager.getActiveRelays()
|
||||||
|
|
||||||
@ -30,26 +26,11 @@ export class PlatformTrackingService {
|
|||||||
// Fallback to primary relay if no active relays
|
// Fallback to primary relay if no active relays
|
||||||
const { getPrimaryRelaySync } = await import('./config')
|
const { getPrimaryRelaySync } = await import('./config')
|
||||||
const relayUrl = getPrimaryRelaySync()
|
const relayUrl = getPrimaryRelaySync()
|
||||||
const pubs = pool.publish([relayUrl], event)
|
await websocketService.publishEvent(event, [relayUrl])
|
||||||
await Promise.all(pubs)
|
|
||||||
} else {
|
} else {
|
||||||
// Publish to all active relays
|
// Publish to all active relays
|
||||||
console.warn(`[PlatformTracking] Publishing tracking event ${event.id} to ${activeRelays.length} active relay(s)`)
|
console.warn(`[PlatformTracking] Publishing tracking event ${event.id} to ${activeRelays.length} active relay(s)`)
|
||||||
const pubs = pool.publish(activeRelays, event)
|
await websocketService.publishEvent(event, activeRelays)
|
||||||
|
|
||||||
// Track failed relays and mark them inactive for the session
|
|
||||||
const results = await Promise.allSettled(pubs)
|
|
||||||
results.forEach((result, index) => {
|
|
||||||
const relayUrl = activeRelays[index]
|
|
||||||
if (!relayUrl) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
if (result.status === 'rejected') {
|
|
||||||
const error = result.reason
|
|
||||||
console.error(`[PlatformTracking] Relay ${relayUrl} failed during publish:`, error)
|
|
||||||
relaySessionManager.markRelayFailed(relayUrl)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -113,37 +94,66 @@ export class PlatformTrackingService {
|
|||||||
/**
|
/**
|
||||||
* Query tracking events for an article
|
* Query tracking events for an article
|
||||||
* Returns all delivery tracking events for a specific article
|
* Returns all delivery tracking events for a specific article
|
||||||
|
* Uses websocketService to route events to Service Worker
|
||||||
*/
|
*/
|
||||||
async getArticleDeliveries(articleId: string): Promise<ContentDeliveryTracking[]> {
|
async getArticleDeliveries(articleId: string): Promise<ContentDeliveryTracking[]> {
|
||||||
try {
|
try {
|
||||||
const pool = nostrService.getPool()
|
const { websocketService } = await import('./websocketService')
|
||||||
if (!pool) {
|
const { getPrimaryRelaySync } = await import('./config')
|
||||||
return []
|
const { swClient } = await import('./swClient')
|
||||||
}
|
|
||||||
|
const filters = [
|
||||||
|
{
|
||||||
|
kinds: [getTrackingKind()],
|
||||||
|
'#p': [this.platformPubkey],
|
||||||
|
'#article': [articleId],
|
||||||
|
limit: 100,
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
const relayUrl = getPrimaryRelaySync()
|
||||||
|
|
||||||
return new Promise((resolve) => {
|
return new Promise((resolve) => {
|
||||||
const deliveries: ContentDeliveryTracking[] = []
|
const deliveries: ContentDeliveryTracking[] = []
|
||||||
let resolved = false
|
let resolved = false
|
||||||
const sub = createArticleDeliveriesSubscription(pool, articleId, this.platformPubkey)
|
let unsubscribe: (() => void) | null = null
|
||||||
|
let eoseReceived = false
|
||||||
|
|
||||||
const finalize = (): void => {
|
const finalize = (): void => {
|
||||||
if (resolved) {
|
if (resolved) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
resolved = true
|
resolved = true
|
||||||
sub.unsub()
|
if (unsubscribe) {
|
||||||
|
unsubscribe()
|
||||||
|
}
|
||||||
resolve(deliveries)
|
resolve(deliveries)
|
||||||
}
|
}
|
||||||
|
|
||||||
sub.on('event', (event: Event) => {
|
// Subscribe via websocketService (routes to Service Worker)
|
||||||
|
void websocketService.subscribe([relayUrl], filters, (event: Event) => {
|
||||||
const delivery = parseTrackingEvent(event)
|
const delivery = parseTrackingEvent(event)
|
||||||
if (delivery) {
|
if (delivery) {
|
||||||
deliveries.push(delivery)
|
deliveries.push(delivery)
|
||||||
}
|
}
|
||||||
|
}).then((unsub) => {
|
||||||
|
unsubscribe = unsub
|
||||||
})
|
})
|
||||||
|
|
||||||
sub.on('eose', finalize)
|
// Listen for EOSE via Service Worker messages
|
||||||
setTimeout(finalize, 5000)
|
const handleEOSE = (data: { relays: string[] }): void => {
|
||||||
|
if (data.relays.includes(relayUrl) && !eoseReceived) {
|
||||||
|
eoseReceived = true
|
||||||
|
finalize()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
swClient.onMessage('WEBSOCKET_EOSE', handleEOSE)
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
if (!eoseReceived) {
|
||||||
|
finalize()
|
||||||
|
}
|
||||||
|
}, 5000)
|
||||||
})
|
})
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error querying article deliveries', {
|
console.error('Error querying article deliveries', {
|
||||||
@ -156,37 +166,67 @@ export class PlatformTrackingService {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Query all deliveries for a recipient
|
* Query all deliveries for a recipient
|
||||||
|
* Uses websocketService to route events to Service Worker
|
||||||
*/
|
*/
|
||||||
async getRecipientDeliveries(recipientPubkey: string): Promise<ContentDeliveryTracking[]> {
|
async getRecipientDeliveries(recipientPubkey: string): Promise<ContentDeliveryTracking[]> {
|
||||||
try {
|
try {
|
||||||
const pool = nostrService.getPool()
|
const { websocketService } = await import('./websocketService')
|
||||||
if (!pool) {
|
const { getPrimaryRelaySync } = await import('./config')
|
||||||
return []
|
const { getTrackingKind } = await import('./platformTrackingEvents')
|
||||||
}
|
const { swClient } = await import('./swClient')
|
||||||
|
|
||||||
|
const filters = [
|
||||||
|
{
|
||||||
|
kinds: [getTrackingKind()],
|
||||||
|
'#p': [this.platformPubkey],
|
||||||
|
'#recipient': [recipientPubkey],
|
||||||
|
limit: 100,
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
const relayUrl = getPrimaryRelaySync()
|
||||||
|
|
||||||
return new Promise((resolve) => {
|
return new Promise((resolve) => {
|
||||||
const deliveries: ContentDeliveryTracking[] = []
|
const deliveries: ContentDeliveryTracking[] = []
|
||||||
let resolved = false
|
let resolved = false
|
||||||
const sub = createRecipientDeliveriesSubscription(pool, recipientPubkey, this.platformPubkey)
|
let unsubscribe: (() => void) | null = null
|
||||||
|
let eoseReceived = false
|
||||||
|
|
||||||
const finalize = (): void => {
|
const finalize = (): void => {
|
||||||
if (resolved) {
|
if (resolved) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
resolved = true
|
resolved = true
|
||||||
sub.unsub()
|
if (unsubscribe) {
|
||||||
|
unsubscribe()
|
||||||
|
}
|
||||||
resolve(deliveries)
|
resolve(deliveries)
|
||||||
}
|
}
|
||||||
|
|
||||||
sub.on('event', (event: Event) => {
|
// Subscribe via websocketService (routes to Service Worker)
|
||||||
|
void websocketService.subscribe([relayUrl], filters, (event: Event) => {
|
||||||
const delivery = parseTrackingEvent(event)
|
const delivery = parseTrackingEvent(event)
|
||||||
if (delivery) {
|
if (delivery) {
|
||||||
deliveries.push(delivery)
|
deliveries.push(delivery)
|
||||||
}
|
}
|
||||||
|
}).then((unsub) => {
|
||||||
|
unsubscribe = unsub
|
||||||
})
|
})
|
||||||
|
|
||||||
sub.on('eose', finalize)
|
// Listen for EOSE via Service Worker messages
|
||||||
setTimeout(finalize, 5000)
|
const handleEOSE = (data: { relays: string[] }): void => {
|
||||||
|
if (data.relays.includes(relayUrl) && !eoseReceived) {
|
||||||
|
eoseReceived = true
|
||||||
|
finalize()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
swClient.onMessage('WEBSOCKET_EOSE', handleEOSE)
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
if (!eoseReceived) {
|
||||||
|
finalize()
|
||||||
|
}
|
||||||
|
}, 5000)
|
||||||
})
|
})
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error querying recipient deliveries', {
|
console.error('Error querying recipient deliveries', {
|
||||||
|
|||||||
266
lib/publishLog.ts
Normal file
266
lib/publishLog.ts
Normal file
@ -0,0 +1,266 @@
|
|||||||
|
/**
|
||||||
|
* Publication log service - stores publication attempts and results in IndexedDB
|
||||||
|
*/
|
||||||
|
|
||||||
|
const DB_NAME = 'nostr_publish_log'
|
||||||
|
const DB_VERSION = 1
|
||||||
|
const STORE_NAME = 'publications'
|
||||||
|
|
||||||
|
interface PublicationLogEntry {
|
||||||
|
id: string // Event ID
|
||||||
|
eventId: string // Event ID (duplicate for easier querying)
|
||||||
|
relayUrl: string
|
||||||
|
success: boolean
|
||||||
|
error?: string
|
||||||
|
timestamp: number
|
||||||
|
objectType?: string // Type of object being published (author, series, publication, etc.)
|
||||||
|
objectId?: string // ID of the object in cache
|
||||||
|
}
|
||||||
|
|
||||||
|
class PublishLogService {
|
||||||
|
private db: IDBDatabase | null = null
|
||||||
|
private initPromise: Promise<void> | null = null
|
||||||
|
|
||||||
|
private async init(): Promise<void> {
|
||||||
|
if (this.db) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.initPromise) {
|
||||||
|
return this.initPromise
|
||||||
|
}
|
||||||
|
|
||||||
|
this.initPromise = this.openDatabase()
|
||||||
|
|
||||||
|
try {
|
||||||
|
await this.initPromise
|
||||||
|
} catch (error) {
|
||||||
|
this.initPromise = null
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private openDatabase(): Promise<void> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (typeof window === 'undefined' || !window.indexedDB) {
|
||||||
|
reject(new Error('IndexedDB is not available'))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const request = indexedDB.open(DB_NAME, DB_VERSION)
|
||||||
|
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(new Error(`Failed to open IndexedDB: ${request.error}`))
|
||||||
|
}
|
||||||
|
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
this.db = request.result
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
|
||||||
|
request.onupgradeneeded = (event: IDBVersionChangeEvent): void => {
|
||||||
|
const db = (event.target as IDBOpenDBRequest).result
|
||||||
|
if (!db.objectStoreNames.contains(STORE_NAME)) {
|
||||||
|
const store = db.createObjectStore(STORE_NAME, { keyPath: 'id', autoIncrement: true })
|
||||||
|
store.createIndex('eventId', 'eventId', { unique: false })
|
||||||
|
store.createIndex('relayUrl', 'relayUrl', { unique: false })
|
||||||
|
store.createIndex('timestamp', 'timestamp', { unique: false })
|
||||||
|
store.createIndex('success', 'success', { unique: false })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Log a publication attempt
|
||||||
|
* Utilise writeService pour écrire via Web Worker
|
||||||
|
*/
|
||||||
|
async logPublication(
|
||||||
|
eventId: string,
|
||||||
|
relayUrl: string,
|
||||||
|
success: boolean,
|
||||||
|
error?: string,
|
||||||
|
objectType?: string,
|
||||||
|
objectId?: string
|
||||||
|
): Promise<void> {
|
||||||
|
// Utiliser writeService pour logger via Web Worker
|
||||||
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.logPublication(eventId, relayUrl, success, error, objectType, objectId)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Log a publication attempt (ancienne méthode, conservée pour fallback dans writeService)
|
||||||
|
* @deprecated Utiliser logPublication qui utilise writeService
|
||||||
|
* @internal Utilisé uniquement par writeService en fallback
|
||||||
|
*/
|
||||||
|
async logPublicationDirect(
|
||||||
|
eventId: string,
|
||||||
|
relayUrl: string,
|
||||||
|
success: boolean,
|
||||||
|
error?: string,
|
||||||
|
objectType?: string,
|
||||||
|
objectId?: string
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
throw new Error('Database not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
const entry: PublicationLogEntry = {
|
||||||
|
id: `${eventId}_${relayUrl}_${Date.now()}`, // Unique ID
|
||||||
|
eventId,
|
||||||
|
relayUrl,
|
||||||
|
success,
|
||||||
|
...(error !== undefined ? { error } : {}),
|
||||||
|
timestamp: Date.now(),
|
||||||
|
...(objectType !== undefined ? { objectType } : {}),
|
||||||
|
...(objectId !== undefined ? { objectId } : {}),
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readwrite')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const request = store.add(entry)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(new Error(`Failed to log publication: ${request.error}`))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (logError) {
|
||||||
|
console.error('[PublishLog] Error logging publication:', logError)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get publication logs for an event
|
||||||
|
*/
|
||||||
|
async getLogsForEvent(eventId: string): Promise<PublicationLogEntry[]> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readonly')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const index = store.index('eventId')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = index.getAll(eventId)
|
||||||
|
request.onsuccess = (): void => {
|
||||||
|
resolve((request.result as PublicationLogEntry[]) ?? [])
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[PublishLog] Error getting logs for event:', error)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get publication logs for a relay
|
||||||
|
*/
|
||||||
|
async getLogsForRelay(relayUrl: string, limit: number = 100): Promise<PublicationLogEntry[]> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readonly')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const index = store.index('relayUrl')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = index.openCursor(IDBKeyRange.only(relayUrl))
|
||||||
|
const entries: PublicationLogEntry[] = []
|
||||||
|
|
||||||
|
request.onsuccess = (event: globalThis.Event): void => {
|
||||||
|
const cursor = (event.target as IDBRequest<IDBCursorWithValue>).result
|
||||||
|
if (cursor) {
|
||||||
|
entries.push(cursor.value as PublicationLogEntry)
|
||||||
|
if (entries.length < limit) {
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(entries.sort((a, b) => b.timestamp - a.timestamp))
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
resolve(entries.sort((a, b) => b.timestamp - a.timestamp))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[PublishLog] Error getting logs for relay:', error)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get all publication logs (successful and failed)
|
||||||
|
*/
|
||||||
|
async getAllLogs(limit: number = 1000): Promise<PublicationLogEntry[]> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (!this.db) {
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = this.db.transaction([STORE_NAME], 'readonly')
|
||||||
|
const store = transaction.objectStore(STORE_NAME)
|
||||||
|
const index = store.index('timestamp')
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = index.openCursor(null, 'prev') // Descending order
|
||||||
|
const entries: PublicationLogEntry[] = []
|
||||||
|
|
||||||
|
request.onsuccess = (event: globalThis.Event): void => {
|
||||||
|
const cursor = (event.target as IDBRequest<IDBCursorWithValue>).result
|
||||||
|
if (cursor) {
|
||||||
|
entries.push(cursor.value as PublicationLogEntry)
|
||||||
|
if (entries.length < limit) {
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(entries)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
resolve(entries)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
request.onerror = (): void => {
|
||||||
|
reject(request.error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[PublishLog] Error getting all logs:', error)
|
||||||
|
return []
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get statistics for a relay
|
||||||
|
*/
|
||||||
|
async getRelayStats(relayUrl: string): Promise<{ total: number; success: number; failed: number }> {
|
||||||
|
const logs = await this.getLogsForRelay(relayUrl, 10000)
|
||||||
|
return {
|
||||||
|
total: logs.length,
|
||||||
|
success: logs.filter((log) => log.success).length,
|
||||||
|
failed: logs.filter((log) => !log.success).length,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const publishLog = new PublishLogService()
|
||||||
29
lib/publishResult.ts
Normal file
29
lib/publishResult.ts
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
import type { Event } from 'nostr-tools'
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Relay publish status
|
||||||
|
*/
|
||||||
|
export interface RelayPublishStatus {
|
||||||
|
relayUrl: string
|
||||||
|
success: boolean
|
||||||
|
error?: string | undefined
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Result of publishing an event to relays
|
||||||
|
*/
|
||||||
|
export interface PublishResult {
|
||||||
|
event: Event | null
|
||||||
|
relayStatuses: RelayPublishStatus[]
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract relay display name from URL
|
||||||
|
*/
|
||||||
|
export function getRelayDisplayName(relayUrl: string): string {
|
||||||
|
const cleaned = relayUrl.replace(/^wss?:\/\//, '').replace(/\/$/, '')
|
||||||
|
if (cleaned.length > 30) {
|
||||||
|
return `${cleaned.substring(0, 27)}...`
|
||||||
|
}
|
||||||
|
return cleaned
|
||||||
|
}
|
||||||
226
lib/publishWorker.ts
Normal file
226
lib/publishWorker.ts
Normal file
@ -0,0 +1,226 @@
|
|||||||
|
/**
|
||||||
|
* Worker service for republishing unpublished objects to relays
|
||||||
|
* Continuously attempts to publish objects with published === false
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { nostrService } from './nostr'
|
||||||
|
import { objectCache, type ObjectType } from './objectCache'
|
||||||
|
import { relaySessionManager } from './relaySessionManager'
|
||||||
|
import { publishLog } from './publishLog'
|
||||||
|
import { writeService } from './writeService'
|
||||||
|
|
||||||
|
const REPUBLISH_INTERVAL_MS = 30000 // 30 seconds
|
||||||
|
const MAX_RETRIES_PER_OBJECT = 10
|
||||||
|
const RETRY_DELAY_MS = 5000 // 5 seconds between retries for same object
|
||||||
|
|
||||||
|
interface UnpublishedObject {
|
||||||
|
objectType: ObjectType
|
||||||
|
id: string
|
||||||
|
event: import('nostr-tools').Event
|
||||||
|
retryCount: number
|
||||||
|
lastRetryAt: number
|
||||||
|
}
|
||||||
|
|
||||||
|
class PublishWorkerService {
|
||||||
|
private isRunning = false
|
||||||
|
private intervalId: NodeJS.Timeout | null = null
|
||||||
|
private unpublishedObjects: Map<string, UnpublishedObject> = new Map()
|
||||||
|
private processing = false
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start the publish worker
|
||||||
|
* Can use Service Worker if available, otherwise falls back to setInterval
|
||||||
|
*/
|
||||||
|
async start(): Promise<void> {
|
||||||
|
if (this.isRunning) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.isRunning = true
|
||||||
|
console.warn('[PublishWorker] Starting publish worker')
|
||||||
|
|
||||||
|
// Try to use Service Worker for background processing
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
try {
|
||||||
|
const { swClient } = await import('./swClient')
|
||||||
|
const isReady = await swClient.isReady()
|
||||||
|
if (isReady) {
|
||||||
|
console.warn('[PublishWorker] Using Service Worker for background processing')
|
||||||
|
await swClient.startPublishWorker()
|
||||||
|
// Still process immediately in main thread
|
||||||
|
void this.processUnpublished()
|
||||||
|
return
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('[PublishWorker] Service Worker not available, using setInterval:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback to setInterval if Service Worker not available
|
||||||
|
// Start processing immediately
|
||||||
|
void this.processUnpublished()
|
||||||
|
|
||||||
|
// Then process periodically
|
||||||
|
this.intervalId = setInterval(() => {
|
||||||
|
void this.processUnpublished()
|
||||||
|
}, REPUBLISH_INTERVAL_MS)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop the publish worker
|
||||||
|
*/
|
||||||
|
async stop(): Promise<void> {
|
||||||
|
if (!this.isRunning) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.isRunning = false
|
||||||
|
|
||||||
|
// Stop Service Worker if active
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
try {
|
||||||
|
const { swClient } = await import('./swClient')
|
||||||
|
const isReady = await swClient.isReady()
|
||||||
|
if (isReady) {
|
||||||
|
await swClient.stopPublishWorker()
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Stop local interval
|
||||||
|
if (this.intervalId) {
|
||||||
|
clearInterval(this.intervalId)
|
||||||
|
this.intervalId = null
|
||||||
|
}
|
||||||
|
console.warn('[PublishWorker] Stopped publish worker')
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process all unpublished objects
|
||||||
|
* Made public for Service Worker access
|
||||||
|
*/
|
||||||
|
async processUnpublished(): Promise<void> {
|
||||||
|
if (this.processing) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.processing = true
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Load unpublished objects from all object types
|
||||||
|
const objectTypes: ObjectType[] = ['author', 'series', 'publication', 'review', 'purchase', 'sponsoring', 'review_tip', 'payment_note']
|
||||||
|
|
||||||
|
for (const objectType of objectTypes) {
|
||||||
|
const unpublished = await objectCache.getUnpublished(objectType)
|
||||||
|
|
||||||
|
for (const { id, event } of unpublished) {
|
||||||
|
const key = `${objectType}:${id}`
|
||||||
|
const existing = this.unpublishedObjects.get(key)
|
||||||
|
|
||||||
|
// Skip if recently retried
|
||||||
|
if (existing && Date.now() - existing.lastRetryAt < RETRY_DELAY_MS) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip if max retries reached
|
||||||
|
if (existing && existing.retryCount >= MAX_RETRIES_PER_OBJECT) {
|
||||||
|
console.warn(`[PublishWorker] Max retries reached for ${objectType}:${id}, skipping`)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add or update in map
|
||||||
|
this.unpublishedObjects.set(key, {
|
||||||
|
objectType,
|
||||||
|
id,
|
||||||
|
event,
|
||||||
|
retryCount: existing?.retryCount ?? 0,
|
||||||
|
lastRetryAt: Date.now(),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process all unpublished objects
|
||||||
|
const objectsToProcess = Array.from(this.unpublishedObjects.values())
|
||||||
|
for (const obj of objectsToProcess) {
|
||||||
|
await this.attemptPublish(obj)
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[PublishWorker] Error processing unpublished objects:', error)
|
||||||
|
} finally {
|
||||||
|
this.processing = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Attempt to publish an unpublished object
|
||||||
|
* Uses websocketService to route events to Service Worker
|
||||||
|
*/
|
||||||
|
private async attemptPublish(obj: UnpublishedObject): Promise<void> {
|
||||||
|
try {
|
||||||
|
const { websocketService } = await import('./websocketService')
|
||||||
|
|
||||||
|
const activeRelays = await relaySessionManager.getActiveRelays()
|
||||||
|
if (activeRelays.length === 0) {
|
||||||
|
const { getPrimaryRelaySync } = await import('./config')
|
||||||
|
const relayUrl = getPrimaryRelaySync()
|
||||||
|
activeRelays.push(relayUrl)
|
||||||
|
}
|
||||||
|
|
||||||
|
console.warn(`[PublishWorker] Attempting to publish ${obj.objectType}:${obj.id} to ${activeRelays.length} relay(s)`)
|
||||||
|
|
||||||
|
// Publish to all active relays via websocketService (routes to Service Worker)
|
||||||
|
const statuses = await websocketService.publishEvent(obj.event, activeRelays)
|
||||||
|
|
||||||
|
const successfulRelays: string[] = []
|
||||||
|
statuses.forEach((status, index) => {
|
||||||
|
const relayUrl = activeRelays[index]
|
||||||
|
if (!relayUrl) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (status.success) {
|
||||||
|
successfulRelays.push(relayUrl)
|
||||||
|
// Log successful publication
|
||||||
|
void publishLog.logPublication(obj.event.id, relayUrl, true, undefined, obj.objectType, obj.id)
|
||||||
|
} else {
|
||||||
|
const errorMessage = status.error ?? 'Unknown error'
|
||||||
|
console.warn(`[PublishWorker] Relay ${relayUrl} failed for ${obj.objectType}:${obj.id}:`, errorMessage)
|
||||||
|
relaySessionManager.markRelayFailed(relayUrl)
|
||||||
|
// Log failed publication
|
||||||
|
void publishLog.logPublication(obj.event.id, relayUrl, false, errorMessage, obj.objectType, obj.id)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Update published status via writeService
|
||||||
|
if (successfulRelays.length > 0) {
|
||||||
|
await writeService.updatePublished(obj.objectType, obj.id, successfulRelays)
|
||||||
|
console.warn(`[PublishWorker] Successfully published ${obj.objectType}:${obj.id} to ${successfulRelays.length} relay(s)`)
|
||||||
|
// Remove from unpublished map
|
||||||
|
this.unpublishedObjects.delete(`${obj.objectType}:${obj.id}`)
|
||||||
|
} else {
|
||||||
|
// All relays failed, increment retry count
|
||||||
|
obj.retryCount++
|
||||||
|
obj.lastRetryAt = Date.now()
|
||||||
|
console.warn(`[PublishWorker] All relays failed for ${obj.objectType}:${obj.id}, retry count: ${obj.retryCount}/${MAX_RETRIES_PER_OBJECT}`)
|
||||||
|
|
||||||
|
// Remove if max retries reached
|
||||||
|
if (obj.retryCount >= MAX_RETRIES_PER_OBJECT) {
|
||||||
|
this.unpublishedObjects.delete(`${obj.objectType}:${obj.id}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`[PublishWorker] Error publishing ${obj.objectType}:${obj.id}:`, error)
|
||||||
|
// Increment retry count on error
|
||||||
|
obj.retryCount++
|
||||||
|
obj.lastRetryAt = Date.now()
|
||||||
|
|
||||||
|
if (obj.retryCount >= MAX_RETRIES_PER_OBJECT) {
|
||||||
|
this.unpublishedObjects.delete(`${obj.objectType}:${obj.id}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const publishWorker = new PublishWorkerService()
|
||||||
@ -1,200 +1,47 @@
|
|||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import { nostrService } from './nostr'
|
|
||||||
import type { Purchase } from '@/types/nostr'
|
import type { Purchase } from '@/types/nostr'
|
||||||
import { parsePurchaseFromEvent } from './nostrEventParsing'
|
|
||||||
import { objectCache } from './objectCache'
|
import { objectCache } from './objectCache'
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
import { parseObjectId } from './urlGenerator'
|
import { parseObjectId } from './urlGenerator'
|
||||||
|
|
||||||
function buildPurchaseFilters(articleId?: string, payerPubkey?: string, authorPubkey?: string): Array<{
|
export async function getPurchaseById(purchaseId: string, _timeoutMs: number = 5000): Promise<Purchase | null> {
|
||||||
kinds: number[]
|
|
||||||
since?: number
|
|
||||||
authors?: string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#e'?: string[]
|
|
||||||
'#kind_type'?: string[]
|
|
||||||
}> {
|
|
||||||
const filters: Array<{
|
|
||||||
kinds: number[]
|
|
||||||
since?: number
|
|
||||||
authors?: string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#e'?: string[]
|
|
||||||
'#kind_type'?: string[]
|
|
||||||
}> = []
|
|
||||||
|
|
||||||
const baseFilter: {
|
|
||||||
kinds: number[]
|
|
||||||
since: number
|
|
||||||
'#kind_type': string[]
|
|
||||||
authors?: string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#e'?: string[]
|
|
||||||
} = {
|
|
||||||
kinds: [9735], // Zap receipt
|
|
||||||
since: MIN_EVENT_DATE,
|
|
||||||
'#kind_type': ['purchase'],
|
|
||||||
}
|
|
||||||
|
|
||||||
if (payerPubkey) {
|
|
||||||
baseFilter.authors = [payerPubkey]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (authorPubkey) {
|
|
||||||
baseFilter['#p'] = [authorPubkey]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (articleId) {
|
|
||||||
baseFilter['#e'] = [articleId]
|
|
||||||
}
|
|
||||||
|
|
||||||
filters.push(baseFilter)
|
|
||||||
return filters
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function getPurchaseById(purchaseId: string, timeoutMs: number = 5000): Promise<Purchase | null> {
|
|
||||||
const parsed = parseObjectId(purchaseId)
|
const parsed = parseObjectId(purchaseId)
|
||||||
const hash = parsed.hash ?? purchaseId
|
const hash = parsed.hash ?? purchaseId
|
||||||
|
|
||||||
// Check cache first
|
// Read only from IndexedDB cache
|
||||||
const cached = await objectCache.get('purchase', hash)
|
const cached = await objectCache.get('purchase', hash)
|
||||||
if (cached) {
|
if (cached) {
|
||||||
return cached as Purchase
|
return cached as Purchase
|
||||||
}
|
}
|
||||||
|
|
||||||
const pool = nostrService.getPool()
|
// Also try by ID if hash lookup failed
|
||||||
if (!pool) {
|
const cachedById = await objectCache.getById('purchase', purchaseId)
|
||||||
throw new Error('Pool not initialized')
|
if (cachedById) {
|
||||||
|
return cachedById as Purchase
|
||||||
}
|
}
|
||||||
|
|
||||||
const filters = buildPurchaseFilters()
|
// Not found in cache - return null (no network request)
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
return null
|
||||||
|
|
||||||
return new Promise<Purchase | null>((resolve) => {
|
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (value: Purchase | null): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(value)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const purchaseParsed = await parsePurchaseFromEvent(event)
|
|
||||||
if (purchaseParsed?.id === purchaseId) {
|
|
||||||
// Cache the parsed purchase
|
|
||||||
if (purchaseParsed.hash) {
|
|
||||||
await objectCache.set('purchase', purchaseParsed.hash, event, purchaseParsed, 0, false, purchaseParsed.index)
|
|
||||||
}
|
|
||||||
done(purchaseParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done(null)
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done(null)
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export function getPurchasesForArticle(articleId: string, timeoutMs: number = 5000): Promise<Purchase[]> {
|
export async function getPurchasesForArticle(articleId: string, _timeoutMs: number = 5000): Promise<Purchase[]> {
|
||||||
const pool = nostrService.getPool()
|
// Read only from IndexedDB cache
|
||||||
if (!pool) {
|
const allPurchases = await objectCache.getAll('purchase')
|
||||||
throw new Error('Pool not initialized')
|
const purchases = allPurchases as Purchase[]
|
||||||
}
|
|
||||||
|
|
||||||
const filters = buildPurchaseFilters(articleId)
|
// Filter by articleId
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
const articlePurchases = purchases.filter((purchase) => purchase.articleId === articleId)
|
||||||
|
|
||||||
return new Promise<Purchase[]>((resolve) => {
|
// Sort by creation date descending
|
||||||
const results: Purchase[] = []
|
return articlePurchases.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const purchaseParsed = await parsePurchaseFromEvent(event)
|
|
||||||
if (purchaseParsed?.articleId === articleId) {
|
|
||||||
// Cache the parsed purchase
|
|
||||||
if (purchaseParsed.hash) {
|
|
||||||
await objectCache.set('purchase', purchaseParsed.hash, event, purchaseParsed, 0, false, purchaseParsed.index)
|
|
||||||
}
|
|
||||||
results.push(purchaseParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done()
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done()
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export function getPurchasesByPayer(payerPubkey: string, timeoutMs: number = 5000): Promise<Purchase[]> {
|
export async function getPurchasesByPayer(payerPubkey: string, _timeoutMs: number = 5000): Promise<Purchase[]> {
|
||||||
const pool = nostrService.getPool()
|
// Read only from IndexedDB cache
|
||||||
if (!pool) {
|
const allPurchases = await objectCache.getAll('purchase')
|
||||||
throw new Error('Pool not initialized')
|
const purchases = allPurchases as Purchase[]
|
||||||
}
|
|
||||||
|
|
||||||
const filters = buildPurchaseFilters(undefined, payerPubkey)
|
// Filter by payerPubkey
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
const payerPurchases = purchases.filter((purchase) => purchase.payerPubkey === payerPubkey)
|
||||||
|
|
||||||
return new Promise<Purchase[]>((resolve) => {
|
// Sort by creation date descending
|
||||||
const results: Purchase[] = []
|
return payerPurchases.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const purchaseParsed = await parsePurchaseFromEvent(event)
|
|
||||||
if (purchaseParsed) {
|
|
||||||
// Cache the parsed purchase
|
|
||||||
if (purchaseParsed.hash) {
|
|
||||||
await objectCache.set('purchase', purchaseParsed.hash, event, purchaseParsed, 0, false, purchaseParsed.index ?? 0)
|
|
||||||
}
|
|
||||||
results.push(purchaseParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done()
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done()
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,207 +1,47 @@
|
|||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import { nostrService } from './nostr'
|
|
||||||
import type { ReviewTip } from '@/types/nostr'
|
import type { ReviewTip } from '@/types/nostr'
|
||||||
import { parseReviewTipFromEvent } from './nostrEventParsing'
|
|
||||||
import { objectCache } from './objectCache'
|
import { objectCache } from './objectCache'
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
import { parseObjectId } from './urlGenerator'
|
import { parseObjectId } from './urlGenerator'
|
||||||
|
|
||||||
function buildReviewTipFilters(articleId?: string, reviewId?: string, authorPubkey?: string, reviewerPubkey?: string): Array<{
|
export async function getReviewTipById(reviewTipId: string, _timeoutMs: number = 5000): Promise<ReviewTip | null> {
|
||||||
kinds: number[]
|
|
||||||
since?: number
|
|
||||||
'#p'?: string[]
|
|
||||||
'#e'?: string[]
|
|
||||||
'#review_id'?: string[]
|
|
||||||
'#reviewer'?: string[]
|
|
||||||
'#kind_type'?: string[]
|
|
||||||
}> {
|
|
||||||
const filters: Array<{
|
|
||||||
kinds: number[]
|
|
||||||
since?: number
|
|
||||||
'#p'?: string[]
|
|
||||||
'#e'?: string[]
|
|
||||||
'#review_id'?: string[]
|
|
||||||
'#reviewer'?: string[]
|
|
||||||
'#kind_type'?: string[]
|
|
||||||
}> = []
|
|
||||||
|
|
||||||
const baseFilter: {
|
|
||||||
kinds: number[]
|
|
||||||
since: number
|
|
||||||
'#kind_type': string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#e'?: string[]
|
|
||||||
'#review_id'?: string[]
|
|
||||||
'#reviewer'?: string[]
|
|
||||||
} = {
|
|
||||||
kinds: [9735], // Zap receipt
|
|
||||||
since: MIN_EVENT_DATE,
|
|
||||||
'#kind_type': ['review_tip'],
|
|
||||||
}
|
|
||||||
|
|
||||||
if (authorPubkey) {
|
|
||||||
baseFilter['#p'] = [authorPubkey]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (articleId) {
|
|
||||||
baseFilter['#e'] = [articleId]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (reviewId) {
|
|
||||||
baseFilter['#review_id'] = [reviewId]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (reviewerPubkey) {
|
|
||||||
baseFilter['#reviewer'] = [reviewerPubkey]
|
|
||||||
}
|
|
||||||
|
|
||||||
filters.push(baseFilter)
|
|
||||||
return filters
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function getReviewTipById(reviewTipId: string, timeoutMs: number = 5000): Promise<ReviewTip | null> {
|
|
||||||
const parsed = parseObjectId(reviewTipId)
|
const parsed = parseObjectId(reviewTipId)
|
||||||
const hash = parsed.hash ?? reviewTipId
|
const hash = parsed.hash ?? reviewTipId
|
||||||
|
|
||||||
// Check cache first
|
// Read only from IndexedDB cache
|
||||||
const cached = await objectCache.get('review_tip', hash)
|
const cached = await objectCache.get('review_tip', hash)
|
||||||
if (cached) {
|
if (cached) {
|
||||||
return cached as ReviewTip
|
return cached as ReviewTip
|
||||||
}
|
}
|
||||||
|
|
||||||
const pool = nostrService.getPool()
|
// Also try by ID if hash lookup failed
|
||||||
if (!pool) {
|
const cachedById = await objectCache.getById('review_tip', reviewTipId)
|
||||||
throw new Error('Pool not initialized')
|
if (cachedById) {
|
||||||
|
return cachedById as ReviewTip
|
||||||
}
|
}
|
||||||
|
|
||||||
const filters = buildReviewTipFilters()
|
// Not found in cache - return null (no network request)
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
return null
|
||||||
|
|
||||||
return new Promise<ReviewTip | null>((resolve) => {
|
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (value: ReviewTip | null): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(value)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const reviewTipParsed = await parseReviewTipFromEvent(event)
|
|
||||||
if (reviewTipParsed?.id === reviewTipId) {
|
|
||||||
// Cache the parsed review tip
|
|
||||||
if (reviewTipParsed.hash) {
|
|
||||||
await objectCache.set('review_tip', reviewTipParsed.hash, event, reviewTipParsed, 0, false, reviewTipParsed.index ?? 0)
|
|
||||||
}
|
|
||||||
done(reviewTipParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done(null)
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done(null)
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export function getReviewTipsForArticle(articleId: string, timeoutMs: number = 5000): Promise<ReviewTip[]> {
|
export async function getReviewTipsForArticle(articleId: string, _timeoutMs: number = 5000): Promise<ReviewTip[]> {
|
||||||
const pool = nostrService.getPool()
|
// Read only from IndexedDB cache
|
||||||
if (!pool) {
|
const allReviewTips = await objectCache.getAll('review_tip')
|
||||||
throw new Error('Pool not initialized')
|
const reviewTips = allReviewTips as ReviewTip[]
|
||||||
}
|
|
||||||
|
|
||||||
const filters = buildReviewTipFilters(articleId)
|
// Filter by articleId
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
const articleReviewTips = reviewTips.filter((tip) => tip.articleId === articleId)
|
||||||
|
|
||||||
return new Promise<ReviewTip[]>((resolve) => {
|
// Sort by creation date descending
|
||||||
const results: ReviewTip[] = []
|
return articleReviewTips.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const reviewTipParsed = await parseReviewTipFromEvent(event)
|
|
||||||
if (reviewTipParsed?.articleId === articleId) {
|
|
||||||
// Cache the parsed review tip
|
|
||||||
if (reviewTipParsed.hash) {
|
|
||||||
await objectCache.set('review_tip', reviewTipParsed.hash, event, reviewTipParsed, 0, false, reviewTipParsed.index ?? 0)
|
|
||||||
}
|
|
||||||
results.push(reviewTipParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done()
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done()
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export function getReviewTipsForReview(reviewId: string, timeoutMs: number = 5000): Promise<ReviewTip[]> {
|
export async function getReviewTipsForReview(reviewId: string, _timeoutMs: number = 5000): Promise<ReviewTip[]> {
|
||||||
const pool = nostrService.getPool()
|
// Read only from IndexedDB cache
|
||||||
if (!pool) {
|
const allReviewTips = await objectCache.getAll('review_tip')
|
||||||
throw new Error('Pool not initialized')
|
const reviewTips = allReviewTips as ReviewTip[]
|
||||||
}
|
|
||||||
|
|
||||||
const filters = buildReviewTipFilters(undefined, reviewId)
|
// Filter by reviewId
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
const reviewReviewTips = reviewTips.filter((tip) => tip.reviewId === reviewId)
|
||||||
|
|
||||||
return new Promise<ReviewTip[]>((resolve) => {
|
// Sort by creation date descending
|
||||||
const results: ReviewTip[] = []
|
return reviewReviewTips.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const reviewTipParsed = await parseReviewTipFromEvent(event)
|
|
||||||
if (reviewTipParsed?.reviewId === reviewId) {
|
|
||||||
// Cache the parsed review tip
|
|
||||||
if (reviewTipParsed.hash) {
|
|
||||||
await objectCache.set('review_tip', reviewTipParsed.hash, event, reviewTipParsed, 0, false, reviewTipParsed.index ?? 0)
|
|
||||||
}
|
|
||||||
results.push(reviewTipParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done()
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done()
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,76 +1,14 @@
|
|||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import { nostrService } from './nostr'
|
|
||||||
import type { Review } from '@/types/nostr'
|
import type { Review } from '@/types/nostr'
|
||||||
import { parseReviewFromEvent } from './nostrEventParsing'
|
import { objectCache } from './objectCache'
|
||||||
import { buildTagFilter } from './nostrTagSystem'
|
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
|
|
||||||
function buildReviewFilters(articleId: string): Array<{
|
export async function getReviewsForArticle(articleId: string, _timeoutMs: number = 5000): Promise<Review[]> {
|
||||||
kinds: number[]
|
// Read only from IndexedDB cache
|
||||||
'#quote'?: string[]
|
const allReviews = await objectCache.getAll('review')
|
||||||
'#article'?: string[]
|
const reviews = allReviews as Review[]
|
||||||
since?: number
|
|
||||||
}> {
|
|
||||||
const tagFilter = buildTagFilter({
|
|
||||||
type: 'quote',
|
|
||||||
articleId,
|
|
||||||
service: PLATFORM_SERVICE,
|
|
||||||
})
|
|
||||||
|
|
||||||
const filterObj: {
|
// Filter by articleId
|
||||||
kinds: number[]
|
const articleReviews = reviews.filter((review) => review.articleId === articleId)
|
||||||
'#quote'?: string[]
|
|
||||||
'#article'?: string[]
|
// Sort by creation date descending
|
||||||
since?: number
|
return articleReviews.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
} = {
|
|
||||||
kinds: Array.isArray(tagFilter.kinds) ? tagFilter.kinds as number[] : [1],
|
|
||||||
since: MIN_EVENT_DATE,
|
|
||||||
}
|
|
||||||
if (tagFilter['#quote']) {
|
|
||||||
filterObj['#quote'] = tagFilter['#quote'] as string[]
|
|
||||||
}
|
|
||||||
if (tagFilter['#article']) {
|
|
||||||
filterObj['#article'] = tagFilter['#article'] as string[]
|
|
||||||
}
|
|
||||||
return [filterObj]
|
|
||||||
}
|
|
||||||
|
|
||||||
export function getReviewsForArticle(articleId: string, timeoutMs: number = 5000): Promise<Review[]> {
|
|
||||||
const pool = nostrService.getPool()
|
|
||||||
if (!pool) {
|
|
||||||
throw new Error('Pool not initialized')
|
|
||||||
}
|
|
||||||
const filters = buildReviewFilters(articleId)
|
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
|
||||||
|
|
||||||
return new Promise<Review[]>((resolve) => {
|
|
||||||
const results: Review[] = []
|
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = (): void => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const parsed = await parseReviewFromEvent(event)
|
|
||||||
if (parsed) {
|
|
||||||
results.push(parsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
done()
|
|
||||||
})
|
|
||||||
setTimeout(() => done(), timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,152 +1,43 @@
|
|||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import { nostrService } from './nostr'
|
|
||||||
import type { Series } from '@/types/nostr'
|
import type { Series } from '@/types/nostr'
|
||||||
import { parseSeriesFromEvent } from './nostrEventParsing'
|
|
||||||
import { buildTagFilter, extractTagsFromEvent } from './nostrTagSystem'
|
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { PLATFORM_SERVICE, MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
import { objectCache } from './objectCache'
|
import { objectCache } from './objectCache'
|
||||||
import { parseObjectId } from './urlGenerator'
|
import { parseObjectId } from './urlGenerator'
|
||||||
|
|
||||||
function buildSeriesFilters(authorPubkey: string): Array<{
|
export async function getSeriesByAuthor(authorPubkey: string, _timeoutMs: number = 5000): Promise<Series[]> {
|
||||||
kinds: number[]
|
// Read only from IndexedDB cache
|
||||||
authors?: string[]
|
const allSeries = await objectCache.getAll('series')
|
||||||
'#series'?: string[]
|
const series = allSeries as Series[]
|
||||||
since: number
|
|
||||||
}> {
|
|
||||||
const tagFilter = buildTagFilter({
|
|
||||||
type: 'series',
|
|
||||||
authorPubkey,
|
|
||||||
service: PLATFORM_SERVICE,
|
|
||||||
})
|
|
||||||
|
|
||||||
return [
|
// Filter by author pubkey
|
||||||
{
|
const authorSeries = series.filter((s) => s.pubkey === authorPubkey)
|
||||||
kinds: tagFilter.kinds as number[],
|
|
||||||
...(tagFilter.authors ? { authors: tagFilter.authors as string[] } : {}),
|
|
||||||
...(tagFilter['#series'] ? { '#series': tagFilter['#series'] as string[] } : {}),
|
|
||||||
since: MIN_EVENT_DATE,
|
|
||||||
},
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
||||||
export function getSeriesByAuthor(authorPubkey: string, timeoutMs: number = 5000): Promise<Series[]> {
|
// Sort by hash (newest first - hash contains timestamp information)
|
||||||
const pool = nostrService.getPool()
|
// Since Series doesn't have createdAt, we sort by id which contains version/index
|
||||||
if (!pool) {
|
return authorSeries.sort((a, b) => {
|
||||||
throw new Error('Pool not initialized')
|
// Sort by version descending, then by index descending
|
||||||
}
|
if (b.version !== a.version) {
|
||||||
const filters = buildSeriesFilters(authorPubkey)
|
return b.version - a.version
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
|
||||||
|
|
||||||
return new Promise<Series[]>((resolve) => {
|
|
||||||
const results: Series[] = []
|
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = (): void => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
}
|
||||||
|
return b.index - a.index
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const parsed = await parseSeriesFromEvent(event)
|
|
||||||
if (parsed) {
|
|
||||||
results.push(parsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
done()
|
|
||||||
})
|
|
||||||
setTimeout(() => done(), timeoutMs).unref?.()
|
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
function buildSeriesByIdFilters(seriesId: string): Array<{
|
export async function getSeriesById(seriesId: string, _timeoutMs: number = 2000): Promise<Series | null> {
|
||||||
kinds: number[]
|
|
||||||
ids: string[]
|
|
||||||
since: number
|
|
||||||
[key: string]: unknown
|
|
||||||
}> {
|
|
||||||
return [
|
|
||||||
{
|
|
||||||
kinds: [1],
|
|
||||||
ids: [seriesId],
|
|
||||||
...buildTagFilter({
|
|
||||||
type: 'series',
|
|
||||||
service: PLATFORM_SERVICE,
|
|
||||||
}),
|
|
||||||
since: MIN_EVENT_DATE,
|
|
||||||
},
|
|
||||||
]
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function getSeriesById(seriesId: string, timeoutMs: number = 5000): Promise<Series | null> {
|
|
||||||
// Try to parse seriesId as id format (<hash>_<index>_<version>) or use it as hash
|
// Try to parse seriesId as id format (<hash>_<index>_<version>) or use it as hash
|
||||||
const parsed = parseObjectId(seriesId)
|
const parsed = parseObjectId(seriesId)
|
||||||
const hash = parsed.hash ?? seriesId
|
const hash = parsed.hash ?? seriesId
|
||||||
|
|
||||||
// Check cache first
|
// Read only from IndexedDB cache
|
||||||
const cached = await objectCache.get('series', hash)
|
const cached = await objectCache.get('series', hash)
|
||||||
if (cached) {
|
if (cached) {
|
||||||
return cached as Series
|
return cached as Series
|
||||||
}
|
}
|
||||||
|
|
||||||
const pool = nostrService.getPool()
|
// Also try by ID if hash lookup failed
|
||||||
if (!pool) {
|
const cachedById = await objectCache.getById('series', seriesId)
|
||||||
throw new Error('Pool not initialized')
|
if (cachedById) {
|
||||||
|
return cachedById as Series
|
||||||
}
|
}
|
||||||
const filters = buildSeriesByIdFilters(seriesId)
|
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
|
||||||
|
|
||||||
return new Promise<Series | null>((resolve) => {
|
// Not found in cache - return null (no network request)
|
||||||
const relayUrl = getPrimaryRelaySync()
|
return null
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (value: Series | null): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
|
|
||||||
// Cache the result if found
|
|
||||||
if (value && value.hash) {
|
|
||||||
// Find the event to cache it
|
|
||||||
// Note: We would need the event here, but for now we cache what we have
|
|
||||||
// The event will be cached when it's parsed from the subscription
|
|
||||||
}
|
|
||||||
|
|
||||||
resolve(value)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const parsed = await parseSeriesFromEvent(event)
|
|
||||||
if (parsed) {
|
|
||||||
// Cache the parsed series
|
|
||||||
const tags = extractTagsFromEvent(event)
|
|
||||||
if (parsed.hash) {
|
|
||||||
await objectCache.set('series', parsed.hash, event, parsed, tags.version ?? 0, tags.hidden ?? false, parsed.index)
|
|
||||||
}
|
|
||||||
await done(parsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done(null)
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done(null)
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -1,161 +1,35 @@
|
|||||||
import type { Event } from 'nostr-tools'
|
|
||||||
import { nostrService } from './nostr'
|
|
||||||
import type { Sponsoring } from '@/types/nostr'
|
import type { Sponsoring } from '@/types/nostr'
|
||||||
import { parseSponsoringFromEvent } from './nostrEventParsing'
|
|
||||||
import { objectCache } from './objectCache'
|
import { objectCache } from './objectCache'
|
||||||
import { getPrimaryRelaySync } from './config'
|
|
||||||
import { MIN_EVENT_DATE } from './platformConfig'
|
|
||||||
import { parseObjectId } from './urlGenerator'
|
import { parseObjectId } from './urlGenerator'
|
||||||
|
|
||||||
function buildSponsoringFilters(authorPubkey?: string, payerPubkey?: string, seriesId?: string, articleId?: string): Array<{
|
export async function getSponsoringById(sponsoringId: string, _timeoutMs: number = 5000): Promise<Sponsoring | null> {
|
||||||
kinds: number[]
|
|
||||||
since?: number
|
|
||||||
authors?: string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#series'?: string[]
|
|
||||||
'#article'?: string[]
|
|
||||||
'#kind_type'?: string[]
|
|
||||||
}> {
|
|
||||||
const filters: Array<{
|
|
||||||
kinds: number[]
|
|
||||||
since?: number
|
|
||||||
authors?: string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#series'?: string[]
|
|
||||||
'#article'?: string[]
|
|
||||||
'#kind_type'?: string[]
|
|
||||||
}> = []
|
|
||||||
|
|
||||||
const baseFilter: {
|
|
||||||
kinds: number[]
|
|
||||||
since: number
|
|
||||||
'#kind_type': string[]
|
|
||||||
authors?: string[]
|
|
||||||
'#p'?: string[]
|
|
||||||
'#series'?: string[]
|
|
||||||
'#article'?: string[]
|
|
||||||
} = {
|
|
||||||
kinds: [9735], // Zap receipt
|
|
||||||
since: MIN_EVENT_DATE,
|
|
||||||
'#kind_type': ['sponsoring'],
|
|
||||||
}
|
|
||||||
|
|
||||||
if (payerPubkey) {
|
|
||||||
baseFilter.authors = [payerPubkey]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (authorPubkey) {
|
|
||||||
baseFilter['#p'] = [authorPubkey]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (seriesId) {
|
|
||||||
baseFilter['#series'] = [seriesId]
|
|
||||||
}
|
|
||||||
|
|
||||||
if (articleId) {
|
|
||||||
baseFilter['#article'] = [articleId]
|
|
||||||
}
|
|
||||||
|
|
||||||
filters.push(baseFilter)
|
|
||||||
return filters
|
|
||||||
}
|
|
||||||
|
|
||||||
export async function getSponsoringById(sponsoringId: string, timeoutMs: number = 5000): Promise<Sponsoring | null> {
|
|
||||||
const parsed = parseObjectId(sponsoringId)
|
const parsed = parseObjectId(sponsoringId)
|
||||||
const hash = parsed.hash ?? sponsoringId
|
const hash = parsed.hash ?? sponsoringId
|
||||||
|
|
||||||
// Check cache first
|
// Read only from IndexedDB cache
|
||||||
const cached = await objectCache.get('sponsoring', hash)
|
const cached = await objectCache.get('sponsoring', hash)
|
||||||
if (cached) {
|
if (cached) {
|
||||||
return cached as Sponsoring
|
return cached as Sponsoring
|
||||||
}
|
}
|
||||||
|
|
||||||
const pool = nostrService.getPool()
|
// Also try by ID if hash lookup failed
|
||||||
if (!pool) {
|
const cachedById = await objectCache.getById('sponsoring', sponsoringId)
|
||||||
throw new Error('Pool not initialized')
|
if (cachedById) {
|
||||||
|
return cachedById as Sponsoring
|
||||||
}
|
}
|
||||||
|
|
||||||
const filters = buildSponsoringFilters()
|
// Not found in cache - return null (no network request)
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
return null
|
||||||
|
|
||||||
return new Promise<Sponsoring | null>((resolve) => {
|
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (value: Sponsoring | null): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(value)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const sponsoringParsed = await parseSponsoringFromEvent(event)
|
|
||||||
if (sponsoringParsed?.id === sponsoringId) {
|
|
||||||
// Cache the parsed sponsoring
|
|
||||||
if (sponsoringParsed.hash) {
|
|
||||||
await objectCache.set('sponsoring', sponsoringParsed.hash, event, sponsoringParsed, 0, false, sponsoringParsed.index ?? 0)
|
|
||||||
}
|
|
||||||
done(sponsoringParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done(null)
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done(null)
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export function getSponsoringByAuthor(authorPubkey: string, timeoutMs: number = 5000): Promise<Sponsoring[]> {
|
export async function getSponsoringByAuthor(authorPubkey: string, _timeoutMs: number = 5000): Promise<Sponsoring[]> {
|
||||||
const pool = nostrService.getPool()
|
// Read only from IndexedDB cache
|
||||||
if (!pool) {
|
const allSponsoring = await objectCache.getAll('sponsoring')
|
||||||
throw new Error('Pool not initialized')
|
const sponsoring = allSponsoring as Sponsoring[]
|
||||||
}
|
|
||||||
|
|
||||||
const filters = buildSponsoringFilters(authorPubkey)
|
// Filter by authorPubkey
|
||||||
const { createSubscription } = require('@/types/nostr-tools-extended')
|
const authorSponsoring = sponsoring.filter((s) => s.authorPubkey === authorPubkey)
|
||||||
|
|
||||||
return new Promise<Sponsoring[]>((resolve) => {
|
// Sort by creation date descending
|
||||||
const results: Sponsoring[] = []
|
return authorSponsoring.sort((a, b) => b.createdAt - a.createdAt)
|
||||||
const relayUrl = getPrimaryRelaySync()
|
|
||||||
const sub = createSubscription(pool, [relayUrl], filters)
|
|
||||||
let finished = false
|
|
||||||
|
|
||||||
const done = async (): Promise<void> => {
|
|
||||||
if (finished) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
finished = true
|
|
||||||
sub.unsub()
|
|
||||||
resolve(results)
|
|
||||||
}
|
|
||||||
|
|
||||||
sub.on('event', (event: Event): void => {
|
|
||||||
void (async (): Promise<void> => {
|
|
||||||
const sponsoringParsed = await parseSponsoringFromEvent(event)
|
|
||||||
if (sponsoringParsed?.authorPubkey === authorPubkey) {
|
|
||||||
// Cache the parsed sponsoring
|
|
||||||
if (sponsoringParsed.hash) {
|
|
||||||
await objectCache.set('sponsoring', sponsoringParsed.hash, event, sponsoringParsed, 0, false, sponsoringParsed.index ?? 0)
|
|
||||||
}
|
|
||||||
results.push(sponsoringParsed)
|
|
||||||
}
|
|
||||||
})()
|
|
||||||
})
|
|
||||||
|
|
||||||
sub.on('eose', (): void => {
|
|
||||||
void done()
|
|
||||||
})
|
|
||||||
setTimeout((): void => {
|
|
||||||
void done()
|
|
||||||
}, timeoutMs).unref?.()
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|||||||
@ -70,38 +70,19 @@ export class SponsoringTrackingService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
private async publishSponsoringTrackingEvent(event: Event): Promise<void> {
|
private async publishSponsoringTrackingEvent(event: Event): Promise<void> {
|
||||||
const pool = nostrService.getPool()
|
// Publish to all active relays via websocketService (routes to Service Worker)
|
||||||
if (!pool) {
|
const { websocketService } = await import('./websocketService')
|
||||||
throw new Error('Pool not initialized')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Publish to all active relays (enabled and not marked inactive for this session)
|
|
||||||
const { relaySessionManager } = await import('./relaySessionManager')
|
const { relaySessionManager } = await import('./relaySessionManager')
|
||||||
const activeRelays = await relaySessionManager.getActiveRelays()
|
const activeRelays = await relaySessionManager.getActiveRelays()
|
||||||
|
|
||||||
if (activeRelays.length === 0) {
|
if (activeRelays.length === 0) {
|
||||||
// Fallback to primary relay if no active relays
|
// Fallback to primary relay if no active relays
|
||||||
const relayUrl = getPrimaryRelaySync()
|
const relayUrl = getPrimaryRelaySync()
|
||||||
const pubs = pool.publish([relayUrl], event)
|
await websocketService.publishEvent(event, [relayUrl])
|
||||||
await Promise.all(pubs)
|
|
||||||
} else {
|
} else {
|
||||||
// Publish to all active relays
|
// Publish to all active relays
|
||||||
console.warn(`[SponsoringTracking] Publishing tracking event ${event.id} to ${activeRelays.length} active relay(s)`)
|
console.warn(`[SponsoringTracking] Publishing tracking event ${event.id} to ${activeRelays.length} active relay(s)`)
|
||||||
const pubs = pool.publish(activeRelays, event)
|
await websocketService.publishEvent(event, activeRelays)
|
||||||
|
|
||||||
// Track failed relays and mark them inactive for the session
|
|
||||||
const results = await Promise.allSettled(pubs)
|
|
||||||
results.forEach((result, index) => {
|
|
||||||
const relayUrl = activeRelays[index]
|
|
||||||
if (!relayUrl) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
if (result.status === 'rejected') {
|
|
||||||
const error = result.reason
|
|
||||||
console.error(`[SponsoringTracking] Relay ${relayUrl} failed during publish:`, error)
|
|
||||||
relaySessionManager.markRelayFailed(relayUrl)
|
|
||||||
}
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
221
lib/swClient.ts
Normal file
221
lib/swClient.ts
Normal file
@ -0,0 +1,221 @@
|
|||||||
|
/**
|
||||||
|
* Service Worker client - communication bridge between main thread and Service Worker
|
||||||
|
*/
|
||||||
|
|
||||||
|
interface SWMessage {
|
||||||
|
type: string
|
||||||
|
data?: unknown
|
||||||
|
}
|
||||||
|
|
||||||
|
interface SWResponse {
|
||||||
|
type: string
|
||||||
|
data?: unknown
|
||||||
|
}
|
||||||
|
|
||||||
|
class ServiceWorkerClient {
|
||||||
|
private registration: ServiceWorkerRegistration | null = null
|
||||||
|
private messageHandlers: Map<string, Array<(data: unknown) => void>> = new Map()
|
||||||
|
private readyPromise: Promise<ServiceWorkerRegistration> | null = null
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize and register the Service Worker
|
||||||
|
*/
|
||||||
|
async register(): Promise<ServiceWorkerRegistration> {
|
||||||
|
if (this.readyPromise) {
|
||||||
|
return this.readyPromise
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof window === 'undefined' || !('serviceWorker' in navigator)) {
|
||||||
|
throw new Error('Service Workers are not supported in this browser')
|
||||||
|
}
|
||||||
|
|
||||||
|
this.readyPromise = navigator.serviceWorker
|
||||||
|
.register('/sw.js', { scope: '/' })
|
||||||
|
.then((registration) => {
|
||||||
|
this.registration = registration
|
||||||
|
console.warn('[SWClient] Service Worker registered:', registration.scope)
|
||||||
|
|
||||||
|
// Listen for messages from Service Worker
|
||||||
|
navigator.serviceWorker.addEventListener('message', (event) => {
|
||||||
|
this.handleMessage(event.data)
|
||||||
|
})
|
||||||
|
|
||||||
|
// Handle updates
|
||||||
|
registration.addEventListener('updatefound', () => {
|
||||||
|
const newWorker = registration.installing
|
||||||
|
if (newWorker) {
|
||||||
|
newWorker.addEventListener('statechange', () => {
|
||||||
|
if (newWorker.state === 'installed' && navigator.serviceWorker.controller) {
|
||||||
|
console.warn('[SWClient] New Service Worker available, reloading...')
|
||||||
|
window.location.reload()
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return registration
|
||||||
|
})
|
||||||
|
.catch((error) => {
|
||||||
|
console.error('[SWClient] Service Worker registration failed:', error)
|
||||||
|
throw error
|
||||||
|
})
|
||||||
|
|
||||||
|
return this.readyPromise
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Unregister the Service Worker
|
||||||
|
*/
|
||||||
|
async unregister(): Promise<boolean> {
|
||||||
|
if (!this.registration) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await this.registration.unregister()
|
||||||
|
this.registration = null
|
||||||
|
this.readyPromise = null
|
||||||
|
return result
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Send message to Service Worker
|
||||||
|
*/
|
||||||
|
async sendMessage(message: SWMessage): Promise<void> {
|
||||||
|
if (!this.registration?.active) {
|
||||||
|
await this.register()
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!this.registration?.active) {
|
||||||
|
throw new Error('Service Worker is not active')
|
||||||
|
}
|
||||||
|
|
||||||
|
this.registration.active.postMessage(message)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Send message and wait for response
|
||||||
|
*/
|
||||||
|
async sendMessageWithResponse(message: SWMessage, timeout: number = 5000): Promise<SWResponse> {
|
||||||
|
if (!this.registration?.active) {
|
||||||
|
await this.register()
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!this.registration?.active) {
|
||||||
|
throw new Error('Service Worker is not active')
|
||||||
|
}
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const messageChannel = new MessageChannel()
|
||||||
|
const timeoutId = setTimeout(() => {
|
||||||
|
reject(new Error('Service Worker response timeout'))
|
||||||
|
}, timeout)
|
||||||
|
|
||||||
|
messageChannel.port1.onmessage = (event) => {
|
||||||
|
clearTimeout(timeoutId)
|
||||||
|
resolve(event.data)
|
||||||
|
}
|
||||||
|
|
||||||
|
this.registration!.active!.postMessage(message, [messageChannel.port2])
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle messages from Service Worker
|
||||||
|
*/
|
||||||
|
private handleMessage(message: SWResponse): void {
|
||||||
|
const handlers = this.messageHandlers.get(message.type)
|
||||||
|
if (handlers) {
|
||||||
|
handlers.forEach((handler) => {
|
||||||
|
try {
|
||||||
|
handler(message.data)
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`[SWClient] Error handling message ${message.type}:`, error)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Register a message handler
|
||||||
|
*/
|
||||||
|
onMessage(type: string, handler: (data: unknown) => void): () => void {
|
||||||
|
if (!this.messageHandlers.has(type)) {
|
||||||
|
this.messageHandlers.set(type, [])
|
||||||
|
}
|
||||||
|
this.messageHandlers.get(type)!.push(handler)
|
||||||
|
|
||||||
|
// Return unsubscribe function
|
||||||
|
return () => {
|
||||||
|
const handlers = this.messageHandlers.get(type)
|
||||||
|
if (handlers) {
|
||||||
|
const index = handlers.indexOf(handler)
|
||||||
|
if (index > -1) {
|
||||||
|
handlers.splice(index, 1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start platform sync
|
||||||
|
*/
|
||||||
|
async startPlatformSync(): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'START_PLATFORM_SYNC' })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop platform sync
|
||||||
|
*/
|
||||||
|
async stopPlatformSync(): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'STOP_PLATFORM_SYNC' })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start user content sync
|
||||||
|
*/
|
||||||
|
async startUserSync(userPubkey: string): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'START_USER_SYNC', data: { userPubkey } })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start publish worker
|
||||||
|
*/
|
||||||
|
async startPublishWorker(): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'START_PUBLISH_WORKER' })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop publish worker
|
||||||
|
*/
|
||||||
|
async stopPublishWorker(): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'STOP_PUBLISH_WORKER' })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start notification detector
|
||||||
|
*/
|
||||||
|
async startNotificationDetector(userPubkey: string): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'START_NOTIFICATION_DETECTOR', data: { userPubkey } })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop notification detector
|
||||||
|
*/
|
||||||
|
async stopNotificationDetector(): Promise<void> {
|
||||||
|
await this.sendMessage({ type: 'STOP_NOTIFICATION_DETECTOR' })
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if Service Worker is ready
|
||||||
|
*/
|
||||||
|
async isReady(): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
await this.register()
|
||||||
|
return this.registration?.active !== null
|
||||||
|
} catch {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const swClient = new ServiceWorkerClient()
|
||||||
185
lib/swSyncHandler.ts
Normal file
185
lib/swSyncHandler.ts
Normal file
@ -0,0 +1,185 @@
|
|||||||
|
/**
|
||||||
|
* Service Worker sync handler - executes sync operations in main thread
|
||||||
|
* Listens to requests from Service Worker and executes them
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { platformSyncService } from './platformSync'
|
||||||
|
import { syncUserContentToCache } from './userContentSync'
|
||||||
|
import { publishWorker } from './publishWorker'
|
||||||
|
import { nostrService } from './nostr'
|
||||||
|
import { swClient } from './swClient'
|
||||||
|
import type { Event } from 'nostr-tools'
|
||||||
|
|
||||||
|
class ServiceWorkerSyncHandler {
|
||||||
|
private initialized = false
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize the sync handler
|
||||||
|
* Sets up message listeners from Service Worker
|
||||||
|
*/
|
||||||
|
async initialize(): Promise<void> {
|
||||||
|
if (this.initialized) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof window === 'undefined') {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await swClient.register()
|
||||||
|
|
||||||
|
// Listen for sync requests from Service Worker
|
||||||
|
swClient.onMessage('SYNC_REQUEST', (data: unknown) => {
|
||||||
|
void (async (): Promise<void> => {
|
||||||
|
const syncData = data as { syncType: string; userPubkey?: string }
|
||||||
|
if (syncData.syncType === 'platform') {
|
||||||
|
await this.handlePlatformSyncRequest()
|
||||||
|
} else if (syncData.syncType === 'user' && syncData.userPubkey) {
|
||||||
|
await this.handleUserSyncRequest(syncData.userPubkey)
|
||||||
|
}
|
||||||
|
})()
|
||||||
|
})
|
||||||
|
|
||||||
|
// Listen for publish worker requests
|
||||||
|
swClient.onMessage('PUBLISH_WORKER_REQUEST', () => {
|
||||||
|
void (async (): Promise<void> => {
|
||||||
|
await this.handlePublishWorkerRequest()
|
||||||
|
})()
|
||||||
|
})
|
||||||
|
|
||||||
|
// Listen for publish requests
|
||||||
|
swClient.onMessage('PUBLISH_REQUEST', (data: unknown) => {
|
||||||
|
void (async (): Promise<void> => {
|
||||||
|
const publishData = data as { event: Event; relays: string[] }
|
||||||
|
await this.handlePublishRequest(publishData.event, publishData.relays)
|
||||||
|
})()
|
||||||
|
})
|
||||||
|
|
||||||
|
// Listen for notification detection requests
|
||||||
|
swClient.onMessage('NOTIFICATION_DETECT_REQUEST', (data: unknown) => {
|
||||||
|
void (async (): Promise<void> => {
|
||||||
|
const detectData = data as { userPubkey: string }
|
||||||
|
await this.handleNotificationDetectRequest(detectData.userPubkey)
|
||||||
|
})()
|
||||||
|
})
|
||||||
|
|
||||||
|
this.initialized = true
|
||||||
|
console.warn('[SWSyncHandler] Initialized')
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SWSyncHandler] Initialization failed:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle platform sync request from Service Worker
|
||||||
|
*/
|
||||||
|
private async handlePlatformSyncRequest(): Promise<void> {
|
||||||
|
try {
|
||||||
|
console.warn('[SWSyncHandler] Executing platform sync request')
|
||||||
|
await platformSyncService.startSync()
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SWSyncHandler] Error in platform sync:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle user sync request from Service Worker
|
||||||
|
*/
|
||||||
|
private async handleUserSyncRequest(userPubkey: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
console.warn('[SWSyncHandler] Executing user sync request for:', userPubkey)
|
||||||
|
await syncUserContentToCache(userPubkey)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SWSyncHandler] Error in user sync:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle publish worker request from Service Worker
|
||||||
|
*/
|
||||||
|
private async handlePublishWorkerRequest(): Promise<void> {
|
||||||
|
try {
|
||||||
|
console.warn('[SWSyncHandler] Executing publish worker request')
|
||||||
|
// Trigger publish worker processing
|
||||||
|
// The publishWorker will process unpublished objects
|
||||||
|
await publishWorker['processUnpublished']()
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SWSyncHandler] Error in publish worker:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle notification detection request from Service Worker
|
||||||
|
* Le Service Worker détecte les changements indépendamment
|
||||||
|
* Ce handler exécute la détection dans le thread principal et crée les notifications via writeService
|
||||||
|
*/
|
||||||
|
private async handleNotificationDetectRequest(userPubkey: string): Promise<void> {
|
||||||
|
try {
|
||||||
|
console.warn('[SWSyncHandler] Executing notification detection request for:', userPubkey)
|
||||||
|
const { notificationDetector } = await import('./notificationDetector')
|
||||||
|
const { writeService } = await import('./writeService')
|
||||||
|
|
||||||
|
// Scanner IndexedDB pour détecter les nouveaux événements
|
||||||
|
await notificationDetector.scan()
|
||||||
|
|
||||||
|
// Les notifications sont créées via notificationDetector qui utilise notificationService
|
||||||
|
// qui utilise maintenant writeService pour écrire via Web Worker
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SWSyncHandler] Error in notification detection:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle publish request from Service Worker
|
||||||
|
*/
|
||||||
|
private async handlePublishRequest(event: Event, relays: string[]): Promise<void> {
|
||||||
|
try {
|
||||||
|
console.warn('[SWSyncHandler] Executing publish request for event:', event.id)
|
||||||
|
const pool = nostrService.getPool()
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Pool not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Publish to specified relays
|
||||||
|
const pubs = pool.publish(relays, event)
|
||||||
|
const results = await Promise.allSettled(pubs)
|
||||||
|
|
||||||
|
const { publishLog } = await import('./publishLog')
|
||||||
|
const successfulRelays: string[] = []
|
||||||
|
|
||||||
|
results.forEach((result, index) => {
|
||||||
|
const relayUrl = relays[index]
|
||||||
|
if (!relayUrl) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (result.status === 'fulfilled') {
|
||||||
|
successfulRelays.push(relayUrl)
|
||||||
|
// Log successful publication
|
||||||
|
void publishLog.logPublication(event.id, relayUrl, true, undefined)
|
||||||
|
} else {
|
||||||
|
const error = result.reason
|
||||||
|
const errorMessage = error instanceof Error ? error.message : String(error)
|
||||||
|
console.error(`[SWSyncHandler] Relay ${relayUrl} failed:`, error)
|
||||||
|
// Log failed publication
|
||||||
|
void publishLog.logPublication(event.id, relayUrl, false, errorMessage)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Update published status in IndexedDB
|
||||||
|
// Access private method via type assertion
|
||||||
|
const nostrServiceAny = nostrService as unknown as {
|
||||||
|
updatePublishedStatus: (eventId: string, published: false | string[]) => Promise<void>
|
||||||
|
}
|
||||||
|
await nostrServiceAny.updatePublishedStatus(
|
||||||
|
event.id,
|
||||||
|
successfulRelays.length > 0 ? successfulRelays : false
|
||||||
|
)
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SWSyncHandler] Error in publish request:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const swSyncHandler = new ServiceWorkerSyncHandler()
|
||||||
@ -8,7 +8,6 @@ import { nostrService } from './nostr'
|
|||||||
import { fetchAuthorPresentationFromPool } from './articlePublisherHelpersPresentation'
|
import { fetchAuthorPresentationFromPool } from './articlePublisherHelpersPresentation'
|
||||||
import { extractTagsFromEvent } from './nostrTagSystem'
|
import { extractTagsFromEvent } from './nostrTagSystem'
|
||||||
import { extractSeriesFromEvent, extractPublicationFromEvent, extractPurchaseFromEvent, extractSponsoringFromEvent, extractReviewTipFromEvent } from './metadataExtractor'
|
import { extractSeriesFromEvent, extractPublicationFromEvent, extractPurchaseFromEvent, extractSponsoringFromEvent, extractReviewTipFromEvent } from './metadataExtractor'
|
||||||
import { objectCache } from './objectCache'
|
|
||||||
import { getLatestVersion } from './versionManager'
|
import { getLatestVersion } from './versionManager'
|
||||||
import { buildTagFilter } from './nostrTagSystemFilter'
|
import { buildTagFilter } from './nostrTagSystemFilter'
|
||||||
import { getPrimaryRelaySync } from './config'
|
import { getPrimaryRelaySync } from './config'
|
||||||
@ -111,14 +110,16 @@ async function fetchAndCachePublications(
|
|||||||
const extractedHash = publicationParsed.hash ?? extracted.id
|
const extractedHash = publicationParsed.hash ?? extracted.id
|
||||||
const extractedIndex = publicationParsed.index ?? 0
|
const extractedIndex = publicationParsed.index ?? 0
|
||||||
const tags = extractTagsFromEvent(latestEvent)
|
const tags = extractTagsFromEvent(latestEvent)
|
||||||
await objectCache.set(
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject(
|
||||||
'publication',
|
'publication',
|
||||||
extractedHash,
|
extractedHash,
|
||||||
latestEvent,
|
latestEvent,
|
||||||
extracted,
|
extracted,
|
||||||
tags.version ?? 0,
|
tags.version ?? 0,
|
||||||
tags.hidden ?? false,
|
tags.hidden ?? false,
|
||||||
extractedIndex
|
extractedIndex,
|
||||||
|
false
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -239,14 +240,16 @@ async function fetchAndCacheSeries(
|
|||||||
const extractedHash = publicationParsed.hash ?? extracted.id
|
const extractedHash = publicationParsed.hash ?? extracted.id
|
||||||
const extractedIndex = publicationParsed.index ?? 0
|
const extractedIndex = publicationParsed.index ?? 0
|
||||||
const tags = extractTagsFromEvent(latestEvent)
|
const tags = extractTagsFromEvent(latestEvent)
|
||||||
await objectCache.set(
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject(
|
||||||
'series',
|
'series',
|
||||||
extractedHash,
|
extractedHash,
|
||||||
latestEvent,
|
latestEvent,
|
||||||
extracted,
|
extracted,
|
||||||
tags.version ?? 0,
|
tags.version ?? 0,
|
||||||
tags.hidden ?? false,
|
tags.hidden ?? false,
|
||||||
extractedIndex
|
extractedIndex,
|
||||||
|
false
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -351,7 +354,8 @@ async function fetchAndCachePurchases(
|
|||||||
const { parsePurchaseFromEvent } = await import('./nostrEventParsing')
|
const { parsePurchaseFromEvent } = await import('./nostrEventParsing')
|
||||||
const purchase = await parsePurchaseFromEvent(event)
|
const purchase = await parsePurchaseFromEvent(event)
|
||||||
if (purchase) {
|
if (purchase) {
|
||||||
await objectCache.set('purchase', purchase.hash, event, purchase, 0, false, purchase.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('purchase', purchase.hash, event, purchase, 0, false, purchase.index, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -452,7 +456,8 @@ async function fetchAndCacheSponsoring(
|
|||||||
const { parseSponsoringFromEvent } = await import('./nostrEventParsing')
|
const { parseSponsoringFromEvent } = await import('./nostrEventParsing')
|
||||||
const sponsoring = await parseSponsoringFromEvent(event)
|
const sponsoring = await parseSponsoringFromEvent(event)
|
||||||
if (sponsoring) {
|
if (sponsoring) {
|
||||||
await objectCache.set('sponsoring', sponsoring.hash, event, sponsoring, 0, false, sponsoring.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('sponsoring', sponsoring.hash, event, sponsoring, 0, false, sponsoring.index, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -553,7 +558,8 @@ async function fetchAndCacheReviewTips(
|
|||||||
const { parseReviewTipFromEvent } = await import('./nostrEventParsing')
|
const { parseReviewTipFromEvent } = await import('./nostrEventParsing')
|
||||||
const reviewTip = await parseReviewTipFromEvent(event)
|
const reviewTip = await parseReviewTipFromEvent(event)
|
||||||
if (reviewTip) {
|
if (reviewTip) {
|
||||||
await objectCache.set('review_tip', reviewTip.hash, event, reviewTip, 0, false, reviewTip.index)
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('review_tip', reviewTip.hash, event, reviewTip, 0, false, reviewTip.index, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -672,11 +678,12 @@ async function fetchAndCachePaymentNotes(
|
|||||||
if (tags.type === 'payment' && tags.payment) {
|
if (tags.type === 'payment' && tags.payment) {
|
||||||
// Cache the payment note event
|
// Cache the payment note event
|
||||||
// Use event.id as hash since payment notes don't have a separate hash system
|
// Use event.id as hash since payment notes don't have a separate hash system
|
||||||
await objectCache.set('payment_note', event.id, event, {
|
const { writeService } = await import('./writeService')
|
||||||
|
await writeService.writeObject('payment_note', event.id, event, {
|
||||||
id: event.id,
|
id: event.id,
|
||||||
type: 'payment_note',
|
type: 'payment_note',
|
||||||
eventId: event.id,
|
eventId: event.id,
|
||||||
}, 0, false, 0)
|
}, 0, false, 0, false)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -807,6 +814,18 @@ export async function syncUserContentToCache(
|
|||||||
|
|
||||||
// Store the current timestamp as last sync date
|
// Store the current timestamp as last sync date
|
||||||
await setLastSyncDate(currentTimestamp)
|
await setLastSyncDate(currentTimestamp)
|
||||||
|
|
||||||
|
// Update lastSyncDate for all relays used during sync
|
||||||
|
const { configStorage } = await import('./configStorage')
|
||||||
|
const config = await configStorage.getConfig()
|
||||||
|
const currentRelay = syncProgressManager.getProgress()?.currentRelay
|
||||||
|
if (currentRelay) {
|
||||||
|
const relayConfig = config.relays.find((r) => r.url === currentRelay)
|
||||||
|
if (relayConfig) {
|
||||||
|
await configStorage.updateRelay(relayConfig.id, { lastSyncDate: Date.now() })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
console.warn('[Sync] Synchronization completed successfully')
|
console.warn('[Sync] Synchronization completed successfully')
|
||||||
} catch (syncError) {
|
} catch (syncError) {
|
||||||
console.error('Error syncing user content to cache:', syncError)
|
console.error('Error syncing user content to cache:', syncError)
|
||||||
|
|||||||
277
lib/websocketService.ts
Normal file
277
lib/websocketService.ts
Normal file
@ -0,0 +1,277 @@
|
|||||||
|
/**
|
||||||
|
* WebSocket service - manages WebSocket connections and communicates with Service Workers
|
||||||
|
*
|
||||||
|
* Rôle principal : Gestion des connexions WebSocket
|
||||||
|
* Rôles secondaires : Routage des messages, gestion de l'état de connexion
|
||||||
|
* Communication avec Service Workers : via postMessage
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { SimplePool } from 'nostr-tools'
|
||||||
|
import { swClient } from './swClient'
|
||||||
|
import type { Event } from 'nostr-tools'
|
||||||
|
|
||||||
|
interface ConnectionState {
|
||||||
|
relayUrl: string
|
||||||
|
connected: boolean
|
||||||
|
lastConnectedAt: number | null
|
||||||
|
lastDisconnectedAt: number | null
|
||||||
|
reconnectAttempts: number
|
||||||
|
}
|
||||||
|
|
||||||
|
class WebSocketService {
|
||||||
|
private pool: SimplePool | null = null
|
||||||
|
private initialized = false
|
||||||
|
private connectionStates: Map<string, ConnectionState> = new Map()
|
||||||
|
private reconnectIntervals: Map<string, number> = new Map()
|
||||||
|
private readonly MAX_RECONNECT_ATTEMPTS = 5
|
||||||
|
private readonly RECONNECT_DELAY_MS = 5000
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize the WebSocket service
|
||||||
|
* Rôle principal : Gestion des connexions
|
||||||
|
*/
|
||||||
|
async initialize(): Promise<void> {
|
||||||
|
if (this.initialized) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof window === 'undefined') {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
this.pool = new SimplePool()
|
||||||
|
this.initialized = true
|
||||||
|
|
||||||
|
// Notify Service Worker of initialization
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_SERVICE_INITIALIZED',
|
||||||
|
})
|
||||||
|
|
||||||
|
console.warn('[WebSocketService] Initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the pool instance
|
||||||
|
*/
|
||||||
|
getPool(): SimplePool | null {
|
||||||
|
return this.pool
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get connection state for a relay
|
||||||
|
* Rôle secondaire : Gestion de l'état de connexion
|
||||||
|
*/
|
||||||
|
getConnectionState(relayUrl: string): ConnectionState | null {
|
||||||
|
return this.connectionStates.get(relayUrl) ?? null
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update connection state
|
||||||
|
* Rôle secondaire : Gestion de l'état de connexion
|
||||||
|
*/
|
||||||
|
private updateConnectionState(relayUrl: string, connected: boolean): void {
|
||||||
|
const current = this.connectionStates.get(relayUrl) ?? {
|
||||||
|
relayUrl,
|
||||||
|
connected: false,
|
||||||
|
lastConnectedAt: null,
|
||||||
|
lastDisconnectedAt: null,
|
||||||
|
reconnectAttempts: 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
const now = Date.now()
|
||||||
|
const updated: ConnectionState = {
|
||||||
|
...current,
|
||||||
|
connected,
|
||||||
|
lastConnectedAt: connected ? now : current.lastConnectedAt,
|
||||||
|
lastDisconnectedAt: connected ? null : now,
|
||||||
|
reconnectAttempts: connected ? 0 : current.reconnectAttempts,
|
||||||
|
}
|
||||||
|
|
||||||
|
this.connectionStates.set(relayUrl, updated)
|
||||||
|
|
||||||
|
// Notify Service Worker of connection state change via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_CONNECTION_STATE_CHANGED',
|
||||||
|
data: { relayUrl, connected, state: updated },
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle reconnection for a relay
|
||||||
|
* Rôle principal : Gestion des reconnexions
|
||||||
|
*/
|
||||||
|
private async handleReconnection(relayUrl: string): Promise<void> {
|
||||||
|
const state = this.connectionStates.get(relayUrl)
|
||||||
|
if (!state || state.connected) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (state.reconnectAttempts >= this.MAX_RECONNECT_ATTEMPTS) {
|
||||||
|
console.warn(`[WebSocketService] Max reconnect attempts reached for ${relayUrl}`)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const updated: ConnectionState = {
|
||||||
|
...state,
|
||||||
|
reconnectAttempts: state.reconnectAttempts + 1,
|
||||||
|
}
|
||||||
|
this.connectionStates.set(relayUrl, updated)
|
||||||
|
|
||||||
|
// Notify Service Worker of reconnection attempt via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_RECONNECT_ATTEMPT',
|
||||||
|
data: { relayUrl, attempt: updated.reconnectAttempts },
|
||||||
|
})
|
||||||
|
|
||||||
|
// Schedule reconnection
|
||||||
|
const timeoutId = window.setTimeout(() => {
|
||||||
|
void this.attemptReconnection(relayUrl)
|
||||||
|
}, this.RECONNECT_DELAY_MS)
|
||||||
|
|
||||||
|
this.reconnectIntervals.set(relayUrl, timeoutId)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Attempt to reconnect to a relay
|
||||||
|
*/
|
||||||
|
private async attemptReconnection(relayUrl: string): Promise<void> {
|
||||||
|
// The pool will handle reconnection automatically
|
||||||
|
// We just need to track the state
|
||||||
|
console.warn(`[WebSocketService] Attempting to reconnect to ${relayUrl}`)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Publish event to relays via WebSocket
|
||||||
|
* Rôle secondaire : Routage des messages
|
||||||
|
* Communicates with Service Worker via postMessage
|
||||||
|
*/
|
||||||
|
async publishEvent(event: Event, relays: string[]): Promise<{ success: boolean; error?: string }[]> {
|
||||||
|
if (!this.pool) {
|
||||||
|
await this.initialize()
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!this.pool) {
|
||||||
|
throw new Error('WebSocket service not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update connection states
|
||||||
|
relays.forEach((relayUrl) => {
|
||||||
|
if (!this.connectionStates.has(relayUrl)) {
|
||||||
|
this.updateConnectionState(relayUrl, true) // Assume connected when publishing
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Publish to relays
|
||||||
|
const pubs = this.pool.publish(relays, event)
|
||||||
|
const results = await Promise.allSettled(pubs)
|
||||||
|
|
||||||
|
const statuses: Array<{ success: boolean; error?: string }> = []
|
||||||
|
results.forEach((result, index) => {
|
||||||
|
const relayUrl = relays[index]
|
||||||
|
if (!relayUrl) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (result.status === 'fulfilled') {
|
||||||
|
statuses.push({ success: true })
|
||||||
|
this.updateConnectionState(relayUrl, true)
|
||||||
|
// Notify Service Worker of successful publication via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_PUBLISH_SUCCESS',
|
||||||
|
data: { eventId: event.id, relayUrl },
|
||||||
|
})
|
||||||
|
} else {
|
||||||
|
const error = result.reason instanceof Error ? result.reason.message : String(result.reason)
|
||||||
|
statuses.push({ success: false, error })
|
||||||
|
this.updateConnectionState(relayUrl, false)
|
||||||
|
// Notify Service Worker of failed publication via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_PUBLISH_FAILED',
|
||||||
|
data: { eventId: event.id, relayUrl, error },
|
||||||
|
})
|
||||||
|
// Trigger reconnection
|
||||||
|
void this.handleReconnection(relayUrl)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
return statuses
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Subscribe to events from relays
|
||||||
|
* Rôle secondaire : Routage des messages
|
||||||
|
* Communicates with Service Worker via postMessage
|
||||||
|
*/
|
||||||
|
async subscribe(
|
||||||
|
relays: string[],
|
||||||
|
filters: Array<Record<string, unknown>>,
|
||||||
|
onEvent: (event: Event) => void
|
||||||
|
): Promise<() => void> {
|
||||||
|
if (!this.pool) {
|
||||||
|
await this.initialize()
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!this.pool) {
|
||||||
|
throw new Error('WebSocket service not initialized')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update connection states
|
||||||
|
relays.forEach((relayUrl) => {
|
||||||
|
this.updateConnectionState(relayUrl, true) // Assume connected when subscribing
|
||||||
|
})
|
||||||
|
|
||||||
|
// Create subscription
|
||||||
|
const sub = this.pool.subscribe(relays, filters, {
|
||||||
|
onevent: (event: Event): void => {
|
||||||
|
// Notify Service Worker of new event via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_EVENT_RECEIVED',
|
||||||
|
data: { event },
|
||||||
|
})
|
||||||
|
onEvent(event)
|
||||||
|
},
|
||||||
|
oneose: (): void => {
|
||||||
|
// Notify Service Worker that subscription is complete via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_EOSE',
|
||||||
|
data: { relays },
|
||||||
|
})
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
return (): void => {
|
||||||
|
sub.close()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Close all connections
|
||||||
|
* Rôle principal : Gestion des connexions
|
||||||
|
*/
|
||||||
|
close(): void {
|
||||||
|
// Clear reconnect intervals
|
||||||
|
this.reconnectIntervals.forEach((timeoutId) => {
|
||||||
|
clearTimeout(timeoutId)
|
||||||
|
})
|
||||||
|
this.reconnectIntervals.clear()
|
||||||
|
|
||||||
|
if (this.pool) {
|
||||||
|
this.pool.close()
|
||||||
|
this.pool = null
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update all connection states
|
||||||
|
this.connectionStates.forEach((state, relayUrl) => {
|
||||||
|
this.updateConnectionState(relayUrl, false)
|
||||||
|
})
|
||||||
|
|
||||||
|
this.initialized = false
|
||||||
|
|
||||||
|
// Notify Service Worker via postMessage
|
||||||
|
void swClient.sendMessage({
|
||||||
|
type: 'WEBSOCKET_SERVICE_CLOSED',
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const websocketService = new WebSocketService()
|
||||||
144
lib/writeOrchestrator.ts
Normal file
144
lib/writeOrchestrator.ts
Normal file
@ -0,0 +1,144 @@
|
|||||||
|
/**
|
||||||
|
* Write orchestrator service - manages writes/updates to WebSockets/API and to write Web Worker
|
||||||
|
* Orchestrates communication between interface, network (Nostr/WebSockets) and write Web Worker
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { websocketService } from './websocketService'
|
||||||
|
import { writeService } from './writeService'
|
||||||
|
import type { Event, EventTemplate } from 'nostr-tools'
|
||||||
|
import { finalizeEvent } from 'nostr-tools'
|
||||||
|
import { hexToBytes } from 'nostr-tools/utils'
|
||||||
|
import type { ObjectType } from './objectCache'
|
||||||
|
import type { NostrEvent } from 'nostr-tools'
|
||||||
|
|
||||||
|
interface WriteObjectParams {
|
||||||
|
objectType: ObjectType
|
||||||
|
hash: string
|
||||||
|
event: NostrEvent
|
||||||
|
parsed: unknown
|
||||||
|
version: number
|
||||||
|
hidden: boolean
|
||||||
|
index?: number
|
||||||
|
published?: false | string[]
|
||||||
|
}
|
||||||
|
|
||||||
|
class WriteOrchestrator {
|
||||||
|
private privateKey: string | null = null
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set private key for signing events
|
||||||
|
*/
|
||||||
|
setPrivateKey(privateKey: string): void {
|
||||||
|
this.privateKey = privateKey
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Write object to IndexedDB and publish to network
|
||||||
|
* Orchestrates: WebSockets → Web Worker → IndexedDB
|
||||||
|
* Écriture en parallèle réseau et local indépendamment
|
||||||
|
* Si réseau échoue mais écriture locale réussit, rien (un autre service worker réessaiera)
|
||||||
|
*/
|
||||||
|
async writeAndPublish(
|
||||||
|
params: WriteObjectParams,
|
||||||
|
relays: string[]
|
||||||
|
): Promise<{ success: boolean; eventId: string; published: false | string[] }> {
|
||||||
|
const { objectType, hash, event, parsed, version, hidden, index, published = false } = params
|
||||||
|
|
||||||
|
// Écriture en parallèle : réseau et local indépendamment
|
||||||
|
const [networkResult, localResult] = await Promise.allSettled([
|
||||||
|
// 1. Publish to network via WebSocket service (en parallèle)
|
||||||
|
websocketService.publishEvent(event, relays).then((statuses) => {
|
||||||
|
return statuses
|
||||||
|
.map((status, index) => (status.success ? relays[index] : null))
|
||||||
|
.filter((relay): relay is string => relay !== null)
|
||||||
|
}),
|
||||||
|
// 2. Write to IndexedDB via Web Worker (en parallèle, avec published: false initialement)
|
||||||
|
writeService.writeObject(objectType, hash, event, parsed, version, hidden, index, false),
|
||||||
|
])
|
||||||
|
|
||||||
|
// Traiter le résultat réseau
|
||||||
|
let publishedRelays: string[] = []
|
||||||
|
if (networkResult.status === 'fulfilled') {
|
||||||
|
publishedRelays = networkResult.value
|
||||||
|
} else {
|
||||||
|
// Si réseau échoue, rien : un autre service worker réessaiera
|
||||||
|
console.warn('[WriteOrchestrator] Network publish failed, will retry later:', networkResult.reason)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Traiter le résultat local
|
||||||
|
if (localResult.status === 'rejected') {
|
||||||
|
console.error('[WriteOrchestrator] Local write failed:', localResult.reason)
|
||||||
|
throw new Error(`Failed to write to IndexedDB: ${localResult.reason}`)
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Update published status in IndexedDB via Web Worker (même si réseau a échoué)
|
||||||
|
const publishedStatus: false | string[] = publishedRelays.length > 0 ? publishedRelays : false
|
||||||
|
await writeService.updatePublished(objectType, hash, publishedStatus)
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: publishedRelays.length > 0,
|
||||||
|
eventId: event.id,
|
||||||
|
published: publishedStatus,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create and publish event from template
|
||||||
|
*/
|
||||||
|
async createAndPublishEvent(
|
||||||
|
eventTemplate: EventTemplate,
|
||||||
|
relays: string[],
|
||||||
|
objectType: ObjectType,
|
||||||
|
hash: string,
|
||||||
|
parsed: unknown,
|
||||||
|
version: number,
|
||||||
|
hidden: boolean,
|
||||||
|
index?: number
|
||||||
|
): Promise<{ success: boolean; event: Event; published: false | string[] }> {
|
||||||
|
if (!this.privateKey) {
|
||||||
|
throw new Error('Private key not set')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create event
|
||||||
|
const unsignedEvent: EventTemplate = {
|
||||||
|
...eventTemplate,
|
||||||
|
created_at: eventTemplate.created_at ?? Math.floor(Date.now() / 1000),
|
||||||
|
}
|
||||||
|
|
||||||
|
const secretKey = hexToBytes(this.privateKey)
|
||||||
|
const event = finalizeEvent(unsignedEvent, secretKey)
|
||||||
|
|
||||||
|
// Write and publish
|
||||||
|
const result = await this.writeAndPublish(
|
||||||
|
{
|
||||||
|
objectType,
|
||||||
|
hash,
|
||||||
|
event,
|
||||||
|
parsed,
|
||||||
|
version,
|
||||||
|
hidden,
|
||||||
|
index,
|
||||||
|
},
|
||||||
|
relays
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: result.success,
|
||||||
|
event,
|
||||||
|
published: result.published,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update published status (for republishing)
|
||||||
|
*/
|
||||||
|
async updatePublishedStatus(
|
||||||
|
objectType: ObjectType,
|
||||||
|
id: string,
|
||||||
|
published: false | string[]
|
||||||
|
): Promise<void> {
|
||||||
|
await writeService.updatePublished(objectType, id, published)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const writeOrchestrator = new WriteOrchestrator()
|
||||||
301
lib/writeService.ts
Normal file
301
lib/writeService.ts
Normal file
@ -0,0 +1,301 @@
|
|||||||
|
/**
|
||||||
|
* Write service - manages all write operations to IndexedDB
|
||||||
|
* Routes writes through Web Worker for IndexedDB operations
|
||||||
|
* Routes network operations through WebSocket service
|
||||||
|
*/
|
||||||
|
|
||||||
|
import type { NostrEvent } from 'nostr-tools'
|
||||||
|
import type { ObjectType } from './objectCache'
|
||||||
|
|
||||||
|
class WriteService {
|
||||||
|
private writeWorker: Worker | null = null
|
||||||
|
private initPromise: Promise<void> | null = null
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize the write worker
|
||||||
|
*/
|
||||||
|
private async init(): Promise<void> {
|
||||||
|
if (this.writeWorker) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.initPromise) {
|
||||||
|
return this.initPromise
|
||||||
|
}
|
||||||
|
|
||||||
|
this.initPromise = this.createWorker()
|
||||||
|
|
||||||
|
try {
|
||||||
|
await this.initPromise
|
||||||
|
} catch (error) {
|
||||||
|
this.initPromise = null
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private createWorker(): Promise<void> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
if (typeof window === 'undefined' || !window.Worker) {
|
||||||
|
// Fallback: write directly if Worker not available
|
||||||
|
console.warn('[WriteService] Web Workers not available, using direct writes')
|
||||||
|
resolve()
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Worker dans public/ pour Next.js
|
||||||
|
this.writeWorker = new Worker('/writeWorker.js', { type: 'classic' })
|
||||||
|
|
||||||
|
this.writeWorker.addEventListener('message', (event) => {
|
||||||
|
const { type, data } = event.data
|
||||||
|
if (type === 'ERROR') {
|
||||||
|
console.error('[WriteService] Worker error:', data)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
this.writeWorker.addEventListener('error', (error) => {
|
||||||
|
console.error('[WriteService] Worker error:', error)
|
||||||
|
// Ne pas rejeter, utiliser fallback
|
||||||
|
console.warn('[WriteService] Falling back to direct writes')
|
||||||
|
this.writeWorker = null
|
||||||
|
resolve()
|
||||||
|
})
|
||||||
|
|
||||||
|
// Attendre que le worker soit prêt
|
||||||
|
const readyTimeout = setTimeout(() => {
|
||||||
|
console.warn('[WriteService] Worker ready timeout, using direct writes')
|
||||||
|
if (this.writeWorker) {
|
||||||
|
this.writeWorker.terminate()
|
||||||
|
this.writeWorker = null
|
||||||
|
}
|
||||||
|
resolve()
|
||||||
|
}, 2000)
|
||||||
|
|
||||||
|
// Le worker est prêt quand il répond
|
||||||
|
const readyHandler = (event: MessageEvent): void => {
|
||||||
|
if (event.data?.type === 'WORKER_READY') {
|
||||||
|
clearTimeout(readyTimeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', readyHandler)
|
||||||
|
resolve()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.writeWorker.addEventListener('message', readyHandler)
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('[WriteService] Failed to create worker, using direct writes:', error)
|
||||||
|
resolve() // Fallback to direct writes
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Write object to IndexedDB (via Web Worker)
|
||||||
|
*/
|
||||||
|
async writeObject(
|
||||||
|
objectType: ObjectType,
|
||||||
|
hash: string,
|
||||||
|
event: NostrEvent,
|
||||||
|
parsed: unknown,
|
||||||
|
version: number,
|
||||||
|
hidden: boolean,
|
||||||
|
index?: number,
|
||||||
|
published: false | string[] = false
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (this.writeWorker) {
|
||||||
|
// Send to worker
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const timeout = setTimeout(() => {
|
||||||
|
reject(new Error('Write operation timeout'))
|
||||||
|
}, 10000)
|
||||||
|
|
||||||
|
const handler = (event: MessageEvent): void => {
|
||||||
|
const { type, data } = event.data
|
||||||
|
if (type === 'WRITE_OBJECT_SUCCESS' && data.hash === hash) {
|
||||||
|
clearTimeout(timeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', handler)
|
||||||
|
resolve()
|
||||||
|
} else if (type === 'ERROR' && data.originalType === 'WRITE_OBJECT') {
|
||||||
|
clearTimeout(timeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', handler)
|
||||||
|
reject(new Error(data.error))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.writeWorker!.addEventListener('message', handler)
|
||||||
|
this.writeWorker!.postMessage({
|
||||||
|
type: 'WRITE_OBJECT',
|
||||||
|
data: {
|
||||||
|
objectType,
|
||||||
|
hash,
|
||||||
|
event,
|
||||||
|
parsed,
|
||||||
|
version,
|
||||||
|
hidden,
|
||||||
|
index,
|
||||||
|
published,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
// Fallback: direct write
|
||||||
|
const { objectCache } = await import('./objectCache')
|
||||||
|
await objectCache.set(objectType, hash, event, parsed, version, hidden, index, published)
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[WriteService] Error writing object:', error)
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update published status (via Web Worker)
|
||||||
|
*/
|
||||||
|
async updatePublished(
|
||||||
|
objectType: ObjectType,
|
||||||
|
id: string,
|
||||||
|
published: false | string[]
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (this.writeWorker) {
|
||||||
|
// Send to worker
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const timeout = setTimeout(() => {
|
||||||
|
reject(new Error('Update published operation timeout'))
|
||||||
|
}, 10000)
|
||||||
|
|
||||||
|
const handler = (event: MessageEvent): void => {
|
||||||
|
const { type, data } = event.data
|
||||||
|
if (type === 'UPDATE_PUBLISHED_SUCCESS' && data.id === id) {
|
||||||
|
clearTimeout(timeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', handler)
|
||||||
|
resolve()
|
||||||
|
} else if (type === 'ERROR' && (data.originalType === 'UPDATE_PUBLISHED' || data.taskId?.startsWith('UPDATE_PUBLISHED'))) {
|
||||||
|
clearTimeout(timeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', handler)
|
||||||
|
reject(new Error(data.error))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.writeWorker!.addEventListener('message', handler)
|
||||||
|
this.writeWorker!.postMessage({
|
||||||
|
type: 'UPDATE_PUBLISHED',
|
||||||
|
data: { objectType, id, published },
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
// Fallback: direct write
|
||||||
|
const { objectCache } = await import('./objectCache')
|
||||||
|
await objectCache.updatePublished(objectType, id, published)
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[WriteService] Error updating published status:', error)
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create notification (via Web Worker)
|
||||||
|
*/
|
||||||
|
async createNotification(
|
||||||
|
type: string,
|
||||||
|
objectType: string,
|
||||||
|
objectId: string,
|
||||||
|
eventId: string,
|
||||||
|
data?: Record<string, unknown>
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (this.writeWorker) {
|
||||||
|
// Send to worker
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const timeout = setTimeout(() => {
|
||||||
|
reject(new Error('Create notification operation timeout'))
|
||||||
|
}, 10000)
|
||||||
|
|
||||||
|
const handler = (event: MessageEvent): void => {
|
||||||
|
const { type: responseType, data: responseData } = event.data
|
||||||
|
if (responseType === 'CREATE_NOTIFICATION_SUCCESS' && responseData.eventId === eventId) {
|
||||||
|
clearTimeout(timeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', handler)
|
||||||
|
resolve()
|
||||||
|
} else if (responseType === 'ERROR' && (responseData.originalType === 'CREATE_NOTIFICATION' || responseData.taskId?.startsWith('CREATE_NOTIFICATION'))) {
|
||||||
|
clearTimeout(timeout)
|
||||||
|
this.writeWorker?.removeEventListener('message', handler)
|
||||||
|
reject(new Error(responseData.error))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
this.writeWorker!.addEventListener('message', handler)
|
||||||
|
this.writeWorker!.postMessage({
|
||||||
|
type: 'CREATE_NOTIFICATION',
|
||||||
|
data: { type, objectType, objectId, eventId, notificationData: data },
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
// Fallback: direct write
|
||||||
|
const { notificationService } = await import('./notificationService')
|
||||||
|
await notificationService.createNotificationDirect(
|
||||||
|
type as Parameters<typeof notificationService.createNotificationDirect>[0],
|
||||||
|
objectType,
|
||||||
|
objectId,
|
||||||
|
eventId,
|
||||||
|
data
|
||||||
|
)
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[WriteService] Error creating notification:', error)
|
||||||
|
throw error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Log publication (via Web Worker)
|
||||||
|
*/
|
||||||
|
async logPublication(
|
||||||
|
eventId: string,
|
||||||
|
relayUrl: string,
|
||||||
|
success: boolean,
|
||||||
|
error?: string,
|
||||||
|
objectType?: string,
|
||||||
|
objectId?: string
|
||||||
|
): Promise<void> {
|
||||||
|
try {
|
||||||
|
await this.init()
|
||||||
|
|
||||||
|
if (this.writeWorker) {
|
||||||
|
// Send to worker
|
||||||
|
this.writeWorker.postMessage({
|
||||||
|
type: 'LOG_PUBLICATION',
|
||||||
|
data: { eventId, relayUrl, success, error, objectType, objectId },
|
||||||
|
})
|
||||||
|
// Don't wait for response for logs (fire and forget)
|
||||||
|
} else {
|
||||||
|
// Fallback: direct write
|
||||||
|
const { publishLog } = await import('./publishLog')
|
||||||
|
await publishLog.logPublicationDirect(eventId, relayUrl, success, error, objectType, objectId)
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[WriteService] Error logging publication:', error)
|
||||||
|
// Don't throw for logs
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Terminate the worker
|
||||||
|
*/
|
||||||
|
terminate(): void {
|
||||||
|
if (this.writeWorker) {
|
||||||
|
this.writeWorker.terminate()
|
||||||
|
this.writeWorker = null
|
||||||
|
this.initPromise = null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const writeService = new WriteService()
|
||||||
@ -7,6 +7,9 @@ import { nostrAuthService } from '@/lib/nostrAuth'
|
|||||||
import { syncUserContentToCache } from '@/lib/userContentSync'
|
import { syncUserContentToCache } from '@/lib/userContentSync'
|
||||||
import { syncProgressManager } from '@/lib/syncProgressManager'
|
import { syncProgressManager } from '@/lib/syncProgressManager'
|
||||||
import { relaySessionManager } from '@/lib/relaySessionManager'
|
import { relaySessionManager } from '@/lib/relaySessionManager'
|
||||||
|
import { publishWorker } from '@/lib/publishWorker'
|
||||||
|
import { swSyncHandler } from '@/lib/swSyncHandler'
|
||||||
|
import { swClient } from '@/lib/swClient'
|
||||||
|
|
||||||
function I18nProvider({ children }: { children: React.ReactNode }): React.ReactElement {
|
function I18nProvider({ children }: { children: React.ReactNode }): React.ReactElement {
|
||||||
// Get saved locale from localStorage or default to French
|
// Get saved locale from localStorage or default to French
|
||||||
@ -51,10 +54,25 @@ export default function App({ Component, pageProps }: AppProps): React.ReactElem
|
|||||||
void relaySessionManager.initialize()
|
void relaySessionManager.initialize()
|
||||||
}, [])
|
}, [])
|
||||||
|
|
||||||
|
// Initialize Service Worker and sync handler
|
||||||
|
React.useEffect(() => {
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
void swSyncHandler.initialize()
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
// Start publish worker on app mount to republish unpublished objects
|
||||||
|
React.useEffect(() => {
|
||||||
|
void publishWorker.start()
|
||||||
|
return () => {
|
||||||
|
void publishWorker.stop()
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
// Start platform sync on app mount and resume on each page navigation
|
// Start platform sync on app mount and resume on each page navigation
|
||||||
React.useEffect(() => {
|
React.useEffect(() => {
|
||||||
// Start continuous sync (runs periodically in background)
|
// Start continuous sync (runs periodically in background, uses Service Worker if available)
|
||||||
platformSyncService.startContinuousSync()
|
void platformSyncService.startContinuousSync()
|
||||||
|
|
||||||
// Also trigger a sync on each page navigation
|
// Also trigger a sync on each page navigation
|
||||||
const handleRouteChange = (): void => {
|
const handleRouteChange = (): void => {
|
||||||
@ -69,7 +87,7 @@ export default function App({ Component, pageProps }: AppProps): React.ReactElem
|
|||||||
|
|
||||||
return () => {
|
return () => {
|
||||||
router.events?.off('routeChangeComplete', handleRouteChange)
|
router.events?.off('routeChangeComplete', handleRouteChange)
|
||||||
platformSyncService.stopSync()
|
void platformSyncService.stopSync()
|
||||||
}
|
}
|
||||||
}, [])
|
}, [])
|
||||||
|
|
||||||
@ -93,6 +111,18 @@ export default function App({ Component, pageProps }: AppProps): React.ReactElem
|
|||||||
syncInProgress = true
|
syncInProgress = true
|
||||||
console.warn('[App] Starting user content sync...')
|
console.warn('[App] Starting user content sync...')
|
||||||
|
|
||||||
|
// Try to use Service Worker for background sync
|
||||||
|
try {
|
||||||
|
const isReady = await swClient.isReady()
|
||||||
|
if (isReady) {
|
||||||
|
console.warn('[App] Using Service Worker for user content sync')
|
||||||
|
await swClient.startUserSync(state.pubkey)
|
||||||
|
// Still sync immediately in main thread for UI feedback
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('[App] Service Worker not available, using direct sync:', error)
|
||||||
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
await syncUserContentToCache(state.pubkey, (progress) => {
|
await syncUserContentToCache(state.pubkey, (progress) => {
|
||||||
syncProgressManager.setProgress(progress)
|
syncProgressManager.setProgress(progress)
|
||||||
|
|||||||
@ -25,7 +25,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get target endpoint and auth token from query
|
// Get target endpoint and auth token from query
|
||||||
const targetEndpoint = (req.query.endpoint as string) || 'https://void.cat/upload'
|
const targetEndpoint = (req.query.endpoint as string) ?? 'https://void.cat/upload'
|
||||||
const authToken = req.query.auth as string | undefined
|
const authToken = req.query.auth as string | undefined
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@ -115,7 +115,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
|
|||||||
|
|
||||||
const requestOptions = {
|
const requestOptions = {
|
||||||
hostname: url.hostname,
|
hostname: url.hostname,
|
||||||
port: url.port || (isHttps ? 443 : 80),
|
port: url.port ?? (isHttps ? 443 : 80),
|
||||||
path: url.pathname + url.search,
|
path: url.pathname + url.search,
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers,
|
headers,
|
||||||
@ -124,7 +124,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
|
|||||||
|
|
||||||
const proxyRequest = clientModule.request(requestOptions, (proxyResponse) => {
|
const proxyRequest = clientModule.request(requestOptions, (proxyResponse) => {
|
||||||
// Handle redirects (301, 302, 307, 308)
|
// Handle redirects (301, 302, 307, 308)
|
||||||
const statusCode = proxyResponse.statusCode || 500
|
const statusCode = proxyResponse.statusCode ?? 500
|
||||||
if ((statusCode === 301 || statusCode === 302 || statusCode === 307 || statusCode === 308) && proxyResponse.headers.location) {
|
if ((statusCode === 301 || statusCode === 302 || statusCode === 307 || statusCode === 308) && proxyResponse.headers.location) {
|
||||||
const {location} = proxyResponse.headers
|
const {location} = proxyResponse.headers
|
||||||
let redirectUrl: URL
|
let redirectUrl: URL
|
||||||
@ -176,7 +176,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
|
|||||||
|
|
||||||
resolve({
|
resolve({
|
||||||
statusCode,
|
statusCode,
|
||||||
statusMessage: proxyResponse.statusMessage || 'Internal Server Error',
|
statusMessage: proxyResponse.statusMessage ?? 'Internal Server Error',
|
||||||
body,
|
body,
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
|
|||||||
@ -23,7 +23,8 @@ function AuthorPageHeader({ presentation }: { presentation: AuthorPresentationAr
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Extract author name from title (format: "Présentation de <name>")
|
// Extract author name from title (format: "Présentation de <name>")
|
||||||
const authorName = presentation.title.replace(/^Présentation de /, '').trim() || presentation.title
|
const trimmed = presentation.title.replace(/^Présentation de /, '').trim()
|
||||||
|
const authorName = trimmed || presentation.title
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="space-y-4 mb-8">
|
<div className="space-y-4 mb-8">
|
||||||
@ -293,14 +294,14 @@ export default function AuthorPage(): React.ReactElement {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const { presentation, series, totalSponsoring, loading, error, reload } = useAuthorData(hashIdOrPubkey || '')
|
const { presentation, series, totalSponsoring, loading, error, reload } = useAuthorData(hashIdOrPubkey ?? '')
|
||||||
|
|
||||||
if (!hashIdOrPubkey) {
|
if (!hashIdOrPubkey) {
|
||||||
return <></>
|
return null
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get the actual pubkey from presentation
|
// Get the actual pubkey from presentation
|
||||||
const actualAuthorPubkey = presentation?.pubkey || ''
|
const actualAuthorPubkey = presentation?.pubkey ?? ''
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<>
|
||||||
|
|||||||
@ -137,13 +137,15 @@ function useSeriesPageData(seriesId: string): {
|
|||||||
setLoading(true)
|
setLoading(true)
|
||||||
setError(null)
|
setError(null)
|
||||||
try {
|
try {
|
||||||
const s = await getSeriesById(seriesId)
|
// First try cache (should be very fast)
|
||||||
|
const s = await getSeriesById(seriesId, 2000)
|
||||||
if (!s) {
|
if (!s) {
|
||||||
setError('Série introuvable')
|
setError('Série introuvable')
|
||||||
setLoading(false)
|
setLoading(false)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
setSeries(s)
|
setSeries(s)
|
||||||
|
// Load aggregates and articles in parallel
|
||||||
const [agg, seriesArticles] = await Promise.all([
|
const [agg, seriesArticles] = await Promise.all([
|
||||||
getSeriesAggregates({ authorPubkey: s.pubkey, seriesId: s.id }),
|
getSeriesAggregates({ authorPubkey: s.pubkey, seriesId: s.id }),
|
||||||
getArticlesBySeries(s.id),
|
getArticlesBySeries(s.id),
|
||||||
|
|||||||
@ -302,6 +302,7 @@ settings.relay.list.disabled=Disabled
|
|||||||
settings.relay.list.priorityLabel=Priority {{priority}}
|
settings.relay.list.priorityLabel=Priority {{priority}}
|
||||||
settings.relay.list.editUrl=Click to edit URL
|
settings.relay.list.editUrl=Click to edit URL
|
||||||
settings.relay.list.remove=Remove
|
settings.relay.list.remove=Remove
|
||||||
|
settings.relay.list.lastSync=Last sync
|
||||||
settings.relay.remove.confirm=Are you sure you want to remove this relay?
|
settings.relay.remove.confirm=Are you sure you want to remove this relay?
|
||||||
settings.relay.empty=No relays configured
|
settings.relay.empty=No relays configured
|
||||||
settings.relay.note.title=Note:
|
settings.relay.note.title=Note:
|
||||||
|
|||||||
@ -307,6 +307,7 @@ settings.relay.list.disabled=Désactivé
|
|||||||
settings.relay.list.priorityLabel=Priorité {{priority}}
|
settings.relay.list.priorityLabel=Priorité {{priority}}
|
||||||
settings.relay.list.editUrl=Cliquer pour modifier l'URL
|
settings.relay.list.editUrl=Cliquer pour modifier l'URL
|
||||||
settings.relay.list.remove=Supprimer
|
settings.relay.list.remove=Supprimer
|
||||||
|
settings.relay.list.lastSync=Dernière synchronisation
|
||||||
settings.relay.remove.confirm=Êtes-vous sûr de vouloir supprimer ce relais ?
|
settings.relay.remove.confirm=Êtes-vous sûr de vouloir supprimer ce relais ?
|
||||||
settings.relay.empty=Aucun relais configuré
|
settings.relay.empty=Aucun relais configuré
|
||||||
settings.relay.note.title=Note :
|
settings.relay.note.title=Note :
|
||||||
|
|||||||
293
public/sw.js
Normal file
293
public/sw.js
Normal file
@ -0,0 +1,293 @@
|
|||||||
|
/**
|
||||||
|
* Service Worker for background synchronization
|
||||||
|
* Handles platform sync, user content sync, and publish worker
|
||||||
|
*/
|
||||||
|
|
||||||
|
const CACHE_NAME = 'zapwall-sync-v1'
|
||||||
|
const SYNC_INTERVAL_MS = 60000 // 1 minute
|
||||||
|
const REPUBLISH_INTERVAL_MS = 30000 // 30 seconds
|
||||||
|
|
||||||
|
let syncInProgress = false
|
||||||
|
let publishWorkerInterval = null
|
||||||
|
let notificationDetectorInterval = null
|
||||||
|
|
||||||
|
// Install event - cache resources
|
||||||
|
self.addEventListener('install', (event) => {
|
||||||
|
console.log('[SW] Service Worker installing')
|
||||||
|
self.skipWaiting() // Activate immediately
|
||||||
|
})
|
||||||
|
|
||||||
|
// Activate event - clean up old caches
|
||||||
|
self.addEventListener('activate', (event) => {
|
||||||
|
console.log('[SW] Service Worker activating')
|
||||||
|
event.waitUntil(
|
||||||
|
caches.keys().then((cacheNames) => {
|
||||||
|
return Promise.all(
|
||||||
|
cacheNames
|
||||||
|
.filter((name) => name !== CACHE_NAME)
|
||||||
|
.map((name) => caches.delete(name))
|
||||||
|
)
|
||||||
|
})
|
||||||
|
)
|
||||||
|
return self.clients.claim() // Take control of all pages immediately
|
||||||
|
})
|
||||||
|
|
||||||
|
// Message handler - communication with main thread
|
||||||
|
self.addEventListener('message', (event) => {
|
||||||
|
const { type, data } = event.data
|
||||||
|
|
||||||
|
switch (type) {
|
||||||
|
case 'START_PLATFORM_SYNC':
|
||||||
|
handleStartPlatformSync()
|
||||||
|
break
|
||||||
|
case 'STOP_PLATFORM_SYNC':
|
||||||
|
handleStopPlatformSync()
|
||||||
|
break
|
||||||
|
case 'START_USER_SYNC':
|
||||||
|
handleStartUserSync(data?.userPubkey)
|
||||||
|
break
|
||||||
|
case 'START_PUBLISH_WORKER':
|
||||||
|
handleStartPublishWorker()
|
||||||
|
break
|
||||||
|
case 'STOP_PUBLISH_WORKER':
|
||||||
|
handleStopPublishWorker()
|
||||||
|
break
|
||||||
|
case 'PUBLISH_EVENT':
|
||||||
|
handlePublishEvent(data?.event, data?.relays)
|
||||||
|
break
|
||||||
|
case 'START_NOTIFICATION_DETECTOR':
|
||||||
|
handleStartNotificationDetector(data?.userPubkey)
|
||||||
|
break
|
||||||
|
case 'STOP_NOTIFICATION_DETECTOR':
|
||||||
|
handleStopNotificationDetector()
|
||||||
|
break
|
||||||
|
case 'PING':
|
||||||
|
event.ports[0]?.postMessage({ type: 'PONG' })
|
||||||
|
break
|
||||||
|
default:
|
||||||
|
console.warn('[SW] Unknown message type:', type)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
// Periodic sync event (if supported)
|
||||||
|
self.addEventListener('periodicsync', (event) => {
|
||||||
|
if (event.tag === 'platform-sync') {
|
||||||
|
event.waitUntil(handleStartPlatformSync())
|
||||||
|
} else if (event.tag === 'publish-worker') {
|
||||||
|
event.waitUntil(handlePublishWorker())
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start platform sync
|
||||||
|
*/
|
||||||
|
async function handleStartPlatformSync() {
|
||||||
|
if (syncInProgress) {
|
||||||
|
console.log('[SW] Platform sync already in progress')
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
syncInProgress = true
|
||||||
|
console.log('[SW] Starting platform sync')
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Request sync from main thread (we need access to nostrService)
|
||||||
|
broadcastToClients({ type: 'SYNC_REQUEST', data: { syncType: 'platform' } })
|
||||||
|
|
||||||
|
// Set up periodic sync
|
||||||
|
if ('periodicSync' in self.registration) {
|
||||||
|
try {
|
||||||
|
await self.registration.periodicSync.register('platform-sync', {
|
||||||
|
minInterval: SYNC_INTERVAL_MS,
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('[SW] Periodic sync not supported:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SW] Error starting platform sync:', error)
|
||||||
|
} finally {
|
||||||
|
syncInProgress = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop platform sync
|
||||||
|
*/
|
||||||
|
function handleStopPlatformSync() {
|
||||||
|
console.log('[SW] Stopping platform sync')
|
||||||
|
if ('periodicSync' in self.registration) {
|
||||||
|
self.registration.periodicSync
|
||||||
|
.unregister('platform-sync')
|
||||||
|
.catch((error) => {
|
||||||
|
console.warn('[SW] Error unregistering periodic sync:', error)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start user content sync
|
||||||
|
*/
|
||||||
|
async function handleStartUserSync(userPubkey) {
|
||||||
|
if (!userPubkey) {
|
||||||
|
console.warn('[SW] User sync requires pubkey')
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log('[SW] Starting user content sync for:', userPubkey)
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Request sync from main thread
|
||||||
|
broadcastToClients({
|
||||||
|
type: 'SYNC_REQUEST',
|
||||||
|
data: { syncType: 'user', userPubkey },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SW] Error starting user sync:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start publish worker
|
||||||
|
*/
|
||||||
|
function handleStartPublishWorker() {
|
||||||
|
console.log('[SW] Starting publish worker')
|
||||||
|
|
||||||
|
// Clear existing interval
|
||||||
|
if (publishWorkerInterval) {
|
||||||
|
clearInterval(publishWorkerInterval)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process unpublished objects periodically
|
||||||
|
publishWorkerInterval = setInterval(() => {
|
||||||
|
handlePublishWorker()
|
||||||
|
}, REPUBLISH_INTERVAL_MS)
|
||||||
|
|
||||||
|
// Process immediately
|
||||||
|
handlePublishWorker()
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop publish worker
|
||||||
|
*/
|
||||||
|
function handleStopPublishWorker() {
|
||||||
|
console.log('[SW] Stopping publish worker')
|
||||||
|
if (publishWorkerInterval) {
|
||||||
|
clearInterval(publishWorkerInterval)
|
||||||
|
publishWorkerInterval = null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process unpublished objects
|
||||||
|
*/
|
||||||
|
async function handlePublishWorker() {
|
||||||
|
console.log('[SW] Processing unpublished objects')
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Request publish worker execution from main thread
|
||||||
|
broadcastToClients({ type: 'PUBLISH_WORKER_REQUEST' })
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SW] Error in publish worker:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Publish event to relays
|
||||||
|
*/
|
||||||
|
async function handlePublishEvent(event, relays) {
|
||||||
|
if (!event || !relays || relays.length === 0) {
|
||||||
|
console.warn('[SW] Invalid publish event data')
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log('[SW] Publishing event:', event.id, 'to', relays.length, 'relays')
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Request publish from main thread (we need access to nostrService)
|
||||||
|
broadcastToClients({
|
||||||
|
type: 'PUBLISH_REQUEST',
|
||||||
|
data: { event, relays },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SW] Error publishing event:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Broadcast message to all clients
|
||||||
|
*/
|
||||||
|
function broadcastToClients(message) {
|
||||||
|
self.clients.matchAll().then((clients) => {
|
||||||
|
clients.forEach((client) => {
|
||||||
|
client.postMessage(message)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Background sync event (fallback for browsers without periodic sync)
|
||||||
|
self.addEventListener('sync', (event) => {
|
||||||
|
if (event.tag === 'platform-sync') {
|
||||||
|
event.waitUntil(handleStartPlatformSync())
|
||||||
|
} else if (event.tag === 'publish-worker') {
|
||||||
|
event.waitUntil(handlePublishWorker())
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start notification detector
|
||||||
|
* Service Worker directly populates notifications table
|
||||||
|
*/
|
||||||
|
function handleStartNotificationDetector(userPubkey) {
|
||||||
|
console.log('[SW] Starting notification detector for:', userPubkey)
|
||||||
|
|
||||||
|
// Clear existing interval
|
||||||
|
if (notificationDetectorInterval) {
|
||||||
|
clearInterval(notificationDetectorInterval)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Service Worker directly scans IndexedDB and creates notifications
|
||||||
|
// This runs in the Service Worker context
|
||||||
|
notificationDetectorInterval = setInterval(() => {
|
||||||
|
void scanAndCreateNotifications(userPubkey)
|
||||||
|
}, 30000) // Every 30 seconds
|
||||||
|
|
||||||
|
// Process immediately
|
||||||
|
void scanAndCreateNotifications(userPubkey)
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Scan IndexedDB and create notifications
|
||||||
|
* Service Worker détecte les changements de manière indépendante
|
||||||
|
* Il n'est pas dépendant du service d'écriture
|
||||||
|
*/
|
||||||
|
async function scanAndCreateNotifications(userPubkey) {
|
||||||
|
try {
|
||||||
|
// Le Service Worker détecte les changements indépendamment
|
||||||
|
// Il scanne IndexedDB directement pour détecter les nouveaux événements
|
||||||
|
// et crée les notifications via le Web Worker d'écriture
|
||||||
|
|
||||||
|
// Pour l'instant, on demande au thread principal de scanner
|
||||||
|
// car le Service Worker n'a pas accès direct à objectCache
|
||||||
|
// TODO: Implémenter la détection directe dans le Service Worker si possible
|
||||||
|
broadcastToClients({
|
||||||
|
type: 'NOTIFICATION_DETECT_REQUEST',
|
||||||
|
data: { userPubkey },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
console.error('[SW] Error in notification detection:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop notification detector
|
||||||
|
*/
|
||||||
|
function handleStopNotificationDetector() {
|
||||||
|
console.log('[SW] Stopping notification detector')
|
||||||
|
if (notificationDetectorInterval) {
|
||||||
|
clearInterval(notificationDetectorInterval)
|
||||||
|
notificationDetectorInterval = null
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log('[SW] Service Worker loaded')
|
||||||
484
public/writeWorker.js
Normal file
484
public/writeWorker.js
Normal file
@ -0,0 +1,484 @@
|
|||||||
|
/**
|
||||||
|
* Web Worker for writing to IndexedDB
|
||||||
|
* Handles all write operations to IndexedDB to keep main thread responsive
|
||||||
|
*
|
||||||
|
* - Reçoit les données complètes (pas seulement les modifications)
|
||||||
|
* - Gère une pile d'écritures pour éviter les conflits
|
||||||
|
* - Transactions multi-tables : plusieurs transactions, logique de découpage côté worker
|
||||||
|
*/
|
||||||
|
|
||||||
|
const DB_VERSIONS = {
|
||||||
|
author: 3,
|
||||||
|
series: 3,
|
||||||
|
publication: 3,
|
||||||
|
review: 3,
|
||||||
|
purchase: 3,
|
||||||
|
sponsoring: 3,
|
||||||
|
review_tip: 3,
|
||||||
|
payment_note: 3,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Pile d'écritures pour éviter les conflits
|
||||||
|
const writeQueue = []
|
||||||
|
let processingQueue = false
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process the write queue sequentially
|
||||||
|
*/
|
||||||
|
async function processQueue() {
|
||||||
|
if (processingQueue || writeQueue.length === 0) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
processingQueue = true
|
||||||
|
|
||||||
|
while (writeQueue.length > 0) {
|
||||||
|
const task = writeQueue.shift()
|
||||||
|
if (!task) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await executeWriteTask(task)
|
||||||
|
} catch (error) {
|
||||||
|
self.postMessage({
|
||||||
|
type: 'ERROR',
|
||||||
|
data: {
|
||||||
|
error: error instanceof Error ? error.message : String(error),
|
||||||
|
originalType: task.type,
|
||||||
|
taskId: task.id,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
processingQueue = false
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Execute a write task
|
||||||
|
*/
|
||||||
|
async function executeWriteTask(task) {
|
||||||
|
const { type, data, id } = task
|
||||||
|
|
||||||
|
switch (type) {
|
||||||
|
case 'WRITE_OBJECT':
|
||||||
|
await handleWriteObject(data, id)
|
||||||
|
break
|
||||||
|
case 'UPDATE_PUBLISHED':
|
||||||
|
await handleUpdatePublished(data, id)
|
||||||
|
break
|
||||||
|
case 'CREATE_NOTIFICATION':
|
||||||
|
await handleCreateNotification(data, id)
|
||||||
|
break
|
||||||
|
case 'LOG_PUBLICATION':
|
||||||
|
await handleLogPublication(data, id)
|
||||||
|
break
|
||||||
|
case 'WRITE_MULTI_TABLE':
|
||||||
|
await handleWriteMultiTable(data, id)
|
||||||
|
break
|
||||||
|
default:
|
||||||
|
throw new Error(`Unknown message type: ${type}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Listen for messages from main thread
|
||||||
|
self.addEventListener('message', (event) => {
|
||||||
|
const { type, data } = event.data
|
||||||
|
|
||||||
|
// Add to queue with unique ID
|
||||||
|
const taskId = `${type}_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`
|
||||||
|
writeQueue.push({ type, data, id: taskId })
|
||||||
|
|
||||||
|
// Process queue
|
||||||
|
void processQueue()
|
||||||
|
})
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle write object request
|
||||||
|
* Reçoit les données complètes
|
||||||
|
*/
|
||||||
|
async function handleWriteObject(data, taskId) {
|
||||||
|
const { objectType, hash, event, parsed, version, hidden, index, published } = data
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Open IndexedDB
|
||||||
|
const db = await openDB(objectType)
|
||||||
|
|
||||||
|
// Calculer l'ID complet si nécessaire
|
||||||
|
let finalId = hash
|
||||||
|
if (index !== undefined && index !== null) {
|
||||||
|
// Construire l'ID avec hash, index et version
|
||||||
|
finalId = `${hash}:${index}:${version}`
|
||||||
|
} else {
|
||||||
|
// Compter les objets avec le même hash pour déterminer l'index
|
||||||
|
const count = await countObjectsWithHash(db, hash)
|
||||||
|
finalId = `${hash}:${count}:${version}`
|
||||||
|
}
|
||||||
|
|
||||||
|
const transaction = db.transaction(['objects'], 'readwrite')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
// Vérifier si l'objet existe déjà pour préserver published
|
||||||
|
const existing = await new Promise((resolve, reject) => {
|
||||||
|
const request = store.get(finalId)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
}).catch(() => null)
|
||||||
|
|
||||||
|
// Préserver published si existant et non fourni
|
||||||
|
const finalPublished = existing && published === false ? existing.published : (published ?? false)
|
||||||
|
|
||||||
|
const object = {
|
||||||
|
id: finalId,
|
||||||
|
hash,
|
||||||
|
hashId: hash, // Legacy field
|
||||||
|
index: index ?? (existing?.index ?? (finalId.split(':')[1] ? parseInt(finalId.split(':')[1]) : 0)),
|
||||||
|
event,
|
||||||
|
parsed,
|
||||||
|
version,
|
||||||
|
hidden,
|
||||||
|
published: finalPublished,
|
||||||
|
createdAt: event.created_at,
|
||||||
|
cachedAt: Date.now(),
|
||||||
|
pubkey: event.pubkey,
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise((resolve, reject) => {
|
||||||
|
const request = store.put(object)
|
||||||
|
request.onsuccess = () => resolve()
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
|
||||||
|
self.postMessage({
|
||||||
|
type: 'WRITE_OBJECT_SUCCESS',
|
||||||
|
data: { hash, id: finalId, taskId },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(`Failed to write object: ${error.message}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count objects with same hash
|
||||||
|
*/
|
||||||
|
function countObjectsWithHash(db, hash) {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const transaction = db.transaction(['objects'], 'readonly')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
const index = store.index('hash')
|
||||||
|
const request = index.openCursor(IDBKeyRange.only(hash))
|
||||||
|
let count = 0
|
||||||
|
|
||||||
|
request.onsuccess = (event) => {
|
||||||
|
const cursor = event.target.result
|
||||||
|
if (cursor) {
|
||||||
|
count++
|
||||||
|
cursor.continue()
|
||||||
|
} else {
|
||||||
|
resolve(count)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle update published status request
|
||||||
|
*/
|
||||||
|
async function handleUpdatePublished(data, taskId) {
|
||||||
|
const { objectType, id, published } = data
|
||||||
|
|
||||||
|
try {
|
||||||
|
const db = await openDB(objectType)
|
||||||
|
const transaction = db.transaction(['objects'], 'readwrite')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
const existing = await new Promise((resolve, reject) => {
|
||||||
|
const request = store.get(id)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
|
||||||
|
if (!existing) {
|
||||||
|
throw new Error(`Object ${id} not found`)
|
||||||
|
}
|
||||||
|
|
||||||
|
const updated = {
|
||||||
|
...existing,
|
||||||
|
published,
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise((resolve, reject) => {
|
||||||
|
const request = store.put(updated)
|
||||||
|
request.onsuccess = () => resolve()
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
|
||||||
|
self.postMessage({
|
||||||
|
type: 'UPDATE_PUBLISHED_SUCCESS',
|
||||||
|
data: { id, taskId },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(`Failed to update published status: ${error.message}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle write multi-table request
|
||||||
|
* Transactions multi-tables : plusieurs transactions, logique de découpage côté worker
|
||||||
|
*/
|
||||||
|
async function handleWriteMultiTable(data, taskId) {
|
||||||
|
const { writes } = data // Array of { objectType, hash, event, parsed, version, hidden, index, published }
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Grouper par objectType pour créer des transactions par type
|
||||||
|
const writesByType = new Map()
|
||||||
|
for (const write of writes) {
|
||||||
|
const { objectType } = write
|
||||||
|
if (!writesByType.has(objectType)) {
|
||||||
|
writesByType.set(objectType, [])
|
||||||
|
}
|
||||||
|
writesByType.get(objectType).push(write)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Exécuter les transactions par type (plusieurs transactions)
|
||||||
|
const results = []
|
||||||
|
for (const [objectType, typeWrites] of writesByType) {
|
||||||
|
const db = await openDB(objectType)
|
||||||
|
const transaction = db.transaction(['objects'], 'readwrite')
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
|
||||||
|
for (const write of typeWrites) {
|
||||||
|
const { hash, event, parsed, version, hidden, index, published } = write
|
||||||
|
|
||||||
|
// Calculer l'ID (logique de découpage côté worker)
|
||||||
|
let finalId = hash
|
||||||
|
if (index !== undefined && index !== null) {
|
||||||
|
finalId = `${hash}:${index}:${version}`
|
||||||
|
} else {
|
||||||
|
const count = await countObjectsWithHash(db, hash)
|
||||||
|
finalId = `${hash}:${count}:${version}`
|
||||||
|
}
|
||||||
|
|
||||||
|
const existing = await new Promise((resolve, reject) => {
|
||||||
|
const request = store.get(finalId)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
}).catch(() => null)
|
||||||
|
|
||||||
|
const finalPublished = existing && published === false ? existing.published : (published ?? false)
|
||||||
|
|
||||||
|
const object = {
|
||||||
|
id: finalId,
|
||||||
|
hash,
|
||||||
|
hashId: hash,
|
||||||
|
index: index ?? (existing?.index ?? 0),
|
||||||
|
event,
|
||||||
|
parsed,
|
||||||
|
version,
|
||||||
|
hidden,
|
||||||
|
published: finalPublished,
|
||||||
|
createdAt: event.created_at,
|
||||||
|
cachedAt: Date.now(),
|
||||||
|
pubkey: event.pubkey,
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise((resolve, reject) => {
|
||||||
|
const request = store.put(object)
|
||||||
|
request.onsuccess = () => resolve()
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
|
||||||
|
results.push({ objectType, id: finalId, hash })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
self.postMessage({
|
||||||
|
type: 'WRITE_MULTI_TABLE_SUCCESS',
|
||||||
|
data: { results, taskId },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(`Failed to write multi-table: ${error.message}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle create notification request
|
||||||
|
*/
|
||||||
|
async function handleCreateNotification(data, taskId) {
|
||||||
|
const { type, objectType, objectId, eventId, notificationData } = data
|
||||||
|
|
||||||
|
try {
|
||||||
|
const db = await openNotificationDB()
|
||||||
|
const transaction = db.transaction(['notifications'], 'readwrite')
|
||||||
|
const store = transaction.objectStore('notifications')
|
||||||
|
|
||||||
|
// Vérifier si la notification existe déjà
|
||||||
|
const index = store.index('eventId')
|
||||||
|
const existing = await new Promise((resolve, reject) => {
|
||||||
|
const request = index.get(eventId)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
}).catch(() => null)
|
||||||
|
|
||||||
|
if (existing) {
|
||||||
|
// Notification déjà existante
|
||||||
|
self.postMessage({
|
||||||
|
type: 'CREATE_NOTIFICATION_SUCCESS',
|
||||||
|
data: { eventId, taskId, alreadyExists: true },
|
||||||
|
})
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
const notification = {
|
||||||
|
id: `${type}_${objectType}_${objectId}_${eventId}_${Date.now()}`,
|
||||||
|
type,
|
||||||
|
objectType,
|
||||||
|
objectId,
|
||||||
|
eventId,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
read: false,
|
||||||
|
data: notificationData,
|
||||||
|
title: notificationData?.title,
|
||||||
|
message: notificationData?.message,
|
||||||
|
articleId: notificationData?.articleId,
|
||||||
|
articleTitle: notificationData?.articleTitle,
|
||||||
|
amount: notificationData?.amount,
|
||||||
|
fromPubkey: notificationData?.fromPubkey,
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise((resolve, reject) => {
|
||||||
|
const request = store.add(notification)
|
||||||
|
request.onsuccess = () => resolve()
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
|
||||||
|
self.postMessage({
|
||||||
|
type: 'CREATE_NOTIFICATION_SUCCESS',
|
||||||
|
data: { eventId, taskId },
|
||||||
|
})
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(`Failed to create notification: ${error.message}`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle log publication request
|
||||||
|
*/
|
||||||
|
async function handleLogPublication(data, taskId) {
|
||||||
|
const { eventId, relayUrl, success, error, objectType, objectId } = data
|
||||||
|
|
||||||
|
try {
|
||||||
|
const db = await openPublishLogDB()
|
||||||
|
const transaction = db.transaction(['publications'], 'readwrite')
|
||||||
|
const store = transaction.objectStore('publications')
|
||||||
|
|
||||||
|
const entry = {
|
||||||
|
id: `${eventId}_${relayUrl}_${Date.now()}`,
|
||||||
|
eventId,
|
||||||
|
relayUrl,
|
||||||
|
success,
|
||||||
|
error,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
objectType,
|
||||||
|
objectId,
|
||||||
|
}
|
||||||
|
|
||||||
|
await new Promise((resolve, reject) => {
|
||||||
|
const request = store.add(entry)
|
||||||
|
request.onsuccess = () => resolve()
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
})
|
||||||
|
// Pas de réponse pour les logs (fire and forget)
|
||||||
|
} catch (error) {
|
||||||
|
// Don't throw for logs, just log the error
|
||||||
|
console.error('[WriteWorker] Error logging publication:', error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Open IndexedDB for object type
|
||||||
|
*/
|
||||||
|
function openDB(objectType) {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const dbName = `nostr_${objectType}_cache`
|
||||||
|
const version = DB_VERSIONS[objectType] ?? 1
|
||||||
|
|
||||||
|
const request = indexedDB.open(dbName, version)
|
||||||
|
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
|
||||||
|
request.onupgradeneeded = (event) => {
|
||||||
|
const db = event.target.result
|
||||||
|
if (!db.objectStoreNames.contains('objects')) {
|
||||||
|
const store = db.createObjectStore('objects', { keyPath: 'id' })
|
||||||
|
store.createIndex('hash', 'hash', { unique: false })
|
||||||
|
store.createIndex('index', 'index', { unique: false })
|
||||||
|
store.createIndex('published', 'published', { unique: false })
|
||||||
|
} else {
|
||||||
|
// Migration : ajouter l'index published si nécessaire
|
||||||
|
const transaction = event.target.transaction
|
||||||
|
const store = transaction.objectStore('objects')
|
||||||
|
if (!store.indexNames.contains('published')) {
|
||||||
|
store.createIndex('published', 'published', { unique: false })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Open IndexedDB for notifications
|
||||||
|
*/
|
||||||
|
function openNotificationDB() {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = indexedDB.open('nostr_notifications', 1)
|
||||||
|
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
|
||||||
|
request.onupgradeneeded = (event) => {
|
||||||
|
const db = event.target.result
|
||||||
|
if (!db.objectStoreNames.contains('notifications')) {
|
||||||
|
const store = db.createObjectStore('notifications', { keyPath: 'id' })
|
||||||
|
store.createIndex('type', 'type', { unique: false })
|
||||||
|
store.createIndex('objectId', 'objectId', { unique: false })
|
||||||
|
store.createIndex('eventId', 'eventId', { unique: false })
|
||||||
|
store.createIndex('timestamp', 'timestamp', { unique: false })
|
||||||
|
store.createIndex('read', 'read', { unique: false })
|
||||||
|
store.createIndex('objectType', 'objectType', { unique: false })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Open IndexedDB for publish log
|
||||||
|
*/
|
||||||
|
function openPublishLogDB() {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const request = indexedDB.open('nostr_publish_log', 1)
|
||||||
|
|
||||||
|
request.onerror = () => reject(request.error)
|
||||||
|
request.onsuccess = () => resolve(request.result)
|
||||||
|
|
||||||
|
request.onupgradeneeded = (event) => {
|
||||||
|
const db = event.target.result
|
||||||
|
if (!db.objectStoreNames.contains('publications')) {
|
||||||
|
const store = db.createObjectStore('publications', { keyPath: 'id', autoIncrement: true })
|
||||||
|
store.createIndex('eventId', 'eventId', { unique: false })
|
||||||
|
store.createIndex('relayUrl', 'relayUrl', { unique: false })
|
||||||
|
store.createIndex('timestamp', 'timestamp', { unique: false })
|
||||||
|
store.createIndex('success', 'success', { unique: false })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Notify main thread that worker is ready
|
||||||
|
self.postMessage({ type: 'WORKER_READY' })
|
||||||
|
|
||||||
|
console.log('[WriteWorker] Worker loaded')
|
||||||
@ -1,43 +0,0 @@
|
|||||||
# Résumé de la mise à jour via Git
|
|
||||||
|
|
||||||
## ✅ Actions effectuées
|
|
||||||
|
|
||||||
1. **Initialisation du dépôt Git** sur le serveur distant
|
|
||||||
2. **Stash des modifications locales** (y compris fichiers non suivis)
|
|
||||||
3. **Pull depuis la branche main** du dépôt distant
|
|
||||||
4. **Installation des dépendances** (`npm ci`)
|
|
||||||
5. **Construction de l'application** (`npm run build`)
|
|
||||||
6. **Redémarrage du service** zapwall
|
|
||||||
|
|
||||||
## 📊 État final
|
|
||||||
|
|
||||||
- ✅ **Service zapwall** : Actif et fonctionnel
|
|
||||||
- ✅ **Port 3001** : En écoute
|
|
||||||
- ✅ **Application** : Construite et prête
|
|
||||||
- ✅ **Dernier commit** : `8f855fa Add NIP-95 upload endpoint documentation to README`
|
|
||||||
|
|
||||||
## 🔄 Pour les prochaines mises à jour
|
|
||||||
|
|
||||||
Utilisez simplement le script :
|
|
||||||
|
|
||||||
```bash
|
|
||||||
./update-remote-git.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
Ou manuellement sur le serveur :
|
|
||||||
|
|
||||||
```bash
|
|
||||||
ssh debian@92.243.27.35
|
|
||||||
cd /var/www/zapwall.fr
|
|
||||||
git stash
|
|
||||||
git pull origin main
|
|
||||||
npm ci
|
|
||||||
npm run build
|
|
||||||
sudo systemctl restart zapwall
|
|
||||||
```
|
|
||||||
|
|
||||||
## 📝 Notes
|
|
||||||
|
|
||||||
- Les modifications locales ont été sauvegardées dans le stash
|
|
||||||
- Pour voir les stashes : `ssh debian@92.243.27.35 'cd /var/www/zapwall.fr && git stash list'`
|
|
||||||
- Pour restaurer un stash : `ssh debian@92.243.27.35 'cd /var/www/zapwall.fr && git stash pop'`
|
|
||||||
Loading…
x
Reference in New Issue
Block a user