Compare commits
4 Commits
5fd83944fc
...
e69ed0e867
| Author | SHA1 | Date |
|---|---|---|
|
|
e69ed0e867 | |
|
|
de770a4f91 | |
|
|
6507adc36d | |
|
|
7919d34dfa |
|
|
@ -0,0 +1,236 @@
|
|||
# Offline Storage Feasibility Assessment
|
||||
|
||||
|
||||
Summary
|
||||
Difficulty: Medium — Feasible with Automerge's built-in support.
|
||||
Why it's feasible
|
||||
Automerge Repo supports storage adapters — you can add an IndexedDB adapter alongside the network adapter
|
||||
CRDT merging — Automerge handles conflict resolution automatically
|
||||
Existing architecture — minimal changes needed; mainly adding a storage adapter
|
||||
What needs to be done
|
||||
Add IndexedDB storage adapter (easy)
|
||||
Install/create an IndexedDB storage adapter
|
||||
Add it to the Repo configuration
|
||||
Fix document ID consistency (important)
|
||||
Currently uses repo.create() which generates new IDs each time
|
||||
Need to use repo.find(roomId) or store a roomId → documentId mapping
|
||||
This is the main blocker for local storage persistence
|
||||
Handle online/offline transitions (medium)
|
||||
Listen for network status changes
|
||||
Trigger sync when coming back online
|
||||
The existing network adapter already handles reconnection
|
||||
Update initialization logic (medium)
|
||||
Load from IndexedDB first (instant offline access)
|
||||
Then sync with server when online
|
||||
Automerge will automatically merge changes
|
||||
Estimated effort
|
||||
Research & setup: 1-2 hours
|
||||
Implementation: 4-6 hours
|
||||
Testing: 2-3 hours
|
||||
Total: ~1 day of focused work
|
||||
Main challenge
|
||||
The document ID consistency issue. The code comment says "We can't use repo.find() with a custom ID because Automerge requires specific document ID formats" — this needs to be resolved. Options:
|
||||
Store a mapping of roomId → documentId in IndexedDB
|
||||
Use a deterministic algorithm to convert roomId to a valid Automerge document ID
|
||||
Use Automerge's document ID generation but persist the mapping
|
||||
Benefits
|
||||
Instant offline access — no loading spinners
|
||||
Automatic conflict resolution — Automerge handles merging
|
||||
Better UX — works seamlessly offline
|
||||
Resilience — works even if server is temporarily unavailable
|
||||
I've created a detailed feasibility document at OFFLINE_STORAGE_FEASIBILITY.md with implementation steps, code examples, and considerations.
|
||||
Should I start implementing this, or do you want to review the feasibility document first?
|
||||
|
||||
|
||||
## Summary
|
||||
|
||||
**Difficulty: Medium** - The implementation is straightforward thanks to Automerge's built-in support for storage adapters, but requires careful integration with the existing sync architecture.
|
||||
|
||||
## Current Architecture
|
||||
|
||||
1. **Client-side**: Uses `@automerge/automerge-repo` with `CloudflareNetworkAdapter` for WebSocket sync
|
||||
2. **Server-side**: `AutomergeDurableObject` stores documents in R2 and handles WebSocket connections
|
||||
3. **Persistence flow**:
|
||||
- Client saves to worker via POST `/room/:roomId`
|
||||
- Worker persists to R2 (throttled to every 2 seconds)
|
||||
- Client loads initial data from server via GET `/room/:roomId`
|
||||
|
||||
## What's Needed
|
||||
|
||||
### 1. Add IndexedDB Storage Adapter (Easy)
|
||||
|
||||
Automerge Repo supports storage adapters out of the box. You'll need to:
|
||||
|
||||
- Install `@automerge/automerge-repo-storage-indexeddb` (if available) or create a custom IndexedDB adapter
|
||||
- Add the storage adapter to the Repo configuration alongside the network adapter
|
||||
- The Repo will automatically persist document changes to IndexedDB
|
||||
|
||||
**Code changes needed:**
|
||||
```typescript
|
||||
// In useAutomergeSyncRepo.ts
|
||||
import { IndexedDBStorageAdapter } from "@automerge/automerge-repo-storage-indexeddb"
|
||||
|
||||
const [repo] = useState(() => {
|
||||
const adapter = new CloudflareNetworkAdapter(workerUrl, roomId, applyJsonSyncData)
|
||||
const storageAdapter = new IndexedDBStorageAdapter() // Add this
|
||||
return new Repo({
|
||||
network: [adapter],
|
||||
storage: [storageAdapter] // Add this
|
||||
})
|
||||
})
|
||||
```
|
||||
|
||||
### 2. Load from Local Storage on Startup (Medium)
|
||||
|
||||
Modify the initialization logic to:
|
||||
- Check IndexedDB for existing document data
|
||||
- Load from IndexedDB first (for instant offline access)
|
||||
- Then sync with server when online
|
||||
- Automerge will automatically merge local and remote changes
|
||||
|
||||
**Code changes needed:**
|
||||
```typescript
|
||||
// In useAutomergeSyncRepo.ts - modify initializeHandle
|
||||
const initializeHandle = async () => {
|
||||
// Check if document exists in IndexedDB first
|
||||
const localDoc = await repo.find(roomId) // This will load from IndexedDB if available
|
||||
|
||||
// Then sync with server (if online)
|
||||
if (navigator.onLine) {
|
||||
// Existing server sync logic
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Handle Online/Offline Transitions (Medium)
|
||||
|
||||
- Detect network status changes
|
||||
- When coming online, ensure sync happens
|
||||
- The existing `CloudflareNetworkAdapter` already handles reconnection, but you may want to add explicit sync triggers
|
||||
|
||||
**Code changes needed:**
|
||||
```typescript
|
||||
// Add network status listener
|
||||
useEffect(() => {
|
||||
const handleOnline = () => {
|
||||
console.log('🌐 Back online - syncing with server')
|
||||
// Trigger sync - Automerge will handle merging automatically
|
||||
if (handle) {
|
||||
// The network adapter will automatically reconnect and sync
|
||||
}
|
||||
}
|
||||
|
||||
window.addEventListener('online', handleOnline)
|
||||
return () => window.removeEventListener('online', handleOnline)
|
||||
}, [handle])
|
||||
```
|
||||
|
||||
### 4. Document ID Consistency (Important)
|
||||
|
||||
Currently, the code creates a new document handle each time (`repo.create()`). For local storage to work properly, you need:
|
||||
- Consistent document IDs per room
|
||||
- The challenge: Automerge requires specific document ID formats (like `automerge:xxxxx`)
|
||||
- **Solution options:**
|
||||
1. Use `repo.find()` with a properly formatted Automerge document ID (derive from roomId)
|
||||
2. Store a mapping of roomId → documentId in IndexedDB
|
||||
3. Use a deterministic way to generate document IDs from roomId
|
||||
|
||||
**Code changes needed:**
|
||||
```typescript
|
||||
// Option 1: Generate deterministic Automerge document ID from roomId
|
||||
const documentId = `automerge:${roomId}` // May need proper formatting
|
||||
const handle = repo.find(documentId) // This will load from IndexedDB or create new
|
||||
|
||||
// Option 2: Store mapping in IndexedDB
|
||||
const storedMapping = await getDocumentIdMapping(roomId)
|
||||
const documentId = storedMapping || generateNewDocumentId()
|
||||
const handle = repo.find(documentId)
|
||||
await saveDocumentIdMapping(roomId, documentId)
|
||||
```
|
||||
|
||||
**Note**: The current code comment says "We can't use repo.find() with a custom ID because Automerge requires specific document ID formats" - this needs to be resolved. You may need to:
|
||||
- Use Automerge's document ID generation but store the mapping
|
||||
- Or use a deterministic algorithm to convert roomId to valid Automerge document ID format
|
||||
|
||||
## Benefits
|
||||
|
||||
1. **Instant Offline Access**: Users can immediately see and edit their data without waiting for server response
|
||||
2. **Automatic Merging**: Automerge's CRDT nature means local and remote changes merge automatically without conflicts
|
||||
3. **Better UX**: No loading spinners when offline - data is instantly available
|
||||
4. **Resilience**: Works even if server is temporarily unavailable
|
||||
|
||||
## Challenges & Considerations
|
||||
|
||||
### 1. Storage Quota Limits
|
||||
- IndexedDB has browser-specific limits (typically 50% of disk space)
|
||||
- Large documents could hit quota limits
|
||||
- **Solution**: Monitor storage usage and implement cleanup for old documents
|
||||
|
||||
### 2. Document ID Management
|
||||
- Need to ensure consistent document IDs per room
|
||||
- Current code uses `repo.create()` which generates new IDs
|
||||
- **Solution**: Use `repo.find(roomId)` with a consistent ID format
|
||||
|
||||
### 3. Initial Load Strategy
|
||||
- Should load from IndexedDB first (fast) or server first (fresh)?
|
||||
- **Recommendation**: Load from IndexedDB first for instant UI, then sync with server in background
|
||||
|
||||
### 4. Conflict Resolution
|
||||
- Automerge handles this automatically, but you may want to show users when their offline changes were merged
|
||||
- **Solution**: Use Automerge's change tracking to show merge notifications
|
||||
|
||||
### 5. Storage Adapter Availability
|
||||
- Need to verify if `@automerge/automerge-repo-storage-indexeddb` exists
|
||||
- If not, you'll need to create a custom adapter (still straightforward)
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
1. **Research**: Check if `@automerge/automerge-repo-storage-indexeddb` package exists
|
||||
2. **Install**: Add storage adapter package or create custom adapter
|
||||
3. **Modify Repo Setup**: Add storage adapter to Repo configuration
|
||||
4. **Update Document Loading**: Use `repo.find()` instead of `repo.create()` for consistent IDs
|
||||
5. **Add Network Detection**: Listen for online/offline events
|
||||
6. **Test**: Verify offline editing works and syncs correctly when back online
|
||||
7. **Handle Edge Cases**: Storage quota, document size limits, etc.
|
||||
|
||||
## Estimated Effort
|
||||
|
||||
- **Research & Setup**: 1-2 hours
|
||||
- **Implementation**: 4-6 hours
|
||||
- **Testing**: 2-3 hours
|
||||
- **Total**: ~1 day of focused work
|
||||
|
||||
## Code Locations to Modify
|
||||
|
||||
1. `src/automerge/useAutomergeSyncRepo.ts` - Main sync hook (add storage adapter, modify initialization)
|
||||
2. `src/automerge/CloudflareAdapter.ts` - Network adapter (may need minor changes for offline detection)
|
||||
3. Potentially create: `src/automerge/IndexedDBStorageAdapter.ts` - If custom adapter needed
|
||||
|
||||
## Conclusion
|
||||
|
||||
This is a **medium-complexity** feature that's very feasible. Automerge's architecture is designed for this exact use case, and the main work is:
|
||||
1. Adding the storage adapter (straightforward)
|
||||
2. Ensuring consistent document IDs (important fix)
|
||||
3. Handling online/offline transitions (moderate complexity)
|
||||
|
||||
The biggest benefit is that Automerge's CRDT nature means you don't need to write complex merge logic - it handles conflict resolution automatically.
|
||||
|
||||
---
|
||||
|
||||
## Related: Google Data Sovereignty
|
||||
|
||||
Beyond canvas document storage, we also support importing and securely storing Google Workspace data locally. See **[docs/GOOGLE_DATA_SOVEREIGNTY.md](./docs/GOOGLE_DATA_SOVEREIGNTY.md)** for the complete architecture covering:
|
||||
|
||||
- **Gmail** - Import and encrypt emails locally
|
||||
- **Drive** - Import and encrypt documents locally
|
||||
- **Photos** - Import thumbnails with on-demand full resolution
|
||||
- **Calendar** - Import and encrypt events locally
|
||||
|
||||
Key principles:
|
||||
1. **Local-first**: All data stored in encrypted IndexedDB
|
||||
2. **User-controlled encryption**: Keys derived from WebCrypto auth, never leave browser
|
||||
3. **Selective sharing**: Choose what to share to canvas boards
|
||||
4. **Optional R2 backup**: Encrypted cloud backup (you hold the keys)
|
||||
|
||||
This builds on the same IndexedDB + Automerge foundation described above.
|
||||
|
||||
|
|
@ -0,0 +1,913 @@
|
|||
# Google Data Sovereignty: Local-First Secure Storage
|
||||
|
||||
This document outlines the architecture for securely importing, storing, and optionally sharing Google Workspace data (Gmail, Drive, Photos, Calendar) using a **local-first, data sovereign** approach.
|
||||
|
||||
## Overview
|
||||
|
||||
**Philosophy**: Your data should be yours. Import it locally, encrypt it client-side, and choose when/what to share.
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ USER'S BROWSER (Data Sovereign Zone) │
|
||||
├─────────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌─────────────┐ ┌──────────────────────────────────────────────┐ │
|
||||
│ │ Google APIs │───>│ Local Processing Layer │ │
|
||||
│ │ (OAuth 2.0) │ │ ├── Fetch data │ │
|
||||
│ └─────────────┘ │ ├── Encrypt with user's WebCrypto keys │ │
|
||||
│ │ └── Store to IndexedDB │ │
|
||||
│ └────────────────────────┬─────────────────────┘ │
|
||||
│ │ │
|
||||
│ ┌───────────────────────────────────────────┴───────────────────────┐ │
|
||||
│ │ IndexedDB Encrypted Storage │ │
|
||||
│ │ ├── gmail_messages (encrypted blobs) │ │
|
||||
│ │ ├── drive_documents (encrypted blobs) │ │
|
||||
│ │ ├── photos_media (encrypted references) │ │
|
||||
│ │ ├── calendar_events (encrypted data) │ │
|
||||
│ │ └── encryption_metadata (key derivation info) │ │
|
||||
│ └─────────────────────────────────────────────────────────────────── │
|
||||
│ │ │
|
||||
│ ┌────────────────────────┴───────────────────────┐ │
|
||||
│ │ Share Decision Layer (User Controlled) │ │
|
||||
│ │ ├── Keep Private (local only) │ │
|
||||
│ │ ├── Share to Board (Automerge sync) │ │
|
||||
│ │ └── Backup to R2 (encrypted cloud backup) │ │
|
||||
│ └────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Browser Storage Capabilities & Limitations
|
||||
|
||||
### IndexedDB Storage
|
||||
|
||||
| Browser | Default Quota | Max Quota | Persistence |
|
||||
|---------|--------------|-----------|-------------|
|
||||
| Chrome/Edge | 60% of disk | Unlimited* | Persistent with permission |
|
||||
| Firefox | 10% up to 10GB | 50% of disk | Persistent with permission |
|
||||
| Safari | 1GB (lax) | ~1GB per origin | Non-persistent (7-day eviction) |
|
||||
|
||||
*Chrome "Unlimited" requires `navigator.storage.persist()` permission
|
||||
|
||||
### Storage API Persistence
|
||||
|
||||
```typescript
|
||||
// Request persistent storage (prevents automatic eviction)
|
||||
async function requestPersistentStorage(): Promise<boolean> {
|
||||
if (navigator.storage && navigator.storage.persist) {
|
||||
const isPersisted = await navigator.storage.persist();
|
||||
console.log(`Persistent storage ${isPersisted ? 'granted' : 'denied'}`);
|
||||
return isPersisted;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check current storage quota
|
||||
async function checkStorageQuota(): Promise<{used: number, quota: number}> {
|
||||
if (navigator.storage && navigator.storage.estimate) {
|
||||
const estimate = await navigator.storage.estimate();
|
||||
return {
|
||||
used: estimate.usage || 0,
|
||||
quota: estimate.quota || 0
|
||||
};
|
||||
}
|
||||
return { used: 0, quota: 0 };
|
||||
}
|
||||
```
|
||||
|
||||
### Safari's 7-Day Eviction Rule
|
||||
|
||||
**CRITICAL for Safari users**: Safari evicts IndexedDB data after 7 days of non-use.
|
||||
|
||||
**Mitigations**:
|
||||
1. Use a Service Worker with periodic background sync to "touch" data
|
||||
2. Prompt Safari users to add to Home Screen (PWA mode bypasses some restrictions)
|
||||
3. Automatically sync important data to R2 backup
|
||||
4. Show clear warnings about Safari limitations
|
||||
|
||||
```typescript
|
||||
// Detect Safari's storage limitations
|
||||
function hasSafariLimitations(): boolean {
|
||||
const isSafari = /^((?!chrome|android).)*safari/i.test(navigator.userAgent);
|
||||
const isIOS = /iPad|iPhone|iPod/.test(navigator.userAgent);
|
||||
return isSafari || isIOS;
|
||||
}
|
||||
|
||||
// Register touch activity to prevent eviction
|
||||
async function touchLocalData(): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
const tx = db.transaction('metadata', 'readwrite');
|
||||
tx.objectStore('metadata').put({
|
||||
key: 'last_accessed',
|
||||
timestamp: Date.now()
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Data Types & Storage Strategies
|
||||
|
||||
### 1. Gmail Messages
|
||||
|
||||
```typescript
|
||||
interface EncryptedEmailStore {
|
||||
id: string; // Gmail message ID
|
||||
threadId: string; // Thread ID for grouping
|
||||
encryptedSubject: ArrayBuffer; // AES-GCM encrypted
|
||||
encryptedBody: ArrayBuffer; // AES-GCM encrypted
|
||||
encryptedFrom: ArrayBuffer; // Sender info
|
||||
encryptedTo: ArrayBuffer[]; // Recipients
|
||||
date: number; // Timestamp (unencrypted for sorting)
|
||||
labels: string[]; // Gmail labels (encrypted or not based on sensitivity)
|
||||
hasAttachments: boolean; // Flag only, attachments stored separately
|
||||
snippet: ArrayBuffer; // Encrypted preview
|
||||
|
||||
// Metadata for search (encrypted bloom filter or encrypted index)
|
||||
searchIndex: ArrayBuffer;
|
||||
|
||||
// Sync metadata
|
||||
syncedAt: number;
|
||||
localOnly: boolean; // Not yet synced to any external storage
|
||||
}
|
||||
|
||||
// Storage estimate per email:
|
||||
// - Average email: ~20KB raw → ~25KB encrypted
|
||||
// - With attachments: varies, but reference stored, not full attachment
|
||||
// - 10,000 emails ≈ 250MB
|
||||
```
|
||||
|
||||
### 2. Google Drive Documents
|
||||
|
||||
```typescript
|
||||
interface EncryptedDriveDocument {
|
||||
id: string; // Drive file ID
|
||||
encryptedName: ArrayBuffer;
|
||||
encryptedMimeType: ArrayBuffer;
|
||||
encryptedContent: ArrayBuffer; // For text-based docs
|
||||
encryptedPreview: ArrayBuffer; // Thumbnail or preview
|
||||
|
||||
// Large files: store reference, not content
|
||||
contentStrategy: 'inline' | 'reference' | 'chunked';
|
||||
chunks?: string[]; // IDs of content chunks if chunked
|
||||
|
||||
// Hierarchy
|
||||
parentId: string | null;
|
||||
path: ArrayBuffer; // Encrypted path string
|
||||
|
||||
// Sharing & permissions (for UI display)
|
||||
isShared: boolean;
|
||||
|
||||
modifiedTime: number;
|
||||
size: number; // Unencrypted for quota management
|
||||
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Storage considerations:
|
||||
// - Google Docs: Convert to markdown/HTML, typically 10-100KB
|
||||
// - Spreadsheets: JSON export, 100KB-10MB depending on size
|
||||
// - PDFs: Store reference only, load on demand
|
||||
// - Images: Thumbnail locally, full resolution on demand
|
||||
```
|
||||
|
||||
### 3. Google Photos
|
||||
|
||||
```typescript
|
||||
interface EncryptedPhotoReference {
|
||||
id: string; // Photos media item ID
|
||||
encryptedFilename: ArrayBuffer;
|
||||
encryptedDescription: ArrayBuffer;
|
||||
|
||||
// Thumbnails stored locally (encrypted)
|
||||
thumbnail: {
|
||||
width: number;
|
||||
height: number;
|
||||
encryptedData: ArrayBuffer; // Base64 or blob
|
||||
};
|
||||
|
||||
// Full resolution: reference only (fetch on demand)
|
||||
fullResolution: {
|
||||
width: number;
|
||||
height: number;
|
||||
// NOT storing full image - too large
|
||||
// Fetch via API when user requests
|
||||
};
|
||||
|
||||
mediaType: 'image' | 'video';
|
||||
creationTime: number;
|
||||
|
||||
// Album associations
|
||||
albumIds: string[];
|
||||
|
||||
// Location data (highly sensitive - always encrypted)
|
||||
encryptedLocation?: ArrayBuffer;
|
||||
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Storage strategy:
|
||||
// - Thumbnails: ~50KB each, store locally
|
||||
// - Full images: NOT stored locally (too large)
|
||||
// - 1,000 photos thumbnails ≈ 50MB
|
||||
// - Full resolution loaded via API on demand
|
||||
```
|
||||
|
||||
### 4. Google Calendar Events
|
||||
|
||||
```typescript
|
||||
interface EncryptedCalendarEvent {
|
||||
id: string; // Calendar event ID
|
||||
calendarId: string;
|
||||
|
||||
encryptedSummary: ArrayBuffer;
|
||||
encryptedDescription: ArrayBuffer;
|
||||
encryptedLocation: ArrayBuffer;
|
||||
|
||||
// Time data (unencrypted for query/sort performance)
|
||||
startTime: number;
|
||||
endTime: number;
|
||||
isAllDay: boolean;
|
||||
timezone: string;
|
||||
|
||||
// Recurrence
|
||||
isRecurring: boolean;
|
||||
encryptedRecurrence?: ArrayBuffer;
|
||||
|
||||
// Attendees (encrypted)
|
||||
encryptedAttendees: ArrayBuffer;
|
||||
|
||||
// Reminders
|
||||
reminders: { method: string; minutes: number }[];
|
||||
|
||||
// Meeting links (encrypted - sensitive)
|
||||
encryptedMeetingLink?: ArrayBuffer;
|
||||
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Storage estimate:
|
||||
// - Average event: ~5KB encrypted
|
||||
// - 2 years of events (~3000): ~15MB
|
||||
```
|
||||
|
||||
## Encryption Strategy
|
||||
|
||||
### Key Derivation
|
||||
|
||||
Using the existing WebCrypto infrastructure, derive data encryption keys from the user's master key:
|
||||
|
||||
```typescript
|
||||
// Derive a data-specific encryption key from master key
|
||||
async function deriveDataEncryptionKey(
|
||||
masterKey: CryptoKey,
|
||||
purpose: 'gmail' | 'drive' | 'photos' | 'calendar'
|
||||
): Promise<CryptoKey> {
|
||||
const encoder = new TextEncoder();
|
||||
const purposeBytes = encoder.encode(`canvas-data-${purpose}`);
|
||||
|
||||
// Import master key for HKDF
|
||||
const baseKey = await crypto.subtle.importKey(
|
||||
'raw',
|
||||
await crypto.subtle.exportKey('raw', masterKey),
|
||||
'HKDF',
|
||||
false,
|
||||
['deriveKey']
|
||||
);
|
||||
|
||||
// Derive purpose-specific key
|
||||
return await crypto.subtle.deriveKey(
|
||||
{
|
||||
name: 'HKDF',
|
||||
hash: 'SHA-256',
|
||||
salt: purposeBytes,
|
||||
info: new ArrayBuffer(0)
|
||||
},
|
||||
baseKey,
|
||||
{ name: 'AES-GCM', length: 256 },
|
||||
false,
|
||||
['encrypt', 'decrypt']
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Encryption/Decryption
|
||||
|
||||
```typescript
|
||||
// Encrypt data before storing
|
||||
async function encryptData(
|
||||
data: string | ArrayBuffer,
|
||||
key: CryptoKey
|
||||
): Promise<{encrypted: ArrayBuffer, iv: Uint8Array}> {
|
||||
const iv = crypto.getRandomValues(new Uint8Array(12)); // 96-bit IV for AES-GCM
|
||||
|
||||
const dataBuffer = typeof data === 'string'
|
||||
? new TextEncoder().encode(data)
|
||||
: data;
|
||||
|
||||
const encrypted = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
key,
|
||||
dataBuffer
|
||||
);
|
||||
|
||||
return { encrypted, iv };
|
||||
}
|
||||
|
||||
// Decrypt data when reading
|
||||
async function decryptData(
|
||||
encrypted: ArrayBuffer,
|
||||
iv: Uint8Array,
|
||||
key: CryptoKey
|
||||
): Promise<ArrayBuffer> {
|
||||
return await crypto.subtle.decrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
key,
|
||||
encrypted
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## IndexedDB Schema
|
||||
|
||||
```typescript
|
||||
// Database schema for encrypted Google data
|
||||
const GOOGLE_DATA_DB = 'canvas-google-data';
|
||||
const DB_VERSION = 1;
|
||||
|
||||
interface GoogleDataSchema {
|
||||
gmail: {
|
||||
key: string; // message ID
|
||||
indexes: ['threadId', 'date', 'syncedAt'];
|
||||
};
|
||||
drive: {
|
||||
key: string; // file ID
|
||||
indexes: ['parentId', 'modifiedTime', 'mimeType'];
|
||||
};
|
||||
photos: {
|
||||
key: string; // media item ID
|
||||
indexes: ['creationTime', 'mediaType'];
|
||||
};
|
||||
calendar: {
|
||||
key: string; // event ID
|
||||
indexes: ['calendarId', 'startTime', 'endTime'];
|
||||
};
|
||||
syncMetadata: {
|
||||
key: string; // 'gmail' | 'drive' | 'photos' | 'calendar'
|
||||
// Stores last sync token, sync progress, etc.
|
||||
};
|
||||
encryptionKeys: {
|
||||
key: string; // purpose
|
||||
// Stores IV, salt for key derivation
|
||||
};
|
||||
}
|
||||
|
||||
async function initGoogleDataDB(): Promise<IDBDatabase> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.open(GOOGLE_DATA_DB, DB_VERSION);
|
||||
|
||||
request.onerror = () => reject(request.error);
|
||||
request.onsuccess = () => resolve(request.result);
|
||||
|
||||
request.onupgradeneeded = (event) => {
|
||||
const db = (event.target as IDBOpenDBRequest).result;
|
||||
|
||||
// Gmail store
|
||||
if (!db.objectStoreNames.contains('gmail')) {
|
||||
const gmailStore = db.createObjectStore('gmail', { keyPath: 'id' });
|
||||
gmailStore.createIndex('threadId', 'threadId', { unique: false });
|
||||
gmailStore.createIndex('date', 'date', { unique: false });
|
||||
gmailStore.createIndex('syncedAt', 'syncedAt', { unique: false });
|
||||
}
|
||||
|
||||
// Drive store
|
||||
if (!db.objectStoreNames.contains('drive')) {
|
||||
const driveStore = db.createObjectStore('drive', { keyPath: 'id' });
|
||||
driveStore.createIndex('parentId', 'parentId', { unique: false });
|
||||
driveStore.createIndex('modifiedTime', 'modifiedTime', { unique: false });
|
||||
}
|
||||
|
||||
// Photos store
|
||||
if (!db.objectStoreNames.contains('photos')) {
|
||||
const photosStore = db.createObjectStore('photos', { keyPath: 'id' });
|
||||
photosStore.createIndex('creationTime', 'creationTime', { unique: false });
|
||||
photosStore.createIndex('mediaType', 'mediaType', { unique: false });
|
||||
}
|
||||
|
||||
// Calendar store
|
||||
if (!db.objectStoreNames.contains('calendar')) {
|
||||
const calendarStore = db.createObjectStore('calendar', { keyPath: 'id' });
|
||||
calendarStore.createIndex('calendarId', 'calendarId', { unique: false });
|
||||
calendarStore.createIndex('startTime', 'startTime', { unique: false });
|
||||
}
|
||||
|
||||
// Sync metadata
|
||||
if (!db.objectStoreNames.contains('syncMetadata')) {
|
||||
db.createObjectStore('syncMetadata', { keyPath: 'service' });
|
||||
}
|
||||
|
||||
// Encryption metadata
|
||||
if (!db.objectStoreNames.contains('encryptionMeta')) {
|
||||
db.createObjectStore('encryptionMeta', { keyPath: 'purpose' });
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Google OAuth & API Integration
|
||||
|
||||
### OAuth 2.0 Scopes
|
||||
|
||||
```typescript
|
||||
const GOOGLE_SCOPES = {
|
||||
// Read-only access (data sovereignty - we import, not modify)
|
||||
gmail: 'https://www.googleapis.com/auth/gmail.readonly',
|
||||
drive: 'https://www.googleapis.com/auth/drive.readonly',
|
||||
photos: 'https://www.googleapis.com/auth/photoslibrary.readonly',
|
||||
calendar: 'https://www.googleapis.com/auth/calendar.readonly',
|
||||
|
||||
// Profile for user identification
|
||||
profile: 'https://www.googleapis.com/auth/userinfo.profile',
|
||||
email: 'https://www.googleapis.com/auth/userinfo.email'
|
||||
};
|
||||
|
||||
// Selective scope request - user chooses what to import
|
||||
function getRequestedScopes(services: string[]): string {
|
||||
const scopes = [GOOGLE_SCOPES.profile, GOOGLE_SCOPES.email];
|
||||
|
||||
services.forEach(service => {
|
||||
if (GOOGLE_SCOPES[service as keyof typeof GOOGLE_SCOPES]) {
|
||||
scopes.push(GOOGLE_SCOPES[service as keyof typeof GOOGLE_SCOPES]);
|
||||
}
|
||||
});
|
||||
|
||||
return scopes.join(' ');
|
||||
}
|
||||
```
|
||||
|
||||
### OAuth Flow with PKCE
|
||||
|
||||
```typescript
|
||||
interface GoogleAuthState {
|
||||
codeVerifier: string;
|
||||
redirectUri: string;
|
||||
state: string;
|
||||
}
|
||||
|
||||
async function initiateGoogleAuth(services: string[]): Promise<void> {
|
||||
const codeVerifier = generateCodeVerifier();
|
||||
const codeChallenge = await generateCodeChallenge(codeVerifier);
|
||||
const state = crypto.randomUUID();
|
||||
|
||||
// Store state for verification
|
||||
sessionStorage.setItem('google_auth_state', JSON.stringify({
|
||||
codeVerifier,
|
||||
state,
|
||||
redirectUri: window.location.origin + '/oauth/google/callback'
|
||||
}));
|
||||
|
||||
const params = new URLSearchParams({
|
||||
client_id: import.meta.env.VITE_GOOGLE_CLIENT_ID,
|
||||
redirect_uri: window.location.origin + '/oauth/google/callback',
|
||||
response_type: 'code',
|
||||
scope: getRequestedScopes(services),
|
||||
access_type: 'offline', // Get refresh token
|
||||
prompt: 'consent',
|
||||
code_challenge: codeChallenge,
|
||||
code_challenge_method: 'S256',
|
||||
state
|
||||
});
|
||||
|
||||
window.location.href = `https://accounts.google.com/o/oauth2/v2/auth?${params}`;
|
||||
}
|
||||
|
||||
// PKCE helpers
|
||||
function generateCodeVerifier(): string {
|
||||
const array = new Uint8Array(32);
|
||||
crypto.getRandomValues(array);
|
||||
return base64UrlEncode(array);
|
||||
}
|
||||
|
||||
async function generateCodeChallenge(verifier: string): Promise<string> {
|
||||
const encoder = new TextEncoder();
|
||||
const data = encoder.encode(verifier);
|
||||
const hash = await crypto.subtle.digest('SHA-256', data);
|
||||
return base64UrlEncode(new Uint8Array(hash));
|
||||
}
|
||||
```
|
||||
|
||||
### Token Storage (Encrypted)
|
||||
|
||||
```typescript
|
||||
interface EncryptedTokens {
|
||||
accessToken: ArrayBuffer; // Encrypted
|
||||
refreshToken: ArrayBuffer; // Encrypted
|
||||
accessTokenIv: Uint8Array;
|
||||
refreshTokenIv: Uint8Array;
|
||||
expiresAt: number; // Unencrypted for refresh logic
|
||||
scopes: string[]; // Unencrypted for UI display
|
||||
}
|
||||
|
||||
async function storeGoogleTokens(
|
||||
tokens: { access_token: string; refresh_token?: string; expires_in: number },
|
||||
encryptionKey: CryptoKey
|
||||
): Promise<void> {
|
||||
const { encrypted: encAccessToken, iv: accessIv } = await encryptData(
|
||||
tokens.access_token,
|
||||
encryptionKey
|
||||
);
|
||||
|
||||
const encryptedTokens: Partial<EncryptedTokens> = {
|
||||
accessToken: encAccessToken,
|
||||
accessTokenIv: accessIv,
|
||||
expiresAt: Date.now() + (tokens.expires_in * 1000)
|
||||
};
|
||||
|
||||
if (tokens.refresh_token) {
|
||||
const { encrypted: encRefreshToken, iv: refreshIv } = await encryptData(
|
||||
tokens.refresh_token,
|
||||
encryptionKey
|
||||
);
|
||||
encryptedTokens.refreshToken = encRefreshToken;
|
||||
encryptedTokens.refreshTokenIv = refreshIv;
|
||||
}
|
||||
|
||||
const db = await initGoogleDataDB();
|
||||
const tx = db.transaction('encryptionMeta', 'readwrite');
|
||||
tx.objectStore('encryptionMeta').put({
|
||||
purpose: 'google_tokens',
|
||||
...encryptedTokens
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Data Import Workflow
|
||||
|
||||
### Progressive Import with Background Sync
|
||||
|
||||
```typescript
|
||||
interface ImportProgress {
|
||||
service: 'gmail' | 'drive' | 'photos' | 'calendar';
|
||||
total: number;
|
||||
imported: number;
|
||||
lastSyncToken?: string;
|
||||
status: 'idle' | 'importing' | 'paused' | 'error';
|
||||
errorMessage?: string;
|
||||
}
|
||||
|
||||
class GoogleDataImporter {
|
||||
private encryptionKey: CryptoKey;
|
||||
private db: IDBDatabase;
|
||||
|
||||
async importGmail(options: {
|
||||
maxMessages?: number;
|
||||
labelsFilter?: string[];
|
||||
dateAfter?: Date;
|
||||
}): Promise<void> {
|
||||
const accessToken = await this.getAccessToken();
|
||||
|
||||
// Use pagination for large mailboxes
|
||||
let pageToken: string | undefined;
|
||||
let imported = 0;
|
||||
|
||||
do {
|
||||
const response = await fetch(
|
||||
`https://gmail.googleapis.com/gmail/v1/users/me/messages?${new URLSearchParams({
|
||||
maxResults: '100',
|
||||
...(pageToken && { pageToken }),
|
||||
...(options.labelsFilter && { labelIds: options.labelsFilter.join(',') }),
|
||||
...(options.dateAfter && { q: `after:${Math.floor(options.dateAfter.getTime() / 1000)}` })
|
||||
})}`,
|
||||
{ headers: { Authorization: `Bearer ${accessToken}` } }
|
||||
);
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Fetch and encrypt each message
|
||||
for (const msg of data.messages || []) {
|
||||
const fullMessage = await this.fetchGmailMessage(msg.id, accessToken);
|
||||
await this.storeEncryptedEmail(fullMessage);
|
||||
imported++;
|
||||
|
||||
// Update progress
|
||||
this.updateProgress('gmail', imported);
|
||||
|
||||
// Yield to UI periodically
|
||||
if (imported % 10 === 0) {
|
||||
await new Promise(r => setTimeout(r, 0));
|
||||
}
|
||||
}
|
||||
|
||||
pageToken = data.nextPageToken;
|
||||
} while (pageToken && (!options.maxMessages || imported < options.maxMessages));
|
||||
}
|
||||
|
||||
private async storeEncryptedEmail(message: any): Promise<void> {
|
||||
const emailKey = await deriveDataEncryptionKey(this.encryptionKey, 'gmail');
|
||||
|
||||
const encrypted: EncryptedEmailStore = {
|
||||
id: message.id,
|
||||
threadId: message.threadId,
|
||||
encryptedSubject: (await encryptData(
|
||||
this.extractHeader(message, 'Subject') || '',
|
||||
emailKey
|
||||
)).encrypted,
|
||||
encryptedBody: (await encryptData(
|
||||
this.extractBody(message),
|
||||
emailKey
|
||||
)).encrypted,
|
||||
// ... other fields
|
||||
date: parseInt(message.internalDate),
|
||||
syncedAt: Date.now(),
|
||||
localOnly: true
|
||||
};
|
||||
|
||||
const tx = this.db.transaction('gmail', 'readwrite');
|
||||
tx.objectStore('gmail').put(encrypted);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Sharing to Canvas Board
|
||||
|
||||
### Selective Sharing Model
|
||||
|
||||
```typescript
|
||||
interface ShareableItem {
|
||||
type: 'email' | 'document' | 'photo' | 'event';
|
||||
id: string;
|
||||
// Decrypted data for sharing
|
||||
decryptedData: any;
|
||||
}
|
||||
|
||||
class DataSharingService {
|
||||
/**
|
||||
* Share a specific item to the current board
|
||||
* This decrypts the item and adds it to the Automerge document
|
||||
*/
|
||||
async shareToBoard(
|
||||
item: ShareableItem,
|
||||
boardHandle: DocumentHandle<CanvasDoc>,
|
||||
userKey: CryptoKey
|
||||
): Promise<void> {
|
||||
// 1. Decrypt the item
|
||||
const decrypted = await this.decryptItem(item, userKey);
|
||||
|
||||
// 2. Create a canvas shape representation
|
||||
const shape = this.createShapeFromItem(decrypted, item.type);
|
||||
|
||||
// 3. Add to Automerge document (syncs to other board users)
|
||||
boardHandle.change(doc => {
|
||||
doc.shapes[shape.id] = shape;
|
||||
});
|
||||
|
||||
// 4. Mark item as shared (no longer localOnly)
|
||||
await this.markAsShared(item.id, item.type);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a visual shape from data
|
||||
*/
|
||||
private createShapeFromItem(data: any, type: string): TLShape {
|
||||
switch (type) {
|
||||
case 'email':
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'email-card',
|
||||
props: {
|
||||
subject: data.subject,
|
||||
from: data.from,
|
||||
date: data.date,
|
||||
snippet: data.snippet
|
||||
}
|
||||
};
|
||||
case 'event':
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'calendar-event',
|
||||
props: {
|
||||
title: data.summary,
|
||||
startTime: data.startTime,
|
||||
endTime: data.endTime,
|
||||
location: data.location
|
||||
}
|
||||
};
|
||||
// ... other types
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## R2 Encrypted Backup
|
||||
|
||||
### Backup Architecture
|
||||
|
||||
```
|
||||
User Browser Cloudflare Worker R2 Storage
|
||||
│ │ │
|
||||
│ 1. Encrypt data locally │ │
|
||||
│ (already encrypted in IndexedDB) │ │
|
||||
│ │ │
|
||||
│ 2. Generate backup key │ │
|
||||
│ (derived from master key) │ │
|
||||
│ │ │
|
||||
│ 3. POST encrypted blob ──────────> 4. Validate user │
|
||||
│ │ (CryptID auth) │
|
||||
│ │ │
|
||||
│ │ 5. Store blob ─────────────────> │
|
||||
│ │ (already encrypted, │
|
||||
│ │ worker can't read) │
|
||||
│ │ │
|
||||
│ <──────────────────────────────── 6. Return backup ID │
|
||||
```
|
||||
|
||||
### Backup Implementation
|
||||
|
||||
```typescript
|
||||
interface BackupMetadata {
|
||||
id: string;
|
||||
createdAt: number;
|
||||
services: ('gmail' | 'drive' | 'photos' | 'calendar')[];
|
||||
itemCount: number;
|
||||
sizeBytes: number;
|
||||
// Encrypted with user's key - only they can read
|
||||
encryptedManifest: ArrayBuffer;
|
||||
}
|
||||
|
||||
class R2BackupService {
|
||||
private workerUrl = '/api/backup';
|
||||
|
||||
async createBackup(
|
||||
services: string[],
|
||||
encryptionKey: CryptoKey
|
||||
): Promise<BackupMetadata> {
|
||||
// 1. Gather all encrypted data from IndexedDB
|
||||
const dataToBackup = await this.gatherData(services);
|
||||
|
||||
// 2. Create a manifest (encrypted)
|
||||
const manifest = {
|
||||
version: 1,
|
||||
createdAt: Date.now(),
|
||||
services,
|
||||
itemCounts: dataToBackup.counts
|
||||
};
|
||||
const { encrypted: encManifest } = await encryptData(
|
||||
JSON.stringify(manifest),
|
||||
encryptionKey
|
||||
);
|
||||
|
||||
// 3. Serialize and chunk if large
|
||||
const blob = await this.serializeForBackup(dataToBackup);
|
||||
|
||||
// 4. Upload to R2 via worker
|
||||
const response = await fetch(this.workerUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/octet-stream',
|
||||
'X-Backup-Manifest': base64Encode(encManifest)
|
||||
},
|
||||
body: blob
|
||||
});
|
||||
|
||||
const { backupId } = await response.json();
|
||||
|
||||
return {
|
||||
id: backupId,
|
||||
createdAt: Date.now(),
|
||||
services: services as any,
|
||||
itemCount: Object.values(dataToBackup.counts).reduce((a, b) => a + b, 0),
|
||||
sizeBytes: blob.size,
|
||||
encryptedManifest: encManifest
|
||||
};
|
||||
}
|
||||
|
||||
async restoreBackup(
|
||||
backupId: string,
|
||||
encryptionKey: CryptoKey
|
||||
): Promise<void> {
|
||||
// 1. Fetch encrypted blob from R2
|
||||
const response = await fetch(`${this.workerUrl}/${backupId}`);
|
||||
const encryptedBlob = await response.arrayBuffer();
|
||||
|
||||
// 2. Data is already encrypted with user's key
|
||||
// Just write directly to IndexedDB
|
||||
await this.writeToIndexedDB(encryptedBlob);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Privacy & Security Guarantees
|
||||
|
||||
### What Never Leaves the Browser (Unencrypted)
|
||||
|
||||
1. **Email content** - body, subject, attachments
|
||||
2. **Document content** - file contents, names
|
||||
3. **Photo data** - images, location metadata
|
||||
4. **Calendar details** - event descriptions, attendee info
|
||||
5. **OAuth tokens** - access/refresh tokens
|
||||
|
||||
### What the Server Never Sees
|
||||
|
||||
1. **Encryption keys** - derived locally, never transmitted
|
||||
2. **Plaintext data** - all API calls are client-side
|
||||
3. **User's Google account data** - we use read-only scopes
|
||||
|
||||
### Data Flow Summary
|
||||
|
||||
```
|
||||
┌─────────────────────┐
|
||||
│ Google APIs │
|
||||
│ (authenticated) │
|
||||
└──────────┬──────────┘
|
||||
│
|
||||
┌─────────▼─────────┐
|
||||
│ Browser Fetch │
|
||||
│ (client-side) │
|
||||
└─────────┬─────────┘
|
||||
│
|
||||
┌─────────▼─────────┐
|
||||
│ Encrypt with │
|
||||
│ WebCrypto │
|
||||
│ (AES-256-GCM) │
|
||||
└─────────┬─────────┘
|
||||
│
|
||||
┌────────────────────┼────────────────────┐
|
||||
│ │ │
|
||||
┌─────────▼─────────┐ ┌───────▼────────┐ ┌────────▼───────┐
|
||||
│ IndexedDB │ │ Share to │ │ R2 Backup │
|
||||
│ (local only) │ │ Board │ │ (encrypted) │
|
||||
│ │ │ (Automerge) │ │ │
|
||||
└───────────────────┘ └────────────────┘ └────────────────┘
|
||||
│ │ │
|
||||
▼ ▼ ▼
|
||||
Only you can read Board members Only you can
|
||||
(your keys) see shared items decrypt backup
|
||||
```
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Foundation
|
||||
- [ ] IndexedDB schema for encrypted data
|
||||
- [ ] Key derivation from existing WebCrypto keys
|
||||
- [ ] Encrypt/decrypt utility functions
|
||||
- [ ] Storage quota monitoring
|
||||
|
||||
### Phase 2: Google OAuth
|
||||
- [ ] OAuth 2.0 with PKCE flow
|
||||
- [ ] Token encryption and storage
|
||||
- [ ] Token refresh logic
|
||||
- [ ] Scope selection UI
|
||||
|
||||
### Phase 3: Data Import
|
||||
- [ ] Gmail import with pagination
|
||||
- [ ] Drive document import
|
||||
- [ ] Photos thumbnail import
|
||||
- [ ] Calendar event import
|
||||
- [ ] Progress tracking UI
|
||||
|
||||
### Phase 4: Canvas Integration
|
||||
- [ ] Email card shape
|
||||
- [ ] Document preview shape
|
||||
- [ ] Photo thumbnail shape
|
||||
- [ ] Calendar event shape
|
||||
- [ ] Share to board functionality
|
||||
|
||||
### Phase 5: R2 Backup
|
||||
- [ ] Encrypted backup creation
|
||||
- [ ] Backup restore
|
||||
- [ ] Backup management UI
|
||||
- [ ] Automatic backup scheduling
|
||||
|
||||
### Phase 6: Polish
|
||||
- [ ] Safari storage warnings
|
||||
- [ ] Offline data access
|
||||
- [ ] Search within encrypted data
|
||||
- [ ] Data export (Google Takeout style)
|
||||
|
||||
## Security Checklist
|
||||
|
||||
- [ ] All data encrypted before storage
|
||||
- [ ] Keys never leave browser unencrypted
|
||||
- [ ] OAuth tokens encrypted at rest
|
||||
- [ ] PKCE used for OAuth flow
|
||||
- [ ] Read-only Google API scopes
|
||||
- [ ] Safari 7-day eviction handled
|
||||
- [ ] Storage quota warnings
|
||||
- [ ] Secure context required (HTTPS)
|
||||
- [ ] CSP headers configured
|
||||
- [ ] No sensitive data in console logs
|
||||
|
||||
## Related Documents
|
||||
|
||||
- [Local File Upload](./LOCAL_FILE_UPLOAD.md) - Multi-item upload with same encryption model
|
||||
- [Offline Storage Feasibility](../OFFLINE_STORAGE_FEASIBILITY.md) - IndexedDB + Automerge foundation
|
||||
|
||||
## References
|
||||
|
||||
- [IndexedDB API](https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API)
|
||||
- [Web Crypto API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Crypto_API)
|
||||
- [Storage API](https://developer.mozilla.org/en-US/docs/Web/API/Storage_API)
|
||||
- [Google OAuth 2.0](https://developers.google.com/identity/protocols/oauth2)
|
||||
- [Gmail API](https://developers.google.com/gmail/api)
|
||||
- [Drive API](https://developers.google.com/drive/api)
|
||||
- [Photos Library API](https://developers.google.com/photos/library/reference/rest)
|
||||
- [Calendar API](https://developers.google.com/calendar/api)
|
||||
|
|
@ -0,0 +1,862 @@
|
|||
# Local File Upload: Multi-Item Encrypted Import
|
||||
|
||||
A simpler, more broadly compatible approach to importing local files into the canvas with the same privacy-first, encrypted storage model.
|
||||
|
||||
## Overview
|
||||
|
||||
Instead of maintaining persistent folder connections (which have browser compatibility issues), provide a **drag-and-drop / file picker** interface for batch importing files into encrypted local storage.
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ UPLOAD INTERFACE │
|
||||
├─────────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌─────────────────────────────────────────────────────────────────┐ │
|
||||
│ │ │ │
|
||||
│ │ 📁 Drop files here or click to browse │ │
|
||||
│ │ │ │
|
||||
│ │ Supports: Images, PDFs, Documents, Text, Audio, Video │ │
|
||||
│ │ │ │
|
||||
│ └─────────────────────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
│ ┌──────────────────────────────────────────────────────────────────┐ │
|
||||
│ │ Import Queue [Upload] │ │
|
||||
│ ├──────────────────────────────────────────────────────────────────┤ │
|
||||
│ │ ☑ photo_001.jpg (2.4 MB) 🔒 Encrypt 📤 Share │ │
|
||||
│ │ ☑ meeting_notes.pdf (450 KB) 🔒 Encrypt ☐ Private │ │
|
||||
│ │ ☑ project_plan.md (12 KB) 🔒 Encrypt ☐ Private │ │
|
||||
│ │ ☐ sensitive_doc.docx (1.2 MB) 🔒 Encrypt ☐ Private │ │
|
||||
│ └──────────────────────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
│ Storage: 247 MB used / ~5 GB available │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Why Multi-Item Upload vs. Folder Connection
|
||||
|
||||
| Feature | Folder Connection | Multi-Item Upload |
|
||||
|---------|------------------|-------------------|
|
||||
| Browser Support | Chrome/Edge only | All browsers |
|
||||
| Persistent Access | Yes (with permission) | No (one-time import) |
|
||||
| Implementation | Complex | Simple |
|
||||
| User Control | Less explicit | Very explicit |
|
||||
| Privacy UX | Hidden | Clear per-file choices |
|
||||
|
||||
**Recommendation**: Multi-item upload is better for privacy-conscious users who want explicit control over what enters the system.
|
||||
|
||||
## Supported File Types
|
||||
|
||||
### Documents
|
||||
| Type | Extension | Processing | Storage Strategy |
|
||||
|------|-----------|-----------|------------------|
|
||||
| Markdown | `.md` | Parse frontmatter, render | Full content |
|
||||
| PDF | `.pdf` | Extract text, thumbnail | Text + thumbnail |
|
||||
| Word | `.docx` | Convert to markdown | Converted content |
|
||||
| Text | `.txt`, `.csv`, `.json` | Direct | Full content |
|
||||
| Code | `.js`, `.ts`, `.py`, etc. | Syntax highlight | Full content |
|
||||
|
||||
### Images
|
||||
| Type | Extension | Processing | Storage Strategy |
|
||||
|------|-----------|-----------|------------------|
|
||||
| Photos | `.jpg`, `.png`, `.webp` | Generate thumbnail | Thumbnail + full |
|
||||
| Vector | `.svg` | Direct | Full content |
|
||||
| GIF | `.gif` | First frame thumb | Thumbnail + full |
|
||||
|
||||
### Media
|
||||
| Type | Extension | Processing | Storage Strategy |
|
||||
|------|-----------|-----------|------------------|
|
||||
| Audio | `.mp3`, `.wav`, `.m4a` | Waveform preview | Reference + metadata |
|
||||
| Video | `.mp4`, `.webm` | Frame thumbnail | Reference + metadata |
|
||||
|
||||
### Archives (Future)
|
||||
| Type | Extension | Processing |
|
||||
|------|-----------|-----------|
|
||||
| ZIP | `.zip` | List contents, selective extract |
|
||||
| Obsidian Export | `.zip` | Vault structure import |
|
||||
|
||||
## Architecture
|
||||
|
||||
```typescript
|
||||
interface UploadedFile {
|
||||
id: string; // Generated UUID
|
||||
originalName: string; // User's filename
|
||||
mimeType: string;
|
||||
size: number;
|
||||
|
||||
// Processing results
|
||||
processed: {
|
||||
thumbnail?: ArrayBuffer; // For images/PDFs/videos
|
||||
extractedText?: string; // For searchable docs
|
||||
metadata?: Record<string, any>; // EXIF, frontmatter, etc.
|
||||
};
|
||||
|
||||
// Encryption
|
||||
encrypted: {
|
||||
content: ArrayBuffer; // Encrypted file content
|
||||
iv: Uint8Array;
|
||||
keyId: string; // Reference to encryption key
|
||||
};
|
||||
|
||||
// User choices
|
||||
sharing: {
|
||||
localOnly: boolean; // Default true
|
||||
sharedToBoard?: string; // Board ID if shared
|
||||
backedUpToR2?: boolean;
|
||||
};
|
||||
|
||||
// Timestamps
|
||||
importedAt: number;
|
||||
lastAccessedAt: number;
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### 1. File Input Component
|
||||
|
||||
```typescript
|
||||
import React, { useCallback, useState } from 'react';
|
||||
|
||||
interface FileUploadProps {
|
||||
onFilesSelected: (files: File[]) => void;
|
||||
maxFileSize?: number; // bytes
|
||||
maxFiles?: number;
|
||||
acceptedTypes?: string[];
|
||||
}
|
||||
|
||||
export function FileUploadZone({
|
||||
onFilesSelected,
|
||||
maxFileSize = 100 * 1024 * 1024, // 100MB default
|
||||
maxFiles = 50,
|
||||
acceptedTypes
|
||||
}: FileUploadProps) {
|
||||
const [isDragging, setIsDragging] = useState(false);
|
||||
const [errors, setErrors] = useState<string[]>([]);
|
||||
|
||||
const handleDrop = useCallback((e: React.DragEvent) => {
|
||||
e.preventDefault();
|
||||
setIsDragging(false);
|
||||
|
||||
const files = Array.from(e.dataTransfer.files);
|
||||
validateAndProcess(files);
|
||||
}, []);
|
||||
|
||||
const handleFileInput = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const files = Array.from(e.target.files || []);
|
||||
validateAndProcess(files);
|
||||
}, []);
|
||||
|
||||
const validateAndProcess = (files: File[]) => {
|
||||
const errors: string[] = [];
|
||||
const validFiles: File[] = [];
|
||||
|
||||
for (const file of files.slice(0, maxFiles)) {
|
||||
if (file.size > maxFileSize) {
|
||||
errors.push(`${file.name}: exceeds ${maxFileSize / 1024 / 1024}MB limit`);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (acceptedTypes && !acceptedTypes.some(t => file.type.match(t))) {
|
||||
errors.push(`${file.name}: unsupported file type`);
|
||||
continue;
|
||||
}
|
||||
|
||||
validFiles.push(file);
|
||||
}
|
||||
|
||||
if (files.length > maxFiles) {
|
||||
errors.push(`Only first ${maxFiles} files will be imported`);
|
||||
}
|
||||
|
||||
setErrors(errors);
|
||||
if (validFiles.length > 0) {
|
||||
onFilesSelected(validFiles);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div
|
||||
onDrop={handleDrop}
|
||||
onDragOver={(e) => { e.preventDefault(); setIsDragging(true); }}
|
||||
onDragLeave={() => setIsDragging(false)}
|
||||
className={`upload-zone ${isDragging ? 'dragging' : ''}`}
|
||||
>
|
||||
<input
|
||||
type="file"
|
||||
multiple
|
||||
onChange={handleFileInput}
|
||||
accept={acceptedTypes?.join(',')}
|
||||
id="file-upload"
|
||||
hidden
|
||||
/>
|
||||
<label htmlFor="file-upload">
|
||||
<span className="upload-icon">📁</span>
|
||||
<span>Drop files here or click to browse</span>
|
||||
<span className="upload-hint">
|
||||
Images, PDFs, Documents, Text files
|
||||
</span>
|
||||
</label>
|
||||
|
||||
{errors.length > 0 && (
|
||||
<div className="upload-errors">
|
||||
{errors.map((err, i) => <div key={i}>{err}</div>)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### 2. File Processing Pipeline
|
||||
|
||||
```typescript
|
||||
interface ProcessedFile {
|
||||
file: File;
|
||||
thumbnail?: Blob;
|
||||
extractedText?: string;
|
||||
metadata?: Record<string, any>;
|
||||
}
|
||||
|
||||
class FileProcessor {
|
||||
|
||||
async process(file: File): Promise<ProcessedFile> {
|
||||
const result: ProcessedFile = { file };
|
||||
|
||||
// Route based on MIME type
|
||||
if (file.type.startsWith('image/')) {
|
||||
return this.processImage(file, result);
|
||||
} else if (file.type === 'application/pdf') {
|
||||
return this.processPDF(file, result);
|
||||
} else if (file.type.startsWith('text/') || this.isTextFile(file)) {
|
||||
return this.processText(file, result);
|
||||
} else if (file.type.startsWith('video/')) {
|
||||
return this.processVideo(file, result);
|
||||
} else if (file.type.startsWith('audio/')) {
|
||||
return this.processAudio(file, result);
|
||||
}
|
||||
|
||||
// Default: store as-is
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processImage(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Generate thumbnail
|
||||
const img = await createImageBitmap(file);
|
||||
const canvas = new OffscreenCanvas(200, 200);
|
||||
const ctx = canvas.getContext('2d')!;
|
||||
|
||||
// Calculate aspect-ratio preserving dimensions
|
||||
const scale = Math.min(200 / img.width, 200 / img.height);
|
||||
const w = img.width * scale;
|
||||
const h = img.height * scale;
|
||||
|
||||
ctx.drawImage(img, (200 - w) / 2, (200 - h) / 2, w, h);
|
||||
result.thumbnail = await canvas.convertToBlob({ type: 'image/webp', quality: 0.8 });
|
||||
|
||||
// Extract EXIF if available
|
||||
if (file.type === 'image/jpeg') {
|
||||
result.metadata = await this.extractExif(file);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processPDF(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Use pdf.js for text extraction and thumbnail
|
||||
const pdfjsLib = await import('pdfjs-dist');
|
||||
const arrayBuffer = await file.arrayBuffer();
|
||||
const pdf = await pdfjsLib.getDocument({ data: arrayBuffer }).promise;
|
||||
|
||||
// Get first page as thumbnail
|
||||
const page = await pdf.getPage(1);
|
||||
const viewport = page.getViewport({ scale: 0.5 });
|
||||
const canvas = new OffscreenCanvas(viewport.width, viewport.height);
|
||||
const ctx = canvas.getContext('2d')!;
|
||||
|
||||
await page.render({ canvasContext: ctx, viewport }).promise;
|
||||
result.thumbnail = await canvas.convertToBlob({ type: 'image/webp' });
|
||||
|
||||
// Extract text from all pages
|
||||
let text = '';
|
||||
for (let i = 1; i <= pdf.numPages; i++) {
|
||||
const page = await pdf.getPage(i);
|
||||
const content = await page.getTextContent();
|
||||
text += content.items.map((item: any) => item.str).join(' ') + '\n';
|
||||
}
|
||||
result.extractedText = text;
|
||||
|
||||
result.metadata = { pageCount: pdf.numPages };
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processText(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
result.extractedText = await file.text();
|
||||
|
||||
// Parse markdown frontmatter if applicable
|
||||
if (file.name.endsWith('.md')) {
|
||||
const frontmatter = this.parseFrontmatter(result.extractedText);
|
||||
if (frontmatter) {
|
||||
result.metadata = frontmatter;
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processVideo(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Generate thumbnail from first frame
|
||||
const video = document.createElement('video');
|
||||
video.preload = 'metadata';
|
||||
video.src = URL.createObjectURL(file);
|
||||
|
||||
await new Promise(resolve => video.addEventListener('loadedmetadata', resolve));
|
||||
video.currentTime = 1; // First second
|
||||
await new Promise(resolve => video.addEventListener('seeked', resolve));
|
||||
|
||||
const canvas = new OffscreenCanvas(200, 200);
|
||||
const ctx = canvas.getContext('2d')!;
|
||||
const scale = Math.min(200 / video.videoWidth, 200 / video.videoHeight);
|
||||
ctx.drawImage(video, 0, 0, video.videoWidth * scale, video.videoHeight * scale);
|
||||
|
||||
result.thumbnail = await canvas.convertToBlob({ type: 'image/webp' });
|
||||
result.metadata = {
|
||||
duration: video.duration,
|
||||
width: video.videoWidth,
|
||||
height: video.videoHeight
|
||||
};
|
||||
|
||||
URL.revokeObjectURL(video.src);
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processAudio(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Extract duration and basic metadata
|
||||
const audio = document.createElement('audio');
|
||||
audio.src = URL.createObjectURL(file);
|
||||
|
||||
await new Promise(resolve => audio.addEventListener('loadedmetadata', resolve));
|
||||
|
||||
result.metadata = {
|
||||
duration: audio.duration
|
||||
};
|
||||
|
||||
URL.revokeObjectURL(audio.src);
|
||||
return result;
|
||||
}
|
||||
|
||||
private isTextFile(file: File): boolean {
|
||||
const textExtensions = ['.md', '.txt', '.json', '.csv', '.yaml', '.yml', '.xml', '.html', '.css', '.js', '.ts', '.py', '.sh'];
|
||||
return textExtensions.some(ext => file.name.toLowerCase().endsWith(ext));
|
||||
}
|
||||
|
||||
private parseFrontmatter(content: string): Record<string, any> | null {
|
||||
const match = content.match(/^---\n([\s\S]*?)\n---/);
|
||||
if (!match) return null;
|
||||
|
||||
try {
|
||||
// Simple YAML-like parsing (or use a proper YAML parser)
|
||||
const lines = match[1].split('\n');
|
||||
const result: Record<string, any> = {};
|
||||
for (const line of lines) {
|
||||
const [key, ...valueParts] = line.split(':');
|
||||
if (key && valueParts.length) {
|
||||
result[key.trim()] = valueParts.join(':').trim();
|
||||
}
|
||||
}
|
||||
return result;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private async extractExif(file: File): Promise<Record<string, any>> {
|
||||
// Would use exif-js or similar library
|
||||
return {};
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Encryption & Storage
|
||||
|
||||
```typescript
|
||||
class LocalFileStore {
|
||||
private db: IDBDatabase;
|
||||
private encryptionKey: CryptoKey;
|
||||
|
||||
async storeFile(processed: ProcessedFile, options: {
|
||||
shareToBoard?: boolean;
|
||||
} = {}): Promise<UploadedFile> {
|
||||
const fileId = crypto.randomUUID();
|
||||
|
||||
// Read file content
|
||||
const content = await processed.file.arrayBuffer();
|
||||
|
||||
// Encrypt content
|
||||
const iv = crypto.getRandomValues(new Uint8Array(12));
|
||||
const encryptedContent = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
this.encryptionKey,
|
||||
content
|
||||
);
|
||||
|
||||
// Encrypt thumbnail if present
|
||||
let encryptedThumbnail: ArrayBuffer | undefined;
|
||||
let thumbnailIv: Uint8Array | undefined;
|
||||
if (processed.thumbnail) {
|
||||
thumbnailIv = crypto.getRandomValues(new Uint8Array(12));
|
||||
const thumbBuffer = await processed.thumbnail.arrayBuffer();
|
||||
encryptedThumbnail = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv: thumbnailIv },
|
||||
this.encryptionKey,
|
||||
thumbBuffer
|
||||
);
|
||||
}
|
||||
|
||||
const uploadedFile: UploadedFile = {
|
||||
id: fileId,
|
||||
originalName: processed.file.name,
|
||||
mimeType: processed.file.type,
|
||||
size: processed.file.size,
|
||||
processed: {
|
||||
extractedText: processed.extractedText,
|
||||
metadata: processed.metadata
|
||||
},
|
||||
encrypted: {
|
||||
content: encryptedContent,
|
||||
iv,
|
||||
keyId: 'user-master-key'
|
||||
},
|
||||
sharing: {
|
||||
localOnly: !options.shareToBoard,
|
||||
sharedToBoard: options.shareToBoard ? getCurrentBoardId() : undefined
|
||||
},
|
||||
importedAt: Date.now(),
|
||||
lastAccessedAt: Date.now()
|
||||
};
|
||||
|
||||
// Store encrypted thumbnail separately (for faster listing)
|
||||
if (encryptedThumbnail && thumbnailIv) {
|
||||
await this.storeThumbnail(fileId, encryptedThumbnail, thumbnailIv);
|
||||
}
|
||||
|
||||
// Store to IndexedDB
|
||||
const tx = this.db.transaction('files', 'readwrite');
|
||||
tx.objectStore('files').put(uploadedFile);
|
||||
|
||||
return uploadedFile;
|
||||
}
|
||||
|
||||
async getFile(fileId: string): Promise<{
|
||||
file: UploadedFile;
|
||||
decryptedContent: ArrayBuffer;
|
||||
} | null> {
|
||||
const tx = this.db.transaction('files', 'readonly');
|
||||
const file = await new Promise<UploadedFile | undefined>(resolve => {
|
||||
const req = tx.objectStore('files').get(fileId);
|
||||
req.onsuccess = () => resolve(req.result);
|
||||
});
|
||||
|
||||
if (!file) return null;
|
||||
|
||||
// Decrypt content
|
||||
const decryptedContent = await crypto.subtle.decrypt(
|
||||
{ name: 'AES-GCM', iv: file.encrypted.iv },
|
||||
this.encryptionKey,
|
||||
file.encrypted.content
|
||||
);
|
||||
|
||||
return { file, decryptedContent };
|
||||
}
|
||||
|
||||
async listFiles(options?: {
|
||||
mimeTypeFilter?: string;
|
||||
limit?: number;
|
||||
offset?: number;
|
||||
}): Promise<UploadedFile[]> {
|
||||
const tx = this.db.transaction('files', 'readonly');
|
||||
const store = tx.objectStore('files');
|
||||
|
||||
return new Promise(resolve => {
|
||||
const files: UploadedFile[] = [];
|
||||
const req = store.openCursor();
|
||||
|
||||
req.onsuccess = (e) => {
|
||||
const cursor = (e.target as IDBRequest).result;
|
||||
if (cursor) {
|
||||
const file = cursor.value as UploadedFile;
|
||||
|
||||
// Filter by MIME type if specified
|
||||
if (!options?.mimeTypeFilter || file.mimeType.startsWith(options.mimeTypeFilter)) {
|
||||
files.push(file);
|
||||
}
|
||||
|
||||
cursor.continue();
|
||||
} else {
|
||||
resolve(files);
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. IndexedDB Schema
|
||||
|
||||
```typescript
|
||||
const LOCAL_FILES_DB = 'canvas-local-files';
|
||||
const DB_VERSION = 1;
|
||||
|
||||
async function initLocalFilesDB(): Promise<IDBDatabase> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.open(LOCAL_FILES_DB, DB_VERSION);
|
||||
|
||||
request.onerror = () => reject(request.error);
|
||||
request.onsuccess = () => resolve(request.result);
|
||||
|
||||
request.onupgradeneeded = (event) => {
|
||||
const db = (event.target as IDBOpenDBRequest).result;
|
||||
|
||||
// Main files store
|
||||
if (!db.objectStoreNames.contains('files')) {
|
||||
const store = db.createObjectStore('files', { keyPath: 'id' });
|
||||
store.createIndex('mimeType', 'mimeType', { unique: false });
|
||||
store.createIndex('importedAt', 'importedAt', { unique: false });
|
||||
store.createIndex('originalName', 'originalName', { unique: false });
|
||||
store.createIndex('sharedToBoard', 'sharing.sharedToBoard', { unique: false });
|
||||
}
|
||||
|
||||
// Thumbnails store (separate for faster listing)
|
||||
if (!db.objectStoreNames.contains('thumbnails')) {
|
||||
db.createObjectStore('thumbnails', { keyPath: 'fileId' });
|
||||
}
|
||||
|
||||
// Search index (encrypted full-text search)
|
||||
if (!db.objectStoreNames.contains('searchIndex')) {
|
||||
const searchStore = db.createObjectStore('searchIndex', { keyPath: 'fileId' });
|
||||
searchStore.createIndex('tokens', 'tokens', { unique: false, multiEntry: true });
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## UI Components
|
||||
|
||||
### Import Dialog
|
||||
|
||||
```tsx
|
||||
function ImportFilesDialog({ isOpen, onClose }: { isOpen: boolean; onClose: () => void }) {
|
||||
const [selectedFiles, setSelectedFiles] = useState<ProcessedFile[]>([]);
|
||||
const [importing, setImporting] = useState(false);
|
||||
const [progress, setProgress] = useState(0);
|
||||
const fileStore = useLocalFileStore();
|
||||
|
||||
const handleFilesSelected = async (files: File[]) => {
|
||||
const processor = new FileProcessor();
|
||||
const processed: ProcessedFile[] = [];
|
||||
|
||||
for (const file of files) {
|
||||
processed.push(await processor.process(file));
|
||||
}
|
||||
|
||||
setSelectedFiles(prev => [...prev, ...processed]);
|
||||
};
|
||||
|
||||
const handleImport = async () => {
|
||||
setImporting(true);
|
||||
|
||||
for (let i = 0; i < selectedFiles.length; i++) {
|
||||
await fileStore.storeFile(selectedFiles[i]);
|
||||
setProgress((i + 1) / selectedFiles.length * 100);
|
||||
}
|
||||
|
||||
setImporting(false);
|
||||
onClose();
|
||||
};
|
||||
|
||||
return (
|
||||
<Dialog open={isOpen} onClose={onClose}>
|
||||
<DialogTitle>Import Files</DialogTitle>
|
||||
|
||||
<FileUploadZone onFilesSelected={handleFilesSelected} />
|
||||
|
||||
{selectedFiles.length > 0 && (
|
||||
<div className="file-list">
|
||||
{selectedFiles.map((pf, i) => (
|
||||
<FilePreviewRow
|
||||
key={i}
|
||||
file={pf}
|
||||
onRemove={() => setSelectedFiles(prev => prev.filter((_, j) => j !== i))}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{importing && (
|
||||
<progress value={progress} max={100} />
|
||||
)}
|
||||
|
||||
<DialogActions>
|
||||
<button onClick={onClose}>Cancel</button>
|
||||
<button
|
||||
onClick={handleImport}
|
||||
disabled={selectedFiles.length === 0 || importing}
|
||||
>
|
||||
Import {selectedFiles.length} files
|
||||
</button>
|
||||
</DialogActions>
|
||||
</Dialog>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### File Browser Panel
|
||||
|
||||
```tsx
|
||||
function LocalFilesBrowser() {
|
||||
const [files, setFiles] = useState<UploadedFile[]>([]);
|
||||
const [filter, setFilter] = useState<string>('all');
|
||||
const fileStore = useLocalFileStore();
|
||||
|
||||
useEffect(() => {
|
||||
loadFiles();
|
||||
}, [filter]);
|
||||
|
||||
const loadFiles = async () => {
|
||||
const mimeFilter = filter === 'all' ? undefined : filter;
|
||||
setFiles(await fileStore.listFiles({ mimeTypeFilter: mimeFilter }));
|
||||
};
|
||||
|
||||
const handleDragToCanvas = (file: UploadedFile) => {
|
||||
// Create a shape from the file and add to canvas
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="local-files-browser">
|
||||
<div className="filter-bar">
|
||||
<button onClick={() => setFilter('all')}>All</button>
|
||||
<button onClick={() => setFilter('image/')}>Images</button>
|
||||
<button onClick={() => setFilter('application/pdf')}>PDFs</button>
|
||||
<button onClick={() => setFilter('text/')}>Documents</button>
|
||||
</div>
|
||||
|
||||
<div className="files-grid">
|
||||
{files.map(file => (
|
||||
<FileCard
|
||||
key={file.id}
|
||||
file={file}
|
||||
onDragStart={() => handleDragToCanvas(file)}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Canvas Integration
|
||||
|
||||
### Drag Files to Canvas
|
||||
|
||||
```typescript
|
||||
// When user drags a local file onto the canvas
|
||||
async function createShapeFromLocalFile(
|
||||
file: UploadedFile,
|
||||
position: { x: number; y: number },
|
||||
editor: Editor
|
||||
): Promise<TLShapeId> {
|
||||
const fileStore = getLocalFileStore();
|
||||
const { decryptedContent } = await fileStore.getFile(file.id);
|
||||
|
||||
if (file.mimeType.startsWith('image/')) {
|
||||
// Create image shape
|
||||
const blob = new Blob([decryptedContent], { type: file.mimeType });
|
||||
const assetId = AssetRecordType.createId();
|
||||
|
||||
await editor.createAssets([{
|
||||
id: assetId,
|
||||
type: 'image',
|
||||
typeName: 'asset',
|
||||
props: {
|
||||
name: file.originalName,
|
||||
src: URL.createObjectURL(blob),
|
||||
w: 400,
|
||||
h: 300,
|
||||
mimeType: file.mimeType,
|
||||
isAnimated: file.mimeType === 'image/gif'
|
||||
}
|
||||
}]);
|
||||
|
||||
return editor.createShape({
|
||||
type: 'image',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: { assetId, w: 400, h: 300 }
|
||||
}).id;
|
||||
|
||||
} else if (file.mimeType === 'application/pdf') {
|
||||
// Create PDF embed or preview shape
|
||||
return editor.createShape({
|
||||
type: 'pdf-preview',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
fileId: file.id,
|
||||
name: file.originalName,
|
||||
pageCount: file.processed.metadata?.pageCount
|
||||
}
|
||||
}).id;
|
||||
|
||||
} else if (file.mimeType.startsWith('text/') || file.originalName.endsWith('.md')) {
|
||||
// Create note shape with content
|
||||
const text = new TextDecoder().decode(decryptedContent);
|
||||
return editor.createShape({
|
||||
type: 'note',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
text: text.slice(0, 1000), // Truncate for display
|
||||
fileId: file.id,
|
||||
fullContentAvailable: text.length > 1000
|
||||
}
|
||||
}).id;
|
||||
}
|
||||
|
||||
// Default: generic file card
|
||||
return editor.createShape({
|
||||
type: 'file-card',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
fileId: file.id,
|
||||
name: file.originalName,
|
||||
size: file.size,
|
||||
mimeType: file.mimeType
|
||||
}
|
||||
}).id;
|
||||
}
|
||||
```
|
||||
|
||||
## Storage Considerations
|
||||
|
||||
### Size Limits & Recommendations
|
||||
|
||||
| File Type | Max Recommended | Notes |
|
||||
|-----------|----------------|-------|
|
||||
| Images | 20MB each | Larger images get resized |
|
||||
| PDFs | 50MB each | Text extracted for search |
|
||||
| Videos | 100MB each | Store reference, thumbnail only |
|
||||
| Audio | 50MB each | Store with waveform preview |
|
||||
| Documents | 10MB each | Full content stored |
|
||||
|
||||
### Total Storage Budget
|
||||
|
||||
```typescript
|
||||
const STORAGE_CONFIG = {
|
||||
// Soft warning at 500MB
|
||||
warningThreshold: 500 * 1024 * 1024,
|
||||
|
||||
// Hard limit at 2GB (leaves room for other data)
|
||||
maxStorage: 2 * 1024 * 1024 * 1024,
|
||||
|
||||
// Auto-cleanup: remove thumbnails for files not accessed in 30 days
|
||||
thumbnailRetentionDays: 30
|
||||
};
|
||||
|
||||
async function checkStorageQuota(): Promise<{
|
||||
used: number;
|
||||
available: number;
|
||||
warning: boolean;
|
||||
}> {
|
||||
const estimate = await navigator.storage.estimate();
|
||||
const used = estimate.usage || 0;
|
||||
const quota = estimate.quota || 0;
|
||||
|
||||
return {
|
||||
used,
|
||||
available: Math.min(quota - used, STORAGE_CONFIG.maxStorage - used),
|
||||
warning: used > STORAGE_CONFIG.warningThreshold
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## Privacy Features
|
||||
|
||||
### Per-File Privacy Controls
|
||||
|
||||
```typescript
|
||||
interface FilePrivacySettings {
|
||||
// Encryption is always on - this is about sharing
|
||||
localOnly: boolean; // Never leaves browser
|
||||
shareableToBoard: boolean; // Can be added to shared board
|
||||
includeInR2Backup: boolean; // Include in cloud backup
|
||||
|
||||
// Metadata privacy
|
||||
stripExif: boolean; // Remove location/camera data from images
|
||||
anonymizeFilename: boolean; // Use generated name instead of original
|
||||
}
|
||||
|
||||
const DEFAULT_PRIVACY: FilePrivacySettings = {
|
||||
localOnly: true,
|
||||
shareableToBoard: false,
|
||||
includeInR2Backup: true,
|
||||
stripExif: true,
|
||||
anonymizeFilename: false
|
||||
};
|
||||
```
|
||||
|
||||
### Sharing Flow
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ User drags local file onto shared board │
|
||||
├─────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ⚠️ Share "meeting_notes.pdf" to this board? │
|
||||
│ │
|
||||
│ This file is currently private. Sharing it will: │
|
||||
│ • Make it visible to all board members │
|
||||
│ • Upload an encrypted copy to sync storage │
|
||||
│ • Keep the original encrypted on your device │
|
||||
│ │
|
||||
│ [Keep Private] [Share to Board] │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
### Phase 1: Core Upload
|
||||
- [ ] File drop zone component
|
||||
- [ ] File type detection
|
||||
- [ ] Image thumbnail generation
|
||||
- [ ] PDF text extraction & thumbnail
|
||||
- [ ] Encryption before storage
|
||||
- [ ] IndexedDB schema & storage
|
||||
|
||||
### Phase 2: File Management
|
||||
- [ ] File browser panel
|
||||
- [ ] Filter by type
|
||||
- [ ] Search within files
|
||||
- [ ] Delete files
|
||||
- [ ] Storage quota display
|
||||
|
||||
### Phase 3: Canvas Integration
|
||||
- [ ] Drag files to canvas
|
||||
- [ ] Image shape from file
|
||||
- [ ] PDF preview shape
|
||||
- [ ] Document/note shape
|
||||
- [ ] Generic file card shape
|
||||
|
||||
### Phase 4: Sharing & Backup
|
||||
- [ ] Share confirmation dialog
|
||||
- [ ] Upload to Automerge sync
|
||||
- [ ] Include in R2 backup
|
||||
- [ ] Privacy settings per file
|
||||
|
||||
## Related Documents
|
||||
|
||||
- [Google Data Sovereignty](./GOOGLE_DATA_SOVEREIGNTY.md) - Same encryption model for Google imports
|
||||
- [Offline Storage Feasibility](../OFFLINE_STORAGE_FEASIBILITY.md) - IndexedDB + Automerge foundation
|
||||
|
|
@ -13,6 +13,7 @@
|
|||
"@automerge/automerge": "^3.1.1",
|
||||
"@automerge/automerge-repo": "^2.2.0",
|
||||
"@automerge/automerge-repo-react-hooks": "^2.2.0",
|
||||
"@automerge/automerge-repo-storage-indexeddb": "^2.5.0",
|
||||
"@chengsokdara/use-whisper": "^0.2.0",
|
||||
"@daily-co/daily-js": "^0.60.0",
|
||||
"@daily-co/daily-react": "^0.20.0",
|
||||
|
|
@ -211,6 +212,31 @@
|
|||
"react-dom": "^18.0.0 || ^19.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@automerge/automerge-repo-storage-indexeddb": {
|
||||
"version": "2.5.0",
|
||||
"resolved": "https://registry.npmjs.org/@automerge/automerge-repo-storage-indexeddb/-/automerge-repo-storage-indexeddb-2.5.0.tgz",
|
||||
"integrity": "sha512-7MJYJ5S6K7dHlbvs5/u/v9iexqOeprU/qQonup28r2IoVqwzjuN5ezaoVk6JRBMDI/ZxWfU4rNrqVrVlB49yXA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@automerge/automerge-repo": "2.5.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@automerge/automerge-repo-storage-indexeddb/node_modules/@automerge/automerge-repo": {
|
||||
"version": "2.5.0",
|
||||
"resolved": "https://registry.npmjs.org/@automerge/automerge-repo/-/automerge-repo-2.5.0.tgz",
|
||||
"integrity": "sha512-bdxuMuKmxw0ZjwQXecrIX1VrHXf445bYCftNJJ5vqgGWVvINB5ZKFYAbtgPIyu1Y0TXQKvc6eqESaDeL+g8MmA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@automerge/automerge": "2.2.8 - 3",
|
||||
"bs58check": "^3.0.1",
|
||||
"cbor-x": "^1.3.0",
|
||||
"debug": "^4.3.4",
|
||||
"eventemitter3": "^5.0.1",
|
||||
"fast-sha256": "^1.3.0",
|
||||
"uuid": "^9.0.0",
|
||||
"xstate": "^5.9.1"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/code-frame": {
|
||||
"version": "7.27.1",
|
||||
"resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.27.1.tgz",
|
||||
|
|
|
|||
|
|
@ -25,6 +25,7 @@
|
|||
"@automerge/automerge": "^3.1.1",
|
||||
"@automerge/automerge-repo": "^2.2.0",
|
||||
"@automerge/automerge-repo-react-hooks": "^2.2.0",
|
||||
"@automerge/automerge-repo-storage-indexeddb": "^2.5.0",
|
||||
"@chengsokdara/use-whisper": "^0.2.0",
|
||||
"@daily-co/daily-js": "^0.60.0",
|
||||
"@daily-co/daily-react": "^0.20.0",
|
||||
|
|
|
|||
|
|
@ -35,6 +35,9 @@ import { ErrorBoundary } from './components/ErrorBoundary';
|
|||
import CryptoLogin from './components/auth/CryptoLogin';
|
||||
import CryptoDebug from './components/auth/CryptoDebug';
|
||||
|
||||
// Import Google Data test component
|
||||
import { GoogleDataTest } from './components/GoogleDataTest';
|
||||
|
||||
inject();
|
||||
|
||||
// Initialize Daily.co call object with error handling
|
||||
|
|
@ -168,6 +171,9 @@ const AppWithProviders = () => {
|
|||
<LocationDashboardRoute />
|
||||
</OptionalAuthRoute>
|
||||
} />
|
||||
{/* Google Data routes */}
|
||||
<Route path="/google" element={<GoogleDataTest />} />
|
||||
<Route path="/oauth/google/callback" element={<GoogleDataTest />} />
|
||||
</Routes>
|
||||
</BrowserRouter>
|
||||
</DailyProvider>
|
||||
|
|
|
|||
|
|
@ -0,0 +1,250 @@
|
|||
/**
|
||||
* Document ID Mapping Utility
|
||||
*
|
||||
* Manages the mapping between room IDs (human-readable slugs) and
|
||||
* Automerge document IDs (automerge:xxxx format).
|
||||
*
|
||||
* This is necessary because:
|
||||
* - Automerge requires specific document ID formats
|
||||
* - We want to persist documents in IndexedDB with consistent IDs
|
||||
* - Room IDs are user-friendly slugs that may not match Automerge's format
|
||||
*/
|
||||
|
||||
const DB_NAME = 'canvas-document-mappings'
|
||||
const STORE_NAME = 'mappings'
|
||||
const DB_VERSION = 1
|
||||
|
||||
interface DocumentMapping {
|
||||
roomId: string
|
||||
documentId: string
|
||||
createdAt: number
|
||||
lastAccessedAt: number
|
||||
}
|
||||
|
||||
let dbInstance: IDBDatabase | null = null
|
||||
|
||||
/**
|
||||
* Open the IndexedDB database for document ID mappings
|
||||
*/
|
||||
async function openDatabase(): Promise<IDBDatabase> {
|
||||
if (dbInstance) return dbInstance
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.open(DB_NAME, DB_VERSION)
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to open document mapping database:', request.error)
|
||||
reject(request.error)
|
||||
}
|
||||
|
||||
request.onsuccess = () => {
|
||||
dbInstance = request.result
|
||||
resolve(request.result)
|
||||
}
|
||||
|
||||
request.onupgradeneeded = (event) => {
|
||||
const db = (event.target as IDBOpenDBRequest).result
|
||||
|
||||
if (!db.objectStoreNames.contains(STORE_NAME)) {
|
||||
const store = db.createObjectStore(STORE_NAME, { keyPath: 'roomId' })
|
||||
store.createIndex('documentId', 'documentId', { unique: true })
|
||||
store.createIndex('lastAccessedAt', 'lastAccessedAt', { unique: false })
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the Automerge document ID for a given room ID
|
||||
* Returns null if no mapping exists
|
||||
*/
|
||||
export async function getDocumentId(roomId: string): Promise<string | null> {
|
||||
try {
|
||||
const db = await openDatabase()
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const transaction = db.transaction(STORE_NAME, 'readonly')
|
||||
const store = transaction.objectStore(STORE_NAME)
|
||||
const request = store.get(roomId)
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to get document mapping:', request.error)
|
||||
reject(request.error)
|
||||
}
|
||||
|
||||
request.onsuccess = () => {
|
||||
const mapping = request.result as DocumentMapping | undefined
|
||||
if (mapping) {
|
||||
// Update last accessed time in background
|
||||
updateLastAccessed(roomId).catch(console.error)
|
||||
resolve(mapping.documentId)
|
||||
} else {
|
||||
resolve(null)
|
||||
}
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error getting document ID:', error)
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Save a mapping between room ID and Automerge document ID
|
||||
*/
|
||||
export async function saveDocumentId(roomId: string, documentId: string): Promise<void> {
|
||||
try {
|
||||
const db = await openDatabase()
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const transaction = db.transaction(STORE_NAME, 'readwrite')
|
||||
const store = transaction.objectStore(STORE_NAME)
|
||||
|
||||
const mapping: DocumentMapping = {
|
||||
roomId,
|
||||
documentId,
|
||||
createdAt: Date.now(),
|
||||
lastAccessedAt: Date.now()
|
||||
}
|
||||
|
||||
const request = store.put(mapping)
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to save document mapping:', request.error)
|
||||
reject(request.error)
|
||||
}
|
||||
|
||||
request.onsuccess = () => {
|
||||
console.log(`📝 Saved document mapping: ${roomId} → ${documentId}`)
|
||||
resolve()
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error saving document ID:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update the last accessed timestamp for a room
|
||||
*/
|
||||
async function updateLastAccessed(roomId: string): Promise<void> {
|
||||
try {
|
||||
const db = await openDatabase()
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const transaction = db.transaction(STORE_NAME, 'readwrite')
|
||||
const store = transaction.objectStore(STORE_NAME)
|
||||
const getRequest = store.get(roomId)
|
||||
|
||||
getRequest.onerror = () => reject(getRequest.error)
|
||||
|
||||
getRequest.onsuccess = () => {
|
||||
const mapping = getRequest.result as DocumentMapping | undefined
|
||||
if (mapping) {
|
||||
mapping.lastAccessedAt = Date.now()
|
||||
store.put(mapping)
|
||||
}
|
||||
resolve()
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
// Silent fail for background update
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a document mapping (useful for cleanup)
|
||||
*/
|
||||
export async function deleteDocumentMapping(roomId: string): Promise<void> {
|
||||
try {
|
||||
const db = await openDatabase()
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const transaction = db.transaction(STORE_NAME, 'readwrite')
|
||||
const store = transaction.objectStore(STORE_NAME)
|
||||
const request = store.delete(roomId)
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to delete document mapping:', request.error)
|
||||
reject(request.error)
|
||||
}
|
||||
|
||||
request.onsuccess = () => {
|
||||
console.log(`🗑️ Deleted document mapping for: ${roomId}`)
|
||||
resolve()
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error deleting document mapping:', error)
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all document mappings (useful for debugging/management)
|
||||
*/
|
||||
export async function getAllMappings(): Promise<DocumentMapping[]> {
|
||||
try {
|
||||
const db = await openDatabase()
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const transaction = db.transaction(STORE_NAME, 'readonly')
|
||||
const store = transaction.objectStore(STORE_NAME)
|
||||
const request = store.getAll()
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to get all document mappings:', request.error)
|
||||
reject(request.error)
|
||||
}
|
||||
|
||||
request.onsuccess = () => {
|
||||
resolve(request.result as DocumentMapping[])
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error getting all mappings:', error)
|
||||
return []
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up old document mappings (documents not accessed in X days)
|
||||
* This helps manage storage quota
|
||||
*/
|
||||
export async function cleanupOldMappings(maxAgeDays: number = 30): Promise<number> {
|
||||
try {
|
||||
const db = await openDatabase()
|
||||
const cutoffTime = Date.now() - (maxAgeDays * 24 * 60 * 60 * 1000)
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const transaction = db.transaction(STORE_NAME, 'readwrite')
|
||||
const store = transaction.objectStore(STORE_NAME)
|
||||
const index = store.index('lastAccessedAt')
|
||||
const range = IDBKeyRange.upperBound(cutoffTime)
|
||||
const request = index.openCursor(range)
|
||||
|
||||
let deletedCount = 0
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to cleanup old mappings:', request.error)
|
||||
reject(request.error)
|
||||
}
|
||||
|
||||
request.onsuccess = (event) => {
|
||||
const cursor = (event.target as IDBRequest).result
|
||||
if (cursor) {
|
||||
cursor.delete()
|
||||
deletedCount++
|
||||
cursor.continue()
|
||||
} else {
|
||||
console.log(`🧹 Cleaned up ${deletedCount} old document mappings`)
|
||||
resolve(deletedCount)
|
||||
}
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error cleaning up old mappings:', error)
|
||||
return 0
|
||||
}
|
||||
}
|
||||
|
|
@ -3,8 +3,9 @@ import { TLStoreSnapshot } from "@tldraw/tldraw"
|
|||
import { CloudflareNetworkAdapter } from "./CloudflareAdapter"
|
||||
import { useAutomergeStoreV2, useAutomergePresence } from "./useAutomergeStoreV2"
|
||||
import { TLStoreWithStatus } from "@tldraw/tldraw"
|
||||
import { Repo } from "@automerge/automerge-repo"
|
||||
import { DocHandle } from "@automerge/automerge-repo"
|
||||
import { Repo, DocHandle, DocumentId } from "@automerge/automerge-repo"
|
||||
import { IndexedDBStorageAdapter } from "@automerge/automerge-repo-storage-indexeddb"
|
||||
import { getDocumentId, saveDocumentId } from "./documentIdMapping"
|
||||
|
||||
interface AutomergeSyncConfig {
|
||||
uri: string
|
||||
|
|
@ -17,9 +18,23 @@ interface AutomergeSyncConfig {
|
|||
}
|
||||
}
|
||||
|
||||
export function useAutomergeSync(config: AutomergeSyncConfig): TLStoreWithStatus & { handle: DocHandle<any> | null; presence: ReturnType<typeof useAutomergePresence> } {
|
||||
// Track online/offline status
|
||||
export type ConnectionStatus = 'online' | 'offline' | 'syncing'
|
||||
|
||||
// Return type for useAutomergeSync - extends TLStoreWithStatus with offline capabilities
|
||||
export interface AutomergeSyncResult {
|
||||
store?: TLStoreWithStatus['store']
|
||||
status: TLStoreWithStatus['status']
|
||||
error?: TLStoreWithStatus['error']
|
||||
handle: DocHandle<any> | null
|
||||
presence: ReturnType<typeof useAutomergePresence>
|
||||
connectionStatus: ConnectionStatus
|
||||
isOfflineReady: boolean
|
||||
}
|
||||
|
||||
export function useAutomergeSync(config: AutomergeSyncConfig): AutomergeSyncResult {
|
||||
const { uri, user } = config
|
||||
|
||||
|
||||
// Extract roomId from URI (e.g., "https://worker.com/connect/room123" -> "room123")
|
||||
const roomId = useMemo(() => {
|
||||
const match = uri.match(/\/connect\/([^\/]+)$/)
|
||||
|
|
@ -33,14 +48,18 @@ export function useAutomergeSync(config: AutomergeSyncConfig): TLStoreWithStatus
|
|||
|
||||
const [handle, setHandle] = useState<any>(null)
|
||||
const [isLoading, setIsLoading] = useState(true)
|
||||
const [connectionStatus, setConnectionStatus] = useState<ConnectionStatus>(
|
||||
typeof navigator !== 'undefined' && navigator.onLine ? 'online' : 'offline'
|
||||
)
|
||||
const [isOfflineReady, setIsOfflineReady] = useState(false)
|
||||
const handleRef = useRef<any>(null)
|
||||
const storeRef = useRef<any>(null)
|
||||
|
||||
|
||||
// Update refs when handle/store changes
|
||||
useEffect(() => {
|
||||
handleRef.current = handle
|
||||
}, [handle])
|
||||
|
||||
|
||||
// JSON sync is deprecated - all data now flows through Automerge sync protocol
|
||||
// Old format content is converted server-side and saved to R2 in Automerge format
|
||||
// This callback is kept for backwards compatibility but should not be used
|
||||
|
|
@ -49,92 +68,199 @@ export function useAutomergeSync(config: AutomergeSyncConfig): TLStoreWithStatus
|
|||
// Don't apply JSON sync - let Automerge sync handle everything
|
||||
return
|
||||
}, [])
|
||||
|
||||
|
||||
// Create Repo with both network AND storage adapters for offline support
|
||||
const [repo] = useState(() => {
|
||||
const adapter = new CloudflareNetworkAdapter(workerUrl, roomId, applyJsonSyncData)
|
||||
const networkAdapter = new CloudflareNetworkAdapter(workerUrl, roomId, applyJsonSyncData)
|
||||
const storageAdapter = new IndexedDBStorageAdapter()
|
||||
|
||||
console.log('🗄️ Creating Automerge Repo with IndexedDB storage adapter for offline support')
|
||||
|
||||
return new Repo({
|
||||
network: [adapter]
|
||||
network: [networkAdapter],
|
||||
storage: storageAdapter
|
||||
})
|
||||
})
|
||||
|
||||
// Initialize Automerge document handle
|
||||
// Listen for online/offline events
|
||||
useEffect(() => {
|
||||
const handleOnline = () => {
|
||||
console.log('🌐 Network: Back online')
|
||||
setConnectionStatus('syncing')
|
||||
// The network adapter will automatically reconnect and sync
|
||||
// After a short delay, assume we're synced if no errors
|
||||
setTimeout(() => {
|
||||
setConnectionStatus('online')
|
||||
}, 2000)
|
||||
}
|
||||
|
||||
const handleOffline = () => {
|
||||
console.log('📴 Network: Gone offline')
|
||||
setConnectionStatus('offline')
|
||||
}
|
||||
|
||||
window.addEventListener('online', handleOnline)
|
||||
window.addEventListener('offline', handleOffline)
|
||||
|
||||
return () => {
|
||||
window.removeEventListener('online', handleOnline)
|
||||
window.removeEventListener('offline', handleOffline)
|
||||
}
|
||||
}, [])
|
||||
|
||||
// Initialize Automerge document handle with offline-first approach
|
||||
useEffect(() => {
|
||||
let mounted = true
|
||||
|
||||
const initializeHandle = async () => {
|
||||
try {
|
||||
console.log("🔌 Initializing Automerge Repo with NetworkAdapter for room:", roomId)
|
||||
|
||||
if (mounted) {
|
||||
// CRITICAL: Create a new Automerge document (repo.create() generates a proper document ID)
|
||||
// We can't use repo.find() with a custom ID because Automerge requires specific document ID formats
|
||||
// Instead, we'll create a new document and load initial data from the server
|
||||
const handle = repo.create()
|
||||
|
||||
console.log("Created Automerge handle via Repo:", {
|
||||
handleId: handle.documentId,
|
||||
isReady: handle.isReady()
|
||||
})
|
||||
|
||||
// Wait for the handle to be ready
|
||||
console.log("🔌 Initializing Automerge Repo with offline support for room:", roomId)
|
||||
|
||||
if (!mounted) return
|
||||
|
||||
let handle: DocHandle<any>
|
||||
let existingDocId: string | null = null
|
||||
let loadedFromLocal = false
|
||||
|
||||
// Step 1: Check if we have a stored document ID for this room
|
||||
try {
|
||||
existingDocId = await getDocumentId(roomId)
|
||||
if (existingDocId) {
|
||||
console.log(`📦 Found existing document ID in IndexedDB: ${existingDocId}`)
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('⚠️ Could not check IndexedDB for existing document:', error)
|
||||
}
|
||||
|
||||
// Step 2: Try to load from local storage first (offline-first approach)
|
||||
if (existingDocId) {
|
||||
try {
|
||||
console.log(`🔍 Attempting to load document from IndexedDB: ${existingDocId}`)
|
||||
// Use repo.find() which will check IndexedDB storage adapter first
|
||||
// In automerge-repo v2.x, find() can return a Promise
|
||||
const foundHandle = await Promise.resolve(repo.find(existingDocId as DocumentId))
|
||||
handle = foundHandle as DocHandle<any>
|
||||
|
||||
// Wait for the handle to be ready (will load from IndexedDB if available)
|
||||
await handle.whenReady()
|
||||
|
||||
const localDoc = handle.doc() as any
|
||||
const localRecordCount = localDoc?.store ? Object.keys(localDoc.store).length : 0
|
||||
|
||||
if (localRecordCount > 0) {
|
||||
console.log(`✅ Loaded ${localRecordCount} records from IndexedDB (offline-first)`)
|
||||
loadedFromLocal = true
|
||||
setIsOfflineReady(true)
|
||||
} else {
|
||||
console.log('📦 Document exists in IndexedDB but is empty')
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('⚠️ Could not load from IndexedDB, will create new document:', error)
|
||||
existingDocId = null
|
||||
}
|
||||
}
|
||||
|
||||
// Step 3: If no local document, create a new one
|
||||
if (!existingDocId || !handle!) {
|
||||
console.log('📝 Creating new Automerge document')
|
||||
handle = repo.create()
|
||||
|
||||
// Save the mapping for future offline access
|
||||
await saveDocumentId(roomId, handle.documentId)
|
||||
console.log(`📝 Saved new document mapping: ${roomId} → ${handle.documentId}`)
|
||||
|
||||
await handle.whenReady()
|
||||
|
||||
// CRITICAL: Always load initial data from the server
|
||||
// The server stores documents in R2 as JSON, so we need to load and initialize the Automerge document
|
||||
console.log("📥 Loading initial data from server...")
|
||||
}
|
||||
|
||||
// Step 4: Sync with server if online (background sync)
|
||||
if (navigator.onLine) {
|
||||
setConnectionStatus('syncing')
|
||||
console.log("📥 Syncing with server...")
|
||||
|
||||
try {
|
||||
const response = await fetch(`${workerUrl}/room/${roomId}`)
|
||||
if (response.ok) {
|
||||
const serverDoc = await response.json() as TLStoreSnapshot
|
||||
const serverShapeCount = serverDoc.store ? Object.values(serverDoc.store).filter((r: any) => r?.typeName === 'shape').length : 0
|
||||
const serverRecordCount = Object.keys(serverDoc.store || {}).length
|
||||
|
||||
console.log(`📥 Loaded document from server: ${serverRecordCount} records, ${serverShapeCount} shapes`)
|
||||
|
||||
// Initialize the Automerge document with server data
|
||||
const serverShapeCount = serverDoc.store
|
||||
? Object.values(serverDoc.store).filter((r: any) => r?.typeName === 'shape').length
|
||||
: 0
|
||||
|
||||
console.log(`📥 Server has: ${serverRecordCount} records, ${serverShapeCount} shapes`)
|
||||
|
||||
// Merge server data into local document
|
||||
if (serverDoc.store && serverRecordCount > 0) {
|
||||
handle.change((doc: any) => {
|
||||
// Initialize store if it doesn't exist
|
||||
if (!doc.store) {
|
||||
doc.store = {}
|
||||
}
|
||||
// Copy all records from server document
|
||||
Object.entries(serverDoc.store).forEach(([id, record]) => {
|
||||
doc.store[id] = record
|
||||
const localDoc = handle.doc() as any
|
||||
const localRecordCount = localDoc?.store ? Object.keys(localDoc.store).length : 0
|
||||
|
||||
// If server has more data or local is empty, merge server data
|
||||
if (serverRecordCount > 0) {
|
||||
handle.change((doc: any) => {
|
||||
if (!doc.store) {
|
||||
doc.store = {}
|
||||
}
|
||||
// Merge server records (Automerge will handle conflicts)
|
||||
Object.entries(serverDoc.store).forEach(([id, record]) => {
|
||||
// Only add if not already present locally, or if this is first load
|
||||
if (!doc.store[id] || !loadedFromLocal) {
|
||||
doc.store[id] = record
|
||||
}
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
console.log(`✅ Initialized Automerge document with ${serverRecordCount} records from server`)
|
||||
} else {
|
||||
console.log("📥 Server document is empty - starting with empty Automerge document")
|
||||
|
||||
const mergedDoc = handle.doc() as any
|
||||
const mergedCount = mergedDoc?.store ? Object.keys(mergedDoc.store).length : 0
|
||||
console.log(`✅ Merged server data. Total records: ${mergedCount}`)
|
||||
}
|
||||
} else if (response.status !== 404) {
|
||||
console.log("📥 Server document is empty")
|
||||
}
|
||||
|
||||
setConnectionStatus('online')
|
||||
} else if (response.status === 404) {
|
||||
console.log("📥 No document found on server (404) - starting with empty document")
|
||||
console.log("📥 No document on server yet - local document will be synced when saved")
|
||||
setConnectionStatus('online')
|
||||
} else {
|
||||
console.warn(`⚠️ Failed to load document from server: ${response.status} ${response.statusText}`)
|
||||
console.warn(`⚠️ Server sync failed: ${response.status}`)
|
||||
setConnectionStatus(loadedFromLocal ? 'offline' : 'online')
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("❌ Error loading initial document from server:", error)
|
||||
// Continue anyway - user can still create new content
|
||||
console.error("❌ Error syncing with server:", error)
|
||||
// If we loaded from local, we're still functional in offline mode
|
||||
setConnectionStatus(loadedFromLocal ? 'offline' : 'online')
|
||||
}
|
||||
|
||||
const finalDoc = handle.doc() as any
|
||||
const finalStoreKeys = finalDoc?.store ? Object.keys(finalDoc.store).length : 0
|
||||
const finalShapeCount = finalDoc?.store ? Object.values(finalDoc.store).filter((r: any) => r?.typeName === 'shape').length : 0
|
||||
|
||||
console.log("Automerge handle initialized:", {
|
||||
hasDoc: !!finalDoc,
|
||||
storeKeys: finalStoreKeys,
|
||||
shapeCount: finalShapeCount
|
||||
})
|
||||
|
||||
} else {
|
||||
console.log("📴 Offline - using local data only")
|
||||
setConnectionStatus('offline')
|
||||
}
|
||||
|
||||
// Mark as offline-ready once we have any document loaded
|
||||
setIsOfflineReady(true)
|
||||
|
||||
const finalDoc = handle.doc() as any
|
||||
const finalStoreKeys = finalDoc?.store ? Object.keys(finalDoc.store).length : 0
|
||||
const finalShapeCount = finalDoc?.store
|
||||
? Object.values(finalDoc.store).filter((r: any) => r?.typeName === 'shape').length
|
||||
: 0
|
||||
|
||||
console.log("✅ Automerge handle initialized:", {
|
||||
documentId: handle.documentId,
|
||||
hasDoc: !!finalDoc,
|
||||
storeKeys: finalStoreKeys,
|
||||
shapeCount: finalShapeCount,
|
||||
loadedFromLocal,
|
||||
isOnline: navigator.onLine
|
||||
})
|
||||
|
||||
if (mounted) {
|
||||
setHandle(handle)
|
||||
setIsLoading(false)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error initializing Automerge handle:", error)
|
||||
console.error("❌ Error initializing Automerge handle:", error)
|
||||
if (mounted) {
|
||||
setIsLoading(false)
|
||||
setConnectionStatus('offline')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -144,7 +270,7 @@ export function useAutomergeSync(config: AutomergeSyncConfig): TLStoreWithStatus
|
|||
return () => {
|
||||
mounted = false
|
||||
}
|
||||
}, [repo, roomId])
|
||||
}, [repo, roomId, workerUrl])
|
||||
|
||||
// Auto-save to Cloudflare on every change (with debouncing to prevent excessive calls)
|
||||
// CRITICAL: This ensures new shapes are persisted to R2
|
||||
|
|
@ -279,6 +405,8 @@ export function useAutomergeSync(config: AutomergeSyncConfig): TLStoreWithStatus
|
|||
return {
|
||||
...storeWithStatus,
|
||||
handle,
|
||||
presence
|
||||
presence,
|
||||
connectionStatus,
|
||||
isOfflineReady
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,468 @@
|
|||
// Simple test component for Google Data Sovereignty OAuth flow
|
||||
import { useState, useEffect } from 'react';
|
||||
import {
|
||||
initiateGoogleAuth,
|
||||
handleGoogleCallback,
|
||||
parseCallbackParams,
|
||||
isGoogleAuthenticated,
|
||||
getGrantedScopes,
|
||||
generateMasterKey,
|
||||
importGmail,
|
||||
importDrive,
|
||||
importPhotos,
|
||||
importCalendar,
|
||||
gmailStore,
|
||||
driveStore,
|
||||
photosStore,
|
||||
calendarStore,
|
||||
deleteDatabase,
|
||||
createShareService,
|
||||
type GoogleService,
|
||||
type ImportProgress,
|
||||
type ShareableItem
|
||||
} from '../lib/google';
|
||||
|
||||
export function GoogleDataTest() {
|
||||
const [status, setStatus] = useState<string>('Initializing...');
|
||||
const [isAuthed, setIsAuthed] = useState(false);
|
||||
const [scopes, setScopes] = useState<string[]>([]);
|
||||
const [masterKey, setMasterKey] = useState<CryptoKey | null>(null);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [importProgress, setImportProgress] = useState<ImportProgress | null>(null);
|
||||
const [storedCounts, setStoredCounts] = useState<{gmail: number; drive: number; photos: number; calendar: number}>({
|
||||
gmail: 0, drive: 0, photos: 0, calendar: 0
|
||||
});
|
||||
const [logs, setLogs] = useState<string[]>([]);
|
||||
const [viewingService, setViewingService] = useState<GoogleService | null>(null);
|
||||
const [viewItems, setViewItems] = useState<ShareableItem[]>([]);
|
||||
|
||||
const addLog = (msg: string) => {
|
||||
console.log(msg);
|
||||
setLogs(prev => [...prev.slice(-20), `${new Date().toLocaleTimeString()}: ${msg}`]);
|
||||
};
|
||||
|
||||
// Initialize on mount
|
||||
useEffect(() => {
|
||||
initializeService();
|
||||
}, []);
|
||||
|
||||
// Check for OAuth callback - wait for masterKey to be ready
|
||||
useEffect(() => {
|
||||
const url = window.location.href;
|
||||
if (url.includes('/oauth/google/callback') && masterKey) {
|
||||
handleCallback(url);
|
||||
}
|
||||
}, [masterKey]); // Re-run when masterKey becomes available
|
||||
|
||||
async function initializeService() {
|
||||
try {
|
||||
// Generate or load master key
|
||||
const key = await generateMasterKey();
|
||||
setMasterKey(key);
|
||||
|
||||
// Check if already authenticated
|
||||
const authed = await isGoogleAuthenticated();
|
||||
setIsAuthed(authed);
|
||||
|
||||
if (authed) {
|
||||
const grantedScopes = await getGrantedScopes();
|
||||
setScopes(grantedScopes);
|
||||
setStatus('Authenticated with Google');
|
||||
} else {
|
||||
setStatus('Ready to connect to Google');
|
||||
}
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : 'Initialization failed');
|
||||
setStatus('Error');
|
||||
}
|
||||
}
|
||||
|
||||
async function handleCallback(url: string) {
|
||||
setStatus('Processing OAuth callback...');
|
||||
|
||||
const params = parseCallbackParams(url);
|
||||
|
||||
if (params.error) {
|
||||
setError(`OAuth error: ${params.error_description || params.error}`);
|
||||
setStatus('Error');
|
||||
return;
|
||||
}
|
||||
|
||||
if (params.code && params.state && masterKey) {
|
||||
const result = await handleGoogleCallback(params.code, params.state, masterKey);
|
||||
|
||||
if (result.success) {
|
||||
setIsAuthed(true);
|
||||
setScopes(result.scopes);
|
||||
setStatus('Successfully connected to Google!');
|
||||
// Clean up URL
|
||||
window.history.replaceState({}, '', '/');
|
||||
} else {
|
||||
setError(result.error || 'Callback failed');
|
||||
setStatus('Error');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function connectGoogle() {
|
||||
setStatus('Redirecting to Google...');
|
||||
const services: GoogleService[] = ['gmail', 'drive', 'photos', 'calendar'];
|
||||
await initiateGoogleAuth(services);
|
||||
}
|
||||
|
||||
async function resetAndReconnect() {
|
||||
addLog('Resetting: Clearing all data...');
|
||||
try {
|
||||
await deleteDatabase();
|
||||
addLog('Resetting: Database cleared');
|
||||
setIsAuthed(false);
|
||||
setScopes([]);
|
||||
setStoredCounts({ gmail: 0, drive: 0, photos: 0, calendar: 0 });
|
||||
setError(null);
|
||||
setStatus('Database cleared. Click Connect to re-authenticate.');
|
||||
addLog('Resetting: Done. Please re-connect to Google.');
|
||||
} catch (err) {
|
||||
addLog(`Resetting: ERROR - ${err}`);
|
||||
}
|
||||
}
|
||||
|
||||
async function viewData(service: GoogleService) {
|
||||
if (!masterKey) return;
|
||||
addLog(`Viewing ${service} data...`);
|
||||
try {
|
||||
const shareService = createShareService(masterKey);
|
||||
const items = await shareService.listShareableItems(service, 20);
|
||||
addLog(`Found ${items.length} ${service} items`);
|
||||
setViewItems(items);
|
||||
setViewingService(service);
|
||||
} catch (err) {
|
||||
addLog(`View error: ${err}`);
|
||||
setError(err instanceof Error ? err.message : String(err));
|
||||
}
|
||||
}
|
||||
|
||||
async function refreshCounts() {
|
||||
const [gmail, drive, photos, calendar] = await Promise.all([
|
||||
gmailStore.count(),
|
||||
driveStore.count(),
|
||||
photosStore.count(),
|
||||
calendarStore.count()
|
||||
]);
|
||||
setStoredCounts({ gmail, drive, photos, calendar });
|
||||
}
|
||||
|
||||
async function testImportGmail() {
|
||||
addLog('Gmail: Starting...');
|
||||
if (!masterKey) {
|
||||
addLog('Gmail: ERROR - No master key');
|
||||
setError('No master key available');
|
||||
return;
|
||||
}
|
||||
setError(null);
|
||||
setImportProgress(null);
|
||||
setStatus('Importing Gmail (max 10 messages)...');
|
||||
try {
|
||||
addLog('Gmail: Calling importGmail...');
|
||||
const result = await importGmail(masterKey, {
|
||||
maxMessages: 10,
|
||||
onProgress: (p) => {
|
||||
addLog(`Gmail: Progress ${p.imported}/${p.total} - ${p.status}`);
|
||||
setImportProgress(p);
|
||||
}
|
||||
});
|
||||
addLog(`Gmail: Result - ${result.status}, ${result.imported} items`);
|
||||
setImportProgress(result);
|
||||
if (result.status === 'error') {
|
||||
addLog(`Gmail: ERROR - ${result.errorMessage}`);
|
||||
setError(result.errorMessage || 'Unknown error');
|
||||
setStatus('Gmail import failed');
|
||||
} else {
|
||||
setStatus(`Gmail import ${result.status}: ${result.imported} messages`);
|
||||
}
|
||||
await refreshCounts();
|
||||
} catch (err) {
|
||||
const errorMsg = err instanceof Error ? `${err.name}: ${err.message}` : String(err);
|
||||
addLog(`Gmail: EXCEPTION - ${errorMsg}`);
|
||||
setError(errorMsg);
|
||||
setStatus('Gmail import error');
|
||||
}
|
||||
}
|
||||
|
||||
async function testImportDrive() {
|
||||
if (!masterKey) return;
|
||||
setError(null);
|
||||
setStatus('Importing Drive (max 10 files)...');
|
||||
try {
|
||||
const result = await importDrive(masterKey, {
|
||||
maxFiles: 10,
|
||||
onProgress: (p) => setImportProgress(p)
|
||||
});
|
||||
setImportProgress(result);
|
||||
setStatus(`Drive import ${result.status}: ${result.imported} files`);
|
||||
await refreshCounts();
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : 'Import failed');
|
||||
setStatus('Error');
|
||||
}
|
||||
}
|
||||
|
||||
async function testImportPhotos() {
|
||||
if (!masterKey) return;
|
||||
setError(null);
|
||||
setStatus('Importing Photos (max 10 thumbnails)...');
|
||||
try {
|
||||
const result = await importPhotos(masterKey, {
|
||||
maxPhotos: 10,
|
||||
onProgress: (p) => setImportProgress(p)
|
||||
});
|
||||
setImportProgress(result);
|
||||
setStatus(`Photos import ${result.status}: ${result.imported} photos`);
|
||||
await refreshCounts();
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : 'Import failed');
|
||||
setStatus('Error');
|
||||
}
|
||||
}
|
||||
|
||||
async function testImportCalendar() {
|
||||
if (!masterKey) return;
|
||||
setError(null);
|
||||
setStatus('Importing Calendar (max 20 events)...');
|
||||
try {
|
||||
const result = await importCalendar(masterKey, {
|
||||
maxEvents: 20,
|
||||
onProgress: (p) => setImportProgress(p)
|
||||
});
|
||||
setImportProgress(result);
|
||||
setStatus(`Calendar import ${result.status}: ${result.imported} events`);
|
||||
await refreshCounts();
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : 'Import failed');
|
||||
setStatus('Error');
|
||||
}
|
||||
}
|
||||
|
||||
const buttonStyle = {
|
||||
padding: '10px 16px',
|
||||
fontSize: '14px',
|
||||
background: '#1a73e8',
|
||||
color: 'white',
|
||||
border: 'none',
|
||||
borderRadius: '4px',
|
||||
cursor: 'pointer',
|
||||
marginRight: '10px',
|
||||
marginBottom: '10px'
|
||||
};
|
||||
|
||||
return (
|
||||
<div style={{
|
||||
padding: '20px',
|
||||
fontFamily: 'system-ui, sans-serif',
|
||||
maxWidth: '600px',
|
||||
margin: '40px auto'
|
||||
}}>
|
||||
<h1>Google Data Sovereignty Test</h1>
|
||||
|
||||
<div style={{
|
||||
padding: '15px',
|
||||
background: error ? '#fee' : '#f0f0f0',
|
||||
borderRadius: '8px',
|
||||
marginBottom: '20px'
|
||||
}}>
|
||||
<strong>Status:</strong> {status}
|
||||
{error && (
|
||||
<div style={{
|
||||
color: 'red',
|
||||
marginTop: '10px',
|
||||
padding: '10px',
|
||||
background: '#fdd',
|
||||
borderRadius: '4px',
|
||||
fontFamily: 'monospace',
|
||||
fontSize: '12px',
|
||||
whiteSpace: 'pre-wrap',
|
||||
wordBreak: 'break-all'
|
||||
}}>
|
||||
<strong>Error:</strong> {error}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{!isAuthed ? (
|
||||
<button
|
||||
onClick={connectGoogle}
|
||||
style={{
|
||||
padding: '12px 24px',
|
||||
fontSize: '16px',
|
||||
background: '#4285f4',
|
||||
color: 'white',
|
||||
border: 'none',
|
||||
borderRadius: '4px',
|
||||
cursor: 'pointer'
|
||||
}}
|
||||
>
|
||||
Connect Google Account
|
||||
</button>
|
||||
) : (
|
||||
<div>
|
||||
<h3 style={{ color: 'green' }}>Connected!</h3>
|
||||
<p><strong>Granted scopes:</strong></p>
|
||||
<ul>
|
||||
{scopes.map(scope => (
|
||||
<li key={scope} style={{ fontSize: '12px', fontFamily: 'monospace' }}>
|
||||
{scope.replace('https://www.googleapis.com/auth/', '')}
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
|
||||
<h3>Test Import (Small Batches)</h3>
|
||||
<div style={{ marginBottom: '20px' }}>
|
||||
<button style={buttonStyle} onClick={testImportGmail}>
|
||||
Import Gmail (10)
|
||||
</button>
|
||||
<button style={buttonStyle} onClick={testImportDrive}>
|
||||
Import Drive (10)
|
||||
</button>
|
||||
<button style={buttonStyle} onClick={testImportPhotos}>
|
||||
Import Photos (10)
|
||||
</button>
|
||||
<button style={buttonStyle} onClick={testImportCalendar}>
|
||||
Import Calendar (20)
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{importProgress && (
|
||||
<div style={{
|
||||
padding: '10px',
|
||||
background: importProgress.status === 'error' ? '#fee' :
|
||||
importProgress.status === 'completed' ? '#efe' : '#fff3e0',
|
||||
borderRadius: '4px',
|
||||
marginBottom: '15px'
|
||||
}}>
|
||||
<strong>{importProgress.service}:</strong> {importProgress.status}
|
||||
{importProgress.status === 'importing' && (
|
||||
<span> - {importProgress.imported}/{importProgress.total}</span>
|
||||
)}
|
||||
{importProgress.status === 'completed' && (
|
||||
<span> - {importProgress.imported} items imported</span>
|
||||
)}
|
||||
{importProgress.errorMessage && (
|
||||
<div style={{ color: 'red', marginTop: '5px' }}>{importProgress.errorMessage}</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
|
||||
<h3>Stored Data (Encrypted in IndexedDB)</h3>
|
||||
<table style={{ width: '100%', borderCollapse: 'collapse' }}>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd' }}>Gmail</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>{storedCounts.gmail} messages</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>
|
||||
{storedCounts.gmail > 0 && <button onClick={() => viewData('gmail')} style={{ fontSize: '12px', padding: '4px 8px' }}>View</button>}
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd' }}>Drive</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>{storedCounts.drive} files</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>
|
||||
{storedCounts.drive > 0 && <button onClick={() => viewData('drive')} style={{ fontSize: '12px', padding: '4px 8px' }}>View</button>}
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd' }}>Photos</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>{storedCounts.photos} photos</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>
|
||||
{storedCounts.photos > 0 && <button onClick={() => viewData('photos')} style={{ fontSize: '12px', padding: '4px 8px' }}>View</button>}
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd' }}>Calendar</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>{storedCounts.calendar} events</td>
|
||||
<td style={{ padding: '8px', borderBottom: '1px solid #ddd', textAlign: 'right' }}>
|
||||
{storedCounts.calendar > 0 && <button onClick={() => viewData('calendar')} style={{ fontSize: '12px', padding: '4px 8px' }}>View</button>}
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
{viewingService && viewItems.length > 0 && (
|
||||
<div style={{ marginTop: '20px' }}>
|
||||
<h4>
|
||||
{viewingService.charAt(0).toUpperCase() + viewingService.slice(1)} Items (Decrypted)
|
||||
<button onClick={() => { setViewingService(null); setViewItems([]); }} style={{ marginLeft: '10px', fontSize: '12px' }}>Close</button>
|
||||
</h4>
|
||||
<div style={{ maxHeight: '300px', overflow: 'auto', border: '1px solid #ddd', borderRadius: '4px' }}>
|
||||
{viewItems.map((item, i) => (
|
||||
<div key={item.id} style={{
|
||||
padding: '10px',
|
||||
borderBottom: '1px solid #eee',
|
||||
background: i % 2 === 0 ? '#fff' : '#f9f9f9'
|
||||
}}>
|
||||
<strong>{item.title}</strong>
|
||||
<div style={{ fontSize: '12px', color: '#666' }}>
|
||||
{new Date(item.date).toLocaleString()}
|
||||
</div>
|
||||
{item.preview && (
|
||||
<div style={{ fontSize: '12px', color: '#888', marginTop: '4px' }}>
|
||||
{item.preview.substring(0, 100)}...
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
<button
|
||||
onClick={refreshCounts}
|
||||
style={{ ...buttonStyle, background: '#666', marginTop: '10px' }}
|
||||
>
|
||||
Refresh Counts
|
||||
</button>
|
||||
<button
|
||||
onClick={resetAndReconnect}
|
||||
style={{ ...buttonStyle, background: '#c00', marginTop: '10px' }}
|
||||
>
|
||||
Reset & Clear All Data
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<hr style={{ margin: '30px 0' }} />
|
||||
|
||||
<h3>Activity Log</h3>
|
||||
<div style={{
|
||||
background: '#1a1a1a',
|
||||
color: '#0f0',
|
||||
padding: '10px',
|
||||
borderRadius: '4px',
|
||||
fontFamily: 'monospace',
|
||||
fontSize: '11px',
|
||||
height: '150px',
|
||||
overflow: 'auto',
|
||||
marginBottom: '20px'
|
||||
}}>
|
||||
{logs.length === 0 ? (
|
||||
<span style={{ color: '#666' }}>Click an import button to see activity...</span>
|
||||
) : (
|
||||
logs.map((log, i) => <div key={i}>{log}</div>)
|
||||
)}
|
||||
</div>
|
||||
|
||||
<details>
|
||||
<summary style={{ cursor: 'pointer' }}>Debug Info</summary>
|
||||
<pre style={{ fontSize: '11px', background: '#f5f5f5', padding: '10px', overflow: 'auto' }}>
|
||||
{JSON.stringify({
|
||||
isAuthed,
|
||||
hasMasterKey: !!masterKey,
|
||||
scopeCount: scopes.length,
|
||||
storedCounts,
|
||||
importProgress,
|
||||
currentUrl: typeof window !== 'undefined' ? window.location.href : 'N/A'
|
||||
}, null, 2)}
|
||||
</pre>
|
||||
</details>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default GoogleDataTest;
|
||||
|
|
@ -0,0 +1,66 @@
|
|||
import { ConnectionStatus } from '@/automerge/useAutomergeSyncRepo'
|
||||
|
||||
interface OfflineIndicatorProps {
|
||||
connectionStatus: ConnectionStatus
|
||||
isOfflineReady: boolean
|
||||
}
|
||||
|
||||
export function OfflineIndicator({ connectionStatus, isOfflineReady }: OfflineIndicatorProps) {
|
||||
// Don't show indicator when online and everything is working normally
|
||||
if (connectionStatus === 'online') {
|
||||
return null
|
||||
}
|
||||
|
||||
const getStatusConfig = () => {
|
||||
switch (connectionStatus) {
|
||||
case 'offline':
|
||||
return {
|
||||
icon: '📴',
|
||||
text: isOfflineReady ? 'Offline (changes saved locally)' : 'Offline',
|
||||
bgColor: '#fef3c7', // warm yellow
|
||||
textColor: '#92400e',
|
||||
borderColor: '#f59e0b'
|
||||
}
|
||||
case 'syncing':
|
||||
return {
|
||||
icon: '🔄',
|
||||
text: 'Syncing...',
|
||||
bgColor: '#dbeafe', // light blue
|
||||
textColor: '#1e40af',
|
||||
borderColor: '#3b82f6'
|
||||
}
|
||||
default:
|
||||
return null
|
||||
}
|
||||
}
|
||||
|
||||
const config = getStatusConfig()
|
||||
if (!config) return null
|
||||
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
position: 'fixed',
|
||||
bottom: '16px',
|
||||
left: '50%',
|
||||
transform: 'translateX(-50%)',
|
||||
backgroundColor: config.bgColor,
|
||||
color: config.textColor,
|
||||
padding: '8px 16px',
|
||||
borderRadius: '8px',
|
||||
border: `1px solid ${config.borderColor}`,
|
||||
boxShadow: '0 2px 8px rgba(0, 0, 0, 0.15)',
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
gap: '8px',
|
||||
zIndex: 9999,
|
||||
fontSize: '14px',
|
||||
fontFamily: 'system-ui, -apple-system, sans-serif',
|
||||
pointerEvents: 'none'
|
||||
}}
|
||||
>
|
||||
<span style={{ fontSize: '16px' }}>{config.icon}</span>
|
||||
<span>{config.text}</span>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
|
@ -0,0 +1,356 @@
|
|||
// R2 encrypted backup service
|
||||
// Data is already encrypted in IndexedDB, uploaded as-is to R2
|
||||
|
||||
import type {
|
||||
GoogleService,
|
||||
EncryptedEmailStore,
|
||||
EncryptedDriveDocument,
|
||||
EncryptedPhotoReference,
|
||||
EncryptedCalendarEvent
|
||||
} from './types';
|
||||
import { exportAllData, clearServiceData } from './database';
|
||||
import {
|
||||
encryptData,
|
||||
decryptData,
|
||||
deriveServiceKey,
|
||||
encryptMasterKeyWithPassword,
|
||||
decryptMasterKeyWithPassword,
|
||||
base64UrlEncode,
|
||||
base64UrlDecode
|
||||
} from './encryption';
|
||||
|
||||
// Backup metadata stored with the backup
|
||||
export interface BackupMetadata {
|
||||
id: string;
|
||||
createdAt: number;
|
||||
services: GoogleService[];
|
||||
itemCounts: {
|
||||
gmail: number;
|
||||
drive: number;
|
||||
photos: number;
|
||||
calendar: number;
|
||||
};
|
||||
sizeBytes: number;
|
||||
version: number;
|
||||
}
|
||||
|
||||
// Backup manifest (encrypted, stored in R2)
|
||||
interface BackupManifest {
|
||||
version: 1;
|
||||
createdAt: number;
|
||||
services: GoogleService[];
|
||||
itemCounts: {
|
||||
gmail: number;
|
||||
drive: number;
|
||||
photos: number;
|
||||
calendar: number;
|
||||
};
|
||||
checksum: string;
|
||||
}
|
||||
|
||||
// R2 backup service
|
||||
export class R2BackupService {
|
||||
private backupApiUrl: string;
|
||||
|
||||
constructor(
|
||||
private masterKey: CryptoKey,
|
||||
backupApiUrl?: string
|
||||
) {
|
||||
// Default to the canvas worker backup endpoint
|
||||
this.backupApiUrl = backupApiUrl || '/api/backup';
|
||||
}
|
||||
|
||||
// Create a backup of all Google data
|
||||
async createBackup(
|
||||
options: {
|
||||
services?: GoogleService[];
|
||||
onProgress?: (progress: { stage: string; percent: number }) => void;
|
||||
} = {}
|
||||
): Promise<BackupMetadata | null> {
|
||||
const services = options.services || ['gmail', 'drive', 'photos', 'calendar'];
|
||||
|
||||
try {
|
||||
options.onProgress?.({ stage: 'Gathering data', percent: 0 });
|
||||
|
||||
// Export all data from IndexedDB
|
||||
const data = await exportAllData();
|
||||
|
||||
// Filter to requested services
|
||||
const filteredData = {
|
||||
gmail: services.includes('gmail') ? data.gmail : [],
|
||||
drive: services.includes('drive') ? data.drive : [],
|
||||
photos: services.includes('photos') ? data.photos : [],
|
||||
calendar: services.includes('calendar') ? data.calendar : [],
|
||||
syncMetadata: data.syncMetadata.filter(m =>
|
||||
services.includes(m.service as GoogleService)
|
||||
),
|
||||
encryptionMeta: data.encryptionMeta
|
||||
};
|
||||
|
||||
options.onProgress?.({ stage: 'Preparing backup', percent: 20 });
|
||||
|
||||
// Create manifest
|
||||
const manifest: BackupManifest = {
|
||||
version: 1,
|
||||
createdAt: Date.now(),
|
||||
services,
|
||||
itemCounts: {
|
||||
gmail: filteredData.gmail.length,
|
||||
drive: filteredData.drive.length,
|
||||
photos: filteredData.photos.length,
|
||||
calendar: filteredData.calendar.length
|
||||
},
|
||||
checksum: await this.createChecksum(filteredData)
|
||||
};
|
||||
|
||||
options.onProgress?.({ stage: 'Encrypting manifest', percent: 30 });
|
||||
|
||||
// Encrypt manifest with backup key
|
||||
const backupKey = await deriveServiceKey(this.masterKey, 'backup');
|
||||
const encryptedManifest = await encryptData(
|
||||
JSON.stringify(manifest),
|
||||
backupKey
|
||||
);
|
||||
|
||||
options.onProgress?.({ stage: 'Serializing data', percent: 40 });
|
||||
|
||||
// Serialize data (already encrypted in IndexedDB)
|
||||
const serializedData = JSON.stringify(filteredData);
|
||||
const dataBlob = new Blob([serializedData], { type: 'application/json' });
|
||||
|
||||
options.onProgress?.({ stage: 'Uploading backup', percent: 50 });
|
||||
|
||||
// Upload to R2 via worker
|
||||
const backupId = crypto.randomUUID();
|
||||
const response = await fetch(this.backupApiUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/octet-stream',
|
||||
'X-Backup-Id': backupId,
|
||||
'X-Backup-Manifest': base64UrlEncode(
|
||||
new Uint8Array(encryptedManifest.encrypted)
|
||||
),
|
||||
'X-Backup-Manifest-IV': base64UrlEncode(encryptedManifest.iv)
|
||||
},
|
||||
body: dataBlob
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.text();
|
||||
throw new Error(`Backup upload failed: ${error}`);
|
||||
}
|
||||
|
||||
options.onProgress?.({ stage: 'Complete', percent: 100 });
|
||||
|
||||
return {
|
||||
id: backupId,
|
||||
createdAt: manifest.createdAt,
|
||||
services,
|
||||
itemCounts: manifest.itemCounts,
|
||||
sizeBytes: dataBlob.size,
|
||||
version: manifest.version
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
console.error('Backup creation failed:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// List available backups
|
||||
async listBackups(): Promise<BackupMetadata[]> {
|
||||
try {
|
||||
const response = await fetch(`${this.backupApiUrl}/list`, {
|
||||
method: 'GET'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to list backups');
|
||||
}
|
||||
|
||||
const backups = await response.json() as BackupMetadata[];
|
||||
return backups;
|
||||
|
||||
} catch (error) {
|
||||
console.error('List backups failed:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
// Restore a backup
|
||||
async restoreBackup(
|
||||
backupId: string,
|
||||
options: {
|
||||
services?: GoogleService[];
|
||||
clearExisting?: boolean;
|
||||
onProgress?: (progress: { stage: string; percent: number }) => void;
|
||||
} = {}
|
||||
): Promise<boolean> {
|
||||
try {
|
||||
options.onProgress?.({ stage: 'Fetching backup', percent: 0 });
|
||||
|
||||
// Fetch backup from R2
|
||||
const response = await fetch(`${this.backupApiUrl}/${backupId}`, {
|
||||
method: 'GET'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Backup not found');
|
||||
}
|
||||
|
||||
options.onProgress?.({ stage: 'Parsing backup', percent: 20 });
|
||||
|
||||
// Get encrypted manifest from headers
|
||||
const manifestBase64 = response.headers.get('X-Backup-Manifest');
|
||||
const manifestIvBase64 = response.headers.get('X-Backup-Manifest-IV');
|
||||
|
||||
if (!manifestBase64 || !manifestIvBase64) {
|
||||
throw new Error('Invalid backup: missing manifest');
|
||||
}
|
||||
|
||||
// Decrypt manifest
|
||||
const backupKey = await deriveServiceKey(this.masterKey, 'backup');
|
||||
const manifestIv = base64UrlDecode(manifestIvBase64);
|
||||
const manifestEncrypted = base64UrlDecode(manifestBase64);
|
||||
const manifestData = await decryptData(
|
||||
{
|
||||
encrypted: manifestEncrypted.buffer as ArrayBuffer,
|
||||
iv: manifestIv
|
||||
},
|
||||
backupKey
|
||||
);
|
||||
const manifest: BackupManifest = JSON.parse(
|
||||
new TextDecoder().decode(manifestData)
|
||||
);
|
||||
|
||||
options.onProgress?.({ stage: 'Verifying backup', percent: 30 });
|
||||
|
||||
// Parse backup data
|
||||
interface BackupDataStructure {
|
||||
gmail?: EncryptedEmailStore[];
|
||||
drive?: EncryptedDriveDocument[];
|
||||
photos?: EncryptedPhotoReference[];
|
||||
calendar?: EncryptedCalendarEvent[];
|
||||
}
|
||||
const backupData = await response.json() as BackupDataStructure;
|
||||
|
||||
// Verify checksum
|
||||
const checksum = await this.createChecksum(backupData);
|
||||
if (checksum !== manifest.checksum) {
|
||||
throw new Error('Backup verification failed: checksum mismatch');
|
||||
}
|
||||
|
||||
options.onProgress?.({ stage: 'Restoring data', percent: 50 });
|
||||
|
||||
// Clear existing data if requested
|
||||
const servicesToRestore = options.services || manifest.services;
|
||||
if (options.clearExisting) {
|
||||
for (const service of servicesToRestore) {
|
||||
await clearServiceData(service);
|
||||
}
|
||||
}
|
||||
|
||||
// Restore data to IndexedDB
|
||||
// Note: Data is already encrypted, just need to write it
|
||||
const { gmailStore, driveStore, photosStore, calendarStore } = await import('./database');
|
||||
|
||||
if (servicesToRestore.includes('gmail') && backupData.gmail?.length) {
|
||||
await gmailStore.putBatch(backupData.gmail);
|
||||
}
|
||||
if (servicesToRestore.includes('drive') && backupData.drive?.length) {
|
||||
await driveStore.putBatch(backupData.drive);
|
||||
}
|
||||
if (servicesToRestore.includes('photos') && backupData.photos?.length) {
|
||||
await photosStore.putBatch(backupData.photos);
|
||||
}
|
||||
if (servicesToRestore.includes('calendar') && backupData.calendar?.length) {
|
||||
await calendarStore.putBatch(backupData.calendar);
|
||||
}
|
||||
|
||||
options.onProgress?.({ stage: 'Complete', percent: 100 });
|
||||
|
||||
return true;
|
||||
|
||||
} catch (error) {
|
||||
console.error('Backup restore failed:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Delete a backup
|
||||
async deleteBackup(backupId: string): Promise<boolean> {
|
||||
try {
|
||||
const response = await fetch(`${this.backupApiUrl}/${backupId}`, {
|
||||
method: 'DELETE'
|
||||
});
|
||||
|
||||
return response.ok;
|
||||
|
||||
} catch (error) {
|
||||
console.error('Delete backup failed:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Create checksum for data verification
|
||||
private async createChecksum(data: unknown): Promise<string> {
|
||||
const serialized = JSON.stringify(data);
|
||||
const encoder = new TextEncoder();
|
||||
const dataBuffer = encoder.encode(serialized);
|
||||
const hashBuffer = await crypto.subtle.digest('SHA-256', dataBuffer);
|
||||
return base64UrlEncode(new Uint8Array(hashBuffer));
|
||||
}
|
||||
|
||||
// Export master key encrypted with password (for backup recovery)
|
||||
async exportMasterKeyBackup(password: string): Promise<{
|
||||
encryptedKey: string;
|
||||
salt: string;
|
||||
}> {
|
||||
const { encryptedKey, salt } = await encryptMasterKeyWithPassword(
|
||||
this.masterKey,
|
||||
password
|
||||
);
|
||||
|
||||
return {
|
||||
encryptedKey: base64UrlEncode(new Uint8Array(encryptedKey.encrypted)) +
|
||||
'.' + base64UrlEncode(encryptedKey.iv),
|
||||
salt: base64UrlEncode(salt)
|
||||
};
|
||||
}
|
||||
|
||||
// Import master key from password-protected backup
|
||||
static async importMasterKeyBackup(
|
||||
encryptedKeyString: string,
|
||||
salt: string,
|
||||
password: string
|
||||
): Promise<CryptoKey> {
|
||||
const [keyBase64, ivBase64] = encryptedKeyString.split('.');
|
||||
|
||||
const encryptedKey = {
|
||||
encrypted: base64UrlDecode(keyBase64).buffer as ArrayBuffer,
|
||||
iv: base64UrlDecode(ivBase64)
|
||||
};
|
||||
|
||||
return decryptMasterKeyWithPassword(
|
||||
encryptedKey,
|
||||
password,
|
||||
base64UrlDecode(salt)
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Progress callback for backups
|
||||
export interface BackupProgress {
|
||||
service: 'gmail' | 'drive' | 'photos' | 'calendar' | 'all';
|
||||
status: 'idle' | 'backing_up' | 'restoring' | 'completed' | 'error';
|
||||
progress: number; // 0-100
|
||||
errorMessage?: string;
|
||||
}
|
||||
|
||||
// Convenience function
|
||||
export function createBackupService(
|
||||
masterKey: CryptoKey,
|
||||
backupApiUrl?: string
|
||||
): R2BackupService {
|
||||
return new R2BackupService(masterKey, backupApiUrl);
|
||||
}
|
||||
|
|
@ -0,0 +1,567 @@
|
|||
// IndexedDB database for encrypted Google data storage
|
||||
// All data stored here is already encrypted client-side
|
||||
|
||||
import type {
|
||||
EncryptedEmailStore,
|
||||
EncryptedDriveDocument,
|
||||
EncryptedPhotoReference,
|
||||
EncryptedCalendarEvent,
|
||||
SyncMetadata,
|
||||
EncryptionMetadata,
|
||||
EncryptedTokens,
|
||||
GoogleService,
|
||||
StorageQuotaInfo
|
||||
} from './types';
|
||||
import { DB_STORES } from './types';
|
||||
|
||||
const DB_NAME = 'canvas-google-data';
|
||||
const DB_VERSION = 1;
|
||||
|
||||
let dbInstance: IDBDatabase | null = null;
|
||||
|
||||
// Open or create the database
|
||||
export async function openDatabase(): Promise<IDBDatabase> {
|
||||
if (dbInstance) {
|
||||
return dbInstance;
|
||||
}
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.open(DB_NAME, DB_VERSION);
|
||||
|
||||
request.onerror = () => {
|
||||
console.error('Failed to open Google data database:', request.error);
|
||||
reject(request.error);
|
||||
};
|
||||
|
||||
request.onsuccess = () => {
|
||||
dbInstance = request.result;
|
||||
resolve(dbInstance);
|
||||
};
|
||||
|
||||
request.onupgradeneeded = (event) => {
|
||||
const db = (event.target as IDBOpenDBRequest).result;
|
||||
createStores(db);
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
// Create all object stores
|
||||
function createStores(db: IDBDatabase): void {
|
||||
// Gmail messages store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.gmail)) {
|
||||
const gmailStore = db.createObjectStore(DB_STORES.gmail, { keyPath: 'id' });
|
||||
gmailStore.createIndex('threadId', 'threadId', { unique: false });
|
||||
gmailStore.createIndex('date', 'date', { unique: false });
|
||||
gmailStore.createIndex('syncedAt', 'syncedAt', { unique: false });
|
||||
gmailStore.createIndex('localOnly', 'localOnly', { unique: false });
|
||||
}
|
||||
|
||||
// Drive documents store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.drive)) {
|
||||
const driveStore = db.createObjectStore(DB_STORES.drive, { keyPath: 'id' });
|
||||
driveStore.createIndex('parentId', 'parentId', { unique: false });
|
||||
driveStore.createIndex('modifiedTime', 'modifiedTime', { unique: false });
|
||||
driveStore.createIndex('syncedAt', 'syncedAt', { unique: false });
|
||||
}
|
||||
|
||||
// Photos store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.photos)) {
|
||||
const photosStore = db.createObjectStore(DB_STORES.photos, { keyPath: 'id' });
|
||||
photosStore.createIndex('creationTime', 'creationTime', { unique: false });
|
||||
photosStore.createIndex('mediaType', 'mediaType', { unique: false });
|
||||
photosStore.createIndex('syncedAt', 'syncedAt', { unique: false });
|
||||
}
|
||||
|
||||
// Calendar events store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.calendar)) {
|
||||
const calendarStore = db.createObjectStore(DB_STORES.calendar, { keyPath: 'id' });
|
||||
calendarStore.createIndex('calendarId', 'calendarId', { unique: false });
|
||||
calendarStore.createIndex('startTime', 'startTime', { unique: false });
|
||||
calendarStore.createIndex('endTime', 'endTime', { unique: false });
|
||||
calendarStore.createIndex('syncedAt', 'syncedAt', { unique: false });
|
||||
}
|
||||
|
||||
// Sync metadata store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.syncMetadata)) {
|
||||
db.createObjectStore(DB_STORES.syncMetadata, { keyPath: 'service' });
|
||||
}
|
||||
|
||||
// Encryption metadata store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.encryptionMeta)) {
|
||||
db.createObjectStore(DB_STORES.encryptionMeta, { keyPath: 'purpose' });
|
||||
}
|
||||
|
||||
// Tokens store
|
||||
if (!db.objectStoreNames.contains(DB_STORES.tokens)) {
|
||||
db.createObjectStore(DB_STORES.tokens, { keyPath: 'id' });
|
||||
}
|
||||
}
|
||||
|
||||
// Close the database connection
|
||||
export function closeDatabase(): void {
|
||||
if (dbInstance) {
|
||||
dbInstance.close();
|
||||
dbInstance = null;
|
||||
}
|
||||
}
|
||||
|
||||
// Delete the entire database (for user data wipe)
|
||||
export async function deleteDatabase(): Promise<void> {
|
||||
closeDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.deleteDatabase(DB_NAME);
|
||||
request.onsuccess = () => resolve();
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Generic put operation
|
||||
async function putItem<T>(storeName: string, item: T): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(storeName, 'readwrite');
|
||||
const store = tx.objectStore(storeName);
|
||||
const request = store.put(item);
|
||||
|
||||
request.onsuccess = () => resolve();
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Generic get operation
|
||||
async function getItem<T>(storeName: string, key: string): Promise<T | null> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(storeName, 'readonly');
|
||||
const store = tx.objectStore(storeName);
|
||||
const request = store.get(key);
|
||||
|
||||
request.onsuccess = () => resolve(request.result || null);
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Generic delete operation
|
||||
async function deleteItem(storeName: string, key: string): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(storeName, 'readwrite');
|
||||
const store = tx.objectStore(storeName);
|
||||
const request = store.delete(key);
|
||||
|
||||
request.onsuccess = () => resolve();
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Generic getAll operation
|
||||
async function getAllItems<T>(storeName: string): Promise<T[]> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(storeName, 'readonly');
|
||||
const store = tx.objectStore(storeName);
|
||||
const request = store.getAll();
|
||||
|
||||
request.onsuccess = () => resolve(request.result || []);
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Generic count operation
|
||||
async function countItems(storeName: string): Promise<number> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(storeName, 'readonly');
|
||||
const store = tx.objectStore(storeName);
|
||||
const request = store.count();
|
||||
|
||||
request.onsuccess = () => resolve(request.result);
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Get items by index with optional range
|
||||
async function getItemsByIndex<T>(
|
||||
storeName: string,
|
||||
indexName: string,
|
||||
query?: IDBKeyRange | IDBValidKey
|
||||
): Promise<T[]> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(storeName, 'readonly');
|
||||
const store = tx.objectStore(storeName);
|
||||
const index = store.index(indexName);
|
||||
const request = query ? index.getAll(query) : index.getAll();
|
||||
|
||||
request.onsuccess = () => resolve(request.result || []);
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Gmail operations
|
||||
export const gmailStore = {
|
||||
put: (email: EncryptedEmailStore) => putItem(DB_STORES.gmail, email),
|
||||
get: (id: string) => getItem<EncryptedEmailStore>(DB_STORES.gmail, id),
|
||||
delete: (id: string) => deleteItem(DB_STORES.gmail, id),
|
||||
getAll: () => getAllItems<EncryptedEmailStore>(DB_STORES.gmail),
|
||||
count: () => countItems(DB_STORES.gmail),
|
||||
|
||||
getByThread: (threadId: string) =>
|
||||
getItemsByIndex<EncryptedEmailStore>(DB_STORES.gmail, 'threadId', threadId),
|
||||
|
||||
getByDateRange: (startDate: number, endDate: number) =>
|
||||
getItemsByIndex<EncryptedEmailStore>(
|
||||
DB_STORES.gmail,
|
||||
'date',
|
||||
IDBKeyRange.bound(startDate, endDate)
|
||||
),
|
||||
|
||||
getLocalOnly: async () => {
|
||||
const all = await getAllItems<EncryptedEmailStore>(DB_STORES.gmail);
|
||||
return all.filter(email => email.localOnly === true);
|
||||
},
|
||||
|
||||
async putBatch(emails: EncryptedEmailStore[]): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(DB_STORES.gmail, 'readwrite');
|
||||
const store = tx.objectStore(DB_STORES.gmail);
|
||||
|
||||
tx.oncomplete = () => resolve();
|
||||
tx.onerror = () => reject(tx.error);
|
||||
|
||||
for (const email of emails) {
|
||||
store.put(email);
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
// Drive operations
|
||||
export const driveStore = {
|
||||
put: (doc: EncryptedDriveDocument) => putItem(DB_STORES.drive, doc),
|
||||
get: (id: string) => getItem<EncryptedDriveDocument>(DB_STORES.drive, id),
|
||||
delete: (id: string) => deleteItem(DB_STORES.drive, id),
|
||||
getAll: () => getAllItems<EncryptedDriveDocument>(DB_STORES.drive),
|
||||
count: () => countItems(DB_STORES.drive),
|
||||
|
||||
getByParent: (parentId: string | null) =>
|
||||
getItemsByIndex<EncryptedDriveDocument>(
|
||||
DB_STORES.drive,
|
||||
'parentId',
|
||||
parentId ?? ''
|
||||
),
|
||||
|
||||
getRecent: (limit: number = 50) =>
|
||||
getItemsByIndex<EncryptedDriveDocument>(DB_STORES.drive, 'modifiedTime')
|
||||
.then(items => items.sort((a, b) => b.modifiedTime - a.modifiedTime).slice(0, limit)),
|
||||
|
||||
async putBatch(docs: EncryptedDriveDocument[]): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(DB_STORES.drive, 'readwrite');
|
||||
const store = tx.objectStore(DB_STORES.drive);
|
||||
|
||||
tx.oncomplete = () => resolve();
|
||||
tx.onerror = () => reject(tx.error);
|
||||
|
||||
for (const doc of docs) {
|
||||
store.put(doc);
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
// Photos operations
|
||||
export const photosStore = {
|
||||
put: (photo: EncryptedPhotoReference) => putItem(DB_STORES.photos, photo),
|
||||
get: (id: string) => getItem<EncryptedPhotoReference>(DB_STORES.photos, id),
|
||||
delete: (id: string) => deleteItem(DB_STORES.photos, id),
|
||||
getAll: () => getAllItems<EncryptedPhotoReference>(DB_STORES.photos),
|
||||
count: () => countItems(DB_STORES.photos),
|
||||
|
||||
getByMediaType: (mediaType: 'image' | 'video') =>
|
||||
getItemsByIndex<EncryptedPhotoReference>(DB_STORES.photos, 'mediaType', mediaType),
|
||||
|
||||
getByDateRange: (startDate: number, endDate: number) =>
|
||||
getItemsByIndex<EncryptedPhotoReference>(
|
||||
DB_STORES.photos,
|
||||
'creationTime',
|
||||
IDBKeyRange.bound(startDate, endDate)
|
||||
),
|
||||
|
||||
async putBatch(photos: EncryptedPhotoReference[]): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(DB_STORES.photos, 'readwrite');
|
||||
const store = tx.objectStore(DB_STORES.photos);
|
||||
|
||||
tx.oncomplete = () => resolve();
|
||||
tx.onerror = () => reject(tx.error);
|
||||
|
||||
for (const photo of photos) {
|
||||
store.put(photo);
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
// Calendar operations
|
||||
export const calendarStore = {
|
||||
put: (event: EncryptedCalendarEvent) => putItem(DB_STORES.calendar, event),
|
||||
get: (id: string) => getItem<EncryptedCalendarEvent>(DB_STORES.calendar, id),
|
||||
delete: (id: string) => deleteItem(DB_STORES.calendar, id),
|
||||
getAll: () => getAllItems<EncryptedCalendarEvent>(DB_STORES.calendar),
|
||||
count: () => countItems(DB_STORES.calendar),
|
||||
|
||||
getByCalendar: (calendarId: string) =>
|
||||
getItemsByIndex<EncryptedCalendarEvent>(DB_STORES.calendar, 'calendarId', calendarId),
|
||||
|
||||
getByDateRange: (startTime: number, endTime: number) =>
|
||||
getItemsByIndex<EncryptedCalendarEvent>(
|
||||
DB_STORES.calendar,
|
||||
'startTime',
|
||||
IDBKeyRange.bound(startTime, endTime)
|
||||
),
|
||||
|
||||
getUpcoming: (fromTime: number = Date.now(), limit: number = 50) =>
|
||||
getItemsByIndex<EncryptedCalendarEvent>(
|
||||
DB_STORES.calendar,
|
||||
'startTime',
|
||||
IDBKeyRange.lowerBound(fromTime)
|
||||
).then(items => items.slice(0, limit)),
|
||||
|
||||
async putBatch(events: EncryptedCalendarEvent[]): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(DB_STORES.calendar, 'readwrite');
|
||||
const store = tx.objectStore(DB_STORES.calendar);
|
||||
|
||||
tx.oncomplete = () => resolve();
|
||||
tx.onerror = () => reject(tx.error);
|
||||
|
||||
for (const event of events) {
|
||||
store.put(event);
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
// Sync metadata operations
|
||||
export const syncMetadataStore = {
|
||||
put: (metadata: SyncMetadata) => putItem(DB_STORES.syncMetadata, metadata),
|
||||
get: (service: GoogleService) => getItem<SyncMetadata>(DB_STORES.syncMetadata, service),
|
||||
getAll: () => getAllItems<SyncMetadata>(DB_STORES.syncMetadata),
|
||||
|
||||
async updateProgress(
|
||||
service: GoogleService,
|
||||
current: number,
|
||||
total: number
|
||||
): Promise<void> {
|
||||
const existing = await this.get(service);
|
||||
await this.put({
|
||||
...existing,
|
||||
service,
|
||||
status: 'syncing',
|
||||
progressCurrent: current,
|
||||
progressTotal: total,
|
||||
lastSyncTime: existing?.lastSyncTime ?? Date.now()
|
||||
} as SyncMetadata);
|
||||
},
|
||||
|
||||
async markComplete(service: GoogleService, itemCount: number): Promise<void> {
|
||||
const existing = await this.get(service);
|
||||
await this.put({
|
||||
...existing,
|
||||
service,
|
||||
status: 'idle',
|
||||
itemCount,
|
||||
lastSyncTime: Date.now(),
|
||||
progressCurrent: undefined,
|
||||
progressTotal: undefined
|
||||
} as SyncMetadata);
|
||||
},
|
||||
|
||||
async markError(service: GoogleService, errorMessage: string): Promise<void> {
|
||||
const existing = await this.get(service);
|
||||
await this.put({
|
||||
...existing,
|
||||
service,
|
||||
status: 'error',
|
||||
errorMessage,
|
||||
lastSyncTime: existing?.lastSyncTime ?? Date.now()
|
||||
} as SyncMetadata);
|
||||
}
|
||||
};
|
||||
|
||||
// Encryption metadata operations
|
||||
export const encryptionMetaStore = {
|
||||
put: (metadata: EncryptionMetadata) => putItem(DB_STORES.encryptionMeta, metadata),
|
||||
get: (purpose: string) => getItem<EncryptionMetadata>(DB_STORES.encryptionMeta, purpose),
|
||||
getAll: () => getAllItems<EncryptionMetadata>(DB_STORES.encryptionMeta)
|
||||
};
|
||||
|
||||
// Token operations
|
||||
export const tokensStore = {
|
||||
async put(tokens: EncryptedTokens): Promise<void> {
|
||||
await putItem(DB_STORES.tokens, { id: 'google', ...tokens });
|
||||
},
|
||||
|
||||
async get(): Promise<EncryptedTokens | null> {
|
||||
const result = await getItem<EncryptedTokens & { id: string }>(DB_STORES.tokens, 'google');
|
||||
if (result) {
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
const { id, ...tokens } = result;
|
||||
return tokens;
|
||||
}
|
||||
return null;
|
||||
},
|
||||
|
||||
async delete(): Promise<void> {
|
||||
await deleteItem(DB_STORES.tokens, 'google');
|
||||
},
|
||||
|
||||
async isExpired(): Promise<boolean> {
|
||||
const tokens = await this.get();
|
||||
if (!tokens) return true;
|
||||
// Add 5 minute buffer
|
||||
return tokens.expiresAt <= Date.now() + 5 * 60 * 1000;
|
||||
}
|
||||
};
|
||||
|
||||
// Storage quota utilities
|
||||
export async function requestPersistentStorage(): Promise<boolean> {
|
||||
if (navigator.storage && navigator.storage.persist) {
|
||||
const isPersisted = await navigator.storage.persist();
|
||||
console.log(`Persistent storage ${isPersisted ? 'granted' : 'denied'}`);
|
||||
return isPersisted;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
export async function checkStorageQuota(): Promise<StorageQuotaInfo> {
|
||||
const defaultQuota: StorageQuotaInfo = {
|
||||
used: 0,
|
||||
quota: 0,
|
||||
isPersistent: false,
|
||||
byService: { gmail: 0, drive: 0, photos: 0, calendar: 0 }
|
||||
};
|
||||
|
||||
if (!navigator.storage || !navigator.storage.estimate) {
|
||||
return defaultQuota;
|
||||
}
|
||||
|
||||
const estimate = await navigator.storage.estimate();
|
||||
const isPersistent = navigator.storage.persisted
|
||||
? await navigator.storage.persisted()
|
||||
: false;
|
||||
|
||||
// Estimate per-service usage based on item counts
|
||||
// (rough approximation - actual size would require iterating all items)
|
||||
const [gmailCount, driveCount, photosCount, calendarCount] = await Promise.all([
|
||||
gmailStore.count(),
|
||||
driveStore.count(),
|
||||
photosStore.count(),
|
||||
calendarStore.count()
|
||||
]);
|
||||
|
||||
// Rough size estimates per item (in bytes)
|
||||
const AVG_EMAIL_SIZE = 25000; // 25KB
|
||||
const AVG_DOC_SIZE = 50000; // 50KB
|
||||
const AVG_PHOTO_SIZE = 50000; // 50KB (thumbnail only)
|
||||
const AVG_EVENT_SIZE = 5000; // 5KB
|
||||
|
||||
return {
|
||||
used: estimate.usage || 0,
|
||||
quota: estimate.quota || 0,
|
||||
isPersistent,
|
||||
byService: {
|
||||
gmail: gmailCount * AVG_EMAIL_SIZE,
|
||||
drive: driveCount * AVG_DOC_SIZE,
|
||||
photos: photosCount * AVG_PHOTO_SIZE,
|
||||
calendar: calendarCount * AVG_EVENT_SIZE
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Safari-specific handling
|
||||
export function hasSafariLimitations(): boolean {
|
||||
const isSafari = /^((?!chrome|android).)*safari/i.test(navigator.userAgent);
|
||||
const isIOS = /iPad|iPhone|iPod/.test(navigator.userAgent);
|
||||
return isSafari || isIOS;
|
||||
}
|
||||
|
||||
// Touch data to prevent Safari 7-day eviction
|
||||
export async function touchLocalData(): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(DB_STORES.encryptionMeta, 'readwrite');
|
||||
const store = tx.objectStore(DB_STORES.encryptionMeta);
|
||||
|
||||
// Just update a timestamp in encryption metadata
|
||||
store.put({
|
||||
purpose: 'master',
|
||||
salt: new Uint8Array(0),
|
||||
createdAt: Date.now()
|
||||
} as EncryptionMetadata);
|
||||
|
||||
tx.oncomplete = () => resolve();
|
||||
tx.onerror = () => reject(tx.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Clear all data for a specific service
|
||||
export async function clearServiceData(service: GoogleService): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const tx = db.transaction(service, 'readwrite');
|
||||
const store = tx.objectStore(service);
|
||||
const request = store.clear();
|
||||
|
||||
request.onsuccess = async () => {
|
||||
// Also clear sync metadata for this service
|
||||
await syncMetadataStore.put({
|
||||
service,
|
||||
lastSyncTime: Date.now(),
|
||||
itemCount: 0,
|
||||
status: 'idle'
|
||||
});
|
||||
resolve();
|
||||
};
|
||||
request.onerror = () => reject(request.error);
|
||||
});
|
||||
}
|
||||
|
||||
// Export all data for backup
|
||||
export async function exportAllData(): Promise<{
|
||||
gmail: EncryptedEmailStore[];
|
||||
drive: EncryptedDriveDocument[];
|
||||
photos: EncryptedPhotoReference[];
|
||||
calendar: EncryptedCalendarEvent[];
|
||||
syncMetadata: SyncMetadata[];
|
||||
encryptionMeta: EncryptionMetadata[];
|
||||
}> {
|
||||
const [gmail, drive, photos, calendar, syncMetadata, encryptionMeta] = await Promise.all([
|
||||
gmailStore.getAll(),
|
||||
driveStore.getAll(),
|
||||
photosStore.getAll(),
|
||||
calendarStore.getAll(),
|
||||
syncMetadataStore.getAll(),
|
||||
encryptionMetaStore.getAll()
|
||||
]);
|
||||
|
||||
return { gmail, drive, photos, calendar, syncMetadata, encryptionMeta };
|
||||
}
|
||||
|
|
@ -0,0 +1,292 @@
|
|||
// WebCrypto encryption utilities for Google Data Sovereignty
|
||||
// Uses AES-256-GCM for symmetric encryption and HKDF for key derivation
|
||||
|
||||
import type { EncryptedData, GoogleService } from './types';
|
||||
|
||||
// Check if we're in a browser environment with WebCrypto
|
||||
export const hasWebCrypto = (): boolean => {
|
||||
return typeof window !== 'undefined' &&
|
||||
window.crypto !== undefined &&
|
||||
window.crypto.subtle !== undefined;
|
||||
};
|
||||
|
||||
// Generate a random master key for new users
|
||||
export async function generateMasterKey(): Promise<CryptoKey> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
return await crypto.subtle.generateKey(
|
||||
{ name: 'AES-GCM', length: 256 },
|
||||
true, // extractable for backup
|
||||
['encrypt', 'decrypt']
|
||||
);
|
||||
}
|
||||
|
||||
// Export master key to raw format for backup
|
||||
export async function exportMasterKey(key: CryptoKey): Promise<ArrayBuffer> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
return await crypto.subtle.exportKey('raw', key);
|
||||
}
|
||||
|
||||
// Import master key from raw format (for restore)
|
||||
export async function importMasterKey(keyData: ArrayBuffer): Promise<CryptoKey> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
return await crypto.subtle.importKey(
|
||||
'raw',
|
||||
keyData,
|
||||
{ name: 'AES-GCM', length: 256 },
|
||||
true,
|
||||
['encrypt', 'decrypt']
|
||||
);
|
||||
}
|
||||
|
||||
// Derive a service-specific encryption key from master key using HKDF
|
||||
export async function deriveServiceKey(
|
||||
masterKey: CryptoKey,
|
||||
service: GoogleService | 'tokens' | 'backup'
|
||||
): Promise<CryptoKey> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
const encoder = new TextEncoder();
|
||||
const info = encoder.encode(`canvas-google-data-${service}`);
|
||||
|
||||
// Export master key to use as HKDF base
|
||||
const masterKeyRaw = await crypto.subtle.exportKey('raw', masterKey);
|
||||
|
||||
// Import as HKDF key
|
||||
const hkdfKey = await crypto.subtle.importKey(
|
||||
'raw',
|
||||
masterKeyRaw,
|
||||
'HKDF',
|
||||
false,
|
||||
['deriveKey']
|
||||
);
|
||||
|
||||
// Generate a deterministic salt based on service
|
||||
const salt = encoder.encode(`canvas-salt-${service}`);
|
||||
|
||||
// Derive the service-specific key
|
||||
return await crypto.subtle.deriveKey(
|
||||
{
|
||||
name: 'HKDF',
|
||||
hash: 'SHA-256',
|
||||
salt: salt,
|
||||
info: info
|
||||
},
|
||||
hkdfKey,
|
||||
{ name: 'AES-GCM', length: 256 },
|
||||
false, // not extractable for security
|
||||
['encrypt', 'decrypt']
|
||||
);
|
||||
}
|
||||
|
||||
// Encrypt data with AES-256-GCM
|
||||
export async function encryptData(
|
||||
data: string | ArrayBuffer,
|
||||
key: CryptoKey
|
||||
): Promise<EncryptedData> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
// Generate random 96-bit IV (recommended for AES-GCM)
|
||||
const iv = crypto.getRandomValues(new Uint8Array(12));
|
||||
|
||||
// Convert string to ArrayBuffer if needed
|
||||
const dataBuffer = typeof data === 'string'
|
||||
? new TextEncoder().encode(data)
|
||||
: data;
|
||||
|
||||
const encrypted = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
key,
|
||||
dataBuffer
|
||||
);
|
||||
|
||||
return { encrypted, iv };
|
||||
}
|
||||
|
||||
// Decrypt data with AES-256-GCM
|
||||
export async function decryptData(
|
||||
encryptedData: EncryptedData,
|
||||
key: CryptoKey
|
||||
): Promise<ArrayBuffer> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
return await crypto.subtle.decrypt(
|
||||
{ name: 'AES-GCM', iv: new Uint8Array(encryptedData.iv) as Uint8Array<ArrayBuffer> },
|
||||
key,
|
||||
encryptedData.encrypted
|
||||
);
|
||||
}
|
||||
|
||||
// Decrypt data to string (convenience method)
|
||||
export async function decryptDataToString(
|
||||
encryptedData: EncryptedData,
|
||||
key: CryptoKey
|
||||
): Promise<string> {
|
||||
const decrypted = await decryptData(encryptedData, key);
|
||||
return new TextDecoder().decode(decrypted);
|
||||
}
|
||||
|
||||
// Encrypt multiple fields of an object
|
||||
export async function encryptFields<T extends Record<string, unknown>>(
|
||||
obj: T,
|
||||
fieldsToEncrypt: (keyof T)[],
|
||||
key: CryptoKey
|
||||
): Promise<Record<string, EncryptedData | unknown>> {
|
||||
const result: Record<string, EncryptedData | unknown> = {};
|
||||
|
||||
for (const [field, value] of Object.entries(obj)) {
|
||||
if (fieldsToEncrypt.includes(field as keyof T) && value !== null && value !== undefined) {
|
||||
const strValue = typeof value === 'string' ? value : JSON.stringify(value);
|
||||
result[`encrypted${field.charAt(0).toUpperCase()}${field.slice(1)}`] =
|
||||
await encryptData(strValue, key);
|
||||
} else if (!fieldsToEncrypt.includes(field as keyof T)) {
|
||||
result[field] = value;
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
// Serialize EncryptedData for IndexedDB storage
|
||||
export function serializeEncryptedData(data: EncryptedData): { encrypted: ArrayBuffer; iv: number[] } {
|
||||
return {
|
||||
encrypted: data.encrypted,
|
||||
iv: Array.from(data.iv)
|
||||
};
|
||||
}
|
||||
|
||||
// Deserialize EncryptedData from IndexedDB
|
||||
export function deserializeEncryptedData(data: { encrypted: ArrayBuffer; iv: number[] }): EncryptedData {
|
||||
return {
|
||||
encrypted: data.encrypted,
|
||||
iv: new Uint8Array(data.iv)
|
||||
};
|
||||
}
|
||||
|
||||
// Base64 URL encoding for PKCE
|
||||
export function base64UrlEncode(buffer: ArrayBuffer | Uint8Array): string {
|
||||
const bytes = buffer instanceof Uint8Array ? buffer : new Uint8Array(buffer);
|
||||
let binary = '';
|
||||
for (let i = 0; i < bytes.length; i++) {
|
||||
binary += String.fromCharCode(bytes[i]);
|
||||
}
|
||||
return btoa(binary)
|
||||
.replace(/\+/g, '-')
|
||||
.replace(/\//g, '_')
|
||||
.replace(/=+$/, '');
|
||||
}
|
||||
|
||||
// Base64 URL decoding
|
||||
export function base64UrlDecode(str: string): Uint8Array {
|
||||
// Add padding if needed
|
||||
let base64 = str.replace(/-/g, '+').replace(/_/g, '/');
|
||||
const padding = base64.length % 4;
|
||||
if (padding) {
|
||||
base64 += '='.repeat(4 - padding);
|
||||
}
|
||||
|
||||
const binary = atob(base64);
|
||||
const bytes = new Uint8Array(binary.length);
|
||||
for (let i = 0; i < binary.length; i++) {
|
||||
bytes[i] = binary.charCodeAt(i);
|
||||
}
|
||||
return bytes;
|
||||
}
|
||||
|
||||
// Generate PKCE code verifier (43-128 chars, URL-safe)
|
||||
export function generateCodeVerifier(): string {
|
||||
const array = new Uint8Array(32);
|
||||
crypto.getRandomValues(array);
|
||||
return base64UrlEncode(array);
|
||||
}
|
||||
|
||||
// Generate PKCE code challenge from verifier
|
||||
export async function generateCodeChallenge(verifier: string): Promise<string> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
const encoder = new TextEncoder();
|
||||
const data = encoder.encode(verifier);
|
||||
const hash = await crypto.subtle.digest('SHA-256', data);
|
||||
return base64UrlEncode(hash);
|
||||
}
|
||||
|
||||
// Derive a key from password for master key encryption (for backup)
|
||||
export async function deriveKeyFromPassword(
|
||||
password: string,
|
||||
salt: Uint8Array
|
||||
): Promise<CryptoKey> {
|
||||
if (!hasWebCrypto()) {
|
||||
throw new Error('WebCrypto not available');
|
||||
}
|
||||
|
||||
const encoder = new TextEncoder();
|
||||
const passwordBuffer = encoder.encode(password);
|
||||
|
||||
// Import password as raw key for PBKDF2
|
||||
const passwordKey = await crypto.subtle.importKey(
|
||||
'raw',
|
||||
passwordBuffer,
|
||||
'PBKDF2',
|
||||
false,
|
||||
['deriveKey']
|
||||
);
|
||||
|
||||
// Derive encryption key using PBKDF2
|
||||
return await crypto.subtle.deriveKey(
|
||||
{
|
||||
name: 'PBKDF2',
|
||||
salt: new Uint8Array(salt) as Uint8Array<ArrayBuffer>,
|
||||
iterations: 100000, // High iteration count for security
|
||||
hash: 'SHA-256'
|
||||
},
|
||||
passwordKey,
|
||||
{ name: 'AES-GCM', length: 256 },
|
||||
false,
|
||||
['encrypt', 'decrypt']
|
||||
);
|
||||
}
|
||||
|
||||
// Generate random salt for password derivation
|
||||
export function generateSalt(): Uint8Array {
|
||||
return crypto.getRandomValues(new Uint8Array(16));
|
||||
}
|
||||
|
||||
// Encrypt master key with password-derived key for backup
|
||||
export async function encryptMasterKeyWithPassword(
|
||||
masterKey: CryptoKey,
|
||||
password: string
|
||||
): Promise<{ encryptedKey: EncryptedData; salt: Uint8Array }> {
|
||||
const salt = generateSalt();
|
||||
const passwordKey = await deriveKeyFromPassword(password, salt);
|
||||
const masterKeyRaw = await exportMasterKey(masterKey);
|
||||
const encryptedKey = await encryptData(masterKeyRaw, passwordKey);
|
||||
|
||||
return { encryptedKey, salt };
|
||||
}
|
||||
|
||||
// Decrypt master key with password
|
||||
export async function decryptMasterKeyWithPassword(
|
||||
encryptedKey: EncryptedData,
|
||||
password: string,
|
||||
salt: Uint8Array
|
||||
): Promise<CryptoKey> {
|
||||
const passwordKey = await deriveKeyFromPassword(password, salt);
|
||||
const masterKeyRaw = await decryptData(encryptedKey, passwordKey);
|
||||
return await importMasterKey(masterKeyRaw);
|
||||
}
|
||||
|
|
@ -0,0 +1,425 @@
|
|||
// Google Calendar import with event encryption
|
||||
// All data is encrypted before storage
|
||||
|
||||
import type { EncryptedCalendarEvent, ImportProgress, EncryptedData } from '../types';
|
||||
import { encryptData, deriveServiceKey } from '../encryption';
|
||||
import { calendarStore, syncMetadataStore } from '../database';
|
||||
import { getAccessToken } from '../oauth';
|
||||
|
||||
const CALENDAR_API_BASE = 'https://www.googleapis.com/calendar/v3';
|
||||
|
||||
// Import options
|
||||
export interface CalendarImportOptions {
|
||||
maxEvents?: number; // Limit total events to import
|
||||
calendarIds?: string[]; // Specific calendars (null for primary)
|
||||
timeMin?: Date; // Only import events after this date
|
||||
timeMax?: Date; // Only import events before this date
|
||||
includeDeleted?: boolean; // Include deleted events
|
||||
onProgress?: (progress: ImportProgress) => void;
|
||||
}
|
||||
|
||||
// Calendar API response types
|
||||
interface CalendarListResponse {
|
||||
items?: CalendarListEntry[];
|
||||
nextPageToken?: string;
|
||||
}
|
||||
|
||||
interface CalendarListEntry {
|
||||
id: string;
|
||||
summary?: string;
|
||||
description?: string;
|
||||
primary?: boolean;
|
||||
backgroundColor?: string;
|
||||
foregroundColor?: string;
|
||||
accessRole?: string;
|
||||
}
|
||||
|
||||
interface EventsListResponse {
|
||||
items?: CalendarEvent[];
|
||||
nextPageToken?: string;
|
||||
nextSyncToken?: string;
|
||||
}
|
||||
|
||||
interface CalendarEvent {
|
||||
id: string;
|
||||
status?: string;
|
||||
htmlLink?: string;
|
||||
created?: string;
|
||||
updated?: string;
|
||||
summary?: string;
|
||||
description?: string;
|
||||
location?: string;
|
||||
colorId?: string;
|
||||
creator?: { email?: string; displayName?: string };
|
||||
organizer?: { email?: string; displayName?: string };
|
||||
start?: { date?: string; dateTime?: string; timeZone?: string };
|
||||
end?: { date?: string; dateTime?: string; timeZone?: string };
|
||||
recurrence?: string[];
|
||||
recurringEventId?: string;
|
||||
attendees?: { email?: string; displayName?: string; responseStatus?: string }[];
|
||||
hangoutLink?: string;
|
||||
conferenceData?: {
|
||||
entryPoints?: { entryPointType?: string; uri?: string; label?: string }[];
|
||||
conferenceSolution?: { name?: string };
|
||||
};
|
||||
reminders?: {
|
||||
useDefault?: boolean;
|
||||
overrides?: { method: string; minutes: number }[];
|
||||
};
|
||||
}
|
||||
|
||||
// Parse event time to timestamp
|
||||
function parseEventTime(eventTime?: { date?: string; dateTime?: string }): number {
|
||||
if (!eventTime) return 0;
|
||||
|
||||
if (eventTime.dateTime) {
|
||||
return new Date(eventTime.dateTime).getTime();
|
||||
}
|
||||
if (eventTime.date) {
|
||||
return new Date(eventTime.date).getTime();
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
// Check if event is all-day
|
||||
function isAllDayEvent(event: CalendarEvent): boolean {
|
||||
return !!(event.start?.date && !event.start?.dateTime);
|
||||
}
|
||||
|
||||
// Get meeting link from event
|
||||
function getMeetingLink(event: CalendarEvent): string | null {
|
||||
// Check hangouts link
|
||||
if (event.hangoutLink) {
|
||||
return event.hangoutLink;
|
||||
}
|
||||
|
||||
// Check conference data
|
||||
const videoEntry = event.conferenceData?.entryPoints?.find(
|
||||
e => e.entryPointType === 'video'
|
||||
);
|
||||
if (videoEntry?.uri) {
|
||||
return videoEntry.uri;
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
// Main Calendar import class
|
||||
export class CalendarImporter {
|
||||
private accessToken: string | null = null;
|
||||
private encryptionKey: CryptoKey | null = null;
|
||||
private abortController: AbortController | null = null;
|
||||
|
||||
constructor(
|
||||
private masterKey: CryptoKey
|
||||
) {}
|
||||
|
||||
// Initialize importer
|
||||
async initialize(): Promise<boolean> {
|
||||
this.accessToken = await getAccessToken(this.masterKey);
|
||||
if (!this.accessToken) {
|
||||
console.error('No access token available for Calendar');
|
||||
return false;
|
||||
}
|
||||
|
||||
this.encryptionKey = await deriveServiceKey(this.masterKey, 'calendar');
|
||||
return true;
|
||||
}
|
||||
|
||||
// Abort current import
|
||||
abort(): void {
|
||||
this.abortController?.abort();
|
||||
}
|
||||
|
||||
// Import calendar events
|
||||
async import(options: CalendarImportOptions = {}): Promise<ImportProgress> {
|
||||
const progress: ImportProgress = {
|
||||
service: 'calendar',
|
||||
total: 0,
|
||||
imported: 0,
|
||||
status: 'importing'
|
||||
};
|
||||
|
||||
if (!await this.initialize()) {
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = 'Failed to initialize Calendar importer';
|
||||
return progress;
|
||||
}
|
||||
|
||||
this.abortController = new AbortController();
|
||||
progress.startedAt = Date.now();
|
||||
|
||||
try {
|
||||
// Get calendars to import from
|
||||
const calendarIds = options.calendarIds?.length
|
||||
? options.calendarIds
|
||||
: ['primary'];
|
||||
|
||||
// Default time range: 2 years back, 1 year forward
|
||||
const timeMin = options.timeMin || new Date(Date.now() - 2 * 365 * 24 * 60 * 60 * 1000);
|
||||
const timeMax = options.timeMax || new Date(Date.now() + 365 * 24 * 60 * 60 * 1000);
|
||||
|
||||
const eventBatch: EncryptedCalendarEvent[] = [];
|
||||
|
||||
for (const calendarId of calendarIds) {
|
||||
if (this.abortController.signal.aborted) {
|
||||
progress.status = 'paused';
|
||||
break;
|
||||
}
|
||||
|
||||
let pageToken: string | undefined;
|
||||
|
||||
do {
|
||||
if (this.abortController.signal.aborted) break;
|
||||
|
||||
const params: Record<string, string> = {
|
||||
maxResults: '250',
|
||||
singleEvents: 'true', // Expand recurring events
|
||||
orderBy: 'startTime',
|
||||
timeMin: timeMin.toISOString(),
|
||||
timeMax: timeMax.toISOString()
|
||||
};
|
||||
if (pageToken) {
|
||||
params.pageToken = pageToken;
|
||||
}
|
||||
if (options.includeDeleted) {
|
||||
params.showDeleted = 'true';
|
||||
}
|
||||
|
||||
const response = await this.fetchEvents(calendarId, params);
|
||||
|
||||
if (!response.items?.length) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Update total
|
||||
progress.total += response.items.length;
|
||||
|
||||
// Process events
|
||||
for (const event of response.items) {
|
||||
if (this.abortController.signal.aborted) break;
|
||||
|
||||
// Skip cancelled events unless including deleted
|
||||
if (event.status === 'cancelled' && !options.includeDeleted) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const encrypted = await this.processEvent(event, calendarId);
|
||||
if (encrypted) {
|
||||
eventBatch.push(encrypted);
|
||||
progress.imported++;
|
||||
|
||||
// Save batch every 50 events
|
||||
if (eventBatch.length >= 50) {
|
||||
await calendarStore.putBatch(eventBatch);
|
||||
eventBatch.length = 0;
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
}
|
||||
|
||||
// Check limit
|
||||
if (options.maxEvents && progress.imported >= options.maxEvents) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
pageToken = response.nextPageToken;
|
||||
|
||||
// Check limit
|
||||
if (options.maxEvents && progress.imported >= options.maxEvents) {
|
||||
break;
|
||||
}
|
||||
|
||||
} while (pageToken);
|
||||
|
||||
// Check limit
|
||||
if (options.maxEvents && progress.imported >= options.maxEvents) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Save remaining events
|
||||
if (eventBatch.length > 0) {
|
||||
await calendarStore.putBatch(eventBatch);
|
||||
}
|
||||
|
||||
progress.status = 'completed';
|
||||
progress.completedAt = Date.now();
|
||||
await syncMetadataStore.markComplete('calendar', progress.imported);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Calendar import error:', error);
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
await syncMetadataStore.markError('calendar', progress.errorMessage);
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
return progress;
|
||||
}
|
||||
|
||||
// Fetch events from Calendar API
|
||||
private async fetchEvents(
|
||||
calendarId: string,
|
||||
params: Record<string, string>
|
||||
): Promise<EventsListResponse> {
|
||||
const url = new URL(`${CALENDAR_API_BASE}/calendars/${encodeURIComponent(calendarId)}/events`);
|
||||
for (const [key, value] of Object.entries(params)) {
|
||||
url.searchParams.set(key, value);
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Calendar API error: ${response.status} ${response.statusText}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Process a single event
|
||||
private async processEvent(
|
||||
event: CalendarEvent,
|
||||
calendarId: string
|
||||
): Promise<EncryptedCalendarEvent | null> {
|
||||
if (!this.encryptionKey) {
|
||||
throw new Error('Encryption key not initialized');
|
||||
}
|
||||
|
||||
// Helper to encrypt
|
||||
const encrypt = async (data: string): Promise<EncryptedData> => {
|
||||
return encryptData(data, this.encryptionKey!);
|
||||
};
|
||||
|
||||
const startTime = parseEventTime(event.start);
|
||||
const endTime = parseEventTime(event.end);
|
||||
const timezone = event.start?.timeZone || Intl.DateTimeFormat().resolvedOptions().timeZone;
|
||||
const meetingLink = getMeetingLink(event);
|
||||
|
||||
// Serialize attendees for encryption
|
||||
const attendeesData = event.attendees
|
||||
? JSON.stringify(event.attendees)
|
||||
: null;
|
||||
|
||||
// Serialize recurrence for encryption
|
||||
const recurrenceData = event.recurrence
|
||||
? JSON.stringify(event.recurrence)
|
||||
: null;
|
||||
|
||||
// Get reminders
|
||||
const reminders: { method: string; minutes: number }[] = [];
|
||||
if (event.reminders?.overrides) {
|
||||
reminders.push(...event.reminders.overrides);
|
||||
} else if (event.reminders?.useDefault) {
|
||||
// Default reminders are typically 10 and 30 minutes
|
||||
reminders.push({ method: 'popup', minutes: 10 });
|
||||
}
|
||||
|
||||
return {
|
||||
id: event.id,
|
||||
calendarId,
|
||||
encryptedSummary: await encrypt(event.summary || ''),
|
||||
encryptedDescription: event.description ? await encrypt(event.description) : null,
|
||||
encryptedLocation: event.location ? await encrypt(event.location) : null,
|
||||
startTime,
|
||||
endTime,
|
||||
isAllDay: isAllDayEvent(event),
|
||||
timezone,
|
||||
isRecurring: !!event.recurringEventId || !!event.recurrence?.length,
|
||||
encryptedRecurrence: recurrenceData ? await encrypt(recurrenceData) : null,
|
||||
encryptedAttendees: attendeesData ? await encrypt(attendeesData) : null,
|
||||
reminders,
|
||||
encryptedMeetingLink: meetingLink ? await encrypt(meetingLink) : null,
|
||||
syncedAt: Date.now()
|
||||
};
|
||||
}
|
||||
|
||||
// List available calendars
|
||||
async listCalendars(): Promise<{
|
||||
id: string;
|
||||
name: string;
|
||||
primary: boolean;
|
||||
accessRole: string;
|
||||
}[]> {
|
||||
if (!await this.initialize()) {
|
||||
return [];
|
||||
}
|
||||
|
||||
try {
|
||||
const calendars: CalendarListEntry[] = [];
|
||||
let pageToken: string | undefined;
|
||||
|
||||
do {
|
||||
const url = new URL(`${CALENDAR_API_BASE}/users/me/calendarList`);
|
||||
url.searchParams.set('maxResults', '100');
|
||||
if (pageToken) {
|
||||
url.searchParams.set('pageToken', pageToken);
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) break;
|
||||
|
||||
const data: CalendarListResponse = await response.json();
|
||||
if (data.items) {
|
||||
calendars.push(...data.items);
|
||||
}
|
||||
pageToken = data.nextPageToken;
|
||||
|
||||
} while (pageToken);
|
||||
|
||||
return calendars.map(c => ({
|
||||
id: c.id,
|
||||
name: c.summary || 'Untitled',
|
||||
primary: c.primary || false,
|
||||
accessRole: c.accessRole || 'reader'
|
||||
}));
|
||||
|
||||
} catch (error) {
|
||||
console.error('List calendars error:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
// Get upcoming events (decrypted, for quick display)
|
||||
async getUpcomingEvents(limit: number = 10): Promise<CalendarEvent[]> {
|
||||
if (!await this.initialize()) {
|
||||
return [];
|
||||
}
|
||||
|
||||
try {
|
||||
const params: Record<string, string> = {
|
||||
maxResults: String(limit),
|
||||
singleEvents: 'true',
|
||||
orderBy: 'startTime',
|
||||
timeMin: new Date().toISOString()
|
||||
};
|
||||
|
||||
const response = await this.fetchEvents('primary', params);
|
||||
return response.items || [];
|
||||
|
||||
} catch (error) {
|
||||
console.error('Get upcoming events error:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Convenience function
|
||||
export async function importCalendar(
|
||||
masterKey: CryptoKey,
|
||||
options: CalendarImportOptions = {}
|
||||
): Promise<ImportProgress> {
|
||||
const importer = new CalendarImporter(masterKey);
|
||||
return importer.import(options);
|
||||
}
|
||||
|
|
@ -0,0 +1,406 @@
|
|||
// Google Drive import with folder navigation and progress tracking
|
||||
// All data is encrypted before storage
|
||||
|
||||
import type { EncryptedDriveDocument, ImportProgress, EncryptedData } from '../types';
|
||||
import { encryptData, deriveServiceKey } from '../encryption';
|
||||
import { driveStore, syncMetadataStore } from '../database';
|
||||
import { getAccessToken } from '../oauth';
|
||||
|
||||
const DRIVE_API_BASE = 'https://www.googleapis.com/drive/v3';
|
||||
|
||||
// Import options
|
||||
export interface DriveImportOptions {
|
||||
maxFiles?: number; // Limit total files to import
|
||||
folderId?: string; // Start from specific folder (null for root)
|
||||
mimeTypesFilter?: string[]; // Only import these MIME types
|
||||
includeShared?: boolean; // Include shared files
|
||||
includeTrashed?: boolean; // Include trashed files
|
||||
exportFormats?: Record<string, string>; // Google Docs export formats
|
||||
onProgress?: (progress: ImportProgress) => void;
|
||||
}
|
||||
|
||||
// Drive file list response
|
||||
interface DriveFileListResponse {
|
||||
files?: DriveFile[];
|
||||
nextPageToken?: string;
|
||||
}
|
||||
|
||||
// Drive file metadata
|
||||
interface DriveFile {
|
||||
id: string;
|
||||
name: string;
|
||||
mimeType: string;
|
||||
size?: string;
|
||||
modifiedTime?: string;
|
||||
createdTime?: string;
|
||||
parents?: string[];
|
||||
shared?: boolean;
|
||||
trashed?: boolean;
|
||||
webViewLink?: string;
|
||||
thumbnailLink?: string;
|
||||
}
|
||||
|
||||
// Default export formats for Google Docs
|
||||
const DEFAULT_EXPORT_FORMATS: Record<string, string> = {
|
||||
'application/vnd.google-apps.document': 'text/markdown',
|
||||
'application/vnd.google-apps.spreadsheet': 'text/csv',
|
||||
'application/vnd.google-apps.presentation': 'application/pdf',
|
||||
'application/vnd.google-apps.drawing': 'image/png'
|
||||
};
|
||||
|
||||
// Determine content strategy based on file size and type
|
||||
function getContentStrategy(file: DriveFile): 'inline' | 'reference' | 'chunked' {
|
||||
const size = parseInt(file.size || '0');
|
||||
|
||||
// Google Docs don't have a size, always inline
|
||||
if (file.mimeType.startsWith('application/vnd.google-apps.')) {
|
||||
return 'inline';
|
||||
}
|
||||
|
||||
// Small files (< 1MB) inline
|
||||
if (size < 1024 * 1024) {
|
||||
return 'inline';
|
||||
}
|
||||
|
||||
// Medium files (1-10MB) chunked
|
||||
if (size < 10 * 1024 * 1024) {
|
||||
return 'chunked';
|
||||
}
|
||||
|
||||
// Large files just store reference
|
||||
return 'reference';
|
||||
}
|
||||
|
||||
// Check if file is a Google Workspace file
|
||||
function isGoogleWorkspaceFile(mimeType: string): boolean {
|
||||
return mimeType.startsWith('application/vnd.google-apps.');
|
||||
}
|
||||
|
||||
// Main Drive import class
|
||||
export class DriveImporter {
|
||||
private accessToken: string | null = null;
|
||||
private encryptionKey: CryptoKey | null = null;
|
||||
private abortController: AbortController | null = null;
|
||||
|
||||
constructor(
|
||||
private masterKey: CryptoKey
|
||||
) {}
|
||||
|
||||
// Initialize importer
|
||||
async initialize(): Promise<boolean> {
|
||||
this.accessToken = await getAccessToken(this.masterKey);
|
||||
if (!this.accessToken) {
|
||||
console.error('No access token available for Drive');
|
||||
return false;
|
||||
}
|
||||
|
||||
this.encryptionKey = await deriveServiceKey(this.masterKey, 'drive');
|
||||
return true;
|
||||
}
|
||||
|
||||
// Abort current import
|
||||
abort(): void {
|
||||
this.abortController?.abort();
|
||||
}
|
||||
|
||||
// Import Drive files
|
||||
async import(options: DriveImportOptions = {}): Promise<ImportProgress> {
|
||||
const progress: ImportProgress = {
|
||||
service: 'drive',
|
||||
total: 0,
|
||||
imported: 0,
|
||||
status: 'importing'
|
||||
};
|
||||
|
||||
if (!await this.initialize()) {
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = 'Failed to initialize Drive importer';
|
||||
return progress;
|
||||
}
|
||||
|
||||
this.abortController = new AbortController();
|
||||
progress.startedAt = Date.now();
|
||||
|
||||
const exportFormats = options.exportFormats || DEFAULT_EXPORT_FORMATS;
|
||||
|
||||
try {
|
||||
// Build query
|
||||
const queryParts: string[] = [];
|
||||
if (options.folderId) {
|
||||
queryParts.push(`'${options.folderId}' in parents`);
|
||||
}
|
||||
if (options.mimeTypesFilter?.length) {
|
||||
const mimeQuery = options.mimeTypesFilter
|
||||
.map(m => `mimeType='${m}'`)
|
||||
.join(' or ');
|
||||
queryParts.push(`(${mimeQuery})`);
|
||||
}
|
||||
if (!options.includeTrashed) {
|
||||
queryParts.push('trashed=false');
|
||||
}
|
||||
|
||||
// Get file list
|
||||
let pageToken: string | undefined;
|
||||
const batchSize = 100;
|
||||
const fileBatch: EncryptedDriveDocument[] = [];
|
||||
|
||||
do {
|
||||
if (this.abortController.signal.aborted) {
|
||||
progress.status = 'paused';
|
||||
break;
|
||||
}
|
||||
|
||||
const params: Record<string, string> = {
|
||||
pageSize: String(batchSize),
|
||||
fields: 'nextPageToken,files(id,name,mimeType,size,modifiedTime,parents,shared,trashed,thumbnailLink)',
|
||||
q: queryParts.join(' and ') || 'trashed=false'
|
||||
};
|
||||
if (pageToken) {
|
||||
params.pageToken = pageToken;
|
||||
}
|
||||
|
||||
const listResponse = await this.fetchApi('/files', params);
|
||||
|
||||
if (!listResponse.files?.length) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Update total on first page
|
||||
if (progress.total === 0) {
|
||||
progress.total = listResponse.files.length;
|
||||
}
|
||||
|
||||
// Process files
|
||||
for (const file of listResponse.files) {
|
||||
if (this.abortController.signal.aborted) break;
|
||||
|
||||
// Skip shared files if not requested
|
||||
if (file.shared && !options.includeShared) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const encrypted = await this.processFile(file, exportFormats);
|
||||
if (encrypted) {
|
||||
fileBatch.push(encrypted);
|
||||
progress.imported++;
|
||||
|
||||
// Save batch every 25 files
|
||||
if (fileBatch.length >= 25) {
|
||||
await driveStore.putBatch(fileBatch);
|
||||
fileBatch.length = 0;
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
}
|
||||
|
||||
// Check limit
|
||||
if (options.maxFiles && progress.imported >= options.maxFiles) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
pageToken = listResponse.nextPageToken;
|
||||
|
||||
// Check limit
|
||||
if (options.maxFiles && progress.imported >= options.maxFiles) {
|
||||
break;
|
||||
}
|
||||
|
||||
} while (pageToken);
|
||||
|
||||
// Save remaining files
|
||||
if (fileBatch.length > 0) {
|
||||
await driveStore.putBatch(fileBatch);
|
||||
}
|
||||
|
||||
progress.status = 'completed';
|
||||
progress.completedAt = Date.now();
|
||||
await syncMetadataStore.markComplete('drive', progress.imported);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Drive import error:', error);
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
await syncMetadataStore.markError('drive', progress.errorMessage);
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
return progress;
|
||||
}
|
||||
|
||||
// Fetch from Drive API
|
||||
private async fetchApi(
|
||||
endpoint: string,
|
||||
params: Record<string, string> = {}
|
||||
): Promise<DriveFileListResponse> {
|
||||
const url = new URL(`${DRIVE_API_BASE}${endpoint}`);
|
||||
for (const [key, value] of Object.entries(params)) {
|
||||
url.searchParams.set(key, value);
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Drive API error: ${response.status} ${response.statusText}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Process a single file
|
||||
private async processFile(
|
||||
file: DriveFile,
|
||||
exportFormats: Record<string, string>
|
||||
): Promise<EncryptedDriveDocument | null> {
|
||||
if (!this.encryptionKey) {
|
||||
throw new Error('Encryption key not initialized');
|
||||
}
|
||||
|
||||
const strategy = getContentStrategy(file);
|
||||
let content: string | null = null;
|
||||
let preview: ArrayBuffer | null = null;
|
||||
|
||||
try {
|
||||
// Get content based on strategy
|
||||
if (strategy === 'inline' || strategy === 'chunked') {
|
||||
if (isGoogleWorkspaceFile(file.mimeType)) {
|
||||
// Export Google Workspace file
|
||||
const exportFormat = exportFormats[file.mimeType];
|
||||
if (exportFormat) {
|
||||
content = await this.exportFile(file.id, exportFormat);
|
||||
}
|
||||
} else {
|
||||
// Download regular file
|
||||
content = await this.downloadFile(file.id);
|
||||
}
|
||||
}
|
||||
|
||||
// Get thumbnail if available
|
||||
if (file.thumbnailLink) {
|
||||
try {
|
||||
preview = await this.fetchThumbnail(file.thumbnailLink);
|
||||
} catch {
|
||||
// Thumbnail fetch failed, continue without it
|
||||
}
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.warn(`Failed to get content for file ${file.name}:`, error);
|
||||
// Continue with reference-only storage
|
||||
}
|
||||
|
||||
// Helper to encrypt
|
||||
const encrypt = async (data: string): Promise<EncryptedData> => {
|
||||
return encryptData(data, this.encryptionKey!);
|
||||
};
|
||||
|
||||
return {
|
||||
id: file.id,
|
||||
encryptedName: await encrypt(file.name),
|
||||
encryptedMimeType: await encrypt(file.mimeType),
|
||||
encryptedContent: content ? await encrypt(content) : null,
|
||||
encryptedPreview: preview ? await encryptData(preview, this.encryptionKey) : null,
|
||||
contentStrategy: strategy,
|
||||
parentId: file.parents?.[0] || null,
|
||||
encryptedPath: await encrypt(file.name), // TODO: build full path
|
||||
isShared: file.shared || false,
|
||||
modifiedTime: new Date(file.modifiedTime || 0).getTime(),
|
||||
size: parseInt(file.size || '0'),
|
||||
syncedAt: Date.now()
|
||||
};
|
||||
}
|
||||
|
||||
// Export a Google Workspace file
|
||||
private async exportFile(fileId: string, mimeType: string): Promise<string> {
|
||||
const response = await fetch(
|
||||
`${DRIVE_API_BASE}/files/${fileId}/export?mimeType=${encodeURIComponent(mimeType)}`,
|
||||
{
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
}
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Export failed: ${response.status}`);
|
||||
}
|
||||
|
||||
return response.text();
|
||||
}
|
||||
|
||||
// Download a regular file
|
||||
private async downloadFile(fileId: string): Promise<string> {
|
||||
const response = await fetch(
|
||||
`${DRIVE_API_BASE}/files/${fileId}?alt=media`,
|
||||
{
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
}
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Download failed: ${response.status}`);
|
||||
}
|
||||
|
||||
return response.text();
|
||||
}
|
||||
|
||||
// Fetch thumbnail
|
||||
private async fetchThumbnail(thumbnailLink: string): Promise<ArrayBuffer> {
|
||||
const response = await fetch(thumbnailLink, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Thumbnail fetch failed: ${response.status}`);
|
||||
}
|
||||
|
||||
return response.arrayBuffer();
|
||||
}
|
||||
|
||||
// List folders for navigation
|
||||
async listFolders(parentId?: string): Promise<{ id: string; name: string }[]> {
|
||||
if (!await this.initialize()) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const query = [
|
||||
"mimeType='application/vnd.google-apps.folder'",
|
||||
'trashed=false',
|
||||
parentId ? `'${parentId}' in parents` : "'root' in parents"
|
||||
].join(' and ');
|
||||
|
||||
try {
|
||||
const response = await this.fetchApi('/files', {
|
||||
q: query,
|
||||
fields: 'files(id,name)',
|
||||
pageSize: '100'
|
||||
});
|
||||
|
||||
return response.files?.map(f => ({ id: f.id, name: f.name })) || [];
|
||||
} catch (error) {
|
||||
console.error('List folders error:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Convenience function
|
||||
export async function importDrive(
|
||||
masterKey: CryptoKey,
|
||||
options: DriveImportOptions = {}
|
||||
): Promise<ImportProgress> {
|
||||
const importer = new DriveImporter(masterKey);
|
||||
return importer.import(options);
|
||||
}
|
||||
|
|
@ -0,0 +1,409 @@
|
|||
// Gmail import with pagination and progress tracking
|
||||
// All data is encrypted before storage
|
||||
|
||||
import type { EncryptedEmailStore, ImportProgress, EncryptedData } from '../types';
|
||||
import { encryptData, deriveServiceKey } from '../encryption';
|
||||
import { gmailStore, syncMetadataStore } from '../database';
|
||||
import { getAccessToken } from '../oauth';
|
||||
|
||||
const GMAIL_API_BASE = 'https://gmail.googleapis.com/gmail/v1/users/me';
|
||||
|
||||
// Import options
|
||||
export interface GmailImportOptions {
|
||||
maxMessages?: number; // Limit total messages to import
|
||||
labelsFilter?: string[]; // Only import from these labels
|
||||
dateAfter?: Date; // Only import messages after this date
|
||||
dateBefore?: Date; // Only import messages before this date
|
||||
includeSpam?: boolean; // Include spam folder
|
||||
includeTrash?: boolean; // Include trash folder
|
||||
onProgress?: (progress: ImportProgress) => void; // Progress callback
|
||||
}
|
||||
|
||||
// Gmail message list response
|
||||
interface GmailMessageListResponse {
|
||||
messages?: { id: string; threadId: string }[];
|
||||
nextPageToken?: string;
|
||||
resultSizeEstimate?: number;
|
||||
}
|
||||
|
||||
// Gmail message response
|
||||
interface GmailMessageResponse {
|
||||
id: string;
|
||||
threadId: string;
|
||||
labelIds?: string[];
|
||||
snippet?: string;
|
||||
historyId?: string;
|
||||
internalDate?: string;
|
||||
payload?: {
|
||||
mimeType?: string;
|
||||
headers?: { name: string; value: string }[];
|
||||
body?: { data?: string; size?: number };
|
||||
parts?: GmailMessagePart[];
|
||||
};
|
||||
}
|
||||
|
||||
interface GmailMessagePart {
|
||||
mimeType?: string;
|
||||
body?: { data?: string; size?: number };
|
||||
parts?: GmailMessagePart[];
|
||||
}
|
||||
|
||||
// Extract header value from message
|
||||
function getHeader(message: GmailMessageResponse, name: string): string {
|
||||
const header = message.payload?.headers?.find(
|
||||
h => h.name.toLowerCase() === name.toLowerCase()
|
||||
);
|
||||
return header?.value || '';
|
||||
}
|
||||
|
||||
// Decode base64url encoded content
|
||||
function decodeBase64Url(data: string): string {
|
||||
try {
|
||||
// Replace URL-safe characters and add padding
|
||||
const base64 = data.replace(/-/g, '+').replace(/_/g, '/');
|
||||
const padding = base64.length % 4;
|
||||
const paddedBase64 = padding ? base64 + '='.repeat(4 - padding) : base64;
|
||||
return atob(paddedBase64);
|
||||
} catch {
|
||||
return '';
|
||||
}
|
||||
}
|
||||
|
||||
// Extract message body from parts
|
||||
function extractBody(message: GmailMessageResponse): string {
|
||||
const payload = message.payload;
|
||||
if (!payload) return '';
|
||||
|
||||
// Check direct body
|
||||
if (payload.body?.data) {
|
||||
return decodeBase64Url(payload.body.data);
|
||||
}
|
||||
|
||||
// Check parts for text/plain or text/html
|
||||
if (payload.parts) {
|
||||
return extractBodyFromParts(payload.parts);
|
||||
}
|
||||
|
||||
return '';
|
||||
}
|
||||
|
||||
function extractBodyFromParts(parts: GmailMessagePart[]): string {
|
||||
// Prefer text/plain, fall back to text/html
|
||||
let plainText = '';
|
||||
let htmlText = '';
|
||||
|
||||
for (const part of parts) {
|
||||
if (part.mimeType === 'text/plain' && part.body?.data) {
|
||||
plainText = decodeBase64Url(part.body.data);
|
||||
} else if (part.mimeType === 'text/html' && part.body?.data) {
|
||||
htmlText = decodeBase64Url(part.body.data);
|
||||
} else if (part.parts) {
|
||||
// Recursively check nested parts
|
||||
const nested = extractBodyFromParts(part.parts);
|
||||
if (nested) return nested;
|
||||
}
|
||||
}
|
||||
|
||||
return plainText || htmlText;
|
||||
}
|
||||
|
||||
// Check if message has attachments
|
||||
function hasAttachments(message: GmailMessageResponse): boolean {
|
||||
const parts = message.payload?.parts || [];
|
||||
return parts.some(part =>
|
||||
part.body?.size && part.body.size > 0 &&
|
||||
part.mimeType !== 'text/plain' && part.mimeType !== 'text/html'
|
||||
);
|
||||
}
|
||||
|
||||
// Build query string from options
|
||||
function buildQuery(options: GmailImportOptions): string {
|
||||
const queryParts: string[] = [];
|
||||
|
||||
if (options.dateAfter) {
|
||||
queryParts.push(`after:${Math.floor(options.dateAfter.getTime() / 1000)}`);
|
||||
}
|
||||
if (options.dateBefore) {
|
||||
queryParts.push(`before:${Math.floor(options.dateBefore.getTime() / 1000)}`);
|
||||
}
|
||||
if (!options.includeSpam) {
|
||||
queryParts.push('-in:spam');
|
||||
}
|
||||
if (!options.includeTrash) {
|
||||
queryParts.push('-in:trash');
|
||||
}
|
||||
|
||||
return queryParts.join(' ');
|
||||
}
|
||||
|
||||
// Main Gmail import class
|
||||
export class GmailImporter {
|
||||
private accessToken: string | null = null;
|
||||
private encryptionKey: CryptoKey | null = null;
|
||||
private abortController: AbortController | null = null;
|
||||
|
||||
constructor(
|
||||
private masterKey: CryptoKey
|
||||
) {}
|
||||
|
||||
// Initialize importer (get token and derive key)
|
||||
async initialize(): Promise<boolean> {
|
||||
this.accessToken = await getAccessToken(this.masterKey);
|
||||
if (!this.accessToken) {
|
||||
console.error('No access token available for Gmail');
|
||||
return false;
|
||||
}
|
||||
|
||||
this.encryptionKey = await deriveServiceKey(this.masterKey, 'gmail');
|
||||
return true;
|
||||
}
|
||||
|
||||
// Abort current import
|
||||
abort(): void {
|
||||
this.abortController?.abort();
|
||||
}
|
||||
|
||||
// Import Gmail messages
|
||||
async import(options: GmailImportOptions = {}): Promise<ImportProgress> {
|
||||
const progress: ImportProgress = {
|
||||
service: 'gmail',
|
||||
total: 0,
|
||||
imported: 0,
|
||||
status: 'importing'
|
||||
};
|
||||
|
||||
if (!await this.initialize()) {
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = 'Failed to initialize Gmail importer';
|
||||
return progress;
|
||||
}
|
||||
|
||||
this.abortController = new AbortController();
|
||||
progress.startedAt = Date.now();
|
||||
|
||||
try {
|
||||
// First, get total count
|
||||
const countResponse = await this.fetchApi('/messages', {
|
||||
maxResults: '1',
|
||||
q: buildQuery(options)
|
||||
});
|
||||
|
||||
progress.total = countResponse.resultSizeEstimate || 0;
|
||||
if (options.maxMessages) {
|
||||
progress.total = Math.min(progress.total, options.maxMessages);
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
|
||||
// Fetch messages with pagination
|
||||
let pageToken: string | undefined;
|
||||
const batchSize = 100;
|
||||
const messageBatch: EncryptedEmailStore[] = [];
|
||||
|
||||
do {
|
||||
// Check for abort
|
||||
if (this.abortController.signal.aborted) {
|
||||
progress.status = 'paused';
|
||||
break;
|
||||
}
|
||||
|
||||
// Fetch message list
|
||||
const listParams: Record<string, string> = {
|
||||
maxResults: String(batchSize),
|
||||
q: buildQuery(options)
|
||||
};
|
||||
if (pageToken) {
|
||||
listParams.pageToken = pageToken;
|
||||
}
|
||||
if (options.labelsFilter?.length) {
|
||||
listParams.labelIds = options.labelsFilter.join(',');
|
||||
}
|
||||
|
||||
const listResponse: GmailMessageListResponse = await this.fetchApi('/messages', listParams);
|
||||
|
||||
if (!listResponse.messages?.length) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Fetch full message details in parallel (batches of 10)
|
||||
const messages = listResponse.messages;
|
||||
for (let i = 0; i < messages.length; i += 10) {
|
||||
if (this.abortController.signal.aborted) break;
|
||||
|
||||
const batch = messages.slice(i, i + 10);
|
||||
const fullMessages = await Promise.all(
|
||||
batch.map(msg => this.fetchMessage(msg.id))
|
||||
);
|
||||
|
||||
// Encrypt and store each message
|
||||
for (const message of fullMessages) {
|
||||
if (message) {
|
||||
const encrypted = await this.encryptMessage(message);
|
||||
messageBatch.push(encrypted);
|
||||
progress.imported++;
|
||||
|
||||
// Save batch every 50 messages
|
||||
if (messageBatch.length >= 50) {
|
||||
await gmailStore.putBatch(messageBatch);
|
||||
messageBatch.length = 0;
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
|
||||
// Check max messages limit
|
||||
if (options.maxMessages && progress.imported >= options.maxMessages) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Small delay to avoid rate limiting
|
||||
await new Promise(r => setTimeout(r, 50));
|
||||
}
|
||||
|
||||
pageToken = listResponse.nextPageToken;
|
||||
|
||||
// Check max messages limit
|
||||
if (options.maxMessages && progress.imported >= options.maxMessages) {
|
||||
break;
|
||||
}
|
||||
|
||||
} while (pageToken);
|
||||
|
||||
// Save remaining messages
|
||||
if (messageBatch.length > 0) {
|
||||
await gmailStore.putBatch(messageBatch);
|
||||
}
|
||||
|
||||
// Update sync metadata
|
||||
progress.status = 'completed';
|
||||
progress.completedAt = Date.now();
|
||||
await syncMetadataStore.markComplete('gmail', progress.imported);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Gmail import error:', error);
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
await syncMetadataStore.markError('gmail', progress.errorMessage);
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
return progress;
|
||||
}
|
||||
|
||||
// Fetch from Gmail API
|
||||
private async fetchApi(
|
||||
endpoint: string,
|
||||
params: Record<string, string> = {}
|
||||
): Promise<GmailMessageListResponse> {
|
||||
const url = new URL(`${GMAIL_API_BASE}${endpoint}`);
|
||||
for (const [key, value] of Object.entries(params)) {
|
||||
url.searchParams.set(key, value);
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Gmail API error: ${response.status} ${response.statusText}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Fetch a single message with full content
|
||||
private async fetchMessage(messageId: string): Promise<GmailMessageResponse | null> {
|
||||
try {
|
||||
const response = await fetch(
|
||||
`${GMAIL_API_BASE}/messages/${messageId}?format=full`,
|
||||
{
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
}
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
console.warn(`Failed to fetch message ${messageId}`);
|
||||
return null;
|
||||
}
|
||||
|
||||
return response.json();
|
||||
} catch (error) {
|
||||
console.warn(`Error fetching message ${messageId}:`, error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Encrypt a message for storage
|
||||
private async encryptMessage(message: GmailMessageResponse): Promise<EncryptedEmailStore> {
|
||||
if (!this.encryptionKey) {
|
||||
throw new Error('Encryption key not initialized');
|
||||
}
|
||||
|
||||
const subject = getHeader(message, 'Subject');
|
||||
const from = getHeader(message, 'From');
|
||||
const to = getHeader(message, 'To');
|
||||
const body = extractBody(message);
|
||||
const snippet = message.snippet || '';
|
||||
|
||||
// Helper to encrypt with null handling
|
||||
const encrypt = async (data: string): Promise<EncryptedData> => {
|
||||
return encryptData(data, this.encryptionKey!);
|
||||
};
|
||||
|
||||
return {
|
||||
id: message.id,
|
||||
threadId: message.threadId,
|
||||
encryptedSubject: await encrypt(subject),
|
||||
encryptedBody: await encrypt(body),
|
||||
encryptedFrom: await encrypt(from),
|
||||
encryptedTo: await encrypt(to),
|
||||
date: parseInt(message.internalDate || '0'),
|
||||
labels: message.labelIds || [],
|
||||
hasAttachments: hasAttachments(message),
|
||||
encryptedSnippet: await encrypt(snippet),
|
||||
syncedAt: Date.now(),
|
||||
localOnly: true
|
||||
};
|
||||
}
|
||||
|
||||
// Get Gmail labels
|
||||
async getLabels(): Promise<{ id: string; name: string; type: string }[]> {
|
||||
if (!await this.initialize()) {
|
||||
return [];
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`${GMAIL_API_BASE}/labels`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const data = await response.json() as { labels?: { id: string; name: string; type: string }[] };
|
||||
return data.labels || [];
|
||||
} catch (error) {
|
||||
console.error('Get labels error:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Convenience function to create and run importer
|
||||
export async function importGmail(
|
||||
masterKey: CryptoKey,
|
||||
options: GmailImportOptions = {}
|
||||
): Promise<ImportProgress> {
|
||||
const importer = new GmailImporter(masterKey);
|
||||
return importer.import(options);
|
||||
}
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
// Export all importers
|
||||
export { GmailImporter, importGmail, type GmailImportOptions } from './gmail';
|
||||
export { DriveImporter, importDrive, type DriveImportOptions } from './drive';
|
||||
export { PhotosImporter, importPhotos, type PhotosImportOptions } from './photos';
|
||||
export { CalendarImporter, importCalendar, type CalendarImportOptions } from './calendar';
|
||||
|
|
@ -0,0 +1,424 @@
|
|||
// Google Photos import with thumbnail storage
|
||||
// Full resolution images are NOT stored locally - fetch on demand
|
||||
// All data is encrypted before storage
|
||||
|
||||
import type { EncryptedPhotoReference, ImportProgress, EncryptedData } from '../types';
|
||||
import { encryptData, deriveServiceKey } from '../encryption';
|
||||
import { photosStore, syncMetadataStore } from '../database';
|
||||
import { getAccessToken } from '../oauth';
|
||||
|
||||
const PHOTOS_API_BASE = 'https://photoslibrary.googleapis.com/v1';
|
||||
|
||||
// Import options
|
||||
export interface PhotosImportOptions {
|
||||
maxPhotos?: number; // Limit total photos to import
|
||||
albumId?: string; // Only import from specific album
|
||||
dateAfter?: Date; // Only import photos after this date
|
||||
dateBefore?: Date; // Only import photos before this date
|
||||
mediaTypes?: ('image' | 'video')[]; // Filter by media type
|
||||
thumbnailSize?: number; // Thumbnail width (default 256)
|
||||
onProgress?: (progress: ImportProgress) => void;
|
||||
}
|
||||
|
||||
// Photos API response types
|
||||
interface PhotosListResponse {
|
||||
mediaItems?: PhotosMediaItem[];
|
||||
nextPageToken?: string;
|
||||
}
|
||||
|
||||
interface PhotosMediaItem {
|
||||
id: string;
|
||||
productUrl?: string;
|
||||
baseUrl?: string;
|
||||
mimeType?: string;
|
||||
filename?: string;
|
||||
description?: string;
|
||||
mediaMetadata?: {
|
||||
creationTime?: string;
|
||||
width?: string;
|
||||
height?: string;
|
||||
photo?: {
|
||||
cameraMake?: string;
|
||||
cameraModel?: string;
|
||||
focalLength?: number;
|
||||
apertureFNumber?: number;
|
||||
isoEquivalent?: number;
|
||||
};
|
||||
video?: {
|
||||
fps?: number;
|
||||
status?: string;
|
||||
};
|
||||
};
|
||||
contributorInfo?: {
|
||||
profilePictureBaseUrl?: string;
|
||||
displayName?: string;
|
||||
};
|
||||
}
|
||||
|
||||
interface PhotosAlbum {
|
||||
id: string;
|
||||
title?: string;
|
||||
productUrl?: string;
|
||||
mediaItemsCount?: string;
|
||||
coverPhotoBaseUrl?: string;
|
||||
coverPhotoMediaItemId?: string;
|
||||
}
|
||||
|
||||
// Main Photos import class
|
||||
export class PhotosImporter {
|
||||
private accessToken: string | null = null;
|
||||
private encryptionKey: CryptoKey | null = null;
|
||||
private abortController: AbortController | null = null;
|
||||
|
||||
constructor(
|
||||
private masterKey: CryptoKey
|
||||
) {}
|
||||
|
||||
// Initialize importer
|
||||
async initialize(): Promise<boolean> {
|
||||
this.accessToken = await getAccessToken(this.masterKey);
|
||||
if (!this.accessToken) {
|
||||
console.error('No access token available for Photos');
|
||||
return false;
|
||||
}
|
||||
|
||||
this.encryptionKey = await deriveServiceKey(this.masterKey, 'photos');
|
||||
return true;
|
||||
}
|
||||
|
||||
// Abort current import
|
||||
abort(): void {
|
||||
this.abortController?.abort();
|
||||
}
|
||||
|
||||
// Import photos
|
||||
async import(options: PhotosImportOptions = {}): Promise<ImportProgress> {
|
||||
const progress: ImportProgress = {
|
||||
service: 'photos',
|
||||
total: 0,
|
||||
imported: 0,
|
||||
status: 'importing'
|
||||
};
|
||||
|
||||
if (!await this.initialize()) {
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = 'Failed to initialize Photos importer';
|
||||
return progress;
|
||||
}
|
||||
|
||||
this.abortController = new AbortController();
|
||||
progress.startedAt = Date.now();
|
||||
|
||||
const thumbnailSize = options.thumbnailSize || 256;
|
||||
|
||||
try {
|
||||
let pageToken: string | undefined;
|
||||
const batchSize = 100;
|
||||
const photoBatch: EncryptedPhotoReference[] = [];
|
||||
|
||||
do {
|
||||
if (this.abortController.signal.aborted) {
|
||||
progress.status = 'paused';
|
||||
break;
|
||||
}
|
||||
|
||||
// Fetch media items
|
||||
const listResponse = await this.fetchMediaItems(options, pageToken, batchSize);
|
||||
|
||||
if (!listResponse.mediaItems?.length) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Update total on first page
|
||||
if (progress.total === 0) {
|
||||
progress.total = listResponse.mediaItems.length;
|
||||
}
|
||||
|
||||
// Process media items
|
||||
for (const item of listResponse.mediaItems) {
|
||||
if (this.abortController.signal.aborted) break;
|
||||
|
||||
// Filter by media type if specified
|
||||
const isVideo = !!item.mediaMetadata?.video;
|
||||
const mediaType = isVideo ? 'video' : 'image';
|
||||
|
||||
if (options.mediaTypes?.length && !options.mediaTypes.includes(mediaType)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Filter by date if specified
|
||||
const creationTime = item.mediaMetadata?.creationTime
|
||||
? new Date(item.mediaMetadata.creationTime).getTime()
|
||||
: 0;
|
||||
|
||||
if (options.dateAfter && creationTime < options.dateAfter.getTime()) {
|
||||
continue;
|
||||
}
|
||||
if (options.dateBefore && creationTime > options.dateBefore.getTime()) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const encrypted = await this.processMediaItem(item, thumbnailSize);
|
||||
if (encrypted) {
|
||||
photoBatch.push(encrypted);
|
||||
progress.imported++;
|
||||
|
||||
// Save batch every 25 items
|
||||
if (photoBatch.length >= 25) {
|
||||
await photosStore.putBatch(photoBatch);
|
||||
photoBatch.length = 0;
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
}
|
||||
|
||||
// Check limit
|
||||
if (options.maxPhotos && progress.imported >= options.maxPhotos) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Small delay for rate limiting
|
||||
await new Promise(r => setTimeout(r, 20));
|
||||
}
|
||||
|
||||
pageToken = listResponse.nextPageToken;
|
||||
|
||||
// Check limit
|
||||
if (options.maxPhotos && progress.imported >= options.maxPhotos) {
|
||||
break;
|
||||
}
|
||||
|
||||
} while (pageToken);
|
||||
|
||||
// Save remaining photos
|
||||
if (photoBatch.length > 0) {
|
||||
await photosStore.putBatch(photoBatch);
|
||||
}
|
||||
|
||||
progress.status = 'completed';
|
||||
progress.completedAt = Date.now();
|
||||
await syncMetadataStore.markComplete('photos', progress.imported);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Photos import error:', error);
|
||||
progress.status = 'error';
|
||||
progress.errorMessage = error instanceof Error ? error.message : 'Unknown error';
|
||||
await syncMetadataStore.markError('photos', progress.errorMessage);
|
||||
}
|
||||
|
||||
options.onProgress?.(progress);
|
||||
return progress;
|
||||
}
|
||||
|
||||
// Fetch media items from API
|
||||
private async fetchMediaItems(
|
||||
options: PhotosImportOptions,
|
||||
pageToken: string | undefined,
|
||||
pageSize: number
|
||||
): Promise<PhotosListResponse> {
|
||||
// If album specified, use album search
|
||||
if (options.albumId) {
|
||||
return this.searchByAlbum(options.albumId, pageToken, pageSize);
|
||||
}
|
||||
|
||||
// Otherwise use list all
|
||||
const url = new URL(`${PHOTOS_API_BASE}/mediaItems`);
|
||||
url.searchParams.set('pageSize', String(pageSize));
|
||||
if (pageToken) {
|
||||
url.searchParams.set('pageToken', pageToken);
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
},
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Photos API error: ${response.status} ${response.statusText}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Search by album
|
||||
private async searchByAlbum(
|
||||
albumId: string,
|
||||
pageToken: string | undefined,
|
||||
pageSize: number
|
||||
): Promise<PhotosListResponse> {
|
||||
const body: Record<string, unknown> = {
|
||||
albumId,
|
||||
pageSize
|
||||
};
|
||||
if (pageToken) {
|
||||
body.pageToken = pageToken;
|
||||
}
|
||||
|
||||
const response = await fetch(`${PHOTOS_API_BASE}/mediaItems:search`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`,
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify(body),
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Photos search error: ${response.status}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Process a single media item
|
||||
private async processMediaItem(
|
||||
item: PhotosMediaItem,
|
||||
thumbnailSize: number
|
||||
): Promise<EncryptedPhotoReference | null> {
|
||||
if (!this.encryptionKey) {
|
||||
throw new Error('Encryption key not initialized');
|
||||
}
|
||||
|
||||
const isVideo = !!item.mediaMetadata?.video;
|
||||
const mediaType: 'image' | 'video' = isVideo ? 'video' : 'image';
|
||||
|
||||
// Fetch thumbnail
|
||||
let thumbnailData: EncryptedData | null = null;
|
||||
if (item.baseUrl) {
|
||||
try {
|
||||
const thumbnailUrl = isVideo
|
||||
? `${item.baseUrl}=w${thumbnailSize}-h${thumbnailSize}` // Video thumbnail
|
||||
: `${item.baseUrl}=w${thumbnailSize}-h${thumbnailSize}-c`; // Image thumbnail (cropped)
|
||||
|
||||
const thumbResponse = await fetch(thumbnailUrl, {
|
||||
signal: this.abortController?.signal
|
||||
});
|
||||
|
||||
if (thumbResponse.ok) {
|
||||
const thumbBuffer = await thumbResponse.arrayBuffer();
|
||||
thumbnailData = await encryptData(thumbBuffer, this.encryptionKey);
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn(`Failed to fetch thumbnail for ${item.id}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
// Helper to encrypt
|
||||
const encrypt = async (data: string): Promise<EncryptedData> => {
|
||||
return encryptData(data, this.encryptionKey!);
|
||||
};
|
||||
|
||||
const width = parseInt(item.mediaMetadata?.width || '0');
|
||||
const height = parseInt(item.mediaMetadata?.height || '0');
|
||||
const creationTime = item.mediaMetadata?.creationTime
|
||||
? new Date(item.mediaMetadata.creationTime).getTime()
|
||||
: Date.now();
|
||||
|
||||
return {
|
||||
id: item.id,
|
||||
encryptedFilename: await encrypt(item.filename || ''),
|
||||
encryptedDescription: item.description ? await encrypt(item.description) : null,
|
||||
thumbnail: thumbnailData ? {
|
||||
width: Math.min(thumbnailSize, width),
|
||||
height: Math.min(thumbnailSize, height),
|
||||
encryptedData: thumbnailData
|
||||
} : null,
|
||||
fullResolution: {
|
||||
width,
|
||||
height
|
||||
},
|
||||
mediaType,
|
||||
creationTime,
|
||||
albumIds: [], // Would need separate album lookup
|
||||
encryptedLocation: null, // Location data not available in basic API
|
||||
syncedAt: Date.now()
|
||||
};
|
||||
}
|
||||
|
||||
// List albums
|
||||
async listAlbums(): Promise<{ id: string; title: string; count: number }[]> {
|
||||
if (!await this.initialize()) {
|
||||
return [];
|
||||
}
|
||||
|
||||
try {
|
||||
const albums: PhotosAlbum[] = [];
|
||||
let pageToken: string | undefined;
|
||||
|
||||
do {
|
||||
const url = new URL(`${PHOTOS_API_BASE}/albums`);
|
||||
url.searchParams.set('pageSize', '50');
|
||||
if (pageToken) {
|
||||
url.searchParams.set('pageToken', pageToken);
|
||||
}
|
||||
|
||||
const response = await fetch(url.toString(), {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) break;
|
||||
|
||||
const data = await response.json() as { albums?: PhotosAlbum[]; nextPageToken?: string };
|
||||
if (data.albums) {
|
||||
albums.push(...data.albums);
|
||||
}
|
||||
pageToken = data.nextPageToken;
|
||||
|
||||
} while (pageToken);
|
||||
|
||||
return albums.map(a => ({
|
||||
id: a.id,
|
||||
title: a.title || 'Untitled',
|
||||
count: parseInt(a.mediaItemsCount || '0')
|
||||
}));
|
||||
|
||||
} catch (error) {
|
||||
console.error('List albums error:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
// Get full resolution URL for a photo (requires fresh baseUrl)
|
||||
async getFullResolutionUrl(mediaItemId: string): Promise<string | null> {
|
||||
if (!await this.initialize()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`${PHOTOS_API_BASE}/mediaItems/${mediaItemId}`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${this.accessToken}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) return null;
|
||||
|
||||
const item: PhotosMediaItem = await response.json();
|
||||
|
||||
if (!item.baseUrl) return null;
|
||||
|
||||
// Full resolution URL with download parameter
|
||||
const isVideo = !!item.mediaMetadata?.video;
|
||||
return isVideo
|
||||
? `${item.baseUrl}=dv` // Download video
|
||||
: `${item.baseUrl}=d`; // Download image
|
||||
} catch (error) {
|
||||
console.error('Get full resolution error:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Convenience function
|
||||
export async function importPhotos(
|
||||
masterKey: CryptoKey,
|
||||
options: PhotosImportOptions = {}
|
||||
): Promise<ImportProgress> {
|
||||
const importer = new PhotosImporter(masterKey);
|
||||
return importer.import(options);
|
||||
}
|
||||
|
|
@ -0,0 +1,287 @@
|
|||
// Google Data Sovereignty Module
|
||||
// Local-first, encrypted storage for Google Workspace data
|
||||
|
||||
// Types
|
||||
export type {
|
||||
EncryptedData,
|
||||
EncryptedEmailStore,
|
||||
EncryptedDriveDocument,
|
||||
EncryptedPhotoReference,
|
||||
EncryptedCalendarEvent,
|
||||
SyncMetadata,
|
||||
EncryptionMetadata,
|
||||
EncryptedTokens,
|
||||
ImportProgress,
|
||||
StorageQuotaInfo,
|
||||
ShareableItem,
|
||||
GoogleService
|
||||
} from './types';
|
||||
|
||||
export { GOOGLE_SCOPES, DB_STORES } from './types';
|
||||
|
||||
// Encryption utilities
|
||||
export {
|
||||
hasWebCrypto,
|
||||
generateMasterKey,
|
||||
exportMasterKey,
|
||||
importMasterKey,
|
||||
deriveServiceKey,
|
||||
encryptData,
|
||||
decryptData,
|
||||
decryptDataToString,
|
||||
generateCodeVerifier,
|
||||
generateCodeChallenge,
|
||||
generateSalt,
|
||||
encryptMasterKeyWithPassword,
|
||||
decryptMasterKeyWithPassword
|
||||
} from './encryption';
|
||||
|
||||
// Database operations
|
||||
export {
|
||||
openDatabase,
|
||||
closeDatabase,
|
||||
deleteDatabase,
|
||||
gmailStore,
|
||||
driveStore,
|
||||
photosStore,
|
||||
calendarStore,
|
||||
syncMetadataStore,
|
||||
encryptionMetaStore,
|
||||
tokensStore,
|
||||
requestPersistentStorage,
|
||||
checkStorageQuota,
|
||||
hasSafariLimitations,
|
||||
touchLocalData,
|
||||
clearServiceData,
|
||||
exportAllData
|
||||
} from './database';
|
||||
|
||||
// OAuth
|
||||
export {
|
||||
initiateGoogleAuth,
|
||||
handleGoogleCallback,
|
||||
getAccessToken,
|
||||
isGoogleAuthenticated,
|
||||
getGrantedScopes,
|
||||
isServiceAuthorized,
|
||||
revokeGoogleAccess,
|
||||
getGoogleUserInfo,
|
||||
parseCallbackParams
|
||||
} from './oauth';
|
||||
|
||||
// Importers
|
||||
export {
|
||||
GmailImporter,
|
||||
importGmail,
|
||||
DriveImporter,
|
||||
importDrive,
|
||||
PhotosImporter,
|
||||
importPhotos,
|
||||
CalendarImporter,
|
||||
importCalendar
|
||||
} from './importers';
|
||||
|
||||
export type {
|
||||
GmailImportOptions,
|
||||
DriveImportOptions,
|
||||
PhotosImportOptions,
|
||||
CalendarImportOptions
|
||||
} from './importers';
|
||||
|
||||
// Share to board
|
||||
export {
|
||||
ShareService,
|
||||
createShareService
|
||||
} from './share';
|
||||
|
||||
export type {
|
||||
EmailCardShape,
|
||||
DocumentCardShape,
|
||||
PhotoCardShape,
|
||||
EventCardShape,
|
||||
GoogleDataShape
|
||||
} from './share';
|
||||
|
||||
// R2 Backup
|
||||
export {
|
||||
R2BackupService,
|
||||
createBackupService
|
||||
} from './backup';
|
||||
|
||||
export type {
|
||||
BackupMetadata,
|
||||
BackupProgress
|
||||
} from './backup';
|
||||
|
||||
// Main service class that ties everything together
|
||||
import { generateMasterKey, importMasterKey, exportMasterKey } from './encryption';
|
||||
import { openDatabase, checkStorageQuota, touchLocalData, hasSafariLimitations, requestPersistentStorage } from './database';
|
||||
import { isGoogleAuthenticated, getGoogleUserInfo, initiateGoogleAuth, revokeGoogleAccess } from './oauth';
|
||||
import { importGmail, importDrive, importPhotos, importCalendar } from './importers';
|
||||
import type { GmailImportOptions, DriveImportOptions, PhotosImportOptions, CalendarImportOptions } from './importers';
|
||||
import { createShareService, ShareService } from './share';
|
||||
import { createBackupService, R2BackupService } from './backup';
|
||||
import type { GoogleService, ImportProgress } from './types';
|
||||
|
||||
export class GoogleDataService {
|
||||
private masterKey: CryptoKey | null = null;
|
||||
private shareService: ShareService | null = null;
|
||||
private backupService: R2BackupService | null = null;
|
||||
private initialized = false;
|
||||
|
||||
// Initialize the service with an existing master key or generate new one
|
||||
async initialize(existingKeyData?: ArrayBuffer): Promise<boolean> {
|
||||
try {
|
||||
// Open database
|
||||
await openDatabase();
|
||||
|
||||
// Set up master key
|
||||
if (existingKeyData) {
|
||||
this.masterKey = await importMasterKey(existingKeyData);
|
||||
} else {
|
||||
this.masterKey = await generateMasterKey();
|
||||
}
|
||||
|
||||
// Request persistent storage (especially important for Safari)
|
||||
if (hasSafariLimitations()) {
|
||||
console.warn('Safari detected: Data may be evicted after 7 days of non-use');
|
||||
await requestPersistentStorage();
|
||||
// Schedule periodic touch to prevent eviction
|
||||
this.scheduleTouchInterval();
|
||||
}
|
||||
|
||||
// Initialize sub-services
|
||||
this.shareService = createShareService(this.masterKey);
|
||||
this.backupService = createBackupService(this.masterKey);
|
||||
|
||||
this.initialized = true;
|
||||
return true;
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to initialize GoogleDataService:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Check if initialized
|
||||
isInitialized(): boolean {
|
||||
return this.initialized && this.masterKey !== null;
|
||||
}
|
||||
|
||||
// Export master key for backup
|
||||
async exportKey(): Promise<ArrayBuffer | null> {
|
||||
if (!this.masterKey) return null;
|
||||
return await exportMasterKey(this.masterKey);
|
||||
}
|
||||
|
||||
// Check Google authentication status
|
||||
async isAuthenticated(): Promise<boolean> {
|
||||
return await isGoogleAuthenticated();
|
||||
}
|
||||
|
||||
// Get Google user info
|
||||
async getUserInfo(): Promise<{ email: string; name: string; picture: string } | null> {
|
||||
if (!this.masterKey) return null;
|
||||
return await getGoogleUserInfo(this.masterKey);
|
||||
}
|
||||
|
||||
// Start Google OAuth flow
|
||||
async authenticate(services: GoogleService[]): Promise<void> {
|
||||
await initiateGoogleAuth(services);
|
||||
}
|
||||
|
||||
// Revoke Google access
|
||||
async signOut(): Promise<boolean> {
|
||||
if (!this.masterKey) return false;
|
||||
return await revokeGoogleAccess(this.masterKey);
|
||||
}
|
||||
|
||||
// Import data from Google services
|
||||
async importData(
|
||||
service: GoogleService,
|
||||
options: {
|
||||
gmail?: GmailImportOptions;
|
||||
drive?: DriveImportOptions;
|
||||
photos?: PhotosImportOptions;
|
||||
calendar?: CalendarImportOptions;
|
||||
} = {}
|
||||
): Promise<ImportProgress> {
|
||||
if (!this.masterKey) {
|
||||
return {
|
||||
service,
|
||||
total: 0,
|
||||
imported: 0,
|
||||
status: 'error',
|
||||
errorMessage: 'Service not initialized'
|
||||
};
|
||||
}
|
||||
|
||||
switch (service) {
|
||||
case 'gmail':
|
||||
return await importGmail(this.masterKey, options.gmail || {});
|
||||
case 'drive':
|
||||
return await importDrive(this.masterKey, options.drive || {});
|
||||
case 'photos':
|
||||
return await importPhotos(this.masterKey, options.photos || {});
|
||||
case 'calendar':
|
||||
return await importCalendar(this.masterKey, options.calendar || {});
|
||||
default:
|
||||
return {
|
||||
service,
|
||||
total: 0,
|
||||
imported: 0,
|
||||
status: 'error',
|
||||
errorMessage: 'Unknown service'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Get share service for board integration
|
||||
getShareService(): ShareService | null {
|
||||
return this.shareService;
|
||||
}
|
||||
|
||||
// Get backup service for R2 operations
|
||||
getBackupService(): R2BackupService | null {
|
||||
return this.backupService;
|
||||
}
|
||||
|
||||
// Get storage quota info
|
||||
async getStorageInfo(): Promise<{
|
||||
used: number;
|
||||
quota: number;
|
||||
isPersistent: boolean;
|
||||
byService: { gmail: number; drive: number; photos: number; calendar: number };
|
||||
}> {
|
||||
return await checkStorageQuota();
|
||||
}
|
||||
|
||||
// Schedule periodic touch for Safari
|
||||
private scheduleTouchInterval(): void {
|
||||
// Touch data every 6 hours to prevent 7-day eviction
|
||||
const TOUCH_INTERVAL = 6 * 60 * 60 * 1000;
|
||||
|
||||
setInterval(async () => {
|
||||
try {
|
||||
await touchLocalData();
|
||||
console.log('Touched local data to prevent Safari eviction');
|
||||
} catch (error) {
|
||||
console.warn('Failed to touch local data:', error);
|
||||
}
|
||||
}, TOUCH_INTERVAL);
|
||||
}
|
||||
}
|
||||
|
||||
// Singleton instance
|
||||
let serviceInstance: GoogleDataService | null = null;
|
||||
|
||||
export function getGoogleDataService(): GoogleDataService {
|
||||
if (!serviceInstance) {
|
||||
serviceInstance = new GoogleDataService();
|
||||
}
|
||||
return serviceInstance;
|
||||
}
|
||||
|
||||
export function resetGoogleDataService(): void {
|
||||
serviceInstance = null;
|
||||
}
|
||||
|
|
@ -0,0 +1,382 @@
|
|||
// Google OAuth 2.0 with PKCE flow
|
||||
// All tokens are encrypted before storage
|
||||
|
||||
import { GOOGLE_SCOPES, type GoogleService } from './types';
|
||||
import {
|
||||
generateCodeVerifier,
|
||||
generateCodeChallenge,
|
||||
encryptData,
|
||||
decryptDataToString,
|
||||
deriveServiceKey
|
||||
} from './encryption';
|
||||
import { tokensStore } from './database';
|
||||
|
||||
// OAuth configuration
|
||||
const GOOGLE_AUTH_URL = 'https://accounts.google.com/o/oauth2/v2/auth';
|
||||
const GOOGLE_TOKEN_URL = 'https://oauth2.googleapis.com/token';
|
||||
|
||||
// Auth state stored in sessionStorage during OAuth flow
|
||||
interface GoogleAuthState {
|
||||
codeVerifier: string;
|
||||
redirectUri: string;
|
||||
state: string;
|
||||
requestedServices: GoogleService[];
|
||||
}
|
||||
|
||||
// Get the Google Client ID from environment
|
||||
function getGoogleClientId(): string {
|
||||
const clientId = import.meta.env.VITE_GOOGLE_CLIENT_ID;
|
||||
if (!clientId) {
|
||||
throw new Error('VITE_GOOGLE_CLIENT_ID environment variable is not set');
|
||||
}
|
||||
return clientId;
|
||||
}
|
||||
|
||||
// Get the Google Client Secret from environment
|
||||
function getGoogleClientSecret(): string {
|
||||
const clientSecret = import.meta.env.VITE_GOOGLE_CLIENT_SECRET;
|
||||
if (!clientSecret) {
|
||||
throw new Error('VITE_GOOGLE_CLIENT_SECRET environment variable is not set');
|
||||
}
|
||||
return clientSecret;
|
||||
}
|
||||
|
||||
// Build the OAuth redirect URI
|
||||
function getRedirectUri(): string {
|
||||
return `${window.location.origin}/oauth/google/callback`;
|
||||
}
|
||||
|
||||
// Get requested scopes based on selected services
|
||||
function getRequestedScopes(services: GoogleService[]): string {
|
||||
const scopes: string[] = [GOOGLE_SCOPES.profile, GOOGLE_SCOPES.email];
|
||||
|
||||
for (const service of services) {
|
||||
const scope = GOOGLE_SCOPES[service];
|
||||
if (scope) {
|
||||
scopes.push(scope);
|
||||
}
|
||||
}
|
||||
|
||||
return scopes.join(' ');
|
||||
}
|
||||
|
||||
// Initiate the Google OAuth flow
|
||||
export async function initiateGoogleAuth(services: GoogleService[]): Promise<void> {
|
||||
if (services.length === 0) {
|
||||
throw new Error('At least one service must be selected');
|
||||
}
|
||||
|
||||
const codeVerifier = generateCodeVerifier();
|
||||
const codeChallenge = await generateCodeChallenge(codeVerifier);
|
||||
const state = crypto.randomUUID();
|
||||
const redirectUri = getRedirectUri();
|
||||
|
||||
// Store auth state for callback verification
|
||||
const authState: GoogleAuthState = {
|
||||
codeVerifier,
|
||||
redirectUri,
|
||||
state,
|
||||
requestedServices: services
|
||||
};
|
||||
sessionStorage.setItem('google_auth_state', JSON.stringify(authState));
|
||||
|
||||
// Build authorization URL
|
||||
const params = new URLSearchParams({
|
||||
client_id: getGoogleClientId(),
|
||||
redirect_uri: redirectUri,
|
||||
response_type: 'code',
|
||||
scope: getRequestedScopes(services),
|
||||
access_type: 'offline', // Get refresh token
|
||||
prompt: 'consent', // Always show consent to get refresh token
|
||||
code_challenge: codeChallenge,
|
||||
code_challenge_method: 'S256',
|
||||
state
|
||||
});
|
||||
|
||||
// Redirect to Google OAuth
|
||||
window.location.href = `${GOOGLE_AUTH_URL}?${params.toString()}`;
|
||||
}
|
||||
|
||||
// Handle the OAuth callback
|
||||
export async function handleGoogleCallback(
|
||||
code: string,
|
||||
state: string,
|
||||
masterKey: CryptoKey
|
||||
): Promise<{
|
||||
success: boolean;
|
||||
scopes: string[];
|
||||
error?: string;
|
||||
}> {
|
||||
// Retrieve and validate stored state
|
||||
const storedStateJson = sessionStorage.getItem('google_auth_state');
|
||||
if (!storedStateJson) {
|
||||
return { success: false, scopes: [], error: 'No auth state found' };
|
||||
}
|
||||
|
||||
const storedState: GoogleAuthState = JSON.parse(storedStateJson);
|
||||
|
||||
// Verify state matches
|
||||
if (storedState.state !== state) {
|
||||
return { success: false, scopes: [], error: 'State mismatch - possible CSRF attack' };
|
||||
}
|
||||
|
||||
// Clean up session storage
|
||||
sessionStorage.removeItem('google_auth_state');
|
||||
|
||||
try {
|
||||
// Exchange code for tokens
|
||||
const tokenResponse = await fetch(GOOGLE_TOKEN_URL, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/x-www-form-urlencoded'
|
||||
},
|
||||
body: new URLSearchParams({
|
||||
client_id: getGoogleClientId(),
|
||||
client_secret: getGoogleClientSecret(),
|
||||
code,
|
||||
code_verifier: storedState.codeVerifier,
|
||||
grant_type: 'authorization_code',
|
||||
redirect_uri: storedState.redirectUri
|
||||
})
|
||||
});
|
||||
|
||||
if (!tokenResponse.ok) {
|
||||
const error = await tokenResponse.json() as { error_description?: string };
|
||||
return {
|
||||
success: false,
|
||||
scopes: [],
|
||||
error: error.error_description || 'Token exchange failed'
|
||||
};
|
||||
}
|
||||
|
||||
const tokens = await tokenResponse.json() as {
|
||||
access_token: string;
|
||||
refresh_token?: string;
|
||||
expires_in: number;
|
||||
scope?: string;
|
||||
};
|
||||
|
||||
// Encrypt and store tokens
|
||||
await storeEncryptedTokens(tokens, masterKey);
|
||||
|
||||
// Parse scopes from response
|
||||
const grantedScopes = (tokens.scope || '').split(' ');
|
||||
|
||||
return {
|
||||
success: true,
|
||||
scopes: grantedScopes
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
console.error('OAuth callback error:', error);
|
||||
return {
|
||||
success: false,
|
||||
scopes: [],
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Store encrypted tokens
|
||||
async function storeEncryptedTokens(
|
||||
tokens: {
|
||||
access_token: string;
|
||||
refresh_token?: string;
|
||||
expires_in: number;
|
||||
scope?: string;
|
||||
},
|
||||
masterKey: CryptoKey
|
||||
): Promise<void> {
|
||||
const tokenKey = await deriveServiceKey(masterKey, 'tokens');
|
||||
|
||||
const encryptedAccessToken = await encryptData(tokens.access_token, tokenKey);
|
||||
|
||||
let encryptedRefreshToken = null;
|
||||
if (tokens.refresh_token) {
|
||||
encryptedRefreshToken = await encryptData(tokens.refresh_token, tokenKey);
|
||||
}
|
||||
|
||||
await tokensStore.put({
|
||||
encryptedAccessToken,
|
||||
encryptedRefreshToken,
|
||||
expiresAt: Date.now() + tokens.expires_in * 1000,
|
||||
scopes: (tokens.scope || '').split(' ')
|
||||
});
|
||||
}
|
||||
|
||||
// Get decrypted access token (refreshing if needed)
|
||||
export async function getAccessToken(masterKey: CryptoKey): Promise<string | null> {
|
||||
const tokens = await tokensStore.get();
|
||||
if (!tokens) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const tokenKey = await deriveServiceKey(masterKey, 'tokens');
|
||||
|
||||
// Check if token is expired
|
||||
if (await tokensStore.isExpired()) {
|
||||
// Try to refresh
|
||||
if (tokens.encryptedRefreshToken) {
|
||||
const refreshed = await refreshAccessToken(
|
||||
tokens.encryptedRefreshToken,
|
||||
tokenKey,
|
||||
masterKey
|
||||
);
|
||||
if (refreshed) {
|
||||
return refreshed;
|
||||
}
|
||||
}
|
||||
return null; // Token expired and can't refresh
|
||||
}
|
||||
|
||||
// Decrypt and return access token
|
||||
return await decryptDataToString(tokens.encryptedAccessToken, tokenKey);
|
||||
}
|
||||
|
||||
// Refresh access token using refresh token
|
||||
async function refreshAccessToken(
|
||||
encryptedRefreshToken: { encrypted: ArrayBuffer; iv: Uint8Array },
|
||||
tokenKey: CryptoKey,
|
||||
masterKey: CryptoKey
|
||||
): Promise<string | null> {
|
||||
try {
|
||||
const refreshToken = await decryptDataToString(encryptedRefreshToken, tokenKey);
|
||||
|
||||
const response = await fetch(GOOGLE_TOKEN_URL, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/x-www-form-urlencoded'
|
||||
},
|
||||
body: new URLSearchParams({
|
||||
client_id: getGoogleClientId(),
|
||||
client_secret: getGoogleClientSecret(),
|
||||
refresh_token: refreshToken,
|
||||
grant_type: 'refresh_token'
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
console.error('Token refresh failed:', await response.text());
|
||||
return null;
|
||||
}
|
||||
|
||||
const tokens = await response.json() as {
|
||||
access_token: string;
|
||||
expires_in: number;
|
||||
};
|
||||
|
||||
// Store new tokens (refresh token may not be returned on refresh)
|
||||
const newTokenKey = await deriveServiceKey(masterKey, 'tokens');
|
||||
const encryptedAccessToken = await encryptData(tokens.access_token, newTokenKey);
|
||||
|
||||
const existingTokens = await tokensStore.get();
|
||||
await tokensStore.put({
|
||||
encryptedAccessToken,
|
||||
encryptedRefreshToken: existingTokens?.encryptedRefreshToken || null,
|
||||
expiresAt: Date.now() + tokens.expires_in * 1000,
|
||||
scopes: existingTokens?.scopes || []
|
||||
});
|
||||
|
||||
return tokens.access_token;
|
||||
|
||||
} catch (error) {
|
||||
console.error('Token refresh error:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Check if user is authenticated with Google
|
||||
export async function isGoogleAuthenticated(): Promise<boolean> {
|
||||
const tokens = await tokensStore.get();
|
||||
return tokens !== null;
|
||||
}
|
||||
|
||||
// Get granted scopes
|
||||
export async function getGrantedScopes(): Promise<string[]> {
|
||||
const tokens = await tokensStore.get();
|
||||
return tokens?.scopes || [];
|
||||
}
|
||||
|
||||
// Check if a specific service is authorized
|
||||
export async function isServiceAuthorized(service: GoogleService): Promise<boolean> {
|
||||
const scopes = await getGrantedScopes();
|
||||
return scopes.includes(GOOGLE_SCOPES[service]);
|
||||
}
|
||||
|
||||
// Revoke Google access
|
||||
export async function revokeGoogleAccess(masterKey: CryptoKey): Promise<boolean> {
|
||||
try {
|
||||
const accessToken = await getAccessToken(masterKey);
|
||||
|
||||
if (accessToken) {
|
||||
// Revoke token with Google
|
||||
await fetch(`https://oauth2.googleapis.com/revoke?token=${accessToken}`, {
|
||||
method: 'POST'
|
||||
});
|
||||
}
|
||||
|
||||
// Clear stored tokens
|
||||
await tokensStore.delete();
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error('Revoke error:', error);
|
||||
// Still delete local tokens even if revocation fails
|
||||
await tokensStore.delete();
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Get user info from Google
|
||||
export async function getGoogleUserInfo(masterKey: CryptoKey): Promise<{
|
||||
email: string;
|
||||
name: string;
|
||||
picture: string;
|
||||
} | null> {
|
||||
const accessToken = await getAccessToken(masterKey);
|
||||
if (!accessToken) {
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch('https://www.googleapis.com/oauth2/v2/userinfo', {
|
||||
headers: {
|
||||
Authorization: `Bearer ${accessToken}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const userInfo = await response.json() as {
|
||||
email: string;
|
||||
name: string;
|
||||
picture: string;
|
||||
};
|
||||
return {
|
||||
email: userInfo.email,
|
||||
name: userInfo.name,
|
||||
picture: userInfo.picture
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Get user info error:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Parse callback URL parameters
|
||||
export function parseCallbackParams(url: string): {
|
||||
code?: string;
|
||||
state?: string;
|
||||
error?: string;
|
||||
error_description?: string;
|
||||
} {
|
||||
const urlObj = new URL(url);
|
||||
return {
|
||||
code: urlObj.searchParams.get('code') || undefined,
|
||||
state: urlObj.searchParams.get('state') || undefined,
|
||||
error: urlObj.searchParams.get('error') || undefined,
|
||||
error_description: urlObj.searchParams.get('error_description') || undefined
|
||||
};
|
||||
}
|
||||
|
|
@ -0,0 +1,555 @@
|
|||
// Share encrypted data to the canvas board
|
||||
// Decrypts items and creates tldraw shapes
|
||||
|
||||
import type {
|
||||
EncryptedEmailStore,
|
||||
EncryptedDriveDocument,
|
||||
EncryptedPhotoReference,
|
||||
EncryptedCalendarEvent,
|
||||
ShareableItem,
|
||||
GoogleService
|
||||
} from './types';
|
||||
import {
|
||||
decryptDataToString,
|
||||
deriveServiceKey
|
||||
} from './encryption';
|
||||
import {
|
||||
gmailStore,
|
||||
driveStore,
|
||||
photosStore,
|
||||
calendarStore
|
||||
} from './database';
|
||||
import type { TLShapeId } from 'tldraw';
|
||||
import { createShapeId } from 'tldraw';
|
||||
|
||||
// Shape types for canvas
|
||||
export interface EmailCardShape {
|
||||
id: TLShapeId;
|
||||
type: 'email-card';
|
||||
x: number;
|
||||
y: number;
|
||||
props: {
|
||||
subject: string;
|
||||
from: string;
|
||||
date: number;
|
||||
snippet: string;
|
||||
messageId: string;
|
||||
hasAttachments: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
export interface DocumentCardShape {
|
||||
id: TLShapeId;
|
||||
type: 'document-card';
|
||||
x: number;
|
||||
y: number;
|
||||
props: {
|
||||
name: string;
|
||||
mimeType: string;
|
||||
content: string | null;
|
||||
documentId: string;
|
||||
size: number;
|
||||
modifiedTime: number;
|
||||
};
|
||||
}
|
||||
|
||||
export interface PhotoCardShape {
|
||||
id: TLShapeId;
|
||||
type: 'photo-card';
|
||||
x: number;
|
||||
y: number;
|
||||
props: {
|
||||
filename: string;
|
||||
description: string | null;
|
||||
thumbnailDataUrl: string | null;
|
||||
mediaItemId: string;
|
||||
mediaType: 'image' | 'video';
|
||||
width: number;
|
||||
height: number;
|
||||
creationTime: number;
|
||||
};
|
||||
}
|
||||
|
||||
export interface EventCardShape {
|
||||
id: TLShapeId;
|
||||
type: 'event-card';
|
||||
x: number;
|
||||
y: number;
|
||||
props: {
|
||||
summary: string;
|
||||
description: string | null;
|
||||
location: string | null;
|
||||
startTime: number;
|
||||
endTime: number;
|
||||
isAllDay: boolean;
|
||||
eventId: string;
|
||||
calendarId: string;
|
||||
meetingLink: string | null;
|
||||
};
|
||||
}
|
||||
|
||||
export type GoogleDataShape =
|
||||
| EmailCardShape
|
||||
| DocumentCardShape
|
||||
| PhotoCardShape
|
||||
| EventCardShape;
|
||||
|
||||
// Service to manage sharing to board
|
||||
export class ShareService {
|
||||
private serviceKeys: Map<GoogleService, CryptoKey> = new Map();
|
||||
|
||||
constructor(private masterKey: CryptoKey) {}
|
||||
|
||||
// Initialize service keys for decryption
|
||||
private async getServiceKey(service: GoogleService): Promise<CryptoKey> {
|
||||
let key = this.serviceKeys.get(service);
|
||||
if (!key) {
|
||||
key = await deriveServiceKey(this.masterKey, service);
|
||||
this.serviceKeys.set(service, key);
|
||||
}
|
||||
return key;
|
||||
}
|
||||
|
||||
// List items available for sharing (with decrypted previews)
|
||||
async listShareableItems(
|
||||
service: GoogleService,
|
||||
limit: number = 50
|
||||
): Promise<ShareableItem[]> {
|
||||
const key = await this.getServiceKey(service);
|
||||
|
||||
switch (service) {
|
||||
case 'gmail':
|
||||
return this.listShareableEmails(key, limit);
|
||||
case 'drive':
|
||||
return this.listShareableDocuments(key, limit);
|
||||
case 'photos':
|
||||
return this.listShareablePhotos(key, limit);
|
||||
case 'calendar':
|
||||
return this.listShareableEvents(key, limit);
|
||||
default:
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
// List shareable emails
|
||||
private async listShareableEmails(
|
||||
key: CryptoKey,
|
||||
limit: number
|
||||
): Promise<ShareableItem[]> {
|
||||
const emails = await gmailStore.getAll();
|
||||
const items: ShareableItem[] = [];
|
||||
|
||||
for (const email of emails.slice(0, limit)) {
|
||||
try {
|
||||
const subject = await decryptDataToString(email.encryptedSubject, key);
|
||||
const snippet = await decryptDataToString(email.encryptedSnippet, key);
|
||||
|
||||
items.push({
|
||||
type: 'email',
|
||||
id: email.id,
|
||||
title: subject || '(No Subject)',
|
||||
preview: snippet,
|
||||
date: email.date
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to decrypt email ${email.id}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
return items.sort((a, b) => b.date - a.date);
|
||||
}
|
||||
|
||||
// List shareable documents
|
||||
private async listShareableDocuments(
|
||||
key: CryptoKey,
|
||||
limit: number
|
||||
): Promise<ShareableItem[]> {
|
||||
const docs = await driveStore.getRecent(limit);
|
||||
const items: ShareableItem[] = [];
|
||||
|
||||
for (const doc of docs) {
|
||||
try {
|
||||
const name = await decryptDataToString(doc.encryptedName, key);
|
||||
|
||||
items.push({
|
||||
type: 'document',
|
||||
id: doc.id,
|
||||
title: name || 'Untitled',
|
||||
date: doc.modifiedTime
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to decrypt document ${doc.id}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
return items;
|
||||
}
|
||||
|
||||
// List shareable photos
|
||||
private async listShareablePhotos(
|
||||
key: CryptoKey,
|
||||
limit: number
|
||||
): Promise<ShareableItem[]> {
|
||||
const photos = await photosStore.getAll();
|
||||
const items: ShareableItem[] = [];
|
||||
|
||||
for (const photo of photos.slice(0, limit)) {
|
||||
try {
|
||||
const filename = await decryptDataToString(photo.encryptedFilename, key);
|
||||
|
||||
items.push({
|
||||
type: 'photo',
|
||||
id: photo.id,
|
||||
title: filename || 'Untitled',
|
||||
date: photo.creationTime
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to decrypt photo ${photo.id}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
return items.sort((a, b) => b.date - a.date);
|
||||
}
|
||||
|
||||
// List shareable events
|
||||
private async listShareableEvents(
|
||||
key: CryptoKey,
|
||||
limit: number
|
||||
): Promise<ShareableItem[]> {
|
||||
// Get all events, not just upcoming
|
||||
const events = await calendarStore.getAll();
|
||||
const items: ShareableItem[] = [];
|
||||
|
||||
for (const event of events.slice(0, limit)) {
|
||||
try {
|
||||
const summary = await decryptDataToString(event.encryptedSummary, key);
|
||||
|
||||
items.push({
|
||||
type: 'event',
|
||||
id: event.id,
|
||||
title: summary || 'Untitled Event',
|
||||
date: event.startTime
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to decrypt event ${event.id}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
return items;
|
||||
}
|
||||
|
||||
// Create a shape from an item for the board
|
||||
async createShapeFromItem(
|
||||
itemId: string,
|
||||
itemType: ShareableItem['type'],
|
||||
position: { x: number; y: number }
|
||||
): Promise<GoogleDataShape | null> {
|
||||
switch (itemType) {
|
||||
case 'email':
|
||||
return this.createEmailShape(itemId, position);
|
||||
case 'document':
|
||||
return this.createDocumentShape(itemId, position);
|
||||
case 'photo':
|
||||
return this.createPhotoShape(itemId, position);
|
||||
case 'event':
|
||||
return this.createEventShape(itemId, position);
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Create email shape
|
||||
private async createEmailShape(
|
||||
emailId: string,
|
||||
position: { x: number; y: number }
|
||||
): Promise<EmailCardShape | null> {
|
||||
const email = await gmailStore.get(emailId);
|
||||
if (!email) return null;
|
||||
|
||||
const key = await this.getServiceKey('gmail');
|
||||
|
||||
try {
|
||||
const subject = await decryptDataToString(email.encryptedSubject, key);
|
||||
const from = await decryptDataToString(email.encryptedFrom, key);
|
||||
const snippet = await decryptDataToString(email.encryptedSnippet, key);
|
||||
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'email-card',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
subject: subject || '(No Subject)',
|
||||
from,
|
||||
date: email.date,
|
||||
snippet,
|
||||
messageId: email.id,
|
||||
hasAttachments: email.hasAttachments
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to create email shape:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Create document shape
|
||||
private async createDocumentShape(
|
||||
docId: string,
|
||||
position: { x: number; y: number }
|
||||
): Promise<DocumentCardShape | null> {
|
||||
const doc = await driveStore.get(docId);
|
||||
if (!doc) return null;
|
||||
|
||||
const key = await this.getServiceKey('drive');
|
||||
|
||||
try {
|
||||
const name = await decryptDataToString(doc.encryptedName, key);
|
||||
const mimeType = await decryptDataToString(doc.encryptedMimeType, key);
|
||||
const content = doc.encryptedContent
|
||||
? await decryptDataToString(doc.encryptedContent, key)
|
||||
: null;
|
||||
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'document-card',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
name: name || 'Untitled',
|
||||
mimeType,
|
||||
content,
|
||||
documentId: doc.id,
|
||||
size: doc.size,
|
||||
modifiedTime: doc.modifiedTime
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to create document shape:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Create photo shape
|
||||
private async createPhotoShape(
|
||||
photoId: string,
|
||||
position: { x: number; y: number }
|
||||
): Promise<PhotoCardShape | null> {
|
||||
const photo = await photosStore.get(photoId);
|
||||
if (!photo) return null;
|
||||
|
||||
const key = await this.getServiceKey('photos');
|
||||
|
||||
try {
|
||||
const filename = await decryptDataToString(photo.encryptedFilename, key);
|
||||
const description = photo.encryptedDescription
|
||||
? await decryptDataToString(photo.encryptedDescription, key)
|
||||
: null;
|
||||
|
||||
// Convert thumbnail to data URL if available
|
||||
let thumbnailDataUrl: string | null = null;
|
||||
if (photo.thumbnail?.encryptedData) {
|
||||
const thumbBuffer = await (await this.getServiceKey('photos')).algorithm;
|
||||
// Decrypt thumbnail and convert to base64
|
||||
// Note: This is simplified - actual implementation would need proper blob handling
|
||||
thumbnailDataUrl = null; // TODO: implement thumbnail decryption
|
||||
}
|
||||
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'photo-card',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
filename: filename || 'Untitled',
|
||||
description,
|
||||
thumbnailDataUrl,
|
||||
mediaItemId: photo.id,
|
||||
mediaType: photo.mediaType,
|
||||
width: photo.fullResolution.width,
|
||||
height: photo.fullResolution.height,
|
||||
creationTime: photo.creationTime
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to create photo shape:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Create event shape
|
||||
private async createEventShape(
|
||||
eventId: string,
|
||||
position: { x: number; y: number }
|
||||
): Promise<EventCardShape | null> {
|
||||
const event = await calendarStore.get(eventId);
|
||||
if (!event) return null;
|
||||
|
||||
const key = await this.getServiceKey('calendar');
|
||||
|
||||
try {
|
||||
const summary = await decryptDataToString(event.encryptedSummary, key);
|
||||
const description = event.encryptedDescription
|
||||
? await decryptDataToString(event.encryptedDescription, key)
|
||||
: null;
|
||||
const location = event.encryptedLocation
|
||||
? await decryptDataToString(event.encryptedLocation, key)
|
||||
: null;
|
||||
const meetingLink = event.encryptedMeetingLink
|
||||
? await decryptDataToString(event.encryptedMeetingLink, key)
|
||||
: null;
|
||||
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'event-card',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
summary: summary || 'Untitled Event',
|
||||
description,
|
||||
location,
|
||||
startTime: event.startTime,
|
||||
endTime: event.endTime,
|
||||
isAllDay: event.isAllDay,
|
||||
eventId: event.id,
|
||||
calendarId: event.calendarId,
|
||||
meetingLink
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to create event shape:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Mark an item as shared (no longer local-only)
|
||||
async markAsShared(itemId: string, itemType: ShareableItem['type']): Promise<void> {
|
||||
switch (itemType) {
|
||||
case 'email': {
|
||||
const email = await gmailStore.get(itemId);
|
||||
if (email) {
|
||||
email.localOnly = false;
|
||||
await gmailStore.put(email);
|
||||
}
|
||||
break;
|
||||
}
|
||||
// Drive, Photos, Calendar don't have localOnly flag in current schema
|
||||
// Would need to add if sharing tracking is needed
|
||||
}
|
||||
}
|
||||
|
||||
// Get full decrypted content for an item
|
||||
async getFullContent(
|
||||
itemId: string,
|
||||
itemType: ShareableItem['type']
|
||||
): Promise<Record<string, unknown> | null> {
|
||||
switch (itemType) {
|
||||
case 'email':
|
||||
return this.getFullEmailContent(itemId);
|
||||
case 'document':
|
||||
return this.getFullDocumentContent(itemId);
|
||||
case 'event':
|
||||
return this.getFullEventContent(itemId);
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Get full email content
|
||||
private async getFullEmailContent(
|
||||
emailId: string
|
||||
): Promise<Record<string, unknown> | null> {
|
||||
const email = await gmailStore.get(emailId);
|
||||
if (!email) return null;
|
||||
|
||||
const key = await this.getServiceKey('gmail');
|
||||
|
||||
try {
|
||||
return {
|
||||
id: email.id,
|
||||
threadId: email.threadId,
|
||||
subject: await decryptDataToString(email.encryptedSubject, key),
|
||||
body: await decryptDataToString(email.encryptedBody, key),
|
||||
from: await decryptDataToString(email.encryptedFrom, key),
|
||||
to: await decryptDataToString(email.encryptedTo, key),
|
||||
date: email.date,
|
||||
labels: email.labels,
|
||||
hasAttachments: email.hasAttachments
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to get full email content:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Get full document content
|
||||
private async getFullDocumentContent(
|
||||
docId: string
|
||||
): Promise<Record<string, unknown> | null> {
|
||||
const doc = await driveStore.get(docId);
|
||||
if (!doc) return null;
|
||||
|
||||
const key = await this.getServiceKey('drive');
|
||||
|
||||
try {
|
||||
return {
|
||||
id: doc.id,
|
||||
name: await decryptDataToString(doc.encryptedName, key),
|
||||
mimeType: await decryptDataToString(doc.encryptedMimeType, key),
|
||||
content: doc.encryptedContent
|
||||
? await decryptDataToString(doc.encryptedContent, key)
|
||||
: null,
|
||||
size: doc.size,
|
||||
modifiedTime: doc.modifiedTime,
|
||||
isShared: doc.isShared
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to get full document content:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
// Get full event content
|
||||
private async getFullEventContent(
|
||||
eventId: string
|
||||
): Promise<Record<string, unknown> | null> {
|
||||
const event = await calendarStore.get(eventId);
|
||||
if (!event) return null;
|
||||
|
||||
const key = await this.getServiceKey('calendar');
|
||||
|
||||
try {
|
||||
return {
|
||||
id: event.id,
|
||||
calendarId: event.calendarId,
|
||||
summary: await decryptDataToString(event.encryptedSummary, key),
|
||||
description: event.encryptedDescription
|
||||
? await decryptDataToString(event.encryptedDescription, key)
|
||||
: null,
|
||||
location: event.encryptedLocation
|
||||
? await decryptDataToString(event.encryptedLocation, key)
|
||||
: null,
|
||||
startTime: event.startTime,
|
||||
endTime: event.endTime,
|
||||
isAllDay: event.isAllDay,
|
||||
timezone: event.timezone,
|
||||
isRecurring: event.isRecurring,
|
||||
attendees: event.encryptedAttendees
|
||||
? JSON.parse(await decryptDataToString(event.encryptedAttendees, key))
|
||||
: [],
|
||||
reminders: event.reminders,
|
||||
meetingLink: event.encryptedMeetingLink
|
||||
? await decryptDataToString(event.encryptedMeetingLink, key)
|
||||
: null
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Failed to get full event content:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Convenience function
|
||||
export function createShareService(masterKey: CryptoKey): ShareService {
|
||||
return new ShareService(masterKey);
|
||||
}
|
||||
|
|
@ -0,0 +1,165 @@
|
|||
// Type definitions for Google Data Sovereignty module
|
||||
// All data is encrypted client-side before storage
|
||||
|
||||
// Base interface for encrypted data
|
||||
export interface EncryptedData {
|
||||
encrypted: ArrayBuffer;
|
||||
iv: Uint8Array;
|
||||
}
|
||||
|
||||
// Encrypted Email Storage
|
||||
export interface EncryptedEmailStore {
|
||||
id: string; // Gmail message ID
|
||||
threadId: string; // Thread ID for grouping
|
||||
encryptedSubject: EncryptedData;
|
||||
encryptedBody: EncryptedData;
|
||||
encryptedFrom: EncryptedData;
|
||||
encryptedTo: EncryptedData;
|
||||
date: number; // Timestamp (unencrypted for sorting)
|
||||
labels: string[]; // Gmail labels
|
||||
hasAttachments: boolean;
|
||||
encryptedSnippet: EncryptedData;
|
||||
syncedAt: number;
|
||||
localOnly: boolean; // Not yet shared to board
|
||||
}
|
||||
|
||||
// Encrypted Drive Document Storage
|
||||
export interface EncryptedDriveDocument {
|
||||
id: string; // Drive file ID
|
||||
encryptedName: EncryptedData;
|
||||
encryptedMimeType: EncryptedData;
|
||||
encryptedContent: EncryptedData | null; // For text-based docs
|
||||
encryptedPreview: EncryptedData | null; // Thumbnail or preview
|
||||
contentStrategy: 'inline' | 'reference' | 'chunked';
|
||||
chunks?: string[]; // IDs of content chunks if chunked
|
||||
parentId: string | null;
|
||||
encryptedPath: EncryptedData;
|
||||
isShared: boolean;
|
||||
modifiedTime: number;
|
||||
size: number; // Unencrypted for quota management
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Encrypted Photo Reference Storage
|
||||
export interface EncryptedPhotoReference {
|
||||
id: string; // Photos media item ID
|
||||
encryptedFilename: EncryptedData;
|
||||
encryptedDescription: EncryptedData | null;
|
||||
thumbnail: {
|
||||
width: number;
|
||||
height: number;
|
||||
encryptedData: EncryptedData; // Base64 or blob
|
||||
} | null;
|
||||
fullResolution: {
|
||||
width: number;
|
||||
height: number;
|
||||
};
|
||||
mediaType: 'image' | 'video';
|
||||
creationTime: number;
|
||||
albumIds: string[];
|
||||
encryptedLocation: EncryptedData | null; // Location data (highly sensitive)
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Encrypted Calendar Event Storage
|
||||
export interface EncryptedCalendarEvent {
|
||||
id: string; // Calendar event ID
|
||||
calendarId: string;
|
||||
encryptedSummary: EncryptedData;
|
||||
encryptedDescription: EncryptedData | null;
|
||||
encryptedLocation: EncryptedData | null;
|
||||
startTime: number; // Unencrypted for query/sort
|
||||
endTime: number;
|
||||
isAllDay: boolean;
|
||||
timezone: string;
|
||||
isRecurring: boolean;
|
||||
encryptedRecurrence: EncryptedData | null;
|
||||
encryptedAttendees: EncryptedData | null;
|
||||
reminders: { method: string; minutes: number }[];
|
||||
encryptedMeetingLink: EncryptedData | null;
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Sync Metadata
|
||||
export interface SyncMetadata {
|
||||
service: 'gmail' | 'drive' | 'photos' | 'calendar';
|
||||
lastSyncToken?: string;
|
||||
lastSyncTime: number;
|
||||
itemCount: number;
|
||||
status: 'idle' | 'syncing' | 'error';
|
||||
errorMessage?: string;
|
||||
progressCurrent?: number;
|
||||
progressTotal?: number;
|
||||
}
|
||||
|
||||
// Encryption Metadata
|
||||
export interface EncryptionMetadata {
|
||||
purpose: 'gmail' | 'drive' | 'photos' | 'calendar' | 'google_tokens' | 'master';
|
||||
salt: Uint8Array;
|
||||
createdAt: number;
|
||||
}
|
||||
|
||||
// OAuth Token Storage (encrypted)
|
||||
export interface EncryptedTokens {
|
||||
encryptedAccessToken: EncryptedData;
|
||||
encryptedRefreshToken: EncryptedData | null;
|
||||
expiresAt: number;
|
||||
scopes: string[];
|
||||
}
|
||||
|
||||
// Import Progress
|
||||
export interface ImportProgress {
|
||||
service: 'gmail' | 'drive' | 'photos' | 'calendar';
|
||||
total: number;
|
||||
imported: number;
|
||||
status: 'idle' | 'importing' | 'paused' | 'completed' | 'error';
|
||||
errorMessage?: string;
|
||||
startedAt?: number;
|
||||
completedAt?: number;
|
||||
}
|
||||
|
||||
// Storage Quota Info
|
||||
export interface StorageQuotaInfo {
|
||||
used: number;
|
||||
quota: number;
|
||||
isPersistent: boolean;
|
||||
byService: {
|
||||
gmail: number;
|
||||
drive: number;
|
||||
photos: number;
|
||||
calendar: number;
|
||||
};
|
||||
}
|
||||
|
||||
// Share Item for Board
|
||||
export interface ShareableItem {
|
||||
type: 'email' | 'document' | 'photo' | 'event';
|
||||
id: string;
|
||||
title: string; // Decrypted for display
|
||||
preview?: string; // Decrypted snippet/preview
|
||||
date: number;
|
||||
}
|
||||
|
||||
// Google Service Types
|
||||
export type GoogleService = 'gmail' | 'drive' | 'photos' | 'calendar';
|
||||
|
||||
// OAuth Scopes
|
||||
export const GOOGLE_SCOPES = {
|
||||
gmail: 'https://www.googleapis.com/auth/gmail.readonly',
|
||||
drive: 'https://www.googleapis.com/auth/drive.readonly',
|
||||
photos: 'https://www.googleapis.com/auth/photoslibrary.readonly',
|
||||
calendar: 'https://www.googleapis.com/auth/calendar.readonly',
|
||||
profile: 'https://www.googleapis.com/auth/userinfo.profile',
|
||||
email: 'https://www.googleapis.com/auth/userinfo.email'
|
||||
} as const;
|
||||
|
||||
// Database Store Names
|
||||
export const DB_STORES = {
|
||||
gmail: 'gmail',
|
||||
drive: 'drive',
|
||||
photos: 'photos',
|
||||
calendar: 'calendar',
|
||||
syncMetadata: 'syncMetadata',
|
||||
encryptionMeta: 'encryptionMeta',
|
||||
tokens: 'tokens'
|
||||
} as const;
|
||||
|
|
@ -56,6 +56,7 @@ import { Collection, initializeGlobalCollections } from "@/collections"
|
|||
import { GraphLayoutCollection } from "@/graph/GraphLayoutCollection"
|
||||
import { GestureTool } from "@/GestureTool"
|
||||
import { CmdK } from "@/CmdK"
|
||||
import { OfflineIndicator } from "@/components/OfflineIndicator"
|
||||
|
||||
|
||||
import "react-cmdk/dist/cmdk.css"
|
||||
|
|
@ -203,13 +204,14 @@ export function Board() {
|
|||
|
||||
// Use Automerge sync for all environments
|
||||
const storeWithHandle = useAutomergeSync(storeConfig)
|
||||
const store = {
|
||||
store: storeWithHandle.store,
|
||||
const store = {
|
||||
store: storeWithHandle.store,
|
||||
status: storeWithHandle.status,
|
||||
...('connectionStatus' in storeWithHandle ? { connectionStatus: storeWithHandle.connectionStatus } : {}),
|
||||
error: storeWithHandle.error
|
||||
}
|
||||
const automergeHandle = (storeWithHandle as any).handle
|
||||
const automergeHandle = storeWithHandle.handle
|
||||
const connectionStatus = storeWithHandle.connectionStatus
|
||||
const isOfflineReady = storeWithHandle.isOfflineReady
|
||||
const [editor, setEditor] = useState<Editor | null>(null)
|
||||
|
||||
useEffect(() => {
|
||||
|
|
@ -658,6 +660,7 @@ export function Board() {
|
|||
<div style={{ position: "fixed", inset: 0, display: "flex", alignItems: "center", justifyContent: "center" }}>
|
||||
<div>Loading canvas...</div>
|
||||
</div>
|
||||
<OfflineIndicator connectionStatus={connectionStatus} isOfflineReady={isOfflineReady} />
|
||||
</AutomergeHandleProvider>
|
||||
)
|
||||
}
|
||||
|
|
@ -740,6 +743,7 @@ export function Board() {
|
|||
>
|
||||
<CmdK />
|
||||
</Tldraw>
|
||||
<OfflineIndicator connectionStatus={connectionStatus} isOfflineReady={isOfflineReady} />
|
||||
</div>
|
||||
</AutomergeHandleProvider>
|
||||
)
|
||||
|
|
|
|||
Loading…
Reference in New Issue