docs: add data sovereignty architecture for Google imports and local file uploads
- Add GOOGLE_DATA_SOVEREIGNTY.md: comprehensive plan for secure local storage of Gmail, Drive, Photos, Calendar data with client-side encryption - Add LOCAL_FILE_UPLOAD.md: multi-item upload tool with same encryption model for local files (images, PDFs, documents, audio, video) - Update OFFLINE_STORAGE_FEASIBILITY.md to reference new docs Key features: - IndexedDB encrypted storage with AES-256-GCM - Keys derived from WebCrypto auth (never leave browser) - Safari 7-day eviction mitigations - Selective sharing to boards via Automerge - Optional encrypted R2 backup 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
parent
b502a08c62
commit
f8790c9934
|
|
@ -215,3 +215,22 @@ This is a **medium-complexity** feature that's very feasible. Automerge's archit
|
|||
|
||||
The biggest benefit is that Automerge's CRDT nature means you don't need to write complex merge logic - it handles conflict resolution automatically.
|
||||
|
||||
---
|
||||
|
||||
## Related: Google Data Sovereignty
|
||||
|
||||
Beyond canvas document storage, we also support importing and securely storing Google Workspace data locally. See **[docs/GOOGLE_DATA_SOVEREIGNTY.md](./docs/GOOGLE_DATA_SOVEREIGNTY.md)** for the complete architecture covering:
|
||||
|
||||
- **Gmail** - Import and encrypt emails locally
|
||||
- **Drive** - Import and encrypt documents locally
|
||||
- **Photos** - Import thumbnails with on-demand full resolution
|
||||
- **Calendar** - Import and encrypt events locally
|
||||
|
||||
Key principles:
|
||||
1. **Local-first**: All data stored in encrypted IndexedDB
|
||||
2. **User-controlled encryption**: Keys derived from WebCrypto auth, never leave browser
|
||||
3. **Selective sharing**: Choose what to share to canvas boards
|
||||
4. **Optional R2 backup**: Encrypted cloud backup (you hold the keys)
|
||||
|
||||
This builds on the same IndexedDB + Automerge foundation described above.
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,913 @@
|
|||
# Google Data Sovereignty: Local-First Secure Storage
|
||||
|
||||
This document outlines the architecture for securely importing, storing, and optionally sharing Google Workspace data (Gmail, Drive, Photos, Calendar) using a **local-first, data sovereign** approach.
|
||||
|
||||
## Overview
|
||||
|
||||
**Philosophy**: Your data should be yours. Import it locally, encrypt it client-side, and choose when/what to share.
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ USER'S BROWSER (Data Sovereign Zone) │
|
||||
├─────────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌─────────────┐ ┌──────────────────────────────────────────────┐ │
|
||||
│ │ Google APIs │───>│ Local Processing Layer │ │
|
||||
│ │ (OAuth 2.0) │ │ ├── Fetch data │ │
|
||||
│ └─────────────┘ │ ├── Encrypt with user's WebCrypto keys │ │
|
||||
│ │ └── Store to IndexedDB │ │
|
||||
│ └────────────────────────┬─────────────────────┘ │
|
||||
│ │ │
|
||||
│ ┌───────────────────────────────────────────┴───────────────────────┐ │
|
||||
│ │ IndexedDB Encrypted Storage │ │
|
||||
│ │ ├── gmail_messages (encrypted blobs) │ │
|
||||
│ │ ├── drive_documents (encrypted blobs) │ │
|
||||
│ │ ├── photos_media (encrypted references) │ │
|
||||
│ │ ├── calendar_events (encrypted data) │ │
|
||||
│ │ └── encryption_metadata (key derivation info) │ │
|
||||
│ └─────────────────────────────────────────────────────────────────── │
|
||||
│ │ │
|
||||
│ ┌────────────────────────┴───────────────────────┐ │
|
||||
│ │ Share Decision Layer (User Controlled) │ │
|
||||
│ │ ├── Keep Private (local only) │ │
|
||||
│ │ ├── Share to Board (Automerge sync) │ │
|
||||
│ │ └── Backup to R2 (encrypted cloud backup) │ │
|
||||
│ └────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Browser Storage Capabilities & Limitations
|
||||
|
||||
### IndexedDB Storage
|
||||
|
||||
| Browser | Default Quota | Max Quota | Persistence |
|
||||
|---------|--------------|-----------|-------------|
|
||||
| Chrome/Edge | 60% of disk | Unlimited* | Persistent with permission |
|
||||
| Firefox | 10% up to 10GB | 50% of disk | Persistent with permission |
|
||||
| Safari | 1GB (lax) | ~1GB per origin | Non-persistent (7-day eviction) |
|
||||
|
||||
*Chrome "Unlimited" requires `navigator.storage.persist()` permission
|
||||
|
||||
### Storage API Persistence
|
||||
|
||||
```typescript
|
||||
// Request persistent storage (prevents automatic eviction)
|
||||
async function requestPersistentStorage(): Promise<boolean> {
|
||||
if (navigator.storage && navigator.storage.persist) {
|
||||
const isPersisted = await navigator.storage.persist();
|
||||
console.log(`Persistent storage ${isPersisted ? 'granted' : 'denied'}`);
|
||||
return isPersisted;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check current storage quota
|
||||
async function checkStorageQuota(): Promise<{used: number, quota: number}> {
|
||||
if (navigator.storage && navigator.storage.estimate) {
|
||||
const estimate = await navigator.storage.estimate();
|
||||
return {
|
||||
used: estimate.usage || 0,
|
||||
quota: estimate.quota || 0
|
||||
};
|
||||
}
|
||||
return { used: 0, quota: 0 };
|
||||
}
|
||||
```
|
||||
|
||||
### Safari's 7-Day Eviction Rule
|
||||
|
||||
**CRITICAL for Safari users**: Safari evicts IndexedDB data after 7 days of non-use.
|
||||
|
||||
**Mitigations**:
|
||||
1. Use a Service Worker with periodic background sync to "touch" data
|
||||
2. Prompt Safari users to add to Home Screen (PWA mode bypasses some restrictions)
|
||||
3. Automatically sync important data to R2 backup
|
||||
4. Show clear warnings about Safari limitations
|
||||
|
||||
```typescript
|
||||
// Detect Safari's storage limitations
|
||||
function hasSafariLimitations(): boolean {
|
||||
const isSafari = /^((?!chrome|android).)*safari/i.test(navigator.userAgent);
|
||||
const isIOS = /iPad|iPhone|iPod/.test(navigator.userAgent);
|
||||
return isSafari || isIOS;
|
||||
}
|
||||
|
||||
// Register touch activity to prevent eviction
|
||||
async function touchLocalData(): Promise<void> {
|
||||
const db = await openDatabase();
|
||||
const tx = db.transaction('metadata', 'readwrite');
|
||||
tx.objectStore('metadata').put({
|
||||
key: 'last_accessed',
|
||||
timestamp: Date.now()
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Data Types & Storage Strategies
|
||||
|
||||
### 1. Gmail Messages
|
||||
|
||||
```typescript
|
||||
interface EncryptedEmailStore {
|
||||
id: string; // Gmail message ID
|
||||
threadId: string; // Thread ID for grouping
|
||||
encryptedSubject: ArrayBuffer; // AES-GCM encrypted
|
||||
encryptedBody: ArrayBuffer; // AES-GCM encrypted
|
||||
encryptedFrom: ArrayBuffer; // Sender info
|
||||
encryptedTo: ArrayBuffer[]; // Recipients
|
||||
date: number; // Timestamp (unencrypted for sorting)
|
||||
labels: string[]; // Gmail labels (encrypted or not based on sensitivity)
|
||||
hasAttachments: boolean; // Flag only, attachments stored separately
|
||||
snippet: ArrayBuffer; // Encrypted preview
|
||||
|
||||
// Metadata for search (encrypted bloom filter or encrypted index)
|
||||
searchIndex: ArrayBuffer;
|
||||
|
||||
// Sync metadata
|
||||
syncedAt: number;
|
||||
localOnly: boolean; // Not yet synced to any external storage
|
||||
}
|
||||
|
||||
// Storage estimate per email:
|
||||
// - Average email: ~20KB raw → ~25KB encrypted
|
||||
// - With attachments: varies, but reference stored, not full attachment
|
||||
// - 10,000 emails ≈ 250MB
|
||||
```
|
||||
|
||||
### 2. Google Drive Documents
|
||||
|
||||
```typescript
|
||||
interface EncryptedDriveDocument {
|
||||
id: string; // Drive file ID
|
||||
encryptedName: ArrayBuffer;
|
||||
encryptedMimeType: ArrayBuffer;
|
||||
encryptedContent: ArrayBuffer; // For text-based docs
|
||||
encryptedPreview: ArrayBuffer; // Thumbnail or preview
|
||||
|
||||
// Large files: store reference, not content
|
||||
contentStrategy: 'inline' | 'reference' | 'chunked';
|
||||
chunks?: string[]; // IDs of content chunks if chunked
|
||||
|
||||
// Hierarchy
|
||||
parentId: string | null;
|
||||
path: ArrayBuffer; // Encrypted path string
|
||||
|
||||
// Sharing & permissions (for UI display)
|
||||
isShared: boolean;
|
||||
|
||||
modifiedTime: number;
|
||||
size: number; // Unencrypted for quota management
|
||||
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Storage considerations:
|
||||
// - Google Docs: Convert to markdown/HTML, typically 10-100KB
|
||||
// - Spreadsheets: JSON export, 100KB-10MB depending on size
|
||||
// - PDFs: Store reference only, load on demand
|
||||
// - Images: Thumbnail locally, full resolution on demand
|
||||
```
|
||||
|
||||
### 3. Google Photos
|
||||
|
||||
```typescript
|
||||
interface EncryptedPhotoReference {
|
||||
id: string; // Photos media item ID
|
||||
encryptedFilename: ArrayBuffer;
|
||||
encryptedDescription: ArrayBuffer;
|
||||
|
||||
// Thumbnails stored locally (encrypted)
|
||||
thumbnail: {
|
||||
width: number;
|
||||
height: number;
|
||||
encryptedData: ArrayBuffer; // Base64 or blob
|
||||
};
|
||||
|
||||
// Full resolution: reference only (fetch on demand)
|
||||
fullResolution: {
|
||||
width: number;
|
||||
height: number;
|
||||
// NOT storing full image - too large
|
||||
// Fetch via API when user requests
|
||||
};
|
||||
|
||||
mediaType: 'image' | 'video';
|
||||
creationTime: number;
|
||||
|
||||
// Album associations
|
||||
albumIds: string[];
|
||||
|
||||
// Location data (highly sensitive - always encrypted)
|
||||
encryptedLocation?: ArrayBuffer;
|
||||
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Storage strategy:
|
||||
// - Thumbnails: ~50KB each, store locally
|
||||
// - Full images: NOT stored locally (too large)
|
||||
// - 1,000 photos thumbnails ≈ 50MB
|
||||
// - Full resolution loaded via API on demand
|
||||
```
|
||||
|
||||
### 4. Google Calendar Events
|
||||
|
||||
```typescript
|
||||
interface EncryptedCalendarEvent {
|
||||
id: string; // Calendar event ID
|
||||
calendarId: string;
|
||||
|
||||
encryptedSummary: ArrayBuffer;
|
||||
encryptedDescription: ArrayBuffer;
|
||||
encryptedLocation: ArrayBuffer;
|
||||
|
||||
// Time data (unencrypted for query/sort performance)
|
||||
startTime: number;
|
||||
endTime: number;
|
||||
isAllDay: boolean;
|
||||
timezone: string;
|
||||
|
||||
// Recurrence
|
||||
isRecurring: boolean;
|
||||
encryptedRecurrence?: ArrayBuffer;
|
||||
|
||||
// Attendees (encrypted)
|
||||
encryptedAttendees: ArrayBuffer;
|
||||
|
||||
// Reminders
|
||||
reminders: { method: string; minutes: number }[];
|
||||
|
||||
// Meeting links (encrypted - sensitive)
|
||||
encryptedMeetingLink?: ArrayBuffer;
|
||||
|
||||
syncedAt: number;
|
||||
}
|
||||
|
||||
// Storage estimate:
|
||||
// - Average event: ~5KB encrypted
|
||||
// - 2 years of events (~3000): ~15MB
|
||||
```
|
||||
|
||||
## Encryption Strategy
|
||||
|
||||
### Key Derivation
|
||||
|
||||
Using the existing WebCrypto infrastructure, derive data encryption keys from the user's master key:
|
||||
|
||||
```typescript
|
||||
// Derive a data-specific encryption key from master key
|
||||
async function deriveDataEncryptionKey(
|
||||
masterKey: CryptoKey,
|
||||
purpose: 'gmail' | 'drive' | 'photos' | 'calendar'
|
||||
): Promise<CryptoKey> {
|
||||
const encoder = new TextEncoder();
|
||||
const purposeBytes = encoder.encode(`canvas-data-${purpose}`);
|
||||
|
||||
// Import master key for HKDF
|
||||
const baseKey = await crypto.subtle.importKey(
|
||||
'raw',
|
||||
await crypto.subtle.exportKey('raw', masterKey),
|
||||
'HKDF',
|
||||
false,
|
||||
['deriveKey']
|
||||
);
|
||||
|
||||
// Derive purpose-specific key
|
||||
return await crypto.subtle.deriveKey(
|
||||
{
|
||||
name: 'HKDF',
|
||||
hash: 'SHA-256',
|
||||
salt: purposeBytes,
|
||||
info: new ArrayBuffer(0)
|
||||
},
|
||||
baseKey,
|
||||
{ name: 'AES-GCM', length: 256 },
|
||||
false,
|
||||
['encrypt', 'decrypt']
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Encryption/Decryption
|
||||
|
||||
```typescript
|
||||
// Encrypt data before storing
|
||||
async function encryptData(
|
||||
data: string | ArrayBuffer,
|
||||
key: CryptoKey
|
||||
): Promise<{encrypted: ArrayBuffer, iv: Uint8Array}> {
|
||||
const iv = crypto.getRandomValues(new Uint8Array(12)); // 96-bit IV for AES-GCM
|
||||
|
||||
const dataBuffer = typeof data === 'string'
|
||||
? new TextEncoder().encode(data)
|
||||
: data;
|
||||
|
||||
const encrypted = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
key,
|
||||
dataBuffer
|
||||
);
|
||||
|
||||
return { encrypted, iv };
|
||||
}
|
||||
|
||||
// Decrypt data when reading
|
||||
async function decryptData(
|
||||
encrypted: ArrayBuffer,
|
||||
iv: Uint8Array,
|
||||
key: CryptoKey
|
||||
): Promise<ArrayBuffer> {
|
||||
return await crypto.subtle.decrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
key,
|
||||
encrypted
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## IndexedDB Schema
|
||||
|
||||
```typescript
|
||||
// Database schema for encrypted Google data
|
||||
const GOOGLE_DATA_DB = 'canvas-google-data';
|
||||
const DB_VERSION = 1;
|
||||
|
||||
interface GoogleDataSchema {
|
||||
gmail: {
|
||||
key: string; // message ID
|
||||
indexes: ['threadId', 'date', 'syncedAt'];
|
||||
};
|
||||
drive: {
|
||||
key: string; // file ID
|
||||
indexes: ['parentId', 'modifiedTime', 'mimeType'];
|
||||
};
|
||||
photos: {
|
||||
key: string; // media item ID
|
||||
indexes: ['creationTime', 'mediaType'];
|
||||
};
|
||||
calendar: {
|
||||
key: string; // event ID
|
||||
indexes: ['calendarId', 'startTime', 'endTime'];
|
||||
};
|
||||
syncMetadata: {
|
||||
key: string; // 'gmail' | 'drive' | 'photos' | 'calendar'
|
||||
// Stores last sync token, sync progress, etc.
|
||||
};
|
||||
encryptionKeys: {
|
||||
key: string; // purpose
|
||||
// Stores IV, salt for key derivation
|
||||
};
|
||||
}
|
||||
|
||||
async function initGoogleDataDB(): Promise<IDBDatabase> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.open(GOOGLE_DATA_DB, DB_VERSION);
|
||||
|
||||
request.onerror = () => reject(request.error);
|
||||
request.onsuccess = () => resolve(request.result);
|
||||
|
||||
request.onupgradeneeded = (event) => {
|
||||
const db = (event.target as IDBOpenDBRequest).result;
|
||||
|
||||
// Gmail store
|
||||
if (!db.objectStoreNames.contains('gmail')) {
|
||||
const gmailStore = db.createObjectStore('gmail', { keyPath: 'id' });
|
||||
gmailStore.createIndex('threadId', 'threadId', { unique: false });
|
||||
gmailStore.createIndex('date', 'date', { unique: false });
|
||||
gmailStore.createIndex('syncedAt', 'syncedAt', { unique: false });
|
||||
}
|
||||
|
||||
// Drive store
|
||||
if (!db.objectStoreNames.contains('drive')) {
|
||||
const driveStore = db.createObjectStore('drive', { keyPath: 'id' });
|
||||
driveStore.createIndex('parentId', 'parentId', { unique: false });
|
||||
driveStore.createIndex('modifiedTime', 'modifiedTime', { unique: false });
|
||||
}
|
||||
|
||||
// Photos store
|
||||
if (!db.objectStoreNames.contains('photos')) {
|
||||
const photosStore = db.createObjectStore('photos', { keyPath: 'id' });
|
||||
photosStore.createIndex('creationTime', 'creationTime', { unique: false });
|
||||
photosStore.createIndex('mediaType', 'mediaType', { unique: false });
|
||||
}
|
||||
|
||||
// Calendar store
|
||||
if (!db.objectStoreNames.contains('calendar')) {
|
||||
const calendarStore = db.createObjectStore('calendar', { keyPath: 'id' });
|
||||
calendarStore.createIndex('calendarId', 'calendarId', { unique: false });
|
||||
calendarStore.createIndex('startTime', 'startTime', { unique: false });
|
||||
}
|
||||
|
||||
// Sync metadata
|
||||
if (!db.objectStoreNames.contains('syncMetadata')) {
|
||||
db.createObjectStore('syncMetadata', { keyPath: 'service' });
|
||||
}
|
||||
|
||||
// Encryption metadata
|
||||
if (!db.objectStoreNames.contains('encryptionMeta')) {
|
||||
db.createObjectStore('encryptionMeta', { keyPath: 'purpose' });
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Google OAuth & API Integration
|
||||
|
||||
### OAuth 2.0 Scopes
|
||||
|
||||
```typescript
|
||||
const GOOGLE_SCOPES = {
|
||||
// Read-only access (data sovereignty - we import, not modify)
|
||||
gmail: 'https://www.googleapis.com/auth/gmail.readonly',
|
||||
drive: 'https://www.googleapis.com/auth/drive.readonly',
|
||||
photos: 'https://www.googleapis.com/auth/photoslibrary.readonly',
|
||||
calendar: 'https://www.googleapis.com/auth/calendar.readonly',
|
||||
|
||||
// Profile for user identification
|
||||
profile: 'https://www.googleapis.com/auth/userinfo.profile',
|
||||
email: 'https://www.googleapis.com/auth/userinfo.email'
|
||||
};
|
||||
|
||||
// Selective scope request - user chooses what to import
|
||||
function getRequestedScopes(services: string[]): string {
|
||||
const scopes = [GOOGLE_SCOPES.profile, GOOGLE_SCOPES.email];
|
||||
|
||||
services.forEach(service => {
|
||||
if (GOOGLE_SCOPES[service as keyof typeof GOOGLE_SCOPES]) {
|
||||
scopes.push(GOOGLE_SCOPES[service as keyof typeof GOOGLE_SCOPES]);
|
||||
}
|
||||
});
|
||||
|
||||
return scopes.join(' ');
|
||||
}
|
||||
```
|
||||
|
||||
### OAuth Flow with PKCE
|
||||
|
||||
```typescript
|
||||
interface GoogleAuthState {
|
||||
codeVerifier: string;
|
||||
redirectUri: string;
|
||||
state: string;
|
||||
}
|
||||
|
||||
async function initiateGoogleAuth(services: string[]): Promise<void> {
|
||||
const codeVerifier = generateCodeVerifier();
|
||||
const codeChallenge = await generateCodeChallenge(codeVerifier);
|
||||
const state = crypto.randomUUID();
|
||||
|
||||
// Store state for verification
|
||||
sessionStorage.setItem('google_auth_state', JSON.stringify({
|
||||
codeVerifier,
|
||||
state,
|
||||
redirectUri: window.location.origin + '/oauth/google/callback'
|
||||
}));
|
||||
|
||||
const params = new URLSearchParams({
|
||||
client_id: import.meta.env.VITE_GOOGLE_CLIENT_ID,
|
||||
redirect_uri: window.location.origin + '/oauth/google/callback',
|
||||
response_type: 'code',
|
||||
scope: getRequestedScopes(services),
|
||||
access_type: 'offline', // Get refresh token
|
||||
prompt: 'consent',
|
||||
code_challenge: codeChallenge,
|
||||
code_challenge_method: 'S256',
|
||||
state
|
||||
});
|
||||
|
||||
window.location.href = `https://accounts.google.com/o/oauth2/v2/auth?${params}`;
|
||||
}
|
||||
|
||||
// PKCE helpers
|
||||
function generateCodeVerifier(): string {
|
||||
const array = new Uint8Array(32);
|
||||
crypto.getRandomValues(array);
|
||||
return base64UrlEncode(array);
|
||||
}
|
||||
|
||||
async function generateCodeChallenge(verifier: string): Promise<string> {
|
||||
const encoder = new TextEncoder();
|
||||
const data = encoder.encode(verifier);
|
||||
const hash = await crypto.subtle.digest('SHA-256', data);
|
||||
return base64UrlEncode(new Uint8Array(hash));
|
||||
}
|
||||
```
|
||||
|
||||
### Token Storage (Encrypted)
|
||||
|
||||
```typescript
|
||||
interface EncryptedTokens {
|
||||
accessToken: ArrayBuffer; // Encrypted
|
||||
refreshToken: ArrayBuffer; // Encrypted
|
||||
accessTokenIv: Uint8Array;
|
||||
refreshTokenIv: Uint8Array;
|
||||
expiresAt: number; // Unencrypted for refresh logic
|
||||
scopes: string[]; // Unencrypted for UI display
|
||||
}
|
||||
|
||||
async function storeGoogleTokens(
|
||||
tokens: { access_token: string; refresh_token?: string; expires_in: number },
|
||||
encryptionKey: CryptoKey
|
||||
): Promise<void> {
|
||||
const { encrypted: encAccessToken, iv: accessIv } = await encryptData(
|
||||
tokens.access_token,
|
||||
encryptionKey
|
||||
);
|
||||
|
||||
const encryptedTokens: Partial<EncryptedTokens> = {
|
||||
accessToken: encAccessToken,
|
||||
accessTokenIv: accessIv,
|
||||
expiresAt: Date.now() + (tokens.expires_in * 1000)
|
||||
};
|
||||
|
||||
if (tokens.refresh_token) {
|
||||
const { encrypted: encRefreshToken, iv: refreshIv } = await encryptData(
|
||||
tokens.refresh_token,
|
||||
encryptionKey
|
||||
);
|
||||
encryptedTokens.refreshToken = encRefreshToken;
|
||||
encryptedTokens.refreshTokenIv = refreshIv;
|
||||
}
|
||||
|
||||
const db = await initGoogleDataDB();
|
||||
const tx = db.transaction('encryptionMeta', 'readwrite');
|
||||
tx.objectStore('encryptionMeta').put({
|
||||
purpose: 'google_tokens',
|
||||
...encryptedTokens
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## Data Import Workflow
|
||||
|
||||
### Progressive Import with Background Sync
|
||||
|
||||
```typescript
|
||||
interface ImportProgress {
|
||||
service: 'gmail' | 'drive' | 'photos' | 'calendar';
|
||||
total: number;
|
||||
imported: number;
|
||||
lastSyncToken?: string;
|
||||
status: 'idle' | 'importing' | 'paused' | 'error';
|
||||
errorMessage?: string;
|
||||
}
|
||||
|
||||
class GoogleDataImporter {
|
||||
private encryptionKey: CryptoKey;
|
||||
private db: IDBDatabase;
|
||||
|
||||
async importGmail(options: {
|
||||
maxMessages?: number;
|
||||
labelsFilter?: string[];
|
||||
dateAfter?: Date;
|
||||
}): Promise<void> {
|
||||
const accessToken = await this.getAccessToken();
|
||||
|
||||
// Use pagination for large mailboxes
|
||||
let pageToken: string | undefined;
|
||||
let imported = 0;
|
||||
|
||||
do {
|
||||
const response = await fetch(
|
||||
`https://gmail.googleapis.com/gmail/v1/users/me/messages?${new URLSearchParams({
|
||||
maxResults: '100',
|
||||
...(pageToken && { pageToken }),
|
||||
...(options.labelsFilter && { labelIds: options.labelsFilter.join(',') }),
|
||||
...(options.dateAfter && { q: `after:${Math.floor(options.dateAfter.getTime() / 1000)}` })
|
||||
})}`,
|
||||
{ headers: { Authorization: `Bearer ${accessToken}` } }
|
||||
);
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Fetch and encrypt each message
|
||||
for (const msg of data.messages || []) {
|
||||
const fullMessage = await this.fetchGmailMessage(msg.id, accessToken);
|
||||
await this.storeEncryptedEmail(fullMessage);
|
||||
imported++;
|
||||
|
||||
// Update progress
|
||||
this.updateProgress('gmail', imported);
|
||||
|
||||
// Yield to UI periodically
|
||||
if (imported % 10 === 0) {
|
||||
await new Promise(r => setTimeout(r, 0));
|
||||
}
|
||||
}
|
||||
|
||||
pageToken = data.nextPageToken;
|
||||
} while (pageToken && (!options.maxMessages || imported < options.maxMessages));
|
||||
}
|
||||
|
||||
private async storeEncryptedEmail(message: any): Promise<void> {
|
||||
const emailKey = await deriveDataEncryptionKey(this.encryptionKey, 'gmail');
|
||||
|
||||
const encrypted: EncryptedEmailStore = {
|
||||
id: message.id,
|
||||
threadId: message.threadId,
|
||||
encryptedSubject: (await encryptData(
|
||||
this.extractHeader(message, 'Subject') || '',
|
||||
emailKey
|
||||
)).encrypted,
|
||||
encryptedBody: (await encryptData(
|
||||
this.extractBody(message),
|
||||
emailKey
|
||||
)).encrypted,
|
||||
// ... other fields
|
||||
date: parseInt(message.internalDate),
|
||||
syncedAt: Date.now(),
|
||||
localOnly: true
|
||||
};
|
||||
|
||||
const tx = this.db.transaction('gmail', 'readwrite');
|
||||
tx.objectStore('gmail').put(encrypted);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Sharing to Canvas Board
|
||||
|
||||
### Selective Sharing Model
|
||||
|
||||
```typescript
|
||||
interface ShareableItem {
|
||||
type: 'email' | 'document' | 'photo' | 'event';
|
||||
id: string;
|
||||
// Decrypted data for sharing
|
||||
decryptedData: any;
|
||||
}
|
||||
|
||||
class DataSharingService {
|
||||
/**
|
||||
* Share a specific item to the current board
|
||||
* This decrypts the item and adds it to the Automerge document
|
||||
*/
|
||||
async shareToBoard(
|
||||
item: ShareableItem,
|
||||
boardHandle: DocumentHandle<CanvasDoc>,
|
||||
userKey: CryptoKey
|
||||
): Promise<void> {
|
||||
// 1. Decrypt the item
|
||||
const decrypted = await this.decryptItem(item, userKey);
|
||||
|
||||
// 2. Create a canvas shape representation
|
||||
const shape = this.createShapeFromItem(decrypted, item.type);
|
||||
|
||||
// 3. Add to Automerge document (syncs to other board users)
|
||||
boardHandle.change(doc => {
|
||||
doc.shapes[shape.id] = shape;
|
||||
});
|
||||
|
||||
// 4. Mark item as shared (no longer localOnly)
|
||||
await this.markAsShared(item.id, item.type);
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a visual shape from data
|
||||
*/
|
||||
private createShapeFromItem(data: any, type: string): TLShape {
|
||||
switch (type) {
|
||||
case 'email':
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'email-card',
|
||||
props: {
|
||||
subject: data.subject,
|
||||
from: data.from,
|
||||
date: data.date,
|
||||
snippet: data.snippet
|
||||
}
|
||||
};
|
||||
case 'event':
|
||||
return {
|
||||
id: createShapeId(),
|
||||
type: 'calendar-event',
|
||||
props: {
|
||||
title: data.summary,
|
||||
startTime: data.startTime,
|
||||
endTime: data.endTime,
|
||||
location: data.location
|
||||
}
|
||||
};
|
||||
// ... other types
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## R2 Encrypted Backup
|
||||
|
||||
### Backup Architecture
|
||||
|
||||
```
|
||||
User Browser Cloudflare Worker R2 Storage
|
||||
│ │ │
|
||||
│ 1. Encrypt data locally │ │
|
||||
│ (already encrypted in IndexedDB) │ │
|
||||
│ │ │
|
||||
│ 2. Generate backup key │ │
|
||||
│ (derived from master key) │ │
|
||||
│ │ │
|
||||
│ 3. POST encrypted blob ──────────> 4. Validate user │
|
||||
│ │ (CryptID auth) │
|
||||
│ │ │
|
||||
│ │ 5. Store blob ─────────────────> │
|
||||
│ │ (already encrypted, │
|
||||
│ │ worker can't read) │
|
||||
│ │ │
|
||||
│ <──────────────────────────────── 6. Return backup ID │
|
||||
```
|
||||
|
||||
### Backup Implementation
|
||||
|
||||
```typescript
|
||||
interface BackupMetadata {
|
||||
id: string;
|
||||
createdAt: number;
|
||||
services: ('gmail' | 'drive' | 'photos' | 'calendar')[];
|
||||
itemCount: number;
|
||||
sizeBytes: number;
|
||||
// Encrypted with user's key - only they can read
|
||||
encryptedManifest: ArrayBuffer;
|
||||
}
|
||||
|
||||
class R2BackupService {
|
||||
private workerUrl = '/api/backup';
|
||||
|
||||
async createBackup(
|
||||
services: string[],
|
||||
encryptionKey: CryptoKey
|
||||
): Promise<BackupMetadata> {
|
||||
// 1. Gather all encrypted data from IndexedDB
|
||||
const dataToBackup = await this.gatherData(services);
|
||||
|
||||
// 2. Create a manifest (encrypted)
|
||||
const manifest = {
|
||||
version: 1,
|
||||
createdAt: Date.now(),
|
||||
services,
|
||||
itemCounts: dataToBackup.counts
|
||||
};
|
||||
const { encrypted: encManifest } = await encryptData(
|
||||
JSON.stringify(manifest),
|
||||
encryptionKey
|
||||
);
|
||||
|
||||
// 3. Serialize and chunk if large
|
||||
const blob = await this.serializeForBackup(dataToBackup);
|
||||
|
||||
// 4. Upload to R2 via worker
|
||||
const response = await fetch(this.workerUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/octet-stream',
|
||||
'X-Backup-Manifest': base64Encode(encManifest)
|
||||
},
|
||||
body: blob
|
||||
});
|
||||
|
||||
const { backupId } = await response.json();
|
||||
|
||||
return {
|
||||
id: backupId,
|
||||
createdAt: Date.now(),
|
||||
services: services as any,
|
||||
itemCount: Object.values(dataToBackup.counts).reduce((a, b) => a + b, 0),
|
||||
sizeBytes: blob.size,
|
||||
encryptedManifest: encManifest
|
||||
};
|
||||
}
|
||||
|
||||
async restoreBackup(
|
||||
backupId: string,
|
||||
encryptionKey: CryptoKey
|
||||
): Promise<void> {
|
||||
// 1. Fetch encrypted blob from R2
|
||||
const response = await fetch(`${this.workerUrl}/${backupId}`);
|
||||
const encryptedBlob = await response.arrayBuffer();
|
||||
|
||||
// 2. Data is already encrypted with user's key
|
||||
// Just write directly to IndexedDB
|
||||
await this.writeToIndexedDB(encryptedBlob);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Privacy & Security Guarantees
|
||||
|
||||
### What Never Leaves the Browser (Unencrypted)
|
||||
|
||||
1. **Email content** - body, subject, attachments
|
||||
2. **Document content** - file contents, names
|
||||
3. **Photo data** - images, location metadata
|
||||
4. **Calendar details** - event descriptions, attendee info
|
||||
5. **OAuth tokens** - access/refresh tokens
|
||||
|
||||
### What the Server Never Sees
|
||||
|
||||
1. **Encryption keys** - derived locally, never transmitted
|
||||
2. **Plaintext data** - all API calls are client-side
|
||||
3. **User's Google account data** - we use read-only scopes
|
||||
|
||||
### Data Flow Summary
|
||||
|
||||
```
|
||||
┌─────────────────────┐
|
||||
│ Google APIs │
|
||||
│ (authenticated) │
|
||||
└──────────┬──────────┘
|
||||
│
|
||||
┌─────────▼─────────┐
|
||||
│ Browser Fetch │
|
||||
│ (client-side) │
|
||||
└─────────┬─────────┘
|
||||
│
|
||||
┌─────────▼─────────┐
|
||||
│ Encrypt with │
|
||||
│ WebCrypto │
|
||||
│ (AES-256-GCM) │
|
||||
└─────────┬─────────┘
|
||||
│
|
||||
┌────────────────────┼────────────────────┐
|
||||
│ │ │
|
||||
┌─────────▼─────────┐ ┌───────▼────────┐ ┌────────▼───────┐
|
||||
│ IndexedDB │ │ Share to │ │ R2 Backup │
|
||||
│ (local only) │ │ Board │ │ (encrypted) │
|
||||
│ │ │ (Automerge) │ │ │
|
||||
└───────────────────┘ └────────────────┘ └────────────────┘
|
||||
│ │ │
|
||||
▼ ▼ ▼
|
||||
Only you can read Board members Only you can
|
||||
(your keys) see shared items decrypt backup
|
||||
```
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Foundation
|
||||
- [ ] IndexedDB schema for encrypted data
|
||||
- [ ] Key derivation from existing WebCrypto keys
|
||||
- [ ] Encrypt/decrypt utility functions
|
||||
- [ ] Storage quota monitoring
|
||||
|
||||
### Phase 2: Google OAuth
|
||||
- [ ] OAuth 2.0 with PKCE flow
|
||||
- [ ] Token encryption and storage
|
||||
- [ ] Token refresh logic
|
||||
- [ ] Scope selection UI
|
||||
|
||||
### Phase 3: Data Import
|
||||
- [ ] Gmail import with pagination
|
||||
- [ ] Drive document import
|
||||
- [ ] Photos thumbnail import
|
||||
- [ ] Calendar event import
|
||||
- [ ] Progress tracking UI
|
||||
|
||||
### Phase 4: Canvas Integration
|
||||
- [ ] Email card shape
|
||||
- [ ] Document preview shape
|
||||
- [ ] Photo thumbnail shape
|
||||
- [ ] Calendar event shape
|
||||
- [ ] Share to board functionality
|
||||
|
||||
### Phase 5: R2 Backup
|
||||
- [ ] Encrypted backup creation
|
||||
- [ ] Backup restore
|
||||
- [ ] Backup management UI
|
||||
- [ ] Automatic backup scheduling
|
||||
|
||||
### Phase 6: Polish
|
||||
- [ ] Safari storage warnings
|
||||
- [ ] Offline data access
|
||||
- [ ] Search within encrypted data
|
||||
- [ ] Data export (Google Takeout style)
|
||||
|
||||
## Security Checklist
|
||||
|
||||
- [ ] All data encrypted before storage
|
||||
- [ ] Keys never leave browser unencrypted
|
||||
- [ ] OAuth tokens encrypted at rest
|
||||
- [ ] PKCE used for OAuth flow
|
||||
- [ ] Read-only Google API scopes
|
||||
- [ ] Safari 7-day eviction handled
|
||||
- [ ] Storage quota warnings
|
||||
- [ ] Secure context required (HTTPS)
|
||||
- [ ] CSP headers configured
|
||||
- [ ] No sensitive data in console logs
|
||||
|
||||
## Related Documents
|
||||
|
||||
- [Local File Upload](./LOCAL_FILE_UPLOAD.md) - Multi-item upload with same encryption model
|
||||
- [Offline Storage Feasibility](../OFFLINE_STORAGE_FEASIBILITY.md) - IndexedDB + Automerge foundation
|
||||
|
||||
## References
|
||||
|
||||
- [IndexedDB API](https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API)
|
||||
- [Web Crypto API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Crypto_API)
|
||||
- [Storage API](https://developer.mozilla.org/en-US/docs/Web/API/Storage_API)
|
||||
- [Google OAuth 2.0](https://developers.google.com/identity/protocols/oauth2)
|
||||
- [Gmail API](https://developers.google.com/gmail/api)
|
||||
- [Drive API](https://developers.google.com/drive/api)
|
||||
- [Photos Library API](https://developers.google.com/photos/library/reference/rest)
|
||||
- [Calendar API](https://developers.google.com/calendar/api)
|
||||
|
|
@ -0,0 +1,862 @@
|
|||
# Local File Upload: Multi-Item Encrypted Import
|
||||
|
||||
A simpler, more broadly compatible approach to importing local files into the canvas with the same privacy-first, encrypted storage model.
|
||||
|
||||
## Overview
|
||||
|
||||
Instead of maintaining persistent folder connections (which have browser compatibility issues), provide a **drag-and-drop / file picker** interface for batch importing files into encrypted local storage.
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────┐
|
||||
│ UPLOAD INTERFACE │
|
||||
├─────────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ┌─────────────────────────────────────────────────────────────────┐ │
|
||||
│ │ │ │
|
||||
│ │ 📁 Drop files here or click to browse │ │
|
||||
│ │ │ │
|
||||
│ │ Supports: Images, PDFs, Documents, Text, Audio, Video │ │
|
||||
│ │ │ │
|
||||
│ └─────────────────────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
│ ┌──────────────────────────────────────────────────────────────────┐ │
|
||||
│ │ Import Queue [Upload] │ │
|
||||
│ ├──────────────────────────────────────────────────────────────────┤ │
|
||||
│ │ ☑ photo_001.jpg (2.4 MB) 🔒 Encrypt 📤 Share │ │
|
||||
│ │ ☑ meeting_notes.pdf (450 KB) 🔒 Encrypt ☐ Private │ │
|
||||
│ │ ☑ project_plan.md (12 KB) 🔒 Encrypt ☐ Private │ │
|
||||
│ │ ☐ sensitive_doc.docx (1.2 MB) 🔒 Encrypt ☐ Private │ │
|
||||
│ └──────────────────────────────────────────────────────────────────┘ │
|
||||
│ │
|
||||
│ Storage: 247 MB used / ~5 GB available │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Why Multi-Item Upload vs. Folder Connection
|
||||
|
||||
| Feature | Folder Connection | Multi-Item Upload |
|
||||
|---------|------------------|-------------------|
|
||||
| Browser Support | Chrome/Edge only | All browsers |
|
||||
| Persistent Access | Yes (with permission) | No (one-time import) |
|
||||
| Implementation | Complex | Simple |
|
||||
| User Control | Less explicit | Very explicit |
|
||||
| Privacy UX | Hidden | Clear per-file choices |
|
||||
|
||||
**Recommendation**: Multi-item upload is better for privacy-conscious users who want explicit control over what enters the system.
|
||||
|
||||
## Supported File Types
|
||||
|
||||
### Documents
|
||||
| Type | Extension | Processing | Storage Strategy |
|
||||
|------|-----------|-----------|------------------|
|
||||
| Markdown | `.md` | Parse frontmatter, render | Full content |
|
||||
| PDF | `.pdf` | Extract text, thumbnail | Text + thumbnail |
|
||||
| Word | `.docx` | Convert to markdown | Converted content |
|
||||
| Text | `.txt`, `.csv`, `.json` | Direct | Full content |
|
||||
| Code | `.js`, `.ts`, `.py`, etc. | Syntax highlight | Full content |
|
||||
|
||||
### Images
|
||||
| Type | Extension | Processing | Storage Strategy |
|
||||
|------|-----------|-----------|------------------|
|
||||
| Photos | `.jpg`, `.png`, `.webp` | Generate thumbnail | Thumbnail + full |
|
||||
| Vector | `.svg` | Direct | Full content |
|
||||
| GIF | `.gif` | First frame thumb | Thumbnail + full |
|
||||
|
||||
### Media
|
||||
| Type | Extension | Processing | Storage Strategy |
|
||||
|------|-----------|-----------|------------------|
|
||||
| Audio | `.mp3`, `.wav`, `.m4a` | Waveform preview | Reference + metadata |
|
||||
| Video | `.mp4`, `.webm` | Frame thumbnail | Reference + metadata |
|
||||
|
||||
### Archives (Future)
|
||||
| Type | Extension | Processing |
|
||||
|------|-----------|-----------|
|
||||
| ZIP | `.zip` | List contents, selective extract |
|
||||
| Obsidian Export | `.zip` | Vault structure import |
|
||||
|
||||
## Architecture
|
||||
|
||||
```typescript
|
||||
interface UploadedFile {
|
||||
id: string; // Generated UUID
|
||||
originalName: string; // User's filename
|
||||
mimeType: string;
|
||||
size: number;
|
||||
|
||||
// Processing results
|
||||
processed: {
|
||||
thumbnail?: ArrayBuffer; // For images/PDFs/videos
|
||||
extractedText?: string; // For searchable docs
|
||||
metadata?: Record<string, any>; // EXIF, frontmatter, etc.
|
||||
};
|
||||
|
||||
// Encryption
|
||||
encrypted: {
|
||||
content: ArrayBuffer; // Encrypted file content
|
||||
iv: Uint8Array;
|
||||
keyId: string; // Reference to encryption key
|
||||
};
|
||||
|
||||
// User choices
|
||||
sharing: {
|
||||
localOnly: boolean; // Default true
|
||||
sharedToBoard?: string; // Board ID if shared
|
||||
backedUpToR2?: boolean;
|
||||
};
|
||||
|
||||
// Timestamps
|
||||
importedAt: number;
|
||||
lastAccessedAt: number;
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation
|
||||
|
||||
### 1. File Input Component
|
||||
|
||||
```typescript
|
||||
import React, { useCallback, useState } from 'react';
|
||||
|
||||
interface FileUploadProps {
|
||||
onFilesSelected: (files: File[]) => void;
|
||||
maxFileSize?: number; // bytes
|
||||
maxFiles?: number;
|
||||
acceptedTypes?: string[];
|
||||
}
|
||||
|
||||
export function FileUploadZone({
|
||||
onFilesSelected,
|
||||
maxFileSize = 100 * 1024 * 1024, // 100MB default
|
||||
maxFiles = 50,
|
||||
acceptedTypes
|
||||
}: FileUploadProps) {
|
||||
const [isDragging, setIsDragging] = useState(false);
|
||||
const [errors, setErrors] = useState<string[]>([]);
|
||||
|
||||
const handleDrop = useCallback((e: React.DragEvent) => {
|
||||
e.preventDefault();
|
||||
setIsDragging(false);
|
||||
|
||||
const files = Array.from(e.dataTransfer.files);
|
||||
validateAndProcess(files);
|
||||
}, []);
|
||||
|
||||
const handleFileInput = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const files = Array.from(e.target.files || []);
|
||||
validateAndProcess(files);
|
||||
}, []);
|
||||
|
||||
const validateAndProcess = (files: File[]) => {
|
||||
const errors: string[] = [];
|
||||
const validFiles: File[] = [];
|
||||
|
||||
for (const file of files.slice(0, maxFiles)) {
|
||||
if (file.size > maxFileSize) {
|
||||
errors.push(`${file.name}: exceeds ${maxFileSize / 1024 / 1024}MB limit`);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (acceptedTypes && !acceptedTypes.some(t => file.type.match(t))) {
|
||||
errors.push(`${file.name}: unsupported file type`);
|
||||
continue;
|
||||
}
|
||||
|
||||
validFiles.push(file);
|
||||
}
|
||||
|
||||
if (files.length > maxFiles) {
|
||||
errors.push(`Only first ${maxFiles} files will be imported`);
|
||||
}
|
||||
|
||||
setErrors(errors);
|
||||
if (validFiles.length > 0) {
|
||||
onFilesSelected(validFiles);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div
|
||||
onDrop={handleDrop}
|
||||
onDragOver={(e) => { e.preventDefault(); setIsDragging(true); }}
|
||||
onDragLeave={() => setIsDragging(false)}
|
||||
className={`upload-zone ${isDragging ? 'dragging' : ''}`}
|
||||
>
|
||||
<input
|
||||
type="file"
|
||||
multiple
|
||||
onChange={handleFileInput}
|
||||
accept={acceptedTypes?.join(',')}
|
||||
id="file-upload"
|
||||
hidden
|
||||
/>
|
||||
<label htmlFor="file-upload">
|
||||
<span className="upload-icon">📁</span>
|
||||
<span>Drop files here or click to browse</span>
|
||||
<span className="upload-hint">
|
||||
Images, PDFs, Documents, Text files
|
||||
</span>
|
||||
</label>
|
||||
|
||||
{errors.length > 0 && (
|
||||
<div className="upload-errors">
|
||||
{errors.map((err, i) => <div key={i}>{err}</div>)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### 2. File Processing Pipeline
|
||||
|
||||
```typescript
|
||||
interface ProcessedFile {
|
||||
file: File;
|
||||
thumbnail?: Blob;
|
||||
extractedText?: string;
|
||||
metadata?: Record<string, any>;
|
||||
}
|
||||
|
||||
class FileProcessor {
|
||||
|
||||
async process(file: File): Promise<ProcessedFile> {
|
||||
const result: ProcessedFile = { file };
|
||||
|
||||
// Route based on MIME type
|
||||
if (file.type.startsWith('image/')) {
|
||||
return this.processImage(file, result);
|
||||
} else if (file.type === 'application/pdf') {
|
||||
return this.processPDF(file, result);
|
||||
} else if (file.type.startsWith('text/') || this.isTextFile(file)) {
|
||||
return this.processText(file, result);
|
||||
} else if (file.type.startsWith('video/')) {
|
||||
return this.processVideo(file, result);
|
||||
} else if (file.type.startsWith('audio/')) {
|
||||
return this.processAudio(file, result);
|
||||
}
|
||||
|
||||
// Default: store as-is
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processImage(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Generate thumbnail
|
||||
const img = await createImageBitmap(file);
|
||||
const canvas = new OffscreenCanvas(200, 200);
|
||||
const ctx = canvas.getContext('2d')!;
|
||||
|
||||
// Calculate aspect-ratio preserving dimensions
|
||||
const scale = Math.min(200 / img.width, 200 / img.height);
|
||||
const w = img.width * scale;
|
||||
const h = img.height * scale;
|
||||
|
||||
ctx.drawImage(img, (200 - w) / 2, (200 - h) / 2, w, h);
|
||||
result.thumbnail = await canvas.convertToBlob({ type: 'image/webp', quality: 0.8 });
|
||||
|
||||
// Extract EXIF if available
|
||||
if (file.type === 'image/jpeg') {
|
||||
result.metadata = await this.extractExif(file);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processPDF(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Use pdf.js for text extraction and thumbnail
|
||||
const pdfjsLib = await import('pdfjs-dist');
|
||||
const arrayBuffer = await file.arrayBuffer();
|
||||
const pdf = await pdfjsLib.getDocument({ data: arrayBuffer }).promise;
|
||||
|
||||
// Get first page as thumbnail
|
||||
const page = await pdf.getPage(1);
|
||||
const viewport = page.getViewport({ scale: 0.5 });
|
||||
const canvas = new OffscreenCanvas(viewport.width, viewport.height);
|
||||
const ctx = canvas.getContext('2d')!;
|
||||
|
||||
await page.render({ canvasContext: ctx, viewport }).promise;
|
||||
result.thumbnail = await canvas.convertToBlob({ type: 'image/webp' });
|
||||
|
||||
// Extract text from all pages
|
||||
let text = '';
|
||||
for (let i = 1; i <= pdf.numPages; i++) {
|
||||
const page = await pdf.getPage(i);
|
||||
const content = await page.getTextContent();
|
||||
text += content.items.map((item: any) => item.str).join(' ') + '\n';
|
||||
}
|
||||
result.extractedText = text;
|
||||
|
||||
result.metadata = { pageCount: pdf.numPages };
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processText(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
result.extractedText = await file.text();
|
||||
|
||||
// Parse markdown frontmatter if applicable
|
||||
if (file.name.endsWith('.md')) {
|
||||
const frontmatter = this.parseFrontmatter(result.extractedText);
|
||||
if (frontmatter) {
|
||||
result.metadata = frontmatter;
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processVideo(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Generate thumbnail from first frame
|
||||
const video = document.createElement('video');
|
||||
video.preload = 'metadata';
|
||||
video.src = URL.createObjectURL(file);
|
||||
|
||||
await new Promise(resolve => video.addEventListener('loadedmetadata', resolve));
|
||||
video.currentTime = 1; // First second
|
||||
await new Promise(resolve => video.addEventListener('seeked', resolve));
|
||||
|
||||
const canvas = new OffscreenCanvas(200, 200);
|
||||
const ctx = canvas.getContext('2d')!;
|
||||
const scale = Math.min(200 / video.videoWidth, 200 / video.videoHeight);
|
||||
ctx.drawImage(video, 0, 0, video.videoWidth * scale, video.videoHeight * scale);
|
||||
|
||||
result.thumbnail = await canvas.convertToBlob({ type: 'image/webp' });
|
||||
result.metadata = {
|
||||
duration: video.duration,
|
||||
width: video.videoWidth,
|
||||
height: video.videoHeight
|
||||
};
|
||||
|
||||
URL.revokeObjectURL(video.src);
|
||||
return result;
|
||||
}
|
||||
|
||||
private async processAudio(file: File, result: ProcessedFile): Promise<ProcessedFile> {
|
||||
// Extract duration and basic metadata
|
||||
const audio = document.createElement('audio');
|
||||
audio.src = URL.createObjectURL(file);
|
||||
|
||||
await new Promise(resolve => audio.addEventListener('loadedmetadata', resolve));
|
||||
|
||||
result.metadata = {
|
||||
duration: audio.duration
|
||||
};
|
||||
|
||||
URL.revokeObjectURL(audio.src);
|
||||
return result;
|
||||
}
|
||||
|
||||
private isTextFile(file: File): boolean {
|
||||
const textExtensions = ['.md', '.txt', '.json', '.csv', '.yaml', '.yml', '.xml', '.html', '.css', '.js', '.ts', '.py', '.sh'];
|
||||
return textExtensions.some(ext => file.name.toLowerCase().endsWith(ext));
|
||||
}
|
||||
|
||||
private parseFrontmatter(content: string): Record<string, any> | null {
|
||||
const match = content.match(/^---\n([\s\S]*?)\n---/);
|
||||
if (!match) return null;
|
||||
|
||||
try {
|
||||
// Simple YAML-like parsing (or use a proper YAML parser)
|
||||
const lines = match[1].split('\n');
|
||||
const result: Record<string, any> = {};
|
||||
for (const line of lines) {
|
||||
const [key, ...valueParts] = line.split(':');
|
||||
if (key && valueParts.length) {
|
||||
result[key.trim()] = valueParts.join(':').trim();
|
||||
}
|
||||
}
|
||||
return result;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private async extractExif(file: File): Promise<Record<string, any>> {
|
||||
// Would use exif-js or similar library
|
||||
return {};
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Encryption & Storage
|
||||
|
||||
```typescript
|
||||
class LocalFileStore {
|
||||
private db: IDBDatabase;
|
||||
private encryptionKey: CryptoKey;
|
||||
|
||||
async storeFile(processed: ProcessedFile, options: {
|
||||
shareToBoard?: boolean;
|
||||
} = {}): Promise<UploadedFile> {
|
||||
const fileId = crypto.randomUUID();
|
||||
|
||||
// Read file content
|
||||
const content = await processed.file.arrayBuffer();
|
||||
|
||||
// Encrypt content
|
||||
const iv = crypto.getRandomValues(new Uint8Array(12));
|
||||
const encryptedContent = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv },
|
||||
this.encryptionKey,
|
||||
content
|
||||
);
|
||||
|
||||
// Encrypt thumbnail if present
|
||||
let encryptedThumbnail: ArrayBuffer | undefined;
|
||||
let thumbnailIv: Uint8Array | undefined;
|
||||
if (processed.thumbnail) {
|
||||
thumbnailIv = crypto.getRandomValues(new Uint8Array(12));
|
||||
const thumbBuffer = await processed.thumbnail.arrayBuffer();
|
||||
encryptedThumbnail = await crypto.subtle.encrypt(
|
||||
{ name: 'AES-GCM', iv: thumbnailIv },
|
||||
this.encryptionKey,
|
||||
thumbBuffer
|
||||
);
|
||||
}
|
||||
|
||||
const uploadedFile: UploadedFile = {
|
||||
id: fileId,
|
||||
originalName: processed.file.name,
|
||||
mimeType: processed.file.type,
|
||||
size: processed.file.size,
|
||||
processed: {
|
||||
extractedText: processed.extractedText,
|
||||
metadata: processed.metadata
|
||||
},
|
||||
encrypted: {
|
||||
content: encryptedContent,
|
||||
iv,
|
||||
keyId: 'user-master-key'
|
||||
},
|
||||
sharing: {
|
||||
localOnly: !options.shareToBoard,
|
||||
sharedToBoard: options.shareToBoard ? getCurrentBoardId() : undefined
|
||||
},
|
||||
importedAt: Date.now(),
|
||||
lastAccessedAt: Date.now()
|
||||
};
|
||||
|
||||
// Store encrypted thumbnail separately (for faster listing)
|
||||
if (encryptedThumbnail && thumbnailIv) {
|
||||
await this.storeThumbnail(fileId, encryptedThumbnail, thumbnailIv);
|
||||
}
|
||||
|
||||
// Store to IndexedDB
|
||||
const tx = this.db.transaction('files', 'readwrite');
|
||||
tx.objectStore('files').put(uploadedFile);
|
||||
|
||||
return uploadedFile;
|
||||
}
|
||||
|
||||
async getFile(fileId: string): Promise<{
|
||||
file: UploadedFile;
|
||||
decryptedContent: ArrayBuffer;
|
||||
} | null> {
|
||||
const tx = this.db.transaction('files', 'readonly');
|
||||
const file = await new Promise<UploadedFile | undefined>(resolve => {
|
||||
const req = tx.objectStore('files').get(fileId);
|
||||
req.onsuccess = () => resolve(req.result);
|
||||
});
|
||||
|
||||
if (!file) return null;
|
||||
|
||||
// Decrypt content
|
||||
const decryptedContent = await crypto.subtle.decrypt(
|
||||
{ name: 'AES-GCM', iv: file.encrypted.iv },
|
||||
this.encryptionKey,
|
||||
file.encrypted.content
|
||||
);
|
||||
|
||||
return { file, decryptedContent };
|
||||
}
|
||||
|
||||
async listFiles(options?: {
|
||||
mimeTypeFilter?: string;
|
||||
limit?: number;
|
||||
offset?: number;
|
||||
}): Promise<UploadedFile[]> {
|
||||
const tx = this.db.transaction('files', 'readonly');
|
||||
const store = tx.objectStore('files');
|
||||
|
||||
return new Promise(resolve => {
|
||||
const files: UploadedFile[] = [];
|
||||
const req = store.openCursor();
|
||||
|
||||
req.onsuccess = (e) => {
|
||||
const cursor = (e.target as IDBRequest).result;
|
||||
if (cursor) {
|
||||
const file = cursor.value as UploadedFile;
|
||||
|
||||
// Filter by MIME type if specified
|
||||
if (!options?.mimeTypeFilter || file.mimeType.startsWith(options.mimeTypeFilter)) {
|
||||
files.push(file);
|
||||
}
|
||||
|
||||
cursor.continue();
|
||||
} else {
|
||||
resolve(files);
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. IndexedDB Schema
|
||||
|
||||
```typescript
|
||||
const LOCAL_FILES_DB = 'canvas-local-files';
|
||||
const DB_VERSION = 1;
|
||||
|
||||
async function initLocalFilesDB(): Promise<IDBDatabase> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const request = indexedDB.open(LOCAL_FILES_DB, DB_VERSION);
|
||||
|
||||
request.onerror = () => reject(request.error);
|
||||
request.onsuccess = () => resolve(request.result);
|
||||
|
||||
request.onupgradeneeded = (event) => {
|
||||
const db = (event.target as IDBOpenDBRequest).result;
|
||||
|
||||
// Main files store
|
||||
if (!db.objectStoreNames.contains('files')) {
|
||||
const store = db.createObjectStore('files', { keyPath: 'id' });
|
||||
store.createIndex('mimeType', 'mimeType', { unique: false });
|
||||
store.createIndex('importedAt', 'importedAt', { unique: false });
|
||||
store.createIndex('originalName', 'originalName', { unique: false });
|
||||
store.createIndex('sharedToBoard', 'sharing.sharedToBoard', { unique: false });
|
||||
}
|
||||
|
||||
// Thumbnails store (separate for faster listing)
|
||||
if (!db.objectStoreNames.contains('thumbnails')) {
|
||||
db.createObjectStore('thumbnails', { keyPath: 'fileId' });
|
||||
}
|
||||
|
||||
// Search index (encrypted full-text search)
|
||||
if (!db.objectStoreNames.contains('searchIndex')) {
|
||||
const searchStore = db.createObjectStore('searchIndex', { keyPath: 'fileId' });
|
||||
searchStore.createIndex('tokens', 'tokens', { unique: false, multiEntry: true });
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## UI Components
|
||||
|
||||
### Import Dialog
|
||||
|
||||
```tsx
|
||||
function ImportFilesDialog({ isOpen, onClose }: { isOpen: boolean; onClose: () => void }) {
|
||||
const [selectedFiles, setSelectedFiles] = useState<ProcessedFile[]>([]);
|
||||
const [importing, setImporting] = useState(false);
|
||||
const [progress, setProgress] = useState(0);
|
||||
const fileStore = useLocalFileStore();
|
||||
|
||||
const handleFilesSelected = async (files: File[]) => {
|
||||
const processor = new FileProcessor();
|
||||
const processed: ProcessedFile[] = [];
|
||||
|
||||
for (const file of files) {
|
||||
processed.push(await processor.process(file));
|
||||
}
|
||||
|
||||
setSelectedFiles(prev => [...prev, ...processed]);
|
||||
};
|
||||
|
||||
const handleImport = async () => {
|
||||
setImporting(true);
|
||||
|
||||
for (let i = 0; i < selectedFiles.length; i++) {
|
||||
await fileStore.storeFile(selectedFiles[i]);
|
||||
setProgress((i + 1) / selectedFiles.length * 100);
|
||||
}
|
||||
|
||||
setImporting(false);
|
||||
onClose();
|
||||
};
|
||||
|
||||
return (
|
||||
<Dialog open={isOpen} onClose={onClose}>
|
||||
<DialogTitle>Import Files</DialogTitle>
|
||||
|
||||
<FileUploadZone onFilesSelected={handleFilesSelected} />
|
||||
|
||||
{selectedFiles.length > 0 && (
|
||||
<div className="file-list">
|
||||
{selectedFiles.map((pf, i) => (
|
||||
<FilePreviewRow
|
||||
key={i}
|
||||
file={pf}
|
||||
onRemove={() => setSelectedFiles(prev => prev.filter((_, j) => j !== i))}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{importing && (
|
||||
<progress value={progress} max={100} />
|
||||
)}
|
||||
|
||||
<DialogActions>
|
||||
<button onClick={onClose}>Cancel</button>
|
||||
<button
|
||||
onClick={handleImport}
|
||||
disabled={selectedFiles.length === 0 || importing}
|
||||
>
|
||||
Import {selectedFiles.length} files
|
||||
</button>
|
||||
</DialogActions>
|
||||
</Dialog>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### File Browser Panel
|
||||
|
||||
```tsx
|
||||
function LocalFilesBrowser() {
|
||||
const [files, setFiles] = useState<UploadedFile[]>([]);
|
||||
const [filter, setFilter] = useState<string>('all');
|
||||
const fileStore = useLocalFileStore();
|
||||
|
||||
useEffect(() => {
|
||||
loadFiles();
|
||||
}, [filter]);
|
||||
|
||||
const loadFiles = async () => {
|
||||
const mimeFilter = filter === 'all' ? undefined : filter;
|
||||
setFiles(await fileStore.listFiles({ mimeTypeFilter: mimeFilter }));
|
||||
};
|
||||
|
||||
const handleDragToCanvas = (file: UploadedFile) => {
|
||||
// Create a shape from the file and add to canvas
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="local-files-browser">
|
||||
<div className="filter-bar">
|
||||
<button onClick={() => setFilter('all')}>All</button>
|
||||
<button onClick={() => setFilter('image/')}>Images</button>
|
||||
<button onClick={() => setFilter('application/pdf')}>PDFs</button>
|
||||
<button onClick={() => setFilter('text/')}>Documents</button>
|
||||
</div>
|
||||
|
||||
<div className="files-grid">
|
||||
{files.map(file => (
|
||||
<FileCard
|
||||
key={file.id}
|
||||
file={file}
|
||||
onDragStart={() => handleDragToCanvas(file)}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Canvas Integration
|
||||
|
||||
### Drag Files to Canvas
|
||||
|
||||
```typescript
|
||||
// When user drags a local file onto the canvas
|
||||
async function createShapeFromLocalFile(
|
||||
file: UploadedFile,
|
||||
position: { x: number; y: number },
|
||||
editor: Editor
|
||||
): Promise<TLShapeId> {
|
||||
const fileStore = getLocalFileStore();
|
||||
const { decryptedContent } = await fileStore.getFile(file.id);
|
||||
|
||||
if (file.mimeType.startsWith('image/')) {
|
||||
// Create image shape
|
||||
const blob = new Blob([decryptedContent], { type: file.mimeType });
|
||||
const assetId = AssetRecordType.createId();
|
||||
|
||||
await editor.createAssets([{
|
||||
id: assetId,
|
||||
type: 'image',
|
||||
typeName: 'asset',
|
||||
props: {
|
||||
name: file.originalName,
|
||||
src: URL.createObjectURL(blob),
|
||||
w: 400,
|
||||
h: 300,
|
||||
mimeType: file.mimeType,
|
||||
isAnimated: file.mimeType === 'image/gif'
|
||||
}
|
||||
}]);
|
||||
|
||||
return editor.createShape({
|
||||
type: 'image',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: { assetId, w: 400, h: 300 }
|
||||
}).id;
|
||||
|
||||
} else if (file.mimeType === 'application/pdf') {
|
||||
// Create PDF embed or preview shape
|
||||
return editor.createShape({
|
||||
type: 'pdf-preview',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
fileId: file.id,
|
||||
name: file.originalName,
|
||||
pageCount: file.processed.metadata?.pageCount
|
||||
}
|
||||
}).id;
|
||||
|
||||
} else if (file.mimeType.startsWith('text/') || file.originalName.endsWith('.md')) {
|
||||
// Create note shape with content
|
||||
const text = new TextDecoder().decode(decryptedContent);
|
||||
return editor.createShape({
|
||||
type: 'note',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
text: text.slice(0, 1000), // Truncate for display
|
||||
fileId: file.id,
|
||||
fullContentAvailable: text.length > 1000
|
||||
}
|
||||
}).id;
|
||||
}
|
||||
|
||||
// Default: generic file card
|
||||
return editor.createShape({
|
||||
type: 'file-card',
|
||||
x: position.x,
|
||||
y: position.y,
|
||||
props: {
|
||||
fileId: file.id,
|
||||
name: file.originalName,
|
||||
size: file.size,
|
||||
mimeType: file.mimeType
|
||||
}
|
||||
}).id;
|
||||
}
|
||||
```
|
||||
|
||||
## Storage Considerations
|
||||
|
||||
### Size Limits & Recommendations
|
||||
|
||||
| File Type | Max Recommended | Notes |
|
||||
|-----------|----------------|-------|
|
||||
| Images | 20MB each | Larger images get resized |
|
||||
| PDFs | 50MB each | Text extracted for search |
|
||||
| Videos | 100MB each | Store reference, thumbnail only |
|
||||
| Audio | 50MB each | Store with waveform preview |
|
||||
| Documents | 10MB each | Full content stored |
|
||||
|
||||
### Total Storage Budget
|
||||
|
||||
```typescript
|
||||
const STORAGE_CONFIG = {
|
||||
// Soft warning at 500MB
|
||||
warningThreshold: 500 * 1024 * 1024,
|
||||
|
||||
// Hard limit at 2GB (leaves room for other data)
|
||||
maxStorage: 2 * 1024 * 1024 * 1024,
|
||||
|
||||
// Auto-cleanup: remove thumbnails for files not accessed in 30 days
|
||||
thumbnailRetentionDays: 30
|
||||
};
|
||||
|
||||
async function checkStorageQuota(): Promise<{
|
||||
used: number;
|
||||
available: number;
|
||||
warning: boolean;
|
||||
}> {
|
||||
const estimate = await navigator.storage.estimate();
|
||||
const used = estimate.usage || 0;
|
||||
const quota = estimate.quota || 0;
|
||||
|
||||
return {
|
||||
used,
|
||||
available: Math.min(quota - used, STORAGE_CONFIG.maxStorage - used),
|
||||
warning: used > STORAGE_CONFIG.warningThreshold
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## Privacy Features
|
||||
|
||||
### Per-File Privacy Controls
|
||||
|
||||
```typescript
|
||||
interface FilePrivacySettings {
|
||||
// Encryption is always on - this is about sharing
|
||||
localOnly: boolean; // Never leaves browser
|
||||
shareableToBoard: boolean; // Can be added to shared board
|
||||
includeInR2Backup: boolean; // Include in cloud backup
|
||||
|
||||
// Metadata privacy
|
||||
stripExif: boolean; // Remove location/camera data from images
|
||||
anonymizeFilename: boolean; // Use generated name instead of original
|
||||
}
|
||||
|
||||
const DEFAULT_PRIVACY: FilePrivacySettings = {
|
||||
localOnly: true,
|
||||
shareableToBoard: false,
|
||||
includeInR2Backup: true,
|
||||
stripExif: true,
|
||||
anonymizeFilename: false
|
||||
};
|
||||
```
|
||||
|
||||
### Sharing Flow
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────┐
|
||||
│ User drags local file onto shared board │
|
||||
├─────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ ⚠️ Share "meeting_notes.pdf" to this board? │
|
||||
│ │
|
||||
│ This file is currently private. Sharing it will: │
|
||||
│ • Make it visible to all board members │
|
||||
│ • Upload an encrypted copy to sync storage │
|
||||
│ • Keep the original encrypted on your device │
|
||||
│ │
|
||||
│ [Keep Private] [Share to Board] │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
### Phase 1: Core Upload
|
||||
- [ ] File drop zone component
|
||||
- [ ] File type detection
|
||||
- [ ] Image thumbnail generation
|
||||
- [ ] PDF text extraction & thumbnail
|
||||
- [ ] Encryption before storage
|
||||
- [ ] IndexedDB schema & storage
|
||||
|
||||
### Phase 2: File Management
|
||||
- [ ] File browser panel
|
||||
- [ ] Filter by type
|
||||
- [ ] Search within files
|
||||
- [ ] Delete files
|
||||
- [ ] Storage quota display
|
||||
|
||||
### Phase 3: Canvas Integration
|
||||
- [ ] Drag files to canvas
|
||||
- [ ] Image shape from file
|
||||
- [ ] PDF preview shape
|
||||
- [ ] Document/note shape
|
||||
- [ ] Generic file card shape
|
||||
|
||||
### Phase 4: Sharing & Backup
|
||||
- [ ] Share confirmation dialog
|
||||
- [ ] Upload to Automerge sync
|
||||
- [ ] Include in R2 backup
|
||||
- [ ] Privacy settings per file
|
||||
|
||||
## Related Documents
|
||||
|
||||
- [Google Data Sovereignty](./GOOGLE_DATA_SOVEREIGNTY.md) - Same encryption model for Google imports
|
||||
- [Offline Storage Feasibility](../OFFLINE_STORAGE_FEASIBILITY.md) - IndexedDB + Automerge foundation
|
||||
Loading…
Reference in New Issue