Merge origin/main into feature/google-export
Bring in all the latest changes from main including: - Index validation and migration for tldraw shapes - UserSettingsModal with integrations tab - CryptID authentication updates - AI services (image gen, video gen, mycelial intelligence) - Automerge sync improvements - Various UI improvements 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
commit
8411211ca6
|
|
@ -0,0 +1,4 @@
|
||||||
|
# Ignore Cloudflare Worker configuration files during Pages deployment
|
||||||
|
# These are only used for separate Worker deployments
|
||||||
|
worker/
|
||||||
|
*.toml
|
||||||
14
.env.example
14
.env.example
|
|
@ -4,10 +4,22 @@ VITE_GOOGLE_MAPS_API_KEY='your_google_maps_api_key'
|
||||||
VITE_DAILY_DOMAIN='your_daily_domain'
|
VITE_DAILY_DOMAIN='your_daily_domain'
|
||||||
VITE_TLDRAW_WORKER_URL='your_worker_url'
|
VITE_TLDRAW_WORKER_URL='your_worker_url'
|
||||||
|
|
||||||
|
# AI Configuration
|
||||||
|
# AI Orchestrator with Ollama (FREE local AI - highest priority)
|
||||||
|
VITE_OLLAMA_URL='https://ai.jeffemmett.com'
|
||||||
|
|
||||||
|
# RunPod API (Primary AI provider when Ollama unavailable)
|
||||||
|
# Users don't need their own API keys - RunPod is pre-configured
|
||||||
|
VITE_RUNPOD_API_KEY='your_runpod_api_key_here'
|
||||||
|
VITE_RUNPOD_TEXT_ENDPOINT_ID='your_text_endpoint_id' # vLLM for chat/text
|
||||||
|
VITE_RUNPOD_IMAGE_ENDPOINT_ID='your_image_endpoint_id' # Automatic1111/SD
|
||||||
|
VITE_RUNPOD_VIDEO_ENDPOINT_ID='your_video_endpoint_id' # Wan2.2
|
||||||
|
VITE_RUNPOD_WHISPER_ENDPOINT_ID='your_whisper_endpoint_id' # WhisperX
|
||||||
|
|
||||||
# Worker-only Variables (Do not prefix with VITE_)
|
# Worker-only Variables (Do not prefix with VITE_)
|
||||||
CLOUDFLARE_API_TOKEN='your_cloudflare_token'
|
CLOUDFLARE_API_TOKEN='your_cloudflare_token'
|
||||||
CLOUDFLARE_ACCOUNT_ID='your_account_id'
|
CLOUDFLARE_ACCOUNT_ID='your_account_id'
|
||||||
CLOUDFLARE_ZONE_ID='your_zone_id'
|
CLOUDFLARE_ZONE_ID='your_zone_id'
|
||||||
R2_BUCKET_NAME='your_bucket_name'
|
R2_BUCKET_NAME='your_bucket_name'
|
||||||
R2_PREVIEW_BUCKET_NAME='your_preview_bucket_name'
|
R2_PREVIEW_BUCKET_NAME='your_preview_bucket_name'
|
||||||
DAILY_API_KEY=your_daily_api_key_here
|
DAILY_API_KEY=your_daily_api_key_here
|
||||||
|
|
@ -0,0 +1,28 @@
|
||||||
|
name: Mirror to Gitea
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
- master
|
||||||
|
workflow_dispatch:
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
mirror:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- name: Mirror to Gitea
|
||||||
|
env:
|
||||||
|
GITEA_TOKEN: ${{ secrets.GITEA_TOKEN }}
|
||||||
|
GITEA_USERNAME: ${{ secrets.GITEA_USERNAME }}
|
||||||
|
run: |
|
||||||
|
REPO_NAME=$(basename $GITHUB_REPOSITORY)
|
||||||
|
git remote add gitea https://$GITEA_USERNAME:$GITEA_TOKEN@gitea.jeffemmett.com/jeffemmett/$REPO_NAME.git || true
|
||||||
|
git push gitea --all --force
|
||||||
|
git push gitea --tags --force
|
||||||
|
|
||||||
|
|
@ -175,3 +175,4 @@ dist
|
||||||
.env.*.local
|
.env.*.local
|
||||||
.dev.vars
|
.dev.vars
|
||||||
.env.production
|
.env.production
|
||||||
|
.aider*
|
||||||
|
|
|
||||||
2
.npmrc
2
.npmrc
|
|
@ -1,3 +1,3 @@
|
||||||
legacy-peer-deps=true
|
legacy-peer-deps=true
|
||||||
strict-peer-dependencies=false
|
strict-peer-dependencies=false
|
||||||
auto-install-peers=true
|
auto-install-peers=true
|
||||||
|
|
@ -0,0 +1,626 @@
|
||||||
|
# AI Services Deployment & Testing Guide
|
||||||
|
|
||||||
|
Complete guide for deploying and testing the AI services integration in canvas-website with Netcup RS 8000 and RunPod.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Overview
|
||||||
|
|
||||||
|
This project integrates multiple AI services with smart routing:
|
||||||
|
|
||||||
|
**Smart Routing Strategy:**
|
||||||
|
- **Text/Code (70-80% workload)**: Local Ollama on RS 8000 → **FREE**
|
||||||
|
- **Images - Low Priority**: Local Stable Diffusion on RS 8000 → **FREE** (slow ~60s)
|
||||||
|
- **Images - High Priority**: RunPod GPU (SDXL) → **$0.02/image** (fast ~5s)
|
||||||
|
- **Video Generation**: RunPod GPU (Wan2.1) → **$0.50/video** (30-90s)
|
||||||
|
|
||||||
|
**Expected Cost Savings:** $86-350/month compared to persistent GPU instances
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📦 What's Included
|
||||||
|
|
||||||
|
### AI Services:
|
||||||
|
1. ✅ **Text Generation (LLM)**
|
||||||
|
- RunPod integration via `src/lib/runpodApi.ts`
|
||||||
|
- Enhanced LLM utilities in `src/utils/llmUtils.ts`
|
||||||
|
- AI Orchestrator client in `src/lib/aiOrchestrator.ts`
|
||||||
|
- Prompt shapes, arrow LLM actions, command palette
|
||||||
|
|
||||||
|
2. ✅ **Image Generation**
|
||||||
|
- ImageGenShapeUtil in `src/shapes/ImageGenShapeUtil.tsx`
|
||||||
|
- ImageGenTool in `src/tools/ImageGenTool.ts`
|
||||||
|
- Mock mode **DISABLED** (ready for production)
|
||||||
|
- Smart routing: low priority → local CPU, high priority → RunPod GPU
|
||||||
|
|
||||||
|
3. ✅ **Video Generation (NEW!)**
|
||||||
|
- VideoGenShapeUtil in `src/shapes/VideoGenShapeUtil.tsx`
|
||||||
|
- VideoGenTool in `src/tools/VideoGenTool.ts`
|
||||||
|
- Wan2.1 I2V 14B 720p model on RunPod
|
||||||
|
- Always uses GPU (no local option)
|
||||||
|
|
||||||
|
4. ✅ **Voice Transcription**
|
||||||
|
- WhisperX integration via `src/hooks/useWhisperTranscriptionSimple.ts`
|
||||||
|
- Automatic fallback to local Whisper model
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Deployment Steps
|
||||||
|
|
||||||
|
### Step 1: Deploy AI Orchestrator on Netcup RS 8000
|
||||||
|
|
||||||
|
**Prerequisites:**
|
||||||
|
- SSH access to Netcup RS 8000: `ssh netcup`
|
||||||
|
- Docker and Docker Compose installed
|
||||||
|
- RunPod API key
|
||||||
|
|
||||||
|
**1.1 Create AI Orchestrator Directory:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
mkdir -p /opt/ai-orchestrator/{services/{router,workers,monitor},configs,data/{redis,postgres,prometheus}}
|
||||||
|
cd /opt/ai-orchestrator
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**1.2 Copy Configuration Files:**
|
||||||
|
|
||||||
|
From your local machine, copy the AI orchestrator files created in `NETCUP_MIGRATION_PLAN.md`:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Copy docker-compose.yml
|
||||||
|
scp /path/to/docker-compose.yml netcup:/opt/ai-orchestrator/
|
||||||
|
|
||||||
|
# Copy service files
|
||||||
|
scp -r /path/to/services/* netcup:/opt/ai-orchestrator/services/
|
||||||
|
```
|
||||||
|
|
||||||
|
**1.3 Configure Environment Variables:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup "cat > /opt/ai-orchestrator/.env" << 'EOF'
|
||||||
|
# PostgreSQL
|
||||||
|
POSTGRES_PASSWORD=$(openssl rand -hex 16)
|
||||||
|
|
||||||
|
# RunPod API Keys
|
||||||
|
RUNPOD_API_KEY=your_runpod_api_key_here
|
||||||
|
RUNPOD_TEXT_ENDPOINT_ID=your_text_endpoint_id
|
||||||
|
RUNPOD_IMAGE_ENDPOINT_ID=your_image_endpoint_id
|
||||||
|
RUNPOD_VIDEO_ENDPOINT_ID=your_video_endpoint_id
|
||||||
|
|
||||||
|
# Grafana
|
||||||
|
GRAFANA_PASSWORD=$(openssl rand -hex 16)
|
||||||
|
|
||||||
|
# Monitoring
|
||||||
|
ALERT_EMAIL=your@email.com
|
||||||
|
COST_ALERT_THRESHOLD=100
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**1.4 Deploy the Stack:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
cd /opt/ai-orchestrator
|
||||||
|
|
||||||
|
# Start all services
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# Check status
|
||||||
|
docker-compose ps
|
||||||
|
|
||||||
|
# View logs
|
||||||
|
docker-compose logs -f router
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**1.5 Verify Deployment:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check health endpoint
|
||||||
|
ssh netcup "curl http://localhost:8000/health"
|
||||||
|
|
||||||
|
# Check API documentation
|
||||||
|
ssh netcup "curl http://localhost:8000/docs"
|
||||||
|
|
||||||
|
# Check queue status
|
||||||
|
ssh netcup "curl http://localhost:8000/queue/status"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 2: Setup Local AI Models on RS 8000
|
||||||
|
|
||||||
|
**2.1 Download Ollama Models:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
# Download recommended models
|
||||||
|
docker exec ai-ollama ollama pull llama3:70b
|
||||||
|
docker exec ai-ollama ollama pull codellama:34b
|
||||||
|
docker exec ai-ollama ollama pull deepseek-coder:33b
|
||||||
|
docker exec ai-ollama ollama pull mistral:7b
|
||||||
|
|
||||||
|
# Verify
|
||||||
|
docker exec ai-ollama ollama list
|
||||||
|
|
||||||
|
# Test a model
|
||||||
|
docker exec ai-ollama ollama run llama3:70b "Hello, how are you?"
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**2.2 Download Stable Diffusion Models:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
mkdir -p /data/models/stable-diffusion/sd-v2.1
|
||||||
|
cd /data/models/stable-diffusion/sd-v2.1
|
||||||
|
|
||||||
|
# Download SD 2.1 weights
|
||||||
|
wget https://huggingface.co/stabilityai/stable-diffusion-2-1/resolve/main/v2-1_768-ema-pruned.safetensors
|
||||||
|
|
||||||
|
# Verify
|
||||||
|
ls -lh v2-1_768-ema-pruned.safetensors
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**2.3 Download Wan2.1 Video Generation Model:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
# Install huggingface-cli
|
||||||
|
pip install huggingface-hub
|
||||||
|
|
||||||
|
# Download Wan2.1 I2V 14B 720p
|
||||||
|
mkdir -p /data/models/video-generation
|
||||||
|
cd /data/models/video-generation
|
||||||
|
|
||||||
|
huggingface-cli download Wan-AI/Wan2.1-I2V-14B-720P \
|
||||||
|
--include "*.safetensors" \
|
||||||
|
--local-dir wan2.1_i2v_14b
|
||||||
|
|
||||||
|
# Check size (~28GB)
|
||||||
|
du -sh wan2.1_i2v_14b
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note:** The Wan2.1 model will be deployed to RunPod, not run locally on CPU.
|
||||||
|
|
||||||
|
### Step 3: Setup RunPod Endpoints
|
||||||
|
|
||||||
|
**3.1 Create RunPod Serverless Endpoints:**
|
||||||
|
|
||||||
|
Go to [RunPod Serverless](https://www.runpod.io/console/serverless) and create endpoints for:
|
||||||
|
|
||||||
|
1. **Text Generation Endpoint** (optional, fallback)
|
||||||
|
- Model: Any LLM (Llama, Mistral, etc.)
|
||||||
|
- GPU: Optional (we use local CPU primarily)
|
||||||
|
|
||||||
|
2. **Image Generation Endpoint**
|
||||||
|
- Model: SDXL or SD3
|
||||||
|
- GPU: A4000/A5000 (good price/performance)
|
||||||
|
- Expected cost: ~$0.02/image
|
||||||
|
|
||||||
|
3. **Video Generation Endpoint**
|
||||||
|
- Model: Wan2.1-I2V-14B-720P
|
||||||
|
- GPU: A100 or H100 (required for video)
|
||||||
|
- Expected cost: ~$0.50/video
|
||||||
|
|
||||||
|
**3.2 Get Endpoint IDs:**
|
||||||
|
|
||||||
|
For each endpoint, copy the endpoint ID from the URL or endpoint details.
|
||||||
|
|
||||||
|
Example: If URL is `https://api.runpod.ai/v2/jqd16o7stu29vq/run`, then `jqd16o7stu29vq` is your endpoint ID.
|
||||||
|
|
||||||
|
**3.3 Update Environment Variables:**
|
||||||
|
|
||||||
|
Update `/opt/ai-orchestrator/.env` with your endpoint IDs:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup "nano /opt/ai-orchestrator/.env"
|
||||||
|
|
||||||
|
# Add your endpoint IDs:
|
||||||
|
RUNPOD_TEXT_ENDPOINT_ID=your_text_endpoint_id
|
||||||
|
RUNPOD_IMAGE_ENDPOINT_ID=your_image_endpoint_id
|
||||||
|
RUNPOD_VIDEO_ENDPOINT_ID=your_video_endpoint_id
|
||||||
|
|
||||||
|
# Restart services
|
||||||
|
cd /opt/ai-orchestrator && docker-compose restart
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 4: Configure canvas-website
|
||||||
|
|
||||||
|
**4.1 Create .env.local:**
|
||||||
|
|
||||||
|
In your canvas-website directory:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /home/jeffe/Github/canvas-website-branch-worktrees/add-runpod-AI-API
|
||||||
|
|
||||||
|
cat > .env.local << 'EOF'
|
||||||
|
# AI Orchestrator (Primary - Netcup RS 8000)
|
||||||
|
VITE_AI_ORCHESTRATOR_URL=http://159.195.32.209:8000
|
||||||
|
# Or use domain when DNS is configured:
|
||||||
|
# VITE_AI_ORCHESTRATOR_URL=https://ai-api.jeffemmett.com
|
||||||
|
|
||||||
|
# RunPod API (Fallback/Direct Access)
|
||||||
|
VITE_RUNPOD_API_KEY=your_runpod_api_key_here
|
||||||
|
VITE_RUNPOD_TEXT_ENDPOINT_ID=your_text_endpoint_id
|
||||||
|
VITE_RUNPOD_IMAGE_ENDPOINT_ID=your_image_endpoint_id
|
||||||
|
VITE_RUNPOD_VIDEO_ENDPOINT_ID=your_video_endpoint_id
|
||||||
|
|
||||||
|
# Other existing vars...
|
||||||
|
VITE_GOOGLE_CLIENT_ID=your_google_client_id
|
||||||
|
VITE_GOOGLE_MAPS_API_KEY=your_google_maps_api_key
|
||||||
|
VITE_DAILY_DOMAIN=your_daily_domain
|
||||||
|
VITE_TLDRAW_WORKER_URL=your_worker_url
|
||||||
|
EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
**4.2 Install Dependencies:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm install
|
||||||
|
```
|
||||||
|
|
||||||
|
**4.3 Build and Start:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Development
|
||||||
|
npm run dev
|
||||||
|
|
||||||
|
# Production build
|
||||||
|
npm run build
|
||||||
|
npm run start
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 5: Register Video Generation Tool
|
||||||
|
|
||||||
|
You need to register the VideoGen shape and tool with tldraw. Find where shapes and tools are registered (likely in `src/routes/Board.tsx` or similar):
|
||||||
|
|
||||||
|
**Add to shape utilities array:**
|
||||||
|
```typescript
|
||||||
|
import { VideoGenShapeUtil } from '@/shapes/VideoGenShapeUtil'
|
||||||
|
|
||||||
|
const shapeUtils = [
|
||||||
|
// ... existing shapes
|
||||||
|
VideoGenShapeUtil,
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Add to tools array:**
|
||||||
|
```typescript
|
||||||
|
import { VideoGenTool } from '@/tools/VideoGenTool'
|
||||||
|
|
||||||
|
const tools = [
|
||||||
|
// ... existing tools
|
||||||
|
VideoGenTool,
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🧪 Testing
|
||||||
|
|
||||||
|
### Test 1: Verify AI Orchestrator
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Test health endpoint
|
||||||
|
curl http://159.195.32.209:8000/health
|
||||||
|
|
||||||
|
# Expected response:
|
||||||
|
# {"status":"healthy","timestamp":"2025-11-25T12:00:00.000Z"}
|
||||||
|
|
||||||
|
# Test text generation
|
||||||
|
curl -X POST http://159.195.32.209:8000/generate/text \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{
|
||||||
|
"prompt": "Write a hello world program in Python",
|
||||||
|
"priority": "normal"
|
||||||
|
}'
|
||||||
|
|
||||||
|
# Expected response:
|
||||||
|
# {"job_id":"abc123","status":"queued","message":"Job queued on local provider"}
|
||||||
|
|
||||||
|
# Check job status
|
||||||
|
curl http://159.195.32.209:8000/job/abc123
|
||||||
|
|
||||||
|
# Check queue status
|
||||||
|
curl http://159.195.32.209:8000/queue/status
|
||||||
|
|
||||||
|
# Check costs
|
||||||
|
curl http://159.195.32.209:8000/costs/summary
|
||||||
|
```
|
||||||
|
|
||||||
|
### Test 2: Test Text Generation in Canvas
|
||||||
|
|
||||||
|
1. Open canvas-website in browser
|
||||||
|
2. Open browser console (F12)
|
||||||
|
3. Look for log messages:
|
||||||
|
- `✅ AI Orchestrator is available at http://159.195.32.209:8000`
|
||||||
|
4. Create a Prompt shape or use arrow LLM action
|
||||||
|
5. Enter a prompt and submit
|
||||||
|
6. Verify response appears
|
||||||
|
7. Check console for routing info:
|
||||||
|
- Should see `Using local Ollama (FREE)`
|
||||||
|
|
||||||
|
### Test 3: Test Image Generation
|
||||||
|
|
||||||
|
**Low Priority (Local CPU - FREE):**
|
||||||
|
|
||||||
|
1. Use ImageGen tool from toolbar
|
||||||
|
2. Click on canvas to create ImageGen shape
|
||||||
|
3. Enter prompt: "A beautiful mountain landscape"
|
||||||
|
4. Select priority: "Low"
|
||||||
|
5. Click "Generate"
|
||||||
|
6. Wait 30-60 seconds
|
||||||
|
7. Verify image appears
|
||||||
|
8. Check console: Should show `Using local Stable Diffusion CPU`
|
||||||
|
|
||||||
|
**High Priority (RunPod GPU - $0.02):**
|
||||||
|
|
||||||
|
1. Create new ImageGen shape
|
||||||
|
2. Enter prompt: "A futuristic city at sunset"
|
||||||
|
3. Select priority: "High"
|
||||||
|
4. Click "Generate"
|
||||||
|
5. Wait 5-10 seconds
|
||||||
|
6. Verify image appears
|
||||||
|
7. Check console: Should show `Using RunPod SDXL`
|
||||||
|
8. Check cost: Should show `~$0.02`
|
||||||
|
|
||||||
|
### Test 4: Test Video Generation
|
||||||
|
|
||||||
|
1. Use VideoGen tool from toolbar
|
||||||
|
2. Click on canvas to create VideoGen shape
|
||||||
|
3. Enter prompt: "A cat walking through a garden"
|
||||||
|
4. Set duration: 3 seconds
|
||||||
|
5. Click "Generate"
|
||||||
|
6. Wait 30-90 seconds
|
||||||
|
7. Verify video appears and plays
|
||||||
|
8. Check console: Should show `Using RunPod Wan2.1`
|
||||||
|
9. Check cost: Should show `~$0.50`
|
||||||
|
10. Test download button
|
||||||
|
|
||||||
|
### Test 5: Test Voice Transcription
|
||||||
|
|
||||||
|
1. Use Transcription tool from toolbar
|
||||||
|
2. Click to create Transcription shape
|
||||||
|
3. Click "Start Recording"
|
||||||
|
4. Speak into microphone
|
||||||
|
5. Click "Stop Recording"
|
||||||
|
6. Verify transcription appears
|
||||||
|
7. Check if using RunPod or local Whisper
|
||||||
|
|
||||||
|
### Test 6: Monitor Costs and Performance
|
||||||
|
|
||||||
|
**Access monitoring dashboards:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# API Documentation
|
||||||
|
http://159.195.32.209:8000/docs
|
||||||
|
|
||||||
|
# Queue Status
|
||||||
|
http://159.195.32.209:8000/queue/status
|
||||||
|
|
||||||
|
# Cost Tracking
|
||||||
|
http://159.195.32.209:3000/api/costs/summary
|
||||||
|
|
||||||
|
# Grafana Dashboard
|
||||||
|
http://159.195.32.209:3001
|
||||||
|
# Default login: admin / admin (change this!)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Check daily costs:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl http://159.195.32.209:3000/api/costs/summary
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected response:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"today": {
|
||||||
|
"local": 0.00,
|
||||||
|
"runpod": 2.45,
|
||||||
|
"total": 2.45
|
||||||
|
},
|
||||||
|
"this_month": {
|
||||||
|
"local": 0.00,
|
||||||
|
"runpod": 45.20,
|
||||||
|
"total": 45.20
|
||||||
|
},
|
||||||
|
"breakdown": {
|
||||||
|
"text": 0.00,
|
||||||
|
"image": 12.50,
|
||||||
|
"video": 32.70,
|
||||||
|
"code": 0.00
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🐛 Troubleshooting
|
||||||
|
|
||||||
|
### Issue: AI Orchestrator not available
|
||||||
|
|
||||||
|
**Symptoms:**
|
||||||
|
- Console shows: `⚠️ AI Orchestrator configured but not responding`
|
||||||
|
- Health check fails
|
||||||
|
|
||||||
|
**Solutions:**
|
||||||
|
```bash
|
||||||
|
# 1. Check if services are running
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose ps"
|
||||||
|
|
||||||
|
# 2. Check logs
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose logs -f router"
|
||||||
|
|
||||||
|
# 3. Restart services
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose restart"
|
||||||
|
|
||||||
|
# 4. Check firewall
|
||||||
|
ssh netcup "sudo ufw status"
|
||||||
|
ssh netcup "sudo ufw allow 8000/tcp"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Issue: Image generation fails with "No output found"
|
||||||
|
|
||||||
|
**Symptoms:**
|
||||||
|
- Job completes but no image URL returned
|
||||||
|
- Error: `Job completed but no output data found`
|
||||||
|
|
||||||
|
**Solutions:**
|
||||||
|
1. Check RunPod endpoint configuration
|
||||||
|
2. Verify endpoint handler returns correct format:
|
||||||
|
```json
|
||||||
|
{"output": {"image": "base64_or_url"}}
|
||||||
|
```
|
||||||
|
3. Check endpoint logs in RunPod console
|
||||||
|
4. Test endpoint directly with curl
|
||||||
|
|
||||||
|
### Issue: Video generation timeout
|
||||||
|
|
||||||
|
**Symptoms:**
|
||||||
|
- Job stuck in "processing" state
|
||||||
|
- Timeout after 120 attempts
|
||||||
|
|
||||||
|
**Solutions:**
|
||||||
|
1. Video generation takes 30-90 seconds, ensure patience
|
||||||
|
2. Check RunPod GPU availability (might be cold start)
|
||||||
|
3. Increase timeout in VideoGenShapeUtil if needed
|
||||||
|
4. Check RunPod endpoint logs for errors
|
||||||
|
|
||||||
|
### Issue: High costs
|
||||||
|
|
||||||
|
**Symptoms:**
|
||||||
|
- Monthly costs exceed budget
|
||||||
|
- Too many RunPod requests
|
||||||
|
|
||||||
|
**Solutions:**
|
||||||
|
```bash
|
||||||
|
# 1. Check cost breakdown
|
||||||
|
curl http://159.195.32.209:3000/api/costs/summary
|
||||||
|
|
||||||
|
# 2. Review routing decisions
|
||||||
|
curl http://159.195.32.209:8000/queue/status
|
||||||
|
|
||||||
|
# 3. Adjust routing thresholds
|
||||||
|
# Edit router configuration to prefer local more
|
||||||
|
ssh netcup "nano /opt/ai-orchestrator/services/router/main.py"
|
||||||
|
|
||||||
|
# 4. Set cost alerts
|
||||||
|
ssh netcup "nano /opt/ai-orchestrator/.env"
|
||||||
|
# COST_ALERT_THRESHOLD=50 # Alert if daily cost > $50
|
||||||
|
```
|
||||||
|
|
||||||
|
### Issue: Local models slow or failing
|
||||||
|
|
||||||
|
**Symptoms:**
|
||||||
|
- Text generation slow (>30s)
|
||||||
|
- Image generation very slow (>2min)
|
||||||
|
- Out of memory errors
|
||||||
|
|
||||||
|
**Solutions:**
|
||||||
|
```bash
|
||||||
|
# 1. Check system resources
|
||||||
|
ssh netcup "htop"
|
||||||
|
ssh netcup "free -h"
|
||||||
|
|
||||||
|
# 2. Reduce model size
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
# Use smaller models
|
||||||
|
docker exec ai-ollama ollama pull llama3:8b # Instead of 70b
|
||||||
|
docker exec ai-ollama ollama pull mistral:7b # Lighter model
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# 3. Limit concurrent workers
|
||||||
|
ssh netcup "nano /opt/ai-orchestrator/docker-compose.yml"
|
||||||
|
# Reduce worker replicas if needed
|
||||||
|
|
||||||
|
# 4. Increase swap (if low RAM)
|
||||||
|
ssh netcup "sudo fallocate -l 8G /swapfile"
|
||||||
|
ssh netcup "sudo chmod 600 /swapfile"
|
||||||
|
ssh netcup "sudo mkswap /swapfile"
|
||||||
|
ssh netcup "sudo swapon /swapfile"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Performance Expectations
|
||||||
|
|
||||||
|
### Text Generation:
|
||||||
|
- **Local (Llama3-70b)**: 2-10 seconds
|
||||||
|
- **Local (Mistral-7b)**: 1-3 seconds
|
||||||
|
- **RunPod (fallback)**: 3-8 seconds
|
||||||
|
- **Cost**: $0.00 (local) or $0.001-0.01 (RunPod)
|
||||||
|
|
||||||
|
### Image Generation:
|
||||||
|
- **Local SD CPU (low priority)**: 30-60 seconds
|
||||||
|
- **RunPod GPU (high priority)**: 3-10 seconds
|
||||||
|
- **Cost**: $0.00 (local) or $0.02 (RunPod)
|
||||||
|
|
||||||
|
### Video Generation:
|
||||||
|
- **RunPod Wan2.1**: 30-90 seconds
|
||||||
|
- **Cost**: ~$0.50 per video
|
||||||
|
|
||||||
|
### Expected Monthly Costs:
|
||||||
|
|
||||||
|
**Light Usage (100 requests/day):**
|
||||||
|
- 70 text (local): $0
|
||||||
|
- 20 images (15 local + 5 RunPod): $0.10
|
||||||
|
- 10 videos: $5.00
|
||||||
|
- **Total: ~$5-10/month**
|
||||||
|
|
||||||
|
**Medium Usage (500 requests/day):**
|
||||||
|
- 350 text (local): $0
|
||||||
|
- 100 images (60 local + 40 RunPod): $0.80
|
||||||
|
- 50 videos: $25.00
|
||||||
|
- **Total: ~$25-35/month**
|
||||||
|
|
||||||
|
**Heavy Usage (2000 requests/day):**
|
||||||
|
- 1400 text (local): $0
|
||||||
|
- 400 images (200 local + 200 RunPod): $4.00
|
||||||
|
- 200 videos: $100.00
|
||||||
|
- **Total: ~$100-120/month**
|
||||||
|
|
||||||
|
Compare to persistent GPU pod: $200-300/month regardless of usage!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Next Steps
|
||||||
|
|
||||||
|
1. ✅ Deploy AI Orchestrator on Netcup RS 8000
|
||||||
|
2. ✅ Setup local AI models (Ollama, SD)
|
||||||
|
3. ✅ Configure RunPod endpoints
|
||||||
|
4. ✅ Test all AI services
|
||||||
|
5. 📋 Setup monitoring and alerts
|
||||||
|
6. 📋 Configure DNS for ai-api.jeffemmett.com
|
||||||
|
7. 📋 Setup SSL with Let's Encrypt
|
||||||
|
8. 📋 Migrate canvas-website to Netcup
|
||||||
|
9. 📋 Monitor costs and optimize routing
|
||||||
|
10. 📋 Decommission DigitalOcean droplets
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Additional Resources
|
||||||
|
|
||||||
|
- **Migration Plan**: See `NETCUP_MIGRATION_PLAN.md`
|
||||||
|
- **RunPod Setup**: See `RUNPOD_SETUP.md`
|
||||||
|
- **Test Guide**: See `TEST_RUNPOD_AI.md`
|
||||||
|
- **API Documentation**: http://159.195.32.209:8000/docs
|
||||||
|
- **Monitoring**: http://159.195.32.209:3001 (Grafana)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Tips for Cost Optimization
|
||||||
|
|
||||||
|
1. **Prefer low priority for batch jobs**: Use `priority: "low"` for non-urgent tasks
|
||||||
|
2. **Use local models first**: 70-80% of workload can run locally for $0
|
||||||
|
3. **Monitor queue depth**: Auto-scales to RunPod when local is backed up
|
||||||
|
4. **Set cost alerts**: Get notified if daily costs exceed threshold
|
||||||
|
5. **Review cost breakdown weekly**: Identify optimization opportunities
|
||||||
|
6. **Batch similar requests**: Process multiple items together
|
||||||
|
7. **Cache results**: Store and reuse common queries
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Ready to deploy?** Start with Step 1 and follow the guide! 🚀
|
||||||
|
|
@ -0,0 +1,372 @@
|
||||||
|
# AI Services Setup - Complete Summary
|
||||||
|
|
||||||
|
## ✅ What We've Built
|
||||||
|
|
||||||
|
You now have a **complete, production-ready AI orchestration system** that intelligently routes between your Netcup RS 8000 (local CPU - FREE) and RunPod (serverless GPU - pay-per-use).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📦 Files Created/Modified
|
||||||
|
|
||||||
|
### New Files:
|
||||||
|
1. **`NETCUP_MIGRATION_PLAN.md`** - Complete migration plan from DigitalOcean to Netcup
|
||||||
|
2. **`AI_SERVICES_DEPLOYMENT_GUIDE.md`** - Step-by-step deployment and testing guide
|
||||||
|
3. **`src/lib/aiOrchestrator.ts`** - AI Orchestrator client library
|
||||||
|
4. **`src/shapes/VideoGenShapeUtil.tsx`** - Video generation shape (Wan2.1)
|
||||||
|
5. **`src/tools/VideoGenTool.ts`** - Video generation tool
|
||||||
|
|
||||||
|
### Modified Files:
|
||||||
|
1. **`src/shapes/ImageGenShapeUtil.tsx`** - Disabled mock mode (line 13: `USE_MOCK_API = false`)
|
||||||
|
2. **`.env.example`** - Added AI Orchestrator and RunPod configuration
|
||||||
|
|
||||||
|
### Existing Files (Already Working):
|
||||||
|
- `src/lib/runpodApi.ts` - RunPod API client for transcription
|
||||||
|
- `src/utils/llmUtils.ts` - Enhanced LLM utilities with RunPod support
|
||||||
|
- `src/hooks/useWhisperTranscriptionSimple.ts` - WhisperX transcription
|
||||||
|
- `RUNPOD_SETUP.md` - RunPod setup documentation
|
||||||
|
- `TEST_RUNPOD_AI.md` - Testing documentation
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Features & Capabilities
|
||||||
|
|
||||||
|
### 1. Text Generation (LLM)
|
||||||
|
- ✅ Smart routing to local Ollama (FREE)
|
||||||
|
- ✅ Fallback to RunPod if needed
|
||||||
|
- ✅ Works with: Prompt shapes, arrow LLM actions, command palette
|
||||||
|
- ✅ Models: Llama3-70b, CodeLlama-34b, Mistral-7b, etc.
|
||||||
|
- 💰 **Cost: $0** (99% of requests use local CPU)
|
||||||
|
|
||||||
|
### 2. Image Generation
|
||||||
|
- ✅ Priority-based routing:
|
||||||
|
- Low priority → Local SD CPU (slow but FREE)
|
||||||
|
- High priority → RunPod GPU (fast, $0.02)
|
||||||
|
- ✅ Auto-scaling based on queue depth
|
||||||
|
- ✅ ImageGenShapeUtil and ImageGenTool
|
||||||
|
- ✅ Mock mode **DISABLED** - ready for production
|
||||||
|
- 💰 **Cost: $0-0.02** per image
|
||||||
|
|
||||||
|
### 3. Video Generation (NEW!)
|
||||||
|
- ✅ Wan2.1 I2V 14B 720p model on RunPod
|
||||||
|
- ✅ VideoGenShapeUtil with video player
|
||||||
|
- ✅ VideoGenTool for canvas
|
||||||
|
- ✅ Download generated videos
|
||||||
|
- ✅ Configurable duration (1-10 seconds)
|
||||||
|
- 💰 **Cost: ~$0.50** per video
|
||||||
|
|
||||||
|
### 4. Voice Transcription
|
||||||
|
- ✅ WhisperX on RunPod (primary)
|
||||||
|
- ✅ Automatic fallback to local Whisper
|
||||||
|
- ✅ TranscriptionShapeUtil
|
||||||
|
- 💰 **Cost: $0.01-0.05** per transcription
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🏗️ Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
User Request
|
||||||
|
│
|
||||||
|
▼
|
||||||
|
AI Orchestrator (RS 8000)
|
||||||
|
│
|
||||||
|
├─── Text/Code ───────▶ Local Ollama (FREE)
|
||||||
|
│
|
||||||
|
├─── Images (low) ────▶ Local SD CPU (FREE, slow)
|
||||||
|
│
|
||||||
|
├─── Images (high) ───▶ RunPod GPU ($0.02, fast)
|
||||||
|
│
|
||||||
|
└─── Video ───────────▶ RunPod GPU ($0.50)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Smart Routing Benefits:
|
||||||
|
- **70-80% of workload runs for FREE** (local CPU)
|
||||||
|
- **No idle GPU costs** (serverless = pay only when generating)
|
||||||
|
- **Auto-scaling** (queue-based, handles spikes)
|
||||||
|
- **Cost tracking** (per job, per user, per day/month)
|
||||||
|
- **Graceful fallback** (local → RunPod → error)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💰 Cost Analysis
|
||||||
|
|
||||||
|
### Before (DigitalOcean + Persistent GPU):
|
||||||
|
- Main Droplet: $18-36/mo
|
||||||
|
- AI Droplet: $36/mo
|
||||||
|
- RunPod persistent pods: $100-200/mo
|
||||||
|
- **Total: $154-272/mo**
|
||||||
|
|
||||||
|
### After (Netcup RS 8000 + Serverless GPU):
|
||||||
|
- RS 8000 G12 Pro: €55.57/mo (~$60/mo)
|
||||||
|
- RunPod serverless: $30-60/mo (70% reduction)
|
||||||
|
- **Total: $90-120/mo**
|
||||||
|
|
||||||
|
### Savings:
|
||||||
|
- **Monthly: $64-152**
|
||||||
|
- **Annual: $768-1,824**
|
||||||
|
|
||||||
|
### Plus You Get:
|
||||||
|
- 10x CPU cores (20 vs 2)
|
||||||
|
- 32x RAM (64GB vs 2GB)
|
||||||
|
- 25x storage (3TB vs 120GB)
|
||||||
|
- Better EU latency (Germany)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📋 Quick Start Checklist
|
||||||
|
|
||||||
|
### Phase 1: Deploy AI Orchestrator (1-2 hours)
|
||||||
|
- [ ] SSH into Netcup RS 8000: `ssh netcup`
|
||||||
|
- [ ] Create directory: `/opt/ai-orchestrator`
|
||||||
|
- [ ] Deploy docker-compose stack (see NETCUP_MIGRATION_PLAN.md Phase 2)
|
||||||
|
- [ ] Configure environment variables (.env)
|
||||||
|
- [ ] Start services: `docker-compose up -d`
|
||||||
|
- [ ] Verify: `curl http://localhost:8000/health`
|
||||||
|
|
||||||
|
### Phase 2: Setup Local AI Models (2-4 hours)
|
||||||
|
- [ ] Download Ollama models (Llama3-70b, CodeLlama-34b)
|
||||||
|
- [ ] Download Stable Diffusion 2.1 weights
|
||||||
|
- [ ] Download Wan2.1 model weights (optional, runs on RunPod)
|
||||||
|
- [ ] Test Ollama: `docker exec ai-ollama ollama run llama3:70b "Hello"`
|
||||||
|
|
||||||
|
### Phase 3: Configure RunPod Endpoints (30 min)
|
||||||
|
- [ ] Create text generation endpoint (optional)
|
||||||
|
- [ ] Create image generation endpoint (SDXL)
|
||||||
|
- [ ] Create video generation endpoint (Wan2.1)
|
||||||
|
- [ ] Copy endpoint IDs
|
||||||
|
- [ ] Update .env with endpoint IDs
|
||||||
|
- [ ] Restart services: `docker-compose restart`
|
||||||
|
|
||||||
|
### Phase 4: Configure canvas-website (15 min)
|
||||||
|
- [ ] Create `.env.local` with AI Orchestrator URL
|
||||||
|
- [ ] Add RunPod API keys (fallback)
|
||||||
|
- [ ] Install dependencies: `npm install`
|
||||||
|
- [ ] Register VideoGenShapeUtil and VideoGenTool (see deployment guide)
|
||||||
|
- [ ] Build: `npm run build`
|
||||||
|
- [ ] Start: `npm run dev`
|
||||||
|
|
||||||
|
### Phase 5: Test Everything (1 hour)
|
||||||
|
- [ ] Test AI Orchestrator health check
|
||||||
|
- [ ] Test text generation (local Ollama)
|
||||||
|
- [ ] Test image generation (low priority - local)
|
||||||
|
- [ ] Test image generation (high priority - RunPod)
|
||||||
|
- [ ] Test video generation (RunPod Wan2.1)
|
||||||
|
- [ ] Test voice transcription (WhisperX)
|
||||||
|
- [ ] Check cost tracking dashboard
|
||||||
|
- [ ] Monitor queue status
|
||||||
|
|
||||||
|
### Phase 6: Production Deployment (2-4 hours)
|
||||||
|
- [ ] Setup nginx reverse proxy
|
||||||
|
- [ ] Configure DNS: ai-api.jeffemmett.com → 159.195.32.209
|
||||||
|
- [ ] Setup SSL with Let's Encrypt
|
||||||
|
- [ ] Deploy canvas-website to RS 8000
|
||||||
|
- [ ] Setup monitoring dashboards (Grafana)
|
||||||
|
- [ ] Configure cost alerts
|
||||||
|
- [ ] Test from production domain
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🧪 Testing Commands
|
||||||
|
|
||||||
|
### Test AI Orchestrator:
|
||||||
|
```bash
|
||||||
|
# Health check
|
||||||
|
curl http://159.195.32.209:8000/health
|
||||||
|
|
||||||
|
# Text generation
|
||||||
|
curl -X POST http://159.195.32.209:8000/generate/text \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"prompt":"Hello world in Python","priority":"normal"}'
|
||||||
|
|
||||||
|
# Image generation (low priority)
|
||||||
|
curl -X POST http://159.195.32.209:8000/generate/image \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"prompt":"A beautiful sunset","priority":"low"}'
|
||||||
|
|
||||||
|
# Video generation
|
||||||
|
curl -X POST http://159.195.32.209:8000/generate/video \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"prompt":"A cat walking","duration":3}'
|
||||||
|
|
||||||
|
# Queue status
|
||||||
|
curl http://159.195.32.209:8000/queue/status
|
||||||
|
|
||||||
|
# Costs
|
||||||
|
curl http://159.195.32.209:3000/api/costs/summary
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📊 Monitoring Dashboards
|
||||||
|
|
||||||
|
Access your monitoring at:
|
||||||
|
|
||||||
|
- **API Docs**: http://159.195.32.209:8000/docs
|
||||||
|
- **Queue Status**: http://159.195.32.209:8000/queue/status
|
||||||
|
- **Cost Tracking**: http://159.195.32.209:3000/api/costs/summary
|
||||||
|
- **Grafana**: http://159.195.32.209:3001 (login: admin/admin)
|
||||||
|
- **Prometheus**: http://159.195.32.209:9090
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔧 Configuration Files
|
||||||
|
|
||||||
|
### Environment Variables (.env.local):
|
||||||
|
```bash
|
||||||
|
# AI Orchestrator (Primary)
|
||||||
|
VITE_AI_ORCHESTRATOR_URL=http://159.195.32.209:8000
|
||||||
|
|
||||||
|
# RunPod (Fallback)
|
||||||
|
VITE_RUNPOD_API_KEY=your_api_key
|
||||||
|
VITE_RUNPOD_TEXT_ENDPOINT_ID=xxx
|
||||||
|
VITE_RUNPOD_IMAGE_ENDPOINT_ID=xxx
|
||||||
|
VITE_RUNPOD_VIDEO_ENDPOINT_ID=xxx
|
||||||
|
```
|
||||||
|
|
||||||
|
### AI Orchestrator (.env on RS 8000):
|
||||||
|
```bash
|
||||||
|
# PostgreSQL
|
||||||
|
POSTGRES_PASSWORD=generated_password
|
||||||
|
|
||||||
|
# RunPod
|
||||||
|
RUNPOD_API_KEY=your_api_key
|
||||||
|
RUNPOD_TEXT_ENDPOINT_ID=xxx
|
||||||
|
RUNPOD_IMAGE_ENDPOINT_ID=xxx
|
||||||
|
RUNPOD_VIDEO_ENDPOINT_ID=xxx
|
||||||
|
|
||||||
|
# Monitoring
|
||||||
|
GRAFANA_PASSWORD=generated_password
|
||||||
|
COST_ALERT_THRESHOLD=100
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🐛 Common Issues & Solutions
|
||||||
|
|
||||||
|
### 1. "AI Orchestrator not available"
|
||||||
|
```bash
|
||||||
|
# Check if running
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose ps"
|
||||||
|
|
||||||
|
# Restart
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose restart"
|
||||||
|
|
||||||
|
# Check logs
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose logs -f router"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. "Image generation fails"
|
||||||
|
- Check RunPod endpoint configuration
|
||||||
|
- Verify endpoint returns: `{"output": {"image": "url"}}`
|
||||||
|
- Test endpoint directly in RunPod console
|
||||||
|
|
||||||
|
### 3. "Video generation timeout"
|
||||||
|
- Normal processing time: 30-90 seconds
|
||||||
|
- Check RunPod GPU availability (cold start can add 30s)
|
||||||
|
- Verify Wan2.1 endpoint is deployed correctly
|
||||||
|
|
||||||
|
### 4. "High costs"
|
||||||
|
```bash
|
||||||
|
# Check cost breakdown
|
||||||
|
curl http://159.195.32.209:3000/api/costs/summary
|
||||||
|
|
||||||
|
# Adjust routing to prefer local more
|
||||||
|
# Edit /opt/ai-orchestrator/services/router/main.py
|
||||||
|
# Increase queue_depth threshold from 10 to 20+
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Documentation Index
|
||||||
|
|
||||||
|
1. **NETCUP_MIGRATION_PLAN.md** - Complete migration guide (8 phases)
|
||||||
|
2. **AI_SERVICES_DEPLOYMENT_GUIDE.md** - Deployment and testing guide
|
||||||
|
3. **AI_SERVICES_SUMMARY.md** - This file (quick reference)
|
||||||
|
4. **RUNPOD_SETUP.md** - RunPod WhisperX setup
|
||||||
|
5. **TEST_RUNPOD_AI.md** - Testing guide for RunPod integration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Next Actions
|
||||||
|
|
||||||
|
**Immediate (Today):**
|
||||||
|
1. Review the migration plan (NETCUP_MIGRATION_PLAN.md)
|
||||||
|
2. Verify SSH access to Netcup RS 8000
|
||||||
|
3. Get RunPod API keys and endpoint IDs
|
||||||
|
|
||||||
|
**This Week:**
|
||||||
|
1. Deploy AI Orchestrator on Netcup (Phase 2)
|
||||||
|
2. Download local AI models (Phase 3)
|
||||||
|
3. Configure RunPod endpoints
|
||||||
|
4. Test basic functionality
|
||||||
|
|
||||||
|
**Next Week:**
|
||||||
|
1. Full testing of all AI services
|
||||||
|
2. Deploy canvas-website to Netcup
|
||||||
|
3. Setup monitoring and alerts
|
||||||
|
4. Configure DNS and SSL
|
||||||
|
|
||||||
|
**Future:**
|
||||||
|
1. Migrate remaining services from DigitalOcean
|
||||||
|
2. Decommission DigitalOcean droplets
|
||||||
|
3. Optimize costs based on usage patterns
|
||||||
|
4. Scale workers based on demand
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Pro Tips
|
||||||
|
|
||||||
|
1. **Start small**: Deploy text generation first, then images, then video
|
||||||
|
2. **Monitor costs daily**: Use the cost dashboard to track spending
|
||||||
|
3. **Use low priority for batch jobs**: Save 100% on images that aren't urgent
|
||||||
|
4. **Cache common results**: Store and reuse frequent queries
|
||||||
|
5. **Set cost alerts**: Get email when daily costs exceed threshold
|
||||||
|
6. **Test locally first**: Use mock API during development
|
||||||
|
7. **Review queue depths**: Optimize routing thresholds based on your usage
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Expected Performance
|
||||||
|
|
||||||
|
### Text Generation:
|
||||||
|
- **Latency**: 2-10s (local), 3-8s (RunPod)
|
||||||
|
- **Throughput**: 10-20 requests/min (local)
|
||||||
|
- **Cost**: $0 (local), $0.001-0.01 (RunPod)
|
||||||
|
|
||||||
|
### Image Generation:
|
||||||
|
- **Latency**: 30-60s (local low), 3-10s (RunPod high)
|
||||||
|
- **Throughput**: 1-2 images/min (local), 6-10 images/min (RunPod)
|
||||||
|
- **Cost**: $0 (local), $0.02 (RunPod)
|
||||||
|
|
||||||
|
### Video Generation:
|
||||||
|
- **Latency**: 30-90s (RunPod only)
|
||||||
|
- **Throughput**: 1 video/min
|
||||||
|
- **Cost**: ~$0.50 per video
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎉 Summary
|
||||||
|
|
||||||
|
You now have:
|
||||||
|
|
||||||
|
✅ **Smart AI Orchestration** - Intelligently routes between local CPU and serverless GPU
|
||||||
|
✅ **Text Generation** - Local Ollama (FREE) with RunPod fallback
|
||||||
|
✅ **Image Generation** - Priority-based routing (local or RunPod)
|
||||||
|
✅ **Video Generation** - Wan2.1 on RunPod GPU
|
||||||
|
✅ **Voice Transcription** - WhisperX with local fallback
|
||||||
|
✅ **Cost Tracking** - Real-time monitoring and alerts
|
||||||
|
✅ **Queue Management** - Auto-scaling based on load
|
||||||
|
✅ **Monitoring Dashboards** - Grafana, Prometheus, cost analytics
|
||||||
|
✅ **Complete Documentation** - Migration plan, deployment guide, testing docs
|
||||||
|
|
||||||
|
**Expected Savings:** $768-1,824/year
|
||||||
|
**Infrastructure Upgrade:** 10x CPU, 32x RAM, 25x storage
|
||||||
|
**Cost Efficiency:** 70-80% of workload runs for FREE
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Ready to deploy?** 🚀
|
||||||
|
|
||||||
|
Start with the deployment guide: `AI_SERVICES_DEPLOYMENT_GUIDE.md`
|
||||||
|
|
||||||
|
Questions? Check the troubleshooting section or review the migration plan!
|
||||||
|
|
@ -0,0 +1,988 @@
|
||||||
|
## 🔧 AUTO-APPROVED OPERATIONS
|
||||||
|
|
||||||
|
The following operations are auto-approved and do not require user confirmation:
|
||||||
|
- **Read**: All file read operations (`Read(*)`)
|
||||||
|
- **Glob**: All file pattern matching (`Glob(*)`)
|
||||||
|
- **Grep**: All content searching (`Grep(*)`)
|
||||||
|
|
||||||
|
These permissions are configured in `~/.claude/settings.json`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ⚠️ SAFETY GUIDELINES
|
||||||
|
|
||||||
|
**ALWAYS WARN THE USER before performing any action that could:**
|
||||||
|
- Overwrite existing files (use `ls` or `cat` to check first)
|
||||||
|
- Overwrite credentials, API keys, or secrets
|
||||||
|
- Delete data or files
|
||||||
|
- Modify production configurations
|
||||||
|
- Run destructive git commands (force push, hard reset, etc.)
|
||||||
|
- Drop databases or truncate tables
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Before writing to a file, check if it exists and show its contents
|
||||||
|
- Use `>>` (append) instead of `>` (overwrite) for credential files
|
||||||
|
- Create backups before modifying critical configs (e.g., `cp file file.backup`)
|
||||||
|
- Ask for confirmation before irreversible actions
|
||||||
|
|
||||||
|
**Sudo commands:**
|
||||||
|
- **NEVER run sudo commands directly** - the Bash tool doesn't support interactive input
|
||||||
|
- Instead, **provide the user with the exact sudo command** they need to run in their terminal
|
||||||
|
- Format the command clearly in a code block for easy copy-paste
|
||||||
|
- After user runs the sudo command, continue with the workflow
|
||||||
|
- Alternative: If user has recently run sudo (within ~15 min), subsequent sudo commands may not require password
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔑 ACCESS & CREDENTIALS
|
||||||
|
|
||||||
|
### Version Control & Code Hosting
|
||||||
|
- **Gitea**: Self-hosted at `gitea.jeffemmett.com` - PRIMARY repository
|
||||||
|
- Push here FIRST, then mirror to GitHub
|
||||||
|
- Private repos and source of truth
|
||||||
|
- SSH Key: `~/.ssh/gitea_ed25519` (private), `~/.ssh/gitea_ed25519.pub` (public)
|
||||||
|
- Public Key: `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIE2+2UZElEYptgZ9GFs2CXW0PIA57BfQcU9vlyV6fz4 gitea@jeffemmett.com`
|
||||||
|
- **Gitea CLI (tea)**: ✅ Installed at `~/bin/tea` (added to PATH)
|
||||||
|
|
||||||
|
- **GitHub**: Public mirror and collaboration
|
||||||
|
- Receives pushes from Gitea via mirror sync
|
||||||
|
- Token: `(REDACTED-GITHUB-TOKEN)`
|
||||||
|
- SSH Key: `~/.ssh/github_deploy_key` (private), `~/.ssh/github_deploy_key.pub` (public)
|
||||||
|
- **GitHub CLI (gh)**: ✅ Installed and available for PR/issue management
|
||||||
|
|
||||||
|
### Git Workflow
|
||||||
|
**Two-way sync between Gitea and GitHub:**
|
||||||
|
|
||||||
|
**Gitea-Primary Repos (Default):**
|
||||||
|
1. Develop locally in `/home/jeffe/Github/`
|
||||||
|
2. Commit and push to Gitea first
|
||||||
|
3. Gitea automatically mirrors TO GitHub (built-in push mirror)
|
||||||
|
4. GitHub used for public collaboration and visibility
|
||||||
|
|
||||||
|
**GitHub-Primary Repos (Mirror Repos):**
|
||||||
|
For repos where GitHub is source of truth (v0.dev exports, client collabs):
|
||||||
|
1. Push to GitHub
|
||||||
|
2. Deploy webhook pulls from GitHub and deploys
|
||||||
|
3. Webhook triggers Gitea to sync FROM GitHub
|
||||||
|
|
||||||
|
### 🔀 DEV BRANCH WORKFLOW (MANDATORY)
|
||||||
|
|
||||||
|
**CRITICAL: All development work on canvas-website (and other active projects) MUST use a dev branch.**
|
||||||
|
|
||||||
|
#### Branch Strategy
|
||||||
|
```
|
||||||
|
main (production)
|
||||||
|
└── dev (integration/staging)
|
||||||
|
└── feature/* (optional feature branches)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Development Rules
|
||||||
|
|
||||||
|
1. **ALWAYS work on the `dev` branch** for new features and changes:
|
||||||
|
```bash
|
||||||
|
cd /home/jeffe/Github/canvas-website
|
||||||
|
git checkout dev
|
||||||
|
git pull origin dev
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **After completing a feature**, push to dev:
|
||||||
|
```bash
|
||||||
|
git add .
|
||||||
|
git commit -m "feat: description of changes"
|
||||||
|
git push origin dev
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Update backlog task** immediately after pushing:
|
||||||
|
```bash
|
||||||
|
backlog task edit <task-id> --status "Done" --append-notes "Pushed to dev branch"
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **NEVER push directly to main** - main is for tested, verified features only
|
||||||
|
|
||||||
|
5. **Merge dev → main manually** when features are verified working:
|
||||||
|
```bash
|
||||||
|
git checkout main
|
||||||
|
git pull origin main
|
||||||
|
git merge dev
|
||||||
|
git push origin main
|
||||||
|
git checkout dev # Return to dev for continued work
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Complete Feature Deployment Checklist
|
||||||
|
|
||||||
|
- [ ] Work on `dev` branch (not main)
|
||||||
|
- [ ] Test locally before committing
|
||||||
|
- [ ] Commit with descriptive message
|
||||||
|
- [ ] Push to `dev` branch on Gitea
|
||||||
|
- [ ] Update backlog task status to "Done"
|
||||||
|
- [ ] Add notes to backlog task about what was implemented
|
||||||
|
- [ ] (Later) When verified working: merge dev → main manually
|
||||||
|
|
||||||
|
#### Why This Matters
|
||||||
|
- **Protects production**: main branch always has known-working code
|
||||||
|
- **Enables testing**: dev branch can be deployed to staging for verification
|
||||||
|
- **Clean history**: main only gets complete, tested features
|
||||||
|
- **Easy rollback**: if dev breaks, main is still stable
|
||||||
|
|
||||||
|
### Server Infrastructure
|
||||||
|
- **Netcup RS 8000 G12 Pro**: Primary application & AI server
|
||||||
|
- IP: `159.195.32.209`
|
||||||
|
- 20 cores, 64GB RAM, 3TB storage
|
||||||
|
- Hosts local AI models (Ollama, Stable Diffusion)
|
||||||
|
- All websites and apps deployed here in Docker containers
|
||||||
|
- Location: Germany (low latency EU)
|
||||||
|
- SSH Key (local): `~/.ssh/netcup_ed25519` (private), `~/.ssh/netcup_ed25519.pub` (public)
|
||||||
|
- Public Key: `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKmp4A2klKv/YIB1C6JAsb2UzvlzzE+0EcJ0jtkyFuhO netcup-rs8000@jeffemmett.com`
|
||||||
|
- SSH Access: `ssh netcup`
|
||||||
|
- **SSH Keys ON the server** (for git operations):
|
||||||
|
- Gitea: `~/.ssh/gitea_ed25519` → `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIE2+2UZElEYptgZ9GFs2CXW0PIA57BfQcU9vlyV6fz4 gitea@jeffemmett.com`
|
||||||
|
- GitHub: `~/.ssh/github_ed25519` → `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC6xXNICy0HXnqHO+U7+y7ui+pZBGe0bm0iRMS23pR1E github-deploy@netcup-rs8000`
|
||||||
|
|
||||||
|
- **RunPod**: GPU burst capacity for AI workloads
|
||||||
|
- Host: `ssh.runpod.io`
|
||||||
|
- Serverless GPU pods (pay-per-use)
|
||||||
|
- Used for: SDXL/SD3, video generation, training
|
||||||
|
- Smart routing from RS 8000 orchestrator
|
||||||
|
- SSH Key: `~/.ssh/runpod_ed25519` (private), `~/.ssh/runpod_ed25519.pub` (public)
|
||||||
|
- Public Key: `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAC7NYjI0U/2ChGaZBBWP7gKt/V12Ts6FgatinJOQ8JG runpod@jeffemmett.com`
|
||||||
|
- SSH Access: `ssh runpod`
|
||||||
|
- **API Key**: `(REDACTED-RUNPOD-KEY)`
|
||||||
|
- **CLI Config**: `~/.runpod/config.toml`
|
||||||
|
- **Serverless Endpoints**:
|
||||||
|
- Image (SD): `tzf1j3sc3zufsy` (Automatic1111)
|
||||||
|
- Video (Wan2.2): `4jql4l7l0yw0f3`
|
||||||
|
- Text (vLLM): `03g5hz3hlo8gr2`
|
||||||
|
- Whisper: `lrtisuv8ixbtub`
|
||||||
|
- ComfyUI: `5zurj845tbf8he`
|
||||||
|
|
||||||
|
### API Keys & Services
|
||||||
|
|
||||||
|
**IMPORTANT**: All API keys and tokens are stored securely on the Netcup server. Never store credentials locally.
|
||||||
|
- Access credentials via: `ssh netcup "cat ~/.cloudflare-credentials.env"` or `ssh netcup "cat ~/.porkbun_credentials"`
|
||||||
|
- All API operations should be performed FROM the Netcup server, not locally
|
||||||
|
|
||||||
|
#### Credential Files on Netcup (`/root/`)
|
||||||
|
| File | Contents |
|
||||||
|
|------|----------|
|
||||||
|
| `~/.cloudflare-credentials.env` | Cloudflare API tokens, account ID, tunnel token |
|
||||||
|
| `~/.cloudflare_credentials` | Legacy/DNS token |
|
||||||
|
| `~/.porkbun_credentials` | Porkbun API key and secret |
|
||||||
|
| `~/.v0_credentials` | V0.dev API key |
|
||||||
|
|
||||||
|
#### Cloudflare
|
||||||
|
- **Account ID**: `0e7b3338d5278ed1b148e6456b940913`
|
||||||
|
- **Tokens stored on Netcup** - source `~/.cloudflare-credentials.env`:
|
||||||
|
- `CLOUDFLARE_API_TOKEN` - Zone read, Worker:read/edit, R2:read/edit
|
||||||
|
- `CLOUDFLARE_TUNNEL_TOKEN` - Tunnel management
|
||||||
|
- `CLOUDFLARE_ZONE_TOKEN` - Zone:Edit, DNS:Edit (for adding domains)
|
||||||
|
|
||||||
|
#### Porkbun (Domain Registrar)
|
||||||
|
- **Credentials stored on Netcup** - source `~/.porkbun_credentials`:
|
||||||
|
- `PORKBUN_API_KEY` and `PORKBUN_SECRET_KEY`
|
||||||
|
- **API Endpoint**: `https://api-ipv4.porkbun.com/api/json/v3/`
|
||||||
|
- **API Docs**: https://porkbun.com/api/json/v3/documentation
|
||||||
|
- **Important**: JSON must have `secretapikey` before `apikey` in requests
|
||||||
|
- **Capabilities**: Update nameservers, get auth codes for transfers, manage DNS
|
||||||
|
- **Note**: Each domain must have "API Access" enabled individually in Porkbun dashboard
|
||||||
|
|
||||||
|
#### Domain Onboarding Workflow (Porkbun → Cloudflare)
|
||||||
|
Run these commands FROM Netcup (`ssh netcup`):
|
||||||
|
1. Add domain to Cloudflare (creates zone, returns nameservers)
|
||||||
|
2. Update nameservers at Porkbun to point to Cloudflare
|
||||||
|
3. Add CNAME record pointing to Cloudflare tunnel
|
||||||
|
4. Add hostname to tunnel config and restart cloudflared
|
||||||
|
5. Domain is live through the tunnel!
|
||||||
|
|
||||||
|
#### V0.dev (AI UI Generation)
|
||||||
|
- **Credentials stored on Netcup** - source `~/.v0_credentials`:
|
||||||
|
- `V0_API_KEY` - Platform API access
|
||||||
|
- **API Key**: `v1:5AwJbit4j9rhGcAKPU4XlVWs:05vyCcJLiWRVQW7Xu4u5E03G`
|
||||||
|
- **SDK**: `npm install v0-sdk` (use `v0` CLI for adding components)
|
||||||
|
- **Docs**: https://v0.app/docs/v0-platform-api
|
||||||
|
- **Capabilities**:
|
||||||
|
- List/create/update/delete projects
|
||||||
|
- Manage chats and versions
|
||||||
|
- Download generated code
|
||||||
|
- Create deployments
|
||||||
|
- Manage environment variables
|
||||||
|
- **Limitations**: GitHub-only for git integration (no Gitea/GitLab support)
|
||||||
|
- **Usage**:
|
||||||
|
```javascript
|
||||||
|
const { v0 } = require('v0-sdk');
|
||||||
|
// Uses V0_API_KEY env var automatically
|
||||||
|
const projects = await v0.projects.find();
|
||||||
|
const chats = await v0.chats.find();
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Other Services
|
||||||
|
- **HuggingFace**: CLI access available for model downloads
|
||||||
|
- **RunPod**: API access for serverless GPU orchestration (see Server Infrastructure above)
|
||||||
|
|
||||||
|
### Dev Ops Stack & Principles
|
||||||
|
- **Platform**: Linux WSL2 (Ubuntu on Windows) for development
|
||||||
|
- **Working Directory**: `/home/jeffe/Github`
|
||||||
|
- **Container Strategy**:
|
||||||
|
- ALL repos should be Dockerized
|
||||||
|
- Optimized containers for production deployment
|
||||||
|
- Docker Compose for multi-service orchestration
|
||||||
|
- **Process Management**: PM2 available for Node.js services
|
||||||
|
- **Version Control**: Git configured with GitHub + Gitea mirrors
|
||||||
|
- **Package Managers**: npm/pnpm/yarn available
|
||||||
|
|
||||||
|
### 🚀 Traefik Reverse Proxy (Central Routing)
|
||||||
|
All HTTP services on Netcup RS 8000 route through Traefik for automatic service discovery.
|
||||||
|
|
||||||
|
**Architecture:**
|
||||||
|
```
|
||||||
|
Internet → Cloudflare Tunnel → Traefik (:80/:443) → Docker Services
|
||||||
|
│
|
||||||
|
├── gitea.jeffemmett.com → gitea:3000
|
||||||
|
├── mycofi.earth → mycofi:3000
|
||||||
|
├── games.jeffemmett.com → games:80
|
||||||
|
└── [auto-discovered via Docker labels]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Location:** `/root/traefik/` on Netcup RS 8000
|
||||||
|
|
||||||
|
**Adding a New Service:**
|
||||||
|
```yaml
|
||||||
|
# In your docker-compose.yml, add these labels:
|
||||||
|
services:
|
||||||
|
myapp:
|
||||||
|
image: myapp:latest
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
- "traefik.http.routers.myapp.rule=Host(`myapp.jeffemmett.com`)"
|
||||||
|
- "traefik.http.services.myapp.loadbalancer.server.port=3000"
|
||||||
|
networks:
|
||||||
|
- traefik-public
|
||||||
|
networks:
|
||||||
|
traefik-public:
|
||||||
|
external: true
|
||||||
|
```
|
||||||
|
|
||||||
|
**Traefik Dashboard:** `http://159.195.32.209:8888` (internal only)
|
||||||
|
|
||||||
|
**SSH Git Access:**
|
||||||
|
- SSH goes direct (not through Traefik): `git.jeffemmett.com:223` → `159.195.32.209:223`
|
||||||
|
- Web UI goes through Traefik: `gitea.jeffemmett.com` → Traefik → gitea:3000
|
||||||
|
|
||||||
|
### ☁️ Cloudflare Tunnel Configuration
|
||||||
|
**Location:** `/root/cloudflared/` on Netcup RS 8000
|
||||||
|
|
||||||
|
The tunnel uses a token-based configuration managed via Cloudflare Zero Trust Dashboard.
|
||||||
|
All public hostnames should point to `http://localhost:80` (Traefik), which routes based on Host header.
|
||||||
|
|
||||||
|
**Managed hostnames:**
|
||||||
|
- `gitea.jeffemmett.com` → Traefik → Gitea
|
||||||
|
- `photos.jeffemmett.com` → Traefik → Immich
|
||||||
|
- `movies.jeffemmett.com` → Traefik → Jellyfin
|
||||||
|
- `search.jeffemmett.com` → Traefik → Semantic Search
|
||||||
|
- `mycofi.earth` → Traefik → MycoFi
|
||||||
|
- `games.jeffemmett.com` → Traefik → Games Platform
|
||||||
|
- `decolonizeti.me` → Traefik → Decolonize Time
|
||||||
|
|
||||||
|
**Tunnel ID:** `a838e9dc-0af5-4212-8af2-6864eb15e1b5`
|
||||||
|
**Tunnel CNAME Target:** `a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com`
|
||||||
|
|
||||||
|
**To deploy a new website/service:**
|
||||||
|
|
||||||
|
1. **Dockerize the project** with Traefik labels in `docker-compose.yml`:
|
||||||
|
```yaml
|
||||||
|
services:
|
||||||
|
myapp:
|
||||||
|
build: .
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
- "traefik.http.routers.myapp.rule=Host(`mydomain.com`) || Host(`www.mydomain.com`)"
|
||||||
|
- "traefik.http.services.myapp.loadbalancer.server.port=3000"
|
||||||
|
networks:
|
||||||
|
- traefik-public
|
||||||
|
networks:
|
||||||
|
traefik-public:
|
||||||
|
external: true
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Deploy to Netcup:**
|
||||||
|
```bash
|
||||||
|
ssh netcup "cd /opt/websites && git clone <repo-url>"
|
||||||
|
ssh netcup "cd /opt/websites/<project> && docker compose up -d --build"
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Add hostname to tunnel config** (`/root/cloudflared/config.yml`):
|
||||||
|
```yaml
|
||||||
|
- hostname: mydomain.com
|
||||||
|
service: http://localhost:80
|
||||||
|
- hostname: www.mydomain.com
|
||||||
|
service: http://localhost:80
|
||||||
|
```
|
||||||
|
Then restart: `ssh netcup "docker restart cloudflared"`
|
||||||
|
|
||||||
|
4. **Configure DNS in Cloudflare dashboard** (CRITICAL - prevents 525 SSL errors):
|
||||||
|
- Go to Cloudflare Dashboard → select domain → DNS → Records
|
||||||
|
- Delete any existing A/AAAA records for `@` and `www`
|
||||||
|
- Add CNAME records:
|
||||||
|
| Type | Name | Target | Proxy |
|
||||||
|
|------|------|--------|-------|
|
||||||
|
| CNAME | `@` | `a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com` | Proxied ✓ |
|
||||||
|
| CNAME | `www` | `a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com` | Proxied ✓ |
|
||||||
|
|
||||||
|
**API Credentials** (on Netcup at `~/.cloudflare*`):
|
||||||
|
- `CLOUDFLARE_API_TOKEN` - Zone read access only
|
||||||
|
- `CLOUDFLARE_TUNNEL_TOKEN` - Tunnel management only
|
||||||
|
- See **API Keys & Services** section above for Domain Management Token (required for DNS automation)
|
||||||
|
|
||||||
|
### 🔄 Auto-Deploy Webhook System
|
||||||
|
**Location:** `/opt/deploy-webhook/` on Netcup RS 8000
|
||||||
|
**Endpoint:** `https://deploy.jeffemmett.com/deploy/<repo-name>`
|
||||||
|
**Secret:** `gitea-deploy-secret-2025`
|
||||||
|
|
||||||
|
Pushes to Gitea automatically trigger rebuilds. The webhook receiver:
|
||||||
|
1. Validates HMAC signature from Gitea
|
||||||
|
2. Runs `git pull && docker compose up -d --build`
|
||||||
|
3. Returns build status
|
||||||
|
|
||||||
|
**Adding a new repo to auto-deploy:**
|
||||||
|
1. Add entry to `/opt/deploy-webhook/webhook.py` REPOS dict
|
||||||
|
2. Restart: `ssh netcup "cd /opt/deploy-webhook && docker compose up -d --build"`
|
||||||
|
3. Add Gitea webhook:
|
||||||
|
```bash
|
||||||
|
curl -X POST "https://gitea.jeffemmett.com/api/v1/repos/jeffemmett/<repo>/hooks" \
|
||||||
|
-H "Authorization: token <gitea-token>" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"type":"gitea","active":true,"events":["push"],"config":{"url":"https://deploy.jeffemmett.com/deploy/<repo>","content_type":"json","secret":"gitea-deploy-secret-2025"}}'
|
||||||
|
```
|
||||||
|
|
||||||
|
**Currently auto-deploying:**
|
||||||
|
- `decolonize-time-website` → /opt/websites/decolonize-time-website
|
||||||
|
- `mycofi-earth-website` → /opt/websites/mycofi-earth-website
|
||||||
|
- `games-platform` → /opt/apps/games-platform
|
||||||
|
|
||||||
|
### 🔐 SSH Keys Quick Reference
|
||||||
|
|
||||||
|
**Local keys** (in `~/.ssh/` on your laptop):
|
||||||
|
|
||||||
|
| Service | Private Key | Public Key | Purpose |
|
||||||
|
|---------|-------------|------------|---------|
|
||||||
|
| **Gitea** | `gitea_ed25519` | `gitea_ed25519.pub` | Primary git repository |
|
||||||
|
| **GitHub** | `github_deploy_key` | `github_deploy_key.pub` | Public mirror sync |
|
||||||
|
| **Netcup RS 8000** | `netcup_ed25519` | `netcup_ed25519.pub` | Primary server SSH |
|
||||||
|
| **RunPod** | `runpod_ed25519` | `runpod_ed25519.pub` | GPU pods SSH |
|
||||||
|
| **Default** | `id_ed25519` | `id_ed25519.pub` | General purpose/legacy |
|
||||||
|
|
||||||
|
**Server-side keys** (in `/root/.ssh/` on Netcup RS 8000):
|
||||||
|
|
||||||
|
| Service | Key File | Purpose |
|
||||||
|
|---------|----------|---------|
|
||||||
|
| **Gitea** | `gitea_ed25519` | Server pulls from Gitea repos |
|
||||||
|
| **GitHub** | `github_ed25519` | Server pulls from GitHub (mirror repos) |
|
||||||
|
|
||||||
|
**SSH Config**: `~/.ssh/config` contains all host configurations
|
||||||
|
**Quick Access**:
|
||||||
|
- `ssh netcup` - Connect to Netcup RS 8000
|
||||||
|
- `ssh runpod` - Connect to RunPod
|
||||||
|
- `ssh gitea.jeffemmett.com` - Git operations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🤖 AI ORCHESTRATION ARCHITECTURE
|
||||||
|
|
||||||
|
### Smart Routing Strategy
|
||||||
|
All AI requests go through intelligent orchestration layer on RS 8000:
|
||||||
|
|
||||||
|
**Routing Logic:**
|
||||||
|
- **Text/Code (70-80% of workload)**: Always local RS 8000 CPU (Ollama) → FREE
|
||||||
|
- **Images - Low Priority**: RS 8000 CPU (SD 1.5/2.1) → FREE but slow (~60s)
|
||||||
|
- **Images - High Priority**: RunPod GPU (SDXL/SD3) → $0.02/image, fast
|
||||||
|
- **Video Generation**: Always RunPod GPU → $0.50/video (only option)
|
||||||
|
- **Training/Fine-tuning**: RunPod GPU on-demand
|
||||||
|
|
||||||
|
**Queue System:**
|
||||||
|
- Redis-based queues: text, image, code, video
|
||||||
|
- Priority-based routing (low/normal/high)
|
||||||
|
- Worker pools scale based on load
|
||||||
|
- Cost tracking per job, per user
|
||||||
|
|
||||||
|
**Cost Optimization:**
|
||||||
|
- Target: $90-120/mo (vs $136-236/mo current)
|
||||||
|
- Savings: $552-1,392/year
|
||||||
|
- 70-80% of workload FREE (local CPU)
|
||||||
|
- GPU only when needed (serverless = no idle costs)
|
||||||
|
|
||||||
|
### Deployment Architecture
|
||||||
|
```
|
||||||
|
RS 8000 G12 Pro (Netcup)
|
||||||
|
├── Cloudflare Tunnel (secure ingress)
|
||||||
|
├── Traefik Reverse Proxy (auto-discovery)
|
||||||
|
│ └── Routes to all services via Docker labels
|
||||||
|
├── Core Services
|
||||||
|
│ ├── Gitea (git hosting) - gitea.jeffemmett.com
|
||||||
|
│ └── Other internal tools
|
||||||
|
├── AI Services
|
||||||
|
│ ├── Ollama (text/code models)
|
||||||
|
│ ├── Stable Diffusion (CPU fallback)
|
||||||
|
│ └── Smart Router API (FastAPI)
|
||||||
|
├── Queue Infrastructure
|
||||||
|
│ ├── Redis (job queues)
|
||||||
|
│ └── PostgreSQL (job history/analytics)
|
||||||
|
├── Monitoring
|
||||||
|
│ ├── Prometheus (metrics)
|
||||||
|
│ ├── Grafana (dashboards)
|
||||||
|
│ └── Cost tracking API
|
||||||
|
└── Application Hosting
|
||||||
|
├── All websites (Dockerized + Traefik labels)
|
||||||
|
├── All apps (Dockerized + Traefik labels)
|
||||||
|
└── Backend services (Dockerized)
|
||||||
|
|
||||||
|
RunPod Serverless (GPU Burst)
|
||||||
|
├── SDXL/SD3 endpoints
|
||||||
|
├── Video generation (Wan2.1)
|
||||||
|
└── Training/fine-tuning jobs
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration Pattern for Projects
|
||||||
|
All projects use unified AI client SDK:
|
||||||
|
```python
|
||||||
|
from orchestrator_client import AIOrchestrator
|
||||||
|
ai = AIOrchestrator("http://rs8000-ip:8000")
|
||||||
|
|
||||||
|
# Automatically routes based on priority & model
|
||||||
|
result = await ai.generate_text(prompt, priority="low") # → FREE CPU
|
||||||
|
result = await ai.generate_image(prompt, priority="high") # → RunPod GPU
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💰 GPU COST ANALYSIS & MIGRATION PLAN
|
||||||
|
|
||||||
|
### Current Infrastructure Costs (Monthly)
|
||||||
|
|
||||||
|
| Service | Type | Cost | Notes |
|
||||||
|
|---------|------|------|-------|
|
||||||
|
| Netcup RS 8000 G12 Pro | Fixed | ~€45 | 20 cores, 64GB RAM, 3TB (CPU-only) |
|
||||||
|
| RunPod Serverless | Variable | $50-100 | Pay-per-use GPU (images, video) |
|
||||||
|
| DigitalOcean Droplets | Fixed | ~$48 | ⚠️ DEPRECATED - migrate ASAP |
|
||||||
|
| **Current Total** | | **~$140-190/mo** | |
|
||||||
|
|
||||||
|
### GPU Provider Comparison
|
||||||
|
|
||||||
|
#### Netcup vGPU (NEW - Early Access, Ends July 7, 2025)
|
||||||
|
|
||||||
|
| Plan | GPU | VRAM | vCores | RAM | Storage | Price/mo | Price/hr equiv |
|
||||||
|
|------|-----|------|--------|-----|---------|----------|----------------|
|
||||||
|
| RS 2000 vGPU 7 | H200 | 7 GB dedicated | 8 | 16 GB DDR5 | 512 GB NVMe | €137.31 (~$150) | $0.21/hr |
|
||||||
|
| RS 4000 vGPU 14 | H200 | 14 GB dedicated | 12 | 32 GB DDR5 | 1 TB NVMe | €261.39 (~$285) | $0.40/hr |
|
||||||
|
|
||||||
|
**Pros:**
|
||||||
|
- NVIDIA H200 (latest gen, better than H100 for inference)
|
||||||
|
- Dedicated VRAM (no noisy neighbors)
|
||||||
|
- Germany location (EU data sovereignty, low latency to RS 8000)
|
||||||
|
- Fixed monthly cost = predictable budgeting
|
||||||
|
- 24/7 availability, no cold starts
|
||||||
|
|
||||||
|
**Cons:**
|
||||||
|
- Pay even when idle
|
||||||
|
- Limited to 7GB or 14GB VRAM options
|
||||||
|
- Early access = limited availability
|
||||||
|
|
||||||
|
#### RunPod Serverless (Current)
|
||||||
|
|
||||||
|
| GPU | VRAM | Price/hr | Typical Use |
|
||||||
|
|-----|------|----------|-------------|
|
||||||
|
| RTX 4090 | 24 GB | ~$0.44/hr | SDXL, medium models |
|
||||||
|
| A100 40GB | 40 GB | ~$1.14/hr | Large models, training |
|
||||||
|
| H100 80GB | 80 GB | ~$2.49/hr | Largest models |
|
||||||
|
|
||||||
|
**Current Endpoint Costs:**
|
||||||
|
- Image (SD/SDXL): ~$0.02/image (~2s compute)
|
||||||
|
- Video (Wan2.2): ~$0.50/video (~60s compute)
|
||||||
|
- Text (vLLM): ~$0.001/request
|
||||||
|
- Whisper: ~$0.01/minute audio
|
||||||
|
|
||||||
|
**Pros:**
|
||||||
|
- Zero idle costs
|
||||||
|
- Unlimited burst capacity
|
||||||
|
- Wide GPU selection (up to 80GB VRAM)
|
||||||
|
- Pay only for actual compute
|
||||||
|
|
||||||
|
**Cons:**
|
||||||
|
- Cold start delays (10-30s first request)
|
||||||
|
- Variable availability during peak times
|
||||||
|
- Per-request costs add up at scale
|
||||||
|
|
||||||
|
### Break-even Analysis
|
||||||
|
|
||||||
|
**When does Netcup vGPU become cheaper than RunPod?**
|
||||||
|
|
||||||
|
| Scenario | RunPod Cost | Netcup RS 2000 vGPU 7 | Netcup RS 4000 vGPU 14 |
|
||||||
|
|----------|-------------|----------------------|------------------------|
|
||||||
|
| 1,000 images/mo | $20 | $150 ❌ | $285 ❌ |
|
||||||
|
| 5,000 images/mo | $100 | $150 ❌ | $285 ❌ |
|
||||||
|
| **7,500 images/mo** | **$150** | **$150 ✅** | $285 ❌ |
|
||||||
|
| 10,000 images/mo | $200 | $150 ✅ | $285 ❌ |
|
||||||
|
| **14,250 images/mo** | **$285** | $150 ✅ | **$285 ✅** |
|
||||||
|
| 100 videos/mo | $50 | $150 ❌ | $285 ❌ |
|
||||||
|
| **300 videos/mo** | **$150** | **$150 ✅** | $285 ❌ |
|
||||||
|
| 500 videos/mo | $250 | $150 ✅ | $285 ❌ |
|
||||||
|
|
||||||
|
**Recommendation by Usage Pattern:**
|
||||||
|
|
||||||
|
| Monthly Usage | Best Option | Est. Cost |
|
||||||
|
|---------------|-------------|-----------|
|
||||||
|
| < 5,000 images OR < 250 videos | RunPod Serverless | $50-100 |
|
||||||
|
| 5,000-10,000 images OR 250-500 videos | **Netcup RS 2000 vGPU 7** | $150 fixed |
|
||||||
|
| > 10,000 images OR > 500 videos + training | **Netcup RS 4000 vGPU 14** | $285 fixed |
|
||||||
|
| Unpredictable/bursty workloads | RunPod Serverless | Variable |
|
||||||
|
|
||||||
|
### Migration Strategy
|
||||||
|
|
||||||
|
#### Phase 1: Immediate (Before July 7, 2025)
|
||||||
|
**Decision Point: Secure Netcup vGPU Early Access?**
|
||||||
|
|
||||||
|
- [ ] Monitor actual GPU usage for 2-4 weeks
|
||||||
|
- [ ] Calculate average monthly image/video generation
|
||||||
|
- [ ] If consistently > 5,000 images/mo → Consider RS 2000 vGPU 7
|
||||||
|
- [ ] If consistently > 10,000 images/mo → Consider RS 4000 vGPU 14
|
||||||
|
- [ ] **ACTION**: Redeem early access code if usage justifies fixed GPU
|
||||||
|
|
||||||
|
#### Phase 2: Hybrid Architecture (If vGPU Acquired)
|
||||||
|
|
||||||
|
```
|
||||||
|
RS 8000 G12 Pro (CPU - Current)
|
||||||
|
├── Ollama (text/code) → FREE
|
||||||
|
├── SD 1.5/2.1 CPU fallback → FREE
|
||||||
|
└── Orchestrator API
|
||||||
|
|
||||||
|
Netcup vGPU Server (NEW - If purchased)
|
||||||
|
├── Primary GPU workloads
|
||||||
|
├── SDXL/SD3 generation
|
||||||
|
├── Video generation (Wan2.1 I2V)
|
||||||
|
├── Model inference (14B params with 14GB VRAM)
|
||||||
|
└── Connected via internal netcup network (low latency)
|
||||||
|
|
||||||
|
RunPod Serverless (Burst Only)
|
||||||
|
├── Overflow capacity
|
||||||
|
├── Models requiring > 14GB VRAM
|
||||||
|
├── Training/fine-tuning jobs
|
||||||
|
└── Geographic distribution needs
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Phase 3: Cost Optimization Targets
|
||||||
|
|
||||||
|
| Scenario | Current | With vGPU Migration | Savings |
|
||||||
|
|----------|---------|---------------------|---------|
|
||||||
|
| Low usage | $140/mo | $95/mo (RS8000 + minimal RunPod) | $540/yr |
|
||||||
|
| Medium usage | $190/mo | $195/mo (RS8000 + vGPU 7) | Break-even |
|
||||||
|
| High usage | $250/mo | $195/mo (RS8000 + vGPU 7) | $660/yr |
|
||||||
|
| Very high usage | $350/mo | $330/mo (RS8000 + vGPU 14) | $240/yr |
|
||||||
|
|
||||||
|
### Model VRAM Requirements Reference
|
||||||
|
|
||||||
|
| Model | VRAM Needed | Fits vGPU 7? | Fits vGPU 14? |
|
||||||
|
|-------|-------------|--------------|---------------|
|
||||||
|
| SD 1.5 | ~4 GB | ✅ | ✅ |
|
||||||
|
| SD 2.1 | ~5 GB | ✅ | ✅ |
|
||||||
|
| SDXL | ~7 GB | ⚠️ Tight | ✅ |
|
||||||
|
| SD3 Medium | ~8 GB | ❌ | ✅ |
|
||||||
|
| Wan2.1 I2V 14B | ~12 GB | ❌ | ✅ |
|
||||||
|
| Wan2.1 T2V 14B | ~14 GB | ❌ | ⚠️ Tight |
|
||||||
|
| Flux.1 Dev | ~12 GB | ❌ | ✅ |
|
||||||
|
| LLaMA 3 8B (Q4) | ~6 GB | ✅ | ✅ |
|
||||||
|
| LLaMA 3 70B (Q4) | ~40 GB | ❌ | ❌ (RunPod) |
|
||||||
|
|
||||||
|
### Decision Framework
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────┐
|
||||||
|
│ GPU WORKLOAD DECISION TREE │
|
||||||
|
├─────────────────────────────────────────────────────────┤
|
||||||
|
│ │
|
||||||
|
│ Is usage predictable and consistent? │
|
||||||
|
│ ├── YES → Is monthly GPU spend > $150? │
|
||||||
|
│ │ ├── YES → Netcup vGPU (fixed cost wins) │
|
||||||
|
│ │ └── NO → RunPod Serverless (no idle cost) │
|
||||||
|
│ └── NO → RunPod Serverless (pay for what you use) │
|
||||||
|
│ │
|
||||||
|
│ Does model require > 14GB VRAM? │
|
||||||
|
│ ├── YES → RunPod (A100/H100 on-demand) │
|
||||||
|
│ └── NO → Netcup vGPU or RS 8000 CPU │
|
||||||
|
│ │
|
||||||
|
│ Is low latency critical? │
|
||||||
|
│ ├── YES → Netcup vGPU (same datacenter as RS 8000) │
|
||||||
|
│ └── NO → RunPod Serverless (acceptable for batch) │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Monitoring & Review Schedule
|
||||||
|
|
||||||
|
- **Weekly**: Review RunPod spend dashboard
|
||||||
|
- **Monthly**: Calculate total GPU costs, compare to vGPU break-even
|
||||||
|
- **Quarterly**: Re-evaluate architecture, consider plan changes
|
||||||
|
- **Annually**: Full infrastructure cost audit
|
||||||
|
|
||||||
|
### Action Items
|
||||||
|
|
||||||
|
- [ ] **URGENT**: Decide on Netcup vGPU early access before July 7, 2025
|
||||||
|
- [ ] Set up GPU usage tracking in orchestrator
|
||||||
|
- [ ] Create Grafana dashboard for cost monitoring
|
||||||
|
- [ ] Test Wan2.1 I2V 14B model on vGPU 14 (if acquired)
|
||||||
|
- [ ] Document migration runbook for vGPU setup
|
||||||
|
- [ ] Complete DigitalOcean deprecation (separate from GPU decision)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📁 PROJECT PORTFOLIO STRUCTURE
|
||||||
|
|
||||||
|
### Repository Organization
|
||||||
|
- **Location**: `/home/jeffe/Github/`
|
||||||
|
- **Primary Flow**: Gitea (source of truth) → GitHub (public mirror)
|
||||||
|
- **Containerization**: ALL repos must be Dockerized with optimized production containers
|
||||||
|
|
||||||
|
### 🎯 MAIN PROJECT: canvas-website
|
||||||
|
**Location**: `/home/jeffe/Github/canvas-website`
|
||||||
|
**Description**: Collaborative canvas deployment - the integration hub where all tools come together
|
||||||
|
- Tldraw-based collaborative canvas platform
|
||||||
|
- Integrates Hyperindex, rSpace, MycoFi, and other tools
|
||||||
|
- Real-time collaboration features
|
||||||
|
- Deployed on RS 8000 in Docker
|
||||||
|
- Uses AI orchestrator for intelligent features
|
||||||
|
|
||||||
|
### Project Categories
|
||||||
|
|
||||||
|
**AI & Infrastructure:**
|
||||||
|
- AI Orchestrator (smart routing between RS 8000 & RunPod)
|
||||||
|
- Model hosting & fine-tuning pipelines
|
||||||
|
- Cost optimization & monitoring dashboards
|
||||||
|
|
||||||
|
**Web Applications & Sites:**
|
||||||
|
- **canvas-website**: Main collaborative canvas (integration hub)
|
||||||
|
- All deployed in Docker containers on RS 8000
|
||||||
|
- Cloudflare Workers for edge functions (Hyperindex)
|
||||||
|
- Static sites + dynamic backends containerized
|
||||||
|
|
||||||
|
**Supporting Projects:**
|
||||||
|
- **Hyperindex**: Tldraw canvas integration (Cloudflare stack) - integrates into canvas-website
|
||||||
|
- **rSpace**: Real-time collaboration platform - integrates into canvas-website
|
||||||
|
- **MycoFi**: DeFi/Web3 project - integrates into canvas-website
|
||||||
|
- **Canvas-related tools**: Knowledge graph & visualization components
|
||||||
|
|
||||||
|
### Deployment Strategy
|
||||||
|
1. **Development**: Local WSL2 environment (`/home/jeffe/Github/`)
|
||||||
|
2. **Version Control**: Push to Gitea FIRST → Auto-mirror to GitHub
|
||||||
|
3. **Containerization**: Build optimized Docker images with Traefik labels
|
||||||
|
4. **Deployment**: Deploy to RS 8000 via Docker Compose (join `traefik-public` network)
|
||||||
|
5. **Routing**: Traefik auto-discovers service via labels, no config changes needed
|
||||||
|
6. **DNS**: Add hostname to Cloudflare tunnel (if new domain) or it just works (existing domains)
|
||||||
|
7. **AI Integration**: Connect to local orchestrator API
|
||||||
|
8. **Monitoring**: Grafana dashboards for all services
|
||||||
|
|
||||||
|
### Infrastructure Philosophy
|
||||||
|
- **Self-hosted first**: Own your infrastructure (RS 8000 + Gitea)
|
||||||
|
- **Cloud for edge cases**: Cloudflare (edge), RunPod (GPU burst)
|
||||||
|
- **Cost-optimized**: Local CPU for 70-80% of workload
|
||||||
|
- **Dockerized everything**: Reproducible, scalable, maintainable
|
||||||
|
- **Smart orchestration**: Right compute for the right job
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
- can you make sure you are runing the hf download for a non deprecated version? After that, you can proceed with Image-to-Video 14B 720p (RECOMMENDED)
|
||||||
|
huggingface-cli download Wan-AI/Wan2.1-I2V-14B-720P \
|
||||||
|
--include "*.safetensors" \
|
||||||
|
--local-dir models/diffusion_models/wan2.1_i2v_14b
|
||||||
|
|
||||||
|
## 🕸️ HYPERINDEX PROJECT - TOP PRIORITY
|
||||||
|
|
||||||
|
**Location:** `/home/jeffe/Github/hyperindex-system/`
|
||||||
|
|
||||||
|
When user is ready to work on the hyperindexing system:
|
||||||
|
1. Reference `HYPERINDEX_PROJECT.md` for complete architecture and implementation details
|
||||||
|
2. Follow `HYPERINDEX_TODO.md` for step-by-step checklist
|
||||||
|
3. Start with Phase 1 (Database & Core Types), then proceed sequentially through Phase 5
|
||||||
|
4. This is a tldraw canvas integration project using Cloudflare Workers, D1, R2, and Durable Objects
|
||||||
|
5. Creates a "living, mycelial network" of web discoveries that spawn on the canvas in real-time
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📋 BACKLOG.MD - UNIFIED TASK MANAGEMENT
|
||||||
|
|
||||||
|
**All projects use Backlog.md for task tracking.** Tasks are managed as markdown files and can be viewed at `backlog.jeffemmett.com` for a unified cross-project view.
|
||||||
|
|
||||||
|
### MCP Integration
|
||||||
|
Backlog.md is integrated via MCP server. Available tools:
|
||||||
|
- `backlog.task_create` - Create new tasks
|
||||||
|
- `backlog.task_list` - List tasks with filters
|
||||||
|
- `backlog.task_update` - Update task status/details
|
||||||
|
- `backlog.task_view` - View task details
|
||||||
|
- `backlog.search` - Search across tasks, docs, decisions
|
||||||
|
|
||||||
|
### Task Lifecycle Workflow
|
||||||
|
|
||||||
|
**CRITICAL: Claude agents MUST follow this workflow for ALL development tasks:**
|
||||||
|
|
||||||
|
#### 1. Task Discovery (Before Starting Work)
|
||||||
|
```bash
|
||||||
|
# Check if task already exists
|
||||||
|
backlog search "<task description>" --plain
|
||||||
|
|
||||||
|
# List current tasks
|
||||||
|
backlog task list --plain
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 2. Task Creation (If Not Exists)
|
||||||
|
```bash
|
||||||
|
# Create task with full details
|
||||||
|
backlog task create "Task Title" \
|
||||||
|
--desc "Detailed description" \
|
||||||
|
--priority high \
|
||||||
|
--status "To Do"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 3. Starting Work (Move to In Progress)
|
||||||
|
```bash
|
||||||
|
# Update status when starting
|
||||||
|
backlog task edit <task-id> --status "In Progress"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 4. During Development (Update Notes)
|
||||||
|
```bash
|
||||||
|
# Append progress notes
|
||||||
|
backlog task edit <task-id> --append-notes "Completed X, working on Y"
|
||||||
|
|
||||||
|
# Update acceptance criteria
|
||||||
|
backlog task edit <task-id> --check-ac 1
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 5. Completion (Move to Done)
|
||||||
|
```bash
|
||||||
|
# Mark complete when finished
|
||||||
|
backlog task edit <task-id> --status "Done"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Project Initialization
|
||||||
|
|
||||||
|
When starting work in a new repository that doesn't have backlog:
|
||||||
|
```bash
|
||||||
|
cd /path/to/repo
|
||||||
|
backlog init "Project Name" --integration-mode mcp --defaults
|
||||||
|
```
|
||||||
|
|
||||||
|
This creates the `backlog/` directory structure:
|
||||||
|
```
|
||||||
|
backlog/
|
||||||
|
├── config.yml # Project configuration
|
||||||
|
├── tasks/ # Active tasks
|
||||||
|
├── completed/ # Finished tasks
|
||||||
|
├── drafts/ # Draft tasks
|
||||||
|
├── docs/ # Project documentation
|
||||||
|
├── decisions/ # Architecture decision records
|
||||||
|
└── archive/ # Archived tasks
|
||||||
|
```
|
||||||
|
|
||||||
|
### Task File Format
|
||||||
|
Tasks are markdown files with YAML frontmatter:
|
||||||
|
```yaml
|
||||||
|
---
|
||||||
|
id: task-001
|
||||||
|
title: Feature implementation
|
||||||
|
status: In Progress
|
||||||
|
assignee: [@claude]
|
||||||
|
created_date: '2025-12-03 14:30'
|
||||||
|
labels: [feature, backend]
|
||||||
|
priority: high
|
||||||
|
dependencies: [task-002]
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
What needs to be done...
|
||||||
|
|
||||||
|
## Plan
|
||||||
|
1. Step one
|
||||||
|
2. Step two
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Criterion 1
|
||||||
|
- [x] Criterion 2 (completed)
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
Progress updates go here...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cross-Project Aggregation (backlog.jeffemmett.com)
|
||||||
|
|
||||||
|
**Architecture:**
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────────────────────────┐
|
||||||
|
│ backlog.jeffemmett.com │
|
||||||
|
│ (Unified Kanban Dashboard) │
|
||||||
|
├─────────────────────────────────────────────────────────────┤
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
|
||||||
|
│ │ canvas-web │ │ hyperindex │ │ mycofi │ ... │
|
||||||
|
│ │ (purple) │ │ (green) │ │ (blue) │ │
|
||||||
|
│ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘ │
|
||||||
|
│ │ │ │ │
|
||||||
|
│ └────────────────┴────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ ┌───────────┴───────────┐ │
|
||||||
|
│ │ Aggregation API │ │
|
||||||
|
│ │ (polls all projects) │ │
|
||||||
|
│ └───────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
Data Sources:
|
||||||
|
├── Local: /home/jeffe/Github/*/backlog/
|
||||||
|
└── Remote: ssh netcup "ls /opt/*/backlog/"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Color Coding by Project:**
|
||||||
|
| Project | Color | Location |
|
||||||
|
|---------|-------|----------|
|
||||||
|
| canvas-website | Purple | Local + Netcup |
|
||||||
|
| hyperindex-system | Green | Local |
|
||||||
|
| mycofi-earth | Blue | Local + Netcup |
|
||||||
|
| decolonize-time | Orange | Local + Netcup |
|
||||||
|
| ai-orchestrator | Red | Netcup |
|
||||||
|
|
||||||
|
**Aggregation Service** (to be deployed on Netcup):
|
||||||
|
- Polls all project `backlog/tasks/` directories
|
||||||
|
- Serves unified JSON API at `api.backlog.jeffemmett.com`
|
||||||
|
- Web UI at `backlog.jeffemmett.com` shows combined Kanban
|
||||||
|
- Real-time updates via WebSocket
|
||||||
|
- Filter by project, status, priority, assignee
|
||||||
|
|
||||||
|
### Agent Behavior Requirements
|
||||||
|
|
||||||
|
**When Claude starts working on ANY task:**
|
||||||
|
|
||||||
|
1. **Check for existing backlog** in the repo:
|
||||||
|
```bash
|
||||||
|
ls backlog/config.yml 2>/dev/null || echo "Backlog not initialized"
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **If backlog exists**, search for related tasks:
|
||||||
|
```bash
|
||||||
|
backlog search "<relevant keywords>" --plain
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Create or update task** before writing code:
|
||||||
|
```bash
|
||||||
|
# If new task needed:
|
||||||
|
backlog task create "Task title" --status "In Progress"
|
||||||
|
|
||||||
|
# If task exists:
|
||||||
|
backlog task edit <id> --status "In Progress"
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Update task on completion**:
|
||||||
|
```bash
|
||||||
|
backlog task edit <id> --status "Done" --append-notes "Implementation complete"
|
||||||
|
```
|
||||||
|
|
||||||
|
5. **Never leave tasks in "In Progress"** when stopping work - either complete them or add notes explaining blockers.
|
||||||
|
|
||||||
|
### Viewing Tasks
|
||||||
|
|
||||||
|
**Terminal Kanban Board:**
|
||||||
|
```bash
|
||||||
|
backlog board
|
||||||
|
```
|
||||||
|
|
||||||
|
**Web Interface (single project):**
|
||||||
|
```bash
|
||||||
|
backlog browser --port 6420
|
||||||
|
```
|
||||||
|
|
||||||
|
**Unified View (all projects):**
|
||||||
|
Visit `backlog.jeffemmett.com` (served from Netcup)
|
||||||
|
|
||||||
|
### Backlog CLI Quick Reference
|
||||||
|
|
||||||
|
#### Task Operations
|
||||||
|
| Action | Command |
|
||||||
|
|--------|---------|
|
||||||
|
| View task | `backlog task 42 --plain` |
|
||||||
|
| List tasks | `backlog task list --plain` |
|
||||||
|
| Search tasks | `backlog search "topic" --plain` |
|
||||||
|
| Filter by status | `backlog task list -s "In Progress" --plain` |
|
||||||
|
| Create task | `backlog task create "Title" -d "Description" --ac "Criterion 1"` |
|
||||||
|
| Edit task | `backlog task edit 42 -t "New Title" -s "In Progress"` |
|
||||||
|
| Assign task | `backlog task edit 42 -a @claude` |
|
||||||
|
|
||||||
|
#### Acceptance Criteria Management
|
||||||
|
| Action | Command |
|
||||||
|
|--------|---------|
|
||||||
|
| Add AC | `backlog task edit 42 --ac "New criterion"` |
|
||||||
|
| Check AC #1 | `backlog task edit 42 --check-ac 1` |
|
||||||
|
| Check multiple | `backlog task edit 42 --check-ac 1 --check-ac 2` |
|
||||||
|
| Uncheck AC | `backlog task edit 42 --uncheck-ac 1` |
|
||||||
|
| Remove AC | `backlog task edit 42 --remove-ac 2` |
|
||||||
|
|
||||||
|
#### Multi-line Input (Description/Plan/Notes)
|
||||||
|
The CLI preserves input literally. Use shell-specific syntax for real newlines:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Bash/Zsh (ANSI-C quoting)
|
||||||
|
backlog task edit 42 --notes $'Line1\nLine2\nLine3'
|
||||||
|
backlog task edit 42 --plan $'1. Step one\n2. Step two'
|
||||||
|
|
||||||
|
# POSIX portable
|
||||||
|
backlog task edit 42 --notes "$(printf 'Line1\nLine2')"
|
||||||
|
|
||||||
|
# Append notes progressively
|
||||||
|
backlog task edit 42 --append-notes $'- Completed X\n- Working on Y'
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Definition of Done (DoD)
|
||||||
|
A task is **Done** only when ALL of these are complete:
|
||||||
|
|
||||||
|
**Via CLI:**
|
||||||
|
1. All acceptance criteria checked: `--check-ac <index>` for each
|
||||||
|
2. Implementation notes added: `--notes "..."` or `--append-notes "..."`
|
||||||
|
3. Status set to Done: `-s Done`
|
||||||
|
|
||||||
|
**Via Code/Testing:**
|
||||||
|
4. Tests pass (run test suite and linting)
|
||||||
|
5. Documentation updated if needed
|
||||||
|
6. Code self-reviewed
|
||||||
|
7. No regressions
|
||||||
|
|
||||||
|
**NEVER mark a task as Done without completing ALL items above.**
|
||||||
|
|
||||||
|
### Configuration Reference
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔧 TROUBLESHOOTING
|
||||||
|
|
||||||
|
### tmux "server exited unexpectedly"
|
||||||
|
This error occurs when a stale socket file exists from a crashed tmux server.
|
||||||
|
|
||||||
|
**Fix:**
|
||||||
|
```bash
|
||||||
|
rm -f /tmp/tmux-$(id -u)/default
|
||||||
|
```
|
||||||
|
|
||||||
|
Then start a new session normally with `tmux` or `tmux new -s <name>`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Default `backlog/config.yml`:
|
||||||
|
```yaml
|
||||||
|
project_name: "Project Name"
|
||||||
|
default_status: "To Do"
|
||||||
|
statuses: ["To Do", "In Progress", "Done"]
|
||||||
|
labels: []
|
||||||
|
milestones: []
|
||||||
|
date_format: yyyy-mm-dd
|
||||||
|
max_column_width: 20
|
||||||
|
auto_open_browser: true
|
||||||
|
default_port: 6420
|
||||||
|
remote_operations: true
|
||||||
|
auto_commit: true
|
||||||
|
zero_padded_ids: 3
|
||||||
|
bypass_git_hooks: false
|
||||||
|
check_active_branches: true
|
||||||
|
active_branch_days: 60
|
||||||
|
```
|
||||||
|
|
@ -73,7 +73,6 @@ Custom shape types are preserved:
|
||||||
- ObsNote
|
- ObsNote
|
||||||
- Holon
|
- Holon
|
||||||
- FathomMeetingsBrowser
|
- FathomMeetingsBrowser
|
||||||
- FathomTranscript
|
|
||||||
- HolonBrowser
|
- HolonBrowser
|
||||||
- LocationShare
|
- LocationShare
|
||||||
- ObsidianBrowser
|
- ObsidianBrowser
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,54 @@
|
||||||
|
# Canvas Website Dockerfile
|
||||||
|
# Builds Vite frontend and serves with nginx
|
||||||
|
# Backend (sync) still uses Cloudflare Workers
|
||||||
|
|
||||||
|
# Build stage
|
||||||
|
FROM node:20-alpine AS build
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
COPY package*.json ./
|
||||||
|
RUN npm ci --legacy-peer-deps
|
||||||
|
|
||||||
|
# Copy source
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Build args for environment
|
||||||
|
ARG VITE_TLDRAW_WORKER_URL=https://jeffemmett-canvas.jeffemmett.workers.dev
|
||||||
|
ARG VITE_DAILY_API_KEY
|
||||||
|
ARG VITE_RUNPOD_API_KEY
|
||||||
|
ARG VITE_RUNPOD_IMAGE_ENDPOINT_ID
|
||||||
|
ARG VITE_RUNPOD_VIDEO_ENDPOINT_ID
|
||||||
|
ARG VITE_RUNPOD_TEXT_ENDPOINT_ID
|
||||||
|
ARG VITE_RUNPOD_WHISPER_ENDPOINT_ID
|
||||||
|
|
||||||
|
# Set environment for build
|
||||||
|
ENV VITE_TLDRAW_WORKER_URL=$VITE_TLDRAW_WORKER_URL
|
||||||
|
ENV VITE_DAILY_API_KEY=$VITE_DAILY_API_KEY
|
||||||
|
ENV VITE_RUNPOD_API_KEY=$VITE_RUNPOD_API_KEY
|
||||||
|
ENV VITE_RUNPOD_IMAGE_ENDPOINT_ID=$VITE_RUNPOD_IMAGE_ENDPOINT_ID
|
||||||
|
ENV VITE_RUNPOD_VIDEO_ENDPOINT_ID=$VITE_RUNPOD_VIDEO_ENDPOINT_ID
|
||||||
|
ENV VITE_RUNPOD_TEXT_ENDPOINT_ID=$VITE_RUNPOD_TEXT_ENDPOINT_ID
|
||||||
|
ENV VITE_RUNPOD_WHISPER_ENDPOINT_ID=$VITE_RUNPOD_WHISPER_ENDPOINT_ID
|
||||||
|
|
||||||
|
# Build the app
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Production stage
|
||||||
|
FROM nginx:alpine AS production
|
||||||
|
WORKDIR /usr/share/nginx/html
|
||||||
|
|
||||||
|
# Remove default nginx static assets
|
||||||
|
RUN rm -rf ./*
|
||||||
|
|
||||||
|
# Copy built assets from build stage
|
||||||
|
COPY --from=build /app/dist .
|
||||||
|
|
||||||
|
# Copy nginx config
|
||||||
|
COPY nginx.conf /etc/nginx/conf.d/default.conf
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 80
|
||||||
|
|
||||||
|
# Start nginx
|
||||||
|
CMD ["nginx", "-g", "daemon off;"]
|
||||||
|
|
@ -0,0 +1,232 @@
|
||||||
|
# mulTmux Integration
|
||||||
|
|
||||||
|
mulTmux is now integrated into the canvas-website project as a collaborative terminal tool. This allows multiple developers to work together in the same terminal session.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
From the root of the canvas-website project:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install all dependencies including mulTmux packages
|
||||||
|
npm run multmux:install
|
||||||
|
|
||||||
|
# Build mulTmux packages
|
||||||
|
npm run multmux:build
|
||||||
|
```
|
||||||
|
|
||||||
|
## Available Commands
|
||||||
|
|
||||||
|
All commands are run from the **root** of the canvas-website project:
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `npm run multmux:install` | Install mulTmux dependencies |
|
||||||
|
| `npm run multmux:build` | Build server and CLI packages |
|
||||||
|
| `npm run multmux:dev:server` | Run server in development mode |
|
||||||
|
| `npm run multmux:dev:cli` | Run CLI in development mode |
|
||||||
|
| `npm run multmux:start` | Start the production server |
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### 1. Build mulTmux
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm run multmux:build
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Start the Server Locally (for testing)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm run multmux:start
|
||||||
|
```
|
||||||
|
|
||||||
|
Server will be available at:
|
||||||
|
- HTTP API: `http://localhost:3000`
|
||||||
|
- WebSocket: `ws://localhost:3001`
|
||||||
|
|
||||||
|
### 3. Install CLI Globally
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd multmux/packages/cli
|
||||||
|
npm link
|
||||||
|
```
|
||||||
|
|
||||||
|
Now you can use the `multmux` command anywhere!
|
||||||
|
|
||||||
|
### 4. Create a Session
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Local testing
|
||||||
|
multmux create my-session
|
||||||
|
|
||||||
|
# Or specify your AI server (when deployed)
|
||||||
|
multmux create my-session --server http://your-ai-server:3000
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5. Join from Another Terminal
|
||||||
|
|
||||||
|
```bash
|
||||||
|
multmux join <token-from-above> --server ws://your-ai-server:3001
|
||||||
|
```
|
||||||
|
|
||||||
|
## Deploying to AI Server
|
||||||
|
|
||||||
|
### Option 1: Using the Deploy Script
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd multmux
|
||||||
|
./infrastructure/deploy.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This will:
|
||||||
|
- Install system dependencies (tmux, Node.js)
|
||||||
|
- Build the project
|
||||||
|
- Set up PM2 for process management
|
||||||
|
- Start the server
|
||||||
|
|
||||||
|
### Option 2: Manual Deployment
|
||||||
|
|
||||||
|
1. **SSH to your AI server**
|
||||||
|
```bash
|
||||||
|
ssh your-ai-server
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Clone or copy the project**
|
||||||
|
```bash
|
||||||
|
git clone <your-repo>
|
||||||
|
cd canvas-website
|
||||||
|
git checkout mulTmux-webtree
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Install and build**
|
||||||
|
```bash
|
||||||
|
npm install
|
||||||
|
npm run multmux:build
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Start with PM2**
|
||||||
|
```bash
|
||||||
|
cd multmux
|
||||||
|
npm install -g pm2
|
||||||
|
pm2 start packages/server/dist/index.js --name multmux-server
|
||||||
|
pm2 save
|
||||||
|
pm2 startup
|
||||||
|
```
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
canvas-website/
|
||||||
|
├── multmux/
|
||||||
|
│ ├── packages/
|
||||||
|
│ │ ├── server/ # Backend (Node.js + tmux)
|
||||||
|
│ │ └── cli/ # Command-line client
|
||||||
|
│ ├── infrastructure/
|
||||||
|
│ │ ├── deploy.sh # Auto-deployment script
|
||||||
|
│ │ └── nginx.conf # Reverse proxy config
|
||||||
|
│ └── README.md # Full documentation
|
||||||
|
├── package.json # Now includes workspace config
|
||||||
|
└── MULTMUX_INTEGRATION.md # This file
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage Examples
|
||||||
|
|
||||||
|
### Collaborative Coding Session
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Developer 1: Create session in project directory
|
||||||
|
cd /path/to/project
|
||||||
|
multmux create coding-session --repo $(pwd)
|
||||||
|
|
||||||
|
# Developer 2: Join and start coding together
|
||||||
|
multmux join <token>
|
||||||
|
|
||||||
|
# Both can now type in the same terminal!
|
||||||
|
```
|
||||||
|
|
||||||
|
### Debugging Together
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create a session for debugging
|
||||||
|
multmux create debug-auth-issue
|
||||||
|
|
||||||
|
# Share token with teammate
|
||||||
|
# Both can run commands, check logs, etc.
|
||||||
|
```
|
||||||
|
|
||||||
|
### List Active Sessions
|
||||||
|
|
||||||
|
```bash
|
||||||
|
multmux list
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
You can customize ports by setting environment variables:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
export PORT=3000 # HTTP API port
|
||||||
|
export WS_PORT=3001 # WebSocket port
|
||||||
|
```
|
||||||
|
|
||||||
|
### Token Expiration
|
||||||
|
|
||||||
|
Default: 60 minutes. To change, edit `/home/jeffe/Github/canvas-website/multmux/packages/server/src/managers/TokenManager.ts:11`
|
||||||
|
|
||||||
|
### Session Cleanup
|
||||||
|
|
||||||
|
Sessions auto-cleanup when all users disconnect. To change this behavior, edit `/home/jeffe/Github/canvas-website/multmux/packages/server/src/managers/SessionManager.ts:64`
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### "Command not found: multmux"
|
||||||
|
|
||||||
|
Run `npm link` from the CLI package:
|
||||||
|
```bash
|
||||||
|
cd multmux/packages/cli
|
||||||
|
npm link
|
||||||
|
```
|
||||||
|
|
||||||
|
### "Connection refused"
|
||||||
|
|
||||||
|
1. Check server is running:
|
||||||
|
```bash
|
||||||
|
pm2 status
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Check ports are available:
|
||||||
|
```bash
|
||||||
|
netstat -tlnp | grep -E '3000|3001'
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Check logs:
|
||||||
|
```bash
|
||||||
|
pm2 logs multmux-server
|
||||||
|
```
|
||||||
|
|
||||||
|
### Token Expired
|
||||||
|
|
||||||
|
Generate a new token:
|
||||||
|
```bash
|
||||||
|
curl -X POST http://localhost:3000/api/sessions/<session-id>/tokens \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"expiresInMinutes": 60}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Notes
|
||||||
|
|
||||||
|
- Tokens expire after 60 minutes
|
||||||
|
- Sessions are isolated per tmux instance
|
||||||
|
- All input is validated on the server
|
||||||
|
- Use nginx + SSL for production deployments
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
1. **Test locally first**: Run `npm run multmux:start` and try creating/joining sessions
|
||||||
|
2. **Deploy to AI server**: Use `./infrastructure/deploy.sh`
|
||||||
|
3. **Set up nginx**: Copy config from `infrastructure/nginx.conf` for SSL/reverse proxy
|
||||||
|
4. **Share with team**: Send them tokens to collaborate!
|
||||||
|
|
||||||
|
For full documentation, see `multmux/README.md`.
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,267 @@
|
||||||
|
# Quick Start Guide - AI Services Setup
|
||||||
|
|
||||||
|
**Get your AI orchestration running in under 30 minutes!**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Goal
|
||||||
|
|
||||||
|
Deploy a smart AI orchestration layer that saves you $768-1,824/year by routing 70-80% of workload to your Netcup RS 8000 (FREE) and only using RunPod GPU when needed.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ⚡ 30-Minute Quick Start
|
||||||
|
|
||||||
|
### Step 1: Verify Access (2 min)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Test SSH to Netcup RS 8000
|
||||||
|
ssh netcup "hostname && docker --version"
|
||||||
|
|
||||||
|
# Expected output:
|
||||||
|
# vXXXXXX.netcup.net
|
||||||
|
# Docker version 24.0.x
|
||||||
|
```
|
||||||
|
|
||||||
|
✅ **Success?** Continue to Step 2
|
||||||
|
❌ **Failed?** Setup SSH key or contact Netcup support
|
||||||
|
|
||||||
|
### Step 2: Deploy AI Orchestrator (10 min)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create directory structure
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
mkdir -p /opt/ai-orchestrator/{services/{router,workers,monitor},configs,data}
|
||||||
|
cd /opt/ai-orchestrator
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# Deploy minimal stack (text generation only for quick start)
|
||||||
|
ssh netcup "cat > /opt/ai-orchestrator/docker-compose.yml" << 'EOF'
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
ports: ["6379:6379"]
|
||||||
|
volumes: ["./data/redis:/data"]
|
||||||
|
command: redis-server --appendonly yes
|
||||||
|
|
||||||
|
ollama:
|
||||||
|
image: ollama/ollama:latest
|
||||||
|
ports: ["11434:11434"]
|
||||||
|
volumes: ["/data/models/ollama:/root/.ollama"]
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# Start services
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose up -d"
|
||||||
|
|
||||||
|
# Verify
|
||||||
|
ssh netcup "docker ps"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 3: Download AI Model (5 min)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Pull Llama 3 8B (smaller, faster for testing)
|
||||||
|
ssh netcup "docker exec ollama ollama pull llama3:8b"
|
||||||
|
|
||||||
|
# Test it
|
||||||
|
ssh netcup "docker exec ollama ollama run llama3:8b 'Hello, world!'"
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected output: A friendly AI response!
|
||||||
|
|
||||||
|
### Step 4: Test from Your Machine (3 min)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Get Netcup IP
|
||||||
|
NETCUP_IP="159.195.32.209"
|
||||||
|
|
||||||
|
# Test Ollama directly
|
||||||
|
curl -X POST http://$NETCUP_IP:11434/api/generate \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{
|
||||||
|
"model": "llama3:8b",
|
||||||
|
"prompt": "Write hello world in Python",
|
||||||
|
"stream": false
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected: Python code response!
|
||||||
|
|
||||||
|
### Step 5: Configure canvas-website (5 min)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /home/jeffe/Github/canvas-website-branch-worktrees/add-runpod-AI-API
|
||||||
|
|
||||||
|
# Create minimal .env.local
|
||||||
|
cat > .env.local << 'EOF'
|
||||||
|
# Ollama direct access (for quick testing)
|
||||||
|
VITE_OLLAMA_URL=http://159.195.32.209:11434
|
||||||
|
|
||||||
|
# Your existing vars...
|
||||||
|
VITE_GOOGLE_CLIENT_ID=your_google_client_id
|
||||||
|
VITE_TLDRAW_WORKER_URL=your_worker_url
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# Install and start
|
||||||
|
npm install
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
### Step 6: Test in Browser (5 min)
|
||||||
|
|
||||||
|
1. Open http://localhost:5173 (or your dev port)
|
||||||
|
2. Create a Prompt shape or use LLM command
|
||||||
|
3. Type: "Write a hello world program"
|
||||||
|
4. Submit
|
||||||
|
5. Verify: Response appears using your local Ollama!
|
||||||
|
|
||||||
|
**🎉 Success!** You're now running AI locally for FREE!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Next: Full Setup (Optional)
|
||||||
|
|
||||||
|
Once quick start works, deploy the full stack:
|
||||||
|
|
||||||
|
### Option A: Full AI Orchestrator (1 hour)
|
||||||
|
|
||||||
|
Follow: `AI_SERVICES_DEPLOYMENT_GUIDE.md` Phase 2-3
|
||||||
|
|
||||||
|
Adds:
|
||||||
|
- Smart routing layer
|
||||||
|
- Image generation (local SD + RunPod)
|
||||||
|
- Video generation (RunPod Wan2.1)
|
||||||
|
- Cost tracking
|
||||||
|
- Monitoring dashboards
|
||||||
|
|
||||||
|
### Option B: Just Add Image Generation (30 min)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Add Stable Diffusion CPU to docker-compose.yml
|
||||||
|
ssh netcup "cat >> /opt/ai-orchestrator/docker-compose.yml" << 'EOF'
|
||||||
|
|
||||||
|
stable-diffusion:
|
||||||
|
image: ghcr.io/stablecog/sc-worker:latest
|
||||||
|
ports: ["7860:7860"]
|
||||||
|
volumes: ["/data/models/stable-diffusion:/models"]
|
||||||
|
environment:
|
||||||
|
USE_CPU: "true"
|
||||||
|
EOF
|
||||||
|
|
||||||
|
ssh netcup "cd /opt/ai-orchestrator && docker-compose up -d"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option C: Full Migration (4-5 weeks)
|
||||||
|
|
||||||
|
Follow: `NETCUP_MIGRATION_PLAN.md` for complete DigitalOcean → Netcup migration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🐛 Quick Troubleshooting
|
||||||
|
|
||||||
|
### "Connection refused to 159.195.32.209:11434"
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check if firewall blocking
|
||||||
|
ssh netcup "sudo ufw status"
|
||||||
|
ssh netcup "sudo ufw allow 11434/tcp"
|
||||||
|
ssh netcup "sudo ufw allow 8000/tcp" # For AI orchestrator later
|
||||||
|
```
|
||||||
|
|
||||||
|
### "docker: command not found"
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install Docker
|
||||||
|
ssh netcup << 'EOF'
|
||||||
|
curl -fsSL https://get.docker.com -o get-docker.sh
|
||||||
|
sudo sh get-docker.sh
|
||||||
|
sudo usermod -aG docker $USER
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# Reconnect and retry
|
||||||
|
ssh netcup "docker --version"
|
||||||
|
```
|
||||||
|
|
||||||
|
### "Ollama model not found"
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# List installed models
|
||||||
|
ssh netcup "docker exec ollama ollama list"
|
||||||
|
|
||||||
|
# If empty, pull model
|
||||||
|
ssh netcup "docker exec ollama ollama pull llama3:8b"
|
||||||
|
```
|
||||||
|
|
||||||
|
### "AI response very slow (>30s)"
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check if downloading model for first time
|
||||||
|
ssh netcup "docker exec ollama ollama list"
|
||||||
|
|
||||||
|
# Use smaller model for testing
|
||||||
|
ssh netcup "docker exec ollama ollama pull mistral:7b"
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💡 Quick Tips
|
||||||
|
|
||||||
|
1. **Start with 8B model**: Faster responses, good for testing
|
||||||
|
2. **Use localhost for dev**: Point directly to Ollama URL
|
||||||
|
3. **Deploy orchestrator later**: Once basic setup works
|
||||||
|
4. **Monitor resources**: `ssh netcup htop` to check CPU/RAM
|
||||||
|
5. **Test locally first**: Verify before adding RunPod costs
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📋 Checklist
|
||||||
|
|
||||||
|
- [ ] SSH access to Netcup works
|
||||||
|
- [ ] Docker installed and running
|
||||||
|
- [ ] Redis and Ollama containers running
|
||||||
|
- [ ] Llama3 model downloaded
|
||||||
|
- [ ] Test curl request works
|
||||||
|
- [ ] canvas-website .env.local configured
|
||||||
|
- [ ] Browser test successful
|
||||||
|
|
||||||
|
**All checked?** You're ready! 🎉
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🎯 Next Steps
|
||||||
|
|
||||||
|
Choose your path:
|
||||||
|
|
||||||
|
**Path 1: Keep it Simple**
|
||||||
|
- Use Ollama directly for text generation
|
||||||
|
- Add user API keys in canvas settings for images
|
||||||
|
- Deploy full orchestrator later
|
||||||
|
|
||||||
|
**Path 2: Deploy Full Stack**
|
||||||
|
- Follow `AI_SERVICES_DEPLOYMENT_GUIDE.md`
|
||||||
|
- Setup image + video generation
|
||||||
|
- Enable cost tracking and monitoring
|
||||||
|
|
||||||
|
**Path 3: Full Migration**
|
||||||
|
- Follow `NETCUP_MIGRATION_PLAN.md`
|
||||||
|
- Migrate all services from DigitalOcean
|
||||||
|
- Setup production infrastructure
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📚 Reference Docs
|
||||||
|
|
||||||
|
- **This Guide**: Quick 30-min setup
|
||||||
|
- **AI_SERVICES_SUMMARY.md**: Complete feature overview
|
||||||
|
- **AI_SERVICES_DEPLOYMENT_GUIDE.md**: Full deployment (all services)
|
||||||
|
- **NETCUP_MIGRATION_PLAN.md**: Complete migration plan (8 phases)
|
||||||
|
- **RUNPOD_SETUP.md**: RunPod WhisperX setup
|
||||||
|
- **TEST_RUNPOD_AI.md**: Testing guide
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Questions?** Check `AI_SERVICES_SUMMARY.md` or deployment guide!
|
||||||
|
|
||||||
|
**Ready for full setup?** Continue to `AI_SERVICES_DEPLOYMENT_GUIDE.md`! 🚀
|
||||||
|
|
@ -0,0 +1,255 @@
|
||||||
|
# RunPod WhisperX Integration Setup
|
||||||
|
|
||||||
|
This guide explains how to set up and use the RunPod WhisperX endpoint for transcription in the canvas website.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The transcription system can now use a hosted WhisperX endpoint on RunPod instead of running the Whisper model locally in the browser. This provides:
|
||||||
|
- Better accuracy with WhisperX's advanced features
|
||||||
|
- Faster processing (no model download needed)
|
||||||
|
- Reduced client-side resource usage
|
||||||
|
- Support for longer audio files
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
1. A RunPod account with an active WhisperX endpoint
|
||||||
|
2. Your RunPod API key
|
||||||
|
3. Your RunPod endpoint ID
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
Add the following environment variables to your `.env.local` file (or your deployment environment):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# RunPod Configuration
|
||||||
|
VITE_RUNPOD_API_KEY=your_runpod_api_key_here
|
||||||
|
VITE_RUNPOD_ENDPOINT_ID=your_endpoint_id_here
|
||||||
|
```
|
||||||
|
|
||||||
|
Or if using Next.js:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
NEXT_PUBLIC_RUNPOD_API_KEY=your_runpod_api_key_here
|
||||||
|
NEXT_PUBLIC_RUNPOD_ENDPOINT_ID=your_endpoint_id_here
|
||||||
|
```
|
||||||
|
|
||||||
|
### Getting Your RunPod Credentials
|
||||||
|
|
||||||
|
1. **API Key**:
|
||||||
|
- Go to [RunPod Settings](https://www.runpod.io/console/user/settings)
|
||||||
|
- Navigate to API Keys section
|
||||||
|
- Create a new API key or copy an existing one
|
||||||
|
|
||||||
|
2. **Endpoint ID**:
|
||||||
|
- Go to [RunPod Serverless Endpoints](https://www.runpod.io/console/serverless)
|
||||||
|
- Find your WhisperX endpoint
|
||||||
|
- Copy the endpoint ID from the URL or endpoint details
|
||||||
|
- Example: If your endpoint URL is `https://api.runpod.ai/v2/lrtisuv8ixbtub/run`, then `lrtisuv8ixbtub` is your endpoint ID
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Automatic Detection
|
||||||
|
|
||||||
|
The transcription hook automatically detects if RunPod is configured and uses it instead of the local Whisper model. No code changes are needed!
|
||||||
|
|
||||||
|
### Manual Override
|
||||||
|
|
||||||
|
If you want to explicitly control which transcription method to use:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { useWhisperTranscription } from '@/hooks/useWhisperTranscriptionSimple'
|
||||||
|
|
||||||
|
const {
|
||||||
|
isRecording,
|
||||||
|
transcript,
|
||||||
|
startRecording,
|
||||||
|
stopRecording
|
||||||
|
} = useWhisperTranscription({
|
||||||
|
useRunPod: true, // Force RunPod usage
|
||||||
|
language: 'en',
|
||||||
|
onTranscriptUpdate: (text) => {
|
||||||
|
console.log('New transcript:', text)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
Or to force local model:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
useWhisperTranscription({
|
||||||
|
useRunPod: false, // Force local Whisper model
|
||||||
|
// ... other options
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Format
|
||||||
|
|
||||||
|
The integration sends audio data to your RunPod endpoint in the following format:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"input": {
|
||||||
|
"audio": "base64_encoded_audio_data",
|
||||||
|
"audio_format": "audio/wav",
|
||||||
|
"language": "en",
|
||||||
|
"task": "transcribe"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Expected Response Format
|
||||||
|
|
||||||
|
The endpoint should return one of these formats:
|
||||||
|
|
||||||
|
**Direct Response:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"output": {
|
||||||
|
"text": "Transcribed text here"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Or with segments:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"output": {
|
||||||
|
"segments": [
|
||||||
|
{
|
||||||
|
"start": 0.0,
|
||||||
|
"end": 2.5,
|
||||||
|
"text": "Transcribed text here"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Async Job Pattern:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": "job-id-123",
|
||||||
|
"status": "IN_QUEUE"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
The integration automatically handles async jobs by polling the status endpoint until completion.
|
||||||
|
|
||||||
|
## Customizing the API Request
|
||||||
|
|
||||||
|
If your WhisperX endpoint expects a different request format, you can modify `src/lib/runpodApi.ts`:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// In transcribeWithRunPod function
|
||||||
|
const requestBody = {
|
||||||
|
input: {
|
||||||
|
// Adjust these fields based on your endpoint
|
||||||
|
audio: audioBase64,
|
||||||
|
// Add or modify fields as needed
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### "RunPod API key or endpoint ID not configured"
|
||||||
|
|
||||||
|
- Ensure environment variables are set correctly
|
||||||
|
- Restart your development server after adding environment variables
|
||||||
|
- Check that variable names match exactly (case-sensitive)
|
||||||
|
|
||||||
|
### "RunPod API error: 401"
|
||||||
|
|
||||||
|
- Verify your API key is correct
|
||||||
|
- Check that your API key has not expired
|
||||||
|
- Ensure you're using the correct API key format
|
||||||
|
|
||||||
|
### "RunPod API error: 404"
|
||||||
|
|
||||||
|
- Verify your endpoint ID is correct
|
||||||
|
- Check that your endpoint is active in the RunPod console
|
||||||
|
- Ensure the endpoint URL format matches: `https://api.runpod.ai/v2/{ENDPOINT_ID}/run`
|
||||||
|
|
||||||
|
### "No transcription text found in RunPod response"
|
||||||
|
|
||||||
|
- Check your endpoint's response format matches the expected format
|
||||||
|
- Verify your WhisperX endpoint is configured correctly
|
||||||
|
- Check the browser console for detailed error messages
|
||||||
|
|
||||||
|
### "Failed to return job results" (400 Bad Request)
|
||||||
|
|
||||||
|
This error occurs on the **server side** when your WhisperX endpoint tries to return results. This typically means:
|
||||||
|
|
||||||
|
1. **Response format mismatch**: Your endpoint's response doesn't match RunPod's expected format
|
||||||
|
- Ensure your endpoint returns: `{"output": {"text": "..."}}` or `{"output": {"segments": [...]}}`
|
||||||
|
- The response must be valid JSON
|
||||||
|
- Check your endpoint handler code to ensure it's returning the correct structure
|
||||||
|
|
||||||
|
2. **Response size limits**: The response might be too large
|
||||||
|
- Try with shorter audio files first
|
||||||
|
- Check RunPod's response size limits
|
||||||
|
|
||||||
|
3. **Timeout issues**: The endpoint might be taking too long to process
|
||||||
|
- Check your endpoint logs for processing time
|
||||||
|
- Consider optimizing your WhisperX model configuration
|
||||||
|
|
||||||
|
4. **Check endpoint handler**: Review your WhisperX endpoint's `handler.py` or equivalent:
|
||||||
|
```python
|
||||||
|
# Example correct format
|
||||||
|
def handler(event):
|
||||||
|
# ... process audio ...
|
||||||
|
return {
|
||||||
|
"output": {
|
||||||
|
"text": transcription_text
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Transcription not working
|
||||||
|
|
||||||
|
- Check browser console for errors
|
||||||
|
- Verify your endpoint is active and responding
|
||||||
|
- Test your endpoint directly using curl or Postman
|
||||||
|
- Ensure audio format is supported (WAV format is recommended)
|
||||||
|
- Check RunPod endpoint logs for server-side errors
|
||||||
|
|
||||||
|
## Testing Your Endpoint
|
||||||
|
|
||||||
|
You can test your RunPod endpoint directly:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -X POST https://api.runpod.ai/v2/YOUR_ENDPOINT_ID/run \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-H "Authorization: Bearer YOUR_API_KEY" \
|
||||||
|
-d '{
|
||||||
|
"input": {
|
||||||
|
"audio": "base64_audio_data_here",
|
||||||
|
"audio_format": "audio/wav",
|
||||||
|
"language": "en"
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Fallback Behavior
|
||||||
|
|
||||||
|
If RunPod is not configured or fails, the system will:
|
||||||
|
1. Try to use RunPod if configured
|
||||||
|
2. Fall back to local Whisper model if RunPod fails or is not configured
|
||||||
|
3. Show error messages if both methods fail
|
||||||
|
|
||||||
|
## Performance Considerations
|
||||||
|
|
||||||
|
- **RunPod**: Better for longer audio files and higher accuracy, but requires network connection
|
||||||
|
- **Local Model**: Works offline, but requires model download and uses more client resources
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
For issues specific to:
|
||||||
|
- **RunPod API**: Check [RunPod Documentation](https://docs.runpod.io)
|
||||||
|
- **WhisperX**: Check your WhisperX endpoint configuration
|
||||||
|
- **Integration**: Check browser console for detailed error messages
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -0,0 +1,139 @@
|
||||||
|
# Testing RunPod AI Integration
|
||||||
|
|
||||||
|
This guide explains how to test the RunPod AI API integration in development.
|
||||||
|
|
||||||
|
## Quick Setup
|
||||||
|
|
||||||
|
1. **Add RunPod environment variables to `.env.local`:**
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Add these lines to your .env.local file
|
||||||
|
VITE_RUNPOD_API_KEY=your_runpod_api_key_here
|
||||||
|
VITE_RUNPOD_ENDPOINT_ID=your_endpoint_id_here
|
||||||
|
```
|
||||||
|
|
||||||
|
**Important:** Replace `your_runpod_api_key_here` and `your_endpoint_id_here` with your actual RunPod credentials.
|
||||||
|
|
||||||
|
2. **Get your RunPod credentials:**
|
||||||
|
- **API Key**: Go to [RunPod Settings](https://www.runpod.io/console/user/settings) → API Keys section
|
||||||
|
- **Endpoint ID**: Go to [RunPod Serverless Endpoints](https://www.runpod.io/console/serverless) → Find your endpoint → Copy the ID from the URL
|
||||||
|
- Example: If URL is `https://api.runpod.ai/v2/jqd16o7stu29vq/run`, then `jqd16o7stu29vq` is your endpoint ID
|
||||||
|
|
||||||
|
3. **Restart the dev server:**
|
||||||
|
```bash
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing the Integration
|
||||||
|
|
||||||
|
### Method 1: Using Prompt Shapes
|
||||||
|
1. Open the canvas website in your browser
|
||||||
|
2. Select the **Prompt** tool from the toolbar (or press the keyboard shortcut)
|
||||||
|
3. Click on the canvas to create a prompt shape
|
||||||
|
4. Type a prompt like "Write a hello world program in Python"
|
||||||
|
5. Press Enter or click the send button
|
||||||
|
6. The AI response should appear in the prompt shape
|
||||||
|
|
||||||
|
### Method 2: Using Arrow LLM Action
|
||||||
|
1. Create an arrow shape pointing from one shape to another
|
||||||
|
2. Add text to the arrow (this becomes the prompt)
|
||||||
|
3. Select the arrow
|
||||||
|
4. Press **Alt+G** (or use the action menu)
|
||||||
|
5. The AI will process the prompt and fill the target shape with the response
|
||||||
|
|
||||||
|
### Method 3: Using Command Palette
|
||||||
|
1. Press **Cmd+J** (Mac) or **Ctrl+J** (Windows/Linux) to open the LLM view
|
||||||
|
2. Type your prompt
|
||||||
|
3. Press Enter
|
||||||
|
4. The response should appear
|
||||||
|
|
||||||
|
## Verifying RunPod is Being Used
|
||||||
|
|
||||||
|
1. **Open browser console** (F12 or Cmd+Option+I)
|
||||||
|
2. Look for these log messages:
|
||||||
|
- `🔑 Found RunPod configuration from environment variables - using as primary AI provider`
|
||||||
|
- `🔍 Found X available AI providers: runpod (default)`
|
||||||
|
- `🔄 Attempting to use runpod API (default)...`
|
||||||
|
|
||||||
|
3. **Check Network tab:**
|
||||||
|
- Look for requests to `https://api.runpod.ai/v2/{endpointId}/run`
|
||||||
|
- The request should have `Authorization: Bearer {your_api_key}` header
|
||||||
|
|
||||||
|
## Expected Behavior
|
||||||
|
|
||||||
|
- **With RunPod configured**: RunPod will be used FIRST (priority over user API keys)
|
||||||
|
- **Without RunPod**: System will fall back to user-configured API keys (OpenAI, Anthropic, etc.)
|
||||||
|
- **If both fail**: You'll see an error message
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### "No valid API key found for any provider"
|
||||||
|
- Check that `.env.local` has the correct variable names (`VITE_RUNPOD_API_KEY` and `VITE_RUNPOD_ENDPOINT_ID`)
|
||||||
|
- Restart the dev server after adding environment variables
|
||||||
|
- Check browser console for detailed error messages
|
||||||
|
|
||||||
|
### "RunPod API error: 401"
|
||||||
|
- Verify your API key is correct
|
||||||
|
- Check that your API key hasn't expired
|
||||||
|
- Ensure you're using the correct API key format
|
||||||
|
|
||||||
|
### "RunPod API error: 404"
|
||||||
|
- Verify your endpoint ID is correct
|
||||||
|
- Check that your endpoint is active in RunPod console
|
||||||
|
- Ensure the endpoint URL format matches: `https://api.runpod.ai/v2/{ENDPOINT_ID}/run`
|
||||||
|
|
||||||
|
### RunPod not being used
|
||||||
|
- Check browser console for `🔑 Found RunPod configuration` message
|
||||||
|
- Verify environment variables are loaded (check `import.meta.env.VITE_RUNPOD_API_KEY` in console)
|
||||||
|
- Make sure you restarted the dev server after adding environment variables
|
||||||
|
|
||||||
|
## Testing Different Scenarios
|
||||||
|
|
||||||
|
### Test 1: RunPod Only (No User Keys)
|
||||||
|
1. Remove or clear any user API keys from localStorage
|
||||||
|
2. Set RunPod environment variables
|
||||||
|
3. Run an AI command
|
||||||
|
4. Should use RunPod automatically
|
||||||
|
|
||||||
|
### Test 2: RunPod Priority (With User Keys)
|
||||||
|
1. Set RunPod environment variables
|
||||||
|
2. Also configure user API keys in settings
|
||||||
|
3. Run an AI command
|
||||||
|
4. Should use RunPod FIRST, then fall back to user keys if RunPod fails
|
||||||
|
|
||||||
|
### Test 3: Fallback Behavior
|
||||||
|
1. Set RunPod environment variables with invalid credentials
|
||||||
|
2. Configure valid user API keys
|
||||||
|
3. Run an AI command
|
||||||
|
4. Should try RunPod first, fail, then use user keys
|
||||||
|
|
||||||
|
## API Request Format
|
||||||
|
|
||||||
|
The integration sends requests in this format:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"input": {
|
||||||
|
"prompt": "Your prompt text here"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
The system prompt and user prompt are combined into a single prompt string.
|
||||||
|
|
||||||
|
## Response Handling
|
||||||
|
|
||||||
|
The integration handles multiple response formats:
|
||||||
|
- Direct text response: `{ "output": "text" }`
|
||||||
|
- Object with text: `{ "output": { "text": "..." } }`
|
||||||
|
- Object with response: `{ "output": { "response": "..." } }`
|
||||||
|
- Async jobs: Polls until completion
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
Once testing is successful:
|
||||||
|
1. Verify RunPod responses are working correctly
|
||||||
|
2. Test with different prompt types
|
||||||
|
3. Monitor RunPod usage and costs
|
||||||
|
4. Consider adding rate limiting if needed
|
||||||
|
|
||||||
|
|
@ -0,0 +1,341 @@
|
||||||
|
# Git Worktree Automation Setup
|
||||||
|
|
||||||
|
This repository is configured to automatically create Git worktrees for new branches, allowing you to work on multiple branches simultaneously without switching contexts.
|
||||||
|
|
||||||
|
## What Are Worktrees?
|
||||||
|
|
||||||
|
Git worktrees allow you to have multiple working directories (copies of your repo) checked out to different branches at the same time. This means:
|
||||||
|
|
||||||
|
- No need to stash or commit work when switching branches
|
||||||
|
- Run dev servers on multiple branches simultaneously
|
||||||
|
- Compare code across branches easily
|
||||||
|
- Keep your main branch clean while working on features
|
||||||
|
|
||||||
|
## Automatic Worktree Creation
|
||||||
|
|
||||||
|
A Git hook (`.git/hooks/post-checkout`) is installed that automatically creates worktrees when you create a new branch from `main`:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# This will automatically create a worktree at ../canvas-website-feature-name
|
||||||
|
git checkout -b feature/new-feature
|
||||||
|
```
|
||||||
|
|
||||||
|
**Worktree Location Pattern:**
|
||||||
|
```
|
||||||
|
/home/jeffe/Github/
|
||||||
|
├── canvas-website/ # Main repo (main branch)
|
||||||
|
├── canvas-website-feature-name/ # Worktree for feature branch
|
||||||
|
└── canvas-website-bugfix-something/ # Worktree for bugfix branch
|
||||||
|
```
|
||||||
|
|
||||||
|
## Manual Worktree Management
|
||||||
|
|
||||||
|
Use the `worktree-manager.sh` script for manual management:
|
||||||
|
|
||||||
|
### List All Worktrees
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh list
|
||||||
|
```
|
||||||
|
|
||||||
|
### Create a New Worktree
|
||||||
|
```bash
|
||||||
|
# Creates worktree for existing branch
|
||||||
|
./scripts/worktree-manager.sh create feature/my-feature
|
||||||
|
|
||||||
|
# Or create new branch with worktree
|
||||||
|
./scripts/worktree-manager.sh create feature/new-branch
|
||||||
|
```
|
||||||
|
|
||||||
|
### Remove a Worktree
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh remove feature/old-feature
|
||||||
|
```
|
||||||
|
|
||||||
|
### Clean Up All Worktrees (Keep Main)
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh clean
|
||||||
|
```
|
||||||
|
|
||||||
|
### Show Status of All Worktrees
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh status
|
||||||
|
```
|
||||||
|
|
||||||
|
### Navigate to a Worktree
|
||||||
|
```bash
|
||||||
|
# Get worktree path
|
||||||
|
./scripts/worktree-manager.sh goto feature/my-feature
|
||||||
|
|
||||||
|
# Or use with cd
|
||||||
|
cd $(./scripts/worktree-manager.sh goto feature/my-feature)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Help
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh help
|
||||||
|
```
|
||||||
|
|
||||||
|
## Workflow Examples
|
||||||
|
|
||||||
|
### Starting a New Feature
|
||||||
|
|
||||||
|
**With automatic worktree creation:**
|
||||||
|
```bash
|
||||||
|
# In main repo
|
||||||
|
cd /home/jeffe/Github/canvas-website
|
||||||
|
|
||||||
|
# Create and switch to new branch (worktree auto-created)
|
||||||
|
git checkout -b feature/terminal-tool
|
||||||
|
|
||||||
|
# Notification appears:
|
||||||
|
# 🌳 Creating worktree for branch: feature/terminal-tool
|
||||||
|
# 📁 Location: /home/jeffe/Github/canvas-website-feature-terminal-tool
|
||||||
|
|
||||||
|
# Continue working in current directory or switch to worktree
|
||||||
|
cd ../canvas-website-feature-terminal-tool
|
||||||
|
```
|
||||||
|
|
||||||
|
**Manual worktree creation:**
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh create feature/my-feature
|
||||||
|
cd $(./scripts/worktree-manager.sh goto feature/my-feature)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Working on Multiple Features Simultaneously
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Terminal 1: Main repo (main branch)
|
||||||
|
cd /home/jeffe/Github/canvas-website
|
||||||
|
npm run dev # Port 5173
|
||||||
|
|
||||||
|
# Terminal 2: Feature branch 1
|
||||||
|
cd /home/jeffe/Github/canvas-website-feature-auth
|
||||||
|
npm run dev # Different port
|
||||||
|
|
||||||
|
# Terminal 3: Feature branch 2
|
||||||
|
cd /home/jeffe/Github/canvas-website-feature-ui
|
||||||
|
npm run dev # Another port
|
||||||
|
|
||||||
|
# All running simultaneously, no conflicts!
|
||||||
|
```
|
||||||
|
|
||||||
|
### Comparing Code Across Branches
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Use diff or your IDE to compare files
|
||||||
|
diff /home/jeffe/Github/canvas-website/src/App.tsx \
|
||||||
|
/home/jeffe/Github/canvas-website-feature-auth/src/App.tsx
|
||||||
|
|
||||||
|
# Or open both in VS Code
|
||||||
|
code /home/jeffe/Github/canvas-website \
|
||||||
|
/home/jeffe/Github/canvas-website-feature-auth
|
||||||
|
```
|
||||||
|
|
||||||
|
### Cleaning Up After Merging
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# After merging feature/my-feature to main
|
||||||
|
cd /home/jeffe/Github/canvas-website
|
||||||
|
|
||||||
|
# Remove the worktree
|
||||||
|
./scripts/worktree-manager.sh remove feature/my-feature
|
||||||
|
|
||||||
|
# Or clean all worktrees except main
|
||||||
|
./scripts/worktree-manager.sh clean
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
### Post-Checkout Hook
|
||||||
|
|
||||||
|
The `.git/hooks/post-checkout` script runs automatically after `git checkout` and:
|
||||||
|
|
||||||
|
1. Detects if you're creating a new branch from `main`
|
||||||
|
2. Creates a worktree in `../canvas-website-{branch-name}`
|
||||||
|
3. Links the worktree to the new branch
|
||||||
|
4. Shows a notification with the worktree path
|
||||||
|
|
||||||
|
**Hook Behavior:**
|
||||||
|
- ✅ Creates worktree when: `git checkout -b new-branch` (from main)
|
||||||
|
- ❌ Skips creation when:
|
||||||
|
- Switching to existing branches
|
||||||
|
- Already in a worktree
|
||||||
|
- Worktree already exists for that branch
|
||||||
|
- Not branching from main/master
|
||||||
|
|
||||||
|
### Worktree Manager Script
|
||||||
|
|
||||||
|
The `scripts/worktree-manager.sh` script provides:
|
||||||
|
- User-friendly commands for worktree operations
|
||||||
|
- Colored output for better readability
|
||||||
|
- Error handling and validation
|
||||||
|
- Status reporting across all worktrees
|
||||||
|
|
||||||
|
## Git Commands with Worktrees
|
||||||
|
|
||||||
|
Most Git commands work the same way in worktrees:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# In any worktree
|
||||||
|
git status # Shows status of current worktree
|
||||||
|
git add . # Stages files in current worktree
|
||||||
|
git commit -m "..." # Commits in current branch
|
||||||
|
git push # Pushes current branch
|
||||||
|
git pull # Pulls current branch
|
||||||
|
|
||||||
|
# List all worktrees (works from any worktree)
|
||||||
|
git worktree list
|
||||||
|
|
||||||
|
# Remove a worktree (from main repo)
|
||||||
|
git worktree remove feature/branch-name
|
||||||
|
|
||||||
|
# Prune deleted worktrees
|
||||||
|
git worktree prune
|
||||||
|
```
|
||||||
|
|
||||||
|
## Important Notes
|
||||||
|
|
||||||
|
### Shared Git Directory
|
||||||
|
|
||||||
|
All worktrees share the same `.git` directory (in the main repo), which means:
|
||||||
|
- ✅ Commits, branches, and remotes are shared across all worktrees
|
||||||
|
- ✅ One `git fetch` or `git pull` in main updates all worktrees
|
||||||
|
- ⚠️ Don't delete the main repo while worktrees exist
|
||||||
|
- ⚠️ Stashes are shared (stash in one worktree, pop in another)
|
||||||
|
|
||||||
|
### Node Modules
|
||||||
|
|
||||||
|
Each worktree has its own `node_modules`:
|
||||||
|
- First time entering a worktree: run `npm install`
|
||||||
|
- Dependencies may differ across branches
|
||||||
|
- More disk space usage (one `node_modules` per worktree)
|
||||||
|
|
||||||
|
### Port Conflicts
|
||||||
|
|
||||||
|
When running dev servers in multiple worktrees:
|
||||||
|
```bash
|
||||||
|
# Main repo
|
||||||
|
npm run dev # Uses default port 5173
|
||||||
|
|
||||||
|
# In worktree, specify different port
|
||||||
|
npm run dev -- --port 5174
|
||||||
|
```
|
||||||
|
|
||||||
|
### IDE Integration
|
||||||
|
|
||||||
|
**VS Code:**
|
||||||
|
```bash
|
||||||
|
# Open specific worktree
|
||||||
|
code /home/jeffe/Github/canvas-website-feature-name
|
||||||
|
|
||||||
|
# Or open multiple worktrees as workspace
|
||||||
|
code --add /home/jeffe/Github/canvas-website \
|
||||||
|
--add /home/jeffe/Github/canvas-website-feature-name
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Worktree Path Already Exists
|
||||||
|
|
||||||
|
If you see:
|
||||||
|
```
|
||||||
|
fatal: '/path/to/worktree' already exists
|
||||||
|
```
|
||||||
|
|
||||||
|
Remove the directory manually:
|
||||||
|
```bash
|
||||||
|
rm -rf /home/jeffe/Github/canvas-website-feature-name
|
||||||
|
git worktree prune
|
||||||
|
```
|
||||||
|
|
||||||
|
### Can't Delete Main Repo
|
||||||
|
|
||||||
|
If you have active worktrees, you can't delete the main repo. Clean up first:
|
||||||
|
```bash
|
||||||
|
./scripts/worktree-manager.sh clean
|
||||||
|
```
|
||||||
|
|
||||||
|
### Worktree Out of Sync
|
||||||
|
|
||||||
|
If a worktree seems out of sync:
|
||||||
|
```bash
|
||||||
|
cd /path/to/worktree
|
||||||
|
git fetch origin
|
||||||
|
git reset --hard origin/branch-name
|
||||||
|
```
|
||||||
|
|
||||||
|
### Hook Not Running
|
||||||
|
|
||||||
|
If the post-checkout hook isn't running:
|
||||||
|
```bash
|
||||||
|
# Check if it's executable
|
||||||
|
ls -la .git/hooks/post-checkout
|
||||||
|
|
||||||
|
# Make it executable if needed
|
||||||
|
chmod +x .git/hooks/post-checkout
|
||||||
|
|
||||||
|
# Test the hook manually
|
||||||
|
.git/hooks/post-checkout HEAD HEAD 1
|
||||||
|
```
|
||||||
|
|
||||||
|
## Disabling Automatic Worktrees
|
||||||
|
|
||||||
|
To disable automatic worktree creation:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Remove or rename the hook
|
||||||
|
mv .git/hooks/post-checkout .git/hooks/post-checkout.disabled
|
||||||
|
```
|
||||||
|
|
||||||
|
To re-enable:
|
||||||
|
```bash
|
||||||
|
mv .git/hooks/post-checkout.disabled .git/hooks/post-checkout
|
||||||
|
```
|
||||||
|
|
||||||
|
## Advanced Usage
|
||||||
|
|
||||||
|
### Custom Worktree Location
|
||||||
|
|
||||||
|
Modify the `post-checkout` hook to change the worktree location:
|
||||||
|
```bash
|
||||||
|
# Edit .git/hooks/post-checkout
|
||||||
|
# Change this line:
|
||||||
|
WORKTREE_BASE=$(dirname "$REPO_ROOT")
|
||||||
|
|
||||||
|
# To (example):
|
||||||
|
WORKTREE_BASE="$HOME/worktrees"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Worktree for Remote Branches
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create worktree for remote branch
|
||||||
|
git worktree add ../canvas-website-remote-branch origin/feature-branch
|
||||||
|
|
||||||
|
# Or use the script
|
||||||
|
./scripts/worktree-manager.sh create origin/feature-branch
|
||||||
|
```
|
||||||
|
|
||||||
|
### Detached HEAD Worktree
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create worktree at specific commit
|
||||||
|
git worktree add ../canvas-website-commit-abc123 abc123
|
||||||
|
```
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
1. **Clean up regularly**: Remove worktrees for merged branches
|
||||||
|
2. **Name branches clearly**: Worktree names mirror branch names
|
||||||
|
3. **Run npm install**: Always run in new worktrees
|
||||||
|
4. **Check branch**: Always verify which branch you're on before committing
|
||||||
|
5. **Use status command**: Check all worktrees before major operations
|
||||||
|
|
||||||
|
## Resources
|
||||||
|
|
||||||
|
- [Git Worktree Documentation](https://git-scm.com/docs/git-worktree)
|
||||||
|
- [Git Hooks Documentation](https://git-scm.com/docs/githooks)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Setup Complete!** New branches will automatically create worktrees. Use `./scripts/worktree-manager.sh help` for manual management.
|
||||||
29
_redirects
29
_redirects
|
|
@ -1,14 +1,25 @@
|
||||||
# Cloudflare Pages redirects and rewrites
|
# Cloudflare Pages redirects and rewrites
|
||||||
# This file handles SPA routing and URL rewrites (replaces vercel.json rewrites)
|
# This file handles SPA routing and URL rewrites (replaces vercel.json rewrites)
|
||||||
|
|
||||||
# SPA fallback - all routes should serve index.html
|
# Specific route rewrites (matching vercel.json)
|
||||||
|
# Handle both with and without trailing slashes
|
||||||
|
/board/* /index.html 200
|
||||||
|
/board /index.html 200
|
||||||
|
/board/ /index.html 200
|
||||||
|
/inbox /index.html 200
|
||||||
|
/inbox/ /index.html 200
|
||||||
|
/contact /index.html 200
|
||||||
|
/contact/ /index.html 200
|
||||||
|
/presentations /index.html 200
|
||||||
|
/presentations/ /index.html 200
|
||||||
|
/presentations/* /index.html 200
|
||||||
|
/dashboard /index.html 200
|
||||||
|
/dashboard/ /index.html 200
|
||||||
|
/login /index.html 200
|
||||||
|
/login/ /index.html 200
|
||||||
|
/debug /index.html 200
|
||||||
|
/debug/ /index.html 200
|
||||||
|
|
||||||
|
# SPA fallback - all routes should serve index.html (must be last)
|
||||||
/* /index.html 200
|
/* /index.html 200
|
||||||
|
|
||||||
# Specific route rewrites (matching vercel.json)
|
|
||||||
/board/* /index.html 200
|
|
||||||
/board /index.html 200
|
|
||||||
/inbox /index.html 200
|
|
||||||
/contact /index.html 200
|
|
||||||
/presentations /index.html 200
|
|
||||||
/dashboard /index.html 200
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,15 @@
|
||||||
|
project_name: "Canvas Feature List"
|
||||||
|
default_status: "To Do"
|
||||||
|
statuses: ["To Do", "In Progress", "Done"]
|
||||||
|
labels: []
|
||||||
|
milestones: []
|
||||||
|
date_format: yyyy-mm-dd
|
||||||
|
max_column_width: 20
|
||||||
|
auto_open_browser: true
|
||||||
|
default_port: 6420
|
||||||
|
remote_operations: true
|
||||||
|
auto_commit: true
|
||||||
|
zero_padded_ids: 3
|
||||||
|
bypass_git_hooks: false
|
||||||
|
check_active_branches: true
|
||||||
|
active_branch_days: 60
|
||||||
|
|
@ -0,0 +1,12 @@
|
||||||
|
---
|
||||||
|
id: task-001
|
||||||
|
title: offline local storage
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03 23:42'
|
||||||
|
updated_date: '2025-12-04 12:13'
|
||||||
|
labels: []
|
||||||
|
dependencies: []
|
||||||
|
---
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -0,0 +1,26 @@
|
||||||
|
---
|
||||||
|
id: task-002
|
||||||
|
title: RunPod AI API Integration
|
||||||
|
status: Done
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, ai, integration]
|
||||||
|
priority: high
|
||||||
|
branch: add-runpod-AI-API
|
||||||
|
worktree: /home/jeffe/Github/canvas-website-branch-worktrees/add-runpod-AI-API
|
||||||
|
updated_date: '2025-12-04 13:43'
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Integrate RunPod serverless AI API for image generation and other AI features on the canvas.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `add-runpod-AI-API`
|
||||||
|
- **Worktree**: `/home/jeffe/Github/canvas-website-branch-worktrees/add-runpod-AI-API`
|
||||||
|
- **Commit**: 083095c
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Connect to RunPod serverless endpoints
|
||||||
|
- [ ] Implement image generation from canvas
|
||||||
|
- [ ] Handle AI responses and display on canvas
|
||||||
|
- [ ] Error handling and loading states
|
||||||
|
|
@ -0,0 +1,24 @@
|
||||||
|
---
|
||||||
|
id: task-003
|
||||||
|
title: MulTmux Web Integration
|
||||||
|
status: In Progress
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, terminal, integration]
|
||||||
|
priority: medium
|
||||||
|
branch: mulTmux-webtree
|
||||||
|
worktree: /home/jeffe/Github/canvas-website-branch-worktrees/mulTmux-webtree
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Integrate MulTmux web terminal functionality into the canvas for terminal-based interactions.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `mulTmux-webtree`
|
||||||
|
- **Worktree**: `/home/jeffe/Github/canvas-website-branch-worktrees/mulTmux-webtree`
|
||||||
|
- **Commit**: 8ea3490
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Embed terminal component in canvas
|
||||||
|
- [ ] Handle terminal I/O within canvas context
|
||||||
|
- [ ] Support multiple terminal sessions
|
||||||
|
|
@ -0,0 +1,24 @@
|
||||||
|
---
|
||||||
|
id: task-004
|
||||||
|
title: IO Chip Feature
|
||||||
|
status: In Progress
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, io, ui]
|
||||||
|
priority: medium
|
||||||
|
branch: feature/io-chip
|
||||||
|
worktree: /home/jeffe/Github/canvas-website-io-chip
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Implement IO chip feature for the canvas - enabling input/output connections between canvas elements.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `feature/io-chip`
|
||||||
|
- **Worktree**: `/home/jeffe/Github/canvas-website-io-chip`
|
||||||
|
- **Commit**: 527462a
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Create IO chip component
|
||||||
|
- [ ] Enable connections between canvas elements
|
||||||
|
- [ ] Handle data flow between connected chips
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
---
|
||||||
|
id: task-005
|
||||||
|
title: Automerge CRDT Sync
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, sync, collaboration]
|
||||||
|
priority: high
|
||||||
|
branch: Automerge
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Implement Automerge CRDT-based synchronization for real-time collaborative canvas editing.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `Automerge`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Integrate Automerge library
|
||||||
|
- [ ] Enable real-time sync between clients
|
||||||
|
- [ ] Handle conflict resolution automatically
|
||||||
|
- [ ] Persist state across sessions
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
---
|
||||||
|
id: task-006
|
||||||
|
title: Stripe Payment Integration
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, payments, integration]
|
||||||
|
priority: medium
|
||||||
|
branch: stripe-integration
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Integrate Stripe for payment processing and subscription management.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `stripe-integration`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Set up Stripe API connection
|
||||||
|
- [ ] Implement payment flow
|
||||||
|
- [ ] Handle subscriptions
|
||||||
|
- [ ] Add billing management UI
|
||||||
|
|
@ -0,0 +1,21 @@
|
||||||
|
---
|
||||||
|
id: task-007
|
||||||
|
title: Web3 Integration
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, web3, blockchain]
|
||||||
|
priority: low
|
||||||
|
branch: web3-integration
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Integrate Web3 capabilities for blockchain-based features (wallet connect, NFT canvas elements, etc.).
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `web3-integration`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Add wallet connection
|
||||||
|
- [ ] Enable NFT minting of canvas elements
|
||||||
|
- [ ] Blockchain-based ownership verification
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
---
|
||||||
|
id: task-008
|
||||||
|
title: Audio Recording Feature
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, audio, media]
|
||||||
|
priority: medium
|
||||||
|
branch: audio-recording-attempt
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Implement audio recording capability for voice notes and audio annotations on the canvas.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `audio-recording-attempt`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Record audio from microphone
|
||||||
|
- [ ] Save audio clips to canvas
|
||||||
|
- [ ] Playback audio annotations
|
||||||
|
- [ ] Transcription integration
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
---
|
||||||
|
id: task-009
|
||||||
|
title: Web Speech API Transcription
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, transcription, speech]
|
||||||
|
priority: medium
|
||||||
|
branch: transcribe-webspeechAPI
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Implement speech-to-text transcription using the Web Speech API for voice input on the canvas.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `transcribe-webspeechAPI`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Capture speech via Web Speech API
|
||||||
|
- [ ] Convert to text in real-time
|
||||||
|
- [ ] Display transcription on canvas
|
||||||
|
- [ ] Support multiple languages
|
||||||
|
|
@ -0,0 +1,21 @@
|
||||||
|
---
|
||||||
|
id: task-010
|
||||||
|
title: Holon Integration
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, holon, integration]
|
||||||
|
priority: medium
|
||||||
|
branch: holon-integration
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Integrate Holon framework for hierarchical canvas organization and nested structures.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `holon-integration`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Implement holon data structure
|
||||||
|
- [ ] Enable nested canvas elements
|
||||||
|
- [ ] Support hierarchical navigation
|
||||||
|
|
@ -0,0 +1,21 @@
|
||||||
|
---
|
||||||
|
id: task-011
|
||||||
|
title: Terminal Tool
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
labels: [feature, terminal, tool]
|
||||||
|
priority: medium
|
||||||
|
branch: feature/terminal-tool
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
Add a terminal tool to the canvas toolbar for embedding terminal sessions.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `feature/terminal-tool`
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
- [ ] Add terminal tool to toolbar
|
||||||
|
- [ ] Spawn terminal instances on canvas
|
||||||
|
- [ ] Handle terminal sizing and positioning
|
||||||
|
|
@ -0,0 +1,67 @@
|
||||||
|
---
|
||||||
|
id: task-012
|
||||||
|
title: Dark Mode Theme
|
||||||
|
status: Done
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-03'
|
||||||
|
updated_date: '2025-12-04 06:29'
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- ui
|
||||||
|
- theme
|
||||||
|
dependencies: []
|
||||||
|
priority: medium
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Implement dark mode theme support for the canvas interface.
|
||||||
|
|
||||||
|
## Branch Info
|
||||||
|
- **Branch**: `dark-mode`
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [x] #1 Create dark theme colors
|
||||||
|
- [x] #2 Add theme toggle
|
||||||
|
- [x] #3 Persist user preference
|
||||||
|
- [x] #4 System theme detection
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
<!-- SECTION:NOTES:BEGIN -->
|
||||||
|
## Implementation Complete (2025-12-03)
|
||||||
|
|
||||||
|
### Components Updated:
|
||||||
|
|
||||||
|
1. **Mycelial Intelligence (MI) Bar** (`src/ui/MycelialIntelligenceBar.tsx`)
|
||||||
|
- Added dark mode color palette with automatic switching based on `isDark` state
|
||||||
|
- Dark backgrounds, lighter text, adjusted shadows
|
||||||
|
- Inline code blocks use CSS class for proper dark mode styling
|
||||||
|
|
||||||
|
2. **Comprehensive CSS Dark Mode** (`src/css/style.css`)
|
||||||
|
- Added CSS variables: `--card-bg`, `--input-bg`, `--muted-text`
|
||||||
|
- Dark mode styles for: blockquotes, tables, navigation, command palette, MDXEditor, chat containers, form inputs, error/success messages
|
||||||
|
|
||||||
|
3. **UserSettingsModal** (`src/ui/UserSettingsModal.tsx`)
|
||||||
|
- Added `colors` object with dark/light mode variants
|
||||||
|
- Updated all inline styles to use theme-aware colors
|
||||||
|
|
||||||
|
4. **StandardizedToolWrapper** (`src/components/StandardizedToolWrapper.tsx`)
|
||||||
|
- Added `useIsDarkMode` hook for dark mode detection
|
||||||
|
- Updated wrapper backgrounds, shadows, borders, tags styling
|
||||||
|
|
||||||
|
5. **Markdown Tool** (`src/shapes/MarkdownShapeUtil.tsx`)
|
||||||
|
- Dark mode detection with automatic background switching
|
||||||
|
- Fixed scrollbar: vertical only, hidden when not needed
|
||||||
|
- Added toolbar minimize/expand button
|
||||||
|
|
||||||
|
### Technical Details:
|
||||||
|
- Automatic detection via `document.documentElement.classList` observer
|
||||||
|
- CSS variables for base styles that auto-switch in dark mode
|
||||||
|
- Inline style support with conditional color objects
|
||||||
|
- Comprehensive coverage of all major UI components and tools
|
||||||
|
<!-- SECTION:NOTES:END -->
|
||||||
|
|
@ -0,0 +1,44 @@
|
||||||
|
---
|
||||||
|
id: task-013
|
||||||
|
title: Markdown Tool UX Improvements
|
||||||
|
status: Done
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 06:29'
|
||||||
|
updated_date: '2025-12-04 06:29'
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- ui
|
||||||
|
- markdown
|
||||||
|
dependencies: []
|
||||||
|
priority: medium
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Improve the Markdown tool user experience with better scrollbar behavior and collapsible toolbar.
|
||||||
|
|
||||||
|
## Changes Implemented:
|
||||||
|
- Scrollbar is now vertical only (no horizontal scrollbar)
|
||||||
|
- Scrollbar auto-hides when not needed
|
||||||
|
- Added minimize/expand button for the formatting toolbar
|
||||||
|
- Full editing area uses available space
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [x] #1 Scrollbar is vertical only
|
||||||
|
- [x] #2 Scrollbar hides when not needed
|
||||||
|
- [x] #3 Toolbar has minimize/expand toggle
|
||||||
|
- [x] #4 Full window is editing area
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
<!-- SECTION:NOTES:BEGIN -->
|
||||||
|
Implementation completed in `src/shapes/MarkdownShapeUtil.tsx`:
|
||||||
|
- Added `overflow-x: hidden` to content area
|
||||||
|
- Custom scrollbar styling with thin width and auto-hide
|
||||||
|
- Added toggle button in toolbar that collapses/expands formatting options
|
||||||
|
- `isToolbarMinimized` state controls toolbar visibility
|
||||||
|
<!-- SECTION:NOTES:END -->
|
||||||
|
|
@ -0,0 +1,351 @@
|
||||||
|
---
|
||||||
|
id: task-014
|
||||||
|
title: Implement WebGPU-based local image generation to reduce RunPod costs
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 11:46'
|
||||||
|
updated_date: '2025-12-04 11:47'
|
||||||
|
labels:
|
||||||
|
- performance
|
||||||
|
- cost-optimization
|
||||||
|
- webgpu
|
||||||
|
- ai
|
||||||
|
- image-generation
|
||||||
|
dependencies: []
|
||||||
|
priority: high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Integrate WebGPU-powered browser-based image generation (SD-Turbo) to reduce RunPod API costs and eliminate cold start delays. This creates a hybrid pipeline where quick drafts/iterations run locally in the browser (FREE, ~1-3 seconds), while high-quality final renders still use RunPod SDXL.
|
||||||
|
|
||||||
|
**Problem:**
|
||||||
|
- Current image generation always hits RunPod (~$0.02/image + 10-30s cold starts)
|
||||||
|
- No instant feedback loop for creative iteration
|
||||||
|
- 100% of compute costs are cloud-based
|
||||||
|
|
||||||
|
**Solution:**
|
||||||
|
- Add WebGPU capability detection
|
||||||
|
- Integrate SD-Turbo for instant browser-based previews
|
||||||
|
- Smart routing: drafts → browser, final renders → RunPod
|
||||||
|
- Potential 70% reduction in RunPod image generation costs
|
||||||
|
|
||||||
|
**Cost Impact (projected):**
|
||||||
|
- 1,000 images/mo: $20 → $6 (save $14/mo)
|
||||||
|
- 5,000 images/mo: $100 → $30 (save $70/mo)
|
||||||
|
- 10,000 images/mo: $200 → $60 (save $140/mo)
|
||||||
|
|
||||||
|
**Browser Support:**
|
||||||
|
- Chrome/Edge: Full WebGPU (v113+)
|
||||||
|
- Firefox: Windows (July 2025)
|
||||||
|
- Safari: v26 beta
|
||||||
|
- Fallback: WASM backend for unsupported browsers
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 WebGPU capability detection added to clientConfig.ts
|
||||||
|
- [ ] #2 SD-Turbo model loads and runs in browser via WebGPU
|
||||||
|
- [ ] #3 ImageGenShapeUtil has Quick Preview vs High Quality toggle
|
||||||
|
- [ ] #4 Smart routing in aiOrchestrator routes drafts to browser
|
||||||
|
- [ ] #5 Fallback to WASM for browsers without WebGPU
|
||||||
|
- [ ] #6 User can generate preview images with zero cold start
|
||||||
|
- [ ] #7 RunPod only called for High Quality final renders
|
||||||
|
- [ ] #8 Model download progress indicator shown to user
|
||||||
|
- [ ] #9 Works offline after initial model download
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Plan
|
||||||
|
|
||||||
|
<!-- SECTION:PLAN:BEGIN -->
|
||||||
|
## Phase 1: Foundation (Quick Wins)
|
||||||
|
|
||||||
|
### 1.1 WebGPU Capability Detection
|
||||||
|
**File:** `src/lib/clientConfig.ts`
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
export async function detectWebGPUCapabilities(): Promise<{
|
||||||
|
hasWebGPU: boolean
|
||||||
|
hasF16: boolean
|
||||||
|
adapterInfo?: GPUAdapterInfo
|
||||||
|
estimatedVRAM?: number
|
||||||
|
}> {
|
||||||
|
if (!navigator.gpu) {
|
||||||
|
return { hasWebGPU: false, hasF16: false }
|
||||||
|
}
|
||||||
|
|
||||||
|
const adapter = await navigator.gpu.requestAdapter()
|
||||||
|
if (!adapter) {
|
||||||
|
return { hasWebGPU: false, hasF16: false }
|
||||||
|
}
|
||||||
|
|
||||||
|
const hasF16 = adapter.features.has('shader-f16')
|
||||||
|
const adapterInfo = await adapter.requestAdapterInfo()
|
||||||
|
|
||||||
|
return {
|
||||||
|
hasWebGPU: true,
|
||||||
|
hasF16,
|
||||||
|
adapterInfo,
|
||||||
|
estimatedVRAM: adapterInfo.memoryHeaps?.[0]?.size
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 1.2 Install Dependencies
|
||||||
|
```bash
|
||||||
|
npm install @anthropic-ai/sdk onnxruntime-web
|
||||||
|
# Or for transformers.js v3:
|
||||||
|
npm install @huggingface/transformers
|
||||||
|
```
|
||||||
|
|
||||||
|
### 1.3 Vite Config Updates
|
||||||
|
**File:** `vite.config.ts`
|
||||||
|
- Ensure WASM/ONNX assets are properly bundled
|
||||||
|
- Add WebGPU shader compilation support
|
||||||
|
- Configure chunk splitting for ML models
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 2: Browser Diffusion Integration
|
||||||
|
|
||||||
|
### 2.1 Create WebGPU Diffusion Module
|
||||||
|
**New File:** `src/lib/webgpuDiffusion.ts`
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { pipeline } from '@huggingface/transformers'
|
||||||
|
|
||||||
|
let generator: any = null
|
||||||
|
let loadingPromise: Promise<void> | null = null
|
||||||
|
|
||||||
|
export async function initSDTurbo(
|
||||||
|
onProgress?: (progress: number, status: string) => void
|
||||||
|
): Promise<void> {
|
||||||
|
if (generator) return
|
||||||
|
if (loadingPromise) return loadingPromise
|
||||||
|
|
||||||
|
loadingPromise = (async () => {
|
||||||
|
onProgress?.(0, 'Loading SD-Turbo model...')
|
||||||
|
|
||||||
|
generator = await pipeline(
|
||||||
|
'text-to-image',
|
||||||
|
'Xenova/sdxl-turbo', // or 'stabilityai/sd-turbo'
|
||||||
|
{
|
||||||
|
device: 'webgpu',
|
||||||
|
dtype: 'fp16',
|
||||||
|
progress_callback: (p) => onProgress?.(p.progress, p.status)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
onProgress?.(100, 'Ready')
|
||||||
|
})()
|
||||||
|
|
||||||
|
return loadingPromise
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function generateLocalImage(
|
||||||
|
prompt: string,
|
||||||
|
options?: {
|
||||||
|
width?: number
|
||||||
|
height?: number
|
||||||
|
steps?: number
|
||||||
|
seed?: number
|
||||||
|
}
|
||||||
|
): Promise<string> {
|
||||||
|
if (!generator) {
|
||||||
|
throw new Error('SD-Turbo not initialized. Call initSDTurbo() first.')
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await generator(prompt, {
|
||||||
|
width: options?.width || 512,
|
||||||
|
height: options?.height || 512,
|
||||||
|
num_inference_steps: options?.steps || 1, // SD-Turbo = 1 step
|
||||||
|
seed: options?.seed
|
||||||
|
})
|
||||||
|
|
||||||
|
// Returns base64 data URL
|
||||||
|
return result[0].image
|
||||||
|
}
|
||||||
|
|
||||||
|
export function isSDTurboReady(): boolean {
|
||||||
|
return generator !== null
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function unloadSDTurbo(): Promise<void> {
|
||||||
|
generator = null
|
||||||
|
loadingPromise = null
|
||||||
|
// Force garbage collection of GPU memory
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2.2 Create Model Download Manager
|
||||||
|
**New File:** `src/lib/modelDownloadManager.ts`
|
||||||
|
|
||||||
|
Handle progressive model downloads with:
|
||||||
|
- IndexedDB caching for persistence
|
||||||
|
- Progress tracking UI
|
||||||
|
- Resume capability for interrupted downloads
|
||||||
|
- Storage quota management
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 3: UI Integration
|
||||||
|
|
||||||
|
### 3.1 Update ImageGenShapeUtil
|
||||||
|
**File:** `src/shapes/ImageGenShapeUtil.tsx`
|
||||||
|
|
||||||
|
Add to shape props:
|
||||||
|
```typescript
|
||||||
|
type IImageGen = TLBaseShape<"ImageGen", {
|
||||||
|
// ... existing props
|
||||||
|
generationMode: 'auto' | 'local' | 'cloud' // NEW
|
||||||
|
localModelStatus: 'not-loaded' | 'loading' | 'ready' | 'error' // NEW
|
||||||
|
localModelProgress: number // NEW (0-100)
|
||||||
|
}>
|
||||||
|
```
|
||||||
|
|
||||||
|
Add UI toggle:
|
||||||
|
```tsx
|
||||||
|
<div className="generation-mode-toggle">
|
||||||
|
<button
|
||||||
|
onClick={() => setMode('local')}
|
||||||
|
disabled={!hasWebGPU}
|
||||||
|
title={!hasWebGPU ? 'WebGPU not supported' : 'Fast preview (~1-3s)'}
|
||||||
|
>
|
||||||
|
⚡ Quick Preview
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={() => setMode('cloud')}
|
||||||
|
title="High quality SDXL (~10-30s)"
|
||||||
|
>
|
||||||
|
✨ High Quality
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3.2 Smart Generation Logic
|
||||||
|
```typescript
|
||||||
|
const generateImage = async (prompt: string) => {
|
||||||
|
const mode = shape.props.generationMode
|
||||||
|
const capabilities = await detectWebGPUCapabilities()
|
||||||
|
|
||||||
|
// Auto mode: local for iterations, cloud for final
|
||||||
|
if (mode === 'auto' || mode === 'local') {
|
||||||
|
if (capabilities.hasWebGPU && isSDTurboReady()) {
|
||||||
|
// Generate locally - instant!
|
||||||
|
const imageUrl = await generateLocalImage(prompt)
|
||||||
|
updateShape({ imageUrl, source: 'local' })
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fall back to RunPod
|
||||||
|
await generateWithRunPod(prompt)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 4: AI Orchestrator Integration
|
||||||
|
|
||||||
|
### 4.1 Update aiOrchestrator.ts
|
||||||
|
**File:** `src/lib/aiOrchestrator.ts`
|
||||||
|
|
||||||
|
Add browser as compute target:
|
||||||
|
```typescript
|
||||||
|
type ComputeTarget = 'browser' | 'netcup' | 'runpod'
|
||||||
|
|
||||||
|
interface ImageGenerationOptions {
|
||||||
|
prompt: string
|
||||||
|
priority: 'draft' | 'final'
|
||||||
|
preferLocal?: boolean
|
||||||
|
}
|
||||||
|
|
||||||
|
async function generateImage(options: ImageGenerationOptions) {
|
||||||
|
const { hasWebGPU } = await detectWebGPUCapabilities()
|
||||||
|
|
||||||
|
// Routing logic
|
||||||
|
if (options.priority === 'draft' && hasWebGPU && isSDTurboReady()) {
|
||||||
|
return { target: 'browser', cost: 0 }
|
||||||
|
}
|
||||||
|
|
||||||
|
if (options.priority === 'final') {
|
||||||
|
return { target: 'runpod', cost: 0.02 }
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback chain
|
||||||
|
return { target: 'runpod', cost: 0.02 }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 5: Advanced Features (Future)
|
||||||
|
|
||||||
|
### 5.1 Real-time img2img Refinement
|
||||||
|
- Start with browser SD-Turbo draft
|
||||||
|
- User adjusts/annotates
|
||||||
|
- Send to RunPod SDXL for final with img2img
|
||||||
|
|
||||||
|
### 5.2 Browser-based Upscaling
|
||||||
|
- Add Real-ESRGAN-lite via ONNX Runtime
|
||||||
|
- 2x/4x upscale locally before cloud render
|
||||||
|
|
||||||
|
### 5.3 Background Removal
|
||||||
|
- U2Net in browser via transformers.js
|
||||||
|
- Zero-cost background removal
|
||||||
|
|
||||||
|
### 5.4 Style Transfer
|
||||||
|
- Fast neural style transfer via WebGPU shaders
|
||||||
|
- Real-time preview on canvas
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Technical Considerations
|
||||||
|
|
||||||
|
### Model Sizes
|
||||||
|
| Model | Size | Load Time | Generation |
|
||||||
|
|-------|------|-----------|------------|
|
||||||
|
| SD-Turbo | ~2GB | 30-60s (first) | 1-3s |
|
||||||
|
| SD-Turbo (quantized) | ~1GB | 15-30s | 2-4s |
|
||||||
|
|
||||||
|
### Memory Management
|
||||||
|
- Unload model when tab backgrounded
|
||||||
|
- Clear GPU memory on low-memory warnings
|
||||||
|
- IndexedDB for model caching (survives refresh)
|
||||||
|
|
||||||
|
### Error Handling
|
||||||
|
- Graceful degradation to WASM if WebGPU fails
|
||||||
|
- Clear error messages for unsupported browsers
|
||||||
|
- Automatic fallback to RunPod on local failure
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Files to Create/Modify
|
||||||
|
|
||||||
|
**New Files:**
|
||||||
|
- `src/lib/webgpuDiffusion.ts` - SD-Turbo wrapper
|
||||||
|
- `src/lib/modelDownloadManager.ts` - Model caching
|
||||||
|
- `src/lib/webgpuCapabilities.ts` - Detection utilities
|
||||||
|
- `src/components/ModelDownloadProgress.tsx` - UI component
|
||||||
|
|
||||||
|
**Modified Files:**
|
||||||
|
- `src/lib/clientConfig.ts` - Add WebGPU detection
|
||||||
|
- `src/lib/aiOrchestrator.ts` - Add browser routing
|
||||||
|
- `src/shapes/ImageGenShapeUtil.tsx` - Add mode toggle
|
||||||
|
- `vite.config.ts` - ONNX/WASM config
|
||||||
|
- `package.json` - New dependencies
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Testing Checklist
|
||||||
|
|
||||||
|
- [ ] WebGPU detection works on Chrome, Edge, Firefox
|
||||||
|
- [ ] WASM fallback works on Safari/older browsers
|
||||||
|
- [ ] Model downloads and caches correctly
|
||||||
|
- [ ] Generation completes in <5s on modern GPU
|
||||||
|
- [ ] Memory cleaned up properly on unload
|
||||||
|
- [ ] Offline generation works after model cached
|
||||||
|
- [ ] RunPod fallback triggers correctly
|
||||||
|
- [ ] Cost tracking reflects local vs cloud usage
|
||||||
|
<!-- SECTION:PLAN:END -->
|
||||||
|
|
@ -0,0 +1,146 @@
|
||||||
|
---
|
||||||
|
id: task-015
|
||||||
|
title: Set up Cloudflare D1 email-collector database for cross-site subscriptions
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 12:00'
|
||||||
|
updated_date: '2025-12-04 12:03'
|
||||||
|
labels:
|
||||||
|
- infrastructure
|
||||||
|
- cloudflare
|
||||||
|
- d1
|
||||||
|
- email
|
||||||
|
- cross-site
|
||||||
|
dependencies: []
|
||||||
|
priority: medium
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Create a standalone Cloudflare D1 database for collecting email subscriptions across all websites (mycofi.earth, canvas.jeffemmett.com, decolonizeti.me, etc.) with easy export capabilities.
|
||||||
|
|
||||||
|
**Purpose:**
|
||||||
|
- Unified email collection from all sites
|
||||||
|
- Page-separated lists (e.g., /newsletter, /waitlist, /landing)
|
||||||
|
- Simple CSV/JSON export for email campaigns
|
||||||
|
- GDPR-compliant with unsubscribe tracking
|
||||||
|
|
||||||
|
**Sites to integrate:**
|
||||||
|
- mycofi.earth
|
||||||
|
- canvas.jeffemmett.com
|
||||||
|
- decolonizeti.me
|
||||||
|
- games.jeffemmett.com
|
||||||
|
- Future sites
|
||||||
|
|
||||||
|
**Key Features:**
|
||||||
|
- Double opt-in verification
|
||||||
|
- Source tracking (which site, which page)
|
||||||
|
- Export in multiple formats (CSV, JSON, Mailchimp)
|
||||||
|
- Basic admin dashboard or CLI for exports
|
||||||
|
- Rate limiting to prevent abuse
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 D1 database 'email-collector' created on Cloudflare
|
||||||
|
- [ ] #2 Schema deployed with subscribers, verification_tokens tables
|
||||||
|
- [ ] #3 POST /api/subscribe endpoint accepts email + source_site + source_page
|
||||||
|
- [ ] #4 Email verification flow with token-based double opt-in
|
||||||
|
- [ ] #5 GET /api/emails/export returns CSV with filters (site, date, verified)
|
||||||
|
- [ ] #6 Unsubscribe endpoint and tracking
|
||||||
|
- [ ] #7 Rate limiting prevents spam submissions
|
||||||
|
- [ ] #8 At least one site integrated and collecting emails
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Plan
|
||||||
|
|
||||||
|
<!-- SECTION:PLAN:BEGIN -->
|
||||||
|
## Implementation Steps
|
||||||
|
|
||||||
|
### 1. Create D1 Database
|
||||||
|
```bash
|
||||||
|
wrangler d1 create email-collector
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Create Schema File
|
||||||
|
Create `worker/email-collector-schema.sql`:
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- Email Collector Schema
|
||||||
|
-- Cross-site email subscription management
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS subscribers (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
email TEXT NOT NULL,
|
||||||
|
email_hash TEXT NOT NULL, -- For duplicate checking
|
||||||
|
source_site TEXT NOT NULL,
|
||||||
|
source_page TEXT,
|
||||||
|
referrer TEXT,
|
||||||
|
ip_country TEXT,
|
||||||
|
subscribed_at TEXT DEFAULT (datetime('now')),
|
||||||
|
verified INTEGER DEFAULT 0,
|
||||||
|
verified_at TEXT,
|
||||||
|
unsubscribed INTEGER DEFAULT 0,
|
||||||
|
unsubscribed_at TEXT,
|
||||||
|
metadata TEXT -- JSON for custom fields
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS verification_tokens (
|
||||||
|
id TEXT PRIMARY KEY,
|
||||||
|
email TEXT NOT NULL,
|
||||||
|
token TEXT UNIQUE NOT NULL,
|
||||||
|
expires_at TEXT NOT NULL,
|
||||||
|
used INTEGER DEFAULT 0,
|
||||||
|
created_at TEXT DEFAULT (datetime('now'))
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Rate limiting table
|
||||||
|
CREATE TABLE IF NOT EXISTS rate_limits (
|
||||||
|
ip_hash TEXT PRIMARY KEY,
|
||||||
|
request_count INTEGER DEFAULT 1,
|
||||||
|
window_start TEXT DEFAULT (datetime('now'))
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Indexes
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_subs_email_hash ON subscribers(email_hash);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_subs_site ON subscribers(source_site);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_subs_page ON subscribers(source_site, source_page);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_subs_verified ON subscribers(verified);
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS idx_subs_unique ON subscribers(email_hash, source_site);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tokens_token ON verification_tokens(token);
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Create Worker Endpoints
|
||||||
|
Create `worker/emailCollector.ts`:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// POST /api/subscribe
|
||||||
|
// GET /api/verify/:token
|
||||||
|
// POST /api/unsubscribe
|
||||||
|
// GET /api/emails/export (auth required)
|
||||||
|
// GET /api/emails/stats
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Export Formats
|
||||||
|
- CSV: `email,source_site,source_page,subscribed_at,verified`
|
||||||
|
- JSON: Full object array
|
||||||
|
- Mailchimp: CSV with required headers
|
||||||
|
|
||||||
|
### 5. Admin Authentication
|
||||||
|
- Use simple API key for export endpoint
|
||||||
|
- Store in Worker secret: `EMAIL_ADMIN_KEY`
|
||||||
|
|
||||||
|
### 6. Integration
|
||||||
|
Add to each site's signup form:
|
||||||
|
```javascript
|
||||||
|
fetch('https://canvas.jeffemmett.com/api/subscribe', {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({
|
||||||
|
email: 'user@example.com',
|
||||||
|
source_site: 'mycofi.earth',
|
||||||
|
source_page: '/newsletter'
|
||||||
|
})
|
||||||
|
})
|
||||||
|
```
|
||||||
|
<!-- SECTION:PLAN:END -->
|
||||||
|
|
@ -0,0 +1,56 @@
|
||||||
|
---
|
||||||
|
id: task-016
|
||||||
|
title: Add encryption for CryptID emails at rest
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 12:01'
|
||||||
|
labels:
|
||||||
|
- security
|
||||||
|
- cryptid
|
||||||
|
- encryption
|
||||||
|
- privacy
|
||||||
|
- d1
|
||||||
|
dependencies:
|
||||||
|
- task-017
|
||||||
|
priority: medium
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Enhance CryptID security by encrypting email addresses stored in D1 database. This protects user privacy even if the database is compromised.
|
||||||
|
|
||||||
|
**Encryption Strategy:**
|
||||||
|
- Encrypt email addresses before storing in D1
|
||||||
|
- Use Cloudflare Workers KV or environment secret for encryption key
|
||||||
|
- Store encrypted email + hash for lookups
|
||||||
|
- Decrypt only when needed (sending emails, display)
|
||||||
|
|
||||||
|
**Implementation Options:**
|
||||||
|
1. **AES-GCM encryption** with key in Worker secret
|
||||||
|
2. **Deterministic encryption** for email lookups (hash-based)
|
||||||
|
3. **Hybrid approach**: Hash for lookup index, AES for actual email
|
||||||
|
|
||||||
|
**Schema Changes:**
|
||||||
|
```sql
|
||||||
|
ALTER TABLE users ADD COLUMN email_encrypted TEXT;
|
||||||
|
ALTER TABLE users ADD COLUMN email_hash TEXT; -- For lookups
|
||||||
|
-- Migrate existing emails, then drop plaintext column
|
||||||
|
```
|
||||||
|
|
||||||
|
**Considerations:**
|
||||||
|
- Key rotation strategy
|
||||||
|
- Performance impact on lookups
|
||||||
|
- Backup/recovery implications
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 Encryption key securely stored in Worker secrets
|
||||||
|
- [ ] #2 Emails encrypted before D1 insert
|
||||||
|
- [ ] #3 Email lookup works via hash index
|
||||||
|
- [ ] #4 Decryption works for email display and sending
|
||||||
|
- [ ] #5 Existing emails migrated to encrypted format
|
||||||
|
- [ ] #6 Key rotation procedure documented
|
||||||
|
- [ ] #7 No plaintext emails in database
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
@ -0,0 +1,63 @@
|
||||||
|
---
|
||||||
|
id: task-017
|
||||||
|
title: Deploy CryptID email recovery to dev branch and test
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 12:00'
|
||||||
|
updated_date: '2025-12-04 12:27'
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- cryptid
|
||||||
|
- auth
|
||||||
|
- testing
|
||||||
|
- dev-branch
|
||||||
|
dependencies:
|
||||||
|
- task-018
|
||||||
|
- task-019
|
||||||
|
priority: high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Push the existing CryptID email recovery code changes to dev branch and test the full flow before merging to main.
|
||||||
|
|
||||||
|
**Code Changes Ready:**
|
||||||
|
- src/App.tsx - Routes for /verify-email, /link-device
|
||||||
|
- src/components/auth/CryptID.tsx - Email linking flow
|
||||||
|
- src/components/auth/Profile.tsx - Email management UI, device list
|
||||||
|
- src/css/crypto-auth.css - Styling for email/device modals
|
||||||
|
- worker/types.ts - Updated D1 types
|
||||||
|
- worker/worker.ts - Auth API routes
|
||||||
|
- worker/cryptidAuth.ts - Auth handlers (already committed)
|
||||||
|
|
||||||
|
**Test Scenarios:**
|
||||||
|
1. Link email to existing CryptID account
|
||||||
|
2. Verify email via link
|
||||||
|
3. Request device link from new device
|
||||||
|
4. Approve device link via email
|
||||||
|
5. View and revoke linked devices
|
||||||
|
6. Recover account on new device via email
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 All CryptID changes committed to dev branch
|
||||||
|
- [ ] #2 Worker deployed to dev environment
|
||||||
|
- [ ] #3 Link email flow works end-to-end
|
||||||
|
- [ ] #4 Email verification completes successfully
|
||||||
|
- [ ] #5 Device linking via email works
|
||||||
|
- [ ] #6 Device revocation works
|
||||||
|
- [ ] #7 Profile shows linked email and devices
|
||||||
|
- [ ] #8 No console errors in happy path
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
<!-- SECTION:NOTES:BEGIN -->
|
||||||
|
Branch created: `feature/cryptid-email-recovery`
|
||||||
|
|
||||||
|
Code committed and pushed to Gitea
|
||||||
|
|
||||||
|
PR available at: https://gitea.jeffemmett.com/jeffemmett/canvas-website/compare/main...feature/cryptid-email-recovery
|
||||||
|
<!-- SECTION:NOTES:END -->
|
||||||
|
|
@ -0,0 +1,111 @@
|
||||||
|
---
|
||||||
|
id: task-018
|
||||||
|
title: Create Cloudflare D1 cryptid-auth database
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 12:02'
|
||||||
|
updated_date: '2025-12-04 12:27'
|
||||||
|
labels:
|
||||||
|
- infrastructure
|
||||||
|
- cloudflare
|
||||||
|
- d1
|
||||||
|
- cryptid
|
||||||
|
- auth
|
||||||
|
dependencies: []
|
||||||
|
priority: high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Create the D1 database on Cloudflare for CryptID authentication system. This is the first step before deploying the email recovery feature.
|
||||||
|
|
||||||
|
**Database Purpose:**
|
||||||
|
- Store user accounts linked to CryptID usernames
|
||||||
|
- Store device public keys for multi-device auth
|
||||||
|
- Store verification tokens for email/device linking
|
||||||
|
- Enable account recovery via verified email
|
||||||
|
|
||||||
|
**Security Considerations:**
|
||||||
|
- Emails should be encrypted at rest (task-016)
|
||||||
|
- Public keys are safe to store (not secrets)
|
||||||
|
- Tokens are time-limited and single-use
|
||||||
|
- No passwords stored (WebCrypto key-based auth)
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 D1 database 'cryptid-auth' created via wrangler d1 create
|
||||||
|
- [ ] #2 D1 database 'cryptid-auth-dev' created for dev environment
|
||||||
|
- [ ] #3 Database IDs added to wrangler.toml (replacing placeholders)
|
||||||
|
- [ ] #4 Schema from worker/schema.sql deployed to both databases
|
||||||
|
- [ ] #5 Verified tables exist: users, device_keys, verification_tokens
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Plan
|
||||||
|
|
||||||
|
<!-- SECTION:PLAN:BEGIN -->
|
||||||
|
## Implementation Steps
|
||||||
|
|
||||||
|
### 1. Create D1 Databases
|
||||||
|
Run from local machine or Netcup (requires wrangler CLI):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /home/jeffe/Github/canvas-website
|
||||||
|
|
||||||
|
# Create production database
|
||||||
|
wrangler d1 create cryptid-auth
|
||||||
|
|
||||||
|
# Create dev database
|
||||||
|
wrangler d1 create cryptid-auth-dev
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Update wrangler.toml
|
||||||
|
Replace placeholder IDs with actual database IDs from step 1:
|
||||||
|
|
||||||
|
```toml
|
||||||
|
[[d1_databases]]
|
||||||
|
binding = "CRYPTID_DB"
|
||||||
|
database_name = "cryptid-auth"
|
||||||
|
database_id = "<PROD_ID_FROM_STEP_1>"
|
||||||
|
|
||||||
|
[[env.dev.d1_databases]]
|
||||||
|
binding = "CRYPTID_DB"
|
||||||
|
database_name = "cryptid-auth-dev"
|
||||||
|
database_id = "<DEV_ID_FROM_STEP_1>"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Deploy Schema
|
||||||
|
```bash
|
||||||
|
# Deploy to dev first
|
||||||
|
wrangler d1 execute cryptid-auth-dev --file=./worker/schema.sql
|
||||||
|
|
||||||
|
# Then production
|
||||||
|
wrangler d1 execute cryptid-auth --file=./worker/schema.sql
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Verify Tables
|
||||||
|
```bash
|
||||||
|
# Check dev
|
||||||
|
wrangler d1 execute cryptid-auth-dev --command="SELECT name FROM sqlite_master WHERE type='table';"
|
||||||
|
|
||||||
|
# Expected output:
|
||||||
|
# - users
|
||||||
|
# - device_keys
|
||||||
|
# - verification_tokens
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5. Commit wrangler.toml Changes
|
||||||
|
```bash
|
||||||
|
git add wrangler.toml
|
||||||
|
git commit -m "chore: add D1 database IDs for cryptid-auth"
|
||||||
|
```
|
||||||
|
<!-- SECTION:PLAN:END -->
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
<!-- SECTION:NOTES:BEGIN -->
|
||||||
|
Feature branch: `feature/cryptid-email-recovery`
|
||||||
|
|
||||||
|
Code is ready - waiting for D1 database creation
|
||||||
|
<!-- SECTION:NOTES:END -->
|
||||||
|
|
@ -0,0 +1,41 @@
|
||||||
|
---
|
||||||
|
id: task-019
|
||||||
|
title: Configure CryptID secrets and SendGrid integration
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 12:02'
|
||||||
|
labels:
|
||||||
|
- infrastructure
|
||||||
|
- cloudflare
|
||||||
|
- cryptid
|
||||||
|
- secrets
|
||||||
|
- sendgrid
|
||||||
|
dependencies:
|
||||||
|
- task-018
|
||||||
|
priority: high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Set up the required secrets and environment variables for CryptID email functionality on Cloudflare Workers.
|
||||||
|
|
||||||
|
**Required Secrets:**
|
||||||
|
- SENDGRID_API_KEY - For sending verification emails
|
||||||
|
- CRYPTID_EMAIL_FROM - Sender email address (e.g., auth@jeffemmett.com)
|
||||||
|
- APP_URL - Base URL for verification links (e.g., https://canvas.jeffemmett.com)
|
||||||
|
|
||||||
|
**Configuration:**
|
||||||
|
- Secrets set for both production and dev environments
|
||||||
|
- SendGrid account configured with verified sender domain
|
||||||
|
- Email templates tested
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 SENDGRID_API_KEY secret set via wrangler secret put
|
||||||
|
- [ ] #2 CRYPTID_EMAIL_FROM secret configured
|
||||||
|
- [ ] #3 APP_URL environment variable set in wrangler.toml
|
||||||
|
- [ ] #4 SendGrid sender domain verified (jeffemmett.com or subdomain)
|
||||||
|
- [ ] #5 Test email sends successfully from Worker
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
@ -0,0 +1,63 @@
|
||||||
|
---
|
||||||
|
id: task-024
|
||||||
|
title: 'Open Mapping: Collaborative Route Planning Module'
|
||||||
|
status: To Do
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 14:30'
|
||||||
|
labels:
|
||||||
|
- feature
|
||||||
|
- mapping
|
||||||
|
dependencies: []
|
||||||
|
priority: high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Implement an open-source mapping and routing layer for the canvas that provides advanced route planning capabilities beyond Google Maps. Built on OpenStreetMap, OSRM/Valhalla, and MapLibre GL JS.
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 MapLibre GL JS integrated with tldraw canvas
|
||||||
|
- [ ] #2 OSRM routing backend deployed to Netcup
|
||||||
|
- [ ] #3 Waypoint placement and route calculation working
|
||||||
|
- [ ] #4 Multi-route comparison UI implemented
|
||||||
|
- [ ] #5 Y.js collaboration for shared route editing
|
||||||
|
- [ ] #6 Layer management panel with basemap switching
|
||||||
|
- [ ] #7 Offline tile caching via Service Worker
|
||||||
|
- [ ] #8 Budget tracking per waypoint/route
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Plan
|
||||||
|
|
||||||
|
<!-- SECTION:PLAN:BEGIN -->
|
||||||
|
Phase 1 - Foundation:
|
||||||
|
- Integrate MapLibre GL JS with tldraw
|
||||||
|
- Deploy OSRM to /opt/apps/open-mapping/
|
||||||
|
- Basic waypoint and route UI
|
||||||
|
|
||||||
|
Phase 2 - Multi-Route:
|
||||||
|
- Alternative routes visualization
|
||||||
|
- Route comparison panel
|
||||||
|
- Elevation profiles
|
||||||
|
|
||||||
|
Phase 3 - Collaboration:
|
||||||
|
- Y.js integration
|
||||||
|
- Real-time cursor presence
|
||||||
|
- Share links
|
||||||
|
|
||||||
|
Phase 4 - Layers:
|
||||||
|
- Layer panel UI
|
||||||
|
- Multiple basemaps
|
||||||
|
- Custom overlays
|
||||||
|
|
||||||
|
Phase 5 - Calendar/Budget:
|
||||||
|
- Time windows on waypoints
|
||||||
|
- Cost estimation
|
||||||
|
- iCal export
|
||||||
|
|
||||||
|
Phase 6 - Optimization:
|
||||||
|
- VROOM TSP/VRP
|
||||||
|
- Offline PWA
|
||||||
|
<!-- SECTION:PLAN:END -->
|
||||||
|
|
@ -0,0 +1,23 @@
|
||||||
|
---
|
||||||
|
id: task-high.01
|
||||||
|
title: 'MI Bar UX: Modal Fade & Scrollable Try Next'
|
||||||
|
status: Done
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 06:34'
|
||||||
|
labels: []
|
||||||
|
dependencies: []
|
||||||
|
parent_task_id: task-high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Improved Mycelial Intelligence bar UX: fades when modals/popups are open, combined Tools + Follow-up suggestions into a single scrollable 'Try Next' section
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 MI bar fades when settings modal is open
|
||||||
|
- [ ] #2 MI bar fades when auth modal is open
|
||||||
|
- [ ] #3 Suggested tools and follow-ups in single scrollable row
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
@ -0,0 +1,24 @@
|
||||||
|
---
|
||||||
|
id: task-high.02
|
||||||
|
title: CryptID Email Recovery in Settings
|
||||||
|
status: Done
|
||||||
|
assignee: []
|
||||||
|
created_date: '2025-12-04 06:35'
|
||||||
|
labels: []
|
||||||
|
dependencies: []
|
||||||
|
parent_task_id: task-high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Added email linking to User Settings modal General tab - allows users to attach their email to their CryptID account for device recovery and verification
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
- [ ] #1 Email linking UI in General settings tab
|
||||||
|
- [ ] #2 Shows email verification status
|
||||||
|
- [ ] #3 Sends verification email on link
|
||||||
|
- [ ] #4 Dark mode aware styling
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
@ -0,0 +1,38 @@
|
||||||
|
# Canvas Website Docker Compose
|
||||||
|
# Production: jeffemmett.com, www.jeffemmett.com
|
||||||
|
# Staging: staging.jeffemmett.com
|
||||||
|
|
||||||
|
services:
|
||||||
|
canvas-website:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
args:
|
||||||
|
- VITE_TLDRAW_WORKER_URL=https://jeffemmett-canvas.jeffemmett.workers.dev
|
||||||
|
# Add other build args from .env if needed
|
||||||
|
container_name: canvas-website
|
||||||
|
restart: unless-stopped
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
- "traefik.docker.network=traefik-public"
|
||||||
|
# Single service definition (both routers use same backend)
|
||||||
|
- "traefik.http.services.canvas.loadbalancer.server.port=80"
|
||||||
|
# Production deployment (jeffemmett.com and www)
|
||||||
|
- "traefik.http.routers.canvas-prod.rule=Host(`jeffemmett.com`) || Host(`www.jeffemmett.com`)"
|
||||||
|
- "traefik.http.routers.canvas-prod.entrypoints=web"
|
||||||
|
- "traefik.http.routers.canvas-prod.service=canvas"
|
||||||
|
# Staging deployment (keep for testing)
|
||||||
|
- "traefik.http.routers.canvas-staging.rule=Host(`staging.jeffemmett.com`)"
|
||||||
|
- "traefik.http.routers.canvas-staging.entrypoints=web"
|
||||||
|
- "traefik.http.routers.canvas-staging.service=canvas"
|
||||||
|
networks:
|
||||||
|
- traefik-public
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost/health"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
|
||||||
|
networks:
|
||||||
|
traefik-public:
|
||||||
|
external: true
|
||||||
|
|
@ -4,7 +4,7 @@ This document describes the complete WebCryptoAPI authentication system implemen
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
The WebCryptoAPI authentication system provides cryptographic authentication using ECDSA P-256 key pairs, challenge-response authentication, and secure key storage. It integrates with the existing ODD (Open Data Directory) framework while providing a fallback authentication mechanism.
|
The WebCryptoAPI authentication system provides cryptographic authentication using ECDSA P-256 key pairs, challenge-response authentication, and secure key storage. This is the primary authentication mechanism for the application.
|
||||||
|
|
||||||
## Architecture
|
## Architecture
|
||||||
|
|
||||||
|
|
@ -23,13 +23,14 @@ The WebCryptoAPI authentication system provides cryptographic authentication usi
|
||||||
- User registration and login
|
- User registration and login
|
||||||
- Credential verification
|
- Credential verification
|
||||||
|
|
||||||
3. **Enhanced AuthService** (`src/lib/auth/authService.ts`)
|
3. **AuthService** (`src/lib/auth/authService.ts`)
|
||||||
- Integrates crypto authentication with ODD
|
- Simplified authentication service
|
||||||
- Fallback mechanisms
|
|
||||||
- Session management
|
- Session management
|
||||||
|
- Integration with CryptoAuthService
|
||||||
|
|
||||||
4. **UI Components**
|
4. **UI Components**
|
||||||
- `CryptoLogin.tsx` - Cryptographic authentication UI
|
- `CryptID.tsx` - Cryptographic authentication UI
|
||||||
|
- `CryptoDebug.tsx` - Debug component for verification
|
||||||
- `CryptoTest.tsx` - Test component for verification
|
- `CryptoTest.tsx` - Test component for verification
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
@ -41,7 +42,6 @@ The WebCryptoAPI authentication system provides cryptographic authentication usi
|
||||||
- **Public Key Infrastructure**: Store and verify public keys
|
- **Public Key Infrastructure**: Store and verify public keys
|
||||||
- **Browser Support Detection**: Checks for WebCryptoAPI availability
|
- **Browser Support Detection**: Checks for WebCryptoAPI availability
|
||||||
- **Secure Context Validation**: Ensures HTTPS requirement
|
- **Secure Context Validation**: Ensures HTTPS requirement
|
||||||
- **Fallback Authentication**: Works with existing ODD system
|
|
||||||
- **Modern UI**: Responsive design with dark mode support
|
- **Modern UI**: Responsive design with dark mode support
|
||||||
- **Comprehensive Testing**: Test component for verification
|
- **Comprehensive Testing**: Test component for verification
|
||||||
|
|
||||||
|
|
@ -86,7 +86,7 @@ const isValid = await crypto.verifySignature(publicKey, signature, challenge);
|
||||||
|
|
||||||
### Feature Detection
|
### Feature Detection
|
||||||
```typescript
|
```typescript
|
||||||
const hasWebCrypto = typeof window.crypto !== 'undefined' &&
|
const hasWebCrypto = typeof window.crypto !== 'undefined' &&
|
||||||
typeof window.crypto.subtle !== 'undefined';
|
typeof window.crypto.subtle !== 'undefined';
|
||||||
const isSecure = window.isSecureContext;
|
const isSecure = window.isSecureContext;
|
||||||
```
|
```
|
||||||
|
|
@ -98,26 +98,26 @@ const isSecure = window.isSecureContext;
|
||||||
1. **Secure Context Requirement**: Only works over HTTPS
|
1. **Secure Context Requirement**: Only works over HTTPS
|
||||||
2. **ECDSA P-256**: Industry-standard elliptic curve
|
2. **ECDSA P-256**: Industry-standard elliptic curve
|
||||||
3. **Challenge-Response**: Prevents replay attacks
|
3. **Challenge-Response**: Prevents replay attacks
|
||||||
4. **Key Storage**: Public keys stored securely
|
4. **Key Storage**: Public keys stored securely in localStorage
|
||||||
5. **Input Validation**: Username format validation
|
5. **Input Validation**: Username format validation
|
||||||
6. **Error Handling**: Comprehensive error management
|
6. **Error Handling**: Comprehensive error management
|
||||||
|
|
||||||
### ⚠️ Security Notes
|
### ⚠️ Security Notes
|
||||||
|
|
||||||
1. **Private Key Storage**: Currently simplified for demo purposes
|
1. **Private Key Storage**: Currently uses localStorage for demo purposes
|
||||||
- In production, use Web Crypto API's key storage
|
- In production, consider using Web Crypto API's non-extractable keys
|
||||||
- Consider hardware security modules (HSM)
|
- Consider hardware security modules (HSM)
|
||||||
- Implement proper key derivation
|
- Implement proper key derivation
|
||||||
|
|
||||||
2. **Session Management**:
|
2. **Session Management**:
|
||||||
- Integrates with existing ODD session system
|
- Uses localStorage for session persistence
|
||||||
- Consider implementing JWT tokens
|
- Consider implementing JWT tokens for server-side verification
|
||||||
- Add session expiration
|
- Add session expiration and refresh logic
|
||||||
|
|
||||||
3. **Network Security**:
|
3. **Network Security**:
|
||||||
- All crypto operations happen client-side
|
- All crypto operations happen client-side
|
||||||
- No private keys transmitted over network
|
- No private keys transmitted over network
|
||||||
- Consider adding server-side verification
|
- Consider adding server-side signature verification
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
|
|
@ -146,11 +146,22 @@ import { useAuth } from './context/AuthContext';
|
||||||
|
|
||||||
const { login, register } = useAuth();
|
const { login, register } = useAuth();
|
||||||
|
|
||||||
// The AuthService automatically tries crypto auth first,
|
// AuthService automatically uses crypto auth
|
||||||
// then falls back to ODD authentication
|
|
||||||
const success = await login('username');
|
const success = await login('username');
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Using the CryptID Component
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import CryptID from './components/auth/CryptID';
|
||||||
|
|
||||||
|
// Render the authentication component
|
||||||
|
<CryptID
|
||||||
|
onSuccess={() => console.log('Login successful')}
|
||||||
|
onCancel={() => console.log('Login cancelled')}
|
||||||
|
/>
|
||||||
|
```
|
||||||
|
|
||||||
### Testing the Implementation
|
### Testing the Implementation
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
|
|
@ -166,31 +177,42 @@ import CryptoTest from './components/auth/CryptoTest';
|
||||||
src/
|
src/
|
||||||
├── lib/
|
├── lib/
|
||||||
│ ├── auth/
|
│ ├── auth/
|
||||||
│ │ ├── crypto.ts # WebCryptoAPI wrapper
|
│ │ ├── crypto.ts # WebCryptoAPI wrapper
|
||||||
│ │ ├── cryptoAuthService.ts # High-level auth service
|
│ │ ├── cryptoAuthService.ts # High-level auth service
|
||||||
│ │ ├── authService.ts # Enhanced auth service
|
│ │ ├── authService.ts # Simplified auth service
|
||||||
│ │ └── account.ts # User account management
|
│ │ ├── sessionPersistence.ts # Session storage utilities
|
||||||
|
│ │ └── types.ts # TypeScript types
|
||||||
│ └── utils/
|
│ └── utils/
|
||||||
│ └── browser.ts # Browser support detection
|
│ └── browser.ts # Browser support detection
|
||||||
├── components/
|
├── components/
|
||||||
│ └── auth/
|
│ └── auth/
|
||||||
│ ├── CryptoLogin.tsx # Crypto auth UI
|
│ ├── CryptID.tsx # Main crypto auth UI
|
||||||
│ └── CryptoTest.tsx # Test component
|
│ ├── CryptoDebug.tsx # Debug component
|
||||||
|
│ └── CryptoTest.tsx # Test component
|
||||||
|
├── context/
|
||||||
|
│ └── AuthContext.tsx # React context for auth state
|
||||||
└── css/
|
└── css/
|
||||||
└── crypto-auth.css # Styles for crypto components
|
└── crypto-auth.css # Styles for crypto components
|
||||||
```
|
```
|
||||||
|
|
||||||
## Dependencies
|
## Dependencies
|
||||||
|
|
||||||
### Required Packages
|
### Required Packages
|
||||||
- `one-webcrypto`: WebCryptoAPI polyfill (^1.0.3)
|
- `one-webcrypto`: WebCryptoAPI polyfill (^1.0.3)
|
||||||
- `@oddjs/odd`: Open Data Directory framework (^0.37.2)
|
|
||||||
|
|
||||||
### Browser APIs Used
|
### Browser APIs Used
|
||||||
- `window.crypto.subtle`: WebCryptoAPI
|
- `window.crypto.subtle`: WebCryptoAPI
|
||||||
- `window.localStorage`: Key storage
|
- `window.localStorage`: Key and session storage
|
||||||
- `window.isSecureContext`: Security context check
|
- `window.isSecureContext`: Security context check
|
||||||
|
|
||||||
|
## Storage
|
||||||
|
|
||||||
|
### localStorage Keys Used
|
||||||
|
- `registeredUsers`: Array of registered usernames
|
||||||
|
- `${username}_publicKey`: User's public key (Base64)
|
||||||
|
- `${username}_authData`: Authentication data (challenge, signature, timestamp)
|
||||||
|
- `session`: Current user session data
|
||||||
|
|
||||||
## Testing
|
## Testing
|
||||||
|
|
||||||
### Manual Testing
|
### Manual Testing
|
||||||
|
|
@ -208,6 +230,7 @@ src/
|
||||||
- [x] User registration
|
- [x] User registration
|
||||||
- [x] User login
|
- [x] User login
|
||||||
- [x] Credential verification
|
- [x] Credential verification
|
||||||
|
- [x] Session persistence
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|
||||||
|
|
@ -228,13 +251,13 @@ src/
|
||||||
- Try refreshing the page
|
- Try refreshing the page
|
||||||
|
|
||||||
4. **"Authentication failed"**
|
4. **"Authentication failed"**
|
||||||
- Verify user exists
|
- Verify user exists in localStorage
|
||||||
- Check stored credentials
|
- Check stored credentials
|
||||||
- Clear browser data and retry
|
- Clear browser data and retry
|
||||||
|
|
||||||
### Debug Mode
|
### Debug Mode
|
||||||
|
|
||||||
Enable debug logging by setting:
|
Enable debug logging by opening the browser console:
|
||||||
```typescript
|
```typescript
|
||||||
localStorage.setItem('debug_crypto', 'true');
|
localStorage.setItem('debug_crypto', 'true');
|
||||||
```
|
```
|
||||||
|
|
@ -242,7 +265,7 @@ localStorage.setItem('debug_crypto', 'true');
|
||||||
## Future Enhancements
|
## Future Enhancements
|
||||||
|
|
||||||
### Planned Improvements
|
### Planned Improvements
|
||||||
1. **Enhanced Key Storage**: Use Web Crypto API's key storage
|
1. **Enhanced Key Storage**: Use Web Crypto API's non-extractable keys
|
||||||
2. **Server-Side Verification**: Add server-side signature verification
|
2. **Server-Side Verification**: Add server-side signature verification
|
||||||
3. **Multi-Factor Authentication**: Add additional authentication factors
|
3. **Multi-Factor Authentication**: Add additional authentication factors
|
||||||
4. **Key Rotation**: Implement automatic key rotation
|
4. **Key Rotation**: Implement automatic key rotation
|
||||||
|
|
@ -254,6 +277,15 @@ localStorage.setItem('debug_crypto', 'true');
|
||||||
3. **Post-Quantum Cryptography**: Prepare for quantum threats
|
3. **Post-Quantum Cryptography**: Prepare for quantum threats
|
||||||
4. **Biometric Integration**: Add biometric authentication
|
4. **Biometric Integration**: Add biometric authentication
|
||||||
|
|
||||||
|
## Integration with Automerge Sync
|
||||||
|
|
||||||
|
The authentication system works seamlessly with the Automerge-based real-time collaboration:
|
||||||
|
|
||||||
|
- **User Identification**: Each user is identified by their username in Automerge
|
||||||
|
- **Session Management**: Sessions persist across page reloads via localStorage
|
||||||
|
- **Collaboration**: Authenticated users can join shared canvas rooms
|
||||||
|
- **Privacy**: Only authenticated users can access canvas data
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
When contributing to the WebCryptoAPI authentication system:
|
When contributing to the WebCryptoAPI authentication system:
|
||||||
|
|
@ -269,4 +301,4 @@ When contributing to the WebCryptoAPI authentication system:
|
||||||
- [WebCryptoAPI Specification](https://www.w3.org/TR/WebCryptoAPI/)
|
- [WebCryptoAPI Specification](https://www.w3.org/TR/WebCryptoAPI/)
|
||||||
- [ECDSA Algorithm](https://en.wikipedia.org/wiki/Elliptic_Curve_Digital_Signature_Algorithm)
|
- [ECDSA Algorithm](https://en.wikipedia.org/wiki/Elliptic_Curve_Digital_Signature_Algorithm)
|
||||||
- [P-256 Curve](https://en.wikipedia.org/wiki/NIST_Curve_P-256)
|
- [P-256 Curve](https://en.wikipedia.org/wiki/NIST_Curve_P-256)
|
||||||
- [Challenge-Response Authentication](https://en.wikipedia.org/wiki/Challenge%E2%80%93response_authentication)
|
- [Challenge-Response Authentication](https://en.wikipedia.org/wiki/Challenge%E2%80%93response_authentication)
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
node_modules
|
||||||
|
packages/*/node_modules
|
||||||
|
packages/*/dist
|
||||||
|
*.log
|
||||||
|
.git
|
||||||
|
.gitignore
|
||||||
|
README.md
|
||||||
|
infrastructure/
|
||||||
|
|
@ -0,0 +1,35 @@
|
||||||
|
# Dependencies
|
||||||
|
node_modules/
|
||||||
|
package-lock.json
|
||||||
|
|
||||||
|
# Build outputs
|
||||||
|
dist/
|
||||||
|
*.tsbuildinfo
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
logs/
|
||||||
|
*.log
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
pm2.log
|
||||||
|
|
||||||
|
# Environment variables
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.*.local
|
||||||
|
|
||||||
|
# IDE
|
||||||
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*~
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# PM2
|
||||||
|
ecosystem.config.js
|
||||||
|
.pm2/
|
||||||
|
|
@ -0,0 +1,32 @@
|
||||||
|
# mulTmux Server Dockerfile
|
||||||
|
FROM node:20-slim
|
||||||
|
|
||||||
|
# Install tmux and build dependencies for node-pty
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
tmux \
|
||||||
|
python3 \
|
||||||
|
make \
|
||||||
|
g++ \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy workspace root files
|
||||||
|
COPY package.json ./
|
||||||
|
COPY tsconfig.json ./
|
||||||
|
|
||||||
|
# Copy packages
|
||||||
|
COPY packages/server ./packages/server
|
||||||
|
COPY packages/cli ./packages/cli
|
||||||
|
|
||||||
|
# Install dependencies (including node-pty native compilation)
|
||||||
|
RUN npm install --workspaces
|
||||||
|
|
||||||
|
# Build TypeScript
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 3002
|
||||||
|
|
||||||
|
# Run the server
|
||||||
|
CMD ["node", "packages/server/dist/index.js"]
|
||||||
|
|
@ -0,0 +1,240 @@
|
||||||
|
# mulTmux
|
||||||
|
|
||||||
|
A collaborative terminal tool that lets multiple users interact with the same tmux session in real-time.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **Real-time Collaboration**: Multiple users can connect to the same terminal session
|
||||||
|
- **tmux Backend**: Leverages tmux for robust terminal multiplexing
|
||||||
|
- **Token-based Auth**: Secure invite links with expiration
|
||||||
|
- **Presence Indicators**: See who's connected to your session
|
||||||
|
- **Low Resource Usage**: ~200-300MB RAM for typical usage
|
||||||
|
- **Easy Deployment**: Works alongside existing services on your server
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────┐ ┌──────────────────┐
|
||||||
|
│ Client │ ──── WebSocket ────────> │ Server │
|
||||||
|
│ (CLI) │ (token auth) │ │
|
||||||
|
└─────────────┘ │ ┌────────────┐ │
|
||||||
|
│ │ Node.js │ │
|
||||||
|
┌─────────────┐ │ │ Backend │ │
|
||||||
|
│ Client 2 │ ──── Invite Link ──────> │ └─────┬──────┘ │
|
||||||
|
│ (CLI) │ │ │ │
|
||||||
|
└─────────────┘ │ ┌─────▼──────┐ │
|
||||||
|
│ │ tmux │ │
|
||||||
|
│ │ Sessions │ │
|
||||||
|
│ └────────────┘ │
|
||||||
|
└──────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Server Setup
|
||||||
|
|
||||||
|
1. **Deploy to your AI server:**
|
||||||
|
```bash
|
||||||
|
cd multmux
|
||||||
|
chmod +x infrastructure/deploy.sh
|
||||||
|
./infrastructure/deploy.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This will:
|
||||||
|
- Install tmux if needed
|
||||||
|
- Build the server
|
||||||
|
- Set up PM2 for process management
|
||||||
|
- Start the server
|
||||||
|
|
||||||
|
2. **(Optional) Set up nginx reverse proxy:**
|
||||||
|
```bash
|
||||||
|
sudo cp infrastructure/nginx.conf /etc/nginx/sites-available/multmux
|
||||||
|
sudo ln -s /etc/nginx/sites-available/multmux /etc/nginx/sites-enabled/
|
||||||
|
# Edit the file to set your domain
|
||||||
|
sudo nano /etc/nginx/sites-available/multmux
|
||||||
|
sudo nginx -t
|
||||||
|
sudo systemctl reload nginx
|
||||||
|
```
|
||||||
|
|
||||||
|
### CLI Installation
|
||||||
|
|
||||||
|
**On your local machine:**
|
||||||
|
```bash
|
||||||
|
cd multmux/packages/cli
|
||||||
|
npm install
|
||||||
|
npm run build
|
||||||
|
npm link # Installs 'multmux' command globally
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Create a Session
|
||||||
|
|
||||||
|
```bash
|
||||||
|
multmux create my-project --repo /path/to/repo
|
||||||
|
```
|
||||||
|
|
||||||
|
This outputs an invite link like:
|
||||||
|
```
|
||||||
|
multmux join a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6
|
||||||
|
```
|
||||||
|
|
||||||
|
### Join a Session
|
||||||
|
|
||||||
|
```bash
|
||||||
|
multmux join a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6
|
||||||
|
```
|
||||||
|
|
||||||
|
### List Active Sessions
|
||||||
|
|
||||||
|
```bash
|
||||||
|
multmux list
|
||||||
|
```
|
||||||
|
|
||||||
|
### Using a Remote Server
|
||||||
|
|
||||||
|
If your server is on a different machine:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create session
|
||||||
|
multmux create my-project --server http://your-server:3000
|
||||||
|
|
||||||
|
# Join session
|
||||||
|
multmux join <token> --server ws://your-server:3001
|
||||||
|
```
|
||||||
|
|
||||||
|
## CLI Commands
|
||||||
|
|
||||||
|
| Command | Description |
|
||||||
|
|---------|-------------|
|
||||||
|
| `multmux create <name>` | Create a new collaborative session |
|
||||||
|
| `multmux join <token>` | Join an existing session |
|
||||||
|
| `multmux list` | List all active sessions |
|
||||||
|
|
||||||
|
### Options
|
||||||
|
|
||||||
|
**create:**
|
||||||
|
- `-s, --server <url>` - Server URL (default: http://localhost:3000)
|
||||||
|
- `-r, --repo <path>` - Repository path to cd into
|
||||||
|
|
||||||
|
**join:**
|
||||||
|
- `-s, --server <url>` - WebSocket server URL (default: ws://localhost:3001)
|
||||||
|
|
||||||
|
**list:**
|
||||||
|
- `-s, --server <url>` - Server URL (default: http://localhost:3000)
|
||||||
|
|
||||||
|
## Server Management
|
||||||
|
|
||||||
|
### PM2 Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pm2 status # Check server status
|
||||||
|
pm2 logs multmux-server # View server logs
|
||||||
|
pm2 restart multmux-server # Restart server
|
||||||
|
pm2 stop multmux-server # Stop server
|
||||||
|
```
|
||||||
|
|
||||||
|
### Resource Usage
|
||||||
|
|
||||||
|
- **Idle**: ~100-150MB RAM
|
||||||
|
- **Per session**: ~5-10MB RAM
|
||||||
|
- **Per user**: ~1-2MB RAM
|
||||||
|
- **Typical usage**: 200-300MB RAM total
|
||||||
|
|
||||||
|
## API Reference
|
||||||
|
|
||||||
|
### HTTP API (default: port 3000)
|
||||||
|
|
||||||
|
| Endpoint | Method | Description |
|
||||||
|
|----------|--------|-------------|
|
||||||
|
| `/api/sessions` | POST | Create a new session |
|
||||||
|
| `/api/sessions` | GET | List active sessions |
|
||||||
|
| `/api/sessions/:id` | GET | Get session info |
|
||||||
|
| `/api/sessions/:id/tokens` | POST | Generate new invite token |
|
||||||
|
| `/api/health` | GET | Health check |
|
||||||
|
|
||||||
|
### WebSocket (default: port 3001)
|
||||||
|
|
||||||
|
Connect with: `ws://localhost:3001?token=<your-token>`
|
||||||
|
|
||||||
|
**Message Types:**
|
||||||
|
- `output` - Terminal output from server
|
||||||
|
- `input` - User input to terminal
|
||||||
|
- `resize` - Terminal resize event
|
||||||
|
- `presence` - User join/leave notifications
|
||||||
|
- `joined` - Connection confirmation
|
||||||
|
|
||||||
|
## Security
|
||||||
|
|
||||||
|
- **Token Expiration**: Invite tokens expire after 60 minutes (configurable)
|
||||||
|
- **Session Isolation**: Each session runs in its own tmux instance
|
||||||
|
- **Input Validation**: All terminal input is validated
|
||||||
|
- **No Persistence**: Sessions are destroyed when all users leave
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Server won't start
|
||||||
|
|
||||||
|
Check if ports are available:
|
||||||
|
```bash
|
||||||
|
netstat -tlnp | grep -E '3000|3001'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Can't connect to server
|
||||||
|
|
||||||
|
1. Check server is running: `pm2 status`
|
||||||
|
2. Check logs: `pm2 logs multmux-server`
|
||||||
|
3. Verify firewall allows ports 3000 and 3001
|
||||||
|
|
||||||
|
### Terminal not responding
|
||||||
|
|
||||||
|
1. Check WebSocket connection in browser console
|
||||||
|
2. Verify token hasn't expired
|
||||||
|
3. Restart session: `pm2 restart multmux-server`
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
multmux/
|
||||||
|
├── packages/
|
||||||
|
│ ├── server/ # Backend server
|
||||||
|
│ │ ├── src/
|
||||||
|
│ │ │ ├── managers/ # Session & token management
|
||||||
|
│ │ │ ├── websocket/ # WebSocket handler
|
||||||
|
│ │ │ └── api/ # HTTP routes
|
||||||
|
│ └── cli/ # CLI client
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── commands/ # CLI commands
|
||||||
|
│ │ ├── connection/ # WebSocket client
|
||||||
|
│ │ └── ui/ # Terminal UI
|
||||||
|
└── infrastructure/ # Deployment scripts
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running in Development
|
||||||
|
|
||||||
|
**Terminal 1 - Server:**
|
||||||
|
```bash
|
||||||
|
npm run dev:server
|
||||||
|
```
|
||||||
|
|
||||||
|
**Terminal 2 - CLI:**
|
||||||
|
```bash
|
||||||
|
cd packages/cli
|
||||||
|
npm run dev -- create test-session
|
||||||
|
```
|
||||||
|
|
||||||
|
### Building
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm run build # Builds both packages
|
||||||
|
```
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
Contributions welcome! Please open an issue or PR.
|
||||||
|
|
@ -0,0 +1,33 @@
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
multmux:
|
||||||
|
build: .
|
||||||
|
container_name: multmux-server
|
||||||
|
restart: unless-stopped
|
||||||
|
environment:
|
||||||
|
- NODE_ENV=production
|
||||||
|
- PORT=3002
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
# HTTP router
|
||||||
|
- "traefik.http.routers.multmux.rule=Host(`terminal.jeffemmett.com`)"
|
||||||
|
- "traefik.http.routers.multmux.entrypoints=web"
|
||||||
|
- "traefik.http.services.multmux.loadbalancer.server.port=3002"
|
||||||
|
# WebSocket support - Traefik handles this automatically for HTTP/1.1 upgrades
|
||||||
|
# Enable sticky sessions for WebSocket connections
|
||||||
|
- "traefik.http.services.multmux.loadbalancer.sticky.cookie=true"
|
||||||
|
- "traefik.http.services.multmux.loadbalancer.sticky.cookie.name=multmux_session"
|
||||||
|
networks:
|
||||||
|
- traefik-public
|
||||||
|
# Health check
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:3002/api/health"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 3
|
||||||
|
start_period: 10s
|
||||||
|
|
||||||
|
networks:
|
||||||
|
traefik-public:
|
||||||
|
external: true
|
||||||
|
|
@ -0,0 +1,91 @@
|
||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# mulTmux Deployment Script for AI Server
|
||||||
|
# This script sets up mulTmux on your existing droplet
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
echo "🚀 mulTmux Deployment Script"
|
||||||
|
echo "============================"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Check if tmux is installed
|
||||||
|
if ! command -v tmux &> /dev/null; then
|
||||||
|
echo "📦 Installing tmux..."
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y tmux
|
||||||
|
else
|
||||||
|
echo "✅ tmux is already installed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if Node.js is installed
|
||||||
|
if ! command -v node &> /dev/null; then
|
||||||
|
echo "📦 Installing Node.js..."
|
||||||
|
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
|
||||||
|
sudo apt-get install -y nodejs
|
||||||
|
else
|
||||||
|
echo "✅ Node.js is already installed ($(node --version))"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if npm is installed
|
||||||
|
if ! command -v npm &> /dev/null; then
|
||||||
|
echo "❌ npm is not installed. Please install npm first."
|
||||||
|
exit 1
|
||||||
|
else
|
||||||
|
echo "✅ npm is already installed ($(npm --version))"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Build the server
|
||||||
|
echo ""
|
||||||
|
echo "🔨 Building mulTmux..."
|
||||||
|
cd "$(dirname "$0")/.."
|
||||||
|
npm install
|
||||||
|
npm run build
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "📝 Setting up PM2 for process management..."
|
||||||
|
if ! command -v pm2 &> /dev/null; then
|
||||||
|
sudo npm install -g pm2
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create PM2 ecosystem file
|
||||||
|
cat > ecosystem.config.js << EOF
|
||||||
|
module.exports = {
|
||||||
|
apps: [{
|
||||||
|
name: 'multmux-server',
|
||||||
|
script: './packages/server/dist/index.js',
|
||||||
|
instances: 1,
|
||||||
|
autorestart: true,
|
||||||
|
watch: false,
|
||||||
|
max_memory_restart: '500M',
|
||||||
|
env: {
|
||||||
|
NODE_ENV: 'production',
|
||||||
|
PORT: 3000,
|
||||||
|
WS_PORT: 3001
|
||||||
|
}
|
||||||
|
}]
|
||||||
|
};
|
||||||
|
EOF
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "🚀 Starting mulTmux server with PM2..."
|
||||||
|
pm2 start ecosystem.config.js
|
||||||
|
pm2 save
|
||||||
|
pm2 startup | tail -n 1 | bash || true
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "✅ mulTmux deployed successfully!"
|
||||||
|
echo ""
|
||||||
|
echo "Server is running on:"
|
||||||
|
echo " HTTP API: http://localhost:3000"
|
||||||
|
echo " WebSocket: ws://localhost:3001"
|
||||||
|
echo ""
|
||||||
|
echo "Useful PM2 commands:"
|
||||||
|
echo " pm2 status - Check server status"
|
||||||
|
echo " pm2 logs multmux-server - View logs"
|
||||||
|
echo " pm2 restart multmux-server - Restart server"
|
||||||
|
echo " pm2 stop multmux-server - Stop server"
|
||||||
|
echo ""
|
||||||
|
echo "To install the CLI globally:"
|
||||||
|
echo " cd packages/cli && npm link"
|
||||||
|
echo ""
|
||||||
|
|
@ -0,0 +1,53 @@
|
||||||
|
# nginx configuration for mulTmux
|
||||||
|
# Place this in /etc/nginx/sites-available/multmux
|
||||||
|
# Then: sudo ln -s /etc/nginx/sites-available/multmux /etc/nginx/sites-enabled/
|
||||||
|
|
||||||
|
upstream multmux_api {
|
||||||
|
server localhost:3000;
|
||||||
|
}
|
||||||
|
|
||||||
|
upstream multmux_ws {
|
||||||
|
server localhost:3001;
|
||||||
|
}
|
||||||
|
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name your-server-domain.com; # Change this to your domain or IP
|
||||||
|
|
||||||
|
# HTTP API
|
||||||
|
location /api {
|
||||||
|
proxy_pass http://multmux_api;
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection 'upgrade';
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_cache_bypass $http_upgrade;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
}
|
||||||
|
|
||||||
|
# WebSocket
|
||||||
|
location /ws {
|
||||||
|
proxy_pass http://multmux_ws;
|
||||||
|
proxy_http_version 1.1;
|
||||||
|
proxy_set_header Upgrade $http_upgrade;
|
||||||
|
proxy_set_header Connection "upgrade";
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
|
proxy_read_timeout 86400;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Optional: SSL configuration (if using Let's Encrypt)
|
||||||
|
# server {
|
||||||
|
# listen 443 ssl http2;
|
||||||
|
# server_name your-server-domain.com;
|
||||||
|
#
|
||||||
|
# ssl_certificate /etc/letsencrypt/live/your-server-domain.com/fullchain.pem;
|
||||||
|
# ssl_certificate_key /etc/letsencrypt/live/your-server-domain.com/privkey.pem;
|
||||||
|
#
|
||||||
|
# # Same location blocks as above...
|
||||||
|
# }
|
||||||
|
|
@ -0,0 +1,19 @@
|
||||||
|
{
|
||||||
|
"name": "multmux",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"private": true,
|
||||||
|
"description": "Collaborative terminal tool with tmux backend",
|
||||||
|
"workspaces": [
|
||||||
|
"packages/*"
|
||||||
|
],
|
||||||
|
"scripts": {
|
||||||
|
"build": "npm run build -ws",
|
||||||
|
"dev:server": "npm run dev -w @multmux/server",
|
||||||
|
"dev:cli": "npm run dev -w @multmux/cli",
|
||||||
|
"start:server": "npm run start -w @multmux/server"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/node": "^20.0.0",
|
||||||
|
"typescript": "^5.0.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,30 @@
|
||||||
|
{
|
||||||
|
"name": "@multmux/cli",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"description": "mulTmux CLI - collaborative terminal client",
|
||||||
|
"main": "dist/index.js",
|
||||||
|
"bin": {
|
||||||
|
"multmux": "./dist/index.js"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"build": "tsc",
|
||||||
|
"dev": "tsx src/index.ts",
|
||||||
|
"start": "node dist/index.js"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"commander": "^11.1.0",
|
||||||
|
"ws": "^8.16.0",
|
||||||
|
"blessed": "^0.1.81",
|
||||||
|
"chalk": "^4.1.2",
|
||||||
|
"ora": "^5.4.1",
|
||||||
|
"node-fetch": "^2.7.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/ws": "^8.5.10",
|
||||||
|
"@types/node": "^20.0.0",
|
||||||
|
"@types/blessed": "^0.1.25",
|
||||||
|
"@types/node-fetch": "^2.6.9",
|
||||||
|
"tsx": "^4.7.0",
|
||||||
|
"typescript": "^5.0.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,50 @@
|
||||||
|
import fetch from 'node-fetch';
|
||||||
|
import chalk from 'chalk';
|
||||||
|
import ora from 'ora';
|
||||||
|
|
||||||
|
export async function createSession(
|
||||||
|
name: string,
|
||||||
|
options: { server?: string; repo?: string }
|
||||||
|
): Promise<void> {
|
||||||
|
const serverUrl = options.server || 'http://localhost:3000';
|
||||||
|
const spinner = ora('Creating session...').start();
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${serverUrl}/api/sessions`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
name,
|
||||||
|
repoPath: options.repo,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Failed to create session: ${response.statusText}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data: any = await response.json();
|
||||||
|
|
||||||
|
spinner.succeed('Session created!');
|
||||||
|
|
||||||
|
console.log('');
|
||||||
|
console.log(chalk.bold('Session Details:'));
|
||||||
|
console.log(` Name: ${chalk.cyan(data.session.name)}`);
|
||||||
|
console.log(` ID: ${chalk.gray(data.session.id)}`);
|
||||||
|
console.log(` Created: ${new Date(data.session.createdAt).toLocaleString()}`);
|
||||||
|
console.log('');
|
||||||
|
console.log(chalk.bold('To join this session:'));
|
||||||
|
console.log(chalk.green(` ${data.inviteUrl}`));
|
||||||
|
console.log('');
|
||||||
|
console.log(chalk.bold('Or share this token:'));
|
||||||
|
console.log(` ${chalk.yellow(data.token)}`);
|
||||||
|
console.log('');
|
||||||
|
console.log(chalk.dim('Token expires in 60 minutes'));
|
||||||
|
} catch (error) {
|
||||||
|
spinner.fail('Failed to create session');
|
||||||
|
console.error(chalk.red((error as Error).message));
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,45 @@
|
||||||
|
import chalk from 'chalk';
|
||||||
|
import ora from 'ora';
|
||||||
|
import { WebSocketClient } from '../connection/WebSocketClient';
|
||||||
|
import { TerminalUI } from '../ui/Terminal';
|
||||||
|
|
||||||
|
export async function joinSession(
|
||||||
|
token: string,
|
||||||
|
options: { server?: string }
|
||||||
|
): Promise<void> {
|
||||||
|
const serverUrl = options.server || 'ws://localhost:3001';
|
||||||
|
const spinner = ora('Connecting to session...').start();
|
||||||
|
|
||||||
|
try {
|
||||||
|
const client = new WebSocketClient(serverUrl, token);
|
||||||
|
|
||||||
|
// Wait for connection
|
||||||
|
await client.connect();
|
||||||
|
spinner.succeed('Connected!');
|
||||||
|
|
||||||
|
// Wait a moment for the 'joined' event
|
||||||
|
await new Promise((resolve) => {
|
||||||
|
client.once('joined', resolve);
|
||||||
|
setTimeout(resolve, 1000); // Fallback timeout
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(chalk.green('\nJoined session! Press ESC or Ctrl-C to exit.\n'));
|
||||||
|
|
||||||
|
// Create terminal UI
|
||||||
|
const ui = new TerminalUI(client);
|
||||||
|
|
||||||
|
// Handle errors
|
||||||
|
client.on('error', (error: Error) => {
|
||||||
|
console.error(chalk.red('\nConnection error:'), error.message);
|
||||||
|
});
|
||||||
|
|
||||||
|
client.on('reconnect-failed', () => {
|
||||||
|
console.error(chalk.red('\nFailed to reconnect. Exiting...'));
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
spinner.fail('Failed to connect');
|
||||||
|
console.error(chalk.red((error as Error).message));
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,38 @@
|
||||||
|
import fetch from 'node-fetch';
|
||||||
|
import chalk from 'chalk';
|
||||||
|
import ora from 'ora';
|
||||||
|
|
||||||
|
export async function listSessions(options: { server?: string }): Promise<void> {
|
||||||
|
const serverUrl = options.server || 'http://localhost:3000';
|
||||||
|
const spinner = ora('Fetching sessions...').start();
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${serverUrl}/api/sessions`);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Failed to fetch sessions: ${response.statusText}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data: any = await response.json();
|
||||||
|
spinner.stop();
|
||||||
|
|
||||||
|
if (data.sessions.length === 0) {
|
||||||
|
console.log(chalk.yellow('No active sessions found.'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(chalk.bold(`\nActive Sessions (${data.sessions.length}):\n`));
|
||||||
|
|
||||||
|
data.sessions.forEach((session: any) => {
|
||||||
|
console.log(chalk.cyan(` ${session.name}`));
|
||||||
|
console.log(` ID: ${chalk.gray(session.id)}`);
|
||||||
|
console.log(` Clients: ${session.activeClients}`);
|
||||||
|
console.log(` Created: ${new Date(session.createdAt).toLocaleString()}`);
|
||||||
|
console.log('');
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
spinner.fail('Failed to fetch sessions');
|
||||||
|
console.error(chalk.red((error as Error).message));
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,120 @@
|
||||||
|
import WebSocket from 'ws';
|
||||||
|
import { EventEmitter } from 'events';
|
||||||
|
|
||||||
|
export interface TerminalMessage {
|
||||||
|
type: 'output' | 'input' | 'resize' | 'join' | 'leave' | 'presence' | 'joined' | 'error';
|
||||||
|
data?: any;
|
||||||
|
clientId?: string;
|
||||||
|
timestamp?: number;
|
||||||
|
sessionId?: string;
|
||||||
|
sessionName?: string;
|
||||||
|
message?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class WebSocketClient extends EventEmitter {
|
||||||
|
private ws: WebSocket | null = null;
|
||||||
|
private reconnectAttempts = 0;
|
||||||
|
private maxReconnectAttempts = 5;
|
||||||
|
|
||||||
|
constructor(private url: string, private token: string) {
|
||||||
|
super();
|
||||||
|
}
|
||||||
|
|
||||||
|
connect(): Promise<void> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const wsUrl = `${this.url}?token=${this.token}`;
|
||||||
|
this.ws = new WebSocket(wsUrl);
|
||||||
|
|
||||||
|
this.ws.on('open', () => {
|
||||||
|
this.reconnectAttempts = 0;
|
||||||
|
this.emit('connected');
|
||||||
|
resolve();
|
||||||
|
});
|
||||||
|
|
||||||
|
this.ws.on('message', (data) => {
|
||||||
|
try {
|
||||||
|
const message: TerminalMessage = JSON.parse(data.toString());
|
||||||
|
this.handleMessage(message);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to parse message:', error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
this.ws.on('close', () => {
|
||||||
|
this.emit('disconnected');
|
||||||
|
this.attemptReconnect();
|
||||||
|
});
|
||||||
|
|
||||||
|
this.ws.on('error', (error) => {
|
||||||
|
this.emit('error', error);
|
||||||
|
reject(error);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleMessage(message: TerminalMessage): void {
|
||||||
|
switch (message.type) {
|
||||||
|
case 'output':
|
||||||
|
this.emit('output', message.data);
|
||||||
|
break;
|
||||||
|
case 'joined':
|
||||||
|
this.emit('joined', {
|
||||||
|
sessionId: message.sessionId,
|
||||||
|
sessionName: message.sessionName,
|
||||||
|
clientId: message.clientId,
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'presence':
|
||||||
|
this.emit('presence', message.data);
|
||||||
|
break;
|
||||||
|
case 'error':
|
||||||
|
this.emit('error', new Error(message.message || 'Unknown error'));
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
sendInput(data: string): void {
|
||||||
|
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
|
||||||
|
this.ws.send(
|
||||||
|
JSON.stringify({
|
||||||
|
type: 'input',
|
||||||
|
data,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
})
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resize(cols: number, rows: number): void {
|
||||||
|
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
|
||||||
|
this.ws.send(
|
||||||
|
JSON.stringify({
|
||||||
|
type: 'resize',
|
||||||
|
data: { cols, rows },
|
||||||
|
timestamp: Date.now(),
|
||||||
|
})
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
disconnect(): void {
|
||||||
|
if (this.ws) {
|
||||||
|
this.ws.close();
|
||||||
|
this.ws = null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private attemptReconnect(): void {
|
||||||
|
if (this.reconnectAttempts < this.maxReconnectAttempts) {
|
||||||
|
this.reconnectAttempts++;
|
||||||
|
setTimeout(() => {
|
||||||
|
this.emit('reconnecting', this.reconnectAttempts);
|
||||||
|
this.connect().catch(() => {
|
||||||
|
// Reconnection failed, will retry
|
||||||
|
});
|
||||||
|
}, 1000 * this.reconnectAttempts);
|
||||||
|
} else {
|
||||||
|
this.emit('reconnect-failed');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,34 @@
|
||||||
|
#!/usr/bin/env node
|
||||||
|
|
||||||
|
import { Command } from 'commander';
|
||||||
|
import { createSession } from './commands/create';
|
||||||
|
import { joinSession } from './commands/join';
|
||||||
|
import { listSessions } from './commands/list';
|
||||||
|
|
||||||
|
const program = new Command();
|
||||||
|
|
||||||
|
program
|
||||||
|
.name('multmux')
|
||||||
|
.description('Collaborative terminal tool with tmux backend')
|
||||||
|
.version('0.1.0');
|
||||||
|
|
||||||
|
program
|
||||||
|
.command('create <name>')
|
||||||
|
.description('Create a new collaborative session')
|
||||||
|
.option('-s, --server <url>', 'Server URL', 'http://localhost:3000')
|
||||||
|
.option('-r, --repo <path>', 'Repository path to use')
|
||||||
|
.action(createSession);
|
||||||
|
|
||||||
|
program
|
||||||
|
.command('join <token>')
|
||||||
|
.description('Join an existing session with a token')
|
||||||
|
.option('-s, --server <url>', 'WebSocket server URL', 'ws://localhost:3001')
|
||||||
|
.action(joinSession);
|
||||||
|
|
||||||
|
program
|
||||||
|
.command('list')
|
||||||
|
.description('List active sessions')
|
||||||
|
.option('-s, --server <url>', 'Server URL', 'http://localhost:3000')
|
||||||
|
.action(listSessions);
|
||||||
|
|
||||||
|
program.parse();
|
||||||
|
|
@ -0,0 +1,154 @@
|
||||||
|
import blessed from 'blessed';
|
||||||
|
import { WebSocketClient } from '../connection/WebSocketClient';
|
||||||
|
|
||||||
|
export class TerminalUI {
|
||||||
|
private screen: blessed.Widgets.Screen;
|
||||||
|
private terminal: blessed.Widgets.BoxElement;
|
||||||
|
private statusBar: blessed.Widgets.BoxElement;
|
||||||
|
private buffer: string = '';
|
||||||
|
|
||||||
|
constructor(private client: WebSocketClient) {
|
||||||
|
// Create screen
|
||||||
|
this.screen = blessed.screen({
|
||||||
|
smartCSR: true,
|
||||||
|
title: 'mulTmux',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Status bar
|
||||||
|
this.statusBar = blessed.box({
|
||||||
|
top: 0,
|
||||||
|
left: 0,
|
||||||
|
width: '100%',
|
||||||
|
height: 1,
|
||||||
|
style: {
|
||||||
|
fg: 'white',
|
||||||
|
bg: 'blue',
|
||||||
|
},
|
||||||
|
content: ' mulTmux - Connecting...',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Terminal output
|
||||||
|
this.terminal = blessed.box({
|
||||||
|
top: 1,
|
||||||
|
left: 0,
|
||||||
|
width: '100%',
|
||||||
|
height: '100%-1',
|
||||||
|
scrollable: true,
|
||||||
|
alwaysScroll: true,
|
||||||
|
scrollbar: {
|
||||||
|
style: {
|
||||||
|
bg: 'blue',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
keys: true,
|
||||||
|
vi: true,
|
||||||
|
mouse: true,
|
||||||
|
content: '',
|
||||||
|
});
|
||||||
|
|
||||||
|
this.screen.append(this.statusBar);
|
||||||
|
this.screen.append(this.terminal);
|
||||||
|
|
||||||
|
// Focus terminal
|
||||||
|
this.terminal.focus();
|
||||||
|
|
||||||
|
// Setup event handlers
|
||||||
|
this.setupEventHandlers();
|
||||||
|
|
||||||
|
// Render
|
||||||
|
this.screen.render();
|
||||||
|
}
|
||||||
|
|
||||||
|
private setupEventHandlers(): void {
|
||||||
|
// Handle terminal output from server
|
||||||
|
this.client.on('output', (data: string) => {
|
||||||
|
this.buffer += data;
|
||||||
|
this.terminal.setContent(this.buffer);
|
||||||
|
this.terminal.setScrollPerc(100);
|
||||||
|
this.screen.render();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle connection events
|
||||||
|
this.client.on('connected', () => {
|
||||||
|
this.updateStatus('Connected', 'green');
|
||||||
|
});
|
||||||
|
|
||||||
|
this.client.on('joined', (info: any) => {
|
||||||
|
this.updateStatus(`Session: ${info.sessionName} (${info.clientId.slice(0, 8)})`, 'green');
|
||||||
|
});
|
||||||
|
|
||||||
|
this.client.on('disconnected', () => {
|
||||||
|
this.updateStatus('Disconnected', 'red');
|
||||||
|
});
|
||||||
|
|
||||||
|
this.client.on('reconnecting', (attempt: number) => {
|
||||||
|
this.updateStatus(`Reconnecting (${attempt}/5)...`, 'yellow');
|
||||||
|
});
|
||||||
|
|
||||||
|
this.client.on('presence', (data: any) => {
|
||||||
|
if (data.action === 'join') {
|
||||||
|
this.showNotification(`User joined (${data.totalClients} online)`);
|
||||||
|
} else if (data.action === 'leave') {
|
||||||
|
this.showNotification(`User left (${data.totalClients} online)`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle keyboard input
|
||||||
|
this.screen.on('keypress', (ch: string, key: any) => {
|
||||||
|
if (key.name === 'escape' || (key.ctrl && key.name === 'c')) {
|
||||||
|
this.close();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send input to server
|
||||||
|
if (ch) {
|
||||||
|
this.client.sendInput(ch);
|
||||||
|
} else if (key.name) {
|
||||||
|
// Handle special keys
|
||||||
|
const specialKeys: { [key: string]: string } = {
|
||||||
|
enter: '\r',
|
||||||
|
backspace: '\x7f',
|
||||||
|
tab: '\t',
|
||||||
|
up: '\x1b[A',
|
||||||
|
down: '\x1b[B',
|
||||||
|
right: '\x1b[C',
|
||||||
|
left: '\x1b[D',
|
||||||
|
};
|
||||||
|
|
||||||
|
if (specialKeys[key.name]) {
|
||||||
|
this.client.sendInput(specialKeys[key.name]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle resize
|
||||||
|
this.screen.on('resize', () => {
|
||||||
|
const { width, height } = this.terminal;
|
||||||
|
this.client.resize(width as number, (height as number) - 1);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Quit on Ctrl-C
|
||||||
|
this.screen.key(['C-c'], () => {
|
||||||
|
this.close();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private updateStatus(text: string, color: string = 'blue'): void {
|
||||||
|
this.statusBar.style.bg = color;
|
||||||
|
this.statusBar.setContent(` mulTmux - ${text}`);
|
||||||
|
this.screen.render();
|
||||||
|
}
|
||||||
|
|
||||||
|
private showNotification(text: string): void {
|
||||||
|
// Append notification to buffer
|
||||||
|
this.buffer += `\n[mulTmux] ${text}\n`;
|
||||||
|
this.terminal.setContent(this.buffer);
|
||||||
|
this.screen.render();
|
||||||
|
}
|
||||||
|
|
||||||
|
close(): void {
|
||||||
|
this.client.disconnect();
|
||||||
|
this.screen.destroy();
|
||||||
|
process.exit(0);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
{
|
||||||
|
"extends": "../../tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"outDir": "./dist",
|
||||||
|
"rootDir": "./src"
|
||||||
|
},
|
||||||
|
"include": ["src/**/*"]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,26 @@
|
||||||
|
{
|
||||||
|
"name": "@multmux/server",
|
||||||
|
"version": "0.1.0",
|
||||||
|
"description": "mulTmux server - collaborative terminal backend",
|
||||||
|
"main": "dist/index.js",
|
||||||
|
"scripts": {
|
||||||
|
"build": "tsc",
|
||||||
|
"dev": "tsx watch src/index.ts",
|
||||||
|
"start": "node dist/index.js"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"express": "^4.18.0",
|
||||||
|
"ws": "^8.16.0",
|
||||||
|
"node-pty": "^1.0.0",
|
||||||
|
"nanoid": "^3.3.7",
|
||||||
|
"cors": "^2.8.5"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/express": "^4.17.21",
|
||||||
|
"@types/ws": "^8.5.10",
|
||||||
|
"@types/node": "^20.0.0",
|
||||||
|
"@types/cors": "^2.8.17",
|
||||||
|
"tsx": "^4.7.0",
|
||||||
|
"typescript": "^5.0.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,116 @@
|
||||||
|
import { Router } from 'express';
|
||||||
|
import { SessionManager } from '../managers/SessionManager';
|
||||||
|
import { TokenManager } from '../managers/TokenManager';
|
||||||
|
|
||||||
|
export function createRouter(
|
||||||
|
sessionManager: SessionManager,
|
||||||
|
tokenManager: TokenManager
|
||||||
|
): Router {
|
||||||
|
const router = Router();
|
||||||
|
|
||||||
|
// Create a new session
|
||||||
|
router.post('/sessions', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { name, repoPath } = req.body;
|
||||||
|
|
||||||
|
if (!name || typeof name !== 'string') {
|
||||||
|
return res.status(400).json({ error: 'Session name is required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const session = await sessionManager.createSession(name, repoPath);
|
||||||
|
const token = tokenManager.generateToken(session.id, 60, 'write');
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
session: {
|
||||||
|
id: session.id,
|
||||||
|
name: session.name,
|
||||||
|
createdAt: session.createdAt,
|
||||||
|
},
|
||||||
|
token,
|
||||||
|
inviteUrl: `multmux join ${token}`,
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to create session:', error);
|
||||||
|
res.status(500).json({ error: 'Failed to create session' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// List active sessions
|
||||||
|
router.get('/sessions', (req, res) => {
|
||||||
|
const sessions = sessionManager.listSessions();
|
||||||
|
res.json({
|
||||||
|
sessions: sessions.map((s) => ({
|
||||||
|
id: s.id,
|
||||||
|
name: s.name,
|
||||||
|
createdAt: s.createdAt,
|
||||||
|
activeClients: s.clients.size,
|
||||||
|
})),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get session info
|
||||||
|
router.get('/sessions/:id', (req, res) => {
|
||||||
|
const session = sessionManager.getSession(req.params.id);
|
||||||
|
|
||||||
|
if (!session) {
|
||||||
|
return res.status(404).json({ error: 'Session not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
id: session.id,
|
||||||
|
name: session.name,
|
||||||
|
createdAt: session.createdAt,
|
||||||
|
activeClients: session.clients.size,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Join an existing session (generates a new token and returns session info)
|
||||||
|
router.post('/sessions/:id/join', (req, res) => {
|
||||||
|
const session = sessionManager.getSession(req.params.id);
|
||||||
|
|
||||||
|
if (!session) {
|
||||||
|
return res.status(404).json({ error: 'Session not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate a new token for this joining client
|
||||||
|
const token = tokenManager.generateToken(session.id, 60, 'write');
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
id: session.id,
|
||||||
|
name: session.name,
|
||||||
|
token,
|
||||||
|
createdAt: session.createdAt,
|
||||||
|
activeClients: session.clients.size,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Generate new invite token for existing session
|
||||||
|
router.post('/sessions/:id/tokens', (req, res) => {
|
||||||
|
const session = sessionManager.getSession(req.params.id);
|
||||||
|
|
||||||
|
if (!session) {
|
||||||
|
return res.status(404).json({ error: 'Session not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const { expiresInMinutes = 60, permissions = 'write' } = req.body;
|
||||||
|
const token = tokenManager.generateToken(session.id, expiresInMinutes, permissions);
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
token,
|
||||||
|
inviteUrl: `multmux join ${token}`,
|
||||||
|
expiresInMinutes,
|
||||||
|
permissions,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Health check
|
||||||
|
router.get('/health', (req, res) => {
|
||||||
|
res.json({
|
||||||
|
status: 'ok',
|
||||||
|
activeSessions: sessionManager.listSessions().length,
|
||||||
|
activeTokens: tokenManager.getActiveTokens(),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
return router;
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,55 @@
|
||||||
|
import express from 'express';
|
||||||
|
import { createServer } from 'http';
|
||||||
|
import { WebSocketServer } from 'ws';
|
||||||
|
import cors from 'cors';
|
||||||
|
import { SessionManager } from './managers/SessionManager';
|
||||||
|
import { TokenManager } from './managers/TokenManager';
|
||||||
|
import { TerminalHandler } from './websocket/TerminalHandler';
|
||||||
|
import { createRouter } from './api/routes';
|
||||||
|
|
||||||
|
const PORT = process.env.PORT || 3002;
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
// Initialize managers
|
||||||
|
const sessionManager = new SessionManager();
|
||||||
|
const tokenManager = new TokenManager();
|
||||||
|
const terminalHandler = new TerminalHandler(sessionManager, tokenManager);
|
||||||
|
|
||||||
|
// HTTP API Server
|
||||||
|
const app = express();
|
||||||
|
app.use(cors());
|
||||||
|
app.use(express.json());
|
||||||
|
app.use('/api', createRouter(sessionManager, tokenManager));
|
||||||
|
|
||||||
|
// Create HTTP server to share with WebSocket
|
||||||
|
const server = createServer(app);
|
||||||
|
|
||||||
|
// WebSocket Server on same port, handles upgrade requests
|
||||||
|
const wss = new WebSocketServer({ server, path: '/ws' });
|
||||||
|
|
||||||
|
wss.on('connection', (ws, req) => {
|
||||||
|
// Extract token from query string
|
||||||
|
const url = new URL(req.url || '', `http://localhost:${PORT}`);
|
||||||
|
const token = url.searchParams.get('token');
|
||||||
|
|
||||||
|
if (!token) {
|
||||||
|
ws.send(JSON.stringify({ type: 'error', message: 'Token required' }));
|
||||||
|
ws.close();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
terminalHandler.handleConnection(ws, token);
|
||||||
|
});
|
||||||
|
|
||||||
|
server.listen(PORT, () => {
|
||||||
|
console.log('');
|
||||||
|
console.log('mulTmux server is ready!');
|
||||||
|
console.log(`API: http://localhost:${PORT}/api`);
|
||||||
|
console.log(`WebSocket: ws://localhost:${PORT}/ws`);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch((error) => {
|
||||||
|
console.error('Failed to start server:', error);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
|
|
@ -0,0 +1,114 @@
|
||||||
|
import { spawn, ChildProcess } from 'child_process';
|
||||||
|
import * as pty from 'node-pty';
|
||||||
|
import { Session } from '../types';
|
||||||
|
import { nanoid } from 'nanoid';
|
||||||
|
|
||||||
|
export class SessionManager {
|
||||||
|
private sessions: Map<string, Session> = new Map();
|
||||||
|
private terminals: Map<string, pty.IPty> = new Map();
|
||||||
|
|
||||||
|
async createSession(name: string, repoPath?: string): Promise<Session> {
|
||||||
|
const id = nanoid(16);
|
||||||
|
const tmuxSessionName = `multmux-${id}`;
|
||||||
|
|
||||||
|
const session: Session = {
|
||||||
|
id,
|
||||||
|
name,
|
||||||
|
createdAt: new Date(),
|
||||||
|
tmuxSessionName,
|
||||||
|
clients: new Set(),
|
||||||
|
repoPath,
|
||||||
|
};
|
||||||
|
|
||||||
|
this.sessions.set(id, session);
|
||||||
|
|
||||||
|
// Create tmux session
|
||||||
|
await this.createTmuxSession(tmuxSessionName, repoPath);
|
||||||
|
|
||||||
|
// Attach to tmux session with pty
|
||||||
|
const terminal = pty.spawn('tmux', ['attach-session', '-t', tmuxSessionName], {
|
||||||
|
name: 'xterm-256color',
|
||||||
|
cols: 80,
|
||||||
|
rows: 24,
|
||||||
|
cwd: repoPath || process.cwd(),
|
||||||
|
env: process.env as { [key: string]: string },
|
||||||
|
});
|
||||||
|
|
||||||
|
this.terminals.set(id, terminal);
|
||||||
|
|
||||||
|
return session;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async createTmuxSession(name: string, cwd?: string): Promise<void> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const args = ['new-session', '-d', '-s', name];
|
||||||
|
if (cwd) {
|
||||||
|
args.push('-c', cwd);
|
||||||
|
}
|
||||||
|
|
||||||
|
const proc = spawn('tmux', args);
|
||||||
|
|
||||||
|
proc.on('exit', (code) => {
|
||||||
|
if (code === 0) {
|
||||||
|
resolve();
|
||||||
|
} else {
|
||||||
|
reject(new Error(`Failed to create tmux session: exit code ${code}`));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
getSession(id: string): Session | undefined {
|
||||||
|
return this.sessions.get(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
getTerminal(sessionId: string): pty.IPty | undefined {
|
||||||
|
return this.terminals.get(sessionId);
|
||||||
|
}
|
||||||
|
|
||||||
|
addClient(sessionId: string, clientId: string): void {
|
||||||
|
const session = this.sessions.get(sessionId);
|
||||||
|
if (session) {
|
||||||
|
session.clients.add(clientId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
removeClient(sessionId: string, clientId: string): void {
|
||||||
|
const session = this.sessions.get(sessionId);
|
||||||
|
if (session) {
|
||||||
|
session.clients.delete(clientId);
|
||||||
|
|
||||||
|
// Clean up session if no clients left
|
||||||
|
if (session.clients.size === 0) {
|
||||||
|
this.destroySession(sessionId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async destroySession(sessionId: string): Promise<void> {
|
||||||
|
const session = this.sessions.get(sessionId);
|
||||||
|
if (!session) return;
|
||||||
|
|
||||||
|
const terminal = this.terminals.get(sessionId);
|
||||||
|
if (terminal) {
|
||||||
|
terminal.kill();
|
||||||
|
this.terminals.delete(sessionId);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Kill tmux session
|
||||||
|
spawn('tmux', ['kill-session', '-t', session.tmuxSessionName]);
|
||||||
|
|
||||||
|
this.sessions.delete(sessionId);
|
||||||
|
}
|
||||||
|
|
||||||
|
listSessions(): Session[] {
|
||||||
|
return Array.from(this.sessions.values());
|
||||||
|
}
|
||||||
|
|
||||||
|
resizeTerminal(sessionId: string, cols: number, rows: number): void {
|
||||||
|
const terminal = this.terminals.get(sessionId);
|
||||||
|
if (terminal) {
|
||||||
|
terminal.resize(cols, rows);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,50 @@
|
||||||
|
import { nanoid } from 'nanoid';
|
||||||
|
import { SessionToken } from '../types';
|
||||||
|
|
||||||
|
export class TokenManager {
|
||||||
|
private tokens: Map<string, SessionToken> = new Map();
|
||||||
|
|
||||||
|
generateToken(
|
||||||
|
sessionId: string,
|
||||||
|
expiresInMinutes: number = 60,
|
||||||
|
permissions: 'read' | 'write' = 'write'
|
||||||
|
): string {
|
||||||
|
const token = nanoid(32);
|
||||||
|
const expiresAt = new Date(Date.now() + expiresInMinutes * 60 * 1000);
|
||||||
|
|
||||||
|
this.tokens.set(token, {
|
||||||
|
token,
|
||||||
|
sessionId,
|
||||||
|
expiresAt,
|
||||||
|
permissions,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clean up expired token after expiration
|
||||||
|
setTimeout(() => this.tokens.delete(token), expiresInMinutes * 60 * 1000);
|
||||||
|
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
|
||||||
|
validateToken(token: string): SessionToken | null {
|
||||||
|
const sessionToken = this.tokens.get(token);
|
||||||
|
|
||||||
|
if (!sessionToken) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (sessionToken.expiresAt < new Date()) {
|
||||||
|
this.tokens.delete(token);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return sessionToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
revokeToken(token: string): void {
|
||||||
|
this.tokens.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
getActiveTokens(): number {
|
||||||
|
return this.tokens.size;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,29 @@
|
||||||
|
export interface Session {
|
||||||
|
id: string;
|
||||||
|
name: string;
|
||||||
|
createdAt: Date;
|
||||||
|
tmuxSessionName: string;
|
||||||
|
clients: Set<string>;
|
||||||
|
repoPath?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SessionToken {
|
||||||
|
token: string;
|
||||||
|
sessionId: string;
|
||||||
|
expiresAt: Date;
|
||||||
|
permissions: 'read' | 'write';
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ClientConnection {
|
||||||
|
id: string;
|
||||||
|
sessionId: string;
|
||||||
|
username?: string;
|
||||||
|
permissions: 'read' | 'write';
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface TerminalMessage {
|
||||||
|
type: 'output' | 'input' | 'resize' | 'join' | 'leave' | 'presence';
|
||||||
|
data: any;
|
||||||
|
clientId?: string;
|
||||||
|
timestamp: number;
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,175 @@
|
||||||
|
import { WebSocket } from 'ws';
|
||||||
|
import { nanoid } from 'nanoid';
|
||||||
|
import { SessionManager } from '../managers/SessionManager';
|
||||||
|
import { TokenManager } from '../managers/TokenManager';
|
||||||
|
import { TerminalMessage, ClientConnection } from '../types';
|
||||||
|
|
||||||
|
export class TerminalHandler {
|
||||||
|
private clients: Map<string, { ws: WebSocket; connection: ClientConnection }> = new Map();
|
||||||
|
|
||||||
|
constructor(
|
||||||
|
private sessionManager: SessionManager,
|
||||||
|
private tokenManager: TokenManager
|
||||||
|
) {}
|
||||||
|
|
||||||
|
handleConnection(ws: WebSocket, token: string): void {
|
||||||
|
// Validate token
|
||||||
|
const sessionToken = this.tokenManager.validateToken(token);
|
||||||
|
if (!sessionToken) {
|
||||||
|
ws.send(JSON.stringify({ type: 'error', message: 'Invalid or expired token' }));
|
||||||
|
ws.close();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify session exists
|
||||||
|
const session = this.sessionManager.getSession(sessionToken.sessionId);
|
||||||
|
if (!session) {
|
||||||
|
ws.send(JSON.stringify({ type: 'error', message: 'Session not found' }));
|
||||||
|
ws.close();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const clientId = nanoid(16);
|
||||||
|
const connection: ClientConnection = {
|
||||||
|
id: clientId,
|
||||||
|
sessionId: sessionToken.sessionId,
|
||||||
|
permissions: sessionToken.permissions,
|
||||||
|
};
|
||||||
|
|
||||||
|
this.clients.set(clientId, { ws, connection });
|
||||||
|
this.sessionManager.addClient(sessionToken.sessionId, clientId);
|
||||||
|
|
||||||
|
// Attach terminal output to WebSocket
|
||||||
|
const terminal = this.sessionManager.getTerminal(sessionToken.sessionId);
|
||||||
|
if (terminal) {
|
||||||
|
const onData = (data: string) => {
|
||||||
|
const message: TerminalMessage = {
|
||||||
|
type: 'output',
|
||||||
|
data,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
};
|
||||||
|
ws.send(JSON.stringify(message));
|
||||||
|
};
|
||||||
|
|
||||||
|
const dataListener = terminal.onData(onData);
|
||||||
|
|
||||||
|
// Clean up on disconnect
|
||||||
|
ws.on('close', () => {
|
||||||
|
dataListener.dispose();
|
||||||
|
this.handleDisconnect(clientId);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send join confirmation
|
||||||
|
ws.send(
|
||||||
|
JSON.stringify({
|
||||||
|
type: 'joined',
|
||||||
|
sessionId: session.id,
|
||||||
|
sessionName: session.name,
|
||||||
|
clientId,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
// Broadcast presence
|
||||||
|
this.broadcastToSession(sessionToken.sessionId, {
|
||||||
|
type: 'presence',
|
||||||
|
data: {
|
||||||
|
action: 'join',
|
||||||
|
clientId,
|
||||||
|
totalClients: session.clients.size,
|
||||||
|
},
|
||||||
|
timestamp: Date.now(),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Handle incoming messages
|
||||||
|
ws.on('message', (data) => {
|
||||||
|
this.handleMessage(clientId, data.toString());
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleMessage(clientId: string, rawMessage: string): void {
|
||||||
|
const client = this.clients.get(clientId);
|
||||||
|
if (!client) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const message: TerminalMessage = JSON.parse(rawMessage);
|
||||||
|
|
||||||
|
switch (message.type) {
|
||||||
|
case 'input':
|
||||||
|
this.handleInput(client.connection, message.data);
|
||||||
|
break;
|
||||||
|
case 'resize':
|
||||||
|
this.handleResize(client.connection, message.data);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to parse message:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleInput(connection: ClientConnection, data: string): void {
|
||||||
|
if (connection.permissions !== 'write') {
|
||||||
|
return; // Read-only clients can't send input
|
||||||
|
}
|
||||||
|
|
||||||
|
const terminal = this.sessionManager.getTerminal(connection.sessionId);
|
||||||
|
if (terminal) {
|
||||||
|
terminal.write(data);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Broadcast input to other clients for cursor tracking
|
||||||
|
this.broadcastToSession(
|
||||||
|
connection.sessionId,
|
||||||
|
{
|
||||||
|
type: 'input',
|
||||||
|
data,
|
||||||
|
clientId: connection.id,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
},
|
||||||
|
connection.id // Exclude sender
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleResize(connection: ClientConnection, data: { cols: number; rows: number }): void {
|
||||||
|
this.sessionManager.resizeTerminal(connection.sessionId, data.cols, data.rows);
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleDisconnect(clientId: string): void {
|
||||||
|
const client = this.clients.get(clientId);
|
||||||
|
if (!client) return;
|
||||||
|
|
||||||
|
this.sessionManager.removeClient(client.connection.sessionId, clientId);
|
||||||
|
this.clients.delete(clientId);
|
||||||
|
|
||||||
|
// Broadcast leave
|
||||||
|
const session = this.sessionManager.getSession(client.connection.sessionId);
|
||||||
|
if (session) {
|
||||||
|
this.broadcastToSession(client.connection.sessionId, {
|
||||||
|
type: 'presence',
|
||||||
|
data: {
|
||||||
|
action: 'leave',
|
||||||
|
clientId,
|
||||||
|
totalClients: session.clients.size,
|
||||||
|
},
|
||||||
|
timestamp: Date.now(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private broadcastToSession(
|
||||||
|
sessionId: string,
|
||||||
|
message: TerminalMessage,
|
||||||
|
excludeClientId?: string
|
||||||
|
): void {
|
||||||
|
const session = this.sessionManager.getSession(sessionId);
|
||||||
|
if (!session) return;
|
||||||
|
|
||||||
|
const messageStr = JSON.stringify(message);
|
||||||
|
|
||||||
|
for (const [clientId, client] of this.clients.entries()) {
|
||||||
|
if (client.connection.sessionId === sessionId && clientId !== excludeClientId) {
|
||||||
|
client.ws.send(messageStr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,8 @@
|
||||||
|
{
|
||||||
|
"extends": "../../tsconfig.json",
|
||||||
|
"compilerOptions": {
|
||||||
|
"outDir": "./dist",
|
||||||
|
"rootDir": "./src"
|
||||||
|
},
|
||||||
|
"include": ["src/**/*"]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,18 @@
|
||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "ES2022",
|
||||||
|
"module": "commonjs",
|
||||||
|
"lib": ["ES2022"],
|
||||||
|
"outDir": "./dist",
|
||||||
|
"rootDir": "./src",
|
||||||
|
"strict": true,
|
||||||
|
"esModuleInterop": true,
|
||||||
|
"skipLibCheck": true,
|
||||||
|
"forceConsistentCasingInFileNames": true,
|
||||||
|
"resolveJsonModule": true,
|
||||||
|
"declaration": true,
|
||||||
|
"declarationMap": true,
|
||||||
|
"sourceMap": true
|
||||||
|
},
|
||||||
|
"exclude": ["node_modules", "dist"]
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,37 @@
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
server_name _;
|
||||||
|
root /usr/share/nginx/html;
|
||||||
|
index index.html;
|
||||||
|
|
||||||
|
# Gzip compression
|
||||||
|
gzip on;
|
||||||
|
gzip_vary on;
|
||||||
|
gzip_min_length 1024;
|
||||||
|
gzip_proxied expired no-cache no-store private auth;
|
||||||
|
gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml application/javascript application/json;
|
||||||
|
gzip_disable "MSIE [1-6]\.";
|
||||||
|
|
||||||
|
# Security headers
|
||||||
|
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||||
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
|
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
|
||||||
|
|
||||||
|
# Cache static assets
|
||||||
|
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff|woff2|ttf|eot)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable";
|
||||||
|
}
|
||||||
|
|
||||||
|
# Handle SPA routing - all routes serve index.html
|
||||||
|
location / {
|
||||||
|
try_files $uri $uri/ /index.html;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Health check endpoint
|
||||||
|
location /health {
|
||||||
|
return 200 'OK';
|
||||||
|
add_header Content-Type text/plain;
|
||||||
|
}
|
||||||
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
26
package.json
26
package.json
|
|
@ -3,8 +3,11 @@
|
||||||
"version": "1.0.0",
|
"version": "1.0.0",
|
||||||
"description": "Jeff Emmett's personal website",
|
"description": "Jeff Emmett's personal website",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
|
"workspaces": [
|
||||||
|
"multmux/packages/*"
|
||||||
|
],
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "concurrently --kill-others --names client,worker --prefix-colors blue,red \"npm run dev:client\" \"npm run dev:worker:local\"",
|
"dev": "concurrently --kill-others --names client,worker,multmux --prefix-colors blue,red,magenta \"npm run dev:client\" \"npm run dev:worker:local\" \"npm run multmux:dev:server\"",
|
||||||
"dev:client": "vite --host 0.0.0.0 --port 5173",
|
"dev:client": "vite --host 0.0.0.0 --port 5173",
|
||||||
"dev:worker": "wrangler dev --config wrangler.dev.toml --remote --port 5172",
|
"dev:worker": "wrangler dev --config wrangler.dev.toml --remote --port 5172",
|
||||||
"dev:worker:local": "wrangler dev --config wrangler.dev.toml --port 5172 --ip 0.0.0.0",
|
"dev:worker:local": "wrangler dev --config wrangler.dev.toml --port 5172 --ip 0.0.0.0",
|
||||||
|
|
@ -15,7 +18,12 @@
|
||||||
"deploy:pages": "tsc && vite build",
|
"deploy:pages": "tsc && vite build",
|
||||||
"deploy:worker": "wrangler deploy",
|
"deploy:worker": "wrangler deploy",
|
||||||
"deploy:worker:dev": "wrangler deploy --config wrangler.dev.toml",
|
"deploy:worker:dev": "wrangler deploy --config wrangler.dev.toml",
|
||||||
"types": "tsc --noEmit"
|
"types": "tsc --noEmit",
|
||||||
|
"multmux:install": "npm install --workspaces",
|
||||||
|
"multmux:build": "npm run build --workspace=@multmux/server --workspace=@multmux/cli",
|
||||||
|
"multmux:dev:server": "npm run dev --workspace=@multmux/server",
|
||||||
|
"multmux:dev:cli": "npm run dev --workspace=@multmux/cli",
|
||||||
|
"multmux:start": "npm run start --workspace=@multmux/server"
|
||||||
},
|
},
|
||||||
"keywords": [],
|
"keywords": [],
|
||||||
"author": "Jeff Emmett",
|
"author": "Jeff Emmett",
|
||||||
|
|
@ -29,15 +37,16 @@
|
||||||
"@chengsokdara/use-whisper": "^0.2.0",
|
"@chengsokdara/use-whisper": "^0.2.0",
|
||||||
"@daily-co/daily-js": "^0.60.0",
|
"@daily-co/daily-js": "^0.60.0",
|
||||||
"@daily-co/daily-react": "^0.20.0",
|
"@daily-co/daily-react": "^0.20.0",
|
||||||
"@oddjs/odd": "^0.37.2",
|
"@mdxeditor/editor": "^3.51.0",
|
||||||
"@tldraw/assets": "^3.15.4",
|
"@tldraw/assets": "^3.15.4",
|
||||||
"@tldraw/tldraw": "^3.15.4",
|
"@tldraw/tldraw": "^3.15.4",
|
||||||
"@tldraw/tlschema": "^3.15.4",
|
"@tldraw/tlschema": "^3.15.4",
|
||||||
"@types/markdown-it": "^14.1.1",
|
"@types/markdown-it": "^14.1.1",
|
||||||
"@types/marked": "^5.0.2",
|
"@types/marked": "^5.0.2",
|
||||||
"@uiw/react-md-editor": "^4.0.5",
|
"@uiw/react-md-editor": "^4.0.5",
|
||||||
"@vercel/analytics": "^1.2.2",
|
|
||||||
"@xenova/transformers": "^2.17.2",
|
"@xenova/transformers": "^2.17.2",
|
||||||
|
"@xterm/addon-fit": "^0.10.0",
|
||||||
|
"@xterm/xterm": "^5.5.0",
|
||||||
"ai": "^4.1.0",
|
"ai": "^4.1.0",
|
||||||
"ajv": "^8.17.1",
|
"ajv": "^8.17.1",
|
||||||
"cherry-markdown": "^0.8.57",
|
"cherry-markdown": "^0.8.57",
|
||||||
|
|
@ -62,9 +71,9 @@
|
||||||
"react-markdown": "^10.1.0",
|
"react-markdown": "^10.1.0",
|
||||||
"react-router-dom": "^7.0.2",
|
"react-router-dom": "^7.0.2",
|
||||||
"recoil": "^0.7.7",
|
"recoil": "^0.7.7",
|
||||||
|
"sharp": "^0.33.5",
|
||||||
"tldraw": "^3.15.4",
|
"tldraw": "^3.15.4",
|
||||||
"use-whisper": "^0.0.1",
|
"use-whisper": "^0.0.1",
|
||||||
"vercel": "^39.1.1",
|
|
||||||
"webcola": "^3.4.0",
|
"webcola": "^3.4.0",
|
||||||
"webnative": "^0.36.3"
|
"webnative": "^0.36.3"
|
||||||
},
|
},
|
||||||
|
|
@ -84,6 +93,11 @@
|
||||||
"wrangler": "^4.33.2"
|
"wrangler": "^4.33.2"
|
||||||
},
|
},
|
||||||
"engines": {
|
"engines": {
|
||||||
"node": ">=18.0.0"
|
"node": ">=20.0.0"
|
||||||
|
},
|
||||||
|
"overrides": {
|
||||||
|
"@xenova/transformers": {
|
||||||
|
"sharp": "0.33.5"
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,63 @@
|
||||||
|
# ComfyUI Model Paths Configuration
|
||||||
|
# Updated to include /runpod-volume/ paths for all model types
|
||||||
|
# This allows models to be loaded from the network volume for faster cold starts
|
||||||
|
|
||||||
|
comfyui:
|
||||||
|
base_path: /ComfyUI/
|
||||||
|
is_default: true
|
||||||
|
|
||||||
|
# Checkpoints - check network volume first, then local
|
||||||
|
checkpoints: |
|
||||||
|
/runpod-volume/models/checkpoints/
|
||||||
|
models/checkpoints/
|
||||||
|
|
||||||
|
# CLIP models
|
||||||
|
clip: |
|
||||||
|
/runpod-volume/models/clip/
|
||||||
|
models/clip/
|
||||||
|
|
||||||
|
# CLIP Vision models (e.g., clip_vision_h.safetensors)
|
||||||
|
clip_vision: |
|
||||||
|
/runpod-volume/models/clip_vision/
|
||||||
|
models/clip_vision/
|
||||||
|
|
||||||
|
# Config files
|
||||||
|
configs: models/configs/
|
||||||
|
|
||||||
|
# ControlNet models
|
||||||
|
controlnet: |
|
||||||
|
/runpod-volume/models/controlnet/
|
||||||
|
models/controlnet/
|
||||||
|
|
||||||
|
# Diffusion models (Wan2.2 model files)
|
||||||
|
diffusion_models: |
|
||||||
|
/runpod-volume/models/diffusion_models/
|
||||||
|
/runpod-volume/models/
|
||||||
|
models/diffusion_models/
|
||||||
|
models/unet/
|
||||||
|
|
||||||
|
# Text embeddings
|
||||||
|
embeddings: |
|
||||||
|
/runpod-volume/models/embeddings/
|
||||||
|
models/embeddings/
|
||||||
|
|
||||||
|
# LoRA models
|
||||||
|
loras: |
|
||||||
|
/runpod-volume/loras/
|
||||||
|
/runpod-volume/models/loras/
|
||||||
|
models/loras/
|
||||||
|
|
||||||
|
# Text encoders (e.g., umt5-xxl-enc-bf16.safetensors)
|
||||||
|
text_encoders: |
|
||||||
|
/runpod-volume/models/text_encoders/
|
||||||
|
models/text_encoders/
|
||||||
|
|
||||||
|
# Upscale models
|
||||||
|
upscale_models: |
|
||||||
|
/runpod-volume/models/upscale_models/
|
||||||
|
models/upscale_models/
|
||||||
|
|
||||||
|
# VAE models (e.g., Wan2_1_VAE_bf16.safetensors)
|
||||||
|
vae: |
|
||||||
|
/runpod-volume/models/vae/
|
||||||
|
models/vae/
|
||||||
|
|
@ -0,0 +1,143 @@
|
||||||
|
#!/bin/bash
|
||||||
|
# Script to set up the RunPod network volume with Wan2.2 models
|
||||||
|
# Run this once on a GPU pod with the network volume attached
|
||||||
|
|
||||||
|
echo "=== Setting up RunPod Network Volume for Wan2.2 ==="
|
||||||
|
|
||||||
|
# Create directory structure
|
||||||
|
echo "Creating directory structure..."
|
||||||
|
mkdir -p /runpod-volume/models/diffusion_models
|
||||||
|
mkdir -p /runpod-volume/models/vae
|
||||||
|
mkdir -p /runpod-volume/models/text_encoders
|
||||||
|
mkdir -p /runpod-volume/models/clip_vision
|
||||||
|
mkdir -p /runpod-volume/loras
|
||||||
|
|
||||||
|
# Check current disk usage
|
||||||
|
echo "Current network volume usage:"
|
||||||
|
df -h /runpod-volume
|
||||||
|
|
||||||
|
# List what's already on the volume
|
||||||
|
echo ""
|
||||||
|
echo "Current contents of /runpod-volume:"
|
||||||
|
ls -la /runpod-volume/
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Current contents of /runpod-volume/models/ (if exists):"
|
||||||
|
ls -la /runpod-volume/models/ 2>/dev/null || echo "(empty or doesn't exist)"
|
||||||
|
|
||||||
|
# Check if models exist in the Docker image
|
||||||
|
echo ""
|
||||||
|
echo "Models in Docker image /ComfyUI/models/diffusion_models/:"
|
||||||
|
ls -la /ComfyUI/models/diffusion_models/ 2>/dev/null || echo "(not found)"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Models in Docker image /ComfyUI/models/vae/:"
|
||||||
|
ls -la /ComfyUI/models/vae/ 2>/dev/null || echo "(not found)"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Models in Docker image /ComfyUI/models/text_encoders/:"
|
||||||
|
ls -la /ComfyUI/models/text_encoders/ 2>/dev/null || echo "(not found)"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Models in Docker image /ComfyUI/models/clip_vision/:"
|
||||||
|
ls -la /ComfyUI/models/clip_vision/ 2>/dev/null || echo "(not found)"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Models in Docker image /ComfyUI/models/loras/:"
|
||||||
|
ls -la /ComfyUI/models/loras/ 2>/dev/null || echo "(not found)"
|
||||||
|
|
||||||
|
# Copy models to network volume (if not already there)
|
||||||
|
echo ""
|
||||||
|
echo "=== Copying models to network volume ==="
|
||||||
|
|
||||||
|
# Diffusion models
|
||||||
|
if [ -d "/ComfyUI/models/diffusion_models" ]; then
|
||||||
|
echo "Copying diffusion models..."
|
||||||
|
cp -vn /ComfyUI/models/diffusion_models/*.safetensors /runpod-volume/models/diffusion_models/ 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# VAE models
|
||||||
|
if [ -d "/ComfyUI/models/vae" ]; then
|
||||||
|
echo "Copying VAE models..."
|
||||||
|
cp -vn /ComfyUI/models/vae/*.safetensors /runpod-volume/models/vae/ 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Text encoders
|
||||||
|
if [ -d "/ComfyUI/models/text_encoders" ]; then
|
||||||
|
echo "Copying text encoder models..."
|
||||||
|
cp -vn /ComfyUI/models/text_encoders/*.safetensors /runpod-volume/models/text_encoders/ 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# CLIP vision
|
||||||
|
if [ -d "/ComfyUI/models/clip_vision" ]; then
|
||||||
|
echo "Copying CLIP vision models..."
|
||||||
|
cp -vn /ComfyUI/models/clip_vision/*.safetensors /runpod-volume/models/clip_vision/ 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# LoRAs
|
||||||
|
if [ -d "/ComfyUI/models/loras" ]; then
|
||||||
|
echo "Copying LoRA models..."
|
||||||
|
cp -vn /ComfyUI/models/loras/*.safetensors /runpod-volume/loras/ 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Copy extra_model_paths.yaml to volume
|
||||||
|
echo ""
|
||||||
|
echo "Copying extra_model_paths.yaml to network volume..."
|
||||||
|
cat > /runpod-volume/extra_model_paths.yaml << 'EOF'
|
||||||
|
# ComfyUI Model Paths Configuration - Network Volume Priority
|
||||||
|
comfyui:
|
||||||
|
base_path: /ComfyUI/
|
||||||
|
is_default: true
|
||||||
|
checkpoints: |
|
||||||
|
/runpod-volume/models/checkpoints/
|
||||||
|
models/checkpoints/
|
||||||
|
clip: |
|
||||||
|
/runpod-volume/models/clip/
|
||||||
|
models/clip/
|
||||||
|
clip_vision: |
|
||||||
|
/runpod-volume/models/clip_vision/
|
||||||
|
models/clip_vision/
|
||||||
|
configs: models/configs/
|
||||||
|
controlnet: |
|
||||||
|
/runpod-volume/models/controlnet/
|
||||||
|
models/controlnet/
|
||||||
|
diffusion_models: |
|
||||||
|
/runpod-volume/models/diffusion_models/
|
||||||
|
/runpod-volume/models/
|
||||||
|
models/diffusion_models/
|
||||||
|
models/unet/
|
||||||
|
embeddings: |
|
||||||
|
/runpod-volume/models/embeddings/
|
||||||
|
models/embeddings/
|
||||||
|
loras: |
|
||||||
|
/runpod-volume/loras/
|
||||||
|
/runpod-volume/models/loras/
|
||||||
|
models/loras/
|
||||||
|
text_encoders: |
|
||||||
|
/runpod-volume/models/text_encoders/
|
||||||
|
models/text_encoders/
|
||||||
|
upscale_models: |
|
||||||
|
/runpod-volume/models/upscale_models/
|
||||||
|
models/upscale_models/
|
||||||
|
vae: |
|
||||||
|
/runpod-volume/models/vae/
|
||||||
|
models/vae/
|
||||||
|
EOF
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "=== Final network volume contents ==="
|
||||||
|
echo ""
|
||||||
|
echo "/runpod-volume/models/:"
|
||||||
|
du -sh /runpod-volume/models/*/ 2>/dev/null || echo "(empty)"
|
||||||
|
echo ""
|
||||||
|
echo "/runpod-volume/loras/:"
|
||||||
|
ls -la /runpod-volume/loras/ 2>/dev/null || echo "(empty)"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Total network volume usage:"
|
||||||
|
du -sh /runpod-volume/
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "=== Setup complete! ==="
|
||||||
|
echo "Models have been copied to the network volume."
|
||||||
|
echo "On subsequent cold starts, models will load from /runpod-volume/ (faster)."
|
||||||
|
|
@ -0,0 +1,249 @@
|
||||||
|
#!/bin/bash
|
||||||
|
#
|
||||||
|
# Worktree Manager - Helper script for managing Git worktrees
|
||||||
|
#
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
REPO_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
|
||||||
|
REPO_NAME=$(basename "$REPO_ROOT")
|
||||||
|
WORKTREE_BASE=$(dirname "$REPO_ROOT")
|
||||||
|
|
||||||
|
# Colors for output
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
show_help() {
|
||||||
|
cat << EOF
|
||||||
|
${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}
|
||||||
|
${GREEN}Worktree Manager${NC} - Manage Git worktrees easily
|
||||||
|
${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}
|
||||||
|
|
||||||
|
${YELLOW}Usage:${NC}
|
||||||
|
./worktree-manager.sh <command> [arguments]
|
||||||
|
|
||||||
|
${YELLOW}Commands:${NC}
|
||||||
|
${GREEN}list${NC} List all worktrees
|
||||||
|
${GREEN}create${NC} <branch> Create a new worktree for a branch
|
||||||
|
${GREEN}remove${NC} <branch> Remove a worktree
|
||||||
|
${GREEN}clean${NC} Remove all worktrees except main
|
||||||
|
${GREEN}goto${NC} <branch> Print command to cd to worktree
|
||||||
|
${GREEN}status${NC} Show status of all worktrees
|
||||||
|
${GREEN}help${NC} Show this help message
|
||||||
|
|
||||||
|
${YELLOW}Examples:${NC}
|
||||||
|
./worktree-manager.sh create feature/new-feature
|
||||||
|
./worktree-manager.sh list
|
||||||
|
./worktree-manager.sh remove feature/old-feature
|
||||||
|
./worktree-manager.sh clean
|
||||||
|
cd \$(./worktree-manager.sh goto feature/new-feature)
|
||||||
|
|
||||||
|
${YELLOW}Automatic Worktrees:${NC}
|
||||||
|
A Git hook is installed that automatically creates worktrees
|
||||||
|
when you run: ${CYAN}git checkout -b new-branch${NC}
|
||||||
|
|
||||||
|
${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}
|
||||||
|
EOF
|
||||||
|
}
|
||||||
|
|
||||||
|
list_worktrees() {
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
echo -e "${GREEN}Git Worktrees:${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
git worktree list --porcelain | awk '
|
||||||
|
/^worktree/ { path=$2 }
|
||||||
|
/^HEAD/ { head=$2 }
|
||||||
|
/^branch/ {
|
||||||
|
branch=$2
|
||||||
|
gsub(/^refs\/heads\//, "", branch)
|
||||||
|
printf "%-40s %s\n", branch, path
|
||||||
|
}
|
||||||
|
/^detached/ {
|
||||||
|
printf "%-40s %s (detached)\n", head, path
|
||||||
|
}
|
||||||
|
' | while read line; do
|
||||||
|
if [[ $line == *"(detached)"* ]]; then
|
||||||
|
echo -e "${YELLOW} $line${NC}"
|
||||||
|
else
|
||||||
|
echo -e "${GREEN} $line${NC}"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
create_worktree() {
|
||||||
|
local branch=$1
|
||||||
|
|
||||||
|
if [ -z "$branch" ]; then
|
||||||
|
echo -e "${RED}Error: Branch name required${NC}"
|
||||||
|
echo "Usage: $0 create <branch-name>"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
local worktree_path="${WORKTREE_BASE}/${REPO_NAME}-${branch}"
|
||||||
|
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
echo -e "${GREEN}Creating worktree for branch: ${YELLOW}$branch${NC}"
|
||||||
|
echo -e "${BLUE}Location: ${YELLOW}$worktree_path${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
|
||||||
|
# Check if branch exists
|
||||||
|
if git show-ref --verify --quiet "refs/heads/$branch"; then
|
||||||
|
# Branch exists, just create worktree
|
||||||
|
git worktree add "$worktree_path" "$branch"
|
||||||
|
else
|
||||||
|
# Branch doesn't exist, create it
|
||||||
|
echo -e "${YELLOW}Branch doesn't exist, creating new branch...${NC}"
|
||||||
|
git worktree add -b "$branch" "$worktree_path"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}✅ Worktree created successfully!${NC}"
|
||||||
|
echo -e ""
|
||||||
|
echo -e "To switch to the worktree:"
|
||||||
|
echo -e " ${CYAN}cd $worktree_path${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
remove_worktree() {
|
||||||
|
local branch=$1
|
||||||
|
|
||||||
|
if [ -z "$branch" ]; then
|
||||||
|
echo -e "${RED}Error: Branch name required${NC}"
|
||||||
|
echo "Usage: $0 remove <branch-name>"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
echo -e "${YELLOW}Removing worktree for branch: $branch${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
|
||||||
|
git worktree remove "$branch" --force
|
||||||
|
|
||||||
|
echo -e "${GREEN}✅ Worktree removed successfully!${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
clean_worktrees() {
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
echo -e "${YELLOW}Cleaning up worktrees (keeping main/master)...${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
|
||||||
|
# Get list of worktrees excluding main/master
|
||||||
|
git worktree list --porcelain | grep "^branch" | sed 's/^branch refs\/heads\///' | while read branch; do
|
||||||
|
if [[ "$branch" != "main" ]] && [[ "$branch" != "master" ]]; then
|
||||||
|
echo -e "${YELLOW}Removing: $branch${NC}"
|
||||||
|
git worktree remove "$branch" --force 2>/dev/null || echo -e "${RED} Failed to remove $branch${NC}"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# Prune deleted worktrees
|
||||||
|
git worktree prune
|
||||||
|
|
||||||
|
echo -e "${GREEN}✅ Cleanup complete!${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
goto_worktree() {
|
||||||
|
local branch=$1
|
||||||
|
|
||||||
|
if [ -z "$branch" ]; then
|
||||||
|
echo -e "${RED}Error: Branch name required${NC}" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
|
||||||
|
# Find worktree path for branch
|
||||||
|
local worktree_path=$(git worktree list --porcelain | awk -v branch="$branch" '
|
||||||
|
/^worktree/ { path=$2 }
|
||||||
|
/^branch/ {
|
||||||
|
b=$2
|
||||||
|
gsub(/^refs\/heads\//, "", b)
|
||||||
|
if (b == branch) {
|
||||||
|
print path
|
||||||
|
exit
|
||||||
|
}
|
||||||
|
}
|
||||||
|
')
|
||||||
|
|
||||||
|
if [ -n "$worktree_path" ]; then
|
||||||
|
echo "$worktree_path"
|
||||||
|
else
|
||||||
|
echo -e "${RED}Error: No worktree found for branch '$branch'${NC}" >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
show_status() {
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
echo -e "${GREEN}Worktree Status:${NC}"
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
|
||||||
|
git worktree list --porcelain | awk '
|
||||||
|
/^worktree/ { path=$2 }
|
||||||
|
/^branch/ {
|
||||||
|
branch=$2
|
||||||
|
gsub(/^refs\/heads\//, "", branch)
|
||||||
|
printf "\n%s%s%s\n", "Branch: ", branch, ""
|
||||||
|
printf "%s%s%s\n", "Path: ", path, ""
|
||||||
|
system("cd " path " && git status --short --branch | head -5")
|
||||||
|
}
|
||||||
|
' | while IFS= read -r line; do
|
||||||
|
if [[ $line == Branch:* ]]; then
|
||||||
|
echo -e "${GREEN}$line${NC}"
|
||||||
|
elif [[ $line == Path:* ]]; then
|
||||||
|
echo -e "${BLUE}$line${NC}"
|
||||||
|
else
|
||||||
|
echo -e "${YELLOW}$line${NC}"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
echo -e "${CYAN}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main command dispatcher
|
||||||
|
case "${1:-help}" in
|
||||||
|
list|ls)
|
||||||
|
list_worktrees
|
||||||
|
;;
|
||||||
|
create|add)
|
||||||
|
create_worktree "$2"
|
||||||
|
;;
|
||||||
|
remove|rm|delete)
|
||||||
|
remove_worktree "$2"
|
||||||
|
;;
|
||||||
|
clean|cleanup)
|
||||||
|
clean_worktrees
|
||||||
|
;;
|
||||||
|
goto|cd)
|
||||||
|
goto_worktree "$2"
|
||||||
|
;;
|
||||||
|
status|st)
|
||||||
|
show_status
|
||||||
|
;;
|
||||||
|
help|--help|-h)
|
||||||
|
show_help
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo -e "${RED}Unknown command: $1${NC}"
|
||||||
|
echo ""
|
||||||
|
show_help
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
85
src/App.tsx
85
src/App.tsx
|
|
@ -1,27 +1,20 @@
|
||||||
import "tldraw/tldraw.css"
|
|
||||||
import "@/css/style.css"
|
|
||||||
import { Default } from "@/routes/Default"
|
|
||||||
import { BrowserRouter, Route, Routes, Navigate } from "react-router-dom"
|
|
||||||
import { Contact } from "@/routes/Contact"
|
|
||||||
import { Board } from "./routes/Board"
|
|
||||||
import { Inbox } from "./routes/Inbox"
|
|
||||||
import { Presentations } from "./routes/Presentations"
|
|
||||||
import { Resilience } from "./routes/Resilience"
|
|
||||||
import { inject } from "@vercel/analytics"
|
|
||||||
import { createRoot } from "react-dom/client"
|
|
||||||
import { DailyProvider } from "@daily-co/daily-react"
|
|
||||||
import Daily from "@daily-co/daily-js"
|
|
||||||
import "tldraw/tldraw.css";
|
import "tldraw/tldraw.css";
|
||||||
import "@/css/style.css";
|
import "@/css/style.css";
|
||||||
import "@/css/auth.css"; // Import auth styles
|
import "@/css/auth.css"; // Import auth styles
|
||||||
import "@/css/crypto-auth.css"; // Import crypto auth styles
|
import "@/css/crypto-auth.css"; // Import crypto auth styles
|
||||||
import "@/css/starred-boards.css"; // Import starred boards styles
|
import "@/css/starred-boards.css"; // Import starred boards styles
|
||||||
import "@/css/user-profile.css"; // Import user profile styles
|
import "@/css/user-profile.css"; // Import user profile styles
|
||||||
import "@/css/location.css"; // Import location sharing styles
|
import { Default } from "@/routes/Default";
|
||||||
|
import { BrowserRouter, Route, Routes, Navigate, useParams } from "react-router-dom";
|
||||||
|
import { Contact } from "@/routes/Contact";
|
||||||
|
import { Board } from "./routes/Board";
|
||||||
|
import { Inbox } from "./routes/Inbox";
|
||||||
|
import { Presentations } from "./routes/Presentations";
|
||||||
|
import { Resilience } from "./routes/Resilience";
|
||||||
import { Dashboard } from "./routes/Dashboard";
|
import { Dashboard } from "./routes/Dashboard";
|
||||||
import { LocationShareCreate } from "./routes/LocationShareCreate";
|
import { createRoot } from "react-dom/client";
|
||||||
import { LocationShareView } from "./routes/LocationShareView";
|
import { DailyProvider } from "@daily-co/daily-react";
|
||||||
import { LocationDashboardRoute } from "./routes/LocationDashboardRoute";
|
import Daily from "@daily-co/daily-js";
|
||||||
import { useState, useEffect } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
|
|
||||||
// Import React Context providers
|
// Import React Context providers
|
||||||
|
|
@ -32,14 +25,12 @@ import NotificationsDisplay from './components/NotificationsDisplay';
|
||||||
import { ErrorBoundary } from './components/ErrorBoundary';
|
import { ErrorBoundary } from './components/ErrorBoundary';
|
||||||
|
|
||||||
// Import auth components
|
// Import auth components
|
||||||
import CryptoLogin from './components/auth/CryptoLogin';
|
import CryptID from './components/auth/CryptID';
|
||||||
import CryptoDebug from './components/auth/CryptoDebug';
|
import CryptoDebug from './components/auth/CryptoDebug';
|
||||||
|
|
||||||
// Import Google Data test component
|
// Import Google Data test component
|
||||||
import { GoogleDataTest } from './components/GoogleDataTest';
|
import { GoogleDataTest } from './components/GoogleDataTest';
|
||||||
|
|
||||||
inject();
|
|
||||||
|
|
||||||
// Initialize Daily.co call object with error handling
|
// Initialize Daily.co call object with error handling
|
||||||
let callObject: any = null;
|
let callObject: any = null;
|
||||||
try {
|
try {
|
||||||
|
|
@ -77,6 +68,14 @@ const OptionalAuthRoute = ({ children }: { children: React.ReactNode }) => {
|
||||||
return <>{children}</>;
|
return <>{children}</>;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Component to redirect board URLs without trailing slashes
|
||||||
|
*/
|
||||||
|
const RedirectBoardSlug = () => {
|
||||||
|
const { slug } = useParams<{ slug: string }>();
|
||||||
|
return <Navigate to={`/board/${slug}/`} replace />;
|
||||||
|
};
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Main App with context providers
|
* Main App with context providers
|
||||||
*/
|
*/
|
||||||
|
|
@ -95,7 +94,7 @@ const AppWithProviders = () => {
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="auth-page">
|
<div className="auth-page">
|
||||||
<CryptoLogin onSuccess={() => window.location.href = '/'} />
|
<CryptID onSuccess={() => window.location.href = '/'} />
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
@ -111,66 +110,60 @@ const AppWithProviders = () => {
|
||||||
<NotificationsDisplay />
|
<NotificationsDisplay />
|
||||||
|
|
||||||
<Routes>
|
<Routes>
|
||||||
|
{/* Redirect routes without trailing slashes to include them */}
|
||||||
|
<Route path="/login" element={<Navigate to="/login/" replace />} />
|
||||||
|
<Route path="/contact" element={<Navigate to="/contact/" replace />} />
|
||||||
|
<Route path="/board/:slug" element={<RedirectBoardSlug />} />
|
||||||
|
<Route path="/inbox" element={<Navigate to="/inbox/" replace />} />
|
||||||
|
<Route path="/debug" element={<Navigate to="/debug/" replace />} />
|
||||||
|
<Route path="/dashboard" element={<Navigate to="/dashboard/" replace />} />
|
||||||
|
<Route path="/presentations" element={<Navigate to="/presentations/" replace />} />
|
||||||
|
<Route path="/presentations/resilience" element={<Navigate to="/presentations/resilience/" replace />} />
|
||||||
|
|
||||||
{/* Auth routes */}
|
{/* Auth routes */}
|
||||||
<Route path="/login" element={<AuthPage />} />
|
<Route path="/login/" element={<AuthPage />} />
|
||||||
|
|
||||||
{/* Optional auth routes */}
|
{/* Optional auth routes */}
|
||||||
<Route path="/" element={
|
<Route path="/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Default />
|
<Default />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/contact" element={
|
<Route path="/contact/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Contact />
|
<Contact />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/board/:slug" element={
|
<Route path="/board/:slug/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Board />
|
<Board />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/inbox" element={
|
<Route path="/inbox/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Inbox />
|
<Inbox />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/debug" element={
|
<Route path="/debug/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<CryptoDebug />
|
<CryptoDebug />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/dashboard" element={
|
<Route path="/dashboard/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Dashboard />
|
<Dashboard />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/presentations" element={
|
<Route path="/presentations/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Presentations />
|
<Presentations />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
<Route path="/presentations/resilience" element={
|
<Route path="/presentations/resilience/" element={
|
||||||
<OptionalAuthRoute>
|
<OptionalAuthRoute>
|
||||||
<Resilience />
|
<Resilience />
|
||||||
</OptionalAuthRoute>
|
</OptionalAuthRoute>
|
||||||
} />
|
} />
|
||||||
{/* Location sharing routes */}
|
|
||||||
<Route path="/share-location" element={
|
|
||||||
<OptionalAuthRoute>
|
|
||||||
<LocationShareCreate />
|
|
||||||
</OptionalAuthRoute>
|
|
||||||
} />
|
|
||||||
<Route path="/location/:token" element={
|
|
||||||
<OptionalAuthRoute>
|
|
||||||
<LocationShareView />
|
|
||||||
</OptionalAuthRoute>
|
|
||||||
} />
|
|
||||||
<Route path="/location-dashboard" element={
|
|
||||||
<OptionalAuthRoute>
|
|
||||||
<LocationDashboardRoute />
|
|
||||||
</OptionalAuthRoute>
|
|
||||||
} />
|
|
||||||
{/* Google Data routes */}
|
{/* Google Data routes */}
|
||||||
<Route path="/google" element={<GoogleDataTest />} />
|
<Route path="/google" element={<GoogleDataTest />} />
|
||||||
<Route path="/oauth/google/callback" element={<GoogleDataTest />} />
|
<Route path="/oauth/google/callback" element={<GoogleDataTest />} />
|
||||||
|
|
|
||||||
18
src/CmdK.tsx
18
src/CmdK.tsx
|
|
@ -66,11 +66,19 @@ export const CmdK = () => {
|
||||||
)
|
)
|
||||||
|
|
||||||
const selected = editor.getSelectedShapeIds()
|
const selected = editor.getSelectedShapeIds()
|
||||||
const inView = editor
|
let inView: TLShapeId[] = []
|
||||||
.getShapesAtPoint(editor.getViewportPageBounds().center, {
|
try {
|
||||||
margin: 1200,
|
inView = editor
|
||||||
})
|
.getShapesAtPoint(editor.getViewportPageBounds().center, {
|
||||||
.map((o) => o.id)
|
margin: 1200,
|
||||||
|
})
|
||||||
|
.map((o) => o.id)
|
||||||
|
} catch (e) {
|
||||||
|
// Some shapes may have invalid geometry (e.g., zero-length arrows)
|
||||||
|
// Fall back to getting all shapes on the current page
|
||||||
|
console.warn('getShapesAtPoint failed, falling back to all page shapes:', e)
|
||||||
|
inView = editor.getCurrentPageShapeIds() as unknown as TLShapeId[]
|
||||||
|
}
|
||||||
|
|
||||||
return new Map([
|
return new Map([
|
||||||
...nameToShapeIdMap,
|
...nameToShapeIdMap,
|
||||||
|
|
|
||||||
|
|
@ -1,9 +1,57 @@
|
||||||
import { TLRecord, RecordId, TLStore } from "@tldraw/tldraw"
|
import { TLRecord, RecordId, TLStore, IndexKey } from "@tldraw/tldraw"
|
||||||
import * as Automerge from "@automerge/automerge"
|
import * as Automerge from "@automerge/automerge"
|
||||||
|
|
||||||
|
// Helper function to validate if a string is a valid tldraw IndexKey
|
||||||
|
// tldraw uses fractional indexing based on https://observablehq.com/@dgreensp/implementing-fractional-indexing
|
||||||
|
// The first letter encodes integer part length: a=1 digit, b=2 digits, c=3 digits, etc.
|
||||||
|
// Examples: "a0"-"a9", "b10"-"b99", "c100"-"c999", with optional fraction "a1V4rr"
|
||||||
|
// Invalid: "b1" (b expects 2 digits but has 1), simple sequential numbers
|
||||||
|
function isValidIndexKey(index: string): boolean {
|
||||||
|
if (!index || typeof index !== 'string' || index.length === 0) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Must start with a letter
|
||||||
|
if (!/^[a-zA-Z]/.test(index)) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const prefix = index[0]
|
||||||
|
const rest = index.slice(1)
|
||||||
|
|
||||||
|
// For lowercase prefixes, validate digit count matches the prefix
|
||||||
|
if (prefix >= 'a' && prefix <= 'z') {
|
||||||
|
// Calculate expected minimum digit count: a=1, b=2, c=3, etc.
|
||||||
|
const expectedDigits = prefix.charCodeAt(0) - 'a'.charCodeAt(0) + 1
|
||||||
|
|
||||||
|
// Extract the integer part (leading digits)
|
||||||
|
const integerMatch = rest.match(/^(\d+)/)
|
||||||
|
if (!integerMatch) {
|
||||||
|
// No digits at all - invalid
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
const integerPart = integerMatch[1]
|
||||||
|
|
||||||
|
// Check if integer part has correct number of digits for the prefix
|
||||||
|
if (integerPart.length < expectedDigits) {
|
||||||
|
// Invalid: "b1" has b (expects 2 digits) but only has 1 digit
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check overall format: letter followed by alphanumeric
|
||||||
|
if (/^[a-zA-Z][a-zA-Z0-9]+$/.test(index)) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
export function applyAutomergePatchesToTLStore(
|
export function applyAutomergePatchesToTLStore(
|
||||||
patches: Automerge.Patch[],
|
patches: Automerge.Patch[],
|
||||||
store: TLStore
|
store: TLStore,
|
||||||
|
automergeDoc?: any // Optional Automerge document to read full records from
|
||||||
) {
|
) {
|
||||||
const toRemove: TLRecord["id"][] = []
|
const toRemove: TLRecord["id"][] = []
|
||||||
const updatedObjects: { [id: string]: TLRecord } = {}
|
const updatedObjects: { [id: string]: TLRecord } = {}
|
||||||
|
|
@ -27,10 +75,11 @@ export function applyAutomergePatchesToTLStore(
|
||||||
|
|
||||||
const existingRecord = getRecordFromStore(store, id)
|
const existingRecord = getRecordFromStore(store, id)
|
||||||
|
|
||||||
// CRITICAL: For shapes, get coordinates from store's current state BEFORE any patch processing
|
// CRITICAL: For shapes, get coordinates and parentId from store's current state BEFORE any patch processing
|
||||||
// This ensures we preserve coordinates even if patches don't include them
|
// This ensures we preserve coordinates and parent relationships even if patches don't include them
|
||||||
// This is especially important when patches come back after store.put operations
|
// This is especially important when patches come back after store.put operations
|
||||||
let storeCoordinates: { x?: number; y?: number } = {}
|
let storeCoordinates: { x?: number; y?: number } = {}
|
||||||
|
let storeParentId: string | undefined = undefined
|
||||||
if (existingRecord && existingRecord.typeName === 'shape') {
|
if (existingRecord && existingRecord.typeName === 'shape') {
|
||||||
const storeX = (existingRecord as any).x
|
const storeX = (existingRecord as any).x
|
||||||
const storeY = (existingRecord as any).y
|
const storeY = (existingRecord as any).y
|
||||||
|
|
@ -40,6 +89,39 @@ export function applyAutomergePatchesToTLStore(
|
||||||
if (typeof storeY === 'number' && !isNaN(storeY) && storeY !== null && storeY !== undefined) {
|
if (typeof storeY === 'number' && !isNaN(storeY) && storeY !== null && storeY !== undefined) {
|
||||||
storeCoordinates.y = storeY
|
storeCoordinates.y = storeY
|
||||||
}
|
}
|
||||||
|
// CRITICAL: Preserve parentId from store (might be a frame or group!)
|
||||||
|
const existingParentId = (existingRecord as any).parentId
|
||||||
|
if (existingParentId && typeof existingParentId === 'string') {
|
||||||
|
storeParentId = existingParentId
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// CRITICAL: If record doesn't exist in store yet, try to get it from Automerge document
|
||||||
|
// This prevents coordinates from defaulting to 0,0 when patches create new records
|
||||||
|
let automergeRecord: any = null
|
||||||
|
let automergeParentId: string | undefined = undefined
|
||||||
|
if (!existingRecord && automergeDoc && automergeDoc.store && automergeDoc.store[id]) {
|
||||||
|
try {
|
||||||
|
automergeRecord = automergeDoc.store[id]
|
||||||
|
// Extract coordinates and parentId from Automerge record if it's a shape
|
||||||
|
if (automergeRecord && automergeRecord.typeName === 'shape') {
|
||||||
|
const docX = automergeRecord.x
|
||||||
|
const docY = automergeRecord.y
|
||||||
|
if (typeof docX === 'number' && !isNaN(docX) && docX !== null && docX !== undefined) {
|
||||||
|
storeCoordinates.x = docX
|
||||||
|
}
|
||||||
|
if (typeof docY === 'number' && !isNaN(docY) && docY !== null && docY !== undefined) {
|
||||||
|
storeCoordinates.y = docY
|
||||||
|
}
|
||||||
|
// CRITICAL: Preserve parentId from Automerge document (might be a frame!)
|
||||||
|
if (automergeRecord.parentId && typeof automergeRecord.parentId === 'string') {
|
||||||
|
automergeParentId = automergeRecord.parentId
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
// If we can't read from Automerge doc, continue without it
|
||||||
|
console.warn(`Could not read record ${id} from Automerge document:`, e)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Infer typeName from ID pattern if record doesn't exist
|
// Infer typeName from ID pattern if record doesn't exist
|
||||||
|
|
@ -112,7 +194,20 @@ export function applyAutomergePatchesToTLStore(
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
let record = updatedObjects[id] || (existingRecord ? JSON.parse(JSON.stringify(existingRecord)) : defaultRecord)
|
// CRITICAL: When creating a new record, prefer using the full record from Automerge document
|
||||||
|
// This ensures we get all properties including coordinates, not just defaults
|
||||||
|
let record: any
|
||||||
|
if (updatedObjects[id]) {
|
||||||
|
record = updatedObjects[id]
|
||||||
|
} else if (existingRecord) {
|
||||||
|
record = JSON.parse(JSON.stringify(existingRecord))
|
||||||
|
} else if (automergeRecord) {
|
||||||
|
// Use the full record from Automerge document - this has all properties including coordinates
|
||||||
|
record = JSON.parse(JSON.stringify(automergeRecord))
|
||||||
|
} else {
|
||||||
|
// Fallback to default record only if we can't get it from anywhere else
|
||||||
|
record = defaultRecord
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: For shapes, ensure x and y are always present (even if record came from updatedObjects)
|
// CRITICAL: For shapes, ensure x and y are always present (even if record came from updatedObjects)
|
||||||
// This prevents coordinates from being lost when records are created from patches
|
// This prevents coordinates from being lost when records are created from patches
|
||||||
|
|
@ -157,6 +252,28 @@ export function applyAutomergePatchesToTLStore(
|
||||||
const originalX = storeCoordinates.x !== undefined ? storeCoordinates.x : recordX
|
const originalX = storeCoordinates.x !== undefined ? storeCoordinates.x : recordX
|
||||||
const originalY = storeCoordinates.y !== undefined ? storeCoordinates.y : recordY
|
const originalY = storeCoordinates.y !== undefined ? storeCoordinates.y : recordY
|
||||||
const hadOriginalCoordinates = originalX !== undefined && originalY !== undefined
|
const hadOriginalCoordinates = originalX !== undefined && originalY !== undefined
|
||||||
|
|
||||||
|
// CRITICAL: Store original richText and arrow text before patch application to preserve them
|
||||||
|
// This ensures richText and arrow text aren't lost when patches only update other properties
|
||||||
|
let originalRichText: any = undefined
|
||||||
|
let originalArrowText: any = undefined
|
||||||
|
if (record.typeName === 'shape') {
|
||||||
|
// Get richText from store's current state (most reliable)
|
||||||
|
if (existingRecord && (existingRecord as any).props && (existingRecord as any).props.richText) {
|
||||||
|
originalRichText = (existingRecord as any).props.richText
|
||||||
|
} else if ((record as any).props && (record as any).props.richText) {
|
||||||
|
originalRichText = (record as any).props.richText
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get arrow text from store's current state (most reliable)
|
||||||
|
if ((record as any).type === 'arrow') {
|
||||||
|
if (existingRecord && (existingRecord as any).props && (existingRecord as any).props.text !== undefined) {
|
||||||
|
originalArrowText = (existingRecord as any).props.text
|
||||||
|
} else if ((record as any).props && (record as any).props.text !== undefined) {
|
||||||
|
originalArrowText = (record as any).props.text
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
switch (patch.action) {
|
switch (patch.action) {
|
||||||
case "insert": {
|
case "insert": {
|
||||||
|
|
@ -229,6 +346,58 @@ export function applyAutomergePatchesToTLStore(
|
||||||
updatedObjects[id] = { ...updatedObjects[id], y: defaultRecord.y || 0 } as TLRecord
|
updatedObjects[id] = { ...updatedObjects[id], y: defaultRecord.y || 0 } as TLRecord
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Preserve richText and arrow text after patch application
|
||||||
|
// This prevents richText and arrow text from being lost when patches only update other properties
|
||||||
|
const currentRecord = updatedObjects[id]
|
||||||
|
|
||||||
|
// Preserve richText for geo/note/text shapes
|
||||||
|
if (originalRichText !== undefined && (currentRecord as any).type !== 'arrow') {
|
||||||
|
const patchedProps = (currentRecord as any).props || {}
|
||||||
|
const patchedRichText = patchedProps.richText
|
||||||
|
// If patch didn't include richText, preserve the original
|
||||||
|
if (patchedRichText === undefined || patchedRichText === null) {
|
||||||
|
updatedObjects[id] = {
|
||||||
|
...currentRecord,
|
||||||
|
props: {
|
||||||
|
...patchedProps,
|
||||||
|
richText: originalRichText
|
||||||
|
}
|
||||||
|
} as TLRecord
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Preserve arrow text for arrow shapes
|
||||||
|
if (originalArrowText !== undefined && (currentRecord as any).type === 'arrow') {
|
||||||
|
const patchedProps = (currentRecord as any).props || {}
|
||||||
|
const patchedText = patchedProps.text
|
||||||
|
// If patch didn't include text, preserve the original
|
||||||
|
if (patchedText === undefined || patchedText === null) {
|
||||||
|
updatedObjects[id] = {
|
||||||
|
...currentRecord,
|
||||||
|
props: {
|
||||||
|
...patchedProps,
|
||||||
|
text: originalArrowText
|
||||||
|
}
|
||||||
|
} as TLRecord
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Preserve parentId from store or Automerge document
|
||||||
|
// This prevents shapes from losing their frame/group parent relationships
|
||||||
|
// which causes them to reset to (0,0) on the page instead of maintaining their position in the frame
|
||||||
|
// Priority: store parentId (most reliable), then Automerge parentId, then patch value
|
||||||
|
const preservedParentId = storeParentId || automergeParentId
|
||||||
|
if (preservedParentId !== undefined) {
|
||||||
|
const patchedParentId = (currentRecord as any).parentId
|
||||||
|
// If patch didn't include parentId, or it's missing/default, use the preserved parentId
|
||||||
|
if (!patchedParentId || (patchedParentId === 'page:page' && preservedParentId !== 'page:page')) {
|
||||||
|
updatedObjects[id] = {
|
||||||
|
...currentRecord,
|
||||||
|
parentId: preservedParentId
|
||||||
|
} as TLRecord
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// CRITICAL: Re-check typeName after patch application to ensure it's still correct
|
// CRITICAL: Re-check typeName after patch application to ensure it's still correct
|
||||||
|
|
@ -251,6 +420,12 @@ export function applyAutomergePatchesToTLStore(
|
||||||
return // Skip - not a TLDraw record
|
return // Skip - not a TLDraw record
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Filter out SharedPiano shapes since they're no longer supported
|
||||||
|
if (record.typeName === 'shape' && (record as any).type === 'SharedPiano') {
|
||||||
|
console.log(`⚠️ Filtering out deprecated SharedPiano shape: ${record.id}`)
|
||||||
|
return // Skip - SharedPiano is deprecated
|
||||||
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const sanitized = sanitizeRecord(record)
|
const sanitized = sanitizeRecord(record)
|
||||||
toPut.push(sanitized)
|
toPut.push(sanitized)
|
||||||
|
|
@ -270,6 +445,18 @@ export function applyAutomergePatchesToTLStore(
|
||||||
// put / remove the records in the store
|
// put / remove the records in the store
|
||||||
// Log patch application for debugging
|
// Log patch application for debugging
|
||||||
console.log(`🔧 AutomergeToTLStore: Applying ${patches.length} patches, ${toPut.length} records to put, ${toRemove.length} records to remove`)
|
console.log(`🔧 AutomergeToTLStore: Applying ${patches.length} patches, ${toPut.length} records to put, ${toRemove.length} records to remove`)
|
||||||
|
|
||||||
|
// DEBUG: Log shape updates being applied to store
|
||||||
|
toPut.forEach(record => {
|
||||||
|
if (record.typeName === 'shape' && (record as any).props?.w) {
|
||||||
|
console.log(`🔧 AutomergeToTLStore: Putting shape ${(record as any).type} ${record.id}:`, {
|
||||||
|
w: (record as any).props.w,
|
||||||
|
h: (record as any).props.h,
|
||||||
|
x: (record as any).x,
|
||||||
|
y: (record as any).y
|
||||||
|
})
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
if (failedRecords.length > 0) {
|
if (failedRecords.length > 0) {
|
||||||
console.log({ patches, toPut: toPut.length, failed: failedRecords.length })
|
console.log({ patches, toPut: toPut.length, failed: failedRecords.length })
|
||||||
|
|
@ -402,14 +589,25 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
|
|
||||||
// For shapes, only ensure basic required fields exist
|
// For shapes, only ensure basic required fields exist
|
||||||
if (sanitized.typeName === 'shape') {
|
if (sanitized.typeName === 'shape') {
|
||||||
|
// CRITICAL: Remove instance-only properties from shapes (these cause validation errors)
|
||||||
|
// These properties should only exist on instance records, not shape records
|
||||||
|
const instanceOnlyProperties = ['insets', 'brush', 'zoomBrush', 'scribbles', 'duplicateProps']
|
||||||
|
instanceOnlyProperties.forEach(prop => {
|
||||||
|
if (prop in sanitized) {
|
||||||
|
delete (sanitized as any)[prop]
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
// Ensure required shape fields exist
|
// Ensure required shape fields exist
|
||||||
// CRITICAL: Only set defaults if coordinates are truly missing or invalid
|
// CRITICAL: Only set defaults if coordinates are truly missing or invalid
|
||||||
// DO NOT overwrite valid coordinates (including 0, which is a valid position)
|
// DO NOT overwrite valid coordinates (including 0, which is a valid position)
|
||||||
// Only set to 0 if the value is undefined, null, or NaN
|
// Only set to 0 if the value is undefined, null, or NaN
|
||||||
if (sanitized.x === undefined || sanitized.x === null || (typeof sanitized.x === 'number' && isNaN(sanitized.x))) {
|
if (sanitized.x === undefined || sanitized.x === null || (typeof sanitized.x === 'number' && isNaN(sanitized.x))) {
|
||||||
|
console.warn(`⚠️ Shape ${sanitized.id} (${sanitized.type}) has invalid x coordinate, defaulting to 0. Original value:`, sanitized.x)
|
||||||
sanitized.x = 0
|
sanitized.x = 0
|
||||||
}
|
}
|
||||||
if (sanitized.y === undefined || sanitized.y === null || (typeof sanitized.y === 'number' && isNaN(sanitized.y))) {
|
if (sanitized.y === undefined || sanitized.y === null || (typeof sanitized.y === 'number' && isNaN(sanitized.y))) {
|
||||||
|
console.warn(`⚠️ Shape ${sanitized.id} (${sanitized.type}) has invalid y coordinate, defaulting to 0. Original value:`, sanitized.y)
|
||||||
sanitized.y = 0
|
sanitized.y = 0
|
||||||
}
|
}
|
||||||
if (typeof sanitized.rotation !== 'number') sanitized.rotation = 0
|
if (typeof sanitized.rotation !== 'number') sanitized.rotation = 0
|
||||||
|
|
@ -422,21 +620,151 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
// Ensure meta is a mutable copy to preserve all properties (including text for rectangles)
|
// Ensure meta is a mutable copy to preserve all properties (including text for rectangles)
|
||||||
sanitized.meta = { ...sanitized.meta }
|
sanitized.meta = { ...sanitized.meta }
|
||||||
}
|
}
|
||||||
if (!sanitized.index) sanitized.index = 'a1'
|
// CRITICAL: IndexKey must follow tldraw's fractional indexing format
|
||||||
|
// Valid format: starts with 'a' followed by digits, optionally followed by alphanumeric jitter
|
||||||
|
// Examples: "a1", "a2", "a10", "a1V", "a24sT", "a1V4rr" (fractional between a1 and a2)
|
||||||
|
// Invalid: "c1", "b1", "z999" (old format - not valid fractional indices)
|
||||||
|
if (!isValidIndexKey(sanitized.index)) {
|
||||||
|
console.warn(`⚠️ Invalid index "${sanitized.index}" for shape ${sanitized.id}, resetting to 'a1'`)
|
||||||
|
sanitized.index = 'a1' as IndexKey
|
||||||
|
}
|
||||||
if (!sanitized.parentId) sanitized.parentId = 'page:page'
|
if (!sanitized.parentId) sanitized.parentId = 'page:page'
|
||||||
if (!sanitized.props || typeof sanitized.props !== 'object') sanitized.props = {}
|
if (!sanitized.props || typeof sanitized.props !== 'object') sanitized.props = {}
|
||||||
|
|
||||||
// CRITICAL: Ensure props is a deep mutable copy to preserve all nested properties
|
// CRITICAL: Ensure props is a deep mutable copy to preserve all nested properties
|
||||||
// This is essential for custom shapes like ObsNote and for preserving richText in geo shapes
|
// This is essential for custom shapes like ObsNote and for preserving richText in geo shapes
|
||||||
// Use JSON parse/stringify to create a deep copy of nested objects (like richText.content)
|
// Use JSON parse/stringify to create a deep copy of nested objects (like richText.content)
|
||||||
sanitized.props = JSON.parse(JSON.stringify(sanitized.props))
|
try {
|
||||||
|
sanitized.props = JSON.parse(JSON.stringify(sanitized.props))
|
||||||
|
} catch (e) {
|
||||||
|
// If JSON serialization fails (e.g., due to functions or circular references),
|
||||||
|
// create a shallow copy and recursively clean it
|
||||||
|
console.warn(`⚠️ Could not deep copy props for shape ${sanitized.id}, using shallow copy:`, e)
|
||||||
|
const propsCopy: any = {}
|
||||||
|
for (const key in sanitized.props) {
|
||||||
|
try {
|
||||||
|
const value = sanitized.props[key]
|
||||||
|
// Skip functions
|
||||||
|
if (typeof value === 'function') {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// Try to serialize individual values
|
||||||
|
try {
|
||||||
|
propsCopy[key] = JSON.parse(JSON.stringify(value))
|
||||||
|
} catch (valueError) {
|
||||||
|
// If individual value can't be serialized, use it as-is if it's a primitive
|
||||||
|
if (value === null || value === undefined || typeof value !== 'object') {
|
||||||
|
propsCopy[key] = value
|
||||||
|
}
|
||||||
|
// Otherwise skip it
|
||||||
|
}
|
||||||
|
} catch (keyError) {
|
||||||
|
// Skip properties that can't be accessed
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
sanitized.props = propsCopy
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: Map old shape type names to new ones (migration support)
|
// CRITICAL: Map old shape type names to new ones (migration support)
|
||||||
// This handles renamed shape types from old data
|
// This handles renamed shape types from old data
|
||||||
if (sanitized.type === 'Transcribe') {
|
if (sanitized.type === 'Transcribe') {
|
||||||
sanitized.type = 'Transcription'
|
sanitized.type = 'Transcription'
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Normalize case for custom shape types (lowercase → PascalCase)
|
||||||
|
// The schema expects PascalCase (e.g., "ChatBox" not "chatBox")
|
||||||
|
const customShapeTypeMap: Record<string, string> = {
|
||||||
|
'chatBox': 'ChatBox',
|
||||||
|
'videoChat': 'VideoChat',
|
||||||
|
'embed': 'Embed',
|
||||||
|
'markdown': 'Markdown',
|
||||||
|
'mycrozineTemplate': 'MycrozineTemplate',
|
||||||
|
'slide': 'Slide',
|
||||||
|
'prompt': 'Prompt',
|
||||||
|
'transcription': 'Transcription',
|
||||||
|
'obsNote': 'ObsNote',
|
||||||
|
'fathomNote': 'FathomNote',
|
||||||
|
'holon': 'Holon',
|
||||||
|
'obsidianBrowser': 'ObsidianBrowser',
|
||||||
|
'fathomMeetingsBrowser': 'FathomMeetingsBrowser',
|
||||||
|
'imageGen': 'ImageGen',
|
||||||
|
'videoGen': 'VideoGen',
|
||||||
|
'multmux': 'Multmux',
|
||||||
|
}
|
||||||
|
|
||||||
|
// Normalize the shape type if it's a custom type with incorrect case
|
||||||
|
if (sanitized.type && typeof sanitized.type === 'string' && customShapeTypeMap[sanitized.type]) {
|
||||||
|
console.log(`🔧 Normalizing shape type: "${sanitized.type}" → "${customShapeTypeMap[sanitized.type]}"`)
|
||||||
|
sanitized.type = customShapeTypeMap[sanitized.type]
|
||||||
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Sanitize Multmux shapes AFTER case normalization - ensure all required props exist
|
||||||
|
// Old shapes may have wsUrl (removed) or undefined values
|
||||||
|
if (sanitized.type === 'Multmux') {
|
||||||
|
console.log(`🔧 Sanitizing Multmux shape ${sanitized.id}:`, JSON.stringify(sanitized.props))
|
||||||
|
// Remove deprecated wsUrl prop
|
||||||
|
if ('wsUrl' in sanitized.props) {
|
||||||
|
delete sanitized.props.wsUrl
|
||||||
|
}
|
||||||
|
// CRITICAL: Create a clean props object with all required values
|
||||||
|
// This ensures no undefined values slip through validation
|
||||||
|
// Every value MUST be explicitly defined - undefined values cause ValidationError
|
||||||
|
const w = (typeof sanitized.props.w === 'number' && !isNaN(sanitized.props.w)) ? sanitized.props.w : 800
|
||||||
|
const h = (typeof sanitized.props.h === 'number' && !isNaN(sanitized.props.h)) ? sanitized.props.h : 600
|
||||||
|
const sessionId = (typeof sanitized.props.sessionId === 'string') ? sanitized.props.sessionId : ''
|
||||||
|
const sessionName = (typeof sanitized.props.sessionName === 'string') ? sanitized.props.sessionName : ''
|
||||||
|
const token = (typeof sanitized.props.token === 'string') ? sanitized.props.token : ''
|
||||||
|
// Fix old port (3000 -> 3002) during sanitization
|
||||||
|
let serverUrl = (typeof sanitized.props.serverUrl === 'string') ? sanitized.props.serverUrl : 'http://localhost:3002'
|
||||||
|
if (serverUrl === 'http://localhost:3000') {
|
||||||
|
serverUrl = 'http://localhost:3002'
|
||||||
|
}
|
||||||
|
const pinnedToView = (sanitized.props.pinnedToView === true) ? true : false
|
||||||
|
// Filter out any undefined or non-string elements from tags array
|
||||||
|
let tags: string[] = ['terminal', 'multmux']
|
||||||
|
if (Array.isArray(sanitized.props.tags)) {
|
||||||
|
const filteredTags = sanitized.props.tags.filter((t: any) => typeof t === 'string' && t !== '')
|
||||||
|
if (filteredTags.length > 0) {
|
||||||
|
tags = filteredTags
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build clean props object - all values are guaranteed to be defined
|
||||||
|
const cleanProps = {
|
||||||
|
w: w,
|
||||||
|
h: h,
|
||||||
|
sessionId: sessionId,
|
||||||
|
sessionName: sessionName,
|
||||||
|
token: token,
|
||||||
|
serverUrl: serverUrl,
|
||||||
|
pinnedToView: pinnedToView,
|
||||||
|
tags: tags,
|
||||||
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Verify no undefined values before assigning
|
||||||
|
// This is a safety check - if any value is undefined, something went wrong above
|
||||||
|
for (const [key, value] of Object.entries(cleanProps)) {
|
||||||
|
if (value === undefined) {
|
||||||
|
console.error(`❌ CRITICAL: Multmux prop ${key} is undefined after sanitization! This should never happen.`)
|
||||||
|
// Fix it with a default value based on key
|
||||||
|
switch (key) {
|
||||||
|
case 'w': (cleanProps as any).w = 800; break
|
||||||
|
case 'h': (cleanProps as any).h = 600; break
|
||||||
|
case 'sessionId': (cleanProps as any).sessionId = ''; break
|
||||||
|
case 'sessionName': (cleanProps as any).sessionName = ''; break
|
||||||
|
case 'token': (cleanProps as any).token = ''; break
|
||||||
|
case 'serverUrl': (cleanProps as any).serverUrl = 'http://localhost:3002'; break
|
||||||
|
case 'pinnedToView': (cleanProps as any).pinnedToView = false; break
|
||||||
|
case 'tags': (cleanProps as any).tags = ['terminal', 'multmux']; break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
sanitized.props = cleanProps
|
||||||
|
console.log(`🔧 Sanitized Multmux shape ${sanitized.id} props:`, JSON.stringify(sanitized.props))
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: Infer type from properties BEFORE defaulting to 'geo'
|
// CRITICAL: Infer type from properties BEFORE defaulting to 'geo'
|
||||||
// This ensures arrows and other shapes are properly recognized
|
// This ensures arrows and other shapes are properly recognized
|
||||||
if (!sanitized.type || typeof sanitized.type !== 'string') {
|
if (!sanitized.type || typeof sanitized.type !== 'string') {
|
||||||
|
|
@ -571,15 +899,63 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
// Remove invalid w/h from props (they cause validation errors)
|
// Remove invalid w/h from props (they cause validation errors)
|
||||||
if ('w' in sanitized.props) delete sanitized.props.w
|
if ('w' in sanitized.props) delete sanitized.props.w
|
||||||
if ('h' in sanitized.props) delete sanitized.props.h
|
if ('h' in sanitized.props) delete sanitized.props.h
|
||||||
|
|
||||||
// Line shapes REQUIRE points property
|
// Line shapes REQUIRE points property with at least 2 points
|
||||||
if (!sanitized.props.points || typeof sanitized.props.points !== 'object' || Array.isArray(sanitized.props.points)) {
|
if (!sanitized.props.points || typeof sanitized.props.points !== 'object' || Array.isArray(sanitized.props.points)) {
|
||||||
sanitized.props.points = {
|
sanitized.props.points = {
|
||||||
'a1': { id: 'a1', index: 'a1' as any, x: 0, y: 0 },
|
'a1': { id: 'a1', index: 'a1' as any, x: 0, y: 0 },
|
||||||
'a2': { id: 'a2', index: 'a2' as any, x: 100, y: 0 }
|
'a2': { id: 'a2', index: 'a2' as any, x: 100, y: 0 }
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
// Ensure the points object has at least 2 valid points
|
||||||
|
const pointKeys = Object.keys(sanitized.props.points)
|
||||||
|
if (pointKeys.length < 2) {
|
||||||
|
sanitized.props.points = {
|
||||||
|
'a1': { id: 'a1', index: 'a1' as any, x: 0, y: 0 },
|
||||||
|
'a2': { id: 'a2', index: 'a2' as any, x: 100, y: 0 }
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Fix draw shapes - ensure valid segments structure (required by schema)
|
||||||
|
// Draw shapes with empty segments cause "No nearest point found" errors
|
||||||
|
if (sanitized.type === 'draw') {
|
||||||
|
// Remove invalid w/h from props (they cause validation errors)
|
||||||
|
if ('w' in sanitized.props) delete sanitized.props.w
|
||||||
|
if ('h' in sanitized.props) delete sanitized.props.h
|
||||||
|
|
||||||
|
// Draw shapes REQUIRE segments property with at least one segment containing points
|
||||||
|
if (!sanitized.props.segments || !Array.isArray(sanitized.props.segments) || sanitized.props.segments.length === 0) {
|
||||||
|
// Create a minimal valid segment with at least 2 points
|
||||||
|
sanitized.props.segments = [{
|
||||||
|
type: 'free',
|
||||||
|
points: [
|
||||||
|
{ x: 0, y: 0, z: 0.5 },
|
||||||
|
{ x: 10, y: 0, z: 0.5 }
|
||||||
|
]
|
||||||
|
}]
|
||||||
|
} else {
|
||||||
|
// Ensure each segment has valid points
|
||||||
|
sanitized.props.segments = sanitized.props.segments.map((segment: any) => {
|
||||||
|
if (!segment.points || !Array.isArray(segment.points) || segment.points.length < 2) {
|
||||||
|
return {
|
||||||
|
type: segment.type || 'free',
|
||||||
|
points: [
|
||||||
|
{ x: 0, y: 0, z: 0.5 },
|
||||||
|
{ x: 10, y: 0, z: 0.5 }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return segment
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure required draw shape properties exist
|
||||||
|
if (typeof sanitized.props.isClosed !== 'boolean') sanitized.props.isClosed = false
|
||||||
|
if (typeof sanitized.props.isComplete !== 'boolean') sanitized.props.isComplete = true
|
||||||
|
if (typeof sanitized.props.isPen !== 'boolean') sanitized.props.isPen = false
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: Fix group shapes - remove invalid w/h from props
|
// CRITICAL: Fix group shapes - remove invalid w/h from props
|
||||||
if (sanitized.type === 'group') {
|
if (sanitized.type === 'group') {
|
||||||
|
|
@ -603,6 +979,33 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Convert props.text to props.richText for geo shapes (tldraw schema change)
|
||||||
|
// tldraw no longer accepts props.text on geo shapes - must use richText
|
||||||
|
// This migration handles shapes that were saved before the schema change
|
||||||
|
if (sanitized.type === 'geo' && 'text' in sanitized.props && typeof sanitized.props.text === 'string') {
|
||||||
|
const textContent = sanitized.props.text
|
||||||
|
|
||||||
|
// Convert text string to richText format for tldraw
|
||||||
|
sanitized.props.richText = {
|
||||||
|
type: 'doc',
|
||||||
|
content: textContent ? [{
|
||||||
|
type: 'paragraph',
|
||||||
|
content: [{
|
||||||
|
type: 'text',
|
||||||
|
text: textContent
|
||||||
|
}]
|
||||||
|
}] : []
|
||||||
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Preserve original text in meta.text for backward compatibility
|
||||||
|
// This is used by search (src/utils/searchUtils.ts) and other legacy code
|
||||||
|
if (!sanitized.meta) sanitized.meta = {}
|
||||||
|
sanitized.meta.text = textContent
|
||||||
|
|
||||||
|
// Remove invalid props.text
|
||||||
|
delete sanitized.props.text
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: Fix richText structure for geo shapes (preserve content)
|
// CRITICAL: Fix richText structure for geo shapes (preserve content)
|
||||||
if (sanitized.type === 'geo' && sanitized.props.richText) {
|
if (sanitized.type === 'geo' && sanitized.props.richText) {
|
||||||
if (Array.isArray(sanitized.props.richText)) {
|
if (Array.isArray(sanitized.props.richText)) {
|
||||||
|
|
@ -615,6 +1018,96 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
sanitized.props.richText = cleanRichTextNaN(sanitized.props.richText)
|
sanitized.props.richText = cleanRichTextNaN(sanitized.props.richText)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Fix arrow shapes - ensure valid start/end structure (required by schema)
|
||||||
|
// Arrows with invalid start/end cause "No nearest point found" errors
|
||||||
|
if (sanitized.type === 'arrow') {
|
||||||
|
// Ensure start property exists and has valid structure
|
||||||
|
if (!sanitized.props.start || typeof sanitized.props.start !== 'object') {
|
||||||
|
sanitized.props.start = { x: 0, y: 0 }
|
||||||
|
} else {
|
||||||
|
// Ensure start has x and y properties (could be bound to a shape or free)
|
||||||
|
const start = sanitized.props.start as any
|
||||||
|
if (start.type === 'binding') {
|
||||||
|
// Binding type must have boundShapeId, normalizedAnchor, and other properties
|
||||||
|
if (!start.boundShapeId) {
|
||||||
|
// Invalid binding - convert to point
|
||||||
|
sanitized.props.start = { x: start.x ?? 0, y: start.y ?? 0 }
|
||||||
|
}
|
||||||
|
} else if (start.type === 'point' || start.type === undefined) {
|
||||||
|
// Point type must have x and y
|
||||||
|
if (typeof start.x !== 'number' || typeof start.y !== 'number') {
|
||||||
|
sanitized.props.start = { x: 0, y: 0 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure end property exists and has valid structure
|
||||||
|
if (!sanitized.props.end || typeof sanitized.props.end !== 'object') {
|
||||||
|
sanitized.props.end = { x: 100, y: 0 }
|
||||||
|
} else {
|
||||||
|
// Ensure end has x and y properties (could be bound to a shape or free)
|
||||||
|
const end = sanitized.props.end as any
|
||||||
|
if (end.type === 'binding') {
|
||||||
|
// Binding type must have boundShapeId
|
||||||
|
if (!end.boundShapeId) {
|
||||||
|
// Invalid binding - convert to point
|
||||||
|
sanitized.props.end = { x: end.x ?? 100, y: end.y ?? 0 }
|
||||||
|
}
|
||||||
|
} else if (end.type === 'point' || end.type === undefined) {
|
||||||
|
// Point type must have x and y
|
||||||
|
if (typeof end.x !== 'number' || typeof end.y !== 'number') {
|
||||||
|
sanitized.props.end = { x: 100, y: 0 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure bend is a valid number
|
||||||
|
if (typeof sanitized.props.bend !== 'number' || isNaN(sanitized.props.bend)) {
|
||||||
|
sanitized.props.bend = 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure arrowhead properties exist
|
||||||
|
if (!sanitized.props.arrowheadStart) sanitized.props.arrowheadStart = 'none'
|
||||||
|
if (!sanitized.props.arrowheadEnd) sanitized.props.arrowheadEnd = 'arrow'
|
||||||
|
|
||||||
|
// Ensure text property exists and is a string
|
||||||
|
if (sanitized.props.text === undefined || sanitized.props.text === null) {
|
||||||
|
sanitized.props.text = ''
|
||||||
|
} else if (typeof sanitized.props.text !== 'string') {
|
||||||
|
// If text is not a string (e.g., RichText object), convert it to string
|
||||||
|
try {
|
||||||
|
if (typeof sanitized.props.text === 'object' && sanitized.props.text !== null) {
|
||||||
|
// Try to extract text from RichText object
|
||||||
|
const textObj = sanitized.props.text as any
|
||||||
|
if (Array.isArray(textObj.content)) {
|
||||||
|
// Extract text from RichText content
|
||||||
|
const extractText = (content: any[]): string => {
|
||||||
|
return content.map((item: any) => {
|
||||||
|
if (item.type === 'text' && item.text) {
|
||||||
|
return item.text
|
||||||
|
} else if (item.content && Array.isArray(item.content)) {
|
||||||
|
return extractText(item.content)
|
||||||
|
}
|
||||||
|
return ''
|
||||||
|
}).join('')
|
||||||
|
}
|
||||||
|
sanitized.props.text = extractText(textObj.content)
|
||||||
|
} else if (textObj.text && typeof textObj.text === 'string') {
|
||||||
|
sanitized.props.text = textObj.text
|
||||||
|
} else {
|
||||||
|
sanitized.props.text = String(sanitized.props.text)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
sanitized.props.text = String(sanitized.props.text)
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.warn(`⚠️ AutomergeToTLStore: Error converting arrow text to string for ${sanitized.id}:`, e)
|
||||||
|
sanitized.props.text = String(sanitized.props.text)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Note: We preserve text even if it's an empty string - that's a valid value
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: Fix richText structure for text shapes - REQUIRED field
|
// CRITICAL: Fix richText structure for text shapes - REQUIRED field
|
||||||
if (sanitized.type === 'text') {
|
if (sanitized.type === 'text') {
|
||||||
// Text shapes MUST have props.richText as an object - initialize if missing
|
// Text shapes MUST have props.richText as an object - initialize if missing
|
||||||
|
|
@ -628,10 +1121,35 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
}
|
}
|
||||||
// CRITICAL: Clean NaN values from richText content to prevent SVG export errors
|
// CRITICAL: Clean NaN values from richText content to prevent SVG export errors
|
||||||
sanitized.props.richText = cleanRichTextNaN(sanitized.props.richText)
|
sanitized.props.richText = cleanRichTextNaN(sanitized.props.richText)
|
||||||
|
|
||||||
|
// CRITICAL: Ensure required text shape properties exist (TLDraw validation requires these)
|
||||||
|
// color is REQUIRED and must be one of the valid color values
|
||||||
|
const validColors = ['black', 'grey', 'light-violet', 'violet', 'blue', 'light-blue', 'yellow', 'orange', 'green', 'light-green', 'light-red', 'red', 'white']
|
||||||
|
if (!sanitized.props.color || typeof sanitized.props.color !== 'string' || !validColors.includes(sanitized.props.color)) {
|
||||||
|
sanitized.props.color = 'black'
|
||||||
|
}
|
||||||
|
// Ensure other required properties have defaults
|
||||||
|
if (typeof sanitized.props.w !== 'number') sanitized.props.w = 300
|
||||||
|
if (!sanitized.props.size || typeof sanitized.props.size !== 'string') sanitized.props.size = 'm'
|
||||||
|
if (!sanitized.props.font || typeof sanitized.props.font !== 'string') sanitized.props.font = 'draw'
|
||||||
|
if (!sanitized.props.textAlign || typeof sanitized.props.textAlign !== 'string') sanitized.props.textAlign = 'start'
|
||||||
|
if (typeof sanitized.props.autoSize !== 'boolean') sanitized.props.autoSize = false
|
||||||
|
if (typeof sanitized.props.scale !== 'number') sanitized.props.scale = 1
|
||||||
|
|
||||||
|
// Remove invalid properties for text shapes (these cause validation errors)
|
||||||
|
// Remove properties that are only valid for custom shapes, not standard TLDraw text shapes
|
||||||
|
// CRITICAL: 'text' property is NOT allowed - text shapes must use props.richText instead
|
||||||
|
const invalidTextProps = ['h', 'geo', 'text', 'isEditing', 'editingContent', 'isTranscribing', 'isPaused', 'fixedHeight', 'pinnedToView', 'isModified', 'originalContent', 'editingName', 'editingDescription', 'isConnected', 'holonId', 'noteId', 'title', 'content', 'tags', 'showPreview', 'backgroundColor', 'textColor']
|
||||||
|
invalidTextProps.forEach(prop => {
|
||||||
|
if (prop in sanitized.props) {
|
||||||
|
delete sanitized.props[prop]
|
||||||
|
}
|
||||||
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
// CRITICAL: Remove invalid 'text' property from text shapes (TLDraw schema doesn't allow props.text)
|
// CRITICAL: Additional safety check - Remove invalid 'text' property from text shapes
|
||||||
// Text shapes should only use props.richText, not props.text
|
// Text shapes should only use props.richText, not props.text
|
||||||
|
// This is a redundant check to ensure text property is always removed
|
||||||
if (sanitized.type === 'text' && 'text' in sanitized.props) {
|
if (sanitized.type === 'text' && 'text' in sanitized.props) {
|
||||||
delete sanitized.props.text
|
delete sanitized.props.text
|
||||||
}
|
}
|
||||||
|
|
@ -655,9 +1173,28 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
// CRITICAL: Clean NaN values from richText content to prevent SVG export errors
|
// CRITICAL: Clean NaN values from richText content to prevent SVG export errors
|
||||||
sanitized.props.richText = cleanRichTextNaN(sanitized.props.richText)
|
sanitized.props.richText = cleanRichTextNaN(sanitized.props.richText)
|
||||||
|
|
||||||
// Only remove properties that cause validation errors (not all "invalid" ones)
|
// CRITICAL: Ensure required text shape properties exist (TLDraw validation requires these)
|
||||||
if ('h' in sanitized.props) delete sanitized.props.h
|
// color is REQUIRED and must be one of the valid color values
|
||||||
if ('geo' in sanitized.props) delete sanitized.props.geo
|
const validColors = ['black', 'grey', 'light-violet', 'violet', 'blue', 'light-blue', 'yellow', 'orange', 'green', 'light-green', 'light-red', 'red', 'white']
|
||||||
|
if (!sanitized.props.color || typeof sanitized.props.color !== 'string' || !validColors.includes(sanitized.props.color)) {
|
||||||
|
sanitized.props.color = 'black'
|
||||||
|
}
|
||||||
|
// Ensure other required properties have defaults
|
||||||
|
if (typeof sanitized.props.w !== 'number') sanitized.props.w = 300
|
||||||
|
if (!sanitized.props.size || typeof sanitized.props.size !== 'string') sanitized.props.size = 'm'
|
||||||
|
if (!sanitized.props.font || typeof sanitized.props.font !== 'string') sanitized.props.font = 'draw'
|
||||||
|
if (!sanitized.props.textAlign || typeof sanitized.props.textAlign !== 'string') sanitized.props.textAlign = 'start'
|
||||||
|
if (typeof sanitized.props.autoSize !== 'boolean') sanitized.props.autoSize = false
|
||||||
|
if (typeof sanitized.props.scale !== 'number') sanitized.props.scale = 1
|
||||||
|
|
||||||
|
// Remove invalid properties for text shapes (these cause validation errors)
|
||||||
|
// Remove properties that are only valid for custom shapes, not standard TLDraw text shapes
|
||||||
|
const invalidTextProps = ['h', 'geo', 'isEditing', 'editingContent', 'isTranscribing', 'isPaused', 'fixedHeight', 'pinnedToView', 'isModified', 'originalContent', 'editingName', 'editingDescription', 'isConnected', 'holonId', 'noteId', 'title', 'content', 'tags', 'showPreview', 'backgroundColor', 'textColor']
|
||||||
|
invalidTextProps.forEach(prop => {
|
||||||
|
if (prop in sanitized.props) {
|
||||||
|
delete sanitized.props[prop]
|
||||||
|
}
|
||||||
|
})
|
||||||
}
|
}
|
||||||
} else if (sanitized.typeName === 'instance') {
|
} else if (sanitized.typeName === 'instance') {
|
||||||
// CRITICAL: Handle instance records - ensure required fields exist
|
// CRITICAL: Handle instance records - ensure required fields exist
|
||||||
|
|
@ -702,6 +1239,12 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Final safety check - ensure text shapes never have invalid 'text' property
|
||||||
|
// This is a last-resort check before returning to catch any edge cases
|
||||||
|
if (sanitized.typeName === 'shape' && sanitized.type === 'text' && sanitized.props && 'text' in sanitized.props) {
|
||||||
|
delete sanitized.props.text
|
||||||
|
}
|
||||||
|
|
||||||
return sanitized
|
return sanitized
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -48,7 +48,15 @@ export class CloudflareAdapter {
|
||||||
// Focus on the store data which is what actually changes
|
// Focus on the store data which is what actually changes
|
||||||
const storeData = doc.store || {}
|
const storeData = doc.store || {}
|
||||||
const storeKeys = Object.keys(storeData).sort()
|
const storeKeys = Object.keys(storeData).sort()
|
||||||
const storeString = JSON.stringify(storeData, storeKeys)
|
|
||||||
|
// CRITICAL FIX: JSON.stringify's second parameter when it's an array is a replacer
|
||||||
|
// that only includes those properties. We need to stringify the entire store object.
|
||||||
|
// To ensure stable ordering, create a new object with sorted keys
|
||||||
|
const sortedStore: any = {}
|
||||||
|
for (const key of storeKeys) {
|
||||||
|
sortedStore[key] = storeData[key]
|
||||||
|
}
|
||||||
|
const storeString = JSON.stringify(sortedStore)
|
||||||
|
|
||||||
// Simple hash function (you could use a more sophisticated one if needed)
|
// Simple hash function (you could use a more sophisticated one if needed)
|
||||||
let hash = 0
|
let hash = 0
|
||||||
|
|
@ -158,6 +166,7 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
private websocket: WebSocket | null = null
|
private websocket: WebSocket | null = null
|
||||||
private roomId: string | null = null
|
private roomId: string | null = null
|
||||||
public peerId: PeerId | undefined = undefined
|
public peerId: PeerId | undefined = undefined
|
||||||
|
public sessionId: string | null = null // Track our session ID
|
||||||
private readyPromise: Promise<void>
|
private readyPromise: Promise<void>
|
||||||
private readyResolve: (() => void) | null = null
|
private readyResolve: (() => void) | null = null
|
||||||
private keepAliveInterval: NodeJS.Timeout | null = null
|
private keepAliveInterval: NodeJS.Timeout | null = null
|
||||||
|
|
@ -167,12 +176,19 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
private reconnectDelay: number = 1000
|
private reconnectDelay: number = 1000
|
||||||
private isConnecting: boolean = false
|
private isConnecting: boolean = false
|
||||||
private onJsonSyncData?: (data: any) => void
|
private onJsonSyncData?: (data: any) => void
|
||||||
|
private onPresenceUpdate?: (userId: string, data: any, senderId?: string, userName?: string, userColor?: string) => void
|
||||||
|
|
||||||
constructor(workerUrl: string, roomId?: string, onJsonSyncData?: (data: any) => void) {
|
constructor(
|
||||||
|
workerUrl: string,
|
||||||
|
roomId?: string,
|
||||||
|
onJsonSyncData?: (data: any) => void,
|
||||||
|
onPresenceUpdate?: (userId: string, data: any, senderId?: string, userName?: string, userColor?: string) => void
|
||||||
|
) {
|
||||||
super()
|
super()
|
||||||
this.workerUrl = workerUrl
|
this.workerUrl = workerUrl
|
||||||
this.roomId = roomId || 'default-room'
|
this.roomId = roomId || 'default-room'
|
||||||
this.onJsonSyncData = onJsonSyncData
|
this.onJsonSyncData = onJsonSyncData
|
||||||
|
this.onPresenceUpdate = onPresenceUpdate
|
||||||
this.readyPromise = new Promise((resolve) => {
|
this.readyPromise = new Promise((resolve) => {
|
||||||
this.readyResolve = resolve
|
this.readyResolve = resolve
|
||||||
})
|
})
|
||||||
|
|
@ -201,11 +217,13 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
// Use the room ID from constructor or default
|
// Use the room ID from constructor or default
|
||||||
// Add sessionId as a query parameter as required by AutomergeDurableObject
|
// Add sessionId as a query parameter as required by AutomergeDurableObject
|
||||||
const sessionId = peerId || `session-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`
|
const sessionId = peerId || `session-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`
|
||||||
|
this.sessionId = sessionId // Store our session ID for filtering echoes
|
||||||
|
|
||||||
// Convert https:// to wss:// or http:// to ws://
|
// Convert https:// to wss:// or http:// to ws://
|
||||||
const protocol = this.workerUrl.startsWith('https://') ? 'wss://' : 'ws://'
|
const protocol = this.workerUrl.startsWith('https://') ? 'wss://' : 'ws://'
|
||||||
const baseUrl = this.workerUrl.replace(/^https?:\/\//, '')
|
const baseUrl = this.workerUrl.replace(/^https?:\/\//, '')
|
||||||
const wsUrl = `${protocol}${baseUrl}/connect/${this.roomId}?sessionId=${sessionId}`
|
const wsUrl = `${protocol}${baseUrl}/connect/${this.roomId}?sessionId=${sessionId}`
|
||||||
|
|
||||||
this.isConnecting = true
|
this.isConnecting = true
|
||||||
|
|
||||||
// Add a small delay to ensure the server is ready
|
// Add a small delay to ensure the server is ready
|
||||||
|
|
@ -252,19 +270,32 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
} else {
|
} else {
|
||||||
// Handle text messages (our custom protocol for backward compatibility)
|
// Handle text messages (our custom protocol for backward compatibility)
|
||||||
const message = JSON.parse(event.data)
|
const message = JSON.parse(event.data)
|
||||||
console.log('🔌 CloudflareAdapter: Received WebSocket message:', message.type)
|
|
||||||
|
// Only log non-presence messages to reduce console spam
|
||||||
|
if (message.type !== 'presence' && message.type !== 'pong') {
|
||||||
|
console.log('🔌 CloudflareAdapter: Received WebSocket message:', message.type)
|
||||||
|
}
|
||||||
|
|
||||||
// Handle ping/pong messages for keep-alive
|
// Handle ping/pong messages for keep-alive
|
||||||
if (message.type === 'ping') {
|
if (message.type === 'ping') {
|
||||||
this.sendPong()
|
this.sendPong()
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle test messages
|
// Handle test messages
|
||||||
if (message.type === 'test') {
|
if (message.type === 'test') {
|
||||||
console.log('🔌 CloudflareAdapter: Received test message:', message.message)
|
console.log('🔌 CloudflareAdapter: Received test message:', message.message)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Handle presence updates from other clients
|
||||||
|
if (message.type === 'presence') {
|
||||||
|
// Pass senderId, userName, and userColor so we can create proper instance_presence records
|
||||||
|
if (this.onPresenceUpdate && message.userId && message.data) {
|
||||||
|
this.onPresenceUpdate(message.userId, message.data, message.senderId, message.userName, message.userColor)
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
// Convert the message to the format expected by Automerge
|
// Convert the message to the format expected by Automerge
|
||||||
if (message.type === 'sync' && message.data) {
|
if (message.type === 'sync' && message.data) {
|
||||||
|
|
@ -275,14 +306,20 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
documentIdType: typeof message.documentId
|
documentIdType: typeof message.documentId
|
||||||
})
|
})
|
||||||
|
|
||||||
// JSON sync is deprecated - all data flows through Automerge sync protocol
|
// JSON sync for real-time collaboration
|
||||||
// Old format content is converted server-side and saved to R2 in Automerge format
|
// When we receive TLDraw changes from other clients, apply them locally
|
||||||
// Skip JSON sync messages - they should not be sent anymore
|
|
||||||
const isJsonDocumentData = message.data && typeof message.data === 'object' && message.data.store
|
const isJsonDocumentData = message.data && typeof message.data === 'object' && message.data.store
|
||||||
|
|
||||||
if (isJsonDocumentData) {
|
if (isJsonDocumentData) {
|
||||||
console.warn('⚠️ CloudflareAdapter: Received JSON sync message (deprecated). Ignoring - all data should flow through Automerge sync protocol.')
|
console.log('📥 CloudflareAdapter: Received JSON sync message with store data')
|
||||||
return // Don't process JSON sync messages
|
|
||||||
|
// Call the JSON sync callback to apply changes
|
||||||
|
if (this.onJsonSyncData) {
|
||||||
|
this.onJsonSyncData(message.data)
|
||||||
|
} else {
|
||||||
|
console.warn('⚠️ No JSON sync callback registered')
|
||||||
|
}
|
||||||
|
return // JSON sync handled
|
||||||
}
|
}
|
||||||
|
|
||||||
// Validate documentId - Automerge requires a valid Automerge URL format
|
// Validate documentId - Automerge requires a valid Automerge URL format
|
||||||
|
|
@ -368,19 +405,42 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
}
|
}
|
||||||
|
|
||||||
send(message: Message): void {
|
send(message: Message): void {
|
||||||
|
// Only log non-presence messages to reduce console spam
|
||||||
|
if (message.type !== 'presence') {
|
||||||
|
console.log('📤 CloudflareAdapter.send() called:', {
|
||||||
|
messageType: message.type,
|
||||||
|
dataType: (message as any).data?.constructor?.name || typeof (message as any).data,
|
||||||
|
dataLength: (message as any).data?.byteLength || (message as any).data?.length,
|
||||||
|
documentId: (message as any).documentId,
|
||||||
|
hasTargetId: !!message.targetId,
|
||||||
|
hasSenderId: !!message.senderId
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
if (this.websocket && this.websocket.readyState === WebSocket.OPEN) {
|
if (this.websocket && this.websocket.readyState === WebSocket.OPEN) {
|
||||||
// Check if this is a binary sync message from Automerge Repo
|
// Check if this is a binary sync message from Automerge Repo
|
||||||
if (message.type === 'sync' && (message as any).data instanceof ArrayBuffer) {
|
if (message.type === 'sync' && (message as any).data instanceof ArrayBuffer) {
|
||||||
console.log('🔌 CloudflareAdapter: Sending binary sync message (Automerge protocol)')
|
console.log('📤 CloudflareAdapter: Sending binary sync message (Automerge protocol)', {
|
||||||
|
dataLength: (message as any).data.byteLength,
|
||||||
|
documentId: (message as any).documentId,
|
||||||
|
targetId: message.targetId
|
||||||
|
})
|
||||||
// Send binary data directly for Automerge's native sync protocol
|
// Send binary data directly for Automerge's native sync protocol
|
||||||
this.websocket.send((message as any).data)
|
this.websocket.send((message as any).data)
|
||||||
} else if (message.type === 'sync' && (message as any).data instanceof Uint8Array) {
|
} else if (message.type === 'sync' && (message as any).data instanceof Uint8Array) {
|
||||||
console.log('🔌 CloudflareAdapter: Sending Uint8Array sync message (Automerge protocol)')
|
console.log('📤 CloudflareAdapter: Sending Uint8Array sync message (Automerge protocol)', {
|
||||||
|
dataLength: (message as any).data.length,
|
||||||
|
documentId: (message as any).documentId,
|
||||||
|
targetId: message.targetId
|
||||||
|
})
|
||||||
// Convert Uint8Array to ArrayBuffer and send
|
// Convert Uint8Array to ArrayBuffer and send
|
||||||
this.websocket.send((message as any).data.buffer)
|
this.websocket.send((message as any).data.buffer)
|
||||||
} else {
|
} else {
|
||||||
// Handle text-based messages (backward compatibility and control messages)
|
// Handle text-based messages (backward compatibility and control messages)
|
||||||
console.log('Sending WebSocket message:', message.type)
|
// Only log non-presence messages
|
||||||
|
if (message.type !== 'presence') {
|
||||||
|
console.log('📤 Sending WebSocket message:', message.type)
|
||||||
|
}
|
||||||
// Debug: Log patch content if it's a patch message
|
// Debug: Log patch content if it's a patch message
|
||||||
if (message.type === 'patch' && (message as any).patches) {
|
if (message.type === 'patch' && (message as any).patches) {
|
||||||
console.log('🔍 Sending patches:', (message as any).patches.length, 'patches')
|
console.log('🔍 Sending patches:', (message as any).patches.length, 'patches')
|
||||||
|
|
@ -394,6 +454,13 @@ export class CloudflareNetworkAdapter extends NetworkAdapter {
|
||||||
}
|
}
|
||||||
this.websocket.send(JSON.stringify(message))
|
this.websocket.send(JSON.stringify(message))
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
if (message.type !== 'presence') {
|
||||||
|
console.warn('⚠️ CloudflareAdapter: Cannot send message - WebSocket not open', {
|
||||||
|
messageType: message.type,
|
||||||
|
readyState: this.websocket?.readyState
|
||||||
|
})
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -20,7 +20,38 @@ function minimalSanitizeRecord(record: any): any {
|
||||||
if (typeof sanitized.isLocked !== 'boolean') sanitized.isLocked = false
|
if (typeof sanitized.isLocked !== 'boolean') sanitized.isLocked = false
|
||||||
if (typeof sanitized.opacity !== 'number') sanitized.opacity = 1
|
if (typeof sanitized.opacity !== 'number') sanitized.opacity = 1
|
||||||
if (!sanitized.meta || typeof sanitized.meta !== 'object') sanitized.meta = {}
|
if (!sanitized.meta || typeof sanitized.meta !== 'object') sanitized.meta = {}
|
||||||
if (!sanitized.index) sanitized.index = 'a1'
|
// NOTE: Index assignment is handled by assignSequentialIndices() during format conversion
|
||||||
|
// Here we validate using tldraw's fractional indexing rules
|
||||||
|
// The first letter encodes integer part length: a=1 digit, b=2 digits, c=3 digits, etc.
|
||||||
|
// Examples: "a0"-"a9", "b10"-"b99", "c100"-"c999", with optional fraction "a1V4rr"
|
||||||
|
// Invalid: "b1" (b expects 2 digits but has 1)
|
||||||
|
if (!sanitized.index || typeof sanitized.index !== 'string' || sanitized.index.length === 0) {
|
||||||
|
sanitized.index = 'a1'
|
||||||
|
} else {
|
||||||
|
// Validate fractional indexing format
|
||||||
|
let isValid = false
|
||||||
|
const prefix = sanitized.index[0]
|
||||||
|
const rest = sanitized.index.slice(1)
|
||||||
|
|
||||||
|
if (/^[a-zA-Z]/.test(sanitized.index) && /^[a-zA-Z][a-zA-Z0-9]+$/.test(sanitized.index)) {
|
||||||
|
if (prefix >= 'a' && prefix <= 'z') {
|
||||||
|
// Calculate expected minimum digit count: a=1, b=2, c=3, etc.
|
||||||
|
const expectedDigits = prefix.charCodeAt(0) - 'a'.charCodeAt(0) + 1
|
||||||
|
const integerMatch = rest.match(/^(\d+)/)
|
||||||
|
if (integerMatch && integerMatch[1].length >= expectedDigits) {
|
||||||
|
isValid = true
|
||||||
|
}
|
||||||
|
} else if (prefix >= 'A' && prefix <= 'Z') {
|
||||||
|
// Uppercase for negative/special indices - allow
|
||||||
|
isValid = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!isValid) {
|
||||||
|
console.warn(`⚠️ MinimalSanitization: Invalid index format "${sanitized.index}" for shape ${sanitized.id}`)
|
||||||
|
sanitized.index = 'a1'
|
||||||
|
}
|
||||||
|
}
|
||||||
if (!sanitized.parentId) sanitized.parentId = 'page:page'
|
if (!sanitized.parentId) sanitized.parentId = 'page:page'
|
||||||
|
|
||||||
// Ensure props object exists
|
// Ensure props object exists
|
||||||
|
|
|
||||||
|
|
@ -47,6 +47,6 @@ To switch from TLdraw sync to Automerge sync:
|
||||||
|
|
||||||
1. Update the Board component to use `useAutomergeSync`
|
1. Update the Board component to use `useAutomergeSync`
|
||||||
2. Deploy the new worker with Automerge Durable Object
|
2. Deploy the new worker with Automerge Durable Object
|
||||||
3. Update the URI to use `/automerge/connect/` instead of `/connect/`
|
3. The CloudflareAdapter will automatically connect to `/connect/{roomId}` via WebSocket
|
||||||
|
|
||||||
The migration is backward compatible - existing TLdraw sync will continue to work while you test the new system.
|
The migration is backward compatible - the system will handle both legacy and new document formats.
|
||||||
|
|
|
||||||
|
|
@ -144,19 +144,97 @@ function sanitizeRecord(record: TLRecord): TLRecord {
|
||||||
console.warn(`🔧 TLStoreToAutomerge: Error checking richText for shape ${sanitized.id}:`, e)
|
console.warn(`🔧 TLStoreToAutomerge: Error checking richText for shape ${sanitized.id}:`, e)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// CRITICAL: Extract arrow text BEFORE deep copy to handle RichText instances properly
|
||||||
|
// Arrow text should be a string, but might be a RichText object in edge cases
|
||||||
|
let arrowTextValue: any = undefined
|
||||||
|
if (sanitized.type === 'arrow') {
|
||||||
|
try {
|
||||||
|
const props = sanitized.props || {}
|
||||||
|
if ('text' in props) {
|
||||||
|
try {
|
||||||
|
// Use Object.getOwnPropertyDescriptor to safely check if it's a getter
|
||||||
|
const descriptor = Object.getOwnPropertyDescriptor(props, 'text')
|
||||||
|
let textValue: any = undefined
|
||||||
|
|
||||||
|
if (descriptor && descriptor.get) {
|
||||||
|
// It's a getter - try to call it safely
|
||||||
|
try {
|
||||||
|
textValue = descriptor.get.call(props)
|
||||||
|
} catch (getterError) {
|
||||||
|
console.warn(`🔧 TLStoreToAutomerge: Error calling text getter for arrow ${sanitized.id}:`, getterError)
|
||||||
|
textValue = undefined
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// It's a regular property - access it directly
|
||||||
|
textValue = (props as any).text
|
||||||
|
}
|
||||||
|
|
||||||
|
// Now process the value
|
||||||
|
if (textValue !== undefined && textValue !== null) {
|
||||||
|
// If it's a string, use it directly
|
||||||
|
if (typeof textValue === 'string') {
|
||||||
|
arrowTextValue = textValue
|
||||||
|
}
|
||||||
|
// If it's a RichText object, extract the text content
|
||||||
|
else if (typeof textValue === 'object' && textValue !== null) {
|
||||||
|
// Try to extract text from RichText object
|
||||||
|
try {
|
||||||
|
const serialized = JSON.parse(JSON.stringify(textValue))
|
||||||
|
// If it has content array, extract text from it
|
||||||
|
if (Array.isArray(serialized.content)) {
|
||||||
|
// Extract text from RichText content
|
||||||
|
const extractText = (content: any[]): string => {
|
||||||
|
return content.map((item: any) => {
|
||||||
|
if (item.type === 'text' && item.text) {
|
||||||
|
return item.text
|
||||||
|
} else if (item.content && Array.isArray(item.content)) {
|
||||||
|
return extractText(item.content)
|
||||||
|
}
|
||||||
|
return ''
|
||||||
|
}).join('')
|
||||||
|
}
|
||||||
|
arrowTextValue = extractText(serialized.content)
|
||||||
|
} else {
|
||||||
|
// Fallback: try to get text property
|
||||||
|
arrowTextValue = serialized.text || ''
|
||||||
|
}
|
||||||
|
} catch (serializeError) {
|
||||||
|
// If serialization fails, try to extract manually
|
||||||
|
if ((textValue as any).text && typeof (textValue as any).text === 'string') {
|
||||||
|
arrowTextValue = (textValue as any).text
|
||||||
|
} else {
|
||||||
|
arrowTextValue = String(textValue)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// For other types, convert to string
|
||||||
|
else {
|
||||||
|
arrowTextValue = String(textValue)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.warn(`🔧 TLStoreToAutomerge: Error extracting text for arrow ${sanitized.id}:`, e)
|
||||||
|
arrowTextValue = undefined
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
console.warn(`🔧 TLStoreToAutomerge: Error checking text for arrow ${sanitized.id}:`, e)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// CRITICAL: For all shapes, ensure props is a deep mutable copy to preserve all properties
|
// CRITICAL: For all shapes, ensure props is a deep mutable copy to preserve all properties
|
||||||
// This is essential for custom shapes like ObsNote and for preserving richText in geo shapes
|
// This is essential for custom shapes like ObsNote and for preserving richText in geo shapes
|
||||||
// Use JSON parse/stringify to create a deep copy of nested objects (like richText.content)
|
// Use JSON parse/stringify to create a deep copy of nested objects (like richText.content)
|
||||||
// Remove richText temporarily to avoid serialization issues
|
// Remove richText and arrow text temporarily to avoid serialization issues
|
||||||
try {
|
try {
|
||||||
const propsWithoutRichText: any = {}
|
const propsWithoutSpecial: any = {}
|
||||||
// Copy all props except richText
|
// Copy all props except richText and arrow text (if extracted)
|
||||||
for (const key in sanitized.props) {
|
for (const key in sanitized.props) {
|
||||||
if (key !== 'richText') {
|
if (key !== 'richText' && !(sanitized.type === 'arrow' && key === 'text' && arrowTextValue !== undefined)) {
|
||||||
propsWithoutRichText[key] = (sanitized.props as any)[key]
|
propsWithoutSpecial[key] = (sanitized.props as any)[key]
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
sanitized.props = JSON.parse(JSON.stringify(propsWithoutRichText))
|
sanitized.props = JSON.parse(JSON.stringify(propsWithoutSpecial))
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
console.warn(`🔧 TLStoreToAutomerge: Error deep copying props for shape ${sanitized.id}:`, e)
|
console.warn(`🔧 TLStoreToAutomerge: Error deep copying props for shape ${sanitized.id}:`, e)
|
||||||
// Fallback: just copy props without deep copy
|
// Fallback: just copy props without deep copy
|
||||||
|
|
@ -164,6 +242,9 @@ function sanitizeRecord(record: TLRecord): TLRecord {
|
||||||
if (richTextValue !== undefined) {
|
if (richTextValue !== undefined) {
|
||||||
delete (sanitized.props as any).richText
|
delete (sanitized.props as any).richText
|
||||||
}
|
}
|
||||||
|
if (arrowTextValue !== undefined) {
|
||||||
|
delete (sanitized.props as any).text
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// CRITICAL: For geo shapes, move w/h/geo from top-level to props (required by TLDraw schema)
|
// CRITICAL: For geo shapes, move w/h/geo from top-level to props (required by TLDraw schema)
|
||||||
|
|
@ -210,11 +291,17 @@ function sanitizeRecord(record: TLRecord): TLRecord {
|
||||||
|
|
||||||
// CRITICAL: For arrow shapes, preserve text property
|
// CRITICAL: For arrow shapes, preserve text property
|
||||||
if (sanitized.type === 'arrow') {
|
if (sanitized.type === 'arrow') {
|
||||||
// CRITICAL: Preserve text property - only set default if truly missing (preserve empty strings and all other values)
|
// CRITICAL: Restore extracted text value if available, otherwise preserve existing text
|
||||||
if ((sanitized.props as any).text === undefined || (sanitized.props as any).text === null) {
|
if (arrowTextValue !== undefined) {
|
||||||
(sanitized.props as any).text = ''
|
// Use the extracted text value (handles RichText objects by extracting text content)
|
||||||
|
(sanitized.props as any).text = arrowTextValue
|
||||||
|
} else {
|
||||||
|
// CRITICAL: Preserve text property - only set default if truly missing (preserve empty strings and all other values)
|
||||||
|
if ((sanitized.props as any).text === undefined || (sanitized.props as any).text === null) {
|
||||||
|
(sanitized.props as any).text = ''
|
||||||
|
}
|
||||||
|
// Note: We preserve text even if it's an empty string - that's a valid value
|
||||||
}
|
}
|
||||||
// Note: We preserve text even if it's an empty string - that's a valid value
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// CRITICAL: For note shapes, preserve richText property (required for note shapes)
|
// CRITICAL: For note shapes, preserve richText property (required for note shapes)
|
||||||
|
|
|
||||||
|
|
@ -115,7 +115,7 @@ export async function saveDocumentId(roomId: string, documentId: string): Promis
|
||||||
}
|
}
|
||||||
|
|
||||||
request.onsuccess = () => {
|
request.onsuccess = () => {
|
||||||
console.log(`📝 Saved document mapping: ${roomId} → ${documentId}`)
|
console.log(`Saved document mapping: ${roomId} -> ${documentId}`)
|
||||||
resolve()
|
resolve()
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
@ -171,7 +171,7 @@ export async function deleteDocumentMapping(roomId: string): Promise<void> {
|
||||||
}
|
}
|
||||||
|
|
||||||
request.onsuccess = () => {
|
request.onsuccess = () => {
|
||||||
console.log(`🗑️ Deleted document mapping for: ${roomId}`)
|
console.log(`Deleted document mapping for: ${roomId}`)
|
||||||
resolve()
|
resolve()
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
@ -238,7 +238,7 @@ export async function cleanupOldMappings(maxAgeDays: number = 30): Promise<numbe
|
||||||
deletedCount++
|
deletedCount++
|
||||||
cursor.continue()
|
cursor.continue()
|
||||||
} else {
|
} else {
|
||||||
console.log(`🧹 Cleaned up ${deletedCount} old document mappings`)
|
console.log(`Cleaned up ${deletedCount} old document mappings`)
|
||||||
resolve(deletedCount)
|
resolve(deletedCount)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
||||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
|
|
@ -1,45 +1,67 @@
|
||||||
import React, { useState, useEffect } from 'react'
|
import React, { useState, useEffect, useContext, useRef } from 'react'
|
||||||
import { useEditor } from 'tldraw'
|
import { useEditor } from 'tldraw'
|
||||||
import { createShapeId } from 'tldraw'
|
import { createShapeId } from 'tldraw'
|
||||||
import { WORKER_URL, LOCAL_WORKER_URL } from '../constants/workerUrl'
|
import { WORKER_URL, LOCAL_WORKER_URL } from '../constants/workerUrl'
|
||||||
|
import { getFathomApiKey, saveFathomApiKey, removeFathomApiKey } from '../lib/fathomApiKey'
|
||||||
|
import { AuthContext } from '../context/AuthContext'
|
||||||
|
|
||||||
interface FathomMeeting {
|
interface FathomMeeting {
|
||||||
id: string
|
recording_id: number
|
||||||
title: string
|
title: string
|
||||||
|
meeting_title?: string
|
||||||
url: string
|
url: string
|
||||||
|
share_url?: string
|
||||||
created_at: string
|
created_at: string
|
||||||
duration: number
|
scheduled_start_time?: string
|
||||||
summary?: {
|
scheduled_end_time?: string
|
||||||
markdown_formatted: string
|
recording_start_time?: string
|
||||||
|
recording_end_time?: string
|
||||||
|
transcript?: any[]
|
||||||
|
transcript_language?: string
|
||||||
|
default_summary?: {
|
||||||
|
template_name?: string
|
||||||
|
markdown_formatted?: string
|
||||||
}
|
}
|
||||||
|
action_items?: any[]
|
||||||
|
calendar_invitees?: Array<{
|
||||||
|
name: string
|
||||||
|
email: string
|
||||||
|
is_external: boolean
|
||||||
|
}>
|
||||||
|
recorded_by?: {
|
||||||
|
name: string
|
||||||
|
email: string
|
||||||
|
team?: string
|
||||||
|
}
|
||||||
|
call_id?: string | number
|
||||||
|
id?: string | number
|
||||||
}
|
}
|
||||||
|
|
||||||
interface FathomMeetingsPanelProps {
|
interface FathomMeetingsPanelProps {
|
||||||
onClose: () => void
|
onClose?: () => void
|
||||||
|
onMeetingSelect?: (meeting: FathomMeeting, options: { summary: boolean; transcript: boolean; actionItems: boolean; video: boolean }, format: 'fathom' | 'note') => void
|
||||||
shapeMode?: boolean
|
shapeMode?: boolean
|
||||||
}
|
}
|
||||||
|
|
||||||
export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetingsPanelProps) {
|
export function FathomMeetingsPanel({ onClose, onMeetingSelect, shapeMode = false }: FathomMeetingsPanelProps) {
|
||||||
const editor = useEditor()
|
const editor = useEditor()
|
||||||
|
// Safely get auth context - may not be available during SVG export
|
||||||
|
const authContext = useContext(AuthContext)
|
||||||
|
const fallbackSession = {
|
||||||
|
username: undefined as string | undefined,
|
||||||
|
}
|
||||||
|
const session = authContext?.session || fallbackSession
|
||||||
|
|
||||||
const [apiKey, setApiKey] = useState('')
|
const [apiKey, setApiKey] = useState('')
|
||||||
const [showApiKeyInput, setShowApiKeyInput] = useState(false)
|
const [showApiKeyInput, setShowApiKeyInput] = useState(false)
|
||||||
const [meetings, setMeetings] = useState<FathomMeeting[]>([])
|
const [meetings, setMeetings] = useState<FathomMeeting[]>([])
|
||||||
const [loading, setLoading] = useState(false)
|
const [loading, setLoading] = useState(false)
|
||||||
const [error, setError] = useState<string | null>(null)
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
// Removed dropdown state - using buttons instead
|
||||||
|
|
||||||
useEffect(() => {
|
const fetchMeetings = async (keyToUse?: string) => {
|
||||||
// Check if API key is already stored
|
const key = keyToUse || apiKey
|
||||||
const storedApiKey = localStorage.getItem('fathom_api_key')
|
if (!key) {
|
||||||
if (storedApiKey) {
|
|
||||||
setApiKey(storedApiKey)
|
|
||||||
fetchMeetings()
|
|
||||||
} else {
|
|
||||||
setShowApiKeyInput(true)
|
|
||||||
}
|
|
||||||
}, [])
|
|
||||||
|
|
||||||
const fetchMeetings = async () => {
|
|
||||||
if (!apiKey) {
|
|
||||||
setError('Please enter your Fathom API key')
|
setError('Please enter your Fathom API key')
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
@ -53,7 +75,7 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
try {
|
try {
|
||||||
response = await fetch(`${WORKER_URL}/fathom/meetings`, {
|
response = await fetch(`${WORKER_URL}/fathom/meetings`, {
|
||||||
headers: {
|
headers: {
|
||||||
'Authorization': `Bearer ${apiKey}`,
|
'X-Api-Key': key,
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json'
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
@ -61,7 +83,7 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
console.log('Production worker failed, trying local worker...')
|
console.log('Production worker failed, trying local worker...')
|
||||||
response = await fetch(`${LOCAL_WORKER_URL}/fathom/meetings`, {
|
response = await fetch(`${LOCAL_WORKER_URL}/fathom/meetings`, {
|
||||||
headers: {
|
headers: {
|
||||||
'Authorization': `Bearer ${apiKey}`,
|
'X-Api-Key': key,
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json'
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
@ -91,28 +113,169 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
|
|
||||||
const saveApiKey = () => {
|
const saveApiKey = () => {
|
||||||
if (apiKey) {
|
if (apiKey) {
|
||||||
localStorage.setItem('fathom_api_key', apiKey)
|
saveFathomApiKey(apiKey, session.username)
|
||||||
setShowApiKeyInput(false)
|
setShowApiKeyInput(false)
|
||||||
fetchMeetings()
|
fetchMeetings(apiKey)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const addMeetingToCanvas = async (meeting: FathomMeeting) => {
|
// Track if we've already loaded meetings for the current user to prevent multiple API calls
|
||||||
|
const hasLoadedRef = useRef<string | undefined>(undefined)
|
||||||
|
const hasMountedRef = useRef(false)
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
// Only run once on mount, don't re-fetch when session.username changes
|
||||||
|
if (hasMountedRef.current) {
|
||||||
|
return // Already loaded, don't refresh
|
||||||
|
}
|
||||||
|
hasMountedRef.current = true
|
||||||
|
|
||||||
|
// Always check user profile first for API key, then fallback to global storage
|
||||||
|
const username = session.username
|
||||||
|
const storedApiKey = getFathomApiKey(username)
|
||||||
|
if (storedApiKey) {
|
||||||
|
setApiKey(storedApiKey)
|
||||||
|
setShowApiKeyInput(false)
|
||||||
|
// Automatically fetch meetings when API key is available
|
||||||
|
// Only fetch once per user to prevent unnecessary API calls
|
||||||
|
if (hasLoadedRef.current !== username) {
|
||||||
|
hasLoadedRef.current = username
|
||||||
|
fetchMeetings(storedApiKey)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
setShowApiKeyInput(true)
|
||||||
|
hasLoadedRef.current = undefined
|
||||||
|
}
|
||||||
|
}, []) // Empty dependency array - only run once on mount
|
||||||
|
|
||||||
|
// Handler for individual data type buttons - creates shapes directly
|
||||||
|
const handleDataButtonClick = async (meeting: FathomMeeting, dataType: 'summary' | 'transcript' | 'actionItems' | 'video') => {
|
||||||
|
// Log to verify the correct meeting is being used
|
||||||
|
console.log('🔵 handleDataButtonClick called with meeting:', {
|
||||||
|
recording_id: meeting.recording_id,
|
||||||
|
title: meeting.title,
|
||||||
|
dataType
|
||||||
|
})
|
||||||
|
|
||||||
|
if (!onMeetingSelect) {
|
||||||
|
// Fallback for non-browser mode
|
||||||
|
const options = {
|
||||||
|
summary: dataType === 'summary',
|
||||||
|
transcript: dataType === 'transcript',
|
||||||
|
actionItems: dataType === 'actionItems',
|
||||||
|
video: dataType === 'video',
|
||||||
|
}
|
||||||
|
await addMeetingToCanvas(meeting, options)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Browser mode - use callback with specific data type
|
||||||
|
// IMPORTANT: Pass the meeting object directly to ensure each button uses its own meeting's data
|
||||||
|
const options = {
|
||||||
|
summary: dataType === 'summary',
|
||||||
|
transcript: dataType === 'transcript',
|
||||||
|
actionItems: dataType === 'actionItems',
|
||||||
|
video: dataType === 'video',
|
||||||
|
}
|
||||||
|
// Always use 'note' format for summary, transcript, and action items (same behavior)
|
||||||
|
// Video opens URL directly, so format doesn't matter for it
|
||||||
|
const format = 'note'
|
||||||
|
onMeetingSelect(meeting, options, format)
|
||||||
|
}
|
||||||
|
|
||||||
|
const formatMeetingDataAsMarkdown = (fullMeeting: any, meeting: FathomMeeting, options: { summary: boolean; transcript: boolean; actionItems: boolean; video: boolean }): string => {
|
||||||
|
const parts: string[] = []
|
||||||
|
|
||||||
|
// Title
|
||||||
|
parts.push(`# ${fullMeeting.title || meeting.meeting_title || meeting.title || 'Meeting'}\n`)
|
||||||
|
|
||||||
|
// Video link if selected
|
||||||
|
if (options.video && (fullMeeting.url || meeting.url)) {
|
||||||
|
parts.push(`**Video:** [Watch Recording](${fullMeeting.url || meeting.url})\n`)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Summary if selected
|
||||||
|
if (options.summary && fullMeeting.default_summary?.markdown_formatted) {
|
||||||
|
parts.push(`## Summary\n\n${fullMeeting.default_summary.markdown_formatted}\n`)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Action Items if selected
|
||||||
|
if (options.actionItems && fullMeeting.action_items && fullMeeting.action_items.length > 0) {
|
||||||
|
parts.push(`## Action Items\n\n`)
|
||||||
|
fullMeeting.action_items.forEach((item: any) => {
|
||||||
|
const description = item.description || item.text || ''
|
||||||
|
const assignee = item.assignee?.name || item.assignee || ''
|
||||||
|
const dueDate = item.due_date || ''
|
||||||
|
parts.push(`- [ ] ${description}`)
|
||||||
|
if (assignee) parts[parts.length - 1] += ` (@${assignee})`
|
||||||
|
if (dueDate) parts[parts.length - 1] += ` - Due: ${dueDate}`
|
||||||
|
parts[parts.length - 1] += '\n'
|
||||||
|
})
|
||||||
|
parts.push('\n')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Transcript if selected
|
||||||
|
if (options.transcript && fullMeeting.transcript && fullMeeting.transcript.length > 0) {
|
||||||
|
parts.push(`## Transcript\n\n`)
|
||||||
|
fullMeeting.transcript.forEach((entry: any) => {
|
||||||
|
const speaker = entry.speaker?.display_name || 'Unknown'
|
||||||
|
const text = entry.text || ''
|
||||||
|
const timestamp = entry.timestamp || ''
|
||||||
|
if (timestamp) {
|
||||||
|
parts.push(`**${speaker}** (${timestamp}): ${text}\n\n`)
|
||||||
|
} else {
|
||||||
|
parts.push(`**${speaker}**: ${text}\n\n`)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return parts.join('')
|
||||||
|
}
|
||||||
|
|
||||||
|
const addMeetingToCanvas = async (meeting: FathomMeeting, options: { summary: boolean; transcript: boolean; actionItems: boolean; video: boolean }) => {
|
||||||
try {
|
try {
|
||||||
|
// If video is selected, just open the Fathom URL directly
|
||||||
|
if (options.video) {
|
||||||
|
// Try multiple sources for the correct video URL
|
||||||
|
// The Fathom API may provide url, share_url, or we may need to construct from call_id or id
|
||||||
|
const callId = meeting.call_id ||
|
||||||
|
meeting.id ||
|
||||||
|
meeting.recording_id
|
||||||
|
|
||||||
|
// Check if URL fields contain valid meeting URLs (contain /calls/)
|
||||||
|
const isValidMeetingUrl = (url: string) => url && url.includes('/calls/')
|
||||||
|
|
||||||
|
// Prioritize valid meeting URLs, then construct from call ID
|
||||||
|
const videoUrl = (meeting.url && isValidMeetingUrl(meeting.url)) ? meeting.url :
|
||||||
|
(meeting.share_url && isValidMeetingUrl(meeting.share_url)) ? meeting.share_url :
|
||||||
|
(callId ? `https://fathom.video/calls/${callId}` : null)
|
||||||
|
|
||||||
|
if (videoUrl) {
|
||||||
|
console.log('Opening Fathom video URL:', videoUrl, 'for meeting:', { callId, recording_id: meeting.recording_id })
|
||||||
|
window.open(videoUrl, '_blank', 'noopener,noreferrer')
|
||||||
|
} else {
|
||||||
|
console.error('Could not determine Fathom video URL for meeting:', meeting)
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Only fetch transcript if transcript is selected
|
||||||
|
const includeTranscript = options.transcript
|
||||||
|
|
||||||
// Fetch full meeting details
|
// Fetch full meeting details
|
||||||
let response
|
let response
|
||||||
try {
|
try {
|
||||||
response = await fetch(`${WORKER_URL}/fathom/meetings/${meeting.id}`, {
|
response = await fetch(`${WORKER_URL}/fathom/meetings/${meeting.recording_id}${includeTranscript ? '?include_transcript=true' : ''}`, {
|
||||||
headers: {
|
headers: {
|
||||||
'Authorization': `Bearer ${apiKey}`,
|
'X-Api-Key': apiKey,
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json'
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.log('Production worker failed, trying local worker...')
|
console.log('Production worker failed, trying local worker...')
|
||||||
response = await fetch(`${LOCAL_WORKER_URL}/fathom/meetings/${meeting.id}`, {
|
response = await fetch(`${LOCAL_WORKER_URL}/fathom/meetings/${meeting.recording_id}${includeTranscript ? '?include_transcript=true' : ''}`, {
|
||||||
headers: {
|
headers: {
|
||||||
'Authorization': `Bearer ${apiKey}`,
|
'X-Api-Key': apiKey,
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json'
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
@ -125,41 +288,60 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
|
|
||||||
const fullMeeting = await response.json() as any
|
const fullMeeting = await response.json() as any
|
||||||
|
|
||||||
// Create Fathom transcript shape
|
// If onMeetingSelect callback is provided, use it (browser mode - creates separate shapes)
|
||||||
|
if (onMeetingSelect) {
|
||||||
|
// Default to 'note' format for text data
|
||||||
|
onMeetingSelect(meeting, options, 'note')
|
||||||
|
// Browser stays open, don't close
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: create shape directly (for non-browser mode, like modal)
|
||||||
|
// Default to note format
|
||||||
|
const markdownContent = formatMeetingDataAsMarkdown(fullMeeting, meeting, options)
|
||||||
|
const title = fullMeeting.title || meeting.meeting_title || meeting.title || 'Fathom Meeting'
|
||||||
|
|
||||||
const shapeId = createShapeId()
|
const shapeId = createShapeId()
|
||||||
editor.createShape({
|
editor.createShape({
|
||||||
id: shapeId,
|
id: shapeId,
|
||||||
type: 'FathomTranscript',
|
type: 'ObsNote',
|
||||||
x: 100,
|
x: 100,
|
||||||
y: 100,
|
y: 100,
|
||||||
props: {
|
props: {
|
||||||
meetingId: fullMeeting.id || '',
|
w: 400,
|
||||||
meetingTitle: fullMeeting.title || '',
|
h: 500,
|
||||||
meetingUrl: fullMeeting.url || '',
|
color: 'black',
|
||||||
summary: fullMeeting.default_summary?.markdown_formatted || '',
|
size: 'm',
|
||||||
transcript: fullMeeting.transcript?.map((entry: any) => ({
|
font: 'sans',
|
||||||
speaker: entry.speaker?.display_name || 'Unknown',
|
textAlign: 'start',
|
||||||
text: entry.text,
|
scale: 1,
|
||||||
timestamp: entry.timestamp
|
noteId: `fathom-${meeting.recording_id}`,
|
||||||
})) || [],
|
title: title,
|
||||||
actionItems: fullMeeting.action_items?.map((item: any) => ({
|
content: markdownContent,
|
||||||
text: item.text,
|
tags: ['fathom', 'meeting'],
|
||||||
assignee: item.assignee,
|
showPreview: true,
|
||||||
dueDate: item.due_date
|
backgroundColor: '#ffffff',
|
||||||
})) || [],
|
textColor: '#000000',
|
||||||
isExpanded: false,
|
isEditing: false,
|
||||||
showTranscript: true,
|
editingContent: '',
|
||||||
showActionItems: true,
|
isModified: false,
|
||||||
|
originalContent: markdownContent,
|
||||||
|
pinnedToView: false,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
onClose()
|
// Only close if not in shape mode (browser stays open)
|
||||||
|
if (!shapeMode && onClose) {
|
||||||
|
onClose()
|
||||||
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error adding meeting to canvas:', error)
|
console.error('Error adding meeting to canvas:', error)
|
||||||
setError(`Failed to add meeting: ${(error as Error).message}`)
|
setError(`Failed to add meeting: ${(error as Error).message}`)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Removed dropdown click-outside handler - no longer needed with button-based interface
|
||||||
|
|
||||||
const formatDate = (dateString: string) => {
|
const formatDate = (dateString: string) => {
|
||||||
return new Date(dateString).toLocaleDateString()
|
return new Date(dateString).toLocaleDateString()
|
||||||
}
|
}
|
||||||
|
|
@ -196,38 +378,22 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
}
|
}
|
||||||
|
|
||||||
const content = (
|
const content = (
|
||||||
<div style={contentStyle} onClick={(e) => shapeMode ? undefined : e.stopPropagation()}>
|
<div
|
||||||
<div style={{
|
style={contentStyle}
|
||||||
display: 'flex',
|
onClick={(e) => {
|
||||||
justifyContent: 'space-between',
|
// Prevent clicks from interfering with shape selection or resetting data
|
||||||
alignItems: 'center',
|
if (!shapeMode) {
|
||||||
marginBottom: '20px',
|
e.stopPropagation()
|
||||||
paddingBottom: '10px',
|
}
|
||||||
borderBottom: '1px solid #eee'
|
// In shape mode, allow normal interaction but don't reset data
|
||||||
}}>
|
}}
|
||||||
<h2 style={{ margin: 0, fontSize: '18px', fontWeight: 'bold' }}>
|
onMouseDown={(e) => {
|
||||||
🎥 Fathom Meetings
|
// Prevent shape deselection when clicking inside the browser content
|
||||||
</h2>
|
if (shapeMode) {
|
||||||
<button
|
e.stopPropagation()
|
||||||
onClick={(e) => {
|
}
|
||||||
e.stopPropagation()
|
}}
|
||||||
onClose()
|
>
|
||||||
}}
|
|
||||||
style={{
|
|
||||||
background: 'none',
|
|
||||||
border: 'none',
|
|
||||||
fontSize: '20px',
|
|
||||||
cursor: 'pointer',
|
|
||||||
padding: '5px',
|
|
||||||
position: 'relative',
|
|
||||||
zIndex: 10002,
|
|
||||||
pointerEvents: 'auto'
|
|
||||||
}}
|
|
||||||
>
|
|
||||||
✕
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{showApiKeyInput ? (
|
{showApiKeyInput ? (
|
||||||
<div>
|
<div>
|
||||||
<p style={{
|
<p style={{
|
||||||
|
|
@ -269,7 +435,8 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
cursor: apiKey ? 'pointer' : 'not-allowed',
|
cursor: apiKey ? 'pointer' : 'not-allowed',
|
||||||
position: 'relative',
|
position: 'relative',
|
||||||
zIndex: 10002,
|
zIndex: 10002,
|
||||||
pointerEvents: 'auto'
|
pointerEvents: 'auto',
|
||||||
|
touchAction: 'manipulation'
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
Save & Load Meetings
|
Save & Load Meetings
|
||||||
|
|
@ -296,7 +463,7 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
<>
|
<>
|
||||||
<div style={{ display: 'flex', gap: '10px', marginBottom: '20px' }}>
|
<div style={{ display: 'flex', gap: '10px', marginBottom: '20px' }}>
|
||||||
<button
|
<button
|
||||||
onClick={fetchMeetings}
|
onClick={() => fetchMeetings(apiKey)}
|
||||||
disabled={loading}
|
disabled={loading}
|
||||||
style={{
|
style={{
|
||||||
padding: '8px 16px',
|
padding: '8px 16px',
|
||||||
|
|
@ -314,9 +481,12 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
</button>
|
</button>
|
||||||
<button
|
<button
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
localStorage.removeItem('fathom_api_key')
|
// Remove API key from user-specific storage
|
||||||
|
removeFathomApiKey(session.username)
|
||||||
setApiKey('')
|
setApiKey('')
|
||||||
|
setMeetings([])
|
||||||
setShowApiKeyInput(true)
|
setShowApiKeyInput(true)
|
||||||
|
hasLoadedRef.current = undefined
|
||||||
}}
|
}}
|
||||||
style={{
|
style={{
|
||||||
padding: '8px 16px',
|
padding: '8px 16px',
|
||||||
|
|
@ -363,7 +533,7 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
) : (
|
) : (
|
||||||
meetings.map((meeting) => (
|
meetings.map((meeting) => (
|
||||||
<div
|
<div
|
||||||
key={meeting.id}
|
key={meeting.recording_id}
|
||||||
style={{
|
style={{
|
||||||
border: '1px solid #e0e0e0',
|
border: '1px solid #e0e0e0',
|
||||||
borderRadius: '6px',
|
borderRadius: '6px',
|
||||||
|
|
@ -393,9 +563,11 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
cursor: 'text'
|
cursor: 'text'
|
||||||
}}>
|
}}>
|
||||||
<div>📅 {formatDate(meeting.created_at)}</div>
|
<div>📅 {formatDate(meeting.created_at)}</div>
|
||||||
<div>⏱️ Duration: {formatDuration(meeting.duration)}</div>
|
<div>⏱️ Duration: {meeting.recording_start_time && meeting.recording_end_time
|
||||||
|
? formatDuration(Math.floor((new Date(meeting.recording_end_time).getTime() - new Date(meeting.recording_start_time).getTime()) / 1000))
|
||||||
|
: 'N/A'}</div>
|
||||||
</div>
|
</div>
|
||||||
{meeting.summary && (
|
{meeting.default_summary?.markdown_formatted && (
|
||||||
<div style={{
|
<div style={{
|
||||||
fontSize: '11px',
|
fontSize: '11px',
|
||||||
color: '#333',
|
color: '#333',
|
||||||
|
|
@ -403,28 +575,91 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
userSelect: 'text',
|
userSelect: 'text',
|
||||||
cursor: 'text'
|
cursor: 'text'
|
||||||
}}>
|
}}>
|
||||||
<strong>Summary:</strong> {meeting.summary.markdown_formatted.substring(0, 100)}...
|
<strong>Summary:</strong> {meeting.default_summary.markdown_formatted.substring(0, 100)}...
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
<button
|
<div style={{
|
||||||
onClick={() => addMeetingToCanvas(meeting)}
|
display: 'flex',
|
||||||
style={{
|
flexDirection: 'row',
|
||||||
padding: '6px 12px',
|
gap: '6px',
|
||||||
backgroundColor: '#28a745',
|
marginLeft: '10px',
|
||||||
color: 'white',
|
alignItems: 'center',
|
||||||
border: 'none',
|
flexWrap: 'wrap'
|
||||||
borderRadius: '4px',
|
}}>
|
||||||
cursor: 'pointer',
|
<button
|
||||||
fontSize: '12px',
|
onClick={() => handleDataButtonClick(meeting, 'summary')}
|
||||||
marginLeft: '10px',
|
disabled={loading}
|
||||||
position: 'relative',
|
style={{
|
||||||
zIndex: 10002,
|
padding: '6px 12px',
|
||||||
pointerEvents: 'auto'
|
backgroundColor: '#3b82f6',
|
||||||
}}
|
color: 'white',
|
||||||
>
|
border: 'none',
|
||||||
Add to Canvas
|
borderRadius: '4px',
|
||||||
</button>
|
cursor: loading ? 'not-allowed' : 'pointer',
|
||||||
|
fontSize: '11px',
|
||||||
|
whiteSpace: 'nowrap',
|
||||||
|
opacity: loading ? 0.6 : 1
|
||||||
|
}}
|
||||||
|
title="Add Summary as Note"
|
||||||
|
>
|
||||||
|
📄 Summary
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={() => handleDataButtonClick(meeting, 'transcript')}
|
||||||
|
disabled={loading}
|
||||||
|
style={{
|
||||||
|
padding: '6px 12px',
|
||||||
|
backgroundColor: '#2563eb',
|
||||||
|
color: 'white',
|
||||||
|
border: 'none',
|
||||||
|
borderRadius: '4px',
|
||||||
|
cursor: loading ? 'not-allowed' : 'pointer',
|
||||||
|
fontSize: '11px',
|
||||||
|
whiteSpace: 'nowrap',
|
||||||
|
opacity: loading ? 0.6 : 1
|
||||||
|
}}
|
||||||
|
title="Add Transcript as Note"
|
||||||
|
>
|
||||||
|
📝 Transcript
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={() => handleDataButtonClick(meeting, 'actionItems')}
|
||||||
|
disabled={loading}
|
||||||
|
style={{
|
||||||
|
padding: '6px 12px',
|
||||||
|
backgroundColor: '#1d4ed8',
|
||||||
|
color: 'white',
|
||||||
|
border: 'none',
|
||||||
|
borderRadius: '4px',
|
||||||
|
cursor: loading ? 'not-allowed' : 'pointer',
|
||||||
|
fontSize: '11px',
|
||||||
|
whiteSpace: 'nowrap',
|
||||||
|
opacity: loading ? 0.6 : 1
|
||||||
|
}}
|
||||||
|
title="Add Action Items as Note"
|
||||||
|
>
|
||||||
|
✅ Actions
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={() => handleDataButtonClick(meeting, 'video')}
|
||||||
|
disabled={loading}
|
||||||
|
style={{
|
||||||
|
padding: '6px 12px',
|
||||||
|
backgroundColor: '#1e40af',
|
||||||
|
color: 'white',
|
||||||
|
border: 'none',
|
||||||
|
borderRadius: '4px',
|
||||||
|
cursor: loading ? 'not-allowed' : 'pointer',
|
||||||
|
fontSize: '11px',
|
||||||
|
whiteSpace: 'nowrap',
|
||||||
|
opacity: loading ? 0.6 : 1
|
||||||
|
}}
|
||||||
|
title="Add Video as Embed"
|
||||||
|
>
|
||||||
|
🎥 Video
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
))
|
))
|
||||||
|
|
@ -477,3 +712,4 @@ export function FathomMeetingsPanel({ onClose, shapeMode = false }: FathomMeetin
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -49,14 +49,33 @@ export function HolonBrowser({ isOpen, onClose, onSelectHolon, shapeMode = false
|
||||||
setHolonInfo(null)
|
setHolonInfo(null)
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Validate that the holonId is a valid H3 index
|
// Check if it's a valid H3 cell ID
|
||||||
if (!h3.isValidCell(holonId)) {
|
const isH3Cell = h3.isValidCell(holonId)
|
||||||
throw new Error('Invalid H3 cell ID')
|
|
||||||
|
// Check if it's a numeric Holon ID (workspace/group identifier)
|
||||||
|
const isNumericId = /^\d{6,20}$/.test(holonId)
|
||||||
|
|
||||||
|
// Check if it's an alphanumeric identifier
|
||||||
|
const isAlphanumericId = /^[a-zA-Z0-9_-]{3,50}$/.test(holonId)
|
||||||
|
|
||||||
|
if (!isH3Cell && !isNumericId && !isAlphanumericId) {
|
||||||
|
throw new Error('Invalid Holon ID. Enter an H3 cell ID (e.g., 872a1070bffffff) or a numeric Holon ID (e.g., 1002848305066)')
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get holon information
|
// Get holon information based on ID type
|
||||||
const resolution = h3.getResolution(holonId)
|
let resolution: number
|
||||||
const [lat, lng] = h3.cellToLatLng(holonId)
|
let lat: number
|
||||||
|
let lng: number
|
||||||
|
|
||||||
|
if (isH3Cell) {
|
||||||
|
resolution = h3.getResolution(holonId)
|
||||||
|
;[lat, lng] = h3.cellToLatLng(holonId)
|
||||||
|
} else {
|
||||||
|
// For non-H3 IDs, use default values
|
||||||
|
resolution = -1 // Indicates non-geospatial holon
|
||||||
|
lat = 0
|
||||||
|
lng = 0
|
||||||
|
}
|
||||||
|
|
||||||
// Try to get metadata from the holon
|
// Try to get metadata from the holon
|
||||||
let metadata = null
|
let metadata = null
|
||||||
|
|
@ -101,7 +120,9 @@ export function HolonBrowser({ isOpen, onClose, onSelectHolon, shapeMode = false
|
||||||
latitude: lat,
|
latitude: lat,
|
||||||
longitude: lng,
|
longitude: lng,
|
||||||
resolution: resolution,
|
resolution: resolution,
|
||||||
resolutionName: HoloSphereService.getResolutionName(resolution),
|
resolutionName: resolution >= 0
|
||||||
|
? HoloSphereService.getResolutionName(resolution)
|
||||||
|
: 'Workspace / Group',
|
||||||
data: {},
|
data: {},
|
||||||
lastUpdated: metadata?.lastUpdated || Date.now()
|
lastUpdated: metadata?.lastUpdated || Date.now()
|
||||||
}
|
}
|
||||||
|
|
@ -192,7 +213,7 @@ export function HolonBrowser({ isOpen, onClose, onSelectHolon, shapeMode = false
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-gray-600 mt-2">
|
<p className="text-sm text-gray-600 mt-2">
|
||||||
Enter a Holon ID to browse its data and import it to your canvas
|
Enter a Holon ID (numeric like 1002848305066 or H3 cell like 872a1070bffffff) to browse its data
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|
@ -210,7 +231,7 @@ export function HolonBrowser({ isOpen, onClose, onSelectHolon, shapeMode = false
|
||||||
value={holonId}
|
value={holonId}
|
||||||
onChange={(e) => setHolonId(e.target.value)}
|
onChange={(e) => setHolonId(e.target.value)}
|
||||||
onKeyDown={handleKeyDown}
|
onKeyDown={handleKeyDown}
|
||||||
placeholder="e.g., 1002848305066"
|
placeholder="e.g., 1002848305066 or 872a1070bffffff"
|
||||||
className="flex-1 px-3 py-2 border border-gray-300 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 z-[10001] relative"
|
className="flex-1 px-3 py-2 border border-gray-300 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 z-[10001] relative"
|
||||||
disabled={isLoading}
|
disabled={isLoading}
|
||||||
style={{ zIndex: 10001 }}
|
style={{ zIndex: 10001 }}
|
||||||
|
|
@ -237,18 +258,29 @@ export function HolonBrowser({ isOpen, onClose, onSelectHolon, shapeMode = false
|
||||||
</h3>
|
</h3>
|
||||||
|
|
||||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4 mb-4">
|
<div className="grid grid-cols-1 md:grid-cols-2 gap-4 mb-4">
|
||||||
<div>
|
{holonInfo.resolution >= 0 ? (
|
||||||
<p className="text-sm text-gray-600">Coordinates</p>
|
<>
|
||||||
<p className="font-mono text-sm">
|
<div>
|
||||||
{holonInfo.latitude.toFixed(6)}, {holonInfo.longitude.toFixed(6)}
|
<p className="text-sm text-gray-600">Coordinates</p>
|
||||||
</p>
|
<p className="font-mono text-sm">
|
||||||
</div>
|
{holonInfo.latitude.toFixed(6)}, {holonInfo.longitude.toFixed(6)}
|
||||||
<div>
|
</p>
|
||||||
<p className="text-sm text-gray-600">Resolution</p>
|
</div>
|
||||||
<p className="text-sm">
|
<div>
|
||||||
{holonInfo.resolutionName} (Level {holonInfo.resolution})
|
<p className="text-sm text-gray-600">Resolution</p>
|
||||||
</p>
|
<p className="text-sm">
|
||||||
</div>
|
{holonInfo.resolutionName} (Level {holonInfo.resolution})
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<div>
|
||||||
|
<p className="text-sm text-gray-600">Type</p>
|
||||||
|
<p className="text-sm font-medium text-green-600">
|
||||||
|
{holonInfo.resolutionName}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
<div>
|
<div>
|
||||||
<p className="text-sm text-gray-600">Holon ID</p>
|
<p className="text-sm text-gray-600">Holon ID</p>
|
||||||
<p className="font-mono text-xs break-all">{holonInfo.id}</p>
|
<p className="font-mono text-xs break-all">{holonInfo.id}</p>
|
||||||
|
|
|
||||||
|
|
@ -1,4 +1,29 @@
|
||||||
import React, { useState, ReactNode } from 'react'
|
import React, { useState, ReactNode, useEffect, useRef, useMemo } from 'react'
|
||||||
|
|
||||||
|
// Hook to detect dark mode
|
||||||
|
function useIsDarkMode() {
|
||||||
|
const [isDark, setIsDark] = useState(() => {
|
||||||
|
if (typeof document !== 'undefined') {
|
||||||
|
return document.documentElement.classList.contains('dark')
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
})
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const observer = new MutationObserver((mutations) => {
|
||||||
|
mutations.forEach((mutation) => {
|
||||||
|
if (mutation.attributeName === 'class') {
|
||||||
|
setIsDark(document.documentElement.classList.contains('dark'))
|
||||||
|
}
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
observer.observe(document.documentElement, { attributes: true })
|
||||||
|
return () => observer.disconnect()
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
return isDark
|
||||||
|
}
|
||||||
|
|
||||||
export interface StandardizedToolWrapperProps {
|
export interface StandardizedToolWrapperProps {
|
||||||
/** The title to display in the header */
|
/** The title to display in the header */
|
||||||
|
|
@ -25,6 +50,16 @@ export interface StandardizedToolWrapperProps {
|
||||||
editor?: any
|
editor?: any
|
||||||
/** Shape ID for selection handling */
|
/** Shape ID for selection handling */
|
||||||
shapeId?: string
|
shapeId?: string
|
||||||
|
/** Whether the shape is pinned to view */
|
||||||
|
isPinnedToView?: boolean
|
||||||
|
/** Callback when pin button is clicked */
|
||||||
|
onPinToggle?: () => void
|
||||||
|
/** Tags to display at the bottom of the shape */
|
||||||
|
tags?: string[]
|
||||||
|
/** Callback when tags are updated */
|
||||||
|
onTagsChange?: (tags: string[]) => void
|
||||||
|
/** Whether tags can be edited */
|
||||||
|
tagsEditable?: boolean
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
@ -44,9 +79,70 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
headerContent,
|
headerContent,
|
||||||
editor,
|
editor,
|
||||||
shapeId,
|
shapeId,
|
||||||
|
isPinnedToView = false,
|
||||||
|
onPinToggle,
|
||||||
|
tags = [],
|
||||||
|
onTagsChange,
|
||||||
|
tagsEditable = true,
|
||||||
}) => {
|
}) => {
|
||||||
const [isHoveringHeader, setIsHoveringHeader] = useState(false)
|
const [isHoveringHeader, setIsHoveringHeader] = useState(false)
|
||||||
|
const [isEditingTags, setIsEditingTags] = useState(false)
|
||||||
|
const [editingTagInput, setEditingTagInput] = useState('')
|
||||||
|
const tagInputRef = useRef<HTMLInputElement>(null)
|
||||||
|
const isDarkMode = useIsDarkMode()
|
||||||
|
|
||||||
|
// Dark mode aware colors
|
||||||
|
const colors = useMemo(() => isDarkMode ? {
|
||||||
|
contentBg: '#1a1a1a',
|
||||||
|
tagsBg: '#252525',
|
||||||
|
tagsBorder: '#404040',
|
||||||
|
tagBg: '#4a5568',
|
||||||
|
tagText: '#e4e4e4',
|
||||||
|
addTagBg: '#4a5568',
|
||||||
|
inputBg: '#333333',
|
||||||
|
inputBorder: '#555555',
|
||||||
|
} : {
|
||||||
|
contentBg: 'white',
|
||||||
|
tagsBg: '#f8f9fa',
|
||||||
|
tagsBorder: '#e0e0e0',
|
||||||
|
tagBg: '#6b7280',
|
||||||
|
tagText: 'white',
|
||||||
|
addTagBg: '#9ca3af',
|
||||||
|
inputBg: 'white',
|
||||||
|
inputBorder: '#9ca3af',
|
||||||
|
}, [isDarkMode])
|
||||||
|
|
||||||
|
// Bring selected shape to front when it becomes selected
|
||||||
|
useEffect(() => {
|
||||||
|
if (editor && shapeId && isSelected) {
|
||||||
|
try {
|
||||||
|
// Bring the shape to the front by updating its index
|
||||||
|
// Note: sendToFront doesn't exist in this version of tldraw
|
||||||
|
const allShapes = editor.getCurrentPageShapes()
|
||||||
|
let highestIndex = 'a0'
|
||||||
|
for (const s of allShapes) {
|
||||||
|
if (s.index && typeof s.index === 'string' && s.index > highestIndex) {
|
||||||
|
highestIndex = s.index
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const shape = editor.getShape(shapeId)
|
||||||
|
if (shape) {
|
||||||
|
const match = highestIndex.match(/^([a-z])(\d+)$/)
|
||||||
|
if (match) {
|
||||||
|
const letter = match[1]
|
||||||
|
const num = parseInt(match[2], 10)
|
||||||
|
const newIndex = num < 100 ? `${letter}${num + 1}` : `${String.fromCharCode(letter.charCodeAt(0) + 1)}1`
|
||||||
|
if (/^[a-z]\d+$/.test(newIndex)) {
|
||||||
|
editor.updateShape({ id: shapeId, type: shape.type, index: newIndex as any })
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
// Silently fail if shape doesn't exist or operation fails
|
||||||
|
// This prevents console spam if shape is deleted during selection
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, [editor, shapeId, isSelected])
|
||||||
|
|
||||||
// Calculate header background color (lighter shade of primary color)
|
// Calculate header background color (lighter shade of primary color)
|
||||||
const headerBgColor = isSelected
|
const headerBgColor = isSelected
|
||||||
|
|
@ -58,13 +154,13 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
const wrapperStyle: React.CSSProperties = {
|
const wrapperStyle: React.CSSProperties = {
|
||||||
width: typeof width === 'number' ? `${width}px` : width,
|
width: typeof width === 'number' ? `${width}px` : width,
|
||||||
height: isMinimized ? 40 : (typeof height === 'number' ? `${height}px` : height), // Minimized height is just the header
|
height: isMinimized ? 40 : (typeof height === 'number' ? `${height}px` : height), // Minimized height is just the header
|
||||||
backgroundColor: "white",
|
backgroundColor: colors.contentBg,
|
||||||
border: isSelected ? `2px solid ${primaryColor}` : `1px solid ${primaryColor}40`,
|
border: isSelected ? `2px solid ${primaryColor}` : `1px solid ${primaryColor}40`,
|
||||||
borderRadius: "8px",
|
borderRadius: "8px",
|
||||||
overflow: "hidden",
|
overflow: "hidden",
|
||||||
boxShadow: isSelected
|
boxShadow: isSelected
|
||||||
? `0 0 0 2px ${primaryColor}40, 0 4px 8px rgba(0,0,0,0.15)`
|
? `0 0 0 2px ${primaryColor}40, 0 4px 8px rgba(0,0,0,${isDarkMode ? '0.4' : '0.15'})`
|
||||||
: '0 2px 4px rgba(0,0,0,0.1)',
|
: `0 2px 4px rgba(0,0,0,${isDarkMode ? '0.3' : '0.1'})`,
|
||||||
display: 'flex',
|
display: 'flex',
|
||||||
flexDirection: 'column',
|
flexDirection: 'column',
|
||||||
fontFamily: "Inter, sans-serif",
|
fontFamily: "Inter, sans-serif",
|
||||||
|
|
@ -107,8 +203,8 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
}
|
}
|
||||||
|
|
||||||
const buttonBaseStyle: React.CSSProperties = {
|
const buttonBaseStyle: React.CSSProperties = {
|
||||||
width: '20px',
|
width: '24px',
|
||||||
height: '20px',
|
height: '24px',
|
||||||
borderRadius: '4px',
|
borderRadius: '4px',
|
||||||
border: 'none',
|
border: 'none',
|
||||||
cursor: 'pointer',
|
cursor: 'pointer',
|
||||||
|
|
@ -120,6 +216,9 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
transition: 'background-color 0.15s ease, color 0.15s ease',
|
transition: 'background-color 0.15s ease, color 0.15s ease',
|
||||||
pointerEvents: 'auto',
|
pointerEvents: 'auto',
|
||||||
flexShrink: 0,
|
flexShrink: 0,
|
||||||
|
touchAction: 'manipulation', // Prevent double-tap zoom, improve touch responsiveness
|
||||||
|
padding: 0,
|
||||||
|
margin: 0,
|
||||||
}
|
}
|
||||||
|
|
||||||
const minimizeButtonStyle: React.CSSProperties = {
|
const minimizeButtonStyle: React.CSSProperties = {
|
||||||
|
|
@ -128,6 +227,16 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
color: isSelected ? 'white' : primaryColor,
|
color: isSelected ? 'white' : primaryColor,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const pinButtonStyle: React.CSSProperties = {
|
||||||
|
...buttonBaseStyle,
|
||||||
|
backgroundColor: isPinnedToView
|
||||||
|
? (isSelected ? 'rgba(255,255,255,0.4)' : primaryColor)
|
||||||
|
: (isSelected ? 'rgba(255,255,255,0.2)' : `${primaryColor}20`),
|
||||||
|
color: isPinnedToView
|
||||||
|
? (isSelected ? 'white' : 'white')
|
||||||
|
: (isSelected ? 'white' : primaryColor),
|
||||||
|
}
|
||||||
|
|
||||||
const closeButtonStyle: React.CSSProperties = {
|
const closeButtonStyle: React.CSSProperties = {
|
||||||
...buttonBaseStyle,
|
...buttonBaseStyle,
|
||||||
backgroundColor: isSelected ? 'rgba(255,255,255,0.2)' : `${primaryColor}20`,
|
backgroundColor: isSelected ? 'rgba(255,255,255,0.2)' : `${primaryColor}20`,
|
||||||
|
|
@ -143,27 +252,148 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
transition: 'height 0.2s ease',
|
transition: 'height 0.2s ease',
|
||||||
display: 'flex',
|
display: 'flex',
|
||||||
flexDirection: 'column',
|
flexDirection: 'column',
|
||||||
|
flex: 1,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const tagsContainerStyle: React.CSSProperties = {
|
||||||
|
padding: '8px 12px',
|
||||||
|
borderTop: `1px solid ${colors.tagsBorder}`,
|
||||||
|
display: 'flex',
|
||||||
|
flexWrap: 'wrap',
|
||||||
|
gap: '4px',
|
||||||
|
alignItems: 'center',
|
||||||
|
minHeight: '32px',
|
||||||
|
backgroundColor: colors.tagsBg,
|
||||||
|
flexShrink: 0,
|
||||||
|
touchAction: 'manipulation', // Improve touch responsiveness
|
||||||
|
}
|
||||||
|
|
||||||
|
const tagStyle: React.CSSProperties = {
|
||||||
|
backgroundColor: colors.tagBg,
|
||||||
|
color: colors.tagText,
|
||||||
|
padding: '4px 8px', // Increased padding for better touch target
|
||||||
|
borderRadius: '12px',
|
||||||
|
fontSize: '10px',
|
||||||
|
fontWeight: '500',
|
||||||
|
display: 'inline-flex',
|
||||||
|
alignItems: 'center',
|
||||||
|
gap: '4px',
|
||||||
|
cursor: tagsEditable ? 'pointer' : 'default',
|
||||||
|
touchAction: 'manipulation', // Improve touch responsiveness
|
||||||
|
minHeight: '24px', // Ensure adequate touch target height
|
||||||
|
}
|
||||||
|
|
||||||
|
const tagInputStyle: React.CSSProperties = {
|
||||||
|
border: `1px solid ${colors.inputBorder}`,
|
||||||
|
borderRadius: '12px',
|
||||||
|
padding: '2px 6px',
|
||||||
|
fontSize: '10px',
|
||||||
|
outline: 'none',
|
||||||
|
minWidth: '60px',
|
||||||
|
flex: 1,
|
||||||
|
backgroundColor: colors.inputBg,
|
||||||
|
color: isDarkMode ? '#e4e4e4' : '#333',
|
||||||
|
}
|
||||||
|
|
||||||
|
const addTagButtonStyle: React.CSSProperties = {
|
||||||
|
backgroundColor: colors.addTagBg,
|
||||||
|
color: colors.tagText,
|
||||||
|
border: 'none',
|
||||||
|
borderRadius: '12px',
|
||||||
|
padding: '4px 10px', // Increased padding for better touch target
|
||||||
|
fontSize: '10px',
|
||||||
|
fontWeight: '500',
|
||||||
|
cursor: 'pointer',
|
||||||
|
display: 'flex',
|
||||||
|
alignItems: 'center',
|
||||||
|
gap: '4px',
|
||||||
|
touchAction: 'manipulation', // Improve touch responsiveness
|
||||||
|
minHeight: '24px', // Ensure adequate touch target height
|
||||||
|
}
|
||||||
|
|
||||||
|
const handleTagClick = (tag: string) => {
|
||||||
|
if (tagsEditable && onTagsChange) {
|
||||||
|
// Remove tag on click
|
||||||
|
const newTags = tags.filter(t => t !== tag)
|
||||||
|
onTagsChange(newTags)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const handleAddTag = () => {
|
||||||
|
if (editingTagInput.trim() && onTagsChange) {
|
||||||
|
const newTag = editingTagInput.trim().replace('#', '')
|
||||||
|
if (newTag && !tags.includes(newTag) && !tags.includes(`#${newTag}`)) {
|
||||||
|
const tagToAdd = newTag.startsWith('#') ? newTag : newTag
|
||||||
|
onTagsChange([...tags, tagToAdd])
|
||||||
|
}
|
||||||
|
setEditingTagInput('')
|
||||||
|
setIsEditingTags(false)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const handleTagInputKeyDown = (e: React.KeyboardEvent<HTMLInputElement>) => {
|
||||||
|
if (e.key === 'Enter') {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
handleAddTag()
|
||||||
|
} else if (e.key === 'Escape') {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
setIsEditingTags(false)
|
||||||
|
setEditingTagInput('')
|
||||||
|
} else if (e.key === 'Backspace' && editingTagInput === '' && tags.length > 0) {
|
||||||
|
// Remove last tag if backspace on empty input
|
||||||
|
e.stopPropagation()
|
||||||
|
if (onTagsChange) {
|
||||||
|
onTagsChange(tags.slice(0, -1))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (isEditingTags && tagInputRef.current) {
|
||||||
|
tagInputRef.current.focus()
|
||||||
|
}
|
||||||
|
}, [isEditingTags])
|
||||||
|
|
||||||
const handleHeaderPointerDown = (e: React.PointerEvent) => {
|
const handleHeaderPointerDown = (e: React.PointerEvent) => {
|
||||||
// Check if this is an interactive element (button)
|
// Check if this is an interactive element (button)
|
||||||
const target = e.target as HTMLElement
|
const target = e.target as HTMLElement
|
||||||
const isInteractive =
|
const isInteractive =
|
||||||
target.tagName === 'BUTTON' ||
|
target.tagName === 'BUTTON' ||
|
||||||
target.closest('button') ||
|
target.closest('button') ||
|
||||||
target.closest('[role="button"]')
|
target.closest('[role="button"]')
|
||||||
|
|
||||||
if (isInteractive) {
|
if (isInteractive) {
|
||||||
// Buttons handle their own behavior and stop propagation
|
// Buttons handle their own behavior and stop propagation
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
// Don't stop the event - let tldraw handle it naturally
|
// CRITICAL: Switch to select tool and select this shape when dragging header
|
||||||
// The hand tool override will detect shapes and handle dragging
|
// This ensures dragging works regardless of which tool is currently active
|
||||||
|
if (editor && shapeId) {
|
||||||
|
const currentTool = editor.getCurrentToolId()
|
||||||
|
if (currentTool !== 'select') {
|
||||||
|
editor.setCurrentTool('select')
|
||||||
|
}
|
||||||
|
// Select this shape if not already selected
|
||||||
|
if (!isSelected) {
|
||||||
|
editor.setSelectedShapes([shapeId])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Don't stop the event - let tldraw handle the drag naturally
|
||||||
}
|
}
|
||||||
|
|
||||||
const handleButtonClick = (e: React.MouseEvent, action: () => void) => {
|
const handleButtonClick = (e: React.MouseEvent, action: () => void) => {
|
||||||
e.stopPropagation()
|
e.stopPropagation()
|
||||||
|
e.preventDefault()
|
||||||
|
action()
|
||||||
|
}
|
||||||
|
|
||||||
|
const handleButtonTouch = (e: React.TouchEvent, action: () => void) => {
|
||||||
|
e.stopPropagation()
|
||||||
|
e.preventDefault()
|
||||||
action()
|
action()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -197,7 +427,18 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
onPointerDown={handleHeaderPointerDown}
|
onPointerDown={handleHeaderPointerDown}
|
||||||
onMouseEnter={() => setIsHoveringHeader(true)}
|
onMouseEnter={() => setIsHoveringHeader(true)}
|
||||||
onMouseLeave={() => setIsHoveringHeader(false)}
|
onMouseLeave={() => setIsHoveringHeader(false)}
|
||||||
onMouseDown={(_e) => {
|
onMouseDown={(e) => {
|
||||||
|
// Don't select if clicking on a button - let the button handle the click
|
||||||
|
const target = e.target as HTMLElement
|
||||||
|
const isButton =
|
||||||
|
target.tagName === 'BUTTON' ||
|
||||||
|
target.closest('button') ||
|
||||||
|
target.closest('[role="button"]')
|
||||||
|
|
||||||
|
if (isButton) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
// Ensure selection happens on mouse down for immediate visual feedback
|
// Ensure selection happens on mouse down for immediate visual feedback
|
||||||
if (editor && shapeId && !isSelected) {
|
if (editor && shapeId && !isSelected) {
|
||||||
editor.setSelectedShapes([shapeId])
|
editor.setSelectedShapes([shapeId])
|
||||||
|
|
@ -209,6 +450,20 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
{headerContent || title}
|
{headerContent || title}
|
||||||
</div>
|
</div>
|
||||||
<div style={buttonContainerStyle}>
|
<div style={buttonContainerStyle}>
|
||||||
|
{onPinToggle && (
|
||||||
|
<button
|
||||||
|
style={pinButtonStyle}
|
||||||
|
onClick={(e) => handleButtonClick(e, onPinToggle)}
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
onMouseDown={(e) => e.stopPropagation()}
|
||||||
|
onTouchStart={(e) => handleButtonTouch(e, onPinToggle)}
|
||||||
|
onTouchEnd={(e) => e.stopPropagation()}
|
||||||
|
title={isPinnedToView ? "Unpin from view" : "Pin to view"}
|
||||||
|
aria-label={isPinnedToView ? "Unpin from view" : "Pin to view"}
|
||||||
|
>
|
||||||
|
📌
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
<button
|
<button
|
||||||
style={minimizeButtonStyle}
|
style={minimizeButtonStyle}
|
||||||
onClick={(e) => {
|
onClick={(e) => {
|
||||||
|
|
@ -220,6 +475,13 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
onPointerDown={(e) => e.stopPropagation()}
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
onMouseDown={(e) => e.stopPropagation()}
|
||||||
|
onTouchStart={(e) => {
|
||||||
|
if (onMinimize) {
|
||||||
|
handleButtonTouch(e, onMinimize)
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
onTouchEnd={(e) => e.stopPropagation()}
|
||||||
title="Minimize"
|
title="Minimize"
|
||||||
aria-label="Minimize"
|
aria-label="Minimize"
|
||||||
disabled={!onMinimize}
|
disabled={!onMinimize}
|
||||||
|
|
@ -230,6 +492,9 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
style={closeButtonStyle}
|
style={closeButtonStyle}
|
||||||
onClick={(e) => handleButtonClick(e, onClose)}
|
onClick={(e) => handleButtonClick(e, onClose)}
|
||||||
onPointerDown={(e) => e.stopPropagation()}
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
onMouseDown={(e) => e.stopPropagation()}
|
||||||
|
onTouchStart={(e) => handleButtonTouch(e, onClose)}
|
||||||
|
onTouchEnd={(e) => e.stopPropagation()}
|
||||||
title="Close"
|
title="Close"
|
||||||
aria-label="Close"
|
aria-label="Close"
|
||||||
>
|
>
|
||||||
|
|
@ -240,12 +505,87 @@ export const StandardizedToolWrapper: React.FC<StandardizedToolWrapperProps> = (
|
||||||
|
|
||||||
{/* Content Area */}
|
{/* Content Area */}
|
||||||
{!isMinimized && (
|
{!isMinimized && (
|
||||||
<div
|
<>
|
||||||
style={contentStyle}
|
<div
|
||||||
onPointerDown={handleContentPointerDown}
|
style={contentStyle}
|
||||||
>
|
onPointerDown={handleContentPointerDown}
|
||||||
{children}
|
>
|
||||||
</div>
|
{children}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Tags at the bottom */}
|
||||||
|
{(tags.length > 0 || (tagsEditable && isSelected)) && (
|
||||||
|
<div
|
||||||
|
style={tagsContainerStyle}
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
onTouchStart={(e) => e.stopPropagation()}
|
||||||
|
onClick={(e) => {
|
||||||
|
if (tagsEditable && !isEditingTags && e.target === e.currentTarget) {
|
||||||
|
setIsEditingTags(true)
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{tags.slice(0, 5).map((tag, index) => (
|
||||||
|
<span
|
||||||
|
key={index}
|
||||||
|
style={tagStyle}
|
||||||
|
onClick={(e) => {
|
||||||
|
e.stopPropagation()
|
||||||
|
handleTagClick(tag)
|
||||||
|
}}
|
||||||
|
onTouchEnd={(e) => {
|
||||||
|
e.stopPropagation()
|
||||||
|
e.preventDefault()
|
||||||
|
handleTagClick(tag)
|
||||||
|
}}
|
||||||
|
title={tagsEditable ? "Click to remove tag" : undefined}
|
||||||
|
>
|
||||||
|
{tag.replace('#', '')}
|
||||||
|
{tagsEditable && <span style={{ fontSize: '8px' }}>×</span>}
|
||||||
|
</span>
|
||||||
|
))}
|
||||||
|
{tags.length > 5 && (
|
||||||
|
<span style={tagStyle}>
|
||||||
|
+{tags.length - 5}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
{isEditingTags && (
|
||||||
|
<input
|
||||||
|
ref={tagInputRef}
|
||||||
|
type="text"
|
||||||
|
value={editingTagInput}
|
||||||
|
onChange={(e) => setEditingTagInput(e.target.value)}
|
||||||
|
onKeyDown={handleTagInputKeyDown}
|
||||||
|
onBlur={() => {
|
||||||
|
handleAddTag()
|
||||||
|
}}
|
||||||
|
style={tagInputStyle}
|
||||||
|
placeholder="Add tag..."
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
{!isEditingTags && tagsEditable && isSelected && tags.length < 10 && (
|
||||||
|
<button
|
||||||
|
style={addTagButtonStyle}
|
||||||
|
onClick={(e) => {
|
||||||
|
e.stopPropagation()
|
||||||
|
setIsEditingTags(true)
|
||||||
|
}}
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
onTouchStart={(e) => {
|
||||||
|
e.stopPropagation()
|
||||||
|
e.preventDefault()
|
||||||
|
setIsEditingTags(true)
|
||||||
|
}}
|
||||||
|
onTouchEnd={(e) => e.stopPropagation()}
|
||||||
|
title="Add tag"
|
||||||
|
>
|
||||||
|
+ Add
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -85,15 +85,21 @@ const StarBoardButton: React.FC<StarBoardButtonProps> = ({ className = '' }) =>
|
||||||
<button
|
<button
|
||||||
onClick={handleStarToggle}
|
onClick={handleStarToggle}
|
||||||
disabled={isLoading}
|
disabled={isLoading}
|
||||||
className={`star-board-button ${className} ${isStarred ? 'starred' : ''}`}
|
className={`toolbar-btn star-board-button ${className} ${isStarred ? 'starred' : ''}`}
|
||||||
title={isStarred ? 'Remove from starred boards' : 'Add to starred boards'}
|
title={isStarred ? 'Remove from starred boards' : 'Add to starred boards'}
|
||||||
>
|
>
|
||||||
{isLoading ? (
|
{isLoading ? (
|
||||||
<span className="loading-spinner">⏳</span>
|
<svg width="14" height="14" viewBox="0 0 16 16" fill="currentColor" className="loading-spinner">
|
||||||
) : isStarred ? (
|
<path d="M8 3a5 5 0 1 0 4.546 2.914.5.5 0 0 1 .908-.417A6 6 0 1 1 8 2v1z"/>
|
||||||
<span className="star-icon starred">⭐</span>
|
</svg>
|
||||||
) : (
|
) : (
|
||||||
<span className="star-icon">☆</span>
|
<svg width="14" height="14" viewBox="0 0 16 16" fill="currentColor">
|
||||||
|
{isStarred ? (
|
||||||
|
<path d="M3.612 15.443c-.386.198-.824-.149-.746-.592l.83-4.73L.173 6.765c-.329-.314-.158-.888.283-.95l4.898-.696L7.538.792c.197-.39.73-.39.927 0l2.184 4.327 4.898.696c.441.062.612.636.282.95l-3.522 3.356.83 4.73c.078.443-.36.79-.746.592L8 13.187l-4.389 2.256z"/>
|
||||||
|
) : (
|
||||||
|
<path d="M2.866 14.85c-.078.444.36.791.746.593l4.39-2.256 4.389 2.256c.386.198.824-.149.746-.592l-.83-4.73 3.522-3.356c.33-.314.16-.888-.282-.95l-4.898-.696L8.465.792a.513.513 0 0 0-.927 0L5.354 5.12l-4.898.696c-.441.062-.612.636-.283.95l3.523 3.356-.83 4.73zm4.905-2.767-3.686 1.894.694-3.957a.565.565 0 0 0-.163-.505L1.71 6.745l4.052-.576a.525.525 0 0 0 .393-.288L8 2.223l1.847 3.658a.525.525 0 0 0 .393.288l4.052.575-2.906 2.77a.565.565 0 0 0-.163.506l.694 3.957-3.686-1.894a.503.503 0 0 0-.461 0z"/>
|
||||||
|
)}
|
||||||
|
</svg>
|
||||||
)}
|
)}
|
||||||
</button>
|
</button>
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -4,15 +4,15 @@ import { useAuth } from '../../context/AuthContext';
|
||||||
import { useNotifications } from '../../context/NotificationContext';
|
import { useNotifications } from '../../context/NotificationContext';
|
||||||
import { checkBrowserSupport, isSecureContext } from '../../lib/utils/browser';
|
import { checkBrowserSupport, isSecureContext } from '../../lib/utils/browser';
|
||||||
|
|
||||||
interface CryptoLoginProps {
|
interface CryptIDProps {
|
||||||
onSuccess?: () => void;
|
onSuccess?: () => void;
|
||||||
onCancel?: () => void;
|
onCancel?: () => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* WebCryptoAPI-based authentication component
|
* CryptID - WebCryptoAPI-based authentication component
|
||||||
*/
|
*/
|
||||||
const CryptoLogin: React.FC<CryptoLoginProps> = ({ onSuccess, onCancel }) => {
|
const CryptID: React.FC<CryptIDProps> = ({ onSuccess, onCancel }) => {
|
||||||
const [username, setUsername] = useState('');
|
const [username, setUsername] = useState('');
|
||||||
const [isRegistering, setIsRegistering] = useState(false);
|
const [isRegistering, setIsRegistering] = useState(false);
|
||||||
const [error, setError] = useState<string | null>(null);
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
@ -178,7 +178,7 @@ const CryptoLogin: React.FC<CryptoLoginProps> = ({ onSuccess, onCancel }) => {
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="crypto-login-container">
|
<div className="crypto-login-container">
|
||||||
<h2>{isRegistering ? 'Create Cryptographic Account' : 'Cryptographic Sign In'}</h2>
|
<h2>{isRegistering ? 'Create CryptID Account' : 'CryptID Sign In'}</h2>
|
||||||
|
|
||||||
{/* Show existing users if available */}
|
{/* Show existing users if available */}
|
||||||
{existingUsers.length > 0 && !isRegistering && (
|
{existingUsers.length > 0 && !isRegistering && (
|
||||||
|
|
@ -206,11 +206,11 @@ const CryptoLogin: React.FC<CryptoLoginProps> = ({ onSuccess, onCancel }) => {
|
||||||
|
|
||||||
<div className="crypto-info">
|
<div className="crypto-info">
|
||||||
<p>
|
<p>
|
||||||
{isRegistering
|
{isRegistering
|
||||||
? 'Create a new account using WebCryptoAPI for secure authentication.'
|
? 'Create a new CryptID account using WebCryptoAPI for secure authentication.'
|
||||||
: existingUsers.length > 0
|
: existingUsers.length > 0
|
||||||
? 'Select an account above or enter a different username to sign in.'
|
? 'Select an account above or enter a different username to sign in.'
|
||||||
: 'Sign in using your cryptographic credentials.'
|
: 'Sign in using your CryptID credentials.'
|
||||||
}
|
}
|
||||||
</p>
|
</p>
|
||||||
<div className="crypto-features">
|
<div className="crypto-features">
|
||||||
|
|
@ -276,4 +276,4 @@ const CryptoLogin: React.FC<CryptoLoginProps> = ({ onSuccess, onCancel }) => {
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
||||||
export default CryptoLogin;
|
export default CryptID;
|
||||||
|
|
@ -145,7 +145,7 @@ const CryptoDebug: React.FC = () => {
|
||||||
const storedUsers = JSON.parse(localStorage.getItem('registeredUsers') || '[]');
|
const storedUsers = JSON.parse(localStorage.getItem('registeredUsers') || '[]');
|
||||||
addResult(`All registered users: ${JSON.stringify(storedUsers)}`);
|
addResult(`All registered users: ${JSON.stringify(storedUsers)}`);
|
||||||
|
|
||||||
// Filter for users with valid keys (same logic as CryptoLogin)
|
// Filter for users with valid keys (same logic as CryptID)
|
||||||
const validUsers = storedUsers.filter((user: string) => {
|
const validUsers = storedUsers.filter((user: string) => {
|
||||||
const publicKey = localStorage.getItem(`${user}_publicKey`);
|
const publicKey = localStorage.getItem(`${user}_publicKey`);
|
||||||
if (!publicKey) return false;
|
if (!publicKey) return false;
|
||||||
|
|
|
||||||
|
|
@ -1,102 +0,0 @@
|
||||||
import React, { useState, useEffect } from 'react'
|
|
||||||
import { useNavigate } from 'react-router-dom'
|
|
||||||
import { createAccountLinkingConsumer } from '../../lib/auth/linking'
|
|
||||||
import { useAuth } from '../../context/AuthContext'
|
|
||||||
import { useNotifications } from '../../context/NotificationContext'
|
|
||||||
|
|
||||||
const LinkDevice: React.FC = () => {
|
|
||||||
const [username, setUsername] = useState('')
|
|
||||||
const [displayPin, setDisplayPin] = useState('')
|
|
||||||
const [view, setView] = useState<'enter-username' | 'show-pin' | 'load-filesystem'>('enter-username')
|
|
||||||
const [accountLinkingConsumer, setAccountLinkingConsumer] = useState<any>(null)
|
|
||||||
const navigate = useNavigate()
|
|
||||||
const { login } = useAuth()
|
|
||||||
const { addNotification } = useNotifications()
|
|
||||||
|
|
||||||
const initAccountLinkingConsumer = async () => {
|
|
||||||
try {
|
|
||||||
const consumer = await createAccountLinkingConsumer(username)
|
|
||||||
setAccountLinkingConsumer(consumer)
|
|
||||||
|
|
||||||
consumer.on('challenge', ({ pin }: { pin: number[] }) => {
|
|
||||||
setDisplayPin(pin.join(''))
|
|
||||||
setView('show-pin')
|
|
||||||
})
|
|
||||||
|
|
||||||
consumer.on('link', async ({ approved, username }: { approved: boolean, username: string }) => {
|
|
||||||
if (approved) {
|
|
||||||
setView('load-filesystem')
|
|
||||||
|
|
||||||
const success = await login(username)
|
|
||||||
|
|
||||||
if (success) {
|
|
||||||
addNotification("You're now connected!", "success")
|
|
||||||
navigate('/')
|
|
||||||
} else {
|
|
||||||
addNotification("Connection successful but login failed", "error")
|
|
||||||
navigate('/login')
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
addNotification('The connection attempt was cancelled', "warning")
|
|
||||||
navigate('/')
|
|
||||||
}
|
|
||||||
})
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error initializing account linking consumer:', error)
|
|
||||||
addNotification('Failed to initialize device linking', "error")
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const handleSubmitUsername = (e: React.FormEvent) => {
|
|
||||||
e.preventDefault()
|
|
||||||
initAccountLinkingConsumer()
|
|
||||||
}
|
|
||||||
|
|
||||||
// Clean up consumer on unmount
|
|
||||||
useEffect(() => {
|
|
||||||
return () => {
|
|
||||||
if (accountLinkingConsumer) {
|
|
||||||
accountLinkingConsumer.destroy()
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}, [accountLinkingConsumer])
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="link-device-container">
|
|
||||||
{view === 'enter-username' && (
|
|
||||||
<>
|
|
||||||
<h2>Link a New Device</h2>
|
|
||||||
<form onSubmit={handleSubmitUsername}>
|
|
||||||
<div className="form-group">
|
|
||||||
<label htmlFor="username">Username</label>
|
|
||||||
<input
|
|
||||||
type="text"
|
|
||||||
id="username"
|
|
||||||
value={username}
|
|
||||||
onChange={(e) => setUsername(e.target.value)}
|
|
||||||
required
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
<button type="submit" disabled={!username}>Continue</button>
|
|
||||||
</form>
|
|
||||||
</>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{view === 'show-pin' && (
|
|
||||||
<div className="pin-display">
|
|
||||||
<h2>Enter this PIN on your other device</h2>
|
|
||||||
<div className="pin-code">{displayPin}</div>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{view === 'load-filesystem' && (
|
|
||||||
<div className="loading">
|
|
||||||
<h2>Loading your filesystem...</h2>
|
|
||||||
<p>Please wait while we connect to your account.</p>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
export default LinkDevice
|
|
||||||
|
|
@ -1,7 +1,7 @@
|
||||||
import React, { useState } from 'react';
|
import React, { useState } from 'react';
|
||||||
import { useAuth } from '../../context/AuthContext';
|
import { useAuth } from '../../context/AuthContext';
|
||||||
import { useNotifications } from '../../context/NotificationContext';
|
import { useNotifications } from '../../context/NotificationContext';
|
||||||
import CryptoLogin from './CryptoLogin';
|
import CryptID from './CryptID';
|
||||||
|
|
||||||
interface LoginButtonProps {
|
interface LoginButtonProps {
|
||||||
className?: string;
|
className?: string;
|
||||||
|
|
@ -33,16 +33,20 @@ const LoginButton: React.FC<LoginButtonProps> = ({ className = '' }) => {
|
||||||
<>
|
<>
|
||||||
<button
|
<button
|
||||||
onClick={handleLoginClick}
|
onClick={handleLoginClick}
|
||||||
className={`login-button ${className}`}
|
className={`toolbar-btn login-button ${className}`}
|
||||||
title="Sign in to save your work and access additional features"
|
title="Sign in to save your work and access additional features"
|
||||||
>
|
>
|
||||||
Sign In
|
<svg width="14" height="14" viewBox="0 0 16 16" fill="currentColor">
|
||||||
|
<path fillRule="evenodd" d="M6 3.5a.5.5 0 0 1 .5-.5h8a.5.5 0 0 1 .5.5v9a.5.5 0 0 1-.5.5h-8a.5.5 0 0 1-.5-.5v-2a.5.5 0 0 0-1 0v2A1.5 1.5 0 0 0 6.5 14h8a1.5 1.5 0 0 0 1.5-1.5v-9A1.5 1.5 0 0 0 14.5 2h-8A1.5 1.5 0 0 0 5 3.5v2a.5.5 0 0 0 1 0v-2z"/>
|
||||||
|
<path fillRule="evenodd" d="M11.854 8.354a.5.5 0 0 0 0-.708l-3-3a.5.5 0 1 0-.708.708L10.293 7.5H1.5a.5.5 0 0 0 0 1h8.793l-2.147 2.146a.5.5 0 0 0 .708.708l3-3z"/>
|
||||||
|
</svg>
|
||||||
|
<span>Sign In</span>
|
||||||
</button>
|
</button>
|
||||||
|
|
||||||
{showLogin && (
|
{showLogin && (
|
||||||
<div className="login-overlay">
|
<div className="login-overlay">
|
||||||
<div className="login-modal">
|
<div className="login-modal">
|
||||||
<CryptoLogin
|
<CryptID
|
||||||
onSuccess={handleLoginSuccess}
|
onSuccess={handleLoginSuccess}
|
||||||
onCancel={handleLoginCancel}
|
onCancel={handleLoginCancel}
|
||||||
/>
|
/>
|
||||||
|
|
|
||||||
|
|
@ -63,7 +63,7 @@ export const Profile: React.FC<ProfileProps> = ({ onLogout, onOpenVaultBrowser }
|
||||||
return (
|
return (
|
||||||
<div className="profile-container">
|
<div className="profile-container">
|
||||||
<div className="profile-header">
|
<div className="profile-header">
|
||||||
<h3>Welcome, {session.username}!</h3>
|
<h3>CryptID: {session.username}</h3>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="profile-settings">
|
<div className="profile-settings">
|
||||||
|
|
|
||||||
|
|
@ -1,64 +0,0 @@
|
||||||
import React, { useState } from 'react'
|
|
||||||
import { register } from '../../lib/auth/account'
|
|
||||||
|
|
||||||
const Register: React.FC = () => {
|
|
||||||
const [username, setUsername] = useState('')
|
|
||||||
const [checkingUsername, setCheckingUsername] = useState(false)
|
|
||||||
const [initializingFilesystem, setInitializingFilesystem] = useState(false)
|
|
||||||
const [error, setError] = useState<string | null>(null)
|
|
||||||
|
|
||||||
const handleRegister = async (e: React.FormEvent) => {
|
|
||||||
e.preventDefault()
|
|
||||||
|
|
||||||
if (checkingUsername) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
setInitializingFilesystem(true)
|
|
||||||
setError(null)
|
|
||||||
|
|
||||||
try {
|
|
||||||
const success = await register(username)
|
|
||||||
|
|
||||||
if (!success) {
|
|
||||||
setError('Registration failed. Username may be taken.')
|
|
||||||
setInitializingFilesystem(false)
|
|
||||||
}
|
|
||||||
} catch (err) {
|
|
||||||
setError('An error occurred during registration')
|
|
||||||
setInitializingFilesystem(false)
|
|
||||||
console.error(err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="register-container">
|
|
||||||
<h2>Create an Account</h2>
|
|
||||||
|
|
||||||
<form onSubmit={handleRegister}>
|
|
||||||
<div className="form-group">
|
|
||||||
<label htmlFor="username">Username</label>
|
|
||||||
<input
|
|
||||||
type="text"
|
|
||||||
id="username"
|
|
||||||
value={username}
|
|
||||||
onChange={(e) => setUsername(e.target.value)}
|
|
||||||
disabled={initializingFilesystem}
|
|
||||||
required
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{error && <div className="error-message">{error}</div>}
|
|
||||||
|
|
||||||
<button
|
|
||||||
type="submit"
|
|
||||||
disabled={initializingFilesystem || !username}
|
|
||||||
>
|
|
||||||
{initializingFilesystem ? 'Creating Account...' : 'Create Account'}
|
|
||||||
</button>
|
|
||||||
</form>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
export default Register
|
|
||||||
|
|
@ -1,187 +0,0 @@
|
||||||
"use client"
|
|
||||||
|
|
||||||
import type React from "react"
|
|
||||||
import { useState, useEffect } from "react"
|
|
||||||
import { useAuth } from "@/context/AuthContext"
|
|
||||||
import { LocationStorageService, type LocationData } from "@/lib/location/locationStorage"
|
|
||||||
import type { GeolocationPosition } from "@/lib/location/types"
|
|
||||||
|
|
||||||
interface LocationCaptureProps {
|
|
||||||
onLocationCaptured?: (location: LocationData) => void
|
|
||||||
onError?: (error: string) => void
|
|
||||||
}
|
|
||||||
|
|
||||||
export const LocationCapture: React.FC<LocationCaptureProps> = ({ onLocationCaptured, onError }) => {
|
|
||||||
const { session, fileSystem } = useAuth()
|
|
||||||
const [isCapturing, setIsCapturing] = useState(false)
|
|
||||||
const [permissionState, setPermissionState] = useState<"prompt" | "granted" | "denied">("prompt")
|
|
||||||
const [currentLocation, setCurrentLocation] = useState<GeolocationPosition | null>(null)
|
|
||||||
const [error, setError] = useState<string | null>(null)
|
|
||||||
|
|
||||||
// Show loading state while auth is initializing
|
|
||||||
if (session.loading) {
|
|
||||||
return (
|
|
||||||
<div className="location-capture-loading flex items-center justify-center min-h-[200px]">
|
|
||||||
<div className="text-center">
|
|
||||||
<div className="text-2xl mb-2 animate-spin">⏳</div>
|
|
||||||
<p className="text-sm text-muted-foreground">Loading authentication...</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check permission status on mount
|
|
||||||
useEffect(() => {
|
|
||||||
if ("permissions" in navigator) {
|
|
||||||
navigator.permissions.query({ name: "geolocation" }).then((result) => {
|
|
||||||
setPermissionState(result.state as "prompt" | "granted" | "denied")
|
|
||||||
|
|
||||||
result.addEventListener("change", () => {
|
|
||||||
setPermissionState(result.state as "prompt" | "granted" | "denied")
|
|
||||||
})
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}, [])
|
|
||||||
|
|
||||||
const captureLocation = async () => {
|
|
||||||
// Don't proceed if still loading
|
|
||||||
if (session.loading) {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!session.authed) {
|
|
||||||
const errorMsg = "You must be logged in to share your location. Please log in and try again."
|
|
||||||
setError(errorMsg)
|
|
||||||
onError?.(errorMsg)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!fileSystem) {
|
|
||||||
const errorMsg = "File system not available. Please refresh the page and try again."
|
|
||||||
setError(errorMsg)
|
|
||||||
onError?.(errorMsg)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
setIsCapturing(true)
|
|
||||||
setError(null)
|
|
||||||
|
|
||||||
try {
|
|
||||||
// Request geolocation
|
|
||||||
const position = await new Promise<GeolocationPosition>((resolve, reject) => {
|
|
||||||
navigator.geolocation.getCurrentPosition(
|
|
||||||
(pos) => resolve(pos as GeolocationPosition),
|
|
||||||
(err) => reject(err),
|
|
||||||
{
|
|
||||||
enableHighAccuracy: true,
|
|
||||||
timeout: 10000,
|
|
||||||
maximumAge: 0,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
})
|
|
||||||
|
|
||||||
setCurrentLocation(position)
|
|
||||||
|
|
||||||
// Create location data
|
|
||||||
const locationData: LocationData = {
|
|
||||||
id: crypto.randomUUID(),
|
|
||||||
userId: session.username,
|
|
||||||
latitude: position.coords.latitude,
|
|
||||||
longitude: position.coords.longitude,
|
|
||||||
accuracy: position.coords.accuracy,
|
|
||||||
timestamp: position.timestamp,
|
|
||||||
expiresAt: null, // Will be set when creating a share
|
|
||||||
precision: "exact",
|
|
||||||
}
|
|
||||||
|
|
||||||
// Save to filesystem
|
|
||||||
const storageService = new LocationStorageService(fileSystem)
|
|
||||||
await storageService.initialize()
|
|
||||||
await storageService.saveLocation(locationData)
|
|
||||||
|
|
||||||
onLocationCaptured?.(locationData)
|
|
||||||
} catch (err: any) {
|
|
||||||
let errorMsg = "Failed to capture location"
|
|
||||||
|
|
||||||
if (err.code === 1) {
|
|
||||||
errorMsg = "Location permission denied. Please enable location access in your browser settings."
|
|
||||||
setPermissionState("denied")
|
|
||||||
} else if (err.code === 2) {
|
|
||||||
errorMsg = "Location unavailable. Please check your device settings."
|
|
||||||
} else if (err.code === 3) {
|
|
||||||
errorMsg = "Location request timed out. Please try again."
|
|
||||||
}
|
|
||||||
|
|
||||||
setError(errorMsg)
|
|
||||||
onError?.(errorMsg)
|
|
||||||
} finally {
|
|
||||||
setIsCapturing(false)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="location-capture">
|
|
||||||
<div className="capture-header">
|
|
||||||
<h2 className="text-2xl font-semibold text-balance">Share Your Location</h2>
|
|
||||||
<p className="text-sm text-muted-foreground mt-2">Securely share your current location with others</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Permission status */}
|
|
||||||
{permissionState === "denied" && (
|
|
||||||
<div className="permission-denied bg-destructive/10 border border-destructive/20 rounded-lg p-4 mt-4">
|
|
||||||
<p className="text-sm text-destructive">
|
|
||||||
Location access is blocked. Please enable it in your browser settings to continue.
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{/* Current location display */}
|
|
||||||
{currentLocation && (
|
|
||||||
<div className="current-location bg-muted/50 rounded-lg p-4 mt-4">
|
|
||||||
<h3 className="text-sm font-medium mb-2">Current Location</h3>
|
|
||||||
<div className="location-details text-xs space-y-1">
|
|
||||||
<p>
|
|
||||||
<span className="text-muted-foreground">Latitude:</span> {currentLocation.coords.latitude.toFixed(6)}
|
|
||||||
</p>
|
|
||||||
<p>
|
|
||||||
<span className="text-muted-foreground">Longitude:</span> {currentLocation.coords.longitude.toFixed(6)}
|
|
||||||
</p>
|
|
||||||
<p>
|
|
||||||
<span className="text-muted-foreground">Accuracy:</span> ±{Math.round(currentLocation.coords.accuracy)}m
|
|
||||||
</p>
|
|
||||||
<p className="text-muted-foreground">Captured {new Date(currentLocation.timestamp).toLocaleString()}</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{/* Error display */}
|
|
||||||
{error && (
|
|
||||||
<div className="error-message bg-destructive/10 border border-destructive/20 rounded-lg p-4 mt-4">
|
|
||||||
<p className="text-sm text-destructive">{error}</p>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{/* Capture button */}
|
|
||||||
<button
|
|
||||||
onClick={captureLocation}
|
|
||||||
disabled={isCapturing || permissionState === "denied" || !session.authed}
|
|
||||||
className="capture-button w-full mt-6 bg-primary text-primary-foreground hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed rounded-lg px-6 py-3 font-medium transition-colors"
|
|
||||||
>
|
|
||||||
{isCapturing ? (
|
|
||||||
<span className="flex items-center justify-center gap-2">
|
|
||||||
<span className="spinner" />
|
|
||||||
Capturing Location...
|
|
||||||
</span>
|
|
||||||
) : (
|
|
||||||
"Capture My Location"
|
|
||||||
)}
|
|
||||||
</button>
|
|
||||||
|
|
||||||
{!session.authed && (
|
|
||||||
<p className="text-xs text-muted-foreground text-center mt-3">Please log in to share your location</p>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -1,270 +0,0 @@
|
||||||
"use client"
|
|
||||||
|
|
||||||
import type React from "react"
|
|
||||||
import { useState, useEffect } from "react"
|
|
||||||
import { useAuth } from "@/context/AuthContext"
|
|
||||||
import { LocationStorageService, type LocationData, type LocationShare } from "@/lib/location/locationStorage"
|
|
||||||
import { LocationMap } from "./LocationMap"
|
|
||||||
|
|
||||||
interface ShareWithLocation {
|
|
||||||
share: LocationShare
|
|
||||||
location: LocationData
|
|
||||||
}
|
|
||||||
|
|
||||||
export const LocationDashboard: React.FC = () => {
|
|
||||||
const { session, fileSystem } = useAuth()
|
|
||||||
const [shares, setShares] = useState<ShareWithLocation[]>([])
|
|
||||||
const [loading, setLoading] = useState(true)
|
|
||||||
const [selectedShare, setSelectedShare] = useState<ShareWithLocation | null>(null)
|
|
||||||
const [error, setError] = useState<string | null>(null)
|
|
||||||
|
|
||||||
const loadShares = async () => {
|
|
||||||
if (!fileSystem) {
|
|
||||||
setError("File system not available")
|
|
||||||
setLoading(false)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const storageService = new LocationStorageService(fileSystem)
|
|
||||||
await storageService.initialize()
|
|
||||||
|
|
||||||
// Get all shares
|
|
||||||
const allShares = await storageService.getAllShares()
|
|
||||||
|
|
||||||
// Get locations for each share
|
|
||||||
const sharesWithLocations: ShareWithLocation[] = []
|
|
||||||
|
|
||||||
for (const share of allShares) {
|
|
||||||
const location = await storageService.getLocation(share.locationId)
|
|
||||||
if (location) {
|
|
||||||
sharesWithLocations.push({ share, location })
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Sort by creation date (newest first)
|
|
||||||
sharesWithLocations.sort((a, b) => b.share.createdAt - a.share.createdAt)
|
|
||||||
|
|
||||||
setShares(sharesWithLocations)
|
|
||||||
setLoading(false)
|
|
||||||
} catch (err) {
|
|
||||||
console.error("Error loading shares:", err)
|
|
||||||
setError("Failed to load location shares")
|
|
||||||
setLoading(false)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (session.authed && fileSystem) {
|
|
||||||
loadShares()
|
|
||||||
}
|
|
||||||
}, [session.authed, fileSystem])
|
|
||||||
|
|
||||||
const handleCopyLink = async (shareToken: string) => {
|
|
||||||
const baseUrl = window.location.origin
|
|
||||||
const link = `${baseUrl}/location/${shareToken}`
|
|
||||||
|
|
||||||
try {
|
|
||||||
await navigator.clipboard.writeText(link)
|
|
||||||
alert("Link copied to clipboard!")
|
|
||||||
} catch (err) {
|
|
||||||
console.error("Failed to copy link:", err)
|
|
||||||
alert("Failed to copy link")
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const isExpired = (share: LocationShare): boolean => {
|
|
||||||
return share.expiresAt ? share.expiresAt < Date.now() : false
|
|
||||||
}
|
|
||||||
|
|
||||||
const isMaxViewsReached = (share: LocationShare): boolean => {
|
|
||||||
return share.maxViews ? share.viewCount >= share.maxViews : false
|
|
||||||
}
|
|
||||||
|
|
||||||
const getShareStatus = (share: LocationShare): { label: string; color: string } => {
|
|
||||||
if (isExpired(share)) {
|
|
||||||
return { label: "Expired", color: "text-destructive" }
|
|
||||||
}
|
|
||||||
if (isMaxViewsReached(share)) {
|
|
||||||
return { label: "Max Views Reached", color: "text-destructive" }
|
|
||||||
}
|
|
||||||
return { label: "Active", color: "text-green-600" }
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!session.authed) {
|
|
||||||
return (
|
|
||||||
<div className="location-dashboard-auth flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="text-center max-w-md">
|
|
||||||
<div className="text-4xl mb-4">🔒</div>
|
|
||||||
<h2 className="text-xl font-semibold mb-2">Authentication Required</h2>
|
|
||||||
<p className="text-sm text-muted-foreground">Please log in to view your location shares</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (loading) {
|
|
||||||
return (
|
|
||||||
<div className="location-dashboard flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="flex flex-col items-center gap-3">
|
|
||||||
<div className="spinner" />
|
|
||||||
<p className="text-sm text-muted-foreground">Loading your shares...</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return (
|
|
||||||
<div className="location-dashboard flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="text-center max-w-md">
|
|
||||||
<div className="text-4xl mb-4">⚠️</div>
|
|
||||||
<h2 className="text-xl font-semibold mb-2">Error Loading Dashboard</h2>
|
|
||||||
<p className="text-sm text-muted-foreground">{error}</p>
|
|
||||||
<button
|
|
||||||
onClick={loadShares}
|
|
||||||
className="mt-4 px-6 py-2 rounded-lg bg-primary text-primary-foreground hover:bg-primary/90 transition-colors"
|
|
||||||
>
|
|
||||||
Retry
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="location-dashboard max-w-6xl mx-auto p-6">
|
|
||||||
<div className="dashboard-header mb-8">
|
|
||||||
<h1 className="text-3xl font-bold text-balance">Location Shares</h1>
|
|
||||||
<p className="text-sm text-muted-foreground mt-2">Manage your shared locations and privacy settings</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{shares.length === 0 ? (
|
|
||||||
<div className="empty-state flex flex-col items-center justify-center min-h-[400px] text-center">
|
|
||||||
<div className="text-6xl mb-4">📍</div>
|
|
||||||
<h2 className="text-xl font-semibold mb-2">No Location Shares Yet</h2>
|
|
||||||
<p className="text-sm text-muted-foreground mb-6 max-w-md">
|
|
||||||
You haven't shared any locations yet. Create your first share to get started.
|
|
||||||
</p>
|
|
||||||
<a
|
|
||||||
href="/share-location"
|
|
||||||
className="px-6 py-3 rounded-lg bg-primary text-primary-foreground hover:bg-primary/90 transition-colors font-medium"
|
|
||||||
>
|
|
||||||
Share Your Location
|
|
||||||
</a>
|
|
||||||
</div>
|
|
||||||
) : (
|
|
||||||
<div className="dashboard-content">
|
|
||||||
{/* Stats Overview */}
|
|
||||||
<div className="stats-grid grid grid-cols-1 md:grid-cols-3 gap-4 mb-8">
|
|
||||||
<div className="stat-card bg-muted/50 rounded-lg p-4 border border-border">
|
|
||||||
<div className="stat-label text-sm text-muted-foreground mb-1">Total Shares</div>
|
|
||||||
<div className="stat-value text-3xl font-bold">{shares.length}</div>
|
|
||||||
</div>
|
|
||||||
<div className="stat-card bg-muted/50 rounded-lg p-4 border border-border">
|
|
||||||
<div className="stat-label text-sm text-muted-foreground mb-1">Active Shares</div>
|
|
||||||
<div className="stat-value text-3xl font-bold text-green-600">
|
|
||||||
{shares.filter((s) => !isExpired(s.share) && !isMaxViewsReached(s.share)).length}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div className="stat-card bg-muted/50 rounded-lg p-4 border border-border">
|
|
||||||
<div className="stat-label text-sm text-muted-foreground mb-1">Total Views</div>
|
|
||||||
<div className="stat-value text-3xl font-bold">
|
|
||||||
{shares.reduce((sum, s) => sum + s.share.viewCount, 0)}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Shares List */}
|
|
||||||
<div className="shares-list space-y-4">
|
|
||||||
{shares.map(({ share, location }) => {
|
|
||||||
const status = getShareStatus(share)
|
|
||||||
const isSelected = selectedShare?.share.id === share.id
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div
|
|
||||||
key={share.id}
|
|
||||||
className={`share-card bg-background rounded-lg border-2 transition-colors ${
|
|
||||||
isSelected ? "border-primary" : "border-border hover:border-primary/50"
|
|
||||||
}`}
|
|
||||||
>
|
|
||||||
<div className="share-card-header p-4 flex items-start justify-between gap-4">
|
|
||||||
<div className="share-info flex-1">
|
|
||||||
<div className="flex items-center gap-3 mb-2">
|
|
||||||
<h3 className="font-semibold">Location Share</h3>
|
|
||||||
<span className={`text-xs font-medium ${status.color}`}>{status.label}</span>
|
|
||||||
</div>
|
|
||||||
<div className="share-meta text-xs text-muted-foreground space-y-1">
|
|
||||||
<p>Created: {new Date(share.createdAt).toLocaleString()}</p>
|
|
||||||
{share.expiresAt && <p>Expires: {new Date(share.expiresAt).toLocaleString()}</p>}
|
|
||||||
<p>
|
|
||||||
Views: {share.viewCount}
|
|
||||||
{share.maxViews && ` / ${share.maxViews}`}
|
|
||||||
</p>
|
|
||||||
<p>
|
|
||||||
Precision: <span className="capitalize">{share.precision}</span>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div className="share-actions flex gap-2">
|
|
||||||
<button
|
|
||||||
onClick={() => handleCopyLink(share.shareToken)}
|
|
||||||
disabled={isExpired(share) || isMaxViewsReached(share)}
|
|
||||||
className="px-4 py-2 rounded-lg border border-border hover:bg-muted disabled:opacity-50 disabled:cursor-not-allowed transition-colors text-sm"
|
|
||||||
>
|
|
||||||
Copy Link
|
|
||||||
</button>
|
|
||||||
<button
|
|
||||||
onClick={() => setSelectedShare(isSelected ? null : { share, location })}
|
|
||||||
className="px-4 py-2 rounded-lg bg-primary text-primary-foreground hover:bg-primary/90 transition-colors text-sm"
|
|
||||||
>
|
|
||||||
{isSelected ? "Hide" : "View"} Map
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{isSelected && (
|
|
||||||
<div className="share-card-body p-4 pt-0 border-t border-border mt-4">
|
|
||||||
<LocationMap location={location} precision={share.precision} showAccuracy={true} height="300px" />
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
})}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -1,241 +0,0 @@
|
||||||
"use client"
|
|
||||||
|
|
||||||
import type React from "react"
|
|
||||||
import { useEffect, useRef, useState } from "react"
|
|
||||||
import type { LocationData } from "@/lib/location/locationStorage"
|
|
||||||
import { obfuscateLocation } from "@/lib/location/locationStorage"
|
|
||||||
import type { PrecisionLevel } from "@/lib/location/types"
|
|
||||||
|
|
||||||
// Leaflet types
|
|
||||||
interface LeafletMap {
|
|
||||||
setView: (coords: [number, number], zoom: number) => void
|
|
||||||
remove: () => void
|
|
||||||
}
|
|
||||||
|
|
||||||
interface LeafletMarker {
|
|
||||||
addTo: (map: LeafletMap) => LeafletMarker
|
|
||||||
bindPopup: (content: string) => LeafletMarker
|
|
||||||
}
|
|
||||||
|
|
||||||
interface LeafletCircle {
|
|
||||||
addTo: (map: LeafletMap) => LeafletCircle
|
|
||||||
}
|
|
||||||
|
|
||||||
interface LeafletTileLayer {
|
|
||||||
addTo: (map: LeafletMap) => LeafletTileLayer
|
|
||||||
}
|
|
||||||
|
|
||||||
interface Leaflet {
|
|
||||||
map: (element: HTMLElement, options?: any) => LeafletMap
|
|
||||||
marker: (coords: [number, number], options?: any) => LeafletMarker
|
|
||||||
circle: (coords: [number, number], options?: any) => LeafletCircle
|
|
||||||
tileLayer: (url: string, options?: any) => LeafletTileLayer
|
|
||||||
icon: (options: any) => any
|
|
||||||
}
|
|
||||||
|
|
||||||
declare global {
|
|
||||||
interface Window {
|
|
||||||
L?: Leaflet
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
interface LocationMapProps {
|
|
||||||
location: LocationData
|
|
||||||
precision?: PrecisionLevel
|
|
||||||
showAccuracy?: boolean
|
|
||||||
height?: string
|
|
||||||
}
|
|
||||||
|
|
||||||
export const LocationMap: React.FC<LocationMapProps> = ({
|
|
||||||
location,
|
|
||||||
precision = "exact",
|
|
||||||
showAccuracy = true,
|
|
||||||
height = "400px",
|
|
||||||
}) => {
|
|
||||||
const mapContainer = useRef<HTMLDivElement>(null)
|
|
||||||
const mapInstance = useRef<LeafletMap | null>(null)
|
|
||||||
const [isLoading, setIsLoading] = useState(true)
|
|
||||||
const [error, setError] = useState<string | null>(null)
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
// Load Leaflet CSS and JS
|
|
||||||
const loadLeaflet = async () => {
|
|
||||||
try {
|
|
||||||
// Load CSS
|
|
||||||
if (!document.querySelector('link[href*="leaflet.css"]')) {
|
|
||||||
const link = document.createElement("link")
|
|
||||||
link.rel = "stylesheet"
|
|
||||||
link.href = "https://unpkg.com/leaflet@1.9.4/dist/leaflet.css"
|
|
||||||
link.integrity = "sha256-p4NxAoJBhIIN+hmNHrzRCf9tD/miZyoHS5obTRR9BMY="
|
|
||||||
link.crossOrigin = ""
|
|
||||||
document.head.appendChild(link)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Load JS
|
|
||||||
if (!window.L) {
|
|
||||||
await new Promise<void>((resolve, reject) => {
|
|
||||||
const script = document.createElement("script")
|
|
||||||
script.src = "https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"
|
|
||||||
script.integrity = "sha256-20nQCchB9co0qIjJZRGuk2/Z9VM+kNiyxNV1lvTlZBo="
|
|
||||||
script.crossOrigin = ""
|
|
||||||
script.onload = () => resolve()
|
|
||||||
script.onerror = () => reject(new Error("Failed to load Leaflet"))
|
|
||||||
document.head.appendChild(script)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
setIsLoading(false)
|
|
||||||
} catch (err) {
|
|
||||||
setError("Failed to load map library")
|
|
||||||
setIsLoading(false)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
loadLeaflet()
|
|
||||||
}, [])
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (!mapContainer.current || !window.L || isLoading) return
|
|
||||||
|
|
||||||
// Clean up existing map
|
|
||||||
if (mapInstance.current) {
|
|
||||||
mapInstance.current.remove()
|
|
||||||
}
|
|
||||||
|
|
||||||
const L = window.L!
|
|
||||||
|
|
||||||
// Get obfuscated location based on precision
|
|
||||||
const { lat, lng, radius } = obfuscateLocation(location.latitude, location.longitude, precision)
|
|
||||||
|
|
||||||
// Create map
|
|
||||||
const map = L.map(mapContainer.current, {
|
|
||||||
center: [lat, lng],
|
|
||||||
zoom: precision === "exact" ? 15 : precision === "street" ? 14 : precision === "neighborhood" ? 12 : 10,
|
|
||||||
zoomControl: true,
|
|
||||||
attributionControl: true,
|
|
||||||
})
|
|
||||||
|
|
||||||
// Add OpenStreetMap tiles
|
|
||||||
L.tileLayer("https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png", {
|
|
||||||
attribution: '© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors',
|
|
||||||
maxZoom: 19,
|
|
||||||
}).addTo(map)
|
|
||||||
|
|
||||||
// Add marker
|
|
||||||
const marker = L.marker([lat, lng], {
|
|
||||||
icon: L.icon({
|
|
||||||
iconUrl: "https://unpkg.com/leaflet@1.9.4/dist/images/marker-icon.png",
|
|
||||||
iconRetinaUrl: "https://unpkg.com/leaflet@1.9.4/dist/images/marker-icon-2x.png",
|
|
||||||
shadowUrl: "https://unpkg.com/leaflet@1.9.4/dist/images/marker-shadow.png",
|
|
||||||
iconSize: [25, 41],
|
|
||||||
iconAnchor: [12, 41],
|
|
||||||
popupAnchor: [1, -34],
|
|
||||||
shadowSize: [41, 41],
|
|
||||||
}),
|
|
||||||
}).addTo(map)
|
|
||||||
|
|
||||||
// Add popup with location info
|
|
||||||
const popupContent = `
|
|
||||||
<div style="font-family: system-ui, sans-serif;">
|
|
||||||
<strong>Shared Location</strong><br/>
|
|
||||||
<small style="color: #666;">
|
|
||||||
Precision: ${precision}<br/>
|
|
||||||
${new Date(location.timestamp).toLocaleString()}
|
|
||||||
</small>
|
|
||||||
</div>
|
|
||||||
`
|
|
||||||
marker.bindPopup(popupContent)
|
|
||||||
|
|
||||||
// Add accuracy circle if showing accuracy
|
|
||||||
if (showAccuracy && radius > 0) {
|
|
||||||
L.circle([lat, lng], {
|
|
||||||
radius: radius,
|
|
||||||
color: "#3b82f6",
|
|
||||||
fillColor: "#3b82f6",
|
|
||||||
fillOpacity: 0.1,
|
|
||||||
weight: 2,
|
|
||||||
}).addTo(map)
|
|
||||||
}
|
|
||||||
|
|
||||||
mapInstance.current = map
|
|
||||||
|
|
||||||
// Cleanup
|
|
||||||
return () => {
|
|
||||||
if (mapInstance.current) {
|
|
||||||
mapInstance.current.remove()
|
|
||||||
mapInstance.current = null
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}, [location, precision, showAccuracy, isLoading])
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return (
|
|
||||||
<div
|
|
||||||
className="map-error flex items-center justify-center bg-muted/50 rounded-lg border border-border"
|
|
||||||
style={{ height }}
|
|
||||||
>
|
|
||||||
<p className="text-sm text-destructive">{error}</p>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (isLoading) {
|
|
||||||
return (
|
|
||||||
<div
|
|
||||||
className="map-loading flex items-center justify-center bg-muted/50 rounded-lg border border-border"
|
|
||||||
style={{ height }}
|
|
||||||
>
|
|
||||||
<div className="flex flex-col items-center gap-3">
|
|
||||||
<div className="spinner" />
|
|
||||||
<p className="text-sm text-muted-foreground">Loading map...</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="location-map-wrapper">
|
|
||||||
<div
|
|
||||||
ref={mapContainer}
|
|
||||||
className="location-map rounded-lg border border-border overflow-hidden"
|
|
||||||
style={{ height, width: "100%" }}
|
|
||||||
/>
|
|
||||||
<div className="map-info mt-3 text-xs text-muted-foreground">
|
|
||||||
<p>
|
|
||||||
Showing {precision} location • Last updated {new Date(location.timestamp).toLocaleTimeString()}
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -1,45 +0,0 @@
|
||||||
import {
|
|
||||||
TLUiDialogProps,
|
|
||||||
TldrawUiDialogBody,
|
|
||||||
TldrawUiDialogCloseButton,
|
|
||||||
TldrawUiDialogHeader,
|
|
||||||
TldrawUiDialogTitle,
|
|
||||||
} from "tldraw"
|
|
||||||
import React from "react"
|
|
||||||
import { ShareLocation } from "./ShareLocation"
|
|
||||||
|
|
||||||
export function LocationShareDialog({ onClose: _onClose }: TLUiDialogProps) {
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<TldrawUiDialogHeader>
|
|
||||||
<TldrawUiDialogTitle>Share Location</TldrawUiDialogTitle>
|
|
||||||
<TldrawUiDialogCloseButton />
|
|
||||||
</TldrawUiDialogHeader>
|
|
||||||
<TldrawUiDialogBody style={{ maxWidth: 800, maxHeight: "90vh", overflow: "auto" }}>
|
|
||||||
<ShareLocation />
|
|
||||||
</TldrawUiDialogBody>
|
|
||||||
</>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -1,183 +0,0 @@
|
||||||
"use client"
|
|
||||||
|
|
||||||
import type React from "react"
|
|
||||||
import { useState, useEffect } from "react"
|
|
||||||
import { LocationMap } from "./LocationMap"
|
|
||||||
import type { LocationData, LocationShare } from "@/lib/location/locationStorage"
|
|
||||||
import { LocationStorageService } from "@/lib/location/locationStorage"
|
|
||||||
import { useAuth } from "@/context/AuthContext"
|
|
||||||
|
|
||||||
interface LocationViewerProps {
|
|
||||||
shareToken: string
|
|
||||||
}
|
|
||||||
|
|
||||||
export const LocationViewer: React.FC<LocationViewerProps> = ({ shareToken }) => {
|
|
||||||
const { fileSystem } = useAuth()
|
|
||||||
const [location, setLocation] = useState<LocationData | null>(null)
|
|
||||||
const [share, setShare] = useState<LocationShare | null>(null)
|
|
||||||
const [loading, setLoading] = useState(true)
|
|
||||||
const [error, setError] = useState<string | null>(null)
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
const loadSharedLocation = async () => {
|
|
||||||
if (!fileSystem) {
|
|
||||||
setError("File system not available")
|
|
||||||
setLoading(false)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
try {
|
|
||||||
const storageService = new LocationStorageService(fileSystem)
|
|
||||||
await storageService.initialize()
|
|
||||||
|
|
||||||
// Get share by token
|
|
||||||
const shareData = await storageService.getShareByToken(shareToken)
|
|
||||||
if (!shareData) {
|
|
||||||
setError("Share not found or expired")
|
|
||||||
setLoading(false)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check if share is expired
|
|
||||||
if (shareData.expiresAt && shareData.expiresAt < Date.now()) {
|
|
||||||
setError("This share has expired")
|
|
||||||
setLoading(false)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check if max views reached
|
|
||||||
if (shareData.maxViews && shareData.viewCount >= shareData.maxViews) {
|
|
||||||
setError("This share has reached its maximum view limit")
|
|
||||||
setLoading(false)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get location data
|
|
||||||
const locationData = await storageService.getLocation(shareData.locationId)
|
|
||||||
if (!locationData) {
|
|
||||||
setError("Location data not found")
|
|
||||||
setLoading(false)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
setShare(shareData)
|
|
||||||
setLocation(locationData)
|
|
||||||
|
|
||||||
// Increment view count
|
|
||||||
await storageService.incrementShareViews(shareData.id)
|
|
||||||
|
|
||||||
setLoading(false)
|
|
||||||
} catch (err) {
|
|
||||||
console.error("Error loading shared location:", err)
|
|
||||||
setError("Failed to load shared location")
|
|
||||||
setLoading(false)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
loadSharedLocation()
|
|
||||||
}, [shareToken, fileSystem])
|
|
||||||
|
|
||||||
if (loading) {
|
|
||||||
return (
|
|
||||||
<div className="location-viewer flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="flex flex-col items-center gap-3">
|
|
||||||
<div className="spinner" />
|
|
||||||
<p className="text-sm text-muted-foreground">Loading shared location...</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return (
|
|
||||||
<div className="location-viewer flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="text-center max-w-md">
|
|
||||||
<div className="text-4xl mb-4">📍</div>
|
|
||||||
<h2 className="text-xl font-semibold mb-2">Unable to Load Location</h2>
|
|
||||||
<p className="text-sm text-muted-foreground">{error}</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!location || !share) {
|
|
||||||
return null
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="location-viewer max-w-4xl mx-auto p-6">
|
|
||||||
<div className="viewer-header mb-6">
|
|
||||||
<h1 className="text-3xl font-bold text-balance">Shared Location</h1>
|
|
||||||
<p className="text-sm text-muted-foreground mt-2">Someone has shared their location with you</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div className="viewer-content space-y-6">
|
|
||||||
{/* Map Display */}
|
|
||||||
<LocationMap location={location} precision={share.precision} showAccuracy={true} height="500px" />
|
|
||||||
|
|
||||||
{/* Share Info */}
|
|
||||||
<div className="share-info bg-muted/50 rounded-lg p-4 space-y-2">
|
|
||||||
<div className="info-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Precision Level:</span>
|
|
||||||
<span className="font-medium capitalize">{share.precision}</span>
|
|
||||||
</div>
|
|
||||||
<div className="info-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Views:</span>
|
|
||||||
<span className="font-medium">
|
|
||||||
{share.viewCount} {share.maxViews ? `/ ${share.maxViews}` : ""}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
{share.expiresAt && (
|
|
||||||
<div className="info-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Expires:</span>
|
|
||||||
<span className="font-medium">{new Date(share.expiresAt).toLocaleString()}</span>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
<div className="info-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Shared:</span>
|
|
||||||
<span className="font-medium">{new Date(share.createdAt).toLocaleString()}</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Privacy Notice */}
|
|
||||||
<div className="privacy-notice bg-primary/5 border border-primary/20 rounded-lg p-4">
|
|
||||||
<p className="text-xs text-muted-foreground">
|
|
||||||
This location is shared securely and will expire based on the sender's privacy settings. The location data
|
|
||||||
is stored in a decentralized filesystem and is only accessible via this unique link.
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
@ -1,279 +0,0 @@
|
||||||
"use client"
|
|
||||||
|
|
||||||
import React, { useState } from "react"
|
|
||||||
import { LocationCapture } from "./LocationCapture"
|
|
||||||
import { ShareSettingsComponent } from "./ShareSettings"
|
|
||||||
import { LocationMap } from "./LocationMap"
|
|
||||||
import type { LocationData, LocationShare } from "@/lib/location/locationStorage"
|
|
||||||
import { LocationStorageService, generateShareToken } from "@/lib/location/locationStorage"
|
|
||||||
import type { ShareSettings } from "@/lib/location/types"
|
|
||||||
import { useAuth } from "@/context/AuthContext"
|
|
||||||
|
|
||||||
export const ShareLocation: React.FC = () => {
|
|
||||||
const { session, fileSystem } = useAuth()
|
|
||||||
const [step, setStep] = useState<"capture" | "settings" | "share">("capture")
|
|
||||||
const [capturedLocation, setCapturedLocation] = useState<LocationData | null>(null)
|
|
||||||
const [shareSettings, setShareSettings] = useState<ShareSettings>({
|
|
||||||
duration: 24 * 3600000, // 24 hours
|
|
||||||
maxViews: null,
|
|
||||||
precision: "street",
|
|
||||||
})
|
|
||||||
const [shareLink, setShareLink] = useState<string | null>(null)
|
|
||||||
const [isCreatingShare, setIsCreatingShare] = useState(false)
|
|
||||||
const [error, setError] = useState<string | null>(null)
|
|
||||||
|
|
||||||
// Show loading state while auth is initializing
|
|
||||||
if (session.loading) {
|
|
||||||
return (
|
|
||||||
<div className="share-location-loading flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="text-center max-w-md">
|
|
||||||
<div className="text-4xl mb-4 animate-spin">⏳</div>
|
|
||||||
<h2 className="text-xl font-semibold mb-2">Loading...</h2>
|
|
||||||
<p className="text-sm text-muted-foreground">Initializing authentication</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
const handleLocationCaptured = (location: LocationData) => {
|
|
||||||
setCapturedLocation(location)
|
|
||||||
setStep("settings")
|
|
||||||
}
|
|
||||||
|
|
||||||
const handleCreateShare = async () => {
|
|
||||||
if (!capturedLocation || !fileSystem) {
|
|
||||||
setError("Location or filesystem not available")
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
setIsCreatingShare(true)
|
|
||||||
setError(null)
|
|
||||||
|
|
||||||
try {
|
|
||||||
const storageService = new LocationStorageService(fileSystem)
|
|
||||||
await storageService.initialize()
|
|
||||||
|
|
||||||
// Generate share token
|
|
||||||
const shareToken = generateShareToken()
|
|
||||||
|
|
||||||
// Calculate expiration
|
|
||||||
const expiresAt = shareSettings.duration ? Date.now() + shareSettings.duration : null
|
|
||||||
|
|
||||||
// Update location with expiration
|
|
||||||
const updatedLocation: LocationData = {
|
|
||||||
...capturedLocation,
|
|
||||||
expiresAt,
|
|
||||||
precision: shareSettings.precision,
|
|
||||||
}
|
|
||||||
|
|
||||||
await storageService.saveLocation(updatedLocation)
|
|
||||||
|
|
||||||
// Create share
|
|
||||||
const share: LocationShare = {
|
|
||||||
id: crypto.randomUUID(),
|
|
||||||
locationId: capturedLocation.id,
|
|
||||||
shareToken,
|
|
||||||
createdAt: Date.now(),
|
|
||||||
expiresAt,
|
|
||||||
maxViews: shareSettings.maxViews,
|
|
||||||
viewCount: 0,
|
|
||||||
precision: shareSettings.precision,
|
|
||||||
}
|
|
||||||
|
|
||||||
await storageService.createShare(share)
|
|
||||||
|
|
||||||
// Generate share link
|
|
||||||
const baseUrl = window.location.origin
|
|
||||||
const link = `${baseUrl}/location/${shareToken}`
|
|
||||||
|
|
||||||
setShareLink(link)
|
|
||||||
setStep("share")
|
|
||||||
} catch (err) {
|
|
||||||
console.error("Error creating share:", err)
|
|
||||||
setError("Failed to create share link")
|
|
||||||
} finally {
|
|
||||||
setIsCreatingShare(false)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const handleCopyLink = async () => {
|
|
||||||
if (!shareLink) return
|
|
||||||
|
|
||||||
try {
|
|
||||||
await navigator.clipboard.writeText(shareLink)
|
|
||||||
// Could add a toast notification here
|
|
||||||
alert("Link copied to clipboard!")
|
|
||||||
} catch (err) {
|
|
||||||
console.error("Failed to copy link:", err)
|
|
||||||
alert("Failed to copy link. Please copy manually.")
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const handleReset = () => {
|
|
||||||
setStep("capture")
|
|
||||||
setCapturedLocation(null)
|
|
||||||
setShareLink(null)
|
|
||||||
setError(null)
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!session.authed) {
|
|
||||||
return (
|
|
||||||
<div className="share-location-auth flex items-center justify-center min-h-[400px]">
|
|
||||||
<div className="text-center max-w-md">
|
|
||||||
<div className="text-4xl mb-4">🔒</div>
|
|
||||||
<h2 className="text-xl font-semibold mb-2">Authentication Required</h2>
|
|
||||||
<p className="text-sm text-muted-foreground">Please log in to share your location securely</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="share-location max-w-4xl mx-auto p-6">
|
|
||||||
{/* Progress Steps */}
|
|
||||||
<div className="progress-steps flex items-center justify-center gap-4 mb-8">
|
|
||||||
{["capture", "settings", "share"].map((s, index) => (
|
|
||||||
<React.Fragment key={s}>
|
|
||||||
<div className="step-item flex items-center gap-2">
|
|
||||||
<div
|
|
||||||
className={`step-number w-8 h-8 rounded-full flex items-center justify-center text-sm font-medium transition-colors ${
|
|
||||||
step === s
|
|
||||||
? "bg-primary text-primary-foreground"
|
|
||||||
: index < ["capture", "settings", "share"].indexOf(step)
|
|
||||||
? "bg-primary/20 text-primary"
|
|
||||||
: "bg-muted text-muted-foreground"
|
|
||||||
}`}
|
|
||||||
>
|
|
||||||
{index + 1}
|
|
||||||
</div>
|
|
||||||
<span
|
|
||||||
className={`step-label text-sm font-medium capitalize ${
|
|
||||||
step === s ? "text-foreground" : "text-muted-foreground"
|
|
||||||
}`}
|
|
||||||
>
|
|
||||||
{s}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
{index < 2 && (
|
|
||||||
<div
|
|
||||||
className={`step-connector h-0.5 w-12 ${
|
|
||||||
index < ["capture", "settings", "share"].indexOf(step) ? "bg-primary" : "bg-muted"
|
|
||||||
}`}
|
|
||||||
/>
|
|
||||||
)}
|
|
||||||
</React.Fragment>
|
|
||||||
))}
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Error Display */}
|
|
||||||
{error && (
|
|
||||||
<div className="error-message bg-destructive/10 border border-destructive/20 rounded-lg p-4 mb-6">
|
|
||||||
<p className="text-sm text-destructive">{error}</p>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{/* Step Content */}
|
|
||||||
<div className="step-content">
|
|
||||||
{step === "capture" && <LocationCapture onLocationCaptured={handleLocationCaptured} onError={setError} />}
|
|
||||||
|
|
||||||
{step === "settings" && capturedLocation && (
|
|
||||||
<div className="settings-step space-y-6">
|
|
||||||
<div className="location-preview">
|
|
||||||
<h3 className="text-lg font-semibold mb-4">Preview Your Location</h3>
|
|
||||||
<LocationMap
|
|
||||||
location={capturedLocation}
|
|
||||||
precision={shareSettings.precision}
|
|
||||||
showAccuracy={true}
|
|
||||||
height="300px"
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<ShareSettingsComponent onSettingsChange={setShareSettings} initialSettings={shareSettings} />
|
|
||||||
|
|
||||||
<div className="settings-actions flex gap-3">
|
|
||||||
<button
|
|
||||||
onClick={() => setStep("capture")}
|
|
||||||
className="flex-1 px-6 py-3 rounded-lg border border-border hover:bg-muted transition-colors"
|
|
||||||
>
|
|
||||||
Back
|
|
||||||
</button>
|
|
||||||
<button
|
|
||||||
onClick={handleCreateShare}
|
|
||||||
disabled={isCreatingShare}
|
|
||||||
className="flex-1 px-6 py-3 rounded-lg bg-primary text-primary-foreground hover:bg-primary/90 disabled:opacity-50 disabled:cursor-not-allowed transition-colors font-medium"
|
|
||||||
>
|
|
||||||
{isCreatingShare ? "Creating Share..." : "Create Share Link"}
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{step === "share" && shareLink && capturedLocation && (
|
|
||||||
<div className="share-step space-y-6">
|
|
||||||
<div className="share-success text-center mb-6">
|
|
||||||
<div className="text-5xl mb-4">✓</div>
|
|
||||||
<h2 className="text-2xl font-bold mb-2">Share Link Created!</h2>
|
|
||||||
<p className="text-sm text-muted-foreground">Your location is ready to share securely</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div className="share-link-box bg-muted/50 rounded-lg p-4 border border-border">
|
|
||||||
<label className="block text-sm font-medium mb-2">Share Link</label>
|
|
||||||
<div className="flex gap-2">
|
|
||||||
<input
|
|
||||||
type="text"
|
|
||||||
value={shareLink}
|
|
||||||
readOnly
|
|
||||||
className="flex-1 px-3 py-2 rounded-lg border border-border bg-background text-sm"
|
|
||||||
onClick={(e) => e.currentTarget.select()}
|
|
||||||
/>
|
|
||||||
<button
|
|
||||||
onClick={handleCopyLink}
|
|
||||||
className="px-4 py-2 rounded-lg bg-primary text-primary-foreground hover:bg-primary/90 transition-colors text-sm font-medium whitespace-nowrap"
|
|
||||||
>
|
|
||||||
Copy Link
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div className="share-preview">
|
|
||||||
<h3 className="text-lg font-semibold mb-4">Location Preview</h3>
|
|
||||||
<LocationMap
|
|
||||||
location={capturedLocation}
|
|
||||||
precision={shareSettings.precision}
|
|
||||||
showAccuracy={true}
|
|
||||||
height="300px"
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div className="share-details bg-muted/50 rounded-lg p-4 space-y-2">
|
|
||||||
<h4 className="font-medium mb-3">Share Settings</h4>
|
|
||||||
<div className="detail-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Precision:</span>
|
|
||||||
<span className="font-medium capitalize">{shareSettings.precision}</span>
|
|
||||||
</div>
|
|
||||||
<div className="detail-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Duration:</span>
|
|
||||||
<span className="font-medium">
|
|
||||||
{shareSettings.duration ? `${shareSettings.duration / 3600000} hours` : "No expiration"}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<div className="detail-row flex justify-between text-sm">
|
|
||||||
<span className="text-muted-foreground">Max Views:</span>
|
|
||||||
<span className="font-medium">{shareSettings.maxViews || "Unlimited"}</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<button
|
|
||||||
onClick={handleReset}
|
|
||||||
className="w-full px-6 py-3 rounded-lg border border-border hover:bg-muted transition-colors"
|
|
||||||
>
|
|
||||||
Share Another Location
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue