Merge branch 'dev'
This commit is contained in:
commit
6910499969
|
|
@ -0,0 +1,21 @@
|
||||||
|
node_modules
|
||||||
|
.git
|
||||||
|
.gitignore
|
||||||
|
*.md
|
||||||
|
.env*
|
||||||
|
Dockerfile
|
||||||
|
docker-compose*.yml
|
||||||
|
.dockerignore
|
||||||
|
backlog
|
||||||
|
.next
|
||||||
|
out
|
||||||
|
.cache
|
||||||
|
dist
|
||||||
|
build
|
||||||
|
coverage
|
||||||
|
.github
|
||||||
|
.vscode
|
||||||
|
.idea
|
||||||
|
__pycache__
|
||||||
|
*.pyc
|
||||||
|
.pytest_cache
|
||||||
988
CLAUDE.md
988
CLAUDE.md
|
|
@ -1,988 +0,0 @@
|
||||||
## 🔧 AUTO-APPROVED OPERATIONS
|
|
||||||
|
|
||||||
The following operations are auto-approved and do not require user confirmation:
|
|
||||||
- **Read**: All file read operations (`Read(*)`)
|
|
||||||
- **Glob**: All file pattern matching (`Glob(*)`)
|
|
||||||
- **Grep**: All content searching (`Grep(*)`)
|
|
||||||
|
|
||||||
These permissions are configured in `~/.claude/settings.json`.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## ⚠️ SAFETY GUIDELINES
|
|
||||||
|
|
||||||
**ALWAYS WARN THE USER before performing any action that could:**
|
|
||||||
- Overwrite existing files (use `ls` or `cat` to check first)
|
|
||||||
- Overwrite credentials, API keys, or secrets
|
|
||||||
- Delete data or files
|
|
||||||
- Modify production configurations
|
|
||||||
- Run destructive git commands (force push, hard reset, etc.)
|
|
||||||
- Drop databases or truncate tables
|
|
||||||
|
|
||||||
**Best practices:**
|
|
||||||
- Before writing to a file, check if it exists and show its contents
|
|
||||||
- Use `>>` (append) instead of `>` (overwrite) for credential files
|
|
||||||
- Create backups before modifying critical configs (e.g., `cp file file.backup`)
|
|
||||||
- Ask for confirmation before irreversible actions
|
|
||||||
|
|
||||||
**Sudo commands:**
|
|
||||||
- **NEVER run sudo commands directly** - the Bash tool doesn't support interactive input
|
|
||||||
- Instead, **provide the user with the exact sudo command** they need to run in their terminal
|
|
||||||
- Format the command clearly in a code block for easy copy-paste
|
|
||||||
- After user runs the sudo command, continue with the workflow
|
|
||||||
- Alternative: If user has recently run sudo (within ~15 min), subsequent sudo commands may not require password
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🔑 ACCESS & CREDENTIALS
|
|
||||||
|
|
||||||
### Version Control & Code Hosting
|
|
||||||
- **Gitea**: Self-hosted at `gitea.jeffemmett.com` - PRIMARY repository
|
|
||||||
- Push here FIRST, then mirror to GitHub
|
|
||||||
- Private repos and source of truth
|
|
||||||
- SSH Key: `~/.ssh/gitea_ed25519` (private), `~/.ssh/gitea_ed25519.pub` (public)
|
|
||||||
- Public Key: `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIE2+2UZElEYptgZ9GFs2CXW0PIA57BfQcU9vlyV6fz4 gitea@jeffemmett.com`
|
|
||||||
- **Gitea CLI (tea)**: ✅ Installed at `~/bin/tea` (added to PATH)
|
|
||||||
|
|
||||||
- **GitHub**: Public mirror and collaboration
|
|
||||||
- Receives pushes from Gitea via mirror sync
|
|
||||||
- Token: `(REDACTED-GITHUB-TOKEN)`
|
|
||||||
- SSH Key: `~/.ssh/github_deploy_key` (private), `~/.ssh/github_deploy_key.pub` (public)
|
|
||||||
- **GitHub CLI (gh)**: ✅ Installed and available for PR/issue management
|
|
||||||
|
|
||||||
### Git Workflow
|
|
||||||
**Two-way sync between Gitea and GitHub:**
|
|
||||||
|
|
||||||
**Gitea-Primary Repos (Default):**
|
|
||||||
1. Develop locally in `/home/jeffe/Github/`
|
|
||||||
2. Commit and push to Gitea first
|
|
||||||
3. Gitea automatically mirrors TO GitHub (built-in push mirror)
|
|
||||||
4. GitHub used for public collaboration and visibility
|
|
||||||
|
|
||||||
**GitHub-Primary Repos (Mirror Repos):**
|
|
||||||
For repos where GitHub is source of truth (v0.dev exports, client collabs):
|
|
||||||
1. Push to GitHub
|
|
||||||
2. Deploy webhook pulls from GitHub and deploys
|
|
||||||
3. Webhook triggers Gitea to sync FROM GitHub
|
|
||||||
|
|
||||||
### 🔀 DEV BRANCH WORKFLOW (MANDATORY)
|
|
||||||
|
|
||||||
**CRITICAL: All development work on canvas-website (and other active projects) MUST use a dev branch.**
|
|
||||||
|
|
||||||
#### Branch Strategy
|
|
||||||
```
|
|
||||||
main (production)
|
|
||||||
└── dev (integration/staging)
|
|
||||||
└── feature/* (optional feature branches)
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Development Rules
|
|
||||||
|
|
||||||
1. **ALWAYS work on the `dev` branch** for new features and changes:
|
|
||||||
```bash
|
|
||||||
cd /home/jeffe/Github/canvas-website
|
|
||||||
git checkout dev
|
|
||||||
git pull origin dev
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **After completing a feature**, push to dev:
|
|
||||||
```bash
|
|
||||||
git add .
|
|
||||||
git commit -m "feat: description of changes"
|
|
||||||
git push origin dev
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Update backlog task** immediately after pushing:
|
|
||||||
```bash
|
|
||||||
backlog task edit <task-id> --status "Done" --append-notes "Pushed to dev branch"
|
|
||||||
```
|
|
||||||
|
|
||||||
4. **NEVER push directly to main** - main is for tested, verified features only
|
|
||||||
|
|
||||||
5. **Merge dev → main manually** when features are verified working:
|
|
||||||
```bash
|
|
||||||
git checkout main
|
|
||||||
git pull origin main
|
|
||||||
git merge dev
|
|
||||||
git push origin main
|
|
||||||
git checkout dev # Return to dev for continued work
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Complete Feature Deployment Checklist
|
|
||||||
|
|
||||||
- [ ] Work on `dev` branch (not main)
|
|
||||||
- [ ] Test locally before committing
|
|
||||||
- [ ] Commit with descriptive message
|
|
||||||
- [ ] Push to `dev` branch on Gitea
|
|
||||||
- [ ] Update backlog task status to "Done"
|
|
||||||
- [ ] Add notes to backlog task about what was implemented
|
|
||||||
- [ ] (Later) When verified working: merge dev → main manually
|
|
||||||
|
|
||||||
#### Why This Matters
|
|
||||||
- **Protects production**: main branch always has known-working code
|
|
||||||
- **Enables testing**: dev branch can be deployed to staging for verification
|
|
||||||
- **Clean history**: main only gets complete, tested features
|
|
||||||
- **Easy rollback**: if dev breaks, main is still stable
|
|
||||||
|
|
||||||
### Server Infrastructure
|
|
||||||
- **Netcup RS 8000 G12 Pro**: Primary application & AI server
|
|
||||||
- IP: `159.195.32.209`
|
|
||||||
- 20 cores, 64GB RAM, 3TB storage
|
|
||||||
- Hosts local AI models (Ollama, Stable Diffusion)
|
|
||||||
- All websites and apps deployed here in Docker containers
|
|
||||||
- Location: Germany (low latency EU)
|
|
||||||
- SSH Key (local): `~/.ssh/netcup_ed25519` (private), `~/.ssh/netcup_ed25519.pub` (public)
|
|
||||||
- Public Key: `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKmp4A2klKv/YIB1C6JAsb2UzvlzzE+0EcJ0jtkyFuhO netcup-rs8000@jeffemmett.com`
|
|
||||||
- SSH Access: `ssh netcup`
|
|
||||||
- **SSH Keys ON the server** (for git operations):
|
|
||||||
- Gitea: `~/.ssh/gitea_ed25519` → `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIE2+2UZElEYptgZ9GFs2CXW0PIA57BfQcU9vlyV6fz4 gitea@jeffemmett.com`
|
|
||||||
- GitHub: `~/.ssh/github_ed25519` → `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC6xXNICy0HXnqHO+U7+y7ui+pZBGe0bm0iRMS23pR1E github-deploy@netcup-rs8000`
|
|
||||||
|
|
||||||
- **RunPod**: GPU burst capacity for AI workloads
|
|
||||||
- Host: `ssh.runpod.io`
|
|
||||||
- Serverless GPU pods (pay-per-use)
|
|
||||||
- Used for: SDXL/SD3, video generation, training
|
|
||||||
- Smart routing from RS 8000 orchestrator
|
|
||||||
- SSH Key: `~/.ssh/runpod_ed25519` (private), `~/.ssh/runpod_ed25519.pub` (public)
|
|
||||||
- Public Key: `ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAC7NYjI0U/2ChGaZBBWP7gKt/V12Ts6FgatinJOQ8JG runpod@jeffemmett.com`
|
|
||||||
- SSH Access: `ssh runpod`
|
|
||||||
- **API Key**: `(REDACTED-RUNPOD-KEY)`
|
|
||||||
- **CLI Config**: `~/.runpod/config.toml`
|
|
||||||
- **Serverless Endpoints**:
|
|
||||||
- Image (SD): `tzf1j3sc3zufsy` (Automatic1111)
|
|
||||||
- Video (Wan2.2): `4jql4l7l0yw0f3`
|
|
||||||
- Text (vLLM): `03g5hz3hlo8gr2`
|
|
||||||
- Whisper: `lrtisuv8ixbtub`
|
|
||||||
- ComfyUI: `5zurj845tbf8he`
|
|
||||||
|
|
||||||
### API Keys & Services
|
|
||||||
|
|
||||||
**IMPORTANT**: All API keys and tokens are stored securely on the Netcup server. Never store credentials locally.
|
|
||||||
- Access credentials via: `ssh netcup "cat ~/.cloudflare-credentials.env"` or `ssh netcup "cat ~/.porkbun_credentials"`
|
|
||||||
- All API operations should be performed FROM the Netcup server, not locally
|
|
||||||
|
|
||||||
#### Credential Files on Netcup (`/root/`)
|
|
||||||
| File | Contents |
|
|
||||||
|------|----------|
|
|
||||||
| `~/.cloudflare-credentials.env` | Cloudflare API tokens, account ID, tunnel token |
|
|
||||||
| `~/.cloudflare_credentials` | Legacy/DNS token |
|
|
||||||
| `~/.porkbun_credentials` | Porkbun API key and secret |
|
|
||||||
| `~/.v0_credentials` | V0.dev API key |
|
|
||||||
|
|
||||||
#### Cloudflare
|
|
||||||
- **Account ID**: `0e7b3338d5278ed1b148e6456b940913`
|
|
||||||
- **Tokens stored on Netcup** - source `~/.cloudflare-credentials.env`:
|
|
||||||
- `CLOUDFLARE_API_TOKEN` - Zone read, Worker:read/edit, R2:read/edit
|
|
||||||
- `CLOUDFLARE_TUNNEL_TOKEN` - Tunnel management
|
|
||||||
- `CLOUDFLARE_ZONE_TOKEN` - Zone:Edit, DNS:Edit (for adding domains)
|
|
||||||
|
|
||||||
#### Porkbun (Domain Registrar)
|
|
||||||
- **Credentials stored on Netcup** - source `~/.porkbun_credentials`:
|
|
||||||
- `PORKBUN_API_KEY` and `PORKBUN_SECRET_KEY`
|
|
||||||
- **API Endpoint**: `https://api-ipv4.porkbun.com/api/json/v3/`
|
|
||||||
- **API Docs**: https://porkbun.com/api/json/v3/documentation
|
|
||||||
- **Important**: JSON must have `secretapikey` before `apikey` in requests
|
|
||||||
- **Capabilities**: Update nameservers, get auth codes for transfers, manage DNS
|
|
||||||
- **Note**: Each domain must have "API Access" enabled individually in Porkbun dashboard
|
|
||||||
|
|
||||||
#### Domain Onboarding Workflow (Porkbun → Cloudflare)
|
|
||||||
Run these commands FROM Netcup (`ssh netcup`):
|
|
||||||
1. Add domain to Cloudflare (creates zone, returns nameservers)
|
|
||||||
2. Update nameservers at Porkbun to point to Cloudflare
|
|
||||||
3. Add CNAME record pointing to Cloudflare tunnel
|
|
||||||
4. Add hostname to tunnel config and restart cloudflared
|
|
||||||
5. Domain is live through the tunnel!
|
|
||||||
|
|
||||||
#### V0.dev (AI UI Generation)
|
|
||||||
- **Credentials stored on Netcup** - source `~/.v0_credentials`:
|
|
||||||
- `V0_API_KEY` - Platform API access
|
|
||||||
- **API Key**: `v1:5AwJbit4j9rhGcAKPU4XlVWs:05vyCcJLiWRVQW7Xu4u5E03G`
|
|
||||||
- **SDK**: `npm install v0-sdk` (use `v0` CLI for adding components)
|
|
||||||
- **Docs**: https://v0.app/docs/v0-platform-api
|
|
||||||
- **Capabilities**:
|
|
||||||
- List/create/update/delete projects
|
|
||||||
- Manage chats and versions
|
|
||||||
- Download generated code
|
|
||||||
- Create deployments
|
|
||||||
- Manage environment variables
|
|
||||||
- **Limitations**: GitHub-only for git integration (no Gitea/GitLab support)
|
|
||||||
- **Usage**:
|
|
||||||
```javascript
|
|
||||||
const { v0 } = require('v0-sdk');
|
|
||||||
// Uses V0_API_KEY env var automatically
|
|
||||||
const projects = await v0.projects.find();
|
|
||||||
const chats = await v0.chats.find();
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Other Services
|
|
||||||
- **HuggingFace**: CLI access available for model downloads
|
|
||||||
- **RunPod**: API access for serverless GPU orchestration (see Server Infrastructure above)
|
|
||||||
|
|
||||||
### Dev Ops Stack & Principles
|
|
||||||
- **Platform**: Linux WSL2 (Ubuntu on Windows) for development
|
|
||||||
- **Working Directory**: `/home/jeffe/Github`
|
|
||||||
- **Container Strategy**:
|
|
||||||
- ALL repos should be Dockerized
|
|
||||||
- Optimized containers for production deployment
|
|
||||||
- Docker Compose for multi-service orchestration
|
|
||||||
- **Process Management**: PM2 available for Node.js services
|
|
||||||
- **Version Control**: Git configured with GitHub + Gitea mirrors
|
|
||||||
- **Package Managers**: npm/pnpm/yarn available
|
|
||||||
|
|
||||||
### 🚀 Traefik Reverse Proxy (Central Routing)
|
|
||||||
All HTTP services on Netcup RS 8000 route through Traefik for automatic service discovery.
|
|
||||||
|
|
||||||
**Architecture:**
|
|
||||||
```
|
|
||||||
Internet → Cloudflare Tunnel → Traefik (:80/:443) → Docker Services
|
|
||||||
│
|
|
||||||
├── gitea.jeffemmett.com → gitea:3000
|
|
||||||
├── mycofi.earth → mycofi:3000
|
|
||||||
├── games.jeffemmett.com → games:80
|
|
||||||
└── [auto-discovered via Docker labels]
|
|
||||||
```
|
|
||||||
|
|
||||||
**Location:** `/root/traefik/` on Netcup RS 8000
|
|
||||||
|
|
||||||
**Adding a New Service:**
|
|
||||||
```yaml
|
|
||||||
# In your docker-compose.yml, add these labels:
|
|
||||||
services:
|
|
||||||
myapp:
|
|
||||||
image: myapp:latest
|
|
||||||
labels:
|
|
||||||
- "traefik.enable=true"
|
|
||||||
- "traefik.http.routers.myapp.rule=Host(`myapp.jeffemmett.com`)"
|
|
||||||
- "traefik.http.services.myapp.loadbalancer.server.port=3000"
|
|
||||||
networks:
|
|
||||||
- traefik-public
|
|
||||||
networks:
|
|
||||||
traefik-public:
|
|
||||||
external: true
|
|
||||||
```
|
|
||||||
|
|
||||||
**Traefik Dashboard:** `http://159.195.32.209:8888` (internal only)
|
|
||||||
|
|
||||||
**SSH Git Access:**
|
|
||||||
- SSH goes direct (not through Traefik): `git.jeffemmett.com:223` → `159.195.32.209:223`
|
|
||||||
- Web UI goes through Traefik: `gitea.jeffemmett.com` → Traefik → gitea:3000
|
|
||||||
|
|
||||||
### ☁️ Cloudflare Tunnel Configuration
|
|
||||||
**Location:** `/root/cloudflared/` on Netcup RS 8000
|
|
||||||
|
|
||||||
The tunnel uses a token-based configuration managed via Cloudflare Zero Trust Dashboard.
|
|
||||||
All public hostnames should point to `http://localhost:80` (Traefik), which routes based on Host header.
|
|
||||||
|
|
||||||
**Managed hostnames:**
|
|
||||||
- `gitea.jeffemmett.com` → Traefik → Gitea
|
|
||||||
- `photos.jeffemmett.com` → Traefik → Immich
|
|
||||||
- `movies.jeffemmett.com` → Traefik → Jellyfin
|
|
||||||
- `search.jeffemmett.com` → Traefik → Semantic Search
|
|
||||||
- `mycofi.earth` → Traefik → MycoFi
|
|
||||||
- `games.jeffemmett.com` → Traefik → Games Platform
|
|
||||||
- `decolonizeti.me` → Traefik → Decolonize Time
|
|
||||||
|
|
||||||
**Tunnel ID:** `a838e9dc-0af5-4212-8af2-6864eb15e1b5`
|
|
||||||
**Tunnel CNAME Target:** `a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com`
|
|
||||||
|
|
||||||
**To deploy a new website/service:**
|
|
||||||
|
|
||||||
1. **Dockerize the project** with Traefik labels in `docker-compose.yml`:
|
|
||||||
```yaml
|
|
||||||
services:
|
|
||||||
myapp:
|
|
||||||
build: .
|
|
||||||
labels:
|
|
||||||
- "traefik.enable=true"
|
|
||||||
- "traefik.http.routers.myapp.rule=Host(`mydomain.com`) || Host(`www.mydomain.com`)"
|
|
||||||
- "traefik.http.services.myapp.loadbalancer.server.port=3000"
|
|
||||||
networks:
|
|
||||||
- traefik-public
|
|
||||||
networks:
|
|
||||||
traefik-public:
|
|
||||||
external: true
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Deploy to Netcup:**
|
|
||||||
```bash
|
|
||||||
ssh netcup "cd /opt/websites && git clone <repo-url>"
|
|
||||||
ssh netcup "cd /opt/websites/<project> && docker compose up -d --build"
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Add hostname to tunnel config** (`/root/cloudflared/config.yml`):
|
|
||||||
```yaml
|
|
||||||
- hostname: mydomain.com
|
|
||||||
service: http://localhost:80
|
|
||||||
- hostname: www.mydomain.com
|
|
||||||
service: http://localhost:80
|
|
||||||
```
|
|
||||||
Then restart: `ssh netcup "docker restart cloudflared"`
|
|
||||||
|
|
||||||
4. **Configure DNS in Cloudflare dashboard** (CRITICAL - prevents 525 SSL errors):
|
|
||||||
- Go to Cloudflare Dashboard → select domain → DNS → Records
|
|
||||||
- Delete any existing A/AAAA records for `@` and `www`
|
|
||||||
- Add CNAME records:
|
|
||||||
| Type | Name | Target | Proxy |
|
|
||||||
|------|------|--------|-------|
|
|
||||||
| CNAME | `@` | `a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com` | Proxied ✓ |
|
|
||||||
| CNAME | `www` | `a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com` | Proxied ✓ |
|
|
||||||
|
|
||||||
**API Credentials** (on Netcup at `~/.cloudflare*`):
|
|
||||||
- `CLOUDFLARE_API_TOKEN` - Zone read access only
|
|
||||||
- `CLOUDFLARE_TUNNEL_TOKEN` - Tunnel management only
|
|
||||||
- See **API Keys & Services** section above for Domain Management Token (required for DNS automation)
|
|
||||||
|
|
||||||
### 🔄 Auto-Deploy Webhook System
|
|
||||||
**Location:** `/opt/deploy-webhook/` on Netcup RS 8000
|
|
||||||
**Endpoint:** `https://deploy.jeffemmett.com/deploy/<repo-name>`
|
|
||||||
**Secret:** `gitea-deploy-secret-2025`
|
|
||||||
|
|
||||||
Pushes to Gitea automatically trigger rebuilds. The webhook receiver:
|
|
||||||
1. Validates HMAC signature from Gitea
|
|
||||||
2. Runs `git pull && docker compose up -d --build`
|
|
||||||
3. Returns build status
|
|
||||||
|
|
||||||
**Adding a new repo to auto-deploy:**
|
|
||||||
1. Add entry to `/opt/deploy-webhook/webhook.py` REPOS dict
|
|
||||||
2. Restart: `ssh netcup "cd /opt/deploy-webhook && docker compose up -d --build"`
|
|
||||||
3. Add Gitea webhook:
|
|
||||||
```bash
|
|
||||||
curl -X POST "https://gitea.jeffemmett.com/api/v1/repos/jeffemmett/<repo>/hooks" \
|
|
||||||
-H "Authorization: token <gitea-token>" \
|
|
||||||
-H "Content-Type: application/json" \
|
|
||||||
-d '{"type":"gitea","active":true,"events":["push"],"config":{"url":"https://deploy.jeffemmett.com/deploy/<repo>","content_type":"json","secret":"gitea-deploy-secret-2025"}}'
|
|
||||||
```
|
|
||||||
|
|
||||||
**Currently auto-deploying:**
|
|
||||||
- `decolonize-time-website` → /opt/websites/decolonize-time-website
|
|
||||||
- `mycofi-earth-website` → /opt/websites/mycofi-earth-website
|
|
||||||
- `games-platform` → /opt/apps/games-platform
|
|
||||||
|
|
||||||
### 🔐 SSH Keys Quick Reference
|
|
||||||
|
|
||||||
**Local keys** (in `~/.ssh/` on your laptop):
|
|
||||||
|
|
||||||
| Service | Private Key | Public Key | Purpose |
|
|
||||||
|---------|-------------|------------|---------|
|
|
||||||
| **Gitea** | `gitea_ed25519` | `gitea_ed25519.pub` | Primary git repository |
|
|
||||||
| **GitHub** | `github_deploy_key` | `github_deploy_key.pub` | Public mirror sync |
|
|
||||||
| **Netcup RS 8000** | `netcup_ed25519` | `netcup_ed25519.pub` | Primary server SSH |
|
|
||||||
| **RunPod** | `runpod_ed25519` | `runpod_ed25519.pub` | GPU pods SSH |
|
|
||||||
| **Default** | `id_ed25519` | `id_ed25519.pub` | General purpose/legacy |
|
|
||||||
|
|
||||||
**Server-side keys** (in `/root/.ssh/` on Netcup RS 8000):
|
|
||||||
|
|
||||||
| Service | Key File | Purpose |
|
|
||||||
|---------|----------|---------|
|
|
||||||
| **Gitea** | `gitea_ed25519` | Server pulls from Gitea repos |
|
|
||||||
| **GitHub** | `github_ed25519` | Server pulls from GitHub (mirror repos) |
|
|
||||||
|
|
||||||
**SSH Config**: `~/.ssh/config` contains all host configurations
|
|
||||||
**Quick Access**:
|
|
||||||
- `ssh netcup` - Connect to Netcup RS 8000
|
|
||||||
- `ssh runpod` - Connect to RunPod
|
|
||||||
- `ssh gitea.jeffemmett.com` - Git operations
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🤖 AI ORCHESTRATION ARCHITECTURE
|
|
||||||
|
|
||||||
### Smart Routing Strategy
|
|
||||||
All AI requests go through intelligent orchestration layer on RS 8000:
|
|
||||||
|
|
||||||
**Routing Logic:**
|
|
||||||
- **Text/Code (70-80% of workload)**: Always local RS 8000 CPU (Ollama) → FREE
|
|
||||||
- **Images - Low Priority**: RS 8000 CPU (SD 1.5/2.1) → FREE but slow (~60s)
|
|
||||||
- **Images - High Priority**: RunPod GPU (SDXL/SD3) → $0.02/image, fast
|
|
||||||
- **Video Generation**: Always RunPod GPU → $0.50/video (only option)
|
|
||||||
- **Training/Fine-tuning**: RunPod GPU on-demand
|
|
||||||
|
|
||||||
**Queue System:**
|
|
||||||
- Redis-based queues: text, image, code, video
|
|
||||||
- Priority-based routing (low/normal/high)
|
|
||||||
- Worker pools scale based on load
|
|
||||||
- Cost tracking per job, per user
|
|
||||||
|
|
||||||
**Cost Optimization:**
|
|
||||||
- Target: $90-120/mo (vs $136-236/mo current)
|
|
||||||
- Savings: $552-1,392/year
|
|
||||||
- 70-80% of workload FREE (local CPU)
|
|
||||||
- GPU only when needed (serverless = no idle costs)
|
|
||||||
|
|
||||||
### Deployment Architecture
|
|
||||||
```
|
|
||||||
RS 8000 G12 Pro (Netcup)
|
|
||||||
├── Cloudflare Tunnel (secure ingress)
|
|
||||||
├── Traefik Reverse Proxy (auto-discovery)
|
|
||||||
│ └── Routes to all services via Docker labels
|
|
||||||
├── Core Services
|
|
||||||
│ ├── Gitea (git hosting) - gitea.jeffemmett.com
|
|
||||||
│ └── Other internal tools
|
|
||||||
├── AI Services
|
|
||||||
│ ├── Ollama (text/code models)
|
|
||||||
│ ├── Stable Diffusion (CPU fallback)
|
|
||||||
│ └── Smart Router API (FastAPI)
|
|
||||||
├── Queue Infrastructure
|
|
||||||
│ ├── Redis (job queues)
|
|
||||||
│ └── PostgreSQL (job history/analytics)
|
|
||||||
├── Monitoring
|
|
||||||
│ ├── Prometheus (metrics)
|
|
||||||
│ ├── Grafana (dashboards)
|
|
||||||
│ └── Cost tracking API
|
|
||||||
└── Application Hosting
|
|
||||||
├── All websites (Dockerized + Traefik labels)
|
|
||||||
├── All apps (Dockerized + Traefik labels)
|
|
||||||
└── Backend services (Dockerized)
|
|
||||||
|
|
||||||
RunPod Serverless (GPU Burst)
|
|
||||||
├── SDXL/SD3 endpoints
|
|
||||||
├── Video generation (Wan2.1)
|
|
||||||
└── Training/fine-tuning jobs
|
|
||||||
```
|
|
||||||
|
|
||||||
### Integration Pattern for Projects
|
|
||||||
All projects use unified AI client SDK:
|
|
||||||
```python
|
|
||||||
from orchestrator_client import AIOrchestrator
|
|
||||||
ai = AIOrchestrator("http://rs8000-ip:8000")
|
|
||||||
|
|
||||||
# Automatically routes based on priority & model
|
|
||||||
result = await ai.generate_text(prompt, priority="low") # → FREE CPU
|
|
||||||
result = await ai.generate_image(prompt, priority="high") # → RunPod GPU
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 💰 GPU COST ANALYSIS & MIGRATION PLAN
|
|
||||||
|
|
||||||
### Current Infrastructure Costs (Monthly)
|
|
||||||
|
|
||||||
| Service | Type | Cost | Notes |
|
|
||||||
|---------|------|------|-------|
|
|
||||||
| Netcup RS 8000 G12 Pro | Fixed | ~€45 | 20 cores, 64GB RAM, 3TB (CPU-only) |
|
|
||||||
| RunPod Serverless | Variable | $50-100 | Pay-per-use GPU (images, video) |
|
|
||||||
| DigitalOcean Droplets | Fixed | ~$48 | ⚠️ DEPRECATED - migrate ASAP |
|
|
||||||
| **Current Total** | | **~$140-190/mo** | |
|
|
||||||
|
|
||||||
### GPU Provider Comparison
|
|
||||||
|
|
||||||
#### Netcup vGPU (NEW - Early Access, Ends July 7, 2025)
|
|
||||||
|
|
||||||
| Plan | GPU | VRAM | vCores | RAM | Storage | Price/mo | Price/hr equiv |
|
|
||||||
|------|-----|------|--------|-----|---------|----------|----------------|
|
|
||||||
| RS 2000 vGPU 7 | H200 | 7 GB dedicated | 8 | 16 GB DDR5 | 512 GB NVMe | €137.31 (~$150) | $0.21/hr |
|
|
||||||
| RS 4000 vGPU 14 | H200 | 14 GB dedicated | 12 | 32 GB DDR5 | 1 TB NVMe | €261.39 (~$285) | $0.40/hr |
|
|
||||||
|
|
||||||
**Pros:**
|
|
||||||
- NVIDIA H200 (latest gen, better than H100 for inference)
|
|
||||||
- Dedicated VRAM (no noisy neighbors)
|
|
||||||
- Germany location (EU data sovereignty, low latency to RS 8000)
|
|
||||||
- Fixed monthly cost = predictable budgeting
|
|
||||||
- 24/7 availability, no cold starts
|
|
||||||
|
|
||||||
**Cons:**
|
|
||||||
- Pay even when idle
|
|
||||||
- Limited to 7GB or 14GB VRAM options
|
|
||||||
- Early access = limited availability
|
|
||||||
|
|
||||||
#### RunPod Serverless (Current)
|
|
||||||
|
|
||||||
| GPU | VRAM | Price/hr | Typical Use |
|
|
||||||
|-----|------|----------|-------------|
|
|
||||||
| RTX 4090 | 24 GB | ~$0.44/hr | SDXL, medium models |
|
|
||||||
| A100 40GB | 40 GB | ~$1.14/hr | Large models, training |
|
|
||||||
| H100 80GB | 80 GB | ~$2.49/hr | Largest models |
|
|
||||||
|
|
||||||
**Current Endpoint Costs:**
|
|
||||||
- Image (SD/SDXL): ~$0.02/image (~2s compute)
|
|
||||||
- Video (Wan2.2): ~$0.50/video (~60s compute)
|
|
||||||
- Text (vLLM): ~$0.001/request
|
|
||||||
- Whisper: ~$0.01/minute audio
|
|
||||||
|
|
||||||
**Pros:**
|
|
||||||
- Zero idle costs
|
|
||||||
- Unlimited burst capacity
|
|
||||||
- Wide GPU selection (up to 80GB VRAM)
|
|
||||||
- Pay only for actual compute
|
|
||||||
|
|
||||||
**Cons:**
|
|
||||||
- Cold start delays (10-30s first request)
|
|
||||||
- Variable availability during peak times
|
|
||||||
- Per-request costs add up at scale
|
|
||||||
|
|
||||||
### Break-even Analysis
|
|
||||||
|
|
||||||
**When does Netcup vGPU become cheaper than RunPod?**
|
|
||||||
|
|
||||||
| Scenario | RunPod Cost | Netcup RS 2000 vGPU 7 | Netcup RS 4000 vGPU 14 |
|
|
||||||
|----------|-------------|----------------------|------------------------|
|
|
||||||
| 1,000 images/mo | $20 | $150 ❌ | $285 ❌ |
|
|
||||||
| 5,000 images/mo | $100 | $150 ❌ | $285 ❌ |
|
|
||||||
| **7,500 images/mo** | **$150** | **$150 ✅** | $285 ❌ |
|
|
||||||
| 10,000 images/mo | $200 | $150 ✅ | $285 ❌ |
|
|
||||||
| **14,250 images/mo** | **$285** | $150 ✅ | **$285 ✅** |
|
|
||||||
| 100 videos/mo | $50 | $150 ❌ | $285 ❌ |
|
|
||||||
| **300 videos/mo** | **$150** | **$150 ✅** | $285 ❌ |
|
|
||||||
| 500 videos/mo | $250 | $150 ✅ | $285 ❌ |
|
|
||||||
|
|
||||||
**Recommendation by Usage Pattern:**
|
|
||||||
|
|
||||||
| Monthly Usage | Best Option | Est. Cost |
|
|
||||||
|---------------|-------------|-----------|
|
|
||||||
| < 5,000 images OR < 250 videos | RunPod Serverless | $50-100 |
|
|
||||||
| 5,000-10,000 images OR 250-500 videos | **Netcup RS 2000 vGPU 7** | $150 fixed |
|
|
||||||
| > 10,000 images OR > 500 videos + training | **Netcup RS 4000 vGPU 14** | $285 fixed |
|
|
||||||
| Unpredictable/bursty workloads | RunPod Serverless | Variable |
|
|
||||||
|
|
||||||
### Migration Strategy
|
|
||||||
|
|
||||||
#### Phase 1: Immediate (Before July 7, 2025)
|
|
||||||
**Decision Point: Secure Netcup vGPU Early Access?**
|
|
||||||
|
|
||||||
- [ ] Monitor actual GPU usage for 2-4 weeks
|
|
||||||
- [ ] Calculate average monthly image/video generation
|
|
||||||
- [ ] If consistently > 5,000 images/mo → Consider RS 2000 vGPU 7
|
|
||||||
- [ ] If consistently > 10,000 images/mo → Consider RS 4000 vGPU 14
|
|
||||||
- [ ] **ACTION**: Redeem early access code if usage justifies fixed GPU
|
|
||||||
|
|
||||||
#### Phase 2: Hybrid Architecture (If vGPU Acquired)
|
|
||||||
|
|
||||||
```
|
|
||||||
RS 8000 G12 Pro (CPU - Current)
|
|
||||||
├── Ollama (text/code) → FREE
|
|
||||||
├── SD 1.5/2.1 CPU fallback → FREE
|
|
||||||
└── Orchestrator API
|
|
||||||
|
|
||||||
Netcup vGPU Server (NEW - If purchased)
|
|
||||||
├── Primary GPU workloads
|
|
||||||
├── SDXL/SD3 generation
|
|
||||||
├── Video generation (Wan2.1 I2V)
|
|
||||||
├── Model inference (14B params with 14GB VRAM)
|
|
||||||
└── Connected via internal netcup network (low latency)
|
|
||||||
|
|
||||||
RunPod Serverless (Burst Only)
|
|
||||||
├── Overflow capacity
|
|
||||||
├── Models requiring > 14GB VRAM
|
|
||||||
├── Training/fine-tuning jobs
|
|
||||||
└── Geographic distribution needs
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Phase 3: Cost Optimization Targets
|
|
||||||
|
|
||||||
| Scenario | Current | With vGPU Migration | Savings |
|
|
||||||
|----------|---------|---------------------|---------|
|
|
||||||
| Low usage | $140/mo | $95/mo (RS8000 + minimal RunPod) | $540/yr |
|
|
||||||
| Medium usage | $190/mo | $195/mo (RS8000 + vGPU 7) | Break-even |
|
|
||||||
| High usage | $250/mo | $195/mo (RS8000 + vGPU 7) | $660/yr |
|
|
||||||
| Very high usage | $350/mo | $330/mo (RS8000 + vGPU 14) | $240/yr |
|
|
||||||
|
|
||||||
### Model VRAM Requirements Reference
|
|
||||||
|
|
||||||
| Model | VRAM Needed | Fits vGPU 7? | Fits vGPU 14? |
|
|
||||||
|-------|-------------|--------------|---------------|
|
|
||||||
| SD 1.5 | ~4 GB | ✅ | ✅ |
|
|
||||||
| SD 2.1 | ~5 GB | ✅ | ✅ |
|
|
||||||
| SDXL | ~7 GB | ⚠️ Tight | ✅ |
|
|
||||||
| SD3 Medium | ~8 GB | ❌ | ✅ |
|
|
||||||
| Wan2.1 I2V 14B | ~12 GB | ❌ | ✅ |
|
|
||||||
| Wan2.1 T2V 14B | ~14 GB | ❌ | ⚠️ Tight |
|
|
||||||
| Flux.1 Dev | ~12 GB | ❌ | ✅ |
|
|
||||||
| LLaMA 3 8B (Q4) | ~6 GB | ✅ | ✅ |
|
|
||||||
| LLaMA 3 70B (Q4) | ~40 GB | ❌ | ❌ (RunPod) |
|
|
||||||
|
|
||||||
### Decision Framework
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────┐
|
|
||||||
│ GPU WORKLOAD DECISION TREE │
|
|
||||||
├─────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ Is usage predictable and consistent? │
|
|
||||||
│ ├── YES → Is monthly GPU spend > $150? │
|
|
||||||
│ │ ├── YES → Netcup vGPU (fixed cost wins) │
|
|
||||||
│ │ └── NO → RunPod Serverless (no idle cost) │
|
|
||||||
│ └── NO → RunPod Serverless (pay for what you use) │
|
|
||||||
│ │
|
|
||||||
│ Does model require > 14GB VRAM? │
|
|
||||||
│ ├── YES → RunPod (A100/H100 on-demand) │
|
|
||||||
│ └── NO → Netcup vGPU or RS 8000 CPU │
|
|
||||||
│ │
|
|
||||||
│ Is low latency critical? │
|
|
||||||
│ ├── YES → Netcup vGPU (same datacenter as RS 8000) │
|
|
||||||
│ └── NO → RunPod Serverless (acceptable for batch) │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
### Monitoring & Review Schedule
|
|
||||||
|
|
||||||
- **Weekly**: Review RunPod spend dashboard
|
|
||||||
- **Monthly**: Calculate total GPU costs, compare to vGPU break-even
|
|
||||||
- **Quarterly**: Re-evaluate architecture, consider plan changes
|
|
||||||
- **Annually**: Full infrastructure cost audit
|
|
||||||
|
|
||||||
### Action Items
|
|
||||||
|
|
||||||
- [ ] **URGENT**: Decide on Netcup vGPU early access before July 7, 2025
|
|
||||||
- [ ] Set up GPU usage tracking in orchestrator
|
|
||||||
- [ ] Create Grafana dashboard for cost monitoring
|
|
||||||
- [ ] Test Wan2.1 I2V 14B model on vGPU 14 (if acquired)
|
|
||||||
- [ ] Document migration runbook for vGPU setup
|
|
||||||
- [ ] Complete DigitalOcean deprecation (separate from GPU decision)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📁 PROJECT PORTFOLIO STRUCTURE
|
|
||||||
|
|
||||||
### Repository Organization
|
|
||||||
- **Location**: `/home/jeffe/Github/`
|
|
||||||
- **Primary Flow**: Gitea (source of truth) → GitHub (public mirror)
|
|
||||||
- **Containerization**: ALL repos must be Dockerized with optimized production containers
|
|
||||||
|
|
||||||
### 🎯 MAIN PROJECT: canvas-website
|
|
||||||
**Location**: `/home/jeffe/Github/canvas-website`
|
|
||||||
**Description**: Collaborative canvas deployment - the integration hub where all tools come together
|
|
||||||
- Tldraw-based collaborative canvas platform
|
|
||||||
- Integrates Hyperindex, rSpace, MycoFi, and other tools
|
|
||||||
- Real-time collaboration features
|
|
||||||
- Deployed on RS 8000 in Docker
|
|
||||||
- Uses AI orchestrator for intelligent features
|
|
||||||
|
|
||||||
### Project Categories
|
|
||||||
|
|
||||||
**AI & Infrastructure:**
|
|
||||||
- AI Orchestrator (smart routing between RS 8000 & RunPod)
|
|
||||||
- Model hosting & fine-tuning pipelines
|
|
||||||
- Cost optimization & monitoring dashboards
|
|
||||||
|
|
||||||
**Web Applications & Sites:**
|
|
||||||
- **canvas-website**: Main collaborative canvas (integration hub)
|
|
||||||
- All deployed in Docker containers on RS 8000
|
|
||||||
- Cloudflare Workers for edge functions (Hyperindex)
|
|
||||||
- Static sites + dynamic backends containerized
|
|
||||||
|
|
||||||
**Supporting Projects:**
|
|
||||||
- **Hyperindex**: Tldraw canvas integration (Cloudflare stack) - integrates into canvas-website
|
|
||||||
- **rSpace**: Real-time collaboration platform - integrates into canvas-website
|
|
||||||
- **MycoFi**: DeFi/Web3 project - integrates into canvas-website
|
|
||||||
- **Canvas-related tools**: Knowledge graph & visualization components
|
|
||||||
|
|
||||||
### Deployment Strategy
|
|
||||||
1. **Development**: Local WSL2 environment (`/home/jeffe/Github/`)
|
|
||||||
2. **Version Control**: Push to Gitea FIRST → Auto-mirror to GitHub
|
|
||||||
3. **Containerization**: Build optimized Docker images with Traefik labels
|
|
||||||
4. **Deployment**: Deploy to RS 8000 via Docker Compose (join `traefik-public` network)
|
|
||||||
5. **Routing**: Traefik auto-discovers service via labels, no config changes needed
|
|
||||||
6. **DNS**: Add hostname to Cloudflare tunnel (if new domain) or it just works (existing domains)
|
|
||||||
7. **AI Integration**: Connect to local orchestrator API
|
|
||||||
8. **Monitoring**: Grafana dashboards for all services
|
|
||||||
|
|
||||||
### Infrastructure Philosophy
|
|
||||||
- **Self-hosted first**: Own your infrastructure (RS 8000 + Gitea)
|
|
||||||
- **Cloud for edge cases**: Cloudflare (edge), RunPod (GPU burst)
|
|
||||||
- **Cost-optimized**: Local CPU for 70-80% of workload
|
|
||||||
- **Dockerized everything**: Reproducible, scalable, maintainable
|
|
||||||
- **Smart orchestration**: Right compute for the right job
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
- can you make sure you are runing the hf download for a non deprecated version? After that, you can proceed with Image-to-Video 14B 720p (RECOMMENDED)
|
|
||||||
huggingface-cli download Wan-AI/Wan2.1-I2V-14B-720P \
|
|
||||||
--include "*.safetensors" \
|
|
||||||
--local-dir models/diffusion_models/wan2.1_i2v_14b
|
|
||||||
|
|
||||||
## 🕸️ HYPERINDEX PROJECT - TOP PRIORITY
|
|
||||||
|
|
||||||
**Location:** `/home/jeffe/Github/hyperindex-system/`
|
|
||||||
|
|
||||||
When user is ready to work on the hyperindexing system:
|
|
||||||
1. Reference `HYPERINDEX_PROJECT.md` for complete architecture and implementation details
|
|
||||||
2. Follow `HYPERINDEX_TODO.md` for step-by-step checklist
|
|
||||||
3. Start with Phase 1 (Database & Core Types), then proceed sequentially through Phase 5
|
|
||||||
4. This is a tldraw canvas integration project using Cloudflare Workers, D1, R2, and Durable Objects
|
|
||||||
5. Creates a "living, mycelial network" of web discoveries that spawn on the canvas in real-time
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📋 BACKLOG.MD - UNIFIED TASK MANAGEMENT
|
|
||||||
|
|
||||||
**All projects use Backlog.md for task tracking.** Tasks are managed as markdown files and can be viewed at `backlog.jeffemmett.com` for a unified cross-project view.
|
|
||||||
|
|
||||||
### MCP Integration
|
|
||||||
Backlog.md is integrated via MCP server. Available tools:
|
|
||||||
- `backlog.task_create` - Create new tasks
|
|
||||||
- `backlog.task_list` - List tasks with filters
|
|
||||||
- `backlog.task_update` - Update task status/details
|
|
||||||
- `backlog.task_view` - View task details
|
|
||||||
- `backlog.search` - Search across tasks, docs, decisions
|
|
||||||
|
|
||||||
### Task Lifecycle Workflow
|
|
||||||
|
|
||||||
**CRITICAL: Claude agents MUST follow this workflow for ALL development tasks:**
|
|
||||||
|
|
||||||
#### 1. Task Discovery (Before Starting Work)
|
|
||||||
```bash
|
|
||||||
# Check if task already exists
|
|
||||||
backlog search "<task description>" --plain
|
|
||||||
|
|
||||||
# List current tasks
|
|
||||||
backlog task list --plain
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2. Task Creation (If Not Exists)
|
|
||||||
```bash
|
|
||||||
# Create task with full details
|
|
||||||
backlog task create "Task Title" \
|
|
||||||
--desc "Detailed description" \
|
|
||||||
--priority high \
|
|
||||||
--status "To Do"
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 3. Starting Work (Move to In Progress)
|
|
||||||
```bash
|
|
||||||
# Update status when starting
|
|
||||||
backlog task edit <task-id> --status "In Progress"
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 4. During Development (Update Notes)
|
|
||||||
```bash
|
|
||||||
# Append progress notes
|
|
||||||
backlog task edit <task-id> --append-notes "Completed X, working on Y"
|
|
||||||
|
|
||||||
# Update acceptance criteria
|
|
||||||
backlog task edit <task-id> --check-ac 1
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 5. Completion (Move to Done)
|
|
||||||
```bash
|
|
||||||
# Mark complete when finished
|
|
||||||
backlog task edit <task-id> --status "Done"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Project Initialization
|
|
||||||
|
|
||||||
When starting work in a new repository that doesn't have backlog:
|
|
||||||
```bash
|
|
||||||
cd /path/to/repo
|
|
||||||
backlog init "Project Name" --integration-mode mcp --defaults
|
|
||||||
```
|
|
||||||
|
|
||||||
This creates the `backlog/` directory structure:
|
|
||||||
```
|
|
||||||
backlog/
|
|
||||||
├── config.yml # Project configuration
|
|
||||||
├── tasks/ # Active tasks
|
|
||||||
├── completed/ # Finished tasks
|
|
||||||
├── drafts/ # Draft tasks
|
|
||||||
├── docs/ # Project documentation
|
|
||||||
├── decisions/ # Architecture decision records
|
|
||||||
└── archive/ # Archived tasks
|
|
||||||
```
|
|
||||||
|
|
||||||
### Task File Format
|
|
||||||
Tasks are markdown files with YAML frontmatter:
|
|
||||||
```yaml
|
|
||||||
---
|
|
||||||
id: task-001
|
|
||||||
title: Feature implementation
|
|
||||||
status: In Progress
|
|
||||||
assignee: [@claude]
|
|
||||||
created_date: '2025-12-03 14:30'
|
|
||||||
labels: [feature, backend]
|
|
||||||
priority: high
|
|
||||||
dependencies: [task-002]
|
|
||||||
---
|
|
||||||
|
|
||||||
## Description
|
|
||||||
What needs to be done...
|
|
||||||
|
|
||||||
## Plan
|
|
||||||
1. Step one
|
|
||||||
2. Step two
|
|
||||||
|
|
||||||
## Acceptance Criteria
|
|
||||||
- [ ] Criterion 1
|
|
||||||
- [x] Criterion 2 (completed)
|
|
||||||
|
|
||||||
## Notes
|
|
||||||
Progress updates go here...
|
|
||||||
```
|
|
||||||
|
|
||||||
### Cross-Project Aggregation (backlog.jeffemmett.com)
|
|
||||||
|
|
||||||
**Architecture:**
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────┐
|
|
||||||
│ backlog.jeffemmett.com │
|
|
||||||
│ (Unified Kanban Dashboard) │
|
|
||||||
├─────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
|
|
||||||
│ │ canvas-web │ │ hyperindex │ │ mycofi │ ... │
|
|
||||||
│ │ (purple) │ │ (green) │ │ (blue) │ │
|
|
||||||
│ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘ │
|
|
||||||
│ │ │ │ │
|
|
||||||
│ └────────────────┴────────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ┌───────────┴───────────┐ │
|
|
||||||
│ │ Aggregation API │ │
|
|
||||||
│ │ (polls all projects) │ │
|
|
||||||
│ └───────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────┘
|
|
||||||
|
|
||||||
Data Sources:
|
|
||||||
├── Local: /home/jeffe/Github/*/backlog/
|
|
||||||
└── Remote: ssh netcup "ls /opt/*/backlog/"
|
|
||||||
```
|
|
||||||
|
|
||||||
**Color Coding by Project:**
|
|
||||||
| Project | Color | Location |
|
|
||||||
|---------|-------|----------|
|
|
||||||
| canvas-website | Purple | Local + Netcup |
|
|
||||||
| hyperindex-system | Green | Local |
|
|
||||||
| mycofi-earth | Blue | Local + Netcup |
|
|
||||||
| decolonize-time | Orange | Local + Netcup |
|
|
||||||
| ai-orchestrator | Red | Netcup |
|
|
||||||
|
|
||||||
**Aggregation Service** (to be deployed on Netcup):
|
|
||||||
- Polls all project `backlog/tasks/` directories
|
|
||||||
- Serves unified JSON API at `api.backlog.jeffemmett.com`
|
|
||||||
- Web UI at `backlog.jeffemmett.com` shows combined Kanban
|
|
||||||
- Real-time updates via WebSocket
|
|
||||||
- Filter by project, status, priority, assignee
|
|
||||||
|
|
||||||
### Agent Behavior Requirements
|
|
||||||
|
|
||||||
**When Claude starts working on ANY task:**
|
|
||||||
|
|
||||||
1. **Check for existing backlog** in the repo:
|
|
||||||
```bash
|
|
||||||
ls backlog/config.yml 2>/dev/null || echo "Backlog not initialized"
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **If backlog exists**, search for related tasks:
|
|
||||||
```bash
|
|
||||||
backlog search "<relevant keywords>" --plain
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Create or update task** before writing code:
|
|
||||||
```bash
|
|
||||||
# If new task needed:
|
|
||||||
backlog task create "Task title" --status "In Progress"
|
|
||||||
|
|
||||||
# If task exists:
|
|
||||||
backlog task edit <id> --status "In Progress"
|
|
||||||
```
|
|
||||||
|
|
||||||
4. **Update task on completion**:
|
|
||||||
```bash
|
|
||||||
backlog task edit <id> --status "Done" --append-notes "Implementation complete"
|
|
||||||
```
|
|
||||||
|
|
||||||
5. **Never leave tasks in "In Progress"** when stopping work - either complete them or add notes explaining blockers.
|
|
||||||
|
|
||||||
### Viewing Tasks
|
|
||||||
|
|
||||||
**Terminal Kanban Board:**
|
|
||||||
```bash
|
|
||||||
backlog board
|
|
||||||
```
|
|
||||||
|
|
||||||
**Web Interface (single project):**
|
|
||||||
```bash
|
|
||||||
backlog browser --port 6420
|
|
||||||
```
|
|
||||||
|
|
||||||
**Unified View (all projects):**
|
|
||||||
Visit `backlog.jeffemmett.com` (served from Netcup)
|
|
||||||
|
|
||||||
### Backlog CLI Quick Reference
|
|
||||||
|
|
||||||
#### Task Operations
|
|
||||||
| Action | Command |
|
|
||||||
|--------|---------|
|
|
||||||
| View task | `backlog task 42 --plain` |
|
|
||||||
| List tasks | `backlog task list --plain` |
|
|
||||||
| Search tasks | `backlog search "topic" --plain` |
|
|
||||||
| Filter by status | `backlog task list -s "In Progress" --plain` |
|
|
||||||
| Create task | `backlog task create "Title" -d "Description" --ac "Criterion 1"` |
|
|
||||||
| Edit task | `backlog task edit 42 -t "New Title" -s "In Progress"` |
|
|
||||||
| Assign task | `backlog task edit 42 -a @claude` |
|
|
||||||
|
|
||||||
#### Acceptance Criteria Management
|
|
||||||
| Action | Command |
|
|
||||||
|--------|---------|
|
|
||||||
| Add AC | `backlog task edit 42 --ac "New criterion"` |
|
|
||||||
| Check AC #1 | `backlog task edit 42 --check-ac 1` |
|
|
||||||
| Check multiple | `backlog task edit 42 --check-ac 1 --check-ac 2` |
|
|
||||||
| Uncheck AC | `backlog task edit 42 --uncheck-ac 1` |
|
|
||||||
| Remove AC | `backlog task edit 42 --remove-ac 2` |
|
|
||||||
|
|
||||||
#### Multi-line Input (Description/Plan/Notes)
|
|
||||||
The CLI preserves input literally. Use shell-specific syntax for real newlines:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Bash/Zsh (ANSI-C quoting)
|
|
||||||
backlog task edit 42 --notes $'Line1\nLine2\nLine3'
|
|
||||||
backlog task edit 42 --plan $'1. Step one\n2. Step two'
|
|
||||||
|
|
||||||
# POSIX portable
|
|
||||||
backlog task edit 42 --notes "$(printf 'Line1\nLine2')"
|
|
||||||
|
|
||||||
# Append notes progressively
|
|
||||||
backlog task edit 42 --append-notes $'- Completed X\n- Working on Y'
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Definition of Done (DoD)
|
|
||||||
A task is **Done** only when ALL of these are complete:
|
|
||||||
|
|
||||||
**Via CLI:**
|
|
||||||
1. All acceptance criteria checked: `--check-ac <index>` for each
|
|
||||||
2. Implementation notes added: `--notes "..."` or `--append-notes "..."`
|
|
||||||
3. Status set to Done: `-s Done`
|
|
||||||
|
|
||||||
**Via Code/Testing:**
|
|
||||||
4. Tests pass (run test suite and linting)
|
|
||||||
5. Documentation updated if needed
|
|
||||||
6. Code self-reviewed
|
|
||||||
7. No regressions
|
|
||||||
|
|
||||||
**NEVER mark a task as Done without completing ALL items above.**
|
|
||||||
|
|
||||||
### Configuration Reference
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🔧 TROUBLESHOOTING
|
|
||||||
|
|
||||||
### tmux "server exited unexpectedly"
|
|
||||||
This error occurs when a stale socket file exists from a crashed tmux server.
|
|
||||||
|
|
||||||
**Fix:**
|
|
||||||
```bash
|
|
||||||
rm -f /tmp/tmux-$(id -u)/default
|
|
||||||
```
|
|
||||||
|
|
||||||
Then start a new session normally with `tmux` or `tmux new -s <name>`.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
Default `backlog/config.yml`:
|
|
||||||
```yaml
|
|
||||||
project_name: "Project Name"
|
|
||||||
default_status: "To Do"
|
|
||||||
statuses: ["To Do", "In Progress", "Done"]
|
|
||||||
labels: []
|
|
||||||
milestones: []
|
|
||||||
date_format: yyyy-mm-dd
|
|
||||||
max_column_width: 20
|
|
||||||
auto_open_browser: true
|
|
||||||
default_port: 6420
|
|
||||||
remote_operations: true
|
|
||||||
auto_commit: true
|
|
||||||
zero_padded_ids: 3
|
|
||||||
bypass_git_hooks: false
|
|
||||||
check_active_branches: true
|
|
||||||
active_branch_days: 60
|
|
||||||
```
|
|
||||||
|
|
@ -0,0 +1,39 @@
|
||||||
|
---
|
||||||
|
id: TASK-064
|
||||||
|
title: Decommission Resend — migrate to self-hosted Mailcow SMTP
|
||||||
|
status: Done
|
||||||
|
assignee: []
|
||||||
|
created_date: '2026-02-15 23:15'
|
||||||
|
labels: []
|
||||||
|
dependencies: []
|
||||||
|
priority: high
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
Replace all Resend API integrations with self-hosted Mailcow SMTP via email relay service
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
<!-- SECTION:NOTES:BEGIN -->
|
||||||
|
## Completed 2026-02-15
|
||||||
|
|
||||||
|
### Changes
|
||||||
|
- cryptidAuth.ts: sendEmail() now calls email-relay.jeffemmett.com instead of api.resend.com
|
||||||
|
- boardPermissions.ts: admin request emails use email relay
|
||||||
|
- types.ts: RESEND_API_KEY → EMAIL_RELAY_URL + EMAIL_RELAY_API_KEY
|
||||||
|
- wrangler.toml: updated secrets docs
|
||||||
|
- Tests updated with new mock env vars
|
||||||
|
|
||||||
|
### Wrangler Secrets
|
||||||
|
- EMAIL_RELAY_URL set
|
||||||
|
- EMAIL_RELAY_API_KEY set
|
||||||
|
- RESEND_API_KEY deleted
|
||||||
|
|
||||||
|
### Email Relay
|
||||||
|
- Deployed at email-relay.jeffemmett.com
|
||||||
|
- Flask + Gunicorn, sends via Mailcow SMTP
|
||||||
|
- Bearer token auth
|
||||||
|
<!-- SECTION:NOTES:END -->
|
||||||
|
|
@ -1,6 +1,23 @@
|
||||||
import { TLRecord, RecordId, TLStore, IndexKey } from "@tldraw/tldraw"
|
import { TLRecord, RecordId, TLStore, IndexKey } from "@tldraw/tldraw"
|
||||||
import * as Automerge from "@automerge/automerge"
|
import * as Automerge from "@automerge/automerge"
|
||||||
|
|
||||||
|
// Track invalid index warnings to avoid console spam - log summary instead of per-shape
|
||||||
|
let _invalidIndexCount = 0
|
||||||
|
let _invalidIndexLogTimer: ReturnType<typeof setTimeout> | null = null
|
||||||
|
function logInvalidIndexThrottled(index: string, shapeId: string) {
|
||||||
|
_invalidIndexCount++
|
||||||
|
// Log a summary every 2 seconds instead of per-shape
|
||||||
|
if (!_invalidIndexLogTimer) {
|
||||||
|
_invalidIndexLogTimer = setTimeout(() => {
|
||||||
|
if (_invalidIndexCount > 0) {
|
||||||
|
console.warn(`Invalid index reset to 'a1' for ${_invalidIndexCount} shape(s) (e.g. "${index}" on ${shapeId})`)
|
||||||
|
_invalidIndexCount = 0
|
||||||
|
}
|
||||||
|
_invalidIndexLogTimer = null
|
||||||
|
}, 2000)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Helper function to validate if a string is a valid tldraw IndexKey
|
// Helper function to validate if a string is a valid tldraw IndexKey
|
||||||
// tldraw uses fractional indexing based on https://observablehq.com/@dgreensp/implementing-fractional-indexing
|
// tldraw uses fractional indexing based on https://observablehq.com/@dgreensp/implementing-fractional-indexing
|
||||||
// The first letter encodes integer part length: a=1 digit, b=2 digits, c=3 digits, etc.
|
// The first letter encodes integer part length: a=1 digit, b=2 digits, c=3 digits, etc.
|
||||||
|
|
@ -434,7 +451,6 @@ export function applyAutomergePatchesToTLStore(
|
||||||
// Skip records with missing required fields
|
// Skip records with missing required fields
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
console.error("Failed to sanitize record:", error, record)
|
|
||||||
failedRecords.push(record)
|
failedRecords.push(record)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
@ -443,7 +459,7 @@ export function applyAutomergePatchesToTLStore(
|
||||||
// Log patch application for debugging
|
// Log patch application for debugging
|
||||||
|
|
||||||
if (failedRecords.length > 0) {
|
if (failedRecords.length > 0) {
|
||||||
console.error("Failed to sanitize records:", failedRecords)
|
console.warn(`Failed to sanitize ${failedRecords.length} record(s)`)
|
||||||
}
|
}
|
||||||
|
|
||||||
// CRITICAL: Final safety check - ensure no geo shapes have w/h/geo at top level
|
// CRITICAL: Final safety check - ensure no geo shapes have w/h/geo at top level
|
||||||
|
|
@ -583,11 +599,9 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
// DO NOT overwrite valid coordinates (including 0, which is a valid position)
|
// DO NOT overwrite valid coordinates (including 0, which is a valid position)
|
||||||
// Only set to 0 if the value is undefined, null, or NaN
|
// Only set to 0 if the value is undefined, null, or NaN
|
||||||
if (sanitized.x === undefined || sanitized.x === null || (typeof sanitized.x === 'number' && isNaN(sanitized.x))) {
|
if (sanitized.x === undefined || sanitized.x === null || (typeof sanitized.x === 'number' && isNaN(sanitized.x))) {
|
||||||
console.warn(`⚠️ Shape ${sanitized.id} (${sanitized.type}) has invalid x coordinate, defaulting to 0. Original value:`, sanitized.x)
|
|
||||||
sanitized.x = 0
|
sanitized.x = 0
|
||||||
}
|
}
|
||||||
if (sanitized.y === undefined || sanitized.y === null || (typeof sanitized.y === 'number' && isNaN(sanitized.y))) {
|
if (sanitized.y === undefined || sanitized.y === null || (typeof sanitized.y === 'number' && isNaN(sanitized.y))) {
|
||||||
console.warn(`⚠️ Shape ${sanitized.id} (${sanitized.type}) has invalid y coordinate, defaulting to 0. Original value:`, sanitized.y)
|
|
||||||
sanitized.y = 0
|
sanitized.y = 0
|
||||||
}
|
}
|
||||||
if (typeof sanitized.rotation !== 'number') sanitized.rotation = 0
|
if (typeof sanitized.rotation !== 'number') sanitized.rotation = 0
|
||||||
|
|
@ -605,7 +619,7 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
// Examples: "a1", "a2", "a10", "a1V", "a24sT", "a1V4rr" (fractional between a1 and a2)
|
// Examples: "a1", "a2", "a10", "a1V", "a24sT", "a1V4rr" (fractional between a1 and a2)
|
||||||
// Invalid: "c1", "b1", "z999" (old format - not valid fractional indices)
|
// Invalid: "c1", "b1", "z999" (old format - not valid fractional indices)
|
||||||
if (!isValidIndexKey(sanitized.index)) {
|
if (!isValidIndexKey(sanitized.index)) {
|
||||||
console.warn(`⚠️ Invalid index "${sanitized.index}" for shape ${sanitized.id}, resetting to 'a1'`)
|
logInvalidIndexThrottled(sanitized.index, sanitized.id)
|
||||||
sanitized.index = 'a1' as IndexKey
|
sanitized.index = 'a1' as IndexKey
|
||||||
}
|
}
|
||||||
if (!sanitized.parentId) sanitized.parentId = 'page:page'
|
if (!sanitized.parentId) sanitized.parentId = 'page:page'
|
||||||
|
|
@ -619,7 +633,7 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
// If JSON serialization fails (e.g., due to functions or circular references),
|
// If JSON serialization fails (e.g., due to functions or circular references),
|
||||||
// create a shallow copy and recursively clean it
|
// create a shallow copy and recursively clean it
|
||||||
console.warn(`⚠️ Could not deep copy props for shape ${sanitized.id}, using shallow copy:`, e)
|
// Deep copy failed, using shallow copy
|
||||||
const propsCopy: any = {}
|
const propsCopy: any = {}
|
||||||
for (const key in sanitized.props) {
|
for (const key in sanitized.props) {
|
||||||
try {
|
try {
|
||||||
|
|
@ -1183,7 +1197,7 @@ export function sanitizeRecord(record: any): TLRecord {
|
||||||
}]
|
}]
|
||||||
}]
|
}]
|
||||||
}
|
}
|
||||||
console.log(`🔧 AutomergeToTLStore: Converted props.text to richText for text shape ${sanitized.id}`)
|
// Converted props.text to richText for text shape
|
||||||
}
|
}
|
||||||
// Preserve original text in meta for backward compatibility
|
// Preserve original text in meta for backward compatibility
|
||||||
if (!sanitized.meta) sanitized.meta = {}
|
if (!sanitized.meta) sanitized.meta = {}
|
||||||
|
|
|
||||||
|
|
@ -48,7 +48,6 @@ function minimalSanitizeRecord(record: any): any {
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!isValid) {
|
if (!isValid) {
|
||||||
console.warn(`⚠️ MinimalSanitization: Invalid index format "${sanitized.index}" for shape ${sanitized.id}`)
|
|
||||||
sanitized.index = 'a1'
|
sanitized.index = 'a1'
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -52,6 +52,8 @@ const CUSTOM_SHAPE_TYPES = [
|
||||||
'PrivateWorkspace', // Private workspace for Google Export
|
'PrivateWorkspace', // Private workspace for Google Export
|
||||||
'GoogleItem', // Individual Google items
|
'GoogleItem', // Individual Google items
|
||||||
'WorkflowBlock', // Workflow builder blocks
|
'WorkflowBlock', // Workflow builder blocks
|
||||||
|
'BlenderGen', // Blender 3D procedural generation
|
||||||
|
'TransactionBuilder', // Safe multisig transaction builder
|
||||||
]
|
]
|
||||||
|
|
||||||
// Combined set of all known shape types for validation
|
// Combined set of all known shape types for validation
|
||||||
|
|
@ -182,6 +184,8 @@ import { HolonBrowserShape } from "@/shapes/HolonBrowserShapeUtil"
|
||||||
import { PrivateWorkspaceShape } from "@/shapes/PrivateWorkspaceShapeUtil"
|
import { PrivateWorkspaceShape } from "@/shapes/PrivateWorkspaceShapeUtil"
|
||||||
import { GoogleItemShape } from "@/shapes/GoogleItemShapeUtil"
|
import { GoogleItemShape } from "@/shapes/GoogleItemShapeUtil"
|
||||||
import { WorkflowBlockShape } from "@/shapes/WorkflowBlockShapeUtil"
|
import { WorkflowBlockShape } from "@/shapes/WorkflowBlockShapeUtil"
|
||||||
|
import { BlenderGenShape } from "@/shapes/BlenderGenShapeUtil"
|
||||||
|
import { TransactionBuilderShape } from "@/shapes/TransactionBuilderShapeUtil"
|
||||||
|
|
||||||
export function useAutomergeStoreV2({
|
export function useAutomergeStoreV2({
|
||||||
handle,
|
handle,
|
||||||
|
|
@ -226,6 +230,8 @@ export function useAutomergeStoreV2({
|
||||||
PrivateWorkspaceShape, // Private workspace for Google Export
|
PrivateWorkspaceShape, // Private workspace for Google Export
|
||||||
GoogleItemShape, // Individual Google items
|
GoogleItemShape, // Individual Google items
|
||||||
WorkflowBlockShape, // Workflow builder blocks
|
WorkflowBlockShape, // Workflow builder blocks
|
||||||
|
BlenderGenShape, // Blender 3D procedural generation
|
||||||
|
TransactionBuilderShape, // Safe multisig transaction builder
|
||||||
]
|
]
|
||||||
|
|
||||||
// Use the module-level CUSTOM_SHAPE_TYPES constant
|
// Use the module-level CUSTOM_SHAPE_TYPES constant
|
||||||
|
|
@ -358,64 +364,31 @@ export function useAutomergeStoreV2({
|
||||||
const automergeDoc = handle.doc()
|
const automergeDoc = handle.doc()
|
||||||
applyAutomergePatchesToTLStore(payload.patches, store, automergeDoc)
|
applyAutomergePatchesToTLStore(payload.patches, store, automergeDoc)
|
||||||
} catch (patchError) {
|
} catch (patchError) {
|
||||||
console.error("Error applying patches batch, attempting individual patch application:", patchError)
|
// Batch application failed - try individual patches silently, only log summary
|
||||||
// Try applying patches one by one to identify problematic ones
|
|
||||||
// This is a fallback - ideally we should fix the data at the source
|
|
||||||
let successCount = 0
|
let successCount = 0
|
||||||
let failedPatches: any[] = []
|
let failCount = 0
|
||||||
// CRITICAL: Pass Automerge document to patch handler so it can read full records
|
const errorTypes: Record<string, number> = {}
|
||||||
const automergeDoc = handle.doc()
|
const automergeDoc = handle.doc()
|
||||||
for (const patch of payload.patches) {
|
for (const patch of payload.patches) {
|
||||||
try {
|
try {
|
||||||
applyAutomergePatchesToTLStore([patch], store, automergeDoc)
|
applyAutomergePatchesToTLStore([patch], store, automergeDoc)
|
||||||
successCount++
|
successCount++
|
||||||
} catch (individualPatchError) {
|
} catch (individualPatchError) {
|
||||||
failedPatches.push({ patch, error: individualPatchError })
|
failCount++
|
||||||
console.error(`Failed to apply individual patch:`, individualPatchError)
|
// Categorize errors for summary
|
||||||
|
const msg = individualPatchError instanceof Error ? individualPatchError.message : String(individualPatchError)
|
||||||
// Log the problematic patch for debugging
|
const category = msg.includes('props.geo') ? 'missing props.geo' :
|
||||||
const recordId = patch.path[1] as string
|
msg.includes('index') ? 'invalid index' :
|
||||||
console.error("Problematic patch details:", {
|
msg.includes('typeName') ? 'missing typeName' :
|
||||||
action: patch.action,
|
'other'
|
||||||
path: patch.path,
|
errorTypes[category] = (errorTypes[category] || 0) + 1
|
||||||
recordId: recordId,
|
|
||||||
value: 'value' in patch ? patch.value : undefined,
|
|
||||||
errorMessage: individualPatchError instanceof Error ? individualPatchError.message : String(individualPatchError)
|
|
||||||
})
|
|
||||||
|
|
||||||
// Try to get more context about the failing record
|
|
||||||
try {
|
|
||||||
const existingRecord = store.get(recordId as any)
|
|
||||||
console.error("Existing record that failed:", existingRecord)
|
|
||||||
|
|
||||||
// If it's a geo shape missing props.geo, try to fix it
|
|
||||||
if (existingRecord && (existingRecord as any).typeName === 'shape' && (existingRecord as any).type === 'geo') {
|
|
||||||
const geoRecord = existingRecord as any
|
|
||||||
if (!geoRecord.props || !geoRecord.props.geo) {
|
|
||||||
// This won't help with the current patch, but might help future patches
|
|
||||||
// The real fix should happen in AutomergeToTLStore sanitization
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (e) {
|
|
||||||
console.error("Could not retrieve existing record:", e)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Log summary
|
|
||||||
if (failedPatches.length > 0) {
|
|
||||||
console.error(`❌ Failed to apply ${failedPatches.length} out of ${payload.patches.length} patches`)
|
|
||||||
// Most common issue: geo shapes missing props.geo - this should be fixed in sanitization
|
|
||||||
const geoShapeErrors = failedPatches.filter(p =>
|
|
||||||
p.error instanceof Error && p.error.message.includes('props.geo')
|
|
||||||
)
|
|
||||||
if (geoShapeErrors.length > 0) {
|
|
||||||
console.error(`⚠️ ${geoShapeErrors.length} failures due to missing props.geo - this should be fixed in AutomergeToTLStore sanitization`)
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (successCount < payload.patches.length || payload.patches.length > 5) {
|
// Log a single summary line instead of per-patch errors
|
||||||
// Partial patches applied
|
if (failCount > 0) {
|
||||||
|
const errorSummary = Object.entries(errorTypes).map(([k, v]) => `${k}: ${v}`).join(', ')
|
||||||
|
console.warn(`Patch sync: ${successCount}/${payload.patches.length} applied, ${failCount} failed (${errorSummary})`)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
@ -492,11 +465,8 @@ export function useAutomergeStoreV2({
|
||||||
return true
|
return true
|
||||||
})
|
})
|
||||||
|
|
||||||
// Log errors for any unknown shape types that were filtered out
|
|
||||||
if (unknownShapeTypes.length > 0) {
|
if (unknownShapeTypes.length > 0) {
|
||||||
console.error(`❌ Unknown shape types filtered out (shapes not loaded):`, unknownShapeTypes)
|
console.warn(`Unknown shape types filtered out: ${unknownShapeTypes.join(', ')}`)
|
||||||
console.error(` These shapes exist in the document but are not registered in KNOWN_SHAPE_TYPES.`)
|
|
||||||
console.error(` To fix: Add these types to CUSTOM_SHAPE_TYPES in useAutomergeStoreV2.ts`)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (filteredRecords.length > 0) {
|
if (filteredRecords.length > 0) {
|
||||||
|
|
@ -1242,9 +1212,7 @@ export function useAutomergeStoreV2({
|
||||||
} else {
|
} else {
|
||||||
// Patches didn't come through - this should be rare if handler is set up before data load
|
// Patches didn't come through - this should be rare if handler is set up before data load
|
||||||
// Log a warning but don't show disruptive confirmation dialog
|
// Log a warning but don't show disruptive confirmation dialog
|
||||||
console.warn(`⚠️ No patches received after ${maxAttempts} attempts for room initialization.`)
|
console.warn(`No patches received after ${maxAttempts} attempts - store may be empty`)
|
||||||
console.warn(`⚠️ This may happen if Automerge doc was initialized with server data before handler was ready.`)
|
|
||||||
console.warn(`⚠️ Store will remain empty - patches should handle data loading in normal operation.`)
|
|
||||||
|
|
||||||
// Simplified fallback: Just log and continue with empty store
|
// Simplified fallback: Just log and continue with empty store
|
||||||
// Patches should handle data loading, so if they don't come through,
|
// Patches should handle data loading, so if they don't come through,
|
||||||
|
|
|
||||||
|
|
@ -152,7 +152,7 @@ export function useAutomergeSync(config: AutomergeSyncConfig): TLStoreWithStatus
|
||||||
const applyJsonSyncData = useCallback((data: TLStoreSnapshot & { deleted?: string[] }) => {
|
const applyJsonSyncData = useCallback((data: TLStoreSnapshot & { deleted?: string[] }) => {
|
||||||
const currentHandle = handleRef.current
|
const currentHandle = handleRef.current
|
||||||
if (!currentHandle || (!data?.store && !data?.deleted)) {
|
if (!currentHandle || (!data?.store && !data?.deleted)) {
|
||||||
console.warn('⚠️ Cannot apply JSON sync - no handle or data')
|
// No handle or data available for JSON sync - expected during initialization
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,201 @@
|
||||||
|
import { useState } from "react"
|
||||||
|
import {
|
||||||
|
usePendingTransactions,
|
||||||
|
useConfirmTransaction,
|
||||||
|
useExecuteTransaction,
|
||||||
|
type SafeTransaction,
|
||||||
|
} from "../../hooks/useSafeTransaction"
|
||||||
|
import { formatAddress } from "../../hooks/useWallet"
|
||||||
|
|
||||||
|
function TxCard({ tx, onRefresh }: { tx: SafeTransaction; onRefresh: () => void }) {
|
||||||
|
const { confirm, isLoading: confirming } = useConfirmTransaction()
|
||||||
|
const { execute, isLoading: executing } = useExecuteTransaction()
|
||||||
|
const [signerKey, setSignerKey] = useState("")
|
||||||
|
const [showActions, setShowActions] = useState(false)
|
||||||
|
|
||||||
|
const thresholdMet = tx.confirmations.length >= tx.confirmationsRequired
|
||||||
|
const sigProgress = `${tx.confirmations.length}/${tx.confirmationsRequired}`
|
||||||
|
|
||||||
|
const valueEth = tx.value !== "0"
|
||||||
|
? `${(Number(tx.value) / 1e18).toFixed(6)} ETH`
|
||||||
|
: "Contract call"
|
||||||
|
|
||||||
|
const handleConfirm = async () => {
|
||||||
|
if (!signerKey) return
|
||||||
|
await confirm(tx.safeTxHash, signerKey)
|
||||||
|
setSignerKey("")
|
||||||
|
setShowActions(false)
|
||||||
|
onRefresh()
|
||||||
|
}
|
||||||
|
|
||||||
|
const handleExecute = async () => {
|
||||||
|
if (!signerKey) return
|
||||||
|
await execute(tx.safeTxHash, signerKey)
|
||||||
|
setSignerKey("")
|
||||||
|
setShowActions(false)
|
||||||
|
onRefresh()
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
border: "1px solid #1e293b",
|
||||||
|
borderRadius: 8,
|
||||||
|
padding: 10,
|
||||||
|
background: "#1e293b",
|
||||||
|
marginBottom: 8,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{/* Header Row */}
|
||||||
|
<div style={{ display: "flex", justifyContent: "space-between", alignItems: "center" }}>
|
||||||
|
<div style={{ fontSize: 12, fontWeight: 600 }}>
|
||||||
|
<span style={{ color: "#e2e8f0" }}>#{tx.nonce}</span>
|
||||||
|
<span style={{ color: "#64748b", marginLeft: 8 }}>→ {formatAddress(tx.to)}</span>
|
||||||
|
</div>
|
||||||
|
<span
|
||||||
|
style={{
|
||||||
|
fontSize: 10,
|
||||||
|
fontWeight: 700,
|
||||||
|
padding: "2px 6px",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: thresholdMet ? "#064e3b" : "#1e1b4b",
|
||||||
|
color: thresholdMet ? "#34d399" : "#818cf8",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{sigProgress}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Value */}
|
||||||
|
<div style={{ fontSize: 11, color: "#94a3b8", marginTop: 4 }}>{valueEth}</div>
|
||||||
|
|
||||||
|
{/* Signers */}
|
||||||
|
<div style={{ marginTop: 6, display: "flex", gap: 4, flexWrap: "wrap" }}>
|
||||||
|
{tx.confirmations.map((c) => (
|
||||||
|
<span
|
||||||
|
key={c.owner}
|
||||||
|
style={{
|
||||||
|
fontSize: 10,
|
||||||
|
padding: "1px 6px",
|
||||||
|
borderRadius: 3,
|
||||||
|
background: "#064e3b",
|
||||||
|
color: "#34d399",
|
||||||
|
fontFamily: "monospace",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{formatAddress(c.owner, 3)}
|
||||||
|
</span>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Actions Toggle */}
|
||||||
|
<button
|
||||||
|
onClick={() => setShowActions(!showActions)}
|
||||||
|
style={{
|
||||||
|
marginTop: 6,
|
||||||
|
border: "none",
|
||||||
|
background: "transparent",
|
||||||
|
color: "#818cf8",
|
||||||
|
cursor: "pointer",
|
||||||
|
fontSize: 11,
|
||||||
|
padding: 0,
|
||||||
|
fontWeight: 600,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{showActions ? "Hide" : thresholdMet ? "Execute" : "Confirm"}
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{/* Action Panel */}
|
||||||
|
{showActions && (
|
||||||
|
<div style={{ marginTop: 8, display: "flex", flexDirection: "column", gap: 6 }}>
|
||||||
|
<input
|
||||||
|
type="password"
|
||||||
|
placeholder="Signer private key (0x...)"
|
||||||
|
value={signerKey}
|
||||||
|
onChange={(e) => setSignerKey(e.target.value)}
|
||||||
|
style={{
|
||||||
|
padding: "6px 8px",
|
||||||
|
border: "1px solid #334155",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: "#0f172a",
|
||||||
|
color: "#e2e8f0",
|
||||||
|
fontSize: 11,
|
||||||
|
fontFamily: "monospace",
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
<div style={{ display: "flex", gap: 6 }}>
|
||||||
|
{!thresholdMet && (
|
||||||
|
<button
|
||||||
|
onClick={handleConfirm}
|
||||||
|
disabled={confirming || !signerKey}
|
||||||
|
style={{
|
||||||
|
flex: 1,
|
||||||
|
padding: "6px 12px",
|
||||||
|
border: "none",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: confirming ? "#334155" : "#818cf8",
|
||||||
|
color: "#fff",
|
||||||
|
fontSize: 11,
|
||||||
|
fontWeight: 600,
|
||||||
|
cursor: confirming ? "not-allowed" : "pointer",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{confirming ? "..." : "Confirm"}
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
{thresholdMet && (
|
||||||
|
<button
|
||||||
|
onClick={handleExecute}
|
||||||
|
disabled={executing || !signerKey}
|
||||||
|
style={{
|
||||||
|
flex: 1,
|
||||||
|
padding: "6px 12px",
|
||||||
|
border: "none",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: executing ? "#334155" : "#12ff80",
|
||||||
|
color: "#0f172a",
|
||||||
|
fontSize: 11,
|
||||||
|
fontWeight: 700,
|
||||||
|
cursor: executing ? "not-allowed" : "pointer",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{executing ? "..." : "Execute On-Chain"}
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
export function PendingTransactions() {
|
||||||
|
const { data: txs, isLoading, error, refetch } = usePendingTransactions()
|
||||||
|
|
||||||
|
if (isLoading) {
|
||||||
|
return <div style={{ color: "#64748b", fontSize: 12, textAlign: "center" }}>Loading...</div>
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return <div style={{ color: "#fca5a5", fontSize: 12 }}>Error: {error}</div>
|
||||||
|
}
|
||||||
|
|
||||||
|
if (txs.length === 0) {
|
||||||
|
return (
|
||||||
|
<div style={{ color: "#64748b", fontSize: 12, textAlign: "center", padding: 20 }}>
|
||||||
|
No pending transactions
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div>
|
||||||
|
<div style={{ fontSize: 11, color: "#64748b", marginBottom: 8 }}>
|
||||||
|
{txs.length} pending transaction{txs.length !== 1 ? "s" : ""}
|
||||||
|
</div>
|
||||||
|
{txs.map((tx) => (
|
||||||
|
<TxCard key={tx.safeTxHash} tx={tx} onRefresh={refetch} />
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,92 @@
|
||||||
|
import { useSafeInfo, useSafeBalances } from "../../hooks/useSafeTransaction"
|
||||||
|
import { formatAddress } from "../../hooks/useWallet"
|
||||||
|
|
||||||
|
export function SafeHeader() {
|
||||||
|
const { data: info, isLoading: infoLoading } = useSafeInfo()
|
||||||
|
const { data: balances, isLoading: balLoading } = useSafeBalances()
|
||||||
|
|
||||||
|
if (infoLoading) {
|
||||||
|
return (
|
||||||
|
<div style={{ padding: 12, textAlign: "center", color: "#64748b", fontSize: 12 }}>
|
||||||
|
Loading Safe info...
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!info) {
|
||||||
|
return (
|
||||||
|
<div style={{ padding: 12, textAlign: "center", color: "#f87171", fontSize: 12 }}>
|
||||||
|
Treasury not configured
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sum native balance from balances data
|
||||||
|
const nativeBalance = balances?.balances?.find((b) => !b.tokenAddress)
|
||||||
|
const nativeFormatted = nativeBalance
|
||||||
|
? (Number(nativeBalance.balance) / 1e18).toFixed(4)
|
||||||
|
: "..."
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
padding: "10px 12px",
|
||||||
|
borderBottom: "1px solid #1e293b",
|
||||||
|
display: "flex",
|
||||||
|
flexDirection: "column",
|
||||||
|
gap: 6,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{/* Address + Chain */}
|
||||||
|
<div style={{ display: "flex", justifyContent: "space-between", alignItems: "center" }}>
|
||||||
|
<div style={{ display: "flex", alignItems: "center", gap: 6 }}>
|
||||||
|
<span
|
||||||
|
style={{
|
||||||
|
fontSize: 14,
|
||||||
|
fontWeight: 700,
|
||||||
|
fontFamily: "monospace",
|
||||||
|
color: "#12ff80",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{formatAddress(info.address, 6)}
|
||||||
|
</span>
|
||||||
|
<button
|
||||||
|
onClick={() => navigator.clipboard.writeText(info.address)}
|
||||||
|
style={{
|
||||||
|
border: "none",
|
||||||
|
background: "transparent",
|
||||||
|
color: "#64748b",
|
||||||
|
cursor: "pointer",
|
||||||
|
fontSize: 11,
|
||||||
|
padding: "2px 4px",
|
||||||
|
}}
|
||||||
|
title="Copy address"
|
||||||
|
>
|
||||||
|
copy
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<span
|
||||||
|
style={{
|
||||||
|
fontSize: 10,
|
||||||
|
fontWeight: 600,
|
||||||
|
padding: "2px 8px",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: "#1e293b",
|
||||||
|
color: "#94a3b8",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Chain {info.chainId}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Owners + Threshold + Balance */}
|
||||||
|
<div style={{ display: "flex", justifyContent: "space-between", fontSize: 11, color: "#94a3b8" }}>
|
||||||
|
<span>
|
||||||
|
{info.threshold}/{info.owners.length} owners
|
||||||
|
</span>
|
||||||
|
<span>v{info.version}</span>
|
||||||
|
<span>{nativeFormatted} ETH</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,159 @@
|
||||||
|
import { useState } from "react"
|
||||||
|
import { useProposeTransaction, useSafeBalances } from "../../hooks/useSafeTransaction"
|
||||||
|
import { useWalletConnection } from "../../hooks/useWallet"
|
||||||
|
|
||||||
|
export function TransactionComposer() {
|
||||||
|
const { address, isConnected } = useWalletConnection()
|
||||||
|
const { data: balances } = useSafeBalances()
|
||||||
|
const { propose, isLoading, error } = useProposeTransaction()
|
||||||
|
|
||||||
|
const [recipient, setRecipient] = useState("")
|
||||||
|
const [amount, setAmount] = useState("")
|
||||||
|
const [tokenAddress, setTokenAddress] = useState("")
|
||||||
|
const [title, setTitle] = useState("")
|
||||||
|
const [signerKey, setSignerKey] = useState("")
|
||||||
|
const [result, setResult] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const handleSubmit = async () => {
|
||||||
|
if (!recipient || !amount || !signerKey) return
|
||||||
|
|
||||||
|
const res = await propose({
|
||||||
|
recipientAddress: recipient,
|
||||||
|
amount,
|
||||||
|
tokenAddress: tokenAddress || undefined,
|
||||||
|
title: title || undefined,
|
||||||
|
signerPrivateKey: signerKey,
|
||||||
|
})
|
||||||
|
|
||||||
|
if (res) {
|
||||||
|
setResult(`Proposed: ${res.safeTxHash.slice(0, 16)}...`)
|
||||||
|
setRecipient("")
|
||||||
|
setAmount("")
|
||||||
|
setTitle("")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const inputStyle: React.CSSProperties = {
|
||||||
|
width: "100%",
|
||||||
|
padding: "8px 10px",
|
||||||
|
border: "1px solid #334155",
|
||||||
|
borderRadius: 6,
|
||||||
|
background: "#1e293b",
|
||||||
|
color: "#e2e8f0",
|
||||||
|
fontSize: 12,
|
||||||
|
fontFamily: "monospace",
|
||||||
|
outline: "none",
|
||||||
|
boxSizing: "border-box",
|
||||||
|
}
|
||||||
|
|
||||||
|
const labelStyle: React.CSSProperties = {
|
||||||
|
fontSize: 11,
|
||||||
|
fontWeight: 600,
|
||||||
|
color: "#94a3b8",
|
||||||
|
marginBottom: 4,
|
||||||
|
display: "block",
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div style={{ display: "flex", flexDirection: "column", gap: 12 }}>
|
||||||
|
{/* Title */}
|
||||||
|
<div>
|
||||||
|
<label style={labelStyle}>Description (optional)</label>
|
||||||
|
<input
|
||||||
|
style={inputStyle}
|
||||||
|
placeholder="Payment for..."
|
||||||
|
value={title}
|
||||||
|
onChange={(e) => setTitle(e.target.value)}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Recipient */}
|
||||||
|
<div>
|
||||||
|
<label style={labelStyle}>Recipient Address *</label>
|
||||||
|
<input
|
||||||
|
style={inputStyle}
|
||||||
|
placeholder="0x..."
|
||||||
|
value={recipient}
|
||||||
|
onChange={(e) => setRecipient(e.target.value)}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Token (optional) */}
|
||||||
|
<div>
|
||||||
|
<label style={labelStyle}>Token Address (empty = native ETH)</label>
|
||||||
|
<select
|
||||||
|
style={{ ...inputStyle, fontFamily: "Inter, system-ui, sans-serif" }}
|
||||||
|
value={tokenAddress}
|
||||||
|
onChange={(e) => setTokenAddress(e.target.value)}
|
||||||
|
>
|
||||||
|
<option value="">ETH (native)</option>
|
||||||
|
{balances?.balances
|
||||||
|
?.filter((b) => b.tokenAddress && b.token)
|
||||||
|
.map((b) => (
|
||||||
|
<option key={b.tokenAddress} value={b.tokenAddress!}>
|
||||||
|
{b.token!.symbol} — {(Number(b.balance) / 10 ** b.token!.decimals).toFixed(4)}
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Amount */}
|
||||||
|
<div>
|
||||||
|
<label style={labelStyle}>Amount * {tokenAddress ? "(token units)" : "(wei)"}</label>
|
||||||
|
<input
|
||||||
|
style={inputStyle}
|
||||||
|
placeholder={tokenAddress ? "100" : "1000000000000000"}
|
||||||
|
value={amount}
|
||||||
|
onChange={(e) => setAmount(e.target.value)}
|
||||||
|
type="text"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Signer Key */}
|
||||||
|
<div>
|
||||||
|
<label style={labelStyle}>Signer Private Key *</label>
|
||||||
|
<input
|
||||||
|
style={inputStyle}
|
||||||
|
type="password"
|
||||||
|
placeholder="0x..."
|
||||||
|
value={signerKey}
|
||||||
|
onChange={(e) => setSignerKey(e.target.value)}
|
||||||
|
/>
|
||||||
|
<span style={{ fontSize: 10, color: "#64748b", marginTop: 2, display: "block" }}>
|
||||||
|
Must be a Safe owner. Key is sent to treasury-service for signing.
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Submit */}
|
||||||
|
<button
|
||||||
|
onClick={handleSubmit}
|
||||||
|
disabled={isLoading || !recipient || !amount || !signerKey}
|
||||||
|
style={{
|
||||||
|
padding: "10px 16px",
|
||||||
|
border: "none",
|
||||||
|
borderRadius: 6,
|
||||||
|
background: isLoading ? "#334155" : "#12ff80",
|
||||||
|
color: isLoading ? "#94a3b8" : "#0f172a",
|
||||||
|
fontWeight: 700,
|
||||||
|
fontSize: 13,
|
||||||
|
cursor: isLoading ? "not-allowed" : "pointer",
|
||||||
|
transition: "all 0.15s",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{isLoading ? "Proposing..." : "Propose Transaction"}
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{/* Result / Error */}
|
||||||
|
{result && (
|
||||||
|
<div style={{ padding: 8, borderRadius: 6, background: "#064e3b", color: "#34d399", fontSize: 11 }}>
|
||||||
|
{result}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{error && (
|
||||||
|
<div style={{ padding: 8, borderRadius: 6, background: "#450a0a", color: "#fca5a5", fontSize: 11 }}>
|
||||||
|
{error}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,133 @@
|
||||||
|
import { useState } from "react"
|
||||||
|
import { useTransactionHistory } from "../../hooks/useSafeTransaction"
|
||||||
|
import { formatAddress } from "../../hooks/useWallet"
|
||||||
|
|
||||||
|
export function TransactionHistory() {
|
||||||
|
const [page, setPage] = useState(1)
|
||||||
|
const { data: txs, total, isLoading, error } = useTransactionHistory(page, 10)
|
||||||
|
|
||||||
|
if (isLoading) {
|
||||||
|
return <div style={{ color: "#64748b", fontSize: 12, textAlign: "center" }}>Loading...</div>
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return <div style={{ color: "#fca5a5", fontSize: 12 }}>Error: {error}</div>
|
||||||
|
}
|
||||||
|
|
||||||
|
if (txs.length === 0) {
|
||||||
|
return (
|
||||||
|
<div style={{ color: "#64748b", fontSize: 12, textAlign: "center", padding: 20 }}>
|
||||||
|
No executed transactions
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div>
|
||||||
|
<div style={{ fontSize: 11, color: "#64748b", marginBottom: 8 }}>
|
||||||
|
{total} total transaction{total !== 1 ? "s" : ""}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{txs.map((tx) => {
|
||||||
|
const valueEth =
|
||||||
|
tx.value !== "0"
|
||||||
|
? `${(Number(tx.value) / 1e18).toFixed(6)} ETH`
|
||||||
|
: "Contract call"
|
||||||
|
|
||||||
|
const date = tx.executionDate
|
||||||
|
? new Date(tx.executionDate).toLocaleDateString("en-US", {
|
||||||
|
month: "short",
|
||||||
|
day: "numeric",
|
||||||
|
hour: "2-digit",
|
||||||
|
minute: "2-digit",
|
||||||
|
})
|
||||||
|
: "—"
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
key={tx.safeTxHash}
|
||||||
|
style={{
|
||||||
|
border: "1px solid #1e293b",
|
||||||
|
borderRadius: 6,
|
||||||
|
padding: 8,
|
||||||
|
background: "#1e293b",
|
||||||
|
marginBottom: 6,
|
||||||
|
display: "flex",
|
||||||
|
justifyContent: "space-between",
|
||||||
|
alignItems: "center",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<div>
|
||||||
|
<div style={{ fontSize: 12, fontWeight: 600 }}>
|
||||||
|
<span style={{ color: "#e2e8f0" }}>#{tx.nonce}</span>
|
||||||
|
<span style={{ color: "#64748b", marginLeft: 8 }}>→ {formatAddress(tx.to)}</span>
|
||||||
|
</div>
|
||||||
|
<div style={{ fontSize: 11, color: "#94a3b8", marginTop: 2 }}>{valueEth}</div>
|
||||||
|
</div>
|
||||||
|
<div style={{ textAlign: "right" }}>
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
fontSize: 10,
|
||||||
|
fontWeight: 600,
|
||||||
|
color: tx.isSuccessful ? "#34d399" : "#f87171",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{tx.isSuccessful ? "Success" : "Failed"}
|
||||||
|
</div>
|
||||||
|
<div style={{ fontSize: 10, color: "#64748b", marginTop: 2 }}>{date}</div>
|
||||||
|
{tx.transactionHash && (
|
||||||
|
<a
|
||||||
|
href={`https://basescan.org/tx/${tx.transactionHash}`}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
style={{ fontSize: 10, color: "#818cf8", textDecoration: "none" }}
|
||||||
|
>
|
||||||
|
{formatAddress(tx.transactionHash, 4)}
|
||||||
|
</a>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
})}
|
||||||
|
|
||||||
|
{/* Pagination */}
|
||||||
|
{total > 10 && (
|
||||||
|
<div style={{ display: "flex", justifyContent: "center", gap: 8, marginTop: 8 }}>
|
||||||
|
<button
|
||||||
|
disabled={page === 1}
|
||||||
|
onClick={() => setPage((p) => p - 1)}
|
||||||
|
style={{
|
||||||
|
padding: "4px 12px",
|
||||||
|
border: "1px solid #334155",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: "transparent",
|
||||||
|
color: page === 1 ? "#334155" : "#94a3b8",
|
||||||
|
cursor: page === 1 ? "not-allowed" : "pointer",
|
||||||
|
fontSize: 11,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Prev
|
||||||
|
</button>
|
||||||
|
<span style={{ fontSize: 11, color: "#64748b", lineHeight: "28px" }}>
|
||||||
|
{page}/{Math.ceil(total / 10)}
|
||||||
|
</span>
|
||||||
|
<button
|
||||||
|
disabled={page * 10 >= total}
|
||||||
|
onClick={() => setPage((p) => p + 1)}
|
||||||
|
style={{
|
||||||
|
padding: "4px 12px",
|
||||||
|
border: "1px solid #334155",
|
||||||
|
borderRadius: 4,
|
||||||
|
background: "transparent",
|
||||||
|
color: page * 10 >= total ? "#334155" : "#94a3b8",
|
||||||
|
cursor: page * 10 >= total ? "not-allowed" : "pointer",
|
||||||
|
fontSize: 11,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
Next
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,294 @@
|
||||||
|
/**
|
||||||
|
* useSafeTransaction - Hooks for interacting with Safe treasury API
|
||||||
|
*
|
||||||
|
* Provides data fetching and mutation hooks for Safe multisig operations.
|
||||||
|
* Talks to the treasury-service REST API on payment-infra.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { useState, useEffect, useCallback } from 'react'
|
||||||
|
|
||||||
|
const TREASURY_API = import.meta.env.VITE_TREASURY_API_URL || 'http://localhost:3006'
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// Types
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export interface SafeInfo {
|
||||||
|
address: string
|
||||||
|
chainId: number
|
||||||
|
threshold: number
|
||||||
|
owners: string[]
|
||||||
|
nonce: number
|
||||||
|
modules: string[]
|
||||||
|
guard: string
|
||||||
|
fallbackHandler: string
|
||||||
|
version: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SafeBalance {
|
||||||
|
tokenAddress: string | null
|
||||||
|
token: {
|
||||||
|
name: string
|
||||||
|
symbol: string
|
||||||
|
decimals: number
|
||||||
|
logoUri?: string
|
||||||
|
} | null
|
||||||
|
balance: string
|
||||||
|
fiatBalance?: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SafeConfirmation {
|
||||||
|
owner: string
|
||||||
|
signature: string
|
||||||
|
signatureType: string
|
||||||
|
submissionDate: string
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SafeTransaction {
|
||||||
|
safeTxHash: string
|
||||||
|
to: string
|
||||||
|
value: string
|
||||||
|
data: string | null
|
||||||
|
operation: number
|
||||||
|
nonce: number
|
||||||
|
confirmations: SafeConfirmation[]
|
||||||
|
confirmationsRequired: number
|
||||||
|
isExecuted: boolean
|
||||||
|
isSuccessful: boolean | null
|
||||||
|
executionDate: string | null
|
||||||
|
transactionHash: string | null
|
||||||
|
submissionDate: string
|
||||||
|
proposer: string
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// Helper
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
async function fetchJson<T>(url: string, options?: RequestInit): Promise<T> {
|
||||||
|
const res = await fetch(url, {
|
||||||
|
...options,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
...options?.headers,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
if (!res.ok) {
|
||||||
|
const body = await res.json().catch(() => ({}))
|
||||||
|
throw new Error((body as Record<string, string>).error || `HTTP ${res.status}`)
|
||||||
|
}
|
||||||
|
return res.json() as Promise<T>
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// useSafeInfo
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function useSafeInfo() {
|
||||||
|
const [data, setData] = useState<SafeInfo | null>(null)
|
||||||
|
const [isLoading, setIsLoading] = useState(true)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const fetch_ = useCallback(async () => {
|
||||||
|
setIsLoading(true)
|
||||||
|
setError(null)
|
||||||
|
try {
|
||||||
|
const info = await fetchJson<SafeInfo>(`${TREASURY_API}/api/treasury/safe`)
|
||||||
|
setData(info)
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
useEffect(() => { fetch_() }, [fetch_])
|
||||||
|
|
||||||
|
return { data, isLoading, error, refetch: fetch_ }
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// useSafeBalances
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function useSafeBalances() {
|
||||||
|
const [data, setData] = useState<{ balances: SafeBalance[] } | null>(null)
|
||||||
|
const [isLoading, setIsLoading] = useState(true)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const fetch_ = useCallback(async () => {
|
||||||
|
setIsLoading(true)
|
||||||
|
setError(null)
|
||||||
|
try {
|
||||||
|
const result = await fetchJson<{ balances: SafeBalance[] }>(`${TREASURY_API}/api/treasury/balance`)
|
||||||
|
setData(result)
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
useEffect(() => { fetch_() }, [fetch_])
|
||||||
|
|
||||||
|
return { data, isLoading, error, refetch: fetch_ }
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// usePendingTransactions
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function usePendingTransactions(pollInterval = 15000) {
|
||||||
|
const [data, setData] = useState<SafeTransaction[]>([])
|
||||||
|
const [isLoading, setIsLoading] = useState(true)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const fetch_ = useCallback(async () => {
|
||||||
|
try {
|
||||||
|
const result = await fetchJson<{ transactions: SafeTransaction[] }>(`${TREASURY_API}/api/treasury/pending`)
|
||||||
|
setData(result.transactions)
|
||||||
|
setError(null)
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
fetch_()
|
||||||
|
const interval = setInterval(fetch_, pollInterval)
|
||||||
|
return () => clearInterval(interval)
|
||||||
|
}, [fetch_, pollInterval])
|
||||||
|
|
||||||
|
return { data, isLoading, error, refetch: fetch_ }
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// useTransactionHistory
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function useTransactionHistory(page = 1, limit = 20) {
|
||||||
|
const [data, setData] = useState<SafeTransaction[]>([])
|
||||||
|
const [total, setTotal] = useState(0)
|
||||||
|
const [isLoading, setIsLoading] = useState(true)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const fetch_ = useCallback(async () => {
|
||||||
|
setIsLoading(true)
|
||||||
|
setError(null)
|
||||||
|
try {
|
||||||
|
const result = await fetchJson<{
|
||||||
|
transactions: SafeTransaction[]
|
||||||
|
pagination: { total: number }
|
||||||
|
}>(`${TREASURY_API}/api/treasury/transactions?page=${page}&limit=${limit}`)
|
||||||
|
setData(result.transactions)
|
||||||
|
setTotal(result.pagination.total)
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [page, limit])
|
||||||
|
|
||||||
|
useEffect(() => { fetch_() }, [fetch_])
|
||||||
|
|
||||||
|
return { data, total, isLoading, error, refetch: fetch_ }
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// useProposeTransaction (mutation)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function useProposeTransaction() {
|
||||||
|
const [isLoading, setIsLoading] = useState(false)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const propose = useCallback(async (params: {
|
||||||
|
recipientAddress: string
|
||||||
|
amount: string
|
||||||
|
tokenAddress?: string
|
||||||
|
title?: string
|
||||||
|
description?: string
|
||||||
|
signerPrivateKey: string
|
||||||
|
}) => {
|
||||||
|
setIsLoading(true)
|
||||||
|
setError(null)
|
||||||
|
try {
|
||||||
|
const result = await fetchJson<{ safeTxHash: string }>(`${TREASURY_API}/api/treasury/propose`, {
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify(params),
|
||||||
|
})
|
||||||
|
return result
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
return null
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
return { propose, isLoading, error }
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// useConfirmTransaction (mutation)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function useConfirmTransaction() {
|
||||||
|
const [isLoading, setIsLoading] = useState(false)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const confirm = useCallback(async (safeTxHash: string, signerPrivateKey: string) => {
|
||||||
|
setIsLoading(true)
|
||||||
|
setError(null)
|
||||||
|
try {
|
||||||
|
const result = await fetchJson<SafeTransaction>(
|
||||||
|
`${TREASURY_API}/api/treasury/confirm/${safeTxHash}`,
|
||||||
|
{
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({ signerPrivateKey }),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return result
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
return null
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
return { confirm, isLoading, error }
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// useExecuteTransaction (mutation)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
export function useExecuteTransaction() {
|
||||||
|
const [isLoading, setIsLoading] = useState(false)
|
||||||
|
const [error, setError] = useState<string | null>(null)
|
||||||
|
|
||||||
|
const execute = useCallback(async (safeTxHash: string, signerPrivateKey: string) => {
|
||||||
|
setIsLoading(true)
|
||||||
|
setError(null)
|
||||||
|
try {
|
||||||
|
const result = await fetchJson<{ transactionHash: string }>(
|
||||||
|
`${TREASURY_API}/api/treasury/execute/${safeTxHash}`,
|
||||||
|
{
|
||||||
|
method: 'POST',
|
||||||
|
body: JSON.stringify({ signerPrivateKey }),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return result
|
||||||
|
} catch (e) {
|
||||||
|
setError((e as Error).message)
|
||||||
|
return null
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false)
|
||||||
|
}
|
||||||
|
}, [])
|
||||||
|
|
||||||
|
return { execute, isLoading, error }
|
||||||
|
}
|
||||||
|
|
@ -144,6 +144,7 @@ import { logActivity } from "../lib/activityLogger"
|
||||||
import { ActivityPanel } from "../components/ActivityPanel"
|
import { ActivityPanel } from "../components/ActivityPanel"
|
||||||
|
|
||||||
import { WORKER_URL } from "../constants/workerUrl"
|
import { WORKER_URL } from "../constants/workerUrl"
|
||||||
|
import { TransactionBuilderShape } from "@/shapes/TransactionBuilderShapeUtil"
|
||||||
|
|
||||||
const customShapeUtils = [
|
const customShapeUtils = [
|
||||||
ChatBoxShape,
|
ChatBoxShape,
|
||||||
|
|
@ -170,6 +171,7 @@ const customShapeUtils = [
|
||||||
PrivateWorkspaceShape, // Private zone for Google Export data sovereignty
|
PrivateWorkspaceShape, // Private zone for Google Export data sovereignty
|
||||||
GoogleItemShape, // Individual items from Google Export with privacy badges
|
GoogleItemShape, // Individual items from Google Export with privacy badges
|
||||||
MapShape, // Open Mapping - OSM map shape
|
MapShape, // Open Mapping - OSM map shape
|
||||||
|
TransactionBuilderShape, // Safe multisig transaction builder
|
||||||
// Conditionally included based on feature flags:
|
// Conditionally included based on feature flags:
|
||||||
...(ENABLE_WORKFLOW ? [WorkflowBlockShape] : []), // Workflow Builder - dev only
|
...(ENABLE_WORKFLOW ? [WorkflowBlockShape] : []), // Workflow Builder - dev only
|
||||||
...(ENABLE_CALENDAR ? [CalendarShape, CalendarEventShape] : []), // Calendar - dev only
|
...(ENABLE_CALENDAR ? [CalendarShape, CalendarEventShape] : []), // Calendar - dev only
|
||||||
|
|
|
||||||
|
|
@ -25,7 +25,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/xfym/"
|
src="https://slides.jeffemmett.com/osmotic-governance"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -50,7 +50,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/bqra/"
|
src="https://slides.jeffemmett.com/exploring-mycofi"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -75,7 +75,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/vwmt/"
|
src="https://slides.jeffemmett.com/mycofi-cofi-gathering"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -102,7 +102,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/caal/"
|
src="https://slides.jeffemmett.com/myco-mutualism"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -125,7 +125,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/pnlz/"
|
src="https://slides.jeffemmett.com/psilocybernetics"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -148,7 +148,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/bnnf/"
|
src="https://slides.jeffemmett.com/move-slow-fix-things"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -173,7 +173,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/hxac/"
|
src="https://slides.jeffemmett.com/commons-stack-launch"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -198,7 +198,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/fhos/"
|
src="https://slides.jeffemmett.com/conviction-voting"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -221,7 +221,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/zzoy/"
|
src="https://slides.jeffemmett.com/polycentric-governance"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -246,7 +246,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/xoea/"
|
src="https://slides.jeffemmett.com/mycofi-myco-munnities"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
@ -269,7 +269,7 @@ export function Presentations() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/afbp/"
|
src="https://slides.jeffemmett.com/community-resilience"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
|
||||||
|
|
@ -23,7 +23,7 @@ export function Resilience() {
|
||||||
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
<div style={{position: "relative", paddingTop: "max(60%, 324px)", width: "100%", height: 0}}>
|
||||||
<iframe
|
<iframe
|
||||||
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
style={{position: "absolute", border: "none", width: "100%", height: "100%", left: 0, top: 0}}
|
||||||
src="https://online.fliphtml5.com/phqos/afbp/"
|
src="https://slides.jeffemmett.com/community-resilience"
|
||||||
seamless={true}
|
seamless={true}
|
||||||
scrolling="no"
|
scrolling="no"
|
||||||
frameBorder="0"
|
frameBorder="0"
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,157 @@
|
||||||
|
import { useState } from "react"
|
||||||
|
import { BaseBoxShapeUtil, TLBaseShape, HTMLContainer } from "tldraw"
|
||||||
|
import { StandardizedToolWrapper } from "../components/StandardizedToolWrapper"
|
||||||
|
import { usePinnedToView } from "../hooks/usePinnedToView"
|
||||||
|
import { useMaximize } from "../hooks/useMaximize"
|
||||||
|
import { SafeHeader } from "../components/safe/SafeHeader"
|
||||||
|
import { TransactionComposer } from "../components/safe/TransactionComposer"
|
||||||
|
import { PendingTransactions } from "../components/safe/PendingTransactions"
|
||||||
|
import { TransactionHistory } from "../components/safe/TransactionHistory"
|
||||||
|
|
||||||
|
export type ITransactionBuilderShape = TLBaseShape<
|
||||||
|
"TransactionBuilder",
|
||||||
|
{
|
||||||
|
w: number
|
||||||
|
h: number
|
||||||
|
mode: "compose" | "pending" | "history"
|
||||||
|
pinnedToView: boolean
|
||||||
|
tags: string[]
|
||||||
|
}
|
||||||
|
>
|
||||||
|
|
||||||
|
export class TransactionBuilderShape extends BaseBoxShapeUtil<ITransactionBuilderShape> {
|
||||||
|
static override type = "TransactionBuilder"
|
||||||
|
|
||||||
|
getDefaultProps(): ITransactionBuilderShape["props"] {
|
||||||
|
return {
|
||||||
|
w: 480,
|
||||||
|
h: 620,
|
||||||
|
mode: "pending",
|
||||||
|
pinnedToView: false,
|
||||||
|
tags: ["safe", "treasury"],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Safe green theme
|
||||||
|
static readonly PRIMARY_COLOR = "#12ff80"
|
||||||
|
|
||||||
|
indicator(shape: ITransactionBuilderShape) {
|
||||||
|
return <rect x={0} y={0} width={shape.props.w} height={shape.props.h} />
|
||||||
|
}
|
||||||
|
|
||||||
|
component(shape: ITransactionBuilderShape) {
|
||||||
|
const [isMinimized, setIsMinimized] = useState(false)
|
||||||
|
const [activeTab, setActiveTab] = useState<"compose" | "pending" | "history">(shape.props.mode)
|
||||||
|
const isSelected = this.editor.getSelectedShapeIds().includes(shape.id)
|
||||||
|
|
||||||
|
usePinnedToView(this.editor, shape.id, shape.props.pinnedToView)
|
||||||
|
|
||||||
|
const { isMaximized, toggleMaximize } = useMaximize({
|
||||||
|
editor: this.editor,
|
||||||
|
shapeId: shape.id,
|
||||||
|
currentW: shape.props.w,
|
||||||
|
currentH: shape.props.h,
|
||||||
|
shapeType: "TransactionBuilder",
|
||||||
|
})
|
||||||
|
|
||||||
|
const handleClose = () => {
|
||||||
|
this.editor.deleteShape(shape.id)
|
||||||
|
}
|
||||||
|
|
||||||
|
const handleMinimize = () => {
|
||||||
|
setIsMinimized(!isMinimized)
|
||||||
|
}
|
||||||
|
|
||||||
|
const handlePinToggle = () => {
|
||||||
|
this.editor.updateShape<ITransactionBuilderShape>({
|
||||||
|
id: shape.id,
|
||||||
|
type: shape.type,
|
||||||
|
props: {
|
||||||
|
...shape.props,
|
||||||
|
pinnedToView: !shape.props.pinnedToView,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
const tabs = [
|
||||||
|
{ key: "compose" as const, label: "Compose" },
|
||||||
|
{ key: "pending" as const, label: "Pending" },
|
||||||
|
{ key: "history" as const, label: "History" },
|
||||||
|
]
|
||||||
|
|
||||||
|
return (
|
||||||
|
<HTMLContainer style={{ width: shape.props.w, height: shape.props.h }}>
|
||||||
|
<StandardizedToolWrapper
|
||||||
|
title="Safe Treasury"
|
||||||
|
primaryColor={TransactionBuilderShape.PRIMARY_COLOR}
|
||||||
|
isSelected={isSelected}
|
||||||
|
width={shape.props.w}
|
||||||
|
height={shape.props.h}
|
||||||
|
onClose={handleClose}
|
||||||
|
onMinimize={handleMinimize}
|
||||||
|
isMinimized={isMinimized}
|
||||||
|
onMaximize={toggleMaximize}
|
||||||
|
isMaximized={isMaximized}
|
||||||
|
isPinnedToView={shape.props.pinnedToView}
|
||||||
|
onPinToggle={handlePinToggle}
|
||||||
|
editor={this.editor}
|
||||||
|
shapeId={shape.id}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
display: "flex",
|
||||||
|
flexDirection: "column",
|
||||||
|
height: "100%",
|
||||||
|
fontFamily: "Inter, system-ui, -apple-system, sans-serif",
|
||||||
|
fontSize: 13,
|
||||||
|
color: "#e2e8f0",
|
||||||
|
background: "#0f172a",
|
||||||
|
}}
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
>
|
||||||
|
{/* Safe Info Header */}
|
||||||
|
<SafeHeader />
|
||||||
|
|
||||||
|
{/* Tab Navigation */}
|
||||||
|
<div
|
||||||
|
style={{
|
||||||
|
display: "flex",
|
||||||
|
borderBottom: "1px solid #1e293b",
|
||||||
|
padding: "0 8px",
|
||||||
|
gap: 2,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{tabs.map((tab) => (
|
||||||
|
<button
|
||||||
|
key={tab.key}
|
||||||
|
onClick={() => setActiveTab(tab.key)}
|
||||||
|
style={{
|
||||||
|
padding: "8px 16px",
|
||||||
|
border: "none",
|
||||||
|
background: activeTab === tab.key ? "#1e293b" : "transparent",
|
||||||
|
color: activeTab === tab.key ? "#12ff80" : "#94a3b8",
|
||||||
|
borderBottom: activeTab === tab.key ? "2px solid #12ff80" : "2px solid transparent",
|
||||||
|
cursor: "pointer",
|
||||||
|
fontSize: 12,
|
||||||
|
fontWeight: 600,
|
||||||
|
borderRadius: "4px 4px 0 0",
|
||||||
|
transition: "all 0.15s",
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{tab.label}
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Tab Content */}
|
||||||
|
<div style={{ flex: 1, overflow: "auto", padding: 12 }}>
|
||||||
|
{activeTab === "compose" && <TransactionComposer />}
|
||||||
|
{activeTab === "pending" && <PendingTransactions />}
|
||||||
|
{activeTab === "history" && <TransactionHistory />}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</StandardizedToolWrapper>
|
||||||
|
</HTMLContainer>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -75,7 +75,8 @@ function createMockEnv(overrides: Partial<Environment> = {}): Environment {
|
||||||
DAILY_API_KEY: 'mock-daily-key',
|
DAILY_API_KEY: 'mock-daily-key',
|
||||||
DAILY_DOMAIN: 'mock.daily.co',
|
DAILY_DOMAIN: 'mock.daily.co',
|
||||||
CRYPTID_DB: createMockD1() as unknown as D1Database,
|
CRYPTID_DB: createMockD1() as unknown as D1Database,
|
||||||
RESEND_API_KEY: 'mock-resend-key',
|
EMAIL_RELAY_URL: 'https://email-relay.jeffemmett.com',
|
||||||
|
EMAIL_RELAY_API_KEY: 'mock-relay-key',
|
||||||
APP_URL: 'https://test.example.com',
|
APP_URL: 'https://test.example.com',
|
||||||
...overrides,
|
...overrides,
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -58,7 +58,8 @@ function createMockEnv(overrides: Partial<Environment> = {}): Environment {
|
||||||
DAILY_API_KEY: 'mock-daily-key',
|
DAILY_API_KEY: 'mock-daily-key',
|
||||||
DAILY_DOMAIN: 'mock.daily.co',
|
DAILY_DOMAIN: 'mock.daily.co',
|
||||||
CRYPTID_DB: createMockD1() as unknown as D1Database,
|
CRYPTID_DB: createMockD1() as unknown as D1Database,
|
||||||
RESEND_API_KEY: 'mock-resend-key',
|
EMAIL_RELAY_URL: 'https://email-relay.jeffemmett.com',
|
||||||
|
EMAIL_RELAY_API_KEY: 'mock-relay-key',
|
||||||
APP_URL: 'https://test.example.com',
|
APP_URL: 'https://test.example.com',
|
||||||
...overrides,
|
...overrides,
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -1099,13 +1099,13 @@ export async function handleRequestAdminAccess(
|
||||||
const body = await request.json().catch(() => ({})) as { reason?: string };
|
const body = await request.json().catch(() => ({})) as { reason?: string };
|
||||||
|
|
||||||
// Send email to global admin (jeffemmett@gmail.com)
|
// Send email to global admin (jeffemmett@gmail.com)
|
||||||
if (env.RESEND_API_KEY) {
|
if (env.EMAIL_RELAY_URL && env.EMAIL_RELAY_API_KEY) {
|
||||||
const emailFrom = env.CRYPTID_EMAIL_FROM || 'Canvas <noreply@jeffemmett.com>';
|
const emailFrom = env.CRYPTID_EMAIL_FROM || 'Canvas <noreply@jeffemmett.com>';
|
||||||
|
|
||||||
const emailResponse = await fetch('https://api.resend.com/emails', {
|
const emailResponse = await fetch(`${env.EMAIL_RELAY_URL}/send`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
'Authorization': `Bearer ${env.RESEND_API_KEY}`,
|
'Authorization': `Bearer ${env.EMAIL_RELAY_API_KEY}`,
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
},
|
},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
|
|
|
||||||
|
|
@ -12,7 +12,7 @@ function generateUUID(): string {
|
||||||
return crypto.randomUUID();
|
return crypto.randomUUID();
|
||||||
}
|
}
|
||||||
|
|
||||||
// Send email via Resend
|
// Send email via self-hosted email relay (Mailcow SMTP)
|
||||||
async function sendEmail(
|
async function sendEmail(
|
||||||
env: Environment,
|
env: Environment,
|
||||||
to: string,
|
to: string,
|
||||||
|
|
@ -20,15 +20,17 @@ async function sendEmail(
|
||||||
htmlContent: string
|
htmlContent: string
|
||||||
): Promise<boolean> {
|
): Promise<boolean> {
|
||||||
try {
|
try {
|
||||||
if (!env.RESEND_API_KEY) {
|
const relayUrl = env.EMAIL_RELAY_URL;
|
||||||
console.error('RESEND_API_KEY not configured');
|
const relayKey = env.EMAIL_RELAY_API_KEY;
|
||||||
|
if (!relayUrl || !relayKey) {
|
||||||
|
console.error('EMAIL_RELAY_URL or EMAIL_RELAY_API_KEY not configured');
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
const response = await fetch('https://api.resend.com/emails', {
|
const response = await fetch(`${relayUrl}/send`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
'Authorization': `Bearer ${env.RESEND_API_KEY}`,
|
'Authorization': `Bearer ${relayKey}`,
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
},
|
},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
|
|
@ -41,7 +43,7 @@ async function sendEmail(
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorText = await response.text();
|
const errorText = await response.text();
|
||||||
console.error('Resend error:', errorText);
|
console.error('Email relay error:', errorText);
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -10,7 +10,8 @@ export interface Environment {
|
||||||
DAILY_DOMAIN: string;
|
DAILY_DOMAIN: string;
|
||||||
// CryptID auth bindings
|
// CryptID auth bindings
|
||||||
CRYPTID_DB?: D1Database;
|
CRYPTID_DB?: D1Database;
|
||||||
RESEND_API_KEY?: string;
|
EMAIL_RELAY_URL?: string;
|
||||||
|
EMAIL_RELAY_API_KEY?: string;
|
||||||
CRYPTID_EMAIL_FROM?: string;
|
CRYPTID_EMAIL_FROM?: string;
|
||||||
APP_URL?: string;
|
APP_URL?: string;
|
||||||
// Admin secret for protected endpoints
|
// Admin secret for protected endpoints
|
||||||
|
|
|
||||||
|
|
@ -108,7 +108,7 @@ crons = ["0 0 * * *"] # Run at midnight UTC every day
|
||||||
# - CLOUDFLARE_API_TOKEN
|
# - CLOUDFLARE_API_TOKEN
|
||||||
# - FAL_API_KEY # For fal.ai image/video generation proxy
|
# - FAL_API_KEY # For fal.ai image/video generation proxy
|
||||||
# - RUNPOD_API_KEY # For RunPod AI endpoints proxy
|
# - RUNPOD_API_KEY # For RunPod AI endpoints proxy
|
||||||
# - RESEND_API_KEY # For email sending
|
# - EMAIL_RELAY_API_KEY # For email sending via self-hosted relay
|
||||||
# - ADMIN_SECRET # For admin-only endpoints
|
# - ADMIN_SECRET # For admin-only endpoints
|
||||||
#
|
#
|
||||||
# To set secrets:
|
# To set secrets:
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue