feat: add Universal Profile support — schema, DB ops, server endpoints, JWT claims

- Schema: up_address, up_key_manager_address, up_chain_id, up_deployed_at columns
- DB: getUserUPAddress(), setUserUPAddress(), getUserByUPAddress()
- Server: GET/POST /api/profile/:id/up endpoints
- JWT: eid.up object in session tokens, eid.authTime fix, wallet capability from UP
- Backlog: task-72 for UP × EncryptID integration tracking

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
Jeff Emmett 2026-03-16 02:24:04 +00:00
parent c3deb18df8
commit fe128f832e
31 changed files with 2532 additions and 43 deletions

View File

@ -1,9 +1,10 @@
---
id: TASK-23
title: 'Feature parity audit: 13 overlapping shapes'
status: To Do
status: Done
assignee: []
created_date: '2026-02-18 19:49'
updated_date: '2026-03-15 00:54'
labels:
- audit
- phase-0
@ -37,7 +38,13 @@ For each pair: read both implementations, note feature gaps, classify as critica
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 All 13 shape pairs compared side-by-side
- [ ] #2 Feature gaps documented with severity (critical/nice-to-have)
- [ ] #3 Critical gaps identified for immediate fix
- [x] #1 All 13 shape pairs compared side-by-side
- [x] #2 Feature gaps documented with severity (critical/nice-to-have)
- [x] #3 Critical gaps identified for immediate fix
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Completed 2026-03-15. All 13 shape pairs audited. 7 CRITICAL gaps identified (VideoChat, Markdown, ObsNote, Slide, Prompt, Map, WorkflowBlock). Folk surpasses canvas in voice dictation, privacy fuzzing, graph ports, style presets.
<!-- SECTION:NOTES:END -->

View File

@ -1,15 +1,19 @@
---
id: TASK-24
title: Add infrastructure dependencies for shape migration
status: To Do
status: Done
assignee: []
created_date: '2026-02-18 19:49'
updated_date: '2026-03-15 00:45'
labels:
- infrastructure
- phase-1
milestone: m-0
dependencies: []
priority: high
status_history:
- status: Done
timestamp: '2026-03-15 00:45'
---
## Description
@ -29,3 +33,9 @@ Also verify existing deps like perfect-freehand are sufficient for Drawfast.
- [ ] #2 No build errors after adding dependencies
- [ ] #3 WASM plugins configured if needed (h3-js)
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Dependencies already installed: h3-js@4.4.0, @xterm/xterm@6.0.0, @xterm/addon-fit@0.11.0, perfect-freehand already present.
<!-- SECTION:NOTES:END -->

View File

@ -1,19 +1,20 @@
---
id: TASK-25
title: Add server API proxy endpoints for new shapes
status: To Do
status: Done
assignee: []
created_date: '2026-02-18 19:49'
updated_date: '2026-03-15 00:45'
labels:
- infrastructure
- phase-1
- server
milestone: m-0
dependencies: []
references:
- rspace-online/server/index.ts
- canvas-website/src/shapes/ImageGenShapeUtil.tsx (API pattern reference)
priority: high
status_history:
- status: Done
timestamp: '2026-03-15 00:45'
---
## Description
@ -36,3 +37,9 @@ Follow existing pattern from /api/image-gen endpoint.
- [ ] #2 WebSocket terminal endpoint accepts connections
- [ ] #3 Error handling and auth middleware applied
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
All needed proxy endpoints exist. Shapes that don't need proxies: holon (local h3-js), transcription (browser API), obs-note (self-contained).
<!-- SECTION:NOTES:END -->

View File

@ -1,20 +1,20 @@
---
id: TASK-41
title: Build dynamic Shape Registry to replace hardcoded switch statements
status: To Do
status: Done
assignee: []
created_date: '2026-02-18 20:06'
updated_date: '2026-03-14 21:56'
labels:
- infrastructure
- phase-0
- ecosystem
milestone: m-1
dependencies: []
references:
- rspace-online/lib/folk-shape.ts
- rspace-online/website/canvas.html
- rspace-online/lib/community-sync.ts
priority: high
status_history:
- status: Done
timestamp: '2026-03-14 21:56'
---
## Description
@ -39,3 +39,9 @@ This is the prerequisite for all other ecosystem features (pipes, events, groups
- [ ] #5 All existing shapes still create and sync correctly
- [ ] #6 No regression in shape creation or remote sync
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Shape registry implemented in lib/shape-registry.ts. Switch statements in community-sync.ts removed. Registry used by ecosystem-bridge for dynamic shape loading.
<!-- SECTION:NOTES:END -->

View File

@ -1,9 +1,10 @@
---
id: TASK-42
title: 'Implement Data Pipes: typed data flow through arrows'
status: To Do
status: Done
assignee: []
created_date: '2026-02-18 20:06'
updated_date: '2026-03-15 00:43'
labels:
- feature
- phase-1
@ -11,12 +12,10 @@ labels:
milestone: m-1
dependencies:
- TASK-41
references:
- rspace-online/lib/folk-arrow.ts
- rspace-online/lib/folk-shape.ts
- rspace-online/lib/folk-image-gen.ts
- rspace-online/lib/folk-prompt.ts
priority: high
status_history:
- status: Done
timestamp: '2026-03-15 00:43'
---
## Description
@ -60,3 +59,9 @@ Example pipeline: Transcription →[text]→ Prompt →[text]→ ImageGen →[im
- [ ] #7 Port values sync to remote clients via Automerge
- [ ] #8 100ms debounce prevents thrashing on rapid changes
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Already implemented: data-types.ts with DataType enum + compatibility matrix, FolkShape has portDescriptors/getPort/setPortValue, folk-arrow connects ports with type checking and flow visualization, AI shapes (image-gen, prompt) have port descriptors.
<!-- SECTION:NOTES:END -->

View File

@ -1,10 +1,10 @@
---
id: TASK-46
title: 'Implement Cross-App Embedding: r-ecosystem apps in rSpace canvases'
status: In Progress
status: Done
assignee: []
created_date: '2026-02-18 20:07'
updated_date: '2026-02-26 03:50'
updated_date: '2026-03-15 00:50'
labels:
- feature
- phase-5
@ -13,11 +13,10 @@ milestone: m-1
dependencies:
- TASK-41
- TASK-42
references:
- rspace-online/lib/shape-registry.ts
- rspace-online/server/index.ts
- rspace-online/website/canvas.html
priority: high
status_history:
- status: Done
timestamp: '2026-03-15 00:50'
---
## Description
@ -72,4 +71,6 @@ Runtime:
POC implemented in commit 50f0e11: folk-rapp shape type embeds live rApp modules as iframes on the canvas. Toolbar rApps section creates folk-rapp shapes with r-prefixed moduleIds. Module picker dropdown, colored header with badge, open-in-tab action, Automerge sync. Remaining: manifest protocol, EcosystemBridge, sandboxed mode, Service Worker caching, remote lazy-loading.
Enhanced in 768ea19: postMessage bridge (parent↔iframe context + shape events), module switcher dropdown, open-in-tab navigation. AC#7 (remote lazy-load) works — newShapeElement switch handles folk-rapp from sync.
Ecosystem bridge fully implemented (343 lines). Cross-app embedding works via rspace.online manifest.
<!-- SECTION:NOTES:END -->

View File

@ -1,9 +1,10 @@
---
id: TASK-47
title: 'Implement System Clock / Heartbeat Service for rSpace canvas'
status: To Do
title: Implement System Clock / Heartbeat Service for rSpace canvas
status: Done
assignee: []
created_date: '2026-02-18 22:30'
updated_date: '2026-03-14 21:55'
labels:
- feature
- infrastructure
@ -11,10 +12,10 @@ labels:
milestone: m-1
dependencies:
- TASK-43
references:
- rspace-online/backlog/tasks/task-43 - Implement-Event-Broadcasting-canvas-wide-pub-sub-system.md
- rSpace-website/docs/R-ECOSYSTEM-ARCHITECTURE.md
priority: high
status_history:
- status: Done
timestamp: '2026-03-14 21:55'
---
## Description
@ -89,3 +90,9 @@ Server-level config in community settings:
- [ ] #6 Fallback local clock when server connection is lost
- [ ] #7 Clock can be enabled/disabled per community in settings
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Already implemented: clock-service.ts with SystemClock class broadcasting tick/5min/hourly/daily events via WebSocket to all canvas clients.
<!-- SECTION:NOTES:END -->

View File

@ -1,22 +1,19 @@
---
id: TASK-51
title: Consolidate standalone r*.online domains → rspace.online
status: To Do
status: Done
assignee: []
created_date: '2026-02-25 07:46'
updated_date: '2026-03-14 21:55'
labels:
- infrastructure
- domains
- migration
dependencies: []
references:
- server/index.ts (lines 457-521 — standalone rewrite logic)
- shared/module.ts (standaloneDomain interface)
- shared/components/rstack-app-switcher.ts (external link arrows)
- docker-compose.yml (lines 44-114 — Traefik labels)
- src/encryptid/server.ts (allowedOrigins list)
- src/encryptid/session.ts (JWT aud claim)
priority: high
status_history:
- status: Done
timestamp: '2026-03-14 21:55'
---
## Description
@ -42,3 +39,9 @@ Key risks:
- [ ] #5 Standalone .ts entry points deleted
- [ ] #6 Domain registrations allowed to expire
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Domain consolidation complete. All standalone domains 301 → rspace.online
<!-- SECTION:NOTES:END -->

View File

@ -1,9 +1,10 @@
---
id: TASK-51.1
title: 'Phase 2: Fix external service URLs (analytics, maps sync, Twenty CRM)'
status: To Do
status: Done
assignee: []
created_date: '2026-02-25 07:47'
updated_date: '2026-03-14 21:55'
labels:
- infrastructure
- domains
@ -11,6 +12,9 @@ labels:
dependencies: []
parent_task_id: TASK-51
priority: high
status_history:
- status: Done
timestamp: '2026-03-14 21:55'
---
## Description
@ -29,3 +33,9 @@ DECISION NEEDED: Is Twenty CRM (rnetwork.online) a separate container or proxied
- [ ] #2 Maps sync WebSocket connects via new URL
- [ ] #3 Network module reaches Twenty CRM without depending on rnetwork.online domain
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
External URLs already fixed - analytics proxied via /collect.js, no hardcoded domains
<!-- SECTION:NOTES:END -->

View File

@ -1,9 +1,10 @@
---
id: TASK-51.2
title: 'Phase 1: Convert standalone domain rewrite to 301 redirects'
status: To Do
status: Done
assignee: []
created_date: '2026-02-25 07:47'
updated_date: '2026-03-14 21:55'
labels:
- infrastructure
- domains
@ -12,6 +13,9 @@ dependencies:
- TASK-51.1
parent_task_id: TASK-51
priority: high
status_history:
- status: Done
timestamp: '2026-03-14 21:55'
---
## Description
@ -29,3 +33,9 @@ Target: server/index.ts lines 482-521. Redirect HTML page loads, continue proxyi
- [ ] #3 API and WebSocket requests still proxied without redirect
- [ ] #4 keepStandalone domains unaffected
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Already implemented: 301 redirects in server/index.ts for all 25 standalone domains
<!-- SECTION:NOTES:END -->

View File

@ -0,0 +1,50 @@
---
title: "Universal Profiles × EncryptID Integration"
status: "In Progress"
priority: "high"
created: 2026-03-16
labels: ["encryptid", "blockchain", "lukso", "base"]
---
# Universal Profiles × EncryptID Integration
Give every EncryptID user a LUKSO Universal Profile (LSP0 + LSP6) on Base, controlled by their passkey-derived secp256k1 key. Replaces fragmented Openfort wallets and raw EOAs with a unified on-chain identity.
## Phase 1: Core — EVM Key Derivation + UP Deployment Service (DONE)
- [x] Client-side secp256k1 key derivation from PRF via HKDF (`evm-key.ts`)
- [x] UP deployment service (`encryptid-up-service/`) — Hono API with CREATE2 factory
- [x] LSP6 permission encoding (AuthLevel → BitArray mapping)
- [x] LSP25 gasless relay service
- [x] LSP3 profile metadata sync
- [x] Database schema migration (UP columns on users table)
- [x] JWT claims updated with `eid.up` object
- [x] Recovery hooks for on-chain controller rotation
- [ ] Deploy LSP0/LSP6 implementation contracts on Base Sepolia
- [ ] Set up Infisical secrets (RELAY_PRIVATE_KEY, JWT_SECRET)
- [ ] DNS record for up.encryptid.jeffemmett.com
- [ ] Install npm dependencies (requires root)
- [ ] End-to-end test: passkey → derive key → deploy UP → relay tx
## Phase 2: SDK Integration — UP-Aware Sessions
- [ ] UP info in JWT claims on auth
- [ ] GET/POST /api/profile/:id/up endpoints
- [ ] SessionManager: getUPAddress(), hasUniversalProfile()
- [ ] Guardian → LSP6 controller mapping for on-chain recovery
## Phase 3: Payment-Infra Migration
- [ ] WalletAdapter abstraction (UP + Openfort)
- [ ] New users → UP by default
- [ ] Optional Openfort → UP migration path
## Phase 4: NLA Oracle Integration
- [ ] getEncryptIDWallet() in NLA CLI
- [ ] --encryptid flag on create/fulfill/collect commands
- [ ] UP-identified escrow parties with LSP3 metadata
## Notes
- 2026-03-16: Phase 1 code complete. SDK changes in encryptid-sdk repo, UP service in encryptid-up-service (new, not yet a git repo). DB/server changes in rspace-online.bak.

View File

@ -0,0 +1,10 @@
# Twenty CRM secrets
# Generate these before first deploy:
# APP_SECRET: openssl rand -hex 32
# POSTGRES_PASSWORD: openssl rand -hex 16
POSTGRES_PASSWORD=changeme
APP_SECRET=changeme
# Store these in Infisical (twenty-crm project) for production.
# The .env file is only used for initial bootstrap / local dev.

123
deploy/twenty-crm/DEPLOY.md Normal file
View File

@ -0,0 +1,123 @@
# Twenty CRM Deployment — commons-hub Lead Funnel
## 1. Deploy Twenty CRM Stack on Netcup
```bash
# SSH to server
ssh netcup-full
# Create directory and copy files
mkdir -p /opt/twenty-crm
# (copy docker-compose.yml and .env from this directory)
# Generate secrets
cd /opt/twenty-crm
cat > .env <<EOF
POSTGRES_PASSWORD=$(openssl rand -hex 16)
APP_SECRET=$(openssl rand -hex 32)
EOF
# Start the stack
docker compose up -d
# Wait for healthy status (may take 2-3 minutes on first boot for migrations)
docker compose ps
docker logs twenty-ch-server --tail 50
```
## 2. Create Admin Account
Once Twenty is healthy at `https://crm.rspace.online`:
1. Open `https://crm.rspace.online` in browser
2. Twenty will show the initial setup wizard
3. Create admin account: **jeff / jeffemmett@gmail.com**
4. Set workspace name: **commons-hub**
## 3. Generate API Token & Store in Infisical
1. In Twenty: Settings → Accounts → API Keys → Create API key
2. Copy the token
3. Store in Infisical:
- Project: `rspace` (same project as rSpace secrets)
- Secret name: `TWENTY_API_TOKEN`
- Secret value: the API token from step 2
4. Restart rSpace to pick up the new token:
```bash
cd /opt/rspace-online && docker compose restart rspace
```
## 4. Configure Lead Funnel Pipeline
In Twenty CRM UI: Settings → Data model → Opportunity → Edit stages
### Pipeline Stages (7)
| # | Stage | Color |
|---|-------|-------|
| 1 | New Lead | Blue |
| 2 | Contacted | Yellow |
| 3 | Qualified | Orange |
| 4 | Offer Sent | Purple |
| 5 | Confirmed | Teal |
| 6 | Won | Green |
| 7 | Lost / Not Now | Red |
### Custom Fields
Add via Settings → Data model → [Object] → Add field:
**On Opportunity:**
- Event Dates (preferred) — DATE
- Event Dates (flexible range) — TEXT
- Group Size — NUMBER
- Needs: Accommodation — BOOLEAN
- Needs: Catering — BOOLEAN
- Needs: Rooms — BOOLEAN
- Needs: AV — BOOLEAN
- Next Action Date — DATE (required)
- Follow-up Date — DATE
- Lost Reason — TEXT
**On Company:**
- Lead Source — SELECT (options: Website, Referral, Event, Cold Outreach, Partner, Other)
- Last Touch Date — DATE
### Saved Views
Create via the Opportunities list → Save view:
1. **My Pipeline** — Group by: Stage, Filter: Assigned to = current user
2. **Needs Follow-up** — Filter: Next Action Date <= today
3. **Stale Leads** — Filter: Next Action Date is empty
## 5. Create commons-hub Space
This will be done via the rSpace community store or directly:
- Space slug: `commons-hub`
- Visibility: `permissioned`
- Owner: jeff (jeffemmett@gmail.com)
- Enabled modules: `["rnetwork"]`
The CRM will then be accessible at: `commons-hub.rspace.online/rnetwork/crm`
## 6. Deploy rSpace Changes
```bash
ssh netcup-full
cd /opt/rspace-online
git pull
docker compose up -d --build
```
## Verification Checklist
- [ ] `docker compose ps` on Netcup — all Twenty containers healthy
- [ ] `curl https://crm.rspace.online` — Twenty CRM loads
- [ ] Navigate to `commons-hub.rspace.online/rnetwork/crm` — CRM embedded in rSpace shell
- [ ] Log in as jeff — admin access confirmed
- [ ] Open pipeline view — 7 stages visible
- [ ] Create test opportunity — all custom fields present
- [ ] Verify "Next Action Date" is required
- [ ] Check rNetwork graph view still works (`/rnetwork` default view)

View File

@ -0,0 +1,129 @@
# Twenty CRM stack for commons-hub lead funnel
# Deploy to /opt/twenty-crm/ on Netcup
#
# Prerequisites:
# - rspace-online stack running (creates rspace-online_rspace-internal network)
# - Traefik running on traefik-public network
# - .env with POSTGRES_PASSWORD + APP_SECRET
#
# All services use rspace-internal network for inter-container communication.
# This avoids Docker br_netfilter issues with freshly-created bridge networks.
services:
twenty-ch-server:
image: twentycrm/twenty:latest
container_name: twenty-ch-server
restart: unless-stopped
depends_on:
twenty-ch-db:
condition: service_healthy
twenty-ch-redis:
condition: service_healthy
environment:
# ── Core ──
- NODE_ENV=production
- SERVER_URL=https://crm.rspace.online
- FRONT_BASE_URL=https://crm.rspace.online
- NODE_PORT=3000
# ── Database ──
- PG_DATABASE_URL=postgres://postgres:${POSTGRES_PASSWORD}@twenty-ch-db:5432/default
# ── Redis ──
- REDIS_URL=redis://twenty-ch-redis:6379
# ── Auth ──
- APP_SECRET=${APP_SECRET}
# ── Storage ──
- STORAGE_TYPE=local
- STORAGE_LOCAL_PATH=.local-storage
# ── Misc ──
- SIGN_IN_PREFILLED=false
- IS_BILLING_ENABLED=false
- TELEMETRY_ENABLED=false
- IS_MULTIWORKSPACE_ENABLED=false
volumes:
- twenty-ch-server-data:/app/.local-storage
labels:
- "traefik.enable=true"
- "traefik.http.routers.twenty-crm.rule=Host(`crm.rspace.online`)"
- "traefik.http.routers.twenty-crm.entrypoints=web"
- "traefik.http.routers.twenty-crm.priority=130"
- "traefik.http.services.twenty-crm.loadbalancer.server.port=3000"
- "traefik.docker.network=traefik-public"
networks:
- traefik-public
- rspace-internal
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:3000/healthz"]
interval: 30s
timeout: 10s
retries: 5
start_period: 60s
twenty-ch-worker:
image: twentycrm/twenty:latest
container_name: twenty-ch-worker
restart: unless-stopped
command: ["yarn", "worker:prod"]
depends_on:
twenty-ch-db:
condition: service_healthy
twenty-ch-redis:
condition: service_healthy
environment:
- NODE_ENV=production
- PG_DATABASE_URL=postgres://postgres:${POSTGRES_PASSWORD}@twenty-ch-db:5432/default
- REDIS_URL=redis://twenty-ch-redis:6379
- APP_SECRET=${APP_SECRET}
- STORAGE_TYPE=local
- STORAGE_LOCAL_PATH=.local-storage
- SERVER_URL=https://crm.rspace.online
- TELEMETRY_ENABLED=false
- IS_MULTIWORKSPACE_ENABLED=false
volumes:
- twenty-ch-server-data:/app/.local-storage
networks:
- rspace-internal
twenty-ch-db:
image: postgres:16-alpine
container_name: twenty-ch-db
restart: unless-stopped
environment:
- POSTGRES_DB=default
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
volumes:
- twenty-ch-pgdata:/var/lib/postgresql/data
networks:
- rspace-internal
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres -d default"]
interval: 10s
timeout: 5s
retries: 5
start_period: 10s
twenty-ch-redis:
image: redis:7
container_name: twenty-ch-redis
restart: unless-stopped
volumes:
- twenty-ch-redis-data:/data
networks:
- rspace-internal
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
volumes:
twenty-ch-server-data:
twenty-ch-pgdata:
twenty-ch-redis-data:
networks:
traefik-public:
external: true
rspace-internal:
name: rspace-online_rspace-internal
external: true

137
scripts/seed-infisical.sh Executable file
View File

@ -0,0 +1,137 @@
#!/bin/sh
# seed-infisical.sh — Add missing rApp secrets to the rspace Infisical project.
# Run from inside the rspace container:
# docker exec -it rspace-online /app/scripts/seed-infisical.sh
#
# Uses INFISICAL_CLIENT_ID, INFISICAL_CLIENT_SECRET, INFISICAL_URL,
# INFISICAL_PROJECT_SLUG, and INFISICAL_ENV from the container environment.
set -e
INFISICAL_URL="${INFISICAL_URL:-http://infisical:8080}"
INFISICAL_ENV="${INFISICAL_ENV:-prod}"
INFISICAL_PROJECT_SLUG="${INFISICAL_PROJECT_SLUG:-rspace}"
if [ -z "$INFISICAL_CLIENT_ID" ] || [ -z "$INFISICAL_CLIENT_SECRET" ]; then
echo "ERROR: INFISICAL_CLIENT_ID and INFISICAL_CLIENT_SECRET must be set."
exit 1
fi
echo "Authenticating with Infisical at ${INFISICAL_URL}..."
TOKEN=$(bun -e "
(async () => {
const r = await fetch('${INFISICAL_URL}/api/v1/auth/universal-auth/login', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
clientId: process.env.INFISICAL_CLIENT_ID,
clientSecret: process.env.INFISICAL_CLIENT_SECRET
})
}).then(r => r.json());
if (!r.accessToken) { console.error('Auth failed:', JSON.stringify(r)); process.exit(1); }
console.log(r.accessToken);
})();
") || { echo "Authentication failed"; exit 1; }
echo "Authenticated. Fetching existing secrets..."
EXISTING=$(bun -e "
(async () => {
const r = await fetch('${INFISICAL_URL}/api/v3/secrets/raw?workspaceSlug=${INFISICAL_PROJECT_SLUG}&environment=${INFISICAL_ENV}&secretPath=/&recursive=true', {
headers: { 'Authorization': 'Bearer ${TOKEN}' }
}).then(r => r.json());
if (!r.secrets) { console.error('Failed to list secrets'); process.exit(1); }
console.log(r.secrets.map(s => s.secretKey).join('\n'));
})();
") || { echo "Failed to list secrets"; exit 1; }
# Secrets to seed: NAME|MODULE|DESCRIPTION
SECRETS="
DATABASE_URL|core|Full Postgres connection string (e.g. postgres://rspace:PASS@rspace-db:5432/rspace)
ADMIN_DIDS|core|Comma-separated admin DIDs
R2_ENDPOINT|rTube|Cloudflare R2 S3 endpoint URL
R2_BUCKET|rTube|R2 bucket name (default: rtube-videos)
R2_ACCESS_KEY_ID|rTube|R2 API key ID
R2_SECRET_ACCESS_KEY|rTube|R2 API secret key
R2_PUBLIC_URL|rTube|R2 public CDN URL
RPHOTOS_IMMICH_URL|rPhotos|Internal Immich URL (e.g. http://immich-server:2283)
RPHOTOS_API_KEY|rPhotos|Immich API key
RPHOTOS_IMMICH_PUBLIC_URL|rPhotos|Public Immich URL
TWENTY_API_TOKEN|rNetwork|Twenty CRM API token
HETZNER_API_TOKEN|rForum|Hetzner Cloud API token (Discourse VM provisioning)
CLOUDFLARE_API_TOKEN|rForum|Cloudflare API token (Discourse DNS management)
CLOUDFLARE_FORUM_ZONE_ID|rForum|Cloudflare zone ID for Discourse forum DNS
DISCOURSE_API_KEY|rForum|Discourse admin API key
DISCOURSE_URL|rForum|Discourse instance URL (e.g. https://forum.rforum.online)
FAL_KEY|AI/MI|Fal.ai API key
RUNPOD_API_KEY|AI/MI|RunPod API key
X402_PAY_TO|payments|Payment recipient address
MAILCOW_API_KEY|EncryptID|Mailcow admin API key
ENCRYPTID_DEMO_SPACES|EncryptID|Comma-separated demo space slugs
TRANSAK_API_KEY|rFunds|Transak widget API key (public, scoped to app)
TRANSAK_WEBHOOK_SECRET|rFunds|Transak webhook HMAC secret for signature verification
TRANSAK_ENV|rFunds|Transak environment: STAGING or PRODUCTION (default: STAGING)
"
ADDED=0
SKIPPED=0
echo ""
echo "=== Seeding rspace Infisical secrets ==="
echo ""
echo "$SECRETS" | while IFS='|' read -r NAME MODULE DESC; do
# Skip blank lines
[ -z "$NAME" ] && continue
# Trim whitespace
NAME=$(echo "$NAME" | xargs)
MODULE=$(echo "$MODULE" | xargs)
DESC=$(echo "$DESC" | xargs)
# Check if already exists
if echo "$EXISTING" | grep -qx "$NAME"; then
echo "[skip] $NAME (already exists)"
continue
fi
printf "[%s] %s — %s\n" "$MODULE" "$NAME" "$DESC"
printf " Enter value (or press Enter to skip): "
read -r VALUE
if [ -z "$VALUE" ]; then
echo " Skipped."
continue
fi
# Create the secret via Infisical API
RESULT=$(bun -e "
(async () => {
const r = await fetch('${INFISICAL_URL}/api/v3/secrets/raw/$(echo "$NAME" | sed 's/ /%20/g')', {
method: 'POST',
headers: {
'Authorization': 'Bearer ${TOKEN}',
'Content-Type': 'application/json'
},
body: JSON.stringify({
workspaceSlug: '${INFISICAL_PROJECT_SLUG}',
environment: '${INFISICAL_ENV}',
secretPath: '/',
secretValue: $(printf '%s' "$VALUE" | bun -e "process.stdout.write(JSON.stringify(await Bun.stdin.text()))")
})
});
if (r.ok) console.log('OK');
else console.log('FAIL: ' + (await r.text()));
})();
") || RESULT="FAIL: bun error"
if [ "$RESULT" = "OK" ]; then
echo " Added."
else
echo " ERROR: $RESULT"
fi
done
echo ""
echo "Done. Restart the container to pick up new secrets:"
echo " docker compose restart rspace"

View File

@ -0,0 +1,258 @@
/**
* Test: Automerge round-trip create, save, load, sync, verify.
*
* Exercises the full local-first stack:
* 1. SyncServer (in-memory doc management)
* 2. Doc persistence (save to disk + load from disk)
* 3. Schema init factories (NotebookDoc, BoardDoc, etc.)
* 4. Doc change + onDocChange callback
*
* Usage: bun run scripts/test-automerge-roundtrip.ts
*/
// Must set env BEFORE imports (doc-persistence reads it at module level)
const TEST_DIR = '/tmp/rspace-automerge-test';
process.env.DOCS_STORAGE_DIR = TEST_DIR;
import * as Automerge from '@automerge/automerge';
import { mkdirSync, rmSync, existsSync, readdirSync } from 'node:fs';
import { readFile } from 'node:fs/promises';
import { resolve } from 'node:path';
import { SyncServer } from '../server/local-first/sync-server';
import { docIdToPath, saveDoc, loadAllDocs } from '../server/local-first/doc-persistence';
// Cleanup from previous runs
if (existsSync(TEST_DIR)) rmSync(TEST_DIR, { recursive: true });
mkdirSync(TEST_DIR, { recursive: true });
let passed = 0;
let failed = 0;
function assert(condition: boolean, label: string) {
if (condition) {
console.log(`${label}`);
passed++;
} else {
console.error(`${label}`);
failed++;
}
}
// ─── Test 1: docIdToPath mapping ──────────────────────────
console.log('\n── Test 1: docId ↔ path mapping ──');
{
const path = docIdToPath('demo:notes:notebooks:abc');
assert(path.endsWith('/demo/notes/notebooks/abc.automerge'), `docIdToPath → ${path}`);
const path2 = docIdToPath('myspace:work:boards:board-1');
assert(path2.endsWith('/myspace/work/boards/board-1.automerge'), `docIdToPath boards → ${path2}`);
let threw = false;
try { docIdToPath('invalid'); } catch { threw = true; }
assert(threw, 'docIdToPath rejects invalid docId (< 3 parts)');
}
// ─── Test 2: SyncServer in-memory CRUD ─────────────────────
console.log('\n── Test 2: SyncServer in-memory CRUD ──');
{
interface TestDoc { title: string; items: Record<string, { text: string }> }
const docChanges: string[] = [];
const server = new SyncServer({
participantMode: true,
onDocChange: (docId) => docChanges.push(docId),
});
// Create a doc
let doc = Automerge.init<TestDoc>();
doc = Automerge.change(doc, 'init', (d) => {
d.title = 'Test Notebook';
d.items = {};
});
server.setDoc('test:notes:notebooks:nb1', doc);
assert(server.getDocIds().includes('test:notes:notebooks:nb1'), 'setDoc registers docId');
// Read it back
const loaded = server.getDoc<TestDoc>('test:notes:notebooks:nb1');
assert(loaded !== undefined, 'getDoc returns the doc');
assert(loaded!.title === 'Test Notebook', 'doc content preserved');
// Change via changeDoc
const changed = server.changeDoc<TestDoc>('test:notes:notebooks:nb1', 'add item', (d) => {
d.items['item-1'] = { text: 'Hello local-first' };
});
assert(changed !== null, 'changeDoc returns updated doc');
assert(changed!.items['item-1'].text === 'Hello local-first', 'changeDoc content correct');
assert(docChanges.length === 1, 'onDocChange callback fired');
// Verify the server's copy is updated
const reloaded = server.getDoc<TestDoc>('test:notes:notebooks:nb1');
assert(reloaded!.items['item-1'].text === 'Hello local-first', 'server copy updated after changeDoc');
}
// ─── Test 3: Relay mode ────────────────────────────────────
console.log('\n── Test 3: Relay mode (encrypted spaces) ──');
{
const server = new SyncServer({ participantMode: true });
assert(!server.isRelayOnly('demo:notes:notebooks:x'), 'not relay by default');
server.setRelayOnly('encrypted-space', true);
assert(server.isRelayOnly('encrypted-space'), 'exact match → relay');
assert(server.isRelayOnly('encrypted-space:notes:notebooks:x'), 'prefix match → relay');
assert(!server.isRelayOnly('other-space:notes:notebooks:x'), 'other space → not relay');
server.setRelayOnly('encrypted-space', false);
assert(!server.isRelayOnly('encrypted-space:notes:notebooks:x'), 'after removal → not relay');
}
// ─── Test 4: Disk persistence round-trip ───────────────────
// Note: We test Automerge binary serialization directly rather than using
// doc-persistence (which reads DOCS_STORAGE_DIR at module load time).
console.log('\n── Test 4: Disk persistence round-trip ──');
await (async () => {
interface NoteDoc { title: string; content: string }
// Create a doc
let doc = Automerge.init<NoteDoc>();
doc = Automerge.change(doc, 'init', (d) => {
d.title = 'Persistent Note';
d.content = 'This should survive a restart';
});
// Serialize to binary
const binary = Automerge.save(doc);
assert(binary.byteLength > 0, `Automerge.save produces ${binary.byteLength} bytes`);
// Write to temp dir
const { mkdir: mk, writeFile: wf, readFile: rf } = await import('node:fs/promises');
const { dirname: dn } = await import('node:path');
const filePath = resolve(TEST_DIR, 'roundtrip/notes/notebooks/persist-1.automerge');
await mk(dn(filePath), { recursive: true });
await wf(filePath, binary);
assert(existsSync(filePath), `file written to disk`);
// Read back and deserialize
const rawBuf = await rf(filePath);
const reloaded = Automerge.load<NoteDoc>(new Uint8Array(rawBuf));
assert(reloaded.title === 'Persistent Note', 'title preserved after load');
assert(reloaded.content === 'This should survive a restart', 'content preserved after load');
// Verify Automerge history
const history = Automerge.getHistory(reloaded);
assert(history.length === 1, `history has ${history.length} change(s)`);
// Test: modify, save again, load again
const doc2 = Automerge.change(reloaded, 'update', (d) => {
d.content = 'Updated content after reload';
});
const binary2 = Automerge.save(doc2);
await wf(filePath, binary2);
const rawBuf2 = await rf(filePath);
const reloaded2 = Automerge.load<NoteDoc>(new Uint8Array(rawBuf2));
assert(reloaded2.content === 'Updated content after reload', 'content updated after second save/load');
assert(Automerge.getHistory(reloaded2).length === 2, 'history has 2 changes after update');
// Load via loadAllDocs into a SyncServer (uses TEST_DIR since that's what's on disk)
// We do this by creating a SyncServer and loading manually
const server2 = new SyncServer({ participantMode: true });
const raw = await rf(filePath);
const loadedDoc = Automerge.load<NoteDoc>(new Uint8Array(raw));
server2.setDoc('roundtrip:notes:notebooks:persist-1', loadedDoc);
const fromServer = server2.getDoc<NoteDoc>('roundtrip:notes:notebooks:persist-1');
assert(fromServer!.title === 'Persistent Note', 'SyncServer holds correct doc from disk');
})();
// ─── Test 5: Multiple docs + listDocs ──────────────────────
console.log('\n── Test 5: Multiple docs + listing ──');
await (async () => {
const server = new SyncServer({ participantMode: true });
for (const id of ['space-a:work:boards:b1', 'space-a:work:boards:b2', 'space-b:cal:events']) {
let doc = Automerge.init<{ label: string }>();
doc = Automerge.change(doc, 'init', (d) => { d.label = id; });
server.setDoc(id, doc);
}
const ids = server.getDocIds();
assert(ids.length === 3, `3 docs registered (got ${ids.length})`);
assert(ids.includes('space-a:work:boards:b1'), 'board b1 listed');
assert(ids.includes('space-b:cal:events'), 'cal events listed');
})();
// ─── Test 6: Peer subscribe + sync message flow ────────────
console.log('\n── Test 6: Peer subscribe + sync flow ──');
{
interface SimpleDoc { value: number }
const sent: Array<{ peerId: string; msg: string }> = [];
const server = new SyncServer({ participantMode: true });
// Create a doc on the server
let doc = Automerge.init<SimpleDoc>();
doc = Automerge.change(doc, 'set value', (d) => { d.value = 42; });
server.setDoc('sync-test:data:metrics', doc);
// Add a mock peer
const mockWs = {
send: (data: string) => sent.push({ peerId: 'peer-1', msg: data }),
readyState: 1,
};
server.addPeer('peer-1', mockWs);
// Subscribe peer to the doc
server.handleMessage('peer-1', JSON.stringify({
type: 'subscribe',
docIds: ['sync-test:data:metrics'],
}));
assert(server.getDocSubscribers('sync-test:data:metrics').includes('peer-1'), 'peer subscribed');
assert(sent.length > 0, `sync message sent to peer (${sent.length} message(s))`);
// Verify the sync message is valid JSON with type 'sync'
const firstMsg = JSON.parse(sent[0].msg);
assert(firstMsg.type === 'sync', `message type is 'sync'`);
assert(firstMsg.docId === 'sync-test:data:metrics', 'correct docId in sync message');
assert(Array.isArray(firstMsg.data), 'sync data is array (Uint8Array serialized)');
// Clean up peer
server.removePeer('peer-1');
assert(!server.getPeerIds().includes('peer-1'), 'peer removed');
assert(server.getDocSubscribers('sync-test:data:metrics').length === 0, 'subscriber cleaned up');
}
// ─── Test 7: Ping/pong ────────────────────────────────────
console.log('\n── Test 7: Ping/pong ──');
{
const sent: string[] = [];
const server = new SyncServer({ participantMode: true });
const mockWs = {
send: (data: string) => sent.push(data),
readyState: 1,
};
server.addPeer('ping-peer', mockWs);
server.handleMessage('ping-peer', JSON.stringify({ type: 'ping' }));
assert(sent.length === 1, 'pong sent');
assert(JSON.parse(sent[0]).type === 'pong', 'response is pong');
server.removePeer('ping-peer');
}
// ─── Summary ───────────────────────────────────────────────
console.log(`\n${'═'.repeat(50)}`);
console.log(` ${passed} passed, ${failed} failed`);
console.log(`${'═'.repeat(50)}\n`);
// Cleanup
rmSync(TEST_DIR, { recursive: true });
process.exit(failed > 0 ? 1 : 0);

View File

@ -0,0 +1,111 @@
/**
* test-eoa-derivation.ts Verify EOA key derivation from PRF output.
*
* Tests:
* 1. Determinism: same PRF input same EOA address every time
* 2. Domain separation: different from what encryption/signing keys would produce
* 3. Valid Ethereum address format (EIP-55 checksummed)
* 4. Different PRF inputs different addresses
* 5. Private key zeroing works
*
* Usage:
* bun run scripts/test-eoa-derivation.ts
*/
import { deriveEOAFromPRF, zeroPrivateKey } from '../src/encryptid/eoa-derivation';
let passed = 0;
let failed = 0;
function assert(condition: boolean, msg: string) {
if (condition) {
console.log(`${msg}`);
passed++;
} else {
console.error(`${msg}`);
failed++;
}
}
function toHex(bytes: Uint8Array): string {
return Array.from(bytes).map(b => b.toString(16).padStart(2, '0')).join('');
}
async function main() {
console.log('=== EOA Key Derivation Tests ===\n');
// Fixed PRF output (simulates 32 bytes from WebAuthn PRF extension)
const prfOutput = new Uint8Array(32);
for (let i = 0; i < 32; i++) prfOutput[i] = i + 1;
// Test 1: Basic derivation works
console.log('[1] Basic derivation');
const result1 = deriveEOAFromPRF(prfOutput);
assert(result1.privateKey.length === 32, 'Private key is 32 bytes');
assert(result1.publicKey.length === 65, 'Public key is 65 bytes (uncompressed)');
assert(result1.publicKey[0] === 0x04, 'Public key starts with 0x04');
assert(result1.address.startsWith('0x'), 'Address starts with 0x');
assert(result1.address.length === 42, 'Address is 42 chars (0x + 40 hex)');
console.log(` Address: ${result1.address}`);
console.log(` PubKey: 0x${toHex(result1.publicKey).slice(0, 20)}...`);
// Test 2: Determinism — same input always gives same output
console.log('\n[2] Determinism');
const result2 = deriveEOAFromPRF(prfOutput);
assert(result2.address === result1.address, 'Same PRF → same address');
assert(toHex(result2.privateKey) === toHex(result1.privateKey), 'Same PRF → same private key');
// Run it 5 more times to be sure
for (let i = 0; i < 5; i++) {
const r = deriveEOAFromPRF(prfOutput);
assert(r.address === result1.address, `Deterministic on iteration ${i + 3}`);
}
// Test 3: Different input → different address
console.log('\n[3] Different inputs produce different addresses');
const prfOutput2 = new Uint8Array(32);
for (let i = 0; i < 32; i++) prfOutput2[i] = 32 - i;
const result3 = deriveEOAFromPRF(prfOutput2);
assert(result3.address !== result1.address, 'Different PRF → different address');
console.log(` Address 1: ${result1.address}`);
console.log(` Address 2: ${result3.address}`);
// Test 4: EIP-55 checksum validity
console.log('\n[4] EIP-55 checksum format');
const hasUppercase = /[A-F]/.test(result1.address.slice(2));
const hasLowercase = /[a-f]/.test(result1.address.slice(2));
const isAllHex = /^0x[0-9a-fA-F]{40}$/.test(result1.address);
assert(isAllHex, 'Address is valid hex');
assert(hasUppercase || hasLowercase, 'Address has mixed case (EIP-55 checksum)');
// Test 5: Private key zeroing
console.log('\n[5] Private key zeroing');
const result4 = deriveEOAFromPRF(prfOutput);
const keyBefore = toHex(result4.privateKey);
assert(keyBefore !== '00'.repeat(32), 'Key is non-zero before wipe');
zeroPrivateKey(result4.privateKey);
const keyAfter = toHex(result4.privateKey);
assert(keyAfter === '00'.repeat(32), 'Key is all zeros after wipe');
// Test 6: Domain separation — verify the address is different from what
// you'd get with different HKDF params (we can't test WebCrypto HKDF here
// but we can verify our specific salt/info combo produces a unique result)
console.log('\n[6] Domain separation');
// Manually derive with same input but different params using @noble/hashes
const { hkdf } = await import('@noble/hashes/hkdf');
const { sha256 } = await import('@noble/hashes/sha256');
const encoder = new TextEncoder();
const encryptionKeyBytes = hkdf(sha256, prfOutput, encoder.encode('encryptid-encryption-key-v1'), encoder.encode('AES-256-GCM'), 32);
const signingKeyBytes = hkdf(sha256, prfOutput, encoder.encode('encryptid-signing-key-v1'), encoder.encode('ECDSA-P256-seed'), 32);
const eoaKeyBytes = hkdf(sha256, prfOutput, encoder.encode('encryptid-eoa-key-v1'), encoder.encode('secp256k1-signing-key'), 32);
assert(toHex(encryptionKeyBytes) !== toHex(eoaKeyBytes), 'EOA key ≠ encryption key material');
assert(toHex(signingKeyBytes) !== toHex(eoaKeyBytes), 'EOA key ≠ signing key material');
assert(toHex(result1.privateKey) === toHex(eoaKeyBytes), 'EOA key matches expected HKDF output');
// Summary
console.log(`\n=== Results: ${passed} passed, ${failed} failed ===`);
process.exit(failed > 0 ? 1 : 0);
}
main();

View File

@ -0,0 +1,117 @@
/**
* test-key-manager-eoa.ts Test EOA integration in EncryptIDKeyManager.
*
* Verifies that the key manager properly derives EOA keys from PRF output
* and includes them in the DerivedKeys interface.
*
* Usage:
* bun run scripts/test-key-manager-eoa.ts
*/
import { EncryptIDKeyManager } from '../src/encryptid/key-derivation';
let passed = 0;
let failed = 0;
function assert(condition: boolean, msg: string) {
if (condition) {
console.log(`${msg}`);
passed++;
} else {
console.error(`${msg}`);
failed++;
}
}
function toHex(bytes: Uint8Array): string {
return Array.from(bytes).map(b => b.toString(16).padStart(2, '0')).join('');
}
async function main() {
console.log('=== Key Manager EOA Integration Tests ===\n');
// Simulate PRF output (32 bytes)
const prfOutput = new ArrayBuffer(32);
const view = new Uint8Array(prfOutput);
for (let i = 0; i < 32; i++) view[i] = i + 1;
// Test 1: Initialize from PRF and derive keys
console.log('[1] Derive keys from PRF (includes EOA)');
const km = new EncryptIDKeyManager();
assert(!km.isInitialized(), 'Not initialized before init');
await km.initFromPRF(prfOutput);
assert(km.isInitialized(), 'Initialized after initFromPRF');
const keys = await km.getKeys();
assert(keys.fromPRF === true, 'Keys marked as from PRF');
assert(keys.encryptionKey !== undefined, 'Has encryption key');
assert(keys.signingKeyPair !== undefined, 'Has signing key pair');
assert(keys.didSeed !== undefined, 'Has DID seed');
assert(keys.did !== undefined, 'Has DID');
// EOA fields should be present (PRF path)
assert(keys.eoaPrivateKey !== undefined, 'Has EOA private key');
assert(keys.eoaAddress !== undefined, 'Has EOA address');
assert(keys.eoaPrivateKey!.length === 32, 'EOA private key is 32 bytes');
assert(keys.eoaAddress!.startsWith('0x'), 'EOA address starts with 0x');
assert(keys.eoaAddress!.length === 42, 'EOA address is 42 chars');
console.log(` EOA Address: ${keys.eoaAddress}`);
// Test 2: Keys are cached (same object returned)
console.log('\n[2] Key caching');
const keys2 = await km.getKeys();
assert(keys2.eoaAddress === keys.eoaAddress, 'Cached keys return same EOA address');
// Test 3: Determinism — new manager with same PRF gives same EOA
console.log('\n[3] Determinism across key manager instances');
const km2 = new EncryptIDKeyManager();
await km2.initFromPRF(prfOutput);
const keys3 = await km2.getKeys();
assert(keys3.eoaAddress === keys.eoaAddress, 'Same PRF → same EOA across instances');
assert(toHex(keys3.eoaPrivateKey!) === toHex(keys.eoaPrivateKey!), 'Same private key too');
// Test 4: Clear wipes EOA key
console.log('\n[4] Clear wipes sensitive material');
const eoaKeyRef = keys.eoaPrivateKey!;
const eoaBefore = toHex(eoaKeyRef);
assert(eoaBefore !== '00'.repeat(32), 'EOA key non-zero before clear');
km.clear();
assert(!km.isInitialized(), 'Not initialized after clear');
// The referenced Uint8Array should be zeroed
const eoaAfter = toHex(eoaKeyRef);
assert(eoaAfter === '00'.repeat(32), 'EOA private key zeroed after clear');
// Test 5: Passphrase path — no EOA keys
console.log('\n[5] Passphrase path (no EOA)');
const km3 = new EncryptIDKeyManager();
const salt = new Uint8Array(32);
crypto.getRandomValues(salt);
await km3.initFromPassphrase('test-passphrase-123', salt);
const passKeys = await km3.getKeys();
assert(passKeys.fromPRF === false, 'Keys marked as from passphrase');
assert(passKeys.eoaPrivateKey === undefined, 'No EOA private key from passphrase');
assert(passKeys.eoaAddress === undefined, 'No EOA address from passphrase');
assert(passKeys.encryptionKey !== undefined, 'Still has encryption key');
km3.clear();
// Test 6: Domain separation — EOA address differs from DID-derived values
console.log('\n[6] Domain separation');
const km4 = new EncryptIDKeyManager();
await km4.initFromPRF(prfOutput);
const keys4 = await km4.getKeys();
// EOA address should not appear in the DID
assert(!keys4.did.includes(keys4.eoaAddress!.slice(2)), 'EOA address not in DID');
km4.clear();
// Cleanup km2
km2.clear();
console.log(`\n=== Results: ${passed} passed, ${failed} failed ===`);
process.exit(failed > 0 ? 1 : 0);
}
main().catch(err => {
console.error('Fatal:', err);
process.exit(1);
});

View File

@ -0,0 +1,93 @@
/**
* test-passkey-signer.ts Test the passkey x402 signer module.
*
* Verifies that createPasskeySigner() and createPasskeySignerFromKeys()
* work correctly with simulated PRF output. Does NOT make actual payments
* (that requires a funded wallet + live facilitator).
*
* Usage:
* bun run scripts/test-passkey-signer.ts
*/
import { createPasskeySigner, createPasskeySignerFromKeys } from '../shared/x402/passkey-signer';
import { deriveEOAFromPRF } from '../src/encryptid/eoa-derivation';
import { EncryptIDKeyManager, type DerivedKeys } from '../src/encryptid/key-derivation';
let passed = 0;
let failed = 0;
function assert(condition: boolean, msg: string) {
if (condition) {
console.log(`${msg}`);
passed++;
} else {
console.error(`${msg}`);
failed++;
}
}
async function main() {
console.log('=== Passkey Signer Tests ===\n');
// Fixed PRF output
const prfOutput = new Uint8Array(32);
for (let i = 0; i < 32; i++) prfOutput[i] = i + 1;
// Expected EOA from this PRF
const expectedEOA = deriveEOAFromPRF(prfOutput);
// Test 1: createPasskeySigner from raw PRF
console.log('[1] createPasskeySigner (from raw PRF)');
const signer = await createPasskeySigner({ prfOutput });
assert(signer.eoaAddress === expectedEOA.address, 'EOA address matches derivation');
assert(typeof signer.paidFetch === 'function', 'paidFetch is a function');
assert(typeof signer.cleanup === 'function', 'cleanup is a function');
console.log(` EOA: ${signer.eoaAddress}`);
// Test 2: Cleanup zeros the key
console.log('\n[2] Cleanup');
signer.cleanup();
// Can't directly check the internal state, but we verify it doesn't throw
assert(true, 'Cleanup ran without error');
// Test 3: createPasskeySignerFromKeys (from DerivedKeys)
console.log('\n[3] createPasskeySignerFromKeys');
const km = new EncryptIDKeyManager();
await km.initFromPRF(prfOutput.buffer);
const keys = await km.getKeys();
const signer2 = await createPasskeySignerFromKeys(keys);
assert(signer2 !== null, 'Signer created from DerivedKeys');
assert(signer2!.eoaAddress === expectedEOA.address, 'Same EOA address from keys');
assert(typeof signer2!.paidFetch === 'function', 'paidFetch is a function');
signer2!.cleanup();
// Test 4: createPasskeySignerFromKeys returns null for passphrase keys
console.log('\n[4] Returns null for passphrase-derived keys (no EOA)');
const km2 = new EncryptIDKeyManager();
await km2.initFromPassphrase('test-pass', new Uint8Array(32));
const passKeys = await km2.getKeys();
const signer3 = await createPasskeySignerFromKeys(passKeys);
assert(signer3 === null, 'Returns null when no EOA keys');
km2.clear();
// Test 5: Custom network
console.log('\n[5] Custom network');
const signer4 = await createPasskeySigner({
prfOutput,
network: 'eip155:8453', // Base mainnet
});
assert(signer4.eoaAddress === expectedEOA.address, 'Same EOA regardless of network');
signer4.cleanup();
// Cleanup
km.clear();
console.log(`\n=== Results: ${passed} passed, ${failed} failed ===`);
process.exit(failed > 0 ? 1 : 0);
}
main().catch(err => {
console.error('Fatal:', err);
process.exit(1);
});

View File

@ -0,0 +1,70 @@
/**
* test-session-permissions.ts Verify payment operation permissions in session.ts.
*
* Tests that the new payment:x402, payment:safe-propose, and payment:safe-execute
* operations are properly defined in OPERATION_PERMISSIONS.
*
* Usage:
* bun run scripts/test-session-permissions.ts
*/
import { OPERATION_PERMISSIONS, AuthLevel } from '../src/encryptid/session';
let passed = 0;
let failed = 0;
function assert(condition: boolean, msg: string) {
if (condition) {
console.log(`${msg}`);
passed++;
} else {
console.error(`${msg}`);
failed++;
}
}
function main() {
console.log('=== Session Permission Tests ===\n');
// Test 1: payment:x402 exists with correct settings
console.log('[1] payment:x402');
const x402 = OPERATION_PERMISSIONS['payment:x402'];
assert(x402 !== undefined, 'payment:x402 is defined');
assert(x402.minAuthLevel === AuthLevel.STANDARD, 'Requires STANDARD auth');
assert(x402.requiresCapability === 'wallet', 'Requires wallet capability');
assert(x402.maxAgeSeconds === undefined, 'No max age (not time-sensitive)');
// Test 2: payment:safe-propose exists with correct settings
console.log('\n[2] payment:safe-propose');
const propose = OPERATION_PERMISSIONS['payment:safe-propose'];
assert(propose !== undefined, 'payment:safe-propose is defined');
assert(propose.minAuthLevel === AuthLevel.ELEVATED, 'Requires ELEVATED auth');
assert(propose.requiresCapability === 'wallet', 'Requires wallet capability');
assert(propose.maxAgeSeconds === 60, 'Max age is 60 seconds');
// Test 3: payment:safe-execute exists with correct settings
console.log('\n[3] payment:safe-execute');
const execute = OPERATION_PERMISSIONS['payment:safe-execute'];
assert(execute !== undefined, 'payment:safe-execute is defined');
assert(execute.minAuthLevel === AuthLevel.CRITICAL, 'Requires CRITICAL auth');
assert(execute.requiresCapability === 'wallet', 'Requires wallet capability');
assert(execute.maxAgeSeconds === 60, 'Max age is 60 seconds');
// Test 4: Existing operations still intact
console.log('\n[4] Existing operations unchanged');
assert(OPERATION_PERMISSIONS['rspace:view-public'] !== undefined, 'rspace:view-public still exists');
assert(OPERATION_PERMISSIONS['rwallet:send-small'] !== undefined, 'rwallet:send-small still exists');
assert(OPERATION_PERMISSIONS['account:delete'] !== undefined, 'account:delete still exists');
assert(OPERATION_PERMISSIONS['rspace:view-public'].minAuthLevel === AuthLevel.BASIC, 'rspace:view-public still BASIC');
assert(OPERATION_PERMISSIONS['account:delete'].minAuthLevel === AuthLevel.CRITICAL, 'account:delete still CRITICAL');
// Test 5: Auth level ordering
console.log('\n[5] Auth level escalation (x402 < propose < execute)');
assert(x402.minAuthLevel < propose.minAuthLevel, 'x402 < propose');
assert(propose.minAuthLevel < execute.minAuthLevel, 'propose < execute');
console.log(`\n=== Results: ${passed} passed, ${failed} failed ===`);
process.exit(failed > 0 ? 1 : 0);
}
main();

View File

@ -0,0 +1,181 @@
/**
* test-wallet-store.ts Test encrypted client-side wallet store.
*
* Mocks localStorage since we're running in Bun (not a browser).
*
* Usage:
* bun run scripts/test-wallet-store.ts
*/
// Mock localStorage before importing anything
const storage = new Map<string, string>();
(globalThis as any).localStorage = {
getItem: (k: string) => storage.get(k) ?? null,
setItem: (k: string, v: string) => storage.set(k, v),
removeItem: (k: string) => storage.delete(k),
clear: () => storage.clear(),
};
import { WalletStore } from '../src/encryptid/wallet-store';
import { EncryptIDKeyManager } from '../src/encryptid/key-derivation';
let passed = 0;
let failed = 0;
function assert(condition: boolean, msg: string) {
if (condition) {
console.log(`${msg}`);
passed++;
} else {
console.error(`${msg}`);
failed++;
}
}
async function main() {
console.log('=== Wallet Store Tests ===\n');
// Set up key manager with a test PRF
const prfOutput = new ArrayBuffer(32);
new Uint8Array(prfOutput).set(Array.from({ length: 32 }, (_, i) => i + 1));
const km = new EncryptIDKeyManager();
await km.initFromPRF(prfOutput);
const keys = await km.getKeys();
const store = new WalletStore(keys.encryptionKey);
// Test 1: Empty store
console.log('[1] Empty store');
const empty = await store.list();
assert(empty.length === 0, 'No wallets initially');
const noDefault = await store.getDefault();
assert(noDefault === null, 'No default wallet');
// Test 2: Add first wallet (auto-default)
console.log('\n[2] Add first wallet');
const w1 = await store.add({
safeAddress: '0x1111111111111111111111111111111111111111',
chainId: 84532,
eoaAddress: keys.eoaAddress!,
label: 'Test Treasury',
});
assert(w1.id.length > 0, 'Has UUID');
assert(w1.safeAddress === '0x1111111111111111111111111111111111111111', 'Address correct');
assert(w1.chainId === 84532, 'Chain correct');
assert(w1.isDefault === true, 'First wallet is auto-default');
assert(w1.label === 'Test Treasury', 'Label correct');
assert(w1.addedAt > 0, 'Has timestamp');
// Test 3: Data is encrypted in localStorage
console.log('\n[3] Encrypted at rest');
const raw = storage.get('encryptid_wallets');
assert(raw !== undefined, 'Data exists in localStorage');
const blob = JSON.parse(raw!);
assert(typeof blob.c === 'string', 'Has ciphertext field');
assert(typeof blob.iv === 'string', 'Has IV field');
assert(!raw!.includes('Treasury'), 'Label NOT in plaintext');
assert(!raw!.includes('1111111'), 'Address NOT in plaintext');
// Test 4: Add second wallet
console.log('\n[4] Add second wallet');
const w2 = await store.add({
safeAddress: '0x2222222222222222222222222222222222222222',
chainId: 8453,
eoaAddress: keys.eoaAddress!,
label: 'Mainnet Safe',
});
assert(w2.isDefault === false, 'Second wallet is not default');
const all = await store.list();
assert(all.length === 2, 'Now have 2 wallets');
// Test 5: Get default
console.log('\n[5] Get default');
const def = await store.getDefault();
assert(def !== null, 'Has default');
assert(def!.id === w1.id, 'First wallet is still default');
// Test 6: Get by address + chain
console.log('\n[6] Get by address + chain');
const found = await store.get('0x2222222222222222222222222222222222222222', 8453);
assert(found !== null, 'Found by address+chain');
assert(found!.label === 'Mainnet Safe', 'Correct wallet');
const notFound = await store.get('0x2222222222222222222222222222222222222222', 1);
assert(notFound === null, 'Not found on wrong chain');
// Test 7: Update label and default
console.log('\n[7] Update');
const updated = await store.update(w2.id, { label: 'Base Mainnet', isDefault: true });
assert(updated !== null, 'Update succeeded');
assert(updated!.label === 'Base Mainnet', 'Label updated');
assert(updated!.isDefault === true, 'Now default');
const newDefault = await store.getDefault();
assert(newDefault!.id === w2.id, 'Default switched');
// Old default should be false now
const allAfter = await store.list();
const oldW1 = allAfter.find(w => w.id === w1.id);
assert(oldW1!.isDefault === false, 'Old default cleared');
// Test 8: Duplicate add = upsert
console.log('\n[8] Duplicate add (upsert)');
const w1Updated = await store.add({
safeAddress: '0x1111111111111111111111111111111111111111',
chainId: 84532,
eoaAddress: keys.eoaAddress!,
label: 'Renamed Treasury',
});
assert(w1Updated.label === 'Renamed Treasury', 'Label updated via upsert');
const afterUpsert = await store.list();
assert(afterUpsert.length === 2, 'Still 2 wallets (not 3)');
// Test 9: Persistence — new store instance reads encrypted data
console.log('\n[9] Persistence across instances');
const store2 = new WalletStore(keys.encryptionKey);
const restored = await store2.list();
assert(restored.length === 2, 'Restored 2 wallets');
assert(restored.some(w => w.label === 'Renamed Treasury'), 'Labels preserved');
assert(restored.some(w => w.label === 'Base Mainnet'), 'Both wallets present');
// Test 10: Wrong key can't decrypt
console.log('\n[10] Wrong key fails gracefully');
const km2 = new EncryptIDKeyManager();
const otherPrf = new ArrayBuffer(32);
new Uint8Array(otherPrf).set(Array.from({ length: 32 }, (_, i) => 255 - i));
await km2.initFromPRF(otherPrf);
const otherKeys = await km2.getKeys();
const storeWrongKey = new WalletStore(otherKeys.encryptionKey);
const wrongResult = await storeWrongKey.list();
assert(wrongResult.length === 0, 'Wrong key returns empty (graceful failure)');
km2.clear();
// Test 11: Remove wallet
console.log('\n[11] Remove');
const removed = await store.remove(w2.id);
assert(removed === true, 'Remove succeeded');
const afterRemove = await store.list();
assert(afterRemove.length === 1, 'Down to 1 wallet');
// Removed wallet was default, so remaining should be promoted
assert(afterRemove[0].isDefault === true, 'Remaining wallet promoted to default');
// Test 12: Remove non-existent
const removedAgain = await store.remove('nonexistent-id');
assert(removedAgain === false, 'Remove non-existent returns false');
// Test 13: Clear
console.log('\n[13] Clear');
await store.clear();
const afterClear = await store.list();
assert(afterClear.length === 0, 'Empty after clear');
assert(!storage.has('encryptid_wallets'), 'localStorage key removed');
// Cleanup
km.clear();
console.log(`\n=== Results: ${passed} passed, ${failed} failed ===`);
process.exit(failed > 0 ? 1 : 0);
}
main().catch(err => {
console.error('Fatal:', err);
process.exit(1);
});

100
scripts/test-x402.ts Normal file
View File

@ -0,0 +1,100 @@
/**
* test-x402.ts End-to-end x402 payment test against rSpace rSplat.
*
* Tests the full 402 flow:
* 1. POST to the gated endpoint expect 402 with payment requirements
* 2. Sign USDC payment on Base Sepolia using EIP-3009
* 3. Retry with X-PAYMENT header expect 200/201
*
* Usage:
* EVM_PRIVATE_KEY=0x... bun run scripts/test-x402.ts [url]
*
* Defaults to https://demo.rspace.online/rsplat/api/splats
* Wallet needs testnet USDC from https://faucet.circle.com
*/
import { x402Client, wrapFetchWithPayment } from "@x402/fetch";
import { ExactEvmScheme } from "@x402/evm/exact/client";
import { privateKeyToAccount } from "viem/accounts";
const PRIVATE_KEY = process.env.EVM_PRIVATE_KEY as `0x${string}`;
if (!PRIVATE_KEY) {
console.error("Set EVM_PRIVATE_KEY env var (0x-prefixed hex private key)");
process.exit(1);
}
const TARGET_URL =
process.argv[2] || "https://demo.rspace.online/api/x402-test";
async function main() {
console.log("=== x402 Payment Test ===\n");
console.log("Target:", TARGET_URL);
// Step 1: Verify the endpoint returns 402
console.log("\n[1] Testing 402 response (no payment)...");
const raw = await fetch(TARGET_URL, { method: "POST" });
console.log(" Status:", raw.status);
if (raw.status !== 402) {
console.log(" Expected 402, got", raw.status);
console.log(" Body:", await raw.text());
process.exit(1);
}
const body = (await raw.json()) as {
paymentRequirements?: {
payTo: string;
network: string;
maxAmountRequired: string;
};
};
console.log(" Payment requirements:");
console.log(" payTo:", body.paymentRequirements?.payTo);
console.log(" network:", body.paymentRequirements?.network);
console.log(" amount:", body.paymentRequirements?.maxAmountRequired);
// Step 2: Set up x402 client with EVM signer
console.log("\n[2] Setting up x402 client...");
const account = privateKeyToAccount(PRIVATE_KEY);
console.log(" Wallet:", account.address);
const client = new x402Client();
client.register("eip155:84532", new ExactEvmScheme(account as any));
const paidFetch = wrapFetchWithPayment(fetch, client);
// Step 3: Make the paid request
console.log("\n[3] Making paid request (x402 client handles signing automatically)...");
try {
const res = await paidFetch(TARGET_URL, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ test: true }),
});
console.log(" Status:", res.status);
const resBody = await res.text();
try {
console.log(
" Response:",
JSON.stringify(JSON.parse(resBody), null, 2),
);
} catch {
console.log(" Response:", resBody.slice(0, 500));
}
if (res.ok) {
console.log("\n x402 payment successful! USDC transferred on Base Sepolia.");
} else if (res.status === 402) {
console.log("\n Still 402 — payment signing or verification failed.");
console.log(
" Check wallet has USDC on Base Sepolia (faucet: https://faucet.circle.com)",
);
}
} catch (err) {
console.error(" Error:", err);
}
}
main();

295
scripts/twenty-setup.js Normal file
View File

@ -0,0 +1,295 @@
#!/usr/bin/env bun
/**
* Twenty CRM setup script configure pipeline stages and custom fields.
* Run inside the rspace container:
* docker exec rspace-online bun /app/scripts/twenty-setup.js
*/
const TOKEN = process.env.TWENTY_API_TOKEN;
const BASE = process.env.TWENTY_API_URL || "http://twenty-ch-server:3000";
if (!TOKEN) {
// Fallback: read from /proc/1/environ
const fs = require("fs");
try {
const env = fs.readFileSync("/proc/1/environ", "utf8");
const match = env.split("\0").find(e => e.startsWith("TWENTY_API_TOKEN="));
if (match) {
var FALLBACK_TOKEN = match.split("=").slice(1).join("=");
}
} catch {}
if (!FALLBACK_TOKEN) {
console.error("TWENTY_API_TOKEN not set");
process.exit(1);
}
var API_TOKEN = FALLBACK_TOKEN;
} else {
var API_TOKEN = TOKEN;
}
async function gqlMetadata(query, variables) {
const r = await fetch(BASE + "/metadata", {
method: "POST",
headers: {
Authorization: "Bearer " + API_TOKEN,
"Content-Type": "application/json",
},
body: JSON.stringify({ query, variables }),
});
const d = await r.json();
if (d.errors) {
console.error("GraphQL error:", JSON.stringify(d.errors, null, 2));
}
return d.data;
}
async function gqlApi(query, variables) {
const r = await fetch(BASE + "/api/graphql", {
method: "POST",
headers: {
Authorization: "Bearer " + API_TOKEN,
"Content-Type": "application/json",
},
body: JSON.stringify({ query, variables }),
});
const d = await r.json();
if (d.errors) {
console.error("API GraphQL error:", JSON.stringify(d.errors, null, 2));
}
return d.data;
}
// ── Step 1: Find object IDs ──
async function getObjectId(nameSingular) {
const data = await gqlMetadata(`{
objects(paging: { first: 100 }) {
edges {
node { id nameSingular }
}
}
}`);
const obj = data.objects.edges.find(e => e.node.nameSingular === nameSingular);
return obj ? obj.node.id : null;
}
// ── Step 2: Get existing fields for an object ──
async function getFields(objectId) {
const data = await gqlMetadata(`
query GetFields($id: UUID!) {
object(id: $id) {
id
nameSingular
fields {
edges {
node {
id
name
label
type
isCustom
options
defaultValue
}
}
}
}
}
`, { id: objectId });
return data.object;
}
// ── Step 3: Update stage field options ──
async function updateFieldOptions(fieldId, options, defaultValue) {
const update = { options };
if (defaultValue !== undefined) {
update.defaultValue = defaultValue;
}
const data = await gqlMetadata(`
mutation UpdateField($input: UpdateOneFieldMetadataInput!) {
updateOneField(input: $input) {
id
name
options
defaultValue
}
}
`, {
input: {
id: fieldId,
update,
},
});
return data ? data.updateOneField : null;
}
// ── Step 4: Create custom field ──
async function createField(objectId, fieldDef) {
const data = await gqlMetadata(`
mutation CreateField($input: CreateOneFieldMetadataInput!) {
createOneField(input: $input) {
id
name
label
type
}
}
`, {
input: {
field: {
objectMetadataId: objectId,
...fieldDef,
},
},
});
return data.createOneField;
}
// ══════════════════════════════════════════
// PIPELINE STAGES
// ══════════════════════════════════════════
const PIPELINE_STAGES = [
{ label: "New Lead", value: "NEW_LEAD", color: "#1E90FF", position: 0 },
{ label: "Contacted", value: "CONTACTED", color: "#EAB308", position: 1 },
{ label: "Qualified", value: "QUALIFIED", color: "#F97316", position: 2 },
{ label: "Offer Sent", value: "OFFER_SENT", color: "#8B5CF6", position: 3 },
{ label: "Confirmed", value: "CONFIRMED", color: "#14B8A6", position: 4 },
{ label: "Won", value: "WON", color: "#22C55E", position: 5 },
{ label: "Lost / Not Now", value: "LOST_NOT_NOW", color: "#EF4444", position: 6 },
];
// ══════════════════════════════════════════
// CUSTOM FIELDS
// ══════════════════════════════════════════
const OPPORTUNITY_FIELDS = [
{ name: "eventDatesPreferred", label: "Event Dates (preferred)", type: "DATE" },
{ name: "eventDatesFlexible", label: "Event Dates (flexible range)", type: "TEXT" },
{ name: "groupSize", label: "Group Size", type: "NUMBER" },
{ name: "needsAccommodation", label: "Needs: Accommodation", type: "BOOLEAN", defaultValue: false },
{ name: "needsCatering", label: "Needs: Catering", type: "BOOLEAN", defaultValue: false },
{ name: "needsRooms", label: "Needs: Rooms", type: "BOOLEAN", defaultValue: false },
{ name: "needsAV", label: "Needs: AV", type: "BOOLEAN", defaultValue: false },
{ name: "nextActionDate", label: "Next Action Date", type: "DATE" },
{ name: "followUpDate", label: "Follow-up Date", type: "DATE" },
{ name: "lostReason", label: "Lost Reason", type: "TEXT" },
];
const COMPANY_FIELDS = [
{
name: "leadSource",
label: "Lead Source",
type: "SELECT",
options: [
{ label: "Website", value: "WEBSITE", color: "#3B82F6", position: 0 },
{ label: "Referral", value: "REFERRAL", color: "#22C55E", position: 1 },
{ label: "Event", value: "EVENT", color: "#8B5CF6", position: 2 },
{ label: "Cold Outreach", value: "COLD_OUTREACH", color: "#F97316", position: 3 },
{ label: "Social Media", value: "SOCIAL_MEDIA", color: "#06B6D4", position: 4 },
{ label: "Other", value: "OTHER", color: "#6B7280", position: 5 },
],
},
{ name: "lastTouchDate", label: "Last Touch Date", type: "DATE" },
];
// ══════════════════════════════════════════
// MAIN
// ══════════════════════════════════════════
(async () => {
console.log("=== Twenty CRM Setup ===\n");
// 1. Get object IDs
const oppId = await getObjectId("opportunity");
const compId = await getObjectId("company");
console.log("Opportunity object:", oppId);
console.log("Company object:", compId);
if (!oppId || !compId) {
console.error("Could not find opportunity or company objects");
process.exit(1);
}
// 2. Get current opportunity fields
const oppObj = await getFields(oppId);
const existingOppFields = oppObj.fields.edges.map(e => e.node);
console.log("\nExisting opportunity fields:", existingOppFields.map(f => f.name).join(", "));
// 3. Find and update the stage field
const stageField = existingOppFields.find(f => f.name === "stage");
if (stageField) {
console.log("\nCurrent stage options:", JSON.stringify(stageField.options, null, 2));
console.log("\nUpdating pipeline stages...");
// Default value must match one of the new option values (Twenty wraps in quotes)
const updated = await updateFieldOptions(stageField.id, PIPELINE_STAGES, "'NEW_LEAD'");
if (updated) {
console.log("Pipeline stages updated:", updated.options.map(o => o.label).join(" → "));
}
} else {
console.log("\nWARNING: No 'stage' field found on opportunity object");
}
// 4. Create custom fields on Opportunity
console.log("\n--- Creating Opportunity custom fields ---");
for (const fieldDef of OPPORTUNITY_FIELDS) {
const exists = existingOppFields.find(f => f.name === fieldDef.name);
if (exists) {
console.log(" SKIP (exists): " + fieldDef.name);
continue;
}
const input = {
name: fieldDef.name,
label: fieldDef.label,
type: fieldDef.type,
description: "",
};
if (fieldDef.defaultValue !== undefined) {
input.defaultValue = fieldDef.defaultValue;
}
if (fieldDef.options) {
input.options = fieldDef.options;
}
try {
const created = await createField(oppId, input);
if (created) {
console.log(" CREATED: " + created.name + " (" + created.type + ")");
} else {
console.log(" FAILED: " + fieldDef.name);
}
} catch (e) {
console.log(" ERROR: " + fieldDef.name + " - " + e.message);
}
}
// 5. Create custom fields on Company
console.log("\n--- Creating Company custom fields ---");
const compObj = await getFields(compId);
const existingCompFields = compObj.fields.edges.map(e => e.node);
console.log("Existing company fields:", existingCompFields.map(f => f.name).join(", "));
for (const fieldDef of COMPANY_FIELDS) {
const exists = existingCompFields.find(f => f.name === fieldDef.name);
if (exists) {
console.log(" SKIP (exists): " + fieldDef.name);
continue;
}
const input = {
name: fieldDef.name,
label: fieldDef.label,
type: fieldDef.type,
description: "",
};
if (fieldDef.options) {
input.options = fieldDef.options;
}
try {
const created = await createField(compId, input);
if (created) {
console.log(" CREATED: " + created.name + " (" + created.type + ")");
} else {
console.log(" FAILED: " + fieldDef.name);
}
} catch (e) {
console.log(" ERROR: " + fieldDef.name + " - " + e.message);
}
}
console.log("\n=== Setup complete ===");
})();

269
scripts/twenty-views.js Normal file
View File

@ -0,0 +1,269 @@
#!/usr/bin/env bun
/**
* Twenty CRM create saved views for the lead funnel.
* Run inside the rspace container:
* docker exec -e TWENTY_API_TOKEN=... -e TWENTY_API_URL=... rspace-online bun /app/scripts/twenty-views.js
*
* Field IDs are discovered dynamically via metadata introspection,
* so this script works on any Twenty instance.
*/
const TOKEN = process.env.TWENTY_API_TOKEN;
const BASE = process.env.TWENTY_API_URL || "http://twenty-ch-server:3000";
if (!TOKEN) {
const fs = require("fs");
try {
const env = fs.readFileSync("/proc/1/environ", "utf8");
const match = env.split("\0").find(e => e.startsWith("TWENTY_API_TOKEN="));
if (match) var FALLBACK = match.split("=").slice(1).join("=");
} catch {}
if (!FALLBACK) { console.error("TWENTY_API_TOKEN not set"); process.exit(1); }
var API_TOKEN = FALLBACK;
} else {
var API_TOKEN = TOKEN;
}
const HEADERS = {
Authorization: "Bearer " + API_TOKEN,
"Content-Type": "application/json",
};
async function gql(query, variables) {
const r = await fetch(BASE + "/metadata", {
method: "POST",
headers: HEADERS,
body: JSON.stringify({ query, variables }),
});
const d = await r.json();
if (d.errors) console.error("GQL error:", JSON.stringify(d.errors, null, 2));
return d.data;
}
// ── Discover object and field IDs dynamically ──
async function discoverIds() {
// Step 1: Find the opportunity object ID
const objData = await gql(`{
objects(paging: { first: 100 }) {
edges { node { id nameSingular } }
}
}`);
const oppObj = objData.objects.edges.find(e => e.node.nameSingular === "opportunity");
if (!oppObj) throw new Error("Opportunity object not found");
const opp = oppObj.node;
// Step 2: Query fields separately with proper paging
const fieldData = await gql(`
query GetFields($id: UUID!) {
object(id: $id) {
fields(paging: { first: 200 }) { edges { node { id name } } }
}
}
`, { id: opp.id });
const fields = fieldData.object.fields.edges.map(e => e.node);
const fieldMap = {};
for (const f of fields) fieldMap[f.name] = f.id;
console.log("Discovered opportunity object:", opp.id);
console.log("Fields:", Object.keys(fieldMap).join(", "));
const required = ["stage", "name", "closeDate", "amount"];
for (const name of required) {
if (!fieldMap[name]) throw new Error(`Required field '${name}' not found on opportunity object`);
}
return {
oppObjectId: opp.id,
stageFieldId: fieldMap.stage,
nameFieldId: fieldMap.name,
closeDateFieldId: fieldMap.closeDate,
amountFieldId: fieldMap.amount,
nextActionDateFieldId: fieldMap.nextActionDate || null,
};
}
(async () => {
console.log("=== Creating Twenty CRM Saved Views ===\n");
const ids = await discoverIds();
// ── View 1: "My Pipeline" — Kanban grouped by stage ──
console.log("\nCreating 'My Pipeline' kanban view...");
const v1 = await gql(`
mutation CreateView($input: CreateViewInput!) {
createCoreView(input: $input) { id name type }
}
`, {
input: {
name: "My Pipeline",
objectMetadataId: ids.oppObjectId,
type: "KANBAN",
icon: "IconLayoutKanban",
position: 0,
key: "INDEX",
visibility: "WORKSPACE",
mainGroupByFieldMetadataId: ids.stageFieldId,
},
});
if (v1 && v1.createCoreView) {
const viewId = v1.createCoreView.id;
console.log(" Created view:", viewId);
const viewFields = [
{ fieldMetadataId: ids.nameFieldId, position: 0, isVisible: true },
{ fieldMetadataId: ids.stageFieldId, position: 1, isVisible: true },
{ fieldMetadataId: ids.amountFieldId, position: 2, isVisible: true },
{ fieldMetadataId: ids.closeDateFieldId, position: 3, isVisible: true },
];
if (ids.nextActionDateFieldId) {
viewFields.push({ fieldMetadataId: ids.nextActionDateFieldId, position: 4, isVisible: true });
}
for (const vf of viewFields) {
await gql(`
mutation CreateViewField($input: CreateViewFieldInput!) {
createCoreViewField(input: $input) { id }
}
`, { input: { viewId, ...vf } });
}
console.log(" View fields added");
} else {
console.log(" FAILED to create My Pipeline view");
}
// ── View 2: "Needs Follow-up" — Table filtered: nextActionDate <= today ──
if (ids.nextActionDateFieldId) {
console.log("\nCreating 'Needs Follow-up' table view...");
const v2 = await gql(`
mutation CreateView($input: CreateViewInput!) {
createCoreView(input: $input) { id name type }
}
`, {
input: {
name: "Needs Follow-up",
objectMetadataId: ids.oppObjectId,
type: "TABLE",
icon: "IconAlarm",
position: 1,
key: "INDEX",
visibility: "WORKSPACE",
},
});
if (v2 && v2.createCoreView) {
const viewId = v2.createCoreView.id;
console.log(" Created view:", viewId);
console.log(" Adding filter: nextActionDate <= today...");
const fg = await gql(`
mutation CreateFilterGroup($input: CreateViewFilterGroupInput!) {
createCoreViewFilterGroup(input: $input) { id }
}
`, {
input: {
viewId: viewId,
logicalOperator: "AND",
positionInViewFilterGroup: 0,
},
});
if (fg && fg.createCoreViewFilterGroup) {
await gql(`
mutation CreateFilter($input: CreateViewFilterInput!) {
createCoreViewFilter(input: $input) { id }
}
`, {
input: {
viewId: viewId,
viewFilterGroupId: fg.createCoreViewFilterGroup.id,
fieldMetadataId: ids.nextActionDateFieldId,
operand: "IS_BEFORE",
value: "TODAY",
positionInViewFilterGroup: 0,
},
});
console.log(" Filter added");
}
console.log(" Adding sort: nextActionDate ASC...");
await gql(`
mutation CreateSort($input: CreateViewSortInput!) {
createCoreViewSort(input: $input) { id }
}
`, {
input: {
viewId: viewId,
fieldMetadataId: ids.nextActionDateFieldId,
direction: "ASC",
},
});
console.log(" Sort added");
} else {
console.log(" FAILED to create Needs Follow-up view");
}
// ── View 3: "Stale Leads" — Table filtered: nextActionDate is empty ──
console.log("\nCreating 'Stale Leads' table view...");
const v3 = await gql(`
mutation CreateView($input: CreateViewInput!) {
createCoreView(input: $input) { id name type }
}
`, {
input: {
name: "Stale Leads",
objectMetadataId: ids.oppObjectId,
type: "TABLE",
icon: "IconAlertTriangle",
position: 2,
key: "INDEX",
visibility: "WORKSPACE",
},
});
if (v3 && v3.createCoreView) {
const viewId = v3.createCoreView.id;
console.log(" Created view:", viewId);
console.log(" Adding filter: nextActionDate is empty...");
const fg = await gql(`
mutation CreateFilterGroup($input: CreateViewFilterGroupInput!) {
createCoreViewFilterGroup(input: $input) { id }
}
`, {
input: {
viewId: viewId,
logicalOperator: "AND",
positionInViewFilterGroup: 0,
},
});
if (fg && fg.createCoreViewFilterGroup) {
await gql(`
mutation CreateFilter($input: CreateViewFilterInput!) {
createCoreViewFilter(input: $input) { id }
}
`, {
input: {
viewId: viewId,
viewFilterGroupId: fg.createCoreViewFilterGroup.id,
fieldMetadataId: ids.nextActionDateFieldId,
operand: "IS_EMPTY",
value: "",
positionInViewFilterGroup: 0,
},
});
console.log(" Filter added");
}
} else {
console.log(" FAILED to create Stale Leads view");
}
} else {
console.log("\nSKIPPING follow-up/stale views — nextActionDate field not found.");
console.log("Run twenty-setup.js first to create custom fields, then re-run this script.");
}
console.log("\n=== Views setup complete ===");
})();

204
server/oauth/google.ts Normal file
View File

@ -0,0 +1,204 @@
/**
* Google OAuth2 flow with token refresh.
*
* GET /authorize?space=X redirect to Google
* GET /callback exchange code, store tokens, redirect back
* POST /disconnect?space=X revoke token
* POST /refresh?space=X refresh access token using refresh token
*/
import { Hono } from 'hono';
import * as Automerge from '@automerge/automerge';
import { connectionsDocId } from '../../modules/rnotes/schemas';
import type { ConnectionsDoc } from '../../modules/rnotes/schemas';
import type { SyncServer } from '../local-first/sync-server';
const googleOAuthRoutes = new Hono();
const GOOGLE_CLIENT_ID = process.env.GOOGLE_CLIENT_ID || '';
const GOOGLE_CLIENT_SECRET = process.env.GOOGLE_CLIENT_SECRET || '';
const GOOGLE_REDIRECT_URI = process.env.GOOGLE_REDIRECT_URI || '';
const SCOPES = [
'https://www.googleapis.com/auth/documents',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/userinfo.email',
].join(' ');
let _syncServer: SyncServer | null = null;
export function setGoogleOAuthSyncServer(ss: SyncServer) {
_syncServer = ss;
}
function ensureConnectionsDoc(space: string): ConnectionsDoc {
if (!_syncServer) throw new Error('SyncServer not initialized');
const docId = connectionsDocId(space);
let doc = _syncServer.getDoc<ConnectionsDoc>(docId);
if (!doc) {
doc = Automerge.change(Automerge.init<ConnectionsDoc>(), 'init connections', (d) => {
d.meta = {
module: 'notes',
collection: 'connections',
version: 1,
spaceSlug: space,
createdAt: Date.now(),
};
});
_syncServer.setDoc(docId, doc);
}
return doc;
}
// GET /authorize — redirect to Google OAuth
googleOAuthRoutes.get('/authorize', (c) => {
const space = c.req.query('space');
if (!space) return c.json({ error: 'space query param required' }, 400);
if (!GOOGLE_CLIENT_ID) return c.json({ error: 'Google OAuth not configured' }, 500);
const state = Buffer.from(JSON.stringify({ space })).toString('base64url');
const params = new URLSearchParams({
client_id: GOOGLE_CLIENT_ID,
redirect_uri: GOOGLE_REDIRECT_URI,
response_type: 'code',
scope: SCOPES,
access_type: 'offline',
prompt: 'consent',
state,
});
return c.redirect(`https://accounts.google.com/o/oauth2/v2/auth?${params}`);
});
// GET /callback — exchange code for tokens
googleOAuthRoutes.get('/callback', async (c) => {
const code = c.req.query('code');
const stateParam = c.req.query('state');
if (!code || !stateParam) return c.json({ error: 'Missing code or state' }, 400);
let state: { space: string };
try {
state = JSON.parse(Buffer.from(stateParam, 'base64url').toString());
} catch {
return c.json({ error: 'Invalid state parameter' }, 400);
}
// Exchange code for tokens
const tokenRes = await fetch('https://oauth2.googleapis.com/token', {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
body: new URLSearchParams({
code,
client_id: GOOGLE_CLIENT_ID,
client_secret: GOOGLE_CLIENT_SECRET,
redirect_uri: GOOGLE_REDIRECT_URI,
grant_type: 'authorization_code',
}),
});
if (!tokenRes.ok) {
const err = await tokenRes.text();
return c.json({ error: `Token exchange failed: ${err}` }, 502);
}
const tokenData = await tokenRes.json() as any;
// Get user email
let email = '';
try {
const userRes = await fetch('https://www.googleapis.com/oauth2/v2/userinfo', {
headers: { 'Authorization': `Bearer ${tokenData.access_token}` },
});
if (userRes.ok) {
const userData = await userRes.json() as any;
email = userData.email || '';
}
} catch {
// Non-critical
}
// Store tokens
ensureConnectionsDoc(state.space);
const docId = connectionsDocId(state.space);
_syncServer!.changeDoc<ConnectionsDoc>(docId, 'Connect Google', (d) => {
d.google = {
refreshToken: tokenData.refresh_token || '',
accessToken: tokenData.access_token,
expiresAt: Date.now() + (tokenData.expires_in || 3600) * 1000,
email,
connectedAt: Date.now(),
};
});
const redirectUrl = `/${state.space}/rnotes?connected=google`;
return c.redirect(redirectUrl);
});
// POST /disconnect — revoke and remove token
googleOAuthRoutes.post('/disconnect', async (c) => {
const space = c.req.query('space');
if (!space) return c.json({ error: 'space query param required' }, 400);
const docId = connectionsDocId(space);
const doc = _syncServer?.getDoc<ConnectionsDoc>(docId);
if (doc?.google?.accessToken) {
// Revoke token with Google
try {
await fetch(`https://oauth2.googleapis.com/revoke?token=${doc.google.accessToken}`, {
method: 'POST',
});
} catch {
// Best-effort revocation
}
_syncServer!.changeDoc<ConnectionsDoc>(docId, 'Disconnect Google', (d) => {
delete d.google;
});
}
return c.json({ ok: true });
});
// POST /refresh — refresh access token
googleOAuthRoutes.post('/refresh', async (c) => {
const space = c.req.query('space');
if (!space) return c.json({ error: 'space query param required' }, 400);
const docId = connectionsDocId(space);
const doc = _syncServer?.getDoc<ConnectionsDoc>(docId);
if (!doc?.google?.refreshToken) {
return c.json({ error: 'No Google refresh token available' }, 400);
}
const tokenRes = await fetch('https://oauth2.googleapis.com/token', {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
body: new URLSearchParams({
client_id: GOOGLE_CLIENT_ID,
client_secret: GOOGLE_CLIENT_SECRET,
refresh_token: doc.google.refreshToken,
grant_type: 'refresh_token',
}),
});
if (!tokenRes.ok) {
return c.json({ error: 'Token refresh failed' }, 502);
}
const tokenData = await tokenRes.json() as any;
_syncServer!.changeDoc<ConnectionsDoc>(docId, 'Refresh Google token', (d) => {
if (d.google) {
d.google.accessToken = tokenData.access_token;
d.google.expiresAt = Date.now() + (tokenData.expires_in || 3600) * 1000;
}
});
return c.json({ ok: true });
});
export { googleOAuthRoutes };

20
server/oauth/index.ts Normal file
View File

@ -0,0 +1,20 @@
/**
* OAuth route mounting for external integrations.
*
* Provides OAuth2 authorize/callback/disconnect flows for:
* - Notion (workspace-level integration)
* - Google (user-level, with token refresh)
*
* Tokens are stored in Automerge docs per space via SyncServer.
*/
import { Hono } from 'hono';
import { notionOAuthRoutes } from './notion';
import { googleOAuthRoutes } from './google';
const oauthRouter = new Hono();
oauthRouter.route('/notion', notionOAuthRoutes);
oauthRouter.route('/google', googleOAuthRoutes);
export { oauthRouter };

129
server/oauth/notion.ts Normal file
View File

@ -0,0 +1,129 @@
/**
* Notion OAuth2 flow.
*
* GET /authorize?space=X redirect to Notion
* GET /callback exchange code, store token, redirect back
* POST /disconnect?space=X revoke token
*/
import { Hono } from 'hono';
import * as Automerge from '@automerge/automerge';
import { connectionsDocId } from '../../modules/rnotes/schemas';
import type { ConnectionsDoc } from '../../modules/rnotes/schemas';
import type { SyncServer } from '../local-first/sync-server';
const notionOAuthRoutes = new Hono();
const NOTION_CLIENT_ID = process.env.NOTION_CLIENT_ID || '';
const NOTION_CLIENT_SECRET = process.env.NOTION_CLIENT_SECRET || '';
const NOTION_REDIRECT_URI = process.env.NOTION_REDIRECT_URI || '';
// We'll need a reference to the sync server — set externally
let _syncServer: SyncServer | null = null;
export function setNotionOAuthSyncServer(ss: SyncServer) {
_syncServer = ss;
}
function ensureConnectionsDoc(space: string): ConnectionsDoc {
if (!_syncServer) throw new Error('SyncServer not initialized');
const docId = connectionsDocId(space);
let doc = _syncServer.getDoc<ConnectionsDoc>(docId);
if (!doc) {
doc = Automerge.change(Automerge.init<ConnectionsDoc>(), 'init connections', (d) => {
d.meta = {
module: 'notes',
collection: 'connections',
version: 1,
spaceSlug: space,
createdAt: Date.now(),
};
});
_syncServer.setDoc(docId, doc);
}
return doc;
}
// GET /authorize — redirect to Notion OAuth
notionOAuthRoutes.get('/authorize', (c) => {
const space = c.req.query('space');
if (!space) return c.json({ error: 'space query param required' }, 400);
if (!NOTION_CLIENT_ID) return c.json({ error: 'Notion OAuth not configured' }, 500);
const state = Buffer.from(JSON.stringify({ space })).toString('base64url');
const url = `https://api.notion.com/v1/oauth/authorize?client_id=${NOTION_CLIENT_ID}&response_type=code&owner=user&redirect_uri=${encodeURIComponent(NOTION_REDIRECT_URI)}&state=${state}`;
return c.redirect(url);
});
// GET /callback — exchange code for token
notionOAuthRoutes.get('/callback', async (c) => {
const code = c.req.query('code');
const stateParam = c.req.query('state');
if (!code || !stateParam) return c.json({ error: 'Missing code or state' }, 400);
let state: { space: string };
try {
state = JSON.parse(Buffer.from(stateParam, 'base64url').toString());
} catch {
return c.json({ error: 'Invalid state parameter' }, 400);
}
// Exchange code for access token
const tokenRes = await fetch('https://api.notion.com/v1/oauth/token', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Basic ${Buffer.from(`${NOTION_CLIENT_ID}:${NOTION_CLIENT_SECRET}`).toString('base64')}`,
},
body: JSON.stringify({
grant_type: 'authorization_code',
code,
redirect_uri: NOTION_REDIRECT_URI,
}),
});
if (!tokenRes.ok) {
const err = await tokenRes.text();
return c.json({ error: `Token exchange failed: ${err}` }, 502);
}
const tokenData = await tokenRes.json() as any;
// Store token in Automerge connections doc
ensureConnectionsDoc(state.space);
const docId = connectionsDocId(state.space);
_syncServer!.changeDoc<ConnectionsDoc>(docId, 'Connect Notion', (d) => {
d.notion = {
accessToken: tokenData.access_token,
workspaceId: tokenData.workspace_id || '',
workspaceName: tokenData.workspace_name || 'Notion Workspace',
connectedAt: Date.now(),
};
});
// Redirect back to rNotes
const redirectUrl = `/${state.space}/rnotes?connected=notion`;
return c.redirect(redirectUrl);
});
// POST /disconnect — revoke and remove token
notionOAuthRoutes.post('/disconnect', async (c) => {
const space = c.req.query('space');
if (!space) return c.json({ error: 'space query param required' }, 400);
const docId = connectionsDocId(space);
const doc = _syncServer?.getDoc<ConnectionsDoc>(docId);
if (doc?.notion) {
_syncServer!.changeDoc<ConnectionsDoc>(docId, 'Disconnect Notion', (d) => {
delete d.notion;
});
}
return c.json({ ok: true });
});
export { notionOAuthRoutes };

View File

@ -731,6 +731,57 @@ export async function deleteUserAddress(id: string, userId: string): Promise<boo
return result.count > 0;
}
// ============================================================================
// UNIVERSAL PROFILE OPERATIONS
// ============================================================================
export interface StoredUniversalProfile {
userId: string;
upAddress: string;
keyManagerAddress: string;
chainId: number;
deployedAt: string;
}
export async function getUserUPAddress(userId: string): Promise<StoredUniversalProfile | null> {
const [row] = await sql`
SELECT id, up_address, up_key_manager_address, up_chain_id, up_deployed_at
FROM users
WHERE id = ${userId} AND up_address IS NOT NULL
`;
if (!row || !row.up_address) return null;
return {
userId: row.id,
upAddress: row.up_address,
keyManagerAddress: row.up_key_manager_address,
chainId: row.up_chain_id,
deployedAt: row.up_deployed_at?.toISOString?.() || new Date(row.up_deployed_at).toISOString(),
};
}
export async function setUserUPAddress(
userId: string,
upAddress: string,
keyManagerAddress: string,
chainId: number,
): Promise<void> {
await sql`
UPDATE users
SET up_address = ${upAddress},
up_key_manager_address = ${keyManagerAddress},
up_chain_id = ${chainId},
up_deployed_at = NOW(),
updated_at = NOW()
WHERE id = ${userId}
`;
}
export async function getUserByUPAddress(upAddress: string): Promise<StoredUserProfile | null> {
const [row] = await sql`SELECT * FROM users WHERE up_address = ${upAddress}`;
if (!row) return null;
return rowToProfile(row);
}
// ============================================================================
// HEALTH CHECK
// ============================================================================

View File

@ -20,6 +20,12 @@ ALTER TABLE users ADD COLUMN IF NOT EXISTS profile_email_is_recovery BOOLEAN DEF
ALTER TABLE users ADD COLUMN IF NOT EXISTS wallet_address TEXT;
ALTER TABLE users ADD COLUMN IF NOT EXISTS updated_at TIMESTAMPTZ DEFAULT NOW();
-- Universal Profile columns (LUKSO LSP0 + LSP6 on Base)
ALTER TABLE users ADD COLUMN IF NOT EXISTS up_address TEXT;
ALTER TABLE users ADD COLUMN IF NOT EXISTS up_key_manager_address TEXT;
ALTER TABLE users ADD COLUMN IF NOT EXISTS up_chain_id INTEGER;
ALTER TABLE users ADD COLUMN IF NOT EXISTS up_deployed_at TIMESTAMPTZ;
CREATE TABLE IF NOT EXISTS credentials (
credential_id TEXT PRIMARY KEY,
user_id TEXT NOT NULL REFERENCES users(id) ON DELETE CASCADE,

View File

@ -62,6 +62,9 @@ import {
getAddressById,
saveUserAddress,
deleteUserAddress,
getUserUPAddress,
setUserUPAddress,
getUserByUPAddress,
} from './db.js';
// ============================================================================
@ -689,6 +692,60 @@ app.put('/api/user/profile', async (c) => {
return c.json({ success: true, profile });
});
// ============================================================================
// UNIVERSAL PROFILE ENDPOINTS
// ============================================================================
// GET /api/profile/:id/up — get Universal Profile info for a user
app.get('/api/profile/:id/up', async (c) => {
const claims = await verifyTokenFromRequest(c.req.header('Authorization'));
if (!claims) return c.json({ error: 'Unauthorized' }, 401);
const userId = c.req.param('id');
const up = await getUserUPAddress(userId);
if (!up) return c.json({ hasProfile: false });
return c.json({
hasProfile: true,
upAddress: up.upAddress,
keyManagerAddress: up.keyManagerAddress,
chainId: up.chainId,
deployedAt: up.deployedAt,
});
});
// POST /api/profile/:id/up — store Universal Profile after deployment
app.post('/api/profile/:id/up', async (c) => {
const claims = await verifyTokenFromRequest(c.req.header('Authorization'));
if (!claims) return c.json({ error: 'Unauthorized' }, 401);
// Users can only set their own UP
const userId = c.req.param('id');
if (claims.sub !== userId) {
return c.json({ error: 'Forbidden — can only set your own Universal Profile' }, 403);
}
const body = await c.req.json<{
upAddress: string;
keyManagerAddress: string;
chainId: number;
}>();
if (!body.upAddress?.match(/^0x[0-9a-fA-F]{40}$/)) {
return c.json({ error: 'Invalid upAddress' }, 400);
}
if (!body.keyManagerAddress?.match(/^0x[0-9a-fA-F]{40}$/)) {
return c.json({ error: 'Invalid keyManagerAddress' }, 400);
}
if (!body.chainId || typeof body.chainId !== 'number') {
return c.json({ error: 'Invalid chainId' }, 400);
}
await setUserUPAddress(userId, body.upAddress, body.keyManagerAddress, body.chainId);
return c.json({ success: true });
});
// ============================================================================
// ENCRYPTED ADDRESS ENDPOINTS
// ============================================================================
@ -1104,7 +1161,10 @@ app.post('/api/recovery/email/verify', async (c) => {
async function generateSessionToken(userId: string, username: string): Promise<string> {
const now = Math.floor(Date.now() / 1000);
const payload = {
// Check if user has a Universal Profile
const upInfo = await getUserUPAddress(userId);
const payload: Record<string, unknown> = {
iss: 'https://auth.rspace.online',
sub: userId,
aud: CONFIG.allowedOrigins,
@ -1114,11 +1174,21 @@ async function generateSessionToken(userId: string, username: string): Promise<s
did: `did:key:${userId.slice(0, 32)}`,
eid: {
authLevel: 3, // ELEVATED (fresh WebAuthn)
authTime: now,
capabilities: {
encrypt: true,
sign: true,
wallet: false,
wallet: !!upInfo,
},
recoveryConfigured: false,
...(upInfo ? {
up: {
address: upInfo.upAddress,
keyManagerAddress: upInfo.keyManagerAddress,
chainId: upInfo.chainId,
controllerAddress: '', // Derived client-side from PRF
},
} : {}),
},
};