Initial commit: Open Notebook deployment config for Netcup RS 8000
- Docker Compose with Traefik labels for notebook.jeffemmett.com - Configured to use local Ollama for LLM and embeddings (FREE) - STT/TTS options documented (Groq free tier, ElevenLabs) - Connected to ai-orchestrator_ai-internal network - Deployment script and README included 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
commit
348fd0ab1f
|
|
@ -0,0 +1,13 @@
|
||||||
|
# Data directories (don't commit)
|
||||||
|
notebook_data/
|
||||||
|
surreal_data/
|
||||||
|
|
||||||
|
# Environment file with secrets
|
||||||
|
docker.env
|
||||||
|
|
||||||
|
# Keep example
|
||||||
|
!docker.env.example
|
||||||
|
|
||||||
|
# OS files
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
@ -0,0 +1,175 @@
|
||||||
|
# Open Notebook - Netcup RS 8000 Deployment
|
||||||
|
|
||||||
|
Self-hosted NotebookLM alternative integrated with the AI orchestrator stack.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
Open Notebook
|
||||||
|
├── Frontend (Next.js) → port 8502 → Traefik → notebook.jeffemmett.com
|
||||||
|
├── API (FastAPI) → port 5055
|
||||||
|
├── Database (SurrealDB) → embedded
|
||||||
|
└── AI Providers:
|
||||||
|
├── LLM → Ollama (local, FREE)
|
||||||
|
├── Embeddings → Ollama (local, FREE)
|
||||||
|
├── STT → Groq/OpenAI (cloud)
|
||||||
|
└── TTS → ElevenLabs/OpenAI (cloud, for podcasts)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Quick Deploy
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. SSH to Netcup
|
||||||
|
ssh netcup
|
||||||
|
|
||||||
|
# 2. Clone/copy the deployment files
|
||||||
|
cd /opt/websites
|
||||||
|
git clone https://gitea.jeffemmett.com/jeff/open-notebook.git
|
||||||
|
# OR copy files manually
|
||||||
|
mkdir -p /opt/websites/open-notebook
|
||||||
|
cd /opt/websites/open-notebook
|
||||||
|
|
||||||
|
# 3. Pull required Ollama models
|
||||||
|
docker exec ollama ollama pull llama3.2:3b # Fast LLM
|
||||||
|
docker exec ollama ollama pull llama3.1:8b # Better LLM
|
||||||
|
docker exec ollama ollama pull nomic-embed-text # Embeddings
|
||||||
|
|
||||||
|
# 4. Edit docker.env with your API keys (optional)
|
||||||
|
nano docker.env
|
||||||
|
|
||||||
|
# 5. Deploy
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
# 6. Verify
|
||||||
|
docker logs -f open-notebook
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configure DNS (Cloudflare Tunnel)
|
||||||
|
|
||||||
|
### Option A: Add to existing tunnel config
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ssh netcup
|
||||||
|
nano /root/cloudflared/config.yml
|
||||||
|
```
|
||||||
|
|
||||||
|
Add:
|
||||||
|
```yaml
|
||||||
|
- hostname: notebook.jeffemmett.com
|
||||||
|
service: http://localhost:80
|
||||||
|
```
|
||||||
|
|
||||||
|
Restart cloudflared:
|
||||||
|
```bash
|
||||||
|
docker restart cloudflared
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option B: Cloudflare Dashboard
|
||||||
|
|
||||||
|
1. Go to Cloudflare Zero Trust → Access → Tunnels
|
||||||
|
2. Select your tunnel → Public Hostnames
|
||||||
|
3. Add `notebook.jeffemmett.com` → `http://localhost:80`
|
||||||
|
|
||||||
|
### DNS Records (if not using wildcard)
|
||||||
|
|
||||||
|
In Cloudflare DNS, add CNAME:
|
||||||
|
- Type: CNAME
|
||||||
|
- Name: notebook
|
||||||
|
- Target: a838e9dc-0af5-4212-8af2-6864eb15e1b5.cfargotunnel.com
|
||||||
|
- Proxy: Enabled
|
||||||
|
|
||||||
|
## AI Provider Configuration
|
||||||
|
|
||||||
|
### Local (FREE) - Already configured
|
||||||
|
|
||||||
|
| Feature | Provider | Model | Cost |
|
||||||
|
|---------|----------|-------|------|
|
||||||
|
| LLM | Ollama | llama3.2:3b, llama3.1:8b | FREE |
|
||||||
|
| Embeddings | Ollama | nomic-embed-text | FREE |
|
||||||
|
|
||||||
|
### Cloud (for premium features)
|
||||||
|
|
||||||
|
| Feature | Recommended Provider | Notes |
|
||||||
|
|---------|---------------------|-------|
|
||||||
|
| STT | **Groq** (free tier) | Fast Whisper, 100 hrs/month free |
|
||||||
|
| TTS | ElevenLabs | Best voice quality for podcasts |
|
||||||
|
| TTS (alt) | OpenAI | Cheaper, good quality |
|
||||||
|
|
||||||
|
### Adding API Keys
|
||||||
|
|
||||||
|
Edit `docker.env`:
|
||||||
|
```bash
|
||||||
|
# For Speech-to-Text (transcription)
|
||||||
|
GROQ_API_KEY=gsk_your_key_here
|
||||||
|
|
||||||
|
# For Text-to-Speech (podcasts)
|
||||||
|
ELEVENLABS_API_KEY=your_key_here
|
||||||
|
# OR
|
||||||
|
OPENAI_API_KEY=sk-your_key_here
|
||||||
|
```
|
||||||
|
|
||||||
|
Then restart:
|
||||||
|
```bash
|
||||||
|
docker compose restart
|
||||||
|
```
|
||||||
|
|
||||||
|
## Useful Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# View logs
|
||||||
|
docker logs -f open-notebook
|
||||||
|
|
||||||
|
# Restart
|
||||||
|
docker compose restart
|
||||||
|
|
||||||
|
# Update to latest version
|
||||||
|
docker compose pull
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
# Check Ollama models
|
||||||
|
docker exec ollama ollama list
|
||||||
|
|
||||||
|
# Pull new Ollama model
|
||||||
|
docker exec ollama ollama pull mistral:7b
|
||||||
|
|
||||||
|
# Backup data
|
||||||
|
tar -czvf notebook-backup.tar.gz notebook_data surreal_data
|
||||||
|
```
|
||||||
|
|
||||||
|
## Accessing Open Notebook
|
||||||
|
|
||||||
|
- **Web UI**: https://notebook.jeffemmett.com
|
||||||
|
- **API Docs**: https://notebook.jeffemmett.com/api/docs (if exposed)
|
||||||
|
- **Local**: http://159.195.32.209:8502
|
||||||
|
|
||||||
|
## Integration with AI Orchestrator
|
||||||
|
|
||||||
|
The Open Notebook instance connects to the same Ollama service used by the AI orchestrator, sharing:
|
||||||
|
- Model cache (no duplicate downloads)
|
||||||
|
- Compute resources
|
||||||
|
- Network (ai-orchestrator_ai-internal)
|
||||||
|
|
||||||
|
For advanced routing (e.g., GPU-accelerated inference via RunPod), configure the AI orchestrator to expose OpenAI-compatible endpoints.
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
**Container won't start:**
|
||||||
|
```bash
|
||||||
|
docker logs open-notebook
|
||||||
|
# Check if ports are in use
|
||||||
|
netstat -tlnp | grep -E '8502|5055'
|
||||||
|
```
|
||||||
|
|
||||||
|
**Can't connect to Ollama:**
|
||||||
|
```bash
|
||||||
|
# Verify network connectivity
|
||||||
|
docker exec open-notebook curl http://ollama:11434/api/tags
|
||||||
|
```
|
||||||
|
|
||||||
|
**Database issues:**
|
||||||
|
```bash
|
||||||
|
# Reset database (CAUTION: loses data)
|
||||||
|
docker compose down
|
||||||
|
rm -rf surreal_data
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
|
@ -0,0 +1,61 @@
|
||||||
|
#!/bin/bash
|
||||||
|
# Open Notebook Deployment Script for Netcup RS 8000
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
DEPLOY_DIR="/opt/websites/open-notebook"
|
||||||
|
REMOTE="netcup"
|
||||||
|
|
||||||
|
echo "=== Open Notebook Deployment ==="
|
||||||
|
|
||||||
|
# Check if we're running locally or on server
|
||||||
|
if [[ "$1" == "local" ]]; then
|
||||||
|
echo "Running locally - will SSH to deploy..."
|
||||||
|
|
||||||
|
# Sync files to server
|
||||||
|
echo "[1/5] Syncing files to $REMOTE:$DEPLOY_DIR..."
|
||||||
|
ssh $REMOTE "mkdir -p $DEPLOY_DIR"
|
||||||
|
scp docker-compose.yml docker.env README.md $REMOTE:$DEPLOY_DIR/
|
||||||
|
|
||||||
|
# Execute deployment on server
|
||||||
|
echo "[2/5] Deploying on server..."
|
||||||
|
ssh $REMOTE "cd $DEPLOY_DIR && docker compose pull"
|
||||||
|
ssh $REMOTE "cd $DEPLOY_DIR && docker compose up -d"
|
||||||
|
|
||||||
|
# Pull Ollama models if needed
|
||||||
|
echo "[3/5] Checking Ollama models..."
|
||||||
|
ssh $REMOTE "docker exec ollama ollama list | grep -q llama3.2:3b || docker exec ollama ollama pull llama3.2:3b"
|
||||||
|
ssh $REMOTE "docker exec ollama ollama list | grep -q nomic-embed-text || docker exec ollama ollama pull nomic-embed-text"
|
||||||
|
|
||||||
|
echo "[4/5] Waiting for container to be healthy..."
|
||||||
|
sleep 10
|
||||||
|
|
||||||
|
echo "[5/5] Checking status..."
|
||||||
|
ssh $REMOTE "docker ps | grep open-notebook"
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "=== Deployment Complete ==="
|
||||||
|
echo "Access Open Notebook at:"
|
||||||
|
echo " - Local: http://159.195.32.209:8502"
|
||||||
|
echo " - After DNS setup: https://notebook.jeffemmett.com"
|
||||||
|
echo ""
|
||||||
|
echo "Don't forget to:"
|
||||||
|
echo " 1. Add API keys to docker.env for STT/TTS features"
|
||||||
|
echo " 2. Configure Cloudflare tunnel hostname"
|
||||||
|
echo " 3. Add DNS CNAME record"
|
||||||
|
|
||||||
|
else
|
||||||
|
# Running on server
|
||||||
|
echo "Running on server..."
|
||||||
|
cd $DEPLOY_DIR
|
||||||
|
|
||||||
|
docker compose pull
|
||||||
|
docker compose up -d
|
||||||
|
|
||||||
|
echo "Checking Ollama models..."
|
||||||
|
docker exec ollama ollama list | grep -q llama3.2:3b || docker exec ollama ollama pull llama3.2:3b
|
||||||
|
docker exec ollama ollama list | grep -q nomic-embed-text || docker exec ollama ollama pull nomic-embed-text
|
||||||
|
|
||||||
|
echo "Deployment complete!"
|
||||||
|
docker ps | grep open-notebook
|
||||||
|
fi
|
||||||
|
|
@ -0,0 +1,35 @@
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
open-notebook:
|
||||||
|
image: ghcr.io/lfnovo/open-notebook:v1-latest-single
|
||||||
|
container_name: open-notebook
|
||||||
|
restart: always
|
||||||
|
env_file:
|
||||||
|
- ./docker.env
|
||||||
|
ports:
|
||||||
|
- "8502:8502" # Frontend
|
||||||
|
- "5055:5055" # API
|
||||||
|
volumes:
|
||||||
|
- ./notebook_data:/app/data
|
||||||
|
- ./surreal_data:/mydata
|
||||||
|
labels:
|
||||||
|
- "traefik.enable=true"
|
||||||
|
# Frontend routing
|
||||||
|
- "traefik.http.routers.open-notebook.rule=Host(`notebook.jeffemmett.com`)"
|
||||||
|
- "traefik.http.routers.open-notebook.entrypoints=web"
|
||||||
|
- "traefik.http.services.open-notebook.loadbalancer.server.port=8502"
|
||||||
|
# API routing (for external access if needed)
|
||||||
|
- "traefik.http.routers.open-notebook-api.rule=Host(`notebook-api.jeffemmett.com`)"
|
||||||
|
- "traefik.http.routers.open-notebook-api.entrypoints=web"
|
||||||
|
- "traefik.http.services.open-notebook-api.loadbalancer.server.port=5055"
|
||||||
|
networks:
|
||||||
|
- traefik-public
|
||||||
|
- ai-internal
|
||||||
|
|
||||||
|
networks:
|
||||||
|
traefik-public:
|
||||||
|
external: true
|
||||||
|
ai-internal:
|
||||||
|
external: true
|
||||||
|
name: ai-orchestrator_ai-internal
|
||||||
|
|
@ -0,0 +1,48 @@
|
||||||
|
# ==============================================================================
|
||||||
|
# OPEN NOTEBOOK CONFIGURATION - NETCUP RS 8000
|
||||||
|
# ==============================================================================
|
||||||
|
# Copy to docker.env and configure your settings
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# API Configuration
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
INTERNAL_API_URL=http://localhost:5055
|
||||||
|
API_URL=http://notebook.jeffemmett.com:5055
|
||||||
|
API_CLIENT_TIMEOUT=300
|
||||||
|
ESPERANTO_LLM_TIMEOUT=120
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# Security (Optional)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# OPEN_NOTEBOOK_PASSWORD=your_secure_password_here
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# LOCAL AI - OLLAMA (Primary - FREE)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
OLLAMA_API_BASE=http://ollama:11434
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# CLOUD AI PROVIDERS (Optional)
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# OPENAI_API_KEY=sk-...
|
||||||
|
# ANTHROPIC_API_KEY=sk-ant-...
|
||||||
|
# GROQ_API_KEY=gsk_...
|
||||||
|
# GOOGLE_API_KEY=...
|
||||||
|
# ELEVENLABS_API_KEY=...
|
||||||
|
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
# DATABASE - SurrealDB
|
||||||
|
# ------------------------------------------------------------------------------
|
||||||
|
SURREAL_URL=ws://localhost:8000/rpc
|
||||||
|
SURREAL_USER=root
|
||||||
|
SURREAL_PASSWORD=root
|
||||||
|
SURREAL_NAMESPACE=open_notebook
|
||||||
|
SURREAL_DATABASE=production
|
||||||
|
|
||||||
|
SURREAL_COMMANDS_RETRY_ENABLED=true
|
||||||
|
SURREAL_COMMANDS_RETRY_MAX_ATTEMPTS=3
|
||||||
|
SURREAL_COMMANDS_RETRY_WAIT_STRATEGY=exponential_jitter
|
||||||
|
SURREAL_COMMANDS_RETRY_WAIT_MIN=1
|
||||||
|
SURREAL_COMMANDS_RETRY_WAIT_MAX=30
|
||||||
|
SURREAL_COMMANDS_MAX_TASKS=5
|
||||||
Loading…
Reference in New Issue