|
CI/CD / deploy (push) Successful in 1m6s
Details
- Remove blog_static/ and blog_sample/ files (moved to p2pfoundation-wiki repo) - Update CI/CD: rename p2pwiki-content → p2pwiki-ai (image, deploy path) - Make content paths configurable via CONTENT_DIR env var in config.py - Update parser.py to use settings.articles_dir instead of hardcoded path - Update docker-compose.yml: HOST_XMLDUMP_DIR for flexible content mounting - Rewrite README.md: link to content repo, document env vars - Update .env.example with content path configuration Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> |
||
|---|---|---|
| .gitea/workflows | ||
| backlog | ||
| drafts | ||
| src | ||
| web | ||
| wiki_deploy | ||
| wiki_scripts | ||
| .dockerignore | ||
| .env.example | ||
| .gitignore | ||
| Dockerfile | ||
| README.md | ||
| docker-compose.yml | ||
| entrypoint.sh | ||
| pagenames.txt | ||
| pyproject.toml | ||
README.md
P2P Wiki AI
AI-augmented system for the P2P Foundation Wiki with two main features:
- Conversational Agent — Ask questions about 23,000+ wiki articles using RAG (Retrieval Augmented Generation)
- Article Ingress Pipeline — Drop article URLs to automatically analyze content, find matching wiki articles, and generate drafts
Content (wiki articles, XML dumps, infrastructure configs) lives in a separate repo: p2pfoundation-wiki
Quick Start
Prerequisites
- Python 3.10+
- Ollama installed locally (or access to a remote Ollama server)
- Optional: Anthropic API key for Claude (higher quality article drafts)
Install & Run
pip install -e .
# Parse wiki content (from XML dumps or articles directory)
python -m src.parser
# Generate embeddings
python -m src.embeddings
# Start server
python -m src.api
Visit http://localhost:8420/ui for the web interface.
Content Directory
By default, the system looks for xmldump/ and articles/articles/ in the project root. To point at the separate content repo:
# In .env
CONTENT_DIR=/path/to/p2pfoundation-wiki
Or mount via Docker:
HOST_XMLDUMP_DIR=/opt/content/p2pfoundation-wiki/xmldump docker compose up -d
Docker Deployment
docker compose up -d --build
docker compose logs -f
Configuration
Copy .env.example to .env and configure:
| Variable | Default | Description |
|---|---|---|
CONTENT_DIR |
(unset) | Path to content repo (resolves xmldump + articles) |
HOST_XMLDUMP_DIR |
./xmldump |
Host path for XML dump volume mount |
OLLAMA_BASE_URL |
http://localhost:11434 |
Ollama server URL |
OLLAMA_MODEL |
llama3.2 |
Model for chat Q&A |
CLAUDE_MODEL |
claude-sonnet-4-20250514 |
Model for article drafts |
USE_CLAUDE_FOR_DRAFTS |
true |
Route drafts to Claude |
USE_OLLAMA_FOR_CHAT |
true |
Route chat to Ollama |
ANTHROPIC_API_KEY |
(unset) | Claude API key (or via Infisical) |
API Endpoints
| Endpoint | Method | Description |
|---|---|---|
/chat |
POST | Ask questions about wiki content |
/ingress |
POST | Process external article URL |
/review |
GET | List items in review queue |
/review/action |
POST | Approve/reject draft |
/search |
GET | Direct vector search |
/articles |
GET | List article titles |
/health |
GET | Health check |
License
The AI system code is provided as-is for educational purposes. Wiki content is from the P2P Foundation under CC BY-SA 3.0.