Self-Hosting Guide
Run Your Own Stack
Sats Stack is fully self-hostable. Here's how.
Architecture Overview
Your Device
Sats Stack App
Your Server
Mempool Instance
Your Server
Ollama AI
All traffic stays within your network. Zero external dependencies once configured.
1. Mempool Setup
Mempool provides Bitcoin blockchain data — balances, transactions, and fee estimates. You'll need a running Electrum server connected to a Bitcoin node.
Step 1 — Start Mempool with Docker Compose
1version: '3'2services:3 mempool-web:4 image: mempool/frontend:latest5 environment:6 FRONTEND_HTTP_PORT: "8080"7 BACKEND_MAINNET_HTTP_HOST: "mempool-api"8 ports:9 - "8080:8080"10 depends_on:11 - mempool-api1213 mempool-api:14 image: mempool/backend:latest15 environment:16 MEMPOOL_BACKEND: "none"17 ELECTRUM_HOST: "your-electrum-host"18 ELECTRUM_PORT: "50002"19 ELECTRUM_TLS_ENABLED: "true"20 DATABASE_HOST: "mempool-db"21 DATABASE_DATABASE: "mempool"22 DATABASE_USERNAME: "mempool"23 DATABASE_PASSWORD: "changeme"24 ports:25 - "8999:8999"26 depends_on:27 - mempool-db2829 mempool-db:30 image: mariadb:10.5.2131 environment:32 MYSQL_DATABASE: "mempool"33 MYSQL_USER: "mempool"34 MYSQL_PASSWORD: "changeme"35 MYSQL_ROOT_PASSWORD: "changeme"36 volumes:37 - mempool_db:/var/lib/mysql3839volumes:40 mempool_db:
Step 2 — Configure in Sats Stack
- 1. Open Sats Stack → Settings → Data Sources
- 2. Set Mempool URL to
http://your-server-ip:8080 - 3. Test the connection and save
2. Ollama Setup
Ollama runs large language models locally. The AI assistant in Sats Stack connects to your Ollama instance — all queries stay on your network.
Step 1 — Start Ollama
1version: '3'2services:3 ollama:4 image: ollama/ollama:latest5 ports:6 - "11434:11434"7 volumes:8 - ollama_data:/root/.ollama9 # Uncomment for GPU support:10 # deploy:11 # resources:12 # reservations:13 # devices:14 # - capabilities: [gpu]1516volumes:17 ollama_data:
Step 2 — Pull a model
# Pull a model (run after starting the container)docker exec -it ollama ollama pull llama3docker exec -it ollama ollama pull mistraldocker exec -it ollama ollama pull phi3
Step 3 — Configure in Sats Stack
- 1. Open Sats Stack → Settings → AI Assistant
- 2. Set Ollama URL to
http://your-server-ip:11434 - 3. Select your preferred model and save
Need more help?
Full documentation, troubleshooting guides, and community support are available at docs.satsstack.app.
Visit full documentation