feat: Sprint 10 - Architecture docs, CI/CD, operational scripts
Some checks failed
CI / lint-and-test (push) Has been cancelled
Some checks failed
CI / lint-and-test (push) Has been cancelled
Phase 1 - Architecture Documentation: - Add Architecture section with Mermaid flowchart to README - Create docs/DATABASE_SCHEMA.md with full ERD Phase 2 - CI/CD: - Add CI badge to README - Create .gitea/workflows/ci.yml for linting and tests - Create .gitea/workflows/deploy-staging.yml - Create .gitea/workflows/deploy-production.yml Phase 3 - Operational Scripts: - Create scripts/logs.sh for docker compose log following - Create scripts/run-detached.sh with health check loop - Create scripts/etl/toronto.sh for Toronto data pipeline - Add Makefile targets: logs, run-detached, etl-toronto Phase 4 - Runbooks: - Create docs/runbooks/adding-dashboard.md - Create docs/runbooks/deployment.md Phase 5 - Hygiene: - Create MIT LICENSE file Phase 6 - Production: - Add live demo link to README (leodata.science) Closes #78, #79, #80, #81, #82, #83, #84, #85, #86, #87, #88, #89, #91 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
200
docs/runbooks/adding-dashboard.md
Normal file
200
docs/runbooks/adding-dashboard.md
Normal file
@@ -0,0 +1,200 @@
|
||||
# Runbook: Adding a New Dashboard
|
||||
|
||||
This runbook describes how to add a new data dashboard to the portfolio application.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- [ ] Data sources identified and accessible
|
||||
- [ ] Database schema designed
|
||||
- [ ] Basic Dash/Plotly familiarity
|
||||
|
||||
## Directory Structure
|
||||
|
||||
Create the following structure under `portfolio_app/`:
|
||||
|
||||
```
|
||||
portfolio_app/
|
||||
├── pages/
|
||||
│ └── {dashboard_name}/
|
||||
│ ├── dashboard.py # Main layout with tabs
|
||||
│ ├── methodology.py # Data sources and methods page
|
||||
│ ├── tabs/
|
||||
│ │ ├── __init__.py
|
||||
│ │ ├── overview.py # Overview tab layout
|
||||
│ │ └── ... # Additional tab layouts
|
||||
│ └── callbacks/
|
||||
│ ├── __init__.py
|
||||
│ └── ... # Callback modules
|
||||
├── {dashboard_name}/ # Data logic (outside pages/)
|
||||
│ ├── __init__.py
|
||||
│ ├── parsers/ # API/CSV extraction
|
||||
│ │ └── __init__.py
|
||||
│ ├── loaders/ # Database operations
|
||||
│ │ └── __init__.py
|
||||
│ ├── schemas/ # Pydantic models
|
||||
│ │ └── __init__.py
|
||||
│ └── models/ # SQLAlchemy ORM
|
||||
│ └── __init__.py
|
||||
```
|
||||
|
||||
## Step-by-Step Checklist
|
||||
|
||||
### 1. Data Layer
|
||||
|
||||
- [ ] Create Pydantic schemas in `{dashboard_name}/schemas/`
|
||||
- [ ] Create SQLAlchemy models in `{dashboard_name}/models/`
|
||||
- [ ] Create parsers in `{dashboard_name}/parsers/`
|
||||
- [ ] Create loaders in `{dashboard_name}/loaders/`
|
||||
- [ ] Add database migrations if needed
|
||||
|
||||
### 2. dbt Models
|
||||
|
||||
Create dbt models in `dbt/models/`:
|
||||
|
||||
- [ ] `staging/stg_{source}__{entity}.sql` - Raw data cleaning
|
||||
- [ ] `intermediate/int_{domain}__{transform}.sql` - Business logic
|
||||
- [ ] `marts/mart_{domain}.sql` - Final analytical tables
|
||||
|
||||
Follow naming conventions:
|
||||
- Staging: `stg_{source}__{entity}`
|
||||
- Intermediate: `int_{domain}__{transform}`
|
||||
- Marts: `mart_{domain}`
|
||||
|
||||
### 3. Visualization Layer
|
||||
|
||||
- [ ] Create figure factories in `figures/` (or reuse existing)
|
||||
- [ ] Follow the factory pattern: `create_{chart_type}_figure(data, **kwargs)`
|
||||
|
||||
### 4. Dashboard Pages
|
||||
|
||||
#### Main Dashboard (`pages/{dashboard_name}/dashboard.py`)
|
||||
|
||||
```python
|
||||
import dash
|
||||
from dash import html, dcc
|
||||
import dash_mantine_components as dmc
|
||||
|
||||
dash.register_page(
|
||||
__name__,
|
||||
path="/{dashboard_name}",
|
||||
title="{Dashboard Title}",
|
||||
description="{Description}"
|
||||
)
|
||||
|
||||
def layout():
|
||||
return dmc.Container([
|
||||
# Header
|
||||
dmc.Title("{Dashboard Title}", order=1),
|
||||
|
||||
# Tabs
|
||||
dmc.Tabs([
|
||||
dmc.TabsList([
|
||||
dmc.TabsTab("Overview", value="overview"),
|
||||
# Add more tabs
|
||||
]),
|
||||
dmc.TabsPanel(overview_tab(), value="overview"),
|
||||
# Add more panels
|
||||
], value="overview"),
|
||||
])
|
||||
```
|
||||
|
||||
#### Tab Layouts (`pages/{dashboard_name}/tabs/`)
|
||||
|
||||
- [ ] Create one file per tab
|
||||
- [ ] Export layout function from each
|
||||
|
||||
#### Callbacks (`pages/{dashboard_name}/callbacks/`)
|
||||
|
||||
- [ ] Create callback modules for interactivity
|
||||
- [ ] Import and register in dashboard.py
|
||||
|
||||
### 5. Navigation
|
||||
|
||||
Add to sidebar in `components/sidebar.py`:
|
||||
|
||||
```python
|
||||
dmc.NavLink(
|
||||
label="{Dashboard Name}",
|
||||
href="/{dashboard_name}",
|
||||
icon=DashIconify(icon="..."),
|
||||
)
|
||||
```
|
||||
|
||||
### 6. Documentation
|
||||
|
||||
- [ ] Create methodology page (`pages/{dashboard_name}/methodology.py`)
|
||||
- [ ] Document data sources
|
||||
- [ ] Document transformation logic
|
||||
- [ ] Add notebooks to `notebooks/{dashboard_name}/` if needed
|
||||
|
||||
### 7. Testing
|
||||
|
||||
- [ ] Add unit tests for parsers
|
||||
- [ ] Add unit tests for loaders
|
||||
- [ ] Add integration tests for callbacks
|
||||
- [ ] Run `make test`
|
||||
|
||||
### 8. Final Verification
|
||||
|
||||
- [ ] All pages render without errors
|
||||
- [ ] All callbacks respond correctly
|
||||
- [ ] Data loads successfully
|
||||
- [ ] dbt models run cleanly (`make dbt-run`)
|
||||
- [ ] Linting passes (`make lint`)
|
||||
- [ ] Tests pass (`make test`)
|
||||
|
||||
## Example: Toronto Dashboard
|
||||
|
||||
Reference implementation: `portfolio_app/pages/toronto/`
|
||||
|
||||
Key files:
|
||||
- `dashboard.py` - Main layout with 5 tabs
|
||||
- `tabs/overview.py` - Livability scores, scatter plots
|
||||
- `callbacks/map_callbacks.py` - Choropleth interactions
|
||||
- `toronto/models/dimensions.py` - Dimension tables
|
||||
- `toronto/models/facts.py` - Fact tables
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Figure Factories
|
||||
|
||||
```python
|
||||
# figures/choropleth.py
|
||||
def create_choropleth_figure(
|
||||
gdf: gpd.GeoDataFrame,
|
||||
value_column: str,
|
||||
title: str,
|
||||
**kwargs
|
||||
) -> go.Figure:
|
||||
...
|
||||
```
|
||||
|
||||
### Callbacks
|
||||
|
||||
```python
|
||||
# callbacks/map_callbacks.py
|
||||
@callback(
|
||||
Output("neighbourhood-details", "children"),
|
||||
Input("choropleth-map", "clickData"),
|
||||
)
|
||||
def update_details(click_data):
|
||||
...
|
||||
```
|
||||
|
||||
### Data Loading
|
||||
|
||||
```python
|
||||
# {dashboard_name}/loaders/load.py
|
||||
def load_data(session: Session) -> None:
|
||||
# Parse from source
|
||||
records = parse_source_data()
|
||||
|
||||
# Validate with Pydantic
|
||||
validated = [Schema(**r) for r in records]
|
||||
|
||||
# Load to database
|
||||
for record in validated:
|
||||
session.add(Model(**record.model_dump()))
|
||||
|
||||
session.commit()
|
||||
```
|
||||
232
docs/runbooks/deployment.md
Normal file
232
docs/runbooks/deployment.md
Normal file
@@ -0,0 +1,232 @@
|
||||
# Runbook: Deployment
|
||||
|
||||
This runbook covers deployment procedures for the Analytics Portfolio application.
|
||||
|
||||
## Environments
|
||||
|
||||
| Environment | Branch | Server | URL |
|
||||
|-------------|--------|--------|-----|
|
||||
| Development | `development` | Local | http://localhost:8050 |
|
||||
| Staging | `staging` | Homelab (hotserv) | Internal |
|
||||
| Production | `main` | Bandit Labs VPS | https://leodata.science |
|
||||
|
||||
## CI/CD Pipeline
|
||||
|
||||
### Automatic Deployment
|
||||
|
||||
Deployments are triggered automatically via Gitea Actions:
|
||||
|
||||
1. **Push to `staging`** → Deploys to staging server
|
||||
2. **Push to `main`** → Deploys to production server
|
||||
|
||||
### Workflow Files
|
||||
|
||||
- `.gitea/workflows/ci.yml` - Runs linting and tests on all branches
|
||||
- `.gitea/workflows/deploy-staging.yml` - Staging deployment
|
||||
- `.gitea/workflows/deploy-production.yml` - Production deployment
|
||||
|
||||
### Required Secrets
|
||||
|
||||
Configure these in Gitea repository settings:
|
||||
|
||||
| Secret | Description |
|
||||
|--------|-------------|
|
||||
| `STAGING_HOST` | Staging server hostname/IP |
|
||||
| `STAGING_USER` | SSH username for staging |
|
||||
| `STAGING_SSH_KEY` | Private key for staging SSH |
|
||||
| `PROD_HOST` | Production server hostname/IP |
|
||||
| `PROD_USER` | SSH username for production |
|
||||
| `PROD_SSH_KEY` | Private key for production SSH |
|
||||
|
||||
## Manual Deployment
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- SSH access to target server
|
||||
- Repository cloned at `~/apps/personal-portfolio`
|
||||
- Virtual environment created at `.venv`
|
||||
- Docker and Docker Compose installed
|
||||
- PostgreSQL container running
|
||||
|
||||
### Steps
|
||||
|
||||
```bash
|
||||
# 1. SSH to server
|
||||
ssh user@server
|
||||
|
||||
# 2. Navigate to app directory
|
||||
cd ~/apps/personal-portfolio
|
||||
|
||||
# 3. Pull latest changes
|
||||
git fetch origin {branch}
|
||||
git reset --hard origin/{branch}
|
||||
|
||||
# 4. Activate virtual environment
|
||||
source .venv/bin/activate
|
||||
|
||||
# 5. Install dependencies
|
||||
pip install -r requirements.txt
|
||||
|
||||
# 6. Run database migrations (if any)
|
||||
# python -m alembic upgrade head
|
||||
|
||||
# 7. Run dbt models
|
||||
cd dbt && dbt run --profiles-dir . && cd ..
|
||||
|
||||
# 8. Restart application
|
||||
docker compose down
|
||||
docker compose up -d
|
||||
|
||||
# 9. Verify health
|
||||
curl http://localhost:8050/health
|
||||
```
|
||||
|
||||
## Rollback Procedure
|
||||
|
||||
### Quick Rollback
|
||||
|
||||
If deployment fails, rollback to previous commit:
|
||||
|
||||
```bash
|
||||
# 1. Find previous working commit
|
||||
git log --oneline -10
|
||||
|
||||
# 2. Reset to that commit
|
||||
git reset --hard {commit_hash}
|
||||
|
||||
# 3. Restart services
|
||||
docker compose down
|
||||
docker compose up -d
|
||||
|
||||
# 4. Verify
|
||||
curl http://localhost:8050/health
|
||||
```
|
||||
|
||||
### Full Rollback (Database)
|
||||
|
||||
If database changes need to be reverted:
|
||||
|
||||
```bash
|
||||
# 1. Stop application
|
||||
docker compose down
|
||||
|
||||
# 2. Restore database backup
|
||||
pg_restore -h localhost -U portfolio -d portfolio backup.dump
|
||||
|
||||
# 3. Revert code
|
||||
git reset --hard {commit_hash}
|
||||
|
||||
# 4. Run dbt at that version
|
||||
cd dbt && dbt run --profiles-dir . && cd ..
|
||||
|
||||
# 5. Restart
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
## Health Checks
|
||||
|
||||
### Application Health
|
||||
|
||||
```bash
|
||||
curl http://localhost:8050/health
|
||||
```
|
||||
|
||||
Expected response:
|
||||
```json
|
||||
{"status": "healthy"}
|
||||
```
|
||||
|
||||
### Database Health
|
||||
|
||||
```bash
|
||||
docker compose exec postgres pg_isready -U portfolio
|
||||
```
|
||||
|
||||
### Container Status
|
||||
|
||||
```bash
|
||||
docker compose ps
|
||||
```
|
||||
|
||||
## Monitoring
|
||||
|
||||
### View Logs
|
||||
|
||||
```bash
|
||||
# All services
|
||||
make logs
|
||||
|
||||
# Specific service
|
||||
make logs SERVICE=postgres
|
||||
|
||||
# Or directly
|
||||
docker compose logs -f
|
||||
```
|
||||
|
||||
### Check Resource Usage
|
||||
|
||||
```bash
|
||||
docker stats
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Application Won't Start
|
||||
|
||||
1. Check container logs: `docker compose logs app`
|
||||
2. Verify environment variables: `cat .env`
|
||||
3. Check database connectivity: `docker compose exec postgres pg_isready`
|
||||
4. Verify port availability: `lsof -i :8050`
|
||||
|
||||
### Database Connection Errors
|
||||
|
||||
1. Check postgres container: `docker compose ps postgres`
|
||||
2. Verify DATABASE_URL in `.env`
|
||||
3. Check postgres logs: `docker compose logs postgres`
|
||||
4. Test connection: `docker compose exec postgres psql -U portfolio -c '\l'`
|
||||
|
||||
### dbt Failures
|
||||
|
||||
1. Check dbt logs: `cd dbt && dbt debug`
|
||||
2. Verify profiles.yml: `cat dbt/profiles.yml`
|
||||
3. Run with verbose output: `dbt run --debug`
|
||||
|
||||
### Out of Memory
|
||||
|
||||
1. Check memory usage: `free -h`
|
||||
2. Review container limits in docker-compose.yml
|
||||
3. Consider increasing swap or server resources
|
||||
|
||||
## Backup Procedures
|
||||
|
||||
### Database Backup
|
||||
|
||||
```bash
|
||||
# Create backup
|
||||
docker compose exec postgres pg_dump -U portfolio portfolio > backup_$(date +%Y%m%d).sql
|
||||
|
||||
# Compressed backup
|
||||
docker compose exec postgres pg_dump -U portfolio -Fc portfolio > backup_$(date +%Y%m%d).dump
|
||||
```
|
||||
|
||||
### Restore from Backup
|
||||
|
||||
```bash
|
||||
# From SQL file
|
||||
docker compose exec -T postgres psql -U portfolio portfolio < backup.sql
|
||||
|
||||
# From dump file
|
||||
docker compose exec -T postgres pg_restore -U portfolio -d portfolio < backup.dump
|
||||
```
|
||||
|
||||
## Deployment Checklist
|
||||
|
||||
Before deploying to production:
|
||||
|
||||
- [ ] All tests pass (`make test`)
|
||||
- [ ] Linting passes (`make lint`)
|
||||
- [ ] Staging deployment successful
|
||||
- [ ] Manual testing on staging complete
|
||||
- [ ] Database backup taken
|
||||
- [ ] Rollback plan confirmed
|
||||
- [ ] Team notified of deployment window
|
||||
Reference in New Issue
Block a user