refactor: multi-dashboard structural migration
Some checks failed
CI / lint-and-test (pull_request) Has been cancelled
Some checks failed
CI / lint-and-test (pull_request) Has been cancelled
- Rename dbt project from toronto_housing to portfolio - Restructure dbt models into domain subdirectories: - shared/ for cross-domain dimensions (dim_time) - staging/toronto/, intermediate/toronto/, marts/toronto/ - Update SQLAlchemy models for raw_toronto schema - Add explicit cross-schema FK relationships for FactRentals - Namespace figure factories under figures/toronto/ - Namespace notebooks under notebooks/toronto/ - Update Makefile with domain-specific targets and env loading - Update all documentation for multi-dashboard structure This enables adding new dashboard projects (e.g., /football, /energy) without structural conflicts or naming collisions. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
182
notebooks/toronto/demographics/income_choropleth.ipynb
Normal file
182
notebooks/toronto/demographics/income_choropleth.ipynb
Normal file
@@ -0,0 +1,182 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Median Income Choropleth Map\n",
|
||||
"\n",
|
||||
"Displays median household income across Toronto's 158 neighbourhoods."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## 1. Data Reference\n",
|
||||
"\n",
|
||||
"### Source Tables\n",
|
||||
"\n",
|
||||
"| Table | Grain | Key Columns |\n",
|
||||
"|-------|-------|-------------|\n",
|
||||
"| `mart_neighbourhood_demographics` | neighbourhood × year | median_household_income, income_index, income_quintile, geometry |\n",
|
||||
"\n",
|
||||
"### SQL Query"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"from dotenv import load_dotenv\n",
|
||||
"from sqlalchemy import create_engine\n",
|
||||
"\n",
|
||||
"# Load .env from project root\n",
|
||||
"load_dotenv(\"../../.env\")\n",
|
||||
"\n",
|
||||
"engine = create_engine(os.environ[\"DATABASE_URL\"])\n",
|
||||
"\n",
|
||||
"query = \"\"\"\n",
|
||||
"SELECT\n",
|
||||
" neighbourhood_id,\n",
|
||||
" neighbourhood_name,\n",
|
||||
" geometry,\n",
|
||||
" year,\n",
|
||||
" median_household_income,\n",
|
||||
" income_index,\n",
|
||||
" income_quintile,\n",
|
||||
" population,\n",
|
||||
" unemployment_rate\n",
|
||||
"FROM public_marts.mart_neighbourhood_demographics\n",
|
||||
"WHERE year = (SELECT MAX(year) FROM public_marts.mart_neighbourhood_demographics)\n",
|
||||
"ORDER BY median_household_income DESC\n",
|
||||
"\"\"\"\n",
|
||||
"\n",
|
||||
"df = pd.read_sql(query, engine)\n",
|
||||
"print(f\"Loaded {len(df)} neighbourhoods\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Transformation Steps\n",
|
||||
"\n",
|
||||
"1. Filter to most recent census year\n",
|
||||
"2. Convert geometry to GeoJSON\n",
|
||||
"3. Scale income to thousands for readability"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"\n",
|
||||
"import geopandas as gpd\n",
|
||||
"\n",
|
||||
"df[\"income_thousands\"] = df[\"median_household_income\"] / 1000\n",
|
||||
"\n",
|
||||
"gdf = gpd.GeoDataFrame(\n",
|
||||
" df, geometry=gpd.GeoSeries.from_wkb(df[\"geometry\"]), crs=\"EPSG:4326\"\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"geojson = json.loads(gdf.to_json())\n",
|
||||
"data = df.drop(columns=[\"geometry\"]).to_dict(\"records\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Sample Output"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df[\n",
|
||||
" [\"neighbourhood_name\", \"median_household_income\", \"income_index\", \"income_quintile\"]\n",
|
||||
"].head(10)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## 2. Data Visualization\n",
|
||||
"\n",
|
||||
"### Figure Factory\n",
|
||||
"\n",
|
||||
"Uses `create_choropleth_figure` from `portfolio_app.figures.toronto.choropleth`."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import sys\n",
|
||||
"\n",
|
||||
"sys.path.insert(0, \"../..\")\n",
|
||||
"\n",
|
||||
"from portfolio_app.figures.toronto.choropleth import create_choropleth_figure\n",
|
||||
"\n",
|
||||
"fig = create_choropleth_figure(\n",
|
||||
" geojson=geojson,\n",
|
||||
" data=data,\n",
|
||||
" location_key=\"neighbourhood_id\",\n",
|
||||
" color_column=\"median_household_income\",\n",
|
||||
" hover_data=[\"neighbourhood_name\", \"income_index\", \"income_quintile\"],\n",
|
||||
" color_scale=\"Viridis\",\n",
|
||||
" title=\"Toronto Median Household Income by Neighbourhood\",\n",
|
||||
" zoom=10,\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"fig.show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Income Quintile Distribution"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"df.groupby(\"income_quintile\")[\"median_household_income\"].agg(\n",
|
||||
" [\"count\", \"mean\", \"min\", \"max\"]\n",
|
||||
").round(0)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"name": "python",
|
||||
"version": "3.11.0"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 4
|
||||
}
|
||||
Reference in New Issue
Block a user