2 Commits

Author SHA1 Message Date
d64f90b3d3 Merge branch 'feature/7-nav-theme-modernization' into development 2026-01-15 11:53:22 -05:00
b3fb94c7cb feat: Add floating sidebar navigation and dark theme support
- Add floating pill-shaped sidebar with navigation icons
- Implement dark/light theme toggle with localStorage persistence
- Update all figure factories for transparent backgrounds
- Use carto-darkmatter map style for choropleths
- Add methodology link button to Toronto dashboard header
- Add back to dashboard button on methodology page
- Remove social links from home page (now in sidebar)
- Update CLAUDE.md to Sprint 7

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 11:53:13 -05:00
14 changed files with 475 additions and 569 deletions

View File

@@ -6,7 +6,7 @@ Working context for Claude Code on the Analytics Portfolio project.
## Project Status ## Project Status
**Current Sprint**: 1 (Project Bootstrap) **Current Sprint**: 7 (Navigation & Theme Modernization)
**Phase**: 1 - Toronto Housing Dashboard **Phase**: 1 - Toronto Housing Dashboard
**Branch**: `development` (feature branches merge here) **Branch**: `development` (feature branches merge here)
@@ -254,4 +254,4 @@ All scripts in `scripts/`:
--- ---
*Last Updated: Sprint 1* *Last Updated: Sprint 7*

View File

@@ -1,520 +0,0 @@
# Leo Miranda — Portfolio Website Blueprint
Structure, navigation, and complete page content
---
## Site Architecture
```
leodata.science
├── Home (Landing)
├── About
├── Projects (Overview + Status)
│ └── [Side Navbar]
│ ├── → Toronto Housing Market Dashboard (live)
│ ├── → US Retail Energy Price Predictor (coming soon)
│ └── → DataFlow Platform (Phase 3)
├── Lab (Bandit Labs / Experiments)
├── Blog
│ └── [Articles]
├── Resume (downloadable + inline)
└── Contact
```
---
## Navigation Structure
Primary Nav: Home | Projects | Lab | Blog | About | Resume
Footer: LinkedIn | GitHub | Email | “Built with Dash & too much coffee”
---
# PAGE CONTENT
---
## 1. HOME (Landing Page)
### Hero Section
Headline:
> I turn messy data into systems that actually work.
Subhead:
> Data Engineer & Analytics Specialist. 8 years building pipelines, dashboards, and the infrastructure nobody sees but everyone depends on. Based in Toronto.
CTA Buttons:
- View Projects → /projects
- Get In Touch → /contact
---
### Quick Impact Strip (Optional — 3-4 stats)
| 1B+ | 40% | 5 Years |
|-------------------------------------------------|------------------------------------|-----------------------------|
| Rows processed daily across enterprise platform | Efficiency gain through automation | Building DataFlow from zero |
---
### Featured Project Card
Toronto Housing Market Dashboard
> Real-time analytics on Torontos housing trends. dbt-powered ETL, Python scraping, Plotly visualization.
> \[View Dashboard\] \[View Repository\]
---
### Brief Intro (2-3 sentences)
Im a data engineer whos spent the last 8 years in the trenches—building the infrastructure that feeds dashboards, automates the boring stuff, and makes data actually usable. Most of my work has been in contact center operations and energy, where Ive had to be scrappy: one-person data teams, legacy systems, stakeholders who need answers yesterday.
I like solving real problems, not theoretical ones.
---
## 2. ABOUT PAGE
### Opening
I didnt start in data. I started in project management—CAPM certified, ITIL trained, the whole corporate playbook. Then I realized I liked building systems more than managing timelines, and I was better at automating reports than attending meetings about them.
That pivot led me to where I am now: 8 years deep in data engineering, analytics, and the messy reality of turning raw information into something people can actually use.
---
### What I Actually Do
The short version: I build data infrastructure. Pipelines, warehouses, dashboards, automation—the invisible machinery that makes businesses run on data instead of gut feelings.
The longer version: At Summitt Energy, Ive been the sole data professional supporting 150+ employees across 9 markets (Canada and US). I inherited nothing—no data warehouse, no reporting infrastructure, no documentation. Over 5 years, I built DataFlow: an enterprise platform processing 1B+ rows, integrating contact center data, CRM systems, and legacy tools that definitely werent designed to talk to each other.
That meant learning to be a generalist. Ive done ETL pipeline development (Python, SQLAlchemy), dimensional modeling, dashboard design (Power BI, Plotly-Dash), API integration, and more stakeholder management than Id like to admit. When youre the only data person, you learn to wear every hat.
---
### How I Think About Data
Im not interested in data for datas sake. The question I always start with: What decision does this help someone make?
Most of my work has been in operations-heavy environments—contact centers, energy retail, logistics. These arent glamorous domains, but theyre where data can have massive impact. A 30% improvement in abandon rate isnt just a metric; its thousands of customers who didnt hang up frustrated. A 40% reduction in reporting time means managers can actually manage instead of wrestling with spreadsheets.
I care about outcomes, not technology stacks.
---
### The Technical Stuff (For Those Who Want It)
Languages: Python (Pandas, SQLAlchemy, FastAPI), SQL (MSSQL, PostgreSQL), R, VBA
Data Engineering: ETL/ELT pipelines, dimensional modeling (star schema), dbt patterns, batch processing, API integration, web scraping (Selenium)
Visualization: Plotly/Dash, Power BI, Tableau
Platforms: Genesys Cloud, Five9, Zoho, Azure DevOps
Currently Learning: Cloud certification (Azure DP-203), Airflow, Snowflake
---
### Outside Work
Im a Brazilian-Canadian based in Toronto. I speak Portuguese (native), English (fluent), and enough Spanish to survive.
When Im not staring at SQL, Im usually:
- Building automation tools for small businesses through Bandit Labs (my side project)
- Contributing to open source (MCP servers, Claude Code plugins)
- Trying to explain to my kid why Daddys job involves “making computers talk to each other”
---
### What Im Looking For
Im currently exploring Senior Data Analyst and Data Engineer roles in the Toronto area (or remote). Im most interested in:
- Companies that treat data as infrastructure, not an afterthought
- Teams where I can contribute to architecture decisions, not just execute tickets
- Operations-focused industries (energy, logistics, financial services, contact center tech)
If that sounds like your team, lets talk.
\[Download Resume\] \[Contact Me\]
---
## 3. PROJECTS PAGE
### Navigation Note
The Projects page serves as an overview and status hub for all projects. A side navbar provides direct links to live dashboards and repositories. Users land on the overview first, then navigate to specific projects via the sidebar.
### Intro Text
These are projects Ive built—some professional (anonymized where needed), some personal. Each one taught me something. Use the sidebar to jump directly to live dashboards or explore the overviews below.
---
### Project Card: Toronto Housing Market Dashboard
Type: Personal Project | Status: Live
The Problem:
Torontos housing market moves fast, and most publicly available data is either outdated, behind paywalls, or scattered across dozens of sources. I wanted a single dashboard that tracked trends in real-time.
What I Built:
- Data Pipeline: Python scraper pulling listings data, automated on schedule
- Transformation Layer: dbt-based SQL architecture (staging → intermediate → marts)
- Visualization: Interactive Plotly-Dash dashboard with filters by neighborhood, price range, property type
- Infrastructure: PostgreSQL backend, version-controlled in Git
Tech Stack: Python, dbt, PostgreSQL, Plotly-Dash, GitHub Actions
What I Learned:
Real estate data is messy as hell. Listings get pulled, prices change, duplicates are everywhere. Building a reliable pipeline meant implementing serious data quality checks and learning to embrace “good enough” over “perfect.”
\[View Live Dashboard\] \[View Repository (ETL + dbt)\]
---
### Project Card: US Retail Energy Price Predictor
Type: Personal Project | Status: Coming Soon (Phase 2)
The Problem:
Retail energy pricing in deregulated US markets is volatile and opaque. Consumers and analysts lack accessible tools to understand pricing trends and forecast where rates are headed.
What Im Building:
- Data Pipeline: Automated ingestion of public pricing data across multiple US markets
- ML Model: Price prediction using time series forecasting (ARIMA, Prophet, or similar)
- Transformation Layer: dbt-based SQL architecture for feature engineering
- Visualization: Interactive dashboard showing historical trends + predictions by state/market
Tech Stack: Python, Scikit-learn, dbt, PostgreSQL, Plotly-Dash
Why This Project:
This showcases the ML side of my skillset—something the Toronto Housing dashboard doesnt cover. It also leverages my domain expertise from 5+ years in retail energy operations.
\[Coming Soon\]
---
### Project Card: DataFlow Platform (Enterprise Case Study)
Type: Professional | Status: Deferred (Phase 3 — requires sanitized codebase)
The Context:
When I joined Summitt Energy, there was no data infrastructure. Reports were manual. Insights were guesswork. I was hired to fix that.
What I Built (Over 5 Years):
- v1 (2020): Basic ETL scripts pulling Genesys Cloud data into MSSQL
- v2 (2021): Dimensional model (star schema) with fact/dimension tables
- v3 (2022): Python refactor with SQLAlchemy ORM, batch processing, error handling
- v4 (2023-24): dbt-pattern SQL views (staging → intermediate → marts), FastAPI layer, CLI tools
Current State:
- 21 tables, 1B+ rows
- 5,000+ daily transactions processed
- Integrates Genesys Cloud, Zoho CRM, legacy systems
- Feeds Power BI prototypes and production Dash dashboards
- Near-zero reporting errors
Impact:
- 40% improvement in reporting efficiency
- 30% reduction in call abandon rate (via KPI framework)
- 50% faster Average Speed to Answer
- 100% callback completion rate
What I Learned:
Building data infrastructure as a team of one forces brutal prioritization. I learned to ship imperfect solutions fast, iterate based on feedback, and never underestimate how long stakeholder buy-in takes.
Note: This is proprietary work. A sanitized case study with architecture patterns (no proprietary data) will be published in Phase 3.
---
### Project Card: AI-Assisted Automation (Bandit Labs)
Type: Consulting/Side Business | Status: Active
What It Is:
Bandit Labs is my consulting practice focused on automation for small businesses. Most clients dont need enterprise data platforms—they need someone to eliminate the 4 hours/week they spend manually entering receipts.
Sample Work:
- Receipt Processing Automation: OCR pipeline (Tesseract, Google Vision) extracting purchase data from photos, pushing directly to QuickBooks. Eliminated 3-4 hours/week of manual entry for a restaurant client.
- Product Margin Tracker: Plotly-Dash dashboard with real-time profitability insights
- Claude Code Plugins: MCP servers for Gitea, Wiki.js, NetBox integration
Why I Do This:
Small businesses are underserved by the data/automation industry. Everyone wants to sell them enterprise software they dont need. I like solving problems at a scale where the impact is immediately visible.
\[Learn More About Bandit Labs\]
---
## 4. LAB PAGE (Bandit Labs / Experiments)
### Intro
This is where I experiment. Some of this becomes client work. Some of it teaches me something and gets abandoned. All of it is real code solving real (or at least real-adjacent) problems.
---
### Bandit Labs — Automation for Small Business
I started Bandit Labs because I kept meeting small business owners drowning in manual work that should have been automated years ago. Enterprise tools are overkill. Custom development is expensive. Theres a gap in the middle.
What I Offer:
- Receipt/invoice processing automation
- Dashboard development (Plotly-Dash)
- Data pipeline setup for non-technical teams
- AI integration for repetitive tasks
Recent Client Work:
- Rio Açaí (Restaurant, Gatineau): Receipt OCR → QuickBooks integration. Saved 3-4 hours/week.
\[Contact for Consulting\]
---
### Open Source / Experiments
MCP Servers (Model Context Protocol)
Ive built production-ready MCP servers for:
- Gitea: Issue management, label operations
- Wiki.js: Documentation access via GraphQL
- NetBox: CMDB integration (DCIM, IPAM, Virtualization)
These let AI assistants (like Claude) interact with infrastructure tools through natural language. Still experimental, but surprisingly useful for my own workflows.
Claude Code Plugins
- projman: AI-guided sprint planning with Gitea/Wiki.js integration
- cmdb-assistant: Conversational infrastructure queries against NetBox
- project-hygiene: Post-task cleanup automation
\[View on GitHub\]
---
## 5. BLOG PAGE
### Intro
I write occasionally about data engineering, automation, and the reality of being a one-person data team. No hot takes, no growth hacking—just things Ive learned the hard way.
---
### Suggested Initial Articles
Article 1: “Building a Data Platform as a Team of One”What I learned from 5 years as the sole data professional at a mid-size company
Outline:
- The reality of “full stack data” when theres no one else
- Prioritization frameworks (what to build first when everything is urgent)
- Technical debt vs. shipping something
- Building stakeholder trust without a team to back you up
- What Id do differently
---
Article 2: “dbt Patterns Without dbt (And Why I Eventually Adopted Them)”How I accidentally implemented analytics engineering best practices before knowing the terminology
Outline:
- The problem: SQL spaghetti in production dashboards
- My solution: staging → intermediate → marts view architecture
- Why separation of concerns matters for maintainability
- The day I discovered dbt and realized Id been doing this manually
- Migration path for legacy SQL codebases
---
Article 3: “The Toronto Housing Market Dashboard: A Data Engineering Postmortem”Building a real-time analytics pipeline for messy, uncooperative data
Outline:
- Why I built this (and why public housing data sucks)
- Data sourcing challenges and ethical scraping
- Pipeline architecture decisions
- dbt transformation layer design
- What broke and how I fixed it
- Dashboard design for non-technical users
---
Article 4: “Automating Small Business Operations with OCR and AI”A case study in practical automation for non-enterprise clients
Outline:
- The client problem: 4 hours/week on receipt entry
- Why “just use \[enterprise tool\]” doesnt work for small business
- Building an OCR pipeline with Tesseract and Google Vision
- QuickBooks integration gotchas
- ROI calculation for automation projects
---
Article 5: “What I Wish I Knew Before Building My First ETL Pipeline”Hard-won lessons for junior data engineers
Outline:
- Error handling isnt optional (its the whole job)
- Logging is your best friend at 2am
- Why idempotency matters
- The staging table pattern
- Testing data pipelines
- Documentation nobody will read (write it anyway)
---
Article 6: “Predicting US Retail Energy Prices: An ML Project Walkthrough”Building a forecasting model with domain knowledge from 5 years in energy retail
Outline:
- Why retail energy pricing is hard to predict (deregulation, seasonality, policy)
- Data sourcing and pipeline architecture
- Feature engineering with dbt
- Model selection (ARIMA vs Prophet vs ensemble)
- Evaluation metrics that matter for price forecasting
- Lessons from applying domain expertise to ML
---
## 6. RESUME PAGE
### Inline Display
Show a clean, readable version of the resume directly on the page. Use your tailored Senior Data Analyst version as the base.
### Download Options
- \[Download PDF\]
- \[Download DOCX\]
- \[View on LinkedIn\]
### Optional: Interactive Timeline
Visual timeline of career progression with expandable sections for each role. More engaging than a wall of text, but only if you have time to build it.
---
## 7. CONTACT PAGE
### Intro
Im currently open to Senior Data Analyst and Data Engineer roles in Toronto (or remote). If youre working on something interesting and need someone who can build data infrastructure from scratch, Id like to hear about it.
For consulting inquiries (automation, dashboards, small business data work), reach out about Bandit Labs.
---
### Contact Form Fields
- Name
- Email
- Subject (dropdown: Job Opportunity / Consulting Inquiry / Other)
- Message
---
### Direct Contact
- Email: leobrmi@hotmail.com
- Phone: (416) 859-7936
- LinkedIn: \[link\]
- GitHub: \[link\]
---
### Location
Toronto, ON, Canada
Canadian Citizen | Eligible to work in Canada and US
---
## TONE GUIDELINES
### Do:
- Be direct and specific
- Use first person naturally
- Include concrete metrics
- Acknowledge constraints and tradeoffs
- Show personality without being performative
- Write like you talk (minus the profanity)
### Dont:
- Use buzzwords without substance (“leveraging synergies”)
- Oversell or inflate
- Write in third person
- Use passive voice excessively
- Sound like a LinkedIn influencer
- Pretend youre a full team when youre one person
---
## SEO / DISCOVERABILITY
### Target Keywords (Organic)
- Toronto data analyst
- Data engineer portfolio
- Python ETL developer
- dbt analytics engineer
- Contact center analytics
### Blog Strategy
Aim for 1-2 posts per month initially. Focus on:
- Technical tutorials (how I built X)
- Lessons learned (what went wrong and how I fixed it)
- Industry observations (data work in operations-heavy companies)
---
## IMPLEMENTATION PRIORITY
### Phase 1 (MVP — Get it live)
1. Home page (hero + brief intro + featured project)
2. About page (full content)
3. Projects page (overview + status cards with navbar links to dashboards)
4. Resume page (inline + download)
5. Contact page (form + direct info)
6. Blog (start with 2-3 articles)
### Phase 2 (Expand)
1. Lab page (Bandit Labs + experiments)
2. US Retail Energy Price Predictor (ML project — coming soon)
3. Add more projects as completed
### Phase 3 (Polish)
1. DataFlow Platform case study (requires sanitized fork of proprietary codebase)
2. Testimonials (if available from Summitt stakeholders)
3. Interactive elements (timeline, project filters)
---
Last updated: January 2025

View File

@@ -2,7 +2,9 @@
import dash import dash
import dash_mantine_components as dmc import dash_mantine_components as dmc
from dash import dcc, html
from .components import create_sidebar
from .config import get_settings from .config import get_settings
@@ -17,14 +19,31 @@ def create_app() -> dash.Dash:
) )
app.layout = dmc.MantineProvider( app.layout = dmc.MantineProvider(
dash.page_container, id="mantine-provider",
children=[
dcc.Location(id="url", refresh=False),
dcc.Store(id="theme-store", storage_type="local", data="dark"),
dcc.Store(id="theme-init-dummy"), # Dummy store for theme init callback
html.Div(
[
create_sidebar(),
html.Div(
dash.page_container,
className="page-content-wrapper",
),
],
),
],
theme={ theme={
"primaryColor": "blue", "primaryColor": "blue",
"fontFamily": "'Inter', sans-serif", "fontFamily": "'Inter', sans-serif",
}, },
forceColorScheme="light", defaultColorScheme="dark",
) )
# Import callbacks to register them
from . import callbacks # noqa: F401
return app return app

View File

@@ -0,0 +1,139 @@
/* Floating sidebar navigation styles */
/* Sidebar container */
.floating-sidebar {
position: fixed;
left: 16px;
top: 50%;
transform: translateY(-50%);
width: 60px;
padding: 16px 8px;
border-radius: 32px;
z-index: 1000;
display: flex;
flex-direction: column;
align-items: center;
gap: 8px;
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.15);
transition: background-color 0.2s ease;
}
/* Page content offset to prevent sidebar overlap */
.page-content-wrapper {
margin-left: 92px; /* sidebar width (60px) + left margin (16px) + gap (16px) */
min-height: 100vh;
}
/* Dark theme (default) */
[data-mantine-color-scheme="dark"] .floating-sidebar {
background-color: #141414;
}
[data-mantine-color-scheme="dark"] body {
background-color: #000000;
}
/* Light theme */
[data-mantine-color-scheme="light"] .floating-sidebar {
background-color: #f0f0f0;
}
[data-mantine-color-scheme="light"] body {
background-color: #ffffff;
}
/* Brand initials styling */
.sidebar-brand {
width: 40px;
height: 40px;
display: flex;
align-items: center;
justify-content: center;
border-radius: 50%;
background-color: var(--mantine-color-blue-filled);
margin-bottom: 4px;
transition: transform 0.2s ease;
}
.sidebar-brand:hover {
transform: scale(1.05);
}
.sidebar-brand-link {
font-weight: 700;
font-size: 16px;
color: white;
text-decoration: none;
line-height: 1;
}
/* Divider between sections */
.sidebar-divider {
width: 32px;
height: 1px;
background-color: var(--mantine-color-dimmed);
margin: 4px 0;
opacity: 0.3;
}
/* Active nav icon indicator */
.nav-icon-active {
background-color: var(--mantine-color-blue-filled) !important;
}
/* Navigation icon hover effects */
.floating-sidebar .mantine-ActionIcon-root {
transition: transform 0.15s ease, background-color 0.15s ease;
}
.floating-sidebar .mantine-ActionIcon-root:hover {
transform: scale(1.1);
}
/* Ensure links don't have underlines */
.floating-sidebar a {
text-decoration: none;
}
/* Theme toggle specific styling */
#theme-toggle {
transition: transform 0.3s ease;
}
#theme-toggle:hover {
transform: rotate(15deg) scale(1.1);
}
/* Responsive adjustments for smaller screens */
@media (max-width: 768px) {
.floating-sidebar {
left: 8px;
width: 50px;
padding: 12px 6px;
border-radius: 25px;
}
.page-content-wrapper {
margin-left: 70px;
}
.sidebar-brand {
width: 34px;
height: 34px;
}
.sidebar-brand-link {
font-size: 14px;
}
}
/* Very small screens - hide sidebar, show minimal navigation */
@media (max-width: 480px) {
.floating-sidebar {
display: none;
}
.page-content-wrapper {
margin-left: 0;
}
}

View File

@@ -0,0 +1,5 @@
"""Application-level callbacks for the portfolio app."""
from . import theme
__all__ = ["theme"]

View File

@@ -0,0 +1,38 @@
"""Theme toggle callbacks using clientside JavaScript."""
from dash import Input, Output, State, clientside_callback
# Toggle theme on button click
# Stores new theme value and updates the DOM attribute
clientside_callback(
"""
function(n_clicks, currentTheme) {
if (n_clicks === undefined || n_clicks === null) {
return window.dash_clientside.no_update;
}
const newTheme = currentTheme === 'dark' ? 'light' : 'dark';
document.documentElement.setAttribute('data-mantine-color-scheme', newTheme);
return newTheme;
}
""",
Output("theme-store", "data"),
Input("theme-toggle", "n_clicks"),
State("theme-store", "data"),
prevent_initial_call=True,
)
# Initialize theme from localStorage on page load
# Uses a dummy output since we only need the side effect of setting the DOM attribute
clientside_callback(
"""
function(theme) {
if (theme) {
document.documentElement.setAttribute('data-mantine-color-scheme', theme);
}
return theme;
}
""",
Output("theme-init-dummy", "data"),
Input("theme-store", "data"),
prevent_initial_call=False,
)

View File

@@ -2,11 +2,13 @@
from .map_controls import create_map_controls, create_metric_selector from .map_controls import create_map_controls, create_metric_selector
from .metric_card import MetricCard, create_metric_cards_row from .metric_card import MetricCard, create_metric_cards_row
from .sidebar import create_sidebar
from .time_slider import create_time_slider, create_year_selector from .time_slider import create_time_slider, create_year_selector
__all__ = [ __all__ = [
"create_map_controls", "create_map_controls",
"create_metric_selector", "create_metric_selector",
"create_sidebar",
"create_time_slider", "create_time_slider",
"create_year_selector", "create_year_selector",
"MetricCard", "MetricCard",

View File

@@ -0,0 +1,179 @@
"""Floating sidebar navigation component."""
import dash_mantine_components as dmc
from dash import dcc, html
from dash_iconify import DashIconify
# Navigation items configuration
NAV_ITEMS = [
{"path": "/", "icon": "tabler:home", "label": "Home"},
{"path": "/toronto", "icon": "tabler:map-2", "label": "Toronto Housing"},
]
# External links configuration
EXTERNAL_LINKS = [
{
"url": "https://github.com/leomiranda",
"icon": "tabler:brand-github",
"label": "GitHub",
},
{
"url": "https://linkedin.com/in/leobmiranda",
"icon": "tabler:brand-linkedin",
"label": "LinkedIn",
},
]
def create_brand_logo() -> html.Div:
"""Create the brand initials logo."""
return html.Div(
dcc.Link(
"LM",
href="/",
className="sidebar-brand-link",
),
className="sidebar-brand",
)
def create_nav_icon(
icon: str,
label: str,
path: str,
current_path: str,
) -> dmc.Tooltip:
"""Create a navigation icon with tooltip.
Args:
icon: Iconify icon string.
label: Tooltip label.
path: Navigation path.
current_path: Current page path for active state.
Returns:
Tooltip-wrapped navigation icon.
"""
is_active = current_path == path or (path != "/" and current_path.startswith(path))
return dmc.Tooltip(
dcc.Link(
dmc.ActionIcon(
DashIconify(icon=icon, width=20),
variant="subtle" if not is_active else "filled",
size="lg",
radius="xl",
color="blue" if is_active else "gray",
className="nav-icon-active" if is_active else "",
),
href=path,
),
label=label,
position="right",
withArrow=True,
)
def create_theme_toggle(current_theme: str = "dark") -> dmc.Tooltip:
"""Create the theme toggle button.
Args:
current_theme: Current theme ('dark' or 'light').
Returns:
Tooltip-wrapped theme toggle icon.
"""
icon = "tabler:sun" if current_theme == "dark" else "tabler:moon"
label = "Switch to light mode" if current_theme == "dark" else "Switch to dark mode"
return dmc.Tooltip(
dmc.ActionIcon(
DashIconify(icon=icon, width=20, id="theme-toggle-icon"),
id="theme-toggle",
variant="subtle",
size="lg",
radius="xl",
color="gray",
),
label=label,
position="right",
withArrow=True,
)
def create_external_link(url: str, icon: str, label: str) -> dmc.Tooltip:
"""Create an external link icon with tooltip.
Args:
url: External URL.
icon: Iconify icon string.
label: Tooltip label.
Returns:
Tooltip-wrapped external link icon.
"""
return dmc.Tooltip(
dmc.Anchor(
dmc.ActionIcon(
DashIconify(icon=icon, width=20),
variant="subtle",
size="lg",
radius="xl",
color="gray",
),
href=url,
target="_blank",
),
label=label,
position="right",
withArrow=True,
)
def create_sidebar_divider() -> html.Div:
"""Create a horizontal divider for the sidebar."""
return html.Div(className="sidebar-divider")
def create_sidebar(current_path: str = "/", current_theme: str = "dark") -> html.Div:
"""Create the floating sidebar navigation.
Args:
current_path: Current page path for active state highlighting.
current_theme: Current theme for toggle icon state.
Returns:
Complete sidebar component.
"""
return html.Div(
[
# Brand logo
create_brand_logo(),
create_sidebar_divider(),
# Navigation icons
*[
create_nav_icon(
icon=item["icon"],
label=item["label"],
path=item["path"],
current_path=current_path,
)
for item in NAV_ITEMS
],
create_sidebar_divider(),
# Theme toggle
create_theme_toggle(current_theme),
create_sidebar_divider(),
# External links
*[
create_external_link(
url=link["url"],
icon=link["icon"],
label=link["label"],
)
for link in EXTERNAL_LINKS
],
],
className="floating-sidebar",
id="floating-sidebar",
)

View File

@@ -39,6 +39,10 @@ def create_choropleth_figure(
if center is None: if center is None:
center = {"lat": 43.7, "lon": -79.4} center = {"lat": 43.7, "lon": -79.4}
# Use dark-mode friendly map style by default
if map_style == "carto-positron":
map_style = "carto-darkmatter"
# If no geojson provided, create a placeholder map # If no geojson provided, create a placeholder map
if geojson is None or not data: if geojson is None or not data:
fig = go.Figure(go.Scattermapbox()) fig = go.Figure(go.Scattermapbox())
@@ -51,6 +55,9 @@ def create_choropleth_figure(
margin={"l": 0, "r": 0, "t": 40, "b": 0}, margin={"l": 0, "r": 0, "t": 40, "b": 0},
title=title or "Toronto Housing Map", title=title or "Toronto Housing Map",
height=500, height=500,
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
) )
fig.add_annotation( fig.add_annotation(
text="No geometry data available. Complete QGIS digitization to enable map.", text="No geometry data available. Complete QGIS digitization to enable map.",
@@ -59,7 +66,7 @@ def create_choropleth_figure(
x=0.5, x=0.5,
y=0.5, y=0.5,
showarrow=False, showarrow=False,
font={"size": 14, "color": "gray"}, font={"size": 14, "color": "#888888"},
) )
return fig return fig
@@ -68,6 +75,11 @@ def create_choropleth_figure(
df = pd.DataFrame(data) df = pd.DataFrame(data)
# Use dark-mode friendly map style
effective_map_style = (
"carto-darkmatter" if map_style == "carto-positron" else map_style
)
fig = px.choropleth_mapbox( fig = px.choropleth_mapbox(
df, df,
geojson=geojson, geojson=geojson,
@@ -76,7 +88,7 @@ def create_choropleth_figure(
color=color_column, color=color_column,
color_continuous_scale=color_scale, color_continuous_scale=color_scale,
hover_data=hover_data, hover_data=hover_data,
mapbox_style=map_style, mapbox_style=effective_map_style,
center=center, center=center,
zoom=zoom, zoom=zoom,
opacity=0.7, opacity=0.7,
@@ -86,10 +98,17 @@ def create_choropleth_figure(
margin={"l": 0, "r": 0, "t": 40, "b": 0}, margin={"l": 0, "r": 0, "t": 40, "b": 0},
title=title, title=title,
height=500, height=500,
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
coloraxis_colorbar={ coloraxis_colorbar={
"title": color_column.replace("_", " ").title(), "title": {
"text": color_column.replace("_", " ").title(),
"font": {"color": "#c9c9c9"},
},
"thickness": 15, "thickness": 15,
"len": 0.7, "len": 0.7,
"tickfont": {"color": "#c9c9c9"},
}, },
) )

View File

@@ -69,7 +69,8 @@ def create_metric_card_figure(
height=120, height=120,
margin={"l": 20, "r": 20, "t": 40, "b": 20}, margin={"l": 20, "r": 20, "t": 40, "b": 20},
paper_bgcolor="rgba(0,0,0,0)", paper_bgcolor="rgba(0,0,0,0)",
font={"family": "Inter, sans-serif"}, plot_bgcolor="rgba(0,0,0,0)",
font={"family": "Inter, sans-serif", "color": "#c9c9c9"},
) )
return fig return fig

View File

@@ -38,8 +38,15 @@ def create_price_time_series(
x=0.5, x=0.5,
y=0.5, y=0.5,
showarrow=False, showarrow=False,
font={"color": "#888888"},
)
fig.update_layout(
title=title,
height=350,
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
) )
fig.update_layout(title=title, height=350)
return fig return fig
df = pd.DataFrame(data) df = pd.DataFrame(data)
@@ -69,6 +76,11 @@ def create_price_time_series(
yaxis_tickprefix="$", yaxis_tickprefix="$",
yaxis_tickformat=",", yaxis_tickformat=",",
hovermode="x unified", hovermode="x unified",
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
xaxis={"gridcolor": "#333333", "linecolor": "#444444"},
yaxis={"gridcolor": "#333333", "linecolor": "#444444"},
) )
return fig return fig
@@ -106,8 +118,15 @@ def create_volume_time_series(
x=0.5, x=0.5,
y=0.5, y=0.5,
showarrow=False, showarrow=False,
font={"color": "#888888"},
)
fig.update_layout(
title=title,
height=350,
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
) )
fig.update_layout(title=title, height=350)
return fig return fig
df = pd.DataFrame(data) df = pd.DataFrame(data)
@@ -153,6 +172,11 @@ def create_volume_time_series(
yaxis_title=volume_column.replace("_", " ").title(), yaxis_title=volume_column.replace("_", " ").title(),
yaxis_tickformat=",", yaxis_tickformat=",",
hovermode="x unified", hovermode="x unified",
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
xaxis={"gridcolor": "#333333", "linecolor": "#444444"},
yaxis={"gridcolor": "#333333", "linecolor": "#444444"},
) )
return fig return fig
@@ -187,8 +211,15 @@ def create_market_comparison_chart(
x=0.5, x=0.5,
y=0.5, y=0.5,
showarrow=False, showarrow=False,
font={"color": "#888888"},
)
fig.update_layout(
title=title,
height=400,
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
) )
fig.update_layout(title=title, height=400)
return fig return fig
if metrics is None: if metrics is None:
@@ -221,12 +252,18 @@ def create_market_comparison_chart(
height=400, height=400,
margin={"l": 40, "r": 40, "t": 50, "b": 40}, margin={"l": 40, "r": 40, "t": 50, "b": 40},
hovermode="x unified", hovermode="x unified",
paper_bgcolor="rgba(0,0,0,0)",
plot_bgcolor="rgba(0,0,0,0)",
font_color="#c9c9c9",
xaxis={"gridcolor": "#333333", "linecolor": "#444444"},
yaxis={"gridcolor": "#333333", "linecolor": "#444444"},
legend={ legend={
"orientation": "h", "orientation": "h",
"yanchor": "bottom", "yanchor": "bottom",
"y": 1.02, "y": 1.02,
"xanchor": "right", "xanchor": "right",
"x": 1, "x": 1,
"font": {"color": "#c9c9c9"},
}, },
) )

View File

@@ -2,7 +2,6 @@
import dash import dash
import dash_mantine_components as dmc import dash_mantine_components as dmc
from dash_iconify import DashIconify
dash.register_page(__name__, path="/", name="Home") dash.register_page(__name__, path="/", name="Home")
@@ -52,19 +51,6 @@ PROJECTS = [
}, },
] ]
SOCIAL_LINKS = [
{
"platform": "LinkedIn",
"url": "https://linkedin.com/in/leobmiranda",
"icon": "mdi:linkedin",
},
{
"platform": "GitHub",
"url": "https://github.com/leomiranda",
"icon": "mdi:github",
},
]
AVAILABILITY = "Open to Senior Data Analyst, Analytics Engineer, and BI Developer opportunities in Toronto or remote." AVAILABILITY = "Open to Senior Data Analyst, Analytics Engineer, and BI Developer opportunities in Toronto or remote."
@@ -160,27 +146,6 @@ def create_projects_section() -> dmc.Paper:
) )
def create_social_links() -> dmc.Group:
"""Create social media links."""
return dmc.Group(
[
dmc.Anchor(
dmc.Button(
link["platform"],
leftSection=DashIconify(icon=link["icon"], width=20),
variant="outline",
size="md",
),
href=link["url"],
target="_blank",
)
for link in SOCIAL_LINKS
],
justify="center",
gap="md",
)
def create_availability_section() -> dmc.Text: def create_availability_section() -> dmc.Text:
"""Create the availability statement.""" """Create the availability statement."""
return dmc.Text(AVAILABILITY, size="sm", c="dimmed", ta="center", fs="italic") return dmc.Text(AVAILABILITY, size="sm", c="dimmed", ta="center", fs="italic")
@@ -193,7 +158,6 @@ layout = dmc.Container(
create_summary_section(), create_summary_section(),
create_tech_stack_section(), create_tech_stack_section(),
create_projects_section(), create_projects_section(),
create_social_links(),
dmc.Divider(my="lg"), dmc.Divider(my="lg"),
create_availability_section(), create_availability_section(),
dmc.Space(h=40), dmc.Space(h=40),

View File

@@ -3,6 +3,7 @@
import dash import dash
import dash_mantine_components as dmc import dash_mantine_components as dmc
from dash import dcc, html from dash import dcc, html
from dash_iconify import DashIconify
from portfolio_app.components import ( from portfolio_app.components import (
create_map_controls, create_map_controls,
@@ -76,6 +77,17 @@ def create_header() -> dmc.Group:
), ),
dmc.Group( dmc.Group(
[ [
dcc.Link(
dmc.Button(
"Methodology",
leftSection=DashIconify(
icon="tabler:info-circle", width=18
),
variant="subtle",
color="gray",
),
href="/toronto/methodology",
),
create_year_selector( create_year_selector(
id_prefix="toronto", id_prefix="toronto",
min_year=2020, min_year=2020,

View File

@@ -2,7 +2,8 @@
import dash import dash
import dash_mantine_components as dmc import dash_mantine_components as dmc
from dash import html from dash import dcc, html
from dash_iconify import DashIconify
dash.register_page( dash.register_page(
__name__, __name__,
@@ -18,8 +19,18 @@ def layout() -> dmc.Container:
size="md", size="md",
py="xl", py="xl",
children=[ children=[
# Back to Dashboard button
dcc.Link(
dmc.Button(
"Back to Dashboard",
leftSection=DashIconify(icon="tabler:arrow-left", width=18),
variant="subtle",
color="gray",
),
href="/toronto",
),
# Header # Header
dmc.Title("Methodology", order=1, mb="lg"), dmc.Title("Methodology", order=1, mb="lg", mt="md"),
dmc.Text( dmc.Text(
"This page documents the data sources, processing methodology, " "This page documents the data sources, processing methodology, "
"and known limitations of the Toronto Housing Dashboard.", "and known limitations of the Toronto Housing Dashboard.",