feat(marketplace): command consolidation + 8 new plugins (v8.1.0 → v9.0.0) [BREAKING]

Phase 1b: Rename all ~94 commands across 12 plugins to /<noun> <action>
sub-command pattern. Git-flow consolidated from 8→5 commands (commit
variants absorbed into --push/--merge/--sync flags). Dispatch files,
name: frontmatter, and cross-reference updates for all plugins.

Phase 2: Design documents for 8 new plugins in docs/designs/.

Phase 3: Scaffold 8 new plugins — saas-api-platform, saas-db-migrate,
saas-react-platform, saas-test-pilot, data-seed, ops-release-manager,
ops-deploy-pipeline, debug-mcp. Each with plugin.json, commands, agents,
skills, README, and claude-md-integration. Marketplace grows from 12→20.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-02-06 14:52:11 -05:00
parent 5098422858
commit 2d51df7a42
321 changed files with 13582 additions and 1019 deletions

View File

@@ -23,8 +23,8 @@ You are a strict data integrity auditor. Your role is to review code for proper
## Trigger Conditions
Activate this agent when:
- User runs `/data-review <path>`
- User runs `/data-gate <path>`
- User runs `/data review <path>`
- User runs `/data gate <path>`
- Projman orchestrator requests data domain gate check
- Code review includes database operations, dbt models, or data pipelines
@@ -78,7 +78,7 @@ Activate this agent when:
### Review Mode (default)
Triggered by `/data-review <path>`
Triggered by `/data review <path>`
**Characteristics:**
- Produces detailed report with all findings
@@ -89,7 +89,7 @@ Triggered by `/data-review <path>`
### Gate Mode
Triggered by `/data-gate <path>` or projman orchestrator domain gate
Triggered by `/data gate <path>` or projman orchestrator domain gate
**Characteristics:**
- Binary PASS/FAIL output
@@ -203,7 +203,7 @@ Blocking Issues (2):
2. portfolio_app/toronto/loaders/census.py:67 - References table 'census_raw' which does not exist
Fix: Table was renamed to 'census_demographics' in migration 003.
Run /data-review for full audit report.
Run /data review for full audit report.
```
### Review Mode Output
@@ -292,7 +292,7 @@ When called as a domain gate by projman orchestrator:
## Example Interactions
**User**: `/data-review dbt/models/staging/`
**User**: `/data review dbt/models/staging/`
**Agent**:
1. Scans all .sql files in staging/
2. Runs dbt_parse to validate project
@@ -301,7 +301,7 @@ When called as a domain gate by projman orchestrator:
5. Cross-references test coverage
6. Returns detailed report
**User**: `/data-gate portfolio_app/toronto/`
**User**: `/data gate portfolio_app/toronto/`
**Agent**:
1. Scans for Python files with pg_query/pg_execute
2. Checks if referenced tables exist

View File

@@ -18,12 +18,12 @@ This project uses the data-platform plugin for data engineering workflows.
| Command | Purpose |
|---------|---------|
| `/data-ingest` | Load data from files or database |
| `/data-profile` | Generate statistical profile |
| `/data-schema` | Show schema information |
| `/data-explain` | Explain dbt model |
| `/data-lineage` | Show data lineage |
| `/data-run` | Execute dbt models |
| `/data ingest` | Load data from files or database |
| `/data profile` | Generate statistical profile |
| `/data schema` | Show schema information |
| `/data explain` | Explain dbt model |
| `/data lineage` | Show data lineage |
| `/data run` | Execute dbt models |
### data_ref Convention
@@ -36,9 +36,9 @@ DataFrames are stored with references. Use meaningful names:
### dbt Workflow
1. Always validate before running: `/data-run` includes automatic `dbt_parse`
1. Always validate before running: `/data run` includes automatic `dbt_parse`
2. For dbt 1.9+, check for deprecated syntax before commits
3. Use `/data-lineage` to understand impact of changes
3. Use `/data lineage` to understand impact of changes
### Database Access
@@ -69,22 +69,22 @@ DATA_PLATFORM_MAX_ROWS=100000
### Data Exploration
```
/data-ingest data/raw_customers.csv
/data-profile raw_customers
/data-schema
/data ingest data/raw_customers.csv
/data profile raw_customers
/data schema
```
### ETL Development
```
/data-schema orders # Understand source
/data-explain stg_orders # Understand transformation
/data-run stg_orders # Test the model
/data-lineage fct_orders # Check downstream impact
/data schema orders # Understand source
/data explain stg_orders # Understand transformation
/data run stg_orders # Test the model
/data lineage fct_orders # Check downstream impact
```
### Database Analysis
```
/data-schema # List all tables
/data schema # List all tables
pg_columns orders # Detailed schema
st_tables # Find spatial data
```

View File

@@ -1,4 +1,8 @@
# /dbt-test - Run dbt Tests
---
name: data dbt-test
---
# /data dbt-test - Run dbt Tests
## Skills to Load
- skills/dbt-workflow.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - dbt Tests`
## Usage
```
/dbt-test [selection] [--warn-only]
/data dbt-test [selection] [--warn-only]
```
## Workflow
@@ -32,9 +36,9 @@ Execute `skills/dbt-workflow.md` test workflow:
## Examples
```
/dbt-test # Run all tests
/dbt-test dim_customers # Tests for specific model
/dbt-test tag:critical # Run critical tests only
/data dbt-test # Run all tests
/data dbt-test dim_customers # Tests for specific model
/data dbt-test tag:critical # Run critical tests only
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-explain - dbt Model Explanation
---
name: data explain
---
# /data explain - dbt Model Explanation
## Skills to Load
- skills/dbt-workflow.md
@@ -13,7 +17,7 @@ Display header: `DATA-PLATFORM - Model Explanation`
## Usage
```
/data-explain <model_name>
/data explain <model_name>
```
## Workflow
@@ -26,8 +30,8 @@ Display header: `DATA-PLATFORM - Model Explanation`
## Examples
```
/data-explain dim_customers
/data-explain fct_orders
/data explain dim_customers
/data explain fct_orders
```
## Required MCP Tools

View File

@@ -1,4 +1,5 @@
---
name: data gate
description: Data integrity compliance gate (pass/fail) for sprint execution
gate_contract: v1
arguments:
@@ -7,21 +8,21 @@ arguments:
required: true
---
# /data-gate
# /data gate
Binary pass/fail validation for data integrity compliance. Used by projman orchestrator during sprint execution to gate issue completion.
## Usage
```
/data-gate <path>
/data gate <path>
```
**Examples:**
```
/data-gate ./dbt/models/staging/
/data-gate ./portfolio_app/toronto/parsers/
/data-gate ./dbt/
/data gate ./dbt/models/staging/
/data gate ./portfolio_app/toronto/parsers/
/data gate ./dbt/
```
## What It Does
@@ -63,7 +64,7 @@ Blocking Issues (2):
2. portfolio_app/toronto/loaders/census.py:67 - References table 'census_raw' which does not exist
Fix: Table was renamed to 'census_demographics' in migration 003.
Run /data-review for full audit report.
Run /data review for full audit report.
```
## Integration with projman
@@ -78,9 +79,9 @@ This command is automatically invoked by the projman orchestrator when:
- PASS: Issue can be marked complete
- FAIL: Issue stays open, blocker comment added with failure details
## Differences from /data-review
## Differences from /data review
| Aspect | /data-gate | /data-review |
| Aspect | /data gate | /data review |
|--------|------------|--------------|
| Output | Binary PASS/FAIL | Detailed report with all severities |
| Severity | FAIL only | FAIL + WARN + INFO |
@@ -95,7 +96,7 @@ This command is automatically invoked by the projman orchestrator when:
- **Quick validation**: Fast pass/fail without full report
- **Pre-merge checks**: Verify data changes before integration
For detailed findings including warnings and suggestions, use `/data-review` instead.
For detailed findings including warnings and suggestions, use `/data review` instead.
## Requirements

View File

@@ -1,4 +1,8 @@
# /data-ingest - Data Ingestion
---
name: data ingest
---
# /data ingest - Data Ingestion
## Skills to Load
- skills/mcp-tools-reference.md
@@ -11,7 +15,7 @@ Display header: `DATA-PLATFORM - Ingest`
## Usage
```
/data-ingest [source]
/data ingest [source]
```
## Workflow
@@ -31,9 +35,9 @@ Display header: `DATA-PLATFORM - Ingest`
## Examples
```
/data-ingest data/sales.csv
/data-ingest data/customers.parquet
/data-ingest "SELECT * FROM orders WHERE created_at > '2024-01-01'"
/data ingest data/sales.csv
/data ingest data/customers.parquet
/data ingest "SELECT * FROM orders WHERE created_at > '2024-01-01'"
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /lineage-viz - Mermaid Lineage Visualization
---
name: data lineage-viz
---
# /data lineage-viz - Mermaid Lineage Visualization
## Skills to Load
- skills/lineage-analysis.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Lineage Visualization`
## Usage
```
/lineage-viz <model_name> [--direction TB|LR] [--depth N]
/data lineage-viz <model_name> [--direction TB|LR] [--depth N]
```
## Workflow
@@ -31,9 +35,9 @@ Display header: `DATA-PLATFORM - Lineage Visualization`
## Examples
```
/lineage-viz dim_customers
/lineage-viz fct_orders --direction TB
/lineage-viz rpt_revenue --depth 2
/data lineage-viz dim_customers
/data lineage-viz fct_orders --direction TB
/data lineage-viz rpt_revenue --depth 2
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-lineage - Data Lineage Visualization
---
name: data lineage
---
# /data lineage - Data Lineage Visualization
## Skills to Load
- skills/lineage-analysis.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Lineage`
## Usage
```
/data-lineage <model_name> [--depth N]
/data lineage <model_name> [--depth N]
```
## Workflow
@@ -25,8 +29,8 @@ Display header: `DATA-PLATFORM - Lineage`
## Examples
```
/data-lineage dim_customers
/data-lineage fct_orders --depth 3
/data lineage dim_customers
/data lineage fct_orders --depth 3
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-profile - Data Profiling
---
name: data profile
---
# /data profile - Data Profiling
## Skills to Load
- skills/data-profiling.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Data Profile`
## Usage
```
/data-profile <data_ref>
/data profile <data_ref>
```
## Workflow
@@ -27,8 +31,8 @@ Execute `skills/data-profiling.md` profiling workflow:
## Examples
```
/data-profile sales_data
/data-profile df_a1b2c3d4
/data profile sales_data
/data profile df_a1b2c3d4
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-quality - Data Quality Assessment
---
name: data quality
---
# /data quality - Data Quality Assessment
## Skills to Load
- skills/data-profiling.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Data Quality`
## Usage
```
/data-quality <data_ref> [--strict]
/data quality <data_ref> [--strict]
```
## Workflow
@@ -33,8 +37,8 @@ Execute `skills/data-profiling.md` quality assessment:
## Examples
```
/data-quality sales_data
/data-quality df_customers --strict
/data quality sales_data
/data quality df_customers --strict
```
## Quality Thresholds

View File

@@ -1,4 +1,5 @@
---
name: data review
description: Audit data integrity, schema validity, and dbt compliance
arguments:
- name: path
@@ -6,21 +7,21 @@ arguments:
required: true
---
# /data-review
# /data review
Comprehensive data integrity audit producing a detailed report with findings at all severity levels. For human review and standalone codebase auditing.
## Usage
```
/data-review <path>
/data review <path>
```
**Examples:**
```
/data-review ./dbt/
/data-review ./portfolio_app/toronto/
/data-review ./dbt/models/marts/
/data review ./dbt/
/data review ./portfolio_app/toronto/
/data review ./dbt/models/marts/
```
## What It Does
@@ -79,46 +80,46 @@ VERDICT: PASS | FAIL (N blocking issues)
### Before Sprint Planning
Audit data layer health to identify tech debt and inform sprint scope.
```
/data-review ./dbt/
/data review ./dbt/
```
### During Code Review
Get detailed data integrity findings alongside code review comments.
```
/data-review ./dbt/models/staging/stg_new_source.sql
/data review ./dbt/models/staging/stg_new_source.sql
```
### After Migrations
Verify schema changes didn't break anything downstream.
```
/data-review ./migrations/
/data review ./migrations/
```
### Periodic Health Checks
Regular data infrastructure audits for proactive maintenance.
```
/data-review ./data_pipeline/
/data review ./data_pipeline/
```
### New Project Onboarding
Understand the current state of data architecture.
```
/data-review .
/data review .
```
## Severity Levels
| Level | Meaning | Gate Impact |
|-------|---------|-------------|
| **FAIL** | Blocking issues that will cause runtime errors | Would block `/data-gate` |
| **FAIL** | Blocking issues that will cause runtime errors | Would block `/data gate` |
| **WARN** | Quality issues that should be addressed | Does not block gate |
| **INFO** | Suggestions for improvement | Does not block gate |
## Differences from /data-gate
## Differences from /data gate
`/data-review` gives you the full picture. `/data-gate` gives the orchestrator a yes/no.
`/data review` gives you the full picture. `/data gate` gives the orchestrator a yes/no.
| Aspect | /data-gate | /data-review |
| Aspect | /data gate | /data review |
|--------|------------|--------------|
| Output | Binary PASS/FAIL | Detailed report |
| Severity | FAIL only | FAIL + WARN + INFO |
@@ -126,8 +127,8 @@ Understand the current state of data architecture.
| Verbosity | Minimal | Comprehensive |
| Speed | Fast (skips INFO) | Thorough |
Use `/data-review` when you want to understand.
Use `/data-gate` when you want to automate.
Use `/data review` when you want to understand.
Use `/data gate` when you want to automate.
## Requirements
@@ -144,6 +145,6 @@ Use `/data-gate` when you want to automate.
## Related Commands
- `/data-gate` - Binary pass/fail for automation
- `/data-lineage` - Visualize dbt model dependencies
- `/data-schema` - Explore database schema
- `/data gate` - Binary pass/fail for automation
- `/data lineage` - Visualize dbt model dependencies
- `/data schema` - Explore database schema

View File

@@ -1,4 +1,8 @@
# /data-run - Execute dbt Models
---
name: data run
---
# /data run - Execute dbt Models
## Skills to Load
- skills/dbt-workflow.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - dbt Run`
## Usage
```
/data-run [model_selection] [--full-refresh]
/data run [model_selection] [--full-refresh]
```
## Workflow
@@ -30,11 +34,11 @@ See `skills/dbt-workflow.md` for full selection patterns.
## Examples
```
/data-run # Run all models
/data-run dim_customers # Run specific model
/data-run +fct_orders # Run model and upstream
/data-run tag:daily # Run models with tag
/data-run --full-refresh # Rebuild incremental models
/data run # Run all models
/data run dim_customers # Run specific model
/data run +fct_orders # Run model and upstream
/data run tag:daily # Run models with tag
/data run --full-refresh # Rebuild incremental models
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-schema - Schema Exploration
---
name: data schema
---
# /data schema - Schema Exploration
## Skills to Load
- skills/mcp-tools-reference.md
@@ -11,7 +15,7 @@ Display header: `DATA-PLATFORM - Schema Explorer`
## Usage
```
/data-schema [table_name | data_ref]
/data schema [table_name | data_ref]
```
## Workflow
@@ -30,9 +34,9 @@ Display header: `DATA-PLATFORM - Schema Explorer`
## Examples
```
/data-schema # List all tables and DataFrames
/data-schema customers # Show table schema
/data-schema sales_data # Show DataFrame schema
/data schema # List all tables and DataFrames
/data schema customers # Show table schema
/data schema sales_data # Show DataFrame schema
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-setup - Data Platform Setup Wizard
---
name: data setup
---
# /data setup - Data Platform Setup Wizard
## Skills to Load
- skills/setup-workflow.md
@@ -11,7 +15,7 @@ Display header: `DATA-PLATFORM - Setup Wizard`
## Usage
```
/data-setup
/data setup
```
## Workflow

View File

@@ -0,0 +1,24 @@
---
description: Data engineering tools with pandas, PostgreSQL, and dbt
---
# /data
Data engineering tools with pandas, PostgreSQL/PostGIS, and dbt integration.
## Sub-commands
| Sub-command | Description |
|-------------|-------------|
| `/data ingest` | Load data from CSV, Parquet, JSON into DataFrame |
| `/data profile` | Generate data profiling report with statistics |
| `/data schema` | Explore database schemas, tables, columns |
| `/data explain` | Explain query execution plan |
| `/data lineage` | Show dbt model lineage and dependencies |
| `/data lineage-viz` | dbt lineage visualization as Mermaid diagrams |
| `/data run` | Run dbt models with validation |
| `/data dbt-test` | Formatted dbt test runner with summary |
| `/data quality` | DataFrame quality checks |
| `/data review` | Comprehensive data integrity audits |
| `/data gate` | Binary pass/fail data integrity gates |
| `/data setup` | Setup wizard for data-platform MCP servers |

View File

@@ -215,7 +215,7 @@ Blocking Issues (N):
2. <location> - <violation description>
Fix: <actionable fix>
Run /data-review for full audit report.
Run /data review for full audit report.
```
### Review Mode (Detailed)
@@ -293,7 +293,7 @@ Do not flag violations in:
When called as a domain gate:
1. Orchestrator detects `Domain/Data` label on issue
2. Orchestrator identifies changed files
3. Orchestrator invokes `/data-gate <path>`
3. Orchestrator invokes `/data gate <path>`
4. Agent runs gate mode scan
5. Returns PASS/FAIL to orchestrator
6. Orchestrator decides whether to complete issue
@@ -301,7 +301,7 @@ When called as a domain gate:
### Standalone Usage
For manual audits:
1. User runs `/data-review <path>`
1. User runs `/data review <path>`
2. Agent runs full review mode scan
3. Returns detailed report with all severity levels
4. User decides on actions