feat(marketplace): command consolidation + 8 new plugins (v8.1.0 → v9.0.0) [BREAKING]

Phase 1b: Rename all ~94 commands across 12 plugins to /<noun> <action>
sub-command pattern. Git-flow consolidated from 8→5 commands (commit
variants absorbed into --push/--merge/--sync flags). Dispatch files,
name: frontmatter, and cross-reference updates for all plugins.

Phase 2: Design documents for 8 new plugins in docs/designs/.

Phase 3: Scaffold 8 new plugins — saas-api-platform, saas-db-migrate,
saas-react-platform, saas-test-pilot, data-seed, ops-release-manager,
ops-deploy-pipeline, debug-mcp. Each with plugin.json, commands, agents,
skills, README, and claude-md-integration. Marketplace grows from 12→20.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-02-06 14:52:11 -05:00
parent 5098422858
commit 2d51df7a42
321 changed files with 13582 additions and 1019 deletions

View File

@@ -1,4 +1,8 @@
# /dbt-test - Run dbt Tests
---
name: data dbt-test
---
# /data dbt-test - Run dbt Tests
## Skills to Load
- skills/dbt-workflow.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - dbt Tests`
## Usage
```
/dbt-test [selection] [--warn-only]
/data dbt-test [selection] [--warn-only]
```
## Workflow
@@ -32,9 +36,9 @@ Execute `skills/dbt-workflow.md` test workflow:
## Examples
```
/dbt-test # Run all tests
/dbt-test dim_customers # Tests for specific model
/dbt-test tag:critical # Run critical tests only
/data dbt-test # Run all tests
/data dbt-test dim_customers # Tests for specific model
/data dbt-test tag:critical # Run critical tests only
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-explain - dbt Model Explanation
---
name: data explain
---
# /data explain - dbt Model Explanation
## Skills to Load
- skills/dbt-workflow.md
@@ -13,7 +17,7 @@ Display header: `DATA-PLATFORM - Model Explanation`
## Usage
```
/data-explain <model_name>
/data explain <model_name>
```
## Workflow
@@ -26,8 +30,8 @@ Display header: `DATA-PLATFORM - Model Explanation`
## Examples
```
/data-explain dim_customers
/data-explain fct_orders
/data explain dim_customers
/data explain fct_orders
```
## Required MCP Tools

View File

@@ -1,4 +1,5 @@
---
name: data gate
description: Data integrity compliance gate (pass/fail) for sprint execution
gate_contract: v1
arguments:
@@ -7,21 +8,21 @@ arguments:
required: true
---
# /data-gate
# /data gate
Binary pass/fail validation for data integrity compliance. Used by projman orchestrator during sprint execution to gate issue completion.
## Usage
```
/data-gate <path>
/data gate <path>
```
**Examples:**
```
/data-gate ./dbt/models/staging/
/data-gate ./portfolio_app/toronto/parsers/
/data-gate ./dbt/
/data gate ./dbt/models/staging/
/data gate ./portfolio_app/toronto/parsers/
/data gate ./dbt/
```
## What It Does
@@ -63,7 +64,7 @@ Blocking Issues (2):
2. portfolio_app/toronto/loaders/census.py:67 - References table 'census_raw' which does not exist
Fix: Table was renamed to 'census_demographics' in migration 003.
Run /data-review for full audit report.
Run /data review for full audit report.
```
## Integration with projman
@@ -78,9 +79,9 @@ This command is automatically invoked by the projman orchestrator when:
- PASS: Issue can be marked complete
- FAIL: Issue stays open, blocker comment added with failure details
## Differences from /data-review
## Differences from /data review
| Aspect | /data-gate | /data-review |
| Aspect | /data gate | /data review |
|--------|------------|--------------|
| Output | Binary PASS/FAIL | Detailed report with all severities |
| Severity | FAIL only | FAIL + WARN + INFO |
@@ -95,7 +96,7 @@ This command is automatically invoked by the projman orchestrator when:
- **Quick validation**: Fast pass/fail without full report
- **Pre-merge checks**: Verify data changes before integration
For detailed findings including warnings and suggestions, use `/data-review` instead.
For detailed findings including warnings and suggestions, use `/data review` instead.
## Requirements

View File

@@ -1,4 +1,8 @@
# /data-ingest - Data Ingestion
---
name: data ingest
---
# /data ingest - Data Ingestion
## Skills to Load
- skills/mcp-tools-reference.md
@@ -11,7 +15,7 @@ Display header: `DATA-PLATFORM - Ingest`
## Usage
```
/data-ingest [source]
/data ingest [source]
```
## Workflow
@@ -31,9 +35,9 @@ Display header: `DATA-PLATFORM - Ingest`
## Examples
```
/data-ingest data/sales.csv
/data-ingest data/customers.parquet
/data-ingest "SELECT * FROM orders WHERE created_at > '2024-01-01'"
/data ingest data/sales.csv
/data ingest data/customers.parquet
/data ingest "SELECT * FROM orders WHERE created_at > '2024-01-01'"
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /lineage-viz - Mermaid Lineage Visualization
---
name: data lineage-viz
---
# /data lineage-viz - Mermaid Lineage Visualization
## Skills to Load
- skills/lineage-analysis.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Lineage Visualization`
## Usage
```
/lineage-viz <model_name> [--direction TB|LR] [--depth N]
/data lineage-viz <model_name> [--direction TB|LR] [--depth N]
```
## Workflow
@@ -31,9 +35,9 @@ Display header: `DATA-PLATFORM - Lineage Visualization`
## Examples
```
/lineage-viz dim_customers
/lineage-viz fct_orders --direction TB
/lineage-viz rpt_revenue --depth 2
/data lineage-viz dim_customers
/data lineage-viz fct_orders --direction TB
/data lineage-viz rpt_revenue --depth 2
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-lineage - Data Lineage Visualization
---
name: data lineage
---
# /data lineage - Data Lineage Visualization
## Skills to Load
- skills/lineage-analysis.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Lineage`
## Usage
```
/data-lineage <model_name> [--depth N]
/data lineage <model_name> [--depth N]
```
## Workflow
@@ -25,8 +29,8 @@ Display header: `DATA-PLATFORM - Lineage`
## Examples
```
/data-lineage dim_customers
/data-lineage fct_orders --depth 3
/data lineage dim_customers
/data lineage fct_orders --depth 3
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-profile - Data Profiling
---
name: data profile
---
# /data profile - Data Profiling
## Skills to Load
- skills/data-profiling.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Data Profile`
## Usage
```
/data-profile <data_ref>
/data profile <data_ref>
```
## Workflow
@@ -27,8 +31,8 @@ Execute `skills/data-profiling.md` profiling workflow:
## Examples
```
/data-profile sales_data
/data-profile df_a1b2c3d4
/data profile sales_data
/data profile df_a1b2c3d4
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-quality - Data Quality Assessment
---
name: data quality
---
# /data quality - Data Quality Assessment
## Skills to Load
- skills/data-profiling.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - Data Quality`
## Usage
```
/data-quality <data_ref> [--strict]
/data quality <data_ref> [--strict]
```
## Workflow
@@ -33,8 +37,8 @@ Execute `skills/data-profiling.md` quality assessment:
## Examples
```
/data-quality sales_data
/data-quality df_customers --strict
/data quality sales_data
/data quality df_customers --strict
```
## Quality Thresholds

View File

@@ -1,4 +1,5 @@
---
name: data review
description: Audit data integrity, schema validity, and dbt compliance
arguments:
- name: path
@@ -6,21 +7,21 @@ arguments:
required: true
---
# /data-review
# /data review
Comprehensive data integrity audit producing a detailed report with findings at all severity levels. For human review and standalone codebase auditing.
## Usage
```
/data-review <path>
/data review <path>
```
**Examples:**
```
/data-review ./dbt/
/data-review ./portfolio_app/toronto/
/data-review ./dbt/models/marts/
/data review ./dbt/
/data review ./portfolio_app/toronto/
/data review ./dbt/models/marts/
```
## What It Does
@@ -79,46 +80,46 @@ VERDICT: PASS | FAIL (N blocking issues)
### Before Sprint Planning
Audit data layer health to identify tech debt and inform sprint scope.
```
/data-review ./dbt/
/data review ./dbt/
```
### During Code Review
Get detailed data integrity findings alongside code review comments.
```
/data-review ./dbt/models/staging/stg_new_source.sql
/data review ./dbt/models/staging/stg_new_source.sql
```
### After Migrations
Verify schema changes didn't break anything downstream.
```
/data-review ./migrations/
/data review ./migrations/
```
### Periodic Health Checks
Regular data infrastructure audits for proactive maintenance.
```
/data-review ./data_pipeline/
/data review ./data_pipeline/
```
### New Project Onboarding
Understand the current state of data architecture.
```
/data-review .
/data review .
```
## Severity Levels
| Level | Meaning | Gate Impact |
|-------|---------|-------------|
| **FAIL** | Blocking issues that will cause runtime errors | Would block `/data-gate` |
| **FAIL** | Blocking issues that will cause runtime errors | Would block `/data gate` |
| **WARN** | Quality issues that should be addressed | Does not block gate |
| **INFO** | Suggestions for improvement | Does not block gate |
## Differences from /data-gate
## Differences from /data gate
`/data-review` gives you the full picture. `/data-gate` gives the orchestrator a yes/no.
`/data review` gives you the full picture. `/data gate` gives the orchestrator a yes/no.
| Aspect | /data-gate | /data-review |
| Aspect | /data gate | /data review |
|--------|------------|--------------|
| Output | Binary PASS/FAIL | Detailed report |
| Severity | FAIL only | FAIL + WARN + INFO |
@@ -126,8 +127,8 @@ Understand the current state of data architecture.
| Verbosity | Minimal | Comprehensive |
| Speed | Fast (skips INFO) | Thorough |
Use `/data-review` when you want to understand.
Use `/data-gate` when you want to automate.
Use `/data review` when you want to understand.
Use `/data gate` when you want to automate.
## Requirements
@@ -144,6 +145,6 @@ Use `/data-gate` when you want to automate.
## Related Commands
- `/data-gate` - Binary pass/fail for automation
- `/data-lineage` - Visualize dbt model dependencies
- `/data-schema` - Explore database schema
- `/data gate` - Binary pass/fail for automation
- `/data lineage` - Visualize dbt model dependencies
- `/data schema` - Explore database schema

View File

@@ -1,4 +1,8 @@
# /data-run - Execute dbt Models
---
name: data run
---
# /data run - Execute dbt Models
## Skills to Load
- skills/dbt-workflow.md
@@ -12,7 +16,7 @@ Display header: `DATA-PLATFORM - dbt Run`
## Usage
```
/data-run [model_selection] [--full-refresh]
/data run [model_selection] [--full-refresh]
```
## Workflow
@@ -30,11 +34,11 @@ See `skills/dbt-workflow.md` for full selection patterns.
## Examples
```
/data-run # Run all models
/data-run dim_customers # Run specific model
/data-run +fct_orders # Run model and upstream
/data-run tag:daily # Run models with tag
/data-run --full-refresh # Rebuild incremental models
/data run # Run all models
/data run dim_customers # Run specific model
/data run +fct_orders # Run model and upstream
/data run tag:daily # Run models with tag
/data run --full-refresh # Rebuild incremental models
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-schema - Schema Exploration
---
name: data schema
---
# /data schema - Schema Exploration
## Skills to Load
- skills/mcp-tools-reference.md
@@ -11,7 +15,7 @@ Display header: `DATA-PLATFORM - Schema Explorer`
## Usage
```
/data-schema [table_name | data_ref]
/data schema [table_name | data_ref]
```
## Workflow
@@ -30,9 +34,9 @@ Display header: `DATA-PLATFORM - Schema Explorer`
## Examples
```
/data-schema # List all tables and DataFrames
/data-schema customers # Show table schema
/data-schema sales_data # Show DataFrame schema
/data schema # List all tables and DataFrames
/data schema customers # Show table schema
/data schema sales_data # Show DataFrame schema
```
## Required MCP Tools

View File

@@ -1,4 +1,8 @@
# /data-setup - Data Platform Setup Wizard
---
name: data setup
---
# /data setup - Data Platform Setup Wizard
## Skills to Load
- skills/setup-workflow.md
@@ -11,7 +15,7 @@ Display header: `DATA-PLATFORM - Setup Wizard`
## Usage
```
/data-setup
/data setup
```
## Workflow

View File

@@ -0,0 +1,24 @@
---
description: Data engineering tools with pandas, PostgreSQL, and dbt
---
# /data
Data engineering tools with pandas, PostgreSQL/PostGIS, and dbt integration.
## Sub-commands
| Sub-command | Description |
|-------------|-------------|
| `/data ingest` | Load data from CSV, Parquet, JSON into DataFrame |
| `/data profile` | Generate data profiling report with statistics |
| `/data schema` | Explore database schemas, tables, columns |
| `/data explain` | Explain query execution plan |
| `/data lineage` | Show dbt model lineage and dependencies |
| `/data lineage-viz` | dbt lineage visualization as Mermaid diagrams |
| `/data run` | Run dbt models with validation |
| `/data dbt-test` | Formatted dbt test runner with summary |
| `/data quality` | DataFrame quality checks |
| `/data review` | Comprehensive data integrity audits |
| `/data gate` | Binary pass/fail data integrity gates |
| `/data setup` | Setup wizard for data-platform MCP servers |