Phase 1b: Rename all ~94 commands across 12 plugins to /<noun> <action> sub-command pattern. Git-flow consolidated from 8→5 commands (commit variants absorbed into --push/--merge/--sync flags). Dispatch files, name: frontmatter, and cross-reference updates for all plugins. Phase 2: Design documents for 8 new plugins in docs/designs/. Phase 3: Scaffold 8 new plugins — saas-api-platform, saas-db-migrate, saas-react-platform, saas-test-pilot, data-seed, ops-release-manager, ops-deploy-pipeline, debug-mcp. Each with plugin.json, commands, agents, skills, README, and claude-md-integration. Marketplace grows from 12→20. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
3.2 KiB
3.2 KiB
name, description, gate_contract, arguments
| name | description | gate_contract | arguments | |||||||
|---|---|---|---|---|---|---|---|---|---|---|
| data gate | Data integrity compliance gate (pass/fail) for sprint execution | v1 |
|
/data gate
Binary pass/fail validation for data integrity compliance. Used by projman orchestrator during sprint execution to gate issue completion.
Usage
/data gate <path>
Examples:
/data gate ./dbt/models/staging/
/data gate ./portfolio_app/toronto/parsers/
/data gate ./dbt/
What It Does
- Activates the
data-advisoragent in gate mode - Loads the
skills/data-integrity-audit.mdskill - Determines scope from target path:
- dbt project directory: full dbt validation (parse, compile, test, lineage)
- Python files with database operations: schema validation
- SQL files: dbt model validation
- Mixed: all applicable checks
- Checks only FAIL-level violations:
- dbt parse failures (project broken)
- dbt compilation errors (SQL invalid)
- Missing tables/columns referenced in code
- Data type mismatches that cause runtime errors
- Broken lineage (orphaned model references)
- PostGIS SRID mismatches
- Returns binary result:
PASS- No blocking violations foundFAIL- One or more blocking violations
Output
On PASS
DATA GATE: PASS
No blocking data integrity violations found.
On FAIL
DATA GATE: FAIL
Blocking Issues (2):
1. dbt/models/staging/stg_census.sql - Compilation error: column 'census_yr' not found
Fix: Column was renamed to 'census_year' in source table. Update model.
2. portfolio_app/toronto/loaders/census.py:67 - References table 'census_raw' which does not exist
Fix: Table was renamed to 'census_demographics' in migration 003.
Run /data review for full audit report.
Integration with projman
This command is automatically invoked by the projman orchestrator when:
- An issue has the
Domain/Datalabel - The orchestrator is about to mark the issue as complete
- The orchestrator passes the path of changed files
Gate behavior:
- PASS: Issue can be marked complete
- FAIL: Issue stays open, blocker comment added with failure details
Differences from /data review
| Aspect | /data gate | /data review |
|---|---|---|
| Output | Binary PASS/FAIL | Detailed report with all severities |
| Severity | FAIL only | FAIL + WARN + INFO |
| Purpose | Automation gate | Human review |
| Verbosity | Minimal | Comprehensive |
| Speed | Skips INFO checks | Full scan |
When to Use
- Sprint execution: Automatic quality gates via projman
- CI/CD pipelines: Automated data integrity checks
- Quick validation: Fast pass/fail without full report
- Pre-merge checks: Verify data changes before integration
For detailed findings including warnings and suggestions, use /data review instead.
Requirements
- data-platform MCP server must be running
- For dbt checks: dbt project must be configured (auto-detected via
dbt_project.yml) - For PostgreSQL checks: connection configured in
~/.config/claude/postgres.env - If database or dbt unavailable: applicable checks skipped with warning (non-blocking degradation)