Add new data-platform plugin for data engineering workflows with: MCP Server (32 tools): - pandas operations (14 tools): read_csv, read_parquet, read_json, to_csv, to_parquet, describe, head, tail, filter, select, groupby, join, list_data, drop_data - PostgreSQL/PostGIS (10 tools): pg_connect, pg_query, pg_execute, pg_tables, pg_columns, pg_schemas, st_tables, st_geometry_type, st_srid, st_extent - dbt integration (8 tools): dbt_parse, dbt_run, dbt_test, dbt_build, dbt_compile, dbt_ls, dbt_docs_generate, dbt_lineage Plugin Features: - Arrow IPC data_ref system for DataFrame persistence across tool calls - Pre-execution validation for dbt with `dbt parse` - SessionStart hook for PostgreSQL connectivity check (non-blocking) - Hybrid configuration (system ~/.config/claude/postgres.env + project .env) - Memory management with 100k row limit and chunking support Commands: /initial-setup, /ingest, /profile, /schema, /explain, /lineage, /run Agents: data-ingestion, data-analysis Test suite: 71 tests covering config, data store, pandas, postgres, dbt tools Addresses data workflow issues from personal-portfolio project: - Lost data after multiple interactions (solved by Arrow IPC data_ref) - dbt 1.9+ syntax deprecation (solved by pre-execution validation) - Ungraceful PostgreSQL error handling (solved by SessionStart hook) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
55 lines
1.6 KiB
Bash
Executable File
55 lines
1.6 KiB
Bash
Executable File
#!/bin/bash
|
|
# data-platform startup check hook
|
|
# Checks for common issues at session start
|
|
# All output MUST have [data-platform] prefix
|
|
|
|
PREFIX="[data-platform]"
|
|
|
|
# Check if MCP venv exists
|
|
PLUGIN_ROOT="${CLAUDE_PLUGIN_ROOT:-$(dirname "$(dirname "$(realpath "$0")")")}"
|
|
VENV_PATH="$PLUGIN_ROOT/mcp-servers/data-platform/.venv/bin/python"
|
|
|
|
if [[ ! -f "$VENV_PATH" ]]; then
|
|
echo "$PREFIX MCP venv missing - run /initial-setup or setup.sh"
|
|
exit 0
|
|
fi
|
|
|
|
# Check PostgreSQL configuration (optional - just warn if configured but failing)
|
|
POSTGRES_CONFIG="$HOME/.config/claude/postgres.env"
|
|
if [[ -f "$POSTGRES_CONFIG" ]]; then
|
|
source "$POSTGRES_CONFIG"
|
|
if [[ -n "${POSTGRES_URL:-}" ]]; then
|
|
# Quick connection test (5 second timeout)
|
|
RESULT=$("$VENV_PATH" -c "
|
|
import asyncio
|
|
import sys
|
|
async def test():
|
|
try:
|
|
import asyncpg
|
|
conn = await asyncpg.connect('$POSTGRES_URL', timeout=5)
|
|
await conn.close()
|
|
return 'OK'
|
|
except Exception as e:
|
|
return f'FAIL: {e}'
|
|
print(asyncio.run(test()))
|
|
" 2>/dev/null || echo "FAIL: asyncpg not installed")
|
|
|
|
if [[ "$RESULT" == "OK" ]]; then
|
|
# PostgreSQL OK - say nothing
|
|
:
|
|
elif [[ "$RESULT" == *"FAIL"* ]]; then
|
|
echo "$PREFIX PostgreSQL connection failed - check POSTGRES_URL"
|
|
fi
|
|
fi
|
|
fi
|
|
|
|
# Check dbt project (if in a project with dbt_project.yml)
|
|
if [[ -f "dbt_project.yml" ]] || [[ -f "transform/dbt_project.yml" ]]; then
|
|
if ! command -v dbt &> /dev/null; then
|
|
echo "$PREFIX dbt CLI not found - dbt tools unavailable"
|
|
fi
|
|
fi
|
|
|
|
# All checks passed - say nothing
|
|
exit 0
|