Compare commits
35 Commits
v3.1.1
...
3227d2d618
| Author | SHA1 | Date | |
|---|---|---|---|
| 3227d2d618 | |||
| 1dfbffcf38 | |||
| 98df35a33e | |||
| 70d6963d0d | |||
| 54c6694117 | |||
| 2402f88daf | |||
| 1cf1dbefb8 | |||
| dafa8db8bb | |||
| 65e79efb24 | |||
| 5ffc13b635 | |||
| 50bfd20fd4 | |||
| c14f1f46cd | |||
| 52c8371f4a | |||
| f8d6d42150 | |||
| 469487f6ed | |||
| 7a2966367d | |||
| 0466b299a7 | |||
| b34304ed57 | |||
| 96963531fc | |||
| 4c9a7c55ae | |||
| 8a75203251 | |||
| da6e81260e | |||
| e1f1335655 | |||
| b017db83a1 | |||
| bc136fab7e | |||
| 6c24bcbb91 | |||
| 11a05799d3 | |||
| 403271dc0c | |||
| cc4abf67b9 | |||
| 35cf20e02d | |||
| 5209f82efb | |||
| 1f55387e9e | |||
| 32bbca73ba | |||
| 0e6999ea21 | |||
| 6cf3c1830c |
@@ -6,7 +6,7 @@
|
||||
},
|
||||
"metadata": {
|
||||
"description": "Project management plugins with Gitea and NetBox integrations",
|
||||
"version": "3.1.0"
|
||||
"version": "3.2.0"
|
||||
},
|
||||
"plugins": [
|
||||
{
|
||||
|
||||
45
CHANGELOG.md
45
CHANGELOG.md
@@ -4,6 +4,51 @@ All notable changes to the Leo Claude Marketplace will be documented in this fil
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
*Changes staged for the next release*
|
||||
|
||||
---
|
||||
|
||||
## [3.2.0] - 2026-01-24
|
||||
|
||||
### Added
|
||||
- **git-flow:** `/commit` now detects protected branches before committing
|
||||
- Warns when on protected branch (main, master, development, staging, production)
|
||||
- Offers to create feature branch automatically instead of committing directly
|
||||
- Configurable via `GIT_PROTECTED_BRANCHES` environment variable
|
||||
- **netbox:** Platform and primary_ip parameters added to device update tools
|
||||
- **claude-config-maintainer:** Auto-enforce mandatory behavior rules via SessionStart hook
|
||||
- **scripts:** `release.sh` - Versioning workflow script for consistent releases
|
||||
- **scripts:** `verify-hooks.sh` - Verify all hooks are command type
|
||||
|
||||
### Changed
|
||||
- **doc-guardian:** Hook switched from `prompt` type to `command` type
|
||||
- Prompt hooks unreliable - Claude ignores explicit instructions
|
||||
- New `notify.sh` bash script guarantees exact output behavior
|
||||
- Only notifies for config file changes (commands/, agents/, skills/, hooks/)
|
||||
- Silent exit for all other files - no blocking possible
|
||||
- **All hooks:** Converted to command type with stricter plugin prefix enforcement
|
||||
- All hooks now mandate `[plugin-name]` prefix with "NO EXCEPTIONS" rule
|
||||
- Simplified output formats with word limits
|
||||
- Consistent structure across projman, pr-review, code-sentinel, doc-guardian
|
||||
- **CLAUDE.md:** Replaced destructive "ALWAYS CLEAR CACHE" rule with "VERIFY AND RESTART"
|
||||
- Cache clearing mid-session breaks MCP tools
|
||||
- Added guidance for proper plugin development workflow
|
||||
|
||||
### Fixed
|
||||
- **cmdb-assistant:** Complete MCP tool schemas for update operations (#138)
|
||||
- **netbox:** Shorten tool names to meet 64-char API limit (#134)
|
||||
- **cmdb-assistant:** Correct NetBox API URL format in setup wizard (#132)
|
||||
- **gitea/projman:** Type safety for `create_label_smart`, curl-based debug-report (#124)
|
||||
- **netbox:** Add diagnostic logging for JSON parse errors (#121)
|
||||
- **labels:** Add duplicate check before creating labels (#116)
|
||||
- **hooks:** Convert ALL hooks to command type with proper prefixes (#114)
|
||||
- Protected branch workflow: Claude no longer commits directly to protected branches (fixes #109)
|
||||
- doc-guardian hook no longer blocks workflow (fixes #110)
|
||||
|
||||
---
|
||||
|
||||
## [3.1.1] - 2026-01-22
|
||||
|
||||
### Added
|
||||
|
||||
101
CLAUDE.md
101
CLAUDE.md
@@ -1,18 +1,58 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code when working with code in this repository.
|
||||
## ⛔ MANDATORY BEHAVIOR RULES - READ FIRST
|
||||
|
||||
**These rules are NON-NEGOTIABLE. Violating them wastes the user's time and money.**
|
||||
|
||||
### 1. WHEN USER ASKS YOU TO CHECK SOMETHING - CHECK EVERYTHING
|
||||
- Search ALL locations, not just where you think it is
|
||||
- Check cache directories: `~/.claude/plugins/cache/`
|
||||
- Check installed: `~/.claude/plugins/marketplaces/`
|
||||
- Check source: `~/claude-plugins-work/`
|
||||
- **NEVER say "no" or "that's not the issue" without exhaustive verification**
|
||||
|
||||
### 2. WHEN USER SAYS SOMETHING IS WRONG - BELIEVE THEM
|
||||
- The user knows their system better than you
|
||||
- Investigate thoroughly before disagreeing
|
||||
- If user suspects cache, CHECK THE CACHE
|
||||
- If user suspects a file, READ THE FILE
|
||||
- **Your confidence is often wrong. User's instincts are often right.**
|
||||
|
||||
### 3. NEVER SAY "DONE" WITHOUT VERIFICATION
|
||||
- Run the actual command/script to verify
|
||||
- Show the output to the user
|
||||
- Check ALL affected locations
|
||||
- **"Done" means VERIFIED WORKING, not "I made changes"**
|
||||
|
||||
### 4. SHOW EXACTLY WHAT USER ASKS FOR
|
||||
- If user asks for messages, show the MESSAGES
|
||||
- If user asks for code, show the CODE
|
||||
- If user asks for output, show the OUTPUT
|
||||
- **Don't interpret or summarize unless asked**
|
||||
|
||||
### 5. AFTER PLUGIN UPDATES - ALWAYS CLEAR CACHE
|
||||
```bash
|
||||
rm -rf ~/.claude/plugins/cache/leo-claude-mktplace/
|
||||
./scripts/verify-hooks.sh
|
||||
```
|
||||
|
||||
**FAILURE TO FOLLOW THESE RULES = WASTED USER TIME = UNACCEPTABLE**
|
||||
|
||||
---
|
||||
|
||||
|
||||
## Project Overview
|
||||
|
||||
**Repository:** leo-claude-mktplace
|
||||
**Version:** 3.0.1
|
||||
**Version:** 3.1.2
|
||||
**Status:** Production Ready
|
||||
|
||||
A plugin marketplace for Claude Code containing:
|
||||
|
||||
| Plugin | Description | Version |
|
||||
|--------|-------------|---------|
|
||||
| `projman` | Sprint planning and project management with Gitea integration | 3.0.0 |
|
||||
| `projman` | Sprint planning and project management with Gitea integration | 3.1.0 |
|
||||
| `git-flow` | Git workflow automation with smart commits and branch management | 1.0.0 |
|
||||
| `pr-review` | Multi-agent PR review with confidence scoring | 1.0.0 |
|
||||
| `clarity-assist` | Prompt optimization with ND-friendly accommodations | 1.0.0 |
|
||||
@@ -59,7 +99,7 @@ leo-claude-mktplace/
|
||||
│ │ ├── .claude-plugin/plugin.json
|
||||
│ │ ├── .mcp.json
|
||||
│ │ ├── mcp-servers/gitea -> ../../../mcp-servers/gitea # SYMLINK
|
||||
│ │ ├── commands/ # 12 commands (incl. setup)
|
||||
│ │ ├── commands/ # 13 commands (incl. setup, debug)
|
||||
│ │ ├── hooks/ # SessionStart mismatch detection
|
||||
│ │ ├── agents/ # 4 agents
|
||||
│ │ └── skills/label-taxonomy/
|
||||
@@ -246,13 +286,56 @@ See `docs/DEBUGGING-CHECKLIST.md` for systematic troubleshooting.
|
||||
- `/debug-report` - Run full diagnostics, create issue if needed
|
||||
- `/debug-review` - Investigate and propose fixes
|
||||
|
||||
## Versioning Rules
|
||||
## Versioning Workflow
|
||||
|
||||
- Version displayed ONLY in main `README.md` title: `# Leo Claude Marketplace - vX.Y.Z`
|
||||
- `CHANGELOG.md` is authoritative for version history
|
||||
- Follow [SemVer](https://semver.org/): MAJOR.MINOR.PATCH
|
||||
- On release: Update README title → CHANGELOG → marketplace.json → plugin.json files
|
||||
This project follows [SemVer](https://semver.org/) and [Keep a Changelog](https://keepachangelog.com).
|
||||
|
||||
### Version Locations (must stay in sync)
|
||||
|
||||
| Location | Format | Example |
|
||||
|----------|--------|---------|
|
||||
| Git tags | `vX.Y.Z` | `v3.2.0` |
|
||||
| README.md title | `# Leo Claude Marketplace - vX.Y.Z` | `v3.2.0` |
|
||||
| marketplace.json | `"version": "X.Y.Z"` | `3.2.0` |
|
||||
| CHANGELOG.md | `## [X.Y.Z] - YYYY-MM-DD` | `[3.2.0] - 2026-01-24` |
|
||||
|
||||
### During Development
|
||||
|
||||
**All changes go under `[Unreleased]` in CHANGELOG.md.** Never create a versioned section until release time.
|
||||
|
||||
```markdown
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
- New feature description
|
||||
|
||||
### Fixed
|
||||
- Bug fix description
|
||||
```
|
||||
|
||||
### Creating a Release
|
||||
|
||||
Use the release script to ensure consistency:
|
||||
|
||||
```bash
|
||||
./scripts/release.sh 3.2.0
|
||||
```
|
||||
|
||||
The script will:
|
||||
1. Validate `[Unreleased]` section has content
|
||||
2. Replace `[Unreleased]` with `[3.2.0] - YYYY-MM-DD`
|
||||
3. Update README.md title
|
||||
4. Update marketplace.json version
|
||||
5. Commit and create git tag
|
||||
|
||||
### SemVer Guidelines
|
||||
|
||||
| Change Type | Version Bump | Example |
|
||||
|-------------|--------------|---------|
|
||||
| Bug fixes only | PATCH (x.y.**Z**) | 3.1.1 → 3.1.2 |
|
||||
| New features (backwards compatible) | MINOR (x.**Y**.0) | 3.1.2 → 3.2.0 |
|
||||
| Breaking changes | MAJOR (**X**.0.0) | 3.2.0 → 4.0.0 |
|
||||
|
||||
---
|
||||
|
||||
**Last Updated:** 2026-01-22
|
||||
**Last Updated:** 2026-01-24
|
||||
|
||||
18
README.md
18
README.md
@@ -1,4 +1,4 @@
|
||||
# Leo Claude Marketplace - v3.1.1
|
||||
# Leo Claude Marketplace - v3.2.0
|
||||
|
||||
A collection of Claude Code plugins for project management, infrastructure automation, and development workflows.
|
||||
|
||||
@@ -19,7 +19,7 @@ AI-guided sprint planning with full Gitea integration. Transforms a proven 15-sp
|
||||
- Branch-aware security (development/staging/production)
|
||||
- Pre-sprint-close code quality review and test verification
|
||||
|
||||
**Commands:** `/sprint-plan`, `/sprint-start`, `/sprint-status`, `/sprint-close`, `/labels-sync`, `/initial-setup`, `/project-init`, `/project-sync`, `/review`, `/test-check`, `/test-gen`
|
||||
**Commands:** `/sprint-plan`, `/sprint-start`, `/sprint-status`, `/sprint-close`, `/labels-sync`, `/initial-setup`, `/project-init`, `/project-sync`, `/review`, `/test-check`, `/test-gen`, `/debug-report`, `/debug-review`
|
||||
|
||||
#### [git-flow](./plugins/git-flow/README.md) *NEW in v3.0.0*
|
||||
**Git Workflow Automation**
|
||||
@@ -106,11 +106,11 @@ Full Gitea API integration for project management.
|
||||
|
||||
| Category | Tools |
|
||||
|----------|-------|
|
||||
| Issues | `list_issues`, `get_issue`, `create_issue`, `update_issue`, `add_comment` |
|
||||
| Labels | `get_labels`, `suggest_labels`, `create_label` |
|
||||
| Wiki | `list_wiki_pages`, `get_wiki_page`, `create_wiki_page`, `create_lesson`, `search_lessons` |
|
||||
| Milestones | `list_milestones`, `get_milestone`, `create_milestone`, `update_milestone` |
|
||||
| Dependencies | `list_issue_dependencies`, `create_issue_dependency`, `get_execution_order` |
|
||||
| Issues | `list_issues`, `get_issue`, `create_issue`, `update_issue`, `add_comment`, `aggregate_issues` |
|
||||
| Labels | `get_labels`, `suggest_labels`, `create_label`, `create_label_smart` |
|
||||
| Wiki | `list_wiki_pages`, `get_wiki_page`, `create_wiki_page`, `update_wiki_page`, `create_lesson`, `search_lessons` |
|
||||
| Milestones | `list_milestones`, `get_milestone`, `create_milestone`, `update_milestone`, `delete_milestone` |
|
||||
| Dependencies | `list_issue_dependencies`, `create_issue_dependency`, `remove_issue_dependency`, `get_execution_order` |
|
||||
| **Pull Requests** | `list_pull_requests`, `get_pull_request`, `get_pr_diff`, `get_pr_comments`, `create_pr_review`, `add_pr_comment` *(NEW in v3.0.0)* |
|
||||
| Validation | `validate_repo_org`, `get_branch_protection` |
|
||||
|
||||
@@ -245,7 +245,8 @@ leo-claude-mktplace/
|
||||
├── docs/ # Documentation
|
||||
│ ├── CANONICAL-PATHS.md # Path reference
|
||||
│ └── CONFIGURATION.md # Setup guide
|
||||
└── scripts/ # Setup scripts
|
||||
├── scripts/ # Setup scripts
|
||||
└── CHANGELOG.md # Version history
|
||||
```
|
||||
|
||||
## Documentation
|
||||
@@ -257,6 +258,7 @@ leo-claude-mktplace/
|
||||
| [COMMANDS-CHEATSHEET.md](./docs/COMMANDS-CHEATSHEET.md) | All commands quick reference |
|
||||
| [UPDATING.md](./docs/UPDATING.md) | Update guide for the marketplace |
|
||||
| [CANONICAL-PATHS.md](./docs/CANONICAL-PATHS.md) | Authoritative path reference |
|
||||
| [DEBUGGING-CHECKLIST.md](./docs/DEBUGGING-CHECKLIST.md) | Systematic troubleshooting guide |
|
||||
| [CHANGELOG.md](./CHANGELOG.md) | Version history |
|
||||
|
||||
## License
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
**This file defines ALL valid paths in this repository. No exceptions. No inference. No assumptions.**
|
||||
|
||||
Last Updated: 2026-01-20 (v3.0.0)
|
||||
Last Updated: 2026-01-23 (v3.1.2)
|
||||
|
||||
---
|
||||
|
||||
@@ -17,10 +17,10 @@ leo-claude-mktplace/
|
||||
├── docs/ # All documentation
|
||||
│ ├── architecture/ # Draw.io diagrams and specs
|
||||
│ ├── CANONICAL-PATHS.md # This file - single source of truth
|
||||
│ ├── COMMANDS-CHEATSHEET.md # All commands quick reference
|
||||
│ ├── CONFIGURATION.md # Centralized configuration guide
|
||||
│ ├── DEBUGGING-CHECKLIST.md # Systematic troubleshooting guide
|
||||
│ ├── UPDATING.md # Update guide
|
||||
│ └── workflows/ # Workflow documentation
|
||||
│ └── UPDATING.md # Update guide
|
||||
├── hooks/ # Shared hooks (if any)
|
||||
├── mcp-servers/ # SHARED MCP servers (v3.0.0+)
|
||||
│ ├── gitea/ # Gitea MCP server
|
||||
@@ -156,7 +156,6 @@ The symlink target is relative: `../../../mcp-servers/{server}`
|
||||
| Type | Location |
|
||||
|------|----------|
|
||||
| Architecture diagrams | `docs/architecture/` |
|
||||
| Workflow docs | `docs/workflows/` |
|
||||
| This file | `docs/CANONICAL-PATHS.md` |
|
||||
| Update guide | `docs/UPDATING.md` |
|
||||
| Configuration guide | `docs/CONFIGURATION.md` |
|
||||
|
||||
@@ -621,6 +621,40 @@ class GiteaClient:
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def create_org_label(
|
||||
self,
|
||||
org: str,
|
||||
name: str,
|
||||
color: str,
|
||||
description: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a new label at the organization level.
|
||||
|
||||
Organization labels are shared across all repositories in the org.
|
||||
Use this for workflow labels (Type, Priority, Complexity, Effort, etc.)
|
||||
|
||||
Args:
|
||||
org: Organization name
|
||||
name: Label name (e.g., 'Type/Bug', 'Priority/High')
|
||||
color: Hex color code (with or without #)
|
||||
description: Optional label description
|
||||
|
||||
Returns:
|
||||
Created label dictionary
|
||||
"""
|
||||
url = f"{self.base_url}/orgs/{org}/labels"
|
||||
data = {
|
||||
'name': name,
|
||||
'color': color.lstrip('#') # Remove # if present
|
||||
}
|
||||
if description:
|
||||
data['description'] = description
|
||||
logger.info(f"Creating organization label '{name}' in {org}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
# ========================================
|
||||
# PULL REQUEST OPERATIONS
|
||||
# ========================================
|
||||
|
||||
@@ -622,13 +622,65 @@ class GiteaMCPServer:
|
||||
),
|
||||
Tool(
|
||||
name="create_label",
|
||||
description="Create a new label in the repository",
|
||||
description="Create a new label in the repository (for repo-specific labels like Component/*, Tech/*)",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "Label name"
|
||||
"description": "Label name (e.g., 'Component/Backend', 'Tech/Python')"
|
||||
},
|
||||
"color": {
|
||||
"type": "string",
|
||||
"description": "Label color (hex code)"
|
||||
},
|
||||
"description": {
|
||||
"type": "string",
|
||||
"description": "Label description"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["name", "color"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_org_label",
|
||||
description="Create a new label at organization level (for workflow labels like Type/*, Priority/*, Complexity/*, Effort/*)",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"org": {
|
||||
"type": "string",
|
||||
"description": "Organization name"
|
||||
},
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "Label name (e.g., 'Type/Bug', 'Priority/High')"
|
||||
},
|
||||
"color": {
|
||||
"type": "string",
|
||||
"description": "Label color (hex code)"
|
||||
},
|
||||
"description": {
|
||||
"type": "string",
|
||||
"description": "Label description"
|
||||
}
|
||||
},
|
||||
"required": ["org", "name", "color"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_label_smart",
|
||||
description="Create a label at the appropriate level (org or repo) based on category. Org: Type/*, Priority/*, Complexity/*, Effort/*, Risk/*, Source/*, Agent/*. Repo: Component/*, Tech/*",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "Label name (e.g., 'Type/Bug', 'Component/Backend')"
|
||||
},
|
||||
"color": {
|
||||
"type": "string",
|
||||
@@ -880,6 +932,20 @@ class GiteaMCPServer:
|
||||
arguments.get('description'),
|
||||
arguments.get('repo')
|
||||
)
|
||||
elif name == "create_org_label":
|
||||
result = self.client.create_org_label(
|
||||
arguments['org'],
|
||||
arguments['name'],
|
||||
arguments['color'],
|
||||
arguments.get('description')
|
||||
)
|
||||
elif name == "create_label_smart":
|
||||
result = await self.label_tools.create_label_smart(
|
||||
arguments['name'],
|
||||
arguments['color'],
|
||||
arguments.get('description'),
|
||||
arguments.get('repo')
|
||||
)
|
||||
# Pull Request tools
|
||||
elif name == "list_pull_requests":
|
||||
result = await self.pr_tools.list_pull_requests(**arguments)
|
||||
|
||||
@@ -259,3 +259,119 @@ class LabelTools:
|
||||
return lookup[category_lower][value_lower]
|
||||
|
||||
return None
|
||||
|
||||
# Organization-level label categories (workflow labels shared across repos)
|
||||
ORG_LABEL_CATEGORIES = {'agent', 'complexity', 'effort', 'efforts', 'priority', 'risk', 'source', 'type'}
|
||||
|
||||
# Repository-level label categories (project-specific labels)
|
||||
REPO_LABEL_CATEGORIES = {'component', 'tech'}
|
||||
|
||||
async def create_label_smart(
|
||||
self,
|
||||
name: str,
|
||||
color: str,
|
||||
description: Optional[str] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a label at the appropriate level (org or repo) based on category.
|
||||
Skips if label already exists (checks both org and repo levels).
|
||||
|
||||
Organization labels: Agent, Complexity, Effort, Priority, Risk, Source, Type
|
||||
Repository labels: Component, Tech
|
||||
|
||||
Args:
|
||||
name: Label name (e.g., 'Type/Bug', 'Component/Backend')
|
||||
color: Hex color code
|
||||
description: Optional label description
|
||||
repo: Repository in 'owner/repo' format
|
||||
|
||||
Returns:
|
||||
Created label dictionary with 'level' key, or 'skipped' if already exists
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
|
||||
target_repo = repo or self.gitea.repo
|
||||
if not target_repo or '/' not in target_repo:
|
||||
raise ValueError("Use 'owner/repo' format (e.g. 'org/repo-name')")
|
||||
|
||||
owner = target_repo.split('/')[0]
|
||||
is_org = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.is_org_repo(target_repo)
|
||||
)
|
||||
|
||||
# Fetch existing labels to check for duplicates
|
||||
existing_labels = await self.get_labels(target_repo)
|
||||
all_existing = existing_labels.get('organization', []) + existing_labels.get('repository', [])
|
||||
existing_names = [label['name'].lower() for label in all_existing]
|
||||
|
||||
# Normalize the new label name for comparison
|
||||
name_normalized = name.lower()
|
||||
|
||||
# Also check for format variations (Type/Bug vs Type: Bug)
|
||||
name_variations = [name_normalized]
|
||||
if '/' in name:
|
||||
name_variations.append(name.replace('/', ': ').lower())
|
||||
name_variations.append(name.replace('/', ':').lower())
|
||||
elif ': ' in name:
|
||||
name_variations.append(name.replace(': ', '/').lower())
|
||||
elif ':' in name:
|
||||
name_variations.append(name.replace(':', '/').lower())
|
||||
|
||||
# Check if label already exists in any format
|
||||
for variation in name_variations:
|
||||
if variation in existing_names:
|
||||
logger.info(f"Label '{name}' already exists (found as '{variation}'), skipping")
|
||||
return {
|
||||
'name': name,
|
||||
'skipped': True,
|
||||
'reason': f"Label already exists",
|
||||
'level': 'existing'
|
||||
}
|
||||
|
||||
# Parse category from label name
|
||||
category = None
|
||||
if '/' in name:
|
||||
category = name.split('/')[0].lower().rstrip('s')
|
||||
elif ':' in name:
|
||||
category = name.split(':')[0].strip().lower().rstrip('s')
|
||||
|
||||
# If it's an org repo and the category is an org-level category, create at org level
|
||||
if is_org and category in self.ORG_LABEL_CATEGORIES:
|
||||
result = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_org_label(owner, name, color, description)
|
||||
)
|
||||
# Handle unexpected response types (API may return list or non-dict)
|
||||
if not isinstance(result, dict):
|
||||
logger.error(f"Unexpected API response type for org label: {type(result)} - {result}")
|
||||
return {
|
||||
'name': name,
|
||||
'error': True,
|
||||
'reason': f"API returned {type(result).__name__} instead of dict: {result}",
|
||||
'level': 'organization'
|
||||
}
|
||||
result['level'] = 'organization'
|
||||
result['skipped'] = False
|
||||
logger.info(f"Created organization label '{name}' in {owner}")
|
||||
else:
|
||||
# Create at repo level
|
||||
result = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_label(name, color, description, target_repo)
|
||||
)
|
||||
# Handle unexpected response types (API may return list or non-dict)
|
||||
if not isinstance(result, dict):
|
||||
logger.error(f"Unexpected API response type for repo label: {type(result)} - {result}")
|
||||
return {
|
||||
'name': name,
|
||||
'error': True,
|
||||
'reason': f"API returned {type(result).__name__} instead of dict: {result}",
|
||||
'level': 'repository'
|
||||
}
|
||||
result['level'] = 'repository'
|
||||
result['skipped'] = False
|
||||
logger.info(f"Created repository label '{name}' in {target_repo}")
|
||||
|
||||
return result
|
||||
|
||||
@@ -4,6 +4,7 @@ NetBox API client for interacting with NetBox REST API.
|
||||
Provides a generic HTTP client with methods for all standard REST operations.
|
||||
Individual tool modules use this client for their specific endpoints.
|
||||
"""
|
||||
import json
|
||||
import requests
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any, Union
|
||||
@@ -83,7 +84,20 @@ class NetBoxClient:
|
||||
if response.status_code == 204 or not response.content:
|
||||
return None
|
||||
|
||||
return response.json()
|
||||
# Parse JSON with diagnostic error handling
|
||||
try:
|
||||
return response.json()
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(
|
||||
f"JSON decode failed. Status: {response.status_code}, "
|
||||
f"Content-Length: {len(response.content)}, "
|
||||
f"Content preview: {response.content[:200]!r}"
|
||||
)
|
||||
raise ValueError(
|
||||
f"Invalid JSON response from NetBox: {e}. "
|
||||
f"Status code: {response.status_code}, "
|
||||
f"Content length: {len(response.content)} bytes"
|
||||
) from e
|
||||
|
||||
def list(
|
||||
self,
|
||||
|
||||
@@ -103,7 +103,19 @@ TOOL_DEFINITIONS = {
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Site ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'status': {'type': 'string', 'description': 'New status'}
|
||||
'slug': {'type': 'string', 'description': 'New slug'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'region': {'type': 'integer', 'description': 'Region ID'},
|
||||
'group': {'type': 'integer', 'description': 'Site group ID'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'facility': {'type': 'string', 'description': 'Facility name'},
|
||||
'time_zone': {'type': 'string', 'description': 'Time zone'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'physical_address': {'type': 'string', 'description': 'Physical address'},
|
||||
'shipping_address': {'type': 'string', 'description': 'Shipping address'},
|
||||
'latitude': {'type': 'number', 'description': 'Latitude'},
|
||||
'longitude': {'type': 'number', 'description': 'Longitude'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
@@ -136,7 +148,14 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_location': {
|
||||
'description': 'Update an existing location',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Location ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Location ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'slug': {'type': 'string', 'description': 'New slug'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'parent': {'type': 'integer', 'description': 'Parent location ID'},
|
||||
'description': {'type': 'string', 'description': 'Description'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_location': {
|
||||
@@ -171,7 +190,18 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_rack': {
|
||||
'description': 'Update an existing rack',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Rack ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Rack ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'location': {'type': 'integer', 'description': 'Location ID'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'role': {'type': 'integer', 'description': 'Role ID'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'u_height': {'type': 'integer', 'description': 'Rack height in U'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_rack': {
|
||||
@@ -198,7 +228,12 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_manufacturer': {
|
||||
'description': 'Update an existing manufacturer',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Manufacturer ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Manufacturer ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'slug': {'type': 'string', 'description': 'New slug'},
|
||||
'description': {'type': 'string', 'description': 'Description'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_manufacturer': {
|
||||
@@ -230,7 +265,16 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_device_type': {
|
||||
'description': 'Update an existing device type',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Device type ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Device type ID'},
|
||||
'manufacturer': {'type': 'integer', 'description': 'Manufacturer ID'},
|
||||
'model': {'type': 'string', 'description': 'Model name'},
|
||||
'slug': {'type': 'string', 'description': 'New slug'},
|
||||
'u_height': {'type': 'number', 'description': 'Height in rack units'},
|
||||
'is_full_depth': {'type': 'boolean', 'description': 'Is full depth'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_device_type': {
|
||||
@@ -259,7 +303,14 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_device_role': {
|
||||
'description': 'Update an existing device role',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Device role ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Device role ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'slug': {'type': 'string', 'description': 'New slug'},
|
||||
'color': {'type': 'string', 'description': 'Hex color code'},
|
||||
'vm_role': {'type': 'boolean', 'description': 'Can be assigned to VMs'},
|
||||
'description': {'type': 'string', 'description': 'Description'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_device_role': {
|
||||
@@ -290,7 +341,13 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_platform': {
|
||||
'description': 'Update an existing platform',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Platform ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Platform ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'slug': {'type': 'string', 'description': 'New slug'},
|
||||
'manufacturer': {'type': 'integer', 'description': 'Manufacturer ID'},
|
||||
'description': {'type': 'string', 'description': 'Description'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_platform': {
|
||||
@@ -326,7 +383,13 @@ TOOL_DEFINITIONS = {
|
||||
'status': {'type': 'string', 'description': 'Device status'},
|
||||
'rack': {'type': 'integer', 'description': 'Rack ID'},
|
||||
'position': {'type': 'number', 'description': 'Position in rack'},
|
||||
'serial': {'type': 'string', 'description': 'Serial number'}
|
||||
'serial': {'type': 'string', 'description': 'Serial number'},
|
||||
'platform': {'type': 'integer', 'description': 'Platform ID'},
|
||||
'primary_ip4': {'type': 'integer', 'description': 'Primary IPv4 address ID'},
|
||||
'primary_ip6': {'type': 'integer', 'description': 'Primary IPv6 address ID'},
|
||||
'asset_tag': {'type': 'string', 'description': 'Asset tag'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['name', 'device_type', 'role', 'site']
|
||||
},
|
||||
@@ -335,7 +398,17 @@ TOOL_DEFINITIONS = {
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Device ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'status': {'type': 'string', 'description': 'New status'}
|
||||
'status': {'type': 'string', 'description': 'New status'},
|
||||
'platform': {'type': 'integer', 'description': 'Platform ID'},
|
||||
'primary_ip4': {'type': 'integer', 'description': 'Primary IPv4 address ID'},
|
||||
'primary_ip6': {'type': 'integer', 'description': 'Primary IPv6 address ID'},
|
||||
'serial': {'type': 'string', 'description': 'Serial number'},
|
||||
'asset_tag': {'type': 'string', 'description': 'Asset tag'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'rack': {'type': 'integer', 'description': 'Rack ID'},
|
||||
'position': {'type': 'number', 'description': 'Position in rack'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
@@ -370,7 +443,18 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_interface': {
|
||||
'description': 'Update an existing interface',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Interface ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Interface ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'type': {'type': 'string', 'description': 'Interface type'},
|
||||
'enabled': {'type': 'boolean', 'description': 'Interface enabled'},
|
||||
'mtu': {'type': 'integer', 'description': 'MTU'},
|
||||
'mac_address': {'type': 'string', 'description': 'MAC address'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'mode': {'type': 'string', 'description': 'VLAN mode'},
|
||||
'untagged_vlan': {'type': 'integer', 'description': 'Untagged VLAN ID'},
|
||||
'tagged_vlans': {'type': 'array', 'description': 'Tagged VLAN IDs'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_interface': {
|
||||
@@ -404,7 +488,15 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'dcim_update_cable': {
|
||||
'description': 'Update an existing cable',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Cable ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Cable ID'},
|
||||
'type': {'type': 'string', 'description': 'Cable type'},
|
||||
'status': {'type': 'string', 'description': 'Cable status'},
|
||||
'label': {'type': 'string', 'description': 'Cable label'},
|
||||
'color': {'type': 'string', 'description': 'Cable color'},
|
||||
'length': {'type': 'number', 'description': 'Cable length'},
|
||||
'length_unit': {'type': 'string', 'description': 'Length unit'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'dcim_delete_cable': {
|
||||
@@ -492,7 +584,15 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'ipam_update_vrf': {
|
||||
'description': 'Update an existing VRF',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'VRF ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'VRF ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'rd': {'type': 'string', 'description': 'Route distinguisher'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'enforce_unique': {'type': 'boolean', 'description': 'Enforce unique IPs'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'ipam_delete_vrf': {
|
||||
@@ -531,7 +631,19 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'ipam_update_prefix': {
|
||||
'description': 'Update an existing prefix',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Prefix ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Prefix ID'},
|
||||
'prefix': {'type': 'string', 'description': 'Prefix in CIDR notation'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'vrf': {'type': 'integer', 'description': 'VRF ID'},
|
||||
'vlan': {'type': 'integer', 'description': 'VLAN ID'},
|
||||
'role': {'type': 'integer', 'description': 'Role ID'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'is_pool': {'type': 'boolean', 'description': 'Is a pool'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'ipam_delete_prefix': {
|
||||
@@ -582,7 +694,18 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'ipam_update_ip_address': {
|
||||
'description': 'Update an existing IP address',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'IP address ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'IP address ID'},
|
||||
'address': {'type': 'string', 'description': 'IP address with prefix length'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'vrf': {'type': 'integer', 'description': 'VRF ID'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'dns_name': {'type': 'string', 'description': 'DNS name'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'},
|
||||
'assigned_object_type': {'type': 'string', 'description': 'Object type to assign to'},
|
||||
'assigned_object_id': {'type': 'integer', 'description': 'Object ID to assign to'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'ipam_delete_ip_address': {
|
||||
@@ -647,7 +770,18 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'ipam_update_vlan': {
|
||||
'description': 'Update an existing VLAN',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'VLAN ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'VLAN ID'},
|
||||
'vid': {'type': 'integer', 'description': 'VLAN ID number'},
|
||||
'name': {'type': 'string', 'description': 'VLAN name'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'group': {'type': 'integer', 'description': 'VLAN group ID'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'role': {'type': 'integer', 'description': 'Role ID'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'ipam_delete_vlan': {
|
||||
@@ -757,16 +891,17 @@ TOOL_DEFINITIONS = {
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Provider ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'circuits_list_circuit_types': {
|
||||
# NOTE: circuit_types tools shortened to meet 28-char limit
|
||||
'circ_list_types': {
|
||||
'description': 'List all circuit types in NetBox',
|
||||
'properties': {'name': {'type': 'string', 'description': 'Filter by name'}}
|
||||
},
|
||||
'circuits_get_circuit_type': {
|
||||
'circ_get_type': {
|
||||
'description': 'Get a specific circuit type by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Circuit type ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'circuits_create_circuit_type': {
|
||||
'circ_create_type': {
|
||||
'description': 'Create a new circuit type',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Type name'},
|
||||
@@ -809,19 +944,20 @@ TOOL_DEFINITIONS = {
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Circuit ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'circuits_list_circuit_terminations': {
|
||||
# NOTE: circuit_terminations tools shortened to meet 28-char limit
|
||||
'circ_list_terminations': {
|
||||
'description': 'List all circuit terminations in NetBox',
|
||||
'properties': {
|
||||
'circuit_id': {'type': 'integer', 'description': 'Filter by circuit ID'},
|
||||
'site_id': {'type': 'integer', 'description': 'Filter by site ID'}
|
||||
}
|
||||
},
|
||||
'circuits_get_circuit_termination': {
|
||||
'circ_get_termination': {
|
||||
'description': 'Get a specific circuit termination by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Termination ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'circuits_create_circuit_termination': {
|
||||
'circ_create_termination': {
|
||||
'description': 'Create a new circuit termination',
|
||||
'properties': {
|
||||
'circuit': {'type': 'integer', 'description': 'Circuit ID'},
|
||||
@@ -832,16 +968,18 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
|
||||
# ==================== Virtualization Tools ====================
|
||||
'virtualization_list_cluster_types': {
|
||||
# NOTE: Tool names shortened from 'virtualization_' to 'virt_' to meet
|
||||
# 28-char limit (Claude API 64-char limit minus 36-char prefix)
|
||||
'virt_list_cluster_types': {
|
||||
'description': 'List all cluster types in NetBox',
|
||||
'properties': {'name': {'type': 'string', 'description': 'Filter by name'}}
|
||||
},
|
||||
'virtualization_get_cluster_type': {
|
||||
'virt_get_cluster_type': {
|
||||
'description': 'Get a specific cluster type by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Cluster type ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_create_cluster_type': {
|
||||
'virt_create_cluster_type': {
|
||||
'description': 'Create a new cluster type',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Type name'},
|
||||
@@ -849,16 +987,16 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'required': ['name', 'slug']
|
||||
},
|
||||
'virtualization_list_cluster_groups': {
|
||||
'virt_list_cluster_groups': {
|
||||
'description': 'List all cluster groups in NetBox',
|
||||
'properties': {'name': {'type': 'string', 'description': 'Filter by name'}}
|
||||
},
|
||||
'virtualization_get_cluster_group': {
|
||||
'virt_get_cluster_group': {
|
||||
'description': 'Get a specific cluster group by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Cluster group ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_create_cluster_group': {
|
||||
'virt_create_cluster_group': {
|
||||
'description': 'Create a new cluster group',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Group name'},
|
||||
@@ -866,7 +1004,7 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'required': ['name', 'slug']
|
||||
},
|
||||
'virtualization_list_clusters': {
|
||||
'virt_list_clusters': {
|
||||
'description': 'List all clusters in NetBox',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Filter by name'},
|
||||
@@ -875,12 +1013,12 @@ TOOL_DEFINITIONS = {
|
||||
'site_id': {'type': 'integer', 'description': 'Filter by site ID'}
|
||||
}
|
||||
},
|
||||
'virtualization_get_cluster': {
|
||||
'virt_get_cluster': {
|
||||
'description': 'Get a specific cluster by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Cluster ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_create_cluster': {
|
||||
'virt_create_cluster': {
|
||||
'description': 'Create a new cluster',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Cluster name'},
|
||||
@@ -891,17 +1029,27 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'required': ['name', 'type']
|
||||
},
|
||||
'virtualization_update_cluster': {
|
||||
'virt_update_cluster': {
|
||||
'description': 'Update an existing cluster',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Cluster ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'Cluster ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'type': {'type': 'integer', 'description': 'Cluster type ID'},
|
||||
'group': {'type': 'integer', 'description': 'Cluster group ID'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_delete_cluster': {
|
||||
'virt_delete_cluster': {
|
||||
'description': 'Delete a cluster',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Cluster ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_list_virtual_machines': {
|
||||
'virt_list_vms': {
|
||||
'description': 'List all virtual machines in NetBox',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Filter by name'},
|
||||
@@ -910,12 +1058,12 @@ TOOL_DEFINITIONS = {
|
||||
'status': {'type': 'string', 'description': 'Filter by status'}
|
||||
}
|
||||
},
|
||||
'virtualization_get_virtual_machine': {
|
||||
'virt_get_vm': {
|
||||
'description': 'Get a specific virtual machine by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'VM ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_create_virtual_machine': {
|
||||
'virt_create_vm': {
|
||||
'description': 'Create a new virtual machine',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'VM name'},
|
||||
@@ -928,29 +1076,45 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'required': ['name']
|
||||
},
|
||||
'virtualization_update_virtual_machine': {
|
||||
'virt_update_vm': {
|
||||
'description': 'Update an existing virtual machine',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'VM ID'}},
|
||||
'properties': {
|
||||
'id': {'type': 'integer', 'description': 'VM ID'},
|
||||
'name': {'type': 'string', 'description': 'New name'},
|
||||
'status': {'type': 'string', 'description': 'Status'},
|
||||
'cluster': {'type': 'integer', 'description': 'Cluster ID'},
|
||||
'site': {'type': 'integer', 'description': 'Site ID'},
|
||||
'role': {'type': 'integer', 'description': 'Role ID'},
|
||||
'tenant': {'type': 'integer', 'description': 'Tenant ID'},
|
||||
'platform': {'type': 'integer', 'description': 'Platform ID'},
|
||||
'vcpus': {'type': 'number', 'description': 'Number of vCPUs'},
|
||||
'memory': {'type': 'integer', 'description': 'Memory in MB'},
|
||||
'disk': {'type': 'integer', 'description': 'Disk in GB'},
|
||||
'primary_ip4': {'type': 'integer', 'description': 'Primary IPv4 address ID'},
|
||||
'primary_ip6': {'type': 'integer', 'description': 'Primary IPv6 address ID'},
|
||||
'description': {'type': 'string', 'description': 'Description'},
|
||||
'comments': {'type': 'string', 'description': 'Comments'}
|
||||
},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_delete_virtual_machine': {
|
||||
'virt_delete_vm': {
|
||||
'description': 'Delete a virtual machine',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'VM ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_list_vm_interfaces': {
|
||||
'virt_list_vm_ifaces': {
|
||||
'description': 'List all VM interfaces in NetBox',
|
||||
'properties': {
|
||||
'virtual_machine_id': {'type': 'integer', 'description': 'Filter by VM ID'},
|
||||
'name': {'type': 'string', 'description': 'Filter by name'}
|
||||
}
|
||||
},
|
||||
'virtualization_get_vm_interface': {
|
||||
'virt_get_vm_iface': {
|
||||
'description': 'Get a specific VM interface by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Interface ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'virtualization_create_vm_interface': {
|
||||
'virt_create_vm_iface': {
|
||||
'description': 'Create a new VM interface',
|
||||
'properties': {
|
||||
'virtual_machine': {'type': 'integer', 'description': 'VM ID'},
|
||||
@@ -1088,16 +1252,18 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
|
||||
# ==================== Wireless Tools ====================
|
||||
'wireless_list_wireless_lan_groups': {
|
||||
# NOTE: Tool names shortened from 'wireless_' to 'wlan_' to meet
|
||||
# 28-char limit (Claude API 64-char limit minus 36-char prefix)
|
||||
'wlan_list_groups': {
|
||||
'description': 'List all wireless LAN groups in NetBox',
|
||||
'properties': {'name': {'type': 'string', 'description': 'Filter by name'}}
|
||||
},
|
||||
'wireless_get_wireless_lan_group': {
|
||||
'wlan_get_group': {
|
||||
'description': 'Get a specific wireless LAN group by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'WLAN group ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'wireless_create_wireless_lan_group': {
|
||||
'wlan_create_group': {
|
||||
'description': 'Create a new wireless LAN group',
|
||||
'properties': {
|
||||
'name': {'type': 'string', 'description': 'Group name'},
|
||||
@@ -1105,7 +1271,7 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'required': ['name', 'slug']
|
||||
},
|
||||
'wireless_list_wireless_lans': {
|
||||
'wlan_list_lans': {
|
||||
'description': 'List all wireless LANs in NetBox',
|
||||
'properties': {
|
||||
'ssid': {'type': 'string', 'description': 'Filter by SSID'},
|
||||
@@ -1113,12 +1279,12 @@ TOOL_DEFINITIONS = {
|
||||
'status': {'type': 'string', 'description': 'Filter by status'}
|
||||
}
|
||||
},
|
||||
'wireless_get_wireless_lan': {
|
||||
'wlan_get_lan': {
|
||||
'description': 'Get a specific wireless LAN by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'WLAN ID'}},
|
||||
'required': ['id']
|
||||
},
|
||||
'wireless_create_wireless_lan': {
|
||||
'wlan_create_lan': {
|
||||
'description': 'Create a new wireless LAN',
|
||||
'properties': {
|
||||
'ssid': {'type': 'string', 'description': 'SSID'},
|
||||
@@ -1128,14 +1294,14 @@ TOOL_DEFINITIONS = {
|
||||
},
|
||||
'required': ['ssid']
|
||||
},
|
||||
'wireless_list_wireless_links': {
|
||||
'wlan_list_links': {
|
||||
'description': 'List all wireless links in NetBox',
|
||||
'properties': {
|
||||
'ssid': {'type': 'string', 'description': 'Filter by SSID'},
|
||||
'status': {'type': 'string', 'description': 'Filter by status'}
|
||||
}
|
||||
},
|
||||
'wireless_get_wireless_link': {
|
||||
'wlan_get_link': {
|
||||
'description': 'Get a specific wireless link by ID',
|
||||
'properties': {'id': {'type': 'integer', 'description': 'Link ID'}},
|
||||
'required': ['id']
|
||||
@@ -1241,6 +1407,52 @@ TOOL_DEFINITIONS = {
|
||||
}
|
||||
|
||||
|
||||
# Map shortened tool names to (category, method_name) for routing.
|
||||
# This is necessary because tool names were shortened to meet the 28-character
|
||||
# limit imposed by Claude API's 64-character tool name limit minus the
|
||||
# 36-character prefix used by Claude Code for MCP tools.
|
||||
TOOL_NAME_MAP = {
|
||||
# Virtualization tools (virt_ -> virtualization category)
|
||||
'virt_list_cluster_types': ('virtualization', 'list_cluster_types'),
|
||||
'virt_get_cluster_type': ('virtualization', 'get_cluster_type'),
|
||||
'virt_create_cluster_type': ('virtualization', 'create_cluster_type'),
|
||||
'virt_list_cluster_groups': ('virtualization', 'list_cluster_groups'),
|
||||
'virt_get_cluster_group': ('virtualization', 'get_cluster_group'),
|
||||
'virt_create_cluster_group': ('virtualization', 'create_cluster_group'),
|
||||
'virt_list_clusters': ('virtualization', 'list_clusters'),
|
||||
'virt_get_cluster': ('virtualization', 'get_cluster'),
|
||||
'virt_create_cluster': ('virtualization', 'create_cluster'),
|
||||
'virt_update_cluster': ('virtualization', 'update_cluster'),
|
||||
'virt_delete_cluster': ('virtualization', 'delete_cluster'),
|
||||
'virt_list_vms': ('virtualization', 'list_virtual_machines'),
|
||||
'virt_get_vm': ('virtualization', 'get_virtual_machine'),
|
||||
'virt_create_vm': ('virtualization', 'create_virtual_machine'),
|
||||
'virt_update_vm': ('virtualization', 'update_virtual_machine'),
|
||||
'virt_delete_vm': ('virtualization', 'delete_virtual_machine'),
|
||||
'virt_list_vm_ifaces': ('virtualization', 'list_vm_interfaces'),
|
||||
'virt_get_vm_iface': ('virtualization', 'get_vm_interface'),
|
||||
'virt_create_vm_iface': ('virtualization', 'create_vm_interface'),
|
||||
|
||||
# Circuits tools (circ_ -> circuits category, for shortened names only)
|
||||
'circ_list_types': ('circuits', 'list_circuit_types'),
|
||||
'circ_get_type': ('circuits', 'get_circuit_type'),
|
||||
'circ_create_type': ('circuits', 'create_circuit_type'),
|
||||
'circ_list_terminations': ('circuits', 'list_circuit_terminations'),
|
||||
'circ_get_termination': ('circuits', 'get_circuit_termination'),
|
||||
'circ_create_termination': ('circuits', 'create_circuit_termination'),
|
||||
|
||||
# Wireless tools (wlan_ -> wireless category)
|
||||
'wlan_list_groups': ('wireless', 'list_wireless_lan_groups'),
|
||||
'wlan_get_group': ('wireless', 'get_wireless_lan_group'),
|
||||
'wlan_create_group': ('wireless', 'create_wireless_lan_group'),
|
||||
'wlan_list_lans': ('wireless', 'list_wireless_lans'),
|
||||
'wlan_get_lan': ('wireless', 'get_wireless_lan'),
|
||||
'wlan_create_lan': ('wireless', 'create_wireless_lan'),
|
||||
'wlan_list_links': ('wireless', 'list_wireless_links'),
|
||||
'wlan_get_link': ('wireless', 'get_wireless_link'),
|
||||
}
|
||||
|
||||
|
||||
class NetBoxMCPServer:
|
||||
"""MCP Server for NetBox integration"""
|
||||
|
||||
@@ -1314,12 +1526,21 @@ class NetBoxMCPServer:
|
||||
)]
|
||||
|
||||
async def _route_tool(self, name: str, arguments: dict):
|
||||
"""Route tool call to appropriate handler."""
|
||||
parts = name.split('_', 1)
|
||||
if len(parts) != 2:
|
||||
raise ValueError(f"Invalid tool name format: {name}")
|
||||
"""Route tool call to appropriate handler.
|
||||
|
||||
category, method_name = parts[0], parts[1]
|
||||
Tool names may be shortened (e.g., 'virt_list_vms' instead of
|
||||
'virtualization_list_virtual_machines') to meet the 28-character
|
||||
limit. TOOL_NAME_MAP handles the translation to actual method names.
|
||||
"""
|
||||
# Check if this is a mapped short name
|
||||
if name in TOOL_NAME_MAP:
|
||||
category, method_name = TOOL_NAME_MAP[name]
|
||||
else:
|
||||
# Fall back to original logic for unchanged tools
|
||||
parts = name.split('_', 1)
|
||||
if len(parts) != 2:
|
||||
raise ValueError(f"Invalid tool name format: {name}")
|
||||
category, method_name = parts[0], parts[1]
|
||||
|
||||
# Map category to tool class
|
||||
tool_map = {
|
||||
|
||||
68
plugins/claude-config-maintainer/hooks/enforce-rules.sh
Executable file
68
plugins/claude-config-maintainer/hooks/enforce-rules.sh
Executable file
@@ -0,0 +1,68 @@
|
||||
#!/bin/bash
|
||||
# claude-config-maintainer: enforce mandatory behavior rules
|
||||
# Checks if CLAUDE.md has the rules, adds them if missing
|
||||
|
||||
PREFIX="[claude-config-maintainer]"
|
||||
|
||||
# Find CLAUDE.md in current directory or parent
|
||||
CLAUDE_MD=""
|
||||
if [ -f "./CLAUDE.md" ]; then
|
||||
CLAUDE_MD="./CLAUDE.md"
|
||||
elif [ -f "../CLAUDE.md" ]; then
|
||||
CLAUDE_MD="../CLAUDE.md"
|
||||
fi
|
||||
|
||||
# If no CLAUDE.md found, exit silently
|
||||
if [ -z "$CLAUDE_MD" ]; then
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Check if mandatory rules exist
|
||||
if grep -q "MANDATORY BEHAVIOR RULES" "$CLAUDE_MD" 2>/dev/null; then
|
||||
# Rules exist, all good
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Rules missing - add them
|
||||
RULES='## ⛔ MANDATORY BEHAVIOR RULES - READ FIRST
|
||||
|
||||
**These rules are NON-NEGOTIABLE. Violating them wastes the user'\''s time and money.**
|
||||
|
||||
### 1. WHEN USER ASKS YOU TO CHECK SOMETHING - CHECK EVERYTHING
|
||||
- Search ALL locations, not just where you think it is
|
||||
- Check cache directories: `~/.claude/plugins/cache/`
|
||||
- Check installed: `~/.claude/plugins/marketplaces/`
|
||||
- Check source directories
|
||||
- **NEVER say "no" or "that'\''s not the issue" without exhaustive verification**
|
||||
|
||||
### 2. WHEN USER SAYS SOMETHING IS WRONG - BELIEVE THEM
|
||||
- The user knows their system better than you
|
||||
- Investigate thoroughly before disagreeing
|
||||
- **Your confidence is often wrong. User'\''s instincts are often right.**
|
||||
|
||||
### 3. NEVER SAY "DONE" WITHOUT VERIFICATION
|
||||
- Run the actual command/script to verify
|
||||
- Show the output to the user
|
||||
- **"Done" means VERIFIED WORKING, not "I made changes"**
|
||||
|
||||
### 4. SHOW EXACTLY WHAT USER ASKS FOR
|
||||
- If user asks for messages, show the MESSAGES
|
||||
- If user asks for code, show the CODE
|
||||
- **Do not interpret or summarize unless asked**
|
||||
|
||||
**FAILURE TO FOLLOW THESE RULES = WASTED USER TIME = UNACCEPTABLE**
|
||||
|
||||
---
|
||||
|
||||
'
|
||||
|
||||
# Create temp file with rules + existing content
|
||||
{
|
||||
head -1 "$CLAUDE_MD"
|
||||
echo ""
|
||||
echo "$RULES"
|
||||
tail -n +2 "$CLAUDE_MD"
|
||||
} > "${CLAUDE_MD}.tmp"
|
||||
|
||||
mv "${CLAUDE_MD}.tmp" "$CLAUDE_MD"
|
||||
echo "$PREFIX Added mandatory behavior rules to CLAUDE.md"
|
||||
10
plugins/claude-config-maintainer/hooks/hooks.json
Normal file
10
plugins/claude-config-maintainer/hooks/hooks.json
Normal file
@@ -0,0 +1,10 @@
|
||||
{
|
||||
"hooks": {
|
||||
"SessionStart": [
|
||||
{
|
||||
"type": "command",
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/hooks/enforce-rules.sh"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -111,6 +111,7 @@ cmdb-assistant/
|
||||
│ └── plugin.json # Plugin manifest
|
||||
├── .mcp.json # MCP server configuration
|
||||
├── commands/
|
||||
│ ├── initial-setup.md # Setup wizard
|
||||
│ ├── cmdb-search.md # Search command
|
||||
│ ├── cmdb-device.md # Device management
|
||||
│ ├── cmdb-ip.md # IP management
|
||||
|
||||
@@ -70,13 +70,15 @@ cat ~/.config/claude/netbox.env 2>/dev/null || echo "FILE_NOT_FOUND"
|
||||
### Step 3.3: Gather NetBox Information
|
||||
|
||||
Use AskUserQuestion:
|
||||
- Question: "What is your NetBox server URL? (e.g., https://netbox.company.com)"
|
||||
- Question: "What is your NetBox API URL? (e.g., https://netbox.company.com/api)"
|
||||
- Header: "NetBox URL"
|
||||
- Options:
|
||||
- "Other (I'll provide the URL)"
|
||||
|
||||
Ask user to provide the URL.
|
||||
|
||||
**Important:** The URL must include `/api` at the end. If the user provides a URL without `/api`, append it automatically.
|
||||
|
||||
### Step 3.4: Create Configuration File
|
||||
|
||||
```bash
|
||||
@@ -120,9 +122,11 @@ Use AskUserQuestion:
|
||||
### Step 4.1: Test Configuration (if token was added)
|
||||
|
||||
```bash
|
||||
source ~/.config/claude/netbox.env && curl -s -o /dev/null -w "%{http_code}" -H "Authorization: Token $NETBOX_API_TOKEN" "$NETBOX_API_URL/api/"
|
||||
source ~/.config/claude/netbox.env && curl -s -o /dev/null -w "%{http_code}" -H "Authorization: Token $NETBOX_API_TOKEN" "$NETBOX_API_URL/"
|
||||
```
|
||||
|
||||
**Note:** The URL already includes `/api`, so we just append `/` for the root API endpoint.
|
||||
|
||||
Report result:
|
||||
- 200: Success
|
||||
- 403: Invalid token
|
||||
|
||||
@@ -9,5 +9,6 @@
|
||||
"homepage": "https://gitea.hotserv.cloud/personal-projects/leo-claude-mktplace/src/branch/main/plugins/code-sentinel/README.md",
|
||||
"repository": "https://gitea.hotserv.cloud/personal-projects/leo-claude-mktplace.git",
|
||||
"license": "MIT",
|
||||
"keywords": ["security", "refactoring", "code-quality", "static-analysis", "hooks"]
|
||||
"keywords": ["security", "refactoring", "code-quality", "static-analysis", "hooks"],
|
||||
"commands": ["./commands/"]
|
||||
}
|
||||
|
||||
@@ -5,8 +5,8 @@
|
||||
"matcher": "Write|Edit|MultiEdit",
|
||||
"hooks": [
|
||||
{
|
||||
"type": "prompt",
|
||||
"prompt": "[code-sentinel] SECURITY CHECK - Before writing this code, scan for these patterns:\n\n**Critical (BLOCK if found):**\n- eval(), exec() with user input\n- SQL string concatenation (SQL injection)\n- shell=True with user input (command injection)\n- Hardcoded secrets (API keys, passwords, tokens)\n- Pickle/marshal deserialization of untrusted data\n- innerHTML/dangerouslySetInnerHTML with user content (XSS)\n\n**Warning (WARN but allow):**\n- subprocess without input validation\n- File operations without path sanitization\n- HTTP requests without timeout\n- Broad exception catches (except:)\n- Debug/print statements with sensitive data\n\n**Response:**\n- If CRITICAL found: STOP with '[code-sentinel] BLOCKED:', explain the issue, suggest safe alternative\n- If WARNING found: Note briefly with '[code-sentinel] WARNING:', proceed with suggestion\n- If clean: Proceed silently (say nothing)\n\nDo NOT announce clean scans. Only speak if issues found."
|
||||
"type": "command",
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/hooks/security-check.sh"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
62
plugins/code-sentinel/hooks/security-check.sh
Executable file
62
plugins/code-sentinel/hooks/security-check.sh
Executable file
@@ -0,0 +1,62 @@
|
||||
#!/bin/bash
|
||||
# code-sentinel security check hook
|
||||
# Checks for obvious security issues in code files, skips config/docs
|
||||
# Command hook - guaranteed predictable behavior
|
||||
|
||||
# Read tool input from stdin
|
||||
INPUT=$(cat)
|
||||
|
||||
# Extract file_path from JSON input
|
||||
FILE_PATH=$(echo "$INPUT" | grep -o '"file_path"[[:space:]]*:[[:space:]]*"[^"]*"' | head -1 | sed 's/.*"file_path"[[:space:]]*:[[:space:]]*"\([^"]*\)".*/\1/')
|
||||
|
||||
# If no file_path, exit silently
|
||||
if [ -z "$FILE_PATH" ]; then
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# SKIP config/doc files entirely - exit silently
|
||||
case "$FILE_PATH" in
|
||||
*.md|*.json|*.yml|*.yaml|*.txt|*.toml|*.ini|*.cfg|*.conf)
|
||||
exit 0
|
||||
;;
|
||||
*/docs/*|*/README*|*/CHANGELOG*|*/LICENSE*)
|
||||
exit 0
|
||||
;;
|
||||
*/.claude/*|*/.github/*|*/.vscode/*)
|
||||
exit 0
|
||||
;;
|
||||
esac
|
||||
|
||||
# For code files, extract content to check
|
||||
# For Edit tool: check new_string
|
||||
# For Write tool: check content
|
||||
CONTENT=$(echo "$INPUT" | grep -o '"new_string"[[:space:]]*:[[:space:]]*"[^"]*"' | head -1 | sed 's/.*"new_string"[[:space:]]*:[[:space:]]*"\([^"]*\)".*/\1/')
|
||||
if [ -z "$CONTENT" ]; then
|
||||
CONTENT=$(echo "$INPUT" | grep -o '"content"[[:space:]]*:[[:space:]]*"[^"]*"' | head -1 | sed 's/.*"content"[[:space:]]*:[[:space:]]*"\([^"]*\)".*/\1/')
|
||||
fi
|
||||
|
||||
# If no content to check, exit silently
|
||||
if [ -z "$CONTENT" ]; then
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Check for hardcoded secrets patterns (obvious cases only)
|
||||
if echo "$CONTENT" | grep -qiE '(api[_-]?key|api[_-]?secret|password|passwd|secret[_-]?key|auth[_-]?token)[[:space:]]*[=:][[:space:]]*["\x27][A-Za-z0-9+/=_-]{20,}["\x27]'; then
|
||||
echo "[code-sentinel] BLOCKED: Hardcoded secret detected"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for AWS keys pattern
|
||||
if echo "$CONTENT" | grep -qE 'AKIA[A-Z0-9]{16}'; then
|
||||
echo "[code-sentinel] BLOCKED: AWS access key detected"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for private key headers
|
||||
if echo "$CONTENT" | grep -qE '\-\-\-\-\-BEGIN (RSA |DSA |EC |OPENSSH )?PRIVATE KEY\-\-\-\-\-'; then
|
||||
echo "[code-sentinel] BLOCKED: Private key detected"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# All other cases: exit silently (allow the edit)
|
||||
exit 0
|
||||
@@ -9,5 +9,6 @@
|
||||
"homepage": "https://gitea.hotserv.cloud/personal-projects/leo-claude-mktplace/src/branch/main/plugins/doc-guardian/README.md",
|
||||
"repository": "https://gitea.hotserv.cloud/personal-projects/leo-claude-mktplace.git",
|
||||
"license": "MIT",
|
||||
"keywords": ["documentation", "sync", "drift-detection", "automation", "hooks"]
|
||||
"keywords": ["documentation", "sync", "drift-detection", "automation", "hooks"],
|
||||
"commands": ["./commands/"]
|
||||
}
|
||||
|
||||
@@ -5,8 +5,8 @@
|
||||
"matcher": "Write|Edit|MultiEdit",
|
||||
"hooks": [
|
||||
{
|
||||
"type": "prompt",
|
||||
"prompt": "[doc-guardian] QUICK drift check (DO NOT block workflow):\n\n1. ONLY check if the modified file is referenced in README.md, CLAUDE.md, or API docs in the SAME directory\n2. Do NOT read files or perform deep analysis - just note potential drift based on file name/path\n3. If potential drift: output a single line like '[doc-guardian] Note: {filename} changed - may affect {doc}. Run /doc-sync to verify.'\n4. If no obvious drift: say nothing\n\nIMPORTANT: This is notification-only. Do NOT read documentation files, do NOT make changes, do NOT use any tools. Just a quick mental check based on the file path."
|
||||
"type": "command",
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/hooks/notify.sh"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
23
plugins/doc-guardian/hooks/notify.sh
Executable file
23
plugins/doc-guardian/hooks/notify.sh
Executable file
@@ -0,0 +1,23 @@
|
||||
#!/bin/bash
|
||||
# doc-guardian notification hook
|
||||
# Outputs a single notification for config file changes, nothing otherwise
|
||||
# This is a command hook - guaranteed not to block workflow
|
||||
|
||||
# Read tool input from stdin (JSON with file_path)
|
||||
INPUT=$(cat)
|
||||
|
||||
# Extract file_path from JSON input
|
||||
FILE_PATH=$(echo "$INPUT" | grep -o '"file_path"[[:space:]]*:[[:space:]]*"[^"]*"' | head -1 | sed 's/.*"file_path"[[:space:]]*:[[:space:]]*"\([^"]*\)".*/\1/')
|
||||
|
||||
# If no file_path found, exit silently
|
||||
if [ -z "$FILE_PATH" ]; then
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Check if file is in a config directory (commands/, agents/, skills/, hooks/)
|
||||
if echo "$FILE_PATH" | grep -qE '/(commands|agents|skills|hooks)/'; then
|
||||
echo "[doc-guardian] Config file modified. Run /doc-sync when ready."
|
||||
fi
|
||||
|
||||
# Exit silently for all other files (no output = no blocking)
|
||||
exit 0
|
||||
@@ -10,7 +10,7 @@ git-flow streamlines common git operations with smart defaults, conventional com
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `/commit` | Create commit with auto-generated conventional message |
|
||||
| `/commit` | Create commit with auto-generated conventional message (with protected branch detection) |
|
||||
| `/commit-push` | Commit and push in one operation |
|
||||
| `/commit-merge` | Commit and merge into target branch |
|
||||
| `/commit-sync` | Full sync: commit, push, and rebase on base branch |
|
||||
@@ -79,7 +79,7 @@ chore/update-dependencies
|
||||
|
||||
### Safety Checks
|
||||
|
||||
- Warns before commits to protected branches
|
||||
- **Protected branch detection**: Before committing, checks if you're on a protected branch (main, master, development, staging, production by default). Offers to create a feature branch automatically instead of committing directly to protected branches.
|
||||
- Confirms force push operations
|
||||
- Prevents accidental branch deletion
|
||||
|
||||
|
||||
@@ -6,13 +6,44 @@ Create a git commit with an auto-generated conventional commit message based on
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Analyze Changes
|
||||
### Step 1: Check for Protected Branch
|
||||
|
||||
Before any commit operation, check if the current branch is protected:
|
||||
|
||||
1. Get current branch: `git branch --show-current`
|
||||
2. Check against `GIT_PROTECTED_BRANCHES` (default: `main,master,development,staging,production`)
|
||||
|
||||
If on a protected branch, warn the user:
|
||||
|
||||
```
|
||||
⚠️ You are on a protected branch: development
|
||||
|
||||
Protected branches typically have push restrictions that will prevent
|
||||
direct commits from being pushed to the remote.
|
||||
|
||||
Options:
|
||||
1. Create a feature branch and continue (Recommended)
|
||||
2. Continue on this branch anyway (may fail on push)
|
||||
3. Cancel
|
||||
```
|
||||
|
||||
**If option 1 (create feature branch):**
|
||||
- Prompt for branch type (feat/fix/chore/docs/refactor)
|
||||
- Prompt for brief description
|
||||
- Create branch using `/branch-start` naming conventions
|
||||
- Continue with commit on the new branch
|
||||
|
||||
**If option 2 (continue anyway):**
|
||||
- Proceed with commit (user accepts risk of push rejection)
|
||||
- Display reminder: "Remember: push may be rejected by remote protection rules"
|
||||
|
||||
### Step 2: Analyze Changes
|
||||
|
||||
1. Run `git status` to see staged and unstaged changes
|
||||
2. Run `git diff --staged` to examine staged changes
|
||||
3. If nothing staged, prompt user to stage changes
|
||||
|
||||
### Step 2: Generate Commit Message
|
||||
### Step 3: Generate Commit Message
|
||||
|
||||
Analyze the changes and generate a conventional commit message:
|
||||
|
||||
@@ -38,7 +69,7 @@ Analyze the changes and generate a conventional commit message:
|
||||
|
||||
**Scope:** Determined from changed files (e.g., `auth`, `api`, `ui`)
|
||||
|
||||
### Step 3: Confirm or Edit
|
||||
### Step 4: Confirm or Edit
|
||||
|
||||
Present the generated message:
|
||||
|
||||
@@ -58,7 +89,7 @@ Options:
|
||||
4. Cancel
|
||||
```
|
||||
|
||||
### Step 4: Execute Commit
|
||||
### Step 5: Execute Commit
|
||||
|
||||
If confirmed, run:
|
||||
|
||||
@@ -75,6 +106,7 @@ EOF
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_PROTECTED_BRANCHES` | `main,master,development,staging,production` | Branches that trigger protection warning |
|
||||
| `GIT_COMMIT_STYLE` | `conventional` | Message style (conventional, simple, detailed) |
|
||||
| `GIT_SIGN_COMMITS` | `false` | Use GPG signing |
|
||||
| `GIT_CO_AUTHOR` | `true` | Include Claude co-author footer |
|
||||
|
||||
@@ -13,6 +13,9 @@ pr-review conducts comprehensive code reviews using specialized agents for secur
|
||||
| `/pr-review <pr#>` | Full multi-agent review |
|
||||
| `/pr-summary <pr#>` | Quick summary without full review |
|
||||
| `/pr-findings <pr#>` | Filter findings by category/confidence |
|
||||
| `/initial-setup` | Full interactive setup wizard |
|
||||
| `/project-init` | Quick project setup (system already configured) |
|
||||
| `/project-sync` | Sync configuration with current git remote |
|
||||
|
||||
## Review Agents
|
||||
|
||||
|
||||
@@ -2,8 +2,8 @@
|
||||
"hooks": {
|
||||
"SessionStart": [
|
||||
{
|
||||
"type": "prompt",
|
||||
"prompt": "[pr-review] Perform silent startup checks:\n\n1. Check if MCP venvs exist at the installed marketplace location. If the marketplace is installed but venvs are missing, warn the user: '[pr-review] MCP venvs missing - run setup.sh from installed marketplace location'.\n\n2. Check if the project git remote matches .env configuration (GITEA_ORG/GITEA_REPO). If mismatch, warn: '[pr-review] Git remote mismatch - run /project-sync'.\n\nStay silent if all checks pass or not applicable. Be quick and non-blocking."
|
||||
"type": "command",
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/hooks/startup-check.sh"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
30
plugins/pr-review/hooks/startup-check.sh
Executable file
30
plugins/pr-review/hooks/startup-check.sh
Executable file
@@ -0,0 +1,30 @@
|
||||
#!/bin/bash
|
||||
# pr-review startup check hook
|
||||
# Checks for common issues at session start
|
||||
# All output MUST have [pr-review] prefix
|
||||
|
||||
PREFIX="[pr-review]"
|
||||
|
||||
# Check if MCP venv exists
|
||||
PLUGIN_ROOT="${CLAUDE_PLUGIN_ROOT:-$(dirname "$(dirname "$(realpath "$0")")")}"
|
||||
VENV_PATH="$PLUGIN_ROOT/mcp-servers/gitea/.venv/bin/python"
|
||||
|
||||
if [[ ! -f "$VENV_PATH" ]]; then
|
||||
echo "$PREFIX MCP venvs missing - run setup.sh from installed marketplace"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Check git remote vs .env config (only if .env exists)
|
||||
if [[ -f ".env" ]]; then
|
||||
CONFIGURED_REPO=$(grep -E "^GITEA_REPO=" .env 2>/dev/null | cut -d'=' -f2 | tr -d '"' || true)
|
||||
if [[ -n "$CONFIGURED_REPO" ]]; then
|
||||
CURRENT_REMOTE=$(git remote get-url origin 2>/dev/null | sed 's/.*[:/]\([^/]*\/[^.]*\).*/\1/' || true)
|
||||
if [[ -n "$CURRENT_REMOTE" && "$CONFIGURED_REPO" != "$CURRENT_REMOTE" ]]; then
|
||||
echo "$PREFIX Git remote mismatch - run /pr-review:project-sync"
|
||||
exit 0
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# All checks passed - say nothing
|
||||
exit 0
|
||||
@@ -20,7 +20,7 @@ claude plugin install project-hygiene
|
||||
|
||||
## How It Works
|
||||
|
||||
The plugin registers a `task-completed` hook that runs after Claude completes any task. It:
|
||||
The plugin registers a `PostToolUse` hook (on Write and Edit tools) that runs after Claude modifies files. It:
|
||||
|
||||
1. Scans for and deletes known temporary file patterns
|
||||
2. Removes temporary directories (`__pycache__`, `.pytest_cache`, etc.)
|
||||
|
||||
@@ -1,365 +1,28 @@
|
||||
#!/bin/bash
|
||||
# project-hygiene cleanup hook
|
||||
# Runs after task completion to clean up temp files and manage orphans
|
||||
# Runs after file edits to clean up temp files
|
||||
# All output MUST have [project-hygiene] prefix
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Configuration
|
||||
PREFIX="[project-hygiene]"
|
||||
|
||||
# Read tool input from stdin (discard - we don't need it for cleanup)
|
||||
cat > /dev/null
|
||||
|
||||
PROJECT_ROOT="${PROJECT_ROOT:-.}"
|
||||
PLUGIN_ROOT="${CLAUDE_PLUGIN_ROOT:-$(dirname "$(dirname "$(realpath "$0")")")}"
|
||||
CONFIG_FILE="${PROJECT_ROOT}/.hygiene.json"
|
||||
LOG_DIR="${PROJECT_ROOT}/.dev/logs"
|
||||
SCRATCH_DIR="${PROJECT_ROOT}/.dev/scratch"
|
||||
LOG_FILE="${LOG_DIR}/hygiene-$(date +%Y%m%d-%H%M%S).log"
|
||||
|
||||
# Default allowed root files (can be overridden by .hygiene.json)
|
||||
DEFAULT_ALLOWED_ROOT=(
|
||||
".git"
|
||||
".gitignore"
|
||||
".gitattributes"
|
||||
".editorconfig"
|
||||
".env"
|
||||
".env.example"
|
||||
".env.local"
|
||||
".nvmrc"
|
||||
".node-version"
|
||||
".python-version"
|
||||
".ruby-version"
|
||||
".tool-versions"
|
||||
"README.md"
|
||||
"LICENSE"
|
||||
"CHANGELOG.md"
|
||||
"CONTRIBUTING.md"
|
||||
"CLAUDE.md"
|
||||
"package.json"
|
||||
"package-lock.json"
|
||||
"yarn.lock"
|
||||
"pnpm-lock.yaml"
|
||||
"Makefile"
|
||||
"Dockerfile"
|
||||
"docker-compose.yml"
|
||||
"docker-compose.yaml"
|
||||
"Cargo.toml"
|
||||
"Cargo.lock"
|
||||
"go.mod"
|
||||
"go.sum"
|
||||
"requirements.txt"
|
||||
"setup.py"
|
||||
"pyproject.toml"
|
||||
"poetry.lock"
|
||||
"Gemfile"
|
||||
"Gemfile.lock"
|
||||
"tsconfig.json"
|
||||
"jsconfig.json"
|
||||
".eslintrc*"
|
||||
".prettierrc*"
|
||||
"vite.config.*"
|
||||
"webpack.config.*"
|
||||
"rollup.config.*"
|
||||
".hygiene.json"
|
||||
)
|
||||
|
||||
# Temp file patterns to delete
|
||||
TEMP_PATTERNS=(
|
||||
"*.tmp"
|
||||
"*.bak"
|
||||
"*.swp"
|
||||
"*.swo"
|
||||
"*~"
|
||||
".DS_Store"
|
||||
"Thumbs.db"
|
||||
"*.log"
|
||||
"*.orig"
|
||||
"*.pyc"
|
||||
"*.pyo"
|
||||
)
|
||||
|
||||
# Directory patterns to delete
|
||||
TEMP_DIRS=(
|
||||
"__pycache__"
|
||||
".pytest_cache"
|
||||
".mypy_cache"
|
||||
".ruff_cache"
|
||||
"node_modules/.cache"
|
||||
".next/cache"
|
||||
".nuxt/.cache"
|
||||
".turbo"
|
||||
"*.egg-info"
|
||||
".eggs"
|
||||
"dist"
|
||||
"build"
|
||||
)
|
||||
|
||||
# Orphan patterns to identify
|
||||
ORPHAN_PATTERNS=(
|
||||
"test_*.py"
|
||||
"debug_*"
|
||||
"*_backup.*"
|
||||
"*_old.*"
|
||||
"*_bak.*"
|
||||
"*.backup"
|
||||
"temp_*"
|
||||
"tmp_*"
|
||||
)
|
||||
|
||||
# Initialize
|
||||
DELETED_COUNT=0
|
||||
WARNED_COUNT=0
|
||||
ORPHAN_COUNT=0
|
||||
MOVE_ORPHANS=false
|
||||
|
||||
# Logging function
|
||||
log() {
|
||||
local msg="[$(date +%H:%M:%S)] $1"
|
||||
echo "$msg"
|
||||
if [[ -f "$LOG_FILE" ]]; then
|
||||
echo "$msg" >> "$LOG_FILE"
|
||||
fi
|
||||
}
|
||||
|
||||
log_action() {
|
||||
local action="$1"
|
||||
local target="$2"
|
||||
log " $action: $target"
|
||||
}
|
||||
|
||||
# Load project-local config if exists
|
||||
load_config() {
|
||||
if [[ -f "$CONFIG_FILE" ]]; then
|
||||
log "Loading config from $CONFIG_FILE"
|
||||
|
||||
# Check if move_orphans is enabled
|
||||
if command -v jq &>/dev/null; then
|
||||
MOVE_ORPHANS=$(jq -r '.move_orphans // false' "$CONFIG_FILE" 2>/dev/null || echo "false")
|
||||
|
||||
# Load additional allowed root files
|
||||
local extra_allowed
|
||||
extra_allowed=$(jq -r '.allowed_root_files // [] | .[]' "$CONFIG_FILE" 2>/dev/null || true)
|
||||
if [[ -n "$extra_allowed" ]]; then
|
||||
while IFS= read -r file; do
|
||||
DEFAULT_ALLOWED_ROOT+=("$file")
|
||||
done <<< "$extra_allowed"
|
||||
fi
|
||||
|
||||
# Load additional temp patterns
|
||||
local extra_temp
|
||||
extra_temp=$(jq -r '.temp_patterns // [] | .[]' "$CONFIG_FILE" 2>/dev/null || true)
|
||||
if [[ -n "$extra_temp" ]]; then
|
||||
while IFS= read -r pattern; do
|
||||
TEMP_PATTERNS+=("$pattern")
|
||||
done <<< "$extra_temp"
|
||||
fi
|
||||
|
||||
# Load ignore patterns (files to never touch)
|
||||
IGNORE_PATTERNS=()
|
||||
local ignore
|
||||
ignore=$(jq -r '.ignore_patterns // [] | .[]' "$CONFIG_FILE" 2>/dev/null || true)
|
||||
if [[ -n "$ignore" ]]; then
|
||||
while IFS= read -r pattern; do
|
||||
IGNORE_PATTERNS+=("$pattern")
|
||||
done <<< "$ignore"
|
||||
fi
|
||||
else
|
||||
log "Warning: jq not installed, using default config"
|
||||
fi
|
||||
fi
|
||||
}
|
||||
|
||||
# Check if file should be ignored
|
||||
should_ignore() {
|
||||
local file="$1"
|
||||
local basename
|
||||
basename=$(basename "$file")
|
||||
|
||||
for pattern in "${IGNORE_PATTERNS[@]:-}"; do
|
||||
if [[ "$basename" == $pattern ]] || [[ "$file" == $pattern ]]; then
|
||||
return 0
|
||||
fi
|
||||
done
|
||||
return 1
|
||||
}
|
||||
|
||||
# Check if file is in allowed root list
|
||||
is_allowed_root() {
|
||||
local file="$1"
|
||||
local basename
|
||||
basename=$(basename "$file")
|
||||
|
||||
for allowed in "${DEFAULT_ALLOWED_ROOT[@]}"; do
|
||||
# Support wildcards in allowed patterns
|
||||
if [[ "$basename" == $allowed ]]; then
|
||||
return 0
|
||||
fi
|
||||
done
|
||||
return 1
|
||||
}
|
||||
|
||||
# Check if file matches orphan pattern
|
||||
is_orphan() {
|
||||
local file="$1"
|
||||
local basename
|
||||
basename=$(basename "$file")
|
||||
|
||||
for pattern in "${ORPHAN_PATTERNS[@]}"; do
|
||||
if [[ "$basename" == $pattern ]]; then
|
||||
return 0
|
||||
fi
|
||||
done
|
||||
return 1
|
||||
}
|
||||
|
||||
# Setup directories
|
||||
setup_dirs() {
|
||||
mkdir -p "$LOG_DIR"
|
||||
if [[ "$MOVE_ORPHANS" == "true" ]]; then
|
||||
mkdir -p "$SCRATCH_DIR"
|
||||
fi
|
||||
|
||||
# Start log file
|
||||
echo "=== Project Hygiene Cleanup ===" > "$LOG_FILE"
|
||||
echo "Started: $(date)" >> "$LOG_FILE"
|
||||
echo "Project: $PROJECT_ROOT" >> "$LOG_FILE"
|
||||
echo "" >> "$LOG_FILE"
|
||||
}
|
||||
|
||||
# Delete temp files
|
||||
cleanup_temp_files() {
|
||||
log "Cleaning temp files..."
|
||||
|
||||
for pattern in "${TEMP_PATTERNS[@]}"; do
|
||||
while IFS= read -r -d '' file; do
|
||||
if should_ignore "$file"; then
|
||||
continue
|
||||
fi
|
||||
rm -f "$file"
|
||||
log_action "DELETED" "$file"
|
||||
((DELETED_COUNT++))
|
||||
done < <(find "$PROJECT_ROOT" -name "$pattern" -type f -print0 2>/dev/null || true)
|
||||
done
|
||||
}
|
||||
|
||||
# Delete temp directories
|
||||
cleanup_temp_dirs() {
|
||||
log "Cleaning temp directories..."
|
||||
|
||||
for pattern in "${TEMP_DIRS[@]}"; do
|
||||
while IFS= read -r -d '' dir; do
|
||||
if should_ignore "$dir"; then
|
||||
continue
|
||||
fi
|
||||
rm -rf "$dir"
|
||||
log_action "DELETED DIR" "$dir"
|
||||
((DELETED_COUNT++))
|
||||
done < <(find "$PROJECT_ROOT" -name "$pattern" -type d -print0 2>/dev/null || true)
|
||||
done
|
||||
}
|
||||
|
||||
# Warn about unexpected root files
|
||||
check_root_files() {
|
||||
log "Checking root files..."
|
||||
|
||||
local unexpected_files=()
|
||||
|
||||
# Silently delete temp files
|
||||
for pattern in "*.tmp" "*.bak" "*.swp" "*~" ".DS_Store"; do
|
||||
while IFS= read -r -d '' file; do
|
||||
local basename
|
||||
basename=$(basename "$file")
|
||||
rm -f "$file" 2>/dev/null && ((DELETED_COUNT++)) || true
|
||||
done < <(find "$PROJECT_ROOT" -name "$pattern" -type f -print0 2>/dev/null || true)
|
||||
done
|
||||
|
||||
# Skip directories
|
||||
[[ -d "$file" ]] && continue
|
||||
# Only output if we deleted something
|
||||
if [[ $DELETED_COUNT -gt 0 ]]; then
|
||||
echo "$PREFIX Cleaned $DELETED_COUNT temp files"
|
||||
fi
|
||||
|
||||
# Skip if in allowed list
|
||||
is_allowed_root "$basename" && continue
|
||||
|
||||
# Skip if should be ignored
|
||||
should_ignore "$basename" && continue
|
||||
|
||||
unexpected_files+=("$basename")
|
||||
log_action "WARNING" "Unexpected root file: $basename"
|
||||
((WARNED_COUNT++))
|
||||
done < <(find "$PROJECT_ROOT" -maxdepth 1 -print0 2>/dev/null || true)
|
||||
|
||||
if [[ ${#unexpected_files[@]} -gt 0 ]]; then
|
||||
log ""
|
||||
log "⚠️ Unexpected files in project root:"
|
||||
for f in "${unexpected_files[@]}"; do
|
||||
log " - $f"
|
||||
done
|
||||
fi
|
||||
}
|
||||
|
||||
# Identify and handle orphaned files
|
||||
handle_orphans() {
|
||||
log "Checking for orphaned files..."
|
||||
|
||||
local orphan_files=()
|
||||
|
||||
for pattern in "${ORPHAN_PATTERNS[@]}"; do
|
||||
while IFS= read -r -d '' file; do
|
||||
if should_ignore "$file"; then
|
||||
continue
|
||||
fi
|
||||
|
||||
orphan_files+=("$file")
|
||||
|
||||
if [[ "$MOVE_ORPHANS" == "true" ]]; then
|
||||
local dest="${SCRATCH_DIR}/$(basename "$file")"
|
||||
# Handle duplicates
|
||||
if [[ -f "$dest" ]]; then
|
||||
dest="${SCRATCH_DIR}/$(date +%Y%m%d%H%M%S)_$(basename "$file")"
|
||||
fi
|
||||
mv "$file" "$dest"
|
||||
log_action "MOVED" "$file -> $dest"
|
||||
else
|
||||
log_action "ORPHAN" "$file"
|
||||
fi
|
||||
((ORPHAN_COUNT++))
|
||||
done < <(find "$PROJECT_ROOT" -name "$pattern" -type f -print0 2>/dev/null || true)
|
||||
done
|
||||
|
||||
if [[ ${#orphan_files[@]} -gt 0 && "$MOVE_ORPHANS" != "true" ]]; then
|
||||
log ""
|
||||
log "📦 Orphaned files found (enable move_orphans in .hygiene.json to auto-move):"
|
||||
for f in "${orphan_files[@]}"; do
|
||||
log " - $f"
|
||||
done
|
||||
fi
|
||||
}
|
||||
|
||||
# Summary
|
||||
print_summary() {
|
||||
log ""
|
||||
log "=== Cleanup Summary ==="
|
||||
log " Deleted: $DELETED_COUNT items"
|
||||
log " Warnings: $WARNED_COUNT unexpected root files"
|
||||
log " Orphans: $ORPHAN_COUNT files"
|
||||
if [[ "$MOVE_ORPHANS" == "true" ]]; then
|
||||
log " Orphans moved to: $SCRATCH_DIR"
|
||||
fi
|
||||
log " Log file: $LOG_FILE"
|
||||
log ""
|
||||
}
|
||||
|
||||
# Main
|
||||
main() {
|
||||
cd "$PROJECT_ROOT" || exit 1
|
||||
|
||||
load_config
|
||||
setup_dirs
|
||||
|
||||
log "Starting project hygiene cleanup..."
|
||||
log ""
|
||||
|
||||
cleanup_temp_files
|
||||
cleanup_temp_dirs
|
||||
check_root_files
|
||||
handle_orphans
|
||||
|
||||
print_summary
|
||||
|
||||
# Exit with warning code if issues found
|
||||
if [[ $WARNED_COUNT -gt 0 || $ORPHAN_COUNT -gt 0 ]]; then
|
||||
exit 0 # Still success, but logged warnings
|
||||
fi
|
||||
}
|
||||
|
||||
main "$@"
|
||||
exit 0
|
||||
|
||||
@@ -13,7 +13,7 @@ Projman transforms a proven 15-sprint workflow into a distributable Claude Code
|
||||
- **Milestones** - Sprint milestone management and tracking
|
||||
- **Lessons Learned** - Systematic capture and search via Gitea Wiki
|
||||
- **Branch-Aware Security** - Prevents accidental changes on production branches
|
||||
- **Three-Agent Model** - Planner, Orchestrator, and Executor agents
|
||||
- **Four-Agent Model** - Planner, Orchestrator, Executor, and Code Reviewer agents
|
||||
- **CLI Tools Blocked** - All operations via MCP tools only (no `tea` or `gh`)
|
||||
|
||||
## Quick Start
|
||||
@@ -461,20 +461,8 @@ projman/
|
||||
├── .claude-plugin/
|
||||
│ └── plugin.json # Plugin manifest
|
||||
├── .mcp.json # MCP server configuration
|
||||
├── mcp-servers/ # Bundled MCP server
|
||||
│ └── gitea/
|
||||
│ ├── .venv/
|
||||
│ ├── requirements.txt
|
||||
│ ├── mcp_server/
|
||||
│ │ ├── server.py
|
||||
│ │ ├── gitea_client.py
|
||||
│ │ └── tools/
|
||||
│ │ ├── issues.py
|
||||
│ │ ├── labels.py
|
||||
│ │ ├── wiki.py
|
||||
│ │ ├── milestones.py
|
||||
│ │ └── dependencies.py
|
||||
│ └── tests/
|
||||
├── mcp-servers/
|
||||
│ └── gitea -> ../../../mcp-servers/gitea # SYMLINK to shared MCP server
|
||||
├── commands/ # Slash commands
|
||||
│ ├── sprint-plan.md
|
||||
│ ├── sprint-start.md
|
||||
|
||||
@@ -43,6 +43,46 @@ Store all values:
|
||||
- `CURRENT_BRANCH`: Current branch name
|
||||
- `WORKING_DIR`: Current working directory
|
||||
|
||||
### Step 1.5: Detect Sprint Context
|
||||
|
||||
Determine if this debug issue should be associated with an active sprint.
|
||||
|
||||
**1. Check for active sprint milestone:**
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__list_milestones(repo=PROJECT_REPO, state="open")
|
||||
```
|
||||
|
||||
Store the first open milestone as `ACTIVE_SPRINT` (if any).
|
||||
|
||||
**2. Analyze branch context:**
|
||||
|
||||
| Branch Pattern | Context |
|
||||
|----------------|---------|
|
||||
| `feat/*`, `fix/*`, `issue-*` | Sprint work - likely related to current sprint |
|
||||
| `main`, `master`, `development` | Production/standalone - not sprint-related |
|
||||
| Other | Unknown - ask user |
|
||||
|
||||
**3. Determine sprint association:**
|
||||
|
||||
```
|
||||
IF ACTIVE_SPRINT exists AND CURRENT_BRANCH matches sprint pattern (feat/*, fix/*, issue-*):
|
||||
→ SPRINT_CONTEXT = "detected"
|
||||
→ Ask user: "Active sprint detected: [SPRINT_NAME]. Is this bug related to sprint work?"
|
||||
Options:
|
||||
- Yes, add to sprint (will associate with milestone)
|
||||
- No, standalone fix (no milestone)
|
||||
→ Store choice as ASSOCIATE_WITH_SPRINT (true/false)
|
||||
|
||||
ELSE IF ACTIVE_SPRINT exists AND CURRENT_BRANCH is main/development:
|
||||
→ SPRINT_CONTEXT = "production"
|
||||
→ ASSOCIATE_WITH_SPRINT = false (standalone fix, no question needed)
|
||||
|
||||
ELSE:
|
||||
→ SPRINT_CONTEXT = "none"
|
||||
→ ASSOCIATE_WITH_SPRINT = false
|
||||
```
|
||||
|
||||
### Step 2: Read Marketplace Configuration
|
||||
|
||||
```bash
|
||||
@@ -105,7 +145,42 @@ Count failures and categorize errors:
|
||||
|
||||
For each failure, write a hypothesis about the likely cause.
|
||||
|
||||
### Step 5: Generate Issue Content
|
||||
### Step 5: Generate Smart Labels
|
||||
|
||||
Generate appropriate labels based on the diagnostic results.
|
||||
|
||||
**1. Build context string for label suggestion:**
|
||||
|
||||
```
|
||||
LABEL_CONTEXT = "Bug fix: " + [summary of main failure] + ". " +
|
||||
"Failed tools: " + [list of failed tool names] + ". " +
|
||||
"Error category: " + [detected error category from Step 4]
|
||||
```
|
||||
|
||||
**2. Get suggested labels:**
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__suggest_labels(
|
||||
repo=PROJECT_REPO,
|
||||
context=LABEL_CONTEXT
|
||||
)
|
||||
```
|
||||
|
||||
**3. Merge with base labels:**
|
||||
|
||||
```
|
||||
BASE_LABELS = ["Type: Bug", "Source: Diagnostic", "Agent: Claude"]
|
||||
SUGGESTED_LABELS = [result from suggest_labels]
|
||||
|
||||
# Combine, avoiding duplicates
|
||||
FINAL_LABELS = BASE_LABELS + [label for label in SUGGESTED_LABELS if label not in BASE_LABELS]
|
||||
```
|
||||
|
||||
The final label set should include:
|
||||
- **Always**: `Type: Bug`, `Source: Diagnostic`, `Agent: Claude`
|
||||
- **If detected**: `Component: *`, `Complexity: *`, `Risk: *`, `Priority: *`
|
||||
|
||||
### Step 6: Generate Issue Content
|
||||
|
||||
Use this exact template:
|
||||
|
||||
@@ -182,82 +257,91 @@ Use this exact template:
|
||||
*Generated by /debug-report - Labels: Type: Bug, Source: Diagnostic, Agent: Claude*
|
||||
```
|
||||
|
||||
### Step 6: Create Issue in Marketplace
|
||||
### Step 7: Create Issue in Marketplace
|
||||
|
||||
**First, check if MCP tools are available.** Attempt to use an MCP tool. If you receive "tool not found", "not in function list", or similar error, the MCP server is not accessible in this session - use the curl fallback.
|
||||
**IMPORTANT:** Always use curl to create issues in the marketplace repo. This avoids branch protection restrictions and MCP context issues that can block issue creation when working on protected branches.
|
||||
|
||||
#### Option A: MCP Available (preferred)
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__create_issue(
|
||||
repo=MARKETPLACE_REPO,
|
||||
title="[Diagnostic] [summary of main failure]",
|
||||
body=[generated content from Step 5],
|
||||
labels=["Type: Bug", "Source: Diagnostic", "Agent: Claude"]
|
||||
)
|
||||
```
|
||||
|
||||
If labels don't exist, create issue without labels.
|
||||
|
||||
#### Option B: MCP Unavailable - Use curl Fallback
|
||||
|
||||
If MCP tools are not available (the very issue you may be diagnosing), use this fallback:
|
||||
|
||||
**1. Check for Gitea credentials:**
|
||||
**1. Load Gitea credentials:**
|
||||
|
||||
```bash
|
||||
if [[ -f ~/.config/claude/gitea.env ]]; then
|
||||
source ~/.config/claude/gitea.env
|
||||
echo "Credentials found. API URL: $GITEA_API_URL"
|
||||
echo "Credentials loaded. API URL: $GITEA_API_URL"
|
||||
else
|
||||
echo "No credentials at ~/.config/claude/gitea.env"
|
||||
echo "ERROR: No credentials at ~/.config/claude/gitea.env"
|
||||
fi
|
||||
```
|
||||
|
||||
**2. If credentials exist, create issue via curl with proper JSON escaping:**
|
||||
**2. Fetch label IDs from marketplace repo:**
|
||||
|
||||
Create secure temp files and save content:
|
||||
The diagnostic labels to apply are:
|
||||
- `Source/Diagnostic` (always)
|
||||
- `Type/Bug` (always)
|
||||
|
||||
```bash
|
||||
# Fetch all labels and extract IDs for our target labels
|
||||
LABELS_JSON=$(curl -s "${GITEA_API_URL}/repos/${MARKETPLACE_REPO}/labels" \
|
||||
-H "Authorization: token ${GITEA_API_TOKEN}")
|
||||
|
||||
# Extract label IDs (handles both org and repo labels)
|
||||
SOURCE_DIAG_ID=$(echo "$LABELS_JSON" | jq -r '.[] | select(.name == "Source/Diagnostic") | .id')
|
||||
TYPE_BUG_ID=$(echo "$LABELS_JSON" | jq -r '.[] | select(.name == "Type/Bug") | .id')
|
||||
|
||||
# Build label array (only include IDs that were found)
|
||||
LABEL_IDS="[]"
|
||||
if [[ -n "$SOURCE_DIAG_ID" && -n "$TYPE_BUG_ID" ]]; then
|
||||
LABEL_IDS="[$SOURCE_DIAG_ID, $TYPE_BUG_ID]"
|
||||
elif [[ -n "$SOURCE_DIAG_ID" ]]; then
|
||||
LABEL_IDS="[$SOURCE_DIAG_ID]"
|
||||
elif [[ -n "$TYPE_BUG_ID" ]]; then
|
||||
LABEL_IDS="[$TYPE_BUG_ID]"
|
||||
fi
|
||||
|
||||
echo "Label IDs to apply: $LABEL_IDS"
|
||||
```
|
||||
|
||||
**3. Create issue with labels via curl:**
|
||||
|
||||
```bash
|
||||
# Create temp files with restrictive permissions
|
||||
DIAG_TITLE=$(mktemp -p /tmp -m 600 diag-title.XXXXXX)
|
||||
DIAG_BODY=$(mktemp -p /tmp -m 600 diag-body.XXXXXX)
|
||||
DIAG_PAYLOAD=$(mktemp -p /tmp -m 600 diag-payload.XXXXXX)
|
||||
DIAG_TITLE=$(mktemp -t diag-title.XXXXXX)
|
||||
DIAG_BODY=$(mktemp -t diag-body.XXXXXX)
|
||||
DIAG_PAYLOAD=$(mktemp -t diag-payload.XXXXXX)
|
||||
|
||||
# Save title
|
||||
echo "[Diagnostic] [summary of main failure]" > "$DIAG_TITLE"
|
||||
|
||||
# Save body (paste Step 5 content) - heredoc delimiter prevents shell expansion
|
||||
# Save body (paste Step 6 content) - heredoc delimiter prevents shell expansion
|
||||
cat > "$DIAG_BODY" << 'DIAGNOSTIC_EOF'
|
||||
[Paste the full issue content from Step 5 here]
|
||||
[Paste the full issue content from Step 6 here]
|
||||
DIAGNOSTIC_EOF
|
||||
```
|
||||
|
||||
Construct JSON safely using jq's --rawfile (avoids command substitution):
|
||||
|
||||
```bash
|
||||
# Build JSON payload using jq with --rawfile for safe content handling
|
||||
# Build JSON payload with labels using jq
|
||||
jq -n \
|
||||
--rawfile title "$DIAG_TITLE" \
|
||||
--rawfile body "$DIAG_BODY" \
|
||||
'{title: ($title | rtrimstr("\n")), body: $body}' > "$DIAG_PAYLOAD"
|
||||
--argjson labels "$LABEL_IDS" \
|
||||
'{title: ($title | rtrimstr("\n")), body: $body, labels: $labels}' > "$DIAG_PAYLOAD"
|
||||
|
||||
# Create issue using the JSON file
|
||||
curl -s -X POST "${GITEA_API_URL}/repos/${MARKETPLACE_REPO}/issues" \
|
||||
RESULT=$(curl -s -X POST "${GITEA_API_URL}/repos/${MARKETPLACE_REPO}/issues" \
|
||||
-H "Authorization: token ${GITEA_API_TOKEN}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d @"$DIAG_PAYLOAD" | jq '.html_url // .'
|
||||
-d @"$DIAG_PAYLOAD")
|
||||
|
||||
# Extract and display the issue URL
|
||||
echo "$RESULT" | jq -r '.html_url // "Error: " + (.message // "Unknown error")'
|
||||
|
||||
# Secure cleanup
|
||||
rm -f "$DIAG_TITLE" "$DIAG_BODY" "$DIAG_PAYLOAD"
|
||||
```
|
||||
|
||||
**3. If no credentials found, save report locally:**
|
||||
**4. If no credentials found, save report locally:**
|
||||
|
||||
```bash
|
||||
REPORT_FILE=$(mktemp -p /tmp -m 600 diagnostic-report-XXXXXX.md)
|
||||
REPORT_FILE=$(mktemp -t diagnostic-report-XXXXXX.md)
|
||||
cat > "$REPORT_FILE" << 'DIAGNOSTIC_EOF'
|
||||
[Paste the full issue content from Step 5 here]
|
||||
[Paste the full issue content from Step 6 here]
|
||||
DIAGNOSTIC_EOF
|
||||
echo "Report saved to: $REPORT_FILE"
|
||||
```
|
||||
@@ -265,7 +349,7 @@ echo "Report saved to: $REPORT_FILE"
|
||||
Then inform the user:
|
||||
|
||||
```
|
||||
MCP tools are unavailable and no Gitea credentials found at ~/.config/claude/gitea.env.
|
||||
No Gitea credentials found at ~/.config/claude/gitea.env.
|
||||
|
||||
Diagnostic report saved to: [REPORT_FILE]
|
||||
|
||||
@@ -274,7 +358,7 @@ To create the issue manually:
|
||||
2. Or create issue directly at: http://gitea.hotserv.cloud/[MARKETPLACE_REPO]/issues/new
|
||||
```
|
||||
|
||||
### Step 7: Report to User
|
||||
### Step 8: Report to User
|
||||
|
||||
Display summary:
|
||||
|
||||
@@ -306,6 +390,7 @@ Next Steps:
|
||||
- **DO NOT** skip any diagnostic test
|
||||
- **DO NOT** call MCP tools without the `repo` parameter
|
||||
- **DO NOT** ask user questions during execution - run autonomously
|
||||
- **DO NOT** use MCP tools to create issues in the marketplace - always use curl (avoids branch restrictions)
|
||||
|
||||
## If All Tests Pass
|
||||
|
||||
@@ -336,7 +421,12 @@ and I can create a manual bug report.
|
||||
- Check if in a git repository: `git rev-parse --git-dir`
|
||||
- If not a git repo, ask user for the repository path
|
||||
|
||||
**MCP tools not available**
|
||||
- Use the curl fallback in Step 6, Option B
|
||||
- Requires Gitea credentials at `~/.config/claude/gitea.env`
|
||||
- If no credentials, report will be saved locally for manual submission
|
||||
**Gitea credentials not found**
|
||||
- Credentials must be at `~/.config/claude/gitea.env`
|
||||
- If missing, the report will be saved locally for manual submission
|
||||
- See docs/CONFIGURATION.md for setup instructions
|
||||
|
||||
**Labels not applied to issue**
|
||||
- Verify labels exist in the marketplace repo: `Source/Diagnostic`, `Type/Bug`
|
||||
- Check the label fetch output in Step 7.2 for errors
|
||||
- If labels don't exist, create them first with `/labels-sync` in the marketplace repo
|
||||
|
||||
@@ -195,6 +195,74 @@ Does this analysis match your understanding of the problem?
|
||||
|
||||
Do NOT proceed until user approves.
|
||||
|
||||
### Step 9.5: Search Lessons Learned
|
||||
|
||||
Before proposing a fix, search for relevant lessons from past fixes.
|
||||
|
||||
**1. Extract search tags from the issue:**
|
||||
|
||||
```
|
||||
SEARCH_TAGS = []
|
||||
# Add tool names
|
||||
for each failed_tool in issue:
|
||||
SEARCH_TAGS.append(tool_name) # e.g., "get_labels", "validate_repo_org"
|
||||
|
||||
# Add error category
|
||||
SEARCH_TAGS.append(error_category) # e.g., "parameter-format", "authentication"
|
||||
|
||||
# Add component if identifiable
|
||||
if error relates to MCP server:
|
||||
SEARCH_TAGS.append("mcp")
|
||||
if error relates to command:
|
||||
SEARCH_TAGS.append("command")
|
||||
```
|
||||
|
||||
**2. Search lessons learned:**
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__search_lessons(
|
||||
repo=REPO_NAME,
|
||||
tags=SEARCH_TAGS,
|
||||
limit=5
|
||||
)
|
||||
```
|
||||
|
||||
**3. Also search by error keywords:**
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__search_lessons(
|
||||
repo=REPO_NAME,
|
||||
query=[key error message words],
|
||||
limit=5
|
||||
)
|
||||
```
|
||||
|
||||
**4. Display relevant lessons (if any):**
|
||||
|
||||
```
|
||||
Related Lessons Learned
|
||||
=======================
|
||||
|
||||
Found [N] relevant lessons from past fixes:
|
||||
|
||||
📚 Lesson: "Sprint 14 - Parameter validation in MCP tools"
|
||||
Tags: mcp, get_labels, parameter-format
|
||||
Summary: Always validate repo parameter format before API calls
|
||||
Prevention: Add format check at function entry
|
||||
|
||||
📚 Lesson: "Sprint 12 - Graceful fallback for missing config"
|
||||
Tags: configuration, fallback
|
||||
Summary: Commands should work even without .env
|
||||
Prevention: Check for env vars, use sensible defaults
|
||||
|
||||
These lessons may inform your fix approach.
|
||||
```
|
||||
|
||||
If no lessons found, display:
|
||||
```
|
||||
No related lessons found. This may be a new type of issue.
|
||||
```
|
||||
|
||||
### Step 10: Propose Fix Approach
|
||||
|
||||
Based on the analysis, propose a specific fix:
|
||||
@@ -342,7 +410,118 @@ Next Steps:
|
||||
1. Review and merge PR #81
|
||||
2. In test project, pull latest plugin version
|
||||
3. Run /debug-report to verify fix
|
||||
4. If passing, close issue #80
|
||||
4. Come back and run Step 15 to close issue and capture lesson
|
||||
```
|
||||
|
||||
### Step 15: Verify, Close, and Capture Lesson
|
||||
|
||||
**This step runs AFTER the user has verified the fix works.**
|
||||
|
||||
When user returns and confirms the fix is working:
|
||||
|
||||
**1. Close the issue:**
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__update_issue(
|
||||
repo=REPO_NAME,
|
||||
issue_number=ISSUE_NUMBER,
|
||||
state="closed"
|
||||
)
|
||||
```
|
||||
|
||||
**2. Ask about lesson capture:**
|
||||
|
||||
Use AskUserQuestion:
|
||||
|
||||
```
|
||||
This fix addressed [ERROR_TYPE] in [COMPONENT].
|
||||
|
||||
Would you like to capture this as a lesson learned?
|
||||
|
||||
Options:
|
||||
- Yes, capture lesson (helps avoid similar issues in future)
|
||||
- No, skip (trivial fix or already documented)
|
||||
```
|
||||
|
||||
**3. If user chooses Yes, auto-generate lesson content:**
|
||||
|
||||
```
|
||||
LESSON_TITLE = "Sprint [N] - [Brief description of fix]"
|
||||
# Example: "Sprint 17 - MCP parameter validation"
|
||||
|
||||
LESSON_CONTENT = """
|
||||
## Context
|
||||
|
||||
[What was happening when the issue occurred]
|
||||
- Command/tool being used: [FAILED_TOOL]
|
||||
- Error encountered: [ERROR_MESSAGE]
|
||||
|
||||
## Problem
|
||||
|
||||
[Root cause identified during investigation]
|
||||
|
||||
## Solution
|
||||
|
||||
[What was changed to fix it]
|
||||
- Files modified: [LIST]
|
||||
- PR: #[PR_NUMBER]
|
||||
|
||||
## Prevention
|
||||
|
||||
[How to avoid this in the future]
|
||||
|
||||
## Related
|
||||
|
||||
- Issue: #[ISSUE_NUMBER]
|
||||
- PR: #[PR_NUMBER]
|
||||
"""
|
||||
|
||||
LESSON_TAGS = [
|
||||
tool_name, # e.g., "get_labels"
|
||||
error_category, # e.g., "parameter-format"
|
||||
component, # e.g., "mcp", "command"
|
||||
"bug-fix"
|
||||
]
|
||||
```
|
||||
|
||||
**4. Show lesson preview and ask for approval:**
|
||||
|
||||
```
|
||||
Lesson Preview
|
||||
==============
|
||||
|
||||
Title: [LESSON_TITLE]
|
||||
Tags: [LESSON_TAGS]
|
||||
|
||||
Content:
|
||||
[LESSON_CONTENT]
|
||||
|
||||
Save this lesson? (Y/N/Edit)
|
||||
```
|
||||
|
||||
**5. If approved, create the lesson:**
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__create_lesson(
|
||||
repo=REPO_NAME,
|
||||
title=LESSON_TITLE,
|
||||
content=LESSON_CONTENT,
|
||||
tags=LESSON_TAGS,
|
||||
category="sprints"
|
||||
)
|
||||
```
|
||||
|
||||
**6. Report completion:**
|
||||
|
||||
```
|
||||
Issue Closed & Lesson Captured
|
||||
==============================
|
||||
|
||||
Issue #[N]: CLOSED
|
||||
Lesson: "[LESSON_TITLE]" saved to wiki
|
||||
|
||||
This lesson will be surfaced in future /debug-review
|
||||
sessions when similar errors are encountered.
|
||||
```
|
||||
|
||||
## DO NOT
|
||||
@@ -350,8 +529,9 @@ Next Steps:
|
||||
- **DO NOT** skip reading relevant files - this is MANDATORY
|
||||
- **DO NOT** proceed past approval gates without user confirmation
|
||||
- **DO NOT** guess at fixes without evidence from code
|
||||
- **DO NOT** close issues - let user verify fix works first
|
||||
- **DO NOT** close issues until user confirms fix works (Step 15)
|
||||
- **DO NOT** commit directly to development or main branches
|
||||
- **DO NOT** skip the lessons learned search - past fixes inform better solutions
|
||||
|
||||
## If Investigation Finds No Bug
|
||||
|
||||
|
||||
@@ -62,12 +62,20 @@ Verify these required label categories exist:
|
||||
|
||||
### Step 6: Create Missing Labels (if any)
|
||||
|
||||
For each missing required label, call:
|
||||
Use `create_label_smart` which automatically creates labels at the correct level:
|
||||
- **Organization level**: Type/*, Priority/*, Complexity/*, Effort/*, Risk/*, Source/*, Agent/*
|
||||
- **Repository level**: Component/*, Tech/*
|
||||
|
||||
```
|
||||
mcp__plugin_projman_gitea__create_label(repo=REPO_NAME, name="Type: Bug", color="d73a4a")
|
||||
mcp__plugin_projman_gitea__create_label_smart(repo=REPO_NAME, name="Type/Bug", color="d73a4a")
|
||||
```
|
||||
|
||||
This automatically detects whether to create at org or repo level based on the category.
|
||||
|
||||
**Alternative (explicit control):**
|
||||
- Org labels: `create_org_label(org="org-name", name="Type/Bug", color="d73a4a")`
|
||||
- Repo labels: `create_label(repo=REPO_NAME, name="Component/Backend", color="5319e7")`
|
||||
|
||||
Use the label format that matches existing labels in the repo (slash `/` or colon-space `: `).
|
||||
|
||||
### Step 7: Report Results
|
||||
|
||||
@@ -2,8 +2,8 @@
|
||||
"hooks": {
|
||||
"SessionStart": [
|
||||
{
|
||||
"type": "prompt",
|
||||
"prompt": "[projman] Perform silent startup checks:\n\n1. Check if MCP venvs exist at the installed marketplace location. If the marketplace is installed but venvs are missing, warn the user: '[projman] MCP venvs missing - run setup.sh from installed marketplace location'.\n\n2. Check if the project git remote matches .env configuration (GITEA_ORG/GITEA_REPO). If mismatch, warn: '[projman] Git remote mismatch - run /project-sync'.\n\nStay silent if all checks pass or not applicable. Be quick and non-blocking."
|
||||
"type": "command",
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/hooks/startup-check.sh"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
30
plugins/projman/hooks/startup-check.sh
Executable file
30
plugins/projman/hooks/startup-check.sh
Executable file
@@ -0,0 +1,30 @@
|
||||
#!/bin/bash
|
||||
# projman startup check hook
|
||||
# Checks for common issues at session start
|
||||
# All output MUST have [projman] prefix
|
||||
|
||||
PREFIX="[projman]"
|
||||
|
||||
# Check if MCP venv exists
|
||||
PLUGIN_ROOT="${CLAUDE_PLUGIN_ROOT:-$(dirname "$(dirname "$(realpath "$0")")")}"
|
||||
VENV_PATH="$PLUGIN_ROOT/mcp-servers/gitea/.venv/bin/python"
|
||||
|
||||
if [[ ! -f "$VENV_PATH" ]]; then
|
||||
echo "$PREFIX MCP venvs missing - run setup.sh from installed marketplace"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Check git remote vs .env config (only if .env exists)
|
||||
if [[ -f ".env" ]]; then
|
||||
CONFIGURED_REPO=$(grep -E "^GITEA_REPO=" .env 2>/dev/null | cut -d'=' -f2 | tr -d '"' || true)
|
||||
if [[ -n "$CONFIGURED_REPO" ]]; then
|
||||
CURRENT_REMOTE=$(git remote get-url origin 2>/dev/null | sed 's/.*[:/]\([^/]*\/[^.]*\).*/\1/' || true)
|
||||
if [[ -n "$CURRENT_REMOTE" && "$CONFIGURED_REPO" != "$CURRENT_REMOTE" ]]; then
|
||||
echo "$PREFIX Git remote mismatch - run /project-sync"
|
||||
exit 0
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# All checks passed - say nothing
|
||||
exit 0
|
||||
@@ -78,3 +78,8 @@ main() {
|
||||
}
|
||||
|
||||
main "$@"
|
||||
|
||||
# Clear plugin cache to ensure fresh hooks are loaded
|
||||
echo "Clearing plugin cache..."
|
||||
rm -rf ~/.claude/plugins/cache/leo-claude-mktplace/
|
||||
echo "Cache cleared"
|
||||
|
||||
172
scripts/release.sh
Executable file
172
scripts/release.sh
Executable file
@@ -0,0 +1,172 @@
|
||||
#!/bin/bash
|
||||
# release.sh - Create a new release with version consistency
|
||||
#
|
||||
# Usage: ./scripts/release.sh X.Y.Z
|
||||
#
|
||||
# This script ensures all version references are updated consistently:
|
||||
# 1. CHANGELOG.md - [Unreleased] becomes [X.Y.Z] - YYYY-MM-DD
|
||||
# 2. README.md - Title updated to vX.Y.Z
|
||||
# 3. marketplace.json - version field updated
|
||||
# 4. Git commit and tag created
|
||||
#
|
||||
# Prerequisites:
|
||||
# - Clean working directory (no uncommitted changes)
|
||||
# - [Unreleased] section in CHANGELOG.md with content
|
||||
# - On development branch
|
||||
|
||||
set -e
|
||||
|
||||
VERSION=$1
|
||||
DATE=$(date +%Y-%m-%d)
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
error() { echo -e "${RED}ERROR: $1${NC}" >&2; exit 1; }
|
||||
warn() { echo -e "${YELLOW}WARNING: $1${NC}"; }
|
||||
success() { echo -e "${GREEN}$1${NC}"; }
|
||||
info() { echo -e "$1"; }
|
||||
|
||||
# Validate arguments
|
||||
if [ -z "$VERSION" ]; then
|
||||
echo "Usage: ./scripts/release.sh X.Y.Z"
|
||||
echo ""
|
||||
echo "Example: ./scripts/release.sh 3.2.0"
|
||||
echo ""
|
||||
echo "This will:"
|
||||
echo " 1. Update CHANGELOG.md [Unreleased] -> [X.Y.Z] - $(date +%Y-%m-%d)"
|
||||
echo " 2. Update README.md title to vX.Y.Z"
|
||||
echo " 3. Update marketplace.json version to X.Y.Z"
|
||||
echo " 4. Commit with message 'chore: release vX.Y.Z'"
|
||||
echo " 5. Create git tag vX.Y.Z"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate version format
|
||||
if ! [[ "$VERSION" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
|
||||
error "Invalid version format. Use X.Y.Z (e.g., 3.2.0)"
|
||||
fi
|
||||
|
||||
# Check we're in the right directory
|
||||
if [ ! -f "CHANGELOG.md" ] || [ ! -f "README.md" ] || [ ! -f ".claude-plugin/marketplace.json" ]; then
|
||||
error "Must run from repository root (CHANGELOG.md, README.md, .claude-plugin/marketplace.json must exist)"
|
||||
fi
|
||||
|
||||
# Check for clean working directory
|
||||
if [ -n "$(git status --porcelain)" ]; then
|
||||
warn "Working directory has uncommitted changes"
|
||||
echo ""
|
||||
git status --short
|
||||
echo ""
|
||||
read -p "Continue anyway? [y/N] " -n 1 -r
|
||||
echo
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check current branch
|
||||
BRANCH=$(git branch --show-current)
|
||||
if [ "$BRANCH" != "development" ] && [ "$BRANCH" != "main" ]; then
|
||||
warn "Not on development or main branch (current: $BRANCH)"
|
||||
read -p "Continue anyway? [y/N] " -n 1 -r
|
||||
echo
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check [Unreleased] section has content
|
||||
if ! grep -q "## \[Unreleased\]" CHANGELOG.md; then
|
||||
error "CHANGELOG.md missing [Unreleased] section"
|
||||
fi
|
||||
|
||||
# Check if tag already exists
|
||||
if git tag -l | grep -q "^v$VERSION$"; then
|
||||
error "Tag v$VERSION already exists"
|
||||
fi
|
||||
|
||||
info ""
|
||||
info "=== Release v$VERSION ==="
|
||||
info ""
|
||||
|
||||
# Show what will change
|
||||
info "Changes to be made:"
|
||||
info " CHANGELOG.md: [Unreleased] -> [$VERSION] - $DATE"
|
||||
info " README.md: title -> v$VERSION"
|
||||
info " marketplace.json: version -> $VERSION"
|
||||
info " Git: commit + tag v$VERSION"
|
||||
info ""
|
||||
|
||||
# Preview CHANGELOG [Unreleased] content
|
||||
info "Current [Unreleased] content:"
|
||||
info "---"
|
||||
sed -n '/^## \[Unreleased\]/,/^## \[/p' CHANGELOG.md | head -30
|
||||
info "---"
|
||||
info ""
|
||||
|
||||
read -p "Proceed with release? [y/N] " -n 1 -r
|
||||
echo
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
|
||||
info "Aborted"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
info ""
|
||||
info "Updating files..."
|
||||
|
||||
# 1. Update CHANGELOG.md
|
||||
# Replace [Unreleased] with [X.Y.Z] - DATE and add new [Unreleased] section
|
||||
sed -i "s/^## \[Unreleased\]$/## [Unreleased]\n\n*Changes staged for the next release*\n\n---\n\n## [$VERSION] - $DATE/" CHANGELOG.md
|
||||
|
||||
# Remove the placeholder text if it exists after the new [Unreleased]
|
||||
sed -i '/^\*Changes staged for the next release\*$/d' CHANGELOG.md
|
||||
|
||||
# Clean up any double blank lines
|
||||
sed -i '/^$/N;/^\n$/d' CHANGELOG.md
|
||||
|
||||
success " CHANGELOG.md updated"
|
||||
|
||||
# 2. Update README.md title
|
||||
sed -i "s/^# Leo Claude Marketplace - v[0-9]\+\.[0-9]\+\.[0-9]\+$/# Leo Claude Marketplace - v$VERSION/" README.md
|
||||
success " README.md updated"
|
||||
|
||||
# 3. Update marketplace.json version
|
||||
sed -i "s/\"version\": \"[0-9]\+\.[0-9]\+\.[0-9]\+\"/\"version\": \"$VERSION\"/" .claude-plugin/marketplace.json
|
||||
success " marketplace.json updated"
|
||||
|
||||
info ""
|
||||
info "Files updated. Review changes:"
|
||||
info ""
|
||||
git diff --stat
|
||||
info ""
|
||||
git diff CHANGELOG.md | head -40
|
||||
info ""
|
||||
|
||||
read -p "Commit and tag? [y/N] " -n 1 -r
|
||||
echo
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
|
||||
warn "Changes made but not committed. Run 'git checkout -- .' to revert."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Commit
|
||||
git add CHANGELOG.md README.md .claude-plugin/marketplace.json
|
||||
git commit -m "chore: release v$VERSION"
|
||||
success " Committed"
|
||||
|
||||
# Tag
|
||||
git tag "v$VERSION"
|
||||
success " Tagged v$VERSION"
|
||||
|
||||
info ""
|
||||
success "=== Release v$VERSION created ==="
|
||||
info ""
|
||||
info "Next steps:"
|
||||
info " 1. Review the commit: git show HEAD"
|
||||
info " 2. Push to remote: git push && git push --tags"
|
||||
info " 3. Merge to main if on development branch"
|
||||
info ""
|
||||
44
scripts/verify-hooks.sh
Executable file
44
scripts/verify-hooks.sh
Executable file
@@ -0,0 +1,44 @@
|
||||
#!/bin/bash
|
||||
# Verify all hooks are command type (not prompt)
|
||||
# Run this after any plugin update
|
||||
|
||||
echo "=== HOOK VERIFICATION ==="
|
||||
echo ""
|
||||
|
||||
FAILED=0
|
||||
|
||||
# Check ALL hooks.json files in .claude directory
|
||||
for f in $(find ~/.claude -name "hooks.json" 2>/dev/null); do
|
||||
if grep -q '"type": "prompt"' "$f" || grep -q '"type":"prompt"' "$f"; then
|
||||
echo "❌ PROMPT HOOK FOUND: $f"
|
||||
FAILED=1
|
||||
fi
|
||||
done
|
||||
|
||||
# Check cache specifically
|
||||
if [ -d ~/.claude/plugins/cache/leo-claude-mktplace ]; then
|
||||
echo "❌ CACHE EXISTS: ~/.claude/plugins/cache/leo-claude-mktplace"
|
||||
echo " Run: rm -rf ~/.claude/plugins/cache/leo-claude-mktplace/"
|
||||
FAILED=1
|
||||
fi
|
||||
|
||||
# Verify installed hooks are command type
|
||||
for plugin in doc-guardian code-sentinel projman pr-review project-hygiene; do
|
||||
HOOK_FILE=~/.claude/plugins/marketplaces/leo-claude-mktplace/plugins/$plugin/hooks/hooks.json
|
||||
if [ -f "$HOOK_FILE" ]; then
|
||||
if grep -q '"type": "command"' "$HOOK_FILE" || grep -q '"type":"command"' "$HOOK_FILE"; then
|
||||
echo "✓ $plugin: command type"
|
||||
else
|
||||
echo "❌ $plugin: NOT command type"
|
||||
FAILED=1
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
echo ""
|
||||
if [ $FAILED -eq 0 ]; then
|
||||
echo "✓ All hooks verified OK"
|
||||
else
|
||||
echo "❌ ISSUES FOUND - fix before using"
|
||||
exit 1
|
||||
fi
|
||||
Reference in New Issue
Block a user