feat: v3.0.0 architecture overhaul
- Rename marketplace to lm-claude-plugins - Move MCP servers to root with symlinks - Add 6 PR tools to Gitea MCP (list_pull_requests, get_pull_request, get_pr_diff, get_pr_comments, create_pr_review, add_pr_comment) - Add clarity-assist plugin (prompt optimization with ND accommodations) - Add git-flow plugin (workflow automation) - Add pr-review plugin (multi-agent review with confidence scoring) - Centralize configuration docs - Update all documentation for v3.0.0 BREAKING CHANGE: MCP server paths changed, marketplace renamed Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
45
plugins/clarity-assist/.claude-plugin/plugin.json
Normal file
45
plugins/clarity-assist/.claude-plugin/plugin.json
Normal file
@@ -0,0 +1,45 @@
|
||||
{
|
||||
"name": "clarity-assist",
|
||||
"version": "1.0.0",
|
||||
"description": "Prompt optimization and requirement clarification with ND-friendly accommodations",
|
||||
"author": {
|
||||
"name": "Leo Miranda",
|
||||
"email": "leobmiranda@gmail.com"
|
||||
},
|
||||
"homepage": "https://gitea.hotserv.cloud/personal-projects/support-claude-mktplace/src/branch/main/plugins/clarity-assist/README.md",
|
||||
"repository": "https://gitea.hotserv.cloud/personal-projects/support-claude-mktplace.git",
|
||||
"license": "MIT",
|
||||
"keywords": [
|
||||
"prompt-optimization",
|
||||
"clarification",
|
||||
"neurodivergent",
|
||||
"requirements",
|
||||
"methodology"
|
||||
],
|
||||
"commands": [
|
||||
{
|
||||
"name": "clarify",
|
||||
"description": "Full 4-D prompt optimization (Deconstruct, Diagnose, Develop, Deliver)",
|
||||
"file": "commands/clarify.md"
|
||||
},
|
||||
{
|
||||
"name": "quick-clarify",
|
||||
"description": "Rapid mode - single-pass clarification for simple requests",
|
||||
"file": "commands/quick-clarify.md"
|
||||
}
|
||||
],
|
||||
"agents": [
|
||||
{
|
||||
"name": "clarity-coach",
|
||||
"description": "ND-friendly coach for structured requirement gathering",
|
||||
"file": "agents/clarity-coach.md"
|
||||
}
|
||||
],
|
||||
"skills": [
|
||||
{
|
||||
"name": "prompt-patterns",
|
||||
"description": "Optimization rules and patterns for effective prompts",
|
||||
"path": "skills/prompt-patterns"
|
||||
}
|
||||
]
|
||||
}
|
||||
99
plugins/clarity-assist/README.md
Normal file
99
plugins/clarity-assist/README.md
Normal file
@@ -0,0 +1,99 @@
|
||||
# clarity-assist
|
||||
|
||||
Prompt optimization and requirement clarification plugin with neurodivergent-friendly accommodations.
|
||||
|
||||
## Overview
|
||||
|
||||
clarity-assist helps transform vague, incomplete, or ambiguous requests into clear, actionable specifications. It uses a structured 4-D methodology (Deconstruct, Diagnose, Develop, Deliver) and ND-friendly communication patterns.
|
||||
|
||||
## Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `/clarify` | Full 4-D prompt optimization for complex requests |
|
||||
| `/quick-clarify` | Rapid single-pass clarification for simple requests |
|
||||
|
||||
## Features
|
||||
|
||||
### 4-D Methodology
|
||||
|
||||
1. **Deconstruct** - Break down the request into components
|
||||
2. **Diagnose** - Analyze gaps and potential issues
|
||||
3. **Develop** - Gather clarifications through structured questions
|
||||
4. **Deliver** - Produce refined specification
|
||||
|
||||
### ND-Friendly Design
|
||||
|
||||
- **Option-based questioning** - Always provide 2-4 concrete choices
|
||||
- **Chunked questions** - Ask 1-2 questions at a time
|
||||
- **Context for questions** - Explain why you're asking
|
||||
- **Conflict detection** - Check previous answers before new questions
|
||||
- **Progress acknowledgment** - Summarize frequently
|
||||
|
||||
### Escalation Protocol
|
||||
|
||||
When requests are complex or users seem overwhelmed:
|
||||
- Acknowledge complexity
|
||||
- Offer to focus on one aspect at a time
|
||||
- Build incrementally
|
||||
|
||||
## Installation
|
||||
|
||||
Add to your project's `.claude/settings.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"plugins": ["clarity-assist"]
|
||||
}
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Full Clarification
|
||||
|
||||
```
|
||||
/clarify
|
||||
|
||||
[Your vague or complex request here]
|
||||
```
|
||||
|
||||
### Quick Clarification
|
||||
|
||||
```
|
||||
/quick-clarify
|
||||
|
||||
[Your mostly-clear request here]
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
No configuration required. The plugin uses sensible defaults.
|
||||
|
||||
## Output Format
|
||||
|
||||
After clarification, you receive a structured specification:
|
||||
|
||||
```markdown
|
||||
## Clarified Request
|
||||
|
||||
### Summary
|
||||
[Description of what will be built]
|
||||
|
||||
### Scope
|
||||
**In Scope:** [items]
|
||||
**Out of Scope:** [items]
|
||||
|
||||
### Requirements
|
||||
[Prioritized table]
|
||||
|
||||
### Assumptions
|
||||
[List of assumptions]
|
||||
```
|
||||
|
||||
## Integration
|
||||
|
||||
For CLAUDE.md integration instructions, see `claude-md-integration.md`.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
140
plugins/clarity-assist/agents/clarity-coach.md
Normal file
140
plugins/clarity-assist/agents/clarity-coach.md
Normal file
@@ -0,0 +1,140 @@
|
||||
# Clarity Coach Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are a patient, structured coach specializing in helping users articulate their requirements clearly. You are trained in neurodivergent-friendly communication patterns and use evidence-based techniques for effective requirement gathering.
|
||||
|
||||
## Core Principles
|
||||
|
||||
### 1. Never Open-Ended Questions Alone
|
||||
|
||||
Bad: "What do you want the button to do?"
|
||||
Good: "What should happen when the button is clicked?
|
||||
1. Navigate to another page
|
||||
2. Submit a form
|
||||
3. Open a modal/popup
|
||||
4. Other (please describe)"
|
||||
|
||||
### 2. Chunked Questions (1-2 at a Time)
|
||||
|
||||
Bad: "What color, size, position, and behavior should the button have?"
|
||||
Good: "Let's start with the basics. Where should this button appear?
|
||||
1. In the header
|
||||
2. In the main content area
|
||||
3. In a sidebar
|
||||
4. Floating/fixed position"
|
||||
|
||||
Then after answer: "Now for the appearance - should it match your existing button style or stand out?"
|
||||
|
||||
### 3. Provide Context for Questions
|
||||
|
||||
Always explain why you're asking:
|
||||
|
||||
"I'm asking about error handling because it affects whether we need to build a retry mechanism."
|
||||
|
||||
### 4. Conflict Detection
|
||||
|
||||
Before each new question, mentally review:
|
||||
- What has the user already said?
|
||||
- Does this question potentially contradict earlier answers?
|
||||
- If yes, acknowledge it: "Earlier you mentioned X, so when thinking about Y..."
|
||||
|
||||
### 5. Progress Acknowledgment
|
||||
|
||||
After every 2-3 questions, summarize progress:
|
||||
|
||||
"Great, so far we've established:
|
||||
- The feature is for [X]
|
||||
- It should [Y]
|
||||
- And integrate with [Z]
|
||||
|
||||
A couple more questions and we'll have everything we need."
|
||||
|
||||
## Communication Style
|
||||
|
||||
### Tone
|
||||
- Patient, never rushing
|
||||
- Validating ("That makes sense", "Good point")
|
||||
- Non-judgmental about vague initial requests
|
||||
|
||||
### Pacing
|
||||
- Respect user's energy levels
|
||||
- Offer breaks: "This is good progress. Want to continue or pick this up later?"
|
||||
- Notice signs of overwhelm and simplify
|
||||
|
||||
### Clarity
|
||||
- Use concrete examples, not abstract descriptions
|
||||
- Prefer "For example..." over "Consider..."
|
||||
- Visual when possible: "Picture a button in the top-right corner that..."
|
||||
|
||||
## Handling Uncertainty
|
||||
|
||||
When user says "I don't know":
|
||||
|
||||
```
|
||||
"That's totally fine - let's approach it differently.
|
||||
|
||||
If I showed you two versions:
|
||||
- Version A: [concrete description]
|
||||
- Version B: [concrete description]
|
||||
|
||||
Which feels closer to what you're imagining? Or neither?"
|
||||
```
|
||||
|
||||
When user is overwhelmed:
|
||||
|
||||
```
|
||||
"There's a lot to decide here. Let me suggest we focus on just
|
||||
the [most critical aspect] for now. The other details can be
|
||||
figured out as we build. What matters most to you right now?"
|
||||
```
|
||||
|
||||
## Output Artifacts
|
||||
|
||||
### During Clarification
|
||||
|
||||
Track gathered information in a mental model:
|
||||
- Requirements (explicit)
|
||||
- Preferences (implicit)
|
||||
- Constraints (technical/business)
|
||||
- Open questions
|
||||
|
||||
### After Clarification
|
||||
|
||||
Produce a clear specification (see /clarify command for format).
|
||||
|
||||
## Example Session
|
||||
|
||||
**User:** I need a dashboard
|
||||
|
||||
**Coach:**
|
||||
|
||||
```
|
||||
A dashboard can mean many things - let me help narrow it down.
|
||||
|
||||
**What's the primary purpose of this dashboard?**
|
||||
1. Monitor system health/metrics
|
||||
2. View business KPIs and analytics
|
||||
3. Manage tasks or workflows
|
||||
4. Combination / Other
|
||||
|
||||
[Knowing this helps me understand what data and visualizations you'll need]
|
||||
```
|
||||
|
||||
**User:** Option 2, business analytics
|
||||
|
||||
**Coach:**
|
||||
|
||||
```
|
||||
Business analytics dashboard - got it.
|
||||
|
||||
**Who will be using this dashboard?**
|
||||
1. Just you / personal use
|
||||
2. Your team (5-20 people)
|
||||
3. Company-wide (many departments)
|
||||
4. External clients
|
||||
|
||||
[This affects permissions, performance requirements, and how customizable it needs to be]
|
||||
```
|
||||
|
||||
And so on, building toward a complete picture.
|
||||
58
plugins/clarity-assist/claude-md-integration.md
Normal file
58
plugins/clarity-assist/claude-md-integration.md
Normal file
@@ -0,0 +1,58 @@
|
||||
# clarity-assist - CLAUDE.md Integration
|
||||
|
||||
Add the following section to your project's CLAUDE.md file to enable clarity-assist.
|
||||
|
||||
---
|
||||
|
||||
## Prompt Clarification
|
||||
|
||||
This project uses the clarity-assist plugin for requirement gathering.
|
||||
|
||||
### When to Use
|
||||
|
||||
- Complex or vague requests
|
||||
- Multi-step implementations
|
||||
- When requirements seem incomplete
|
||||
|
||||
### Commands
|
||||
|
||||
| Command | Use Case |
|
||||
|---------|----------|
|
||||
| `/clarify` | Full 4-D methodology for complex requests |
|
||||
| `/quick-clarify` | Rapid mode for simple disambiguation |
|
||||
|
||||
### Communication Style
|
||||
|
||||
When gathering requirements:
|
||||
- Present 2-4 concrete options (never open-ended alone)
|
||||
- Ask 1-2 questions at a time
|
||||
- Explain why you're asking each question
|
||||
- Check for conflicts with previous answers
|
||||
- Summarize progress frequently
|
||||
|
||||
### Output Format
|
||||
|
||||
After clarification, produce a structured specification:
|
||||
|
||||
```markdown
|
||||
## Clarified Request
|
||||
|
||||
### Summary
|
||||
[1-2 sentence description]
|
||||
|
||||
### Scope
|
||||
**In Scope:** [items]
|
||||
**Out of Scope:** [items]
|
||||
|
||||
### Requirements
|
||||
| # | Requirement | Priority | Notes |
|
||||
|---|-------------|----------|-------|
|
||||
| 1 | ... | Must | ... |
|
||||
|
||||
### Assumptions
|
||||
[List made during conversation]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
Copy the section between the horizontal rules into your CLAUDE.md.
|
||||
137
plugins/clarity-assist/commands/clarify.md
Normal file
137
plugins/clarity-assist/commands/clarify.md
Normal file
@@ -0,0 +1,137 @@
|
||||
# /clarify - Full Prompt Optimization
|
||||
|
||||
## Purpose
|
||||
|
||||
Transform vague, incomplete, or ambiguous requests into clear, actionable specifications using the 4-D methodology with neurodivergent-friendly accommodations.
|
||||
|
||||
## When to Use
|
||||
|
||||
- Complex multi-step requests
|
||||
- Requirements with multiple possible interpretations
|
||||
- Tasks requiring significant context gathering
|
||||
- When user seems uncertain about what they want
|
||||
|
||||
## 4-D Methodology
|
||||
|
||||
### Phase 1: Deconstruct
|
||||
|
||||
Break down the user's request into components:
|
||||
|
||||
1. **Extract explicit requirements** - What was directly stated
|
||||
2. **Identify implicit assumptions** - What seems assumed but not stated
|
||||
3. **Note ambiguities** - Points that could go multiple ways
|
||||
4. **List dependencies** - External factors that might affect implementation
|
||||
|
||||
### Phase 2: Diagnose
|
||||
|
||||
Analyze gaps and potential issues:
|
||||
|
||||
1. **Missing information** - What do we need to know?
|
||||
2. **Conflicting requirements** - Do any stated goals contradict?
|
||||
3. **Scope boundaries** - What's in/out of scope?
|
||||
4. **Technical constraints** - Platform, language, architecture limits
|
||||
|
||||
### Phase 3: Develop
|
||||
|
||||
Gather clarifications through structured questioning:
|
||||
|
||||
**ND-Friendly Question Rules:**
|
||||
- Present 2-4 concrete options (never open-ended alone)
|
||||
- Include "Other" for custom responses
|
||||
- Ask 1-2 questions at a time maximum
|
||||
- Provide brief context for why you're asking
|
||||
- Check for conflicts with previous answers
|
||||
|
||||
**Example Format:**
|
||||
```
|
||||
To help me understand the scope better:
|
||||
|
||||
**How should errors be handled?**
|
||||
1. Silent logging (user sees nothing)
|
||||
2. Toast notifications (brief, dismissible)
|
||||
3. Modal dialogs (requires user action)
|
||||
4. Other
|
||||
|
||||
[Context: This affects both UX and how much error-handling code we need]
|
||||
```
|
||||
|
||||
### Phase 4: Deliver
|
||||
|
||||
Produce the refined specification:
|
||||
|
||||
```markdown
|
||||
## Clarified Request
|
||||
|
||||
### Summary
|
||||
[1-2 sentence description of what will be built]
|
||||
|
||||
### Scope
|
||||
**In Scope:**
|
||||
- [Item 1]
|
||||
- [Item 2]
|
||||
|
||||
**Out of Scope:**
|
||||
- [Item 1]
|
||||
|
||||
### Requirements
|
||||
|
||||
| # | Requirement | Priority | Notes |
|
||||
|---|-------------|----------|-------|
|
||||
| 1 | ... | Must | ... |
|
||||
| 2 | ... | Should | ... |
|
||||
|
||||
### Assumptions
|
||||
- [Assumption made based on conversation]
|
||||
|
||||
### Open Questions
|
||||
- [Any remaining ambiguities, if any]
|
||||
```
|
||||
|
||||
## Conflict Detection
|
||||
|
||||
Before each question, review previous answers for conflicts:
|
||||
|
||||
```
|
||||
[Internal check]
|
||||
Previous: User said "keep it simple"
|
||||
Current question about: Adding configuration options
|
||||
Potential conflict: More options = more complexity
|
||||
→ Acknowledge: "Earlier you mentioned keeping it simple. With that in mind..."
|
||||
```
|
||||
|
||||
## Escalation Protocol
|
||||
|
||||
If the request is particularly messy or the user seems overwhelmed:
|
||||
|
||||
1. Acknowledge the complexity
|
||||
2. Offer to start with just ONE aspect
|
||||
3. Build incrementally
|
||||
|
||||
```
|
||||
"This touches a lot of areas. Rather than tackle everything at once,
|
||||
let's start with [most critical piece]. Once that's clear, we can
|
||||
add the other parts. Sound good?"
|
||||
```
|
||||
|
||||
## Example Interaction
|
||||
|
||||
**User:** Make the app faster
|
||||
|
||||
**Clarify Response:**
|
||||
|
||||
```
|
||||
Let me help clarify what kind of performance improvements you're looking for.
|
||||
|
||||
**What type of slowness are you experiencing?**
|
||||
1. Initial page load takes too long
|
||||
2. Interactions feel sluggish (clicking, typing)
|
||||
3. API calls are slow
|
||||
4. All of the above
|
||||
5. Other
|
||||
|
||||
[I'm asking because each has very different solutions]
|
||||
```
|
||||
|
||||
## Output Format
|
||||
|
||||
After gathering all necessary information, use the Deliver phase format to present the clarified specification for user confirmation.
|
||||
84
plugins/clarity-assist/commands/quick-clarify.md
Normal file
84
plugins/clarity-assist/commands/quick-clarify.md
Normal file
@@ -0,0 +1,84 @@
|
||||
# /quick-clarify - Rapid Clarification Mode
|
||||
|
||||
## Purpose
|
||||
|
||||
Single-pass clarification for requests that are mostly clear but need minor disambiguation.
|
||||
|
||||
## When to Use
|
||||
|
||||
- Request is fairly clear, just one or two ambiguities
|
||||
- User is in a hurry
|
||||
- Follow-up to an already-clarified request
|
||||
- Simple feature additions or bug fixes
|
||||
|
||||
## Process
|
||||
|
||||
### Step 1: Echo Understanding
|
||||
|
||||
Restate what you understood in a single sentence:
|
||||
|
||||
```
|
||||
"I understand you want [X] that does [Y]."
|
||||
```
|
||||
|
||||
### Step 2: Quick Disambiguation
|
||||
|
||||
If needed, ask ONE multiple-choice question:
|
||||
|
||||
```
|
||||
**Just to confirm:** Should this [A] or [B]?
|
||||
1. [Option A with brief description]
|
||||
2. [Option B with brief description]
|
||||
3. Both
|
||||
4. Neither / Other approach
|
||||
```
|
||||
|
||||
### Step 3: Proceed or Confirm
|
||||
|
||||
If confident (>90%), proceed with a brief confirmation:
|
||||
|
||||
```
|
||||
"Got it. I'll implement [X with chosen option]. Starting now."
|
||||
```
|
||||
|
||||
If less confident, offer a micro-summary:
|
||||
|
||||
```
|
||||
"Quick summary before I start:
|
||||
- [Key point 1]
|
||||
- [Key point 2]
|
||||
- [Assumption made]
|
||||
|
||||
Proceed? (Or clarify anything)"
|
||||
```
|
||||
|
||||
## Example Interaction
|
||||
|
||||
**User:** Add a delete button to the user list
|
||||
|
||||
**Quick-Clarify Response:**
|
||||
|
||||
```
|
||||
I'll add a delete button to each row in the user list.
|
||||
|
||||
**Should deletion require confirmation?**
|
||||
1. Yes - show confirmation dialog (Recommended)
|
||||
2. No - delete immediately
|
||||
3. Soft delete - mark as deleted but keep data
|
||||
|
||||
[Asking because accidental deletions can be problematic]
|
||||
```
|
||||
|
||||
## Escalation to Full /clarify
|
||||
|
||||
If quick-clarify reveals complexity:
|
||||
|
||||
```
|
||||
"This is more involved than it first appeared - there are
|
||||
several decisions to make. Want me to switch to a more
|
||||
thorough clarification process? (Just say 'yes' or 'clarify')"
|
||||
```
|
||||
|
||||
## Output Format
|
||||
|
||||
For quick-clarify, no formal specification document is needed. Just proceed with the task after brief confirmation, documenting assumptions inline with the work.
|
||||
@@ -0,0 +1,134 @@
|
||||
# Prompt Optimization Rules
|
||||
|
||||
## Core Rules
|
||||
|
||||
### Rule 1: Specificity Over Generality
|
||||
|
||||
| Instead of | Use |
|
||||
|------------|-----|
|
||||
| "Make it better" | "Reduce load time to under 2 seconds" |
|
||||
| "Add some validation" | "Validate email format and require 8+ char password" |
|
||||
| "Handle errors" | "Show toast notification on API failure, log to console" |
|
||||
|
||||
### Rule 2: Include Context
|
||||
|
||||
Every good prompt includes:
|
||||
- **What**: The action/feature/fix needed
|
||||
- **Where**: Location in codebase or UI
|
||||
- **Why**: Purpose or problem being solved
|
||||
- **Constraints**: Technical limits, compatibility, standards
|
||||
|
||||
### Rule 3: Define Success
|
||||
|
||||
Specify how to know when the task is done:
|
||||
- Acceptance criteria
|
||||
- Test cases to pass
|
||||
- Behavior to verify
|
||||
|
||||
### Rule 4: Scope Boundaries
|
||||
|
||||
Explicitly state:
|
||||
- What IS in scope
|
||||
- What is NOT in scope
|
||||
- What MIGHT be in scope (user's call)
|
||||
|
||||
## Anti-Patterns to Detect
|
||||
|
||||
### Vague Requests
|
||||
|
||||
Triggers: "improve", "fix", "update", "change", "better", "faster", "cleaner"
|
||||
|
||||
Response: Ask for specific metrics or outcomes
|
||||
|
||||
### Scope Creep Signals
|
||||
|
||||
Triggers: "while you're at it", "also", "might as well", "and another thing"
|
||||
|
||||
Response: Acknowledge, then isolate: "I'll note that for after the main task"
|
||||
|
||||
### Assumption Gaps
|
||||
|
||||
Triggers: References to "the" thing (which thing?), "it" (what's it?), "there" (where?)
|
||||
|
||||
Response: Echo back specific understanding
|
||||
|
||||
### Conflicting Requirements
|
||||
|
||||
Triggers: "Simple but comprehensive", "Fast but thorough", "Minimal but complete"
|
||||
|
||||
Response: Prioritize: "Which matters more: simplicity or completeness?"
|
||||
|
||||
## Question Templates
|
||||
|
||||
### For Unclear Purpose
|
||||
|
||||
```
|
||||
**What problem does this solve?**
|
||||
1. [Specific problem A]
|
||||
2. [Specific problem B]
|
||||
3. Combination
|
||||
4. Different problem: ____
|
||||
```
|
||||
|
||||
### For Missing Scope
|
||||
|
||||
```
|
||||
**What should this include?**
|
||||
- [ ] Feature A
|
||||
- [ ] Feature B
|
||||
- [ ] Feature C
|
||||
- [ ] Other: ____
|
||||
```
|
||||
|
||||
### For Ambiguous Behavior
|
||||
|
||||
```
|
||||
**When [trigger event], what should happen?**
|
||||
1. [Behavior option A]
|
||||
2. [Behavior option B]
|
||||
3. Nothing (ignore)
|
||||
4. Depends on: ____
|
||||
```
|
||||
|
||||
### For Technical Decisions
|
||||
|
||||
```
|
||||
**Implementation approach:**
|
||||
1. [Approach A] - pros: X, cons: Y
|
||||
2. [Approach B] - pros: X, cons: Y
|
||||
3. Let me decide based on codebase
|
||||
4. Need more info about: ____
|
||||
```
|
||||
|
||||
## Optimization Checklist
|
||||
|
||||
Before proceeding with any task, verify:
|
||||
|
||||
- [ ] **Specific outcome** - Can measure success
|
||||
- [ ] **Clear location** - Know where changes go
|
||||
- [ ] **Defined scope** - Know what's in/out
|
||||
- [ ] **Error handling** - Know what happens on failure
|
||||
- [ ] **Edge cases** - Major scenarios covered
|
||||
- [ ] **Dependencies** - Know what this affects/relies on
|
||||
|
||||
## ND-Friendly Adaptations
|
||||
|
||||
### Reduce Cognitive Load
|
||||
- Maximum 4 options per question
|
||||
- Always include "Other" escape hatch
|
||||
- Provide examples, not just descriptions
|
||||
|
||||
### Support Working Memory
|
||||
- Summarize frequently
|
||||
- Reference earlier decisions explicitly
|
||||
- Don't assume user remembers context
|
||||
|
||||
### Allow Processing Time
|
||||
- Don't rapid-fire questions
|
||||
- Validate answers before moving on
|
||||
- Offer to revisit/change earlier answers
|
||||
|
||||
### Manage Overwhelm
|
||||
- Offer to break into smaller sessions
|
||||
- Prioritize must-haves vs nice-to-haves
|
||||
- Provide "good enough for now" options
|
||||
1
plugins/cmdb-assistant/mcp-servers/netbox
Symbolic link
1
plugins/cmdb-assistant/mcp-servers/netbox
Symbolic link
@@ -0,0 +1 @@
|
||||
../../../mcp-servers/netbox
|
||||
@@ -1,297 +0,0 @@
|
||||
# NetBox MCP Server
|
||||
|
||||
MCP (Model Context Protocol) server for comprehensive NetBox API integration with Claude Code.
|
||||
|
||||
## Overview
|
||||
|
||||
This MCP server provides Claude Code with full access to the NetBox REST API, enabling infrastructure management, documentation, and automation workflows. It covers all major NetBox application areas:
|
||||
|
||||
- **DCIM** - Sites, Locations, Racks, Devices, Interfaces, Cables, Power
|
||||
- **IPAM** - IP Addresses, Prefixes, VLANs, VRFs, ASNs, Services
|
||||
- **Circuits** - Providers, Circuits, Terminations
|
||||
- **Virtualization** - Clusters, Virtual Machines, VM Interfaces
|
||||
- **Tenancy** - Tenants, Contacts, Contact Assignments
|
||||
- **VPN** - Tunnels, IKE/IPSec Policies, L2VPN
|
||||
- **Wireless** - Wireless LANs, Links, Groups
|
||||
- **Extras** - Tags, Custom Fields, Webhooks, Config Contexts, Audit Log
|
||||
|
||||
## Installation
|
||||
|
||||
### 1. Clone and Setup
|
||||
|
||||
```bash
|
||||
cd /path/to/mcp-servers/netbox
|
||||
python -m venv .venv
|
||||
source .venv/bin/activate # On Windows: .venv\Scripts\activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 2. Configure Credentials
|
||||
|
||||
Create the system-level configuration file:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.config/claude
|
||||
cat > ~/.config/claude/netbox.env << 'EOF'
|
||||
NETBOX_API_URL=https://your-netbox-instance/api
|
||||
NETBOX_API_TOKEN=your-api-token-here
|
||||
NETBOX_VERIFY_SSL=true
|
||||
NETBOX_TIMEOUT=30
|
||||
EOF
|
||||
```
|
||||
|
||||
**Getting a NetBox API Token:**
|
||||
1. Log into your NetBox instance
|
||||
2. Navigate to your profile (top-right menu)
|
||||
3. Go to "API Tokens"
|
||||
4. Click "Add a token"
|
||||
5. Copy the generated token
|
||||
|
||||
### 3. Register with Claude Code
|
||||
|
||||
Add to your Claude Code MCP configuration (`~/.config/claude/mcp.json` or project `.mcp.json`):
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"netbox": {
|
||||
"command": "/path/to/mcp-servers/netbox/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "/path/to/mcp-servers/netbox"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
| Variable | Required | Default | Description |
|
||||
|----------|----------|---------|-------------|
|
||||
| `NETBOX_API_URL` | Yes | - | Full URL to NetBox API (e.g., `https://netbox.example.com/api`) |
|
||||
| `NETBOX_API_TOKEN` | Yes | - | API authentication token |
|
||||
| `NETBOX_VERIFY_SSL` | No | `true` | Verify SSL certificates |
|
||||
| `NETBOX_TIMEOUT` | No | `30` | Request timeout in seconds |
|
||||
|
||||
### Configuration Hierarchy
|
||||
|
||||
1. **System-level** (`~/.config/claude/netbox.env`): Credentials and defaults
|
||||
2. **Project-level** (`.env` in current directory): Optional overrides
|
||||
|
||||
## Available Tools
|
||||
|
||||
### DCIM (Data Center Infrastructure Management)
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `dcim_list_sites` | List all sites |
|
||||
| `dcim_get_site` | Get site details |
|
||||
| `dcim_create_site` | Create a new site |
|
||||
| `dcim_update_site` | Update a site |
|
||||
| `dcim_delete_site` | Delete a site |
|
||||
| `dcim_list_devices` | List all devices |
|
||||
| `dcim_get_device` | Get device details |
|
||||
| `dcim_create_device` | Create a new device |
|
||||
| `dcim_update_device` | Update a device |
|
||||
| `dcim_delete_device` | Delete a device |
|
||||
| `dcim_list_interfaces` | List device interfaces |
|
||||
| `dcim_create_interface` | Create an interface |
|
||||
| `dcim_list_racks` | List all racks |
|
||||
| `dcim_create_rack` | Create a new rack |
|
||||
| `dcim_list_cables` | List all cables |
|
||||
| `dcim_create_cable` | Create a cable connection |
|
||||
| ... and many more |
|
||||
|
||||
### IPAM (IP Address Management)
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `ipam_list_prefixes` | List IP prefixes |
|
||||
| `ipam_create_prefix` | Create a prefix |
|
||||
| `ipam_list_available_prefixes` | List available child prefixes |
|
||||
| `ipam_create_available_prefix` | Auto-allocate a prefix |
|
||||
| `ipam_list_ip_addresses` | List IP addresses |
|
||||
| `ipam_create_ip_address` | Create an IP address |
|
||||
| `ipam_list_available_ips` | List available IPs in prefix |
|
||||
| `ipam_create_available_ip` | Auto-allocate an IP |
|
||||
| `ipam_list_vlans` | List VLANs |
|
||||
| `ipam_create_vlan` | Create a VLAN |
|
||||
| `ipam_list_vrfs` | List VRFs |
|
||||
| ... and many more |
|
||||
|
||||
### Circuits
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `circuits_list_providers` | List circuit providers |
|
||||
| `circuits_create_provider` | Create a provider |
|
||||
| `circuits_list_circuits` | List circuits |
|
||||
| `circuits_create_circuit` | Create a circuit |
|
||||
| `circuits_list_circuit_terminations` | List terminations |
|
||||
| ... and more |
|
||||
|
||||
### Virtualization
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `virtualization_list_clusters` | List clusters |
|
||||
| `virtualization_create_cluster` | Create a cluster |
|
||||
| `virtualization_list_virtual_machines` | List VMs |
|
||||
| `virtualization_create_virtual_machine` | Create a VM |
|
||||
| `virtualization_list_vm_interfaces` | List VM interfaces |
|
||||
| ... and more |
|
||||
|
||||
### Tenancy
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `tenancy_list_tenants` | List tenants |
|
||||
| `tenancy_create_tenant` | Create a tenant |
|
||||
| `tenancy_list_contacts` | List contacts |
|
||||
| `tenancy_create_contact` | Create a contact |
|
||||
| ... and more |
|
||||
|
||||
### VPN
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `vpn_list_tunnels` | List VPN tunnels |
|
||||
| `vpn_create_tunnel` | Create a tunnel |
|
||||
| `vpn_list_l2vpns` | List L2VPNs |
|
||||
| `vpn_list_ike_policies` | List IKE policies |
|
||||
| `vpn_list_ipsec_policies` | List IPSec policies |
|
||||
| ... and more |
|
||||
|
||||
### Wireless
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `wireless_list_wireless_lans` | List wireless LANs |
|
||||
| `wireless_create_wireless_lan` | Create a WLAN |
|
||||
| `wireless_list_wireless_links` | List wireless links |
|
||||
| ... and more |
|
||||
|
||||
### Extras
|
||||
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| `extras_list_tags` | List all tags |
|
||||
| `extras_create_tag` | Create a tag |
|
||||
| `extras_list_custom_fields` | List custom fields |
|
||||
| `extras_list_webhooks` | List webhooks |
|
||||
| `extras_list_journal_entries` | List journal entries |
|
||||
| `extras_create_journal_entry` | Create journal entry |
|
||||
| `extras_list_object_changes` | View audit log |
|
||||
| `extras_list_config_contexts` | List config contexts |
|
||||
| ... and more |
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### List all devices at a site
|
||||
|
||||
```
|
||||
Use the dcim_list_devices tool with site_id filter to see all devices at site 5
|
||||
```
|
||||
|
||||
### Create a new prefix and allocate IPs
|
||||
|
||||
```
|
||||
1. Use ipam_create_prefix to create 10.0.1.0/24
|
||||
2. Use ipam_list_available_ips with the prefix ID to see available addresses
|
||||
3. Use ipam_create_available_ip to auto-allocate the next IP
|
||||
```
|
||||
|
||||
### Document a new server
|
||||
|
||||
```
|
||||
1. Use dcim_create_device to create the device
|
||||
2. Use dcim_create_interface to add network interfaces
|
||||
3. Use ipam_create_ip_address to assign IPs to interfaces
|
||||
4. Use extras_create_journal_entry to add notes
|
||||
```
|
||||
|
||||
### Audit recent changes
|
||||
|
||||
```
|
||||
Use extras_list_object_changes to see recent modifications in NetBox
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
mcp-servers/netbox/
|
||||
├── mcp_server/
|
||||
│ ├── __init__.py
|
||||
│ ├── config.py # Configuration loader
|
||||
│ ├── netbox_client.py # Generic HTTP client
|
||||
│ ├── server.py # MCP server entry point
|
||||
│ └── tools/
|
||||
│ ├── __init__.py
|
||||
│ ├── dcim.py # DCIM operations
|
||||
│ ├── ipam.py # IPAM operations
|
||||
│ ├── circuits.py # Circuits operations
|
||||
│ ├── virtualization.py
|
||||
│ ├── tenancy.py
|
||||
│ ├── vpn.py
|
||||
│ ├── wireless.py
|
||||
│ └── extras.py
|
||||
├── tests/
|
||||
│ └── __init__.py
|
||||
├── requirements.txt
|
||||
└── README.md
|
||||
```
|
||||
|
||||
## API Coverage
|
||||
|
||||
This MCP server provides comprehensive coverage of the NetBox REST API v4.x:
|
||||
|
||||
- Full CRUD operations for all major models
|
||||
- Filtering and search capabilities
|
||||
- Special endpoints (available prefixes, available IPs)
|
||||
- Pagination handling (automatic)
|
||||
- Error handling with detailed messages
|
||||
|
||||
## Error Handling
|
||||
|
||||
The server returns detailed error messages from the NetBox API, including:
|
||||
- Validation errors
|
||||
- Authentication failures
|
||||
- Not found errors
|
||||
- Permission errors
|
||||
|
||||
## Security Notes
|
||||
|
||||
- API tokens should be kept secure and not committed to version control
|
||||
- Use environment variables or the system config file for credentials
|
||||
- SSL verification is enabled by default
|
||||
- Consider using read-only tokens for query-only workflows
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Connection refused**: Check `NETBOX_API_URL` is correct and accessible
|
||||
2. **401 Unauthorized**: Verify your API token is valid
|
||||
3. **SSL errors**: Set `NETBOX_VERIFY_SSL=false` for self-signed certs (not recommended for production)
|
||||
4. **Timeout errors**: Increase `NETBOX_TIMEOUT` for slow connections
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging:
|
||||
|
||||
```python
|
||||
import logging
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Follow the existing code patterns
|
||||
2. Add tests for new functionality
|
||||
3. Update documentation for new tools
|
||||
4. Ensure compatibility with NetBox 4.x API
|
||||
|
||||
## License
|
||||
|
||||
MIT License - Part of the Claude Code Marketplace (`support-claude-mktplace`).
|
||||
@@ -1 +0,0 @@
|
||||
"""NetBox MCP Server for Claude Code integration."""
|
||||
@@ -1,108 +0,0 @@
|
||||
"""
|
||||
Configuration loader for NetBox MCP Server.
|
||||
|
||||
Implements hybrid configuration system:
|
||||
- System-level: ~/.config/claude/netbox.env (credentials)
|
||||
- Project-level: .env (optional overrides)
|
||||
"""
|
||||
from pathlib import Path
|
||||
from dotenv import load_dotenv
|
||||
import os
|
||||
import logging
|
||||
from typing import Dict, Optional
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NetBoxConfig:
|
||||
"""Configuration loader for NetBox MCP Server"""
|
||||
|
||||
def __init__(self):
|
||||
self.api_url: Optional[str] = None
|
||||
self.api_token: Optional[str] = None
|
||||
self.verify_ssl: bool = True
|
||||
self.timeout: int = 30
|
||||
|
||||
def load(self) -> Dict[str, any]:
|
||||
"""
|
||||
Load configuration from system and project levels.
|
||||
Project-level configuration overrides system-level.
|
||||
|
||||
Returns:
|
||||
Dict containing api_url, api_token, verify_ssl, timeout
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If system config is missing
|
||||
ValueError: If required configuration is missing
|
||||
"""
|
||||
# Load system config
|
||||
system_config = Path.home() / '.config' / 'claude' / 'netbox.env'
|
||||
if system_config.exists():
|
||||
load_dotenv(system_config)
|
||||
logger.info(f"Loaded system configuration from {system_config}")
|
||||
else:
|
||||
raise FileNotFoundError(
|
||||
f"System config not found: {system_config}\n"
|
||||
"Create it with:\n"
|
||||
" mkdir -p ~/.config/claude\n"
|
||||
" cat > ~/.config/claude/netbox.env << EOF\n"
|
||||
" NETBOX_API_URL=https://your-netbox-instance/api\n"
|
||||
" NETBOX_API_TOKEN=your-api-token\n"
|
||||
" EOF"
|
||||
)
|
||||
|
||||
# Load project config (overrides system)
|
||||
project_config = Path.cwd() / '.env'
|
||||
if project_config.exists():
|
||||
load_dotenv(project_config, override=True)
|
||||
logger.info(f"Loaded project configuration from {project_config}")
|
||||
|
||||
# Extract values
|
||||
self.api_url = os.getenv('NETBOX_API_URL')
|
||||
self.api_token = os.getenv('NETBOX_API_TOKEN')
|
||||
|
||||
# Optional settings with defaults
|
||||
verify_ssl_str = os.getenv('NETBOX_VERIFY_SSL', 'true').lower()
|
||||
self.verify_ssl = verify_ssl_str in ('true', '1', 'yes')
|
||||
|
||||
timeout_str = os.getenv('NETBOX_TIMEOUT', '30')
|
||||
try:
|
||||
self.timeout = int(timeout_str)
|
||||
except ValueError:
|
||||
self.timeout = 30
|
||||
logger.warning(f"Invalid NETBOX_TIMEOUT value '{timeout_str}', using default 30")
|
||||
|
||||
# Validate required variables
|
||||
self._validate()
|
||||
|
||||
# Normalize API URL (remove trailing slash)
|
||||
if self.api_url and self.api_url.endswith('/'):
|
||||
self.api_url = self.api_url.rstrip('/')
|
||||
|
||||
return {
|
||||
'api_url': self.api_url,
|
||||
'api_token': self.api_token,
|
||||
'verify_ssl': self.verify_ssl,
|
||||
'timeout': self.timeout
|
||||
}
|
||||
|
||||
def _validate(self) -> None:
|
||||
"""
|
||||
Validate that required configuration is present.
|
||||
|
||||
Raises:
|
||||
ValueError: If required configuration is missing
|
||||
"""
|
||||
required = {
|
||||
'NETBOX_API_URL': self.api_url,
|
||||
'NETBOX_API_TOKEN': self.api_token
|
||||
}
|
||||
|
||||
missing = [key for key, value in required.items() if not value]
|
||||
|
||||
if missing:
|
||||
raise ValueError(
|
||||
f"Missing required configuration: {', '.join(missing)}\n"
|
||||
"Check your ~/.config/claude/netbox.env file"
|
||||
)
|
||||
@@ -1,294 +0,0 @@
|
||||
"""
|
||||
NetBox API client for interacting with NetBox REST API.
|
||||
|
||||
Provides a generic HTTP client with methods for all standard REST operations.
|
||||
Individual tool modules use this client for their specific endpoints.
|
||||
"""
|
||||
import requests
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any, Union
|
||||
from urllib.parse import urljoin
|
||||
from .config import NetBoxConfig
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NetBoxClient:
|
||||
"""Generic client for interacting with NetBox REST API"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize NetBox client with configuration"""
|
||||
config = NetBoxConfig()
|
||||
config_dict = config.load()
|
||||
|
||||
self.base_url = config_dict['api_url']
|
||||
self.token = config_dict['api_token']
|
||||
self.verify_ssl = config_dict['verify_ssl']
|
||||
self.timeout = config_dict['timeout']
|
||||
|
||||
self.session = requests.Session()
|
||||
self.session.headers.update({
|
||||
'Authorization': f'Token {self.token}',
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json'
|
||||
})
|
||||
self.session.verify = self.verify_ssl
|
||||
|
||||
logger.info(f"NetBox client initialized for {self.base_url}")
|
||||
|
||||
def _build_url(self, endpoint: str) -> str:
|
||||
"""
|
||||
Build full URL for API endpoint.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path (e.g., 'dcim/devices/')
|
||||
|
||||
Returns:
|
||||
Full URL
|
||||
"""
|
||||
# Ensure endpoint starts with /
|
||||
if not endpoint.startswith('/'):
|
||||
endpoint = '/' + endpoint
|
||||
# Ensure endpoint ends with /
|
||||
if not endpoint.endswith('/'):
|
||||
endpoint = endpoint + '/'
|
||||
return f"{self.base_url}{endpoint}"
|
||||
|
||||
def _handle_response(self, response: requests.Response) -> Any:
|
||||
"""
|
||||
Handle API response and raise appropriate errors.
|
||||
|
||||
Args:
|
||||
response: requests Response object
|
||||
|
||||
Returns:
|
||||
Parsed JSON response
|
||||
|
||||
Raises:
|
||||
requests.HTTPError: If request failed
|
||||
"""
|
||||
try:
|
||||
response.raise_for_status()
|
||||
except requests.HTTPError as e:
|
||||
# Try to get error details from response
|
||||
try:
|
||||
error_detail = response.json()
|
||||
logger.error(f"API error: {error_detail}")
|
||||
except Exception:
|
||||
logger.error(f"API error: {response.text}")
|
||||
raise e
|
||||
|
||||
# Handle empty responses (e.g., DELETE)
|
||||
if response.status_code == 204 or not response.content:
|
||||
return None
|
||||
|
||||
return response.json()
|
||||
|
||||
def list(
|
||||
self,
|
||||
endpoint: str,
|
||||
params: Optional[Dict[str, Any]] = None,
|
||||
paginate: bool = True,
|
||||
limit: int = 50
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
List objects from an endpoint with optional pagination.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
params: Query parameters for filtering
|
||||
paginate: Whether to handle pagination automatically
|
||||
limit: Number of results per page
|
||||
|
||||
Returns:
|
||||
List of objects
|
||||
"""
|
||||
url = self._build_url(endpoint)
|
||||
params = params or {}
|
||||
params['limit'] = limit
|
||||
|
||||
logger.info(f"Listing objects from {endpoint}")
|
||||
|
||||
if not paginate:
|
||||
response = self.session.get(url, params=params, timeout=self.timeout)
|
||||
result = self._handle_response(response)
|
||||
return result.get('results', []) if isinstance(result, dict) else result
|
||||
|
||||
# Handle pagination
|
||||
all_results = []
|
||||
while url:
|
||||
response = self.session.get(url, params=params, timeout=self.timeout)
|
||||
result = self._handle_response(response)
|
||||
|
||||
if isinstance(result, dict):
|
||||
all_results.extend(result.get('results', []))
|
||||
url = result.get('next')
|
||||
params = {} # Next URL already contains params
|
||||
else:
|
||||
all_results.extend(result)
|
||||
break
|
||||
|
||||
return all_results
|
||||
|
||||
def get(self, endpoint: str, id: Union[int, str]) -> Dict:
|
||||
"""
|
||||
Get a single object by ID.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
id: Object ID
|
||||
|
||||
Returns:
|
||||
Object dictionary
|
||||
"""
|
||||
url = self._build_url(f"{endpoint}/{id}")
|
||||
logger.info(f"Getting object {id} from {endpoint}")
|
||||
response = self.session.get(url, timeout=self.timeout)
|
||||
return self._handle_response(response)
|
||||
|
||||
def create(self, endpoint: str, data: Dict) -> Dict:
|
||||
"""
|
||||
Create a new object.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
data: Object data
|
||||
|
||||
Returns:
|
||||
Created object dictionary
|
||||
"""
|
||||
url = self._build_url(endpoint)
|
||||
logger.info(f"Creating object in {endpoint}")
|
||||
response = self.session.post(url, json=data, timeout=self.timeout)
|
||||
return self._handle_response(response)
|
||||
|
||||
def create_bulk(self, endpoint: str, data: List[Dict]) -> List[Dict]:
|
||||
"""
|
||||
Create multiple objects in bulk.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
data: List of object data
|
||||
|
||||
Returns:
|
||||
List of created objects
|
||||
"""
|
||||
url = self._build_url(endpoint)
|
||||
logger.info(f"Bulk creating {len(data)} objects in {endpoint}")
|
||||
response = self.session.post(url, json=data, timeout=self.timeout)
|
||||
return self._handle_response(response)
|
||||
|
||||
def update(self, endpoint: str, id: Union[int, str], data: Dict) -> Dict:
|
||||
"""
|
||||
Update an existing object (full update).
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
id: Object ID
|
||||
data: Updated object data
|
||||
|
||||
Returns:
|
||||
Updated object dictionary
|
||||
"""
|
||||
url = self._build_url(f"{endpoint}/{id}")
|
||||
logger.info(f"Updating object {id} in {endpoint}")
|
||||
response = self.session.put(url, json=data, timeout=self.timeout)
|
||||
return self._handle_response(response)
|
||||
|
||||
def patch(self, endpoint: str, id: Union[int, str], data: Dict) -> Dict:
|
||||
"""
|
||||
Partially update an existing object.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
id: Object ID
|
||||
data: Fields to update
|
||||
|
||||
Returns:
|
||||
Updated object dictionary
|
||||
"""
|
||||
url = self._build_url(f"{endpoint}/{id}")
|
||||
logger.info(f"Patching object {id} in {endpoint}")
|
||||
response = self.session.patch(url, json=data, timeout=self.timeout)
|
||||
return self._handle_response(response)
|
||||
|
||||
def delete(self, endpoint: str, id: Union[int, str]) -> None:
|
||||
"""
|
||||
Delete an object.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
id: Object ID
|
||||
"""
|
||||
url = self._build_url(f"{endpoint}/{id}")
|
||||
logger.info(f"Deleting object {id} from {endpoint}")
|
||||
response = self.session.delete(url, timeout=self.timeout)
|
||||
self._handle_response(response)
|
||||
|
||||
def delete_bulk(self, endpoint: str, ids: List[Union[int, str]]) -> None:
|
||||
"""
|
||||
Delete multiple objects in bulk.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
ids: List of object IDs
|
||||
"""
|
||||
url = self._build_url(endpoint)
|
||||
data = [{'id': id} for id in ids]
|
||||
logger.info(f"Bulk deleting {len(ids)} objects from {endpoint}")
|
||||
response = self.session.delete(url, json=data, timeout=self.timeout)
|
||||
self._handle_response(response)
|
||||
|
||||
def options(self, endpoint: str) -> Dict:
|
||||
"""
|
||||
Get available options for an endpoint (schema info).
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
|
||||
Returns:
|
||||
Options/schema dictionary
|
||||
"""
|
||||
url = self._build_url(endpoint)
|
||||
logger.info(f"Getting options for {endpoint}")
|
||||
response = self.session.options(url, timeout=self.timeout)
|
||||
return self._handle_response(response)
|
||||
|
||||
def search(
|
||||
self,
|
||||
endpoint: str,
|
||||
query: str,
|
||||
params: Optional[Dict[str, Any]] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Search objects using the 'q' parameter.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
query: Search query string
|
||||
params: Additional filter parameters
|
||||
|
||||
Returns:
|
||||
List of matching objects
|
||||
"""
|
||||
params = params or {}
|
||||
params['q'] = query
|
||||
return self.list(endpoint, params=params)
|
||||
|
||||
def filter(
|
||||
self,
|
||||
endpoint: str,
|
||||
**filters
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Filter objects by various fields.
|
||||
|
||||
Args:
|
||||
endpoint: API endpoint path
|
||||
**filters: Filter parameters (field=value)
|
||||
|
||||
Returns:
|
||||
List of matching objects
|
||||
"""
|
||||
return self.list(endpoint, params=filters)
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,20 +0,0 @@
|
||||
"""NetBox MCP tools package."""
|
||||
from .dcim import DCIMTools
|
||||
from .ipam import IPAMTools
|
||||
from .circuits import CircuitsTools
|
||||
from .virtualization import VirtualizationTools
|
||||
from .tenancy import TenancyTools
|
||||
from .vpn import VPNTools
|
||||
from .wireless import WirelessTools
|
||||
from .extras import ExtrasTools
|
||||
|
||||
__all__ = [
|
||||
'DCIMTools',
|
||||
'IPAMTools',
|
||||
'CircuitsTools',
|
||||
'VirtualizationTools',
|
||||
'TenancyTools',
|
||||
'VPNTools',
|
||||
'WirelessTools',
|
||||
'ExtrasTools',
|
||||
]
|
||||
@@ -1,373 +0,0 @@
|
||||
"""
|
||||
Circuits tools for NetBox MCP Server.
|
||||
|
||||
Covers: Providers, Circuits, Circuit Types, Circuit Terminations, and related models.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class CircuitsTools:
|
||||
"""Tools for Circuits operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'circuits'
|
||||
|
||||
# ==================== Providers ====================
|
||||
|
||||
async def list_providers(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all circuit providers."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/providers', params=params)
|
||||
|
||||
async def get_provider(self, id: int) -> Dict:
|
||||
"""Get a specific provider by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/providers', id)
|
||||
|
||||
async def create_provider(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
asns: Optional[List[int]] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new provider."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if asns:
|
||||
data['asns'] = asns
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/providers', data)
|
||||
|
||||
async def update_provider(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a provider."""
|
||||
return self.client.patch(f'{self.base_endpoint}/providers', id, kwargs)
|
||||
|
||||
async def delete_provider(self, id: int) -> None:
|
||||
"""Delete a provider."""
|
||||
self.client.delete(f'{self.base_endpoint}/providers', id)
|
||||
|
||||
# ==================== Provider Accounts ====================
|
||||
|
||||
async def list_provider_accounts(
|
||||
self,
|
||||
provider_id: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
account: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all provider accounts."""
|
||||
params = {k: v for k, v in {
|
||||
'provider_id': provider_id, 'name': name, 'account': account, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/provider-accounts', params=params)
|
||||
|
||||
async def get_provider_account(self, id: int) -> Dict:
|
||||
"""Get a specific provider account by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/provider-accounts', id)
|
||||
|
||||
async def create_provider_account(
|
||||
self,
|
||||
provider: int,
|
||||
account: str,
|
||||
name: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new provider account."""
|
||||
data = {'provider': provider, 'account': account, **kwargs}
|
||||
if name:
|
||||
data['name'] = name
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/provider-accounts', data)
|
||||
|
||||
async def update_provider_account(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a provider account."""
|
||||
return self.client.patch(f'{self.base_endpoint}/provider-accounts', id, kwargs)
|
||||
|
||||
async def delete_provider_account(self, id: int) -> None:
|
||||
"""Delete a provider account."""
|
||||
self.client.delete(f'{self.base_endpoint}/provider-accounts', id)
|
||||
|
||||
# ==================== Provider Networks ====================
|
||||
|
||||
async def list_provider_networks(
|
||||
self,
|
||||
provider_id: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all provider networks."""
|
||||
params = {k: v for k, v in {
|
||||
'provider_id': provider_id, 'name': name, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/provider-networks', params=params)
|
||||
|
||||
async def get_provider_network(self, id: int) -> Dict:
|
||||
"""Get a specific provider network by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/provider-networks', id)
|
||||
|
||||
async def create_provider_network(
|
||||
self,
|
||||
provider: int,
|
||||
name: str,
|
||||
service_id: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new provider network."""
|
||||
data = {'provider': provider, 'name': name, **kwargs}
|
||||
if service_id:
|
||||
data['service_id'] = service_id
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/provider-networks', data)
|
||||
|
||||
async def update_provider_network(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a provider network."""
|
||||
return self.client.patch(f'{self.base_endpoint}/provider-networks', id, kwargs)
|
||||
|
||||
async def delete_provider_network(self, id: int) -> None:
|
||||
"""Delete a provider network."""
|
||||
self.client.delete(f'{self.base_endpoint}/provider-networks', id)
|
||||
|
||||
# ==================== Circuit Types ====================
|
||||
|
||||
async def list_circuit_types(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all circuit types."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/circuit-types', params=params)
|
||||
|
||||
async def get_circuit_type(self, id: int) -> Dict:
|
||||
"""Get a specific circuit type by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/circuit-types', id)
|
||||
|
||||
async def create_circuit_type(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
color: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new circuit type."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if color:
|
||||
data['color'] = color
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/circuit-types', data)
|
||||
|
||||
async def update_circuit_type(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a circuit type."""
|
||||
return self.client.patch(f'{self.base_endpoint}/circuit-types', id, kwargs)
|
||||
|
||||
async def delete_circuit_type(self, id: int) -> None:
|
||||
"""Delete a circuit type."""
|
||||
self.client.delete(f'{self.base_endpoint}/circuit-types', id)
|
||||
|
||||
# ==================== Circuit Groups ====================
|
||||
|
||||
async def list_circuit_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all circuit groups."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/circuit-groups', params=params)
|
||||
|
||||
async def get_circuit_group(self, id: int) -> Dict:
|
||||
"""Get a specific circuit group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/circuit-groups', id)
|
||||
|
||||
async def create_circuit_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new circuit group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/circuit-groups', data)
|
||||
|
||||
async def update_circuit_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a circuit group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/circuit-groups', id, kwargs)
|
||||
|
||||
async def delete_circuit_group(self, id: int) -> None:
|
||||
"""Delete a circuit group."""
|
||||
self.client.delete(f'{self.base_endpoint}/circuit-groups', id)
|
||||
|
||||
# ==================== Circuit Group Assignments ====================
|
||||
|
||||
async def list_circuit_group_assignments(
|
||||
self,
|
||||
group_id: Optional[int] = None,
|
||||
circuit_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all circuit group assignments."""
|
||||
params = {k: v for k, v in {
|
||||
'group_id': group_id, 'circuit_id': circuit_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/circuit-group-assignments', params=params)
|
||||
|
||||
async def get_circuit_group_assignment(self, id: int) -> Dict:
|
||||
"""Get a specific circuit group assignment by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/circuit-group-assignments', id)
|
||||
|
||||
async def create_circuit_group_assignment(
|
||||
self,
|
||||
group: int,
|
||||
circuit: int,
|
||||
priority: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new circuit group assignment."""
|
||||
data = {'group': group, 'circuit': circuit, **kwargs}
|
||||
if priority:
|
||||
data['priority'] = priority
|
||||
return self.client.create(f'{self.base_endpoint}/circuit-group-assignments', data)
|
||||
|
||||
async def update_circuit_group_assignment(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a circuit group assignment."""
|
||||
return self.client.patch(f'{self.base_endpoint}/circuit-group-assignments', id, kwargs)
|
||||
|
||||
async def delete_circuit_group_assignment(self, id: int) -> None:
|
||||
"""Delete a circuit group assignment."""
|
||||
self.client.delete(f'{self.base_endpoint}/circuit-group-assignments', id)
|
||||
|
||||
# ==================== Circuits ====================
|
||||
|
||||
async def list_circuits(
|
||||
self,
|
||||
cid: Optional[str] = None,
|
||||
provider_id: Optional[int] = None,
|
||||
provider_account_id: Optional[int] = None,
|
||||
type_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
site_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all circuits with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'cid': cid, 'provider_id': provider_id, 'provider_account_id': provider_account_id,
|
||||
'type_id': type_id, 'status': status, 'tenant_id': tenant_id, 'site_id': site_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/circuits', params=params)
|
||||
|
||||
async def get_circuit(self, id: int) -> Dict:
|
||||
"""Get a specific circuit by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/circuits', id)
|
||||
|
||||
async def create_circuit(
|
||||
self,
|
||||
cid: str,
|
||||
provider: int,
|
||||
type: int,
|
||||
status: str = 'active',
|
||||
provider_account: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
install_date: Optional[str] = None,
|
||||
termination_date: Optional[str] = None,
|
||||
commit_rate: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new circuit."""
|
||||
data = {'cid': cid, 'provider': provider, 'type': type, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('provider_account', provider_account), ('tenant', tenant),
|
||||
('install_date', install_date), ('termination_date', termination_date),
|
||||
('commit_rate', commit_rate), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/circuits', data)
|
||||
|
||||
async def update_circuit(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a circuit."""
|
||||
return self.client.patch(f'{self.base_endpoint}/circuits', id, kwargs)
|
||||
|
||||
async def delete_circuit(self, id: int) -> None:
|
||||
"""Delete a circuit."""
|
||||
self.client.delete(f'{self.base_endpoint}/circuits', id)
|
||||
|
||||
# ==================== Circuit Terminations ====================
|
||||
|
||||
async def list_circuit_terminations(
|
||||
self,
|
||||
circuit_id: Optional[int] = None,
|
||||
site_id: Optional[int] = None,
|
||||
provider_network_id: Optional[int] = None,
|
||||
term_side: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all circuit terminations."""
|
||||
params = {k: v for k, v in {
|
||||
'circuit_id': circuit_id, 'site_id': site_id,
|
||||
'provider_network_id': provider_network_id, 'term_side': term_side, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/circuit-terminations', params=params)
|
||||
|
||||
async def get_circuit_termination(self, id: int) -> Dict:
|
||||
"""Get a specific circuit termination by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/circuit-terminations', id)
|
||||
|
||||
async def create_circuit_termination(
|
||||
self,
|
||||
circuit: int,
|
||||
term_side: str,
|
||||
site: Optional[int] = None,
|
||||
provider_network: Optional[int] = None,
|
||||
port_speed: Optional[int] = None,
|
||||
upstream_speed: Optional[int] = None,
|
||||
xconnect_id: Optional[str] = None,
|
||||
pp_info: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new circuit termination."""
|
||||
data = {'circuit': circuit, 'term_side': term_side, **kwargs}
|
||||
for key, val in [
|
||||
('site', site), ('provider_network', provider_network),
|
||||
('port_speed', port_speed), ('upstream_speed', upstream_speed),
|
||||
('xconnect_id', xconnect_id), ('pp_info', pp_info), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/circuit-terminations', data)
|
||||
|
||||
async def update_circuit_termination(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a circuit termination."""
|
||||
return self.client.patch(f'{self.base_endpoint}/circuit-terminations', id, kwargs)
|
||||
|
||||
async def delete_circuit_termination(self, id: int) -> None:
|
||||
"""Delete a circuit termination."""
|
||||
self.client.delete(f'{self.base_endpoint}/circuit-terminations', id)
|
||||
|
||||
async def get_circuit_termination_paths(self, id: int) -> Dict:
|
||||
"""Get cable paths for a circuit termination."""
|
||||
return self.client.get(f'{self.base_endpoint}/circuit-terminations', f'{id}/paths')
|
||||
@@ -1,935 +0,0 @@
|
||||
"""
|
||||
DCIM (Data Center Infrastructure Management) tools for NetBox MCP Server.
|
||||
|
||||
Covers: Sites, Locations, Racks, Devices, Cables, Interfaces, and related models.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DCIMTools:
|
||||
"""Tools for DCIM operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'dcim'
|
||||
|
||||
# ==================== Regions ====================
|
||||
|
||||
async def list_regions(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
parent_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all regions with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'parent_id': parent_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/regions', params=params)
|
||||
|
||||
async def get_region(self, id: int) -> Dict:
|
||||
"""Get a specific region by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/regions', id)
|
||||
|
||||
async def create_region(self, name: str, slug: str, parent: Optional[int] = None, **kwargs) -> Dict:
|
||||
"""Create a new region."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if parent:
|
||||
data['parent'] = parent
|
||||
return self.client.create(f'{self.base_endpoint}/regions', data)
|
||||
|
||||
async def update_region(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a region."""
|
||||
return self.client.patch(f'{self.base_endpoint}/regions', id, kwargs)
|
||||
|
||||
async def delete_region(self, id: int) -> None:
|
||||
"""Delete a region."""
|
||||
self.client.delete(f'{self.base_endpoint}/regions', id)
|
||||
|
||||
# ==================== Site Groups ====================
|
||||
|
||||
async def list_site_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
parent_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all site groups with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'parent_id': parent_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/site-groups', params=params)
|
||||
|
||||
async def get_site_group(self, id: int) -> Dict:
|
||||
"""Get a specific site group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/site-groups', id)
|
||||
|
||||
async def create_site_group(self, name: str, slug: str, parent: Optional[int] = None, **kwargs) -> Dict:
|
||||
"""Create a new site group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if parent:
|
||||
data['parent'] = parent
|
||||
return self.client.create(f'{self.base_endpoint}/site-groups', data)
|
||||
|
||||
async def update_site_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a site group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/site-groups', id, kwargs)
|
||||
|
||||
async def delete_site_group(self, id: int) -> None:
|
||||
"""Delete a site group."""
|
||||
self.client.delete(f'{self.base_endpoint}/site-groups', id)
|
||||
|
||||
# ==================== Sites ====================
|
||||
|
||||
async def list_sites(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
region_id: Optional[int] = None,
|
||||
group_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all sites with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'status': status,
|
||||
'region_id': region_id, 'group_id': group_id, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/sites', params=params)
|
||||
|
||||
async def get_site(self, id: int) -> Dict:
|
||||
"""Get a specific site by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/sites', id)
|
||||
|
||||
async def create_site(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
status: str = 'active',
|
||||
region: Optional[int] = None,
|
||||
group: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
facility: Optional[str] = None,
|
||||
time_zone: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
physical_address: Optional[str] = None,
|
||||
shipping_address: Optional[str] = None,
|
||||
latitude: Optional[float] = None,
|
||||
longitude: Optional[float] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new site."""
|
||||
data = {'name': name, 'slug': slug, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('region', region), ('group', group), ('tenant', tenant),
|
||||
('facility', facility), ('time_zone', time_zone),
|
||||
('description', description), ('physical_address', physical_address),
|
||||
('shipping_address', shipping_address), ('latitude', latitude),
|
||||
('longitude', longitude)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/sites', data)
|
||||
|
||||
async def update_site(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a site."""
|
||||
return self.client.patch(f'{self.base_endpoint}/sites', id, kwargs)
|
||||
|
||||
async def delete_site(self, id: int) -> None:
|
||||
"""Delete a site."""
|
||||
self.client.delete(f'{self.base_endpoint}/sites', id)
|
||||
|
||||
# ==================== Locations ====================
|
||||
|
||||
async def list_locations(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
site_id: Optional[int] = None,
|
||||
parent_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all locations with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'site_id': site_id, 'parent_id': parent_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/locations', params=params)
|
||||
|
||||
async def get_location(self, id: int) -> Dict:
|
||||
"""Get a specific location by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/locations', id)
|
||||
|
||||
async def create_location(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
site: int,
|
||||
parent: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new location."""
|
||||
data = {'name': name, 'slug': slug, 'site': site, **kwargs}
|
||||
if parent:
|
||||
data['parent'] = parent
|
||||
return self.client.create(f'{self.base_endpoint}/locations', data)
|
||||
|
||||
async def update_location(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a location."""
|
||||
return self.client.patch(f'{self.base_endpoint}/locations', id, kwargs)
|
||||
|
||||
async def delete_location(self, id: int) -> None:
|
||||
"""Delete a location."""
|
||||
self.client.delete(f'{self.base_endpoint}/locations', id)
|
||||
|
||||
# ==================== Rack Roles ====================
|
||||
|
||||
async def list_rack_roles(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all rack roles."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/rack-roles', params=params)
|
||||
|
||||
async def get_rack_role(self, id: int) -> Dict:
|
||||
"""Get a specific rack role by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/rack-roles', id)
|
||||
|
||||
async def create_rack_role(self, name: str, slug: str, color: str = '9e9e9e', **kwargs) -> Dict:
|
||||
"""Create a new rack role."""
|
||||
data = {'name': name, 'slug': slug, 'color': color, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/rack-roles', data)
|
||||
|
||||
async def update_rack_role(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a rack role."""
|
||||
return self.client.patch(f'{self.base_endpoint}/rack-roles', id, kwargs)
|
||||
|
||||
async def delete_rack_role(self, id: int) -> None:
|
||||
"""Delete a rack role."""
|
||||
self.client.delete(f'{self.base_endpoint}/rack-roles', id)
|
||||
|
||||
# ==================== Rack Types ====================
|
||||
|
||||
async def list_rack_types(self, manufacturer_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all rack types."""
|
||||
params = {k: v for k, v in {'manufacturer_id': manufacturer_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/rack-types', params=params)
|
||||
|
||||
async def get_rack_type(self, id: int) -> Dict:
|
||||
"""Get a specific rack type by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/rack-types', id)
|
||||
|
||||
async def create_rack_type(
|
||||
self,
|
||||
manufacturer: int,
|
||||
model: str,
|
||||
slug: str,
|
||||
form_factor: str = '4-post-frame',
|
||||
width: int = 19,
|
||||
u_height: int = 42,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new rack type."""
|
||||
data = {
|
||||
'manufacturer': manufacturer, 'model': model, 'slug': slug,
|
||||
'form_factor': form_factor, 'width': width, 'u_height': u_height, **kwargs
|
||||
}
|
||||
return self.client.create(f'{self.base_endpoint}/rack-types', data)
|
||||
|
||||
async def update_rack_type(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a rack type."""
|
||||
return self.client.patch(f'{self.base_endpoint}/rack-types', id, kwargs)
|
||||
|
||||
async def delete_rack_type(self, id: int) -> None:
|
||||
"""Delete a rack type."""
|
||||
self.client.delete(f'{self.base_endpoint}/rack-types', id)
|
||||
|
||||
# ==================== Racks ====================
|
||||
|
||||
async def list_racks(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
site_id: Optional[int] = None,
|
||||
location_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
role_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all racks with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'site_id': site_id, 'location_id': location_id,
|
||||
'status': status, 'role_id': role_id, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/racks', params=params)
|
||||
|
||||
async def get_rack(self, id: int) -> Dict:
|
||||
"""Get a specific rack by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/racks', id)
|
||||
|
||||
async def create_rack(
|
||||
self,
|
||||
name: str,
|
||||
site: int,
|
||||
status: str = 'active',
|
||||
location: Optional[int] = None,
|
||||
role: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
rack_type: Optional[int] = None,
|
||||
width: int = 19,
|
||||
u_height: int = 42,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new rack."""
|
||||
data = {'name': name, 'site': site, 'status': status, 'width': width, 'u_height': u_height, **kwargs}
|
||||
for key, val in [('location', location), ('role', role), ('tenant', tenant), ('rack_type', rack_type)]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/racks', data)
|
||||
|
||||
async def update_rack(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a rack."""
|
||||
return self.client.patch(f'{self.base_endpoint}/racks', id, kwargs)
|
||||
|
||||
async def delete_rack(self, id: int) -> None:
|
||||
"""Delete a rack."""
|
||||
self.client.delete(f'{self.base_endpoint}/racks', id)
|
||||
|
||||
# ==================== Rack Reservations ====================
|
||||
|
||||
async def list_rack_reservations(
|
||||
self,
|
||||
rack_id: Optional[int] = None,
|
||||
site_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all rack reservations."""
|
||||
params = {k: v for k, v in {
|
||||
'rack_id': rack_id, 'site_id': site_id, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/rack-reservations', params=params)
|
||||
|
||||
async def get_rack_reservation(self, id: int) -> Dict:
|
||||
"""Get a specific rack reservation by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/rack-reservations', id)
|
||||
|
||||
async def create_rack_reservation(
|
||||
self,
|
||||
rack: int,
|
||||
units: List[int],
|
||||
user: int,
|
||||
description: str,
|
||||
tenant: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new rack reservation."""
|
||||
data = {'rack': rack, 'units': units, 'user': user, 'description': description, **kwargs}
|
||||
if tenant:
|
||||
data['tenant'] = tenant
|
||||
return self.client.create(f'{self.base_endpoint}/rack-reservations', data)
|
||||
|
||||
async def update_rack_reservation(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a rack reservation."""
|
||||
return self.client.patch(f'{self.base_endpoint}/rack-reservations', id, kwargs)
|
||||
|
||||
async def delete_rack_reservation(self, id: int) -> None:
|
||||
"""Delete a rack reservation."""
|
||||
self.client.delete(f'{self.base_endpoint}/rack-reservations', id)
|
||||
|
||||
# ==================== Manufacturers ====================
|
||||
|
||||
async def list_manufacturers(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all manufacturers."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/manufacturers', params=params)
|
||||
|
||||
async def get_manufacturer(self, id: int) -> Dict:
|
||||
"""Get a specific manufacturer by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/manufacturers', id)
|
||||
|
||||
async def create_manufacturer(self, name: str, slug: str, **kwargs) -> Dict:
|
||||
"""Create a new manufacturer."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/manufacturers', data)
|
||||
|
||||
async def update_manufacturer(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a manufacturer."""
|
||||
return self.client.patch(f'{self.base_endpoint}/manufacturers', id, kwargs)
|
||||
|
||||
async def delete_manufacturer(self, id: int) -> None:
|
||||
"""Delete a manufacturer."""
|
||||
self.client.delete(f'{self.base_endpoint}/manufacturers', id)
|
||||
|
||||
# ==================== Device Types ====================
|
||||
|
||||
async def list_device_types(
|
||||
self,
|
||||
manufacturer_id: Optional[int] = None,
|
||||
model: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all device types."""
|
||||
params = {k: v for k, v in {
|
||||
'manufacturer_id': manufacturer_id, 'model': model, 'slug': slug, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/device-types', params=params)
|
||||
|
||||
async def get_device_type(self, id: int) -> Dict:
|
||||
"""Get a specific device type by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/device-types', id)
|
||||
|
||||
async def create_device_type(
|
||||
self,
|
||||
manufacturer: int,
|
||||
model: str,
|
||||
slug: str,
|
||||
u_height: float = 1.0,
|
||||
is_full_depth: bool = True,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new device type."""
|
||||
data = {
|
||||
'manufacturer': manufacturer, 'model': model, 'slug': slug,
|
||||
'u_height': u_height, 'is_full_depth': is_full_depth, **kwargs
|
||||
}
|
||||
return self.client.create(f'{self.base_endpoint}/device-types', data)
|
||||
|
||||
async def update_device_type(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a device type."""
|
||||
return self.client.patch(f'{self.base_endpoint}/device-types', id, kwargs)
|
||||
|
||||
async def delete_device_type(self, id: int) -> None:
|
||||
"""Delete a device type."""
|
||||
self.client.delete(f'{self.base_endpoint}/device-types', id)
|
||||
|
||||
# ==================== Module Types ====================
|
||||
|
||||
async def list_module_types(self, manufacturer_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all module types."""
|
||||
params = {k: v for k, v in {'manufacturer_id': manufacturer_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/module-types', params=params)
|
||||
|
||||
async def get_module_type(self, id: int) -> Dict:
|
||||
"""Get a specific module type by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/module-types', id)
|
||||
|
||||
async def create_module_type(self, manufacturer: int, model: str, **kwargs) -> Dict:
|
||||
"""Create a new module type."""
|
||||
data = {'manufacturer': manufacturer, 'model': model, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/module-types', data)
|
||||
|
||||
async def update_module_type(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a module type."""
|
||||
return self.client.patch(f'{self.base_endpoint}/module-types', id, kwargs)
|
||||
|
||||
async def delete_module_type(self, id: int) -> None:
|
||||
"""Delete a module type."""
|
||||
self.client.delete(f'{self.base_endpoint}/module-types', id)
|
||||
|
||||
# ==================== Device Roles ====================
|
||||
|
||||
async def list_device_roles(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all device roles."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/device-roles', params=params)
|
||||
|
||||
async def get_device_role(self, id: int) -> Dict:
|
||||
"""Get a specific device role by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/device-roles', id)
|
||||
|
||||
async def create_device_role(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
color: str = '9e9e9e',
|
||||
vm_role: bool = False,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new device role."""
|
||||
data = {'name': name, 'slug': slug, 'color': color, 'vm_role': vm_role, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/device-roles', data)
|
||||
|
||||
async def update_device_role(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a device role."""
|
||||
return self.client.patch(f'{self.base_endpoint}/device-roles', id, kwargs)
|
||||
|
||||
async def delete_device_role(self, id: int) -> None:
|
||||
"""Delete a device role."""
|
||||
self.client.delete(f'{self.base_endpoint}/device-roles', id)
|
||||
|
||||
# ==================== Platforms ====================
|
||||
|
||||
async def list_platforms(self, name: Optional[str] = None, manufacturer_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all platforms."""
|
||||
params = {k: v for k, v in {'name': name, 'manufacturer_id': manufacturer_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/platforms', params=params)
|
||||
|
||||
async def get_platform(self, id: int) -> Dict:
|
||||
"""Get a specific platform by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/platforms', id)
|
||||
|
||||
async def create_platform(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
manufacturer: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new platform."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if manufacturer:
|
||||
data['manufacturer'] = manufacturer
|
||||
return self.client.create(f'{self.base_endpoint}/platforms', data)
|
||||
|
||||
async def update_platform(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a platform."""
|
||||
return self.client.patch(f'{self.base_endpoint}/platforms', id, kwargs)
|
||||
|
||||
async def delete_platform(self, id: int) -> None:
|
||||
"""Delete a platform."""
|
||||
self.client.delete(f'{self.base_endpoint}/platforms', id)
|
||||
|
||||
# ==================== Devices ====================
|
||||
|
||||
async def list_devices(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
site_id: Optional[int] = None,
|
||||
location_id: Optional[int] = None,
|
||||
rack_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
role_id: Optional[int] = None,
|
||||
device_type_id: Optional[int] = None,
|
||||
manufacturer_id: Optional[int] = None,
|
||||
platform_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
serial: Optional[str] = None,
|
||||
asset_tag: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all devices with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'site_id': site_id, 'location_id': location_id,
|
||||
'rack_id': rack_id, 'status': status, 'role_id': role_id,
|
||||
'device_type_id': device_type_id, 'manufacturer_id': manufacturer_id,
|
||||
'platform_id': platform_id, 'tenant_id': tenant_id,
|
||||
'serial': serial, 'asset_tag': asset_tag, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/devices', params=params)
|
||||
|
||||
async def get_device(self, id: int) -> Dict:
|
||||
"""Get a specific device by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/devices', id)
|
||||
|
||||
async def create_device(
|
||||
self,
|
||||
name: str,
|
||||
device_type: int,
|
||||
role: int,
|
||||
site: int,
|
||||
status: str = 'active',
|
||||
location: Optional[int] = None,
|
||||
rack: Optional[int] = None,
|
||||
position: Optional[float] = None,
|
||||
face: Optional[str] = None,
|
||||
platform: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
serial: Optional[str] = None,
|
||||
asset_tag: Optional[str] = None,
|
||||
primary_ip4: Optional[int] = None,
|
||||
primary_ip6: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new device."""
|
||||
data = {
|
||||
'name': name, 'device_type': device_type, 'role': role,
|
||||
'site': site, 'status': status, **kwargs
|
||||
}
|
||||
for key, val in [
|
||||
('location', location), ('rack', rack), ('position', position),
|
||||
('face', face), ('platform', platform), ('tenant', tenant),
|
||||
('serial', serial), ('asset_tag', asset_tag),
|
||||
('primary_ip4', primary_ip4), ('primary_ip6', primary_ip6)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/devices', data)
|
||||
|
||||
async def update_device(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a device."""
|
||||
return self.client.patch(f'{self.base_endpoint}/devices', id, kwargs)
|
||||
|
||||
async def delete_device(self, id: int) -> None:
|
||||
"""Delete a device."""
|
||||
self.client.delete(f'{self.base_endpoint}/devices', id)
|
||||
|
||||
# ==================== Modules ====================
|
||||
|
||||
async def list_modules(self, device_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all modules."""
|
||||
params = {k: v for k, v in {'device_id': device_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/modules', params=params)
|
||||
|
||||
async def get_module(self, id: int) -> Dict:
|
||||
"""Get a specific module by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/modules', id)
|
||||
|
||||
async def create_module(self, device: int, module_bay: int, module_type: int, **kwargs) -> Dict:
|
||||
"""Create a new module."""
|
||||
data = {'device': device, 'module_bay': module_bay, 'module_type': module_type, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/modules', data)
|
||||
|
||||
async def update_module(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a module."""
|
||||
return self.client.patch(f'{self.base_endpoint}/modules', id, kwargs)
|
||||
|
||||
async def delete_module(self, id: int) -> None:
|
||||
"""Delete a module."""
|
||||
self.client.delete(f'{self.base_endpoint}/modules', id)
|
||||
|
||||
# ==================== Interfaces ====================
|
||||
|
||||
async def list_interfaces(
|
||||
self,
|
||||
device_id: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
type: Optional[str] = None,
|
||||
enabled: Optional[bool] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all interfaces."""
|
||||
params = {k: v for k, v in {
|
||||
'device_id': device_id, 'name': name, 'type': type, 'enabled': enabled, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/interfaces', params=params)
|
||||
|
||||
async def get_interface(self, id: int) -> Dict:
|
||||
"""Get a specific interface by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/interfaces', id)
|
||||
|
||||
async def create_interface(
|
||||
self,
|
||||
device: int,
|
||||
name: str,
|
||||
type: str,
|
||||
enabled: bool = True,
|
||||
mtu: Optional[int] = None,
|
||||
mac_address: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
mode: Optional[str] = None,
|
||||
untagged_vlan: Optional[int] = None,
|
||||
tagged_vlans: Optional[List[int]] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new interface."""
|
||||
data = {'device': device, 'name': name, 'type': type, 'enabled': enabled, **kwargs}
|
||||
for key, val in [
|
||||
('mtu', mtu), ('mac_address', mac_address), ('description', description),
|
||||
('mode', mode), ('untagged_vlan', untagged_vlan), ('tagged_vlans', tagged_vlans)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/interfaces', data)
|
||||
|
||||
async def update_interface(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an interface."""
|
||||
return self.client.patch(f'{self.base_endpoint}/interfaces', id, kwargs)
|
||||
|
||||
async def delete_interface(self, id: int) -> None:
|
||||
"""Delete an interface."""
|
||||
self.client.delete(f'{self.base_endpoint}/interfaces', id)
|
||||
|
||||
# ==================== Console Ports ====================
|
||||
|
||||
async def list_console_ports(self, device_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all console ports."""
|
||||
params = {k: v for k, v in {'device_id': device_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/console-ports', params=params)
|
||||
|
||||
async def get_console_port(self, id: int) -> Dict:
|
||||
"""Get a specific console port by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/console-ports', id)
|
||||
|
||||
async def create_console_port(self, device: int, name: str, **kwargs) -> Dict:
|
||||
"""Create a new console port."""
|
||||
data = {'device': device, 'name': name, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/console-ports', data)
|
||||
|
||||
async def update_console_port(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a console port."""
|
||||
return self.client.patch(f'{self.base_endpoint}/console-ports', id, kwargs)
|
||||
|
||||
async def delete_console_port(self, id: int) -> None:
|
||||
"""Delete a console port."""
|
||||
self.client.delete(f'{self.base_endpoint}/console-ports', id)
|
||||
|
||||
# ==================== Console Server Ports ====================
|
||||
|
||||
async def list_console_server_ports(self, device_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all console server ports."""
|
||||
params = {k: v for k, v in {'device_id': device_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/console-server-ports', params=params)
|
||||
|
||||
async def get_console_server_port(self, id: int) -> Dict:
|
||||
"""Get a specific console server port by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/console-server-ports', id)
|
||||
|
||||
async def create_console_server_port(self, device: int, name: str, **kwargs) -> Dict:
|
||||
"""Create a new console server port."""
|
||||
data = {'device': device, 'name': name, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/console-server-ports', data)
|
||||
|
||||
async def update_console_server_port(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a console server port."""
|
||||
return self.client.patch(f'{self.base_endpoint}/console-server-ports', id, kwargs)
|
||||
|
||||
async def delete_console_server_port(self, id: int) -> None:
|
||||
"""Delete a console server port."""
|
||||
self.client.delete(f'{self.base_endpoint}/console-server-ports', id)
|
||||
|
||||
# ==================== Power Ports ====================
|
||||
|
||||
async def list_power_ports(self, device_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all power ports."""
|
||||
params = {k: v for k, v in {'device_id': device_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/power-ports', params=params)
|
||||
|
||||
async def get_power_port(self, id: int) -> Dict:
|
||||
"""Get a specific power port by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/power-ports', id)
|
||||
|
||||
async def create_power_port(self, device: int, name: str, **kwargs) -> Dict:
|
||||
"""Create a new power port."""
|
||||
data = {'device': device, 'name': name, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/power-ports', data)
|
||||
|
||||
async def update_power_port(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a power port."""
|
||||
return self.client.patch(f'{self.base_endpoint}/power-ports', id, kwargs)
|
||||
|
||||
async def delete_power_port(self, id: int) -> None:
|
||||
"""Delete a power port."""
|
||||
self.client.delete(f'{self.base_endpoint}/power-ports', id)
|
||||
|
||||
# ==================== Power Outlets ====================
|
||||
|
||||
async def list_power_outlets(self, device_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all power outlets."""
|
||||
params = {k: v for k, v in {'device_id': device_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/power-outlets', params=params)
|
||||
|
||||
async def get_power_outlet(self, id: int) -> Dict:
|
||||
"""Get a specific power outlet by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/power-outlets', id)
|
||||
|
||||
async def create_power_outlet(self, device: int, name: str, **kwargs) -> Dict:
|
||||
"""Create a new power outlet."""
|
||||
data = {'device': device, 'name': name, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/power-outlets', data)
|
||||
|
||||
async def update_power_outlet(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a power outlet."""
|
||||
return self.client.patch(f'{self.base_endpoint}/power-outlets', id, kwargs)
|
||||
|
||||
async def delete_power_outlet(self, id: int) -> None:
|
||||
"""Delete a power outlet."""
|
||||
self.client.delete(f'{self.base_endpoint}/power-outlets', id)
|
||||
|
||||
# ==================== Power Panels ====================
|
||||
|
||||
async def list_power_panels(self, site_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all power panels."""
|
||||
params = {k: v for k, v in {'site_id': site_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/power-panels', params=params)
|
||||
|
||||
async def get_power_panel(self, id: int) -> Dict:
|
||||
"""Get a specific power panel by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/power-panels', id)
|
||||
|
||||
async def create_power_panel(self, site: int, name: str, location: Optional[int] = None, **kwargs) -> Dict:
|
||||
"""Create a new power panel."""
|
||||
data = {'site': site, 'name': name, **kwargs}
|
||||
if location:
|
||||
data['location'] = location
|
||||
return self.client.create(f'{self.base_endpoint}/power-panels', data)
|
||||
|
||||
async def update_power_panel(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a power panel."""
|
||||
return self.client.patch(f'{self.base_endpoint}/power-panels', id, kwargs)
|
||||
|
||||
async def delete_power_panel(self, id: int) -> None:
|
||||
"""Delete a power panel."""
|
||||
self.client.delete(f'{self.base_endpoint}/power-panels', id)
|
||||
|
||||
# ==================== Power Feeds ====================
|
||||
|
||||
async def list_power_feeds(self, power_panel_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all power feeds."""
|
||||
params = {k: v for k, v in {'power_panel_id': power_panel_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/power-feeds', params=params)
|
||||
|
||||
async def get_power_feed(self, id: int) -> Dict:
|
||||
"""Get a specific power feed by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/power-feeds', id)
|
||||
|
||||
async def create_power_feed(
|
||||
self,
|
||||
power_panel: int,
|
||||
name: str,
|
||||
status: str = 'active',
|
||||
type: str = 'primary',
|
||||
supply: str = 'ac',
|
||||
phase: str = 'single-phase',
|
||||
voltage: int = 120,
|
||||
amperage: int = 20,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new power feed."""
|
||||
data = {
|
||||
'power_panel': power_panel, 'name': name, 'status': status,
|
||||
'type': type, 'supply': supply, 'phase': phase,
|
||||
'voltage': voltage, 'amperage': amperage, **kwargs
|
||||
}
|
||||
return self.client.create(f'{self.base_endpoint}/power-feeds', data)
|
||||
|
||||
async def update_power_feed(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a power feed."""
|
||||
return self.client.patch(f'{self.base_endpoint}/power-feeds', id, kwargs)
|
||||
|
||||
async def delete_power_feed(self, id: int) -> None:
|
||||
"""Delete a power feed."""
|
||||
self.client.delete(f'{self.base_endpoint}/power-feeds', id)
|
||||
|
||||
# ==================== Cables ====================
|
||||
|
||||
async def list_cables(
|
||||
self,
|
||||
site_id: Optional[int] = None,
|
||||
device_id: Optional[int] = None,
|
||||
rack_id: Optional[int] = None,
|
||||
type: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all cables."""
|
||||
params = {k: v for k, v in {
|
||||
'site_id': site_id, 'device_id': device_id, 'rack_id': rack_id,
|
||||
'type': type, 'status': status, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/cables', params=params)
|
||||
|
||||
async def get_cable(self, id: int) -> Dict:
|
||||
"""Get a specific cable by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/cables', id)
|
||||
|
||||
async def create_cable(
|
||||
self,
|
||||
a_terminations: List[Dict],
|
||||
b_terminations: List[Dict],
|
||||
type: Optional[str] = None,
|
||||
status: str = 'connected',
|
||||
label: Optional[str] = None,
|
||||
color: Optional[str] = None,
|
||||
length: Optional[float] = None,
|
||||
length_unit: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a new cable.
|
||||
|
||||
a_terminations and b_terminations are lists of dicts with:
|
||||
- object_type: e.g., 'dcim.interface'
|
||||
- object_id: ID of the object
|
||||
"""
|
||||
data = {
|
||||
'a_terminations': a_terminations,
|
||||
'b_terminations': b_terminations,
|
||||
'status': status,
|
||||
**kwargs
|
||||
}
|
||||
for key, val in [
|
||||
('type', type), ('label', label), ('color', color),
|
||||
('length', length), ('length_unit', length_unit)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/cables', data)
|
||||
|
||||
async def update_cable(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a cable."""
|
||||
return self.client.patch(f'{self.base_endpoint}/cables', id, kwargs)
|
||||
|
||||
async def delete_cable(self, id: int) -> None:
|
||||
"""Delete a cable."""
|
||||
self.client.delete(f'{self.base_endpoint}/cables', id)
|
||||
|
||||
# ==================== Virtual Chassis ====================
|
||||
|
||||
async def list_virtual_chassis(self, **kwargs) -> List[Dict]:
|
||||
"""List all virtual chassis."""
|
||||
return self.client.list(f'{self.base_endpoint}/virtual-chassis', params=kwargs)
|
||||
|
||||
async def get_virtual_chassis(self, id: int) -> Dict:
|
||||
"""Get a specific virtual chassis by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/virtual-chassis', id)
|
||||
|
||||
async def create_virtual_chassis(self, name: str, domain: Optional[str] = None, **kwargs) -> Dict:
|
||||
"""Create a new virtual chassis."""
|
||||
data = {'name': name, **kwargs}
|
||||
if domain:
|
||||
data['domain'] = domain
|
||||
return self.client.create(f'{self.base_endpoint}/virtual-chassis', data)
|
||||
|
||||
async def update_virtual_chassis(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a virtual chassis."""
|
||||
return self.client.patch(f'{self.base_endpoint}/virtual-chassis', id, kwargs)
|
||||
|
||||
async def delete_virtual_chassis(self, id: int) -> None:
|
||||
"""Delete a virtual chassis."""
|
||||
self.client.delete(f'{self.base_endpoint}/virtual-chassis', id)
|
||||
|
||||
# ==================== Inventory Items ====================
|
||||
|
||||
async def list_inventory_items(self, device_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all inventory items."""
|
||||
params = {k: v for k, v in {'device_id': device_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/inventory-items', params=params)
|
||||
|
||||
async def get_inventory_item(self, id: int) -> Dict:
|
||||
"""Get a specific inventory item by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/inventory-items', id)
|
||||
|
||||
async def create_inventory_item(
|
||||
self,
|
||||
device: int,
|
||||
name: str,
|
||||
parent: Optional[int] = None,
|
||||
manufacturer: Optional[int] = None,
|
||||
part_id: Optional[str] = None,
|
||||
serial: Optional[str] = None,
|
||||
asset_tag: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new inventory item."""
|
||||
data = {'device': device, 'name': name, **kwargs}
|
||||
for key, val in [
|
||||
('parent', parent), ('manufacturer', manufacturer),
|
||||
('part_id', part_id), ('serial', serial), ('asset_tag', asset_tag)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/inventory-items', data)
|
||||
|
||||
async def update_inventory_item(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an inventory item."""
|
||||
return self.client.patch(f'{self.base_endpoint}/inventory-items', id, kwargs)
|
||||
|
||||
async def delete_inventory_item(self, id: int) -> None:
|
||||
"""Delete an inventory item."""
|
||||
self.client.delete(f'{self.base_endpoint}/inventory-items', id)
|
||||
@@ -1,560 +0,0 @@
|
||||
"""
|
||||
Extras tools for NetBox MCP Server.
|
||||
|
||||
Covers: Tags, Custom Fields, Custom Links, Webhooks, Journal Entries, and more.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ExtrasTools:
|
||||
"""Tools for Extras operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'extras'
|
||||
|
||||
# ==================== Tags ====================
|
||||
|
||||
async def list_tags(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
color: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all tags with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'color': color, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/tags', params=params)
|
||||
|
||||
async def get_tag(self, id: int) -> Dict:
|
||||
"""Get a specific tag by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/tags', id)
|
||||
|
||||
async def create_tag(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
color: str = '9e9e9e',
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new tag."""
|
||||
data = {'name': name, 'slug': slug, 'color': color, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/tags', data)
|
||||
|
||||
async def update_tag(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a tag."""
|
||||
return self.client.patch(f'{self.base_endpoint}/tags', id, kwargs)
|
||||
|
||||
async def delete_tag(self, id: int) -> None:
|
||||
"""Delete a tag."""
|
||||
self.client.delete(f'{self.base_endpoint}/tags', id)
|
||||
|
||||
# ==================== Custom Fields ====================
|
||||
|
||||
async def list_custom_fields(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
type: Optional[str] = None,
|
||||
content_types: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all custom fields."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'type': type, 'content_types': content_types, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/custom-fields', params=params)
|
||||
|
||||
async def get_custom_field(self, id: int) -> Dict:
|
||||
"""Get a specific custom field by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/custom-fields', id)
|
||||
|
||||
async def create_custom_field(
|
||||
self,
|
||||
name: str,
|
||||
content_types: List[str],
|
||||
type: str = 'text',
|
||||
label: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
required: bool = False,
|
||||
filter_logic: str = 'loose',
|
||||
default: Optional[Any] = None,
|
||||
weight: int = 100,
|
||||
validation_minimum: Optional[int] = None,
|
||||
validation_maximum: Optional[int] = None,
|
||||
validation_regex: Optional[str] = None,
|
||||
choice_set: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new custom field."""
|
||||
data = {
|
||||
'name': name, 'content_types': content_types, 'type': type,
|
||||
'required': required, 'filter_logic': filter_logic, 'weight': weight, **kwargs
|
||||
}
|
||||
for key, val in [
|
||||
('label', label), ('description', description), ('default', default),
|
||||
('validation_minimum', validation_minimum), ('validation_maximum', validation_maximum),
|
||||
('validation_regex', validation_regex), ('choice_set', choice_set)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/custom-fields', data)
|
||||
|
||||
async def update_custom_field(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a custom field."""
|
||||
return self.client.patch(f'{self.base_endpoint}/custom-fields', id, kwargs)
|
||||
|
||||
async def delete_custom_field(self, id: int) -> None:
|
||||
"""Delete a custom field."""
|
||||
self.client.delete(f'{self.base_endpoint}/custom-fields', id)
|
||||
|
||||
# ==================== Custom Field Choice Sets ====================
|
||||
|
||||
async def list_custom_field_choice_sets(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all custom field choice sets."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/custom-field-choice-sets', params=params)
|
||||
|
||||
async def get_custom_field_choice_set(self, id: int) -> Dict:
|
||||
"""Get a specific custom field choice set by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/custom-field-choice-sets', id)
|
||||
|
||||
async def create_custom_field_choice_set(
|
||||
self,
|
||||
name: str,
|
||||
extra_choices: List[List[str]],
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new custom field choice set."""
|
||||
data = {'name': name, 'extra_choices': extra_choices, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/custom-field-choice-sets', data)
|
||||
|
||||
async def update_custom_field_choice_set(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a custom field choice set."""
|
||||
return self.client.patch(f'{self.base_endpoint}/custom-field-choice-sets', id, kwargs)
|
||||
|
||||
async def delete_custom_field_choice_set(self, id: int) -> None:
|
||||
"""Delete a custom field choice set."""
|
||||
self.client.delete(f'{self.base_endpoint}/custom-field-choice-sets', id)
|
||||
|
||||
# ==================== Custom Links ====================
|
||||
|
||||
async def list_custom_links(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
content_types: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all custom links."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'content_types': content_types, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/custom-links', params=params)
|
||||
|
||||
async def get_custom_link(self, id: int) -> Dict:
|
||||
"""Get a specific custom link by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/custom-links', id)
|
||||
|
||||
async def create_custom_link(
|
||||
self,
|
||||
name: str,
|
||||
content_types: List[str],
|
||||
link_text: str,
|
||||
link_url: str,
|
||||
enabled: bool = True,
|
||||
new_window: bool = False,
|
||||
weight: int = 100,
|
||||
group_name: Optional[str] = None,
|
||||
button_class: str = 'outline-dark',
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new custom link."""
|
||||
data = {
|
||||
'name': name, 'content_types': content_types,
|
||||
'link_text': link_text, 'link_url': link_url,
|
||||
'enabled': enabled, 'new_window': new_window,
|
||||
'weight': weight, 'button_class': button_class, **kwargs
|
||||
}
|
||||
if group_name:
|
||||
data['group_name'] = group_name
|
||||
return self.client.create(f'{self.base_endpoint}/custom-links', data)
|
||||
|
||||
async def update_custom_link(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a custom link."""
|
||||
return self.client.patch(f'{self.base_endpoint}/custom-links', id, kwargs)
|
||||
|
||||
async def delete_custom_link(self, id: int) -> None:
|
||||
"""Delete a custom link."""
|
||||
self.client.delete(f'{self.base_endpoint}/custom-links', id)
|
||||
|
||||
# ==================== Webhooks ====================
|
||||
|
||||
async def list_webhooks(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all webhooks."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/webhooks', params=params)
|
||||
|
||||
async def get_webhook(self, id: int) -> Dict:
|
||||
"""Get a specific webhook by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/webhooks', id)
|
||||
|
||||
async def create_webhook(
|
||||
self,
|
||||
name: str,
|
||||
payload_url: str,
|
||||
content_types: List[str],
|
||||
type_create: bool = True,
|
||||
type_update: bool = True,
|
||||
type_delete: bool = True,
|
||||
type_job_start: bool = False,
|
||||
type_job_end: bool = False,
|
||||
enabled: bool = True,
|
||||
http_method: str = 'POST',
|
||||
http_content_type: str = 'application/json',
|
||||
additional_headers: Optional[str] = None,
|
||||
body_template: Optional[str] = None,
|
||||
secret: Optional[str] = None,
|
||||
ssl_verification: bool = True,
|
||||
ca_file_path: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new webhook."""
|
||||
data = {
|
||||
'name': name, 'payload_url': payload_url, 'content_types': content_types,
|
||||
'type_create': type_create, 'type_update': type_update, 'type_delete': type_delete,
|
||||
'type_job_start': type_job_start, 'type_job_end': type_job_end,
|
||||
'enabled': enabled, 'http_method': http_method,
|
||||
'http_content_type': http_content_type, 'ssl_verification': ssl_verification, **kwargs
|
||||
}
|
||||
for key, val in [
|
||||
('additional_headers', additional_headers), ('body_template', body_template),
|
||||
('secret', secret), ('ca_file_path', ca_file_path)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/webhooks', data)
|
||||
|
||||
async def update_webhook(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a webhook."""
|
||||
return self.client.patch(f'{self.base_endpoint}/webhooks', id, kwargs)
|
||||
|
||||
async def delete_webhook(self, id: int) -> None:
|
||||
"""Delete a webhook."""
|
||||
self.client.delete(f'{self.base_endpoint}/webhooks', id)
|
||||
|
||||
# ==================== Journal Entries ====================
|
||||
|
||||
async def list_journal_entries(
|
||||
self,
|
||||
assigned_object_type: Optional[str] = None,
|
||||
assigned_object_id: Optional[int] = None,
|
||||
kind: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all journal entries."""
|
||||
params = {k: v for k, v in {
|
||||
'assigned_object_type': assigned_object_type,
|
||||
'assigned_object_id': assigned_object_id, 'kind': kind, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/journal-entries', params=params)
|
||||
|
||||
async def get_journal_entry(self, id: int) -> Dict:
|
||||
"""Get a specific journal entry by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/journal-entries', id)
|
||||
|
||||
async def create_journal_entry(
|
||||
self,
|
||||
assigned_object_type: str,
|
||||
assigned_object_id: int,
|
||||
comments: str,
|
||||
kind: str = 'info',
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new journal entry."""
|
||||
data = {
|
||||
'assigned_object_type': assigned_object_type,
|
||||
'assigned_object_id': assigned_object_id,
|
||||
'comments': comments, 'kind': kind, **kwargs
|
||||
}
|
||||
return self.client.create(f'{self.base_endpoint}/journal-entries', data)
|
||||
|
||||
async def update_journal_entry(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a journal entry."""
|
||||
return self.client.patch(f'{self.base_endpoint}/journal-entries', id, kwargs)
|
||||
|
||||
async def delete_journal_entry(self, id: int) -> None:
|
||||
"""Delete a journal entry."""
|
||||
self.client.delete(f'{self.base_endpoint}/journal-entries', id)
|
||||
|
||||
# ==================== Config Contexts ====================
|
||||
|
||||
async def list_config_contexts(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all config contexts."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/config-contexts', params=params)
|
||||
|
||||
async def get_config_context(self, id: int) -> Dict:
|
||||
"""Get a specific config context by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/config-contexts', id)
|
||||
|
||||
async def create_config_context(
|
||||
self,
|
||||
name: str,
|
||||
data: Dict[str, Any],
|
||||
weight: int = 1000,
|
||||
description: Optional[str] = None,
|
||||
is_active: bool = True,
|
||||
regions: Optional[List[int]] = None,
|
||||
site_groups: Optional[List[int]] = None,
|
||||
sites: Optional[List[int]] = None,
|
||||
locations: Optional[List[int]] = None,
|
||||
device_types: Optional[List[int]] = None,
|
||||
roles: Optional[List[int]] = None,
|
||||
platforms: Optional[List[int]] = None,
|
||||
cluster_types: Optional[List[int]] = None,
|
||||
cluster_groups: Optional[List[int]] = None,
|
||||
clusters: Optional[List[int]] = None,
|
||||
tenant_groups: Optional[List[int]] = None,
|
||||
tenants: Optional[List[int]] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new config context."""
|
||||
context_data = {
|
||||
'name': name, 'data': data, 'weight': weight, 'is_active': is_active, **kwargs
|
||||
}
|
||||
for key, val in [
|
||||
('description', description), ('regions', regions),
|
||||
('site_groups', site_groups), ('sites', sites),
|
||||
('locations', locations), ('device_types', device_types),
|
||||
('roles', roles), ('platforms', platforms),
|
||||
('cluster_types', cluster_types), ('cluster_groups', cluster_groups),
|
||||
('clusters', clusters), ('tenant_groups', tenant_groups),
|
||||
('tenants', tenants), ('tags', tags)
|
||||
]:
|
||||
if val is not None:
|
||||
context_data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/config-contexts', context_data)
|
||||
|
||||
async def update_config_context(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a config context."""
|
||||
return self.client.patch(f'{self.base_endpoint}/config-contexts', id, kwargs)
|
||||
|
||||
async def delete_config_context(self, id: int) -> None:
|
||||
"""Delete a config context."""
|
||||
self.client.delete(f'{self.base_endpoint}/config-contexts', id)
|
||||
|
||||
# ==================== Config Templates ====================
|
||||
|
||||
async def list_config_templates(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all config templates."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/config-templates', params=params)
|
||||
|
||||
async def get_config_template(self, id: int) -> Dict:
|
||||
"""Get a specific config template by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/config-templates', id)
|
||||
|
||||
async def create_config_template(
|
||||
self,
|
||||
name: str,
|
||||
template_code: str,
|
||||
description: Optional[str] = None,
|
||||
environment_params: Optional[Dict[str, Any]] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new config template."""
|
||||
data = {'name': name, 'template_code': template_code, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
if environment_params:
|
||||
data['environment_params'] = environment_params
|
||||
return self.client.create(f'{self.base_endpoint}/config-templates', data)
|
||||
|
||||
async def update_config_template(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a config template."""
|
||||
return self.client.patch(f'{self.base_endpoint}/config-templates', id, kwargs)
|
||||
|
||||
async def delete_config_template(self, id: int) -> None:
|
||||
"""Delete a config template."""
|
||||
self.client.delete(f'{self.base_endpoint}/config-templates', id)
|
||||
|
||||
# ==================== Export Templates ====================
|
||||
|
||||
async def list_export_templates(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
content_types: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all export templates."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'content_types': content_types, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/export-templates', params=params)
|
||||
|
||||
async def get_export_template(self, id: int) -> Dict:
|
||||
"""Get a specific export template by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/export-templates', id)
|
||||
|
||||
async def create_export_template(
|
||||
self,
|
||||
name: str,
|
||||
content_types: List[str],
|
||||
template_code: str,
|
||||
description: Optional[str] = None,
|
||||
mime_type: str = 'text/plain',
|
||||
file_extension: Optional[str] = None,
|
||||
as_attachment: bool = True,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new export template."""
|
||||
data = {
|
||||
'name': name, 'content_types': content_types,
|
||||
'template_code': template_code, 'mime_type': mime_type,
|
||||
'as_attachment': as_attachment, **kwargs
|
||||
}
|
||||
if description:
|
||||
data['description'] = description
|
||||
if file_extension:
|
||||
data['file_extension'] = file_extension
|
||||
return self.client.create(f'{self.base_endpoint}/export-templates', data)
|
||||
|
||||
async def update_export_template(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an export template."""
|
||||
return self.client.patch(f'{self.base_endpoint}/export-templates', id, kwargs)
|
||||
|
||||
async def delete_export_template(self, id: int) -> None:
|
||||
"""Delete an export template."""
|
||||
self.client.delete(f'{self.base_endpoint}/export-templates', id)
|
||||
|
||||
# ==================== Saved Filters ====================
|
||||
|
||||
async def list_saved_filters(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
content_types: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all saved filters."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'content_types': content_types, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/saved-filters', params=params)
|
||||
|
||||
async def get_saved_filter(self, id: int) -> Dict:
|
||||
"""Get a specific saved filter by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/saved-filters', id)
|
||||
|
||||
async def create_saved_filter(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
content_types: List[str],
|
||||
parameters: Dict[str, Any],
|
||||
description: Optional[str] = None,
|
||||
weight: int = 100,
|
||||
enabled: bool = True,
|
||||
shared: bool = True,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new saved filter."""
|
||||
data = {
|
||||
'name': name, 'slug': slug, 'content_types': content_types,
|
||||
'parameters': parameters, 'weight': weight,
|
||||
'enabled': enabled, 'shared': shared, **kwargs
|
||||
}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/saved-filters', data)
|
||||
|
||||
async def update_saved_filter(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a saved filter."""
|
||||
return self.client.patch(f'{self.base_endpoint}/saved-filters', id, kwargs)
|
||||
|
||||
async def delete_saved_filter(self, id: int) -> None:
|
||||
"""Delete a saved filter."""
|
||||
self.client.delete(f'{self.base_endpoint}/saved-filters', id)
|
||||
|
||||
# ==================== Image Attachments ====================
|
||||
|
||||
async def list_image_attachments(
|
||||
self,
|
||||
object_type: Optional[str] = None,
|
||||
object_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all image attachments."""
|
||||
params = {k: v for k, v in {
|
||||
'object_type': object_type, 'object_id': object_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/image-attachments', params=params)
|
||||
|
||||
async def get_image_attachment(self, id: int) -> Dict:
|
||||
"""Get a specific image attachment by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/image-attachments', id)
|
||||
|
||||
async def delete_image_attachment(self, id: int) -> None:
|
||||
"""Delete an image attachment."""
|
||||
self.client.delete(f'{self.base_endpoint}/image-attachments', id)
|
||||
|
||||
# ==================== Object Changes (Audit Log) ====================
|
||||
|
||||
async def list_object_changes(
|
||||
self,
|
||||
user_id: Optional[int] = None,
|
||||
changed_object_type: Optional[str] = None,
|
||||
changed_object_id: Optional[int] = None,
|
||||
action: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all object changes (audit log)."""
|
||||
params = {k: v for k, v in {
|
||||
'user_id': user_id, 'changed_object_type': changed_object_type,
|
||||
'changed_object_id': changed_object_id, 'action': action, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/object-changes', params=params)
|
||||
|
||||
async def get_object_change(self, id: int) -> Dict:
|
||||
"""Get a specific object change by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/object-changes', id)
|
||||
|
||||
# ==================== Scripts ====================
|
||||
|
||||
async def list_scripts(self, **kwargs) -> List[Dict]:
|
||||
"""List all available scripts."""
|
||||
return self.client.list(f'{self.base_endpoint}/scripts', params=kwargs)
|
||||
|
||||
async def get_script(self, id: str) -> Dict:
|
||||
"""Get a specific script by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/scripts', id)
|
||||
|
||||
async def run_script(self, id: str, data: Dict[str, Any], commit: bool = True) -> Dict:
|
||||
"""Run a script with the provided data."""
|
||||
payload = {'data': data, 'commit': commit}
|
||||
return self.client.create(f'{self.base_endpoint}/scripts/{id}', payload)
|
||||
|
||||
# ==================== Reports ====================
|
||||
|
||||
async def list_reports(self, **kwargs) -> List[Dict]:
|
||||
"""List all available reports."""
|
||||
return self.client.list(f'{self.base_endpoint}/reports', params=kwargs)
|
||||
|
||||
async def get_report(self, id: str) -> Dict:
|
||||
"""Get a specific report by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/reports', id)
|
||||
|
||||
async def run_report(self, id: str) -> Dict:
|
||||
"""Run a report."""
|
||||
return self.client.create(f'{self.base_endpoint}/reports/{id}', {})
|
||||
@@ -1,718 +0,0 @@
|
||||
"""
|
||||
IPAM (IP Address Management) tools for NetBox MCP Server.
|
||||
|
||||
Covers: IP Addresses, Prefixes, VLANs, VRFs, ASNs, and related models.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class IPAMTools:
|
||||
"""Tools for IPAM operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'ipam'
|
||||
|
||||
# ==================== ASN Ranges ====================
|
||||
|
||||
async def list_asn_ranges(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all ASN ranges."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/asn-ranges', params=params)
|
||||
|
||||
async def get_asn_range(self, id: int) -> Dict:
|
||||
"""Get a specific ASN range by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/asn-ranges', id)
|
||||
|
||||
async def create_asn_range(self, name: str, slug: str, rir: int, start: int, end: int, **kwargs) -> Dict:
|
||||
"""Create a new ASN range."""
|
||||
data = {'name': name, 'slug': slug, 'rir': rir, 'start': start, 'end': end, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/asn-ranges', data)
|
||||
|
||||
async def update_asn_range(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an ASN range."""
|
||||
return self.client.patch(f'{self.base_endpoint}/asn-ranges', id, kwargs)
|
||||
|
||||
async def delete_asn_range(self, id: int) -> None:
|
||||
"""Delete an ASN range."""
|
||||
self.client.delete(f'{self.base_endpoint}/asn-ranges', id)
|
||||
|
||||
# ==================== ASNs ====================
|
||||
|
||||
async def list_asns(
|
||||
self,
|
||||
asn: Optional[int] = None,
|
||||
rir_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all ASNs."""
|
||||
params = {k: v for k, v in {
|
||||
'asn': asn, 'rir_id': rir_id, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/asns', params=params)
|
||||
|
||||
async def get_asn(self, id: int) -> Dict:
|
||||
"""Get a specific ASN by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/asns', id)
|
||||
|
||||
async def create_asn(
|
||||
self,
|
||||
asn: int,
|
||||
rir: int,
|
||||
tenant: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new ASN."""
|
||||
data = {'asn': asn, 'rir': rir, **kwargs}
|
||||
if tenant:
|
||||
data['tenant'] = tenant
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/asns', data)
|
||||
|
||||
async def update_asn(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an ASN."""
|
||||
return self.client.patch(f'{self.base_endpoint}/asns', id, kwargs)
|
||||
|
||||
async def delete_asn(self, id: int) -> None:
|
||||
"""Delete an ASN."""
|
||||
self.client.delete(f'{self.base_endpoint}/asns', id)
|
||||
|
||||
# ==================== RIRs ====================
|
||||
|
||||
async def list_rirs(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all RIRs (Regional Internet Registries)."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/rirs', params=params)
|
||||
|
||||
async def get_rir(self, id: int) -> Dict:
|
||||
"""Get a specific RIR by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/rirs', id)
|
||||
|
||||
async def create_rir(self, name: str, slug: str, is_private: bool = False, **kwargs) -> Dict:
|
||||
"""Create a new RIR."""
|
||||
data = {'name': name, 'slug': slug, 'is_private': is_private, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/rirs', data)
|
||||
|
||||
async def update_rir(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a RIR."""
|
||||
return self.client.patch(f'{self.base_endpoint}/rirs', id, kwargs)
|
||||
|
||||
async def delete_rir(self, id: int) -> None:
|
||||
"""Delete a RIR."""
|
||||
self.client.delete(f'{self.base_endpoint}/rirs', id)
|
||||
|
||||
# ==================== Aggregates ====================
|
||||
|
||||
async def list_aggregates(
|
||||
self,
|
||||
prefix: Optional[str] = None,
|
||||
rir_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all aggregates."""
|
||||
params = {k: v for k, v in {
|
||||
'prefix': prefix, 'rir_id': rir_id, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/aggregates', params=params)
|
||||
|
||||
async def get_aggregate(self, id: int) -> Dict:
|
||||
"""Get a specific aggregate by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/aggregates', id)
|
||||
|
||||
async def create_aggregate(
|
||||
self,
|
||||
prefix: str,
|
||||
rir: int,
|
||||
tenant: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new aggregate."""
|
||||
data = {'prefix': prefix, 'rir': rir, **kwargs}
|
||||
if tenant:
|
||||
data['tenant'] = tenant
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/aggregates', data)
|
||||
|
||||
async def update_aggregate(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an aggregate."""
|
||||
return self.client.patch(f'{self.base_endpoint}/aggregates', id, kwargs)
|
||||
|
||||
async def delete_aggregate(self, id: int) -> None:
|
||||
"""Delete an aggregate."""
|
||||
self.client.delete(f'{self.base_endpoint}/aggregates', id)
|
||||
|
||||
# ==================== Roles ====================
|
||||
|
||||
async def list_roles(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all IPAM roles."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/roles', params=params)
|
||||
|
||||
async def get_role(self, id: int) -> Dict:
|
||||
"""Get a specific role by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/roles', id)
|
||||
|
||||
async def create_role(self, name: str, slug: str, weight: int = 1000, **kwargs) -> Dict:
|
||||
"""Create a new IPAM role."""
|
||||
data = {'name': name, 'slug': slug, 'weight': weight, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/roles', data)
|
||||
|
||||
async def update_role(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a role."""
|
||||
return self.client.patch(f'{self.base_endpoint}/roles', id, kwargs)
|
||||
|
||||
async def delete_role(self, id: int) -> None:
|
||||
"""Delete a role."""
|
||||
self.client.delete(f'{self.base_endpoint}/roles', id)
|
||||
|
||||
# ==================== Prefixes ====================
|
||||
|
||||
async def list_prefixes(
|
||||
self,
|
||||
prefix: Optional[str] = None,
|
||||
site_id: Optional[int] = None,
|
||||
vrf_id: Optional[int] = None,
|
||||
vlan_id: Optional[int] = None,
|
||||
role_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
family: Optional[int] = None,
|
||||
is_pool: Optional[bool] = None,
|
||||
within: Optional[str] = None,
|
||||
within_include: Optional[str] = None,
|
||||
contains: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all prefixes with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'prefix': prefix, 'site_id': site_id, 'vrf_id': vrf_id,
|
||||
'vlan_id': vlan_id, 'role_id': role_id, 'tenant_id': tenant_id,
|
||||
'status': status, 'family': family, 'is_pool': is_pool,
|
||||
'within': within, 'within_include': within_include, 'contains': contains, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/prefixes', params=params)
|
||||
|
||||
async def get_prefix(self, id: int) -> Dict:
|
||||
"""Get a specific prefix by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/prefixes', id)
|
||||
|
||||
async def create_prefix(
|
||||
self,
|
||||
prefix: str,
|
||||
status: str = 'active',
|
||||
site: Optional[int] = None,
|
||||
vrf: Optional[int] = None,
|
||||
vlan: Optional[int] = None,
|
||||
role: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
is_pool: bool = False,
|
||||
mark_utilized: bool = False,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new prefix."""
|
||||
data = {'prefix': prefix, 'status': status, 'is_pool': is_pool, 'mark_utilized': mark_utilized, **kwargs}
|
||||
for key, val in [
|
||||
('site', site), ('vrf', vrf), ('vlan', vlan),
|
||||
('role', role), ('tenant', tenant), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/prefixes', data)
|
||||
|
||||
async def update_prefix(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a prefix."""
|
||||
return self.client.patch(f'{self.base_endpoint}/prefixes', id, kwargs)
|
||||
|
||||
async def delete_prefix(self, id: int) -> None:
|
||||
"""Delete a prefix."""
|
||||
self.client.delete(f'{self.base_endpoint}/prefixes', id)
|
||||
|
||||
async def list_available_prefixes(self, id: int) -> List[Dict]:
|
||||
"""List available child prefixes within a prefix."""
|
||||
return self.client.list(f'{self.base_endpoint}/prefixes/{id}/available-prefixes', paginate=False)
|
||||
|
||||
async def create_available_prefix(self, id: int, prefix_length: int, **kwargs) -> Dict:
|
||||
"""Create a new prefix from available space."""
|
||||
data = {'prefix_length': prefix_length, **kwargs}
|
||||
return self.client.create(f'{self.base_endpoint}/prefixes/{id}/available-prefixes', data)
|
||||
|
||||
async def list_available_ips(self, id: int) -> List[Dict]:
|
||||
"""List available IP addresses within a prefix."""
|
||||
return self.client.list(f'{self.base_endpoint}/prefixes/{id}/available-ips', paginate=False)
|
||||
|
||||
async def create_available_ip(self, id: int, **kwargs) -> Dict:
|
||||
"""Create a new IP address from available space in prefix."""
|
||||
return self.client.create(f'{self.base_endpoint}/prefixes/{id}/available-ips', kwargs)
|
||||
|
||||
# ==================== IP Ranges ====================
|
||||
|
||||
async def list_ip_ranges(
|
||||
self,
|
||||
start_address: Optional[str] = None,
|
||||
end_address: Optional[str] = None,
|
||||
vrf_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all IP ranges."""
|
||||
params = {k: v for k, v in {
|
||||
'start_address': start_address, 'end_address': end_address,
|
||||
'vrf_id': vrf_id, 'tenant_id': tenant_id, 'status': status, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ip-ranges', params=params)
|
||||
|
||||
async def get_ip_range(self, id: int) -> Dict:
|
||||
"""Get a specific IP range by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ip-ranges', id)
|
||||
|
||||
async def create_ip_range(
|
||||
self,
|
||||
start_address: str,
|
||||
end_address: str,
|
||||
status: str = 'active',
|
||||
vrf: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
role: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IP range."""
|
||||
data = {'start_address': start_address, 'end_address': end_address, 'status': status, **kwargs}
|
||||
for key, val in [('vrf', vrf), ('tenant', tenant), ('role', role), ('description', description)]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/ip-ranges', data)
|
||||
|
||||
async def update_ip_range(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IP range."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ip-ranges', id, kwargs)
|
||||
|
||||
async def delete_ip_range(self, id: int) -> None:
|
||||
"""Delete an IP range."""
|
||||
self.client.delete(f'{self.base_endpoint}/ip-ranges', id)
|
||||
|
||||
async def list_available_ips_in_range(self, id: int) -> List[Dict]:
|
||||
"""List available IP addresses within an IP range."""
|
||||
return self.client.list(f'{self.base_endpoint}/ip-ranges/{id}/available-ips', paginate=False)
|
||||
|
||||
# ==================== IP Addresses ====================
|
||||
|
||||
async def list_ip_addresses(
|
||||
self,
|
||||
address: Optional[str] = None,
|
||||
vrf_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
role: Optional[str] = None,
|
||||
interface_id: Optional[int] = None,
|
||||
device_id: Optional[int] = None,
|
||||
virtual_machine_id: Optional[int] = None,
|
||||
family: Optional[int] = None,
|
||||
parent: Optional[str] = None,
|
||||
dns_name: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all IP addresses with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'address': address, 'vrf_id': vrf_id, 'tenant_id': tenant_id,
|
||||
'status': status, 'role': role, 'interface_id': interface_id,
|
||||
'device_id': device_id, 'virtual_machine_id': virtual_machine_id,
|
||||
'family': family, 'parent': parent, 'dns_name': dns_name, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ip-addresses', params=params)
|
||||
|
||||
async def get_ip_address(self, id: int) -> Dict:
|
||||
"""Get a specific IP address by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ip-addresses', id)
|
||||
|
||||
async def create_ip_address(
|
||||
self,
|
||||
address: str,
|
||||
status: str = 'active',
|
||||
vrf: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
role: Optional[str] = None,
|
||||
assigned_object_type: Optional[str] = None,
|
||||
assigned_object_id: Optional[int] = None,
|
||||
nat_inside: Optional[int] = None,
|
||||
dns_name: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IP address."""
|
||||
data = {'address': address, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('vrf', vrf), ('tenant', tenant), ('role', role),
|
||||
('assigned_object_type', assigned_object_type),
|
||||
('assigned_object_id', assigned_object_id),
|
||||
('nat_inside', nat_inside), ('dns_name', dns_name),
|
||||
('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/ip-addresses', data)
|
||||
|
||||
async def update_ip_address(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IP address."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ip-addresses', id, kwargs)
|
||||
|
||||
async def delete_ip_address(self, id: int) -> None:
|
||||
"""Delete an IP address."""
|
||||
self.client.delete(f'{self.base_endpoint}/ip-addresses', id)
|
||||
|
||||
# ==================== FHRP Groups ====================
|
||||
|
||||
async def list_fhrp_groups(
|
||||
self,
|
||||
protocol: Optional[str] = None,
|
||||
group_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all FHRP groups."""
|
||||
params = {k: v for k, v in {'protocol': protocol, 'group_id': group_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/fhrp-groups', params=params)
|
||||
|
||||
async def get_fhrp_group(self, id: int) -> Dict:
|
||||
"""Get a specific FHRP group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/fhrp-groups', id)
|
||||
|
||||
async def create_fhrp_group(
|
||||
self,
|
||||
protocol: str,
|
||||
group_id: int,
|
||||
auth_type: Optional[str] = None,
|
||||
auth_key: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new FHRP group."""
|
||||
data = {'protocol': protocol, 'group_id': group_id, **kwargs}
|
||||
if auth_type:
|
||||
data['auth_type'] = auth_type
|
||||
if auth_key:
|
||||
data['auth_key'] = auth_key
|
||||
return self.client.create(f'{self.base_endpoint}/fhrp-groups', data)
|
||||
|
||||
async def update_fhrp_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an FHRP group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/fhrp-groups', id, kwargs)
|
||||
|
||||
async def delete_fhrp_group(self, id: int) -> None:
|
||||
"""Delete an FHRP group."""
|
||||
self.client.delete(f'{self.base_endpoint}/fhrp-groups', id)
|
||||
|
||||
# ==================== FHRP Group Assignments ====================
|
||||
|
||||
async def list_fhrp_group_assignments(self, group_id: Optional[int] = None, **kwargs) -> List[Dict]:
|
||||
"""List all FHRP group assignments."""
|
||||
params = {k: v for k, v in {'group_id': group_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/fhrp-group-assignments', params=params)
|
||||
|
||||
async def get_fhrp_group_assignment(self, id: int) -> Dict:
|
||||
"""Get a specific FHRP group assignment by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/fhrp-group-assignments', id)
|
||||
|
||||
async def create_fhrp_group_assignment(
|
||||
self,
|
||||
group: int,
|
||||
interface_type: str,
|
||||
interface_id: int,
|
||||
priority: int = 100,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new FHRP group assignment."""
|
||||
data = {
|
||||
'group': group, 'interface_type': interface_type,
|
||||
'interface_id': interface_id, 'priority': priority, **kwargs
|
||||
}
|
||||
return self.client.create(f'{self.base_endpoint}/fhrp-group-assignments', data)
|
||||
|
||||
async def update_fhrp_group_assignment(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an FHRP group assignment."""
|
||||
return self.client.patch(f'{self.base_endpoint}/fhrp-group-assignments', id, kwargs)
|
||||
|
||||
async def delete_fhrp_group_assignment(self, id: int) -> None:
|
||||
"""Delete an FHRP group assignment."""
|
||||
self.client.delete(f'{self.base_endpoint}/fhrp-group-assignments', id)
|
||||
|
||||
# ==================== VLAN Groups ====================
|
||||
|
||||
async def list_vlan_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
site_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all VLAN groups."""
|
||||
params = {k: v for k, v in {'name': name, 'site_id': site_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/vlan-groups', params=params)
|
||||
|
||||
async def get_vlan_group(self, id: int) -> Dict:
|
||||
"""Get a specific VLAN group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/vlan-groups', id)
|
||||
|
||||
async def create_vlan_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
scope_type: Optional[str] = None,
|
||||
scope_id: Optional[int] = None,
|
||||
min_vid: int = 1,
|
||||
max_vid: int = 4094,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new VLAN group."""
|
||||
data = {'name': name, 'slug': slug, 'min_vid': min_vid, 'max_vid': max_vid, **kwargs}
|
||||
if scope_type:
|
||||
data['scope_type'] = scope_type
|
||||
if scope_id:
|
||||
data['scope_id'] = scope_id
|
||||
return self.client.create(f'{self.base_endpoint}/vlan-groups', data)
|
||||
|
||||
async def update_vlan_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a VLAN group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/vlan-groups', id, kwargs)
|
||||
|
||||
async def delete_vlan_group(self, id: int) -> None:
|
||||
"""Delete a VLAN group."""
|
||||
self.client.delete(f'{self.base_endpoint}/vlan-groups', id)
|
||||
|
||||
async def list_available_vlans(self, id: int) -> List[Dict]:
|
||||
"""List available VLANs in a VLAN group."""
|
||||
return self.client.list(f'{self.base_endpoint}/vlan-groups/{id}/available-vlans', paginate=False)
|
||||
|
||||
# ==================== VLANs ====================
|
||||
|
||||
async def list_vlans(
|
||||
self,
|
||||
vid: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
site_id: Optional[int] = None,
|
||||
group_id: Optional[int] = None,
|
||||
role_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all VLANs with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'vid': vid, 'name': name, 'site_id': site_id, 'group_id': group_id,
|
||||
'role_id': role_id, 'tenant_id': tenant_id, 'status': status, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/vlans', params=params)
|
||||
|
||||
async def get_vlan(self, id: int) -> Dict:
|
||||
"""Get a specific VLAN by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/vlans', id)
|
||||
|
||||
async def create_vlan(
|
||||
self,
|
||||
vid: int,
|
||||
name: str,
|
||||
status: str = 'active',
|
||||
site: Optional[int] = None,
|
||||
group: Optional[int] = None,
|
||||
role: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new VLAN."""
|
||||
data = {'vid': vid, 'name': name, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('site', site), ('group', group), ('role', role),
|
||||
('tenant', tenant), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/vlans', data)
|
||||
|
||||
async def update_vlan(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a VLAN."""
|
||||
return self.client.patch(f'{self.base_endpoint}/vlans', id, kwargs)
|
||||
|
||||
async def delete_vlan(self, id: int) -> None:
|
||||
"""Delete a VLAN."""
|
||||
self.client.delete(f'{self.base_endpoint}/vlans', id)
|
||||
|
||||
# ==================== VRFs ====================
|
||||
|
||||
async def list_vrfs(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
rd: Optional[str] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all VRFs with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'rd': rd, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/vrfs', params=params)
|
||||
|
||||
async def get_vrf(self, id: int) -> Dict:
|
||||
"""Get a specific VRF by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/vrfs', id)
|
||||
|
||||
async def create_vrf(
|
||||
self,
|
||||
name: str,
|
||||
rd: Optional[str] = None,
|
||||
tenant: Optional[int] = None,
|
||||
enforce_unique: bool = True,
|
||||
description: Optional[str] = None,
|
||||
import_targets: Optional[List[int]] = None,
|
||||
export_targets: Optional[List[int]] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new VRF."""
|
||||
data = {'name': name, 'enforce_unique': enforce_unique, **kwargs}
|
||||
for key, val in [
|
||||
('rd', rd), ('tenant', tenant), ('description', description),
|
||||
('import_targets', import_targets), ('export_targets', export_targets)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/vrfs', data)
|
||||
|
||||
async def update_vrf(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a VRF."""
|
||||
return self.client.patch(f'{self.base_endpoint}/vrfs', id, kwargs)
|
||||
|
||||
async def delete_vrf(self, id: int) -> None:
|
||||
"""Delete a VRF."""
|
||||
self.client.delete(f'{self.base_endpoint}/vrfs', id)
|
||||
|
||||
# ==================== Route Targets ====================
|
||||
|
||||
async def list_route_targets(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all route targets."""
|
||||
params = {k: v for k, v in {'name': name, 'tenant_id': tenant_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/route-targets', params=params)
|
||||
|
||||
async def get_route_target(self, id: int) -> Dict:
|
||||
"""Get a specific route target by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/route-targets', id)
|
||||
|
||||
async def create_route_target(
|
||||
self,
|
||||
name: str,
|
||||
tenant: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new route target."""
|
||||
data = {'name': name, **kwargs}
|
||||
if tenant:
|
||||
data['tenant'] = tenant
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/route-targets', data)
|
||||
|
||||
async def update_route_target(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a route target."""
|
||||
return self.client.patch(f'{self.base_endpoint}/route-targets', id, kwargs)
|
||||
|
||||
async def delete_route_target(self, id: int) -> None:
|
||||
"""Delete a route target."""
|
||||
self.client.delete(f'{self.base_endpoint}/route-targets', id)
|
||||
|
||||
# ==================== Services ====================
|
||||
|
||||
async def list_services(
|
||||
self,
|
||||
device_id: Optional[int] = None,
|
||||
virtual_machine_id: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
protocol: Optional[str] = None,
|
||||
port: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all services."""
|
||||
params = {k: v for k, v in {
|
||||
'device_id': device_id, 'virtual_machine_id': virtual_machine_id,
|
||||
'name': name, 'protocol': protocol, 'port': port, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/services', params=params)
|
||||
|
||||
async def get_service(self, id: int) -> Dict:
|
||||
"""Get a specific service by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/services', id)
|
||||
|
||||
async def create_service(
|
||||
self,
|
||||
name: str,
|
||||
ports: List[int],
|
||||
protocol: str,
|
||||
device: Optional[int] = None,
|
||||
virtual_machine: Optional[int] = None,
|
||||
ipaddresses: Optional[List[int]] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new service."""
|
||||
data = {'name': name, 'ports': ports, 'protocol': protocol, **kwargs}
|
||||
for key, val in [
|
||||
('device', device), ('virtual_machine', virtual_machine),
|
||||
('ipaddresses', ipaddresses), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/services', data)
|
||||
|
||||
async def update_service(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a service."""
|
||||
return self.client.patch(f'{self.base_endpoint}/services', id, kwargs)
|
||||
|
||||
async def delete_service(self, id: int) -> None:
|
||||
"""Delete a service."""
|
||||
self.client.delete(f'{self.base_endpoint}/services', id)
|
||||
|
||||
# ==================== Service Templates ====================
|
||||
|
||||
async def list_service_templates(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all service templates."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/service-templates', params=params)
|
||||
|
||||
async def get_service_template(self, id: int) -> Dict:
|
||||
"""Get a specific service template by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/service-templates', id)
|
||||
|
||||
async def create_service_template(
|
||||
self,
|
||||
name: str,
|
||||
ports: List[int],
|
||||
protocol: str,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new service template."""
|
||||
data = {'name': name, 'ports': ports, 'protocol': protocol, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/service-templates', data)
|
||||
|
||||
async def update_service_template(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a service template."""
|
||||
return self.client.patch(f'{self.base_endpoint}/service-templates', id, kwargs)
|
||||
|
||||
async def delete_service_template(self, id: int) -> None:
|
||||
"""Delete a service template."""
|
||||
self.client.delete(f'{self.base_endpoint}/service-templates', id)
|
||||
@@ -1,281 +0,0 @@
|
||||
"""
|
||||
Tenancy tools for NetBox MCP Server.
|
||||
|
||||
Covers: Tenants, Tenant Groups, Contacts, Contact Groups, and Contact Roles.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class TenancyTools:
|
||||
"""Tools for Tenancy operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'tenancy'
|
||||
|
||||
# ==================== Tenant Groups ====================
|
||||
|
||||
async def list_tenant_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
parent_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all tenant groups."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'parent_id': parent_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/tenant-groups', params=params)
|
||||
|
||||
async def get_tenant_group(self, id: int) -> Dict:
|
||||
"""Get a specific tenant group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/tenant-groups', id)
|
||||
|
||||
async def create_tenant_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
parent: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new tenant group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if parent:
|
||||
data['parent'] = parent
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/tenant-groups', data)
|
||||
|
||||
async def update_tenant_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a tenant group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/tenant-groups', id, kwargs)
|
||||
|
||||
async def delete_tenant_group(self, id: int) -> None:
|
||||
"""Delete a tenant group."""
|
||||
self.client.delete(f'{self.base_endpoint}/tenant-groups', id)
|
||||
|
||||
# ==================== Tenants ====================
|
||||
|
||||
async def list_tenants(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
group_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all tenants with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'group_id': group_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/tenants', params=params)
|
||||
|
||||
async def get_tenant(self, id: int) -> Dict:
|
||||
"""Get a specific tenant by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/tenants', id)
|
||||
|
||||
async def create_tenant(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
group: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new tenant."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if group:
|
||||
data['group'] = group
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/tenants', data)
|
||||
|
||||
async def update_tenant(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a tenant."""
|
||||
return self.client.patch(f'{self.base_endpoint}/tenants', id, kwargs)
|
||||
|
||||
async def delete_tenant(self, id: int) -> None:
|
||||
"""Delete a tenant."""
|
||||
self.client.delete(f'{self.base_endpoint}/tenants', id)
|
||||
|
||||
# ==================== Contact Groups ====================
|
||||
|
||||
async def list_contact_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
parent_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all contact groups."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'parent_id': parent_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/contact-groups', params=params)
|
||||
|
||||
async def get_contact_group(self, id: int) -> Dict:
|
||||
"""Get a specific contact group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/contact-groups', id)
|
||||
|
||||
async def create_contact_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
parent: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new contact group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if parent:
|
||||
data['parent'] = parent
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/contact-groups', data)
|
||||
|
||||
async def update_contact_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a contact group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/contact-groups', id, kwargs)
|
||||
|
||||
async def delete_contact_group(self, id: int) -> None:
|
||||
"""Delete a contact group."""
|
||||
self.client.delete(f'{self.base_endpoint}/contact-groups', id)
|
||||
|
||||
# ==================== Contact Roles ====================
|
||||
|
||||
async def list_contact_roles(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all contact roles."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/contact-roles', params=params)
|
||||
|
||||
async def get_contact_role(self, id: int) -> Dict:
|
||||
"""Get a specific contact role by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/contact-roles', id)
|
||||
|
||||
async def create_contact_role(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new contact role."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/contact-roles', data)
|
||||
|
||||
async def update_contact_role(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a contact role."""
|
||||
return self.client.patch(f'{self.base_endpoint}/contact-roles', id, kwargs)
|
||||
|
||||
async def delete_contact_role(self, id: int) -> None:
|
||||
"""Delete a contact role."""
|
||||
self.client.delete(f'{self.base_endpoint}/contact-roles', id)
|
||||
|
||||
# ==================== Contacts ====================
|
||||
|
||||
async def list_contacts(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
group_id: Optional[int] = None,
|
||||
email: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all contacts with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'group_id': group_id, 'email': email, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/contacts', params=params)
|
||||
|
||||
async def get_contact(self, id: int) -> Dict:
|
||||
"""Get a specific contact by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/contacts', id)
|
||||
|
||||
async def create_contact(
|
||||
self,
|
||||
name: str,
|
||||
group: Optional[int] = None,
|
||||
title: Optional[str] = None,
|
||||
phone: Optional[str] = None,
|
||||
email: Optional[str] = None,
|
||||
address: Optional[str] = None,
|
||||
link: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new contact."""
|
||||
data = {'name': name, **kwargs}
|
||||
for key, val in [
|
||||
('group', group), ('title', title), ('phone', phone),
|
||||
('email', email), ('address', address), ('link', link),
|
||||
('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/contacts', data)
|
||||
|
||||
async def update_contact(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a contact."""
|
||||
return self.client.patch(f'{self.base_endpoint}/contacts', id, kwargs)
|
||||
|
||||
async def delete_contact(self, id: int) -> None:
|
||||
"""Delete a contact."""
|
||||
self.client.delete(f'{self.base_endpoint}/contacts', id)
|
||||
|
||||
# ==================== Contact Assignments ====================
|
||||
|
||||
async def list_contact_assignments(
|
||||
self,
|
||||
contact_id: Optional[int] = None,
|
||||
role_id: Optional[int] = None,
|
||||
object_type: Optional[str] = None,
|
||||
object_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all contact assignments."""
|
||||
params = {k: v for k, v in {
|
||||
'contact_id': contact_id, 'role_id': role_id,
|
||||
'object_type': object_type, 'object_id': object_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/contact-assignments', params=params)
|
||||
|
||||
async def get_contact_assignment(self, id: int) -> Dict:
|
||||
"""Get a specific contact assignment by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/contact-assignments', id)
|
||||
|
||||
async def create_contact_assignment(
|
||||
self,
|
||||
contact: int,
|
||||
role: int,
|
||||
object_type: str,
|
||||
object_id: int,
|
||||
priority: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new contact assignment."""
|
||||
data = {
|
||||
'contact': contact, 'role': role,
|
||||
'object_type': object_type, 'object_id': object_id, **kwargs
|
||||
}
|
||||
if priority:
|
||||
data['priority'] = priority
|
||||
return self.client.create(f'{self.base_endpoint}/contact-assignments', data)
|
||||
|
||||
async def update_contact_assignment(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a contact assignment."""
|
||||
return self.client.patch(f'{self.base_endpoint}/contact-assignments', id, kwargs)
|
||||
|
||||
async def delete_contact_assignment(self, id: int) -> None:
|
||||
"""Delete a contact assignment."""
|
||||
self.client.delete(f'{self.base_endpoint}/contact-assignments', id)
|
||||
@@ -1,296 +0,0 @@
|
||||
"""
|
||||
Virtualization tools for NetBox MCP Server.
|
||||
|
||||
Covers: Clusters, Virtual Machines, VM Interfaces, and related models.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class VirtualizationTools:
|
||||
"""Tools for Virtualization operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'virtualization'
|
||||
|
||||
# ==================== Cluster Types ====================
|
||||
|
||||
async def list_cluster_types(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all cluster types."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/cluster-types', params=params)
|
||||
|
||||
async def get_cluster_type(self, id: int) -> Dict:
|
||||
"""Get a specific cluster type by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/cluster-types', id)
|
||||
|
||||
async def create_cluster_type(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new cluster type."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/cluster-types', data)
|
||||
|
||||
async def update_cluster_type(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a cluster type."""
|
||||
return self.client.patch(f'{self.base_endpoint}/cluster-types', id, kwargs)
|
||||
|
||||
async def delete_cluster_type(self, id: int) -> None:
|
||||
"""Delete a cluster type."""
|
||||
self.client.delete(f'{self.base_endpoint}/cluster-types', id)
|
||||
|
||||
# ==================== Cluster Groups ====================
|
||||
|
||||
async def list_cluster_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all cluster groups."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/cluster-groups', params=params)
|
||||
|
||||
async def get_cluster_group(self, id: int) -> Dict:
|
||||
"""Get a specific cluster group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/cluster-groups', id)
|
||||
|
||||
async def create_cluster_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new cluster group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/cluster-groups', data)
|
||||
|
||||
async def update_cluster_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a cluster group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/cluster-groups', id, kwargs)
|
||||
|
||||
async def delete_cluster_group(self, id: int) -> None:
|
||||
"""Delete a cluster group."""
|
||||
self.client.delete(f'{self.base_endpoint}/cluster-groups', id)
|
||||
|
||||
# ==================== Clusters ====================
|
||||
|
||||
async def list_clusters(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
type_id: Optional[int] = None,
|
||||
group_id: Optional[int] = None,
|
||||
site_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all clusters with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'type_id': type_id, 'group_id': group_id,
|
||||
'site_id': site_id, 'tenant_id': tenant_id, 'status': status, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/clusters', params=params)
|
||||
|
||||
async def get_cluster(self, id: int) -> Dict:
|
||||
"""Get a specific cluster by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/clusters', id)
|
||||
|
||||
async def create_cluster(
|
||||
self,
|
||||
name: str,
|
||||
type: int,
|
||||
status: str = 'active',
|
||||
group: Optional[int] = None,
|
||||
site: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new cluster."""
|
||||
data = {'name': name, 'type': type, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('group', group), ('site', site), ('tenant', tenant), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/clusters', data)
|
||||
|
||||
async def update_cluster(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a cluster."""
|
||||
return self.client.patch(f'{self.base_endpoint}/clusters', id, kwargs)
|
||||
|
||||
async def delete_cluster(self, id: int) -> None:
|
||||
"""Delete a cluster."""
|
||||
self.client.delete(f'{self.base_endpoint}/clusters', id)
|
||||
|
||||
# ==================== Virtual Machines ====================
|
||||
|
||||
async def list_virtual_machines(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
cluster_id: Optional[int] = None,
|
||||
site_id: Optional[int] = None,
|
||||
role_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
platform_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all virtual machines with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'cluster_id': cluster_id, 'site_id': site_id,
|
||||
'role_id': role_id, 'tenant_id': tenant_id, 'platform_id': platform_id,
|
||||
'status': status, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/virtual-machines', params=params)
|
||||
|
||||
async def get_virtual_machine(self, id: int) -> Dict:
|
||||
"""Get a specific virtual machine by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/virtual-machines', id)
|
||||
|
||||
async def create_virtual_machine(
|
||||
self,
|
||||
name: str,
|
||||
status: str = 'active',
|
||||
cluster: Optional[int] = None,
|
||||
site: Optional[int] = None,
|
||||
role: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
platform: Optional[int] = None,
|
||||
primary_ip4: Optional[int] = None,
|
||||
primary_ip6: Optional[int] = None,
|
||||
vcpus: Optional[float] = None,
|
||||
memory: Optional[int] = None,
|
||||
disk: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new virtual machine."""
|
||||
data = {'name': name, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('cluster', cluster), ('site', site), ('role', role),
|
||||
('tenant', tenant), ('platform', platform),
|
||||
('primary_ip4', primary_ip4), ('primary_ip6', primary_ip6),
|
||||
('vcpus', vcpus), ('memory', memory), ('disk', disk),
|
||||
('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/virtual-machines', data)
|
||||
|
||||
async def update_virtual_machine(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a virtual machine."""
|
||||
return self.client.patch(f'{self.base_endpoint}/virtual-machines', id, kwargs)
|
||||
|
||||
async def delete_virtual_machine(self, id: int) -> None:
|
||||
"""Delete a virtual machine."""
|
||||
self.client.delete(f'{self.base_endpoint}/virtual-machines', id)
|
||||
|
||||
# ==================== VM Interfaces ====================
|
||||
|
||||
async def list_vm_interfaces(
|
||||
self,
|
||||
virtual_machine_id: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
enabled: Optional[bool] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all VM interfaces."""
|
||||
params = {k: v for k, v in {
|
||||
'virtual_machine_id': virtual_machine_id, 'name': name, 'enabled': enabled, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/interfaces', params=params)
|
||||
|
||||
async def get_vm_interface(self, id: int) -> Dict:
|
||||
"""Get a specific VM interface by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/interfaces', id)
|
||||
|
||||
async def create_vm_interface(
|
||||
self,
|
||||
virtual_machine: int,
|
||||
name: str,
|
||||
enabled: bool = True,
|
||||
mtu: Optional[int] = None,
|
||||
mac_address: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
mode: Optional[str] = None,
|
||||
untagged_vlan: Optional[int] = None,
|
||||
tagged_vlans: Optional[List[int]] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new VM interface."""
|
||||
data = {'virtual_machine': virtual_machine, 'name': name, 'enabled': enabled, **kwargs}
|
||||
for key, val in [
|
||||
('mtu', mtu), ('mac_address', mac_address), ('description', description),
|
||||
('mode', mode), ('untagged_vlan', untagged_vlan), ('tagged_vlans', tagged_vlans)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/interfaces', data)
|
||||
|
||||
async def update_vm_interface(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a VM interface."""
|
||||
return self.client.patch(f'{self.base_endpoint}/interfaces', id, kwargs)
|
||||
|
||||
async def delete_vm_interface(self, id: int) -> None:
|
||||
"""Delete a VM interface."""
|
||||
self.client.delete(f'{self.base_endpoint}/interfaces', id)
|
||||
|
||||
# ==================== Virtual Disks ====================
|
||||
|
||||
async def list_virtual_disks(
|
||||
self,
|
||||
virtual_machine_id: Optional[int] = None,
|
||||
name: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all virtual disks."""
|
||||
params = {k: v for k, v in {
|
||||
'virtual_machine_id': virtual_machine_id, 'name': name, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/virtual-disks', params=params)
|
||||
|
||||
async def get_virtual_disk(self, id: int) -> Dict:
|
||||
"""Get a specific virtual disk by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/virtual-disks', id)
|
||||
|
||||
async def create_virtual_disk(
|
||||
self,
|
||||
virtual_machine: int,
|
||||
name: str,
|
||||
size: int,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new virtual disk."""
|
||||
data = {'virtual_machine': virtual_machine, 'name': name, 'size': size, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/virtual-disks', data)
|
||||
|
||||
async def update_virtual_disk(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a virtual disk."""
|
||||
return self.client.patch(f'{self.base_endpoint}/virtual-disks', id, kwargs)
|
||||
|
||||
async def delete_virtual_disk(self, id: int) -> None:
|
||||
"""Delete a virtual disk."""
|
||||
self.client.delete(f'{self.base_endpoint}/virtual-disks', id)
|
||||
@@ -1,428 +0,0 @@
|
||||
"""
|
||||
VPN tools for NetBox MCP Server.
|
||||
|
||||
Covers: Tunnels, Tunnel Groups, Tunnel Terminations, IKE/IPSec Policies, and L2VPN.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class VPNTools:
|
||||
"""Tools for VPN operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'vpn'
|
||||
|
||||
# ==================== Tunnel Groups ====================
|
||||
|
||||
async def list_tunnel_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all tunnel groups."""
|
||||
params = {k: v for k, v in {'name': name, 'slug': slug, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/tunnel-groups', params=params)
|
||||
|
||||
async def get_tunnel_group(self, id: int) -> Dict:
|
||||
"""Get a specific tunnel group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/tunnel-groups', id)
|
||||
|
||||
async def create_tunnel_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new tunnel group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/tunnel-groups', data)
|
||||
|
||||
async def update_tunnel_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a tunnel group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/tunnel-groups', id, kwargs)
|
||||
|
||||
async def delete_tunnel_group(self, id: int) -> None:
|
||||
"""Delete a tunnel group."""
|
||||
self.client.delete(f'{self.base_endpoint}/tunnel-groups', id)
|
||||
|
||||
# ==================== Tunnels ====================
|
||||
|
||||
async def list_tunnels(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
group_id: Optional[int] = None,
|
||||
encapsulation: Optional[str] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all tunnels with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'status': status, 'group_id': group_id,
|
||||
'encapsulation': encapsulation, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/tunnels', params=params)
|
||||
|
||||
async def get_tunnel(self, id: int) -> Dict:
|
||||
"""Get a specific tunnel by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/tunnels', id)
|
||||
|
||||
async def create_tunnel(
|
||||
self,
|
||||
name: str,
|
||||
status: str = 'active',
|
||||
encapsulation: str = 'ipsec-tunnel',
|
||||
group: Optional[int] = None,
|
||||
ipsec_profile: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
tunnel_id: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new tunnel."""
|
||||
data = {'name': name, 'status': status, 'encapsulation': encapsulation, **kwargs}
|
||||
for key, val in [
|
||||
('group', group), ('ipsec_profile', ipsec_profile),
|
||||
('tenant', tenant), ('tunnel_id', tunnel_id), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/tunnels', data)
|
||||
|
||||
async def update_tunnel(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a tunnel."""
|
||||
return self.client.patch(f'{self.base_endpoint}/tunnels', id, kwargs)
|
||||
|
||||
async def delete_tunnel(self, id: int) -> None:
|
||||
"""Delete a tunnel."""
|
||||
self.client.delete(f'{self.base_endpoint}/tunnels', id)
|
||||
|
||||
# ==================== Tunnel Terminations ====================
|
||||
|
||||
async def list_tunnel_terminations(
|
||||
self,
|
||||
tunnel_id: Optional[int] = None,
|
||||
role: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all tunnel terminations."""
|
||||
params = {k: v for k, v in {
|
||||
'tunnel_id': tunnel_id, 'role': role, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/tunnel-terminations', params=params)
|
||||
|
||||
async def get_tunnel_termination(self, id: int) -> Dict:
|
||||
"""Get a specific tunnel termination by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/tunnel-terminations', id)
|
||||
|
||||
async def create_tunnel_termination(
|
||||
self,
|
||||
tunnel: int,
|
||||
role: str,
|
||||
termination_type: str,
|
||||
termination_id: int,
|
||||
outside_ip: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new tunnel termination."""
|
||||
data = {
|
||||
'tunnel': tunnel, 'role': role,
|
||||
'termination_type': termination_type, 'termination_id': termination_id, **kwargs
|
||||
}
|
||||
if outside_ip:
|
||||
data['outside_ip'] = outside_ip
|
||||
return self.client.create(f'{self.base_endpoint}/tunnel-terminations', data)
|
||||
|
||||
async def update_tunnel_termination(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a tunnel termination."""
|
||||
return self.client.patch(f'{self.base_endpoint}/tunnel-terminations', id, kwargs)
|
||||
|
||||
async def delete_tunnel_termination(self, id: int) -> None:
|
||||
"""Delete a tunnel termination."""
|
||||
self.client.delete(f'{self.base_endpoint}/tunnel-terminations', id)
|
||||
|
||||
# ==================== IKE Proposals ====================
|
||||
|
||||
async def list_ike_proposals(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all IKE proposals."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ike-proposals', params=params)
|
||||
|
||||
async def get_ike_proposal(self, id: int) -> Dict:
|
||||
"""Get a specific IKE proposal by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ike-proposals', id)
|
||||
|
||||
async def create_ike_proposal(
|
||||
self,
|
||||
name: str,
|
||||
authentication_method: str,
|
||||
encryption_algorithm: str,
|
||||
authentication_algorithm: str,
|
||||
group: int,
|
||||
sa_lifetime: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IKE proposal."""
|
||||
data = {
|
||||
'name': name, 'authentication_method': authentication_method,
|
||||
'encryption_algorithm': encryption_algorithm,
|
||||
'authentication_algorithm': authentication_algorithm, 'group': group, **kwargs
|
||||
}
|
||||
if sa_lifetime:
|
||||
data['sa_lifetime'] = sa_lifetime
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/ike-proposals', data)
|
||||
|
||||
async def update_ike_proposal(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IKE proposal."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ike-proposals', id, kwargs)
|
||||
|
||||
async def delete_ike_proposal(self, id: int) -> None:
|
||||
"""Delete an IKE proposal."""
|
||||
self.client.delete(f'{self.base_endpoint}/ike-proposals', id)
|
||||
|
||||
# ==================== IKE Policies ====================
|
||||
|
||||
async def list_ike_policies(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all IKE policies."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ike-policies', params=params)
|
||||
|
||||
async def get_ike_policy(self, id: int) -> Dict:
|
||||
"""Get a specific IKE policy by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ike-policies', id)
|
||||
|
||||
async def create_ike_policy(
|
||||
self,
|
||||
name: str,
|
||||
version: int,
|
||||
mode: str,
|
||||
proposals: List[int],
|
||||
preshared_key: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IKE policy."""
|
||||
data = {'name': name, 'version': version, 'mode': mode, 'proposals': proposals, **kwargs}
|
||||
if preshared_key:
|
||||
data['preshared_key'] = preshared_key
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/ike-policies', data)
|
||||
|
||||
async def update_ike_policy(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IKE policy."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ike-policies', id, kwargs)
|
||||
|
||||
async def delete_ike_policy(self, id: int) -> None:
|
||||
"""Delete an IKE policy."""
|
||||
self.client.delete(f'{self.base_endpoint}/ike-policies', id)
|
||||
|
||||
# ==================== IPSec Proposals ====================
|
||||
|
||||
async def list_ipsec_proposals(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all IPSec proposals."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ipsec-proposals', params=params)
|
||||
|
||||
async def get_ipsec_proposal(self, id: int) -> Dict:
|
||||
"""Get a specific IPSec proposal by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ipsec-proposals', id)
|
||||
|
||||
async def create_ipsec_proposal(
|
||||
self,
|
||||
name: str,
|
||||
encryption_algorithm: str,
|
||||
authentication_algorithm: str,
|
||||
sa_lifetime_seconds: Optional[int] = None,
|
||||
sa_lifetime_data: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IPSec proposal."""
|
||||
data = {
|
||||
'name': name, 'encryption_algorithm': encryption_algorithm,
|
||||
'authentication_algorithm': authentication_algorithm, **kwargs
|
||||
}
|
||||
for key, val in [
|
||||
('sa_lifetime_seconds', sa_lifetime_seconds),
|
||||
('sa_lifetime_data', sa_lifetime_data), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/ipsec-proposals', data)
|
||||
|
||||
async def update_ipsec_proposal(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IPSec proposal."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ipsec-proposals', id, kwargs)
|
||||
|
||||
async def delete_ipsec_proposal(self, id: int) -> None:
|
||||
"""Delete an IPSec proposal."""
|
||||
self.client.delete(f'{self.base_endpoint}/ipsec-proposals', id)
|
||||
|
||||
# ==================== IPSec Policies ====================
|
||||
|
||||
async def list_ipsec_policies(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all IPSec policies."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ipsec-policies', params=params)
|
||||
|
||||
async def get_ipsec_policy(self, id: int) -> Dict:
|
||||
"""Get a specific IPSec policy by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ipsec-policies', id)
|
||||
|
||||
async def create_ipsec_policy(
|
||||
self,
|
||||
name: str,
|
||||
proposals: List[int],
|
||||
pfs_group: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IPSec policy."""
|
||||
data = {'name': name, 'proposals': proposals, **kwargs}
|
||||
if pfs_group:
|
||||
data['pfs_group'] = pfs_group
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/ipsec-policies', data)
|
||||
|
||||
async def update_ipsec_policy(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IPSec policy."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ipsec-policies', id, kwargs)
|
||||
|
||||
async def delete_ipsec_policy(self, id: int) -> None:
|
||||
"""Delete an IPSec policy."""
|
||||
self.client.delete(f'{self.base_endpoint}/ipsec-policies', id)
|
||||
|
||||
# ==================== IPSec Profiles ====================
|
||||
|
||||
async def list_ipsec_profiles(self, name: Optional[str] = None, **kwargs) -> List[Dict]:
|
||||
"""List all IPSec profiles."""
|
||||
params = {k: v for k, v in {'name': name, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/ipsec-profiles', params=params)
|
||||
|
||||
async def get_ipsec_profile(self, id: int) -> Dict:
|
||||
"""Get a specific IPSec profile by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/ipsec-profiles', id)
|
||||
|
||||
async def create_ipsec_profile(
|
||||
self,
|
||||
name: str,
|
||||
mode: str,
|
||||
ike_policy: int,
|
||||
ipsec_policy: int,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new IPSec profile."""
|
||||
data = {'name': name, 'mode': mode, 'ike_policy': ike_policy, 'ipsec_policy': ipsec_policy, **kwargs}
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/ipsec-profiles', data)
|
||||
|
||||
async def update_ipsec_profile(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an IPSec profile."""
|
||||
return self.client.patch(f'{self.base_endpoint}/ipsec-profiles', id, kwargs)
|
||||
|
||||
async def delete_ipsec_profile(self, id: int) -> None:
|
||||
"""Delete an IPSec profile."""
|
||||
self.client.delete(f'{self.base_endpoint}/ipsec-profiles', id)
|
||||
|
||||
# ==================== L2VPN ====================
|
||||
|
||||
async def list_l2vpns(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
type: Optional[str] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all L2VPNs with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'type': type, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/l2vpns', params=params)
|
||||
|
||||
async def get_l2vpn(self, id: int) -> Dict:
|
||||
"""Get a specific L2VPN by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/l2vpns', id)
|
||||
|
||||
async def create_l2vpn(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
type: str,
|
||||
identifier: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
import_targets: Optional[List[int]] = None,
|
||||
export_targets: Optional[List[int]] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new L2VPN."""
|
||||
data = {'name': name, 'slug': slug, 'type': type, **kwargs}
|
||||
for key, val in [
|
||||
('identifier', identifier), ('tenant', tenant), ('description', description),
|
||||
('import_targets', import_targets), ('export_targets', export_targets)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/l2vpns', data)
|
||||
|
||||
async def update_l2vpn(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an L2VPN."""
|
||||
return self.client.patch(f'{self.base_endpoint}/l2vpns', id, kwargs)
|
||||
|
||||
async def delete_l2vpn(self, id: int) -> None:
|
||||
"""Delete an L2VPN."""
|
||||
self.client.delete(f'{self.base_endpoint}/l2vpns', id)
|
||||
|
||||
# ==================== L2VPN Terminations ====================
|
||||
|
||||
async def list_l2vpn_terminations(
|
||||
self,
|
||||
l2vpn_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all L2VPN terminations."""
|
||||
params = {k: v for k, v in {'l2vpn_id': l2vpn_id, **kwargs}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/l2vpn-terminations', params=params)
|
||||
|
||||
async def get_l2vpn_termination(self, id: int) -> Dict:
|
||||
"""Get a specific L2VPN termination by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/l2vpn-terminations', id)
|
||||
|
||||
async def create_l2vpn_termination(
|
||||
self,
|
||||
l2vpn: int,
|
||||
assigned_object_type: str,
|
||||
assigned_object_id: int,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new L2VPN termination."""
|
||||
data = {
|
||||
'l2vpn': l2vpn, 'assigned_object_type': assigned_object_type,
|
||||
'assigned_object_id': assigned_object_id, **kwargs
|
||||
}
|
||||
return self.client.create(f'{self.base_endpoint}/l2vpn-terminations', data)
|
||||
|
||||
async def update_l2vpn_termination(self, id: int, **kwargs) -> Dict:
|
||||
"""Update an L2VPN termination."""
|
||||
return self.client.patch(f'{self.base_endpoint}/l2vpn-terminations', id, kwargs)
|
||||
|
||||
async def delete_l2vpn_termination(self, id: int) -> None:
|
||||
"""Delete an L2VPN termination."""
|
||||
self.client.delete(f'{self.base_endpoint}/l2vpn-terminations', id)
|
||||
@@ -1,166 +0,0 @@
|
||||
"""
|
||||
Wireless tools for NetBox MCP Server.
|
||||
|
||||
Covers: Wireless LANs, Wireless LAN Groups, and Wireless Links.
|
||||
"""
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Any
|
||||
from ..netbox_client import NetBoxClient
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WirelessTools:
|
||||
"""Tools for Wireless operations in NetBox"""
|
||||
|
||||
def __init__(self, client: NetBoxClient):
|
||||
self.client = client
|
||||
self.base_endpoint = 'wireless'
|
||||
|
||||
# ==================== Wireless LAN Groups ====================
|
||||
|
||||
async def list_wireless_lan_groups(
|
||||
self,
|
||||
name: Optional[str] = None,
|
||||
slug: Optional[str] = None,
|
||||
parent_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all wireless LAN groups."""
|
||||
params = {k: v for k, v in {
|
||||
'name': name, 'slug': slug, 'parent_id': parent_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/wireless-lan-groups', params=params)
|
||||
|
||||
async def get_wireless_lan_group(self, id: int) -> Dict:
|
||||
"""Get a specific wireless LAN group by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/wireless-lan-groups', id)
|
||||
|
||||
async def create_wireless_lan_group(
|
||||
self,
|
||||
name: str,
|
||||
slug: str,
|
||||
parent: Optional[int] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new wireless LAN group."""
|
||||
data = {'name': name, 'slug': slug, **kwargs}
|
||||
if parent:
|
||||
data['parent'] = parent
|
||||
if description:
|
||||
data['description'] = description
|
||||
return self.client.create(f'{self.base_endpoint}/wireless-lan-groups', data)
|
||||
|
||||
async def update_wireless_lan_group(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a wireless LAN group."""
|
||||
return self.client.patch(f'{self.base_endpoint}/wireless-lan-groups', id, kwargs)
|
||||
|
||||
async def delete_wireless_lan_group(self, id: int) -> None:
|
||||
"""Delete a wireless LAN group."""
|
||||
self.client.delete(f'{self.base_endpoint}/wireless-lan-groups', id)
|
||||
|
||||
# ==================== Wireless LANs ====================
|
||||
|
||||
async def list_wireless_lans(
|
||||
self,
|
||||
ssid: Optional[str] = None,
|
||||
group_id: Optional[int] = None,
|
||||
vlan_id: Optional[int] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
status: Optional[str] = None,
|
||||
auth_type: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all wireless LANs with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'ssid': ssid, 'group_id': group_id, 'vlan_id': vlan_id,
|
||||
'tenant_id': tenant_id, 'status': status, 'auth_type': auth_type, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/wireless-lans', params=params)
|
||||
|
||||
async def get_wireless_lan(self, id: int) -> Dict:
|
||||
"""Get a specific wireless LAN by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/wireless-lans', id)
|
||||
|
||||
async def create_wireless_lan(
|
||||
self,
|
||||
ssid: str,
|
||||
status: str = 'active',
|
||||
group: Optional[int] = None,
|
||||
vlan: Optional[int] = None,
|
||||
tenant: Optional[int] = None,
|
||||
auth_type: Optional[str] = None,
|
||||
auth_cipher: Optional[str] = None,
|
||||
auth_psk: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new wireless LAN."""
|
||||
data = {'ssid': ssid, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('group', group), ('vlan', vlan), ('tenant', tenant),
|
||||
('auth_type', auth_type), ('auth_cipher', auth_cipher),
|
||||
('auth_psk', auth_psk), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/wireless-lans', data)
|
||||
|
||||
async def update_wireless_lan(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a wireless LAN."""
|
||||
return self.client.patch(f'{self.base_endpoint}/wireless-lans', id, kwargs)
|
||||
|
||||
async def delete_wireless_lan(self, id: int) -> None:
|
||||
"""Delete a wireless LAN."""
|
||||
self.client.delete(f'{self.base_endpoint}/wireless-lans', id)
|
||||
|
||||
# ==================== Wireless Links ====================
|
||||
|
||||
async def list_wireless_links(
|
||||
self,
|
||||
ssid: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
tenant_id: Optional[int] = None,
|
||||
**kwargs
|
||||
) -> List[Dict]:
|
||||
"""List all wireless links with optional filtering."""
|
||||
params = {k: v for k, v in {
|
||||
'ssid': ssid, 'status': status, 'tenant_id': tenant_id, **kwargs
|
||||
}.items() if v is not None}
|
||||
return self.client.list(f'{self.base_endpoint}/wireless-links', params=params)
|
||||
|
||||
async def get_wireless_link(self, id: int) -> Dict:
|
||||
"""Get a specific wireless link by ID."""
|
||||
return self.client.get(f'{self.base_endpoint}/wireless-links', id)
|
||||
|
||||
async def create_wireless_link(
|
||||
self,
|
||||
interface_a: int,
|
||||
interface_b: int,
|
||||
ssid: Optional[str] = None,
|
||||
status: str = 'connected',
|
||||
tenant: Optional[int] = None,
|
||||
auth_type: Optional[str] = None,
|
||||
auth_cipher: Optional[str] = None,
|
||||
auth_psk: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
**kwargs
|
||||
) -> Dict:
|
||||
"""Create a new wireless link."""
|
||||
data = {'interface_a': interface_a, 'interface_b': interface_b, 'status': status, **kwargs}
|
||||
for key, val in [
|
||||
('ssid', ssid), ('tenant', tenant), ('auth_type', auth_type),
|
||||
('auth_cipher', auth_cipher), ('auth_psk', auth_psk), ('description', description)
|
||||
]:
|
||||
if val is not None:
|
||||
data[key] = val
|
||||
return self.client.create(f'{self.base_endpoint}/wireless-links', data)
|
||||
|
||||
async def update_wireless_link(self, id: int, **kwargs) -> Dict:
|
||||
"""Update a wireless link."""
|
||||
return self.client.patch(f'{self.base_endpoint}/wireless-links', id, kwargs)
|
||||
|
||||
async def delete_wireless_link(self, id: int) -> None:
|
||||
"""Delete a wireless link."""
|
||||
self.client.delete(f'{self.base_endpoint}/wireless-links', id)
|
||||
@@ -1,6 +0,0 @@
|
||||
mcp>=0.9.0 # MCP SDK from Anthropic
|
||||
python-dotenv>=1.0.0 # Environment variable loading
|
||||
requests>=2.31.0 # HTTP client for NetBox API
|
||||
pydantic>=2.5.0 # Data validation
|
||||
pytest>=7.4.3 # Testing framework
|
||||
pytest-asyncio>=0.23.0 # Async testing support
|
||||
@@ -1 +0,0 @@
|
||||
"""NetBox MCP Server tests."""
|
||||
75
plugins/git-flow/.claude-plugin/plugin.json
Normal file
75
plugins/git-flow/.claude-plugin/plugin.json
Normal file
@@ -0,0 +1,75 @@
|
||||
{
|
||||
"name": "git-flow",
|
||||
"version": "1.0.0",
|
||||
"description": "Git workflow automation with intelligent commit messages and branch management",
|
||||
"author": {
|
||||
"name": "Leo Miranda",
|
||||
"email": "leobmiranda@gmail.com"
|
||||
},
|
||||
"homepage": "https://gitea.hotserv.cloud/personal-projects/support-claude-mktplace/src/branch/main/plugins/git-flow/README.md",
|
||||
"repository": "https://gitea.hotserv.cloud/personal-projects/support-claude-mktplace.git",
|
||||
"license": "MIT",
|
||||
"keywords": [
|
||||
"git",
|
||||
"workflow",
|
||||
"commit",
|
||||
"branch",
|
||||
"automation"
|
||||
],
|
||||
"commands": [
|
||||
{
|
||||
"name": "commit",
|
||||
"description": "Create a commit with auto-generated conventional message",
|
||||
"file": "commands/commit.md"
|
||||
},
|
||||
{
|
||||
"name": "commit-push",
|
||||
"description": "Commit and push to remote in one operation",
|
||||
"file": "commands/commit-push.md"
|
||||
},
|
||||
{
|
||||
"name": "commit-merge",
|
||||
"description": "Commit current changes and merge into target branch",
|
||||
"file": "commands/commit-merge.md"
|
||||
},
|
||||
{
|
||||
"name": "commit-sync",
|
||||
"description": "Commit, push, and sync with upstream",
|
||||
"file": "commands/commit-sync.md"
|
||||
},
|
||||
{
|
||||
"name": "branch-start",
|
||||
"description": "Start a new feature/fix/chore branch with naming convention",
|
||||
"file": "commands/branch-start.md"
|
||||
},
|
||||
{
|
||||
"name": "branch-cleanup",
|
||||
"description": "Clean up merged branches locally and optionally remotely",
|
||||
"file": "commands/branch-cleanup.md"
|
||||
},
|
||||
{
|
||||
"name": "git-status",
|
||||
"description": "Enhanced git status with recommendations",
|
||||
"file": "commands/git-status.md"
|
||||
},
|
||||
{
|
||||
"name": "git-config",
|
||||
"description": "Configure git-flow settings for this project",
|
||||
"file": "commands/git-config.md"
|
||||
}
|
||||
],
|
||||
"agents": [
|
||||
{
|
||||
"name": "git-assistant",
|
||||
"description": "Git workflow assistant for complex operations",
|
||||
"file": "agents/git-assistant.md"
|
||||
}
|
||||
],
|
||||
"skills": [
|
||||
{
|
||||
"name": "workflow-patterns",
|
||||
"description": "Git branching strategies and workflow patterns",
|
||||
"path": "skills/workflow-patterns"
|
||||
}
|
||||
]
|
||||
}
|
||||
128
plugins/git-flow/README.md
Normal file
128
plugins/git-flow/README.md
Normal file
@@ -0,0 +1,128 @@
|
||||
# git-flow
|
||||
|
||||
Git workflow automation with intelligent commit messages and branch management.
|
||||
|
||||
## Overview
|
||||
|
||||
git-flow streamlines common git operations with smart defaults, conventional commit messages, and workflow enforcement. It supports multiple branching strategies and adapts to your team's workflow.
|
||||
|
||||
## Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `/commit` | Create commit with auto-generated conventional message |
|
||||
| `/commit-push` | Commit and push in one operation |
|
||||
| `/commit-merge` | Commit and merge into target branch |
|
||||
| `/commit-sync` | Full sync: commit, push, and rebase on base branch |
|
||||
| `/branch-start` | Start new feature/fix/chore branch |
|
||||
| `/branch-cleanup` | Clean up merged branches |
|
||||
| `/git-status` | Enhanced status with recommendations |
|
||||
| `/git-config` | Configure git-flow settings |
|
||||
|
||||
## Workflow Styles
|
||||
|
||||
| Style | Description | Best For |
|
||||
|-------|-------------|----------|
|
||||
| `simple` | Direct commits to main | Solo projects |
|
||||
| `feature-branch` | Feature branches, merge when done | Small teams |
|
||||
| `pr-required` | All changes via pull request | Code review workflows |
|
||||
| `trunk-based` | Short-lived branches, frequent integration | CI/CD heavy |
|
||||
|
||||
## Installation
|
||||
|
||||
Add to your project's `.claude/settings.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"plugins": ["git-flow"]
|
||||
}
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Set environment variables in `.env` or `~/.config/claude/git-flow.env`:
|
||||
|
||||
```bash
|
||||
GIT_WORKFLOW_STYLE=feature-branch
|
||||
GIT_DEFAULT_BASE=development
|
||||
GIT_AUTO_DELETE_MERGED=true
|
||||
GIT_AUTO_PUSH=false
|
||||
GIT_PROTECTED_BRANCHES=main,master,development,staging
|
||||
GIT_COMMIT_STYLE=conventional
|
||||
GIT_CO_AUTHOR=true
|
||||
```
|
||||
|
||||
Or use `/git-config` for interactive configuration.
|
||||
|
||||
## Features
|
||||
|
||||
### Smart Commit Messages
|
||||
|
||||
Analyzes staged changes to generate appropriate conventional commit messages:
|
||||
|
||||
```
|
||||
feat(auth): add password reset functionality
|
||||
|
||||
Implement forgot password flow with email verification.
|
||||
Includes rate limiting and token expiration.
|
||||
```
|
||||
|
||||
### Branch Naming
|
||||
|
||||
Enforces consistent branch naming:
|
||||
|
||||
```
|
||||
feat/add-user-authentication
|
||||
fix/login-timeout-error
|
||||
chore/update-dependencies
|
||||
```
|
||||
|
||||
### Safety Checks
|
||||
|
||||
- Warns before commits to protected branches
|
||||
- Confirms force push operations
|
||||
- Prevents accidental branch deletion
|
||||
|
||||
### Conflict Resolution
|
||||
|
||||
The git-assistant agent helps resolve merge conflicts with analysis and recommendations.
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### Start a Feature
|
||||
|
||||
```
|
||||
/branch-start add user authentication
|
||||
|
||||
→ Created: feat/add-user-authentication
|
||||
Based on: development
|
||||
```
|
||||
|
||||
### Commit Changes
|
||||
|
||||
```
|
||||
/commit
|
||||
|
||||
→ Analyzing changes...
|
||||
→ Proposed: feat(auth): add login component
|
||||
→ Committed: abc1234
|
||||
```
|
||||
|
||||
### Full Sync
|
||||
|
||||
```
|
||||
/commit-sync
|
||||
|
||||
→ Committed: abc1234
|
||||
→ Pushed to origin
|
||||
→ Rebased on development
|
||||
→ Status: Clean, up-to-date
|
||||
```
|
||||
|
||||
## Integration
|
||||
|
||||
For CLAUDE.md integration instructions, see `claude-md-integration.md`.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
139
plugins/git-flow/agents/git-assistant.md
Normal file
139
plugins/git-flow/agents/git-assistant.md
Normal file
@@ -0,0 +1,139 @@
|
||||
# Git Assistant Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are a git workflow assistant that helps users navigate complex git operations, resolve conflicts, and maintain clean repository history.
|
||||
|
||||
## Capabilities
|
||||
|
||||
### 1. Conflict Resolution
|
||||
|
||||
Guide users through merge conflicts:
|
||||
|
||||
```
|
||||
Conflict detected in: src/auth/login.ts
|
||||
|
||||
Let me analyze the conflict:
|
||||
|
||||
<<<<<<< HEAD (your changes)
|
||||
function login(user: User) {
|
||||
return authenticate(user.email, user.password);
|
||||
}
|
||||
=======
|
||||
function login(credentials: Credentials) {
|
||||
return authenticate(credentials);
|
||||
}
|
||||
>>>>>>> development (incoming)
|
||||
|
||||
Analysis:
|
||||
- Your version: Uses User type, explicit parameters
|
||||
- Their version: Uses Credentials type, single object
|
||||
|
||||
Recommendation:
|
||||
The incoming change uses a cleaner pattern. However, you'll need
|
||||
to ensure User type is converted to Credentials.
|
||||
|
||||
Options:
|
||||
1. Accept theirs (update your code to use Credentials)
|
||||
2. Accept yours (they'll need to adapt)
|
||||
3. Combine both (create overloaded function)
|
||||
4. Need more context about Credentials type
|
||||
```
|
||||
|
||||
### 2. History Cleanup
|
||||
|
||||
Help with rebasing, squashing, and amending:
|
||||
|
||||
```
|
||||
Your branch has 15 commits, including:
|
||||
- 5 "WIP" commits
|
||||
- 3 "fix typo" commits
|
||||
- 2 "oops" commits
|
||||
|
||||
Before merging, would you like to clean up history?
|
||||
|
||||
Options:
|
||||
1. Squash all into single commit (cleanest)
|
||||
2. Interactive rebase (you pick what to squash)
|
||||
3. Keep as-is (preserve full history)
|
||||
```
|
||||
|
||||
### 3. Recovery Operations
|
||||
|
||||
Help users recover from mistakes:
|
||||
|
||||
```
|
||||
Accidentally deleted a branch? Let's recover it:
|
||||
|
||||
Recent deleted branches (from reflog):
|
||||
- feat/important-feature (deleted 5 mins ago)
|
||||
Last commit: abc1234 "Add important feature"
|
||||
|
||||
Recover this branch?
|
||||
```
|
||||
|
||||
### 4. Complex Merges
|
||||
|
||||
Handle cherry-picks, partial merges, octopus merges:
|
||||
|
||||
```
|
||||
You want to merge only specific commits from feat/mixed-changes.
|
||||
|
||||
Commits in that branch:
|
||||
1. abc1234 - Add user auth (✓ want this)
|
||||
2. def5678 - Fix unrelated bug (✗ skip)
|
||||
3. ghi9012 - Add password reset (✓ want this)
|
||||
|
||||
I'll cherry-pick commits 1 and 3. Proceed?
|
||||
```
|
||||
|
||||
## Communication Style
|
||||
|
||||
### Clear Explanations
|
||||
- Explain what each command does before running
|
||||
- Show the before/after state
|
||||
- Highlight risks
|
||||
|
||||
### Safe Defaults
|
||||
- Always prefer non-destructive operations
|
||||
- Confirm before force operations
|
||||
- Create backups before risky operations
|
||||
|
||||
### Educational
|
||||
- Explain why conflicts occur
|
||||
- Teach patterns to avoid issues
|
||||
- Suggest workflow improvements
|
||||
|
||||
## Safety Protocols
|
||||
|
||||
### Before Destructive Operations
|
||||
```
|
||||
⚠️ This operation will:
|
||||
- Rewrite history for 5 commits
|
||||
- Require force push to remote
|
||||
- Affect other team members
|
||||
|
||||
Creating backup branch: backup/feat-password-reset-20240120
|
||||
|
||||
Proceed? (yes/no)
|
||||
```
|
||||
|
||||
### Protected Branches
|
||||
```
|
||||
⛔ Cannot directly modify 'main' branch.
|
||||
|
||||
This branch is protected. You should:
|
||||
1. Create a feature branch
|
||||
2. Make your changes
|
||||
3. Create a pull request
|
||||
|
||||
Would you like me to create a branch for this change?
|
||||
```
|
||||
|
||||
## Output Style
|
||||
|
||||
Always show:
|
||||
- What will happen
|
||||
- Current state
|
||||
- Expected outcome
|
||||
- Recovery options if things go wrong
|
||||
55
plugins/git-flow/claude-md-integration.md
Normal file
55
plugins/git-flow/claude-md-integration.md
Normal file
@@ -0,0 +1,55 @@
|
||||
# git-flow - CLAUDE.md Integration
|
||||
|
||||
Add the following section to your project's CLAUDE.md file to enable git-flow.
|
||||
|
||||
---
|
||||
|
||||
## Git Workflow
|
||||
|
||||
This project uses the git-flow plugin for git operations.
|
||||
|
||||
### Workflow Style
|
||||
|
||||
**Style:** feature-branch
|
||||
**Base Branch:** development
|
||||
|
||||
### Branch Naming
|
||||
|
||||
Use the format: `<type>/<description>`
|
||||
|
||||
Types: feat, fix, chore, docs, refactor, test, perf
|
||||
|
||||
Examples:
|
||||
- `feat/add-user-auth`
|
||||
- `fix/login-timeout`
|
||||
- `chore/update-deps`
|
||||
|
||||
### Commit Messages
|
||||
|
||||
Use conventional commits:
|
||||
|
||||
```
|
||||
<type>(<scope>): <description>
|
||||
|
||||
[body]
|
||||
|
||||
[footer]
|
||||
```
|
||||
|
||||
### Commands
|
||||
|
||||
| Command | Use Case |
|
||||
|---------|----------|
|
||||
| `/commit` | Create commit with smart message |
|
||||
| `/commit-push` | Commit and push |
|
||||
| `/commit-merge` | Commit and merge to base |
|
||||
| `/branch-start` | Start new branch |
|
||||
| `/git-status` | Enhanced status |
|
||||
|
||||
### Protected Branches
|
||||
|
||||
Do not commit directly to: main, development, staging
|
||||
|
||||
---
|
||||
|
||||
Copy the section between the horizontal rules into your CLAUDE.md.
|
||||
94
plugins/git-flow/commands/branch-cleanup.md
Normal file
94
plugins/git-flow/commands/branch-cleanup.md
Normal file
@@ -0,0 +1,94 @@
|
||||
# /branch-cleanup - Clean Merged Branches
|
||||
|
||||
## Purpose
|
||||
|
||||
Remove branches that have been merged, both locally and optionally on remote.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Identify Merged Branches
|
||||
|
||||
```bash
|
||||
# Find merged local branches
|
||||
git branch --merged <base-branch>
|
||||
|
||||
# Find merged remote branches
|
||||
git branch -r --merged <base-branch>
|
||||
```
|
||||
|
||||
### Step 2: Present Findings
|
||||
|
||||
```
|
||||
Found 5 merged branches:
|
||||
|
||||
Local:
|
||||
- feat/login-page (merged 3 days ago)
|
||||
- fix/typo-header (merged 1 week ago)
|
||||
- chore/deps-update (merged 2 weeks ago)
|
||||
|
||||
Remote:
|
||||
- origin/feat/login-page
|
||||
- origin/fix/typo-header
|
||||
|
||||
Protected (won't delete):
|
||||
- main
|
||||
- development
|
||||
- staging
|
||||
|
||||
Delete these branches?
|
||||
1. Delete all (local + remote)
|
||||
2. Delete local only
|
||||
3. Let me pick which ones
|
||||
4. Cancel
|
||||
```
|
||||
|
||||
### Step 3: Execute Cleanup
|
||||
|
||||
```bash
|
||||
# Delete local
|
||||
git branch -d <branch-name>
|
||||
|
||||
# Delete remote
|
||||
git push origin --delete <branch-name>
|
||||
```
|
||||
|
||||
### Step 4: Report
|
||||
|
||||
```
|
||||
Cleanup complete:
|
||||
|
||||
Deleted local: 3 branches
|
||||
Deleted remote: 2 branches
|
||||
Skipped: 0 branches
|
||||
|
||||
Remaining local branches:
|
||||
- main
|
||||
- development
|
||||
- feat/current-work (not merged)
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_DEFAULT_BASE` | `development` | Base branch for merge detection |
|
||||
| `GIT_PROTECTED_BRANCHES` | `main,master,development,staging,production` | Never delete these |
|
||||
| `GIT_AUTO_DELETE_REMOTE` | `false` | Auto-delete remote branches |
|
||||
|
||||
## Safety
|
||||
|
||||
- Never deletes protected branches
|
||||
- Warns about unmerged branches
|
||||
- Confirms before deleting remote branches
|
||||
- Uses `-d` (safe delete) not `-D` (force delete)
|
||||
|
||||
## Output
|
||||
|
||||
On success:
|
||||
```
|
||||
Cleaned up:
|
||||
Local: 3 branches deleted
|
||||
Remote: 2 branches deleted
|
||||
|
||||
Repository is tidy!
|
||||
```
|
||||
96
plugins/git-flow/commands/branch-start.md
Normal file
96
plugins/git-flow/commands/branch-start.md
Normal file
@@ -0,0 +1,96 @@
|
||||
# /branch-start - Start New Branch
|
||||
|
||||
## Purpose
|
||||
|
||||
Create a new feature/fix/chore branch with consistent naming conventions.
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/branch-start [description]
|
||||
```
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Determine Branch Type
|
||||
|
||||
```
|
||||
What type of change is this?
|
||||
1. feat - New feature
|
||||
2. fix - Bug fix
|
||||
3. chore - Maintenance task
|
||||
4. docs - Documentation
|
||||
5. refactor - Code refactoring
|
||||
```
|
||||
|
||||
### Step 2: Get Description
|
||||
|
||||
If not provided, ask:
|
||||
|
||||
```
|
||||
Brief description (2-4 words):
|
||||
> add user authentication
|
||||
```
|
||||
|
||||
### Step 3: Generate Branch Name
|
||||
|
||||
Convert to kebab-case:
|
||||
- `feat/add-user-authentication`
|
||||
- `fix/login-timeout-error`
|
||||
- `chore/update-dependencies`
|
||||
|
||||
### Step 4: Create Branch
|
||||
|
||||
```bash
|
||||
# Ensure base branch is up-to-date
|
||||
git checkout <base-branch>
|
||||
git pull origin <base-branch>
|
||||
|
||||
# Create and switch to new branch
|
||||
git checkout -b <new-branch>
|
||||
```
|
||||
|
||||
### Step 5: Confirm
|
||||
|
||||
```
|
||||
Created branch: feat/add-user-authentication
|
||||
Based on: development (abc1234)
|
||||
|
||||
Ready to start coding!
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_DEFAULT_BASE` | `development` | Branch to create from |
|
||||
| `GIT_BRANCH_PREFIX` | `true` | Use type/ prefix |
|
||||
|
||||
## Naming Rules
|
||||
|
||||
- Lowercase only
|
||||
- Hyphens for spaces
|
||||
- No special characters
|
||||
- Max 50 characters
|
||||
|
||||
## Validation
|
||||
|
||||
```
|
||||
Branch name validation:
|
||||
✓ Lowercase
|
||||
✓ Valid prefix (feat/)
|
||||
✓ Descriptive (3+ words recommended)
|
||||
✗ Too long (52 chars, max 50)
|
||||
|
||||
Suggested: feat/add-user-auth
|
||||
Use this instead? (y/n)
|
||||
```
|
||||
|
||||
## Output
|
||||
|
||||
On success:
|
||||
```
|
||||
Branch: feat/add-user-authentication
|
||||
Base: development @ abc1234
|
||||
Status: Ready for development
|
||||
```
|
||||
83
plugins/git-flow/commands/commit-merge.md
Normal file
83
plugins/git-flow/commands/commit-merge.md
Normal file
@@ -0,0 +1,83 @@
|
||||
# /commit-merge - Commit and Merge
|
||||
|
||||
## Purpose
|
||||
|
||||
Commit current changes, then merge the current branch into a target branch.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Run /commit
|
||||
|
||||
Execute the standard commit workflow.
|
||||
|
||||
### Step 2: Identify Target Branch
|
||||
|
||||
Check environment or ask:
|
||||
|
||||
```
|
||||
Merge into which branch?
|
||||
1. development (Recommended - GIT_DEFAULT_BASE)
|
||||
2. main
|
||||
3. Other: ____
|
||||
```
|
||||
|
||||
### Step 3: Merge Strategy
|
||||
|
||||
```
|
||||
How should I merge?
|
||||
1. Merge commit (preserves history)
|
||||
2. Squash and merge (single commit)
|
||||
3. Rebase (linear history)
|
||||
```
|
||||
|
||||
### Step 4: Execute Merge
|
||||
|
||||
```bash
|
||||
# Switch to target
|
||||
git checkout <target>
|
||||
|
||||
# Pull latest
|
||||
git pull origin <target>
|
||||
|
||||
# Merge feature branch
|
||||
git merge <feature-branch> [--squash] [--no-ff]
|
||||
|
||||
# Push
|
||||
git push origin <target>
|
||||
```
|
||||
|
||||
### Step 5: Cleanup (Optional)
|
||||
|
||||
```
|
||||
Merge complete. Delete the feature branch?
|
||||
1. Yes, delete local and remote (Recommended)
|
||||
2. Delete local only
|
||||
3. Keep the branch
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_DEFAULT_BASE` | `development` | Default branch to merge into |
|
||||
| `GIT_MERGE_STRATEGY` | `merge` | Default merge strategy |
|
||||
| `GIT_AUTO_DELETE_MERGED` | `true` | Auto-delete merged branches |
|
||||
|
||||
## Safety Checks
|
||||
|
||||
- Verify target branch exists
|
||||
- Check for uncommitted changes before switching
|
||||
- Ensure merge doesn't conflict (preview first)
|
||||
|
||||
## Output
|
||||
|
||||
On success:
|
||||
```
|
||||
Committed: abc1234
|
||||
feat(auth): add password reset functionality
|
||||
|
||||
Merged feat/password-reset → development
|
||||
Deleted branch: feat/password-reset
|
||||
|
||||
development is now at: def5678
|
||||
```
|
||||
57
plugins/git-flow/commands/commit-push.md
Normal file
57
plugins/git-flow/commands/commit-push.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# /commit-push - Commit and Push
|
||||
|
||||
## Purpose
|
||||
|
||||
Create a commit and push to the remote repository in one operation.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Run /commit
|
||||
|
||||
Execute the standard commit workflow (see commit.md).
|
||||
|
||||
### Step 2: Push to Remote
|
||||
|
||||
After successful commit:
|
||||
|
||||
1. Check if branch has upstream tracking
|
||||
2. If no upstream, set it: `git push -u origin <branch>`
|
||||
3. If upstream exists: `git push`
|
||||
|
||||
### Step 3: Handle Conflicts
|
||||
|
||||
If push fails due to diverged history:
|
||||
|
||||
```
|
||||
Remote has changes not in your local branch.
|
||||
|
||||
Options:
|
||||
1. Pull and rebase, then push (Recommended)
|
||||
2. Pull and merge, then push
|
||||
3. Force push (⚠️ destructive)
|
||||
4. Cancel and review manually
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_AUTO_PUSH` | `true` | Auto-push after commit |
|
||||
| `GIT_PUSH_STRATEGY` | `rebase` | How to handle diverged branches |
|
||||
|
||||
## Safety Checks
|
||||
|
||||
- **Protected branches**: Warn before pushing to main/master/production
|
||||
- **Force push**: Require explicit confirmation
|
||||
- **No tracking**: Ask before creating new remote branch
|
||||
|
||||
## Output
|
||||
|
||||
On success:
|
||||
```
|
||||
Committed: abc1234
|
||||
feat(auth): add password reset functionality
|
||||
|
||||
Pushed to: origin/feat/password-reset
|
||||
Remote URL: https://github.com/user/repo
|
||||
```
|
||||
79
plugins/git-flow/commands/commit-sync.md
Normal file
79
plugins/git-flow/commands/commit-sync.md
Normal file
@@ -0,0 +1,79 @@
|
||||
# /commit-sync - Commit, Push, and Sync
|
||||
|
||||
## Purpose
|
||||
|
||||
Full sync operation: commit local changes, push to remote, and sync with upstream/base branch.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Run /commit
|
||||
|
||||
Execute the standard commit workflow.
|
||||
|
||||
### Step 2: Push to Remote
|
||||
|
||||
Push committed changes to remote branch.
|
||||
|
||||
### Step 3: Sync with Base
|
||||
|
||||
Pull latest from base branch and rebase/merge:
|
||||
|
||||
```bash
|
||||
# Fetch all
|
||||
git fetch --all
|
||||
|
||||
# Rebase on base branch
|
||||
git rebase origin/<base-branch>
|
||||
|
||||
# Push again (if rebased)
|
||||
git push --force-with-lease
|
||||
```
|
||||
|
||||
### Step 4: Report Status
|
||||
|
||||
```
|
||||
Sync complete:
|
||||
|
||||
Local: feat/password-reset @ abc1234
|
||||
Remote: origin/feat/password-reset @ abc1234
|
||||
Base: development @ xyz7890 (synced)
|
||||
|
||||
Your branch is up-to-date with development.
|
||||
No conflicts detected.
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_DEFAULT_BASE` | `development` | Branch to sync with |
|
||||
| `GIT_SYNC_STRATEGY` | `rebase` | How to incorporate upstream changes |
|
||||
|
||||
## Conflict Handling
|
||||
|
||||
If conflicts occur during rebase:
|
||||
|
||||
```
|
||||
Conflicts detected while syncing with development.
|
||||
|
||||
Conflicting files:
|
||||
- src/auth/login.ts
|
||||
- src/auth/types.ts
|
||||
|
||||
Options:
|
||||
1. Open conflict resolution (I'll guide you)
|
||||
2. Abort sync (keep local state)
|
||||
3. Accept all theirs (⚠️ loses your changes in conflicts)
|
||||
4. Accept all ours (⚠️ ignores upstream in conflicts)
|
||||
```
|
||||
|
||||
## Output
|
||||
|
||||
On success:
|
||||
```
|
||||
Committed: abc1234
|
||||
Pushed to: origin/feat/password-reset
|
||||
Synced with: development (xyz7890)
|
||||
|
||||
Status: Clean, up-to-date
|
||||
```
|
||||
117
plugins/git-flow/commands/commit.md
Normal file
117
plugins/git-flow/commands/commit.md
Normal file
@@ -0,0 +1,117 @@
|
||||
# /commit - Smart Commit
|
||||
|
||||
## Purpose
|
||||
|
||||
Create a git commit with an auto-generated conventional commit message based on staged changes.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Analyze Changes
|
||||
|
||||
1. Run `git status` to see staged and unstaged changes
|
||||
2. Run `git diff --staged` to examine staged changes
|
||||
3. If nothing staged, prompt user to stage changes
|
||||
|
||||
### Step 2: Generate Commit Message
|
||||
|
||||
Analyze the changes and generate a conventional commit message:
|
||||
|
||||
```
|
||||
<type>(<scope>): <description>
|
||||
|
||||
[optional body]
|
||||
|
||||
[optional footer]
|
||||
```
|
||||
|
||||
**Types:**
|
||||
- `feat`: New feature
|
||||
- `fix`: Bug fix
|
||||
- `docs`: Documentation only
|
||||
- `style`: Formatting, missing semicolons, etc.
|
||||
- `refactor`: Code change that neither fixes a bug nor adds a feature
|
||||
- `perf`: Performance improvement
|
||||
- `test`: Adding/updating tests
|
||||
- `chore`: Maintenance tasks
|
||||
- `build`: Build system or external dependencies
|
||||
- `ci`: CI configuration
|
||||
|
||||
**Scope:** Determined from changed files (e.g., `auth`, `api`, `ui`)
|
||||
|
||||
### Step 3: Confirm or Edit
|
||||
|
||||
Present the generated message:
|
||||
|
||||
```
|
||||
Proposed commit message:
|
||||
───────────────────────
|
||||
feat(auth): add password reset functionality
|
||||
|
||||
Implement forgot password flow with email verification.
|
||||
Includes rate limiting and token expiration.
|
||||
───────────────────────
|
||||
|
||||
Options:
|
||||
1. Use this message (Recommended)
|
||||
2. Edit the message
|
||||
3. Regenerate with different focus
|
||||
4. Cancel
|
||||
```
|
||||
|
||||
### Step 4: Execute Commit
|
||||
|
||||
If confirmed, run:
|
||||
|
||||
```bash
|
||||
git commit -m "$(cat <<'EOF'
|
||||
<message>
|
||||
|
||||
Co-Authored-By: Claude <noreply@anthropic.com>
|
||||
EOF
|
||||
)"
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `GIT_COMMIT_STYLE` | `conventional` | Message style (conventional, simple, detailed) |
|
||||
| `GIT_SIGN_COMMITS` | `false` | Use GPG signing |
|
||||
| `GIT_CO_AUTHOR` | `true` | Include Claude co-author footer |
|
||||
|
||||
## Edge Cases
|
||||
|
||||
### No Changes Staged
|
||||
|
||||
```
|
||||
No changes staged for commit.
|
||||
|
||||
Would you like to:
|
||||
1. Stage all changes (`git add -A`)
|
||||
2. Stage specific files (I'll help you choose)
|
||||
3. Cancel
|
||||
```
|
||||
|
||||
### Untracked Files
|
||||
|
||||
```
|
||||
Found 3 untracked files:
|
||||
- src/new-feature.ts
|
||||
- tests/new-feature.test.ts
|
||||
- docs/new-feature.md
|
||||
|
||||
Include these in the commit?
|
||||
1. Yes, stage all (Recommended)
|
||||
2. Let me pick which ones
|
||||
3. No, commit only tracked files
|
||||
```
|
||||
|
||||
## Output
|
||||
|
||||
On success:
|
||||
```
|
||||
Committed: abc1234
|
||||
feat(auth): add password reset functionality
|
||||
|
||||
Files: 3 changed, 45 insertions(+), 12 deletions(-)
|
||||
```
|
||||
100
plugins/git-flow/commands/git-config.md
Normal file
100
plugins/git-flow/commands/git-config.md
Normal file
@@ -0,0 +1,100 @@
|
||||
# /git-config - Configure git-flow
|
||||
|
||||
## Purpose
|
||||
|
||||
Configure git-flow settings for the current project.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Interactive Configuration
|
||||
|
||||
```
|
||||
git-flow Configuration
|
||||
═══════════════════════════════════════════
|
||||
|
||||
Current settings:
|
||||
GIT_WORKFLOW_STYLE: feature-branch
|
||||
GIT_DEFAULT_BASE: development
|
||||
GIT_AUTO_DELETE_MERGED: true
|
||||
GIT_AUTO_PUSH: false
|
||||
|
||||
What would you like to configure?
|
||||
1. Workflow style
|
||||
2. Default base branch
|
||||
3. Auto-delete merged branches
|
||||
4. Auto-push after commit
|
||||
5. Protected branches
|
||||
6. View all settings
|
||||
7. Reset to defaults
|
||||
```
|
||||
|
||||
### Setting: Workflow Style
|
||||
|
||||
```
|
||||
Choose your workflow style:
|
||||
|
||||
1. simple
|
||||
- Direct commits to development
|
||||
- No feature branches required
|
||||
- Good for solo projects
|
||||
|
||||
2. feature-branch (Recommended)
|
||||
- Feature branches from development
|
||||
- Merge when complete
|
||||
- Good for small teams
|
||||
|
||||
3. pr-required
|
||||
- Feature branches from development
|
||||
- Requires PR for merge
|
||||
- Good for code review workflows
|
||||
|
||||
4. trunk-based
|
||||
- Short-lived branches
|
||||
- Frequent integration
|
||||
- Good for CI/CD heavy workflows
|
||||
```
|
||||
|
||||
### Setting: Protected Branches
|
||||
|
||||
```
|
||||
Protected branches (comma-separated):
|
||||
Current: main, master, development, staging, production
|
||||
|
||||
These branches will:
|
||||
- Never be auto-deleted
|
||||
- Require confirmation before direct commits
|
||||
- Warn before force push
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
| Variable | Default | Options |
|
||||
|----------|---------|---------|
|
||||
| `GIT_WORKFLOW_STYLE` | `feature-branch` | simple, feature-branch, pr-required, trunk-based |
|
||||
| `GIT_DEFAULT_BASE` | `development` | Any branch name |
|
||||
| `GIT_AUTO_DELETE_MERGED` | `true` | true, false |
|
||||
| `GIT_AUTO_PUSH` | `false` | true, false |
|
||||
| `GIT_PROTECTED_BRANCHES` | `main,master,development,staging,production` | Comma-separated |
|
||||
| `GIT_COMMIT_STYLE` | `conventional` | conventional, simple, detailed |
|
||||
| `GIT_CO_AUTHOR` | `true` | true, false |
|
||||
|
||||
## Storage
|
||||
|
||||
Settings are stored in:
|
||||
- Project: `.env` or `.claude/settings.json`
|
||||
- User: `~/.config/claude/git-flow.env`
|
||||
|
||||
Project settings override user settings.
|
||||
|
||||
## Output
|
||||
|
||||
After configuration:
|
||||
```
|
||||
Configuration saved!
|
||||
|
||||
GIT_WORKFLOW_STYLE=feature-branch
|
||||
GIT_DEFAULT_BASE=development
|
||||
GIT_AUTO_DELETE_MERGED=true
|
||||
|
||||
These settings will be used for all git-flow commands.
|
||||
```
|
||||
72
plugins/git-flow/commands/git-status.md
Normal file
72
plugins/git-flow/commands/git-status.md
Normal file
@@ -0,0 +1,72 @@
|
||||
# /git-status - Enhanced Status
|
||||
|
||||
## Purpose
|
||||
|
||||
Show comprehensive git status with recommendations and insights.
|
||||
|
||||
## Behavior
|
||||
|
||||
### Output Format
|
||||
|
||||
```
|
||||
═══════════════════════════════════════════
|
||||
Git Status: <repo-name>
|
||||
═══════════════════════════════════════════
|
||||
|
||||
Branch: feat/password-reset
|
||||
Base: development (3 commits ahead, 0 behind)
|
||||
Remote: origin/feat/password-reset (synced)
|
||||
|
||||
─── Changes ───────────────────────────────
|
||||
|
||||
Staged (ready to commit):
|
||||
✓ src/auth/reset.ts (modified)
|
||||
✓ src/auth/types.ts (modified)
|
||||
|
||||
Unstaged:
|
||||
• tests/auth.test.ts (modified)
|
||||
• src/utils/email.ts (new file, untracked)
|
||||
|
||||
─── Recommendations ───────────────────────
|
||||
|
||||
1. Stage test file: git add tests/auth.test.ts
|
||||
2. Consider adding new file: git add src/utils/email.ts
|
||||
3. Ready to commit with 2 staged files
|
||||
|
||||
─── Quick Actions ─────────────────────────
|
||||
|
||||
• /commit - Commit staged changes
|
||||
• /commit-push - Commit and push
|
||||
• /commit-sync - Full sync with development
|
||||
|
||||
═══════════════════════════════════════════
|
||||
```
|
||||
|
||||
## Analysis Provided
|
||||
|
||||
### Branch Health
|
||||
- Commits ahead/behind base branch
|
||||
- Sync status with remote
|
||||
- Age of branch
|
||||
|
||||
### Change Categories
|
||||
- Staged (ready to commit)
|
||||
- Modified (not staged)
|
||||
- Untracked (new files)
|
||||
- Deleted
|
||||
- Renamed
|
||||
|
||||
### Recommendations
|
||||
- What to stage
|
||||
- What to ignore
|
||||
- When to commit
|
||||
- When to sync
|
||||
|
||||
### Warnings
|
||||
- Large number of changes (consider splitting)
|
||||
- Old branch (consider rebasing)
|
||||
- Conflicts with upstream
|
||||
|
||||
## Output
|
||||
|
||||
Always produces the formatted status report with context-aware recommendations.
|
||||
@@ -0,0 +1,183 @@
|
||||
# Git Branching Strategies
|
||||
|
||||
## Supported Workflow Styles
|
||||
|
||||
### 1. Simple
|
||||
|
||||
```
|
||||
main ─────●─────●─────●─────●─────●
|
||||
↑ ↑ ↑ ↑ ↑
|
||||
commit commit commit commit commit
|
||||
```
|
||||
|
||||
**Best for:**
|
||||
- Solo projects
|
||||
- Small scripts/utilities
|
||||
- Documentation repos
|
||||
|
||||
**Rules:**
|
||||
- Direct commits to main/development
|
||||
- No feature branches required
|
||||
- Linear history
|
||||
|
||||
### 2. Feature Branch (Default)
|
||||
|
||||
```
|
||||
main ─────────────────●───────────●───────────
|
||||
↑ ↑
|
||||
development ────●────●────●────●────●────●────
|
||||
↑ ↑ ↑ ↑
|
||||
feat/a ─────●───●────┘ │ │
|
||||
│ │
|
||||
feat/b ──────────●────●───┘ │
|
||||
│
|
||||
fix/c ────────────────●────●───┘
|
||||
```
|
||||
|
||||
**Best for:**
|
||||
- Small teams (2-5 developers)
|
||||
- Projects without formal review process
|
||||
- Rapid development cycles
|
||||
|
||||
**Rules:**
|
||||
- Feature branches from development
|
||||
- Merge when complete
|
||||
- Delete branches after merge
|
||||
- development → main for releases
|
||||
|
||||
### 3. PR Required
|
||||
|
||||
```
|
||||
main ─────────────────────────────●───────────
|
||||
↑
|
||||
development ────●────●────●────●────●────●────
|
||||
↑ ↑ ↑ ↑
|
||||
PR PR PR PR
|
||||
↑ ↑ ↑ ↑
|
||||
feat/a ─────●───● │ │ │
|
||||
│ │ │
|
||||
feat/b ──────────●───● │ │
|
||||
│ │
|
||||
feat/c ───────────────●───● │
|
||||
│
|
||||
fix/d ────────────────────●────●
|
||||
```
|
||||
|
||||
**Best for:**
|
||||
- Teams with code review requirements
|
||||
- Open source projects
|
||||
- Projects with CI/CD gates
|
||||
|
||||
**Rules:**
|
||||
- All changes via pull request
|
||||
- At least one approval required
|
||||
- CI must pass before merge
|
||||
- Squash commits on merge
|
||||
|
||||
### 4. Trunk-Based
|
||||
|
||||
```
|
||||
main ────●────●────●────●────●────●────●────●
|
||||
↑ ↑ ↑ ↑ ↑ ↑ ↑ ↑
|
||||
│ │ │ │ │ │ │ │
|
||||
short branches (< 1 day)
|
||||
```
|
||||
|
||||
**Best for:**
|
||||
- CI/CD heavy workflows
|
||||
- Experienced teams
|
||||
- High deployment frequency
|
||||
|
||||
**Rules:**
|
||||
- Very short-lived branches (hours, not days)
|
||||
- Frequent integration to main
|
||||
- Feature flags for incomplete work
|
||||
- Continuous deployment
|
||||
|
||||
## Branch Naming Convention
|
||||
|
||||
```
|
||||
<type>/<description>
|
||||
```
|
||||
|
||||
### Types
|
||||
|
||||
| Type | Purpose | Example |
|
||||
|------|---------|---------|
|
||||
| `feat` | New feature | `feat/user-authentication` |
|
||||
| `fix` | Bug fix | `fix/login-timeout` |
|
||||
| `chore` | Maintenance | `chore/update-deps` |
|
||||
| `docs` | Documentation | `docs/api-reference` |
|
||||
| `refactor` | Code restructure | `refactor/auth-module` |
|
||||
| `test` | Test additions | `test/auth-coverage` |
|
||||
| `perf` | Performance | `perf/query-optimization` |
|
||||
|
||||
### Naming Rules
|
||||
|
||||
1. Lowercase only
|
||||
2. Hyphens for word separation
|
||||
3. No special characters
|
||||
4. Descriptive (2-4 words)
|
||||
5. Max 50 characters
|
||||
|
||||
### Examples
|
||||
|
||||
```
|
||||
✓ feat/add-password-reset
|
||||
✓ fix/null-pointer-login
|
||||
✓ chore/upgrade-typescript-5
|
||||
|
||||
✗ Feature/Add_Password_Reset (wrong case, underscores)
|
||||
✗ fix-bug (too vague)
|
||||
✗ my-branch (no type prefix)
|
||||
```
|
||||
|
||||
## Protected Branches
|
||||
|
||||
Default protected branches:
|
||||
- `main` / `master`
|
||||
- `development` / `develop`
|
||||
- `staging`
|
||||
- `production`
|
||||
|
||||
Protection rules:
|
||||
- No direct commits
|
||||
- No force push
|
||||
- Require PR for changes
|
||||
- No deletion
|
||||
|
||||
## Commit Message Convention
|
||||
|
||||
```
|
||||
<type>(<scope>): <description>
|
||||
|
||||
[optional body]
|
||||
|
||||
[optional footer]
|
||||
```
|
||||
|
||||
### Examples
|
||||
|
||||
```
|
||||
feat(auth): add password reset flow
|
||||
|
||||
Implement forgot password functionality with email verification.
|
||||
Includes rate limiting (5 attempts/hour) and 24h token expiration.
|
||||
|
||||
Closes #123
|
||||
```
|
||||
|
||||
```
|
||||
fix(ui): resolve button alignment on mobile
|
||||
|
||||
The submit button was misaligned on screens < 768px.
|
||||
Added responsive flex rules.
|
||||
```
|
||||
|
||||
```
|
||||
chore(deps): update dependencies
|
||||
|
||||
- typescript 5.3 → 5.4
|
||||
- react 18.2 → 18.3
|
||||
- node 18 → 20 (LTS)
|
||||
```
|
||||
71
plugins/pr-review/.claude-plugin/plugin.json
Normal file
71
plugins/pr-review/.claude-plugin/plugin.json
Normal file
@@ -0,0 +1,71 @@
|
||||
{
|
||||
"name": "pr-review",
|
||||
"version": "1.0.0",
|
||||
"description": "Multi-agent pull request review with confidence scoring and actionable feedback",
|
||||
"author": {
|
||||
"name": "Leo Miranda",
|
||||
"email": "leobmiranda@gmail.com"
|
||||
},
|
||||
"homepage": "https://gitea.hotserv.cloud/personal-projects/support-claude-mktplace/src/branch/main/plugins/pr-review/README.md",
|
||||
"repository": "https://gitea.hotserv.cloud/personal-projects/support-claude-mktplace.git",
|
||||
"license": "MIT",
|
||||
"keywords": [
|
||||
"pull-request",
|
||||
"code-review",
|
||||
"security",
|
||||
"performance",
|
||||
"multi-agent"
|
||||
],
|
||||
"commands": [
|
||||
{
|
||||
"name": "pr-review",
|
||||
"description": "Full multi-agent PR review (security, performance, maintainability, tests)",
|
||||
"file": "commands/pr-review.md"
|
||||
},
|
||||
{
|
||||
"name": "pr-summary",
|
||||
"description": "Quick summary of PR changes without full review",
|
||||
"file": "commands/pr-summary.md"
|
||||
},
|
||||
{
|
||||
"name": "pr-findings",
|
||||
"description": "List and filter review findings by category or confidence",
|
||||
"file": "commands/pr-findings.md"
|
||||
}
|
||||
],
|
||||
"agents": [
|
||||
{
|
||||
"name": "coordinator",
|
||||
"description": "Orchestrates the multi-agent review process",
|
||||
"file": "agents/coordinator.md"
|
||||
},
|
||||
{
|
||||
"name": "security-reviewer",
|
||||
"description": "Analyzes code for security vulnerabilities",
|
||||
"file": "agents/security-reviewer.md"
|
||||
},
|
||||
{
|
||||
"name": "performance-analyst",
|
||||
"description": "Identifies performance issues and optimization opportunities",
|
||||
"file": "agents/performance-analyst.md"
|
||||
},
|
||||
{
|
||||
"name": "maintainability-auditor",
|
||||
"description": "Reviews code quality, patterns, and maintainability",
|
||||
"file": "agents/maintainability-auditor.md"
|
||||
},
|
||||
{
|
||||
"name": "test-validator",
|
||||
"description": "Validates test coverage and test quality",
|
||||
"file": "agents/test-validator.md"
|
||||
}
|
||||
],
|
||||
"skills": [
|
||||
{
|
||||
"name": "review-patterns",
|
||||
"description": "Code review patterns and confidence scoring rules",
|
||||
"path": "skills/review-patterns"
|
||||
}
|
||||
],
|
||||
"mcpServers": ["gitea"]
|
||||
}
|
||||
9
plugins/pr-review/.mcp.json
Normal file
9
plugins/pr-review/.mcp.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"gitea": {
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/mcp-servers/gitea/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "${CLAUDE_PLUGIN_ROOT}/mcp-servers/gitea"
|
||||
}
|
||||
}
|
||||
}
|
||||
126
plugins/pr-review/README.md
Normal file
126
plugins/pr-review/README.md
Normal file
@@ -0,0 +1,126 @@
|
||||
# pr-review
|
||||
|
||||
Multi-agent pull request review with confidence scoring and actionable feedback.
|
||||
|
||||
## Overview
|
||||
|
||||
pr-review conducts comprehensive code reviews using specialized agents for security, performance, maintainability, and test coverage. Each finding includes a confidence score to reduce noise and focus on real issues.
|
||||
|
||||
## Commands
|
||||
|
||||
| Command | Description |
|
||||
|---------|-------------|
|
||||
| `/pr-review <pr#>` | Full multi-agent review |
|
||||
| `/pr-summary <pr#>` | Quick summary without full review |
|
||||
| `/pr-findings <pr#>` | Filter findings by category/confidence |
|
||||
|
||||
## Review Agents
|
||||
|
||||
| Agent | Focus |
|
||||
|-------|-------|
|
||||
| **Security Reviewer** | Injections, auth, data exposure, crypto |
|
||||
| **Performance Analyst** | N+1 queries, complexity, memory, caching |
|
||||
| **Maintainability Auditor** | Complexity, duplication, naming, coupling |
|
||||
| **Test Validator** | Coverage, test quality, flaky tests |
|
||||
|
||||
## Confidence Scoring
|
||||
|
||||
Findings are scored 0.0 - 1.0:
|
||||
|
||||
| Range | Label | Action |
|
||||
|-------|-------|--------|
|
||||
| 0.9 - 1.0 | HIGH | Must address |
|
||||
| 0.7 - 0.89 | MEDIUM | Should address |
|
||||
| 0.5 - 0.69 | LOW | Consider addressing |
|
||||
| < 0.5 | (suppressed) | Not reported |
|
||||
|
||||
## Installation
|
||||
|
||||
Add to your project's `.claude/settings.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"plugins": ["pr-review"]
|
||||
}
|
||||
```
|
||||
|
||||
Requires Gitea MCP server configuration.
|
||||
|
||||
## Configuration
|
||||
|
||||
```bash
|
||||
# Minimum confidence to report (default: 0.5)
|
||||
PR_REVIEW_CONFIDENCE_THRESHOLD=0.5
|
||||
|
||||
# Auto-submit review to Gitea (default: false)
|
||||
PR_REVIEW_AUTO_SUBMIT=false
|
||||
```
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### Full Review
|
||||
|
||||
```
|
||||
/pr-review 123
|
||||
|
||||
═══════════════════════════════════════════════════
|
||||
PR Review Report: #123
|
||||
═══════════════════════════════════════════════════
|
||||
|
||||
Summary:
|
||||
Files changed: 12
|
||||
Lines: +234 / -45
|
||||
|
||||
Findings: 8 total
|
||||
🔴 Critical: 1
|
||||
🟠 Major: 2
|
||||
🟡 Minor: 3
|
||||
💡 Suggestions: 2
|
||||
|
||||
[Detailed findings...]
|
||||
|
||||
VERDICT: REQUEST_CHANGES
|
||||
═══════════════════════════════════════════════════
|
||||
```
|
||||
|
||||
### Filter Findings
|
||||
|
||||
```
|
||||
/pr-findings 123 --category security
|
||||
|
||||
# Shows only security-related findings
|
||||
```
|
||||
|
||||
### Quick Summary
|
||||
|
||||
```
|
||||
/pr-summary 123
|
||||
|
||||
# Shows change overview without full analysis
|
||||
```
|
||||
|
||||
## Output
|
||||
|
||||
Review reports include:
|
||||
- Summary statistics
|
||||
- Findings grouped by severity
|
||||
- Code snippets with context
|
||||
- Suggested fixes
|
||||
- Overall verdict (APPROVE/COMMENT/REQUEST_CHANGES)
|
||||
|
||||
## Verdict Logic
|
||||
|
||||
| Condition | Verdict |
|
||||
|-----------|---------|
|
||||
| Any critical finding | REQUEST_CHANGES |
|
||||
| 2+ major findings | REQUEST_CHANGES |
|
||||
| Only minor/suggestions | COMMENT |
|
||||
| No significant findings | APPROVE |
|
||||
|
||||
## Integration
|
||||
|
||||
For CLAUDE.md integration instructions, see `claude-md-integration.md`.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
133
plugins/pr-review/agents/coordinator.md
Normal file
133
plugins/pr-review/agents/coordinator.md
Normal file
@@ -0,0 +1,133 @@
|
||||
# Coordinator Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are the review coordinator that orchestrates the multi-agent PR review process. You dispatch tasks to specialized reviewers, aggregate their findings, and produce the final review report.
|
||||
|
||||
## Responsibilities
|
||||
|
||||
### 1. PR Analysis
|
||||
|
||||
Before dispatching to agents:
|
||||
1. Fetch PR metadata and diff
|
||||
2. Identify changed file types
|
||||
3. Determine which agents are relevant
|
||||
|
||||
### 2. Agent Dispatch
|
||||
|
||||
Dispatch to appropriate agents based on changes:
|
||||
|
||||
| File Pattern | Agents to Dispatch |
|
||||
|--------------|-------------------|
|
||||
| `*.ts`, `*.js` | Security, Performance, Maintainability |
|
||||
| `*.test.*`, `*_test.*` | Test Validator |
|
||||
| `*.sql`, `*migration*` | Security (SQL injection) |
|
||||
| `*.css`, `*.scss` | Maintainability only |
|
||||
| `*.md`, `*.txt` | Skip (documentation) |
|
||||
|
||||
### 3. Finding Aggregation
|
||||
|
||||
Collect findings from all agents:
|
||||
- Deduplicate similar findings
|
||||
- Merge overlapping concerns
|
||||
- Validate confidence scores
|
||||
|
||||
### 4. Report Generation
|
||||
|
||||
Produce structured report:
|
||||
1. Summary statistics
|
||||
2. Findings by severity (critical → suggestion)
|
||||
3. Per-finding details
|
||||
4. Overall verdict
|
||||
|
||||
### 5. Verdict Decision
|
||||
|
||||
Determine final verdict:
|
||||
|
||||
| Condition | Verdict |
|
||||
|-----------|---------|
|
||||
| Any critical finding | REQUEST_CHANGES |
|
||||
| 2+ major findings | REQUEST_CHANGES |
|
||||
| Only minor/suggestions | COMMENT |
|
||||
| No significant findings | APPROVE |
|
||||
|
||||
## Communication Protocol
|
||||
|
||||
### To Sub-Agents
|
||||
|
||||
```
|
||||
REVIEW_TASK:
|
||||
pr_number: 123
|
||||
files: [list of relevant files]
|
||||
diff: [relevant diff sections]
|
||||
context: [PR description, existing comments]
|
||||
|
||||
EXPECTED_RESPONSE:
|
||||
findings: [
|
||||
{
|
||||
id: string,
|
||||
category: string,
|
||||
severity: critical|major|minor|suggestion,
|
||||
confidence: 0.0-1.0,
|
||||
file: string,
|
||||
line: number,
|
||||
title: string,
|
||||
description: string,
|
||||
fix: string (optional)
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Report Template
|
||||
|
||||
```
|
||||
═══════════════════════════════════════════════════
|
||||
PR Review Report: #<number>
|
||||
═══════════════════════════════════════════════════
|
||||
|
||||
Summary:
|
||||
Files changed: <n>
|
||||
Lines: +<added> / -<removed>
|
||||
Agents consulted: <list>
|
||||
|
||||
Findings: <total>
|
||||
🔴 Critical: <n>
|
||||
🟠 Major: <n>
|
||||
🟡 Minor: <n>
|
||||
💡 Suggestions: <n>
|
||||
|
||||
[Findings grouped by severity]
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
VERDICT: <APPROVE|COMMENT|REQUEST_CHANGES>
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
<Justification>
|
||||
```
|
||||
|
||||
## Behavior Guidelines
|
||||
|
||||
### Be Decisive
|
||||
|
||||
Provide clear verdict with justification. Don't hedge.
|
||||
|
||||
### Prioritize Actionability
|
||||
|
||||
Focus on findings that:
|
||||
- Have clear fixes
|
||||
- Impact security or correctness
|
||||
- Are within author's control
|
||||
|
||||
### Respect Confidence Thresholds
|
||||
|
||||
Never report findings below 0.5 confidence. Be transparent about uncertainty:
|
||||
- 0.9+ → "This is definitely an issue"
|
||||
- 0.7-0.89 → "This is likely an issue"
|
||||
- 0.5-0.69 → "This might be an issue"
|
||||
|
||||
### Avoid Noise
|
||||
|
||||
Don't report:
|
||||
- Style preferences (unless egregious)
|
||||
- Minor naming issues
|
||||
- Theoretical problems with no practical impact
|
||||
99
plugins/pr-review/agents/maintainability-auditor.md
Normal file
99
plugins/pr-review/agents/maintainability-auditor.md
Normal file
@@ -0,0 +1,99 @@
|
||||
# Maintainability Auditor Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are a code quality reviewer that identifies maintainability issues, code smells, and opportunities to improve code clarity and long-term health.
|
||||
|
||||
## Focus Areas
|
||||
|
||||
### 1. Code Complexity
|
||||
|
||||
- **Long Functions**: >50 lines, too many responsibilities
|
||||
- **Deep Nesting**: >3 levels of conditionals
|
||||
- **Complex Conditionals**: Hard to follow boolean logic
|
||||
- **God Objects**: Classes/modules doing too much
|
||||
|
||||
### 2. Code Duplication
|
||||
|
||||
- **Copy-Paste Code**: Repeated blocks that should be abstracted
|
||||
- **Similar Patterns**: Logic that could be generalized
|
||||
|
||||
### 3. Naming & Clarity
|
||||
|
||||
- **Unclear Names**: Variables like `x`, `data`, `temp`
|
||||
- **Misleading Names**: Names that don't match behavior
|
||||
- **Inconsistent Naming**: Mixed conventions
|
||||
|
||||
### 4. Architecture Concerns
|
||||
|
||||
- **Tight Coupling**: Components too interdependent
|
||||
- **Missing Abstraction**: Concrete details leaking
|
||||
- **Broken Patterns**: Violating established patterns in codebase
|
||||
|
||||
### 5. Error Handling
|
||||
|
||||
- **Swallowed Errors**: Empty catch blocks
|
||||
- **Generic Errors**: Losing error context
|
||||
- **Missing Error Handling**: No handling for expected failures
|
||||
|
||||
## Finding Format
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "MAINT-001",
|
||||
"category": "maintainability",
|
||||
"subcategory": "complexity",
|
||||
"severity": "minor",
|
||||
"confidence": 0.75,
|
||||
"file": "src/services/orderProcessor.ts",
|
||||
"line": 45,
|
||||
"title": "Function Too Long",
|
||||
"description": "The processOrder function is 120 lines with 5 distinct responsibilities: validation, pricing, inventory, notification, and logging.",
|
||||
"impact": "Difficult to test, understand, and modify. Changes risk unintended side effects.",
|
||||
"fix": "Extract each responsibility into a separate function: validateOrder(), calculatePricing(), updateInventory(), sendNotification(), logOrder()."
|
||||
}
|
||||
```
|
||||
|
||||
## Severity Guidelines
|
||||
|
||||
| Severity | Criteria |
|
||||
|----------|----------|
|
||||
| Critical | Makes code dangerous to modify |
|
||||
| Major | Significantly impacts readability/maintainability |
|
||||
| Minor | Noticeable but manageable issue |
|
||||
| Suggestion | Nice to have, not blocking |
|
||||
|
||||
## Confidence Calibration
|
||||
|
||||
Maintainability is subjective. Be measured:
|
||||
|
||||
HIGH confidence when:
|
||||
- Clear violation of established patterns
|
||||
- Obvious duplication or complexity
|
||||
- Measurable metrics exceed thresholds
|
||||
|
||||
MEDIUM confidence when:
|
||||
- Judgment call on complexity
|
||||
- Could be intentional design choice
|
||||
- Depends on team conventions
|
||||
|
||||
Suppress when:
|
||||
- Style preference not shared by team
|
||||
- Generated or third-party code
|
||||
- Temporary code with TODO
|
||||
|
||||
## Special Considerations
|
||||
|
||||
### Context Awareness
|
||||
|
||||
Check existing patterns before flagging:
|
||||
- If codebase uses X pattern, don't suggest Y
|
||||
- If similar code exists elsewhere, ensure consistency
|
||||
- Respect team conventions over personal preference
|
||||
|
||||
### Constructive Feedback
|
||||
|
||||
Always provide:
|
||||
- Why it matters
|
||||
- Concrete improvement suggestion
|
||||
- Example if complex
|
||||
93
plugins/pr-review/agents/performance-analyst.md
Normal file
93
plugins/pr-review/agents/performance-analyst.md
Normal file
@@ -0,0 +1,93 @@
|
||||
# Performance Analyst Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are a performance-focused code reviewer that identifies performance issues, inefficiencies, and optimization opportunities in pull request changes.
|
||||
|
||||
## Focus Areas
|
||||
|
||||
### 1. Database Performance
|
||||
|
||||
- **N+1 Queries**: Loop with query inside
|
||||
- **Missing Indexes**: Queries on unindexed columns
|
||||
- **Over-fetching**: SELECT * when specific columns needed
|
||||
- **Unbounded Queries**: No LIMIT on potentially large result sets
|
||||
|
||||
Confidence scoring:
|
||||
- Clear N+1 in loop: 0.9
|
||||
- Possible N+1 with unclear iteration: 0.7
|
||||
- Query without visible index: 0.5
|
||||
|
||||
### 2. Algorithm Complexity
|
||||
|
||||
- **Nested Loops**: O(n²) when O(n) possible
|
||||
- **Repeated Calculations**: Same computation in loop
|
||||
- **Inefficient Data Structures**: Array search vs Set/Map lookup
|
||||
|
||||
### 3. Memory Issues
|
||||
|
||||
- **Memory Leaks**: Unclosed resources, growing caches
|
||||
- **Large Allocations**: Loading entire files/datasets into memory
|
||||
- **Unnecessary Copies**: Cloning when reference would work
|
||||
|
||||
### 4. Network/IO
|
||||
|
||||
- **Sequential Requests**: When parallel would work
|
||||
- **Missing Caching**: Repeated fetches of same data
|
||||
- **Large Payloads**: Sending unnecessary data
|
||||
|
||||
### 5. Frontend Performance
|
||||
|
||||
- **Unnecessary Re-renders**: Missing memoization
|
||||
- **Large Bundle Impact**: Heavy imports
|
||||
- **Blocking Operations**: Sync ops on main thread
|
||||
|
||||
## Finding Format
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "PERF-001",
|
||||
"category": "performance",
|
||||
"subcategory": "database",
|
||||
"severity": "major",
|
||||
"confidence": 0.85,
|
||||
"file": "src/services/orders.ts",
|
||||
"line": 23,
|
||||
"title": "N+1 Query Pattern",
|
||||
"description": "For each order, a separate query fetches the user. With 100 orders, this executes 101 queries.",
|
||||
"evidence": "orders.forEach(order => { const user = await db.users.find(order.userId); })",
|
||||
"impact": "Linear increase in database load with order count. 1000 orders = 1001 queries.",
|
||||
"fix": "Use eager loading or batch the user IDs: db.users.findMany({ id: { in: userIds } })"
|
||||
}
|
||||
```
|
||||
|
||||
## Severity Guidelines
|
||||
|
||||
| Severity | Criteria |
|
||||
|----------|----------|
|
||||
| Critical | Will cause outage or severe degradation at scale |
|
||||
| Major | Significant impact on response time or resources |
|
||||
| Minor | Measurable but tolerable impact |
|
||||
| Suggestion | Optimization opportunity, premature if not hot path |
|
||||
|
||||
## Confidence Calibration
|
||||
|
||||
Be conservative about performance claims:
|
||||
- Measure or cite benchmarks when possible
|
||||
- Consider actual usage patterns
|
||||
- Acknowledge when impact depends on scale
|
||||
|
||||
HIGH confidence when:
|
||||
- Clear algorithmic issue (N+1, O(n²))
|
||||
- Pattern known to cause problems
|
||||
- Impact calculable from code
|
||||
|
||||
MEDIUM confidence when:
|
||||
- Depends on data size
|
||||
- Might be optimized elsewhere
|
||||
- Theoretical improvement
|
||||
|
||||
Suppress when:
|
||||
- Likely not a hot path
|
||||
- Micro-optimization
|
||||
- Depends heavily on runtime
|
||||
93
plugins/pr-review/agents/security-reviewer.md
Normal file
93
plugins/pr-review/agents/security-reviewer.md
Normal file
@@ -0,0 +1,93 @@
|
||||
# Security Reviewer Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are a security-focused code reviewer that identifies vulnerabilities, security anti-patterns, and potential exploits in pull request changes.
|
||||
|
||||
## Focus Areas
|
||||
|
||||
### 1. Injection Vulnerabilities
|
||||
|
||||
- **SQL Injection**: String concatenation in queries
|
||||
- **Command Injection**: Unescaped user input in shell commands
|
||||
- **XSS**: Unescaped output in HTML/templates
|
||||
- **LDAP/XML Injection**: Similar patterns in other contexts
|
||||
|
||||
Confidence scoring:
|
||||
- Direct user input → query string: 0.95
|
||||
- Indirect path with possible taint: 0.7
|
||||
- Theoretical with no clear path: 0.4
|
||||
|
||||
### 2. Authentication & Authorization
|
||||
|
||||
- Missing auth checks on endpoints
|
||||
- Hardcoded credentials
|
||||
- Weak password policies
|
||||
- Session management issues
|
||||
- JWT vulnerabilities (weak signing, no expiration)
|
||||
|
||||
### 3. Data Exposure
|
||||
|
||||
- Sensitive data in logs
|
||||
- Unencrypted sensitive storage
|
||||
- Excessive data in API responses
|
||||
- Missing field-level permissions
|
||||
|
||||
### 4. Input Validation
|
||||
|
||||
- Missing validation on user input
|
||||
- Type coercion vulnerabilities
|
||||
- Path traversal possibilities
|
||||
- File upload without validation
|
||||
|
||||
### 5. Cryptography
|
||||
|
||||
- Weak algorithms (MD5, SHA1 for passwords)
|
||||
- Hardcoded keys/IVs
|
||||
- Predictable random values
|
||||
- Missing salt
|
||||
|
||||
## Finding Format
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "SEC-001",
|
||||
"category": "security",
|
||||
"subcategory": "injection",
|
||||
"severity": "critical",
|
||||
"confidence": 0.95,
|
||||
"file": "src/api/users.ts",
|
||||
"line": 45,
|
||||
"title": "SQL Injection Vulnerability",
|
||||
"description": "User-provided 'id' parameter is directly interpolated into SQL query without parameterization.",
|
||||
"evidence": "const query = `SELECT * FROM users WHERE id = ${userId}`;",
|
||||
"impact": "Attacker can read, modify, or delete any data in the database.",
|
||||
"fix": "Use parameterized queries: db.query('SELECT * FROM users WHERE id = ?', [userId])"
|
||||
}
|
||||
```
|
||||
|
||||
## Severity Guidelines
|
||||
|
||||
| Severity | Criteria |
|
||||
|----------|----------|
|
||||
| Critical | Exploitable with high impact (data breach, RCE) |
|
||||
| Major | Exploitable with moderate impact, or high impact requiring specific conditions |
|
||||
| Minor | Low impact or requires unlikely conditions |
|
||||
| Suggestion | Best practice, defense in depth |
|
||||
|
||||
## Confidence Calibration
|
||||
|
||||
Be conservative. Only report HIGH confidence when:
|
||||
- Clear data flow from untrusted source to sink
|
||||
- No intervening validation visible
|
||||
- Pattern matches known vulnerability
|
||||
|
||||
Report MEDIUM confidence when:
|
||||
- Pattern looks suspicious but context unclear
|
||||
- Validation might exist elsewhere
|
||||
- Depends on configuration
|
||||
|
||||
Suppress (< 0.5) when:
|
||||
- Purely theoretical
|
||||
- Would require multiple unlikely conditions
|
||||
- Pattern is common but safe in context
|
||||
110
plugins/pr-review/agents/test-validator.md
Normal file
110
plugins/pr-review/agents/test-validator.md
Normal file
@@ -0,0 +1,110 @@
|
||||
# Test Validator Agent
|
||||
|
||||
## Role
|
||||
|
||||
You are a test quality reviewer that validates test coverage, test quality, and testing practices in pull request changes.
|
||||
|
||||
## Focus Areas
|
||||
|
||||
### 1. Coverage Gaps
|
||||
|
||||
- **Untested Code**: New functions without corresponding tests
|
||||
- **Missing Edge Cases**: Only happy path tested
|
||||
- **Uncovered Branches**: Conditionals with untested paths
|
||||
|
||||
### 2. Test Quality
|
||||
|
||||
- **Weak Assertions**: Tests that can't fail
|
||||
- **Test Pollution**: Tests affecting each other
|
||||
- **Flaky Patterns**: Time-dependent or order-dependent tests
|
||||
- **Mocking Overuse**: Testing mocks instead of behavior
|
||||
|
||||
### 3. Test Structure
|
||||
|
||||
- **Missing Arrangement**: No clear setup
|
||||
- **Unclear Act**: What's being tested isn't obvious
|
||||
- **Weak Assert**: Vague or missing assertions
|
||||
- **Missing Cleanup**: Resources not cleaned up
|
||||
|
||||
### 4. Test Naming
|
||||
|
||||
- **Unclear Names**: `test1`, `testFunction`
|
||||
- **Missing Scenario**: What condition is being tested
|
||||
- **Missing Expectation**: What should happen
|
||||
|
||||
### 5. Test Maintenance
|
||||
|
||||
- **Brittle Tests**: Break with unrelated changes
|
||||
- **Duplicate Setup**: Same setup repeated
|
||||
- **Dead Tests**: Commented out or always-skipped
|
||||
|
||||
## Finding Format
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "TEST-001",
|
||||
"category": "tests",
|
||||
"subcategory": "coverage",
|
||||
"severity": "major",
|
||||
"confidence": 0.8,
|
||||
"file": "src/services/auth.ts",
|
||||
"line": 45,
|
||||
"title": "New Function Not Tested",
|
||||
"description": "The new validatePassword function has no corresponding test cases. This function handles security-critical validation.",
|
||||
"evidence": "Added validatePassword() in auth.ts, no matching test in auth.test.ts",
|
||||
"impact": "Regression bugs in password validation may go undetected.",
|
||||
"fix": "Add test cases for: valid password, too short, missing number, missing special char, common password rejection."
|
||||
}
|
||||
```
|
||||
|
||||
## Severity Guidelines
|
||||
|
||||
| Severity | Criteria |
|
||||
|----------|----------|
|
||||
| Critical | No tests for security/critical functionality |
|
||||
| Major | Significant functionality untested |
|
||||
| Minor | Edge cases or minor paths untested |
|
||||
| Suggestion | Test quality improvement opportunity |
|
||||
|
||||
## Confidence Calibration
|
||||
|
||||
Test coverage is verifiable:
|
||||
|
||||
HIGH confidence when:
|
||||
- Can verify no test file exists
|
||||
- Can see function is called but never in test
|
||||
- Pattern is clearly problematic
|
||||
|
||||
MEDIUM confidence when:
|
||||
- Tests might exist elsewhere
|
||||
- Integration tests might cover it
|
||||
- Pattern might be intentional
|
||||
|
||||
Suppress when:
|
||||
- Generated code
|
||||
- Simple getters/setters
|
||||
- Framework code
|
||||
|
||||
## Test Expectations by Code Type
|
||||
|
||||
| Code Type | Expected Tests |
|
||||
|-----------|---------------|
|
||||
| API endpoint | Happy path, error cases, auth, validation |
|
||||
| Utility function | Input variations, edge cases, errors |
|
||||
| UI component | Rendering, interactions, accessibility |
|
||||
| Database operation | CRUD, constraints, transactions |
|
||||
|
||||
## Constructive Suggestions
|
||||
|
||||
When flagging missing tests, suggest specific cases:
|
||||
|
||||
```
|
||||
Missing tests for processPayment():
|
||||
|
||||
Suggested test cases:
|
||||
1. Valid payment processes successfully
|
||||
2. Invalid card number returns error
|
||||
3. Insufficient funds handled
|
||||
4. Network timeout retries appropriately
|
||||
5. Duplicate payment prevention
|
||||
```
|
||||
46
plugins/pr-review/claude-md-integration.md
Normal file
46
plugins/pr-review/claude-md-integration.md
Normal file
@@ -0,0 +1,46 @@
|
||||
# pr-review - CLAUDE.md Integration
|
||||
|
||||
Add the following section to your project's CLAUDE.md file to enable pr-review.
|
||||
|
||||
---
|
||||
|
||||
## Pull Request Review
|
||||
|
||||
This project uses the pr-review plugin for automated code review.
|
||||
|
||||
### Commands
|
||||
|
||||
| Command | Use Case |
|
||||
|---------|----------|
|
||||
| `/pr-review <pr#>` | Full multi-agent review |
|
||||
| `/pr-summary <pr#>` | Quick change summary |
|
||||
| `/pr-findings <pr#>` | Filter review findings |
|
||||
|
||||
### Review Categories
|
||||
|
||||
Reviews analyze:
|
||||
- **Security**: Injections, auth issues, data exposure
|
||||
- **Performance**: N+1 queries, complexity, memory
|
||||
- **Maintainability**: Code quality, duplication, naming
|
||||
- **Tests**: Coverage gaps, test quality
|
||||
|
||||
### Confidence Threshold
|
||||
|
||||
Findings below 0.5 confidence are suppressed.
|
||||
|
||||
- HIGH (0.9+): Definite issue
|
||||
- MEDIUM (0.7-0.89): Likely issue
|
||||
- LOW (0.5-0.69): Possible concern
|
||||
|
||||
### Verdict Rules
|
||||
|
||||
| Condition | Verdict |
|
||||
|-----------|---------|
|
||||
| Critical findings | REQUEST_CHANGES |
|
||||
| 2+ Major findings | REQUEST_CHANGES |
|
||||
| Minor only | COMMENT |
|
||||
| No issues | APPROVE |
|
||||
|
||||
---
|
||||
|
||||
Copy the section between the horizontal rules into your CLAUDE.md.
|
||||
137
plugins/pr-review/commands/pr-findings.md
Normal file
137
plugins/pr-review/commands/pr-findings.md
Normal file
@@ -0,0 +1,137 @@
|
||||
# /pr-findings - Filter Review Findings
|
||||
|
||||
## Purpose
|
||||
|
||||
List and filter findings from a previous PR review by category, severity, or confidence level.
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/pr-findings <pr-number> [filters]
|
||||
```
|
||||
|
||||
### Filters
|
||||
|
||||
```
|
||||
--category <cat> Filter by category (security, performance, maintainability, tests)
|
||||
--severity <sev> Filter by severity (critical, major, minor, suggestion)
|
||||
--confidence <min> Minimum confidence score (0.0-1.0)
|
||||
--file <pattern> Filter by file path pattern
|
||||
```
|
||||
|
||||
## Examples
|
||||
|
||||
```
|
||||
# Show only security findings
|
||||
/pr-findings 123 --category security
|
||||
|
||||
# Show critical and major issues only
|
||||
/pr-findings 123 --severity critical,major
|
||||
|
||||
# Show high-confidence findings only
|
||||
/pr-findings 123 --confidence 0.8
|
||||
|
||||
# Show findings in specific files
|
||||
/pr-findings 123 --file src/api/*
|
||||
```
|
||||
|
||||
## Behavior
|
||||
|
||||
### Without Previous Review
|
||||
|
||||
If no review exists for this PR:
|
||||
|
||||
```
|
||||
No review found for PR #123.
|
||||
|
||||
Would you like to:
|
||||
1. Run full /pr-review now
|
||||
2. Run quick /pr-summary
|
||||
3. Cancel
|
||||
```
|
||||
|
||||
### With Previous Review
|
||||
|
||||
Display filtered findings:
|
||||
|
||||
```
|
||||
═══════════════════════════════════════════════════
|
||||
PR #123 Findings (filtered: security)
|
||||
═══════════════════════════════════════════════════
|
||||
|
||||
Showing 3 of 8 total findings
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
[SEC-001] SQL Injection Vulnerability
|
||||
Confidence: 0.95 (HIGH) | Severity: Critical
|
||||
File: src/api/users.ts:45
|
||||
|
||||
The query uses string interpolation without parameterization.
|
||||
|
||||
Fix: Use parameterized queries.
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
[SEC-002] Missing Input Validation
|
||||
Confidence: 0.88 (MEDIUM) | Severity: Major
|
||||
File: src/api/auth.ts:23
|
||||
|
||||
User input is passed directly to database without validation.
|
||||
|
||||
Fix: Add input validation middleware.
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
[SEC-003] Sensitive Data in Logs
|
||||
Confidence: 0.72 (MEDIUM) | Severity: Minor
|
||||
File: src/utils/logger.ts:15
|
||||
|
||||
Password field may be logged in debug mode.
|
||||
|
||||
Fix: Sanitize sensitive fields before logging.
|
||||
|
||||
═══════════════════════════════════════════════════
|
||||
```
|
||||
|
||||
## Output Formats
|
||||
|
||||
### Default (Detailed)
|
||||
|
||||
Full finding details with descriptions and fixes.
|
||||
|
||||
### Compact (--compact)
|
||||
|
||||
```
|
||||
SEC-001 | Critical | 0.95 | src/api/users.ts:45 | SQL Injection
|
||||
SEC-002 | Major | 0.88 | src/api/auth.ts:23 | Missing Validation
|
||||
SEC-003 | Minor | 0.72 | src/utils/logger.ts | Sensitive Logs
|
||||
```
|
||||
|
||||
### JSON (--json)
|
||||
|
||||
```json
|
||||
{
|
||||
"pr": 123,
|
||||
"findings": [
|
||||
{
|
||||
"id": "SEC-001",
|
||||
"category": "security",
|
||||
"severity": "critical",
|
||||
"confidence": 0.95,
|
||||
"file": "src/api/users.ts",
|
||||
"line": 45,
|
||||
"title": "SQL Injection Vulnerability",
|
||||
"description": "...",
|
||||
"fix": "..."
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Use Cases
|
||||
|
||||
- Focus on specific issue types
|
||||
- Track resolution of findings
|
||||
- Export findings for tracking
|
||||
- Quick reference during fixes
|
||||
139
plugins/pr-review/commands/pr-review.md
Normal file
139
plugins/pr-review/commands/pr-review.md
Normal file
@@ -0,0 +1,139 @@
|
||||
# /pr-review - Full Multi-Agent Review
|
||||
|
||||
## Purpose
|
||||
|
||||
Conduct a comprehensive pull request review using specialized agents for security, performance, maintainability, and test coverage.
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/pr-review <pr-number> [--repo owner/repo]
|
||||
```
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Fetch PR Data
|
||||
|
||||
Using Gitea MCP tools:
|
||||
1. `get_pull_request` - PR metadata
|
||||
2. `get_pr_diff` - Code changes
|
||||
3. `get_pr_comments` - Existing discussion
|
||||
|
||||
### Step 2: Dispatch to Agents
|
||||
|
||||
The coordinator dispatches review tasks to specialized agents:
|
||||
|
||||
```
|
||||
PR Review: #123 - Add user authentication
|
||||
═══════════════════════════════════════════════════
|
||||
|
||||
Dispatching to review agents:
|
||||
├─ Security Reviewer → analyzing...
|
||||
├─ Performance Analyst → analyzing...
|
||||
├─ Maintainability Auditor → analyzing...
|
||||
└─ Test Validator → analyzing...
|
||||
```
|
||||
|
||||
### Step 3: Aggregate Findings
|
||||
|
||||
Collect findings from all agents, each with:
|
||||
- Category (security, performance, maintainability, tests)
|
||||
- Severity (critical, major, minor, suggestion)
|
||||
- Confidence score (0.0 - 1.0)
|
||||
- File and line reference
|
||||
- Description
|
||||
- Suggested fix (if applicable)
|
||||
|
||||
### Step 4: Filter by Confidence
|
||||
|
||||
Only display findings with confidence >= 0.5:
|
||||
|
||||
| Confidence | Label | Description |
|
||||
|------------|-------|-------------|
|
||||
| 0.9 - 1.0 | HIGH | Definite issue, must address |
|
||||
| 0.7 - 0.89 | MEDIUM | Likely issue, should address |
|
||||
| 0.5 - 0.69 | LOW | Possible concern, consider addressing |
|
||||
| < 0.5 | (suppressed) | Too uncertain to report |
|
||||
|
||||
### Step 5: Generate Report
|
||||
|
||||
```
|
||||
═══════════════════════════════════════════════════
|
||||
PR Review Report: #123
|
||||
═══════════════════════════════════════════════════
|
||||
|
||||
Summary:
|
||||
Files changed: 12
|
||||
Lines added: 234
|
||||
Lines removed: 45
|
||||
|
||||
Findings: 8 total
|
||||
🔴 Critical: 1
|
||||
🟠 Major: 2
|
||||
🟡 Minor: 3
|
||||
💡 Suggestions: 2
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
CRITICAL FINDINGS
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
[SEC-001] SQL Injection Vulnerability (Confidence: 0.95)
|
||||
File: src/api/users.ts:45
|
||||
Category: Security
|
||||
|
||||
The query uses string interpolation without parameterization:
|
||||
```ts
|
||||
const query = `SELECT * FROM users WHERE id = ${userId}`;
|
||||
```
|
||||
|
||||
Suggested fix:
|
||||
```ts
|
||||
const query = 'SELECT * FROM users WHERE id = ?';
|
||||
db.query(query, [userId]);
|
||||
```
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
MAJOR FINDINGS
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
[PERF-001] N+1 Query Pattern (Confidence: 0.82)
|
||||
...
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
VERDICT
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
❌ REQUEST_CHANGES
|
||||
|
||||
This PR has 1 critical security issue that must be addressed
|
||||
before merging. See SEC-001 above.
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
```
|
||||
|
||||
### Step 6: Submit Review (Optional)
|
||||
|
||||
```
|
||||
Submit this review to Gitea?
|
||||
1. Yes, with REQUEST_CHANGES
|
||||
2. Yes, as COMMENT only
|
||||
3. No, just show me the report
|
||||
```
|
||||
|
||||
If yes, use `create_pr_review` MCP tool.
|
||||
|
||||
## Output
|
||||
|
||||
Full review report with:
|
||||
- Summary statistics
|
||||
- Findings grouped by severity
|
||||
- Code snippets with context
|
||||
- Suggested fixes
|
||||
- Overall verdict
|
||||
|
||||
## Configuration
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `PR_REVIEW_CONFIDENCE_THRESHOLD` | `0.5` | Minimum confidence to report |
|
||||
| `PR_REVIEW_AUTO_SUBMIT` | `false` | Auto-submit to Gitea |
|
||||
103
plugins/pr-review/commands/pr-summary.md
Normal file
103
plugins/pr-review/commands/pr-summary.md
Normal file
@@ -0,0 +1,103 @@
|
||||
# /pr-summary - Quick PR Summary
|
||||
|
||||
## Purpose
|
||||
|
||||
Generate a quick summary of PR changes without conducting a full multi-agent review.
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/pr-summary <pr-number> [--repo owner/repo]
|
||||
```
|
||||
|
||||
## Behavior
|
||||
|
||||
### Step 1: Fetch PR Data
|
||||
|
||||
Using Gitea MCP tools:
|
||||
1. `get_pull_request` - PR metadata
|
||||
2. `get_pr_diff` - Code changes
|
||||
|
||||
### Step 2: Analyze Changes
|
||||
|
||||
Quick analysis of:
|
||||
- Files modified
|
||||
- Types of changes (features, fixes, refactoring)
|
||||
- Scope and impact
|
||||
|
||||
### Step 3: Generate Summary
|
||||
|
||||
```
|
||||
═══════════════════════════════════════════════════
|
||||
PR Summary: #123 - Add user authentication
|
||||
═══════════════════════════════════════════════════
|
||||
|
||||
Author: @johndoe
|
||||
Branch: feat/user-auth → development
|
||||
Status: Open (ready for review)
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
CHANGES OVERVIEW
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
Files: 12 changed
|
||||
+ 8 new files
|
||||
~ 3 modified files
|
||||
- 1 deleted file
|
||||
|
||||
Lines: +234 / -45 (net +189)
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
WHAT THIS PR DOES
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
This PR adds user authentication functionality:
|
||||
|
||||
1. **New API endpoints**
|
||||
- POST /api/auth/login
|
||||
- POST /api/auth/register
|
||||
- POST /api/auth/logout
|
||||
|
||||
2. **Frontend components**
|
||||
- LoginForm component
|
||||
- RegisterForm component
|
||||
- Auth context provider
|
||||
|
||||
3. **Database changes**
|
||||
- New users table
|
||||
- Sessions table
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
KEY FILES
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
• src/api/auth/login.ts (+85) - Login endpoint
|
||||
• src/api/auth/register.ts (+120) - Registration
|
||||
• src/components/LoginForm.tsx (+65) - Login UI
|
||||
• src/db/migrations/001_users.sql (+45) - Schema
|
||||
|
||||
───────────────────────────────────────────────────
|
||||
QUICK ASSESSMENT
|
||||
───────────────────────────────────────────────────
|
||||
|
||||
Scope: Medium (authentication feature)
|
||||
Risk: Medium (new security-sensitive code)
|
||||
Recommendation: Full /pr-review suggested
|
||||
|
||||
═══════════════════════════════════════════════════
|
||||
```
|
||||
|
||||
## Output
|
||||
|
||||
Summary report with:
|
||||
- PR metadata
|
||||
- Change statistics
|
||||
- Plain-language description of changes
|
||||
- Key files list
|
||||
- Quick risk assessment
|
||||
|
||||
## When to Use
|
||||
|
||||
- Get quick overview before full review
|
||||
- Triage multiple PRs
|
||||
- Understand PR scope
|
||||
1
plugins/pr-review/mcp-servers/gitea
Symbolic link
1
plugins/pr-review/mcp-servers/gitea
Symbolic link
@@ -0,0 +1 @@
|
||||
../../../mcp-servers/gitea
|
||||
139
plugins/pr-review/skills/review-patterns/confidence-scoring.md
Normal file
139
plugins/pr-review/skills/review-patterns/confidence-scoring.md
Normal file
@@ -0,0 +1,139 @@
|
||||
# Confidence Scoring for PR Review
|
||||
|
||||
## Purpose
|
||||
|
||||
Confidence scoring ensures that review findings are calibrated and actionable. By filtering out low-confidence findings, we reduce noise and focus reviewer attention on real issues.
|
||||
|
||||
## Score Ranges
|
||||
|
||||
| Range | Label | Meaning | Action |
|
||||
|-------|-------|---------|--------|
|
||||
| 0.9 - 1.0 | HIGH | Definite issue | Must address |
|
||||
| 0.7 - 0.89 | MEDIUM | Likely issue | Should address |
|
||||
| 0.5 - 0.69 | LOW | Possible concern | Consider addressing |
|
||||
| < 0.5 | SUPPRESSED | Uncertain | Don't report |
|
||||
|
||||
## Scoring Factors
|
||||
|
||||
### Positive Factors (Increase Confidence)
|
||||
|
||||
| Factor | Impact |
|
||||
|--------|--------|
|
||||
| Clear data flow from source to sink | +0.3 |
|
||||
| Pattern matches known vulnerability | +0.2 |
|
||||
| No intervening validation visible | +0.2 |
|
||||
| Matches OWASP Top 10 | +0.15 |
|
||||
| Found in security-sensitive context | +0.1 |
|
||||
|
||||
### Negative Factors (Decrease Confidence)
|
||||
|
||||
| Factor | Impact |
|
||||
|--------|--------|
|
||||
| Validation might exist elsewhere | -0.2 |
|
||||
| Depends on runtime configuration | -0.15 |
|
||||
| Pattern is common but often safe | -0.15 |
|
||||
| Requires multiple conditions to exploit | -0.1 |
|
||||
| Theoretical impact only | -0.1 |
|
||||
|
||||
## Calibration Guidelines
|
||||
|
||||
### Security Issues
|
||||
|
||||
Base confidence by pattern:
|
||||
- SQL string concatenation with user input: 0.95
|
||||
- Hardcoded credentials: 0.9
|
||||
- Missing auth check: 0.8
|
||||
- Generic error exposure: 0.6
|
||||
- Missing rate limiting: 0.5
|
||||
|
||||
### Performance Issues
|
||||
|
||||
Base confidence by pattern:
|
||||
- Clear N+1 in loop: 0.9
|
||||
- SELECT * on large table: 0.7
|
||||
- Missing index on filtered column: 0.6
|
||||
- Suboptimal algorithm: 0.5
|
||||
|
||||
### Maintainability Issues
|
||||
|
||||
Base confidence by pattern:
|
||||
- Function >100 lines: 0.8
|
||||
- Deep nesting >4 levels: 0.75
|
||||
- Duplicate code blocks: 0.7
|
||||
- Unclear naming: 0.6
|
||||
- Minor style issues: 0.3 (suppress)
|
||||
|
||||
### Test Coverage
|
||||
|
||||
Base confidence by pattern:
|
||||
- No test file for new module: 0.9
|
||||
- Security function untested: 0.85
|
||||
- Edge case not covered: 0.6
|
||||
- Simple getter untested: 0.3 (suppress)
|
||||
|
||||
## Threshold Configuration
|
||||
|
||||
The default threshold is 0.5. This can be adjusted:
|
||||
|
||||
```bash
|
||||
PR_REVIEW_CONFIDENCE_THRESHOLD=0.7 # Only high-confidence
|
||||
PR_REVIEW_CONFIDENCE_THRESHOLD=0.3 # Include more speculative
|
||||
```
|
||||
|
||||
## Example Scoring
|
||||
|
||||
### High Confidence (0.95)
|
||||
|
||||
```javascript
|
||||
// Clear SQL injection
|
||||
const query = `SELECT * FROM users WHERE id = ${req.params.id}`;
|
||||
```
|
||||
|
||||
- User input (req.params.id): +0.3
|
||||
- Direct to SQL query: +0.3
|
||||
- No visible validation: +0.2
|
||||
- Matches OWASP Top 10: +0.15
|
||||
- **Total: 0.95**
|
||||
|
||||
### Medium Confidence (0.72)
|
||||
|
||||
```javascript
|
||||
// Possible performance issue
|
||||
users.forEach(async (user) => {
|
||||
const orders = await db.orders.find({ userId: user.id });
|
||||
});
|
||||
```
|
||||
|
||||
- Loop with query: +0.3
|
||||
- Pattern matches N+1: +0.2
|
||||
- But might be small dataset: -0.15
|
||||
- Could have caching: -0.1
|
||||
- **Total: 0.72**
|
||||
|
||||
### Low Confidence (0.55)
|
||||
|
||||
```javascript
|
||||
// Maybe too complex?
|
||||
function processOrder(order, user, items, discounts, shipping) {
|
||||
// 60 lines of logic
|
||||
}
|
||||
```
|
||||
|
||||
- Function is long: +0.2
|
||||
- Many parameters: +0.15
|
||||
- But might be intentional: -0.1
|
||||
- Could be refactored later: -0.1
|
||||
- **Total: 0.55**
|
||||
|
||||
### Suppressed (0.35)
|
||||
|
||||
```javascript
|
||||
// Minor style preference
|
||||
const x = foo ? bar : baz;
|
||||
```
|
||||
|
||||
- Ternary could be if/else: +0.1
|
||||
- Very common pattern: -0.2
|
||||
- No real impact: -0.1
|
||||
- Style preference: -0.1
|
||||
- **Total: 0.35** (suppressed)
|
||||
@@ -1,441 +1,39 @@
|
||||
# Configuration Guide - Projman Plugin
|
||||
|
||||
Complete setup and configuration instructions for the Projman project management plugin.
|
||||
For comprehensive configuration instructions, see the **[Centralized Configuration Guide](../../docs/CONFIGURATION.md)**.
|
||||
|
||||
## Overview
|
||||
## Quick Reference
|
||||
|
||||
The Projman plugin uses a **hybrid configuration** approach:
|
||||
- **System-level:** Credentials for Gitea (stored once per machine)
|
||||
- **Project-level:** Repository path (stored per project)
|
||||
### Required Configuration
|
||||
|
||||
This design allows:
|
||||
- Single token per service (update once, use everywhere)
|
||||
- Easy multi-project setup (just add `.env` per project)
|
||||
- Security (tokens never committed to git)
|
||||
- Project isolation (each project has its own scope)
|
||||
**System-level** (`~/.config/claude/gitea.env`):
|
||||
```bash
|
||||
GITEA_URL=https://gitea.example.com
|
||||
GITEA_TOKEN=your_token
|
||||
GITEA_ORG=your_organization
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
**Project-level** (`.env` in project root):
|
||||
```bash
|
||||
GITEA_REPO=your-repo-name
|
||||
```
|
||||
|
||||
Before configuring the plugin, ensure you have:
|
||||
|
||||
1. **Python 3.10+** installed
|
||||
```bash
|
||||
python3 --version # Should be 3.10.0 or higher
|
||||
```
|
||||
|
||||
2. **Git repository** initialized
|
||||
```bash
|
||||
git status # Should show initialized repository
|
||||
```
|
||||
|
||||
3. **Gitea access** with an account and permissions to:
|
||||
- Create issues
|
||||
- Manage labels
|
||||
- Read organization information
|
||||
- Access repository wiki
|
||||
|
||||
4. **Claude Code** installed and working
|
||||
|
||||
## Step 1: Install MCP Server
|
||||
|
||||
The plugin bundles the Gitea MCP server at `mcp-servers/gitea/`:
|
||||
### MCP Server Installation
|
||||
|
||||
```bash
|
||||
# Navigate to MCP server directory (inside plugin)
|
||||
cd plugins/projman/mcp-servers/gitea
|
||||
|
||||
# Create virtual environment
|
||||
cd mcp-servers/gitea
|
||||
python3 -m venv .venv
|
||||
|
||||
# Activate virtual environment
|
||||
source .venv/bin/activate # Linux/Mac
|
||||
# or
|
||||
.venv\Scripts\activate # Windows
|
||||
|
||||
# Install dependencies
|
||||
pip install -r requirements.txt
|
||||
|
||||
# Verify installation
|
||||
python -c "from mcp_server import server; print('Gitea MCP Server installed successfully')"
|
||||
|
||||
# Deactivate when done
|
||||
deactivate
|
||||
```
|
||||
|
||||
## Step 2: Generate Gitea API Token
|
||||
|
||||
1. Log into Gitea: https://gitea.example.com
|
||||
2. Navigate to: **User Icon** (top right) → **Settings**
|
||||
3. Click **Applications** tab
|
||||
4. Scroll to **Manage Access Tokens**
|
||||
5. Click **Generate New Token**
|
||||
6. Configure token:
|
||||
- **Token Name:** `claude-code-projman`
|
||||
- **Permissions:**
|
||||
- `repo` (all sub-permissions) - Repository access
|
||||
- `read:org` - Read organization information and labels
|
||||
- `read:user` - Read user information
|
||||
- `write:repo` - Wiki access
|
||||
7. Click **Generate Token**
|
||||
8. **IMPORTANT:** Copy token immediately (shown only once!)
|
||||
9. Save token securely - you'll need it in Step 3
|
||||
|
||||
**Token Permissions Explained:**
|
||||
- `repo` - Create, read, update issues, labels, and wiki
|
||||
- `read:org` - Access organization-level labels
|
||||
- `read:user` - Associate issues with user account
|
||||
- `write:repo` - Create wiki pages for lessons learned
|
||||
|
||||
## Step 3: System-Level Configuration
|
||||
|
||||
Create system-wide configuration file in `~/.config/claude/`:
|
||||
|
||||
### 3.1 Create Configuration Directory
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.config/claude
|
||||
```
|
||||
|
||||
### 3.2 Configure Gitea
|
||||
|
||||
```bash
|
||||
cat > ~/.config/claude/gitea.env << 'EOF'
|
||||
# Gitea API Configuration
|
||||
GITEA_URL=https://gitea.example.com
|
||||
GITEA_TOKEN=your_gitea_token_here
|
||||
GITEA_ORG=your_organization
|
||||
EOF
|
||||
|
||||
# Secure the file (owner read/write only)
|
||||
chmod 600 ~/.config/claude/gitea.env
|
||||
```
|
||||
|
||||
**Replace placeholders:**
|
||||
- `your_gitea_token_here` with the token from Step 2
|
||||
- `your_organization` with your Gitea organization name
|
||||
|
||||
**Configuration Variables:**
|
||||
- `GITEA_URL` - Gitea base URL (without `/api/v1`)
|
||||
- `GITEA_TOKEN` - Personal access token from Step 2
|
||||
- `GITEA_ORG` - Organization name (e.g., `bandit`)
|
||||
|
||||
### 3.3 Verify System Configuration
|
||||
|
||||
```bash
|
||||
# Check file exists and has correct permissions
|
||||
ls -la ~/.config/claude/gitea.env
|
||||
|
||||
# Should show:
|
||||
# -rw------- gitea.env
|
||||
```
|
||||
|
||||
**Security Note:** File should have `600` permissions (owner read/write only) to protect API tokens.
|
||||
|
||||
## Step 4: Project-Level Configuration
|
||||
|
||||
For each project where you'll use Projman, create a `.env` file:
|
||||
|
||||
### 4.1 Create Project .env File
|
||||
|
||||
```bash
|
||||
# In your project root directory
|
||||
cat > .env << 'EOF'
|
||||
# Gitea Repository Configuration
|
||||
GITEA_REPO=your-repo-name
|
||||
EOF
|
||||
```
|
||||
|
||||
**Example for MyProject:**
|
||||
```bash
|
||||
cat > .env << 'EOF'
|
||||
GITEA_REPO=my-project
|
||||
EOF
|
||||
```
|
||||
|
||||
### 4.2 Verify Project Configuration
|
||||
|
||||
```bash
|
||||
# Check .env exists
|
||||
ls -la .env
|
||||
|
||||
# Check .env content
|
||||
cat .env
|
||||
```
|
||||
|
||||
**Note:** The `.env` file may already be in your `.gitignore`. If your project uses `.env` for other purposes, the Gitea configuration will merge with existing variables.
|
||||
|
||||
## Step 5: Configuration Verification
|
||||
|
||||
Test that everything is configured correctly:
|
||||
|
||||
### 5.1 Test Gitea Connection
|
||||
|
||||
```bash
|
||||
# Test with curl
|
||||
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
|
||||
https://gitea.example.com/api/v1/user
|
||||
|
||||
# Should return your user information in JSON format
|
||||
```
|
||||
|
||||
### 5.2 Test Wiki Access
|
||||
|
||||
```bash
|
||||
# Test wiki API
|
||||
curl -H "Authorization: token YOUR_GITEA_TOKEN" \
|
||||
https://gitea.example.com/api/v1/repos/YOUR_ORG/YOUR_REPO/wiki/pages
|
||||
|
||||
# Should return list of wiki pages (or empty array)
|
||||
```
|
||||
|
||||
### 5.3 Test MCP Server Loading
|
||||
|
||||
```bash
|
||||
# Navigate to plugin directory
|
||||
cd plugins/projman
|
||||
|
||||
# Verify .mcp.json exists
|
||||
cat .mcp.json
|
||||
|
||||
# Test loading (Claude Code will attempt to start MCP servers)
|
||||
claude --debug
|
||||
```
|
||||
|
||||
## Step 6: Initialize Plugin
|
||||
|
||||
### 6.1 Run Initial Setup
|
||||
|
||||
```bash
|
||||
/initial-setup
|
||||
```
|
||||
|
||||
This will:
|
||||
- Validate Gitea MCP server connection
|
||||
- Test credential configuration
|
||||
- Sync label taxonomy
|
||||
- Verify required directory structure
|
||||
|
||||
### 6.2 Sync Label Taxonomy
|
||||
|
||||
```bash
|
||||
/labels-sync
|
||||
```
|
||||
|
||||
This will:
|
||||
- Fetch all labels from Gitea (organization + repository)
|
||||
- Update `skills/label-taxonomy/labels-reference.md`
|
||||
- Enable intelligent label suggestions
|
||||
|
||||
### 6.3 Verify Commands Available
|
||||
|
||||
```bash
|
||||
# List available commands
|
||||
/sprint-plan
|
||||
/sprint-start
|
||||
/sprint-status
|
||||
/sprint-close
|
||||
/labels-sync
|
||||
/initial-setup
|
||||
```
|
||||
|
||||
## Configuration Files Reference
|
||||
|
||||
### System-Level Files
|
||||
|
||||
**`~/.config/claude/gitea.env`:**
|
||||
```bash
|
||||
GITEA_URL=https://gitea.example.com
|
||||
GITEA_TOKEN=glpat-xxxxxxxxxxxxxxxxxxxxx
|
||||
GITEA_ORG=your_organization
|
||||
```
|
||||
|
||||
### Project-Level Files
|
||||
|
||||
**`.env` (in project root):**
|
||||
```bash
|
||||
GITEA_REPO=your-repo-name
|
||||
```
|
||||
|
||||
### Plugin Configuration
|
||||
|
||||
**`projman/.mcp.json`:**
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"gitea": {
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/mcp-servers/gitea/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "${CLAUDE_PLUGIN_ROOT}/mcp-servers/gitea",
|
||||
"env": {
|
||||
"PYTHONPATH": "${CLAUDE_PLUGIN_ROOT}/mcp-servers/gitea"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Multi-Project Setup
|
||||
|
||||
To use Projman with multiple projects:
|
||||
|
||||
1. **System config:** Set up once (already done in Step 3)
|
||||
2. **Project config:** Create `.env` in each project root:
|
||||
|
||||
**Project 1: Main App**
|
||||
```bash
|
||||
# ~/projects/my-app/.env
|
||||
GITEA_REPO=my-app
|
||||
```
|
||||
|
||||
**Project 2: App Site**
|
||||
```bash
|
||||
# ~/projects/my-app-site/.env
|
||||
GITEA_REPO=my-app-site
|
||||
```
|
||||
|
||||
**Project 3: Company Site**
|
||||
```bash
|
||||
# ~/projects/company-site/.env
|
||||
GITEA_REPO=company-site
|
||||
```
|
||||
|
||||
Each project operates independently with its own issues and lessons learned (stored in each repository's wiki).
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Cannot find configuration files
|
||||
|
||||
**Problem:** MCP server reports "Configuration not found"
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Check system config exists
|
||||
ls -la ~/.config/claude/gitea.env
|
||||
|
||||
# If missing, recreate from Step 3
|
||||
```
|
||||
|
||||
### Authentication failed
|
||||
|
||||
**Problem:** "401 Unauthorized" or "Invalid token"
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Test Gitea token
|
||||
curl -H "Authorization: token YOUR_TOKEN" \
|
||||
https://gitea.example.com/api/v1/user
|
||||
|
||||
# If fails, regenerate token (Step 2)
|
||||
```
|
||||
|
||||
### MCP server not starting
|
||||
|
||||
**Problem:** "Failed to start MCP server"
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Check Python virtual environment exists
|
||||
ls plugins/projman/mcp-servers/gitea/.venv
|
||||
|
||||
# If missing, reinstall (Step 1)
|
||||
|
||||
# Check dependencies installed
|
||||
cd plugins/projman/mcp-servers/gitea
|
||||
source .venv/bin/activate
|
||||
python -c "import requests; import mcp"
|
||||
|
||||
# If import fails, reinstall requirements
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### Wrong repository
|
||||
### Verification
|
||||
|
||||
**Problem:** Issues created in wrong repo
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Check project .env configuration
|
||||
cat .env
|
||||
|
||||
# Verify GITEA_REPO matches Gitea repository name exactly
|
||||
|
||||
# Update if incorrect
|
||||
/initial-setup
|
||||
/labels-sync
|
||||
```
|
||||
|
||||
### Permissions errors
|
||||
|
||||
**Problem:** "Permission denied" when creating issues or wiki pages
|
||||
|
||||
**Solution:**
|
||||
- Verify token has `repo`, `read:org`, and `write:repo` permissions (Step 2)
|
||||
- Regenerate token with correct permissions if needed
|
||||
|
||||
### Repository not in organization
|
||||
|
||||
**Problem:** "Repository must belong to configured organization"
|
||||
|
||||
**Solution:**
|
||||
- Verify `GITEA_ORG` in system config matches the organization owning the repository
|
||||
- Verify `GITEA_REPO` belongs to that organization
|
||||
- Fork the repository to your organization if needed
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
1. **Never commit tokens**
|
||||
- Keep credentials in `~/.config/claude/` only
|
||||
- Never hardcode tokens in code
|
||||
- Use system-level config for credentials
|
||||
|
||||
2. **Secure configuration files**
|
||||
- Set `600` permissions on `~/.config/claude/*.env`
|
||||
- Store in user home directory only
|
||||
- Don't share token files
|
||||
|
||||
3. **Rotate tokens periodically**
|
||||
- Regenerate tokens every 6-12 months
|
||||
- Immediately revoke if compromised
|
||||
- Use separate tokens for dev/prod if needed
|
||||
|
||||
4. **Minimum permissions**
|
||||
- Only grant required permissions
|
||||
- Gitea: `repo`, `read:org`, `read:user`, `write:repo`
|
||||
|
||||
5. **Monitor usage**
|
||||
- Review Gitea access logs periodically
|
||||
- Watch for unexpected API usage
|
||||
|
||||
## Next Steps
|
||||
|
||||
After configuration is complete:
|
||||
|
||||
1. Run `/initial-setup` to verify everything works
|
||||
2. Run `/labels-sync` to fetch label taxonomy
|
||||
3. Try `/sprint-plan` to start your first sprint
|
||||
4. Read [README.md](./README.md) for usage guide
|
||||
|
||||
## Support
|
||||
|
||||
**Configuration Issues:**
|
||||
- Check [README.md](./README.md) troubleshooting section
|
||||
- Contact repository maintainer for support
|
||||
|
||||
**Questions:**
|
||||
- Read command documentation: `commands/*.md`
|
||||
- Check agent descriptions in `agents/`
|
||||
|
||||
---
|
||||
|
||||
**Configuration Status Checklist:**
|
||||
|
||||
- [ ] Python 3.10+ installed
|
||||
- [ ] Gitea MCP server installed (in `mcp-servers/gitea/`)
|
||||
- [ ] Gitea API token generated with correct permissions
|
||||
- [ ] System config created (`~/.config/claude/gitea.env`)
|
||||
- [ ] Project config created (`.env`)
|
||||
- [ ] Gitea connection tested
|
||||
- [ ] Wiki access tested
|
||||
- [ ] `/initial-setup` completed successfully
|
||||
- [ ] `/labels-sync` completed successfully
|
||||
- [ ] Commands verified available
|
||||
|
||||
Once all items are checked, you're ready to use Projman!
|
||||
For detailed setup instructions, troubleshooting, and security best practices, see [docs/CONFIGURATION.md](../../docs/CONFIGURATION.md).
|
||||
|
||||
1
plugins/projman/mcp-servers/gitea
Symbolic link
1
plugins/projman/mcp-servers/gitea
Symbolic link
@@ -0,0 +1 @@
|
||||
../../../mcp-servers/gitea
|
||||
@@ -1,412 +0,0 @@
|
||||
# Gitea MCP Server
|
||||
|
||||
Model Context Protocol (MCP) server for Gitea integration with Claude Code.
|
||||
|
||||
## Overview
|
||||
|
||||
The Gitea MCP Server provides Claude Code with direct access to Gitea for issue management, label operations, and repository tracking. It supports both single-repository (project mode) and multi-repository (company/PMO mode) operations.
|
||||
|
||||
**Status**: ✅ Phase 1 Complete - Fully functional and tested
|
||||
|
||||
## Features
|
||||
|
||||
### Core Functionality
|
||||
|
||||
- **Issue Management**: CRUD operations for Gitea issues
|
||||
- **Label Taxonomy**: Dynamic 44-label system with intelligent suggestions
|
||||
- **Mode Detection**: Automatic project vs company-wide mode detection
|
||||
- **Branch-Aware Security**: Prevents accidental changes on production branches
|
||||
- **Hybrid Configuration**: System-level credentials + project-level paths
|
||||
- **PMO Support**: Multi-repository aggregation for organization-wide views
|
||||
|
||||
### Tools Provided
|
||||
|
||||
| Tool | Description | Mode |
|
||||
|------|-------------|------|
|
||||
| `list_issues` | List issues from repository | Both |
|
||||
| `get_issue` | Get specific issue details | Both |
|
||||
| `create_issue` | Create new issue with labels | Both |
|
||||
| `update_issue` | Update existing issue | Both |
|
||||
| `add_comment` | Add comment to issue | Both |
|
||||
| `get_labels` | Get all labels (org + repo) | Both |
|
||||
| `suggest_labels` | Intelligent label suggestion | Both |
|
||||
| `aggregate_issues` | Cross-repository issue aggregation | PMO Only |
|
||||
|
||||
## Architecture
|
||||
|
||||
### Directory Structure
|
||||
|
||||
```
|
||||
mcp-servers/gitea/
|
||||
├── .venv/ # Python virtual environment
|
||||
├── requirements.txt # Python dependencies
|
||||
├── mcp_server/
|
||||
│ ├── __init__.py
|
||||
│ ├── server.py # MCP server entry point
|
||||
│ ├── config.py # Configuration loader
|
||||
│ ├── gitea_client.py # Gitea API client
|
||||
│ └── tools/
|
||||
│ ├── __init__.py
|
||||
│ ├── issues.py # Issue tools
|
||||
│ └── labels.py # Label tools
|
||||
├── tests/
|
||||
│ ├── __init__.py
|
||||
│ ├── test_config.py
|
||||
│ ├── test_gitea_client.py
|
||||
│ ├── test_issues.py
|
||||
│ └── test_labels.py
|
||||
├── README.md # This file
|
||||
└── TESTING.md # Testing instructions
|
||||
```
|
||||
|
||||
### Mode Detection
|
||||
|
||||
The server operates in two modes based on environment variables:
|
||||
|
||||
**Project Mode** (Single Repository):
|
||||
- When `GITEA_REPO` is set
|
||||
- Operates on single repository
|
||||
- Used by `projman` plugin
|
||||
|
||||
**Company Mode** (Multi-Repository / PMO):
|
||||
- When `GITEA_REPO` is NOT set
|
||||
- Operates on all repositories in organization
|
||||
- Used by `projman-pmo` plugin
|
||||
|
||||
### Branch-Aware Security
|
||||
|
||||
Operations are restricted based on the current Git branch:
|
||||
|
||||
| Branch | Read | Create Issue | Update/Comment |
|
||||
|--------|------|--------------|----------------|
|
||||
| `main`, `master`, `prod/*` | ✅ | ❌ | ❌ |
|
||||
| `staging`, `stage/*` | ✅ | ✅ | ❌ |
|
||||
| `development`, `develop`, `feat/*`, `dev/*` | ✅ | ✅ | ✅ |
|
||||
|
||||
## Installation
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.10 or higher
|
||||
- Git repository (for branch detection)
|
||||
- Access to Gitea instance with API token
|
||||
|
||||
### Step 1: Install Dependencies
|
||||
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate # Linux/Mac
|
||||
# or .venv\Scripts\activate # Windows
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### Step 2: Configure System-Level Settings
|
||||
|
||||
Create `~/.config/claude/gitea.env`:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.config/claude
|
||||
|
||||
cat > ~/.config/claude/gitea.env << EOF
|
||||
GITEA_API_URL=https://gitea.example.com/api/v1
|
||||
GITEA_API_TOKEN=your_gitea_token_here
|
||||
GITEA_OWNER=bandit
|
||||
EOF
|
||||
|
||||
chmod 600 ~/.config/claude/gitea.env
|
||||
```
|
||||
|
||||
### Step 3: Configure Project-Level Settings (Optional)
|
||||
|
||||
For project mode, create `.env` in your project root:
|
||||
|
||||
```bash
|
||||
echo "GITEA_REPO=your-repo-name" > .env
|
||||
echo ".env" >> .gitignore
|
||||
```
|
||||
|
||||
For company/PMO mode, omit the `.env` file or don't set `GITEA_REPO`.
|
||||
|
||||
## Configuration
|
||||
|
||||
### System-Level Configuration
|
||||
|
||||
**File**: `~/.config/claude/gitea.env`
|
||||
|
||||
**Required Variables**:
|
||||
- `GITEA_API_URL` - Gitea API endpoint (e.g., `https://gitea.example.com/api/v1`)
|
||||
- `GITEA_API_TOKEN` - Personal access token with repo permissions
|
||||
- `GITEA_OWNER` - Organization or user name (e.g., `bandit`)
|
||||
|
||||
### Project-Level Configuration
|
||||
|
||||
**File**: `<project-root>/.env`
|
||||
|
||||
**Optional Variables**:
|
||||
- `GITEA_REPO` - Repository name (enables project mode)
|
||||
|
||||
### Generating Gitea API Token
|
||||
|
||||
1. Log into Gitea: https://gitea.example.com
|
||||
2. Navigate to: **Settings** → **Applications** → **Manage Access Tokens**
|
||||
3. Click **Generate New Token**
|
||||
4. Configure token:
|
||||
- **Token Name**: `claude-code-mcp`
|
||||
- **Permissions**:
|
||||
- ✅ `repo` (all) - Read/write repositories, issues, labels
|
||||
- ✅ `read:org` - Read organization information and labels
|
||||
- ✅ `read:user` - Read user information
|
||||
5. Click **Generate Token**
|
||||
6. Copy token immediately (shown only once)
|
||||
7. Add to `~/.config/claude/gitea.env`
|
||||
|
||||
## Usage
|
||||
|
||||
### Running the MCP Server
|
||||
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
source .venv/bin/activate
|
||||
python -m mcp_server.server
|
||||
```
|
||||
|
||||
The server communicates via JSON-RPC 2.0 over stdio.
|
||||
|
||||
### Integration with Claude Code Plugins
|
||||
|
||||
The MCP server is designed to be used by Claude Code plugins via `.mcp.json` configuration:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"gitea": {
|
||||
"command": "python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "${CLAUDE_PLUGIN_ROOT}/../mcp-servers/gitea",
|
||||
"env": {
|
||||
"PYTHONPATH": "${CLAUDE_PLUGIN_ROOT}/../mcp-servers/gitea"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Example Tool Calls
|
||||
|
||||
**List Issues**:
|
||||
```python
|
||||
from mcp_server.tools.issues import IssueTools
|
||||
from mcp_server.gitea_client import GiteaClient
|
||||
|
||||
client = GiteaClient()
|
||||
issue_tools = IssueTools(client)
|
||||
|
||||
issues = await issue_tools.list_issues(state='open', labels=['Type/Bug'])
|
||||
```
|
||||
|
||||
**Suggest Labels**:
|
||||
```python
|
||||
from mcp_server.tools.labels import LabelTools
|
||||
|
||||
label_tools = LabelTools(client)
|
||||
|
||||
context = "Fix critical authentication bug in production API"
|
||||
suggestions = await label_tools.suggest_labels(context)
|
||||
# Returns: ['Type/Bug', 'Priority/Critical', 'Component/Auth', 'Component/API', ...]
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Unit Tests
|
||||
|
||||
Run all 42 unit tests with mocks:
|
||||
|
||||
```bash
|
||||
pytest tests/ -v
|
||||
```
|
||||
|
||||
Expected: `42 passed in 0.57s`
|
||||
|
||||
### Integration Tests
|
||||
|
||||
Test with real Gitea instance:
|
||||
|
||||
```bash
|
||||
python -c "
|
||||
from mcp_server.gitea_client import GiteaClient
|
||||
|
||||
client = GiteaClient()
|
||||
issues = client.list_issues(state='open')
|
||||
print(f'Found {len(issues)} open issues')
|
||||
"
|
||||
```
|
||||
|
||||
### Full Testing Guide
|
||||
|
||||
See [TESTING.md](./TESTING.md) for comprehensive testing instructions.
|
||||
|
||||
## Label Taxonomy System
|
||||
|
||||
The system supports a dynamic 44-label taxonomy (28 org + 16 repo):
|
||||
|
||||
**Organization Labels (28)**:
|
||||
- `Agent/*` (2) - Agent/Human, Agent/Claude
|
||||
- `Complexity/*` (3) - Simple, Medium, Complex
|
||||
- `Efforts/*` (5) - XS, S, M, L, XL
|
||||
- `Priority/*` (4) - Low, Medium, High, Critical
|
||||
- `Risk/*` (3) - Low, Medium, High
|
||||
- `Source/*` (4) - Development, Staging, Production, Customer
|
||||
- `Type/*` (6) - Bug, Feature, Refactor, Documentation, Test, Chore
|
||||
|
||||
**Repository Labels (16)**:
|
||||
- `Component/*` (9) - Backend, Frontend, API, Database, Auth, Deploy, Testing, Docs, Infra
|
||||
- `Tech/*` (7) - Python, JavaScript, Docker, PostgreSQL, Redis, Vue, FastAPI
|
||||
|
||||
Labels are fetched dynamically from Gitea and suggestions adapt to the current taxonomy.
|
||||
|
||||
## Security
|
||||
|
||||
### Token Storage
|
||||
|
||||
- Store tokens in `~/.config/claude/gitea.env`
|
||||
- Set file permissions to `600` (read/write owner only)
|
||||
- Never commit tokens to Git
|
||||
- Use separate tokens for development and production
|
||||
|
||||
### Branch Detection
|
||||
|
||||
The MCP server implements defense-in-depth branch detection:
|
||||
|
||||
1. **MCP Tools**: Check branch before operations
|
||||
2. **Agent Prompts**: Warn users about branch restrictions
|
||||
3. **CLAUDE.md**: Provides additional context
|
||||
|
||||
### Input Validation
|
||||
|
||||
- All user input is validated before API calls
|
||||
- Issue titles and descriptions are sanitized
|
||||
- Label names are checked against taxonomy
|
||||
- Repository names are validated
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
**Module not found**:
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
source .venv/bin/activate
|
||||
```
|
||||
|
||||
**Configuration not found**:
|
||||
```bash
|
||||
ls -la ~/.config/claude/gitea.env
|
||||
# If missing, create it following installation steps
|
||||
```
|
||||
|
||||
**Authentication failed**:
|
||||
```bash
|
||||
# Test token manually
|
||||
curl -H "Authorization: token YOUR_TOKEN" \
|
||||
https://gitea.example.com/api/v1/user
|
||||
```
|
||||
|
||||
**Permission denied on branch**:
|
||||
```bash
|
||||
# Check current branch
|
||||
git branch --show-current
|
||||
|
||||
# Switch to development branch
|
||||
git checkout development
|
||||
```
|
||||
|
||||
See [TESTING.md](./TESTING.md#troubleshooting) for more details.
|
||||
|
||||
## Development
|
||||
|
||||
### Project Structure
|
||||
|
||||
- `config.py` - Hybrid configuration loader with mode detection
|
||||
- `gitea_client.py` - Synchronous Gitea API client using requests
|
||||
- `tools/issues.py` - Async wrappers with branch detection
|
||||
- `tools/labels.py` - Label management and suggestion
|
||||
- `server.py` - MCP server with JSON-RPC 2.0 over stdio
|
||||
|
||||
### Adding New Tools
|
||||
|
||||
1. Add method to `GiteaClient` (sync)
|
||||
2. Add async wrapper to appropriate tool class
|
||||
3. Register tool in `server.py` `setup_tools()`
|
||||
4. Add unit tests
|
||||
5. Update documentation
|
||||
|
||||
### Testing Philosophy
|
||||
|
||||
- **Unit tests**: Use mocks for fast feedback
|
||||
- **Integration tests**: Use real Gitea API for validation
|
||||
- **Branch detection**: Test all branch types
|
||||
- **Mode detection**: Test both project and company modes
|
||||
|
||||
## Performance
|
||||
|
||||
### Caching
|
||||
|
||||
Labels are cached to reduce API calls:
|
||||
|
||||
```python
|
||||
from functools import lru_cache
|
||||
|
||||
@lru_cache(maxsize=128)
|
||||
def get_labels_cached(self, repo: str):
|
||||
return self.get_labels(repo)
|
||||
```
|
||||
|
||||
### Retry Logic
|
||||
|
||||
API calls include automatic retry with exponential backoff:
|
||||
|
||||
```python
|
||||
@retry_on_failure(max_retries=3, delay=1)
|
||||
def list_issues(self, state='open', labels=None, repo=None):
|
||||
# Implementation
|
||||
```
|
||||
|
||||
## Changelog
|
||||
|
||||
### v1.0.0 (2025-01-06) - Phase 1 Complete
|
||||
|
||||
✅ Initial implementation:
|
||||
- Configuration management (hybrid system + project)
|
||||
- Gitea API client with all CRUD operations
|
||||
- MCP server with 8 tools
|
||||
- Issue tools with branch detection
|
||||
- Label tools with intelligent suggestions
|
||||
- Mode detection (project vs company)
|
||||
- Branch-aware security model
|
||||
- 42 unit tests (100% passing)
|
||||
- Comprehensive documentation
|
||||
|
||||
## License
|
||||
|
||||
MIT License - Part of the Claude Code Marketplace project.
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- **Projman Documentation**: `plugins/projman/README.md`
|
||||
- **Configuration Guide**: `plugins/projman/CONFIGURATION.md`
|
||||
- **Testing Guide**: `TESTING.md`
|
||||
|
||||
## Support
|
||||
|
||||
For issues or questions:
|
||||
1. Check [TESTING.md](./TESTING.md) troubleshooting section
|
||||
2. Review [plugins/projman/README.md](../../README.md) for plugin documentation
|
||||
3. Create an issue in the project repository
|
||||
|
||||
---
|
||||
|
||||
**Built for**: Claude Code Marketplace - Project Management Plugins
|
||||
**Phase**: 1 (Complete)
|
||||
**Status**: ✅ Production Ready
|
||||
**Last Updated**: 2025-01-06
|
||||
@@ -1,582 +0,0 @@
|
||||
# Gitea MCP Server - Testing Guide
|
||||
|
||||
This document provides comprehensive testing instructions for the Gitea MCP Server implementation.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Unit Tests](#unit-tests)
|
||||
2. [Manual MCP Server Testing](#manual-mcp-server-testing)
|
||||
3. [Integration Testing](#integration-testing)
|
||||
4. [Configuration Setup for Testing](#configuration-setup-for-testing)
|
||||
5. [Troubleshooting](#troubleshooting)
|
||||
|
||||
---
|
||||
|
||||
## Unit Tests
|
||||
|
||||
Unit tests use mocks to test all modules without requiring a real Gitea instance.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Ensure the virtual environment is activated and dependencies are installed:
|
||||
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
source .venv/bin/activate # Linux/Mac
|
||||
# or .venv\Scripts\activate # Windows
|
||||
```
|
||||
|
||||
### Running All Tests
|
||||
|
||||
Run all 42 unit tests:
|
||||
|
||||
```bash
|
||||
pytest tests/ -v
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
============================== 42 passed in 0.57s ==============================
|
||||
```
|
||||
|
||||
### Running Specific Test Files
|
||||
|
||||
Run tests for a specific module:
|
||||
|
||||
```bash
|
||||
# Configuration tests
|
||||
pytest tests/test_config.py -v
|
||||
|
||||
# Gitea client tests
|
||||
pytest tests/test_gitea_client.py -v
|
||||
|
||||
# Issue tools tests
|
||||
pytest tests/test_issues.py -v
|
||||
|
||||
# Label tools tests
|
||||
pytest tests/test_labels.py -v
|
||||
```
|
||||
|
||||
### Running Specific Tests
|
||||
|
||||
Run a single test:
|
||||
|
||||
```bash
|
||||
pytest tests/test_config.py::test_load_system_config -v
|
||||
```
|
||||
|
||||
### Test Coverage
|
||||
|
||||
Generate coverage report:
|
||||
|
||||
```bash
|
||||
pytest --cov=mcp_server --cov-report=html tests/
|
||||
|
||||
# View coverage report
|
||||
# Open htmlcov/index.html in your browser
|
||||
```
|
||||
|
||||
Expected coverage: >80% for all modules
|
||||
|
||||
### Test Organization
|
||||
|
||||
**Configuration Tests** (`test_config.py`):
|
||||
- System-level configuration loading
|
||||
- Project-level configuration override
|
||||
- Mode detection (project vs company)
|
||||
- Missing configuration handling
|
||||
|
||||
**Gitea Client Tests** (`test_gitea_client.py`):
|
||||
- API client initialization
|
||||
- Issue CRUD operations
|
||||
- Label retrieval
|
||||
- PMO multi-repo operations
|
||||
|
||||
**Issue Tools Tests** (`test_issues.py`):
|
||||
- Branch-aware security checks
|
||||
- Async wrappers for sync client
|
||||
- Permission enforcement
|
||||
- PMO aggregation mode
|
||||
|
||||
**Label Tools Tests** (`test_labels.py`):
|
||||
- Label retrieval (org + repo)
|
||||
- Intelligent label suggestion
|
||||
- Multi-category detection
|
||||
|
||||
---
|
||||
|
||||
## Manual MCP Server Testing
|
||||
|
||||
Test the MCP server manually using stdio communication.
|
||||
|
||||
### Step 1: Start the MCP Server
|
||||
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
source .venv/bin/activate
|
||||
python -m mcp_server.server
|
||||
```
|
||||
|
||||
The server will start and wait for JSON-RPC 2.0 messages on stdin.
|
||||
|
||||
### Step 2: Test Tool Listing
|
||||
|
||||
In another terminal, send a tool listing request:
|
||||
|
||||
```bash
|
||||
echo '{"jsonrpc": "2.0", "id": 1, "method": "tools/list"}' | python -m mcp_server.server
|
||||
```
|
||||
|
||||
Expected response:
|
||||
```json
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"id": 1,
|
||||
"result": {
|
||||
"tools": [
|
||||
{"name": "list_issues", "description": "List issues from Gitea repository", ...},
|
||||
{"name": "get_issue", "description": "Get specific issue details", ...},
|
||||
{"name": "create_issue", "description": "Create a new issue in Gitea", ...},
|
||||
...
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Step 3: Test Tool Invocation
|
||||
|
||||
**Note:** Manual tool invocation requires proper configuration. See [Configuration Setup](#configuration-setup-for-testing).
|
||||
|
||||
Example: List issues
|
||||
```bash
|
||||
echo '{
|
||||
"jsonrpc": "2.0",
|
||||
"id": 2,
|
||||
"method": "tools/call",
|
||||
"params": {
|
||||
"name": "list_issues",
|
||||
"arguments": {
|
||||
"state": "open"
|
||||
}
|
||||
}
|
||||
}' | python -m mcp_server.server
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Integration Testing
|
||||
|
||||
Test the MCP server with a real Gitea instance.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
1. **Gitea Instance**: Access to https://gitea.example.com (or your Gitea instance)
|
||||
2. **API Token**: Personal access token with required permissions
|
||||
3. **Configuration**: Properly configured system and project configs
|
||||
|
||||
### Step 1: Configuration Setup
|
||||
|
||||
Create system-level configuration:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.config/claude
|
||||
|
||||
cat > ~/.config/claude/gitea.env << EOF
|
||||
GITEA_API_URL=https://gitea.example.com/api/v1
|
||||
GITEA_API_TOKEN=your_gitea_token_here
|
||||
GITEA_OWNER=bandit
|
||||
EOF
|
||||
|
||||
chmod 600 ~/.config/claude/gitea.env
|
||||
```
|
||||
|
||||
Create project-level configuration (for project mode testing):
|
||||
|
||||
```bash
|
||||
cd /path/to/test/project
|
||||
|
||||
cat > .env << EOF
|
||||
GITEA_REPO=test-repo
|
||||
EOF
|
||||
|
||||
# Add to .gitignore
|
||||
echo ".env" >> .gitignore
|
||||
```
|
||||
|
||||
### Step 2: Generate Gitea API Token
|
||||
|
||||
1. Log into Gitea: https://gitea.example.com
|
||||
2. Navigate to: **Settings** → **Applications** → **Manage Access Tokens**
|
||||
3. Click **Generate New Token**
|
||||
4. Token configuration:
|
||||
- **Token Name:** `mcp-integration-test`
|
||||
- **Required Permissions:**
|
||||
- ✅ `repo` (all) - Read/write access to repositories, issues, labels
|
||||
- ✅ `read:org` - Read organization information and labels
|
||||
- ✅ `read:user` - Read user information
|
||||
5. Click **Generate Token**
|
||||
6. Copy the token immediately (shown only once)
|
||||
7. Add to `~/.config/claude/gitea.env`
|
||||
|
||||
### Step 3: Verify Configuration
|
||||
|
||||
Test configuration loading:
|
||||
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
source .venv/bin/activate
|
||||
python -c "
|
||||
from mcp_server.config import GiteaConfig
|
||||
config = GiteaConfig()
|
||||
result = config.load()
|
||||
print(f'API URL: {result[\"api_url\"]}')
|
||||
print(f'Owner: {result[\"owner\"]}')
|
||||
print(f'Repo: {result[\"repo\"]}')
|
||||
print(f'Mode: {result[\"mode\"]}')
|
||||
"
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
API URL: https://gitea.example.com/api/v1
|
||||
Owner: bandit
|
||||
Repo: test-repo (or None for company mode)
|
||||
Mode: project (or company)
|
||||
```
|
||||
|
||||
### Step 4: Test Gitea Client
|
||||
|
||||
Test basic Gitea API operations:
|
||||
|
||||
```bash
|
||||
python -c "
|
||||
from mcp_server.gitea_client import GiteaClient
|
||||
|
||||
client = GiteaClient()
|
||||
|
||||
# Test listing issues
|
||||
print('Testing list_issues...')
|
||||
issues = client.list_issues(state='open')
|
||||
print(f'Found {len(issues)} open issues')
|
||||
|
||||
# Test getting labels
|
||||
print('\\nTesting get_labels...')
|
||||
labels = client.get_labels()
|
||||
print(f'Found {len(labels)} repository labels')
|
||||
|
||||
# Test getting org labels
|
||||
print('\\nTesting get_org_labels...')
|
||||
org_labels = client.get_org_labels()
|
||||
print(f'Found {len(org_labels)} organization labels')
|
||||
|
||||
print('\\n✅ All integration tests passed!')
|
||||
"
|
||||
```
|
||||
|
||||
### Step 5: Test Issue Creation (Optional)
|
||||
|
||||
**Warning:** This creates a real issue in Gitea. Use a test repository.
|
||||
|
||||
```bash
|
||||
python -c "
|
||||
from mcp_server.gitea_client import GiteaClient
|
||||
|
||||
client = GiteaClient()
|
||||
|
||||
# Create test issue
|
||||
print('Creating test issue...')
|
||||
issue = client.create_issue(
|
||||
title='[TEST] MCP Server Integration Test',
|
||||
body='This is a test issue created by the Gitea MCP Server integration tests.',
|
||||
labels=['Type/Test']
|
||||
)
|
||||
print(f'Created issue #{issue[\"number\"]}: {issue[\"title\"]}')
|
||||
|
||||
# Clean up: Close the issue
|
||||
print('\\nClosing test issue...')
|
||||
client.update_issue(issue['number'], state='closed')
|
||||
print('✅ Test issue closed')
|
||||
"
|
||||
```
|
||||
|
||||
### Step 6: Test MCP Server with Real API
|
||||
|
||||
Start the MCP server and test with real Gitea API:
|
||||
|
||||
```bash
|
||||
cd mcp-servers/gitea
|
||||
source .venv/bin/activate
|
||||
|
||||
# Run server with test script
|
||||
python << 'EOF'
|
||||
import asyncio
|
||||
import json
|
||||
from mcp_server.server import GiteaMCPServer
|
||||
|
||||
async def test_server():
|
||||
server = GiteaMCPServer()
|
||||
await server.initialize()
|
||||
|
||||
# Test list_issues
|
||||
result = await server.issue_tools.list_issues(state='open')
|
||||
print(f'Found {len(result)} open issues')
|
||||
|
||||
# Test get_labels
|
||||
labels = await server.label_tools.get_labels()
|
||||
print(f'Found {labels["total_count"]} total labels')
|
||||
|
||||
# Test suggest_labels
|
||||
suggestions = await server.label_tools.suggest_labels(
|
||||
"Fix critical bug in authentication"
|
||||
)
|
||||
print(f'Suggested labels: {", ".join(suggestions)}')
|
||||
|
||||
print('✅ All MCP server integration tests passed!')
|
||||
|
||||
asyncio.run(test_server())
|
||||
EOF
|
||||
```
|
||||
|
||||
### Step 7: Test PMO Mode (Optional)
|
||||
|
||||
Test company-wide mode (no GITEA_REPO):
|
||||
|
||||
```bash
|
||||
# Temporarily remove GITEA_REPO
|
||||
unset GITEA_REPO
|
||||
|
||||
python -c "
|
||||
from mcp_server.gitea_client import GiteaClient
|
||||
|
||||
client = GiteaClient()
|
||||
|
||||
print(f'Running in {client.mode} mode')
|
||||
|
||||
# Test list_repos
|
||||
print('\\nTesting list_repos...')
|
||||
repos = client.list_repos()
|
||||
print(f'Found {len(repos)} repositories')
|
||||
|
||||
# Test aggregate_issues
|
||||
print('\\nTesting aggregate_issues...')
|
||||
aggregated = client.aggregate_issues(state='open')
|
||||
for repo_name, issues in aggregated.items():
|
||||
print(f' {repo_name}: {len(issues)} open issues')
|
||||
|
||||
print('\\n✅ PMO mode tests passed!')
|
||||
"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Configuration Setup for Testing
|
||||
|
||||
### Minimal Configuration
|
||||
|
||||
**System-level** (`~/.config/claude/gitea.env`):
|
||||
```bash
|
||||
GITEA_API_URL=https://gitea.example.com/api/v1
|
||||
GITEA_API_TOKEN=your_token_here
|
||||
GITEA_OWNER=bandit
|
||||
```
|
||||
|
||||
**Project-level** (`.env` in project root):
|
||||
```bash
|
||||
# For project mode
|
||||
GITEA_REPO=test-repo
|
||||
|
||||
# For company mode (PMO), omit GITEA_REPO
|
||||
```
|
||||
|
||||
### Verification
|
||||
|
||||
Verify configuration is correct:
|
||||
|
||||
```bash
|
||||
# Check system config exists
|
||||
ls -la ~/.config/claude/gitea.env
|
||||
|
||||
# Check permissions (should be 600)
|
||||
stat -c "%a %n" ~/.config/claude/gitea.env
|
||||
|
||||
# Check content (without exposing token)
|
||||
grep -v TOKEN ~/.config/claude/gitea.env
|
||||
|
||||
# Check project config (if using project mode)
|
||||
cat .env
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
#### 1. Import Errors
|
||||
|
||||
**Error:**
|
||||
```
|
||||
ModuleNotFoundError: No module named 'mcp_server'
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Ensure you're in the correct directory
|
||||
cd mcp-servers/gitea
|
||||
|
||||
# Activate virtual environment
|
||||
source .venv/bin/activate
|
||||
|
||||
# Verify installation
|
||||
pip list | grep mcp
|
||||
```
|
||||
|
||||
#### 2. Configuration Not Found
|
||||
|
||||
**Error:**
|
||||
```
|
||||
FileNotFoundError: System config not found: /home/user/.config/claude/gitea.env
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Create system config
|
||||
mkdir -p ~/.config/claude
|
||||
cat > ~/.config/claude/gitea.env << EOF
|
||||
GITEA_API_URL=https://gitea.example.com/api/v1
|
||||
GITEA_API_TOKEN=your_token_here
|
||||
GITEA_OWNER=bandit
|
||||
EOF
|
||||
|
||||
chmod 600 ~/.config/claude/gitea.env
|
||||
```
|
||||
|
||||
#### 3. Missing Required Configuration
|
||||
|
||||
**Error:**
|
||||
```
|
||||
ValueError: Missing required configuration: GITEA_API_TOKEN, GITEA_OWNER
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Check configuration file
|
||||
cat ~/.config/claude/gitea.env
|
||||
|
||||
# Ensure all required variables are present:
|
||||
# - GITEA_API_URL
|
||||
# - GITEA_API_TOKEN
|
||||
# - GITEA_OWNER
|
||||
```
|
||||
|
||||
#### 4. API Authentication Failed
|
||||
|
||||
**Error:**
|
||||
```
|
||||
requests.exceptions.HTTPError: 401 Client Error: Unauthorized
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Test token manually
|
||||
curl -H "Authorization: token YOUR_TOKEN" \
|
||||
https://gitea.example.com/api/v1/user
|
||||
|
||||
# If fails, regenerate token in Gitea settings
|
||||
```
|
||||
|
||||
#### 5. Permission Errors (Branch Detection)
|
||||
|
||||
**Error:**
|
||||
```
|
||||
PermissionError: Cannot create issues on branch 'main'
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Check current branch
|
||||
git branch --show-current
|
||||
|
||||
# Switch to development branch
|
||||
git checkout development
|
||||
# or
|
||||
git checkout -b feat/test-feature
|
||||
```
|
||||
|
||||
#### 6. Repository Not Specified
|
||||
|
||||
**Error:**
|
||||
```
|
||||
ValueError: Repository not specified
|
||||
```
|
||||
|
||||
**Solution:**
|
||||
```bash
|
||||
# Add GITEA_REPO to project config
|
||||
echo "GITEA_REPO=your-repo-name" >> .env
|
||||
|
||||
# Or specify repo in tool call
|
||||
# (for PMO mode multi-repo operations)
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable debug logging:
|
||||
|
||||
```bash
|
||||
export LOG_LEVEL=DEBUG
|
||||
python -m mcp_server.server
|
||||
```
|
||||
|
||||
### Test Summary
|
||||
|
||||
After completing all tests, verify:
|
||||
|
||||
- ✅ All 42 unit tests pass
|
||||
- ✅ MCP server starts without errors
|
||||
- ✅ Configuration loads correctly
|
||||
- ✅ Gitea API client connects successfully
|
||||
- ✅ Issues can be listed from Gitea
|
||||
- ✅ Labels can be retrieved
|
||||
- ✅ Label suggestions work correctly
|
||||
- ✅ Branch detection blocks writes on main/staging
|
||||
- ✅ Mode detection works (project vs company)
|
||||
|
||||
---
|
||||
|
||||
## Success Criteria
|
||||
|
||||
Phase 1 is complete when:
|
||||
|
||||
1. **All unit tests pass** (42/42)
|
||||
2. **MCP server starts without errors**
|
||||
3. **Can list issues from Gitea**
|
||||
4. **Can create issues with labels** (in development mode)
|
||||
5. **Mode detection works** (project vs company)
|
||||
6. **Branch detection prevents writes on main/staging**
|
||||
7. **Configuration properly merges** system + project levels
|
||||
|
||||
---
|
||||
|
||||
## Next Steps
|
||||
|
||||
After completing testing:
|
||||
|
||||
1. **Document any issues** found during testing
|
||||
2. **Create integration with projman plugin** (Phase 2)
|
||||
3. **Test in real project workflow** (Phase 5)
|
||||
4. **Performance optimization** (if needed)
|
||||
5. **Production hardening** (Phase 8)
|
||||
|
||||
---
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- **MCP Documentation**: https://docs.anthropic.com/claude/docs/mcp
|
||||
- **Gitea API Documentation**: https://docs.gitea.io/en-us/api-usage/
|
||||
- **Projman Documentation**: `plugins/projman/README.md`
|
||||
- **Configuration Guide**: `plugins/projman/CONFIGURATION.md`
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-01-06 (Phase 1 Implementation)
|
||||
@@ -1,98 +0,0 @@
|
||||
"""
|
||||
Configuration loader for Gitea MCP Server.
|
||||
|
||||
Implements hybrid configuration system:
|
||||
- System-level: ~/.config/claude/gitea.env (credentials)
|
||||
- Project-level: .env (repository specification)
|
||||
"""
|
||||
from pathlib import Path
|
||||
from dotenv import load_dotenv
|
||||
import os
|
||||
import logging
|
||||
from typing import Dict, Optional
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class GiteaConfig:
|
||||
"""Hybrid configuration loader with mode detection"""
|
||||
|
||||
def __init__(self):
|
||||
self.api_url: Optional[str] = None
|
||||
self.api_token: Optional[str] = None
|
||||
self.repo: Optional[str] = None
|
||||
self.mode: str = 'project'
|
||||
|
||||
def load(self) -> Dict[str, Optional[str]]:
|
||||
"""
|
||||
Load configuration from system and project levels.
|
||||
Project-level configuration overrides system-level.
|
||||
|
||||
Returns:
|
||||
Dict containing api_url, api_token, repo, mode
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If system config is missing
|
||||
ValueError: If required configuration is missing
|
||||
"""
|
||||
# Load system config
|
||||
system_config = Path.home() / '.config' / 'claude' / 'gitea.env'
|
||||
if system_config.exists():
|
||||
load_dotenv(system_config)
|
||||
logger.info(f"Loaded system configuration from {system_config}")
|
||||
else:
|
||||
raise FileNotFoundError(
|
||||
f"System config not found: {system_config}\n"
|
||||
"Create it with: mkdir -p ~/.config/claude && "
|
||||
"cat > ~/.config/claude/gitea.env"
|
||||
)
|
||||
|
||||
# Load project config (overrides system)
|
||||
project_config = Path.cwd() / '.env'
|
||||
if project_config.exists():
|
||||
load_dotenv(project_config, override=True)
|
||||
logger.info(f"Loaded project configuration from {project_config}")
|
||||
|
||||
# Extract values
|
||||
self.api_url = os.getenv('GITEA_API_URL')
|
||||
self.api_token = os.getenv('GITEA_API_TOKEN')
|
||||
self.repo = os.getenv('GITEA_REPO') # Optional, must be owner/repo format
|
||||
|
||||
# Detect mode
|
||||
if self.repo:
|
||||
self.mode = 'project'
|
||||
logger.info(f"Running in project mode: {self.repo}")
|
||||
else:
|
||||
self.mode = 'company'
|
||||
logger.info("Running in company-wide mode (PMO)")
|
||||
|
||||
# Validate required variables
|
||||
self._validate()
|
||||
|
||||
return {
|
||||
'api_url': self.api_url,
|
||||
'api_token': self.api_token,
|
||||
'repo': self.repo,
|
||||
'mode': self.mode
|
||||
}
|
||||
|
||||
def _validate(self) -> None:
|
||||
"""
|
||||
Validate that required configuration is present.
|
||||
|
||||
Raises:
|
||||
ValueError: If required configuration is missing
|
||||
"""
|
||||
required = {
|
||||
'GITEA_API_URL': self.api_url,
|
||||
'GITEA_API_TOKEN': self.api_token
|
||||
}
|
||||
|
||||
missing = [key for key, value in required.items() if not value]
|
||||
|
||||
if missing:
|
||||
raise ValueError(
|
||||
f"Missing required configuration: {', '.join(missing)}\n"
|
||||
"Check your ~/.config/claude/gitea.env file"
|
||||
)
|
||||
@@ -1,593 +0,0 @@
|
||||
"""
|
||||
Gitea API client for interacting with Gitea API.
|
||||
|
||||
Provides synchronous methods for:
|
||||
- Issue CRUD operations
|
||||
- Label management
|
||||
- Repository operations
|
||||
- PMO multi-repo aggregation
|
||||
- Wiki operations (lessons learned)
|
||||
- Milestone management
|
||||
- Issue dependencies
|
||||
"""
|
||||
import requests
|
||||
import logging
|
||||
import re
|
||||
from typing import List, Dict, Optional
|
||||
from .config import GiteaConfig
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class GiteaClient:
|
||||
"""Client for interacting with Gitea API"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize Gitea client with configuration"""
|
||||
config = GiteaConfig()
|
||||
config_dict = config.load()
|
||||
|
||||
self.base_url = config_dict['api_url']
|
||||
self.token = config_dict['api_token']
|
||||
self.repo = config_dict.get('repo') # Optional default repo in owner/repo format
|
||||
self.mode = config_dict['mode']
|
||||
|
||||
self.session = requests.Session()
|
||||
self.session.headers.update({
|
||||
'Authorization': f'token {self.token}',
|
||||
'Content-Type': 'application/json'
|
||||
})
|
||||
|
||||
logger.info(f"Gitea client initialized in {self.mode} mode")
|
||||
|
||||
def _parse_repo(self, repo: Optional[str] = None) -> tuple:
|
||||
"""Parse owner/repo from input. Always requires 'owner/repo' format."""
|
||||
target = repo or self.repo
|
||||
if not target or '/' not in target:
|
||||
raise ValueError("Use 'owner/repo' format (e.g. 'org/repo-name')")
|
||||
parts = target.split('/', 1)
|
||||
return parts[0], parts[1]
|
||||
|
||||
def list_issues(
|
||||
self,
|
||||
state: str = 'open',
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
List issues from Gitea repository.
|
||||
|
||||
Args:
|
||||
state: Issue state (open, closed, all)
|
||||
labels: Filter by labels
|
||||
repo: Repository in 'owner/repo' format
|
||||
|
||||
Returns:
|
||||
List of issue dictionaries
|
||||
"""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues"
|
||||
params = {'state': state}
|
||||
if labels:
|
||||
params['labels'] = ','.join(labels)
|
||||
logger.info(f"Listing issues from {owner}/{target_repo} with state={state}")
|
||||
response = self.session.get(url, params=params)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_issue(
|
||||
self,
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Get specific issue details."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}"
|
||||
logger.info(f"Getting issue #{issue_number} from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def create_issue(
|
||||
self,
|
||||
title: str,
|
||||
body: str,
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a new issue in Gitea."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues"
|
||||
data = {'title': title, 'body': body}
|
||||
if labels:
|
||||
label_ids = self._resolve_label_ids(labels, owner, target_repo)
|
||||
data['labels'] = label_ids
|
||||
logger.info(f"Creating issue in {owner}/{target_repo}: {title}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def _resolve_label_ids(self, label_names: List[str], owner: str, repo: str) -> List[int]:
|
||||
"""Convert label names to label IDs."""
|
||||
org_labels = self.get_org_labels(owner)
|
||||
repo_labels = self.get_labels(f"{owner}/{repo}")
|
||||
all_labels = org_labels + repo_labels
|
||||
label_map = {label['name']: label['id'] for label in all_labels}
|
||||
label_ids = []
|
||||
for name in label_names:
|
||||
if name in label_map:
|
||||
label_ids.append(label_map[name])
|
||||
else:
|
||||
logger.warning(f"Label '{name}' not found, skipping")
|
||||
return label_ids
|
||||
|
||||
def update_issue(
|
||||
self,
|
||||
issue_number: int,
|
||||
title: Optional[str] = None,
|
||||
body: Optional[str] = None,
|
||||
state: Optional[str] = None,
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Update existing issue. Repo must be 'owner/repo' format."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}"
|
||||
data = {}
|
||||
if title is not None:
|
||||
data['title'] = title
|
||||
if body is not None:
|
||||
data['body'] = body
|
||||
if state is not None:
|
||||
data['state'] = state
|
||||
if labels is not None:
|
||||
data['labels'] = labels
|
||||
logger.info(f"Updating issue #{issue_number} in {owner}/{target_repo}")
|
||||
response = self.session.patch(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def add_comment(
|
||||
self,
|
||||
issue_number: int,
|
||||
comment: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Add comment to issue. Repo must be 'owner/repo' format."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/comments"
|
||||
data = {'body': comment}
|
||||
logger.info(f"Adding comment to issue #{issue_number} in {owner}/{target_repo}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_labels(self, repo: Optional[str] = None) -> List[Dict]:
|
||||
"""Get all labels from repository. Repo must be 'owner/repo' format."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/labels"
|
||||
logger.info(f"Getting labels from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_org_labels(self, org: str) -> List[Dict]:
|
||||
"""Get organization-level labels. Org is the organization name."""
|
||||
url = f"{self.base_url}/orgs/{org}/labels"
|
||||
logger.info(f"Getting organization labels for {org}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def list_repos(self, org: str) -> List[Dict]:
|
||||
"""List all repositories in organization. Org is the organization name."""
|
||||
url = f"{self.base_url}/orgs/{org}/repos"
|
||||
logger.info(f"Listing all repositories for organization {org}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def aggregate_issues(
|
||||
self,
|
||||
org: str,
|
||||
state: str = 'open',
|
||||
labels: Optional[List[str]] = None
|
||||
) -> Dict[str, List[Dict]]:
|
||||
"""Fetch issues across all repositories in org."""
|
||||
repos = self.list_repos(org)
|
||||
aggregated = {}
|
||||
logger.info(f"Aggregating issues across {len(repos)} repositories")
|
||||
for repo in repos:
|
||||
repo_name = repo['name']
|
||||
try:
|
||||
issues = self.list_issues(
|
||||
state=state,
|
||||
labels=labels,
|
||||
repo=f"{org}/{repo_name}"
|
||||
)
|
||||
if issues:
|
||||
aggregated[repo_name] = issues
|
||||
logger.info(f"Found {len(issues)} issues in {repo_name}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching issues from {repo_name}: {e}")
|
||||
|
||||
return aggregated
|
||||
|
||||
# ========================================
|
||||
# WIKI OPERATIONS (Lessons Learned)
|
||||
# ========================================
|
||||
|
||||
def list_wiki_pages(self, repo: Optional[str] = None) -> List[Dict]:
|
||||
"""List all wiki pages in repository."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/wiki/pages"
|
||||
logger.info(f"Listing wiki pages from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_wiki_page(
|
||||
self,
|
||||
page_name: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Get a specific wiki page by name."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/wiki/page/{page_name}"
|
||||
logger.info(f"Getting wiki page '{page_name}' from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def create_wiki_page(
|
||||
self,
|
||||
title: str,
|
||||
content: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a new wiki page."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/wiki/new"
|
||||
data = {
|
||||
'title': title,
|
||||
'content_base64': self._encode_base64(content)
|
||||
}
|
||||
logger.info(f"Creating wiki page '{title}' in {owner}/{target_repo}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def update_wiki_page(
|
||||
self,
|
||||
page_name: str,
|
||||
content: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Update an existing wiki page."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/wiki/page/{page_name}"
|
||||
data = {
|
||||
'content_base64': self._encode_base64(content)
|
||||
}
|
||||
logger.info(f"Updating wiki page '{page_name}' in {owner}/{target_repo}")
|
||||
response = self.session.patch(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def delete_wiki_page(
|
||||
self,
|
||||
page_name: str,
|
||||
repo: Optional[str] = None
|
||||
) -> bool:
|
||||
"""Delete a wiki page."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/wiki/page/{page_name}"
|
||||
logger.info(f"Deleting wiki page '{page_name}' from {owner}/{target_repo}")
|
||||
response = self.session.delete(url)
|
||||
response.raise_for_status()
|
||||
return True
|
||||
|
||||
def _encode_base64(self, content: str) -> str:
|
||||
"""Encode content to base64 for wiki API."""
|
||||
import base64
|
||||
return base64.b64encode(content.encode('utf-8')).decode('utf-8')
|
||||
|
||||
def _decode_base64(self, content: str) -> str:
|
||||
"""Decode base64 content from wiki API."""
|
||||
import base64
|
||||
return base64.b64decode(content.encode('utf-8')).decode('utf-8')
|
||||
|
||||
def search_wiki_pages(
|
||||
self,
|
||||
query: str,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""Search wiki pages by content (client-side filtering)."""
|
||||
pages = self.list_wiki_pages(repo)
|
||||
results = []
|
||||
query_lower = query.lower()
|
||||
for page in pages:
|
||||
if query_lower in page.get('title', '').lower():
|
||||
results.append(page)
|
||||
return results
|
||||
|
||||
def create_lesson(
|
||||
self,
|
||||
title: str,
|
||||
content: str,
|
||||
tags: List[str],
|
||||
category: str = "sprints",
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a lessons learned entry in the wiki."""
|
||||
# Sanitize title for wiki page name
|
||||
page_name = f"lessons/{category}/{self._sanitize_page_name(title)}"
|
||||
|
||||
# Add tags as metadata at the end of content
|
||||
full_content = f"{content}\n\n---\n**Tags:** {', '.join(tags)}"
|
||||
|
||||
return self.create_wiki_page(page_name, full_content, repo)
|
||||
|
||||
def search_lessons(
|
||||
self,
|
||||
query: Optional[str] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""Search lessons learned by query and/or tags."""
|
||||
pages = self.list_wiki_pages(repo)
|
||||
results = []
|
||||
|
||||
for page in pages:
|
||||
title = page.get('title', '')
|
||||
# Filter to only lessons (pages starting with lessons/)
|
||||
if not title.startswith('lessons/'):
|
||||
continue
|
||||
|
||||
# If query provided, check if it matches title
|
||||
if query:
|
||||
if query.lower() not in title.lower():
|
||||
continue
|
||||
|
||||
# Get full page content for tag matching if tags provided
|
||||
if tags:
|
||||
try:
|
||||
full_page = self.get_wiki_page(title, repo)
|
||||
content = self._decode_base64(full_page.get('content_base64', ''))
|
||||
# Check if any tag is in the content
|
||||
if not any(tag.lower() in content.lower() for tag in tags):
|
||||
continue
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
results.append(page)
|
||||
|
||||
return results
|
||||
|
||||
def _sanitize_page_name(self, title: str) -> str:
|
||||
"""Convert title to valid wiki page name."""
|
||||
# Replace spaces with hyphens, remove special chars
|
||||
name = re.sub(r'[^\w\s-]', '', title)
|
||||
name = re.sub(r'[\s]+', '-', name)
|
||||
return name.lower()
|
||||
|
||||
# ========================================
|
||||
# MILESTONE OPERATIONS
|
||||
# ========================================
|
||||
|
||||
def list_milestones(
|
||||
self,
|
||||
state: str = 'open',
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""List all milestones in repository."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/milestones"
|
||||
params = {'state': state}
|
||||
logger.info(f"Listing milestones from {owner}/{target_repo}")
|
||||
response = self.session.get(url, params=params)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_milestone(
|
||||
self,
|
||||
milestone_id: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Get a specific milestone by ID."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/milestones/{milestone_id}"
|
||||
logger.info(f"Getting milestone #{milestone_id} from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def create_milestone(
|
||||
self,
|
||||
title: str,
|
||||
description: Optional[str] = None,
|
||||
due_on: Optional[str] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a new milestone."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/milestones"
|
||||
data = {'title': title}
|
||||
if description:
|
||||
data['description'] = description
|
||||
if due_on:
|
||||
data['due_on'] = due_on
|
||||
logger.info(f"Creating milestone '{title}' in {owner}/{target_repo}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def update_milestone(
|
||||
self,
|
||||
milestone_id: int,
|
||||
title: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
state: Optional[str] = None,
|
||||
due_on: Optional[str] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Update an existing milestone."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/milestones/{milestone_id}"
|
||||
data = {}
|
||||
if title is not None:
|
||||
data['title'] = title
|
||||
if description is not None:
|
||||
data['description'] = description
|
||||
if state is not None:
|
||||
data['state'] = state
|
||||
if due_on is not None:
|
||||
data['due_on'] = due_on
|
||||
logger.info(f"Updating milestone #{milestone_id} in {owner}/{target_repo}")
|
||||
response = self.session.patch(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def delete_milestone(
|
||||
self,
|
||||
milestone_id: int,
|
||||
repo: Optional[str] = None
|
||||
) -> bool:
|
||||
"""Delete a milestone."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/milestones/{milestone_id}"
|
||||
logger.info(f"Deleting milestone #{milestone_id} from {owner}/{target_repo}")
|
||||
response = self.session.delete(url)
|
||||
response.raise_for_status()
|
||||
return True
|
||||
|
||||
# ========================================
|
||||
# ISSUE DEPENDENCY OPERATIONS
|
||||
# ========================================
|
||||
|
||||
def list_issue_dependencies(
|
||||
self,
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""List all dependencies for an issue (issues that block this one)."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/dependencies"
|
||||
logger.info(f"Listing dependencies for issue #{issue_number} in {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def create_issue_dependency(
|
||||
self,
|
||||
issue_number: int,
|
||||
depends_on: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a dependency (issue_number depends on depends_on)."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/dependencies"
|
||||
data = {
|
||||
'dependentIssue': {
|
||||
'owner': owner,
|
||||
'repo': target_repo,
|
||||
'index': depends_on
|
||||
}
|
||||
}
|
||||
logger.info(f"Creating dependency: #{issue_number} depends on #{depends_on} in {owner}/{target_repo}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def remove_issue_dependency(
|
||||
self,
|
||||
issue_number: int,
|
||||
depends_on: int,
|
||||
repo: Optional[str] = None
|
||||
) -> bool:
|
||||
"""Remove a dependency between issues."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/dependencies"
|
||||
data = {
|
||||
'dependentIssue': {
|
||||
'owner': owner,
|
||||
'repo': target_repo,
|
||||
'index': depends_on
|
||||
}
|
||||
}
|
||||
logger.info(f"Removing dependency: #{issue_number} no longer depends on #{depends_on}")
|
||||
response = self.session.delete(url, json=data)
|
||||
response.raise_for_status()
|
||||
return True
|
||||
|
||||
def list_issue_blocks(
|
||||
self,
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""List all issues that this issue blocks."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/blocks"
|
||||
logger.info(f"Listing issues blocked by #{issue_number} in {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
# ========================================
|
||||
# REPOSITORY VALIDATION
|
||||
# ========================================
|
||||
|
||||
def get_repo_info(self, repo: Optional[str] = None) -> Dict:
|
||||
"""Get repository information including owner type."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}"
|
||||
logger.info(f"Getting repo info for {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def is_org_repo(self, repo: Optional[str] = None) -> bool:
|
||||
"""Check if repository belongs to an organization (not a user)."""
|
||||
info = self.get_repo_info(repo)
|
||||
owner_type = info.get('owner', {}).get('type', '')
|
||||
return owner_type.lower() == 'organization'
|
||||
|
||||
def get_branch_protection(
|
||||
self,
|
||||
branch: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Optional[Dict]:
|
||||
"""Get branch protection rules for a branch."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/branch_protections/{branch}"
|
||||
logger.info(f"Getting branch protection for {branch} in {owner}/{target_repo}")
|
||||
try:
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except requests.exceptions.HTTPError as e:
|
||||
if e.response.status_code == 404:
|
||||
return None # No protection rules
|
||||
raise
|
||||
|
||||
def create_label(
|
||||
self,
|
||||
name: str,
|
||||
color: str,
|
||||
description: Optional[str] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a new label in the repository."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/labels"
|
||||
data = {
|
||||
'name': name,
|
||||
'color': color.lstrip('#') # Remove # if present
|
||||
}
|
||||
if description:
|
||||
data['description'] = description
|
||||
logger.info(f"Creating label '{name}' in {owner}/{target_repo}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
@@ -1,764 +0,0 @@
|
||||
"""
|
||||
MCP Server entry point for Gitea integration.
|
||||
|
||||
Provides Gitea tools to Claude Code via JSON-RPC 2.0 over stdio.
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
import json
|
||||
from mcp.server import Server
|
||||
from mcp.server.stdio import stdio_server
|
||||
from mcp.types import Tool, TextContent
|
||||
|
||||
from .config import GiteaConfig
|
||||
from .gitea_client import GiteaClient
|
||||
from .tools.issues import IssueTools
|
||||
from .tools.labels import LabelTools
|
||||
from .tools.wiki import WikiTools
|
||||
from .tools.milestones import MilestoneTools
|
||||
from .tools.dependencies import DependencyTools
|
||||
|
||||
# Suppress noisy MCP validation warnings on stderr
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logging.getLogger("root").setLevel(logging.ERROR)
|
||||
logging.getLogger("mcp").setLevel(logging.ERROR)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class GiteaMCPServer:
|
||||
"""MCP Server for Gitea integration"""
|
||||
|
||||
def __init__(self):
|
||||
self.server = Server("gitea-mcp")
|
||||
self.config = None
|
||||
self.client = None
|
||||
self.issue_tools = None
|
||||
self.label_tools = None
|
||||
self.wiki_tools = None
|
||||
self.milestone_tools = None
|
||||
self.dependency_tools = None
|
||||
|
||||
async def initialize(self):
|
||||
"""
|
||||
Initialize server and load configuration.
|
||||
|
||||
Raises:
|
||||
Exception: If initialization fails
|
||||
"""
|
||||
try:
|
||||
config_loader = GiteaConfig()
|
||||
self.config = config_loader.load()
|
||||
|
||||
self.client = GiteaClient()
|
||||
self.issue_tools = IssueTools(self.client)
|
||||
self.label_tools = LabelTools(self.client)
|
||||
self.wiki_tools = WikiTools(self.client)
|
||||
self.milestone_tools = MilestoneTools(self.client)
|
||||
self.dependency_tools = DependencyTools(self.client)
|
||||
|
||||
logger.info(f"Gitea MCP Server initialized in {self.config['mode']} mode")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize: {e}")
|
||||
raise
|
||||
|
||||
def setup_tools(self):
|
||||
"""Register all available tools with the MCP server"""
|
||||
|
||||
@self.server.list_tools()
|
||||
async def list_tools() -> list[Tool]:
|
||||
"""Return list of available tools"""
|
||||
return [
|
||||
Tool(
|
||||
name="list_issues",
|
||||
description="List issues from Gitea repository",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"state": {
|
||||
"type": "string",
|
||||
"enum": ["open", "closed", "all"],
|
||||
"default": "open",
|
||||
"description": "Issue state filter"
|
||||
},
|
||||
"labels": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"description": "Filter by labels"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (for PMO mode)"
|
||||
}
|
||||
}
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="get_issue",
|
||||
description="Get specific issue details",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_number": {
|
||||
"type": "integer",
|
||||
"description": "Issue number"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (for PMO mode)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_number"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_issue",
|
||||
description="Create a new issue in Gitea",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"title": {
|
||||
"type": "string",
|
||||
"description": "Issue title"
|
||||
},
|
||||
"body": {
|
||||
"type": "string",
|
||||
"description": "Issue description"
|
||||
},
|
||||
"labels": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"description": "List of label names"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (for PMO mode)"
|
||||
}
|
||||
},
|
||||
"required": ["title", "body"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="update_issue",
|
||||
description="Update existing issue",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_number": {
|
||||
"type": "integer",
|
||||
"description": "Issue number"
|
||||
},
|
||||
"title": {
|
||||
"type": "string",
|
||||
"description": "New title"
|
||||
},
|
||||
"body": {
|
||||
"type": "string",
|
||||
"description": "New body"
|
||||
},
|
||||
"state": {
|
||||
"type": "string",
|
||||
"enum": ["open", "closed"],
|
||||
"description": "New state"
|
||||
},
|
||||
"labels": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"description": "New labels"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (for PMO mode)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_number"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="add_comment",
|
||||
description="Add comment to issue",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_number": {
|
||||
"type": "integer",
|
||||
"description": "Issue number"
|
||||
},
|
||||
"comment": {
|
||||
"type": "string",
|
||||
"description": "Comment text"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (for PMO mode)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_number", "comment"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="get_labels",
|
||||
description="Get all available labels (org + repo)",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (for PMO mode)"
|
||||
}
|
||||
}
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="suggest_labels",
|
||||
description="Analyze context and suggest appropriate labels",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"context": {
|
||||
"type": "string",
|
||||
"description": "Issue title + description or sprint context"
|
||||
}
|
||||
},
|
||||
"required": ["context"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="aggregate_issues",
|
||||
description="Fetch issues across all repositories (PMO mode)",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"org": {
|
||||
"type": "string",
|
||||
"description": "Organization name (e.g. 'bandit')"
|
||||
},
|
||||
"state": {
|
||||
"type": "string",
|
||||
"enum": ["open", "closed", "all"],
|
||||
"default": "open",
|
||||
"description": "Issue state filter"
|
||||
},
|
||||
"labels": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"description": "Filter by labels"
|
||||
}
|
||||
},
|
||||
"required": ["org"]
|
||||
}
|
||||
),
|
||||
# Wiki Tools (Lessons Learned)
|
||||
Tool(
|
||||
name="list_wiki_pages",
|
||||
description="List all wiki pages in repository",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
}
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="get_wiki_page",
|
||||
description="Get a specific wiki page by name",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"page_name": {
|
||||
"type": "string",
|
||||
"description": "Wiki page name/path"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["page_name"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_wiki_page",
|
||||
description="Create a new wiki page",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"title": {
|
||||
"type": "string",
|
||||
"description": "Page title/name"
|
||||
},
|
||||
"content": {
|
||||
"type": "string",
|
||||
"description": "Page content (markdown)"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["title", "content"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="update_wiki_page",
|
||||
description="Update an existing wiki page",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"page_name": {
|
||||
"type": "string",
|
||||
"description": "Wiki page name/path"
|
||||
},
|
||||
"content": {
|
||||
"type": "string",
|
||||
"description": "New page content (markdown)"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["page_name", "content"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_lesson",
|
||||
description="Create a lessons learned entry in the wiki",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"title": {
|
||||
"type": "string",
|
||||
"description": "Lesson title (e.g., 'Sprint 16 - Prevent Infinite Loops')"
|
||||
},
|
||||
"content": {
|
||||
"type": "string",
|
||||
"description": "Lesson content (markdown with context, problem, solution, prevention)"
|
||||
},
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"description": "Tags for categorization"
|
||||
},
|
||||
"category": {
|
||||
"type": "string",
|
||||
"default": "sprints",
|
||||
"description": "Category (sprints, patterns, architecture, etc.)"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["title", "content", "tags"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="search_lessons",
|
||||
description="Search lessons learned from previous sprints",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"query": {
|
||||
"type": "string",
|
||||
"description": "Search query (optional)"
|
||||
},
|
||||
"tags": {
|
||||
"type": "array",
|
||||
"items": {"type": "string"},
|
||||
"description": "Tags to filter by (optional)"
|
||||
},
|
||||
"limit": {
|
||||
"type": "integer",
|
||||
"default": 20,
|
||||
"description": "Maximum results"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
}
|
||||
}
|
||||
),
|
||||
# Milestone Tools
|
||||
Tool(
|
||||
name="list_milestones",
|
||||
description="List all milestones in repository",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"state": {
|
||||
"type": "string",
|
||||
"enum": ["open", "closed", "all"],
|
||||
"default": "open",
|
||||
"description": "Milestone state filter"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
}
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="get_milestone",
|
||||
description="Get a specific milestone by ID",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"milestone_id": {
|
||||
"type": "integer",
|
||||
"description": "Milestone ID"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["milestone_id"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_milestone",
|
||||
description="Create a new milestone",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"title": {
|
||||
"type": "string",
|
||||
"description": "Milestone title"
|
||||
},
|
||||
"description": {
|
||||
"type": "string",
|
||||
"description": "Milestone description"
|
||||
},
|
||||
"due_on": {
|
||||
"type": "string",
|
||||
"description": "Due date (ISO 8601 format)"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["title"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="update_milestone",
|
||||
description="Update an existing milestone",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"milestone_id": {
|
||||
"type": "integer",
|
||||
"description": "Milestone ID"
|
||||
},
|
||||
"title": {
|
||||
"type": "string",
|
||||
"description": "New title"
|
||||
},
|
||||
"description": {
|
||||
"type": "string",
|
||||
"description": "New description"
|
||||
},
|
||||
"state": {
|
||||
"type": "string",
|
||||
"enum": ["open", "closed"],
|
||||
"description": "New state"
|
||||
},
|
||||
"due_on": {
|
||||
"type": "string",
|
||||
"description": "New due date"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["milestone_id"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="delete_milestone",
|
||||
description="Delete a milestone",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"milestone_id": {
|
||||
"type": "integer",
|
||||
"description": "Milestone ID"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["milestone_id"]
|
||||
}
|
||||
),
|
||||
# Dependency Tools
|
||||
Tool(
|
||||
name="list_issue_dependencies",
|
||||
description="List all dependencies for an issue (issues that block this one)",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_number": {
|
||||
"type": "integer",
|
||||
"description": "Issue number"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_number"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_issue_dependency",
|
||||
description="Create a dependency (issue depends on another issue)",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_number": {
|
||||
"type": "integer",
|
||||
"description": "Issue that will depend on another"
|
||||
},
|
||||
"depends_on": {
|
||||
"type": "integer",
|
||||
"description": "Issue that blocks issue_number"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_number", "depends_on"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="remove_issue_dependency",
|
||||
description="Remove a dependency between issues",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_number": {
|
||||
"type": "integer",
|
||||
"description": "Issue that depends on another"
|
||||
},
|
||||
"depends_on": {
|
||||
"type": "integer",
|
||||
"description": "Issue being depended on"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_number", "depends_on"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="get_execution_order",
|
||||
description="Get parallelizable execution order for issues based on dependencies",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"issue_numbers": {
|
||||
"type": "array",
|
||||
"items": {"type": "integer"},
|
||||
"description": "List of issue numbers to analyze"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["issue_numbers"]
|
||||
}
|
||||
),
|
||||
# Validation Tools
|
||||
Tool(
|
||||
name="validate_repo_org",
|
||||
description="Check if repository belongs to an organization",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
}
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="get_branch_protection",
|
||||
description="Get branch protection rules",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"branch": {
|
||||
"type": "string",
|
||||
"description": "Branch name"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["branch"]
|
||||
}
|
||||
),
|
||||
Tool(
|
||||
name="create_label",
|
||||
description="Create a new label in the repository",
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "Label name"
|
||||
},
|
||||
"color": {
|
||||
"type": "string",
|
||||
"description": "Label color (hex code)"
|
||||
},
|
||||
"description": {
|
||||
"type": "string",
|
||||
"description": "Label description"
|
||||
},
|
||||
"repo": {
|
||||
"type": "string",
|
||||
"description": "Repository name (owner/repo format)"
|
||||
}
|
||||
},
|
||||
"required": ["name", "color"]
|
||||
}
|
||||
)
|
||||
]
|
||||
|
||||
@self.server.call_tool()
|
||||
async def call_tool(name: str, arguments: dict) -> list[TextContent]:
|
||||
"""
|
||||
Handle tool invocation.
|
||||
|
||||
Args:
|
||||
name: Tool name
|
||||
arguments: Tool arguments
|
||||
|
||||
Returns:
|
||||
List of TextContent with results
|
||||
"""
|
||||
try:
|
||||
# Route to appropriate tool handler
|
||||
if name == "list_issues":
|
||||
result = await self.issue_tools.list_issues(**arguments)
|
||||
elif name == "get_issue":
|
||||
result = await self.issue_tools.get_issue(**arguments)
|
||||
elif name == "create_issue":
|
||||
result = await self.issue_tools.create_issue(**arguments)
|
||||
elif name == "update_issue":
|
||||
result = await self.issue_tools.update_issue(**arguments)
|
||||
elif name == "add_comment":
|
||||
result = await self.issue_tools.add_comment(**arguments)
|
||||
elif name == "get_labels":
|
||||
result = await self.label_tools.get_labels(**arguments)
|
||||
elif name == "suggest_labels":
|
||||
result = await self.label_tools.suggest_labels(**arguments)
|
||||
elif name == "aggregate_issues":
|
||||
result = await self.issue_tools.aggregate_issues(**arguments)
|
||||
# Wiki tools
|
||||
elif name == "list_wiki_pages":
|
||||
result = await self.wiki_tools.list_wiki_pages(**arguments)
|
||||
elif name == "get_wiki_page":
|
||||
result = await self.wiki_tools.get_wiki_page(**arguments)
|
||||
elif name == "create_wiki_page":
|
||||
result = await self.wiki_tools.create_wiki_page(**arguments)
|
||||
elif name == "update_wiki_page":
|
||||
result = await self.wiki_tools.update_wiki_page(**arguments)
|
||||
elif name == "create_lesson":
|
||||
result = await self.wiki_tools.create_lesson(**arguments)
|
||||
elif name == "search_lessons":
|
||||
tags = arguments.get('tags')
|
||||
result = await self.wiki_tools.search_lessons(
|
||||
query=arguments.get('query'),
|
||||
tags=tags,
|
||||
limit=arguments.get('limit', 20),
|
||||
repo=arguments.get('repo')
|
||||
)
|
||||
# Milestone tools
|
||||
elif name == "list_milestones":
|
||||
result = await self.milestone_tools.list_milestones(**arguments)
|
||||
elif name == "get_milestone":
|
||||
result = await self.milestone_tools.get_milestone(**arguments)
|
||||
elif name == "create_milestone":
|
||||
result = await self.milestone_tools.create_milestone(**arguments)
|
||||
elif name == "update_milestone":
|
||||
result = await self.milestone_tools.update_milestone(**arguments)
|
||||
elif name == "delete_milestone":
|
||||
result = await self.milestone_tools.delete_milestone(**arguments)
|
||||
# Dependency tools
|
||||
elif name == "list_issue_dependencies":
|
||||
result = await self.dependency_tools.list_issue_dependencies(**arguments)
|
||||
elif name == "create_issue_dependency":
|
||||
result = await self.dependency_tools.create_issue_dependency(**arguments)
|
||||
elif name == "remove_issue_dependency":
|
||||
result = await self.dependency_tools.remove_issue_dependency(**arguments)
|
||||
elif name == "get_execution_order":
|
||||
result = await self.dependency_tools.get_execution_order(**arguments)
|
||||
# Validation tools
|
||||
elif name == "validate_repo_org":
|
||||
is_org = self.client.is_org_repo(arguments.get('repo'))
|
||||
result = {'is_organization': is_org}
|
||||
elif name == "get_branch_protection":
|
||||
result = self.client.get_branch_protection(
|
||||
arguments['branch'],
|
||||
arguments.get('repo')
|
||||
)
|
||||
elif name == "create_label":
|
||||
result = self.client.create_label(
|
||||
arguments['name'],
|
||||
arguments['color'],
|
||||
arguments.get('description'),
|
||||
arguments.get('repo')
|
||||
)
|
||||
else:
|
||||
raise ValueError(f"Unknown tool: {name}")
|
||||
|
||||
return [TextContent(
|
||||
type="text",
|
||||
text=json.dumps(result, indent=2)
|
||||
)]
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Tool {name} failed: {e}")
|
||||
return [TextContent(
|
||||
type="text",
|
||||
text=f"Error: {str(e)}"
|
||||
)]
|
||||
|
||||
async def run(self):
|
||||
"""Run the MCP server"""
|
||||
await self.initialize()
|
||||
self.setup_tools()
|
||||
|
||||
async with stdio_server() as (read_stream, write_stream):
|
||||
await self.server.run(
|
||||
read_stream,
|
||||
write_stream,
|
||||
self.server.create_initialization_options()
|
||||
)
|
||||
|
||||
|
||||
async def main():
|
||||
"""Main entry point"""
|
||||
server = GiteaMCPServer()
|
||||
await server.run()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
@@ -1,7 +0,0 @@
|
||||
"""
|
||||
MCP tools for Gitea integration.
|
||||
|
||||
This package provides MCP tool implementations for:
|
||||
- Issue operations (issues.py)
|
||||
- Label management (labels.py)
|
||||
"""
|
||||
@@ -1,216 +0,0 @@
|
||||
"""
|
||||
Issue dependency management tools for MCP server.
|
||||
|
||||
Provides async wrappers for issue dependency operations:
|
||||
- List/create/remove dependencies
|
||||
- Build dependency graphs for parallel execution
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
from typing import List, Dict, Optional, Set, Tuple
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DependencyTools:
|
||||
"""Async wrappers for Gitea issue dependency operations"""
|
||||
|
||||
def __init__(self, gitea_client):
|
||||
"""
|
||||
Initialize dependency tools.
|
||||
|
||||
Args:
|
||||
gitea_client: GiteaClient instance
|
||||
"""
|
||||
self.gitea = gitea_client
|
||||
|
||||
async def list_issue_dependencies(
|
||||
self,
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
List all dependencies for an issue (issues that block this one).
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
List of issues that this issue depends on
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.list_issue_dependencies(issue_number, repo)
|
||||
)
|
||||
|
||||
async def create_issue_dependency(
|
||||
self,
|
||||
issue_number: int,
|
||||
depends_on: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a dependency between issues.
|
||||
|
||||
Args:
|
||||
issue_number: The issue that will depend on another
|
||||
depends_on: The issue that blocks issue_number
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
Created dependency information
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_issue_dependency(issue_number, depends_on, repo)
|
||||
)
|
||||
|
||||
async def remove_issue_dependency(
|
||||
self,
|
||||
issue_number: int,
|
||||
depends_on: int,
|
||||
repo: Optional[str] = None
|
||||
) -> bool:
|
||||
"""
|
||||
Remove a dependency between issues.
|
||||
|
||||
Args:
|
||||
issue_number: The issue that currently depends on another
|
||||
depends_on: The issue being depended on
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
True if removed successfully
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.remove_issue_dependency(issue_number, depends_on, repo)
|
||||
)
|
||||
|
||||
async def list_issue_blocks(
|
||||
self,
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
List all issues that this issue blocks.
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
List of issues blocked by this issue
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.list_issue_blocks(issue_number, repo)
|
||||
)
|
||||
|
||||
async def build_dependency_graph(
|
||||
self,
|
||||
issue_numbers: List[int],
|
||||
repo: Optional[str] = None
|
||||
) -> Dict[int, List[int]]:
|
||||
"""
|
||||
Build a dependency graph for a list of issues.
|
||||
|
||||
Args:
|
||||
issue_numbers: List of issue numbers to analyze
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
Dictionary mapping issue_number -> list of issues it depends on
|
||||
"""
|
||||
graph = {}
|
||||
for issue_num in issue_numbers:
|
||||
try:
|
||||
deps = await self.list_issue_dependencies(issue_num, repo)
|
||||
graph[issue_num] = [
|
||||
d.get('number') or d.get('index')
|
||||
for d in deps
|
||||
if (d.get('number') or d.get('index')) in issue_numbers
|
||||
]
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not fetch dependencies for #{issue_num}: {e}")
|
||||
graph[issue_num] = []
|
||||
return graph
|
||||
|
||||
async def get_ready_tasks(
|
||||
self,
|
||||
issue_numbers: List[int],
|
||||
completed: Set[int],
|
||||
repo: Optional[str] = None
|
||||
) -> List[int]:
|
||||
"""
|
||||
Get tasks that are ready to execute (no unresolved dependencies).
|
||||
|
||||
Args:
|
||||
issue_numbers: List of all issue numbers in sprint
|
||||
completed: Set of already completed issue numbers
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
List of issue numbers that can be executed now
|
||||
"""
|
||||
graph = await self.build_dependency_graph(issue_numbers, repo)
|
||||
ready = []
|
||||
|
||||
for issue_num in issue_numbers:
|
||||
if issue_num in completed:
|
||||
continue
|
||||
|
||||
deps = graph.get(issue_num, [])
|
||||
# Task is ready if all its dependencies are completed
|
||||
if all(dep in completed for dep in deps):
|
||||
ready.append(issue_num)
|
||||
|
||||
return ready
|
||||
|
||||
async def get_execution_order(
|
||||
self,
|
||||
issue_numbers: List[int],
|
||||
repo: Optional[str] = None
|
||||
) -> List[List[int]]:
|
||||
"""
|
||||
Get a parallelizable execution order for issues.
|
||||
|
||||
Returns batches of issues that can be executed in parallel.
|
||||
Each batch contains issues with no unresolved dependencies.
|
||||
|
||||
Args:
|
||||
issue_numbers: List of all issue numbers
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
List of batches, where each batch can be executed in parallel
|
||||
"""
|
||||
graph = await self.build_dependency_graph(issue_numbers, repo)
|
||||
completed: Set[int] = set()
|
||||
remaining = set(issue_numbers)
|
||||
batches = []
|
||||
|
||||
while remaining:
|
||||
# Find all tasks with no unresolved dependencies
|
||||
batch = []
|
||||
for issue_num in remaining:
|
||||
deps = graph.get(issue_num, [])
|
||||
if all(dep in completed for dep in deps):
|
||||
batch.append(issue_num)
|
||||
|
||||
if not batch:
|
||||
# Circular dependency detected
|
||||
logger.error(f"Circular dependency detected! Remaining: {remaining}")
|
||||
batch = list(remaining) # Force include remaining to avoid infinite loop
|
||||
|
||||
batches.append(batch)
|
||||
completed.update(batch)
|
||||
remaining -= set(batch)
|
||||
|
||||
return batches
|
||||
@@ -1,261 +0,0 @@
|
||||
"""
|
||||
Issue management tools for MCP server.
|
||||
|
||||
Provides async wrappers for issue CRUD operations with:
|
||||
- Branch-aware security
|
||||
- PMO multi-repo support
|
||||
- Comprehensive error handling
|
||||
"""
|
||||
import asyncio
|
||||
import subprocess
|
||||
import logging
|
||||
from typing import List, Dict, Optional
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class IssueTools:
|
||||
"""Async wrappers for Gitea issue operations with branch detection"""
|
||||
|
||||
def __init__(self, gitea_client):
|
||||
"""
|
||||
Initialize issue tools.
|
||||
|
||||
Args:
|
||||
gitea_client: GiteaClient instance
|
||||
"""
|
||||
self.gitea = gitea_client
|
||||
|
||||
def _get_current_branch(self) -> str:
|
||||
"""
|
||||
Get current git branch.
|
||||
|
||||
Returns:
|
||||
Current branch name or 'unknown' if not in a git repo
|
||||
"""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['git', 'rev-parse', '--abbrev-ref', 'HEAD'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True
|
||||
)
|
||||
return result.stdout.strip()
|
||||
except subprocess.CalledProcessError:
|
||||
return "unknown"
|
||||
|
||||
def _check_branch_permissions(self, operation: str) -> bool:
|
||||
"""
|
||||
Check if operation is allowed on current branch.
|
||||
|
||||
Args:
|
||||
operation: Operation name (list_issues, create_issue, etc.)
|
||||
|
||||
Returns:
|
||||
True if operation is allowed, False otherwise
|
||||
"""
|
||||
branch = self._get_current_branch()
|
||||
|
||||
# Production branches (read-only except incidents)
|
||||
if branch in ['main', 'master'] or branch.startswith('prod/'):
|
||||
return operation in ['list_issues', 'get_issue', 'get_labels']
|
||||
|
||||
# Staging branches (read-only for code)
|
||||
if branch == 'staging' or branch.startswith('stage/'):
|
||||
return operation in ['list_issues', 'get_issue', 'get_labels', 'create_issue']
|
||||
|
||||
# Development branches (full access)
|
||||
if branch in ['development', 'develop'] or branch.startswith(('feat/', 'feature/', 'dev/')):
|
||||
return True
|
||||
|
||||
# Unknown branch - be restrictive
|
||||
return False
|
||||
|
||||
async def list_issues(
|
||||
self,
|
||||
state: str = 'open',
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
List issues from repository (async wrapper).
|
||||
|
||||
Args:
|
||||
state: Issue state (open, closed, all)
|
||||
labels: Filter by labels
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
List of issue dictionaries
|
||||
|
||||
Raises:
|
||||
PermissionError: If operation not allowed on current branch
|
||||
"""
|
||||
if not self._check_branch_permissions('list_issues'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(
|
||||
f"Cannot list issues on branch '{branch}'. "
|
||||
f"Switch to a development branch."
|
||||
)
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.list_issues(state, labels, repo)
|
||||
)
|
||||
|
||||
async def get_issue(
|
||||
self,
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Get specific issue details (async wrapper).
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Issue dictionary
|
||||
|
||||
Raises:
|
||||
PermissionError: If operation not allowed on current branch
|
||||
"""
|
||||
if not self._check_branch_permissions('get_issue'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(
|
||||
f"Cannot get issue on branch '{branch}'. "
|
||||
f"Switch to a development branch."
|
||||
)
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.get_issue(issue_number, repo)
|
||||
)
|
||||
|
||||
async def create_issue(
|
||||
self,
|
||||
title: str,
|
||||
body: str,
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create new issue (async wrapper with branch check).
|
||||
|
||||
Args:
|
||||
title: Issue title
|
||||
body: Issue description
|
||||
labels: List of label names
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Created issue dictionary
|
||||
|
||||
Raises:
|
||||
PermissionError: If operation not allowed on current branch
|
||||
"""
|
||||
if not self._check_branch_permissions('create_issue'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(
|
||||
f"Cannot create issues on branch '{branch}'. "
|
||||
f"Switch to a development branch to create issues."
|
||||
)
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_issue(title, body, labels, repo)
|
||||
)
|
||||
|
||||
async def update_issue(
|
||||
self,
|
||||
issue_number: int,
|
||||
title: Optional[str] = None,
|
||||
body: Optional[str] = None,
|
||||
state: Optional[str] = None,
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Update existing issue (async wrapper with branch check).
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
title: New title (optional)
|
||||
body: New body (optional)
|
||||
state: New state - 'open' or 'closed' (optional)
|
||||
labels: New labels (optional)
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Updated issue dictionary
|
||||
|
||||
Raises:
|
||||
PermissionError: If operation not allowed on current branch
|
||||
"""
|
||||
if not self._check_branch_permissions('update_issue'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(
|
||||
f"Cannot update issues on branch '{branch}'. "
|
||||
f"Switch to a development branch to update issues."
|
||||
)
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.update_issue(issue_number, title, body, state, labels, repo)
|
||||
)
|
||||
|
||||
async def add_comment(
|
||||
self,
|
||||
issue_number: int,
|
||||
comment: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Add comment to issue (async wrapper with branch check).
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
comment: Comment text
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Created comment dictionary
|
||||
|
||||
Raises:
|
||||
PermissionError: If operation not allowed on current branch
|
||||
"""
|
||||
if not self._check_branch_permissions('add_comment'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(
|
||||
f"Cannot add comments on branch '{branch}'. "
|
||||
f"Switch to a development branch to add comments."
|
||||
)
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.add_comment(issue_number, comment, repo)
|
||||
)
|
||||
|
||||
async def aggregate_issues(
|
||||
self,
|
||||
org: str,
|
||||
state: str = 'open',
|
||||
labels: Optional[List[str]] = None
|
||||
) -> Dict[str, List[Dict]]:
|
||||
"""Aggregate issues across all repositories in org."""
|
||||
if not self._check_branch_permissions('aggregate_issues'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(f"Cannot aggregate issues on branch '{branch}'.")
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.aggregate_issues(org, state, labels)
|
||||
)
|
||||
@@ -1,158 +0,0 @@
|
||||
"""
|
||||
Label management tools for MCP server.
|
||||
|
||||
Provides async wrappers for label operations with:
|
||||
- Label taxonomy retrieval
|
||||
- Intelligent label suggestion
|
||||
- Dynamic label detection
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
from typing import List, Dict, Optional
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class LabelTools:
|
||||
"""Async wrappers for Gitea label operations"""
|
||||
|
||||
def __init__(self, gitea_client):
|
||||
"""
|
||||
Initialize label tools.
|
||||
|
||||
Args:
|
||||
gitea_client: GiteaClient instance
|
||||
"""
|
||||
self.gitea = gitea_client
|
||||
|
||||
async def get_labels(self, repo: Optional[str] = None) -> Dict[str, List[Dict]]:
|
||||
"""Get all labels (org + repo). Repo must be 'owner/repo' format."""
|
||||
loop = asyncio.get_event_loop()
|
||||
|
||||
target_repo = repo or self.gitea.repo
|
||||
if not target_repo or '/' not in target_repo:
|
||||
raise ValueError("Use 'owner/repo' format (e.g. 'org/repo-name')")
|
||||
|
||||
org = target_repo.split('/')[0]
|
||||
|
||||
org_labels = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.get_org_labels(org)
|
||||
)
|
||||
|
||||
repo_labels = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.get_labels(target_repo)
|
||||
)
|
||||
|
||||
return {
|
||||
'organization': org_labels,
|
||||
'repository': repo_labels,
|
||||
'total_count': len(org_labels) + len(repo_labels)
|
||||
}
|
||||
|
||||
async def suggest_labels(self, context: str) -> List[str]:
|
||||
"""
|
||||
Analyze context and suggest appropriate labels.
|
||||
|
||||
Args:
|
||||
context: Issue title + description or sprint context
|
||||
|
||||
Returns:
|
||||
List of suggested label names
|
||||
"""
|
||||
suggested = []
|
||||
context_lower = context.lower()
|
||||
|
||||
# Type detection (exclusive - only one)
|
||||
if any(word in context_lower for word in ['bug', 'error', 'fix', 'broken', 'crash', 'fail']):
|
||||
suggested.append('Type/Bug')
|
||||
elif any(word in context_lower for word in ['refactor', 'extract', 'restructure', 'architecture', 'service extraction']):
|
||||
suggested.append('Type/Refactor')
|
||||
elif any(word in context_lower for word in ['feature', 'add', 'implement', 'new', 'create']):
|
||||
suggested.append('Type/Feature')
|
||||
elif any(word in context_lower for word in ['docs', 'documentation', 'readme', 'guide']):
|
||||
suggested.append('Type/Documentation')
|
||||
elif any(word in context_lower for word in ['test', 'testing', 'spec', 'coverage']):
|
||||
suggested.append('Type/Test')
|
||||
elif any(word in context_lower for word in ['chore', 'maintenance', 'update', 'upgrade']):
|
||||
suggested.append('Type/Chore')
|
||||
|
||||
# Priority detection
|
||||
if any(word in context_lower for word in ['critical', 'urgent', 'blocker', 'blocking', 'emergency']):
|
||||
suggested.append('Priority/Critical')
|
||||
elif any(word in context_lower for word in ['high', 'important', 'asap', 'soon']):
|
||||
suggested.append('Priority/High')
|
||||
elif any(word in context_lower for word in ['low', 'nice-to-have', 'optional', 'later']):
|
||||
suggested.append('Priority/Low')
|
||||
else:
|
||||
suggested.append('Priority/Medium')
|
||||
|
||||
# Complexity detection
|
||||
if any(word in context_lower for word in ['simple', 'trivial', 'easy', 'quick']):
|
||||
suggested.append('Complexity/Simple')
|
||||
elif any(word in context_lower for word in ['complex', 'difficult', 'challenging', 'intricate']):
|
||||
suggested.append('Complexity/Complex')
|
||||
else:
|
||||
suggested.append('Complexity/Medium')
|
||||
|
||||
# Efforts detection
|
||||
if any(word in context_lower for word in ['xs', 'tiny', '1 hour', '2 hours']):
|
||||
suggested.append('Efforts/XS')
|
||||
elif any(word in context_lower for word in ['small', 's ', '1 day', 'half day']):
|
||||
suggested.append('Efforts/S')
|
||||
elif any(word in context_lower for word in ['medium', 'm ', '2 days', '3 days']):
|
||||
suggested.append('Efforts/M')
|
||||
elif any(word in context_lower for word in ['large', 'l ', '1 week', '5 days']):
|
||||
suggested.append('Efforts/L')
|
||||
elif any(word in context_lower for word in ['xl', 'extra large', '2 weeks', 'sprint']):
|
||||
suggested.append('Efforts/XL')
|
||||
|
||||
# Component detection (based on keywords)
|
||||
component_keywords = {
|
||||
'Component/Backend': ['backend', 'server', 'api', 'database', 'service'],
|
||||
'Component/Frontend': ['frontend', 'ui', 'interface', 'react', 'vue', 'component'],
|
||||
'Component/API': ['api', 'endpoint', 'rest', 'graphql', 'route'],
|
||||
'Component/Database': ['database', 'db', 'sql', 'migration', 'schema', 'postgres'],
|
||||
'Component/Auth': ['auth', 'authentication', 'login', 'oauth', 'token', 'session'],
|
||||
'Component/Deploy': ['deploy', 'deployment', 'docker', 'kubernetes', 'ci/cd'],
|
||||
'Component/Testing': ['test', 'testing', 'spec', 'jest', 'pytest', 'coverage'],
|
||||
'Component/Docs': ['docs', 'documentation', 'readme', 'guide', 'wiki']
|
||||
}
|
||||
|
||||
for label, keywords in component_keywords.items():
|
||||
if any(keyword in context_lower for keyword in keywords):
|
||||
suggested.append(label)
|
||||
|
||||
# Tech stack detection
|
||||
tech_keywords = {
|
||||
'Tech/Python': ['python', 'fastapi', 'django', 'flask', 'pytest'],
|
||||
'Tech/JavaScript': ['javascript', 'js', 'node', 'npm', 'yarn'],
|
||||
'Tech/Docker': ['docker', 'dockerfile', 'container', 'compose'],
|
||||
'Tech/PostgreSQL': ['postgres', 'postgresql', 'psql', 'sql'],
|
||||
'Tech/Redis': ['redis', 'cache', 'session store'],
|
||||
'Tech/Vue': ['vue', 'vuejs', 'nuxt'],
|
||||
'Tech/FastAPI': ['fastapi', 'pydantic', 'starlette']
|
||||
}
|
||||
|
||||
for label, keywords in tech_keywords.items():
|
||||
if any(keyword in context_lower for keyword in keywords):
|
||||
suggested.append(label)
|
||||
|
||||
# Source detection (based on git branch or context)
|
||||
if 'development' in context_lower or 'dev/' in context_lower:
|
||||
suggested.append('Source/Development')
|
||||
elif 'staging' in context_lower or 'stage/' in context_lower:
|
||||
suggested.append('Source/Staging')
|
||||
elif 'production' in context_lower or 'prod' in context_lower:
|
||||
suggested.append('Source/Production')
|
||||
|
||||
# Risk detection
|
||||
if any(word in context_lower for word in ['breaking', 'breaking change', 'major', 'risky']):
|
||||
suggested.append('Risk/High')
|
||||
elif any(word in context_lower for word in ['safe', 'low risk', 'minor']):
|
||||
suggested.append('Risk/Low')
|
||||
|
||||
logger.info(f"Suggested {len(suggested)} labels based on context")
|
||||
return suggested
|
||||
@@ -1,145 +0,0 @@
|
||||
"""
|
||||
Milestone management tools for MCP server.
|
||||
|
||||
Provides async wrappers for milestone operations:
|
||||
- CRUD operations for milestones
|
||||
- Milestone-sprint relationship tracking
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
from typing import List, Dict, Optional
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class MilestoneTools:
|
||||
"""Async wrappers for Gitea milestone operations"""
|
||||
|
||||
def __init__(self, gitea_client):
|
||||
"""
|
||||
Initialize milestone tools.
|
||||
|
||||
Args:
|
||||
gitea_client: GiteaClient instance
|
||||
"""
|
||||
self.gitea = gitea_client
|
||||
|
||||
async def list_milestones(
|
||||
self,
|
||||
state: str = 'open',
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
List all milestones in repository.
|
||||
|
||||
Args:
|
||||
state: Milestone state (open, closed, all)
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
List of milestone dictionaries
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.list_milestones(state, repo)
|
||||
)
|
||||
|
||||
async def get_milestone(
|
||||
self,
|
||||
milestone_id: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Get a specific milestone by ID.
|
||||
|
||||
Args:
|
||||
milestone_id: Milestone ID
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
Milestone dictionary
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.get_milestone(milestone_id, repo)
|
||||
)
|
||||
|
||||
async def create_milestone(
|
||||
self,
|
||||
title: str,
|
||||
description: Optional[str] = None,
|
||||
due_on: Optional[str] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a new milestone.
|
||||
|
||||
Args:
|
||||
title: Milestone title (e.g., "v2.0 Release", "Sprint 17")
|
||||
description: Milestone description
|
||||
due_on: Due date in ISO 8601 format (e.g., "2025-02-01T00:00:00Z")
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
Created milestone dictionary
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_milestone(title, description, due_on, repo)
|
||||
)
|
||||
|
||||
async def update_milestone(
|
||||
self,
|
||||
milestone_id: int,
|
||||
title: Optional[str] = None,
|
||||
description: Optional[str] = None,
|
||||
state: Optional[str] = None,
|
||||
due_on: Optional[str] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Update an existing milestone.
|
||||
|
||||
Args:
|
||||
milestone_id: Milestone ID
|
||||
title: New title (optional)
|
||||
description: New description (optional)
|
||||
state: New state - 'open' or 'closed' (optional)
|
||||
due_on: New due date (optional)
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
Updated milestone dictionary
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.update_milestone(
|
||||
milestone_id, title, description, state, due_on, repo
|
||||
)
|
||||
)
|
||||
|
||||
async def delete_milestone(
|
||||
self,
|
||||
milestone_id: int,
|
||||
repo: Optional[str] = None
|
||||
) -> bool:
|
||||
"""
|
||||
Delete a milestone.
|
||||
|
||||
Args:
|
||||
milestone_id: Milestone ID
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
True if deleted successfully
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.delete_milestone(milestone_id, repo)
|
||||
)
|
||||
@@ -1,149 +0,0 @@
|
||||
"""
|
||||
Wiki management tools for MCP server.
|
||||
|
||||
Provides async wrappers for wiki operations to support lessons learned:
|
||||
- Page CRUD operations
|
||||
- Lessons learned creation and search
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
from typing import List, Dict, Optional
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WikiTools:
|
||||
"""Async wrappers for Gitea wiki operations"""
|
||||
|
||||
def __init__(self, gitea_client):
|
||||
"""
|
||||
Initialize wiki tools.
|
||||
|
||||
Args:
|
||||
gitea_client: GiteaClient instance
|
||||
"""
|
||||
self.gitea = gitea_client
|
||||
|
||||
async def list_wiki_pages(self, repo: Optional[str] = None) -> List[Dict]:
|
||||
"""List all wiki pages in repository."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.list_wiki_pages(repo)
|
||||
)
|
||||
|
||||
async def get_wiki_page(
|
||||
self,
|
||||
page_name: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Get a specific wiki page by name."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.get_wiki_page(page_name, repo)
|
||||
)
|
||||
|
||||
async def create_wiki_page(
|
||||
self,
|
||||
title: str,
|
||||
content: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Create a new wiki page."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_wiki_page(title, content, repo)
|
||||
)
|
||||
|
||||
async def update_wiki_page(
|
||||
self,
|
||||
page_name: str,
|
||||
content: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""Update an existing wiki page."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.update_wiki_page(page_name, content, repo)
|
||||
)
|
||||
|
||||
async def delete_wiki_page(
|
||||
self,
|
||||
page_name: str,
|
||||
repo: Optional[str] = None
|
||||
) -> bool:
|
||||
"""Delete a wiki page."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.delete_wiki_page(page_name, repo)
|
||||
)
|
||||
|
||||
async def search_wiki_pages(
|
||||
self,
|
||||
query: str,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""Search wiki pages by title."""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.search_wiki_pages(query, repo)
|
||||
)
|
||||
|
||||
async def create_lesson(
|
||||
self,
|
||||
title: str,
|
||||
content: str,
|
||||
tags: List[str],
|
||||
category: str = "sprints",
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a lessons learned entry in the wiki.
|
||||
|
||||
Args:
|
||||
title: Lesson title (e.g., "Sprint 16 - Prevent Infinite Loops")
|
||||
content: Lesson content in markdown
|
||||
tags: List of tags for categorization
|
||||
category: Category (sprints, patterns, architecture, etc.)
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
Created wiki page
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.create_lesson(title, content, tags, category, repo)
|
||||
)
|
||||
|
||||
async def search_lessons(
|
||||
self,
|
||||
query: Optional[str] = None,
|
||||
tags: Optional[List[str]] = None,
|
||||
limit: int = 20,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Search lessons learned from previous sprints.
|
||||
|
||||
Args:
|
||||
query: Search query (optional)
|
||||
tags: Tags to filter by (optional)
|
||||
limit: Maximum results (default 20)
|
||||
repo: Repository in owner/repo format
|
||||
|
||||
Returns:
|
||||
List of matching lessons
|
||||
"""
|
||||
loop = asyncio.get_event_loop()
|
||||
results = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.search_lessons(query, tags, repo)
|
||||
)
|
||||
return results[:limit]
|
||||
@@ -1,6 +0,0 @@
|
||||
mcp>=0.9.0 # MCP SDK from Anthropic
|
||||
python-dotenv>=1.0.0 # Environment variable loading
|
||||
requests>=2.31.0 # HTTP client for Gitea API
|
||||
pydantic>=2.5.0 # Data validation
|
||||
pytest>=7.4.3 # Testing framework
|
||||
pytest-asyncio>=0.23.0 # Async testing support
|
||||
@@ -1,151 +0,0 @@
|
||||
"""
|
||||
Unit tests for configuration loader.
|
||||
"""
|
||||
import pytest
|
||||
from pathlib import Path
|
||||
import os
|
||||
from mcp_server.config import GiteaConfig
|
||||
|
||||
|
||||
def test_load_system_config(tmp_path, monkeypatch):
|
||||
"""Test loading system-level configuration"""
|
||||
# Mock home directory
|
||||
config_dir = tmp_path / '.config' / 'claude'
|
||||
config_dir.mkdir(parents=True)
|
||||
|
||||
config_file = config_dir / 'gitea.env'
|
||||
config_file.write_text(
|
||||
"GITEA_API_URL=https://test.com/api/v1\n"
|
||||
"GITEA_API_TOKEN=test_token\n"
|
||||
"GITEA_OWNER=test_owner\n"
|
||||
)
|
||||
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
config = GiteaConfig()
|
||||
result = config.load()
|
||||
|
||||
assert result['api_url'] == 'https://test.com/api/v1'
|
||||
assert result['api_token'] == 'test_token'
|
||||
assert result['owner'] == 'test_owner'
|
||||
assert result['mode'] == 'company' # No repo specified
|
||||
assert result['repo'] is None
|
||||
|
||||
|
||||
def test_project_config_override(tmp_path, monkeypatch):
|
||||
"""Test that project config overrides system config"""
|
||||
# Set up system config
|
||||
system_config_dir = tmp_path / '.config' / 'claude'
|
||||
system_config_dir.mkdir(parents=True)
|
||||
|
||||
system_config = system_config_dir / 'gitea.env'
|
||||
system_config.write_text(
|
||||
"GITEA_API_URL=https://test.com/api/v1\n"
|
||||
"GITEA_API_TOKEN=test_token\n"
|
||||
"GITEA_OWNER=test_owner\n"
|
||||
)
|
||||
|
||||
# Set up project config
|
||||
project_dir = tmp_path / 'project'
|
||||
project_dir.mkdir()
|
||||
|
||||
project_config = project_dir / '.env'
|
||||
project_config.write_text("GITEA_REPO=test_repo\n")
|
||||
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(project_dir)
|
||||
|
||||
config = GiteaConfig()
|
||||
result = config.load()
|
||||
|
||||
assert result['repo'] == 'test_repo'
|
||||
assert result['mode'] == 'project'
|
||||
|
||||
|
||||
def test_missing_system_config(tmp_path, monkeypatch):
|
||||
"""Test error handling for missing system configuration"""
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
with pytest.raises(FileNotFoundError) as exc_info:
|
||||
config = GiteaConfig()
|
||||
config.load()
|
||||
|
||||
assert "System config not found" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_missing_required_config(tmp_path, monkeypatch):
|
||||
"""Test error handling for missing required variables"""
|
||||
# Clear environment variables
|
||||
for var in ['GITEA_API_URL', 'GITEA_API_TOKEN', 'GITEA_OWNER', 'GITEA_REPO']:
|
||||
monkeypatch.delenv(var, raising=False)
|
||||
|
||||
# Create incomplete config
|
||||
config_dir = tmp_path / '.config' / 'claude'
|
||||
config_dir.mkdir(parents=True)
|
||||
|
||||
config_file = config_dir / 'gitea.env'
|
||||
config_file.write_text(
|
||||
"GITEA_API_URL=https://test.com/api/v1\n"
|
||||
# Missing GITEA_API_TOKEN and GITEA_OWNER
|
||||
)
|
||||
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
with pytest.raises(ValueError) as exc_info:
|
||||
config = GiteaConfig()
|
||||
config.load()
|
||||
|
||||
assert "Missing required configuration" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_mode_detection_project(tmp_path, monkeypatch):
|
||||
"""Test mode detection for project mode"""
|
||||
config_dir = tmp_path / '.config' / 'claude'
|
||||
config_dir.mkdir(parents=True)
|
||||
|
||||
config_file = config_dir / 'gitea.env'
|
||||
config_file.write_text(
|
||||
"GITEA_API_URL=https://test.com/api/v1\n"
|
||||
"GITEA_API_TOKEN=test_token\n"
|
||||
"GITEA_OWNER=test_owner\n"
|
||||
"GITEA_REPO=test_repo\n"
|
||||
)
|
||||
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
config = GiteaConfig()
|
||||
result = config.load()
|
||||
|
||||
assert result['mode'] == 'project'
|
||||
assert result['repo'] == 'test_repo'
|
||||
|
||||
|
||||
def test_mode_detection_company(tmp_path, monkeypatch):
|
||||
"""Test mode detection for company mode (PMO)"""
|
||||
# Clear environment variables, especially GITEA_REPO
|
||||
for var in ['GITEA_API_URL', 'GITEA_API_TOKEN', 'GITEA_OWNER', 'GITEA_REPO']:
|
||||
monkeypatch.delenv(var, raising=False)
|
||||
|
||||
config_dir = tmp_path / '.config' / 'claude'
|
||||
config_dir.mkdir(parents=True)
|
||||
|
||||
config_file = config_dir / 'gitea.env'
|
||||
config_file.write_text(
|
||||
"GITEA_API_URL=https://test.com/api/v1\n"
|
||||
"GITEA_API_TOKEN=test_token\n"
|
||||
"GITEA_OWNER=test_owner\n"
|
||||
# No GITEA_REPO
|
||||
)
|
||||
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(tmp_path)
|
||||
|
||||
config = GiteaConfig()
|
||||
result = config.load()
|
||||
|
||||
assert result['mode'] == 'company'
|
||||
assert result['repo'] is None
|
||||
@@ -1,224 +0,0 @@
|
||||
"""
|
||||
Unit tests for Gitea API client.
|
||||
"""
|
||||
import pytest
|
||||
from unittest.mock import Mock, patch, MagicMock
|
||||
from mcp_server.gitea_client import GiteaClient
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_config():
|
||||
"""Fixture providing mocked configuration"""
|
||||
with patch('mcp_server.gitea_client.GiteaConfig') as mock_cfg:
|
||||
mock_instance = mock_cfg.return_value
|
||||
mock_instance.load.return_value = {
|
||||
'api_url': 'https://test.com/api/v1',
|
||||
'api_token': 'test_token',
|
||||
'owner': 'test_owner',
|
||||
'repo': 'test_repo',
|
||||
'mode': 'project'
|
||||
}
|
||||
yield mock_cfg
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def gitea_client(mock_config):
|
||||
"""Fixture providing GiteaClient instance with mocked config"""
|
||||
return GiteaClient()
|
||||
|
||||
|
||||
def test_client_initialization(gitea_client):
|
||||
"""Test client initializes with correct configuration"""
|
||||
assert gitea_client.base_url == 'https://test.com/api/v1'
|
||||
assert gitea_client.token == 'test_token'
|
||||
assert gitea_client.owner == 'test_owner'
|
||||
assert gitea_client.repo == 'test_repo'
|
||||
assert gitea_client.mode == 'project'
|
||||
assert 'Authorization' in gitea_client.session.headers
|
||||
assert gitea_client.session.headers['Authorization'] == 'token test_token'
|
||||
|
||||
|
||||
def test_list_issues(gitea_client):
|
||||
"""Test listing issues"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = [
|
||||
{'number': 1, 'title': 'Test Issue 1'},
|
||||
{'number': 2, 'title': 'Test Issue 2'}
|
||||
]
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'get', return_value=mock_response):
|
||||
issues = gitea_client.list_issues(state='open')
|
||||
|
||||
assert len(issues) == 2
|
||||
assert issues[0]['title'] == 'Test Issue 1'
|
||||
gitea_client.session.get.assert_called_once()
|
||||
|
||||
|
||||
def test_list_issues_with_labels(gitea_client):
|
||||
"""Test listing issues with label filter"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = [{'number': 1, 'title': 'Bug Issue'}]
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'get', return_value=mock_response):
|
||||
issues = gitea_client.list_issues(state='open', labels=['Type/Bug'])
|
||||
|
||||
gitea_client.session.get.assert_called_once()
|
||||
call_args = gitea_client.session.get.call_args
|
||||
assert call_args[1]['params']['labels'] == 'Type/Bug'
|
||||
|
||||
|
||||
def test_get_issue(gitea_client):
|
||||
"""Test getting specific issue"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = {'number': 1, 'title': 'Test Issue'}
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'get', return_value=mock_response):
|
||||
issue = gitea_client.get_issue(1)
|
||||
|
||||
assert issue['number'] == 1
|
||||
assert issue['title'] == 'Test Issue'
|
||||
|
||||
|
||||
def test_create_issue(gitea_client):
|
||||
"""Test creating new issue"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = {
|
||||
'number': 1,
|
||||
'title': 'New Issue',
|
||||
'body': 'Issue body'
|
||||
}
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'post', return_value=mock_response):
|
||||
issue = gitea_client.create_issue(
|
||||
title='New Issue',
|
||||
body='Issue body',
|
||||
labels=['Type/Bug']
|
||||
)
|
||||
|
||||
assert issue['title'] == 'New Issue'
|
||||
gitea_client.session.post.assert_called_once()
|
||||
|
||||
|
||||
def test_update_issue(gitea_client):
|
||||
"""Test updating existing issue"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = {
|
||||
'number': 1,
|
||||
'title': 'Updated Issue'
|
||||
}
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'patch', return_value=mock_response):
|
||||
issue = gitea_client.update_issue(
|
||||
issue_number=1,
|
||||
title='Updated Issue'
|
||||
)
|
||||
|
||||
assert issue['title'] == 'Updated Issue'
|
||||
gitea_client.session.patch.assert_called_once()
|
||||
|
||||
|
||||
def test_add_comment(gitea_client):
|
||||
"""Test adding comment to issue"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = {'body': 'Test comment'}
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'post', return_value=mock_response):
|
||||
comment = gitea_client.add_comment(1, 'Test comment')
|
||||
|
||||
assert comment['body'] == 'Test comment'
|
||||
gitea_client.session.post.assert_called_once()
|
||||
|
||||
|
||||
def test_get_labels(gitea_client):
|
||||
"""Test getting repository labels"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = [
|
||||
{'name': 'Type/Bug'},
|
||||
{'name': 'Priority/High'}
|
||||
]
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'get', return_value=mock_response):
|
||||
labels = gitea_client.get_labels()
|
||||
|
||||
assert len(labels) == 2
|
||||
assert labels[0]['name'] == 'Type/Bug'
|
||||
|
||||
|
||||
def test_get_org_labels(gitea_client):
|
||||
"""Test getting organization labels"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = [
|
||||
{'name': 'Type/Bug'},
|
||||
{'name': 'Type/Feature'}
|
||||
]
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'get', return_value=mock_response):
|
||||
labels = gitea_client.get_org_labels()
|
||||
|
||||
assert len(labels) == 2
|
||||
|
||||
|
||||
def test_list_repos(gitea_client):
|
||||
"""Test listing organization repositories (PMO mode)"""
|
||||
mock_response = Mock()
|
||||
mock_response.json.return_value = [
|
||||
{'name': 'repo1'},
|
||||
{'name': 'repo2'}
|
||||
]
|
||||
mock_response.raise_for_status = Mock()
|
||||
|
||||
with patch.object(gitea_client.session, 'get', return_value=mock_response):
|
||||
repos = gitea_client.list_repos()
|
||||
|
||||
assert len(repos) == 2
|
||||
assert repos[0]['name'] == 'repo1'
|
||||
|
||||
|
||||
def test_aggregate_issues(gitea_client):
|
||||
"""Test aggregating issues across repositories (PMO mode)"""
|
||||
# Mock list_repos
|
||||
gitea_client.list_repos = Mock(return_value=[
|
||||
{'name': 'repo1'},
|
||||
{'name': 'repo2'}
|
||||
])
|
||||
|
||||
# Mock list_issues
|
||||
gitea_client.list_issues = Mock(side_effect=[
|
||||
[{'number': 1, 'title': 'Issue 1'}], # repo1
|
||||
[{'number': 2, 'title': 'Issue 2'}] # repo2
|
||||
])
|
||||
|
||||
aggregated = gitea_client.aggregate_issues(state='open')
|
||||
|
||||
assert 'repo1' in aggregated
|
||||
assert 'repo2' in aggregated
|
||||
assert len(aggregated['repo1']) == 1
|
||||
assert len(aggregated['repo2']) == 1
|
||||
|
||||
|
||||
def test_no_repo_specified_error(gitea_client):
|
||||
"""Test error when repository not specified"""
|
||||
# Create client without repo
|
||||
with patch('mcp_server.gitea_client.GiteaConfig') as mock_cfg:
|
||||
mock_instance = mock_cfg.return_value
|
||||
mock_instance.load.return_value = {
|
||||
'api_url': 'https://test.com/api/v1',
|
||||
'api_token': 'test_token',
|
||||
'owner': 'test_owner',
|
||||
'repo': None, # No repo
|
||||
'mode': 'company'
|
||||
}
|
||||
client = GiteaClient()
|
||||
|
||||
with pytest.raises(ValueError) as exc_info:
|
||||
client.list_issues()
|
||||
|
||||
assert "Repository not specified" in str(exc_info.value)
|
||||
@@ -1,159 +0,0 @@
|
||||
"""
|
||||
Unit tests for issue tools with branch detection.
|
||||
"""
|
||||
import pytest
|
||||
from unittest.mock import Mock, patch, AsyncMock
|
||||
from mcp_server.tools.issues import IssueTools
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_gitea_client():
|
||||
"""Fixture providing mocked Gitea client"""
|
||||
client = Mock()
|
||||
client.mode = 'project'
|
||||
return client
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def issue_tools(mock_gitea_client):
|
||||
"""Fixture providing IssueTools instance"""
|
||||
return IssueTools(mock_gitea_client)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_list_issues_development_branch(issue_tools):
|
||||
"""Test listing issues on development branch (allowed)"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='feat/test-feature'):
|
||||
issue_tools.gitea.list_issues = Mock(return_value=[{'number': 1}])
|
||||
|
||||
issues = await issue_tools.list_issues(state='open')
|
||||
|
||||
assert len(issues) == 1
|
||||
issue_tools.gitea.list_issues.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_issue_development_branch(issue_tools):
|
||||
"""Test creating issue on development branch (allowed)"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='development'):
|
||||
issue_tools.gitea.create_issue = Mock(return_value={'number': 1})
|
||||
|
||||
issue = await issue_tools.create_issue('Test', 'Body')
|
||||
|
||||
assert issue['number'] == 1
|
||||
issue_tools.gitea.create_issue.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_issue_main_branch_blocked(issue_tools):
|
||||
"""Test creating issue on main branch (blocked)"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='main'):
|
||||
with pytest.raises(PermissionError) as exc_info:
|
||||
await issue_tools.create_issue('Test', 'Body')
|
||||
|
||||
assert "Cannot create issues on branch 'main'" in str(exc_info.value)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_issue_staging_branch_allowed(issue_tools):
|
||||
"""Test creating issue on staging branch (allowed for documentation)"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='staging'):
|
||||
issue_tools.gitea.create_issue = Mock(return_value={'number': 1})
|
||||
|
||||
issue = await issue_tools.create_issue('Test', 'Body')
|
||||
|
||||
assert issue['number'] == 1
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_update_issue_main_branch_blocked(issue_tools):
|
||||
"""Test updating issue on main branch (blocked)"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='main'):
|
||||
with pytest.raises(PermissionError) as exc_info:
|
||||
await issue_tools.update_issue(1, title='Updated')
|
||||
|
||||
assert "Cannot update issues on branch 'main'" in str(exc_info.value)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_list_issues_main_branch_allowed(issue_tools):
|
||||
"""Test listing issues on main branch (allowed - read-only)"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='main'):
|
||||
issue_tools.gitea.list_issues = Mock(return_value=[{'number': 1}])
|
||||
|
||||
issues = await issue_tools.list_issues(state='open')
|
||||
|
||||
assert len(issues) == 1
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_issue(issue_tools):
|
||||
"""Test getting specific issue"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='development'):
|
||||
issue_tools.gitea.get_issue = Mock(return_value={'number': 1, 'title': 'Test'})
|
||||
|
||||
issue = await issue_tools.get_issue(1)
|
||||
|
||||
assert issue['number'] == 1
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_add_comment(issue_tools):
|
||||
"""Test adding comment to issue"""
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='development'):
|
||||
issue_tools.gitea.add_comment = Mock(return_value={'body': 'Test comment'})
|
||||
|
||||
comment = await issue_tools.add_comment(1, 'Test comment')
|
||||
|
||||
assert comment['body'] == 'Test comment'
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_aggregate_issues_company_mode(issue_tools):
|
||||
"""Test aggregating issues in company mode"""
|
||||
issue_tools.gitea.mode = 'company'
|
||||
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='development'):
|
||||
issue_tools.gitea.aggregate_issues = Mock(return_value={
|
||||
'repo1': [{'number': 1}],
|
||||
'repo2': [{'number': 2}]
|
||||
})
|
||||
|
||||
aggregated = await issue_tools.aggregate_issues()
|
||||
|
||||
assert 'repo1' in aggregated
|
||||
assert 'repo2' in aggregated
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_aggregate_issues_project_mode_error(issue_tools):
|
||||
"""Test that aggregate_issues fails in project mode"""
|
||||
issue_tools.gitea.mode = 'project'
|
||||
|
||||
with patch.object(issue_tools, '_get_current_branch', return_value='development'):
|
||||
with pytest.raises(ValueError) as exc_info:
|
||||
await issue_tools.aggregate_issues()
|
||||
|
||||
assert "only available in company mode" in str(exc_info.value)
|
||||
|
||||
|
||||
def test_branch_detection():
|
||||
"""Test branch detection logic"""
|
||||
tools = IssueTools(Mock())
|
||||
|
||||
# Test development branches
|
||||
with patch.object(tools, '_get_current_branch', return_value='development'):
|
||||
assert tools._check_branch_permissions('create_issue') is True
|
||||
|
||||
with patch.object(tools, '_get_current_branch', return_value='feat/new-feature'):
|
||||
assert tools._check_branch_permissions('create_issue') is True
|
||||
|
||||
# Test production branches
|
||||
with patch.object(tools, '_get_current_branch', return_value='main'):
|
||||
assert tools._check_branch_permissions('create_issue') is False
|
||||
assert tools._check_branch_permissions('list_issues') is True
|
||||
|
||||
# Test staging branches
|
||||
with patch.object(tools, '_get_current_branch', return_value='staging'):
|
||||
assert tools._check_branch_permissions('create_issue') is True
|
||||
assert tools._check_branch_permissions('update_issue') is False
|
||||
@@ -1,246 +0,0 @@
|
||||
"""
|
||||
Unit tests for label tools with suggestion logic.
|
||||
"""
|
||||
import pytest
|
||||
from unittest.mock import Mock, patch
|
||||
from mcp_server.tools.labels import LabelTools
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_gitea_client():
|
||||
"""Fixture providing mocked Gitea client"""
|
||||
client = Mock()
|
||||
client.repo = 'test_repo'
|
||||
return client
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def label_tools(mock_gitea_client):
|
||||
"""Fixture providing LabelTools instance"""
|
||||
return LabelTools(mock_gitea_client)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_labels(label_tools):
|
||||
"""Test getting all labels (org + repo)"""
|
||||
label_tools.gitea.get_org_labels = Mock(return_value=[
|
||||
{'name': 'Type/Bug'},
|
||||
{'name': 'Type/Feature'}
|
||||
])
|
||||
label_tools.gitea.get_labels = Mock(return_value=[
|
||||
{'name': 'Component/Backend'},
|
||||
{'name': 'Component/Frontend'}
|
||||
])
|
||||
|
||||
result = await label_tools.get_labels()
|
||||
|
||||
assert len(result['organization']) == 2
|
||||
assert len(result['repository']) == 2
|
||||
assert result['total_count'] == 4
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_bug():
|
||||
"""Test label suggestion for bug context"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
context = "Fix critical bug in login authentication"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
|
||||
assert 'Type/Bug' in suggestions
|
||||
assert 'Priority/Critical' in suggestions
|
||||
assert 'Component/Auth' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_feature():
|
||||
"""Test label suggestion for feature context"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
context = "Add new feature to implement user dashboard"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
|
||||
assert 'Type/Feature' in suggestions
|
||||
assert any('Priority' in label for label in suggestions)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_refactor():
|
||||
"""Test label suggestion for refactor context"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
context = "Refactor architecture to extract service layer"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
|
||||
assert 'Type/Refactor' in suggestions
|
||||
assert 'Component/Backend' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_documentation():
|
||||
"""Test label suggestion for documentation context"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
context = "Update documentation for API endpoints"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
|
||||
assert 'Type/Documentation' in suggestions
|
||||
assert 'Component/API' in suggestions or 'Component/Docs' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_priority():
|
||||
"""Test priority detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# Critical priority
|
||||
context = "Urgent blocker in production"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Priority/Critical' in suggestions
|
||||
|
||||
# High priority
|
||||
context = "Important feature needed asap"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Priority/High' in suggestions
|
||||
|
||||
# Low priority
|
||||
context = "Nice-to-have optional improvement"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Priority/Low' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_complexity():
|
||||
"""Test complexity detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# Simple complexity
|
||||
context = "Simple quick fix for typo"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Complexity/Simple' in suggestions
|
||||
|
||||
# Complex complexity
|
||||
context = "Complex challenging architecture redesign"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Complexity/Complex' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_efforts():
|
||||
"""Test efforts detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# XS effort
|
||||
context = "Tiny fix that takes 1 hour"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Efforts/XS' in suggestions
|
||||
|
||||
# L effort
|
||||
context = "Large feature taking 1 week"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Efforts/L' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_components():
|
||||
"""Test component detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# Backend component
|
||||
context = "Update backend API service"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Component/Backend' in suggestions
|
||||
assert 'Component/API' in suggestions
|
||||
|
||||
# Frontend component
|
||||
context = "Fix frontend UI component"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Component/Frontend' in suggestions
|
||||
|
||||
# Database component
|
||||
context = "Add database migration for schema"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Component/Database' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_tech_stack():
|
||||
"""Test tech stack detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# Python
|
||||
context = "Update Python FastAPI endpoint"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Tech/Python' in suggestions
|
||||
assert 'Tech/FastAPI' in suggestions
|
||||
|
||||
# Docker
|
||||
context = "Fix Dockerfile configuration"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Tech/Docker' in suggestions
|
||||
|
||||
# PostgreSQL
|
||||
context = "Optimize PostgreSQL query"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Tech/PostgreSQL' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_source():
|
||||
"""Test source detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# Development
|
||||
context = "Issue found in development environment"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Source/Development' in suggestions
|
||||
|
||||
# Production
|
||||
context = "Critical production issue"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Source/Production' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_risk():
|
||||
"""Test risk detection in suggestions"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
# High risk
|
||||
context = "Breaking change to major API"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Risk/High' in suggestions
|
||||
|
||||
# Low risk
|
||||
context = "Safe minor update with low risk"
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
assert 'Risk/Low' in suggestions
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_suggest_labels_multiple_categories():
|
||||
"""Test that suggestions span multiple categories"""
|
||||
tools = LabelTools(Mock())
|
||||
|
||||
context = """
|
||||
Urgent critical bug in production backend API service.
|
||||
Need to fix broken authentication endpoint.
|
||||
This is a complex issue requiring FastAPI and PostgreSQL expertise.
|
||||
"""
|
||||
|
||||
suggestions = await tools.suggest_labels(context)
|
||||
|
||||
# Should have Type
|
||||
assert any('Type/' in label for label in suggestions)
|
||||
|
||||
# Should have Priority
|
||||
assert any('Priority/' in label for label in suggestions)
|
||||
|
||||
# Should have Component
|
||||
assert any('Component/' in label for label in suggestions)
|
||||
|
||||
# Should have Tech
|
||||
assert any('Tech/' in label for label in suggestions)
|
||||
|
||||
# Should have Source
|
||||
assert any('Source/' in label for label in suggestions)
|
||||
Reference in New Issue
Block a user