Merge pull request 'development' (#22) from development into main
Reviewed-on: bandit/support-claude-mktplace#22
This commit was merged in pull request #22.
This commit is contained in:
@@ -98,19 +98,19 @@ Complete JSON schema reference for `.claude-plugin/plugin.json` files.
|
||||
"version": "2.1.0",
|
||||
"description": "Automated deployment tools for cloud platforms",
|
||||
"author": {
|
||||
"name": "Bandit Labs",
|
||||
"email": "plugins@hyperhivelabs.com",
|
||||
"url": "https://hyperhivelabs.com"
|
||||
"name": "Your Name",
|
||||
"email": "plugins@example.com",
|
||||
"url": "https://example.com"
|
||||
},
|
||||
"license": "MIT",
|
||||
"keywords": ["deployment", "automation", "cloud", "devops"],
|
||||
"homepage": "https://github.com/hyperhivelabs/deploy-automation",
|
||||
"homepage": "https://github.com/yourorg/deploy-automation",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/hyperhivelabs/deploy-automation.git"
|
||||
"url": "https://github.com/yourorg/deploy-automation.git"
|
||||
},
|
||||
"bugs": {
|
||||
"url": "https://github.com/hyperhivelabs/deploy-automation/issues"
|
||||
"url": "https://github.com/yourorg/deploy-automation/issues"
|
||||
},
|
||||
"dependencies": {
|
||||
"node": ">=18.0.0",
|
||||
|
||||
28
.mcp.json
Normal file
28
.mcp.json
Normal file
@@ -0,0 +1,28 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"gitea": {
|
||||
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea",
|
||||
"env": {
|
||||
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea"
|
||||
}
|
||||
},
|
||||
"wikijs": {
|
||||
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs",
|
||||
"env": {
|
||||
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs"
|
||||
}
|
||||
},
|
||||
"netbox": {
|
||||
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox",
|
||||
"env": {
|
||||
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
31
CLAUDE.md
31
CLAUDE.md
@@ -212,20 +212,19 @@ The label system includes:
|
||||
|
||||
**Critical Feature:** After 15 sprints without lesson capture, repeated mistakes occurred (e.g., Claude Code infinite loops on similar issues 2-3 times).
|
||||
|
||||
**Wiki.js Structure:**
|
||||
**Wiki.js Structure (Example):**
|
||||
```
|
||||
Wiki.js: https://wiki.hyperhivelabs.com
|
||||
└── /hyper-hive-labs/
|
||||
Wiki.js: https://wiki.your-company.com
|
||||
└── /your-org/
|
||||
├── projects/ # Project-specific documentation
|
||||
│ ├── cuisineflow/
|
||||
│ ├── project-a/
|
||||
│ │ ├── lessons-learned/
|
||||
│ │ │ ├── sprints/
|
||||
│ │ │ ├── patterns/
|
||||
│ │ │ └── INDEX.md
|
||||
│ │ └── documentation/
|
||||
│ ├── cuisineflow-site/
|
||||
│ ├── intuit-engine/
|
||||
│ └── hhl-site/
|
||||
│ ├── project-b/
|
||||
│ └── project-c/
|
||||
├── company/ # Company-wide documentation
|
||||
│ ├── processes/
|
||||
│ ├── standards/
|
||||
@@ -376,11 +375,11 @@ bandit/support-claude-mktplace/
|
||||
|
||||
## Multi-Project Context (PMO Plugin)
|
||||
|
||||
The `projman-pmo` plugin will coordinate interdependent projects:
|
||||
- **CuisineFlow** - Main product
|
||||
- **CuisineFlow-Site** - Demo sync + customer gateway
|
||||
- **Intuit Engine Service** - API aggregator extraction (imminent)
|
||||
- **HHL-Site** - Company presence
|
||||
The `projman-pmo` plugin coordinates interdependent projects across an organization. Example use cases:
|
||||
- Main product repository
|
||||
- Marketing/documentation sites
|
||||
- Extracted services
|
||||
- Supporting tools
|
||||
|
||||
PMO plugin adds:
|
||||
- Cross-project issue aggregation (all repos in organization)
|
||||
@@ -392,7 +391,7 @@ PMO plugin adds:
|
||||
|
||||
**Configuration Difference:**
|
||||
- PMO operates at company level (no `GITEA_REPO` or `WIKIJS_PROJECT`)
|
||||
- Accesses entire `/hyper-hive-labs` Wiki.js namespace
|
||||
- Accesses entire organization Wiki.js namespace
|
||||
- Queries all repositories in organization
|
||||
|
||||
Build PMO plugin AFTER projman is working and validated.
|
||||
@@ -409,7 +408,7 @@ Create local marketplace for plugin development:
|
||||
```
|
||||
|
||||
**Integration Testing:**
|
||||
Test in real CuisineFlow repository with actual Gitea instance before distribution.
|
||||
Test in a real repository with actual Gitea instance before distribution.
|
||||
|
||||
**Success Metrics:**
|
||||
- Sprint planning time reduced 40%
|
||||
@@ -426,7 +425,7 @@ Test in real CuisineFlow repository with actual Gitea instance before distributi
|
||||
- **Type/Refactor label** - Newly implemented at org level for architectural work
|
||||
- **Branch detection must be 100% reliable** - Prevents production accidents
|
||||
- **Python for MCP servers** - Use Python 3.8+ with virtual environments
|
||||
- **Wiki.js structure** - All HHL content under `/hyper-hive-labs` namespace
|
||||
- **Wiki.js structure** - Organization content under configured base namespace
|
||||
|
||||
## CRITICAL: Rules You MUST Follow
|
||||
|
||||
@@ -501,7 +500,7 @@ This repository contains comprehensive planning documentation:
|
||||
- Two-MCP-server approach confirmed (Gitea + Wiki.js)
|
||||
- Python selected for MCP server implementation
|
||||
- Hybrid configuration strategy defined (system + project level)
|
||||
- Wiki.js structure planned at `/hyper-hive-labs`
|
||||
- Wiki.js structure planned with configurable base path
|
||||
- Repository structure designed with shared MCP servers
|
||||
- `claude-plugin-developer` skill added to project
|
||||
|
||||
|
||||
@@ -25,12 +25,12 @@ The MCP server operates in two modes based on environment variables:
|
||||
|
||||
**Project Mode (projman):**
|
||||
- When `WIKIJS_PROJECT` is present
|
||||
- Operates within project path: `/hyper-hive-labs/projects/cuisineflow`
|
||||
- Operates within project path: `/your-org/projects/my-project`
|
||||
- Used by projman plugin
|
||||
|
||||
**Company Mode (pmo):**
|
||||
- When `WIKIJS_PROJECT` is absent
|
||||
- Operates on entire namespace: `/hyper-hive-labs`
|
||||
- Operates on entire namespace: `/your-org`
|
||||
- Used by projman-pmo plugin
|
||||
|
||||
```python
|
||||
@@ -38,8 +38,8 @@ The MCP server operates in two modes based on environment variables:
|
||||
def load(self):
|
||||
# ... load configs ...
|
||||
|
||||
self.base_path = os.getenv('WIKIJS_BASE_PATH') # /hyper-hive-labs
|
||||
self.project_path = os.getenv('WIKIJS_PROJECT') # projects/cuisineflow (optional)
|
||||
self.base_path = os.getenv('WIKIJS_BASE_PATH') # /your-org
|
||||
self.project_path = os.getenv('WIKIJS_PROJECT') # projects/my-project (optional)
|
||||
|
||||
# Compose full path
|
||||
if self.project_path:
|
||||
@@ -66,10 +66,10 @@ def load(self):
|
||||
### Company-Wide Organization
|
||||
|
||||
```
|
||||
Wiki.js: https://wiki.hyperhivelabs.com
|
||||
└── /hyper-hive-labs/ # Base path
|
||||
Wiki.js: https://wiki.your-company.com
|
||||
└── /your-org/ # Base path
|
||||
├── projects/ # Project-specific documentation
|
||||
│ ├── cuisineflow/
|
||||
│ ├── my-project/
|
||||
│ │ ├── lessons-learned/
|
||||
│ │ │ ├── sprints/
|
||||
│ │ │ │ ├── sprint-01-auth.md
|
||||
@@ -83,13 +83,13 @@ Wiki.js: https://wiki.hyperhivelabs.com
|
||||
│ │ ├── architecture/
|
||||
│ │ ├── api/
|
||||
│ │ └── deployment/
|
||||
│ ├── cuisineflow-site/
|
||||
│ ├── my-project-site/
|
||||
│ │ ├── lessons-learned/
|
||||
│ │ └── documentation/
|
||||
│ ├── intuit-engine/
|
||||
│ │ ├── lessons-learned/
|
||||
│ │ └── documentation/
|
||||
│ └── hhl-site/
|
||||
│ └── company-site/
|
||||
│ ├── lessons-learned/
|
||||
│ └── documentation/
|
||||
├── company/ # Company-wide documentation
|
||||
@@ -124,11 +124,11 @@ Wiki.js: https://wiki.hyperhivelabs.com
|
||||
|
||||
**Project Mode (projman):**
|
||||
- Full path = `{WIKIJS_BASE_PATH}/{WIKIJS_PROJECT}`
|
||||
- Example: `/hyper-hive-labs/projects/cuisineflow`
|
||||
- Example: `/your-org/projects/my-project`
|
||||
|
||||
**Company Mode (pmo):**
|
||||
- Full path = `{WIKIJS_BASE_PATH}`
|
||||
- Example: `/hyper-hive-labs`
|
||||
- Example: `/your-org`
|
||||
|
||||
---
|
||||
|
||||
@@ -139,14 +139,14 @@ Wiki.js: https://wiki.hyperhivelabs.com
|
||||
**File:** `~/.config/claude/wikijs.env`
|
||||
|
||||
```bash
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=your_wikijs_token
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
```
|
||||
|
||||
**Generating Wiki.js API Token:**
|
||||
|
||||
1. Log into Wiki.js: https://wiki.hyperhivelabs.com
|
||||
1. Log into Wiki.js: https://wiki.your-company.com
|
||||
2. Navigate to: **User Menu (top-right)** → **Administration**
|
||||
3. In the sidebar, go to: **API Access**
|
||||
4. Click **Create New Token**
|
||||
@@ -170,9 +170,9 @@ mkdir -p ~/.config/claude
|
||||
|
||||
# Create wikijs.env
|
||||
cat > ~/.config/claude/wikijs.env << EOF
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=your_token_here
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
EOF
|
||||
|
||||
# Secure the file (important!)
|
||||
@@ -188,13 +188,13 @@ cat ~/.config/claude/wikijs.env
|
||||
|
||||
```bash
|
||||
# Wiki.js project path (relative to base path)
|
||||
WIKIJS_PROJECT=projects/cuisineflow
|
||||
WIKIJS_PROJECT=projects/my-project
|
||||
```
|
||||
|
||||
**Setup:**
|
||||
```bash
|
||||
# In each project root
|
||||
echo "WIKIJS_PROJECT=projects/cuisineflow" >> .env
|
||||
echo "WIKIJS_PROJECT=projects/my-project" >> .env
|
||||
|
||||
# Add to .gitignore (if not already)
|
||||
echo ".env" >> .gitignore
|
||||
@@ -243,8 +243,8 @@ class WikiJSConfig:
|
||||
# Extract values
|
||||
self.api_url = os.getenv('WIKIJS_API_URL')
|
||||
self.api_token = os.getenv('WIKIJS_API_TOKEN')
|
||||
self.base_path = os.getenv('WIKIJS_BASE_PATH') # /hyper-hive-labs
|
||||
self.project_path = os.getenv('WIKIJS_PROJECT') # projects/cuisineflow (optional)
|
||||
self.base_path = os.getenv('WIKIJS_BASE_PATH') # /your-org
|
||||
self.project_path = os.getenv('WIKIJS_PROJECT') # projects/my-project (optional)
|
||||
|
||||
# Compose full path
|
||||
if self.project_path:
|
||||
@@ -691,7 +691,7 @@ class WikiJSClient:
|
||||
by_project = {}
|
||||
for result in results:
|
||||
# Extract project name from path
|
||||
# e.g., "/hyper-hive-labs/projects/cuisineflow/..." -> "cuisineflow"
|
||||
# e.g., "/your-org/projects/my-project/..." -> "my-project"
|
||||
path_parts = result['path'].split('/')
|
||||
if len(path_parts) >= 4:
|
||||
project = path_parts[3]
|
||||
@@ -957,7 +957,7 @@ Agent: Any architectural insights for similar future work?
|
||||
User: [Response]
|
||||
|
||||
Agent: I'll create a lesson in Wiki.js:
|
||||
Path: /hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
Path: /your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
|
||||
Tags detected:
|
||||
#service-extraction #api #refactoring #claude-code-loops
|
||||
@@ -965,7 +965,7 @@ Agent: I'll create a lesson in Wiki.js:
|
||||
Creating page in Wiki.js... ✅
|
||||
Updating INDEX.md... ✅
|
||||
|
||||
View at: https://wiki.hyperhivelabs.com/hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
View at: https://wiki.your-company.com/your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
```
|
||||
|
||||
---
|
||||
@@ -1011,14 +1011,14 @@ def test_project_config_path_composition(tmp_path, monkeypatch):
|
||||
system_config.write_text(
|
||||
"WIKIJS_API_URL=https://wiki.test.com/graphql\n"
|
||||
"WIKIJS_API_TOKEN=test_token\n"
|
||||
"WIKIJS_BASE_PATH=/hyper-hive-labs\n"
|
||||
"WIKIJS_BASE_PATH=/your-org\n"
|
||||
)
|
||||
|
||||
project_dir = tmp_path / 'project'
|
||||
project_dir.mkdir()
|
||||
|
||||
project_config = project_dir / '.env'
|
||||
project_config.write_text("WIKIJS_PROJECT=projects/cuisineflow\n")
|
||||
project_config.write_text("WIKIJS_PROJECT=projects/my-project\n")
|
||||
|
||||
monkeypatch.setenv('HOME', str(tmp_path))
|
||||
monkeypatch.chdir(project_dir)
|
||||
@@ -1026,8 +1026,8 @@ def test_project_config_path_composition(tmp_path, monkeypatch):
|
||||
config = WikiJSConfig()
|
||||
result = config.load()
|
||||
|
||||
assert result['project_path'] == 'projects/cuisineflow'
|
||||
assert result['full_path'] == '/hyper-hive-labs/projects/cuisineflow'
|
||||
assert result['project_path'] == 'projects/my-project'
|
||||
assert result['full_path'] == '/your-org/projects/my-project'
|
||||
assert result['mode'] == 'project'
|
||||
```
|
||||
|
||||
@@ -1116,7 +1116,7 @@ pytest tests/test_config.py::test_project_config_path_composition
|
||||
|
||||
### Initial Structure Creation
|
||||
|
||||
**Status:** Base structure `/hyper-hive-labs` **does not exist** and needs to be created during Phase 1.1b.
|
||||
**Status:** Base structure `/your-org` **does not exist** and needs to be created during Phase 1.1b.
|
||||
|
||||
**Setup Script:** Run this script during Phase 1.1b to create the base structure:
|
||||
|
||||
@@ -1134,7 +1134,7 @@ from mcp_server.wikijs_client import WikiJSClient
|
||||
|
||||
|
||||
async def initialize_wiki_structure():
|
||||
"""Create base Wiki.js structure for Bandit Labs"""
|
||||
"""Create base Wiki.js structure for Your Organization"""
|
||||
|
||||
print("Initializing Wiki.js base structure...")
|
||||
print("=" * 60)
|
||||
@@ -1153,17 +1153,17 @@ async def initialize_wiki_structure():
|
||||
# Base structure to create
|
||||
base_pages = [
|
||||
{
|
||||
'path': 'hyper-hive-labs',
|
||||
'title': 'Bandit Labs',
|
||||
'content': '''# Bandit Labs Documentation
|
||||
'path': 'your-org',
|
||||
'title': 'Your Organization',
|
||||
'content': '''# Your Organization Documentation
|
||||
|
||||
Welcome to the Bandit Labs knowledge base.
|
||||
Welcome to the Your Organization knowledge base.
|
||||
|
||||
## Organization
|
||||
|
||||
- **[Projects](hyper-hive-labs/projects)** - Project-specific documentation and lessons learned
|
||||
- **[Company](hyper-hive-labs/company)** - Company-wide processes, standards, and tools
|
||||
- **[Shared](hyper-hive-labs/shared)** - Cross-project architecture patterns and best practices
|
||||
- **[Projects](your-org/projects)** - Project-specific documentation and lessons learned
|
||||
- **[Company](your-org/company)** - Company-wide processes, standards, and tools
|
||||
- **[Shared](your-org/shared)** - Cross-project architecture patterns and best practices
|
||||
|
||||
## Purpose
|
||||
|
||||
@@ -1176,10 +1176,10 @@ This knowledge base captures:
|
||||
All content is searchable and tagged for easy discovery across projects.
|
||||
''',
|
||||
'tags': ['company', 'index'],
|
||||
'description': 'Bandit Labs company knowledge base'
|
||||
'description': 'Your Organization company knowledge base'
|
||||
},
|
||||
{
|
||||
'path': 'hyper-hive-labs/projects',
|
||||
'path': 'your-org/projects',
|
||||
'title': 'Projects',
|
||||
'content': '''# Project Documentation
|
||||
|
||||
@@ -1187,10 +1187,10 @@ Project-specific documentation and lessons learned.
|
||||
|
||||
## Active Projects
|
||||
|
||||
- **[CuisineFlow](hyper-hive-labs/projects/cuisineflow)** - Main product
|
||||
- **[CuisineFlow-Site](hyper-hive-labs/projects/cuisineflow-site)** - Demo and customer gateway
|
||||
- **[Intuit-Engine](hyper-hive-labs/projects/intuit-engine)** - API aggregator service
|
||||
- **[HHL-Site](hyper-hive-labs/projects/hhl-site)** - Company website
|
||||
- **[My-Project](your-org/projects/my-project)** - Main product
|
||||
- **[My-Project-Site](your-org/projects/my-project-site)** - Demo and customer gateway
|
||||
- **[Intuit-Engine](your-org/projects/intuit-engine)** - API aggregator service
|
||||
- **[Company-Site](your-org/projects/company-site)** - Company website
|
||||
|
||||
Each project maintains:
|
||||
- Lessons learned from sprints
|
||||
@@ -1201,7 +1201,7 @@ Each project maintains:
|
||||
'description': 'Index of all project documentation'
|
||||
},
|
||||
{
|
||||
'path': 'hyper-hive-labs/company',
|
||||
'path': 'your-org/company',
|
||||
'title': 'Company',
|
||||
'content': '''# Company Documentation
|
||||
|
||||
@@ -1209,9 +1209,9 @@ Company-wide processes, standards, and tools.
|
||||
|
||||
## Sections
|
||||
|
||||
- **[Processes](hyper-hive-labs/company/processes)** - Development workflows, onboarding, deployment
|
||||
- **[Standards](hyper-hive-labs/company/standards)** - Code style, API design, security practices
|
||||
- **[Tools](hyper-hive-labs/company/tools)** - Gitea, Wiki.js, Claude Code plugin guides
|
||||
- **[Processes](your-org/company/processes)** - Development workflows, onboarding, deployment
|
||||
- **[Standards](your-org/company/standards)** - Code style, API design, security practices
|
||||
- **[Tools](your-org/company/tools)** - Gitea, Wiki.js, Claude Code plugin guides
|
||||
|
||||
These standards apply to all projects and team members.
|
||||
''',
|
||||
@@ -1219,7 +1219,7 @@ These standards apply to all projects and team members.
|
||||
'description': 'Company processes and standards'
|
||||
},
|
||||
{
|
||||
'path': 'hyper-hive-labs/shared',
|
||||
'path': 'your-org/shared',
|
||||
'title': 'Shared Resources',
|
||||
'content': '''# Shared Resources
|
||||
|
||||
@@ -1227,9 +1227,9 @@ Cross-project architecture patterns, best practices, and technical knowledge.
|
||||
|
||||
## Sections
|
||||
|
||||
- **[Architecture Patterns](hyper-hive-labs/shared/architecture-patterns)** - Microservices, service extraction, API design
|
||||
- **[Best Practices](hyper-hive-labs/shared/best-practices)** - Error handling, logging, testing strategies
|
||||
- **[Tech Stack](hyper-hive-labs/shared/tech-stack)** - Python ecosystem, Docker, CI/CD pipelines
|
||||
- **[Architecture Patterns](your-org/shared/architecture-patterns)** - Microservices, service extraction, API design
|
||||
- **[Best Practices](your-org/shared/best-practices)** - Error handling, logging, testing strategies
|
||||
- **[Tech Stack](your-org/shared/tech-stack)** - Python ecosystem, Docker, CI/CD pipelines
|
||||
|
||||
These patterns are distilled from lessons learned across all projects.
|
||||
''',
|
||||
@@ -1238,40 +1238,40 @@ These patterns are distilled from lessons learned across all projects.
|
||||
},
|
||||
# Project placeholders
|
||||
{
|
||||
'path': 'hyper-hive-labs/projects/cuisineflow',
|
||||
'title': 'CuisineFlow',
|
||||
'content': '''# CuisineFlow
|
||||
'path': 'your-org/projects/my-project',
|
||||
'title': 'My-Project',
|
||||
'content': '''# My-Project
|
||||
|
||||
Main product - recipe management and meal planning platform.
|
||||
|
||||
## Documentation
|
||||
|
||||
- **[Lessons Learned](hyper-hive-labs/projects/cuisineflow/lessons-learned)** - Sprint retrospectives and insights
|
||||
- **[Architecture](hyper-hive-labs/projects/cuisineflow/documentation/architecture)** - System architecture
|
||||
- **[API](hyper-hive-labs/projects/cuisineflow/documentation/api)** - API documentation
|
||||
- **[Lessons Learned](your-org/projects/my-project/lessons-learned)** - Sprint retrospectives and insights
|
||||
- **[Architecture](your-org/projects/my-project/documentation/architecture)** - System architecture
|
||||
- **[API](your-org/projects/my-project/documentation/api)** - API documentation
|
||||
|
||||
Sprint lessons will be automatically captured here by the projman plugin.
|
||||
''',
|
||||
'tags': ['project', 'cuisineflow'],
|
||||
'description': 'CuisineFlow project documentation'
|
||||
'tags': ['project', 'my-project'],
|
||||
'description': 'My-Project project documentation'
|
||||
},
|
||||
{
|
||||
'path': 'hyper-hive-labs/projects/cuisineflow/lessons-learned',
|
||||
'title': 'CuisineFlow - Lessons Learned',
|
||||
'content': '''# CuisineFlow - Lessons Learned
|
||||
'path': 'your-org/projects/my-project/lessons-learned',
|
||||
'title': 'My-Project - Lessons Learned',
|
||||
'content': '''# My-Project - Lessons Learned
|
||||
|
||||
Sprint retrospectives and insights from CuisineFlow development.
|
||||
Sprint retrospectives and insights from My-Project development.
|
||||
|
||||
## Organization
|
||||
|
||||
- **[Sprints](hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints)** - Sprint-specific lessons
|
||||
- **[Patterns](hyper-hive-labs/projects/cuisineflow/lessons-learned/patterns)** - Recurring patterns and solutions
|
||||
- **[INDEX](hyper-hive-labs/projects/cuisineflow/lessons-learned/INDEX)** - Complete index with tags
|
||||
- **[Sprints](your-org/projects/my-project/lessons-learned/sprints)** - Sprint-specific lessons
|
||||
- **[Patterns](your-org/projects/my-project/lessons-learned/patterns)** - Recurring patterns and solutions
|
||||
- **[INDEX](your-org/projects/my-project/lessons-learned/INDEX)** - Complete index with tags
|
||||
|
||||
Lessons are automatically captured during sprint close via `/sprint-close` command.
|
||||
''',
|
||||
'tags': ['lessons-learned', 'cuisineflow'],
|
||||
'description': 'CuisineFlow lessons learned index'
|
||||
'tags': ['lessons-learned', 'my-project'],
|
||||
'description': 'My-Project lessons learned index'
|
||||
}
|
||||
]
|
||||
|
||||
@@ -1308,7 +1308,7 @@ Lessons are automatically captured during sprint close via `/sprint-close` comma
|
||||
sys.exit(1)
|
||||
else:
|
||||
print(f"\n✅ All pages created successfully!")
|
||||
print(f"\nView at: https://wiki.hyperhivelabs.com/hyper-hive-labs")
|
||||
print(f"\nView at: https://wiki.your-company.com/your-org")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
@@ -1326,13 +1326,13 @@ python setup_wiki_structure.py
|
||||
```
|
||||
Initializing Wiki.js base structure...
|
||||
============================================================
|
||||
✅ Connected to Wiki.js at https://wiki.hyperhivelabs.com/graphql
|
||||
Base path: /hyper-hive-labs
|
||||
✅ Connected to Wiki.js at https://wiki.your-company.com/graphql
|
||||
Base path: /your-org
|
||||
|
||||
Creating: /hyper-hive-labs
|
||||
Creating: /your-org
|
||||
✅ Created successfully
|
||||
|
||||
Creating: /hyper-hive-labs/projects
|
||||
Creating: /your-org/projects
|
||||
✅ Created successfully
|
||||
|
||||
...
|
||||
@@ -1343,12 +1343,12 @@ Setup complete!
|
||||
|
||||
✅ All pages created successfully!
|
||||
|
||||
View at: https://wiki.hyperhivelabs.com/hyper-hive-labs
|
||||
View at: https://wiki.your-company.com/your-org
|
||||
```
|
||||
|
||||
**Post-Setup:**
|
||||
After running the script:
|
||||
1. Visit https://wiki.hyperhivelabs.com/hyper-hive-labs to verify structure
|
||||
1. Visit https://wiki.your-company.com/your-org to verify structure
|
||||
2. Add additional project directories as needed
|
||||
3. The structure is now ready for projman plugin to use
|
||||
|
||||
@@ -1451,13 +1451,13 @@ if __name__ == '__main__':
|
||||
curl -H "Authorization: Bearer YOUR_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"query":"{ pages { list { id title } } }"}' \
|
||||
https://wiki.hyperhivelabs.com/graphql
|
||||
https://wiki.your-company.com/graphql
|
||||
```
|
||||
|
||||
**Issue:** Path not found
|
||||
```bash
|
||||
# Solution: Verify base structure exists
|
||||
# Check in Wiki.js web interface that /hyper-hive-labs path exists
|
||||
# Check in Wiki.js web interface that /your-org path exists
|
||||
```
|
||||
|
||||
**Issue:** Tags not working
|
||||
|
||||
@@ -197,7 +197,7 @@ projman-pmo/
|
||||
|
||||
**Critical Differences from projman:**
|
||||
- **NO** `GITEA_REPO` → operates on all repositories
|
||||
- **NO** `WIKIJS_PROJECT` → operates on entire `/hyper-hive-labs` namespace
|
||||
- **NO** `WIKIJS_PROJECT` → operates on entire `/your-org` namespace
|
||||
- Same shared MCP servers at `../mcp-servers/`
|
||||
|
||||
### Environment Variables
|
||||
@@ -282,10 +282,10 @@ User: /pmo-status
|
||||
|
||||
PMO: Projects Overview:
|
||||
|
||||
CuisineFlow: Sprint in progress, 60% complete
|
||||
└─ Blocking: CuisineFlow-Site deployment
|
||||
Main-App: Sprint in progress, 60% complete
|
||||
└─ Blocking: Main-App-Site deployment
|
||||
|
||||
I notice CuisineFlow-Site is waiting on API changes.
|
||||
I notice Main-App-Site is waiting on API changes.
|
||||
Use projman in the cuisineflow-site repo to plan the sync work.
|
||||
```
|
||||
|
||||
@@ -309,7 +309,7 @@ User: /pmo-status
|
||||
PMO: Projects Overview
|
||||
================
|
||||
|
||||
CuisineFlow (main product)
|
||||
Main-App (main product)
|
||||
├── Sprint: Intuit Engine Extraction
|
||||
├── Status: In Progress (60%)
|
||||
├── Issue: #47
|
||||
@@ -317,13 +317,13 @@ PMO: Projects Overview
|
||||
├── Next: API testing
|
||||
└── Timeline: On track (1 week remaining)
|
||||
|
||||
CuisineFlow-Site (demo)
|
||||
Main-App-Site (demo)
|
||||
├── Sprint: Dashboard Updates
|
||||
├── Status: Waiting on dependencies
|
||||
├── Issue: #12
|
||||
├── Blockers: Depends on CuisineFlow #47 (API changes)
|
||||
├── Blockers: Depends on Main-App #47 (API changes)
|
||||
├── Next: Deploy when API ready
|
||||
└── Timeline: Blocked (waiting on CuisineFlow)
|
||||
└── Timeline: Blocked (waiting on Main-App)
|
||||
|
||||
Intuit-Engine (service)
|
||||
├── Sprint: Initial Setup
|
||||
@@ -333,7 +333,7 @@ PMO: Projects Overview
|
||||
├── Next: Service boundary definition
|
||||
└── Timeline: Not started
|
||||
|
||||
HHL-Site (marketing)
|
||||
Company-Site (marketing)
|
||||
├── Sprint: Content Updates
|
||||
├── Status: Complete
|
||||
├── Issue: #8 (closed)
|
||||
@@ -342,14 +342,14 @@ PMO: Projects Overview
|
||||
└── Timeline: Ready for deployment
|
||||
|
||||
Cross-Project Insights:
|
||||
- CuisineFlow-Site blocked by CuisineFlow #47
|
||||
- Main-App-Site blocked by Main-App #47
|
||||
- Intuit-Engine needs architectural guidance
|
||||
- HHL-Site ready for deployment
|
||||
- Company-Site ready for deployment
|
||||
|
||||
Suggested Actions:
|
||||
1. Prioritize CuisineFlow #47 completion (unblocks Site)
|
||||
1. Prioritize Main-App #47 completion (unblocks Site)
|
||||
2. Schedule architecture review for Intuit-Engine
|
||||
3. Deploy HHL-Site content updates
|
||||
3. Deploy Company-Site content updates
|
||||
```
|
||||
|
||||
**Success Criteria:**
|
||||
@@ -375,15 +375,15 @@ PMO: Priority Analysis Across Projects
|
||||
=================================
|
||||
|
||||
Critical (Blocks launches):
|
||||
1. CuisineFlow #47 - Intuit Engine Extraction
|
||||
→ Blocks: CuisineFlow v2.0 launch (Nov 15)
|
||||
→ Blocks: CuisineFlow-Site deployment
|
||||
1. Main-App #47 - Intuit Engine Extraction
|
||||
→ Blocks: Main-App v2.0 launch (Nov 15)
|
||||
→ Blocks: Main-App-Site deployment
|
||||
→ Status: 60% complete, 1 week remaining
|
||||
→ Recommendation: Maintain focus
|
||||
|
||||
High (Important features):
|
||||
2. CuisineFlow-Site #12 - Dashboard Updates
|
||||
→ Depends on: CuisineFlow #47
|
||||
2. Main-App-Site #12 - Dashboard Updates
|
||||
→ Depends on: Main-App #47
|
||||
→ Status: Waiting
|
||||
→ Recommendation: Begin planning, wait for #47
|
||||
|
||||
@@ -393,14 +393,14 @@ PMO: Priority Analysis Across Projects
|
||||
→ Recommendation: Schedule architecture review
|
||||
|
||||
Medium (Enhancements):
|
||||
4. HHL-Site #8 - Content Updates
|
||||
4. Company-Site #8 - Content Updates
|
||||
→ Status: Complete, ready to deploy
|
||||
→ Recommendation: Deploy this week
|
||||
|
||||
Resource Analysis:
|
||||
Leo (primary developer):
|
||||
- Working on: CuisineFlow #47 (Critical)
|
||||
- Next: CuisineFlow-Site #12 (High)
|
||||
- Working on: Main-App #47 (Critical)
|
||||
- Next: Main-App-Site #12 (High)
|
||||
- Later: Intuit-Engine #3 (High)
|
||||
|
||||
⚠️ Potential Conflict:
|
||||
@@ -438,27 +438,27 @@ PMO: Project Dependencies
|
||||
|
||||
Intuit-Engine (service)
|
||||
↓ (must complete before)
|
||||
CuisineFlow (main)
|
||||
Main-App (main)
|
||||
↓ (API changes required by)
|
||||
CuisineFlow-Site (demo)
|
||||
Main-App-Site (demo)
|
||||
↓ (deploy together)
|
||||
Customer VPS (deployments)
|
||||
|
||||
Detailed Dependencies:
|
||||
|
||||
1. CuisineFlow → Intuit-Engine
|
||||
1. Main-App → Intuit-Engine
|
||||
Type: Service extraction
|
||||
Status: In progress
|
||||
Blocker: Must complete #47 before v2.0 launch
|
||||
Impact: High (blocks launch)
|
||||
|
||||
2. CuisineFlow → CuisineFlow-Site
|
||||
2. Main-App → Main-App-Site
|
||||
Type: Feature sync
|
||||
Status: Waiting
|
||||
Blocker: Dashboard updates need API changes from #47
|
||||
Impact: Medium (demo consistency)
|
||||
|
||||
3. CuisineFlow → Customer VPS
|
||||
3. Main-App → Customer VPS
|
||||
Type: Deployment
|
||||
Status: Pending
|
||||
Blocker: v2.0 launch must complete before customer updates
|
||||
@@ -466,16 +466,16 @@ PMO: Project Dependencies
|
||||
|
||||
Deployment Order:
|
||||
1. Complete Intuit-Engine service extraction
|
||||
2. Deploy CuisineFlow v2.0 to production
|
||||
3. Sync CuisineFlow-Site demo
|
||||
2. Deploy Main-App v2.0 to production
|
||||
3. Sync Main-App-Site demo
|
||||
4. Update customer VPS instances
|
||||
|
||||
Critical Path:
|
||||
Intuit-Engine (#3) → CuisineFlow (#47) → Site (#12) → Customer VPS
|
||||
Intuit-Engine (#3) → Main-App (#47) → Site (#12) → Customer VPS
|
||||
Estimated timeline: 3 weeks total
|
||||
|
||||
Risks:
|
||||
- CuisineFlow #47 delay would cascade to all downstream
|
||||
- Main-App #47 delay would cascade to all downstream
|
||||
- Customer VPS updates have Nov 20 deadline
|
||||
```
|
||||
|
||||
@@ -505,9 +505,9 @@ PMO: Resource Conflict Analysis
|
||||
|
||||
Leo (Developer):
|
||||
Assigned to:
|
||||
- CuisineFlow #47: Intuit Engine extraction (Priority: Critical)
|
||||
- CuisineFlow-Site #12: Dashboard redesign (Priority: High)
|
||||
- HHL-Site #8: Content update (Priority: Medium)
|
||||
- Main-App #47: Intuit Engine extraction (Priority: Critical)
|
||||
- Main-App-Site #12: Dashboard redesign (Priority: High)
|
||||
- Company-Site #8: Content update (Priority: Medium)
|
||||
- Intuit-Engine #3: Service setup (Priority: High)
|
||||
|
||||
Conflict Analysis:
|
||||
@@ -517,15 +517,15 @@ PMO: Resource Conflict Analysis
|
||||
- Timeline: 2 weeks available
|
||||
|
||||
Recommendations:
|
||||
1. FOCUS: CuisineFlow #47 (Critical, blocks launch)
|
||||
1. FOCUS: Main-App #47 (Critical, blocks launch)
|
||||
→ Keep as primary focus
|
||||
→ Estimated completion: 1 week
|
||||
|
||||
2. QUEUE: CuisineFlow-Site #12 (High, depends on #47)
|
||||
2. QUEUE: Main-App-Site #12 (High, depends on #47)
|
||||
→ Start after #47 completes
|
||||
→ Natural sequencing
|
||||
|
||||
3. DELEGATE: HHL-Site #8 (Medium, content work)
|
||||
3. DELEGATE: Company-Site #8 (Medium, content work)
|
||||
→ Can be handled by marketing team
|
||||
→ Low technical complexity
|
||||
|
||||
@@ -536,14 +536,14 @@ PMO: Resource Conflict Analysis
|
||||
Timeline Conflict:
|
||||
|
||||
Without changes:
|
||||
Week 1-2: CuisineFlow #47
|
||||
Week 1-2: Main-App #47
|
||||
Week 3-4: Site #12 + Engine #3 (CONFLICT)
|
||||
|
||||
With recommendations:
|
||||
Week 1-2: CuisineFlow #47 (FOCUS)
|
||||
Week 3: CuisineFlow-Site #12 (QUEUE)
|
||||
Week 1-2: Main-App #47 (FOCUS)
|
||||
Week 3: Main-App-Site #12 (QUEUE)
|
||||
Week 4: Intuit-Engine #3 (SCHEDULE)
|
||||
HHL-Site #8: Marketing (DELEGATE)
|
||||
Company-Site #8: Marketing (DELEGATE)
|
||||
|
||||
Apply recommendations?
|
||||
```
|
||||
@@ -571,18 +571,18 @@ PMO: Deployment Schedule
|
||||
===================
|
||||
|
||||
Week of Nov 6:
|
||||
- CuisineFlow: Continue Sprint #47 (Intuit Engine)
|
||||
- HHL-Site: Deploy content updates (#8) ✅
|
||||
- Main-App: Continue Sprint #47 (Intuit Engine)
|
||||
- Company-Site: Deploy content updates (#8) ✅
|
||||
|
||||
Week of Nov 13:
|
||||
- CuisineFlow: Complete Sprint #47 ⚠️
|
||||
- Main-App: Complete Sprint #47 ⚠️
|
||||
→ Critical: Must complete by Nov 15 for launch
|
||||
- CuisineFlow-Site: Begin Dashboard updates (#12)
|
||||
- Main-App-Site: Begin Dashboard updates (#12)
|
||||
|
||||
Week of Nov 20:
|
||||
- CuisineFlow: Deploy v2.0 to production 🚀
|
||||
- Main-App: Deploy v2.0 to production 🚀
|
||||
→ Deployment window: Nov 20-22
|
||||
- CuisineFlow-Site: Sync demo with v2.0
|
||||
- Main-App-Site: Sync demo with v2.0
|
||||
- Customer VPS: Begin rollout (Deadline: Nov 20)
|
||||
|
||||
Week of Nov 27:
|
||||
@@ -590,19 +590,19 @@ PMO: Deployment Schedule
|
||||
- Intuit-Engine: Begin service implementation (#3)
|
||||
|
||||
Critical Dates:
|
||||
- Nov 15: CuisineFlow v2.0 feature freeze
|
||||
- Nov 15: Main-App v2.0 feature freeze
|
||||
- Nov 20: Production deployment
|
||||
- Nov 20: Customer VPS deadline
|
||||
- Nov 22: Demo site sync
|
||||
|
||||
Dependencies:
|
||||
✅ Intuit-Engine service boundaries defined
|
||||
🔄 CuisineFlow #47 in progress (60%)
|
||||
⏳ CuisineFlow-Site #12 waiting
|
||||
🔄 Main-App #47 in progress (60%)
|
||||
⏳ Main-App-Site #12 waiting
|
||||
⏳ Customer VPS rollout pending
|
||||
|
||||
Risks:
|
||||
⚠️ CuisineFlow #47 delay would impact:
|
||||
⚠️ Main-App #47 delay would impact:
|
||||
- v2.0 launch date (Nov 20)
|
||||
- Customer VPS deadline (Nov 20)
|
||||
- Demo site sync (Nov 22)
|
||||
@@ -632,7 +632,7 @@ PMO: Searching lessons across all projects...
|
||||
|
||||
Found 5 relevant lessons:
|
||||
|
||||
CuisineFlow:
|
||||
Main-App:
|
||||
- Sprint 08: Service boundary issues with shared DB
|
||||
Tags: #service-extraction #database #refactoring
|
||||
Key: Define boundaries at data model level first
|
||||
@@ -646,13 +646,13 @@ PMO: Searching lessons across all projects...
|
||||
Tags: #service-extraction #dependencies
|
||||
Key: Watch for import cycles
|
||||
|
||||
CuisineFlow-Site:
|
||||
Main-App-Site:
|
||||
- Sprint 05: Sync issues during backend refactor
|
||||
Tags: #sync #refactoring
|
||||
Key: Coordinate deployments during major changes
|
||||
|
||||
Shared Patterns:
|
||||
- /hyper-hive-labs/shared/architecture-patterns/service-extraction
|
||||
- /your-org/shared/architecture-patterns/service-extraction
|
||||
Key: Service Extraction Playbook
|
||||
|
||||
Common Themes:
|
||||
@@ -672,15 +672,15 @@ User: /pmo-status
|
||||
PMO: [Shows status]
|
||||
|
||||
Resource Analysis:
|
||||
- Leo: 3 active projects (CuisineFlow, Site, Engine)
|
||||
- Marketing: 1 project (HHL-Site)
|
||||
- Leo: 3 active projects (Main-App, Site, Engine)
|
||||
- Marketing: 1 project (Company-Site)
|
||||
|
||||
Bottleneck detected: Leo is on critical path for all technical work
|
||||
|
||||
Recommendations:
|
||||
1. Prioritize work that unblocks others
|
||||
2. Consider hiring/contracting for parallel work
|
||||
3. Delegate non-critical work (HHL-Site content)
|
||||
3. Delegate non-critical work (Company-Site content)
|
||||
|
||||
Would you like a detailed resource allocation report?
|
||||
```
|
||||
@@ -690,18 +690,18 @@ PMO: [Shows status]
|
||||
```
|
||||
User: Plan the v2.0 release across all affected projects
|
||||
|
||||
PMO: Release Coordination: CuisineFlow v2.0
|
||||
PMO: Release Coordination: Main-App v2.0
|
||||
======================================
|
||||
|
||||
Affected Projects:
|
||||
1. CuisineFlow (main product)
|
||||
1. Main-App (main product)
|
||||
2. Intuit-Engine (new service)
|
||||
3. CuisineFlow-Site (demo)
|
||||
3. Main-App-Site (demo)
|
||||
4. Customer VPS instances
|
||||
|
||||
Pre-Release Checklist:
|
||||
✅ Intuit-Engine service deployed
|
||||
🔄 CuisineFlow #47 complete (60%)
|
||||
🔄 Main-App #47 complete (60%)
|
||||
⏳ Integration tests pending
|
||||
⏳ Demo site updates pending
|
||||
⏳ Customer migration scripts ready
|
||||
@@ -712,12 +712,12 @@ PMO: Release Coordination: CuisineFlow v2.0
|
||||
→ Duration: 2 hours
|
||||
→ Rollback plan: Ready
|
||||
|
||||
2. Deploy CuisineFlow v2.0 (main app)
|
||||
2. Deploy Main-App v2.0 (main app)
|
||||
→ Date: Nov 20
|
||||
→ Duration: 4 hours
|
||||
→ Rollback plan: Ready
|
||||
|
||||
3. Sync CuisineFlow-Site (demo)
|
||||
3. Sync Main-App-Site (demo)
|
||||
→ Date: Nov 22
|
||||
→ Duration: 1 hour
|
||||
→ Dependencies: Step 2 complete
|
||||
@@ -767,11 +767,11 @@ shared_patterns = await wikijs.get_shared_docs("architecture-patterns")
|
||||
User: Create a shared pattern document for service extraction
|
||||
|
||||
PMO: I'll create a pattern document based on lessons from:
|
||||
- CuisineFlow Sprint 08
|
||||
- CuisineFlow Sprint 12
|
||||
- Main-App Sprint 08
|
||||
- Main-App Sprint 12
|
||||
- Intuit-Engine Sprint 02
|
||||
|
||||
Creating: /hyper-hive-labs/shared/architecture-patterns/service-extraction
|
||||
Creating: /your-org/shared/architecture-patterns/service-extraction
|
||||
|
||||
Service Extraction Pattern
|
||||
=========================
|
||||
@@ -855,7 +855,7 @@ PMO: I'll create a pattern document based on lessons from:
|
||||
**Deliverable:** Validated PMO plugin integrated with projman
|
||||
|
||||
**Tasks:**
|
||||
1. Test with multiple projects (CuisineFlow, Site, Engine, HHL-Site)
|
||||
1. Test with multiple projects (Main-App, Site, Engine, Company-Site)
|
||||
2. Simulate deployment coordination scenarios
|
||||
3. Test priority conflict resolution
|
||||
4. Validate dependency tracking
|
||||
@@ -911,10 +911,10 @@ PMO: I'll create a pattern document based on lessons from:
|
||||
```
|
||||
User: /pmo-status
|
||||
|
||||
PMO: CuisineFlow-Site: Waiting on dependencies
|
||||
Blocked by: CuisineFlow #47 (API changes)
|
||||
PMO: Main-App-Site: Waiting on dependencies
|
||||
Blocked by: Main-App #47 (API changes)
|
||||
|
||||
Recommendation: Focus resources on CuisineFlow #47 to unblock Site work
|
||||
Recommendation: Focus resources on Main-App #47 to unblock Site work
|
||||
```
|
||||
|
||||
### Example 2: Resource Conflict Resolution
|
||||
@@ -926,8 +926,8 @@ PMO: Leo assigned to 4 tasks across 3 projects
|
||||
Estimated: 4 weeks work in 2 weeks timeline
|
||||
|
||||
Recommendations:
|
||||
- Maintain focus on Critical priority (CuisineFlow #47)
|
||||
- Delegate content work (HHL-Site #8) to marketing
|
||||
- Maintain focus on Critical priority (Main-App #47)
|
||||
- Delegate content work (Company-Site #8) to marketing
|
||||
- Schedule architectural review (Intuit-Engine #3) as team session
|
||||
```
|
||||
|
||||
@@ -936,13 +936,13 @@ PMO: Leo assigned to 4 tasks across 3 projects
|
||||
```
|
||||
User: /pmo-schedule
|
||||
|
||||
PMO: Critical Path: Engine → CuisineFlow → Site → Customer VPS
|
||||
PMO: Critical Path: Engine → Main-App → Site → Customer VPS
|
||||
Deadline: Nov 20 (Customer VPS commitment)
|
||||
|
||||
Current: Week 1 of 3
|
||||
Status: On track
|
||||
|
||||
Risk: If CuisineFlow #47 delayed, entire timeline at risk
|
||||
Risk: If Main-App #47 delayed, entire timeline at risk
|
||||
Mitigation: Daily progress monitoring
|
||||
```
|
||||
|
||||
|
||||
@@ -925,13 +925,13 @@ Orchestrator: I'll create a lesson in Wiki.js:
|
||||
#service-extraction #api #database #refactoring #claude-code-loops
|
||||
|
||||
Creating page in Wiki.js... ✅
|
||||
Path: /hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
Path: /your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
|
||||
Updating INDEX.md... ✅
|
||||
|
||||
Closing sprint issue #47... ✅
|
||||
|
||||
View lesson at: https://wiki.hyperhivelabs.com/hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
View lesson at: https://wiki.your-company.com/your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
|
||||
```
|
||||
|
||||
**Success Criteria:**
|
||||
|
||||
@@ -155,9 +155,9 @@ GITEA_API_TOKEN=your_token
|
||||
GITEA_OWNER=bandit
|
||||
|
||||
# ~/.config/claude/wikijs.env
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=your_token
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
```
|
||||
|
||||
**Project-Level:**
|
||||
@@ -271,18 +271,18 @@ WIKIJS_PROJECT=projects/cuisineflow
|
||||
## Wiki.js Structure
|
||||
|
||||
```
|
||||
Wiki.js: https://wiki.hyperhivelabs.com
|
||||
└── /hyper-hive-labs/
|
||||
Wiki.js: https://wiki.your-company.com
|
||||
└── /your-org/
|
||||
├── projects/ # Project-specific
|
||||
│ ├── cuisineflow/
|
||||
│ ├── project-a/
|
||||
│ │ ├── lessons-learned/
|
||||
│ │ │ ├── sprints/
|
||||
│ │ │ ├── patterns/
|
||||
│ │ │ └── INDEX.md
|
||||
│ │ └── documentation/
|
||||
│ ├── cuisineflow-site/
|
||||
│ ├── intuit-engine/
|
||||
│ └── hhl-site/
|
||||
│ ├── project-b/
|
||||
│ ├── project-c/
|
||||
│ └── company-site/
|
||||
├── company/ # Company-wide
|
||||
│ ├── processes/
|
||||
│ ├── standards/
|
||||
@@ -373,9 +373,9 @@ EOF
|
||||
|
||||
# Wiki.js config
|
||||
cat > ~/.config/claude/wikijs.env << EOF
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=your_wikijs_token
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
EOF
|
||||
|
||||
# Secure files
|
||||
@@ -565,7 +565,7 @@ Previous attempts failed due to:
|
||||
### Immediate Actions
|
||||
|
||||
1. **Set up system configuration** (Gitea + Wiki.js tokens)
|
||||
2. **Create Wiki.js base structure** at `/hyper-hive-labs`
|
||||
2. **Create Wiki.js base structure** at `/your-org`
|
||||
3. **Begin Phase 1.1a** - Gitea MCP Server implementation
|
||||
4. **Begin Phase 1.1b** - Wiki.js MCP Server implementation
|
||||
|
||||
@@ -597,10 +597,10 @@ These decisions were finalized before development:
|
||||
- **Minimum:** Python 3.10.0
|
||||
|
||||
### 2. Wiki.js Base Structure: Needs Creation
|
||||
- **Status:** `/hyper-hive-labs` structure does NOT exist yet
|
||||
- **Status:** `/your-org` structure does NOT exist yet
|
||||
- **Action:** Run `setup_wiki_structure.py` during Phase 1.1b
|
||||
- **Script:** See MCP-WIKIJS.md for complete setup script
|
||||
- **Post-setup:** Verify at https://wiki.hyperhivelabs.com/hyper-hive-labs
|
||||
- **Post-setup:** Verify at https://wiki.your-company.com/your-org
|
||||
|
||||
### 3. Testing Strategy: Both Mocks and Real APIs
|
||||
- **Unit tests:** Use mocks for fast feedback during development
|
||||
|
||||
@@ -21,7 +21,6 @@ class GiteaConfig:
|
||||
def __init__(self):
|
||||
self.api_url: Optional[str] = None
|
||||
self.api_token: Optional[str] = None
|
||||
self.owner: Optional[str] = None
|
||||
self.repo: Optional[str] = None
|
||||
self.mode: str = 'project'
|
||||
|
||||
@@ -31,7 +30,7 @@ class GiteaConfig:
|
||||
Project-level configuration overrides system-level.
|
||||
|
||||
Returns:
|
||||
Dict containing api_url, api_token, owner, repo, mode
|
||||
Dict containing api_url, api_token, repo, mode
|
||||
|
||||
Raises:
|
||||
FileNotFoundError: If system config is missing
|
||||
@@ -58,8 +57,7 @@ class GiteaConfig:
|
||||
# Extract values
|
||||
self.api_url = os.getenv('GITEA_API_URL')
|
||||
self.api_token = os.getenv('GITEA_API_TOKEN')
|
||||
self.owner = os.getenv('GITEA_OWNER')
|
||||
self.repo = os.getenv('GITEA_REPO') # Optional for PMO
|
||||
self.repo = os.getenv('GITEA_REPO') # Optional, must be owner/repo format
|
||||
|
||||
# Detect mode
|
||||
if self.repo:
|
||||
@@ -75,7 +73,6 @@ class GiteaConfig:
|
||||
return {
|
||||
'api_url': self.api_url,
|
||||
'api_token': self.api_token,
|
||||
'owner': self.owner,
|
||||
'repo': self.repo,
|
||||
'mode': self.mode
|
||||
}
|
||||
@@ -89,8 +86,7 @@ class GiteaConfig:
|
||||
"""
|
||||
required = {
|
||||
'GITEA_API_URL': self.api_url,
|
||||
'GITEA_API_TOKEN': self.api_token,
|
||||
'GITEA_OWNER': self.owner
|
||||
'GITEA_API_TOKEN': self.api_token
|
||||
}
|
||||
|
||||
missing = [key for key, value in required.items() if not value]
|
||||
|
||||
@@ -26,8 +26,7 @@ class GiteaClient:
|
||||
|
||||
self.base_url = config_dict['api_url']
|
||||
self.token = config_dict['api_token']
|
||||
self.owner = config_dict['owner']
|
||||
self.repo = config_dict.get('repo') # Optional for PMO
|
||||
self.repo = config_dict.get('repo') # Optional default repo in owner/repo format
|
||||
self.mode = config_dict['mode']
|
||||
|
||||
self.session = requests.Session()
|
||||
@@ -36,7 +35,15 @@ class GiteaClient:
|
||||
'Content-Type': 'application/json'
|
||||
})
|
||||
|
||||
logger.info(f"Gitea client initialized for {self.owner} in {self.mode} mode")
|
||||
logger.info(f"Gitea client initialized in {self.mode} mode")
|
||||
|
||||
def _parse_repo(self, repo: Optional[str] = None) -> tuple:
|
||||
"""Parse owner/repo from input. Always requires 'owner/repo' format."""
|
||||
target = repo or self.repo
|
||||
if not target or '/' not in target:
|
||||
raise ValueError("Use 'owner/repo' format (e.g. 'bandit/support-claude-mktplace')")
|
||||
parts = target.split('/', 1)
|
||||
return parts[0], parts[1]
|
||||
|
||||
def list_issues(
|
||||
self,
|
||||
@@ -50,26 +57,17 @@ class GiteaClient:
|
||||
Args:
|
||||
state: Issue state (open, closed, all)
|
||||
labels: Filter by labels
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
repo: Repository in 'owner/repo' format
|
||||
|
||||
Returns:
|
||||
List of issue dictionaries
|
||||
|
||||
Raises:
|
||||
ValueError: If repository not specified
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
target_repo = repo or self.repo
|
||||
if not target_repo:
|
||||
raise ValueError("Repository not specified")
|
||||
|
||||
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues"
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues"
|
||||
params = {'state': state}
|
||||
|
||||
if labels:
|
||||
params['labels'] = ','.join(labels)
|
||||
|
||||
logger.info(f"Listing issues from {self.owner}/{target_repo} with state={state}")
|
||||
logger.info(f"Listing issues from {owner}/{target_repo} with state={state}")
|
||||
response = self.session.get(url, params=params)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
@@ -79,26 +77,10 @@ class GiteaClient:
|
||||
issue_number: int,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Get specific issue details.
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Issue dictionary
|
||||
|
||||
Raises:
|
||||
ValueError: If repository not specified
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
target_repo = repo or self.repo
|
||||
if not target_repo:
|
||||
raise ValueError("Repository not specified")
|
||||
|
||||
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues/{issue_number}"
|
||||
logger.info(f"Getting issue #{issue_number} from {self.owner}/{target_repo}")
|
||||
"""Get specific issue details."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}"
|
||||
logger.info(f"Getting issue #{issue_number} from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
@@ -110,69 +92,30 @@ class GiteaClient:
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Create a new issue in Gitea.
|
||||
|
||||
Args:
|
||||
title: Issue title
|
||||
body: Issue description
|
||||
labels: List of label names (will be converted to IDs)
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Created issue dictionary
|
||||
|
||||
Raises:
|
||||
ValueError: If repository not specified
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
target_repo = repo or self.repo
|
||||
if not target_repo:
|
||||
raise ValueError("Repository not specified")
|
||||
|
||||
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues"
|
||||
data = {
|
||||
'title': title,
|
||||
'body': body
|
||||
}
|
||||
|
||||
"""Create a new issue in Gitea."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues"
|
||||
data = {'title': title, 'body': body}
|
||||
if labels:
|
||||
# Convert label names to IDs (Gitea expects integer IDs, not strings)
|
||||
label_ids = self._resolve_label_ids(labels, target_repo)
|
||||
label_ids = self._resolve_label_ids(labels, owner, target_repo)
|
||||
data['labels'] = label_ids
|
||||
|
||||
logger.info(f"Creating issue in {self.owner}/{target_repo}: {title}")
|
||||
logger.info(f"Creating issue in {owner}/{target_repo}: {title}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def _resolve_label_ids(self, label_names: List[str], repo: str) -> List[int]:
|
||||
"""
|
||||
Convert label names to label IDs.
|
||||
|
||||
Args:
|
||||
label_names: List of label names (e.g., ['Type/Feature', 'Priority/High'])
|
||||
repo: Repository name
|
||||
|
||||
Returns:
|
||||
List of label IDs
|
||||
"""
|
||||
# Fetch all available labels (org + repo)
|
||||
org_labels = self.get_org_labels()
|
||||
repo_labels = self.get_labels(repo)
|
||||
def _resolve_label_ids(self, label_names: List[str], owner: str, repo: str) -> List[int]:
|
||||
"""Convert label names to label IDs."""
|
||||
org_labels = self.get_org_labels(owner)
|
||||
repo_labels = self.get_labels(f"{owner}/{repo}")
|
||||
all_labels = org_labels + repo_labels
|
||||
|
||||
# Build name -> ID mapping
|
||||
label_map = {label['name']: label['id'] for label in all_labels}
|
||||
|
||||
# Resolve IDs
|
||||
label_ids = []
|
||||
for name in label_names:
|
||||
if name in label_map:
|
||||
label_ids.append(label_map[name])
|
||||
else:
|
||||
logger.warning(f"Label '{name}' not found in Gitea, skipping")
|
||||
|
||||
logger.warning(f"Label '{name}' not found, skipping")
|
||||
return label_ids
|
||||
|
||||
def update_issue(
|
||||
@@ -184,31 +127,10 @@ class GiteaClient:
|
||||
labels: Optional[List[str]] = None,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Update existing issue.
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
title: New title (optional)
|
||||
body: New body (optional)
|
||||
state: New state - 'open' or 'closed' (optional)
|
||||
labels: New labels (optional)
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Updated issue dictionary
|
||||
|
||||
Raises:
|
||||
ValueError: If repository not specified
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
target_repo = repo or self.repo
|
||||
if not target_repo:
|
||||
raise ValueError("Repository not specified")
|
||||
|
||||
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues/{issue_number}"
|
||||
"""Update existing issue. Repo must be 'owner/repo' format."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}"
|
||||
data = {}
|
||||
|
||||
if title is not None:
|
||||
data['title'] = title
|
||||
if body is not None:
|
||||
@@ -217,8 +139,7 @@ class GiteaClient:
|
||||
data['state'] = state
|
||||
if labels is not None:
|
||||
data['labels'] = labels
|
||||
|
||||
logger.info(f"Updating issue #{issue_number} in {self.owner}/{target_repo}")
|
||||
logger.info(f"Updating issue #{issue_number} in {owner}/{target_repo}")
|
||||
response = self.session.patch(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
@@ -229,131 +150,62 @@ class GiteaClient:
|
||||
comment: str,
|
||||
repo: Optional[str] = None
|
||||
) -> Dict:
|
||||
"""
|
||||
Add comment to issue.
|
||||
|
||||
Args:
|
||||
issue_number: Issue number
|
||||
comment: Comment text
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Created comment dictionary
|
||||
|
||||
Raises:
|
||||
ValueError: If repository not specified
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
target_repo = repo or self.repo
|
||||
if not target_repo:
|
||||
raise ValueError("Repository not specified")
|
||||
|
||||
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues/{issue_number}/comments"
|
||||
"""Add comment to issue. Repo must be 'owner/repo' format."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/comments"
|
||||
data = {'body': comment}
|
||||
|
||||
logger.info(f"Adding comment to issue #{issue_number} in {self.owner}/{target_repo}")
|
||||
logger.info(f"Adding comment to issue #{issue_number} in {owner}/{target_repo}")
|
||||
response = self.session.post(url, json=data)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_labels(
|
||||
self,
|
||||
repo: Optional[str] = None
|
||||
) -> List[Dict]:
|
||||
"""
|
||||
Get all labels from repository.
|
||||
|
||||
Args:
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
List of label dictionaries
|
||||
|
||||
Raises:
|
||||
ValueError: If repository not specified
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
target_repo = repo or self.repo
|
||||
if not target_repo:
|
||||
raise ValueError("Repository not specified")
|
||||
|
||||
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/labels"
|
||||
logger.info(f"Getting labels from {self.owner}/{target_repo}")
|
||||
def get_labels(self, repo: Optional[str] = None) -> List[Dict]:
|
||||
"""Get all labels from repository. Repo must be 'owner/repo' format."""
|
||||
owner, target_repo = self._parse_repo(repo)
|
||||
url = f"{self.base_url}/repos/{owner}/{target_repo}/labels"
|
||||
logger.info(f"Getting labels from {owner}/{target_repo}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def get_org_labels(self) -> List[Dict]:
|
||||
"""
|
||||
Get organization-level labels.
|
||||
|
||||
Returns:
|
||||
List of organization label dictionaries
|
||||
|
||||
Raises:
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
url = f"{self.base_url}/orgs/{self.owner}/labels"
|
||||
logger.info(f"Getting organization labels for {self.owner}")
|
||||
def get_org_labels(self, org: str) -> List[Dict]:
|
||||
"""Get organization-level labels. Org is the organization name."""
|
||||
url = f"{self.base_url}/orgs/{org}/labels"
|
||||
logger.info(f"Getting organization labels for {org}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
# PMO-specific methods
|
||||
|
||||
def list_repos(self) -> List[Dict]:
|
||||
"""
|
||||
List all repositories in organization (PMO mode).
|
||||
|
||||
Returns:
|
||||
List of repository dictionaries
|
||||
|
||||
Raises:
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
url = f"{self.base_url}/orgs/{self.owner}/repos"
|
||||
logger.info(f"Listing all repositories for organization {self.owner}")
|
||||
def list_repos(self, org: str) -> List[Dict]:
|
||||
"""List all repositories in organization. Org is the organization name."""
|
||||
url = f"{self.base_url}/orgs/{org}/repos"
|
||||
logger.info(f"Listing all repositories for organization {org}")
|
||||
response = self.session.get(url)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
def aggregate_issues(
|
||||
self,
|
||||
org: str,
|
||||
state: str = 'open',
|
||||
labels: Optional[List[str]] = None
|
||||
) -> Dict[str, List[Dict]]:
|
||||
"""
|
||||
Fetch issues across all repositories (PMO mode).
|
||||
Returns dict keyed by repository name.
|
||||
|
||||
Args:
|
||||
state: Issue state (open, closed, all)
|
||||
labels: Filter by labels
|
||||
|
||||
Returns:
|
||||
Dictionary mapping repository names to issue lists
|
||||
|
||||
Raises:
|
||||
requests.HTTPError: If API request fails
|
||||
"""
|
||||
repos = self.list_repos()
|
||||
"""Fetch issues across all repositories in org."""
|
||||
repos = self.list_repos(org)
|
||||
aggregated = {}
|
||||
|
||||
logger.info(f"Aggregating issues across {len(repos)} repositories")
|
||||
|
||||
for repo in repos:
|
||||
repo_name = repo['name']
|
||||
try:
|
||||
issues = self.list_issues(
|
||||
state=state,
|
||||
labels=labels,
|
||||
repo=repo_name
|
||||
repo=f"{org}/{repo_name}"
|
||||
)
|
||||
if issues:
|
||||
aggregated[repo_name] = issues
|
||||
logger.info(f"Found {len(issues)} issues in {repo_name}")
|
||||
except Exception as e:
|
||||
# Log error but continue with other repos
|
||||
logger.error(f"Error fetching issues from {repo_name}: {e}")
|
||||
|
||||
return aggregated
|
||||
|
||||
@@ -15,7 +15,10 @@ from .gitea_client import GiteaClient
|
||||
from .tools.issues import IssueTools
|
||||
from .tools.labels import LabelTools
|
||||
|
||||
# Suppress noisy MCP validation warnings on stderr
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logging.getLogger("root").setLevel(logging.ERROR)
|
||||
logging.getLogger("mcp").setLevel(logging.ERROR)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -216,6 +219,10 @@ class GiteaMCPServer:
|
||||
inputSchema={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"org": {
|
||||
"type": "string",
|
||||
"description": "Organization name (e.g. 'bandit')"
|
||||
},
|
||||
"state": {
|
||||
"type": "string",
|
||||
"enum": ["open", "closed", "all"],
|
||||
@@ -227,7 +234,8 @@ class GiteaMCPServer:
|
||||
"items": {"type": "string"},
|
||||
"description": "Filter by labels"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": ["org"]
|
||||
}
|
||||
)
|
||||
]
|
||||
|
||||
@@ -245,35 +245,17 @@ class IssueTools:
|
||||
|
||||
async def aggregate_issues(
|
||||
self,
|
||||
org: str,
|
||||
state: str = 'open',
|
||||
labels: Optional[List[str]] = None
|
||||
) -> Dict[str, List[Dict]]:
|
||||
"""
|
||||
Aggregate issues across all repositories (PMO mode, async wrapper).
|
||||
|
||||
Args:
|
||||
state: Issue state (open, closed, all)
|
||||
labels: Filter by labels
|
||||
|
||||
Returns:
|
||||
Dictionary mapping repository names to issue lists
|
||||
|
||||
Raises:
|
||||
ValueError: If not in company mode
|
||||
PermissionError: If operation not allowed on current branch
|
||||
"""
|
||||
if self.gitea.mode != 'company':
|
||||
raise ValueError("aggregate_issues only available in company mode")
|
||||
|
||||
"""Aggregate issues across all repositories in org."""
|
||||
if not self._check_branch_permissions('aggregate_issues'):
|
||||
branch = self._get_current_branch()
|
||||
raise PermissionError(
|
||||
f"Cannot aggregate issues on branch '{branch}'. "
|
||||
f"Switch to a development branch."
|
||||
)
|
||||
raise PermissionError(f"Cannot aggregate issues on branch '{branch}'.")
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
return await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.aggregate_issues(state, labels)
|
||||
lambda: self.gitea.aggregate_issues(org, state, labels)
|
||||
)
|
||||
|
||||
@@ -27,27 +27,20 @@ class LabelTools:
|
||||
self.gitea = gitea_client
|
||||
|
||||
async def get_labels(self, repo: Optional[str] = None) -> Dict[str, List[Dict]]:
|
||||
"""
|
||||
Get all labels (org + repo) (async wrapper).
|
||||
|
||||
Args:
|
||||
repo: Override configured repo (for PMO multi-repo)
|
||||
|
||||
Returns:
|
||||
Dictionary with 'org' and 'repo' label lists
|
||||
"""
|
||||
"""Get all labels (org + repo). Repo must be 'owner/repo' format."""
|
||||
loop = asyncio.get_event_loop()
|
||||
|
||||
# Get org labels
|
||||
target_repo = repo or self.gitea.repo
|
||||
if not target_repo or '/' not in target_repo:
|
||||
raise ValueError("Use 'owner/repo' format (e.g. 'bandit/support-claude-mktplace')")
|
||||
|
||||
org = target_repo.split('/')[0]
|
||||
|
||||
org_labels = await loop.run_in_executor(
|
||||
None,
|
||||
self.gitea.get_org_labels
|
||||
lambda: self.gitea.get_org_labels(org)
|
||||
)
|
||||
|
||||
# Get repo labels if repo is specified
|
||||
repo_labels = []
|
||||
if repo or self.gitea.repo:
|
||||
target_repo = repo or self.gitea.repo
|
||||
repo_labels = await loop.run_in_executor(
|
||||
None,
|
||||
lambda: self.gitea.get_labels(target_repo)
|
||||
|
||||
@@ -23,7 +23,10 @@ from .tools.vpn import VPNTools
|
||||
from .tools.wireless import WirelessTools
|
||||
from .tools.extras import ExtrasTools
|
||||
|
||||
# Suppress noisy MCP validation warnings on stderr
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logging.getLogger("root").setLevel(logging.ERROR)
|
||||
logging.getLogger("mcp").setLevel(logging.ERROR)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
||||
@@ -110,7 +110,7 @@ cat > ~/.config/claude/wikijs.env << 'EOF'
|
||||
# Wiki.js API Configuration
|
||||
WIKIJS_API_URL=http://wikijs.hotport/graphql
|
||||
WIKIJS_API_TOKEN=your_api_token_here
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
EOF
|
||||
|
||||
chmod 600 ~/.config/claude/wikijs.env
|
||||
@@ -216,7 +216,7 @@ The MCP server is referenced in plugin `.mcp.json`:
|
||||
|----------|-------------|---------|
|
||||
| `WIKIJS_API_URL` | Wiki.js GraphQL endpoint | `http://wiki.example.com/graphql` |
|
||||
| `WIKIJS_API_TOKEN` | API authentication token (JWT) | `eyJhbGciOiJSUzI1...` |
|
||||
| `WIKIJS_BASE_PATH` | Base path in Wiki.js | `/hyper-hive-labs` |
|
||||
| `WIKIJS_BASE_PATH` | Base path in Wiki.js | `/your-org` |
|
||||
|
||||
### Optional Variables
|
||||
|
||||
@@ -234,7 +234,7 @@ The MCP server is referenced in plugin `.mcp.json`:
|
||||
### Recommended Organization
|
||||
|
||||
```
|
||||
/hyper-hive-labs/ # Base path
|
||||
/your-org/ # Base path
|
||||
├── projects/ # Project-specific
|
||||
│ ├── your-project/
|
||||
│ │ ├── lessons-learned/
|
||||
|
||||
@@ -52,7 +52,7 @@ mkdir -p ~/.config/claude
|
||||
cat > ~/.config/claude/wikijs.env << 'EOF'
|
||||
WIKIJS_API_URL=http://wikijs.hotport/graphql
|
||||
WIKIJS_API_TOKEN=your_real_token_here
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
EOF
|
||||
|
||||
# Run integration tests
|
||||
@@ -198,7 +198,7 @@ echo '{
|
||||
"success": true,
|
||||
"page": {
|
||||
"id": 123,
|
||||
"path": "/hyper-hive-labs/projects/test-project/documentation/test-api",
|
||||
"path": "/your-org/projects/test-project/documentation/test-api",
|
||||
"title": "Test API Documentation"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -13,7 +13,10 @@ from mcp.types import Tool, TextContent
|
||||
from .config import WikiJSConfig
|
||||
from .wikijs_client import WikiJSClient
|
||||
|
||||
# Suppress noisy MCP validation warnings on stderr
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logging.getLogger("root").setLevel(logging.ERROR)
|
||||
logging.getLogger("mcp").setLevel(logging.ERROR)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
||||
@@ -21,8 +21,8 @@ class WikiJSClient:
|
||||
Args:
|
||||
api_url: Wiki.js GraphQL API URL (e.g., http://wiki.example.com/graphql)
|
||||
api_token: Wiki.js API token
|
||||
base_path: Base path in Wiki.js (e.g., /hyper-hive-labs)
|
||||
project: Project path (e.g., projects/cuisineflow) for project mode
|
||||
base_path: Base path in Wiki.js (e.g., /your-org)
|
||||
project: Project path (e.g., projects/my-project) for project mode
|
||||
"""
|
||||
self.api_url = api_url
|
||||
self.api_token = api_token
|
||||
|
||||
@@ -2,7 +2,10 @@
|
||||
"name": "cmdb-assistant",
|
||||
"version": "1.0.0",
|
||||
"description": "NetBox CMDB integration for infrastructure management - query, create, update, and manage network devices, IP addresses, sites, and more",
|
||||
"author": "Bandit Labs",
|
||||
"author": {
|
||||
"name": "Bandit Labs",
|
||||
"email": "dev@banditlabs.io"
|
||||
},
|
||||
"homepage": "https://github.com/bandit-labs/cmdb-assistant",
|
||||
"license": "MIT",
|
||||
"keywords": [
|
||||
@@ -16,50 +19,25 @@
|
||||
"commands": {
|
||||
"cmdb-search": {
|
||||
"description": "Search NetBox for devices, IPs, sites, or any CMDB object",
|
||||
"file": "commands/cmdb-search.md"
|
||||
"source": "commands/cmdb-search.md"
|
||||
},
|
||||
"cmdb-device": {
|
||||
"description": "Manage network devices (create, view, update, delete)",
|
||||
"file": "commands/cmdb-device.md"
|
||||
"source": "commands/cmdb-device.md"
|
||||
},
|
||||
"cmdb-ip": {
|
||||
"description": "Manage IP addresses and prefixes",
|
||||
"file": "commands/cmdb-ip.md"
|
||||
"source": "commands/cmdb-ip.md"
|
||||
},
|
||||
"cmdb-site": {
|
||||
"description": "Manage sites and locations",
|
||||
"file": "commands/cmdb-site.md"
|
||||
"source": "commands/cmdb-site.md"
|
||||
}
|
||||
},
|
||||
"agents": {
|
||||
"cmdb-assistant": {
|
||||
"description": "Infrastructure management assistant for NetBox CMDB operations",
|
||||
"file": "agents/cmdb-assistant.md"
|
||||
"source": "agents/cmdb-assistant.md"
|
||||
}
|
||||
},
|
||||
"mcpServers": {
|
||||
"netbox": {
|
||||
"description": "NetBox API integration via MCP",
|
||||
"configFile": ".mcp.json"
|
||||
}
|
||||
},
|
||||
"configuration": {
|
||||
"required": [
|
||||
{
|
||||
"name": "NETBOX_URL",
|
||||
"description": "NetBox instance URL (e.g., https://netbox.example.com)"
|
||||
},
|
||||
{
|
||||
"name": "NETBOX_TOKEN",
|
||||
"description": "NetBox API token for authentication"
|
||||
}
|
||||
],
|
||||
"optional": [
|
||||
{
|
||||
"name": "NETBOX_VERIFY_SSL",
|
||||
"description": "Verify SSL certificates (default: true)",
|
||||
"default": "true"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"netbox": {
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/netbox/.venv/bin/python",
|
||||
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/netbox"
|
||||
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox",
|
||||
"env": {
|
||||
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,14 +1,20 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"gitea": {
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/gitea/.venv/bin/python",
|
||||
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/gitea"
|
||||
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea",
|
||||
"env": {
|
||||
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea"
|
||||
}
|
||||
},
|
||||
"wikijs": {
|
||||
"command": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/wikijs/.venv/bin/python",
|
||||
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs/.venv/bin/python",
|
||||
"args": ["-m", "mcp_server.server"],
|
||||
"cwd": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/wikijs"
|
||||
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs",
|
||||
"env": {
|
||||
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -112,7 +112,7 @@ python -c "from mcp_server import server; print('Wiki.js MCP Server installed su
|
||||
|
||||
### 2.2 Generate Wiki.js API Token
|
||||
|
||||
1. Log into Wiki.js: https://wiki.hyperhivelabs.com
|
||||
1. Log into Wiki.js: https://wiki.your-company.com
|
||||
2. Navigate to: **Administration** (top right)
|
||||
3. Click **API Access** in the left sidebar
|
||||
4. Click **New API Key**
|
||||
@@ -166,9 +166,9 @@ chmod 600 ~/.config/claude/gitea.env
|
||||
```bash
|
||||
cat > ~/.config/claude/wikijs.env << 'EOF'
|
||||
# Wiki.js API Configuration
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=your_wikijs_token_here
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
EOF
|
||||
|
||||
# Secure the file (owner read/write only)
|
||||
@@ -180,7 +180,7 @@ chmod 600 ~/.config/claude/wikijs.env
|
||||
**Configuration Variables:**
|
||||
- `WIKIJS_API_URL` - Wiki.js GraphQL endpoint (includes `/graphql`)
|
||||
- `WIKIJS_API_TOKEN` - API key from Step 2.2 (JWT format)
|
||||
- `WIKIJS_BASE_PATH` - Base path in Wiki.js (e.g., `/hyper-hive-labs`)
|
||||
- `WIKIJS_BASE_PATH` - Base path in Wiki.js (e.g., `/your-org`)
|
||||
|
||||
### 3.4 Verify System Configuration
|
||||
|
||||
@@ -263,7 +263,7 @@ curl -H "Authorization: token YOUR_GITEA_TOKEN" \
|
||||
curl -H "Authorization: Bearer YOUR_WIKIJS_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"query": "{ pages { list { id title } } }"}' \
|
||||
https://wiki.hyperhivelabs.com/graphql
|
||||
https://wiki.your-company.com/graphql
|
||||
|
||||
# Should return pages data in JSON format
|
||||
```
|
||||
@@ -320,17 +320,17 @@ GITEA_OWNER=bandit
|
||||
|
||||
**`~/.config/claude/wikijs.env`:**
|
||||
```bash
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9...
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
```
|
||||
|
||||
### Project-Level Files
|
||||
|
||||
**`.env` (in project root):**
|
||||
```bash
|
||||
GITEA_REPO=cuisineflow
|
||||
WIKIJS_PROJECT=projects/cuisineflow
|
||||
GITEA_REPO=your-repo-name
|
||||
WIKIJS_PROJECT=projects/your-project-name
|
||||
```
|
||||
|
||||
**`.gitignore` (must include):**
|
||||
@@ -371,25 +371,25 @@ To use Projman with multiple projects:
|
||||
1. **System config:** Set up once (already done in Step 3)
|
||||
2. **Project config:** Create `.env` in each project root:
|
||||
|
||||
**Project 1: CuisineFlow**
|
||||
**Project 1: Main App**
|
||||
```bash
|
||||
# ~/projects/cuisineflow/.env
|
||||
GITEA_REPO=cuisineflow
|
||||
WIKIJS_PROJECT=projects/cuisineflow
|
||||
# ~/projects/my-app/.env
|
||||
GITEA_REPO=my-app
|
||||
WIKIJS_PROJECT=projects/my-app
|
||||
```
|
||||
|
||||
**Project 2: CuisineFlow-Site**
|
||||
**Project 2: App Site**
|
||||
```bash
|
||||
# ~/projects/cuisineflow-site/.env
|
||||
GITEA_REPO=cuisineflow-site
|
||||
WIKIJS_PROJECT=projects/cuisineflow-site
|
||||
# ~/projects/my-app-site/.env
|
||||
GITEA_REPO=my-app-site
|
||||
WIKIJS_PROJECT=projects/my-app-site
|
||||
```
|
||||
|
||||
**Project 3: HHL-Site**
|
||||
**Project 3: Company Site**
|
||||
```bash
|
||||
# ~/projects/hhl-site/.env
|
||||
GITEA_REPO=hhl-site
|
||||
WIKIJS_PROJECT=projects/hhl-site
|
||||
# ~/projects/company-site/.env
|
||||
GITEA_REPO=company-site
|
||||
WIKIJS_PROJECT=projects/company-site
|
||||
```
|
||||
|
||||
Each project operates independently with its own issues and lessons learned.
|
||||
@@ -421,7 +421,7 @@ curl -H "Authorization: token YOUR_TOKEN" \
|
||||
|
||||
# Test Wiki.js token
|
||||
curl -H "Authorization: Bearer YOUR_TOKEN" \
|
||||
https://wiki.hyperhivelabs.com/graphql
|
||||
https://wiki.your-company.com/graphql
|
||||
|
||||
# If fails, regenerate token (Step 2)
|
||||
```
|
||||
|
||||
@@ -59,9 +59,9 @@ EOF
|
||||
|
||||
# Wiki.js configuration
|
||||
cat > ~/.config/claude/wikijs.env << EOF
|
||||
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql
|
||||
WIKIJS_API_URL=https://wiki.your-company.com/graphql
|
||||
WIKIJS_API_TOKEN=your_wikijs_token_here
|
||||
WIKIJS_BASE_PATH=/hyper-hive-labs
|
||||
WIKIJS_BASE_PATH=/your-org
|
||||
EOF
|
||||
|
||||
# Secure the files
|
||||
@@ -327,7 +327,7 @@ See [CONFIGURATION.md](./CONFIGURATION.md) for detailed configuration instructio
|
||||
|
||||
### Cannot connect to Wiki.js
|
||||
- Verify `~/.config/claude/wikijs.env` exists and has correct URL and token
|
||||
- Check Wiki.js GraphQL endpoint: `https://wiki.hyperhivelabs.com/graphql`
|
||||
- Check Wiki.js GraphQL endpoint: `https://wiki.your-company.com/graphql`
|
||||
- Verify API token has pages read/write permissions
|
||||
|
||||
### Labels not syncing
|
||||
|
||||
Reference in New Issue
Block a user