development #22

Merged
lmiranda merged 2 commits from development into main 2025-12-12 07:13:10 +00:00
22 changed files with 357 additions and 506 deletions

View File

@@ -98,19 +98,19 @@ Complete JSON schema reference for `.claude-plugin/plugin.json` files.
"version": "2.1.0", "version": "2.1.0",
"description": "Automated deployment tools for cloud platforms", "description": "Automated deployment tools for cloud platforms",
"author": { "author": {
"name": "Bandit Labs", "name": "Your Name",
"email": "plugins@hyperhivelabs.com", "email": "plugins@example.com",
"url": "https://hyperhivelabs.com" "url": "https://example.com"
}, },
"license": "MIT", "license": "MIT",
"keywords": ["deployment", "automation", "cloud", "devops"], "keywords": ["deployment", "automation", "cloud", "devops"],
"homepage": "https://github.com/hyperhivelabs/deploy-automation", "homepage": "https://github.com/yourorg/deploy-automation",
"repository": { "repository": {
"type": "git", "type": "git",
"url": "https://github.com/hyperhivelabs/deploy-automation.git" "url": "https://github.com/yourorg/deploy-automation.git"
}, },
"bugs": { "bugs": {
"url": "https://github.com/hyperhivelabs/deploy-automation/issues" "url": "https://github.com/yourorg/deploy-automation/issues"
}, },
"dependencies": { "dependencies": {
"node": ">=18.0.0", "node": ">=18.0.0",

28
.mcp.json Normal file
View File

@@ -0,0 +1,28 @@
{
"mcpServers": {
"gitea": {
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea/.venv/bin/python",
"args": ["-m", "mcp_server.server"],
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea",
"env": {
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea"
}
},
"wikijs": {
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs/.venv/bin/python",
"args": ["-m", "mcp_server.server"],
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs",
"env": {
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs"
}
},
"netbox": {
"command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox/.venv/bin/python",
"args": ["-m", "mcp_server.server"],
"cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox",
"env": {
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox"
}
}
}
}

View File

@@ -212,20 +212,19 @@ The label system includes:
**Critical Feature:** After 15 sprints without lesson capture, repeated mistakes occurred (e.g., Claude Code infinite loops on similar issues 2-3 times). **Critical Feature:** After 15 sprints without lesson capture, repeated mistakes occurred (e.g., Claude Code infinite loops on similar issues 2-3 times).
**Wiki.js Structure:** **Wiki.js Structure (Example):**
``` ```
Wiki.js: https://wiki.hyperhivelabs.com Wiki.js: https://wiki.your-company.com
└── /hyper-hive-labs/ └── /your-org/
├── projects/ # Project-specific documentation ├── projects/ # Project-specific documentation
│ ├── cuisineflow/ │ ├── project-a/
│ │ ├── lessons-learned/ │ │ ├── lessons-learned/
│ │ │ ├── sprints/ │ │ │ ├── sprints/
│ │ │ ├── patterns/ │ │ │ ├── patterns/
│ │ │ └── INDEX.md │ │ │ └── INDEX.md
│ │ └── documentation/ │ │ └── documentation/
│ ├── cuisineflow-site/ │ ├── project-b/
── intuit-engine/ ── project-c/
│ └── hhl-site/
├── company/ # Company-wide documentation ├── company/ # Company-wide documentation
│ ├── processes/ │ ├── processes/
│ ├── standards/ │ ├── standards/
@@ -376,11 +375,11 @@ bandit/support-claude-mktplace/
## Multi-Project Context (PMO Plugin) ## Multi-Project Context (PMO Plugin)
The `projman-pmo` plugin will coordinate interdependent projects: The `projman-pmo` plugin coordinates interdependent projects across an organization. Example use cases:
- **CuisineFlow** - Main product - Main product repository
- **CuisineFlow-Site** - Demo sync + customer gateway - Marketing/documentation sites
- **Intuit Engine Service** - API aggregator extraction (imminent) - Extracted services
- **HHL-Site** - Company presence - Supporting tools
PMO plugin adds: PMO plugin adds:
- Cross-project issue aggregation (all repos in organization) - Cross-project issue aggregation (all repos in organization)
@@ -392,7 +391,7 @@ PMO plugin adds:
**Configuration Difference:** **Configuration Difference:**
- PMO operates at company level (no `GITEA_REPO` or `WIKIJS_PROJECT`) - PMO operates at company level (no `GITEA_REPO` or `WIKIJS_PROJECT`)
- Accesses entire `/hyper-hive-labs` Wiki.js namespace - Accesses entire organization Wiki.js namespace
- Queries all repositories in organization - Queries all repositories in organization
Build PMO plugin AFTER projman is working and validated. Build PMO plugin AFTER projman is working and validated.
@@ -409,7 +408,7 @@ Create local marketplace for plugin development:
``` ```
**Integration Testing:** **Integration Testing:**
Test in real CuisineFlow repository with actual Gitea instance before distribution. Test in a real repository with actual Gitea instance before distribution.
**Success Metrics:** **Success Metrics:**
- Sprint planning time reduced 40% - Sprint planning time reduced 40%
@@ -426,7 +425,7 @@ Test in real CuisineFlow repository with actual Gitea instance before distributi
- **Type/Refactor label** - Newly implemented at org level for architectural work - **Type/Refactor label** - Newly implemented at org level for architectural work
- **Branch detection must be 100% reliable** - Prevents production accidents - **Branch detection must be 100% reliable** - Prevents production accidents
- **Python for MCP servers** - Use Python 3.8+ with virtual environments - **Python for MCP servers** - Use Python 3.8+ with virtual environments
- **Wiki.js structure** - All HHL content under `/hyper-hive-labs` namespace - **Wiki.js structure** - Organization content under configured base namespace
## CRITICAL: Rules You MUST Follow ## CRITICAL: Rules You MUST Follow
@@ -501,7 +500,7 @@ This repository contains comprehensive planning documentation:
- Two-MCP-server approach confirmed (Gitea + Wiki.js) - Two-MCP-server approach confirmed (Gitea + Wiki.js)
- Python selected for MCP server implementation - Python selected for MCP server implementation
- Hybrid configuration strategy defined (system + project level) - Hybrid configuration strategy defined (system + project level)
- Wiki.js structure planned at `/hyper-hive-labs` - Wiki.js structure planned with configurable base path
- Repository structure designed with shared MCP servers - Repository structure designed with shared MCP servers
- `claude-plugin-developer` skill added to project - `claude-plugin-developer` skill added to project

View File

@@ -25,12 +25,12 @@ The MCP server operates in two modes based on environment variables:
**Project Mode (projman):** **Project Mode (projman):**
- When `WIKIJS_PROJECT` is present - When `WIKIJS_PROJECT` is present
- Operates within project path: `/hyper-hive-labs/projects/cuisineflow` - Operates within project path: `/your-org/projects/my-project`
- Used by projman plugin - Used by projman plugin
**Company Mode (pmo):** **Company Mode (pmo):**
- When `WIKIJS_PROJECT` is absent - When `WIKIJS_PROJECT` is absent
- Operates on entire namespace: `/hyper-hive-labs` - Operates on entire namespace: `/your-org`
- Used by projman-pmo plugin - Used by projman-pmo plugin
```python ```python
@@ -38,8 +38,8 @@ The MCP server operates in two modes based on environment variables:
def load(self): def load(self):
# ... load configs ... # ... load configs ...
self.base_path = os.getenv('WIKIJS_BASE_PATH') # /hyper-hive-labs self.base_path = os.getenv('WIKIJS_BASE_PATH') # /your-org
self.project_path = os.getenv('WIKIJS_PROJECT') # projects/cuisineflow (optional) self.project_path = os.getenv('WIKIJS_PROJECT') # projects/my-project (optional)
# Compose full path # Compose full path
if self.project_path: if self.project_path:
@@ -66,10 +66,10 @@ def load(self):
### Company-Wide Organization ### Company-Wide Organization
``` ```
Wiki.js: https://wiki.hyperhivelabs.com Wiki.js: https://wiki.your-company.com
└── /hyper-hive-labs/ # Base path └── /your-org/ # Base path
├── projects/ # Project-specific documentation ├── projects/ # Project-specific documentation
│ ├── cuisineflow/ │ ├── my-project/
│ │ ├── lessons-learned/ │ │ ├── lessons-learned/
│ │ │ ├── sprints/ │ │ │ ├── sprints/
│ │ │ │ ├── sprint-01-auth.md │ │ │ │ ├── sprint-01-auth.md
@@ -83,13 +83,13 @@ Wiki.js: https://wiki.hyperhivelabs.com
│ │ ├── architecture/ │ │ ├── architecture/
│ │ ├── api/ │ │ ├── api/
│ │ └── deployment/ │ │ └── deployment/
│ ├── cuisineflow-site/ │ ├── my-project-site/
│ │ ├── lessons-learned/ │ │ ├── lessons-learned/
│ │ └── documentation/ │ │ └── documentation/
│ ├── intuit-engine/ │ ├── intuit-engine/
│ │ ├── lessons-learned/ │ │ ├── lessons-learned/
│ │ └── documentation/ │ │ └── documentation/
│ └── hhl-site/ │ └── company-site/
│ ├── lessons-learned/ │ ├── lessons-learned/
│ └── documentation/ │ └── documentation/
├── company/ # Company-wide documentation ├── company/ # Company-wide documentation
@@ -124,11 +124,11 @@ Wiki.js: https://wiki.hyperhivelabs.com
**Project Mode (projman):** **Project Mode (projman):**
- Full path = `{WIKIJS_BASE_PATH}/{WIKIJS_PROJECT}` - Full path = `{WIKIJS_BASE_PATH}/{WIKIJS_PROJECT}`
- Example: `/hyper-hive-labs/projects/cuisineflow` - Example: `/your-org/projects/my-project`
**Company Mode (pmo):** **Company Mode (pmo):**
- Full path = `{WIKIJS_BASE_PATH}` - Full path = `{WIKIJS_BASE_PATH}`
- Example: `/hyper-hive-labs` - Example: `/your-org`
--- ---
@@ -139,14 +139,14 @@ Wiki.js: https://wiki.hyperhivelabs.com
**File:** `~/.config/claude/wikijs.env` **File:** `~/.config/claude/wikijs.env`
```bash ```bash
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=your_wikijs_token WIKIJS_API_TOKEN=your_wikijs_token
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
``` ```
**Generating Wiki.js API Token:** **Generating Wiki.js API Token:**
1. Log into Wiki.js: https://wiki.hyperhivelabs.com 1. Log into Wiki.js: https://wiki.your-company.com
2. Navigate to: **User Menu (top-right)****Administration** 2. Navigate to: **User Menu (top-right)****Administration**
3. In the sidebar, go to: **API Access** 3. In the sidebar, go to: **API Access**
4. Click **Create New Token** 4. Click **Create New Token**
@@ -170,9 +170,9 @@ mkdir -p ~/.config/claude
# Create wikijs.env # Create wikijs.env
cat > ~/.config/claude/wikijs.env << EOF cat > ~/.config/claude/wikijs.env << EOF
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=your_token_here WIKIJS_API_TOKEN=your_token_here
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
EOF EOF
# Secure the file (important!) # Secure the file (important!)
@@ -188,13 +188,13 @@ cat ~/.config/claude/wikijs.env
```bash ```bash
# Wiki.js project path (relative to base path) # Wiki.js project path (relative to base path)
WIKIJS_PROJECT=projects/cuisineflow WIKIJS_PROJECT=projects/my-project
``` ```
**Setup:** **Setup:**
```bash ```bash
# In each project root # In each project root
echo "WIKIJS_PROJECT=projects/cuisineflow" >> .env echo "WIKIJS_PROJECT=projects/my-project" >> .env
# Add to .gitignore (if not already) # Add to .gitignore (if not already)
echo ".env" >> .gitignore echo ".env" >> .gitignore
@@ -243,8 +243,8 @@ class WikiJSConfig:
# Extract values # Extract values
self.api_url = os.getenv('WIKIJS_API_URL') self.api_url = os.getenv('WIKIJS_API_URL')
self.api_token = os.getenv('WIKIJS_API_TOKEN') self.api_token = os.getenv('WIKIJS_API_TOKEN')
self.base_path = os.getenv('WIKIJS_BASE_PATH') # /hyper-hive-labs self.base_path = os.getenv('WIKIJS_BASE_PATH') # /your-org
self.project_path = os.getenv('WIKIJS_PROJECT') # projects/cuisineflow (optional) self.project_path = os.getenv('WIKIJS_PROJECT') # projects/my-project (optional)
# Compose full path # Compose full path
if self.project_path: if self.project_path:
@@ -691,7 +691,7 @@ class WikiJSClient:
by_project = {} by_project = {}
for result in results: for result in results:
# Extract project name from path # Extract project name from path
# e.g., "/hyper-hive-labs/projects/cuisineflow/..." -> "cuisineflow" # e.g., "/your-org/projects/my-project/..." -> "my-project"
path_parts = result['path'].split('/') path_parts = result['path'].split('/')
if len(path_parts) >= 4: if len(path_parts) >= 4:
project = path_parts[3] project = path_parts[3]
@@ -957,7 +957,7 @@ Agent: Any architectural insights for similar future work?
User: [Response] User: [Response]
Agent: I'll create a lesson in Wiki.js: Agent: I'll create a lesson in Wiki.js:
Path: /hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine Path: /your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
Tags detected: Tags detected:
#service-extraction #api #refactoring #claude-code-loops #service-extraction #api #refactoring #claude-code-loops
@@ -965,7 +965,7 @@ Agent: I'll create a lesson in Wiki.js:
Creating page in Wiki.js... ✅ Creating page in Wiki.js... ✅
Updating INDEX.md... ✅ Updating INDEX.md... ✅
View at: https://wiki.hyperhivelabs.com/hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine View at: https://wiki.your-company.com/your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
``` ```
--- ---
@@ -1011,14 +1011,14 @@ def test_project_config_path_composition(tmp_path, monkeypatch):
system_config.write_text( system_config.write_text(
"WIKIJS_API_URL=https://wiki.test.com/graphql\n" "WIKIJS_API_URL=https://wiki.test.com/graphql\n"
"WIKIJS_API_TOKEN=test_token\n" "WIKIJS_API_TOKEN=test_token\n"
"WIKIJS_BASE_PATH=/hyper-hive-labs\n" "WIKIJS_BASE_PATH=/your-org\n"
) )
project_dir = tmp_path / 'project' project_dir = tmp_path / 'project'
project_dir.mkdir() project_dir.mkdir()
project_config = project_dir / '.env' project_config = project_dir / '.env'
project_config.write_text("WIKIJS_PROJECT=projects/cuisineflow\n") project_config.write_text("WIKIJS_PROJECT=projects/my-project\n")
monkeypatch.setenv('HOME', str(tmp_path)) monkeypatch.setenv('HOME', str(tmp_path))
monkeypatch.chdir(project_dir) monkeypatch.chdir(project_dir)
@@ -1026,8 +1026,8 @@ def test_project_config_path_composition(tmp_path, monkeypatch):
config = WikiJSConfig() config = WikiJSConfig()
result = config.load() result = config.load()
assert result['project_path'] == 'projects/cuisineflow' assert result['project_path'] == 'projects/my-project'
assert result['full_path'] == '/hyper-hive-labs/projects/cuisineflow' assert result['full_path'] == '/your-org/projects/my-project'
assert result['mode'] == 'project' assert result['mode'] == 'project'
``` ```
@@ -1116,7 +1116,7 @@ pytest tests/test_config.py::test_project_config_path_composition
### Initial Structure Creation ### Initial Structure Creation
**Status:** Base structure `/hyper-hive-labs` **does not exist** and needs to be created during Phase 1.1b. **Status:** Base structure `/your-org` **does not exist** and needs to be created during Phase 1.1b.
**Setup Script:** Run this script during Phase 1.1b to create the base structure: **Setup Script:** Run this script during Phase 1.1b to create the base structure:
@@ -1134,7 +1134,7 @@ from mcp_server.wikijs_client import WikiJSClient
async def initialize_wiki_structure(): async def initialize_wiki_structure():
"""Create base Wiki.js structure for Bandit Labs""" """Create base Wiki.js structure for Your Organization"""
print("Initializing Wiki.js base structure...") print("Initializing Wiki.js base structure...")
print("=" * 60) print("=" * 60)
@@ -1153,17 +1153,17 @@ async def initialize_wiki_structure():
# Base structure to create # Base structure to create
base_pages = [ base_pages = [
{ {
'path': 'hyper-hive-labs', 'path': 'your-org',
'title': 'Bandit Labs', 'title': 'Your Organization',
'content': '''# Bandit Labs Documentation 'content': '''# Your Organization Documentation
Welcome to the Bandit Labs knowledge base. Welcome to the Your Organization knowledge base.
## Organization ## Organization
- **[Projects](hyper-hive-labs/projects)** - Project-specific documentation and lessons learned - **[Projects](your-org/projects)** - Project-specific documentation and lessons learned
- **[Company](hyper-hive-labs/company)** - Company-wide processes, standards, and tools - **[Company](your-org/company)** - Company-wide processes, standards, and tools
- **[Shared](hyper-hive-labs/shared)** - Cross-project architecture patterns and best practices - **[Shared](your-org/shared)** - Cross-project architecture patterns and best practices
## Purpose ## Purpose
@@ -1176,10 +1176,10 @@ This knowledge base captures:
All content is searchable and tagged for easy discovery across projects. All content is searchable and tagged for easy discovery across projects.
''', ''',
'tags': ['company', 'index'], 'tags': ['company', 'index'],
'description': 'Bandit Labs company knowledge base' 'description': 'Your Organization company knowledge base'
}, },
{ {
'path': 'hyper-hive-labs/projects', 'path': 'your-org/projects',
'title': 'Projects', 'title': 'Projects',
'content': '''# Project Documentation 'content': '''# Project Documentation
@@ -1187,10 +1187,10 @@ Project-specific documentation and lessons learned.
## Active Projects ## Active Projects
- **[CuisineFlow](hyper-hive-labs/projects/cuisineflow)** - Main product - **[My-Project](your-org/projects/my-project)** - Main product
- **[CuisineFlow-Site](hyper-hive-labs/projects/cuisineflow-site)** - Demo and customer gateway - **[My-Project-Site](your-org/projects/my-project-site)** - Demo and customer gateway
- **[Intuit-Engine](hyper-hive-labs/projects/intuit-engine)** - API aggregator service - **[Intuit-Engine](your-org/projects/intuit-engine)** - API aggregator service
- **[HHL-Site](hyper-hive-labs/projects/hhl-site)** - Company website - **[Company-Site](your-org/projects/company-site)** - Company website
Each project maintains: Each project maintains:
- Lessons learned from sprints - Lessons learned from sprints
@@ -1201,7 +1201,7 @@ Each project maintains:
'description': 'Index of all project documentation' 'description': 'Index of all project documentation'
}, },
{ {
'path': 'hyper-hive-labs/company', 'path': 'your-org/company',
'title': 'Company', 'title': 'Company',
'content': '''# Company Documentation 'content': '''# Company Documentation
@@ -1209,9 +1209,9 @@ Company-wide processes, standards, and tools.
## Sections ## Sections
- **[Processes](hyper-hive-labs/company/processes)** - Development workflows, onboarding, deployment - **[Processes](your-org/company/processes)** - Development workflows, onboarding, deployment
- **[Standards](hyper-hive-labs/company/standards)** - Code style, API design, security practices - **[Standards](your-org/company/standards)** - Code style, API design, security practices
- **[Tools](hyper-hive-labs/company/tools)** - Gitea, Wiki.js, Claude Code plugin guides - **[Tools](your-org/company/tools)** - Gitea, Wiki.js, Claude Code plugin guides
These standards apply to all projects and team members. These standards apply to all projects and team members.
''', ''',
@@ -1219,7 +1219,7 @@ These standards apply to all projects and team members.
'description': 'Company processes and standards' 'description': 'Company processes and standards'
}, },
{ {
'path': 'hyper-hive-labs/shared', 'path': 'your-org/shared',
'title': 'Shared Resources', 'title': 'Shared Resources',
'content': '''# Shared Resources 'content': '''# Shared Resources
@@ -1227,9 +1227,9 @@ Cross-project architecture patterns, best practices, and technical knowledge.
## Sections ## Sections
- **[Architecture Patterns](hyper-hive-labs/shared/architecture-patterns)** - Microservices, service extraction, API design - **[Architecture Patterns](your-org/shared/architecture-patterns)** - Microservices, service extraction, API design
- **[Best Practices](hyper-hive-labs/shared/best-practices)** - Error handling, logging, testing strategies - **[Best Practices](your-org/shared/best-practices)** - Error handling, logging, testing strategies
- **[Tech Stack](hyper-hive-labs/shared/tech-stack)** - Python ecosystem, Docker, CI/CD pipelines - **[Tech Stack](your-org/shared/tech-stack)** - Python ecosystem, Docker, CI/CD pipelines
These patterns are distilled from lessons learned across all projects. These patterns are distilled from lessons learned across all projects.
''', ''',
@@ -1238,40 +1238,40 @@ These patterns are distilled from lessons learned across all projects.
}, },
# Project placeholders # Project placeholders
{ {
'path': 'hyper-hive-labs/projects/cuisineflow', 'path': 'your-org/projects/my-project',
'title': 'CuisineFlow', 'title': 'My-Project',
'content': '''# CuisineFlow 'content': '''# My-Project
Main product - recipe management and meal planning platform. Main product - recipe management and meal planning platform.
## Documentation ## Documentation
- **[Lessons Learned](hyper-hive-labs/projects/cuisineflow/lessons-learned)** - Sprint retrospectives and insights - **[Lessons Learned](your-org/projects/my-project/lessons-learned)** - Sprint retrospectives and insights
- **[Architecture](hyper-hive-labs/projects/cuisineflow/documentation/architecture)** - System architecture - **[Architecture](your-org/projects/my-project/documentation/architecture)** - System architecture
- **[API](hyper-hive-labs/projects/cuisineflow/documentation/api)** - API documentation - **[API](your-org/projects/my-project/documentation/api)** - API documentation
Sprint lessons will be automatically captured here by the projman plugin. Sprint lessons will be automatically captured here by the projman plugin.
''', ''',
'tags': ['project', 'cuisineflow'], 'tags': ['project', 'my-project'],
'description': 'CuisineFlow project documentation' 'description': 'My-Project project documentation'
}, },
{ {
'path': 'hyper-hive-labs/projects/cuisineflow/lessons-learned', 'path': 'your-org/projects/my-project/lessons-learned',
'title': 'CuisineFlow - Lessons Learned', 'title': 'My-Project - Lessons Learned',
'content': '''# CuisineFlow - Lessons Learned 'content': '''# My-Project - Lessons Learned
Sprint retrospectives and insights from CuisineFlow development. Sprint retrospectives and insights from My-Project development.
## Organization ## Organization
- **[Sprints](hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints)** - Sprint-specific lessons - **[Sprints](your-org/projects/my-project/lessons-learned/sprints)** - Sprint-specific lessons
- **[Patterns](hyper-hive-labs/projects/cuisineflow/lessons-learned/patterns)** - Recurring patterns and solutions - **[Patterns](your-org/projects/my-project/lessons-learned/patterns)** - Recurring patterns and solutions
- **[INDEX](hyper-hive-labs/projects/cuisineflow/lessons-learned/INDEX)** - Complete index with tags - **[INDEX](your-org/projects/my-project/lessons-learned/INDEX)** - Complete index with tags
Lessons are automatically captured during sprint close via `/sprint-close` command. Lessons are automatically captured during sprint close via `/sprint-close` command.
''', ''',
'tags': ['lessons-learned', 'cuisineflow'], 'tags': ['lessons-learned', 'my-project'],
'description': 'CuisineFlow lessons learned index' 'description': 'My-Project lessons learned index'
} }
] ]
@@ -1308,7 +1308,7 @@ Lessons are automatically captured during sprint close via `/sprint-close` comma
sys.exit(1) sys.exit(1)
else: else:
print(f"\n✅ All pages created successfully!") print(f"\n✅ All pages created successfully!")
print(f"\nView at: https://wiki.hyperhivelabs.com/hyper-hive-labs") print(f"\nView at: https://wiki.your-company.com/your-org")
if __name__ == '__main__': if __name__ == '__main__':
@@ -1326,13 +1326,13 @@ python setup_wiki_structure.py
``` ```
Initializing Wiki.js base structure... Initializing Wiki.js base structure...
============================================================ ============================================================
✅ Connected to Wiki.js at https://wiki.hyperhivelabs.com/graphql ✅ Connected to Wiki.js at https://wiki.your-company.com/graphql
Base path: /hyper-hive-labs Base path: /your-org
Creating: /hyper-hive-labs Creating: /your-org
✅ Created successfully ✅ Created successfully
Creating: /hyper-hive-labs/projects Creating: /your-org/projects
✅ Created successfully ✅ Created successfully
... ...
@@ -1343,12 +1343,12 @@ Setup complete!
✅ All pages created successfully! ✅ All pages created successfully!
View at: https://wiki.hyperhivelabs.com/hyper-hive-labs View at: https://wiki.your-company.com/your-org
``` ```
**Post-Setup:** **Post-Setup:**
After running the script: After running the script:
1. Visit https://wiki.hyperhivelabs.com/hyper-hive-labs to verify structure 1. Visit https://wiki.your-company.com/your-org to verify structure
2. Add additional project directories as needed 2. Add additional project directories as needed
3. The structure is now ready for projman plugin to use 3. The structure is now ready for projman plugin to use
@@ -1451,13 +1451,13 @@ if __name__ == '__main__':
curl -H "Authorization: Bearer YOUR_TOKEN" \ curl -H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \ -H "Content-Type: application/json" \
-d '{"query":"{ pages { list { id title } } }"}' \ -d '{"query":"{ pages { list { id title } } }"}' \
https://wiki.hyperhivelabs.com/graphql https://wiki.your-company.com/graphql
``` ```
**Issue:** Path not found **Issue:** Path not found
```bash ```bash
# Solution: Verify base structure exists # Solution: Verify base structure exists
# Check in Wiki.js web interface that /hyper-hive-labs path exists # Check in Wiki.js web interface that /your-org path exists
``` ```
**Issue:** Tags not working **Issue:** Tags not working

View File

@@ -197,7 +197,7 @@ projman-pmo/
**Critical Differences from projman:** **Critical Differences from projman:**
- **NO** `GITEA_REPO` → operates on all repositories - **NO** `GITEA_REPO` → operates on all repositories
- **NO** `WIKIJS_PROJECT` → operates on entire `/hyper-hive-labs` namespace - **NO** `WIKIJS_PROJECT` → operates on entire `/your-org` namespace
- Same shared MCP servers at `../mcp-servers/` - Same shared MCP servers at `../mcp-servers/`
### Environment Variables ### Environment Variables
@@ -282,10 +282,10 @@ User: /pmo-status
PMO: Projects Overview: PMO: Projects Overview:
CuisineFlow: Sprint in progress, 60% complete Main-App: Sprint in progress, 60% complete
└─ Blocking: CuisineFlow-Site deployment └─ Blocking: Main-App-Site deployment
I notice CuisineFlow-Site is waiting on API changes. I notice Main-App-Site is waiting on API changes.
Use projman in the cuisineflow-site repo to plan the sync work. Use projman in the cuisineflow-site repo to plan the sync work.
``` ```
@@ -309,7 +309,7 @@ User: /pmo-status
PMO: Projects Overview PMO: Projects Overview
================ ================
CuisineFlow (main product) Main-App (main product)
├── Sprint: Intuit Engine Extraction ├── Sprint: Intuit Engine Extraction
├── Status: In Progress (60%) ├── Status: In Progress (60%)
├── Issue: #47 ├── Issue: #47
@@ -317,13 +317,13 @@ PMO: Projects Overview
├── Next: API testing ├── Next: API testing
└── Timeline: On track (1 week remaining) └── Timeline: On track (1 week remaining)
CuisineFlow-Site (demo) Main-App-Site (demo)
├── Sprint: Dashboard Updates ├── Sprint: Dashboard Updates
├── Status: Waiting on dependencies ├── Status: Waiting on dependencies
├── Issue: #12 ├── Issue: #12
├── Blockers: Depends on CuisineFlow #47 (API changes) ├── Blockers: Depends on Main-App #47 (API changes)
├── Next: Deploy when API ready ├── Next: Deploy when API ready
└── Timeline: Blocked (waiting on CuisineFlow) └── Timeline: Blocked (waiting on Main-App)
Intuit-Engine (service) Intuit-Engine (service)
├── Sprint: Initial Setup ├── Sprint: Initial Setup
@@ -333,7 +333,7 @@ PMO: Projects Overview
├── Next: Service boundary definition ├── Next: Service boundary definition
└── Timeline: Not started └── Timeline: Not started
HHL-Site (marketing) Company-Site (marketing)
├── Sprint: Content Updates ├── Sprint: Content Updates
├── Status: Complete ├── Status: Complete
├── Issue: #8 (closed) ├── Issue: #8 (closed)
@@ -342,14 +342,14 @@ PMO: Projects Overview
└── Timeline: Ready for deployment └── Timeline: Ready for deployment
Cross-Project Insights: Cross-Project Insights:
- CuisineFlow-Site blocked by CuisineFlow #47 - Main-App-Site blocked by Main-App #47
- Intuit-Engine needs architectural guidance - Intuit-Engine needs architectural guidance
- HHL-Site ready for deployment - Company-Site ready for deployment
Suggested Actions: Suggested Actions:
1. Prioritize CuisineFlow #47 completion (unblocks Site) 1. Prioritize Main-App #47 completion (unblocks Site)
2. Schedule architecture review for Intuit-Engine 2. Schedule architecture review for Intuit-Engine
3. Deploy HHL-Site content updates 3. Deploy Company-Site content updates
``` ```
**Success Criteria:** **Success Criteria:**
@@ -375,15 +375,15 @@ PMO: Priority Analysis Across Projects
================================= =================================
Critical (Blocks launches): Critical (Blocks launches):
1. CuisineFlow #47 - Intuit Engine Extraction 1. Main-App #47 - Intuit Engine Extraction
→ Blocks: CuisineFlow v2.0 launch (Nov 15) → Blocks: Main-App v2.0 launch (Nov 15)
→ Blocks: CuisineFlow-Site deployment → Blocks: Main-App-Site deployment
→ Status: 60% complete, 1 week remaining → Status: 60% complete, 1 week remaining
→ Recommendation: Maintain focus → Recommendation: Maintain focus
High (Important features): High (Important features):
2. CuisineFlow-Site #12 - Dashboard Updates 2. Main-App-Site #12 - Dashboard Updates
→ Depends on: CuisineFlow #47 → Depends on: Main-App #47
→ Status: Waiting → Status: Waiting
→ Recommendation: Begin planning, wait for #47 → Recommendation: Begin planning, wait for #47
@@ -393,14 +393,14 @@ PMO: Priority Analysis Across Projects
→ Recommendation: Schedule architecture review → Recommendation: Schedule architecture review
Medium (Enhancements): Medium (Enhancements):
4. HHL-Site #8 - Content Updates 4. Company-Site #8 - Content Updates
→ Status: Complete, ready to deploy → Status: Complete, ready to deploy
→ Recommendation: Deploy this week → Recommendation: Deploy this week
Resource Analysis: Resource Analysis:
Leo (primary developer): Leo (primary developer):
- Working on: CuisineFlow #47 (Critical) - Working on: Main-App #47 (Critical)
- Next: CuisineFlow-Site #12 (High) - Next: Main-App-Site #12 (High)
- Later: Intuit-Engine #3 (High) - Later: Intuit-Engine #3 (High)
⚠️ Potential Conflict: ⚠️ Potential Conflict:
@@ -438,27 +438,27 @@ PMO: Project Dependencies
Intuit-Engine (service) Intuit-Engine (service)
↓ (must complete before) ↓ (must complete before)
CuisineFlow (main) Main-App (main)
↓ (API changes required by) ↓ (API changes required by)
CuisineFlow-Site (demo) Main-App-Site (demo)
↓ (deploy together) ↓ (deploy together)
Customer VPS (deployments) Customer VPS (deployments)
Detailed Dependencies: Detailed Dependencies:
1. CuisineFlow → Intuit-Engine 1. Main-App → Intuit-Engine
Type: Service extraction Type: Service extraction
Status: In progress Status: In progress
Blocker: Must complete #47 before v2.0 launch Blocker: Must complete #47 before v2.0 launch
Impact: High (blocks launch) Impact: High (blocks launch)
2. CuisineFlow → CuisineFlow-Site 2. Main-App → Main-App-Site
Type: Feature sync Type: Feature sync
Status: Waiting Status: Waiting
Blocker: Dashboard updates need API changes from #47 Blocker: Dashboard updates need API changes from #47
Impact: Medium (demo consistency) Impact: Medium (demo consistency)
3. CuisineFlow → Customer VPS 3. Main-App → Customer VPS
Type: Deployment Type: Deployment
Status: Pending Status: Pending
Blocker: v2.0 launch must complete before customer updates Blocker: v2.0 launch must complete before customer updates
@@ -466,16 +466,16 @@ PMO: Project Dependencies
Deployment Order: Deployment Order:
1. Complete Intuit-Engine service extraction 1. Complete Intuit-Engine service extraction
2. Deploy CuisineFlow v2.0 to production 2. Deploy Main-App v2.0 to production
3. Sync CuisineFlow-Site demo 3. Sync Main-App-Site demo
4. Update customer VPS instances 4. Update customer VPS instances
Critical Path: Critical Path:
Intuit-Engine (#3) → CuisineFlow (#47) → Site (#12) → Customer VPS Intuit-Engine (#3) → Main-App (#47) → Site (#12) → Customer VPS
Estimated timeline: 3 weeks total Estimated timeline: 3 weeks total
Risks: Risks:
- CuisineFlow #47 delay would cascade to all downstream - Main-App #47 delay would cascade to all downstream
- Customer VPS updates have Nov 20 deadline - Customer VPS updates have Nov 20 deadline
``` ```
@@ -505,9 +505,9 @@ PMO: Resource Conflict Analysis
Leo (Developer): Leo (Developer):
Assigned to: Assigned to:
- CuisineFlow #47: Intuit Engine extraction (Priority: Critical) - Main-App #47: Intuit Engine extraction (Priority: Critical)
- CuisineFlow-Site #12: Dashboard redesign (Priority: High) - Main-App-Site #12: Dashboard redesign (Priority: High)
- HHL-Site #8: Content update (Priority: Medium) - Company-Site #8: Content update (Priority: Medium)
- Intuit-Engine #3: Service setup (Priority: High) - Intuit-Engine #3: Service setup (Priority: High)
Conflict Analysis: Conflict Analysis:
@@ -517,15 +517,15 @@ PMO: Resource Conflict Analysis
- Timeline: 2 weeks available - Timeline: 2 weeks available
Recommendations: Recommendations:
1. FOCUS: CuisineFlow #47 (Critical, blocks launch) 1. FOCUS: Main-App #47 (Critical, blocks launch)
→ Keep as primary focus → Keep as primary focus
→ Estimated completion: 1 week → Estimated completion: 1 week
2. QUEUE: CuisineFlow-Site #12 (High, depends on #47) 2. QUEUE: Main-App-Site #12 (High, depends on #47)
→ Start after #47 completes → Start after #47 completes
→ Natural sequencing → Natural sequencing
3. DELEGATE: HHL-Site #8 (Medium, content work) 3. DELEGATE: Company-Site #8 (Medium, content work)
→ Can be handled by marketing team → Can be handled by marketing team
→ Low technical complexity → Low technical complexity
@@ -536,14 +536,14 @@ PMO: Resource Conflict Analysis
Timeline Conflict: Timeline Conflict:
Without changes: Without changes:
Week 1-2: CuisineFlow #47 Week 1-2: Main-App #47
Week 3-4: Site #12 + Engine #3 (CONFLICT) Week 3-4: Site #12 + Engine #3 (CONFLICT)
With recommendations: With recommendations:
Week 1-2: CuisineFlow #47 (FOCUS) Week 1-2: Main-App #47 (FOCUS)
Week 3: CuisineFlow-Site #12 (QUEUE) Week 3: Main-App-Site #12 (QUEUE)
Week 4: Intuit-Engine #3 (SCHEDULE) Week 4: Intuit-Engine #3 (SCHEDULE)
HHL-Site #8: Marketing (DELEGATE) Company-Site #8: Marketing (DELEGATE)
Apply recommendations? Apply recommendations?
``` ```
@@ -571,18 +571,18 @@ PMO: Deployment Schedule
=================== ===================
Week of Nov 6: Week of Nov 6:
- CuisineFlow: Continue Sprint #47 (Intuit Engine) - Main-App: Continue Sprint #47 (Intuit Engine)
- HHL-Site: Deploy content updates (#8) ✅ - Company-Site: Deploy content updates (#8) ✅
Week of Nov 13: Week of Nov 13:
- CuisineFlow: Complete Sprint #47 ⚠️ - Main-App: Complete Sprint #47 ⚠️
→ Critical: Must complete by Nov 15 for launch → Critical: Must complete by Nov 15 for launch
- CuisineFlow-Site: Begin Dashboard updates (#12) - Main-App-Site: Begin Dashboard updates (#12)
Week of Nov 20: Week of Nov 20:
- CuisineFlow: Deploy v2.0 to production 🚀 - Main-App: Deploy v2.0 to production 🚀
→ Deployment window: Nov 20-22 → Deployment window: Nov 20-22
- CuisineFlow-Site: Sync demo with v2.0 - Main-App-Site: Sync demo with v2.0
- Customer VPS: Begin rollout (Deadline: Nov 20) - Customer VPS: Begin rollout (Deadline: Nov 20)
Week of Nov 27: Week of Nov 27:
@@ -590,19 +590,19 @@ PMO: Deployment Schedule
- Intuit-Engine: Begin service implementation (#3) - Intuit-Engine: Begin service implementation (#3)
Critical Dates: Critical Dates:
- Nov 15: CuisineFlow v2.0 feature freeze - Nov 15: Main-App v2.0 feature freeze
- Nov 20: Production deployment - Nov 20: Production deployment
- Nov 20: Customer VPS deadline - Nov 20: Customer VPS deadline
- Nov 22: Demo site sync - Nov 22: Demo site sync
Dependencies: Dependencies:
✅ Intuit-Engine service boundaries defined ✅ Intuit-Engine service boundaries defined
🔄 CuisineFlow #47 in progress (60%) 🔄 Main-App #47 in progress (60%)
CuisineFlow-Site #12 waiting Main-App-Site #12 waiting
⏳ Customer VPS rollout pending ⏳ Customer VPS rollout pending
Risks: Risks:
⚠️ CuisineFlow #47 delay would impact: ⚠️ Main-App #47 delay would impact:
- v2.0 launch date (Nov 20) - v2.0 launch date (Nov 20)
- Customer VPS deadline (Nov 20) - Customer VPS deadline (Nov 20)
- Demo site sync (Nov 22) - Demo site sync (Nov 22)
@@ -632,7 +632,7 @@ PMO: Searching lessons across all projects...
Found 5 relevant lessons: Found 5 relevant lessons:
CuisineFlow: Main-App:
- Sprint 08: Service boundary issues with shared DB - Sprint 08: Service boundary issues with shared DB
Tags: #service-extraction #database #refactoring Tags: #service-extraction #database #refactoring
Key: Define boundaries at data model level first Key: Define boundaries at data model level first
@@ -646,13 +646,13 @@ PMO: Searching lessons across all projects...
Tags: #service-extraction #dependencies Tags: #service-extraction #dependencies
Key: Watch for import cycles Key: Watch for import cycles
CuisineFlow-Site: Main-App-Site:
- Sprint 05: Sync issues during backend refactor - Sprint 05: Sync issues during backend refactor
Tags: #sync #refactoring Tags: #sync #refactoring
Key: Coordinate deployments during major changes Key: Coordinate deployments during major changes
Shared Patterns: Shared Patterns:
- /hyper-hive-labs/shared/architecture-patterns/service-extraction - /your-org/shared/architecture-patterns/service-extraction
Key: Service Extraction Playbook Key: Service Extraction Playbook
Common Themes: Common Themes:
@@ -672,15 +672,15 @@ User: /pmo-status
PMO: [Shows status] PMO: [Shows status]
Resource Analysis: Resource Analysis:
- Leo: 3 active projects (CuisineFlow, Site, Engine) - Leo: 3 active projects (Main-App, Site, Engine)
- Marketing: 1 project (HHL-Site) - Marketing: 1 project (Company-Site)
Bottleneck detected: Leo is on critical path for all technical work Bottleneck detected: Leo is on critical path for all technical work
Recommendations: Recommendations:
1. Prioritize work that unblocks others 1. Prioritize work that unblocks others
2. Consider hiring/contracting for parallel work 2. Consider hiring/contracting for parallel work
3. Delegate non-critical work (HHL-Site content) 3. Delegate non-critical work (Company-Site content)
Would you like a detailed resource allocation report? Would you like a detailed resource allocation report?
``` ```
@@ -690,18 +690,18 @@ PMO: [Shows status]
``` ```
User: Plan the v2.0 release across all affected projects User: Plan the v2.0 release across all affected projects
PMO: Release Coordination: CuisineFlow v2.0 PMO: Release Coordination: Main-App v2.0
====================================== ======================================
Affected Projects: Affected Projects:
1. CuisineFlow (main product) 1. Main-App (main product)
2. Intuit-Engine (new service) 2. Intuit-Engine (new service)
3. CuisineFlow-Site (demo) 3. Main-App-Site (demo)
4. Customer VPS instances 4. Customer VPS instances
Pre-Release Checklist: Pre-Release Checklist:
✅ Intuit-Engine service deployed ✅ Intuit-Engine service deployed
🔄 CuisineFlow #47 complete (60%) 🔄 Main-App #47 complete (60%)
⏳ Integration tests pending ⏳ Integration tests pending
⏳ Demo site updates pending ⏳ Demo site updates pending
⏳ Customer migration scripts ready ⏳ Customer migration scripts ready
@@ -712,12 +712,12 @@ PMO: Release Coordination: CuisineFlow v2.0
→ Duration: 2 hours → Duration: 2 hours
→ Rollback plan: Ready → Rollback plan: Ready
2. Deploy CuisineFlow v2.0 (main app) 2. Deploy Main-App v2.0 (main app)
→ Date: Nov 20 → Date: Nov 20
→ Duration: 4 hours → Duration: 4 hours
→ Rollback plan: Ready → Rollback plan: Ready
3. Sync CuisineFlow-Site (demo) 3. Sync Main-App-Site (demo)
→ Date: Nov 22 → Date: Nov 22
→ Duration: 1 hour → Duration: 1 hour
→ Dependencies: Step 2 complete → Dependencies: Step 2 complete
@@ -767,11 +767,11 @@ shared_patterns = await wikijs.get_shared_docs("architecture-patterns")
User: Create a shared pattern document for service extraction User: Create a shared pattern document for service extraction
PMO: I'll create a pattern document based on lessons from: PMO: I'll create a pattern document based on lessons from:
- CuisineFlow Sprint 08 - Main-App Sprint 08
- CuisineFlow Sprint 12 - Main-App Sprint 12
- Intuit-Engine Sprint 02 - Intuit-Engine Sprint 02
Creating: /hyper-hive-labs/shared/architecture-patterns/service-extraction Creating: /your-org/shared/architecture-patterns/service-extraction
Service Extraction Pattern Service Extraction Pattern
========================= =========================
@@ -855,7 +855,7 @@ PMO: I'll create a pattern document based on lessons from:
**Deliverable:** Validated PMO plugin integrated with projman **Deliverable:** Validated PMO plugin integrated with projman
**Tasks:** **Tasks:**
1. Test with multiple projects (CuisineFlow, Site, Engine, HHL-Site) 1. Test with multiple projects (Main-App, Site, Engine, Company-Site)
2. Simulate deployment coordination scenarios 2. Simulate deployment coordination scenarios
3. Test priority conflict resolution 3. Test priority conflict resolution
4. Validate dependency tracking 4. Validate dependency tracking
@@ -911,10 +911,10 @@ PMO: I'll create a pattern document based on lessons from:
``` ```
User: /pmo-status User: /pmo-status
PMO: CuisineFlow-Site: Waiting on dependencies PMO: Main-App-Site: Waiting on dependencies
Blocked by: CuisineFlow #47 (API changes) Blocked by: Main-App #47 (API changes)
Recommendation: Focus resources on CuisineFlow #47 to unblock Site work Recommendation: Focus resources on Main-App #47 to unblock Site work
``` ```
### Example 2: Resource Conflict Resolution ### Example 2: Resource Conflict Resolution
@@ -926,8 +926,8 @@ PMO: Leo assigned to 4 tasks across 3 projects
Estimated: 4 weeks work in 2 weeks timeline Estimated: 4 weeks work in 2 weeks timeline
Recommendations: Recommendations:
- Maintain focus on Critical priority (CuisineFlow #47) - Maintain focus on Critical priority (Main-App #47)
- Delegate content work (HHL-Site #8) to marketing - Delegate content work (Company-Site #8) to marketing
- Schedule architectural review (Intuit-Engine #3) as team session - Schedule architectural review (Intuit-Engine #3) as team session
``` ```
@@ -936,13 +936,13 @@ PMO: Leo assigned to 4 tasks across 3 projects
``` ```
User: /pmo-schedule User: /pmo-schedule
PMO: Critical Path: Engine → CuisineFlow → Site → Customer VPS PMO: Critical Path: Engine → Main-App → Site → Customer VPS
Deadline: Nov 20 (Customer VPS commitment) Deadline: Nov 20 (Customer VPS commitment)
Current: Week 1 of 3 Current: Week 1 of 3
Status: On track Status: On track
Risk: If CuisineFlow #47 delayed, entire timeline at risk Risk: If Main-App #47 delayed, entire timeline at risk
Mitigation: Daily progress monitoring Mitigation: Daily progress monitoring
``` ```

View File

@@ -925,13 +925,13 @@ Orchestrator: I'll create a lesson in Wiki.js:
#service-extraction #api #database #refactoring #claude-code-loops #service-extraction #api #database #refactoring #claude-code-loops
Creating page in Wiki.js... ✅ Creating page in Wiki.js... ✅
Path: /hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine Path: /your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
Updating INDEX.md... ✅ Updating INDEX.md... ✅
Closing sprint issue #47... ✅ Closing sprint issue #47... ✅
View lesson at: https://wiki.hyperhivelabs.com/hyper-hive-labs/projects/cuisineflow/lessons-learned/sprints/sprint-16-intuit-engine View lesson at: https://wiki.your-company.com/your-org/projects/my-project/lessons-learned/sprints/sprint-16-intuit-engine
``` ```
**Success Criteria:** **Success Criteria:**

View File

@@ -155,9 +155,9 @@ GITEA_API_TOKEN=your_token
GITEA_OWNER=bandit GITEA_OWNER=bandit
# ~/.config/claude/wikijs.env # ~/.config/claude/wikijs.env
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=your_token WIKIJS_API_TOKEN=your_token
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
``` ```
**Project-Level:** **Project-Level:**
@@ -271,18 +271,18 @@ WIKIJS_PROJECT=projects/cuisineflow
## Wiki.js Structure ## Wiki.js Structure
``` ```
Wiki.js: https://wiki.hyperhivelabs.com Wiki.js: https://wiki.your-company.com
└── /hyper-hive-labs/ └── /your-org/
├── projects/ # Project-specific ├── projects/ # Project-specific
│ ├── cuisineflow/ │ ├── project-a/
│ │ ├── lessons-learned/ │ │ ├── lessons-learned/
│ │ │ ├── sprints/ │ │ │ ├── sprints/
│ │ │ ├── patterns/ │ │ │ ├── patterns/
│ │ │ └── INDEX.md │ │ │ └── INDEX.md
│ │ └── documentation/ │ │ └── documentation/
│ ├── cuisineflow-site/ │ ├── project-b/
│ ├── intuit-engine/ │ ├── project-c/
│ └── hhl-site/ │ └── company-site/
├── company/ # Company-wide ├── company/ # Company-wide
│ ├── processes/ │ ├── processes/
│ ├── standards/ │ ├── standards/
@@ -373,9 +373,9 @@ EOF
# Wiki.js config # Wiki.js config
cat > ~/.config/claude/wikijs.env << EOF cat > ~/.config/claude/wikijs.env << EOF
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=your_wikijs_token WIKIJS_API_TOKEN=your_wikijs_token
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
EOF EOF
# Secure files # Secure files
@@ -565,7 +565,7 @@ Previous attempts failed due to:
### Immediate Actions ### Immediate Actions
1. **Set up system configuration** (Gitea + Wiki.js tokens) 1. **Set up system configuration** (Gitea + Wiki.js tokens)
2. **Create Wiki.js base structure** at `/hyper-hive-labs` 2. **Create Wiki.js base structure** at `/your-org`
3. **Begin Phase 1.1a** - Gitea MCP Server implementation 3. **Begin Phase 1.1a** - Gitea MCP Server implementation
4. **Begin Phase 1.1b** - Wiki.js MCP Server implementation 4. **Begin Phase 1.1b** - Wiki.js MCP Server implementation
@@ -597,10 +597,10 @@ These decisions were finalized before development:
- **Minimum:** Python 3.10.0 - **Minimum:** Python 3.10.0
### 2. Wiki.js Base Structure: Needs Creation ### 2. Wiki.js Base Structure: Needs Creation
- **Status:** `/hyper-hive-labs` structure does NOT exist yet - **Status:** `/your-org` structure does NOT exist yet
- **Action:** Run `setup_wiki_structure.py` during Phase 1.1b - **Action:** Run `setup_wiki_structure.py` during Phase 1.1b
- **Script:** See MCP-WIKIJS.md for complete setup script - **Script:** See MCP-WIKIJS.md for complete setup script
- **Post-setup:** Verify at https://wiki.hyperhivelabs.com/hyper-hive-labs - **Post-setup:** Verify at https://wiki.your-company.com/your-org
### 3. Testing Strategy: Both Mocks and Real APIs ### 3. Testing Strategy: Both Mocks and Real APIs
- **Unit tests:** Use mocks for fast feedback during development - **Unit tests:** Use mocks for fast feedback during development

View File

@@ -21,7 +21,6 @@ class GiteaConfig:
def __init__(self): def __init__(self):
self.api_url: Optional[str] = None self.api_url: Optional[str] = None
self.api_token: Optional[str] = None self.api_token: Optional[str] = None
self.owner: Optional[str] = None
self.repo: Optional[str] = None self.repo: Optional[str] = None
self.mode: str = 'project' self.mode: str = 'project'
@@ -31,7 +30,7 @@ class GiteaConfig:
Project-level configuration overrides system-level. Project-level configuration overrides system-level.
Returns: Returns:
Dict containing api_url, api_token, owner, repo, mode Dict containing api_url, api_token, repo, mode
Raises: Raises:
FileNotFoundError: If system config is missing FileNotFoundError: If system config is missing
@@ -58,8 +57,7 @@ class GiteaConfig:
# Extract values # Extract values
self.api_url = os.getenv('GITEA_API_URL') self.api_url = os.getenv('GITEA_API_URL')
self.api_token = os.getenv('GITEA_API_TOKEN') self.api_token = os.getenv('GITEA_API_TOKEN')
self.owner = os.getenv('GITEA_OWNER') self.repo = os.getenv('GITEA_REPO') # Optional, must be owner/repo format
self.repo = os.getenv('GITEA_REPO') # Optional for PMO
# Detect mode # Detect mode
if self.repo: if self.repo:
@@ -75,7 +73,6 @@ class GiteaConfig:
return { return {
'api_url': self.api_url, 'api_url': self.api_url,
'api_token': self.api_token, 'api_token': self.api_token,
'owner': self.owner,
'repo': self.repo, 'repo': self.repo,
'mode': self.mode 'mode': self.mode
} }
@@ -89,8 +86,7 @@ class GiteaConfig:
""" """
required = { required = {
'GITEA_API_URL': self.api_url, 'GITEA_API_URL': self.api_url,
'GITEA_API_TOKEN': self.api_token, 'GITEA_API_TOKEN': self.api_token
'GITEA_OWNER': self.owner
} }
missing = [key for key, value in required.items() if not value] missing = [key for key, value in required.items() if not value]

View File

@@ -26,8 +26,7 @@ class GiteaClient:
self.base_url = config_dict['api_url'] self.base_url = config_dict['api_url']
self.token = config_dict['api_token'] self.token = config_dict['api_token']
self.owner = config_dict['owner'] self.repo = config_dict.get('repo') # Optional default repo in owner/repo format
self.repo = config_dict.get('repo') # Optional for PMO
self.mode = config_dict['mode'] self.mode = config_dict['mode']
self.session = requests.Session() self.session = requests.Session()
@@ -36,7 +35,15 @@ class GiteaClient:
'Content-Type': 'application/json' 'Content-Type': 'application/json'
}) })
logger.info(f"Gitea client initialized for {self.owner} in {self.mode} mode") logger.info(f"Gitea client initialized in {self.mode} mode")
def _parse_repo(self, repo: Optional[str] = None) -> tuple:
"""Parse owner/repo from input. Always requires 'owner/repo' format."""
target = repo or self.repo
if not target or '/' not in target:
raise ValueError("Use 'owner/repo' format (e.g. 'bandit/support-claude-mktplace')")
parts = target.split('/', 1)
return parts[0], parts[1]
def list_issues( def list_issues(
self, self,
@@ -50,26 +57,17 @@ class GiteaClient:
Args: Args:
state: Issue state (open, closed, all) state: Issue state (open, closed, all)
labels: Filter by labels labels: Filter by labels
repo: Override configured repo (for PMO multi-repo) repo: Repository in 'owner/repo' format
Returns: Returns:
List of issue dictionaries List of issue dictionaries
Raises:
ValueError: If repository not specified
requests.HTTPError: If API request fails
""" """
target_repo = repo or self.repo owner, target_repo = self._parse_repo(repo)
if not target_repo: url = f"{self.base_url}/repos/{owner}/{target_repo}/issues"
raise ValueError("Repository not specified")
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues"
params = {'state': state} params = {'state': state}
if labels: if labels:
params['labels'] = ','.join(labels) params['labels'] = ','.join(labels)
logger.info(f"Listing issues from {owner}/{target_repo} with state={state}")
logger.info(f"Listing issues from {self.owner}/{target_repo} with state={state}")
response = self.session.get(url, params=params) response = self.session.get(url, params=params)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
@@ -79,26 +77,10 @@ class GiteaClient:
issue_number: int, issue_number: int,
repo: Optional[str] = None repo: Optional[str] = None
) -> Dict: ) -> Dict:
""" """Get specific issue details."""
Get specific issue details. owner, target_repo = self._parse_repo(repo)
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}"
Args: logger.info(f"Getting issue #{issue_number} from {owner}/{target_repo}")
issue_number: Issue number
repo: Override configured repo (for PMO multi-repo)
Returns:
Issue dictionary
Raises:
ValueError: If repository not specified
requests.HTTPError: If API request fails
"""
target_repo = repo or self.repo
if not target_repo:
raise ValueError("Repository not specified")
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues/{issue_number}"
logger.info(f"Getting issue #{issue_number} from {self.owner}/{target_repo}")
response = self.session.get(url) response = self.session.get(url)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
@@ -110,69 +92,30 @@ class GiteaClient:
labels: Optional[List[str]] = None, labels: Optional[List[str]] = None,
repo: Optional[str] = None repo: Optional[str] = None
) -> Dict: ) -> Dict:
""" """Create a new issue in Gitea."""
Create a new issue in Gitea. owner, target_repo = self._parse_repo(repo)
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues"
Args: data = {'title': title, 'body': body}
title: Issue title
body: Issue description
labels: List of label names (will be converted to IDs)
repo: Override configured repo (for PMO multi-repo)
Returns:
Created issue dictionary
Raises:
ValueError: If repository not specified
requests.HTTPError: If API request fails
"""
target_repo = repo or self.repo
if not target_repo:
raise ValueError("Repository not specified")
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues"
data = {
'title': title,
'body': body
}
if labels: if labels:
# Convert label names to IDs (Gitea expects integer IDs, not strings) label_ids = self._resolve_label_ids(labels, owner, target_repo)
label_ids = self._resolve_label_ids(labels, target_repo)
data['labels'] = label_ids data['labels'] = label_ids
logger.info(f"Creating issue in {owner}/{target_repo}: {title}")
logger.info(f"Creating issue in {self.owner}/{target_repo}: {title}")
response = self.session.post(url, json=data) response = self.session.post(url, json=data)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
def _resolve_label_ids(self, label_names: List[str], repo: str) -> List[int]: def _resolve_label_ids(self, label_names: List[str], owner: str, repo: str) -> List[int]:
""" """Convert label names to label IDs."""
Convert label names to label IDs. org_labels = self.get_org_labels(owner)
repo_labels = self.get_labels(f"{owner}/{repo}")
Args:
label_names: List of label names (e.g., ['Type/Feature', 'Priority/High'])
repo: Repository name
Returns:
List of label IDs
"""
# Fetch all available labels (org + repo)
org_labels = self.get_org_labels()
repo_labels = self.get_labels(repo)
all_labels = org_labels + repo_labels all_labels = org_labels + repo_labels
# Build name -> ID mapping
label_map = {label['name']: label['id'] for label in all_labels} label_map = {label['name']: label['id'] for label in all_labels}
# Resolve IDs
label_ids = [] label_ids = []
for name in label_names: for name in label_names:
if name in label_map: if name in label_map:
label_ids.append(label_map[name]) label_ids.append(label_map[name])
else: else:
logger.warning(f"Label '{name}' not found in Gitea, skipping") logger.warning(f"Label '{name}' not found, skipping")
return label_ids return label_ids
def update_issue( def update_issue(
@@ -184,31 +127,10 @@ class GiteaClient:
labels: Optional[List[str]] = None, labels: Optional[List[str]] = None,
repo: Optional[str] = None repo: Optional[str] = None
) -> Dict: ) -> Dict:
""" """Update existing issue. Repo must be 'owner/repo' format."""
Update existing issue. owner, target_repo = self._parse_repo(repo)
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}"
Args:
issue_number: Issue number
title: New title (optional)
body: New body (optional)
state: New state - 'open' or 'closed' (optional)
labels: New labels (optional)
repo: Override configured repo (for PMO multi-repo)
Returns:
Updated issue dictionary
Raises:
ValueError: If repository not specified
requests.HTTPError: If API request fails
"""
target_repo = repo or self.repo
if not target_repo:
raise ValueError("Repository not specified")
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues/{issue_number}"
data = {} data = {}
if title is not None: if title is not None:
data['title'] = title data['title'] = title
if body is not None: if body is not None:
@@ -217,8 +139,7 @@ class GiteaClient:
data['state'] = state data['state'] = state
if labels is not None: if labels is not None:
data['labels'] = labels data['labels'] = labels
logger.info(f"Updating issue #{issue_number} in {owner}/{target_repo}")
logger.info(f"Updating issue #{issue_number} in {self.owner}/{target_repo}")
response = self.session.patch(url, json=data) response = self.session.patch(url, json=data)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
@@ -229,131 +150,62 @@ class GiteaClient:
comment: str, comment: str,
repo: Optional[str] = None repo: Optional[str] = None
) -> Dict: ) -> Dict:
""" """Add comment to issue. Repo must be 'owner/repo' format."""
Add comment to issue. owner, target_repo = self._parse_repo(repo)
url = f"{self.base_url}/repos/{owner}/{target_repo}/issues/{issue_number}/comments"
Args:
issue_number: Issue number
comment: Comment text
repo: Override configured repo (for PMO multi-repo)
Returns:
Created comment dictionary
Raises:
ValueError: If repository not specified
requests.HTTPError: If API request fails
"""
target_repo = repo or self.repo
if not target_repo:
raise ValueError("Repository not specified")
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/issues/{issue_number}/comments"
data = {'body': comment} data = {'body': comment}
logger.info(f"Adding comment to issue #{issue_number} in {owner}/{target_repo}")
logger.info(f"Adding comment to issue #{issue_number} in {self.owner}/{target_repo}")
response = self.session.post(url, json=data) response = self.session.post(url, json=data)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
def get_labels( def get_labels(self, repo: Optional[str] = None) -> List[Dict]:
self, """Get all labels from repository. Repo must be 'owner/repo' format."""
repo: Optional[str] = None owner, target_repo = self._parse_repo(repo)
) -> List[Dict]: url = f"{self.base_url}/repos/{owner}/{target_repo}/labels"
""" logger.info(f"Getting labels from {owner}/{target_repo}")
Get all labels from repository.
Args:
repo: Override configured repo (for PMO multi-repo)
Returns:
List of label dictionaries
Raises:
ValueError: If repository not specified
requests.HTTPError: If API request fails
"""
target_repo = repo or self.repo
if not target_repo:
raise ValueError("Repository not specified")
url = f"{self.base_url}/repos/{self.owner}/{target_repo}/labels"
logger.info(f"Getting labels from {self.owner}/{target_repo}")
response = self.session.get(url) response = self.session.get(url)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
def get_org_labels(self) -> List[Dict]: def get_org_labels(self, org: str) -> List[Dict]:
""" """Get organization-level labels. Org is the organization name."""
Get organization-level labels. url = f"{self.base_url}/orgs/{org}/labels"
logger.info(f"Getting organization labels for {org}")
Returns:
List of organization label dictionaries
Raises:
requests.HTTPError: If API request fails
"""
url = f"{self.base_url}/orgs/{self.owner}/labels"
logger.info(f"Getting organization labels for {self.owner}")
response = self.session.get(url) response = self.session.get(url)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
# PMO-specific methods def list_repos(self, org: str) -> List[Dict]:
"""List all repositories in organization. Org is the organization name."""
def list_repos(self) -> List[Dict]: url = f"{self.base_url}/orgs/{org}/repos"
""" logger.info(f"Listing all repositories for organization {org}")
List all repositories in organization (PMO mode).
Returns:
List of repository dictionaries
Raises:
requests.HTTPError: If API request fails
"""
url = f"{self.base_url}/orgs/{self.owner}/repos"
logger.info(f"Listing all repositories for organization {self.owner}")
response = self.session.get(url) response = self.session.get(url)
response.raise_for_status() response.raise_for_status()
return response.json() return response.json()
def aggregate_issues( def aggregate_issues(
self, self,
org: str,
state: str = 'open', state: str = 'open',
labels: Optional[List[str]] = None labels: Optional[List[str]] = None
) -> Dict[str, List[Dict]]: ) -> Dict[str, List[Dict]]:
""" """Fetch issues across all repositories in org."""
Fetch issues across all repositories (PMO mode). repos = self.list_repos(org)
Returns dict keyed by repository name.
Args:
state: Issue state (open, closed, all)
labels: Filter by labels
Returns:
Dictionary mapping repository names to issue lists
Raises:
requests.HTTPError: If API request fails
"""
repos = self.list_repos()
aggregated = {} aggregated = {}
logger.info(f"Aggregating issues across {len(repos)} repositories") logger.info(f"Aggregating issues across {len(repos)} repositories")
for repo in repos: for repo in repos:
repo_name = repo['name'] repo_name = repo['name']
try: try:
issues = self.list_issues( issues = self.list_issues(
state=state, state=state,
labels=labels, labels=labels,
repo=repo_name repo=f"{org}/{repo_name}"
) )
if issues: if issues:
aggregated[repo_name] = issues aggregated[repo_name] = issues
logger.info(f"Found {len(issues)} issues in {repo_name}") logger.info(f"Found {len(issues)} issues in {repo_name}")
except Exception as e: except Exception as e:
# Log error but continue with other repos
logger.error(f"Error fetching issues from {repo_name}: {e}") logger.error(f"Error fetching issues from {repo_name}: {e}")
return aggregated return aggregated

View File

@@ -15,7 +15,10 @@ from .gitea_client import GiteaClient
from .tools.issues import IssueTools from .tools.issues import IssueTools
from .tools.labels import LabelTools from .tools.labels import LabelTools
# Suppress noisy MCP validation warnings on stderr
logging.basicConfig(level=logging.INFO) logging.basicConfig(level=logging.INFO)
logging.getLogger("root").setLevel(logging.ERROR)
logging.getLogger("mcp").setLevel(logging.ERROR)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -216,6 +219,10 @@ class GiteaMCPServer:
inputSchema={ inputSchema={
"type": "object", "type": "object",
"properties": { "properties": {
"org": {
"type": "string",
"description": "Organization name (e.g. 'bandit')"
},
"state": { "state": {
"type": "string", "type": "string",
"enum": ["open", "closed", "all"], "enum": ["open", "closed", "all"],
@@ -227,7 +234,8 @@ class GiteaMCPServer:
"items": {"type": "string"}, "items": {"type": "string"},
"description": "Filter by labels" "description": "Filter by labels"
} }
} },
"required": ["org"]
} }
) )
] ]

View File

@@ -245,35 +245,17 @@ class IssueTools:
async def aggregate_issues( async def aggregate_issues(
self, self,
org: str,
state: str = 'open', state: str = 'open',
labels: Optional[List[str]] = None labels: Optional[List[str]] = None
) -> Dict[str, List[Dict]]: ) -> Dict[str, List[Dict]]:
""" """Aggregate issues across all repositories in org."""
Aggregate issues across all repositories (PMO mode, async wrapper).
Args:
state: Issue state (open, closed, all)
labels: Filter by labels
Returns:
Dictionary mapping repository names to issue lists
Raises:
ValueError: If not in company mode
PermissionError: If operation not allowed on current branch
"""
if self.gitea.mode != 'company':
raise ValueError("aggregate_issues only available in company mode")
if not self._check_branch_permissions('aggregate_issues'): if not self._check_branch_permissions('aggregate_issues'):
branch = self._get_current_branch() branch = self._get_current_branch()
raise PermissionError( raise PermissionError(f"Cannot aggregate issues on branch '{branch}'.")
f"Cannot aggregate issues on branch '{branch}'. "
f"Switch to a development branch."
)
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
return await loop.run_in_executor( return await loop.run_in_executor(
None, None,
lambda: self.gitea.aggregate_issues(state, labels) lambda: self.gitea.aggregate_issues(org, state, labels)
) )

View File

@@ -27,31 +27,24 @@ class LabelTools:
self.gitea = gitea_client self.gitea = gitea_client
async def get_labels(self, repo: Optional[str] = None) -> Dict[str, List[Dict]]: async def get_labels(self, repo: Optional[str] = None) -> Dict[str, List[Dict]]:
""" """Get all labels (org + repo). Repo must be 'owner/repo' format."""
Get all labels (org + repo) (async wrapper).
Args:
repo: Override configured repo (for PMO multi-repo)
Returns:
Dictionary with 'org' and 'repo' label lists
"""
loop = asyncio.get_event_loop() loop = asyncio.get_event_loop()
# Get org labels target_repo = repo or self.gitea.repo
if not target_repo or '/' not in target_repo:
raise ValueError("Use 'owner/repo' format (e.g. 'bandit/support-claude-mktplace')")
org = target_repo.split('/')[0]
org_labels = await loop.run_in_executor( org_labels = await loop.run_in_executor(
None, None,
self.gitea.get_org_labels lambda: self.gitea.get_org_labels(org)
) )
# Get repo labels if repo is specified repo_labels = await loop.run_in_executor(
repo_labels = [] None,
if repo or self.gitea.repo: lambda: self.gitea.get_labels(target_repo)
target_repo = repo or self.gitea.repo )
repo_labels = await loop.run_in_executor(
None,
lambda: self.gitea.get_labels(target_repo)
)
return { return {
'organization': org_labels, 'organization': org_labels,

View File

@@ -23,7 +23,10 @@ from .tools.vpn import VPNTools
from .tools.wireless import WirelessTools from .tools.wireless import WirelessTools
from .tools.extras import ExtrasTools from .tools.extras import ExtrasTools
# Suppress noisy MCP validation warnings on stderr
logging.basicConfig(level=logging.INFO) logging.basicConfig(level=logging.INFO)
logging.getLogger("root").setLevel(logging.ERROR)
logging.getLogger("mcp").setLevel(logging.ERROR)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View File

@@ -110,7 +110,7 @@ cat > ~/.config/claude/wikijs.env << 'EOF'
# Wiki.js API Configuration # Wiki.js API Configuration
WIKIJS_API_URL=http://wikijs.hotport/graphql WIKIJS_API_URL=http://wikijs.hotport/graphql
WIKIJS_API_TOKEN=your_api_token_here WIKIJS_API_TOKEN=your_api_token_here
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
EOF EOF
chmod 600 ~/.config/claude/wikijs.env chmod 600 ~/.config/claude/wikijs.env
@@ -216,7 +216,7 @@ The MCP server is referenced in plugin `.mcp.json`:
|----------|-------------|---------| |----------|-------------|---------|
| `WIKIJS_API_URL` | Wiki.js GraphQL endpoint | `http://wiki.example.com/graphql` | | `WIKIJS_API_URL` | Wiki.js GraphQL endpoint | `http://wiki.example.com/graphql` |
| `WIKIJS_API_TOKEN` | API authentication token (JWT) | `eyJhbGciOiJSUzI1...` | | `WIKIJS_API_TOKEN` | API authentication token (JWT) | `eyJhbGciOiJSUzI1...` |
| `WIKIJS_BASE_PATH` | Base path in Wiki.js | `/hyper-hive-labs` | | `WIKIJS_BASE_PATH` | Base path in Wiki.js | `/your-org` |
### Optional Variables ### Optional Variables
@@ -234,7 +234,7 @@ The MCP server is referenced in plugin `.mcp.json`:
### Recommended Organization ### Recommended Organization
``` ```
/hyper-hive-labs/ # Base path /your-org/ # Base path
├── projects/ # Project-specific ├── projects/ # Project-specific
│ ├── your-project/ │ ├── your-project/
│ │ ├── lessons-learned/ │ │ ├── lessons-learned/

View File

@@ -52,7 +52,7 @@ mkdir -p ~/.config/claude
cat > ~/.config/claude/wikijs.env << 'EOF' cat > ~/.config/claude/wikijs.env << 'EOF'
WIKIJS_API_URL=http://wikijs.hotport/graphql WIKIJS_API_URL=http://wikijs.hotport/graphql
WIKIJS_API_TOKEN=your_real_token_here WIKIJS_API_TOKEN=your_real_token_here
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
EOF EOF
# Run integration tests # Run integration tests
@@ -198,7 +198,7 @@ echo '{
"success": true, "success": true,
"page": { "page": {
"id": 123, "id": 123,
"path": "/hyper-hive-labs/projects/test-project/documentation/test-api", "path": "/your-org/projects/test-project/documentation/test-api",
"title": "Test API Documentation" "title": "Test API Documentation"
} }
} }

View File

@@ -13,7 +13,10 @@ from mcp.types import Tool, TextContent
from .config import WikiJSConfig from .config import WikiJSConfig
from .wikijs_client import WikiJSClient from .wikijs_client import WikiJSClient
# Suppress noisy MCP validation warnings on stderr
logging.basicConfig(level=logging.INFO) logging.basicConfig(level=logging.INFO)
logging.getLogger("root").setLevel(logging.ERROR)
logging.getLogger("mcp").setLevel(logging.ERROR)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View File

@@ -21,8 +21,8 @@ class WikiJSClient:
Args: Args:
api_url: Wiki.js GraphQL API URL (e.g., http://wiki.example.com/graphql) api_url: Wiki.js GraphQL API URL (e.g., http://wiki.example.com/graphql)
api_token: Wiki.js API token api_token: Wiki.js API token
base_path: Base path in Wiki.js (e.g., /hyper-hive-labs) base_path: Base path in Wiki.js (e.g., /your-org)
project: Project path (e.g., projects/cuisineflow) for project mode project: Project path (e.g., projects/my-project) for project mode
""" """
self.api_url = api_url self.api_url = api_url
self.api_token = api_token self.api_token = api_token

View File

@@ -2,7 +2,10 @@
"name": "cmdb-assistant", "name": "cmdb-assistant",
"version": "1.0.0", "version": "1.0.0",
"description": "NetBox CMDB integration for infrastructure management - query, create, update, and manage network devices, IP addresses, sites, and more", "description": "NetBox CMDB integration for infrastructure management - query, create, update, and manage network devices, IP addresses, sites, and more",
"author": "Bandit Labs", "author": {
"name": "Bandit Labs",
"email": "dev@banditlabs.io"
},
"homepage": "https://github.com/bandit-labs/cmdb-assistant", "homepage": "https://github.com/bandit-labs/cmdb-assistant",
"license": "MIT", "license": "MIT",
"keywords": [ "keywords": [
@@ -16,50 +19,25 @@
"commands": { "commands": {
"cmdb-search": { "cmdb-search": {
"description": "Search NetBox for devices, IPs, sites, or any CMDB object", "description": "Search NetBox for devices, IPs, sites, or any CMDB object",
"file": "commands/cmdb-search.md" "source": "commands/cmdb-search.md"
}, },
"cmdb-device": { "cmdb-device": {
"description": "Manage network devices (create, view, update, delete)", "description": "Manage network devices (create, view, update, delete)",
"file": "commands/cmdb-device.md" "source": "commands/cmdb-device.md"
}, },
"cmdb-ip": { "cmdb-ip": {
"description": "Manage IP addresses and prefixes", "description": "Manage IP addresses and prefixes",
"file": "commands/cmdb-ip.md" "source": "commands/cmdb-ip.md"
}, },
"cmdb-site": { "cmdb-site": {
"description": "Manage sites and locations", "description": "Manage sites and locations",
"file": "commands/cmdb-site.md" "source": "commands/cmdb-site.md"
} }
}, },
"agents": { "agents": {
"cmdb-assistant": { "cmdb-assistant": {
"description": "Infrastructure management assistant for NetBox CMDB operations", "description": "Infrastructure management assistant for NetBox CMDB operations",
"file": "agents/cmdb-assistant.md" "source": "agents/cmdb-assistant.md"
} }
},
"mcpServers": {
"netbox": {
"description": "NetBox API integration via MCP",
"configFile": ".mcp.json"
}
},
"configuration": {
"required": [
{
"name": "NETBOX_URL",
"description": "NetBox instance URL (e.g., https://netbox.example.com)"
},
{
"name": "NETBOX_TOKEN",
"description": "NetBox API token for authentication"
}
],
"optional": [
{
"name": "NETBOX_VERIFY_SSL",
"description": "Verify SSL certificates (default: true)",
"default": "true"
}
]
} }
} }

View File

@@ -1,9 +1,12 @@
{ {
"mcpServers": { "mcpServers": {
"netbox": { "netbox": {
"command": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/netbox/.venv/bin/python", "command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox/.venv/bin/python",
"args": ["-m", "mcp_server.server"], "args": ["-m", "mcp_server.server"],
"cwd": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/netbox" "cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox",
"env": {
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/netbox"
}
} }
} }
} }

View File

@@ -1,14 +1,20 @@
{ {
"mcpServers": { "mcpServers": {
"gitea": { "gitea": {
"command": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/gitea/.venv/bin/python", "command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea/.venv/bin/python",
"args": ["-m", "mcp_server.server"], "args": ["-m", "mcp_server.server"],
"cwd": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/gitea" "cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea",
"env": {
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/gitea"
}
}, },
"wikijs": { "wikijs": {
"command": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/wikijs/.venv/bin/python", "command": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs/.venv/bin/python",
"args": ["-m", "mcp_server.server"], "args": ["-m", "mcp_server.server"],
"cwd": "${CLAUDE_PLUGIN_ROOT}/../../mcp-servers/wikijs" "cwd": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs",
"env": {
"PYTHONPATH": "/home/lmiranda/repos/bandit/support-claude-mktplace/mcp-servers/wikijs"
}
} }
} }
} }

View File

@@ -112,7 +112,7 @@ python -c "from mcp_server import server; print('Wiki.js MCP Server installed su
### 2.2 Generate Wiki.js API Token ### 2.2 Generate Wiki.js API Token
1. Log into Wiki.js: https://wiki.hyperhivelabs.com 1. Log into Wiki.js: https://wiki.your-company.com
2. Navigate to: **Administration** (top right) 2. Navigate to: **Administration** (top right)
3. Click **API Access** in the left sidebar 3. Click **API Access** in the left sidebar
4. Click **New API Key** 4. Click **New API Key**
@@ -166,9 +166,9 @@ chmod 600 ~/.config/claude/gitea.env
```bash ```bash
cat > ~/.config/claude/wikijs.env << 'EOF' cat > ~/.config/claude/wikijs.env << 'EOF'
# Wiki.js API Configuration # Wiki.js API Configuration
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=your_wikijs_token_here WIKIJS_API_TOKEN=your_wikijs_token_here
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
EOF EOF
# Secure the file (owner read/write only) # Secure the file (owner read/write only)
@@ -180,7 +180,7 @@ chmod 600 ~/.config/claude/wikijs.env
**Configuration Variables:** **Configuration Variables:**
- `WIKIJS_API_URL` - Wiki.js GraphQL endpoint (includes `/graphql`) - `WIKIJS_API_URL` - Wiki.js GraphQL endpoint (includes `/graphql`)
- `WIKIJS_API_TOKEN` - API key from Step 2.2 (JWT format) - `WIKIJS_API_TOKEN` - API key from Step 2.2 (JWT format)
- `WIKIJS_BASE_PATH` - Base path in Wiki.js (e.g., `/hyper-hive-labs`) - `WIKIJS_BASE_PATH` - Base path in Wiki.js (e.g., `/your-org`)
### 3.4 Verify System Configuration ### 3.4 Verify System Configuration
@@ -263,7 +263,7 @@ curl -H "Authorization: token YOUR_GITEA_TOKEN" \
curl -H "Authorization: Bearer YOUR_WIKIJS_TOKEN" \ curl -H "Authorization: Bearer YOUR_WIKIJS_TOKEN" \
-H "Content-Type: application/json" \ -H "Content-Type: application/json" \
-d '{"query": "{ pages { list { id title } } }"}' \ -d '{"query": "{ pages { list { id title } } }"}' \
https://wiki.hyperhivelabs.com/graphql https://wiki.your-company.com/graphql
# Should return pages data in JSON format # Should return pages data in JSON format
``` ```
@@ -320,17 +320,17 @@ GITEA_OWNER=bandit
**`~/.config/claude/wikijs.env`:** **`~/.config/claude/wikijs.env`:**
```bash ```bash
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9... WIKIJS_API_TOKEN=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9...
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
``` ```
### Project-Level Files ### Project-Level Files
**`.env` (in project root):** **`.env` (in project root):**
```bash ```bash
GITEA_REPO=cuisineflow GITEA_REPO=your-repo-name
WIKIJS_PROJECT=projects/cuisineflow WIKIJS_PROJECT=projects/your-project-name
``` ```
**`.gitignore` (must include):** **`.gitignore` (must include):**
@@ -371,25 +371,25 @@ To use Projman with multiple projects:
1. **System config:** Set up once (already done in Step 3) 1. **System config:** Set up once (already done in Step 3)
2. **Project config:** Create `.env` in each project root: 2. **Project config:** Create `.env` in each project root:
**Project 1: CuisineFlow** **Project 1: Main App**
```bash ```bash
# ~/projects/cuisineflow/.env # ~/projects/my-app/.env
GITEA_REPO=cuisineflow GITEA_REPO=my-app
WIKIJS_PROJECT=projects/cuisineflow WIKIJS_PROJECT=projects/my-app
``` ```
**Project 2: CuisineFlow-Site** **Project 2: App Site**
```bash ```bash
# ~/projects/cuisineflow-site/.env # ~/projects/my-app-site/.env
GITEA_REPO=cuisineflow-site GITEA_REPO=my-app-site
WIKIJS_PROJECT=projects/cuisineflow-site WIKIJS_PROJECT=projects/my-app-site
``` ```
**Project 3: HHL-Site** **Project 3: Company Site**
```bash ```bash
# ~/projects/hhl-site/.env # ~/projects/company-site/.env
GITEA_REPO=hhl-site GITEA_REPO=company-site
WIKIJS_PROJECT=projects/hhl-site WIKIJS_PROJECT=projects/company-site
``` ```
Each project operates independently with its own issues and lessons learned. Each project operates independently with its own issues and lessons learned.
@@ -421,7 +421,7 @@ curl -H "Authorization: token YOUR_TOKEN" \
# Test Wiki.js token # Test Wiki.js token
curl -H "Authorization: Bearer YOUR_TOKEN" \ curl -H "Authorization: Bearer YOUR_TOKEN" \
https://wiki.hyperhivelabs.com/graphql https://wiki.your-company.com/graphql
# If fails, regenerate token (Step 2) # If fails, regenerate token (Step 2)
``` ```

View File

@@ -59,9 +59,9 @@ EOF
# Wiki.js configuration # Wiki.js configuration
cat > ~/.config/claude/wikijs.env << EOF cat > ~/.config/claude/wikijs.env << EOF
WIKIJS_API_URL=https://wiki.hyperhivelabs.com/graphql WIKIJS_API_URL=https://wiki.your-company.com/graphql
WIKIJS_API_TOKEN=your_wikijs_token_here WIKIJS_API_TOKEN=your_wikijs_token_here
WIKIJS_BASE_PATH=/hyper-hive-labs WIKIJS_BASE_PATH=/your-org
EOF EOF
# Secure the files # Secure the files
@@ -327,7 +327,7 @@ See [CONFIGURATION.md](./CONFIGURATION.md) for detailed configuration instructio
### Cannot connect to Wiki.js ### Cannot connect to Wiki.js
- Verify `~/.config/claude/wikijs.env` exists and has correct URL and token - Verify `~/.config/claude/wikijs.env` exists and has correct URL and token
- Check Wiki.js GraphQL endpoint: `https://wiki.hyperhivelabs.com/graphql` - Check Wiki.js GraphQL endpoint: `https://wiki.your-company.com/graphql`
- Verify API token has pages read/write permissions - Verify API token has pages read/write permissions
### Labels not syncing ### Labels not syncing