initial commit

This commit is contained in:
2025-08-01 13:29:38 -04:00
parent 2d1aa8280e
commit d9a8b13c16
15 changed files with 2855 additions and 315 deletions

180
CLAUDE.md Normal file
View File

@@ -0,0 +1,180 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Project Overview
JobForge is an AI-powered job application management system designed for individual job seekers. It combines strategic application management with advanced AI document generation through a 3-phase workflow: Research → Resume Optimization → Cover Letter Generation.
## Technology Stack
- **Frontend**: Dash + Mantine UI components (Python-based web framework)
- **Backend**: FastAPI with AsyncIO for high-performance REST API
- **Database**: PostgreSQL 16 + pgvector extension for vector search
- **AI Services**: Claude Sonnet 4 for document generation, OpenAI for embeddings
- **Development**: Docker Compose for containerized environment
- **Authentication**: JWT tokens with bcrypt password hashing
## Development Commands
### Docker Environment
```bash
# Start all services (PostgreSQL, Backend, Frontend)
docker-compose up -d
# View logs for all services
docker-compose logs -f
# View logs for specific service
docker-compose logs -f backend
docker-compose logs -f frontend
docker-compose logs -f postgres
# Stop all services
docker-compose down
# Rebuild services after code changes
docker-compose up --build
# Reset database (WARNING: Deletes all data)
docker-compose down -v && docker-compose up -d
```
### Testing
```bash
# Run all backend tests
docker-compose exec backend pytest
# Run tests with coverage report
docker-compose exec backend pytest --cov=src --cov-report=html
# Run specific test file
docker-compose exec backend pytest tests/unit/services/test_auth_service.py
```
### Database Operations
```bash
# Connect to PostgreSQL database
docker-compose exec postgres psql -U jobforge_user -d jobforge_mvp
# Check database health
curl http://localhost:8000/health
```
## Architecture Overview
### Core Components
**Frontend Structure (`src/frontend/`)**:
- `main.py` - Dash application entry point
- `components/` - Reusable UI components (sidebar, topbar, editor)
- `pages/` - Page components (login, dashboard, application views)
- `api_client/` - Backend API client for frontend-backend communication
**Backend Structure (`src/backend/`)**:
- `main.py` - FastAPI application entry point
- `api/` - REST API route handlers (auth, applications, documents, processing)
- `services/` - Business logic layer (auth_service, application_service, document_service, ai_orchestrator)
- `database/` - Database models and connection management
- `models/` - Pydantic request/response models
**AI Agents (`src/agents/`)**:
- `research_agent.py` - Phase 1: Job analysis and company research
- `resume_optimizer.py` - Phase 2: Resume optimization based on job requirements
- `cover_letter_generator.py` - Phase 3: Personalized cover letter generation
- `claude_client.py` - Claude AI API integration
### 3-Phase AI Workflow
1. **Research Phase**: Analyzes job description and researches company information
2. **Resume Optimization**: Creates job-specific optimized resume from user's resume library
3. **Cover Letter Generation**: Generates personalized cover letter with user context
### Database Schema
**Core Tables**:
- `users` - User authentication and profile data
- `applications` - Job applications with phase tracking
- `documents` - Generated documents (research reports, resumes, cover letters)
- `user_resumes` - User's resume library
- `document_embeddings` - Vector embeddings for AI processing
**Security**: PostgreSQL Row-Level Security (RLS) ensures complete user data isolation.
## Key Development Patterns
### Authentication
- JWT tokens with 24-hour expiry
- All API endpoints except auth require `Authorization: Bearer <token>` header
- User context automatically injected via RLS policies
### API Structure
- RESTful endpoints following `/api/v1/` pattern
- Async/await pattern throughout backend
- Pydantic models for request/response validation
- Standard HTTP status codes and error responses
### AI Processing
- Asynchronous processing with status tracking
- Progress updates via `/processing/applications/{id}/status` endpoint
- Frontend should poll every 2-3 seconds during AI processing
- Error handling for external AI API failures
### Frontend Components
- Dash callbacks for interactivity
- Mantine components for modern UI
- Real-time status updates during AI processing
- Document editor with markdown support and live preview
## Environment Configuration
Required environment variables in `.env`:
```bash
# API Keys (REQUIRED)
CLAUDE_API_KEY=your_claude_api_key_here
OPENAI_API_KEY=your_openai_api_key_here
# Database
DATABASE_URL=postgresql+asyncpg://jobforge_user:jobforge_password@postgres:5432/jobforge_mvp
# JWT Authentication
JWT_SECRET_KEY=your-super-secret-jwt-key-change-this-in-production
# Development Settings
DEBUG=true
LOG_LEVEL=INFO
```
## Service URLs
- **Frontend Application**: http://localhost:8501
- **Backend API**: http://localhost:8000
- **API Documentation**: http://localhost:8000/docs (Swagger UI)
- **Database**: localhost:5432
## Development Guidelines
### Code Style
- Follow FastAPI patterns for backend development
- Use async/await for all database and external API calls
- Implement proper error handling and logging
- Follow PostgreSQL RLS patterns for data security
### Testing Strategy
- Unit tests for business logic and services
- Integration tests for API endpoints and database interactions
- AI mocking for reliable testing without external API dependencies
- Maintain 80%+ test coverage
### Security Best Practices
- Never commit API keys or sensitive data to repository
- Use environment variables for all configuration
- Implement proper input validation and sanitization
- Follow JWT token best practices
## Current Development Status
**Phase**: MVP Development (8-week timeline)
**Status**: Foundation setup and documentation complete, code implementation in progress
The project is currently in its initial development phase with comprehensive documentation and architecture planning completed. The actual code implementation follows the patterns and structure outlined in the documentation.