A FastAPI-based microservice providing AI and recommendation services for HireOn recruitment platform.
This repository contains the machine learning services for the HireOn platform, including:
- Gen AI Services: CV analysis, job matching, and cover letter generation using Gemini AI.
- Recommendation Engine: Job recommendation system using semantic search and embeddings.
-
CV Job Analysis: Analyzes a candidate's CV against job details to:
- Calculate a relevance score between the CV and job
- Identify skill matches and gaps
- Provide personalized improvement suggestions
- Highlight strengths and areas for development
-
Cover Letter Generator: Automatically generates personalized cover letters based on:
- Candidate's CV content
- Job description and requirements
- Outputs as a professionally formatted PDF
-
Job Recommendations: Provides personalized job recommendations based on:
- Semantic understanding of CV content
- Job title and description matching
- Similarity scoring and ranking
-
CV Embeddings: Stores and retrieves vector embeddings of user CVs for efficient similarity searching
- FastAPI: Modern, high-performance web framework for building APIs
- Gemini AI: Google's generative AI model for natural language processing tasks
- ChromaDB: Vector database for storing and querying embeddings
- Google Cloud Storage: For storing documents and generated PDFs
- Docker: Containerization for deployment
- Python 3.12: Core programming language
.
├── app/ # Main application package
│ ├── api/ # API routes and models
│ │ ├── core/ # Core service components
│ │ ├── models/ # Data models and schemas
│ │ └── routes/ # API endpoints
│ └── utils/ # Utility functions
├── credentials/ # Service account credentials
├── data/ # Data files
├── experiments/ # Experimental notebooks and scripts
├── notebook/ # Jupyter notebooks for development
├── .env # Environment variables
├── .env.example # Example environment configuration
├── Dockerfile # Docker configuration
├── docker-compose.yml # Docker Compose configuration
├── main.py # Application entry point
└── requirements.txt # Python dependencies
- Python 3.12+
- Docker and Docker Compose (optional)
- Google Cloud account with Gemini API and Storage access
-
Clone the repository:
git clone https://github.com/yourusername/hireon-ml-services.git cd hireon-ml-services -
Set up a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Configure environment variables:
cp .env.example .env # Edit .env file with your API keys and configuration -
Add your Google Cloud credentials to the
credentials/directory.
uvicorn main:app --reloadThe API will be available at http://localhost:8000
docker build -t hireon-ml-services .
docker run -p 8000:8000 hireon-ml-servicesOr with Docker Compose:
docker-compose upWhen the service is running, API documentation is available at:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
| Variable | Description |
|---|---|
GEMINI_API_KEY |
Google Gemini API key |
GOOGLE_APPLICATION_CREDENTIALS |
Path to Google Cloud service account credentials |
GOOGLE_CLOUD_PROJECT |
Google Cloud project ID |
GOOGLE_CLOUD_LOCATION |
Google Cloud region |
GOOGLE_GENAI_USE_VERTEXAI |
Whether to use Vertex AI (true/false) |
MONGO_URI |
MongoDB connection string |
CHROMA_CLIENT_HOST |
ChromaDB server host |
CHROMA_CLIENT_PORT |
ChromaDB server port |
- Create a new route file in
app/api/routes/or extend existing ones - Define request/response models in
app/api/models/models.py - Implement utility functions in
app/utils/ - Register the router in
app/__init__.py
[License Information]
[Contributor Information]