A comprehensive FastAPI-based research platform that leverages OpenAI's advanced models (O3, O4-Mini) to conduct in-depth research, idea validation, market analysis, and financial assessments.
- Features
- Technology Stack
- Prerequisites
- Installation
- Usage
- Research Types
- Project Structure
- API Endpoints
- Deployment
- Configuration
- Development
- Performance
- Roadmap
- Contributing
- License
- Multi-Model Research: Support for O3 Deep Research and O4 Mini Deep Research models
- Research Types:
- Custom Research: General-purpose research queries
- Idea Validation: Comprehensive startup/business idea analysis
- Market Research: Market analysis and competitive landscape
- Financial Analysis: Financial feasibility and projections
- Comprehensive Analysis: All three research types combined
- Progressive Results: Real-time updates for comprehensive research
- Web Interface: Modern, responsive web UI with live progress tracking
- Data Persistence: SQLite database for storing research history
- Export Options: Download reports as Markdown files
- RESTful API: Complete API for programmatic access
| Layer | Technology | Purpose |
|---|---|---|
| Backend | FastAPI, Python 3.8+ | Web framework and application logic |
| AI Models | OpenAI O3, O4-Mini | Deep research and analysis |
| Database | SQLite | Research history and data persistence |
| Frontend | HTML/CSS/JS, Alpine.js | Interactive web interface |
| Styling | Tailwind CSS | Responsive UI design |
| Containerization | Docker, docker-compose | Production deployment |
| Reverse Proxy | Nginx | Load balancing and SSL termination |
| Caching | Redis | Performance optimization |
| Hosting | Vercel | Serverless deployment option |
- Python 3.8 or higher
- OpenAI API key with access to research models
- Git (for cloning)
- Docker (optional, for containerized deployment)
-
Clone the repository:
git clone https://github.com/MajorAbdullah/ai-research-platform.git cd ai-research-platform -
Create and activate virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install dependencies:
pip install -r config/requirements.txt
-
Set up environment variables: Create a
.envfile in the root directory:OPENAI_API_KEY=your_openai_api_key_here
-
Initialize the database:
python -c "from models.database import init_db; init_db()"
# Setup (first time only)
./setup.sh
# Run the application
./dev.sh runsource .venv/bin/activate
python app.pyThe application will be available at:
- Web Interface: http://localhost:8000
- API Documentation: http://localhost:8000/docs
- Health Check: http://localhost:8000/health
- Open http://localhost:8000 in your browser
- Select your preferred research model
- Choose research type
- Enter your research query
- Monitor real-time progress
- Download or copy results when complete
curl -X POST "http://localhost:8000/api/research" \
-H "Content-Type: application/json" \
-d '{
"query": "AI app for construction workers",
"model": "o3-deep-research",
"research_type": "comprehensive",
"enrich_prompt": true
}'curl "http://localhost:8000/api/research/{task_id}/status"curl "http://localhost:8000/api/research/{task_id}/result"General-purpose research for any query with intelligent prompt enrichment.
Comprehensive startup/business idea analysis including:
- Market opportunity assessment
- Target audience analysis
- Competition landscape
- Technical feasibility
- Risk assessment
Detailed market analysis covering:
- Market size and growth
- Customer segments
- Competitive analysis
- Market trends
- Entry barriers
Financial feasibility assessment including:
- Revenue projections
- Cost analysis
- Break-even analysis
- Funding requirements
- ROI calculations
Combines all three research types for complete business intelligence.
ai-research-platform/
βββ app.py # Main FastAPI application (57K+)
βββ config/
β βββ __init__.py
β βββ requirements.txt # Python dependencies
βββ models/
β βββ __init__.py
β βββ database.py # Database models and operations
βββ services/
β βββ __init__.py
β βββ research_client.py # OpenAI research client
β βββ storage_service.py # Data persistence layer
β βββ document_manager.py # Document handling
βββ api/
β βββ main.py # Vercel serverless entry point
βββ tests/ # Test suite
βββ research_documents/ # Generated research reports
β βββ archives/
β βββ comprehensive_research/
β βββ custom_research/
β βββ financial_analysis/
β βββ idea_validation/
β βββ market_research/
β βββ metadata/
βββ Dockerfile # Multi-stage Docker build
βββ docker-compose.yml # Full stack with Redis + Nginx
βββ vercel.json # Vercel deployment config
βββ setup.sh # Initial setup script
βββ dev.sh # Development helper script
βββ CONTRIBUTING.md # Contribution guidelines
βββ VERCEL_DEPLOYMENT.md # Vercel deployment guide
βββ .env # Environment variables (create this)
βββ .gitignore
βββ LICENSE
βββ README.md
| Method | Endpoint | Description |
|---|---|---|
POST |
/api/research |
Start new research task |
GET |
/api/research/{task_id}/status |
Get task status |
GET |
/api/research/{task_id}/result |
Get completed results |
GET |
/api/research/{task_id}/progressive |
Get progressive results |
GET |
/api/research/results |
Get all results |
DELETE |
/api/research/{task_id} |
Delete research result |
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/models |
Get available research models |
GET |
/api/dashboard/overview |
Dashboard metrics |
GET |
/api/dashboard/ideas |
All ideas data |
GET |
/health |
Health check |
The project includes a multi-stage Dockerfile and a docker-compose configuration with Redis caching and Nginx reverse proxy.
# Build and run with docker-compose
docker-compose up --build
# Or using the dev script
./dev.sh dockerThe docker-compose stack includes:
- ai-research-platform - Main FastAPI application on port 8000
- redis - Redis 7 Alpine for caching on port 6379
- nginx - Nginx Alpine reverse proxy on ports 80/443
The platform can be deployed as a serverless function on Vercel:
# Install Vercel CLI
npm i -g vercel
# Deploy
vercelSee VERCEL_DEPLOYMENT.md for detailed Vercel deployment instructions.
For comprehensive research, the system provides real-time updates as each research phase completes:
- Idea validation phase
- Market research phase
- Financial analysis phase
Results are cached in SQLite database with metadata for quick retrieval and historical analysis.
- Markdown format downloads
- Clipboard copy functionality
- Structured JSON API responses
Choose between different OpenAI models based on your needs:
- O3 Deep Research: Maximum depth and accuracy
- O4 Mini Deep Research: Faster, cost-effective option
OPENAI_API_KEY=your_api_key_here
DATABASE_URL=sqlite:///research_platform.db # Optional, defaults to local SQLite
DEBUG=false # Optional, for developmentModels can be configured in services/research_client.py with custom parameters for:
- Maximum tool calls
- Timeout settings
- Response formatting
./dev.sh setup # Initial setup
./dev.sh run # Start development server
./dev.sh test # Run tests
./dev.sh lint # Check code quality
./dev.sh format # Format code
./dev.sh clean # Clean up files
./dev.sh docker # Run with Docker./dev.sh test
# or manually:
source .venv/bin/activate
pytest tests/ -v --cov=.The application follows a modular architecture:
- app.py: Main FastAPI application and routes
- services/: Business logic and external integrations
- models/: Data models and database operations
- config/: Configuration and dependencies
- Typical research completion: 2-5 minutes
- Concurrent request handling: Up to 10 parallel research tasks
- Database: Optimized for quick retrieval with indexing
- Memory usage: ~100MB base + active research tasks
- Integration with additional AI models
- Advanced data visualization dashboard
- Export to multiple formats (PDF, DOCX)
- Collaborative research features
- API rate limiting and usage analytics
- Cloud deployment guides
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
See CONTRIBUTING.md for detailed contribution guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.
If you encounter any issues or have questions:
- Check the Issues page
- Create a new issue with detailed information
- Contact the maintainers
Built by Major Abdullah using OpenAI's advanced research capabilities