Skip to content

noahkostesku/RepoChat---For-Devs

Repository files navigation

Repo Chat - Chrome Extension

Generate personalized coffee chat questions and repository summaries from GitHub profiles using local AI.

Chrome Extension

RepoChat Chrome Extension

Web Version

RepoChat Web Version

Features

  • Coffee Chat Questions: Generate ice breakers, project deep-dives, technical insights, and skills practice questions
  • Repository Summaries: Get clear explanations of what each repository does, key features, and technical details
  • Multiple AI Models: Choose from local Ollama, OpenAI ChatGPT, or Google Gemini
  • Customizable: Choose between casual/formal style and short/medium/long length
  • API Key Management: Secure storage of API keys with validation

Quick Start

1. Choose Your AI Model

You have three options for AI models:

Option A: Local Ollama (Recommended - Free)

  • Install Ollama: Download from ollama.ai
  • Pull the Model: Run ollama pull deepseek-r1:7b
  • Start Ollama: Run ollama serve (keep this running)

Option B: OpenAI ChatGPT (Paid)

Option C: Google Gemini (Paid)

2. Install the Extension

  1. Download the RepoChat-Extension.zip file
  2. Unzip it to a folder on your computer
  3. Open Chrome and go to chrome://extensions/
  4. Enable "Developer mode" (toggle in top right)
  5. Click "Load unpacked" and select the unzipped folder
  6. The extension icon should appear in your toolbar

3. Use the Extension

  1. Navigate to any GitHub profile page
  2. Click the Repo Chat extension icon
  3. Select your AI model and enter API keys if needed
  4. Select your preferred style and length
  5. Click "Generate Questions" or "Generate Summary"
  6. Copy and use the generated content for your coffee chat!

Setup Instructions

For Users (Zip Installation)

Using Local Ollama (Free)

  1. Download Ollama: Visit ollama.ai/download and install for your OS
  2. Pull the Model: Open terminal/command prompt and run ollama pull deepseek-r1:7b
  3. Start Ollama: Run ollama serve (keep this running)
  4. Install Extension: Follow the Quick Start steps above
  5. Verify Status: The extension will show "Ollama is running" when everything is set up correctly

Using OpenAI or Gemini (Paid)

  1. Get API Key: Visit the respective platform to get your API key
  2. Install Extension: Follow the Quick Start steps above
  3. Enter API Key: In the extension, select your model and enter your API key
  4. Verify Status: The extension will validate your API key format

For Developers

# Clone the repository
git clone <your-repo>
cd RepoChat

# Install dependencies
npm install

# Build the extension
npm run build

# Package for distribution
npm run package

Troubleshooting

"Ollama is not running" Error

  1. Check if Ollama is installed: Run ollama --version in terminal
  2. Start Ollama: Run ollama serve in terminal
  3. Check the model: Run ollama list to see if deepseek-r1:7b is installed
  4. Pull the model: If not listed, run ollama pull deepseek-r1:7b

"Invalid API Key" Error

  1. Check API key format:
    • OpenAI keys should start with "sk-"
    • Gemini keys should start with "AIza"
  2. Verify API key: Make sure you copied the key correctly
  3. Check account status: Ensure you have sufficient credits/quota

Rate Limit Errors

  1. OpenAI: Check your usage at OpenAI Platform
  2. Gemini: Check your quota at Google Cloud Console
  3. Wait and retry: Rate limits usually reset after a short period

Extension Not Working

  1. Check Ollama status: The extension shows a status indicator
  2. Refresh the extension: Go to chrome://extensions/ and click the refresh icon
  3. Check GitHub page: Make sure you're on a GitHub profile page
  4. Check console: Open DevTools (F12) and look for error messages

Model Not Found

If you get an error about the model not being found:

# Pull the correct model
ollama pull deepseek-r1:7b

# Or try an alternative model
ollama pull llama3.2:3b

How It Works

  1. Profile Scraping: The extension reads the GitHub profile page to extract repository information
  2. Local AI Processing: Sends the profile data to your local Ollama instance
  3. Question Generation: Uses AI to create personalized questions based on the repositories
  4. Summary Creation: Generates technical summaries of each repository

Privacy & Security

  • Local Ollama: All data processing happens locally on your machine
  • External APIs: When using OpenAI or Gemini, your data is sent to their servers
  • API Key Storage: API keys are stored securely in Chrome's sync storage
  • GitHub Data: Your GitHub profile data is only used for question generation

Requirements

  • Chrome browser
  • Internet connection (for GitHub profile access)
  • For Local Ollama: Ollama installed and running with deepseek-r1:7b model
  • For OpenAI: Valid OpenAI API key with sufficient credits
  • For Gemini: Valid Google AI Studio API key with sufficient quota

Support

If you encounter issues:

  1. Check the troubleshooting section above
  2. Ensure Ollama is running (ollama serve)
  3. Verify the model is installed (ollama list)
  4. Check the extension status indicator

License

[Your License Here]


Copyright (c) 2025 Noah Kostesku. All Rights Reserved.

This software is proprietary and may not be copied, modified, distributed, or used in any derivative works without the author's express written permission.

THE SOFTWARE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND.

About

A Chrome Extension that generates personalized coffee chat questions from GitHub profiles for recruiters using AI.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors