Skip to content

ru-aish/eye-care

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Terminal Coding Assistant (gpp)

A powerful terminal-based AI coding assistant that uses LM Studio to help you create, read, and edit code files efficiently. Now with support for multiple file operations and global terminal access.

✨ Features

  • Create - Generate new code files from natural language prompts
  • Read - Analyze and explain existing code files
  • Edit - Modify existing files based on instructions
  • Multiple Files - Create, read, and edit multiple files in one command
  • Model Management - List, load, and unload models from LM Studio
  • Multi-language - Supports C, C++, Java, Python, HTML, CSS, PHP, JavaScript, TypeScript
  • Global Access - Use gpp from any terminal directory after installation

🔧 Prerequisites

  1. LM Studio running locally on http://localhost:1234
  2. Python 3.6+ with requests library
    pip3 install requests

📦 Installation

Quick Install (Recommended)

Linux/macOS:

chmod +x install.sh
./install.sh

Windows (PowerShell):

.\install.ps1

Windows (Command Prompt):

install.bat

Manual Installation

Linux/macOS:

# Copy files to ~/.local/bin
mkdir -p ~/.local/bin
cp gpp ~/.local/bin/gpp
cp model_manager.py ~/.local/bin/model_manager.py
chmod +x ~/.local/bin/gpp

# Add to PATH (add to ~/.bashrc, ~/.zshrc, or ~/.profile)
export PATH="$HOME/.local/bin:$PATH"

Windows:

  1. Create %USERPROFILE%\.local\bin directory
  2. Copy gpp and model_manager.py to that directory
  3. Add the directory to your system PATH
  4. Restart terminal

✅ Verify Installation

./verify_setup.sh

This will check:

  • ✅ gpp is accessible globally
  • ✅ Python 3 is installed
  • ✅ requests library is available
  • ✅ LM Studio is running and accessible
  • ✅ Models are available

🚀 Usage

Single File Operations

# Create a new file
gpp "binary search function" --create search.py

# Edit existing file
gpp "add error handling" --edit script.py

# Read and analyze
gpp "what does this do?" --read program.c

# Analyze without prompt
gpp --read code.java

Multiple File Operations

# Create multiple files
gpp "create a web app" --create index.html style.css app.js

# Edit multiple files with same instruction
gpp "add comments to all functions" --edit util.py helper.py main.py

# Analyze multiple files
gpp "how do these files interact?" --read app.py config.py database.py

Model Management

# List all available models
gpp --list-models

# Show currently loaded model
gpp --current-model

# Load a specific model
gpp --load-model "qwen2.5-coder-3b-instruct"

# Unload current model
gpp --unload-model

Advanced Options

# Specify language
gpp "quick sort algorithm" --lang=python --create sort.py

# Adjust generation parameters
gpp "add validation" --edit config.py --temperature=0.5 --max-tokens=800

📋 Supported Languages

Language Extensions
C .c, .h
C++ .cpp, .cc, .cxx, .hpp
Java .java
Python .py
HTML .html, .htm
CSS .css
PHP .php
JavaScript .js
TypeScript .ts

📚 Examples

Example 1: Create a Web App

gpp "professional portfolio website with responsive design" \
    --create index.html styles.css script.js

Output:

1. index.html : [HTML code with semantic structure]
2. styles.css : [CSS code with responsive design]
3. script.js : [JavaScript for interactivity]

Example 2: Edit Multiple Files

gpp "add comprehensive docstrings and type hints" \
    --edit app.py utils.py models.py

Example 3: Analyze Project Structure

gpp "explain how these components work together" \
    --read main.py database.py api.py config.py

Example 4: Quick Code Generation

gpp "convert this algorithm to TypeScript" --lang=typescript
gpp "what are the security implications?" --read api.py

⚙️ Options Reference

Options:
  --create FILE(s)          Create one or multiple files
  --edit FILE(s)            Edit one or multiple files
  --read FILE(s)            Read and analyze files
  --lang, -l LANG           Specify programming language
  --temperature, -t FLOAT   Generation temperature (0.0-1.0, default: 0.7)
  --max-tokens, -m INT      Max tokens to generate (default: 500)

Model Management:
  --list-models             List all available models
  --current-model           Show currently loaded model
  --load-model MODEL        Load specific model
  --unload-model            Unload current model

  -h, --help                Show help message

🔧 Configuration

The tool uses these defaults:

  • API: http://localhost:1234
  • Model: qwen2.5-coder-3b-instruct
  • Temperature: 0.7
  • Max tokens: 500

To change defaults, edit the constants at the top of the gpp file:

API_BASE_URL = "http://localhost:1234"
MODEL_NAME = "qwen2.5-coder-3b-instruct"

🐛 Troubleshooting

"Cannot connect to LM Studio"

  • Make sure LM Studio is running
  • Verify it's accessible at http://localhost:1234
  • Check your firewall settings

"Command not found: gpp"

  • Run gpp --help from the installation directory
  • Verify PATH is set correctly: echo $PATH
  • Restart your terminal after installation

"Module not found: model_manager"

  • Ensure model_manager.py is in the same directory as gpp
  • Check file permissions: chmod +x model_manager.py

"requests library not found"

pip3 install requests

Model won't load

  • Check if another model is already loaded
  • Use gpp --unload-model first
  • Verify sufficient disk space and RAM

📖 More Information

  • Model Management: See MODEL_MANAGEMENT.md
  • Implementation Details: See IMPLEMENTATION_SUMMARY.md
  • Quick Reference: See MODEL_COMMANDS_QUICK_REFERENCE.txt

🤝 Contributing

To improve this tool:

  1. Test with different models and edge cases
  2. Report bugs and issues
  3. Suggest new features

📝 License

This project is provided as-is for use with LM Studio.

🎯 Summary

GPP brings AI-powered code generation to your terminal with:

  • ✅ Single and multiple file support
  • ✅ Global terminal access from any directory
  • ✅ Easy installation and setup
  • ✅ Full model management capabilities
  • ✅ Multiple language support
  • ✅ Simple, intuitive command syntax

Get started now:

./install.sh
gpp --help

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors