Skip to content

tkykszk/aici

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

41 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

aici πŸš€

PyPI version Python Versions License

A command line interface tool for AI models like OpenAI's ChatGPT and DeepSeek AI. πŸ€–πŸ’¬

Use Case: would like to use AI models with editors like Emacs and/or automated tools.

commandline

emacs

πŸ“¦ Installation:

pip install aici

πŸ“– Overview:

AICI (AI Chat Interface) is a Python🐍 command-line tool for interacting with AI models from OpenAI or DeepSeek. It takes a user's prompt as input and outputs the response from the selected AI model. The output can be directed to either standard output or the clipboardπŸ“‹.

Key Features:

  • Support for OpenAI and DeepSeek models
  • Default model: gpt-4o-mini (updated from deprecated gpt-3.5-turbo)
  • Streaming responses (or complete responses with -c)
  • Custom system messages via direct input or file
  • Clipboard output support
  • Environment variable configuration
  • JSON conversation format support
  • Automatic warnings for deprecated models

πŸ’» Command-Line Description:

Argument env val Default Type Description
-v, --version - Show version and exit
prompt - str Prompt to send to AI. Specify "-" to read from stdin
-m, --model AICI_MODEL gpt-4o-mini str Model name to use (gpt-4o, gpt-4o-mini, gpt-4-turbo, deepseek-chat, etc.) ⚠️ gpt-3.5-turbo deprecated in Feb 2026
-c, --complete False (default streaming) bool Get complete response at once without streaming
-s, --system AICI_SYSTEM You are a helpful assistant. str Specify system message
-S, --system-file AICI_SYSTEM_FILE - str Specify file containing system message
-V, --verbose False bool Show detailed debug information
-o, --output stdout str Specify output destination. Use "clip" to copy to clipboard

OpenAI models documentation DeepSeek models documentation

πŸ“ Logging

Log File Location:

Aici automatically creates a log file to record API calls, errors, and debug information:

  • If config file exists: Log file is created in the same directory as your config file

    • Linux/macOS: ~/.config/aici/aici.log or ~/.aici/aici.log
    • Windows: %USERPROFILE%\AppData\Local\aici\aici.log
  • If no config file: Default location is ~/.config/aici/aici.log

Log Levels:

  • INFO: Records all API calls and responses (always logged to file)
  • WARNING: Shows warnings in console (e.g., deprecated models)
  • DEBUG: Detailed information for troubleshooting (enabled with -V flag)

Viewing Logs:

# View recent log entries
tail -f ~/.config/aici/aici.log

# Enable verbose mode to see debug info in console
aici "Hello" -V

Note: If aici cannot write to the log file (e.g., permission denied), it will show a warning and continue without file logging.

πŸ“₯ input

πŸ’» std input πŸ’¬ command parameter

πŸ“€output

πŸ’» std output (streaming, buffering) πŸ“‹ clipboard

πŸ”§ Config Environment Variables or File:

πŸ”‘ API keys can be set using environment variables or config files

API Key Setup

Getting API Keys

OpenAI API Key:

  1. Visit OpenAI Platform
  2. Sign up or log in to your account
  3. Navigate to API Keys
  4. Click "Create new secret key"
  5. Copy the key (starts with sk-)
  6. Store it securely - you won't be able to see it again

DeepSeek API Key:

  1. Visit DeepSeek Platform
  2. Sign up or log in to your account
  3. Navigate to API Keys section
  4. Generate a new API key
  5. Copy the key (starts with sk-)
  6. Store it securely

Setting Up API Keys

You can configure API keys in two ways:

Method 1: Environment Variables (Temporary - current session only)

# Linux/macOS
export AICI_OPENAI_KEY=sk-your-openai-key-here

# Windows
set AICI_OPENAI_KEY=sk-your-openai-key-here

Method 2: Config File (Permanent - persists across sessions)

Create a config file at one of these locations:

  • Linux/macOS: ~/.config/aici/config or ~/.aici
  • Windows: %USERPROFILE%\AppData\Local\aici\config

Add your API key to the file:

AICI_OPENAI_KEY=sk-your-openai-key-here

Verification:

# Check if aici can find your API key
aici --version  # Should not show API key errors
aici "Hello" -V  # Verbose mode shows configuration loaded

Environment Variables

Priority Order

API Keys (in order of priority):

  1. AICI_OPENAI_KEY (highest priority for OpenAI models)
  2. OPENAI_API_KEY (fallback for OpenAI models)
  3. AICI_DEEPSEEK_KEY (highest priority for DeepSeek models)
  4. DEEPSEEK_API_KEY (fallback for DeepSeek models)

Model Selection (in order of priority):

  1. -m command line option (highest priority)
  2. AICI_MODEL environment variable
  3. AICI_OPENAI_MODEL or AICI_DEEPSEEK_MODEL (provider-specific)
  4. Default: gpt-4o-mini

System Message (in order of priority):

  1. -s command line option (highest priority)
  2. -S file specified via command line
  3. AICI_SYSTEM_FILE environment variable
  4. AICI_SYSTEM environment variable
  5. Default: "You are a helpful assistant."

Note on File Paths (Windows):

  • File paths are automatically normalized for cross-platform compatibility
  • You can use forward slashes (/) even on Windows - they will be converted to backslashes (\)
  • Examples that work on Windows:
    set AICI_SYSTEM_FILE=C:/Users/YourName/system.txt
    set AICI_SYSTEM_FILE=~/Documents/system.txt
    Both will be correctly processed as Windows paths

Configuration Examples

Linux/macOS:

# OpenAI API Key (AICI_OPENAI_KEY takes priority over OPENAI_API_KEY)
export AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
export OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx

# DeepSeek API Key (AICI_DEEPSEEK_KEY takes priority over DEEPSEEK_API_KEY)
export AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
export DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
export AICI_MODEL=gpt-4o-mini           # General model selection
export AICI_OPENAI_MODEL=gpt-4o-mini    # OpenAI specific model
export AICI_DEEPSEEK_MODEL=deepseek-chat # DeepSeek specific model

# System Message
export AICI_SYSTEM="You are a helpful assistant."
export AICI_SYSTEM_FILE=~/path/to/system_message.txt

Windows (Command Prompt):

# OpenAI API Key (AICI_OPENAI_KEY takes priority over OPENAI_API_KEY)
set AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
set OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx

# DeepSeek API Key (AICI_DEEPSEEK_KEY takes priority over DEEPSEEK_API_KEY)
set AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
set DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
set AICI_MODEL=gpt-4o-mini
set AICI_OPENAI_MODEL=gpt-4o-mini
set AICI_DEEPSEEK_MODEL=deepseek-chat

# System Message
set AICI_SYSTEM="You are a helpful assistant."
set AICI_SYSTEM_FILE=C:\path\to\system_message.txt

Config Files

It will check the files in the following locations (in the order listed below). ~/.config/aici/config ~/.aici

# API Keys
AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx
AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx

# Model Selection
AICI_MODEL=gpt-4o-mini
AICI_OPENAI_MODEL=gpt-4o-mini
AICI_DEEPSEEK_MODEL=deepseek-chat

# System Message
AICI_SYSTEM=You are a helpful assistant.
AICI_SYSTEM_FILE=~/path/to/system_message.txt  # Load system message from file

πŸ–₯️ On Windows file path, it is expanded like

File path Windows Specific
C:\Users\{USERNAME}\AppData\Local\aici\config βœ”
C:\Users\{USERNAME}\AppData\Roaming\aici\config βœ”
C:\Users\{USERNAME}\.config\aici\config
C:\Users\{USERNAME}\.aici
(The priority of the applied config files is in the order listed from top to bottom.)

πŸ‘‹ Examples:

πŸ’¨ Basic input from CLI

$ aici Hello

πŸ’¨ Read from stdin

$ echo Hello | aici -

πŸ’¨ Specify a model

$ aici -m gpt-4o "What's the weather like today?"
$ aici -m gpt-4o-mini "What's the weather like today?"
$ aici -m deepseek-chat "Tell me about quantum computing"

πŸ’¨ Use a system message from a file

$ echo "You are a helpful coding assistant." > system.txt
$ aici -S system.txt "How do I write a Python function?"

πŸ’¨ Enable debug mode

$ aici -V "Hello there"

πŸ’¨ output to clipboard πŸ“‹

$ echo Hello | aici - --output clip

πŸ‘‹ emacs

Emacs Lisp Code Example

Below is the content of emacs/aici.el

(defun aici-call ()
  "Send selected region or prompt for input if no region is selected to the 'aici' command and insert the output in real-time."
  (interactive)
  (let* ((text (if (use-region-p)
                   (buffer-substring-no-properties (region-beginning) (region-end))
                 (read-string "Enter text: ")))
         ;; Attempt to create or get the output buffer
         (output-buffer (get-buffer-create "*AICI Output*")))

    ;; Check if the buffer creation was successful
    (if (not output-buffer)
        (error "Failed to create or access the output buffer")
      ;; Clear the output buffer
      (with-current-buffer output-buffer
        (erase-buffer)
        ;; Set the buffer to markdown-mode
        (markdown-mode))

      ;; Display a message indicating that processing is ongoing
      (message "Processing...")

      ;; Start the process and stream the output to the buffer
      (let ((process (start-process "aici-process" output-buffer "sh" "-c"
                                    (format "echo %s | aici -" (shell-quote-argument text)))))
        ;; Set a process filter to handle output streaming
        (set-process-filter process
                            (lambda (proc output)
                              ;; Explicitly reference the output-buffer by capturing it in the lambda
                              (let ((buffer (process-buffer proc)))
                                (when (buffer-live-p buffer)
                                  (with-current-buffer buffer
                                    (goto-char (point-max))
                                    (insert output)
                                    ;; Optionally display the buffer in real-time
                                    (display-buffer buffer))))))

        ;; Set a sentinel to handle process completion
        (set-process-sentinel process
                              (lambda (proc event)
                                ;; Again, ensure that output-buffer is properly referenced
                                (let ((buffer (process-buffer proc)))
                                  (when (buffer-live-p buffer)
                                    (if (string= event "finished\n")
                                        (message "Processing complete.")
                                      (message "Processing interrupted: %s" event))))))))

      ;; Ensure the output buffer is displayed after starting the process
      (display-buffer output-buffer)))

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

cli of chatgpt

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors