A command line interface tool for AI models like OpenAI's ChatGPT and DeepSeek AI. π€π¬
Use Case: would like to use AI models with editors like Emacs and/or automated tools.
pip install aici
AICI (AI Chat Interface) is a Pythonπ command-line tool for interacting with AI models from OpenAI or DeepSeek. It takes a user's prompt as input and outputs the response from the selected AI model. The output can be directed to either standard output or the clipboardπ.
Key Features:
- Support for OpenAI and DeepSeek models
- Default model:
gpt-4o-mini(updated from deprecatedgpt-3.5-turbo) - Streaming responses (or complete responses with
-c) - Custom system messages via direct input or file
- Clipboard output support
- Environment variable configuration
- JSON conversation format support
- Automatic warnings for deprecated models
| Argument | env val | Default | Type | Description |
|---|---|---|---|---|
| -v, --version | - | Show version and exit | ||
| prompt | - | str | Prompt to send to AI. Specify "-" to read from stdin | |
| -m, --model | AICI_MODEL | gpt-4o-mini | str | Model name to use (gpt-4o, gpt-4o-mini, gpt-4-turbo, deepseek-chat, etc.) |
| -c, --complete | False (default streaming) | bool | Get complete response at once without streaming | |
| -s, --system | AICI_SYSTEM | You are a helpful assistant. | str | Specify system message |
| -S, --system-file | AICI_SYSTEM_FILE | - | str | Specify file containing system message |
| -V, --verbose | False | bool | Show detailed debug information | |
| -o, --output | stdout | str | Specify output destination. Use "clip" to copy to clipboard |
OpenAI models documentation DeepSeek models documentation
Log File Location:
Aici automatically creates a log file to record API calls, errors, and debug information:
-
If config file exists: Log file is created in the same directory as your config file
- Linux/macOS:
~/.config/aici/aici.logor~/.aici/aici.log - Windows:
%USERPROFILE%\AppData\Local\aici\aici.log
- Linux/macOS:
-
If no config file: Default location is
~/.config/aici/aici.log
Log Levels:
- INFO: Records all API calls and responses (always logged to file)
- WARNING: Shows warnings in console (e.g., deprecated models)
- DEBUG: Detailed information for troubleshooting (enabled with
-Vflag)
Viewing Logs:
# View recent log entries
tail -f ~/.config/aici/aici.log
# Enable verbose mode to see debug info in console
aici "Hello" -VNote: If aici cannot write to the log file (e.g., permission denied), it will show a warning and continue without file logging.
π» std input π¬ command parameter
π» std output (streaming, buffering) π clipboard
π API keys can be set using environment variables or config files
OpenAI API Key:
- Visit OpenAI Platform
- Sign up or log in to your account
- Navigate to API Keys
- Click "Create new secret key"
- Copy the key (starts with
sk-) - Store it securely - you won't be able to see it again
DeepSeek API Key:
- Visit DeepSeek Platform
- Sign up or log in to your account
- Navigate to API Keys section
- Generate a new API key
- Copy the key (starts with
sk-) - Store it securely
You can configure API keys in two ways:
Method 1: Environment Variables (Temporary - current session only)
# Linux/macOS
export AICI_OPENAI_KEY=sk-your-openai-key-here
# Windows
set AICI_OPENAI_KEY=sk-your-openai-key-hereMethod 2: Config File (Permanent - persists across sessions)
Create a config file at one of these locations:
- Linux/macOS:
~/.config/aici/configor~/.aici - Windows:
%USERPROFILE%\AppData\Local\aici\config
Add your API key to the file:
AICI_OPENAI_KEY=sk-your-openai-key-here
Verification:
# Check if aici can find your API key
aici --version # Should not show API key errors
aici "Hello" -V # Verbose mode shows configuration loadedAPI Keys (in order of priority):
AICI_OPENAI_KEY(highest priority for OpenAI models)OPENAI_API_KEY(fallback for OpenAI models)AICI_DEEPSEEK_KEY(highest priority for DeepSeek models)DEEPSEEK_API_KEY(fallback for DeepSeek models)
Model Selection (in order of priority):
-mcommand line option (highest priority)AICI_MODELenvironment variableAICI_OPENAI_MODELorAICI_DEEPSEEK_MODEL(provider-specific)- Default:
gpt-4o-mini
System Message (in order of priority):
-scommand line option (highest priority)-Sfile specified via command lineAICI_SYSTEM_FILEenvironment variableAICI_SYSTEMenvironment variable- Default: "You are a helpful assistant."
Note on File Paths (Windows):
- File paths are automatically normalized for cross-platform compatibility
- You can use forward slashes (
/) even on Windows - they will be converted to backslashes (\) - Examples that work on Windows:
Both will be correctly processed as Windows paths
set AICI_SYSTEM_FILE=C:/Users/YourName/system.txt set AICI_SYSTEM_FILE=~/Documents/system.txt
Linux/macOS:
# OpenAI API Key (AICI_OPENAI_KEY takes priority over OPENAI_API_KEY)
export AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
export OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx
# DeepSeek API Key (AICI_DEEPSEEK_KEY takes priority over DEEPSEEK_API_KEY)
export AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
export DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx
# Model Selection
export AICI_MODEL=gpt-4o-mini # General model selection
export AICI_OPENAI_MODEL=gpt-4o-mini # OpenAI specific model
export AICI_DEEPSEEK_MODEL=deepseek-chat # DeepSeek specific model
# System Message
export AICI_SYSTEM="You are a helpful assistant."
export AICI_SYSTEM_FILE=~/path/to/system_message.txtWindows (Command Prompt):
# OpenAI API Key (AICI_OPENAI_KEY takes priority over OPENAI_API_KEY)
set AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
set OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx
# DeepSeek API Key (AICI_DEEPSEEK_KEY takes priority over DEEPSEEK_API_KEY)
set AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
set DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx
# Model Selection
set AICI_MODEL=gpt-4o-mini
set AICI_OPENAI_MODEL=gpt-4o-mini
set AICI_DEEPSEEK_MODEL=deepseek-chat
# System Message
set AICI_SYSTEM="You are a helpful assistant."
set AICI_SYSTEM_FILE=C:\path\to\system_message.txtIt will check the files in the following locations (in the order listed below). ~/.config/aici/config ~/.aici
# API Keys
AICI_OPENAI_KEY=sk-xxxxxxxxxxxxxxxxx
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxx
AICI_DEEPSEEK_KEY=sk-xxxxxxxxxxxxxxxxx
DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxx
# Model Selection
AICI_MODEL=gpt-4o-mini
AICI_OPENAI_MODEL=gpt-4o-mini
AICI_DEEPSEEK_MODEL=deepseek-chat
# System Message
AICI_SYSTEM=You are a helpful assistant.
AICI_SYSTEM_FILE=~/path/to/system_message.txt # Load system message from file
π₯οΈ On Windows file path, it is expanded like
| File path | Windows Specific |
|---|---|
C:\Users\{USERNAME}\AppData\Local\aici\config |
β |
C:\Users\{USERNAME}\AppData\Roaming\aici\config |
β |
C:\Users\{USERNAME}\.config\aici\config |
|
C:\Users\{USERNAME}\.aici |
|
| (The priority of the applied config files is in the order listed from top to bottom.) |
π¨ Basic input from CLI
$ aici Hello
π¨ Read from stdin
$ echo Hello | aici -
π¨ Specify a model
$ aici -m gpt-4o "What's the weather like today?"
$ aici -m gpt-4o-mini "What's the weather like today?"
$ aici -m deepseek-chat "Tell me about quantum computing"
π¨ Use a system message from a file
$ echo "You are a helpful coding assistant." > system.txt
$ aici -S system.txt "How do I write a Python function?"
π¨ Enable debug mode
$ aici -V "Hello there"
π¨ output to clipboard π
$ echo Hello | aici - --output clip
Below is the content of emacs/aici.el
(defun aici-call ()
"Send selected region or prompt for input if no region is selected to the 'aici' command and insert the output in real-time."
(interactive)
(let* ((text (if (use-region-p)
(buffer-substring-no-properties (region-beginning) (region-end))
(read-string "Enter text: ")))
;; Attempt to create or get the output buffer
(output-buffer (get-buffer-create "*AICI Output*")))
;; Check if the buffer creation was successful
(if (not output-buffer)
(error "Failed to create or access the output buffer")
;; Clear the output buffer
(with-current-buffer output-buffer
(erase-buffer)
;; Set the buffer to markdown-mode
(markdown-mode))
;; Display a message indicating that processing is ongoing
(message "Processing...")
;; Start the process and stream the output to the buffer
(let ((process (start-process "aici-process" output-buffer "sh" "-c"
(format "echo %s | aici -" (shell-quote-argument text)))))
;; Set a process filter to handle output streaming
(set-process-filter process
(lambda (proc output)
;; Explicitly reference the output-buffer by capturing it in the lambda
(let ((buffer (process-buffer proc)))
(when (buffer-live-p buffer)
(with-current-buffer buffer
(goto-char (point-max))
(insert output)
;; Optionally display the buffer in real-time
(display-buffer buffer))))))
;; Set a sentinel to handle process completion
(set-process-sentinel process
(lambda (proc event)
;; Again, ensure that output-buffer is properly referenced
(let ((buffer (process-buffer proc)))
(when (buffer-live-p buffer)
(if (string= event "finished\n")
(message "Processing complete.")
(message "Processing interrupted: %s" event))))))))
;; Ensure the output buffer is displayed after starting the process
(display-buffer output-buffer)))
This project is licensed under the MIT License - see the LICENSE file for details.

