Releases: codecaine-zz/ollama_boolean
Releases · codecaine-zz/ollama_boolean
ollama_boolean
ollama_boolean - Pre-built Binaries
This directory contains pre-built binaries for the ollama_boolean application.
Available Binaries
ollama_boolean- Linux x64ollama_boolean-arm64- macOS Apple Silicon (ARM64)ollama_boolean.exe- Windows x64
Quick Start
Prerequisites
- Ollama must be running on your system on port 11434
- Have at least one model installed (e.g.,
qwen3,llama3.2)
# Install and start Ollama if you haven't already
ollama pull qwen3
ollama serveUsage
Choose the appropriate binary for your platform and run it:
Linux/macOS:
# Make executable (first time only)
chmod +x ollama_boolean
# Basic usage
./ollama_boolean "Is the sky blue?"
# Quiet mode (returns only the number)
./ollama_boolean --quiet "Can pigs fly?"
# With custom model
./ollama_boolean "Is this art beautiful?" llama3.2Windows:
# Basic usage
.\ollama_boolean.exe "Is the sky blue?"
# Quiet mode
.\ollama_boolean.exe --quiet "Can pigs fly?"Response Codes
The application returns:
1- Yes/Positive/True0- No/Negative/False2- Unknown/Subjective/Uncertain
Examples
# Factual question (returns 1)
./ollama_boolean "Is water wet?"
# False statement (returns 0)
./ollama_boolean "Can humans breathe underwater without equipment?"
# Subjective question (returns 2)
./ollama_boolean "Is pizza the best food?"
# Script-friendly quiet mode
result=$(./ollama_boolean --quiet "Is 2+2=4?")
echo "Result: $result"Installation for Global Use
Linux/macOS:
# Copy to system PATH
sudo cp ollama_boolean /usr/local/bin/
# Now you can use 'ollama_boolean' from anywhereWindows:
# Add to a directory in your PATH or copy to System32
copy ollama_boolean.exe C:\Windows\System32\Help
./ollama_boolean --helpTroubleshooting
- "Connection refused": Make sure Ollama is running (
ollama serve) - "Model not found": Install the model (
ollama pull qwen3) - Permission denied: Make the binary executable (
chmod +x ollama_boolean)
For more detailed documentation, see the main repository README.