AIStack CLI is an experimental swiss-knife command-line application designed to streamline the installation and management of AI development tools, including gemini-cli, opencode and various MCP servers. The main goal is to provide a convenient way to install and configure AI tools, ensuring minimal system, to test or use them.
- AI Tool Management: Streamlines the installation and configuration of AI agents like
gemini-cliandopencode. Provides some minimal convenient default settings. - MCP Server Integration: Easily configure and manage connections to various MCP (Model Context Protocol) servers.
- Isolated Environments: All tools are installed into a local
workspace/directory, preventing system-wide conflicts. Installing any agent or MCP server will not pollute in anyway your system nor your development environment path with its own dependencies (nodejs, python, ...). Everything is contained in an easy deletable internal folder. - Portability: Bash application, works on Linux & MacOS.
bashgit
aistack provides a simple command-line interface to manage your tools and environments.
| Command | Description |
|---|---|
| init | Install/Reinstall dependencies |
| help | Display help message |
| shell | Enter a sub-shell with the aistack environment and paths configured |
Install and configure gemini-cli from scratch
git clone https://github.com/StudioEtrange/aistack
cd aistack
./aistack init
./aistack gc install
./aistack gc register bash
Register local MCP server calculator
cd aistack
./aistack mcp calculator install
Configure the underlying nodejs to add a local npm registry
cd aistack
./aistack npm-config set registry https://registry.local.org/
aistack/: Contains the main application logic for theaistackwrapper script.lib/: ITools internal libraries and code.pool/: Contains configuration files templates and framework.workspace/: The directory where all isolated environments and installed software ("features") are stored.
See Gemini CLI
See Opencode
See VS Code
See CLIProxyAPI
AIStack simplifies connecting to MCP (Model Context Protocol) servers, allowing your AI agents to interact with external tools and services.
- Catalogs: MCPMarket, PulseMCP, MCPServers.org
Supported MCP Servers:
- Desktop Commander: Grants terminal control and file system access. (Source)
- Calculator: A simple server for performing calculations. (Source)
- Context7: Fetches up-to-date documentation and code examples. (Source). https://context7.com/
- GitHub: Official server for interacting with GitHub issues, PRs, and repositories. (Source)
- Data Commons: Tools and agents for interacting with the Data Commons Knowledge Graph using the Model Context Protocol (MCP). (Source)
AIStack leverages the Stella framework for its core functionality. Stella provides the infrastructure for application structure, environment isolation, and package management. Package Management: Stella uses a concept of "Features" (software packages) which are defined by "Recipes" (Bash scripts). aistack uses this system to provide all the tools it manages. The recipes are located in pool/stella/nix/pool/feature-recipe/.
-
npxcommand needs at leastnodebinary in PATH andshbinary in PATH -
any mcp server based on node have 2 ways to be registered :
A standard way using json in settings.json, injecting needed PATH env var value to reach node and other binaries (using STELLA_ORIGINAL_SYSTEM_PATH which contains original system PATH value)
- registered mcp server desktop-commander :
{ "mcpServers": { "desktop-commander": { "command": "npx", "args": [ "-y", "@wonderwhy-er/desktop-commander" ], "env": { "PATH": "${AISTACK_NODEJS_BIN_PATH}:${STELLA_ORIGINAL_SYSTEM_PATH}" } } } }Or an indirect way using a script as launcher
- registered mcp server context7 :
{ "mcpServers": { "context7": { "command": "${AISTACK_MCP_LAUNCHER_HOME}/context7" } } }- script launcher for context7 :
#!/bin/sh export PATH="/home/nomorgan/workspace/aistack/workspace/isolated_dependencies/nodejs/bin/:${PATH}" exec "npx" -y @upstash/context7-mcp --api-key "${CONTEXT7_API_KEY}"
- kilocode vsextension config home : $HOME/.vscode-server/data/User/globalStorage/kilocode.kilo-code/settings/mcp_settings.json
- process manager : goreman https://github.com/mattn/goreman
- MCP-cli
- https://github.com/IBM/mcp-cli
- CLI to connect and interact with MCP (Model Context Protocol) servers.
- Manages conversation, tool invocation, session handling.
- Supports chat, interactive shell, and automation via MCP.
- Integrates with LLMs for reasoning and tool-based workflows.
- Serving LLM - Ollama vs vLLM : https://developers.redhat.com/articles/2025/08/08/ollama-vs-vllm-deep-dive-performance-benchmarking#comparison_2__tuned_ollama_versus_vllm
- Ollama excels in its intended role: a simple, accessible tool for local development, prototyping, and single-user applications. Its strength lies in its ease of use, not its ability to handle high-concurrency production traffic, where it struggles even when tuned.
- vLLM is unequivocally the superior choice for production deployment. It is built for performance, delivering significantly higher throughput and lower latency under heavy load. Its dynamic batching and efficient resource management make it the ideal engine for scalable, enterprise-grade AI applications.
- orla cli https://github.com/dorcha-inc/orla https://korben.info/orla-agent-ia-local-cli.html
- security tool : https://github.com/TheAuditorTool/Auditor
- Chrome DevTools MCP https://korben.info/chrome-devtools-mcp.html
See the CONTRIBUTORS file for the full list of contributors
Licensed under the Apache License, Version 2.0.
Copyright © 2025-2026 Sylvain Boucault.
See the LICENSE file for details.