HyperCode now supports multiple LLM providers through Vercel AI SDK integration. This allows you to use different AI models from various providers while maintaining a consistent interface.
- OpenAI - GPT-4o, GPT-4, GPT-3.5 Turbo
- Anthropic - Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
- Google AI - Gemini 2.5 Pro, Gemini 2.5 Flash
- Cohere - Command R+, Command R
- Mistral AI - Mistral Large, Mistral Medium
- Amazon Bedrock - Various models through AWS
Set these environment variables to configure your preferred provider:
# Primary configuration
export HYPERCODE_PROVIDER=openai # Provider name
export HYPERCODE_API_KEY=sk-... # API key
export HYPERCODE_MODEL=gpt-4o # Model name
export HYPERCODE_API_URL=https://... # Optional custom API URL
# Proxy configuration
export HYPERCODE_PROXY=http://proxy:8080 # HyperCode-specific proxy
export HTTP_PROXY=http://proxy:8080 # General HTTP proxy
export HTTPS_PROXY=http://proxy:8080 # General HTTPS proxy
# SSL configuration (for development/debugging)
export HYPERCODE_SKIP_SSL_VERIFICATION=true # Skip SSL verification
# Provider-specific API keys (alternative)
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GEMINI_API_KEY=AIza...
export COHERE_API_KEY=...
export MISTRAL_API_KEY=...export HYPERCODE_PROVIDER=openai
export HYPERCODE_API_KEY=sk-proj-...
export HYPERCODE_MODEL=gpt-4oexport HYPERCODE_PROVIDER=anthropic
export HYPERCODE_API_KEY=sk-ant-...
export HYPERCODE_MODEL=claude-3-5-sonnet-20241022export HYPERCODE_PROVIDER=google
export HYPERCODE_API_KEY=AIza...
export HYPERCODE_MODEL=gemini-2.5-proexport HYPERCODE_PROVIDER=cohere
export HYPERCODE_API_KEY=...
export HYPERCODE_MODEL=command-r-plusexport HYPERCODE_PROVIDER=mistral
export HYPERCODE_API_KEY=...
export HYPERCODE_MODEL=mistral-large-latestexport HYPERCODE_PROVIDER=amazon-bedrock
export AWS_REGION=us-east-1
export AWS_ACCESS_KEY_ID=...
export AWS_SECRET_ACCESS_KEY=...
export HYPERCODE_MODEL=anthropic.claude-3-5-sonnet-20241022-v2:0The CLI will automatically use the configured provider:
# Uses the provider configured in environment variables
gemini "Write a hello world program in Python"
# Provider info is shown in debug mode
gemini --debug "Explain machine learning"import { UniversalAIClient } from '@google/gemini-cli-core';
// From environment variables
const client = UniversalAIClient.fromEnvironment();
const result = await client.generateText('Hello, world!');
console.log(result.text);
// Direct configuration
const openaiClient = new UniversalAIClient({
provider: AIProvider.OPENAI,
apiKey: 'sk-...',
model: 'gpt-4o',
});const schema = {
type: 'object',
properties: {
name: { type: 'string' },
age: { type: 'number' },
}
};
const result = await client.generateObject(
'Generate a person profile',
schema
);
console.log(result.object); // { name: "...", age: ... }const stream = await client.streamText('Write a story');
for await (const chunk of stream.textStream) {
process.stdout.write(chunk);
}For gradual migration from Gemini-specific code:
import { Config } from '@google/gemini-cli-core';
const config = new Config(params);
const adapter = config.getAIClientAdapter();
// Works with all providers
const result = await adapter.generateSimpleText('Hello');
// Legacy Gemini operations (when using Gemini provider)
if (adapter.isUsingUniversalClient()) {
console.log('Using new multi-provider system');
} else {
console.log('Using legacy Gemini client');
}Each provider has a recommended default model:
| Provider | Default Model |
|---|---|
| OpenAI | gpt-4o |
| Anthropic | claude-3-5-sonnet-20241022 |
| gemini-2.5-pro | |
| Cohere | command-r-plus |
| Mistral | mistral-large-latest |
| Amazon Bedrock | anthropic.claude-3-5-sonnet-20241022-v2:0 |
Replace your existing Gemini configuration:
# Old
export GEMINI_API_KEY=AIza...
# New (maintains compatibility)
export HYPERCODE_PROVIDER=gemini
export HYPERCODE_API_KEY=AIza...
export HYPERCODE_MODEL=gemini-2.5-progemini --debug "test message"export HYPERCODE_PROVIDER=openai
export HYPERCODE_API_KEY=sk-...
gemini "same functionality, different provider"export HYPERCODE_API_URL=https://your-proxy.com/v1export HYPERCODE_PROVIDER=openai
export HYPERCODE_API_KEY=sk-...
export HYPERCODE_PROXY=http://corporate-proxy:8080export HYPERCODE_PROVIDER=anthropic
export HYPERCODE_API_KEY=sk-ant-...
export HYPERCODE_PROXY=socks5://proxy:1080export HYPERCODE_PROVIDER=gemini
export HYPERCODE_API_KEY=AIza...
export HYPERCODE_PROXY=https://dev-proxy:3128
export HYPERCODE_SKIP_SSL_VERIFICATION=trueconst client = new UniversalAIClient({
provider: AIProvider.AMAZON_BEDROCK,
model: 'anthropic.claude-3-5-sonnet-20241022-v2:0',
options: {
region: 'us-west-2',
accessKeyId: '...',
secretAccessKey: '...',
}
});const client = UniversalAIClient.fromEnvironment();
const test = await client.testConnection();
if (!test.success) {
console.error('Connection failed:', test.error);
}const info = client.getProviderInfo();
console.log(`Using ${info.provider} with model ${info.model}`);gemini --debug "your prompt"See /examples/multi-provider-usage.ts for comprehensive examples of:
- Switching between providers
- Structured object generation
- Error handling
- Configuration patterns
The system maintains full backward compatibility with existing Gemini CLI usage:
- Existing
GEMINI_API_KEYenvironment variables work - All existing CLI commands and options work unchanged
- Gradual migration path available