Use AWS Bedrock models directly in GitHub Copilot Chat, including Claude, Llama, Mistral, Qwen, and more.
- Keep code and prompts in your AWS account for stronger governance
- Choose your AWS region to align with residency and compliance requirements
- Streaming + tool calling for responsive coding workflows
- Multi-region support across 12 AWS regions
- Compliance-first architecture: prompts, code context, and responses stay within your AWS account boundary.
- Data residency control: select the AWS region your team is allowed to use and keep traffic there.
- Enterprise-ready access model: works with existing AWS credentials, profiles, and IAM controls.
- No model lock-in: use multiple Bedrock model families from one Copilot Chat workflow.
- Built for developer UX: streaming responses, tool calling, and model switching in the standard chat UI.
gpt-oss-20b,gpt-oss-120b- Safeguard variants:
gpt-oss-safeguard-20b/120b
- Gemma 3:
4b,12b,27bvariants
magistral-small-2509mistral-large-3-675b-instruct- Ministral:
3b,8b,14bvariants - Voxtral:
mini-3b,small-24bvariants
- General:
qwen3-32b,qwen3-235b,qwen3-next-80b - Vision:
qwen3-vl-235b(multimodal) - Coding:
qwen3-coder-30b/480b
v3.1
nemotron-nano-9b-v2,nemotron-nano-12b-v2
- MoonshotAI:
kimi-k2-thinking - Minimax:
minimax-m2 - ZAI:
glm-4.6
Authentication options:
- API key mode (optional):
- Use an AWS Bedrock API key from the AWS Bedrock Console
- AWS credentials mode (optional):
- Use AWS credentials/profile available to VS Code (env vars,
~/.aws/credentials, SSO, etc) - You can also set
aws-bedrock.awsProfile
- Use AWS credentials/profile available to VS Code (env vars,
- VS Code: Version 1.104.0 or later
- Open VS Code
- Go to Extensions (Cmd+Shift+X)
- Search for "Bedrock LLMs for GitHub Copilot Chat"
- Click Install
-
Clone this repository:
git clone https://github.com/easytocloud/bedrock-vscode-chat.git cd bedrock-vscode-chat -
Install dependencies:
npm install
-
Compile the extension:
npm run compile
-
Press
F5to open a new VS Code window with the extension loaded
The extension supports two authentication methods:
Via Command Palette:
- Open Command Palette (
Cmd+Shift+P/Ctrl+Shift+P) - Run:
Manage AWS Bedrock - Select "Enter API Key (Mantle)"
- Paste your API key from AWS Bedrock Console
On First Use:
- The extension will prompt for your API key when required
- Your key is stored securely in VS Code's SecretStorage
- Open Command Palette
- Run:
Manage AWS Bedrock - Select "Configure Mantle Authentication"
- Choose "AWS Credentials"
- Optionally set a specific profile via "Set AWS Profile (Mantle)"
This method uses AWS Signature V4 authentication with your existing AWS credentials.
If you want a specific named profile:
- Run:
Manage AWS Bedrock - Select "Set AWS Profile (Native)"
- Enter a profile name (or leave blank to use the default credential chain)
Default region is us-east-1. To change:
- Open Command Palette
- Run:
Manage AWS Bedrock - Select "Change Region"
- Choose your preferred AWS region
Or set in Settings:
{
"aws-bedrock.region": "us-west-2",
"aws-bedrock.mantleAuthMethod": "awsCredentials", // or "apiKey"
"aws-bedrock.mantleAwsProfile": "my-profile", // optional
"aws-bedrock.awsProfile": "my-profile" // for native Bedrock
}Show/hide specialized models (like safeguard variants):
{
"aws-bedrock.showAllModels": true // default: true
}- Open GitHub Copilot Chat (
Cmd+Shift+I/Ctrl+Shift+I) - Click the model picker (top of chat panel)
- Select an AWS Bedrock model (e.g., "OpenAI GPT OSS 120B")
- Start chatting!
- In any editor, use
@workspaceor other chat participants - The model picker will include Bedrock models
- Select a Bedrock model for your conversation
You: What are the key features of Rust's ownership system?
Assistant (via Bedrock): [Streams response in real-time...]
| Setting | Type | Default | Description |
|---|---|---|---|
aws-bedrock.region |
string | us-east-1 |
AWS region for Bedrock requests |
aws-bedrock.enableMantle |
boolean | true |
Enable models available through API key mode |
aws-bedrock.enableNative |
boolean | true |
Enable models available through Converse API |
aws-bedrock.mantleAuthMethod |
string | apiKey |
Auth mode for API key path: apiKey or awsCredentials |
aws-bedrock.mantleAwsProfile |
string | empty | Optional AWS profile for API key path when using credentials |
aws-bedrock.awsProfile |
string | empty | Optional AWS profile for Converse API path |
aws-bedrock.showAllModels |
boolean | true |
Show all models including specialized variants |
aws-bedrock.debugLogging |
boolean | false |
Enable verbose debug logging |
aws-bedrock.sendTools |
boolean | true |
Send tool definitions to the model |
aws-bedrock.emitPlaceholders |
boolean | true |
Emit placeholder text while waiting |
aws-bedrock.modelMetadataSource |
string | litellm |
Metadata source for token/capability info |
aws-bedrock.modelMetadataUrl |
string | default URL | External metadata registry URL |
aws-bedrock.modelMetadataCacheHours |
number | 24 |
Cache duration for external metadata |
Note: setting keys and some command labels include mantle naming for backward compatibility.
us-east-1(N. Virginia) - Defaultus-east-2(Ohio)us-west-2(Oregon)eu-west-1(Ireland)eu-west-2(London)eu-central-1(Frankfurt)eu-north-1(Stockholm)eu-south-1(Milan)ap-south-1(Mumbai)ap-northeast-1(Tokyo)ap-southeast-3(Jakarta)sa-east-1(São Paulo)
| Command | Description |
|---|---|
Manage AWS Bedrock |
Configure authentication, AWS profile, region, and settings |
Clear AWS Bedrock API Key (Mantle) |
Remove stored AWS Bedrock API key |
Show AWS Bedrock Logs |
Open the extension output channel |
This extension implements VS Code's LanguageModelChatProvider interface using AWS Bedrock APIs.
- BedrockMantleProvider: Main provider implementing the GitHub Copilot Chat provider interface
- Dynamic Model Discovery: Fetches available model catalogs from AWS Bedrock APIs
- Streaming Support: Processes SSE (Server-Sent Events) for real-time responses
- Tool Calling: Buffers and parses streaming tool calls for function calling support
https://bedrock-mantle.<region>.api.aws/v1
Models with function calling capabilities:
gpt-oss-120bmistral-large-3-675b-instructmagistral-small-2509deepseek.v3.1qwen3-235band larger modelsqwen3-vl-235b(vision + tools)
Models with multimodal (image) input:
- Models from API-key mode: based on model naming and API behavior
- Models from Converse API mode: based on Bedrock's reported input modalities
- Token limits + initial capabilities: The extension can optionally use an external model metadata registry (default: Litellm's public JSON) to populate
maxInputTokens,maxOutputTokens, and initial tool/vision flags. Configure viaaws-bedrock.modelMetadataSource,aws-bedrock.modelMetadataUrl, andaws-bedrock.modelMetadataCacheHours. - Converse API models: vision is derived from
ListFoundationModelsinput modalities (reliable). Tool support is verified on-demand by attempting a tool-enabled request and caching whether the model accepts tool config (this overrides external metadata if runtime behavior differs). - API-key catalog models:
/v1/modelsdoes not include full tool/vision/token metadata, so the extension uses external metadata when enabled, plus runtime probing (tools) as a safety net.
Models optimized for coding:
qwen3-coder-30b-a3b-instructqwen3-coder-480b-a35b-instruct
Models with enhanced reasoning:
kimi-k2-thinking
Problem: "Invalid API key" error
Solution:
- Verify your API key in AWS Bedrock Console
- Run:
Manage AWS Bedrock→ "Clear API Key (Mantle)" - Re-enter your API key
Problem: "Model not available in region" error
Solution:
- Not all models are available in all regions
- Try changing to
us-east-1(widest availability) - Check AWS Bedrock Model Availability
Problem: "Rate limit exceeded" error
Solution:
- Wait a few moments and try again
- Consider using smaller models for testing
- Check your AWS Bedrock quotas in AWS Console
Problem: Network or timeout errors
Solution:
- Check your internet connection
- Verify firewall/proxy settings allow access to
*.api.aws - Ensure the selected region is accessible from your location
# Install dependencies
npm install
# Compile TypeScript
npm run compile
# Watch mode for development
npm run watch
# Run linting
npm run lintOr use the Makefile shortcuts:
make install
make compile
make watch
make lint- Open the project in VS Code
- Press
F5to launch Extension Development Host - Set breakpoints in source files
- Test the extension in the new window
bedrock-vscode-chat/
├── src/
│ ├── extension.ts # Extension entry point
│ ├── provider.ts # Main provider implementation
│ ├── bedrockNative.ts # Native Bedrock Converse API
│ ├── externalModelMetadata.ts # External model metadata loader
│ ├── types.ts # TypeScript type definitions
│ └── utils.ts # Utility functions
├── package.json # Extension manifest
├── tsconfig.json # TypeScript configuration
├── icon.svg # Source icon (editable)
├── icon.png # Extension icon (128x128)
├── README.md # This file
├── CONTRIBUTING.md # Development guide
└── PLAN.md # Architecture details
Contributions are welcome! See CONTRIBUTING.md for detailed development guidelines.
Quick start for contributors:
- Fork and clone the repository
- Install dependencies:
npm install - Compile:
npm run compile - Press F5 to launch Extension Development Host
- See CONTRIBUTING.md for testing, logging, and publishing guidelines
Key development notes:
- Publisher name:
easytocloud(lowercase) - Use Output Channel for logging, not console.log
- Include node_modules in VSIX (required for AWS SDK)
- Test in both F5 mode and installed VSIX
- Use
rsvg-convertfor icon generation
- AWS Bedrock Documentation
- VS Code Language Model API
- OpenAI API Reference
- Contributing Guide - Detailed development documentation
MIT License - See LICENSE file for details
- Project Lead: easytocloud
- Development Assistant: GitHub Copilot
Inspired by the HuggingFace extension for GitHub Copilot Chat.
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- AWS Bedrock: AWS Support
Version: 0.3.1
Status: Production
Last Updated: February 5, 2026