Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
187 changes: 160 additions & 27 deletions docs/tutorials/integrations/amazon-bedrock.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,40 +11,56 @@ This tutorial is a community contribution and is not supported by the Open WebUI

---

# Integrating Open-WebUI with Amazon Bedrock
# Integrating Open WebUI with Amazon Bedrock

In this tutorial, we'll explore one of the most common and popular approaches to integrate Open-WebUI with Amazon Bedrock.

## Prerequisites

In order to follow this tutorial, you must have the following:

- An active AWS account
- An active AWS Access Key and Secret Key
- IAM permissions in AWS to enable Bedrock models or already enabled models
- Docker installed on your system
In this tutorial, we'll explore the most common and popular approaches to integrate Open WebUI with Amazon Bedrock.

## What is Amazon Bedrock

Direct from AWS' website:

"Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Luma, Meta, Mistral AI, poolside (coming soon), Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with."

To learn more about Bedrock, visit: [Amazon Bedrock's Official Page](https://aws.amazon.com/bedrock/)
To learn more about Bedrock, visit the [Amazon Bedrock official page](https://aws.amazon.com/bedrock/).

# Integration Options

There are multiple OpenAI-compatible ways to connect Open WebUI to AWS Bedrock:

* **Bedrock Access Gateway** (BAG)
* **stdapi.ai**
* **LiteLLM** with its Bedrock provider (LiteLLM is not dedicated to AWS).

## Feature Comparison

| Capability | Bedrock Access Gateway (BAG) | stdapi.ai | LiteLLM (Bedrock provider) |
|------------------------------| --- | --- | --- |
| Automatic models discovery | ✅ | ✅ | — |
| Chat completion | ✅ | ✅ | ✅ |
| Embeddings | ✅ | ✅ | ✅ |
| Text to speech | — | ✅ | — |
| Speech to text | — | ✅ | — |
| Image generation | — | ✅ | ✅ |
| Image editing | — | ✅ | — |
| Models from multiple regions | — | ✅ | ✅ |
| License | MIT | AGPL or Commercial | MIT or Commercial |

# Integration Steps

## Step 1: Verify access to Amazon Bedrock Base Models
## Solution 1: Bedrock Access Gateway (BAG)

Before we can integrate with Bedrock, you first have to verify that you have access to at least one, but preferably many, of the available Base Models. At the time of this writing (Feb 2025), there were 47 base models available. You can see in the screenshot below that I have access to multiple models. You'll know if you have access to a model if it says "✅ Access Granted" next to the model. If you don't have access to any models, you will get an error on the next step.
### Prerequisites

AWS provides good documentation for request accessing / enabling these models in the [Amazon Bedrock's Model Access Docs](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-modify.html)
In order to follow this tutorial, you must have the following:

![Amazon Bedrock Base Models](/images/tutorials/amazon-bedrock/amazon-bedrock-base-models.png)
- An active AWS account
- An active AWS Access Key and Secret Key
- IAM permissions in AWS to enable Bedrock models or already enabled models
- Docker installed on your system

## Step 2: Configure the Bedrock Access Gateway
### Step 1: Configure the Bedrock Access Gateway

Now that we have access to at least one Bedrock base model, we need to configure the Bedrock Access Gateway, or BAG. You can think of the BAG as kind of proxy or middleware developed by AWS that wraps around AWS native endpoints/SDK for Bedrock and, in turn, exposes endpoints that are compatible with OpenAI's schema, which is what Open-WebUI requires.
We need to configure the Bedrock Access Gateway, or BAG. You can think of the BAG as kind of proxy or middleware developed by AWS that wraps around AWS native endpoints/SDK for Bedrock and, in turn, exposes endpoints that are compatible with OpenAI's schema, which is what Open-WebUI requires.

For reference, here is a simple mapping between the endpoints:

Expand Down Expand Up @@ -75,9 +91,9 @@ You should now be able to access the BAG's swagger page at: http://localhost:800

![Bedrock Access Gateway Swagger](/images/tutorials/amazon-bedrock/amazon-bedrock-proxy-api.png)

## Step 3: Add Connection in Open-WebUI
### Step 2: Add Connection in Open WebUI

Now that you the BAG up-and-running, it's time to add it as a new connection in Open-WebUI.
Now that you the BAG up-and-running, it's time to add it as a new connection in Open WebUI.

- Under the Admin Panel, go to Settings -> Connections.
- Use the "+" (plus) button to add a new connection under the OpenAI
Expand All @@ -87,15 +103,132 @@ Now that you the BAG up-and-running, it's time to add it as a new connection in

![Add New Connection](/images/tutorials/amazon-bedrock/amazon-bedrock-proxy-connection.png)

## Step 4: Start using Bedrock Base Models
### Other Helpful Tutorials

You should now see all your Bedrock models available!
These are a few other very helpful tutorials when working to integrate Open WebUI with Amazon Bedrock using the Bedrock Access Gateway.

![Use Bedrock Models](/images/tutorials/amazon-bedrock/amazon-bedrock-models-in-oui.png)
- https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2
- https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/

## Other Helpful Tutorials
## Solution 2: stdapi.ai

These are a few other very helpful tutorials when working to integrate Open-WebUI with Amazon Bedrock.
[stdapi.ai](https://stdapi.ai/) is an OpenAI-compatible API gateway you deploy in your AWS account, or run locally using Docker.

- https://gauravve.medium.com/connecting-open-webui-to-aws-bedrock-a1f0082c8cb2
- https://jrpospos.blog/posts/2024/08/using-amazon-bedrock-with-openwebui-when-working-with-sensitive-data/
Open WebUI connects to it as if it were OpenAI, and stdapi.ai routes requests to Bedrock and other AWS AI services such as Amazon Polly and Transcribe. It also supports multi-region access to Bedrock, making it easier to reach more models that may only be available in specific AWS regions.

### stdapi.ai Deployment

#### Deploying on AWS

stdapi.ai provides a full Terraform sample that provisions Open WebUI on ECS Fargate, connects it to stdapi.ai, and includes supporting services like Elasticache Valkey, Aurora PostgreSQL with vector extension, SearXNG, and Playwright.
This method handles both the stdapi.ai and Open WebUI configuration:

- [stdapi.ai Documentation - Open WebUI integration](https://stdapi.ai/use_cases_openwebui/)
- [stdapi-ai GitHub - Open WebUI Terraform sample](https://github.com/stdapi-ai/samples/tree/main/getting_started_openwebui)

stdapi.ai also provides documentation and Terraform samples to deploy it independently if you prefer to connect it to an existing Open WebUI instance.

- [stdapi.ai Documentation - Getting started](https://stdapi.ai/operations_getting_started/)

#### Deploying Locally

stdapi.ai also provides a Docker image for local usage.

Here is a minimal command to run it using your AWS access key:
```bash
docker run \
-e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
-e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \
-e AWS_SESSION_TOKEN=$AWS_SESSION_TOKEN \
-e AWS_BEDROCK_REGIONS=us-east-1,us-west-2 \
-e ENABLE_DOCS=true \
--rm \
-p 8000:8000 \
ghcr.io/stdapi-ai/stdapi.ai-community:latest
```
The application is now available at http://localhost:8000 (use it as `YOUR_STDAPI_URL` in the Open WebUI configuration below).

The `AWS_BEDROCK_REGIONS` variable lets you select regions where you want to load models, in this case `us-east-1` and `us-west-2`.

If you pass the `ENABLE_DOCS=true` variable, an interactive Swagger documentation page is available at http://localhost:8000/docs.

`API_KEY=my_secret_password` can also be used to set a custom API key for the application (defaults to no API key required). This is highly recommended if the server is reachable from elsewhere. Use this API key as `YOUR_STDAPI_KEY` in the Open WebUI configuration below.

Many other configuration options are available; see [the documentation](https://stdapi.ai/operations_configuration/) for more information.

### Open WebUI Configuration

Open WebUI is configured via environment variables, and you can also set the same values from the Open WebUI admin panel.

Use the same stdapi.ai key for all `*_OPENAI_API_KEY` entries.

#### Core connection (chat + background tasks)

```bash
OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
OPENAI_API_KEY=YOUR_STDAPI_KEY
TASK_MODEL_EXTERNAL=amazon.nova-micro-v1:0
```

Use a fast, low-cost chat model for `TASK_MODEL_EXTERNAL`.

#### RAG embeddings

```bash
RAG_EMBEDDING_ENGINE=openai
RAG_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
RAG_OPENAI_API_KEY=YOUR_STDAPI_KEY
RAG_EMBEDDING_MODEL=cohere.embed-v4:0
```

Pick any embedding model you prefer.

#### Image generation

```bash
ENABLE_IMAGE_GENERATION=true
IMAGE_GENERATION_ENGINE=openai
IMAGES_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
IMAGES_OPENAI_API_KEY=YOUR_STDAPI_KEY
IMAGE_GENERATION_MODEL=stability.stable-image-core-v1:1
```

Choose any image generation model you prefer.

#### Image editing

```bash
ENABLE_IMAGE_EDIT=true
IMAGE_EDIT_ENGINE=openai
IMAGES_EDIT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
IMAGES_EDIT_OPENAI_API_KEY=YOUR_STDAPI_KEY
IMAGE_EDIT_MODEL=stability.stable-image-control-structure-v1:0
```

Pick any image-editing model that supports edits without a mask.

#### Speech to text (STT)

```bash
AUDIO_STT_ENGINE=openai
AUDIO_STT_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
AUDIO_STT_OPENAI_API_KEY=YOUR_STDAPI_KEY
AUDIO_STT_MODEL=amazon.transcribe
```

#### Text to speech (TTS)

```bash
AUDIO_TTS_ENGINE=openai
AUDIO_TTS_OPENAI_API_BASE_URL=YOUR_STDAPI_URL/v1
AUDIO_TTS_OPENAI_API_KEY=YOUR_STDAPI_KEY
AUDIO_TTS_MODEL=amazon.polly-neural
```

If you see inconsistent auto-detection for TTS languages, set a fixed language in stdapi.ai (for example, `DEFAULT_TTS_LANGUAGE=en-US`).

# Start using Bedrock Base Models

You should now see all your Bedrock models available!

![Use Bedrock Models](/images/tutorials/amazon-bedrock/amazon-bedrock-models-in-oui.png)
Binary file not shown.