Skip to content

darkmavis1980/docker-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama and Ollama WebUI

This project has a docker-compose.yml file to start Ollama and Ollama WebUI on your machine.

Requirements

  • Docker

Setup

Copy the .env.example file to .env and set the MODELS_PATH variable to the path where your Ollama models are stored.

cp .env.example .env

Run it

docker-compose up -d

Stop it

docker-compose down

Using NVIDIA GPU

If you have an NVIDIA GPU and want to use it with Ollama, you can use the provided docker-compose.nvidia.yml file. Make sure you have the NVIDIA Container Toolkit installed on your machine. To start Ollama with NVIDIA GPU support, run:

docker-compose -f docker-compose.nvidia.yml up -d

Make sure to set the MODELS_PATH environment variable to the path where your Ollama models are stored before running the command.

Troubleshooting

If you encounter this issue:

 ✘ Container ollama Error response from daemon: unknown or invalid runtime name: nvidia                             0.0s
Error response from daemon: unknown or invalid runtime name: nvidia

Make sure you have the NVIDIA Container Toolkit installed. You can find the installation instructions here.

Then you might need to reconfigure docker or containerd to use the nvidia runtime as default. You can follow the instructions here, but long story short, execute the following:

# For Docker
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker

# For Containerd
sudo nvidia-ctk runtime configure --runtime=containerd
sudo systemctl restart containerd

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors