I'm trying to set up Ollama to run inside Docker while utilizing my NVidia GPU. I have Docker Desktop installed on an Ubuntu Proxmox VM with GPU passthrough. While I can successfully use the GPU with Ollama outside of Docker, I haven't been able to get it to work inside the Docker environment.
2 Answers
Why are you using Docker Desktop on Ubuntu? You might want to just install docker-ce and run it natively. Docker Desktop adds an extra VM layer that complicates things.
I recommend running Open WebUI outside of Docker and keeping Ollama there. That's been working well for me!
I get that, but I'm specifically trying to integrate Ollama with n8n. I've heard it's easier if they're both running in Docker together, which is why I'm going this route.

I'm still a newbie with Docker, just trying to learn the ropes. Once I wrap my head around the basics, I’m sure I’ll find it easier to explore other options.