What are the best ways to run local LLMs in containers?

0
7
Asked By CuriousGamer42 On

I'm searching for some good platforms to help me host local LLMs in containers. I'm hoping to find cheap or free options that simplify testing. I've been hitting a lot of bumps along the way, so any advice would be appreciated!

1 Answer

Answered By TechNerd123 On

I’ve been using Ollama within a Docker container. To keep your LLMs safe from deletion when your container stops, make sure to store them in a volume. You’ll need a Docker-Compose YAML file for this setup, along with Docker and Compose installed on your machine. If you prefer, you can also use Podman instead of Docker. Just remember that LLMs shouldn't be stored directly in the container, especially if you plan to deploy through Kubernetes.

HelpfulHanna99 -

Thanks a ton! This info really helps me understand why my pod wasn't starting.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.