I'm searching for some good platforms to help me host local LLMs in containers. I'm hoping to find cheap or free options that simplify testing. I've been hitting a lot of bumps along the way, so any advice would be appreciated!
1 Answer
I’ve been using Ollama within a Docker container. To keep your LLMs safe from deletion when your container stops, make sure to store them in a volume. You’ll need a Docker-Compose YAML file for this setup, along with Docker and Compose installed on your machine. If you prefer, you can also use Podman instead of Docker. Just remember that LLMs shouldn't be stored directly in the container, especially if you plan to deploy through Kubernetes.
Thanks a ton! This info really helps me understand why my pod wasn't starting.