What’s the Best Linux Distro for Running LLMs Without Telemetry?

0
1
Asked By CuriousCoder42 On

I'm getting back into Linux after quite a while, and I'm eager to learn how to run my own language models (LLMs) and dive into some homelab projects. I've recently earned my Security+ certification, so I'm looking to experiment a bit in that field as well. One big concern for me is avoiding telemetry; I know Debian is solid in that regard, but I've heard it can be tricky to set up. On the other hand, I've considered using Ubuntu, but I'd prefer to dodge any telemetry if possible. I have a compatible setup with a Ryzen 9 9950x, an Asus ProArt X870E motherboard, and a Gigabyte RTX 5060 Ti with 16GB of VRAM. What's the best choice here? Any advice on dealing with WiFi setup would also be appreciated since I'm currently using Ethernet.

4 Answers

Answered By LinuxLover88 On

I’ve been using Ultramarine, and it works well with my setup, so it’s definitely worth trying! In general, most distros should work fine with your GPU. I suggest sticking with mainstream distros for better compatibility, but you should be safe with the big names!

Answered By TechieTinker On

Honestly, the telemetry in Ubuntu isn't a huge deal—it's pretty straightforward to disable. Personally, I used to be really against it, but since it's a free OS and I’m not contributing to the code, I can see why they collect that data. But I totally understand if you want to avoid it! Debian is a great choice too, and you won’t find it too hard to set up. For fun, you could also try out different distros by installing Ventoy on a USB stick or using netboot.xyz to experiment.

Answered By HelpfulHelper On

You might want to check out the distro selection page in the Linux4Noobs wiki for names and recommendations! Just remember to back up your important files and take your time experimenting, especially in a VM. It’s a great way to learn and get comfortable with commands before executing them!

Answered By ModelMaster On

When it comes to telemetry, you can control it manually. If you turn it off, no data will be sent. The data that’s collected primarily helps improve the distro and isn’t for tracking you. If you're considering running LLMs, Ollama is the way to go—it’s compatible with nearly any distro and can run substantial models with your GPU. For an easy experience, Mint and Ubuntu are solid choices, plus maybe Fedora for variety!

DataDude91 -

I have 16GB of VRAM too, and you usually can’t run models that are 32B or 70B without running into VRAM limits. Most models cap at around 20B for practical use. Even then, I often find myself tight on memory!

VRAMExpert -

Yeah, the larger models like 70B need insane VRAM! Dedicated GPUs are typically your best bet for that kind of heavy lifting. Just a heads up if you’re aiming to work with those big LLMs!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.