Hey everyone! I'm new here, so I hope this is the right place to ask. I'm working on a project where I need to create a lot of datasets from a file, which is super demanding on memory, CPU, and I/O. Unfortunately, my Linode and Hetzner accounts have limits on the types of systems I can use. I reached out to Linode support, but they haven't been able to help. I'm looking for alternatives that offer more flexible server leasing options. Anyone have suggestions?
5 Answers
Did you mention that you’re restricted on Hetzner? If you were invited by another user to join their project, you might have to upgrade your account information to lift the limits you're facing.
Have you considered using Apache Spark with multiple nodes? It’s built for big data processing and can handle multiple resources effectively.
If you’re limited on Hetzner, have you thought about applying for a limit increase? If you own the account, you can request it directly through their support.
If you really need 128 GB of RAM or more, it's puzzling why Linode wouldn’t be working for you. They do offer dedicated options with high memory, like up to 300 GB. Maybe the issue lies elsewhere?
I can't get those higher instances due to my account's restrictions.
You might want to check out Azure Batch. It allows you to scale up your resources as needed for your data processing tasks. Just keep in mind, going unlimited can get pricey! How much memory are you aiming for, though?
I might have misunderstood—doesn't Spark require you to own the nodes? I can't afford that.