How to Trigger Kubernetes Jobs Across Multiple EKS Clusters?

0
28
Asked By TechWhiz37 On

I'm working on a central dashboard that's running in its own EKS cluster. This dashboard needs to trigger long-running Kubernetes Jobs in various target EKS clusters – one for each environment (dev, qa, uat, prod). The workflow is straightforward: the dashboard sends a request with parameters, then the target cluster runs a job (like db-migrate, data-sync, report-gen, etc.), and once the job completes, the dashboard receives the status and logs.

Here's my current setup:
- The target clusters have public API endpoints that are secured with strict IP allowlists.
- The dashboard requires permissions only to create jobs and read their statuses within a specific namespace (no cluster-admin access necessary).
- Every trigger needs to be auditable (it should track who ran what and when, along with the parameters used).

While I'm okay with using public endpoints and IP restrictions for now, I'm curious if this method is truly scalable and secure as the number of clusters increases. I'm looking for ways to improve the design for scalability, including aspects like networking, secure parameter passing, RBAC and auditability, and managing operational overhead across 4 to 10 clusters. If anyone has dealt with something similar, I'd appreciate your insights, links to resources, or any useful diagrams!

6 Answers

Answered By MessageBusBoss On

I recommend using a message bus like Kafka to decouple the components. This would address most of your needs without relying on public API access. You'd just need to add control and tracing once you set that up.

Answered By K8sDevMaster On

As others have mentioned, using Argo or Flux would definitely work. But really, keep your API off the public internet!

Answered By EventRunner32 On

Why not go for SQS combined with KEDA? It's a pretty straightforward solution that could simplify your setup.

Answered By KubePilot45 On

You should aim for a setup where clusters fetch instructions instead of the dashboard pushing them. Using something like ArgoCD or creating a custom system could be a good solution for this.

Answered By CloudGuru22 On

You might want to check out ArgoCD for this. Personally, I would avoid having public API endpoints, even with IP restrictions, but that's just my cautious side talking.

Answered By DataNinja88 On

Definitely consider adding Argo Workflows along with ArgoCD if you want to run parameterized jobs. It has an API you can connect to for triggering jobs with parameters directly from your dashboard. Useful stuff!

ScriptMaster99 -

That's a great idea! Integrating Argo Workflows could really streamline the whole process.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.