I'm working on a cloud hygiene tool that evaluates and cleans up resources, and I'm using a CI process that looks like this:
1. Set up Python with a specified version.
2. Install the CleanCloud package along with its dependencies.
This setup works well, but I want to know if using Python in CI/CD is a poor choice for developers. Ideally, I'd like to simplify things so that users can:
- Download a single executable file
- Run it directly in their CI
- Avoid dealing with Python versions and dependencies
Here are some specific questions:
- Is relying on Python acceptable in CI/CD workflows?
- Should I aim for creating a standalone binary instead?
- What are the best practices for distributing Python-based tools without burdening users with Python management?
I'm looking forward to hearing from those of you who have experience with similar tools in real-world CI/CD pipelines!
2 Answers
It's definitely worth considering using something like a requirements.txt to manage dependencies. If Python's part of the pipeline, there’s no major issue since many systems already have it installed, similar to how Node is commonly used. You could also look into caching your package installations to speed things up!
Using a Python dependency in CI/CD can work just fine, especially with ephemeral runners. It might feel a bit cumbersome, but it's manageable. A binary might be a cleaner approach in theory, but as long as you're using cloud hosted runners that are wiped after each use, I think it’s acceptable. You might want to consider building a GitHub Action for your specific task, similar to what we’re doing for cleaning up old workflows.

That makes sense! Caching could help a lot and speed up the process after the first run.