Hey everyone! 👋 I'm currently working on an exciting (and slightly daunting) project. Our customer base is global, with teams located in Australia, the US, and Europe. I need to establish an infrastructure that allows them to quickly and securely retrieve container images from a geographically relevant registry. However, it's not just about speed; I need to ensure that what they download is legitimate and untouched, maintaining the authenticity and versioning we promised. Essentially, when a customer uses our software, I want them to be 100% confident that:
1. It originates from us
2. It hasn't been modified in any way
3. It's the exact version they expected
I'm brainstorming the best strategies to achieve this. Could edge replication or verified signatures be the answer, or is there a more effective approach? I'd really appreciate hearing how others have dealt with similar trusted software delivery challenges at scale!
5 Answers
You should definitely check out the Continuous Delivery book; it has some great insights! Expanding your knowledge with it will really help you in this area. Also, consider your deployment processes and how mature they are. Automated testing and rollout strategies are crucial for minimizing risks.
Don't forget to generate a Software Bill of Materials (SBOM) with your artifacts for verification! This will help ensure the integrity of your software during distribution. For signing, consider using tools like cosign from RedHat, which can help you sign and verify images through your registry.
A straightforward approach is pulling containers using their SHA hash rather than tags. You can get the hash beforehand and share it with your users. This makes tampering almost impossible, as it gets verified on the local runtime!
I like that! It sounds like a solid way to enhance security.
One important step is to sign your artifacts. Signing is crucial, but you should also have a solid method for verifying those signatures when containers are pulled, especially in different locations. I'm really curious about how to manage keys and make sure everything ties back to a trusted source, though!
Absolutely, verifying signatures at the pull stage is key. I'm also looking into efficient key management, as it really ties everything together.
I recommend having one Harbor instance for each region set up as a pull-through cache for a central registry. You upload images to the central registry, and when a region's first pull happens, it gets cached locally. It's best to avoid tagging - only allow pulls by SHA and include a signed Software Bill of Materials (SBOM) with each image. This way, everyone knows exactly what they're getting!
This setup sounds clean! I love the SHA-only pulls and the SBOM idea for transparency. How do you handle checking signatures, though? Is it done at the regional or central level?

I haven't dug into the Continuous Delivery book yet, but it's now at the top of my reading list. Thanks for the recommendation!