I'm dealing with a unique challenge in our air-gapped environment where we can't pull artifacts directly from the internet. Keeping our minimal container images up to date with the latest security patches is tough. What strategies do you employ to safely automate vulnerability updates in such environments?
5 Answers
Consider setting up an internal registry for your patched images. You might also want to establish automated vulnerability scanning pipelines outside of your isolated network. Controlled synchronization and immutable tagging can be effective in keeping a large number of minimal images current without compromising security.
Just a reminder, isolated and air-gapped systems can differ. Utilizing a mirror can be a solid strategy to manage images.
You might want to utilize gateway tools like JFrog or Nexus Sonatype. These can act as an interface between your DMZ and the internal air-gapped servers, enabling them to fetch the necessary repositories while ensuring security.
For a more hands-on approach, weekly mirror your base images to a DMZ bastion using commands like `skopeo copy docker://debian:bookworm` into your internal Harbor or Quay registry. Employ reproducible builds with a multi-stage Dockerfile, using tools like Trivy for scans, and automate vulnerability patching through a GitOps pipeline to keep track of changes in your CI/CD within an air-gapped environment.
Make sure to only transfer vetted images into your air-gapped environment using secure methods like signed tarballs or mirroring your internal registry. Automating the tagging and promotion of these images helps maintain consistency and using reproducible builds can help manage vulnerabilities seamlessly across all your images.

Another approach is to build, patch, and scan the images in an internet-connected environment before securely transferring them to your internal registry. Automate your builds and vulnerability scans, and use immutable tagging along with periodic audits for security.