How to Automatically Rebuild and Deploy a Static Site with Docker Compose?

0
12
Asked By CuriousCoder42 On

I've been diving into automating the deployment of a static site using Docker Compose on my Synology NAS. The idea is to streamline the process so that new content can be generated and deployed without needing any manual steps. Here's how I set it up:

1. A scheduled task runs daily.
2. A Python script creates new markdown content and checks it for errors.
3. Docker Compose executes a build using the Astro tool inside a container.
4. The nginx container restarts, serving the updated site live.

The rebuild and restart take about a minute, and since it's a static site, there's minimal downtime because the old version keeps serving until the restart is done.

I'm curious about how others handle automated static deployments in self-hosted environments. Do you use Docker Compose like I do, Git hooks, or have a more advanced method?

5 Answers

Answered By BuildItBetter On

I'm considering separating my build and web containers more distinctly. Right now they’re both attached to my Docker setup, but I’m thinking it might be more efficient to mount the 'dist' folder via a shared volume instead of restarting NGINX every time. Anyone trying a similar approach?

Answered By TechieTribe99 On

When you're serving static files, you really only need to restart NGINX if you’re making changes to its configuration. You can use bind mounts so NGINX directly serves the static files from the build output without any downtime. This could simplify your setup a lot, allowing the build container to push files directly to NGINX!

CuriousCoder42 -

That’s a great point! You're right that I'm not changing the NGINX config, just generating new static files. Using a bind mount does seem like a better approach. I’ll look into refactoring my setup to eliminate the restart step. Thanks for the tip!

Answered By DevDynamo On

You might find GitHub Actions helpful for automating the build and deployment process. If you're using Portainer, it has a "GitOps" mode that updates stacks automatically when you push changes. The flow could look like this: GitHub Actions builds, pushes to GitHub Packages, and then Portainer deploys the new image.

If you're avoiding external services, consider a Python script that checks your GitHub repo for updates. When it detects a change, it pulls in the latest code and runs a build script, then uses `rsync` to sync files to your web server container with a bind mount. This method can help avoid downtime during updates!

CuriousCoder42 -

That GitHub Actions to Portainer workflow sounds solid, especially for bigger projects. However, I'm keeping this project self-contained on my NAS for learning purposes. I like your bind mount and rsync idea, though—they align with what I'm exploring!

Answered By CodeCrafter On

I've explored this too! I was inspired by Docker Compose for a tool I created for similar purposes. I even open-sourced it, which you can check out [here](https://github.com/haloydev/haloy)! It's been a fun project!

Answered By OldSchoolCoder On

Honestly, I still do things the manual way: I build the site, containerize it, push to GitHub Container Registry, then pull from my web server. I bring down the old container and bring the new one up. There’s definitely room for improvement in terms of automation!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.