I've been struggling for about 10 hours to get a bash script working on my Synology NAS. I've been using ChatGPT for assistance, but every solution seems to create more problems. I want this script to run every 6 hours, terminating successfully to avoid scheduler errors. SSH is usually disabled, so I can't use crontab, and the DSM GUI only allows limited scheduling. My goal is to create an incremental backup of files from an FTP server in the /virtual folder, which has many subfolders filled with tiny files. Here's the general flow I'm aiming for: 1. Create an initial full backup of the /virtual folder. 2. On subsequent runs, copy the previous backup to a new folder with a current timestamp. 3. Only download new or changed files from the FTP server. 4. The script must terminate afterward. Unfortunately, my latest iterations of the script are downloading everything again, and it's become increasingly buggy. I'd love some advice to fix it, keeping in mind I can only use FTP, not SFTP or rsync.
3 Answers
I see you're trying to work around Synology's limitations. Have you thought about using Docker on your NAS? It might give you more flexibility to run backup jobs without SSH. Plus, tools inside a container can often get around those pesky restrictions! Just something to consider.
If you're getting unexpected results with your file listing, trying a more straightforward command could help. For example, use `backups=( * )` to directly create an array of files instead of dealing with complex `ls` commands. Keeping it simple might give you clearer results and reduce confusion.
Before making major changes, try debugging your script with `set -xv` at the top. This will provide verbose output showing what happens step by step. It could really help identify where things are going wrong. And remember to save version copies each time you add a new part, so you can revert if needed!
Totally agree! Also, while you're at it, consider using the `script` command to log everything during execution. That gives you a comprehensive log to analyze later.
Good idea! It might also help to break down tasks into smaller pieces and test them individually, that way you can pinpoint where everything starts to go awry.
Exactly! Simplifying where possible could save you a lot of headaches.