Automation

It’s important that backups are run on a regular schedule to be useful. Relying on yourself to periodically run backups is not a good strategy. It’s easy to setup automated backups so let’s get to it!

On Mac OSX systems Full Disk Access must be enabled first to grant HashBackup access to your data. Without it there will be permission errors and much of your data can’t be backed up.

Setup Automated Backups

  1. sudo sh to get a root shell:

    $ sudo sh
    Password:
    #
  2. as root, create a new backup directory with hb init, usually at the root level for a system-wide backup. This example uses /hbbackup:

    $ hb init -c /hbbackup
    HashBackup #2677 Copyright 2009-2022 HashBackup, LLC
    Backup directory: /hbbackup
    Permissions set for owner access only
    Created key file /hbbackup/key.conf
    Key file set to read-only
    Setting include/exclude defaults: /hbbackup/inex.conf
    
    VERY IMPORTANT: your backup is encrypted and can only be accessed with
    the encryption key, stored in the file:
    
        /hbbackup/key.conf
    
    You MUST make copies of this file and store them in secure locations,
    separate from your computer and backup data.  If your hard drive fails,
    you will need this key to restore your files.  If you have setup remote
    destinations in dest.conf, that file should be copied too.
    
    Backup directory initialized
  3. set the dedup-mem config option

    $ hb config -c /hbbackup dedup-mem 1g
    HashBackup #2677 Copyright 2009-2022 HashBackup, LLC
    Backup directory: /hbbackup
    Current config version: 0
    
    Set dedup-mem to 1g (was 0) for future backups
  4. edit /hbbackup/inex.conf to add files and directories to exclude

  5. create /hbbackup/dest.conf and setup remote destinations for the backup. This example uses:

    • a directory destination for a USB drive named hbusb1 mounted as /Volumes/hbusb1

    • a destination for Backblaze B2 storage service. This can be setup later if you don’t want to do it now.

    • a 2nd dir destination could be setup, hbusb2, to rotate between two USB drives, keeping one at another location

    • for SSD drives, leave out workers 1. It is there for spinning drives.

    # NOTE: hbusb1 is mounted at /Volumes/hbusb1
    
    mkdir /Volumes/hbusb1/hbbackup
    
    cat - >/hbbackup/dest.conf   # use control d to exit
    destname hbusb1
    type dir
    dir /Volumes/hbusb1/hbbackup
    workers 1
    
    destname b2
    type b2
    accountid 0123456789ab
    appkey 0123456789abcdef0123456789abcdef0123456789
    bucket hbbackup
    dir myhost1
  6. it’s a good idea to keep a local copy of backup data in /hbbackup, another copy on a USB drive with a Dir destination, and a 3rd copy on a remote storage service. But if you’ve decided not to keep a full copy in /hbbackup, set cache-size-limit to the amount you want to keep there, or this can be done later:

    $ hb config -c /hbbackup cache-size-limit 50GB
    HashBackup #2677 Copyright 2009-2022 HashBackup, LLC
    Backup directory: /hbbackup
    Current config version: 1
    
    Set cache-size-limit to 50GB (was -1) for future backups
  7. test destinations with a small backup to make sure everything is working:

    $ hb backup -c /hbbackup /sbin
    HashBackup #2677 Copyright 2009-2022 HashBackup, LLC
    Backup directory: /hbbackup
    Backup start: 2022-01-19 18:26:43
    Using destinations in dest.conf
    Copied HB program to /hbbackup/hb#2677
    This is backup version: 0
    Dedup enabled, 0% of current size, 0% of max size
    /
    /hbbackup
    /hbbackup/inex.conf
    /sbin
    /sbin/apfs_hfs_convert
    /sbin/autodiskmount
    ...
    /sbin/rtsol
    /sbin/shutdown
    /sbin/umount
    Copied arc.0.0 to hbusb1 (821 KB 0s 505 MB/s)
    Writing hb.db.0
    Copied hb.db.0 to hbusb1 (11 KB 0s 11 MB/s)
    Copied dest.db to hbusb1 (36 KB 0s 38 MB/s)
    
    Time: 0.2s
    CPU:  0.2s, 83%
    Mem:  75 MB
    Checked: 66 paths, 2435539 bytes, 2.4 MB
    Saved: 66 paths, 2434170 bytes, 2.4 MB
    Excluded: 0
    Dupbytes: 44400, 44 KB, 1%
    Compression: 66%, 3.0:1
    Efficiency: 6.00 MB reduced/cpusec
    Space: +821 KB, 858 KB total
    No errors
  8. to automate nightly backups, add this root crontab entry to /etc/crontab, the system crontab file:

    cat - >>/etc/crontab        # use control d to exit
    # min     hour     mday     month     wday    user     command
    MAILTO=me@email.com
    PATH=/bin:/usr/bin:/usr/local/bin
    0 0 * * * root nice sh -c "cd /hbbackup && hb backup -c . / --maxtime 7h; hb retain -c . -s30d12m; hb selftest -c . -v4 --inc 1d/30d,1GB; hb upgrade; hb log -c . -e -x1" >/dev/null
    OSX systems may not have an /etc/crontab file, but if you create one, it will work. Prefix your edit command with sudo to create the /etc/crontab system file.
  9. Finally, and most importantly: make copies of your key.conf and dest.conf files, to USB thumb drives, paper, phone pictures, and any other way necessary to make sure you don’t lose your backup key and remote credentials. Keep a thumb drive on your keychain, give copies to trusted relatives or friends, and make sure you have a copy of these two important files in multiple locations.

    Without the key your backup cannot be accessed!

What It Does

  1. nice prefix runs backups at a lower priority; remove if backups are too slow. On OSX, replace with taskpolicy -d standard for lower priority, or taskpolicy -d throttle for even lower priority

  2. backup root (/) at midnight every day. Add file systems as needed

  3. stop backup if not finished within 7 hours (--maxtime 7h)

  4. log all output to /hbbackup/logs with timestamps for every line

  5. retain 30 daily backups + one monthly backup for the last 12 months (-s30d12m)

  6. run selftest to check the backup

  7. selftest downloads and verify all backup data (-v4) over 30 days (1d/30d). Downloads are limited to 1GB, so it may take longer than 30 days to complete a cycle if your backup is larger than 30GB.

    selftest -v4 can expensive for large backups because of download fees (egress data transfer). Adjust --inc accordingly, add --sample to sample your backup, or use the dest verify command instead of selftest for faster & cheaper remote verification — but not as thorough.
  8. check and install HashBackup upgrades. Automated upgrades ensure that you have the latest updates to the HashBackup program and your backup database stays up to date. Upgrade happens after the backup instead of before so that requests to the HashBackup upgrade server are spread out. Everyone’s backup time varies, so upgrade checks follow a somewhat random schedule. Cron sends an email with the release notes when a new version is installed.

  9. summarize and archive the log files, sending an email only if errors occur (-e) with 1 line of context around the error (-x1)

Adjustments

  1. email address to receive release notes and error summary

  2. start time in the first two fields, minute first, then hour. Avoid starting between 1-2AM because those times can occur twice in one day under Daylight Savings Time.

  3. cd /hbbackup backup directory name

  4. paths backed up. This example backs up everything (/) but does not cross mount points: other filesystems have to be specifically listed

  5. retention policy of older file versions (-s30d12m)

  6. how much time to verify the entire backup (1d/30d means 1/30th of the backup each day)

  7. limit on how much data to download from each destination (1GB); this has priority over the previous time limit

  8. see performance tuning tips if backing up makes your computer too slow

More Frequent Backups

For more frequent backups, the backup command and maintenance commands can be separated into two jobs in /etc/crontab. In a crontab, every line is a job. In the first five time fields, a * means "every interval", so a * on the hour field means "every hour"; on the daily field it means "every day".

This example crontab adds an hourly backup of /Users/jim at the top of every hour (the first field, minute, is zero) between the hours of 6AM and 11PM (6-23).

cat - >>/etc/crontab        # use control d to exit
# min     hour     mday     month     wday    user     command
MAILTO = me@email.com
0 6-23 * * * root nice /usr/local/bin/hb backup -c /hbbackup /Users/jim >/dev/null

For an online crontab time editor, check https://crontab.guru

Backups Don’t Run?

If your automated backup job does not run:

  1. There may be a mistake in the cron command. To test this, sudo sh to become root, then copy and paste the cron command to see if it works or causes an error.

  2. Your computer may be sleeping — very common with laptops. On OSX, go to System Preferences → Energy Saver → Power Adapter and check the box for "Prevent computer from sleeping automatically when the display is off". Also, don’t close the lid! Closing the lid will make your laptop sleep and cron jobs do not run while sleeping.

  3. On OSX, within Energy Saver, click Schedule and set a wake-up one minute before your cron job is scheduled to start.