Happy customers are the best reward!
In my company we work on cancer screening and prevention, our IT infrastructure is mainly based on xen virtual machines. Handling with very sensitive data, we immediately needed to produce encrypted and geographically distributed backups.
After a long search in many technical forums on linux compatible tools, we tried and tested different solutions and finally started to perform our production backup process with HashBackup.
We began in summer 2013 with hb #1070 on a small amount of data, about 50GB. Through the months/years our infrastructure grew and changed, and vm servers increased over 40 times changing their nature from "Image-Based" to "LVM-Based" and we have adjusted our backup script over time.
After four years of using HashBackup I can say that it has always satisfied our needs and has also made possible successful point-in-time data recovery.
Currently we back up 1.4 TB amount of data everyday: with compression and deduplication we get only 19 GB output data for each incremental backup session with a reduction ratio of 78:1! Since August 2016 we have backed up 27 TB of data occupying less than 900GB on storage and replicating it on separate locations: local backup server, remote storage, cloud (Amazon S3 IA).
We are very satisfied. I am happy to share my findings and maybe help save some time for other sysadmins that are still searching for a good backup tool.
Thank you Jim for the great work! Well done!
im3D S.p.A. - Medical Imaging Lab
I work in the Rieke Lab at the University of Washington. We’ve been using HashBackup for over 6 years and it’s been fantastic! We do daily incremental backups of a 6TB RAID array and keep our backup files both locally and on Amazon S3-IA. After 6 years of incrementals, our backup is still only 1.6TB! That includes over 30 user directories and every daily modification made on the files within them.
Just the other day I had a researcher ask me about changes he made to a file sometime in October or November of 2015. No problem. A quick `hb ls` and we had a list of every daily revision of that file since 2010. `hb get` and we had all the revisions between October and November 2015 as separate files in a self-contained directory. That’s awesome and amazingly powerful!
Thanks for the great work you do on HashBackup. It’s been a pleasure using it over the last 6 years and it’s great to see it get even better with every update.
University of Washington
[In early 2016 Rieke Lab migrated their 6 year backup from Amazon Glacier to S3-IA using HashBackup's built-in sync feature. Recently Mark wrote in to say that he restored a researcher's 44GB home directory in 27 minutes, showing that HashBackup has good restore performance even from a backup with 6 years of daily history.]
In the past I've used NetBackup, bacula, amanda, and other quite heavy weight solutions and I've found they all break really easily and are time consuming to setup. We're trying to ensure we've got a complete off-site backup with disaster recovery plan so keeping things straight forward and reliable has been one requirement, ideally anybody with a little instruction should get the system restored and running again.
I've also steered away from solutions that try and backup the content of a system's filesystem, we've a mix of Linux, BSD, and Windows and Windows in particular with its system files can be a pain to restore that way. We've been using virtual machines for a while now and it seemed that taking a backup of a VM snapshot would be platform independent and almost guaranteed to restore. This might make for a bigger backup but with deduplication and compression it's not such a hit.
With all the VMs we use logical volumes so we can get live snapshots for backup, resize easily, and get better performance, but this rules out a lot of backup software that doesn't treat block devices specially and we didn't want to copy the LVM content to a file for backup.
With all that in mind we started looking for solutions that:
1) Could backup block devices directly for the logical volumes
2) Could still backup regular files on our NAS
3) Local backup to NAS for testing and quick recovery
4) Off site backup for disaster recovery
5) Deduplication and compression to minimise uploads off site
6) Easy to drive from script
There aren't many backup solutions that fit that bill.
If you start off with HashBackup then look at the alternatives you start discounting them pretty soon, they just lack the same features.
We ended up going for S3 in the end, not because of price but it's already in use elsewhere in the company.
Good to know about the IA support though, we'll see how costs go and maybe move to that.
That's one of the great advantages of HB, it's so easy to switch storage providers.
HashBackup has been a fantastic tool for our needs! It's a small single binary that requires virtually no dependencies (statically linked) and has an incredible array of well-documented options. As a result, it's a good fit for cloud servers and highly-available distributed systems where ephemeral storage and dynamic instance generation precludes bulky installations and configuration processes. In addition to being script-friendly, security is well thought out, automatic versioning is great for restoring deleted files, and we especially like the mount option which eliminates vendor lock-in concerns. Development is active and Jim has been very responsive to our queries and suggestions. Overall, a great backup solution!
Web Operations Manager, University of Arizona
Just wanted to say well done on the superb HashBackup!
I was pulling my hair out (I'm bald!) attempting to get duplicity/duply playing nicely with b2.
I plugged in hashbackup... "job done".
While building a backup processing service where users can just drop off their backups in their respective user folder, I was looking for a lenient solution that was easy to use, stable, efficient and most of all.. secure. After looking at all the options, HashBackup came out in the top 3.
Jim has always been very responsive to my (sometimes not so bright) mails, quickly fixing any issues or feature requests I may have. I am now proud to say HashBackup is part of a automated backup processing system with stats aggregation!
Thank you, Jim.
I love HB. It's working great for backing up my personal GitLab server. So well in fact that I wanted to backup some files on my Windows 10 laptop!
You may have heard of Bash on Windows - a beta dev feature of Windows 10 that translates all of the Linux syscalls to Windows syscalls natively. I use Backblaze Unlimited on my main PC, but didn't want to pay $5/mo for the 50-100GB of files on my laptop. So using HB via Bash on Windows I'm now backing up my Windows 10 PC to Backblaze B2!
Thank you for your work on HashBackup. It's working great on my server and Windows 10 PC. :)