Customers

Happy customers are the best reward!



Hi Jim,

On Feb 11, 2016 [about 15 months ago] we have exchanged a couple of mails on HB and at that time, you have warned me not to rely on Time Machine, as you had your own experience with missing files.

I was cautious and did use TM and HB in parallel and when I upgraded from Mavericks to El Capitan, I had to undo the upgrade using TM's recovery feature. Worked nicely. I then installed El Cap and eventually solved my issues. (The recovery was not needed it turned out.) The reason to use TM is the recovery feature from the backup disk.

About a month I was wondering why I have deleted some directories. [Yes, I know that you know what's coming next.] Once I wanted to go to a file I *surely* knew that it existed, but could not find it on my Mac any more, I used HB to recover it. This was the point when I looked up when the file went away, exactly during the time I did the upgrade. Finally, made a full diff using the mounted HB backup volume between the 2,000,000 files in my home prior and after recovery. Found 80,000 (sic!)  files missing. No pattern to find. Sometimes whole directories, sometimes only files in directories. Disaster. Man, was I pissed.

I wrote a shell script based on the diffs in the directory trees to copy the missing files from the HB backup directory to my home. I also filed a nasty complaint with Apple.

Success story for HB, though. A lot of work, one day to get the diffs, write the script and recover the files. Doing this made me wish for an HB feature, you probably would also be interested: compare file states between two backup versions. From what I understand, at the moment this can be only done between the file system being backed up and the backup.

Just some experiences I wanted to share... Thanks for making HB, it saved my life!

Chris

[My own experience with missing Time Machine files that Chris mentions was when I upgraded my MacBook Pro hard drive to an SSD.  TM ran hourly, but right before the upgrade, I ran it again "just to be sure".  Swapped in the hard drive, booted from the external TM drive, and did a recovery to restore the hard drive to the empty SSD.  Booted the SSD and everything seemed to be fine.  But later that day, my version control system started giving weird errors I had never seen before about missing files.  I still had the old hard drive, checked it, and sure enough they were there.  Hmm... why didn't TM restore them?  Looking at the TM backup disk, these files weren't there.  I used rsync to compare my old drive with the TM backup, and found there were *many* files that were either missing, or older versions were restored, scattered here and there, with no pattern that I could see.  These weren't files excluded from the TM backup: file abc.i was backed up, yet def.i was out of date (an old version).  Luckily I also had an HB backup.  I used hb compare to see which files were missing or wrong, confirming the rsync compare.  Then I used hb get to restore the whole drive on top of the TM restore to recover the missing files and correct the others.  My version control problem was fixed, and hb compare confirmed that the SSD now matched the backup.  I was lucky that my version control system complained, or I may not have noticed for months that the TM restore was incorrect.  I haven't posted this previously because it happened on a Snow Leopard system and I wasn't sure it was still relevant.  I guess it is.  -Jim]
   


In my company we work on cancer screening and prevention, our IT infrastructure is mainly based on xen virtual machines.   Handling with very sensitive data, we immediately needed to produce encrypted and geographically distributed backups.

After a long search in many technical forums on linux compatible tools, we tried and tested different solutions and finally started to perform our production backup process with HashBackup.

We began in summer 2013 with hb #1070 on a small amount of data, about 50GB.  Through the months/years our infrastructure grew and changed, and vm servers increased over 40 times changing their nature from "Image-Based" to "LVM-Based" and we have adjusted our backup script over time.

After four years of using HashBackup I can say that it has always satisfied our needs and has also made possible successful point-in-time data recovery.

Currently we back up 1.4 TB amount of data everyday: with compression and deduplication we get only 19 GB output data for each incremental backup session with a reduction ratio of 78:1!  Since August 2016 we have backed up 27 TB of data occupying less than 900GB on storage and replicating it on separate locations: local backup server, remote storage, cloud (Amazon S3 IA). 

We are very satisfied. I am happy to share my findings and maybe help save some time for other sysadmins that are still searching for a good backup tool.

Thank you Jim for the great work! Well done!

Emanuele Viale
im3D S.p.A. - Medical Imaging Lab



I work in the Rieke Lab at the University of Washington. We’ve been using HashBackup for over 6 years and it’s been fantastic! We do daily incremental backups of a 6TB RAID array and keep our backup files both locally and on Amazon S3-IA. After 6 years of incrementals, our backup is still only 1.6TB! That includes over 30 user directories and every daily modification made on the files within them.

Just the other day I had a researcher ask me about changes he made to a file sometime in October or November of 2015. No problem. A quick `hb ls` and we had a list of every daily revision of that file since 2010. `hb get` and we had all the revisions between October and November 2015 as separate files in a self-contained directory. That’s awesome and amazingly powerful!

Thanks for the great work you do on HashBackup. It’s been a pleasure using it over the last 6 years and it’s great to see it get even better with every update.

Mark Cafaro
Rieke Lab
University of Washington

[In early 2016 Rieke Lab migrated their 6 year backup from Amazon Glacier to S3-IA using HashBackup's built-in sync feature.  Recently Mark wrote in to say that he restored a researcher's 44GB home directory in 27 minutes, showing that HashBackup has good restore performance even from a backup with 6 years of daily history.]



Hi Jim,

In the past I've used NetBackup, bacula, amanda, and other quite heavy weight solutions and I've found they all break really easily and are time consuming to setup. We're trying to ensure we've got a complete off-site backup with disaster recovery plan so keeping things straight forward and reliable has been one requirement, ideally anybody with a little instruction should get the system restored and running again.

I've also steered away from solutions that try and backup the content of a system's filesystem, we've a mix of Linux, BSD, and Windows and Windows in particular with its system files can be a pain to restore that way. We've been using virtual machines for a while now and it seemed that taking a backup of a VM snapshot would be platform independent and almost guaranteed to restore. This might make for a bigger backup but with deduplication and compression it's not such a hit.

With all the VMs we use logical volumes so we can get live snapshots for backup, resize easily, and get better performance, but this rules out a lot of backup software that doesn't treat block devices specially and we didn't want to copy the LVM content to a file for backup.

With all that in mind we started looking for solutions that:
1) Could backup block devices directly for the logical volumes
2) Could still backup regular files on our NAS
3) Local backup to NAS for testing and quick recovery
4) Off site backup for disaster recovery
5) Deduplication and compression to minimise uploads off site
6) Easy to drive from script

There aren't many backup solutions that fit that bill.

If you start off with HashBackup then look at the alternatives you start discounting them pretty soon, they just lack the same features.

We ended up going for S3 in the end, not because of price but it's already in use elsewhere in the company.
Good to know about the IA support though, we'll see how costs go and maybe move to that.
That's one of the great advantages of HB, it's so easy to switch storage providers.

Cheers, Max
Aria Networks



HashBackup has been a fantastic tool for our needs!  It's a small single binary that requires virtually no dependencies (statically linked) and has an incredible array of well-documented options.  As a result, it's a good fit for cloud servers and highly-available distributed systems where ephemeral storage and dynamic instance generation precludes bulky installations and configuration processes.  In addition to being script-friendly, security is well thought out, automatic versioning is great for restoring deleted files, and we especially like the mount option which eliminates vendor lock-in concerns.  Development is active and Jim has been very responsive to our queries and suggestions.  Overall, a great backup solution!

Ben Emmons
Web Operations Manager, University of Arizona



Hi -

Just wanted to say well done on the superb HashBackup! 
I was pulling my hair out (I'm bald!) attempting to get duplicity/duply playing nicely with b2.
I plugged in hashbackup... "job done".

Thanks again.

cheers
Pete




While building a backup processing service where users can just drop off their backups in their respective user folder, I was looking for a lenient solution that was easy to use, stable, efficient and most of all.. secure. After looking at all the options, HashBackup came out in the top 3.

Jim has always been very responsive to my (sometimes not so bright) mails, quickly fixing any issues or feature requests I may have.  I am now proud to say HashBackup is part of a automated backup processing system with stats aggregation!

Thank you, Jim.

Niels Hofmans



I love HB. It's working great for backing up my personal GitLab server. So well in fact that I wanted to backup some files on my Windows 10 laptop! 

You may have heard of Bash on Windows - a beta dev feature of Windows 10 that translates all of the Linux syscalls to Windows syscalls natively.  I use Backblaze Unlimited on my main PC, but didn't want to pay $5/mo for the 50-100GB of files on my laptop.  So using HB via Bash on Windows I'm now backing up my Windows 10 PC to Backblaze B2!

Thank you for your work on HashBackup. It's working great on my server and Windows 10 PC. :)

Jacob Hands
Comments