Customers

Happy customers are the best reward!


I would like to thank you for a great product.

It is rare indeed for me to be moved to pen such an accolade, but after trying so many other options - Arq, Restic, Dupliciti, to name but a few I was so delighted to see the flexibility, speed and simplicity of hb.

The 'staged' backup is so cool. Do all the heavy lifting of compression, deduplication and archiving locally, then send the archives to any remote destination for security. Other products took two passes to do this with separate target settings for each run.

All this with a tiny footprint in memory and CPU resources. Compared with others, especially Restic, it is amazing.

Thanks again.
Paul


On my server a silent hardware failure was causing corruption when writing archives to disk or network. HashBackup’s incremental selftest feature was a tripwire that helped detect the problem.

When the faulty hardware was replaced, HashBackup was able to repair all my backups, across a variety of sites. Only the revisions created during the time of the fault had to be updated, years of prior backup history remained intact. Plus, the 'hb compare' functionality gave me confidence all was well.

Perhaps most impressive, however, is the help you offered when problems arise. This was a weird problem, not at all the fault of your software, and I’m amazed both that it was recoverable, and that you were so responsive.

Thank you again for both the extremely reliable software as well as all your help!

Alex


Thanks for your recent work on making 'get' smarter.

I had previously only been backing up a smaller set of more important files with HashBackup - Nextcloud for the family, emails, and other self-hosted stuff.  This was because if a disk in my pool were to fail, I would need to do a full restore of everything and not only the missing files.  I have another couple of TBs of less important media backed up using Duplicacy which supports only recovering missing/changed files.

I’m now able to add the less important stuff to the HashBackup backup as well, knowing that should a single disk fail I would only need to download the missing files on the failed disk.  I find HB has much better deduplication and is generally much more efficient and faster to use. Whereas Duplicacy backups tend to balloon over time, HB’s selective retaining and archive repacking keep things a lot more clean.  Being able to stagger thorough self tests over a long period of time to ensure everything is safely backed up is also great.

Thanks again for all the hard work you’ve put into HB over the years.  It does what it does really well.

All the best,
Ben


Your software is truly amazing, thank you for your awesome work! A disaster recovery story from today: every device, server, office computer, flash drive, external hard drive in my company is backed up using HashBackup to a central NAS server which in turn is backed up daily to an offsite backup storage except one - an external drive with the most important (and secret) documents. Guess what?  Someone accidentally overwrote the first 8GB of that drive with an image meant for SD card for Raspberry PI. The drive wasn’t normally backed up, but it was left plugged in overnight several times, so it got caught in a full-system backup. More luck than anything else, but again - thanks to HB it took me like 10 minutes + download time to find that it was in fact plugged in 3 weeks ago and recover it!

Thank you (again) for creating such awesome software and your time, really HashBackup is the best of all available, even very expensive, solutions.

Bartosz G


HashBackup is my go-to solution for backups. I have seldom come across products that are as stable and also as easy to use as HB. Having backed up my data daily since over a year now I have had no problems backing up or recovering data, allowing me to rest easy as I know my data is safe. HB is extremely well engineered: I have frequently encountered "wow, that is amazing" moments when using it, e.g looking at the data usage from my sync of a 190GB repository I was surprised to see only the actual data being used (utilized space, ~40GB) was downloaded because ranged downloads are supported out of the box; no need to set up or configure anything. The amount of features included is huge and most everything a backup tool needs. While that alone would be an excellent product the customer support from Jim just tops it off, with very short response times and detailed, high-quality explanations and tips. If you are looking for a backup solution, I can say with confidence you found it. Thanks Jim for creating this great tool

Leo Tietz


HashBackup is fantastic software and the support you offer unbeatable.  Today it has saved me from a major problem by being able to restore a copy of a critical program.  Thank you very much for your work and effort.

Jose Boluda


Thank you for HashBackup! I use it both professionally and now also personally because it’s simply great, fast and efficient and I’ve never had any issues.

Gabriel


I just discovered your great program the other day. I’m really enjoying using it to back up my servers. For me, this was really the perfect convergence of all the items I needed - incremental backups, offsite integration, and easy access to the backups (I love the fact that it can be mounted! This was the kicker for me; I provide the mount to my users via an FTP server which is easy to use, and a well known interface - yours is one of only a few that allows easy FUSE mounting of the archive). Thanks very much for the great product!

Daniel Beckmann
Nagata Acoustics


Having tried many apps to backup Linux servers including both Open Source and Commercial solutions, I always had issues or hit a stumbling block that meant it wouldn’t be viable.  I finally found HashBackup and it all changed. I felt in control of my data and the backups. The implementation was flawless and I felt empowered by the way it backed up the data. I felt comforted that the integrity of data was being checked every day. The flexibility of restoring files as well as versions is a delight and so easy. I now have well over 2TB’s of data stored up in Backblaze B2 thanks to HashBackup and every day it seamlessly does its job. The documentation and examples make the whole process so simple.  I recently contacted Jim for some advice. Not only were his responses really quick, but he made real effort to answer my questions thoroughly.  Thanks Jim for a great product!

Sy Doveton, UK


We’re a small ISP based in Germany. We’ve been using BackupPC with rsync-over-ssh for many years to back up our "standard" clients. But, rsync has its limits - especially in the case of this particular customer that has an ever-growing amount of millions of small text files on their filesystem. We tried many solutions, but HashBackup was the only software to handle this heavy scenario. We’ve been using it for this customer since 2014 and never had an issue with it, nor with restoring. For this task, it’s the best-performing software out there. Whenever we had a question (and we have had lots of them), HashBackup’s customer support was stellar and even changed code for us. Just awesome! The best and most reliable backup software that I know of.

Markus, Germany


Thanks for a fast reply and insight into a fantastic software. We’re really happy we chose HashBackup as our backup software. You’re doing a top job.

I increased the workers from the default to 8 and sync completed in less than a day. Funny thing is I tested this scenario on a testing server with much smaller data set. And even if I increased the workers I didn’t see any significant improvement. I guess I didn’t wait long enough or there might have been other outside factors hiding the effect.

Anyways, thanks for your help.

Best Regards,
Johan Abbors
Senior Software Engineer, Walkbase


On Feb 11, 2016 we have exchanged a couple of mails on HB and at that time, you have warned me not to rely on Time Machine, as you had your own experience with missing files.

I was cautious and did use TM and HB in parallel and when I upgraded from Mavericks to El Capitan, I had to undo the upgrade using TM’s recovery feature. Worked nicely. I then installed El Cap and eventually solved my issues. (The recovery was not needed it turned out.) The reason to use TM is the recovery feature from the backup disk.

About a month I was wondering why I have deleted some directories. [Yes, I know that you know what’s coming next.] Once I wanted to go to a file I surely knew that it existed, but could not find it on my Mac any more, I used HB to recover it. This was the point when I looked up when the file went away, exactly during the time I did the upgrade. Finally, made a full diff using the mounted HB backup volume between the 2,000,000 files in my home prior and after recovery. Found 80,000 (sic!)  files missing. No pattern to find. Sometimes whole directories, sometimes only files in directories. Disaster. Man, was I pissed.

I wrote a shell script based on the diffs in the directory trees to copy the missing files from the HB backup directory to my home. I also filed a nasty complaint with Apple.

Success story for HB, though. A lot of work, one day to get the diffs, write the script and recover the files. Doing this made me wish for an HB feature, you probably would also be interested: compare file states between two backup versions. From what I understand, at the moment this can be only done between the file system being backed up and the backup.

Just some experiences I wanted to share. Thanks for making HB, it saved my life!

Chris

[ My own experience with missing Time Machine files that Chris mentions was when I upgraded my MacBook Pro hard drive to an SSD.  TM ran hourly, but right before the upgrade, I ran it again "just to be sure".  Swapped in the SSD, booted from the external TM drive, and did a recovery to restore the old hard drive data to the empty SSD.  Booted the SSD and everything seemed to be fine.  But later that day, my version control system started giving weird errors I had never seen before about missing files.  I still had the old hard drive, checked it, and sure enough they were there.  Hmm''' why didn’t TM restore them?  Looking at the TM backup disk, these files weren’t there.  I used rsync to compare my old drive with the TM backup, and found there were many files that were either missing, or older versions were restored, scattered here and there, with no pattern that I could see.  These weren’t files excluded from the TM backup: file abc.i was backed up, yet def.i was out of date (an old version).  Luckily I also had an HB backup.  I used hb compare to see which files were missing or wrong, confirming the rsync compare.  Then I used hb get to restore the whole drive on top of the TM restore to recover the missing files and correct the others.  My version control problem was fixed, and hb compare confirmed that the SSD now matched the backup.  I was lucky that my version control system complained, or I may not have noticed for months that the TM restore was incorrect.  I haven’t posted this previously because it happened on a Snow Leopard system and I wasn’t sure it was still relevant.  I guess it is.  -Jim ]


In my company we work on cancer screening and prevention, our IT infrastructure is mainly based on xen virtual machines.   Handling with very sensitive data, we immediately needed to produce encrypted and geographically distributed backups.

After a long search in many technical forums on linux compatible tools, we tried and tested different solutions and finally started to perform our production backup process with HashBackup.

We began in summer 2013 with hb #1070 on a small amount of data, about 50GB.  Through the months/years our infrastructure grew and changed, and vm servers increased over 40 times changing their nature from "Image-Based" to "LVM-Based" and we have adjusted our backup script over time.

After four years of using HashBackup I can say that it has always satisfied our needs and has also made possible successful point-in-time data recovery.

Currently we back up 1.4 TB amount of data everyday: with compression and deduplication we get only 19 GB output data for each incremental backup session with a reduction ratio of 78:1!  Since August 2016 we have backed up 27 TB of data occupying less than 900GB on storage and replicating it on separate locations: local backup server, remote storage, cloud (Amazon S3 IA).

We are very satisfied. I am happy to share my findings and maybe help save some time for other sysadmins that are still searching for a good backup tool.

Thank you Jim for the great work! Well done!

Emanuele Viale
im3D S.p.A. - Medical Imaging Lab


I work in the Rieke Lab at the University of Washington. We’ve been using HashBackup for over 6 years and it’s been fantastic! We do daily incremental backups of a 6TB RAID array and keep our backup files both locally and on Amazon S3-IA. After 6 years of incrementals, our backup is still only 1.6TB! That includes over 30 user directories and every daily modification made on the files within them.

Just the other day I had a researcher ask me about changes he made to a file sometime in October or November of 2015. No problem. A quick hb ls and we had a list of every daily revision of that file since 2010. hb get and we had all the revisions between October and November 2015 as separate files in a self-contained directory. That’s awesome and amazingly powerful!

Thanks for the great work you do on HashBackup. It’s been a pleasure using it over the last 6 years and it’s great to see it get even better with every update.

Mark Cafaro
Rieke Lab
University of Washington

[ In early 2016 Rieke Lab migrated their 6 year backup from Amazon Glacier to S3-IA using HashBackup’s built-in sync feature.  Recently Mark wrote in to say that he restored a researcher’s 44GB home directory in 27 minutes, showing that HashBackup has good restore performance even from a backup with 6 years of daily history. ]


In the past I’ve used NetBackup, Bacula, Amanda, and other quite heavy weight solutions and I’ve found they all break really easily and are time consuming to setup. We’re trying to ensure we’ve got a complete off-site backup with disaster recovery plan so keeping things straight forward and reliable has been one requirement, ideally anybody with a little instruction should get the system restored and running again.

I’ve also steered away from solutions that try and backup the content of a system’s filesystem, we’ve a mix of Linux, BSD, and Windows and Windows in particular with its system files can be a pain to restore that way. We’ve been using virtual machines for a while now and it seemed that taking a backup of a VM snapshot would be platform independent and almost guaranteed to restore. This might make for a bigger backup but with deduplication and compression it’s not such a hit.

With all the VMs we use logical volumes so we can get live snapshots for backup, resize easily, and get better performance, but this rules out a lot of backup software that doesn’t treat block devices specially and we didn’t want to copy the LVM content to a file for backup.

With all that in mind we started looking for solutions that:
1) Could backup block devices directly for the logical volumes
2) Could still backup regular files on our NAS
3) Local backup to NAS for testing and quick recovery
4) Off site backup for disaster recovery
5) Deduplication and compression to minimise uploads off site
6) Easy to drive from script

There aren’t many backup solutions that fit that bill.

If you start off with HashBackup then look at the alternatives you start discounting them pretty soon, they just lack the same features.

We ended up going for S3 in the end, not because of price but it’s already in use elsewhere in the company. Good to know about the IA support though, we’ll see how costs go and maybe move to that. That’s one of the great advantages of HB, it’s so easy to switch storage providers.

Cheers, Max
Aria Networks


HashBackup has been a fantastic tool for our needs!  It’s a small single binary that requires virtually no dependencies (statically linked) and has an incredible array of well-documented options.  As a result, it’s a good fit for cloud servers and highly-available distributed systems where ephemeral storage and dynamic instance generation precludes bulky installations and configuration processes.  In addition to being script-friendly, security is well thought out, automatic versioning is great for restoring deleted files, and we especially like the mount option which eliminates vendor lock-in concerns.  Development is active and Jim has been very responsive to our queries and suggestions.  Overall, a great backup solution!

Ben Emmons
Web Operations Manager, University of Arizona


I run a web hosting business 123host.com.au.  Backups have always been such a hassle and so expensive, add to that not having real control.  While necessary they are tedious to setup and maintain.  Then I discovered HashBackup and Backblaze B2 - the level of control over backups is just fantastic.  Fortunately I don’t have to restore often, but when I do, no problem at all.  Thanks.

Steve


Just wanted to say well done on the superb HashBackup!

I was pulling my hair out (I’m bald!) attempting to get duplicity/duply playing nicely with B2. I plugged in HashBackup - "job done".

Thanks again.
Cheers
Pete


While building a backup processing service where users can just drop off their backups in their respective user folder, I was looking for a lenient solution that was easy to use, stable, efficient and most of all.. secure. After looking at all the options, HashBackup came out in the top 3.

Jim has always been very responsive to my (sometimes not so bright) mails, quickly fixing any issues or feature requests I may have.  I am now proud to say HashBackup is part of a automated backup processing system with stats aggregation!

Thank you, Jim.

Niels Hofmans


I love HB. It’s working great for backing up my personal GitLab server. So well in fact that I wanted to backup some files on my Windows 10 laptop!

You may have heard of Bash on Windows - a beta dev feature of Windows 10 that translates all of the Linux syscalls to Windows syscalls natively.  I use Backblaze Unlimited on my main PC, but didn’t want to pay $5/mo for the 50-100GB of files on my laptop.  So using HB via Bash on Windows I’m now backing up my Windows 10 PC to Backblaze B2!

Thank you for your work on HashBackup. It’s working great on my server and Windows 10 PC. :)

Jacob Hands