I use Time Machine on a large external drive, nightly backups via SuperDuper to another external drive. I also use SuperDuper to do weekly offsite (detached garage) backups.
My friends’ old iMac failed. When an attempted First Aid failed, we reinstalled a clean macOS then went to restore from one of the two backup drives I helped them set up years ago. But Time Machine wanted a password for the encrypted drive. Uh-oh. My friend tried everything he could think of, and we starting to think decades of photos were gone… Then he remembered that he had stored the password at the rear of the top shelf of his closet. That is to say “back up” in the closet. Gotta say that was a huge relief, both for him and for me because although I had him doing off-site backup rotation, I hadn’t pushed for a broader backup strategy.
P.S. the iMac failed again shortly after so I helped them buy an iMac M4 and restore it from the backup.
My most memorable need for a backup was with a Powerbook G3 Bronze Keyboard. It had drive bays that allowed a backup drive to be inserted. I had ordered a replacement as it was acting a bit flaky. Imagine my surprise when I came into the room and saw smoke coming out of the computer. The replacement G4 did not have the bays but a friend had a G3 so I loaded the backup drive into his computer, copied that to an external and set up the new computer from the external drive. Needles to say, I was even more a confirmed fan of backups.
My backup strategy. I have a 2 disk RAID that I use as a TM that backs up daily. I have a second TM that I plug in weekly and let run. I have a Studio, Side disk of 4 TB (The Studio storage was expensive and this was the best way to supplement) and a Large 4 disk SoftRAID drive that are backup up to BackBlaze. I have some SSD’s that I plug in weekly than run a CCC to back up the main Studio and another to back up the Side drive. I have external drives that store backups of iTunes folder, and another for movies (Some Blu-Ray and some DVD).
Then I have an “Archive” drive that it things that I want to save but do not need accessible. I place things there is different categories and a clone of that is in my safety deposit box and is cloned from the main one (which I keep in a firebox at home) every few months. That is also backed up to Backblaze (which reminds me every 2 weeks to plug it in so it stays up to date).
Most of my drives are cataloged with NeoFinder. That folder is backup up to DropBox. My biggest problem is too much redundancy, but the cost of Hard Drives being what they are, is not that much of a concern.
The other concern is that I am the only computer literate person in my house, so I am writing down in a physical book all the information so that my wife or child can find it when/if they need to.
I think that is all, but probably not.
Time Machine to Synology NAS every day
BackBlaze constantly
SuperDuper to Crucial SSD every day
Google, Amazon and Apple Photos when snapped
Google Workspace for business
I use Time Machine for the internal system disk and Back Blaze for the external RAID arrays.
I use many of those methods, but have given up on Time Machine. The backups are often difficult to browse and delete the data I want anyway. I rely on migration assistant when transitioning to a new Mac, and recover individual files or folders if needed.
I use Backblaze and iCloud as my two offline backups solutions.
I used to feel that way about the Time Machine in HFS+ format, but APFS Time Machine is now organized by snapshots. So you can open the Time Machine disk in the Finder, find the Snapshot corresponding to the date and time you are curious about, and search through it as you would any finder directory.
Huh? Time Machine on HFS was also by snapshots, and you could open one and search through it as you would any Finder directory. What is different?
I agree.
I’d say the actual difference is that on HFS+ TM you could mess with the backups which you can no longer do in snapshot-based APFS TM.
This becomes relevant when you realize you have backed up a very large file that you don’t actually want to back up. Think a large VM that you use for testing something — it changes a little bit every time you use it so it gets backed up every time TM runs, but it doesn’t need to be backed up. In the HFS+ TM world, you could just get rid of that file from the TM backup (there was even a right-click option to remove it from all TM backups) to be followed by adding that one file (or its parent folder) to the exclusion list. That no longer works in the the APFS TM world. Since it’s based on snapshots, there’s no way to remove just that one file from the backup. It’s there as long as the snapshot exists and gets backed up. Exclusion has also changed. Since TM is based on snapshots that are made on disk ahead of time, you cannot have TM remove just this one file or its parent folder from the snapshots and hence the backup. That file actually needs to live in another APFS volume for which no TM backups are made if you want to make sure it never becomes part of a TM snapshot (which BTW lives on your internal disk [and thus uses up space there] until the next TM backup is performed).
Not quite. HFS+ doesn’t support snapshots. Instead, Time Machine creates a separate folder for each backup.
In order to avoid massive duplication of data, Time Machine creates large amounts of hard links so files and directories that are identical in two or more backups all reference the same physical file.
The problem is that this system of hard links has shown itself to be fragile over time, causing many people to lose data. And the backed-up files are not immutable - they can be deleted and edited. And editing a file with multiple links to it will cause it to change in all the backups that share that link. vs. snapshots, which are read-only and can’t be changed under any circumstance.
But I agree that there should be no problem opening a particular backup’s folder and walk through its contents looking for file.
My point is that the Finder presents the backups similarly: a set of time-stamped folders. In fact, it was easier to scan across snapshots with the HFS Time Machine, because it was also the same structure from the Terminal command prompt and to other applications. The APFS version has to mount each snapshot separately.
I’m guessing not. If the Mac wouldn’t even boot into recovery mode, it wouldn’t have allowed booting from an external drive.
The main reason I let Time Machine do its thing hourly is that then I don’t have to think about it or potentially forget to make a backup—there’s no real difference in the amount of data stored because of how it prunes. Some people say they’ve noticed it working, but particularly with a modern Mac and an SSD, I’ve never noticed any performance issues related to Time Machine running in the background. Same with Backblaze, which backs up constantly.
FWIW, I know when TM is running, but that’s because I’m backing up to a hard drive, and head motion is noisy (especially on the Toshiba NAS drives I’m using). But system performance has never suffered, so the impact is just a little noise.
In fact, that was one of the big reasons I switched to SSD for Time Machine and SuperDuper—I don’t like the hard drive noise. ![]()
This is a very good reason to just let Time Machine do its thing every hour. For me, not having to keep my TM disk connected all day is an added benefit of making manual backups once a day. Fortunately, I very rarely forget to make a TM backup before putting my computer to sleep for the day.
My basic strategy is I make 4 kinds of backups:
-
My primary backup is Time Machine, running hourly. It is the most current local backup.
Currently I’m running Time Machine to a partition on a 4 TB SSD. Unfortunately it isn’t large enough to include my VMware Fusion VMs, which are huge.
-
My backup for restoring or recovering from a catastrophic machine failure is a SuperDuper! bootable clone. This is a weekly scheduled clone of all files on the drive, to an another partition on the 4 TB drive. Yeah, I know: I’m putting two backups on the same drive. More about this later.
-
But what if my house burned down? I used to use CrashPlan, but when it ended for personal use I switched to Arq, backing up to space in Wasabi. Arq is running hourly, with Time Machine style retention: hourly for a day, daily for a month, weekly for a year, monthly then on. I currently have it set to keep the data for 5 years.
The advantage of Arq over other cloud backups is that it is Bring Your Own Cloud space. Since I’m paying for my cloud space, Arq doesn’t care what I backup. I can include anything I want. Applications? No problem. 5 years of virtual machine VMDKs? It doesn’t care. Arq has no incentive to try to keep you from using up space.
-
But seriously, what if my house burned down? It would be impractical to pull back terabytes of data from the cloud. So I use SuperDuper! to create a pair of data-only clones, one of which is kept offsite. I update the one at home monthly, and then swap it offsite. (I’m backing up to multi-terabyte bare drives, using a drive dock.)
OK, now to the issue about having multiple backups on the same drive, which is a no-no. The solution is that I actually have more than one bootable clone and more than one Time Machine. I update the other bootable clone monthly. Normally I’d only connect the 2nd Time Machine once a month, but for reasons one of my Time Machine’s is on the same drive I’m booted from, so I have both Time Machines connected all the time. It switches between them on each backup.
Using more than two different full backups of my machines.
As for methods on physical disks: Carbon Copy Cloner, Time Machine
Secure online storage (zero-knowledge): Tresorit
The physical disks are at least two that get physically rotated off-site.
Would like to move to ZFS-based storage for its protection against bit rot. Have been hit in the past by hardware failure that resulted in erroneous data being written to disk.
See https://openzfsonosx.org/ also https://openzfs.org/
I just moved from an Intel iMac 27" to an M4 MacMini. I had an OWC 2TB HD for TimeMachine but thinking I wanted to get up and going fast on that new spiffy new machine, I updated the sorta-monthly CCC data backup on a recent 1TB LaCie SSD. So using Migration Assistant on the LaCie I migrated. Goodness that was fast! What fun!
But then . . . I realized I had a large image archive in a separate volume on the OWC drive that I really should dump directly into the MacMini (large for me—100GB or so). So I happily connected the OWC drive, fired it up, and . . . dead as a doornail as in really dead, as in behold my new doorstop. Yikes! But then I remembered I backed-up that archive to an old firewire G-drive once a month or so. But then when I plugged the G-Drive into the Mini (via multiple adapters that worked fine on the iMac) nada. No mount, Disk Utility didn’t even see it. So, lift the old iMac from the floor, connect the G-Drive, plug in the SSD and copy everything over, then dump it into the Mini. So simple, right? Be still my heart. ![]()
The moral(s) of the story are: One, that multiple backups are good even if they are not absolutely up-to-date; Two, that exceptionally sturdy drive is older than you think and maybe you should think about tossing all those old drives that surely will work when you start them up again after 5 years; Three, multiple backups are good.
My current backup regime is simple. TimeMachine to a Samsung T9, CCC to a LaCie SSD every few weeks, Backblaze continuous, iCloud for iThings.
I’ll probably get another SSD for Data backup on occasion just on general principles. ![]()
I would like to point out that restoration from BackBlaze backups is not trivial (Get a USB drive from them for small $, then spend hours picking out mail & messages & prefs & documents & god knows what else to rebuild a client’s laptop). Migration Assistant it is not. Thank heavens we used it but. . . .
Dave
Fantastic discussions! I need to update my earlier post about my strategy. I actually ran into a “quirk” with my SuperDuper! bootable backups on both of my Macs (an M1 MacMini, and an M3 MacBook Air). I am currently using V15.2 of Sequoia (that was part of the issue in Adam’s initial article about all this).
As I mentioned previously, I use SuperDuper! (SD) to back up each of my Macs to 2 separate Samsung 1 TB SSDs. The other day, I tried to boot my Mac Mini from one of its SD backups, but kept getting errors. It just would not boot up! For the other one, it was fine! (The Samsung SSDs are the exact same mode: T7 1 TB Shield, purchased a couple of years ago close to the same time). Same issue with my MacBook Air, although I only tried one of them. Kept driving me crazy! Contacted David Nanian of Shirt Pocket Software, and he was as perplexed as me. He offered some ideas, and one actually worked: re-install OS 15.2 on the backup. That worked, but seemed like a “half ass” way of doing things.
This morning I tried again. This time, I used specific USB-C ports. On my Mini, there are 2 of them on the back: one right next to the Ethernet port, and the other next to that (ie, one port away from the Ethernet port). On the MacBook Air, it also has 2 USB-C ports: one next to the Power “port”, and the other next to that one (again, one port away from the Power “port”). I went ahead and connected each of my Samsung SSDs to the USB-C port on the mini one port away from the Ethernet port. Did an SD complete backup, and it worked! That is, I was able to subsequently boot the MIni from each such SD bootable backup. Similarly for the MacBook Air, I connected each SSD to the USB-C port one port away from the Power “port”. Again, success! That is, did an SD backup to each SSD, and was able to subsequently boot the Air from that just completed SD backup.
So, it seems there is a port “dependency” going on here. Of course I do not know whether it is Mac model dependent, Mac OS dependent, or Samsung SSD dependent (or possibly a combination of two, or all three). In discussions with David, he indicated that neither Apple nor himself are aware of anyone else having success like me, and specifically with OS 15.2 (again, that was part of the “gist” of the original article posted by Adam last month on the Tidbits site).
Obviously I am pleased, but sure was not easy detective work. I asked for this before, so I’ll try again. I’m sure David and Adam would like to know also. Would like to hear from other users of SuperDuper!, and especially ones running OS 15.2.