Originally published at: https://tidbits.com/2019/04/05/what-does-the-t2-chip-mean-for-mac-usage/
Apple’s custom T2 chip brings better security to recent Macs—and we all like security! But the T2 also makes Macs harder to repair or use with non-Apple operating systems, and it can create nightmares for DJs and musicians. So is a T2 Mac right for you?
I’ve received a few questions about the audio testing methodology I used in this article, so here are the “off into the weeds” details.
I do a fair bit of pro audio work (mostly recording music), but I don’t have a T2 Mac myself because the T2 Macs have been fairly infamous in audio circles since the introduction of the iMac Pro. So I tested using other people’s systems as I had access to them: these included iMac Pros, as well as more-recent MacBook Pros and Mac minis, in environments ranging from professional recording and production studios (Seattle and New Orleans) to amateur musicians and podcasters just doing stuff in at home.
When Adam assigned the article (I think in January) I thought I would make a couple template projects in Pro Tools, Logic, Reason, etc., and carry those around on a USB stick so I could quickly fire them up on T2 Macs and test. That turned out not to work so well: audio setups vary too widely, and I almost always made test projects from scratch (sometimes in GarageBand, because most people had it). The T2 problems are also unpredictable enough that it took me a long time to gather what I felt was solid data: I did 42 tests of varying duration (ranging from 30 mins to over four hours) on 15 machines over a span of about two and a half months. I also spoke to several audio interface makers on background.
For the testing, I borrowed an NTi Digirator DR2. It’s basically a signal generator: totally overkill for what I was doing, but it’s reliable. The idea behind all the tests was the same: record as many channels of sine wave input as I could for as long as possible, then go back through the audio and look for anomalies. Testing used Avid (Pro Tools), UA, MOTU, Presonus, Focusrite, Apogee interfaces connected using USB and Thunderbolt, depending, along with “consumer level” USB interfaces like a Blue Yeti mic and “guitar” inputs from iRig and Line6. No plugins or onboard preprocessing: just straight in.
The “glitch” screenshot screenshot shows Logic Pro X 10.4.4 on a 10-core iMac Pro running High Sierra 10.14.3 on a rig with Thunderbolt UA Apollo interfaces (two x16s and an x8p, I think) recording at 24-bit/48KHz (nothing stressful). It’s recording eight tracks of sine wave input from the Digirator; I did the signal bussing for the Digirator on the studio’s physical console (old school!), so the UA/iMac was just blindly recording eight tracks of audio—as un-fancy as I could make it. I got to run that test about 90 minutes; there’s another glitch later in the same session. I picked that particular glitch because it happened just four minutes into the test - so I knew I could find it again fast - and it was on eight tracks simultaneously.
Now, if Adam would finally approve article pitches on unity gain, avoiding single-coil hum from LCD displays and other noise sources, how to change acoustic guitar strings using just hose-clamp pliers…I could do that all in-house.
Man, that hose-clamp pliers tip sounds hot.
Just to further confuse the issue, there’s an article on Appleinsider today that concludes Macs with T2 chips can encode video much faster.
I like their testing methodology. I’m not a video person, but I can see that helping out consumers and hobbyist video people who (I’m just making this up) use iMovie or Adobe Premier Elements(?) or similar things. (Do people actually do that? I know no one who uses these programs, but I’m obviously more audio-focused.) I imagine most pros/semi-pros would prefer to render with “real” graphics processors. I have no idea how widespread third-party support for Video Toolbox might be, or the degree to which Video Toolbox supports third-party graphics hardware.
For what Handbrake is doing, converting video files from one format to each other, GPU is not important. VideoToolbox provides software access to available hardware that exists to encode or decode specific video formats, like H.265. These chips are common, everybody’s smartphone has them because without them, watching or creating a video would kill their battery life, if it was possible at all. iPhones, GoPros, etc. can shoot 4K video because of chips do nothing but encode video to the format they support. A discrete video card could have such chips on its board in addition to its main GPU and I would expect them to be available through VideoToolbox; since VideoToolbox is a part of macOS’s core graphics library, it would be weird to make a video card and drivers for Macs that didn’t use it.
In the AppleInsider testing, the 4K iMac they used has a “real” discrete video card, a Radeon Pro 555X, it didn’t help. They have another graph that shows encode times didn’t change when an eGPU was attached to a 2018 Mac mini or MacBook Pro.
Good GPUs matter earlier in the production process; rendering 3D graphics, applying filters, doing color correction, etc. When it comes to encoding the final product, a good GPU only helps if those effects need to be re-applied before the actual encode (e.g. when editing using proxy files). In the absence of dedicated hardware for encoding the final video format, encoding is handled by the computer’s CPU.
I think that’s quite in line with what we see in the AI benchmarks linked to above. The 4K iMac had a dedicated GPU and hardly did any better than the Core i3 mini with only Intel on-board graphics and T2 deactivated.
The real game changer according to those benchmarks appears to have been enabling the mini’s T2. With the T2 coming from the iOS world where Apple has had the greatest interest in integrated encoding/decoding circuitry this perhaps it not a big surprise.
What I do wonder about is where the modest gains of the iMac compared to the mini with T2 disabled did come from. The AI testers explicitly checked to make sure it was unrelated to disk. Both Macs use the same i3 and memory bus. The difference in transcoding time was on the 20% level so more than I’d expect to chalk up to measurement accuracy/repeatability. Is the GPU is contributing in some other fashion?
Thanks for posting this, Curtis: makes sense to me and a video editor I’m in a Messages chat with (although he doesn’t have a T2 Mac).
There is another downside to the T2 chip, especially if you use Carbon Copy Cloner: bootable backups. I have 2 TB HDDs that I use for making clones, and only clones. I have to format them as APFS, and not HFS+, so that I can later run FileVault and encrypt the HDDs (of course, after allowing booting from external media in the Startup Security Utility). Performance is slow for magnetic media drives running APFS, but there appears to be no other choice.
That’s a great point, although so long as Recovery Mode is still accessible startup security can be disabled via the Startup Security Utility. Not all drive failures leave Recovery Mode accessible, though.
As much as I love the idea of bootable backups, I’m going to run slightly counter to the usual TidBITS mindset and claim they’re something of a red herring. In my experience, the most important part of “bootable backup” is “backup”—making bootable backups on a Mac has been an exclusively nerdy activity for many years. A great deal of external storage devices (particularly USB hard drives) cannot be used to start up Macs anyway, so I’ve essentially stopped trying to convince the backup-adverse they need to make bootable backups in the simple hope they’ll get some form of storage and perform some sort of regular backup, bootable or not. Mostly, they don’t.
Bootable backups, in my case SuperDuper made, have saved my bacon on more than one occasion.
I have to agree with both you and Geoff. Bootable backups are kind of “nerdy” things, and getting someone to have any kind of backup is the first priority (and often seemingly an insurmountable task ). That being said though, they’ve saved my bacon a number of times as well, and I would hate to be without (at least) one.
Totally. It is no longer 2010. Making bootable backups is going against Apple’s design and development directions. More so with each OS release.
For when it can work and I need it I have a flash drive that can boot multiple versions of the OS. AFPS has made it so I now use 2 of them.
Yep. I still personally do bootable backups as part of my backup strategy, but I can count on the toes of one hand the number of other people I’ve convinced to make that leap. I, have, however, gotten a few to be pretty rigorous about using Time Machine and whatnot.
I agree bootable backups are great, and I really wish they were easier to make and perform. But they’re not, they’ve been difficult for years, and they’re getting more difficult all the time.
Just curious, is something going on that is making this harder now or will make it harder in the future? I have been doing the same thing for at least 15 years, which is open SuperDuper, click the backup and let it do its thing. Every so often I boot from one of them to make sure it’s working. I only just updated to Mojave and haven’t made a bootable of the new OS. Is that going to cause issues for SuperDuper?
What’s changed is Macs with the T2 chip will not boot from external media by default. As Geoff mentioned, it can be enabled by booting from the Recovery partition and changing a setting in the Startup Security Utility. If you don’t make that change in advance of needing it and the Mac has experienced a total drive failure, there’s still Internet Recovery but I’ve never checked to see if it includes all the utilities, including Startup Security Utility.
Without starting a deep discussion about the why of it, Apple SEEMS to feel that booting one machine from an image made on another machine is not something that is desirable going forward.
In my pure speculation I can see a time when you might have a perfect clone of a boot disk, but it will only (if even that) be able to boot the system serial # it was cloned from. For those of us who deal with management of systems, image
copying is a fail and we have been told that for years. So, most of us in such jobs have just moved on from bootable imaging. Those that haven’t are having a hard time keeping their old imaging/cloning processes working. A very hard time.
For those who have a system that works now great. Will it work with the next release of the macOS or the next computer they buy? Maybe? But at some point, I expect not.
Like jaclay I have been using Superduper for some time and it is very simple to use. Recently I accidentally erased the contents of my iBooks. I tried to do a recovery using TM but it refused to do it for me. I was able to go into my Superduper backup and recover the missing data quite easily. It will be disappointing if this is not possible in the future.
There’s no concern that bootable duplicates will go away; what people are talking about is how they’re a bit more of a stretch to make for less-experienced users and how Apple is discouraging about booting one Mac from another’s duplicate (it’s related to the fact that firmware updates won’t get installed unless the installer is used).
I suspect you are underestimating Apple’s plans.
But this is speculation on my part.