Macs Make the Move to ARM with Apple Silicon

I think it is bizarre that Apple should promise new Intel based models to be released later this year when it has already effectively declared that they are on an architecture which it is leaving behind. I feel for people who spend a large amount of money buying a machine that is obsolete before it is even put on sale.

It will be interesting to see how long Intel support lasts. On the switch away from PowerPC, developers fairly quickly started to deliver “Intel only” applications, which meant that they didn’t need to support two versions. So it isn’t just down to how long Apple has OS support for the Intel architecture.

My other worry is that, while Apple may well feel it can produce faster silicon than Intel right now, AMD and Intel push each other forward. In 5 years time will Apple still be faster? And on the graphics front, can Apple beat both Nvidia and AMD?

All that said, it will be interesting to see Apple’s pricing on the new machines relative to performance. Will they just swallow the Intel mark-up and keep prices at the same level (I’m suspecting they will)? Will they be able to consistently move forward the high-end processors on the Apple Silicon architecture when it is a very small market?

I feel that macOS laptops may increasingly struggle to differentiate themselves from the iPad Pro given the new Magic Keyboard cases. And to what extent will developers make the extra effort to produce Catalyst versions of their apps which are then still only semi-macOS native when they can just tell ARM Mac owners to run the iPad app on their desktop?

I get the excitement over the move, but there are still a lot of unanswered questions.

1 Like

I share that sentiment even though I have no worries about the Intel MBP I just purchased — it will easily last me long enough to await the 2nd gen AS MBP.

Apple could have simply spec bumped the Mac mini and iMacs around the same time it did so for the 13" MBP. The new MP is recently new so that a refresh could have waited until well into 2021 (assuming AS on the MP level would be ready by then). That way they would have only shipped new designs based on AS after the transition announcement.

It will be interesting to see how many Intel iMacs and Mac minis they actually sell now that the cat is out of the bag. Apple of course hides individual sales figures but there are always industry estimates. If Intel sales indeed tank, we’ll hear about it. The silver lining there would be that such a slump would certainly light a fire under Apple. Now that they’ve loudmouthed to the entire world about how great their silicon is and how slow Intel progress was, they have plenty to deliver on.

1 Like

One wrinkle that surprised me a bit is the hardware specs of the box that developers can “borrow” for $500 for a few months. It fits in a Mini case, but there are differences besides the processor. The elephant in THAT room is the expansion ports. The analogous currently shipping Mini has 4 TB3 ports and 2 USB-A (USB 3) ports. The “preview” box has 2 USBc ports and NO Thunderbolt 3 ports. I don’t get that.

TB support is delivered through Intel’s chipset. Most likely AS support is just not quite ready yet. Keep in mind Axx CPUs previously interfaced to USB-C (iPad Pro) but never before to TB. I have little doubt it will be there by commercial release. Also, USB4 is around the corner and that is effectively a rebranded TB3 so maybe AS Macs will simply ship with that. The connector is USB-C and it is backwards compatible.

1 Like

A very good point on USB4. Apple will need to replace more than just processors!

I’m also fully in agreement on avoiding the first versions of new Apple kit. I remember all too well the fate of the Mac Pro 1,1 which turned out not to be fully 64-bit and therefore was dropped like a stone by Apple well before its successors. The upside of having invested in Intel kit is that it might tide us through until any initial bumps are ironed out. And we do still have the ability to run Bootcamp and virtualised versions of Windows - which it seems may not be possible on the ARM machines (or, if done at all, may be more akin to the old VirtualPC in performance terms than what Parallels offers on an Intel machine).

That’s because it’s not a preview of anything. It’s a development kit designed to get app developers up and running as soon as possible. Just like the developer transition kit Apple loaned out in 2005 to support the transition from PowerPC to Intel.

The 2005 DTK was nothing like any Mac that ever shipped. It was, for all intents and purposes, a PC motherboard bolted into a PowerMac case, running a hacked-up version of the operating system that couldn’t run on anything else.

The kit Apple is loaning out today is almost certainly the same deal. Something that can host the pre-release software development/test environment, but otherwise completely unrelated to anything that will ship commercially.

Intel supports TB in the CPU only for the latest (10th generation) processors. For everybody else, the interface chips are PCIe devices.

While Apple might integrate TB3 into the Axxx SoC package, it is equally likely that they will ship a SoC with some number of PCIe lanes to drive on-board devices (like Thunderbolt ports, NVMe storage, etc.) On the Mac Pro, they will also connect to expansion slots.

In addition to simplifying the SoC’s design somewhat, it also allows them to leverage existing motherboard designs which use PCIe lanes for all kinds of on-board devices.

2 Likes

I absolutely agree with that. The point I was trying to make is that while Apple had already interfaced Axx CPUs to USB-C, they have not so far done that with TB. And as I already said, I don’t doubt they’ll get it ready by release of the first AS Macs. For now, there’s really no rush because the DTK didn’t need it for its mission.

Ugh. If Apple wants to do both developers and users a favor, why not FREEZE the MacOS for Intel processors where it is? Get all the bugs out and let developers catch up, then stop changing the OS every six months! Developers won’t have to maintain two versions, and a lot of users will be happy to keep using the apps that are working well on their Intel machines (without worrying about what happens next). When it’s really time to upgrade they can get a new Mac with Apple silicon, or jump ship.

1 Like

Ok, now let’s see if Apple will release a similar Mac mini with the 16GB RAM and 512GB SSD for $500.00 like they are doing for the developers. :slight_smile:

That’s a rental machine—you have to give it back at the end of the program. So don’t assume that $500 has any connection with the price of the hardware.

2 Likes

Apparently Microsoft will have to change its licensing for Windows 10 for ARM for people to be able to use it for virtualization. We’ll see what happens.

1 Like

The initial Mac mini was $499 (in 2005, today that would be $674) so it’s not that absurd. That said, you’re entirely correct, Adam. The DTK is a rental.

For comparison, a Core i3 mini with 16/512 is $1200 or $1500 if you choose a six-core i7. Makes this rental sound rather expensive actually, but of course as dev box that’s besides the point.

1 Like

This got me thinking about 2FA. If we’ll be able to use App Store apps on our Macs, won’t that at least to some degree defeat the purpose of apps like Google’s Authenticator?

Will app developers be able to choose to not allow their app to run on Macs at all? I realize they can already select not to compile a binary for macOS, but that’s not quite the same thing.

The Pentium 4 PPC to Intel dev kit was a $999 rental. That makes this one look inexpensive.

Oh, that’s another iPhone/iPad app I’d love on my Mac. Why does it matter where the time based one time code app is run? I log in to apps and sites on my iPhone (and iPad) from the same device all of the time (though sometimes I use my watch). The resulting code is the same no matter the platform.

Only if you forget that it came as the Intel equivalent to $3k PPC Power Mac. The ratios are similar though. Each rental box costs about a third of the retail price of the shipping Mac counterpart. Rather consistent of Apple. And I’m sure not a coincidence.

My work requires two separate devices be used for 2FA. It’s always either cell phone app for computer or web browser on computer for cell phone (VPN). The idea is that you require two separate hardware verifications to gain access. If an authenticator app were to run on my Mac, I could no longer use it to generate codes that I use to do 2FA on that same Mac. It’s not about codes being unique, it’s about codes being generated and verified on different devices. The idea is of course that while one device could be stolen or misappropriated, there’s much less chance that will happen on two separate devices simultaneously.

What do you mean by “defeat the purpose of apps like Google’s Authenticator”? Google Authenticator just generates TOTPs (time-based one-time passwords). If you use something that uses TOTP for 2FA, you’ll need something that can generate the codes, that won’t change on an ARM based Mac. And as there are a number of iOS and macOS apps that do so already (1Password for example), I don’t see ARM based Macs affecting this use much.

I doesn’t, at all. The code generated is based on the key you enter when you set it up (either by typing the key or scanning a QR code), and the number of 30 second intervals since midnight on January 1, 1970 (or other values the server and client agree on, those are the defaults). The machine/device the code is generated on doesn’t come into play.

An important point that was revealed on the SOTU Developer video hours after the Keynote announcements: Apple is designing an entire line of ARM based SoC’s that will be specifically targeting each Apple Mac’s unique needs and constraints. That would be physical space, power / energy consumption and cooling solutions.

The A12Z in the Developer Kit (custom Mac Mini) is NOT the production Apple Silicon SoC that will be going into future Macs. In fact, these Developer Kit Mac Mini’s with the A12Z SoC are in limited supply, you have to apply for them and qualify and you need to eventually return the device back to Apple when the program finishes. The A12Z is still a mobile device processor and while it performs remarkably well, there will likely be a large difference in the Mac SoC designs. Yes, they will inherit all the same technology but will be designed for significantly more energy use and performance. While still delivering massive gains on performance per watt for a laptop / desktop processor.

ALL of the Demos seen during the Keynote and the SOTU were all running on the A12Z which is the same SoC in the 2020 iPad Pro but fitted into a custom Mac Mini case. It alone is impressive but just wait till you see the real deal in the first few Macs. This is a very exciting time.

If you see benchmarks published from this A12Z take it with a grain of salt, because it’s not what will be shipping in Macs. It’s merely to give developers something to test with to become familiar with the new way of things.

2 Likes

Actually, benchmarking may be difficult at first. Most of the benchmarks are designed for x86-x64. Apple is working with Blender.org directly and Blender is used in benchmarking tests of CPU cores and GPU.

Yes, since Apple said that developers can opt-out of having their iPhone and iPad apps included in the Mac App Store. I doubt most will, but it’s possible.

As other have said, it doesn’t in general, though some systems may require it. I use Authy, because Google Authenticator loses all its codes in system upgrades, and Authy Desktop lets me access my codes on the Mac so I don’t have to pull out my iPhone and type them in. It’s still a little fussy, but easier. And I’m not too worried about the security of my Mac due to its physical location.

And then you got a “free” 17” intel iMac at the end of the DTK programme. Mine is still in use elsewhere in the family, I think it is the longest lasting Mac that I ever got.