Macs Make the Move to ARM with Apple Silicon

That’s a rental machine—you have to give it back at the end of the program. So don’t assume that $500 has any connection with the price of the hardware.

2 Likes

Apparently Microsoft will have to change its licensing for Windows 10 for ARM for people to be able to use it for virtualization. We’ll see what happens.

1 Like

The initial Mac mini was $499 (in 2005, today that would be $674) so it’s not that absurd. That said, you’re entirely correct, Adam. The DTK is a rental.

For comparison, a Core i3 mini with 16/512 is $1200 or $1500 if you choose a six-core i7. Makes this rental sound rather expensive actually, but of course as dev box that’s besides the point.

1 Like

This got me thinking about 2FA. If we’ll be able to use App Store apps on our Macs, won’t that at least to some degree defeat the purpose of apps like Google’s Authenticator?

Will app developers be able to choose to not allow their app to run on Macs at all? I realize they can already select not to compile a binary for macOS, but that’s not quite the same thing.

The Pentium 4 PPC to Intel dev kit was a $999 rental. That makes this one look inexpensive.

Oh, that’s another iPhone/iPad app I’d love on my Mac. Why does it matter where the time based one time code app is run? I log in to apps and sites on my iPhone (and iPad) from the same device all of the time (though sometimes I use my watch). The resulting code is the same no matter the platform.

Only if you forget that it came as the Intel equivalent to $3k PPC Power Mac. The ratios are similar though. Each rental box costs about a third of the retail price of the shipping Mac counterpart. Rather consistent of Apple. And I’m sure not a coincidence.

My work requires two separate devices be used for 2FA. It’s always either cell phone app for computer or web browser on computer for cell phone (VPN). The idea is that you require two separate hardware verifications to gain access. If an authenticator app were to run on my Mac, I could no longer use it to generate codes that I use to do 2FA on that same Mac. It’s not about codes being unique, it’s about codes being generated and verified on different devices. The idea is of course that while one device could be stolen or misappropriated, there’s much less chance that will happen on two separate devices simultaneously.

What do you mean by “defeat the purpose of apps like Google’s Authenticator”? Google Authenticator just generates TOTPs (time-based one-time passwords). If you use something that uses TOTP for 2FA, you’ll need something that can generate the codes, that won’t change on an ARM based Mac. And as there are a number of iOS and macOS apps that do so already (1Password for example), I don’t see ARM based Macs affecting this use much.

I doesn’t, at all. The code generated is based on the key you enter when you set it up (either by typing the key or scanning a QR code), and the number of 30 second intervals since midnight on January 1, 1970 (or other values the server and client agree on, those are the defaults). The machine/device the code is generated on doesn’t come into play.

An important point that was revealed on the SOTU Developer video hours after the Keynote announcements: Apple is designing an entire line of ARM based SoC’s that will be specifically targeting each Apple Mac’s unique needs and constraints. That would be physical space, power / energy consumption and cooling solutions.

The A12Z in the Developer Kit (custom Mac Mini) is NOT the production Apple Silicon SoC that will be going into future Macs. In fact, these Developer Kit Mac Mini’s with the A12Z SoC are in limited supply, you have to apply for them and qualify and you need to eventually return the device back to Apple when the program finishes. The A12Z is still a mobile device processor and while it performs remarkably well, there will likely be a large difference in the Mac SoC designs. Yes, they will inherit all the same technology but will be designed for significantly more energy use and performance. While still delivering massive gains on performance per watt for a laptop / desktop processor.

ALL of the Demos seen during the Keynote and the SOTU were all running on the A12Z which is the same SoC in the 2020 iPad Pro but fitted into a custom Mac Mini case. It alone is impressive but just wait till you see the real deal in the first few Macs. This is a very exciting time.

If you see benchmarks published from this A12Z take it with a grain of salt, because it’s not what will be shipping in Macs. It’s merely to give developers something to test with to become familiar with the new way of things.

2 Likes

Actually, benchmarking may be difficult at first. Most of the benchmarks are designed for x86-x64. Apple is working with Blender.org directly and Blender is used in benchmarking tests of CPU cores and GPU.

Yes, since Apple said that developers can opt-out of having their iPhone and iPad apps included in the Mac App Store. I doubt most will, but it’s possible.

As other have said, it doesn’t in general, though some systems may require it. I use Authy, because Google Authenticator loses all its codes in system upgrades, and Authy Desktop lets me access my codes on the Mac so I don’t have to pull out my iPhone and type them in. It’s still a little fussy, but easier. And I’m not too worried about the security of my Mac due to its physical location.

And then you got a “free” 17” intel iMac at the end of the DTK programme. Mine is still in use elsewhere in the family, I think it is the longest lasting Mac that I ever got.

2 posts were split to a new topic: Should 2FA always be on two different devices?

Looking at prices and specs, surely it’ll only be a few years before users ask why they can’t plug their USB-C iPhone/iPads into a screen, and use it as a low end Mac mini.

Already exists. Samsung DeX. It probably sucks, and Apple’s implementation will hopefully be better. :slight_smile:

Knowing Apple, they’ll remove the iPhone lightning port to avoid such considerations.

That will definitely be a concern a few years after the last Intel-based Mac ships. I think these days Apple’s Xcode is much more dominant for developing software for the Mac and through it Apple can at least encourage companies to keep their binaries “fat.” Another point of influence is Apple could require all Mac App Store applications be ARM/Intel; the store could also deliver to each Mac just the code compiled for that hardware.

No, and I doubt they’d try. Their competition will be Intel graphics, which they can probably already beat. For whatever reason, Apple and Nvidia have parted ways so they’ll need to work with AMD to make Radeon cards work on non-Intel Macs to satisfy the businesses that buy Mac Pros for GPU work.

The problems with Nvidia were very serious, and Apple was dealing with repairs for MacBooks for years. And for some time Nvidia refused to admit their cards were at fault:

https://support.apple.com/en-us/HT203254

In addition to faulty processors, there were supply chain issues that had big impact on Apple’s product delivery:

I wonder if sometime in the future Apple will start developing their own graphics cards?

I agree. Nothing indicates so far they’re trying to replace dedicated desktop-class GPUs similar to what Nvidia or AMD produce.

Considering the competitive advantage Nvidia has been displaying for the last couple of GPU generations, it’s high time for Apple to get real here and put behind petty vendettas. Nvidia offers something nobody else can match. It’s time Apple work to make that accessible to their professional customers who don’t care about bygones. Restricting yourself to a single GPU supplier is silly. It’s essentially putting them in the same boat with AMD as they just got out of with Intel.

A petty vendetta? Apple gets hit with a very, very major class action lawsuit because Nvidia refused to own up to the fact that Nvidia chips were bricking MacBook Pros? This literally cost Apple many years of ongoing bad publicity:

Apple had to take Nvidia to court to force them to admit to the flaws and help Apple pay for repairs Nvidia had to fork over $148.6 million for repairs that stretched over about 6-7 years:

And Apple missed the announced delivery date for its brand new 30 inch Cinema Displays by many months because Nvidia couldn’t deliver the chips on schedule. I linked to this in a previous post.

Then Nvidia yanked Apple into a patent lawsuit with Intel, which Apple had no intention of participating in:

Clearly, this is not “a petty vendetta.” It’s been a smart business move.

1 Like

LOL. :rofl: Good one. The only people who can afford that opinion are those who professionally haven’t had to rely on high-performance frameworks such as CUDA. Or those who can afford to wait twice as long for their GPU to return results.