Macs Make the Move to ARM with Apple Silicon

What do you mean by “defeat the purpose of apps like Google’s Authenticator”? Google Authenticator just generates TOTPs (time-based one-time passwords). If you use something that uses TOTP for 2FA, you’ll need something that can generate the codes, that won’t change on an ARM based Mac. And as there are a number of iOS and macOS apps that do so already (1Password for example), I don’t see ARM based Macs affecting this use much.

I doesn’t, at all. The code generated is based on the key you enter when you set it up (either by typing the key or scanning a QR code), and the number of 30 second intervals since midnight on January 1, 1970 (or other values the server and client agree on, those are the defaults). The machine/device the code is generated on doesn’t come into play.

An important point that was revealed on the SOTU Developer video hours after the Keynote announcements: Apple is designing an entire line of ARM based SoC’s that will be specifically targeting each Apple Mac’s unique needs and constraints. That would be physical space, power / energy consumption and cooling solutions.

The A12Z in the Developer Kit (custom Mac Mini) is NOT the production Apple Silicon SoC that will be going into future Macs. In fact, these Developer Kit Mac Mini’s with the A12Z SoC are in limited supply, you have to apply for them and qualify and you need to eventually return the device back to Apple when the program finishes. The A12Z is still a mobile device processor and while it performs remarkably well, there will likely be a large difference in the Mac SoC designs. Yes, they will inherit all the same technology but will be designed for significantly more energy use and performance. While still delivering massive gains on performance per watt for a laptop / desktop processor.

ALL of the Demos seen during the Keynote and the SOTU were all running on the A12Z which is the same SoC in the 2020 iPad Pro but fitted into a custom Mac Mini case. It alone is impressive but just wait till you see the real deal in the first few Macs. This is a very exciting time.

If you see benchmarks published from this A12Z take it with a grain of salt, because it’s not what will be shipping in Macs. It’s merely to give developers something to test with to become familiar with the new way of things.

2 Likes

Actually, benchmarking may be difficult at first. Most of the benchmarks are designed for x86-x64. Apple is working with Blender.org directly and Blender is used in benchmarking tests of CPU cores and GPU.

Yes, since Apple said that developers can opt-out of having their iPhone and iPad apps included in the Mac App Store. I doubt most will, but it’s possible.

As other have said, it doesn’t in general, though some systems may require it. I use Authy, because Google Authenticator loses all its codes in system upgrades, and Authy Desktop lets me access my codes on the Mac so I don’t have to pull out my iPhone and type them in. It’s still a little fussy, but easier. And I’m not too worried about the security of my Mac due to its physical location.

And then you got a “free” 17” intel iMac at the end of the DTK programme. Mine is still in use elsewhere in the family, I think it is the longest lasting Mac that I ever got.

2 posts were split to a new topic: Should 2FA always be on two different devices?

Looking at prices and specs, surely it’ll only be a few years before users ask why they can’t plug their USB-C iPhone/iPads into a screen, and use it as a low end Mac mini.

Already exists. Samsung DeX. It probably sucks, and Apple’s implementation will hopefully be better. :slight_smile:

Knowing Apple, they’ll remove the iPhone lightning port to avoid such considerations.

That will definitely be a concern a few years after the last Intel-based Mac ships. I think these days Apple’s Xcode is much more dominant for developing software for the Mac and through it Apple can at least encourage companies to keep their binaries “fat.” Another point of influence is Apple could require all Mac App Store applications be ARM/Intel; the store could also deliver to each Mac just the code compiled for that hardware.

No, and I doubt they’d try. Their competition will be Intel graphics, which they can probably already beat. For whatever reason, Apple and Nvidia have parted ways so they’ll need to work with AMD to make Radeon cards work on non-Intel Macs to satisfy the businesses that buy Mac Pros for GPU work.

The problems with Nvidia were very serious, and Apple was dealing with repairs for MacBooks for years. And for some time Nvidia refused to admit their cards were at fault:

https://support.apple.com/en-us/HT203254

In addition to faulty processors, there were supply chain issues that had big impact on Apple’s product delivery:

I wonder if sometime in the future Apple will start developing their own graphics cards?

I agree. Nothing indicates so far they’re trying to replace dedicated desktop-class GPUs similar to what Nvidia or AMD produce.

Considering the competitive advantage Nvidia has been displaying for the last couple of GPU generations, it’s high time for Apple to get real here and put behind petty vendettas. Nvidia offers something nobody else can match. It’s time Apple work to make that accessible to their professional customers who don’t care about bygones. Restricting yourself to a single GPU supplier is silly. It’s essentially putting them in the same boat with AMD as they just got out of with Intel.

A petty vendetta? Apple gets hit with a very, very major class action lawsuit because Nvidia refused to own up to the fact that Nvidia chips were bricking MacBook Pros? This literally cost Apple many years of ongoing bad publicity:

Apple had to take Nvidia to court to force them to admit to the flaws and help Apple pay for repairs Nvidia had to fork over $148.6 million for repairs that stretched over about 6-7 years:

And Apple missed the announced delivery date for its brand new 30 inch Cinema Displays by many months because Nvidia couldn’t deliver the chips on schedule. I linked to this in a previous post.

Then Nvidia yanked Apple into a patent lawsuit with Intel, which Apple had no intention of participating in:

Clearly, this is not “a petty vendetta.” It’s been a smart business move.

1 Like

LOL. :rofl: Good one. The only people who can afford that opinion are those who professionally haven’t had to rely on high-performance frameworks such as CUDA. Or those who can afford to wait twice as long for their GPU to return results.

I think anyone who needs CUDA has already gone PC.

The vast majority of Mac graphics/video/animation users are happy with the alternatives, the overall ecosystem is more important to them.

1 Like

I’m afraid that’s exactly the point, Tommy. Some Apple fans tend to have this myopic view that GPUs are about FCP. Scientific computing has zero interest in FCP, but it has been heavily involved with CUDA. Apple has lost likely around $200k from my department alone just because of their AMD single-sourcing folly. Apple used to care about scientific computing, now they seem to believe they can do well without. Maybe, maybe not. Regardless, tying yourself to a sole GPU supplier just makes no sense, even if Apple believes the only pros that matter are video studios.

It’s not my field but I thought the performance advantages of CUDA over OpenCL had been overstated. The remaining problem being CUDA having robust ecosystem of software libraries so you don’t have to write nearly so much yourself.

Or the cloud.

1 Like

That’s been my experience, Curtis. You could attempt to write OpenCL code yourself and if you were really interested you might be able to make it perform almost as good.

People in the field I work in are for the most part not coders. We have scientific problems we are attempting to solve and our interest lies in the results, not the tools to get there. So we tend to use what’s already available, especially if that stuff is really good. If there are highly efficient CUDA libraries around or if we have ML tools nicely tailored to a small farm of Teslas that’s what we’ll want to use.

Could it be done in OpenCL or on AMD? Probably. Will people go the extra mile? No. Already before we did lots of computation on clusters, but there was always a fare amount of ‘local’ work, primarily testing and benchmarking. I’d say about half of my colleagues have now migrated to doing all of that on Linux boxes simply because Apple has been being dicky about GPU support. Especially those doing lots of ML. My corner of UC Berkeley is becoming quite a Google shop these days so common tools like Word or Keynote have effectively been replaced by Google docs anyway and they run in the same Chrome browser on any platform. These people have less and less reason to stay with Macs. Personally, I really like Linux and spend a lot of time using Linux tools, but I would definitely prefer I could do it all on my Mac rather than having to run several boxes. That’s probably niche. But of course so are those mixing rap music videos on $40,000 workstations that require $1000 wheel kits. ;)

1 Like

I mentioned this once before, my cousin is an astrophysicist who has been working with NASA for decades and is also a professor at a major university. He loves Macs and Apple products as much as I do, and he’s always said how much they love Macs at NASA, and how the Curiosity Mars Rover was basically a G3 9600 Mac:

A radiologist friend of the family runs most of his practice on Macs, uses this software that he says is popular around the world:

An Optometrist neighbor uses this, and the company makes Mac software for other medical specialties:

A speech pathologist friend who is involved in research always talks about how important iPads and Macs are in the field.

Apple’s Research Kit and Care Kit for iOS and Macs are making headway in the medical and healthcare studies, as well as in communication between providers and patients:

There are many scientific fields that depend on visualization, rapid rendering and communication, as well as other fields that do not use CUDA.

Honestly thought CUDA was an old switch/button on the Mac motherboard.

2 Likes