Apple Turns to Google’s Gemini to Power Siri and Apple Intelligence

Originally published at: Apple Turns to Google’s Gemini to Power Siri and Apple Intelligence - TidBITS

In a remarkably understated joint announcement, Apple and Google said:

Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google’s Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.

After careful evaluation, Apple determined that Google’s Al technology provides the most capable foundation for Apple Foundation Models and is excited about the innovative new experiences it will unlock for Apple users. Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple’s industry-leading privacy standards.

Foundation models are the core AI systems on which other features are built. It’s important to realize that Gemini is a family of large language, image, video, and companion models optimized for different tasks and scenarios. Even within Google products, the models users interact with vary widely: those in the Gemini Web interface and app are significantly more capable than those that power the AI Overviews at the top of Google Search. The AI Overviews feature likely uses a fast, lightweight Gemini variant optimized for speed, cost, and summarization, which would help explain why its answers are often much worse than those from the Gemini app.

Apple is likely downplaying this announcement because it’s embarrassing that it has so far failed to deliver AI models that match the quality and capabilities of those from Google, OpenAI, and Anthropic in real-world use. At least now Apple is moving forward with a technology stack that may enable the company to deliver a less stupid more personalized Siri and other Apple Intelligence features in updates to iOS 26, iPadOS 26, and macOS 26 Tahoe. It remains to be seen how these improvements will extend to devices with less processing power, such as the Apple Watch, HomePod, and Apple TV, since routing queries through Private Cloud Compute would introduce noticeable lag.

Why Google?

Although Apple Intelligence features bolted-on connectivity to OpenAI’s ChatGPT, Apple’s Craig Federighi told TechCrunch at the 2024 launch, “We’re looking forward to doing integrations with other models, including Google Gemini, for instance, in the future.”

It’s not surprising that Apple ultimately chose Google for a deeper partnership. While OpenAI and Anthropic models had long been considered the best, Google’s late 2025 release of the well-received Gemini 3 demonstrated the search giant could also be fully competitive in AI. The Verge even just published an article titled “Gemini is winning.”

Compared with OpenAI and Anthropic, Google is a more stable corporate partner for Apple, in part because of the long-standing deal that makes Google Search the default in Safari. Google is also a profitable company with multiple businesses, while OpenAI and Anthropic continue to burn billions in venture capital. Mark Gurman of Bloomberg suggested Apple would pay roughly $1 billion for access to Gemini—accounting aside, it’s a pittance compared to the $20 billion Google pays Apple for preferential search positioning.

However, the deal isn’t exclusive. M.G. Siegler wrote that Apple isn’t replacing the current ChatGPT partnership. That’s sensible; even if Siri’s ChatGPT integration is nowhere near as satisfying as using ChatGPT directly, users react poorly to chatbots changing their personalities and capabilities. And Apple undoubtedly wants to keep its options open in case things change again.

I certainly hope Apple never even considered Meta’s Llama, trained in part on the cesspool of social media, or xAI’s Grok deepfake porn generator. Not that Apple or Google get a bye here, since they allow Grok to remain in the App Store and Google Play even though it has been widely documented as being used to generate guideline-violating content.

Despite this deal, I’m sure Apple will continue developing its own foundation models. When the iPhone launched in 2007, Apple Maps relied on mapping data and technology from Google Maps, which lasted until iOS 6 in 2012, when Apple launched Apple Maps with its own data. That may have been too early—the launch was rocky, to say the least (see “Examining Maps in the Wake of Tim Cook’s Apology,” 28 September 2012), but now Apple Maps is comparable or even preferable to Google Maps. Apple hates being beholden to other companies and will undoubtedly be looking to switch back to its own technology in the future.

1 Like

Looking at the Ollama site (where you can download and try lots of different open source LLMs), I see the following, with respect to Google models:

  • Gemma. Comes in two sizes. The Gemma family of models is developed by Google and is “inspired by Gemini models”.
  • Gemma 2. Comes in three sizes.
  • Gemma 3. Comes in 5 sizes, including a small one suitable for running on mainstream computers.
  • Gemma 3n. Two sizes. Optimized for efficient execution on everyday devices.
  • FunctionGemma. A lightweight model designed for creating custom function-call models (e.g. to use from within Python scripts)
  • CodeGemma. Optimized for coding tasks (code completion, code generation, etc.)
  • EmbeddingGemma. Based on Gemma 3. Designed for embedding in small devices.
  • ShieldGemma. Safety and content moderation models based on Gemma 2. Doesn’t answer queries, but returns whether or not input- and output-text should be permitted or blocked.

I think it is safe to assume there are at least as many Gemini model variants that Google has not released as open source.

6 Likes

One has to wonder how these models will end-up being implemented across AppleOS.

  1. Will different LLMs be used on different tasks or apps?
  2. Will Apple pick the LLM(s) for the user or the user be able to choose? If the latter, then how easy will deciding between LLMs be for average users?
  3. Will there be paid add-ons or upgrades?
  4. Will different non-Google LLMs still be available in the OS?

…oh, and the final one:
Will Siri now work properly every time to do basic tasks like turning things on/off without giving you a different answer or simply ignoring you?! :slight_smile:


I can also see why Apple didn’t partner further with OpenAI, given Sam Altman is collaborating with ex-head of Apple Design, Jonny Ive, on their future ‘thing’ that may end-up competing with an Apple product. So I suspect Apple is avoiding a potential conflict of interest down the line.

Also, Musk is too unpredictable to be trusted, while Twitter/X remains about as toxic as possible, so who would want anything to do with Grok.

1 Like

Apple was able to launch Apple Maps at least in part by using OpenStreetMap content without attribution, I remember the grumbling!

My main concern would be privacy. Google only pays lip service to privacy. Siri, as dumb as it is respects user privacy. I thought Apple was looking for an on chip intelligence. I’d really rather not open my Apple devices to a Google invasion. Maybe I’m missing something here.

2 Likes

This was my first thought. I have made a conscious effort to transition as much of my life as possible away from Google. I absolutely do not want them hooked into my phone, all of my content for intelligent summarizing, and my Siri interactions.

3 Likes

I am not as sanguine as some people regarding Apple’s use of our personally identifiable data currently and in the future. Its adoption of Google’s Gemini only confirms my suspicions. I’ve also never used Siri on any of my Apple devices.

OK, now I’ll have to disable Siri. Google’s only reason for existing is to be a data harvester for monetization. Now their AI will have a open door into iPhones, iPads, iHome,  TV,  Watch, etc.

1 Like

I was unaware that Anthropic’s finances were shaky. I’m disappointed to read this, as I’ve been working with Claude AI very happily over code writing and many more diverse subject areas. I discounted ChatGPT for coding at an early stage when it made some disastrous errors while refactoring.

Also an issue is if we little people will be able to turn this off/delete from Apple devices!

Apple is planning to run the Gemini models on user devices and on private cloud compute when the cloud is needed, just as Apple Intelligence now runs using Apple’s current models. Google will not be running or seeing anything from the model use. Just as a reminder, when a user that uses the private cloud compute is done with a session, the servers involved retain no private data and are reset to a default configuration.

4 Likes

@ddmiller , if you’re replying to my comment just above, I’m both too cynical to believe, ‘what goes on on your device stays on your device’, and too old fashioned to want or need anything AI. I would rather have more storage room available on the device. After over 40 years computing, I know how to do things myself on a computer/device.

Apple can do what they want of course, it would just be more courteous of them to acknowledge that quite a few users don’t want this stuff, and if it will be required to install and be used, to either describe some workaround to disable it for couple of years or a date/version after which we will be forced to let the computer do what Apple wants, not what we want, so we can go our own ways. They express all this touchy feely ‘concern for users’ in PR but where the rubber meets the road, it’s about what’s best for Apple.

If you’re replying to some other comment, sorry about inserting my stupid opinion, I’ll delete it if so.

Apple is not sending your requests to Google servers.

Apple has licensed some of Google’s Gemini models to run on Apple’s servers (and maybe some of the smaller ones on your phone as well).

Google will have no access to the data passed to Apple’s server, just like they have no access to the data you might pass to a Gemma model you run on your own computer.

So if this data is used for some unethical purpose, it will be Apple doing it, not Google.

5 Likes

If you believe Apple is lying about this, then you shouldn’t trust anything else they say.

If you use any Apple (or Google or Microsoft or any other closed-source) product, then you are running millions of lines of code that nobody outside of the corporation has ever seen. Any of it could be spyware.

But are you willing to restrict yourself to running only open source software that has been audited by trusted organizations? I know of some such people, but I think most of us wouldn’t be happy doing that.

8 Likes

No, I wasn’t. I was replying to all of the people who said that they don’t trust Google not to data harvest.

But to reply to you, from the start of Siri, it’s an option you can turn off. From the start of Apple Intelligence, it’s a non-default feature that you can turn off. It sounds like you want to keep them both off.

I rarely use Siri, almost never on my phone. The only things I use it for are:

  1. To set timers on my watch when my hands are gloved or cannot manipulate the watch. Siri works well for this! I also sometimes end workouts on my watch when I am wearing gloves, or it is raining and I have turned on water-lock on the watch.
  2. To turn on or off lights, again, only when my hands cannot use my phone to do so otherwise. For example, I have a basement light connected to HomeKit and I sometimes exit the basement and then re-enter from the bulkhead outside, and find that my wife has turned off the light. The light switch is at the top of the stairs, so I can use Siri (and wait several seconds most of the time) to turn it back on again.
  3. Again this is rare, as I don’t often do it while I am driving, but occasionally I use siri to dictate messages or replies while I am driving. Again, Siri seems to do this well with CarPlay.

Otherwise I am one of those who hates talking commands to my phone (or any other device.) If I need to find something out (who is this actor in this movie? Where have I sen him before?), I just use an app or a search on my device the “old-fashioned” way.

And I do have Apple Intelligence enabled on my phone and iPad, but I hardly ever use them.

2 Likes

As an aside, I asked web-based Gemini to find a postage stamp based on my description of the design. It returned a a photograph on Alamy that was related to my description but not completely. I turned on “thinking mode” and told it that its reply was not a postage stamp. It returned a painting on Etsy. I’m not sure what Apple has bought. (ChatGPT also had problems and never completely got it right, but at least it knew what a postage stamp is.)

Different models, undoubtedly. Whether that means different models within the Gemini family or Apple going elsewhere, there’s no way to know.

Apart from very high profile things where users might want to hand off a chat to ChatGPT or Gemini online for search capabilities, just like Visual Intelligence and Siri do now, I doubt that users will be able to pick specific models. It’s a choice that most people would have no idea how to make usefully.

Probably, but I doubt Apple will make any of this clear.

Remember, the announcement was for Gemini to power the next generation of Apple Foundation Models, so in some respects, it’s all still Apple.

Oh, I hope! Of course, there might be entirely new failure modes too.

And just to reiterate what @ddmiller and @Shamino said, this is Google technology that Apple will be incorporating into Apple Foundation Models, not a live connection to Google servers. There are no new privacy-related concerns here based on what has been said.

3 Likes

I wonder if this development means we’ll get an option to use Apple Intelligence with Apple’s in-house licensed Gemini only so without ever invoking ChatGPT no matter what we ask Siri.

Thanks for the responses, Adam.

Ah, I think I understand better now…
Apple are going to pick the ‘best of LLMs’ per feature and/or app (most from Gemini under this new deal, maybe others from elsewhere, depending on how things pan out), then use them on it’s own servers (or servers it controls) under it’s own underlying Apple Foundation Models system, meaning it will have control over the privacy aspects.

Sounds interesting. We’ll all have to see what we eventually get.

1 Like

Thanks @Shamino . Apple is too savvy with language to be caught lying. They know how to shade language with expertise so it sounds like they are doing something else. Still, among Big Tech I would rate them the most trustworthy.

I don’t knowingly use Google products and have nothing MS installed on my primary Mac. I’m no tech wiz myself so can’t know for sure, but surely ‘they’ are tracking and analyzing my device usage, but I try to find a balance between usability and protection.

If I knew of such a way to operate in your last paragraph, I’d consider it. Maybe you could connect me with such people so I could see what that realm is like, maybe it’s a good fit, maybe not but I wouldn’t know where to start even if I wanted.