Apple AI's “wrong path”: Why has it fallen behind in the era of large models?

 If you're an iPhone user, you've probably seen a lot of news about the explosive progress of AI over the past two years: ChatGPT is all over the news, Google's Gemini updates are more frequent than Android updates, Microsoft's Copilot is built right into Office, and Meta has even opened up Llama for “mass model training.”


But when you open your iPhone, Mac, or iPad, where is Apple?


This isn't an illusion—Apple has indeed fallen behind in the era of large AI models. Siri only started “upgrading its AI” in 2024, and the so-called “Apple Intelligence” is essentially just an AI-enhanced system feature, with no “breakthrough” products or model architectures in sight.


The issue isn't that Apple lacks funds, talent, or chips; it's that it has trapped itself in a cage that's incompatible with the AI ecosystem.


1. Apple’s “late awakening” to AI is not accidental—it’s inevitable.


Over the past two years, Google has been showing off with Gemini, Microsoft has been making money with Copilot, Meta has been expanding its territory with Llama, and Amazon has been selling AI as a lubricant for AWS. Even Tesla is using Dojo to build humanoid robots. What about Apple? It’s still saying, “We won’t look at your data.”


It seems like an elderly user telling the world, “Don't be too aggressive; stability is the way to go.”


But AI isn't a camera, a chip, or a well-crafted aluminum alloy casing. AI is about data throughput, model capabilities, and the iteration speed of cloud computing power. Apple certainly knows this, but it refuses to abandon its most fundamental principles—user privacy and a closed ecosystem.


This is like using Formula One racing rules in a bicycle race—you'll never win the straight-line sprint.


II. Apple's “Three Pillars of Faith” in AI: Privacy, Device-Based, and Local Closed-Loop


Apple's AI philosophy, put simply, boils down to three principles:


1. Data stays on the device;


2. Computing power doesn't rely on public clouds;


3. Everything must be closed-loop within the Apple ecosystem.


Thus, we see that “Apple Intelligence” has launched a PCC (Private Cloud Compute) system, claiming it is safer than the cloud because user data is not stored, uploaded, or left any trace—in short: we do AI without relying on your “dirty data.”


The question is, how powerful can AI be without “dirty data”?


Don't forget that GPT-4o consumes countless tokens daily; Gemini 2.5 Pro can perform reasoning across images, text, tables, and audio; and Meta's Llama 3.3, even though it's “semi-open source,” has already sparked intense competition in the community. Using an iPhone 15 Pro to run large models on the A17 chip isn't challenging Moore's Law—it's challenging market patience.


3. Why is Apple insisting on taking this difficult path?


At its core, this is a long-standing “aesthetic obsession” of Apple as a company—everything must be done in-house, everything must be kept confidential, and everything must be seamlessly integrated.


In the past, this approach has proven highly successful for Apple in both software and hardware: custom-designed chips, a closed system, and unified design. Each generation of iPhones sold steadily. But AI is a battlefield that doesn't prioritize “aesthetics”; it's more like a product of hacker culture and cloud-native thinking: iterate first, refine later; launch first, reflect later.


Ben Thompson aptly pointed out: Apple's “lag” in AI is because it's still using a “perfect product mindset” to address a “continuous service evolution” technical paradigm.


It's like insisting on writing “elegant code” when the developer next to you has already used Copilot to “Ctrl+Enter” and finish it in one go.


4. Apple isn't without options; it's just that its “AI gene” is too thin.


Apple actually knows where its weaknesses lie.


So while Apple Intelligence seems bold on the surface, it has actually integrated ChatGPT into the system. When Siri can't answer a question, it politely “brings in ChatGPT” to save the day, without even requiring a login.


Is this pragmatic? Of course. But strategically, Apple has effectively admitted: “Our model isn't strong enough; we need external support.”


This approach may alleviate pressure in the short term, but in the long run, isn't it ceding control over its own AI sovereignty? After all, once user habits are established, it becomes difficult to get people to switch back to your own model.


This is akin to Siri's decline after losing to Alexa and Google Assistant—the once-dominant smart assistant became little more than a voice-activated calendar and weather app.


5. Apple's Dilemma: The Conflict Between Closed Systems and Open AI


Ultimately, most leaders in the AI era have chosen “openness.”


Google opened up the Gemini SDK, Microsoft integrated Copilot into GitHub, M365, and even the Windows system's underlying layers, and Meta went with “open-source + deployment + integration with social platforms.” These are all classic “going out” strategies.


Apple, however, has consistently followed an “inviting in” strategy—if you want to run AI on my platform, you must first pass my review, use my hardware, adhere to my APIs, and comply with my privacy policies.

This “firewall-style” design approach feels out of place in the AI era. In the world of open-source models, no company can define the intelligent experience by “controlling the entry point.”


Ironically, when you use ChatGPT on an iPhone, the truly powerful components are the cloud, OpenAI, and Azure computing power—not Apple.



6. Conclusion: Apple's AI isn't bad; it just chose the wrong path


Many people say that Apple's AI is lagging behind because it's conservative, slow to react, and because its chip resources are tied up with Vision Pro. However, I tend to agree with another viewpoint: Apple isn't neglecting AI; it just bet on the wrong direction.


What it aims to create is a “personal intelligent assistant under user privacy,” not an all-powerful, all-knowing, omnipotent “super AI.”


However, the issue is that most users today are seeking an AI that can “help me finish a PowerPoint presentation, create a vlog, plan a trip, or even start a company with just one command.” This is not an ‘adequate’ AI, but an “the stronger, the better” AI.


While the entire industry is racing toward general intelligence, Apple is slowly refining a “Siri that understands you,” which is inherently an unequal competition.


In the future, Apple must either acknowledge that this path is unfeasible and swiftly pivot, or develop sufficiently robust device-based AI capabilities to deliver a “transformative” user experience. Otherwise, in the AI war, it risks being trapped in a “closed-loop” scenario.

Comments

Popular posts from this blog

トヨタ自動車の次期プリウスが公開され、さらに高性能になりました

Starting price of S$278,999 Avita 11 officially launched in Singapore

iOS 26 and iPadOS 26 public beta released: Come and experience the new features!