After years of stagnation, Apple is finally giving Siri the brain transplant it desperately needs. At WWDC 2026, the company will unveil the most significant transformation in the voice assistant’s 15-year history.
For years, Siri has been the punchline of tech industry jokes—a once-revolutionary assistant that fell behind while competitors like ChatGPT, Google Gemini, and Claude raced ahead. But according to extensive reporting from Bloomberg’s Mark Gurman and
corroborated by multiple industry sources, Apple is preparing to flip the script at WWDC 2026 with a complete Siri overhaul that fundamentally reimagines how users interact with their devices .
The new Siri—internally codenamed “Campo”—represents Apple’s answer to the generative AI revolution. It’s not merely an incremental update but a strategic repositioning of Siri as the centerpiece of Apple’s AI strategy across iPhone, iPad, and Mac .
Table of Contents
From Voice Assistant to System-Wide AI Agent: The Strategic Pivot
The Standalone Siri App: A New Home for Conversations
Redesigned Interface: Dynamic Island Takes Center Stage
Technical Foundation: Apple’s Dual-Model Strategy with Google Gemini
New Capabilities: What the Campo Update Actually Does
The Privacy Advantage: Apple’s Differentiator in the AI Wars
Rollout Timeline: When to Expect the New Siri
Why This Matters: Apple’s AI Moment of Truth
From Voice Assistant to System-Wide AI Agent: The Strategic Pivot
The decision to rebuild Siri stems from a sobering realization inside Apple: Apple Intelligence, the company’s ambitious AI initiative, failed to gain meaningful traction with users . Meanwhile, competitors who embraced conversational AI have seen explosive growth, with ChatGPT alone capturing hundreds of millions of users.
Apple’s response is nothing short of a complete architectural overhaul. The new Siri will transition from a traditional voice assistant—one that handled simple commands like setting timers and checking weather—to a true conversational AI agent capable of understanding context, maintaining threaded dialogues, and executing complex multi-step tasks across applications .
This shift required Apple to abandon its long-standing position. Craig Federighi, Apple’s software engineering chief who now oversees AI efforts, told reporters last year that Apple did not want users “to be pushed into some chat experience just to get things done” . But the rapid adoption of generative AI tools has forced a change in philosophy. The new Siri will embrace exactly what Federighi once resisted: chat-based interaction that feels natural and conversational .
The Standalone Siri App: A New Home for Conversations
Perhaps the most visible change coming with iOS 27 is the introduction of a standalone Siri application for iPhone, iPad, and Mac . This represents a fundamental shift in how Apple thinks about AI interaction—moving Siri from a background utility to a foreground application.
What the Independent Siri App Will Offer
The standalone Siri app functions as a centralized hub for all AI interactions, drawing clear inspiration from popular chatbot interfaces while maintaining Apple’s distinctive design language :
| Feature | Description |
|---|---|
| Conversation History | All Siri interactions are saved in a threaded interface, allowing users to revisit, search, and continue previous conversations |
| Chat Bubble Interface | Conversations appear in a familiar chat format similar to Apple’s Messages app, with clear back-and-forth threading |
| Pin & Save | Users can pin important conversations for quick access and save useful exchanges |
| File Upload Support | Upload documents and photos directly within Siri for analysis, summarization, or editing |
| Cross-Device Sync | Conversations sync across iPhone, iPad, and Mac through iCloud |
| Suggested Prompts | Context-aware suggestions based on previous usage patterns and current activity |
The app interface supports both light and dark modes, with a clean design that prioritizes readability and ease of navigation. Users can start new conversations with a prominent “+” button and search through past interactions using natural language queries .
Integration Without Disruption
Importantly, the standalone app does not replace traditional Siri activation methods. Users can still invoke Siri by pressing the power button or using the “Hey Siri” voice command. The difference lies in what happens next: instead of a brief audio response, users are presented with a persistent conversation interface that can be expanded, minimized, or continued across sessions .
Redesigned Interface: Dynamic Island Takes Center Stage
The visual redesign accompanying Siri’s transformation represents Apple’s most significant interface change since the introduction of the Dynamic Island in 2022. According to sources testing internal builds, multiple design approaches are being evaluated, with one clear frontrunner .
The Dynamic Island Integration
One design in active testing places Siri directly within the Dynamic Island—the pill-shaped cutout at the top of iPhone displays introduced with the iPhone 14 Pro. When activated, Siri prompts users with a simple “Search or Ask” message. A glowing Siri icon appears alongside a pill-shaped indicator labeled “Searching” while the system processes requests .
Once results are ready, the interface expands into a larger translucent panel featuring Apple’s signature “Liquid Glass” design language—a aesthetic introduced in iOS 26 that emphasizes depth, translucency, and fluid animations . Users can pull down on this panel to continue conversations, maintaining context across multiple exchanges .
Replacing Spotlight Search
Beyond the visual refresh, Apple is positioning Siri to eventually replace Spotlight search—the system-wide search feature that has remained largely unchanged for years. The new unified search interface will allow users to find local content and conduct web searches from a single entry point, all powered by Siri’s enhanced intelligence .
This integration includes “Siri Suggestions”—AI-recommended apps, upcoming calendar events, and relevant settings—presented within the same interface. The goal is to eliminate the need for users to remember whether a query requires Spotlight or Siri; the AI handles both seamlessly .
“Ask Siri” and “Write with Siri” System-Wide
Apple is also testing two new system-level entry points for Siri functionality:
Ask Siri Button: A system-wide button appearing in native app menus that allows users to send selected content directly to Siri. For example, highlighting a paragraph in Safari and tapping “Ask Siri” could trigger a summary or related search .
Write with Siri: Located above the keyboard, this feature provides quick access to Apple’s writing tools for text generation, rewriting, and editing. This moves functionality that was buried in iOS 26 to a more accessible location .
Technical Foundation: Apple’s Dual-Model Strategy with Google Gemini
The intelligence powering Siri’s transformation comes from an unconventional partnership. While Apple has been developing its own AI models—collectively known as Apple Foundation Models—the company has also partnered with Google to integrate Gemini technology .
A $10 Billion Partnership
Apple and Google finalized a partnership reportedly valued at approximately $10 billion, formalized in January 2026, to bring Gemini’s advanced language capabilities to Apple’s ecosystem . This arrangement allows Apple to leapfrog years of development while maintaining its focus on user experience and privacy.
Under the agreement, Google provides custom-tuned Gemini models optimized for Apple’s hardware and privacy requirements. These models are not standard Gemini implementations but specialized versions designed to work within Apple’s constraints .
The Two-Tier Model Architecture
According to internal documentation cited by multiple sources, the new Siri operates on a two-tier system :
On-Device Processing (Apple Foundation Models): For simple queries and privacy-sensitive tasks, Siri relies on Apple’s own foundation models running entirely on the device. This ensures rapid response times and maintains Apple’s privacy commitments.
Cloud Processing (Gemini Integration): For complex queries requiring advanced reasoning, multi-step planning, or external knowledge, Siri can leverage Google’s Gemini models through Apple’s Private Cloud Compute infrastructure—which maintains privacy by not storing user data or exposing it to Google’s systems.
This hybrid approach allows Apple to offer capabilities comparable to ChatGPT and Gemini while preserving the privacy protections that differentiate Apple from competitors .
CoreAI Framework
iOS 27 will also introduce CoreAI, a new framework replacing the existing CoreML machine learning stack. CoreAI is designed specifically for generative AI workloads and will serve as the foundation for Siri’s enhanced capabilities across Apple’s operating systems .
New Capabilities: What the Campo Update Actually Does
The technical improvements translate into concrete capabilities that will change how users interact with their devices. Based on reporting from developers and internal testers, here are the key features arriving with the Campo update :
Personal Context Understanding
The new Siri can access and synthesize information across Apple’s native apps to complete complex, context-aware requests. Examples include:
“Find the presentation I was working on last night” (searches through recently accessed files in Pages or Keynote)
“What was the name of that restaurant in the email from Sarah last week?” (searches Mail and extracts relevant information)
“Play that playlist I used during my run on Tuesday” (accesses Music app history and workout data)
Screen Awareness
Siri can now see what’s on your screen and take action based on that context. This enables interactions like:
Summarizing a long article currently open in Safari
Extracting contact information from a photo or document
Finding related content based on what you’re viewing
In-App Actions
Through expanded App Intents, Siri can perform tasks within both Apple and third-party applications:
Send emails or messages within their respective apps
Adjust settings within Photos (cropping, color adjustments, filters)
Control media playback with natural language (“Play more songs like this one”)
Navigate within apps (“Scroll down,” “Go back,” “Find the settings”)
Enhanced Information Synthesis
Rather than simply providing links or brief answers, the new Siri can synthesize information from multiple sources:
Generate summaries of recent emails or messages
Create lists and bullet-point summaries from long documents
Provide sourced answers with citations from Apple News and the web
Compare information across sources
Cross-App Workflows
Perhaps most powerfully, Siri can now execute workflows spanning multiple applications:
“Remind me to reply to this email when I get home” (combines Mail and Reminders)
“Find photos of our trip to Paris and create a shared album” (combines Photos and iCloud Shared Albums)
“Add everyone in this message to a new calendar event” (combines Messages and Calendar)
The Privacy Advantage: Apple’s Differentiator in the AI Wars
As competitors like Google and OpenAI have faced scrutiny over how they handle user data, Apple is positioning privacy as its competitive advantage in the AI race. The new Siri incorporates multiple privacy protections that differentiate it from alternatives .
On-Device Processing First
Whenever possible, Siri processes requests entirely on the device. This includes:
Personal context queries that access local data
Simple information requests that don’t require external knowledge
App control and settings adjustments
This approach ensures that sensitive user data never leaves the device for routine interactions .
Private Cloud Compute
For queries requiring cloud resources, Apple uses Private Cloud Compute—an infrastructure designed to provide AI processing without compromising privacy. Key features include:
No persistent storage of user queries or data
Cryptographic verification that only Apple-approved code runs on servers
No data exposure to third parties, including Google (despite Gemini integration)
Federated Learning
Apple continues to develop on-device learning techniques that improve Siri’s capabilities without uploading user data. Models can be updated with aggregated, anonymized usage patterns rather than individual user data .
This privacy-first approach creates a meaningful differentiation from competitors, particularly for users concerned about their data being used to train AI models or shared with third parties.
Rollout Timeline: When to Expect the New Siri
Apple has established a clear rollout schedule for the new Siri, though some features will arrive later than others .
WWDC 2026 Announcement (June 8-12, 2026)
Apple will formally unveil the new Siri during its WWDC 2026 keynote. The presentation will include:
Demonstration of the standalone Siri app
Overview of new capabilities and interface
Announcement of the Google Gemini partnership
Developer tools for integrating with App Intents
Developer Beta (June 2026)
The first developer beta of iOS 27, iPadOS 27, and macOS 27 will be released immediately following the WWDC keynote. This beta will include the new Siri for testing, though some features may be incomplete .
Public Beta (July-August 2026)
A public beta program will allow non-developers to test the new Siri ahead of the official release. Apple typically uses this period to gather feedback and fix bugs.
iOS 27 Official Release (September 2026)
The stable version of iOS 27 will ship with new iPhones—expected to be the iPhone 18 series—and become available for compatible older devices. The new Siri will be a core feature of this release .
Fall 2026 Feature Completion
Some advanced features—particularly personal context capabilities and full App Intents integration—may not be fully available until later in the fall, following additional development and testing .
macOS 27 Release (October-November 2026)
The Mac version of the new Siri will arrive with macOS 27, typically released in the fall following the iPhone launch.
Why This Matters: Apple’s AI Moment of Truth
The stakes for this Siri overhaul could not be higher. Apple has watched competitors define the AI landscape for the past three years, with ChatGPT becoming a cultural phenomenon and Google integrating Gemini across its massive product ecosystem. Meanwhile, Siri remained largely unchanged—reliable for basic tasks but incapable of the sophisticated interactions users now expect.
The Developer Ecosystem Angle
Apple’s approach to AI also carries implications for its developer ecosystem. The expanded App Intents system, which allows Siri to control third-party applications, creates new opportunities for developers to integrate their apps with system-level AI . However, Apple will need to convince developers to enable this level of access—and some major players may resist allowing AI assistants to control their apps .
Hardware Synergy
The new Siri also represents an opportunity for Apple to leverage its hardware ecosystem. Future products reportedly in development include:
AI-powered AirPods with deeper Siri integration
An AI pendant device similar to what OpenAI is developing with Jony Ive
Continued investment in on-device AI capabilities through the M-series and A-series chips
Competitive Positioning
Perhaps most importantly, the new Siri allows Apple to re-enter the AI conversation. After years of being dismissed as lagging behind, Apple can now argue that its approach—combining conversational AI with deep system integration and uncompromising privacy—represents a superior vision for how AI should work on personal devices.
The company’s marketing message is clear: the best AI isn’t a separate service you visit in a browser; it’s an invisible layer woven throughout your operating system, understanding your context, respecting your privacy, and working across the apps you already use.
Conclusion: A New Chapter for Apple’s AI Ambitions
The Siri overhaul coming with iOS 27 represents Apple’s most significant AI investment to date—and its best chance to reclaim relevance in a space it once dominated. By combining a completely redesigned interface, a standalone app, deep system integration, and an unprecedented partnership with Google, Apple is betting that it can deliver AI capabilities that rival the best in the industry while maintaining the privacy focus that distinguishes its brand.
For users, the changes will be immediately apparent. Siri will finally feel like a true conversation partner rather than a voice-activated command line. The ability to upload documents, maintain threaded conversations, and execute complex cross-app workflows will bring Apple’s assistant into parity with—and in some ways ahead of—competing offerings.
For Apple, this moment represents a test of the company’s ability to execute on AI at scale. The pieces are in place: the hardware, the privacy infrastructure, the partnership with Google, and a redesigned interface. Now the company must deliver a polished, reliable experience that justifies the years of development and positions Siri as the center of Apple’s AI future.
The curtain rises on June 8 at WWDC 2026. If Apple delivers on the promise of Campo, the conversation around Siri will finally shift from what it has been to what it could become.