All the signs so far point towards Apple putting AI – specifically generative AI – front and center at WWDC 2024. Its developer-centric conference will surely highlight the potential of AI in and across its myriad of products and platforms. And highlight how developers use and could use AI-based APIs to integrate an extra layer of smarts into their apps and services.
And so that leads me to ponder what could the Worldwide Developers Conference, set for June 10, actually tell us about Apple’s future hardware, given no new devices are tipped to be revealed at the keynote.
First, let’s look at what we can expect from Apple over the rest of the year and heading into 2025.
New iPhones and Apple Watches for 2024
As night follows day, we’ll undoubtedly get the iPhone 16 around September time, likely joined by a new Apple Watch – specifically the Apple Watch 10 – and maybe the rumored AirPods Pro 3 and AirPods Max 2; though we’re not so convinced by the latter.
Given the iPad Air and iPad Pro range just got a rework, we don’t expect any new iPads any time soon. And as we got new MacBook Pro M3 and MacBook Air M3 models this year, we won’t expect to see M4 variants until 2025. There’s even a chance that the Macs could skip the M4 chip as it’s similar to the M3 only with efficiency and AI workloads in mind. The same could be true of the iMac 24-inch M3.
Personally, I’d love to see the 27-inch iMac make a comeback, but I suspect that desktop might have gone the way of the dodo, potentially joined by the iPad mini.
So that’s a good stab at the state of play for Apple’s hardware over the next 12 months or so. And with that in mind, what could we posit that WWDC will tell us what we can expect from said hardware…
AI integrated at Apple’s core
My prediction – and I stress to add this is speculation based on my experience covering Apple – is that the overall design and hardware capabilities of Cupertino’s products won’t change a great deal. Apple has arguably nailed the tablet formula with the iPads, and its MacBooks are all rather slick; though I’m sure some would argue for a few more ports.
There’s scope for the iPhone and Apple Watch designs to have some tweaks. But I think Apple will continue to iterate on these by trimming display bezels and reducing the thickness of watch cases, rather than go down the road of drastic redesigns.
But AI tools and capabilities will surely be integrated across iOS, iPadOS, macOS, and watchOS, to bring in things like generative AI-powered editing, smarter searching in Safari, the ability to quickly summarize web pages, documents, and meetings, or simply ask a smarter Siri to pull together an agenda for your week based on access to Maps, your email and calendar – plus third-party tools like air travel apps.
And I suspect that’ll drive the need for chips in the iPhone 16 family to prioritize Apple’s Neural Engine, potentially putting more power and focus on driving AI workloads and machine-learning algorithms than raw clock speeds and graphics capabilities.
From that, I’d not be surprised if the standard iPhone 16 got a new AI-centric A-series chip, alongside the iPhone 16 Pro and iPhone 16 Pro Max. Last year’s iPhone 15 range saw a two-tier approach to chips with the standard and iPhone 15 Plus models getting the A16 Bionic and the iPhone 15 Pro and iPhone 15 Pro Max getting the powerful A17 Pro chips. This was the second generation in a row that Apple separated the chipsets of the vanilla and Pro iPhones.
However, I’d posit that the iPhone 16 family will all use the same chipset if Apple does indeed lean hard into AI, as it’ll need parity of silicon to ensure smart tools can run on all its new phones and reach the widest audience.
Of course, running AI models within a device takes power, especially if it’s done on board, which is something Apple could lean into given its privacy-first approach it champions to its users and their data. Even if a connection to the cloud is needed, that’ll likely mean tapping into a phone’s modem chip, which also consumes power, especially when using 5G.
So from that I can theorize that the iPhone 16 range will have a larger battery or perhaps even be a bit thicker to include a bigger cell or more efficient cooling via vapor chambers and heat pipes. Am I talking about major differences? Well no, but a few millimeters added onto a phone’s thickness can make all the difference in everyday use.
If such predictions come to pass, I suspect what I’ve honed in on with the iPhone 16 could then spread onto the iPads and Macs. The former could get a little thicker to hold a bigger battery and get a more powerful AI-targeted chip, but I suspect that won’t happen until later in 2025 or even 2026.
Macs will likely get more powerful chips next year and could even come with dedicated AI keys on their keyboards. And I’d not be surprised if bigger batteries were added to all MacBooks, with the MacBook Air seeing a return to active cooling if it needs to run demanding AI workloads and algorithms on device. I’m not predicting a big design change, but that display notch could be reduced with AI processing used to fill in any gaps left by shrunken sensors.
Wearables with virtual brains
Again, I don’t foresee big changes to the Apple Watch design, given it’s arguably somewhat iconic now.
But I can see Apple adding a new chip to the next-generation Apple Watch built around on-device AI processing. And that could see the thickness boosted a tad to hold a larger battery. Apple may not do that, choosing to preserve the aesthetic set by the Apple Watch 9 and instead bring faster charging to its next-generation smartwatch as compensation for any increased power drain.
On the AIrPods side, I’d not expect any huge design changes there. But once again I can see the likes of the AirPods Pro 3 and AirPods Max 2 getting chips running low performance-impact AI features – say smart equalizer adjustments or working with an iPhone to cleverly switch to more dynamic tracks in Apple Music if a person suddenly picks up their walking pace.
A dedicated Action button or touch-sensitive surface could also pop up in future AirPods to trigger an upgraded Siri with new generative AI abilities.
The use of multimodal AI models, which can pull and process data from multiple sources, could see Apple Watch and AirPods tapping into more data and feedback from accelerometers and other sensors to feed an AI with information that helps it provide more accurate information and results. This may not herald a design revolution, but sensor placement could change and evolve within Apple’s wearables.
Evolution on the outside, revolution on the inside
Ultimately, I don’t think a dive into AI will see Apple drastically change product designs and hardware features. But under the hood, more powerful chips are likely to pop up in Apple’s gadgets. Such chips could transform the iPhone from an app-centric device to a more AI-led one, and give Macs an extra layer of productivity potential.
Apple’s devices have a vast reach, especially iPhones, and such broad availability of AI-powered handsets could lead to the frictionless use and normalization of AI tools in everyday life, rather than having people regard them warily, seeing tech bros losing their minds over ChatGPT, and maybe dispel that general sense of AI ennui.
Unless I and the wider tech journalism world have got it wrong, I’d not be surprised if Apple’s tipped showcase of AI at WWDC 2024, and then its subsequent use of smart tools and features acts as a form of gateway drug into generative AI for the average phone user and consumer. That could be very big for the future of AI.
If you want to see if I’m right or wrong, then check back with TechRadar, as we’ll be bringing you all the latest rumors, news, opinions, and analysis ahead of and during WWDC 2024. And if you want to watch the keynote for yourself, check out our guide on how to watch WWDC 2024.