WTF is Apple Intelligence? Apple's take on AI explained

How convenient that artificial and Apple both start with 'A'

WWDC 2024 has kicked off, and as expected, Apple has previewed its long-awaited approach to AI for its devices. While a fair bit was spoken about all of Apple's software - iOS, iPadOS, WatchOS, macOS, and so on - the big announcements came around artificial intelligence, and how Apple intends to take it on in the coming months. Cheekily termed Apple Intelligence, it's promising a much more personalised experience and approach to AI, with promises of more privacy, and responses and purposes better tailored to each user's specific needs.

It may sound complicated, but it also sounds pretty impressive. Let's try to get to the bottom of it all.

On-device processing

The big talking point for Apple Intelligence is that a majority of processing will take place on device, thus ensuring better privacy and an assurance that your personal data isn't being farmed and used by anyone. All of this will be built into iOS18, macOS Sequoia, and more.

Apple Intelligence will be able to draw from your usage to create language and text, improve on your writing, create images, and generally act while using personal context and on-screen data.

That said, Apple Intelligence can seek help from ChatGPT 4.0, and it'll always ask for permission before sharing any of your personal data to get better results. This can help with generating images, working with existing images for suggestions, and even creating long-form text.

Compatibility will expectedly be a situation given the heavy reliance on on-device processing. Apple has confirmed that Apple Intelligence features will be available to iPhone 15 Pro series (and later), as well as iPad and Mac devices running M-series processors. They'll also need to run iOS 18, iPadOS 18, or macOS Sequoia, which will be widely released later this year.

Siri gets supercharged by Apple Intelligence

There are some subtle changes to the way Siri will look and react - notably in the colour, the way it appears around your screen, and the icon. However, the real changes come with Siri getting a healthy dose of Apple Intelligence to make it more capable. On an immediate level, Siri should be able to understand users better, including figuring out contextual references from previous queries.

You can also now type to Siri instead of speaking, and the voice assistant will learn from your usage to improve over time. Even complicated asks should hopefully be easier for Siri to figure out. Even correcting yourself mid-sentence - previously a definite cause for having to repeat yourself - will now hopefully be understood better by Siri.

Prioritising what needs to be prioritised

Apple Intelligence

Apple Intelligence will be able to prioritise notifications based on their expected importance, promising to put the right mails and messages up top.You could also get summaries of long messages and those seemingly endless group chats. You can also expect to see the right messages and mails at the right time - for example, your boarding pass shows up top pretty much right before your flight.

AI tends to work rather well to write and rewrite basic text passages, and Apple Intelligence comes with the ability to work with just about any app with editable text. The system promises to write and rewrite as needed, including proofreading, grammatical improvements, and even changing the tone of emails and messages.

You can even create new emoji with Genmoji, which uses Apple Intelligence's generative capabilities to create any sort of emoji to implement on supported apps.