Home News Apple Intelligence: All you need to Know about Apple’s AI Model and...

Apple Intelligence: All you need to Know about Apple’s AI Model and Services

0
Apple Intelligence: All you need to Know about Apple’s AI Model and Services

Apple Intelligence can be found in your favorite apps like Messages Mail and Notes if you have upgraded to a more recent iPhone model. Apple Intelligence, or AI, was introduced to Apple’s ecosystem on October 20, 2024. It’s here to remain as Apple competes against Google, OpenAI and Anthropic to create the best AI tools.

What is Apple Intelligence?

Image credits:Apple.

Cupertino executives have branded Apple Intelligence as “AI for the rest us.” The platform was designed to leverage what generative AI does well, such as text and image creation, to improve on existing features. Apple Intelligence, like other platforms such as ChatGPT and Google Gemini was trained using large information models. These systems use deep-learning to form connections whether it is text, images or video.

LLM’s text offering is presented as Writing Tools. This feature is available in a variety of Apple apps including Mail, Messages and Pages. It can be used for summarizing long texts, proofreading and even writing messages for you using content and tone prompts.

Image creation has also been integrated in a similar fashion, albeit less seamlessly. Apple Intelligence can be asked to create custom emojis in the Apple house style. Image Playground is a standalone app that uses prompts to generate visual content. This can be used for Messages, Keynote, or shared on social media. Apple Intelligence is also a long-awaited update for Siri. The smart assistant, which was introduced early on, has been neglected over the past few years. Siri is now integrated into Apple’s operating system. Instead of the familiar icon on the iPhone screen, users will see an illuminated edge.

New Siri is more important because it works across all apps. This means that, for example you can ask Siri edit a picture and then insert it into a text. It’s a frictionless user experience that the assistant lacked before. Siri’s onscreen awareness is a feature that uses the context provided by the content you are currently interacting with to provide the most appropriate answer.

Many expected Apple to introduce a more powerful version of Siri at WWDC 2025. However, we will have to wait for a little longer.

At WWDC 2025, Apple SVP of Software Engineering Craig Federighi said: “As we have shared, we are continuing our work to bring features that make Siri more personal.” “This work required more time to achieve our high-quality standard, and we look to share more about it in coming year.” Bloomberg reports that the in-development version is not yet ready. It was too error-riddento ship, which is why it was delayed.

Apple also announced a new AI feature at WWDC 2025 called Visual Intelligence. This allows you to search images for items you see while browsing. Apple also announced a Live Translation feature, which can translate conversations in real-time in the Messages app, FaceTime app, and Phone app.

Visual Intelligence, Live Translation and other features are expected to become available in 2025 when iOS 26 is released to the public.

When was Apple Intelligence revealed?

Apple Intelligence was the star of WWDC 2024 after months of speculation. The platform was announced after a torrent generative AI news, from companies such as Google and Open AI. This caused concern that the famously secretive tech giant had missed out on the latest tech craze. Apple, contrary to the speculation, had a team working on a very Apple approach towards artificial intelligence. Apple is known for its love of putting on a show, but Apple Intelligence offers a more pragmatic approach to the category. Apple Intelligence doesn’t exist as a standalone feature. It’s more about integrating it into existing offerings. The technology behind the large language models (LLMs) will be used to operate in the background. The technology will mainly be visible to the consumer as new features in existing apps.

In September 2024, Apple held its iPhone 16 event. Apple announced a number AI-powered features that will be coming to its devices. These included visual search on iPhones and translation on Apple Watch Series 10. Apple Intelligence’s first wave will arrive at the end October as part of iOS 18, iPadOS 18.1, macOS Sequoia 15,1.

These features were first launched in U.S. English. Apple added Australian, Canadian and New Zealand English localizations later. Support for Chinese (English (India), English(Singapore), French (German), Italian, Japanese (Korea), Portuguese, Spanish and Vietnamese will arrive by 2025.

Who is Apple Intelligence for?

Image Credits:Darrell Etherington

The first wave of Apple Intelligence arrived in October 2024 via iOS 18.1, iPadOS 18., and macOS Sequoia 15.1 updates. These updates included integrated writing tools, image cleanup, article summaries, and a typing input for the redesigned Siri experience. A second wave of features became available as part of iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2. That list includes, Genmoji, Image Playground, Visual Intelligence, Image Wand, and ChatGPT integration.

These offerings are free to use, so long as you have one of the following pieces of hardware:

  • All iPhone 16 models
  • iPhone 15 Pro Max (A17 Pro)
  • iPhone 15 Pro (A17 Pro)
  • iPad Pro (M1 and later)
  • iPad Air (M1 and later)
  • iPad mini (A17 or later)
  • MacBook Air (M1 and later)
  • MacBook Pro (M1 and later)
  • iMac (M1 and later)
  • Mac mini (M1 and later)
  • Mac Studio (M1 Max and later)
  • Mac Pro (M2 Ultra)

Notably, only the Pro versions of the iPhone 15 are getting access, owing to shortcomings on the standard model’s chipset. Presumably, however, the whole iPhone 16 line will be able to run Apple Intelligence when it arrives.

How does Apple’s AI function without an Internet connection?

Image Credits:Apple

When you ask GPT or Gemini a question, your query is being sent to external servers to generate a response, which requires an internet connection. But Apple has taken a small-model, bespoke approach to training.

The biggest benefit of this approach is that many of these tasks become far less resource intensive and can be performed on-device. This is because, rather than relying on the kind of kitchen sink approach that fuels platforms like GPT and Gemini, the company has compiled datasets in-house for specific tasks like, say, composing an email.

That doesn’t apply to everything, however. More complex queries will utilize the new Private Cloud Compute offering. The company now operates remote servers running on Apple Silicon, which it claims allows it to offer the same level of privacy as its consumer devices. Whether an action is being performed locally or via the cloud will be invisible to the user, unless their device is offline, at which point remote queries will toss up an error.

Apple Intelligence with Third-Party Apps

Image Credits:Didem Mente/Anadolu Agency / Getty Images

A lot of noise was made about Apple’s pending partnership with OpenAI ahead of the launch of Apple Intelligence. Ultimately, however, it turned out that the deal was less about powering Apple Intelligence and more about offering an alternative platform for those things it’s not really built for. It’s a tacit acknowledgement that building a small-model system has its limitations.

Apple Intelligence is free. So, too, is access to ChatGPT. However, those with paid accounts to the latter will have access to premium features free users don’t, including unlimited queries.

ChatGPT integration, which debuts on iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2, has two primary roles: supplementing Siri’s knowledge base and adding to the existing Writing Tools options.

With the service enabled, certain questions will prompt the new Siri to ask the user to approve its accessing ChatGPT. Recipes and travel planning are examples of questions that may surface the option. Users can also directly prompt Siri to “ask ChatGPT.”

Compose is the other primary ChatGPT feature available through Apple Intelligence. Users can access it in any app that supports the new Writing Tools feature. Compose adds the ability to write content based on a prompt. That joins existing writing tools like Style and Summary.

We know for sure that Apple plans to partner with additional generative AI services. The company all but said that Google Gemini is next on that list.

Can developers build on Apple AI models? Apple announced the Foundation Models Framework at WWDC 2025. This framework will allow developers to access Apple’s AI models even when they are offline. This allows developers to integrate AI features into third-party apps by leveraging Apple’s existing systems.

“For instance, if you are getting ready for an examination, an app such as Kahoot will create a personalized test from your notes in order to make studying more interesting,” Federighi stated at WWDC. “And because this happens using on-device model, it happens without cloud API cost […] ”

www.aiobserver.co

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version