by   @lauriesullivan
Source: www.mediapost.com, June 2024


Apple at its developer conference Monday announced Apple Intelligence based on AI to personalize content across devices.

The company’s next step will be driven by generative AI (GAI) and large language models, announcing a partnership with OpenAI to integrate ChatGPT into its devices and services, Apple CEO Tim Cook said during the Worldwide Developers Conference (WWDC) in Cupertino, California.

Cook described “Apple Intelligence” as a “personal intelligence system” based on Apple’s own generative AI models and its custom semiconductor chips.

Privacy sits among the most important features of this transition, he said. Processing information on Apple devices rather than sending it to the cloud lets Apple access personal data without collecting personal data, said Craig Federighi, Apple senior vice-president of software engineering.

Apple has developed some of the most advanced silicon in the past few years. It sits inside iPhones, iPads and Macs, and helps the company adhere to strict privacy standards. Many of Apple’s generative AI models can run entirely on a device powered by Apple’s A17+ or M-series chips, eliminating the risk of sending a user’s personal data to a remote server.

When larger cloud-based model are needed to process generative AI (GAI) requests, Apple will run it on servers it created using Apple silicon. This allows for the use of security tools built into the Swift programming language. Apple Intelligence will only send to the server any relevant data required to complete a task.

Apple Intelligence determines whether or not the request can be processed on the device or if it must go to the server for more information. Private Cloud Compute will refuse to communicate with a server unless its software has been publicly logged for inspection, setting a new standard for AI and privacy.

Consumer data is never stored or made accessible to Apple. It only is used to fill a request.

Siri, now powered by Apple Intelligence, also received an upgrade based on privacy. Apple Intelligence lets Siri handle stumbles in speech and better understand context. A semantic index will let users change course in a conversation without starting from the beginning. Apple said users make 1.5 billion voice requests daily.

The voice assistant also will soon gain what Apple calls “onscreen awareness.” For example, if a friend sends their address, the user can ask Siri add the address to a specific contact. It also will have the ability to take actions in and across apps through an App Intents API for developers.

By double-tapping the bottom of the screen, Siri will switch between voice and text queries to provide answers.