Apple opens core AI model to third-party developers

Apple says it will allow developers to access the foundational AI model that powers some of its features

Apple introduced new artificial intelligence features on Monday at its Worldwide Developers Conference, focusing on smaller updates that improve daily use instead of announcing major changes.

The company added tools such as live phone call translation and expanded visual recognition features, including the ability to find and buy items similar to those shown on screen.

Apple said it will allow developers to access the foundational AI model that powers some of its features. Software chief Craig Federighi said apps will be able to use the on-device large language model at the core of Apple Intelligence.

Apple also said it will offer code completion tools from both its own system and from OpenAI inside its developer software.

As part of a demonstration, Apple showed that OpenAI’s image generation tool, ChatGPT, would be integrated into the Image Playground app. The company said no user data would be shared with OpenAI without the user’s permission.

The updates come after Apple failed to deliver major AI upgrades last year. This time, the company focused on tools that can help users complete small tasks more easily. Federighi also said the company is redesigning all its operating systems using a style called “liquid glass,” with transparent icons and menus.

Operating systems will now be named by year instead of version number.

Apple said developers will only have access to the on-device model of Apple Intelligence, which has around 3 billion parameters. This version does not connect to Apple’s special AI data centers and is not designed to handle complex tasks.

During the event, Apple introduced a feature called “Call Screening,” which lets iPhones answer calls from unknown numbers and transcribe the reason for the call before notifying the user. Live phone call translation will also be available, and developers will be able to include this feature in their apps. The person receiving the call does not need an iPhone.

The Visual Intelligence feature will now analyze items shown on the screen, not just through the camera. Apple showed how users can point their phones at a jacket and then find a similar one for sale through another app on the device.

While Apple focused on software updates, OpenAI announced on the same day that it had reached a $10 billion annual revenue run rate. Meanwhile, Apple shares closed down 1.2% following the event.

Monitoring Desk
Monitoring Desk
Our monitoring team diligently searches the vast expanse of the web to carefully handpick and distill top-tier business and economic news stories and articles, presenting them to you in a concise and informative manner.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

Nintendo sells over 3.5 million Switch 2 units in first four...

Nintendo says it expects to sell 15 million Switch 2 units by the end of the current financial year in March