Apple has announced several AI-powered features coming to its Siri virtual assistant, such as an updated design, improvements to how it speaks and understands spoken language commands, and an integration with OpenAI’s ChatGPT model.
Some of these features, announced during Apple’s WWDC event on June 10th, come courtesy of Apple Intelligence — the company’s new privacy-focused personal intelligence system, which the company says will be able to control actions in apps on behalf of users. For Siri, Apple says these actions include things like retrieving information from user emails if asked to check when their mom’s flight is landing or the ability to locate an image of a driver’s license on the device and extract that information to fill out a form for you.
Apple says Siri’s updated design makes it more prominent across the device, displaying a glowing multi-colored border on the screen when in use. Siri should also be more capable of understanding users when they stumble on their words while speaking to the assistant and can maintain the context of the conversation between requests — making an event for a specific location after providing weather information about that place, for example.
Users can also make requests by simply describing an app or feature they wish to use, with Siri being capable of understanding and fetching the relevant information. Additionally, users will soon be able to type their questions and requests to Siri by double-tapping on the bottom of the screen and can continue those actions via either typing or voice commands.
Apple says that Apple Intelligence gives Siri “on-screen awareness” to take action and understand things on your screen, allowing it to do things like update the address on a contact card when that person messages you with their new information. And Siri’s ability to take “hundreds” of in-app actions will allow it to perform complex tasks entirely over voice commands, like fetching images of friends in specific locations or outfits, and enhancing those images if asked to “make them pop.”
For the incoming ChatGPT integration, Apple says Siri will be able to “tap into ChatGPT’s expertise when it may be helpful for you,” such as when asking for recipe advice. Siri will ask users if it can utilize ChatGPT to complete a request and will present that information directly on the device screen if granted user approval.
Today’s announcement follows several months of reports and rumors regarding Apple’s plans for AI, including efforts to develop its own framework for large language models. The ChatGPT integration is expected to be available sometime later this year, while the new Siri features powered by Apple Intelligence will be available in beta for iOS 18 “this fall” in the US.
While Siri’s ability to respond to verbal commands or questions was impressive when it was introduced on the iPhone 4S in 2011, its lack of subsequent improvements has made it feel like a disappointing afterthought. Meanwhile, products like OpenAI’s ChatGPT and Google’s Gemini have since shown that AI tools and systems are capable of a whole lot more. With Siri’s upgraded capabilities and ChatGPT now integrated into the service directly, Apple virtual assistant may now have the second wind it desperately needs.