Apple has announced a new feature called Visual Intelligence that will be part of the company’s suite of AI capabilities, Apple Intelligence, later this year. The feature works in much the same way as similar features offered by other multimodal AI systems from Google and OpenAI.
Apple’s Craig Federighi said at the company’s September event today that visual intelligence will allow the phone to “instantly learn about everything it sees.” Federighi said the feature is “enabled by Camera Control,” which is the company’s name for the new capacitive camera button on the side of the iPhone 16 and 16 Pro. To activate the feature, users need to click and hold the button while pointing their phone’s camera at something of interest.
With this feature, you can simply point your phone at something to search on Google or send an image to ChatGPT to identify the breed of dog or find restaurant hours. Apple hasn’t said when the feature will debut, other than to say it will be released sometime this year.
Under development…Our LIVE BLOG For more information on the latest updates.