Owners of the iPhone 15 Pro and iPhone 15 Pro Max will soon have another reason not to upgrade to the iPhone 16. According to Daring Fireball , Visual Intelligence — Apple's equivalent of Google Lens — will appear on the 2023 Pro-series flagships.
On the iPhone 16 and iPhone 16 Pro, the Visual Intelligence feature can be launched by long-pressing the dedicated camera button. But like the recently introduced iPhone 16e (which also supports the feature), the iPhone 15 Pro and Pro Max don’t have a physical camera button. So all three models will get the ability to activate it via the Action Button or a shortcut in Control Center, which will arrive in a future software update.
Apple has not yet announced which version of iOS will bring Visual Intelligence to the iPhone 15 Pro. However, Daring Fireball 's John Gruber suggests that it will be iOS 18.4, which will begin beta testing soon and be released in April.
As a reminder, Visual Intelligence is part of Apple Intelligence's AI suite. It allows you to point your camera at an object and analyze it in real time. Some features are available without additional services, but the feature becomes even more useful with integrated shortcuts for searching in ChatGPT or Google Image Search .
Visual Intelligence also allows you to interact with text: translate, annotate, and summarize. And also find out information about the business you point the camera at: view opening hours, menus, available services, or purchase products.