Google has begun rolling out new artificial intelligence features to Gemini Live that will allow the chatbot to "see" through a smartphone camera and answer questions, The Verge reports.
Google first introduced these features nearly a year ago at its I/O conference, where they were dubbed Project Astra, and the company is expected to incorporate the technology into its upcoming smart glasses.
A Reddit user reported that the Share screen with Live feature has appeared on his Xiaomi phone and posted a video showing Gemini reading from the screen. It is known that the innovation is only available to Google One AI subscribers.
Another useful feature of Project Astra is real-time object recognition from a smartphone camera. Users will be able to ask the chatbot questions about a particular subject in real time and receive expert advice.