At the Google I/O conference, the company unveiled Project Astra, a multimodal model that can analyze everything the user shows it.

The tech giant showed a very early version of what it hopes will become a universal assistant. Project Astra is a multimodal AI assistant that sees the world in real time, can recognize things, and remembers their location.

In a demonstration video, which Google Deep Mind head Demis Hasabis assures is not a fake or tampering, an Astra user at Google’s London office asked the assistant to describe the objects captured on camera, explain the code, and remind him where his glasses were.

The assistant answered all the questions in a live conversation format without any delays.

Astra is much closer to how a real-life AI assistant should work than previous products, Hasabis says.

The new assistant should be available “later this year.”