#

Google’s AI enters its ‘agentic era’

A photo of a person holding a phone up to a bus.
Project Astra is supposed to be more useful than ever — and it knows where you are. | Image: Google

I stepped into a room lined with bookshelves, stacked with ordinary programming and architecture texts. One shelf stood slightly askew, and behind it was a hidden room that had three TVs displaying famous artworks: Edvard Munch’s The Scream, Georges Seurat’s Sunday Afternoon, and Hokusai’s The Great Wave off Kanagawa. “There’s some interesting pieces of art here,” said Bibo Xu, Google DeepMind’s lead product manager for Project Astra. “Is there one in particular that you would want to talk about?”

Project Astra, Google’s prototype AI “universal agent,” responded smoothly. “The Sunday Afternoon artwork was discussed previously,” it replied. “Was there a particular detail about it you wish to discuss, or were you interested in discussing The Scream?”

I was at Google’s sprawling Mountain View campus, seeing the latest projects from its AI lab DeepMind. One was Project Astra, a virtual assistant first demoed at Google I/O earlier this year. Currently contained in an app, it can process text, images, video, and audio in real time and respond to questions about them. It’s like a Siri or Alexa that’s slightly more natural to talk to, can see the world around you, and can “remember”…

Read the full story at The Verge.