Google's multimodal virtual assistant Astra is considered the future of Google AI. Analysis suggests that Google plans to launch some Hardware that supports Astra—smart glasses are particularly well-suited for the tasks Google aims to accomplish with Astra.
After Apple and Meta, is Google also going to make smart glasses?
At this year's spring Google I/O conference, Google showcased its multimodal virtual assistant Astra—although this project is still in its early stages, it is considered to be the future of Google AI. During Astra's demonstration, one thing kept appearing: glasses.
For many years, Google has been developing a variety of smart facial devices, from Glass to Cardboard, to the Project Iris translation glasses showcased two years ago. Analysis suggests that Astra needs smart glasses, and Google may therefore produce some.
At the press conference before the release of Gemini 2.0, Bibo Xu, product manager of Google's DeepMind team, stated:
"A small number of people will test Astra through prototype glasses, and we believe this is one of the most powerful and intuitive forms of experiencing this AI."
Xu also revealed:
"We will soon release more news regarding the glasses products themselves."
Analysis suggests that this indicates Google's plan to launch some hardware supporting Astra, and smart glasses are particularly suitable for what Google aims to achieve with Astra:
Google hopes that users will envision Astra as an always-online assistant in their lives, and nothing is better than integrating audio, video, and displays into the user's face—especially if users desire a continuously connected experience.
In a new video showcasing Astra Gemini 2.0 features, a tester uses Astra to remember the security code of an apartment building, check the weather, and more. In one scene, the tester sees a bus rushing by and asks Astra whether that bus will pass near Chinatown... While all these tasks can be accomplished with a smartphone, doing so through a wearable device would feel much more natural.