All of the major technology suspects participate in a The arms race is not secret To be the first to develop augmented reality glasses in the market. We already know that Google has collaborated with Samsung to develop glasses that focus on Android, but the company recently offered another device to a small audience in TED2025 conferenceIndeed, we can tell this type of XR – AKA “The Extension Reality” – which may take us to help us escape the tyranny of VR headphones.
Shahram Izadi, Vice President and Director General of the XR at Google Out on on TED, wearing what was initially seemed to be a typical model-although it does not exceed-from glasses (think about the huge and unpublished Meta as well Orion smart glasses). But they were Google XR glasses, and Izadi claimed that they were offering his speech notes as he was speaking.
The glasses had a microphone, camera and amplifiers to collect a lot of information. The first XR device from Google Since Google Glass It contains an “internal width”, which Izadi holds on the camera for a few few seconds, noting that it is “very small”. This may indicate the idea that Google is experimenting with a wave glasses screens, contained in Devices like the latest Rayneo glasses. The same glasses use Android XR, the Google local operating system for extended reality devices.

Google Nishtha Bhatia Product Manager showed the crowd that glasses cameras can show through the compact camera. She took advantage of one arm of glasses, which launched the typical Blue Star Gemini logo at the bottom of the display, and after half a second, Chatbot Ai-Voored seemed ready to present Haiko worthy Eyeroll around the audience that was collected with “their faces all the following”. Gemini can translate the text that he sees into different languages, although Izadi suggested that the feature may produce mixed results. This camera itself can also analyze text and graphs and convert them into more attacking audio panels.
Ezadi and Patia also showed the “memory” feature that allows AI to remember the things he saw through the camera in the recent past. It is closer to what Google DeepMind has been tried with the ASTRA project last year. Google slowly added ASTRA features, Including recognition of photos and videosTo the Gemini Live Chatbot interface, and it appears that the company is looking to combine similar features on a pair of the upcoming AR glasses.
Eschers must have more capabilities to connect to your smartphone and “access all your phone applications”. However, the real killer feature is to connect to other Google applications. Bathia asked glasses to look at a record of rapper Teddy swimming and play a path. YouTube music opened and played the required song. We already enjoy the Ray-Ban Meta glasses well enough for solid loudspeakers without the need for a pair of earphones, so this is unreasonable from Google. What Wilder was that the screen inside the glasses can work with Google Maps, providing a semi -homogeneous image of Google Streetvief to move.
Google was Work behind the scenes on AR cups for several yearsEven after Google has disappeared. It is strange that the TED spouses may not be sold under the Google logo. The search giant works alongside Samsung on Mohan project Smart glasses by enabling the device using Android XR. Google has offered the headphone headphone of Samsung on the TED2025 stage. From what we have seen from Mohan, which may be A distinctive price deviceThe headphone will You behave largely like Apple Vision Pro todayBut with more features that support Gemini.
Samsung also hinted so much that she works on A separate pair of smart glasses, But we doubt what Google showed this week is what is rumored to reach later this year. Korean publishing EtNEWS (Read with a machine translation) The reported last month that the next device may not have a screen or buttons. Instead depends on a microphone for speech and a camera for the controls of gestures.
Meanwhile, it may also be meta Work on an expensive pair of glasses With a small screen dedicated to the app at the bottom of the right lens. The screen is what will really put “augmented reality” glasses from the sounds that focus on the sound we see so far. Google will need to know how to use a larger screen, with a battery life balance and the weight of glasses. But if so, we may finally get our AR kicks without having to push a heavy headphone on our eyes.
https://gizmodo.com/app/uploads/2025/04/TED2025-Google-AR-Glasses-1.jpg
Source link