Serving technology enthusiasts for over 25 Years. TechSpot is the place to go for tech advice and analysis you can trust.
Looking forward: The race for the future of wearable technologies is heating up. Smart glasses are emerging as the next major frontier. While Meta’s Ray-Ban partnership has already made waves in the tech world, tech giants such as Apple, Samsung, Google, and Samsung are developing their own projects. Google recently gave the public the most tangible look at Android XR powered smart glasses in a live demo at the TED2025 Conference.
Until recently, Google’s Android XR smart glasses were only seen in carefully curated teaser video and limited hands-on demos shared with selected publications. These early glimpses showed the potential for integrating artificial intelligence in everyday eyewear, but they left questions about performance. Shahram Izadi and Nishtha Bhattia, Google’s Android XR leader, demonstrated the prototype glasses on the TED stage.
This live demo demonstrated a number of features that set these glasses apart from other smart eyewear attempts. The device looks like a pair of ordinary glasses at first glance. It’s packed full of advanced technology including a miniature camera, speakers, microphones and a high resolution color display embedded into the lens.
These glasses are lightweight, discreet and can be fitted with prescription lenses. They can also be connected to a smartphone in order to access a wider range of apps and to take advantage of its processing power.
Izadi began the demo by using the glasses to display his speaker notes on stage, illustrating a practical, everyday use case. The real highlight, however, was the integration of Google’s Gemini AI assistant. In a series of live interactions, Bhatia demonstrated how Gemini could generate a haiku on demand, recall the title of a book glimpsed just moments earlier, and locate a misplaced hotel key card ā all through simple voice commands and real-time visual processing.
But the glasses’ capabilities extend well beyond these parlor tricks. The demo also featured on-the-fly translation: a sign was translated from English to Farsi, then seamlessly switched to Hindi when Bhatia addressed Gemini in that language ā without any manual setting changes.
Samsung will launch its smart glasses in the second half of this year.
Other demonstrations included visual explanations of charts, contextual object recognition (such as identifying an album and playing a song) and heads-up navigation using a 3D map overlay projected into the wearerās field of vision.
The Android XR platform, developed in collaboration between Samsung and Qualcomm, was unveiled last December as an open operating system for extended-reality devices. It allows users to experience familiar Google apps in immersive environments. YouTube and Google TV are displayed on virtual big screens. Google Photos is rendered in 3D. Google Maps is immersive. Chrome has multiple floating windows. Users can interact through visual cues, voice commands and hand gestures. The platform is compatible with existing Android applications, ensuring an ecosystem that is robust from the start.
Samsung is also preparing to launch their own smart glasses codenamed Haean later this year. Haean glasses, which are similar to regular sunglasses and incorporate gesture-based control via cameras and sensors, are designed for comfort and subtly.
While the final specifications are being selected, it is expected that the glasses will feature integrated cameras, a light frame, and Qualcomm’s Snapdragon XR2 plus Gen 2 chip. Video recording, music playback and voice calling are also features that are being considered.