Meta AI’s new smart glasses experiment can see what you do and tell you how you feel

(Image credit: Meta)
The smart glass is being used by researchers to train robots and create better AI systems, which could be incorporated into consumer-friendly smart glasses. These smart glasses, unlike the Ray-Bans are currently only for research. However, they are equipped with enough cameras, sensors, and processing power to ensure that some of what Meta learns will be incorporated into wearables in the future.

Project Aria research-level tools like the new smart glass are used by people who work on computer vision, robots, or any relevant blend of contextual AI and neurology that catches Meta’s eye. Developers can use these glasses to create more effective ways to teach machines to navigate, contextualize and interact with the environment.

In 2020, the first Aria smart glass was released. The Aria Gen 2s have far more advanced hardware and software. They are lighter, more accurate and have more power. They look more like the glasses that people wear every day, but you wouldn’t mistaken them for a normal pair of glasses.

Four computer vision cameras are able to see an arc of 80 degrees around you, and measure distance and depth. This allows it to tell how far away your coffee mug from your keyboard is, or where the drone’s landing gear may be. The glasses also have an ambient light sensor that has ultraviolet mode, as well as a contact mic that can pick up voice in noisy environments and a heart rate detector embedded in the nosepad.

Future facewear

There is also plenty of eye tracking technology that can tell where you are looking, when you blink and how your pupils change. It can track your hands and measure joint movement to help train robots or learn gestures. The glasses can determine what you are looking at, what you are holding, and whether you have an emotional reaction to what you see. The AI could help you aim your egg accurately if you are holding an egg while you see a sworn enemy. As stated earlier, these are research instruments. Meta has not said when or if these tools will be available for purchase by consumers. Researchers must apply for access and the company will begin accepting applications later this year. But the implications of this are much larger. Meta’s smart glasses plans go beyond just checking for messages. They want to teach machines how to interact with the real world, just as humans do. Theoretically speaking, these robots could look at, listen to, and interpret the environment around them as humans do.

Sign up to receive breaking news, reviews and opinions, top tech deals and more.

The Aria Gen 2 smart glass may not be available tomorrow, but it is closer than you think. It’s only a question of time before the Aria Gen 2 is available to the general public. You’ll be able to have that powerful AI brain on your face, remembering your keys and sending you a robot to retrieve them.

Why 2025 is the year of AI smart glasses.

Eric Hal Schwartz has been a freelance writer at TechRadar for more than 15 years. He has covered the intersection of technology and the world. He was the head writer of Voicebot.ai for five years and was at the forefront of reporting on large language models and generative AI. Since then, he has become an expert in the products of generative AI, including OpenAI’s ChatGPT and Anthropic’s Claude. He also knows Google Gemini and all other synthetic media tools. His experience spans print, digital and broadcast media as well as live events. He’s now continuing to tell stories that people want to hear and need to know about the rapidly changing AI space and the impact it has on their lives. Eric is based out of New York City.

www.aiobserver.co

More from this stream

Recomended