In the corner of a bustling showroom in Consumer Electronics Show 2025I felt like the conductor of an orchestra. As I subtly waved my arms from side to side, the notes rang out on the cello displayed on the giant screen in front of me. The faster I move my arm, the faster the bow slides through the strings. I even got a round of applause from my fellow booth attendees after a particularly fast performance.
This is what it felt like to use the Mudra Link wristband, which lets you manipulate devices using gesture controls. Motion controls aren’t new; I remember using non-touch controls as far back as 2014 with devices like Mew badge. What’s different now is that such gadgets have a greater reason to exist thanks to the arrival of smart glasses, which were seemingly everywhere at CES 2025.
Startups and big tech companies alike have been trying to make smart glasses for more than a decade. However, the arrival of artificial intelligence models It can process speech and visual input At the same time I made them I feel more important Ever. After all, digital assistants could be more useful if they could see what you see and answer questions in real time, just like the idea behind it. Google Astra Project Typical glasses. Smartglass shipments are expected to grow 73.1% in 2024, according to a September IDC report. a reportWhich also indicates that technology-equipped glasses are starting to spread.
Read more: Nvidia’s CEO explains how new AI models could power future smart glasses
Watch this: These new smart glasses want to be your next AI companion
Last fall, Meta showed off a model pair of her own AR glasses called Orion,controlled by gestures and neural input wristband. At last year’s AR World Expo conference, other startups emerged Similar experiments showed.
At CES, it became clear that companies are thinking a lot about how they will handle these devices in the future. In addition to the Mudra Link bracelet, I found a couple of other wearables that are designed to work with glasses.
Take Aference Ring, for example, which applies neural tactile to your finger to provide tactile feedback when using gesture controls. It’s intended for devices like smart glasses and headphones, but I had to try out a prototype of it paired with a tablet just to get a feel for how the technology works.
In one demo, I played a simple mini-golf game that required me to pull my arm back until it finished and then release it to release the ball. The further I backed away, the more I felt the force of the touches on my finger. The experience of toggling the brightness and volume sliders was similar; When I increased the brightness, the sensation in my finger became more pronounced.
The Aference Ring provides haptic feedback on your finger.
It was a simple demo, but it helped me understand what kind of approach companies might take to implementing haptic feedback to menus and apps in mixed reality. Aference didn’t mention any specific partners it’s working with, but it’s worth noting that Samsung Next is involved Afference seed funding round. Samsung launched the first Smart ring for health tracking last year and announced in December that it was building the first headphone that would run on the newly announced speaker Android XR platform For upcoming mixed reality headsets.
The Mudra Link wristband works with the newly announced TCL RayNeo X3 Pro glasseswhich will be launched later this year. I briefly tried out the Mudra Link wristband to scroll through the list of apps on the RayNeo glasses, but the software isn’t finished yet.
I spent most of my time using the wristband to process graphics on a giant screen used for demonstration purposes at the conference. The cello example was the most convincing demo, but I was also able to grab, stretch, and move a cartoon character’s face around the screen just by waving my hand and pinching my fingers.
Halliday’s smart glasses, which were also unveiled at CES, work with an accompanying ring for navigation. Although I wasn’t able to try out the track, I did briefly use the glasses for real-time language translation, with text translations instantly appearing in my field of vision even on a noisy showroom floor.
the Halliday smart glasses Place a small screen in your field of vision, and you can navigate the device using the accompanying ring.
Without gestures, there are generally two primary ways to interact with smart glasses: touch controls on the device, and voice commands. The former is ideal for quick interactions, such as scrolling through a menu, launching an app, or rejecting a call, while the latter is useful for summoning and issuing commands to virtual assistants.
Gesture controls can make it easier to navigate between interfaces without having to raise your hand to your face, speak loudly, or hold an external controller. However, there is still a degree of awkwardness that comes with using gestures to control a screen that is not visible to everyone except the person wearing the glasses. I can’t imagine waving my hand in public without any context.
Meta is already moving toward gesture-controlled glasses, and its CTO, Andrew Bosworth, is working on that. He recently told CNET These gestures will likely be required for any future pair of display-enabled glasses.
If CES is any indication, 2025 is shaping up to be a big year for smart glasses — and gesture control will no doubt play a role in how we navigate these new spatial interfaces in the future.
https://www.cnet.com/a/img/resize/96638cddd80a30f2689fecf17ab3d1039355837a/hub/2025/01/10/43ca8849-5133-41b1-9608-6cefb7500dfc/fl-mudra-link-wristband-still3.jpg?auto=webp&fit=crop&height=675&width=1200
Source link