Meta is rolling out direct AI and Shazam integration on its smart glasses

Photo of author

By [email protected]


the It already worked well as a head-mounted camera and a pair of open-back headphones, but now the Meta is as well With access to live AI without a wake word, live translation between several different languages, and access to Shazam for music recognition.

dead Most of these features are in In September. Live AI lets you start a “live session” with Meta AI which gives the assistant access to everything you see and lets you ask questions without having to say “Hey Meta.” If you need to use your hands to cook or fix something, Live AI should keep your smart glasses useful even if you need to focus on whatever you’re doing.

Live translation allows your smart glasses to translate between English and either French, Italian, or Spanish. If live translation is enabled and someone speaks to you in one of the selected languages, you’ll hear what they say in English through the smart glasses’ speakers or as transcribed text in the Meta View app. You’ll have to download specific templates to translate between each language, and live translation must be enabled before you can actually work as an interpreter, but it feels more natural than grabbing your phone to translate something.

With Shazam integration, Meta smart glasses will also be able to identify any song you hear around you. A simple “Meta, What is This Song” will make the smart glasses’ microphones recognize what you’re listening to, just like using Shazam on your smartphone.

All three updates advance the wearable toward Meta’s ultimate goal of a real pair of augmented reality glasses that could replace your smartphone, an idea It is a realistic preview of. Pairing AI, VR, and AR seems to be an idea that many tech giants are kicking around as well. Google’s latest XR platform, is built on the idea that generative AI like Gemini could be the glue that makes virtual or augmented reality compelling. We’re still years away from any company wanting to alter your field of vision with 3D images, but in the meantime smart glasses seem like a fairly useful temporary solution.

All owners of Ray-Ban Meta Smart Glasses will be able to enjoy Shazam integration as part of the Meta v11 update. To get live translation and live AI, you’ll need to be part of Meta’s early access program, which you can join now .



https://s.yimg.com/ny/api/res/1.2/.RB0mSZq75gStjO8efcjOw–/YXBwaWQ9aGlnaGxhbmRlcjt3PTEyMDA7aD03MzQ-/https://s.yimg.com/os/creatr-uploaded-images/2024-12/8a467110-bbe2-11ef-bc2a-a95f4d611b43

Source link

Leave a Comment