What you need to know
- Early Access program members are getting three key new features for the Ray-Ban Meta smart glasses.
- “Live AI” will record video and offer AI insights based on what you see, a feature originally shown at Meta Connect 2024.
- “Live translate” will let you hear or see people’s Spanish, French, or Italian words in English.
- Ray-Ban Metas also added Shazam integration with the v11 update.
Your Ray-Ban Meta smart glasses are about to get a major intelligence update, and you won’t have to wait if you’re an Early Access program member.
On Monday, Meta announced the v11 update, with Live AI as the standout feature. Until now, Meta AI could only respond to “Hey Meta” queries by taking a single photo and analyzing it. Now, “Meta AI can see what you see continuously and converse with you more naturally than ever before.”
During a Live AI session, it’ll record video and analyze what it sees in real time. You can ask it questions without the Hey Meta wake word, and the AI will remember things you asked earlier. You can also “interrupt anytime to ask follow-up questions or change topics,” and Meta promises that the AI will proactively “give useful suggestions even before you ask.”
If that sounds familiar, it’s because Google’s Gemini Live offers similar conversational AI, with interruptions and remembered queries. But that’s tied to your Android phone, whereas Meta’s Live AI will let you ask questions without needing to hold anything.
During Meta Connect 2024 in October, Zuckerberg gave an example of using Live AI to look at your closet and ask which clothes would best fit a Roaring 20s-themed costume party. We’ll have to test how well it works in practice, but the concept is intriguing and a big step up for the smart glasses.
The second v11 update feature is Live translate, which Zuckerberg demoed at Meta Connect 2024. As the name suggests, you can speak in English with someone speaking Spanish, French, or Italian and either see their responses on your phone or hear them through the Ray-Ban Meta glasses’ speakers.
Google has offered live translate for years, and in more languages. Meta’s version is unique because of its hands-free option, but the translation delay will still make conversations a bit awkward (based on Zuckerberg’s Connect demo).
The final v11 update is Shazam integration. You’ll simply say, “Hey Meta, Shazam this song,” and find out what the music playing around you is.
If you want to test these features today, you’ll need to sign up for the Early Access program, or else wait for the final release. That link walks you through the steps, but all you really need to do is opt in from the Settings menu of the Meta View app. Otherwise, the stable version should arrive in early 2025.
Meta has pushed out several useful AI updates in recent months, such as the ability to ask Meta AI to “remember” something — such as where you parked — and have it photograph the relevant thing and create a reminder for later.
With Meta Orion AR glasses on the horizon, Meta is wisely pushing out iterative AI updates on its display-free smart glasses for now, getting Meta AI ready for the next generation and challenging Google Gemini and its nascent Android XR platform.
Ray-Ban Meta smart glasses
Use AI in style
The Ray-Ban Meta smart glasses are our top pick for the best smart glasses available. Even without AI, they’re comfortable, stylish, let you stream playlists without needing earbuds, and capture surprisingly good photos with your phone in your pocket. And Meta AI itself is only going to keep getting smarter in 2025.