At the Meta Connect event earlier today, Mark Zuckerberg showed off a host of new features on the company’s flagship Meta Ray-Ban smart glasses. Calling the glasses “the perfect form factor for AI,” the new quality-of-life improvements center around the glasses’ multi-modal AI for a more natural interaction (similar to what we saw with Google’s Gemini and ChatGPT 4o).
Also: Everything announced at Meta Connect 2024: Affordable Quest 3, AR glasses, and more
But beyond improvements to communication, the glasses’ multimodal AI allows for some interesting new interactions, giving them the ability to “see” what you see and “hear” what you hear with less context needed on the user’s end.
One of the most useful features is the glasses’ ability to “remember” things for you — taking note of specific numbers or visual indicators to file away for later. Here’s a breakdown of everything that will be rolling out soon.
1. Translations on the fly
Similar to other live translation technologies we’ve seen emerge this year, the Meta Ray-Bans will be getting a live translation feature designed to work in real-time (or at least close to it) with Spanish, French, and Italian. During the event, Zuckerberg demonstrated a conversation with a Spanish speaker, and the glasses translated what each speaker said and heard from Spanish into English in just seconds between the lines.
Of course, not every conversation will involve two users wearing smart glasses, so the company is allowing users to sync their output with the Meta companion app, leveraging the smartphone to display translations.
Also: Meta Ray-Ban Smart Glasses review: The best AI-powered AR glasses to buy right now
In addition to the glasses’ new features, Meta also teased its new translation AI tool for Instagram Reels that automatically translates audio into English and then uses AI to sync the speaker’s mouth movements to match the English translation. The result — in the demo at least — was a natural-looking video in English using the speaker’s own voice sample.
So far, this feature is in its early stages, and only available in Spanish for now on Instagram and Facebook while Meta continues to test the technology.
2. The glasses can now ‘remember’ things
The demo also showed off the glasses’ “photographic memory” by solving a problem we’ve all had: remembering where we parked. The user looked at the number on the parking spot and simply said, “Remember where I parked.”
Later, asking the glasses, “Hey Meta, where did I park?” invoked the AI to respond with the parking space number. This kind of “filing away” of knowledge on the fly is an example of utilizing what the AI is best at: recalling specific data in a pre-defined context. We’ll have to test ourselves how reliable the feature will be for less visually hinted information.
Additional usability examples of this feature are easy to imagine, looking at anything from grocery lists to event dates or phone numbers.
3. Next-level multimodality
Previously, you’d have to say “Hey Meta” to invoke the glasses’ AI, then wait for the prompt to begin your inquiry. Now, you can simply ask questions about the glasses in real-time, even while in motion, utilizing the glasses’ multimodal AI to analyze what you’re seeing or hearing.
Also: Meta’s new 512GB Quest 3 deal may be the best VR headset offer right now
One demo showed a user peeling an avocado and asking, “What can I make with these?”, not specifying what “these” referred to. Another demo showed a user searching through a closet and pulling out multiple items of clothing at once, asking the AI to help style an outfit in real time. Like how other popular voice assistants have developed, you can always interrupt Meta AI when converting with it.
Along the same lines, the multimodal capabilities of the glasses extend beyond simply analyzing what’s in view in a static sense. The glasses will recognize things like URLs, phone numbers, which you can call, or QR codes, which you can scan instantly with the glasses.
4. ‘Be My Eyes’ partnership
Lastly, Zuckerberg demoed a clever new accessibility feature of the glasses. Blind and vision-impaired people can use the glasses to broadcast what they see to a volunteer on the other end, who can talk them through the details of what they’re looking at. Be My Eyes is an already-existing program that connects vision-impaired folks with virtual volunteers through live video.
The demo showed a woman looking at a party invitation with dates and times. Still, real-world uses for this could essentially be anything from reading signs to shopping for groceries to navigating a tech gadget.
Also: Google co-founder on the future of AI wearables (and his Google Glass regrets)
Finally, Zuck showed off some new designs, including a new, limited edition of the Ray-Bans with clear, transparent frames, as well as the introduction of new transition lenses, effectively doubling their usability as both sunglasses and prescription glasses.
The Meta Ray-Bans start at $300 and come in nine different frame designs and a new limited-edition transparent style.