Connect with us

Tech

New Meta Ray-Ban AI features roll out, making the smart glasses even more tempting

Published

on

New Meta Ray-Ban AI features roll out, making the smart glasses even more tempting

Kerry Wan/ZDNET

At the Meta Connect event last week, Mark Zuckerberg showed off many new features on the company’s flagship Meta Ray-Ban smart glasses. Calling the glasses “the perfect form factor for AI,” the new quality-of-life improvements center around the glasses’ multi-modal AI for a more natural interaction (similar to what we saw with Google’s Gemini and ChatGPT 4o). 

Some of the new features that were announced will start to roll out today, including the glasses’ much-hyped ability to “remember” things for you. In an Instagram post yesterday, Mark Zuckerberg showed off this new feature while looking hopelessly lost in a parking garage. He asks the glasses to remind him where he parked, to which it cheerfully replies, “You parked in spot 304.” 

Also: Meta Ray-Ban Smart Glasses review: The best AI-powered AR glasses to buy right now

If you already have the glasses, just make sure you have the latest version of the Meta View app (version 186) to start getting the new features in rolling updates. As of today, only the glasses’ ability to “remember” for you, the improvements to invoking the AI with “Hey, Meta”, it’s ability to scan QR codes, and send voice messages via WhatsApp and Messenger are live. 

Other features, like the new Live translation ability, multimodal video, and the Be My Eyes partnership aren’t part of today’s update rollout, but will be coming soon. Here’s a closer look at all the features and improvements that were announced last week. 

1. Translations on the fly 

Meta Connect - Meta Ray Bans

Meta

Similar to other live translation technologies we’ve seen emerge this year, the Meta Ray-Bans will be getting a live translation feature soon that is designed to work in real-time (or at least close to it) with Spanish, French, and Italian. During Meta Connect, Zuckerberg demonstrated a conversation with a Spanish speaker, and the glasses translated what each speaker said and heard from Spanish into English in just seconds between the lines. 

Of course, not every conversation will involve two users wearing smart glasses, so the company is allowing users to sync their output with the Meta companion app, leveraging the smartphone to display translations.

Also: Google’s AI podcast tool transforms your text into stunningly lifelike audio – for free

In addition to the glasses’ new features, Meta also teased its new translation AI tool for Instagram Reels that automatically translates audio into English and then uses AI to sync the speaker’s mouth movements to match the English translation. The result — in the demo at least — was a natural-looking video in English using the speaker’s own voice sample. 

So far, this feature is in its early stages, and only available in Spanish for now on Instagram and Facebook while Meta continues to test the technology. 

2. The glasses can now ‘remember’ things

Meta Connect - Meta Ray Bans

Meta

Along with Zuck’s recent Instagram post, the demo also showed off the glasses’ “photographic memory” by solving a problem we’ve all had: remembering where we parked. The user looked at the number on the parking spot and simply said, “Remember where I parked.” 

Later, asking the glasses, “Hey Meta, where did I park?” invoked the AI to respond with the parking space number. This kind of “filing away” of knowledge on the fly is an example of utilizing what the AI is best at: recalling specific data in a pre-defined context. We’ll have to test ourselves how reliable the feature will be for less visually hinted information.

Additional usability examples of this feature are easy to imagine, looking at anything from grocery lists to event dates or phone numbers. This feature will go live today as part of the update rollout and should be available for users on both Android and iOS.

3. Next-level multimodality

Previously, you’d have to say “Hey Meta” to invoke the glasses’ AI, then wait for the prompt to begin your inquiry. Now, you just need to say it once to kick off the initial conversation, and you won’t have to keep repeating it with follow-up questions. Also, since the multimodal AI is “always on”, you won’t need to specifically tell the glasses to “look at” a specific object. 

Also: Meta’s new 512GB Quest 3 deal may be the best VR headset offer right now

One demo showed a user peeling an avocado and asking, “What can I make with these?”, not specifying what “these” referred to. Another demo showed a user searching through a closet and pulling out multiple items of clothing at once, asking the AI to help style an outfit in real time. Like how other popular voice assistants have developed, you can always interrupt Meta AI when converting with it.

Along the same lines, the multimodal capabilities of the glasses extend beyond simply analyzing what’s in view in a static sense. The glasses will recognize things like URLs, phone numbers, which you can call, or QR codes, which you can scan instantly with the glasses. 

4. ‘Be My Eyes’ partnership

Lastly, Zuckerberg demoed a clever new accessibility feature of the glasses. Blind and vision-impaired people can use the glasses to broadcast what they see to a volunteer on the other end, who can talk them through the details of what they’re looking at. Be My Eyes is an already-existing program that connects vision-impaired folks with virtual volunteers through live video. 

The demo showed a woman looking at a party invitation with dates and times. Still, real-world uses for this could essentially be anything from reading signs to shopping for groceries to navigating a tech gadget. 

Also: Google co-founder on the future of AI wearables (and his Google Glass regrets)

Finally, Zuck showed off some new designs, including a new, limited edition of the Ray-Bans with clear, transparent frames, as well as the introduction of new transition lenses, effectively doubling their usability as both sunglasses and prescription glasses. 

The Meta Ray-Bans start at $300 and come in nine different frame designs and a new limited-edition transparent style. The new updates are rolling out today for both Android and iOS users with the most recently-updated version of the Meta View app. 

Continue Reading