Connect with us

Tech

Meta adds live translation, AI video to Ray-Ban smart glasses

Published

on

Meta adds live translation, AI video to Ray-Ban smart glasses

(Reuters) – Meta Platforms said on Monday it has updated the Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality.

The Facebook parent, which first announced the features during its annual Connect conference in September, said the update is available for members that are part of its “Early Access Program”.

The features are included in the v11 software update, which will begin rolling out on Monday.

Trusted news and daily delights, right in your inbox

See for yourself — The Yodel is the go-to source for daily news, entertainment and feel-good stories.

The latest update adds video to Meta’s AI chatbot assistant, which allows the Ray-Ban smart glasses to process what the user is seeing and respond to questions in real-time.

The smart glasses will now be able to translate speech in real time between English and Spanish, French or Italian.

“When you’re talking to someone speaking one of those three languages, you’ll hear what they say in English through the glasses’ open-ear speakers or viewed as transcripts on your phone, and vice versa,” Meta said in a blog.

Meta also added Shazam, an app that lets users identify songs, to the smart glasses, which will be available in the U.S. and Canada.

In September, Meta said it is updating the Ray-Ban smart glasses with several new AI features, including tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands.

(Reporting by Harshita Mary Varghese in Bengaluru; Editing by Vijay Kishore)

Continue Reading