Meta adds live translation and AI video to Ray-Ban smart glasses – 12/16/2024 at 7:05 p.m.

Meta adds live translation and AI video to Ray-Ban smart glasses – 12/16/2024 at 7:05 p.m.
Meta adds live translation and AI video to Ray-Ban smart glasses – 12/16/2024 at 7:05 p.m.

((Automated translation by Reuters, please see disclaimer https://bit.ly/rtrsauto))

Meta Platforms META.O said on Monday it has updated Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality.

Facebook’s parent company, which first announced the features at its annual Connect conference in September, said the update is available to members who are part of its Early Access Program. early access).

These features are included in software update v11, which will begin rolling out on Monday.

The latest update adds video to Meta’s AI chatbot assistant, allowing Ray-Ban smart glasses to process what the user sees and answer questions in real time.

Smart glasses will now be able to translate speech between English and Spanish, French or Italian in real time.

“When you talk to someone who speaks one of these three languages, you hear what they say in English through the speakers of the glasses or as a transcription on your phone, and vice versa,” explains Meta in a blog.

Meta also added Shazam, an app that lets users identify songs, to the smart glasses, which will be available in the United States and Canada.

In September, Meta said it was updating Ray-Ban smart glasses with several new artificial intelligence features, including tools for setting reminders and the ability to scan QR codes and phone numbers at the same time. using voice commands.

-

-

PREV Why our gaming laptops aren't really switching to USB-C
NEXT “A wake-up call for our planet”: new images show the frightening scale of Arctic ice retreat