DayFR Euro

Ray-Ban connected glasses from Meta gain 4 interesting functions …

At last week’s Meta Connect event, Mark Zuckerberg showed off new features on the company’s flagship Meta Ray-Ban smart glasses. Calling the glasses the “perfect form factor for AI,” the improvements are centered around multimodal AI. The idea is to provide a more natural interaction, similar to what we’ve already seen with Google’s Gemini and ChatGPT 4o.

But beyond the improvements in communication, the glasses’ multimodal AI enables interesting new interactions, giving them the ability to “see” what you see and “hear” what you hear, with less necessary context from the user.

One of the most useful features is the glasses’ ability to “remember” things for you. How ? Taking note of visual indicators to categorize for later.

Here’s a look at everything that’s coming soon.

1. On-the-fly translations

Meta

The Meta Ray-Ban glasses will feature a translation feature designed to work in real time (or at least close to it) with Spanish, French and Italian. At the event, Zuckerberg demonstrated a conversation with a Spanish speaker. And the glasses translated what each speaker said and heard from Spanish to English in just a few seconds.

Of course, not every conversation involves both users wearing the smart glasses. That’s why the company allows users to sync their production with the Meta companion app, relying on the smartphone to view translations.

A new AI translation tool for Instagram Reels

In addition to the new features of the glasses, Meta also introduced its new AI translation tool for Instagram Reels, which automatically translates audio into English and then uses AI to synchronize the speaker’s mouth movements with the English translation . The result – at least in the demo – was a natural-looking English video using the speaker’s voice sample.

For now, this feature is in its early stages. It is currently only available in Spanish on Instagram and Facebook.

2. Glasses can now “remember” things

Meta

The demo also showed off the glasses’ “photographic memory.” And this by solving a problem that we have all encountered. It’s about remembering where you parked. The user looked at the parking spot number and simply said, “Remember where I parked.”

Later, when asking the glasses “Hey Meta, where did I park?”, the AI ​​responded with the parking spot number.

This type of “ranking” of knowledge on the fly is an example of using what AI does best. That is to say recall specific data in a predefined context.

Other examples of how this feature can be used are easy to imagine, from shopping lists to event dates to phone numbers.

3. Higher level multimodality

Previously, you had to say “Hey Meta” to summon the glasses’ AI. Then wait for the prompt to begin your query. Now you can simply ask the glasses questions in real time, even when you’re in motion. And this by using the multimodal AI of the glasses to analyze what you see or hear.

  • One demo showed a user peeling an avocado and asking “What can I do with this?”, without specifying what “this” referred to.
  • Another demo showed a user rummaging through a closet and pulling out several items of clothing at once, asking the AI ​​to help them style an outfit in real time.

Like other popular voice assistants, it is always possible to interrupt Meta AI during a conversation.

Along the same lines, the multimodal capabilities of glasses go beyond the simple static analysis of what is seen. The glasses recognize things like URLs, phone numbers you can call, or QR codes you can instantly scan with the glasses.

4. partenariat Be My Eyes

Finally, Zuckerberg demonstrated a nifty new accessibility feature of the glasses. Blind and visually impaired people can use the glasses to transmit what they see to a person on the other end, who can explain in detail what they are looking at.

Be My Eyes is a program that is already in production. The demonstration showed a woman looking at a party invitation with dates and times. This technology could be used from reading signs to shopping for groceries.

Finally, Zuck presented some new models, including a new limited edition Ray-Bans with clear and transparent frames, as well as the introduction of new transition lenses, which allow their use as sunglasses and eyeglasses to be doubled. of view.

The Meta Ray-Bans are sold from 300 euros and are available in nine different frame models, as well as a new transparent limited edition.

-

Related News :