Meta updates Ray-Ban smart glasses with AI video and live translation

The new features include video support for Meta’s AI chatbot assistant, enabling the glasses to process the user’s surroundings and respond to questions in real time

Meta Platforms announced on Monday that it has updated its Ray-Ban Meta smart glasses with AI video capabilities and real-time language translation.

The updates are part of the v11 software rollout, which started on Monday and is available to members of Meta’s “Early Access Program.”

The new features include video support for Meta’s AI chatbot assistant, enabling the glasses to process the user’s surroundings and respond to questions in real time. The smart glasses can now also translate speech in real time between English and Spanish, French, or Italian.

Meta explained that users will hear translations through the glasses’ open-ear speakers or view transcripts on their phones, with the feature working both ways. Additionally, Meta has integrated Shazam into the smart glasses, allowing users in the U.S. and Canada to identify songs.

Meta first announced these updates at its annual Connect conference in September, where it introduced several AI features, including tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands.

Monitoring Desk
Monitoring Desk
Our monitoring team diligently searches the vast expanse of the web to carefully handpick and distill top-tier business and economic news stories and articles, presenting them to you in a concise and informative manner.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

PIA to launch inaugural flight to Gwadar airport on January 20

The project aims to make Gwadar an important hub in Balochistan, supporting CPEC’s goals of better connectivity and economic growth