Meta Platforms announced on Monday that it has updated its Ray-Ban Meta smart glasses with AI video capabilities and real-time language translation.
The updates are part of the v11 software rollout, which started on Monday and is available to members of Meta’s “Early Access Program.”
The new features include video support for Meta’s AI chatbot assistant, enabling the glasses to process the user’s surroundings and respond to questions in real time. The smart glasses can now also translate speech in real time between English and Spanish, French, or Italian.
Meta explained that users will hear translations through the glasses’ open-ear speakers or view transcripts on their phones, with the feature working both ways. Additionally, Meta has integrated Shazam into the smart glasses, allowing users in the U.S. and Canada to identify songs.
Meta first announced these updates at its annual Connect conference in September, where it introduced several AI features, including tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands.