Meta announced on Monday that its Ray-Ban Meta smart glasses now feature AI video capabilities and real-time language translation.
These upgrades, first unveiled at Meta’s annual Connect conference in September, are included in the v11 software update, which began rolling out this week for users in its “Early Access Program.”
The latest update enhances Meta’s AI chatbot assistant, enabling the smart glasses to process visual data and answer user questions in real-time. Additionally, the glasses now support speech translation between English and Spanish, French, or Italian. According to Meta, users can hear translations via the glasses’ open-ear speakers or view transcripts on their phones, ensuring seamless two-way communication.
Another new feature is the integration of Shazam, allowing users in the U.S. and Canada to identify songs directly through the glasses.
Previously, Meta announced other AI-powered upgrades for the glasses, such as tools to set reminders and the ability to scan QR codes and phone numbers using voice commands.
The updated features underscore Meta’s push to combine wearable technology with artificial intelligence, positioning the Ray-Ban Meta glasses as a versatile tool for everyday tasks and communication.