These smart glasses will be able to capture audio and translate it during a real conversation. (REUTERS/Manuel Orbegozo)
Meta and Ray-Ban announced a series of new features for the smart glasses that both companies are developing. Many of the changes are focused on the inclusion of artificial intelligence functions and real-time translation.
During Meta Connect 2024, the company’s annual event, A new design was also revealed, which adds to the colours already available on the market.
One of the most notable new features of these glasses is their new limited edition with transparent frame. This retro design, reminiscent of the aesthetics of iconic devices such as the Game Boy Color, has caught the attention of consumers looking for something more than just sunglasses. With only 7,500 units available, this edition sells for $429, $100 more than the standard black version. This transparent frame allows the complex technological components that form part of its interior to be observed, a detail that underlines its technological sophistication.
Now these devices will have a transparent design in which their components will be visible. (Meta)
In addition, Meta has announced that it will soon launch a line of Transitions lenses, in collaboration with EssilorLuxottica, which will increase the versatility of the glasses, allowing for more comfortable use in different lighting situations.
One of the most anticipated updates is the integration of the real-time translation function. Until now, the Ray-Ban Meta only offered the possibility of translating text from still images, but Meta has promised that, in the coming months, these glasses will be able to translate live conversations between several languages, including English, Spanish, French and Italian. This feature will be especially useful for those who travel frequently or interact in multilingual environments.
When this feature is available, users will be able to speak to someone in another language and hear the translation directly through the speakers built into the glasses. Although the list of languages is limited at launch, Mark Zuckerberg has noted that the Meta team is already working to add more languages in the future.
These smart glasses will be able to capture audio and translate it during a real conversation. (REUTERS/Carlos Barria/File Photo)
Another key development in the evolution of these glasses is the addition of the “Reminders” function. This tool allows the glasses to capture an image of what the user is seeing and then send a notification to the phone to remind them of that moment. It is a kind of visual memory that promises to change the way users interact with their environment.You will no longer need to take notes or rely on memory to remember something important: the glasses will do it for you.
What’s more, this AI feature goes a step further by allowing the glasses to remember essential details, such as where you parked your car or send voice reminders to perform actions in the future, all without the need to take your phone out of your pocket.
Meta has optimized its virtual assistant Meta AI, which now allows for a more fluid and natural interaction. By simply saying “Hey, Meta”, Users will be able to start a conversation and continue doing so without having to repeat the activation command.
These smart glasses will be able to capture audio and translate it during a real conversation. (REUTERS/Manuel Orbegozo)
Similarly, AI has been refined to interpret the user’s visual environment more accurately and without the need for additional verbal commands such as “look and…”. This means the glasses will be able to identify objects and provide information instantly, significantly improving the user experience compared to previous versions.
On the other hand, one of the most anticipated features is the ability to process video in real time. With this update, when a user is exploring a new city, they will be able to ask Meta AI to suggest places of interest or guide them on a personalized tour based on what they are seeing through the glasses.This feature will also be useful in everyday activities, such as grocery shopping, where AI can help decide what to prepare for lunch based on the ingredients in sight.
Keynote USA
For the Latest News, Follow KeynoteUSA on Twitter Or Google News.