Meta Puts AI In Ray-Ban Smart Glasses

Mark Zuckerberg, CEO of Meta, shared a new post on his Instagram* account that gives us a first look at how his new multi-modal Meta AI model can “see and hear” using Ray-Ban Meta smart glasses.

Mark Zuckerberg, CEO of Meta, shared a new post on his Instagram account that gives us a first look at how his new multi-modal Meta AI model can “see and hear” using Ray-Ban Meta smart glasses.

The capabilities of AI in smart glasses were demonstrated by the head of Meta Mark Zuckerberg – he asked the system to match the trousers to the shirt he was holding in his hands. The AI ​​described the shirt and suggested a couple of trouser options that could complement it. Mr. Zuckerberg also showed a humorous picture with captions in Spanish on the glasses’ camera and asked the system to translate the captions into English. The AI ​​was also able to recognize a strange fruit, and also came up with a funny caption for a photo of a dog in a suit.ш

Meta CTO Andrew Bosworth also spoke about the new functions of the device: in particular, the device can offer its own captions for photographs taken by the user. Testing of AI for smart glasses will be carried out only in the United States with the participation of a limited number of users who will have to apply to participate in the program.

Natalia Ganeva

Natalia Ganeva

Natalia Ganeva is a young and enthusiastic technology journalist who brings a fresh perspective to the tech reporting landscape. Natalia's articles and features showcase her dedication to staying abreast of the latest tech trends and her ability to convey complex topics in an accessible manner.