Meta is integrating its upcoming next generation AI model Llama 3 into virtual assistant – MetaAI in its Ray Ban smart glasses to enhance performance.
This will enable the smart glass to handle more complex tasks and respond more naturally in a major update that will transform the AI powered glasses.
The multi-modal features have been in early access last December and can perform translations in addition to identifying objects, animals, and monuments.
A quick glimpse at the glasses
Users can activate the virtual assistant on the Ray Ban glasses by saying “Hey Meta” then ask a question or prompt, and the assistant will respond via speakers build on the frames.
According to Meta, with the Ray Ban glasses, users can livestream from the glasses to social media platforms Facebook and Instagram, using “Hey Meta” to engage with the firm’s “advanced conversational assistant” MetaAI.
The Ray Ban glasses feature enhanced audio and cameras, in addition to more than 150 different custom frame and lens combinations, “and they are lighter and comfortable.”
An earlier report by NYT indicated that while the glasses correctly identified objects like artwork and pets, they were not 100% all the time. They battled to identify some zoo animals, as well as some exotic fruits like cherimoya after several trials.
The latest upgrades are therefore expected bring enhanced user experience to the smart glasses.
Compatibility with smaller gadgets
Now, the upcoming open-source, Llama 3 is also expected to come in smaller sizes that are compatible with local devices such as mobile phones and smart glasses, hence the scope to integrate it into Ray Ban glasses.
It will compete with other models like Anthropic’s Claude Haiku or Gemini Nano. It will also compete with other “larger full response and reasoning -capable” models like Claude Opus and GPT-4, which it is reportedly expected to outperform.
According to Tom’s Guide, Meta teased participants at its London headquarters event with hints on what to expect from the company’s AI basket, including Llama 3, MetaAI assistant and the Ray Ban glasses.
In 2023, MetaAI assistant, powered by Llama-2 was upgraded to enable it to “see” through the camera of the glasses. This, according to Tom’s Guide, let it offer “sartorial and technical advice and that will only get better with Llama 3.”
Now, Llama 3 is expected to be capable of understanding a range of inputs such as spatial understanding of the real world, speech, video.
Building on already existing AI features
With such capabilities, this enhanced immersive understanding is expected to empower the third version of Meta’s model an improved perspective on the real world, enabling the company to improve on its MetaAI assistant, which is found in the Ray Ban smart glasses, leveraging on the already available AI features.
Speaking at the London event, Meta’s chief AI scientist Yann Le Cun declared the smart glasses are now “his main glasses.”
“In the near future, every single one of our interactions with the digital world will be through AI assistants. Our entire digital diet will be mediated by AI systems,” he said.
Also read: Google and Intel Launch Own AI Chips as Nvidia Rivalry Heats Up
MetaAI is also coming to Quest 3
Not only Ray Ban glasses will get MetaAI, but Meta is expanding the virtual assistant to Quest 3 VR headset. At the London event, the panelists also revealed that future versions of MetaAI will also be integrated into Instagram and WhatsApp, “plus a mechanism in Facebook to make it easier to manage large groups.”
“MetaAI will become a general assistant that can take in information and provide support users want. It is coming to Quest 3 soon,” explained Joelle Pineau, VP of AI research at Meta.
She added that Meta’s goal is “make Llama-powered Meta AI the most useful assistant in the world.”
While the virtual assistant built on Llama 2, an upgrade to Llama 3 will give it improved reasoning and deeper understanding.