Back

 Industry News Details

 
Meta details plans to build the metaverse (and put Siri and Alexa to shame) Posted on : Feb 24 - 2022

In its deep-dive, two-hour-plus video explanation of how it sees the metaverse operating in the future, Meta offered 2,000-plus online listeners both high-level descriptions and details on several specific areas of this proposed new world. They included how the Facebook-led company is using AI and machine learning in the metaverse for research, product development, running a universal language translator, giving personal assistants human-level intelligence, and establishing responsible use of AI and all the personal data that goes with it.

CEO Mark Zuckerberg led off with a 16-minute high-level overview of the day’s session, noting several times that his company is placing high priority on building the Metaverse with a “responsible” approach to data stewardship, something which lost Facebook credibility in past years. Eight presentations followed in the 140-minute session.

How Meta plans to beat Siri, Alexa and Google

Personalized assistants that understand people and let them control the flow of conversation can make peoples’ lives easier and pave the way to smarter devices — at home or on the go. But today, in 2022, they generally still leave a lot to be desired in terms of understanding requests, speed and accuracy of information.

“Assistants today — whether via voice or chat— are generally underwhelming,” Meta conversational AI tech lead Alborz Geramifard said. “There are several reasons why, starting with how they are engineered. We’re sharing the challenges developers and engineers face when attempting to build useful assistants and how we can navigate these challenges as we build for the metaverse.”

Zuckerberg’s hope for his company is to build a personal assistant that puts Siri, Alexa, and Google to shame. While Meta hasn’t picked out a name for it yet, Zuckerberg said Meta wants its voice assistant to be more intuitive: picking up contextual clues in conversations, along with other data points that it can collect about our bodies, such as where our gaze is going, facial expressions, and hand gestures.

“To support true world creation and exploration, we need to advance beyond the current state of the art for smart assistants,” Zuckerberg said. “When we have glasses on our faces, that will be the first time an AI system will be able to really see the world from our perspective — see what we see, hear what we hear, and more. So, the ability and expectation we have for AI systems will be much higher.”

Meta’s team appears to be up for those challenging tasks. During the presentation, Meta also introduced Project CAIRaoke, which Geramifard described as “breakthrough research that aims to make assistants more helpful and interactions with them more enjoyable. Project CAIRaoke is an AI model created for conversational agents. It works end-to-end, combining the four existing models typically used by today’s assistants into a single, more efficient and flexible model.”

Project CAIRaoke is leveraging years of advancement in natural language processing instead of scripted conversations delivered by applications that are deeply contextual and personalized, and the user is in charge of the conversation flow, Geramifard said.  View more