According to a corporate executive on Wednesday, one of Meta’s substantial investment in AI is the creation of an AI system intended to run Facebook’s entire video recommendation engine across all of its platforms.
Facebook CEO Tom Alison stated that creating an AI recommendation algorithm that can run the company’s TikTok-like Reels short video service as well as more conventional, lengthier films is a part of Meta’s “technology roadmap that goes to 2026.”
Speaking on stage at Morgan Stanley’s tech conference in San Francisco, Alison stated that Meta has up to now employed distinct models for each of its products, including Reels, Groups, and the main Facebook Feed.
The business Meta has been investing billions of dollars in Nvidia graphics processing units, or GPUs, as part of its grand venture into artificial intelligence. These days, AI researchers primarily utilize them to train the kinds of massive language models that underpin other generative AI models including OpenAI’s well-known ChatGPT chatbot.
According to Alison, “phase 1” of Meta’s technology plan entailed converting the company’s present recommendation algorithms from more conventional computer chips to GPUs in order to enhance product performance as a whole.
Executives at Meta were astounded by how these large AI models could “handle lots of data and all kinds of very general-purpose types of activities like chatting” when interest in LLMs surged the previous year, according to Alison. After realizing there was a chance to create a massive recommendation model that could be used to many goods, Meta developed “this kind of new model architecture” last year, according to Alison, who also stated the business tested it on Reels.
Facebook saw “an 8% to 10% gain in Reels watch time” on the main Facebook app thanks to this new “model architecture,” which Alison said demonstrated how much better the model was “learning from the data than the previous generation.”
“We’ve really focused on kind of investing more in making sure that we can scale these models up with the right kind of hardware,” he said.
As part of its “phase 3” system re-architecture, Meta is currently working to test the technology and implement it across a number of products.
“Instead of just powering Reels, we’re working on a project to power our entire video ecosystem with this single model, and then can we add our Feed recommendation product to also be served by this model,” Alison said. “If we get this right, not only will the recommendations be kind of more engaging and more relevant, but we think the responsiveness of them can improve as well.”
Alison explained how it would function if it is effective, saying, “If you see something that you’re into in Reels, and then you go back to the Feed, we can kind of show you more similar content.”
According to Alison, Meta has amassed an enormous GPU stockpile that will support the company’s larger generative AI initiatives, such the creation of digital assistants.
One of the generative AI projects that Meta is thinking about is adding more advanced chat features to its main feed. This would allow someone to “easily just click a button and say, ‘Hey Meta AI, tell me more about what I’m seeing with Taylor Swift right now,'” after they see a “recommended post about Taylor Swift.”
Additionally, Meta is experimenting with integrating its AI chat feature into Facebook groups. This means that someone in a group on Facebook baking may ask a question about desserts and receive a response from a virtual assistant.
“I believe there is a chance to integrate generative AI into a multiplayer consumer setting,” stated Alison.