Mark Zuckerberg Plans to Deepen AI’s Presence in Our Online Lives

Six months into its massive AI transformation, Meta believes 2026 will be the year it starts reaping the benefits.
The tech giant has spent billions upon billions on Meta Superintelligence Labs, poaching top talent from the likes of OpenAI, Apple, and others in hopes of revitalizing its failing AI programs.
The company hopes to prove it’s finally reaping the rewards of that commitment with a slew of new AI models and products it will ship in the coming months, Meta CEO Mark Zuckerberg said on the company’s earnings call Wednesday. But maybe don’t expect anything new.
“I think this is going to be a long-term effort,” Zuckerberg said. “This is the journey we’re on, and the first set of things we’ve put out, I think will be more about showing the direction we’re on rather than one minute at a time.”
Beyond the model and product announcements, Meta is hoping that 2026 is the year it can use AI to make its existing offerings even more personal. To the average user, it will look like an Instagram feed of eerily targeted content, thanks to LLM’s advanced recommendation system that can understand “unique personal goals” and combine ads and feeds accordingly.
“Today, our apps feel like algorithms that recommend content,” Zuckerberg said. “Soon, you’ll open our apps, and you’ll have an AI that understands you and will be able to show you great content or create great personalized content for you.”
These recommendation models will use the world knowledge and reasoning power of LLMs to make better predictions about what content you will like. Meta CFO Susan Li said it will especially help with newly posted content that has little engagement data to base recommendations on.
Since last month, the company has officially started using AI chat history to inform highly targeted ads and posts on all platforms, except in the European Union, where it is forced to remove personalized ads due to strict consumer protection.
Without improving the algorithm, AI is already “driving more time spent on Instagram,” Li said, with AI-produced videos in local languages.
“Hundreds of millions of people watch AI-translated videos every day,” Li said.
This personalization effort will translate into Meta AI rendering. The more personalized the responses, the more engaged the user is with the AI, Li said. But that may not always be a good thing.
OpenAI has spent the last few months under scrutiny and some legal consequences after it was revealed that AI chatbot designs come with inherent risks, especially for the mental health of vulnerable users such as children and teenagers.
Meta already doesn’t have a good track record when it comes to AI safety for vulnerable people, especially children. The company has been under legal scrutiny after a Reuters report over the summer found that Meta had allowed its chatbots to engage in “sensational” conversations with children.
In the earnings call, Meta executives said the company may lose material assets this year due to “scrutiny on youth issues.”
Zuck’s digital search is “embedded”
The AI-enhanced feed is a continuation of Zuckerberg’s long-standing vision of a “embedded and interactive” digital experience. It’s the same idea that drove his massive investment and turnaround in Metaverse, a business that has now racked up nearly $80 billion in net operating losses.
According to Zuckerberg, we have seen online content from text to image to video, but it has not yet reached the final frontier.
“Soon, we’re going to see an explosion of new media formats that are more immersive and interactive and made possible by AI advances,” Zuckerberg said on the call. “Our feed is going to be more synergistic overall.”
While he previously believed that virtual office environments were the way this would happen, it seems that Zuckerberg has shifted his focus to artificial intelligence and wearables.
Earlier this month, Meta laid off 1,500 people at its Metaverse division as part of a plan to shift investments from VR to wearables, such as smart glasses.
“The glasses are the end result of this idea. They will be able to see what you see, hear what you hear, talk to you, and help you as you go about your day,” said Zuckerberg. He even compared smart glasses to smartphones.
“I think we’re in a time like the advent of smartphones, and it was obviously only a matter of time until all those phones became smartphones,” Zuckerberg said. “It’s hard to imagine a world in a few years where most of the glasses people wear aren’t AI glasses.”



