Skip to main content

Meta earnings live updates — stock down ahead of Q4 results, AI spending biggest worry

Mark Zuckerberg and company are revealing the latest to investors

Mark Zuckerberg wearing Orion glasses
(Image: ©  David Paul Morris/Bloomberg via Getty Images)

Meta is getting ready to announce its Q4 2025 earnings. The company is expected to announce massive spending on AI while also taking in billions of dollars. But it's not just about the money; the company could tease or announce new goodies.

Generally, earnings calls are designed for investors to learn about a company's latest initiatives and how it plans to generate more money, but for us regular tech enthusiasts, there's plenty of information to be gleaned. That's where Tom's Guide comes in — we'll dig through the investor information overload and explain why it's relevant and what it means for the company and for you.

Meta earnings call expectations

  • Earnings per share (EPS) of $8.16, up from $8.02 per share in Q4 2024
  • Revenue of $58.4 billion, up from $48.4 billion in Q4 2024
  • $21.9 billion in expenses, up from $14.4 billion in Q4 2024
  • Meta’s Reality Labs expected to report $959 million in revenue but operating loss of $5.9 billion.
  • Meta stock down 12% since October, due to concerns over spending

When is the Meta Q4 2025 earnings call?

Meta will stream its earnings call live at 4:40 p.m. ET (1:30 p.m. PT).

How to listen live

While we're here to live blog the earnings calls with the latest information as it emerges, you might want to listen for yourself (if you're a shareholder in Meta, you almost certainly want to hear what's going on. Here's the link to listen as Mark Zuckerberg and the rest of the executives report what happened in the most recent quarter:

Meta Q4 2025 earnings live webcast

Meta Q4 earnings call — live updates

Refresh

Meta’s AI team just hit a milestone — here’s why it matters

Meta AI

(Image credit: Meta AI)

Meta Platforms has crossed an important AI milestone just ahead of its earnings call.

Speaking at the World Economic Forum, Meta CTO Andrew Bosworth said the company’s newly formed internal AI team has delivered its first key AI models this month. The work is still early, but Bosworth described the results as promising — especially given how new the team is.

According to Bosworth, the group is only about six months into the effort, yet the models are already showing capabilities Meta hadn’t previously achieved internally. That’s notable for a company that’s investing heavily in AI infrastructure, chips, and talent — and facing growing pressure to show results.

Meta hasn’t shared exactly what these models do yet or when they’ll appear in products. However, recent reporting suggests the company has been developing both text-based and multimodal AI systems — models that can understand and generate combinations of text, images, and video. For now, Meta is keeping specifics under wraps.

For everyday users, the significance signals a new direction. Meta is increasingly focused on building its own AI systems, rather than relying entirely on external partners. Over time, these internal models could power features across Instagram, Facebook, WhatsApp and Meta’s expanding lineup of smart glasses — from smarter recommendations to more capable AI assistants.

This update also helps explain why AI spending keeps coming up in earnings conversations. Training and running advanced AI models is expensive, but Meta appears to see this as a long-term investment in control, performance, and differentiation.

The Meta Ray-Ban display glasses explained

Meta Ray-Ban Display

(Image credit: Future)

Released in September 2025, the Meta Ray-Ban Display glasses are the clearest example yet of how Meta Platforms envisions the next phase of smart eyewear. Rather than jumping straight to full augmented reality, these glasses introduce what Meta calls Display AI — a practical, scaled-back approach to putting visual information on your face.

At the core of the design is a full-color monocular display embedded in the right lens, with a resolution of 600×600 pixels. Crucially, this isn’t meant to be immersive. The display is small and glanceable, designed to surface information only when you need it, then fade back into the background.

That means you won’t see holograms, floating windows or digital worlds layered over reality. Instead, the glasses focus on everyday utilities: turn-by-turn navigation, incoming messages, quick prompts and live translation. It’s closer to a heads-up display than an AR headset — and that distinction is intentional.

Meta’s goal here is usability. By limiting the visual footprint, the company avoids many of the problems that have plagued early AR hardware, including bulkiness, distraction and social awkwardness. You can look at the display briefly, then return your attention to the real world without feeling like you’re wearing a computer on your face.

The Ray-Ban partnership also matters. By packaging this technology inside familiar, fashion-forward frames from Ray‑Ban, Meta is trying to make Display AI feel more like a normal accessory — something you’d actually wear outside your house.

In short, the Meta Ray-Ban Display glasses aren’t trying to replace your phone or introduce full AR..yet. They’re meant to introduce visual AI gently, setting expectations for what smart glasses can do today — and paving the way for more ambitious AR hardware down the line.

Why Meta isn’t jumping straight to full AR

Meta Quest 3

(Image credit: Future / Tom's Guide)

For all of its ambition in AR, Meta Platforms has been unusually cautious about pushing full AR glasses to consumers — and that’s by design.

True AR glasses still face hard constraints: they’re bulky, power-hungry, expensive to manufacture, and difficult to wear comfortably for long stretches. Add in battery life, heat, and social acceptance, and the technology simply isn’t ready for everyday use at scale.

Rather than force an early version onto users, Meta is taking a step-by-step approach. Display AI glasses let the company introduce visual computing in a limited, practical way, while continuing to refine the hardware and software needed for full AR behind the scenes.

This also gives Meta time to answer a bigger question: what do people actually want on their face all day? By watching how users interact with small displays — when they glance, what they ignore, and what they find helpful — Meta can shape future AR experiences around real behavior, not demos.

Full AR is still the goal. But for now, Meta is betting that earning trust and habit first matters more than shipping the most advanced tech as fast as possible.

Meta’s smart glasses plan isn’t about a single device

Map view in Meta Ray-Ban Display

(Image credit: Future)

Meta Platforms isn’t betting on a single “killer” pair of smart glasses — and that’s intentional. Instead of rushing straight to full augmented reality, Meta is rolling out a multi-stage hardware strategy built around three distinct categories: audio AI glasses, display AI glasses, and eventually full AR.

The goal is gradual adoption. Audio-first glasses introduce AI assistance without changing how people use eyewear. Display AI glasses add lightweight visual information without overwhelming users. Full AR only comes later, once the technology is slim, socially acceptable, and genuinely useful for everyday life.

In other words, Meta is trying to normalize smart glasses before making them futuristic. Rather than asking users to strap a full AR computer to their face all day, the company is easing people in — feature by feature, year by year — until wearing AI-powered glasses feels as natural as wearing earbuds or a smartwatch.

This staged approach also gives Meta room to refine the technology, build habits, and test what people actually want from AI on their face — before committing to the most ambitious version of augmented reality.

Meta reiterates where its official earnings news actually appears

Meta logo on screen of mobile phone on Facebook word background. Facebook after rebranding and changing name to Meta.

(Image credit: Viacheslav Lopatin | Shutterstock)

In its earnings announcement, Meta Platforms reiterated the official channels it uses for material disclosures under Regulation FD, which governs how public companies share market-moving information.

According to Meta, investors and the public should look to the following sources for official earnings results, press releases, and regulatory updates:

  • investor.atmeta.com — Meta’s Investor Relations site, where earnings releases, prepared remarks, financial tables, and webcasts are posted
  • meta.com/news — the company’s corporate newsroom for major announcements
  • Mark Zuckerberg’s public social profiles on Facebook, Instagram and Threads, which Meta treats as recognized disclosure channels

This matters because information released through these platforms is considered official company communication, not speculation or leaks.

If earnings details, guidance or strategic updates appear elsewhere first, they shouldn’t be treated as confirmed unless they’re echoed through one of Meta’s designated channels.

CapEx watch: Meta’s 2026 spending plans loom large

Mark Zuckerberg

(Image credit: Chip Somodevilla/Getty Images)

One of the biggest questions heading into Meta’s earnings call is how much the company plans to spend next year. Analysts expect updated guidance on 2026 capital expenditures, particularly for AI data centers, chips and computing infrastructure.

Current projections range from roughly $109B to $117B, and even small changes to that outlook could influence how Wall Street views Meta’s margins, cash flow and long-term discipline.

Advertising is still Meta’s engine — but AI is changing the math

Threads

(Image credit: Future)

Advertising remains the backbone of Meta’s business, powering Facebook, Instagram, WhatsApp and Threads. AI-driven targeting and recommendations have helped boost performance, keeping ad revenue resilient even as the digital ad market tightens.

The challenge is cost. Running large-scale AI systems isn’t cheap, and those expenses are climbing fast. On today’s earnings call, analysts will be listening for how Meta balances ad growth with the rising cost of AI infrastructure behind the scenes.

What analysts — and users — are watching on Meta’s earnings call

Instagram app on iPhone

(Image credit: Shutterstock)

As Meta reports earnings, a few key themes are expected to dominate the conversation. Here’s what matters most — and why it’s worth paying attention even if you’re not an investor.

  • Advertising strength vs. rising AI costs. Ads are still Meta’s bread and butter. Facebook, Instagram, WhatsApp and Threads all rely on ad revenue, with AI-powered targeting and recommendations increasingly driving performance. But that growth is now competing with the reality that AI is expensive to run, and those costs are rising fast.
  • CapEx and 2026 spending plans. One of the biggest questions on the call will be how much Meta plans to spend next year — especially on AI data centers, chips and infrastructure. Analysts expect updated guidance for 2026, with projected spending in the $109B–$117B range. That number will shape how Wall Street views Meta’s margins and long-term cash flow.
  • Are Meta’s AI investments paying off yet? Meta has poured billions into AI, but investors want evidence that those investments are starting to translate into real returns — whether that’s smarter ads, better engagement, new paid features or efficiency gains. The balance between progress and pressure will be closely scrutinized.
  • Reality Labs: still a drag, still a bet. While AI is the headline story, Meta’s Reality Labs division — home to VR, AR and metaverse projects — continues to lose money. Any updates on losses, timelines or strategic shifts here could influence confidence in Meta’s longer-term bets.
  • Engagement and new monetization signals. Finally, analysts will be listening for signals around user engagement and future revenue streams. That includes updates on Threads, experiments with subscriptions, and how AI-enabled features might eventually turn into new ways for Meta to make money.

Meta’s in-house AI models arrive — and what that signals

Mark Zuckerberg

(Image credit: Chip Somodevilla/Getty Images)

Meta’s Superintelligence Labs — a new internal AI team — has delivered its first high-profile AI models internally, according to CTO Andrew Bosworth in an exclusive interview with Reuters. He described the work as promising even at this early stage, with ongoing efforts to refine and adapt those models into usable technologies inside Meta’s ecosystem.

Building foundational AI capabilities in-house gives Meta more control over future innovation — potentially powering smarter recommendations, generative tools, and other advanced features down the line — but the company has stopped short of outlining specific product launches or timelines tied to these models.

Analysts see this as part of a broader strategy to regain footing in the competitive AI landscape while investing in long-term tech leadership.

At the same time, Meta has recently paused teen access to its AI characters globally amid safety and oversight concerns — a reminder that as new AI tools roll out, Meta still faces scrutiny over content moderation and user protection.

AI strategy and earnings: what Meta’s spending means for users

Zuckerberg on phone

When Meta talks about AI on its earnings call later today, we will learn more about how Instagram, Facebook and WhatsApp evolve. The company is spending heavily on AI infrastructure to power smarter feeds, generative tools, assistants and automation across its apps.

That investment is already shaping product decisions. Meta has been building more of its AI in-house, developing proprietary models meant to run recommendations, creative tools and new AI features users interact with every day.

But AI doesn’t come cheap. As costs rise, Meta is under pressure to find ways to pay for those features — which helps explain why the company is exploring optional subscriptions and paid upgrades, alongside ads.

At the same time, Meta has pulled back and adjusted some AI features, like pausing teen access to AI characters, as concerns around safety and oversight grow.

For users, the takeaway is that Meta’s AI push is accelerating, but how it gets funded — and who gets access to what — is still very much in flux.

AI costs are quietly shaping Meta’s subscription strategy

Meta Threads logo on phone

(Image credit: Shutterstock)

Meta is positioning potential subscriptions around added features and control, but the real pressure point may be AI economics. Training, deploying and running large-scale AI systems — from generative tools to recommendation engines — is capital-intensive, requiring massive investments in data centers, chips and ongoing compute.

Advertising still funds the bulk of Meta’s business, but AI changes the math. As more AI features move from experiments into everyday products, the cost of serving each user rises. Optional subscriptions would give Meta a way to recover some of those costs directly, without locking core social experiences behind a paywall.

In that sense, subscriptions wouldn’t replace ads — they’d act as a pressure valve, helping Meta fund AI expansion while keeping Instagram, Facebook and WhatsApp largely free for most users.

Meta explores premium subscriptions across its apps

meta ai

(Image credit: Shutterstock)

Meta Platforms says it’s exploring optional premium subscription tiers for Instagram, Facebook, and WhatsApp, while keeping the core experience free. The paid options would focus on exclusive features, added controls and expanded AI tools, rather than replacing ad-supported access.

The move signals Meta’s continued interest in diversifying revenue beyond ads, particularly as it ramps up spending on AI and infrastructure.

Where does Meta make money?

Zuckerberg wearing Meta RayBan Display smart glasses

(Image credit: Meta)

At this point, most of Meta's earnings come from digital advertising. Yet Apple just invested $14.3 billion in Scale AI in June and brought over CEO Alexandr Wang and other top talent from the company.

But Meta is spending billions (between $70 billion and $72 billion) on AI and other areas that don't generate revenue.

“We’re seeing the returns in the core business that’s giving us a lot of confidence that we should be investing a lot more, and we want to make sure that we’re not underinvesting,” he said.

The company sees AI, AR and VR as the future, and it might just be sacrificing money now to make that future a reality.

Meta's most exciting department is bleeding cash

Meta Ray-Ban Display

(Image credit: Future)

If you're just a fan of cool technology (which most of us are), you're probably excited about Meta’s Reality Labs division. It's where the cool AR tech is being developed. It's also where Meta is losing an absurd amount of money in the short term. In fact, it is expected to bring in only $959 million in revenue for the quarter while losing a staggering $5.9 billion.

Does that mean we should temper our excitement over this new tech? Perhaps not, as Meta is seemingly playing the long game and banking on this technology becoming massive in the future; otherwise, the company wouldn't spend so much on its researchers, developers, and engineers working to build the future of virtual and augmented reality.

Meta's stock is down

meta ai

(Image credit: Shutterstock)

Meta's last earnings call was in October 2025 where the company reported its Q3 income and expenses. Apparently, Wall Street didn't like what it heard, as the company's stock is down 12% overall since then.

According to the latest reports, investors have raised concerns about the social media giant’s massive amount of spending. Essentially, Meta is spending a lot of money, and the company said it expects 2026 capital expenditure growth to be “notably larger," which means it plans to spend even more.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.