SAN FRANCISCO, Feb 19, 2026, 01:01 PST
- Meta is set to roll out millions of Nvidia’s Blackwell and Rubin GPUs as part of a broadened, multi-year agreement between the two companies.
- The agreement brings a major deployment of Nvidia’s Arm-powered Grace CPUs, plus Spectrum-X Ethernet networking, into the mix.
- Nvidia said Meta plans to deploy “confidential computing” on WhatsApp as it ramps up its AI data centers.
Meta Platforms is set to roll out millions of Nvidia’s latest and upcoming AI chips, expanding a multi-year agreement that’s now at the core of the social media giant’s data center expansion plans.
No coincidence on the timing here. As big tech scrambles to secure AI computing muscle, the crunch has moved beyond snapping up single chips. Now, it’s all about assembling entire systems that’ll handle massive AI models—without sending power bills through the roof.
The deal takes Nvidia deeper into CPUs, moving beyond its traditional graphics processors. Intel and Advanced Micro Devices have controlled these central processing units in data centers for years.
Nvidia announced a deal covering both on-premises and cloud setups. Meta plans to roll out “hyperscale” data centers tailored for AI training and inference—so, training models on data, then generating results—powered by Nvidia’s Blackwell and Rubin GPUs, as well as Grace CPUs and Spectrum‑X Ethernet. http://nvidianews.nvidia.com/news/meta-builds-ai-infrastructure-with-nvidia
“No one deploys AI at Meta’s scale,” Nvidia CEO Jensen Huang said, highlighting what he called “deep codesign” spanning chips, networking, and software. Meta CEO Mark Zuckerberg said Meta plans to build clusters with Nvidia’s “Vera Rubin platform” to power upcoming AI systems.
The two companies are developing servers that run solely on CPUs. Nvidia called Meta’s deployment the first major rollout of Grace CPUs used independently, and said both are now working together on a next-gen “Vera” CPU—targeted for potentially wider use in 2027.
Nvidia said Meta is bringing Nvidia’s “confidential computing” to WhatsApp, aiming to shield data as it’s being processed on servers—not only when it’s stored or transmitted.
Ben Bajarin, CEO and principal analyst at Creative Strategies, says the move into CPUs shows how recent “agentic AI” software is creating new requirements for general-purpose chips—even as GPUs remain responsible for most of the processing. https://www.wired.com/story/nvidias-deal-with-meta-signals-a-new-era-in-computing-power/
The news sent AI stocks climbing. Nvidia finished 1.6% higher on Wednesday, while Meta tacked on 0.6%. U.S. equities closed up. https://www.reuters.com/business/us-stock-futures-trend-higher-tech-sentiment-improves-2026-02-18/
The companies kept financial details under wraps and offered no timeline on when most of the new equipment would be delivered. https://finance.yahoo.com/news/nvidia-and-meta-expand-gpu-team-up-with-millions-of-additional-ai-chips-211544907.html
The competitive field keeps expanding. Nvidia is pushing to offer an entire data center package—not just GPUs, but CPUs and networking gear as well. At the same time, the biggest AI players are looking elsewhere, building out their own custom chips and eyeing different suppliers.