SAN FRANCISCO, April 13, 2026, 04:03 (PDT)
Nvidia climbed $4.60 to $188.63 early Monday, with analysts pointing to strong AI infrastructure demand pushing key supplier TSMC toward a fourth consecutive record profit.
This shift has weight: Nvidia is deeply embedded in the surge of AI outlays. CoreWeave’s latest agreements with Anthropic and Meta—plus Meta’s early shot at Nvidia’s Vera Rubin chips—point to ongoing demand for cloud capacity. Investors, though, remain skeptical over whether these hefty, multibillion-dollar data-center bets will deliver.
Nvidia wrapped up fiscal 2026 posting a record $68.1 billion in quarterly revenue—$62.3 billion of that came straight from its data center business. Gross margin? 75.0%. The company’s market cap now stands at roughly $4.53 trillion, still giving it a heavy footprint in the AI trade.
Taiwan isn’t letting up. TSMC—the chipmaker powering Nvidia’s AI push—posted a 35% jump in first-quarter revenue last week. March exports hit a new high: $80.18 billion, thanks to resilient demand from AI and cloud sectors. Macquarie’s Arthur Lai pointed to “sustained AI demand and advanced-node leadership” as he looked for TSMC to issue stronger second-quarter guidance. Reuters
Rubin is up next in Nvidia’s lineup. Back in March, the company announced its Vera Rubin platform had entered full production, targeting both AI model training and inference—the phase when models actually deliver answers. That same month, Reuters put the combined sales potential for Rubin and Blackwell at over $1 trillion by the close of 2027, not counting China.
The crowd’s getting thicker. Last week, Amazon CEO Andy Jassy pointed out, “Virtually all AI thus far has been done on NVIDIA chips, but a new shift has started,” noting that Trainium3 capacity is close to fully booked and Amazon’s in-house chip business now runs at a $20 billion annual revenue pace. And Broadcom? On April 6, it revealed a long-term contract to build Google’s custom AI chips all the way through 2031. Amazon News
Memory remains in focus. High-bandwidth memory—HBM—powers AI accelerators with quick, stacked storage. Samsung last week flagged tight supply and rising prices, citing robust AI demand and its first shipments of HBM4 chips to Nvidia customers. SK Hynix and Micron are both already in mass production as well.
Geopolitics is the immediate concern. Oil topped $100 once U.S.-Iran negotiations broke down, while Reuters flagged that chip producers are monitoring supply risks for helium and neon. “Diversified sourcing and safety stock should be sufficient to manage short-term disruptions,” IDC’s Galen Zeng said of TSMC. Reuters
Supplier and customer figures both still indicate solid AI spending. Still, with clients getting tougher on pricing, ramping up in-house chip production, and insisting on proof their data-center investments pay off, Nvidia’s margin for error looks slimmer than its breakneck growth rate might imply.