SANTA CLARA, California, April 30, 2026, 07:02 PDT
Marvell Technology — typically associated with data-center infrastructure rather than splashy AI chips — caught a wave of renewed interest on Thursday. Surging roughly 3% at the start of U.S. trading, Marvell shares benefited as Big Tech’s results fueled more bets on AI spending. Nvidia edged 1.6% lower; its dominance isn’t in doubt, but investors are looking further afield.
Timing is key. Alphabet, Microsoft, Amazon and Meta all flagged this week that their AI budgets aren’t cooling off—combined spending is now headed for more than $700 billion this year, up from roughly $600 billion, according to Reuters. And it’s not just Nvidia’s GPUs grabbing those dollars: networking hardware, optical connections, even custom chips are seeing a surge, too.
That’s where Marvell fits in. ASICs—application-specific integrated circuits—are designed for targeted tasks, not general-purpose computing. Reuters said Thursday that this part of the chip market has played to the strengths of both Broadcom and Marvell. Qualcomm, for its part, aims to deliver data-center processors, inference accelerators, and custom ASICs by year-end, per .
Marvell isn’t arguing that Nvidia’s dominance is over. The point is, cloud players are hunting for more than a single route to assemble AI infrastructure—particularly for inference, where trained models respond to users. Alphabet’s Sundar Pichai has said Google now sells its own AI chips to select customers. Earlier this month, Reuters reported Google has been in discussions with Marvell to co-develop a pair of AI chips, aiming for more efficient model performance.
Nvidia’s size is tough to match. The chipmaker just posted fiscal fourth-quarter revenue of $68.1 billion, with its data-center business alone hitting a record $62.3 billion, and full-year revenue coming in at $215.9 billion. “Enterprise adoption of agents is skyrocketing,” founder and CEO Jensen Huang said, adding that customers are “racing to invest in AI compute.” NVIDIA Newsroom
Marvell is a much smaller player. For fiscal 2026, the company posted record revenue of $8.195 billion, a 42% jump from the previous year. Data-center sales made up 74% of total revenue in the January quarter. Chairman and CEO Matt Murphy pointed to record design wins and bookings that, according to him, are growing faster than ever.
Everything shifted for investors after Nvidia put $2 billion directly into Marvell. The companies unveiled their strategic tie-up on March 31, with plans to collaborate on NVLink Fusion—a rack-scale platform aimed at semi-custom AI infrastructure. Under the arrangement, Marvell takes on the custom XPU and NVLink-ready networking pieces, while Nvidia brings its CPUs, networking silicon, interconnects and broader infrastructure to the table.
Back then, Huang declared, “the inference inflection has arrived.” Murphy, for his part, pointed to how the tie-up underscored just how crucial high-speed connectivity, optical interconnect, and accelerated infrastructure have become as AI ramps up. What’s notable here: this language doesn’t cast Marvell as an Nvidia rival. Instead, it shows Nvidia wants Marvell’s tech embedded in its own systems. Marvell Technology, Inc.
Analysts are already factoring this in. Oppenheimer’s Rick Schafer called the Nvidia-Marvell deal “a vote of confidence” for Marvell as a key AI partner in both ASIC and connectivity. Stifel analyst Tore Svanberg lifted his Marvell price target to $140 from $120 and reiterated his Buy rating earlier this month, per a roundup of calls by Insider Monkey. insidermonkey.com
After Marvell posted results in March, Bank of America’s Vivek Arya upgraded the stock to Buy, citing traction in AI optical connectivity and clearer prospects for custom-chip projects with Microsoft and Amazon, according to TheStreet. That’s Marvell’s sweet spot: less about massive GPUs, more about tailored compute and the networking pieces that bind thousands of chips together into a single platform.
Competition among chipmakers keeps heating up. Broadcom stands out as a heavyweight in custom silicon, while Qualcomm is angling for a slice of the data-center ASIC market. AMD, of course, is still Nvidia’s most visible challenger in AI accelerators. So Marvell’s connection to Nvidia is a bit of an outlier: the company teams up with the AI chip leader, yet also reaps rewards as cloud providers seek out more custom silicon.
Nvidia hardly looks like old news just yet. Citing industry sources, Reuters said Thursday the company’s B300 server units are going for around 7 million yuan, or $1 million, in China—close to twice the going rate in the U.S.—as AI demand remains hot and U.S. export restrictions bite. Scarce supply rarely chases a product that’s lost relevance.
Still, the risks tied to Marvell stand out. The company’s annual report flags a heavy reliance on just a handful of customers for much of its revenue; a big account can be won or lost. There’s also the chance those customers could end up designing their own chips or opt to integrate instead. On top of that, Marvell noted that some ASIC deals might pull down gross margins, so even if revenue climbs, profits could lag if costs rise or pricing slips.
The real question isn’t if Nvidia’s moment has passed. It’s whether the coming wave of AI investment opens up more space for firms like Marvell—those working nearer the hardware, switches, and custom chips powering AI data centers. Wall Street, at least for now, seems a bit more receptive to that possibility than it was just a few months back.