SAN JOSE, California, April 27, 2026, 14:04 PDT
- Rambus dropped roughly 10.8% Monday, with shares sliding after the company posted its first-quarter numbers.
- Product revenue climbed 15% over last year, boosted by stronger demand for data-center and AI memory systems.
- The company told investors to expect stronger product revenue in the second quarter, though it cautioned that the outcome hinges on closing customer agreements.
Rambus Inc. shares tumbled Monday, sliding 10.79% to finish at $141.31. The drop came even as the chipmaker posted $180.2 million in first-quarter revenue and highlighted AI-fueled data-center demand. Shares had earlier peaked at $161.70. Product business growth failed to offset the market reaction.
This shift is notable—Rambus increasingly serves as a direct proxy for the memory bottleneck dogging AI systems. The company isn’t in the business of producing core AI processors. Instead, Rambus sells memory interface chips and silicon IP—licensable chip designs—key for shuttling data between processors and memory.
Rambus reported that product revenue climbed 15% year over year, hitting $88.0 million. Cash from operations came in at $83.2 million. GAAP revenue, based on standard U.S. accounting, was up from $166.7 million in the same period last year. Net income landed at $59.9 million, or 55 cents per diluted share.
Luc Seraphin, the company’s chief executive, described the period as a “solid first quarter.” He pointed to AI inference and agentic workloads as key factors boosting appetite for higher memory bandwidth and better data movement. In essence, Rambus is capitalizing on the fact that large AI systems hunger for speedier memory access—processors alone aren’t enough. Business Wire
Rambus put out second-quarter guidance calling for product revenue between $95 million and $101 million, licensing billings ranging from $76 million to $82 million, and royalty revenue in the $72 million to $78 million band. Licensing billings—essentially invoices for Rambus tech—offer a read on demand, though not all billed amounts hit revenue immediately.
A few days before the latest report, Rambus rolled out its SOCAMM2 server module chipset—a low-power memory module built for AI server demands. The company says these chipsets back LPDDR5X-based SOCAMM2 modules with speeds reaching 9.6 gigabits per second. Instead of soldered-down memory, the design features modules that can be swapped or serviced.
Rambus now sits closer to the big memory suppliers, though not head-to-head with them. “The industry needs a robust ecosystem around LPDDR-class server memory,” said Praveen Vaidyanathan, Micron’s vice president and general manager of Cloud Memory Products. IDC’s Soo Kyoum Kim described SOCAMM2 as “an important evolution” for striking the right balance between performance and efficiency in AI data centers. Nasdaq
Rambus pointed to its HBM4E memory controller IP—a design tailored for high-bandwidth memory in AI accelerators and high-performance computing. According to the company, the controller handles speeds of up to 16 gigabits per second per pin and delivers as much as 4.1 terabytes per second per memory device. HBM stacks memory near processors, aiming for faster data delivery.
Competition is relentless. Memory giants like Micron and Samsung, plus a slew of AI-chip vendors, are all racing to crack the same bandwidth and power hurdles. “HBM bandwidth,” MatX co-founder and CEO Reiner Pope said, is still a top obstacle for large language models. Over at Samsung, executive Ben Rhew pointed to the need for HBM4E IP before the technology can see wider uptake. Rambus
Ahead of earnings, Rambus was sitting close to its 52-week peak and trading above where most analysts had set their targets. Eight out of nine analysts called it a buy, data from Investing.com show. In other words, much of the optimism was already baked in, so even stronger product revenue couldn’t give shares much fresh momentum for a typical quarter.
There’s a hitch: Rambus cautioned that its second-quarter revenue forecast comes with strings attached, hinging on new customer deals for both product sales and licensing. The outlook faces downside risk if AI server demand weakens, memory supply gets squeezed in unhelpful segments, or customer ramp-ups lag expectations.
Rambus is clocking growth where it matters for AI infrastructure, but the market didn’t bite—shares cooled off following a big rally, with investors looking for something beyond just a solid quarter.