Today: 24 April 2026
Micron’s new 256GB SOCAMM2 memory targets AI server power limits — Nvidia weighs in
5 March 2026
2 mins read

Micron’s new 256GB SOCAMM2 memory targets AI server power limits — Nvidia weighs in

BOISE, Idaho, March 5, 2026, 08:03 (MST)

  • Micron has begun sending out customer samples of its 256GB SOCAMM2 low-power server memory module, targeting AI data center workloads.
  • The company claims its module scales up to 2TB of CPU-attached memory while drawing less power than typical DDR5 server modules.
  • Micron shares barely moved in early U.S. trading

Micron Technology (MU.O) on Tuesday announced it’s shipping customer samples of its 256-gigabyte SOCAMM2 low-power server memory module, specifically targeting AI data center workloads. “Micron’s 256GB SOCAMM2 offering enables the most power-efficient CPU-attached memory solution for both AI and HPC,” said Raj Narasimhan, senior vice president at Micron. For Nvidia (NVDA.O), data center CPU product chief Ian Finder called the module “enabling the next generation of AI CPUs.” Micron Technology

Cloud operators are bumping up against power and cooling constraints in their race to scale systems for generative AI. Meanwhile, memory costs and performance lag have grown more pressing, with large models demanding a flood of data kept near the processor.

“CPU-attached LPDRAM is emerging as a critical memory tier,” Futurum Research analyst Brendan Burke wrote, pointing to the surge in workloads shifting toward inference—essentially, running models post-training. The catch: inference can cause demand to spike in a hurry, and it eats into power budgets, fast. Futurum

SOCAMM2, or small outline compression attached memory module, is a modular form factor designed to place low-power memory right next to server CPUs. Unlike standard DDR5 registered modules (RDIMMs) you’ll find in most servers, SOCAMM2 swaps in LPDDR5X—a type of DRAM usually seen in smartphones. High-performance computing (HPC) refers to those massive workloads, like scientific modeling.

Micron’s 256GB module uses a monolithic 32-gigabit LPDDR5X layout and, according to the company, supports up to 2 terabytes of low-power memory for each eight-channel server CPU—roughly 33% more than the previous 192GB SOCAMM2. Power consumption comes in at about a third that of equivalent RDIMMs. The module, by Micron’s math, also occupies just one-third the space. Running internal benchmarks with a Llama 3 70B large language model, Micron reported a more than 2.3x jump in time to first token—the wait before the model starts returning text.

Micron said it’s collaborating with Nvidia on memory designs aimed at advanced AI infrastructure, while also participating in the JEDEC standards group as SOCAMM2 specs develop. The company noted the modular approach is designed to boost serviceability and work with liquid-cooled server setups.

Micron has begun sampling as Samsung Electronics and SK hynix ramp up their own AI server memory plans, while the industry looks to expand SOCAMM2 beyond just niche use. According to TrendForce, all three companies are already working on SOCAMM2, with initial prototypes or samples now out in the open.

Micron stands as one of just three major players in high-bandwidth memory—HBM, the stacked DRAM crucial for AI GPUs—competing with Samsung and SK hynix. Back in December, the company bumped up its capital expenditure target for 2026 to $20 billion, aiming to ramp up production. Both Morningstar and J.P. Morgan analysts expect supply constraints could persist through 2027.

Micron dipped roughly 0.1% to $400.42 during morning hours in the U.S., with Nvidia off by about 0.6%.

Micron’s fiscal second-quarter numbers land March 18, a date investors have marked as they hunt for signals on pricing trends and the pace of new data-center product shipments showing up in volumes. The company plans to webcast the call.

SOCAMM2 is still just sampling for now. Whether it gains traction depends on customer qualification processes, system revamps, and how fast JEDEC moves on standards. A slowdown in AI capex or if competitors capture major platforms could delay the product’s impact on the top line.

Back in October 2025, Micron rolled out samples of a 192GB SOCAMM2 module, touting its design as a way to reduce power consumption and bring additional memory right up alongside the CPU. Now, the company is pushing ahead with its 256GB version—the next logical move—though actual volume shipments will hinge on when customers are ready to launch.

Stock Market Today

  • Procure Space ETF (UFO) Sees $190M Inflow, Underlying Stocks Mixed
    April 24, 2026, 11:32 AM EDT. The Procure Space ETF (UFO) recorded a notable inflow of approximately $190.4 million, boosting outstanding units by 36.1% week over week. Key holdings showed mixed performance: Planet Labs PBC (PL) fell 6.8%, Rocket Lab Corp (RKLB) dropped 3.3%, while Globalstar Inc (GSAT) rose marginally by 0.3%. UFO's latest share price stands at $51.39, well above its 52-week low of $21.61 and near the high of $55.91. ETFs trade in units, which adjust based on demand, impacting the buying or selling of underlying assets. Large inflows into ETF units typically signal increased purchases of basket components, affecting individual stock prices. Monitoring these changes helps investors gauge market interest in specific sectors.

Latest article

POET Stock Is Flying Again: The Marvell Link, AI Optics Buzz and Tax Fight Behind the Rally

POET Stock Is Flying Again: The Marvell Link, AI Optics Buzz and Tax Fight Behind the Rally

24 April 2026
POET Technologies shares rose 27% to $14.93 in late-morning trading Friday after CFO Thomas Mika said the company received a purchase order linked to Marvell Technology. Marvell has not confirmed the order. POET reported $341,202 in Q4 revenue and a $42.7 million net loss, as it shifts from development to production. The rally follows news of Marvell’s acquisition of Celestial AI and reported Google talks.
Marvell’s New AI Data-Center Deal Puts Optical Chips Back in Focus

Marvell’s New AI Data-Center Deal Puts Optical Chips Back in Focus

24 April 2026
Marvell Technology acquired Swiss firm Polariton Technologies, adding plasmonics-based silicon photonics to its optical-chip lineup. The company’s shares fell 2.1% to $162.01 in early Friday trading. Marvell did not disclose financial terms. The deal comes as AI data centers demand faster, lower-power optical links to move data between servers.
Elong Power Stock Jumps 26% as Heavy Volume Tests Battery Turnaround Story

Elong Power Stock Jumps 26% as Heavy Volume Tests Battery Turnaround Story

24 April 2026
Elong Power Holding shares surged 26% to $3.37 in late-morning Nasdaq trading Friday, with volume topping 58 million shares and no new filings since its April 20 annual report. The company reported 2025 revenue of $2.05 million and a narrowed net loss of $5.57 million, but auditors flagged “material uncertainty” over its ability to continue as a going concern.
BW stock jumps after Babcock & Wilcox gets $2.4 billion green light to build AI data-center power project
Previous Story

BW stock jumps after Babcock & Wilcox gets $2.4 billion green light to build AI data-center power project

AMD teams up with University of Toronto for 100-project AI research lab in Canada
Next Story

AMD teams up with University of Toronto for 100-project AI research lab in Canada

Go toTop