NEW YORK, February 6, 2026, 16:33 (EST)
- Micron shares dropped before the bell as Semianalysis slashed its outlook for Micron’s Nvidia HBM4 share all the way down to zero.
- Research firm estimates put Nvidia’s HBM4 supply breakdown at roughly 70% for SK Hynix and 30% for Samsung.
- Nvidia is pushing Samsung for early shipments of HBM4, DigiTimes reported, with memory shortages on the horizon.
Micron Technology slipped 0.5% before the bell Friday after Semianalysis cut its outlook, predicting Micron will have no share in supplying HBM4 memory for Nvidia’s upcoming lineup. 1
This call carries weight: high-bandwidth memory—HBM for short—is the stacked DRAM right next to AI processors, pumping in data at top speed. If a supplier lands on Nvidia’s list, that can mean years of strong volume and pricing clout. Miss out, and it’s a scramble to catch the next cycle.
Investors are tracking the broader AI infrastructure push, with memory shortages resurfacing as a key constraint. This year, major U.S. tech companies are on pace to spend upwards of $630 billion on datacenters and AI chips. “The magnitude of the spend is materially greater than consensus expected,” analysts at MoffettNathanson noted. 2
Micron jumped roughly 4% in premarket hours after Amazon rolled out its $200 billion capital spending target for 2026—clear evidence cloud expansion isn’t slowing. But that rally fizzled once the Semianalysis note started making the rounds, and the stock pulled back.
Semianalysis isn’t seeing “indications of Nvidia ordering Micron HBM4,” and is now betting Nvidia will stick with SK Hynix and Samsung for its HBM4 needs. The firm estimates SK Hynix will take about 70% of the share, Samsung 30%, pointing to “poor speed performance from their use of an internal base die” as a factor in its earlier downgrade.
DigiTimes is reporting Nvidia has put in an early call to Samsung Electronics for sixth-generation high-bandwidth memory, or HBM4, as chipmakers channel more production into HBM and tight supply starts to bite. 3
According to TechPowerUp, Nvidia plans to stick with SK Hynix and Samsung HBM4 for its “Vera Rubin” lineup, skipping Micron entirely. 4
If those projections pan out, the competition between the three main HBM providers just gets tighter. SK Hynix stays out front, Samsung grabs a bit more ground, and Micron is left looking to pick up new HBM deals or hope for future Nvidia allocations.
HBM isn’t your standard commodity memory. Instead, it’s built up in stacks, each connected to a logic “base die” that coordinates the stack and links everything to the processor package. The performance ceiling keeps shifting higher, with AI chips constantly hungry for more bandwidth.
Supplier allocations are always in flux as qualification runs, yields, and timelines get juggled; Nvidia stays mum on sourcing for anything not in wide release yet. “The market just dislikes the substantial amount of money that keeps getting put into capex for these growth rates,” said Dave Wagner, portfolio manager at Aptus Capital Advisors, highlighting how quickly the mood can turn on AI outlays and the companies supplying the gear. 5
Micron holders are left wondering if the company can secure a significant position in HBM4 as Nvidia gears up for its next phase of AI hardware. Friday’s session made it clear—a single read-through from a supplier was enough to drown out an otherwise strong signal on demand.