Browse Category

Semiconductor Market News 28 August 2025 - 8 October 2025

OpenAI’s Mega-Deals with Nvidia & AMD Spark a $1 Trillion AI Frenzy

OpenAI’s Mega-Deals with Nvidia & AMD Spark a $1 Trillion AI Frenzy

OpenAI Fuels a $1 Trillion AI Hardware Boom OpenAI – the creator of ChatGPT – has positioned itself at the center of an AI hardware buying frenzy that is reshaping the tech industry. In the past few weeks (early October 2025), OpenAI announced two blockbuster chip-supply deals that together span 16 gigawatts of computing capacity – an almost inconceivable scale. (For perspective, 16 GW of data center power is roughly equivalent to the output of 16 large power plants.) These deals, one with long-time AI chip leader Nvidia and the other with rival AMD, are helping propel what analysts dub a “trillion-dollar
HBM Memory Gold Rush: Why High-Bandwidth Memory Prices Are Soaring in the AI Era

HBM Memory Gold Rush: Why High-Bandwidth Memory Prices Are Soaring in the AI Era

Key Facts HBM Node Transition: From HBM3 to HBM4 High-Bandwidth Memory has evolved rapidly to keep pace with AI needs. HBM3 – introduced around 2022 – brought huge leaps in bandwidth and capacity per stack, and became the workhorse memory for top-tier AI accelerators like NVIDIA’s A100/H100 and Google TPUs. By 2023, attention shifted to HBM3E, an enhanced generation offering even higher performance. In early 2024, SK hynix became the first to mass-produce HBM3E (sometimes dubbed the 5th-generation HBM) trendforce.com, supplying 12-Hi 36GB stacks to NVIDIA and others. This HBM3E delivers ~1.2 TB/s per stack (versus ~0.8–1.0 TB/s for HBM3),
Go toTop