Micron Begins Mass Production of 12-Stack HBM to Supply NVIDIA
Micron Technology has officially commenced mass production of its 12-stack High Bandwidth Memory (HBM), marking a significant milestone in the AI semiconductor industry. The company is set to supply this advanced memory solution to NVIDIA, a leading player in the AI and GPU market.
Micron Technology has officially commenced mass production of its 12-stack High Bandwidth Memory (HBM), marking a significant milestone in the AI semiconductor industry. The company is set to supply this advanced memory solution to NVIDIA, a leading player in the AI and GPU market.
Micron’s 12-Stack HBM: A Game-Changer in AI Memory
Micron completed the development of its 12-stack HBM in September 2024 and has since provided samples to key clients, including NVIDIA. The company’s CFO, Mark Murphy, recently highlighted the advantages of this new memory technology during a presentation at Wolfe Research. According to Murphy, Micron’s 12-stack HBM consumes 20% less power while offering a 50% increase in capacity compared to competing 8-stack HBM solutions.
As AI workloads demand ever-increasing memory bandwidth and efficiency, Micron’s latest HBM is expected to gain traction among AI chip manufacturers. The company anticipates that the majority of its HBM production in the latter half of 2025 will be dedicated to 12-stack configurations.
The Growing Demand for High Bandwidth Memory in AI
HBM technology plays a crucial role in AI and high-performance computing by vertically stacking DRAM chips to enhance data processing speeds and bandwidth. This innovation is particularly essential for AI applications, where GPUs require fast and efficient memory solutions to handle large-scale model training and inference tasks.
With demand for AI-driven hardware surging, securing high-quality HBM supplies has become a top priority for semiconductor giants like NVIDIA. Micron’s ability to mass-produce 12-stack HBM positions it as a key supplier in this competitive market.
As the AI industry continues to evolve, advancements in HBM technology will remain critical for improving computing performance, reducing energy consumption, and meeting the growing needs of AI-powered applications.








