SK Hynix has completed internal certification for its next-generation HBM4 chips and established a production system for customers, marking a key milestone in its AI memory roadmap.
The South Korean chipmaker, currently Nvidia’s main HBM supplier, shipped 12-layer HBM4 samples to customers in March and plans to finish preparations for mass production in the second half of 2025. HBM (high-bandwidth memory) stacks DRAM vertically to save space, reduce power consumption, and meet the massive data requirements of AI training workloads. HBM4 is a major leap forward, delivering over 2TB/s bandwidth, making it critical for next-gen AI infrastructure.
Key Highlights
Analysts reacted positively, with SK Hynix shares rising 7.3 percent in morning trade, outpacing the broader KOSPI’s 1.2 percent gain. Meritz Securities projects SK Hynix’s HBM market share to remain in the low 60 percent range through 2026, supported by early HBM4 supply and a tighter customer integration strategy.
The HBM4 generation introduces customer-specific logic dies, making it harder to replace a supplier’s chip with an identical competitor product. This marks a departure from the more standardized nature of previous HBM generations and could strengthen SK Hynix’s competitive edge.
Also Read: SK Group's Chey Tae-won Pushes AI Skills, Efficiency
Already holding 62 percent of the global HBM market, SK Hynix recently overtook Samsung as the world’s top memory chipmaker. Early HBM4 mass production could further consolidate its position as AI-driven demand for high-performance memory accelerates globally.
We use cookies to ensure you get the best experience on our website. Read more...