The burgeoning field of artificial intelligence (AI) has seen an uptick in demand for specialized hardware. Recent reports indicate that industry giants SK Hynix and Samsung have sold out their entire stock of HBM2025 memory. This development underscores the increasing reliance on high-performance memory solutions to meet the computational demands of advanced AI applications. Further details on this are covered in the content.
The announcement that Nvidia’s Hopper H100 AI graphics card series will be unavailable in the market for the next six to nine months is driving up demand within the AI industry.
The surge in demand has notably benefited HBM3 memory manufacturers SK hynix and Samsung, positioning them as the biggest winners in this uptrend.
According to reports from Korean news outlets, SK hynix and Samsung are optimistic about the future owing to the soaring demand for AI GPUs. The primary reason for this outlook is that these two companies are the sole producers of HBM3 memory.
HBM3 memory, which is crucial for the latest AI GPUs, boasts features like high transfer rates, expansive bandwidth, enhanced memory capacity, and reduced power consumption.
Even though SK hynix leads in this sector, Samsung is intensifying the market competition with its newly unveiled HBM3e “Shinebolt” offering.
SK Hynix asserts that with potential enhancements of 60% to 80% in HBM interface technology over the next five years, the company could preserve its market share against competitors like Samsung.
A report by TrendForce last year revealed that SK hynix holds 50% of the global HBM memory market share, Samsung accounts for 40%, and Micron occupies the third spot with 10%.
SK Hynix has reported that the production capacity for HBM3 and HBM3e for the next year is already sold out, with demand still on the rise.