
As The Demand for HBM Explodes, SK Hynix is Anticipated to Profit
The demand for prime bandwidth reminiscence is about to blow up within the coming quarters and years as a result of broader adoption of synthetic intelligence normally and generative AI particularly. SK Hynix will probably be the first beneficiary of the HBM rally because it leads shipments of any such reminiscence, holding a 50% share in 2022, in line with TrendForce.
Analysts from TrendForce consider that shipments of AI servers geared up with compute GPUs like Nvidia’s A100 or H100 will enhance by roughly 9% year-over-year in 2022. Nonetheless, they don’t elaborate on whether or not they imply unit shipments or greenback shipments. They now estimate that the rise of generative AI will catalyze demand for AI servers, and this market will develop by 15.4% in 2023 and proceed rising at a compound annual progress price of 12.2% by way of 2027.
The upsurge in AI server utilization will even enhance demand for all sorts of reminiscence, together with commodity DDR5 SDRAM, HBM2e in addition to HBM3 for compute GPUs, and 3D NAND reminiscence for high-performance and high-capacity storage gadgets.
TrendForce estimates that whereas general-purpose servers pack 500 GB – 600 GB of commodity reminiscence, an AI server makes use of 1.2 TB – 1.7 TB. As well as, such machines use compute GPUs geared up with 80 GB or extra of HBM2e/HBM3 reminiscence. Since every AI machine comes with a number of compute GPUs, the overall content material of HBM per field is now 320 GB – 640 GB, and it is just set to develop additional as accelerators like AMD’s Intuition MI300 and Nvidia H100 NVL carry extra HBM3 reminiscence.
Talking of HBM3 adoption, it’s mandatory to notice that SK Hynix is at present the one maker that mass produces any such reminiscence, in line with TrendForce. Because of this, as demand for any such reminiscence grows, it would profit probably the most. Final yr SK Hynix commanded 50% of HBM shipments, adopted by Samsung with 40% and Micron with 10%. This yr the corporate will solidify its positions and management 53% of HBM shipments, whereas shares of Samsung and Micron will decline to 38% and 9%, respectively, TrendForce claims.
These days, AI servers are used primarily by the main U.S. cloud service suppliers, together with AWS, Google, Meta, and Microsoft. As extra corporations launch their generative AI merchandise, they are going to inevitably have to make use of AI servers both on-prem or at AWS or Microsoft. For instance, Baidu and ByteDance plan to introduce generative AI services and products within the coming quarters.
Supply: TrendForce
No Comments