Updated May 2nd, 2024 at 11:08 IST

Nvidia supplier SK Hynix AI chips sold out for current fiscal year

As a key supplier to Nvidia, SK Hynix disclosed plans to commence sample shipments of its latest 12-layer HBM3E chip

Reported by: Business Desk
Representative | Image:Unsplash
Advertisement

SK Hynix HBM chips: South Korea's SK Hynix announced on Thursday that its high-bandwidth memory (HBM) chips, integral to AI chipsets, have been fully allocated for the current year, with nearly all of the inventory for 2025 already reserved, reflecting the robust expansion of artificial intelligence services across industries.

As a key supplier to Nvidia and the world's second-largest memory chipmaker, SK Hynix disclosed plans to commence sample shipments of its latest 12-layer HBM3E chip in May, with mass production slated for the third quarter.

Advertisement

SK Hynix's Chief Executive Officer, Kwak Noh-Jung, underlined the sustained growth trajectory of the HBM market, projecting an anticipated 60 per cent annual demand surge in the mid-to-long term, driven by escalating data volumes and AI model complexities.

In a strategic move, last month SK Hynix unveiled a $3.87 billion initiative to establish an advanced chip packaging facility in Indiana, USA, integrating an HBM chip line. Concurrently, the company disclosed a 5.3 trillion won ($3.9 billion) investment in a new DRAM chip factory domestically, prioritising HBM production.

Advertisement

Kwak underscored that SK Hynix's investment approach in HBM deviates from conventional industry patterns, with heightened responsiveness to customer demands, leveraging increased capacity following extensive consultations.

Concerns about potential shortages in conventional memory chips for smartphones, personal computers, and network servers by year-end were addressed by SK Hynix during last week's post-earnings conference call, citing the possibility of demand surpassing expectations.

Advertisement

Looking ahead, Justin Kim, SK Hynix's head of AI infrastructure, forecasted a significant shift in chip composition by 2028, with AI-focused components such as HBM and high-capacity DRAM modules projected to constitute 61 per cent of total memory volume in terms of value, compared to a mere 5 per cent in 2023.

(With Reuters inputs) 

Advertisement

Published May 2nd, 2024 at 11:08 IST