Nvidia’s biggest AI memory producer nearing max capacity for 2025

SK Hynix Inc. revealed that its capacity to make high-bandwidth memory chips is almost fully booked through next year, underscoring the intense demand for semiconductors essential to artificial intelligence development.

The South Korean company now aims to begin mass production of its next-generation HBM chip in the third quarter, it said in a statement Thursday.

That’s a move intended to keep SK Hynix ahead of Samsung Electronics Co. in providing the advanced components, which work alongside Nvidia Corp. accelerators to create and host AI platforms.

SK Hynix is one of the suppliers of critical components for a boom in the development of ChatGPT-like services from the US to China.

The company, which recorded its fastest pace of revenue growth since 2010 last week, has seen its shares gain more than 20% this year on expectations it will continue to lead the market in providing HBM chips.

The stock changed little in Seoul trading on Thursday.

SK Hynix said last month it plans to spend about $14.6 billion (R271.15 billion) building a new memory chip complex in South Korea to meet that demand.

It’s also erecting a $4 billion (R74.29 billion) packaging facility in Indiana — its first in the US.

In response to a question about how the company would pay for the investments, chief financial officer Kim Woo-hyun said he expects the company to generate sufficient cash from operations to fund the necessary projects.

The company plans to secure capital in the mid-to-long term, considering the balance between cash generation and its financial health.

Latest news

Partner Content

Show comments

Recommended

Share this article
Nvidia’s biggest AI memory producer nearing max capacity for 2025