SK hynix is preparing a US initial public offering valued at up to $14 billion to fund expanded production of high-bandwidth memory chips, according to TechCrunch AI, as the South Korean manufacturer seeks to capitalise on surging demand from artificial intelligence infrastructure providers.
The proposed listing, which could raise between $10 billion and $14 billion, would mark one of the largest technology IPOs in recent years and comes as the industry grapples with acute shortages of HBM (high-bandwidth memory) chips essential for AI training and inference workloads. SK hynix currently commands approximately 50% of the global HBM market, supplying critical components to Nvidia, AMD, and other AI accelerator manufacturers.
The timing reflects intensifying pressure on memory suppliers to expand capacity. Industry analysts estimate that HBM demand will grow at a compound annual rate exceeding 100% through 2026, driven primarily by generative AI applications and large language model deployment. Current production constraints have created delivery lead times stretching beyond six months for some specifications, with prices rising accordingly.
SK hynix has indicated the IPO proceeds would fund construction of additional fabrication facilities and advanced packaging lines specifically designed for HBM3 and next-generation HBM4 production. The company’s existing South Korean facilities are operating at maximum capacity, whilst competitors Samsung and Micron face similar constraints despite their own expansion programmes.
The shortage has created a cascading effect across the AI infrastructure stack. Cloud providers including Amazon Web Services, Microsoft Azure, and Google Cloud have reported extended wait times for GPU instances, whilst enterprise AI deployments face procurement delays that can stretch project timelines by quarters rather than weeks. Some hyperscalers have resorted to securing multi-year supply agreements at premium pricing to guarantee allocation.
Market implications
The capital injection positions SK hynix to potentially double its HBM production capacity within 18 to 24 months, according to industry observers. This timeline would align with the expected ramp of next-generation AI accelerators from Nvidia, AMD, and emerging competitors, all of which require substantially more HBM per chip than current designs.
Nvidia stands to benefit most directly from expanded SK hynix capacity, given its dominant position in AI accelerators and existing supply relationships. However, the capacity addition also creates opportunities for AMD and Intel to secure larger allocations for their competing products, potentially intensifying competition in the AI chip market.
For cloud providers and enterprise AI buyers, increased HBM supply could ease procurement bottlenecks and moderate pricing pressure by late 2025 or early 2026. This would particularly benefit mid-sized AI companies and research institutions that have struggled to secure GPU access at viable economics.
The IPO also carries implications for SK hynix’s competitors. Samsung Electronics has announced plans to invest $230 billion in semiconductor manufacturing through 2042, with significant allocation to memory production. Micron Technology, the primary American HBM supplier, may face pressure to accelerate its own capacity expansion to maintain market position.
Geopolitical considerations add complexity to the expansion plans. US restrictions on advanced semiconductor equipment exports to China, combined with growing emphasis on domestic chip production, may influence where SK hynix locates new facilities. The company has existing operations in both South Korea and China, though its most advanced HBM production occurs domestically.
What to watch
The IPO’s success will depend partly on broader market conditions and investor appetite for semiconductor exposure following recent volatility in chip stocks. Regulatory approval timelines in both South Korea and the United States will also affect the offering schedule, with completion likely in the second half of 2025 at the earliest.
Industry observers should monitor whether the capital raise prompts similar moves from Samsung or Micron, potentially triggering a broader capacity expansion cycle that could fundamentally alter AI infrastructure economics. The scale and speed of SK hynix’s production ramp will directly influence AI deployment costs and feasibility across the industry through at least 2027.







