Samsung Electronics is preparing to begin production of its next-generation high-bandwidth memory chips, known as HBM4, as early as next month, with the chips expected to be supplied to Nvidia, according to a source familiar with the matter cited by Reuters on Monday.
The move marks a critical step for the South Korean technology giant as it works to close the gap with domestic rival SK Hynix, which has emerged as Nvidia’s primary supplier of advanced memory used in artificial intelligence accelerators. Samsung has struggled in recent quarters due to delays in qualifying its latest HBM products, which previously weighed on both earnings and share performance.
Following the report, Samsung shares rose 2.2% in morning trading, while SK Hynix stock fell 2.9%, reflecting shifting investor sentiment around the competitive landscape for AI memory chips.
Catching Up in the AI Memory Race
HBM chips are a crucial component in Nvidia’s AI processors, providing the ultra-fast memory bandwidth required to power large-scale AI training and inference workloads. SK Hynix currently dominates the market for advanced HBM products, including earlier HBM3 and HBM3E generations, giving it a strong foothold in the booming AI supply chain.
Samsung’s planned start of HBM4 production signals an effort to regain ground after falling behind last year due to supply delays and qualification hurdles. The source speaking to Reuters did not disclose how many HBM4 units Samsung plans to ship to Nvidia or the financial value of the potential supply agreement.
Samsung declined to comment on the report, while Nvidia was not immediately available for comment.
Reports Point to Successful Qualification Tests
South Korean newspaper Korea Economic Daily reported separately that Samsung has successfully passed HBM4 qualification tests conducted by both Nvidia and AMD. Citing semiconductor industry sources, the newspaper said Samsung is expected to begin shipping HBM4 chips to Nvidia as early as next month.
If confirmed, this would represent a significant breakthrough for Samsung in the high-margin AI memory market, where demand continues to surge as cloud providers and chipmakers race to scale AI infrastructure.
SK Hynix Expands Production Capacity
Meanwhile, SK Hynix has also been ramping up capacity to meet rising demand. The company said in October that it had completed supply negotiations with major customers for next year’s HBM shipments.
Earlier this month, an SK Hynix executive told Reuters that the company plans to start deploying silicon wafers next month at its new M15X fabrication facility in Cheongju, South Korea. The fab is designed to produce advanced HBM chips, though the executive did not confirm whether HBM4 would be included in the initial production lineup.
Earnings in Focus as HBM4 Details Emerge
Both Samsung Electronics and SK Hynix are scheduled to report fourth-quarter earnings on Thursday, and investors are closely watching the announcements for additional details on HBM4 orders, production volumes, and customer commitments.
The broader memory market has already shown signs of tightening supply. Contract prices for 32GB DDR5 memory modules surged to $239 in November, up sharply from $149 in September, according to industry data cited by Reuters. The price jump highlights strong demand driven by AI servers and data center expansion.
Nvidia’s Next-Generation Platform Depends on HBM4
Nvidia CEO Jensen Huang said earlier this month that the company’s next-generation AI platform, Vera Rubin, is now in full production. Nvidia plans to launch the new chips later this year, pairing them with HBM4 memory to deliver higher performance and energy efficiency for advanced AI workloads.
As Nvidia prepares for that rollout, reliable access to HBM4 supply has become increasingly strategic — raising the stakes for memory makers competing to secure long-term supply agreements.
Why It Matters
Samsung’s entry into HBM4 production could reshape the competitive balance in the AI memory market, which has so far been dominated by SK Hynix. With Nvidia, AMD, and other AI chipmakers demanding ever-faster memory solutions, the success or failure of HBM4 qualification may determine which suppliers capture the next wave of AI-driven revenue growth.

