Thursday, June 26, 2025

A sneak peek at HBM chilly conflict between Samsung and SK hynix


As high-bandwidth reminiscence (HBM) strikes from HBM3 to its prolonged model HBM3e, a fierce competitors kicks off between Samsung and SK hynix. Micron, the third largest reminiscence maker, has additionally tagged alongside to assert stakes on this reminiscence nirvana that’s strategically important in synthetic intelligence (AI) designs.

HBM is a high-value, high-performance reminiscence that vertically interconnects a number of DRAM chips to dramatically improve knowledge processing pace in comparison with standard DRAM merchandise. HBM3e is the fifth era of HBM following HBM, HBM2, HBM2E and HBM3 reminiscence gadgets.

HBM helps package deal quite a few AI processors and recollections in a multi-connected vogue to construct a profitable AI system that may course of an enormous quantity of knowledge rapidly. “HBM reminiscence could be very difficult, and the worth added could be very excessive,” Jensen Huang, Nvidia co-founder and CEO, stated at a media briefing throughout the GPU Know-how Convention (GTC) held in March 2024 at San Jose, California. “We’re spending some huge cash on HBM.”

Take Nvidia’s A100 and H100 processors, which commanded 80% of the complete AI processor market in 2023; SK hynix is the only provider of HBM3 chips for these GPUs. SK hynix at present dominates the market with a first-mover benefit. It launched the primary HBM chip in partnership with AMD in 2014 and the primary HBM2 chip in 2015.

AMD-Powered Advantech AIMB-723 Industrial Motherboard Future-Proofs AOI Deployments 

04.09.2024

Nuvoton drives the EV market with its cutting-edge battery monitoring chipset solution

04.03.2024

Improved Power Efficiency and AI Inference in Autonomous Systems

03.26.2024

Determine 1 SK hynix at present dominates the HBM market with practically 90% of the market share.

Final month, SK hynix made waves by saying to start out the mass manufacturing of the trade’s first HBM3e chip. So, is the HBM market and its intrinsic pairing with AI processors a case of winner-takes-all? Probably not. Enter Samsung with a 12-layer HBM3e chip.

Samsung’s HBM shock

Samsung’s crosstown reminiscence rival SK hynix has been thought of the unrivalled HBM champion because it unveiled the primary HBM reminiscence chip in 2014. It’s often known as the only HBM provider of AI kingpin Nvidia whereas Samsung has been extensively reported to be lagging in HBM3e pattern submission and validation.

Then got here Nvidia’s four-day annual convention, GTC 2024, the place the GPU provider unveiled its H200 and B100 processors for AI purposes. Samsung, identified for its quiet dedication, as soon as extra outpaced its rivals by displaying 12-layer HBM3e chips with 36 GB capability and 1.28 TB/s bandwidth.

Determine 2 Samsung startled the market by saying 12-layer HBM3e gadgets in comparison with 8-layer HBM3e chips from Micron and SK hynix.

Samsung’s HBM3e chips are at present going by means of a verification course of at Nvidia, and CEO Jensen Huang’s be aware “Jensen Authorised” subsequent to Samsung’s 12-layer HBM3e machine on show at GTC 2024 hints that the validation course of is a carried out deal. South Korean media outlet Alpha Biz has reported that Samsung will start supplying Nvidia with its 12-layer HBM3e chips as early as September 2024.

These HBM3e chips stack 12 DRAMs, every carrying 24-GB capability, resulting in a peak reminiscence bandwidth of 1.28 TB/s, 50% larger than 8-layer HBM3e gadgets. Samsung additionally claims its 12-layer HBM3e machine maintains the identical peak because the 8-layer HBM3e whereas providing 50% extra capability.

It’s vital to notice that SK hynix started supplying 8-layer HBM3e gadgets to Nvidia in March 2024 whereas its 12-layer gadgets, although displayed at GTC 2024, are reportedly encountering course of points. Likewise, Micron, the world’s third largest producer of reminiscence chips, following Samsung and SK hynix, introduced the manufacturing of 8-layer HBM3e chips in February 2024.

Micron’s window of alternative

Micron, seeing the recognition of HBM gadgets in AI purposes, can be catching up with its Korean rivals. Market analysis agency TrendForce, which valued the HBM market roughly 8.4% of the general DRAM trade in 2023, initiatives that this proportion might broaden to twenty.1% by the top of 2024.

Micron’s first HBM3e product stacks 8 DRAM layers, providing 24 GB capability and 1.2 TB/s bandwidth. The Boise, Idaho-based reminiscence provider calls its HBM3e chip “HBM3 Gen2” and claims it consumes 30% much less energy than rival choices.

Determine 3 Micron’s HBM3e chip has reportedly been certified for pairing with Nvidia’s H200 Tensor Core GPU.

Apart from technical deserves like decrease energy consumption, market dynamics are serving to the U.S. reminiscence chip provider to meet up with its Korean rivals Samsung and SK hynix. As famous by Anshel Sag, an analyst at Moor Insights & Technique, SK hynix already having offered out its 2024 stock might place rivals like Micron as a dependable second supply.

It’s value mentioning that Micron has already certified as a major HBM3e provider for Nvidia’s H200 processors. The shipments of Micron’s 8-layer HBM3e chips are set start within the second quarter of 2024. And like SK hynix, Micron claims to have offered all its HBM3e stock for 2024.

HBM a market to look at

The HBM market will proceed to stay aggressive in 2024 and past. Whereas HBM3e is positioning as the brand new mainstream reminiscence machine, each Samsung and SK hynix intention to mass produce HBM4 gadgets in 2026.

SK hynix is using hybrid bonding expertise to stack 16 layers of DRAMs and achive 48 GB capability; in comparison with HBM3e chips, it’s anticipated to spice up bandwidth by 40% and decrease energy consumption by 70%.

On the Worldwide Stable-State Circuits Convention (ISSCC 2024) held in San Francisco on February 18-21, the place SK hynix showcased its 16-layer HBM gadgets, Samsung additionally demonstrated its HBM4 machine boasting a bandwidth of two TB/s, a whopping 66% improve from HBM3e. The machine additionally doubled the variety of I/Os.

HBM is now not the unsung hero of the AI revolution, and all eyes are on the uptake of this exceptional reminiscence expertise.

Associated Content material


Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles