A sneak peek at HBM chilly warfare between Samsung and SK hynix

[ad_1]

As high-bandwidth reminiscence (HBM) strikes from HBM3 to its prolonged model HBM3e, a fierce competitors kicks off between Samsung and SK hynix. Micron, the third largest reminiscence maker, has additionally tagged alongside to say stakes on this reminiscence nirvana that’s strategically essential in synthetic intelligence (AI) designs.

HBM is a high-value, high-performance reminiscence that vertically interconnects a number of DRAM chips to dramatically enhance information processing velocity in comparison with standard DRAM merchandise. HBM3e is the fifth era of HBM following HBM, HBM2, HBM2E and HBM3 reminiscence gadgets.

HBM helps bundle quite a few AI processors and recollections in a multi-connected style to construct a profitable AI system that may course of an enormous quantity of information shortly. “HBM reminiscence could be very difficult, and the worth added could be very excessive,” Jensen Huang, Nvidia co-founder and CEO, stated at a media briefing through the GPU Expertise Convention (GTC) held in March 2024 at San Jose, California. “We’re spending some huge cash on HBM.”

Take Nvidia’s A100 and H100 processors, which commanded 80% of your entire AI processor market in 2023; SK hynix is the only provider of HBM3 chips for these GPUs. SK hynix presently dominates the market with a first-mover benefit. It launched the primary HBM chip in partnership with AMD in 2014 and the primary HBM2 chip in 2015.

AMD-Powered Advantech AIMB-723 Industrial Motherboard Future-Proofs AOI Deployments 

04.09.2024

Nuvoton drives the EV market with its cutting-edge battery monitoring chipset solution

04.03.2024

Improved Power Efficiency and AI Inference in Autonomous Systems

03.26.2024

Determine 1 SK hynix presently dominates the HBM market with practically 90% of the market share.

Final month, SK hynix made waves by asserting to start out the mass manufacturing of the trade’s first HBM3e chip. So, is the HBM market and its intrinsic pairing with AI processors a case of winner-takes-all? Probably not. Enter Samsung with a 12-layer HBM3e chip.

Samsung’s HBM shock

Samsung’s crosstown reminiscence rival SK hynix has been thought-about the unrivalled HBM champion because it unveiled the primary HBM reminiscence chip in 2014. It’s also referred to as the only HBM provider of AI kingpin Nvidia whereas Samsung has been extensively reported to be lagging in HBM3e pattern submission and validation.

Then got here Nvidia’s four-day annual convention, GTC 2024, the place the GPU provider unveiled its H200 and B100 processors for AI functions. Samsung, recognized for its quiet willpower, as soon as extra outpaced its rivals by displaying 12-layer HBM3e chips with 36 GB capability and 1.28 TB/s bandwidth.

Determine 2 Samsung startled the market by asserting 12-layer HBM3e gadgets in comparison with 8-layer HBM3e chips from Micron and SK hynix.

Samsung’s HBM3e chips are presently going by a verification course of at Nvidia, and CEO Jensen Huang’s word “Jensen Accepted” subsequent to Samsung’s 12-layer HBM3e system on show at GTC 2024 hints that the validation course of is a carried out deal. South Korean media outlet Alpha Biz has reported that Samsung will start supplying Nvidia with its 12-layer HBM3e chips as early as September 2024.

These HBM3e chips stack 12 DRAMs, every carrying 24-GB capability, resulting in a peak reminiscence bandwidth of 1.28 TB/s, 50% increased than 8-layer HBM3e gadgets. Samsung additionally claims its 12-layer HBM3e system maintains the identical peak because the 8-layer HBM3e whereas providing 50% extra capability.

It’s vital to notice that SK hynix started supplying 8-layer HBM3e gadgets to Nvidia in March 2024 whereas its 12-layer gadgets, although displayed at GTC 2024, are reportedly encountering course of points. Likewise, Micron, the world’s third largest producer of reminiscence chips, following Samsung and SK hynix, introduced the manufacturing of 8-layer HBM3e chips in February 2024.

Micron’s window of alternative

Micron, seeing the recognition of HBM gadgets in AI functions, can be catching up with its Korean rivals. Market analysis agency TrendForce, which valued the HBM market roughly 8.4% of the general DRAM trade in 2023, initiatives that this share might increase to twenty.1% by the tip of 2024.

Micron’s first HBM3e product stacks 8 DRAM layers, providing 24 GB capability and 1.2 TB/s bandwidth. The Boise, Idaho-based reminiscence provider calls its HBM3e chip “HBM3 Gen2” and claims it consumes 30% much less energy than rival choices.

Determine 3 Micron’s HBM3e chip has reportedly been certified for pairing with Nvidia’s H200 Tensor Core GPU.

Moreover technical deserves like decrease energy consumption, market dynamics are serving to the U.S. reminiscence chip provider to meet up with its Korean rivals Samsung and SK hynix. As famous by Anshel Sag, an analyst at Moor Insights & Technique, SK hynix already having offered out its 2024 stock might place rivals like Micron as a dependable second supply.

It’s price mentioning that Micron has already certified as a major HBM3e provider for Nvidia’s H200 processors. The shipments of Micron’s 8-layer HBM3e chips are set start within the second quarter of 2024. And like SK hynix, Micron claims to have offered all its HBM3e stock for 2024.

HBM a market to look at

The HBM market will proceed to stay aggressive in 2024 and past. Whereas HBM3e is positioning as the brand new mainstream reminiscence system, each Samsung and SK hynix goal to mass produce HBM4 gadgets in 2026.

SK hynix is using hybrid bonding know-how to stack 16 layers of DRAMs and achive 48 GB capability; in comparison with HBM3e chips, it’s anticipated to spice up bandwidth by 40% and decrease energy consumption by 70%.

On the Worldwide Strong-State Circuits Convention (ISSCC 2024) held in San Francisco on February 18-21, the place SK hynix showcased its 16-layer HBM gadgets, Samsung additionally demonstrated its HBM4 system boasting a bandwidth of two TB/s, a whopping 66% enhance from HBM3e. The system additionally doubled the variety of I/Os.

HBM is not the unsung hero of the AI revolution, and all eyes are on the uptake of this outstanding reminiscence know-how.

Associated Content material

[ad_2]

Supply hyperlink

Greatest Web Suppliers in Visalia, California

Gastronology Launches Industrial Manufacturing of 3D Printed Meals for Dysphagia Sufferers