Home PC & LaptopHardware Samsung Intros 9.8 Gbps HBM3E “Shinebolt”, 32 Gbps GDDR7, 7.5 Gbps LPDDR5x CAMM2 Memory

Samsung Intros 9.8 Gbps HBM3E “Shinebolt”, 32 Gbps GDDR7, 7.5 Gbps LPDDR5x CAMM2 Memory

by Contributor

Samsung has officially introduced its next-gen memory technologies including HBM3E, GDDR7, LPDDR5x CAMM2, and more during its Memory Tech Day 2023.

Samsung Goes All Out With Next-Gen Memory Technologies Including HBM3E, GDDR7, LPDDR5x CAMM2 & More

We have already reported the developments on the Samsung HBM3E memory codenamed “Shine Bolt” and GDDR7 for next-generation AI, Gaming, and data center applications. These can be seen as the two biggest highlights of the Memory Tech Day 2023 but Samsung sure has a lot more action going on.

Samsung HBM3E “Shinebolt” Memory For AI & Data Centers

Building on Samsung’s expertise in commercializing the industry’s first HBM2 and opening the HBM market for high-performance computing (HPC) in 2016, the company today revealed its next-generation HBM3E DRAM, named Shinebolt. Samsung’s Shinebolt will power next-generation AI applications, improving total cost of ownership (TCO) and speeding up AI-model training and inference in the data center.

Image Source: Samsung

The HBM3E boasts an impressive speed of 9.8 gigabits-per-second (Gbps) per pin speed, meaning it can achieve transfer rates exceeding more than 1.2 terabytes-per-second (TBps). In order to enable higher-layer stacks and improve thermal characteristics, Samsung has optimized its non-conductive film (NCF) technology to eliminate gaps between chip layers and maximize thermal conductivity. Samsung’s 8H and 12H HBM3 products are currently in mass production and samples for Shinebolt are shipping to customers.

Leaning into its strength as a total semiconductor solutions provider, the company also plans to offer a custom turnkey service that combines next-generation HBM, advanced packaging technologies, and foundry offerings together.

HBM Memory Specifications Comparison

DRAM HBM1 HBM2 HBM2e HBM3 HBM3 Gen2 HBMNext (HBM4)
I/O (Bus Interface) 1024 1024 1024 1024 1024-2048 1024-2048
Prefetch (I/O) 2 2 2 2 2 2
Maximum Bandwidth 128 GB/s 256 GB/s 460.8 GB/s 819.2 GB/s 1.2 TB/s 1.5 – 2.0 TB/s
DRAM ICs Per Stack 4 8 8 12 8-12 8-12
Maximum Capacity 4 GB 8 GB 16 GB 24 GB 24 – 36 GB 36-64 GB
tRC 48ns 45ns 45ns TBA TBA TBA
tCCD 2ns (=1tCK) 2ns (=1tCK) 2ns (=1tCK) TBA TBA TBA
VPP External VPP External VPP External VPP External VPP External VPP TBA
VDD 1.2V 1.2V 1.2V TBA TBA TBA
Command Input Dual Command Dual Command Dual Command Dual Command Dual Command Dual Command

Samsung GDDR7 – 32 Gbps & 32 Gb DRAM For Next-Gen Gaming Graphics

Other products highlighted at the event include the 32Gb DDR5 DRAM with the industry’s highest capacity, the industry’s first 32Gbps GDDR7, and the petabyte-scale PBSSD, which offers a significant boost to storage capabilities for server applications.

According to Samsung, the GDDR7 memory will offer a 40% performance boost and 20% power efficiency improvement compared to the current fastest 24 Gbps GDDR6 DRAM offering up to 16 Gb die capacities. The first products will be rated at transfer speeds of up to 32 Gbps which marks a 33% improvement over GDDR6 memory while achieving up to 1.5 TB/s of bandwidth which will be achieved on a 384-bit bus interface solution.

Image Source: Samsung

Following is the bandwidth the 32 Gbps pin speeds would offer across multiple bus configurations:

  • 512-bit – 2048 GB/s (2.0 TB/s)
  • 384-bit – 1536 GB/s (1.5 TB/s)
  • 320-bit – 1280 GB/s (1.3 TB/s)
  • 256-bit – 1024 GB/s (1.0 TB/s)
  • 192-bit – 768 GB/s
  • 128-bit – 512 GB/s

The company has also tested early samples running at speeds of up to 36 Gbps though we doubt those will be ready in enough mass-produced quantities to fulfill the next-gen gaming and AI GPU lineups.

GDDR7 memory will also offer 20% higher efficiency and that’s great considering memory consumes a huge amount of power for high-end GPUs. It is said that the Samsung GDDR7 DRAM will include technology specifically optimized for high-speed workloads and there will also be a low-operating voltage option designed for applications with mindful power usage such as laptops. For thermals, the new memory standard will utilize an epoxy molding compound (EMC) with high thermal conductivity which reduces thermal resistance by up to 70%. It was reported back in August that Samsung was sampling its GDDR7 DRAM to NVIDIA for early evaluation of its next-gen gaming graphics cards.

GDDR Graphics Memory Evolution:

GRAPHICS MEMORY GDDR5X GDDR6 GDDR6X GDDR7
Workload Gaming Gaming / AI Gaming / AI Gaming / AI
Platform (Example) GeForce GTX 1080 Ti GeForce RTX 2080 Ti GeForce RTX 4090 GeForce RTX 5090?
Number of Placements 12 12 12 12?
Gb/s/pin 11.4 14-16 19-24 32-36
GB/s/placement 45 56-64 76-96 128-144
GB/s/system 547 672-768 912-1152 1536-1728
Configuration (Example) 384 IO (12pcs x 32 IO package) 384 IO (12pcs x 32 IO package) 384 IO (12pcs x 32 IO package) 384 IO (12pcs x 32 IO package)?
Frame Buffer of Typical System 12GB 12GB 24 GB 24 GB?
Average Device Power (pJ/bit) 8.0 7.5 7.25 TBD
Typical IO Channel PCB (P2P SM) PCB (P2P SM) PCB (P2P SM) PCB (P2P SM)

Samsung LPDDR5x For Next-Gen CAMM2 Modules, Slimming Down Mobile Designs

In order to process data-intensive tasks, today’s AI technologies are moving toward a hybrid model that allocates and distributes workload among cloud and edge devices. Accordingly, Samsung introduced a range of memory solutions that support high-performance, high-capacity, low-power, and small form factors at the edge.

In addition to the industry’s first 7.5Gbps LPDDR5X CAMM2  which is expected to be a true game changer in the next-generation PC and laptop DRAM market the company also showcased its 9.6Gbps LPDDR5X DRAM, LLW DRAM specialized for on-device AI, next-generation Universal Flash Storage (UFS), and the high-capacity Quad-Level Cell (QLC) SSD BM9C1 for PCs.

GPU Memory Technology Updates

Graphics Card Name Memory Technology Memory Speed Memory Bus Memory Bandwidth Release
AMD Radeon R9 Fury X HBM1 1.0 Gbps 4096-bit 512 GB/s 2015
NVIDIA GTX 1080 GDDR5X 10.0 Gbps 256-bit 320 GB/s 2016
NVIDIA Tesla P100 HBM2 1.4 Gbps 4096-bit 720 GB/s 2016
NVIDIA Titan Xp GDDR5X 11.4 Gbps 384-bit 547 GB/s 2017
AMD RX Vega 64 HBM2 1.9 Gbps 2048-bit 483 GB/s 2017
NVIDIA Titan V HBM2 1.7 Gbps 3072-bit 652 GB/s 2017
NVIDIA Tesla V100 HBM2 1.7 Gbps 4096-bit 901 GB/s 2017
NVIDIA RTX 2080 Ti GDDR6 14.0 Gbps 384-bit 672 GB/s 2018
AMD Instinct MI100 HBM2 2.4 Gbps 4096-bit 1229 GB/s 2020
NVIDIA A100 80 GB HBM2e 3.2 Gbps 5120-bit 2039 GB/s 2020
NVIDIA RTX 3090 GDDR6X 19.5 Gbps 384-bit 936.2 GB/s 2020
AMD Instinct MI200 HBM2e 3.2 Gbps 8192-bit 3200 GB/s 2021
NVIDIA RTX 3090 Ti GDDR6X 21.0 Gbps 384-bit 1008 GB/s 2022
NVIDIA H100 80 GB HBM3/E 2.6 Gbps 5120-bit 1681 GB/s 2022

Share this story

Facebook

Twitter

Source link

Related Posts