According to an interview conducted by South Korean media SEDaily at the IEEE ISSCC 2025 International Solid State Circuit Conference, Samsung Electronics unveiled its HBM memory roadmap at the conference.
In Samsung's eyes, the two major changes of HBM4E compared to HBM4 are the introduction of 32Gb DRAM die and the increase in per pin rate to 10Gbps: the former can expand the single stack capacity to 64GB when stacked at 16Hi, while the latter means that the overall bandwidth of HBM4E will be 1.25 times that of HBM4.
In terms of downstream applications, HBM4E memory is expected to be adopted by NVIDIA's Rubin Ultra AI GPU, which will be launched in 2027. Rubin Ultra supports 12 HBM4 (E) stacks, which means that the single accelerator memory capacity is expected to reach an astonishing 768GB.