Nvidia comes with A100 GPU with 80GB HBM2e memory and PCIe 4.0 interface

Spread the love

Nvidia is working on a new variant of the A100 GPU, with 80GB HBM2e memory and a PCIe 4.0 interface. Currently, the PCIe variant of this accelerator only comes with 40GB of video memory.

The A100 PCIe with 80GB of HBM2e memory recently appeared on Nvidia’s data center website, without an official announcement. Nvidia previously introduced an 80GB variant of the A100 GPU, but it was only supplied as an SXM4 module, which is mounted directly on the motherboard. The company has not previously offered an A100 with 80GB of memory as a PCIe plug-in card.

Nvidia has not yet announced a release date of the new PCIe variant, but anonymous sources tell VideoCardz that the GPU should be available next week. The new 80GB variant would have a memory bandwidth of 2TB/s, just like the SXM4 module with 80GB memory.

The Nvidia A100 is a data center GPU based on the Ampere architecture, which is also used in the company’s GeForce RTX 30 graphics cards. The chip has a surface area of ​​826mm² and consists of 54 billion transistors. The A100 features 6912 CUDA cores. However, the build of those cores is different from the CUDA cores from Nvidia’s recent GeForce consumer video cards, so those core numbers can’t be compared with the company’s RTX 30 video cards.

Nvidia A100 Specifications
Fashion model A100 PCIe A100 SXM4
GPU GA100 GA100
CUDA cores 6912 6912
Memory 40GB / 80GB 40GB / 80GB
Memory bandwidth 40GB: 1555GB/s
80GB: 2039GB/s?
40GB: 1555GB/s
80GB: 2039GB/s
Max. tdp 40GB: 250W
80GB: nnb
40GB: 400W
80GB: 400W
You might also like