Samsung is ramping up production of 8GB-hbm2 modules
Samsung is increasing the production of its 8GB dram with hbm2 interface, in its own words to meet the rapidly increasing demand. The memory is intended for video cards, high-performance computing and servers. Current video cards use 4GB modules.
Samsung is currently the only manufacturer that makes hbm2 modules with a capacity of 8GB. The manufacturer does not mention exact details about production numbers in its announcement. The South Korean manufacturer also does not make any statements about which products will use the memory and when they will be on the market.
Hbm2 is used by Nvidia and AMD in high-end video cards, among others. Nvidia first used hbm2 on its Tesla P100 cards last year and also applies it to the Tesla V100 introduced in May, but in both cases it concerns modules of 4GB. Samsung began mass production of that memory in early 2016.
AMD used the first generation of hbm memory on its Fury cards and applies hbm2 to its Radeon Vega Frontier Edition. The 16GB hbm2 card has two 8GB stacks. Presumably there will also be hbm2 on the upcoming Radeon RX Vega cards, but in what quantity and with which packages is not yet known. It is likely that this will also concern 4GB modules.
Samsung’s 8GB hbm2 modules consist of eight stacked hbm2 dies, each with a capacity of 8Gb, and there is a buffer at the bottom of the stack. Each tier features over 5,000 through silicon via channels, bringing the total number to over 40,000 for an 8GB package.
Nvidia’s current high-end video cards and deep learning accelerators combine four 4GB-hbm2 packages for a total of 16GB. With the new 8GB modules, a memory amount of 32GB would be possible in the same configuration.