Samsung on Tuesday announced that it is increasing production volumes of its 8 GB, 8-Hi HBM2 DRAM stacks due to growing demand. In the coming months the company’s 8 GB HBM2 chips will be used for several applications, including those for consumers, professionals, AI, as well as for parallel computing. Meanwhile, AMD’s Radeon Vega graphics cards for professionals and gamers will likely be the largest consumers of HBM2 in terms of volume. And while AMD is traditionally a SK Hynix customer, the timing of this announcement with AMD's launches certainly suggests that AMD is likely a Samsung customer this round as well.

Samsung’s 8 GB HBM Gen 2 memory KGSDs (known good stacked die) are based on eight 8-Gb DRAM devices in an 8-Hi stack configuration. The memory components are interconnected using TSVs and feature over 5,000 TSV interconnects each. Every KGSD has a 1024-bit bus and offers up to 2 Gbps data rate per pin, thus providing up to 256 GB/s of memory bandwidth per single 8-Hi stack. The company did not disclose power consumption and heat dissipation of its HBM memory components.

Update 7/20: Samsung confirmed that the DRAM devices are made using 20 nm process technology, but could not disclose power consumption and TDP of KGSDs.

Samsung began mass production of 4-Hi HBM2 KGSDs with 4 GB capacity and 2 Gbps data rate per pin in early 2016. These chips have been used to build various solutions based on NVIDIA’s GP100 and later GV100 GPUs aimed at HPC and similar applications. The company also started to manufacture HBM2 KGSDs with 8 GB capacity in 2016 and so far, Samsung is the only company to publicly announce that they can mass-produce 8 GB HBM2 KGSDs.

Recently AMD launched its Radeon Vega Frontier Edition cards, the first commercial products featuring Vega and 8-Hi HBM2 stacks. To date we haven't been able to get confirmation of whose HBM2 AMD is using – frequent collaborator SK Hynix's or Samsung’s – however as Samsung is for now the only vendor to announce 8-Hi volume production, it's looking increasingly likely that AMD is using Samsung's HBM2. Meanwhile in the coming months AMD will expand the lineup of its graphics cards based on the Vega GPU with the RX Vega line for gamers, and considering that such devices are sold in mass quantities, Samsung has a very good reason for increasing HBM2 production..

Samsung expects 8 GB HBM2 KGSDs to account for over 50% of its HBM2 production by the first half of 2018.

Related Reading:

Source: Samsung

Comments Locked


View All Comments

  • milkod2001 - Wednesday, July 19, 2017 - link

    What would be price difference between 8GB HBM2 AMD is using and 8GB GDDR5X NV is using?
  • extide - Wednesday, July 19, 2017 - link

    I would expect the memory itself is actually a pretty similar price -- the thing about HBM though is the interposer and requiring that you have a good GPU die, good HBM stacks and then a good final product with the interposer, GPU, and memory -- so many more places for things to go wrong plus the cost of the interposer itself.
  • ddriver - Wednesday, July 19, 2017 - link

    The difference is nvidia is not using it no "mainstream" grade products, but instead on products with immensely higher margins.

    I suspect the reason for the heavy vega delay is someone **cough nvidia** poaching the initial supply, raising the price, forcing amd to wait for better availability and lower prices, since they want to put it in a product that will not enjoy such high margins, and from the looks of it, won't shatter performance records either.

    I suspect this production ramp up is the reason why amd are finally planning a hard release, the FE doesn't really count.
  • Santoval - Wednesday, July 19, 2017 - link

    So you suspect Nvidia bought much more HMB2 chips than they required? How many Tesla GP100 and GV100 boards is Nvidia expecting to sell compared to AMD's consumer and semi-pro Vega cards? Due to additional demand from self-driving car developers perhaps the ratio is not 1 Tesla (GP/GV combined) for every 10 AMD cards, but there is no way it is going to be more than 1 per 7. That would live them with a significant stock of unused HMB2 chips.

    Unless you implied that Nvidia bought on purpose more HBM2 chips in order to hinder AMD's Vega release. Who knows, that might just be the case. But it would be *very* expensive.
  • ddriver - Thursday, July 20, 2017 - link

    No, it simply offered a better price, since they can afford it, with the added benefit of screwing with amd by depriving them of what they need to launch their long delayed high end offering.

    I expect that nvidia has sold far more hbm2 products than amd has. And whoever told you that nvidia hbm2 products only go into self driving cars really took you for a ride.

    Nvidia didn't buy more than they needed, they simply poached the initial supply, leaving amd with some scraps for testing and the limited FE release.
  • Strunf - Friday, July 21, 2017 - link

    Yeah right as if NVIDIA needed to such thing... If NVIDIA could even do such thing then it just means the HBM2 wasn't ready for mass market and hence it's AMD own fault for making a product based on a technology not readily available.
    Besides the fact VEGA FE release is a big mess means AMD has bigger problems than HBM2 availability to sort out...
  • Strunf - Thursday, July 20, 2017 - link

    What initial supply? producers of HBM2 have been delaying it and reducing its capabilities... the HBM2 problems have been known from even before Vega was announced.
  • descendency - Wednesday, July 19, 2017 - link

    Size of the board.
  • nathanddrews - Wednesday, July 19, 2017 - link

    HBM hasn't been all that impressive to date. It's technically very impressive, but in practice it seems disappointing due to the hardware it's paired with, especially when contrasted with faster GPUs using GDDR5/X.

    Are we going to ever see multi-die GPU unified by HBM? Navi? Volta 2.0?
  • Manch - Wednesday, July 19, 2017 - link

    Dude just say you hate AMD but love NVidia instead of the thinly veiled "analysis" LOL. Nothing wrong with being a fan, but come on.

Log in

Don't have an account? Sign up now