HBM2

The demand for high bandwidth memory is set to explode in the coming quarters and years due to the broader adoption of artificial intelligence in general and generative AI in particular. SK Hynix will likely be the primary beneficiary of the HBM rally as it leads shipments of this type of memory, holding a 50% share in 2022, according to TrendForce. Analysts from TrendForce believe that shipments of AI servers equipped with compute GPUs like Nvidia's A100 or H100 will increase by roughly 9% year-over-year in 2022. However, they do not elaborate on whether they mean unit shipments or dollar shipments. They now estimate that the rise of generative AI will catalyze demand for AI servers, and this market will grow by 15.4% in 2023...

SK Hynix Adds HBM2 to Catalog: 4 GB Stacks Set to Be Available in Q3

SK Hynix has quietly added its HBM Gen 2 memory stacks to its public product catalog earlier this month, which means that the start of mass production should be...

43 by Anton Shilov on 8/1/2016

NVIDIA Unveils the DGX-1 HPC Server: 8 Teslas, 3U, Q2 2016

For a few years now, NVIDIA has been flirting with the server business as a means of driving the growth of datacenter sales of their products. A combination of...

31 by Ryan Smith & Ian Cutress on 4/6/2016

AMD Unveils GPU Architecture Roadmap: After Polaris Comes Vega

Although AMD’s GDC 2016 “Capsaicin” event was primarily focused on game development – it is the Game Developers Conference, after all – AMD did spend a brief moment discussing...

54 by Ryan Smith on 3/15/2016

JEDEC Publishes HBM2 Specification as Samsung Begins Mass Production of Chips

The high-bandwidth memory (HBM) technology solves two key problems related to modern DRAM: it substantially increases bandwidth available to computing devices (e.g., GPUs) and reduces power consumption. The first-generation...

42 by Anton Shilov on 1/20/2016

Log in

Don't have an account? Sign up now