Today Qualcomm is revealing more information on last year’s announced “Cloud AI 100” inference chip and platform. The new inference platform by the company is said to have entered production already with the first silicon successfully coming back, and with first customer sampling having started. The Cloud AI 100 is Qualcomm’s first foray into the datacentre AI inference accelerator business, representing the company’s investments into machine learning and leveraging their expertise in the area from the consumer mobile SoC world, and bringing it to the enterprise market. Qualcomm had first revealed the Cloud AI 100 early last year, although admittedly this was more of a paper launch rather than a disclosure of what the hardware actually brought to the table. Today, with actual silicon in the...
One of the more interesting AI silicon projects over the last couple of years has been the Cerebras Wafer Scale Engine, most notably for the fact that a single...8 by Dr. Ian Cutress on 8/21/2020
One of Intel's future 10nm products is the Spring Hill NNP-I 1000 Inference Engine. Today the company is lifting the lid on some of the architecture behind the chip.3 by Dr. Ian Cutress on 8/20/2019
One of the characteristics of Intel is its investment into new IP. This usually takes several forms, such as internal R&D, investing in other companies through Intel Capital, or...11 by Ian Cutress on 4/16/2019
A large number of inference demonstrations published by the big chip manufacturers revolve around processing large batch sizes of images on trained networks. In reality, when video is being...4 by Ian Cutress on 4/10/2019
When visiting the Supercomputing conference this year, there were plenty of big GPU systems on display for machine learning. A large number were geared towards the heavy duty cards...12 by Ian Cutress on 11/19/2018