Intel Releases Cooper Lake CPU Family, Bakes in Bfloat16

FavoriteLoadingIncorporate to favorites

Intel will increase focus on AI workloads…

Intel has launched its third-generation “Cooper Lake” family of Xeon processors — which the chip heavyweight promises will make AI inference and instruction “more greatly deployable on basic-purpose CPUs”.

Although the new CPUs may well not break documents (the top-of-the-range Platinum 8380H* has 28 cores, for a full of 224 cores in an 8-socket system) they occur with some welcome new capabilities for users, and are being welcomed by OEMs eager to refresh their hardware choices this calendar year.

The company promises the chips will be in a position to underpin far more highly effective deep understanding, virtual machine (VM) density, in-memory databases, mission-vital applications and analytics-intensive workloads.

Intel states the 8380H will supply 1.9X much better efficiency on “popular” workloads vis-a-vis 5-calendar year-aged techniques. (Benchmarks here, #11).

It has a optimum memory pace of 3200 MHz, a processor foundation frequency of 2.90 GHz and can aid up to forty eight PCI Categorical lanes.

Cooper Lake range: The specs.

The Cooper Lake chips feature some thing known as Bfloat16″: a numeric structure that works by using half the bits of the FP32 structure but “achieves similar model accuracy with minimum computer software alterations demanded.”

Bfloat16 was born at Google and is helpful for AI, but hardware supporting it has not been the norm to-day. (AI workloads require a heap of floating place-intensive arithmetic, the equal to your machine accomplishing a whole lot of fractions some thing which is intensive to do in binary techniques).

(For visitors seeking to get into the weeds on exponent and mantissa bit variances et al, EE Journal’s Jim Turley has a awesome generate-up here Google Cloud’s Shibo Wang talks by means of how it is made use of in cloud TPUs here).

Intel promises the chips have been adopted as the foundation for Facebook’s most recent Open up Compute Platform (OCP) servers, with Alibaba, Baidu and Tencent all also adopting the chips, which are delivery now. General OEM techniques availability is predicted in the 2nd half of 2020.

Also new: The Optane persistent memory two hundred collection, with up to four.5TB of memory for each socket to deal with information-intensive workloads, two new NAND SSDs (the SSD D7-P5500 and P5600) featuring a new low-latency PCIe controller, and teased: the forthcoming, AI-optimised Stratix 10 NX FPGA.

See also: Microfocus on its relisting, offer chain stability, edge as opposed to cloud, and THAT “utterly bogus” spy chip story