Headline

Start-up creates AI processor as big as an Ipad

Paul van Gerven
Reading time: 1 minute

Throwing all conventional wisdom out of the window, Silicon Valley start-up Cerebras has developed an AI processor measuring a whopping 21.5 by 21.5 cm. Packing 1.2 trillion transistors and 400,000 cores manufactured in a 16nm process, the company claims its Wafer-Scale Engine (WSE) can train neural networks up to a thousand times faster than equivalent hardware.

There are good reasons why most chips fit in the palm of your hand. Defectivity is an important one. Chip-ruining defects will typically scatter across the wafer during manufacturing, which means the smaller the chips, the more working chips will be sawed from the wafer. Another reason is cooling: the bigger the chip, the harder it is to cool properly.

The size of the chips obviously curbs their processing power. Supercomputers and AI systems, therefore, have many chips working together. However, shuttling data between chips is much slower than within a single chip, ultimately presenting a bottleneck in performance.

This article is exclusively available to premium members of Bits&Chips. Already a premium member? Please log in. Not yet a premium member? Become one for only €15 and enjoy all the benefits.

Login

Related content