Google claims that tpu v4 pods offer exaflops of computing power

Spread the love

Google will start using generation four of its Tensor processing units this year. The artificial intelligence chips would be twice as fast as the v3 generation and a pod with 4096 tpus would deliver exaflops computing power.

Google CEO Sundar Pichai announced the tpu v4 generation during Google I / O 2021. He reported that Google will soon be deploying “dozens” of pods with tpu v4s in its computing centers. ‘Pods’ is the name Google uses for racks with TPU PCBs. Each of those pods with fourth-generation chips offers such exaflops of computing power, or a thousand petaflops, according to Pichai.

Google’s current pods with tpu v3 chips offer more than a hundred petaflops of computing power by comparison. The company says it uses 2048 tpu v3 cores in those pods and each tpu v3 chip has two of those cores, which means a v3 pod has 1024 chips. According to Pichai, a v4 pod has 4096 chips. It is not known how many cores the v4 chips contain, but it is clear that Google will again place four of those chips on a pcb. According to Google, they are “twice as fast” as the third generation, but the company has not yet published any further details about the chips.

Google started in 2015 with its self-designed Tensor processing units, optimized for computing for artificial intelligence. The company itself is increasingly in need of computing power for the increasingly complex machine learning models that underpin its services, such as those for image, text and audio recognition. In addition, the company offers the computing power of the pods commercially. That should also happen this year with the tpu v4 pods.

You might also like