Oleg Zabluda's blog
Thursday, April 06, 2017
ISCA paper preprint about Google's Tensor Processing Unit
Originally shared by Jeff Dean

ISCA paper preprint about Google's Tensor Processing Unit

Paper: https://drive.google.com/file/d/0Bx4hafXDDq2EMzRNcy1vSUxtcEk/view
Blog post by Norm Jouppi: https://cloudplatform.googleblog.com/2017/04/quantifying-the-performance-of-the-TPU-our-first-machine-learning-chip.html

Last June at Google I/O, Sundar Pichai showed an example of a new type of custom ASIC that Google had developed to accelerate machine learning workloads, called a Tensor Processing Unit (TPU), but didn't give very many details. The TPU is used to run large neural networks very efficiently and with low latency throughout many Google products, including Search, Photos, Translate, and also powered the AlphaGo system used during the match against Lee Sedol in Korea last March, and offers 92 trillion operations per second (TOPs) per chip with a modest power budget. I'm happy to announce that we now have a detailed paper In-Datacenter Performance Analysis of a Tensor Processing Unit​ that will appear in this year's International Symposium on Computer Architecture (ISCA) conference in Toronto in June. Today we've published a pre-print of the paper and a companion blog post, and David Patterson will be giving a talk about the TPU at the Computer History Museum in Mountain View this afternoon (https://sites.google.com/corp/view/naeregionalsymposium: sadly no more space is available).

Various news articles:
Hacker News discussion: https://news.ycombinator.com/item?id=14043059


| |


Powered by Blogger