You are here:  Home  >  Education Resources For Use & Management of Data  >  Data Daily | Data News  >  Current Article

Google Building Its Own Chips for Machine Learning

By   /  May 23, 2016  /  No Comments

googleby Angela Guess

Frederic Lardinois reports in TechCrunch, “As Google announced at its I/O developer conference today, the company recently started building its own specialized chips to expedite the machine learning algorithms. These so-called Tensor Processing Units (TPU) are custom-built chips that Google has now been using in its own data centers for almost a year, as Google’s senior VP for its technical infrastructure Urs Holzle noted in a press conference at I/O. Google says it’s getting ‘an order of magnitude better-optimized performance per watt for machine learning’ and argues that this is ‘roughly equivalent to fast-forwarding technology about seven years into the future’.”

Lardinois goes on, “Google also manages to speed up the machine learning algorithms with the TPUs because it doesn’t need the high-precision of standard CPUs and GPUs. Instead of 32-bit precision, the algorithms happily run with a reduced precision of 8 bits, so every transaction needs fewer transistors. If you are using Google’s voice recognition services, your queries are already running on these TPUs today — and if you’re a developer, Google’s Cloud Machine Learning services also run on these chips. AlphaGo, which recently beat the Go world champion, also ran on TPUs. Holzle said that Google decided to build these application-specific chips instead of using more flexible FPGAs because it was looking for efficiency.”

Read more here.

Photo credit: Google

You might also like...

Thinking Inside the Box: How to Audit an AI

Read More →