Google Takes the Next Step in Developing AI

The Tensor Processing Unit promises leaps in processing speed

3
GCP TPU
Collaboration

Published: April 21, 2017

Rebekah Carter - Writer

Rebekah Carter

Google teased the communications industry last year with an initial announcement about their AI developments, although the details were a little thin. We knew that the Tensor Processing Unit (TPU), Google’s latest development, was coming – we just didn’t know much about it. However, the latest bit of news about the processing chip has come with a white paper, which is set to be presented at a National Academy of Engineering Meeting. The paper claims that the TPU is anywhere between 15 and 30 times faster than contemporary CPUs and GPUs, as well as around 30 to 80 times more energy efficient in some circumstances.

According to a senior hardware engineer for Google, Norm Jouppi, the demand for TPUs started to emerge around six years ago, when they began using expensive deep-learning models in more places across their product portfolio. However, they admit that the expense of using those models had them worried. In a situation where people were using Google voice search for around three minutes a day, they would have had to double the amount of data centres they have in place.

The Developing Role of Artificial Intelligence

Artificial intelligence, particularly learning technologies, have begun to play a crucial role in the impact of Google business. More than 100 different teams are now using various machine learning components to power the Google engine, from Inbox Smart Reply, to Street View. To help put that specific challenge into perspective, Google already owns around 15 data centres across the globe, and doubling that amount would be a financial and practical nightmare.

Today, Google’s business is dependent on the successful use of adverts, which have to be relevant to user mood, preference, activities, and general state of play. Artificial intelligence is key to managing all of this data to create more personalised experiences. The status quo needed to evolve to ensure that Google could meet the demands of all users and advertisers, while also maintaining profitability which is attractive to investors.

The TPU operates in a space defined by Google as the “neural network inference”. In other words, it’s the data-crunching and training phase, where predictions can be made to improve and reinforce performance. It’s very demanding and data-heavy from a performance perspective. In last year’s announcement, Google noted that TPU is designed for machine-learning applications. The chip is more tolerant of reduced computational accuracy, which leads to fewer transistors per operation.

With the TPU, Google claims it can squeeze more operations into the silicon per second; use more powerful and sophisticated models for machine learning; and apply those models faster, so users get results rapidly.

Diversifying Google

It seems as though Google is taking steps to actively diversify the technology that it invests in. However, all of the examples that we’ve seen so far can be linked back to data collection, which feeds the machine for artificial intelligence that improves Google’s ability to offer the search capacity we use every day.

Perhaps the most important thing to take away from this new development, is an idea of the position that Google has been left in. Should the current claims be correct, the chip will move the search-engine giant ahead of its competition by years, in a field that could mean everything to tomorrow’s technology leaders.

Read the full study here.

 

Artificial Intelligence
Featured

Share This Post