Google recently unveiled the latest version of an advanced hardware and software system designed specifically for machine learning—and it could have huge implications for the adoption of AI, according to machine learning experts we interviewed.
What’s machine learning? Start here.
Google’s system is comprised of second-generation Tensor Processing Units (TPUs), the chips the company designed specifically for its internal machine learning applications. These chips, reports TechCrunch, are 15-30X faster “in executing Google’s regular machine learning workloads than a standard GPU/CPU [hardware] combination.”
Previously, TPUs were jealously guarded by Google and used internally to power its major products, which—from search to translate—all use types of machine learning.
But now, Google is deploying TPUs across the Google Compute Engine, reports The Verge, “a platform other companies and researchers can tap for computing resources similar to Amazon’s AWS.” (AWS is Amazon’s wildly successful cloud computing wing.)
What this means is that Google is potentially opening up massive machine learning computational power to companies with ideas for AI-powered products and services.
If that comes true, expect big doses of AI-powered products and features to make their way into the marketing and sales industries as it becomes easier and cheaper to solve business challenges with AI.
We spoke to several AI experts to get their take on both the potential and limitations of Google’s chip and platform.
The chip and platform has people like Cortex CEO Brennan White cautiously optimistic.
“We are fans of [Google’s open source machine intelligence library] TensorFlow already and excited to see what this can do,” says White.
Cortex’s solution uses AI to improve content marketing decision-making.
“This could be a big move for Google to get machine learning companies to leave Amazon or to forgo building their own hardware.”
Part of the appeal here is just how well that hardware performs, says Pawan Deshpande, CEO of Curata, a company that uses AI to curate the third-party content.
“Back when I was a researcher at MIT and Google, I'd literally have to wait for days for the training to complete across a parallelized computing infrastructure,” he says. (Parallel computing is what Google’s chip and platform use to perform machine learning calculations faster.)
“Google's TPU 2.0 is exciting because it can purportedly reduce this to a matter of hours, allowing researchers to more quickly iterate on new machine learning models, and also enable more online learning algorithms (meaning they train in real-time as feedback is provided).”
This parallelism, where calculations are carried out at the same time to increase speed, is attractive says Aki Balogh, CEO of AI solution MarketMuse.
“The difference between GPUs and CPUs is that GPUs are natively designed for massive parallelism,” he says. “But the use of GPUs for this task is a hack. A chip specifically designed for parallelism makes a lot of sense.”
Eric Ho, founder of PaveAI, sees additional benefits in how the chips use power. His company uses AI to turn Google Analytics data into actionable recommendations.
“From the numbers quoted, these chips offer a huge performance to wattage improvement that will allow others companies to offer widely available AI products that were not previously scalable or affordable,” he says.
“It will a year or two before end users and marketers get to see the benefits of this chip, but it is one of the key steps in making AI available to the masses.”
Google’s efforts have our AI experts bullish, but they readily admit there may be limitations.
“One of the biggest limitations of TPU 2.0 is that the chip is optimized for Google's TensorFlow machine learning library only,” says Deshpande at Curata.
This means users of other libraries won’t benefit from Google’s efforts. However, machine learning expert Samim Winiger told The Verge that “there’s hardly a way around TensorFlow these days” so it remains to be seen if this will discourage adoption.
Other game-changing technologies could also foil Google’s AI world domination plans.
Balogh at MarketMuse cites quantum computing as an example. While still in the developmental phase, this technology could make parallelism like that displayed in Google’s chips and platform obsolete.
“Instead of just 1s and 0s [being computed], quantum computing allows for a third state, which totally changes the game in terms of parallelism,” he says.
Finally, there’s the problem that plagues all AI news: hype.
Plenty of companies, big and small, make big claims about their AI capabilities, but can’t deliver the goods.
Cautions Brennan White at Cortex: "We will certainly be the first in line to try the chip. But, as with all things in the AI space, I'll believe it when I try it."