There's a war brewing in the cloud, and machine learning may well determine the outcome. With infrastructure services trending toward commoditisation, each of the big-three cloud purveyors is racing to augment vanilla IaaS and PaaS with not-so-vanilla machine learning (ML)/artificial intelligence (AI) smarts. Sure, they're way out in front of the market – most enterprises simply aren't doing much with ML – but the hope is that by paving the way to a dystopian future when the machines take over they will make boatloads of money along the way.
And yet each of the clouds is taking a very different approach to our enslavement to the machines.
Mastering ML is all about scale, and scale comes from the cloud. True, most data today still sits in musty enterprise data centres, but that won't remain true for long. As Gartner analyst Thomas Bittman has pointed out: "New stuff tends to go to the public cloud, while doing old stuff in new ways tends to go to private clouds. And new stuff is simply growing faster." How much faster? According to Gartner's analysis, private cloud VMs have been growing at a rate of 3X, while public cloud VMs have exploded 20X.
Even the leading on-prem Hadoop vendors have caught this vision, with Cloudera noting in its S-1 filing: "[T]he same system built for managing big data in the cloud also unlocks the power of machine learning for enterprises." It's a nice thought, but at best this translates into 25 per cent of their customers running big data apps in a public or hybrid cloud environment, as is the case with Hortonworks customers today.
That's a good start. Meanwhile, AWS, Microsoft Azure, and Google Cloud are already "all in" on public cloud, with the data and burgeoning ML services to master it. This advantage will likely prove insurmountable to anyone trying to break into the ML game, as Stratchery analyst Ben Thompson declares:
With AWS, Microsoft and Google collectively spending $31.5bn on data centres in 2016 alone, compared to would-be challenger Oracle's paltry $1.7bn, this advantage of scale is a gift that will keep on giving to the big three. Today, the advantage goes to AWS for, as Algorithmia chief executive Diego Oppenheimer posits: "Most corporate data in the cloud is in AWS." Google and Microsoft, however, are now using ML offerings to entice more data into their clouds.
Ironically (or not), Google is last place in IaaS market share but – arguably – first place in ML/AI offerings. The company has been betting big that the easier it can make the arcane science of machine learning, the more likely companies will entrust it to shepherd their data. As Thompson has proposed: "[S]uperior machine learning offerings can not only be a differentiator but a sustainable one: being better will attract more customers and thus more data, and data is the fuel by which machine learning improvement comes about. And it is because of data that Google is AWS's biggest threat in the cloud."
Google has also made open source its friend in buffeting the AWS beast. TensorFlow, Syntaxnet, Kubernetes, and more: all designed to get developers standardising on Google's smarts and then, hopefully, running their Googlified apps on Google's cloud. Or, as Server Density CEO David Mytton argues, Google hopes to "standardise machine learning on a single framework and API" then augment it "with a service that can [manage] it all for you more efficiently and with less operational overhead."
How much less? Mytton goes on: "The [TensorFlow/Tensor Processing Units] announcement demonstrates that Google has optimised dedicated hardware designed specifically to 'deliver an order of magnitude better-optimised performance per watt for machine learning'."
In this manner, Google is positioning its cloud as the best place to operate ML workloads at scale, particularly those that may intersect with its open-source ML/AI tools along the way.
Not that AWS is standing still. Indeed, as Google has opened up access to its ML-based speech recognition technology, AWS has rolled out its own ML-driven speech recognition tech, Lex (the tech behind Alexa), thereby enabling companies to more easily build out voice-activated chatbots, among other things. As Andreessen Horowitz's Benedict Evans stated: "The race to create new AI tech is paralleled by a race to commodify it." Both companies, along with Microsoft, want to open up as many ML-powered gateways to their clouds as possible.
Though AWS generally doesn't get the same fanfare for its ML/AI expertise, this is more a failing of marketing than any reality. As AWS general manager of product strategy Matt Wood told me: "Amazon as a whole has thousands of engineers working on machine learning and deep learning. We've been doing this for decades."
The trick, he went on, is to take that complex AI expertise and make it approachable to developers. AWS AI services are "all designed to put sophisticated, high-quality deep learning, which is easy to use and priced aggressively, in the hands of as many people as possible."
At the most recent AWS re:Invent confernce, AWS general manager Swaminathan Sivasubramanian reiterated this goal "to bring machine learning to every AWS developer" with the launches of Rekognition, Polly, and Lex. Each of these is somewhat limited in themselves, but designed to lower the bar to ML for mainstream developers (in a nod to the ML elite, AWS has already created ways to leverage GPUs and FPGAs).
This is critical because, as Gartner analyst Nick Heudecker has called out, though up to half of all enterprises like to pretend that they're experimenting with ML, most won't get there anytime soon. Everyone talked about their big data plans, but just 15 per cent actually have made it to production. With ML, Heudecker laments, success rates "will likely be much lower with ML".
The trick, then, isn't delivering the world's most cutting-edge ML capabilities. No, the winning formula is making relatively complicated ML smarts easy for average developers to adopt. In this area Google's strategy of training developers through its open-source projects seems wise, but so does Amazon's simple ML services like Lex.
Which brings us to the dark horse candidate: Microsoft.
As Oppenheimer suggests: "Microsoft is the [cloud vendor] that will actually be able to convince the enterprises to do [machine learning]." It's an old-school vendor that had its decades of dominance, but that's precisely the foundation for what it hopes will be another decade or two of dominance. Microsoft, after all, regularly scores as one of CIOs' most trusted vendors.
Those same CIOs love Microsoft databases, applications, and more. Redmond's strategy is increasingly to imbue these services with "intelligence", by which it means AI/ML. Intelligence, Microsoft's corporate vice president for its data group Joseph Sirosh said at a press event in Seattle in early April, should "reside right next to the data. It should reside in databases. It should reside in applications that generate data... bringing intelligence to our data platforms is an incredibly important part of our strategy."
Google and AWS are essentially opening up the AI underpinnings to their home-grown services. Microsoft, for its part, is building AI into the tried-and-true products CIOs already depend upon.
Sure, Microsoft Azure also offers a bevy of ML services, similar to Google and AWS. But it's this strategy of making old-school enterprise tools just a bit smarter that strikes me as having the most immediate resonance for mainstream enterprises.
Will it be enough? That remains to be seen. Google and AWS have the deepest benches in ML, given years of operating such services for internal use at scale. Google, for its part, has strengthened its position with a host of open-sourced ML services that will lead developers back to its cloud. AWS, on the other hand, has the most data in the cloud, and a suite of services that developers already use.
Lastly, Microsoft's customers have the most data tied up on-prem, which Microsoft is well positioned to help them move to the cloud, coupled with a dependence on Microsoft applications, databases, and more.
Each vendor has a strong claim to a winning strategy. Customers, for their part, are spoiled for ML choice. ®