I spent my previous couple blogs exploring some interesting trends in Big Data. Today, I want to go deeper into another major trend: the growth of hybrid cloud and what it means for the growth of Big Data.
Driven by the success of AWS, the public cloud has been generating heavy buzz for a decade. But it wasn’t until the past couple years that we hit critical mass and saw many Fortune 500 enterprises actually start executing a cloud strategy. In the previous ten years, the debate around cloud centered on an either/or proposition: Customers could either stay on premises in their own datacenter, or they could go to the public cloud. Staying on premises gave you more control of your data and its security, while deploying in the cloud offered lower costs and greater flexibility.
But what we’ve seen these past couple years is that customers can leverage the cloud without having to commit to moving all of their workloads there. Instead, they can take a hybrid approach in which some workloads are left on premises and some are put in the public cloud. There is no either/or dilemma at all. And this is precisely the path that more and more organizations are choosing: According to research by MarketsAndMarkets, the hybrid cloud industry is expected to grow at a CAGR of 22.5% and reach $91.74 billion in 2021, up from $33.28 billion in 2016.
Cloud adoption is growing, and it’s being driven partially by a hybrid model that allows customers the best of both worlds. What does this mean for the growth of Big Data? To put it simply, it’s great.
Big Data works better in the cloud than in traditional infrastructure on premises. This is because it has elastic compute requirements. Big Data platforms often see sudden spikes in usage. For example, an e-commerce company may see consistent traffic throughout the year, then will experience a massive surge in visits on Cyber Monday. The e-commerce company would employ Big Data to analyze these visitors’ behavior and provide recommendations in real time. If their Big Data is on premises, then they would have to purchase extra servers to accommodate all the extra traffic. But those servers wouldn’t be useful for the rest of the year. In the cloud, though, the company would only need to push a button to scale up instantly. It’s cheaper and easier.
Consider a similar example over a shorter time span. Most apps and websites see traffic spike during certain hours of the day and experience far fewer visitors during off hours. Tinder, for example, is busiestbetween 8-9pm in the evening, according to Nielsen. The app heavily leverages Big Data to provide the best experience for its users – showing people potential matches they’re more apt to like, throttling bad users, etc. In the cloud, they could just instantly scale up their Big Data deployments during these peak hours, then scale back down once usage drops. On premises, they’d have to own enough infrastructure to accommodate the one hour of the day they really need it.
The cloud has moved beyond a curiosity and major enterprises are rapidly adopting it. This growth will lead to increasing Big Data adoption, as Big Data is cheaper and easier in the cloud. On-premises infrastructure isn’t going away, as there will always be a demand for control and security with certain workloads. Still, you can expect cloud to drive most future growth of Big Data.
We at Unravel are committed to maximizing reliability and performance of big data applications no matter where the enterprise chose to run their workloads. And for that reason Unravel optimizes speed and usage of a variety of big data environments such as AWS, Azure on the cloud along with Cloudera, Hortonworksand MapR on-premises.