The future of cloud computing is, to be sure, hotly contested. Indeed, the term “cloud” is an appropriate one because like the clouds in the sky, the cloud of technology is ever-changing and rearranging its shape. The winds and currents that guide cloud tech are business needs as companies come to terms with what the cloud means for being competitive.
A study issued earlier this year by IDG Research Services found nearly 40% of organizations with some kind of public cloud initiative had moved at least some of those workloads back to on premises, mostly due to security and cost concerns.
The trend that has emerged this year is a more cautious move on the part of enterprises. They are realizing, sometimes the hard way, that the cloud is heavily metered and they are being more judicious about what they move to the cloud.
But they are still moving. A cloud-first enterprise strategy is gaining in prominence. Hybrid cloud IT operations management provider OpsRamp conducted a poll of IT leaders in large companies (more than 500 employees) and found that public cloud services are grabbing a bigger share of IT budgets.
More than half of respondents said they have been using public cloud for more than three years, but only 29% of those polled described their level of cloud adoption as “mature,” compared with 50% for “developing” and 21% for “emerging.” So there is still room to grow and it will grow.
Cloud Future Influenced by Shape and Form
The question is: what future form will cloud computing take? The survey found three quarters of respondents said they expect to work with different cloud providers for their business needs. For that reason, a new term has entered the lexicon along with public cloud, private cloud and hybrid cloud: multi-cloud.
Multi-cloud is used in two ways, so you have to be careful. One way is to defined two or three cloud types together, like public and private used in tandem. Of course, that’s what the hybrid cloud is. The other way multi-cloud is used is to say using multiple cloud providers. A company might go with Amazon, Microsoft and a smaller provider, perhaps Box just for storage.
Tim Crawford, principle consultant with Avoa, a strategic advisory firm to CIOs on data center and cloud computing issues, said people will need a number of different services for a number of different mechanisms, so that by default will dictate a hybrid or multi-cloud approach.
“No one type is going to suffice. The reality is people will have to use a combination of them. That’s not a corner case, either. I expect to see it across the midsection majority of enterprises,” he said.
He is already starting to see it now and the momentum is definitely picking up. “There’s a lot of noise that says all roads lead to public cloud but I don’t believe that to be the case, because there are legitimate reasons public cloud is not a feasible solution for enterprise for certain applications,” said Crawford.
Steve Tack, senior vice president of products for cloud app performance monitoring vendor Dynatrace, says the consensus across the board with large enterprises and ISV customers is hybrid cloud will be the approach moving forward, “whether you call it hybrid cloud or multi-cloud, whatever that may be.”
There are two drivers to cloud migration, he said, density as people look at containers and microservices architectures, along with speed to market. The microservices approach has many benefits, like speed of deployment and decoupling components to make them easier to deploy and update.
But he also cited speed in terms of moving from waterfall approach of development to a more Agile-oriented continuous integration/delivery approach. He’s seen companies go from major releases once every six months to two weeks.
They achieved that rapid rate of update through a combination of things. One is automation. Companies that automate testing and performance analysis are able to free up developers to focus on actual development. Automation also provides visibility and insight into usage and the customer experience.
“By getting that feedback loop and having a continual deployment approach, it allows you to accelerate the lifecycle of an app and speed deployment,” he said. “When you combine cloud native apps, containers, automation, Agile and DevOps, they all give you a compound value when you do them together.”
The Limits of Public Cloud
Task said the majority of new cloud investments are driven by the need for new investments and new initiatives but he has also seen a rise in what people refer to as “lift and shift” of existing apps data center apps into the cloud.
“The former is built around competitive advantages while the latter is around cost. Both get traction within the enterprise today,” he said.
Lift and shift may not be a good idea, though. There have been several studies that found companies moved apps to the cloud, only to move back on premises very quickly. Crawford said that it’s not because the public cloud is more expensive but the way they are using it is more expensive.
“It can be as much as four times more expensive than traditional data center. That’s because traditional legacy apps, assuming not rewritten or rearchitected, are used to running 24/7 at peak on redundant architecture. Public cloud is not designed to run that way. If they used it more appropriately for what it was designed for, it would be less expensive,” he said.
So why are companies making this mistake? They aren’t being told the whole story. “There’s so much noise. How do you ascertain what is real and what is not? A lot of analyst firms are buying into the latest press releases,” he said.
So moving data center apps to the cloud is not going to be a major trend in the future due to how they run. Crawford also thinks latency might also hurt the public cloud and favor on-premises/private cloud for things like IoT and edge services “because the speed of light is still a limiting factor,” he said.
New Future Cloud Format?
Neither Task nor Crawford sees any new formats popping up in cloud computing because there doesn’t need to be. What’s out there now is adequate.
“The reality is today most enterprises are using a form of hybrid cloud,” said Crawford. “They probably have a corporate data center are are using some form of SaaS which is running on top of a public cloud. You can split hairs on whether that truly is hybrid or not,” he said.
Task said that the area with a lot of hype around it is where containers were 18 months ago — it’s serverless computing through the public cloud. “Serverless” computing is a bit of a misnomer. You still need a server to run. It’s just that rather than allocating virtual machines or containers and deploying code to them, the development team just uploads its application, which is called a function because it performs a single function, and the vendor – AWS or otherwise – then executes the function and shuts it down when the process is done.
It solves one of the biggest problems in cloud computing: people forgetting to shut down a virtual machine when they are done. Developers and other users forget to shut down their virtual instance and leave it running and if the VM is running, the meter is running. That’s how enterprises get a bad case of sticker shock when the cloud computing bill comes in.
With serverless computing, you never spin up a VM, so you never have to worry about shutting it down. It creates an instance to execute the code, a simple, single-function app, and shuts down when done.
“It allows people to design and build and run apps and services and abstract from dealing with the hardware and managing servers as well as they only have to pay for the time they consumed on a very granular basis,” he said.
“I still think there’s a ton of headroom in the market with mass migrations and containerization but also a lot of interest in serverless computing and the impact that will have on the overall structure will be considerable,” he added.