Download the authoritative guide: Cloud Computing 2018: Using the Cloud to Transform Your Business
Moving to the cloud isn’t as easy as many cloud vendors would have you believe, especially for large enterprises. Pundits cite concerns like security and compliance as reasons that large enterprises will be late adopters, but there’s an equally compelling and often overlooked factor: legacy investments.
Most large enterprises have made major investments in their own datacenters. They aren’t just going to throw those investments overboard for the latest trend.
It’s a heck of a lot easier for an SMB to ditch a server or two than for a distributed enterprise to turn its back on its application, networking, security, storage, and other infrastructure investments.
“You may not see the mass adoption of the cloud [in large enterprises] for five, ten or even fifteen years,” said Dan Lamorena, a director in Symantec’s storage and availability management group. “Many organizations won’t change until it’s time to refresh servers and other critical infrastructure. It just won’t make sense before then.”
Of Mainframes and Private Clouds
Don’t believe him? Consider this: the mainframe business is still a multi-billion dollar one. Heck, IBM even released a mainframe for the mid-market last year, the zEnterprise. Remember the predictions about the last mainframe being unplugged in 1996?
How’d that work out?
In fact, according to BMC Software’s most recent Mainframe Survey, 62 percent of businesses planned to increase their usage of mainframes this year.
The point of all of this mainframe talk? Technology moves slowly in large enterprises. They are stuck with legacy costs and infrastructures, and it would be foolish to expect them to abandon those overnight to leap into public clouds.
That doesn’t mean large enterprises will be on the outside looking in, wishing they could take advantage of all these cloud benefits everyone else is talking about. Rather, it just means many enterprises will have to take a different path.
And for most, that different path means favoring private clouds over public ones, at least initially.
From Virtualization to Private Cloud – It’s Harder than You Think
Even if most large organizations would have a hard time moving to public clouds right away, nearly as many have already taken the first steps toward cloud infrastructures.
Most enterprises have invested in virtualization technologies. They have consolidated servers and storage, and they believe once the virtualization challenge is tackled, the transition to having a full-blown private cloud is trivial.
Not so, says Kevin Brown, CEO of storage vendor Coraid. “Many organizations have traveled far along the ‘virtualization journey’ to transform their old, brittle server infrastructures into dynamic, self-healing environments. This has brought tremendous cost savings, both in terms of capital and operating costs.”
But that’s not really what a cloud environment is. Even a virtualized environment can feature application silos. A true cloud delivers on-demand access, resource metering, and the ability to scale up and down rapidly.
“These are not automatic in a virtualized environment,” Brown said. “In fact, they are still very hard to do. Virtualization can facilitate these features, but realizing them is primarily about changes in business processes. To truly deploy cloud, organizations need to think hard about how to standardize their service offerings, make them available through simple portals, and track usage and cost information to report or charge back to their business partners.”
Here’s where existing infrastructure can complicate things. Often, those application silos will feature different operating systems. Even for something that seems monolithic, like storage, a large enterprise could have storage resources from multiple vendors.
“Everyone wants to embrace chargebacks, but if you have storage boxes from five different vendors, you also have five different sets of reporting tools. Figuring out chargebacks, then, is time-consuming, costly and not terribly accurate,” Lamorena said.
Private Cloud Software Roadblocks
There’s no clear model yet for how the cloud will be managed, and what we have now is a patchwork of point products. Virtualization vendors are adding cloud management features to their software suites, pushing their capabilities up the stack, while traditional management vendors, such as IBM, are pushing their management features further and further down the stack.
There is also competition from the open source-centered community with OpenStack. And plenty of startups will tell you that all of the above are doing it wrong and will then offer their own alternatives.
Cloud management is in its infancy, so don’t expect any set-it-and-forget-it magic.
“One of the benefits of developing a private cloud is that it provides the opportunity, however challenging, to define exactly what you are trying to do with IT in your organization as you go forward,” said Steve Pelletier, solution architect for Logicalis, an IT solutions and managed service provider.
“To realize the full potential of your private cloud environment, you will also need to address several key concepts that are not so easily defined. These involve squishy terms like ‘management,’ ‘automation’ and ‘orchestration’ that need to be defined in terms of where you are today and where you want to be tomorrow.”
Will Data Center Management Be the Foundation of the Enterprise Cloud?
To get a handle on things like management and automation, you have to know exactly what it is you are trying to manage and automate, and this is where many organizations hit a wall.
Many organizations have little visibility into their IT infrastructures. Legacy IT management tools just weren’t built for today’s virtualized infrastructures, and if you start moving stuff into public clouds, what little visibility you have will evaporate.
“The fact that cloud environments are expansive, decentralized, and fluid makes visibility both more imperative and more difficult without the proper cloud management tool,” said Antonio Piraino, CTO of ScienceLogic, a provider of datacenter and cloud management platforms.
Datacenters, internal or in the cloud, are moving toward virtualized, multi-tenant infrastructures. Your datacenter may not fit this model today, but it will eventually. Of course, multi-tenancy doesn’t necessarily mean you’ll share your internal servers with the business next door (or two time zones over). Multi-tenancy could be as simple as having virtual machines from different departments sharing the same resources.
However, once you embrace multi-tenancy, you become a de facto service provider to your own organization. And you then have the headaches of a service provider. Who do you charge for what and how do you justify those charges if there is pushback? Traditional IT operations management point products and management suites won’t be much help for multi-tenant environments. Those tools simply weren’t built for them.
Too many organizations’ cloud management efforts simply focus on enabling self-service provisioning of virtual machines for very specific tasks, such as application development. Since these deployments are small and limited to a few users or a couple of departments, the lack of visibility into these cloud deployments doesn’t set off too many warning bells.
However, the lack of visibility can and should slow organizations down as they move to the private cloud. Piraino argues that cloud service providers need to centralize management as much as possible. Large enterprises won’t feel comfortable with private, public or hybrid clouds until critical IT operations and cloud management functions – such as performance, availability, ticketing, and event management – are unified.
Legacy IT management and monitoring tools simply do not extend into the cloud, meaning that if you don’t update your datacenter management strategy to align with your cloud strategy, you will be flying blind into the cloud. So, don’t be shocked when you drift through a fog bank and into a mountain.