Going into the virtualised reality
The rate of change and more cost-effective solutions has transformed virtualisation from a hyped technology into a necessity for businesses, writes Quinton O’Reilly
A few years ago, the jury was still out on whether virtualisation — which is creating a virtual version of something like servers, desktop, operating system, storage or network — was a worthy investment or not.
As the technology became more robust, easier to use, and more affordable, that scepticism was brushed aside as virtualisation became a major part of businesses’ day-to-day operations.
While the most obvious application of it is through cloud computing, the fact that virtualisation can be applied to many different areas of a business makes it worth looking at.
There are good reasons why it’s now an attractive prospect. Seen as a way to reduce operating costs, reduce the overall storage space needed and enhance productivity, it’s no surprise that companies have made the move or are considering it.
Add to that major players like VMware, Microsoft, Oracle and Amazon offering services and the market is more accessible than ever.
As the GreenLake consumption services manager and business developer at HPE Ireland Noel Brilly says, it allows for more efficient use of IT resources than was possible before.
“Before virtualisation, it was common to have both under- and over-utilised hardware in the same data centre,” he said. “With virtualisation, our customers can move workloads between virtual machines according to load. The same physical server can also run multiple server operating systems and configurations, further increasing efficiency.”
Probably the most popular one happening right now is server virtualisation, which is used to divide one physical server into multiple isolated virtual environments depending on users’ needs. Now the basis for both cloud computing and hybrid IT setups, it’s become the norm for businesses as they pick and choose which areas are worth converting.
“Where people are at now, they’ve pretty much done server virtualisation,” said Catherine Doyle, enterprise director of Dell EMC Ireland. “They’ve pretty much done it or they’re at least on the journey to do. Then the next piece of the puzzle is the network virtualisation and that’s what hot right now.
“That is huge for us at the moment… because it’s got two parts to it. The first piece is it delivers the final part of the ‘as-a-service’ model to enable that full service to be offered but there’s also a massive security knock on for virtualisating your network.”
Doyle says that the industry is “halfway there” and will likely become truly mainstream in the next six to nine months.
That acceleration of usage is something Michele Neylon, the CEO of Blacknight, has seen. As a hosting provider and provider of infrastructure for many years, his company is in a good position to see the trends in this space and it’s definitely picked up thanks to more user-friendly offerings.
“The adoption has definitely accelerated over the years,” he said. “It’s become more and more commonplace and the tools that are available to help manage and deploy everything has evolved.
“Years back it was not easy, there was a lot of poking around and command lines and doing lots of weird and funky stuff whereas nowadays, there are software solutions on the market that are much more evolved. While you need to have some kind of technical knowledge, it’s much more user-friendly.”
The user-friendly nature means, for the average business, virtualisation is a realistic prospect and in the public conscious. Now it’s a case of what will be converted instead of whether it will happen in the first place.
“If you look at the whole cloud and virtualisation space, it’s no longer if organisations are going to be migrating workloads into those environments or platforms, but more now about what is right to virtualise,” said Chris Ducker, senior director of global proposition strategy at SungardAS.
“It’s now mainstream. People may have had their fingers burnt sometimes by putting stuff into cloud environments with inflated promises of lower costs or more agility but without actually understanding the complexities associated with it or resourcing up for those complexities.
“We’re now in a world where people are beginning to understand the reality and benefits of virtualisation and that it’s a little bit more complex than was first envisioned.
“When you’re looking at migrating legacy environments or workloads or platforms, [you ask] what makes good business sense to virtualise, what makes good business sense to leave it where it is and what do we refactor, redesign or rebuild to leverage the benefits of a virtualised or cloud world.”
Much of the developments around virtualisation have been around speed, efficiency and cost, yet other areas that Ducker sees developments in is incorporating AI and machine learning into both standard and managing applications.
Other developments are more focused on the backend and protection against unwelcome surprises like data loss, a benefit that is more crucial now considering the increased rate of attacks and cyber incidents that occur.
“If an organisation can virtualise more workloads, they can capitalise on all sorts of things,” said Ducker. “[If] you can quickly move a virtual machine from one server to another, that means the whole concept of backup and recovery is so much more within reach of organisations or workloads.
“I would predict that there’s massive growth in utilisation of backup and recovery as a result of the speed and ease of which you can do it and the result of the lower cost of doing that.
“It now becomes cost-effective for organisations to backup their businesses at a time where we see more threats to data protection and data management. You need to have backup and it’s almost a perfect storm that the increased threat and increased capability has risen at the same so I see that as being a huge area for growth in the near future.”
Benefiting the mainstream
As mentioned before, there are many benefits to virtualising parts of your business and network.
One major benefit to having a virtualised environment, besides the cost and flexibility, is the added security it can bring. If it’s properly designed, it can be updated on the fly and virtual networks can be isolated to prevent malware from spreading.
“If you look at a TCP/IP protocol, it has something like 50,000 connectors inside of it,” explains Doyle. “In a standard network, once you get in and breaches happen and people get in, you basically have access to 50,000 connectors. If your network is virtualised, you can reduce that to two so when you get in, you only have access to two.
“That’s not to say that you can’t start getting access to everything else, [but] it’s going to take longer to get to the next one so by the time you get there, you’ve been caught and the company knows [it’s being attacked].
“It significantly slows down how breaches happen and if you’re in a company where you got software that was born in the cloud, proper next-generation software that’s obviously fully virtualised it’s containerised, etc, and you put a network on top of it, what a lot of companies are doing is respiring their software every few days to wash out any bugs or breaches that have been planted in there.”
The expectations businesses have of IT departments and managed service providers have changed too alongside this. For one, they expect them to be agile and able to fix changes quickly
“Businesses now expect their IT departments to be as agile as say for example a marketing department,” Doyle said. “They expect to be able to launch new products and services swiftly to gain market share or fend off competition. No longer can IT respond with long deployment or start-up delays.
“Through virtualisation and HPE GreenLake, we have now enabled our customers to be as flexible and as agile as any area in the business and to become an enabler as opposed to a blocker as previously happened.”
While it’s easy to focus on the benefits that virtualisation brings, that doesn’t mean it’s suitable for everything. The aims of the business should be taken into consideration and just because it’s becoming more realistic to virtualise parts of a business, that doesn’t mean it’s a straight swap.
“There are different use cases for it but it just makes sense in many cases,” said Neylon. “Obviously, outsourcing it to a technical infrastructure company like ourselves means we’re able to work with the client, design the solution that meets their specific needs.
“For other use cases, you can talk it through the client and say in this instance, the virtualisation path is the way we want to go and this instance over here, maybe it’s not.
“A big problem with any of these things is where people get treating it as a buzzword instead of using it to find a solution to a business need. By working with the technical partner that understands the technologies and can help to develop that solution that meets that requirement, that’s what we want to be aiming for.”
That’s something that Ducker mentions as well. The strategic input that a provider can bring can help them identify the approach that works best for them.
Looking at workloads that can be virtualised, workloads that are already virtualised and could be leveraging some of the benefits of the cloud, and what workloads would see no benefit from virtualisation should all be considered.
For the most part, a hybrid IT model is probably the way to go for most companies, picking and choosing which parts are virtualised to help reduce costs and increase efficiency.
“[Sungard AS has] talked about hybrid IT being the new normal for a number of years and hybrid IT is the reality,” he said. “It’s not all going to be cloud, it’s not all cloud already, it doesn’t make good business sense to put everything into the cloud just for the sake of it, you have to look at it workload by workload.
“The service provider should help you do that by helping you with a strategy, the implementation of that strategy, and then you get into managing and optimising.
“Once you’re in the cloud, you can rack up loads of bills if you’re not constantly managing. We’ve been implementing automation tools [and] machine learning tools that monitor your utilisation of the cloud and reports back on anomalies, and optimum utilisation.
“[It tells you if] you’re using what you’re actually paying for, or are you wasting money on it. From our perspective as a company that is renowned for resilience, there’s [the element of] underpinning it with security and recovery capabilities so that you can address adversity when it hits.”
Racking up bills isn’t the only pitfall that businesses can fall into. A big one is not properly considering how to convert from legacy systems to virtualised systems. Considering that there may be multiple third-parties services behind your legacy systems, the shift to virtualisation may not be as straightforward as they might have thought.
“If you virtualise [your business], it becomes much more complicated,” said Doyle. “Because the network has grown up over a long period of time, it’s generally understood in its current form so if you change it into something that’s virtualised, it’s quite a big shift for companies.
“What we’re finding a lot of companies are doing… it for a specific project or for part of it. [If your] router is up for a refresh, rather than refresh, it’s time to look at virtualisation.
“However, generally the people who managed the network, they’ve been certified, they have all of their accreditations, they’re comfortable with that environment so it’s quite a big shift to move into that virtualisation space.”
Look before you leap
The other part you need to think about is the complexity behind systems, says Doyle. Chances are most companies rely on numerous third-party services to bolster their operations and so knowing how you deliver your network and making the necessary changes, both with your technology and processes, can be a challenge.
“It’s quite complex to figure out what you’re going to do and something else like virtualising your network will change how you deliver your network,” she said. “It’s a bit like the way that virtualising your servers changes how you deliver your servers.
“It’s not even the technologies because companies can learn the technologies, it’s more about the changes it makes to their internal processes and how they actually operate their company and operate how their infrastructure works to make sure everything stays very stable.
“It is quite a big change, when server virtualisation came in first, there was a lot of resistance to it. Questions like would things be supported, and how would they recover [were asked]. It’s the same thing except there are more parts to it when you try to virtualised your network, but ultimately networks will be virtualised. That’s what’s going to happen.
“It’s at a maturity stage where as technology becomes up for refresh in terms of financial cycle, it becomes an option to have a look at it for… three main reasons; security, ease of use in terms of the network and also it’s much cheaper.”
As for Brilly, he sees the growth of Hyper-Converged Infrastructure (HCI) — a type of infrastructure service, largely software-defined, that virtualises all of the elements of conventional hardware systems — continuing for another few years, turning into a hybrid computing platform that integrates with public cloud resources.
Other areas include the sales patterns of servers changing so that new ones going towards cloud providers instead of customer data centres and an increase in demand for virtual desktop infrastructure (VDI).
“Companies also now want VDI environments that are centrally managed and flexible. Companies want options for hosting desktops in the platform of their choice — whether physical or virtual infrastructure, a privately or publicly hosted data centre, and so forth.
“Automation is going to play an instrumental role in the speeding up of enterprise deployments. It will also affect their capacity to orchestrate and then scale out IT infrastructure. The number of companies and industries seeking to use advanced automation for improving their efficiency is on the rise.”
Neylon mentions that much of the virtualisation landscape has changed, most of which for the better, and the services we rely on are going to lean more towards using it.
“You’re going to end up with a mish-mash of different technologies and different solutions and trying to get them all to work together so that they make sense for the business,” he said. “So that you know it’s a technically sound solution but it’s done in such a fashion that it meets the company’s needs.
“The other thing that’s fascinating is we’re getting to the point where the operating system wars are pretty much over. You don’t see Microsoft attacking open source as much and now oddly enough you’re able to use Microsoft tools to manage open source operating systems which is amazing.
“Ten to 15 years ago that would have been unthinkable, there was no way those guys would have wanted to work with open source technologies whereas now it’s completely commonplace.”
There are a number of players in the virtualisation space, one of which is HPE with Greenlake. With a focus on a consumption-based model, it offers a degree of flexibility for businesses to avail of when using virtualisation.
It’s a direction it and other services are moving towards as technology improves and managed services continue to grow in popularity.
“This flexibility is one of the key factors in the growth of the public cloud and now HPE has managed to replicate this benefit with on-premise solutions,” said Noel Brilly, GreenLake consumption services manager and business developer at HPE Ireland.
“In an ideal world you would want to have sufficient capacity to meet the IT demand of businesses with the ability to switch on additional capacity as demand dictates.
“Traditionally to do this there is an over-investment in infrastructure and more likely an under-utilisation of spending IT budgets up front. This over provisioning of computer and storage is an inefficient use of budget that could be better allocated elsewhere in the business.”
Brilly mentions that IT departments need the ability to onboard resources quickly when it’s required which is where GreenLake comes into play. By providing the additional computing power and storage when necessary, it meets the needs of businesses and manages to give them the best of both worlds.
“GreenLake provides the capability of provisioning additional storage or compute power within a matter of a few hours as opposed to possible long, drawn-out procurement processes lasting possibly months before the infrastructure is available.
“It is not always possible or feasible to have a full view or plan on future capacity needs. As such, having the ability to draw on pre-provisioned capacity which is only paid for when used can generate real business value.”
Building up a resilience
As businesses grapple with the different elements of virtualisation, their expectation of what it brings and what to expect changes, too.
One that Chris Ducker, senior director of global proposition strategy at Sungard AS, says that he’s seen a greater focus on resilience, something that will only continue to grow in importance as time goes on.
“Over the last 12 months, we’ve seen the whole concept of resilience rising up the agenda massively on a number of levels,” he said. “One level we’re talking about is the issues of security… organisations face adversity from an environmental perspective, from a security perspective so they need to withstand that.
“They need systems that are resilient to those issues, but also organisations need to be resilient to a changing marketplace and have the [necessary] agility. Moving workloads to virtualised environments give them that higher degree of flexibility to be able to address those challenges.”
The other area that Ducker mentions is recovery, which is crucial when a business is hit by an attack or suffers problems, where virtualisation can remove some of the pain points associated with this.
“If you’re in a situation where you can’t access your resources, one of the key benefits of a virtualised cloud environment is you can quickly move the virtual machine from one server to another,” he added. “Which is fantastic for situations like backup, disaster recovery, business continuity, whereas if your workload is based on physical infrastructure and you can’t access the physical infrastructure, you’ve got a bit of a problem.”