John Bergin, managing director, IT Force

Moving to virtualisation 3.0

Virtualisation has been around since the 1960s. But there are diverse levels of technology adoption across the market.

Virtualisation has been part of the computing landscape for decades now.  In fact the technology has been around in one guise or another since the 1960s when IBM first introduced its Control Program/Cambridge Monitor System (CP/CMS), allowing users to run an isolated system within one computing environment.

True to form, the technology industry has put a modern twist on an old friend with its current technology lifecycle now represented by the term ‘virtualisation 3.0’.

Virtualisation 1.0 stretches back to 2003 and the introduction of x86 server virtualisation, primarily used by companies for software test and development purposes.  Adoption grew quickly and moved to a 2.0 phase, which focused on the consolidation of production applications, primarily driven by cost savings.

But virtualisation 3.0 goes beyond servers and hypervisors to encompass storage, security and networking, driving transformation of the data centre into one agile, dynamic computing environment and enabling the delivery of IT as a service.  In short, this is virtualisation for cloud.

“Network virtualisation and firewall virtualisation are growth areas at the moment and are probably the last ones to catch up,” according to George Dowling, cloud and managed services practice lead with Ergo.

“The adoption of cloud technologies means that businesses using services such as public cloud Azure or Office 365 are now deploying virtual firewall appliances, where they secure or encrypt the data before it goes across any platform, eliminating threat.

“It’s done at the perimeter of the organisation before the data goes into the virtual layer but it enables the data to go into that virtual layer without the risk,” he said.

Richard O’Brien, IT director with Triangle sees customers turning their attention to those non-virtualised elements of their infrastructure.  Network virtualisation or the creation of virtual networks that can be physically decoupled from the underlying network hardware and managed through a software-based console is being touted as the next wave of virtualisation deployment by major vendors including VMware, HP and Cisco.

“Up to four or five years ago, the maturity of networking within the virtualisation stack was pretty basic in terms of the functionality it could deliver,” he said.  “But the Nicira purchase by VMware really extrapolates that to a different level with full control from a software level down to individual ports across the whole network environment,” he said.

Rob Padden, solutions director with Trilogy Technologies sees a number of benefits from software defined networking including flexibility and potential cost reductions but he sounds a note of caution.

“There are some practicalities that need to be considered which will hold back many companies from implementing such a major change.  The design and operation of a complex network will not disappear just because dedicated network appliances are replaced by servers and software.  Organisations need to be clear that they will still need appropriate network expertise, either from their own in-house team or through experienced third parties.

“If you have a WAN/LAN that was designed and implemented several years ago and has basically remained unchanged, making the case to change could be hard.  We believe that this is an approach that will be adopted by large, network-heavy enterprises initially,” he said.

But O’Brien admits that network virtualisation is at a very early stage in terms of adoption and future deployment will depend on realistic business cases, ROI and TCO models for the technology.

In the mid-tier market segment, David Kinsella, CTO with Datapac sees strong, sustained adoption of virtualised storage solutions.

“The recent release of VMware’s virtual storage area network (SAN) essentially brings storage to the host server providing traditional high availability, live migration and all the features that people are used to from virtualisation. But now the storage is sitting within the hosts themselves.

“Storage virtualisation eliminates the training element for the SMB and mid-sized market in Ireland as you don’t need as much technical skill to manage the disk within a physical server,” he said.

But while virtualisation technologies may be relatively mature at the mid-tier and large enterprise end of the market, the picture for small business is quite different according to John Bergin, managing director with IT Force.

“I would say that 50 per cent of SMBs haven’t virtualised their servers yet. For small business, it’s a simple calculation really; the more users you have the more you can invest in central storage and central server technology and spread the cost.

“Small businesses upgrade their server technology every three to seven years and cost drives the decision as you can get more out of virtualised servers, use resources more efficiently and cut down on power.  I think most small businesses will go down the virtualisation route,” he said.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *