By Allen Mitchell, Senior Technical Account Manager, Mena at CommVault Systems
Beyond the obvious cost savings due to consolidation and reduced operational overheads, businesses are now realising that they can improve the availability, reliability and even performance of their mission critical applications by running them in a virtual context.
Gartner recently forecast an expectation for x86 server virtualisation to double from 40% in 2011 to 80% by 2016. While the uptake of virtualisation is encouraging, data protection and management in the virtual environment continues to be a challenge for most organisations. Most existing tools for backup and data protection of virtual machines are limited either in terms of scale, recovery capabilities, scope and support or application integration.
Simply force fitting legacy solutions, which are not optimised for the new virtual environments, only adds to cost and increases the potential risks to overall business.
What CIOs should know about virtual server data protection
One of the biggest misconceptions in IT is the equation between "Best of Breed" and "Point Solution". Many vendors now offer data protection solutions which they claim are "purpose built" or "optimised" for the virtual world. Typically, point solutions offer data protection for only a subset of the overall data centre - in this case the virtual environment- which means these solutions lack support for what are potentially key elements of a comprehensive data protection strategy.
Consequently, multiple solutions are required to ensure adequate data protection for the entire data centre. The drive to virtual infrastructure is about reducing cost by consolidating resources, maximising utilisation and centralising management. It is thus counter-productive to completely contradict this strategy by adding multiple, complex solutions for data protection and management.
CIOs should be aware that many purpose-built solutions run into real trouble when users attempt to scale beyond fifty or a hundred virtual machines, significantly impacting performance of production hosts and applications. Choosing a solution that doesn't understand the scale and performance requirements of the converged virtual infrastructure can result in unnecessarily expensive, over-designed solutions or in critical application and virtual machine data that are left unprotected.
Finally, given that by the end of 2012 an estimated 58% of all x86 server workloads were virtualised, it can be deduced that many organisations have already virtualised the basic areas of their infrastructures. Therefore, for most organisations, achieving the goal of a fully or 100% virtualised environment means targeting Tier 1 applications. This puts data management squarely in the critical path of the virtualisation initiative.
IT leaders should be asking tough questions designed to ensure they have a data management strategy that comprehends the drive to a fully scaled out virtual infrastructure, in order to avoid the need for a costly, time-consuming data management redesign. Here are the top three:
Does the solution align to the long term goals of the business?
The virtual platform and initiative is strategic. The supporting cast of infrastructure components and tools- including data protection - needs to align with this strategic direction. If the goal is to become 100% virtualised the data protection solution must be ready to handle the scale, application integration, recovery, and access requirements to make this happen.
Does the solution help reduce cost?
Let's face it, the primary objective of virtual platforms is to do more with less and ultimately reduce costs across the board.






