With the advent of hyper-converged infrastructure and enterprise cloud solutions, companies around the world would like to achieve both cloud like simplicity and the ability to scale as they desire in managing their on-prem data centers. Data protection becomes an even more essential service for every Cloud, be it Enterprise or Public Cloud, in this new emerging IT reality.
It's easy to assume that a solution responsible to protect data hosted on these cutting-edge platforms would match its ingenuity. However, this isn’t always the case as most data protection vendors simply repackage their legacy solutions, rename them and say they're tailor made for a specific platform.
There's an easy way to cut through the clutter though. Start by asking a few simple questions to any enterprise backup software vendor you consider:
Question: How long will it take to plan and architect my backup infrastructure?
98% of plans that any organization makes are subject to change. The plans you put together for your data protection infrastructure should roll with your organization’s overarching goals. So make sure the solution that you are choosing is adaptable, i.e. can gradually and modularly expand and shrink if necessary. Too much upfront planning means that the architecture of a solution may be too rigid. Rigid architectures will make life painful when organizational plans change.
Question: How much time will it take to deploy and configure before I can protect my Applications and VMs?
This can be a complicated and time-consuming process depending on the architecture of the solution. Manual install-binaries for servers, data movers, agents, plugins are a perfect recipe for an installation quagmire. An ideal data protection solution should take minimal time to be production-ready, otherwise known as the Protection-Readiness-Objective (PRO). This can be done in two ways, either deliver the solution in a software-based pre-built virtual-appliance or as a service. The former can greatly simplify the overall deployment and upgrade exercises while the latter can totally eliminate the very concept of deployment.
Question: How much effort is it to scale the solution as you expand your data center over time?
This sort of leads us back to the first question. What if you have too much unexpected growth or what if you’re consolidating the existing footprint? In most cases, data creation and usage tend to grow despite continuous optimization of real estate and logical constructs of a data center. Whatever the case may be, the solution should be able to seamlessly scale up, scale out and scale across multiple on-prem and public clouds, while still maintaining a simple, intuitive, single pane of glass management.
Question: What’s the learning curve for my team to adopt the solution into their daily work life?
The success of any IT organization depends on its team. Its counter-productive for IT teams to spend months in training to master a data protection solution, just to protect their data centers. What if the solution could provide a UI that your end-users are already familiar with? This is only possible if the UI of the solution is purpose-built to its supporting platform’s UI and terminology. This way, your end users don’t have to be a stranger while using the solution for the first time and it helps with a painless adoption.
Question: How much time is my team going to spend to keep the lights on? Such as maintenance, troubleshooting, upgrades etc.
Do the solutions in place help or hinder your IT team’s key goals and do they help in improving your team’s overall efficiency? There are instances where IT teams can end up wasting countless hours in troubleshooting a problem within a solution. The goal here is to eliminate the maintenance shenanigans as much as possible so that IT teams can focus on high value tasks. This can only be achieved if a solution is light weight, nimble with less moving parts and provide detailed logging mechanisms that can facilitate instant upgrades, minimal maintenance and outstanding customer support.