When virtual tape libraries (VTLs) emerged several years ago, they improved how data backup applications wrote...
data to disk. Tape-centric backup applications understood how to communicate with and manage tape devices, and then write data in tape format.
Virtual tape library vendors that launched those initial solutions were focused mostly on developing the emulations for popular Fibre Channel tape libraries; proving compatibility with leading backup apps; ensuring that the virtual tape library wasn't a single point of failure in the backup process; and figuring out how to get virtual tape copies easily offsite for disaster recovery (DR) purposes. Virtual tape library vendors masked any complexity by packaging their solution as a purpose-built appliance and partnering with disk vendors to deliver turnkey solutions.
As VTLs have moved from early adopters to mainstream users, the stakes have changed. Ease of deployment and use, improved backup performance and reduced physical tape media management issues are basic requirements today. Virtual tape library vendors now need to accommodate the more advanced needs of mainstream users and adapt to the changing climate.
Changes in the virtual tape library market
There are a number of factors contributing to the changing virtual tape library climate. First, the base technology is more mature. VTL vendors have been able to exercise their products in a variety of environments and collect valuable feedback from customers. Second, today's virtual tape library buyer is better educated not only about VTL technology but their own data protection requirements. This next generation of virtual tape library buyers (that's you) is more careful about vetting vendor claims. For example, what type of scalability is required to achieve advertised throughput estimates? And are data deduplication ratios realistic?
Another angle is the advancement of backup solutions. Most, if not all, of the traditional backup vendors have embraced disk-to-disk (D2D) backup and a new crop of backup solutions built specifically for D2D backup has emerged. Today, backup vendors are keenly aware of user needs for addressing backup window problems with better backup performance, improving recovery time objectives with rapid recovery techniques, and reducing the capacity of data transferred and stored with capacity optimisation technologies.
Because many of the newer backup products address these primary concerns, how have virtual tape library vendors responded? It turns out they're introducing a crop of new features to win over the pragmatic buyer (you again), which is great news for IT organisations.
Second-generation VTL requirements
The Enterprise Strategy Group surveyed virtual tape library users about the types of features they would like to see in their current virtual tape library solution. Data deduplication and improvements in virtual tape library management topped the list, followed closely by improved scalability, and better recoverability and performance. Some users, struggling with methods for getting data offsite for DR purposes, prioritized disaster recovery features and support for physical tape on the VTL back end. Lastly, support for protocols -- such as FICON and ESCON for mainframe support, and Ethernet for iSCSI support -- rounded out the list.
These users are concerned with overall functionality. VTL vendors are beginning to factor this into their development roadmaps. For example, implementing replication between VTLs has become a standard feature for most solutions. virtual tape library buyers are interested in how that capability maps to current processes and existing environments. Can the replication occur to more than one location? Is it possible to encrypt the data while in flight? Is scheduling (at off-peak times) or bandwidth throttling available? What techniques are offered to reduce the capacity of data transferred?
Here are some of the second-generation features virtual tape library users are interested in.
Capacity and performance scalability. The ability to scale storage capacity and performance are high-priority requirements that go hand in hand. Some early VTL solutions had design bottlenecks where the VTL architectures didn't fully take into account how backup applications work or what performance limits they would hit. Often, the existing backup processes or jobs would have to change just to take the pressure off virtual tape library solutions. Second-generation VTLs must have the ability to seamlessly increase throughput performance as backup window pressure and recovery objectives dictate. The same holds true for capacity limits, and how easy it is to deploy additional disk capacity in the environment.
Capacity optimisation. As organisations back up more workloads to disk and retain data on disk for longer periods of time, capacity optimisation capabilities such as compression and data deduplication become more critical. Second-generation virtual tape libraries will need to have these features packaged in a way that addresses functional requirements. That includes deduplicating data across multiple virtual tape library heads, turning deduplication off depending on the workload, and deduplicating data in real-time (inline processing) or batch mode (post-processing) to address performance concerns.
Central management. Organisations that have implemented multiple VTLs (due to scale limitations or by design) created an unexpected management issue: virtual tape library sprawl. Central management of policies and a consolidated view of multiple VTLs can alleviate the management burden, and are important features in the next phase of VTL products.
Disaster recovery. Another burden of adopting a D2D strategy is safeguarding data in the event of a local outage or disaster. Best practices call for a copy of backup media to be stored offsite. Those first VTL vendors either implemented a "tape export" command to generate a physical duplicate of a virtual tape or directly managed the physical tape library to create duplicate media outside of the backup window. Neither approach satisfied the requirement that the backup app maintain full knowledge and control of duplicates. The alternative was to initiate a virtual-to-physical tape copy through the backup app, which meant the backup application was burdened with additional processing and the data was dragged over the network unnecessarily. Second-generation VTLs are leveraging more efficient methods for creating physical media for offsite storage. Taking advantage of backup vendors' APIs, such as Symantec Veritas NetBackup's OpenStorage programming interface, allows VTLs to initiate duplication to tape or a secondary location, while ensuring that the media catalog is synchronized. DR preparedness is also being addressed via local-to-remote virtual tape library-to-virtual tape library replication. Second-generation VTLs take this basic feature a step further by allowing users the flexibility to decide which workloads require site-to-site replication; supporting multiple replication topologies, such as 1:1, and many-to-one ratios; as well as supporting bi-directional replication.
iSCSI support. Improvements in iSCSI performance, wider deployment of 10Gb Ethernet networks and increased support for iSCSI from storage vendors may drive more demand for Ethernet interface support. Users who want to leverage an IP SAN instead of, or in addition to, FC will look for VTLs that support iSCSI connectivity.
We know that organisations of all sizes are under pressure to have secondary data available and quickly accessible in downtime situations. Not doing so can be costly--if not devastating -- to organisations. Virtual tape libraries have emerged as a means to this end for a segment of the market. VTL vendors are responding to these challenges as evidenced by the excitement around their incorporation of data deduplication features. For users, it means scouting out vendors that are improving on these capabilities so you can be prepared to take advantage of the benefits being offered by the next generation of virtual tape library products.
Lauren Whitehouse is an analyst with Enterprise Strategy Group and covers data protection technologies. Lauren is a 20-plus-year veteran in the software industry, formerly serving in marketing and software development roles.