Narrowing the Enterprise Backup Window Choices

As the amount of data in an organization increases, the scale of the backup solution must keep pace. And as the world moves to a Web-enabled e-business model, downtime for backup and restore must be kept as close to zero as practicable. Backup speed can become a critical issue for business competitiveness.

Fueled by Internet technologies, electronic messaging, multimedia, online transaction processing and other data-intensive applications, the enterprise backup and storage market is expected to double during the next two years, according to David Hill, Storage Analyst with the Aberdeen Group.

The backup process copies information, especially vitally important information, onto magnetic tape or other disks. This enables the restoration of anything, from one file to the entire system, should the need arise. Backups have helped companies recover from data losses caused by power surges and outages, static electricity, lightning strikes, terrorist bombings, user errors, viruses and even spilled coffee.

Data recovery tools and services do exist. But they are limited and expensive. As a result of these products, users might be able to retrieve some of the lost data. However, it is unlikely you will ever retrieve all of the lost data.

Complex applications, network configurations, customized set-ups and passwords may be impossible to retrieve. The sudden loss of a mission-critical server that stores and maintains corporate records and data (one of a company’s most valuable assets) can be financially disastrous. A well-designed backup system safeguards crucial information, providing the most efficient insurance against disaster.

Developing a successful backup strategy begins with a carefully planned backup-needs analysis. The administrator first identifies the company’s total backup needs and then matches those needs to the appropriate backup hardware and software.

In many companies as much as 40 percent of the data changes every month. Ultimately, all company data and programs should be backed up so that the entire system can be restored in the event of a catastrophic disaster. The total amount of data to be backed up indicates the capacity required of the drive and the media. If planned backups will be unattended, then the selected backup device must have enough capacity to hold the full amount of information to be backed up.

Another crucial factor to consider is the performance of the backup kit, both hardware and software. System administrators typically perform backups when user demands on the server are at the lowest.

Ideally, this time period, often called the backup window, happens when user access can be restricted or the server shut down. For many companies with worldwide operations accessing their servers, no clear backup window exists. In such an instance, the administrator will have to perform the backup while the system is still in use. This procedure often leads to some degradation in overall performance. Where a backup window does exist, the selected device’s backup rate, together with the appropriate backup software, must be able to accomplish the task within the time available. The amount of data that needs to be backed up, along with the time available for doing it, often determines the type of media and software that can be used.

Determine performance by dividing the amount of information (in gigabytes) that must be backed up by the size of the backup window (in hours). This simple calculation yields the required performance as an overall transfer rate expressed in gigabytes per hour.

Fundamental Backup Technologies

Quarter Inch Cartridge. Currently, three types of backup technologies are frequently used. The oldest and most common among these includes Quarter Inch Cartridge (QIC). These drives have the lowest capacities and slowest transfer speeds. QIC drives meet the half-height form factor of desktop computers. QIC tapes are virtually industry-standard for standalone machines. However, with capacities limited to 1.2Gb, they are not generally suitable for backing up servers with 2Gb or more storage capacity – a capacity exceeded by ordinary desktop PCs nowadays.

Low capacities and slow speeds are not the only disadvantages of QIC drives. They also require regular maintenance. Like the normal audio tape heads in consumer tape recorders, QIC drive heads must be cleaned regularly. This process is particularly important if the tapes themselves are being reused often. QIC drives are slow, not just because of their overall data transfer rate. Most of them only allow sequential access to data. Therefore, selectively replacing backed-up data becomes difficult.

Despite these disadvantages, QIC drives, which are still very popular, offer excellent value for money. For many small businesses concerned about cost, QIC drives offer the best solution. Even for larger enterprises, QIC can be useful for short-term archiving.

Digital Audio Tape. The second most popular backup hardware format is Digital Audio Tape (DAT). Servers with 2Gb to 8Gb capacities can benefit from DAT drives. A high-density DAT tape can store up to 16Gb of data with efficient compression. Like QIC drives, DAT drives also offer value for money. In addition to bigger capacities, they offer better reliability and, more importantly, random access to stored data.

Who should consider using DAT for their backups? Medium-level enterprises that have modest backup windows and budgets are ideally suited for DAT backups. Even fast-growing small businesses with increasing backup requirements might want to consider migrating to DAT from QIC or other, less capable solutions.

Digital Linear Tape. One of the Goliaths among backup technologies is Digital Linear Tape (DLT) (see sidebar). These drives can store up to 30Gb of data. The basic DLT technology has been around for a decade. Digital Equipment Corporation (DEC) first introduced DLT in the early 1990s. When coupled with RAID technology, arrays of identical DLT drives offer a high degree of fault tolerance and data accuracy. In addition to much higher data backup capabilities, the basic DLT design is inherently more reliable than those of DLT’s counterparts from the older technologies.

In DAT and other helical scan drives, the tape gets wound tightly around a drum and moves through a complex tape path, resulting in stress and abrasion to the tape and the heads. DLT tape operates at a lower constant tension. Therefore, it has a longer life span.

The technologies mentioned above are not the only ones available for backup purposes. Many companies offer proprietary backup solutions at a lower cost. However, those who intend to adopt such seemingly less-expensive, proprietary technologies must also inquire about hidden costs and the level of technical support available in each region.

Deciding the type of backup device is only one step toward solving the problem. After this, the system administrator has to adopt a schedule and methodology for creating the backups. For instance, if the backup windows are really small or non-existent, the system administrator may opt for a dedicated network just for backup purposes.

This dedicated network, which is completely independent of the normal LAN, has two major advantages. First, it almost completely eliminates the need for a backup window, as the backup process can be completed while users are still logged on to the network.

Second, because there is a dedicated channel for transferring backup files, the bandwidth on the main LAN remains untouched and does not affect performance. This dedicated network can be constructed using regular networking cables or fibre channel cables. Of course, high bandwidth technologies, like fibre channel, tend to be much more expensive.

Choosing the right hardware is only half of the backup solution. If you do not have fully tested software to run on the appropriate platforms, even the most robust hardware setup may not be able to prevent loss of data.

Like hardware, backup software is available from a number of different vendors to suit all needs and budgets. The major hardware vendors offer quite a few of the advanced backup software. And they are available for most popular server operating systems like Windows NT and UNIX.

Basic Backup Software

In most companies, the backup process occurs remotely without human intervention. Therefore, more than anything else, a backup software package must be able to schedule all the tasks involved. At times specified by the system administrator, the software must initiate the backup process. The software must be capable of continuously monitoring the entire process and reporting any problems it encounters.

If the problem relates to software (such as conflict with another application), the package must be intelligent enough to recognize it as such and take the appropriate measures to correct it. Even some hardware problems can be fixed by good quality backup software.

Most backup software vendors offer regular updates for their products. However, not all the updates may be appropriate for every organization. Therefore, always check the quality of technical support available before purchasing the product. Most of the major vendors of backup software support the popular UNIX platforms, NT and NetWare. If you are backing up data resources on other platforms, you may find your choice is restricted. Data backup is an integral part of any organization’s IT infrastructure.

Although not quite as visible as many other operations, data backup is just as important to the success and well-being of any organization. As the amount of data in an organization increases, the scale of the backup solution must keep up with the pace.

As the world moves to a 24x7 Web-enabled e-commerce model, downtime for backup and restore must be kept as close to zero as practical. Backup speed can become a critical issue for business competitiveness.

About the Author: Elizabeth Ferrarini is an independent marketing consultant.




Can DLT Be Dethroned?

By Zophar Sante

DLT is considered by many to be the de facto standard for providing data backup and protection to midrange computing environments. There are roughly one million DLT tape drives and over 20 million data cartridges in service today. Eighty-five percent of all applications for tape technology within this midrange segment are for data backup and disaster recovery. DLT tape represents an 86 percent market share of this lucrative and demanding midrange computing segment where Fortune 500 companies and medium-sized businesses alike rely on their operational data to stay competitive.

The earlier success of DLT is rooted in reliability. DLT, as a technology, belongs to the family of multi-channel linear recording tape drives. Multiple tracks of data are simultaneously written down the length of the tape. Once the end of the tape is reached, the head steps to the next set of tracks and starts writing or reading back in the opposite direction. Linear technology has been reliably used to record computer data since the early 1950s and offers important design advantages over 4mm and 8mm helical scan technologies.

It’s physically impossible to drive your minivan at 300 miles per hour down a mountain road, and you can’t fit 50 people into your two-door European roadster. These two low-tech statements describe the barriers 4mm and 8mm helical scan drives have in providing reliable, big capacity products that scale far into the future.

A two-door roadster can only hold so many people, and small 4mm and 8mm data cartridges can only hold so much data. The physical recording surface of a 4mm data cartridge is less than 800 sq. inches and an 8mm data cartridge is less than 2,200 sq. inches. DLT has a recording surface of 10,000 sq. inches. That’s almost four to 10 times more room for data than 8mm and 4mm technologies. Why is this important?

Keeping data bits separated is the biggest reliability challenge faced, when data bits are being moved or when data bits are being stored, as on tape or hard disk. As you pack data bits tighter together to gain added capacity or transfer speed, you need to continue to keep them separated, and that requires distance or space. If data bits are not kept separated they become unreadable. It’s very similar to reading a sentence. If the letters are too close together and begin to overlap, they become impossible to read. Future 4mm and 8mm tape products become more difficult to design and manufacture when the space available for data is already four to 10 times tighter than DLT. DLT has a lot of space on the cartridge providing great potential for reliable future products. Super DLTtape can scale to 500GB.

It’s impossible to drive your minivan at 300 miles per hour down a mountain road unless it is designed like a linear tape drive. Super DLT and SLR32 and SLR50 are linear tape drives that use a "closed loop head tracking system." Like most hard drives, linear tape drives can use servo feedback tracks imbedded onto the media and an extremely fast voice coil-based head positioning system to keep the heads perfectly aligned over the critical data tracks.

To give you an idea about how accurately and quickly a closed loop data recovery system works; let’s say you’re driving your car on a seven-foot-wide mountain road that swerves nine feet in either direction, and you have six inches of clearance on either side of your car. With a voice coil mounted head and a "closed loop" data recovery system, you can travel that road at 420,000 miles per hour. Helical scan drives use a spinning drum that becomes a gyroscope. It’s almost impossible to correct anything "on the fly" with a helical scan drive, where the gyroscopic head is spinning at over 4,000 RPM. On the other hand, linear drive heads can be adjusted to stay aligned with the data tracks. This makes it much easier to design and manufacture faster and bigger tape drives.

The DLT7000 boasts a 70GB capacity with a transfer rate of 36GBs per hour, assuming 2:1 data compression. It fits into a small, five-inch form factor, a far cry from earlier reel-to-reel systems. DLT provides backward read and write capability to early DLT formats. The DLT7000 boasts a 30,000-hour head life, 200,000 hour MTBF at a 100 percent duty cycle. The media is rated for one million passes and has a 30-year shelf life. There’s no doubt that DLT has great specifications and an impressive future roadmap with soon to be Super DLT. But are specifications enough to maintain DLT as the standard?

Continued success goes well beyond specifications on a brochure or an impressive roadmap of future products. Newcomer technologies hoping to dethrone DLT have to look beyond their specifications and price and also focus on the other elements that are part of the buying equation. DLT has three strong advantages that are not technological but rather strategic. Strategic advantages may be very important when an IT manager is deciding on which technology to select.

Will a newcomer be supported for many years to come? Even a big brand name does not insure the future of any product. A huge multi-national company will "kill" a product if it’s not making them money. DLT is at the center of a revenue universe. DLT is being sold by some of the largest IT companies in the world: HP, IBM, Compaq, Sun, Dell, DG and SGI., and the list goes on and on. High-end data protection and disaster recovery companies, like StorageTek, ATL, Emass, Breece hill, Qualstar, Exabyte, Computer Associates, Veritas/Seagate, Legato Systems and NovaStor, protect your data by writing it to DLT.

The "bottom line" says that DLT is generating solid revenue for the biggest organizations in the IT and data protection industry. Healthy revenue is key to longevity.

In fact, demand for DLT is so high that Quantum has licensed Tandberg Data to build a manufacturing facility to insure that the DLT revenue universe is supplied with enough product.

DLT is "Dyed in the Wool." It’s now part of the infrastructure that supports businesses all over the world. Almost all IT managers must consider infrastructure issues when selecting or changing a tape technology. There’s first compatibility with current or legacy media. After all, tape is removable data protection and storage, so you have to be able to read it back. There are existing hardware and software investments and compatibility issues to review. A large 10 TB DLT library can easily cost $250,000 and it’s not simply something you can toss out.

DLT works – you don’t need to fix it. Probably the last element is time and human resources. IT managers are very busy people. Constantly keeping data flowing is an "around the clock" job. A new technology will not receive resources from an IT manager unless there is a significant advantage that solves a big problem. DLT has big capacity, is fast and very reliable. Super DLT will be even bigger, faster and more reliable.

About the Author: Zophar Sante is Senior Product Manager at Tandberg Data.

Must Read Articles