Extreme Storage: SAN Pushes the Limits of the Enterprise
In the crazy ups and downs of the IT world, there was always one steady and predictable element you could count on: storage. From year to year, databases and systems would grow at a steady pace. IT managers simply added disk space on the fly as applications grew or were brought online, and based their storage budgets on system development plans for the coming year. No longer.
Now, storage requirements are growing just as wildly as everything else. E-commerce, business intelligence, data warehousing, e-mail, and Windows 2000 have blown the whole storage scenario to pieces. In a survey by FIND/SVP on behalf of EMC Corp. (www.emc.com), 74 percent of managers say they are being asked to roll out new applications at record speed, with 85 percent saying their biggest challenge is responding quickly to unpredictable customer demand generated by the Internet.
"A lot of the increase is being driven by these messaging systems, such as Exchange and Lotus Notes," says Brian Maher, senior program manager at EMC. "These applications tend to gobble up quite a bit of capacity as we get more ornate with our PowerPoint and Excel spreadsheets."
Companies are absorbing an enormous amount of information, and collecting it, says Robert Abraham, president of Freeman Reports (www.freemanreports.com), a storage industry analyst firm. "Every time a customer or employee makes a transaction or inquiry, the company needs to record that. Plus, there’s all their databases and other internal requirements," he says.
This is all wreaking havoc with storage planning.
The spontaneous combustion of storage needs is also expensive, as Dataquest Inc. (www.dataquest.com) estimates that storage represents 35 percent of hardware purchases. IT departments continue to increase their annual budgets for storage hardware and services by an average of 16 percent to 20 percent a year. On average, storage requirements are doubling every year. Among Internet companies, demand for storage capacity is doubling every 90 days. Companies can't seem to get enough in storage capacity, serving a range of needs from performance to fault tolerance and backup. Even for the storage industry, whose members are used to high growth numbers, this is unprecedented.
"In 1993, the total worldwide requirement for mainframes was 75 TB," EMC’s Maher says. "Today, we're seeing many datacenters going well over 100 TB."
"How do you back up 100 TB?" asks Wayne Giroux, manager of storage strategy for Amdahl Corp. (www.amdahl.com). Rapid growth in storage has made both disk and tape storage a daunting challenge. "How do you back up when you have to have your applications online and available 24 hours a day, seven days a week, 365 days a year?"
That's why storage capacity planning has grown so important in recent years. Organizations need to develop a roadmap of how much disk space they will require as they expand applications and e-commerce efforts, and the financial justifications for such expansion.
"CIOs who carefully monitor transaction performance and associated storage utilization requirements should be better prepared to meet ever-changing needs," says Edward Broderick, senior research analyst at Robert Frances Group Inc. (www.rfgonline.com). "CIOs should judge whether growth over the next two to five years could be economically accommodated with more devices of the current technologies or whether conversion to a new technology is warranted."
Chiron Corp. (www.chiron.com), a biotechnology company, recently reigned in its disk acquisition process with a deliberate, planned methodology and software tools. "Managing disk space was pretty much hit or miss," says Anthony Flux, director of IS at Chiron.
Chiron’s IT department didn't know precisely who was using exactly how much disk space. They couldn't get down to that level. Secondly, the administrators managed on a requirement basis. So if a partition filled up, or got too close to the limit, they sent out e-mail messages and dealt with it using manual processes.
"It was inefficient, it took a lot of time, and it was not very effective," he says.
More Than Dot-com
In the wild world of e-commerce, it's hard to plan for next week, let alone six months down the road. "No one has any idea how fast it's going to grow," Amdahl's Giroux says. "How do you plan for capacity when you don't know how successful you're going to be?" An initial investment in a few terabytes worth of storage "could very quickly grow to the hundreds of terabytes, or even pedabytes," he explains.
This isn't just a dot-com problem, either. Traditional in-house business applications in both Internet and brick-and-mortar companies gobble up plenty of storage. "Even traditional applications are getting an e-commerce slant to them now," says Duane Ternes, manager of Unix and database systems at Electric Lightwave Inc. (www.eli.net), a major network service provider. For example, Electric Lightwave is now exploring providing billing details to customers, which means leveraging data from back-end systems. "Applications such as this create direct impacts to our storage environments," Ternes says.
The FIND/SVP survey finds that while the Internet is playing an ever-increasing role in large businesses, 66 percent of the respondents say they have not been able to integrate their Internet data with their traditional business data. Simple space requirements also dog new storage acquisition efforts.
"With these growth rates, the sheer capacity of the physical IT facility to hold all of these devices comes into question," RFG's Broderick says.
To manage this challenge, many companies are moving to or exploring SANs, which provide more flexibility and scalability to rapidly growing sites that are not constrained by platforms or physical facilities. More than half of respondents to the FIND/SVP survey have either implemented or are planning SANs to help manage their information from multiple systems. Many of these systems now support more than a terabyte.
While SANs are popular, most companies are still only in the early stages of the technology. Electric Lightwave, for example, just begun to implement a SAN infrastructure that ties together its various Hewlett-Packard storage systems.
Johns Manville International Inc. (www.johnsmanville.com), a manufacturer of building and roofing materials, reinforcements, and filtration media, created innovative solutions for its growing roster of demanding applications. Currently, the company, which maintains almost 5 TB of storage on HP storage systems, has begun to implement a SAN, according to Scott Blancett, IT project manager at Johns Manville.
"The normal purchasing pattern for storage was to buy it as you needed it," Freeman Reports’ Abraham says. "Since the growth of storage has accelerated, it’s really important that storage solutions be highly scalable. Your storage architecture is the key. Scalability is one of the primary challenges of SAN -- to stretch it much further, to get a much greater degree of scalability than there's ever been in the storage world."
SANs are also sparking a consolidation trend, which ranks among the top reasons for implementing a SAN, along with backup and disaster recovery and file sharing. A well-deployed SAN should address at least one of these needs, says Tom Rose, vice president of advanced marketing at HighGround Systems (www.highground.com). "The two killer drivers we see in planning for SANs, are faster LAN-free backup and server consolidation," he observes.
Interestingly, this may accelerate as companies seek to reduce the number of servers as they roll out Windows 2000, he adds. The FIND/SVP survey confirms that a large majority of companies are turning to enterprise storage strategies such as SAN to consolidate servers.
"Folks faced with migrating to Windows 2000 are looking to short-circuit that process by reducing the number of Windows 2000 copies they need to buy from Microsoft," Rose continues. "This can be addressed by buying bigger servers that connect to a SAN with lots of storage." In addition, more centralized storage enhances scalability, he adds.
The SAN is playing a role in Johns Manville's efforts to consolidate its Windows NT server base. "We experienced a fair amount of growth through acquisition over the last few years," Johns Manville’s Blancett relates. "We just completed a consolidation of our Windows NT environment, where we were growing servers like they were going out of style. We at one point had 25 file-and-print servers in our headquarters alone. We have since whittled that down to two, partly due to utilizing a SAN."
All new systems brought into the company will be automatically attached to the SAN as well, Blancett continues. "If it's going to be bought in the future, it's going to be attached to a SAN," he points out. "Most of our storage previously was direct-attach SCSI. We noticed a vast improvement when going to the SAN environment with Fibre Channel. We can now format an NTFS partition with 60 GB in under two minutes. Before, that was a long, drawn-out affair."
In migrating to a SAN, companies need to consider the data and applications that will be put on the SAN. Typically, the process begins incrementally, as opposed to a wholesale migration, says HighGround's Rose. "It's rare that a customer decides they're going to migrate their entire storage infrastructure to SAN. Usually they start with one or two applications."
At this early stage, users need to conduct an assessment of their storage infrastructures, including identification of large files that have high I/O requirements and servers with the highest rates of file change. "That typically impacts how big backups will be," Rose says. "If a lot of files change during the day, that means all those files need to get backed up that night, since backup/recovery and faster backup is one of the drivers of SAN. Knowing where your busiest servers are is key."
Electric Lightwave is selective about systems going on its SAN. Currently, only the company's largest databases -- those extending into the multi-hundred terabyte range -- are being migrated to the SAN, Ternes says. "We're not making an effort to put smaller servers with internal storage on the SAN right now. A majority of our storage today is direct-attached storage, not SAN based," Ternes points out. "Our SAN environment is the cornerstone of our future, and that's where we're planning on growing all our future storage."
Rose warns that companies also need to identify "stale storage," or files that have not been touched for over a year. "The last thing you want to do when you implement a SAN, and you're buying expensive switches and expensive RAID storage, is migrate unused or unaccessed or stale files or stale data to the SAN," Rose says. "This is your chance to get back some capacity and prune your file systems."
Even the best of plans, however, are subject to being shaken by the fast-moving Internet train. At Electric Lightwave, future storage needs past a six-month horizon are "pretty tough to calculate with any accuracy," Ternes points out. For instance, Electric Lightwave recently signed on a major client that sent the billing database from 50 GB to almost 80 GB within a month.
Plus, for every database, the company maintains a number of test environments -- which gobble storage space in multiple quantities. "We spend a lot of our storage in test environments," Ternes says. "We have at least two or three -- and in some cases, eight or ten test environments -- for each production database environment, depending on the size and number of conversions going on for that system."
Business changes, combined with the need to support multiple test environments, have made storage planning difficult," Ternes continues. "The models that we've been using for forecasting are not very useful anymore. The key to making growth plans is building an infrastructure that you can grow beyond your expectations." As a general rule, if Electric Lightwave's free storage space falls below 20 percent of its entire disk environment, the company acquires more disk," he points out.
At Johns Manville, storage planning is also based on a fairly rough estimate because the company has been growing through acquisition. Johns Manville’s Blancett says it is difficult to calculate a forecast and acquisition. The closest the company can come to estimating future storage needs is by tracking monthly disk usage. With that data, Blancett forecasts out the amount of disk space they need for the budget year, and doubles it.
All this demand for storage has been quite an eye-opener for the whole industry, Amdahl's Giroux says. "We used to have nice planned steady growth."
Understanding the new e-commerce-driven growth rates is the key to capacity planning, HighGround's Rose says. "If you don't know how fast storage is being used today in your direct-attached environment, then you're not going to know how to plan ahead, and how big the partitions you should make in the SAN should be, or how much expansion capability you should build in to your SAN solution."
Administrators should know how fast and frequently files are being added to the network, and how fast the network space is currently being used. With that information, IT can make sound, accurate judgments as to how big a SAN partition should be, and what kind of extra capacity should be built in or planned for as a SAN is expanded.
Ultimately, SANs make capacity planning easier, but only if proper monitoring tools and procedures are in place. "The SAN makes it a bit easier because it centrally locates storage in one area, connected through Fibre Channel switches, hubs and bridges," Rose says. "On the other hand, it gets more complicated, because now there are all kinds of storage resources that need to be tracked. There are more potential points of failure and bottlenecks. Now you're adding switches and ports and hubs."