In-Depth

Homeland Defense: Are We Walking the Talk?

What's really happened to the security of information technology since Sept. 11?

The press has devoted considerable ink to "heightened security awareness" in public and private sector computing. New attitudes have loosened a few purse strings here, and accelerated product development there ... and certainly galvanized security marketeers.

But what's really happened to the security of information technology? Three themes have emerged in the wake of 9/11:

1. IT organizations are revisiting disaster recovery methods and techniques. Sept. 11 caused many organizations to invoke disaster recovery plans to get around attack-caused infrastructure and services interruptions to power, telecommunications, air travel and more. Many found unexpected gaps in those plans, in part related to simultaneous execution of so many disaster recovery schemes by so many organizations.

2. Technology successes—and failures—inadvertently enabled the attacks. Government and IT post-mortems have identified technologies that aided in planning and executing the attacks.

Investigators have discovered that terrorists obtained information via the Internet, and used Web-based chat rooms to coordinate efforts—their work funded through bank networks. Some perpetrators' expired visas or illegal entry should've been picked up by monitoring systems, but poor data interchange between the INS, FBI and other intelligence agencies made that impossible.

3. IT has been identified as a potential target—and a highly vulnerable one—for future terrorist attacks. The IT security industry was quick to observe that a smart enemy would target the information-based United States' Achilles heel, information technology. The next terrorist attack will come as concerted hacker attacks … unless IT organizations act quickly to improve security.

Truth be told, none of these themes covered any new ground. The need for effective disaster recovery planning is already well understood by most organizations. Until 9/11, DR was often a back-burner issue, or worse, a capability to be "bolted on" to applications after they had been fielded.

Unfortunately, the interest level almost always wanes as the disaster fades from public consciousness. Within a few months of 9/11, organizations outside Washington and New York are already relaxing vigilance.

But 9/11 did show that organizations with the best disaster recovery plans are more likely to survive than those who'd made disaster recovery a back-burner issue. The extraordinary rate of data growth in most companies seriously compromised time-honored strategies for storage recovery.

Recovering today's complex storage infrastructures requires pre-planning a data backup strategy that replicates all critical data in the production center to a manageable, pre-defined minimum equipment platform at a pre-designated recovery center. DR planners must be well acquainted with the latest storage recovery technology capabilities and some very real limitations:

  • A single high-end tape library with 16 state-of-the-art tape drives transfers data to a storage platform in a recovery center at a maximum rate of about 1TB/hour. Planners should consider a range of alternatives including disk-to-disk mirroring or pre-staging of "static data" to recovery platforms on an ongoing basis.

  • Disk and array mirroring technologies often work only between arrays of the same type from the same vendor. Mirroring strategies must therefore use one-for-one replication of production facility storage components at the recovery center. Planners may be able to get around it with new technologies for cross-platform, volume-based replication that are in the works—especially from storage virtualization vendors. But these technologies need to be pre-tested to ensure their reliability.

  • Data replication schemes can result in significant data loss. Depending on the application involved, even a minor difference between original and copied data can have important consequences. Planners should investigate technologies that may overcome these problems, such as Message Digest Version 5 (MD5) hashing algorithms. These "brand" data at the time of its creation with a unique identifier to verify contents as the data is migrated.

None of these technologies originated in response to 9/11. Each addressed day-to-day storage provisioning and management concerns in the production IT environment.

Secure Networks?
IT security vendors were quick to seize upon 9/11 and the war on terrorism to promote new offerings in firewalls, encryption and virus elimination. Very few of these products are new.

Victor Sheymov of Invicta Networks in Herndon, Va., a former KGB cryptographer turned entrepreneur, claims that conventional thinking about network security is something out of the Middle Ages, when monarchs "firewalled" their castles with moats and drawbridges, and used encryption to pass messages back and forth between their peers. He says the best way to protect networks from eavesdropping and hacking is to make them "invisible" to hackers by using a "three-dimensional," twenty-first century approach.

Sheymov's company offers a product called InvisiLAN, which uses its proprietary Variable Cyber Coordinates (VCC) technology to provide network invisibility by rapidly changing the logical network addresses of communicating end stations.

Security, of course, is a two-edged sword. Securing networks against hacking and eavesdropping by terrorists may sound like a good thing, but it may also enable increased security for the network-based communications of terrorists and their cohorts.

Wireless LANs also played a role in many 9/11 recoveries. Despite concerns about their security, wireless networks are increasingly pressed into service by organizations to support practical requirements and enhance the mobility of workers.

From current analyst projections, the wireless network market is booming. Sales of 802.11b/Wi-Fi compliant wireless LANs climbed 21 percent in the fourth quarter of 2001, according to analyst firm META Group. At the same time, most companies are not enabling Wired Equivalent Privacy (WEP), the wireless LAN security mechanism built into the standard.

That leaves a high potential for "drive by" hacking by anyone with an 802.11b transceiver-enabled system. And not only can messages and application traffic be readily intercepted from an unsecured wireless connection, but often the connection itself offers an unprotected inroad into the wired LAN.

Wireless LAN equipment is cheap and easy to deploy, often without corporate IT, so security managers may not even know that the exposure exists.

Sharing with XML and Web Services
Authorities knew 9/11 perpetrators were in the country long before the incidents. Activities such as enrolling in flight schools might have triggered law enforcement to act had this information been shared between agencies. Technical (and bureaucratic) obstacles prevented such sharing.

Among the technical obstacles: incompatible database formats. According to marketeers, if government agencies had XML-enabled their applications, then (in theory) information exchange could have been facilitated.

XML-based exchanges are easier to create, deploy and maintain than are EDI relationships.

Applications enabled as XML Web services would provide tremendous resiliency to interruptions for their owners (they could be hosted and replicated anywhere) and could virtually eliminate downtime, according to advocates.

But a significant amount of work is still required to refine the technology, establish a standard approach and address numerous security issues. Despite the rhetoric, this technology is still bleeding edge.

Could XML-enabled Web services have prevented 9/11? Maybe. Interchange wouldn't have mattered, though, without analytical savvy.

Shortly after Sept. 11, news services reported that a checkout clerk at a department store suspected a burgeoning terrorist attack. A patron had purchased several stuffed toy teddy bears, several packages of "BB" ammunition, and household chemicals that might be explosive. The clerk "did the math," saw a potential threat and reported it to the FBI.

An XML-based system at the department store might well have captured the data pertaining to the sale—facilitating data interchange between an inventory system, a point-of-sale system, and a customer service system, for example. However, without the "higher order analysis" provided by the suspicious sales clerk, this data would simply have been data.

That XML exchanges between intelligence community databases might yield better raw material for analytical estimates than no data sharing at all goes pretty much without saying. But to suggest that XML-based or Web Services-enabled databases would have stopped 9/11 or any other terrorist event is quite far-fetched.

Lessons Learned
The key lesson of 9/11 was simple. Bad things happen. Those who prepare and plan, who build resiliency and security into their IT infrastructure, are more likely to recover their critical operations than those who don't. All the rest is marketing.

Most companies, unless compelled by a legal mandate, do not make security or resiliency a high priority. Both require the provisioning, at potentially great cost in terms of dollars and user convenience, of capabilities that in the best of possible circumstances will never need to be used.

Must Read Articles