IT Tackling Data Center Complexity with Information Governance

A survey released today commissioned by Symantec examined responses from 2,453 organizations in 32 countries to learn how IT organizations are coping with managing their data centers. From security to disaster recovery, server maintenance to managing mobile devices, there’s no shortage of work for a data center manager.

Rating five areas of data center complexity (security, infrastructure, disaster recovery, storage, and compliance) on a scale from 0 (simple) to 10 (highly complex), respondents gave all areas a 6.56 or higher rating; security was rated the most complex, with a score of 7.06. I find it interesting that organizations in Latin America gave overall complexity a score of 7.81; respondents in the Asia-Pacific region and Japan ready complexity the lowest at 6.15. Symantec didn’t speculate on the cause of that difference.

What’s behind the relatively high complexity ratings? Chalk it up to an increasing number of business-critical apps -- so said 65 percent of respondents, with growth of data coming in second at 51 percent, and mobile computing noted by 44 percent of respondents (multiple answers were solicited).

Complexity also leads to higher costs, according to 47 percent of respondents -- something enterprise managers would be well advised to heed. According to the survey, other side effects include “longer lead times for storage migration,” “reduced agility,” and “longer lead times for provisioning storage.”

The most troubling result, however, may be that 35 percent of respondents said that downtime is one of the effects of data center complexity. Uh oh -- but wait, it gets worse. “The typical organization in the report experienced an average of 16 data center outages over the past 12 months, at a cost of $5.1 million. The most common cause of downtime was system failures, followed by human error, and natural disasters.” [emphasis mine]

The challenge is to figure out what IT can do to mitigate such complexity. In its report, Symantec points out that “common activities include staff training; standardizing applications, hardware, and security; increasing budget; and centralizing data centers.” That’s all well and good, but the survey found that the single biggest mitigation step enterprises are taking is information governance. For example, 9 percent said they have already implemented information governance, and 23 percent are in the process. Another 23 percent are conducting trials, and 35 percent are discussing the technology.

Survey respondents say they are being driven to launch a governance effort in large part by the need for better security, the availability of new technologies that make information governance easier, as well as the growth of data. They also expect such governance efforts to lead to enhance security (so said 75 percent of respondents), “ease of finding the right information in a timely manner” (70 percent), “reduced information management costs” (69 percent), and lower storage costs (68 percent), among other benefits.

I asked Symantec why governance emerged at the top of enterprises’ list of approaches to mitigate data center complexity. After all, governance has a been a big issue for IT for several years. It wasn’t so long ago that Sarbanes-Oxley and HIPAA demanded IT’s attention.

Trevor Daughney, director of product marketing at Symantec, said it was due to the fact that information governance equips organizations with the ability to reduce the risks related to eDiscovery and compliance and allows them to consolidate previously discrete portions of IT operations.

“Symantec provides federated search and a common classification engine across critical data sources to bring context and relevance to information so organizations can find what they need, when they need it, and appropriately enforce policies and controls. The ability to centrally manage security, information retention, and eDiscovery functions also reduces operational expenses and training costs,” Daughney told me.

Is governance enough? Symantec said it begins with information governance and establishing C-level ownership. Organizations should start with high-ROI projects such as data loss prevention, archiving, and eDiscovery to preserve critical information, but they shouldn’t stop there. Companies also need to get visibility beyond platforms and understand the business services that IT is providing. These efforts will help IT to mitigate the effects of data center complexity.

-- James E. Powell
Editorial Director, ESJ

0 comments


Three Security-Fortifying Steps to Take Now

Maintaining the best security is tough, with more applications running in the cloud, hackers getting more creative, and regulatory penalties increasing. What’s a security manager to do?

A white paper from PwC, Fortifying your defenses: The role of internal audit in assuring data security and privacy, looks at the role internal auditors can play in keeping an enterprise safe. Before you groan (yes, I know, we all hate those nit-picky, can’t-see-things-from-IT’s-perspective auditors), PwC’s recommendations make sense in light of today’s complicated IT environment.

PwC acknowledges that many companies already have comprehensive security controls and privacy policies in place. Notes Dean Simone, head of PwC’s risk assurance practice in the United States, “To battle the ever-changing hacker profiles and accelerating rate of technological change, companies need to constantly re-evaluate their privacy and security plans.”

Based on figures cited in the white paper, IT’s not keeping up. For example, “In 2011, only 39 percent of nearly 10,000 executives in 138 countries said they reviewed their privacy policies annually, compared to 52 percent in 2009. Only 41 percent had an identity management strategy in 2011, a decrease from 48 percent in 2009.” Those are not good signs.

PwC offers three lines of defense IT can establish to fortify an enterprise’s defenses, quoted here from the report:

Management: Companies that are good at managing information security risks typically assign responsibility for their security regimes at the highest levels of the organization. Management has ownership, responsibility and accountability for assessing, controlling and mitigating risks.

Risk management and compliance functions: These functions facilitate and monitor the implementation of effective risk management practices by management and help risk owners in reporting adequate risk-related information up and down the firm.

Internal audit: Provides objective assurance to the board and executive management on how effectively the organization assesses and manages its risks. It’s imperative that this line of defense be at least as strong as the first two for critical risk areas.

It’s not enough to have policies in place -- an enterprise has to make sure the policies are enforced and those policies are sufficiently up-to-date to handle the latest security threats.

The 12-page report is available at no cost here. No registration is required.

-- James E. Powell
Editorial Director, ESJ

0 comments


Moving to Newer Operating Systems May be a Safety Issue

A study by Fortinet Research shows that it’s “smarter” to move to a new operating system than to maintain an older version. That could be especially relevant now given that support for Windows XP will end in April, 2014 -- which isn’t that far away.

A study of its malware database revealed that the number of vulnerabilities was significantly greater for older operating systems because “exploit kits and existing malicious code have had ample time to mature and circulate.” The firm said that it’s also harder to find a working rootkit for Windows 7 than for Windows XP, crediting such technologies as PatchGuard, which protects the OS kernel from modifications. The same may also be true about Windows 8.

You can read more here.

-- James E. Powell
Editorial Director, ESJ

0 comments


Number of Breaches Steady, but Fewer Identities Stolen Per Breach

There’s some good news for security administrators. In the latest Symantec Intelligence Report, the company compared breaches in the last 8 months of 2011 with those occurring in the first 8 months of this year. Overall, the number of breaches stayed about the same but the average number of identities stolen per breach is down by almost half.

Other highlights from the report:

  • Spam now makes up 72.3 percent of e-mail traffic, an increase of 4.7 percentage points from July 2012 to August 2012

  • One in every 313 e-mail messages is on a phishing expedition, a slight increase in the last month; one in every 233 messages contains malware, a slight decrease over the same period

  • The number of malicious Web sites dropped by almost half (one Web site was blocked per day, on average)

There’s lots to mull over in the report, which is available here at no cost with no registration required.

-- James E. Powell
Editorial Director, ESJ

0 comments


Symform Introduces Intriguing Backup Pricing Plan

If your budget is tight, but you have extra hard disk space that’s not being used, Symform has a backup plan with an attractive price.

The company announced today that it will now accept bucks or bytes when paying for its service. Not only can this cut costs, but it’s infinitely simpler than many competitors’ complex licensing plans that boggle the mind and make it more difficult to figure out what your monthly or annual charges are going to be.

In a nutshell, Symform uses a decentralized cloud model to protect and store your data. You tell the system what files and/or folders you want to back up, and it splits those files into small chunks, adds information to facilitate redundancy, encrypts the data, and distributes the fragments to others using the Symform service -- literally around the world.

Because no other Symform user has a copy of any complete file (and even if they did, it’s protected with AES-256 encryption), your data is safe. If you need to recover a file or folder, Symform retrieves all the fragments it needs, no matter where they’re stored. In a meeting last week at the company’s headquarters, Tim Helming, director of product management, told me that it would take 32 systems (all with at least one of your file fragments) to be unavailable simultaneously for your data to be unrecoverable, a highly unlikely event.

All users get the first 10 GB for free. After that, Symform used to charge extra. No more. Now, as long as you keep a 2:1 ratio of “contributed” space (storage others can use on your hardware for their files) to space used for your own files, the backup/recovery service is free. So, for example, if you tell the system where it can store 50 GB of other enterprises’ file (in fragments) on your system, you can store an additional 25 GB (above the initial 10 GB) of your own files (in fragments as well) at no additional charge.

For users who can’t maintain that 2:1 ratio, Symform also let you pay for storage in the old-fashioned way -- with money. If you start to exceed your ratio, you’ll get a warning message so you can contribute more space or more money. You can also pay for support with additional contributed space or money.

Set up is incredibly easy. During initial installation, you pick the files you want to back up and specify which devices or folders you want to make available to the system for the file fragments of others. You specify your current bandwith and how much of it Symform can use (including specifying when your peak usage hours are so Symform doesn’t interfere with bandwidth needed for every day production). In the background, Symform is uploading your files and placing data fragments into your “contributed” space without interrupting your other applications.

The split-and-store approach to data protection is ingenious in and of itself. With the cost of storage remaining relatively stable (and much of your enterprise’s hard drive space unused), the new pricing policy is a smart choice for cash-strapped budgets.

-- James E. Powell
Editorial Director, ESJ

1 comments


Imation Helps Enterprises Makes Sense of Mishmash of State Data-Breach Regulations

If you think the patchwork of state data-breach notification laws is confusing, you’re not alone. Fortunately, Imation Corp. -- a scalable storage and data security company -- has collected and examined information from a variety of publicly available sites (such as the National Conference of State Legislatures) as well as analysis from two law firms, and consolidated its analysis into a compliance heat map, available at www.imation.com/compliancemap. (You’ll also find a link to the combined compliance map and state scores and rankings there.)

Imation looked at each of the states and applied a series of questions to evaluate details such what data is covered and the notification and data destruction required. For example, as part of its analysis, the company determined whether the law or regulation specifies how data is to be destroyed, the amount of the penalty, who must be notified, and what encryption is required (and whether encryption is sufficient). Imation factored in whether the law applies to “owners and licensees,” and examined whether the regulations apply to state agencies or if government entities are exempt.

The map shows how strict each state’s data breach laws and penalties are, from light yellow (the least strict) to dark red (the danger zone). In fact, according to their analysis, Virginia has the strictest laws; the state has specific requirements about “what is to be included in the breach notification, requires government and credit reporting agency notification, and includes a large financial penalty relative to other states.” Virginia, along with a few other states, also requires notification if the data breached was encrypted if the encryption key for the data was also stolen.

According to the company, “data breach notification laws are strikingly similar, but vary in compliance requirements for businesses, with all laws highlighting the need for companies to deploy methods for closely storing, protecting, and controlling sensitive information.” Imation looked at state compliance regulations of the 46 U.S. states with such laws (Alabama, Kentucky, New Mexico, and South Dakota have none), as well as the U.S. Virgin Islands, the District of Columbia, and Puerto Rico.

I asked Dave Duncan, software and security solutions marketing director at Imation, if there were any surprises in the results.

“What is surprising is that there are not yet uniform rules for data breach notification. A number of legislative attempts have been initiated in the U.S. Congress and Senate, but to date these have not yet become law. The lack of a standard application of data breach notification laws makes it extremely difficult for businesses to assess their risks and understand their obligations in the event of a potential or actual data loss.”

The report highlights the extra burden this patchwork system of regulations imposes. “Businesses with operations and customers in multiple states need to ensure they understand the potential implications of data loss in each state in which they have an operational facility and customers. This is because their requirements for response to any such loss will be mandated by the state in which customers live that have had their information lost. Some states also require notification disclosure based on the location of the company’s operations.

“Another issue is that some states require notification to customers if a potential for data loss occurs. For example, an organization may have misplaced a device that has consumer data on it. The device is not yet known to be lost or compromised nor has a theft of the data occurred. Merely the fact that a potential loss may occur can trigger notification laws in some states.”

Duncan added, “For businesses, the risks of potentially restrictive federal data breach notification legislative rules, may, in fact, be offset by the reduced costs of having a uniform set of guidelines by which they can better understand their risks and costs for notification in the event that a data loss occurs.”

-- James E. Powell
Editorial Director, ESJ

0 comments


How to Think Like a Programmer

Why do programmers sometimes struggle to write programs? What’s getting in the way isn’t likely the programmer’s grasp of language semantics. V. Anton Spraul has been teaching programming for over 15 years. In his new book, Think Like a Programmer, Spraul says the problem is more likely that the missing skill is problem-solving -- that is, “the ability to take a given problem description and write an original program to solve it.”

Of course, the author admits that not every program requires extensive problem-solving, but when it does, it could stymie a programmer. After all, problem-solving “is a different activity from learning programming syntax and therefore uses a different set of ‘mental muscles.’”

To help you firm up those muscles, the author investigates several approaches to solve problems -- from a divide-and-conquer technique to reducing the scope of the problem (by adding or removing constraints). He explains that programmers can look for analogies or use experimentation and observation to figure out a solution.

Spraul begins his exploration by looking at common puzzles and, using examples in C++, explores problems including message decoding, checksum validation, and breaking string input into an integer value. (Some familiarity with C++ is useful, but it’s not into you get to the section on pointers where you really need a good grasp of the language.)

He covers a variety of common programming constructs: besides pointers, Spraul examines arrays, loops, string manipulation, classes, dynamic allocation and deallocation, list management, dynamic memory, recursion, and binary trees. The author spends chapter 7 talking about solving problems with code reuse, a never-ending goal.

In chapter 8, Spraul pulls all his concepts together to help you create a master plan for attacking your own programming problems. His advice is simple, straightforward, and practical. It’s an easy -- and valuable -- read.

A full, free preview of Chatper 6 -- one of the meatier chapters (it discusses recursion) -- is available here.

Think Like a Programmer by V. Anton Spraul; No Starch Press, 2012; 233 pages; $34.95

-- James E. Powell
Editorial Director, ESJ

0 comments


How Enterprises Are Managing Big Data Backups and Recovery

The more data you have, the larger your data backup and recovery concerns. How are large enterprises meeting these challenges?

In its fourth annual survey, Sepaton asked 93 IT professionals in North America and Europe about their data protection issues and concerns as well as their expectations about future big data backups.

According to the company, nearly half (47 percent) are replicating over 50 percent of their data to remote recovery sites. On the opposite end of the spectrum, 23 percent of data centers “are still replicating less than 10 percent of their data to a disaster recovery site.”

We’re talking big data here -- 55 percent of respondents had backup volumes with more than 50 TB of data (that includes 14 percent with backups of over 1 PB of data).

Sepaton says that “21 percent have an active-active remote replication strategy in place" (meaning that they “back up and replicate full data sets using fully configured systems at both local and remote sites”), and 41 percent have an active-passive replication strategy (they back up to a fully configured system in a main data center and replicate to a ‘passive’ system(s) in a remote site”). “This result is noteworthy,” the company pointed out in a prepared statement, “given that large enterprises have historically moved large data volumes by shipping physical tapes off-site.”

Those physical tapes haven’t disappeared: 18 percent of large enterprise backup environments still copy their data to physical tapes that are stored off site. Of course, that’s better than the 14 percent of respondents who say their enterprise is still working on creating their disaster recovery strategy or the 3 percent with no data recovery strategy whatsoever.

Even those with DR plans may find their plans aren’t adequate. According to the survey, “11 percent of data in main data centers is currently not backed up or protected.” In fairness, that may be OK -- if the data not backed up includes multimedia files downloaded by employees against company policy.

The picture is brighter for remote offices. In last year’s study, more than a third of remote-office data wasn’t protected. This year, that figure is down to 15 percent.

Half of respondents feel “the remote office data is adequately protected in the event of site-wide disaster” (which is up from 30 percent in last year’s survey), and 60 percent of respondents rated “improving remote office data protection” as critical or of moderate priority for data protection in the next year.

The survey included respondents from several vertical industries, including 23 percent from government, 14 percent from financial services, 13 percent from health care, and 12 percent from manufacturing.

-- James E. Powell
Editorial Director, ESJ

0 comments