If you think eDiscovery means sifting through e-mail archives, you're only part right. A new survey sponsored by Symantec sheds light on data retention and eDiscovery practices that reveal how the growing amounts of information is making compliance with legal challenges for information tougher.
Symantec's 2011 Information Retention and eDiscovery Survey, conducted among 2,000 companies in May and June, looked at the challenges enterprises are facing and how they're coping with the growth and complexity of legal requests. One fact jumped out clearly, as Symantec’s Annie Goranson told me: no matter what industry you're in, there are identifiable best practices that, if followed, will make handling eDiscovery easier.
The world of eDiscovery is changing. E-mail has long held the top spot as most-requested documentation in eDiscovery motions. In this year’s survey, e-mail has dropped to third place (cited by 58 percent of respondents as an important document type). When asked about what information was most “frequently” requested, just over two-thirds (67 percent) cited “files and documents,” followed in second place by database or application data (at 61 percent).
Instant messages and mobile phone text messages are still requested often. Bringing up the rear (at 41 percent) were requests for social media records, including corporate posts to Twitter, Facebook, and LinkedIn, among others.
Best practices for managing information retention and retrieval include implementing a formal information retention plan, automation of legal holds, and the use of a formal archiving tool. Goranson said Symantec divided respondents into three groups based on their information retention programs and, according to the report, the “quality” of the practices they followed.
“The top-tier companies that closely followed best practices were 81 percent more likely to have a retention plan in place, and were 63 percent more likely to implement the automation of legal holds. These organizations were also 50 percent more likely to use a formal archiving tool. They were also much less likely to follow poor information management practices, such as performing legal holds in their backup systems,” the report said.
Despite following best practices, IT has little to brag about its response to legal challenges. Only 35 percent of respondents said they “successfully fulfill the request in a timeframe that is acceptable to the requestor.” On the plus side, enterprises that followed best practices reported a 64 percent faster response time; the survey results, analyzed by Applied Research, found that these companies also enjoyed at 2.3 times higher success rate in responding to eDiscovery requests.
Another 25 percent say they fill the request more slowly than the requestor would like, though they didn’t specify the lag time. Only 10 percent say they “partially fail” to fulfill the request, and another 10 percent don’t fulfill the request at all.
Of those that did not fulfill the request successfully, penalties are clear: 42 percent said it damaged their enterprise’s reputation or caused embarrassment, and 41 percent said fines were levied. Thirty-eight percent found themselves in a compromised legal position, and 28 percent faced court sanctions. One quarter of respondents said it “raised our profile as a potential litigation target.” No doubt.
Again, here’s where following best practices clearly pays off. According to the report, those enterprises following best practices were more likely “to receive a favorable outcome in legal proceedings.” They “suffered fewer negative consequences than companies that lack a formal information retention policy” =-- for example, they were 78 percent “less likely to be sanctioned by the courts and 47 percent less likely to find themselves in a compromised legal position. They were also 20 percent less likely to have fines levied against them. In addition, they were 45 percent less likely to disclose too much information, which could compromise their litigation position.”
When it comes to a formal retention plan, IT clearly isn’t doing a great job. Only 32 percent have a plan in place; 24 percent say they’re working on creating a plan, and 30 percent are discussing it or in the planning stage. Fourteen percent don’t have such a plan and have no intention of creating one. Remind me not to do business with any of them.
Given all the expense and angst organizations go through to deal with eDiscovery requests, most seem to have a pretty good (but undeserved) opinion of their ability to respond to “legal, compliance, or regulatory” requests. Seventeen percent said they were extremely confident in their preparedness, and 35 percent they were “somewhat” confident. A quarter said they were “neutral.”
From the results I read, I’m not very confident in their confidence level. Nine percent admitted to being “extremely unsure” of their preparedness. Even that figures sounds too low to me.
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
Symantec Corp.'s latest State of Security report for 2011 examines the state of over 3300 enterprises' cybersecurity efforts, and like last year's survey, IT puts security squarely at the top of its list of business risks (according to 49 percent of respondents), followed by "IT incidents caused by well-meaning insiders" at 46 percent. Symantec emphasizes that those risks are "ahead of traditional crime, natural disasters, and terrorism."
There's good news in the survey: enterprises say they're doing a better job at fighting cybersecurity threats, even though 71 percent having been attacked in the last 12 months (compared with 75 percent in 2010). Twenty-nine percent of enterprises have it bad: they said they experience attacks "on a regular basis," though that frequency wasn't defined.
Fortunately, those reporting an increase in the frequency of attacks dropped (from 29 percent last year to 21 percent this year). Unfortunately, 92 percent of enterprises reported being attacked that caused losses; the top three loses were downtime, theft of an employee’s identity, and intellectual property theft. On the positive side, 100 percent of respondents to last year's survey reported a loss. Leading attacks were from "malicious code, social engineering, and external malicious attacks."
Losses included "productivity; revenue; lost organization, customer, or employee data; and brand reputation," Symantec explained. "The survey found that 20 percent of small businesses lost at least $100,000 last year due to cyberattacks. That figure was even higher for large enterprises, with 20 percent incurring $271,000 or more in damages."
In a company release, Sean Doherty, vice president and chief technology officer of Enterprise Security at Symantec, points out that “There’s no question that attackers are using more insidious, sophisticated, and silent methods to steal data and wreak havoc. Organizations today have more to lose than ever before and need to keep adopting the security innovations and best practices that the industry is delivering to stay protected.”
Security is getting the recognition it deserves. More enterprises believe it's vital to maintain the security of operations and their information. "Forty-one percent said cybersecurity is somewhat or significantly more important than 12 months ago." Fortunately, only 15 percent think the opposite -- that it's "somewhat" or "significantly less" important.
Several trends are keeping IT on its security toes. For example, 47 percent of respondents identified mobile computing was making it harder maintain cybersecurity; social media wasn't far behind (at 46 percent); the consumerization of IT was cited by 45 percent of respon) dents.
Slightly more than half (52 percent) say they are doing "somewhat or extremely well in addressing routine security measures," and almost as many (51 percent) say they're doing "somewhat or extremely well" in their response to attacks and breaches. They’re less successful in handling compliance or pursuing "innovative security measures."
Applied Research conducted the telephone survey of 3,300 respondents in 36 countries in April and May 2011. The survey targeted "C-level professionals, strategic and tactical IT, and individuals in charge of IT resources from companies with a range of 5 to more than 5,000 employees."
You can read a full copy of the report here. No registration is required.
0 comments
Let’s face it -- at some point the tools you’re using to prevent data loss won’t be enough and your data will be breached. It’s not if but when, and it can occur because of a concerted hacking attack or because an unsuspecting user responds to a Facebook message from a “friend.”
Co3 Systems has released a new service that helps your enterprise prepare for such events and better manage incidents when they occur (and they will).
“To date, most enterprise security approaches look at guarding against breaches in the first place, what we call a ‘pre-incident’ approach,” Ted Julian, chief marketing officer at Co3 Systems told me. “Now there’s an emphasis on regulatory compliance -- doing what you have to do when the breach occurs. It’s a ‘post-incident’ solution.”
Compliance regulations are growing more complex. You must use different letter templates to disclose credit card exposure depending on what state the customer lives in. Julian points out that there are 46 states, three commonwealths, and 14 Federal agencies that you may have to deal with. Even for most enterprises with a bevy of lawyers or compliance officers to steer them though the crisis, compliance can be daunting.
Co3 Systems has introduced an automated way to handle data loss. Using the Co3's software-as-a-service application, an enterprise answers a few simple questions about a potential data loss event (for example, a lost laptop) using an online wizard. The systems maps the event characteristics to its knowledge base and generates a report on the potential exposure and impact so you’ll know what you have to do. It can also estimate the liability of such a situation, which may be the jolt your executives need to increase their security investment.
In the event of an actual breach, the program can generate a detailed incident response plan that lists the tasks your enterprise must complete. These aren’t generic “contact your customer” directives. Instead, the system will explain that you must "notify the CEO" to "send consumer notification letters to Massachusetts customers." It provides a link to the regulatory language that triggered the task, explains the contact information, provides a template of the notification letter -- whatever you need to do to stay in compliance. The Co3 system includes a lightweight project manager for assigning tasks and tracking progress.
A press release from the company claims that “early engagements suggest the time savings and resource focus alone can save as much as $50,000 per incident.” Whether that’s a reasonable estimate or not, it certainly can eliminate the “What do we do now?” panic many organizations experience when they’re attacked.
The company is offering a free 90-day trial version, with initial pricing for the service at $450 per month; this includes unlimited events and one full incident. Other plans allows for more incidents per year.
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
From all my years in testing mainframe applications, one thing I always knew: the best test decks came from the best data -- and the best data was in your current production files. Those records had valid Social Security numbers and account numbers and department codes. I could try to generate such test data files, but that always proved onerous.
The problem, of course, is that using such personally identifiable information (PII) can get you in trouble. If a test file ended up in the wrong hands, you were toast, and this was long before we IT professionals had heard about data governance and long before government and industry regulations imposed nasty fines.
Compuware has had tools for disguising production data -- masking Social Security numbers for example or increasing numeric values by a given percent. Their products -- one for mainframe data and one for distributed data -- used different interfaces, and pulling data from a variety of data sources has become an ever-bigger chore. Today, when you build a test suite, some test data comes from DB2, some from SQLServer, and if you mess with key values, records lose their relationship to each other.
Thankfully, the company has recently upgraded its test data management solutions. Test Data Privacy version 3.1 simplifies creating and disguising test data in test environments.
"This is the first time we've offered a single tool for managing both mainframe and distributed test data management, so users can work with Oracle, SQLServer, DB2, and IMS all from a single solution," Dennis O'Flynn, product management director at Compuware told Enterprise Strategies.
What's more, O'Flynn points out, the new release keeps your data in sync. If you need to mask customer number and combine data that uses this key to match records in DB2 and SQL databases, the new version keeps everything in sync so the information relationships are maintained.
The product is both rules- and roles-based. A compliance officer can set the rules (set up data field masks, for example) which developers can then execute -- a great way to make sure programmers don't waste time developing non-compliant rules and compliance officers enforce strict policies. The rules now can be applied to both mainframe and distributed data sources once, eliminating the need to duplicate rules for each environment.
The program works with a new Compuware Workbench, a rich client application that stores rules in a central repository.
More information is available at www.compuware.com.
Posted by Jim Powell0 comments
When enterprise IT managers were asked about their top priorities and concerned about moving enterprise apps to cloud environments, one thing stood out: they prefer private clouds over public clouds.
The survey found -- to no one’s surprise, I’m sure -- that “large enterprises are migrating front-office and back-office applications to the cloud.” So far this year, over a third (39 percent) of enterprises moved their e-mail and collaboration systems to virtual infrastructure; a third moved their IT management apps, and one fifth moved sales and marketing. One-third of respondents predict that they’ve move finance, ERP, or HR applications to the cloud; 23 percent will move their e-mail and collaboration software, and 21 percent expect to move their IT management applications there.
In terms of the “big picture,” 37 percent of companies say that they’ve eventually migrate 61 percent or more of their applications to a private cloud; 6 percent says they’ll move those apps to a public cloud. Long term, the future isn't bright for public clouds: 51 percent said that no more than 5 percent of their applications will ever move to a public cloud.
One reason for the popularity of private clouds could be that resolving applications in cloud environments is getting harder. Slow application performance was reported by 41 percent as their biggest problem (and was also the most costly problem), followed by “slow time to identify the root cause of issues” (20 percent), multi-tenant storage contention (18 percent), and inter-application shared resource contention (also 18 percent).
The cloud may be a mixed blessing. Moving apps to the cloud is supposed to relieve IT from a variety of tasks and ease responsibility for their maintenance; IT is thus (in theory) freed up to work on higher-priority tasks. Furthermore, the cloud is supposed to give IT “the ability to quickly move a high-priority application to a more optimal resource when performance begins to suffer.” According to a release from Precise, a transaction performance management vendor and the survey’s sponsor, “A majority of the survey respondents (26 percent) report that they expect application performance will improve in the cloud, yet most predicted that it will take longer to pinpoint the causes of problems after applications move to the cloud (37 percent).”
When comparing their virtualizing experience to their expectations, 46 percent cited "About the same cost as expected," and 29 percent said it was "easier to maintain than expected." Fifteen percent said it was "a lot less expensive than expected," with the same percentage citing "completed faster than expected." (Multiple responses were allowed for this question.)
The release quotes Zohar Gilad, executive vice president as Precise: “When a problem occurs, virtualization is the enemy of visibility. Compounded with dynamic provisioning in the cloud and server cluster architecture, it's difficult to determine which server, VM, or application instance is to blame when troubleshooting.”
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
We’re always trying to get our applications running at peak performance. What are you doing to maximize your own performance of everyday tasks?
For me, it boils down to two key products.
Speech-to-Text that Actually Works
I spend the bulk of my day writing and/or editing the work of others. That’s a lot of time spent in Word. If I’m not working in Word, I’m composing text in another form: e-mail or instant messages.
I’m a good typist -- I can crank out 80 or 90 words a minute without breaking a sweat. I’ve used Word macros, Excel macros, and I count on DataPrompter when I have repetitive documents to create (where I only need to change a few words before creating the next version), and Macro Express when I have more complex tasks (it allows me to write sophisticated macros with conditional branching, for example, across multiple applications).
When I need to create original text, these days I depend on Nuance’s Dragon Naturally Speaking (DNS). I tried a similar product many, many years ago -- back in the days when Microsoft was touting its speech recognition features. (It was mediocre, inaccurate, and clumsy.) Yes, DNS requires a short training session (less than 20 minutes). Consider it an investment that will pay you back handsomely.
As you speak your words into a microphone, the program displays them in an open window (Word, Notepad, Outlook, fields in a form, cells in Excel, you name it), usually after you pause. In Word, it shows a full sentence at a time, because that’s when I usually stop for a breath. You can “speak” punctuation (“comma”) or navigation (“new line”) easily. If you make a mistake, just back and up and speak what you mean. If Dragon misinterprets what you’ve said, a simple “Correct that” spoken command presents a list of alternative text.
(When you exit DNS, the program asks if you want it to update its logic based on the corrections you’ve made.) The longer you work with it, the fewer corrections are necessary, but out of the box, its accuracy is amazing.
Just like there are times when I am tongue-tied, sometimes I find that despite my best intentions, I occasionally stumble when I type. No matter what I think I type, it comes out with letters transposed or the wrong word (I think “their” and type “they’re”.) I’m more often tripping over fumbling fingers than tripping over my tongue, however.
The beauty of Dragon Naturally Speaking is that it doesn’t have such problems; it never has a bad day, either. Furthermore, I don’t have to think about those frequently pondered-over words (it is conscience or conscious?) -- the program figures out the right word from its context. Finally, DNS cuts way down on misspellings -- no more “hte" typed in haste in an application lacking AutoCorrect.
In addition to converting speech to text quickly, it helps me think more clearly. I can speak a sentence faster than I can type it, and DNS gets the words into Word faster than I can type them. I can work at the speed of thought -- well, the speed of speech, at least. It always seems to be able to keep up.
There are few speech-to-text applications that work. Google Phone offers translations of voice mail into text messages. It’s a disaster, in my experience. I get nothing but garbage. Dragon Naturally Speaking lives up to its name -- it lets me speak naturally, get higher text throughput, and makes it easier to dictate than type. I can’t see myself going back to keyboard-only entry.
RoboForm Overcomes My Bad Memory
If I’m doing research, I have to visit many Web sites. Unfortunately, to get much of the information I truly need, I have to register at each of the sites. Following advice I’ve read repeatedly (and which makes sense), I vary my UserID and password so no two sites have the same login information. Some sites require letters and numbers, some demand that passwords must be at least 8 characters, and so on. For me, there’s no such thing as a “standard” password, anyway, and if I were to use the same letter/number combo on multiple sites, once that password (or pattern) is discovered, hackers could use that knowledge to, say, drain my bank account online.
Of course, I have better things to do in my life than memorize all these different passwords, and no information is secure if it’s written on a Post-It note affixed to your monitor. Thankfully, I don’t have to.
RoboForm remembers all the messy details for me. Yes, later versions of Firefox can memorize these IDs; Norton Internet Security Suite has a similar feature. Trouble is, there simply not as simple as RoboForm. In addition, if you rely on a browser to remember your passwords, you can only use those passwords in that same browser, whereas you can use your RoboForm passwords in any browser. (Once I set up information for a site, it’s there whether I’m using Firefox or IE.)
RoboForm does more than remember user ID, password, and URL. If there’s information that’s required (I never reveal the actual name of my first pet), RoboForm can keep that, too. If you are an active online shopper, forget entering your mailing address and different delivery address -- RoboForm can fill in the details quickly.
When you visit a new site, RoboForm is smart enough to know when you’ve entered information for the first time and offers to save it for you.
If you login with more than one set of credentials (I have two distinct LinkedIn accounts, for example), that’s no problem. You can keep multiple sets for the same site, choose a default, and RoboForm will prompt you to choose when you visit the site. The program also keeps track of the sites you’ve defined; rather than clutter your desktop with shortcuts, you can pick the site (with automatic userID/password fill-in) from a list of URLs using RoboForm’s toolbar.
Keeping passwords secure can be enhanced by creating a “master” password that protects all your information. Think of it as the password to access your other passwords.
There are many other nice-to-have features. You can print a list of all the sites and credentials for each; I find that helpful when I want to review just what, exactly, it’s stored (especially if I want to have that list for accessing a site from a new location or system) or just as a printed backup to put in a locked drawer. (RoboForm Everywhere is an online service that lets you synchronize your information with multiple computers and mobile devices and have one-click access remotely, but I haven’t tried that feature.)
In addition to running on many platforms, there are several versions of the product, including an enterprise version. You owe it to yourself to clear your head of all your passwords. Forget lists of userIDs and passwords. Let RoboForm do the mental gymnastics.
Posted by Jim Powell1 comments
Storage managers are often pressured to buy storage from a single vendor. That, in turn, leads to being pinned down to a particular vendor’s virtualization management solution.
I hate having my options limited. I want to pick best-of-breed hardware and solution. Mix-and-match is my motto. When it comes to storage virtualization software, DataCore seems to share my values. It’s new SANsymphony-V dynamically relocates workloads across pools of any type of storage equipment -- including SSDs -- and from any vendor. Because it sits high-enough up on the interface ladder, DataCore’s director of product marketing Augie Gonzalez told me, any new storage device you add can work instantly with SANsymphony -- no updates needed.
“We apply the same device-independent approach to auto-tiering as we do with all our high-value services, including thin provisioning, caching, synchronous mirroring, asynchronous replication, snapshots and CDP. Let’s just say it arms you with a lot more bargaining muscle when the next disk hardware purchase comes around.”
DataCore’s President and CEO, George Teixeira, told me recently that his company’s product neutrality is what gives IT the greatest flexibility to move seldom-used sections of files to a slower tier of storage (read: cheaper disk) and most-in-demand or most-important sections to more expensive (read: faster) storage devices. Although the product’s chief benefit is that it handles all the messy details in the background automatically, there are rules (“policies”) you can define to override its decisions (for example, you can exclude some workloads from auto-tiering). Furthermore, if you no longer have enough capacity in a tier to to meet your requirements, it will tell you so.
The approach has two advantages -- both economic -- and both explained succinctly in their press release: “This expanded choice gives customers the opportunity to shop for the best value at each tier from competing sources without having to discard what they purchased last year.” Keeping legacy equipment alive is critical -- so is shopping for the best deal on new equipment.
Hardware-based approaches restrict tiers “to premium-priced trays within a single storage enclosure or frame, DataCore says. Its infrastructure-wide software “spans multiple storage systems from potentially different suppliers.”
Flexibility is paramount. Tiers can be made up of high-capacity but lower-priced SATA drives from Company A (or Company A and B and C) or consist of multiple SAS midrange disk systems from the same or different vendors. You can mix and match capacity, technology, and manufacturers in the same tier -- it all appears as a pool from your control panel. SSDs, which cost up to 10 times more than conventional drives, are likewise supported in a tier.
The company says that more than 50 percent of stored data turns to “inactive” status within just 60 days of its creation. If you’re using high-priced storage for it, you’re wasting precious IT budget dollars.
In its press release, the company says its customers “have reported up to 60 percent cost savings with SANsymphony-V alone, and now with the ability to automate and dynamically optimize tiered storage capacity, incremental savings of 20 percent or more are now possible. The final result is an extremely cost effecting, self-tuning system.” Not to mention one that doesn’t pin you down.
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell42 comments
Sometimes when you put cost over other factors, you end up getting what you paid for -- trouble.
That’s just one of the conclusions I drew from an online survey of over 500 IT managers at small businesses in the U.S. sponsored by HP and conducted by Wakefield Research. When asked, “How often, if ever, does your company place cost concerns above the best solution when budgeting for IT?” 22 percent answered “All the time.” Over half (56 percent) responded “some of the time.”
It’s all part of a do-more-with-less, tighten-your-belt approach to IT budgets. Overall, 93 percent of all small companies report having at some time put cost concerns ahead of buying the “best” solution, and of these, 89 percent have experienced problems, including slow performance (46 percent), out-of-date equipment (37 percent), hardware reliability (23 percent), hardware with a short life cycle (22 percent), or hardware that wasn’t energy efficient (22 percent).
I was amused by one question in particular: “Which of the following takes up the most time for you and your IT staff?” Your staff? You mean the guy or gal running around trying to keep everything running in addition to their regular duties? Many small businesses I know don’t have a formal IT staff -- keeping PCs running and workstations connected is just one of the tasks this go-to person must juggle.
Whoever’s in charge, given the cost-as-a-key-factor attitude, it’s no wonder such support people are facing software issues (41 percent) and network/connectivity issues (33 percent) most often.
What’s most in need of improvement? Processing speed according to more than a third (35 percent), followed by reliability (19 percent) and data security (15 percent).
What’s also needed (whether the majority of small businesses realize it or not) is an overhaul to their security policies for retiring equipment. When asked if their company’s “end-of-life” policy for disposal of “outdated computers” protected confidential information still residing on the equipment, only 43 percent could “strongly agree” that they were protected; a third (33 percent) “somewhat agreed,” and 12 percent either somewhat or strongly disagreed. That’s a red flag as far as I’m concerned. Fortunately, 88 percent of companies have an end-of-life policy. Now if they’d only spend the $50 or so to get a disk-wipe program.
This year, small businesses have spent or plan to spend money on laptops (60 percent), desktops (57 percent), data storage (47 percent) and smartphones (44 percent). Currently, 37 percent say their biggest program with their existing computers is that they’re too old; a quarter (25 percent) say they don’t have sufficient processing power.
In summer, over a quarter (27 percent) aid more employees work remotely; 27 percent also said their employees spend more time working remotely.
--James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments