Social Networks Present Big Security Headache

Much of what’s in the latest Symantec Internet Security Threat Report released today isn’t startling. You know -- over 6,000 new vulnerabilities were discovered last year, the number of zero-day vulnerabilities was unprecedented, and attack kits are getting smarter and nastier. No surprises there.

There are a couple of gems, however, that should make security professionals and end users alike sit up and pay attention.

Security administrators have long known that external threats are just one of the vectors they have to monitor. As the Symantec report points out, it’s user behavior you have to worry about more than ever before. Two targeted attacks (Stuxnet and Hydraq) “teach future attackers that the easiest vulnerability to exploit is our trust of friends and colleagues.”

Part of the problem -- clearly highlighted in the report -- is that people (read: your employees) aren’t careful about what they post on social network sites. “Whether the attacker is targeting a CEO or a member of the QA staff, the Internet and social networks provide rich research for tailoring an attack. By sneaking in among our friends, hackers can learn our interests, gain our trust, and convincingly masquerade as friends,” the report points out.

“Attackers are getting cleverer. They can read where you work, what your hobbies are, where you like to eat, and who are friends are. They use that information to devise special attacks that target you. You might not expect an invitation to a unknown bistro, but how about a message from your favorite restaurant?” warns Gerry Egan, director of Symantec Security and Response in a conversation with Enterprise Strategies last week.

Egan warns that “people just aren’t careful about the information they put online, and we have to get the word out that they need to be smarter about what they post.”

Mobile and social networks are increasingly points of exposure. “Attackers can set up accounts that look just like they’re your current friends. When you receive a request to join your network, users think, ‘I thought they were already a friend, but I guess not,’ and just as quickly add them to their inner circle. Now the attacker has access to all your sensitive data.

On the mobile side users are lax about downloading apps. Trojan apps look “just like legitimate apps, and users don’t think twice when they download and install them,” Egan explains, and we are starting to see a steady trickle of new Android Trojan apps in circulation

The report also notes that financial assets are no longer the only target of hackers. Intellectual property is becoming a popular target. It’s getting easier, given user behavior. “Hydraq would not have been successful without convincing users that the links and attachments they received in an email were from a trusted source.”

Another eye opener: shortened URLs are popular for distributing attacks -- such as links to malicious Web sites -- because they hide the actual destination. OK, I know this is the “Duh!” part. What surprised me was that during the three-month period studied, “two-thirds of malicious links in new feeds observed by Symantec used shortened URLs.” [emphasis added]

Remember that adage your elementary school teacher drilled into your head: when you get to a railroad crossing or street corner: Stop, Look, and Listen. Perhaps today’s teachers should train their young, tech-savvy charges to add IBM’s famous one-word directive to those words of wisdom: Stop, Look, Listen, and Think!

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell on 04/04/2011 at 11:53 AM0 comments

Educators Tackling Mobile Device Challenges

In a new report from Forrester Research, Building An Effective Mobile Device Management Strategy For Education, one thing is clear: when it comes to managing the myriad of devices attached to its networks, the education sector is facing tough challenges. "It’s even more crucial for the education industry over the next 12 months because the number of post-PC devices, such as slates, tablets, netbooks, and smartphones, has already eclipsed traditional PC devices, such as desktops and notebooks.”

In fact, Forrester reports that “with this surge in post-PC devices that do less but do it in more places, IT managers no longer have the authority to veto the use of mobile devices or limit use to a specific brand or operating system.” The researchers say that more than 80 percent of the 55 survey participants responsible for IT decisions in the education sector “have already implemented or are planning on implementing mobile device management solutions that can scale across all devices, regardless of who actually owns the hardware.”

Educational institutions are providing smartphones and slates to incoming students, teachers, and administrators. They’re putting slates in libraries and labs, and phasing out printed textbooks. Lecture notes are now posted online, as are audio files. Of course, traditional applications are still immensely popular, including e-mail and calendar applications.

The problem is that these mobile devices are filled with sensitive information, including financial data (think scholarship information, tuition accounting, credit card payments for room and board, and the like). Forrester estimates that “educational institutions are at risk of a data security breach that could ultimately cost more than $1 million in data loss notifications and remediation services.” That could explain why 95 percent of security decision-markers in the education sector say data security is among the top priority at their institution over the next 12 months (followed by managing vulnerabilities and threats and business continuity/disaster recovery).

The survey, sponsored by Fiberlink (a mobile device management vendor), found that the surge in mobile device use is -- probably to no one's surprise -- also having an impact on IT support. “Three out of every four IT decision-makers reported that they have to invest 20 or more hours every week on supporting mobile devices” (the figure includes inventorying devices and ensuring compliance). The researchers say the education market is looking at cloud-hosted mobile-as-a-service solutions that may be faster to deploy and support strong password policies and full disk encryption. Also key among mobile management features: remote lock/wipe and activity visibility and management -- for everything from laptops and smartphones to netbooks and tablets.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell on 03/28/2011 at 11:53 AM0 comments

Cloud Viable Disaster Recovery Option, But Quantifying Downtime Costs Elusive

When it comes to the cloud, 44 percent of respondents to a new survey believe the cloud is a viable option for disaster recovery; 26 percent aren’t sure, and 30 percent don’t -- of which one-third (34 percent) say it’s because they aren’t confident about cloud security.

Those are just some of the findings released by the survey's sponsor, Neverfail, which asked 1,473 U.S. IT professionals in enterprises of all sizes about their disaster recovery "plans and practices."

Backup and recovery is clearly an important focus of IT, given that nearly a quarter (23 percent) reported having an IT outage that lasted more than one business day, and only five percent said they never experienced an outage (talk about lucky!). Most outages were caused by hardware or software problems (according to 43 percent of respondents), followed by power or data center outages (35 percent), natural disasters (8 percent), and human error (6 percent).

Despite these outages, most respondents can’t put a cost to the event. Over half (54 percent) said they don’t know the cost per hour of downtime. Of those who can quantify the cost, 16 percent said it was more than $10,000, 7 percent put the figure at between $5,001 and $10,000 per hour; the remainder said it was under $5,000. (These estimates strike me as unrealistically low.)  There are other costs – and consequences – of downtime, including reduced employee productivity (30 percent), revenue loss (26 percent), damage to corporate reputation (23 percent), and failure to meet a service-level agreement (19 percent).

Just over three-fourths (76 percent) of respondents could identify their “most critical” applications.  In the top four positions: Microsoft Exchange (30 percent), Microsoft SQL (26 percent), Microsoft Sharepoint (13 percent), and Blackberry Enterprise Server (11 percent). For these critical apps, 44 percent said over 1,000 people depended on their mission-critical application each day; in fact, 69 percent say they need to provide access to the top application without interruption, 24 hours every day.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell on 03/22/2011 at 11:53 AM0 comments

Two-Thirds of U.S. Small Enterprises Risk E-mail Loss, Compliance Penalties

An independent survey of 202 U.S.-based IT decision makers at small and mid-size enterprises (SMEs) reveals a frightening lack e-mail archives -- and data backups in general.

According to the survey, most (62.4 percent) SMEs don’t use a mail-archiving product. Besides exposing users to limited e-mail backup and recovery (if not complete, outright data loss), these SMEs could be unable to locate messages when they’re audited or receive an eDiscovery request, resulting in “costly compliance violations or legal suits.”

The survey, conducted by Opinion Matters, was sponsored by GFI Software, which sells Web and mail security as well as archiving solutions for the SME market.

If the lack of archiving isn’t enough to make you shiver, consider this: GFI says its survey found that more than 38 percent of respondents don’t have any archiving or backup solution at all, “further exacerbating the chances that a network failure could result in a complete loss of critical data stored in e-mail.”

This lack of a backup solution surprised me; backup is part of IT 101, isn’t it? It’s even more troublesome given that over one-third of companies (37 percent) “are required to search for old or deleted e-mails on a monthly basis, if not more frequently, because of requests from end users, the need to meet compliance requirements, the need to provide copies of correspondence for a lawsuit or audit, or any other requirements.” [emphasis added]

The smaller the IT staff, the greater the enterprise’s risk. “Two-thirds (66.8 percent) of respondents were unfamiliar with U.S. regulatory compliance standards regarding e-mail archiving. This number ballooned to over 90 percent in businesses that rely on only one IT professional,” GFI says.

Walter Scott, GFI Software’s CEO, noted that “Taking the risk of not backing up or archiving key data stored in e-mail can be a very costly gamble depending on the type of data your business is dealing with.”

Now there’s an understatement.

--James E. Powell
Editorial Director, ESJ

Posted on 03/22/2011 at 11:53 AM0 comments

Cost of Data Breaches Continues to Rise

If you're looking for a way to justify an increase in your security budget, look no further than the sixth annual study of data breach incidents by The Ponemon Institute.  Its study shows that -- once again -- costs are on the rise. The in-depth examination of breaches at 51 U.S. enterprises in 15 different industries was conducted between March and December 2010.

Since 2006, total breach costs have grown every year, according to Ponemon’s 2010 Annual Study:U.S. Cost of a Data Breach. In 2010, the average cost was $7.2 million, up from $6.8 million in 2009. Companies spent $214 per compromised record on average, a rise of $10 per record (or 5 percent) from the previous year. Industry sectors suffering the highest per-record expenses were communications ($380), financials ($353), and pharmaceuticals ($345). Those suffering the smallest per-record costs were media ($131), education ($112), and the public sector ($81).

The study found that although companies prefer to act quickly (43 percent notified customer within a month of discovering a breach, up from 36 percent in 2009), it’s costing them more per record to do so than for companies that take longer to take action. “In 2010, quick responders had a per-record cost of $268, up $49 (22 percent) from $219 the year before. Companies that took longer paid $174 per record, down $22 (11 percent) from 2009.”

Quick action wasn’t the only characteristic that Ponemon noticed. Also rising this year: there were enterprises wth CISOs leading their data breach response, and more enterprises maintained an “above-average IT security posture.” The study concludes that “Taken together, these figures may indicate more organizations are taking more active steps to thwart hostile attacks.” The report says breaches from “systems failures, lost or stolen devices, and third-party mistakes” declined. “All these point to companies becoming more conscientious about preventing data breaches in the worsening threat environment.”

Ponemon reports that the cost of malicious or criminal attacks -- which accounted for nearly one-third (31 percent) of all breaches examined -- skyrocketed. “The 2010 cost per compromised record of a data breach involving a malicious or criminal act averaged $318, up $103 (48 percent) from 2009 and the highest of any data breach cause this year. The huge increases reinforce the extreme danger hostile breaches pose.” Such breaches occurrences rose 7 percentage points over 2009, which itself had double the number of breaches over 2008.

Negligence is still the most common threat, and it’s costing more. “The number of breaches attributed to negligence edged up a point to 41 percent. Breaches from negligence in 2010 averaged $196 per record, up $42 (27 percent) from 2009.” Ponemon says the relatively unchanged rate “may indicate that ensuring employee and partner compliance remains an ongoing challenge.”

Among the preventive actions the report recommends are automated enterprise data protection solutions which employ encryption (including protecting mobile devices), data loss prevention solutions, identity and access management products, and endpoint security tools.

Despite the downsides organizations may see in regulatory compliance, such efforts do lower churn rates, Ponemon says, by “boosting customer confidence in organizations’ IT security practices.”

The report examines a wide variety of business costs, such as those to detect, escalate, notify, and follow-up on breaches, as well as the “impact of lost or diminished customer trust and confidence as measured by customer turnover, or churn, rates.”

It doesn’t stop there: the study examines such direct costs as employing forensic experts, outsourcing hotline support, credit monitoring subscriptions for affected customers, and future product or service discounts -- as well as indirect costs such as in-house investigations and internal communication.

Churn, however, accounted for the greatest cost component. “For the second straight year, abnormal churn or turnover of customers after data breaches appears to be the dominant factor in data breach cost.” Most post-breach churn rates remained at 4 percent, but those in the pharmaceutical and health-care industries lost 7 percent of their customers. Public-sector organizations suffered the least, with churn rates below 1 percent.

The study was conducted by The Ponemon Institute and sponsored by Symantec.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell on 03/08/2011 at 11:53 AM0 comments

Survey Quantifies Costs of Poor Cloud-App Performance

Why doesn't IT adopt more cloud-based applications? According to a new survey of almost 700 businesses in North America and Europe, it could be the cost of poor performance and the complexity cloud apps impose on an enterprise.

To better understand the impact of IT’s ability to manage cloud apps and the impact of poor cloud-app performance, Compuware wanted more than anecdotal evidence about expected revenue loss, hence the survey.

Poor performance hits organizations in a big way, according to the Performance in the Cloud report. When asked to approximate how much money an organization would lose in a year from the performance problems of their cloud-based applications, the weighted average of North American firms was $985,000; for European firms the loss averaged $777,000. That’s just the average. Almost one in three North American firms (29 percent) estimated their yearly loss would be at least $1.5 million; for nearly one in eight firms (14 percent), the minimum loss was estimated at $3 million.

Poor performance also has IT wary of cloud apps: 58 percent of North American respondents and 57 percent of those in Europe indicated that because they couldn’t manage application performance, they were forced to delay or even stop cloud-based application adoption.

Furthermore, 65 percent of North American firms and 72 percent of European enterprises agree with the statement, “As cloud applications are delivered to end users over the internet, our IT department’s ability to guarantee service levels is severely restricted.” That’s no surprise, given that Cloud apps are more complex because of a variety of elements along the delivery chain, such as Internet and network speeds.

This complicates application performance management (APM) tools, which often can’t monitor cloud-based applications because they can’t be seen. Furthermore, unlike traditional applications that run in-house, cloud apps share an environment that includes resources over which IT may have no control, so application performance can drop without warning.

SLAs must grow more complex to handle cloud environments. A whopping 94 percent of North American firms agree that “If or when we start to use cloud-based applications, I would expect very rigorous SLAs that go beyond simple availability metrics. For cloud-based applications, I would expect SLAs based on end-user experience.” (The figure is 84 percent for European firms.) About two-thirds of all respondents say they’re up to the task, agreeing that their “IT team has the skills and knowledge to negotiate the more complex SLAs for issues such as Internet connection and performance, end-user experience, and other factors.

Compuware says its previous research “clearly demonstrated that there is a direct link between application performance and revenue.” For example, it found that problems start when a Web page response is nearly 4 seconds; at the 6-second mark, one-third of users abandon the page. “Not only does this represent lost revenue, but it also creates a poor perception of the company in general, making it far less likely that the customer will return anytime soon.”

The bottom line, says Compuware, is that “Without an APM strategy capable of managing this extended delivery chain and the unique properties of the cloud, there is a very high risk that companies will be completely unaware that their internal users and customers are experiencing performance problems [with cloud applications] until it is too late.” In fact, rather than enjoying the economic benefits promised by cloud apps, enterprises actually suffer from a decrease in revenue.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell on 03/03/2011 at 11:53 AM0 comments

IT Skills Gap Revealed in New Survey

If the economy has truly moved into expansion mode, we can expect organizations will be hiring IT staff -- including college graduates. The question is, do college graduates have the skills IT needs, or will they need additional training once on the job? The results of a recent survey will likely worry hiring managers.

Closing the IT Skills Gap: 2011 SHARE Survey for Guiding University and College IT Agendas, a new survey of 376 employers conducted in conjunction with SHARE, the independent IBM users group, reports that organizations are counting on higher education to produce “graduates with specific IT skills in enterprise programming languages and mainframe administration skills.”

The survey found that 82 percent of organizations look for colleges and universities to train students in database skills; 76 percent want analysis and architecture skills. Topping the “pure” business skills needed: problem solving (according to 77 percent of respondents), critical thinking (70 percent), writing/communication (61 percent), interpersonal communications (59 percent), and project management (57 percent). Unfortunately, nearly a third (32 percent) of respondents say the business skill level of candidates from colleges and universities today is “unsatisfactory.”

Few enterprises are “entirely satisfied” with the current crop of graduates; only 11 percent “would rate their IT hires’ technical proficiency as ‘well-trained, ready to go.’” Fully one-fourth (26 percent) of respondents say that programming/development skills of candidates from colleges and universities is “unsatisfactory.” That’s an important benchmark, given that about half of companies hire new IT employees “straight out of school, with relatively little actual working experience.” Two-thirds of firms are looking for students who have been interns, and “most would like to see at least a year of on-the-job experience” on a candidate’s résumé. Education is important: 65 percent of firms require at least a bachelor’s degree.

Programmers and developers are being sought by 60 percent of respondents. Organizations are looking for skills “in application server environments, database languages, and Java;” COBOL is still a popular requirement in 4 of every 10 hiring firms.

IT administration skills that top list of in-demand skills include “backup and recovery, storage administration, security, and disaster recovery.” More than half of companies surveyed are looking for project management, analytics/business intelligence, and enterprise architecture talent.

The survey looked beyond technical skills. Research analyst Joseph McKendrick, the study’s author, points out that “employers also want well-rounded, business-savvy employees as well. As one respondent, an IT executive with a western retailer, put it, ‘People need to understand the “big picture” of how computers work, from the deep level programming to how that affects -- and interconnects with -- applications, servers, and other things in the data center.’”

Business skills are important -- but lacking. One-third of respondents say they’re looking for “professionals and managers [who] can bridge the divides between IT departments and business leaders.” However, only 8 percent rate the business proficiency of IT new hires as “well-trained, ready to go;” 44 percent said candidates are sufficiently trained but have skill gaps. Almost a third (31 percent) rate business proficiency of new IT hires as “not sufficiently trained, remedial, or hands-on training usually required.”

The survey found that 37 percent are “distressed by the lack of business proficiency they see IT hires bring into the organization (e.g., analytics, problem-solving, understanding processes).” An IT executive at a financial services company sums it up well, noting, “Most [university and college] programs are programming-specific and completely ignore how the future candidate will integrate in a complex organization.”

Companies will likely have to pick up the training today’s higher-education institutions don’t provide. Topping the list of strategies enterprises employ today to develop IT talent: corporate training and development (46 percent), use of outside consultants or outsourcers (38 percent), partnerships with colleges and universities (37 percent), and partners with vendors (26 percent).

Organizations had a diverse IT infrastructure, mixing mainframes, Windows, Linux, and Unix systems.

Conducted in January 2011, the survey queried responses from both the technical staff and managers. About 29 percent of respondents were IT executives and managers, and 31 percent were analysts, programmers, or administrators. About 28 percent were from organizations with fewer than 1,000 employees; over a third (36 percent) or organizations had over 10,000 employees.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell on 02/28/2011 at 11:53 AM1 comments

A New Approach to Database Security

It’s rare to find a new product category these days, but I think a new product from Oracle fills the bill. In the crowded enterprise security field, that’s saying something.

I spoke with Vipin Samar, vice president of database security at Oracle, last week about the company’s new Oracle Database Firewall. It’s a product that Samar says you shouldn’t leave home without. He could have a point.

The company calls it a “first line of defense for databases,” and the firewall takes a new approach to preventing data breaches. Rather than looking at the actual changes being made to your database, the software monitors traffic to look for unusual patterns. For example, is a user attempting delete transactions at 3 in the morning? That’s probably an indication that the user account has been hacked.

“Oracle Database Firewall sets up a perimeter around databases,” Samar told me. In real time its focus is on preventing attacks (such as SQL injection) and unauthorized user data access attempts. Because it offers a monitor mode, it’s also useful for organizations to see if they have data breaches in the first place. (Understanding the true nature of your environment is critical, and many organizations simply don’t have a grasp of what’s going on.)

The key is to use “SQL grammar analysis technology.” It looks at the SQL statements themselves, then (depending on how you’ve set policies in the software) lets the transaction pass and/or be logged. You can set up alerts (for information only, for example) or block the transaction completely.

Oracle Database Firewall takes two traditional (and familiar) approaches -- whitelists and blacklists. With a blacklist, for example, you can specify which SQL statements are forbidden. (Administrators can set up exceptions such as for patching operations).

Besides time of day (for those possible midnight raids on data), Oracle Database Firewall looks at such attributes as IP address, application, and user ID.

The good news is that Oracle Database Firewall inserts itself into the mix without requiring changes to your database infrastructure. It runs on Intel-based platforms and supports Oracle Database through 11g; IBM DB2 for Linux, UNIX, and Windows (versions 9.x); Microsoft SQL Server 2000, 2005, and 2008; Sybase Adaptive Server Enterprise (ASE) (versions 12.5.4 through 15); and Sybase SQL Anywhere V10.

Samar says overhead should be minimal, even in large shops.

Given that it’s a security app, it’s no surprise that Oracle Database Firewall ships with preconfigured reports that address familiar regulations (including PCI DSS, SOX, and HIPAA). It also allows you to create custom reports.

In the security realm, I’m constantly bombarded with me-too products. It’s nice to hear about one that goes in a different direction.

-- James E. Powell
Editorial Director, ESJ

Posted on 02/17/2011 at 11:53 AM5 comments