Enterprise Snippets: BYOD, Cloud Storage, TPC Benchmark

 

IT, Employees Differ on BYOD

A new survey by Mimecast of 500 IT professionals and admins conducted at the Microsoft 2012 DevConnections conference underscored the tension between IT and employees when it comes to “bring your own device” (BYOD) policies.

Nearly half (47 percent) agree that the consumerization of IT is important to an enterprise. In fact, 50.7 of IT professionals in the survey said employee access to their devices was a productivity necessity. However, 26 percent of all respondents say their enterprise doesn’t allow employee use of personal devices for corporate work, and 7.9 percent said personal devices hurt productivity.

IT’s biggest challenge, sited by nearly three-quarters (74 percent) of respondents, is managing information security; 34.4 percent said it was managing the volume of devices.

What’s scary about the cloud, according to 70.3 percent of those surveyed, is “not having the skill set to keep up with new services.”

Over a third (34 percent) of respondents say the impact to their organization of BOYD is “tough but manageable.”

 

Storage Infrastructure Hampering Cloud Performance

DataCore Software’s second annual survey of software and private clouds found that “storage-related challenges to virtualizations and private cloud deployments are transitioning from being cost-central to performance- and availability-centric.” More than a third (34.3 percent) admit they “underestimated the impact server/desktop virtualization would have on their storage costs.” The figure was 28.1 percent for respondents using a private cloud.

In the survey of 289 companies in North America and Europe, 63.3 percent put downtime and slow application performance at the top of their list of storage-related concerns, up from 35.7 percent of respondents in 2011. The company notes that cost considerations are typical early in a project, and they’re still of concern to 50.9 percent of respondents (though that’s down from 66.1 percent last year).

Performance isn’t being improved when IT increases its storage budget, and complaints about application performance due to their storage infrastructure rose this year to 31.5 percent of respondents, up from 25.3 percent last year. Worse, 31.8 percent have suffered from downtime due to storage-related problems.

Concern may be down, but costs still gobble up more than a quarter of IT’s budget according to 44 percent of respondents. In fact, 37 percent say their storage budgets have increased in the last year.

 

TPC Launches New Industry Standard Benchmark

The Transaction Processing Performance Council (TPC) announced a new decision support benchmark named TPC-DS; it measures “query throughput and data integration performance for a given hardware configuration, operating system, and DBMS configuration under a controlled, complex, multi-user decision support workload.”

According to Meikel Poess, the TPC-DS committee chairman, “It is the first benchmark specification to integrate key workloads of modern decision support systems including ad-hoc queries, reporting queries, OLAP queries, data mining queries and data integration from OLTP systems.”

More information cam be found at www.tpc.org/tpcds/.

-- James E. Powell
Editorial Director, ESJ

0 comments


Attacks Skyrocket as Hackers Exploit Old Techniques

A new year-in-review report from Symantec holds some bad news for security administrators. Although the number of brand new attacks decreased last year, the number of attacks themselves has risen by 81 percent. The Internet Security Threat Report, 2011 Trends report cites the easy availability of Web attack kits that make it easy to “tweak” a vulnerability rather than invent new ones. (The report is available at no cost; registration is not required.)

In fact, says Symantec, the number of unique malware variations increased by 41 percent last year. Server-side polymorphism attacks were particularly popular. “This technique enables attackers to generate an almost unique version of their malware for each potential victim,” the report explains.  The lack of truly new vulnerabilities was echoed by Microsoft last week in its Security Intelligence Report.

Mobile applications are growing as a delivery medium. As Liam O Murchu, manager of operations at Symantec Security Response, explained to me, hackers are taking existing applications, inserting their code, then reposting them online. Unsuspecting users, mistaking the infected application for the legitimate one, download it. “Android devices are particularly vulnerable to this because it’s more common for users of that platform to download applications from unregulated, third-party Web sites.”

Automation may play a part in this trend. A new report from Imperva, Hacker Intelligence Initiative, Monthly Trend Report #9, notes that automatic tools are enabling an attacker to target more applications and take advantage of vulnerabilities than manual methods. “The automatic tools that are available online save the attacker the trouble of studying attack methods and coming up with exploits to applications’ vulnerabilities. An attacker can just pick a set of automatic attack tools from the ones that are freely available online, install them, point them at lucrative targets, and reap the results.” Imperva’s 12-page report is available for free; no registration is required.

The Imperva reports notes that “Automatic tools open new avenues for evading security defenses. For example, such a tool can periodically change the HTTP User Agent header that is usually sent in each request to an application and that may be used to identify and block malicious clients. As another example, sophisticated automatic tools can split the attack between several controlled hosts, thus evading being blacklisted.”

Yes, hackers are still after information, but the techniques vary by month. “Hackers can try something this month, then switch to something else next month,” O Murchu said. Mobile users are also facing increased problems from mobile hackers in the form of premium text rates or phone calls. (Where as credit cards were worth between 40 and 80 cents to hackers, a premium SMS text can cost a mobile user $9.99.)

Indeed, just last week Symantec said it observed a new mobile threat that takes advantage of users of Android devices by exploiting the popular Biophilia app. “Once users download the Trojanized Biophilia app, they are able to stream music just as the app promises,” a spokesman says, but it also launches a malicious background service that’s part of the Android.Golddream malware family, which “indicates the authors of this threat likely intend to use infected devices to generate revenue via premium SMS scams.”

Spam levels have dropped, which any mail administrator will be happy to know, so that delivery mechanism is playing a smaller role in spreading malware. The report credits “law enforcement action which shut down Rustock, a massive, worldwide botnet that was responsible for sending out large amounts of spam.”

Of course, whenever one medium fades, another takes its place -- in this case, social networks. “The very nature of social networks make users feel that they are amongst friends and perhaps not at risk.” Social networks also make it easier to spread virally. Clearly, security administrators have some user educating to do.

Symantec also found that the targets themselves are changing. In 2011, large enterprises were no longer the key target. Last year, 50 percent of attacks were aimed at companies with fewer than 2500 employees, and another 18 percent went after companies with no more than 250 employees.

Upper management used to be a favorite target; last year, 58 percent of attacks were directed at “other job functions such as Sales, HR, Executive Assistants, and Media/Public Relations.” The report points out that people in these positions are likely to receive messages with attachments (and presumably aren’t averse to clicking on attachment icons).

A final disturbing note: Over 232 million identities were stolen globally in 2011. Symantec says that “the most frequent cause of data breaches (across all sectors) was theft or loss of a computer or other medium on which data is stored or transmitted, such as a USB key or a back-up medium. Theft or loss accounted for 34.3 percent of breaches that could lead to identities exposed.”

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Dirty Disks May Pose Cloud Security Risk

If you were concerned about the security of your data in the cloud, you may have good reason for concern -- for an entirely new reason.

Context Information Security has found “potentially significant flaws in the implementation of Cloud infrastructure services offered by some providers” that could make its clients’ data vulnerable.

The problem involves “data separation.” Context says its consultants gained access to data that was left behind by other users of the same cloud provider on “dirty disks.” The data included “fragments of customer databases and elements of system information that could, in combination with other data, allow an attacker to take control of other hosted servers.” Context discovered the data accessibility but did not “disclose, use, record, transmit, or store” any of that data.

Among the four providers tested by the consultancy, two (VPS.NET and Rackspace) did not consistently and securely separate virtual servers or nodes “through shared hard disk and network resources.” After informing the providers of the potential security hole, Context confirmed that Rackspace had fixed the vulnerability which was found “among some users of its now-legacy platform for Linux Cloud Servers.” According to Context, Rackspace says “it knows of no instance in which any customer’s data was seen or exploited in any way by any unauthorized party.”

VPS.NET says it installed patch that resolved the security issue, but no details were provided. What’s more troublesome about this provider is that its service is based on OnApp technology, which Context says “is also used by over 250 other cloud providers. OnApp told Context that it now allows customers to opt-in to having their data removed securely, leaving thousands of virtual machines at potential risk. OnApp added that it has not taken measures to clean up remnant data left by providers or customers, on the grounds that not many customers are affected.” Unfortunately, “not many” doesn’t mean every customer is safe.

Context says that its research revealed that “if virtual machines are not sufficiently isolated or a mistake is made somewhere in the provisioning or de-provisioning process, then leakage of data might occur between servers.” A more complete explanation of the exposure, how tests were conducted, and provider responses are described in its blog.

--James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Enterprise Snippets: Mismanaging Admin Rights; Programmer’s Professional Network; VDI Projects Stalled

Survey Shows Impact of Mismanaging Windows Admin Rights

A new survey of 1000 British IT personnel this month reveals several hidden dangers of (and massive costs due to) mismanagement of user administration rights. According to a release prepared by Avecto, a Windows privilege management specialist and the survey’s sponsor:

  • 19 percent of those surveyed missed a critical deadline as a result of being denied full access to an application
  • Nearly 30 percent believe they don’t have access to all the applications they need to get their work done
  • 16 percent of respondents would be tempted to use their admin rights to access sensitive data if they still had them after they left their company
  • Nearly one in four call IT support three or more times a year because they can’t get an app to work due to issues with admin rights (the average of all respondents is 1.77 calls per year)
  • More than 1 in 5 people taking the survey say they know someone in their organization who has breached IT security policies

Paul Kenyon, Avecto’s COO, noted that “Being denied access to work applications has a greater impact on men than women -- with half of men saying it has caused them problems compared to just a third of women. It also worried more men, with a quarter fearing that they won’t be able to access the technology they need to do their jobs. Perhaps this ‘loaphobia’ (Lack-Of-Application-Phobia) will be the next big thing worrying the UK’s workforce?”

 

Masterbranch Unveils Professional Network for Developers by Developers

Masterbranch, a site used by 500,000 developers to showcase their 1.5 million projects, has added a new "Project Leaderboard" that ranks its contributors’ Open Source projects using DevScore, its own system for assigning a numerical rating that factors in developer contributions, project reputation (downloads, followers, likes, etc.), and the reputation of co-workers.

The site, which automatically updates and scores developer’s projects, has added a "Company Wish List" feature that lets developers contact companies directly with, along with project and skill information, their relocation preferences and salary requirements.

It strikes me that Masterbranch in some ways is a cross between a sophisticated jobs/talent board and a social site. It serves as a venue for displaying a programmer’s talent instead of their (often-boring) résumés, and could be a good source for enterprises looking for coding talent. On the social front, the new “Connections” feature allows developers to connect with other developers (and only developers) and connect with people working with colleagues or other people they might know.


VDI Projects Are Hot, But Half are Stalled

A new study of 500 IT professionals funded by Virsto Software confirms what most IT professionals already know: virtualized desktop initiative (VDI) projects are hot, especially among most midsize- and large-enterprise IT organizations. Unfortunately, they’re being hampered by three factors: cost, performance, and end-user complaints.

Over half (54 percent) of respondents are involved with VDI -- they’ve either started a pilot project or have implemented a VDI project. Of those projects, however, 46 percent are stalled “due to unacceptable end user performance and projected cost overruns.”

According to a company release, the survey also found that

  • Two-third (67 percent) of respondents plan to work on a VDI project in the next year. Among this group, three-quarters (76 percent) say cost reduction and ease of management are driving their initiative.
  • Over half (54 percent) have a budget target per desktop (including storage, licensing, and end points) of less than $500
  • 65 percent of respondents selected VMware as their deployment hypervisor; 12 percent chose Citrix XenServer, and 8 percent selected Microsoft Hyper-V. Half (50 percent) plan to deploy VMware’s View desktop solution.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Security Snippets: Flashback Update, Disorganized Encryption

Flashback: Lessons Learned, Free Removal Tool

Symantec says the number of Flashback-infected computers is on the decline; roughly 270,000 systems (down from the original 600,000), mostly in North America, Australia, and the UK. The company has also launched a free Flashback detection and removal tool available at www.norton.com/flashback.

The company lists three “lessons learned” from the malware incident:

  • No operating system is immune to malware attack and any Internet-connected device should have security precautions in place.p

  • Mac users are not out of the woods. There are still hundreds of thousands of users who have not taken the steps necessary to remove the malware. Additional infections are also still possible if the appropriate security updates are not installed.

  • Cybercriminals often build on the exploits of others; additional attempts at widespread Mac malware infections are likely to follow.

More details can be found in this Symantec blog post.


IT’s Encryption Efforts Disorganized, Risky

The independent research firm Enterprise Strategy Group (ESG) says chief security officers should “aggressively address risk and costs of ad hoc encryption technologies and fragmented key management.” In its paper, Enterprise Encryption and Key Management Strategy: The Time is Now, ESG’s senior principal analyst Jon Oltsik presents new research how data encryption is being used in enterprise networks. [Editor’s note: Access to the report requires registration.]

It isn’t pretty. Oltsik warns that “encryption technologies are being implemented in a disorganized, ad hoc manner that leads to increased security risks and costs.” The analyst recommends a framework to address these shortcomings.

Oltsik explains that “when it comes to information strategy, large organizations tend to focus on firefighting rather than long-term strategy. Unfortunately, this short-sighted approach has its limits. Ad hoc encryption leads to redundant processes, complex operations, and high costs while placing sensitive data at risk of accidental compromise or malicious insider attack.”

The report points to four common shortcomings in enterprise encryption and key management:

  • A lack of standards and management by disparate functional IT groups without data security expertise

  • No central command and control – each tool has its own policies, provisioning and management of keys

  • Disorganized key management systems that place data at risk for a security breach and unrecoverable critical files

  • Organizational misalignment that doesn’t address insider threats by providing adequate access management and separation of duties

Tina Stewart, vice president of marketing for Vormetric, an enterprise encryption specialist, noted that “Encryption is being implemented on a broad scale, driven by increased threats from the outside as well as within the organization. This ESG research report explains the risk and costs associated with fragmented approaches to encryption, and advantages of developing a top/down plan to centralize its management and control.”

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Data Centers Not Prepared for Disaster

When disaster strikes, IT will be woefully ill-prepared -- and they know it.

That’s the upshot of the 2012 Service Availability Benchmark Survey from Continuity Software, a provider of disaster recovery and business continuity monitoring and management solutions. The survey found that “many enterprise IT organizations remain woefully ill prepared to face and endure an interruption in services and/or disaster of any duration, size, or scale.”

Over one-fourth of surveyed firms admitted “that they did not meet their service availability goals for their mission-critical systems in 2011,” and 84 percent admitted that “they were aware that their organization lacked sufficient disaster recovery capacity.” Upper management should be worried: 64 percent of those surveyed said they “lacked confidence in their DR testing.”

Talk about courting disaster with a disaster recovery plan -- or a lack thereof. Even so, company founder and CEO Gil Hecht said, “We at Continuity Software were not terribly surprised by the results of this survey. Our day-to-day conversations with IT executives and channel reseller partners over the years consistently confirm that organizations are simply not dedicating the time and resources necessary to protect their business. However, while for some it is a matter of neglect, for most IT executives it is an unfortunate result of being provided with limited operational and capital investment budgets.”

The report is available here (registration required).

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell3 comments


Financial Services Target of More (But Shorter) DDoS Attacks

A new report from Prolexic Technologies, which offers distributed denial of service (DDoS) protection services, says its Security Engineering and Response Team (PLXsert) team found a nearly three-fold rise in attacks against its financial services clients in Q1 compared to the last quarter of 2012. That’s not all -- it reported a 3000 percent increase (that’s a 30-fold rise) in malicious packet traffic.

Prolexic also said it mitigated more attack traffic this quarter than it did in all of 2011.

According to Neal Quinn, Prolexic’s vice president of operations, “We expect other verticals beyond financial services, gaming and gambling to be on the receiving end of these massive attack volumes as the year progresses.”

The report also found that compared to the same quarter in 2011, the total number of DDoS attacks rose by a quarter as did Layer 7 (application layer) attacks. Attacks were shorter, lasting just 28.5 hours vs. 65 hours.

Compared to the last quarter of 2011, the total number of attacks remained steady, but Layer 7 attacks rose by 6%. Prolexic said that China is still the top source country for attacks, though the U.S. and Russia are rising on that list. The duration dropped (from 34 hours to 28.5 hours, but average attack bandwidth increased from 5.2 Gpbs to 6.1 Gbps.

Attackers still like infrastructure layer attacks that target Layer 3 and Layer 4, and long term, PLXsert sees the gradual move to Layer 7 attacks.

The company also noted that in the last year, “UDP Floods have declined in popularity with SYN Floods emerging as the ‘go-to’ attack type.”

The full report is available for free here (registration required).

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Unit Testing Kills Bugs Best; C “Overtakes” Java

Unit Testing Best Bug Killer

According to a new developer survey from Typemock, a vendor of unit-testing solutions, over 90 percent of developers say that unit testing is effective in reducing software bugs -- more so than integration testing, pair programming, and quality assurance. The survey also found that 80 percent of respondents acknowledge that developers are responsible for bugs; 8 percent said it was up to QA.

Finding and fixing bugs is time consuming: 48 percent of developers said that they spend up to 5 hours each week performing that task; 38 percent said they spend up to 10 hours a week, and 12 percent devote more than 10 hours a week to the chore. Typemock says that customers of its Typemock Team Mate, which tracks developers’ actual usage, actually spend between 50 and 55 percent of their workweek using the debugger.

C “Overtakes” Java as Most Popular Language

TIOBE, a software quality assessment and tracking company, has released an update to its “Programming Community Index,” noting that the long-term popularity of the C programming language has surpassed the popularity of long-in-decline Java. “Although it is expected that Java will not decline much further due to the popularity of the Android platform, C is able to remain number one for at least another couple of months.”

The full report, including trend lines and a list of the top 50 languages, can be found here.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments