IT Ignoring Remote and Local File-inclusion Attacks

A new Hacker Intelligence Initiative report released by Imperva, a data security solutions provider, takes aim at local and remote file inclusion (RFI/LFI) attacks that allow hackers to run malicious code and steal data not by including them as e-mail attachments but rather by manipulating a company's Web server.

It may be a threat you haven’t paid attention to, but RFI/LFI attacks made up more than one in five (21 percent) application attacks the company says it found when reviewing attacks on 40 applications from June through November of last year.

“RFI and LFI attacks take advantage of vulnerable PHP Web application parameters by including a URL reference to remotely host[ed] arbitrary code, enabling remote execution. PHP is a programming language designed for Web development and whose use is prevalent in applications on the Internet,” the report said.

The company is raising a red flag for IT security professionals. "LFI and RFI are popular attack vectors for hackers because it is less known and extremely powerful when successful," said Tal Be'ery, the company’s senior Web researcher. "We observed that hacktivists and for-profit hackers utilized these techniques extensively in 2011, and we believe it is time for the security community to devote more attention to the issue."

The report discusses real-world attacks, including how 1.2 million WordPress Web sites were compromised using the TimThumb vulnerability. It includes a technical analysis of an RFI-infected file, examining how shell code hides the attack vector and how that makes it possible to avoid traditional detection and mitigation techniques.

The report also discusses an approach “to mitigate against RFI attacks by utilizing a shell hosting feed.”

The full report can be downloaded for free without the need for registration here.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell2 comments


CDW Poll Reveals Popular Options for Improving Data Center Energy Efficiency

CDW’s fourth annual Energy Efficient IT Report, a survey of IT professionals in both public and private sectors across the U.S., is out with “solution ratings maps” that identify data center solutions respondents believe offer the “greatest potential for cost reductions” and that are the easiest to get approved and implemented.

Online versions of the Solution Ease Ratings Map and Solution Savings Ratings Map help users understand the results visually. The Solution Ease map displays what percent of respondents received actual savings and projected additional savings for a variety of tasks, such as “consolidating servers” and “Deploying more power-efficient networking equipment.” The Solution Savings map plots tasks against ease of approval and ease of implementation.

Among newer technologies, 62 percent of respondents said they believe cloud computing is an energy-efficient approach to data center consolidation (only 47 percent agreed with that statement in 2010).

What are enterprises actually doing now to reduce their energy use? The top two options: virtualization of servers and storage (65 percent) and server consolidation (60 percent). Implementing hardware that uses newer, low-power/low-wattage processors, installing ENERGY STAR devices, and deploying more power-efficient networking equipment rounded out the top 5 in popularity.

How much are they saving? Respondents said virtualized servers and storage helped them reduce energy use by 28 percent; new cooling approaches could cut costs by 22 percent, and energy-efficient/load-shedding UPSes could cut use by 21 percent. (The survey warns that average savings estimates are for individual solutions only; total savings when solutions are combined would be different.)

Favorable attitudes about “going green” are growing: 43 percent of study participants said green initiatives are a leading driver of data center consolidation (up from 34 percent in the previous year’s report). More than half of respondents (54 percent) “have or are developing programs to manage power demand in the data center.” Of enterprises with such a program, three-quarters (75 percent) claim they’ve reduced their IT energy costs.

IT is also putting its money where its’ “clean” mouth is. Survey respondents said that on average, about a third (32 percent) of their data center purchasing in the last three months was “green” -- that is, “energy efficient, water efficient, bio-based, environmentally preferable, or non-ozone depleting.”

Of course, not everything is rosy; many barriers remain. IT professionals still want “information and measurement tools” so they can assess energy use, potential savings, and ROI. They want “an objective breakdown of power and energy use within IT, a clearer set of industry standards for what constitutes energy efficient IT, and easier identification of energy-efficient equipment.” Topping the list of barriers: 58 percent said "We have too little budget left for new, more efficient systems," and "Our senior management gives higher priority to investments in other areas of the organization" was cited by half of those surveyed. A mere eight percent of respondents say it’s easy to estimate energy use or anticipated savings based on the equipment specifications manufacturers provide.

A free copy of the report (registration required), which includes breakdowns by industry, can be downloaded here.

-- James E. Powell
Editorial Director, ESJ

0 comments


Enterprise Snippets for March 30, 2012

Testing Orgs Using Cloud, Outsourcing to Deliver Value

Capgemini and Sogeti released their second annual World Quality Report. The HP-sponsored report “shows the challenges facing financial services organizations are directly translating into IT departments focusing on compliance, business growth, and cost optimization. As a result, quality assurance organizations are seeking to deliver value to the business by leveraging cloud and SaaS services, expanding outsourcing, creating testing centers of excellence (TCoE) and preparing for mobile services.” Details here.


Obama Administration Announces $200 Million “Big Data” R&D Initiative

The Obama Administration’s “Big Data Research and Development Initiative” is attempting to improve “our ability to extract knowledge and insights from large and complex collections of digital data.” Six Federal departments and agencies are committing over $200 million “to greatly improve the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data.” Details for each agency are below.


Tips for Creating a DDoS Playbook

Prolexic Technologies, a distributed denial of service (DDoS) protection services provider, recommends online businesses should create a “mitigation playbook” to “minimize the disruption and confusion that typically occurs at the outset of a DDoS attack.” The company says it’s a best practice it implements with all its clients. The company’s white paper explains the purposes of a playbook as well as the three elements it should contain. It’s available for free here; a short registration is required.

Posted by Jim Powell1 comments


Offshoring Will Hit IT Hard, Hackett Group Predicts

Although there are signs of economic growth in business services (including IT, finance, procurement, and human resources) in the U.S. and Europe, “globalization” and competitive pressures will continue to drive jobs to offshore providers. By 2016, 2.3 million jobs in those four sectors will have moved offshore -- of which nearly 1.1 million are IT jobs.

The Hackett Group says that although economic growth between 2002 and projected into 2016 is 20 percent above its 2001 level, IT jobs will decline 54 percent in the same 15-year period, the worst of any of the four business functions. By comparison, finance will lose 42 percent of its jobs, procurement functions will decline by 36 percent, and HR will decline by 33 percent.

The study, Job Losses from Offshoring and Productivity Improvements Far Outpace Gains from Economic Growth, looks at employment at North American and European companies with over $1 billion in revenue in 2010 -- a total of 4,700 enterprises. It notes that business-services jobs have been shifting from developed economies to offshore providers since 2002. Its projections of job cuts between 2013 and 2016 show a gradual decrease (in absolute numbers), and for IT staff it’ll still be tough going.

The report also points out that “as companies embed technology into an expanding range of products, new IT jobs in their product development organizations are being created. Finally, the IT industry (hardware, software and telecommunication) itself continues to grow, creating additional demand for IT workers.” Unfortunately, the demand may be fulfilled in “low-cost geographies.”

Today, 26 percent of “IT (Knowledge-Centric)” jobs (including “infrastructure development, application development and implementation, planning and strategy, and function management”) are located in these “geographies.” In 2-3 years, Hackett believes that will grow to 37 percent. For IT operations, the figures are 36 percent today and 50 percent within 3 years.

There may be hope for IT employment longer term. The Hackett Group chief research officer Michel Janssen says that although the jobs outlook for the next decade isn’t bright, “... after the offshoring spike driven by the Great Recession in 2009, the well is clearly beginning to dry up. A decade from now the landscape will have fundamentally changed, and the flow of business services jobs to India and other low-cost countries will have ceased.”

Enterprises looking to cut costs still will still have “opportunities for improving efficiency,” Janssen says; they can examine “automation and end-to-end process improvements to streamline how business services are provided.”

The full report is available for free download here (short registration required).

-- James E. Powell

Posted by Jim Powell10 comments


Avoiding Cloud Lock-in Relies on Migration Options, Speed

The cloud is no different than any other application when it comes to IT concerns. After all, just as with selecting a payroll or network monitoring application, IT must worry about “lock in.” What if the service doesn’t keep up with technology or the law? What if it gets too expensive? What are our options? How do we avoid vendor lock-in?

With cloud computing, selecting the right provider can be tricky. It’s a nascent industry, after all; some providers may be underfunded or their tech support or performance may be substandard. This may not come to light until you actually sign up for the service, load your data, and then run into glitches. It’s no wonder IT wants to keep its options open.

In a report from Nasuni, Bulk Data Migration in the Cloud, the enterprise storage company tested cloud-to-cloud migration using cloud computing resources with three of the most popular storage providers based on its State of Cloud Storage Providers report from December -- Amazon S3, Microsoft Windows Azure, and Rackspace. The results are eye-opening.

The company used 5 percent of a 12 TB test bed containing 22 million files in all, with a variety of file sizes (but averaging 550 KB). Files were encrypted and compressed so that moving them would pose no security threat. From this test, Nasuni estimated the minimum migration time for a 12 TB storage volume. Overall results varied “significantly depending on the time of day and the number of compute machines used to transfer the data,” as you’d expect, but neither of these was the critical factor.

When Amazon S3 was the cloud destination, times were shortest. An S3-to-S3 transfer took four hours, as did the Azure-to-S3 move. When Azure was the recipient of data from S3, the task took 10 times as long: that transfer was estimated at 40 hours. However, that was fast compared to moving from S3 to Rackspace -- which took “just under one week.” Going in the opposite direction -- moving from Rackspace to S3 -- was much speedier; the task was complete in five hours.

Based on these results, the report concludes “the biggest limiting factor appears to be the cloud’s write capability.”  The Amazon S3 results might have been faster had Nasuni worked with more resources. The company was limited to 40 machines for its tests, so “engineers couldn’t push Amazon S3 to its limit.”

You can download the white paper (which contains the test methodology and results) here; a short registration is required.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


When It Comes to the Cloud, Users Won’t Wait for IT

The saying “Give the customers what they want” has an equally important corollary: “If you don’t, customers will find another way to get it.” When it comes to public clouds, heed the corollary.

A new study, Delivering on High Cloud Expectations conducted by Forrester Consulting for business service management specialist BMC Software, won’t be released until April 26, but the company released a preview of some key findings this week that should be of interest to IT and business users (and managers) alike.

The study looks at the growing demand for public cloud services. That’s what the customer wants. It IT doesn’t give its users what they want, CIOs “are rightly concerned that business teams are willing to circumvent IT in order to acquire cloud services on their own.”

Business users see the cloud as a speedy and low-cost way of getting the solutions they need, which is putting pressure on IT everywhere. Although “IT teams work to meet the needs of the business, the demand for more speed and agility is creating an environment in which business teams are looking outside the organization to provision services in public clouds.” The bottom line: “IT departments must expand plans to incorporate public cloud services into their overall cloud strategies.”

Some key findings from the report show a conundrum for IT. For example, IT is battling complexity, which isn’t going to change any time soon. The survey found that “39 percent of respondents reported having five or more virtual server pools, and 43 percent report three or more hypervisor technologies.” The study found that IT’s top priority over the next year is cost reduction (the “doing more with less” principle), and “complexity reduction” is the leading strategy for getting there.

Business users, on the other hand, see cloud computing as a way to be independent of IT, according to 72 percent of the CIOs in the survey. The problem: when users go around IT, the complexity (and headaches) for IT simply increase. Unfortunately, users are already well on their way to this behavior. “Approximately 58 percent of respondents are running mission-critical workloads in the unmanaged public cloud regardless of policy, while only 36 percent have policies allowing this.” The survey found that 79 percent of respondents

“plan to support running mission-critical workloads on unmanaged public cloud services in the next two years.”

They’d better hurry if they don’t want to lose all control. Don’t get me wrong -- IT wants to help, but it’ll be tough when you read that “71 percent of respondents thought that IT operations should be responsible for ensuring public cloud services meet their firm’s requirements for performance, security, and availability,” but “61 percent of the survey respondents agreed that it will be difficult to provide the same level of management across public and private cloud services.”

The study was based on a sample of “327 enterprise infrastructure executives and architects across the United States, Europe and Asia-Pacific.”

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell2 comments


Employees Share Confidential Data Despite Possible Job Loss

A survey by FileTrek reveals that 90 percent of the more than 2,600 adult Americans surveyed in January believe people share company confidential information with the outside world, and 72 percent of respondents dread being accused of doing so (it's their top fear).

Despite this, nearly half (48 percent) say a boss’s approval is a valid defense for sharing data. Also on the list of acceptable circumstances: needing to finish a late-night project at home, working on weekends or during vacation, when the information is about themselves, when the data can be returned without the boss’ knowledge, or to show others who promise confidentiality.  The file transfer medium of choice: 55 percent said they’d use a USB drive.

Opinions vary by age. For example, 68 percent of respondents aged 18-34 say it is permissible to remove files, but only half of those aged 55 and over feel that way. In fact, 86 percent of those older respondents say that taking confidential information is grounds for dismissal; the figure is 74 percent for the younger group. Only sexual harassment of a coworker and incompetence rated higher as a reason for termination.

I asked FileTrek CEO Dale Quayle what he believed was the root cause of the problem with employees' attitudes.  After all, employees know it could get them fired but they'd do it anyway if the boss said it's OK.

"I actually believe that many workers don't even realize that they are putting their employer’s confidential information at risk since many companies are not clear about their policies regarding corporate IP," Quayle said.. "Another root cause is simply the fact that modern technology has allowed employees to share electronic data quite easily. These innovations have increased productivity and collaboration, but it has also increased the risk of confidential information being leaked or traveling out in the wild.

"Employees are more concerned with the ability to be mobile and the freedom to access work documents on several devices than being motivated to protect sensitive data. I believe the solution is to provide businesses with a safe and secure system that enables employees with modern tools while protecting an organization's confidential data at the same time."

I asked Quayle what he thought the biggest mistakes enterprises make in trying to change the attitudes and company culture about file sharing and wondered what steps enterprises should take instead.

"The most common mistake is when an enterprise allows their employees to utilize numerous disparate file sharing services between departments, work groups, and within the company," Quayle pointed out. This creates tremendous data sprawl and it is difficult to put the genie back in the bottle. Many of our customers claim the key to successfully changing the culture regarding file sharing and ensuring security of corporate IP is to: first, define concise policies regarding document distribution and approved methods. Next, it is extremely important to educate their workforce as to the importance and reasoning behind the policies."

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Latest Ponemon Security Report Examines Enterprise Data Breaches

Security administrators know that they have more to fear from negligent employees than they do from external hacks. Ponemon’s study, 2011 Cost of Data Breach Study: United States, is the seventh in the firm’s history and it comes to the same conclusion. It reports that “negligent insiders and malicious attacks are the main causes of data breach,” and 39 percent of organizations “say negligence was the root cause of the data breaches.”

What’s new (at least to me): “malicious attacks are 25 percent more costly than other types” of attacks.

If you want to cut the cost of a data breach, hire a chief information security officer (CISO) with responsibility for data protection. Doing so, Ponemon says, can cut the cost of a data breach “by 35 percent per compromised record.” The cost of such a breach in 2011 was $5.5 million dollars or $194 per record (based on information from 49 breaches at companies in the United States across 14 industries), which is down from $7.2 million ($214 per record) in 2010. Given that figure, the cost of that CISO could easily be justified by the savings alone.

Not ready to add to your payroll? “Outside consultants assisting with the breach response also can save as much as $41 per record.”

Experience with data breaches is also valuable. “Organizations that had their first ever data breach spent on average $37 more per record.” It also pays to take your time and get the facts straight after a breach. Enterprises “that responded and notified customers too quickly without a thorough assessment of the data breach also paid an average of $33 more per record.”

Despite the serious impact to an enterprise’s reputation after a breach is made public, customers stay loyal. For the first time, fewer customers are abandoning companies that have a data breach. However, certain industries are more susceptible to customer churn, which causes their data breach costs to be higher than the average.”

The study factored in both direct business costs (hiring forensic experts, free credit monitoring services for customers) and indirect costs (such as in-house investigations), as well as more complex factors such as loss of customers and customer turnover. It excludes “big” breaches (those over 100,000 records) because they are less common.

-- James E. Powell
Editorial Director, ESJ

0 comments