Global Network Assessment Shows Network Weaknesses

How’s your network? If you’re typical of the organizations surveyed for Dimension Data’s annual Network Barometer Report, there’s plenty of room for improvement.

The company aggregated data from 270 Technology Lifecycle Management (TLM) assessments conducted globally, reviewing an organization’s network readiness. It examined security vulnerabilities, the end-of-life status of network components, and any variances between how the network is actually configured versus best practices.

Take security vulnerabilities, for example. Overall, the percentage of networking devices with security vulnerabilities for companies of all sizes was a stunning 73 percent. Dig into the numbers more closely, Dimension Data found that half of the vulnerabilities could be attributed to a single problem: “PSIRT 109444 was found to be responsible for the jump, and was found in 66 percent of all devices analyzed during 2010.” [emphasis added]

The vulnerability was first identified by Cisco in September 2009, and it’s a serious one: a wide range of devices from Cisco “could have their TCP stated manipulated such that no new TCP sessions can be established, thereby resulting in a Denial of Service where the network device can no longer transmit data,” according to the report. Given the wide use of Cisco devices, eradicating the problem should be a high priority for network admins.

The report points out that “the prevalence of this ... security vulnerability suggests that for the majority of organizations, existing discovery and remediation processes are falling short of the mark.” It notes that “the 2010 assessment results showed that, apart from this particular threat, organizations had been patching fairly well, as the next four vulnerabilities were found in less than 20 percent of all devices.” With two-thirds of devices harboring a serious problem, I’m reminded of the question, “Other than that, Mrs. Lincoln, how was the play?”

The good news is that when it comes to hardware, IT is paying attention. “The percentage of devices in late stage end-of-life dropped from 58 percent last year to 47 percent this year, and those beyond [last day of support, after which vendor support ceases] dropped from 31 percent last year to 9 percent.”

The report also looked at configuration issues, reporting an average of 29 for all organizations examined. The firm used automated tool sets to compare configuration issues based on a “single, generic configuration policy set” derived from a variety of sources, including Cisco Safe Blueprint, ISO 17799, and PCI DSS (Payment Card Industry Data Security Standard). This is down from an average of 42 for last year. “One possible explanation for this improvement is that the global financial crisis -- which delayed many capital expenditure projects -- may have provided the opportunity for organizations to focus on enforcing configuration policies,” the report explains.

Even 29 may be too high. Data Dimension cautions that “Not only do certain configuration errors open the door for serious security threats and access violations, they also a leading culprit in terms of network availability. ... Other research has found that as much as 80 percent of application performance problems and network downtime can be attributed to configuration change or error, and surmises that over half of all major network failures may be caused by configuration errors.”

Dimension Data says its TLM assessment “provides organizations the compass they need to chart their IT asset landscape, enabling fundamental security, configuration, and end-of-life network device issues to be proactively addressed.” It must be working: those enterprises performing the assessment more frequently have fewer problems. “While the overall sample size was fairly consistent with previous years, repeat TLM assessment clients (25 percent of the sample) had a lower obsolescence rate (32 percent EoS [end of sale -- the date after which the product can no longer be purchases]) than [first]-time assessment clients (40 percent) and the overall average (38 percent).”

The complete report is available for download (short registration required) at www.dimensiondata.com/networkbarometer.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


IT Pros Reveal Practices, Plans for Using Managed Service Providers

In a new survey of midsize and large enterprises (those with 1000 to 20,000 employees) about their use of managed service providers (MSPs), it’s clear that IT wants to get rid of some of their least-favorite chores. The most common functions outsourced by respondents were storage backup and recovery (56 percent) and service/help desk support (44 percent). The later topped the list of functions respondents are planning to move to MSPs, and twice as many smaller organizations are likely to make that change than large companies. Executives were 2.3 times more likely than their non-executive peers to be thinking about outsourcing storage/recovery management chores.

For those respondents currently using an MSP, virtual servers (68 percent), data center infrastructure (61 percent), and physical servers (56 percent) were in the hands of their MSPs.

What’s driving interest in such outsourcing? Cost reduction was the reason given most often (41 percent), followed by “access to greater technical expertise/depth” (37 percent) and “the ability to better focus on the core business” (36 percent). Executives favored accelerating cloud adoption and saving money by avoiding a capital expense. Non-executives were more likely than executives to be interested in cost reduction and off-hours coverage.

What’s holding enterprises back from fully embracing MSPs? Security risk (at 68 percent of all respondents) was the top issue mentioned, and nearly half worry about loss of control (this factor was favored by larger enterprises). Smaller enterprises tend to be most concerned about the uncertainty of cost and a lack of MSP responsiveness.

The survey of over 100 executives, mid-level managers, and “individual IT contributors,” was conducted in February and March by Enterprise Management Associates and sponsored by Nimsoft; it looked at how enterprises were using and are planning to use managed services providers.

Interest remains strong in MSPs. Storage and virtual servers tied (named by 51 percent of respondents) as the top technology their enterprise is planning to move to an MSP in the future. Executives were more likely than other respondents to consider every technology and infrastructure domain the survey raised, with LAN/WAN, physical desktops, and virtual servers among their favorites candidates.

When it comes to applications, MSPs are handling hosted applications (53 percent) and databases (50 percent). What applications will likely be operated by MSPs soon? Private clouds received 60 percent of the vote, tied with hosted applications. Executives are particularly keen on moving databases, SaaS applications, VoIP, and video.

Responses also varied significantly by the size of the respondent’s organization. For example, achieving improved focus on core business was twice as important for larger organizations than it was for smaller ones, while smaller organizations were 2.6 times more likely than larger ones to be looking to MSPs that could provide better coverage during off-hours.

As for selecting an MSP, respondents consider proven expertise and depth of expertise ahead of all other properties (66 percent), followed by reputation/references and service-level agreements (both at 55 percent). Enterprises aren’t limiting themselves to a single MSP, either. They’re looking for balancing risk with business continuity (73 percent) and having different domains and levels of tech experience (66 percent) by using more than one managed service provider.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Enterprises Adopting Tablets without Clear Strategy

According to a new survey conducted by Dimensional Research, enterprise adoption of iPads and tablets is growing strong. Nearly a quarter (22 percent) say they have formally deployed such devices, and an additional 22 percent will deploy tablets this year. If plans become reality, more than three-quarters (78 percent) of respondents’ organizations will have tablets in use by the end of 2013.

The problem: over half of respondents say they don’t have a strategy for adapting these devices. Even worse, 72 percent say there are tablets in use in their organizations, even though they haven't been “formally deployed.” Over four in 10 devices (41 percent) are personal hardware purchases used by individuals for enterprise applications. Only 12 percent of participants say their corporation’s IT policy forbids tablets.

Apple is the enterprise-tablet frontrunner. Of those planning to deploy tables, 83 percent plan to deploy iPads; 34 percent plan to deploy tablets based on the Android operating system (no single brand dominated), and 19 percent plan to add BlackBerry Playbooks. Apple’s popularity is due to the availability of productivity tools according to 51 percent of those surveyed.

iPads make their initial introduction to an organization through C-level executives (49 percent) or a senior IT executive (19 percent). “However, once iPads are introduced into the organization, it is primarily IT [that] is driving future adoption. According to participants, corporate IT is typically spearheading tablet adoption (48 percent) rather than C-levelexecutives or business stakeholders,” the report notes. Executives use iPads the most, followed by corporate IT, sales, and field service personnel. The most popular application: sales force automation.

When it comes to a formal strategy for tablet adoption, over half of participants (51 percent) report their organization does not have a “clearly articulated strategy.”

There’s more worrisome news: “Among those with a strategy, the majority of them cited an application-driven approach (28 percent), where iPads would be deployed to support specific applications and business functions that were a good fit for the unique capabilities of tablets. Very few participants report that their companies are adopting tablets to save money with only 4 percent describing their strategy as bottom-line driven with the goal of reducing costs.”

You’ve probably read how tablets are going to supersede laptops and desktops. According to the survey, however, this “conventional wisdom” isn't so wise. Only 18 percent said tablets will replace laptops; the remaining 82 percent see tables complementing laptops.

The survey results are based on 448 responses from "business executives, business managers, IT executives, IT managers, and hands-on IT professionals" across a wide range of company sizes and industries.  The survey was conducted in April, 2011  and sponsored by Model Metrics.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Improved Service Availability, Scalability Drive Cloud Adoption

According to a survey of 94 network engineers, IT managers, and executives attending Interop conducted by Network Instruments, migrating to cloud computing continues to be popular, growing 20 percent since last year’s survey.

In its fifth annual survey, respondents said their biggest benefit came from “increased application availability and scalability.”

Among the survey’s results:

  • Cloud computing services are running on organization networks according to 61 percent of respondents. Half of these respondents implemented “some form of software-as-a-service (SaaS)” such as Salesforce.com or Google Apps, a gain of 10 percent over last year’s SaaS adoption figure.

  • Half (50 percent) of respondents have private clouds, a 21 percent increase that was the largest gain among all survey answers.

  • Platform-as-a-service is a small but still significant service; 21 percent of respondents said they “rely on some form of platform as a service (PaaS) such as Microsoft Azure and Salesforce Force,” according to a Network Instruments release.

  • Respondents expect to run 38 percent of their applications in the cloud by the middle of this year, up from 21 percent for the same period last year.

  • Cloud services provided 61 percent of respondents with improved application availability, and 4 percent reported availability declined. Just over half (52 percent) reported an improved end-user experience; 4 percent said it was worse. Over half (Network Instruments doesn’t give the percent) said they were better able to scale applications thanks to cloud computing.

  • On the down side, 60 percent said their ability to troubleshoot problems remained the same or was worse after their cloud migration. More than half (52 percent) noted that their ability to monitor cloud performance declined or remained steady.

In its announcement, Network Instruments makes this caveat from Brad Reinboldt, senior product manager at the company. “Although cloud adopters have reported improvements in application availability and cost savings. These improvements aren’t sustainable in the long run without appropriate monitoring tools. When trouble does hit, it falls in the lap of the organization’s network team to prove that the problem is occurring on the cloud provider’s side. Without proof, organizations will waste time finger pointing, jeopardizing any cost savings or efficiency improvements.”

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Data Logs: For IT, More is Better

When it comes to keeping and managing system logs, a new SANS Institute report makes it clear the relevant word is “more.” IT is collecting more information from more sources and want to use the data for more far-reaching purposes.

“When this survey started seven years ago, log collection was only being done by 43 percent of respondents, compared with 89 percent who indicated they collected logs this year,” notes the report’s author, Jerry Shenk, who is a senior analyst for the SANS Institute and a senior security analyst at Windstream Communications. However, rather than just using logs to detect suspicious behavior or troubleshoot problems, enterprises now are collecting log data for forensic analysis and correlation “to meet/prove regulatory compliance” (PCI DSS leads the pack of regulations in this regard). Respondents said they wanted to make better use of the data in cost management, for example.

More data is coming from physical plant and operations sources such as HVAC systems according to 14 percent of respondents, and 59 percent say they are collecting log data from their line-of-business applications. These sources didn’t even register as major sources in last year’s survey. Other new sources this year: mobile devices (15 percent of respondents) and cloud services (14 percent). In addition to sources, more devices are being tapped: most organizations collect information from more than 50 devices.

As more enterprises are using log data, the challenges have changed. “The mechanics of collecting, storing, and archiving the log data are no longer the challenge in today’s world of almost unlimited data storage,” the report explains. The biggest problem: the industry still hasn’t devised standard formats. Enterprises must still deal with inconsistent date formats between log sources, for example.

I asked Bill Roth, executive vice president of marketing at LogLogic (one of the report’s sponsors) what’s lacking. “The state of standardization is poor,” he explained. “There are fake standards like CEF and then there are underdeveloped CEE. My rant against CEF is that it is not open. Arcsight lists it on their Web site but does not provide the specification. There is no way any other vendor would implement it.”

When asked about log management tools, real-time alerts remain the most important feature for survey respondents, and their displeasure with log management system’s interfacing capabilities with third-party tools is their biggest gripe. Windows systems were cited as particularly unfriendly for log analysis; it is difficult to draw out and normalize log data from these systems.

The survey points to respondents’ dissatisfaction with searching and reporting. According to Roth, it’s a two-fold problem. “First, everybody hates search. You hate Google Search and Yahoo Search. Search is hard. The issue is not to make people like search, but to make them hate it less, so search will always be a problem. Second, challenges from reporting happen because everyone expects their reports in exactly the way they want them, so LogLogic realizes this and that's why we're spending a huge amount of investment on reporting capabilities.

That could be a big challenge, given that respondents weren’t happy about the analytic capability of log solutions. Roth gets it. “People want the analytics for their business. Every business is unique. The typical problems with analytics tend to be that they are too slow and that they don't have access to enough data. That will be a perennial problem and again, we are investing to make it better.”

I asked Roth if there were any surprises in the survey results. “Yes, that so many people are doing some form of logging -- 89% is really high and indicates that the market has moved to a new phase where it means that nearly everyone is doing log management so that vendors have to get more innovative in order to survive.”

I asked Roth about what sources IT will add to its log collection. “Rail systems, satellites, and geo-spatial data,” he said. We’ll see a year from now.

The report is available at http://www.loglogic.com/sans. Access is free but registration is required.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Key to Low-Cost Backup/Recovery: Putting Idle Resources to Work

Why is the cost of online storage so expensive when the cost of physical hard drives is falling as capacity rises? Why are some cloud storage vendors shutting down their service?

According to the executives I talked to at Symform, it’s because the pricing model and technology are all wrong.

Think about it: at least half of your hard drive’s capacity is likely empty -- on my desktop it’s closer to 75 percent. The drive just sits there spinning and costing you money. It’s a waste. That’s just the waste in your own organization. Now imagine a storage provider, which must maintain a data center with terabytes to spare, all sitting around unused. The biggest cost isn’t the hardware -- it’s the cost of electricity to run (and cool) drives that aren’t being used to their fullest.

The big benefit of server virtualization is that multiple applications could run on the same server, thus pushing server utilization up and overall costs down. That same principle -- making use of idle resources -- is what’s behind Symform’s storage-in-the-cloud service, which today is announcing a new pricing program that is a radical departure from traditional pay-by-the-terabyte plans.

Symform doesn’t maintain central data centers, which are money sinks. Instead, it puts to use the idle space on its customers’ hard drives. Using a small agent that runs on individual systems in your environment, your data is compressed, encrypted (using AES-256 algorithms), and stored on other systems within Symform’s customer base -- much like Hadoop uses unused CPU capacity among all the systems in a resource pool.

Using Symform’s model, you pay a flat fee based on a combination of number of employees and servers; then it’s an all-you-can-store model, with capacity determined by how much storage space you make available to other Symform customers. In other words, if you can provide 1 TB of data for storage for others, you can store 1 TB of your own data on other systems within the resource pool.

There’s a lot of sophistication behind that agent and Symform’s technology. The agent splits files apart into smaller chunks, which it then geographically disburses around the world (the company calls this geo-spacing), giving you the benefit of geographic decentralization to a much greater degree than most offsite storage providers offer. What if some storage locations are offline? Symform adds redundancy. It claims that only 64 of 96 pieces of a file need to be available for the company to be able to reassemble the pieces and recover a file. As machines fail it automatically repopulates the fragments. This means 33 unrelated machines would have to fail at about the same time for Symform to lose customer data. Compare that to a data center. Security is assured; there is no way for a company to recover your file because (a) it’s encrypted and (b) no single site stores all of the pieces of your file.

Symform executives, including Praerit Garg (the company’s president and co-founder) and Bassam Tabbara (chief technology officer and the other co-founder) are disarmingly honest. Although security and reliability are critical, choosing a storage service, they admit, comes down to price. In that regard, the company has a strong case to make, claiming that its service can save an enterprise up to 50 percent when compared to other cloud storage providers, and in many cases, much more than that. For example, for an enterprise with 100 employees and five servers, Symform says savings of $3,900 to $136,400 are possible, depending on your provider.

According to a company statement, “all corporate laptops and desktops are included in this low price so CIOs don’t have budgets driving the choice of devices and data to protect.” That’s smart.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


User Access Frequently Certified Despite Infrequent Reviews

A global survey released today by Courion Corporation reveals that one-third of the 1,250 IT decision makers in large enterprises don’t think their enterprises have accurately assessed their level of IT risk from both internal and external threats.

The survey points out that almost a quarter of the companies (23 percent) say they don’t have a formal IT risk management program. Even those that do don’t regularly review user data access rights.

That’s no surprise. As with many security surveys I’ve read, contradictions abound. For example, over two-thirds (67 percent) of respondents say their company has a formal IT risk management program. Of these, nearly 91 percent say identification of user access is a core component of that program, and 90 percent say identity and access management are part of the program.

How often are individual user access or entitlements reviewed? Not very. Thirteen percent say at least monthly, and a quarter (26 percent) say quarterly. After that, regular reviews aren’t a priority. Nearly one in five respondents (19 percent) say they review access or entitlements yearly, and nearly 24 percent make such reviews “occasionally but not at regularly scheduled intervals.”

Despite such lax reviews, 59 percent of organizations require their business managers “to determine and/or certify the proper access rights of employees working under their supervision,” and 53 percent require resource/asset owners “to determine and/or certify the proper access rights of employees working under their supervision.” Even more frightening: 79 percent say they are “identifying/certifying access to sensitive data.” One has to wonder: how can you make such certifications with a clear conscience?

User access reviews makes good business (and security) sense. Given that security admins tell me that their biggest threats are internal, not external, it would be wise to focus again on who has access to what.

Such reviews can turn up a host of problems. For example, 56 percent of respondents in Courion’s survey identified users who still had access from a prior job role, and 36 percent found “zombie” accounts (access for terminated employees). Nearly half (48 percent) of respondents discovered users with “excessive rights” and 39 percent found “inappropriate privileged/super user access” rights improperly granted.

Access monitoring is all that’s deficient. Organizations have room for improvement in monitoring user activity or data movement. Only 64 percent say their organization actively monitors user activity for systems and applications; 87 percent monitor data on the network or in the data center. Only 36 percent monitor “the movement of sensitive data with a data loss prevention” application.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Energy Sector Survey Shows Sorry State of Security

A new survey sponsored by Q1 Labs is full of surprises, and not the kind of surprises you’d hope to find.

For example, more than 75 percent of the 291 IT and IT security practitioners in global energy organizations surveyed say they have experienced at least one data breach in the last year, and more than two-thirds of organizations think a data breach is likely or very likely to occur in the next year.

State of IT Security: Study of Utilities & Energy Companies was designed “to better understand how global energy and utility organizations determine their state of readiness in the face of a plethora of information security, data protection, and privacy risks,” Q1 said in a release. From what I’ve read, the bottom line seems to be: they aren’t ready.

Over half of global energy enterprises don’t view IT security as a strategic initiative. Furthermore, “71 percent of IT Security executives at global energy producers state that their executive management team does not understand or appreciate the value of IT Security.” Q1 Labs Tom Turner, senior vice president of marketing and channels, didn’t mince words about that finding when we spoke last week.

“There’s clearly a disconnect between the IT practitioners who understand the risks and the C-suite of executives who end up funding the projects that address them,” he told me. “This survey becomes a validation point as well as a persuasive sales tool for IT security practitioners to make their case more effectively with their management about why they want to put better security intelligence sytems in place and make their networks more resilient.”

Amen to that. IT security experts are going to need help. Turner notes that the survey found that spending on physical security was 10 times that of IT security spending.

Dr. Larry Ponemon, founder and chairman of the Ponemon Institute (which conducted the survey), noted: “One of the scariest points that jumped out at me is that it takes, on average, 22 days to detect insiders making unauthorized changes, showing just how vulnerable organizations are today. These results show that energy and utilities organizations are struggling to identify the relevant issues that are plaguing their company from a security perspective.” The question is -- do they know what’s going on? The survey found that nearly three in four respondents (72 percent) claim that initiatives “are not effective at getting actionable intelligence (such as real-time alerts, threat analysis, and prioritization) about actual and potential exploits.”

Confirming what I’ve heard from a number of security experts, the survey found that 43 percent of respondents say the top security threat faced by their organization is negligent or malicious insiders; that’s also the top cause of data breaches.

Fewer than one-fourth of organizations (a meager 21 percent) believe their existing controls can protect them against exploits and attacks “through smart grid and smart meter-connected systems,” and two-thirds aren’t using state-of-the-art technologies to minimize risks to their SCADA networks. In fact, 77 percent of organizations say that compliance with industry-related regulations isn’t a priority, even though it’s the second-ranked security objective. Over half of respondents say that the regulatory environment has no impact on the effectiveness of their IT security.

“We were really taken aback by some of the results – especially that 71 percent of respondents believe that C-level executives don’t understand or appreciate security initiatives. This is further demonstrated by the statistic that the physical security budget is about 10X the information security budget,” said Turner in a statement. “IT Security in these organizations has the challenging task of protecting Critical Infrastructure against breach. Against a backdrop of Wikileaks, the Nasdaq Hack, the RSA breach, and the energy-specific Stuxnet virus, we have found that customers are crying out for Security Intelligence.”

I asked Turner if it’s going to take a major threat to get executives to pay attention. “Nothing focuses the mind quite like a breach that happens to you. The second best mind-focusing event is a breach that occurs to one of your peer group.”

If he conducts the survey again in a year, what will he expect to find based on the trends he’s seen? “I expect we’ll see a reduction in the number of IT security professionals who don’t think that executive management understands the importance of IT security -- very likely because we will have seen a significant exploit against a utility or energy company. We’ll also see that networks are getting better at providing security intelligence.”

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments