IBM Takes Top Spot in Leadership Study

If you're looking to develop good leaders in your IT organization, consider using IBM as your model.

In Fortune magazine’s latest study of organizational leadership around the world, IBM placed first among 470 global companies “committed to building leadership capabilities within their organizations.” According to Big Blue, “An expert panel of independent judges selected and ranked winners based on criteria including strength of leadership practices and culture, examples of leader development on a global scale, impact of leadership in communities in which companies operate, business performance, and company reputation.”

An announcement from IBM quotes the Fortune survey as pointing out that “leadership, more than ever, is the single largest determinant of competitiveness in business, and that the demand for leaders will only intensify in the current business environment.” That could easily be said about IT leadership as well.

Constant evaluation and planning is critical. Of those ranked as “top companies” in the study, 92 percent “measure their leadership development processes, compared to 61 percent of the other companies.” In addition, 84 percent of top companies evaluate “their high potential programs, while only 49 percent of the rest do the same.” The study reports that “92 percent of Top Companies measure the strength of their leadership pipeline, as well as their ability to retain leaders, while 60 percent of all other companies measure these two areas.” The ability to retain their high-potential/critical talent, and their “ability to fill key positions with internal candidates” are also characteristics of the top firms.

"This recognition reflects IBM's ongoing commitment to developing leaders from deep within our global organization. This is a discipline that is both world class and uniquely IBM," said Randy MacDonald, IBM's Senior Vice President of Human Resources. "As we enter our second century, leadership development will remain at the top of our agenda as we groom the next generation of leaders skilled at collaborating across teams, cultures, countries and businesses."

Leadership development takes work, a commitment I haven’t seen in most IT shops where I’ve worked. The top companies “are passionate and committed to leadership development. Their leadership programs are practical, relevant and aligned with business goals. Top Companies have an intense focus on talent, and they are deliberate about hiring, coaching, developing, and rewarding success. Finally, leadership development at these organizations is an embedded practice and mindset.”

The study, first released in 2002, was conducted by Aon Hewitt, a human resource consulting and outsourcing firm, and its partner, The RBL Group, an advisory firm specializing in strategic HR and leadership. More about the study can be found here.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Traditional Security Safeguards Insufficient, New Study Finds

Results from a new application delivery networking survey took stock of the effects on enterprises of complex network attacks and what security measures they’re taking to guard against them. It’s always been a race between IT security admins and hackers, but the survey results are startling. One-third (36 percent) reported that their firewalls had failed under the load of denial-of-service (DoS) attacks at the application layer, 42 percent had a DoS-related firewall failure at the network layer), and all 1,000 large corporations (spread among 10 countries) reported losses from cyber attacks within the last year at an average cost of $682,000.

According to the report, “the front line has shifted from layer 4 to layer 7 attacks. While most traditional safeguards can handle layer 4 threats like SYN Flood DoS attacks, layer 7 threats, such as SlowLoris, are trickier. They get by layer 4 defenses because they look like legitimate traffic.

“In effect,” the report notes, “hackers have raised the ante. It is now IT’s turn to respond.” Application delivery controllers (ADCs) are one possible solution, since they understand the context of the network traffic and manage all layers, and 92 percent of respondents said they “see specific roles for ADCs.” In fact, one-third of respondents already use ADCs for security, and half of all respondents “say ADCs can replace many or most traditional safeguards.”

Of the top types of attacks, the five toughest to combat are DNS, network layer DoS , access of encrypted data, misconfigurations, and app-layer DoS attacks. More than half noted that the impact to network performance from security safeguards “is somewhat or extremely challenging.”

“The effects of cyber attacks can be crippling,” the report points out, and that should surprise no one. The most-frequently mentioned cost was to lost productivity (cited by 50 percent of respondents), loss of data (43 percent), and loss of revenue (31 percent).

The survey was sponsored by F5 Networks and conducted by Applied Research in September. Respondents reported having at a role in which they spent at least one-fourth of their time on security matters.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Will Dell be the Beneficiary of HP’s PC Blunders?

Hewlett-Packard is staying in the PC game after all.

On Thursday, HP issued a press release saying the company “completed its evaluation of strategic alternatives for its Personal Systems Group (PSG) and has decided the unit will remain part of the company.” Though not technically a flip-flop or Netflix-style fiasco, HP’s announcement makes we wonder about the management skills of the company’s board of directors.

The release quotes new president and CEO Meg Whitman’s explanation for the change of heart: “HP objectively evaluated the strategic, financial, and operational impact of spinning off PSG. It’s clear after our analysis that keeping PSG within HP is right for customers and partners, right for shareholders, and right for employees. HP is committed to PSG, and together we are stronger.”

Well, that’s certainly one way to look at the situation. Here’s another: It was stupid for the board to consider spinning off its PC division in the first place, and it was even stupider to go public with its ruminations (that is, to think out loud). Customer reaction wasn’t positive, to say the least.

A representative of Dell reached to me with news of a new IDG Research survey that found among 302 IT decision makers polled, 79 percent of respondents report that Dell would be among the vendors they consider if HP stopped manufacturing PCs, followed by Lenovo (49 percent), Apple (12 percent), Toshiba (7 percent), and Acer (6 percent).”

HP could lose more than just desktop PC sales; the study found that “41 percent are considering new or additional server vendors as a result of HP’s recent activity,” which included leadership changes and its announcement about acquiring Autonomy Corporation. Even worse: “of those, 64 percent would consider Dell.” (IBM was second at 52 percent, followed by Cisco at 18 percent and Oracle at 12 percent.)

That couldn’t (and shouldn’t) have pleased HP, especially given Dell’s continuing (unwavering) commitment to end-to-end solutions. Adding fuel to the fire: 34 percent of those interviewed are considering new or additional services vendors; of those naming preferred vendors, IBM ranked first, Dell was second.

HP’s moves were particularly puzzling because it showed the board of directors didn’t understand its customers. Most IT managers I know want to simplify acquisitions and buy the bulk of their equipment from a limited number of suppliers and be able to call just one of a few phone numbers for help. IDC’s survey backs up that sentiment, finding that “45 percent of organizations say it is important for their technology vendors to offer complete end-to-end solutions.”

Dell should also be pleased by a September 2011 survey from Technology Business Research (TBR) that said a majority of the 130 HP customers in the U.S. with 500 or more employees “have become concerned with the direction the company is taking.” TBR says Dell would “likely be their provider of choice if they decide to make a change.”

HPs news wasn’t just grist for the water cooler; IT managers were giving it careful consideration. The TBR survey found that following HP’s initial announcements, 46 percent of respondents were less likely “to purchase HP products and services; for companies 1,000 to 4,999, this sentiment rose to 53 percent.” Also troubling for HP, “47 percent of respondents using HP PCs or mobile devices and 23 percent of those using HP servers indicated they were investigating alternatives.”

TBR “asked HP customers if, based on the announcements, they felt HP was well-managed with a clear vision of the future or if they felt HP was struggling and unsure of what to do next,” according to a press release from Dell. At the time, 63 percent chose the “struggling” answer.

When HP initially floated the idea of spinning off PSG, executives were quoted in popular media saying that margins were slim, keeping up to date with technology was tough, and PC sales were declining. I won’t dispute any of those statements, and yes, the popularity of tablets and smartphones as information access devices will cut into desktop PC sales. However, sometimes you have to put other factors ahead of profit, and understand that some divisions will run at a loss in order to serve your overall goal, such as being a full service provider.

HP’s Thursday release points out that “the data-driven evaluation revealed the depth of the integration that has occurred across key operations such as supply chain, IT, and procurement. It also detailed the significant extent to which PSG contributes to HP’s solutions portfolio and overall brand value.” At least someone woke up to realize the value of the brand. “Finally, [the analysis] also showed that the cost to recreate these in a standalone company outweighed any benefits of separation.” Well, duh.

During an investor presentation on Thursday (full transcript here), Cathie Lesjak, Hewlett Packard’s CFO, laid out a variety of factors the company considered, and noted that that “the annual synergies between HP and PSG exceed $1 billion in operating profit per year, and enable HP to be more competitive as a whole.” Are you telling me they didn’t know this before announcing a possible spin-off? If so, how can they justify their positions on the board of a major corporation?

Can HP regain customer loyalty? Time will tell. The last thing IT needs right now is confusion. Fear, uncertain, and doubt is one marketing technique, but it’s rare to see a company instill FUD about its own future.

With so much to manage, IT managers simply don’t need HP muddying the waters.

HP must heed that familiar naval warning: Loose lips sink ships. When you don’t know what you’re going to do, you don’t float ideas in public. You keep quiet. If you need to test the waters, you talk to selected clients.

Perhaps its board of directors should take an MBA refresher course to relearn that it costs much more to acquire a new customer than it does to retain one.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell1 comments


2012 Budget, Salary Projections Revealed by Society of Information Management Survey

With organizations beginning to focus on their 2012 budgets, findings from a survey by the Society for Information Management (SIM) are timely indeed.

In last year’s survey of CIOs, “business productivity and cost reduction” was at the top of the list of concerns. This year, that dropped to fourth place. In the lead: “IT and business alignment,” followed by “business agility and speed to market” and “business process management and reengineering.”

There’s good news for IT professionals in the survey results, which found that 34 percent of the 275 CIOs answering the survey plan to keep their budgets the same and 51 percent said they will increase them in 2012. The survey found that 83 percent of respondents said their IT budget was equal to or the same as their 2010 actuals (money actually spent last year).

Also good news: 94 percent said staff salaries will remain the same or increase next year. Staff turnover remains about the same over the three years; year to date it stands at 7.06 percent, down from 7.10 last year and below the seven-year average of 7.46 percent. IT staff salaries overall increased over 2010’s salaries at 66 percent of organizations, remained the same at a quarter (26 percent), and declined at 8 percent. That pattern is projected to be similar next year; 67 percent of participants say salaries will rise in 2012, 27 percent say they’ll remain the same, and 6 percent expect them to decline.

Not so great news: the percent of corporate revenue allocated to IT’s budget has dropped to 3.55 percent this year from 3.87 percent last year, though it’s still near to the seven-year average of 3.68 percent.

What has IT been spending its money on so far this year? Business intelligence is the top of the list, followed by cloud computing, enterprise resource planning (ERP) systems, mobile and wireless applications, and customer relationship management (CRM) systems. The latter two are new this year to the top 5.

Jeffy Luftman, a distinguished professor at the Stevens Institute of Technology and the lead researcher for the survey, will present the full survey results at SIM’s annual conference, SIMposium, in November. Details are here.

-- James E. Powell
Editorial Director, ESJ

1 comments


How to Ensure Your E-commerce Site is Ready for the Holiday Rush

Recently, Apica, a load testing and performance-monitoring provider for both cloud and mobile applications, released its top 10 tips for ensuring your e-commerce site is ready for the holiday shopping season. I found several of the tips interesting and asked the author and Apica’s CEO, Sven Hammar, for more details.

Hammar’s first tip is:

Put vanity aside and reduce the amount of high-resolution images and video on your site in order to minimize response times. If you’re too in love with the bulky images, then be sure and invest in systems that can handle short response times despite a high-resolution content.

So, what, exactly, is Hammar’s definition of “too many” images?

“There are two aspects that should be considered here, image size and the number of images,” he says. “The size of the images shouldn’t exceed 50 KB and they should be structured in separate DNS names (i.e., static.site.com/image1). This is important so that they are easy to front-end cache and store in CDNs. The number of images and URLs should be as few as possible. A good number to aim for is not more than 200.”

Fair enough.

His second tip is:

Consider using a CDN/Accelerator service to accelerate the delivery of rich content such as images and videos to customers. These services aren’t terribly expensive and the upside is huge.

Hammar recommends creating a sub-domain such as static.domain.com. “Once created, you can then create a Cname that points to your CDN provider. This is the only technical part that has to be done from the customer point-of-view.

“If you have done this correctly, every picture, CSS, etc. that is placed in the sub-domain automatically gets cached in the CDN because it points directly to the CDN provider. This will accelerate the page when traffic to that page is heavy and is only stored in the CDN for a preset amount of time. You set this up with your CDN provider and work it as a kind of “off-button” when there’s little traffic to the page.”

Hammar’s fifth tip is something all IT departments should be doing: monitoring performance.

Periodically test, monitor and optimize your site to ensure a great consumer experience. Web testing companies can test and optimize your site, simulating peak loads by using ‘synthetic traffic,’ and then suggesting improvements. These companies often offer complimentary surveillance services.

This made me wonder -- what happens if you’re not diligent about proactively testing/monitoring/optimizing? When then?

“Most sites are constantly being updated with new features, new demands, and new infrastructure,” Hammar replied. “The number of times you change your site will typically introduce an escalating “slow” feeling of the site. If you don’t react proactively, you will soon reach critical load-time levels and the site will become unstable. If this happens you will lose visitors by the second.

“Outside monitoring from the browser view will give you a baseline so you can pick up on minor deteriorations or have the time to react after a deployment when the response times increase 20 percent due to a missed configuration. Load testing validates the stability of the system and reduces the risk of system crashes.

“The most important aspect in the configuration is capacity. Performing a load test is the only way to make sure that you have the optimal configuration and have picked the correct number of servers. For example, should you have a medium instance in Amazon, determine the best configuration of your cache: 1 minute, 2 minutes, 10 minutes, etc.”

Finally, his ninth question caught my eye:

Use your analytics tools to identify the top three-to-five business processes customers are conducting on your site, and maximize them for peak performance.

I wanted to know what typical processes an e-tailer should look at first, and how much extra bang for the buck can you get quickly, given that the holidays are almost upon us.

“With tools like Google Analytics, you can see which paths are used and where the users start. Response-time analytics, together with page access, reveals when conversion is impacted by slow response times. If the checkout process takes 10 seconds vs. 40 seconds, how much is the conversion rate affected?

“’Search’ is a good example where you can easily find data from your peers in the industry. Leading performance vendors will provide you with a custom built index that can serve as indicators for management on how you are doing. Fast is always good -- fast build brands, fast improves conversion, and fast makes e-retailers more money.

“Benchmarking your site will provide an action list where you can identify the most pressing problems, and typical action is to continuously remove the five worst-performing pages or functions, thus always working proactively to increase performance on your website.

You can read all ten tips here. (No registration is required.)

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell2 comments


McAfee Survey Shows Disconnect between Security Perceptions, Reality

Results of a new survey show a “serious disconnect between security perceptions and reality among IT Enterprise security managers” according to survey sponsor McAfee.

The 2011 Data Center Security Survey, conducted by Gabriel Consulting Group (GCG), looked at security issues at 147 enterprise data center managers. Most respondents (60 percent) say that their organization’s management believes “security is stronger than it actually is,” and just under a quarter (22 percent) believe management knows about their company’s actual security preparedness.

“It’s astounding that almost two-thirds of our respondents say that their management is in the dark about their true security status,” said Dan Olds, principal analyst at Gabriel Consulting Group, in a prepared release from McAfee. “This is something that should cause a lot of thought both in the executive suite and in the data center. Management needs to seek out the truth when it comes to IT security, and data center management needs to be frank and honest when discussing the strengths and weaknesses of their security mechanisms. Obviously, it’s far better to discuss potential security issues before they’re exposed by a breach.”

Almost half of the survey participants said “virtualization and private clouds pose a unique security challenge.” Despite that, most use the same tools “to secure both physical and virtualized systems.” About 20 percent claimed their organization had been breached in the past 18 months; over 60 percent of those breaches were from outsiders. Eighty percent said they lost worker productivity as a result.

When it comes to remediation, the survey revealed a dramatic split among respondents. Remediation includes breach discovery, damage assessment, notification, and problem correction. Over 40 percent claimed that “their breach remediation was an ‘all hands on deck’ effort” that required at least half of their IT staff and other resources. For most others (48 percent), remediation required no more than 20 percent of their IT resources.

Likewise, there was a split in the time it took to get the job done. “Almost half reported that their efforts took one week [or less, but] just under 40 percent said that remediation took at least a month -- or longer.”

Among the other results from the GCG report:

  • Over four in ten respondents believe their organization’s “security pace isn’t keeping up with threats”
  • About 70 percent are skeptical about security in public clouds
  • Four in 10 (40 percent) say everyday security doesn’t conform to their official policies and standards

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


IBM Declares 2011 "Year of the Security Breach"

If you're wondering when malware will become a real problem for mobile devices, your wait is over. So says Tom Cross, manager of Threat Intelligence and Strategy for IBM X-Force.

According to IBM's new X-Force 2011 Mid-Year Trend and Risk Report, Big Blue predicts that by the end of 2011, the number of exploits will be double what they were in 2010. A key target: mobile devices.

The X-Force team says many mobile phone vendors don't issue security updates quickly enough. Mobile devices are an increasingly popular target simply because of the incredible size of the market, and the team notes that mobile computing threats are enabled, in part, though malware distributed using third-party app sites. Some of this malware collects users’ personal information which can be used for identity theft or for phishing attacks. Other malware can spy on users' personal communications or track physical movements using GPS features built in to their devices.

Speaking of phishing, the report uses a term new to me -- whaling -- to describe "a type of spear phishing which targets 'big fish' or those positioned in high levels of an organization with access to critical data." Forget sending messages to everyone hoping to play the law of averages to be successful. Whaling attacks are "often launched after careful study of a person's online profiles" that give attackers the information they need to be successful. Through a combination of "stealth, sophisticated technical capabilities, and careful planning," teams of professional hackers are collecting the information they need to access critical network resources.

There are some bright spots in the X-Force report. For instance, in the first half of this year, Web application vulnerabilities dropped from 49 percent of all vulnerability disclosures to just 37 percent -- a first in the five years the team has been tracking such data. Also encouraging: "High and critical vulnerabilities in Web browsers were also at their lowest point since 2007" and spam volume has declined "significantly" through the first half of this year. To no one's surprise, when botnet operators are stopped, the number of spam messages drops and phishing attacks decline.

The biggest source of spam has moved to the Asia Pacific region; India accounts for 10 percent of all spam, with South Korea and Indonesia making it into the top five as well. That explains why IBM has opened a new IBM Institute for Advanced Security in the region (joining existing Institutes in Brussels, Belgium and Washington, D.C.).

Financial gain is a key driver, but increasingly attacks are done for political reasons. The X-Force team says "hacktivist" groups are using well-known techniques such as SQL injection against Web sites. Also highlighted in the report: a tripling in the number of anonymous proxies in the last three years.

What isn't new: some of the techniques hackers use. For example, attacks on weak passwords are still a popular approach, as are SQL Injection attacks. Exploitation of JavaScript is still successful; of the nearly 700 Web sites of Fortune 500 and other company sites IBM tested, 40 percent contain client-side JavaScript vulnerabilities.

The report warns that "Although we understand how to defend against many of these attacks on a technical level, organizations don't always have the cross-company operational practices in place to protect themselves."

As the "eyes and ears of thousands of IBM clients", the X-Force team gathers security intelligence using public disclosures and its own monitoring of 12 billion daily security events. The full report is available at no cost here (though a short registration is required).

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Social Engineering Strikes Almost Half of Enterprises, Survey Shows

A new global survey of more 850 IT and security professionals from Check Point Software Technologies Ltd. (a company specializing in Internet security) found that almost half (48 percent) of enterprises it questioned have suffered from vulnerabilities that exploit social engineering -- and suffered more than once. Those surveyed said they had experienced 25 or more attacks over the last two years, at a cost of between $25,000 and $100,000 per security incident.

The report, The Risk of Social Engineering on Information Security, puts phishing and social networking tools at the top of the list of socially-engineering threat sources. Phishing e-mails were popular with (47 percent of respondents), with social networking sites that reveal personal or professional information close behind at 39 percent. The report notes that “social engineering attacks are more challenging to manage since they depend on human behavior and invoice taking advantage of vulnerable employees.” Hackers use several techniques and social networking applications to determine an organization’s “weakest link.”

Enterprises are aware of the problem -- at least in the abstract. According to the survey, 97 percent of security professionals and 86 percent of all IT professionals “recognize social engineering as a growing concern.” Most (51 percent) say financial gain is the top motivator, followed by proprietary information (46 percent), competitive advantage (40 percent), and revenge (14 percent).

When it comes down to the individual organization, however, the numbers aren’t so strong. Although 43 percent know they’ve been targeted by social engineering attacks, 41 percent were not aware if their organization had been attacked. Sixteen percent said they hadn’t been the target of social engineering. Worse, only a quarter (26 percent) conduct ongoing training to inhibit or prevent the success of such attacks, and a third (34 percent) don’t made any attempt to educate their employees.

There are cost savings to be realized in such training and education. Survey participants estimated each security incident costing anywhere from $25,000 to over $100,000, including costs associated with business disruptions, customer outlays, revenue loss and brand damage,” the company notes.

The company points out that the “prevalence of Web 2.0 and mobile computing has also made it easier to obtain information about individuals and has created new entry points to execute socially-engineered attacks.” Still, it boils down to the victims themselves -- the richest source of information comes from new employees (60 percent cited such employees are susceptible) and contractors (44 percent); both categories may be ignorant of or unfamiliar with corporate security policies. Other rich sources of information from social engineering exploits: executive assistants (38 percent). IT staff -- who presumably should know better -- were noted as being at high risk by 23 percent of respondents. That’s not a comforting number.

Conducted in July and August, the survey focused on workers in the U.S., Canada, UK, Germany, Australia, and New Zealand in organizations of all sizes and several industries (such as finance, defense, retail, health-care, and education).

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments