According to Chris Dowse, CEO of consulting firm Neochange, most organizations receive less than half of the benefits they want from their IT investments. What accounts for this “value gap”? Complex IT adoption for one.
The company is currently collecting input from enterprises for their fourth annual survey, and they’d like your help into what you believe are the challenges and trends you’re facing. This year’s survey focuses on the business impact of a variety of end-user strategies as well as a deeper understanding of adoption barriers.
The survey asks about how well groups within your organization are executing their responsibilities (such as end-user support staff to front-line managers), to when you get involved with user adoption within your projects. Neochange also wants to understand how long you’ve been using end-user technologies such as end-user experience monitoring, education simulations, and self-service education. How do your users learn about their applications, and how actively engaged are users?
Neochange will share their results with ESJ in a future issue of Enterprise Strategies. If you’d like to participate, visit their survey site.
-- James E. Powell
Editorial Director, ESJ
0 comments
A monthly index of IT jobs maintained by TechServe Alliance found that there are now 4,068,400 jobs in the IT sector, an increase of 7,100 jobs since October. The new total jobs figure represents a 2.1 percent increase -- 84,000 jobs -- since November 2010.
Though these figures doesn't seem impressive, they compare favorably with the overall labor market growth of 0.1 percent in the last month and 1.2 percent in the last year.
In a prepared statement , Mark Roberts, TechServe Alliance’s CEO, said he remains “very bullish on the prospects for continued growth in 2012." He anticipates "IT employment will surpass its all-time high set back in June 2008.”
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
A new survey of IT executives attending Gartner’s Application Architecture, Development and Integration (AADI) Summit shows that application development priorities have changed little in a year. “Deliver applications faster” was at the top of the list -- exactly where it was last year. Only the percentage has changed; this year, 68 percent of respondents said it was a top priority; the figure was 61 percent in last year’s Summit survey.
The survey was conducted by Serena Software, a firm that specializes in orchestrated application delivery, IT, and business processes. What the survey results tell me is that IT is still battling the problems it faced last year, though concern about meeting these challenges has grown. For example, the second and third spots on the list simply swapped positions from last year. This year, “Expand the use of Agile” was second on the list (third last year); “Reduce app dev costs” was third this year (down from second in 2010, despite tougher economic times and tighter IT budgets).
The survey also asked respondents about their top application development initiatives for the coming year. “Managing applications as a business process” came out on top. Second (up from fifth last year -- the only dramatic change I saw in the survey) was having “end-to-end traceability across different tools and the application development lifecycle” (65 percent of respondents want to tie production code back to the underlying business requirements).
The survey also found that “85 percent of respondents cited managing application development as a business process as ‘very important to extremely important’ to their organization.” Nearly half (47 percent) put “standardization on methodology” on the list (placing it fourth overall), and 47 percent said “increased innovation” would be among their development priorities next year (putting it fifth in the list).
Serena says the survey revealed that “most software delivery teams today have the right tools, roles, and functions in place,” but IT’s biggest challenge is “in finding a way for these elements to effectively work together from initial request to release into production, so the entire IT organization can be as efficient and cost effective as possible. By putting effective processes in place, even the largest global enterprises can better orchestrate, measure, predict, and also improve the overall software delivery process.
In fact, says David Hurwitz, senior vice president of worldwide marketing at Serena, “Most IT organizations today leverage multiple development tools and we don’t believe this is going to change.”
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell1 comments
A new CAST Report on Application Software Health (CRASH) was released today by CAST, a software analysis and measurement firm. It’s the first I’ve seen that studies the exposure enterprises have to “fix hidden problems that remain in software and result in damaging risks in applications after they are operational” -- what it calls technical debt -- and puts it in monetary terms any boardroom executive can understand. CAST acknowledges that its estimate -- technical debt is $3.61 per line of code -- is conservative. The firm didn’t include costs to fix software so it performs its intended functionality (that is, correct logic problems).
If the survey is representative of IT applications in general, and about 15 percent of applications contain more than a million lines of code as CAST found in its sample, then a single large application may expose the enterprise to more than $3.6 million in technical debt.
That debt is problematic for another reason: it’s often not part of IT’s budget. When an application fix is needed, IT must use funds from another project, disrupting its priorities.
According to Dr. Bill Curtis, CAST’s chief scientist, senior vice president of CAST Research Labs and director of the Consortium for IT Software Quality, “The number of software glitches, outages and security breaches reported in the press this year, and the damage they have done to the reputations of organizations like Toyota, Sony and RIM, not to mention the U.S. Government and a multitude of banks and stock exchanges around the world, have made problems with structural quality in application software a boardroom issue.”
Curtis says ignoring these flaws can be dangerous. “What we found were numerous problems that should have been addressed prior to deployment. It’s little different from ignoring termites that are destroying the structure of your home.”
The study “used automated analysis to measure the structural quality of 365 million lines of code within 745 IT applications used by 160 companies throughout 10 industries.” That’s triple last year’s study, CAST’s first, which examined 288 applications with 108 million lines of code in all.
The study examined five “health factors” in judging an application’s structural soundness: security (how well it prevents unauthorized instructions), performance (application responsiveness), robustness (an application’s stability and the likelihood defects will be introduced during modifications), ease of software transferability (how easy it is for a new team to understand the application and productively work with it), and ease of changeability (how easily and quickly an application can be modified).
There were some surprises -- the survey dispelled some development myths. For example, there was no difference in structural quality between applications that were outsourced and those developed in-house, nor between onshore and development. Structural quality also was lower for applications with a smaller set of users.
Language matters: Java EE applications were the most prevalent applications in the study. Their performance scores were significantly lower and they were more costly to fix than applications written in other languages.
Using an agile methodology -- with its reputation for delivering applications (or changes) quickly -- may also bring better structural quality than custom methods. Applications built using a waterfall methodology had the highest scores for transferability and changeability; scores for those two measures continue to be the lowest in government IT departments.
COBOL applications continue to enjoy their reputation for security, scoring highest in that dimension; .NET applications had the worst security scores.
What best practices can IT follow to reduce this debt? I asked Lev Lesokhin, vice president of marketing at CAST, for recommendations.
“The first step is to recognize the problem and garner management attention,” he explained. “The very next step is to quantify and categorize the technical debt to get a handle on the problem. Most analysts will then tell you that you need to start remediation projects to pay down the debt while modernizing your systems. In our experience, while this is a valuable activity, it’s often more important to put quality gates in place to limit the production of new debt with every release to mission critical systems.”
I also asked Lesokhin if there were any surprises in the results.
“Probably the most counterintuitive finding we had in the data was that applications managed offshore, or that were outsourced, had the same quality levels as internally managed applications. More of the quality variation is explained by QA practices than sourcing method. This is a fact and we need to analyze the data further to understand the sources of variation.”
More information about the CAST survey can be found here.
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
Network security provider Fortinet is out with its monthly threat research, and as part of the report is the company’s list of Top 5 Android Malware Families. In addition, the company has commented on new root-level attacks on Android phones.
Fortinet says Gartner claims the Android operating system has a global market share of 52.5 percent, Symbian is in second place with 18 percent, and iOS is third (though no percentage is given). Android’s popularity is certainly attractive to hackers: the company found “approximately five times the amount of malicious families on the Android OS versus what we’ve found on iOS.”
Axelle Apvrille, a Fortinet senior mobile anti-virus researcher, explains that “this disparity can be attributed to the way Apple handles iOS application development and distribution. Unlike Android, which makes it fairly easy to place applications for people to download, iOS requires developers to undergo some strict screening from Apple before the application can make it to the Apple Store. That’s not to say that Apple is totally immune from being infiltrated by malware -- the Eeki banking worm proves that -- but it is a testament to why we’re seeing so little activity on the iOS platform.”
Android’s larger market share (not to mention its open development environment) may be why the company has seen a “90 percent increase in Android malware families in 2011 compared to 2010, while malicious iOS families only increased by 25 percent” during that period, according to Apvrille.
FortiGuard Labs’s antivirus engine detected the largest threat samples from these five malware families:
- Geinimi, Android’s first botnet, sends a user’s geographic location and controls infected phones remotely; Geinimi can cause an phone to call a particular phone number
- A Trojan in the form of live wallpaper called Hongtouto “steals private information such as the victim's subscriber number (IMSI) and automatically visits [Web sites] that the malware directs it to”
- DroidKungFu, a botnet that can remotely install other malware and start other apps
- A phony instant-messenger app, JiFake, “sends SMS messages to premium phone numbers”
- The BaseBridge Trojan also sends SMS messages to premium telephone numbers; the vulnerability was also available (and removed from) the Android Market
The malware comes dressed to look like legitimate apps, according to Karine de Ponteves, a malware analyst at Fortinet. “DroidKungFu was an example of malware that was found repackaged in a legitimate VPN utility, whereas Geinimi was found within the legitimate application ‘Sex Positions.’”
Unfortunately, it isn’t tough to exploit root access to Android devices. “The mobile security trend is a familiar one: as operating systems mature and gain popularity, malware and vulnerabilities follow since there is focus and motivation from cyber criminals,” Fortinet’s senior security strategist Derek Manky, explained. “With root access, hackers can gain access to system files and change system settings that are typically authored to be read only. For example, a malware creator with root access to a vulnerable device could silently download and install additional malicious software, such as ransomware, spambots, and keyloggers.”
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
A new survey released today by Symantec Corp. -- the 2011Enterprise Encryption Trends Survey -- found an increase in enterprise adoption of encryption. That is, of course, no surprise, given that IT is trying to control storage costs amid massive data growth. The problem uncovered by the survey is that fragmented encryption solutions are “creating risk for organizations from the lack of centralized control of access to sensitive information,” as well as disrupting e-discovery and compliance monitoring.
“While many organizations understand the importance of encrypting their data, issues with key management and multiple point products can give them inconsistent visibility into what has been protected,” said Joe Gow, director of product management at Symantec, in a release announcing the results.
The state of these encryption solutions is having a financial impact: the survey estimates that “fragmented encryption solutions and poor key management is costing each organization an average of $124,965 per year.”
Nearly half (48 percent) of respondents say they increased encryption use over the past two years; for 38 percent, the encryption level remained the same. The respondents claim that 43 percent of their data is encrypted “at some point in its lifecycle.”
The solution(s) used varies widely. According to the report:
While adoption is high, that doesn’t mean everybody is on the same page. We didn’t see a consensus on a single, agreed-upon encryption product that was meeting everyone’s needs. Some enterprises reported as many as five different encryption solutions deployed in their data center. The typical organization reports they have five different encryption solutions deployed.
To my surprise, a third of those surveyed admit that they deployed encryption without approval of the security group “on a somewhat to extremely frequent basis.” Perhaps that’s why, as the report points out, “the projects are not necessarily following the company’s best practices, [and] 52 percent of organizations have experienced serious issues with encryption keys including lost keys (34 percent) and key failure (32 percent).”
Employee turnover is always a security problem -- enterprises must be diligent in terminating access to applications and data. Here’s another task to add to that list: “former employees who have refused to return keys,” a situation at more than a quarter (26 percent) of enterprises surveyed. Unfortunately, the survey found respondents expressing concern about managing encryption keys. “Forty percent are less than somewhat confident they can retrieve keys. Thirty-nine percent are less than somewhat confident they can protect access to business information from disgruntled employees.” That should make the security team nervous. Very nervous.
Key management causes further problems for enterprises; the most common is an inability to meet compliance requests (48 percent), followed by an inability to respond to eDiscovery requests (42 percent) or impeding access to important business information (41 percent).
Research for the trends survey was conducted in September by Applied Research, which examined the answers of “C-level, tactical management, and strategic management” respondents in 1,575 organizations from 37 countries.
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments
If you work in a small business and want to enjoy the cost savings, increased agility, and greater efficiencies (and the resulting return on investment) that server virtualization can bring to your organization but don’t know where to begin, CDW (a national technology solutions provider) has a free, short self-assessment tool to help you gauge your readiness to deploy and manage server virtualization.
An August survey by the company solicited the practices, plans, and opinions of nearly 300 small business IT pros and found that the savings proved to be substantial -- an average of 18.4 percent of the respondent company’s IT budget, or $19,400 a year. CDW combined the survey results with recommendations and tips from its own staff as well as customers and posted it as the Small Business Server Virtualization Roadmap as a free download.
The survey uncovered some interesting practices and opinions. For example, one-quarter of small businesses “have virtualized at least some of their servers to run multiple, independent, virtual operating systems on a single server.” However, three-fourths of survey respondents that aren’t using server virtualization are looking into or planning to use the technology -- but getting started poses a big challenge.
The free self-assessment tool is a three-page, self-scored questionnaire (supplied as a PDF) that can help an SMB’s IT staff figure out how to get started. The company says it’s a five-minute questionnaire, but it likely won’t take you more than two minutes to complete (three minutes tops). Once you add up the number of checkboxes from each of three columns, then do some simply multiplication and addition, and your score will rate your enterprise as a “prime candidate,” ready to perform due diligence, or in need of a partner for guidance.
The developers of the self-assessment tool understand that no two small businesses are alike, so it takes into account such factors as the number of IT network users, server count, whether the organization has identified costs and established a budget, the IT staff’s virtualization knowledge, and management’s understanding of the benefits of the technology.
In a CDW press release, the company’s vice president of small business sales points out that in the survey, “two-thirds of small businesses that virtualized their server environments said it has significantly increased the ROI of their IT -- but many small businesses wonder if server virtualization makes sense for their organization.”
-- James E. Powell
Editorial Director, ESJ
0 comments
A new Forrester Consulting study found that most (57 percent) IT managers at small and midsize businesses (SMBs) don’t accurately estimate the full cost of purchasing and maintaining a traditional file server.
Forrester reported that 51 percent of SMBs have a terabyte or more of data in file storage (including file servers); three percent have 100 TB or more. Forrester also reports that 43 percent say their data is growing at 11 percent or more each year (49 percent rate growth at 1 to 10 percent).
When it comes to storage, most SMBs (76 percent) use an on-premise file server. However, 28 percent of respondents say they frequently or occasionally use a software-as-a-service solution for storage, and 22 percent are either planning to use it within a year or are interested in doing so. To support collaboration, 87 percent use an on-premises file server, 40 percent use a hosted file server, and 17 percent use a cloud file server.
When Forrester asked participants for their overall estimate of their annual file server costs (including annualized purchase costs, labor, and maintenance), 51 percent said the costs were under $5,000 per year, and 12 percent said the costs exceeded $20,000 annually.
Forrester then asked respondents to consider
...annualized costs of hardware, software, and personnel time. For hardware, we prompted them to sum the annualized costs of server and storage hardware, redundant power, and networking gear. For software costs, we prompted them to consider the annualized cost of file server OS and software licenses, VPN software, security and firewall software, backup software, and maintenance contracts. For personnel costs, we prompted them to consider the annualized costs for an IT administrator to maintain and expand the file server and manage and support users, plus any use of consultants or systems integrators.
What a difference that made! The “under $5,000” estimate was reported by only 38 percent of participants, and 32 percent said their costs were $20,000 or more. In fact, 47 percent of respondents changed their estimates upward. Clearly, SMB IT managers don’t have accurate figures when they’re comparing costs of file servers to hosted or cloud solutions.
Forrester suggests that “cloud file services offer some or many of the features of an on-premises file server but charge per user per month rather than requiring lots of up front expense -- and often cost less per employee on an annualized basis.”
That’s music to the ears of Vineet Jain, the CEO and co-founder of survey sponsor Egnyte, a hybrid-cloud-storage solution provider. “The writing is on the wall, and we believe within two years, at least two-thirds, if not more, of the file servers that reach the end of their lifespans will go the way of the dinosaur.”
Of course, Jain is hardly impartial, given the solutions his company sells. Still, I believe he’s right about one thing: he expects that tightening budgets will force IT to more closely look at all costs, and that once that’s done, “IT will realize the true value and cost savings of a pure cloud or hybrid cloud approach, and will simply retire the traditional file server.”
In 2008, when Jain’s company launched its first product, traditional servers worked fine when all workers were in the same location, but they weren’t well suited for distributed file sharing. Egnyte’s focus was on hosted solutions, replacing the traditional physical file server while addressing multiple and remote device access.
Jain admits that at the time he didn’t see that “the cloud, by itself, would become such a big category,” but keep in mind that bandwidth was a significant issue.
Then, as now, Jain says he ran into resistance when trying to sell his solution: the perception by SMBs that they would lose control of their data, and compliance issues required local copies of some files. That led the company to develop a second product that worked with a cloud server and network-connected devices; it enabled IT administrative control and handled automatic synching along with file management tasks (preventing simultaneous file writes by two users) and security permissions. Because it used virtual file access, users could go home at night, click on a drive letter on their PC, and gain transparent access to corporate files.
Today the bulk of the SMBs Egnyte sells to buy the product as an alternative to, or to replace, actual physical file servers. In one installation Jain described to me, a company installed the Egnyte solution and a NAS device in a remote office without any IT support and was up and running in a few hours. That’s just not possible with a physical file server. Since this approach makes file sharing easy, it certainly reduces the need to use consumer-oriented (and unapproved) file storage services (such as Dropbox) for corporate files -- something a whopping 41 percent of Forrester survey respondents said occurred at their enterprise.
The cost advantage is dramatic, if what Jain claims is true. He says the company’s own study found that “traditional file servers typically cost five times more than Egnyte HybridCloud,” which is billed as a monthly fee.
That could certainly hasten the last days of on-premise file servers.
-- James E. Powell
Editorial Director, ESJ
Posted by Jim Powell0 comments