Free Online Resources Aid Anti-Fraud Initiatives

Organizations around the world lose an estimated 5 percent of their annual revenues to fraud, according research by the Association of Certified Fraud Examiners (ACFE). What better time to focus on fraud within your own organization than this week -- which happens to be International Fraud Awareness Week.

To aid in the fight against fraud, the ACFE has posted downloadable tools and information, including a Fraud Prevention Check-up, Fraud IQ Quiz, and a variety of presentations focused on fraud prevention and detection. The resources area available at no cost at www.FraudWeek.com (click on the Anti-Fraud Resources tab); no registration is required for access.

In the organization’s 2010 Report to the Nations on Occupational Fraud & Abuse, the ACFE study reveals facts and figures that may sober any IT security professional:

  • Fraud schemes are costly, with median losses caused by the occupational fraud it studied of $160,000. That’s just the median; almost one-fourth of the fraud caused losses of $1 million or more.

  • Like some malware and viruses, fraud can make several months -- sometimes years -- to detect. In the ACFE study, the median time to discovery was 18 months. One great way to find fraud: tips from your workforce.

  • Occupational fraud is a global problem. Though there are differences by region, most of the characteristics of perpetrators (and the anti-fraud controls meant to stop them) are similar no matter where the fraud occurs.

  • Small businesses are particularly vulnerable to occupational fraud because they generally don’t have anti-fraud controls in place as larger competitors do.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Symantec Study Looks at Real-World Windows 7 Migrations

On the heels of Microsoft’s recent announcement of higher-than-expected adoption comes a study conducted in August by Symantec of 1360 businesses that had completed a Windows 7 migration. The report examined migration practices and results and includes recommendations to help those planning such a migration.

Although waiting for the first service pack before migrating to a new operating system is often considered conventional wisdom, the survey found that few respondents waited for a specific service pack before migrating. In fact, 71 percent of what the report categorizes as “top-tier companies” upgraded within one year; only 52 percent of “bottom-tier” companies did so.

When asked about how enterprises made the “buy a new PC versus upgrade an existing system” decision, 75 percent said RAM capacity was a somewhat to extremely important consideration, followed by processor speed (74 percent), a PC’s age (73 percent), and budget (71 percent). Slightly more than half (52 percent) of respondents consulted the Windows Experience Index to determine what processor capabilities they’d need.

The migration is being driven by expectations of increased performance (69 percent), improved reliability (59 percent), and an improved user experience (51 percent). Almost two-thirds of respondents (62 percent) set ROI goals for their migration project, and 90 percent of these respondents achieved them.

When it comes to resources, the Windows 7 migration projects required about half of IT’s staff; just over half of respondents (54 percent) reported they automated the process. The top three time-consuming tasks: planning, the upgrade execution itself, and reinstalling applications.

Among the best practices recommended to enterprises yet to migrate to Windows 7 were user training and pilot tests.

Participants came from firms in 16 countries that employed between five to more than 10,000 workers; the median fell between 1000 and 2,499 employees.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Unisys State-of-Security-Concern Measure Drops to Lowest Point Since Survey Began

Unisys has released the results of its semi-annual Security Index, which is down slightly, at 136, from the same survey taken in the first half of the year.  The company says the numerical rating indicates "a moderate level of concern."  It's the lowest rating since the survey began in 2007.

The telephone survey of 1,004 U.S. respondents (all at least 18 years old) showed they are most concerned about national security (in relation to war or terrorism), followed by concerns over identity theft and bankcard fraud. When it comes to protecting themselves, 80 percent of social media users say they routinely limit the personal information posted on such sites and set privacy settings so information is restricted. Nearly three in four (73 percent) regularly update their anti-virus software, and two-thirds (67 percent) shred financial and medical documents.

There's room for improvement where passwords are concerned: only 46 percent "regularly" use or update passwords that are difficult to guess (30 percent say they do so "every once in a while"), and only 37 percent of mobile device owners use passwords on their hardware.

When it comes to protection, 61 percent say the President of the United States should have the authority to take control of the Internet "in the event of a malicious cyber-security attack" by "a foreign government against our military, civilian government, electrical grid, financial systems, or other critical infrastructure." The youngest consumers (those 18-34) are the least worried about national security; it concerns only 48 percent of this group; those with at least a high-school education worry the most about national security.

The survey -- a snapshot of the "nation's sense of security" according to Unisys -- provides a regular, "statistically robust" measure of concerns about eight areas of safety: national security (the security of the U.S. in relation to war or terrorism, serious health epidemics in the U.S.), financial security (the ability of others to obtain and use credit/debit card information, your ability to meet essential financial obligations such as mortgages and bill payments), Internet security (how secure a computer is against viruses and unsolicited e-mail, the security of online shopping and banking), and personal security (personal information breaches and how secure you believe you'll feel in the next six months).

-- James E. Powell
Editorial Director, ESJ

0 comments


Heterogeneous Platforms Growing Challenge for DBAs, Survey Finds

When we think of heterogeneous environments, we often think of mixing hardware types -- support for Android and Blackberry devices, for example, or managing servers from different manufacturers.

Increasingly, however, we should also be thinking about mixed database platforms. That's become the norm in most data centers, according to a survey of database administrators just released by Embarcadero Technologies.

The Database Trends Survey Report indicates that just one in five DBAs (20 percent) manage a single database platform; a third (33 percent) said they manage two; one-fourth say they deal with three; 8 percent manage four platforms; and more than one in eight (14 percent) are responsible for five or more platforms.

The most common database platforms are Microsoft SQL Server -- 62 percent of respondents work with it -- and Oracle (at 60 percent), followed by Sybase Adaptive Server Enterprise (35 percent) and Microsoft Access (19 percent). Oracle is the database platform most respondents identify as their primary platform.

It isn't just the multiple platforms that's challenging; the survey found that most DBAs are managing multiple versions of the same database. Of the more than two-thirds (69 percent) who support multiple versions of a database, 51 percent said they manage three or more versions.

“Each database platform and version has its own features and functionality, and keeping them all straight can be a monumental task for DBAs,” said Scott Walz, senior director of product management for Embarcadero, in a company statement. “Multi-platform database management is becoming more commonplace, but that doesn’t mean it’s getting less complicated.”

Cross-platform database management was cited as the biggest database-related challenge they'll face next year. Tied in second place: multi-instance databases and database tuning; database management came in third.

The M&A Effect

What's driving this mish mash of databases? The survey asked respondents if their company had been part of a merger or acquisition in the last five years; 43 percent said they had, of which nearly one-fifth (18 percent) said they began working with more database platforms as a result of the M&A activity.

DBAs expect the heterogeneity to grow; "nearly one-third of respondents expect more database platforms to be introduced into their organizations in the next year," Embarcadero said.

The study was conducted over the summer of over 1,200 DBAs, developers, architects, and analysts. The full report is available without charge or registration at http://www.embarcadero.com/reports/database-trends-survey.

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Green IT May Provide a Path to IT Profitability by 2020

The IT industry must adopt new energy-efficient technologies in the next ten years or face serious economic problems according to an 18-month study just published by the Institute for Sustainable and Applied Infodynamics (ISAID) and Rice University's Baker Institute for Public Policy. Otherwise, IT runs a "serious risk of being unable to contribute to growing the global economy if limits are placed on carbon emissions."

"In the face of growing global concerns over greenhouse carbon emission, the key for the industry is finding new technologies that deliver more performance for each kilogram of CO2 emitted," said Rice computer scientist Krishna Palem, ISAID's director, in a prepared statement. ISAID is a joint effort of Rice University in Houston and Nanyang Technological University in Singapore. "Fortunately, there are viable technological options on the table, and the information and communication industries have a strong track record of embracing new technologies."

The problem, according to the report, is that the U.S. information and communication technology (ICT) industry is predicted to increase its carbon emissions twice as fast as it contributes to gross domestic product over the next ten years.

"In the U.S. in 2009, the economic output of the ICT industry per kilogram of CO2 emitted was about $2.83, and in a business-as-usual scenario, that output will fall to about $1.06 per kilogram of CO2 by 2020," according to one of the authors, Chris Bronk, a fellow in technology, society, and public policy at the Baker Institute and lecturer of computer science at Rice.

"Based on those numbers, the industry is headed for a brick wall if limits are placed on CO2 emissions. In a carbon-constrained economy, green innovation will be absolutely essential for ICT profitability."

The report examined two trends in depth -- carbon emissions and gross domestic product (GDP) of the ITC industry. The authors developed a new metric they call the sustainability innovation quotient (SIQ), which "expresses the number of dollars returned in GDP by the ICT industry for each kilogram of carbon dioxide it emits," in response to a lack of an existing, reliable metric.

The research team looked at the energy consumed by today's devices (including PCs and laptops as well as smartphones and game consoles) and how increased demand will affect it. Such devices don't directly emit carbon dioxide, of course, but the electricity they use is produced mostly by burning coal and natural gas, which do emit the gas. The authors "factored in the effect of cleaner, more efficient electric production technologies that will be rolled out in the coming decade."

Although data centers were included in the study, the network equipment used by telecommunications and wireless providers was not factored in because, according to the authors, previous studies "had looked at their energy consumption in significant detail."

The growth estimates should serve as a wakeup call for IT data center managers. In a prepared release, Rice and ISAID noted that:

The authors calculated the global carbon emissions that will likely result if the ICT industry continues with business as usual. The calculations showed that global carbon emissions related to PCs and laptops, which accounted for 48.5 percent of all global ICT emissions in 2009, will nearly quadruple by 2020. Data center-related emissions will more than triple by 2020, and calculations showed that emissions related to both game consoles and mobile phones will more than triple by 2020. Mobile phones, which are constrained by battery life, and game consoles will together account for just 5.01 percent of total ICT emissions by 2020. [emphasis added]

The full report, which explains the author's methodology,can be downloaded at no cost; no registration is required.

-- James E. Powell
Editorial Director, ESJ

0 comments


Only One-Third of Critical Infrastructure Firms are Extremely Prepared, Study Finds

Cyberattacks are a part of daily IT life, but attacks with specific political goals are increasingly frequent and costly, according to a new report, the Symantec 2010 Critical Infrastructure Protection Study.

Symantec's study examined trends in six infrastructure segments: energy, banking and finance, communications, IT, health care, and emergency services. Over half (53 percent) of all firms said they "suspected or were pretty sure they had experienced an attack waged with a specific political goal in mind." Of those experiencing an attack, firms typically reported 10 hits in the last five years, with banking and financial firms hit the most. Eighty percent of respondents think the level of such attacks is constant or increasing -- which should be worrisome, given that the average cost of an attack is $850,000, according to the report.

Symantec's insights are backed up by recent events. For example, Trusteer, a secure browsing service provider, says 11 Eastern European hackers were formally charged in the UK on September 29. The next day, 70 Eastern European hackers were charged in the U.S. with stealing $3 million from U.S. online bank accounts using the Zeus Trojan.

"The recent arrests in the U.S. and the UK indicate that financial fraud is not the business of individuals," according to Mickey Boodaei, Trusteer's CEO. "Behind these operations you can find groups of people which in many cases operate for larger organized crime groups. They have the money and the means to run large-scale, sustainable criminal online operations. As time goes by, we're seeing more groups which are larger, more efficient, and knowledgeable than before, and as a result much more successful. Zeus is being used around the world to attack individual customers, and big businesses are also being targeted, particularly in the U.S."

Boodaei said other cybercrime gangs are "almost certainly operating in other countries," possibly in continental Europe, Canada, and in the Asian-Pacific region, "running parallel criminal operations to the Zeus gangs in the UK and the U.S."

After reading the Symantec story, I was disappointed by how unprepared the surveyed firms are in the face of such attacks. When asked about the kinds of attacks, including attempts to steal electronic information, alter or destroy data, interference with networks (slowing or shutting down networks), or tampering with physical equipment, only a third reported being "extremely prepared;" another third (from 36 to 41 percent, depending on the type of attack) felt "somewhat prepared." Nearly a third (31 percent) of firms said they felt "less than somewhat prepared." In other words, unprepared.

Weak points in their preparedness include security training, "awareness and appreciation of threat by executive management," and a deficiency in endpoint security measures. Security response and security audits rounded out the list.

The good news is that industries are not just willing to cooperate with government critical infrastructure protection (CIP) programs but are actually doing so. Ninety percent "have engaged with their country's CIP programs to at least some degree, with 56 percent being significantly or completely engaged," according to the report. The energy sector has the highest engagement (83 percent); IT is the least engaged (49 percent).

Mark Bregman, Symantec's CTO, told me that although more large companies are engaged, even small companies are participating. "When it comes to emergency services, for example, we often forget small companies such as ambulance services. Over half of small, critical-infrastructure companies are engaged with government programs." In fact, Bregman says, the firms are enthusiastic about the programs as well. Unfortunately, small companies are also the worst prepared for threats.

The telephone survey of 1,580 private businesses that are part of the critical infrastructure was conducted in August. Responding firms in 15 countries had between 10 and over 10,000 employees (the median was between 1,000 and 2,499 employees).

-- James E. Powell
Editorial Director, ESJ

Posted by Jim Powell0 comments


Lack of Time for Performance Tuning Tops Database Professionals' Complaints

So much to do, so little time. That familiar refrain is very much on the minds of database professionals who yearn for more time to tune and fix poor performing SQL code to accelerate their database and application performance according to a just-released Database Trends Survey conducted by Embarcadero Technologies.

The Web-based survey, taken in July and August, received 1230 responses from DBAs, developers, architects, and analysts about the database-related tasks they wanted to spend more time performing. Most respondents (41.8 percent) work in enterprises valued at more than $1 billion; 60 percent of represented organizations had at least 1,000 employees. Nearly a third (30.4 percent) had been a DBA or working with databases for ten to 15 years.

Forty-one percent of respondents cited tuning as the job they'd like to find the time for, followed by fixing sub-optimal SQL code (39 percent), monitoring their databases (28 percent), and learning new skills or technologies (27 percent). Rounding out the list: "keeping skills up to date, testing new database features, and eliminating bottlenecks," Embarcadero said. Loading and unloading data finished in last place on the list.

The lack of time could be due to the additional responsibilities placed on respondents. Forty-one percent of DBAs said they are writing more SQL code than they did five years ago (about 31 percent said they weren't doing more; the question didn't apply to the remainder of respondents).

“The lines between DBAs and developers are continuing to blur, with more production DBAs getting involved in non-production environments and traditional developers using SQL on a regular basis,” said Kyle Hailey, program manager for database performance and optimization products at Embarcadero, in a prepared statement. ”At the end of the day, both groups have equitable concerns and goals -- namely, how to tune and improve the performance of SQL code without making it an enormous time suck.”

To give them more time for the tasks they said they want to do, many respondents thought automation could help and their work lives easier. Topping the list of tasks to automate: over a third (36 percent) chose diagnosing production issues, followed by "fixing poor performing SQL code" (29 percent), database monitoring (26 percent), and database tuning (24 percent).

The complete survey results are available at no cost. No registration is required.

-- James E. Powell
Editorial Director, ESJ

0 comments


Study Pegs Maintenance Costs at $1 Million per Application

In the first of a planned annual series of reports about global software quality trends, Cast, Inc. presents an eye-opening figure for all IT managers. The software analysis and measurement company's study estimates that the cost of fixing problems in production applications exceeds $1 million per application.

The analysis focused on "structural quality" -- the engineering soundness of an application's architecture, not whether the application actually met its functional requirements. Cast examined four characteristics -- security, performance, robustness, and changeability -- in a variety of industries and in both the public and private sectors. The firm analyzed source code for violations against the company's collection of industry standard best practices, then calculated a score "using an algorithm that weights the severity of each violation and its relevance to each individual quality characteristic."

The report notes that that "structural quality characteristics are important because they are difficult to detect through standard testing. However, structural quality problems are ... most likely to cause unforeseen operational problems such as outages, performance degradation, breaches by unauthorized users, or data corruption."

The study examined attributes that affect an application's ability to avert unauthorized intrusions, its responsiveness, its stability, and the ease and speed with which it can be changed (without introducing more problems).

COBOL programs achieved the best scores for security, "scoring 59 percent higher than .NET and 37 percent higher than Java EE technologies." Jay Sappidi is in charge of Cast Research Labs; he told me that's because COBOL is used to a great degree in the financial and insurance industries, a sector that is more focused on security. "Those industries have complex requirements, and the cost of failure is high. Furthermore, the applications have been in existence for so long that problems could have already been fixed over time. Finally, COBOL applications are typically not directly exposed externally, so they're obviously safer."

Government IT is spending considerably more on maintenance than private-sector firms as represented by the "changeability" score, which Cast says "is a good predictor of application maintenance costs." Sappidi says that Cast examined code complexity and the complexity of the database, how code is written, including the quality of documentation and how well such structures as loops are documented internally.

"With the government, outsourcing is high, and that equals high risk. Usually in the government sector, multiple vendors are used, and those vendors rotate frequently, which means that they're spending more money to keep the lights on rather than add functionality. In fact, 75 percent of government IT budgets are spent on maintenance; the figure in the private sector is closer to 50 to 60 percent. With only one-quarter of government budgets going into investments in new applications and new functionality, there's a big drain on their budgets."

Contrary to conventional wisdom, application size doesn't correlate to application quality -- as long as the application is (and stays) modular. "When an application is modular, its quality can be high even as it grows to a very large size."

COBOL applications are the exception. When size increases, quality falls. Sappidi says that .NET and Java EE applications have low complex objects (such as loops within loops within loops, and the deeper you go, the more complex the code). "In COBOL applications, 60 percent of components are complex, and the language is less modular."

The trade-off, however, is performance, which "tends to be far better in COBOL than in Java because most test platforms have built-in performance metrics that are run during QA. The down side is that the more modular the code, the greater the overhead."

The study examined 288 in-house-written applications, both onsite and outsourced -- 108 million lines of code in all -- from 75 organizations located mostly in North America, Europe, and India, which may be the largest application sample to be statistically analyzed. "The applications range from 10,000 to 5 million lines of code, with 26% containing less than 50,000 lines of code and 32% falling between 50,000 and 150,000 thousand lines of code," according to Cast. The report says that the average business application contains about 374,000 lines of code.

-- James E. Powell
Editorial Director, ESJ

0 comments