In-Depth

Anti-Spyware Shootout

VeriTest, an independent testing lab, pitted three popular anti-spyware products against each other for four months, but such performance results can be problematic.

Which enterprise anti-spyware software is best? To find out, researchers at VeriTest, an independent testing lab, pit three popular anti-spyware products against 200 pieces of spyware for a four-month period.

Researchers examined three programs: McAfee AntiVirus Enterprise with AntiSpyware Module V8.0, Sunbelt CounterSpy Enterprise 1.5.268, and Webroot Spy Sweeper Enterprise 2.5.1. (The study was underwritten by Webroot.)

Which product came out on top? According to VeriTest, “Webroot cleaned 94 percent of all spyware tested, versus 53 percent for McAfee, and 26 percent for Sunbelt.” For system monitors, Webroot removed 97 percent, versus McAfee’s 17 percent and Sunbelt’s 10 percent. Likewise, Webroot eliminated 96 percent of adware encountered, versus 53 percent for McAfee and 26 percent for Sunbelt.

Testing was conducted using Windows 2003 Standard Edition server. “Each enterprise anti-spyware application was installed to its own server, each of which had three client PCs dedicated as agents,” notes the VeriTest report. All anti-spyware programs were given Internet access via a proxy server, and allowed to update at will. Testers then loaded a CD-ROM with 200 individual pieces of spyware onto the PCs, including system monitors, adware, and Trojans, and watched what happened for four months. That duration was calibrated to assess vendors’ anti-spyware signature update practices. As VeriTest notes, “administrators must take into account the rate at which their anti-spyware vendor identifies new threats.”

To assess how well vendors update their signatures, VeriTest later retested all three anti-spyware products by selecting 10 programs at random that all three had failed to clean the first time. Given a second try, both Webroot and Sunbelt cleaned 8 of the 10, while McAfee caught 7 of the 10.

Useful Results?

What can security managers make of the results? They may be useful for organizations considering enterprise anti-spyware software deployment, or which want to pressure their current vendor to do better. Yet the testing only rounds up three anti-spyware products, and there are several others commonly seen in the enterprise, including software from Symantec and Trend Micro, and potentially Microsoft’s still-in-beta Windows Defender.

The lack of a head-to-head test of all leading anti-spyware software makes it impossible for organizations using or evaluating software not tested in the study to meaningfully assess the study’s results, since organization would have difficulty subjecting the excluded software to similar testing. “Testing anti-spyware applications for effectiveness is extremely complex,” notes VeriTest. “Most businesses conduct rudimentary tests with common spies that produce inconsistent results.”

In addition, also note while some organizations select best-of-breed anti-spyware software, others aim for security suites that include anti-spyware capabilities, perhaps knowing full well the two categories may not overlap.

The Perils of Counting Spyware

While the usefulness of VeriTest’s results are limited, the lack of a standardized set of metrics for assessing anti-spyware products is nothing new. For example, all vendors publish information about the number of pieces of spyware their spyware scanning engine can find and obliterate. But how useful are those numbers?

According to David Perry, the global director of education for Trend Micro, beware spyware statistics, and especially those promulgated by vendors. The reason is simple, and it parallels the numbers game vendors played in the early days of antivirus software. “In the 1990s, Symantec and McAfee went to war over the number of viruses they caught, and we ended up in the world of saying [there are] 140,000 virus strains we detect, of which less than 4,000 have ever infected anyone, ever,” he says. “So the rest are fake, or proofs of concepts that have never hurt anyone.”

Now, despite the fact that the majority of spyware attacks have never been seen in the wild, “they’re doing the same things with adware and spyware—removing things that are really innocuous,” he says, but nevertheless adding it to the count. Some vendors may also inflate the count, removing one piece of spyware containing multiple components, then “defining it as 50 pieces.”

Thus the current confusion over spyware effectiveness is no surprise. “We have a public who are half-educated about this,” says Perry. “They’re going to walk down the road and say, look, they detected 95,000 cookies, and you only detected 22,000.”

Given the futility of the numbers game, Perry recommends taking a more holistic approach. In particular, he says, IT managers need to remember that placing anti-spyware tools on gateways and proxies can prevent it from ever reaching desktops. In that scenario, the pertinent spyware statistic isn’t how many pieces of spyware a desktop scanning engine claims to block, but how many pieces of spyware never reach desktops in the first place.

Related Articles:

About the Author

Mathew Schwartz is a Contributing Editor for Enterprise Systems and is its Security Strategies column, as well as being a long-time contributor to the company's print publications. Mr. Schwartz is also a security and technology freelance writer.

Must Read Articles