News

Vista Vulnerability Study Puts Microsoft on Defensive

Microsoft and some independent security researchers had the blogosphere buzzing Wednesday over a series of denunciations after one company claimed that Vista was more vulnerable to malware and other exploits than previous operating systems.

Late last week, a study by Sydney, Australia-based anti-virus concern PC Tools suggested that although Vista was an improvement over Windows XP in respect to system fortitude, it is more easily encroached upon by malware and other exploits than Windows 2000.

PC Tools found that for every 1,000 machines running Vista, 639 suffered from cases of malware in varying degrees. Among machines running Windows 2000, 586 were found compromised; for Windows 2003, 478.

At the root of Microsoft staffer Austin Wilson's refutation of those findings is the assertion that the numbers PC Tools used to reach its conclusion don't stem from data with a proper control factor; essentially, the net wasn't cast wide enough to capture the true effect of malware on Vista.

"We study the malware space very carefully and publish our results twice a year in the Security Intelligence Report," Wilson wrote in the Vista security blog. "This report is compiled from statistics on malware infections based on over 450 million executions of the Malicious Software Removal Tool (MSRT). Microsoft is a member of AMTSO (Anti Malware Testing Standards Organization) and its charter includes defining test methodology so that there is a minimum quality bar to all testing of this type."

It wasn't just Microsoft; criticisms of PC Tools' report came from as far away as Eastern Europe in the form of IT pros such as Dennis Kudin, CTO of Ukraine-based Information Security Center Ltd. In a blog post of his own, Kudin wrote, "I think [the study] is a very dangerous delusion. First of all, the difference between 639 and 586 is not big and can be easily explained."

Reached by Redmondmag.com for comment on the issue, Michael Greene, PC Tools' vice president of product strategy, said that it's not enough to just identify the presence of malware on systems, and that his company takes a "behavioral approach" to identifying what the real dangers are or could be. He added that this type of thinking is what prompted the research in the first place.

"Our thing is, don't take our word for law that it's malware," Greene said. "Run your scanners to look at the severity of what's happening. With our findings, what you see is how unique pieces of malware got on these machines in question and the question you ask is, 'How did they get through?' Look, everybody knows that Vista is more secure than XP, but the problem isn't solved."

Regardless of the arguments being volleyed back and forth, the "problem" is, indeed, not solved. Security experts agree that administrators need a layered approach, especially in light of other research findings regarding malware that spotlight different strains -- worms, Trojan horses, rootkits, spyware, malicious adware, grayware and certain bots, for starters.

In fact, one of the preliminary results from Symantec Corp.'s Internet Security Threat Report released last month suggests that "the release rate of malicious code and other unwanted programs may be exceeding that of legitimate software applications." Moreover, Finland-based anti-virus company F-Secure announced recently that as much malware was launched in 2007 as there was over the previous 20 years.

"I think what we know from the various reports that are out there is that there are threats," said Andrew Storms, director of IT security operations at San Francisco-based nCircle Network Security. "The question is not the degree of what's out there but what the actual risks are and how to mitigate them."

About the Author

Jabulani Leffall is an award-winning journalist whose work has appeared in the Financial Times of London, Investor's Business Daily, The Economist and CFO Magazine, among others.

comments powered by Disqus

Featured

  • Windows Community Toolkit v8.2 Adds Native AOT Support

    Microsoft shipped Windows Community Toolkit v8.2, an incremental update to the open-source collection of helper functions and other resources designed to simplify the development of Windows applications. The main new feature is support for native ahead-of-time (AOT) compilation.

  • New 'Visual Studio Hub' 1-Stop-Shop for GitHub Copilot Resources, More

    Unsurprisingly, GitHub Copilot resources are front-and-center in Microsoft's new Visual Studio Hub, a one-stop-shop for all things concerning your favorite IDE.

  • Mastering Blazor Authentication and Authorization

    At the Visual Studio Live! @ Microsoft HQ developer conference set for August, Rockford Lhotka will explain the ins and outs of authentication across Blazor Server, WebAssembly, and .NET MAUI Hybrid apps, and show how to use identity and claims to customize application behavior through fine-grained authorization.

  • Linear Support Vector Regression from Scratch Using C# with Evolutionary Training

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the linear support vector regression (linear SVR) technique, where the goal is to predict a single numeric value. A linear SVR model uses an unusual error/loss function and cannot be trained using standard simple techniques, and so evolutionary optimization training is used.

  • Low-Code Report Says AI Will Enhance, Not Replace DIY Dev Tools

    Along with replacing software developers and possibly killing humanity, advanced AI is seen by many as a death knell for the do-it-yourself, low-code/no-code tooling industry, but a new report belies that notion.

Subscribe on YouTube