Google Reports on Web-based Dangers
It appears Google is taking giant steps in finding methods to keep from pointing its users toward dangerous sites. I’ve long advocated that Google implement some kind of website filter similar to what Site Advisor provides. It appears, Googles bots are doing a lot more than you might think.
This week Google published a report that details the dangers found on websites. Many Blogs are reporting that Google says 10% of websites are dangerous but that headline only scratches the surface of the report. I recommend reading the entire report which can be downloaded at http://www.usenix.org/events/hotbots07/tech/full_papers/provos/provos.pdf
(Thanks to Vic Laurie for the tip.)
One focus of the report is how frequently malicious binary files change to defeat signature detection by traditional virus scanners. Signature based scanners are great for cleaning up your system but it should be no surprise I recommend having an event based detection program like WinPatrol installed as well.
Many of the examples to exploit users depend on Javascript or VBScripts which can easily be inserted into HTML fragments. These code segments can often be inserted into web bulletin boards, blogs and advertisements. I am a big fan of JavaScript so I’m pleased that the report didn’t go as far as to recommend users disable scripting.
This is the second technical white paper I’ve read this month from Google and I’m pleased they’re sharing their research with all of us.
0 Comments:
Post a Comment
<< Home