Many malicious URLs are now invisible to URL filters and antivirus software, says M86 Security.
The web security company pitted three leading but unnamed antivirus products against 15,000 malicious URLs and found that only 39 per cent were successfully blocked.
When they ran a second set of malicious URLs against a leading URL list the news was even worse. Only 444, or around three percent, were correctly identified.
Astoundingly, of the 15,000 URLs filtered, 5273 were not only missed as being malicious, they were actually tagged as being legitimate.
In total, 9283 of the links were simply not categorised either way, which means that a filtering system might or might not have blocked them depending on how it was set up.
M86 Security stressed that the tests were legitimate.
The URL lists were compiled from links they had discovered to be malicious using their own technology, mixed with others culled from Google search analysis as well as links identified by third parties.
Their own technology identified all the links without error, the company claimed.
"Even though URL Filters now check for more than 22 million malware signatures, seven times the number in 2004, websites are still no safer as malware and web 2.0 threats increase at least as quickly," said M86 Security's Bradley Anstis.
Why were so few malicious links correctly identified? According to M86 Security, this is probably because they were legitimate websites that had been hijacked by malware in a way that would not be obvious to a simply reputation-based filter list.
This is plausible. A limitation of crude filtering based on lists is that it assumes legitimate sites are OK to visit. The key is how often this data is updated to spot sites that have become a problem.
Contentiously, M86 Security also reports on the opposite phenomenon, namely false positives that blacklist sites no longer infected with malware, using a named example to show the ineffectiveness of rival products from McAfee, Websense and Blue Coat.
As might be expected, M86 Security trumpets its own technology, which performs a layered analysis of web pages to ensure that they are correctly identified.
It is more accurate to say that the effectiveness of one URL filtering system over another has yet to be comprehensively tested by an independent authority, which remains fixated on pitting static malware against scanners in a methodology that has barely changed in two decades.