Skip navigation.
Home

An idea that never materialized

Warning: This happens to be an OFF Topic Post, sort of a rant.

It's too bad that we didn't see the potential to develop something similar here in Offensive computing, even upon a bare idea and some small support from one guy who said it's a nice idea to have a standard for testing AVs

Today this is here on ...
http://www.securityfocus.com/news/11502

Just felt like, aww, that's something I thought, and it never materialzed ;)

Cheers :)
Kish

yeh its a cool idea

I spent a bit of time working on it and basically ran out of steam. its a hard problem which needs a collaborative effort to succeed I think.

Its something you want to be sure to get right.

V.

The only thing that strikes

The only thing that strikes me as creepy about that article is that the people who are making the money are funding the testing standards. Really it should be some sort of outside organization. Hispasec (The Virustotal guys) would be a good start, but since I'm really sure what their business model is I don't know if it should actually be followed through with.

AV wants to make sure they don't get screwed, the rest of us want to expose the great lie of the antivirus industry. We have no time / money, the AV industry has lots of both. So they win. It sort of sucks.

Nice one DannyQuist

You nailed my point danny...

"AV wants to make sure they don't get screwed, the rest of us want to expose the great lie of the antivirus industry"

"We have no time / money, the AV industry has lots of both. So they win. It sort of sucks"

Makes a lot of sense, you said it right ! :)

But how long can we let these people be on top with their lies ... we must probably come up with an idea or promote n.runs parsesafe ;) to protect the antivirus...

And as always, Thanks for your support V

Cheers :)
Kish

--
Remember there is alwayz someone who knows more than us out there

I don´t agree. Money is not

I don´t agree. Money is not required to make antivirus comparatives. Just look at VirusP, he is a good example of this. He has been doing antivirus comparatives for years and his results have been published on several computer magazines.

I consider the real problem is getting the enough amount of samples to make reliable antivirus comparatives. Nowadays any antivirus comparative using less than 1 million malware files is not worth. Of course, detection ratio comparative is not the only thing to test. Things that any antivirus comparative should not miss:

* Resident protection test

* Disinfecting an infected system: virus and rootkits specially, and also cleaning malware registry keys.

So money is not the problem at all. Problems are others, like getting a malware collection and the time and the patience to do the tests. That´s all.

Here is the reason

There are alot of things to test. For example how much good is it really to simply test detection rates. Who That most tests aren't embraced by the AV industry and aren't that useful. The tests are incomplete. What are you testing for? Here are some criteria that I think should be tested and analyzed:

- detection rates
- miss rates
- false positive rates
- system intrusiveness
- products own security (will I get hacked via AV)
- ease of mass deployment
- speed
- update frequency
- use of signatures vs other methods
- ability to clean
- ability to handle classes of malware
* rootkits, trojans, worms, backdoors, spyware
- ability to detect in realtime vs scan only
- price/price per seat

Theres probably more stuff too. It will be a complex amalgamation of all these factors depending on your individual needs that will decide for you which AV is best. Simply saying "kaspersky detects the most" really isn't that helpful except maybe to the most naive consumer.

V.

The decision that's always been made

I think there's a lot of thought that all AV products are equal on the technical front too. Most often what I've seen is a large organization making a decision based on your last point which is price.