Friday, February 12, 2010

Comparing Apples, Oranges, and IEDs

[Before you start reading I would like to make sure I let you know that this post is my own personal thoughts, no one else's, not my employer's, my mother's or any influential friends... capicé?]


I read a Securosis post by Mike Rothman today that nailed it.  Absolutely brilliant... but I wanted to expand on it some because I think there is a little more that needs to be said.

The long and short of it is this - product reviews are dead.  The days of real, useful product reviews have gone the way of the honest politician.  Not only that - but publishing one is a great way to lose any credibility you have and potentially make enemies.

The sad fact as Mike points out is that there aren't any good independent review companies out there ... (wait - are there?) I submit for consideration NSS Labs, run by Rick Moy and Vik Phatak has both the capability and credibility to perform an "unbiased real-world comparison" of security software/hardware.  The problem is defining "real world" and making it useful for anything beyond a baseline.

As I mentioned, "real world tests" are very hard to come by.  Here's why... (if you get it, skip ahead)...

First let me give you a completely non-IT example.  If you watch TV at all you've seen one of those really cool car commercials where the SUV is racing through the desert, or doing something you look at and think "I'm never going to do that".  Then, the voice comes on to tell you that in a "real world comparison" their car got better gas mileage than the competition"; "people preferred their soda 3 to 1" ... you laugh at those right?

Now consider the IT environment you work in.  The hardware, software, techniques and nuances of your business.  Furthermore, consider how vastly different it is from your previous job.  Now ... imagine trying to build something that would be a perfect fit into both environments.  If we're talking web application testing tools [my particular specialty] this becomes even more complicated and you can see where things get out of control.  Some develop .Net, some Java, some PHP and others ColdFusion... then there's AJAX, Web Services, Flash, Flex and that's just the easily defined stuff.  If your head isn't spinning tell me how you do it.  There are too many things here to keep straight, much less write a piece of software that will "address them all".

Moving on.

Assuming that you trust the source of the report as independent and competent (a big leap) you can,on a good day, at best use it as a baseline.  Your deviation from that baseline directly impacts your mileage from said report.  The big question once you've decided the source is both trustworthy and intelligent (again a big leap these days...) is how much can you actually gain from reading a product comparison against some far-off battery of tests.  Consider this - if a WAF product is tested in a "real-world test" against a .Net application, how does that translate into my business which is made up of RESTful Web Services built on a custom web platform?

Let me make it real simple - 9 out of 10 dentists agree - the answer is "who the hell knows?!"

Here's another angle to this that makes no sense to me.  Working for a web app security tools vendor [for full disclosure, vendor name is irrelevant] I have seen people read reports from independent 3rd parties and go into a mental melt-down.  Inevitably they see results that someone in a lab somewhere produced and it doesn't match the evaluation they did for themselves... in their environment.  Whiskey Tango Foxtrot.  Talking someone off a ledge after they've done this to themselves is no fun, either - because they assume you're just another vendor trying to spin it your way.  While it's true - no matter who does the testing, what the results - everyone will spin it their own way.  Please, folks I urge you not to lose your heads... nothing trumps the evaluations you do yourself.  If the SUV you drive performs better in your case than their competitor says on a commercial - don't go out and buy the competing product just because they say so.

Finally, just a quick word on independence.  Look boys and girls, no one is truly independent.  Everyone has an agenda, everyone has their personal bias they acquired through personal or professional experience, friendships, whatever.  I would love to say I'm different but I'm not... and you're exactly the same.

Philosophically - I would love to see each niche group in InfoSec go through a yearly "product validation" by a 3rd party that is truly independent and transparent in their methodologies and practices.  I think that, in the end, would result in higher quality products, less smoke/mirrors from sneaky sales folk, and a general increase in the security posture of everything ... and why wouldn't we all want that?

1 comment:

Anonymous said...

Good points. The customer does require some objective third party opinion somehow.

Perhaps bake-offs by product category should be in a public setting with ground rules and test protocols the same for all vendors, who do the setting up themselves. The criteria/conditions for performance would be transparent. All of the tests would be performed for all tools in the same space concurrently. Vendors would have a set amount of time to submit a response regarding the final test results, which would be available to consumers the next day. Since the vendor was responsible for setup, if they screwed up, no one else would be to blame. These events could precede major security shows and be overseen by independent third party TEAMS of experts in each category.

How much would a vendor sweat at a show like RSA if their product tanked at the pre-show bake-off? Let the products do the talking, but under conditions that no one could make excuses about.

Google+