Friday, August 24, 2007

Define... "securely transferred"

Here's something for you folks out there to ponder, and I'll give my take on it as well, but first I want to pose the scenario -- and offer a chance to think it over and maybe reply publicly if you're daring...

Scenario:
You're a financial company, or rather, you work for one. You have a vested interest in protecting your clients' data; whether it be cardholder information, investor information, or banker information... it's all critical and sensitive. Now say you work with a 'partner' (or vendor) who will do something with some portion of your customer data records. To make it more concrete, let's assume that this vendor will provide outsourced "rewards redemption" on the line of credit cards you offer... "Pet Points" as an example. So if I own a "Pet Points" card you issue, and I want to redeem my points for a spa treatment for Fifi, I dial up an 1-800 number, and get your vendor-partner who's CSRs use the data you have about me to allow me to redeem all those hard-earned "points". Of course, included in the necessary data that the vendor-partner has to have about me is my card number, expiration, home address, name, maybe some other morsels of information too? Now, for this vendor-partner, let's call them Partner X for brevity, they have to have this information sent to them in a flat-file so they can input it into their system as a nightly batch job (standard for financial systems these days). This flat-file, as you would imagine, is brutal if it falls into the wrong hands, yet your partner tells you they only support "in-transit" encryption and that nothing like PGP is supported as "it is too complex and difficult to support". What do you do?

Allow me to break this down for you:
  • Sensitive cardholder information in a flat-file
  • Flat-file sent over to a 3rd party
  • Link is encrypted "edge to edge" (meaning, router to router, or firewall to firewall)
  • Flat-file encryption is not supported by your vendor
So ask yourself... "Self... what do I do?"

This is an egregious act of negligence. I'll tell you why. Feel free to disagree.
  • First, the argument that the data is "encrypted on the way over" is crap. PVC's, VPN's, even private copper (very rare) is still only part of the puzzle. That data is exposed the second it drops out of the encrypted tunnel
  • Next, how much do you trust your internal employees? If you're intelligent the answer is very little, cardholder data should never be stored unencrypted even on "internal" systems
  • Additionally, as the client, I have the right to tell my vendor how/where/when I want data secured -- if you're a vendor telling me you won't support something I feel is fundamental, I'll find someone who will
I think I'd like to take that first point a step further. Data in motion is encryped, typically, which is great. The problem is where that data is de-tunneled, or de-crypted. Are the systems that handle the data in a DMZ? Are those systems in the internal network where they are accessible by all your employees? Can I "plug" wirelessly into your network and possibly see this data? If you the answer to any of those question is yes, then you have a problem. That data is not secure. So you see, it's great that the tunnel you have established encrypts data as it passes from my firewall to yours, but that system that receives the data... which was hacked a month ago by someone from Elbonia, that's still insecure. You're still facing at least a loss of my business, at worst a lawsuit that'll put you out of business.

Thursday, August 2, 2007

How do you help someone who won't help themselves?

So as I think about consumer protection, I recall an old parable my Sunday school teacher told us. I'll give you the abridged version. Basically a guy was stuck on a rooftop in a floor, and asked God for assistance. A few hours later a guy in a boat came by to rescue him but he sent him away saying God was going to rescue him. It happens two more times, another boat and then a helicopter - then the guy drowns and asks God why he didn't save him. God's reply is simple - I sent you two guys in boats and even a helicopter to rescue you... all you had to do was let me help you. The moral of the story- you need to let yourself be helped when you're in over your head, literally and figuratively. This is the reality us security professionals live in. Consumers who won't let us protect them - so often the worst enemy of the consumer is... you guessed it - the consumer.


Many of you know exactly where I'm going with this. Consumers expect, nae, demand to be protected online when making purchases, reserving their vacation tickets, or buying grandma's birthday present, but it seems a rare one who is willing to do something about it. I get marketing people in my ear every day how I can't make people use 'stronger' passwords because they won't use the application or site. I can't make a partner site (which potentially has financials in it) require more than an email as a UserID and your ZIP CODE as your password... everything else and ... get this... the consumer will go to a competitor who allows easier access. If you're reading this blog, and this sounds like a recent project you've heard me wail about... yes I work with you :-) Some days I'm tempted to put my foot down and say "Fine, let them go to the competition; but when their accounts are empty because someone guessed their idiotically simple password - we can say we told you so!"

Before I get too far off on a rant about my marketing folks (sorry, you're such easy targets) I need to make my point. Consumers won't let us, as security professionals, protect them in the obvious ways. So we have to do things the sneaky way. We have to write filters, and scripts and other behind the scenes types of things which will keep them safer, without letting them know we're doing it. This drives me bonkers... what about you? Sure, one-time passwords via RSA token aren't the end-all, and can still be tricked via man-in-the-middle attacks or skimming attacks (session riding) but we at that point would significantly up the ante - we would force the 'bad guys' to work that much harder for their stolen money.

So - I have to ask the consumer... what is wrong with you people? I feel like I can answer myself... complexity is bad but there has to be a happy medium... somewhere. If any one of you readers (how ever many I have) have ideas - let's discuss... maybe we can get an open forum going? I'd love to hear people from across the industry present ideas, and maybe we can creatively solve this problem together? Maybe education is part of the answer, and an industry-wide 'mandate' or (dare I say it) another compliance policy which mandates something more 'complex' than simple userID and passwords?

I think I can safely say, and not get too many blank stares, that the userID/password is dead for high-risk use. There has to be a better way, but unless consumers realize that this is a "takes two to tango" scenario... we're screwed.
Google+