Monday, October 6, 2014

To Reform and Institutionalize Research for Public Safety (and Security)

On October 3rd, 2014 a petition appeared on the Petitions.WhiteHouse.gov website titled "Unlocking public access to research on software safety through DMCA and CFAA reform". I encourage you to go read the text of the petition yourself.

While I believe that on the whole the CFAA and more urgently the DMCA need dramatic reforms if not to be flat-out dumped, I'm just not sure I'm completely on board with there this idea is going. I've discussed my displeasure for the CFAA on a few of our recent podcasts if you follow our Down the Security Rabbithole Podcast series, and I would likely throw a party if the DMCA were repealed tomorrow - but unlocking "research" broadly is dangerous.

There is no doubt in my mind that security research is critical in exposing safety and security issues in matters that affect the greater public good. However, let's not confuse legitimate research with thinly veiled extortion or a license to hack at will. We can all remember the incident Apple had where a hacker purportedly had exposed a flaw in their online forums, then to prove his point he exploited the vulnerability and extracted data of real users. All in the name of "research" right? I don't think so.

You see, what a recent conversation with Shawn Tuma has taught me is that under the CFAA we have one of these "I'll know it when I see it" conditions where a prosecuting attorney can choose to either go after someone, or look the other way if they believe they were acting in good faith and for the public good... or some such. This type of power makes me uncomfortable as it gives that prosecuting attorney way too much room. Room for what you ask? How about room to be swayed by a big corporation... I'm looking at you AT&T.

Let me lay out a scenario for you. Say you are a security professional interested in home automation and alarm systems. You purchase one, and begin to conduct research into the types of vulnerabilities one of these things is open to - since you'll be installing it in your home and all. You uncover some major design flaws, and maybe even a way to remotely disable the home alarm feature on thousands of units across the country. You want to notify the company, get them to fix the issue, and maybe get a little by-line credit for it. Only the company slaps a DMCA law suit on you for reverse engineering their product and you're in hot water. Clearly they have more money and attorneys than you do. Your choices are few - drop the research or face criminal prosecution. Odds are you're not even getting a choice.

In that scenario - it's clear that reforms are needed. Crystal clear, in fact.

The issue is we need to protect legitimate research from prosecutorial malfeasance while still allowing for laws to protect intellectual property and a company's security. So you see, the issue isn't as simple as opening up research, but much more subtle and deliberate.

How do we limit the law and protect legitimate research, while allowing for the protections companies still deserve? I think the answer lies in how we define a researcher. I propose that we require researchers to declare their research and its intent and draft ethical guidelines which can be agreed upon (and enforced on both ends) between the researcher and the organization being researched. There must be rules of engagement, and rules for "responsible and coordinated disclosure". The laws must be tweaked to shield the researched with declared intent and following the rules of engagement from being prosecuted by a company which is simply trying to skirt responsibility for safety, privacy and security. Furthermore, there must be provisions for matters that affect the greater good - which companies simply cannot opt out of.

Now, if you ask me if I believe this will happen any time soon, that's another matter entirely. Big companies will use their lobbying power to make sure this type of reform never happens, because it simply doesn't serve their self-interest. Having seen first-hand the inner workings of a large enterprise technology company - I know exactly how much profit is valued over security (or anything else, really). Profit now, and maybe no one will notice the big gaping holes later. That's just how it is in real life. But when public safety comes into play I think we will see a few major incidents where we have loss of life directly attributed to security flaws before we see any sort of reform. Of course when we do have serious incidents, they'll simply go after the hackers and shed any responsibility. That's just how these things work.

So in closing - I think there is a lot of work to be done here. First we need to more closely define and create formal understanding of security research. Once we're comfortable with that, we need to refine the CFAA and maybe get rid of the DMCA - to legitimize security research into the areas that affect public safety, privacy and security.

No comments:

Google+