Monday, August 11, 2008

The Security Boondoggle

I've been having to think about security at work recently (never something good for my mood), and despite rarely wanting to blog about Computer Science, I ran into something too funny and endemic of typical security issues that I'll break the rule this time.

First, I read the MIT presentation about subway hacking, which is in itself hilarious funny and very much worth a read. Go ahead and read it and then come back --- the rest of this post assumes that you have.

Well, I happen to know people who used to work on that kind of transportation security system, so I sent them e-mail to tease them about the security work. Here's a response (names and details redacted):
[they hired] a security guy who guards all the encryption code zealously. I mean.. he was quite the nazi and because of his position, he let's people know it. Everyone who wanted to work on the encryption code for XXX subways had to go through him.

One day, the worse programmer I ever know (although he claimed he invented the keyboard) was assigned to debug an issue on the fare cards and encryption god was out of town. Well, he basically reversed engineered the encryption code by manually trying everything until it worked. Took him a week but he did it. That scared the shit out of XXX because he was quite possibly a sanitation engineer who pretended to type on the keyboard.

Anyway, these MIT kids need to take a lesson from some of those tricksters in XXX. The most creative ones know exactly where to crease a magnetic stripe so that the fare card will give unlimited rides. This is without the benefit of any technology. Another one would manually tape several cards over each other to create a super ride card. Of course, there are the ones that just brings a gun and a bat and just shoot the machine until they can get in. Those ones are much less creative.


That description of the security Nazi unfortunately matches my experiences with computer systems in general --- when systems designers think about security, they immediately think of complex crypto system, encrypting everything everywhere, and in general making life difficult for the legitimate user. In reality, most security attacks work on the weakest link --- the social engineering approach, or the physical system. So your most valuable security people isn't the guy with the PhD in cryptography, but your UI designers and engineers. If you make a system so painful to use because of security, then users will actively find a way to defeat it. (For instance, if I buy a computer game, I usually end up finding a pirated version anyway and installing it because the user experience is better!)

A few years ago, Eric Rescorla gave a talk at Google entitled The Internet is Already Too Secure. It was a great talk, and it makes the very important point that it's too easy to get academic respectability for designing and implementing complex crypto systems for security. What's really hard is designing easy to use systems that users will adopt and achieve widespread adoption and success (like ssh), with good-enough security that the rest of the system is the weakest link. But whenever I talk to security experts that's never what I hear. It's always about making life hard for the legitimate user!

Consider this story about two payment systems: one system was much cheaper than the other, but required additional input from the user to verify security. The other system was much more expensive, but required no work from the user to use, and hence was much less secure. Both systems were widely available at all point of sales. The higher security system had next to no fraud. But the lower security, more expensive system was much more popular. The maker of the lower security system made so much more money than the other system that it more than paid for reimbursing merchants for fraudulent use. In case you haven't figured it out, the lower security system is the Visa/Mastercard payment system, and the higher security system is the pin-required debit card system. Convenience, and making things easy for the legitimate user should trump all security concerns --- if that's not in your design goal, you've already screwed up big time and it doesn't matter how much security you put in --- commercial success will be out of your reach, so you'll never have any security problems to worry about.

And for those who are wondering, the Munich MVV system uses the least secure method of all --- the honor system. Until you get caught a few times, it's actually cheaper not to navigate the difficult-to-use ticket system. In the time I've been in Munich, I've only been checked once (yes, I had a ticket when I was checked) --- but the system still works (when the ticket inspectors came through, not one person on my incredibly crowded train was a cheater). My guess is, going with a more complex security model would have cost the MVV money, rather than save them any. In that sense, more security is just a tax on legitimate user, rather than helping anyone at all.
Post a Comment