Cory Doctorow: Security in Numbers

Locus Magazine, Science Fiction FantasyEdward Snowden wasn’t the first person to leak information about US mass surveillance. The mass surveillance story has been unfolding since an AT&T technician called Mark Klein blew the whistle on the NSA in 2006, but the Snowden story is the first one that’s caught and held the public’s interest for more than a brief moment. I wish I knew why that was. I suspect that if you knew what made the Snowden leaks news for a year and more, you could use that knowledge to run the most successful political campaign of the century or found a global religion.

I know that, for me, the story has an incredibly compelling one-two, lurching rhythm. First, we learn about some new way in which the NSA and its allies have been invading our privacy on a breathtaking scale, say, by putting whole countries under surveillance. Then we learn about a new way in which the spies have sabotaged the security of some vital class of computers or networks. Ka-pow! Not only are you being spied upon in ways that make Orwell look like an optimist, but whatever tool you thought you could trust with your digital life has been compromised and has been abetting the surveillance. One-two.

But there’s good news in the Snowden story, and its longevity. The wider public seems to finally give a damn about security and privacy, topics that have been hopelessly esoteric and nerdy until this moment. It makes a huge difference in all kinds of policy questions. Back when AT&T and T-Mobile were considering their merger, the digital policy people I knew talked about how the new megacompany would be an irresistible target for spies, with a bird’s-eye view of who you were, where you were, who you knew, and what you did with them, but this argument got almost zero play on the wider stage. Back then, talking about how cops and spies might view a telcoms merger as a surveillance opportunity made you sound like a swivel-eyed paranoid loon. Today – post-Snowden – it makes you sound like someone who’s been paying attention.

At last, people who aren’t computer experts are starting to worry about the security of computers. It’s a glorious day, seriously. Finally, there’s a group of people who aren’t computer experts who want to use security tools like GPG (for scrambling e-mail) and TOR (for adding privacy to your network use) and OTR (for having private chats) and even TAILS. (Boot up a computer with this operating system and be sure that it’s not running any spyware, that your communications are private, and that whatever you do will be scrubbed when you turn the computer off again.)

That’s outstanding news. If normal people are using this stuff, it’ll start to get user-interfaces that are comprehensible to normal people – interfaces that don’t assume a high degree of technical knowledge. There’s a certain view that the reason these tools tend to be complex is that security is Just Hard, which may be so, but it didn’t help that everyone who knew enough to care about technological privacy measures was also someone who understood technology well enough to get past a clunky interface.

If you’re just getting to this stuff, welcome. Seriously. We need everyone to be worried about this stuff, and not just because it will help us get governments to put a leash on the spies. More important is the fact that security isn’t an individual matter.

A really good way to understand this is to think about e-mail. Like many long-time Internet users, I was suspicious of Google’s Gmail and decided that I’d much rather host my own e-mail server, and download all my incoming mail my laptop, which is with me most of the time (I also have a backup or two, in case I lose my laptop), but over time, lots of other people started using Gmail, including a large slice of the people I correspond with. And they don’t host their own e-mail. They don’t pull their mail off the server and move it to a computer that’s with them at all times. They use Gmail, like a normal person, and that means that a huge slice of that ‘‘private’’ e-mail I send and receive is sitting on Google’s servers, which are pretty well maintained, but are also available for mass surveillance through NSA programs like Prism.

Effectively, that means that I’m a Gmail user too, even though I pay to host and maintain my own mail server. This is a point that was well made by Benjamin Mako Hill in an essay in May, at mako.cc/copyrighteous/google-has-most-of-my-e-mail-because-it-has-all-of-yours, which introduced some research he’d done, mining the e-mail in his In and Out boxes to see how much of it had transited Google’s servers – it turns out that about two-thirds of the mail he sends ends up in Google’s (and therefore, potentially, the NSA’s) hands.

For me to be secure against a raid on Google’s servers, I have to convince you to take action. Ideally, we’d all host our own mail – and hell, we’d even reform the weird, old Electronic Communications Privacy Act of 1986 that lets the cops do a warrantless request for any file that’s more than six months old. But even though that day is a long way off, there’s still things we can do today to protect our privacy, if we do them together.

Take GPG, the e-mail privacy tool I mentioned a few paragraphs back. If we both use GPG to encrypt our e-mail, the NSA can’t read our e-mail anymore, even if it’s on Gmail’s servers. They can still see that we’re talking to each other, who else is CC’ed, where we are when we send the e-mail (tracing our IP addresses), and so forth, but the actual payload is secure. For modern messaging, well, if we just throw away technologies that are proprietary (and should thus be presumed to have something wrong with them – if no one is allowed to see how they work, there’s a pretty good chance the company that made them is kidding itself about how secure they are), technologies that are known to be insecure, and technologies that are known to be compromised (like Skype, which is the electronic equivalent of wearing a CCTV that feeds directly to the NSA), then we’re left with stuff like OTR, which actually works. With OTR, there’s not even subject lines, CCs, and IP addresses to data-mine.

The fact that security can’t be an individual matter isn’t surprising when you think about it. Road-safety is collective, too: it doesn’t matter how defensively you drive, if everyone else is a lunatic. So is health security: as the anti-vaccination movement has shown us, without herd immunity, we’re all at risk. Even society itself can be thought of as a collective security exercise: through legitimate laws made by legitimate governments, we set out the rules, the administrative systems, and the punishments by which we’ll all be secure.

For example, the Framers of the US Constitution tacked on a whole Bill of Rights full of security measures that would keep people safe from their governments. There are ideas like the Fourth Amendment:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

These set out an unambiguous way in which we – the people – collectively opt to keep ourselves secure from abuses of authority. Now that the word about electronic security and privacy has started to get around, maybe we can get the NSA to start obeying the law.


Cory Doctorow is the author of Walkaway, Little Brother, and Information Doesn’t Want to Be Free (among many others); he is the co-owner of Boing Boing, a special consultant to the Electronic Frontier Foundation, a visiting professor of Computer Science at the Open University and an MIT Media Lab Research Affiliate.


From the July 2014 issue of Locus Magazine

Leave a Reply

Your email address will not be published. Required fields are marked *