Cory Doctorow: Zuck’s Empire of Oily Rags

Cory Doctorow
Photo by Paula Mariel Salischiker

For 20 years, privacy advocates have been sounding the alarm about commercial online surveillance, the way that companies gather deep dossiers on us to help marketers target us with ads. This pitch fell flat: by and large, people were skeptical of the efficacy of targeted advertising; the ads we got were rarely very persuasive, and when they did work, it was usually because the advertisers had figured out what we wanted and offered to sell it to us: people who’d previously shopped for a sofa saw ads for sofas, and if they bought a sofa, the ads persisted for a while because the ad targeting systems weren’t smart enough to know that their services were no longer needed, but really, where was the harm in that? The worst case scenario was that advertisers would waste their money with ads that had no effect, and the best case scenario was that shopping would get somewhat more convenient as predictive algorithms made it easier for us to find the thing we were just about to look for.

Privacy advocates tried to explain that persuasion was just the tip of the iceberg. Commercial databases were juicy targets for spies and identity thieves, to say nothing of blackmail for people whose data-trails revealed socially risky sexual practices, religious beliefs, or political views.

Now we’re living through the techlash, and finally people are coming back to the privacy advocates, saying we were right all along; given enough surveillance, companies can sell us anything: Brexit, Trump, ethnic cleansing in Myanmar, and successful election bids for absolute bastards like Turkey’s Erdogan and Hungary’s Orban.

It’s great that the privacy-matters message is finally reaching a wider audience, and it’s exciting to think that we’re approaching a tipping point for indifference to privacy and surveillance.

But while the acknowledgment of the problem of Big Tech is most welcome, I am worried that the diagnosis is wrong.

The problem is that we’re confusing automated persuasion with automated targeting. Laughable lies about Brexit, Mexican rapists, and creeping Sharia law didn’t convince otherwise sensible people that up was down and the sky was green.

Rather, the sophisticated targeting systems available through Facebook, Google, Twitter, and other Big Tech ad platforms made it easy to find the racist, xenophobic, fearful, angry people who wanted to believe that foreigners were destroying their country while being bankrolled by George Soros.

Remember that elections are generally knife-edge affairs, even for politicians who’ve held their seats for decades with slim margins: 60% of the vote is an excellent win. Remember, too, that the winner in most races is “none of the above,” with huge numbers of voters sitting out the election. If even a small number of these non-voters can be motivated to show up at the polls, safe seats can be made contestable. In a tight race, having a cheap way to reach all the latent Klansmen in a district and quietly inform them that Donald J. Trump is their man is a game-changer.

Cambridge Analytica are like stage mentalists: they’re doing something labor-intensive and pretending that it’s something supernatural. A stage mentalist will train for years to learn to quickly memorize a deck of cards and then claim that they can name your card thanks to their psychic powers. You never see the unglamorous, unimpressive memorization practice. Cambridge Analytica uses Facebook to find racist jerks and tell them to vote for Trump and then they claim that they’ve discovered a mystical way to get otherwise sensible people to vote for maniacs.

This isn’t to say that persuasion is impossible. Automated disinformation campaigns can flood the channel with contradictory, seemingly plausible accounts for the current state of affairs, making it hard for a casual observer to make sense of events. Long-term repetition of a consistent narrative, even a manifestly unhinged one, can create doubt and find adherents – think of climate change denial, or George Soros conspiracies, or the anti-vaccine movement.

These are long, slow processes, though, that make tiny changes in public opinion over the course of years, and they work best when there are other conditions that support them – for example, fascist, xenophobic, and nativist movements that are the handmaidens of austerity and privation. When you don’t have enough for a long time, you’re ripe for messages blaming your neighbors for having deprived you of your fair share.

But we don’t need commercial surveillance to create angry mobs: Goebbels and Mao did it very well with analog techniques.

Facebook isn’t a mind-control ray. It’s a tool for finding people who possess uncommon, hard-to-locate traits, whether that’s “person thinking of buying a new refrigerator,” “person with the same rare disease as you,” or “person who might participate in a genocidal pogrom,” and then pitching them on a nice side-by-side or some tiki torches, while showing them social proof of the desirability of their course of action, in the form of other people (or bots) that are doing the same thing, so they feel like they’re part of a crowd.

Even if mind-control rays remain science fiction, Facebook and other commercial surveillance platforms are still worrisome, and not just because they allow people with extreme views to find each other. Gathering huge dossiers on everyone in the world is scary in and of itself: in Cambodia, the autocratic government uses Facebook to identify dissidents and subject them to arrest and torture; the US Customs and Border Protection service is using social media to find visitors to the US guilty by association, blocking them from entering the country based on their friends, affiliations and interests. Then there are the identity thieves, blackmailers, and con artists who use credit bureau data, leaked user data, and social media to ruin peoples’ lives. Finally, there are the hackers who supercharge their “social engineering” attacks by harvesting leaked personal information in order to effect convincing impersonations that trick their targets into revealing information that lets them break into sensitive networks.

It’s fashionable to treat the dysfunctions of social media as the result of the naivete of early technologists, who failed to foresee these outcomes. The truth is that the ability to build Facebook-like services is relatively common. What was rare was the moral recklessness necessary to go through with it.

The thing is, it’s always been obvious that by spying on internet users, you could improve the efficacy of advertising. That’s not so much because spying gives you fantastic insights into new ways to convince people to buy products as it is a tribute to just how ineffective marketing is. When an ad’s expected rate of success is well below one percent, doubling or tripling its efficacy still leaves you with a sub-one-percent conversion rate.

But it was also obvious from the start that amassing huge dossiers on everyone who used the internet could create real problems for all of society that would dwarf the minute gains these dossiers would realize for advertisers.

It’s as though Mark Zuckerberg woke up one morning and realized that the oily rags he’d been accumulating in his garage could be refined for an extremely low-grade, low-value crude oil. No one would pay very much for this oil, but there were a lot of oily rags, and provided no one asked him to pay for the inevitable horrific fires that would result from filling the world’s garages with oily rags, he could turn a tidy profit.

A decade later, everything is on fire and we’re trying to tell Zuck and his friends that they’re going to need to pay for the damage and install the kinds of fire-suppression gear that anyone storing oily rags should have invested in from the beginning, and the commercial surveillance industry is absolutely unwilling to contemplate anything of the sort.

That’s because dossiers on billions of people hold the power to wreak almost unimaginable harm, and yet, each dossier brings in just a few dollars a year. For commercial surveillance to be cost effective, it has to socialize all the risks associated with mass surveillance and privatize all the gains.

There’s an old-fashioned word for this: corruption. In corrupt systems, a few bad actors cost everyone else billions in order to bring in millions – the savings a factory can realize from dumping pollution in the water supply are much smaller than the costs we all bear from being poisoned by effluent. But the costs are widely diffused while the gains are tightly concentrated, so the beneficiaries of corruption can always outspend their victims to stay clear.

Facebook doesn’t have a mind-control problem, it has a corruption problem. Cambridge Analytica didn’t convince decent people to become racists; they convinced racists to become voters.


Cory Doctorow is the author of Walkaway, Little Brother, and Information Doesn’t Want to Be Free (among many others); he is the co-owner of Boing Boing, a special consultant to the Electronic Frontier Foundation, a visiting professor of Computer Science at the Open University and an MIT Media Lab Research Affiliate.


This review and more like it in the July 2018 issue of Locus.

Locus Magazine, Science Fiction FantasyWhile you are here, please take a moment to support Locus with a one-time or recurring donation. We rely on reader donations to keep the magazine and site going, and would like to keep the site paywall free, but WE NEED YOUR FINANCIAL SUPPORT to continue quality coverage of the science fiction and fantasy field.

13 thoughts on “Cory Doctorow: Zuck’s Empire of Oily Rags

  • July 4, 2018 at 7:18 pm
    Permalink

    Cory,

    It seems to me that you are making a political point here that, to my mind, discredits your underlying and valid point about privacy and corrupt business models.

    I strongly suggest that you de-conflate different things. You do not need to state partisan political positions (e.g. anti-Brexit, anti-Trump, implicit accussations that people who disagree with you on political points racists, tc.) which will alienate very many of those who might otherwise have been able to agree with you on the more substantive and apposite issues about corrupt, privacy-destroying business models.

    In effect, as things stand, you are complaining about anti-privacy/corrupt business models because you think they resulted in political decision about which you disapprove. That’s not a good basis for criticism, especially when there are far more objective points on which to criticise business models that depend on abuse of people’s supposedly private data.

    Take the partisan politics out of your criticisms and be objective. As things stand, you are, in very many people’s eyes, damaging the point that you are apparently setting out out make.

    Reply
    • August 5, 2018 at 12:40 pm
      Permalink

      Agree. Unfortunately Mr Doctorow has fallen thoughtlessly into the shallow, name-calling, populist gestalt of the Obama era. Very sad and also surprising. His own examples are loaded with divisive, inflammatory terms. I wonder how he cannot see that social media has been widely used to encourage racial hatred and social division, and to promote this horrible lack of respect in our discourse.

      Reply
    • November 29, 2020 at 6:56 am
      Permalink

      Yes, by all means, let’s not call out racist, science-denying imbeciles for fear of alienating the racist, science-denying imbeciles.

      Reply
  • July 5, 2018 at 3:45 am
    Permalink

    Cory,

    You are my favorite writer but I have to disagree with you on this one. I fully agree that CA and the services they provide go beyond the digital marketing schemes of Google and Facebook. I agree that the mining of our personal data shared on their platforms is intrusive and dangerous for just the reasons you specified. I am in complete agreement with you there. However, I think your political leanings and zeal for activism have warped your opinion.

    There were not significantly more racists voting in the 2016 election. The 2016 election shaped up because of those living outside of the big cities don’t agree with the way the country was going as those within. Trump, who for all purposes, is a Democrat and spouts the same rhetoric the party has as far back as Reagan. Because his opponent targeted the cities, he had to reach the rest of the country. However, the results are the same, short of a few bones that had to be thrown to the Republicans for supporting him. CA may have helped reach out to those voters, but to say it was because of racism is completely wrong. Most people, regardless of political leaning, don’t trust the Clinton’s. She may have been the most qualified, but not enough to overcome America’s lack of confidence in her integrity. No amount of identity politics and fanning the fires of our differences can change that.

    Keep up the good work. Regardless of politics, I am with you in the fight to secure our digital privacy.

    Kurt.

    Reply
  • July 6, 2018 at 7:10 am
    Permalink

    LOL this while article is your safe, milquetoast political opinion. Combine that with your photo and be surprised if anyone takes this article, or you, seriously.

    Reply
  • July 6, 2018 at 7:18 am
    Permalink

    “Socially risky sexual practices” ? Did you mean sexual orientation, or sleeping with a married person? If you meant the former, please just say so, there’s no need for strangely euphemistic language. If you meant the latter, how many people do you expect are putting their social lives at risk with their sexual practices?

    Reply
    • July 10, 2018 at 3:57 am
      Permalink

      Could mean either, it will depend on the country and social groupings concerned. The social risk varies with the society, and Facebook is in (almost) all of them.

      Reply
    • November 17, 2018 at 5:02 pm
      Permalink

      Dick pics, he means dick pics. Not to mention sexual assault, harassment, public humiliation/objectification–your garden variety sexual predation/fuckery. Plenty of high profile examples including elected officials as of late, have your pick.

      Reply
  • July 6, 2018 at 8:08 am
    Permalink

    Would have been a great article if you hadn’t of put your bias front and center in the first paragraph. Just another “I told you so” sycophant write-up.

    Reply
  • July 6, 2018 at 1:06 pm
    Permalink

    Zuckerburg put a crack in society in 2012 when he flipped the hate switch on while trying to monetize Facebook.- one day we woke up and saw hundreds of comments from people we didn’t know! Every shared article was full of random strangers arguing politics and religion and race. Because threaded chat rooms are unnatural forums for communication with two people, much less hundreds, our decorum went out the window. At that point, 1.5 billion people were getting there news and impression of the world from Facebook, which was a pretty cool tool invented by some well-meaning college kids. But suddenly the rug was pulled out from beneath us and the wave of hate spread. You can see where that led. All he has to do is flip that switch off and a few people would grumble and then we’d get back to not seeing everyone’s dirty laundry in our face when we’re just trying to connect to friends. Now, 6 years later, that crack he put in society is a full-on chasm. One man and one man alone has the decision power to just flip the hate switch off on this most ubiquitous platform. A few people will grumble for a few days, then it will back to business as usual but without the ugly, vile, vitriolic, venomous, destructive filth that much of Facebook has been for the last 6 years.

    Reply
  • August 17, 2018 at 1:03 pm
    Permalink

    To date I don’t have a face book page. I mean, why would anyone want to visit it? Even I wouldn’t want to visit it if I had one. Can Zuck sell bad web pages? Sure. But first he has to have it. That’s one less oily rag. Not that anyone notices. But one less is better than none less.

    Reply
  • November 17, 2018 at 5:58 pm
    Permalink

    This article speaks to moral considerations, not political ones. Regardless of your political leanings or which side of a specific issue you find yourself on, the outcome of any particular campaign is less important than the greater ethical implications of the way these social media ‘hackers’ dog-whistled to the amygdalas and reptilian brains of the alienated, to gin up fear, xenophobia, white supremacy, and the nazism/fascism that we now see threatening to envelope swaths of entire continents. It’s not so much WHAT they did (i.e., won or lost specific campaigns), it’s HOW they did it, with utter and complete disregard for truth (i.e., targeted ads comprised of utter falsehoods to promote a specific voting outcome, that would come to have reverberating effects well after any election), and without any consideration for the broader moral and ethical implications of collecting, storing, and manipulating personal data in a way that has created a dangerous climate of violence and encroaching fascism globally.

    Reply
  • November 16, 2021 at 4:49 am
    Permalink

    I wonder how, 4 years later, all of you saying this was some kind of political statement are getting on? Your comments seem stale, whereas the warning you ignored from Cory is being screamed louder than ever.

    Reply

Leave a Reply to Tereza Cancel reply

Your email address will not be published. Required fields are marked *