Internet Privacy: Stepping Up Our Self-Defense Game

By Evan SelingerNovember 10, 2015

Obfuscation: A User’s Guide for Privacy and Protest by Helen Nissenbaum and Finn Brunton

LET’S FACE IT, despite all the coverage “privacy” gets in the post-Snowden world, many of us don’t see what all the fuss is about. Or at least we act as if we don’t. We avidly connect with friends, family, and colleagues over social media, and for the most part we can’t seem to get enough of products and services that use personal information to create customized recommendations: new things to purchase, good songs to listen to, fun places to visit, expedient routes to drive and walk, helpful weather warnings, and pertinent responses to search engine queries. We even like tailored advertising when it’s relevant, and we get a tad annoyed when companies aspiring to know us inside and out make poor inferences about our tastes.


Then again, we’re also likely to be anxious about the commercial internet and our telltale footprints. Maybe we’re dismayed that our past peccadillos or fetishes, our outdated infractions and guilty pleasures won’t disappear into the ether. And that’s just the tip of the iceberg now that our digital dossiers are overflowing with detail. Few of us happily embrace the idea that every time we go online, companies watch and record our every click — companies that are getting ever better at linking on and offline activity; companies that have their own self-serving agendas and pass off self-interested endeavors as altruistic attempts to make the world a better, more connected place; companies that might be complicit in government surveillance, but only seem embarrassed by their complicity when the public’s awareness of that surveillance threatens their revenue. 


The fact is that privacy-conscious citizens don’t want to be dogged, much less defined, by browsing histories and digitally archived transactions. And they object to third parties ripping their information out of its intended context, as if their intentions and desires are meaningless. Above all, these folks want to be free to reveal different parts of themselves to different audiences without anyone keeping tabs on it all.


 


Resisting Data Tyranny


When consumers believe corporate tracking leaves them vulnerable, even powerless, they face two basic choices: accept the status quo and remain perpetually anxious about companies being hell bent on reducing our complex, intimate, and ever-changing life stories to reified classifications, or take active steps to limit the encroachment. 


In Obfuscation: A User’s Guide for Privacy and Protest, Finn Brunton and Helene Nissenbaum recommend we step up our self-defense game. They advocate for strong measures, including corporate sabotage. 


Brunton and Nissenbaum’s outrage centers on two points. First, they insist that major tech companies are actively trying to keep us in the dark, and they’re capitalizing on our limited knowledge of how data mining and analysis works. Through corporate secrecy, and by designing mediated environments that don’t reveal much about what’s going on below the surface (all the while being optimized to nudge maximum user disclosures), as well as by hiring people whose technical knowledge vastly exceeds that of most users, tech companies limit our ability to know what they’re currently doing with our information, what they can and might do with it in the near future, and what long-term agendas will kick in as big data repositories grow and information science and technology improves.


Second, Brunton and Nissenbaum maintain that tech companies engage in lots of machinations to exploit our vulnerabilities and keep us powerless. They rig the game so we’re inclined to accept without reading terms of service contracts filled with impenetrable legal jargon that we can’t grasp and which we couldn’t bargain with even if comprehension weren’t an issue. They give lip service to “choice,” but hugely benefit from the extremely high social and economic costs of opting out of their data-mining services. They also benefit from the fact that private sector attempts to create best practices and codes of conduct are structurally hampered by the “competitive disadvantage associated with general restraints on access to information.” And they engage in lobbying efforts that capitalize on the gap between rapid technological advancement and slow legal regulations.


Brunton and Nissenbaum conclude we’re living in a state of “data tyranny” that justifies our taking up guerilla tactics. 


You might be surprised that Nissenbaum is on the front lines advancing a radical anti-corporate agenda. She’s not a stereotypical in-your-face activist, but, like Brunton, a highly respected professor at an expensive private college, New York University. Indeed, Nissenbaum is a prolific author and considered the doyenne of “contextual integrity” — one of the most widely admired and referenced contemporary privacy theories. It spells out what’s wrong when a community’s sense of appropriate communication and interaction is undermined. Crucially, contextual integrity has institutional teeth, and it directly informs how the Federal Trade Commission approaches privacy protections.


And yet, despite having the attention of influential regulators and advocates, Nissenbaum and Brunton believe the time has come to push for a more radical form of intervention by way of a grassroots movement. It’s an admittedly morally complex form of intervention: they recommend that we turn to obfuscatory tactics in order to become deceptive dissenters.   


 


Self-Defense Through Obfuscation


Obfuscation “is the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection.” There are lots of ways to do it.


Unhappy Facebook users have tried obfuscation by entering vast amounts of false personal information (“Bayesian flooding”) to make it harder for Zuckerberg’s crew to create robust profiles that advertisers can target. Shoppers who are worried about stores identifying patterns in their spending behavior obfuscate by sharing loyalty cards with one another. 


Of course, nothing prevents powerful actors from turning to obfuscation when it suits their purposes. Consider the 2011 Russian deployment of Twitter bots. These “programs purporting to be people” appropriated protestor hashtags, thus adding obfuscatory noise to the communicative system. As a result, dissatisfied citizens had trouble discussing parliamentary elections. 


Two technologies exemplify the type of obfuscation Brunton and Nissenbuam are encouraging: TrackMeNot and AdNauseum.


TrackMeNot is an internet browser extension that Nissenbaum collaborated on with Daniel Howe and Vincent Toubiana. It takes your search terms and automatically supplements them with misleading but plausible ones. This makes it more difficult for companies to isolate signal from noise, and allegedly makes it harder for search engine providers, like Google, to discern patterns in queries. 


AdNauseum is a Firefox extension for AdBlock Plus that Nissenbaum created with Howe and Mushon Zer-Aviv. On it’s own, AdBlock Plus allows users to “block annoying ads, disable tracking and block domains known to spread malware.” AdNauseum augments these features by “quietly clicking on all blocked ads while recording, for the user’s interest, details about ads that have been served and blocked.” Incessant ad clicking makes it harder for trackers to know which clicks are genuine, and it also subverts a contested financial system by paying ad-networks and ad-hosting websites for content that consumers never viewed.


 


Does Obfuscation Work and Is It Justified?


I have concerns about obfuscation. But I’d also like to take issue with some of the criticisms that have been directed at it.


The obvious practical question is whether TrackmeNot and AdNauseum get the job done. Since their inception, skepticism rightly has been raised about effectiveness.


Efficacy is important, but in my opinion it’s not decisive here and certainly shouldn’t be used as a trump card to dismiss Brunton and Nissenbaum’s agenda. It’s early days for their obfuscation efforts, and so we shouldn’t be too invested yet in how well the software actually works. Movements take time to build, which is why we’re better off viewing TrackMeNot and AdNauseum as akin to thought experiments. They may not leave tech giants shaking, but they do establish “proof of concept,” thereby giving coders and their collaborators empirical material to work with: a clear vision that’s embodied in a serviceable starting point; a sense of actual functions that can be programmed; and an operational baseline that future projects can improve upon. 


Futurist David Brin raises another practical objection, which I’m not buying. He characterizes obfuscation proponents as suffering from “technological myopia” that prevents them from appreciating an ugly truth: their proposals will benefit elites, not average Joes and Janes. Here’s what he says:


Science Fiction author Vernor Vinge referred to this approach [obfuscation] in a novel, portraying a group that filled the Internet with machine-generated garbage information in order to mask personal information beneath a deluge of misleading fabrications. The “Friends of Privacy” thus enabled their own members to conceal their online activities — or so they thought–while making the Net and Web effectively useless at providing reliable information to anyone, anywhere. A classic case of spoiling the commons, in order to achieve a narrow, self-serving goal…


Over the long run, Vernor reveals the obvious — that the "Friends of Privacy" are no more than a front for powerful interests, from oligarchs to criminal gangs, seeking to conceal their nefarious activities from any chance of discovery. Indeed, pick any method of concealment or obfuscation — won’t elites use it more effectively (by far) than commonfolk? In fact, won’t the very programs that you and I purchase, to muddy our footprints, actually report every datum to the original creators of the software? And if you doubt that, oh, have I a bridge to sell you.


Brin is right about one thing. “Commonfolk” probably can’t design obfuscation tools on their own and will need help from people who possess sufficient technical expertise. But he’s wrong to presume this relationship of dependence is inherently synonymous with corporate dependency, and he should be more imaginative than to speculate that our only option is “purchasing” software. 


If open-source advocates are persuaded by Brunton and Nissenbaum’s obfuscation arguments, they can in fact build obfuscation tools without surveillance strings attached. Indeed, that’s a big reason Brunton and Nissebaum wrote Obfuscation. They want to inspire ethically and politically concerned software developers to create new tools that functionally improve upon current ones, and that are easy enough to obtain and use that the charge of “free riding” becomes irrelevant. In other words, they don’t want an obfuscating minority-elite exploiting weaknesses in data-tracking systems while the rest of us — the non-techie masses — remain woefully vulnerable. Of course, whether or not elites with deep pockets will benefit disproportionately from having access to expensive obfuscation tools depends on how motivated people and institutions are to providing free high quality user-friendly versions. If such democratized possibilities are ruled out in advance, it’s only because the dubious logics of technological and economic determinism are being embraced. 


Then, there’s the ethical question of whether it’s really okay for us to defend ourselves by polluting corporate data pools. Whereas Nissenbaum and Brunton emphasize our limited options for resistance, I think it’s important to broaden the frame of analysis. Only then can we discern if more possibilities exist; and only then can we figure out if some of these enable us to avoid ethical murkiness. 


Obfuscating isn’t the only way to enhance privacy, and while Brunton and Nissenbaum express concern about what information corporations know about us, the fact is that our collective concern about proprietary platforms extends beyond their limited focus. For example, we’re apprehensive about the social information we share on such platforms, which becomes available for all kinds of questionable uses. So, while Brunton and Nissenbaum acknowledge they chose the word "obfuscation" “because it connotes obscurity,” I think it’s helpful to keep in mind the work Woodrow Hartzog and I have done on the subject, which demonstrates that, if your goal is to challenge others who want to access your information, you can in fact select from diverse pathways of resistance. In other words, we should be wary of fixating too intently on any particular approach. 


There are several ways to make online communication more obscure: sharing ideas on platforms that are invisible to search engines; using privacy settings and other access controls; withholding your real name and speaking anonymously or identifying yourself with a pseudonym; disclosing information in coded ways that only a limited audience will grasp; or transmitting content that is encrypted or temporarily accessible through an ephemeral conduit…


Given the prevalence of algorithmic surveillance today, obscurity practices go beyond making it difficult for other people to know what we’re saying or that we’re the ones saying it. They also include using strategies for conversing online “without tipping [...] intentions to the algorithm,” effectively “communicating without being computed”...Some of the options to produce obscurity include: referring to folks without tagging them; referring to people and deliberately misspelling their names or alluding to them through contextual clues; sending screenshots of a story instead of directly linking to it...


Thinking about privacy protections through the broader lens of obscurity does more than help us appreciate the diverse ways in which individuals can mitigate both corporate and social surveillance. It also can improve our outlook on the possibilities for genuine policy reform. Federal Trade Commissioner Julie Brill is committed to advancing obscurity protections. She criticizes data brokers and “people search firms” for providing us with inadequate control over our personal information. Approaches like hers serve as positive reminders that obscurity proposals — which can include modes of obfuscation — aren’t limited to data gorilla warfare and the technological arms race that it can create. They can be legal tools for helping ensure that companies promote trust, not fear.  Legalizing “a right not to be computed” can, in other words, be part of a multi-pronged approach that has a greater chance of succeeding than Brunton and Nissenbaum suggest.


 


Books Today


I’d be remiss if I ended this review without addressing the fact that Obfuscation is a small and familiar book. The version I read — uncorrected page proofs — only runs 120 pages, and endnotes begin before readers cross the triple page threshold. Moreover, interested readers can get a solid obfuscation education without ever reading the text. They can consult online material that’s free and easy to find: Nissenbaum’s podcast appearance, “Resisting Data’s Tyranny with Obfuscation” (2014), Brunton and Nissenbaum’s seminal article, “Vernacular resistance to data collection and analysis: A political theory of obfuscation” (2011), and other quality resources


In pointing out the size and repetition, I’m not slighting Brunton and Nissenbaum. The fact is, too many academics are long-winded and would benefit from judicious editing of their prose. And as to the re-purposing of ideas, well, scholarship is changing and it’s important to acknowledge that this book is a sign of the changing times. Once a researcher is ready to write a game-changing monograph, there’s a good chance others have publically comment on the project and the author has already worked out the guiding logic and rehearsed key examples in public: in interviews; in Op-Eds; in blog posts; in podcasts; in archived talks and videotaped conference appearances; in book reviews; and in papers that exist, in some version, outside the confines of paywalls.


Overall, this is a positive development. When you know in advance that the public might scrutinize your claims and when you think hard about the comments your public presentations elicit, you can become crystal clear about your subject matter and develop a reliable sense of how to select examples that arouse your reader’s curiosity and sympathy. At the same time, this means turning readable and relevant academic books into unoriginal consolidated records of disparate public archives.


In the case of Obfuscation, we shouldn’t want it any other way. Movements require public attention.


¤


Evan Selinger is a professor of philosophy at Rochester Institute of Technology, where he is also affiliated with the Center for Media, Arts, Games, Interaction, and Creativity (MAGIC).

LARB Contributor

Evan Selinger (@evanselinger) is a professor of philosophy at Rochester Institute of Technology.

Share

LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!