Obfuscation

Obfuscation

A User's Guide for Privacy and Protest

Finn Brunton, Helen Nissenbaum

In a sentence: Obfuscation is the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection.
124
The Mexican-election Twitter bots were deliberately engaging in bad behavior in order to trigger an automatic delisting, thereby keeping the impact of #marchaAntiEPN “off the radar” of the larger media. They were making the hashtag unusable and removing its potential media significance. This was obfuscation as a destructive act.
285
In the CacheCloak model, your phone predicts your possible paths and then fetches the results for several likely routes. As you move, you receive the benefits of locative awareness—access to what you are looking for, in the form of data cached in advance of potential requests—and an adversary is left with many possible paths, unable to distinguish the beginning from the end of a route and unable to determine where you came from, where you mean to go, or even where you are. From an observer’s perspective, the salient data—the data we wish to keep to ourselves—are buried inside a space of other, equally likely data.
305
the proliferation of documents and details presented opportunities for resistance, as well as for compliance.” 34 In situations where one can’t say No, there are opportunities for a chorus of unhelpful Yeses—for example, don’t send a folder in response to a request; send a pallet of boxes of folders containing potentially relevant papers.
413
babble tapes have been used less by mobsters than by attorneys concerned that eavesdropping may violate attorney-client privilege. A babble tape is a digital file meant to be played in the background during conversations. The file is complex. Forty voice tracks run simultaneously (thirty-two in English, eight in other languages), and each track is compressed in frequency and time to produce additional “voices” that fill the entire frequency spectrum. There are also various non-human mechanical noises, and a periodic supersonic burst (inaudible to adult listeners) engineered specifically to interfere with the automatic gain-control system of an eavesdropping device configures itself to best pick up an audio signal. Most pertinent for present purposes, the voices on a babble tape used by an attorney include those of the client and the attorney themselves. The dense mélange of voices increases the difficulty of discerning any single voice.
473
The extraordinary tale of Operation Vula has been told by one of its chief architects, Tim Jenkin, in the pages of the ANC’s journal Mayibuye. 38 It represents a superb example of operations security, tradecraft, and managing a secure network.
493
Given a small amount of text, stylometry can identify an author. And we mean small—according to Josyula Rao and Pankaj Ratangi, a sample consisting of about 6,500 words is sufficient (when used with a corpus of identified text, such as email messages, posts to a social network, or blog posts) to make possible an 80 percent rate of successful identification. 16 In the course of their everyday use of computers, many people produce 6,500 words in a few days.
679
Anonymouth—a tool that is under development as of this writing—is a step toward implementing this approach by producing statistically bland prose that can be obfuscated within the corpus of similar writing.
713
Things we once thought were private—if we thought of that at all—become open, visible, and meaningful to new technologies. This is one aspect of the information asymmetry that shapes our practices of privacy and autonomy: we don’t know what near-future algorithms, techniques, hardware, and databases will be able to do with our data. There is a constantly advancing front of transition from meaningless to meaningful—from minor life events to things that can change our taxes, our insurance rates, our access to capital, our freedom to move, or whether we are placed on a list.
1010
Anthony Giddens’s “manufactured risks”: dangers produced by the process of modernization, rather than mitigated by it, and, in turn, requiring new systems of mitigation.
1052
The response to orders is not some cinematic refusal, but foot dragging, slowdowns, feigned ignorance, deliberate stupidity, and the pretense of compliance. Finally, and most important for our purposes, rather than overt backtalk or heroic here-we-stand speeches there is the evasive muttering, gossip, and slander of what Scott terms the hidden transcript.
1130
In the world after Snowden, it has become clear that, for many national-security, espionage, and law-enforcement organizations, having a population already predisposed to disclose to companies huge volumes of information about themselves that can either be subpoenaed or covertly exploited is all to the good. 23 Poorly designed and managed social platforms create an efficiently self-spying population, doing their own wiretapping gratis with photos uploaded with their EXIF metadata intact and with detailed social chit-chat waiting to be subjected to data-mining algorithms.
1223
Obfuscation obscures by making noise and muddying the waters; it can be used for data disobedience under difficult circumstances and as a digital weapon for the informationally weak.
1249
John Rawls, in A Theory of Justice, 12 demands as a basic requirement that the obfuscation practices in question not violate or erode basic rights and liberties. This requirement calls into question obfuscating systems relying on deception, system subversion, and exploitation that have the potential to violate rights of property, security, and autonomy.
1503
this means that when weighing policy options, a just society should not necessarily look to equalize the standing of different individuals or groups, but where this is not possible or makes no sense should focus on the plight of those on the lower end of the socioeconomic spectrum, ensuring that whatever policy is chosen is one that maximizes outcomes for these stakeholders. A just society’s policies, in other words, should maximize the minimum.
1512
Those on the wrong side of the power and knowledge asymmetries of an information society are, as we have argued, effectively class members of its less well-off —subjects of surveillance, uncertain how it affects their fates, and lacking power to set terms of engagement. Consequently, in developing policies for a society deemed just according to Rawls’ two principles, 22 those on the wrong side of the asymmetries should be allowed the freedom to assert their values, interests, and preferences through obfuscation (in keeping with ethical requirements), even if this means impinging on the interests and preferences of those on the right side of knowledge and power asymmetries.
1590
Threat models lower the costs of security and privacy by helping us understand what our adversaries are looking for and what they are capable of finding, so that we can defend against those dangers specifically.
1684
They can’t necessarily create secret social spaces for their community—parents can and do demand passwords to their social-network accounts and access to their phones. Instead, they use a variety of practices that assume everyone can see what they do, and then behave so that only a few people can understand the meaning of their actions. “Limiting access to meaning,” boyd writes, “can be a much more powerful tool for achieving privacy than trying to limit access to the content itself.” 5 Their methods don’t necessarily use obfuscation (they lean heavily on subtle social cues, references, and nuance to create material that reads differently to different audiences, a practice of “social steganography”), but they emphasize the importance of understanding goals. The goal is not to disappear or to maintain total informational control (which may be impossible); it is to limit and shape the community that can accurately interpret actions that everyone can see.
1702
The strength of an obfuscation approach isn’t measured by a single objective standard (as safes are) but in relation to a goal and a context: to be strong enough. It may be used on its own or in concert with other privacy techniques. The success of obfuscation is always relative to its purposes, and to consideration of constraints, obstacles, and the un-level playing field of epistemic and power asymmetries.
1719