close

The Moral Economy of Tech

Maciej Cegłowski

There is powerful social pressure to avoid incremental change, particularly any change that would require working with people outside tech and treating them as intellectual equals.
treating the world as software promotes fantasies of control. And the best kind of control is control without responsibility. Our unique position as authors of software used by millions gives us power, but we don't accept that this should make us accountable.
Those who benefit from the death of privacy attempt to frame our subjugation in terms of freedom, just like early factory owners talked about the sanctity of contract law. They insisted that a worker should have the right to agree to anything, from sixteen-hour days to unsafe working conditions, as if factory owners and workers were on an equal footing. Companies that perform surveillance are attempting the same mental trick. They assert that we freely share our data in return for valuable services. But opting out of surveillance capitalism is like opting out of electricity, or cooked foods—you are free to do it in theory. In practice, it will upend your life.
imagine what the British surveillance state, already the worst in Europe, is going to look like in two years, when it's no longer bound by the protections of European law, and economic crisis has driven the country further into xenophobia.
I am very suspicious of attempts to change the world that can't first work on a local scale. If after decades we can't improve quality of life in places where the tech élite actually lives, why would we possibly make life better anywhere else? We should not listen to people who promise to make Mars safe for human habitation, until we have seen them make Oakland safe for human habitation. We should be skeptical of promises to revolutionize transportation from people who can't fix BART, or have never taken BART. And if Google offers to make us immortal, we should check first to make sure we'll have someplace to live.