Technology and Society

Code and Ethics

November 2, 2017

Congress is asking some hard questions about platforms they use to raise funds and get elected. Whoops! I mean, asking hard questions because they are representatives of the people. Like most of us, they’re trying to do more than one thing at once. We’re all caught in that dilemma. Social platforms are built for marketing even as we also use them to connect or learn or share ideas. Even when we know these platforms can do harm, It’s hard for people to voluntarily remove themselves from the dominant communications channels of our day. And with that great power comes . . . you know the rest.

It’s not just that Google, Facebook, and Twitter can be used by foreign powers to influence elections or to organize a genocide. It’s the way these platforms are designed to capture our attention and persuade us, and how those mechanisms play out when the persuasion is political. The scale of this persuasion is unprecedented. The companies involved have become incredibly wealthy and powerful but haven’t caught up with their platforms’ unintended consequences and don’t have a clue how to proceed.

As Zeynep Tufekci said in a talk she gave last month, “we’re building a dystopia just to make people click on ads.” Platforms that gather data on us have developed a formidable “persuasion architecture.” Where that once meant putting tempting candy bars near the grocery checkout, it now means those candy bars are insinuated into our lives differently.

The platform knows you like Mounds but not Heath Bars, your blood sugar is low because you missed lunch, you read an article an hour ago that made you anxious and depressed, and a friend just posted a message mentioning chocolate. You start to respond to your friend (“I could sure use a Mounds Bar . . . ”) but you delete the message before you send it. No matter. Facebook captured your half-finished message and added it to your simulated self, along with credit card purchase records, the results of personality tests, what you’ve been reading, and what you care about. They can sell that information to food companies, to diet programs that can play on your shameful Mounds addiction, or to political operatives who know what makes you anxious and how you might respond to that anxiety. They will also use it as training material to make their marketing schemes more effective and personalized. The combination of massive amounts of data and artificial intelligence is engineered to sell stuff. It’s not so great when it’s used to engineer society. As Tufekci puts it:

. . . if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it.

This isn’t our grandparent’s authoritarianism. It’s not jackboots and mass demonstrations in which people make the same gestures and shout the same slogans. It’s personalized authoritarianism that can invisibly work to the same ends. It’s also something else – a powerful mischief machine, a way to tear us apart and make our society dysfunctional. That seems to be what Russia attempted to do: tap into our emotions and amplify them at a time when those emotions can overrule our judgment, regardless of political beliefs. The troll armies weren’t partisan in their efforts to heighten tension and make us distrust each other.

Journalists developed a set of ethical principles for journalists. It took a few decades for that profession to figure this out and adopt practices and an identity based on values. Of course, reporters don’t live up to them. None of us live up to our ideals. But those ideals helped develop a professional culture that at least knows what it should be doing. Google, Facebook, and Twitter have mottos and taglines, but they don’t have codes of ethics that guide what they do and steer them away from unintended consequences. Social media hasn’t been around for long, and these companies been too busy writing code to write codes.

We need regulations to hold these companies accountable for the mischief they profit from. Whether this congress and administration has the knowhow and will to do it is unlikely, but still, we need to think seriously about regulation. Europe has rules that protect privacy and inhibit the kind of micro-targeting that messed with our election.

These companies are realizing they need to develop the technical means of moderating content that flows at unprecedented rates, with extraordinary reach in real time. As technical problems go, this one’s a massive challenge, but without better management their platforms can become toxic waste sites. They didn’t give it much thought as they built their incredibly powerful persuasion machines, so it will be hard to bolt on now. But they realize it’s messing with their brands.

More than that, though, what we need is for these people who influence our lives so profoundly to shift their sense of purpose and develop a professional code of ethics to guide that purpose. Building neat stuff that attracts lots of users and piles of money is not good enough. Writing code and developing artificial intelligence that works really well isn’t good enough. Slogans like “organize the world’s information” and “bring the world closer together” aren’t real purposes that determine their actions, not until doing those things becomes more important to engineer well than the efficient gathering, manipulating, and selling of personal data.

I don’t know who will finally solve this issue, or how, but without ethics guiding what these companies do, society is facing a huge risk.

License

Icon for the Creative Commons Attribution 4.0 International License

Babel Fish Bouillabaisse II Copyright © 2019 by Barbara Fister is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.