Technology and Society
The Ghost in the Machines of Loving Grace
January 16, 2017
Facebook and Google are highly influential in the way they shape our perceptions of the world, yet have tried hard to deny any responsibility for deciding what is correct or true. That’s understandable. Who wants to wade into that territory when really all you want to do is sell highly-targeted advertising? But that targeting, that enormously effective engineering of our attention and our identities, has had a huge effect on what news we see, and it has worked so well that Americans more than ever before live in different worlds, rich in information but without common ground. It’s not just a social media problem, of course; danah boyd points out how our decisions about privatization and personalization have profound influence on our sense of who we are as a nation. But now that we get our news online, and our online lives are so thoroughly shaped by the engineers at Facebook and Google, it’s getting harder for these companies to sustain their claim that we’re watched over by machines of loving grace that have not a biased circuit in their bodies.
It struck me yesterday that the problem of algorithmic neutrality is similar to the problem of librarian neutrality. Someone asked me why libraries don’t remove books that have had their license revoked, so to speak. The book in question was Arming America, which was stripped of its Bancroft Prize when historians called the author’s methods and sources into question, but there are plenty of other examples of books that have been debated, denounced, or discredited yet stay on library shelves. I told him academic libraries rarely remove a book because it has fallen into disfavor. Often, a discredited source has had a big impact, and becomes part of the history of that issue. Newer works on the topic will be shelved nearby that will problematize the questionable work and put it in context. We never claim to have the answer; we provide access to multiple answers, but the researcher is the one who has to decide what to think. We expect students to look at a variety of sources and never to take any one source as gospel.
There’s a level of trust there, that our students can and will approach a debate with genuine curiosity and integrity. There’s also a level of healthy distrust. We don’t believe it’s wise to leave decisions about truth and falsehood up to librarians.
But at the same time, we kind of do. We don’t want completely false conspiracy theories or propaganda to share equal shelf space (or budget lines) with carefully researched scholarship. We don’t want to imply any old “facts” will do in an argument. We don’t want to say “nope, we have absolutely no responsibility for what you find here. You want to put some propaganda on our shelves? Go ahead. The more, the merrier.” Libraries have collection development policies that spell out the way they approach their choices, and there are always things we don’t add as well as things we remove through the same curation process in reverse. There’s only so much space and money and not everything furthers the mission of the institution. Besides, if you believe books and the ideas they contain can be beneficial, you have to think through how they can also be harmful, and own that problem.
The thing is, Facebook literally can’t afford to be an arbiter. It profits from falsehoods and hype. Social media feeds on clicks, and scandalous, controversial, emotionally-charged, and polarizing information is good for clicks. Things that are short are more valuable than things that are long. Things that reinforce a person’s world view are worth more than those that don’t fit so neatly and might be passed over. Too much cruft will hurt the brand, but too little isn’t good, either. The more we segment ourselves into distinct groups through our clicks, the easier it is to sell advertising. And that’s what it’s about.
A public sphere can’t thrive within a space that’s a private enterprise, one that treats information and attention as a commodity to be traded as fast and as often as possible. Facebook and Google may have global reach and may increasingly be where we get our news, but they aren’t in the public interest.
If libraries were as personalized, you would wave your library card at the door and enter a different library than the next person who arrives. We’d quickly tidy away the books you haven’t shown interest in before; we’d do everything we could to provide material that confirms what you already believe. That doesn’t seem a good way to learn or grow. It seems dishonest. Whatever libraries may have in common with these tech companies, however much our stated missions sound similar, they have a difference of purpose, and it’s a difference that matters in ways that can be hard to see.