Libraries and Learning
Lessons from the Facebook Fiasco
April 15, 2018
“Privacy is essential to the exercise of free speech, free thought, and free association. . . . The library profession has a long-standing commitment to an ethic of facilitating, not monitoring, access to information.” Privacy: An Interpretation of the Library Bill of Rights, 2001-2002.
“We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted.” American Library Association Code of EthicsAdopted 1939, last amended 2008.
It was interesting to learn about a new product designed to personalize the library experience for public library patrons shortly after watching Mark Zuckerberg being grilled by members of Congress. Thanks to a Twitter thread posted by Becky Yoose, a systems librarian who works for a large public library, I now know OCLC, a global library cooperative based in the US, best known for its shared catalog WorldCat, has acquired Wise, a subsystem for what librarians call an “ILS” – an integrated library system that combines the catalog with library functions such as keeping track of who has checked out what. This new system will do much more – in a sense, cataloging library users and tying their interests to library materials and programs through marketing.
Librarians like to think privacy is an essential feature of freedom of inquiry, and keeping track of what you’ve read in the past once you’ve returned a book is none of our business, even if it makes us seem less convenient or user-friendly than all of the other systems that remember things for us. It’s too bad that we can’t tell you whether you’ve read John Sanford’s Extreme Prey already. (It’s understandably hard to keep track when all 26 books in the series have the word “prey” in the title.) That inconvenience is outweighed by possibility that your reading history could be used to hurt you or another of our library users. Reading should never be used as evidence against you – and the freedom to read widely shouldn’t be chilled by concerns that your reading records may be stored and used against your will.
Sure, we were scorned by officials after 9/11 who said “the government doesn’t care about what James Patterson novel you read” and “if you haven’t done anything wrong, you don’t have anything to hide.” Well, I’ve written articles about sexual violence and serial murder in crime fiction. Some of my reading records could look mighty suspicious if someone suspected I was up to no good. What if what you’ve been reading was obtained and published by someone who had a grudge and wanted to publicly provoke a reaction? Could get messy. Luckily, that can’t happen if those records aren’t kept.
That said, a lot of librarians think library privacy is an outdated, slightly ridiculous obsession, given literally billions of people have signed up for social media platforms and are used to getting algorithmic recommendations. Aren’t libraries letting our patrons down if we don’t personalize our information offerings algorithmically? Won’t we seem useless and outdated?
(An aside: in academic libraries, where demonstrating value is considered a survival skill, this is more frequently framed as tracking individual use of the library to promote “student success,” enabling us to intervene if a student isn’t connecting to the library. After all, there’s evidence students with high GPAs use the library, thank goodness – it shows we have value and therefore should be allowed to exist. But what about students who don’t use the library? I’ve been told I’m forcing my personal values on innocent students who may be harmed if I don’t keep track of whether and how they use library resources. To which I say there are non-invasive ways of reaching out to struggling students and ethical ways to study library effectiveness that don’t involve gathering piles of sensitive data and asking it to do it for us.)
Anyway, this new product being offered to public libraries in the US after a successful debut in the Netherlands will allow libraries to enhance patron profiles, tying what they like to read and programs they are interested in to demographic data, such as their age, gender, and residence. This is both to personalize the library experience and to make the library more valuable to the public. (A Dutch site describing the product opens with “”Falling loan figures. Falling visitor numbers. The customer is increasingly central” and goes on to say “By following the behavior of your customer, you have a wealth of information. With the Wise marketing module you can edit and use this data for different marketing purposes.” Anxiety stick, followed by a marketing carrot.) Public libraries have programs for job seekers, for the homeless, for people with various medical issues, for immigrants learning English or applying for green cards. What we haven’t done is connect that information into their patron records. But we could, to better personalize the customer experience!
According to a press release “OCLC takes data privacy very seriously. OCLC does not sell users’ personal data. Wise and other OCLC products make use of personal data only within the context of providing the library services that our members and their users have agreed to.”
I’ve heard this somewhere before . . . where was it? . . . oh, yeah, that’s what Zuckerberg said, over and over, to members of Congress. We don’t sell data. We care about privacy. Users agreed to it. And there was his weird insistence that it was all in order to serve the “Facebook community” as if there is a singular community of over two billion people who want their lives improved by seeing highly personalized advertising, that this is all about bringing us closer together by amassing and using personal information for “customer relationship management” through segementation.
OCLC, I have discovered, has registered the service mark, “because what is known must be shared.” No. Really. It doesn’t.
Perhaps even more distressing to me is that OCLC describes this product as an analytics tool that “removes subjectivity” because its decisions about people involve data and computers. That seems shockingly uninformed for an information organization. Let me share some reading materials voluntarily, just as I have tagged myself as “interested in privacy” right here of my own free will. I’m sure there’s lots I’ve missed, but these come immediately to mind.
- Algorithms of Oppression by Safiya Noble
- Automating Inequality by Virginia Eubanks
- Cyber Racism by Jessie Daniels
- Weapons of Math Destruction by Cathy O’Neil
- Data and Goliath by Bruce Schneier
- And for academic librarians particularly, this C&RL article by Kyle M. L. Jones and Dorothea Salo, “Learning Analytics and the Academic Library.”
Algorithms don’t remove subjectivity. This is basic professional knowledge. The trouble Facebook is in should be making us more cautious, less easily persuaded that our salvation lies in gathering more user data. I’m not sure who to blame for blithely embracing market-segmentation “personalization” systems – OCLC or members of my profession for thinking surveillance capitalism is the model we should follow in spite of what we know and what we purport to value.