The Dark Days of Information Retrieval

On November 20th, 2016 I revealed my professional purpose to my family in a way I had never done before. I posted the following on Facebook:

As a librarian, I am constantly asking students to read and ask questions about the purpose, point of view, and credibility of resources before they use them to inform their own opinions. This article helps to reveal the information structures that exist within social media. Algorithms already manipulate what news you see, whose posts you read, and who sees the things you post. Encouraging awareness of information structures and power dynamics is also part of my job.

When I first accepted my job as a Public Services Librarian I was repeatedly asked what my job responsibilities were – do you think I’ll continue receiving these questions? Before this post, I had never before formed such a statement of self-purpose. But, I’ve been told that I am increasingly relevant and that my friends and family take great pride in my job, and so I should feel confident to assert my professional endeavors everywhere, not just on campus.

The Framework for Information Literacy for Higher Education states that “Information creation is a process.” Generally, this frame is the foundation to discussions about peer-reviewed journals and social media posts, or, recently, fake news. However, as I hinted above, I’ve been thinking much more about the algorithms that alter the way we connect with information.

Algorithms are the method that any search engine, feed reader, or social media platform use to deliver content to you. Companies do not publicize the logic, or priorities, that structure the way content is delivered, though web developers and marketing professionals do work hard to manipulate the code so that their content is delivered first. In a world of increasing reliance upon digital delivery of information I think we should call for transparency in algorithm development!

For now, we are on defense – we learn how these algorithms affect our lives after the fact. For example, Dylann Roof’s radicalization story is closely tied to the algorithm that automatically completes Google search terms. An NPR article  reports on Roof’s information search:

[Roof] said that after hearing about [Trayvon] Martin’s death he had “decided to look his name up. Type him into Google, you know what I’m saying?” Roof told investigators that he had read the Wikipedia article for Martin, and then “for some reason after I read that, I,” he paused before continuing, “I typed in – for some reason it made me type in the words black on white crime.”

This same article reveals that by typing “black on” the autocomplete function suggests “black on white crime.” By typing “white on” the autocomplete function suggests “white on white crime.” We don’t know why, but the important thing to know is that these algorithms are written by people and so they can be changed (which Google has done with some inflammatory search terms) and can contain bias. The algorithm development is a process too!

The algorithm that Facebook uses to get posts to user’s feed has also been under scrutiny for creating filter bubbles, the name given to the echo chamber that is created by an algorithm that prioritizes showing users posts from their friends, people who agree with their ideological views. Filter bubbles help in the spread and acceptance of fake news, which was prolific on Facebook during the presidential campaign. Since then, Facebook has voiced a desire to decrease the spread of fake news in the future. How? By becoming a media company or, at least, starting a “journalism project.” According to another NPR report, Facebook will be hiring engineers to make Facebook a better platform for news distribution. One of the initiatives of the Facebook Journalism Project is to “invest(ing) in research and projects that promote ‘news literacy.’” How will the company increase its user’s critical evaluation of the news? A great place to start would be to reveal the structures that get the news to their feed. Will they do that? Probably not.

It’s hard to teach others about the bias, power dynamics, and social structures written into algorithms that fuel our information retrieval online when they are the property of corporations who gain from their private nature. However, the Framework outlines a good place to start: “accept[ing] the ambiguity surrounding the potential value of information creation expressed in emerging formats or modes.” We need to be vigilant about identifying the value, authority, and purpose of information that circulates in our social media and populates our search rankings. And let’s call for more information and transparency surrounding the structures that get information into our digital hands and reasoned minds today!

Author: Jess Denke

Assessment and Outreach Librarian at Muhlenberg College

2 thoughts on “The Dark Days of Information Retrieval”

  1. Jess, I will have to back track to read posts I’ve missed, but I was just ruminating that the tools we use are not neutral a few days ago. I was reminded about this after reading Farkas’ AL article “Never Neutral.” Here is the link: https://americanlibrariesmagazine.org/2017/01/03/never-neutral-critlib-technology/ I really enjoyed this post, Jess. Thanks!

    Oh, and this also reminded me of this Fusion article: http://fusion.net/story/270135/the-english-speaking-web-creates-digital-ghettos/

    Do you mind if I share your post…in a post? 🙂

    Like

Leave a comment