A View from Karrie Karahalios
Algorithm Awareness
How the news feed on Facebook decides what you get to see.
Increasingly, it is algorithms that choose which products to recommend to us and algorithms that decide whether we should receive a new credit card. But these algorithms are buried outside our perception. How does one begin to make sense of these mysterious hidden forces?
The question gained resonance recently when Facebook revealed a scientific study on “emotion contagion” that had been conducted by means of its news feed. The study showed that displaying fewer positive updates in people’s feeds causes them to post fewer positive and more negative messages of their own. This result is interesting but disturbing, revealing the full power of Facebook’s algorithmic influence as well as its willingness to use it.
To explore the issue of algorithmic awareness, in 2013 three colleagues and I built a tool that helps people understand how their Facebook news feed works.
Using Facebook’s own programming interface, our tool displayed a list of stories that appeared on one’s news feed on the left half of the screen. On the right, users saw a list of stories posted by their entire friend network—that is, they saw the unadulterated feed with no algorithmic curation or manipulation.
A third panel showed which friends’ posts were predominantly hidden and which friends’ posts appeared most often. Finally, the tool allowed users to manually choose which posts they desired to see and which posts they wanted to discard.
We recruited 40 people—a small sample but one closely representative of the demographics of the U.S.—to participate in a study to see how they made sense of their news feed. Some were shocked to learn that their feed was manipulated at all. But by the end of our study, as participants chose what posts they wanted to see, they found value in the feed they curated.
When we followed up months later, many said they felt empowered. Some had changed their Facebook settings so they could manipulate the feed themselves. Of the 40 participants, one person quit using Facebook altogether because it violated an expectation of how a feed should work.
The public outcry over Facebook’s emotion study showed that few people truly grasp the way algorithms shape the world we experience. And our research shows the importance of empowering people to take control of that experience.
We deserve to understand the power that algorithms hold over us, for better or worse.
Karrie Karahalios is an associate professor of computer science at the University of Illinois.