Connectivity
Smartphones Are Weapons of Mass Manipulation, and This Guy Is Declaring War on Them
Tristan Harris thinks big tech is taking advantage of us all. Can its power be used for good?
If, like an ever-growing majority of people in the U.S., you own a smartphone, you might have the sense that apps in the age of the pocket-sized computer are designed to keep your attention as long as possible. You might not have the sense that they’re manipulating you one tap, swipe, or notification at a time.
But Tristan Harris thinks that’s just what’s happening to the billions of us who use social networks like Facebook, Instagram, Snapchat, and Twitter, and he’s on a mission to steer us toward potential solutions—or at least to get us to acknowledge that this manipulation is, in fact, going on.
Harris, formerly a product manager turned design ethicist at Google, runs a nonprofit called Time Well Spent, which focuses on the addictive nature of technology and how apps could be better designed; it pursues public advocacy and supports design standards that take into account what’s good for people’s lives, rather than just seeking to maximize screen time. He says he’s moving away from Time Well Spent these days (his new effort is as yet unnamed), trying to hold the tech industry accountable for the way it persuades us to spend as much time as possible online, with tactics ranging from Snapchat’s snapstreaks to auto-playing videos on sites like YouTube and Facebook.
“It’s so invisible what we’re doing to ourselves,” he says. “It’s like a public health crisis. It’s like cigarettes, except because we’re given so many benefits, people can’t actually see and admit the erosion of human thought that’s occurring at the same time.”
Harris argues that because tech companies’ business models largely depend upon advertising revenue, it’s not really in their best interest to push us toward, say, getting off the social network du jour and going outside to hang out with friends. He’s not saying Facebook (or any of its peers, for that matter) is bad, or that we should stop using our smartphones. But after spending years inside the tech industry—he joined Google in 2011 when it bought the startup he cofounded, a search-within-the-Web-page company called Apture—he is saying they are the most powerful social persuasion machines ever built, and he’s concerned about how we’re using them. Or, more to the point, how they’re using us.
It’s an increasingly valid concern. For all the great things mobile technology makes possible, a growing body of research suggests that the use of social networks including Facebook, Instagram, Snapchat, and Twitter may have negative consequences, like increasing your chances of depression or social isolation. Indeed, simply having your phone around could lower your cognitive capacity.
To get his message out, Harris is working with colleagues including Roger McNamee, a venture capitalist and early Facebook and Google investor, who has recently written of his regrets about these money-making moves.
He’s also becoming adept at public speaking: a TED talk he gave in April has been viewed about 1.5 million times, and he was featured on 60 Minutes that same month. So he seemed in his element when I first saw him, standing in front of a packed lecture hall at Stanford University on a fall evening, addressing a class on artificial intelligence and society with a presentation titled “Building an AI for Human Attention.”
It wasn’t the most glamorous setting. While Stanford is a leafy, expansive, expensive bastion of Silicon Valley learning, this particular lecture hall was windowless, and the chair desks were old and uncomfortable. Harris, dressed comfortably in a chambray shirt and black pants, looked cramped standing behind an old lectern in a corner of the room.
But if history is an indicator, it is one of the best places to reach the very people Harris hopes to connect with: bright students who may very well be the tech leaders of tomorrow (he would know, since he is a Stanford alum and counts as his friends some tech-famous Stanford graduates, Instagram founders Kevin Systrom and Mike Krieger).
And even in that setting, Harris was charismatic and his message disturbing but measured. For over an hour, he kept the students’ attention as he talked about the tech industry’s race for attention and its techniques for tugging at consumers, reciting stats like the fact that there are more people on Facebook now than there are Muslims in the world.
“The question is, once you start to monopolize what people are thinking about, is that actually good for society? What is that vulnerable to? Where could that go wrong?” he asked.
I had some questions of my own after hearing him speak. At first, I found Harris’s rhetoric interesting but needlessly alarmist. I’ve been using Facebook, Twitter, Instagram, and of course Google for years. I depend on them for so much as I gather and disseminate information throughout every single day—finding story tips, keeping in touch with friends and family, posting cute photos and videos of my baby, reading news, and so on. I wondered, is that really so bad? Am I really being controlled or influenced in some way?
But after Harris’s talk at Stanford, I started thinking a lot more about how I get sucked into watching auto-playing ads for bras and shoes that I actually do kind of want to buy. And how I feel when I get a notification on my smartphone that someone liked, or loved, or retweeted, one of my posts on Facebook, Instagram, or Twitter. There’s definitely a little charge in my stomach and a ping in my brain, and I really, really like it. I crave it, even, after putting up a particularly adorable baby photo or cleverly worded status update, and getting one of these notifications inevitably induces me to open whichever social app it came from to see what’s going on. Am I just going to keep liking photos on Facebook, retweeting funny tidbits on Twitter, and feeding the AI that runs these networks until I keel over and die?
The next day, I caught up with Harris and talked to him about all this over a sushi lunch in San Francisco. He didn’t have any easy solutions to assuage my fears, but he did lay out his vision for what sites like Facebook could be if they were not beholden to capturing your attention but dedicated to serving society (which, if you think about it, sounds in line with Facebook founder Mark Zuckerberg’s original vision).
The Facebook that exists now has helped many people connect and communicate in positive ways, but it has also led to things like Russian interference in the most recent U.S. presidential election.
“The problem is that these are the unintended consequences of well-intended strategies,” says McNamee, who describes himself as Harris’s “wingman.”
So what if, Harris wonders, the content you saw on Facebook included ways of making the world or at least your community better, or improving your life? In his vision of a social network with a sort of ethical persuasion built in, Facebook might do things like suggest several specific ways you can help with climate change, such as turning your heater’s base temperature down a few degrees or installing solar panels on your roof. Or maybe it would encourage you to meet up with people off Facebook to discuss politics in person.
“It’s so hard to imagine that, because everything in the feed is basically things that you consume—other articles you can read or videos you can watch—instead of what you can do today that would move you closer to the life you want to be living,” he says.
And while you could say that people are already doing what they really want on Facebook, Twitter, Instagram, and other social networks, representing preferences with clicks and choices about which people and news sources they follow, Harris doesn’t think we are truly in control of social media as it exists now.
“Everything [Facebook] knows about me can be used to persuade me toward a future goal,” he says. “And it’s very powerful; it knows exactly what would persuade me, because it has persuaded me in the past.”
The persuasion tools may be getting even more powerful for advertisers in particular: Facebook is reportedly letting some brands try sifting through public posts and comments (sans usernames) to help them target users.
I contacted Facebook about whether it was working on any efforts involving ethical persuasion; it did not respond.
Harris isn’t waiting for Facebook, though. He and McNamee are working on political advocacy to make people, both in politics and the general public, more aware of the control major tech companies have over users. McNamee says their initial mission was to “stimulate a conversation” about the appropriate role of Internet platform monopolies in society, and that they’ve been speaking to people, but he won’t name names. Harris is hoping to get employees at tech companies more interested in his work, too. That’s happened in a few cases already, especially after people leave their posts.
“Companies are not going to change themselves,” he says.