Justin Saglio

Connectivity

The Woman Battling Hate Speech, Censorship, and Extremism Online (and Off)

Yasmin Green, head of R&D for Alphabet’s Jigsaw, is using technology in hopes of making the world a better place. It’s not easy.

When Yasmin Green was a young child in 1984, her family fled the revolution in Iran for London. As an adult, she’s promoting online and offline freedom using the resources of one of the world’s most powerful technology companies.

Green is the director of research and development for Jigsaw, a tech incubator within Alphabet—the parent company of Google. Jigsaw is working on projects like letting people in Internet-censored countries access the open Internet via other people’s connections, and using machine learning to ferret out online abuse.

After speaking about her work at MIT Technology Review’s annual EmTech MIT conference on Wednesday, Green sat down with us to discuss how her past influences her today and what she wants to understand when it comes to nasty behavior on the Internet.

Your family left Iran when you were just three years old. How do you feel your cultural background helps you do your work?

I think the biggest travesty would be if large technology companies invest in developing meaningful technology, technology to try to make the Internet and societies healthier, but they do it from, like, windowless offices in  Silicon Valley. There are so many failed attempts at developing meaningful technology, so the more that we have people who naturally understand the differences across the globe, and the more we can get out to different parts of the world, I think the more likely we are to develop [meaningful] technology. In my case, being from a country where the Internet is so heavily censored but also so popular, I think it helps—I can provide at least that perspective to the team. What are the security concerns? What are the physical … intimidation tactics the government’s used that affect how people use the Internet? Do I have networks of people in the country who could test products for us?

Jigsaw is working on a number of projects that use technology to fight issues like censorship, hate, and vitriol online. What are you focused on right now?

In terms of what technology is most interesting to me at the moment, it’s probably looking at whether we can understand online toxicity. Can we break down toxicity into its components of what makes people want to leave an [online] conversation—is it obscene language? Is it insults? Is it threats? Is it identity-based attacks? And they’re all actually different types of bad speech, and different communities online may care about one and not the other. And the more we can define the models, the more useful it will ultimately be to creating inclusive conversations.

Your work and that of your Jigsaw colleagues includes spending time talking to people who have done things like defecting from ISIS. Do you consider yourselves to be, in some ways, a group of anthropologists?

I would think of us that way. I think what’s kind of special about the technology that comes out of Jigsaw is it’s conceived of in the field, on the front lines of this research. There’s ethnographic research, there’s investigative research, there’s lots of partnerships with investigative journalists. These problems span the Internet and the real world—the offline world. It takes a combination of actors and experts in different fields to address them.