Campaign Hopes to Put a Stop to Sexist Trolls on Twitter

A pilot program aims to keep women safe from trolls on the social media giant. Will it work?
Nov 10, 2014· 2 MIN READ
Nicole Pasulka is a writer and reporter who lives in New York City. She has written for Mother Jones, BuzzFeed, The Believer, and the New York Observer.

Anita Sarkeesian criticized misogyny in video games and received death threats and incessant online harassment. British feminist Caroline Criado-Perez successfully campaigned to have a woman put on a bank note and received 50 death threats an hour over Twitter; she says she was "unable to function" because of the harassment. Tracy Van Slyke wrote about sexist and racist undertones in Thomas the Tank Engine cartoons, and some people were so enraged they posted photos of her child online alongside racist comments.

As high-profile instances of Internet harassment continue to make news, the need to address—and solve—the issue grows stronger every day. That's why representatives from Twitter, the site of a significant amount of online abuse, and members of Women, Action, and Media have joined forces to tackle the problem. It is a big one.

One-quarter of women between the ages of 18 and 24 have been stalked or harassed online, according to a Pew Research Center poll published last month. Harassment is also worse for minorities. More than half of black and Hispanic Internet users have experienced harassment, compared with 34 percent of white Internet users.

When women are harassed online it means they don't have "equal access to public spheres,” says Jaclyn Friedman, the executive director of WAM. “Having equal access means participating in public conversations without fear."

But if part of the allure and the promise of online spaces is anonymity, how can tech companies that run platforms with hundreds of millions of users keep this behavior at bay?

Twitter has an "abusive behavior policy" in which any user found to be making threats or posting abusive content can be suspended from the site. That might sound like an ideal solution, but following up on reports against individual users is time consuming, and Friedman says Twitter hasn't allocated enough resources to quickly vet every complaint. Abusers who are suspended often just open new accounts, and the policy doesn't account for harassment carried out by groups of people rather than individuals.

The partnership between Twitter and WAM is a pilot program in which Twitter users who feel they're being harassed can report incidents directly to WAM through an online form. WAM—a two-person organization that does a lot more than moderate harassment—will investigate the reports and has the power to escalate valid complaints, meaning pass them on directly to Twitter. After it collects the complaints for several weeks WAM hopes to be able to present the social media giant with a better sense of how online harassment occurs on its platform.

Rooting out and ending harassment is "tricky," Friedman says, and rather than offering a solution to the problem, she sees the partnership as a first step, or a “fact-finding mission.”

For example, when several users are hounding women, it can be hard to report the activity or see any change in behavior. “Twitter’s responses are pretty focused on the individual. Is one person doing behavior that violates terms of service?” Friedman says. “Together [harassers] create an environment on Twitter in which a person can’t function. Historically, we have not seen Twitter take action on those cases.”

So far WAM has gotten around 300 reports and has escalated a fraction of those to Twitter. Wading through the trolls is emotionally taxing work. “Having now spent a few days receiving reports and analyzing whether I should escalate them, I have a lot of sympathy for Twitter,” Friedman says.

She already thinks that in the future more moderators and tools that automatically block users who haven’t been on Twitter for 30 days or who have a very small number of followers—because trolls often open new accounts after they’ve been suspended—could help solve the problem.

While the company has been receptive to suggestions from advocates and organizations, Friedman hopes that Twitter will take the data WAM is collecting and come up with effective ways to resolve the issue on its own. “Just because I can’t present you with a solution doesn’t mean one doesn’t exist,” she says.