Mark Zuckerberg, the co-founder and chief executive of Facebook, likes to say that his Web site brings people together, helping to make the world a better place, but Facebook is not a utopia and when it comes up short, Dave Willner tries to clean it up.
Dressed in Facebook’s quasi-official uniform of jeans, a T-shirt and flip-flops, the 26-year-old Willner hardly looks like a cop on the beat. Yet he and his colleagues on Facebook’s hate and harassment team are part of a virtual police squad charged with taking down content that is illegal or violates Facebook’s terms of service. That puts them on the front line of the debate over free speech on the Internet.
That role came into sharp focus last week as the controversy about WikiLeaks boiled over on the Web, with coordinated attacks on major corporate and government sites perceived to be hostile to WikiLeaks.
Facebook took down a page used by WikiLeaks supporters to organize hacking attacks on the sites of such companies, including PayPal and MasterCard. It said the page violated the terms of service, which prohibit material that is hateful, threatening or pornographic, or incites violence or illegal acts, but it did not remove WikiLeaks’ own Facebook pages.
Facebook’s decision in the WikiLeaks matter illustrates the complexities that the company grapples with, on issues as diverse as that controversy, verbal bullying among teenagers, gay-baiting and religious intolerance.
With Facebook’s prominence on the Web — its more than 500 million members upload more than 1 billion pieces of content a day — the site’s role as an arbiter of free speech is likely to become even more pronounced.
“Facebook has more power in determining who can speak and who can be heard around the globe than any Supreme Court justice, any king or any president,” said Jeffrey Rosen, a law professor at The George Washington University who has written about free speech on the Internet. “It is important that Facebook is exercising its power carefully and protecting more speech rather than less.”
However, Facebook rarely pleases everyone. Any piece of content — a photograph, video, page or even a message between two individuals — could offend somebody. Decisions by the company not to remove material related to Holocaust denial or pages critical of Islam and other religions, for example, have annoyed advocacy groups and prompted some foreign governments to temporarily block the site.
Some critics say Facebook does not do enough to prevent certain abuses, like bullying, and may put users at risk with lax privacy policies. They also say the company is often too slow to respond to problems.
For example, a page lampooning and, in some instances, threatening violence against an 11-year-old girl from Orlando, Florida, who had appeared in a music video, was still up last week, months after users reported the page to Facebook. The girl’s mother, Christa Etheridge, said she had been in touch with law enforcement authorities and was hoping the offenders would be prosecuted.
“I’m highly upset that Facebook has allowed this to go on repeatedly and to let it get this far,” she said.
A Facebook spokesman said the company had left the page up because it did not violate its terms of service, which allow criticism of a public figure. The spokesman said that by appearing in a band’s video, the girl had become a public figure and that the threatening comments had not been posted until a few days ago. Those comments, and the account of the user who had posted them, were removed after the New York Times inquired about them.
Facebook says it is constantly working to improve its tools to report abuse and trying to educate users about bullying. It says it responds as fast as it can to the roughly 2 million reports of potentially abusive content that its users flag every week.
“Our intent is to triage to make sure we get to the high-priority, high-risk and high-visibility items most quickly,” said Joe Sullivan, Facebook’s chief security officer.
In early October, Willner and his colleagues spent more than a week dealing with one high-risk, highly visible case — rogue citizens of Facebook’s world had posted anti-gay messages and threats of violence on a page inviting people to remember Tyler Clementi and other gay teenagers who have committed suicide, on so-called Spirit Day, Oct. 20.
Working with colleagues in California and in Dublin, they tracked down the accounts of the offenders and shut them down. Then, using an automated technology to tap Facebook’s graph of connections between members, they tracked down more profiles for people who, as it turned out, had also been posting violent messages.
“Most of the hateful content was coming from fake profiles,” said James Mitchell, who is Willner’s supervisor and leads the team.
He said that because most of these profiles, created by people he called “trolls” were connected to those of other trolls, Facebook could track down and block an entire network relatively quickly.
Using the system, Willner and his colleagues silenced dozens of troll accounts and the page became usable again, but trolls are repeat offenders and it took Willner and his colleagues nearly 10 days of monitoring the page around the clock to take down more than 7,000 profiles that kept surfacing to attack the Spirit Day event page.
Most abuse incidents are not nearly as prominent or public as the defacing of the Spirit Day page, which had nearly 1.5 million members. As with schoolyard taunts, they often happen among a small group of people, hidden from casual view.
On a morning last month, Nick Sullivan, a member of the hate and harassment team, watched as reports of bullying incidents scrolled across his screen, full of mind-numbing vulgarity.
Emily looks like a brother. (Deleted). Grady is with Dave. (Deleted). Ronald is the biggest loser. (Deleted). As attacks on specific people who are not public figures, these all violated the terms of service.
“There’s definitely some crazy stuff out there, but you can do thousands of these in a day,” Nick Sullivan said.
Nancy Willard, director of the Center for Safe and Responsible Internet Use, which advises parents and teachers on Internet safety, said her organization frequently received complaints that Facebook does not quickly remove threats against individuals. Jim Steyer, executive director of Common Sense Media, a nonprofit group based in San Francisco, also said that many instances of abuse seemed to fall through the cracks.
“Self-policing can take some time and by then a lot of the damage may already be done,” he said.
Facebook maintains it is doing its best.
“In the same way that efforts to combat bullying offline are not 100 percent successful, the efforts to stop people from saying something offensive about another person online are not complete either,” Nick Sullivan said.
Facebook faces even thornier challenges when policing activity that is considered political by some and illegal by others, such as the publication of secret diplomatic cables obtained by WikiLeaks.
This year, for example, the company declined to take down pages related to “Everybody Draw Mohammed Day,” an Internet-wide protest to defend free speech that surfaced in repudiation of death threats received by two cartoonists who had drawn pictures of the prophet Mohammed. A lot of the discussion on Facebook involved people in Islamic countries debating with people in the West about why the images offended.
Facebook’s team worked to separate the political discussion from the attacks on specific people or Muslims.
“There were people on the page that were crossing the line, but the page itself was not crossing the line,” Mitchell said.
Facebook’s refusal to shut down the debate caused its entire site to be blocked in Pakistan and Bangladesh for several days.
Facebook has also sought to walk a delicate line on Holocaust denial. The company has generally refused to block Holocaust denial material, but has worked with human rights groups to take down some content linked to organizations or groups, like the government of Iran, for which Holocaust denial is part of a larger campaign against Jews.
“Obviously we disagree with them on Holocaust denial,” said Rabbi Abraham Cooper, associate dean of the Simon Wiesenthal Center.
However, Cooper said Facebook had done a better job than many other major Web sites in developing a thoughtful policy on hate and harassment.
The soft-spoken Willner, who on his own Facebook page describes his political views as “turning swords into plowshares and spears into pruning hooks,” makes for an unlikely enforcer.
An archeology and anthropology major in college, he said that while he loved his job, he did not love watching so much of the underbelly of Facebook.
“I handle it by focusing on the fact that what we do matters,” he said.
On Monday, Chinese President Xi Jinping (習近平) spoke during the opening ceremony of this year’s World Health Assembly (WHA). For the first time in the assembly’s history, attendees, including Xi, had to dial in virtually. Xi made no acknowledgement of the Chinese government’s role in causing the COVID-19 pandemic, nor was there any meaningful apology. Instead, he painted China as a benign force for good and a friend to all nations. Except Taiwan, of course. The address was a reheated version of the speech Xi gave at the 2017 World Economic Forum in Davos, Switzerland. Xi again attempted to step into the
Over the past few years, migrant workers’ rights have improved in Taiwan, but there has not been a comparable improvement in protections for employers, who are faced with a range of challenges, such as family nurses mistreating patients or workers threatening to change brokers or demanding that employers change their jobs. Then there is the decrease in work standards. Migrant workers too often find the lure of the underground jobs market irresistible, are unaware of employment laws and regulations, or have found that National Immigration Agency (NIA) checks are lax, and as a result abscond. If this happens, what protections or