“If I were to run, I’d run as a Republican. They are the dumbest group of voters in the country. They believe anything on Fox News. I could lie and they’d still eat it up. I bet my numbers would be terrific.”
Many Guardian readers will have seen this quote, attributed to a 1998 interview with US president-elect Donald Trump in People magazine, in their Facebook news feed.
It is a great quote, but he never said it.
Illustration: Yusha
It typifies the kind of fake news and misinformation that has plagued this year’s US election on an unprecedented scale. In the wake of the surprise election of Trump, pressure is growing on Facebook to not only tackle the problem, but also to find ways to encourage healthier discourse between people with different political views.
Rather than connecting people — as Facebook’s euphoric mission statement claims — the bitter polarization of the social network over the past 18 months suggests Facebook is actually doing more to divide the world.
“People have ‘unfriended’ friends and family members because the style of discourse is so harsh,” said Claire Wardle, research director at the Tow Center for Digital Journalism. “Facebook stumbled into the news business without systems, editorial frameworks and editorial guidelines, and now it’s trying to course-correct.”
Facebook needs to change its business model if it does want to address these editorial challenges. Currently, the truth of a piece of content is less important than whether it is shared, liked and monetized. These “engagement” metrics distort the media landscape, allowing clickbait, hyperbole and misinformation to proliferate. On Facebook’s voracious news feed, the emphasis is on the quantity of posts, not spending time on powerful, authoritative, well-researched journalism.
The more users click, like and share stuff that resonates with their own world views the more Facebook feeds them with similar posts. This has progressively divided the political narrative into two distinct filter bubbles — one for conservatives and one for liberals (a blue feed and a red feed), pulling further and further apart in the run-up to election day.
These information bubbles did not burst on Tuesday last week, but the election result has highlighted how mainstream media and polling systems underestimated the power of alt-right news sources and smaller conservative sites that largely rely on Facebook to reach an audience.
The Pew Research Center found that 44 percent of Americans get their news from Facebook.
However, fake news is not a uniquely Republican problem. An analysis by BuzzFeed found that 38 percent of posts shared from three large right-wing politics pages on Facebook included “false or misleading information” and that three large left-wing pages did the same 19 percent of the time.
What is a uniquely Republican problem is the validation given to fake news by Trump, who has routinely repeated false news stories and whipped up conspiracy theories — whether that is questioning US President Barack Obama’s heritage, calling climate change a hoax or questioning “crooked” Democratic US presidential candidate Hillary Rodham Clinton’s health — during high-profile rallies, while urging his followers not to trust “corrupt” traditional media.
The conspiracy theories are amplified by a network of highly partisan media outlets with questionable editorial policies, including a Web site called the Denver Guardian peddling stories about Clinton murdering people and a cluster of pro-Trump sites founded by teenagers in Veles, Macedonia, motivated only by the advertising dollars they can accrue if enough people click on their links.
The situation is so dire that this week Obama spoke about the “crazy conspiracy theorizing” that spreads on Facebook, creating a “dust cloud of nonsense.”
“There is a cottage industry of Web sites that just fabricate fake news designed to make one group or another group particularly riled up,” said Fil Menczer, a professor at Indiana University who studies the spread of misinformation. “If you like Donald Trump and hate Hillary Clinton it’s easy for you to believe a fake piece of news about some terrible thing Hillary has done. These fake news Web sites often generate the same news just changing the name to get people on either side to be outraged.”
Menczer and his Indiana University colleagues hope to better understand how fake news, and how pieces debunking fake news, spread through social media by launching a range of analytical, nonprofit tools later this year.
The misinformation being spread does not always involve outlandish conspiracy theories. There is a long tail of insidious half-truths and misleading interpretations that fall squarely in the grey area, particularly when dealing with complex issues such as immigration, climate change or the economy.
“Not everything is true or false, and in the gaps between what we can check and what is missing from our control we can create a narrative,” said Italian computer scientist Walter Quattrociocchi, who has studied the spread of false information. “Trump won at this. He was able to gather all the distrust in institutional power by providing an option for people looking for a change.”
“These things are very hard to detect automatically if they are true or not,” Menczer said. “Even professional fact-checkers can’t keep up.”
According to Menczer’s research there is a lag of about 13 hours between the publication of a false report and the subsequent debunking. That is enough time for a story to be read by hundreds of thousands, if not millions of people. Within Facebook’s digital echo chamber, misinformation that aligns with our beliefs spreads like wildfire, thanks to confirmation bias.
“People are more prone to accept false information and ignore dissenting information,” Quattrociocchi said. “We are just looking for what we want to hear.”
That is a quirk of human psychology that the UK Independence Party (UKIP) toyed with during the campaign for Britain to leave the EU.
Arron Banks, UKIP’s largest donor, told the Guardian that facts were not necessary for winning.
“It was taking an American-style media approach. What they said early on was ‘facts don’t work’ and that’s it. You have got to connect with people emotionally. It’s the Trump success,” he said.
While it is human nature to believe what we want to hear, Facebook’s algorithms reinforce political polarization.
“You are being manipulated by the system [for falling for the fake news] and you become the perpetrator because you share it to your friends who trust you and so the outbreak continues,” Menczer said.
It is a perfect feedback loop. So how do you break it?
Menczer said the solution is to create a filter.
Before social media, the filter was provided by media companies, who acted as gatekeepers to the news and had staff trained in fact-checking and verifying information. In an age of budget cuts in traditional media, and the rise of clickbait and race-to-the-bottom journalism, standards have slipped across the board.
“Now the filter is us. But that’s not our job, so we’re not good at it. Then the Facebook algorithm leverages that and amplifies the effect,” Menczer said.
So we come back to the algorithm.
Despite continually insisting that it is a neutral technology platform and not a media company, Facebook is all too aware of the influence it has to drive footfall to the polling stations. About 340,000 extra people turned out to vote in the 2010 US congressional elections because of a single election-day Facebook message, according to a study published in the journal Nature.
In a separate study, the social networking site worked out how to make people feel happier or sadder by manipulating the information posted on 689,000 users’ news feeds. It found it could make people feel more positive of negative through a process of “emotional contagion.”
So what should Facebook do? It is certainly not going to be easy. It has tried — and failed — to get a grip on the problem before, launching a tool to let users report false information in January last year. The tool ultimately failed because it relied on users, who turned out not to be very good at spotting fake news and also to falsely report a story as “fake” if they did not agree with it.
In September, the company joined a coalition, along with Twitter, to improve the quality of reporting on social media and cut down on fake news. The fruits of this alliance have yet to emerge.
In the interim, Facebook found itself in trouble over the team of humans who were curating its trending news section.
According to a former journalist who worked on the project, the team was routinely told to suppress news stories of interest to conservative readers. The company was widely criticized for playing the role of censor and being biased against Republicans.
That led Facebook to fire the editors and let the algorithm decide what is trending. Since then, fake news has repeatedly found its way into the highly influential trending list.
“Instead of hiring more editors to check the facts, they got rid of the editors and now they are even more likely to spread misinformation,” Menczer said. “They don’t see themselves as a media company and they run the risk of being told they are picking sides. They are in a tough spot, but they are also making a lot of money.”
Facebook’s continued rejection of the idea that it is a media company does not sit well with some critics.
“It sounds like bullshit,” said high-profile investor Dave McClure, speaking from the Web Summit in Lisbon a few hours after an expletive-filled on-stage rant about Trump. “It’s clearly a source of news and information for billions of people. If that’s not a media organization then I don’t know what is.”
He added that technology entrepreneurs have a responsibility to enable a “more well-rounded experience” for their audiences.
“A lot of them are only thinking about how to make money. Maybe we need to mix in having ethics and principles and caring about the fact that people have a reasonable and rational experience of the information they process. Although that sounds a little too utopian,” he said.
One solution could be to try to reduce the effect of filter bubbles by showing users a wider variety of opinions than their own. Even if people have a tendency to reject those opinions, at least they would be exposed to a diversity of views.
Wardle suggests that to tackle fake news, Facebook could introduce a mechanism to allow fact-checking organizations to report false stories to Facebook so they do not continually circulate.
“Of course, people will shout censorship, so maybe Facebook could choose to change the way it display certain stories instead,” she said.
This is problematic, because Facebook would have to manipulate the algorithm to make it less likely you would see something from a site categorized as disreputable. This would potentially involve discounting content your friends were interested in.
“Then we would not like the platform as much, because we like seeing stuff our friends are liking and sharing,” Menczer said.
All of these issues point towards the inevitability of Facebook acknowledging that it is no longer just a technology company, but a media company — the media company.
In Mark Zuckerberg’s first Facebook update post-election, he talked about the need for everyone to work together.
“We are all blessed to have the ability to make the world better, and we have the responsibility to do it. Let’s go work even harder,” he said.
Wardle is skeptical.
“That’s all well and good — but start by changing your platform,” she said.
Could Asia be on the verge of a new wave of nuclear proliferation? A look back at the early history of the North Atlantic Treaty Organization (NATO), which recently celebrated its 75th anniversary, illuminates some reasons for concern in the Indo-Pacific today. US Secretary of Defense Lloyd Austin recently described NATO as “the most powerful and successful alliance in history,” but the organization’s early years were not without challenges. At its inception, the signing of the North Atlantic Treaty marked a sea change in American strategic thinking. The United States had been intent on withdrawing from Europe in the years following
My wife and I spent the week in the interior of Taiwan where Shuyuan spent her childhood. In that town there is a street that functions as an open farmer’s market. Walk along that street, as Shuyuan did yesterday, and it is next to impossible to come home empty-handed. Some mangoes that looked vaguely like others we had seen around here ended up on our table. Shuyuan told how she had bought them from a little old farmer woman from the countryside who said the mangoes were from a very old tree she had on her property. The big surprise
The issue of China’s overcapacity has drawn greater global attention recently, with US Secretary of the Treasury Janet Yellen urging Beijing to address its excess production in key industries during her visit to China last week. Meanwhile in Brussels, European Commission President Ursula von der Leyen last week said that Europe must have a tough talk with China on its perceived overcapacity and unfair trade practices. The remarks by Yellen and Von der Leyen come as China’s economy is undergoing a painful transition. Beijing is trying to steer the world’s second-largest economy out of a COVID-19 slump, the property crisis and
As former president Ma Ying-jeou (馬英九) wrapped up his visit to the People’s Republic of China, he received his share of attention. Certainly, the trip must be seen within the full context of Ma’s life, that is, his eight-year presidency, the Sunflower movement and his failed Economic Cooperation Framework Agreement, as well as his eight years as Taipei mayor with its posturing, accusations of money laundering, and ups and downs. Through all that, basic questions stand out: “What drives Ma? What is his end game?” Having observed and commented on Ma for decades, it is all ironically reminiscent of former US president Harry