In February, the YourNewsWire page on Facebook was at its peak popularity, boosted by its salacious post claiming that Canadian Prime Minister Justin Trudeau was late Cuban president Fidel Castro’s love child.
That story, shared 1,800 times, represented exactly the kind of content Facebook Inc had promised to clean up on its Web site — and which YourNewsWire prolifically produced. It was created to drive attention and therefore advertising revenue. It was also provably false.
YourNewsWire is still publishing, but its stories are not going viral on Facebook anymore and the Web site is finding it more difficult to make money. In the case of YourNewsWire, at least, Facebook delivered on its promises in time for Tuesday’s US midterm elections.
“In many ways, they’re an ideal test example in at least the limited scope of what Facebook said they wanted to do — to see blatantly false news debunked, and reduce its reach,” said Alexios Mantzarlis, who leads the International Fact-Checking Network at the Poynter Institute in St Petersburg, Florida.
It has been more than two years since the 2016 US presidential election conversation was muddied with viral false information, such as the report that then-US presidential candidate Donald Trump had been endorsed by the pope. While Facebook has admitted some responsibility for the spread of “fake news,” its road to fixing the problem has been slow.
The company decided it would limit its efforts to stories that were provably false and it would not do so directly. Instead, Facebook works with third-party fact-checkers, including PolitiFact and Snopes.
In interviews, fact-checkers said that they often only have enough staff members to address a few stories a week, sometimes long after they have gone viral. When stories are debunked, Facebook reduces their reach.
This week’s US election was unlikely to see a story go viral at the level of the fake pope endorsement article. Absent more data, it is hard to know how much to attribute that progress to Facebook.
Those who study the Internet’s worst offenders said that they were not resonating as much as they had in the past.
It might be that readers are wiser, TruthOrFiction.com managing editor Brooke Binkowski said.
“Readers have become more savvy,” said Binkowski, who used to work at Snopes, the Facebook partner. “They understand that fake news is a problem, and they’ve become more vigilant.”
Facebook built a system that specifically addresses hoax news Web sites and pages, but that shifted some of the fake news activity to posts and images that go viral in Facebook groups, in which old photos are often doctored or retitled to apply to current news events.
Facebook groups have helped spread misinformation about a caravan of immigrants walking on foot to the US border — falsely labeling the group as violent or diseased in a fear campaign that has bubbled up to Trump.
“Groups have become the preferred base for coordinated information influence activities on the platform,” not the Facebook pages that were active ahead of the 2016 election, Jonathan Albright, director of the Digital Forensics Initiative at Columbia University’s Tow Center for Digital Journalism, said in a report.
“It is Facebook’s Groups — right here, right now — that I feel represents the greatest short-term threat to election news and information integrity,” Albright said.
If hoax publishers are not as much of a problem in the US, polarization still is. Publishers on the far right or far left — who do not publish fake news so much as news in a skewed context, meant to alarm readers — still thrive on Facebook. The social network has been asking users to rate publishers by trustworthiness and baking the scores into its algorithm to address the issue. Still, hyperpartisan news thrives.
A Facebook spokeswoman said that the firm is aware that photographic and video misinformation has become more common, and this year started enabling fact-checkers to also address it.
Facebook also touted three recent studies from Stanford University, the University of Michigan and French newspaper Le Monde that concluded the magnitude of misinformation has declined on the social network.
“It’s challenging to prove we’re making progress on this because of lack of consensus on what ‘false news’ means,” a Facebook spokeswoman said in a statement. “But we know that this is a highly adversarial space and we have more work to do.”
YourNewsWire, based in southern California, built its reputation on conspiracy theories, claiming that public figures are pedophiles or that vaccines kill, and became one of the top broadcasters of blatant misinformation.
Content with no basis in fact is harder to fully disprove, Mantzarlis said.
“You can’t go to every middle-aged woman in America and ask them if they’re a body double for Hillary Clinton,” he said.
However, YourNewsWire looks like a regular news Web site. At the top of the page, its tagline is “News. Truth. Unfiltered.”
In some ways, that has made its false content more dangerous than that of Alex Jones, the propaganda artist behind InfoWars, PolitiFact executive director Aaron Sharockman said.
Jones was earlier this year banned from Facebook after public outrage over his content, especially over his perpetual claim that the Sandy Hook Elementary School shooting did not occur.
Sharockman said that every time a PolitiFact fact-checker finds a YourNewsWire story to be erroneous, the reporter calls to inform the Web site.
Since Facebook’s rules about “downranking” content went into place, YourNewsWire started to delete the posts, in an effort to avoid the consequences, he said.
A Facebook spokeswoman said that the company is aware some publishers think the tactic is a workaround, and it is soliciting feedback from fact-checking partners to make its policies clearer.
Deleting a post is not enough to eliminate a “strike” against the page, Facebook said.
YourNewsWire editor-in-chief Sean Adl-Tabatabai denied that the Web site uses that strategy to protect itself against fact-checking, saying he has been in touch with Facebook about fact-checkers that he thinks are overeager to nitpick his stories.
He said Facebook listens, but tells him to take up his complaints with the fact-checkers directly.
“I would say overall the idea that there are third-party fact-checkers deciding what people can and cannot see on Facebook is problematic,” Adl-Tabatabai said. “The fact-checkers have been given more and more leeway and what can I do? If they remove me from the public square and put me in the digital gulag, what can I do?”
Facebook said that while Web sites are able to appeal the conclusions of fact-checkers, the partners are all following Poynter’s fact-checking principles.
YourNewsWire has not been removed from the platform, but some of its partners have distanced themselves. Revcontent, which used to serve advertisements on YourNewsWire’s Web site, in August started to remove promotions and withhold revenue from any stories that outside fact-checkers had debunked.
On Oct. 19, after repeated violations, Revcontent removed YourNewsWire’s advertising support entirely, making it the first Web site to face that consequence for misinformation reasons, Revcontent brand manager Charlie Terenzio said.
Google, which used to serve advertisements on the Web site, has not done so for a few years, a person familiar with the matter said.
On Saturday last week, the domain YourNewsWire.com started to reroute to newspunch.com.
News Punch has a different tagline: “Where mainstream fears to tread.”
Sinclair Treadway, who runs the Web site with Adl-Tabatabai, said it had to rebrand after its reputation and earning abilities were affected by negative publicity and Facebook’s fact-checking program.
At Facebook’s Menlo Park, California, headquarters this week, US election activity was monitored from a “war room” with constant dashboards to keep tabs on what was going viral.
Facebook said it was in contact with secretaries of state and state election bureaus to combat reports on any fake news on the Web site that could suppress voting.
Meanwhile, the company has expanded its fact-checking network to 14 other countries, where it works with local-language news organizations and fact-checkers.
However, the program does not extend to WhatsApp, the company’s popular messaging app, which is encrypted and where there is no visibility into which stories are going viral. The app was a major vehicle for disinformation in the Brazilian presidential election.
Viral content has also helped fuel lynchings in India and ethnic warfare in Myanmar, which Facebook has only started to address this year.
The episodes demonstrate that although Facebook might have tamped down on hoax news in the US, bigger problems remain, driven by the viral nature of shocking content.
“I am not convinced that Facebook’s fact-checking program has ever worked,” Binkowski said.
In December 2001, this newspaper ran an editorial with a headline similar to this one. Taiwan, along with the rest of the world, has changed immeasurably since then, but adultery remains a criminal offense in Taiwan under Article 239 of the Criminal Code. This nation is one of the few in the world to still consider adultery a criminal, rather than civil, offense, with punishment of up to one year in prison, but hopefully this could finally change in the foreseeable future. The Council of Grand Justices on Tuesday is to begin hearing arguments on the constitutionality of that article, following petitions
In an authoritarian environment such as in China, information is strictly controlled and, if suited to the leadership, is presented as the truth to its unsuspecting citizens, who do not have the wherewithal to check the veracity of such information. One such unsubstantiated piece of information was served by a Chinese official who claimed that the COVID-19 disease, which has taken a heavy toll on human lives and thrown the world into complete turmoil, had originated from the US and was brought to China by the US Army. The COVID-19 fiasco had made China’s leadership, including Chinese President Xi Jinping (習近平), appear
Like the Titanic striking a massive iceberg in 1912, a novel coronavirus, later named COVID-19, struck Wuhan in China late last year. First revealed by local doctors in early December, the virus spread like a global tidal wave and has now infected residents in 152 of the 193 UN member nations. Amid the gloomy scenarios painted by traditional and social media, the world’s policymakers, as well as individual citizens, must pay close attention to what some governments did to restrain the pandemic, and examine why it took such a heavy toll on other countries. The world can learn from the first regions
Taiwan and many other countries have set up measures, such as travel bans, border closures, curfews and lockdowns, to contain the spread of the COVID-19 pandemic. On Wednesday last week, Minister of Health and Welfare Chen Shih-chung (陳時中) announced that effective the following day, foreign travelers would be denied entry into Taiwan to battle the sharp increase in COVID-19 cases over the previous couple of days. European countries and North America also closed their borders to restrict arrivals by non-residents. By integrating its National Health Insurance and immigration and customs databases, the government identified cases with real-time alerts during clinical visits based