Sat, Apr 06, 2019 - Page 9 News List

YouTube executives ignored warnings over toxic videos

People who once worked for YouTube said that concerns over profitability and being treated as editors were put above keeping objectionable material off the site

By Mark Bergen  /  Bloomberg

Illustration: Kevin Sheu

A year ago, Susan Wojcicki was on stage to defend YouTube. Her company, hammered for months for fueling falsehoods online, was reeling from another flare-up involving a conspiracy theory video about the Parkland, Florida, high-school shooting that suggested the victims were “crisis actors.”

Wojcicki, YouTube’s chief executive officer, is a reluctant public ambassador, but she was in Austin, Texas, at the South by Southwest conference to unveil a solution that she hoped would help quell conspiracy theories: a tiny text box from Web sites like Wikipedia that would sit below videos that questioned well-established facts like the moon landing and link viewers to the truth.

Wojcicki’s media behemoth, bent on overtaking television, is estimated to rake in sales of more than US$16 billion a year.

However, on that day, Wojcicki compared her video site to a different kind of institution.

“We’re really more like a library,” she said, staking out a familiar position as a defender of free speech. “There have always been controversies, if you look back at libraries.”

Since Wojcicki took the stage, prominent conspiracy theories on the platform — including one on child vaccinations and another tying former US secretary of state Hillary Rodham Clinton to a Satanic cult — have drawn the ire of lawmakers eager to regulate technology companies. And YouTube is, a year later, even more associated with the darker parts of the Web.

The conundrum is not just that videos questioning the moon landing or the efficacy of vaccines are on YouTube. The massive “library,” generated by users with little editorial oversight, is bound to have untrue nonsense. Instead, YouTube’s problem is that it allows the nonsense to flourish. And, in some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.

Wojcicki and her deputies know this. In recent years, scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world’s largest video site surfaced and spread. One employee wanted to flag troubling videos, which fell just short of the hate speech rules, and stop recommending them to viewers. Another wanted to track these videos in a spreadsheet to chart their popularity. A third, fretful of the spread of “alt-right” video bloggers, created an internal vertical that showed just how popular they were. Each time they got the same basic response: Do not rock the boat.

The company spent years chasing one business goal above others: “engagement,” a measure of the views, time spent and interactions with online videos. Conversations with more than 20 people who work at or recently left YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.

Wojcicki would “never put her fingers on the scale,” said one person who worked for her.

“Her view was: ‘My job is to run the company, not deal with this,’” they said.

This person, like others who spoke to Bloomberg News, asked not to be identified because of a worry of retaliation.

YouTube turned down Bloomberg News’ requests to speak to Wojcicki, other executives, management at Google and the board of Alphabet Inc, its parent company.

Last week, Neal Mohan, YouTube’s chief product officer, told the New York Times that the company has “made great strides” in addressing its issues with recommendation and radical content.

This story has been viewed 2621 times.

Comments will be moderated. Keep comments relevant to the article. Remarks containing abusive and obscene language, personal attacks of any kind or promotion will be removed and the user banned. Final decision will be at the discretion of the Taipei Times.

TOP top