When a friend messaged voice actor Bev Standing to ask whether she was the voice of TikTok’s text-to-speech feature, the Canadian performer’s surprise soon turned to irritation.
She had never done work for the popular social media platform, yet the voice was unmistakably hers.
Standing concluded that speech recordings she had done for another client years earlier had somehow been obtained by TikTok, and fed into an algorithm which allows users to turn written text into a voiceover for video clips in the app.
Photo: AFP
She sued TikTok in a case that was settled in September — one that performing artists say was symptomatic of the growing challenges artificial intelligence (AI) poses to creatives.
“I am a business. I need to protect my product, and my product is my voice,” Standing said. “They [TikTok] weren’t my client. It’s like me buying a car and you driving it away. You didn’t buy it, you don’t get to drive it.”
From digitally resurrecting dead celebrities to improving lip sync for movies dubbed in foreign languages, AI has been increasingly deployed in the movie and audio industry in recent years, sparking debates around ethics and copyright issues.
A documentary on the late chef Anthony Bourdain faced a backlash after using AI to recreate his voice.
Working actors and other performing artists say they are also concerned about the impacts of AI on their livelihoods, with some calling for creators to be given more rights over their work and how it is used.
British actor and comedian Rick Kiesewetter signed away all rights to recordings of his voice and face movements in a job for a tech firm several years ago.
Now he feels somewhat uneasy about it.
“I just don’t know where this stuff is going to end up ... It could even end up in porn as far as I know,” he said.
LEGAL GAPS
Actors’ groups said artists risk losing work to AI, and that performers who lend their voice or acting skills to computers are often not fairly compensated.
While AI has created new employment opportunities, it has also become increasingly common for performers to have their image or voice used without permission, according to Equity, a British union representing workers from the performing arts.
It has also raised concerns that performers who have done work for AI projects do not fully understand their rights.
“Like any technology, there’s a lag to it,” said Kiesewetter, an Equity member, who said some agents were not yet up to speed on the intricacies of tech companies’ contracts.
Artists worldwide often have no real protection against AI-generated imitations such as deepfake videos, said Mathilde Pavis, a law lecturer at the University of Exeter in Britain, as laws have not kept up with tech developments.
Numerous countries, including Britain and France, have performers’ rights legislation, which typically allow artists to withhold access or get paid for recordings of their performance.
But the laws generally do not protect the creative content of performances, meaning imitations — such as those generated by AI — are allowed, said Pavis.
“We assumed that a performance could only be reproduced or bootlegged or stolen on scale if it’s recorded on something,” she said.
Both she and Equity have argued that new laws strengthening rights for creatives are needed.
Pavis said it would be better to give performers copyright over the content of their performances to hand them full control. Artists would then have to consent to performances being used, and could get a passive income stream, said Pavis.
AI AUTHORS?
Ryan Abbott, a professor of law and health science at the University of Surrey in Britain who has written a book on law and AI, says algorithms should have rights, too.
Computers are increasingly skilled at creating anything from music to stories, but their work lacks full protection in the US as only humans can be listed as authors, he said.
In a bid to change that, Abbott is bringing a legal case against the US Copyright Office to register a digital artwork made by a computer with AI as its author, and the AI owner as the copyright owner who ultimately benefits from any profits.
While machines do not need rights, lack of copyright protection leaves their owners and developers exposed, and risks stifling investment and innovation, he said.
Works made by AI should also be clearly identified to avoid devaluing the role of human artists, he added.
“You can have someone record my voice for an hour and train the AI to make music. And suddenly you can have an AI make award-winning music that sounded like it was coming from me,” he said.
“If we did that, I would say it would be unfair to have me listed as the author, because it would suggest I’m a great musician, when really, I’m absolutely terrible.”
As AI typically generates work after having “learnt” its craft from large amounts of human-made pieces, some say human artists should be compensated if their performance is used in the process.
On this and other issues, it will come down to what lawmakers decide is fair, said Abbott.
“Do we want to make it easier for businesses to do business, or ... for creatives to get compensation?” he said.
With recordings of work calls and video conversations having become commonplace during the pandemic, these decisions will have repercussions outside the entertainment industry in spheres such as workplace rights, said Pavis.
“Performers are a little bit the lab rats for what we think is acceptable and unacceptable, legal and illegal,” she said.
“They are at the frontier of the debate around what use of our data and our faces for work will be seen as appropriate.”
Nov. 11 to Nov. 17 People may call Taipei a “living hell for pedestrians,” but back in the 1960s and 1970s, citizens were even discouraged from crossing major roads on foot. And there weren’t crosswalks or pedestrian signals at busy intersections. A 1978 editorial in the China Times (中國時報) reflected the government’s car-centric attitude: “Pedestrians too often risk their lives to compete with vehicles over road use instead of using an overpass. If they get hit by a car, who can they blame?” Taipei’s car traffic was growing exponentially during the 1960s, and along with it the frequency of accidents. The policy
Hourglass-shaped sex toys casually glide along a conveyor belt through an airy new store in Tokyo, the latest attempt by Japanese manufacturer Tenga to sell adult products without the shame that is often attached. At first glance it’s not even obvious that the sleek, colorful products on display are Japan’s favorite sex toys for men, but the store has drawn a stream of couples and tourists since opening this year. “Its openness surprised me,” said customer Masafumi Kawasaki, 45, “and made me a bit embarrassed that I’d had a ‘naughty’ image” of the company. I might have thought this was some kind
What first caught my eye when I entered the 921 Earthquake Museum was a yellow band running at an angle across the floor toward a pile of exposed soil. This marks the line where, in the early morning hours of Sept. 21, 1999, a massive magnitude 7.3 earthquake raised the earth over two meters along one side of the Chelungpu Fault (車籠埔斷層). The museum’s first gallery, named after this fault, takes visitors on a journey along its length, from the spot right in front of them, where the uplift is visible in the exposed soil, all the way to the farthest
“Designed to be deleted” is the tagline of one of the UK’s most popular dating apps. Hinge promises that it is “the dating app for people who want to get off dating apps” — the place to find lasting love. But critics say modern dating is in crisis. They claim that dating apps, which have been downloaded hundreds of millions of times worldwide, are “exploitative” and are designed not to be deleted but to be addictive, to retain users in order to create revenue. An Observer investigation has found that dating apps are increasingly pushing users to buy extras that have been