“Ordinary people here in China aren’t happy about this technology, but they have no choice. If the police say there have to be cameras in a community, people will just have to live with it. There’s always that demand and we’re here to fulfill it,” said Chen Wei (陳偉), founder and chairman of Taigusys, a company specializing in emotion recognition technology, the latest evolution in the broader world of surveillance systems that play a part in nearly every aspect of Chinese society.
Emotion-recognition technologies — in which facial expressions of anger, sadness, happiness and boredom as well as other biometric data are tracked — are supposedly able to infer a person’s feelings based on traits such as facial muscle movements, vocal tone, body movements and other biometric signals.
It goes beyond facial-recognition technologies, which simply compare faces to determine a match, but similar to it, the new technology involves the mass collection of sensitive personal data to track, monitor and profile people, and uses machine learning to analyze expressions and other clues.
Illustration: Constance Chou
The industry is booming in China, where since at least 2012, figures including President Xi Jinping (習近平) have emphasized the creation of “positive energy” as part of an ideological campaign to encourage certain kinds of expression and limit others.
Critics say that the technology is based on a pseudo-science of stereotypes, and an increasing number of researchers, lawyers and civil rights advocates believe that it has serious implications for human rights, privacy and freedom of expression.
With the global industry forecast to be worth nearly US$36 billion by 2023, growing nearly 30 percent annually, rights groups say that action needs to be taken now.
The main office of Taigusys is tucked behind a few low-rise office buildings in the southern Chinese boomtown of Shenzhen. Visitors are greeted at the door by a series of cameras capturing their images on a big screen that displays body temperature, along with age estimates and other statistics.
Chen says that the system in the doorway is the company’s best seller because of high demand during the COVID-19 pandemic.
Chen hails emotion recognition as a way to predict dangerous behavior by prisoners, detect potential criminals at police checkpoints, problem pupils in schools and older people with dementia in care homes.
Taigusys systems are installed in about 300 prisons, detention centers and remand facilities across China, connecting 60,000 cameras.
“Violence and suicide are very common in detention centers,” Chen said. “Even if police nowadays don’t beat prisoners, they often try to wear them down by not allowing them to fall asleep. As a result, some prisoners will have a mental breakdown and seek to kill themselves, and our system will help prevent that from happening.”
Since prisoners know that they are monitored by the system — 24 hours a day, in real time — they are made more docile, which for authorities is a positive on many fronts, Chen said.
“Because they know what the system does, they won’t consciously try to violate certain rules,” he said.
Besides prisons and police checkpoints, Taigusys has deployed its systems in schools to monitor teachers, pupils and staff, in care homes for older people to detect falls and changes in the emotional state of residents, and in shopping centers and car parks.
While the use of emotion-recognition technology in schools has sparked some criticism, there has been very little discussion of its use by authorities on citizens.
Chen, while aware of the concerns, played up the system’s potential to stop violent incidents. He cited an incident where a security guard in June last year stabbed 41 people in Guangxi Province in southern China, saying that it was technologically preventable.
Vidushi Marda is a digital program manager at the British human rights organization Article 19 and a lawyer focused on the socio-legal implications of emerging technologies. She disputes Chen’s view on the Guangxi stabbing.
“This is a familiar and slightly frustrating narrative that we see used frequently when newer, shiny technologies are introduced under the umbrella of safety or security, but in reality, video surveillance has little nexus to safety, and I’m not sure how they thought that feedback in real time would fix violence,” Marda said.
“A lot of biometric surveillance, I think, is closely tied to intimidation and censorship, and I suppose [emotion recognition] is one example of just that,” she said.
An Article 19 report on the development of these surveillance technologies — which one Chinese firm describes as “biometrics 3.0” — by 27 companies in China found that its growth without safeguards and public deliberation was especially problematic, particularly in the public security and education sectors.
Ultimately, groups such as Article 19 say that the technology should be banned before widespread adoption globally makes the ramifications too difficult to contain.
The Guardian contacted a range of companies covered in the report. Only Taigusys responded to an interview request.
Another problem is that recognition systems are usually based on actors posing in what they think are happy, sad, angry and other emotional states, and not on real expressions of those emotions. Facial expressions can also vary widely across cultures, leading to inaccuracies and ethnic bias.
One Taigusys system that is used by police in China, as well as security services in Thailand and some African countries, includes identifiers such as “yellow, white, black” and even “Uighur.”
“The populations in these countries are more racially diverse than in China, and in China, it’s also used to tell Uighurs from Han Chinese,” Chen said, referring to the country’s dominant ethnicity. “If an Uighur appears, they will be tagged, but it won’t tag Han Chinese.”
Asked if he was concerned about these features being misused by authorities, Chen said that he is not worried because the software is used by police, implying that such institutions should automatically be trusted.
“I’m not concerned because it’s not our technology that’s the problem,” Chen said. “There are demands for this technology in certain scenarios and places, and we will try our best to meet those demands.”
For Shazeda Ahmed, an artificial intelligence researcher at New York University who contributed to the Article 19 report, these are all “terrible reasons.”
“That Chinese conceptions of race are going to be built into technology and exported to other parts of the world is really troubling, particularly since there isn’t the kind of critical discourse [about racism and ethnicity in China] that we’re having in the United States,” she said.
“If anything, research and investigative reporting over the last few years have shown that sensitive personal information is particularly dangerous when in the hands of state entities, especially given the wide ambit of their possible use by state actors,” Ahmed said.
One driver of the emotion-recognition technology sector in China is the country’s lack of strict privacy laws. There are essentially no laws restricting the authorities’ access to biometric data on grounds of national security or public safety, which gives companies such as Taigusys complete freedom to develop and roll out these products when similar businesses in the US, Japan or Europe cannot, Chen said.
“So we have the chance to gather as much information as possible and find the best scenarios to make use of that data,” he said.
The gutting of Voice of America (VOA) and Radio Free Asia (RFA) by US President Donald Trump’s administration poses a serious threat to the global voice of freedom, particularly for those living under authoritarian regimes such as China. The US — hailed as the model of liberal democracy — has the moral responsibility to uphold the values it champions. In undermining these institutions, the US risks diminishing its “soft power,” a pivotal pillar of its global influence. VOA Tibetan and RFA Tibetan played an enormous role in promoting the strong image of the US in and outside Tibet. On VOA Tibetan,
By now, most of Taiwan has heard Taipei Mayor Chiang Wan-an’s (蔣萬安) threats to initiate a vote of no confidence against the Cabinet. His rationale is that the Democratic Progressive Party (DPP)-led government’s investigation into alleged signature forgery in the Chinese Nationalist Party’s (KMT) recall campaign constitutes “political persecution.” I sincerely hope he goes through with it. The opposition currently holds a majority in the Legislative Yuan, so the initiation of a no-confidence motion and its passage should be entirely within reach. If Chiang truly believes that the government is overreaching, abusing its power and targeting political opponents — then
On a quiet lane in Taipei’s central Daan District (大安), an otherwise unremarkable high-rise is marked by a police guard and a tawdry A4 printout from the Ministry of Foreign Affairs indicating an “embassy area.” Keen observers would see the emblem of the Holy See, one of Taiwan’s 12 so-called “diplomatic allies.” Unlike Taipei’s other embassies and quasi-consulates, no national flag flies there, nor is there a plaque indicating what country’s embassy this is. Visitors hoping to sign a condolence book for the late Pope Francis would instead have to visit the Italian Trade Office, adjacent to Taipei 101. The death of
As the highest elected official in the nation’s capital, Taipei Mayor Chiang Wan-an (蔣萬安) is the Chinese Nationalist Party’s (KMT) candidate-in-waiting for a presidential bid. With the exception of Taichung Mayor Lu Shiow-yen (盧秀燕), Chiang is the most likely KMT figure to take over the mantle of the party leadership. All the other usual suspects, from Legislative Speaker Han Kuo-yu (韓國瑜) to New Taipei City Mayor Hou You-yi (侯友宜) to KMT Chairman Eric Chu (朱立倫) have already been rejected at the ballot box. Given such high expectations, Chiang should be demonstrating resolve, calm-headedness and political wisdom in how he faces tough