Alphabet Inc’s Google in May introduced a slick feature for Gmail that automatically completes sentences for users as they type.
Tap out “I love” and Gmail might propose “you” or “it,” but users are out of luck if the object of their affection is “him” or “her.”
Google’s technology does not suggest gender-based pronouns because the risk is too high that its “Smart Compose” technology might predict someone’s sex or gender identity incorrectly and offend users, product leaders revealed in interviews.
Gmail product manager Paul Lambert said a company research scientist discovered the problem in January when he typed: “I am meeting an investor next week,” and Smart Compose suggested a possible follow-up question: “Do you want to meet him?” instead of “her.”
Consumers have become accustomed to embarrassing gaffes from autocorrect on smartphones, but Google refused to take chances at a time when gender issues are reshaping politics and society, and critics are scrutinizing potential biases in artificial intelligence (AI) like never before.
“Not all ‘screw ups’ are equal,” Lambert said.
Gender is a “a big, big thing” to get wrong.
Getting Smart Compose right could be good for business. Demonstrating that Google understands the nuances of AI better than its competitors is part of the company’s strategy to build affinity for its brand and attract customers to its AI-powered cloud computing tools, advertising services and hardware.
Gmail has 1.5 billion users and Lambert said Smart Compose assists on 11 percent of messages worldwide sent from Gmail.com, where the feature first launched.
Smart Compose is an example of what AI developers call natural language generation, in which computers learn to write sentences by studying patterns and relationships between words in literature, e-mails and Web pages.
A system shown billions of human sentences becomes adept at completing common phrases, but is limited by generalities.
Men have long dominated fields such as finance and science, for example, so the technology would conclude from the data that an investor or engineer is “he” or “him.” The issue trips up nearly every major technology company.
The Smart Compose team of about 15 engineers and designers tried several workarounds, but none proved bias-free or worthwhile, Lambert said.
They decided the best solution was the strictest one: Limit coverage. The gendered pronoun ban affects fewer than 1 percent of cases where Smart Compose would propose something, he said.
“The only reliable technique we have is to be conservative,” said Prabhakar Raghavan, who oversaw engineering of Gmail and other services until a recent promotion.
Google’s decision to play it safe on gender follows some high-profile embarrassments for the company’s predictive technologies.
The company apologized in 2015 when the image recognition feature of its photo service labeled a black couple as gorillas.
In 2016, Google altered its search engine’s autocomplete function after it suggested the anti-Semitic query “are jews evil” when users sought information about Jews.
Google has banned expletives and racial slurs from its predictive technologies, as well as mentions of its business rivals or tragic events.
The company’s new policy banning gendered pronouns also affected the list of possible responses in Google’s Smart Reply. That service allows users to respond instantly to text messages and e-mails with short phrases such as “sounds good.”
Google uses tests developed by its AI ethics team to uncover new biases. A spam and abuse team pokes at systems, trying to find “juicy” gaffes by thinking as hackers or journalists might, Lambert said.
Workers outside the US look for local cultural issues. Smart Compose is soon to be launched in four other languages: Spanish, Portuguese, Italian and French.
“You need a lot of human oversight, [because] in each language, the net of inappropriateness has to cover something different,” Raghavan said.
Google is not the only technology company wrestling with the gender-based pronoun problem.
Agolo, a New York start-up that has received investment from Thomson Reuters, uses AI to summarize business documents.
Its technology cannot reliably determine in some documents which pronoun goes with which name. So the summary pulls several sentences to give users more context, Agolo chief technology officer Mohamed AlTantawy said.
He said longer copy is better than missing details.
“The smallest mistakes will make people lose confidence,” AlTantawy said. “People want 100 percent correct.”
Yet, imperfections remain.
Predictive keyboard tools developed by Google and Apple Inc propose the gendered “policeman” to complete “police” and “salesman” for “sales.”
Type the neutral Turkish phrase “one is a soldier” into Google Translate and it spits out “he’s a soldier” in English.
So do translation tools from Alibaba Group Holding Ltd and Microsoft Corp.
Amazon.com Inc opts for “she” for the same phrase on its translation service for cloud computing customers.
AI experts have called on the companies to display a disclaimer and multiple possible translations.
Microsoft’s LinkedIn said it avoids gendered pronouns in its year-old predictive messaging tool, Smart Replies, to ward off potential blunders.
Alibaba and Amazon did not respond to requests to comment.
Warnings and limitations such as those in Smart Compose remain the most-used countermeasures in complex systems, said John Hegele, integration engineer at Durham, North Carolina-based Automated Insights Inc, which generates news articles from statistics.
“The end goal is a fully machine-generated system where it magically knows what to write,” Hegele said.
“There’s been a ton of advances made, but we’re not there yet,” he said.
Nearly half of China’s major cities are suffering “moderate to severe” levels of subsidence, putting millions of people at risk of flooding, especially as sea levels rise, according to a study of nationwide satellite data released yesterday. The authors of the paper, published by the journal Science, found that 45 percent of China’s urban land was sinking faster than 3mm per year, with 16 percent at more than 10mm per year, driven not only by declining water tables, but also the sheer weight of the built environment. With China’s urban population already in excess of 900 million people, “even a small portion
UNSETTLING IMAGES: The scene took place in front of TV crews covering the Trump trial, with a CNN anchor calling it an ‘emotional and unbelievably disturbing moment’ A man who doused himself in an accelerant and set himself on fire outside the courthouse where former US president Donald Trump is on trial has died, police said yesterday. The New York City Police Department (NYPD) said the man was declared dead by staff at an area hospital. The man was in Collect Pond Park at about 1:30pm on Friday when he took out pamphlets espousing conspiracy theories, tossed them around, then doused himself in an accelerant and set himself on fire, officials and witnesses said. A large number of police officers were nearby when it happened. Some officers and bystanders rushed
HYPOCRISY? The Chinese Ministry of Foreign Affairs yesterday asked whether Biden was talking about China or the US when he used the word ‘xenophobic’ US President Joe Biden on Wednesday called for a hike in steel tariffs on China, accusing Beijing of cheating as he spoke at a campaign event in Pennsylvania. Biden accused China of xenophobia, too, in a speech to union members in Pittsburgh. “They’re not competing, they’re cheating. They’re cheating and we’ve seen the damage here in America,” Biden said. Chinese steel companies “don’t need to worry about making a profit because the Chinese government is subsidizing them so heavily,” he said. Biden said he had called for the US Trade Representative to triple the tariff rates for Chinese steel and aluminum if Beijing was
Beijing is continuing to commit genocide and crimes against humanity against Uyghurs and other Muslim minorities in its western Xinjiang province, U.S. Secretary of State Antony Blinken said in a report published on Monday, ahead of his planned visit to China this week. The State Department’s annual human rights report, which documents abuses recorded all over the world during the previous calendar year, repeated language from previous years on the treatment of Muslims in Xinjiang, but the publication raises the issue ahead of delicate talks, including on the war in Ukraine and global trade, between the top U.S. diplomat and Chinese