A Google software engineer was suspended after going public with his claims of encountering “sentient” artificial intelligence on the company’s servers — spurring a debate about how and whether AI can achieve consciousness. Researchers say it’s an unfortunate distraction from more pressing issues in the industry.
The engineer, Blake Lemoine, said he believed that Google’s AI chatbot was capable of expressing human emotion, and that the company would need to address the resulting ethical ramifications. Google put him on leave for sharing confidential information and said his concerns had no basis in fact — a view widely held in the AI community. What’s more important, researchers say, is addressing issues like whether AI can engender real-world harm and prejudice, whether actual humans are exploited in the training of AI, and how the major technology companies act as gatekeepers of the development of the tech.
Lemoine’s stance may also make it easier for tech companies to abdicate responsibility for AI-driven decisions, said Emily Bender, a professor of computational linguistics at the University of Washington. “Lots of effort has been put into this sideshow,” she said. “The problem is, the more this technology gets sold as artificial intelligence — let alone something sentient — the more people are willing to go along with AI systems” that can cause real-world harm.
Photo: Pixabay 照片：Pixabay
Bender pointed to examples in job hiring and grading students, which can carry embedded prejudice depending on what data sets were used to train the AI. If the focus is on the system’s apparent sentience, Bender said, it creates a distance from the AI creators’ direct responsibility for any flaws or biases in the programs.
The debate over sentience in robots has been carried out alongside science fiction portrayal in popular culture, in stories and movies with AI romantic partners or AI villains. So the debate had an easy path to the mainstream. “Instead of discussing the harms of these companies,” such as sexism, racism and centralization of power created by these AI systems, everyone “spent the whole weekend discussing sentience,” Timnit Gebru, formerly co-lead of Google’s ethical AI group, said on Twitter. “Derailing mission accomplished.”
Putting an emphasis on AI sentience would have given Google the leeway to blame the issue on the intelligent AI making such a decision, Bender said. “The company could say, ‘Oh, the software made a mistake,’” she said. “Well no, your company created that software. You are accountable for that mistake. And the discourse about sentience muddies that in bad ways.”
Photo: Pixabay 照片：Pixabay
The earliest chatbots of the 1960s and ’70s, including ELIZA and PARRY, generated headlines for their ability to be conversational with humans. In more recent years, the GPT-3 language model from OpenAI, the lab founded by Tesla CEO Elon Musk and others, has demonstrated even more cutting-edge abilities, including the ability to read and write. But from a scientific perspective, there is no evidence that human intelligence or consciousness are embedded in these systems, said Bart Selman, a professor of computer science at Cornell University who studies artificial intelligence.
班德說，將重點放在 AI 感知上會讓谷歌有迴旋餘地，將問題歸咎於智慧AI做出此決定。「公司可能會說，『喔，是軟體出錯了』」，她說。「嗯，不對，你的公司創建了那個軟體。你要為這個錯誤負責。但關於感知的討論很糟糕地讓這一點模糊掉了」。
A Google software engineer was suspended after going public with his claims of encountering “sentient” artificial intelligence on the company’s servers — spurring a debate about how and whether AI can achieve consciousness. Researchers say it’s an unfortunate distraction from more pressing issues in the industry. The engineer, Blake Lemoine, said he believed that Google’s AI chatbot was capable of expressing human emotion, and that the company would need to address the resulting ethical ramifications. Google put him on leave for sharing confidential information and said his concerns had no basis in fact — a view widely held in the AI
Disney’s latest animation film “Lightyear,” which features two women sharing a kiss, has been denied release in 14 Muslim-majority countries and regions, a source close to the company said. “Lightyear” is a spin-off of the popular Toy Story film series, serving as an origin story for the main character Buzz Lightyear. The film follows the legendary Space Ranger after he is marooned on a hostile planet alongside his commander and their crew. One scene depicts Buzz’s best friend Alisha Hawthorne kissing her wife. Walt Disney Co. has tried to navigate differing public and political attitudes on LGBTQ issues. However, some countries across
A netizen encountered “a” strange creature that looked like a cockroach but had “legs on its back.” She took a photograph of the creature, which she said made her feel nauseous, and posted it on the Internet, looking for answers about what it was. When other people online saw the photo, they could not help but laugh, as it turned out to be a huntsman spider ingesting a cockroach, not a “strange creature” at all. They strongly advised the original poster not to harm the “cockroach killer.” As it turns out, the original poster hadn’t recognized the “beneficial spider” and had
Musical ‘The Lion King’ is touring Taiwan again (1/3) 音樂劇《獅子王》再度來台巡演（一） A: The musical “The Lion King” is touring Taiwan. Do you want to go with me? B: OK. I’ve watched Disney’s animated film and also a live-action adaptation, but I’ve never seen the musical. A: The musical tells the story of the adventures of Prince Simba as a young lion. However, the live performance is stunning. B: The songs from the musical are really great. I especially love “Circle of Life” and “Can You Feel the Love Tonight.” I can’t wait for the live show. A: 音樂劇《獅子王》來台灣巡演了！你想要跟我去看嗎？ B: 好啊，我之前看過迪士尼的動畫版，還看過真人版電影，不過從來沒看過音樂劇。 A: 音樂劇也是關於獅子辛巴的冒險故事，不過現場演出更震撼。 B: 劇中的名曲都很好聽，我超愛《Circle of Life》、《Can You Feel the Love