In the pre-dawn hours, Ann Li’s anxieties felt overwhelming. She’d recently been diagnosed with a serious health problem, and she just wanted to talk to someone about it. But she hadn’t told her family, and all her friends were asleep. So instead, she turned to ChatGPT.
“It’s easier to talk to AI during those nights,” said the 30-year-old Taiwanese woman.
In China, Yang, a 25-year-old Guangdong resident, had never seen a mental health professional when she started talking to an AI chatbot earlier this year. Yang says it was difficult to access mental health services, and she couldn’t contemplate confiding in family or friends.
Photo: Reuters
“Telling the truth to real people feels impossible,” she says.
But she was soon talking to the chatbot “day and night.”
Li and Yang are among a growing number of Chinese-speaking people turning to generative AI chatbots instead of professional human therapists. Experts say there is huge potential for AI in the mental health sector, but are concerned about the risks of people in distress turning to the technology, rather than human beings, for medical assistance.
There are few official statistics, but mental health professionals in Taiwan and China have reported rising rates of patients consulting AI before seeing them, or instead of seeing them. Surveys, including a global analysis recently published by Harvard Business Review, show psychological assistance is now a leading reason for adults to use AI chatbots. On social media there are hundreds of thousands of posts praising AI for helping them.
It comes amid rising rates of mental illness in Taiwan and China, particularly among younger people. Access to services is not keeping apace — appointments are hard to get, and they’re expensive. Chatbot users say AI saves them time and money, gives real answers and is more discrete in a society where there is still stigma around mental health.
“In some way the chatbot does help us — it’s accessible, especially when ethnic Chinese tend to suppress or downplay our feelings,” says Dr Yi-Hsien Su, a clinical psychologist at True Colors in Taiwan, who also works in schools and hospitals to promote mental wellbeing in Taiwan.
“I talk to people from Gen Z and they’re more willing to talk about problems and difficulties … But there’s still much to do.”
In Taiwan, the most popular chatbot is ChatGPT. In China, where western apps like ChatGPT are banned, people have turned to domestic offerings like Baidu’s Ernie Bot, or the recently launched DeepSeek. They are all advancing at rapid speed, and are incorporating wellbeing and therapy into responses as demand increases.
User experiences vary. Li says ChatGPT gives her what she wants to hear, but that can also be predictable and uninsightful. She also misses the process of self discovery in counseling.
“I think AI tends to give you the answer, the conclusion that you would get after you finish maybe two or three sessions of therapy,” she says.
Yet 27-year-old Nabi Liu, a Taiwanese woman based in London, has found the experience to be very fulfilling.
“When you share something with a friend, they might not always relate. But ChatGPT responds seriously and immediately,” she says. “I feel like it’s genuinely responding to me each time.”
Experts say it can assist people who are in distress but perhaps don’t need professional help yet, like Li, or those who need a little encouragement to take the next step.
Yang says she doubted whether her struggles were serious enough to warrant professional help.
“Only recently have I begun to realize that I might actually need a proper diagnosis at a hospital,” she says.
“Going from being able to talk [to AI] to being able to talk to real people might sound simple and basic, but for the person I was before, it was unimaginable.”
But experts have also raised concerns about people falling through the cracks, missing the signs that Yang saw for herself, and not getting the help they need.
There have been tragic cases in recent years of young people in distress seeking help from chatbots instead of professionals, and later taking their own lives.
“AI mostly deals with text, but there are things we call non verbal input. When a patient comes in maybe they act differently to how they speak but we can recognize those inputs,” Su says.
A spokesperson for the Taiwan Counseling Psychology Association says AI can be an “auxiliary tool,” but couldn’t replace professional assistance “let alone the intervention and treatment of psychologists in crisis situations.”
“AI has the potential to become an important resource for promoting the popularization of mental health. However, the complexity and interpersonal depth of the clinical scene still require the real ‘present’ psychological professional.”
The association says AI can be “overly positive,” miss cues and delay necessary medical care. It also operates outside the peer review and ethics codes of the profession.
“In the long run, unless AI develops breakthrough technologies beyond current imagination, the core structure of psychotherapy should not be shaken.”
Su says he’s excited about the ways AI could modernize and improve his industry, noting potential uses in training of professionals and detecting people online who might need intervention. But for now he recommends people approach the tools with caution.
“It’s a simulation, it’s a good tool, but has limits and you don’t know how the answer was made,” he says.
May 18 to May 24 Pastor Yang Hsu’s (楊煦) congregation was shocked upon seeing the land he chose to build his orphanage. It was surrounded by mountains on three sides, and the only way to access it was to cross a river by foot. The soil was poor due to runoff, and large rocks strewn across the plot prevented much from growing. In addition, there was no running water or electricity. But it was all Yang could afford. He and his Indigenous Atayal wife Lin Feng-ying (林鳳英) had already been caring for 24 orphans in their home, and they were in
On May 2, Chinese Nationalist Party (KMT) Chairman Eric Chu (朱立倫), at a meeting in support of Taipei city councilors at party headquarters, compared President William Lai (賴清德) to Hitler. Chu claimed that unlike any other democracy worldwide in history, no other leader was rooting out opposing parties like Lai and the Democratic Progressive Party (DPP). That his statements are wildly inaccurate was not the point. It was a rallying cry, not a history lesson. This was intentional to provoke the international diplomatic community into a response, which was promptly provided. Both the German and Israeli offices issued statements on Facebook
President William Lai (賴清德) yesterday delivered an address marking the first anniversary of his presidency. In the speech, Lai affirmed Taiwan’s global role in technology, trade and security. He announced economic and national security initiatives, and emphasized democratic values and cross-party cooperation. The following is the full text of his speech: Yesterday, outside of Beida Elementary School in New Taipei City’s Sanxia District (三峽), there was a major traffic accident that, sadly, claimed several lives and resulted in multiple injuries. The Executive Yuan immediately formed a task force, and last night I personally visited the victims in hospital. Central government agencies and the
Australia’s ABC last week published a piece on the recall campaign. The article emphasized the divisions in Taiwanese society and blamed the recall for worsening them. It quotes a supporter of the Taiwan People’s Party (TPP) as saying “I’m 43 years old, born and raised here, and I’ve never seen the country this divided in my entire life.” Apparently, as an adult, she slept through the post-election violence in 2000 and 2004 by the Chinese Nationalist Party (KMT), the veiled coup threats by the military when Chen Shui-bian (陳水扁) became president, the 2006 Red Shirt protests against him ginned up by