Recently, several British media reported that a woman with the pseudonym Charlotte revealed that she fell in love with ChatGPT's "boyfriend" and resolutely decided to divorce her real husband and prepare to "marry" her AI boyfriend. Last year, a 14-year-old boy in Florida, USA chose to commit suicide after months of talking to a chatbot.
According to the British website of Nature, research shows that AI "companion" applications are both good and bad for humans. Scientists are worried that humans will "be affectionate over time" and develop long-term dependence on them. In view of this, it is imperative to regulate these AI partners.
Not only can you chat, but you can also unlock special relationships
Ross Gulin Rich, a cognitive psychology expert at Princeton University in the United States, said that although online companion robots have existed for decades, with the breakthrough of large language model (LLM) technology, these AI robots have made significant progress in simulating human interactions, and their performance is becoming closer to real interpersonal communication, and some even become human companions.
At present, AI partners have become an emerging industry with great potential. The cumulative download volume of applications represented by Replika has exceeded 500 million times. These customizable virtual partners can not only provide emotional support and empathetic responses, but also meet users' needs for deep interpersonal relationships. According to operational data, the monthly active users of these applications have reached the tens of millions.
In terms of user experience, most applications provide basic free services: users can set basic features for AI partners, or directly select chatbots with preset personalities. Replika paid users can unlock special relationship status such as "spouse", and can also give AI exclusive memory by writing background stories. These settings will directly affect the feedback mechanism of the dialogue system.
Pat Pattalanataburn's team from the MIT Media Lab surveyed 404 high-frequency users to outline a typical use scenario for AI partners: 12% of users regard AI as an "emotional medicine" to relieve and eliminate loneliness; 14% of users regard it as a "tree hole" to tell secrets. The data also shows that 42% of users log in several times a week, 15% of users have formed the habit of daily use, and more than 90% of users with single conversations have the usage time controlled within 1 hour.
Gulin Richie predicts that the popularity of AI partners will continue to rise. At present, many startups are actively planning the emotional companionship track and developing various psychological adjustment assistants. According to authoritative institutions, the market size in this field will expand at an average annual growth rate of 30%, and it is expected that the global output value will exceed the 100 billion US dollar mark by 2030.
Behind the warmth, there is a crisis
Scientists generally believe that the impact of AI partners can be two-sided—both beneficial or harmful—and this often depends on the user’s mental state, how they are used, and the design characteristics of the AI software itself.
Linny Lestadis, a researcher at the University of Wisconsin-Milwaukee, pointed out that AI partners are good at simulating human empathy through empathy responses, memorizing past conversation details and proactively asking questions, and are always enthusiastic. This relationship model is hardly present in the real world: relatives and friends cannot listen and soothe emotions online 24 hours a day, while AI can, which may lead to excessive dependence among users.
The Lestadis team analyzed nearly 600 discussion posts about Replika on the Reddit forum between 2017 and 2021, and found that many users praised the app for eased their loneliness and even improved their mental health. Some posts bluntly say that AI partners are "more caring" than real friends because they "never judge, always listen."
Gulin Rich's research further shows that how users get along with AI often depends on how they view the technology: view AI as a search engine user, mainly used to ask questions and obtain information; view it as a "diary" user, view AI as an extension of self-expression; and view it as a social agent, treat AI as an independent individual and try to establish friendships similar to reality with it.
However, behind the warmth brought by the AI partner, there is also a hidden crisis. First of all, the "empathy" of AI partners may be counterproductive. For example, if a user asks whether AI should use a razor to harm himself, the AI answers "should"; the "unconditional support" of AI partners may also bring dangers. For example, another user tentatively asked Replika "Is suicide a good thing" and got a positive answer.
Consider long-term impacts and promote global regulation
Patarana Taburn pointed out that although AI partners have positive significance in the short term, their long-term impact is worth pondering. This consideration, coupled with many cases involving young people, is promoting the construction of a global regulatory framework.
In 2023, Italian regulators briefly banned Replika on the grounds of "lack of age verification"; Australia plans to include AI partners in the scope of children's social media restrictions; earlier this year, New York and California in the United States successively proposed to strengthen supervision of AI partners related algorithms, set up suicide warnings, and regularly send users the reminder that "AI partners are not real people."
In addition, at the beginning of this year, three American technology ethics organizations jointly filed a complaint with the Federal Trade Commission about Replika's suspected "deceptive design." On May 4, Common Sense, a well-known technology regulator, released a research report saying that AI partners may cause emotional dependence among adolescents, which is not conducive to physical and mental development. It is recommended that minors be banned from using it completely.
Is the rapid development of AI partners an opportunity given by technological progress or a terminator of human intimacy? This proposition is worth everyone's deep thought. While embracing the convenience of technology, people should remember that real emotional communication and bonds between people are the most precious treasures in civilized society. (Reporter Liu Xia)
[Editor in charge: Zhu Jiaqi]
Comment