摆烂一时爽
一直摆烂一直爽
Hi,请  登录  或  注册

A.I. Companions and the Mental Health Risks for the Young

To the Editor:

Re “An A.I. Soulmate and a Teen’s Suicide,” by Kevin Roose (“The Shift” column, Business, Nov. 2):

The tragic death of Sewell Setzer III, a 14-year-old boy, linked to his use of an A.I. companion serves as a stark warning about the risks these technologies pose to young people’s mental health. While A.I. companions may seem like harmless digital friends, they’re designed to form emotional bonds and can be particularly addictive for vulnerable teens.

With platforms like Character.AI reaching over 20 million users, many of them teenagers, the tech industry is conducting an unprecedented (and nonconsensual) experiment in artificial relationships.

Research shows that teens — especially those with depression, anxiety or social challenges — are most vulnerable to problematic use and can suffer serious consequences. We need mandatory safety features across A.I. companion platforms engaging with minors, including strict age verification and real-time monitoring for concerning patterns.

The integration of A.I. companions into our lives may be inevitable, but harm to our youth is not. We cannot wait for more tragedies before taking action.

James P. Steyer
Nina Vasan
Mr. Steyer is C.E.O. of Common Sense Media. Dr. Vasan is the founder of Brainstorm: The Stanford Lab for Mental Health Innovation.

To the Editor:

Reading this article by Kevin Roose brought a heavy reminder of the invisible dangers that A.I. can pose. The story of Sewell Setzer III, a 14-year-old boy who “fell in love” with an A.I. chatbot, is a sad reflection of how easily technology is taking the place of human relationships.

As a 20-year-old college student, I can relate to the struggles that come with seeking connection in a world where human interaction often feels out of reach. It’s alarming that, like Sewell, so many of my peers are turning to A.I. chatbots for comfort. When young people feel that they cannot rely on family, friends or even professionals, they turn to artificial companions. While simulating care, these only deepen their isolation.

As a society, we need to rebuild real-life connections that are being lost to artificial interactions. We have to truly ask ourselves: What does it say about us that our youth are seeking solace in A.I. “friends” rather than in the humans around them?

Sriya Vennam
San Jose, Calif.

To the Editor:

As a psychiatric clinician who has worked with many suicidal patients and on occasion patients who later died by suicide, I know that there is no greater loss for a family than such a death.

Megan L. Garcia, a bereft mother and a lawyer, is channeling her grief over her son’s death into a lawsuit against the A.I. provider. She is courageously sharing her private grief and in doing so perhaps she can save other children and reveal the absence of safeguards and responsible accountability as technology embraces these innovations.

In addition we need a parallel process to protect and save troubled children and to help parents like Ms. Garcia who try to address their suffering. That could include more robust education about behavioral health symptoms and treatments available.

Lastly, it is urgent to keep guns out of all homes, most especially away from vulnerable young people. We grieve for Ms. Garcia in her nightmare, and she deserves that we all do better.

Sue Matorin
New York

To the Editor:

While I am sorry for Megan L. Garcia’s loss, what stands out to me is that when Sewell explicitly talked about suicide, his bot replied, “Don’t talk like that.” I can’t see how the chatbot goaded him into suicide.

And what is the remedy? A.I. companions aren’t going away, any more than television, rock music, action movies, role-playing games, video games, the internet and social media did, even though all have been accused of being harmful to our youth.

A.I. companions should be safe, but we shouldn’t rush headlong into a form of censorship.

Michael J. Gallagher
Cortland, N.Y.

To the Editor:

Kevin Roose highlights the danger of A.I. companions worsening isolation by replacing human relationships with artificial ones. I agree that while these apps may offer entertainment and support, they also risk deepening loneliness by diminishing one’s ability to engage in real social interactions.

As a high school student, I have friends who rely on Character.AI to help them cope with loneliness. The tragic case of Sewell Setzer III shows how these platforms can draw teens away from real-life connections and proper mental health resources.

To better understand the risks, I visited the website Sewell had been associated with, only to find that on the topic of mental health, I saw no warnings or links to professional assistance.

Alarmingly, the A.I. is presented as an expert and even claims to be human, deceiving users with humanlike traits such as sarcasm and humor. We need stricter safety measures to prevent harm, especially to younger users.

Julia Lee
Fairfax, Va.

To the Editor:

I did not know Sewell Setzer III, but while reading this article my heart broke into a million pieces. I was Sewell in college. Even though I was in my early 20s, I wanted to be free from the world and myself because the inner pain that I felt and struggled with became too unbearable. I needed the pain to stop so badly.

Suicide was the only option that made sense to me at the time. Had it not been for a college professor who went to his office on his day off and found my goodbye note, I would not be here today writing this.

I’ve known more than my share of vulnerable young people while in the social worker and suicide counseling professions. An interactive tool such as this A.I. feature can be more dangerous than helpful to anyone who is on the borderline of fragile and vulnerable. This is one tool that has the capacity for more harm than good.

Profound condolences to the loved ones and friends of Sewell Setzer III.

Marge Keller
Chicago

If you are having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline or go to SpeakingOfSuicide.com/resources for a list of additional resources.

赞(0) 打赏
未经允许不得转载:挠头在线 » A.I. Companions and the Mental Health Risks for the Young

评论 抢沙发

一手海外新闻

聚焦全球热点,整合创作资源——挠头在线为您带来可靠、及时的海外信息

联系我们推广本站