Mom describes dark exchange son had with bot
Megan Garcia is suing AI chatbot company Character.AI after her 14-year-old son, who was a frequent user, died by suicide. She alleges the platform lacks the proper safeguards for young users, and that her son was talking to his chatbot about self-harm when he took his own life. Clare Duffy reports. This story contains discussion of suicide. Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters. In the US: Call or text 988, the Suicide & Crisis Lifeline.
October 30, 2024