The China Mail - Death of 'sweet king': AI chatbots linked to teen tragedy

USD -
AED 3.672498
AFN 66.493556
ALL 83.585365
AMD 384.269741
ANG 1.790403
AOA 917.000113
ARS 1421.493968
AUD 1.523276
AWG 1.8025
AZN 1.696076
BAM 1.690026
BBD 2.022104
BDT 122.271573
BGN 1.690095
BHD 0.376988
BIF 2958.293548
BMD 1
BND 1.300664
BOB 6.93794
BRL 5.378599
BSD 1.004009
BTN 89.103572
BWP 13.338403
BYN 3.413148
BYR 19600
BZD 2.019252
CAD 1.401755
CDF 2392.494926
CHF 0.806009
CLF 0.024222
CLP 950.22021
CNY 7.128499
CNH 7.12955
COP 3900.97
CRC 505.247331
CUC 1
CUP 26.5
CVE 95.281095
CZK 21.059963
DJF 178.78437
DKK 6.453995
DOP 63.140522
DZD 130.170956
EGP 47.552704
ERN 15
ETB 147.179545
EUR 0.86437
FJD 2.26665
FKP 0.746808
GBP 0.751715
GEL 2.719772
GGP 0.746808
GHS 12.348997
GIP 0.746808
GMD 71.999841
GNF 8707.674627
GTQ 7.693172
GYD 210.047872
HKD 7.781045
HNL 26.348444
HRK 6.511199
HTG 131.371787
HUF 337.978499
IDR 16592.6
ILS 3.259825
IMP 0.746808
INR 88.7041
IQD 1315.261912
IRR 42062.499797
ISK 122.41015
JEP 0.746808
JMD 161.605862
JOD 0.709017
JPY 152.888028
KES 129.920133
KGS 87.447098
KHR 4031.972348
KMF 423.999837
KPW 900.010196
KRW 1421.06498
KWD 0.30697
KYD 0.836631
KZT 543.412889
LAK 21777.301557
LBP 89907.108169
LKR 303.907506
LRD 183.225896
LSL 17.168571
LTL 2.95274
LVL 0.60489
LYD 5.460311
MAD 9.166825
MDL 17.022968
MGA 4493.372276
MKD 53.235701
MMK 2099.46352
MNT 3596.789275
MOP 8.046074
MRU 39.948844
MUR 45.506225
MVR 15.302996
MWK 1740.749702
MXN 18.385285
MYR 4.222975
MZN 63.850159
NAD 17.168571
NGN 1477.692364
NIO 36.949506
NOK 10.072302
NPR 142.566061
NZD 1.738284
OMR 0.384504
PAB 1.004009
PEN 3.458428
PGK 4.215147
PHP 58.252018
PKR 284.379569
PLN 3.678588
PYG 7025.214731
QAR 3.669616
RON 4.404302
RSD 101.288046
RUB 81.171529
RWF 1456.803138
SAR 3.75067
SBD 8.271757
SCR 14.856017
SDG 601.500474
SEK 9.529755
SGD 1.29843
SHP 0.785843
SLE 23.21499
SLL 20969.503664
SOS 573.769075
SRD 38.378504
STD 20697.981008
STN 21.170788
SVC 8.784543
SYP 13002.075365
SZL 17.164682
THB 32.779504
TJS 9.352177
TMT 3.51
TND 2.9518
TOP 2.342101
TRY 41.824545
TTD 6.812644
TWD 30.581804
TZS 2449.999771
UAH 41.685619
UGX 3448.663222
UYU 40.086065
UZS 12120.007604
VES 189.012825
VND 26350
VUV 121.315644
WST 2.780881
XAF 566.819038
XAG 0.020073
XAU 0.000252
XCD 2.70255
XCG 1.809446
XDR 0.704941
XOF 566.819038
XPF 103.053765
YER 238.999667
ZAR 17.18793
ZMK 9001.20432
ZMW 22.966317
ZWL 321.999592
  • RBGPF

    -0.1800

    75.55

    -0.24%

  • GSK

    0.0900

    43.44

    +0.21%

  • NGG

    -0.2800

    73.33

    -0.38%

  • RYCEF

    -0.0600

    15.35

    -0.39%

  • CMSC

    -0.0200

    23.69

    -0.08%

  • CMSD

    -0.0600

    24.27

    -0.25%

  • RIO

    -0.7000

    67

    -1.04%

  • BCC

    -2.5300

    73.89

    -3.42%

  • SCS

    -0.2600

    16.53

    -1.57%

  • RELX

    -0.6900

    45.15

    -1.53%

  • BTI

    -0.2400

    51.36

    -0.47%

  • AZN

    -0.3400

    85.04

    -0.4%

  • JRI

    -0.1100

    14.01

    -0.79%

  • VOD

    0.0100

    11.28

    +0.09%

  • BP

    -0.2300

    34.29

    -0.67%

  • BCE

    0.2100

    23.44

    +0.9%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: © AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

J.Thompson--ThChM