The China Mail - Death of 'sweet king': AI chatbots linked to teen tragedy

USD -
AED 3.672504
AFN 65.000368
ALL 81.652501
AMD 376.168126
ANG 1.79008
AOA 917.000367
ARS 1431.790402
AUD 1.425923
AWG 1.8025
AZN 1.70397
BAM 1.654023
BBD 2.008288
BDT 121.941731
BGN 1.67937
BHD 0.375999
BIF 2954.881813
BMD 1
BND 1.269737
BOB 6.889932
BRL 5.217404
BSD 0.997082
BTN 90.316715
BWP 13.200558
BYN 2.864561
BYR 19600
BZD 2.005328
CAD 1.36855
CDF 2200.000362
CHF 0.77566
CLF 0.021803
CLP 860.890396
CNY 6.93895
CNH 6.929815
COP 3699.522179
CRC 494.312656
CUC 1
CUP 26.5
CVE 93.2513
CZK 20.504104
DJF 177.555076
DKK 6.322204
DOP 62.928665
DZD 129.553047
EGP 46.73094
ERN 15
ETB 155.0074
EUR 0.846204
FJD 2.209504
FKP 0.735067
GBP 0.734457
GEL 2.69504
GGP 0.735067
GHS 10.957757
GIP 0.735067
GMD 73.000355
GNF 8752.167111
GTQ 7.647681
GYD 208.609244
HKD 7.81385
HNL 26.338534
HRK 6.376104
HTG 130.618631
HUF 319.703831
IDR 16855.5
ILS 3.110675
IMP 0.735067
INR 90.596504
IQD 1306.186308
IRR 42125.000158
ISK 122.710386
JEP 0.735067
JMD 156.057339
JOD 0.70904
JPY 157.200504
KES 128.622775
KGS 87.450384
KHR 4023.848789
KMF 419.00035
KPW 900.021111
KRW 1463.560383
KWD 0.30721
KYD 0.830902
KZT 493.331642
LAK 21426.698803
LBP 89293.839063
LKR 308.47816
LRD 187.449786
LSL 16.086092
LTL 2.95274
LVL 0.60489
LYD 6.314009
MAD 9.153622
MDL 17.000296
MGA 4426.402808
MKD 52.129054
MMK 2100.115486
MNT 3570.277081
MOP 8.023933
MRU 39.425769
MUR 46.060378
MVR 15.450378
MWK 1728.952598
MXN 17.263604
MYR 3.947504
MZN 63.750377
NAD 16.086092
NGN 1366.980377
NIO 36.694998
NOK 9.690604
NPR 144.506744
NZD 1.674621
OMR 0.383441
PAB 0.997082
PEN 3.354899
PGK 4.275868
PHP 58.511038
PKR 278.812127
PLN 3.56949
PYG 6588.016407
QAR 3.634319
RON 4.310404
RSD 99.268468
RUB 76.789716
RWF 1455.283522
SAR 3.748738
SBD 8.058149
SCR 13.84955
SDG 601.503676
SEK 9.023204
SGD 1.272904
SHP 0.750259
SLE 24.450371
SLL 20969.499267
SOS 568.818978
SRD 37.818038
STD 20697.981008
STN 20.719692
SVC 8.724259
SYP 11059.574895
SZL 16.08271
THB 31.535038
TJS 9.342721
TMT 3.505
TND 2.891792
TOP 2.40776
TRY 43.612504
TTD 6.752083
TWD 31.590367
TZS 2577.445135
UAH 42.828111
UGX 3547.71872
UYU 38.538627
UZS 12244.069517
VES 377.985125
VND 25950
VUV 119.620171
WST 2.730723
XAF 554.743964
XAG 0.012866
XAU 0.000202
XCD 2.70255
XCG 1.797032
XDR 0.689923
XOF 554.743964
XPF 100.858387
YER 238.403589
ZAR 16.04457
ZMK 9001.203584
ZMW 18.570764
ZWL 321.999592
  • SCS

    0.0200

    16.14

    +0.12%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • CMSD

    0.0600

    23.95

    +0.25%

  • NGG

    1.1700

    88.06

    +1.33%

  • CMSC

    -0.0400

    23.51

    -0.17%

  • BCC

    1.8700

    91.03

    +2.05%

  • BCE

    -0.4900

    25.08

    -1.95%

  • GSK

    1.0600

    60.23

    +1.76%

  • RYCEF

    0.2600

    16.88

    +1.54%

  • RELX

    -0.7100

    29.38

    -2.42%

  • RIO

    2.2900

    93.41

    +2.45%

  • VOD

    0.4900

    15.11

    +3.24%

  • BTI

    0.8400

    62.8

    +1.34%

  • JRI

    0.0900

    12.97

    +0.69%

  • AZN

    5.8700

    193.03

    +3.04%

  • BP

    0.8400

    39.01

    +2.15%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: © AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

J.Thompson--ThChM