The China Mail - Death of 'sweet king': AI chatbots linked to teen tragedy

USD -
AED 3.672498
AFN 63.99968
ALL 83.250317
AMD 377.160121
ANG 1.790083
AOA 916.999933
ARS 1382.505983
AUD 1.447168
AWG 1.80125
AZN 1.694587
BAM 1.70594
BBD 2.013154
BDT 122.637848
BGN 1.709309
BHD 0.377582
BIF 2964
BMD 1
BND 1.290401
BOB 6.906447
BRL 5.179301
BSD 0.999512
BTN 95.111495
BWP 13.788472
BYN 2.972354
BYR 19600
BZD 2.010179
CAD 1.390825
CDF 2284.999752
CHF 0.796702
CLF 0.023467
CLP 926.609578
CNY 6.88655
CNH 6.885245
COP 3683.58
CRC 464.734923
CUC 1
CUP 26.5
CVE 95.875038
CZK 21.21905
DJF 177.720315
DKK 6.46023
DOP 60.099511
DZD 133.250672
EGP 54.5799
ERN 15
ETB 157.049836
EUR 0.86454
FJD 2.257401
FKP 0.758039
GBP 0.754075
GEL 2.690171
GGP 0.758039
GHS 11.000341
GIP 0.758039
GMD 74.000008
GNF 8775.000407
GTQ 7.64789
GYD 209.174328
HKD 7.837245
HNL 26.598252
HRK 6.510799
HTG 131.185863
HUF 332.194497
IDR 16990.45
ILS 3.136103
IMP 0.758039
INR 93.580801
IQD 1310
IRR 1315875.000027
ISK 123.969689
JEP 0.758039
JMD 158.129555
JOD 0.709009
JPY 158.639504
KES 129.999832
KGS 87.450175
KHR 4010.000018
KMF 428.505954
KPW 899.974671
KRW 1506.999759
KWD 0.30962
KYD 0.832908
KZT 476.211659
LAK 21949.999763
LBP 89509.105032
LKR 315.318459
LRD 183.675058
LSL 17.070062
LTL 2.95274
LVL 0.60489
LYD 6.404997
MAD 9.342497
MDL 17.701369
MGA 4178.000434
MKD 53.264382
MMK 2099.498084
MNT 3571.008867
MOP 8.070843
MRU 40.109711
MUR 46.790262
MVR 15.469725
MWK 1736.999852
MXN 17.88899
MYR 4.037498
MZN 63.949813
NAD 17.070226
NGN 1384.029762
NIO 36.729794
NOK 9.67056
NPR 152.178217
NZD 1.740475
OMR 0.384513
PAB 0.999507
PEN 3.495947
PGK 4.39013
PHP 60.275504
PKR 279.198292
PLN 3.705805
PYG 6474.685228
QAR 3.64399
RON 4.4066
RSD 101.505023
RUB 81.3021
RWF 1460
SAR 3.753424
SBD 8.042037
SCR 14.298932
SDG 600.999861
SEK 9.438835
SGD 1.28561
SHP 0.750259
SLE 24.549865
SLL 20969.510825
SOS 571.499729
SRD 37.374012
STD 20697.981008
STN 21.725
SVC 8.746053
SYP 110.555055
SZL 17.070482
THB 32.620496
TJS 9.580319
TMT 3.51
TND 2.929978
TOP 2.40776
TRY 44.487204
TTD 6.790468
TWD 31.934015
TZS 2585.810972
UAH 43.911606
UGX 3762.887497
UYU 40.550736
UZS 12195.498196
VES 473.27785
VND 26340
VUV 120.343344
WST 2.769273
XAF 572.15615
XAG 0.013415
XAU 0.000213
XCD 2.70255
XCG 1.801363
XDR 0.710952
XOF 570.497088
XPF 104.049704
YER 238.650234
ZAR 16.898898
ZMK 9001.196673
ZMW 19.105686
ZWL 321.999592
  • RBGPF

    -13.5000

    69

    -19.57%

  • CMSC

    -0.4028

    21.9

    -1.84%

  • BCC

    0.9000

    75.85

    +1.19%

  • BCE

    0.0100

    25.24

    +0.04%

  • CMSD

    -0.4000

    22.1

    -1.81%

  • GSK

    0.9600

    55.19

    +1.74%

  • JRI

    0.3800

    12.3

    +3.09%

  • AZN

    3.3400

    197.22

    +1.69%

  • RIO

    4.4700

    93.29

    +4.79%

  • NGG

    0.9100

    84.6

    +1.08%

  • RELX

    0.4000

    33.15

    +1.21%

  • RYCEF

    0.7400

    15.09

    +4.9%

  • BTI

    0.2100

    58.47

    +0.36%

  • VOD

    0.3200

    15.02

    +2.13%

  • BP

    -0.3500

    47

    -0.74%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: © AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

J.Thompson--ThChM