The China Mail - Can you trust your ears? AI voice scams rattle US

USD -
AED 3.672504
AFN 66.379449
ALL 81.856268
AMD 381.470403
ANG 1.790403
AOA 917.000367
ARS 1450.503978
AUD 1.490535
AWG 1.80025
AZN 1.70397
BAM 1.658674
BBD 2.014358
BDT 122.21671
BGN 1.660906
BHD 0.377309
BIF 2957.76141
BMD 1
BND 1.284077
BOB 6.926234
BRL 5.544041
BSD 1.00014
BTN 89.856547
BWP 13.14687
BYN 2.919259
BYR 19600
BZD 2.011466
CAD 1.36805
CDF 2200.000362
CHF 0.78828
CLF 0.023092
CLP 905.903912
CNY 7.028504
CNH 7.004085
COP 3697
CRC 499.518715
CUC 1
CUP 26.5
CVE 93.513465
CZK 20.589604
DJF 177.720393
DKK 6.341204
DOP 62.690023
DZD 129.697253
EGP 47.553819
ERN 15
ETB 155.604932
EUR 0.849304
FJD 2.269204
FKP 0.740887
GBP 0.739891
GEL 2.68504
GGP 0.740887
GHS 11.126753
GIP 0.740887
GMD 74.503851
GNF 8741.153473
GTQ 7.662397
GYD 209.237241
HKD 7.77175
HNL 26.362545
HRK 6.400904
HTG 130.951927
HUF 328.603831
IDR 16772.3
ILS 3.19263
IMP 0.740887
INR 89.814504
IQD 1310.19773
IRR 42125.000352
ISK 125.730386
JEP 0.740887
JMD 159.532199
JOD 0.70904
JPY 156.52504
KES 128.950385
KGS 87.425039
KHR 4008.85391
KMF 418.00035
KPW 900.007297
KRW 1447.390383
KWD 0.30716
KYD 0.833489
KZT 514.029352
LAK 21644.588429
LBP 89561.205624
LKR 309.599834
LRD 177.018844
LSL 16.645168
LTL 2.95274
LVL 0.60489
LYD 5.412442
MAD 9.124909
MDL 16.777482
MGA 4573.672337
MKD 52.273789
MMK 2099.762774
MNT 3557.834851
MOP 8.011093
MRU 39.604456
MUR 45.950378
MVR 15.450378
MWK 1734.230032
MXN 17.910804
MYR 4.048504
MZN 63.910377
NAD 16.645168
NGN 1451.090377
NIO 36.806642
NOK 10.009404
NPR 143.770645
NZD 1.710133
OMR 0.384612
PAB 1.000136
PEN 3.365433
PGK 4.319268
PHP 58.710375
PKR 280.16122
PLN 3.58005
PYG 6777.849865
QAR 3.645469
RON 4.321504
RSD 99.687487
RUB 79.007431
RWF 1456.65485
SAR 3.750704
SBD 8.153391
SCR 14.462231
SDG 601.503676
SEK 9.157904
SGD 1.284104
SHP 0.750259
SLE 24.075038
SLL 20969.503664
SOS 570.585342
SRD 38.335504
STD 20697.981008
STN 20.777943
SVC 8.75133
SYP 11056.849201
SZL 16.631683
THB 31.070369
TJS 9.19119
TMT 3.51
TND 2.909675
TOP 2.40776
TRY 42.823038
TTD 6.803263
TWD 31.395038
TZS 2470.000335
UAH 42.191946
UGX 3610.273633
UYU 39.087976
UZS 12053.751267
VES 288.088835
VND 26291
VUV 120.294541
WST 2.770875
XAF 556.301203
XAG 0.012608
XAU 0.000221
XCD 2.70255
XCG 1.802508
XDR 0.692794
XOF 556.303562
XPF 101.141939
YER 238.450363
ZAR 16.668037
ZMK 9001.203584
ZMW 22.577472
ZWL 321.999592
  • NGG

    0.1500

    77.64

    +0.19%

  • RBGPF

    -0.5500

    80.71

    -0.68%

  • RYCEF

    0.0300

    15.56

    +0.19%

  • CMSC

    0.0700

    23.09

    +0.3%

  • SCS

    0.0200

    16.14

    +0.12%

  • VOD

    0.0200

    13.12

    +0.15%

  • RELX

    0.0200

    41.11

    +0.05%

  • BCE

    0.0400

    23.05

    +0.17%

  • CMSD

    -0.0300

    23.11

    -0.13%

  • GSK

    0.1200

    49.08

    +0.24%

  • AZN

    0.4500

    92.9

    +0.48%

  • BTI

    0.0300

    57.27

    +0.05%

  • BCC

    0.4200

    75.13

    +0.56%

  • JRI

    0.0000

    13.47

    0%

  • BP

    -0.0400

    34.27

    -0.12%

  • RIO

    1.3500

    82.24

    +1.64%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: © AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

X.Gu--ThChM