The China Mail - Can you trust your ears? AI voice scams rattle US

USD -
AED 3.672499
AFN 66.442915
ALL 83.53923
AMD 382.538682
ANG 1.789982
AOA 916.999925
ARS 1410.018441
AUD 1.533213
AWG 1.8075
AZN 1.690189
BAM 1.689625
BBD 2.013494
BDT 122.069743
BGN 1.689811
BHD 0.376989
BIF 2947.185639
BMD 1
BND 1.301634
BOB 6.907782
BRL 5.270326
BSD 0.999706
BTN 88.497922
BWP 13.360229
BYN 3.408608
BYR 19600
BZD 2.010635
CAD 1.401815
CDF 2199.999612
CHF 0.80071
CLF 0.023863
CLP 936.130166
CNY 7.11965
CNH 7.12253
COP 3758.53
CRC 502.187839
CUC 1
CUP 26.5
CVE 95.25887
CZK 20.947749
DJF 178.024086
DKK 6.449835
DOP 64.291792
DZD 130.440068
EGP 47.187601
ERN 15
ETB 153.605691
EUR 0.86376
FJD 2.278982
FKP 0.760151
GBP 0.76115
GEL 2.704946
GGP 0.760151
GHS 10.946537
GIP 0.760151
GMD 73.502744
GNF 8677.923346
GTQ 7.662868
GYD 209.125426
HKD 7.77113
HNL 26.300717
HRK 6.508031
HTG 130.828607
HUF 332.539499
IDR 16720.5
ILS 3.221505
IMP 0.760151
INR 88.59435
IQD 1309.59323
IRR 42112.496418
ISK 126.630266
JEP 0.760151
JMD 160.453032
JOD 0.708989
JPY 154.360497
KES 129.16016
KGS 87.449953
KHR 4018.850239
KMF 421.00021
KPW 899.978423
KRW 1469.670454
KWD 0.30712
KYD 0.83315
KZT 524.753031
LAK 21704.649515
LBP 89524.681652
LKR 304.188192
LRD 182.949902
LSL 17.155692
LTL 2.95274
LVL 0.60489
LYD 5.455535
MAD 9.276437
MDL 16.965288
MGA 4487.985245
MKD 53.15606
MMK 2099.547411
MNT 3580.914225
MOP 8.004423
MRU 39.668779
MUR 45.890344
MVR 15.405031
MWK 1733.511298
MXN 18.329702
MYR 4.128497
MZN 63.950448
NAD 17.155766
NGN 1436.469987
NIO 36.793386
NOK 10.062505
NPR 141.595718
NZD 1.768835
OMR 0.384463
PAB 0.999711
PEN 3.36655
PGK 4.287559
PHP 58.983976
PKR 282.685091
PLN 3.658005
PYG 7055.479724
QAR 3.654247
RON 4.3911
RSD 101.20905
RUB 80.950041
RWF 1452.569469
SAR 3.750367
SBD 8.237372
SCR 14.331615
SDG 600.507848
SEK 9.463759
SGD 1.30288
SHP 0.750259
SLE 23.199636
SLL 20969.499529
SOS 571.30022
SRD 38.573981
STD 20697.981008
STN 21.165667
SVC 8.7479
SYP 11056.693449
SZL 17.149299
THB 32.473501
TJS 9.227493
TMT 3.5
TND 2.950679
TOP 2.342104
TRY 42.236297
TTD 6.779061
TWD 31.069501
TZS 2453.097878
UAH 41.988277
UGX 3559.287624
UYU 39.782986
UZS 11986.678589
VES 230.803897
VND 26338
VUV 122.395188
WST 2.82323
XAF 566.684377
XAG 0.019542
XAU 0.000243
XCD 2.70255
XCG 1.80176
XDR 0.704774
XOF 566.681929
XPF 103.029282
YER 238.496617
ZAR 17.17035
ZMK 9001.197151
ZMW 22.518444
ZWL 321.999592
  • RBGPF

    0.5700

    78.52

    +0.73%

  • BCC

    -0.2000

    69.63

    -0.29%

  • BP

    0.2300

    37.35

    +0.62%

  • NGG

    -0.0200

    77.31

    -0.03%

  • RELX

    0.4500

    42.48

    +1.06%

  • BTI

    0.3400

    55.76

    +0.61%

  • AZN

    1.6100

    89.09

    +1.81%

  • SCS

    0.0100

    15.75

    +0.06%

  • CMSC

    0.0800

    23.97

    +0.33%

  • RIO

    0.0300

    70.32

    +0.04%

  • JRI

    0.1400

    13.82

    +1.01%

  • GSK

    1.0500

    48.41

    +2.17%

  • CMSD

    0.1600

    24.32

    +0.66%

  • RYCEF

    -0.1600

    15.03

    -1.06%

  • VOD

    0.9700

    12.67

    +7.66%

  • BCE

    0.4700

    23.41

    +2.01%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: © AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

X.Gu--ThChM