The China Mail - Can you trust your ears? AI voice scams rattle US

USD -
AED 3.672504
AFN 63.503991
ALL 81.244999
AMD 376.110854
ANG 1.789731
AOA 917.000367
ARS 1399.250402
AUD 1.409443
AWG 1.8
AZN 1.70397
BAM 1.647475
BBD 2.012046
BDT 122.174957
BGN 1.647646
BHD 0.3751
BIF 2946.973845
BMD 1
BND 1.262688
BOB 6.903087
BRL 5.219404
BSD 0.998947
BTN 90.484774
BWP 13.175252
BYN 2.862991
BYR 19600
BZD 2.009097
CAD 1.36175
CDF 2255.000362
CHF 0.769502
CLF 0.021854
CLP 862.903912
CNY 6.90865
CNH 6.901015
COP 3660.44729
CRC 484.521754
CUC 1
CUP 26.5
CVE 92.882113
CZK 20.44504
DJF 177.88822
DKK 6.293504
DOP 62.233079
DZD 128.996336
EGP 46.615845
ERN 15
ETB 155.576128
EUR 0.842404
FJD 2.19355
FKP 0.732987
GBP 0.734187
GEL 2.67504
GGP 0.732987
GHS 10.993556
GIP 0.732987
GMD 73.503851
GNF 8768.057954
GTQ 7.662048
GYD 208.996336
HKD 7.81845
HNL 26.394306
HRK 6.348604
HTG 130.985975
HUF 319.430388
IDR 16832.8
ILS 3.09073
IMP 0.732987
INR 90.56104
IQD 1308.680453
IRR 42125.000158
ISK 122.170386
JEP 0.732987
JMD 156.340816
JOD 0.70904
JPY 152.69504
KES 128.812703
KGS 87.450384
KHR 4018.026366
KMF 415.00035
KPW 900.005022
KRW 1440.860383
KWD 0.30661
KYD 0.832498
KZT 494.35202
LAK 21437.897486
LBP 89457.103146
LKR 308.891042
LRD 186.25279
LSL 16.033104
LTL 2.95274
LVL 0.60489
LYD 6.298277
MAD 9.134566
MDL 16.962473
MGA 4370.130144
MKD 51.922672
MMK 2099.920079
MNT 3581.976903
MOP 8.044813
MRU 39.81384
MUR 45.903741
MVR 15.405039
MWK 1732.215811
MXN 17.164804
MYR 3.907504
MZN 63.910377
NAD 16.033104
NGN 1353.403725
NIO 36.760308
NOK 9.506104
NPR 144.775302
NZD 1.662372
OMR 0.38258
PAB 0.999031
PEN 3.351556
PGK 4.288422
PHP 57.848504
PKR 279.396706
PLN 3.54775
PYG 6551.825801
QAR 3.640736
RON 4.291404
RSD 98.909152
RUB 77.184854
RWF 1458.450912
SAR 3.749258
SBD 8.045182
SCR 13.47513
SDG 601.503676
SEK 8.922504
SGD 1.263504
SHP 0.750259
SLE 24.450371
SLL 20969.49935
SOS 570.441814
SRD 37.754038
STD 20697.981008
STN 20.637662
SVC 8.741103
SYP 11059.574895
SZL 16.029988
THB 31.080369
TJS 9.425178
TMT 3.5
TND 2.880259
TOP 2.40776
TRY 43.608504
TTD 6.780946
TWD 31.384038
TZS 2607.252664
UAH 43.08175
UGX 3536.200143
UYU 38.512404
UZS 12277.302784
VES 392.73007
VND 25970
VUV 118.59522
WST 2.712215
XAF 552.547698
XAG 0.012937
XAU 0.000198
XCD 2.70255
XCG 1.800362
XDR 0.687192
XOF 552.547698
XPF 100.459083
YER 238.350363
ZAR 15.950904
ZMK 9001.203584
ZMW 18.156088
ZWL 321.999592
  • RBGPF

    0.1000

    82.5

    +0.12%

  • JRI

    0.2135

    13.24

    +1.61%

  • CMSD

    0.0647

    23.64

    +0.27%

  • BCE

    -0.1200

    25.71

    -0.47%

  • GSK

    0.3900

    58.93

    +0.66%

  • CMSC

    0.0500

    23.75

    +0.21%

  • BCC

    -1.5600

    86.5

    -1.8%

  • RYCEF

    0.2300

    17.1

    +1.35%

  • RELX

    2.2500

    31.06

    +7.24%

  • RIO

    0.1600

    98.07

    +0.16%

  • NGG

    1.1800

    92.4

    +1.28%

  • AZN

    1.0300

    205.55

    +0.5%

  • BTI

    -1.1100

    59.5

    -1.87%

  • VOD

    -0.0500

    15.57

    -0.32%

  • BP

    0.4700

    37.66

    +1.25%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: © AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

X.Gu--ThChM