The China Mail - Biden robocall: Audio deepfake fuels election disinformation fears

USD -
AED 3.672501
AFN 65.999906
ALL 83.302086
AMD 382.09008
ANG 1.790176
AOA 917.00004
ARS 1408.179299
AUD 1.52023
AWG 1.8025
AZN 1.70389
BAM 1.68937
BBD 2.014244
BDT 122.111228
BGN 1.68707
BHD 0.377018
BIF 2950
BMD 1
BND 1.30343
BOB 6.910223
BRL 5.2939
BSD 1.000082
BTN 88.671219
BWP 14.25758
BYN 3.410338
BYR 19600
BZD 2.011289
CAD 1.398725
CDF 2137.497068
CHF 0.794445
CLF 0.023707
CLP 930.019904
CNY 7.11275
CNH 7.095625
COP 3706.75
CRC 502.36889
CUC 1
CUP 26.5
CVE 95.374988
CZK 20.812199
DJF 177.720029
DKK 6.41914
DOP 64.39652
DZD 130.297012
EGP 47.180286
ERN 15
ETB 153.601015
EUR 0.85964
FJD 2.271804
FKP 0.76162
GBP 0.759325
GEL 2.716915
GGP 0.76162
GHS 10.964974
GIP 0.76162
GMD 73.497294
GNF 8685.000003
GTQ 7.664334
GYD 209.232018
HKD 7.770465
HNL 26.309931
HRK 6.481599
HTG 130.904411
HUF 330.134975
IDR 16727.2
ILS 3.19875
IMP 0.76162
INR 88.636496
IQD 1310
IRR 42112.491712
ISK 126.350053
JEP 0.76162
JMD 160.817476
JOD 0.709033
JPY 154.415973
KES 129.187145
KGS 87.449858
KHR 4020.000244
KMF 427.498797
KPW 900.002739
KRW 1462.789747
KWD 0.30675
KYD 0.833377
KZT 524.809647
LAK 21694.999877
LBP 89572.717427
LKR 304.582734
LRD 182.000053
LSL 17.24503
LTL 2.95274
LVL 0.60489
LYD 5.460465
MAD 9.282494
MDL 16.941349
MGA 4500.000098
MKD 53.084556
MMK 2099.574422
MNT 3579.076518
MOP 8.005511
MRU 39.850126
MUR 45.803814
MVR 15.404988
MWK 1735.999979
MXN 18.268895
MYR 4.126992
MZN 63.959782
NAD 17.245037
NGN 1442.190133
NIO 36.770279
NOK 10.050625
NPR 141.874295
NZD 1.760515
OMR 0.384498
PAB 1.000073
PEN 3.369003
PGK 4.120237
PHP 58.886057
PKR 280.750457
PLN 3.63684
PYG 7057.035009
QAR 3.640897
RON 4.370402
RSD 100.725029
RUB 80.626386
RWF 1450
SAR 3.750417
SBD 8.237372
SCR 13.863319
SDG 600.499807
SEK 9.398475
SGD 1.29966
SHP 0.750259
SLE 23.375014
SLL 20969.509086
SOS 571.48083
SRD 38.5565
STD 20697.981008
STN 21.45
SVC 8.750858
SYP 11056.921193
SZL 17.244968
THB 32.269875
TJS 9.260569
TMT 3.5
TND 2.952502
TOP 2.40776
TRY 42.253297
TTD 6.781462
TWD 31.068499
TZS 2440.000269
UAH 42.073999
UGX 3625.244555
UYU 39.767991
UZS 12005.00033
VES 233.26555
VND 26330
VUV 122.187972
WST 2.81293
XAF 566.596269
XAG 0.018418
XAU 0.000236
XCD 2.70255
XCG 1.802343
XDR 0.704774
XOF 564.999955
XPF 103.249894
YER 238.493009
ZAR 17.00268
ZMK 9001.221651
ZMW 22.426266
ZWL 321.999592
  • RBGPF

    -0.0500

    78.47

    -0.06%

  • CMSC

    0.1100

    24.08

    +0.46%

  • RYCEF

    -0.0700

    14.96

    -0.47%

  • RELX

    -1.1200

    41.36

    -2.71%

  • GSK

    -0.3400

    48.07

    -0.71%

  • SCS

    0.0000

    15.75

    0%

  • RIO

    0.7900

    71.11

    +1.11%

  • NGG

    0.7200

    78.03

    +0.92%

  • AZN

    -1.4100

    87.68

    -1.61%

  • BTI

    0.0600

    55.82

    +0.11%

  • BP

    -0.4900

    36.86

    -1.33%

  • BCC

    0.6500

    70.28

    +0.92%

  • VOD

    -0.3000

    12.37

    -2.43%

  • JRI

    0.0500

    13.87

    +0.36%

  • BCE

    -0.6400

    22.77

    -2.81%

  • CMSD

    0.2300

    24.55

    +0.94%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: © AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

S.Davis--ThChM