The China Mail - Biden robocall: Audio deepfake fuels election disinformation fears

USD -
AED 3.672504
AFN 69.503991
ALL 83.850403
AMD 382.520403
ANG 1.789783
AOA 917.000367
ARS 1342.688342
AUD 1.529304
AWG 1.8025
AZN 1.70397
BAM 1.676431
BBD 2.014495
BDT 121.622259
BGN 1.672204
BHD 0.375818
BIF 2948.5
BMD 1
BND 1.285567
BOB 6.911271
BRL 5.432404
BSD 1.000219
BTN 88.156209
BWP 13.465107
BYN 3.403177
BYR 19600
BZD 2.01158
CAD 1.37485
CDF 2865.000362
CHF 0.800504
CLF 0.024637
CLP 966.503912
CNY 7.130804
CNH 7.12231
COP 4017.25
CRC 505.037951
CUC 1
CUP 26.5
CVE 94.62504
CZK 20.928604
DJF 177.720393
DKK 6.387704
DOP 63.000359
DZD 128.141873
EGP 48.414118
ERN 15
ETB 141.703874
EUR 0.855804
FJD 2.255404
FKP 0.739957
GBP 0.740466
GEL 2.69504
GGP 0.739957
GHS 11.75039
GIP 0.739957
GMD 71.503851
GNF 8681.000355
GTQ 7.666428
GYD 209.163884
HKD 7.79775
HNL 26.410388
HRK 6.44704
HTG 130.91386
HUF 339.420388
IDR 16416.25
ILS 3.34452
IMP 0.739957
INR 88.16745
IQD 1310
IRR 42075.000352
ISK 122.540386
JEP 0.739957
JMD 160.040115
JOD 0.70904
JPY 147.05404
KES 129.503801
KGS 87.391304
KHR 4006.00035
KMF 422.00035
KPW 900.03541
KRW 1388.970383
KWD 0.305475
KYD 0.833501
KZT 538.801435
LAK 21675.000349
LBP 89565.891938
LKR 302.011323
LRD 200.532296
LSL 17.640381
LTL 2.95274
LVL 0.60489
LYD 5.420381
MAD 9.037504
MDL 16.663167
MGA 4475.000347
MKD 52.749551
MMK 2099.589215
MNT 3598.002954
MOP 8.030721
MRU 39.970379
MUR 45.910378
MVR 15.403739
MWK 1734.289351
MXN 18.655604
MYR 4.225039
MZN 63.903729
NAD 17.640377
NGN 1538.730377
NIO 36.810377
NOK 10.059304
NPR 141.049762
NZD 1.696353
OMR 0.383306
PAB 1.000219
PEN 3.532504
PGK 4.146504
PHP 57.088038
PKR 281.750374
PLN 3.648856
PYG 7230.991433
QAR 3.640604
RON 4.342038
RSD 100.326017
RUB 79.648171
RWF 1445
SAR 3.752438
SBD 8.210319
SCR 14.129123
SDG 600.503676
SEK 9.461604
SGD 1.284104
SHP 0.785843
SLE 23.290371
SLL 20969.49797
SOS 571.639188
SRD 38.605504
STD 20697.981008
STN 21.3
SVC 8.751591
SYP 13001.911386
SZL 17.640369
THB 32.270369
TJS 9.326659
TMT 3.51
TND 2.873504
TOP 2.342104
TRY 41.103635
TTD 6.796412
TWD 30.579038
TZS 2505.878038
UAH 41.381211
UGX 3549.494491
UYU 40.029315
UZS 12475.000334
VES 146.89867
VND 26345
VUV 119.905576
WST 2.672352
XAF 562.259299
XAG 0.025175
XAU 0.00029
XCD 2.70255
XCG 1.802605
XDR 0.699264
XOF 561.503593
XPF 102.503591
YER 240.000331
ZAR 17.65301
ZMK 9001.203584
ZMW 23.58901
ZWL 321.999592
  • RBGPF

    0.0000

    77

    0%

  • CMSC

    -0.1300

    23.74

    -0.55%

  • RYCEF

    -0.2100

    14.27

    -1.47%

  • BP

    -0.1200

    35.23

    -0.34%

  • RIO

    -0.1600

    62.72

    -0.26%

  • CMSD

    -0.2800

    23.62

    -1.19%

  • SCS

    0.0200

    16.74

    +0.12%

  • NGG

    -0.2800

    70.57

    -0.4%

  • RELX

    -0.2900

    46.67

    -0.62%

  • GSK

    0.2300

    39.67

    +0.58%

  • BTI

    0.6800

    56.89

    +1.2%

  • BCC

    -0.2700

    87

    -0.31%

  • VOD

    0.0400

    11.96

    +0.33%

  • JRI

    0.1500

    13.6

    +1.1%

  • BCE

    0.1400

    24.96

    +0.56%

  • AZN

    -0.0900

    79.9

    -0.11%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: © AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

S.Davis--ThChM