The China Mail - 'Tool for grifters': AI deepfakes push bogus sexual cures

USD -
AED 3.672498
AFN 66.189861
ALL 82.308739
AMD 381.101852
ANG 1.790403
AOA 916.999547
ARS 1449.268601
AUD 1.506557
AWG 1.8025
AZN 1.695316
BAM 1.668209
BBD 2.011916
BDT 122.169244
BGN 1.6672
BHD 0.377035
BIF 2953.637244
BMD 1
BND 1.291379
BOB 6.902993
BRL 5.551498
BSD 0.998878
BTN 89.50329
BWP 14.050486
BYN 2.935821
BYR 19600
BZD 2.009016
CAD 1.377585
CDF 2558.556157
CHF 0.794305
CLF 0.023214
CLP 910.69048
CNY 7.04095
CNH 7.032575
COP 3830.4
CRC 498.893291
CUC 1
CUP 26.5
CVE 94.051468
CZK 20.725804
DJF 177.880699
DKK 6.365695
DOP 62.572768
DZD 129.783354
EGP 47.456197
ERN 15
ETB 155.183896
EUR 0.85228
FJD 2.28735
FKP 0.750114
GBP 0.745305
GEL 2.684986
GGP 0.750114
GHS 11.473145
GIP 0.750114
GMD 73.000281
GNF 8731.773266
GTQ 7.654449
GYD 208.991888
HKD 7.77914
HNL 26.315879
HRK 6.419894
HTG 130.971776
HUF 329.432504
IDR 16785.55
ILS 3.209245
IMP 0.750114
INR 89.617976
IQD 1308.603329
IRR 42100.000086
ISK 125.459681
JEP 0.750114
JMD 159.835209
JOD 0.70896
JPY 157.4965
KES 129.009876
KGS 87.450192
KHR 4008.904887
KMF 420.000025
KPW 899.999969
KRW 1480.620333
KWD 0.30755
KYD 0.832484
KZT 516.941816
LAK 21634.83067
LBP 89452.454975
LKR 309.276152
LRD 176.805994
LSL 16.757292
LTL 2.95274
LVL 0.60489
LYD 5.414465
MAD 9.156424
MDL 16.911247
MGA 4542.76003
MKD 52.46135
MMK 2100.312258
MNT 3551.223311
MOP 8.006346
MRU 39.977141
MUR 46.170356
MVR 15.449838
MWK 1732.151158
MXN 18.00365
MYR 4.076981
MZN 63.907172
NAD 16.757577
NGN 1458.929593
NIO 36.762668
NOK 10.136605
NPR 143.207097
NZD 1.729675
OMR 0.384501
PAB 0.9989
PEN 3.363983
PGK 4.249457
PHP 58.789501
PKR 279.869756
PLN 3.58449
PYG 6701.551925
QAR 3.641792
RON 4.334981
RSD 100.038982
RUB 79.275995
RWF 1454.433797
SAR 3.750698
SBD 8.146749
SCR 13.9235
SDG 601.499323
SEK 9.261735
SGD 1.29076
SHP 0.750259
SLE 24.049673
SLL 20969.503664
SOS 569.859135
SRD 38.441498
STD 20697.981008
STN 20.897483
SVC 8.740228
SYP 11058.38145
SZL 16.755159
THB 31.179501
TJS 9.205089
TMT 3.5
TND 2.923942
TOP 2.40776
TRY 42.807202
TTD 6.780138
TWD 31.511972
TZS 2483.481013
UAH 42.236154
UGX 3573.0431
UYU 39.219031
UZS 12008.597675
VES 282.15965
VND 26334.5
VUV 120.603378
WST 2.787816
XAF 559.492159
XAG 0.014521
XAU 0.000227
XCD 2.70255
XCG 1.800332
XDR 0.695829
XOF 559.492159
XPF 101.722094
YER 238.401933
ZAR 16.71335
ZMK 9001.199154
ZMW 22.600359
ZWL 321.999592
  • SCS

    0.0200

    16.14

    +0.12%

  • BCC

    -2.9300

    74.77

    -3.92%

  • BCE

    -0.0100

    22.84

    -0.04%

  • GSK

    0.3200

    48.61

    +0.66%

  • RIO

    0.6900

    78.32

    +0.88%

  • RBGPF

    0.0000

    80.22

    0%

  • RYCEF

    0.2800

    15.68

    +1.79%

  • BTI

    -0.5900

    56.45

    -1.05%

  • CMSC

    -0.1200

    23.17

    -0.52%

  • NGG

    -0.2800

    76.11

    -0.37%

  • RELX

    0.0800

    40.73

    +0.2%

  • JRI

    -0.0500

    13.38

    -0.37%

  • CMSD

    -0.0300

    23.25

    -0.13%

  • AZN

    0.7500

    91.36

    +0.82%

  • VOD

    0.0400

    12.84

    +0.31%

  • BP

    0.6300

    33.94

    +1.86%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: © AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

V.Liu--ThChM