The China Mail - 'Tool for grifters': AI deepfakes push bogus sexual cures

USD -
AED 3.672498
AFN 65.498106
ALL 81.051571
AMD 375.859332
ANG 1.79008
AOA 916.497158
ARS 1416.446495
AUD 1.413497
AWG 1.8
AZN 1.695264
BAM 1.642701
BBD 2.007895
BDT 121.837729
BGN 1.67937
BHD 0.376981
BIF 2949.857215
BMD 1
BND 1.265076
BOB 6.903242
BRL 5.194898
BSD 0.996892
BTN 90.375901
BWP 13.137914
BYN 2.873173
BYR 19600
BZD 2.004955
CAD 1.356445
CDF 2215.000232
CHF 0.766405
CLF 0.021628
CLP 853.970006
CNY 6.9225
CNH 6.91111
COP 3673.08
CRC 494.204603
CUC 1
CUP 26.5
CVE 92.612579
CZK 20.361605
DJF 177.523938
DKK 6.275825
DOP 62.758273
DZD 129.497006
EGP 46.881699
ERN 15
ETB 155.496052
EUR 0.83996
FJD 2.192099
FKP 0.731721
GBP 0.73155
GEL 2.690096
GGP 0.731721
GHS 10.970939
GIP 0.731721
GMD 73.501083
GNF 8751.926558
GTQ 7.647373
GYD 208.567109
HKD 7.81758
HNL 26.333781
HRK 6.329797
HTG 130.732404
HUF 317.258982
IDR 16798
ILS 3.084801
IMP 0.731721
INR 90.52085
IQD 1305.980178
IRR 42125.000158
ISK 121.802706
JEP 0.731721
JMD 155.929783
JOD 0.708991
JPY 155.210977
KES 128.896279
KGS 87.450406
KHR 4020.661851
KMF 413.999932
KPW 900.003053
KRW 1462.055014
KWD 0.30709
KYD 0.830758
KZT 492.323198
LAK 21424.491853
LBP 89570.078396
LKR 308.550311
LRD 185.426737
LSL 15.97833
LTL 2.952739
LVL 0.60489
LYD 6.302705
MAD 9.117504
MDL 16.932639
MGA 4376.784814
MKD 51.774104
MMK 2100.147418
MNT 3570.525201
MOP 8.025869
MRU 39.586763
MUR 45.679579
MVR 15.459738
MWK 1728.624223
MXN 17.194145
MYR 3.923498
MZN 63.76003
NAD 15.97833
NGN 1354.939889
NIO 36.687385
NOK 9.517145
NPR 144.601881
NZD 1.654635
OMR 0.384497
PAB 0.996892
PEN 3.348144
PGK 4.337309
PHP 58.522499
PKR 278.761885
PLN 3.53947
PYG 6573.156392
QAR 3.634035
RON 4.276802
RSD 98.549011
RUB 77.251007
RWF 1455.48463
SAR 3.75074
SBD 8.054878
SCR 13.836531
SDG 601.500203
SEK 8.92498
SGD 1.26597
SHP 0.750259
SLE 24.524979
SLL 20969.499267
SOS 568.704855
SRD 37.971496
STD 20697.981008
STN 20.57786
SVC 8.723333
SYP 11059.574895
SZL 15.970939
THB 31.168005
TJS 9.336094
TMT 3.5
TND 2.879712
TOP 2.40776
TRY 43.633798
TTD 6.753738
TWD 31.523799
TZS 2586.096953
UAH 42.973963
UGX 3548.630942
UYU 38.224264
UZS 12265.141398
VES 384.79041
VND 25885
VUV 119.800563
WST 2.713692
XAF 550.946582
XAG 0.012177
XAU 0.000198
XCD 2.70255
XCG 1.796657
XDR 0.685201
XOF 550.946582
XPF 100.167141
YER 238.349504
ZAR 15.926345
ZMK 9001.203383
ZMW 18.8468
ZWL 321.999592
  • CMSC

    0.0150

    23.6

    +0.06%

  • SCS

    0.0200

    16.14

    +0.12%

  • JRI

    0.0100

    12.82

    +0.08%

  • BCE

    0.2150

    25.835

    +0.83%

  • CMSD

    0.0000

    23.97

    0%

  • BCC

    1.2200

    90.24

    +1.35%

  • RYCEF

    0.5300

    17.41

    +3.04%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • NGG

    -0.2200

    88.17

    -0.25%

  • RIO

    -0.6400

    96.21

    -0.67%

  • GSK

    -0.4900

    58.52

    -0.84%

  • BTI

    -1.4100

    59.74

    -2.36%

  • VOD

    -0.1080

    15.372

    -0.7%

  • BP

    -2.7300

    36.49

    -7.48%

  • RELX

    -0.0300

    29.45

    -0.1%

  • AZN

    5.3100

    193.32

    +2.75%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: © AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

V.Liu--ThChM