The China Mail - 'Tool for grifters': AI deepfakes push bogus sexual cures

USD -
AED 3.672496
AFN 68.18705
ALL 82.654845
AMD 382.36924
ANG 1.790403
AOA 916.99971
ARS 1451.445104
AUD 1.504019
AWG 1.8
AZN 1.707273
BAM 1.66742
BBD 2.014834
BDT 121.74432
BGN 1.666425
BHD 0.377083
BIF 2985.464001
BMD 1
BND 1.283345
BOB 6.912486
BRL 5.353103
BSD 1.000384
BTN 88.242466
BWP 13.326229
BYN 3.38838
BYR 19600
BZD 2.011936
CAD 1.384195
CDF 2835.00015
CHF 0.796785
CLF 0.02426
CLP 951.728548
CNY 7.124701
CNH 7.12354
COP 3893.772113
CRC 503.94305
CUC 1
CUP 26.5
CVE 94.006565
CZK 20.74715
DJF 178.140586
DKK 6.36682
DOP 63.421288
DZD 129.420691
EGP 48.067104
ERN 15
ETB 143.637069
EUR 0.852961
FJD 2.238696
FKP 0.737679
GBP 0.737905
GEL 2.689777
GGP 0.737679
GHS 12.204271
GIP 0.737679
GMD 71.500902
GNF 8676.414169
GTQ 7.669551
GYD 209.292809
HKD 7.779923
HNL 26.209131
HRK 6.425297
HTG 130.90072
HUF 332.879926
IDR 16408
ILS 3.335965
IMP 0.737679
INR 88.277501
IQD 1310.541796
IRR 42075.000562
ISK 122.030058
JEP 0.737679
JMD 160.475724
JOD 0.709006
JPY 147.662503
KES 129.249972
KGS 87.449795
KHR 4009.548574
KMF 419.506512
KPW 900.03427
KRW 1392.339996
KWD 0.30537
KYD 0.83371
KZT 540.935249
LAK 21691.461699
LBP 89584.381261
LKR 301.837248
LRD 177.569376
LSL 17.362036
LTL 2.95274
LVL 0.60489
LYD 5.401765
MAD 9.008824
MDL 16.616224
MGA 4433.26655
MKD 52.466005
MMK 2099.833626
MNT 3596.020755
MOP 8.019268
MRU 39.935206
MUR 45.479981
MVR 15.310197
MWK 1734.600793
MXN 18.45195
MYR 4.204976
MZN 63.910518
NAD 17.362036
NGN 1500.850375
NIO 36.813163
NOK 9.86678
NPR 141.187604
NZD 1.679699
OMR 0.383563
PAB 1.000384
PEN 3.486338
PGK 4.239737
PHP 57.207001
PKR 284.023957
PLN 3.629555
PYG 7148.642312
QAR 3.651903
RON 4.317099
RSD 99.867855
RUB 83.397664
RWF 1449.592907
SAR 3.750597
SBD 8.206879
SCR 14.26498
SDG 601.502513
SEK 9.331397
SGD 1.282535
SHP 0.785843
SLE 23.37501
SLL 20969.503664
SOS 571.720875
SRD 39.375022
STD 20697.981008
STN 20.887506
SVC 8.753144
SYP 13001.951397
SZL 17.345155
THB 31.749595
TJS 9.413615
TMT 3.51
TND 2.912145
TOP 2.3421
TRY 41.336799
TTD 6.801654
TWD 30.299901
TZS 2460.974466
UAH 41.241911
UGX 3515.921395
UYU 40.069909
UZS 12452.363698
VES 158.73035
VND 26385
VUV 118.929522
WST 2.747698
XAF 559.236967
XAG 0.023712
XAU 0.000275
XCD 2.70255
XCG 1.802975
XDR 0.695511
XOF 559.236967
XPF 101.675263
YER 239.550483
ZAR 17.359398
ZMK 9001.202571
ZMW 23.734175
ZWL 321.999592
  • RBGPF

    0.0000

    77.27

    0%

  • CMSD

    0.0100

    24.4

    +0.04%

  • NGG

    0.5300

    71.6

    +0.74%

  • VOD

    -0.0100

    11.85

    -0.08%

  • RYCEF

    0.1800

    15.37

    +1.17%

  • GSK

    -0.6500

    40.83

    -1.59%

  • SCS

    -0.1900

    16.81

    -1.13%

  • RELX

    0.1700

    46.5

    +0.37%

  • RIO

    -0.1000

    62.44

    -0.16%

  • CMSC

    -0.0200

    24.36

    -0.08%

  • BTI

    -0.7200

    56.59

    -1.27%

  • JRI

    0.1100

    14.23

    +0.77%

  • BCC

    -3.3300

    85.68

    -3.89%

  • BP

    -0.5800

    33.89

    -1.71%

  • BCE

    -0.1400

    24.16

    -0.58%

  • AZN

    -1.5400

    79.56

    -1.94%

'Tool for grifters': AI deepfakes push bogus sexual cures
'Tool for grifters': AI deepfakes push bogus sexual cures / Photo: © AFP

'Tool for grifters': AI deepfakes push bogus sexual cures

Holding an oversized carrot, a brawny, shirtless man promotes a supplement he claims can enlarge male genitalia -- one of countless AI-generated videos on TikTok peddling unproven sexual treatments.

Text size:

The rise of generative AI has made it easy -- and financially lucrative -- to mass-produce such videos with minimal human oversight, often featuring fake celebrity endorsements of bogus and potentially harmful products.

In some TikTok videos, carrots are used as a euphemism for male genitalia, apparently to evade content moderation policing sexually explicit language.

"You would notice that your carrot has grown up," the muscled man says in a robotic voice in one video, directing users to an online purchase link.

"This product will change your life," the man adds, claiming without evidence that the herbs used as ingredients boost testosterone and send energy levels "through the roof."

The video appears to be AI-generated, according to a deepfake detection service recently launched by the Bay Area-headquartered firm Resemble AI, which shared its results with AFP.

"As seen in this example, misleading AI-generated content is being used to market supplements with exaggerated or unverified claims, potentially putting consumers' health at risk," Zohaib Ahmed, Resemble AI's chief executive and co-founder, told AFP.

"We're seeing AI-generated content weaponized to spread false information."

- 'Cheap way' -

The trend underscores how rapid advances in artificial intelligence have fueled what researchers call an AI dystopia, a deception-filled online universe designed to manipulate unsuspecting users into buying dubious products.

They include everything from unverified -- and in some cases, potentially harmful -- dietary supplements to weight loss products and sexual remedies.

"AI is a useful tool for grifters looking to create large volumes of content slop for a low cost," misinformation researcher Abbie Richards told AFP.

"It's a cheap way to produce advertisements," she added.

Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell Tech, has observed a surge of "AI doctor" avatars and audio tracks on TikTok that promote questionable sexual remedies.

Some of these videos, many with millions of views, peddle testosterone-boosting concoctions made from ingredients such as lemon, ginger and garlic.

More troublingly, rapidly evolving AI tools have enabled the creation of deepfakes impersonating celebrities such as actress Amanda Seyfried and actor Robert De Niro.

"Your husband can't get it up?" Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, appears to ask in a TikTok video promoting a prostate supplement.

But the clip is a deepfake, using Fauci's likeness.

- 'Pernicious' -

Many manipulated videos are created from existing ones, modified with AI-generated voices and lip-synced to match what the altered voice says.

"The impersonation videos are particularly pernicious as they further degrade our ability to discern authentic accounts online," Mantzarlis said.

Last year, Mantzarlis discovered hundreds of ads on YouTube featuring deepfakes of celebrities -- including Arnold Schwarzenegger, Sylvester Stallone, and Mike Tyson -- promoting supplements branded as erectile dysfunction cures.

The rapid pace of generating short-form AI videos means that even when tech platforms remove questionable content, near-identical versions quickly reappear -- turning moderation into a game of whack-a-mole.

Researchers say this creates unique challenges for policing AI-generated content, requiring novel solutions and more sophisticated detection tools.

AFP's fact checkers have repeatedly debunked scam ads on Facebook promoting treatments -- including erectile dysfunction cures -- that use fake endorsements by Ben Carson, a neurosurgeon and former US cabinet member.

Yet many users still consider the endorsements legitimate, illustrating the appeal of deepfakes.

"Scammy affiliate marketing schemes and questionable sex supplements have existed for as long as the internet and before," Mantzarlis said.

"As with every other bad thing online, generative AI has made this abuse vector cheaper and quicker to deploy at scale."

V.Liu--ThChM