News
Becca Monaghan
Mar 07, 2025
Doctor busts TikTok health myths
Cover Media - Shareable / VideoElephant
As artificial intelligence evolves at a rapid pace, distinguishing between what's real and what's fabricated is becoming increasingly difficult online.
Initially, deepfakes sparked concerns about misinformation and malicious intent. At the same time, AI's potential to replace jobs became a growing concern across various industries. Now, AI is taking on even more unsettling roles, including impersonating doctors online. The pressing question is: why?
When you search for "coochie doctor" on TikTok – a playful slang term for a gynecologist – you'll come across numerous videos offering odd and sometimes questionable advice from AI bots posing as doctors.
One clip that racked up almost three million views suggested eating pineapple-cucumber salads for gut health and compared lemon balm to Ozempic.
Shockingly, thousands of viewers bought into it and flooded the clip with 36,000 comments requesting recipes and asking where to buy them from.
"Lemon balm please. Also, I can’t find your wellness hub. I would love to know more!! Thank you," one wrote, as another quipped: "I need to buy the newsletter to get the lemon balm recipe correct?"
Meanwhile, TikToker Javon Ford Beauty (@javonford16) brought attention to the collection of clips, exposing the AI app such creators are using to deceive blissfully unaware TikTokers.
@javonford16 Replying to @ladolcedana #hillmantok #education #skincare #haircare
He claimed that this "creepy doctor" originated from a "deeply insidious" app called Captions that lets users choose AI avatars.
While browsing the app to explore the avatars, he stumbled upon Violet, the same woman posing as a doctor in many of the videos. He then entered a script, and to his surprise, the AI delivered a convincing speech from it.
It didn't take long for shocked TikTokers to chime in on the action with one writing: "VICTOR IS FAKE?!"
Another added: "I absolutely loath these post if people can’t tell it’s AI by the things it’s saying we are doomed."
Meanwhile, a third wrote: "So that’s actually scary! Now that you point it out, I can see through it, but w/o the warning, I may have fallen for it!"
Indy100 reached out to TikTok for comment
You may also like...
- Should we really be using AI tools for therapy? Doctors reveal all
- ChatGPT gets stressed at negative news just like us
How to join the indy100's free WhatsApp channel
Sign up for our free Indy100 weekly newsletter
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.
Top 100
The Conversation (0)