Image by Getty / Futurism

Since OpenAI first introduced ChatGPT to the public back in 2022, people have done all sorts of ill-advised things with the AI tool — from attorneys filing court documents that cite hallucinated caselaw to everyday users spiraling into severe mental health crises as the chatbot affirms delusional thoughts.

Now add to that list: asking ChatGPT for advice on how to inject facial filler — a trendy cosmetic procedure intended to puff up features like lips and cheeks — at home, without the assistance of a medical professional.

"I'll be injecting myself tonight," one Redditor wrote in a recent post. "I have all things needed on hand and I'm trying to research the best way of keeping things as sterile/clean as possible. I asked ChatGPT and it said I should absolutely not use normal gloves, I googled and can't find any specific info on it."

Needless to say, this is a resoundingly terrible idea. Please don't do this procedure at home, and instead go to a qualified medical facility so you don't hurt yourself. (While pros can screw up this process too, at least they can be held liable.)

Unfortunately, nobody chastised the Redditor for asking ChatGPT for advice. In fact, a quick perusal of the same subreddit, where thrifty beauty aficionados swap tips on administering cosmetic procedures on their own, finds a huge number of similarly alarming situations.

"I used ChatGPT to help me map my tox and PN placements, how to dilute my tox facial and depth of injections, etc," one commenter enthused. "If you send it annotated photos it can view your mapping and correct it."

Another user turned to AI after problems with a DIY cosmetic procedure.

"Asked [ChatGPT], and it said that since a small amount likely migrated to cheek area through tear trough [sic]," they wrote. "But since it migrated, likely was dissolved into bloodstream. Fibrosis possible but may resolve. If fat was dissolved it should be very negligible."

AI models may be set to revolutionize medicine in certain ways, such as at the Icahn School of Medicine at Mount Sinai, which is incorporating AI into training doctors. Researchers are excited about AI being used to diagnose diseases such as prostate cancer and heart disease earlier than before.

But the jury is still out on how effective AI chatbots will be in dispensing useful medical advice. For example, a recent npj Digital Medicine paper in March revealed that while large language models such as ChatGPT are more accurate than search engines, they are still going to spew out more than 30 percent of incorrect advice under certain circumstances.

In addition, the quality of output is reliant on the quality of the prompt.

"We found that some input prompts, which guide the models towards reputed sources, are much more effective than basic prompts (or prompts that give no context at all)," the researchers wrote. "However, lay users would hardly resort to sophisticated prompts or complex interactions with the LLMs."

In a nutshell, sure you can ask ChatGPT questions — but please confer with a real doctor before undertaking any treatment, especially if you're doing it at home.

More on ChatGPT: Man Annoyed When ChatGPT Tells Users He Murdered His Children in Cold Blood


Share This Article