Murder Bot

A Serial Killer Used ChatGPT to Plan Murders, Police Say

Grim.
Frank Landymore Avatar
Shadow of a person holding and using a smartphone projected on a purple wall with a pattern of light dots. The shadow shows the person's head and hand clearly, with a blurred hand holding the phone in the foreground.
Getty / Futurism

An accused serial killer in South Korea used ChatGPT to help plan a string of murders, investigators alleged Thursday.

The 21-year-old woman, identified by her surname Kim, is accused of killing two men by giving them drinks laced with benzodiazepines that she was prescribed for a mental illness, according to reports from The Korea Herald and the BBC

Investigators found that, before the men’s deaths, Kim had asked ChatGPT about the risks of administering the drugs.

“What happens if you take sleeping pills with alcohol?” she allegedly prompted the AI chatbot. “How much would be considered dangerous?” And, “Could it be fatal?”

Kim was initially arrested on February 11 on the lesser charge of inflicting bodily injury resulting in death. But she’s now being charged on two counts of murder, after investigators determined she had intent to kill based on her online activity, per the Korea Herald.

“Kim repeatedly asked questions related to drugs on ChatGPT,” an investigator said, as quoted by the newspaper. She was fully aware that consuming alcohol together with drugs could result in death.”

Police say that the first attack occurred on January 28. At 9:24 PM, Kim allegedly entered a motel in Suyu-dong, Gangbuk-gu, with a man in his 20s, and left the motel alone two hours later. The next day, the man was found dead on the bed around 6:00 PM. Her next attack, on Feb 9, played out nearly the same way after she checked into a different motel with another man in his 20s.

Before the murders, Kim had also allegedly attempted to kill a man she was dating at the time in December by giving him a drink laced with sedatives in a cafe parking lot in Namyangju, Gyeonggi province. The man lost consciousness, but survived and was not in a life-threatening condition.

The alleged murders may be the latest example of how ChatGPT and other AI chatbots are used in the buildup to acts of violence and self-harm. Experts have criticized the tech’s weak and unreliable guardrails, which can easily be subverted on purpose or by accident in prolonged conversations, leading to the chatbots freely giving instructions on activities like how to build bombs.

AI’s sycophantic responses are also thought to be driving delusional mental health spirals that some experts are calling AI psychosis. An AI’s humanlike personality paired with its obsequious interactions can reinforce a user’s delusions and fraught mental state. Some cases have ended in suicide and murder. A 16-year-old boy killed himself after discussing his suicide with ChatGPT for months. Another man is accused of murdering his mother after his interactions with ChatGPT helped convince him that she was part of a conspiracy against him.

The string of alleged murders in South Korea also comes as AI companies’ responsibility for actively monitoring for dangerous interactions on their products comes under the spotlight. A scoop from the Wall Street Journal last week revealed that OpenAI’s automated review system flagged disturbing conversations that an 18-year old in British Columbia had prior to carrying out a mass shooting. Employees at OpenAI had urged leaders there to alert authorities, but they opted not to. Eight people died in the horrific shooting, including the perpetrator, Jesse Van Rootselaar.

Kim admitted to mixing her medication into a drink that she gave to her victims, but denied any intent to kill them.

More on AI: OpenAI Flagged a Mass Shooter’s Troubling Conversations With ChatGPT Before the Incident, Decided Not to Warn Police

Frank Landymore Avatar

Frank Landymore

Contributing Writer

I’m a tech and science correspondent for Futurism, where I’m particularly interested in astrophysics, the business and ethics of artificial intelligence and automation, and the environment.