- Two men were found dead in separate motels after drinking drinks that a woman allegedly spiked with prescription drugs.
- Seoul police say their repeated questions on ChatGPT about lethal combinations of sedatives and alcohol show they knew the mixture could be deadly.
- Investigators say his chatbot’s search history proves intent, making it a central part of their enhanced murder charges.
South Korean police have upgraded charges against a 21-year-old woman to murder after discovering a disturbing series of queries she allegedly typed on ChatGPT before two men were found dead in separate motel rooms.
Seoul investigators say the suspect, identified only as Kim, repeatedly asked the AI chatbot in different ways what happens when you mix sleeping pills with alcohol and when it becomes dangerous or even deadly. Police now say the searches show she knew the risks long before serving the drug-laced drinks that left two men dead and another unconscious.
Authorities initially arrested Kim in February for the misdemeanor charge of bodily injury resulting in death, a charge that often applies when someone causes fatal harm without the intent to kill. That changed once digital forensics teams searched his phone. The combination of her prior statements and the precise wording of her ChatGPT questions convinced investigators that she was not simply reckless or unconscious. That formed the backbone of a revised case that now alleges deliberate and premeditated poisoning.
According to police accounts, the first alleged murder occurred on January 28 when Kim checked in with a man in his 20s at a hotel and left two hours later. Staff discovered his body the next day. On February 9, an almost identical sequence took place at another motel with another man in his 20s. In both cases, police say the victims consumed alcoholic drinks that Kim had prepared, into which investigators believe she had dissolved prescription sedatives.
Detectives discovered an earlier non-fatal attempt involving Kim’s partner, who later recovered. After regaining consciousness, investigators say Kim began preparing stronger concoctions and significantly increasing the drug doses. ChatGPT’s role became central to the case once the phone records were decoded. The searches highlighted by investigators were neither broad nor vague. According to authorities, they were specific, repeated and obsessed with lethality.
Police say this means they knew what could happen and it changes the story from an unintentional overdose to a planned and investigated poisoning. Kim reportedly told investigators she mixed the sedatives in drinks, but claimed she did not expect the men to die. The police counter that his digital behavior contradicts this story. They also suggested that actions she took after the two motel deaths further undermined her claims. According to authorities, she only removed the empty bottles used in the mixtures before leaving the motel rooms, without taking any action to call for help or alert authorities. Detectives interpret this as an attempt at cover-up rather than panic or confusion.
ChatGPT Poison Control Guide
One of the most striking elements of the case, beyond the violence itself, is how generative AI fits into the timeline of the investigation. For years, police have relied on browser histories, text logs and social media posts to establish intent. The presence of interactions with chatbots adds a new category of evidence. ChatGPT, unlike a traditional search engine, can provide personalized advice in conversational form. When someone asks a question about harm, the wording and follow-up can reveal not only curiosity, but also persistence.
For ordinary people who use AI casually, this case is a reminder that digital footprints can take on a life of their own. As more people turn to chatbots for everything from homework help to medical questions, law enforcement agencies around the world are beginning to explore how these conversations should be handled during investigations. Some countries already treat AI service logs the same as browser data. Others are still weighing privacy concerns and legal limits.
Although the events themselves are tragic, they shed light on a new reality. Technology is now behind many serious crimes. In this case, police believe ChatGPT queries help paint a clear picture of intent. The courts will ultimately decide to what extent these questions prove guilt. For the public, the outcome may influence how people perceive privacy, permanence, and the potential consequences of interacting with AI.
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds. Make sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp Also.
The best business laptops for every budget




