Why you shouldn’t seek relationship advice from ChatGPT: It will simply tell you that you are right and “could escalate rather than resolve the conflict.”


  • A new study finds that AI chatbots are far more likely than humans to validate users during personal conflicts.
  • This trend can become dangerous when people use chatbots for advice on fighting.
  • AI can easily make people feel overly justified in making bad decisions

Bringing interpersonal drama to an AI chatbot isn’t exactly why the developers created the software, but that doesn’t stop people in the middle of fighting with friends and family from seeking (and getting) validation from digital advocates.

AI chatbots are always available, infinitely patient and very good at imitating good emotions. Too good, really, because they often disagree with users, which can cause much bigger problems, according to a new study published in Science.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top