“ We have not yet understood it ”: Sam Altman explains why the use of chatgpt as therapist is always a nightmare of confidentiality


  • The CEO of Openai says that the use of chatgpt for therapy presents serious risk of confidentiality
  • Your private cats could be exposed if Openai was to face a lawsuit
  • Nourishing your private thoughts in an opaque AI is also a risky decision

One of the results of having an artificial intelligence assistant (AI) like Chatgpt wherever you go is that people start to look into it for things for which she has never been intended. According to the CEO of Openai, Sam Altman, this includes therapy and personal life advice – but it could lead to all kinds of confidentiality problems in the future.

During a recent episode of last weekend with the Podcast Theo Von, Altman explained a major difference between talking to a human therapist and using an AI for mental health support: “At the moment, if you are talking to a therapist or a lawyer or a doctor of these problems.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top