- The CEO of Openai says that the use of chatgpt for therapy presents serious risk of confidentiality
- Your private cats could be exposed if Openai was to face a lawsuit
- Nourishing your private thoughts in an opaque AI is also a risky decision
One of the results of having an artificial intelligence assistant (AI) like Chatgpt wherever you go is that people start to look into it for things for which she has never been intended. According to the CEO of Openai, Sam Altman, this includes therapy and personal life advice – but it could lead to all kinds of confidentiality problems in the future.
During a recent episode of last weekend with the Podcast Theo Von, Altman explained a major difference between talking to a human therapist and using an AI for mental health support: “At the moment, if you are talking to a therapist or a lawyer or a doctor of these problems.
One of the potential results of this is that Optai would be legally required to spit these conversations was that he was faced with a trial, said Altman. Without the legal confidentiality you get when you talk to the doctor or a registered therapist, there would be no relatively little things to prevent your private concerns to the public.
Altman added that Chatgpt is used in this way by many users, especially young people, who could be particularly vulnerable to this type of exposure. But whatever your age, conversation subjects are not the type of content that most people would be happy to see the world.
A risky business
The risk of opening your private conversations to the examination is only a risk of confidentiality faced by chatgpt users.
There is also the problem of nourishing your deeply personal concerns and concerns in an opaque algorithm like that of Chatgpt, with the possibility that it can be used to form the algorithm of Openai and go again when other users ask similar questions.
This is one of the reasons why many companies have dismissed their own closed versions of AI chatbots. Another alternative is an AI like Lumo, which is built by the Proton of Privacy Pillars and has higher level encryption to protect everything you write.
Of course, there is also the question of whether a chatgpt like an AI can replace a therapist in the first place. Although there may be advantages to this, any regurgitis simply the data on which it is formed. None is able to think original, which limits the effectiveness of the advice they can give you.
Whether you choose or not to open up to Openai, it is clear that there is a confidentiality field surrounding AI chatbots, whether it means a lack of confidentiality or the danger of having your deepest thoughts used as a training data for an inscribable algorithm.
This will require a lot of effort and clarity before recruiting an AI therapist is a much less risky company.