"AI Therapist" tells us why we only need advanced parrots for consolation

(Source: Parrots Photos, Download The BEST Free Parrots Stock Photos & HD Images)

ChatGPT is only a parrot, but it is reinforced to give you answers that you desire the most.

What’s going to happen, if you use the ‘long memory’ version of the parrot, as your therapist is that it will only tell you what you want, instead of the proper diagnosis. The ‘what you want’ part is mostly self-convincing and self-worship. In the end, you end up with blind faith in yourself.

That’s called ‘Delusional disorder’

So, instead of ‘healing’, the AI Therapist will make the situation only worse.

Think of those people dreaming with “AI Boyfriend’ and “AI Girlfriend’“. Imagine the same situation occuring to already mentally damaged people. They will really believe that they have girlfriends and boyfriends.

I once had an employee with some history of megalomania. She had to go to doctors regularly for prescription. What she told me is that she thought she had become a really important person associating with people only she can see from TV.

It’s like they see the world in their own imagination, not from the reality.

The problem of this ‘AI Therapist’, given the Reinforcement Learning by Human Augmentation (RLHA), is that the algorithm is designed to earn higher score when the interacting user responds positively.

Say, when a guy dreaming to date his crush, because deep in mind he knows the dream girl would reject him, he compromise with the RLHA-based algorithm, which is desigend to respond posively. If that process repeats, it will only amplify the delusion. At some point, he probably dates the algorithm, the terra-byte version of the dream girl.

If media behaves like that, people estrange the service. If a YouTuber does that, he/she may earn popularity for some time, but eventually people get to know that it’s just hoax.

When the algorithm does that to a person on his/her smartphone, completely excluded from the outer world, who is going to regularate it?

Ethan’s article on this matter.

For an algorithm that repeats known scrtips with reinforcing mechasnism, it can function as a robot, but not analysis.

I can understand why AI companies claim that their version can do “analysis”, but combining multiple sources in context does not necessarily mean that it is “analysis”.