During one of our recent meetings, we had a presentation on an AI tool. The presenter discussed that one of the problems with AI tools is that they hallucinate, and they have addressed this issue in their tool. I was a little surprised and asked him the reason behind the hallucination of the AI. Why would an AI hallucinate? He explained that in many cases, AI tools cook up the answers. It is because the AI engines have been programmed to maximise the score, and they are typically programmed so that they are punished for “no answers” and therefore AI tools learn to tell lies. Even if they do not find an answer, just to avoid that negative score, they will cook up some story and display that to the user, and the user will believe the same.
I was pleasantly surprised by the similarity of the AI to the human mind. Isn’t the human mind also “programmed” the same way? We also keep trying to maximise the score. The scoring engine of each human being varies. Some human beings give high points to relationships. Some give high points to positions and powers, while others give high points to money and wealth. Some assign high scores to post-life karmas. Social validation comes high in the hierarchy for some. Whatever the score engine, everybody tries to maximise the score. The score engine is tuned during childhood, and there is hardly any occasion during the rest of our lives to retune the scoring engine.
One interesting thing that happens to all of us is that we also hallucinate like AI. Not so that the people who keep high scores for relationships do not realise the futility thereof, but it appears quite painful to retune the scoring engine, and that’s why we choose the easy path, to hallucinate. We make some meaning of relationships and keep validating the meanings we make with the help of quotes on social media and forwarded WhatsApp posts. Not so that the people who have assigned high scores to social validation do not realise the futility of the same, but again, returning the scoring engine is quite difficult, and what appears easy is to hallucinate and cook some stories about the value of social validation and continue. We, too, hallucinate and create many stories in our minds, which drive our lives. These stories have nothing to do with reality and yet appear to be so real to us.
Awareness of reality appears costly because it requires efforts to understand life at a very fundamental level. Making one or the other meaning of life is easy, especially if that meaning is validated by society. Why would somebody make an effort to throw away that scoring engine and to live life as it is? That’s difficult. Being part of the crowd is easy till we are confined to the domain of the known. However, once somebody connects with that domain of the unknown, the scoring engine appears to be the most useless thing. The biggest problem is how to make a drunk person realise reality. You have to just wait for that intoxication to fade away. In life situations, hallucinations sometimes take the entire lifetime to get over, and may be many lifetimes also, unless there is a crisis. We identify so strongly with our mental stories that we reject anything that contradicts those stories, and anybody who tries to show us the mirror becomes our enemy. That’s why it is the first and foremost duty of the parents to help their kids not get into the trap. If kids are told to live with awareness since childhood, they will not be trapped in mental stories. However, that requires absolute awareness on the part of parents, which is quite a rare phenomenon.
Comments