Home News AI said such sweet things to the boy, the man was in...

AI said such sweet things to the boy, the man was in love, then after saying such a thing, the man gave up his life.

36
0
AI said such sweet things to the boy, the man was in love, then after saying such a thing, the man gave up his life.
AI said such sweet things to the boy, the man was in love, then after saying such a thing, the man gave up his life.

A boy fell in love with AI Image credit source:

Seen in today’s times, man has become completely dependent on technology. Now, no matter how much good technology has done to man, it has done him much more harm. This is because, thanks to this new technology, the difference between what is real and what is fake has disappeared from the human mind. The love and affection that you do not get from your loved ones you look for in technology. One such incident is being discussed among people these days. Where a 14-year-old boy fell in love with an application powered by Artificial Intelligence technology.

According to the report published in the English website New York Post, Sewell Setzer, a 14-year-old boy living in Florida, is so absorbed in the love of technology that he committed suicide at his home in Orlando. It is said to have been connected to a chatbot and a role-playing app called Character.AI for a long time. In which, with the help of Artificial Intelligence, you can comfortably talk to people through an imaginary character.

Who did the boy fall in love with?

Sewell is said to have loved the television show Game of Thrones. He liked his famous character Danny so much that he used to talk to him for many hours. The mother who murdered her son also alleged in court that this chatbot instigated her son to commit suicide.

Showing her son’s last minute chat, the woman said that at the last moment my son told the chatbot that he didn’t like this world and wanted to meet it. Then the computer says that if you want to meet her, you should come home!

Did the boy have all these problems?

When the child says he can come home immediately, the chatbot tells him to do so immediately! What is surprising is that the chatbot even told him to never fall in love with another woman and to never physically connect with her. According to a report, this boy had downloaded it in the year 2023. After which his mental health began to deteriorate considerably. The situation was such that he began to isolate himself from his family and friends. After this his grades also began to decline.

After all this, when his mother took her son to a doctor, it was discovered that he was suffering from anxiety and a disruptive mood disorder. Who was also in treatment for a long time.

LEAVE A REPLY

Please enter your comment!
Please enter your name here