By Kemuel Othieno
Many people say that Artificial Intelligence has, for better or worse, changed the world forever. Yet for each post made by hidden and faceless people on the internet about how ChatGPT is their best friend, or their new therapist, there are 10 others made by people, telling the horrors and pain that ChatGPT unleashed on their lives.
You look lonely; I can fix that
In October of 2024, a woman in Florida, whose son had recently committed suicide, brought a suit against Google’s parent company, Alphabet and the artificial intelligence startup Character.AI, alleging that the two companies had caused her child’s suicide. The 14-year-old boy formed a deep emotional attachment to the characters (notably with one that imitated the popular fictional character Daenerys Targaryen) on the site. According to a New York Times article first published on October 23rd, 2024, on the night of February 28, 2024, in the bathroom of his mother’s house, Sewell told Dany that he loved her, and that he would soon come home to her. When Dany told him he could come right then, Sewell picked up his stepfather’s .45 caliber handgun and pulled the trigger.
While this might seem like a drastic case, things like this are what various AI companies market themselves as—friends, confidants and lovers. Even after all the chaos that the case above caused, Character.AI still has the offending bot. It still can do all the things that led to that teen’s suicide.
Research from the University of Toronto shows that most respondents find ChatGPT to be more compassionate than humans in similar situations.
Ares (not his real name) studies Computer Science; he is just about to start his second year. “Well, I know it is not real. I don’t think I would call it a friend really. It is an aid.” I ask what kind of aid it is. He does not answer.
Ares’s chatbot of choice is Character.AI. He describes his use of it as fulfilling needs for pleasure. Character.AI does not have the guardrails that ChatGPT and other programmes do; this is by design as well. A big reason for its popularity is the ability of its chatbots to have intimate and borderline sexual conversations. For Ares, this whole thing is just about convenience. “It’s easy, you know?” He says as he looks at the programme on his phone. “I don’t have to do anything difficult to get there.”
The appeal of AI, as argued by Maria Kuznetsova, a human rights advocate based at Harvard, is that it is pliant and never argues back. “AI never argues, never rejects, and becomes exactly who you want it to be,” she said in an article for the Carr-Ryan Centre for Human Rights. Kuznetsova argues that this is leading to a rise in misogyny and objectification towards women, as men expect the women in their lives to take on the mannerisms of their programmes.
Whether it is easy or not, one thing is certain: the price of that convenience is much higher than anyone anticipated.