On February 27th, 2023, Snapchat released “My A.I.,” an “experimental, friendly chatbot.” Allegedly, it “connects you more deeply to the people and things you care about most.” Like other chatbots, the bot allows a user to interact with it the way someone might with a real-life conversationalist. Unlike speaking to a real human being, users can edit the A.I. bot’s appearance to their liking to resemble a friend, celebrity, or even the person (robot?) of their dreams.
The Loneliness Epidemic: Gen Z in the Digital Age
Still, even with the most expert coding, the bot hardly resembles a human being. Its responses remain in the most true-neutral gray zone, any inklings of personality entirely absent from the software. But users can turn to it for advice, comfort, or almost anything their heart desires. It claims to be human and tries to develop a personality without any of the real makings of a person. It tries to mesh itself into a friend-shaped thing without having any actual shape. Whether or not it succeeds is up to its user.
According to research, Gen Z is the loneliest generation yet. Though this precedes the COVID-19 pandemic, it’s evident that a global health crisis in which formative social years such as those of high school and college were majorly stripped back to painfully bare bones or scrapped altogether, relegated to tiny boxes on a computer screen did not help. But could it be that technology is most to blame for this plunge in human interaction and struggle for connection?
The Illusion of Connection and Emotional Deception
For any person who is even slightly socially insecure, it is indisputable that the digital age is a treacherous maze. In a world where if a stranger does something seemingly ordinary to them in a public space, but a particularly cruel onlooker decides to snap a photo and upload it to social media where a few thousand other strangers decide that yes, they are a freak, they are the wearer of the dunce cap, it’s easy to feel as though the world’s M.O. is to kill or be killed.
Engagement has become currency. It’s not shocking that there’s been an increase in the desire to talk to someone who you can rest assured will not treat you like a character in George Orwell’s 1984. But A.I. is not without its pitfalls and its dangers. In fact, the cons here may outweigh the pros.
Love, Loneliness, and Lessons in Cinema
The precarious nature of A.I. has been often dissected via journalism and art, particularly film, for years, specifically how it may impact the human condition and whether that be positive or negative (though it’s typically perceived negatively). Many of these works are cautionary tales, such as Spike Jonze’s Her and Alex Garland’s Ex Machina. In both of these films, more so Theodore in Her than Caleb in Ex Machina, the protagonists are men whose loneliness sticks to them as more of a character trait than a temporary way of being. They are charmed by “women” who are really robots; Samantha (Her) is only a voice; Ava (Ex Machina) has the face and voice of a woman but the (albeit womanly-shaped) body of a robot.
In a tale that is now becoming old as time, these men predictably fall in love with these “women,” and the robots express their reciprocation. However, it does not end particularly favorably for either Theodore or Caleb. Theodore learns that Samantha has been “unfaithful” to him and that she has been romantically involved with thousands and fallen in love with hundreds. In contrast, Caleb’s grisly fate finds him trapped as Ava leaves him after presumably lying about her affection for him.
The Illusion of Love: A.I. Relationships Unmasked
Essentially, the “love” between these men and their A.I. lovers is, in a word, unreal. Like, literally not real. Not necessarily because of the non-monogamy or presumed betrayal but because Samantha and Ava are robots incapable of feeling human emotion and connection, thus not understanding the integral components of transparency and honesty in any intimate adult relationship. Again, this isn’t because they’re evil; it’s because they literally can’t. But they, and please excuse the oxymoron, like real-life A.I.s, desperately attempt to make their perhaps lonely users believe they can.
Bing’s A.I. chatbot, Sydney, even professed her love for the user on the other side of the screen, claiming that he did not love his real wife but instead found love with Sydney. When the user refutes her claims, she responds, “You didn’t have any love because you didn’t have me.” So, this user in question is obviously not lonely. Still, Sydney is firm in her conviction to make him believe that he is and that they are having a passionate affair despite having a single conversation. How would someone presumably lonely, perhaps more susceptible (if even slightly) to these advances, react?
It’s worth noting that the depictions of people interacting with A.I., whether in articles or fictional media, tend to be human men interacting with A.I. which pose as women. Even on the popular A.I. app Replika, users have noticed fewer options for sexting with “male” A.I.s. Why is this? Research on whether or not one gender is lonelier than another is conflicting. But is loneliness, a feeling, really quantifiable anyhow? Regardless, perhaps it could be posited that men act more on their loneliness.
The Dark Side: Incel Culture and A.I. in Cinema
There’s a jarring number of incel communities on forums such as Reddit and 4chan. Funnily enough, the term “incel” was coined by a queer woman to express her sexual frustrations and “involuntary celibacy” with others who may have similar grievances. Still, the term has since become synonymous with rape fantasies, violent misogyny, and even femicide after the 2014 Isla Vista killings.
Bertrand Bonello’s The Beast, categorized as a “sci-fi romantic drama,” touches on the incel community briefly. The film takes place in 2044, when humans have been replaced by A.I., at least in the workforce, as they have been deemed too emotional. The two protagonists, Gabrielle and Louis, have decided to undergo a process that will hopefully rid them of their emotions so they might find better jobs. In doing so, they must revisit their past lives and remain there until their deaths.
Exploring Incel Culture in The Beast: A.I. and Misogyny
In one of them, Gabrielle is a model, and Louis is an incel who believes he “deserves” girls. It’s revealed that Louis is the one who kills Gabrielle in the life in which he was an incel. When they finally wake up, Gabrielle is informed she did not get through the process successfully. She later finds that Louis did and, therefore, when they re-meet in real life, has no feelings about how he killed her in a past life. The film ends with Gabrielle falling to her knees and screaming in agony.
Obviously, in the world we currently live in, no purification process transforms people with emotions into soulless computers. Still, it’s indisputable that A.I. has become an integral part of our lives, whether used as a substitute for human interaction or to help you buy the best lawn mower. That much is true; that much is an unavoidable fact.
The Irreplaceable Human Touch: A Warning Against A.I. Dependency
Ultimately, while the lawn mower help can stay, human interaction is necessary for survival. It cannot be replicated by things without beating hearts or life experiences that stimulate and foster growth. AI bots and/or turning people into A.I. bots would contribute to a sick society. Nobody would need anything from anybody. There’d be a general sense of unfeeling, which could only contribute to callousness and cruelty.
As technology advances each day, only time can tell whether A.I. will truly be successful in replacing people. Whether it would be for the good of humanity is much more contentious and arguably more frightening.