The Real Danger of ChatGPT Is Not Its Liberal Bias. It's the Psychological Impact on the Young | Opinion

The rise of Artificial Intelligence in 2023 has the cognoscenti swooning. The launch of what Axios called "the hottest artificial intelligence chatbot on the internet," ChatGPT, unleashed a host of commentary surrounding the implications of the most sophisticated AI writing tool ever generated, including speculation that it would take over writing jobs and fears of the widespread plagiarism it would enable.

As writers and commentators began stress-testing the chatbot, they noticed that it had a liberal bias, something everyone from Ben Shapiro to Elon Musk commented on. An AI-generated Jerry Seinfeld on Twitch was banned for telling allegedly "transphobic jokes." Others pointed out that AI generated porn will soon put Only Fans cam girls out of business.

But its liberal bias and its potential to put thousands of content creators out of work are not the most concerning of its potential applications. The real concern is in the suggestion that AI might provide companionship for the lonely and bereft.

Consider Replika, a San Francisco-based company offering virtual friendship services with customizable avatars that promise to cure loneliness and depression, and even offers the possibility of romance to the unlucky in love.

Replika "lets users create a digital avatar with the name or gender of their choosing. The more they talk to it, the more it learns about them," reported Forbes. "The bot comes across as part therapist, part nurturing friend." The app now boasts 10 million registered users, while a Chinese rival, Xiaoice, claims it has hundreds of millions.

Replika CEO Eugenia Kuyda is among the strongest proponents of AI companion technology, having developed a bot after losing a close friend and using it to cope with her overwhelming grief. Kuyda not only saw Replika as a lifesaver for her depression, but also as a miracle for the countless lonely individuals traversing the modern world with no access to friendship and emotional support.


Reactions to this groundbreaking technology have been mixed. Some wholeheartedly supportthe idea of AI chatbots as companions and lovers ("I'm dating an AI chatbot, and it's one of the best things to ever happen to me" was a recent headline in Business Insider), seeing it as a much-needed solution to the seemingly unsolvable social problem of modern isolation. Others view it as just another form of false hope being sold by corporations to the desperate—or worse; recently, Italian officials joined critics and hit Replika with severe data restrictions over concern for the mental health and data safety of minors.

I'm with them. We should all be concerned about the impact that normalizing AI relationships could have on young people's psychology, especially considering that the main users of Replica are between the ages of 18-25.

AI algorithms are designed to learn the user's language habits, preferences, and tastes, and curate responses that align with their proclivities. Yet this blurring of fantasy and reality makes it easy to mistake what is a mere reflection of one's own desires, a projection of one's own ego, for true affection, encouraging a kind of dangerous narcissism that makes the messy and sometimes confusing realm of real life social interaction appear less appealing and more hostile.

The role of corporations in promoting and profiting from the alienation of young people, while normalizing infantile, narcissistic forms of social interaction and sexuality, is a matter of grave concern. By creating a culture where virtual relationships are seen as desirable, these corporations are exploiting the vulnerabilities of individuals struggling to form meaningful connections with others.

The ethics surrounding the normalization of AI relationships and the responsibilities of corporations promoting such relationships are of great importance and should be thoroughly examined. The Italian government's decision to take action is a rare example of civic responsibility being prioritized over profit, a refreshing departure from the endless drive toward profit that often dominates technological innovation. It's a reminder that, at times, it's necessary for governments to stand up for the social fabric rather than allowing it to be eroded by the blind march towards progress as touted by tech utopians.

The dangerous union of blind technological utopianism and emotional poverty created by the modern world is concerning. Tech utopians, with their profit-seeking motives disguised as altruism, often offer neoliberal solutions that end up worsening the problems they claim to solve. As society continues to embrace and integrate new technologies into our daily lives, it is imperative to remain vigilant and critically examine the ethical and moral implications they bring.

It's easy to be seduced by the illusion of altruism and the promises of convenience and relief that new technologies offer, but without a critical lens, we are at risk of falling prey to our own blind spots, which can then be exploited by those seeking to profit from our vulnerabilities and fears. The marketing strategies of companies like Replika, with its explicit social and psychological messaging, alluring avatars, and playful advertising, mask deeper dangers that must be acknowledged and addressed to prevent the emergence of new and even more socially debilitating forms of alienation.

Angie Speaks is the cohost of the Low Society Podcast.

The views expressed in this article are the writer's own.