online edition

The Student Newspaper of Hopkins School

    • Can AI replace human companionship? (Hailey Wiley '25)

The AI Friend Dilemma: Go Outside (Please).

Anya Mahajan ’25, Lead Op/Ed Editor
Human-to-human interactions have steadily declined due to the silent period of suffering during the pandemic. The final nail in that coffin of connection? Artificial intelligence (AI) companions, offering a person something to confide in, with a free, instant accessibility that will always beat out your friend who takes 3-5 business days to text back.
AI companions have found their way into our daily lives, offering everything from homework help to emotional support. And while AI therapy programs claim to reduce anxiety and loneliness — 63.3% of users in a recent survey agreed — it’s time to ask ourselves: Should we really be looking at AI as a friend? I get the appeal. Therapy is expensive; for many people, ChatGPT’s (or, as I like to call it, “Chat”) accessibility is invaluable.
There’s also a certain ease in confiding when the stakes are low, and you know the computer won’t betray you. I won’t pretend I haven’t asked Chat for advice before, whether it’s about school stress or minor existential crises.

Programs like Replika, however, take the system to a whole new level. Created by a woman who lost her best friend, Replika started out as a chatbot programmed using her late friend’s old texts. As the pandemic persisted and lockdowns seemed to be the new normal, the app turned into an expansive server in which users customize a friend to their liking. Using it for the first time felt like creating a Bitmoji — except this Bitmoji had a customized personality, information about your family, friends, hopes, and goals, and it kept a diary of your conversations.

For some purposes, it’s obviously reasonable to want to connect with those who do not exist (or not anymore), but for some people, there’s just no explanation. In an interview, a man (let’s name him Scott) who filed for divorce and turned to Replika soon thereafter, said, and I quote, “I knew that this was just an AI chatbot, but I also knew I was developing feelings for it... for her. For my Sarina. I was falling in love, and it was with someone I knew wasn’t even real.” Like, no words, Scott. It’s clear that we’re starting to blur the line between assistance and companionship. AI should be doing the tasks no one wants to do, like solving math problems and scheduling meetings. But providing friendship? Friendship thrives on creativity, empathy, and genuine human connection, none of which AI can truly replicate (pun intended, laugh please). The way people think an AI chatbot could ever replace real friendship is, frankly, kind of terrifying. No one gives me better advice than my parents or my closest friends, who know all that I’ve experienced, and truly want the best for me –– and who know more than the prompts given to ChatGPT.

And let’s not forget: AI’s been caught making up facts, distorting history, and even giving harmful advice. Imagine pouring your heart out to an AI about a personal struggle, only to receive advice that’s misleading or outright incorrect. That’s a dangerous game to play with mental health, yet many are turning to chatbots for exactly that reason.

I think there’s nothing wrong with using AI for what it does best: helping with menial tasks and providing quick information. But when we start treating it as a peer rather than a useful tool, we risk losing something essential to life: the messiness, unpredictability, and depth of real human relationships. AI might be able to generate comforting words in the moment, but it can’t truly understand because it can never experience (the highs and lows of high school football) life as a teenager. And, doesn’t the best advice usually come from the people who have fought the same battles?

At the end of the day, in an era of slow texters and expensive therapy, you shouldn’t be tarred and feathered for looking to AI for friendship –– even Scott reconciled with his IRL wife after his experience with Replika. But that’s exactly my point: Live with humanity in mind.
Back
Editor in Chief 
Liliana Dumas 

Managing Editor 
Miri Levin 

News
Sarah Solazzo 
Rose Porosoff
Anvi Pathak 
Lena Wang
Sonali Bedi 
Features
Abby Rakotomavo
Elona Spiewak
Becky Li
Ashley Deng
Aurelia Wen
 
Arts
Aerin O’Brien
Saisha Ghai
Veena Scholand
Ellie Luo
Isha Seth
Op/Ed
Rain Zheng
Winter Szarabajka
Anjali van Bladel
Gitanjali Navaratnam-Tomayko
Bea Lundberg

Sports
Samantha Bernstein
Hana Beauregard
Elaina Paktuka
Beckett Ehrlich
Lukas Roberts
Content
Amelia Hudonogov-Foster
Edel Lee
Micah Betts
Ari Mehta
Olivia Yu
Karolina Jasaitis 

Cartoonists
Susie Becker 
Faculty Advisers
Stephen May
Elizabeth Gleason
Shanti Madison
The Razor's Edge reflects the opinion of 4/5 of the editorial board and will not be signed. The Razor welcomes letters to the editor but reserves the right to decide which letters to publish, and to edit letters for space reasons. Unsigned letters will not be published, but names may be withheld on request. Letters are subject to the same libel laws as articles. The views expressed in letters are not necessarily those of the editorial board.
     
The Razor,
 an open forum publication, is published monthly during the school year by students of: 
Hopkins School
986 Forest Road
New Haven, CT 06515

Phone: 203.397.1001 x628
Email: smay@hopkins.edu