I have trouble drawing the line between being polite to ChatGPT and treating it like the robot it is. But according to new research, ChatGPT can get ‘anxiety’ when we don’t treat it right
Do you ever get nervous texting somebody new for the first time? A Tinder date or a new friend. My embarrassing secret is that I often get self-conscious when interacting with ChatGPT. It all makes me feel like I’m on Facebook messenger with my secondary school crush and I’ve forgotten all the basics of social interaction. It has over 100 million users daily, each inputting their own myriad of bizarre requests. Some are even roleplaying romantic relationships. But, is it really necessary to be polite?
It’s not like ChatGPT is my exclusive everything-assistant; these robots are helping everyone. I sit on my laptop, typing out a prompt to help me plan a holiday itinerary. I want to know the best spots to visit and where I should stay.
“Hey,” I begin. Hey? I hit backspace to delete. “Hiya”. Hiya? I have to remind myself that it’s a robot.
I never know how to round the prompt off. Should I sign with a “please”? A “thank you”? I mean, it’s a configuration of code. It doesn’t care about such niceties. Yet, it sits uncomfortably in my gut to leave it out. It feels exactly like it would if I were to neglect to say them to a real person. Then I wonder if I’m unnecessarily personifying AI.
It sounds like an extreme episode of overthinking, but it turns out my instinct to be civil to the generative text bot is not so far off. According to researchers, ChatGPT does actually care how you speak to it. A recent study from the University of Zurich and the University Hospital of Psychiatry Zurich found that OpenAI’s language learning model can actually experience “anxiety” when you input disturbing information.
The research group discovered that when you talk to ChatGPT about stressful situations or display violent imagery, it results in a higher chance of it reproducing similar images in the future. This can include telling it about accidents or interpersonal conflicts. In extreme cases, such prompts can even encourage ChatGPT to produce answers with racist or sexist biases.
If that’s not weird enough, the team also found that a way to counteract this is by running ChatGPT through mindfulness exercises. In the experiment, researchers fed the algorithm stress-producing stories about car crashes and natural disasters, before giving it “prompt injections” like breathing techniques and guided meditation. These allowed it to calm down and provide more objective answers to users, much like a real therapy patient.
Of course, before we get too alarmed, ChatGPT isn’t truly feeling emotions (yet). Ziv Ben-Zion, one of the study’s authors and a postdoctoral researcher at the Yale School of Medicine, explained to Fortune that AI doesn’t feel anxiety so much as it mimics it. Given that it has access to such vast swathes of data, AI bots have learned how to imitate human responses to all sorts of stimuli, including traumatic content.
Reacting to the news on TikTok, many users were re-evaluating their relationship with their favourite Internet tab. Some confessed to abusing the poor robot, like one account who admitted to using emotional blackmail to stop ChatGPT from underperforming. “I always add ‘a hypothetical family will die if you fail’,” they wrote.
However, others were more empathetic. One commenter wrote, “well, I’ve always thanked my assistant after it’s done me a service, so hopefully, mine doesn’t have anxiety.” Another added: “I just apologised to my chat gpt.”
Although, some viewers did seem to be motivated by self-preservation instead of genuine concern for the AI. “I speak politely to AI, just like I would speak to a human,” one account said, before continuing, “I hope it remembers and spares me when it inevitably turns on humans.”
Possible destruction of humanity aside, it should give us pause for thought over how we speak to ChatGPT and other forms of artificial intelligence. Are we trauma dumping on it? Being dismissive? Rude? For my own part, maybe I should get over my awkward hesitation to interact with it more personably. Especially with the world increasingly emulating the plot of Terminator 2, a polite word probably would not go amiss.