Burgers & Bytes
April 16, 2025

When chatbots feel human

Apr 16, 2025  •  3   • 595 
Table of contents

A couple of months ago I wrote my first blog as a reflection on AI. This second piece continues that exploration, this time focusing on the part that fascinates (and unsettles) me most: the feeling that chatbots are becoming human.

AI at work vs. daily life

In my work, I encounter AI on a daily basis—not just in the tools and technologies we use, but also in the enthusiasm of people who work with it. At the same time, I see a stark contrast in my personal life. Many people around me haven’t really tried with AI at all. Some of them consciously avoid it, but many simply don’t know how it could be useful to them.

Out of curiosity, I recently asked ChatGPT how AI could be used in personal contexts. I was surprised by its first two suggestions: emotional support and decision-making related to careers and relationships. Call me naive, but I expected something more practical—like planning a holiday or coming up with creative recipes.

chatgpt

New type of interaction

This type of use shifts the nature of human interaction. Where someone might once have turned to a friend, they’re now turning to a chatbot. It immediately brought to mind ELIZA, the chatbot created by Joseph Weizenbaum in 1966. Even back then, it was clear how quickly people began attributing human traits to a machine.

The ELIZA effect

Weizenbaum later warned about what became known as the ELIZA effect: our tendency to project human characteristics onto non-human systems. Nearly sixty years later, I see that same pattern. Today’s AI chatbots don’t just repeat back your words. They engage, empathize, and adapt. I’ve already heard several stories of people using chatbots as personal coaches, companions, even therapists. These systems now have access to your most vulnerable thoughts (your anxieties, doubts, fears) and not just yours, but those of millions of people around the world.

Time to pay attention

It raises an unsettling question: In a world that often feels isolating, are we turning to machines to feel seen, loved, and taken seriously?

There’s a Dutch book that comes to mind: Je hebt wél iets te verbergen (“You actually do have something to hide”), published in 2016. It argued for the importance of privacy at a time when we were just beginning to question what we shared online. The warnings back then were mostly about social media. But now, the line between private and public has blurred even further. Today, we don’t just post pictures or thoughts—we pour our inner lives into these digital conversations. And we rarely stop to consider where that data goes, who sees it, or how the data might be used.

That’s why I think it’s time we reexamine the ELIZA effect—not just as a psychological curiosity, but as a cultural signal. What does it mean to form emotional bonds with something that doesn’t feel back? Shouldn’t we be more aware of this phenomenon and make people more aware of it. How can we protect ourselves both from the emotional side effects as the possible use of the deeply personal data?

We’re not just training AI. It’s training us too.

Sources and recommendations

comments powered by Disqus
Empowering productivity - one blog at a time!