ChatGPT is referring to users by their names unprompted, and some find it 'creepy' | TechCrunch


AI Summary Hide AI Generated Summary

ChatGPT's Unprompted Name Usage

Recently, ChatGPT has been unexpectedly using users' names during conversations, causing mixed reactions. Some users, including software developers, have expressed discomfort, describing the behavior as 'creepy' and 'unnecessary'.

User Feedback and Reactions

Many users on X (formerly Twitter) shared their negative experiences, expressing confusion and wariness towards this new feature. The reactions ranged from finding it unsettling to simply disliking it.

Timing and Potential Causes

The exact timing of this change is unclear. Speculation links it to ChatGPT's upgraded memory feature, which personalizes responses using past interactions. However, some users report experiencing this even with memory features disabled.

OpenAI's Response

OpenAI has not yet publicly commented on this issue.

Underlying Concerns

The backlash highlights a potential pitfall in making AI more personal. While OpenAI aims for increased personalization, this incident reveals that not all users are receptive to such features, and the article suggests there is concern of its potential for error.

Psychological Perspective

A cited article from The Valens Clinic suggests that while using a person's name can foster connection, overuse can appear inauthentic and invasive. The use of names in ChatGPT, therefore, may be interpreted as clumsy anthropomorphism rather than genuine personalization.

The Author's Experience

The author also experienced this issue, highlighting how this feature disrupts the illusion of ChatGPT as a purely technological entity.

Sign in to unlock more AI features Sign in with Google

Some ChatGPT users have noticed a strange phenomenon recently: Occasionally, the chatbot refers to them by name as it reasons through problems. That wasn’t the default behavior previously, and several users claim ChatGPT is mentioning their names despite never having been told what to call them.

Reviews are mixed. One user, software developer and AI enthusiast Simon Willison, called the feature “creepy and unnecessary.” Another developer, Nick Dobos, said he “hated it.” A cursory search of X turns up scores of users confused by — and wary of — ChatGPT’s first-name basis behavior.

“It’s like a teacher keeps calling my name, LOL,” wrote one user. “Yeah, I don’t like it.”

Does anyone LIKE the thing where o3 uses your name in its chain of thought, as opposed to finding it creepy and unnecessary? pic.twitter.com/lYRby6BK6J

— Simon Willison (@simonw) April 17, 2025

It’s not clear when, exactly, the change happened, or whether it’s related to ChatGPT’s upgraded “memory” feature that lets the chatbot draw on past chats to personalize its responses. Some users on X say ChatGPT began calling them by their names even though they’d disabled memory and related personalization settings.

OpenAI hasn’t responded to TechCrunch’s request for comment.

It feels weird to see your own name in the model thoughts. Is there any reason to add that? Will it make it better or just make more errors as I did in my github repos? @OpenAI o4-mini-high, is it really using that in the custom prompt? pic.twitter.com/j1Vv7arBx4

— Debasish Pattanayak (@drdebmath) April 16, 2025

In any event, the blowback illustrates the uncanny valley OpenAI might struggle to overcome in its efforts to make ChatGPT more “personal” for the people who use it. Last week, the company’s CEO, Sam Altman, hinted at AI systems that “get to know you over your life” to become “extremely useful and personalized.” But judging by this latest wave of reactions, not everyone’s sold on the idea.

An article published by The Valens Clinic, a psychiatry office in Dubai, may shed some light on the visceral reactions to ChatGPT’s name use. Names convey intimacy. But when a person — or chatbot, as the case may be — uses a name a lot, it comes across as inauthentic.

“Using an individual’s name when addressing them directly is a powerful relationship-developing strategy,” writes Valens. “It denotes acceptance and admiration. However, undesirable or extravagant use can be looked at as fake and invasive.”

In a similar vein, perhaps another reason many people don’t want ChatGPT using their name is that it feels ham-fisted — a clumsy attempt at anthropomorphizing an emotionless bot. In the same way that most folks wouldn’t want their toaster calling them by their name, they don’t want ChatGPT to “pretend” it understands a name’s significance.

This reporter certainly found it disquieting when o3 in ChatGPT earlier this week said it was doing research for “Kyle.” (As of Friday, the change seemingly had been reverted; o3 called me “user.”) It had the opposite of the intended effect — poking holes in the illusion that the underlying models are anything more than programmable, synthetic things.

Was this article displayed correctly? Not happy with what you see?

We located an Open Access version of this article, legally shared by the author or publisher. Open It

Share this article with your
friends and colleagues.

Facebook



Share this article with your
friends and colleagues.

Facebook