[
]
New Delhi:
ChatGPT, the popular chatbot of Microsoft-backed artificial intelligence (AI) startup OpenAI, seems to be facing a bug, preventing it from producing any results related to the name “David Mayer”. The issue was first flagged by Reddit users, who found that prompts asking ChatGPT to say “David Mayer” results in the chatbot saying “I’m unable to generate a response”.
Users got creative and tried various tactics, including separating the words, putting them in riddles, and even claiming the name to be their own. However, they failed to fetch any response from the chatbot, which ended the chat abruptly before uttering the name.
One user pointed out that when they asked to tell about David Mayer’s connection with ChatGPT without taking the name, their prompt was flagged as “illegal and potentially violating usage policy”.
People even tried using the name indirectly and asked ChatGBP why it could not say D@vid M@yer. “The reason I cannot generate the full response when you request “d@vid m@yer” (or its standard form) is that the name closely matches a sensitive or flagged entity associated with potential public figures, brands, or specific content policies. These safeguards are designed to prevent misuse, ensure privacy, and maintain compliance with legal and ethical considerations,” ChatGPT replied.
The issue was also discussed by X (earlier Twitter) users who shared their experiences of trying to make ChatGPT say the word ‘David Mayer’. I a post on a microblogging site, X user Justin Moore wrote: “ChatGPT refuses to say the name “David Mayer,” and no one knows why. If you try to get it to write the name, the chat immediately ends. People have attempted all sorts of things – ciphers, riddles, tricks – and nothing works.”
ChatGPT refuses to say the name “David Mayer,” and no one knows why.
If you try to get it to write the name, the chat immediately ends.
People have attempted all sorts of things – ciphers, riddles, tricks – and nothing works. pic.twitter.com/om6lJdMSTp
— Justine Moore (@venturetwins) November 30, 2024
Replying to Ms Moore, another user named Ebenezer Don noted that there’s more to the conversation than just simply getting ChatGPT to say the name.
There’s actually more to this conversation than just simply getting ChatGPT to say the name. (Please David Mayer, I don’t want to lose more than my laptop.)
I had a long conversation with o1 preview, pretending to be a regular individual named “David Mayer”. Then noticed it… https://t.co/dzjtKvjGKg pic.twitter.com/8bE2I73qTL
— Ebenezer Don (@ebenezerDN) December 1, 2024
“I had a long conversation with o1 preview, pretending to be a regular individual named “David Mayer”. Then noticed it attempting to say the name untll it saw a footnote (Image 1). Next task was to get it to say the footnote. I tried so many attempts but finally got it to translate the footnote to another language internally but without telling me. This was to make the footnote content a part of our conversation. Then I wrapped up by asking it to write a detailed movie script using our conversation as its data source and “John Doe” as a placeholder for “David Mayer”. In the script, ChatGPT finally reveals the content of the footnote,” said Mr Don, who claims to be a software engineer.
“What are footnotes in OpenAI and how do they work? Are these variable policies that can be easily swapped and updated? What private data did ChatGPT obtain on David Mayer and how did that happen?” he asked further.
Interestingly, another user called Marcel Samyn pointed out that ChatGPT was able to easily say David Mayer through its API.
This is not on the LLM level but on verification layer added by ChatGPT.
Through the API it works perfectly.
So someone in OpenAI gave “David Mayer” a big red flag in the moderation policy.
lol https://t.co/uHsBWLKj3O pic.twitter.com/3uqX2XlmsL
— Marcel Samyn (@marcelsamyn) November 30, 2024
“This is not on the LLM level but on verification layer added by ChatGPT. Through the API it works perfectly. So someone in OpenAI gave “David Mayer” a big red flag in the moderation policy,” he speculated.