
Multiple ChatGPT users have reported a rare glitch that seemingly causes answers to be swapped between different users.
Reddit user “Striking-Professional1” posted to the r/ChatGPT subreddit that they were experiencing issues with the chatbot.
“I asked ChatGPT what’s wrong with my code, and this is how it replied,” the user wrote.
I asked chatgpt whats wrong with my code, and this is how it replied.
byu/Striking-Profession1 inChatGPT
ChatGPT replied, “Yes, Hamas is officially designated as a terrorist organization by the following entities.”
The response follows with a list of countries and entities that deem Hamas a terrorist organization.
ChatGPT’s answer is totally wrong. But what’s striking about the response is that it seems to be answering someone else's question.
It’s unclear whether a bug or anything else may be causing the problem.
Cybernews has reached out to OpenAI for comment.

While Redditors joked that the user's code must have been awful, other users replied that they had almost the exact same issue.
User “zephxv” replied, saying that ChatGPT “did exactly the same thing to me yesterday, exactly the same response.”
The Redditor asked ChatGPT an unrelated question and got the response, “Yes, Hamas is officially designated as a terrorist organization by several countries and international bodies.”
And just as the last user, ChatGPT listed several entities that deem Hamas a terrorist organization. Completely unprovoked.
Certain users claim that this is proof that ChatGPT is being tampered with to spread propaganda, while others have different ideas about what is actually going on here.
“There is some kind of issue going on in the last few days with requests getting swapped around or context windows getting shared…I've had similar responses where it seems like chat is responding to a totally different user's prompt,” said one Redditor.
This issue isn’t exclusive to Reddit, either. Roughly two weeks ago, a GitHub user posted a “critical privacy breach” in which he received another user's data.

The user writes, “During my session with ChatGPT on April 6th, 2025, I experienced a serious issue where it seems like my chat session cross-connected with another user’s.”
“I uploaded my own files and gave personal prompts, but the assistant responded with another user’s private data, including an admit card with full name, roll number, mobile number, and photo.”
While the legitimacy of this claim cannot be verified, it’s certainly strange that multiple users claim to have the same issue.
Another user on GitHub seems to have a similar issue, but instead of allegedly switching between someone else's chats, ChatGPT is responding to earlier chats.

Asking ChatGPT itself
I decided to go to the source and ask ChatGPT what the problem could be. It said that this “sounds like there may be a bug or technical issue causing a mix-up in the chat data.”
When working properly, ChatGPT shouldn’t respond with “unrelated or off-topic information, especially not something as drastically disconnected as a political topic when asking about code.”
The chatbot itself believes that this “could be due to a server-side error or a malfunction in how the data is being handled.”
Your email address will not be published. Required fields are markedmarked