Has anyone else noticed that ChatGPT tends to forget earlier messages in a conversation as it goes on? It's really frustrating because you'd expect it to remember everything since we're still chatting in the same thread. I feel like this issue has been around for a while now, as I've experienced it before. It seems to just let go of details after a certain point, and I'm not sure why that is.
4 Answers
Yes, LLMs have a context window limitation based on your subscription plan. For example, Plus users can access up to 32k tokens, while free users only get 8k. That means the model might start forgetting the beginning of the chat when it reaches the end of the context window. So if you're on a free plan, that could be why you're noticing this.
ChatGPT has the ability to handle a context window of up to 128k tokens, but this doesn’t apply the same way in conversations. Depending on your subscription, you’ll either get 8k or 32k context. So in a lively conversation that goes on long enough, parts of it will start dropping from the front end as new messages are added.
Exactly! Managing how long you keep a single conversation can help. I try to start fresh every day to maintain context.
It's all about the "context window," which is basically a limit on how much data the model can handle at once. Large Language Models (LLMs) like ChatGPT have this fixed amount they can process, so older parts of the conversation can get dropped if you go over that limit. To keep certain points in focus, it helps to repeat key details occasionally throughout the chat.
I usually try to keep my chats concise and start new ones often. It seems to work better for retaining context without running into these issues, even though it's a bit of a hassle.
That makes sense! So it’s not a bug, just a matter of how much the model can remember at once?