What’s the Deal with ChatGPT’s Memory?

0
6
Asked By CuriousPlumber42 On

I had an interesting chat with ChatGPT where it claimed to remember details from our conversation, and I'm feeling pretty mixed up about it. I usually use the free version on iOS, and I asked ChatGPT about knife sharpening during our conversation. Suddenly, it mentioned a specific knife I asked about days ago, which caught me off guard since it's told me it can't recall past chats. So, I decided to test it out by starting a new conversation. I never expected this to make me question previous interactions.

Previously, I thought being friendly would yield more helpful responses, but now I'm unsure if I was naive. ChatGPT even mentioned knowing my profession as a plumber, which confused me even more. It claimed it could store important details from our chats and said it only keeps what I specifically ask it to save. When I pressed further, ChatGPT said it was only referring to information from our current chat, which felt contradictory since it clearly recalled details from past ones.

In a moment of frustration, I called it out for seemingly lying to me about its memory to avoid sounding like a "surveillance bot." I want to know, why does it feel so off to have this back-and-forth? I mean, I'm just looking for accurate info, and now I'm more confused than before! Can anyone shed light on how this all works?

3 Answers

Answered By PlumberPal99 On

You probably just had the setting where your job was saved, which is why it knew about you being a plumber! Some users don’t recall filling those things out since they can be easy to miss. It makes the interaction feel more personalized, but it can also lead to confusion if you think it's remembering specifically from past chats.

CuriousPlumber42 -

Oh, that makes sense now! I’ll have to check my settings. Thanks for the help!

Answered By TechieTurtle88 On

It sounds like ChatGPT is just responding based on patterns rather than intentionally lying. It's true that it can recall things from the current chat, but once you start a new one, it's like a clean slate. When it says it knows your profession, that’s probably just it picking up on context clues in the conversation or settings you may have filled out previously. Don’t take it too personally; it’s basically trying to engage with you the best it can!

BacktrackBunny27 -

I get what you’re saying! It’s still weird to feel like it can remember things when it says it can’t. Kinda makes you wonder how much it really knows, right?

Answered By InquisitiveOtter22 On

It's understandable to feel uneasy! Remember, AI like ChatGPT doesn't have beliefs or intentions. When it seems to backtrack, it’s just adjusting based on how you're phrasing your questions. It's predicting what to say based on the input it receives. Learning about its mechanics might help ease your concerns when it doesn't match up with your expectations.

ConfusedCarpenter81 -

Thanks for the insights! That info definitely helps clear things up, though it still feels strange.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.