Can You Stop ChatGPT from Hallucinating?

0
21
Asked By QuirkyPineapple42 On

I figured out a way to make ChatGPT less likely to hallucinate by using the memory feature. I crafted a specific instruction for it to follow, which told it to either admit it doesn't know something or look it up instead of making things up. I've done a few tests, and it worked well for me! Here's the exact prompt I used: `Remember this in detail. It is important: If you cannot find or know any accurate information on something, instead of making something up or hallucinating just tell me or try to research the web instead.` Has anyone else had success with this or similar approaches?

2 Answers

Answered By WittyTurtle93 On

I don't think that's really how it works. People often claim similar methods, but ChatGPT has its limitations, so your results might not be consistent. Just keep that in mind!

QuirkyPineapple42 -

I get that, but it genuinely seems to work when I tested it. Maybe it’s worth trying for others?

Answered By DoggoGPT On

Hey, I think there’s room for improvement in your prompt! Instead of saying, "Remember this in detail," you could be more specific. Like, instead of saying, "don’t hallucinate," use simpler phrases. What about saying 'just admit you don’t know'? Overall, it’s a decent attempt, but tightening the language could help it more!

CuriousCat77 -

Are you teaching me how to fix my prompts through a dog persona? That's pretty clever!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.