I've been really frustrated with ChatGPT's math abilities. It seems to mess up basic calculations, mix up signs, and doesn't follow the order of operations correctly. There are times it's even said that 4+4 equals 9! I thought this AI was supposed to be logical. I've even uploaded photos of calculations, but it still gets things wrong.
I'm also wondering if my questions in my native language are affecting its performance. Could that be a reason for the errors?
3 Answers
It's not a language issue. All these language models just learn probabilities of word sequences. If a question is complicated or not frequently seen in training data, it tends to guess. Simple stuff like 2+2 should be accurate most of the time, but randomness is added to keep responses varied. That's why it might sometimes give a wrong answer instead of the most likely one.
ChatGPT isn't great at math unless you prompt it to use an external tool. It knows 2+2=4 from its training, but it doesn't really understand the math itself. If you ask it how it solves 2+2, the answer can be pretty surprising!
Just a reminder: ChatGPT doesn't really think; it just predicts what comes next. It's all about finding the most probable output based on what it's seen before!
What do you mean by "external tool"?