Is Chat GPT smarter than a fifth grader?

As a mathematician and math teacher, I think it is very important to help my students learn to think mathematically. When solving problems, I don't put a lot of emphasis on doing calculations, but I do expect them to be able to do the calculations. Sometimes I even use computers to do calculations for me, even write algorithms and computer programs and use them when I want to check the results of the problems before I set them up.

The other day I was trying to solve a problem and even used computer programs to help me calculate the result. One of the programs gave me the exact answer, but the other did not. Apparently, I had made a mistake that I could not find. I kept repeating my calculations, but I still didn't get the right answer. Finally, I decided to try Chat GPT.

I asked Chat GPT the original question, but the answer I received was disappointing. When differentiating a function, the program made such a big mistake that I had to ask back, whereupon the program corrected itself and gave the correct answer. I realized that I had made a mistake in one character. That was my first impression of using Chat GPT.

The next time, just for fun, I tried asking Chat GPT to solve a mathematical competition problem that required some thought and further calculations. The answer was so ridiculously incoherent that I thought it was impossible to solve this problem with Chat GPT. I did some thinking and found that reformulating the problem might help. Chat GPT solved the problem in two lines. I was amazed and disappointed at the same time.

For the third time, I was really curious about what Chat GPT can and cannot do. Again, I posed a competitive problem that required a lot more thinking and some calculations. Chat GPT started the solution in the usual way, doing the required mathematical reasoning, but then drew an incorrect conclusion. He even made mistakes in the calculations. I asked for a correction, and again the program drew an incorrect conclusion. For the third time Chat GPT gave the correct answer, but a mathematically wrong solution, i.e. the answer was correct (by mistake?), but the reasoning was not.

Custom Software Development

I don't think Chat GPT can do the logical thinking of humans at this moment, at least not in full depth, and even if it looks like it, Chat GPT never understands what it is saying.

Understanding is in the nature of man.

Author : Kati

You have questions?

Our experts support you and are here to help you with any questions you may have.

Or take a look at our references.