As a mathematician and maths teacher, I find it very important to help my students learn to think mathematically. When solving problems, I don’t put a lot of emphasis on doing calculations, but I do expect them to be able to do the calculations. Sometimes I even use computers to do calculations for me, even write algorithms and computer programmes and use them when I want to check the results of the problems before I set them up.
The other day I was trying to solve a problem and even used computer programmes to help me calculate the result. One of the programmes gave me the exact answer, but the other did not. Apparently I had made a mistake that I couldn’t find. I kept repeating my calculations, but I still didn’t get the right answer. Finally, I decided to try Chat GPT.
I asked Chat GPT the original question, but the answer I received was disappointing. When differentiating a function, the programme made such a big mistake that I had to ask back, whereupon the programme corrected itself and gave the correct answer. I realised that I had made a mistake in one character. That was my first impression of using Chat GPT.
Next time, just for fun, I tried asking Chat GPT to solve a maths competition problem that required some thought and further calculation. The answer was so ridiculously incoherent that I thought it was impossible to solve this problem with Chat GPT. I did some thinking and figured out that reformulating the task might help. The Chat GPT solved the problem in two lines. I was amazed and disappointed at the same time.
For the third time, I was really curious about what Chat GPT can and cannot do. Again, I posed a competition problem that required a lot more thinking and some calculations. Chat GPT started the solution in the usual way by doing the necessary maths, but then came to the wrong conclusion. He even made mistakes in the calculations. I asked for a correction, and again the programme drew the wrong conclusion. For the third time, Chat GPT gave the correct answer, but a mathematically incorrect solution, i.e. the answer was correct (by mistake?), but the reasoning was not.

I don’t think Chat GPT can take over the logical thinking of humans at this moment, at least not in any depth, and even if it looks like it, Chat GPT never understands what it is saying.
Understanding is part of human nature.
Author : Kati