> you can Google the answers to most of the questions
1) Students can Google the answers also, but neither students nor GPT-4 are allowed to Google the answers during the test, so it remains a fair comparison.
2) Many of the questions require calculations, which are far less Googleable.
1) It's not possible to fairly compare human intelligence with something that can memorize gigabytes of text and hold it in non-volatile memory.
2) Months ago, in my earliest interactions with ChatGPT, I asked it to solve math problems. It gave me back stuff with LaTeX formatting. Obviously it had, if not these exact problems, similar templates in its training set.
Recently it was shown that GPT is completely incapable of solving Codeforces problems that appeared after it was trained.
Whatever is going on here is interesting but less impressive than it looks.
1) Do you really think ChatGPT works by memorizing quantum mechanics textbooks? There are only 355 billion parameters in GPT-3.5, which is several orders of magnitude less than the 600 trillion synapses in the human brain.
2) Your conclusion is unfounded. ChatGPT speaks many languages, including Latex.
It put LaTeX in there unbidden because it pattern matched to LaTeX source code in its training set. Code containing solved math problems similar to those it was being asked.
1) Students can Google the answers also, but neither students nor GPT-4 are allowed to Google the answers during the test, so it remains a fair comparison.
2) Many of the questions require calculations, which are far less Googleable.