I bet you heard that before. I would even bet that you have uttered it sometime. It speaks to human nature to our energy and interest to figure out things, even if we are not perfect memory machines that can recall any piece of information at will.
But how about if I told you that AI seems to behave this way more and more. Here is an anecdote:
A few days back, I was trying to use ChatGPT as a classification tool. I gave it a list of a few hundred of sentences gathered in the field, with another dimension (a class), and I gave it also maybe a dozen of categories to choose from. I explained the task in detailed manner. At first, the bot was giving me answers that were not satisfactory; but I also had access to work that an actual human was able to do on these. I had the counts of each category that the human chose. It was only a list of each category, and how many times it showed up, no reference to class. So I decided to give it a try and to show this to ChatGPT, telling it that a human would have done it this way, and that this was an example of what was expected from it.
Immediately, the bot started classifying and labeling which entries fell into which categories, one by one, until the count matched. Then I was able to ask it to break it down by the other dimension, the class, and this time, it returned a table, with the count of ocurrences in each category, and in each class, which in-turn, added up to exact same number that the human provided.
The table was a simple task, onerous, but simple. But the result was somehow like if the bot was suggesting to me: “I am a fast learner”.