In my experience, humans are at least as bad at it as GPT-4, if not far worse. In terms, specifically, of being "factually accurate" and grounded in absolute reality. Humans operate entirely in the probabilistic realm of what seems right to us based on how we were educated, the values we were raised with, our religious beliefs, etc. -- Human beings are all over the map with this.
> In my experience, humans are at least as bad at it as GPT-4, if not far worse.
I had an argument with a former friend recently, because he read some comments on YouTube and was convinced a racoon raped a cat and produced some kind of hybrid offspring that was terrorizing a neighborhood. Trying to explain that different species can't procreate like that resulted in him pointing to the fact that other people believed it in the comments as proof.
Say what you will about LLMs, but they seem to have a better basic education than an awful lot of adults, and certainly significantly better basic reasoning capabilities.
> Trying to explain that different species can't procreate like that resulted in him pointing to the fact that other people believed it in the comments as proof.
Those two species can't interbreed apparently, but considering the number of species that can [1] produce hybrid offspring, some even from different families, it is reasonable to forgive people for entertaining the possibility.
I don't think it's remotely reasonable. The list you refer to, which I don't need to click on as I'm already familiar with it, is animals within the same family, e.g. bi cats.
Raccoons are not any type of feline, and this should be basic knowledge for any adult in any western country who grew up there and went to school.
There are at least a couple of examples in the article that you refuse to read that describe hybrids from different families. Sorry, but your purported basic knowledge is wrong.
I'm not 'refusing to read' it, I said I'm familiar with it because I've read it numerous times in the past.
Which examples are you referring to? The only real example seems to be fish.
In any case I was using 'family' in a loose sense, not in the stricter scientific biological hierarchy sense.
My basic knowledge is not wrong at all, because my point was that animals that far apart could not reproduce. That's it. The wiki page you linked doesn't really justify your idea that because some hybrids exist people might think any hybrid could exist.
The point is, it's frankly idiotic or at least extremely ignorant for anyone 40 years of age who grew up in the US or any developed country to think that.
I also very much doubt the people who believe a racoon could rape a cat and produce offspring are even aware of that wiki page or any of the examples on it. Hell, I doubt they even know a mule is a hybrid. Your hypothesis doesn't hold water.
Additionally, most of the examples on that page are the result of human intervention and artificial insemination, not wild encounters. Context matters.
This is demonstrably not true. People also bullshit, a lot, but nowhere near the level of an LLM. You won't get fake citations, complete with publication year and ISBN, in a conversation with a human. StackOverflow is not full of down voted answers of people suggesting to use non-existent libraries with complete code examples.
It's definitely part of what cognition is, hallucinogens/meditation/etc allows anyone to verify that much.
Intuitively cognition is several systems running in tandem, supervising and cross checking answers, likely iteratively until some threshold is reached.
Wouldn't surprise me if expert/rule systems are up for some kind of comeback; I feel like we need both, tightly integrated.
There's also dreams, and the role they play in awareness, some kind of self reflective work is probably crucial.
That being said, I'm 100% sure there is something in self awareness that is not part of the system and can't be replicated.
I can observe myself from the outside, actions and reactions, thoughts and feelings; which begs the question: who is acting and reacting, thinking and feeling, and what am I if not that?
Both of those terms have precise meanings. They're not the same thing. Summarized --
Cognition: acquiring knowledge and understanding through thought and the senses.
Hallucination: An experience involving the perception of something not present.
With those definitions in mind, hallucination can be defined as false-cognition that is not based in reality. It's not cognition because cognition grants knowledge based on truth and hallucination leads the subject to believe lies.
In other words, "humans are just really good at hallucination" rejects the notion that we're able to perceive actual reality with our senses.
Humans can hallucinate but later determine that what they thought was occurring was not actually real. LLMs can't do that.
What you're saying sounds to me rather like what some people are tempted to do on encountering metaphysics: posing questions like "maybe everything is a dream and nothing we experience is real". Which is a logically valid sentence, I guess, but it really is meaningless. The reason we have words like "dreaming" and "awake" is that we have experienced both and know the difference. Ditto "hallucinations". It doesn't seem that there is any difference to LLMs between hallucinations and any other kind of experience. So, I feel like your line of reasoning is off-base somewhat.
I agree. I shouldn't have used the word "hallucinations" since the point of the conversation above my comment was that they are not really hallucinations by any meaningful definition of the word.
My question was more about whether "babbling" with statistically likely tokens can eventually emerge into real cognition. If we add enough neurons to a neural network, will it achieve AGI? or is there some special sauce that is still missing.