Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Thats an absolutely ridiculous interpretation. Asserting that our current "simple parts" are still very far from AGI is nowhere even close to asserting that intelligence is immaterial.

Not really what it says. Look at following passage:

> AI singularity as a narrative, and identify the numerous places in the story where the phrase "... and then a miracle happens" occurs, it becomes apparent pretty quickly that they've reinvented Christianity.

The phrase "miracle happens" , to me is suspect. Intelligence rose many times in various creatures. There is nothing miracle about something that rose multiple number of times.

Are Singularists wrong? Yes. They confuse saturation for exponential curves; current neural networks are far cry from actual neuron networks, and their time scales reflect more their fear of death than any sensible timeline.

If Charlie wants to criticize Singularists, there are plenty of valid reasons. Them being cult like is the least important one.



When I say that he's not saying that "intelligence is immaterial" I mean literally that he is not saying intelligence does not arise from material processes.

Equivalently, when he says a miracle happens, it is not the same thing as saying "miraculous intervention is required to endow programs with intelligence". He is saying that current and near-term machine learning techniques are not capable of scaling exponentially self-accelerating, godlike, incomprehensible superintelligence, not that no program ever can reach intelligence.

> There is nothing miracle about something that rose multiple number of times.

It is ridiculous to imply he believes that based on his statements or even just that passage in isolation. That would ignore -and take for granted as true- the many assumptions required for a singularity-like event besides the ability to create artificial intelligence. These include:

1. That our techniques are anywhere near reaching the general intelligence of a human

2.That that intelligence can be capably run on existing hardware, which presumes cognition does not rely significantly on processes happening within neurons, only between them

3. That the intelligence of a human, given the understanding of itself and ability to modify itself, would be capable of making improvements that would compound to a significant degree; if you presume intelligence takes off exponentially then unless you're already smart its hard to add much more

4. That intelligence isn't effectively limited by single threaded performance

5. That the ability to think can unlock all the wonders of the universe, and simply being sufficiently smart will allow you to infer the tremendous amounts of hidden state and randomness that dictate life.

If any of those extremely reasonable things fail, the singularity takes millions of years or is simply impossible. Expecting the singularity within the next millenia isn't just faith in all the above, it's faith that all the above is so true that the process happens in less than a decade. It is fanatical.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: