Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The network has specific circuits that correspond to concepts and you can see that the network uses and combines those concepts to work through problems. That is reasoning.


Under this definition an 74LS21 AND gate is reasoning - it has specific circuits that correspond to concepts, and it uses that network to determine an output based on the input. Seems pretty overly broad - we run back into the issue of saying that a nightlight or thermostat is reasoning.


Reasoning is probably better thought of as a spectrum where what you describe is a very little bit of reasoning, and LLMs do a lot more reasoning.


For true reasoning you really need to introduce the ability for the circuit to intentionally decide to do something different that is not just a random selection or hallucination - otherwise we are just saying that state machines "reason" for the sake of using an anthropomorphic word.


This restriction makes it impossible to determine if something is reasoning. An LLM may well intentionally make decisions; I have as much evidence for that as I have for anybody else doing so, ie. zilch. I'm not even sure that I make intentional decisions, I can only say that it feels like I do. But free will isn't really compliant with my model of physical reality.


No, I don’t think “reasoning” should require intent.

I think a prolog program should be something that can be described as reasoning.


"intentionally decide" is at least as problematic a term as "reason", no?


Of course logic gates apply logical reasoning to solve problems, they are not much use for anything else (except as a space heater if there are a lot of them).


"Reasoning" implies the extrapolation of information - not the mechanical generation of a fixed output based on known inputs. No one would claim that a set of gears is "reasoning" but the logic gate is as fixed in it's output as a transmission.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: