Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a weak epiphenominalist, I'd argue that all conscious thought is a backpropagation. Is it controversial that the brain is able to perceive its own output, or am I misunderstanding how backpropagation in neural networks is implemented?


that's not what backpropagation is. Backpropagation is best thought of as a 'cheat' (algorithmic simplification) that allows you to calculate the derivative of a feed-forward neural net. You would need to calculate the derivative of a neural net in general to optimize relative to some cost function, you could use, for example, gradient descent, but that is computationally costly.

For some neural nets, you still have a gradient, but the concept of back or forward propagation is not defineable. Based on the topology and structure of biological neural nets, what would you think is the case?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: