Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's deeply depressing that AI has been sold as something even capable of modelling anything humans do; and quite depressing that this comment exists.

"AI" is just taking `mean()` over our choice of encodings of our choice of measurements of our selection of things we've created.

There is as much "alike humans" in patterns in tree bark.

AI is an embarrassingly dumb procedure, incapable of the most basic homology with anything any animal has ever done; us especially.

We are embedded in our environments, on which we act, and which act on us. In doing so we physically grow, mould our structure and that of our environment, and develop sensory-motor conceptualisations of the world. Everything we do, every act of the imagination or of movement of our limbs, is preconditioned-on and symptomatic-of our profound understanding of the world and how we are in it.

The idea that `mean(424,34324,223123,3424,....)` even has any revelance to us at all is quite absurd. The idea that such a thing might sound pleasant thru' a speaker, irrelevant.

This is a product of i dont know what. On the optimist side, a cultish desire to see Science produce a new utopia. On the pessimisst side, a likewise delusional desire to see Humans as dumb machines.

What a sad state!



I lack your confidence, and find it a bit religious.

> The idea that `mean(424,34324,223123,3424,....)` even has any revelance to us at all is quite absurd.

Most of what I say to anyone is exactly this.

When I'm about to give anyone any information, I look back at all of the relevant past information that I can recall (through word and sensory association, not by logic, unless I have a recollection of an associated internal or external dialog that also used logical rules.) I multiply those by strength of recollection and similarity of situation (e.g. can I create a metaphor for the current situation from the recalled one?). I take the mean, then I share it, along with caveats about the aforementioned strength of recollection and similarity of situation.

This is what it feels like I actually do. Any of these steps can be either taken consciously or by reflex. It's not hidden.

> I think it's deeply depressing that AI has been sold as something even capable of modelling anything humans do

This is a bizarre position. All computers ever do is model things that humans do. All a computer consists of is a receptacle for placing human will that will continue to apply that will after the human is removed. They are a way of crystallizing will in a way that you can sustain it with things (like electricity) other than the particular combination of air, water, food, space, pressure, temperature, etc. that is a person. An overflow drain is a computer that models the human will. An automatic switch/regulator is the basic electrical model of human will, and a computer is just a bunch of those stitched together in a complementary way.


You're an animal. You've no idea what you do, and you're using machines as a model. Likewise, in the 16th C. it was brass cogs; and in anchient greece, air/fire/etc.

You're no more made of clay & god's breath, as you are sand and electricy.

You're an oozing, growing, malluable organic organism being physiologically dynamically shaped by your sensory-motor oozing. You're a mystery to yourself, and these self-reports, heavily coloured by the in-vogue tech are not science, they're pseudoscience.

If you want to study how animals work, you'd need to study that. Not these impoverished metaphors that mystify both machines and men. No machine has ever acquired a concept through sensory-motor action, nor used one to imagine, nor thereby planned its actions. No machine is ever at play, nor has grown its muscles to be better at-play. No machine has, therefore, learned to play the piano. No machine has thought about food, because no machine has been hungry; no machine has cared, nor been motivated to care by a harsh environment.

An inorganic mechanism is nothing at all like an animal, and an algorithm over a discrete sequence of numbers with electronic semantics, is nothing like tissue development.

What you are doing is not something you can introspect. And you arent really doing that. Rather, you've learned a "way of speaking" about machine action and are back-projecting that onto yourself. In this way, you're obliterating 95% of the things you are.


This isn't really responsive. Not only am I not using machines as any sort of model for human behavior, I'm trying to think about weird things you could do to a machine to make it ape a human.

> these self-reports, heavily coloured by the in-vogue tech are not science, they're pseudoscience.

I simply don't know what you're referring to. If you're referring to retrieving memories through associations, there's mountains of empirical evidence for that. If you're referring to wondering if I remember things, and being unsure of the information I'm recalling when I have less recall of that, or wondering if past situations compare well to current situations, well you got me. It's my personal belief that conscious thought is an epiphenomenon that is a rationalization of decisions already made.

But the rest of this is nonsense. Vivid imagery is not an argument for exceptionalism, no matter how much I say things drip or ooze. This is just association in action. You're trying to create a distinction for life (or rather what you recognize as life) life oozes and has viscera, so using a bunch of words that feel wet and organy can substitute for reason contra the robots.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: