Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> How would you convince an intelligent being that could not feel pain that it exists at all?

Assuming the being is "reasonable" (in this context it would mean it's willing to accept that there exist concepts that it may not understand or directly experience, and is willing to trust us), we could just point out the chemical and electrical phenomena correlated with pain and say that it's something that causes a certain kind of feeling of discomfort. We would get in trouble if this being also can not feel general discomfort, but you're probably bound to hit a wall in understanding at some point anyway when communicating with an entity whose experiencing capabilities are wildly different from ours.

> The existence of pain falls outside the boundaries of scientific inquiry, I agree. But are you saying that it therefore doesn't exist?

Actually, my point about pain was that it does exist, precisely because it can be examined and explained within a scientific framework. If we couldn't do that, then we could say that for all practical purposes, as far as science is concerned, "pain doesn't exist".

The same is true for consciousness. What's difficult about it is that it's not a thing, there's no hormone for consciousness, there's no brain centre where it's localized, rather it's a process and a product both phylogenic and ontogenic so it's a lot harder to capture it and identify it, to put it "under the microscope". It's not some secret sauce to intelligence, it's a consequence of intelligence. And the most important part of the process is the dynamics of language acquisition (at least when we're speaking of conscious experience in Homo sapiens).

I could go into the details but I'm afraid my posts would explode in length. ATM I don't have time to dig for good online material on this, and I'm under the impression that the theory in time got derailed into developing some practical aspects concerning child cognitive development, verbal learning etc, and away from the hard meaty implications we're discussing here, so I'm reluctant to even attempt to go into that rabbit hole. But they're explicitly there (the books I mentioned discuss the issue at length). Interestingly, about 10 years ago I was doing some work on word-meaning and symbol grounding development and I was both glad and frustrated to see literature on computer modelling in this area full of operationally defined concepts from the theory but people were seemingly unaware that this work has already been treated in depth on the theoretical level because there were no references to it then, I'm not sure if anything has changed, I've since moved on to other things. For example, the Talking Heads model[1][2]. It's not about consciousness per se, and although the authors never reference the socio-cultural theory of cognitive development (a horrible name in this day and age, it tends to evoke associations to post-modern dribble, but nothing could be further from the truth), it can give you a good idea of some aspects of the dynamics explored in the theory because what is happening in the TH model is exactly what the S-C theory describes is happening externally during language acquisition (in broader strokes though).

As for the philosophical zombie problem, I'd like to retract what I said about it being nonsense. Actually, it's very useful in showing why worrying about subjective sensation of consciousness is completely useless in AI and is very much like asking how many angels can dance on a tip of a needle. On a very related note I'd add: people are severely underestimating the significance of the Turing test.

[1] http://staff.science.uva.nl/~gideon/Steels_Tutorial_PART2.pd...

[2] http://scholar.google.com/scholar?hl=en&q=talking+heads+expe...



> Actually, my point about pain was that it does exist, precisely because it can be examined and explained within a scientific framework.

The physical processes of pain (ie. the electricity) can be observed scientifically, but the "sensation" of pain (to use your word from before) cannot. But it is the "sensation" of pain that gives it its moral significance, otherwise inflicting pain would be no different morally than flipping on the switch to an electrical circuit.

> The same is true for consciousness. What's difficult about it is that it's not a thing, there's no hormone for consciousness, there's no brain centre where it's localized, rather it's a process and a product both phylogenic and ontogenic

I can only conclude that you mean something different than I do when you say "consciousness." To me the sensation of pain is a subset of consciousness. It's the difference between electricity "falling in the middle of the forest" so to speak and electricity that causes some sentient being to feel discomfort.

> Actually, it's very useful in showing why worrying about subjective sensation of consciousness is completely useless in AI

Sure it's useless to AI. To AI the zombie problem doesn't matter, because the goal is to produce intelligence, not sentience. But it's useful in a conversation about what sentience and consciousness mean.

If we created intelligence that could pass the Turing Test against anybody, it would be basically impossible to know if it experiences sentience in the way that all of us individually know that we do. But that is the essence of the zombie problem. Where does sentience come from? We have no idea.

Actually I take it back; the zombie problem will be extremely useful to AI the moment a computer can pass the Turing Test, because that's when it will matter whether we can "kill" it or not.


> The physical processes of pain (ie. the electricity) can be observed scientifically, but the "sensation" of pain (to use your word from before) cannot.

You state this as though it's a given, but it's not. You're assuming Dualism. So, of course you end up with Dualism.

> But it is the "sensation" of pain that gives it its moral significance, otherwise inflicting pain would be no different morally than flipping on the switch to an electrical circuit.

This is a silly over-simplification. Complexity matters. The patterns of electro-chemical reactions that occur when I inflict pain on another human cause that human to emote in a way that I can relate to because of the electro-chemical reactions that have been happening in me and those around me since before my birth. So what?

It's in no way comparable to flipping a light switch, except in the largely irrelevant detail that electricity was part of each system.

The fact that an incredibly complex system consisting of individuals, language, and society should yield different results from three pieces of metal and some current shouldn't be the least bit surprising, and is not a reasonable argument for dualism, or p-zombies.

Here's my take on the p-zombie "problem". We can say all kinds of shit, but it doesn't have to make sense. For example I can say "This table is also an electron". That's a sentence. It evokes some kind of imagery, but it's utter nonsense. It doesn't point out some deep mystery about tables or electrons. It's just nonsense.


> You state this as though it's a given, but it's not. You're assuming Dualism.

No. Dualism is the idea that our minds are non-physical. I say minds are fully physical, and all thinking happens in the physical realm. But somehow the results of this thinking are perceived and sensed by a self-aware being as "self" in a way that other physical processes are not.

> The patterns of electro-chemical reactions that occur when I inflict pain on another human cause that human to emote in a way that I can relate to because of the electro-chemical reactions that have been happening in me and those around me since before my birth.

Exactly. You are extrapolating by analogy that other people experience pain in the same way you do, because you cannot experience their pain directly in the way that they do. But this analogy of thinking is just an assumption. And it certainly offers no insight into why you are self-aware and a computer (a very different but still complex electrical system) is not (we assume).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: