• Patterner
    577
    Coincidentally, i just started listening to Anil Seth's Being You: A New Science of Consciousness on my commute. He says exactly what I think.
    Here’s why the zombie idea is supposed to provide an argument against physicalist explanations of consciousness. If you can imagine a zombie, this means you can conceive of a world that is indistinguishable from our world, but in which no consciousness is happening. And if you can conceive of such a world, then consciousness cannot be a physical phenomenon.

    And here’s why it doesn’t work. The zombie argument, like many thought experiments that take aim at physicalism, is a conceivability argument, and conceivability arguments are intrinsically weak. Like many such arguments, it has a plausibility that is inversely related to the amount of knowledge one has.

    Can you imagine an A380 flying backward? Of course you can. Just imagine a large plane in the air, moving backward. Is such a scenario really conceivable? Well, the more you know about aerodynamics and aeronautical engineering, the less conceivable it becomes. In this case, even a minimal knowledge of these topics makes it clear that planes cannot fly backward. It just cannot be done.

    It’s the same with zombies. In one sense it’s trivial to imagine a philosophical zombie. I just picture a version of myself wandering around without having any conscious experiences. But can I really conceive this? What I’m being asked to do, really, is to consider the capabilities and limitations of a vast network of many billions of neurons and gazillions of synapses (the connections between neurons), not to mention glial cells and neurotransmitter gradients and other neurobiological goodies, all wrapped into a body interacting with a world which includes other brains in other bodies. Can I do this? Can anyone do this? I doubt it. Just as with the A380, the more one knows about the brain and its relation to conscious experiences and behavior, the less conceivable a zombie becomes.

    Whether something is conceivable or not is often a psychological observation about the person doing the conceiving, not an insight into the nature of reality. This is the weakness of zombies. We are asked to imagine the unimaginable, and through this act of illusory comprehension, conclusions are drawn about the limits of physicalist explanation.
    — Seth
    Not to worry. I already disagree him in other ways.
  • AmadeusD
    1.9k
    This doesn't appear to me as an argument against anything but aesthetic implication (it would be weird, no?).

    It doesn't seem to address the fact that the Hard Problem and P-zombies are exactly meant to invoke the gap science is trying to fill.
    Unless you can fully understand consciousness in physical terms (I do not believe this is hte case, but even if not, we don't ahve that understanding yet) then p-zombies are coherent until we do (and it excludes that possibility). 180 Proof made a similar error earlier in the thread (though, it was years ago). "identical" to a 'conscious being' would be a conscious being. Being "physically identical" is the actual case in the TE.

    But i agree with Seth - it's a very weak argument against Physicalism, for sure. It's just that he assumes he's right:

    is to consider the capabilities and limitations of a vast network of many billions of neurons and gazillions of synapses (the connections between neurons), not to mention glial cells and neurotransmitter gradients and other neurobiological goodies, all wrapped into a body interacting with a world which includes other brains in other bodies. Can I do this? Can anyone do this? I doubt it. — Seth

    This precludes anything but a physicalist account for it to be a decent objection, i think. I also think Seth (among others) overblows the correlation we find between certain parts of hte brain and fairly imprecise conscious experience. If the brain is a receiver, nothing here has any really weight on the question/s. But it would certainly rule out an emergent (from neural activity) account of consciousness
  • RogueAI
    2.5k
    "Can you imagine an A380 flying backward? Of course you can. Just imagine a large plane in the air, moving backward. Is such a scenario really conceivable? Well, the more you know about aerodynamics and aeronautical engineering, the less conceivable it becomes. In this case, even a minimal knowledge of these topics makes it clear that planes cannot fly backward. It just cannot be done."

    I like this. I keep trying to imagine a p-zombie kicking up it's feet at the end of a hard day and drinking a couple beers to take the edge off and I keep not being able to do it. I can, superficially, but when I try to pair my imagining with a being that has no mental states, it's impossible.
  • RogueAI
    2.5k
    Unless you can fully understand consciousness in physical terms (I do not believe this is hte case, but even if not, we don't ahve that understanding yet) then p-zombies are coherent until we do (and it excludes that possibility).AmadeusD

    What would the history of p-zombie world be? Is there a coherent story that could be told where p-zombies evolved like we did and developed language, like we did? How could p-zombie language have any referents to mental states?
  • AmadeusD
    1.9k
    yeah that’s a bit of a problem. I wasn’t under the impression that would need accounting for though. I can see the evolution side occurring in roughly the same way it has but I imagine we are still about 250,000 years ago culture-wise(I.e <1 - near zero) and obviously more like several million years ago in terms of actual behavioural capacities. It’s a very different world no doubt and would take some serious storytelling to get going
  • Patterner
    577

    It seems to me that needs accounting for. Why would something that has no subjective experience - something for which there is nothing it is like to be, to itself - ever develop language about these things. If asked "Are you conscious?", why would it say "Yes"?
  • AmadeusD
    1.9k


    Why are we assuming language? That seems a conscious ability, whereas we're talking about physically identical, yet non-conscious entities.
  • Patterner
    577
    Why are we assuming language? That seems a conscious ability, whereas we're talking about physically identical, yet non-conscious entities.AmadeusD
    That's the scenario we're given. P-zombies are supposed to act exactly like us. We would have no way of knowing that they have no consciousness. So they talk. And they answer questions the same ways we do.
  • AmadeusD
    1.9k
    That's the scenario we're given. P-zombies are supposed to act exactly like us. We would have no way of knowing that they have no consciousness. So they talk. And they answer questions the same ways we do.Patterner

    That is not how I've ever understood any version of the TE.

    p-zombies are physically the same, yet unconscious. No idea why we are assuming they're behaving exactly the same? If i've got that wrong, then I have got that wrong.
  • RogueAI
    2.5k
    That is not how I've ever understood any version of the TE.

    p-zombies are physically the same, yet unconscious. No idea why we are assuming they're behaving exactly the same? If i've got that wrong, then I have got that wrong.
    AmadeusD

    They're supposed to act the same as us: talking, fighting, warring, yelling out "Ouch!" when they smash their toe, crying watching Schindler's List, etc. They wouldn't, of course, which is why they're incoherent.
  • AmadeusD
    1.9k
    They wouldn't, of courseRogueAI

    No, they wouldn't, but I don't understand how its possible it could be contended that they're 'supposed' to . So, I have no idea where to go with this now :lol:
  • Lionino
    1.5k
    p-zombies are physically the same, yet unconscious. No idea why we are assuming they're behaving exactly the same? If i've got that wrong, then I have got that wrong.AmadeusD

    On the outside they act exactly the same. They show happiness and sadness, but they don't undergo the experience.

    Replace p-zombie with a computer that perfectly simulates human personality. Does the computer feel sadness when it cries? That is basically the question.

    Of course you will say "No! The computer is not biological". Here is where p-zombie comes into play; they have a brain and it works just like yours, they are made of flesh and bones, but they don't feel or think, they just act as though they do.
    For me, the question hinges entirely on mind-body dualism, I think 180 proof said something similar as well.
    The question is surely related to solipsism as well.
  • AmadeusD
    1.9k
    Fair enough; I guess i've misunderstood the TE. Whoops lol.

    I don't really see those elements as relevant (at least certainly not necessary) to the Hard problem. For my part, when i consider this TE in the HP context, I imagine a being, physically exactly the same as a typical human but without conscious experience (i.e, that's the only difference) meaning there is no sadness or happiness. They do not have the experience required to inform that. It can't be 'shown' without hte experience. My job is to figure out the difference between the p-zombie i've described, and a human with conscious experience.

    I am under the impression that this requires biting the "consciousness is not emergent from neural activity" bullet hard, but nothing else - only serves to preclude a fully physicalist account of consciousness, and all the interesting questions are still in the air (what, where from, how, why etc..) about consciousness.
  • Patterner
    577
    Replace p-zombie with a computer that perfectly simulates human personality. Does the computer feel sadness when it cries? That is basically the question.Lionino
    The difference is that we can program computers to act like us. But there's no reason to think p-zombies would act like us.
  • Lionino
    1.5k
    The difference is that we can program computers to act like us. But there's no reason to think p-zombies would act like us.Patterner

    By your own argument, there is. The p-zombie would be biologically wired to act like us.

    I don't really see those elements as relevant (at least certainly not necessary) to the Hard problemAmadeusD

    Because one of the versions of the hard problem is "When will a collection of physical states C be conscious (chimpanzee) or un-conscious (rock)". If p-zombie is physically possible, there will no distinguishing criteria for what C will do.

    I am under the impression that this requires biting the "consciousness is not emergent from neural activity" bullet hard, but nothing else - only serves to preclude a fully physicalist account of consciousness, and all the interesting questions are still in the air (what, where from, how, why etc..) about consciousness.AmadeusD

    That seems to be what it implies.
  • Patterner
    577
    The difference is that we can program computers to act like us. But there's no reason to think p-zombies would act like us.
    — Patterner

    By your own argument, there is. The p-zombie would be biologically wired to act like us.
    Lionino
    That's not my argument. That's the premise, which i dispute.
  • Lionino
    1.5k
    That's not my argument. That's the premise, which i dispute.Patterner

    I don't understand what you mean.
    You say the difference is that we can program computers to act like us; a p-zombie could be neurologically programmed to act like us.
    You say "But there's no reason to think p-zombies would act like us.". I presented a reason.
  • Patterner
    577

    My turn to not understand. :grin:
    How would the p-zombies, which do not possess consciousness, come to be programmed to speak and act as though they did?
  • Lionino
    1.5k
    How would the p-zombies, which do not possess consciousness, come to be programmed to speak and act as though they did?Patterner

    If mind-body dualism is true, they would simply have no soul.
    If physicalism is true, I don't see any way; anything with our neurological set-up would be conscious. Unless you come up with a physicalist version of p-zombie where the zombie acts the way we do because the neurological set-up is the same ours EXCEPT for a certain property X that gives us consciousness. That would an essentialist view of consciousness within physicalism, but that is pushing the meaning of p-zombie.
  • Patterner
    577

    Right. If physicalism is absolute, then p-zombies - exact physical duplicates of us, down to the smallest detail - without consciousness are not a possibility. Any physical duplicate would be conscious.

    If there is something like dualism, panpsychism, or whatever other ideas there are, and we remove that from an exact duplicate, so there is only the physical, and there is no consciousness, then there is no reason they would say Yes if asked if they are conscious, or have words for such concepts in their language.
  • Lionino
    1.5k
    then there is no reason they would say Yes if asked if they are consciousPatterner

    If it is their brain that prompts them to say yes, there would be a reason.
  • Patterner
    577

    If I asked a p-zombie if it was conscious, I would think its brain would prompt it to say something like, "What is 'conscious'?" Why would a computer that had no programming or memory related to consciousness think it was conscious, or come up with the idea on its own? If a p-zombie with no consciousness, nothing but stimulus and response, existed, why would it answer other than the way the computer would?
  • Lionino
    1.5k
    I would think its brain would prompt it to say something like, "What is 'conscious'?"Patterner

    The premise of p-zombies is that they would not ask that. They act exactly the same as us.

    Why would a computer that had no programming or memory related to consciousness think it was conscious, or come up with the idea on its own?Patterner

    If you train an AI on comments talking about things such as feelings and so on, the AI would talk as if it is conscious.
  • Patterner
    577
    The premise of p-zombies is that they would not ask that. They act exactly the same as us.Lionino
    Yes. My position is that the premise is not conceivable. Yes, we can write the words "I conceive of a p-zombie with such-and-such characteristics." But that's just writing words. I can write any outlandish thing i want, but that doesn't make it conceivable.
    A square circle that was shaped like a pyramid and made entirely of chocolate flavored whipped cream flew into a black hole, lived there for a year, changed its mind, and flew back out.


    If you train an AI on comments talking about things such as feelings and so on, the AI would talk as if it is conscious.Lionino
    Yes. But if you didn't train it that way, why would it? If you didn't train p-zombies that way, why would they?
  • RogueAI
    2.5k
    I would think its brain would prompt it to say something like, "What is 'conscious'?"
    — Patterner

    The premise of p-zombies is that they would not ask that. They act exactly the same as us.
    Lionino

    But if we are asked if we have attribute x, and we don't have it or don't know what x is (e.g., telepathy), we would either say, "no" or "what are you talking about?". We don't (usually) lie and pretend we have x. The P-zombie isn't conscious. In so far as it knows things, it would know it's not conscious. So when asked if it's conscious, you're saying it would lie? If so, the zombies aren't acting like us. If not, then by their own admission they're not conscious.
  • Lionino
    1.5k
    Yes. My position is that the premise is not conceivable. Yes, we can write the words "I conceive of a p-zombie with such-and-such characteristics." But that's just writing words. I can write any outlandish thing i want, but that doesn't make it conceivable.
    A square circle that was shaped like a pyramid and made entirely of chocolate flavored whipped cream flew into a black hole, lived there for a year, changed its mind, and flew back out.
    Patterner

    Then the burden of proof is on you to prove that p-zombies are as incoherent as square circles or the such.

    Yes. But if you didn't train it that way, why would it? If you didn't train p-zombies that way, why would they?Patterner

    Because the premise is that they behave like us. We humans say "Yes." to "Are you conscious?". So they would as well.

    it would know it's not consciousRogueAI

    From this fragment you can deduce the lack of understanding of the concept of a p-zombie. The zombie does not know anything, does not feel anything, it does not think. It would not go "Huh, I guess I don't know what that thing you are talking about refers to". If asked if it is conscious, it will say "Yes" because that is what we would do.
  • AmadeusD
    1.9k
    Yeah, i'm understanding i've gotten the TE wrong - but i also can't work out why it's the way it is.

    It makes little sense because it's importing all of the requirements of success into the experiment. I don't see how that matters - the issue, surely, is whether or not a physically identical being would be conscious, or not. So, a p-zombie, to me, should be conceived as physically absolutely identical but not conscious. To me, that's the bullet to bite. I don't really grok how one could, or could not, confirm or deny the potential for a being acting fully conscious, yet not being so. Begs the question, surely.
  • Lionino
    1.5k
    Let me add this from Descartes' Discourse on the Method, where he talks about something resembling p-zombies:

    And here I specially stayed to show that, were there such machines exactly resembling
    organs and outward form an ape or any other irrational animal, we could have no means of knowing that they were in any respect of a different nature from these animals; but if there were machines bearing the image of our bodies, and capable of imitating our actions as far as it is morally possible, there would still remain two most certain tests whereby to know that they were not therefore really men. Of these the first is that they could never use words or other signs arranged in such a manner as is competent to us in order to declare our thoughts to others : for we may easily conceive a machine to be so constructed that it emits vocables, and even that it emits some correspondent to the action upon it of external objects which cause a change in its organs; for example, if touched in a particular place it may demand what we wish to say to it; if in another it may cry out that it is hurt, and such like; but not that it should arrange them variously so as appositely to reply to what is said in its presence, as men of the lowest grade of intellect can do. The second test is, that although such machines might execute many things with equal or perhaps greater perfection than any of us, they would, without doubt, fail in certain others from which it could be discovered that they did not act from knowledge, but solely from the disposition of their organs: for while reason is an universal instrument that is alike available on every occasion, these organs, on the contrary, need a particular arrangement for each particular action; whence it must be morally impossible that there should exist in any machine a diversity of organs sufficient to enable it to act in all the occurrences of life, in the way in which our reason enables us to act.
  • RogueAI
    2.5k
    The zombie does not know anything, does not feel anything, it does not think.Lionino

    So I'm supposed to think my p-zombie doppelganger will be able to do my job effectively and navigate the world without knowing anything and/or thinking? How would that work, exactly?
  • RogueAI
    2.5k
    If asked if it is conscious, it will say "Yes" because that is what we would do.Lionino

    Can a p-zombie lie?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.