• Marchesk
    4.6k
    Harder is emphasized to make it clear this is an additional problem of consciousness presented by Ned Block's paper in 2003, which you can read here:

    http://ruccs.rutgers.edu/images/personal-zenon-pylyshyn/class-info/Consciousness_2014/Block_HarderQuestions.pdf

    The Partially Examined Life's most recent podcast discusses Block's argument and also David Papineau’s paper on the possibility for a science of consciousness. It's a good discussion of physicalism, functionalism and phenomenalism in the context of the harder problem.

    https://partiallyexaminedlife.com/2019/07/01/ep219-block-papineau/

    To summarize, the harder problem is that human phenomenal concepts do not reveal whether our material makeup or the functional role our neurobiology plays is responsible for consciousness. As such, we have no philosophical justification for saying whether a functional isomorph made up of different material such as the android Data from Star Trek is conscious. Even more confusing, we have no way of telling whether a "mere" functional isomorph is conscious, where "mere" means functional in terms of human folk psychology only, and not in the actual neural functions.

    So if Data's positronic brain functions different from our own brain tissue, but still produces reports and behaviors based on things like beliefs, desires and phenomenal experience, we have neither the physical nor functional basis for deciding whether he is actually conscious, or just simulating it. Which of course brings up the Turing Test and Searle's biological position. Which one should note would be the same position identity theorists would be endorsing when they say consciousness is identical to certain brain states.

    This applies to other possible physical systems, such as Block's Chinese brain, were a billion Chinese with radios and flags implement the functional role for conscious brain states. And if one bites the functional bullet on this and says that such a system would be conscious, then in all likelihood, countries like India are already performing enough of those roles to be conscious.

    At which point I would jump off the functionalist bandwagon. Jaron Lanier has made this point with meteor showers implementing brain simulations and therefore being conscious according to functionalists, if you interpret the data from the meteor shower the right way. This would mean the universe would be full of all manner of conscious systems. So this would be similar to full blown mereology, where any conceivable combination of matter is an object, even ones that make no sense to us. Except in this case, it would be consciousness run amok, even when we have no reason to suspect India or a meteor shower would have conscious experiences.

    While listening to the podcast (I'm paraphrasing and adding a few of my thoughts above), I couldn't help but feel dissatisfaction with the implications of any position one takes regarding the harder problem. But it should be noted that the harder problem exists only if:

    A. One endorses physicalism, which Block thinks is the starting point for Naturalism.
    B. One is a realist about phenomenal experience.

    Papineau argues that phenomenal concepts are too vague in certain ways for science to pin down an explanation. Thus, science cannot tell us whether Data or other animals are conscious, or if so, how similar or different their phenomenal experiences are to our own.

    https://sas-space.sas.ac.uk/885/1/D_Papineau_Science..pdf

    That's the harder problem. I should also note that Data would likely not pass the Turing Test as he has certain idiosyncrasies that would probably tip humans off that he's a machine, if the test were sufficiently thorough. And Data also has certain abilities that humans do not, or are superhuman. He also lacks emotion (until he gets an emotion chip which he can normally disable at will) and lacks sensations such as pain, cold or pleasure (except the time the Borg Queen grafted skin on his arm).

    So Data's is certainly not a perfect isomorph of humans. That's why it was a big deal when Q made him laugh at the end of one episode, and Data couldn't explain what was so funny after he stopped laughing.
  • T Clark
    13k
    To summarize, the harder problem is that human phenomenal concepts do not reveal whether our material makeup or the functional role our neurobiology plays is responsible for consciousness. As such, we have no philosophical justification for saying whether a functional isomorph made up of different material such as the android Data from Star Trek is conscious. Even more confusing, we have no way of telling whether a "mere" functional isomorph is conscious, where "mere" means functional in terms of human folk psychology only, and not in the actual neural functions.Marchesk

    Simulating consciousness is consciousness. Consciousness is a behavioral feature, not a physiological or neurological one. The mind is not the brain. I say that even though I believe that what we call mental states result from physical, electrical, and chemical reactions in the brain and the rest of the nervous system. When you watch a basketball game on television, you don't typically say that the game or the images and sounds we perceive on the screen are the same as the television.That would be silly. It's just as silly to say the mind is the brain. Our minds are the shows our brains are playing.

    This applies to other possible physical systems, such as Block's Chinese brain, were a billion Chinese with radios and flags implement the functional role for conscious brain states. And if one bites the functional bullet on this and says that such a system would be conscious, then in all likelihood, countries like India are already performing enough of those roles to be conscious.Marchesk

    Hmmm... Well...I guess I could go along with the Chinese brain being conscious, although I doubt all the people and all the radios and all the flags in the world could make up a system as complex as the brain. Also, the Chinese brain computer would be so slow that all the people would die before the simplest mental process could be created. I'm guessing it couldn't simulate a mind even in a minimal way in the time between the big bang and the heat death of the universe.

    As for India being conscious, It's something, but I wouldn't say it's conscious unless you'd say that Adam Smith's invisible hand of the market is conscious. Which I wouldn't.

    I should also note that Data would likely not pass the Turing Test as he has certain idiosyncrasies that would probably tip humans off that he's a machine, if the test were sufficiently thorough.Marchesk

    Data could certainly pass the Turing test if he wanted to.
  • Terrapin Station
    13.8k
    From the Block paper:

    "T. H. Huxley famously said ‘How it is that anything so remarkable as a state of consciousness comes about as a result of irritating nervous tissue' . . ."

    Hey, nervous tissue isn't that annoying. It's just trying to make friends.
  • Terrapin Station
    13.8k
    The hard problem of the hard problem of consciousness is that there's no good analysis of what explanations are, including (i) what makes something count as an explanation versus not count, and (ii) just what the relationship is between an explanation and what it's explaining. The ridiculous problem of the hard problem of the hard problem is that no one seems as if they could care less about this.
  • Terrapin Station
    13.8k
    I shouldn't just comment on this a bit at a time, I suppose, but that's what I'm doing as I go through the Block paper first:

    "The Hard Problem is one of explaining why the neural basis of a phenomenal quality is the neural basis of that phenomenal quality rather than another phenomenal quality or no phenomenal quality at all. In other terms, there is an explanatory gap between the neural basis of a phenomenal quality and the phenomenal quality itself. "

    Re what I said about explanations above, we could just as well say:

    "A hard problem is explaining, for any explanation of any property, why the (observationally-)claimed basis of a property is the basis of that property rather than another property or no property (at least of x-type) at all. In other terms, there is an explanatory gap between the observational basis of a property and the property itself."

    ==========================================================================

    "The claim that Q is identical to corticothalamic oscillation is just as puzzling—maybe more puzzling—than the claim that the physical basis of Q is corticothalamic oscillation."

    The distinction he's making there isn't clear to me.

    "How could one property be both subjective and objective?"

    It's not, actually, since mental phenomena are subjective period. If corticothalamic oscillation is mental phenomena--which it is if it's identical to Q, then it's subjective.

    But what he's asking is at the heart of the "explanatory gap": he's asking about corticothalamic oscillation not "seeming like" Q when one is observing another's corticothalamic oscillation, whereas it "seems like" Q when it's one's own corticothalamic oscillation. That's because our mentality is simply the properties of things like corticothalamic oscillation from the perspective of being the corticothalamic oscillation in question.

    There's a similar problem in all explanations, since they're always from some perspective, some "point of reference," and there are no perspectiveless perspectives or point of reference-free points of reference. Any phenomena or property/set of properties p is different from different perspectives/reference points, including that they're be different from the perspective/reference point of being the substances/dynamic relations in question than they are from various removed-from-identity observational perspectives/points of reference (which are all different from each other).
  • Devans99
    2.7k
    I think it could be that we and Data are both machines of similar complexity, so we would both share the same level of consciousness. Considering a simpler machine, a computer (specifically the operating system), it has parallels to a human:

    - It is linked to and responds to peripherals (it 'senses' so to speak)
    - It manages multiple tasks simultaneously
    - it is constantly aware (well aware at least for every time slice - at least 60 times a second).
    - It has an active train of thought - the current process (on a uniprocessor)

    A human has a better ability to deal with unexpected sensual input. If an operating system reads data off disk in an unexpected format, it responds with a predicable error. If a human senses something he/she has not sensed before, there is (usually) a creative response that shows adaptation and improvisation. This is probably just indicative that humans have much more complex logic than any current computer.
  • Terrapin Station
    13.8k
    Anyway, re the "harder problem," it's not something we need to account for--it's just something that we're unsure of--whether and when something that's different materially but similar functionally (and/or structurally) would be conscious.

    My stance is one of cautious skepticism, basically. I think we need to show good reasons for why functionalism/substratum independence might be true before accepting it. Although I think it's worth pursuing functionalism practically, with lessened skepticism, simply because it might produce useful technology to pursue it.
  • Terrapin Station
    13.8k
    Re this, by the way:

    "Putnam, Fodor, and Block and Fodor argued that if functionalism about the mind is true, physicalism is false.The line of argument assumes that functional organizationsare multiply realizable. The state of adding 2 cannot be identical to an electronic state if a nonelectronic device (e.g., a brain) can add 2."

    What Putnam et al are arguing is false. The state of adding 2 would be identical to the electronic state for the electronic device as it adds 2, and it would be identical to the brain state as the brain (as someone mentally) adds 2.

    "Adding 2" is not identical in both instances, obviously. And it's not identical in two instances of a calculator (or two calculators) adding two, either.
  • schopenhauer1
    9.9k
    Searle's biological positionMarchesk

    Can you reiterate this for me? What makes neurons special as a carrier of chemical messengers, sodium/potassium gates, and so on? Further, what is it about billions of these chemical messenger carriers packed together in a skull with peripheral sensory organs like the eye with its auditory nerve, and somatic nerves in the skin, and so on?

    What I thought was a funny conclusion from much of these philosophies, is that neurons themselves seem to have a sort of magical quality.. If one does not bite the bullet on PANpscyhism, one bites the bullet on NEUROpsychism. In other words, the "Cartesian theater", the "hidden dualism", and the "ghost in the machine" (or whatever nifty term you want to use) gets put into the equation at SOME point. It just depends on exactly what point you want to put it in the equation.
  • Marchesk
    4.6k
    Can you reiterate this for me? What makes neurons special as a carrier of chemical messengers, sodium/potassium gates, and so on?schopenhauer1

    Searle just says we know in the case of humans we know that we're conscious, so it must be tied to our biology, since we don't have any other explanation.
  • Marchesk
    4.6k
    Simulating consciousness is consciousness. Consciousness is a behavioral feature, not a physiological or neurological one.T Clark

    it's not behavior. I can pretend to be in pain or feel sad. i can also hide my pain (within reason) or sadness. When you dream at night, usually your body is paralyzed so you don't move around in response to your dreams. You can sit perfectly still and meditate.

    And there are many times we really don't know what someone else is thinking or feeling from their behavior.

    Also, we can fake behavior up to a point mechanically and with computers. Siri sometimes tells me, "Brrrr, it's 20 degrees, cold outside." I have no reason to suppose my phone feels cold. It's just programmed to say that for certain temperature ranges.

    Data could certainly pass the Turing test if he wanted to.T Clark

    On the show, Data is always puzzled by some feature of common human behavior. Maybe he could convince someone he's autistic, except the can perform calculation and recitation of facts at a superhuman level if asked, and he usually does so unless told not to.

    Now his brother Lore could pass. He's a good liar. It helps being evil.
  • Marchesk
    4.6k
    What Putnam et al are arguing is false. The state of adding 2 would be identical to the electronic state for the electronic device as it adds 2, and it would be identical to the brain state as the brain (as someone mentally) adds 2.Terrapin Station

    I don't think we're doing exactly the same thing as a calculator/computer when we add two, because two has an import conceptual component for us. It's an abstract concept that stands in for any set of two things. That's why we have debates over platonism.

    Also, because we learn the rules for arithmetic and memorize basic results, which i doubt very much is performing the same function as a CPU making the calculation. Maybe computing a complicated sum with pen and paper is functionally the same?

    However, I think the argument is that functionalism is a kind of dualism, because it's something additional to the physical substrate.
  • Relativist
    2.1k
    To summarize, the harder problem is that human phenomenal concepts do not reveal whether our material makeup or the functional role our neurobiology plays is responsible for consciousness. As such, we have no philosophical justification for saying whether a functional isomorph made up of different material such as the android Data from Star Trek is conscious. Even more confusing, we have no way of telling whether a "mere" functional isomorph is conscious, where "mere" means functional in terms of human folk psychology only, and not in the actual neural functions.Marchesk
    IMO, the difficulty is due to the vagueness of the term "consciousness". Broadly speaking, we can consider Data, my cat, and myself to each possess a sort of functional capability that are analogous to one another - and we could label this functional capability as "consciousness." Or we could adopt a narrow view of consciousness that could only possibly apply to humans (and possibly not even all HUMANS!).

    Conscious thoughts do not arise in isolation, they arise in a complex context. The context includes sensory perceptions (consider the processing of the visual cortex, which automagically produces a conscious "visual image" by processing reflected light), subconscious "knowledge" (consider reflex reactions that are triggered by past experience), preconscious hard-wiring (consider the physical aspects of sexual stimulation), and bodily functions (feelings of pain, hunger, etc). Also consider pattern recognition - perceiving sameness in a variety of objects, this depends on hard-wiring in the brain, unconscious learning, and conscious learning (think through the mental processes involved with reading and interpreting the words on your computer screen right now).

    My cat and Data each comprise very different contexts, and therefore their respective "consciousnesses" (broadly defined) will necessarily be very different from my own. My cat's consciousness will be closer to mine in some ways (those relating to bodily functions, perhaps), while Data's consciousness will be closer to mine with regard to intellectual processing (rational thought). I see or smell desirable food, and my body reacts (my mouth waters, I suddenly become conscious of hunger) - and it appears to me that my cat has some pretty similar reactions. Data does not, because he's not wired that way. On the other hand, I see a reference to Pi (3.14159...) and this triggers my learned concept of circle - and I'll assume Data has a similar thought.

    It's clear that neither my cat nor Data can be said to have a human consciousness - the contexts are too different. And yet they each have some functional similarities to human consciousness. On the other hand, I can't accept Block's Chinese brain as having anything analogous - the context is too different.
  • Marchesk
    4.6k
    That's a good point that Data does not eat, so he lacks anything functional related to hunger or digestion. But note that Odo from Deep Space 9 also does not eat, but he's biological. We would also have even more reason for thinking Odo is conscious, but we wouldn't know what it's like to be able to shape-shift, or link together with other Changelings.

    Odo clearly has feelings and his people have their own biases and wage war in response to past mistreatment. Now the Borg would be a very interesting compromise between Data and biology.
  • T Clark
    13k
    it's not behavior. I can pretend to be in pain or feel sad. i can also hide my pain (within reason) or sadness. When you dream at night, usually your body is paralyzed so you don't move around in response to your dreams. You can sit perfectly still and meditate.Marchesk

    Let's talk about consciousness in others rather than in ourselves just for the moment. We'll come back to our experience of our own consciousness later. How do we know someone is conscious? As far as we know, we can't experience their internal experience directly, so we have to use an outward sign, i.e., their behavior. That's true for any mental state.

    Language - speech, writing, signed, by whatever method - is behavior. Do we agree on that? And language is the primary way, not the only way, we can evaluate another being's consciousness. There are other methods, for example, some scientists have tried to determine whether a non-human animal is conscious by seeing if the animal can recognize itself in a mirror. I don't know whether or not I buy that, but it's an interesting way of thinking about it.

    Are dreams and meditative states consciousness? I don't think I think they are. Or I think I don't think they are. In my experience, becoming consciously aware of dreams is something that happens in memory after I wake up. As for meditation, maybe it makes sense to think of it as awareness without consciousness. I'm not sure about that.

    Now, back to our internal experience of consciousness. For me, and, as I understand it, others, the essence of the experience is internal speech. Talking to ourselves. Another essential aspect is that it allows us to stand back and observe ourselves objectively, as if from the outside, just the way we observe others. We judge ourselves conscious just as we judge others - based on our behavior.
  • Terrapin Station
    13.8k
    I don't think we're doing exactly the same thing as a calculator/computer when we add two,Marchesk

    Hence why I said, "'Adding 2' is not identical in both instances, obviously."

    However, I think the argument is that functionalism is a kind of dualism, because it's something additional to the physical substrate.Marchesk

    That would work maybe if the functionalist is positing multiple instances of something identical, so that they'd have to be realists on universals/types. But we could have a nominalist sort of functionalism, where we're calling x and y "F," even though it's not literally two identical instances of F.
  • Marchesk
    4.6k
    Now, back to our internal experience of consciousness. For me, and, as I understand it, others, the essence of the experience is internal speech. Talking to ourselves. Another essential aspect is that it allows us to stand back and observe ourselves objectively, as if from the outside, just the way we observe others. We judge ourselves conscious just as we judge others - based on our behavior.T Clark

    Here is where we fundamentally disagree. Inner dialog is just one more form of conscious experience. And it's not necessary to experience color, sound, pain in perception, memory, imagination, etc.

    If inner dialog were all there was to conscious experience, it would still present a hard problem. Also, not everyone has inner dialog. See Temple Grandin and visual thinking.

    I judge myself to be conscious because I am conscious, not because I behave as if I am. I judge other people on behavior AND biology, because I don't experience what they do, but I have no reason for supposing they would be lacking, since they're human beings like me.
  • Marchesk
    4.6k
    Are dreams and meditative states consciousness? I don't think I think they are. Or I think I don't think they are. In my experience, becoming consciously aware of dreams is something that happens in memory after I wake upT Clark

    You've never had a lucid dream?
  • Marchesk
    4.6k
    That would work maybe if the functionalist is positing multiple instances of something identical, so that they'd have to be realists on universals/types. But we could have a nominalist sort of functionalism, where we're calling x and y "F," even though it's not literally two identical instances of F.Terrapin Station

    That might work, but would you extend that to different computers performing addition?
  • Relativist
    2.1k
    Inner dialog is just one more form of conscious experience. And it's not necessary to experience color, sound, pain in perception, memory, imagination, etc.Marchesk
    I agree with this, but would like to clarify that inner dialog is one aspect of HUMAN consciousness. Non-human animals (and non-verbal humans) probably don't have an inner dialog, but they arguably experience qualia.

    I argue that we should use a comprehensive definition of consciousness that admits a wide set of mental behavior. If we get too specific, we become overly human-chauvinistic.
  • Terrapin Station
    13.8k
    That might work, but would you extend that to different computers performing addition?Marchesk

    Yes. Again, I said this in the earlier post. The full quote was: "'Adding 2' is not identical in both instances, obviously. And it's not identical in two instances of a calculator (or two calculators) adding two, either. "
  • Marchesk
    4.6k
    I agree with this, but would like to clarify that inner dialog is one aspect of HUMAN consciousness. Non-human animals (and non-verbal humans) probably don't have an inner dialog, but they arguably experience qualia.Relativist

    Probably not, but they might have an inner visual sequence or smell or sonar, which aids their thinking like inner dialog does ours.

    I argue that we should use a comprehensive definition of consciousness that admits a wide set of mental behavior. If we get too specific, we become overly human-chauvinistic.Relativist

    Yes, something philosophers are sometimes prone too. Over relying on vision for making epistemological and ontological arguments, for example.
  • Marchesk
    4.6k
    Yes. Again, I said this in the earlier post. The full quote was: "'Adding 2' is not identical in both instances, obviously. And it's not identical in two instances of a calculator (or two calculators) adding two, either. "Terrapin Station

    I don't know that I can agree with that. How would they functionally be different for such a simple case? You're saying that there can never be an exact duplicate function across different physical subtrates.
  • T Clark
    13k
    On the show, Data is always puzzled by some feature of common human behavior. Maybe he could convince someone he's autistic, except the can perform calculation and recitation of facts at a superhuman level if asked, and he usually does so unless told not to.Marchesk

    There are lots of people who are "puzzled by some feature of common human behavior." Someone blind from birth might have trouble speaking coherently about visual experience, which they've never had. There are people who are unable to empathetically understand the emotional experience of others. Would we say these people are not conscious?

    Here is where we fundamentally disagree. Inner dialog is just one more form of conscious experience. And it's not necessary to experience color, sound, pain in perception, memory, imagination, etc.Marchesk

    It is not necessary to consciously experience color, sound, pain in perception, memory, imagination, etc. It's obviously possible to experience these without being consciously aware. Animals and babies do it all the time. Actually, so do we all. It's just that we have something else added on top of that.

    I judge myself to be conscious because I am conscious, not because I behave as if I am. I judge other people on behavior AND biology, because I don't experience what they do, but I have no reason for supposing they would be lacking.Marchesk

    Again, in my experience and those of others, the essence of consciousness is internal dialog. As for Temple Grandin - I know who she is but I haven't read extensively. Of course there is non-verbal, including visual, awareness. I have visual awareness without being conscious of it. Most of my internal life is non-conscious. I contend that that's true for most, if not all, people. What does Grandin say about awareness vs. consciousness.
  • Marchesk
    4.6k
    What does Grandin say about awareness vs. consciousness.T Clark

    I don't know, I just recall reading that she claims to think in pictures and translate those to language when communicating, and she suspects animals also think in pictures. She compared her visualization capabilities to a Holodeck on Star Trek.

    Also, what was interesting is that when she thinks of a roof, she thinks of the set of all roofs she's ever seen, and not some abstract roof concept. Therefore, particulars and not universals, with the ability to translate to universals for the purpose of communication.
  • Marchesk
    4.6k
    It is necessary to consciously experience color, sound, pain in perception, memory, imagination, etc. It's obviously possible to experience these without being consciously aware.T Clark

    I would say we aren't experiencing anything when we're not conscious. We're p-zombies in that regard. Experience is consciousness.
  • Marchesk
    4.6k
    It is not necessary to consciously experience color, sound, pain in perception, memory, imagination, etc.T Clark

    Color, sound and pain only exist as consciousness. Otherwise, they become labels for something biological or physical. The world is not colored in. It doesn't look like anything, except to conscious viewers. It also doesn't feel like anything.

    A p-zombie universe has no color. It's only a label for the ability to discriminate wavelengths of visible light, since there is no experience of color in that universe.

    But I don't fully endorse the p-zombie argument. I think there couldn't be any phenomenal concepts in a p-zombie universe. Color wouldn't exist as a word. Nor would pain.
  • T Clark
    13k
    When you dream at night, usually your body is paralyzed so you don't move around in response to your dreams. You can sit perfectly still and meditate.Marchesk

    You've never had a lucid dream?Marchesk

    From Wikipedia - A lucid dream is a dream during which the dreamer is aware that they are dreaming. During a lucid dream, the dreamer may gain some amount of control over the dream characters, narrative, and environment; however, this is not actually necessary for a dream to be described as lucid.

    I don't remember ever having this kind of experience. I don't know how it fits in with our discussion
  • Marchesk
    4.6k
    don't remember ever having this kind of experience. I don't know how it fits in with our discussionT Clark

    You didn't think that dreams were experienced, only remembered. Well, I've had lucid dreams a few times. They are conscious experiences as much as perception is.
  • T Clark
    13k
    I would say we aren't experiencing anything when we're not conscious. We're p-zombies in that regard. Experience is consciousness.Marchesk

    Really? When a baby cries for food, it's not because it is experiencing hunger? When a dog is injured, it doesn't experience pain and fear? Dogs and babies don't experience anything? That seems like a pretty radical claim to me.
  • Marchesk
    4.6k
    Really? When a baby cries for food, it's not because it is experiencing hunger? When a dog is injured, it doesn't experience pain and fear? Dogs and babies don't experience anything? That seems like a pretty radical claim to me.T Clark

    Which is not one I would make. Why wouldn't they be conscious?

    Part of the problem here is that experience can mean behavior as well as consciousness, and I would rather restrict experience to consciousness, otherwise it's easy to slip between the two, resulting in arguing past one another in these debates.

    If want to get down to it, a rock "experiences" the sun from a physical or informational point of view, but that's not what we mean at all when saying a baby experiences hunger.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.