Search

  • Solution to the hard problem of consciousness

    For me, the hard problem of consciousness is about feelings. Feelings are physical pains and pleasures, and emotions, though when I say emotions, I only mean the experience of feeling a certain way, not anything wider, such as 'a preparation for action'.

    My preferred definition of consciousness is subjective experience. The unemotional content of subjective experience includes awareness of the environment and the self-awareness, all sorts of thoughts, but no emotional content. I am quite happy to follow Dennett as far as the unemotional content of subjective experience is concerned: that is just what being a certain kind of information processing system is like, and there is nothing more to explain. But I do not believe that feelings can emerge from pure information processing. I think that information processing can explain an 'emotional zombie' which behaves identically to a human, is conscious, but has no feelings. There is something which it is to be like to be an emotional zombie, but (as I've heard David Chalmers say) it might be boring.

    Here's a couple of funny-peculiar things about how humans think and feel about feelings and consciousness.

    1. In science fiction, there are many aliens and robots who are very like us but who have little or no feelings (or are they really so flat inside? read or watch more to find out!). Whether an emotional zombie can really exist or not, we seem to be very keen on imagining that they can. It is much rarer to find an alien or robot which has stronger or richer or more varied feelings than we do. (Maybe Marvin in HHGG counts.) We're quite happy imagining aliens and robots that are smarter or morally superior to us, but bigger hearts? stronger passions? Nah, we don't want to there.

    2. A thought experiment that Chalmers (among others) likes is the one where little bits of your brain are replaced by computer chips or whatever, which perform the same information processing as what they replace. As this process continues, will the 'light of consciousness' remain unchanged? slowly dim? continue for a while then suddenly blink out when some critical threshold is crossed? It is the unasked question that interests me: will the light of consciousness get brighter?

    For me, the fundamental question is: How does anything ever feel anything at all?
  • Solution to the hard problem of consciousness

    For me, the hard problem of consciousness is about feelings. Feelings are physical pains and pleasures, and emotions, though when I say emotions, I only mean the experience of feeling a certain way, not anything wider, such as 'a preparation for action'.

    My preferred definition of consciousness is subjective experience. The unemotional content of subjective experience includes awareness of the environment and the self-awareness, all sorts of thoughts, but no emotional content. I am quite happy to follow Dennett as far as the unemotional content of subjective experience is concerned: that is just what being a certain kind of information processing system is like, and there is nothing more to explain.
    GrahamJ

    Could you say more about why you distinguish emotions from the other aspects of experience?

    Could you give some examples of thoughts with no emotional content?
  • Solution to the hard problem of consciousness

    Could you say more about why you distinguish emotions from the other aspects of experience?

    Could you give some examples of thoughts with no emotional content?
    Daemon

    This is basically an answer to your first question, which maybe makes an answer to the second uninteresting.

    I am a mathematician and programmer. I've worked in AI and with biologists. I think that science (mainly computer science, maths, AI) already has the ingredients with which to explain non-emotional subjective experience. We don't yet know how to put the ingredients together, but I don't think that it is mysterious, just a huge amount of work. It seems like we will one day be able to make very intelligent self-aware machines with thoughts and behaviour quite like ours. It seems that self-awareness, thoughts and behaviour are made of complex information processing, and we have a lot if ideas about how we might implement these.

    However, we really have no clue about emotions. There is no theory about how to go from information processing to feelings. There seems to be no need for feelings to exist in order to produce thoughts and behaviour. Perhaps emotions will just emerge somehow, but there is no current explanation for how this could happen.

    As far as the hard problem is concerned, the area of AI known as reinforcement learning is, in my opinion, the most relevant.

    Reinforcement learning (RL) is an area of machine learning concerned with how intelligent agents ought to take actions in an environment in order to maximize the notion of cumulative reward.Wikipedia

    The purpose of reinforcement learning is for the agent to learn an optimal, or nearly-optimal, policy that maximizes the "reward function" or other user-provided reinforcement signal that accumulates from the immediate rewards. This is similar to processes that appear to occur in animal psychology. For example, biological brains are hardwired to interpret signals such as pain and hunger as negative reinforcements, and interpret pleasure and food intake as positive reinforcements.Wikipedia

    I am quoting these to show that something (the reward function) is used to perform the function that pain and pleasure appear to perform in brains. It is absolutely fundamental to RL that there is something that acts like feelings, but it is just a series of numbers that comes from the environment, it's just information like everything else in the system.

    I am not trying to separate thoughts from feelings in brains (or programs). I am saying that we can, in principle, explain thoughts using science-as-is, but not feelings.
  • The hard problem of consciousness and physicalism

    There are a few lengthy threads all about qualia and the hard problem already. Most of the issues have been answered (but not resolved) in these. Some of these threads were very active in recent weeks.
  • The 'hard problem of consciousness'.


    'First person point of view' is potentially just as innocent as 'conscious experience,' such as a novel being written in the first person point of view.

    Perhaps the subject is dull. It's been a long time since I was myself a mysterion, trying to build a world out of qualia.

    Still, the hard problem is hyped as a profundity, and it seems to serve mostly as propaganda for irrationalism.
  • The 'hard problem of consciousness'.


    I imagine you imaging this world somehow transformed so that ceteris paribus (somehow) there is no more electric charge. So then brains as we know them don't work, etc. Fair enough.

    But those who defend a radically immaterial 'private' I-know-not-what could suggest that charge-less mass could indeed be Conscious. The more the mysterions require an organic brain for and exclude calculators from 'conscious experience,' the more they demonstrate the parasitism of the sacred concept on our mental-and-physical-entangled ordinary life. In other words, saying that an organic brain is necessary for consciousness already 'defeats' or transgresses the hard problem and starts to explain-constrain-articulate consciousness, in terms of stuff we can all see. The true or consistent hardproblemer is or should be worried about stepping on cobblestones.
  • The 'hard problem of consciousness'.

    But those who defend a radically immaterial 'private' I-know-not-what could suggest that charge-less mass could indeed be Conscious. The more the mysterions require an organic brain for and exclude calculators from 'conscious experience,' the more they demonstrate the parasitism of the sacred concept on our mental-and-physical-entangled ordinary life. In other words, saying that an organic brain is necessary for consciousness already 'defeats' or transgresses the hard problem and starts to explain-constrain-articulate consciousness, in terms of stuff we can all see. The true or consistent hardproblemer is or should be worried about stepping on cobblestones.ajar

    If they think that in a massless and chargeless matter world the conscious can exist... It would be an empty conscious(ness). Do they see the conscious as separate from matter? I mean, is it tied to it or can it escape, like the soul leaving the body when dead?
  • The 'hard problem of consciousness'.

    The Paradox of The Hard Problem of Consciousness

    1. If consciousness is purely subjective, the word "consciousness" becomes meaningless as a referent (re Wittgenstein's private language argument)

    Yet

    2. We talk of consciousness as if there's a referent for the word "consciousness" and we know what that is, objectively speaking.

    Perhaps what we mean by "consciousness" is actually physical correlates (walking, talking, etc.) - a set of such outward signs defining what consciousness is. In that case, p-zombies are impossible; anything that behaves the way I do (assuming I'm conscious) is conscious, and nonphysicalists are in trouble. Language is social (vide Wittgenstein). Turing?
  • The 'hard problem of consciousness'.

    And how is a fictional first-person point of view an innocent ignorant assemblage of words? That we can all understand what a novel, fiction, written in the first-person point of view entails directly contradicts your affirmation.javra

    I think you misunderstand me. I said that the first-person-POV makes solid sense in a literary context. 'Innocent' is synonymous with acceptable in what I wrote above.

    The main idea is that the questionable metaphysical extension (rarefied to absurdity) is parasitic upon the typical worldly use and context in which the hard problem vanishes. In fact, we make judgements about 'conscious experience' using the 'physical' all the time. Any meaning that 'conscious experience' can have for us depends precisely on the stuff that science can handle.
  • The 'hard problem of consciousness'.



    From my POV, respectfully, you have not demonstrated an understanding of my point. That may be my fault, for not finding the right words. My point is not about consciousness denial at all, but only about the phoniness of the hard problem. This point does involve a denial of the utility or intelligibility of a certain metaphysical use of 'consciousness' or 'qualia.' In short, I think folks often don't know that they don't know what they're talking about.
  • The 'hard problem of consciousness'.

    From my POV, respectfully, you have not demonstrated an understanding of my point. That may be my fault, for not finding the right words. My point is not about consciousness denial at all, but only about the phoniness of the hard problem (which can be understood as a denial of the utility or intelligibility of a certain metaphysical use of 'consciousness' or 'qualia.')ajar

    Then why oh why reply to me this way: I.e., What was it in my initial post to you that you disagree with?

    But I guess like I previously said, never mind.
  • How to answer the "because evolution" response to hard problem?

    The "hard problem" in philosophy simply doesn't exist for cognitive neuroscience. From an old thread: https://thephilosophyforum.com/discussion/comment/511358
  • How to answer the "because evolution" response to hard problem?



    There is an easy answer to your excellent question. The evolution approach to the hard problem in consciousness states the obvious. Consciousness evolved. Right. We know that. But did it evolve in order to propagate genes or memes, as Dawkinskians put it? Is it a sign of fitness?

    The view on evolution is based on dogma. All books Dawkins wrote, all approaches to features of life, be it a dream, ethics, sex, even death, are based on

    The CENTRAL DOGMA of molecular biology

    Dawkins is the modern preacherman of the New Dogma.
  • How to answer the "because evolution" response to hard problem?

    But when we realize that the equations describe composition relations between stuffs then it becomes clear that the existence of stuffs is not only natural but also necessary for the existence of any relations.litewave

    This sounds like it’s leading toward a kind of panpsychism in the vein of Chalmers: all matter is composed of stuffs just as the psyche is composed of felt stuffs. But this elevating of stuffs to the position of fundamental basis of matter reifies rather than dissolves the hard problem.
    There are no such things as stuffs , either in the form of subjective qualia or objective matter. Stuff is a derivative abstraction that has convenient uses in the sciences.
  • How to answer the "because evolution" response to hard problem?

    How does one actually get the point across why this is not an acceptable answer as far as the hard problem is concerned?schopenhauer1

    As something different from the answers already provided, that biological evolution has taken place in no way specifies what does, and does not, have consciousness. First off, we know we have consciousness because we experientially know we are conscious (and not because biological evolution tells us so). Secondly, we infer that we acquired the specific forms of our consciousness via evolution. To which I say of course. But how can evolution explain if nematodes (which have a nervous system) have, or don’t have, consciousness? The same question can be asked of any other non-human lifeform, ameba included. Note: all I mean by “consciousness” here is “firsthand experience”.

    The occurrence of evolution no more explains the occurrence of consciousness than does the occurrence of change: as in, consciousness occurs because change occurs. Which is to say, it holds no satisfactory explanations regarding the matter. Because it does not explain what does, and does not, have consciousness, it does not explain why consciousness is nor how consciousness comes to be wherever it does.
  • How to answer the "because evolution" response to hard problem?

    Nothing I've written claims or implies that "animals (are) non-sentient machines".180 Proof

    It is part of the "hard problem" whether animals have sentience or not. So much would be easier if they didn't have any. And if human political opponents didn't have sentience either, then you could dispose of them without a guilty conscience.
  • How to answer the "because evolution" response to hard problem?

    I suppose this is a textbook case of ignoratio elenchi (missing the point).

    The hard problem of consciousness is, in question form, can science (ever) explain consciousness?

    "Because evolution created it."

    :roll:
  • How to answer the "because evolution" response to hard problem?

    Why create a sign post at the point of the hard problem by labeling it as too-unexpected, and demanding of special explanation?Bird-Up

    Can you explain this sentence a bit more?
  • How to answer the "because evolution" response to hard problem?

    The hard problem only exists as long as you declare the experience of consciousness to be unnecessary.Bird-Up

    That is confusing. I would think it is the other way around.
  • How to answer the "because evolution" response to hard problem?

    But why that special designation of "conscious"? Couldn't I just say: "My body has nerves."

    What is our motive behind creating the superfluous "conscious" label?
    Bird-Up

    I agree with that. The idea of the "hard problem" just makes a fetish out of consciousness.
  • How to answer the "because evolution" response to hard problem?

    Consciousness is just... awareness.Jackson


    It would work just as well to call it the hard problem of awareness.
  • How to answer the "because evolution" response to hard problem?

    Wrong. The hard problem exposes the fetish of physicalists with their naive realism and dualists with their inability to explain how two opposing substances can interact.Harry Hindu

    I am correct. Good day!
  • How to answer the "because evolution" response to hard problem?

    The hard problem is resolved by a monistic view that information or process is fundamental - not matter and/or mind.Harry Hindu

    This does not explain why information-processing organic matter has feelings and information-processing inorganic matter does not.
  • How to answer the "because evolution" response to hard problem?

    Do you believe that the hard problem does exist, and that it isn't being addressed properly?Bird-Up

    Consciousness came about through evolution. That doesn't explain why consciousness is the same thing as neural/biological activities.
  • The “hard problem” of suffering

    There are two ways to dismiss Chalmer’s hard problem.Joshs

    That's a gross oversimplification; as if the only choice were between Dennett and Heidegger.
  • The “hard problem” of suffering

    If Chalmer’s hard problem of consciousness does not exist, then there is no difference between a living human body suffering and a computer built to imitate all happenings and behaviours of suffering.Angelo Cannata

    Why?

    This is too fast a step. It is not obvious that the notion of "what it is like" consciousness is coherent, nor that it is impossible for a sufficiently complex artificial organism of some sort to suffer; and to claim that we have "no evidence that somebody is suffering inside a body showing alarm signs of suffering" is reprehensible - of course you can see when someone is suffering.
  • The “hard problem” of suffering

    Well, that's the wrong question, right? And scientific (explanatory), not philosophical (descriptive, interpretive)?180 Proof

    That's the hard problem.
  • The Hard Problem of Consciousness & the Fundamental Abstraction

    On page 6 you ask:

    Does the Hard Problem reflect a failure of the reductive paradigm?

    and answer in the affirmative:

    Reductionism assumes that to know the parts is, implicitly, to know the whole, but Aristotle showed in Topics IV, 13 that the whole is not the sum of its parts, for building materials are not a house.

    How well do we know the parts? Although a heap of building materials is not self organizing, matter might be. If so then to have sufficient knowledge of the parts is know the ways in which they can form higher orders of organization, including organisms that are conscious.

    While I agree with the importance of understanding things as natural wholes, this leaves open the question of how do these wholes come to be? It is one thing to argue that there has always been something, it is quite another to argue that there has always been wholes such as human beings.

    Declaring the failure of reductionism seems premature. Explanations of why "you can't get there from here" are common and occur before it becomes clear how to get there from here.
  • The Hard Problem of Consciousness & the Fundamental Abstraction

    :up:

    However,
    The hard problem really boils down to "What is it like to be another conscious being?"Philosophim

    this doesn't seem quite correct. One could argue that someone with MPD experiences just that. But, you might say, they shift from one to the other, so being in one state at a time. I would even question that assertion. I have actually had a meditative experience in which I was able to be myself and another simultaneously for a few brief moments. No, I'm not crazy. As a matter of fact whenever we talk to ourselves we indulge very slightly in this experience. But I'm setting up a strawman here, so I'll leave it.
  • The Hard Problem of Consciousness & the Fundamental Abstraction

    The hard problem really boils down to "What is it like to be another conscious being?"
    — Philosophim

    this doesn't seem quite correct.
    jgill

    +1. That is indeed not the point of the argument. The point of the argument about 'what it is like to be...' is to convey the fact of being a subject of experience. 'Being a subject of experience' is not something that can be captured in any objective description. So depicting it in terms of 'what it is like to be someone else' plainly misses the point of the argument.

    @Dfpolis - I've read most of the article. As I too am generally critical of physicalism and reductionism, then I'm onside with your general approach ('the enemy of the enemy is my friend ;-) ) - although there are a few specific points with which I will take issue, when I've spent a bit more time digesting it.

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.