• SteveKlinko
    395
    The hard problem introduce a new additional problem that in my view does not exist. When Red-Neurons are firing in X, the conscious experience of Red happen in X (X experiences a qualia).Belter
    You make this statement while saying there is no Problem. Here's the Problem .. Given:

    1) Red-Neurons are firing in X
    2) Conscious experience of Red happen in X

    How does 2 happen when 1 happens? I think your main argument is that the Hard problem does not exist because of an improper use of language when asking this question. You might be correct but you have not explained what exactly is the problem with the question.

    For now I see the question as a huge Problem. It is the Hard Problem of Consciousness. It is the Explanatory Gap on full display. Nobody knows how this works.
  • Belter
    89
    How does 2 happen when 1 happens?SteveKlinko

    It is in my view, the question is bad formulated. It is a scientific question the "how" the knife cuts the onion: it simply "cuts" it, separating it in different parts. If you want continue asking when you assume that 2 happens by 1 you only will obtain biological details: "How people think with the brain?" is responded "By circuits, cores, modules, for the different competences, faculties etc.". But even when we have not still an advanced theory of mind (neuroscience is very young) it does not mean that it is another problem that a psychological one.
  • SteveKlinko
    395
    It is in my view, the question is bad formulated. It is a scientific question the "how" the knife cuts the onion: it simply "cuts" it, separating it in different parts. If you want continue asking when you assume that 2 happens by 1 you only will obtain biological details: "How people think with the brain?" is responded "By circuits, cores, modules, for the different competences, faculties etc.". But even when we have not still an advanced theory of mind (neuroscience is very young) it does not mean that it is another problem that a psychological one.Belter
    I guess we will have to disagree on the Hard Problem. I have given it my best shot and have obviously failed to convince you.
  • Belter
    89


    I am agree in general. It is possible that qualias of another persons but also the our self are differentiated by actions, like when you say that you're wearing fur to know if you're dreaming. A priori, we do not know if we are dreaming or not. Even for subjective point of view, qualia must be "inferred" from actions (included verbal actions) so they are not directly accounted. This led us to an skeptic conclusion about the "immediateness" knowledge of conscience.
  • tom
    1.5k
    If you program the robot for view colors, why do you think that it has not a "qualia" such as of humans for which evolution programmed them for that? For my it is an unwarranted assumption.Belter

    So, if you attach a camera to your PC, the PC has qualia? Do you really think so?
  • Belter
    89
    So, if you attach a camera to your PC, the PC has qualia? Do you really think so?tom

    You are which is seeing the PC, but not the opposite. It seem to me very trivial. That is, camera in the example is like human glasses: you extend, increase, etc., your vision through it, but the subject in the example is which is using the camera.
  • Belter
    89
    I have given it my best shot and have obviously failed to convince you.SteveKlinko

    I think that a constructive analysis of Hard Problem could be the following. It is not possible to differentiate a priori, for a given subject, if he is experiencing a dream or hallucination, or instead reality. Then, to know if something is a qualia (a mind state known as mind state; contrary to unconscious mind states, which are not qualia but just mind) is always a posteriori, by empirical evidence. For example, the trivial example of of rubbing your eyes or pinching yourself. A conscious experience is either a one stated as about reality or about fiction. Conscience is to differentiate mental and phsycal experiences. We can not know a priori if X is a physical red object or a mental red one. The problem 1) is about the mind, and 2) is about conscience. But conscience is by definition the ability to differentiate fiction and reality, so it scientific study could be not possible due to it is at the same time the instrument and the object of knowledge.
  • SteveKlinko
    395
    We can not know a priori if X is a physical red object or a mental red one.Belter
    But you never see the actual Physical Red object even when you are looking right at it. You are always only Seeing the Mental, or Conscious, Surrogate of the Physical Red object. The Redness of the Red exists only in the Conscious World, or as I like to say it exists only in Conscious Space.
  • TheMadFool
    13.8k
    In a video game, different levels, different bosses.
  • Anthony
    197
    Fish, lizards and robots all use some kind of "brain", in the sense of a material system for thinking. Mind happens without some form of brain is for my not conceivable.Belter

    Robots have a brain? I realize you're thinking in terms of functionalism. Still, robots just don't have brains. If they have circuits, wires and sensors, then honesty would denote them circuits, wires and sensors. A rose is a rose is a rose.

    Wouldn't there be some sort of mind that forms between two different species when they communicate?
  • MetaphysicsNow
    311
    "How people think with the brain?"
    This is a strange question - does it really make sense to suppose that we think with our brains?
    I wash with soap.
    I wave with my hand.
    I laugh with my friend.
    I accept a gift with gratitude.
    I finish with a flourish.

    In all these different cases of doing one thing with another, it makes sense to think of the doing in the absence of the thing that it is being done with. So, I guess I can think without my brain? Or perhaps I could think with someone else's brain, and not my own? If you do not think either scenario is possible, then the preposition "with" is not capturing what you take to be the connection between thinking and the brain.
  • Belter
    89
    Or perhaps I could think with someone else's brain, and not my own?MetaphysicsNow

    I am an instrumentalist, functionalist, pragmatist, etc.: brain is the evolutionary solution to the advantage that think has. Individual is which thinks, and brain is the instrument, such as knife and cutting onions.
  • MetaphysicsNow
    311
    The terms "instrumentalism", "functionalism", "pragmatism" mean different things in different contexts and in some contexts are not even compatible with each other. I presume you mean you subscribe to something like the following argument:
    1) Mental states and occurences are defined by their functional roles.
    2) The functional roles so defined are filled by states of and occurences in the brain (well, let's be honest, you'll need more than just a brain to fill some of these functional roles, the rest of the body will probably have to get a look-in).
    3)Therefore, mental states and occurences are brain (bodily) states and occurences.

    This is pretty much classic functional state identity theory - is that what you believe? If so, the knife/cutting onions analogy does not work in this context, since the activity of cutting onions - even if it can be defined in terms of some functional role - requires more than a knife to fill.
  • Belter
    89
    The terms "instrumentalism", "functionalism", "pragmatism" mean different things in different contexts and in some contexts are not even compatible with each other.MetaphysicsNow

    And they can also mean the same, such as the context of my previous answer.
    I am explained my view in previous post, so I will not repeat it. Basically I can be considered a some kind of zombie-arguments hunter.
  • MetaphysicsNow
    311
    You can hunt the philosophical-zomby argument without being a functionalist, pragmatist or instrumentalist. The philosophical-zomby argument is just one of a range of different problems thrown at functional state identity theory. Some of what you say in the this thread indicates to me that you do subscribe to that theory, some of what you say seems unclear. Hence I asked you the direct question to get clear on exactly what it is you believe. So, I'll try again: do you believe that the argument I just gave is sound (i.e. is logically valid and has true premises)?
  • Belter
    89
    1) Mental states and occurences are defined by their functional roles.
    2) The functional roles so defined are filled by states of and occurences in the brain (well, let's be honest, you'll need more than just a brain to fill some of these functional roles, the rest of the body will probably have to get a look-in).
    3)Therefore, mental states and occurences are brain (bodily) states and occurences.
    MetaphysicsNow

    I think that mental states are not defined by their functional roles, but brain states. That is, in 1) you must change "mental" for "brain". Then 1) and 2) state the same, so 3) is actually "mental states are brain functions".
  • Belter
    89


    I think, as I posted at beginning, that there are 1) individuals (sociological level), which 2) use its brain (biological level) 3) to think (psychological level).
    Basically, we explain 1) by 3), and 3) by 2).
  • MetaphysicsNow
    311
    I think that mental states are not defined by their functional roles, but brain states.

    If I change "Mental" for "Brain" in premise 1, for the argument to remain valid has to have for its conclusion:
    3) Therefore, brain state and occurences are brain states and occurrences.

    That would certainly be a logically valid argument, but then with (3) as a conclusion, I could substitute anything for (1) and (2) and have still have a valid argument, since (3) is a vacuous tautology.
  • Belter
    89

    I try to show that your 1) premise is not true, so your argument is lack (or it is a vacuous tautology).
  • Belter
    89


    Since I do not endorse that mental states have not a function, but they are a function, of brain states, this argument about the identity does not work for me. In my view, the relation between mind and individual is ontological: that is, subject X makes the action Y, and then we say for example that X makes Y due to think T. Then, T is something that X also makes. However, between brain and individual does not happen the same. Brain is the instrument by X thinks T, so between T and brain state B there is a action/instrument relation like cutting and knife (but not actor/action like for X and T).
  • MetaphysicsNow
    311
    Since I do not endorse that mental states have not a function
    I'm sorry, but I'm having difficulties with the way you are expressing yourself.
    Do you mean that, although you believe that there are mental states, you do not believe that they have a function?
  • Belter
    89


    I want to say that mental states fit brain functions, and mental states also have functions (to predict movement, differentiate objects, etc.), but these are at individual level. The individual level is often neglected in the accounts to mind-brain problem. We can account this problem in a basic way by stating 3 levels and pragmatism: there is actors, actions, instruments, mechanisms, etc. The Hard Problem is lack if you consider that it is not possible that an instrument working into a mechanism (brain producing consciousness) does not realize the corresponding function or action (It is not possible to think Chalmers's zombies, C-fibers without felling pain, etc. as it is not possible to think a knife cutting without something being cut).
  • MetaphysicsNow
    311
    So you do think that mental states have functions. So what exactly is your problem with the first premise of the argument for functional state identity theory?
    I want to say that mental states fit brain functions
    I'm not sure what you mean by "fit" here - do you mean "are", do you mean "are caused by" or something else entirely?
12Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.