• Caldwell
    1.3k
    You can get walking from legs but not legs from walking. Wtf. "Why is it a one-way street?"180 Proof
    :starstruck: lol
  • RogueAI
    2.4k
    Suspicion confirmed. I'm not claiming there's a possible world where consciousness can arise from rocks.

    If there's no possible world where consciousness arises from rocks, then it is impossible for consciousness to arise from rocks. That is to say, no matter what you do with rocks, no matter if the rock-based system is functionally identical to a working conscious brain, if you believe there's no possible world where consciousness can arise from rocks, you CANNOT get consciousness from rocks, no matter what.

    I happen to agree. Is that your claim, though? Because now I'm going to ask you: why isn't a system of rocks that's functionally equivalent to a working brain conscious? What's stopping it from becoming conscious?

    This is tough for the materialist, because on the one hand, if you say, "consciousness from rocks is possible", you open yourself up to a reductio absurdum and a bunch of questions about how on Earth you can get consciousness from rocks. But if you say that consciousness from rocks is impossible (as you are now seeming to do), you're making a claim that some substrates won't produce consciousness. So, which substrates besides rocks are off limits and how do you know this? But you have to make a claim one way or the other: either consciousness from rocks is possible or impossible. Which is it?
  • RogueAI
    2.4k
    That's correct. Are octopuses conscious? Does that question involve whether computers are conscious or not? No. So the question is not about computers (although a perfectly good example).Kenosha Kid

    Is it possible for computers to be conscious? If yes, how would you verify whether a specific computer is conscious or not? If computer consciousness is impossible, why is it impossible?
  • RogueAI
    2.4k
    And my point was that you don't need a scientific description of consciousness to tell whether something is conscious or not. In order for a scientist to discover scientifically what water is, yes, she needs a definition of water. If she doesn't know what water is, she can't tell you what's in the glass. Even if she knows what water looks like, she needs to be able to differentiate it from alcohol, or any other transparent liquid. As it happens, you don't need to know _much_ about water to be able to distinguish it perfectly well from not-water (it's appearance, fluidity, taste, lack of smell). This is the extent to which the definition of consciousness also needs to be precise: to distinguish it from unconscious things.

    You're arguing my point: you don't need to know _much_ about consciousness to be able to distinguish it perfectly well from non-consciousness. We don't need a rigorous definition of consciousness to determine whether that computer that just passed the Turing Test is conscious or not. We don't need to "know much" about consciousness to pose that question. Our basic understanding of consciousness is sufficient to make sense of the question: is that computer conscious or not? Just like we don't need to know much about water to measure how much is in the glass.

    Agreed?
  • Kenosha Kid
    3.2k
    You're arguing my point: you don't need to know _much_ about consciousness to be able to distinguish it perfectly well from non-consciousness. We don't need a rigorous definition of consciousness to determine whether that computer that just passed the Turing Test is conscious or not. We don't need to "know much" about consciousness to pose that question. Our basic understanding of consciousness is sufficient to make sense of the question: is that computer conscious or not? Just like we don't need to know much about water to measure how much is in the glass.

    Agreed?
    RogueAI

    Absolutely not. We have no common "basic understanding" of consciousness. On this site alone you'll find a new one for every thread on the subject.
  • RogueAI
    2.4k
    Absolutely not. We have no common "basic understanding" of consciousness. On this site alone you'll find a new one for every thread on the subject.Kenosha Kid

    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.

    Now, did we need a precise definition of consciousness to answer those questions? No. Did those questions and answers make sense to you and me? Yes. I know what you mean when you say you're conscious and vice-versa.

    We all have a basic understanding of consciousness. Claiming otherwise is absurd. The materialist "game" is often to retreat to language difficulties when the going gets tough (you'll notice I never once talked about qualia). You're doing that here.

    There are also some outstanding questions you haven't answered:
    - Is it possible to get consciousness from rocks, yes/no?
    - Is it possible to simulate consciousness, yes/no?
    - Is consciousness substrate independent, yes/no?
  • Kenosha Kid
    3.2k
    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.RogueAI

    This is precisely what I was talking about before. That sort of wishy-washy 'well, I know what I mean' way of communicating is no good for answering questions about consciousness in a scientific way. It's not useful for me to note that I am conscious when trying to determine if a dolphin, an octopus or a guppy are conscious. What is required is not an obvious, possibly extreme example of a conscious being like a human but a minimal set of requirements for something to be said to be conscious.

    We all have a basic understanding of consciousness.RogueAI

    Generally I think we're all pretty ignorant about, neuroscientists like Isaac aside.

    There are also some outstanding questions you haven't answered:
    - Is it possible to get consciousness from rocks, yes/no?
    RogueAI

    I have answered this here:

    We should NOT assume that consciousness can arise from rocks.Kenosha Kid

    - Is it possible to simulate consciousness, yes/no?RogueAI

    In principle, or with present technology? Probably, and definitely not respectively.

    - Is consciousness substrate independent, yes/no?RogueAI

    Already answered this too. Yes. Even if the brain turns out to be the only natural or technological means of having consciousness, the answer would still be yes.
  • bert1
    1.8k
    That sort of wishy-washy 'well, I know what I mean' way of communicating is no good for answering questions about consciousness in a scientific way.Kenosha Kid

    Do you have a definition in mind when discussing consciousness? When you discuss consciousness, what is it you are discussing?
  • Kenosha Kid
    3.2k
    Do you have a definition in mind when discussing consciousness? When you discuss consciousness, what is it you are discussing?bert1

    The part you quoted wasn't about lay or philosophical discussion, but scientific testing. As I am not a neurologist, how I conceive of consciousness isn't pertinent. Also, establishing the need for a scientific definition of consciousness is not the same as defining it. One can recognise that a scientific definition of consciousness must discern between conscious and unconscious things without having such a definition.

    But the consciousness discussed by neurologists afaik is along the lines of: cognitive awareness of one's environment and one's cognitive awareness of that environment. In more detail, (human, at least) consciousness is a process comprised of multiple components such as awareness, alertness, motivation, perception and memory that together give an integrated picture of one's environment and how one relates to it.
  • bert1
    1.8k
    But the consciousness discussed by neurologists afaik is along the lines of: cognitive awareness of one's environment and one's cognitive awareness of that environment.Kenosha Kid

    And these are presumably measurable in some way? If so, they would need to be functionally defined. You input something into the person, look at the output (how the person behaves, a reading from some kind of direct brain scan), and then the degree of awareness of the environment is observed. Is that the idea?

    In more detail, (human, at least) consciousness is a process comprised of multiple components such as awareness, alertness, motivation, perception and memory that together give an integrated picture of one's environment and how one relates to it.

    Is this sense of 'consciousness' a collective term for a number of related cognitive faculties? Each of which could be given functional definitions and associated with functional tests to measure their degree of presence? A bit like (Banno's favourite) the Glasgow Coma Scale? Is that the idea of consciousness as studied by scientists?

    What do you think of the following rough definition:

    "Consciousness is subjective experience — ‘what it is like’, for example, to perceive a scene, to endure pain, to entertain a thought or to reflect on the experience itself"

    Would that do as a starting point for a scientific investigation?
  • Kenosha Kid
    3.2k
    And these are presumably measurable in some way? If so, they would need to be functionally defined. You input something into the person, look at the output (how the person behaves, a reading from some kind of direct brain scan), and then the degree of awareness of the environment is observed. Is that the idea?bert1

    That's an example. Another would be brain imaging. Or both in conjunction. A good example might be experiments that detect pre-cognitive decision-making. The lapse between initiating a response and being cognitively aware of it, if accurate (it's disputed) would fit with the idea that cognition is validating other mental outputs.

    Is this sense of 'consciousness' a collective term for a number of related cognitive faculties?bert1

    Or rather an emergent function of interacting parts. For instance, you might want to put your finger in the pretty flame, but last time you did that it really hurt. This is memory working in conjunction with motivation, not just a bundle of memory, motivation and other processes operating in parallel.

    "Consciousness is subjective experience — ‘what it is like’, for example, to perceive a scene, to endure pain, to entertain a thought or to reflect on the experience itself"

    Would that do as a starting point for a scientific investigation?
    bert1

    No, because it doesn't say anything. It relies on the reader having their understanding of what it is like to perceive a scene, endure pain, etc. and that does the heavy lifting without examination.
  • bert1
    1.8k
    Thanks for that. Do you think philosophers and scientists have much to say to each other? When scientists investigate well-defined observable functions, and philosophers talk about hard problems and 'what it's likeness,' are they talking past each other? They both use the word 'consciousness'. Has one or other misused the word? Or are there genuinely different meanings?
  • Kenosha Kid
    3.2k
    When scientists investigate well-defined observable functions, and philosophers talk about hard problems and 'what it's likeness,' are they talking past each other? They both use the word 'consciousness'. Has one or other misused the word? Or are there genuinely different meanings?bert1

    Scientists are reductionists, and reductionists will look for the fundamental elements of a thing and how they produce that thing when so arranged. Philosophers can be reductionists or non-reductionists. I expect reductionist philosophers and scientists will probably speak the same language, but non-reductionists tend to take such concepts as irreducible, so speak a different language.

    Things like love and consciousness are multifaceted collections of things. Two people could be in agreement with a given statement about love, but one person was thinking about attachment and the other romance, or one romance and the other sex. Consciousness is similar. Treating it as a simple thing is apt to produce ambiguity and confusion.

    The Nagel/Chalmers type of approach does this. It treats What it's like as a simple thing, separable to having a bat's body, including it's brain, a bat's needs, a bat's habitat, a bat's social structure, a bat's senses, a bat's memories, all the tiny things that individually and in conjunction produce what it's like to be a bat. And, worse, tells you that because you don't have a bat's body, a bat's brain, a bat's habitat, a bat's social structure, a bat's senses, a bat's memories, you cannot imagine what it's like to be a bat (true), and that this is somehow proof of an irreducible quintessence of batness that will be left over if and when you have as complete a scientific account of the third person view of a bat as is possible. It doesn't deal with the precise elements of what it is talking about at all.
  • Wayfarer
    20.6k
    The Nagel/Chalmers type of approach does this. It treats What it's like as a simple thing, separable to having a bat's body, including it's brain, a bat's needs, a bat's habitat, a bat's social structure, a bat's senses, a bat's memories, all the tiny things that individually and in conjunction produce what it's like to be a bat. And, worse, tells you that because you don't have a bat's body, a bat's brain, a bat's habitat, a bat's social structure, a bat's senses, a bat's memories, you cannot imagine what it's like to be a bat (true), and that this is somehow proof of an irreducible quintessence of batness that will be left over if and when you have as complete a scientific account of the third person view of a bat as is possible. It doesn't deal with the precise elements of what it is talking about at all.Kenosha Kid

    Invoking what it is like to be a bat is really a rhetorical device or thought-experiment to drive home the the understanding of the fundamental nature of the way in which a creature is embodied and how they interact with the environment

    I find the phrase 'what it is like to be' an awkward expression. I think what it means to articulate is, simply, 'being'. Humans are after all designated 'beings', as are other sentient creatures, including bats. And beings are not objects, in that they're conscious agents. This is precisely what is denied by reductionism, as reductionism has no category which corresponds with the notion of 'being'. That is why reductionists (such as Dennett) are obliged to deny that beings are in any basic sense different to objects.

    It's possible to have a nearly complete scientific account of an object, although since the discovery of quantum mechanics, even that is now questionable. But you can't provide a purely objective account of a subjective state of being. That's really all there is to it.
  • Pantagruel
    3.2k
    I think there's tension between the claim that matter can produce consciousness, but not vice-versaRogueAI

    I have always summed it up this way to myself: is it more unlikely that matter gives rise to consciousness, or that consciousness gives rise to matter?
  • RogueAI
    2.4k
    I like that better.
  • RogueAI
    2.4k
    Are you conscious? Is your significant(s) other conscious? To not draw this out, I'll answer for you: yes, and yes.
    — RogueAI

    This is precisely what I was talking about before. That sort of wishy-washy 'well, I know what I mean' way of communicating is no good for answering questions about consciousness in a scientific way.
    Kenosha Kid

    I'm not asking questions about consciousness in a scientific way. Are you, Kenosha Kid, conscious??? That's not talking about consciousness is a "scientific way". We're at a pretty basic level when I'm asking you that.

    Now, your claim is that the sophistication of the language has to increase when it comes to determining whether something other than ourselves is conscious. Why? If I can ask "Are you, Kenosha Kid, conscious???" in a meaningful way and get a meaningful answer (which I can), without defining consciousness in a scientific way, why can't I ask a scientist, "Hey, is that machine over there conscious? You say it is. Can it feel pain? Can it be happy? Sad? What is it like to be that machine?" The scientist has to answer those questions. Those aren't questions that are being asked "in a scientific way". Those are ground level questions that a small child can understand.

    So, my point is that the regular folk definitions of consciousness and pain and happiness and "what is it like to be that?" that we all use are perfectly appropriate to inquire meaningfully about consciousness. If that's the case, and some scientist is claiming some machine is conscious (which will eventually happen), someone is going to say, "prove it". What's the scientist going to do in that case? Retreat behind a wall of jargon? Claim he can't prove it because there's a language problem? No, the scientist can't prove a computer is conscious because it's impossible to verify the existence of other consciousnesses.

    Do you dispute the bolded? If so, explain how you can verify that other minds and/or consciousnesses exist. If not, then concede the point that any physicalist theory of consciousness will be unverifiable in principle.

    Before we go on to the possibility of consciousness coming from rocks, I want to close off this point: it's impossible to verify the existence of other consciousnesses. Agreed or not?
  • Kenosha Kid
    3.2k
    I find the phrase 'what it is like to be' an awkward expression.Wayfarer

    Me too.

    And beings are not objects, in that they're conscious agents. This is precisely what is denied by reductionism, as reductionism has no category which corresponds with the notion of 'being'.Wayfarer

    Thanks for a third and fourth example... Love, consciousness, being, agency. Reductionism does not deny that we're conscious agents, but yes it does say we're made of objects, and therefore are objects. There's no contradiction between being an object and being a conscious agent: we're just objects with higher order properties of consciousness and agency.

    But you can't provide a purely objective account of a subjective state of being. That's really all there is to it.Wayfarer

    This nicely encapsulates the hard problem. It isn't a problem, rather an insistence.
  • Wayfarer
    20.6k
    There's no contradiction between being an object and being a conscious agent: we're just objects with higher order properties of consciousness and agency.Kenosha Kid

    Nope. Not true. There's a rhetorical description, 'nothing but-ism' or 'nothing but-ery', which is precisely that.

    There's nothing in the scientific description of objects - which is physics - in terms of which affective states and so on, can even be described.
  • Wayfarer
    20.6k
    No, the scientist can't prove a computer is conscious because it's impossible to verify the existence of other consciousnesses.RogueAI

    That’s because consciousness is only ever known in the first person.

    The scientific world-picture vouchsafes a very complete understanding of all that happens — it makes it just a little too understandable. It allows you to imagine the total display as that of a mechanical clockwork which, for all that science knows, could go on just the same as it does, without there being consciousness, will, endeavor, pain and delight and responsibility connected with it — though they actually are. And the reason for this disconcerting situation is just this: that for the purpose of constructing the picture of the external world, we have used the greatly simplifying device of cutting our own personality out, removing it; hence it is gone, it has evaporated, it is ostensibly not needed. — Erwin Schrodinger, Nature and the Greeks
  • Kenosha Kid
    3.2k
    If I can ask "Are you, Kenosha Kid, conscious???" in a meaningful way and get a meaningful answer (which I can), without defining consciousness in a scientific way, why can't I ask a scientist, "Hey, is that machine over there conscious? You say it is. Can it feel pain? Can it be happy? Sad? What is it like to be that machine?" The scientist has to answer those questions. Those aren't questions that are being asked "in a scientific way". Those are ground level questions that a small child can understand.RogueAI

    A child, or adult, can _think_ they understand. Asking a conscious person if they are conscious is not comparable to asking a scientist if a machine is conscious. All the person has to do to achieve the former is know how to use the word 'conscious' in a sentence. To achieve the latter, a scientist has to know what consciousness really is, what parts and processes constitute the minimal agreed criteria for consciousness.

    No, the scientist can't prove a computer is conscious because it's impossible to verify the existence of other consciousnesses.RogueAI

    I don't think there's any insurmountable barrier to determining whether another human is conscious or not, because we know a lot about what to look for. The difficulty is more likely in knowing the same about other animals whose brains we understand less well. Non-animal systems might be easier, since we're free to design the hardware, or bypass it in software with the knowledge that the kind of conscience thing you're building might not be naturally or technologically realisable.

    Consider an ecological microsystem, small enough to be simulated. Place within this microsystem a sprite and assign that sprite needs: food, water, procreation, survival, with observable meters. Place within the microsystem elements that will satisfy or thwart those needs: fruits and smaller food sprites, a lake, predators, other sprites as the same class as our subject. Define a set of input channels to the sprite and couple those to the state of the environment in a fixed and incomplete way. This defines the surface of the sprite. Give the sprite a set of subroutines that a) build a model of the environment from those inputs, b) perform things like edge detection to discern other objects in that environment, c) some retrainable pattern-recognition models for enriching that view with higher order data. (As an example, you could have three small red things to eat that are nutritious and one that makes the sprite ill. This pattern-recognition package would allow the sprite to learn first that small red things are nutritious, and second that one particular small red thing is harmful.)

    An important part of that package would be reflexes. For instance, if a predator sprite is detected, run home or, if not possible, run directly away from the predator. Thus we need some interface between this processor and the surface of the sprite, a sort of will. However we'll also add a second package of routines that also take input from the first. These will be algorithmic, for enriching the environmental model with metadata that can't be yielded by pattern-recognition, and for verifying the metadata from pattern-recognition. This will also feed back to the first processor so that the sprite can act on algorithmic outputs when firing the will.

    Implicit in the above is somewhere to store outputs of models, pattern-recognition and algorithms so that the sprite doesn't have to work everything out from scratch every time. These will also be inputs to both processors.

    Obviously there's a lot of gaps to fill in there, but you get the gist. Would the sprite be conscious? Done right, if we had the tech to do it, with those gaps sensibly filled in, I'd argue yes.

    How could we tell that it's conscious? Compare its behaviour to other sprites that are missing one or more of the key features outlined above. Additionally, make some of those meters inferable from the outside. Perhaps, if we gave the sprite capacity for language such that it could teach its daughter which small red thing not to eat, then removed all of its predators and placed food outside of its den every morning, it might even invent philosophy, science and art! :)
  • Kenosha Kid
    3.2k
    Nope. Not true. 'Oranges are really carpentry tools, they just lack the handles.'Wayfarer

    On a purely linguistic level you ought to be able to debunk your own argument here.
  • Wayfarer
    20.6k
    I deleted that expression before you quoted it. Perhaps you might adjust your response accordingly.
  • Wayfarer
    20.6k
    OK, I will explain what I meant to say with that ham-fisted analogy. I was responding to the claim that conscious agents are simply objects with higher-order attributes such as consciousness. That is pretty well the essence of reductionism. So I said, on a whim, and before I went back and deleted it, that it’s like comparing two wildly different kinds of things - oranges and carpentry tools - and denying that there’s any real difference between them. Again, a ham-fisted metaphor, but not wide of the mark. We can describe how objects behave using physics - up to a point - but those descriptions can’t be extended to apply to conscious beings//unless, of course, you’re referring to the rate at which they accelerate when dropped, for instance.//. And that’s the point of Chalmer’s ‘facing up to the hard problem’. But you have to recognise a problem to face up to it, and I don’t think you do.
  • bert1
    1.8k
    I think we've hit a conceptual wall. The trouble with the concept of consciousness is that consciousness is only knowable by a kind of introspective reflexive act. You have to notice you have it. For some people this is a simple and obvious thing. Others do not find it simple and obvious, they see problems with it and say such internal observations are illusory or at least misleading. There is a privacy issue here; this problem is not resolvable by consulting a shared world. Normally disagreements about simple matters of fact are resolvable by both parties going to the same place at the same time and saying "Look there, we both see it don't we?"
  • Wayfarer
    20.6k
    The trouble with the concept of consciousness is that consciousness is only knowable by a kind of introspective reflexive act.bert1

    That’s because consciousness is only ever known in the first person.Wayfarer
  • bert1
    1.8k
    Asking a conscious person if they are conscious is not comparable to asking a scientist if a machine is conscious.Kenosha Kid

    It is if the scientist has the same definition/concept as the non-scientist. This definition:

    "Consciousness is subjective experience — ‘what it is like’, for example, to perceive a scene, to endure pain, to entertain a thought or to reflect on the experience itself"

    ...is given at the very start of the neuroscientist Guilio Tononi's paper on the IIT. Some scientists do start with this concept. And it is those thinkers who I think do come up with a theory of consciousness (even if it is false), and these theories are interesting to me as genuine candidates for a true theory of consciousness. However some thinkers take 'consciousness' to mean a set of observable functions or behaviours etc. That's fine if it's useful, say for a paramedic. But I don't take these as theories of consciousness as I understand it. They are definitions by fiat, and philosophically uninteresting.

    EDIT: An example of the latter is H Pattee in his Cell Phenopmenology: The First Phenomenon, in which he says this:

    Most branches of philosophy have an explicit or tacit focus on the human level of thought, language, and behavior. Phenomenology has historically focused explicitly on the subjective conscious experience of the human individual. For many years I have found it instructive to explore phenomena from a broader and more elementary evolutionary and physical law-based point of view, defining it as those subjective events that appear to the simplest individual self as functional. At the cell level function cannot be precisely defined because what is functional ultimately depends on the course of evolution. Functional phenomena occur at all levels in evolution and are not limited to conscious awareness. — Pattee

    (my bold)

    To be clear the article he writes is extremely interesting in many other ways. I just don't think it touches the hard problem.
  • Kenosha Kid
    3.2k
    I deleted that expression before you quoted it. Perhaps you might adjust your response accordingly.Wayfarer

    If you like. "On a purely linguistic level, you were able to debunk your own argument." :P

    So I said, on a whim, and before I went back and deleted it, that it’s like comparing two wildly different kinds of things - oranges and carpentry tools - and denying that there’s any real difference between them.Wayfarer

    I don't think so. Object is a class. Conscious being is a class. Even if you object to the specific claim that the latter is a subclass of the the former, there'll always be some superclass you can define that includes conscious beings and oranges.
  • Wayfarer
    20.6k
    Object is a class. Conscious being is a class. Even if you object to the specific claim that the latter is a subclass of the the former, there'll always be some superclass you can define that includes conscious beings and oranges.Kenosha Kid


    I suppose it's pointless to try and explain what I think is the matter with this, so I'll pass.

    Although I think Bert1 has done a good job of it:

    some thinkers take 'consciousness' to mean a set of observable functions or behaviours etc. That's fine if it's useful, say for a paramedic. But I don't take these as theories of consciousness as I understand it. They are definitions by fiat, and philosophically uninteresting.bert1
  • bert1
    1.8k
    I'm not sure how my comment relates to kk's.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.