• RogueAI
    2.4k
    Why would integration have to be all or nothing? How about degrees of it and a threshold for consciousness?frank

    Then that would be consciousness=(some amount of) integrated information, and vice-versa. That sounds a little ad hoc, but maybe. But by taking a measured approach and setting a minimum amount of information processing that has to go on before consciousness arises (call it X) an opponent of Tononi can claim, "No, no, that's all wrong! It's X-1 [or X+1]. Then you get consciousness". Since there's no way to "get under the hood" and actually see if something is conscious or not, Tononi and his opponent are just going to go around and around with no way to prove their respective cases. It's easier to simply claim consciousness=information processing, but that has problems of it's own.
  • frank
    14.5k

    Around 300 years ago Newton described gravity. We're still trying to understand how it works.

    A theory of consciousness doesn't have to be served up completed. It's ok if this takes a while.
  • RogueAI
    2.4k
    Yeah, but Newton didn't have a lot of red flags pop up right at the start. The theory he came up with almost perfectly mapped on to reality (except for Mercury's eccentric orbit which I'm not even sure was discovered in his lifetime) and made excellent predictions. I can already see what look like insolvable problems with IIT.
  • frank
    14.5k
    The axioms are the description. Do you disagree with any of them? What would you add?
  • RogueAI
    2.4k
    "Intrinsic existence
    Consciousness exists: each experience is actual—indeed, that my experience here and now exists (it is real) is the only fact I can be sure of immediately and absolutely. Moreover, my experience exists from its own intrinsic perspective, independent of external observers (it is intrinsically real or actual)."

    I like this a lot.

    "Consciousness is structured: each experience is composed of multiple phenomenological distinctions, elementary or higher-order. For example, within one experience I may distinguish a book, a blue color, a blue book, the left side, a blue book on the left, and so on."

    I have problems with this. Consciousness is often structured, but it seems possible to clear our minds for short times during meditation and still retain consciousness. In that case, we are experiencing only our own conscious awareness, which would not be an experience composed of multiple phenomenological distinctions. I can also imagine a single thing that is not composed of anything else: a giant red blob. Mostly I agree with this.

    "Consciousness is specific: each experience is the particular way it is—being composed of a specific set of specific phenomenal distinctions—thereby differing from other possible experiences (differentiation). For example, an experience may include phenomenal distinctions specifying a large number of spatial locations, several positive concepts, such as a bedroom (as opposed to no bedroom), a bed (as opposed to no bed), a book (as opposed to no book), a blue color (as opposed to no blue), higher-order “bindings” of first-order distinctions, such as a blue book (as opposed to no blue book), as well as many negative concepts, such as no bird (as opposed to a bird), no bicycle (as opposed to a bicycle), no bush (as opposed to a bush), and so on. Similarly, an experience of pure darkness and silence is the particular way it is—it has the specific quality it has (no bedroom, no bed, no book, no blue, nor any other object, color, sound, thought, and so on). And being that way, it necessarily differs from a large number of alternative experiences I could have had but I am not actually having."

    Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence?

    "Consciousness is unified: each experience is irreducible to non-interdependent, disjoint subsets of phenomenal distinctions. Thus, I experience a whole visual scene, not the left side of the visual field independent of the right side (and vice versa). For example, the experience of seeing the word “BECAUSE” written in the middle of a blank page is irreducible to an experience of seeing “BE” on the left plus an experience of seeing “CAUSE” on the right. Similarly, seeing a blue book is irreducible to seeing a book without the color blue, plus the color blue without the book."

    I'm not sure that this is true...

    "Consciousness is definite, in content and spatio-temporal grain: each experience has the set of phenomenal distinctions it has, neither less (a subset) nor more (a superset), and it flows at the speed it flows, neither faster nor slower. For example, the experience I am having is of seeing a body on a bed in a bedroom, a bookcase with books, one of which is a blue book, but I am not having an experience with less content—say, one lacking the phenomenal distinction blue/not blue, or colored/not colored; or with more content—say, one endowed with the additional phenomenal distinction high/low blood pressure.[2] Moreover, my experience flows at a particular speed—each experience encompassing say a hundred milliseconds or so—but I am not having an experience that encompasses just a few milliseconds or instead minutes or hours.[3]"

    This one is fascinating, and I'm glad I clicked on your link. I want to talk about the bolded. Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would the "speed of his mind" (just go with it) look slower to Suzie? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Frank. If Suzie is watching their minds in real time, she's going to observe a divergence, and is it going to look like Frank's consciousness "slowing down"??? What would that be like? Slowing a film down?
  • Pop
    1.5k
    Evolutionary biology might be the answer. Why would we need to answer that definitively at this point?frank

    This the pertinent point. Evolutionary biology ( brain ) facilitates the information gathering and translating, but only the integrated information can create this moment of consciousness. Nothing other than the information in an integrated state can create this moment of consciousness. Nothing other then information knows how the information can be integrated. It is self organizing - Information integrating information into a synthesis of consciousness that is a state of integrated information.
  • frank
    14.5k
    Is this saying that all experiences are unique and that when an experience is happening there's something it's like to be having that experience, even if it's an experience of pure darkness and silence?RogueAI

    You're asking about the information axiom. Tononi is using "information" the way physicists do. Out of all the ways a thing can be, it's this way.

    It takes a little getting used to. It's kind of subtle.

    m not sure that this is true...RogueAI

    ok

    Let's suppose we have three people. Bob is stationary, Frank is accelerating to 99% the speed of light, and Susie is also motionless, but through a magical telescope, she's able to observe Bob and Frank's brains in real time. Bob's brain should look like a normal functioning brain, but as Frank accelerates, shouldn't Suzie see Frank's brain functions go slower and slower as time dilation kicks in? And let's also say that Suzie's magic telescope can look inside Frank's mind. As Frank accelerates, would his thoughts look slower and slower to Suzie? Would his consciousness change at all? And yet it must, because at the end of Frank's trip, he's going to report that he was conscious for X amount of time, while Bob reports that he was conscious for X+years more than Bob.RogueAI

    Could be. :grin:
  • frank
    14.5k
    but only the integrated information can create this moment of consciousness.Pop

    I'm not sure what that means.
  • Pop
    1.5k
    but only the integrated information can create this moment of consciousness.
    — Pop

    I'm not sure what that means.
    frank

    As far as I can see, there is a continuum of integrated information, integrating more and more information on to itself. The brain provides the substrate and it facilitates the translation of sense data to information, but it cannot anticipate any instance of integrated information ( consciousness ). The information has to create this by itself, by integrating on its own. The senses and brain orient the person in place, through vision. sound, etc, but the significance of that orientation to the person is not something biology can anticipate. It would suggest the information self organizes. Like pieces of a jigsaw puzzle integrating on their own.

    We normally say information integrates subconsciously, but another explanation might be that it is self organizing, as per systems theory.
  • MAYAEL
    239
    I blame Richard Maurice Bucke for the majority of the New Age movement people holding to and keeping this concept of Consciousness being this kind of highest important / value system/ style religious/ God opinion thing alive and continually circulating as if it's something that needs to be contemplated more . It's the perfect definition of mental masturbation
  • RogueAI
    2.4k
    We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?
  • Daemon
    591
    We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?RogueAI

    The brain doesn't do information processing any more than digestion does. The brain does things like ion exchanges at synapses. We can describe this as information processing, but all the actual work is done by things like ion exchanges.
  • frank
    14.5k
    Well, I don't totally understand IIT at this point. That's why I started this thread, in hopes of figuring out how it comes together.

    So far I know that causality is big for Tononi.

    I may have to buy his book. :grimace:
  • RogueAI
    2.4k
    I think there's a problem for IIT, though. If consciousness "flows at the speed it flows, neither faster nor slower", and Frank travels faster than Bob, then when Frank returns from his space travelling, he and Bob are going to disagree at the "speed" of which their consciousness "flows", and neither will be mistaken. Therefore, someone's consciousness went faster or slower than someone else's.
  • frank
    14.5k

    Bob is just going to be a lot older than Frank. They'll be able to consult with a physicist to understand why.
  • Pop
    1.5k
    Well, I don't totally understand IIT at this point. That's why I started this thread, in hopes of figuring out how it comes together.frank

    I wasn't specifically referring to IIT, though it is also a relevant question for it. Imo, the single most pertinent question regarding all this is what causes the information to integrate, as that will be consciousness, and as far as I can unravel it, the information integrates on its own, as only it can know how it fits together. That the information is self organizing would have far reaching consequences for philosophy and understanding in general.

    However I sense you would rather focus on IIT, so I will leave you to it.
  • Pop
    1.5k
    We are conscious of very little of what our brain is actually doing, and it's doing a lot of information processing moment by moment. Why does information integration viz-a-viz digestion not result in conscious experience?RogueAI

    Why do you think it doesn't? I know somebody who has a very uncomfortable, often painful, time digesting and it ruins their day often. The totality of bodily feeling always exists in the background contributing to experience, but we normally are only aware of it when it is panful.
  • RogueAI
    2.4k
    IIT, originated by Giulio Tonini,
    — frank

    Scott Aaronson debunkificated this a while back. David Chalmers shows up in the comment section.
    fishfry

    I thought this was relevant:

    "To his credit, Tononi cheerfully accepts the panpsychist implication: yes, he says, it really does mean that thermostats and photodiodes have small but nonzero levels of consciousness."
  • RogueAI
    2.4k
    Doesn't IIT entail that our consciousness should fluctuate with the amount of information integration going on? For example, sitting in a dark silent room that's neither hot nor cold should result in a severely diminished conscious state compared to doing a stairmaster at a gym, but of course that's not the case.
  • RogueAI
    2.4k
    Bob is just going to be a lot older than Frank. They'll be able to consult with a physicist to understand why.frank

    I think there's more to it than that. At time t to whatever, Bob and Frank report the same "speed" of consciousness. But if Frank accelerates enough, then at T+whatever, Bob and Frank will differ on how much conscious experience they report has happened to them, and they will both be correct. But that entails that for one (or both of them) their consciousness did not "flow at the speed it flows, neither faster nor slower".
  • frank
    14.5k
    Imo, the single most pertinent question regarding all this is what causes the information to integrate, as that will be consciousness, and as far as I can unravel it, the information integrates on its own,Pop

    IIT says a conscious system has a certain amount of internal causation.
  • Pop
    1.5k
    IIT says a conscious system has a certain amount of internal causation.frank

    This is well expressed. But I wonder, can you see how perception (extra information) has to integrate with already established information to form understanding?
  • frank
    14.5k
    This is well expressed. But I wonder, can you see how perception (extra information) has to integrate with already established information to form understanding?Pop

    At the early stages, IIT isn't trying to explain the experience of understanding.

    It's trying to set out a set of correlates of consciousness.

    On the one hand, we have behavioral correlates of consciousness:. BCC. On the other, we have neurological correlates: NCC.

    Traditionally, science has worked on matching those two. The Hard problem pointed out that this approach leaves out the most amazing thing about consciousness (though ITT supporters are quick to say that Leibniz identified the hard problem first, don't know why they think that's important, though. Chalmers is obviously the instigator here, not Leibniz)

    So now we're describing experience itself, the graphic is the view out of one of your eyeballs), and attempting to hypothesize about correlates of that in the neuronal realm.

    Make sense?
  • Pop
    1.5k
    So now we're describing experience itself, the graphic is the view out of one of your eyeballs), and attempting to hypothesize about correlates of that in the neuronal realm.frank

    IIT is a computational theory of consciousness that blocks out the hard problem. Its quite a strange approach, that initially starts confidently in phenomenology. It derives all its axioms in phenomenology, but then that's the last we hear of phenomenology. Instead the theory focuses on a calculation of consciousness through a cause effect repertoire ( which unfortunately is beyond my ability to properly scrutinize).

    I see many problems with it, but the main one being that the felt quality of experience is left out of the axioms. The felt quality of consciousness is dealt as a secondary consideration that is simply explained by qualia being equal to consciousness, as if its an irrelevant consideration. But many phenomenologists see consciousness as being composed of two poles - cognition, and experiential reaction. Tononi blocks out experiential reaction, so one wonders what exactly is he calculating as consciousness, since half the information is being ignored?

    Its not really a theory of consciousness, in my view, since the hard problem is being ignored, and in being ignored only half of consciousness is being calculated. It seems more of a proposal of a way to calculate cognition. So on the basis of this I'm not going to analyze it further.

    What I like about IIT is that it crystalizes the idea that consciousness is integrated information, and that it acknowledges the validity of phenomenology ( normally dismissed offhand as unscientific by physicalists / computationalists ).

    Of course the hard problem would be blocked out, as a felt quality can not be conceptualized ( being felt slightly differently in every end user ), so can never be calculated in any universal sense. So I don't have much hope for this approach as a path toward a theory of consciousness.
  • bert1
    1.8k
    Its not really a theory of consciousness, in my view, since the hard problem is being ignored, and in being ignored only half of consciousness is being calculated. It seems more of a proposal of a way to calculate cognition. So on the basis of this I'm not going to analyze it further.Pop

    Yes that's the conclusion I came to as well. There's no answer to the question "OK, by why can't integrating information happen in the dark?" As if often the case with functionalist views, they come to an interesting point, but when faced with the problem of 'Yes, but why does that result in an experience exactly?" they tend to abandon theory and opt for definition by fiat instead. They say "Oh, but that's just what 'experience' means. There is nothing more other than that." Which is nonsense. I certainly do not mean 'integrated information' when I talk about consciousness.

    I do think the IIT is an interesting theory of something. Maybe it is a way to define conscious individuals, and that would solve a problem that besets a lot of panpsychist views, namely Searle's question "What are the units supposed to be?" Maybe the conscious subject is that system that integrates information. And maybe the more information the system integrates in interesting ways, the more varied and rich the associated experiences of that subject are. But as you say, it just doesn't touch the basic question of why we should think that integrated information is consciousness, why it creates a first person perspective at all.
  • frank
    14.5k
    IIT is a computational theory of consciousness that blocks out the hard problem.Pop

    What do you mean "blocks the hard problem?". It attempts to answer it.

    The felt quality of consciousness is dealt as a secondary consideration that is simply explained by qualia being equal to consciousness,Pop

    Isn't it?

    Of course the hard problem would be blocked out, as a felt quality can not be conceptualized ( being felt slightly differently in every end user ), so can never be calculated in any universal sense.Pop

    So we need a unique theory of consciousness for every incidence of it?
  • frank
    14.5k
    They say "Oh, but that's just what 'experience' means. There is nothing more other than that." Which is nonsense. I certainly do not mean 'integrated information' when I talk about consciousness.bert1

    The theory starts with a description of consciousness. I don't think you could say that issue was glossed over.

    But as you say, it just doesn't touch the basic question of why we should think that integrated information is consciousness, why it creates a first person perspective at all.bert1

    So after considering the theory, do you end up where you started? Or did you end up with a finer understanding of your own expectations for a theory of consciousness? Or what?
  • bert1
    1.8k
    Yes, you're right regarding Tononi, I was unfair. I was generalising but should not have included Tononi in that.

    Yes I did end up with a clearer expectation. I'm a fan of Tononi, I just think he's wrong. It's great that he started with phenomenology and his theory is interesting.
  • frank
    14.5k
    Cool. I'm still working on understanding his theory, so I'm not much use defending him. :grin:
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.