• T Clark
    14k
    Agreed. I wasn't suggesting it can't be done.FrancisRay

    YGID%20small.png
  • Apustimelogist
    621
    I don't think the so called hard problem can ever be solved. All our explanations are functional, that is therr nature, so how on earth can there be any kind of explanation of the sort we want for qualitative conscious experiences?

    A conscious-centric framework will not even be able to explain consciousness because it is possibly the most trivial concept we have since it is the primitive base of all knowledge. There is absolutely no constraint on what can be considered to be an experience. It seems plausible to me that there are an infinite multitude of experiences that we could never even imagine for different possible kinds of sentient agents. Once you think like that, can you even point out what it means not to be conscious or be an experience? I am not even sure anymore, especially if someone like a panpsychist thinka that even the simplest possible micro-thing can have some form of experience that is just extremely, unfathomably basic.

    There's no possible characterization of consciousness. It is utterly primitive to us as information-processing creatures. All we can do is fit it in as best we can with the rest of science. Since consucousness has no actual characterization, the only thing we can do is juxtapose with useful physical concepts like people already do in neuroscience. Physical concepts are doing all the hard work and science hasn't found any evidence of dualism. Sure, panpsychism could be true but again since consciousness lacks any decent characterization, which of our concepts do the heavy lifting in relating experience to the rest of reality? The physical concepts.

    Again, explanations are inherently functional. Experiences are not. There will never be a good explanation of consciousness and anything useful to our knowledge will be functional and so inherently at odds with describing or explaining experiences. Maybe physical concepts don't explain consciousness like we want them to but physical concepts are central to any kind of useful explanations about fundamental reality. I think fundamental ontology is likely impossible to comprehend and the next step is a computational or informational explanation of why that is and for how that hard problem arises in intelligent machines like us in the first place.
  • PeterJones
    415
    I don't believe it's possible to define the hard problem in such a way that it cannot be solved or has not already been solved. Chalmers originally defines it as the problem of explaining how consciousness arises from matter. and in this form it isn't even difficult. The problem seems to be ideological rather than philosophical.

    There's no possible characterization of consciousness. It is utterly primitive to us as information-processing creatures.Apustimelogist

    Very much agree. So did Kant. He placed the origin of the both the world and the intellect prior to the categories of thought. If you assume it is primitive then you have solved the hard problem. There are still plenty of difficulties to overcome but none that are intractable.

    I found your post above perceptive and a good summary of the situation, but unnecessarily pessimistic/ .

    I think fundamental ontology is likely impossible to comprehend and the next step is a computational or informational explanation of why that is and for how that hard problem arises in intelligent machines like us in the first place.

    This would be a hopeless approach for for the reasons you give. A fundamental theory must look beyond computation and intellection.

    But if you think human beings are are intelligent machines or one of Chalmers' zombies then I'm afraid you're stuck with the hard problem for all eternity. This assumption renders the problem impossible. .

    .

    . .
  • flannel jesus
    1.9k
    If you assume it is primitive then you have solved the hard problem.FrancisRay

    If you assume anything is primitive, you can answer the same "how" question. How does consciousness arise? It's primitive. How does life work? It's primitive (see Vital Force, an idea which lost favour when scientists were able to build up a picture of life working via electro chemical processes).

    Some things are primitive, of course, and it may be that consciousness is, but it feels more like a non answer to me than an answer. It feels like giving up. "I can't think of how it could come about via any physical or non physical processes, so it must be fundamental". That's exactly how Vital Force explained the processes of life, up until we had the means to explain it electro-chemically.

    Maybe it's fundamental, but probably, I think, we just don't have the answer yet, and the idea that it's primitive will start disappearing when we have a picture of the mechanisms involved, like life itself.
  • PeterJones
    415
    f you assume anything is primitive, you can answer the same "how" question. How does consciousness arise?flannel jesus
    If it is primitive consciousness doesn't arise. What would arise is intentional 'subject/object consciousness,

    It's primitive. How does life work? It's primitive (see Vital Force, an idea which lost favour when scientists were able to build up a picture of life working via electro chemical processes).
    Life is a different issue.

    Some things are primitive, of course, and it may be that consciousness is, but it feels more like a non answer to me than an answer. It feels like giving up.
    Why do you think this? It allows us to construct a fundamental theory. This is the answer given by Perennial philosophy, for which no hard problems arise. Rather than giving up this is the only way forward.

    Maybe it's fundamental, but probably, I think, we just don't have the answer yet, and the idea that it's primitive will start disappearing when we have a picture of the mechanisms involved, like life itself.

    This the dream of the materialists, but you've just argued it's a pipe-dream.

    The idea that consciousness is primitive will never disappear. The 'Perennial' philosophy will never go away since it is not conjectural and it works. The problem is only that few people take any notice of it. Then they cannot make sense of metaphysics or consciousness and conclude that nobody ever will. It's an odd and rather surreal situation. .
    .
  • Apustimelogist
    621
    unnecessarily pessimisticFrancisRay

    In what way? I don't see it as pessimistic at all or that anything is lost. What does a solution to the hard problem look like? I don't think there is a good one I can think of which doesn't imply some sort of dualism which I fundamentally disagree with.

    This would be a hopeless approach for for the reasons you give. A fundamental theory must look beyond computation and intellection.FrancisRay

    I am not suggesting looking for a fundamental ontology based on computation but an explanation for why knowing about fundamental ontologies are out of reach.

    I think the explanation is actually already there, it just has to be articulated and demonstrated. Like you said, experiences are primitive. We know experiences are related to the functional architecture of our brains. We can transfer or demonstrate the concept of this kind of primitiveness into the architectures and functional repertoires of A.I. We use A.I. to demonstrate the limits of what kinds of information is transferable from the environment, what kinds of concepts are created and what information they don't or can't include, and then see what kind of metacognitive consequences this has. Does a. A.I. come up with primitive phenomenal concepts on a purely functional basis that it cannot explain, similarly to our hard problem? This is a totally plausible research program even if it may not be possible right at this moment.

    But if you think human beings are are intelligent machines or one of Chalmers' zombies then I'm afraid you're stuck with the hard problem for all eternity. This assumption renders the problem impossible. .FrancisRay

    Not sure what you mean here but functionally, yes we are just intelligent machines. We are just brains.
  • Generic Snowflake
    3
    Given our current and best information about the physical world, unless I am missing something, I don't see how consciousness as well as the subjective experience that forms from it, can't be safely explained as a purely computational phenomenon.

    Take the simplest of computational networks - two states going through a logic gate, producing a new state. According to the research that I am aware of, examples of which I write in the next paragraphs, this simple network, by itself, can be regarded as a fundamental level of consciousness, or a single block of logic if you will. If, for example, you want it to contain memory, in order to process that memory and produce a new state, two NOR gates will suffice. Connect them with another gate and a binary sensor and you essentially have stored information processing which also depends on the environment.

    There is already an amount of research around programming microorganism behavior with a combination of logic gates - which is the fundamental computational mechanism in electronics. Example nice reads:
    https://www.sciencedirect.com/science/article/pii/S030326472200003X
    https://arstechnica.com/science/2010/12/building-logic-gates-with-bacterial-colonies/

    Beyond that, we are just describing different complexity levels of "logic". From what I understand molecular neurotransmitter function (that mostly work as emotional regulators in humans), can be boiled down to logic gates as well. For one, they seem to work similarly to AI neural network learning algorithm techniques to encourage or discourage decisions by altering neuron firing frequency, and even if one could argue that neurotransmitter effect on neurons is not binary, unlike logic gates, their analog behavior can be replicated with binary behavior. Again, by looking at something we can actually map, neurotransmitters in earthworms for example, work in their nervous system as a decision regulator.

    By taking a look at the animal kingdom to comprehend our "seemingly inexplicable phenomenon" of consciousness, we can see that the more complex this network of logic is, the more behaviors emerge from it. In vastly more complex social organisms like bees, research has shown that they share more "traditionally human" behaviors than was thought before. Some name that level of complexity "sentience" - but what does this sentience describe, if not something that just describes a greater level of similarity to our own "special" experience, and not something unique or a separate phenomenon.

    In essence a decently complex lifeform, is self-powered, has sensors that constantly gather information from the environment, can store an amount of memory, and contains a mindbogglingly complex neural network regulated by neurotransmitters that makes decisions.

    Moving on to more complex lifeforms, their similarities to our species increase. There are important differences, for example, like the capacity to store long-term memory, or the evolution of a dedicated neurotransmitter network (Amygdala) and many more, but at the end of the day, it boils down to the aggregation of complex computational systems.

    As "the hard problem of the consciousness" in the sense of how "gives rise to subjective experience", I don't see how it's not just simply a subsequent symptom of the complexity of our systems and the randomization of information. Randomization of information exists in every aspect of our conscious being. From our imperfect sensory inputs, to the wiring of our neural networks and the unique set of experiences and DNA that helps it form.

    Beyond information randomization, in theory, the quantum mind hypothesis could further explain and bridge the probabilistic nature of cognition that gives rise to subjectivity, but again, this is well within the realm of soon-to-be conventional computation. Anyhow, I think that speculating or even philosophizing around this kind of a black box is counter-productive to the discussion, so I won't touch it further.

    If there is information that dispels this, please, go forth and explain.
  • Benj96
    2.3k
    because a property of consciousness is the ability to disagree on what consciousness is/its nature.
  • flannel jesus
    1.9k
    I think your approach is promising, but I also think it's at least currently incomplete. "Consciousness is just computation", while I agree is actually a compelling possibility, still leaves us with the question, "so why do I experience seeing blue and green and yellow and red in the ways I see them?"

    Chinese Room, right?

    I wouldn't be surprised if the answer really was just computation of some sort in the end, but I don't think you're giving the Hard Problem enough credit in your post.
  • wonderer1
    2.2k
    There is already an amount of research around replicating microorganism behavior with a combination of logic gates - which is the fundamental computational mechanism in electronics. Example nice read:
    https://arstechnica.com/science/2010/12/building-logic-gates-with-bacterial-colonies/
    Generic Snowflake

    The article is about replicating logic gates with microorganisms, not vice versa as you suggest.
  • Generic Snowflake
    3
    It is most definitely incomplete - for starters I couldn't hope to articulate it properly within a post, a limited amount of time and without a lot of knowledge to pinpoint many of the logical leaps it does.

    Even if I could do all that, the only way a solution would be complete, would be for science to map and understand every single node and function of our brains. We already have for less complex organisms though. So in general, it just seems like it is the most fitting solution with the available information we have at this time.

    I think the Chinese Room is fallacious and there are pretty convincing counterarguments against it. My own take is:
    - For starters, if a person that doesn't understand Chinese, manually runs the program that can answer in Chinese, they will inevitably have used the knowledge that is contained in that program.
    - What is "understanding what characters mean" if not just data? If the only knowledge that was contained in the program, was how to form an answer without knowledge about what each character means, be it a computer or human, they would just lack that knowledge that is absent from the program.
    Even if someone would need that knowledge to write the program, that doesn't mean it would be included in the program.
    - Can the actual meaning of characters be embedded in the program? Sure. The information we store and recall - ideas, pictures, sounds, smells, feelings, touch impressions can all be represented by data. It's just that, should they come in the form of data, the latter three require molecules or direct brain signals to be communicated to us.

    By extension, the so-called Qualia (these subjective experiences) can be represented by data. Perhaps even then, there would be no way to communicate Qualia accurately, unless one shared the same brain, since each person has a differently formed network that process them in different ways.


    Yes, my bad, should have wrote programming instead of replicating. I'll edit it and include an extra link.
  • Patterner
    1.1k

    As, uh, Flannel Jesus (lol) said, we still have the question of subjective experience. Everything you wrote is regarding physical processes. And those physical processes explain behavior nicely. Never having seen a red-hot piece of metal, my brain sends signals to my muscles, and I pick it up. My nerves send signals of extreme damage to my brain. My brain sends signals to my muscles to drop it.

    Various signals also get stored in my brain. The next time my retinas are hit with patterns of photons that are a close enough match to those stored signals that are linked with the damage my hand received, my brain does not send signals to my muscles to pick it up.

    Obviously, it's extraordinarily more complex than that. But it's all just physical things and processes, bringing about physical behavior. How do those physical things and processes bring about another phenomenon at the same time? A phenomenon that, as I said in a previous post, a leading expert in neurology and the study of consciousness, and a leading expert in the properties of particles, forces, and the laws of physics, do not know how to account for with neurons, properties of particles, forces, and the laws of physics. They don't know how those things account for subjective experience.

    There's also the question of why. Behavior is explained by the physical. Why have the subjective experience that only observes after the fact? Why would evolution have selected for something that has no function?
  • bert1
    2k
    Behavior is explained by the physical.Patterner

    Is it though? I do things because of the way I feel, it seems to me. So we have the problem of overdetermination. A topic for another thread I think. I still have to catch up on a paper @fdrake wants me to read though, so I'll do that first.
  • simplyG
    111
    The problem of consciousness is so hard because not only is it an abstraction layer on top of a physical brain but also because we are creatures that experience emotion and behave unexpectedly rather than mechanically.

    The reason why it can’t be explained is because consciousness could be the divine spark manifesting its creator in human form or it could even be a soul but that’s unscientific.

    The mind is not the brain.
  • Patterner
    1.1k
    Behavior is explained by the physical.
    — Patterner

    Is it though? I do things because of the way I feel, it seems to me.
    bert1
    To me, too. I'm just stating the case for the other side, and asking how it works.

    (I sent you a message the other day. Don’t know if you were aware. Pardon the pun.)
  • Generic Snowflake
    3
    Ok I'll try to hone in.

    How can a collection of mindless, thoughtless, emotionless particles come together and yield inner sensations of color or sound, of elation or wonder, of confusion or surprise? — Brian Green
    This is the gist of subjective experience, correct?

    As far as I know, when it comes to e.g. images (same with sound), the process can be explained to a sufficient degree computations-wise. The visual cortex is the one which allows us to "see", via translating eye photon receptors, since incoming photon particles form patterns (in the same way, sound is specific particle vibrations on a medium). These patterns are specific pieces of data. For mental images various parts of the brain cooperate in order to recall the data - or even build it from scratch - and create a mental image. This process hasn't been fully deciphered yet, but it seems plausible to me from what I've read, that a subjective experience of a mental image can be regarded as "seeing via memory/imagination" in the same way we decode sensory information when seeing via the eyes, since it's the brain who does the actual seeing after all.

    When it comes to emotions, we know that they are contingent with neural functions and the process from which they stem from, e.g. elation:
    - We receive or recall data.
    - Depending on its nature certain neurotransmitters, which are hormones and biochemical substances are released. In the case of elation, it should be dopamine, endorphins, oxytocin etc.
    - We know these neurotransmitters bind to neuron receptors, opening ion channels and affecting their firing rate. In the case of elation the neurons fire more frequently in the reward centers of the brain.

    You are right though, this one seems still far from deciphered. Could it be a bespoke form of information meant to shape behaviors for lifeforms to survive and evolve? Much like the cells that translate photon patterns to vision or vibrations to sound? There doesn't seem to be any more information that connects it with the subjective experience of emotion.

    The reason why the experience is private and unique, I think is explained by our neurodivergence as I wrote in the other post. After all each person's sensors differ so they don't get the exact same information, and most importantly they process it in vastly different neural networks. Subsequently, since the processed data differs, will the Qualia that come from it.

    I do disagree on the why - I think all forms of subjective experience have an important evolutionary value, for example recalling or imagining information has practical value and emotions work as a reward/punishment mechanism that promotes certain behaviors, much like a reward function in AI reinforcement learning.
  • bert1
    2k
    To me, too. I'm just stating the case for the other side, and asking how it worksPatterner

    Oh I see, that makes sense. Sorry I haven't been following closely.
  • Patterner
    1.1k

    Not your fault. I didn’t explain.
  • PeterJones
    415
    I don't see it as pessimistic at all or that anything is lost. What does a solution to the hard problem look like? I don't think there is a good one I can think of which doesn't imply some sort of dualism which I fundamentally disagree with.Apustimelogist

    You're right., Dualism is hopeless. The solution would be nondualism.

    I am not suggesting looking for a fundamental ontology based on computation but an explanation for why knowing about fundamental ontologies are out of reach.

    I'm suggesting such knowledge is not out of reach. To show that it is out of reach would require ignoring all the people who claim to have such knowledge, or proving they do not. . .

    I think the explanation is actually already there, it just has to be articulated and demonstrated. Like you said, experiences are primitive.

    Ah. I didn't say this and would argue against it. You're conflating consciousness and experience, but I;m suggesting that the former is prior to the latter. Bear in mind that experience-experiencer is a duality that must be reduced in order to overcome dualism. . .

    We know experiences are related to the functional architecture of our brains. We can transfer or demonstrate the concept of this kind of primitiveness into the architectures and functional repertoires of A.I. We use A.I. to demonstrate the limits of what kinds of information is transferable from the environment, what kinds of concepts are created and what information they don't or can't include, and then see what kind of metacognitive consequences this has. Does a. A.I. come up with primitive phenomenal concepts on a purely functional basis that it cannot explain, similarly to our hard problem? This is a totally plausible research program even if it may not be possible right at this moment.

    There are no primitive concepts or experiences. This was shown by Kant. For a solution one would have to assume a state or level of consciousness free of all concepts and prior to information. Don't forget that and information theory requires an information space, and the space comes before the information. .

    Not sure what you mean here but functionally, yes we are just intelligent machines. We are just brains.

    If you believe this you will never have a fundamental theory and will will have to live with the 'hard' problem. forever. I wonder what leads you to believe this when it is just a speculation. If you believe this then much of what I'm saying will make no sense to you. I would advise against making such assumptions, or indeed any assumptions at all. , .
  • Apustimelogist
    621
    I'm suggesting such knowledge is not out of reach. To show that it is out of reach would require ignoring all the people who claim to have such knowledge, or proving they do not. . .FrancisRay

    Who actually has a suggestion though? I don't think a fundamental ontology can be characterized because all explanations are functional and rely on our stream of experience. What does anything actually mean independently of the dynamics of experiences? What is the utility of any factual statement except in how that statement predicts further experiences? We might then want to characterize experience as the fundamental ontology but I resist that because I don't think there is a coherent account of what experience is or means and it seems impossible to characterize publicly or in scientific paradigms due to the nature of the hard problem.

    Ah. I didn't say this and would argue against it. You're conflating consciousness and experience, but I;m suggesting that the former is prior to the latter.FrancisRay

    Well this is confusing; you seemed to say it since you replied to the quote that you agreed. I have only been referring to experience the whole time. I am not sure what you mean by consciousness here. When I say experience is primitive, I just mean in a kind of epistemic sense - experiences are immediately apparent and intuitive to us and they don't have an explicit characterization... I just see blue, I cannot tell you what it is.

    My whole experience (tentatively I would say consciousness) is just a stream of these things. They cannot be reduced further... they are the bottom and foundation for everything I know and perceive. That is to say nothing about reality but just that experiences are the primitive, irreducible foundation of what I know and perceive.

    Bear in mind that experience-experiencer is a duality that must be reduced in order to overcome dualism. . .FrancisRay

    Not sure what you mean by experience-experiencer duality beyond conventional dualism. I am not sure what "experiencer" means.

    There are no primitive concepts or experiences. This was shown by Kant.FrancisRay

    Again, my notion of primitiveness just relates to the immediate, irreducible apprehension of experiences after which there is nothing more basic epistemically.

    For a solution one would have to assume a state or level of consciousness free of all concepts and prior to information.FrancisRay

    I don't think you can have consciousness free of information nor do I understand wht you think this is required for a solution.

    and information theory requires an information space, and the space comes before the information. .FrancisRay

    I don't think there is priority here. If there is information, it exists on an information space; n information space is defined by the information in it. One doesnt come before the other.


    If you believe this you will never have a fundamental theory and will will have to live with the 'hard' problem. forever. I wonder what leads you to believe this when it is just a speculation. If you believe this then much of what I'm saying will make no sense to you. I would advise against making such assumptions, or indeed any assumptions at all. , .FrancisRay

    I don't see what your alternative suggestion could possibly be if you don't believe dualism is true. Regardless of what you think the fundamental reality is, the evidence is overwhelming about how consciousness relates to or can be characterized in terms of brains in a functional sense (I hope you understand what I mean when I say functionally). What is your alternative characterization?

    I am starting to think you haven't understood anything I have said at all. Its hard to believe now that you could have said my previous post was perceptive and a good summary if you really understood it. Neither have I been trying to think about some fundamental theory that resolves the hard problem. My initial post said that I didn't think the so called hard problem could be solved at all.
  • PeterJones
    415
    Who actually has a suggestion though?Apustimelogist
    Have you examined the suggestions of the Buddha, Lao Tzu and the Upanishads? Afaik there is no other explanation for consciousness that works. .

    When I say experience is primitive, I just mean in a kind of epistemic sense - experiences are immediately apparent and intuitive to us and they don't have an explicit characterization... I just see blue, I cannot tell you what it is/

    Okay. But I''m speaking ontologically. I'm suggesting that consciousness in its original state is prior to experience and is known simply as what it is. . .

    My whole experience (tentatively I would say consciousness) is just a stream of these things. They cannot be reduced further... they are the bottom and foundation for everything I know and perceive. That is to say nothing about reality but just that experiences are the primitive, irreducible foundation of what I know and perceive.

    If you explore your consciousness I predict that you'll eventually discover that consciousness is not a stream of things. These 'things; are the contents of consciousness, not the phenomenon itself. Meditation is the practice of seeing beyond these things to their underlying basis. This basis is beyond time and space, and knowing this is what 'enlightenment' means in Buddhism. . .
    Not sure what you mean by experience-experiencer duality beyond conventional dualism. I am not sure what "experiencer" means.

    An experience requires an experiencer. I;m suggesting that if you explore your consciousness you are capable of transcending this duality for the final truth about consciousness. The task would be to 'Know thyself', as advised by the Delphic oracle. When Lao Tzu is asked how he knows the origin of the universe he answers, 'I look inside myself and see'. . .

    Again, my notion of primitiveness just relates to the immediate, irreducible apprehension of experiences after which there is nothing more basic epistemically.

    This is a very bold assumption. I wonder whether you realise that what you're proposing is that the nondual doctrine of the Perennial philosophy is false,. .

    I don't think you can have consciousness free of information nor do I understand why you think this is required for a solution.

    An information theory without an information space is not fundamental or even coherent. You may believe that consciousness cannot be free of information, but it is telling that having made this assumption you cannot explain metaphysics, consciousness, or the hard problem. Have you considered that your problems may be caused by your own assumptions? .

    I don't think there is priority here. If there is information, it exists on an information space; n information space is defined by the information in it. One doesnt come before the other

    In order to draw a Venn diagram one must first have a blank sheet of paper. .

    I don't see what your alternative suggestion could possibly be if you don't believe dualism is true. Regardless of what you think the fundamental reality is, the evidence is overwhelming about how consciousness relates to or can be characterized in terms of brains in a functional sense (I hope you understand what I mean when I say functionally). What is your alternative characterization?

    My suggestion is that consciousness is prior to number and form and that its function is simply knowing. All the rest is cogitation, intellection and conceptualisation. If you cannot imagine my alternative suggestion then this can only be because you've not studied philosophy beyond the walls off the Academy. You'll find the same suggestion in every book you ever read on mysticism. Those who investigate consciousness rather than speculate come back to report that at its root consciousness is prior to number and form and free of concepts and ideas. . ,

    I am starting to think you haven't understood anything I have said at all. Its hard to believe now that you could have said my previous post was perceptive and a good summary if you really understood it. Neither have I been trying to think about some fundamental theory that resolves the hard problem. My initial post said that I didn't think the so called hard problem could be solved at all.

    Yes. So I chipped in to say it was solved long ago and is easy to solve. The solution would be to abandon dualism and pay attention to what those who study consciousness have to say about it. It is astonishing how few people bother to do this, and so not at all surprising that so many people struggle with the hard problem. . . .
  • Apustimelogist
    621


    I am sorry, I don't feel there is much fruitful to be gained in continuing this specific conversation. I find it very difficult to engage with your way of writing, it all seems very vague
  • RogueAI
    2.9k
    What does a solution to the hard problem look like?Apustimelogist

    Idealism.
  • RogueAI
    2.9k
    Take the simplest of computational networks - two states going through a logic gate, producing a new state.Generic Snowflake

    Is an abacus falling through the air, beads moving back and forth from the wind, doing any computations?
  • RogueAI
    2.9k
    My whole experience (tentatively I would say consciousness) is just a stream of these things. They cannot be reduced further... they are the bottom and foundation for everything I know and perceive.Apustimelogist

    What about when you clear your mind? When I meditate, I can clear my mind for at least a short time so there is no stream of anything, yet I'm still conscious.
  • Corvus
    3.4k


    Large part of consciousness is memory.  Suppose that you lost your memory totally, so you are unable to recall even 1 second of perception in your mind.  Then everything you perceive would be just a piece of photographic images just like a cell phone camera taken so many photos that you saw with no meaning and no thoughts or logic or memories.

    Memory in perception is a significant process and it is always a large topic in psychology, but strangely the topic is not discussed a lot in cognitive science and especially epistemology.
    Memory chips are the most important part of function in the computer systems along with the computer processors, and without it computing would be simply limited to abacus level in terms of its practicality.

    How computer memories work and are manufactured can easily be found in electronics and computer engineering studies.

    Anyhow, consciousness is largely made of memory function of the brain, and the way the brain's memory works could be analogised from the computer memory.

    I am not an AI expert, but I am guessing this would be how the AI brains work too. AI brains would have large RAM (random access memory) and also hard disks of huge capacity. These would be stored with mega tons of information, how they must respond to the input signals. 

    High power central microprocessor would monitor the input signals and analyse what response or action to take in the output forms.  It would then process the preprogrammed relevant set of data from the memory (either from RAM or the hard disk), and then output the instructions to the installed hardware in the form of facial expressions, movement of hands or legs, even fingers ordering these devices with the instructed jobs.

    Of course, the memory function of the human brain itself alone cannot be reduced to the human consciousness, but it cannot be denied that it is a large part of it, if not the base of consciousness. This is what I believe.
  • Apustimelogist
    621

    Well I think I have somewhat of a reply to that in my initial post.



    Yes, its computing solutions for equations of motion in physics.



    Yes, I still think even in these cases its still just a stream of experience and this kind of thing can be accounted for in terms of attention and access conscious.
  • RogueAI
    2.9k
    "The idea that plants might be conscious is not popular, but it is definitely not untestable, unscientific, or “magicalist” (not a word)."
    https://iai.tv/articles/no-theory-of-consciousness-is-scientific-auid-2610?_auid=2020

    Conscious crystals can't be too far behind. I hope Shirley MacLaine lives long enough.
  • RogueAI
    2.9k
    Yes, its computing solutions for equations of motion in physics.Apustimelogist

    What do you think of this, by Searle? "“The wall behind my back is right now implementing the WordStar program, because there is some pattern of molecule movements that is isomorphic with the formal structure of WordStar. But if the wall is implementing WordStar, if it is a big enough wall it is implementing any program, including any program implemented in the brain.”"
    https://philosophynow.org/issues/124/Is_Everything_A_Computer

    If no one is looking at a computer simulation of a tornado, is it still a computer simulation of a tornado? Or is just a bunch of pixels turning off and on?
  • Apustimelogist
    621


    I don't know, I would have to think about it, I am neithere here or there.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.