On the one hand, fields are real and modeled mathematically: — Mww
Yes. The intrinsic Either/Or aspect of our apparently dual "Reality" is what Einstein was talking about in his Theory of Relativity. What's real depends on who's looking. That's also why my personal worldview is based on a complementary Both/And perspective. For all practical purposes (science), what we perceive as concrete objects and physical effects is what is Real. But for theoretical purposes (philosophy), our perceptions of those objects are mental constructs. So discussions about Consciousness must make that distinction clear, or else, by reifying Consciousness, we run into the paradoxical "hard problem".And on the other, fields are completely abstract and quantitatively incommensurable directly: — Mww
And this very scientist says that, and again I quote [ ... ] There genuinely, really is 'a hard problem of consciousness', ... — Wayfarer
... but it's almost beyond doubt that you don't actually comprehend what it is. — Wayfarer
No. "The hard problem ... ", like e.g. æther, is an empty concept — 180 Proof
The purported 'hard problem' is dissolved - as is many other so-called 'problems' - when we quit using utterly inadequate frameworks to talk about stuff.
If you are ascribing some kind of independence to subconscious phenomena that's a pretty large leap. — Pantagruel
I'm familiar with Laszlo , but not with that abstruse theory. However, the term sounds like Cartesian Dualism to me. His solution was "neat", in that it got the church off his back, by arbitrarily defining Non-Overlapping Magisteria. And materialistic Science has flourished for centuries since cutting itself off from Philosophy and Metaphysics. But since the Quantum revolution in Science, the overlap between Mind & Matter has become ever harder to ignore. Anyway, I'll check it out, because the notion of Complementarity is essential to my own abstruse thesis. :smile:It absolutely does address the hard problem of consciousness. The solution is called "biperspectivism". It as quite neat. — Pantagruel
The statements you refer to are empty (meaningless) to you, because you don't understand the unconventional worldview that the assertions are derived from. That's why I provide links for those who are interested enough to investigate a novel way of looking at the world.You keep making empty statements. How does that have anything to do with this thread and what I said in the opening post? — Zelebg
So you intend a falsification of A = A, insofar as some occasions permit A = not-A? I submit that if you’re daydreaming you’re not driving — Mww
Same with metaphysical truths, per se: the principles of them may be found in reason a priori, and the possible objects given from those principles may be exemplified by experience, but that is not sufficient in itself to allow truths of any kind to reside in consciousness. Truth is where cognition conforms to its object, and no cognition is possible that is not first a judgement. Therefore, it is the case that truth resides in judgement, and if there is such judgement we are then conscious of that which is cognized as true. — Mww
Why would it have one? — Mww
All the stuff about ethics and spirituality is besides any of this. This is just descriptive; any prescriptions could be paired with this. Accepting this description of the world doesn’t say anything about what is or isn’t valuable or good or etc. — Pfhorrest
P-zombie" is an incoherent construct because it violates Leibniz's Indentity of Indiscernibles without grounds to do so. To wit: an embodied cognition that's physically indiscernible from an ordinary human being cannot not have "phenomenal consciousness" since that is a property of human embodiment (or output of human embodied cognition). A "p-zombie", in other words, is just a five-sided triangle ...— 180 Proof
Why would an entity that has the appearance of a regular human necessarily have phenomenal consciousness?
That's a strong claim. It would require strong evidence. — frank
For all practical purposes (science), what we perceive as concrete objects and physical effects is what is Real. But for theoretical purposes (philosophy), our perceptions of those objects are mental constructs. So discussions about Consciousness must make that distinction clear, or else, by reifying Consciousness, we run into the paradoxical "hard problem". — Gnomon
what Einstein was talking about in his Theory of Relativity. What's real depends on who's looking. — Gnomon
Or are you saying you can't imagine the p-zombie at all? — frank
I want to understand how the way the eye works and how that corresponds to the sound of a breeze through the leaves of a tree. — ovdtogt
Unfortunately, "consciousness" is an analogous term, and using this definition, when I define consciousness differently (as "awareness of intelligiblity"), is equivocation. If you want to criticize my work, then you must use technical terms as I use them. In saying this, I am not objecting to ypur definition in se, only to its equivocal use. — Dfpolis
I will agree with you that it is a Working Hypothesis since we don't already have a Theory mainly because we have to many competing frameworks at this point.Then you will have no problem in explaining how this hypothesis, which I am calling the Standard Model (SM), conforms to the facts I raised against it. — Dfpolis
-Yes a healthy functioning brain is a necessary and sufficient explanation for any property of mind known to us. We may miss many details on how specific properties correlate to specific brain functions but that's not a reason to overlook the huge body of knowledge that we've gained the last 35 years.Please note that I fully agree that rational thought requires proper brain function. So, that is not the issue. The issue is whether brain function alone is adequate. — Dfpolis
-An important question that comes in mind is: " Is your problem relevant to our efforts to understand".That may well be true. I do not know what neuroscientists consider hard, nor is that what I am addressing in my article. As I made clear from the beginning, I am addressing the problem Chalmers defined. That does not prevent you from discussing something else, as long as you recognize that in doing so you are not discussing my article or the problem it addresses. In saying that, I am not denigrating the importance of the problems neuroscientists consider hard -- they're just not my problem. — Dfpolis
In defining the Hard Problem, you quote a reputable secondary source (Scholarpedia), but I quoted a primary source. So, I will stick with my characterization. — Dfpolis
-Agreed. But if Chalmers wanted answers to his ''why" questions with a different sense, he should have been Studying Cognitive Science. i.e. his first why question "Why are physical processes ever accompanied by experience?" the answer is simple. Evolutionary principles. Making meaning of your world ads an advantage for survival and flourishing(Avoiding suffering, managing pleasure etc).There are many senses of "why." Aristotle enumerates four. I suppose you mean "why" in the sense of some divine purpose. But, I did not ask or attempt to answer that question. The question I am asking is how we come to be aware of neurally encoded contents. So, I fail to see the point you are making. — Dfpolis
In my opinion you fail because as you said yourself, you ignore the latest work and the hard questions tackled by Neuroscience.The question I am asking is how we come to be aware of neurally encoded contents. So, I fail to see the point you are making. — Dfpolis
-I was referring to Chalmers's pseudo philosophical "why" questions. Questions like "Why there is something instead of nothing" are designed to remain unanswered.However, if you wish to call something "pseudo philosophical" or claim that it "create unsolvable questions," some justification for your claims would be courteous. Also, since I solved the problems I raised, they are hardly "unsolvable." — Dfpolis
-Sure there are many problems we haven't solved (yet). Why do you think that the SM won't manage to finally provide a solution and how are you sure that some of them aren't solved already. After all,as you stated you are not familiar with the current Science on the topic.I have never denied that the SM is able to solve a wide range of problems. It definitely is. The case is very like that of Newtonian physics, which can also solve many problems. However, I enumerated a number of problems it could not solve. Will you not address those? — Dfpolis
Well I don't know if it was a critique of your work. I only address the paragraph (Article) on Reduction and Emergence"Does the Hard Problem reflect a failure of the reductive paradigm? "Again, this does not criticize my work, because you are not saying that my analysis is wrong, or even that reduction is not involved.cogency of your objection. — Dfpolis
-As I explained if you are pointing to a different problem then you are committing a logical error. Science and every single one of us are limited within a single realm. The burden is not on Science to prove the phenomenon to be physical, but its on the side making the claim for an f an additional sub-straight. The two justified answers are "we currently don't know" Or"this mechanism is necessary and sufficient to explain the phenomenon".Rather, you want me to look at a different problem. Further, with respect to that different problem, you do not even claim that the named methods have made progress in explaining how awareness of contents comes to be. So, I fail to see the — Dfpolis
-My objection was with the word "prove", since in science we don't prove anything.It is a definition, specifying how I choose to use words, and not a claim that could be true or false. — bert1
-ok I think we are on the same page on that.Sorry, I just don't think you've grasped the distinction between definition and theory. — bert1
-Because the hard problem ...is a made up problem.(Chalmers's teleological questions).I agree. I did not say that science proved frameworks, but that we use their principles to deduce predictions. That is the essence of the hypothetico-deductive method. — Dfpolis
-Yes you did, but you also accept a portion of it...right? In retrospect you did stated that your questions seek the "how" and I pointed out that Science has addressed many "how" questions on Brain functions and meaning/Symbolic thinking.If you read carefully, you would see that I criticized Chalmers' philosophy, rather than basing my argument on it. — Dfpolis
-Sure you clarified that and I pointed out the problem with your "how" questions. Many "how" questions have already been addressed and if they haven't been that is not a justification to reject the whole model (the Quasi Dogmatic Principles protects the framework at all time). After all its a dynamic model in progress that yields results and the only one that can be applied,tested produce causal descriptions and Technical Applications!This is baloney. I am asking how questions. The SM offers no hint as to how these observed effects occur. In fact, it precludes them. — Dfpolis
-Strawman, I never said he did. I only pointed out the main historical errors in our Philosophy. Teleology in nature(Chalmers's hard problem) and agency with properties pretty similar to the properties displayed by the phenomenon we are trying to explain.(your claim on the non physical nature of Consciousness)Obviously, you have never read Aristotle, as he proposes none of these. That you would think he does shows deep prejudice. Instead of taking the time to learn, or at least remaining quiet when you do not know, you choose to slander. It is very disappointing. A scientific mind should be open to, and thirsty for, the facts. — Dfpolis
-Abstract concepts do not help complex topics like this one. On the contrary they introduce more ambiguity in the discussion. Plus you strawmanned me again with that supernatural first person data.Nor am I suggesting that we do. I am suggesting that methodological naturalism does not restrict us to the third-person perspective of the Fundamental Abstraction. That you would think that considering first-person data is "supernatural" is alarming. — Dfpolis
I recently published an article with the above title (https://jcer.com/index.php/jcj/article/view/1042/1035). Here is the abstract: — Dfpolis
-That is only true for the advances in Philosophy. Almost all the breakthroughs made by relevant Scientific disciplines never make it in Neurophilosophy mainly because Philosophical frameworks that are based on the latest epistemology are part of Cognitive Science.Yet, in the years since David Chalmers distinguished the Hard Problem of Consciousness from the easy problems of neuroscience, no progress has been made toward a physical reduction of consciousness. — D. F. Polis
This, together with collateral shortcomings Chalmers missed, show that the SM is inadequate to experience. — D. F. Polis
The explanatory gap" is misinterpreted by many philosophers as an "unsolvable problem" (by philosophical means alone, of course) for which they therefore fiat various speculative woo-of-the-gaps that only further obfuscate the issue.
— 180 Proof
Not at all.
In philosophy of mind and consciousness, the explanatory gap is the difficulty that physicalist theories have in explaining how physical properties give rise to the way things feel when they are experienced. It is a term introduced by philosopher Joseph Levine.[1] In the 1983 paper in which he first used the term, he used as an example the sentence, "Pain is the firing of C fibers", pointing out that while it might be valid in a physiological sense, it does not help us to understand how pain feels.
The explanatory gap has vexed and intrigued philosophers and AI researchers alike for decades and caused considerable debate. Bridging this gap (that is, finding a satisfying mechanistic explanation for experience and qualia) is known as "the hard problem".
— Wikipedia
As I've shown already in this thread, the hard explanatory problem has scientific validation, namely, that of the subjective unity of consciousness, and how to account for it in neurological terms. This is one aspect of the well-known neural binding problem, which is how to account for all of the disparate activities of the brain and body can culminate in the obvious fact of the subjective unity of experience.
As is well known, current science has nothing to say about subjective (phenomenal) experience and this discrepancy between science and experience is also called the “explanatory gap” and “the hard problem” (Chalmers 1996). There is continuing effort to elucidate the neural correlates of conscious experience; these often invoke some version of temporal synchrony as discussed above.
There is a plausible functional story for the stable world illusion. First of all, we do have a (top-down) sense of the space around us that we cannot currently see, based on memory and other sense data—primarily hearing, touch, and smell. Also, since we are heavily visual, it is adaptive to use vision as broadly as possible. Our illusion of a full field, high resolution image depends on peripheral vision—to see this, just block part of your peripheral field with one hand. Immediately, you lose the illusion that you are seeing the blocked sector. When we also consider change blindness, a simple and plausible story emerges. Our visual system (somehow) relies on the fact that the periphery is very sensitive to change. As long as no change is detected it is safe to assume that nothing is significantly altered in the parts of the visual field not currently attended.
But this functional story tells nothing about the neural mechanisms that support this magic. What we do know is that there is no place in the brain where there could be a direct neural encoding of the illusory detailed scene (Kaas and Collins 2003). That is, enough is known about the structure and function of the visual system to rule out any detailed neural representation that embodies the subjective experience. So, this version of the Neural Binding Problem really is a scientific mystery at this time.
— Jerome S. Feldman, The Neural Binding Problem(s)
Your continual invocation of 'woo of the gaps' only illustrates that you're not grasping problem at hand. It's a hard problem for physicalism and naturalism because of the axioms they start from, not because there is no solution whatever. Seen from other perspectives, there is no hard problem, it simply dissolves. It's all a matter of perspective. But seen from the perspective of modern scientific naturalism, there is an insuperable problem, because its framework doesn't accomodate the reality of first-person experience, a.k.a. 'being', which is why 'eliminative materialism' must insist that it has no fundamental reality. You're the one obfuscating the problem, because it clashes with naturalism - there's an issue you're refusing to see which is as plain as the nose on your face.
'Speculative woo-of-the-gaps' is at bottom simply the observation that there are things about the mind that science can't know, because of its starting assumptions. It's a very simple thing, but some guy by the name of Chalmers was able to create an international career as an esteemed philosopher by pointing it out. — Wayfarer
Unfortunately, "consciousness" is an analogous term, and using this definition, when I define consciousness differently (as "awareness of intelligiblity"), is equivocation. If you want to criticize my work, then you must use technical terms as I use them. In saying this, I am not objecting to ypur definition in se, only to its equivocal use."Consciousness is an arousal and awareness of environment and self, which is achieved through action of the ascending reticular activating system (ARAS) on the brain stem and cerebral cortex — Nickolasgaspar
Then you will have no problem in explaining how this hypothesis, which I am calling the Standard Model (SM), conforms to the facts I raised against it. Please note that I fully agree that rational thought requires proper brain function. So, that is not the issue. The issue is whether brain function alone is adequate.the conclusion that brain function is responsible for human behavior and thought processes is way more than an assumption. — Nickolasgaspar
That may well be true. I do not know what neuroscientists consider hard, nor is that what I am addressing in my article. As I made clear from the beginning, I am addressing the problem Chalmers defined. That does not prevent you from discussing something else, as long as you recognize that in doing so you are not discussing my article or the problem it addresses. In saying that, I am not denigrating the importance of the problems neuroscientists consider hard -- they're just not my problem.Now, Chalmers's attempt to identify the Hard problem of Consciousness had nothing to do with the actual Hard problems faced by the field. — Nickolasgaspar
There are many senses of "why." Aristotle enumerates four. I suppose you mean "why" in the sense of some divine purpose. But, I did not ask or attempt to answer that question. The question I am asking is how we come to be aware of neurally encoded contents. So, I fail to see the point you are making.Searching meaning in natural processes is a pseudo philosophical attempt to project Intention and purpose in nature (Agency) and create unsolvable questions. Proper questions capable to understand consciousness should begin with "how" and "what" , not why. (how some emerges, what is responsible for it etc). — Nickolasgaspar
I have never denied that the SM is able to solve a wide range of problems. It definitely is. The case is very like that of Newtonian physics, which can also solve many problems. However, I enumerated a number of problems it could not solve. Will you not address those?The current Working Hypothesis (SM) is more than adequate to explain the phenomenon. It even allow us to make predictions and produce Technical Applications that can directly affect, alter or terminate the phenomenon. It establishes Strong Correlations between lower level system(brain function) and high level systems(Mental states and properties). — Nickolasgaspar
Again, this does not criticize my work, because you are not saying that my analysis is wrong, or even that reduction is not involved. Rather, you want me to look at a different problem. Further, with respect to that different problem, you do not even claim that the named methods have made progress in explaining how awareness of contents comes to be. So, I fail to see the cogency of your objection.the Hard Problem doesn't reflect a failure of the reductive paradigm because this paradigm (tool of science)is not that RELEVANT to the methods we use to study Mental properties. Complexity Science and Scientific Emergence are the proper tools for the job. — Nickolasgaspar
It is a definition, specifying how I choose to use words, and not a claim that could be true or false."Epistemological emergence occurs when the consequences of known principles cannot be
deduced. We often assume, but cannot prove, that system behavior is the result of isolated com-
ponent behavior"
-Thats not quite true. — Nickolasgaspar
I agree. I did not say that science proved frameworks, but that we use their principles to deduce predictions. That is the essence of the hypothetico-deductive method.First of all in science we don't "prove" frameworks, we falsify them and we accept them for their Descriptive and Predictive power. — Nickolasgaspar
It is also a term that I did not employ.Strong Emergence is an observer relative term. — Nickolasgaspar
I am not sure how a problem, of any sort, can be a fallacy. It is just an issue that bothers someone, and seeks resolution. It may be based on a fallacy, and if it is, then exposing the fallacy solves it.In my opinion the whole "Hard Problem" objection is nothing more than an Argument from Ignorance and in many cases, from Personal Incredulity Fallacies. — Nickolasgaspar
If you read carefully, you would see that I criticized Chalmers' philosophy, rather than basing my argument on it.I could go in depth challenging the rest of the claims in the paper but It seems like it tries to draw its validity from Chalmers' bad philosophy. — Nickolasgaspar
Then you will have no difficulty in showing how my specific objections about reports of consciousness, one-to-many mappings from the physical to the intentional, and propositional attitudes, inter alia, are resolved by this theory -- or how neurally encoded intelligible contents become actually known. Despite the length of your response, you have made no attempt to resolve these critical issues.The Ascending Reticular Activating System, the Central Lateral Thalamus and the latest Theories of Consciousness on Emotions as the driving force (Mark Solmes, founder of Neuropsychoanalysis) leave no room for a competing non naturalistic theory in Methodological Naturalism and in Philosophy in general. — Nickolasgaspar
This is baloney. I am asking how questions. The SM offers no hint as to how these observed effects occur. In fact, it precludes them.because we can not answer a "why" question. — Nickolasgaspar
Obviously, you have never read Aristotle, as he proposes none of these. That you would think he does shows deep prejudice. Instead of taking the time to learn, or at least remaining quiet when you do not know, you choose to slander. It is very disappointing. A scientific mind should be open to, and thirsty for, the facts.IT takes us back in bed with Aristotle. Are we going to resurrect Gods, Phlogiston, Miasma, Panacea, Orgone Energy all over again??? — Nickolasgaspar
Nor am I suggesting that we do. I am suggesting that methodological naturalism does not restrict us to the third-person perspective of the Fundamental Abstraction. That you would think that considering first-person data is "supernatural" is alarming.We don't have the evidence (yet) to use Supernatural Philosophy (reject the current Scientific paradigm of Methodological Naturalism) in our explanations just because we miss pieces from our puzzle. — Nickolasgaspar
Behavior is not consciousness. That's stimulus and response. How do you behave when something sharp pokes into your back? How do you behave when your energy levels are depleted? These are not questions of consciousness.Subjective consciousness is not empirically observable. Behavioral consciousness is. — Philosophim
It's a mystery because nobody can explain it. Christof Koch can't, try though he does. You are not even offering speculations. You only say it happens in the brain. That's obviously where my consciousness is. But what is the mechanism?The only reason its a mystery is you think that its impossible for consciousness to come out of physical matter and energy. Why? It clearly does. — Philosophim
Not for me. I don't care what the answer is. I just want to know what it is.Is it some necessary desire that we want ourselves to be above physical reality? — Philosophim
If it was not a mystery, we would have the answer. We don't. The resistance, in my case, is that the answer of "It just does" to the question of "How does the physical brain produce consciousness?" is no answer at all. Just as we wouldn't accept that answer to "How does eating food give us energy?", we shouldn't accept it here.Because if you eliminate that desire, its clear as day that consciousness is physical by even a cursory glance into medicine and brain research. I just don't get the mystery or the resistance. — Philosophim
Yes it is. That's what is meant when people refer to the Hard Problem of Consciousness.That is the Hard Problem. "Through our physical brain" is a where, not a how. "In the sky" does not tell us how flight is accomplished. "In our legs" does not tell us how walking is accomplished. "In our brain" does not tell us how consciousness is accomplished. The details are not insignificant. They are remarkably important. And they are unknown.
— Patterner
Sure, but its not the hard problem. — Philosophim
Many books and articles on consciousness have appeared in the past few years, and one might think that we are making progress. But on a closer look, most of this work leaves the hardest problems about consciousness untouched. Often, such work addresses what might be called the “easy” problems of consciousness: How does the brain process environmental stimulation? How does it integrate information? How do we produce reports on internal states? These are important questions, but to answer them is not to solve the hard problem: Why is all this processing accompanied by an experienced inner life?
In philosophy of mind, the hard problem of consciousness is to explain why and how humans and other organisms have qualia, phenomenal consciousness, or subjective experiences. It is contrasted with the "easy problems" of explaining why and how physical systems give a (healthy) human being the ability to discriminate, to integrate information, and to perform behavioral functions such as watching, listening, speaking (including generating an utterance that appears to refer to personal behaviour or belief), and so forth. The easy problems are amenable to functional explanation: that is, explanations that are mechanistic or behavioral, as each physical system can be explained (at least in principle) purely by reference to the "structure and dynamics" that underpin the phenomenon.
The hard problem of consciousness is the problem of explaining why any physical state is conscious rather than nonconscious. It is the problem of explaining why there is “something it is like” for a subject in conscious experience, why conscious mental states “light up” and directly appear to the subject.
The hard problem of consciousness (Chalmers 1995) is the problem of explaining the relationship between physical phenomena, such as brain processes, and experience (i.e., phenomenal consciousness, or mental states/events with phenomenal qualities or qualia). Why are physical processes ever accompanied by experience? And why does a given physical process generate the specific experience it does—why an experience of red rather than green, for example?
As far as the complex processes of the body that spark a consciousness go, I suspect that activated matrices of neurons and electromagnetic (EM) fields play a part in activating dispersed areas of the brain to form coherent qualitative conscious responses.
This would somewhat explain our preoccupation with consciousness being an ethereal non-physical thing, as EM fields are essentially invisible to human perception. — Brock Harding
The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Nagel (What is it Like to be a Bat,1974) has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience.
It is undeniable that some organisms are subjects of experience. But the question of how it is that these systems are subjects of experience is perplexing. — David Chalmers
So, something like aristotelian realism about universals? — aporiap
I'm not familiar with terms like 'notes of comprehension' or 'essential notes'. — aporiap
You say that logical distinction is predicated on the fact that intentional objects like concepts are different from materiality not ontologically but by virtue of not sharing these notes of comprehension. — aporiap
I mentioned in the post that it poses a problem for programs which require continual looping or continual sampling. In this instance the program would cease being an atmospheric sampler if it lost the capability of iteratively looping because it would then loose the capability to sample [i.e. it would cease being a sampler.] — aporiap
What do you mean they solve mathematical problems only? There are reinforcement learning algorithms out now which can learn your buying and internet surfing habits and suggest adverts based on those preferences. There are learning algorithms which -from scratch, without hard coded instruction- can defeat players at high-level strategy games, without using mathematical algorithms. — aporiap
Also I don't get the point about why operating on reality representations somehow makes data-processing unable to be itself conscious. The kind of data-processing going on in the brain is identical to the consciousness in my account. It's either that or the thing doing the data processing [i.e. the brain] which is [has the property of] consciousness by virtue of the data processing. — aporiap
Take an algorithm which plays movies for instance. Any one iteration of the loop outputs one frame of the movie... The movie, here, is made by viewing the frames in a sequential order. — aporiap
But, if it can't be physical, and it's not data processing, what is the supposed cause?
I don't think the multiple realization argument holds here.. it could just be something like a case of convergent evolution, where you have different configurations independently giving rise to the same phenomenon - in this case consciousness. Eg. cathode ray tube TV vs digital TV vs some other TV operate under different mechanisms and yet result in the same output phenomenon - image on a screen. — aporiap
I am not in the field of computer science but from just this site I can see there are at least three different kinds of abstract computational models. Is it true that physical properties of the machine are necessary for all the other models described? — aporiap
Even if consciousness required certain physical features of hardware, why would that matter for the argument since your ultimate goal is not to argue for the necessity of certain physical properties for consciousness but instead for consciousness as being fundamentally intentional and (2) that intentionality is fundamentally distinct from [albeit co-present with] materiality. — aporiap
I actually think my personal thought is not that different to yours but I don't think of intentionality as so distinct as to not be realized by [or, a fundamental property of] the activity of the physical substrate. My view is essentially that of Searle but I don't think consciousness is only limited to biological systems. — aporiap
I don't understand why a neuron not being conscious but a collection of neurons being conscious automatically leads to the hard problem. — aporiap
Searle provides a clear intuitive solution here in which it's an emergent property of a physical system in the same way viscosity or surface tension are emergent from lower-level interactions- it's the interactions [electrostatic attraction/repulsion] which, summatively result in an emergent phenomenon [surface tension] . — aporiap
Well the retinal state is encoded by a different set of cells than the intentional state of 'seeing the cat' - the latter would be encoded by neurons within a higher-level layer of cells [i.e. cells which receive iteratively processed input from lower-level cells] whereas the raw visual information is encoded in the retinal cells and immediate downstream area of early visual cortex. You could have two different 'intentional states' encoded by different layers of the brain or different sets of interacting cells. The brain processes in parallel and sequentially — aporiap
Okay but you seem to imply in some statements that the intentional is not determined by or realized by activity of the brain. — aporiap
I would say intentional state can be understood as some phenomenon that is caused by / emerges from a certain kind of activity pattern of the brain. — aporiap
Of course the measurables are real and so are their relations- which are characterized in equations; but the actual entities may just be theoretical. — aporiap
I was trying to say that introspection is not the only way to get knowledge of conscious experience. I'm saying it will be possible [one day] to scan someone's brain, decode some of their mental contents and figure out what they are feeling or thinking. — aporiap
The more accurate thing to say is that there are neurons in higher-level brain regions which fire selectively to seemingly abstract stimuli. — aporiap
That seems to account for the intentional component no? — aporiap
My question is if dualism isn't correct, would there be a need for two problems of consciousness? — Wheatley
So after a bit more reflection on questions like why does consciousness, this universe, or even existence "exists", I began to think that maybe it's our understanding of consciousness that makes the problem seem "hard". — Flaw
Perhaps your opinion is that we only need to solve the 'easy' problem of consciousness, and that we don't need to take the 'hard' problem seriously. I don't mind that. It sounds pragmatic. — pfirefry
"Why should physical processing give rise to a rich inner life at all? It seems objectively unreasonable that it should, and yet it does."
The critical common trait among these easy problems is that they all concern how a cognitive or behavioral function is performed. All are ultimately questions about how the brain carries out some task-how it discriminates stimuli, integrates information, produces reports and so on. Once neurobiology specifies appropriate neural mechanisms, showing how the functions are performed, the easy problems are solved. The hard problem of consciousness, in contrast, goes beyond problems about how functions are performed. Even if every behavioral and cognitive function related to consciousness were explained, there would still remain a further mystery: Why is the performance of these functions accompanied by conscious experience? It is this additional conundrum that makes the hard problem hard.
Another way to express the Hard Problem is : "how does physical activity (neural & endocrinological) result in the meta-physical (mental) functions that we label "Ideas" and "Awareness"? — Gnomon
But, like Gravity, we only know what it does physically, not what it is essentially. — Gnomon
Recent scientific investigations have found that Information is much more than the empty entropic vessels of Shannon's definition. Information also is found in material & energetic forms. — Gnomon
The "physical capability" of Energy to exist is taken for granted, because we can detect its effects by sensory observation, even though we can't see or touch Energy with our physical senses*2. Mechanical causation works by direct contact between material objects. But Mental Causation works more like "spooky action at a distance". So, Consciousness doesn't act like a physical machine, but like a metaphysical person. — Gnomon
Again, in my thesis, Consciousness is defined as a process or function of physical entities. We have no knowledge of consciousness apart from material substrates. But since its activities are so different from material Physics, philosophers place it in a separate category of Meta-Physics. And religious thinkers persist in thinking of Consciousness in terms of a Cartesian Soul (res cogitans), existing in a parallel realm. — Gnomon
But my thesis postulates that both Physical Energy and Malleable Matter are emergent from a more fundamental element of Nature : Causal EnFormAction*4(EFA). The Big Bang origin state was completely different from the current state, in that there was no solid matter as we know it. Instead, physicists imagine that the primordial state was a sort of quark-gluon Plasma, neither matter nor energy, but with the potential (EFA) for both to emerge later. And ultimately for the emergence of Integrated Information as Consciousness. :smile: — Gnomon
The evidential Gap, beyond the evidence, can be filled with speculation of Creation, or a Tower-of-Turtles hypothesis. — Gnomon
However, Philosophical questions about Mind & Consciousness depend on personal reasoning (Inference) from that physical evidence. If you can't make that deduction from available evidence, then you live in a matterful but mindless & meaningless world. And the mystery of Consciousness is dispelled, as a ghost, with a wave of dismissal. — Gnomon
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.