We learn by abstraction from experience. — Dfpolis
My question would be, if we had a priori "knowledge," what reason would we have to believe that it applied to the world of experience? — Dfpolis
So, something like aristotelian realism about universals? — aporiap
I'm not familiar with terms like 'notes of comprehension' or 'essential notes'. — aporiap
You say that logical distinction is predicated on the fact that intentional objects like concepts are different from materiality not ontologically but by virtue of not sharing these notes of comprehension. — aporiap
I mentioned in the post that it poses a problem for programs which require continual looping or continual sampling. In this instance the program would cease being an atmospheric sampler if it lost the capability of iteratively looping because it would then loose the capability to sample [i.e. it would cease being a sampler.] — aporiap
What do you mean they solve mathematical problems only? There are reinforcement learning algorithms out now which can learn your buying and internet surfing habits and suggest adverts based on those preferences. There are learning algorithms which -from scratch, without hard coded instruction- can defeat players at high-level strategy games, without using mathematical algorithms. — aporiap
Also I don't get the point about why operating on reality representations somehow makes data-processing unable to be itself conscious. The kind of data-processing going on in the brain is identical to the consciousness in my account. It's either that or the thing doing the data processing [i.e. the brain] which is [has the property of] consciousness by virtue of the data processing. — aporiap
Take an algorithm which plays movies for instance. Any one iteration of the loop outputs one frame of the movie... The movie, here, is made by viewing the frames in a sequential order. — aporiap
But, if it can't be physical, and it's not data processing, what is the supposed cause?
I don't think the multiple realization argument holds here.. it could just be something like a case of convergent evolution, where you have different configurations independently giving rise to the same phenomenon - in this case consciousness. Eg. cathode ray tube TV vs digital TV vs some other TV operate under different mechanisms and yet result in the same output phenomenon - image on a screen. — aporiap
I am not in the field of computer science but from just this site I can see there are at least three different kinds of abstract computational models. Is it true that physical properties of the machine are necessary for all the other models described? — aporiap
Even if consciousness required certain physical features of hardware, why would that matter for the argument since your ultimate goal is not to argue for the necessity of certain physical properties for consciousness but instead for consciousness as being fundamentally intentional and (2) that intentionality is fundamentally distinct from [albeit co-present with] materiality. — aporiap
I actually think my personal thought is not that different to yours but I don't think of intentionality as so distinct as to not be realized by [or, a fundamental property of] the activity of the physical substrate. My view is essentially that of Searle but I don't think consciousness is only limited to biological systems. — aporiap
I don't understand why a neuron not being conscious but a collection of neurons being conscious automatically leads to the hard problem. — aporiap
Searle provides a clear intuitive solution here in which it's an emergent property of a physical system in the same way viscosity or surface tension are emergent from lower-level interactions- it's the interactions [electrostatic attraction/repulsion] which, summatively result in an emergent phenomenon [surface tension] . — aporiap
Well the retinal state is encoded by a different set of cells than the intentional state of 'seeing the cat' - the latter would be encoded by neurons within a higher-level layer of cells [i.e. cells which receive iteratively processed input from lower-level cells] whereas the raw visual information is encoded in the retinal cells and immediate downstream area of early visual cortex. You could have two different 'intentional states' encoded by different layers of the brain or different sets of interacting cells. The brain processes in parallel and sequentially — aporiap
Okay but you seem to imply in some statements that the intentional is not determined by or realized by activity of the brain. — aporiap
I would say intentional state can be understood as some phenomenon that is caused by / emerges from a certain kind of activity pattern of the brain. — aporiap
Of course the measurables are real and so are their relations- which are characterized in equations; but the actual entities may just be theoretical. — aporiap
I was trying to say that introspection is not the only way to get knowledge of conscious experience. I'm saying it will be possible [one day] to scan someone's brain, decode some of their mental contents and figure out what they are feeling or thinking. — aporiap
The more accurate thing to say is that there are neurons in higher-level brain regions which fire selectively to seemingly abstract stimuli. — aporiap
That seems to account for the intentional component no? — aporiap
We learn by abstraction from experience. — Dfpolis
Hmmmm, yes. I see. I see you’re talking about learning, I’m talking about understanding. — Mww
how do you suppose culturally differentiated systems find a commonality in their respective analysis? What is the same for a child here and now arriving at “5”, and a medieval Roman child arriving at “V”? — Mww
One reason to believe would be, the world of experience satisfies some prerogatives that belong to a priori truths, re: one doesn’t need the experience of a severe car crash to know a severe car crash can kill him. — Mww
But general a priori truths have nothing whatsoever to do with experience (hence the standing definition), but are sustained by the principles of universality and necessity, for which experience can never suffice, re: two parallel lines can never enclose a space. I think it’s more significant, not that we do know some truths a priori, but that we can. — Mww
Which is developmentally conditioned and bounded. Within these limits, agreed. At and beyond, another topic.I mean this as a parenthetic comment.The coming to be of understanding is learning — Dfpolis
In opposition to Sartre's "existence precedes essence"? But I'm not asking for argument here, either.and that requires something already operative in the intentional/logical order.
Thus, intentional being is ontologically prior to material being. — Dfpolis
Since, once we have such transcendental principles we know they apply to all reality, they may be thought of as a priori, but as they are grounded in our experiential understanding of being, they are, in the first instance, and ultimately, a posteriori. — Dfpolis
and that requires something already operative in the intentional/logical order.
Thus, intentional being is ontologically prior to material being. — Dfpolis
In opposition to Sartre's "existence precedes essence"? But I'm not asking for argument here, either. — tim wood
It is "ontologically" I request you briefly define. In particular and more simply, that ontological priority is not to be confused with temporal priority, yes? — tim wood
The challenge is to recapitulate for the rest of us in perhaps five sentences or less, the main point(s) of this thread. (Or to treat this challenge contemptuously, which in fact it may deserve!) — tim wood
Thus everything that is discovered is first and finally, empirical, i.e., revealed." Thus how I read it. — tim wood
For your argument to stand, you have to define empiricism idiosyncratically and in a way that itself "proves" your case, in short, begs the question. And at the same time destroys its common meaning. — tim wood
I read it - you - as having the a priori being just a case of the a posteriori, a subset, a species. I argue that they're different animals. It is as if you wished to characterize people as apes. In evolutionary terms, yes, but not now. Not without violence to all the terms in use. — tim wood
The cultural invariant is the concept <five>, not what is counted — Dfpolis
f we had no experience of cars, it would be difficult to understand the concept of a car crash. — Dfpolis
once we have such transcendental principles we know they apply to all reality, they may be thought of as a priori — Dfpolis
The cultural invariant is the concept <five>, not what is counted — Dfpolis
Agreed. Which merely begs the question......from where did sure cultural invariant arise? It must be a condition of all similarly constituted rationalities, n’est pas? All that is counted, and the labels assigned to each unit of substance in the series of counting are immediately dismissed. What is left, both necessarily and sufficiently enabling a thoroughly mental exercise? It is nothing but the pure, a priori concepts, thought by the understanding alone, rising from the constitution of the mind**, the categories of quantity (plurality), quality (reality), relation (causality) and modality (existence). — Mww
the concept of car alone is insufficient to justify the truth of the consequent (a guy will die). The synthetic requirement for an outstanding force is also necessary. — Mww
That’s what I’m talking about!!!!!! Odd though, you acknowledge that which we know applies to all reality, yet balk at the realization they are the ground of all empirical exercise. Like counting. — Mww
There are two questions here. — Dfpolis
Abstraction from experience is adequate for a priori knowledge, but doesn’t address whether any other methodology is possible — Mww
Whether that matters or not depends on what we intend to do about how far astray we find ourselves in thinking about the world of things. — Mww
The problem is that consciousness is not at all emergent in the sense in which viscosity and surface tension are. — Dfpolis
.......so-called "a priori" truths. — Dfpolis
I think that there is a great deal more information packed into our experience of being than you seem to. — Dfpolis
The problem is that consciousness is not at all emergent in the sense in which viscosity and surface tension are. — Dfpolis
No, but if viscosity and surface tension prove emergence itself is possible, and with the admitted lack of complete understanding of neurophysiology, neuroplasticity, must the possibility of consciousness emerging from mere neural complexity, in principle, be granted? — Mww
Interesting. Why would you qualify some truths as so-called “a priori”? Are you thinking the term is mis-used? It’s value mis-applied? The whole schema doubtful? — Mww
What do you mean by transcendental principle, and what is an example of one? — Mww
What is meant by “our experience of being”, and what additional/supplemental information could be packed into my own personal experience of being, that isn’t already there? — Mww
For your argument to stand, you have to define empiricism idiosyncratically and in a way that itself "proves" your case, in short, begs the question. And at the same time destroys its common meaning.
— tim wood
I was not trying to define "empiricism" at all. I am happy to admit it has many flavors. I was talking about "experience" -- about the world as it interacts with us and so reveals itself to us. So, I am unsure where you see question begging. Could you please explain? — Dfpolis
Since, once we have such transcendental principles we know they apply to all reality, they may be thought of as a priori, but as they are grounded in our experiential understanding of being, they are, in the first instance, and ultimately, a posteriori. — Dfpolis
I read it - you - as having the a priori being just a case of the a posteriori, a subset, a species. I argue that they're different animals. It is as if you wished to characterize people as apes. In evolutionary terms, yes, but not now. Not without violence to all the terms in use.
— tim wood
I really do not follow this. Could you expand?
Let me say what I mean. Whenever we experience anything, we experience being -- something that can act to effect the experience we are having. We usually don't strip out all of the specifics to arrive at existence as the unspecified power to act; nonetheless, it is there, at the corner of awareness, ready to be examined and reflected upon if we choose to do so. So, there is a concept of <being> hovering in the background, and when we reflect on principles such as Identity or Excluded Middle, it is there to help us judge them.
This in effect re-grounds the knowledge of the thing into the process of the discovery of the thing, while demolishing the status of the knowledge as knowledge. — tim wood
Once done, and the thing known/defined/named, never again need it be discovered: we know it. — tim wood
Does that imply that all knowledge is a priori? I answer yes, with respect to the criteria that establishes the knowledge as knowledge. — tim wood
So far your argument is a claim. But I do not find that you have argued it in substantive terms. — tim wood
Is referencing <being> a flight to being, or an explication of experience/phenomena? — tim wood
You have <being> as "something that can act." (I note too you have <being> that we experience, and "a concept of <being>... there to help us.") How does it act? Would it both simplify and demystify to rebrand this <being> as just a capacity of the human mind? — tim wood
So I'm expecting that you are going to show how the Hard Problem goes away. Ill read on.It is quite common to believe that intentional realities, as found in conscious thought, are fundamentally material -- able to be explained in terms of neurophysiological data processing. This belief has presented metaphysical naturalists with what David Chalmers has called "the Hard Problem." It seems to me that the Hard Problem is a chimera induced by a provably irrational belief. — Dfpolis
I think you are missing an important aspect of Consciousness by dismissing the experience of Qualia as you do. What is that Redness that you experience when you look at a Red object or when you Dream about a Red Object?By way of background, I take consciousness to be awareness of present, typically neurophysiologically encoded, intelligibility. I see qualia as of minor interest, being merely the contingent forms of awareness. — Dfpolis
Sounds like you are saying that there are two separate subsystems of the Material Mind (the Neurons). One is the Computational Machine sub system that is not Conscious and the other sub system is the Conscious aspect where Intentional Reality exists. Another way of saying this is that it is all in the Neurons. But this is still perpetuating the Belief that you criticized above. But then you say:I am not a dualist. I hold that human beings are fully natural unities, but that we can, via abstraction, separate various notes of intelligibility found in unified substances. Such separation is mental, not based on ontological separation. As a result, we can maintain a two-subsystem theory of mind without resort to ontological dualism. — Dfpolis
This sounds like you are saying that you are going to show that Intentional Reality cannot be found in the Neurons. So then where is it? What is it? Sound like Ontological Dualism to me.Here are the reasons I see intentional reality as irreducible to material reality. — Dfpolis
You are just assuming that Neural Activity must imply Conscious Activity in all cases. This does not have to be true even if the Conscious Activity really is all in the Neurons. We don't know enough about Conscious Activity to make sweeping conclusions like this about anything.1. Neurophysiological data processing cannot be the explanatory invariant of our awareness of contents. If A => B, then every case of A entails a case of B. So, if there is any case of neurophysiological data processing which does not result in awareness of the processed data (consciousness) then neurophysiological data processing alone cannot explain awareness. Clearly, we are not aware of all the data we process. — Dfpolis
Yes but this seems to imply that the Conscious Activity of Intention can not be found in the Neurons by Science yet. This implies that Conscious Activity must be some other kind of thing that is not in the Neurons. Sounds like Ontological Dualism to me.2. All knowledge is a subject-object relation. There is always a knowing subject and a known object. At the beginning of natural science, we abstract the object from the subject -- we choose to attend to physical objects to the exclusion of the mental acts by which the subject knows those objects. In natural science care what Ptolemy, Brahe, Galileo, and Hubble saw, not the act by which the intelligibility of what they saw became actually known. Thus, natural science is, by design, bereft of data and concepts relating to the knowing subject and her acts of awareness. Lacking these data and concepts, it has no way of connecting what it does know of the physical world, including neurophysiology, to the act of awareness. Thus it is logically impossible for natural science, as limited by its Fundamental Abstraction, to explain the act of awareness. Forgetting this is a prime example of Whitehead's Fallacy of Misplaced Concreteness (thinking what exists only in abstraction is the concrete reality in its fullness). — Dfpolis
I like the Orthogonal Mathematics metaphor. In mathematics when Vectors are Orthogonal you cannot project one onto the other. You cannot project the Intentional Vector onto the Material Vector.3. The material and intentional aspects of reality are logically orthogonal. That is to say, that, though they co-occur and interact, they do not share essential, defining notes. Matter is essentially extended and changeable. It is what it is because of intrinsic characteristics. As extended, matter has parts outside of parts, and so is measurable. As changeable, the same matter can take on different forms. As defined by intrinsic characteristics, we need not look beyond a sample to understand its nature. — Dfpolis
I'll continue to think about this one.Intentions do not have these characteristics. They are unextended, having no parts outside of parts. Instead they are indivisible unities. Further, there is no objective means of measuring them. They are not changeable. If you change your intent, you no longer have the same intention, but a different intention. As Franz Brentano noted, an essential characteristic of intentionality is its aboutness, which is to to say that they involve some target that they are about. We do not just know, will or hope, we know, will and hope something. Thus, to fully understand/specify an intention we have to go beyond its intrinsic nature, and say what it is about. (To specify a desire, we have to say what is desired.) This is clearly different from what is needed to specify a sample of matter. — Dfpolis
This is all well and good if Intention actually is Information. Maybe. I'll continue to think about this too.4. Intentional realities are information based. What we know, will, desire, etc. is specified by actual, not potential, information. By definition, information is the reduction of (logical) possibility. If a message is transmitted, but not yet fully received, then it is not physical possibility that is reduced in the course of its reception, but logical possibility. As each bit is received, the logical possibility that it could be other than it is, is reduced.
The explanatory invariant of information is not physical. The same information can be encoded in a panoply of physical forms that have only increased in number with the advance of technology. Thus, information is not physically invariant. So, we have to look beyond physicality to understand information, and so the intentional realities that are essentially dependent on information — Dfpolis
So I'm expecting that you are going to show how the Hard Problem goes away. — SteveKlinko
I think you are missing an important aspect of Consciousness by dismissing the experience of Qualia as you do. What is that Redness that you experience when you look at a Red object or when you Dream about a Red Object? — SteveKlinko
Sounds like you are saying that there are two separate subsystems of the Material Mind (the Neurons). — SteveKlinko
This sounds like you are saying that you are going to show that Intentional Reality cannot be found in the Neurons. — SteveKlinko
Sound like Ontological Dualism to me. — SteveKlinko
You are just assuming that Neural Activity must imply Conscious Activity in all cases. — SteveKlinko
Yes but this seems to imply that the Conscious Activity of Intention can not be found in the Neurons by Science yet. — SteveKlinko
I did not see any solution to the Hard Problem in all this. If Intentional Realities are not reducible to the Material Neurons then what are Intentional Realities? Where are Intentional Realities? How can this be Explained? There is a big Explanatory Gap here. This Explanatory Gap is the Chalmers Hard Problem. — SteveKlinko
I think you are saying that there is only an Explanatory Gap if the Intentional Reality is found to be in the Neurons — SteveKlinko
But if it is found to be in the Neurons then that means that Science has an Explanation for How and Why it is in the Neurons — SteveKlinko
If Intentional Reality is not found in the Neurons then there would exist a Huge Explanatory Gap as to what it could be. — SteveKlinko
. How does this non-Material Intention ultimately interact with the Neurons, as it must, to produce Intentional or Volitional effects? — SteveKlinko
Am I correct in saying that Volition is the same as Intention in your analysis? — SteveKlinko
I don't think there is any experimental test for this.I think you are saying that there is only an Explanatory Gap if the Intentional Reality is found to be in the Neurons — SteveKlinko
I am not sure what, operationally, it would mean to find intentional reality "in the neurons." If intentions are to be effective, if I am actually able to go to the store because I intend to go to store, then clearly my intentions need to modify the behavior of neurons and are in them in the sense of being operative in them. Yet, for the hard problem to make sense requires more than this, for it assumes that the operation of our neurophysiology is the cause of intentionality. What kind of observation could possibly confirm this? — Dfpolis
I'm missing your point here because I said that Science will need to have the Explanation for the How and Why, and not merely the fact that it is.But if it is found to be in the Neurons then that means that Science has an Explanation for How and Why it is in the Neurons — SteveKlinko
Knowing what is, is not the same as knowing how or why it is. We know that electrons have a charge of -1 in natural units. We have no idea of how and why this is so. — Dfpolis
I disagree that we know anything about what Intentionality is. We know we have it, but what really is it? This is similar to how we Experience the Redness of Red. We certainly know that we have the Experience but we have no idea what it is.If Intentional Reality is not found in the Neurons then there would exist a Huge Explanatory Gap as to what it could be. — SteveKlinko
Not at all. We already know what intentionality is. We can define it, describe it, and give uncounted examples of it. What we do not know is what we cannot know, i.e. how something that cannot be its cause is its cause. That is no more a gap than not knowing how to trisect an arbitrary angle with a compass and straightedge is a gap in our knowledge of Euclidean geometry. There is no gap if there is no reality to understand. — Dfpolis
If you have an intention to do something then that intention must ultimately be turned into a Volitional command to the Brain that will lead to the firing of Neurons that will activate the muscles of the Physical Body to do something. I believe you called that a Committed Intention.. How does this non-Material Intention ultimately interact with the Neurons, as it must, to produce Intentional or Volitional effects? — SteveKlinko
I think the problem here is how you are conceiving the issue. You seem to be thinking of intentional reality as a a quasi-material reality that "interacts" with material reality. It is not a different thing, it is a way of thinking about one thing -- about humans and how humans act. It makes no sense to ask how one kind of human activity "interacts" with being human, for it is simply part of being human. — Dfpolis
When you say the Laws of Nature are Intentional, it sounds like you are talking about some kind of Intelligent Design. I'm not sure how this is even relevant to the discussion.I have argued elsewhere on this forum and in my paper (https://www.academia.edu/27797943/Mind_or_Randomness_in_Evolution), that the laws of nature are intentional. The laws of nature are not a thing separate from the material states they act to transform. Rather, both are aspects of nature that we are able to distinguish mentally and so discuss in abstraction from each other. That we discuss them independently does not mean that they exist, or can exist, separately.
Would it make any sense to ask how the laws of nature (which are intentional), "interact" with material states? No, that would be a category error, for the laws of nature are simply how material states act and it makes no sense to ask how a state acts "interacts" with the state acting. In the same way it makes no sense to ask how an effective intention, how my commitment to go to the store, interacts with my going to the store -- it is simply a mentally distinguishable aspect of my going to the store. — Dfpolis
I had been thinking that you actually were using the word Intentions to mean Committed Intentions or in my way of thinking: Volition. I'm not sure what to do with an abstract concept like Intentions. When you hang your argument for eliminating the Hard Problem on an abstract Intentions concept being Material you are setting up a straw man.I know this does not sound very satisfactory. So, think of it this way. If I have not decided to go to the store, my neurophysiology obeys certain laws of nature. Once I commit to going, it can no longer be obeying laws that will not get me to the store, so it must be obeying slightly different laws -- laws that are modified by my intentions. So, my committed intentions must modify the laws controlling my neurophysiology. That is how they act to get me to the store.
Am I correct in saying that Volition is the same as Intention in your analysis? — SteveKlinko
Volition produces what I am calling "committed intentions." There are many other kinds of intentions like knowing, hoping, believing, etc. — Dfpolis
I'm missing your point here because I said that Science will need to have the Explanation for the How and Why, and not merely the fact that it is. — SteveKlinko
I disagree that we know anything about what Intentionality is. We know we have it, but what really is it? This is similar to how we Experience the Redness of Red. We certainly know that we have the Experience but we have no idea what it is. — SteveKlinko
If you have an intention to do something then that intention must ultimately be turned into a Volitional command to the Brain that will lead to the firing of Neurons that will activate the muscles of the Physical Body to do something. I believe you called that a Committed Intention. — SteveKlinko
When you say the Laws of Nature are Intentional, it sounds like you are talking about some kind of Intelligent Design. I'm not sure how this is even relevant to the discussion. — SteveKlinko
When you hang your argument for eliminating the Hard Problem on an abstract Intentions concept being Material you are setting up a straw man. — SteveKlinko
Even if your Intention argument is true, this Redness Experience Explanatory Gap must be solved. This is what the Hard Problem is really all about. — SteveKlinko
I think you are saying that there is only an Explanatory Gap if the Intentional Reality is found to be in the Neurons. — SteveKlinko
That's how bad our understanding of Consciousness is. We can't even conceive that there could be a Scientific explanation for it. But I think there probably is a Scientific explanation. We just need some smart Mind to figure it out someday in the future.I'm missing your point here because I said that Science will need to have the Explanation for the How and Why, and not merely the fact that it is. — SteveKlinko
OK. I misunderstood what you were saying. To me there is data, and the data might show that there is intentionality in the neurons, and there is theory, which would explain the data in terms of how and why. But, you agree that there is no experimental test for finding intentionality in neurons, so, there can be no data to explain. That leaves us with the question: What kind of evidentiary support can there be for a theory that supposedly explains something that cannot be observed? If this theory predicts that some set of physical circumstances will produce intentionality in neurons, and we cannot observe intentionality in neurons, doesn't that make the theory unfalsifiable, and so unscientific? In short, I have difficulty in seeing how such a theory can be part of science. — Dfpolis
We know what they are from our subjective Conscious experience of them. But since we don't know what Consciousness is, in the first place, being Conscious of them is not an explanation.I disagree that we know anything about what Intentionality is. We know we have it, but what really is it? This is similar to how we Experience the Redness of Red. We certainly know that we have the Experience but we have no idea what it is. — SteveKlinko
If you mean that we cannot reduce these things to a physical basis, that is the very point I am making. But that is not the same as not knowing what a thing is. If we can define intentionality well enough for other people to recognize it when they encounter it, we know what it is.
I think you need to ask yourself what you mean by knowing "what a thing is?" What things are is fully defined by what they can do. If we know what things can do -- how they scatter light, interact with other objects, and so on -- we know all there is to know about what they are. We pretty much know what various kinds of intentions do. So, in what way do we not know what they are? — Dfpolis
If you have an intention to do something then that intention must ultimately be turned into a Volitional command to the Brain that will lead to the firing of Neurons that will activate the muscles of the Physical Body to do something. I believe you called that a Committed Intention. — SteveKlinko
Agreed. And that means that committed intentions must modify the laws that control how our neurophysiology works. How else could they do what they do? — Dfpolis
I guess you are making a distinction now between Laws of Nature that apply to Intentional Phenomenon and Laws of Nature that apply to Material Phenomenon. So you should not say the Laws of Nature are Intentional but only a subset of the Laws of Nature that apply to Intentionality are Intentional.When you say the Laws of Nature are Intentional, it sounds like you are talking about some kind of Intelligent Design. I'm not sure how this is even relevant to the discussion. — SteveKlinko
I am not an advocate of Intelligent Design. I think it gravely misunderstands the laws of nature. ID assumes that God is not intelligent enough to create a cosmos that effects His ends without recurrent diddling. That is insulting to God.
The arguments I give in my paper for the laws of nature being intentional are based solely on our empirical knowledge, and do not assume the existence of an intending God. The relevance here of the laws being intentional is that they are in the same theater of operations as human commitments. Since they are in the same theater of operation, our commitments can affect the general laws, perturbing them to effect our ends. Material operations, on the other hand, are not in the same theater of operation and so cannot affect the laws of nature. — Dfpolis
I don't think the Brain is the Consciousness aspect. But rather I think the Brain connects to a Consciousness aspect.When you hang your argument for eliminating the Hard Problem on an abstract Intentions concept being Material you are setting up a straw man. — SteveKlinko
This seems confused. First, I an not saying intentions are material. Second, the Hard Problem is about the production of consciousness (of intellect) and not, in the first instance about volition (will).
We have no intentions without consciousness, which is awareness of present intelligibility. It makes what was merely intelligible actually known. The brain can process data in amazing ways, but processing data does not raise data from being merely intelligible to being actually known. To make what is intelligible actually known requires a power that is not merely potential, but operational. So, nothing that is merely intelligible, that is only potentially an intention, can produce an intention. Thus, data encoded in the brain cannot make itself actually known -- it cannot produce consciousness.
What is already operational in the intentional theater is awareness -- what Aristotle called the agent intellect. It is when we turn our awareness to present intelligibility that the neurally encoded contents become known. So, while the brain can produce the contents of awareness, it cannot produce awareness of those contents. — Dfpolis
I think every instance of Consciousness actually does involve some sort of Quale. Things that are sub Conscious of course do not involve any Qualia. Even the sense of Awareness itself has a certain feel to it. The experience of Understanding itself has a feel to it. There are all kinds of Qualia besides sensory Qualia.Even if your Intention argument is true, this Redness Experience Explanatory Gap must be solved. This is what the Hard Problem is really all about. — SteveKlinko
If that were so, then every instance of consciousness, even the most abstract, would involve some quale. It does not. So, quale are not an essential aspect of consciousness. On the other hand, there is no instance of consciousness without awareness and some intelligible object. So, the essential features of consciousness are awareness/subjectivity and the the contents of awareness/objectivity.
Of course there are qualia, but we do know what they are. All qualia are the contingent forms of sensory awareness. We know, for example, that redness is the form of our awareness of certain spectral distributions of light. There is nothing else to know about redness. If you think there is, what would it be? — Dfpolis
Since we cannot even begin to understand how to approach the study of Consciousness there is no way we can make a list of all the possible Explanations. There is no clear set of demarcation criteria for Explanations of Consciousness. Everything and anything is possible at this point. In fact it is a Red Herring to demand such a list of possible Explanations. A First Clue is what we need at this point.I think you are saying that there is only an Explanatory Gap if the Intentional Reality is found to be in the Neurons. — SteveKlinko
"Explanatory gap" talk is a red herring as long as we continue to not analyze just what is to count as an explanation and why, with a clear set of demarcation criteria for explanations, and where we make sure that we pay attention to the qualitative differences--in general, for all explanations--between what explanations are and the phenomena that they're explaining — Terrapin Station
If this theory predicts that some set of physical circumstances will produce intentionality in neurons, and we cannot observe intentionality in neurons, doesn't that make the theory unfalsifiable, and so unscientific? In short, I have difficulty in seeing how such a theory can be part of science. — Dfpolis
That's how bad our understanding of Consciousness is. We can't even conceive that there could be a Scientific explanation for it. But I think there probably is a Scientific explanation. We just need some smart Mind to figure it out someday in the future. — SteveKlinko
We pretty much know what various kinds of intentions do. So, in what way do we not know what they are? — Dfpolis
We know what they are from our subjective Conscious experience of them. But since we don't know what Consciousness is, in the first place, being Conscious of them is not an explanation. — SteveKlinko
I guess you are making a distinction now between Laws of Nature that apply to Intentional Phenomenon and Laws of Nature that apply to Material Phenomenon. So you should not say the Laws of Nature are Intentional but only a subset of the Laws of Nature that apply to Intentionality are Intentional. — SteveKlinko
I don't think the Brain is the Consciousness aspect. But rather I think the Brain connects to a Consciousness aspect. — SteveKlinko
I think every instance of Consciousness actually does involve some sort of Quale. — SteveKlinko
There are all kinds of Qualia besides sensory Qualia. — SteveKlinko
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.