• Michael
    15.7k
    Wouldn't it be preferable to say intentional attitude? That's the usual term used by philosophers, with a quite substantial backing in the literature. It avoids the problematic notion of the subjective.Banno

    Sure, I just grabbed that definition from Wikipedia.
  • Michael
    15.7k
    But wouldn't "belief", for a p-zombie, be precisely this "belief-analog"?hypericin

    "Belief" is a word in the English language that has a well-established meaning. If p-zombies are speaking English then the word "belief" means what it means in English.
  • Banno
    25.2k
    Fine. You know me, "subjective" is a trigger. :wink:
  • AmadeusD
    2.6k
    There is someone who made a thread yesterday or the day before explaining how he has no inner monologue and also cannot form images mentally.Lionino

    This is true for the majority of people, it seems. https://www.iflscience.com/people-with-no-internal-monologue-explain-what-its-like-in-their-head-57739

    https://irisreading.com/is-it-normal-to-not-have-an-internal-monologue/

    I find it fascinating - and fascinating that it took until 2022 for a real grappling to occur.
  • hypericin
    1.6k
    There is someone who made a thread yesterday or the day before explaining how he has no inner monologue and also cannot form images mentally.Lionino

    Which? I've heard of this before, its super interesting that we have so much variation in our inner lives, yet almost never talk about it.
  • hypericin
    1.6k
    Belief" is a word in the English language that has a well-established meaning. If p-zombies are speaking English then the word "belief" means what it means in English.Michael

    The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state.
  • Lionino
    2.7k
    What is crazier about this is that, even for the people "graced" with having inner monologue, I cannot even see how that aids them in thinking besides when they need to prepare a speech (as something that will be spoken verbally, be it to a friend or to a room of executives). Words popping up in my mind do not help me connect the dots of different ideas, the glyph and its content are different things. For me, it is only when I summon the content (image) in my mind that I can finally think.

    Thinking words for me is like reading a book but not imagining what is happening, you just repeat the words you read. It is only when I focus on what I am reading and follow along by imagining it that I can remember anything of what I read. But reading more than 30 pages for me in one sitting is quite exhausting when it comes to complex books like the Iliad or La Commedia. Maybe that explains how some people are able to read 200+ pages of classics in a day — they are not quite reading it.



    This guy right here https://thephilosophyforum.com/discussion/14847/are-words-more-than-their-symbols/

    The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state.hypericin

    You sure?
  • AmadeusD
    2.6k
    Absolutely; bizarre, isn't it?

    Conversely, as noted in that thread, I have an extremely active internal dialogue to, at times, a debilitating degree. I cannot understand how you could possibly deal with systematic knowledge, or logically working through propositions without definitive reference to prior thought - whcih occur to my mind in sentences/phrases. I can't 'image' an emotional concept, for instance, but i can put it into words and hold that while i tinker with the next element of the larger thought process. If i attempt to think in concepts and images only, not only is it utterly, dismally, emotionally triggeringly boring, I can't make heads or tails of fucking anything. I can't only make sense of images in reference to the language in which i first understood the image (perhaps this was a process of acquiring 'concepts' when i was an infant), or subsequently reappraised it under.
  • hypericin
    1.6k
    Words popping up in my mind do not help me connect the dots of different ideas, the glyph and its content are different things. For me, it is only when I summon the content (image) in my mind that I can finally think.Lionino

    This parallels what I experience. I think you might underestimate the inner monologue. After all, I am guessing that animals can think visually as well. Our ability to manipulate glyphs which represent arbitrary concepts, both aloud and internally, is part of what sets our cognitive ability apart.

    But then again, there's these guys who just don't have it! Crazy..
  • Lionino
    2.7k
    This parallels what I experience. I think you might underestimate the inner monologue.hypericin

    I do use internal monologue quite a bit. When I am having my "ADHD moments" (I am not diagnosed but everybody has those every now and then), the voice keeps going, and even forces my mouth to speak the words occasionally. But as I hinted, I only use internal monologue intentionally when there is a need or a desire to turn thoughts into words, which might help with clarity and memory especially when I move from one thought to another while needing to remember the previous. But when I just "need to think", I don't use words.
  • RogueAI
    2.9k
    The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state.hypericin

    You have a point. Let's stipulate p-zombies have beliefs. Would a p-zombie believe it's in pain (the way a human understands pain, which is accompanied by a feeling) if it's not actually in pain? Suppose a p-zombie burns it's finger, and has damaged itself, but there is no feeling of pain. Would it believe it's in pain? In other words, would it believe something it knows to be false?
  • hypericin
    1.6k
    Would it believe it's in pain? In other words, would it believe something it knows to be false?RogueAI

    I think it would believe that "pain" denotes the cluster of behaviors that it and we engage in when injured. So long as it is engaging in these behaviors, it is "in pain".
  • javra
    2.6k
    I think you might underestimate the inner monologue. After all, I am guessing that animals can think visually as well. Our ability to manipulate glyphs which represent arbitrary concepts, both aloud and internally, is part of what sets our cognitive ability apart.hypericin

    I mentioned something about how animals think without words in that thread. So I felt like commenting on this aspect here.

    Although animals will have wordless thoughts in one way or another, the argument can well be made that thinking via words will set limitations on what can be thought by humans, this in manners that wordless thinking will not. This limiting of thought via words that grants thought relatively stringent structure would then be a hindrance in activities ranging from novel artistic expressions to novel ideas in both the sciences and in philosophies.

    For humans, wordless thought can think outside of the box in which word-driven thought resides, so to speak. Sometimes in very abstract (and logically consistent) manners.
  • Michael
    15.7k
    The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state.hypericin

    What does it mean to "accept", "consider", or "hold as an opinion"? Again, these aren't terms that it makes sense to attribute to a p-zombie. A p-zombie is just a machine that responds to stimulation. It's an organic clockwork-like body that moves and makes sound.

    It's quite ironic that you're anthropomorphising p-zombies.
  • hypericin
    1.6k
    What does it mean to "accept", "consider", or "hold as an opinion"? Again, these aren't terms that it makes sense to attribute to a p-zombie. A p-zombie is just a machine that responds to stimulation. It's a complicated clockwork body. They're just objects that move and make sound. That's it.Michael

    Until recently, only humans could "accept", "consider", or "hold as an opinion", and it has been generally presumed that these are always accompanied by subjective states. This is a fact, but it does not imply that these notions are incoherent without subjective states.

    But now, it is not true anymore. Take ChatGPT. You can ask it to "consider X", that is, to examine a proposition without being committed to its belief, and it obliges very well, without a trace of subjective state.

    It's quite ironic that you're anthropomorphising p-zombies.Michael
    Why ironic? They are already maximally anthropomorphized.
  • wonderer1
    2.2k
    I’m not sure that counts as belief. Belief seems to me to be a conscious activity. Machines can record and analyze information but they don’t believe anything.Michael

    This gets me thinking about 'muscle memory' events, like knocking a cup off a counter and reflexively catching it without time to think. We must have subconscious expectations which direct moving our hand to where it needs to be to catch the cup. Are these subconscious expectations not in some sense beliefs? And if we posit conscious beliefs and subconscious beliefs, is there a clear dividing line?
  • Apustimelogist
    603


    Well given that a p-zombie:

    1) Behaves identically to a regular person;

    2) The reason it behaves are for the same reasons as a regular person (i.e. it has a human brain that performs functions just like ours); and

    3) The circumstances about the world under which I myself am "accepting" or "considering" things is the same as which apply to the p-zombie (in regard to the p-zombie performing acts of p-"considering" and p-"accepting", etc);

    Then maybe it makes sense to attribute to it these things you mentioned after all. For all intents and purposes they cannot be distinguished in either case.

    Edit: attempt at clarification same points, partly by formatting.
  • noAxioms
    1.5k
    You could just say “I am a p-zombie”.Michael
    Most of us don't know it. It isn't easy to tell until you get a glimpse of the bit being missed, sort of like having your vision restored after cataracts have reduced you to near grayscale levels.

    You’ll need to explain it in these terms.Michael
    Hence my attempt with the car, which very much is aware of its surroundings, but 'aware' is perhaps one of those forbidden words. It all smacks of racism. They basically degraded black slaves by refusing to use human terms for anything related to them, using cattle terms instead,.It made it easy to justify how they were treated. "Cows don't feel pain. Neither do p-zombies. It's not immoral to set em on fire."
    I'm sorry, just because we're not conscious in the Chalmer way doesn't mean we don't hold beliefs. I refuse to withhold perfectly understandable terms when no alternatives are offered.

    I don't agree with this. "so he says he feels pain, not knowing that it isn't real pain". That's an epistemic issue, not a truth issue. For any x, if x is not feeling pain/hurting, and x says it is feeling pain/hurting, x is wrong. X is saying something false.RogueAI
    I didn't say the statement "so he says he feels pain, not knowing that it isn't real pain" was true.
    I said the statement, "pain hurts" is true regardless of what utters it.

    But this isn't true for you.RogueAI
    That's just using a language bias to attempt a demonstration of a difference where there isn't one. A Roomba cannot be conscious because you define 'conscious' to only apply to humans. That doesn't demonstrate that a pimped-out Roomba isn't doing the exact same thing, it only means that the Roomba needs to pick a different word for the exact same thing, and then tell the human that he isn't that sort of conscious because he's an inferior human.

    That's the sort of argument I see coming from everybody. The word is not legally applied to you, therefore you you're not doing what the chosen race is doing.

    But belief is a conscious mental activity. P-belief/p-consider is incoherent. It's missing a necessary condition for anything that remotely resembles believing and considering.RogueAI
    What is the computer doing then when it processes data from a camera pointed at a table. The computer 'concludes' (probably a forbidden word) that there is a table in front of the camera in question, and outputs a statement "there seems to be a table in front of the camera". You say it's not a mental activity. I agree with that. That usage of "mental activity" only applies to an immaterial mind such as Chalmers envisions. So OK, you can't express that the computer believes there's a table there, or that it concludes that. How do you phrase what the computer does when it does the exact same thing as the human, which is deduce (presumably another forbidden word) the nature of the object in the field of view.
    If you can't provide acceptable alternative terms, then I'm sorry, the computer believes there's a table there. Deal with it.

    The answer is that it would behave as its physical circumstances dictate.Banno
    So I claim I'm doing. But how would you express what the p-zombie does when it correctly identifies the table in front of it that it cannot 'see'?

    We don't behave this wayRogueAI
    I do, and you don't. So why are we indistinguishable (except for me deciding to stop imitating the language you use for that which I cannot ever know)?

    I don't have a computer algorithm in my head. My head does not implement a Von-Neumann architecture, even though I'm capable of simulating one, and a Von-Neumann machine is capable of simulating me. None of that is true of you. The lights are out for me, but I don't know it since I've never experienced the light and had that 'ooooohh' moment.

    By definition they behave as we do. This includes belief.hypericin
    Well, per Michael, this includes false claims of belief. I'm doing something that I think is belief, but Michael says it's by definition false.
  • hypericin
    1.6k


    I would say, if your claim were true, it would be a revolutionary finding, and upend our notions about what it means to be human.

    Hence, forgive my skepticism.

    I still think the problem is conceptual. Frankly, it is hard to think about, how this mistake might arise.

    Interesting stuff, regardless. New OP? "I am a p-zombie, prove me wrong"?

    What is the computer doing then when it processes data from a camera pointed at a table. The computer 'concludes' (probably a forbidden word) that there is a table in front of the camera in question, and outputs a statement "there seems to be a table in front of the camera". You say it's not a mental activity. I agree with that. That usage of "mental activity" only applies to an immaterial mind such as Chalmers envisions. So OK, you can't express that the computer believes there's a table there, or that it concludes that. How do you phrase what the computer does when it does the exact same thing as the human, which is deduce (presumably another forbidden word) the nature of the object in the field of view.
    If you can't provide acceptable alternative terms, then I'm sorry, the computer believes there's a table there. Deal with it.
    noAxioms



    :up:
  • noAxioms
    1.5k
    I would say, if your claim were true, it would be a revolutionary findinghypericin
    It wouldn't be a finding at all. If it was true, nobody (not even I) would know for sure. Of course, I'm sure that 'knowing' things (and being 'sure') are all forbidden. But I do have whatever it takes to pass an interview for a technical job, even if it isn't knowledge. I have claims of it on my resume, all false apparently.

    Apologies to RogueAI for somewhat hijacking this topic. It's about p-zombies still, but not much about having children. I have no way of knowing if my kids are conscious or not. Not sure if a new topic would cover any ground not already covered.
  • RogueAI
    2.9k
    Apologies to RogueAI for somewhat hijacking this topic. It's about p-zombies still, but not much about having children. I have no way of knowing if my kids are conscious or not. Not sure if a new topic would cover any ground not already covered.noAxioms

    I'm enjoying the thread.
  • Michael
    15.7k
    Here's a really rubbish AI:

    <?php
    
    function responseTo($text)
    {
      return ['Yes', 'No', 'Maybe'][random_int(0, 2)];
    }
    
    echo responseTo('Consider p-zombies. Can they believe?');
    

    It doesn't seem at all appropriate to say that it believes or accepts or considers anything. That would be a very obvious misuse of language.

    ChatGPT and p-zombies are just very complicated versions of the above, with p-zombies having a meat suit.
  • flannel jesus
    1.8k
    I don't think you've fully grasped what p zombies are
  • hypericin
    1.6k
    It doesn't seem at all appropriate to say that it believes or accepts or considers anything. That would be a very obvious misuse of language.Michael

    Because RubbishAI lacks every feature that would otherwise make this language appropriate.

    ChatGPT and p-zombies are just very complicated versions of the above, with p-zombies having a meat suit.Michael

    In most contexts we make extremely fine taxonomic distinctions between objects, even though they definitely have no subjectivity. But suddenly when subjectivity is involved, you want to flatten everything not subjective into one big pile. Even p-zombies, which are physically and behaviorally identical, only lacking a presumed quality you cannot see or verify. We might spend the rest of our life's allotment of time on this forum going back and forth with @noAxioms and still not definitively figure out whether he is a p-zombie or not.
  • Michael
    15.7k
    We might spend the rest of our life's allotment of time on this forum going back and forth with noAxioms and still not definitively figure out whether he is a p-zombie or not.hypericin

    I agree. If he were to just to say “I am a p-zombie” then I would accept that it’s possibly true. I am simply explaining that “I believe that I am a p-zombie” is false if he is a p-zombie and irrational if he’s not.
  • Dawnstorm
    247
    P-zombies have no consciousness. They just have an outward appearance (including observable behaviour). You’ll need to explain it in these terms.

    (By outward appearances I don’t mean to exclude muscles and bones and internal organs)
    Michael

    Well, the point of the p-zombie thought experiment is to figure out what phenomenal experience does, if anything. If I understand epiphenomenalism right, that's the idea that phenomenal experience does absolutely nothing. Under an epiphenomenal view, a p-zombie should be able to believe things (as not being able to experience its own belief adds nothing of value to the concept of believing).

    Internal organs include the brain, right? So I have aphantasia. I look at things, my visual cortex is active. I imagine things, my visual cortex is not or barely active. The same would be true for my p-zombie twin. A p-zombie without aphantasia would have an active visual cortex when seeing things, and thus he wouldn't be lying when he said he sees things in his head.

    It's just that seeing things in your head isn't accompanied by any phenomenal experience; it's just the visual cortex (among other things) doing its thing.

    How we interpret this state of affairs probably differs from philosophy to philosophy, from person to person. Ordinary langauge generally doesn't take into account the question what (if anything) phenomenal consciousness does. We cannot observe anyone's phenomenal consciousness outside of our own, anyway, so we just assume that other people have it, too. That's such a total assumption under usual circumstances, that we don't raise the topic at all.

    But with the p-zombie thought experiment we must. A p-zombie can have aphantasia (to the extent that its brain behaves like an aphantasiac brain), be insentive to pain, detect phantom limbs after an operation... all that groovy stuff that can come with a human brain, which he has. A p-zombie, by definition, has subjectivity to the extent that the brain is involved. But a p-zombie can't experience subjectivity as a phenomenon.

    So a p-zombie can believe things as far as brain-activity is involved, but a p-zombie can not experience believing things. So believing things would be brain behaviour accompanied by corresponding experience, and p-believing things would be brain behaviour not accompanied by experience.

    I'm not sure what I think of this myself. But it makes sense to me that, if p-zombies are biologically indistinguishable from non-p-zombies, that you could have p-zombies that are sensetive to pain, and p-zombies that are insensitive to pain, as this has behavioural consequences. Sentences like "P-zombies don't feel pain," are therefore too imprecise in the context of this thought experiment. The problem is, thoug, once we push through to the experience part of the thought experiment we're pretty much in uncharted terrain, and it's all fuzzy and imprecise. I mean what's the difference between holding and experiencing a believe and holding but not experiencing a believe?

    A p-zombie is just a machine that responds to stimulation. It's an organic clockwork-like body that moves and makes sound.Michael

    If a p-zombie's body is "an organic clockwork-like body that moves and makes sound" then so is yours or mine. The bodies are indistinguishable. So what is this consciousness? How important is it? I'd say that makes them significantly human; I've not yet figured out what difference consciousness makes, but then that's part of the point of the thought experiment to begin with.
  • Michael
    15.7k
    So a p-zombie can believe things as far as brain-activity is involvedDawnstorm

    I don’t think the meaning of the word “belief” can be reduced to an explanation of brain states, just as I don’t think the meaning of the phrase “phenomenal subjective experience” can be reduced to an explanation of brain states.

    If we are p-zombies then we don’t have phenomenal subjective experiences and we don’t have beliefs. We just react to stimuli.
  • RogueAI
    2.9k
    I don't think you've fully grasped what p zombies areflannel jesus

    Maybe. I know they're supposed to be behaviorally identical to us, but address this point: When I burn my finger, I might cry out "that hurts" and also have the belief that I'm in pain (because I actually am). When my p-zombie counterpart burns its finger it also cries out "that hurts", but does it also believe it's in pain? If it does, then it's believing something it knows to be false (engaging in doublethink), since it has all my knowledge and would know that p-zombies, by definition, cannot feel pain. If it doesn't believe it's in pain, then I and the p-zombie are no longer acting the same way, since we now have different beliefs. Either way, it seems, the p-zombie is a lot different than me: either it's constantly engaging in doublethink, or it's beliefs and my beliefs are a lot different.
  • Michael
    15.7k
    If it doesn't believe it's in pain, the I and the p-zombie are no longer acting the same way, since we now have different beliefs.RogueAI

    The p-zombie doesn’t believe anything. It just burns its finger and cries out “that hurts”.
  • RogueAI
    2.9k
    The p-zombie doesn’t believe anything. It just burns its finger and cries out “that hurts”.Michael

    I'm not so sure it doesn't have beliefs. Hypericin made a good case. If they do have beliefs, what do you think of the point I made a post ago?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.