There is someone who made a thread yesterday or the day before explaining how he has no inner monologue and also cannot form images mentally. — Lionino
There is someone who made a thread yesterday or the day before explaining how he has no inner monologue and also cannot form images mentally. — Lionino
Belief" is a word in the English language that has a well-established meaning. If p-zombies are speaking English then the word "belief" means what it means in English. — Michael
The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state. — hypericin
Words popping up in my mind do not help me connect the dots of different ideas, the glyph and its content are different things. For me, it is only when I summon the content (image) in my mind that I can finally think. — Lionino
This parallels what I experience. I think you might underestimate the inner monologue. — hypericin
The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state. — hypericin
I think you might underestimate the inner monologue. After all, I am guessing that animals can think visually as well. Our ability to manipulate glyphs which represent arbitrary concepts, both aloud and internally, is part of what sets our cognitive ability apart. — hypericin
The relevant definition in Webster's is "something that is accepted, considered to be true, or held as an opinion". This to me doesn't entail subjective state. — hypericin
What does it mean to "accept", "consider", or "hold as an opinion"? Again, these aren't terms that it makes sense to attribute to a p-zombie. A p-zombie is just a machine that responds to stimulation. It's a complicated clockwork body. They're just objects that move and make sound. That's it. — Michael
Why ironic? They are already maximally anthropomorphized.It's quite ironic that you're anthropomorphising p-zombies. — Michael
I’m not sure that counts as belief. Belief seems to me to be a conscious activity. Machines can record and analyze information but they don’t believe anything. — Michael
Most of us don't know it. It isn't easy to tell until you get a glimpse of the bit being missed, sort of like having your vision restored after cataracts have reduced you to near grayscale levels.You could just say “I am a p-zombie”. — Michael
Hence my attempt with the car, which very much is aware of its surroundings, but 'aware' is perhaps one of those forbidden words. It all smacks of racism. They basically degraded black slaves by refusing to use human terms for anything related to them, using cattle terms instead,.It made it easy to justify how they were treated. "Cows don't feel pain. Neither do p-zombies. It's not immoral to set em on fire."You’ll need to explain it in these terms. — Michael
I didn't say the statement "so he says he feels pain, not knowing that it isn't real pain" was true.I don't agree with this. "so he says he feels pain, not knowing that it isn't real pain". That's an epistemic issue, not a truth issue. For any x, if x is not feeling pain/hurting, and x says it is feeling pain/hurting, x is wrong. X is saying something false. — RogueAI
That's just using a language bias to attempt a demonstration of a difference where there isn't one. A Roomba cannot be conscious because you define 'conscious' to only apply to humans. That doesn't demonstrate that a pimped-out Roomba isn't doing the exact same thing, it only means that the Roomba needs to pick a different word for the exact same thing, and then tell the human that he isn't that sort of conscious because he's an inferior human.But this isn't true for you. — RogueAI
What is the computer doing then when it processes data from a camera pointed at a table. The computer 'concludes' (probably a forbidden word) that there is a table in front of the camera in question, and outputs a statement "there seems to be a table in front of the camera". You say it's not a mental activity. I agree with that. That usage of "mental activity" only applies to an immaterial mind such as Chalmers envisions. So OK, you can't express that the computer believes there's a table there, or that it concludes that. How do you phrase what the computer does when it does the exact same thing as the human, which is deduce (presumably another forbidden word) the nature of the object in the field of view.But belief is a conscious mental activity. P-belief/p-consider is incoherent. It's missing a necessary condition for anything that remotely resembles believing and considering. — RogueAI
So I claim I'm doing. But how would you express what the p-zombie does when it correctly identifies the table in front of it that it cannot 'see'?The answer is that it would behave as its physical circumstances dictate. — Banno
I do, and you don't. So why are we indistinguishable (except for me deciding to stop imitating the language you use for that which I cannot ever know)?We don't behave this way — RogueAI
Well, per Michael, this includes false claims of belief. I'm doing something that I think is belief, but Michael says it's by definition false.By definition they behave as we do. This includes belief. — hypericin
What is the computer doing then when it processes data from a camera pointed at a table. The computer 'concludes' (probably a forbidden word) that there is a table in front of the camera in question, and outputs a statement "there seems to be a table in front of the camera". You say it's not a mental activity. I agree with that. That usage of "mental activity" only applies to an immaterial mind such as Chalmers envisions. So OK, you can't express that the computer believes there's a table there, or that it concludes that. How do you phrase what the computer does when it does the exact same thing as the human, which is deduce (presumably another forbidden word) the nature of the object in the field of view.
If you can't provide acceptable alternative terms, then I'm sorry, the computer believes there's a table there. Deal with it. — noAxioms
It wouldn't be a finding at all. If it was true, nobody (not even I) would know for sure. Of course, I'm sure that 'knowing' things (and being 'sure') are all forbidden. But I do have whatever it takes to pass an interview for a technical job, even if it isn't knowledge. I have claims of it on my resume, all false apparently.I would say, if your claim were true, it would be a revolutionary finding — hypericin
<?php function responseTo($text) { return ['Yes', 'No', 'Maybe'][random_int(0, 2)]; } echo responseTo('Consider p-zombies. Can they believe?');
It doesn't seem at all appropriate to say that it believes or accepts or considers anything. That would be a very obvious misuse of language. — Michael
ChatGPT and p-zombies are just very complicated versions of the above, with p-zombies having a meat suit. — Michael
We might spend the rest of our life's allotment of time on this forum going back and forth with noAxioms and still not definitively figure out whether he is a p-zombie or not. — hypericin
P-zombies have no consciousness. They just have an outward appearance (including observable behaviour). You’ll need to explain it in these terms.
(By outward appearances I don’t mean to exclude muscles and bones and internal organs) — Michael
A p-zombie is just a machine that responds to stimulation. It's an organic clockwork-like body that moves and makes sound. — Michael
So a p-zombie can believe things as far as brain-activity is involved — Dawnstorm
I don't think you've fully grasped what p zombies are — flannel jesus
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.