No, you do understand. — Michael
That what it means to be X depends on how we use the word "X". — Michael
It's a counterfactual argument. I'm certainly not trying to conclude that rabbits are horses. — Michael
I'm saying that if we use the word "equine" to refer to horses then to be an equine is to be a horse and if we use the word "equine" to refer to rabbits then to be an equine is to be a rabbit — Michael
I think it needs to be stated the other way round; "horsexual" means "gay". So what's the problem? — Michael
That depends on if we're using the word "gay" to refer to rabbits or if we're predicating homosexuality of horses. It's not clear what you're trying to sau. — Michael
But I do mean gay, because "gay" now means "homosexual", and I mean "homosexual". — Michael
And if we coin a new word "horse" that means "rabbit" it follows that rabbits were already horses. — Michael
Just as now I call a homosexual "gay", and I still mean gay -- because "gay" means "homosexual". — Michael
Only according to the current meaning of "horse". But I've changed it. You might as well say that if so-and-so was gay then he'd be carefree and happy. — Michael
U. G. Krishnamurti - Mind Is a Myth — The Great Whatever
I think C. S. Lewis wrote a story in which hell was a rather dreary, dark, dusty city absent of any feeling. — Bitter Crank
Hell is easier to imagine than a heaven of blessedness, for some reason. — Bitter Crank
This feels like a faulty analogy as well. The manipulation of a chess board to produce legal moves would include rules about situational moves implicit with an understanding of the object of the game (e.g., what to do when in check). In this case, we are making an intuitive judgement that knowing all the legal moves is insufficient to produce an understanding of the game, but we are doing so from a state of ignorance. The scope of knowing ALL the legal moves might in fact entail an understanding of the object of the game. — Soylent
The problem I see with the Chinese Room and your above example is that if you buy into the computational theory of mind you can see how each respectively fits into the theory. Alternatively, if you think there's something missing, you see how each respectively demonstrates that position as well. The analogies seem only to illustrate confirmation biases in intuition rather than insight into what is really going on. — Soylent
So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't. — Michael
This cuts both ways though, do humans/animals do something more than produce a programmed/hard-wired output in the proper situations? — Soylent
Computers and robots have shown creativity and novelty within a specific domain. — Soylent
So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't. — Michael
Marchesk, stop avoiding. You said that your claim that humans can understand but that computers can't isn't dogma. You said that you have evidence. Tell me what that evidence is. — Michael
It seems clear to me that you don't have any evidence that humans can feel but that computers can't. It seems clear to me that this is just a dogmatic assertions. I'm not sure why you're so unwilling to admit this. — Michael
And being sexual means what? Feeling sexual arousal? You're just begging the question. And needing to reproduce means what? Having the desire to reproduce? You're just begging the question. — Michael
They could. — Michael
And what evidence shows that only animals experience sexual arousal? — Michael
How do you know what they can't? — Michael
Then as I keep asking, what evidence shows that humans can genuinely feel emotions but that computers/robots can't? Clearly it can't be empirical evidence because you're saying that outward behaviour can be "fake". So you have non-empirical evidence? — Michael
Perhaps the input to which "grief" is the output? And if we go with something like the James-Lange theory then the input is physiological arousal. — Michael
So the presence of emotions is determined by public behaviour. Then if a robot behaves the same way a person does, e.g. saying "I'm sorry for your loss" when you tell them that your father has died, accompanied with the appropriate facial expressions and body language, then the robot has demonstrated his capacity for emotions. — Michael
Second, simply because we do not relate to a computer as well as we do to other humans doesn't mean a computer doesn't feel. The recent movie Ex Machina explores this. — darthbarracuda
Which means what? — Michael
And what evidence shows that humans can provide meanings to the symbols but computers can't? — Michael
The correct question is "what's the difference between a computer taking in, manipulating, and outputting symbols and a human taking in, manipulating, and outputting symbols?" It's the one I've asked you, and it's the one I'm still waiting an answer for. — Michael
Correct, abstract (and fictional people in stories) people don't actually grieve. — Michael
You're just reasserting the claim that humans can understand and computers can't. I want to know what evidence supports this claim. — Michael
