Comments

  • Martha the Symbol Transformer
    No, you do understand.Michael

    So when I output "horse" instead of "rabbit" at T2, not knowing a word of English, what do I mean? Do I somehow manage to mean rabbit? How?
  • Martha the Symbol Transformer
    So you agree that if I am Chinese speaker trained to output "horse" when I see "rabbit" in the appropriate situation (T2), that neither I nor the system understand the meaning of "horse" or "rabbit".

    The meaning is horse, not the words "horse" or "rabbit".
  • Martha the Symbol Transformer
    So you agree that horses is what gives meaning to how we use the word "horses".
  • Martha the Symbol Transformer
    That what it means to be X depends on how we use the word "X".Michael

    But we use the word "horse" to refer to animals with certain properties, and that's where the meaning comes from. The meaning of "horses" is horses. And if we decide to use "rabbit" in the future, it will also mean horses.
  • Martha the Symbol Transformer
    It's a counterfactual argument. I'm certainly not trying to conclude that rabbits are horses.Michael

    Right, but what exactly are you trying to claim? That the meaning of rabbits or horses is contained in the word we use, such that if we use another word instead, the meaning changes?
  • Martha the Symbol Transformer
    So you agree that "rabbit" means those furry creatures with big ears, and not equines, even if we decide to use the word "horse" instead.
  • Martha the Symbol Transformer
    We can change the word "horse" such that it's a synonym for "rabbit", which means those furry creatures with big ears, regardless of what we want to call them.
  • Martha the Symbol Transformer
    We didn't change the meaning of gay, we changed the way we used the word, "gay", such that it means homosexual now. The meaning is not the in the word.
  • Martha the Symbol Transformer
    I'm saying that if we use the word "equine" to refer to horses then to be an equine is to be a horse and if we use the word "equine" to refer to rabbits then to be an equine is to be a rabbitMichael

    The word used changes, but a horse is a horse, of course. The meaning of horse remains the same. We don't mean that rabbits are now horses. We mean that horses are horses.
  • Martha the Symbol Transformer
    I think it needs to be stated the other way round; "horsexual" means "gay". So what's the problem?Michael

    Gay already means something, so I picked a meaningless word to transition to. Then you can see that meaning doesn't change when the word changes.
  • Martha the Symbol Transformer
    That depends on if we're using the word "gay" to refer to rabbits or if we're predicating homosexuality of horses. It's not clear what you're trying to sau.Michael

    I was aiming for humor there, because the conversation was starting to make me to laugh.
  • Martha the Symbol Transformer
    But I do mean gay, because "gay" now means "homosexual", and I mean "homosexual".Michael

    So let's invent a new word called horsexual, and let's say that gay now means "horsexual". Now what?
  • Martha the Symbol Transformer
    And if we coin a new word "horse" that means "rabbit" it follows that rabbits were already horses.Michael

    And if we say that rabbits were gay, we mean that horses are homosexual, right?
  • Martha the Symbol Transformer
    Just as now I call a homosexual "gay", and I still mean gay -- because "gay" means "homosexual".Michael

    You don't still mean happy, you mean homosexual now. So you don't still mean "gay".
  • Martha the Symbol Transformer
    Only according to the current meaning of "horse". But I've changed it. You might as well say that if so-and-so was gay then he'd be carefree and happy.Michael

    You haven't changed the meaning of "horse". You've exchanged the word for another. Now you call a "horse" a "rabbit", but you still mean horse.

    A horse by any other name.
  • Monthly Readings: Suggestions
    U. G. Krishnamurti - Mind Is a MythThe Great Whatever

    Whoa, I went and read some of that. Pretty extreme stuff. Would make for an interesting conversation.
  • Re: that other place ...
    The best and the worst.
  • Re: that other place ...
    I think C. S. Lewis wrote a story in which hell was a rather dreary, dark, dusty city absent of any feeling.Bitter Crank

    He also wrote a story of purgatory or temporary hell where everyone was completely self-absorbed or caught up in whatever issue. They still had a chance to leave and take the bus to heaven, but they wouldn't because they were unwilling to get over themselves, or whatever jealousy or hate they were clinging too.
  • Re: that other place ...
    Hell is easier to imagine than a heaven of blessedness, for some reason.Bitter Crank

    Because sometimes it feels like we're already there? And heavenly bliss is a more fleeting state.
  • Martha the Symbol Transformer
    This feels like a faulty analogy as well. The manipulation of a chess board to produce legal moves would include rules about situational moves implicit with an understanding of the object of the game (e.g., what to do when in check). In this case, we are making an intuitive judgement that knowing all the legal moves is insufficient to produce an understanding of the game, but we are doing so from a state of ignorance. The scope of knowing ALL the legal moves might in fact entail an understanding of the object of the game.Soylent

    Google had their DeepMind machine learning software learn various Atari 2600 games. For some of them, it excelled. But it struggled with others. It scored a zero on Montezuma's Revenge, because the score doesn't change unless you're able to navigate across a room with obstacles and get the key. DeepMind has no understanding of objects in any of the games. It only knows pixels and the score, which it's trying to maximize. To do well in this game, you need to know that the key is something to aim for. Any human would quickly figure that out.

    But admittedly, that is different than a chess playing program. Humans labored to program chess software to play the game. It didn't learn how.
  • Martha the Symbol Transformer
    The problem I see with the Chinese Room and your above example is that if you buy into the computational theory of mind you can see how each respectively fits into the theory. Alternatively, if you think there's something missing, you see how each respectively demonstrates that position as well. The analogies seem only to illustrate confirmation biases in intuition rather than insight into what is really going on.Soylent

    Fair enough. I think I've made the mistake of accepting Searle's setup. If I don't buy into the computational theory of mind, why would I expect the Chinese Room to work? Why would I expect a symbol manipulating system to pass the Turing Test (in a strong way)?
  • Martha the Symbol Transformer
    So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't.Michael

    Before language, there were animals who experienced and felt. That's what's fundamental. Language is late in the game. Symbols are parasitic.

    You ask how I know that a computer can't feel. That's missing the point. Symbols can't feel. To the extent that a computer only manipulates symbols, it isn't feeling or knowing anything, because there is no knowledge or feeling in symbols themselves, only what they stand in for.
  • Martha the Symbol Transformer
    This cuts both ways though, do humans/animals do something more than produce a programmed/hard-wired output in the proper situations?Soylent

    Yes, since they don't always produce the same output. Animals, and particularly humans, display a great deal of flexibility and variability There is also a question of what determines the proper situation. What is proper in a given situation? Often, human culture defines that.

    An example for the wild is an offspring nest where a video camera was setup and streamed online. The mother, for unknown reasons, started attacking the offspring chicks, and failed to feed them properly. That doesn't make much sense from an evolutionary point of view, but life is messy.

    Computers and robots have shown creativity and novelty within a specific domain.Soylent

    True.
  • Martha the Symbol Transformer
    So I'm still waiting for the evidence that shows that people are doing more than manipulating symbols and that computers aren't.Michael

    The reason is because symbol manipulation alone undermines itself. In order for there to be symbols to compute, the symbols have to be defined. Chinese symbols without Chinese speakers aren't actually symbols. They're random markings.

    The word or emoticon for grief isn't a word or emoticon if there is no grief. It either means something else, or nothing at all. You have to have the grief first before there can be a symbol invented to represent it.

    The argument here is that symbols can't be primary or fundamental. They are derived, invented, created to aid in communication or thinking.
  • Martha the Symbol Transformer
    Marchesk, stop avoiding. You said that your claim that humans can understand but that computers can't isn't dogma. You said that you have evidence. Tell me what that evidence is.Michael

    Actually, my contention was that symbol manipulation alone doesn't result in understanding. If a computer can be arranged to do more than symbol manipulation, then I'm not claiming it can't understand, because I don't know at that point.

    Searle's contention was that computers only manipulate symbols, however sophisticated.
  • Martha the Symbol Transformer
    It seems clear to me that you don't have any evidence that humans can feel but that computers can't. It seems clear to me that this is just a dogmatic assertions. I'm not sure why you're so unwilling to admit this.Michael

    First off, you agree that there is something more to feeling than producing a symbolic representation of feeling in the proper context, correct?
  • Martha the Symbol Transformer
    And being sexual means what? Feeling sexual arousal? You're just begging the question. And needing to reproduce means what? Having the desire to reproduce? You're just begging the question.Michael

    You can't be serious.
  • Martha the Symbol Transformer
    They could.Michael

    Are they, though? Have computers formed a linguistic community? Have they told us what the symbols of that community mean (or how they are used to use your definition of meaning)?
  • Martha the Symbol Transformer
    And what evidence shows that only animals experience sexual arousal?Michael

    I don't know. I guess hurricanes might be aroused when they hit shore of a major city.

    It probably has to do with animals being sexual, and needing to reproduce.
  • Martha the Symbol Transformer
    The linguistic community.Michael

    And computers form a linguistic community?

    Maybe. What evidence allows us to justify an answer either way?Michael

    Something about machines not being animals, probably.
  • Martha the Symbol Transformer
    How do you know what they can't?Michael

    I don't know. How do you know a rock can't be sexually stimulated?
  • Martha the Symbol Transformer
    Then as I keep asking, what evidence shows that humans can genuinely feel emotions but that computers/robots can't? Clearly it can't be empirical evidence because you're saying that outward behaviour can be "fake". So you have non-empirical evidence?Michael

    This is like asking how do I know computers/robots can't be sexually stimulated just because it can be faked.
  • Martha the Symbol Transformer
    Perhaps the input to which "grief" is the output? And if we go with something like the James-Lange theory then the input is physiological arousal.Michael

    Two questions here:

    1. Who or what determines what the proper output is?

    2. Do computers have physiological arousal?
  • Martha the Symbol Transformer
    So the presence of emotions is determined by public behaviour. Then if a robot behaves the same way a person does, e.g. saying "I'm sorry for your loss" when you tell them that your father has died, accompanied with the appropriate facial expressions and body language, then the robot has demonstrated his capacity for emotions.Michael

    No, it's not. A person can fake emotions, afterall. I might be convinced that you're sorry (or the robot), but maybe it's just mimicry. Maybe you don't actually feel sorry. Maybe you didn't like the person who died, or me, or just aren't close to the situation. Maybe you just aren't feeling empathetic. But you want to maintain a polite appearance.
  • Martha the Symbol Transformer
    Second, simply because we do not relate to a computer as well as we do to other humans doesn't mean a computer doesn't feel. The recent movie Ex Machina explores this.darthbarracuda

    Interesting that you mentioned that movie, since the machine in the movie manipulated the feelings of the protagonist in order to accomplish some other goal. The protagonist felt empathy for the machine and wanted to help it, not realizing that he was being fooled.

    There is another movie along these lines where journalist with a background in robotics is invited to do a piece on a successful roboticist. Turns out this person has managed to create a very human-like android and he wants the journalist to examine it. She ends up falling in love with the roboticist, and a bit disturbed by the android, because it's awkward in conversation, and begins to exhibit signs of jealousy and sexual interest.

    Turns out, the roboticist is actually the android, and the android is the roboticist. She's been fooled to see if the mimicry could be carried out convincingly, which it has, since she's fallen in love with a machine that's been programmed to mimic being a self-confident genius. The real person is a less convincing, awkward, but brilliant human.
  • Martha the Symbol Transformer
    Grief isn't a symbol, it's an experience. It can be communicated with symbols, but the symbols aren't grieving. As such, outputting grieving symbols in the right situation is not at all the same as experiencing grief.
  • Martha the Symbol Transformer
    Which means what?Michael

    We use symbols to communicate meaning.

    And what evidence shows that humans can provide meanings to the symbols but computers can't?Michael

    Searle's argument, as I understand it, is that computers (or any system) are unable to do this if all they're doing is manipulating symbols. Humans are doing something in addition when we produce symbols. The fundamental reason is that symbols aren't meaningful, rather they connotate meaning. They're symbols for a reason.
  • Martha the Symbol Transformer
    The correct question is "what's the difference between a computer taking in, manipulating, and outputting symbols and a human taking in, manipulating, and outputting symbols?" It's the one I've asked you, and it's the one I'm still waiting an answer for.Michael

    Humans provide meanings to the symbols in the first place, which is what you're ignoring.
  • Martha the Symbol Transformer
    Correct, abstract (and fictional people in stories) people don't actually grieve.Michael

    But here's the thing. The computer is taking in symbols, manipulating those symbols, and outputting symbols, correct? So what's the difference between that and a human writing out the algorithm for computing grief?

    A human could take the symbols for a funeral, write down the computations a Turing Machine would make, and output the symbols for grief, or whatever. In theory. Maybe a billion Chinese could do it. Would that system grieve?
  • Martha the Symbol Transformer
    You're just reasserting the claim that humans can understand and computers can't. I want to know what evidence supports this claim.Michael

    Computers are instantiations of Turing machines (limited by physics), correct? You agreed that an abstract Turing machine can't compute grief. What makes an instantiated Turing machine different?

    You might retort that abstract machines don't compute, but that's not quite right, because we can write out the algorithm for whatever computation, if we wanted to take the time and effort (within the limitations of our resources).

    So if there exists an algorithm for grief, why wouldn't the algorithm itself feel grief, or a written out version of Turing machine computing that algorithm? Is there something that an instantiated computer does with symbols that an abstraction doesn't?

    Is it the electricity flow through the gates? Does electricity give meaning to symbols?