• Marchesk
    4.6k
    Perhaps that we dogmatically believe that people understand but computers don't?Michael

    We understand that people are doing something more than manipulating symbols. When I say that I understand your loss, you take it to mean I can relate to having lost someone, not that I can produce those symbols in the right situation, in which case I'm just formally being polite. If a machine says it, it's understood that someone programmed a machine to say it in those circumstances, which might come off as incredibly cold and insensitive, or downright creepy (if it hits the uncanny valley). What we don't do is think that the machine feels our pain or empathizes.

    It's the same with Siri telling me it's cold outside. It's cute and all, but nobody takes it seriously.
  • Michael
    15.8k
    And then there's the question of intentionality. How do symbols refer? How is it that 1 stands for any single individual item? — Marchesk

    When the input is "•" the output is "1". When the input is "••" the output is "2", etc. We're taught what to say in response to something else.
  • Marchesk
    4.6k
    No, not dogma. It's really absurd to maintain otherwise, unless you're invoking some altered version of the other minds problem in which I'm the only one not doing symbol manipulation.

    But anyway, I'll try another approach which isn't about consciousness. Once we humans have an understanding of 1 + 1 (to use a trivial example), we can universalize it to any domain. A computer can't do that. It has to be programmed in different scenarios how to apply 1 + 1 to achieve whatever result.

    Sure, the computer always knows how to compute 2, but it doesn't know how to apply addition in various real world situations without being programmed to do so.
  • Michael
    15.8k
    No, not dogma. It's really absurd to maintain otherwise, unless you're invoking some altered version of the other minds problem in which I'm the only one not doing symbol manipulation. — Marchesk

    If it's not dogma then there's evidence. What evidence shows that the computer who says "I'm sorry" doesn't understand and that the human who says "I'm sorry" does?
  • Marchesk
    4.6k
    When the input is "•" the output is "1". When the input is "••" the output is "2", etc.Michael

    That's not what reference means at all.
  • Michael
    15.8k
    Then what does reference mean?
  • Marchesk
    4.6k
    If it's not dogma then there's evidence. What evidence shows that the computer who says "I'm sorry" doesn't understand and that the human who says "I'm sorry" does?Michael

    Humans form emotional bonds and machines don't. Do you need some scientific literature to back this up? Humans also grieve when those bonds are broken and machines don't.
  • Michael
    15.8k
    Humans form emotional bonds and machines don't. Do you need some scientific literature to back this up? Humans also grieve and machines don't. — Marchesk

    What evidence shows that humans can form emotional bonds and grieve but that computers can't? You can't use science because science can only ever use observable behaviour as evidence, and the premise of the thought experiment is that the computer has the same observable behaviour as a human.
  • Marchesk
    4.6k
    Then what does reference mean?Michael

    The mathematical symbol "1" means any item or unit ever, in the context of counting or sets. You can use it to denote any one thing.

    If I made up some word, say "bluxargy", and then defined with some other made up words, what does it reference? It references nothing, so reference can't be symbol manipulation.
  • Marchesk
    4.6k
    What evidence shows that humans can form emotional bonds and grieve but that computers can't? You can't use science because science can only ever use observable behaviour as evidence, and the premise of the thought experiment is that the computer has the same observable behaviour as a human.Michael

    Okay, let's set aside empirical matters and just accept that humans do experience emotion. What about Turing machines? Can a Turing machine, in just its abstract form, experience grief? Does that make any sense?

    What I mean is, say some brilliant mathematician/programmer defined the algorithm that some theoretical computer could use to compute being in grief, and wrote it down. Would that algorithm then experience grief? Let's say they pay someone to illustrate a Turing machine manipulating the symbols needed to compute the algorithm. Whole forests are cut down to print this thing out, but there it is. Are the symbols sad?
  • Michael
    15.8k
    The mathematical symbol "1" means any item or unit ever, in the context of counting or sets. You can use it to denote any one thing. — Marchesk

    Sure. And you asked how it's come to mean this thing. I pointed out that we're provided with some input (of which there may be many that resemble one another in some empirical way), e.g. "•" or "••", and are told what to output, e.g. "1" or "2".
  • Marchesk
    4.6k
    Sure. And you asked how it's come to mean this thing. I pointed out that we're provided with some input (of which there may be many that resemble one another in some empirical way), e.g. "•" or "••", and are told what to output, e.g. "1" or "2".Michael

    Again, that's not what "1" or "+" or "2" means, at all.
  • Michael
    15.8k
    Okay, let's set aside empirical matters and just accept that humans do experience emotion. What about turing machines? Can a turing machine, in just it's abstract form, experience grief? Does that make any sense? — Marchesk

    If we're just going to accept that the humans experience emotions then why not just accept that the Turing machine does? And if it doesn't make sense to say that the Turing machine can grieve then why wouldn't it not make sense to say that the human can grieve? What's the difference between humans and computers that makes emotions in the former reasonable but not the latter?
  • Marchesk
    4.6k
    If we're just going to accept that the humans experience emotions then why not just accept that the Turing machine does?Michael

    Because symbols are abstractions from experience. They stand in for something else. An emoticon isn't happy or sad or mad. It just means that to us, because we can be happy, or mad or sad.
  • Michael
    15.8k
    I don't see how that answers my question. It seems a non sequitur.
  • Michael
    15.8k
    Then what does 1 mean?
  • Marchesk
    4.6k
    How so? You just asked why a bunch of symbols can't have emotions if humans have emotions. I just told you that symbols are stand ins for something else, in this case emotion. A happy face isn't happy. It means happy.
  • Marchesk
    4.6k
    I already told you that 1 means any individual thing in context of counting or sets. I'm sure someone else can provide a better mathematical definition. This can run kind of deep because we could get start debating the exact meaning behind the symbol which might lead to a universals vs nominalism debate.
  • Michael
    15.8k
    I already told you that 1 means any individual thing in context of counting or sets. — Marchesk

    So when I say "there's 1 apple" I'm saying "there's any individual thing in context of counting or sets apple"?
  • Soylent
    188
    The claim Searle is making is that no amount of symbol manipulation gets you to understanding, because understanding isn't in the symbols.Marchesk

    You're shifting the terms of understanding. If understanding is granted to the system for the accurate manipulation of the symbols, then human understanding is likewise granted for accurate manipulation of the symbols. It's not enough to have the symbols, one has to have the rules to manipulate the symbols. Searle, and perhaps you, seems to want to isolate the understanding of the Chinese Room participant from the entire system, which includes the set(s) of rules. Martha doesn't need to know the meaning of the output, because the meaning is supplied by the entire system and not a single part of it. Your tongue doesn't need to know the meaning of the words in order to get into the right position to make a sound. The aggregate system is demonstrative of understanding: input to output and all the various computational places in between.

    You might object that the computational theory of mind begs the question as well. Humans do have an understanding not present in the Chinese Room, but I don't think appealing to the intuition of the scenario is going to lead us to any insight about what is going on there. The Chinese Room and !Kung-speaking Martha are inadequate to settle the matter because one cannot see past the preconception brought to the example.
  • Michael
    15.8k
    You confused me. I thought you meant a machine that passes the Turing test. Bringing up abstract things is a red herring. Nobody is saying that abstract things can have real emotions.
  • Marchesk
    4.6k
    Nobody is saying that abstract things can have real emotions.Michael

    Right, so what makes a computer different than an abstraction, like a Turing Machine (of which a computer is a finite realization)? Is it that the computer is made of matter instead of symbols?
  • Michael
    15.8k
    Yes. Just as a human is made different from an abstraction by being made of matter.

    You seem to be avoiding the question. Why is it reasonable to assume that humans can experience but that computers can't? What's the evidence? Is there any? Or is it just dogma?
  • Marchesk
    4.6k
    You're shifting the terms of understanding. If understanding is granted to the system for the accurate manipulation of the symbols, then human understanding is likewise granted for accurate manipulation of the symbols. It's not enough to have the symbols, one has to have the rules to manipulate the symbols.Soylent

    Right, and I'll accept that this is one notion of understanding, being that words can have multiple meanings. Siri knows how to tell me what the temperature outside is. "She" understands how to compute that result.

    Searle, and perhaps you, seems to want to isolate the understanding of the Chinese Room participant from the entire system, which includes the set(s) of rules. Martha doesn't need to know the meaning of the output, because the meaning is supplied by the entire system and not a single part of it.Soylent

    But Searle's point is that it doesn't matter, because it's still just a form of symbol manipulation. He thinks we do something fundamentally different than following rules to manipulate symbols when we speak English or Chinese, although of course we are capable of computing symbols, albeit not usually as well as a computer.
  • Marchesk
    4.6k
    Yes. Just as a human is made different from an abstraction by being made of matter.Michael

    So it is matter that gives meaning to symbols?
  • Michael
    15.8k
    Use gives meaning to symbols.
  • The Great Whatever
    2.2k
    You won't get far on the harder tests with that level of 'understanding.'
  • Michael
    15.8k
    What would get me further? Presumably more symbol manipulation. That's what mathematical proofs are, right? Take some input sentences (the axioms) and apply the rules of inference to output some other sentences.
  • The Great Whatever
    2.2k
    Knowing what drives the proposals for the rules of symbolic manipulation.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.