• Rich
    3.2k
    The mind is a creative, self-organizing life force that transcends the brain and permeates the body.
  • Efram
    46
    I have a lot of objections to the linked essay, but I assume it's given here more for illustration and not as a focal point for us to critique specifically.

    This topic inevitably devolves into getting deadlocked by the hard problem of consciousness, questions about dualism, etc. To state that the brain could be a computer, you need to establish that a computer is capable of everything that a brain is - and while we know a computer would be capable of what we would accept as intelligent behaviour, we can't say so certainly whether a computer would be capable of experiencing qualia, for example.
  • CasKev
    410
    I wouldn't say the brain/mind is digital, because that suggests every operation is a yes or a no, which disregards degrees of intensity. Intensity brings with it a sense of the infinite - as in, between two points, there is a space that can be subdivided again and again. This level of complexity couldn't be calculated based on simple ones and zeroes. Take pain for example, which can range from mildly discomforting to excruciating. You can rate pain as 7 on a scale of 1 to 10, but the true value is likely 7.128329...
  • Forgottenticket
    212
    I have a lot of objections to the linked essay, but I assume it's given here more for illustration and not as a focal point for us to critique specifically.Efram

    Sort of, the thread title comes from it and it was linked in the original forums.philosophyforums.com thread. Feel free to make objections to any part of it.

    we can't say so certainly whether a computer would be capable of experiencing qualia, for example.Efram

    Well machine-functionalists like Dennett deny the existence of qualia. They say it is the function of the machine that has an abstract "I" function and he uses experiments like "change blindness" to show qualia concept isn't clear. Abstractions are real because physics is "nifty" like that. (The nifty quote is from a recent google talk).
  • Pacem
    40
    Unless knowing current theories and events in neuro-science, this topic is left unsupported. The mind-body problem or the problem of consciousness -whatever you say- is also a problem of neuro-science.
  • Brian
    88
    This was one of the more popular threads on the old forum so I'm remaking it.

    Here is Searle's original essay:
    http://users.ecs.soton.ac.uk/harnad/Papers/Py104/searle.comp.html

    Some objections are what defines computation? If everything (the entire universe) is a computation then the statement "the mind is computation" is only trivially true.
    Another is that it could in theory be made out of anything so long as the system represents binary symbols (so minds can exist in anything). There is a longer Lanier essay on that problem available. http://www.jaronlanier.com/zombie.html
    Searle also notes the homunculus problem: IE: that someone has to be around to interpret and operate the system.
    JupiterJess

    I generally tend to think that while computation is some of what the brain does, it is far from everything. So, there is overlap but no, the brain is not just a digital computer. I've always enjoyed Searle's work on this topic and I find that the Chinese Room argument reveals deep intuitions about the importance of semantics for determining the nature of the human mind. I tend to think understanding is more basic to the human mind than computing is. And I don't think computers understand, though I obviously can't prove this.
  • Wayfarer
    20.6k
    The question that always occurs to me it that if the universe is indeed 'a computation', then who made the computer? It sounds obviously like an argument for a Primordial Programmer. After all, every computer we know about is an artefact, so I don't know why the possibility that the Universe is 'a computation' can be anything other than a kind theistic argument.

    In any case, on the other topic - I am generally persuaded by Hubert Dreyfus' arguments, that he advanced decades ago in 'what computers can't do'. How do you write a specification for a system that emulates the unconscious, irony, humour, intuition and the like? I suppose the argument now is that these qualities begin to emerge out of neural networks, and I suppose they might. But I will always be sceptical, because I don't believe that computers are beings.

    “It is much easier,” observed AI pioneer Terry Winograd, “to write a program to carry out abstruse formal operations than to capture the common sense of a dog.”

    A dog knows, through its own sort of common sense, that it cannot leap over a house in order to reach its master. It presumably knows this as the directly given meaning of houses and leaps — a meaning it experiences all the way down into its muscles and bones. As for you and me, we know, perhaps without ever having thought about it, that a person cannot be in two places at once. We know (to extract a few examples from the literature of cognitive science) that there is no football stadium on the train to Seattle, that giraffes do not wear hats and underwear, and that a book can aid us in propping up a slide projector but a sirloin steak probably isn’t appropriate.

    We could, of course, record any of these facts in a computer. The impossibility arises when we consider how to record and make accessible the entire, unsurveyable, and ill-defined body of common sense. We know all these things, not because our “random access memory” contains separate, atomic propositions bearing witness to every commonsensical fact (their number would be infinite), and not because we have ever stopped to deduce the truth from a few more general propositions (an adequate collection of such propositions isn’t possible even in principle). Our knowledge does not present itself in discrete, logically well-behaved chunks, nor is it contained within a neat deductive system.

    It is no surprise, then, that the contextual coherence of things — how things hold together in fluid, immediately accessible, interpenetrating patterns of significance rather than in precisely framed logical relationships — remains to this day the defining problem for AI. It is the problem of meaning.

    Logic, DNA and Poetry - Steve Talbot
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment