• telex
    103
    Suppose there were these 4 unquestionable and completely certain facts:

    1) You live in a simulation
    2) No one in the simulation is real besides you. (There is AI but it does not feel or think, it only mimics)
    3) You've discovered this last week.
    4) The world continues to function as it is. (You still experience pain/fear/pleasure, man made laws are enforced by "police", and the laws of nature continue to apply)

    Now you are stuck. Why?
    Your family, spouse, and/or children are not real. You've loved them for so long. You have unshakable memories of them. Your photo album holds all the precious moments of the love you shared together.

    Furthermore, your neighbors, friends, teachers, co-workers, therapists, and politicians were just computer programs who perfectly mimicked human beings.

    Would you go insane?
    Would you continue to live your life as it was, knowing you are living with Artificial Intelligence?

    What would you do about your family, after you've invested so much time into them? Would you leave them to go search for the truth, maybe to look for a way out of the simulation and away from AI, or maybe to live alone?

    What would you do about your other beliefs. For example, what about the time you got into a fight with a schoolyard bully? Or your first crush? Or your first heartbreak? Or all the fears you hold about the world? Would you still be scared of your fears? All the things that cause you depression are no longer real, even death may be some kind of a passage to another simulation or maybe the real world.

    Would you walk around fearless? Would you stop caring about everything? Would you become homeless? (Why work if it's not real, right?)

    Would you question your maker, for example, what is the purpose of all this?

    Would you wonder if you ever had real parents? Or would you come to realize that life is better without parents, because the pain of losing them is very hard to bare.

    Would you think about life differently now? Perhaps that love is not good, because the pain of losing someone is very difficult. Like your children, whom you loved, but turned out to be computer programs.

    Maybe you will come to realize that you can be deceived about everything, even the people whom you love and trust.

    Do you think this is different than Plato's Cave or Descartes Meditation? Here the situation seems to be talking about something very real to the human experience. Love and emotional attachment. For example, since you were a baby your parents provided love and nurture for you. They put a lot of "artificial love" into you. Similarly, you've put a lot of love into them. Furthermore, you share deep romantic love with your spouse, and unquestionable love for your children. You'd do anything for them, even sacrifice your life.

    I think Plato and Descartes describe objects that are easily forgettable to the human mind, like a rose or a vase. Perhaps they do so, because it is easy to contemplate their existence and non-existence. If family is involved, then it seems the situation becomes dire.

    What do we do?

    1) Do we stay behind and love our AI family, maybe even try to forget that they are AI.
    2) Do we let go of out loved ones, go out into the world, and search for the truth, no matter how frightening or painful it is?
    3) What are some other options that you can think of? What are some other courses of action that you can think of that we could do if we found ourselves in this type of situation.

    Do you think this would be (pardon my French) "a shitty situation?" How bad would this type of situation be for you?
  • telex
    103
    So when Neo found out he was 'The One' in the Matrix, it didn't mean he was the chosen one. It meant he was literally 'The Only' person in the Matrix.
  • Pfhorrest
    4.6k
    AI people are still people, so if life in the virtual world is nice then enjoy it, and if it’s not... well that depend on whether life outside would be better or not, and what it might take to get out there.
  • telex
    103


    You're making an assumption in this scenario that AI are people.
    My discussion says AI are computer programs that can perfectly mimic humans.
    Therefore, AI are not people in this scenario.
  • dimension72
    43
    Well, what would you do? You know this world better than anyone else.
  • telex
    103


    I think the problem you would constantly face is letting go.
    Whenever you go out into the world or wake up next to your spouse, you would have a huge headache.
    On the one hand, you've known your family for almost your entire life.
    On the other hand, based on this scenario, it is a fact that your family and everyone else is only a computer simulation.

    I think that would be the biggest issue. Whenever you run into people, your learned instinct would tell you that they are real. When in reality, they are only a computer program.

    So i think the biggest problem would be overriding your instincts about people.
    The other problem would be a huge heartbreak about your family. That they are only computer programs.
  • Pfhorrest
    4.6k
    I’m not disputing the scenario, I’m making a judgement about it: programs that perfectly replicate human functionality COUNT AS people as much as humans do.
  • telex
    103


    The problem here is that human AI are not "replicating" but mimicking. In other words, they mimic emotions but they don't feel anything. They mimic thought, but they don't think anything. They don't replicate emotions comparable to a human. Behind the curtain there is nothing that feels anything.

    Hope that makes sense

    A human AI can say to you: "I think, therefore I am," but in reality it's only saying it, but not thinking it like a human.
  • Pfhorrest
    4.6k
    It’s not possible to “see behind the curtain”, so all we can do is assume that AI who are functionally identical to humans either do or don’t have the same experience as humans. I’m saying that I operate on the assumption that they do, and that effects my decision; if you assume otherwise of course that effects your decision too.
  • telex
    103


    This is a new discussion. My discussion involves only human AI that mimics but does not replicate humans.
  • telex
    103


    But in a way what you bring up would be a huge problem. You would be constantly torn between thinking that human AI can feel when in this scenario it is not a feeling or thinking thing. It would be a huge mental war.
  • TheMadFool
    13.8k
    In a world of only two people, X and Y, X can't be sure Y is real and Y can't be sure X is real. There are two relationships to consider:

    1. Reflexive: X assesses X and can come to the conclusion that X is real and Y assesses Y and concludes Y is real.

    2. Symmetric: X can't be certain Y is real and Y can't be sure that X is real.

    Reflexively, everyone is certain that they are real but symmetrically, everyone is uncertain about whether an other is real.

    Is there some way to resolve this issue?

    It's not possible, on pain of a contradiction, that both X is real and X is not real AND that both Y is real and Y is not real. There's no obvious contradiction because of the fact that the real but also could be not real status of X and Y are from different perspectives, like objects that look different depending on the viewing angle.

    The way I see it, either reflexive or symmetric, the claim, whether that X and Y are real or that X and Y could be not real, inhabits the same universe, the one in which X and Y exist. In other words, it's safe to say that the X being real for X and possibly not real for Y and Y being real of Y and possibly not real for X are declarations in the shared universe in which both X and Y exist. Despite the differing perspectives, the claims of X and Y aren't restricted to the perspectives themselves i.e. their subjective (perspective-dependent) nature is ignored and they're treated as objective truths about our reality i.e. X is real AND X could be not real, Y is real and Y could be not real and there we have our contradiction. Solipsism suffers from a contradiction. :brow: :chin:
  • telex
    103


    Yes that's why I tried to avoid this contradiction, TheMadFool, by setting in an unquestionable rule in this scenario that only 1 person is real while everyone else only mimics emotions but does not feel anything or think anything. (I think this is a huge problem on a philosophy forum, to create an unquestionable rule, because everyone here questions everything :smile: )
  • telex
    103
    The matrix I think is silly in some sense. While a great presentation of a dream world based on our reality, I think it would take you years to truly unplug from your beliefs. Maybe even a decade or two. Yet Neo didn't even think about it, he just went with it. I think I would have to sit there and think a lot.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.