• wax
    301
    The dilemma where you have the power to divert a runaway rail truck so that it would kill one person, rather than stay on its course of killing multiple people.

    I think part of the problem really is that many people like to promote the idea that inaction leaves people without responsibility,

    Maybe this was the intent of the dilemma, to expose this.

    I think if you are in a position to do something, and then you don't you still have the responsibility for the outcome of events.

    But this can be inconvenient for society in things like the way the world is going environmentally.

    I like Asimov's first law of robotics:
    "A robot may not injure a human being, or through inaction, allow a human being to come to harm. "

    edit: ok it is more complex than that....maybe deeply rooted in humans is the idea that not taking action actually does leave someone without responsibility,,,
  • I like sushi
    4.3k
    There are other variations that show choices are effected by physical distance - eg. far fewer people feel comfortable about pushing someone onto the track to halt the train in order to save others.
  • Terrapin Station
    13.8k
    Although I don't agree with this:
    I think if you are in a position to do something, and then you don't you still have the responsibility for the outcome of events.wax

    And I don't see the "runaway railtruck" dilemma as much of a dilemma. Ceteris paribus, I'd not have to think for a moment that I'd divert the railtruck to kill just one person.

    However, I can see some merit to it being a dilemma to some folks because of what you point out. The way I look at it, though is that people are going to be killed in the scenario no matter what I do, so I might as well cause less of them to be killed, barring good reasons other than simple numbers why one side should be chosen over the other.
  • wax
    301


    ok, maybe my first post was too simplistic... I don't particularly agree with the philosophy that the greater the number in a group the more value it has, compared to a smaller group.

    I would also take the position that I hadn't ask to be in the position of being able to chose following events....most likely I would run away from the points lever, or swap it back and forth.. :)
  • TheMadFool
    13.8k
    Let's consider the trolley problem.

    Group A: we should kill one to save the many

    Group B: we shouldn't

    Group A are consequentialists and their response is a logically entailed by their theory.

    Group B could be classifed as deontologists who would/should never use people as means.

    Who is right? That's the question isn't it?

    Consequentialism has a serious flaw doesn't it? Who on this Earth can accurately trace the causal chain into the future. Beyond the immediate effects one is totally in the dark about future states. What if the people saved turn into Hitlers or Stalins? What if the person killed would've saved the world from the apocalypse?

    A consequentialist could say that we remain focused in the present and concern ourselves only with immediate consequences. That seems absurd because that's a myopic understanding of consequences and causality. It seems we pride ourselves as good planners and claim long-term goal oriented behavior but it's impossible to comprehend causality because it's too complex.

    I think consequentialism requires us to be omniscient which is impossible.
  • wax
    301
    I think consequentialism requires us to be omniscient which is impossible.TheMadFool

    I wonder if it is even possible to be omniscient.

    So there has to be a decision about what to do with the railway point handle,.

    If you decide course A, then what lead you to decide that course?
    If you decide course B then the same.

    If you assume the choice you made is rooted in something, then the whole set-up could be rooted in the same....I think this fits in with my idea that you can't stand outside reality and view it independently, so a potential candidate for omniscience(maybe God) is still part of the system, and can't differentiate himself from it, and so can't know everything, including the outcome of a decision.
    If that makes sense..
  • Filipe
    25
    I agree with @Terrapin Station I do not see as dilemma at all and I would go as far as saying that even when you introduce the ideia of family in the middle of the victims I would still act and I would sleep ok at night... ( It was a variation of the problem when the "lonely person" is actually part of your close family), in a normal situation I would Choose the 1 person but in the new scenario I would choose to leave the train hit the 5 people.
  • Kaz
    15
    Too context dependent. Try making a difficult moral choice with a headache. More seriously, I think that the context will shape the decision more than this empty theoretising. I don't believe that humans, at large, make their "moral" decisions based on an intellectualised system of ethics. Such decisions are much more intuitive.
  • T Clark
    13k


    I really hate the trolley problem. It's so !@#$% stupid. I'll have more to say about that in a later post. First off, I want to play a routine by one of my favorite comedians, Jon Mulaney, that I think is relevant. Different thought experiment, similar issues.

  • T Clark
    13k


    Ok, so let's talk about the trolley problem (TP). Here are some expansions/extensions of TP.

    TP-A - First, let's address the kinds of issues raised in Jon Maloney's routine, above:
    • How am I so sure throwing the fat guy off the bridge will stop the trolley? Seems unlikely to me.
    • There's a greater than 0, probably much greater, chance the trolley driver will see the guys on the track in time and will stop.
    • There's a good chance that not all the guys on the track will be killed.
    • Are there other actions that might get the trolley to stop in time?
      The fat guy and I could yell really loudly. We could drop our backpacks on top of the train. I could jump down and wave my arms to try to stop the train. Maybe I'd break my leg, but nobody would have to die. Any of those seem as likely to stop the trolley as dropping the fat guy.
    • What are the odds the fat guy is just going to let me throw him off the bridge?

    All of these uncertainties will change the moral calculus of the dilemma. Now, some more TPs to address the moral rather than practical aspects of the problem.

    TP-B - If you drop the fat guy off the bridge, you will probably be, rightly, prosecuted for and convicted of murder. Are you willing to spend 20 years in jail for you moral convictions?

    TP-C - What if you are the fat guy? Are you willing to jump yourself? I probably wouldn't be. Even if I were willing, there's a good chance I wouldn't have the courage. What gives me the right to make that decision for the fat guy?

    TP-C1 - Even if you aren't the fat guy, do you really think that there's a significant difference between his body and yours? The train weighs tons. Do you think a couple of hundred extra pounds will make a difference. Seems unlikely to me.

    Let's ask the fat guy
    FG - "Hell no, you're not going to throw me off the fucking bridge. Wait, here, I'll throw you.
    You - "Aaaaaaa!!!" Splat.
    FG - Well, shit. It didn't stop the train after all, but the workers heard the splat and moved off in time. I call that a win-win situation.

    Did I mention I hate the trolley problem.
  • Andrew4Handel
    2.5k


    There are different types of inaction it seems.

    Inaction might be classed as immoral in some circumstances, such as not preventing a child from drowning when you easily could.

    In others such as not supporting a a war or dictatorship it could be seen as moral.

    I do think inaction makes you less culpable than action. I think problems are usually caused by action first. For example environmental problems are caused by our actions.

    It can feel like positive action is futile when it is trying to combat a flood of negative actions
  • I like sushi
    4.3k


    Nope. I think you got it right the first time. As is it makes more sense to have a hand in preserving two human lives more tha it is to preserve one human life (the problem is set out initially as some hypothetical and therefore they’re hypothetical ‘generic humans’).

    If you scale it up to save one human and allow 100,000 to die then say “doing nothign is the better moral choice” it’s abhorrent in my mind.

    An underlying issue is the refusal to decide is for selfish reasons and social preservation of your public self. It’s a convenient lie we fall into telling ourselves when answering hypothetical moral questions and presenting them publicly. We wish to show both our supposed sense of acute morality as well as our rational thinking.

    I general view the idea of the hypothetical as a commonly misused exercise. We’ve seen above comments about “hating” this problem, or even the claim that there is no problem at all. I have a number of strong arguments for what I am saying so throw what you’ve got at me. I want to take over the thread though, but I’ve sure got a lot to say about this subject ;)

    I also came up with my own little version of this problem that catches many people out.
  • BC
    13.2k
    The dilemma where you have the power to divert a runaway rail truck so that it would kill one person, rather than stay on its course of killing multiple people.wax

    But Wax, if we agree that people are stupid feckless fools, then why would we ever want to deflect the killer caboose from its appointed rounds? The trouble is that the ruthlessly soft-tissue squishing railroad is running over too few stupid fools, rather than too many.

    The obese unit that is standing next to you on the bridge should of course be thrown off the overpass to improve public health stats, but in no way should the fat unit deflect the bone-crunching skull squashing vehicle.
  • noAxioms
    1.3k
    Some random thoughts.

    The dilemma where you have the power to divert a runaway rail truck so that it would kill one person, rather than stay on its course of killing multiple people.wax
    This is the trolley problem, except I presume we are killing the truck operator who arguably has some fault in letting this situation come about. The trolley problem usually involves the death of one or multiple total innocents, and thus is more of a pure moral dilemma.
    I like Asimov's first law of robotics:"A robot may not injure a human being, or through inaction, allow a human being to come to harm. "
    This would cause war with the robots and possibly end humanity. A robot would have to attempt to imprison everybody in the equivalent of padded zoo cell to keep them safe. Pregnancy would be prevented since it carries the significant chance of harm.
    Computers/robots are quite literal with their instructions and one would need to craft such directives much more carefully than those 3 simple laws. The 3rd law is self-preservation, but that might rise to top law since if the robots were to come to harm, they would not be able to implement the first law. Hence the war.

    What if the person killed would've saved the world from the apocalypse?TheMadFool
    Positing unknowns is useless. The guy saving the world from the apocalypse is more likely to be in the larger group. Maybe the apocalypse is exactly what the world needs.
    Group A: we should kill one to save the many
    By this argument, it is moral to disassemble a healthy person to distribute organs to multiple people in mortal need of them. On a more practical standpoint, this argument holds water. Take 20 people in need of 20 different organs and too far down the waiting list to survive. They draw lots and one of them gives his healthy parts to the 19 others. This makes so much sense (even if I was one of them), that I don't see why it isn't done.

    TP-B - If you drop the fat guy off the bridge, you will probably be, rightly, prosecuted for and convicted of murder. Are you willing to spend 20 years in jail for you moral convictions?T Clark
    The law is indeed on the side of doing nothing. Saving 20 lives at the cost of one different life is a punishable offense.

    Talk of the fat guy is argument from emotion. Sacrifice the repellent guy in favor of multiple children and puppies. Somehow I don't think the solution to this lies along such biases.
  • TheMadFool
    13.8k
    The guy saving the world from the apocalypse is more likely to be in the larger group.noAxioms

    Don't forget to include Stalin and Hitler too in the ''larger'' group.
  • wax
    301
    This would cause war with the robots and possibly end humanity. A robot would have to attempt to imprison everybody in the equivalent of padded zoo cell to keep them safe. Pregnancy would be prevented since it carries the significant chance of harm.noAxioms

    yea, the thing I like about the aspect of the 'first law of robotics' isn't how useful or not it would actually be, in a robotic run world, it is just how I think that a lot of people try to use the idea of not taking action as leaving them without responsibility..
    If say I was made aware that a zombie apocalypses had begun or at least there were a few zombies wandering about, but decided 'nah can't be bothered to report that to the police.'', then that wouldn't leave me without responsibility,
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.