• alcontali
    1.3k
    I do not see how you are challenging premise 2. Does the machine value anything? No, it is a machine.Bartricks

    We would first have to agree that a morality is a set of rules. In that case, derived moral rules are not "valued" but evaluated. A machine is perfectly capable of verifying such evaluation.
  • Bartricks
    6k
    We don't have to agree to that at all - and I don't agree to it. You mentioned rules, I simply pointed out that rules or no you're not generating a counterexample to premise 2.
  • Bartricks
    6k
    I think there are or could be moral rules, but there don't have to be. But rules require a ruler just as values require a valuer. All roads point to a subject.
  • TheMadFool
    13.8k
    Validity alone doesn't make an argument good.

    If I take one of the premises: If I value something it is not necessarily morally valuable, it leads to a conclusion that contradicts your conclusion. There's an inconsistency in your position.
  • Bartricks
    6k
    I know, that's why I pointed out that the arguments are valid and that they have premises that are true beyond a reasonable doubt. That is, they appear to be sound arguments. It doesn't get better than a sound argument.

    There's no contradiction. That premise, combined with the premise "If moral values are my valuings then if I value something it is necessarily valuable" entails the conclusion that moral values are not my values. That is consistent with moral values being someone's values (for I am not everyone)
  • alcontali
    1.3k
    I think there are or could be moral rules, but there don't have to be. But rules require a ruler just as values require a valuer. All roads point to a subject.Bartricks

    So, let's forget about any ineffable moralities that cannot be expressed in rules, and limit ourselves to moralities that can. Humans can make inferences based on rules, but are not required for that purpose. Machines are perfectly capable of executing inference engines. Therefore, machines can reach conclusions from a set of rules.

    Therefore, I have to reject that "rules require a (human) ruler" or "evaluations require a (human) evaluator":

    Rule-based systems. In computer science, a rule-based system is used to store and manipulate knowledge to interpret information in a useful way. It is often used in artificial intelligence applications and research. Normally, the term rule-based system is applied to systems involving human-crafted or curated rule sets. Rule-based systems constructed using automatic rule inference, such as rule-based machine learning, are normally excluded from this system type.

    A rule-based system is NOT a human at all, even though humans can do it too (usually with lots of errors, though).
  • Bartricks
    6k
    You're conflating descriptive rules with normative rules.

    Morality, if it involves any rules, is going to involve normative rules. And is those that require a ruler.
  • alcontali
    1.3k
    Morality, if it involves any rules, is going to involve normative rules. And is those that require a ruler.Bartricks

    Normative rules lead to rulings, which are simply language expressions. A machine can traverse rules and produce a ruling. There is no need for a human to do that.
  • Bartricks
    6k
    No, a normative rule is a prescription. It tells you to do something. Only a subject can tell you to do something.
  • Bartricks
    6k
    And note, I did not say that moral values and prescriptions are human values and prescriptions. I concluded that they are not.
  • TheMadFool
    13.8k
    I know, that's why I pointed out that the arguments are valid and that they have premises that are true beyond a reasonable doubt. That is, they appear to be sound arguments. It doesn't get better than a sound argument.

    There's no contradiction. That premise, combined with the premise "If moral values are my valuings then if I value something it is necessarily valuable" entails the conclusion that moral values are not my values. That is consistent with moral values being someone's values (for I am not everyone)
    Bartricks


    Are you changing your argument?

    "If moral values are my valuings then if I value something it is necessarily valuable"

    The above statement is incoherent. How do you go from "moral values are my valuings" to "it is necessarily valuable". In fact that means that there's no need for "a god" at all. After all anyone's values are necessarily valuable without the need for that ultimate/final god.
  • alcontali
    1.3k
    No, a normative rule is a prescription. It tells you to do something. Only a subject can tell you to do something.Bartricks

    Not true at all.

    The answer to a jurisprudential question (such as a fatwa) will declare a particular behaviour to be morally permissible or impermissible. These rulings do not tell you what to do, because you do what you want. Still, if you claim to accept the basic rules of the underlying morality then you may want to remain consistent by accepting all its consequences.

    Example ruling: "He worked as a programmer for a company but they did not give him his dues; can he sell some of their programs to get his money?"

    Whatever the answer may be, there is nobody who will force the person asking the question to act in accordance with the answer. There is, however, a real influence that goes out of the answer, because the person asking the question is likely to want to remain consistent with his basic beliefs. Otherwise, he would obviously not even ask this question.

    Morality is much, much more about consistency than about enforcement. In fact, in certain ways, morality has a real and noticeable propensity of enforcing itself ...
  • Bartricks
    6k
    No, that was my original argument for thinking that moral values, though the valuings of a subject, are not my valuings.

    There's nothing incoherent about it. Look, because pain is a feeling then if I feel in pain, necessarily I am in pain, yes? Likewise then, if moral values are my values - that is, if my valuing something thereby makes it morally valuable - then if I value something necessarily it will be morally valuable. Which is isn't, of course. Hence, moral values are thereby demonstrated not to be constituted by my valuings.
  • Bartricks
    6k
    Again, you're not comparing like with like.
    Moral norms are prescriptions. That's just what a norm is. Well, it's more of a rag bag than that. Moral philosophers often characterise them as 'favourings'. Doesn't matter. Favourings require a favourer.
  • Bartricks
    6k
    I am unclear what your point is. The word 'objective' is ambiguous - it can mean 'goal', it can mean 'impartial' and it can mean 'exists outside of subjects of experience'. So, to avoid confusion I stipulated what it was going to mean in this thread - it was going to mean 'exists outside of subjects'.

    I then argued that it is not possible for moral values to exist in this way. Moral values must be subjective becsaue they are valuings and only subjects value things.
  • alcontali
    1.3k
    Moral norms are prescriptions. That's just what a norm is. Well, it's more of a rag bag than that. Moral philosophers often characterise them as 'favourings'. Doesn't matter. Favourings require a favourer.Bartricks

    No, in principle, rulings are produced, or at least verified, entirely mechanically by a rule-based system, i.e. a machine, from a set of basic rules.
  • Wayfarer
    20.8k
    I am unclear what your point is.Bartricks

    The point is, 'subjectivism' is generally considered a very weak position in moral philosophy. Why? Because it reduces moral propositions to 'what I think is right (or not)'.

    The problem is, that objective measurement doesn't tell us anything directly about moral goods. This is the root of the famous 'is-ought problem' of David Hume.

    So I am trying to extend the analysis to draw out the deeper meanings of 'objective' and 'subjective' and why moral issues are said to be subjective but not objective. There's a philosophical issue involved which I think it deep, difficult and important. So I perfectly understand why you are having trouble seeing it, but nevertheless I think it's an important point.
  • TheMadFool
    13.8k
    No, that was my original argument for thinking that moral values, though the valuings of a subject, are not my valuings.

    There's nothing incoherent about it. Look, because pain is a feeling then if I feel in pain, necessarily I am in pain, yes? Likewise then, if moral values are my values - that is, if my valuing something thereby makes it morally valuable - then if I value something necessarily it will be morally valuable. Which is isn't, of course. Hence, moral values are thereby demonstrated not to be constituted by my valuings.
    Bartricks

    Well, I'm definitely not enjoying this ride. The territory is not familiar and the guide (you) is either too knowledgeable or himself/herself perplexed by the domain of discourse.

    You DONOT value your own valuations as is evidenced by you trying to shift the burden onto someone/something else, a poor helpless god who, it appears, is just there to be the bearer of your values. This is inconsistent because if your valuations of morals are insufficient to convince or satisfy you how then will it satisfy you when all you do is invent an ultimate/final valuator? Considering "a god" is better than us, humans, how will morals that is pointless because we value it suddenly become worthwhile by making "a god" value it? It only makes sense if you value morals because "a god" values it. Euthypro had an issue with Socrates on that.
  • Bartricks
    6k
    well, top marks for not bothering to address anything I actually argued.
  • Bartricks
    6k
    I do not follow you. I have not invented anything, I have just followed an argument to its logical conclusion. Moral values are valuings (what else could they be?) and valuings are somethings subjects have a monopoly on. Yet they are not my valuings or anyone else's apart from the person whose valuings they are.
    Note, I am not saying I value or do not value moral values. I am talking about what they are - that is, what they are made of - not my own attitudes towards them.
    As for the Euthyphro - what's the problem?
  • Terrapin Station
    13.8k
    Because the subject whose values constitute moral values would be a god. Moral values are not my values or your values, but they are someone's (as the argument demonstrates). And that someone would be a god precisely because their values constitute moral values.Bartricks

    Say what? This is a complete non-sequitur with respect to your earlier comments.

    1. If moral values are my valuings then if I value something it is necessarily morally valuable
    2. If I value something it is not necessarily morally valuable.
    Bartricks

    (1) needs a "to you" at the end. It's just like If you feel pain, then necessarily you feel pain. It's about what's the case in your mind.

    Where is (2) coming from?

    Re (1), otherwise, the move you're trying to make is akin to this (excusing using "pain/paining" as a both a verb and noun): "If my leg paining is my pain, then if I pain something it is necessarily a pain"--where you're trying to suggest a universal scope at the end (as if everyone's leg should be paining then), rather than keeping the scope as something about you.
  • Bartricks
    6k
    I don't think you know what a non sequitur is.
    And no, nothing needs to be changed. The argument was deductively valid and both premises - as I wrote them, not as you might re-write them - are true. Deal.
  • Terrapin Station
    13.8k


    Re your first premise, "then if I value something it is necessarily morally valuable"--to whom?
  • Bartricks
    6k
    No one. Just 'morally valuable'. if I value something, then it is valuable to me, yes? But if something is morally valuable then it is valuable irrespective of whether I value it, yes?

    the point - which you don't seem to be grasping - is that my valuing of something is not of a piece with it being morally valuable. They're different. Not the same. Different.
  • Bartricks
    6k
    So, rather than rewriting my premises or assuming I've incorrectly written them, just address them. That is, try to take issue with one. Again, with one of my premises as I have written them.

    Here, for your convenience, is the argument thus far:

    1. For something to be morally valuable is for it to be being valued.
    2. Only a subject can value something
    3. Therefore, for something to be morally valuable is for it to be being valued by a subject.
    4. If moral values are my valuings, then if I value something necessarily it is morally valuable
    5. If I value something it is not necessarily morally valuable.
    6. Therefore, for something to be morally valuable is for it to be being valued by a subject who is not me.
  • Bartricks
    6k
    LIke I say, you're confusing descriptions with prescriptions. Moral rules, if there are any, are prescriptions. Now, can a machine issue a prescription? No, not literally. Someone can programme a machine to issue prescriptions, but then those prescriptions qualify as prescriptions only because we can trace them to a subject whose attitudes they express.

    For example, imagine that meteorological conditions bring it about that the clouds above your head temporarily form into shapes that look to you like the words "buy some milk!" Are you being instructed to buy some milk? No, obviously not. Why? Because the clouds were not expressing the attitude of any subject - it was just a freak meteorological occurrence. We can describe why it happened by appealing to laws of nature - but those laws, note, are descriptive not prescriptive. Which is why explaining why it happened will not amount to showing that you were, in fact, being told to buy some milk. You were not being told to buy some milk and what appeared to you to be a prescription was no such thing at all, just some clouds.

    Now, for it really to be the case that there is a prescription against being cruel, say - and there obviously is such a prescription, for virtually all of those possessed of reason recognise that there is - there would need to be a subject whose attitudes that prescription expresses.
  • Terrapin Station
    13.8k
    No one.Bartricks

    So that's a problem as I pointed out. Valuations are always to someone. You can't have a valuation to no one.
  • Bartricks
    6k
    Well as the argument demonstrates, moral values are not the valuings of you or I, but of another subject (and the 'of' in that sentence denotes not the object of the valuings, but the valuer - the one doing the valuing). So, the 'someone' you're referring to just is the valuer whose valuings constitute moral valuings. So, er, yes - the whole point of the argument was to show that moral values are always the values of a person. The point, though, is that the person is also demonstrably not you or I, but a god. Which is why it makes no sense to say "Xing is morally valuable to me".
  • Bartricks
    6k
    So, once again, and for the last time, which premise are you disputing?
  • Terrapin Station
    13.8k
    Well as the argument demonstrates, moral valuations are not the values of you or I, but of another subject.Bartricks

    Which would mean that your first premise is

    "f moral values are my valuings then if I value something it is necessarily morally valuable to another subject"

    or

    "If moral values are my valuings then if I value something it is necessarily morally valuable to God"

    Neither one of those seems noncontroversial as a premise, do they?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment