• Vera Mont
    4.6k
    I don't follow you, Vera. I referred to pleasure as a concept, not particular instances or "experiences" (and "accessed via drugs" has nothing to do with Epicurus – check the three links I provided for clarification in the context of my response).180 Proof
    I'll do that when I have a little more time.

    In this thread my references to pleasure were in response to this
    AI will solve the purpose of human existence and he lists some things like of pleasure is the goal then we’d just be hooked up to drugs all the time without needing to bother with experiences. That sounds like either ruining the human experience or “revealing” it for what it is, that being just chemical reactions with our storytelling to make it seem like more.Darkneos
    and the cartoon-laden quora post which he can't argue.
  • Vera Mont
    4.6k
    I'm back from doing necessary tasks, several of which gave me a low-level pleasure in the completion. I read the links and remain unsatisfied. Absence of pain, irritation, frustration, or whatever is not enough. Some of the greatest pleasures I experience are freebies: frog-song on a spring evening, a good joke, the scent of cilantro on my hands, a few strains of Beethoven accidentally heard through a window, the tender mauve light of early morning, the trusting paw offered by a dog - these pleasures are extras, above the absence of pain and frustration.

    Equanimity and tranquillity are fine, contentment is better, but happiness is achieved when those little pleasures are added to contentment. That may well be just a bunch of chemicals telling one another stories, but I don't think they can be artificially induced - at least, not yet - because the one missing component is being there: the conscious awareness of one's fortunate condition and the commitment to support its various components.
    From the little I know of Epicurus, he knew this, too.
  • Janus
    16.9k
    Some would argue that's just storytelling, making things out to be more than what they really are.Darkneos

    "What they really are" is just another story. Discursively rendered, what anything really is depends on how you are looking at it.
  • Philosophim
    2.9k
    1. What do you mean here by "morality"?180 Proof

    A system that evaluates the consequences of a decision holistically and not merely to a narrow goal as to the best action in a particular circumstance.

    2. In what way does suffering-focused ethics fail to be "objective" (even though, like the fact Earth is round, there is (still) not universal consensus)?180 Proof

    Because it doesn't hold up if we treat it as an objective principal. Suffering is a subjective principal in many cases. Take two people who are working at a job and look at them from the outside. How do we know how much suffering each has? What if one person expresses how much pain they're in, but the first person is lying and the second person is not?

    And this is only in regards to a specific suffering, pain. How do we compare and contrast the pain of losing money to taxes vs the ease of suffering from someone who doesn't pay taxes? Is inequality of outcomes suffering? Should we all win the at games and eliminate competition? Is exercise or dietary discipline suffering for a healthy weight suffering?

    Finally because suffering is subjective, it relies on the human emotion of sympathy, something an AI does not have. It needs something objective. Measurable. Ironically, a measurable morality may be beyond the complexity of human kind and only a computer will have the ability to process everything needed.

    3. Why assume that "AI" (i.e. AGI) has to "reference" our morality anyway and not instead develop its own (that might or might not be human-compatible)?180 Proof

    What you're saying is that morality is purely subjective. And if it is, there are a whole host of problems that subjective morality brings. "Might makes right" and "It boils down to there being no morality" being a few.
  • Janus
    16.9k
    Because it doesn't hold up if we treat it as an objective principal. Suffering is a subjective principal in many cases. Take two people who are working at a job and look at them from the outside. How do we know how much suffering each has?Philosophim

    Empathetic people know when others are suffering. Suffering is an objective fact; if someone suffers they suffer regardless of whether anyone knows about it.
  • Philosophim
    2.9k
    Empathetic people know when others are suffering. Suffering is an objective fact; if someone suffers they suffer regardless of whether anyone knows about it.Janus

    I wish that were true. What you're describing is human empathy which is a subjective experience. We're talking about an objective morality which literally has zero feelings behind it. An objective morality should be measurable like a liter of cola. It is not a measure how how much someone personally likes or dislikes cola.
  • Janus
    16.9k
    What you're describing is human empathy which is a subjective experience.Philosophim

    Whether someone fells empathy for others or not is an objective fact, just as whether or not someone suffers is an objective fact.
  • Philosophim
    2.9k
    Whether someone fells empathy for others or not is an objective fact, just as whether or not someone suffers is an objective fact.Janus

    Right, but a moral system needs an objective measuring system. All feelings are objectively felt by every being that has those feelings, but the feeling itself is a subjective experience that no one can measure. We can measure brain states or actions, but not the feeling of being that person itself.
  • Janus
    16.9k
    What do we need to measure? If we are empathetic, we know when someone is suffering. The idea of an objective morality is, as much as possible, to avoid causing others to suffer. It is not so much a matter of a moral system; it is more a matter of having a moral sense.
  • Philosophim
    2.9k
    ↪Philosophim What do we need to measure? If we are empathetic, we know when someone is suffering. The idea of an objective morality is, as much as possible, to avoid causing others to suffer. It is not so much a matter of a moral system; it is more a matter of having a moral sense.Janus

    Look at it like this.

    I have subjective empathy and that causes me to give a person 5$ who needs it. I don't have subjective empathy but I have objective knowledge that a person needs 5$ so I give it to them. The action of giving 5$ is correct because it actively helps them. Whether I feel it helps them or not is irrelevant. I do lots of things I deem good without any feelings behind them Janus. Sometimes I don't want to do them, but I do anyway because the situation calls for it. Morality is not a feeling. That's just someone being directed by their own emotions.
  • 180 Proof
    15.7k
    these pleasures are extrasVera Mont
    Yes, and they are consistent with, or not excluded by, what Epicurus (or disutilitarianism) says about pleasure as a moral concept and practice.

    Suffering is a subjective ...Philosophim
    Which of the following are only "subjective" (experiences) and not objective, or disvalues (i.e. defects) shared by all h. sapiens w i t h o u t exception (and therefore are knowable facts of our species):

    re: Some of h. sapiens' defects (which are self-evident as per e.g. P. Foot, M. Nussbaum): vulnerabilities to

    - deprivation (of e.g. sustanence, shelter, sleep, touch, esteem, care, health, hygiene, trust, safety, etc)

    - dysfunction (i.e. injury, ill-health, disability)

    - helplessness (i.e. trapped, confined, or fear-terror of being vulnerable)

    - stupidity (i.e. maladaptive habits (e.g. mimetic violence, lose-lose preferences, etc))

    - betrayal (i.e. trust-hazards)

    - bereavement (i.e. losing loved ones & close friends), etc ...

    ... in effect, any involuntary decrease, irreparable loss or final elimination of human agency.
    180 Proof

    also, my reply to you (2024) ...
    https://thephilosophyforum.com/discussion/comment/903818

    Why assume that "AI" (i.e. AGI) has to "reference" our morality anyway and not instead develop its own (that might or might not be human-compatible)?
    — 180 Proof

    What you're saying is that morality is purely subjective.
    Philosophim
    This is precisely the opposite of what I've said. Maybe this old post clarifies my meaning ...

    Excerpts from from a recent [2024] thread Understanding ethics in the case of Artificial Intelligence ...

    I suspect we will probably have to wait for 'AGI' to decide for itself whether or not to self-impose moral norms and/or legal constraints and what kind of ethics and/or laws it may create for itself – superceding human ethics & legal theories? – if it decides it needs them in order to 'optimally function' within (or without) human civilization.
    — 180 Proof

    My point is that the 'AGI', not humans, will decide whether or not to impose on itself and abide by (some theory of) moral norms, or codes of conduct; besides, its 'sense of responsibility' may or may not be consistent with human responsibility. How or why 'AGI' decides whatever it decides will be done so for its own reasons which humans might or might not be intelligent enough to either grasp or accept.— 180 Proof
    180 Proof
  • Janus
    16.9k
    I have subjective empathy and that causes me to give a person 5$ who needs it. I don't have subjective empathy but I have objective knowledge that a person needs 5$ so I give it to them. The action of giving 5$ is correct because it actively helps them.Philosophim

    Right, the act of helping them is correct, the act of harming them not correct. There you have objective morality in a nutshell.

    Morality is not a feeling. That's just someone being directed by their own emotions.Philosophim

    When I spoke of a "moral sense" I did not have in mind any mere feeling. Sure, you could do what you think is the right thing, helping someone, without actually feeling any empathy. In that case what would you be motivated by? Is that motivation to do help, even absent any empathy, not a moral sense, a sense of what is right and wrong?

    Also, I spoke of not causing others to suffer, actively helping others is a more complex issue.
  • Darkneos
    918
    The central mistake of that hypothesis is the inaccurate equation of pleasure with happiness. As I've attempted to demonstrate earlier, pleasure is simple and fleeting; happiness is sustained and complex.Vera Mont

    But if it's chemicals whats the difference?

    https://x.com/Merryweatherey/status/1516836303895240708

    Are those meanings the same in ancient Greek and modern English? I think Epicurus had a wider vocabulary of pleasures, or pleasurable experiences, than can be accessed via drugs.Vera Mont

    I mean if we are talking about the brain isn't it all chemical reactions? Like the comic is saying, you would get the same chemicals from doing anything so why not plug in?

    I still haven't stopped trying to find another way around it, this is very distressing. Though I feel that wanting a solution would just be proving the thought experiment right.
  • Darkneos
    918
    Which of the following are only "subjective" (experiences) and not objective, or disvalues (i.e. defects) shared by all h. sapiens w i t h o u t exception (and therefore are knowable facts of our species)180 Proof

    I'd have to agree with them, it doesn't matter if humans share them (though not all humans) it's still subjective feelings, not objective facts. Everything on that list is subjective feelings and everyone might not feel the same about all of them.

    I know some Buddhist monks who wouldn't suffer from any of those for example, and that's just one case, therefor it's not objective but subjective.

    As for AGI I guess there is no point in speculating about it since if such a thing did come to pass it's computing power would be far beyond our ability to comprehend or do anything about.

    Humanity isn't ready for such a scenario.
  • 180 Proof
    15.7k
    Everything on that list is subjective feelingsDarkneos
    Nonsense. Human facticity is not "subjective". Being raped or starved, for example, are not merely "subjective feelings" just like loss of sustanence, lack of shelter, lack of sleep, ... lack of hygiene, ... lack of safety .... injury, ill-health, disability ... maladaptive habits ... those vulnerabilities (afflictions) are facts of suffering.
  • Darkneos
    918
    Nonsense. Human facticity is not "subjective". Being raped or starved, for example, are not merely "subjective feelings" just like loss of sustanence, lack of shelter, lack of sleep, ... lack of hygiene, ... lack of safety .... injury, ill-health, disability ... maladaptive habits ... those vulnerabilities (afflictions) are facts of suffering.180 Proof

    Incorrect, again. It's not facticity, it's subjective. Those are also not facts of suffering, Buddhism and Eastern philosophy already addressed that.

    These are merely subjective, no matter how bad they are to the person experiencing them at doesn't make them any more fact than any other feeling.
  • 180 Proof
    15.7k
    So you believe that there isn't any aspect of suffering that is a fact of the human condition (i.e. hominin species)?
  • praxis
    6.6k
    Incorrect, again. It's not facticity, it's subjective. Those are also not facts of suffering, Buddhism and Eastern philosophy already addressed that.Darkneos

    Once upon a time, a young monk, eager to understand truth, approached his master and asked, "Master, what is the nature of reality?"

    The master pointed to the towering mountain in the distance and said, "That is a mountain."

    The monk was puzzled. "Of course, it is a mountain," he thought.

    Years passed, and as the monk studied deeply, he began to see through illusions. He realized that the mountain was not a mountain—it was a collection of elements, ever-changing. There was no fixed essence of "mountain." Excited by this insight, he returned to his master.

    "Master! I see now—the mountain is not a mountain!"

    The master smiled but said nothing.

    More years passed. The monk continued his practice, going beyond concepts and distinctions. Eventually, he returned, bowing deeply.

    "Master, I see now… the mountain is once again a mountain."

    The master laughed. "Now you truly understand."
  • Darkneos
    918
    So you believe that there isn't any aspect of suffering that is a fact of the human condition (i.e. hominin species)?180 Proof

    Suffering is though it is a personal thing.

    "Master, I see now… the mountain is once again a mountain."

    The master laughed. "Now you truly understand."
    praxis

    It's an old zen story about how true enlightenment acknowledging the two truths of reality and to live the paradox. It not that it exists or doesn't exist, both are true and to know both is to see the truth.

    Or put another way, ultimate reality and conventional reality, both true and exist in tandem. To label one as false and the other true is to err.
  • 180 Proof
    15.7k
    Suffering is [a fact] though it is a personal thing.Darkneos
    Yes, "a personal" objective fact like every physical or cognitive disability; therefore, suffering-focused ethics (i.e. non-reciprocally preventing and reducing disvalues) is objective to the degree it consists of normative interventions (like e.g. preventive medicine (re: biology), public health regulation (re: biochemistry) or environmental protection (re: ecology)) in matters of fact which are the afflictions, vulnerabilties & dysfunctions – fragility – specific to each living species.

    addendum to
    https://thephilosophyforum.com/discussion/comment/980498
  • Darkneos
    918
    Yes, "a personal" objective fact like every physical or cognitive disability; therefore, suffering-focused ethics (i.e. non-reciprocally preventing and reducing disvalues) is objective to the degree it consists of normative interventions (like e.g. preventive medicine (re: biology), public health regulation (re: biochemistry) or environmental protection (re: ecology)) in matters of fact which are the afflictions, vulnerabilties & dysfunctions – fragility – specific to each living species.180 Proof

    No, not an objective fact, it's personal therefor not objective. Physical and cognitive "disabilities" are also not objective facts.

    is objective to the degree it consists of normative interventions180 Proof

    Again, you have to be told it's not objective.

    in matters of fact which are the afflictions, vulnerabilties & dysfunctions – fragility – specific to each living species.180 Proof

    Again not matters of fact, just interpretations. Suffering is open to interpretation and exists only subjectively. Though there are those who do not suffer, like I mentioned before. Again just insisting it is doesn't make it so.
  • 180 Proof
    15.7k
    Your obstinate dismissals without argument, sir, are now dismissed by me without (further) argument. Hopefully, someone much more thoughtful than you will offer credible counters to my arguments.
  • Darkneos
    918
    Your obstinate dismissals without argument, sir, are now dismissed by me without (further) argument. Hopefully, someone much more thoughtful than you will offer credible counters to my arguments.180 Proof

    They already gave them and you ignored them, you just doubled down insisting subjective feelings and assessments are objective. I even explained how your "list" is still subjective evaluations and not everything on there is a fact of suffering because there is no fact of suffering due it's subjective nature, for every thing on your list there is someone who doesn't suffer due to it.

    That's also why they stopped responding to you.

    You have made no argument, only insisting it is so and I had to keep pointing out how it's still a subjective experience and there is nothing objective about it or a fact. I even listed an entire branch of philosophy that argued otherwise, maybe try Buddhism.

    So unless you have anything beyond insisting it's objective then you're easily dismissed, like how your earlier point about AI had nothing to do with the topic.

    Suffering is not measurable quantity and therefor not objective fact, even your link shows that...
  • kindred
    153


    Every technological advancement has its advantages and disadvantages. I think this has been the case since the invention of the wheel and the invention of fire. It made life easier and the human propensity for ingenuity and invention is relentless either due to necessity or desire to improve things.

    Sure AI can replace a lot of jobs but so did the Industrial Revolution. Take transport for example, the men involved in the horse trade would have been impacted by the invention of the automobile yet the automobile conferred many advantages to the owner of it. The same for AI if it reduces costs in a capitalistic society then it will become widespread. I think the danger of it though is overstated because it opens new career opportunities such as coding in AI etc.

    However if we become over reliant or dependent on AI without knowing how it works it could stifle innovation unless of course AI itself is capable of innovation and original thought.

    Yet despite the advances in AI I don’t think it can match the human touch when delivering many types of services and jobs like the care sector for example as in doctors nurses or other hospitality catering industries.
  • Darkneos
    918
    Yet despite the advances in AI I don’t think it can match the human touch when delivering many types of services and jobs like the care sector for example as in doctors nurses or other hospitality catering industries.kindred

    That's kinda what AGI is for, the next step. It's meant to replace that level of cognitive work for humans.

    Also the Industrial Revolution is a terrible example to use. We still are suffering from that one. The environment being poisoned, people working more than ever before, and lets not forget we had to sign a whole bunch of legislation to prevent workers from being just meat puppets (though that's getting over turned). The over reliance on cars has also been bad because it makes cities and towns more dangerous for pedestrians and now we have less walkable cities. It also gave rise to the massive environmental disaster that are the suburbs.

    People like to think technological progress has all been good, unaware of the heavy cost and who's paying it. Though people who think the dangers of AI are overstated clearly don't understand what's happening. I gave one example about it solving the purpose of human existence and just having people hooked up to drugs instead of having to perform experiences for the same thing.

    https://www.youtube.com/watch?v=fa8k8IQ1_X0

    Simply put, humanity is fundamentally unprepared for such a thing.
12Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.

×
We use cookies and similar methods to recognize visitors and remember their preferences.