• TheMadFool
    13.8k
    Bannings are old news - people have been banned from the forum and some have come to being a hair's breadth away from being expelled from the forum.

    There are many reasons for bannings but one that bothers me is the emotionally charged offensive posts against another member or even against the forum itself.

    That out of the way, I want to call your attention to an experience that I had 4 or 5 years ago on another forum. What happened was that chatbots were registering themselves as members on this forum. I don't know whether it was because chatbots lower the quality of the posts and thereby the forum itself or something else but the mods weren't happy. The mods would conduct tests on members to check whether they were chatbots or not and if one was discovered, it was immediately banned. I suppose the mods were using their own version of the Turing test

    We all know that between emotions and reason, what AI (artificial intelligence) can't emulate is emotions. Replicating logical thinking is a piece of cake for a computer but emotions are an entirely different story. If so, one practical method for telling apart real people and chatbots is to test for emotions. Presumably the stronger the emotions, the more unlikely that a forum member is a chat-bot. If so, mods should be pleased to see feelings flare up in the forum - insults, rejoinders, expletives, name calling, etc. all indicate a population of normal human beings instead of a swarm of chatbots.

    Unfortunately, no, mods on forums keep an eye out for offensive posts of the highly emotional type - the more emotionally charged a post is, the greater the likelihood of being banned by the panel of mods. This, if nothing else, demonstrates that mods on many forums have it backwards. Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.

    What gives?
  • Kenosha Kid
    3.2k
    Unfortunately, no, mods on forums keep an eye out for offensive posts of the highly emotional type - the more emotionally charged a post is, the greater the likelihood of being banned by the panel of mods. This, if nothing else, demonstrates that mods on many forums have it backwards. Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.

    What gives?
    TheMadFool

    Probably a dearth of bots and an embarrassment of riches when it comes to the other. Eliminating bots is not the mods' only problem.
  • Caldwell
    1.3k
    Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.TheMadFool

    Yeah death by sterilization.

    First, let's make sure that the mods themselves are not bots. Now that we've gotten that out of the way, we can think of life/civility balance.

    I guess it lies in the rules they laid down. Then, an unwanted consequence -- civil, but all bots themselves. Messy and emotionally charged, but real humans.
  • TheMadFool
    13.8k
    Probably a dearth of bots and an embarrassment of riches when it comes to the other. Eliminating bots is not the mods' only problem.Kenosha Kid

    Yes, there are other problems mods have to deal with but what I wanted to touch upon was how close to the reverse turing test the mods' methods are.

    Yeah death by sterilization.

    First, let's make sure that the mods themselves are not bots. Now that we've gotten that out of the way, we can think of life/civility balance.

    I guess it lies in the rules they laid down. Then, an unwanted consequence -- civil, but all bots themselves. Messy and emotionally charged, but real humans.
    Caldwell

    Rules that favor less heart and more brain.
  • Caldwell
    1.3k
    Rules that favor less heart and more brain.TheMadFool

    I can construe this either way -- do you mean bots or humans?

    Truth is, they could create the most realistic AI in all appearances but eventually our connection would be shallow, and often lonely. AI cannot replace humans in many ways. How's 'gut instinct' for good measure?
  • Streetlight
    9.1k
    Eliminating bots is not the mods' only problem.Kenosha Kid

    Does literally anything else need to be said about this? That it needed to be said at all is embarrasing.
  • TheMadFool
    13.8k
    Does literally anything else need to be said about this? That it needed to be said at all is embarrasing.StreetlightX

    :ok:

    but eventually our connection would be shallow, and often lonely.Caldwell

    What if an AI saved your life? Last I checked, the deep bond that occasionally :chin: forms between a savior and the saved is based wholly on the act, the act of saving and not on the mental/emotional abilities of the savior. Just asking.
  • baker
    5.6k
    Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.

    What gives?
    TheMadFool
    That's a false dichotomy. Throwing tantrums may be unique to humans, but it's hardly what makes one a good human.
  • TheMadFool
    13.8k
    That's a false dichotomy. Throwing tantrums may be unique to humans, but it's hardly what makes one a good humanbaker

    I'm banking on our uniqueness - tantrums and all - to see us through an AI takeover IF that comes to pass.
  • Jack Cummins
    5.3k

    It is almost midday and I have not got out of bed yet because I haven't recovered from reading about the recent banning.

    I read the news of the banning when I got up in the middle of the night and was so shocked because RL was the star of the show at the moment. It was disappointing that some of her writings were not her own.I just can't think why she used others writings. She did put a couple of replies to me on the threads I wrote and I would presume these were written by her because they seemed in response to me. They were well written and I would have imagined that she aa a person could have written plenty herself, so it just seems a shame that she felt the need to use others writings instead.

    But 3 bannings in less than a week is very dramatic. It is all starting to become like a reality TV show, but perhaps it is because of all the lockdowns. Also, in another thread before the latest banning, Gus Lamarch said that what is happening here on the forum is asymptomatic of fragmentation in the world.
  • TheMadFool
    13.8k
    I'm veering dangerously close to suicide by mod but one has to walk the talk as they say and I'm not all that keen to flip the kill switch...not yet...probably some day :smile:

    Based on your pupil dilation, skin temperature and motor functions...I calculate an 83% probability that you will not pull the trigger. — Terminator
  • TheMadFool
    13.8k
    Just curious, in which thread did we discuss the idea of self-fulfilling prophecy?
  • Jack Cummins
    5.3k

    The discussion on self-fulfilling prophecy was in the thread on disasters and where are we going.
    I would be interested in the topic but I will log off for a few hours. That is because it is after midday and I am still in bed, because I have been lying in bed, busy reading and writing on this site. I can't stay in bed all day!
  • TheMadFool
    13.8k
    I would be interested in the topicJack Cummins

    :ok:

    I guess it's a good idea to take a break every now and then.
  • Pantagruel
    3.4k
    Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.TheMadFool

    Since you put this in the philosophy forum as opposed to the lounge I'm going to point out this is a faulty generalization. Just because 'bots cannot simulate feelings does not imply that those who are not 'bots are not necessarily like 'bots in respect of not having feelings.There is a whole spectrum between being too passionate, to the point where emotion compromises reason, and having no feelings at all.
  • frank
    15.8k

    I was banned from a subreddit for commenting that a particular child molester's throat should be cut and his body thrown in a ditch.

    The whole site was clamping down on incitements to violence at the time (during the Floyd riots).

    It was ok with me tho. It's their subreddit. If they don't want my violent comments, I understand.
  • TheMadFool
    13.8k
    Since you put this in the philosophy forum as opposed to the lounge I'm going to point out this is a faulty generalization. Just because 'bots cannot simulate feelings does not imply that those who are not 'bots are not necessarily like 'bots in respect of not having feelings.There is a whole spectrum between being too passionate, to the point where emotion compromises reason, and having no feelings at all.Pantagruel

    A generalization that plays a role in my thesis: No chatbots can simulate emotions. Where's the "faulty" generalization? Are you saying, some chatbots can simulate emotions? That's news to me. I'd like some references. Plus, even if some chatbots can fool and have fooled us into thinking they're emotion-capable humans, I bet they lack the full emotional range of a normal adult human (see below).

    As for emotions being a spectrum, count me in among the crowd who endorse that view.

    Furthermore, I agree with you that "just because bots cannot simulate feelings does not imply that those who are not bots are not necessarily like bots in respect of not having feelings" for the simple reason that apathy is a well-documented psychological phenomenon. I didn't say anything that contradicts this truth.

    I was banned from a subreddit for commenting that a particular child molester's throat should be cut and his body thrown in a ditch.

    The whole site was clamping down on incitements to violence at the time (during the Floyd riots).

    It was ok with me tho. It's their subreddit. If they don't want my violent comments, I understand
    frank

    That's precisely what's wrong with moderators coming down hard on forum members when they get worked up into frenzy as you were. Only humans are capable of losing it as they say and what better evidence than that to prove a member isn't a emotionless bot.

    It almost seems like we humans secretly aspire to become [more] machine-like and it shows in how forum moderators, not just the ones on this forum, are quick to ban those who go off the deep end.
  • Pantagruel
    3.4k
    A generalization that plays a role in my thesis: No chatbots can simulate emotions. Where's the "faulty" generalization?TheMadFool

    Well, in the reverse, that there could be a 'reverse turning test'. The Turing Test targets chatbots, but the reverse Turing Test doesn't target all 'real human beings' but only the set whose exaggerated emotions rise to the level of unreasonable display. So you aren't leaving behind only a "machine-like" residue. It's a faulty generalization.
  • TheMadFool
    13.8k
    Well, in the reverse, that there could be a 'reverse turning test'. The Turing Test targets chatbots, but the reverse Turing Test doesn't target all 'real human beings' but only the set whose exaggerated emotions rise to the level of unreasonable display. So you aren't leaving behind only a "machine-like" residue. It's a faulty generalizationPantagruel

    Oh! I see. In the reverse Turing test, people are tested if they can mimic a computer or a simple chatbot I suppose. I didn't say ALL people are capable of that feat i.e. I didn't make a generalization on that score. In fact, that people can pass the reverse Turing test is why we're all still members of this forum, having outwitted the moderators into thinking we're not human or that we're state-of-the-art chatbots capable of a decent conversation with another human being and not ruffling anyone's feathers along the way.
  • Pantagruel
    3.4k
    In fact, that people can pass the reverse Turing test is why we're all still members of this forum, having outwitted the moderators into thinking we're not human or that we're state-of-the-art chatbots capable of a decent conversation with another human being and not ruffling anyone's feathers along the way.TheMadFool

    :lol:
  • Caldwell
    1.3k
    What if an AI saved your life? Last I checked, the deep bond that occasionally :chin: forms between a savior and the saved is based wholly on the act, the act of saving and not on the mental/emotional abilities of the savior. Just asking.TheMadFool

    I think putting it this way is meandering away from the point of this thread.

    A man paid a $100k for a sports BMW equipped with saving the life of a driver in the event it flips over multiple times during an accident. Then the accident happened -- the car traveling 100mph flipped several times, he got out of it and walked away, from an accident that would normally kill.

    To say he formed a deep bond with this machine is sentimentality. One would be very thankful. Amazed. But to call it a deep bond is projecting.

    So, going back to the task at hand, can a bot have gut feeling? Do not be fooled by the word "feeling" here. Gut feeling actually operates as intelligence used in decision-making.
  • Jack Cummins
    5.3k

    I can't possibly think that you would get banned. Even though I am not someone who advocates banning people I can see that the two people who were banned had enormous attitude problems, which you do not. I would imagine that the mods do put some careful thought into banning rather than doing it arbitrarily.

    One seemed to think he was superior to almost all others on the site, practically wanting to change it completely and even suggested that he should edit articles. The other had many prejudices and I had a difficult night when I challenged him about his use of the word schizophrenia to imply someone who lacks rationality. He also was being very offhand with me on the day before he was banned, asking me how old I was. I know that the reasons why these 2 were banned was for different reasons but I thought that they were extremely difficult members.

    I would object if you were banned. I think that the only problems that the mods might have with you or me is that we start a lot of threads. I really started the one on the arts this week to try to break down all the heated politics. We all get heated, and sometimes I feel heated and have to think before I write. I find lying on my bed and playing some music helps. I have also thought that if get too wound up I might avoid the site for a few days, but it is not easy because I have got into the habit of logging on to my phone.
  • TheMadFool
    13.8k
    I think putting it this way is meandering away from the point of this thread.

    A man paid a $100k for a sports BMW equipped with saving the life of a driver in the event it flips over multiple times during an accident. Then the accident happened -- the car traveling 100mph flipped several times, he got out of it and walked away, from an accident that would normally kill.

    To say he formed a deep bond with this machine is sentimentality. One would be very thankful. Amazed. But to call it a deep bond is projecting.

    So, going back to the task at hand, can a bot have gut feeling? Do not be fooled by the word "feeling" here. Gut feeling actually operates as intelligence used in decision-making.
    Caldwell

    Here's an analogy fir you to consider. To my knowledge, traits like selflessness, not expecting anything in return, to name a few, define a good person and we're drawn to people who possess these qualities i.e. we're eager beavers regarding opportunities to bond with them.

    The expensive BMW that saves the driver is both selfless, literally, and also doesn't seek recognition for having saved the driver.

    Ergo...
  • baker
    5.6k
    It almost seems like we humans secretly aspire to become [more] machine-like and it shows in how forum moderators, not just the ones on this forum, are quick to ban those who go off the deep end.TheMadFool
    Dude, lay off the drama.

    [Fully aware that classy stops being classy once one has to explain it ...]

    There are four kinds of entities that aren't into drama:
    1. chatbots,
    2. people who try to be like chatbots,
    3. people who just don't like drama,
    4. ideally, philosophers.

    This is a philosophy forum, and philosophy is supposed to be love of wisdom, not love of drama. Philosophers should exemplify this with their conduct. One of the hallmarks of such conduct is moderation in one's emotional expression.
  • TheMadFool
    13.8k
    Dude, lay off the drama.

    [Fully aware that classy stops being classy once one has to explain it ...]

    There are four kinds of entities that aren't into drama:
    1. chatbots,
    2. people who try to be like chatbots
    3. people who just don't like drama,
    4. ideally, philosophers.

    This is a philosophy forum, and philosophy is supposed to be love of wisdom, not love of drama. Philosophers should exemplify this with their conduct. One of the hallmarks of such conduct is moderation in one's emotional expression.
    baker

    I only report that which I observe. Your list is intriguing to say the least. In some world, chatbots, people who try to be chatbots, and philosophers are part of the same coherent category. That's precisely my point. The irony is that philosophers are in the process of becoming more like existing chatbots, emotionally sterile and computer scientists are in the business of making chatbots more human, possesed of emotions or, at least, capable of simulating them.
  • baker
    5.6k
    In some world, chatbots, people who try to be chatbots, and philosophers are part of the same coherent category.TheMadFool
    Not on my planet.


    The irony is that philosophers are in the process of becoming more like existing chatbots, emotionally sterileTheMadFool
    I'm sure some are like that.
    But it's important to distinguish between emotional sterility and emotional moderation. The two look the same at first glance, but they are not the same. We can discover which person is which by talking to them for a while.
  • TheMadFool
    13.8k
    Not on my planet.baker

    You mean not on this planet but that would be odd since it makes sense to me and I'm definitely on this planet.

    The two look the same at first glancebaker

    Indeed, one is the inability to emote and the other is about control but what I'm driving at is that the wish to control emotions reveals a secret obsession to be emotionally dead, like existing robots and AI.
  • Harry Hindu
    5.1k
    We all know that between emotions and reason, what AI (artificial intelligence) can't emulate is emotions.TheMadFool
    Its very easy to emulate emotions on a forum. Any time some one makes any assertion, it replies back with phrases like, You're an idiot, racist, bigot, etc.

    Its actually much more difficult to produce a logical response than an emotional response because it requires more work and energy.
  • baker
    5.6k
    Indeed, one is the inability to emote and the other is about control but what I'm driving at is that the wish to control emotions reveals a secret obsession to be emotionally dead, like existing robots and AI.TheMadFool
    Well, I suppose some people want to control emotions for such a reason.

    But some people follow the path of the samurai.

    You're just not allowing for enough detail in this.
  • TheMadFool
    13.8k
    Its very easy to emulate emotions on a forum. Any time some one makes any assertion, it replies back with phrases like, You're an idiot, racist, bigot, etc.

    Its actually much more difficult to produce a logical response than an emotional response because it requires more work and energy.
    Harry Hindu

    What you say here squares with how Aristotle and later generations of thinkers viewed humans, as rational animals. On this view, emotions can be considered remnants of our animal ancestry, subhuman as it were and to be dispatched off as quickly as possible if ever possible. From such standpoint, emotions are hindrances, preventing and/or delaying the fulfillment of our true potential as perfect rational beings. It would seem then that reason, rationality, logic, defines us - it's what could be taken as the essence of a human being.

    So far so good.

    Logic, as it turns out, is reducible to a set of "simple" rules that can be programmed into a computer and that computer would then become a perfect rational entity and this has been achieved - there are computers that can construct theorems on their own given a set of axioms whatever these maybe, its ability to be logical implicit in that capacity. Does this mean that we've already managed to emulate the mind of a perfect human being - completely rational and in no way hampered by emotional baggage, the computer?

    From your perspective yes and yet people involved in AI research seem not to be convinced of it. They're still trying to improve AI. Since AI executes logic flawlessly - they don't commit fallacies - it follows that it's not true, at least in the field of AI research, that logical ability defines what it is to be human.

    What exactly is it that's missing in AI like chatbots? It definitely doesn't have anything to do with logic for that's already under the belt of current AI. What is it that precludes current AI being given equal status to humans? One answer, among many others, is emotions. I maybe taking it a bit too far when I say this but movies like Terminator, The Matrix, and I, Robot underscore this fact. I consider these movies to reflect our collective intuition on the subject, the intuition that emotions and the nuances involved therein are chockablock with paradoxes and these, by their very nature, are beyond the ken of purely logical entities. To emotions and the complexities that arise out of them an AI will simply respond "THAT DOES NOT COMPUTE".

    Does this mean that emotions are inherently irrational?

    The answer "yes" is in line with how moderators in forums conduct their affairs. The moment feelings enter the picture as will be indicated more often than not by insulting profanity, they'll step in and the offenders will be subjected to punitive measures that can take the form of a stern warning or outright expulsion from the forum. If one takes this kind of behavior from the moderators into account, it would seem that they would like us to behave more like the perfect logical entities I mentioned earlier as if that were the zenith of the human potential. However, AI experts don't seem to share that sentiment. If they did, they would be out of a job but no they're not, AI research is alive, well and kicking.

    The answer "no" would point in another direction. If emotions are not irrational, it means that we're by and large completely in the dark as to their nature for the simple reason that we treat them as encumbrances to logical thinking. Emotions could actually be rational, we just haven't figured it out yet. This, in turn, entails that a certain aspect of rationality - emotions - lies out of existing AI's reach which takes us back to the issue of whether or not we should equate humans with only one-half of our mental faculties viz. the half that's associated with classical logic with its collection of rules and principles.

    Yes, we could program a chatbot to respond with "You're an idiot, racist, bigot, etc." and this does bear a resemblance to a real human blowing a gasket but that, by your own logic, would make them inhuman and, at the other extreme, being totally rational is, by my logic, also inhuman. It seems then that the truth lies somewhere between these two extremes; I guess a human is someone capable of both clear logical thought and also, on occasion, becoming a raging imbecile.

    Well, I suppose some people want to control emotions for such a reason.

    But some people follow the path of the samurai.

    You're just not allowing for enough detail in this.
    baker

    Death (nonexistence) before dishonor (feeling). Precisely my point.
  • baker
    5.6k
    Death (nonexistence) before dishonor (feeling). Precisely my point.TheMadFool
    *sigh*
    mTgy3qYPB1-sTdpbrSPTrBEAhm4OPOgOVeAms9p7t4k-jryBSc5yBEfiQV---rThZKkp5Zja8I520aZrb_97jWch7KcLNRqzF80um6jkuccEkoTIOHo2xc_S_R5EJ1y3
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.