• Tom Storm
    8.5k
    Can anyone tell me how you can detect something that's unconscious? Doesn't this cognitive bias theory has the same problem as psychoanalysis, that's it's not falsifiable?Skalidris

    Your question is a broad one. How can we identify with certainty whether someone's beliefs are influenced by an unconscious perspective that comes from (let's say) personal trauma or a family of origin value system? Don't think we can.

    I'm not sure I understand how you are connecting cognitive bias theory with critical thinking. In what sense are you proposing they are connected?
  • Pantagruel
    3.3k
    Just because we use numbers for interpretations doesn't mean the phenomenon is quantitatively measurable...Skalidris

    Actually that is exactly what it means. It seems you are coming from some kind of radically anti-scientific bias. All in good fun I guess, but not a good use of my time.
  • Joshs
    5.3k
    It sounds like you are arguing that there is a cognitive bias in the research that has concluded there is cognitive bias. If humans who are trying to be objective and have systematic protocols still manage to have a cognitive bias, don't you think this supports the idea in general that many people have a cognitive bias? Further do you really doubt that people adjust their memories to avoid certain feelings and conclusions (about themselves and others)? Sure, objective research can have hidden biases and specific conclusions about cognitive bias may be faulty, but I encounter cognitive bias in myself and others all the time. What gets noticed and what doesn't depending on group identity or ego self-protection. If cognitive bias was a crime, there is clear motive and access.Bylaw

    We don’t observe the world directly but through a personal framework of constructs that form a functional
    unity. Each of us is thus ‘biased’ with respect to the perspectives of others. We each live in slightly different worlds. When we reach consensus on facts of situations or the working of the mind , this consensus doesn’t eliminate the perspectival nature of our outlooks. Consensus and normative agreement on scientific fact is an averaging of all of our personal biases , not their elimination. The ‘objective’ fact is a view that no one in particular actually holds, we all hold our own variation on that template.

    When we accuse someone of cognitive bias, we are pointing out that their view deviates from
    the consensus of the larger group. This doesn’t tell us the view of the majority is more ‘correct’ than that of the deviant. They cannot be said to be in closer touch with ‘true’ reality. The fundamental arbiter of validity of a viewpoint is to what extent it is consistent with one’s own understanding, not whether it measures up to some third person external criterion of truth.

    Our negative emotions tell us when an aspect of the world no longer makes sense to us, when our personal anticipations of events fails to match up with what actually ensues( from our own personal perspective). We can block painful emotions , but this is generally a matter of not being able to articulate those feelings of chaos. We repress and avoid what we can’t make sense of, but this doesn’t eliminate the crisis, it only constricts our engagement with the world to what we can handle.
  • Bylaw
    549
    When we accuse someone of cognitive bias, we are pointing out that their view deviates from
    the consensus of the larger group. This doesn’t tell us the view of the majority is more ‘correct’ than that of the deviant.
    Joshs
    I don't think the research has to go like this. You are assuming there is a chosen right answer and that people are judged from deviating from that. But you don't need to have a right answer (and answer or interpretation) from which we judge a person's answer to show bias. All we need to do is see if people with certain political beliefs actually do not notice counterevidence. This doesn't mean they are wrong to think the Iraq was wrong or abortions are ok. Both sides of any issue can be shown to literally not notice things that go against their beliefs.

    I can see things like this in myself in relation to 'things that happened' and how I viewed them then and notice that I didn't not look at things/hypocrisies/evidence that I would have found hard to face. I protected myself from guilt or shame.

    I don't think this needs to be an accusation against someone. I think that's the wrong verb, though I am sure there are situations where someone is accused of cognitive bias, I see this as a fairly inevitable tendency, though one that can be struggled against in oneself and I suppose with others one is close to. In what I would call a healthy relationship, the people involved are aware this is a possibility, that they are filtering information to not admit/face something. So, there is some slack to have this pointed out. (and generally no one uses the phrase cognitive bias in these dialogues, but that is often what is being talked about)

    You seem to be viewing cognitive bias as creating subset of those who deviate. I am sure that kind of thing happens (and often without the need of the concept of cognitive bias. Psychiatry has done this by pathologizing certain people or states or attitudes. Cognitive bias is considered something we all have. It's not like the category of, say, psychosis.
    Our negative emotions tell us when an aspect of the world no longer makes sense to us, when our personal anticipations of events fails to match up with what actually ensues( from our own personal perspective).Joshs
    I am not quite sure what in my post this is responding to. I think those scenarios could and often would lead to negative emotions. I do think that the so-called negative emotions (I don't think of them this way) also arise without confusion: say when someone violates a boundary or we think that they have: with violence, say. I think they can also arise when things we expect but do not like happen. But you may not have been trying to present a complete picture of when these emotions arise. As I said I am not quite sure how this section connects.
  • Joshs
    5.3k
    All we need to do is see if people with certain political beliefs actually do not notice counterevidence. This doesn't mean they are wrong to think the Iraq was wrong or abortions are ok. Both sides of any issue can be shown to literally not notice things that go against their beliefs.

    I can see things like this in myself in relation to 'things that happened' and how I viewed them then and notice that I didn't not look at things/hypocrisies/evidence that I would have found hard to face. I protected myself from guilt or shame.
    Bylaw

    My point about the relation between negative emotions like guilt and shame , and the breakdown of predictive sense-making, is that guilt, shame and anger all have to do with situations that surprise, violate and thus invalidate schemes of understanding the world that we counted on to effectively predict events.Since these emotions are expressions or byproducts of a partial breakdown in the effectiveness of our schemes of understanding events and people, it is not the guilt, shame or anger that we need to protect ourselves from, it is ideas and behaviors of others that we cannot make sense of. We withdraw from people who alarm, disturb or confuse us with ideas that don’t make sense to us, and that as a consequence we may feel are harmful or immoral.
    It is not that we simply ignore evidence that contradicts our beliefs, as if a part of ourselves recognizes and fully understands the opposing belief, we form a negative emotion and then decide to protect ourself from this emotion by ignoring the belief. We never get to this stage of recognition and comprehension. A belief is part of a larger system of mutually consistent ideas. It is impossible to incorporate, or even to fully recognize as meaningful, ideas of someone else that are incompatible with that system. Such ideas simply don’t make sense to us, seem incoherent or illogical , or may be mostly invisible. It’s not that we are pretending they don’t make sense, they really dont make sense. This isnt a matter of fooling ourselves or hiding something from ourselves that has already been absorbed. We have no peg, no proper structure to hang it on, and so it simply isn’t assimilated. This selectiveness of perception is a necessary feature of sense making.

    In today’s polarized political climate, we spend a lot of time psychoanalyzing our opponents. We say they refuse to accept reality, create fake news, are brainwashed, succumb to shady motives, ignore what they don’t want to hear. What we have a great deal
    of difficulty doing is recognizing that a fact only makes the sense it does within a particular account, and people from different backgrounds and histories use different accounts to interpret facts.
  • jgill
    3.6k
    What I would like to know is how and why people think it can help with critical thinkingSkalidris

    When I taught complex variables, a senior level mathematics course, I would resort to heuristics in order to encourage understanding of principles and theory. A theorem might require a complicated proof, but by drawing pictures on the board and describing the underlying concept students could see through the complications and comprehend a rational argument that implied the result. It was a shortcut, but one I have used for myself numerous times. If an idea is abstract and convoluted, find an example that illustrates the idea. Then study the formal approach.

    I have no idea if this is the sort of thing you have in mind. Heuristics is in the broad category of CB. I've found that extracting an idea and making it personal in some way helps critical thinking. But I think this thread is more about political biases.
  • Bylaw
    549
    We withdraw from people who alarm, disturb or confuse us with ideas that don’t make sense to us, and that as a consequence we may feel are harmful or immoral.Joshs
    And fail to read, take seriously, notice counterevidence, anomalies, portions of texts, portions of what is said in the same way. This is cognitive bias. For the purposes of the discussion it doesn't matter much why, it's just that we do it.
    In today’s polarized political climate, we spend a lot of time psychoanalyzing our opponents. We say they refuse to accept reality, create fake news, are brainwashed, succumb to shady motives, ignore what they don’t want to hear. What we have a great deal
    of difficulty doing is recognizing that a fact only makes the sense it does within a particular account, and people from different backgrounds and histories use different accounts to interpret facts.
    Joshs
    Which is another way of saying they have biases. Some people can have more than others. But we all have this.
    Yes, there is a lot of bullshit 'analyses' of other people out there now. And rarely do the accusers (because that is part of accusing and labelling people and I now understand better where you are coming from) actually demonstrate their psychoanalysis nor do they realize that cognitive bias cuts both ways. I hold positions on current and past events that do not fit with mainstream media's version of reality. So I think I have great sympathy for what you dislike. I don't think the misuse of the idea of cognitive bias means that there is no cognitive bias. I see cognitive bias in all political groups and yes, political power generally determines what is objective and sweeping philosophically weak judgments of people do get thrown around. And, in fact, cognitive bias contributes to people's wholehearted certainty when they go along with BS getting shoved our way. And it has gotten worse. There is a centralization of media power and this has allowed
  • Bylaw
    549
    OK, I think I have a better idea where you are coming from. Yes, these ideas get used in terrible and politicized and/or financially motivated ways. I have beliefs that do not match mainstream media biases. And a lot of sloppy thinking unaware uses of concepts like cognitive bias exist. Sure.
    And they manage to use these in part because of cognitive bias and people's unwillingness to notice their team's cognitive bias, media bias, contradictions, counterevidence, control of media and more.

    But cognitive bias exists and this can be demonstrated in research that is not being used for these kinds of purposes. Many true ideas can be misused. Many neutral things can do evil in the wrong hands. The people who hurl that around aimed at certain groups are not citing research, they are aiming a psychological concept at people the disagree with. This used to happen with psychological ideas like 'projection' 'delusion' 'paranoia' and so on. Those are real phenomena, but once laypeople or professionals acting as laypeople start hurling them around it has little to do with the research and further is just intuitively being applied. (and of course research can and is more and more biased, given the concentration of money controlling research in fewer and fewer companies, but that's another story)

    Much of what you say, for example around negative emotions, seems like an explanation for why we have cognitive biases.
  • Banno
    23.4k
    And yet, despite the misgivings of @Skalidris' op, and the the sceptical musings of @Joshs, we know that we have cognitive biases...

    Again,
    I believe as a rule that understanding a cognitive bias facilitates mitigation.Pantagruel
  • jgill
    3.6k
    The Less-is-more effect is a positive outcome of heuristics, which itself is a small part of the very large cognitive bias spectrum. Math people use heuristics all the time to uncover general truths before looking for proof structures. However, it fails in this regard frequently. I just finished using it to define a specific outcome, only to discover a fallacy upon closer inspection.
  • Skalidris
    118
    I'm not sure I understand how you are connecting cognitive bias theory with critical thinking. In what sense are you proposing they are connected?Tom Storm

    Well, having cognitive biases would lead to a more "subjective" vision of reality, so cognitive bias mitigation would naturally lead to a more "objective" one, and that would imply to be more critical about yourself or others in order to do that.

    Actually that is exactly what it means. It seems you are coming from some kind of radically anti-scientific bias. All in good fun I guess, but not a good use of my time.Pantagruel

    With experiments, we can conclude a lot of people have cognitive bias (or whatever you want to call it actually), but that doesn't mean that we have tools to measure it quantitively in someone at a given moment. You have no way of measuring how much someone's opinion is biased. What did you have in mind? That we have some kind of cognitive bias detector that tells you how biased you are?

    but by drawing pictures on the board and describing the underlying concept students could see through the complications and comprehend a rational argument that implied the resultjgill

    Oh yeah I think that method could actually help a lot, it would be harder to ignore one element due to strong emotions if it's in front of your eyes and logically connected to all the others.

    But I was talking about trying to figure out if you are experiencing a cognitive bias, like simply asking yourself "do I take this decision because of the survivorship bias?". I think that approach is not efficient at all.

    But I think this thread is more about political biases.jgill

    Political? No not necessarily, it could be all kinds of bias really.
  • Pantagruel
    3.3k
    With experiments, we can conclude a lot of people have cognitive bias (or whatever you want to call it actually), but that doesn't mean that we have tools to measure it quantitively in someone at a given moment. You have no way of measuring how much someone's opinion is biased. What did you have in mind? That we have some kind of cognitive bias detector that tells you how biased you are?Skalidris

    I never proposed that we should construct a scale. Essentially, a bias is a distortion, so whatever the degree of the distortion, remediating it (by whatever amount) is better than not, don't you think?
  • Skalidris
    118
    remediating it (by whatever amount) is better than not, don't you think?Pantagruel

    But how do you plan to do that if you can't even know for sure if it's there or not? At a given moment for a given opinion, we have no tools to detect it...
  • Pantagruel
    3.3k
    But how do you plan to do that if you can't even know for sure if it's there or not? At a given moment for a given opinion, we have no tools to detect it...Skalidris

    We have the catalog of known cognitive biases. That's a pretty good tool IMO.
  • Tom Storm
    8.5k
    What we have a great deal
    of difficulty doing is recognizing that a fact only makes the sense it does within a particular account, and people from different backgrounds and histories use different accounts to interpret facts.
    Joshs

    Totally agree. But are there not also some dishonest people involved, who do know different to what they profess?
  • Joshs
    5.3k
    What we have a great deal
    of difficulty doing is recognizing that a fact only makes the sense it does within a particular account, and people from different backgrounds and histories use different accounts to interpret facts.
    — Joshs

    Totally agree. But are there not also some dishonest people involved, who do know different to what they profess?
    Tom Storm

    We generally lie when we think our real motives and justifications will not be understood the way we mean them, in their full context , or when we believe the ideas we are operating from will not be properly understood. In these cases our dishonestly is not the root of the problem. It is only a symptom of , and our attempt to ameliorate the effects of, a prior breakdown in mutual understanding.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Joshs
    5.3k
    ↪Skalidris

    How can we ever be sure that the decision we’re making isn’t biased? Biases are unconscious…
    — Skalidris

    Work to make the unconscious conscious. The few who attempt to do so find it is a long, painful process.
    ArielAssante

    And after all that work they eventually find that what they end up with is a conscious bias.
  • Skalidris
    118
    Work to make the unconscious conscious. The few who attempt to do so find it is a long, painful process.ArielAssante

    What if you’re biased with another bias when you conclude you’re biased? What if you like the cognitive bias theory so much that it’s the confirmation bias to think you’re biased? How would you know which one is true? How can you be sure you consciously realised what was in your unconscious mind?
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Skalidris
    118


    How does anything of what you sent me answer any of my questions?
    If you're implying it's excessive to push the bias theory as far as having bias in the process of mitigating other biases, why do you think it is excessive?

    You showed the theory, that's great, but I'm saying it's impossible to apply it and be aware of cognitive biases. I didn't make this thread to know more about the theory, I made it to get an actual debate about its application.
  • Bylaw
    549
    How can we ever be sure that the decision we’re making isn’t biased? Biases are unconscious…
    I see a lot of people using cognitive bias as some kind of superiority: “I know about cognitive bias and I try to avoid it, and you don’t, so I’m closer to the truth than you are”… And this is exactly the kind of behaviour that kills critical thinking… Or people who use it to take down someone’s defense: “you’re saying that because you’re biased, therefore it doesn’t have any value”…
    Skalidris
    Right, this is a fairly useless use of the concept. Hurling or implying superiority is unlikly to improve the conversation. On the other hand, I do think many people do develop ways to check their cognitive biases, to varying degrees of success. A lot of psychological concepts, I think, do describe real phenomena. Projection, passive-aggressiveness, narcissism, or even something like the fancy ass sounding Herzberg’s Motivation-Hygiene Theory can easily be abused.
    They tend to be much better at understanding patterns than resolving disputes via labeling of others You can use ideas like these to improve self-knowledge, understanding of dynamics, perhaps even to make positive changes in yourself or extricate yourself from toxic patterns with others. None of it that is diminished by the misuses people put these ideas to..
    The thread is titled
    Cognitive bias: tool for critical thinking or ego trap?
    But it's not an either or situation as with many concepts from many fields. It is a useful concept, I think, AND people can use it terribly.
    Like axes, guns or referendums.
  • Skalidris
    118
    . It is a useful concept, I think, AND people can use it terribly.Bylaw

    Yeah sure but how do you prevent yourself from using it terribly? For guns you can have a license, also check if you're stable mentally, but what about cognitive bias? My point is you can never know for sure if you're biased, so I don't see the point trying to figure out if you are, it's nonsense, not falsifiable. Maybe in some cases it would help someone get some distance from themselves, because they would question their opinions, but I really don't think the detection of cognitive bias are the best questions to ask, I think they lead to a lot of confusion. And I also think it can quickly escalade to some kind of superiority.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Skalidris
    118
    certainty brings insanity”, Certainty is not possible.ArielAssante

    So you shouldn't try to work with things that are more certain than others? You shouldn't try to maximise the certainty? When I said "how can you be sure", I obviously didn't see certainty as something binary, it has shades. I was referring to the falsifiable principle of Popper.

    excessive thinking not good. Tends toward narcissismArielAssante

    Um... This is a very extreme opinion. Some people who overthink actually have very low self esteem and aren't narcissistic at all...

    But don't get me wrong, I actually agree that knowing yourself is the best thing you could do to be more critical, I just don't think naming biases and trying to detect it in yourself or others is going to help.
  • Bylaw
    549
    Yeah sure but how do you prevent yourself from using it terribly?Skalidris
    Some people can, others can't. How does one learn how to weed out pathological boy/girlfriends? Some people can't so no one should bother? You can reflect over what attracted you to the person? You can listen to other people's nightmares, you could ask certain questions earlier in the relationship. Here I am using an analogy where intuition and wisdom are involved. Can one rule out one will not end up getting close to another crazy person? Well, probably not completely. But can one improve? Sure. There are all sorts of things we learn to do better that some people cannot or will not try to learn to do better.

    And remember: you are talking about an individual on their own analyzing their own personal cognitive biases. The concept of cognitive bias exists and is well supported by research even if people can't use it as a tool on themselves alone.

    And actually you can test to see if people's biases change and reduce and what can lead to this.

    Then there's using other people to check on you biases: people you respect, perhaps best with a variety of political/social/philosophical opinions. They can point out when you read an article and manage not to notice the part that goes against your ideas. They can point out when you point out only those life experiences you've had which support or seem to your opinion of women, Republicans, the safety of vaccines, your sense of cosmology, your sense of what a responsible worker does and so on. They can point out when you contradict yourself. And this kind of dialogue (which hopefully is mutual) can and does reduce people's biases.

    You want foolproof, then you need to not deal with a complicated lifeform like humans. It's easy to adjust the straightness of a bicycle tire.

    Another example is learning how to learn. Over my lifetime I have learned how to improve my learning. Even if I am suddenly in the position of having to learn a new kind of subject. One little one is to reflect on what I have learned and how I went about it during a single lesson/study time/whatever.

    Does this make me an immaculate learner? No. Does my reflection catch all my assumptions? No. Can my biases regarding my own work affect my self-reflection? Sure.

    HOWEVER.....

    My learning has gotten better and after I introduced reflective practices the difference was significant and noticed by others.

    One can get into one's head and imagine that yes, it is possible that some process of change will have no effect because it is hard to track, is complicated, and I am involved so I may be confused about myself. And one can then say it is impossible to learn or learn with tool X. Welcome to the human condition. But actually these things have been measured in labs and while the individual is in their everyday fog, tools like working against one's own cognitive biases and using reflection in learning have be documented to make changes.

    If you want perfect and 100% certainty then apply to be something simpler like a toaster.

    But people are learning around you and using tools you are rejecting because there is some possibility it isn't working and you might just be fooling yourself if you think it is. And we might be brains in vats. And perhaps your job situation will never improve. And perhaps you will never meet someone you really get along with, just think you have for a while so no need to look who you are drawn to and how your dreamy romantic nature might be distorting the way you relate to romantic partners and so on. Since it is hard to know for sure, then there is no point in trying to improve.

    Parents can't get better via reflecting on their parenting of their first child and talking to other parents because they may just think they got better.

    Oh, and of course it is falsifiable. You can easily test to see if someone's poltical position affects what they notice in articles. If it doesn't then there is no cognitive bias in those situations. And hundreds of different hypotheses around what certain beliefs, roles, statuses and more will lead to in relation to cognitive bias have all been wel documented and could have been falsified.

    You are not a psychological research center. And yeah, it's hard for us to reseach the way they can. That's our situation. I will bet you try to learn and change things in other areas of you life where the results may be misinterpreted by you, but you still go ahead and try to change through being mentored, through sharing with friends, through reading and reflecting, through engaging in dialogues with people you disagree with, through confessing stuff, through trying new things.
  • Deleted User
    0
    This user has been deleted and all their posts removed.
  • Skalidris
    118
    The concept of cognitive bias exists and is well supported by research even if people can't use it as a tool on themselves aloneBylaw

    I never disagreed with that.

    If you want perfect and 100% certainty then apply to be something simpler like a toaster.Bylaw

    It's really funny cause I created another thread to try and name this kind of behaviour. Wanting more certainty doesn't mean wanting 100% certainty. It's foolish to even think 100% certainty is possible to reach... Of course I never meant to reach 100% certainty in that topic...

    Oh, and of course it is falsifiable. You can easily test to see if someone's poltical position affects what they notice in articles.Bylaw

    It's falsifiable as a general concept, I don't have anything against the experiments. But the thing is, detecting it personally in someone at a specific time is much more tricky. An experiment with one person being both the control and test subject is kind of crappy... Say you read two articles, one contradicting your point of view and the other supporting it. What if you remember several info about the supporting one because it reminded you of something that has shocked you in the past? Say you're scared of dolphins, and the supporting one has an analogy with dolphins... You wouldn't remember it because of the confirmation bias, although the result is the same. My point is, you can remember something in one specific article rather than another one for many reasons, even unconscious ones, maybe you just liked the layout of one article more than the other, or something stupid like that...

    They can point out when you contradict yourself. And this kind of dialogue (which hopefully is mutual) can and does reduce people's biasesBylaw

    You don't need the cognitive bias theory for that. If a person ignores some data because it contradicts their opinion, you could ask them how they take these data into account in their ways of thinking, and if they can't answer/don't want to, you know they have some emotional blockage with that. I find it much more efficient to lay all the data on the table, link it logically, and ask the person how they reached another conclusion, discussing the logical link. And if some data are ignored (because of the survivorship bias, because of the confirmation bias, or because unicorns are white), it's going to be visible and you can point it out to the person. Like this technique mentioned by @jgill. But who cares about the cause in the end? Who cares if you act like this because of some trauma from your past or whatever? Point is, if you force yourself to lay everything on the table, you're going to be more objective anyway, you're going to mitigate these "biases", even if you have no idea they even exist.

    I honestly don't think psychoanalysis is a good tool for improving yourself and trying to detect cognitive biases in someone seems to have a lot of common grounds with it.
  • Agent Smith
    9.5k
    It is better to be alive & wrong than dead & right!
  • Caerulea-Lawrence
    16
    Cognitive bias: tool for critical thinking or ego trap?


    Skalidris
    78
    What I would like to know is how and why people think it can help with critical thinking.

    I'll explain why I think it's an ego trap with an example of the survivorship bias:
    If we ask a lottery winner to talk to a group of people about how amazing his life has become, that group will be more likely to buy lottery tickets then a second group of people who would have listened to the story of a homeless man who lost all his money on lottery tickets. My guess is, if we tell people who bought tickets from the first group that they were biased, they might just say “oh but even if the chance is low, it’s still there, maybe it’s my lucky day”. So in the end, even given that info, I still think the first group would have more buyers than the second one. It could even be worse, they could fool themselves into thinking they’re critical: “I’m aware there is the survivorship bias, I’m aware the chance of winning is low but I’m rationally deciding to buy a ticket because I’m willing to risk losing small amounts of money to win big”. Is it really rational though? They’re mostly driven by the emotions that were triggered by the story of the winner…

    How can we ever be sure that the decision we’re making isn’t biased? Biases are unconscious…
    I see a lot of people using cognitive bias as some kind of superiority: “I know about cognitive bias and I try to avoid it, and you don’t, so I’m closer to the truth than you are”… And this is exactly the kind of behaviour that kills critical thinking… Or people who use it to take down someone’s defense: “you’re saying that because you’re biased, therefore it doesn’t have any value”…
    Skalidris


    Hello Skalidris,

    The way I understand your OP is as follows.

    These are your claims:
    Cognitive biases effect both the uncritical as well as the critical thinker.
    Knowledge of biases doesn’t hinder the formation of ego traps in the critical thinker nor the uncritical one.
    And finally, biases are unconscious, and can influence you regardless of your conscious knowledge of them.

    Did I get that right?

    Title question:
    The answer you give in your OP is you saying you know a lot of people that do not use cognitive biases as a tool for critical thinking, but turn them into an ego trap. Which I see as concluding that critical thinking doesn't immunize against biases.

    Did I get that right?

    Now, I would have to assume a lot more if I were to induce how I could prove the usefulness of critical thinking given your claims and even your conclusion. So, instead I would prefer if you could tell me what, specifically, would be valid examples, arguments or experiences to you. Moreover, it would also be great if you had a kind of baseline with regards to what it would take for you to change your mind.

    I hope that is useful to you, as I would like to answer if I can understand better what it is you want.

    Caerulea-Lawrence
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.