Comments

  • Friendship - For Many And For None -
    You're welcome to chime in. That's a topic of interest to me, I'm a childless man. For myself, I do care about any old children above fit adults: so, on a train or a boat or on the street, say, then I'll help out children unknown to me. Through charity and taxes we all support 'other people's children'. In some cultures 'one's own' young children are not so easily defined, in the caring relation: women working together, for example, may care for each other's children as if they were 'their own', as we would see it.

    So I'm inclined to see this as a question of norms of a basically mutually cooperative species who school ,work and play together. But the norms nevertheless involve greater caring for some than for others, radiating out from those we know best, then those we are acquainted with, and so on. Norms of behaviour enable us to be economical like that, otherwise we would all have to follow Peter Singer and care for everyone on the planet. 'Care' can't work like that, although as above, I think the notion that we only really 'care' for ourselves is a modern narcissistic myth.
  • Friendship - For Many And For None -
    everyone cares only about themselvesGus Lamarch

    I find it hard to imagine that a mother of an infant, for example, cares only about herself. How would you explain her care for the infant?
  • Some Remarks on Bedrock Beliefs
    some of the reasons why they are referred to as bedrock beliefs or foundational beliefs.Sam26

    I'm interested in what you're saying and hope you don't mind continuing comments.

    As I've said I don't understand the primacy you give to 'belief'. What for example is added to the above phrase by the word 'beliefs'? It seems to me 'bedrock' and 'foundation' would be perfectly clear without them, as metaphors.

    It feels like you are shifting away from your Wittgensteinian core, as in my eyes language games are all about knowing-how: skilful use. 'Belief' is relatively unimportant then. As Harry says, to proclaim a belief is to exhibit the skill in making such proclamations; it's the tying the shoelaces that counts. It counts for me, as something the shoelace-tier knows. They may believe they are solving the final problem that makes the universe whole, but to me they are tying shoes, and their beliefs are their own affair.
  • Explaining multiple realizability and its challenges
    Could you hurt it? Cause it to feel physical or emotional pain?Wayfarer

    I am only proposing that you can give a social robot enough of the appearance of a carer for humans to feel comfortable interacting with it. It seems to me that ai is now sophisticated enough to give a machine for example parameters that would represent our two broad theories of other minds, i.e. simulation theory or theory theory. And the social robot would have a head start with its human because it would indeed appear to be reading the human's mind, as that would be its primary purpose: to provide for the care needed, including anticipating future needs. For example, if a doddery person falls over when they stand on more than one occasion, a machine could perfectly well begin to anticipate and help out with that. Clever dogs are already trained to do that to a limited degree.
  • Why aliens will never learn to speak our language
    One interesting thing if you watch people talking on the telephone is how they cannot help gesturing and communicating with their face and eyes to the interlocutor who isn't there. And the gesturing in a stifled form gets through, just as 'acting' in a radio play gets through to the listener; the hearer can recognise, for instance, the stilted delivery of someone being inexpressive because they're on a crowded train at the other end of the phone-line.
  • Ergodic and Butterfly Theories of History
    No need to withdraw the idea in my view, John. It's a very productive idea, even if fuzzy. And as soon as a fuzzy-in-itself idea becomes a metaphor, it soon finds itself proposing to solve world problems when the humble practitioner - you in this case :) - is still saying, 'But I'm not sure it actually works.'
  • Why aliens will never learn to speak our language
    I think the main reason we can't talk fluently with computers is that we talk, and interpret talk, with our bodies, not our 'minds'. That's how mind-reading and 'mirroring' happen, through mutual body-reading.

    As for the future: never say never. I saw 'Arrival', so I know Amy Adams will know how to communicate with the aliens, if no-one else can.
  • Explaining multiple realizability and its challenges
    Great quote! To debate it thoroughly would take us off-topic. My feeling is that social robotics - not Siri and Alexa, but the robots that provide care and comfort - have progressed to the point where they defy Descartes' first point. If it feels like a carer, if it acts like a carer, then it's a carer. But (Descartes' second point) it won't, indeed, go off-piste as humans would and tell you how moved it was by its grandfather's wartime experiences.
  • Explaining multiple realizability and its challenges
    (HI Wayfarer, hope all is well) I don't see why a machine couldn't be developed that would know how to simulate the expression of pain, and would also know that 'pain' is usually, but not always, a sign that something is wrong, so would also either simulate the wrongness, or only express pain when something is wrong.

    I don't mean that I agree with the machine-metaphor behind reductionism, but I think it needs a subtler critique than this, now that we can envisage quasi-organisms.
  • Some Remarks on Bedrock Beliefs
    I think we must see the Moore-Wittgenstein debate about 'hands' differently. To me Wittgenstein's point is that these just are my hands: 'knowledge' or 'belief' don't come into it. If I chop down a tree with an axe, they are a tree and an axe. To 'believe' in what they are in some intermediary way is redundant. Bedrock just is bedrock. The test-case which I have heard people argue over is hallucination. If Kripke hacks down a tree that turns out to be a person, what was happening, from Kripke's point of view? The visiting psychologist may well say that this Kripke believes this and that, but for Kripke, there was just a tree that unexpectedly bled, and an axe. On my version, at any rate.
  • Friendship - For Many And For None -
    I think Aristotle was referring to [homo]'philia'.ovdtogt

    What justification do you have for saying that? In European languages such a term only originated in the 1920's. Hughes's commentary on Aristotelian philia cites examples from the Nicomachean Ethics of:

    young lovers (1156b2), lifelong friends (1156b12), cities with one another (1157a26), political or business contacts (1158a28), parents and children (1158b20), fellow-voyagers and fellow-soldiers (1159b28), members of the same religious society (1160a19), or of the same tribe (1161b14), a cobbler and the person who buys from him (1163b35).{/quote] — Hughes on Aristotle
  • Some Remarks on Bedrock Beliefs
    Hi Sam, Long time no see, I hope you are well. I have been infirm and studying philosophy. I think Wittgenstein addresses this question in para 208 of the P.I. (A philosopher called Meredith Williams has constructed a Wittgensteinian view of how the infant acquires language from this insight which I recommend if you can get at it).

    ...if a person has not yet got the concepts, I'll teach him to use the words by means of examples and by exercises - And when I do this, I do not communicate less to him that I know myself. In the course of this teaching, I'll show him the same colours, the same lengths, the same shapes...For example I'll teach him to continue an ornamental patter 'uniformly' when told to do so. - And also to continue progressions. — Witt P.I. 208

    We learn conceptual norms by trust, initially, through language, and we subsequently develop our own, which we in turn pass on.

    these most basic of beliefs (states-of-mind) are not revealed linguistically, but are revealed in our actions (remember I'm talking prelinguistic beliefs)Sam26

    For myself I think 'prelinguistic' is a red herring, though I know many are wedded to it. Instead I feel that it's a mistake to distinguish the linguistic from action. To use or interpret language is to act, it's not an alternative to action. It may be that this will eventually amount to a similar argument to the one from the 'prelinguistic', but for me it has different foundations: all our engagement in language games is a way of acting, as the form of life we are.
  • Ergodic and Butterfly Theories of History
    The most concrete efforts to apply ergodicity to 'history' have happened in Economics. Google 'ergodicity' and 'economics' and you'll find a tangle of approaches.

    The difficulties faced in Economics apply to History in general:
    - you can't do experiments, you only have post facto data
    - most theories whether ergodic or not have a terribly poor ex ante record, i.e. prediction success is low
    - the discipline itself is rooted in linear, equilibrium-based and 'rational' assumptions, and many of these assumptions insinuate themselves into ideas that purport to be ergodic, which need instead to be non-linear, freewheeling and arational/stochastic.

    All the same there has been fruitful mutual work between economists and physicists/mathematicians, though it does not rest on a secure set of foundations.

    The foundations would need a set of abstractions that could be agreed to be workable. Terms like 'Capitalism' and 'Communism' for instance come so laden with political baggage that they can be hard to use, e.g. was the Soviet Union a form of state capitalism rather than communism? It might be that one could begin with a narrow account of a relatively isolated State and make some sort of historical sense.
  • Friendship - For Many And For None -
    friendships are made of interests, of a "Union of Egoists"Gus Lamarch

    If you begin with an individualistic premise like this, you come to an individualistic conclusion. This isn't a surprise.

    I might begin with, say, 'Human beings are naturally cooperative social animals.' Then I might arrive at the sort of ethical system Aristotle, for example, arrived at. In his Nicomachean Ethics the chapters that modern people usually skip are where he founds the virtuous Athenian life in 'philia', or the deep friendship of a small number of people.
  • The Problem with Escapism
    1. None of God’s actions toward persons is unjust or unlovingMarissa

    If you take ordinary human suffering that has 'natural' causes, like a painful childhood death from cancer, or the deaths from a tsunami or earthquake - many people will regard such events as somehow 'unjust or unloving', if some sort of super-person was their cause.

    You also have what for me is a perennial problem in making analytic quasi-logical arguments about 'God': there seem to be many premises behind the visible ones, which Christians like yourself take for granted. Here's one: in what sense is 'God' a kind of super-person who can allow or inflict verdicts and their consequences? Can the idea be given meaning for a person who doesn't use 'God' in their vocabulary of philosophical accounts?
  • The Blind Spot of Science and the Neglect of Lived Experience
    How I feel about this at the moment is that ideas that separate 'consciousness' or 'lived experience' from our other endeavours like this are not going to address such a question in the right way. Our science is saturated with lived experience: without experience there is no science. Our own preoccupations are reflected back at us by the structure and content of our science. If you talk to practising scientists about their practices, in my experience, they are fantastically preoccupied with their equipment and the danger that their observations are wrongly influenced by themselves or their machinery.

    Social science is quite accustomed to accommodating the presence of the observer, it's only some areas of 'natural' science that are in question. Any science that claims to look at other living creatures or phenomena is also informed by our prejudices; consider how our knowledge of the ways of the animal kingdom have shifted as our views of how to live within that kingdom have shifted.

    I have wondered lately if part of what's at issue is our sense of responsibility for scientific work. I'm not a panpsychist but I feel that every time we talk about atoms, say, we remind, or ought to remind ourselves of atomic bombs. Our curiosity is inseparable from the uses we make of our curiosity's conclusions, but we try artificially to separate them so that the 'discovery' of sub-atomic particles and how they interact, say, is somehow heroic, and somehow separable from the villanous mass destruction that only became possible with its discovery.
  • The moralistic and the naturalistic fallacy
    Your post bears more than a passing resemblance to an 11-year-old blog post, https://www.psychologytoday.com/gb/blog/the-scientific-fundamentalist/200810/two-logical-fallacies-we-must-avoid

    Kanazawa's conclusion is that we can avoid these fallacies by never speaking about 'ought'. I don't think that will do. 'Ought' is easily smuggled in to the most analytic-looking remark. Evolutionary psychology is in my view especially prone to that: it tends to omit the historical situation of the scientific 'fact' it uses in an argument because its exponents have prejudices of their own, as we all do. Essentialism about women and men, for example, easily follows, for who could be more typical of all women and men who ever lived than 23 Columbia University students having a scan for a neuroscience experiment?
  • Is “Water is H2O” a posteriori necessary truth?
    It's an odd thing that a glass of water is not entirely H2O. Most water, in glasses, lakes, taps/faucets, also 'contains' what we call, if pushed. 'impurities'. For some liquid to be only constituted of H2O we need to add 'pure' or 'distilled' in front of it. So I've always thought 'Water is H2O' to be mistaken as an example of anything necessary.

    When popular news features a visit to a comet by a probe, 'water' is usually a feature of what has been found: but this will often turn out to be D2O, see Wayfarer's comments above, or at least contain more deuterium than water on earth generally does when 'naturally' occurring.

    As for Kripke, this is part of an arcane post-Kantian game. But it might matter (only if Terrapin Station were wrong) to how we think about things.
  • Aristotle: “Poetics”
    I used to use the Poetics to teach scriptwriting and screenwriting. It's enjoyably succinct, and has that slightly baffling tendency of Aristotle's (I've only realised in later life, through also reading the Nicomachean Ethics and other works) to move between description and prescription, which is a good blurring for students to argue against, in my experience :)

    I long ago decided it was best to call mimesis 'mimesis'. It's had a lot of meanings in modern times as well as its different meanings 2300 years ago. There is a whole Continental strand of 20th century thought that debates 'mimesis', e.g. Adorno after Walter Benjamin. But they all hold on to the origin of the creative process in 'imitation'. Music, for instance, imitates certain feelings by certain tricks; dance expresses/represents ideas and feelings by the dancer and her audience inhabiting, as it were, a certain way of performing. Acting might be quintessentially mimetic on this reading, though it would depend on the skill of the writer as well as of the actor.

    If I may say so, I think you are worrying unnecessarily about 'objects', which slip around in meaning in translations of Aristotle. For instance if you fast forward to xxv of the Poetics you'll find this:

    The poet being an imitator, like a painter or any other artist, must of necessity imitate one of three objects- things as they were or are, things as they are said or thought to be, or things as they ought to be. — Poetics

    That covers a pretty wide area!

    Plenty is known about the history of the chorus, which was large until Aeschylus reduced its size, enabling the greats to utilize the chorus for more immediate, intimate and dramatic purposes.
  • Evolutionary Psychology and the Computer Mind
    Dualists would be the most ardent opposition to such a theory for obvious reasons.Harry Hindu

    The Stanford entry you quote does go on to mention, as it had already stated at the outset, that 'there is a broad consensus among philosophers of science that evolutionary psychology is a deeply flawed enterprise.' These allegations of flaws are not necessarily expressed ardently, but they are powerful. Many of them come from people who believe that evolutionary biology provides a more secure basis for scientific progress and that evolutionary psychology bears the heavy weight of biasses that its practitioners hold.
  • a world of mass hallucination
    https://www.disclose.tv/physicists-are-starting-to-suspect-physical-reality-is-an-illusion-364016

    In this article it speculates the only thing real is information and how we perceive reality is a product of our brain or at the very least what we percieve as our brain.

    Questions and comments?
    christian2017

    The blog in Scientific American referenced by this article is actually opposed to the view you summarise here. Kastrup is against 'information realism' and proposes instead that the mental universe is 'a transpersonal field of mentation that presents itself to us as physicality—with its concreteness, solidity and definiteness—once our personal mental processes interact with it through observation.'
  • Philosopher Roger Scruton Has Been Sacked for Islamophobia and Antisemitism
    ...no excuse for inventing ‘Islamophobia’ as an explanation of the negative views that many people hold about Islam. The invention of this term by activists of the Muslim Brotherhood is a rhetorical trick, though it seems that my habit of pointing this out is a further proof that I am guilty. — Scruton

    The idea that this was invented by Muslim Brotherhood activists had I thought been long discredited. Here a person calling themselves a counter-jihadist is eager to explain the error in case annoying lefties seize on it. The term was propagated into general circulation by the UK Runnymede Trust's report of 1997. The claim by activists was just their vanity.

    To me meaning is mostly use: the term was promoted by liberal multi-culturalists and has since been adopted by lots of factions. To claim its origin among Muslim activists is a rhetorical trick by Scruton, cleverly disguised by him by his counter-claiming that it was a rhetorical trick on others' part. He's a clever bloke and knows what he's doing. He knows perfectly well that 'the Soros empire' is an anti-Semitic trope. Orban in Hungary specialises in such language: language loaded with implication without ever quite speaking its name. Of course Orban is a politician trying to hold his ground given that 20% of the Hungarian population support an openly anti-Semitic far-right organisation (with which, sadly, their left is also flirting).

    To add: the journalism in the New Statesman was crass, 'phobia' is usually a rubbish term because it condescends to real fear, and Scruton writes brilliantly on music, for example. But he is deeply reactionary, in a sometimes charming and seductive way, and his appointment as commissioner of building late last year was a symbolic expression of the terrible state of British Conservatism: he knows nothing about building but has the right sort of reactionary views on aesthetics. There are plenty of right-wing public administrators, engineers and architects who would have been infinitely preferable, as at least they would have got something done and known how it was done: but the Tories have reached a terrible level of ineptitude. That's where the UK is. That's why 'we' are bungling Brexit so spectacularly.

    Thank goodness for philosophy, I say. I'm going back to my Wittgenstein immediately. Somewhere comforting to bury my head.
  • Art highlights the elitism of opinion
    I think you still know what I mean with regards to the 'elite' who are the subject of the OP. In some respects, its {LOTR] inclusion on the curriculum rather proves the point. There was obviously some suggestion to include it, a complete lack of compelling evidence to the contrary, so on what grounds the previous snobbish dismissal?Isaac

    I'm an art-maker not an elite member of a critical group, I've spent much of my life writing prose, dialogue, music, poetry, songs, all for a living. I don't understand the either/or-ness of this debate. I've written episodes of tv soap operas watched by millions, and I've written obscure poems read by thirteen people. The arts (as I'd prefer to call the body of work) are a broad church. It's quite a common thing historically for popular art to be denigrated by one generation of arty-farts, then revered by the next. Take Rodgers and Hammerstein's musicals, Simenon novels, Shakespeare indeed.

    When you've worked hard to create art of some kind you feel the work and the knowledge in it. Art is something humans have done since language began: the earliest musical pipes are 50,000 years old, and so on. We make art to help make sense of the world we find ourselves in. Sometimes our passing entertainment is of a high quality and people call it art, sometimes it isn't. I think Michael Bey for example is a highly-skilled entrepreneur, comparable to great showmen, but he isn't by my lights an artist. I know lots of people who work in film, and they have artistic standards they work towards. There is a body of practical opinion in any art-form which values some work more than others, and these valuations derive from experience and reflection. If you don't value such opinion, then to my mind you're missing out on part of the pleasure and understanding you can derive from any given art.

    Lastly this is all very 'consumerist' to me. Art is something humans make not just what we gawp at. If you try to make the simplest video lasting more than ten seconds you start to feel the art in it: both skill, and shaping of understanding. Art is work, even to enjoy it as a consumer. To enjoy Shakespeare you have to do some background work: I think that's rewarding, because even now I wept the other month at a brilliant performance of King Lear in Manchester, and my tears and thoughts afterwards felt richer to me for the effort I've put in to understand the language and the shape of that play. Of course I've had to wade through some pretentious crap too to get there, but I've also read brilliant educators: take Anne Righter's (Barton's) 'Shakespeare and the idea of the play', a brilliant book I first read 50 years ago that I still remember with pleasure.

    With this experience of my own, by the way, I think there is a perfectly good case for claiming Lord of the Rings is second-rate: verbose, derivative, with prose that isn't carefully styled, and with sometimes childish plotting and characterisation and not in a good way. That's my considered view. I don't think it's 'snobbish' or a 'dismissal' nor do I think it 'compelling evidence'. It's just something to weigh against other views. We only have our opinions, but they can be considered and well-informed, and I will respect them more if they are.
  • Art highlights the elitism of opinion
    The Lord of the Rings vs Pride and Prejudice would not be so clear cut, and yet I don't imagine The Lord of the Rings making it on to the English Literature curriculum any time soon.Isaac

    You're mistaken. Lord of the Rings is on the curriculum in many school districts in many countries. Houghton Mifflin publish a comprehensive pack for secondary school teachers. http://www.houghtonmifflinbooks.com/features/lordoftheringstrilogy/lessons/
  • Are proper names countable?

    Just to add...any given list of proper names of actual people will have duplicates...triplicates...Like Socrates the footballer and Socrates the philosopher...the many Kims of Korea...the andrewks and mcdoodles...

    So where will all this counting of the different names get us?
  • A Substantive Philosophical Issue
    I don't think the objective/subjective split can be resolved by just looking at the words being used. That's because while we experience being part of a world with other people, life, objects and events, we also experience a world of inner dialog, imagination, dreams, memories and being in our own skin that nobody else can experience. Others can often infer some of our experience, and we can relate part of it to them in language, which is public, but it is still our own alone to experience.Marchesk

    Looking at the words used here, it is, however, interesting that you use the first person plural. You don't refer to 'my' own skin but 'our'. You refer to 'we' not 'I'.

    Of course this is a rhetorical device. But to me the device in this context does appeal to a commonality of experience even as it insists that one's own is unique. There's something paradoxical going on. Your words propose that I will understand what you're proposing because I will experience things that way too. And I do!

    To me then this whole paragraph adopts not a subjective nor an objective approach, but a sort of mutual approach, and that's often how we are. Look inside my mind, here: see, this is how it stands in here. I wandered lonely as a cloud. Know what I mean?
  • On Disidentification.
    Now, I don't know how to (dis)-identify with depression anymore, it's been with me for so long, that I've become accustomed to itPosty McPostface
    I've had meetings with therapists where I've pretended to accept I have depression but I'm always unconvincing, I feel. In the long run I've just concluded: I have a melancholy disposition, and the way things have turned out seem to demonstrate how right I was to be melancholy.

    But depression is a thing in a systematic world of diagnoses and therapies, pills and cures, that I don't subscribe to. I have subscribed to it, but not now. Is this dis-identification?

    A more fundamental example for my personal situation over a period of time is 'alcoholism'. I drink too much and some days my primary thoughts have been about where the next drink is coming from. Still, I got most of my best ideas when boozed up, even though I needed to sober up to get them straight and in order. People who want to help me use this word 'alcoholic' and it bugs me. The founders of AA died, one of drink, he relapsed, the other of lung cancer as an addictive smoker. This stuff about helplessness, a higher power and disease that is built into the system that uses the term 'alcoholic' just leaves me alienated.

    I would add: therein lies a danger of disidentification. Alienation. If you refuse the label you're given when you go into the therapeutic room you'll find yourself isolated, and that itself may not be wise. I remember Lawrence Block novels with affection: he had a melancholy detective called Matthew Scudder who went to AA meetings whenever he could, not out of belief, but in a habitual practice that to me resembled religious practice: you may not believe the theology ('alcoholism', 'depression') - but to enact the rituals, to join in the fellowship, to experience the mutuality, all these activities are tremendously helpful and enhance your feeling of your own humanity. So maybe it's best to keep going to the Depression/Booze clinic, and take some of the medication if it doesn't ruin your creativity too much, and keep doing your best.
  • The Philosophy of Language and It's Importance
    I am fascinated by the philosophy of language and at an advanced age am in the middle of a Master's where I am covering lots of topics but focused on language. I don't think Wittgenstein claims as much as some here think: 'some' problems can indeed be clarified by attention to the use of language, but not 'all' or 'most'.

    I find more depth in Wittgenstein the more I read him. (We shouldn't sanctify him of course, he was in some ways a pretty unpleasant person) I've become interested in 'language games' and in particular how much talk about language assumes that monologue-style statements, as if spoken to oneself or written in papers, are the meaningful stuff of our language where we really really mean things.

    There's a movement in sociolinguistics to consider dialogism as more important than we have previously considered - which potentially takes us back to Plato, but also feels present to me in much of Witt's later chunterings to himself, his uses of remarks in quotes by speakers whose identity we don't know.

    An odd starting point for me is the sometimes rudeness or abrutpness of Siri and other a.i. assistants. And indeed of humans in call centres trained on systematic scripts. They don't understand 'dialogue' or 'conversation'. To me this means they don't understand 'language games'. They are underpinned by a philosophy that talk is 'speech acts': monologues delivered to audiences. - Whereas actual talk is exchange, conversation, a relation between humans that also involves gesture, mood, scent.

    I think there are ways back here into some analytic approaches, e.g. Davidsonian interpretation, but I am trying to get my head round what the routes are.

    I've wandered off into vagueness, just explaining where I've got to.
  • Crime and Extreme Punishment: The Death Penalty in America
    The death penalty being unpopular or popular hasn't changed that much in the last decade.ArguingWAristotleTiff

    Two points here. 1, you mean, in the United States. It's something Americans on the forum are often careless about and means something: about whether the rest of the world counts.

    2. Evidentially, the last evidence I saw was that USA support for the death penalty was in quite steep historical decline: https://deathpenaltyinfo.org/national-polls-and-studies But maybe in the last year it has changed.
  • A Brief History of Metaphysics
    Yes, I think what you say is kind of true, though not only of philosophy, but of science, economics, anthropology; in short of all domains of inquiry. As absolute presuppositions are also operative in the kinds of everyday commonsense beliefs that we could never foresee being overturned, they may be said to resemble hinge propositions. The difference is that in the domains of inquiry the absolute presuppositions are things we can be said to necessarily suppose, rather than believe, in order to carry out any investigation at all.Janus

    Thanks, the quotes about Peirce are very illuminating. There are many strains of analytic philosophy that go on a lot about 'belief'. I've been reading a bloke called Duncan Pritchard who holds that for Wittgenstein hinges are not 'beliefs' at all, indeed they must stand outside what we think of as 'beliefs' for rational thought to function. Thus hinge presuppositions hold the same place in the intellectual firmament as Collingwood's absolute presuppositions. We recognise that these are contingent - as Tim Wood said above - on time, place, person: Wittgenstein himself, for instance, banged on about no man having visited the moon as a hinge, a basis for rational reflection, when within 20 years it would cease to be so.

    Looked at in this way I don't think hinge propositions could be said to be true false or anywhere in between. Instead they're the foundations upon which claims of truth are built. So while it's a skeptical position it also claims to be an answer to a certain kind of skepticism, since it asserts that if we don't start with some presuppositions or other, there is nothing to talk about, indeed, there's probably no talking :)
  • What do you call this?
    Well, yes, formal languages are devoid of this feature. So, then why are non-formal languages so rife with the possibility for inconsistency?Posty McPostface

    Non-formal languages are languages of communication. There was a notion from Frege onwards that somehow a more 'scientific' language might be developed, but it never comes to pass.

    There is, for utmost clarity, the language of mathematics. There is, for how we get along, natural language. The latter is just untame-able. Why would it not be? What would be the virtue of inconsistency, certainly in matters other than factual ones?
  • What do you call this?
    Then, shouldn't every text be made so that there's the least amount of possibility of contradiction in it, therefore someone may feel as though they understand it appropriately? But, how do you ensure this important feature of any text???Posty McPostface

    I confess I feel rather the opposite. Why should texts obey some principle of non-contradiction? This would be the dream of an authoritarian, surely? Non-contradiction happens in logic, perhaps, but as soon as we use natural language it creeps in. And creeps, and creeps.

    Certain texts may be regarded as some sort of guidance to behaviour, but how are humans to be governed in this way? As soon as I read 'Thou shalt not'...' written say by some stuffy patriarch, I want to go looking for a fellow-transgressor.

    Hello again, btw, Posty. Hope you're well.
  • What do you call this?
    when someone behaves contrary to what (to serve as an example) the Bible preachesPosty McPostface

    I'm not a Christian but, or and, I don't understand the Bible as a consistent body of work that preaches one thing rather than another. Actually I find some of the most enjoyable sections, like the Song of Solomon or the parables of Jesus, to be where the narrative seems nothing like a guide to how one is to behave.

    So I feel that you target is more likely a person who avows one thing one day, and a contradictory thing another day, and won't see that they're contradictory. Faced with such people I confess that lately I am a bad Samaritan and pass by on the other side.
  • A Brief History of Metaphysics
    As I understand it. according to Collingwood absolute presuppositions are the fundamental principles upon which the fields of human inquiry depend. They are understood to be different than propositions in that it is inappropriate to speak about them in terms of truth and falsity.Janus

    My reading of Collingwood is that it's about questioning and absolute presuppositions. When you study a philosopher you come up with questions met with answers which lead to further questions which lead...eventually, to the point where the views of that philosopher offer no answer. Here are the absolute presuppositions which, I agree with Banno, bear a remarkable resemblance to hinge propositions.

    As someone on the old forum said to me, one very odd thing about Collingwood is that he held these seriously anti-ontological views (I don't think they are anti-epistemological) but he also went to church every Sunday and engaged in acts of worship.
  • How do we justify logic?
    \I'm interested in inference, which is conventionally described as something inside logic, a step of reasoning.

    To my mind logic is a human construct. It has its own particular subset of language. I don't buy the idea of a closer relationship to 'language' in general than that.

    I see other animals, however, making inferences, about food, trails, whether you're an enemy or friend. I don't believe they have logic, or language. But they have inferential systems.
  • Justification in Practical Reason
    What art or skill are we using when we unpick assumptions or presuppositions? When we question, say, why a rule is a rule? I think it’s fair to say the outcome is the result of practical reasoning, but it’s hard to be clear what that involves. Sometimes an imaginative leap is required. An imaginative leap isn’t reasoning, exactly. It sometimes enables clearer reasoning. Oddly for me an imaginative leap in an argument feels like the same sort of thing as when I’ve been struggling over a song or story I’m trying to write then seem to envisage a solution. Suddenly the problem has a different shape, to which different parameters apply. Now I can reason / sing / fictionalise anew.
  • To See Everything Just As It Is
    There's a paper online by Eric Steinhart in which he discusses one of the implications of this view. For Nietzsche 'sameness' or 'identity' is only an appearance. It's a highly sophisticated appearance. Our human ability to postulate sameness / endurance enables us to conceptualize and count. But it is only an appearance, in a dialectic with 'difference'.
  • What now?
    So, what are your thought given the above?Posty McPostface

    My own experience is that a certain amount of dis-content motivates me, and oddly enough, contentedness is quite bad for me in the medium-term. A challenge eggs me on.
  • On persuasion in theory
    The scientist can not only be a good observer of the world - they must also be good at presenting what they see, keeping the audience interested long enough to get the point across.darthbarracuda

    I don't know if this is a counter-example or a variant on your view. The analytic philosophy of language is replete with papers that sound mathematical and/or logical. They feed successfully off each other, creating a matrix of terminology and argument you have to try to understand before you can even criticise what they're saying. Often it's good to throw in a few symbols and things that look like equations.

    I suspect a lot of them are guff. But it's hard to penetrate their world sufficiently to be sure; I mean, I've gone back to school to try to figure it out and I can't.