## Gettier's Case II Is Bewitchment

• 1.1k

So what about the distinctive syllogism?

If I assign to A a probability of r, and to B a probability of s, what probability should I assign to ~A & B? (That is, to ~A & (A v B).) That would be (1 - r)s. Since the probability of A v B is r + s - rs, it's also pr(A) + pr(~A & B). If pr(A v B) = 1, then if pr(A) goes to 0, pr(B) = 1. So there's no weirdness treating the usual disjunctive syllogism as a special case of standard probability.

I don't have Smith assigning a probability of 1 to Jones owning a Ford, and I don't have him assigning a probability of 0 to Brown being in, say, Barcelona. Those are assumptions of mine that I think are defensible from the text-- and from life-- but there's certainly room to argue otherwise.

So what should Smith's view be of the possibility that Jones does not own a Ford but Brown is indeed in Barcelona? Given probabilities of 0.90 for the Ford and 0.01 for Barcelona, he should assign a probability of 0.001 to Barcelona but no Ford. As it should be, since pr(Ford & Barcelona) = 0.901. So that's at least consistent.

But it has to be admitted that what I'm doing here is not-- what should we call it?-- "simply" inferring one belief from another. I allow Smith to form the prediction that Ford or Barcelona based on his hypothesis that Jones owns a Ford, but then in order to assign probabilities to it and to the disjunctive syllogism (to its premises actually, since he already has a prior for Barcelona), he does the math.

Thus I never see Smith being in the position of saying, "Probably A, but if not then definitely B."

Where do you derive this principle from? It isn't a law of logic. If I might use an analogy, the higher you want to build, the more secure you need to make your foundations.

Yeah, I have no justification for that (my thing about the credence you give a conclusion). I think it's a reasonable rule of thumb, something like Hume's saying that "the wise man proportions his belief to the evidence." For instance, in the case at hand of addition, the likelihood of A v B is higher than the likelihood of A, but that's because I'm smuggling in a prior for B.

As a matter of fact-- and this gets to your second point-- if A entails B, then the likelihood of B is at least as high as that of A. (If all F are G, there are at least as many G as F.)

So while there's intuitive support for the general idea of firm foundations and less and less certainty the farther your chain of inference carries you from those foundations, you have to be careful. If your theory as a whole is thought of as just a big conjunction of all of your current beliefs, and if some of those are less than certain, then all of them being true is less likely than some of them taken alone, because when you multiply the independent ones, their product is necessarily smaller. Sure. But our theories are more complicated than big conjunctions. There's a lot of dependence, entailments, conditional probabilities and disjunctions in there.

So I don't see Smith as overstepping the bounds of reason and landing in a puddle of nonsense. I see him as a victim of chance. Something extraordinarily unlikely happens, and it will challenge his otherwise orderly process of belief formation.
• 984
We can extend the treatment to "certainty": So long as we mean mere practical certainty or a feeling of sureness, but not absolute theoretical certainty, certainty is compatible with doubt.

it just seems to be the case that minds like ours never or almost never attain absolute certainty.

Here I think we are getting closer to what is going on in the Theaetetus.

One way we can be certain is when we take things as the bedrock of our discussion. In this sense, doubt is dismissed as not having a place in the discussion. So, for example, this is not a discussion about the comparative benefits of diesel and petrol engines, and thinking it so is to misunderstand what is going on. Or, to use the all-pervasive example, one does not doubt that a bishop moves diagonally while playing chess.

The problem here is the philosopher's game of putting "absolute" in front of "certainty" and thinking that this means something. Outside of philosophy, minds like ours always or almost always certain. Few folk check that they have an arm before they reach for the fork. It's not the sort of thing that one doubts, outside the philosopher's parlour.

And here is where the logos differs from justification. @Hanover brought this to mind elsewhere. When you learn that the cup is red (again), are you learning something about the cup, or something about the use of the word "red"? Well, one hand washes the other. When you learn that r justifies p, you learn more than just that r materially implies p; you learn a new way of using "r" and "p". It does not automatically follow that, if r justifies p, it justifies p v q.
• 984
It's a good first step; but so much more is involved. Consider the ancient distinction that splits knowing into knowing how... and knowing that..., and then pretends that knowhow has no place in philosophy.

Until it was pointed out that philosophers ought know how to use words.

To paraphrase, there are ways of knowing that are not exhibited in statements, but shown in what we do.

These are missing from Gettier.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal