• AJJ
    909
    What's the heart of the matter? The heart is that which is actually going on.Zweistein

    My understanding is that chance entails lots of brute contingencies. Why does A happen and not B? It just does and it isn’t possible for there to be an explanation, since this would remove the chance.
  • sime
    1k
    Are we saying the same thing only you maybe more technically correctly? Or something different?
    All I'm saying is that if I throw that fair die, I shall get any of 1 through 6, and that I shall expect each to occur about 1/6th of the time.
    tim wood

    I was responding to the common belief that chance represents ignorance. We know from experience that the odds for a "fair die" landing on any side, if tossed in a typical fashion, are roughly 1/6 in the sense of relative frequencies. This is essentially the definition of what a "fair die" is.

    But a priori, we don't even know that. For in the case of an unknown die that isn't necessarily fair, all we know a priori is that the odds for any outcome is between 0 and 1. Nevertheless, there persists a convention which assigns a uniform distribution in the case of an unknown dice. But this is misleading for it conflates knowledge of a fair die with ignorance of an unknown die, and also leads to inferential biases in the case of an unknown die that aren't warranted.

    If all you know is that an event has n possible outcomes, there is nothing more than can be said, and chance cannot be quantified.
  • T Clark
    13k
    My understanding is that chance entails lots of brute contingencies. Why does A happen and not B? It just does and it isn’t possible for there to be an explanation, since this would remove the chance.AJJ

    I think the determinists response is that each of the "lots of brute contingencies" is determined, even if we don't know what they are or how they are caused. In this view, chance is just another word for our ignorance of what determines what. I don't buy that.
  • AJJ
    909


    So it looks like the choice is between a view that forgoes further explanation and one that claims but can’t demonstrate its explanation.
  • TheMadFool
    13.8k
    Of course induction works.T Clark

    So, the probability that a law of nature will break down is nil?

    Prove it!
  • NOS4A2
    8.3k


    If each state is determined by its anterior state the first state wouldn’t exist because there was no anterior state to determine it.

    In any case, we couldn’t know the initial state of the dice and the exact interactions with its environment because by the time we did the initial state and exact interactions would be different. This is not because we are too slow or inadequate at examining states, but because there are no states.
  • T Clark
    13k
    So, the probability that a law of nature will break down is nil?TheMadFool

    It doesn't have to be perfect. It only has to work well enough to be useful and understandable enough so we can figure the uncertainties. You use induction all the time.
  • T Clark
    13k
    So it looks like the choice is between a view that forgoes further explanation and one that claims but can’t demonstrate its explanation.AJJ

    I don't consider the idea of determinism very useful in any but the simplest situations. Scientific generalizations, including laws, we develop describe how the world happens to work, not how it has to work. The law of conservation of matter and energy does not cause matter and energy to be conserved.
  • AJJ
    909


    This seems fair. I’m inclined to accept that chance outcomes exist and have no explanation; explanation being something that appears to run out regardless of the world view held.
  • sime
    1k
    The author JR Tolkien determined what happened in the world of Middle-Earth, but he didn't specify what would have happened in the story if alternative courses of action were taken. So is the world of Middle-Earth deterministic , non-deterministic, or neither?

    Is even the world of a choose-your-own adventure story, where alternative courses of action can be chosen by the reader, describable with either of these adjectives?
  • T Clark
    13k
    explanation being something that appears to run out regardless of the world view held.AJJ

    And that is metaphysics.
  • AJJ
    909


    If there is an explanation for why A happens rather than B, in what meaningful sense is A a chance outcome as opposed to a determined one?
  • AJJ
    909


    You’re referring to it as chance, but it isn’t really chance; it’s just ignorance of a determined outcome. If true chance outcomes exist then they necessarily lack an explanation.
  • AJJ
    909
    We don't know on which side a coin falls. So each side has the same chance. More or less.CasaNostra

    Sure, but ignorance of an outcome is not what I’ve been referring to as chance. If someone believes only that sort of chance exists then I don’t see what distinguishes their view from a deterministic one.

    The difference with QM is that QM indeed offers a pure chance. Without explanation there. Not in principle. But how can this be?CasaNostra

    I’ve contended that true chance outcomes are brute contingencies—they don’t have an explanation because they necessarily can’t have one.
  • AJJ
    909
    There is no difference. The outcome is determined bu luckily we don't know what it's gonna be.CasaNostra

    Can you demonstrate that this is always (or ever?) the case in any event?

    Why they can't have an explanation?CasaNostra

    Because then they wouldn’t be chance outcomes, but determined ones we only call chance because of our ignorance.

    There are no brute contingencies.CasaNostra

    Unless you can demonstrate this I’m fine believing there are.
  • AJJ
    909


    I accept that there’s physical stuff governing the roll of the dice; I don’t accept that you could predict the outcome even if you knew everything.
  • AJJ
    909


    What I should have said is that I don’t accept that everything can be known such that the outcome of the roll could be predicted.
  • AJJ
    909


    Fair, guess we’re not substantially in disagreement then.
  • AJJ
    909


    Well, we still disagree here. I don’t mind believing that pure chance causes those outcomes; that they’re random and have no explanation.
  • sime
    1k
    In Automata Theory, non-determinism is the existence of two applicable state-transitions from a given state. This isn't an epistemic notion.
  • apokrisis
    6.8k
    How is it that the normal distribution occurs all the time? It seems at the macro-level, at least, the more likely events occur more of the time.

    At the scale of the very small, that rule seems violated. Which may be no more than a case of different rules - very different rules. Or no rules at all. Or a third case: rules, but not that we can determine because of fundamental limits to our ability to determine rules - at least so far.
    tim wood

    The Galton board is a good example. But doesn’t it illustrate the way that micro chance and macro determinism are yoked together?

    The board engineers things so that every peg gives a 50-50 probability of deflecting a falling ball to its left or right. The randomness is deliberately maximised at this level - or else it is a loaded board. We can argue that no board could ever be so perfectly engineered. Each peg might be infinitesimally biased. But the point of the exercise is to approach the limit of pure randomness at this level.

    Then given a perfect board, it will produce a perfectly determined probability distribution. At the macro level, you can be absolutely certain of a nice and tidy Gaussian distribution emerging from enough trials.

    Each ball hits 7 pegs on the way down. Each deflection is a 50-50 split. There is only one way to hit the outside bin - 7 left or right deflections in a row. And then 70 ways to land in the central two bins as an even mix of left and right deflections.

    So the individual pegs provide the pure chance. But the board as a whole imposes a sequential history on what actually happens - a certainty about the number of 50-50 events and the number of different histories, or paths through the maze, that describe the one final outcome.

    So in a Platonically perfect world, the micro and the macro scale are engineered to represent the opposing ideals or the accidental and the determined. The system isn’t either the one or the other in some deeper metaphysical sense. It is designed to represent the dialectic of accidental versus determined as being the proper model of a reality that is probabilistic.

    Chance and determinism are yoked in a reciprocal relation as the opposing limits of nature, Micro level chance and macro level determinism are how we get a system that has a stochastic character.

    Then of course, the problem is that the real world may not be amenable to such perfect engineering. This is where chaos and quantum effects impact on things.

    Chaos is about non-linearity. It is written into our assumptions about the pegs and the board that we can keep any imprecision in our engineering within linear bounds. Any bias or error in the construction will itself be averaged away in Gaussian fashion. But if there is non-linearity of some kind - maybe the pegs are springy in a way that reverberations are set up - then errors of prediction will compound at an exponential rate. The attempt to engineer a perfect distinction between local randomness and global determinism will go off course because of the emergence of non-linear divergences that lead to new kinds of internal correlations, or synchronised behaviour.

    Then quantum uncertainty also affects our perfect engineering. If the Galton board is very small or very hot, then it is going to start to misbehave. Everything from the balls, to the pegs, to the board as a whole, will be fluctuating in ways that introduce an indeterminism about both the randomness of each deflection event and the determinism about the countable ensemble of paths as a whole.

    Again the classical picture of a world cleanly split between absolute chance and absolute constraint will lose its linearity and become subject to an excess of divergence and/or an excess of correlation.

    We will arrive at the quantum weirdness of a physical system that either diverges at every event to create a many world ensemble of separate histories, or we have to accept the other available interpretation - that there are spooky non-local correlations limiting the chaos.

    So what I am arguing is that the classical picture demands some kind of monistic commitment - either reality is fundamentally based on determinism or chance. But our best models of randomness or probability are intrinsically dichotomistic. It is essential to construct a system - whether it is a die, a coin, a Galton board, a random number generating algorithm- that exemplifies indifferent chance at the micro scale and constraining history on the macro scale.

    Then we learn in fact that physical reality can’t be so perfectly engineered. We can approach linearity, but only by suppressing non-linearity. To achieve our Platonic image of the ideal gaming device, we have to do work to eliminate both its potential for divergence - too much local independence in terms of accumulating history - as well as the opposite peril of a system with too much internal correlation, or too many emergent intermediate-scale interactions.
  • tim wood
    8.7k
    Imho an instructive mini-essay.
    The Galton board is a good example. But doesn’t it illustrate the way that micro chance and macro determinism are yoked together?apokrisis
    So what I am arguing is that the classical picture demands some kind of monistic commitmentapokrisis
    I suppose macro reality is built atop, arises out of QM. But it would appear the rules for each are so different that neither is needed to make sense of the other. Thus monistic commitment seems right and unremarkable in its proper sphere. Nor am I aware of anyplace Bell's inequality is violated except in tests of Bell's inequality. And quantum tunneling - which I do not understand - if useful in computers must at least satisfy the needs of a macro-application.

    The question of the OP is what is at the heart of it all. Four possible answers seem exhaustive of the possibilities: 1) something, 1a) something, and we're not there yet, 1b) something, but we can never get there, 2) nothing. And if it's not 1a) then everyone may as well go home. I suspect 1b) will prove to be correct, but the proof a long way in the future.

    Chance and determinism are yoked in a reciprocal relation as the opposing limits of nature, Micro level chance and macro level determinism are how we get a system that has a stochastic character.apokrisis
    Micro-level, sure, but imo not QM level.
  • apokrisis
    6.8k
    I suppose macro reality is built atop, arises out of QM. But it would appear the rules for each are so different that neither is needed to make sense of the other.tim wood

    Yep. Your post got me thinking that this is another way into the interpretation issues.

    The first step is to drop the monistic demand that something is something "all the way down". Even a classical view of a system exhibiting perfect randomness achieves its goal by imposing a strong dichotomy on nature. Perfect randomness at the local scale of the independent events have to be matched by perfect determinacy in terms of the macroscopic boundary conditions. A die has to be precision machined so that as a six-sided shape, it rolls fair.

    And then having understood classical indeterminacy in that fashion, that opens the way to understanding probabilistic systems where this essential dichotomy itself has a larger story.

    From a mathematical perspective, you get chaos and its non-linearity - a dichotomy where the opposing limits are about divergence vs coherence. You get every trajectory able to bend away from the straight line in unpredictable fashion, but also the opposite thing of all trajectories being bent towards a common goal - the correlations that produce attractors.

    Quantum theory seems to say much the same thing about material reality. You have both more convergence and more divergence than linear classicality would make you suspect. The uncertainties are more uncertain, and the certainties also more certain.

    So the monistic ground becomes a fundamental dialectic. And this dialectic in turn is revealed to have a more generic form. Classicality emerges as the perfectly engineered limit of a more basic dichotomy where the non-linearities have yet to be tamed. You get both more divergence in the parts and more coherence in the whole.

    The Galton board pegs can dance and so be even more chaotic, but they also can dance in synchrony, and so deliver a more ordered result.

    Thus monistic commitment seems right and unremarkable in its proper sphere. Nor am I aware of anyplace Bell's inequality is violated except in tests of Bell's inequality. And quantum tunneling - which I do not understand - if useful in computers must at least satisfy the needs of a macro-application.tim wood

    Sure, linearity does get achieved to a useful degree. Otherwise we wouldn't be here to discuss quantum weirdness or chaotic non-linearity. The anthropic principle applies there.

    But the OP did raise the question of how classical randomness can square with quantum indeterminancy.

    My answer is that these are not two incompatible models. They may be the one model, but with constraints added. Quantum reality is the non-linear version (as a broad brush statement) and classical reality is the linearised version of that - the thermally decohered limit.

    Micro-level, sure, but imo not QM level.tim wood

    But I am arguing that QM is the larger dichotomy in which the classical dichotomy is embedded. So QM is its own less constrained, less linear, version of the micro-macro dichotomy under discussion here.

    QM uncertainty is both about the smallest spacetime scales and the greatest energy densities. It is about non-linear fluctuations – endless quantum corrections to any classical particle value - yet also the constraining holism of non-locality. If we take the Feynman path integral literally, a particle explores every possible path to discover the path that delivers on the constraint of the least action principle.

    It breaks the rules in both their directions. It is more extreme in terms of its individuated chance, and more extreme in terms of its contextual determinism.

    So same old divided world, but less linearised in both regards.
  • khaled
    3.5k
    This sounds something like “Why is pi equal to 3.14”?

    It just is. I don’t see how it’s amenable to explanation. I’m in the camp that quantum randomness is real, ontological, not just an epistemological problem, though there are interpretations that have it be like the dice example. Pilot wave theory for one if I understand correctly.

    Then again, I don’t know shit about QM.
  • T Clark
    13k
    Classicality emerges as the perfectly engineered limit of a more basic dichotomy where the non-linearities have yet to be tamed.apokrisis

    Are you talking about statistical mechanics, e.g. pressure arising out of the random behavior of molecules, or something else?
  • Wayfarer
    20.8k
    How can there be a pure chance, without a deterministic substrate giving rise to our ignorance?Zweistein

    Why shouldn't there be? What prima facie case is there that there ought not to be chance?

    So what I am arguing is that the classical picture demands some kind of monistic commitment - either reality is fundamentally based on determinism or chance.apokrisis

    What about Peirce's 'tychism'? Didn't he see chance as basic? And does it have to be one or the other - all chance, or totally detemined? What about the strange attractors in chaos theory - they produce patterns arising from apparently minute fluctuations - which seems a way of conceptualising something which is both a product of chance but also subject to laws?
  • apokrisis
    6.8k
    Are you talking about statistical mechanics, e.g. pressure arising out of the random behavior of molecules, or something else?T Clark

    Systems with pressures and temperatures are examples of this general way of thinking.

    But one of the other things I would point out here is the “weirdness” of the situation where the random kinetics of the particles of an ideal gas is seen as the deterministic part of the story, and macro properties like pressure and temperature become the emergent accidents.

    Again, that is the consequence of a backwards metaphysics that wants to make Newtonian dynamics the generic case and statistical systems, and quantum systems, the special cases.

    This is like thinking everything is Euclidean geometry - flat and infinite - and that non-Euclidean geometry is some weird extra. We had to flip that around once we realised - ad with relativity - that it is in fact linear Euclideanism which is the special case here.
  • apokrisis
    6.8k
    What about Peirce's 'tychism'? Didn't he see chance as basic?Wayfarer

    Yep. But unfortunately a further dichotomy is built into that - one that Peirce was still working on.

    Just as there is a Aristotelian distinction between potentials and possibilities, there is a distinction between vagueness and fluctuation.

    So there is “chance” that is basic in terms of being a logical vagueness - anything might be the case. And then there is “chance” in the sense of some definite spontaneous event - a tychic “sporting”.

    One is about the generality of potential being. The other is about the particularity of some accident of being - a definite possibility that is logically crisp in the sense of being a counterfactual.

    Again, this speaks to a holistic systems view of nature as concrete chance only exists by virtue of the counterfactuality of some matchingly definite context. It is a local-global deal. The radioactive particle is still there or it just spontaneously decayed.

    But the quantum vacuum is a much vaguer beast - a generic indeterminacy. It is both full of fluctuations, and yet they are “virtual”. The vacuum needs a constraining context to make its zero-point uncertainty manifest. You need an apparatus like two Casimir plates to turn a vague potential into definite possibilities.

    What about the strange attractors in chaos theory - they produce patterns arising from apparently minute fluctuations - which seems a way of conceptualising something which is both a product of chance but also subject to laws?Wayfarer

    Attractors are produced by correlated interactions. So rather than trajectories exploring the world with complete freedom, they become entrained to emergent patterns.

    Draining water forms a spiral. A vortex is a simple point attractor. Rather than every molecule having to find its own random path to the plug hole exit - which could take for bloody ever - they get sucked into the most efficient possible path that solves the collective problem.

    Order out of chaos, as they say. All the minute individual fluctuations are overwhelmed by mob forces.

    The butterfly effect is then the widely misunderstood converse of the story. If we try to figure out which minute fluctuation began the general plug hole spiral, we might pick one wee fellow that seemed to mark the right angle of attack first. It was the spontaneous fluctuation that broke the symmetry and so set up the giant “tropical storm in a distant land” that became the gurgling vortex.

    But really, the cause of the vortex was the general shape of the system - the boundary conditions that set up a bath full of water where the plug had suddenly been pulled. After that, any old fluctuation could have been the first panicked lurch that set the whole crowd stampede off.
  • TheMadFool
    13.8k
    It doesn't have to be perfect. It only has to work well enough to be useful and understandable enough so we can figure the uncertainties. You use induction all the timeT Clark

    Then, by extension, determinism isn't perfect! In other words, chance and free will are a possibility.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment