• Andrew M
    1.6k
    The Copenhagen Interpretation of quantum mechanics postulates the Born rule which gives the probability that an observer will measure a quantum system in a particular state. For example, if a photon is sent through a beam splitter, the Born rule says that the photon will be detected on the reflection path half the time and on the transmission path the other half of the time.

    The Born rule gives the correct probabilistic predictions. But what explains the Born rule? And why are there probabilities at all? According to the Copenhagen interpretation (which rejects causality) there is no explanation - the probabilities are just a brute fact of the universe. Hence the Born rule must be postulated under that interpretation.

    However for causal interpretations (such as Everett and de Broglie-Bohm), the probabilities must have an underlying causal explanation. So, in the case of Everett, the Born rule should be derivable from unitary quantum mechanics, not merely postulated. I'm going to outline a derivation below - note that it draws from Carroll's and Sebens' derivation.

    The first issue is why we should expect probabilities at all. With the beam splitter example above, the Everett interpretation says that a photon is detected on both paths (as described by the wave function). But an observer only reports detecting a photon on one path. Therefore, on the Everett interpretation, the probabilities describe the self-locating uncertainty of the observer (or the observing system), not the probabilities that the photon will be detected exclusively on one path or the other.

    The second issue is why the probability is that given by the Born rule and not by some other rule.

    The Born rule states that the probability is given by squaring the magnitude of the amplitude for a particular state. In the beam splitter example above, the amplitude for each relative state is √(1/2). So the probability for each state is 1/2.

    It appears that in these scenarios we should be indifferent about which relative state we will end up observing. So the simple rule here is to assign an equal probability to each state. If there are two states, then the probability of each is 1/2. If there are three states, then the probability of each is 1/3. And so on. This is often called branch-counting and it makes the correct predictions in cases where the amplitudes for each state are equal.

    So far so good. But this approach doesn't work if the amplitudes are not equal. For example, suppose we have a beam splitter where the probability of reflection is 1/3 and the probability of transmission is 2/3.

    In this scenario there are two states with amplitudes √(1/3) and √(2/3). If we are indifferent about which state we will end up observing, then we will wrongly assign a probability of 1/2 to each state. Branch counting doesn't work in this case.

    So we have a simple indifference rule that only works in specific circumstances and seems inapplicable here. What can be done? One thing we can do is to transform the setup so that the rule does become applicable.

    To do this, we can add a second beam splitter (with 1/2 probability of reflection and transmission) to the transmission path of the first beam splitter. When we send a photon through this setup, there are now three final states and they all have equal amplitudes of √(1/3) each (i.e., √(2/3) * √(1/2) = √(1/3)).

    Now the indifference rule can be applied to get a probability of 1/3 for each final state. Since there are two final states on the first beam splitter's transmission path, their probabilities add to give a total probability for the first beam splitter's transmission path state of 2/3. This is the correct prediction as per the Born rule and it only assumes the indifference rule as required.

    This works for any scenario where states of unequal amplitudes are factorable into states of equal amplitudes.

    Does anyone see any problems with this?
  • Wayfarer
    22.5k
    Only that it perhaps ought to have been posted on Physics Forum.
  • Andrew M
    1.6k
    It's about the connection between causality and probability which is really a philosophical issue not a physics issue.
  • Wayfarer
    22.5k
    Fair enough. Could you just unpack a bit more what the problem is that you're trying to solve? Why is the role of 'probability' such that it requires such an elaborate response? Is that related to Einstein's 'god playing dice' remark?
  • Andrew M
    1.6k
    Yes. Before quantum mechanics came along, it was assumed that probability reflected a lack of knowledge about the world (i.e., it was an epistemic issue). But quantum mechanics suggested that probability was baked into the universe at the most fundamental level, violating causality. Which irked Einstein and prompted him to say that "God does not play dice with the universe".

    So the problem for a causal interpretation (as Everett and de Broglie-Bohm are) is to explain how probability arises in a causal universe. In particular, if the wave function describes a state evolving into a superposition of two states where one state has an amplitude of 1 and the other has an amplitude of 2, then why should the probability of observing them be in the ratio of 1:4? That is the squared rule that is the Born rule.

    If that can be explained in a causal framework then it restores the idea that probability reflects a lack of knowledge about the world, it's not fundamental.
  • Wayfarer
    22.5k
    well if you accept that the universe is not causally closed, the whole problem goes away. But apparently that is too high a price.
  • Rich
    3.2k
    If that can be explained in a causal framework then it restores the idea that probability reflects a lack of knowledge about the world, it's not fundamental.Andrew M

    Probability can still be baked in the universe even with a causal interpretation. The cause may be inherently probabilistic, which is one of the possible interpretations of the Bohm quantum potential initial conditions. Hence, the reason Bohm suggested that his interpretation is causal yet non-deterministic.
  • Andrew M
    1.6k
    well if you accept that the universe is not causally closed, the whole problem goes away. But apparently that is too high a price.Wayfarer

    That's a possible response. But if you make a distinction between the universe and reality, then it just pushes the issue back a level. That is, is reality causally closed (recast as the Principle of Sufficient Reason rather than in physical terms)?
  • Andrew M
    1.6k
    Probability can still be baked in the universe even with a causal interpretation. The cause may be inherently probabilistic, which is one of the possible interpretations of the Bohm quantum potential initial conditions. Hence, the reason Bohm suggested that his interpretation is causal yet non-deterministic.Rich

    That raises the question of the status of the Born rule under such interpretations. It would seem that the Born rule could only be postulated, not explained or derived.
  • Rich
    3.2k
    I agree. I certainly have no love for the Copenhagen Interpretation nor the way the Copenhagen group successfully rammed it down everyone's throat.
  • apokrisis
    7.3k
    If that can be explained in a causal framework then it restores the idea that probability reflects a lack of knowledge about the world, it's not fundamental.Andrew M

    I like the quantum information approach where the view is that uncertainty is irreducible because you can't ask two orthogonal questions of reality at once. Location and momentum are opposite kinds of questions and so an observer can only determine one value in any particular act of measurement.

    So this view accepts reality is fundamentally indeterministic. But also that acts of measurements are physically real constraints on that indeterminism. Collapse of the wavefunction happens. It is only that you can't collapse both poles of a complementary/orthogonal pair of variables at once. Maximising certainty about one, minimises uncertainty of the other, in good Heisenberg fashion.

    What this interpretation brings out sharply is the contextual or holistic nature of quantum reality. And the role played by complementarity. Eventually you are forced to a logical fork when trying to eliminate measurement uncertainty. You can't physically pin down two completely opposite quantities in a single act of constraint.

    Grab the snake by the head, or by the tail. The choice is yours. But pin one down and the other now squirms with maximum freedom and unpredictability.

    Of course, when talking of observers and measurements, we then have to grant that it is the Universe itself which is doing this - exerting the constraints that fix a history of events. So it becomes a thermal decoherence interpretation with the second law of thermodynamics being the party interested in reducing the information entropy of the Universe as a physical system.
  • Wayfarer
    22.5k
    That is, is reality causally closed (recast as the Principle of Sufficient Reason rather than in physical terms)?Andrew M

    Things happen for reasons, but there is also serendipity, chance or hazard. There is an element of spontaneity involved.

    when talking of observers and measurements, we then have to grant that it is the Universe itself which is doing thisapokrisis

    'A scientist is just an atom's way of looking at itself' ~ Niels Bohr.

    Note that 'the Copenhagen interpretation' is NOT a scientific hypothesis. It is simply the kinds of things that Heisenberg, Bohr and Pauli would say about the philosophical implications of quantum mechanics. The term wasn't even coined until the 1950's. But it is an epistemologically modest attitude, in my view, and one generally in keeping with the tradition of Western natural philosophy. Heisenberg's essays on Physics and Philosophy are very interesting in that regard.
  • apokrisis
    7.3k
    Modest or radical? The Copenhagen Interpretation is metaphysically radical in paving the ground to acknowledge that there must be an epistemic cut in nature.

    The "modest" understanding of that has been the good old dualistic story that it is all in the individual mind of a human observer. All we can say is what we personally experience. Which then leads to folk thinking that consciousness is what must cause wavefunction collapse. So epistemic modesty quickly becomes transcendental confusion. We have the divorce in nature which is two worlds - mental and physical - in completely mysterious interaction.

    I, of course, am taking the other holistic and semiotic tack. The epistemic cut is now made a fundamental feature of nature itself. We have the two worlds of the it and the bit. Matter and information. Or local degrees of freedom and global states of constraint.

    So CI, in recognising information complementarity, can go three ways.

    The actually modest version is simple scientific instrumentalism. We just don't attempt to go further with the metaphysics. (But then that is also giving up hope on improving on the science.)

    Then CI became popular as a confirmation of hard dualism. The mind created reality by its observation.

    But the third route is the scientific one which various information theoretic and thermodynamically inspired interpretations are working towards. The Universe is a system that comes to definitely exist by dissipating its own uncertainty. It is a self constraining system with emergent global order. A sum over histories that takes time and space to develop into its most concrete condition.
  • Wayfarer
    22.5k
    what do you make of the Everett-many worlds hypothesis?

    Oh, and I'm sure there is an 'epistemic cut'.
  • Wayfarer
    22.5k
    The "modest" understanding of that has been the good old dualistic story that it is all in the individual mind of a human observer.apokrisis

    My take would certainly not be that it's ALL in the mind of the observer, but that (1) the mind makes a contribution without which we would perceive nothing and (2) this contribution is not itself amongst the objects of perception as it is in the domain of the subjective. Scientific realism would like to say that we can see a reality as if we're not even there at all, 'as it is in itself', in other words that what we're seeing is truly 'observer independent'. But it is precisely the undermining of that which has caused all of the angst over the 'observer problem', because that shows that the observer has an inextricable role in what is being observed. The observer problem makes that an unavoidable conclusion, which Everett's 'many worlds' seeks to avoid.
  • apokrisis
    7.3k
    Many worlds is used by many to avoid the physical reality of wavefunction collapse or an actual epistemic cut. Or rather, to argue that rather than local variable collapse, there is branching that creates complementary global worlds.

    So as maths, many worlds is fine. It has to be as it is just ordinary quantum formalism with the addition of thermodynamical constraint - exactly the decoherent informational view I advocate.

    But it gets squirmy when Interpretation tries to speak about the metaphysics. If people start thinking of literal new worlds arising, that's crazy.

    If they say they only mean branching world lines, that usually turns out to mean they want to have their metaphysical cake and eat it. There is intellectual dishonesty because now we do have the observer being split across the world lines in ways that beg the question of how this can be metaphysically real. The observer is turned back into a mystic being that gets freely multiplied.

    So I prefer decoherence thinking that keeps observers and observables together in the one universe. The epistemic cut itself is a real thing happening and not something that gets pushed out of sight via the free creation of other parallel worlds or other parallel observers.
  • Wayfarer
    22.5k
    as maths, many worlds is fine. It has to be as it is just ordinary quantum formalism with the addition of thermodynamical constraint - exactly the decoherent informational view I advocate.

    But it gets squirmy when Interpretation tries to speak about the metaphysics. If people start thinking of literal new worlds arising, that's crazy.
    apokrisis

    What would you say if someone were to assert that quantum computers rely on the actual reality of many worlds in order to operate?
  • apokrisis
    7.3k
    They are being metaphysically extravagant in a way the mathematics of decoherence doesn't require.

    To build an actual quantum computer will require a lot of technical ingenuity to sort practical problems like keeping the circuits in a cold enough, and isolated enough, condition for states of entanglement to be controllable.

    Do you think those basic engineering problems - that may be insurmountable if we want to scale up a circuit design in any reasonable fashion - are going to be helped by a metaphysical claim about the existence of many worlds.

    Unless MWI is also new science, new formalism, it is irrelevant to what is another engineering application of good old quantum mechanics.
  • Wayfarer
    22.5k
    Do you think those basic engineering problems - that may be insurmountable if we want to scale up a circuit design in any reasonable fashion - are going to be helped by a metaphysical claim about the existence of many worlds?apokrisis

    I'm only going on the jacket blurb of the first book by the guy that invented quantum computing:

    The multiplicity of universes, according to Deutsch, turns out to be the key to achieving a new worldview, one which synthesizes the theories of evolution, computation, and knowledge with quantum physics. Considered jointly, these four strands of explanation reveal a unified fabric of reality that is both objective and comprehensible, the subject of this daring, challenging book. The Fabric of Reality explains and connects many topics at the leading edge of current research and thinking, such as quantum computers (which work by effectively collaborating with their counterparts in other universes).

    Reading through the reader reviews of that title, it seems Deutsch gives pretty short shrift to anyone who doubts the actual reality of parallel universes, which he seems to think is necessary for the concept to actually work.
  • Metaphysician Undercover
    13.1k
    Reading through the reader reviews of that title, it seems Deutsch gives pretty short shrift to anyone who doubts the actual reality of parallel universes, which he seems to think is necessary for the concept to actually work.Wayfarer

    Isn't apo saying that the concept doesn't actually work?
  • Wayfarer
    22.5k
    He, like myself, can't accept the idea of 'parallel universes', but the point I'm trying to make is that it is an inevitable consequence of Everett's 'relative state formulation', like it or not. So, let's move on.
  • apokrisis
    7.3k
    To be fair to Deutsch, he wrote that book back in the 1990s. Many people got carried away and were taking the most literal metaphysical view of the newly derived thermal decoherence modification to quantum formalism.

    Now most have moved to a more nuanced take of talking about many world-line branches. But my criticism is that simply mumbles the same extravagant metaphysics rather than blurting it out aloud. Many minds is as bad as many worlds.

    On the other hand, listen closely enough to MWI proponents, and they now also start to put "branches" in quotes as well as "worlds". It all starts to become a fuzzy ensemble of possibilities that exist outside of time, space and even energy (as preserved conservation symmetries). The MWIers like Wallace start to emphasise the decision making inherent in the very notion of making a measurement. In other words - in accepting metaphysical vagueness and the role that "questioning" plays in dissipating that foundational uncertainty - MWI is back into just the kind of interpretative approach I have advocated.

    There is now only the one universe emerging from the one action - the dissipation of uncertainty that comes from this universe being able to ask ever more precise questions of itself.

    In ordinary language - classical physics - we would say the Universe is cooling/expanding. It began as a fireball of hot maximal uncertainty. As vague and formless as heck. Then it started to develop its highly structured character we know and love. It sorted itself into various forces and particles. Eventually it will be completely definite and "existent" as it becomes maximally certain - a state of eternal heat death.

    Only about three degrees of cooling/expanding to get to that absolute limit of concrete definiteness now. You will then be able to count as many worlds as you like as everyone of them will look exactly the same. (Or rather you won't, as individual acts of measurement will no longer be distinguishable as belonging to any particular time or location.)
  • Wayfarer
    22.5k
    Eventually it will be completely definite and "existent" as it becomes maximally certain - a state of eternal heat death.apokrisis

    Unless it collapses back into another singularity, and then expands again. Guess we'll have to wait and see ;-)
  • apokrisis
    7.3k
    He, like myself, can't accept the idea of 'parallel universes', but the point I'm trying to make is that it is an inevitable consequence of Everett's 'relative state formulation', like it or not. So, let's move on.Wayfarer

    I reject parallel worlds and parallel minds because immanence has to be more reasonable than transcendence when it comes to metaphysics.

    An immanent explanation could at least be wrong. A transcendent explanation is always "not even wrong" because it posits no actual causal mechanism. It just sticks a warning sign at the edge of the map saying "here be dragons".

    And the Everett formulation is just an interpretation - a metaphysical heuristic. It itself becomes subject to various metaphysical interpretations as I just described. You can get literal and concrete. Or you can take a vaguer approach where the worlds and branches are possibilities, not really actualities. Or you can go the full hog and just accept that the foundation of being is ontically vague and so any counterfactual definiteness is an emergent property.

    The real advance of "MWI" is the uniting of the maths of quantum mechanics with the maths of thermodynamical constraints - the decoherence formalism.

    This is a genuine step in the development of quantum theory. And it has sparked its own wave of interpretative understanding - even if ardent MWIers claim to own decoherence as their own thing.
  • apokrisis
    7.3k
    Unless it collapses back into another singularity, and then expands again. Guess we'll have to wait and see ;-)Wayfarer

    No we bloody don't. Dark energy is a fact. The Heat Death is gonna happen.

    Of course we now have to account for dark energy. And again - in my view - decoherence is the best hope of that. Because quantum level uncertainty can only be constrained, not eliminated, then that means that the fabric of spacetime is going to have a built-in negative pressure. It is going to have a zero-point energy that causes quantum-scale "creep".

    Unfortunately we don't know enough particle physics to do an exact calculation of this "creep". We can't sum all the contributions in an accurate way to see if they match the dark energy observations. And the naive level calculation - where either things either all sum or all cancel - produce the ridiculous answers that the dark energy value should be either zero or Planck-scale "infinite". An error of 130 orders of magnitude and so another of your often cited "crises of modern physics".

    Other calculations going beyond the most naive have got closer to the observed value. But also admittedly, not come nearly close enough yet.

    But at least, as a mechanism, it could be bloody wrong. ;)
  • dclements
    498
    To the best of my knowledge, it has already been determined that the issue comes from the state of particle (or whatever else is being tested) being changed by being MEASURED and therefore we have already ruled out any problem with any problem from it being OBSERVED.

    It is a bit too fruit loopy to think that just our observation of something completely altering it beyond that which is conceivable through more or less physics. I mean there could be something like "magic" where our mind alters realty but it is best to rule out everything else before we allow ourselves to think something like "magic" is going on.
  • Rich
    3.2k
    Of course we now have to account for dark energy. And again - in my view - decoherence is the best hope of that. Because quantum level uncertainty can only be constrained, not eliminated, then that means that the fabric of spacetime is going to have a built-in negative pressure. It is going to have a zero-point energy that causes quantum-scale "creep".apokrisis

    I don't know whether I should laugh or cry. I am sure many members are awaiting breathlessly for the final verdict on what will happen billions and billions of years from now as science refines it's precise calculations. No doubt such calculations will require increased funding. Come to think of it, how about forecasting tomorrow?
  • apokrisis
    7.3k
    I am sure many members are awaiting breathlessly for the v final verdict on what will happen billions and billions of years from now as science refines it's calculations.Rich

    Just measure the cosmic background radiation. Its 2.7 degrees above absolute zero. The average energy density is down to a handful of protons per cubic metre.

    Again you reveal the vastness of your ignorance of routine scientific facts. The Heat Death is a done deal even if you might also say the clock has another tick or two to actually reach midnight.
  • Rich
    3.2k
    Heck, anyone who can predict as fact what will be happening billions upon billions (maybe trillions) of years from now has to be .... well, just a remarkable fortuneteller. Thank you for putting my mind at rest. Not in my lifetime at least. Any other long-term forecasts?
  • apokrisis
    7.3k
    You are like someone plunging off a skyscraper, now being inches from the ground, shouting out I'm not dead yet, you don't know what you're talking about, my future has not been foretold, there are no grounds to predict my immanent demise.
  • Rich
    3.2k
    Nah, just laughing and wondering how many people buy into your gobblygook? Your demise? Are you working on the precise calculations? Need funding?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.