Continuing a bit, I take the view that existence, and thus causality, is fundamentally probabilistic. Atomism is emergent. And we have two formal statistical models - the classical and the quantum - that capture that fact.
An irony is that Boltzmann settled the argument in favour of atomism by establishing a statistical mechanics view of reality. His famous dictum was “If you can heat it, it has microstructure.”
The equipartition law says there is a direct link between macroscopic and microscopic physics because if you know the total thermal energy of a body - its temperature - you can calculate the number of microscopic degrees of freedom it must contain. Avogadro’s constant.
So atomism was "proved" by spacetime having a well-behaved statistics. A given volume could contain a given number of degrees of freedom. And then - the ontological leap of faith - by observational degrees of freedom, we would be talking about actual definite particles ... as that is what our causal interpretation most naturally would want to assume.
But who in particle physics believes in "actual particles" anymore? What we actually know to exist is the statistical formalism that describes the prototypically classical situation. We have equations that cough out results in terms of countable microstates or degrees of freedom.
So the classical picture and the quantum picture are pretty much aligned on that score. They boil down to the kind of statistics to expect given a physical system with certain global or macro constraints on local possibilities. Going beyond the statistics to talk about "actual particles" - conventional atomism - is a reach.
So in this way, quantum weirdness should cause us to go back and revisit the classical tale. Classical thermodynamics had already created an approach where atoms were modelled as the limit of states of constraint. The basic degrees of freedom of a system - the very "stuff" it was supposed to be constructed from - were emergent.
And getting back to the quantum level of the story, Thanu Padmanabhan is pursuing this way of thinking as a way to understand dark energy and spacetime geometry -
http://nautil.us/issue/53/monsters/the-universe-began-with-a-big-melt-not-a-big-bang
So Boltzmann's argument - if it can be heated, it has "atoms" - can be used to impute a quantumly grainy structure to spacetime itself.
But it is not that spacetime is actually composed of fundamental causal particles. Instead, it is the reverse story that regular spatiotemporal causal structure has a smallest limit. There is not enough contextuality to continue to imprint its regularity on events once you arrive at the Planck scale. You are foiled by all directions turning symmetric at that point - principally in the sense that there is no thermal temporal direction in which events can move by dissipating their localised heat.
So again, what we read off our successful statistical descriptions is the literal existence of hard little atomistic parts. Our conventional notions of causality encourage that. Possibility itself is understood atomistically - which is what makes an added degree of quantum uncertainty rather a mystery when it starts to manifest ... and eventually completely erases any definite atoms by turning everything in sight vanilla symmetric. A quark-gluon fluid or whatever describes a primal state of material being.
But we can turn it around so that atoms are always emergent. And classical atoms reflect another step towards maximal counterfactual constraint - one that takes a step beyond a looser quantum level of constraint, but then even a quantum level is still pretty constrained.
It is exactly the story with algebras. Normal classical number systems operate as point on a 1D line. Quantum number systems operate in one step more complex/less constrained realm of 2D imaginary numbers. Yet there are further algebras beyond - the 4D quarternions and 8D octonions, and then eventually right off into barely constrained structures of the even higher dimensional exceptionals.
So classical counting uses fundamental particles - 0D points on 1D lines. The emergent limit case if you were constraining the freedom of the act of counting. But then quantum counting leaves you with chasing your number around a 2D plane, which winds up behaving like an added rotation. When it comes to actual particles - like an electron - you have to in some sense count its spin twice to arrive at its spin number. To fix its state with classical counterfactual definiteness, you have to add back an extra constraint that eliminates the extra quantum degree of freedom it has from "inhabiting" a larger background space of probability.
Everywhere you look in modern fundamental physics, this is what you find. Classicality is emergent - where you arrive at the end of a trail of increasing constraint on free possibility. So causality needs to be understood now in these same terms.
And when it comes to quantum mechanics, it isn't even really that "weird" as it is already way more constrained in its dimensionality than the more unconstrained dimensional systems that could lie beyond it in "algebra-space". Quantum mechanics just has ordinary classical time baked into it at a background axiomatic level. That is why it is possible to calculate a deteministic wavefunction statistics for any given initial conditions. A definite basis has been assumed to get the modelling started.
But to move beyond QM, to get to quantum gravity, it seems clear that time itself must become an output of the model, not an input. And if you give up time as being fundamental, if you presume it to be merely the emergent limit, then of course conventional notions of causality are dead - except as useful macroscopic statistical descriptions of nature.