One of the take-aways from this is that the very idea of the (continuous) number-line is a kind of fiction, an attempt to glue together geometry and arithmetic in a way that isn't actually possible — StreetlightX
Zeno's paradoxes are paradoxes of intuition. This is because it's quite easy to circumvent Zeno's paradoxes with sufficiently precise definitions of what limits and continuity are; the celebrated epsilon-delta and epsilon-N constructions of Weirstrass. You can go on as if the paradoxes are resolved because pure mathematical inquiry is largely a conditional enterprise; given these assumptions (which characterise a structure), what can be shown and what does it do? You can posit yourself past the paradoxes if you so wish, and as is usually done. — fdrake
Nice discussion. The core problem is that this is a tension that always exists because it speaks to an underlying metaphysical-strength dichotomy, and thus it raises the issue of what it would mean to resolve the tension without dissolving also the useful division.
So the mathematical debate seems to hinge on whether "the real" is discrete or continuous. The intuition being applied gets hung up on that. And clearly - Rosen's point - maths depends on the trick of atomistic constructability. Arithmetic or algebra are seen as fundamental as a continuity of form can be built up step by step from a succession of parts or acts.
But then continuity - the grounding wholeness that geometry seems to speak just as directly to - also seems to exist just as much, according to intuition. The geometer can see how the operation of division is a cuckoo in arithmetic's nest. Zeno's paradox shows that. There is more to the story than just the algebraic acts of construction - addition, subtraction and multiplication.
Then continuity shows its face in other ways. Non-linear systems contain the possibility of divergences at every point in their space. As Rosen argues, systems that are safely enough linear are in fact rare in nature. Linearity is non-generic. Perfect constructablity must fail.
So the problem is that the tension is real. Construction seems to work. Used with care, maths can formally model the world in ways that are powerfully useful. The world can come to seem exactly like a machine. And yet also, as any biologist or quantum physicist will know, the perfectly mechanistic is so non-generic that ultimately a machine model describes nothing in the real world at all.
It is the pragmatics of modelling that really bring this tension into the limelight. Maths can simply ignore the issue. It can keep deferring the problems of constructability by pushing them ever further away as the limit, just as
@fdrake describes. It is a respectable working practice. Maths has benefited by taking this metaphysical licence. But scientists modelling the world with maths have to deal with the ill-fit of a purely mechanistic description. Continuity always lurks and waits to bite. It needs to be included in the modelling game somehow - even if it is just like Rosen's essays, the planting of a bunch of "here be dragons" signs at the edge of the intellectual map.
But the way out for me is the usual one of Peircean semiotics. Where you have a dichotomy, you actually have a pair of complementary limits. The discrete and the continuous would both be a matter of "taking the limit". And this is in turn founded on a logic of vagueness. You can have the discrete and the continuous as both merely the emergent limits on the real if they have some common ground of indistinction that they are together - measurably - serving to divide.
So now you don't have to worry if reality is fundamentally discrete or fundamentally continuous. It is neither - always being vaguer - but also it is forever moving towards those crisp limits in terms of its actions. If it is developing, it is the discrete vs the continuous dichotomy that is becoming ever more strongly manifest. It is approaching both limits at once.
At this point, we might need to more from the overly spatial dichotomy of the discrete~continuous - the idea of a 0D location and its 1D line. The simplest possible space that would be formed via a translational symmetry and the definite possibility of it being broken. The real world needs to incorporate space, time and energy as its triad of irreducibly fundamental components. A maths suited to actually modelling nature would need to align itself with that somehow.
Or indeed, being biologists, concerned with the study of organisms, we might leap all the way to a focus on agency and autonomy - the modelling relation, or semiosis pure.
Now we can reply to the issue of atomistic constructability in terms of the dichotomy it forms with the notion of holistic constraints. The real world - sans modelling - just is a product of constraints on freedoms. But modelling itself has a pragmatic goal regulating it. The goal of being a modeller - the reason organismic agency and autonomy would evolve within an agent-less cosmos - would be to gain machine-like control over nature. A model is a way to construct constraints so as to achieve purposes. And hence mathematics reflects that modelling imperative.
Maths gets it "wrong" by pushing constructability to an unreasonable seeming metaphysical limit. It makes the "mistake" of treating reality as if it were a pure machine. And yet that is also what is right and correct. Maths is useful to the degree it can construct a world constrained enough by our machinery that it achieves our goals reliably enough.
Biology itself is already a mechanisation of physics. It is the imposition of a system of molecular motors on nanoscale material chaos. So scientific modelling is simply an intellectual continuation of that organismic trick.
Rosen is a bit conflicted in that he complains about the flaws in the tools we use, and yet those flaws are only apparent in the grandest totalising metaphysical perspective. The largest model.
So what he gets right is that the mathematical approach, based on mechanical constructability, can only constrain uncertainty, never arrive at certainty. That is all maths ever does - move towards the limits of being, imagined particularly in the form of the dichotomy of the continuous and the discrete, the geometric and the algebraic, the structures and their morphic acts.
Maths can point to the limits where uncertainty of either kind - either pole of the dichotomised - would finally be eliminated. But the uncertainty must always remain. Which is why maths also keeps advancing as every step towards those limits must spring a leak that is then worth our while trying to fix, so setting up the need for the further step to repair the still smaller leak that will be now be exposed.
So it is an interesting game. Nature is the product of the symmetry-breaking tango between global constraints and local spontaneity. Uncertainty is basic. Yet also it becomes highly regulated or lawful.
Then organisms arise by being able to seize control of the regulatory possibilities of a modelling relation. If you can construct constraints - impose mechanistic structure on that natural world - then you can become a world within the world. You can use your ideas to make the world behave locally in ways that suit your interests.
Eventually humans gained such mathematical mastery over their realities that they could afford to get upset about the way even the best-constructed models were still full of leaks.
But apophatically, flip that around, and you have now your best metaphysical model of reality as a system of constraints. Uncertainty - as vagueness - becomes the new anti-foundationalist foundation. Atomistic construction comes into focus as the emergently individuated - the machinery being imposed on nature so as to limit that vagueness, corral that spontaneity or fluctuation, that spoils the way it is "meant to look".