## Time and Boundaries

• 11.6k
:smirk:
• 20.4k
the universal law just says two masses will accelerate towards each other. So your explanation would amount to that two masses accelerate towards each other because they accelerate towards each other.
• 11.1k

Newton's first law explicitly says that the motion of a body will remain constant unless acted on by a force. I think "acted on by a force" implies causation doesn't it? In Newtonian physics gravity is a force, and acceleration is caused.
• 621
Gravity is just a name for the acceleration of any two masses towards each other.

Are you talking about gravitational attraction?

$g = 9.8 m/s ^2$ You hold this equation in contempt?

What’s causing precise acceleration?

Respective masses curving spacetime.

Is this the language you respect?
• 20.4k
Are you talking about gravitational attraction?
You hold this equation in contempt?

Why would you suppose that? An odd response.

What the Universal Law of Gravitation says is that the force between two masses is inversely proportional to the square of the distance between them. That number you posited is the proportion.

The force is the product of the mass and the acceleration. F=ma.

"Cause" does not appear anywhere in those equations.
• 621
Saying gravity causes acceleration is just saying the acceleration between two masses causes the acceleration between two masses.

Is it a stretch, a distortion, a mis-read to say your above quote is affirmation of my main point?

Gravity causes acceleration (in free fall) ⇒ acceleration of mass ≡ gravity as $g = 9.8/s^2$, or $f = ma$.

My Main Point

Gravity and acceleration-due-to-gravity are, in a certain sense, as one. They are conjoined as a unified concept: gravity-and-acceleration. Thus cause and effect are, in the same sense, as one, save one stipulation: temporal sequencing.
• 621
..."time" is neither "temporal" nor a "phenomenon". (I think you're confusing (your) maps with the territory.)

What’s the critical operation between cause and effect when considered as conjunction: time?''

No. IMO, wrong, or incoherent, question (i.e. misuse of terms).

Are there any observable boundaries time cannot merge?

More incoherence. "Time" is a metric (i.e. parameter), ucarr, not a force or agent.

Do you think the forward-flowing of history comprises the physical phenomena populating our empirical
experiences?

In the below quote, are you referring to the commingling of the forward-flowing of history with the metric that tracks it mathematically?

(I think you're confusing (your) maps with the territory.)
• 20.4k
I dunno. I guess I give up, having not been able to follow what it is you might be claiming.

In Newtonian physics gravity just is an acceleration of a mass due to the another mass. Saying gravity causes that acceleration is circular. If that is all you have to say, then fine. But you then add something odd about temporal sequences.

Newtonian physics is pretty clean, making use of mathematical equations rather than causal statements. While we can to some extent treat the equations as causal links, that's perhaps a bit muddled. So we can say that gravity causes stuff to fall, but that's a shorthand for a failure to explain the acceleration between masses rather than an explanation.

pretty much ended the mistaken notion that cause requires time. It's a topic that has been discussed here before, leading quickly to partisan stances.
• 2.8k
I'm with Victor Toth on this one. Gravity is a force. The force is counteracted by the upward force of the plane before the parachutist leaps away. Then it's force due to gravity counterbalanced by force due to air resistance as he falls, each force changing a bit with distance, which is determined by time's passage. The effect of him falling is determined by several "causes", including jumping out of the plane and the force of gravity. I don't think "cause and effect" is relegated to the junkpile of philosophy (or physics, for that matter) because at some infinitesimal scale it's hard to discern which is which. I agree with in that regard.

(I model mathematical causal chains - in time - as compositions of functions. A result (effect) at a time t is, say, z. The next temporal step, and the scale of time can vary, is to compute s, where s=f(z), then after that, r, where r= g(s), and so on. There's a whole theory herein. But I think it more realistic to assume several functions act on z, not just one. Like differing forces. So each step - and these are associated with intervals of time - has as outcome the influence of a number of "forces", rather than a single function.)

Sorry, got carried away with a current research topic of mine. Maybe it's relevant here.
• 621

(I model mathematical causal chains as compositions of functions. A result (effect) at a time t is, say, z. The next temporal step is to compute s, where s=f(z), then after that, r, where r= g(s), and so on. There's a whole theory herein. But I think it more realistic to assume several functions act on z, not just one. Like differing forces. So each step - and these are associated with intervals of time - has as outcome the influence of a number of "forces", rather than a single function.)

In the above quote, jgill elaborates with detail and clarity what I've been trying to claim more vaguely and superficially. The above quote gives us a description of phenomenal reality, known empirically to all of us. It is a complex mix of the physical and the conceptual. Cause and effect and time are deeply partial to each other as an interweave, and this interweave has for its signature the forward-flowing of history.

I model mathematical causal chains as compositions of functions.

The gist of my claim herein is that the above quote describes our fluidly transforming world as an ongoing continuity of boundary crossings, boundary mergers, Venn Diagram overlapping and transcendence of boundaries.

Time and its signature, the forward-flowing of history, will bleed through anything, whether physical or conceptual: the drop of water, in time, bores through the great stone; the black hole, in time, evaporates, releasing phenomena only seemingly lost forever.
• 11.6k
forward-flowing of history
"Forward-flowing" is a cognitive illusion and intuitive way of talking about asymmetric change. "History" represents time-as-past-tense-narrative (i.e. a ghost story). Particle physicists refer to worldlines (or many-worlds branchings) and statistical mechanics refer to entropy gradients. I still don't see what your musings, ucarr, have to do with philosophy. What's the philosophical itch you're trying to get us to scratch? State it plainly.
• 621
I still don't see what your musings, ucarr, have to with philosophy. What's the philosophical itch you're trying to get us to scratch? State it plainly.

Have your seen my quote directly above yours?

Do you think the forward-flowing of history comprises the physical phenomena populating our empirical experiences?

"Forward-flowing" is a cognitive illusion and intuitive way of talking about asymmetric change. "History" represents time-as-past-tense-narrative (i.e. a ghost story). Particle physicists refer to worldlines (or many-worlds branchings) and statistical mechanics refer to entropy gradients.

I take your above quote for an answer to my question above it.

No doubt my appointment with the dentist tomorrow, when seen as asymmetric change representing time-as-past-tense-narrative (i.e. a ghost story) with reference to world lines (or many-worlds branchings) and statistical mechanics referring to entropy gradients, holds formally very little in common with my vision of getting a filling in my back molar. No. I haven't entered such descriptions into my daily planner.

Having said that, I think I understand your cutting-edge scientific vision of forward movement is pertinent to the concepts and details of my narrative. If I'm right, then you exaggerate when claiming "It's clear as mud to me."
• 11.6k
After I posted. It's clear as mud to me.
• 2.8k
I'm not sure my little exposition should be a reference point. That's how I perceive change over time. I can also go backwards in time, showing there need not be a conflict between infinite regression and first causes.

Particle physicists refer to worldlines (or many-worlds branchings) and statistical mechanics refer to entropy gradients

Where a lot of that begins is the Schrödinger equation, which is fundamentally a partial differential equation with the independent variable t = time. When solutions are computed, all of a sudden mystical superpositions and wave collapses occur with experimentations. Why is time so vital here?
• 11.6k
Don't hold me to this but I vaguely recall that Heisenberg et al's matrix mechanics (re: possible-states of observables) provides a non-mystical, though experimentally equivalent, alternative to Schrödinger's wave mechanics (re: particles as classical waves). Something about Feynmann's path-integrals plays a decisive role in extending the scope of matrices, doesn't it? Yeah, I don't know wtf I'm talking about, jgill, but somebody with real QM chops is bound to come along who can talk mathematical physics to a mathematician. :sweat:
• 7.4k
"what is causing galaxies to deviate from the predictions of our models?" Such causes get posited as new elements of a model a in many subfields uncovering the nature of these causes becomes a major, or the major topic of research, e.g. dark matter and dark energy.

They do, because they speak the same language we do. I'm not saying causation is denied, but what is the focus? You said it yourself - things that "deviate from the predictions of our model". Anomalies.

But in the mathematical models, there is no variable or constant 'cause' or 'effect'. Nor do the models cause the universe to obey them. The world is orderly and disorderly and mathematics describes the order and the disorder. When there is an anomaly there is work to be done revising the model, or refining the instruments. Causation drops out of the conversation because it has no function. It is not a particle, or a field, or a force, or a dimension or a measurement... It's not anything, but an old fashioned way of thinking that we still use. To look for the cause of an anomaly not understood is to look for some new thing; it is not to look for causation. Causation is a fancy word for 'the way things go' and that is why there is the temporal aspect.
• 768
I am genuinely curious about this widespread world of physics where cause is not referenced. I read a lot of physics and causes are mentioned constantly. Things like do-calculus were invented for the natural sciences. Bayesian inference is generally couched in causal language. The Routledge Guide to Philosophy of Physics, which is an excellent reference guide BTW, mentions cause 787 times, causal 586 times. Some of these references are indeed arguments against cause, but not most. In general, arguments against causation are nuanced, and not eliminitivist at any rate.

"Cause isn't in mathematical equations," certainly isn't taken as gospel in the philosophy of causation (I'm currently in the middle of "Causation: A Users Guide). Why can mathematics not represent causes, but it can represent state changes and processes with a defined start and end point?

Where I've seen arguments against cause related to physics, it's been in popular science books in the context of arguments for a block universe. The block universe is hardly something all physicists accept, and if authors are putting their best arguments for such a view into their books, they seem to have more motivations in philosophy than in physics. To be sure, this is partly because debates on the nature of causation generally aren't considered a topic for physics articles, and one's popular science books are a good place to get into more speculative discussions.

But I certainly don't see the "cause is antiquated," view writ large on the natural sciences as a whole, or even just physics. Instruction on elements of physics being time symmetric is not an argument that physics itself is time symmetric, it demonstrably is not.

I would be less skeptical of the block universe if the motivation behind some key arguments for it didn't seem to come from philosophers' anxiety over how their propositions could have truth values given some form of presentism. Davies, who I generally like, goes for one of these. It's frustrating because these are presented with an air of certitude (he says something like "one must be a solipsist to disagree") when in fact there is by no means only one way to view SR vis-á-vis the reality of local becoming. These examples amount to attacks on the Newtonian time the audience is expected to be familiar with, and then propose the block universe as the only solution (Putnam does something similar). The issue can also be resolved by seeing time as degenerate in SR, with time bifurcating into co-ordinate time and proper time . This distinction gets muddled in many retellings of twin paradoxes though.

Of the views on time I like best in modern physics is the view that events in the past exist, and exist(ed) just at the local time they occured, while "now" is defined locally by the simultaneity of local interacting processes. I see no reason to jettison the overwhelming empirical evidence for time's passage when there exists fully coherent models that don't require eternalism.

Cause is trickier because people mean many things by cause. Just like time now has to be split into many different types of precisely defined time (and even these might not be enough, some physicists think Minkowski Spacetime is doomed as a flawed model), we probably need some sort of precisely formulated definition of causality. In the philosophy of physics, the transfer of conserved quantities is the leading definition of causation from what I've seen, but there are information theoretic definitions too.
• 768

A world line in an objects' 3D path rendered with a time dimension, nothing more. A world line can also be used to describe the history of a path for an observer. We talk about time in statistical mechanics all the time. Even in a model of quantum foundations like consistent histories, where there is no one true state of affairs at time T, a classical history emerges from decoherence/collapse. Physicists don't talk about time in SR/GR because you need to specify which types of time you are referring to. This doesn't disprove the reality of an arrow of time or local becoming, except inasmuch as philosophers have used the model to construct paradoxes, or pseudoparadoxes depending on who you ask, that call them into question.

The funny thing is that the alleged paradoxes and the arguments that allegedly rebut them haven't really moved since the 1940s; they just get restated. Someone who wants to refute Davies can cite Gödel or Robb who were actually replying to people in their time... and so maybe time is illusory or circular...

The things you mentioned don't have anything to do with history being a "cognitive illusion." The apparent "arrow of time," is one of the big questions in physics, not something that has been solved and written off as illusory by any means. Some physicists speculate that time is somehow "illusory," although the nature of this illusion is generally fairly nuanced and not grounded in cognitive science. When they do so, they tend to be doing more philosophy than physics, although the use of specialized terms certainly confuses this fact.

That time, and thus history, can't flow and that things do not "move" "forwards" and "backwards" in time is more well established. These are bad analogies that lead to apparent paradox. So, "forward flowing of history," is probably best to avoid.
• 11.1k
The gist of my claim herein is that the above quote describes our fluidly transforming world as an ongoing continuity of boundary crossings, boundary mergers, Venn Diagram overlapping and transcendence of boundaries.

All this does is show the deficiency of systems theory as a means for modeling the world. The reality of these "boundary crossings" implies that there is many things which cannot be classified as being proper to one system or another. Initially, this may not appear as a problem, but when it comes to mapping causation, we need to distinguish between what is within the system, and what is acting on the system, as a causal force. As in my reply to Banno, above, inertial continuity is modeled as internal, therefore non-causal, and external influence is modeled as a causal force of change.

So for example, someone in another thread suggested to me that we could model an atom as a system. However, the natural state of atoms is to exist within complex molecules, where parts (electrons for example) are shared. If two atoms share an electron, and the atoms themselves are being modeled as distinct systems, then in each model, the shared atom is both an internal part of the inertial continuity of the system, and also a part of the other system, thereby acting as a causal force of change on that same system. In other words, from this 'systems' perspective, the electron must be understood as both a part of the inertial continuity of the system, and a causal force of change to the system (being a part of an external system), at the same time.
• 768

You might be interested in information theoretic, holographic principal-based workarounds for this problem if you're not already aware of them. Since information is only exchanged across any systems' (however defined) 2D surface, we can model them purely relationally. One interpretation of this is that information content is relative between systems, with these relationships formalized using the concept of symmetry and group theory. Example: for many enzyme reactions, a chemicals' being composed of isotopes or not is indiscernible for both systems and thus irrelevant to describing the interaction. This was best expressed in brilliant dissertation that made it into Springer Frontiers and got rave reviews, before the author seemingly disappeared, which is a shame.

Verdal's book sort of goes with this, in his explanation of information only existing relationally between parts of the universe, but he seems to reverse on this later in the book to use the old "amount of bits stored by each particle," calculation to make some points about quantum information.

I think the arbitrary nature of system boundaries is akin to other problems in the sciences and even humanities. For example, in semiotic analysis/communications, a physical entity, say a group of neurons, might act as object, symbol, and interpretant during the process, depending on the level of analysis that is used. But at a certain part, the ability of any one component to convey aspects of the total message breaks down. E.g., a single logic gate can't hold the number "8," itself. Certain relationships only exist at higher levels of emergence, like your example of shared electrons.

Causation, in such models, would likely be interpreted in terms of computation or information exchange, and I'd argue that current theories of computation and communications would actually make it extremely difficult to differentiate these two models at the formal level.

IMO, something like the concept of levels of abstraction in computer science is needed for this sort of problem, but I can't fathom how to formalize it in a manner that isn't arbitrary.

Subjective is fine. Entropy is subjective (see the Gibbs Paradox) but not arbitrary. Arbitrariness seems like a problem however.
• 621

So for example, someone in another thread suggested to me that we could model an atom as a system. However, the natural state of atoms is to exist within complex molecules, where parts (electrons for example) are shared. If two atoms share an electron, and the atoms themselves are being modeled as distinct systems, then in each model, the shared atom is both an internal part of the inertial continuity of the system, and also a part of the other system, thereby acting as a causal force of change on that same system. In other words, from this 'systems' perspective, the electron must be understood as both a part of the inertial continuity of the system, and a causal force of change to the system (being a part of an external system), at the same time.

I think the arbitrary nature of system boundaries is akin to other problems in the sciences and even humanities. For example, in semiotic analysis/communications, a physical entity, say a group of neurons, might act as object, symbol, and interpretant during the process, depending on the level of analysis that is used. But at a certain part, the ability of any one component to convey aspects of the total message breaks down. E.g., a single logic gate can't hold the number "8," itself. Certain relationships only exist at higher levels of emergence, like your example of shared electrons.

Your above quotes for me are introductions to detailed examinations of topics in physics, each of which, in the elaboration of specialization, would easily engage the entire careers of physicist-specialists.

My label of convenience for the theme connecting and focusing pertinent issues within Time and Boundaries is Boundary Ontology. Under this category the focus is on such questions as: How do we measure the surface of a material object? In the scale of human experience, this question is perhaps mundane. Is that the case at the scale of the elementary particles? How about the scale of the expanding universe? What does it mean for spacetime to expand and yet have no outer boundary?

Speaking mathematically, clearly topology has a key role to play herein. For example: topology might offer a rational approach to a definition of the soul: a surface invariant to unlimited manifolding of a set.

Is system the limit of entropic expansion? Is universe the limit of system? These are, I think, important boundary ontology questions.

Is there a possible general mathematical definition of what constitutes the boundary of a system?

Can boundaries be defined for cognitive inter-relations, thereby establishing a hybrid interweaving the cognitive_physical?

Finally, there's the supreme challenge of the sine qua non of boundary ontology puzzles: Origin Boundary Ontology. First principle, first cause, etc, will need more than three spatial dimensions + time for practical elaboration.
• 2.8k
I don't know wtf I'm talking about, jgill, but somebody with real QM chops is bound to come along who can talk mathematical physics to a mathematician. :sweat:

Real-life Q-physicists have been chased away, I fear. Kenosha Kid tried to get some sympathy for the Transactional approach, but had unsatisfactory experiences and left the room to play his guitar. I know very, very little about Q-theory beyond the elementary stuff. Feynman's path integral I can follow if I take the simplified version involving time splitting. In my old age I dabble in very elementary mathematics (in the professional sense), finding the road I am on challenging enough. :cool:
• 11.6k
:up:
• 11.1k

Interesting. Why do you say that entropy is subjective? Is it because a system's boundary is arbitrary?
• 768

Not just that.

Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises once the gases are mixed. If the gases are the same, no additional entropy is calculated. The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different. The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas - a paradoxical discontinuity...

As a central example in Jaynes' paper points out, one can develop a theory that treats two gases as similar even if those gases may in reality be distinguished through sufficiently detailed measurement. As long as we do not perform these detailed measurements, the theory will have no internal inconsistencies. (In other words, it does not matter that we call gases A and B by the same name if we have not yet discovered that they are distinct.) If our theory calls gases A and B the same, then entropy does not change when we mix them. If our theory calls gases A and B different, then entropy does increase when they are mixed. This insight suggests that the ideas of "thermodynamic state" and of "entropy" are somewhat subjective.

I don't agree with the use of the term "arbitrary" in the Wiki article, at least not in an important sense.

This paradox has a special place in my heart because when I began reading a lot more on statistical mechanics and doing problems on it I realized this problem myself somewhat early on. I thought to myself "holy shit, maybe I could be really good at this, look what I uncovered, this is air tight too!"

I finally got over the fear of someone stealing my great insight and posted a question in Stack Exchange. Within a few hours someone asked, "do you mean the Gibbs Paradox?"

Yeah, someone had the idea first, over a century ago, pretty much as soon as Boltzmann published. So much for my genius lol. I felt better about this after reading Max Tegmark describe "discovering" decoherence as a first year PhD student, only to learn he'd been scooped by several years. At least that was somewhat close in time though.
• 11.1k
Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises once the gases are mixed. If the gases are the same, no additional entropy is calculated. The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different. The two gases may be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas - a paradoxical discontinuity...

I suggest that this is an illusion created by the terms of the example. If each individual molecule of compartment A is marked as A, and each individual molecule of B is marked as B, then even if the two compartments each contain the same type of gas, the combining will appear the same as if they are different gases, because they are marked as different.

There is no paradox, just an illusion. In the case of two distinct gases, an act of mixing is required, and this requires time and energy. In the case of the gases being the same, it appears like the gases have already mixed as soon as the separation is removed. That's just an illusion, mixing has not occurred, as marking the molecules would reveal.
• 768

In the case of two distinct gases, an act of mixing is required, and this requires time and energy. In the case of the gases being the same, it appears like the gases have already mixed as soon as the separation is removed. That's just an illusion, mixing has not occurred, as marking the molecules would reveal.

Yes, that was sort of Gibbs' original point in the case of ideal gasses. You need a non-extensive entropy to deal with that the problem.

Jayne's big point is summed up in the introduction: " We argue that, on the contrary, phenomenological thermodynamics, classical statistics, and quantum statistics are all in just the same logical position with regard to extensivity of entropy; they are silent on the issue, neither requiring it nor forbidding it."

And, counter intuitively, non-extensive entropy actually tends to model many real systems better (e.g. Tsallis entropy).

Jaynes paper does a better job explaining why this was generally been considered a genuine paradox. Distinguishability is, in an important sense for predicting/describing physical interactions, relational.

• 621
I guess I give up, having not been able to follow what it is you might be claiming.

My central mission in this conversation is to define time in terms of boundaries and their inter-relationships.

My central premise is that time is a type of general boundary modulator; perhaps it is the general boundary modulator.

For an example of what I mean, consider: once you were a boy in single digits; now you are a man in double digits. How did this change happen? Typically, we say, "Time passed and you, making your various rights of passage: birth, first steps, first words, first date, graduation, first job, marriage and etc., moved on, growing older."

Well, do you think these rights of passage are moving you along through one boundary after another? Do you think passage through all of these boundaries has been actuated -- maybe I should rather say, facilitated -- by time?
• 20.4k
What you have to say is too muddled to have any reverberation.
• 621
What you have to say is too muddled to have any reverberation.

Thanks for the weigh-in. Dialogue is divine, even when it's not.

You think my thinking untidy.

The hard trick in slinking behind low expectations: maintaining enough public interest to avoid wholesale dismissal. Invective trumps silence, especially when it's instructive.

Against obverse inclination, you've been doing your job of examination: unselfish.

Hostile interest is intriguing because -- I'm off topic...

Back to chasing reverberation. Goal: sustain your pithy judgments.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal