• Benj96
    2.2k
    Entropy is the tendency of things to spread out and become more chaotic and disordered.

    Now let's take a look at "order" and "disorder" for a moment.

    I have a packet of 20 marbles. They are homogenous - all identical. I release the bag onto a table and let them bounce around and roll to a stop. Theres a mess. I ask my physicist friend to make these marbles ordered.

    Naturally one would simply gather them all together as close as possible into a single unit. This is an analogy for the singularity of energy at the beginning of the universe.

    But now, I take the same bag of 20 marbles but this time, 10 are blue and 10 are green. I say "order these please". To the person doing the task it may be more intuitive to group them in two distinct groups of all blues and all greens. This to them is the highest level of "order" one can achieve in such a state because; a). The marbles are collected into their respective qualities and b). Are also even in number. Balanced.

    Theres a fundamental difference in order here. The physicist has ordered her marbles not spatially but categorically. She has identified a qualitative difference which logically denies putting them all in one place and calling them ordered.

    Again I take a bag of 20 marbles but this time, 2 are blue, 3 are violet, 5 are green, 1 is yellow, 7 are red and 2 are orange. Again she opts for a "categorical organisation" by colour making 6 groupings but cannot balance them as the numbers in each group are different. So instead she uses another type of relationship. I see they are arranged in a spectrum from violet to red. The physicist explains that this is the highest level of organisation of these marbles using not only the category of colour but also the relationship of one colour to another (wavelength).

    I ask her which group is the most ordered? And secondly which group holds most information?
    The first - where everything is the same and in one place, or the third - one which is structured in systems by quality and also by relationship of qualities?

    As you can see, the potential for an ordered system is 1 when the quality is 1 (gather them up). The potential for an ordered system when there are 2 qualities is far far more; (chronology - which group came first, spatial - how are the qualities located in respect to one another, quantitative - how much of each? Interactive - what effects does one quality have on the behaviour or form of the other? And this continues exponentially with each nth quality added to the system.

    How do we approach order in a world whereby everything is both qualitatively the same (energy) but also qualitatively different (mass, time, space etc)?
  • SophistiCat
    2.2k
    You already started a thread on this yesterday. Why did you abandon that discussion just to post another OP with the same ignorant tosh?
  • Kenosha Kid
    3.2k

    I don't see a profound revelation here. The two kinds of marbles were consciously ordered in the same way when they were put in the bag, became disordered in the same thermodynamical way, then were consciously reordered by someone else in similar ways. It is an artefact of the the question alone that the person did not put the marbles back in the bag together, nor is there anything obviously profound that one person ordered the two sets of marbles one way and another person another way.

    I can order my record collection alphabetically by artist then by year, or alphabetically by title, or chronologically, or biographically, or in order of most use.

    There are myriad ways to order something. The same pile of bricks can be ordered to create 100 different houses, but disordering any of them just gives you rubble.
  • Benj96
    2.2k
    You already started a thread on this yesterday. Why did you abandon that discussion just to post another OP with the same ignorant tosh?SophistiCat

    That question was regarding "life" and entropy. This question regards entropy and classification/ definition of order. If you had read carefully and comprehended my line of questioning you would have observed the clear distinction. Besides, I wasnt aware I could ask a question on the same concept more than once perhaps I should never consider "time" or "consciousness" or any other topic further because I've already asked one question about them.

    Whether you find my questions ignorant tosh or not is up to you. But personally, I feel that needless ridicule of someone for being curious... is pretty ignorant. I wouldnt like to be like someone like that and quite frankly their views with lack of respect for another individual.. dont really impact me at all.
  • Benj96
    2.2k
    There are myriad ways to order something.Kenosha Kid

    Yes there are. So are there the same myriad ways of ordering one type of thing as there is a collection of different things with respect to one another? As my question asked? What does variation in quality do to the capacity to which you can order something and can "one thing" have equal amounts of information in it as "all things combined" ? I never mentioned a bag btw. I mentioned three separate individual scenarios based of the same concept
  • Kenosha Kid
    3.2k
    I never mentioned a bag btw.Benj96

    Ahem.

    I release the bag onto a table and let them bounce around and roll to a stop.Benj96

    If you're talking entropy in the thermodynamic sense, the order is not necessarily quantitatively different for different categories of ordering. Each category will have multiple orderings with different (low) levels of entropy. So, in answer to your question, some orderings will have more or less entropy than others.

    But you seem to be introducing an element of the scientist knowing that certain orderings have more meaning than others. Putting the coloured balls in a line in order of the EM spectrum has a meaning to the scientist that balls of a fixed colour as close together as possible does not. But this in itself wouldn't affect the entropy of the system: it is the configuration, the number of elements, and the effort required that effects the entropy.

    That said, since scientist + balls is not a closed system, there will be entropical effects of the scientist figuring out an ordering with meaning compared with just smushing some balls together.

    Have I understood you correctly?
  • apokrisis
    6.8k
    How do we approach order in a world whereby everything is both qualitatively the same (energy) but also qualitatively different (mass, time, space etc)?Benj96

    What you are drawing attention to is that “disorder” is a relative claim. The question becomes “disordered in relation to what kind of expectation, meaning, purpose or constraint?”

    So a more general definition of entropy would be grounded in an information theoretic perspective. What about this world counts as a degree of uncertainty or surprise in relation to my simplest model of it as a system?

    You can see that your first system - 20 identical balls - is already a highly constrained or ordered one as you have somehow managed to reduce all possible surprise as to the colour of the balls. Surprise is minimised. Your world is completely predictable on that score.

    A truely entropic situation would be if the balls could randomly take on any colour at any time. Even as you grouped them, they could switch colour on you. Or split, merge, be in multiple places at once, etc.

    So note how the standard mental image of an entropic system already smuggles in an atomising assumption - some stably countable degree of freedom like a particle that itself is already in a highly negentropic state of constraint. The particle and its qualities are made as homogenous as possible so that - by contrast - a chosen variable like location becomes maximally surprising. The thing you have the least information about, the least control over ... until you get grouping and impose order over that too.

    Of course, treating physical systems as if they were systems of particles - an ideal gas confined in a container and sat in a heat sink - is a useful model. If you are doing practical thermodynamics here on the warm surface of a planet floating in a cosmic heat sink with a temperature of 2.7 degrees K, then the statistics of bags of marbles pitches things at a suitable level.

    But once you want to apply the concept of entropy to the Universe itself as a system, then you have to recognise this habit of including negentropic assumptions in your metaphysical accounts.

    Take the Big Bang to Heat Death story of a Universe that starts off hot and constrained and becomes cold and spread out. In a broad sense, nothing changes as the positive contribution to entropification in terms of a disordering of position is matched by a negative contribution in terms of an increase of resulting gravitational potential. If the universe was just a bunch of balls spilling out, then a gravitational gradient wanting to clump them all back becomes an ever swelling constraint on their apparently unconstrained kinetics.

    Of course, that in itself is way too simplistic a model of the actual universe as it is presuming that the BIg Bang and Heat Death can be modeled in terms of countable degrees of freedom - definite material particles with a defined location and energetic state, so therefore a matchingly undefined degree of surprise as to the locations or energies they might have.

    In the Big Bang, any such degree of freedom is maximally indeterminate. The quantum uncertainty of any claim for identity is as high as it could be. So - relative to that accountancy point of view - the Big Bang was a chaos that became increasingly ordered by a process of spatiotemporal expansion. What got constructed was a developing heat sink that started to make particles - as localised energy densities - countable elements. After a while, the chaos got sorted into collections of quarks and electrons with their identities constrained by fundamental symmetry breakings.

    Then at the other end of the story, you have the Heat Death which - to our best knowledge - will be a state of immense order and uniformity ... measured from a relative point of view.

    At the Heat Death, you will left with an empty vacuum that continues to radiate with only a zero point quantum energy. All particles will have been swallowed up by black holes that then themselves eventually evaporate. The contents of this world are black body photons with a wavelength of the width of the visible universe - the de Sitter horizon. Or an uncountable number of photons with a temperature within a Planck’s hairsbreath of absolute zero K.

    So again, like the Big Bang, essentially a nothingness without a point of view. But still some kind of transition from a hot everythingness of an ur-potential to the chill emptiness of a generalised spatially structured void.

    Thus using entropy models to describe the evolutionary trajectory of systems such as the universe is tricky and fraught. But for quite understandable reasons. We have to make three shifts in our point of view to arrive at a point of view that is actually “objectively” outside the totality of the thing we want to describe.

    The first rung of the modelling is the standard entropy story. We have a bag of balls, a die with a fixed number of faces, an ideal gas with a defined number of identical particles. We are creating a world that is completely ordered or constrained in a way that, by contrast, leaves other aspects completely free or random. A world of degrees ... of freedoms. So this is an internalist dichotomy. We stand inside a world where this contrast is between what we are certain of - some number of balls - and what we are matchingly uncertain about - their possible location.

    A second rung of modelling would be to recognise that this state of affairs is only relative to that constructed point of view. It could be otherwise. We could be certain about the location of the balls - clumped in this group - but uncertain as to their identity, So now your counting of entropy/surprise/disorder is relative to what you decide to fix vs what you leave to swing free. If you are imagining a system as a bag of balls spilling out freely, well what about the gravitational pull that is a countering quantity of negentropy?

    Like cosmologists do, you would have to step up to a viewpoint where the creation of spacetime - as the great heat sink being manufactured to absorb what now looks to be so be some initiating Big Bang quantity of located energy - is also a thing to be counted in the final balance.

    Then from there, you need to step up to a third rung that achieves a viewpoint completely outside the system in question. If the Universe isn’t just a messy dispersion of degrees of freedom, nor even the orderly construction of a vast heat sink void, then you have to have an evolutionary tale that combines the local and global scales of what is going on in holistic fashion.

    Now you arrive at a picture where the very distinction you seek - order vs disorder - has to emerge into being. At the beginning of time - the Big Bang - order and disorder are radically indistinguishable as there is just an absolute (quantum/Planckian) potential. And at the end of time, you have the opposite of that. The Heat Death is final maximal dispersion of that potential into the ever lasting and unchanging definiteness that is an infinite void with a single temperature and undifferentiated holographic glow of de Sitter radiation. Both locally and globally, there is maximal uniformity across all possible locations along with a maximal number of those possible locations where something could have been different.

    So at the beginning of time, nothing could be counted as distinctive variety - individual bits of information or degrees of freedom. Everything was a hot quantum blur of potential. A quantified account can only be imputed retrospectively by the countable variety - in terms of a quantity of energy/a quantity of space - that we observe around us now.

    And at the end of time, the number of energy bits (Heat Death photons) and number of spatial bits (Planck scaled distances) will be matchingly infinite in number. So uncountable for the opposite reason of being in unlimited abundance and hence offering zero distinctiveness once more. A chill blandness of differences (radiation) that can’t make a difference (to the cosmically prevailing temperature).

    Standing on the third rung right outside the system that is the universe, we now see a transition from unlimited potential to unlimited difference (that also, matchingly, makes no meaningful difference).

    Each view of the situation can be correct. So the standard bag of marbles modelling works fine within its own limits. But also each enfolds the other as a succession of larger views. And the largest view is radically unlike the standard, or even the second tier relativistic models used mostly in cosmology.

    It is only when you get to quantum holographic type models of the universe - de Sitter horizons, etc - that you start tracking everything that is emerging. Marbles with some countable identity (surprising or otherwise) to have, along with countable locations that give them some place (surprising or otherwise) to be.
  • javra
    2.4k
    So a more general definition of entropy would be grounded in an information theoretic perspective. What about this world counts as a degree of uncertainty or surprise in relation to my simplest model of it as a system? [...] A truely entropic situation would be if the balls could randomly take on any colour at any time. Even as you grouped them, they could switch colour on you. Or split, merge, be in multiple places at once, etc.

    [...]

    Then at the other end of the story, you have the Heat Death which - to our best knowledge - will be a state of immense order and uniformity ... measured from a relative point of view.
    apokrisis

    Wanted to read your thoughts on what I’ve traditionally viewed to be a contradictory semantics between IT notions of entropy and, for lack of better phrasing, empirical notions of entropy. Trying to keep things short:

    IT notions of entropy equate entropy to degrees of uncertainty - to which I'll add: such that multiplicities of possibility result that thereby diminish what is, or else can be, ontically certain and, hence, determinate. I naturally further interpret that the more extreme the ontic uncertainty, or indeterminacy, of a given the more chaotic the given becomes.

    On the other hand, the empirical notion of entropy holds it that the process of entropy moves individual givens via paths of least environmental resistance toward an end-state of maximal order and uniformity.

    In short, increasing IT’s entropy results in increased disorder. Whereas increasing entropy when empirically understood results in increased, global, homogenized order.

    To me, this is 180 degree turn in semantics.

    I’m partial to what I’ve here labeled the empirical notion of entropy (entropy leading toward a global, homogenized order), and can’t so far find means of making it cohesive with IT’s notions of entropy.

    You’ve made use of both notions. How do you make sense of them in manners devoid of equivocation? Hopefully I’m missing out on something here.
  • javra
    2.4k
    ps. Glad to see you're still around. :grin:
  • apokrisis
    6.8k
    You’ve made use of both notions. How do you make sense of them in manners devoid of equivocation? Hopefully I’m missing out on something here.javra

    Hi Javra. As Shannon made clear, these would be physically complementary perspectives. The information and the dynamics. But also, that fact gets confused because reductionist science wants to still strip its metaphysics down to a world devoid of meaning. So we have the paradox that information theory winds up counting noise rather than signal. A bit might well have significance, but information theory just locates it as an atomistic position - a bare material absence or presence.

    So again, reductionism gives a useful first order model of reality. But begs the question as soon as you want to do real philosophical work. Almost anyone trying to be scientific about metaphysical questions find the whole discussion going off the road as standard science is designed for modelling a world that already has its global constraints (its laws) and local constants (its atomistic grain) baked in as unexamined presumptions.

    A systems view is based on all four Aristotelean causes. Reductionist science wants to account for the world only in terms of material/efficient causes as its atomistic variables. So that is what still frames the discussion whether we are counting entropy in terms of informational bits or dynamical degrees of freedom. The Holism is collapsed and hidden in the fact that information and dynamics are united by “the Planck scale” where it is bit, and vice versa.

    You would have to crack open the machinery of the Planckscale - the triad of constants that are the speed of light, the strength of gravity, and the uncertainty of the quantum - to find where the deeper holism has got stuffed. (Clue: G and h are in a reciprocal relation to define fundamental location vs fundamental action, and c then scales the interaction to give you an emergent direction for temporal evolution.)

    So the informational bit and the dynamical degree of freedom are not an equivocation but the same thing seen from its two possible directions. The informational angle stresses the formal/final half of the systems view - that which speaks to a capacity to constrain action to a location such as to make definite some atomistic degree of freedom. And then the dynamical angle speaks from a material/efficient perspective where such a degree of freedom simply exists ... in some brute fashion as a given. The constraints acting to make this so are extra to the model.

    And then equivocation of a kind does arise when the constraints-based production of a bit is taken for granted as likewise a brute material fact with no systematic history. This is what happens when physics seems to say reality is made of bits as it it WERE a material rather than a meaningful limitation that has created a “material” possibility.

    Yet even as an equivocation it is a useful one for founding models of semiotic complexity. It is a huge fact - one that legitimated the whole exercise - to be able to show that there is an irreducible physics of symbols. The old Platonic division between matter and idea does actually reduce to a Planckscale commonality. There is a baseline size to counterfactual definiteness. A bit of noise or entropy is the same size as a bit of signal or negentropy when you drill down to the simplest possible level of material description. And then having a fundamental basis for the measurement of cosmic simplicity, you can do what reductionist science is so good at doing - add levels of more complex systems modelling on top. Like chemistry, biology and sociology.

    Discovery the equivalence of Boltzmann entropy - dynamical degrees of freedom - and Shannon information entropy was an epochal move. And what united them was Planck scale physics.

    Physics can now recover a full systems perspective from that. As it is doing with its information theoretic turn and attempts to recast quantum theory in the language of contextual constraint (decoherence, etc).

    A simpler way to put it might be that information theory is seeking its least meaningful quantity - the bit that could be countably present because it could be countably absent. Dynamical degrees of freedom are likewise the least form of material action that is countable present vs countably absent. And because reality is a system, based on an interaction between laws and actions, constraints and possibilities, regulation and dynamics, the search for the smallest scale of definite existence - a grain of being - arrives in the same place when you take either route.

    The further complication - the third rung issue I cite - is that the actual Universe only arrives at this physical limit of counterfactual definiteness at the end of time. It is the great fact that evolves.

    Or equivalently, if you unpack the machinery of the Planckscale maths, the end of time is also the biggest and flattest possible state of things. A cold and even void very definitely exists in a way that was not the case at the beginning, when all you could say was there was a state of indeterminate potential.

    So it takes three steps back to see the wholeness.

    Step one creates the reductionist view of an atomised ground. Reality is composed of bits. And both the informational and the dynamical perspective arrive at a counting system to handle that.

    Then step two is to see that information and dynamics are the two complementary halves of the one deal. The maths of the Planckscale encode the fundamental largeness of reality as much as its fundamental smallness. A reciprocal relation is what is baked in, but rarely highlighted.

    Then step three is to see that this very distinction - of maximal largeness and smallness, or order and disorder, spatiotemporal extent and local energy density, and other ways of describing it - themselves are a feature that has to emerge via a process of development. Crisp counterfactuality is where things arrive as they cease to change at the end of time. It is only when things get very cold in a very big world that even quantum fluctuation arrives at its residual level.

    An observer of the Heat Death could look around and be sure that there is just nothing happening in the most extreme possible fashion. The cosmos still expands at lightspeed. And that creates event horizons that must radiate. So material dynamics is in full play. But it is equally as devoid of informational difference. It is so homogenous that it just an eternalised nothing,

    The glass is both completely full and completely empty, and so it’s counterfactually is expressed not just locally but globally. If a heat Death photon represents some hope of an energetic disturbance, a local perturbation, well it has now been stretched so that a single wave beat spans the visible universe and thus can do no work inside that event horizon.

    Thankfuly we exist because the universe had to cross over from one kind of simplicity to the other. At the Big Bang, there was no stable counterfactuality in terms of global informational constraint or local dynamical degrees of freedom. At the Heat Death, the two are united by a local~global homogeneity. Halfway through the story, there is an abundance of stars and chill vacuum. There are many localised gradients where energy densities can bleed into heat sinks. The grand equilibration process is in complex unfolding still. Reductionist science has eons before its celestial accountancy is redundant.
  • apokrisis
    6.8k
    Just popped back to check an old post. Nice to see a few metaphysics threads going. :)
  • javra
    2.4k


    Thanks for the account. As previously, some minor metaphysical differences between us - you say the end-state of the universe is a physicalist’s Heat Death, I say it’s some cosmic form of Nirvana, kind of thing :razz: - but I respect your metaphysics in its own right. (And have few doubts that many here about don't much respect mine.)

    But to rephrase things in as simpleton a fashion as I can currently produce: The entropy of given X within the universe leads to disorder relative to given X (its permanency, or identity, or determinacy steadily ceasing to be), but simultaneously leads to greater order in respect to the universe itself as a whole. Entropy thereby simultaneously increases disorder and order relative to parts and to everything, respectively. Is that about right? If it’s not, please correct this interpretation as needed.
  • apokrisis
    6.8k
    But to rephrase things in as simpleton a fashion as I can currently produce: The entropy of given X within the universe leads to disorder relative to given X (its permanency, or identity, or determinacy steadily ceasing to be), but simultaneously leads to greater order in respect to the universe itself as a whole. Entropy thereby simultaneously increases disorder and order relative to parts and to everything, respectively. Is that about right?javra

    Sorry, I don’t think I follow. Entropy is a measure of where some system X might be on a spectrum between maximal order and maximal disorder - if we are speaking very simply.

    So a pack of cards might be completely ordered in terms of suit value - ranked in sequence that has zero uncertainty from that point of view. Or it might be completely disordered in being so well shuffled you couldn’t guess what came next at a level better than chance. Or it might be somewhere in between in its shuffle so that you could still guess one card would follow the next in sequence to a degree.

    Reductionism likes to emphasise that random local action will always arrive at a perfectly shuffled deck. An arrangement that offers the least predictability. Mindless nature can have an entropic arrow simply because of unmotivated statistics.

    But I was countering this kind of happy metaphysics by saying it builds in presumptions - like that nature just comes with brute degrees of freedom in the way our imagination supplies us with these handy decks of cards and bags of balls that constitute a reality already pre-atomised.

    So what is determinate - the concealed presumption in the OP - is that a bag of identical balls can just exist. The balls don’t fluctuate through all kinds of possible identities, just as the deck,of cards doesn’t muck about in any fashion and just passively let’s you shuffle them.

    But we know from fundamental physics that any notion of countable particles disappears as you reach the energy densities of the Big Bang. Standard notions of entropy counting cannot apply in any simple fashion. And the same applies at the Heat Death in a different way.

    So entropy is a modelling construct - and all the better for the fact that is not disguised. The mistake was to talk about energy as if it were something substantial and material - a push or impulse. And now people talk about entropy as a similar quantity of some localised stuff that gets spread about and forces things to happen.

    This seems to be what you have in mind here, but I’m not sure. My point was about how the entropic/informational approach to physics can free you from one sided materialistic conceptions. A fuller systems metaphysics is implicit in the maths once you get past the usual introductory examples.
  • javra
    2.4k
    So entropy is a modelling construct - and all the better for the fact that is not disguised. The mistake was to talk about energy as if it were something substantial and material - a push or impulse. And now people talk about entropy as a similar quantity of some localised stuff that gets spread about and forces things to happen.apokrisis

    Yes, entropy is a model just as much as, say, our notions of biological evolution are a model. However, I yet hold that there is a terrain which is being modeled in both cases. And, as with biological evolution, due to lack of better phrasing, we yet term the terrain by the name of the model we employ to map it.

    Because of this, until I stand corrected, I’ll be addressing entropy as the terrain which we do our best to model.

    Also, thought I’d mention this: Maybe I’m cheating, wanting to take a shortcut, by having asked the question - rather than taking time to get into serious study of the differences and commonalities between IT’s entropy and Thermodynamic’s entropy. But to try to make my previous post better understood:

    When considering the metaphysical issue of identity: It can be argued that the universe’s identity as a whole is currently not maximally ordered, being instead fragmented into multiple, often competing, identities – residing within the universe, and from which the universe is constituted – whose often enough conflicting interactions results in a relative disorder, or unpredictability, and, hence, uncertainty. By “identity” I intend anything which can be identified in principle which, for simplicity of argument, is corporeal: be these individual photons, rocks, humans, stars, black holes, etc.

    On the one hand, when considered from the vantage of some individual identity: each existent given within the cosmos is a) in a state of flux (a flux which can be ascribed to entropy and negative entropy) and b) holds its own imperfect order of identity – imperfect on account of a flux that moves toward maximal entropy. As maximal entropy (cosmic thermodynamic equilibrium) is approached, each existing identity within the universe becomes increasingly disordered – this until all identities within the universe cease to be upon obtainment of maximal entropy. From this vantage, increased entropy leads to increased disorder (namely, relative to the parts of the universe as whole).

    On the other hand, when considering the cosmos’s identity as a whole: increased entropy will simultaneously result in an increased order of the cosmos’s being as a whole - this till maximal entropy is obtained, wherein the identity of all parts of the cosmos vanish so as to result in a maximally ordered, maximally harmonious or cohesive, and maximally homogeneous identity of the universe. From this vantage, increased entropy leads to increased order (namely, relative to the universe as whole).

    You might not agree with this, but hopefully I’ve better expressed the perspective which I previously mentioned.
  • apokrisis
    6.8k
    Because of this, until I stand corrected, I’ll be addressing entropy as the terrain which we do our best to model.javra

    But it is only the differences that we would experience or measure. And "entropy" talk is about imputing the mechanism.

    Time has a thermodynamic arrow. Entropy - measured as disorder - has tendency to increase. The terrain seems to have this constant slope downwards.

    So are we being propelled down this slope by the hand of some global force? Or are we stumbling down this slope due to the local vagaries of chance? Entropy thinking is a claim about the imagined mechanism.

    When considering the metaphysical issue of identity: It can be argued that the universe’s identity as a whole is currently not maximally ordered, being instead fragmented into multiple, often competing, identities – residing within the universe, and from which the universe is constituted – whose often enough conflicting interactions results in a relative disorder, or unpredictability, and, hence, uncertainty.javra

    Yes. The universe at the age and temperature we live in right now is in the process of transiting from one extreme to the other. So you have this fragmentation that ranges from simple identities to complex ones.

    A mountain is an entropy dissipating structure. A monkey is too. Different grades of complexity can evolve as bits of the universe are hotter than other bits and provide the energy that allow the localised accumulation of information in the form of entropy-producing superstructure.

    But at the beginning of time, such variations from place to place were minimal - quantum level fluctuations around the maximal possible heat density. And at the end of time, they will again become minimal. But now as quantum level fluctuations around the minimal possible heat density.

    So complexity of entropic identity is just a passing stage we are having to live through at the moment.

    On the other hand, when considering the cosmos’s identity as a whole: increased entropy will simultaneously result in an increased order of the cosmos’s being as a whole - this till maximal entropy is obtained, wherein the identity of all parts of the cosmos vanish so as to result in a maximally ordered, maximally harmonious or cohesive, and maximally homogeneous identity of the universe. From this vantage, increased entropy leads to increased order (namely, relative to the universe as whole).javra

    It is a kind of exchanging of one form of order for another. Or one kind of disorder for another. And that is why talk about order vs disorder tends to drop out of the conversation. As concepts, they become too simplistic.

    At the beginning, the Universe is all potential, all becoming. At the end, it is all spent, all become. So something has been wasted to get there. Or has something been achieved?

    We humans can project our value systems on to the scientific facts either way. The accepted scientistic view is see it as a journey arriving at meaningless waste. You prefer to read it as achieving some ultimate good state - call it Nirvana.

    I say it is what it is. And the remarkable fact looks to be that we count as a high point of that fragmented identity which is the universe in the middle of its grand transition from universalised everythingness to universalised nothingness. We exist at the time of maximal somethingness. This is the time when local complexity - informational densities - can be its own thing.
  • javra
    2.4k


    Your physicalist bias is showing. I didn’t ascribe any value to disorder and order, so why the fuss? As to the metaphysics I’ve previously mentioned in jest, humor here aside, it is far more aligned to Peirce’s pragmatism than the Heat Death you take to be true on grounds of the physicalism you espouse:

    An especially intriguing and curious twist in Peirce's evolutionism is that in Peirce's view evolution involves what he calls its “agapeism.” Peirce speaks of evolutionary love. According to Peirce, the most fundamental engine of the evolutionary process is not struggle, strife, greed, or competition. Rather it is nurturing love, in which an entity is prepared to sacrifice its own perfection for the sake of the wellbeing of its neighbor. This doctrine had a social significance for Peirce, who apparently had the intention of arguing against the morally repugnant but extremely popular socio-economic Darwinism of the late nineteenth century. The doctrine also had for Peirce a cosmic significance, which Peirce associated with the doctrine of the Gospel of John and with the mystical ideas of Swedenborg and Henry James.https://plato.stanford.edu/entries/peirce/#anti

    Topics to make one gag or snide, right? Spewed by none other than Peirce.

    At any rate, have no present interest in debating against physicalist metaphysics. More pertinently, the question concerning the disparity between IT’s model of entropy and the thermodynamic model of entropy has not been answered clearly, if at all.

    I’ll let you at it.
  • apokrisis
    6.8k
    Topics to make one gag or snide, right? Spewed by none other than Peirce.javra

    But isn't evolution a balancing of the competitive and the co-operative? That's what ecology says.

    Peirce's religious excesses are what they are. To be taken in context.

    More pertinently, the question concerning the disparity between IT’s model of entropy and the thermodynamic model of entropy has not been answered clearly, if at all.javra

    What disparity? They are formally complementary modes of description.

    See Wiki - https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.