Comments

  • Networks, Evolution, and the Question of Life
    Think. Universal Turing computation.
  • Is 'information' physical?
    I've asked you several times to describe how it is that you arrived at your notions of pragmatism and semiotics. There had to be a transfer of information by causal processes somewhere.Harry Hindu

    Seems like a reasonable hypothesis.

    You feeling up to explaining how colour constancy supports your assertions about direct cause and effect yet?
  • Is 'information' physical?
    As to Platonia, again I am arguing that maths structures our experience in a semiotic way.

    So number in fact divides experience into theory and measurement. The maths strength concepts and then the values we read off dials.

    So both concepts and impressions are rendered as “objectively” symbolised. Reality is understood in mathematically idealised fashion.

    And that understanding is objective as it is encoded in actual material symbols. It can be written down and transmitted from one mind to another. It is replicable, and indeed evolvable, as semiotic structure.
  • Is 'information' physical?
    Saying there could be primal material cause is different from saying there is primal material substance or being.

    That is why I’m talking in terms of a process ontology.

    And so too that would be the correct understanding of Anaximander’s apeiron, or even Aristotle’s prime matter.
  • Is 'information' physical?
    Hegel is hard to parse,t0m

    You might be closer to Hegel than you think:t0m

    To be honest, I've given up trying to parse Hegel himself. But the key difference that keeps cropping up (apart from the tilting of the argument in favour of theistic readings) is that Hegel stresses the resolution of dichotomies, while I (following Peirce) am stressing the separation that results from dichotomisation.

    This is why Peirce had to make a clear distinction between vague being and general being. Hegel was indeed describing the generality that results from difference being self-annihilated. But Peirce's approach shows that the resolution or synthesis lies in the thirdness of habit formation. The dichotomised become equilibrated in the form of a complex mixture. Every part of reality becomes good and bad, or entropic and negentropic, as some generalised "fractal" balance.

    So generality can't be where things start. It is where things end because - once achieving equilibrium - a complexly divided systems is now in a state of generalised indifference. It is both completely composed of difference, but none of those differences now make a difference (to the overall state of Being).

    This then leaves open the need to characterise the ground that could give rise to this habitual equilibrium thirdness. And that is where vagueness or Firstness - just the pure spontaneity of a difference, not as yet judged or reacted to in any way - comes in.

    So Hegel - for me - failed clearly to see that generality is not any kind of ultimate simplicity. It is well organised complexity. We need a developmental opposite to this generality - which vagueness supplies.

    The vague and the general then find their own resolution in secondness or actuality. The mix of irreducible freedoms (inexhaustible firstness) and constraining limits (robust emergent habits) is why there is then a material reality that forms in-between these two ontic bounds.

    But then again, many passages of Hegel could be read as if he were aware of vagueness in this way too, just wasn't clear on the point as Peirce was.
  • Networks, Evolution, and the Question of Life
    The nodes in neural networks are placed there by modellers and use a message passing algorithm to update parameters linking the nodes. The nodes in gene expression networks are discovered through a kind of cluster analysis.fdrake

    I see a confusion here. Human-made machinery of course is engineered to be physics-free and so the hardware is completely generic in terms of its informational capacity. The network boils down to a collection of binary switches that represent a memory state. So there is a strong division between memory and processing. The divorce between the setting of weights and the dynamics of any "processing interaction with the world" is rather absolute.

    From that mechanical basis, a neural network will then evolve some operational state. That can then be described in functional terms. Some collection of nodes, with their pattern of weights, will "act like a particular feature detector". We will want to point to that functional element of the internal model as a cluster that analyses. But really, that seems a heuristic rather than a properly motivated claim. A mathematically soft one, rather than a rigorously grounded one.

    Then with real-life genomic networks, we can't even really claim that they even want to start with actual digital circuitry - the binary switches that enforce a strong divorce between memory states and dynamical behaviour. Genes may approach some kind of digitalism, and yet still be essentially "analog all the way down".

    For me, this is still the great unsolved riddle of life and mind. Sure, mostly science presumes a digital ground - life and mind are just super-complicated machinery. But ontologically, I take the semiotic view where creativity, spontaneity and uncertainty are irreducible (even if pragmatically constrainable). And so confusion will arise if genomic networks are still imagined as having to rest on mechanically stable foundations - actual little hardware switches.

    Generally, just a great big citation needed on the material in your post.fdrake

    Hah. Yep. I'm afraid I've deliberately steered clear of gene expression for the past 15 years, rather waiting for the science to sort itself out. But biophysics has suddenly got interesting, so gene regulation again becomes interesting for me as now it is easier to see what it actually has to control. And that has become much less due to the fact that molecular machinery does most of the hands-on regulation.

    A cell isn't a bag of chemistry but itself a highly regulated, almost machine-like, network of entropy flow or dissipative structure.

    Again, the hierarchy of control is becoming apparent. Molecular machinery is like a whole new level of regulation in itself, whereas before there was only a genetic blueprint and a chemical soup.
  • Emergence is incoherent from physical to mental events
    Still, you do tend to downplay the "wonder" at existence as such. That's fine. I don't think such wonder is sustainable for mostly practical creatures like ourselves.t0m

    I dispute that the wonder is something separate - mystical, supernatural, transcendent. I am certainly applying the scientific image of reality - Peirce's definition of inductive reasoning. And this follows from the presuppositions of being a natural philosopher - making the assumption the world is a functional unity, and so explainable in its own rational, immanent, developmental terms.

    So yes, we may be "worlds apart" on that score.

    But that doesn't mean that I just leave some aspects of life unattended. Indeed, rather than downplaying these psychological and cultural aspects of our lived reality, I tend to go after them pretty aggressively according to most people.

    So explaining art or morality or whatever in naturalistic terms is not downplaying. But it is certainly an attempt to "explain away" the mystical, the supernatural, the transcendent. That is, dispel their lingering claims to be part of any totalising metaphysics.
  • Is 'information' physical?
    ...the desire to overcome all contradictions. — Engels

    Hah. Isn't that just showing how Hegel got it the wrong way round?

    My approach (following Peirce) ends up saying that bare contradiction is instead what everything is founded upon.

    So it is a dialectical metaphysics. But it is the fact of symmetry breaking that is the foundation of Being, not the fact of "a pre-existing substantial symmetry".

    The logical process itself is the ground, not some kind of further "indeterminate substance" - as Apeiron is often understood.

    I agree this is not easy to accept as we are so accustomed to materialist ontologies. It in fact it seems pretty idealist - objective idealism - in laying so much stress on the logic of dialectics or dichotomies as how "anything can happen in the first place".

    Yet I am still arguing for physicalism even though it is a pansemiotic physicalism. There is still primal material cause as well as primal formal cause. They are just now themselves to be understood as the originating "logical division which couldn't be prevented from starting to express itself as dialectically structured, or hylomorphic, Being."
  • Emergence is incoherent from physical to mental events
    This isn't the philosophical nothing, though. It's a seething chaos.t0m

    But I did say that I am talking about a vagueness - something that is less than nothing.

    Your approach looks to take the form that the mind is a busy place, when it is attentive and self-consciously thinking especially. And then it can relax and go quiet. And if it kept on going it would have to be completely quiescent. It would become the nothing of the mind ceasing to exist.

    Which is fine for idealists perhaps.

    But I am talking as someone with physicalist ontological commitments. And so any "nothingness" has to make sense within that framework.

    And as I say, my metaphysical goal is imagining the least brute fact foundation for a tale of cosmic development. So vagueness understood as "mere fluctuations" is where that line of thought arrives.

    Sure, some absolute passivity seems like a better ontic candidate. Yet we must accept the fact that the materiality of action exists, along with the directionality provided by global organising form. So the best we can do is imagine the initial conditions as representing the least of both these things. A fluctuation is what that looks like.
  • Is 'information' physical?
    Can we really present any philosophy as a consistent system in a single moment with all of the meanings of each of its terms fixed? I don't think so.t0m

    Yep. Knowledge has to bootstrap itself from axioms. We have to risk making a hypothesis that seems "reasonable". But hey, it seems to work pretty well.
  • Is 'information' physical?
    Actually that claim is ‘transcendental realism’, in philosophy speak.Wayfarer

    I still prefer to call it naive realism in this case. Transcendental realism would surely be only a position that makes sense as a conscious opposition to Kant. So it would count as at least not being folk metaphysics. :)
  • Networks, Evolution, and the Question of Life
    To think of a genomic network as structurally isomorphic to a neural network is probably possible, but it will remove both specificities.fdrake

    That would be the research question. We do have a variety of neural network architectures as candidates. The genome could be just like of those, or something different.

    I doubt, though I could be wrong, that genomic networks are necessarily concerned with message passing in continuous or quasi-continuous time like neural networks arefdrake

    Surely genomic networks would have to be very much always monitoring the state of the cell? The biophysics now makes the point that cellular machinery is always on the point of breaking down and so needing the right nudge to build itself back up.

    Direct evidence that timing matters is how the most time-critical regulation - that of the respiratory chains of mitochondria - is handled right on the spot by mitochondrial genes.

    So most of the mitochondria genes have migrated to the central chromosomes. The time lags are not so critical. But where fast response is needed, the genes are kept right where the reactions are taking place.

    Dynamical systems theory is already being used. Street's reference to canalisation has a link to bifurcation theory.fdrake

    Of course. But this is now different if instead of the dynamics being something directly physical - the self-organising dynamics at a "chemical" level - we are talking about a dynamics-based information model.

    So this is about the genome as analog computation. A dynamical description might apply. But at the level of disembodied information rather than embodied physics.

    It is just the same as the problem of talking about the representation of the world in the brain. Patterns of neural activity don't just simply "look like" the physics of the world they are modelling. There is certainly some topographical relationships preserved, but also a hierarchy of functional specialisation. Activity in one small blob of the visual cortex produces our sense of colour, another that of motion.

    So the old mental picture of the genome was a "flat" network at best. It was just some kind of straight transcription layer with not particular internal complexity.

    Once we start talking about neural networks, we are asking just how much internal hierarchical organisation, and hence "mind-like" complexity, there is going on. A very different ball game.
  • Networks, Evolution, and the Question of Life
    Our bodies and natural selection must obey the laws of physics. If it didn't then we could all fly without the proper anatomy.Harry Hindu

    Physics certainly constrains biology. But what then isn't constrained is by definition free to happen. And this freedom is what demands further modelling.

    In fact, this freedom is something we have begun to generalise by talking about computation, information, negentropy, modelling relations, semiosis, etc.

    So rather than biology just being some small and accidental expression of something physics failed to forbid, we are realising that it leads to a whole realm of "mind". The fact that physics creates a hard baseline of constraints in itself opens up a matchingly definite counter-possibility of a hard superstructure of rather absolute freedoms. Information becomes a real source of causality acting back on the world in evolutionary fashion.

    That is why talk of genenomic networks being like neural networks in having a "separate adaptive intelligence" makes sense. A test of how much this is the case would be to consider the resilience that genomes show if you try to knock out one developmental pathway, and yet the genome can reorganise to still produce the desired functional outcome.

    One super interesting thing to bring up in relation to this - I might start another thread on this down the line - is in following Robert Rosen's contention that biology is, contrary to what is commonly thought, a more general science than physics, insofar as biological systems have a richer repertoire of causal entailments than do physical ones.StreetlightX

    Yep.
  • Networks, Evolution, and the Question of Life
    Boy, this gets complicated. I can see - half genes from mom, half from dad. Trait A from mom, Trait B from dad. But if it's a network, that's 1/2 mom's network and 1/2 dad's network. Not even that. How do you inherit 1/2 a network? Why would it fit with the 1/2 network from the other parent? I'm not arguing against what you're saying.T Clark

    That is a good point. A big constraint on the whole gene story is that there is this strong selection for the whole damn business being evolvable too. The genome both has to regulate development and also do its best to expose individual traits to the winnowing force of selection.

    So while a distributed genome behaves as a processing network, it still wants to offer up traits on an individualised basis, not the whole package.

    This was the argument for why we go from the simple gene rings of bacteria to the chromosomes of multicellular life. The chromosomes, with their recombination shuffling of the gene deck, look to be a mechanism that both preserves the coherence of the developmental network and allows traits to be exposed to evolution in a suitably individualised fashion.

    The evolution of sex - the separation of the immortal germ-line - is another important step. It means the genome that runs the everyday body is kept out of the evolutionary fray. The trait-level view is the specialist role played by gametes, or sperm and egg.
  • Networks, Evolution, and the Question of Life
    So it can be said that the topological properties of networks constrain the types of flow that pass through it, but the level of structural isomorphism is inappropriate for the analysis of the expression of particular genes or sets thereof.fdrake

    Yes. The interesting point about genomic networks is that their internal processing structure could be - in principle - uncrackable and forever hidden. Can we reconstruct the way a neural network executes its function even with full knowledge of the weights of its nodes?

    If the functionality is multirealisable, then a knowledge of some particular state of task-adapted componentry does not give a simple theory of the functional dynamics of the network.

    We could still hope to model genomics at a higher level. That’s why I’m thinking of a description in terms of general logical principles. Like the repetition of units (as in segmented body plans) or timing information when it comes to regulating tissue growth and developmental symmetry breakings or bifurcations.
  • Is 'information' physical?
    Colors are merely the effect of the state of the apple interacting with light and your visual processing system.Harry Hindu

    Love the “merely”.

    Anyway, you are still successfully dodging the question of how an apple can still look red to us even when the light it reflects is not in the normal red frequency range. It can’t be then a simple cause and effect relationship in terms of the actual light entering our eye and the way we construe the hue of what we see. What we imagine we should see, given our model of the lighting conditions, takes over.

    The point here is that the indirect perceptual route is more accurate in that it sees the apple as it would be understood in ideal lighting conditions. It is the interpretation that can make allowances because the modelling isn’t simply driven in causal fashion by physical inputs.

    If you say that they are simply another model, then you are using models to explain models, which then makes the term, "model" meaninglessHarry Hindu

    Yes. And why not?

    Of course they are also models at completely different levels of semiosis. Colour experience is biological-level perceptual modelling of “the world”. Talk about electromagnetic radiation and wavelength is socially constructed knowledge of the world.

    One model can only change over eons of evolutionary time. The other we could reinvent tomorrow.
  • Is 'information' physical?
    Every time you make your case for how you think things are, you are attempting to get at and inform me of how things really are.Harry Hindu

    I’m telling you what I find to be a justified belief. I’m not pretending to have transcendent access to absolute truth.

    Why do you find that so hard to follow?
  • Is 'information' physical?
    I think the mistake Apo made was making the distinction that a wave is not red. Apples are red or not red. Anyone who knows what they are talking about should know that waves are not red. Apples are.Harry Hindu

    So how does colour constancy fit in here? You haven’t explained.

    And this talk of apples is just misdirection. A laser beam could be tuned to the same hue as the apple as seen under white light. So trying to treat colour as some material property of an apple is nonsense when we can see red just from the pure shining of a light.
  • Is 'information' physical?
    To even say that it is "indirect" is to admit causation, no?Harry Hindu

    But not your direct cause and effect. Instead my indirect causation which is the modelling relation.

    what is the point in making a distinction between "direct" and "indirect", when you are getting directly at how reality really is in order to make any statement about how reality really is?Harry Hindu

    But my argument is indeed that we don’t get at what reality really is. The modelling relation is about regulation, not knowledge. What we want is the most efficient and useful image of reality.

    You then want to claim there is a problem in that because now I’m making a claim about how things “really are”.

    Well, partly we can know what is real about our own epistemic strategies. Science is about accepting the pragmatism of the modelling relation. But then also it is only you who is concerned about some absolute veridical knowledge of reality in the first place. The high bar you set doesn’t apply to me if my claim to know that this is how the mind works is itself just another testable pragmatic hypothesis.
  • Emergence is incoherent from physical to mental events
    . It's what we experience (or do not experience) in dreamless sleep.t0m

    Even in dreamless sleep, there is a desultory rumination going on. There actually is experience of some deeply disorganised and unremembered kind. And when we try to shut off our minds when awake, or are sitting in a sensory deprivation chamber, the mind still rustles with fleeting visions and half-thoughts.

    So phenomenology support a “nothingness” that is an active vagueness of fluctuations rather than a passive nothingness or emptiness.

    And I also prefer a foundational tale based on active fluctuation as that is what science has been finding. The quantum vaccuum seethes with virtual particles. It is not empty nothingness but furious action which complete cancels and so amounts to nothing.

    So the ontological argument for a fluctuations based picture is derived from Peircean logic, phenomenology and our best physical theories. All roads seem to lead to Rome here.

    The beginning is indeed vague. Is this something that is crystal clear in your mind, or is this the sketchier part of the theory you are still working ot0m

    As metaphysics, sketchy is fine. So my interest is in how the concept of a vagueness of unlimited fluctuations could be a proper scientific theory - actually modelled mathematically. That is what would take it forward.

    And there is already a lot of such physical modelling that bears on the question. Fluctuation based thinking is pretty common. Virtual particles is one example I just mentioned.
  • Networks, Evolution, and the Question of Life
    That is, if we think in terms of networks, how is it possible to think the specificity of life itselfStreetlightX

    The problem I see here is you seem to be arguing that network topologies force their organisation on biological structure in a rather crystalline fashion. So something brute about possible topological arrangements would impose itself on biology and greatly restrict the variety of body structure that genes could encode.

    But the argument is that genes are organised like neural networks and so embody a plastic intelligence. As you say, the genetic information would be distributed as a pattern of network weights. Yet in being a neural network, the network topology wouldn’t simply impose its own crystalline structure on the biology. Instead, the network could represent any kind of biological design as the “image it holds in mind”.

    So just as a neural network can be trained to recognise any pattern, gene expression would be the same in reverse - the capacity to output any learnt pattern. The same intelligence of course helps to explain how the genome can recognise the current state of the body so as to respond appropriately in the first place. So really the genome is a pattern matching system.

    And as such, there doesn’t seem any strong reason to suggest that the genome would be limited in the variety of states it could choose to represent. The limits such as they are would be more the logical ones - like the rules to generate bilateral symmetry and the repetition of segments - than topological ones. And that in itself is evidence that the gene networks must be operating at an intelligent level - actually relying on logic-like operations to switch and direct body development.
  • Emergence is incoherent from physical to mental events
    But part of the problem is that some philosophical topics are trying to ground a metaphysics of beingschopenhauer1

    So is Being a verb or a noun here? Are we describing the process of how things could come to be, or frustrating ourselves because claiming existence as a brute fact leaves us with no counterfactuals to make a description of "what is" even possible?

    You driven yourself up a mental cul-de-sac with your insistent reifications. Back up the truck and get back on the road.
  • Emergence is incoherent from physical to mental events
    I like your theory. I don't know that it exactly minimizes brute fact, though. What would this mean, exactly?t0m

    A semiotic approach to metaphysics says all you need to get things going is naked potential. Just the sheer fact of "something happens". So a propensity towards fleeting and dimensionless fluctuations.

    So the first action is neither accidental nor necessary - there is as yet no context to decide that either way.

    It is neither form nor material. It might seem like a fluctuation is the suggestion of an action in a direction, but the world as yet lacks the kind of definite history of existence that could determine that it expressed some direction as a definite fact. Well, there just is no world.

    And likewise, while it seems the fluctuation might have expressed an action, it didn't have a material effect. It did not react with anything else. Its existence left no mark.

    So again, this fluctuation - the first expression of a naked propensity - is the very least state of Being that we could possibly imagine. We start with a brute fact that is also the very least kind of brute fact that seems possible.

    This is what Peirce called Firstness or Tychic spontaneity. His logical term for it was Vagueness. Ancient Greek metaphysics started with it with Anaximander's Apeiron.

    So then Peirce says what if a lot of fluctuations are just popping off? Well, they are going to start reacting with each other. They are going to amplify or cancel. Alignment will form. Dimensionality itself will start to emerge like the way rain drops falling on a virgin hill slope would eventually start to carve our regular channels, a pattern of rivulets, streams and rivers.

    This third stage is Thirdness or indeed the carving out of habits, the establishment of strong constraints that tell of a developing history of action and the statistical emergence of whatever regularity is thus natural.

    So a semiotic metaphysics begins with less than nothing - as nothingness is some kind of already definite state, like a world with dimensionality and some absolute absence of content. And while you might say that this vague potentiality or Firstness is still "a something", a brute fact, it is the least kind of somethingness imaginable.
  • Emergence is incoherent from physical to mental events
    Meh. It is a self consistent story about how existence could develop. So of course it may not totally do away with brute fact, but also it minimises the brute factness that normally dominates most folk’s metaphysics.

    Yes, the noumenal is never grasped. This is a phenenological approach, as it says on the box. Yet the noumenal is going be approached the most closely this way. Hence how semiosis is the pervasive theme of new frontier science.

    Now your latest ludicrous charge is that I’m dodging metaphysics. Christ you’re basically such a miserable bugger. Don’t you find any joy in encountering new ideas?
  • Is 'information' physical?
    Pierce also treated chance or spontaneity as real in this way. So vagueness, fluctuation or material potential are matchingly real, giving you a triadic metaphysics.
  • The Universe as a Gas Can – Part I: Entropy
    The basic idea of general relativity, as famously expressed by John Wheeler, is: "Mass tells space-time how to curve, and space-time tells mass how to move."

    So the shape of the spacetime "container" is determined by the distribution of its material contents. That shape then determines the gravitational acceleration that distributed masses will feel - mass will just run inertially along the curves of that space.

    I believe that is the basic thought MikeL is trying to express. Though in using his own colourful language, it is not so clear how well he understands what a physicist would be meaning.

    And yes, there is a mysterious issue of how could matter and spacetime "talk to each other" in this fashion. How does each "detect" the difference that each causes in the other?

    This is where the metaphysics comes into it. The physics has a model that works. But the mechanism sounds a little spooky and supernatural when you start to focus on the "how" of such a two-way mutual, or dichotomous, interaction.

    It is just the same as when Newton formulated his own theory of gravity and was forced just to conclude that there is "action at a distance". He knew that talk of an immaterial interaction - a connection supported by nothing - sounded metaphysically crazy. But the maths worked beautifully (over the then available scales of observation).

    Later science had better instruments. General relativity came along to show that "spacetime curvature" could do away with the need for spooky instantaneous connections with nothing inbetween to explain them.

    But now there is a new spooky connection between mass density and spacetime curvature to be explained. So onwards to quantum gravity theory. :)
  • The Universe as a Gas Can – Part I: Entropy
    Gravity is spatiotemporal curvature according to one way of modelling reality - general relativity. And it is a force in another - quantum physics (with its gravitons).

    So you have to be careful not to mistake the ontological commitments that science might make in modelling with some kind of final exhaustive truth about the "thing in itself". That is, you can't say gravity is not a force because it is curvature. Both of those alternatives are themselves just models.

    Also don't forget that dark matter has nothing whatsoever to do with dark energy.
  • The Universe as a Gas Can – Part I: Entropy
    You haven’t given a good reason for why two directions of action or entropic dissipation would be separated in a way that produces clumpiness.

    The mainstream physics story is that gravity only switched on to become a contracting force after the Higgs symmetry was broken. The universe had grown enough to cool enough and so gravitating mass could condense out and start to make the distribution of matter clumpy, or bubbly.

    This is why primordial quantum fluctuations in the original distribution of the Big Bang radiation are so important. There had to be a seed inhomgeneity already present in the matter distribution which the switching on of gravity would then amplify to create a fractal distribution of stars and galaxies.

    So a well worked out story to account for the bubbly distribution of matter in space already exists. Theory matches observation. An army of well funded scientists have researched this.

    Now you are wanting to invent an explanation for something that is not observed - a bubbly distribution of space itself.

    You might argue spacetime is bubbly because it has event horizons due to its expansion and lightspeed limits on material interactions. The Cosmos has a fractal lightcone structure.

    So yes, on the one hand, bubbly structure - fractal distributions - are the very hallmark of entropy-maximising processes. It is why you run up against bubbly features when talking about cosmological mechanisms.

    But there is no need to invent bubbly explanations unless there are bubbly observations in need of explanation.

    Dark energy is an example of a cosmic feature which looks to exist evenly everywhere for the vacuum - the basic spatial fabric of the universe.

    And yes, the alternative view that dark energy is just an observational illusion - the result of the distribution of matter being more fractally distributed than the standard inflationary Big Bang model - is quite plausible. It depends on whether the calculations those guys need to do pan out and give us some detectable prediction that matches what we can see.

    So matter distribution is already said to be bubbly - due to the amplification of primordial fluctuations. Now the question is whether dark energy is a further expansion mechanism when it comes to the smoothness of matter’s spatial backdrop.

    Dark energy has no reason to be clumpy, either theoretically or observationally. So as speculation, that hypothesis has poor motivation.
  • Emergence is incoherent from physical to mental events
    If we are still discussing the nature of mind, we only need biosemiosis and its epistemic cut.

    Peircean semiosis claims the irreducibility of spontaneity or tychism anyway. Othererwise what is there to constraint or regulate?

    Then this is logical at the metaphysically general level because it is reasonable in a causal sense. If everything tries to happen, much will cancel out. An average will emerge.

    Remember the high esteem with which you hold a statistical principle like natural selection? Well Peirce’s view is that physical existence is probabilistic and falls into the regularity of patterns due to emergent constraints.

    Note also that evo-devo has been replacing the modern Darwinian synthesis in biology. This is a recognition that material self organisation - development - is as important as inheritance and selection, or evolution. ... Just as Pattee’s epistemic cut describes.

    So as a result of the 1980s paradigm shift brought about by chaos theory, dissipative structure theory, self organising criticality theory, etc, even physics and chemistry seem lively and mindful in that self-constraining order can emerge for purely probabilistic or entropic reasons.

    That makes pansemiosis a reasonable metaphysical framework. And biology certainly now recognises that life is not about bringing dead matter into action. It already wants to develop order. The trick then is to find material processes balanced at the edge of chaos - where they are at the point of critical instability and so easy to tip with just an informational nudge.

    You can’t be a follower of modern biology and not have noted this paradigm shift. The 1960s genecentric view is out. It is now evolution and development because life has to rely on the more fundamental self organising tendencies of a material world.

    Nature is rational or reasonable all the way down in that order cannot help but emerge to make disorder, or entropy, also an actual thing.
  • Is 'information' physical?
    as if you had a "direct" view of reality itself?Harry Hindu

    Precisely. Direct needs to said in scare quotes. Indirect is admitting that it is only “as if”.
  • Is 'information' physical?
    What is interesting it's that both Bergson and Peirce viewed matter as some sort of decayed or constrained mind.Rich

    Amusingly:

    Charles Sanders Peirce took strong exception to those who associated him with Bergson. In response to a letter comparing his work with that of Bergson he wrote, "a man who seeks to further science can hardly commit a greater sin than to use the terms of his science without anxious care to use them with strict accuracy; it is not very gratifying to my feelings to be classed along with a Bergson who seems to be doing his utmost to muddle all distinctions."

    https://en.wikipedia.org/wiki/Henri_Bergson
  • Emergence is incoherent from physical to mental events
    As I say, it explains how semiosis is even possible due to an emergent scale of physical convergence.
  • Is 'information' physical?
    That question wasn't for you.Srap Tasmaner

    And yet the question was quite clearly addressed to me. So what's the game here?
  • Is 'information' physical?
    The question is whether language has a monopoly on meaning or on information or both.Srap Tasmaner

    Clearly my answer is that it doesn't. And I also pointed out that semiosis recognises grades of "communication" here, like your shift from indexical shrieks to iconic social signalling to symbolic speech acts.

    Is there mind involved in some of these but not others? Do some of these hominin sounds carry meaning or information but not others?Srap Tasmaner

    What do you mean by "mind"? Is there a useful distinction in operation here?
  • Emergence is incoherent from physical to mental events
    Can it be tested for, or is it something you think is not testable?schopenhauer1

    I'll repost the particular way that biosemiosis has now been cashed out as a theory below.

    Of course, the biophysicists involved don't frame their work as "biosemiosis".

    And the shift from a focus on genetic codes to mechanical devices - molecular machinery - is very Pattean. Semiosis itself needs to be understood more generally in terms of essentially physical constraints - again, a closing of the apparent gap between the informational and material modes of action.

    Life and mind don't just thrive despite material instability. Material instability is what they require as their ontic foundation. This is a profoundly new idea. Or at least profoundly different to the assumption that regular reductionist science makes about these things.

    So yes. This is a classic Kuhnian paradigm shift. It is not some particular theory. It is the framework of thought within which theories and experiments are themselves grounded. It is the paradigm within which they could even make sense.

    Counterfactually, it might have indeed proved the case that life doesn't arise at some level of critical instability as I describe below. As I say, it was the last thing most biologists would have expected even a decade or so ago.

    I mean the ATP-ase enzyme is actually a rotating spindle device driven by a proton gradient?!? Kinesin transport proteins literally walk their way down actin tight-ropes?!?

    That's got to be whacky science fiction right - if your paradigm of cellular metabolism is that it is a reaction vat or soup of chemistry.

    So your complaint is that this is non-standard and seems more like a metaphysical catch-all than laboratory-ready. My answer is this is a paradigm shift. And it can be seen happening as a series of waves now.

    Chaos and complexity theory were a big shake-up through the 1970s and 1980s for example. That is when the connection between material instability and emergent self-organisation was made.

    That then sent shock waves through life and mind science. Now we are seeing the results of that in terms of increasingly semiotic approaches. The question of how "immaterial" information can harness material "possibility" or instability has come to the forefront.

    Gene theory used to make the semiotic relation between information and matter look simple. Just code for some enzymes and toss them into an organic stew.

    (Note here how you just completely miss the fact that semiotics is already as central to biology as natural selection, by the way.)

    Well whoops. Genetic information and the sign relation they create is just the tip of the iceberg so far as semiotics goes. Looks like we need a better developed metaphysical paradigm to recognise semiosis in all its grades or guises.

    Anyway, here is that post which argues how biological sign has been shown to arise at a particular level of physics. The claims of biosemiosis have demonstrated their foundations.

    ---------------------------------------------

    Biophysics finds a new substance

    This looks like a game-changer for our notions of “materiality”. Biophysics has discovered a special zone of convergence at the nanoscale – the region poised between quantum and classical action. And crucially for theories about life and mind, it is also the zone where semiotics emerges. It is the scale where the entropic matter~symbol distinction gets born. So it explains the nanoscale as literally a new kind of stuff, a physical state poised at “the edge of chaos”, or at criticality, that is a mix of its material and formal causes.

    The key finding: In brief, as outlined in this paper http://thebigone.stanford.edu/papers/Phillips2006.pdf , and in this book http://lifesratchet.com/ the nanoscale turns out to a convergence zone where all the key structure-creating forces of nature become equal in size, and coincide with the thermal properties/temperature scale of liquid water.

    So at a scale of 10^-9 metres (the average distance of energetic interactions between molecules) and 10^-20 joules (the average background energy due to the “warmth” of water), all the many different kinds of energy become effectively the same. Elastic energy, electrostatic energy, chemical bond energy, thermal energy – every kind of action is suddenly equivalent in strength. And thus easily interconvertible. There is no real cost, no energetic barrier, to turning one kind of action into another kind of action. And so also – from a semiotic or informational viewpoint – no real problem getting in there and regulating the action. It is like a railway system where you can switch trains on to other tracks at virtually zero cost. The mystery of how “immaterial” information can control material processes disappears because the conversion of one kind of action into a different kind of action has been made cost-free in energetic terms. Matter is already acting symbolically in this regard.

    This cross-over zone had to happen due to the fact that there is a transition from quantum to classical behaviour in the material world. As the micro-scale, the physics of objects is ruled by surface area effects. Molecular structures have a lot of surface area and very little volume, so the geometry dominates when it comes to the substantial properties being exhibited. The shapes are what matter more than what the shapes are made of. But then at the macro-scale, it is the collective bulk effects that take over. The nature of a substance is determined now by the kinds of atoms present, the types of bonds, the ratios of the elements.

    The actual crossing over in terms of the forces involved is between the steadily waning strength of electromagnetic binding energy – the attraction between positive and negative charges weakens proportionately with distance – and the steadily increasing strength of bulk properties such as the stability of chemical, elastic, and other kinds of mechanical or structural bonds. Get enough atoms together and they start to reinforce each others behaviour.

    So you have quantum scale substance where the emergent character is based on geometric properties, and classical scale substance where it is based on bulk properties. And this is even when still talking about the same apparent “stuff”. If you probe a film of water perhaps five or six molecules thick with a super-fine needle, you can start to feel the bumps of extra resistance as you push through each layer. But at a larger scale of interaction, water just has its generalised bulk identity – the one that conforms to our folk intuitions about liquidity.

    So the big finding is the way that contrasting forces of nature suddenly find themselves in vanilla harmony at a certain critical scale of being. It is kind of like the unification scale for fundamental physics, but this is the fundamental scale of nature for biology – and also mind, given that both life and mind are dependent on the emergence of semiotic machinery.

    The other key finding: The nanoscale convergence zone has only really been discovered over the past decade. And alongside that is the discovery that this is also the realm of molecular machines.

    In the past, cells where thought of as pretty much bags of chemicals doing chemical things. The genes tossed enzymes into the mix to speed reactions up or slow processes down. But that was mostly it so far as the regulation went. In fact, the nanoscale internals of a cell are incredibly organised by pumps, switches, tracks, transporters, and every kind of mechanical device.

    A great example are the motor proteins – the kinesin, myosin and dynein families of molecules. These are proteins that literally have a pair of legs which they can use to walk along various kinds of structural filaments – microtubules and actin fibres – while dragging a bag of some cellular product somewhere else in a cell. So stuff doesn’t float to where it needs to go. There is a transport network of lines criss-crossing a cell with these little guys dragging loads.

    It is pretty fantastic and quite unexpected. You’ve got to see this youtube animation to see how crazy this is – https://www.youtube.com/watch?v=y-uuk4Pr2i8 . And these motor proteins are just one example of the range of molecular machines which organise the fundamental workings of a cell.

    A third key point: So at the nanoscale, there is this convergence of energy levels that makes it possible for regulation by information to be added at “no cost”. Basically, the chemistry of a cell is permanently at its equilibrium point between breaking up and making up. All the molecular structures – like the actin filaments, the vesicle membranes, the motor proteins – are as likely to be falling apart as they are to reform. So just the smallest nudge from some source of information, a memory as encoded in DNA in particular, is enough to promote either activity. The metaphorical waft of a butterfly wing can tip the balance in the desired direction.

    This is the remarkable reason why the human body operates on an energy input of about 100 watts – what it takes to run a light bulb. By being able to harness the nanoscale using a vanishingly light touch, it costs almost next to nothing to run our bodies and minds. The power density of our nano-machinery is such that a teaspoon full would produce 130 horsepower. In other words, the actual macro-scale machinery we make is quite grotesquely inefficient by comparison. All effort for small result because cars and food mixers work far away from the zone of poised criticality – the realm of fundamental biological substance where the dynamics of material processes and the regulation of informational constraints can interact on a common scale of being.

    The metaphysical implications: The problem with most metaphysical discussions of reality is that they rely on “commonsense” notions about the nature of substance. Reality is composed of “stuff with properties”. The form or organisation of that stuff is accidental. What matters is the enduring underlying material which has a character that can be logically predicated or enumerated. Sure there is a bit of emergence going on – the liquidity of H2O molecules in contrast to gaseousness or crystallinity of … well, water at other temperatures. But essentially, we are meant to look through organisational differences to see the true material stuff, the atomistic foundations.

    But here we have a phase of substance, a realm of material being, where all the actual many different kinds of energetic interaction are zeroed to have the same effective strength. A strong identity (as quantum or classical, geometric or bulk) has been lost. Stuff is equally balanced in all its directions. It is as much organised by its collective structure as its localised electromagnetic attractions. Effectively, it is at its biological or semiotic Planck scale. And I say semiotic because regulation by symbols also costs nothing much at this scale of material being. This is where such an effect – a downward control – can be first clearly exerted. A tiny bit of machinery can harness a vast amount of material action with incredible efficiency.

    It is another emergent phase of matter – one where the transition to classicality can be regulated and exploited by the classical physics of machines. The world the quantum creates turns out to contain autopoietic possibility. There is this new kind of stuff with semiosis embedded in its very fabric as an emergent potential.

    So contra conventional notions of stuff – which are based on matter gone cold, hard and dead – this shows us a view of substance where it is clear that the two sources of substantial actuality are the interaction between material action and formal organisation. You have a poised state where a substance is expressing both these directions in its character – both have the same scale. And this nanoscale stuff is also just as much symbol as matter. It is readily mechanisable at effectively zero cost. It is not a big deal for there to be semiotic organisation of “its world”.

    As I say, it is only over the last decade that biophysics has had the tools to probe this realm and so the metaphysical import of the discovery is frontier stuff.

    And indeed, there is a very similar research-led revolution of understanding going on in neuroscience where you can now probe the collective behaviour of cultures of neurons. The zone of interaction between material processes and informational regulation can be directly analysed, answering the crucial questions about how “minds interact with bodies”. And again, it is about the nanoscale of biological organisation and the unsuspected “processing power” that becomes available at the “edge of chaos” when biological stuff is poised at criticality.
  • Is 'information' physical?
    But what the machine actually does is physical, right? Just because a human designs a machine to serve a human purpose, doesn't mean the machine itself is doing something non-physical, does it?Srap Tasmaner

    This skips too fast over the fact that the machine is operating syntactically - according to rules the physics can't see.

    This is especially obvious with a computer - an information processing device after all. It is explicit in Turing universal computation that physics falls out of the picture. There is a designed in divorce of hardware and software. And even the hardware is divorced from the physics in being a material structure with no inherent dynamics. You have to plug the computer into the wall to make it go.

    So it is missing the point to argue that a computer is "just physics". The essence of computation is the syntax that regulates it. And it is the origins of that syntax which then becomes the larger issue.

    It is pretty easy to why the physical structure has been given no choice but to act in a way that has been mechanically determined by some program being executed according to a constraining hierarchy of rules.

    So a machine is regular physics silenced and controlled. That creates a space of symbolic freedom. Rules can be freely invented to make the machine "do things".

    The loop is then semiotically complete when the whole of this relation is constrained by the general requirement that it pragmatically works. The machine does "useful things". Or indeed "semantically meaningful things".
  • The Universe as a Gas Can – Part I: Entropy
    Dark energy also wants to flatten itself out in order to become flat space but in the reverse. Gravity flattens out of its well, dark energy flattens its mole hill.MikeL

    But gravity wants to curve spacetime into a tight ball - a singularity. And dark energy is the opposite in wanting to curve spacetime hyperbolically - a constant bending away from itself. So we have a positive and negative curvature deal that balances out and leaves a flat Euclidean spacetime.

    It is more complicated than that. But it is crucial that gravitating mass produces positive curvature (like a sphere surface) while dark energy is about negative curvature (like the surface of a saddle).
  • Is 'information' physical?
    We've been through this before. How can you claim direct cause and effect if we still see red when the wavelength is not "red"?

    https://en.wikipedia.org/wiki/Color_constancy
  • Emergence is incoherent from physical to mental events
    Well, I may well accept it, but then, if this is THE way to look at nature, why is science itself not really concerned with it?schopenhauer1

    Err. It's new.

    Also it is holistic. Science on the whole only needs to be reductionist. Holism only becomes important when science approaches the bounds of existence - the very small, the very large, the very complex.

    Now, yes there are some fairly well-lettered scientists involved with this theory, but again, they seem to have more of an enclave.schopenhauer1

    Ah. You want to make out this is some small cranky cult of the embittered?

    You are funny.

    ...but it seems to be a niche and not THE theory that science is advancing towards.schopenhauer1

    Would you say the information theoretic view is the one over-taking modern science at the general metaphysical level?

    I wonder why that is? It surely can't be because reducing material events to the status of signals for "observing contexts" is a better metaphysics.

    I mean next folk will be saying the whole Cosmos is an emergent self-organising process in which constraints shape its degrees of freedom.

    My goodness, next we will have physicists like Lee Smolin saying this:

    So if we want to ask cosmological questions, if we want to really explain everything, we need to apply a different method. We need to have a different starting point. And the search for that different method has been the central point in my thinking since the early 90's.

    Now some of this is not new. The American philosopher, Charles Sanders Peirce, identified this issue that I've just mentioned in the late 19th century. However, his thinking has not influenced most physicists. Indeed, I was thinking about laws evolving before I read Charles Sanders Peirce. But something that he said encapsulates what I think is a very important conclusion that I came to through a painful route. And other people have more recently come to it, which is that the only way to explain how the laws of nature might have been selected is if there's a dynamical process by which laws can change and evolve in time.

    And so I've been searching to try to identify and make hypotheses about that process where the laws must have changed and evolved in time because the situation we're in is: Either we become kind of mystics, well, just those are the laws full stop, or we have to explain the laws. And if we want to explain the laws, there needs to be some history, some process of evolution, some dynamics by which laws change.

    https://www.edge.org/conversation/lee_smolin-think-about-nature

    So information theory is now dominating frontier physics thinking. And when physicists adopt evolutionary thinking as well, it starts to all sound like what some obscure bearded chappy was saying in 1880s.

    Hmm. What could be brewing out there?

    The explanation is implications taken after the fact and interpreted in a light that matches the overriding theory.schopenhauer1

    Ah. You mean that just like evolutionary theory or hierarchy theory, we are speaking of a general metaphysical framework - a mathematical truism - more than some particular theory of something?

    As you say, natural selection was born as an explanation for speciation - the variety of life. (It certainly did not explain the origin of life, or in fact explain the generation of variety, just the removal of variety from a population). But despite this small beginning, natural selection is cited just as if it were ... a general statistical principle.

    Well semiosis arises as a science of meaning - a way to account for language use, both in the ordinary sense and propositionally. Or do you doubt that it even applies there?

    All theories can thus never NOT fit into triadic theories because it is always there after the fact.schopenhauer1

    If you want to propose monism or dualism instead, you could give it a go.

    Isn't this where we started? You couldn't even decide whether to be a panexperiential monist or a correlational dualist. Now you are bitching about a triadic metaphysics.

    But keep on going. Prove to yourself that the reason you are confused is that better structured ways of thought are not available.
  • Emergence is incoherent from physical to mental events
    Okay, but this theory still seems non-standard. Ischopenhauer1

    So your complaint is it is too philosophical? Well, OK...

    Would you accept that hierarchy theory is regarded as a universal natural organisational structure just as natural selection is held to be held a universal natural organising process?

    I mean even science itself is organised hierarchically.

    And aren't Pattee and Salthe among those who have literally written the book on hierarchical organisation? Yet now they are really keen on calling it semiosis when talking about evolving systems.

    Kind of makes you think.