• Metaphysician Undercover
    13.3k
    For me there are different senses of 'know'. First there is the knowing of participation, familiarity. I believe animals do this; it seems obvious. With symbolic language come recursive and discursive forms of knowing which may be more or less 'digital'. But remember, within linguistically mediated forms of knowing there are metaphorical, which is to say analogical, modes as well as more precisely propositional ( digital) modes. And the differences between these modes of knowing do not themselves constitute a sharp dichotomy (although it may be conceived as such) but a series of imprecise locales along a continuum.John

    I don't dispute that there are different senses of "know". But I think that they all involve some form of identity. Familiarity involves recognition which is a form of identification. I do not think that it is correct to extend "knowing", right down to primitive life forms, and then restrict "identity" to a function of human language.

    As to "unreasonable," I think we need a notion of pure reason to ground any notion of pure unreasonableness (I think you'll agree).Hoo

    Yes, I agree with this, but the point I was making to apkrisis, is that when two terms are seen to be opposed in conception, this does not indicate that the two things referred to are mutually dependent on each other, nor does it mean that one is not prior to the other. So in the case of reasonable and unreasonable, we fist make a conception of what qualifies as reasonable, then based on this description of reasonable, we can determine unreasonable. However, that the concept of reasonable is prior to the concept of unreasonable, does not mean that reasonableness exists prior to unreasonableness in nature. So I do not think that you can proceed to your conclusion that "reason itself is on fire", because what you are referring to is the conception of reasonable, and unreasonable, not "reason itself".

    Most of "common sense" or our prejudices have to remain intact while we judge and edit a particular prejudice. Pleasure and pain are the hammers that re-shape this edifice. But the pain can be cognitive dissonance, and the pleasure can be a sense of status. It's not at all just bodily.Hoo
    I see the point with the ship analogy, but here we are concerned with fundamental ontological principles. Can we assume that massive conceptual structures rest on fundamental principles? If so, then when we are examining these fundamental principles, should we judge them according to common sense, and good intuition, or should we judge them according to other fundamental principles, so as to maintain consistency with these other principles, and not to rock the boat? I think the former, if the fundamental principles are not consistent with common sense, and good intuition, then there is a problem with those principles, and that must be exposed, despite the fact that other principles might be destabilized in the process. .

    The idea that there is something beyond prejudice can itself be described (though not finally, since description is apparently never final) as one more prejudice. This threatens the distinction itself of course which we need in order to get to this threatening...Hoo
    I don't know if this can be called "prejudice". Prejudice implies a preconception. What I refer to is the potential for a method to go beyond conception, to observe, and describe, in an unbiased and objective way. If, the idea that this is possible is considered as a preconception, then I guess there is prejudice here as well. I don't see that it is possible to get beyond all prejudice, even common-sense, and intuition are inherently prejudiced, as there are prejudices inherent within our language.


    I'm not sure what you think I'm arguing here. It has been my point that we impose our frameworks of intelligibility on the world.

    But then a dialectic or dichotomous logic ensures that this process is rigorous. In being able to name the complementary limits on possibility, we have our best shot at talking about the actuality of the world, as it must lie within those (now measurable) bounds.
    apokrisis

    What I am arguing is that this dichotomous logic is the framework of intelligibility which is being imposed on the world. And this is the mistake. It is a mistake to think that the world must fit within our systems of measurement, the "bounds" which we imposed. We must adapt our systems of measurement, shape them to the world. But even this requires a preliminary understanding, which cannot be given by measurement because the system for measurement will be created based on this understanding.

    In order to properly understand the world we must start with a coherent system of description. Despite the fact that dichotomous logic can, and does, place restrictions on how one can describe, it does not attempt to restrict the thing being described to fit the system of description. We shape the system of description to fit the thing being described. We accept the fact that a description may not be precise, that your description may contradict my description of the very same thing, etc.. This is the mode of description, we do not attempt to force the world into our devices of measurement, we keep describing, and re-describing, working at the description and altering our descriptive terms, until we are satisfied with it. We then devise a means for measuring that described thing, based on the desciption.

    So if you want to talk about "time", then it is only going to be an intelligible notion that we can project onto reality in a measurable fashion to the degree we have formed a crisply dichotomous model of it.

    ...

    So we have a variety of ways of thinking about time - all of them models that try to impose some kind of fundamental dichotomy that would make time an intelligible, and thus measurable, concept of the thing-in-itself.
    apokrisis

    This is where I believe the mistake lies. You equate intelligible with measurable. But measurable is restricted by our capacity to measure. A thing is only measurable in so far as we have developed a way to measure it. However, a thing is intelligible to the extent that we have the capacity to describe it, and description does not require measurement. John, above, would argue that the capacity to recognize familiarity makes the thing intelligible. So how we proceed toward understanding the world, is to first develop ways to describe its qualities, then to develop ways of quantifying those qualities (measuring). Therefore, when talking about a thing like time, it is only practical to discuss our ability to measure it, to the extent of our ability to describe it.
  • Hoo
    415

    This is a great conversation. So, first, thanks for that! Again I'll reply freely to your entire post, since, well, I love this stuff.
    However, that the concept of reasonable is prior to the concept of unreasonable, does not mean that reasonableness exists prior to unreasonableness in nature. So I do not think that you can proceed to your conclusion that "reason itself is on fire", because what you are referring to is the conception of reasonable, and unreasonable, not "reason itself".Metaphysician Undercover
    This is deep water, because I'm not sure how much of a gap there is between reason and the conception of reason. It's connected to the issue of the world-for-us versus the world-in-itself. But the world-in-itself or the world-not-for-us looks necessarily like an empty negation. It marks the expectation that we will update the world-for-us (which includes the model of the filtering mind enclosed in non-mind that it must manage indirectly, conceptually, fictionally.) Is there a place for reason in this "real" non-mind enclosure? Or is reason a foggy notion distributed through our practices, verbal and physical? As philosophy wrestles with the definition of reason, or reason-for-reason, it seems to be the very fire I was getting at. The problem with reason-in-itself is that we can't say anything about it. It seems to cash out to the expectation that we will keep reconceptualizing reconceptualization itself, you might say.
    Can we assume that massive conceptual structures rest on fundamental principles? If so, then when we are examining these fundamental principles, should we judge them according to common sense, and good intuition, or should we judge them according to other fundamental principles, so as to maintain consistency with these other principles, and not to rock the boat? I think the former, if the fundamental principles are not consistent with common sense, and good intuition, then there is a problem with those principles, and that must be exposed, despite the fact that other principles might be destabilized in the process.Metaphysician Undercover
    I think we generally agree. I'd say that we only embrace the destabilization of an investment/prejudice in order to prevent the destabilization of a greater investment/prejudice. We amputate the hand to save the arm, or we trade the old arm for a new arm. It's a model of the modelling mind as a system that seeks minimum dissonance/tension/confusion and/or maximize preparedness, security, the sense of well-being. I think it's useful to think of the mind as a "readiness" machine. We have to act quickly sometimes, so the imagination cooks up detailed responses. If we are terrified by the thought of life in a world devoid of principle X, we will probably throw principle Y under the bus to save it.

    I don't know if this can be called "prejudice". Prejudice implies a preconception. What I refer to is the potential for a method to go beyond conception, to observe, and describe, in an unbiased and objective way. If, the idea that this is possible is considered as a preconception, then I guess there is prejudice here as well. I don't see that it is possible to get beyond all prejudice, even common-sense, and intuition are inherently prejudiced, as there are prejudices inherent within our language.Metaphysician Undercover

    So we basically agree. Of course it's not one of my prejudices that all prejudices are equal. The paradigm notion of objectivity is the physical world, perhaps, but maybe that physical world seems to just a part of the fuzzy intersection of the fuzzy cores of the belief systems involved. You mention common sense and language. That's objective, too, and it seems to ground the objectivity of science. We need the "manifest image" as a background and ordinary language in order to practice science. Maybe "true-for-all" is a grandiose extension of "true-for-folks-like-us," encouraged by the apparent universality of mathematical natural science. We share candidates (prejudices) for inter-subjective adoption by a community. As humanists, almost unconsciously so, we tend to think about universal inter-subjective adoption: truth-for-all, not just for Americans or "the superior man." And yet the same prejudice or implicit-rule-for-action might not fit equally well into two differing sets of already-held ideas or rules-for-action. It's just our almost invisible prejudice that what is true for me at least should be true for all.

    Yet when it comes to "fundamental ontologies" (including the presence or absence of the god-thing or the right-and-wrong thing or the truth-for-all thing), I don't think there's any reason to expect convergence. Fuzzy convergence in behavior, perhaps, since that is policed, but acceptable behavior under-determines the belief system, hence the "harmlessness" of freedom of thought. Different personal histories give us a different exposure to adoptable prejudices and really a different world to test them against (since we all live in the same world only by abstraction; our lives are all different streams of experience, etc.) Since science stays so close to the sensual and the mathematical, great consensus can be achieved. But our total, less-specialized belief systems don't seem similarly constrained. Lots of different belief systems "work" and are relatively stable. They're on fire, but it's a low heat, perhaps at the surface. For instance, I don't expect any worldview/ethical revolutions at this point, but I'll always be policing the "corona" of the fuzzy core of my investments.

    And yet this vision/fiction itself is just a "candidate belief" for the relatively small inter-subjective community that is comfortable with such abstraction. I like it this prejudice, so I bring it benevolently (perhaps hoping for a reward somehow) to the tiny tribe like a better mousetrap.
  • apokrisis
    7.3k
    It is a mistake to think that the world must fit within our systems of measurement, the "bounds" which we imposed. We must adapt our systems of measurement, shape them to the world. But even this requires a preliminary understanding, which cannot be given by measurement because the system for measurement will be created based on this understanding.Metaphysician Undercover

    Well if all this is a mistake, what is your alternative? Can you even define your epistemic method here?

    The Peircean model of scientific reason says yes, we have to begin with just a guess, a stab at an answer. And a good stab at an answer is one that tries rationally to imagine the limits that must bound that answer. That is what gives us a reference frame from which to start making actual empirical measurements. And from measurement, we can construct some history of conformity between the thing-in-itself and the way we think about the thing-in-itself. So retrospectively, any founding assumptions, any rational stabs in the dark which got things started, are either going to justified or rejected by their own empirical consequences. If they were in fact bad guesses, experience will tell us so. And contrariwise.

    Importantly (and something that goes to the analog~digital distinction as SX, channelling Wilden, has defined it), this also means that the model doesn't have to end up looking anything like what it is suppose to "represent".

    I think what troubles you is this apparent loss of veridicality. You want the kind of knowledge of the world that is literally analogic - an intuitive picture in the head. If someone is talking about atoms, you want to see a reality composed of billiard balls rattling around a table. If someone talks about development, you want to see a point moving along a drawn timeline, the future steadily moving backwards to become the past as intervals of the present get consumed.

    But higher order understanding of the world is different in being digital. It throws away the material variation to leave only the digital distinctions - the description of the boundaries or constraints, the description of the rate independent information in the shape of eternal or timeless laws and constants.

    So semiotic modelling is this curious thing of not being a re-presentation of what actually exists in all its messy glory. Instead, it is a boiling down of reality into the sparseness of abstraction entrained to particularity - the semiotic mechanism of theory and measurement.

    Sure, it is still nice to picture billiard balls, waves, timelines, and all kinds of other analogic representations of the thing-in-itself. But the digital thing is all about giving that kind of re-presentation up. In the extreme it becomes the kind of instrumentalism that SX would find disemboddied and "un-aesthetic". One may find oneself left simply with a syntax to be followed - a mathematical habit - which works (it makes predictions about future measureables) and yet for the life of us, we can't picture the "how". That's pretty much where the Copenhagen Interpretation ended up with quantum mechanics.

    So the Peircean/digital/semiotic approach to modelling the thing-in-itself is both cleanly justified in terms of epistemology, and also never going to deliver quite what you probably think it should. This is why whenever I talk about vagueness, you always just keep saying tell me about it again in a way that is not purely rational but instead gives me a picture I can believe inside my head.

    But sorry, that is what it means for modelling to be embodied, or meaning to be use. We have to head in the direction of extreme mathematical-strength abstraction so as to be able in turn to make the most precise and telling acts of measurement - to also digitise experience itself as acts of countiing, a harvesting of a field of symbols.

    You equate intelligible with measurable. But measurable is restricted by our capacity to measure. A thing is only measurable in so far as we have developed a way to measure it. However, a thing is intelligible to the extent that we have the capacity to describe it, and description does not require measurement.Metaphysician Undercover

    So just as I say, you yearn for analog iconicity - a concrete picture in your head that you can stand back and describe ... as if such a representation were the thing-in-itself floating veridically before your eyes.

    Pragmatism says that is a Kantian pipedream. A picture in your head is just going to be a picture. What actually matters - the only thing that in the end you can cling onto - is the functional relationship you can build between your model of existence, and the control that appears to give you over that existence. And the digital is stronger than the analog in that regard because it decisively erases unnecessary details. It can negate the real in a way that makes for the most useful map of the real.

    And we all know how a map bears bugger all material resemblance to the physical reality of the territory-in-itself. But who complains about a map of a country having "unreal" properties like being small and flat enough to fold up in your back pocket?
  • Hoo
    415

    What actually matters - the only thing that in the end you can cling onto - is the functional relationship you can build between your model of existence, and the control that appears to give you over that existence.apokrisis
    This is where we really overlap. Rescher likes "methodological pragmatism." The epistemological system is machine-like, a normalized discourse.The system as a whole and not its individual, inter-dependent parts is put to the test as we act on its output: "truths" or (implicitly) rules for action. For instance, this was probably the "living" justification of infinitesimals. They were part of a model of existence that allowed us to control that existence.
  • Janus
    16.5k
    I don't dispute that there are different senses of "know". But I think that they all involve some form of identity. Familiarity involves recognition which is a form of identification. I do not think that it is correct to extend "knowing", right down to primitive life forms, and then restrict "identity" to a function of human language.Metaphysician Undercover

    As I said think there is a useful logical distinction between identity and identification. To identify something is to identify it as something which, as you say, involves prior recognition. Animals ( at least some) obviously do recognition, but I would not agree it makes sense to say that they identify things, much less to say that they see things as identities.

    But, all of this is just terminology and I acknowledge there may be different ways of interpreting the ambit of terms. For me, though, it is recognition first, then the recursive act of identification of things (most broadly as entities and then kinds of entities), and then the still more reflexive act of understanding things as identities (unique entities).
  • Streetlight
    9.1k
    It is important that the analog or iconic representation already exists on the other side of the epistemic cut - on the side of the symbolic or "rate independent information". It is a distinction made at the level of the mapping, even if it means to be talking about a distinction in the (computational!!) world.apokrisis

    Sure, every setting of a boundary is always (at least) double: the explicit one between the two (digitized) elements in question (A, not-A), and implicit one between the 'boundary-setter' and the very system under consideration, taken as a whole. But this is just the methodological constraint set on any attempt at analysis; Wilden himself is perfectly aware of this:

    "Even if we think we have successfully divided the whole of reality and unreality into only two sets by drawing a line between A and non-A (and by including within non-A, non-B, etc.), the act of drawing that line defines at least one system or set as belonging to neither A nor non-A: the line itself. And since that line is the locus of our intervention into a universe, it necessarily defines the goalseeking system that drew the line as itself distinct from both A and non-A: it is their 'frame'".

    The 'goalseeking system' in question being nothing other than living things, of course. But just as the 'location' of the first boundary setting operation is consecutively undecidable - it belongs neither to A nor not-A (W: "it corresponds to nothing in the real world whatsoever") so too is this second-order boundary setting: the line is methodological, and cannot be imputed to the 'world': doing so is nothing but metaphysical dogmatism, in the Kantian sense of the term.

    In other words, you can't have your cake and eat it too: if you insist that the analog/digital distinction is made at the level of digital mapping to begin with, the projection of a more primordial ground of vagueness is simply that: a mythological projection that doesn't abide by the very epistemological constraints you ought to be beholden too. This is why rather than take the path you do, Wilden correctly recognizes that this higher order 'cut' is just that - a higher order cut:

    "[The second-order distinction] is of a different logical type from the line between A and B. The metalinguistic function of 'not' is in fact what generates the higher-order paradox, for 'not' is the boundary of the empty set, which like 'the class of classes not members of themselves' is both a member of itself and not a member of itself. And [second-order distinction] turns out to be another, higher order, substitute for 'not' : it defines an Imaginary line which belongs to the process of making distinctions, rather than to the distinctions themselves."
  • Mongrel
    3k
    I could take a stab at the question about how the issue bears on formal logic. Logicians get by with very low ambitions. Ontology tends to be in the background driving questions about reference and the nature of knowledge, but logic, being apriori, doesn't make ontological claims. It can't report on what's natural and what isn't.

    And... the point was being made that there are contradictions in our thinking, such as boundaries that are neither something or nothing. But then set theory was brought up in regard to digital being a subset of analog (which makes no sense to me.. but anyway). Set theory is founded on an odd contradictory notion called the transfinite... so nobody has a monopoly on contradiction.
  • Terrapin Station
    13.8k
    Did anyone address whether we're "analog" or "digital" yet? If so, I didn't see it.
  • Streetlight
    9.1k
    The question doesn't make sense. Analog and digital characterize systems or processes in nature, not things or entities.
  • Terrapin Station
    13.8k
    So humans aren't systems or processes in nature in your view?
  • apokrisis
    7.3k
    In other words, you can't have your cake and eat it too: if you insist that the analog/digital distinction is made at the level of digital mapping to begin with, the projection of a more primordial ground of vagueness is simply that: a mythological projection that doesn't abide by the very epistemological constraints you ought to be beholden too.StreetlightX

    Weird. The definition of vagueness is that it is the "not yet digitised". Vagueness is that state of affairs to which the principle of non-contradiction fails to apply. And thus it stands orthogonal to crispness, the state where A/not-A are busy doing their logically definite thing.

    So in a set theoretic sense, the vague~crisp is the superset here. As I said earlier, it is Peircean thirdness in incorporating the whole of the sign relation - the three levels of logic that would be vagueness, particularity and generality. A/not-A is just the digital crispness which is secondness, or the logic of the particular.
  • Streetlight
    9.1k
    Depends on how you mean. Your original question quoted a statement about symbolic representational systems then asked if 'we' are digital or analog. In that context your question makes no sense.
  • Streetlight
    9.1k
    Weird. The definition of vagueness is that it is the "not yet digitised". Vagueness is that state of affairs to which the principle of non-contradiction fails to apply. And thus it stands orthogonal to crispness, the state where A/not-A are busy doing their logically definite thing.apokrisis

    So which is it - do vague and crisp map on to analog and digital or do they not? If they do, in what sense can you claim that the analog/digital distinction is derivative from vagueness (circularity). If they don't, you're back to mythology.
  • Terrapin Station
    13.8k
    So we're not partially symbolic representation systems in your view? How would we not be a symbolic representation system when we do formal logic?
  • Streetlight
    9.1k
    Logic is a symbolic representational system. I have no idea what it means to say that 'we' are a symbolic representation system.
  • Terrapin Station
    13.8k
    Without getting into the ontology of logic for a moment (maybe that will eventually be necessary, but I'll avoid it for now), we can at least do formal logic, can't we?
  • Terrapin Station
    13.8k
    So would you say that we're "functioning digitally" when we do formal logic?
  • Terrapin Station
    13.8k
    Well then how do we manage to do formal logic if it's necessarily digital and we are not functioning digitally when we do it?
  • Streetlight
    9.1k
    I have no idea what you mean by 'we function digitally'. Sorry Terra, I don't think you have a grasp of the vocabulary here, which is why I'm being curt.
  • Terrapin Station
    13.8k
    Oh, well then probably don't answer "no" to the question "are we functioning digitally" when I ask you, as if you know what I'm talking about and you believe the answer is "no." Instead answer "I don't know what functioning digitally would even refer to" or something like that when I ask.

    Okay, so "functioning" just refers to operating or behaving in a particular way or engaging in a particular process. Is that much clear to you? And is it clear then when we say that humans can function in one way or another?
  • Streetlight
    9.1k
    Honestly, I really don't want to play twenty questions. Thanks for engaging though.
  • Terrapin Station
    13.8k
    I think you're bowing out rather because you don't want to deal with the issue I'm bringing up but okay. Not following through all the way is honestly what I expected. I was pleasantly surprised that conversation went as far as it did. That's the way I prefer to do philosophy discussions (which is why I'd prefer a chat room to a forum where people routinely write hundreds of words at a time)
  • Mongrel
    3k
    Analog and digital are properties of data. The terms are also used to describe electronic design formats. Analog data is continuous like a sine wave. Digital data is typically a square wave (although multi-level digital formats were discussed at one point).

    Analog design obviously preceded digital. It was characterized by the sorts of things we see in a radio: various kinds of filters, inductors, transformers and so forth. Digital electronics started replacing analog electronics back in the 1960's. The first digital telecommunications transmission system went into operation in 1960 and since then, the majority of electronic equipment has become digital or computer driven.

    In this thread, the terms are being used metaphorically. It's not clear if everybody realizes that, although it's been pointed out several times in the this thread that the metaphor is being stretched pretty far... maybe too far.

    It is interesting to ponder that metaphor. It obviously runs straight into philosophy of math because we're talking about continuity vs discontinuity. Looking at it that way, the notion that the digital is parasitic on the analog is just wrong. If we persist in maintaining that the digital is "loose" on the analog, we're stipulating some specialized meaning for the terms. It wouldn't be appropriate to complain that people don't understand the jargon. You're going to have to explain it since you've made up something unusual.
  • Michael
    15.8k
    You're going to have to explain it since you've made up something unusual.Mongrel

    He did that in the opening post:

    Analog systems are defined by continuous variables, like the distance between points or changes in velocity; rulers, thermometers, or accelerator pedals are all examples of analog systems. Digital systems, by contrast, are defined by discontinuous or discrete variables: as with the ten 'digits' of the fingers, digital systems, unlike analog systems, involve discontinutous 'jumps' between measurement values. — StreetlightX

    So the analog/digital distinction is a continuous/discrete distinction.
  • Mongrel
    3k
    I read the OP, thanks Michael. My point is that claims have been made regarding the relationship between analog and digital that are not true of the continuous/discontinuous distinction... so no, that's not how the words are being defined here.
  • Metaphysician Undercover
    13.3k
    Well if all this is a mistake, what is your alternative? Can you even define your epistemic method here?apokrisis

    I explained the alternative, it involves first, the recognition that our measurement techniques are inadequate for measuring some aspects of the world, in particular, the aspects associated with the assumed continuum. So we need to go back to a method of focusing on description rather than measuring. This is where the scientific method began, and made its greatest advances, developing out of practises such as alchemy. It involves endless observations, defining words and developing new words to avoid inconsistencies and contradictions between the observations of different individuals. The individuals concern themselves with producing a coherent and consistent description of the phenomenon, based in many varying descriptions.

    In the act of describing, the digital method (rules of logic) is applied to the tool of description, language. In the act of measuring, we tend to believe that the digital method is applied directly to the thing being measured, but this is an illusion. In reality, the limitations of the digital method have been incorporated into the language of measurement. The result is that any observations that are measurements, are necessarily theory-laden, due to the restrictions which are inherent within the measurement system. That is the position to which science has progressed today. Scientists rarely give themselves the freedom of separating the logic of digital restrictions from the language of description, to produce feely described observations. They cannot produce varying descriptions of the same phenomenon while using the same measurement system. Instead, they are constrained by a language of mathematics which has restrictions inherent within, to produce observations which are bound by those restrictions. In other words, the perspective from which one observes, is completely restricted by the measurement system, such that the possibility of varying descriptions of the same phenomenon, has been excluded.

    I think what troubles you is this apparent loss of veridicality. You want the kind of knowledge of the world that is literally analogic - an intuitive picture in the head.apokrisis
    Well of course that's what I want. If you assume that there is an analog continuum in the world, yet you describe, or model it as being digital, would you be satisfied with that? Either your assumption or your description is wrong. Can you live happily, knowing that you are involved in such self-deception?
  • Streetlight
    9.1k
    In this thread, the terms are being used metaphorically. It's not clear if everybody realizes that, although it's been pointed out several times in the this thread that the metaphor is being stretched pretty far... maybe too far.

    It is interesting to ponder that metaphor. It obviously runs straight into philosophy of math because we're talking about continuity vs discontinuity. Looking at it that way, the notion that the digital is parasitic on the analog is just wrong. If we persist in maintaining that the digital is "loose" on the analog, we're stipulating some specialized meaning for the terms. It wouldn't be appropriate to complain that people don't understand the jargon. You're going to have to explain it since you've made up something unusual.
    Mongrel

    I've stipulated what I've meant by the terms multiple times, precisely defining them in terms of negation and reflexivity, meanings which are certainly not idiosyncratic to me, but freely employed in philosophical discourse. Moreover, defining the difference in this way is far more precise than the appeal to the discrete and the continuous, which are more like heuristics, to the extent that the one can simply scale into the other at a level of granularity fine enough. But this is exactly what I'm trying to avoid. In any case, the terms are certainly not meant as metaphors. Of course you if you think Terrapin's question makes any sense whatsoever even in the data sense, you're welcome to engage him (and even then, the original sense of the terms have less to do with data than they do information).
  • Metaphysician Undercover
    13.3k
    Digital data is typically a square wave (although multi-level digital formats were discussed at one point).Mongrel

    Oh good, here's someone with some technical knowledge. Can you explain what a "square" wave is, or is that just a metaphor in itself?
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.