• denis yamunaque
    4
    Well, it's an awkward question, but, what in fact is probability? I mean, we assume that if an event has probability of 99,9% of happening, it means that if we simulate the conditions, each 1000 times the event would occur next to 999 times. But that's not a fact, since nothing really prohibits the complement of the event, with probability of 0,1% of keep continuously occurring through time, while the first event, with almost 100% of probability never happens.
    Anyone could argue that this is not likely, or if it happens, if you repeat the experiment, it would probably not happen again. But those two arguments just use again the definition of probability without explaining it.
    I could add that saying "Means that something is more or less likely" just change the word probable for likely.
    I know that in real life events like the first described are not usual, but mathematically it's not impossible and it's just a scenario to the main question: What, conceptually, is probability? What is something being likely to happen?
    I don't know if I expressed my question in a proper way, but we can discuss to make it clear if that's a point.
  • javi2541997
    5.9k


    I think this question goes far than mathematical criteria. Probably we should focus in terms of luck.
    Probabilities and chances depend about of how we want to assume if the goals/recognition we get depends somehow of “luck”
    How many luck do I have in terms of passing the next exam? It depends in how increases our probabilities while we study more or less.
    Conceptually, when we check the definition of probability in Oxford dictionary, it redirects you synonym that is called “likelihood” which says the chance of something happening; how likely something is to happen

    It is interesting how literally says “chance”. This is why I guess the significance of probability depends a lot of “opportunities to do something”
  • T Clark
    14k
    I could add that saying "Means that something is more or less likely" just change the word probable for likely.denis yamunaque

    All definitions are just changing one word that you don't know the meaning of for another that you do.

    A probability that an event P will have a result R is X if, when P takes place N times, on average, R will result X * N times.
  • SophistiCat
    2.2k
    Well, it's an awkward question, but, what in fact is probability?denis yamunaque

    It's a very legitimate question that has received plenty of attention from mathematicians (who invented the concept) and philosophers. Have a read: Interpretations of Probability
  • Aryamoy Mitra
    156
    What, conceptually, is probability? What is something being likely to happen?denis yamunaque

    Statistical probability is (for all necessary objectives) a formalism, that underlies the intuitive notion of a 'likelihood' (as you've discussed).

    I mean, we assume that if an event has probability of 99,9% of happening, it means that if we simulate the conditions, each 1000 times the event would occur next to 999 times. But that's not a fact, since nothing really prohibits the complement of the event, with probability of 0,1% of keep continuously occurring through time, while the first event, with almost 100% of probability never happens.denis yamunaque

    You've identified precisely why probabilities aren't exhaustive, in and of themselves - they don't preclude the complement of an event (or an event itself), irrespective of how unlikely (or likely) they exert themselves to be. Unless they're of an unequivocal certainty (0 or 1), they shouldn't be conflated with assertions of certainty - or its lack, thereof.

    Probabilistic interpretations, nevertheless, often 'suggest' that an assertion - of likelihood - is of practical utility in a decision-making paradigm (or likewise, not).

    In this particular analogy, one can formalize a probabilistic construct with a binomial distribution for the complement of a 'significantly probable' event - denoted by , in the characteristic form .

    You might be acquainted, in some measure, with the following sequence of reasoning:

    For any elective integer , wherein , there exists a discernible probability that conducting 1000 iterations of an arrangement, will result in complements to an event.

    Specifically, if illustrates the likelihood of an outcome (under the purview of a sample space) with an affiliated probability , emerging on occasions after exactly iterations - then a canonical inference entails:



    Repurposing this expression, by virtue of the aforementioned system's and constituents suggests:



    0.001 is an imperceptible ratio, but is nonetheless fathomable. For instance, when (an indescribably anomalous iteration set),

    This might be misleading, unfortunately - since singular or incremental probabilities are nearly always inconsequent.

    One may consolidate (or assimilate) the singular probabilities of variable iterations, across the discrete interval in order to accumulate superior likelihoods. Naturally, this engenders a binomial cumulative distribution - for a complement to manifest anywhere between , when a probabilistic event is undertaken with distinctive iterations.

    What if one were to redefine (for argumentation) to - whilst and were unchanged? We'd witness an inconceivably larger number of operations (or iterations), therefore bolstering the likelihood of spectating a particular outcome across a specific interval of frequencies.

    In answering your question, probability - schematically - is merely a suggestive trajectory to how 'likely' an event should be construed as being. It's neither infallible, nor overly authoritative; it exists solely to guide.
  • denis yamunaque
    4
    Changing the words of a phrase by it's synonyms may help to understand some concepts by approaching the speech to something more familiar to the audience, but that's not a definition at all, it's not adding information.

    "A probability that an event P will have a result R is X if, when P takes place N times, on average, R will result X * N times" -> I don't know yet how to quote as fancy as you do, and would appreciate the help to learn!

    But, life goes on: I don't know if you are saying that this is a definition, but assuming you are, it is just describing how to get to the magnitude of the probability of an event. It is like saying that price is the law of supply and demand.
  • denis yamunaque
    4
    Thank you, I really appreciated your explanation. This last part 'suggestive trajectory to how 'likely...', it's tricky to me, 'cause in my language the translation of likely and probable is the same, so looks like not going further from the initial point. But adding this information 'suggestive trajectory...' was enough to make think a little more.
  • Aryamoy Mitra
    156
    Thank you, I really appreciated your explanation. This last part 'suggestive trajectory to how 'likely...', it's tricky to me, 'cause in my language the translation of likely and probable is the same, so looks like not going further from the initial point. But adding this information 'suggestive trajectory...' was enough to make think a little more.denis yamunaque

    Quite truthfully, after engaging in this discussion, I'm not sure I understand probabilities as well as I thought I did.

    You're right - I haven't been able to further your initial argument. Personally, I don't know if one can.

    Mathematically, one can always devise one sophistication upon another.

    Philosophically, however - determining what exactly likelihood entails, is a genuine conundrum.
  • simeonz
    310

    I have a rather layman understanding of 20th century physics, but this is evidently important question for QM. Without philosophically meaningful description of probability, using propensities as subject matter is vague.

    I have some conclusions of my own that I will propose.

    Basically, the mathematical definition, specifically the one about expectation, aims to compute utilities over collections of instances. It deals with other quantities as well, but those are usually advisory, and the central notion is the mean. There is no need for non-determinism or incomplete knowledge to use probability. You could compute means over populations in some country or over the last month's orders of a company, etc. As long as you know the frequencies in advance, by direct measurement, the results are infallible.

    When you don't have the ability to measure probability directly, because you either lack knowledge or there is genuine non-determinism, there are multiple avenues to take.

    One involves past experience and statistics. The root of confidence in any kind of inductive practice is our history, both personal and collective, both recorded and genetic. Every time we achieve success, we experience confirmation bias and further ourselves, socially and as genetic viability. This develops and reinforces our habit, culture, and heritage. Like anything in nature, statistical induction will either work or will not, but from what we have seen, in most cases, had we decided not to trust the probabilistic consistency of nature, or the uniformity of nature, we would have had retroactively lost from the outcome. So, our actions are justified only in retrospect. Statistical thinking is in our natural programming, social convention, and habit.

    There are other ways to extract distributions. You can seek trivial probability events (certain) and derive the rest of the distributions for related properties by first principles. For a dice roll, considering that the dice is symmetric, and since the roll is a certain event, each side would be presumed to roll with equal sixth chance. Similarly, if you toss three coins, you can derive the outcome chance of each configuration by independence. Starting with certain event and following symmetry and independence arguments, you can derive a lot of event chances.

    The last method involves Bayesian thinking, where starting with particular distribution, either of the initial conditions in some stochastic process, or initial constraint on the present event, you reevaluate your probability constantly.

    So much for now. Article on the Stanford E. P. that deals with statistics. https://plato.stanford.edu/entries/statistics
  • fishfry
    3.4k
    A while back someone on the site told me the technical names for what I call ontological and epistemological probability but I don't seem to remember them at the moment so I'll just use my terms.

    Say you flip a coin. If we believe in Newtonian physics for macroscopic events, the result of the coin flip is absolutely determined by the initial state and exact position of the coin, the flipping force, the air pressure, local gravity (accounting for our local elevation, say), the humidity of the air, and so forth. If we only knew all of those factors, we could predict the outcome of the coin toss with perfect accuracy.

    The problem is, we can't know all those things with sufficient accuracy The coin toss is random not because it's inherently random, which it isn't. The coin toss is random because we can't possibly know the factors that go into its determinate outcome. That's what I call epistemological randomness. The event is random because we don't know, or we can't know, enough about the factors that cause it to be one way or the other.

    Now a moment's thought reveals that pretty much everything that we regard as probabilistic is actually of this type. It's very difficult to think of anything at all that's truly random -- ontologically random -- which I define as random not because of our ignorance of the deterministic factors, but actually deep down random.

    We are hard pressed to give even a single example of an ontologically random event. Most people will fall back on quantum events. The low-order bit of the femtosecond timestamp of the next neutrino to hit your detector is random because QM says it is.

    But is it? How do we know that it wasn't determined at the moment of the big bang? And that if we only knew enough, we could predict it? Or that even if we could never know enough, it's still only random by virtue of our lack of knowledge?

    This gets into one of the great disputes of twentieth century physics. Is it possible there are "hidden variables," aspects of reality that we just don't know, that, if we knew them, would enable us to see the determinism lurking behind even quantum probabilities? I'm no expert on this stuff, but I understand that Bell's theorem rules out hidden variables. At least up to the limits of our present knowledge of physics.

    There's a lot of physics that we don't know. We can't merge quantum physics with general relativity. We don't know what dark matter or dark energy are. There are revolutions waiting to be explicated by geniuses not yet born. It's possible that there is no such thing as randomness; and that probability is simply a measure of our ignorance of the causes of events.

    https://en.wikipedia.org/wiki/Hidden-variable_theory

    https://en.wikipedia.org/wiki/Bell%27s_theorem
  • T H E
    147
    What, conceptually, is probability? What is something being likely to happen?denis yamunaque

    Deep issue, and you've got some good answers already. I'll just introduce the theme of the theoretically fair coin. This is an ideal coin. It's the prototypical RNG (random number generator). Each time it's activated, it is just as likely to gives heads as tails (or 1 as 0, if you prefer to think of a random bit.) It has no memory, no anticipation. It's doesn't matter that the last 1000 flips were heads. The next flip is just as likely to be heads as tails. I don't know enough about modern physics to say whether anything in nature corresponds with it, but we seem to co-imagine it pretty well. Do we just have an intuition about RNGs that's hard to say more about?
  • fishfry
    3.4k
    The next flip is just as likely to be heads as tails.T H E

    Only by virtue of our ignorance of the physical determinants of the outcome. Else you must not believe in physics at the macroscopic level.

    See for example

    https://www.google.com/search?q=coin+flip+controlled+under+lab+conditions&spell=1&sa=X&ved=2ahUKEwjAisbJgc3vAhW8FzQIHSn5AbYQBSgAegQIARAv&biw=807&bih=549

    and

    https://phys.org/news/2009-10-tails-key-variables.html
  • FlaccidDoor
    132
    I see probability as a count of all possible scenarios in respect to it's results. For example, suppose a coin is tossed. That coin, thrown randomly, has an infinite number of ways it can fall considering it's speed, position, angle at which it lands, etc. However probability can simplify that to 50/50 because both sides have the same infinite number of scenarios that can lead to it. They cancel out, simply put and allow a simplified understanding of what is technically something of infinite possibilities.

    Because of its nature, there needs to be either a very concrete understanding of how things can lead to the results or a large amount of trials to be able to apply the concept of probability to something. Going further with the coin example, a concrete understanding would be like what was mentioned previously, where the probability of 50/50 was reached deductively. The other way is to inductively find it by for example taking 100000 trials and finding something close to the 50/50 ratio.

    But in the end it is the same. It is a simplified way for us to count the infinite possibilities for something.
  • T H E
    147
    Only by virtue of our ignorance of the physical determinants of the outcome. Else you must not believe in physics at the macroscopic level.fishfry

    Nature aside, what I'm trying to get at is the ideal, mathematical model. Specifically, we can think of a Bernoulli variable with p = 0.5. But even this is a formalism that aims at an notion. Actual coins are used as 'approximations' of the ideal coin, just as we settle for PRNGs.
  • fishfry
    3.4k
    Nature aside, what I'm trying to get at is the ideal, mathematical model. Specifically, we can think of a Bernoulli variable with p = 0.5. But even this is a formalism that aims at an notion. Actual coins are used as 'approximations' of the ideal coin, just as we settle for PRNGs.T H E

    You're right, I reread your post. You're talking about a theoretically random coin, and asking if such a thing could exist or if it's only an object of our intuitions. So that's the right question, if I understood you correctly.
  • T H E
    147

    Yes, I was trying to bring this intuition of the ideal coin into the conversation. Did we evolve so that this intuition was available? And: how does it connect to the larger conversation? Whether nature offers anything comparable is of course another important issue. (Does God play dice? Etc.)
  • Gnomon
    3.8k
    What, conceptually, is probability?denis yamunaque
    The word "probability" was derived from the concept of a provable postulate or prediction. An un-provable prediction is an opinion with no testable grounds for belief. Such prophecies must be taken on faith in the soothsayer, not on any objective evidence pointing to a normal future state. Hence, the prediction may rely on the small possibility of abnormal events (black swans) or miracles (divine intervention).

    Conceptually, a highly probable event doesn't have to be taken on faith, it's almost certain. But the lower the probability, the more faith is required for belief. For example, the likelihood of a tornado hitting my home may seem remote, but if the weather forecaster has a good record of reliability, you'd be wise to take his word for it, and prepare to take shelter.

    Mathematical probability is a numerical evaluation of the odds that the future state predicted can be tested and found true. As a practical method, probability theory derives its power from the stability of the "normal" Bell Curve behavior of large numbers of relevant objects or trials. In other words, we predict the future based upon past experience. But, the flaw in that theory is the small probability of Black Swans, that don't conform to the Norm. Some people interpret such rare events as miracles, because their minuscule probability is hard to calculate. Nevertheless, we can still retro-compute the probability after the fact ; after the evidence has been found ; after the event has "come to pass". :smile:

    The law of large numbers is a principle of probability according to which the frequencies of events with the same likelihood of occurrence even out, given enough trials or instances. As the number of experiments increases, the actual ratio of outcomes will converge on the theoretical, or expected, ratio of outcomes.
    https://whatis.techtarget.com/definition/law-of-large-numbers

    Black Swan theory : 2. The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities).
    https://en.wikipedia.org/wiki/Black_swan_theory

    Beloved Weatherman : https://www.washingtonpost.com/weather/2021/03/25/james-spann-alabama-tornado/

    True Prophets : Make predictions that “come to pass” (Jeremiah 28:9)
    When a prophet speaks in the name of the Lord, if the word does not come to pass or come true, that is a word that the Lord has not spoken; the prophet has spoken it presumptuously. You need not be afraid of him. (Deuteronomy 18:20-22)
    Beloved, do not believe every spirit, but test the spirits to see whether they are from God, for many false prophets have gone out into the world. (1 John 4:1-6)
  • TheMadFool
    13.8k
    Probability is uncertainty.

    If x + 1 = 1, then I'm absolutely certain, I have no doubts whatsoever, that x = 0. Probability has no role here.

    On the other hand, if 5 < x + 1 < 10 where x is an integer, x could be 5 or 6 or 7 or 8. I'm uncertain of x's value. Probability enters the picture and without any further information, the probability that x = 5 is 1/4 = 25%. The probability that x >= 6 is 2/4 = 1/2 = 50%.
  • InPitzotl
    880
    I propose thinking of probability a bit more abstractly.

    Probability fundamentally is a measure; when you assign a probability value, you're assigning a measure. The assignment of numerical probabilities is explicitly a claim that the measure of probability of events scales to the measure of the assigned numbers. To claim that A is twice as likely as B is to claim that P(A)=2P(B); to claim that you cannot say A is twice as likely as B is to claim that you cannot say P(A)=2P(B).
    I mean, we assume that if an event has probability of 99,9% of happening, it means that if we simulate the conditions, each 1000 times the event would occur next to 999 times. But that's not a fact, since nothing really prohibits the complement of the event, with probability of 0,1% of keep continuously occurring through time, while the first event, with almost 100% of probability never happens.denis yamunaque
    I disagree with this assessment. To claim that P(A)=99.9% is to claim that you can assign a probability measure of A happening to the value 0.999. Since the numerical assignments presume the numerical scales map to the probabilities, and since 0.999 is 999 times greater than 0.001, the statement P(A)=99.9% is equivalent to the claim that A is 999 times as likely to occur as it is to not occur. That is what P(A)=99.9% means.

    That does not imply that if you repeat the event 1000 times, then A should occur nearly 999 times (this being an application of the Law of Large Numbers). At best it implies that A is likely to occur nearly 999 times. Nor does this imply that if you repeat the event 1000 times, A would occur at least once; at best, it implies that it is likely to occur at least once. All P(A)=99.9% implies is that there exists a meaningful measure such that you can say things like A is 999 times more likely to occur than it is to not occur. In fact, as @Aryamoy Mitra did earlier, we can use probability to describe exactly how unlikely it is that A never occurs when the event happens 1000 times.
  • EnPassant
    670
    Probability means that out of a number of events each has the same likelihood of happening. But how can we say they are all equally likely? How can we know that?
  • jgill
    3.9k
    Probability means that out of a number of events each has the same likelihood of happeningEnPassant

    You have a container in which there are ten black balls and five white balls. You reach in a pluck out a ball. What is the probability it is black? White?
  • EnPassant
    670
    You have a container in which there are ten black balls and five white balls. You reach in a pluck out a ball. What is the probability it is black? White?jgill

    black = 2/3
    white = 1/3
  • Gnomon
    3.8k
    Well, it's an awkward question, but, what in fact is probability?denis yamunaque
    Humans have an advantage over most animals, in that we can imagine the near future, and prepare to make our next move, before the future actually arrives. Most animals deal with unexpected events with automatic knee-jerk reflexes. Which serves them well, in their narrow niche of the tooth & claw jungle. But humans have created a variety of artificial niches to suit diverse specialized needs and preferences. Consequently, our "asphalt jungle" is even more complex & chaotic, and rapidly changing, than the natural habitat of other animals.

    That may be why we were forced to supplement our basic animal survival instincts, with formal methods for more accurately predicting the moving targets of the future. Ancient prophecy was merely educated guessing, based on direct experience from past events and trends. But humans also learned to create abstract mathematical models of how the world works. And Probability Theory eventually emerged, ironically from Game Theory, based on long experience with gambling competitions, to give those-in-the-know an advantage over other players. For example, a card game is an abstract simulation of real-world social situations. If you can "count cards" you will have a better idea of what hand your opponent is holding, and what his next move might be. Hence, when such unknowns can be reduced to number values they can be manipulated more quickly & accurately than the nebulous social values of human communities : e.g is he bluffing?.

    Therefore, what we now call "Probability" is essentially a formalized form of intuition or foresight. It allows us to calculate what is normally-to-be-expected in a well-defined situation. Hence, It gives us an edge in dealing with the unnatural exigencies of the complicated civilized world of cunning thinking animals, and with the unfamiliar uncertainties of the natural world. Probability Theory is "in fact" a new tool, like teeth & claws, for humans to use in the high-stakes game of survival. Unfortunately, Probability is still not a perfect form of Prognostication. :brow:


    There are three major types of probabilities:
    Theoretical Probability.
    Experimental Probability.
    Axiomatic Probability.
  • fishfry
    3.4k
    Humans have an advantage over most animals, in that we can imagine the near future, and prepare to make our next move, before the future actually arrives.Gnomon

    Like a cat badgering you till you open a can of cat food for it? Cats most definitely imagine the near future. "Rub my tummy, human." "Feed me, human."
  • Gnomon
    3.8k
    Cats most definitely imagine the near future.fishfry
    The human advantage over cats is in the degree & detail of its imagery -- including abstract models of Probability. I assume that cats have an instinctive sense of future prospects, but the theory of Probability goes beyond the innate dispositions that humans share with cats, into ideal realms where the cat food doesn't require human servants with hands & can openers. What if cat food just grew on trees -- what are the chances? :joke:
  • tim wood
    9.3k
    Probability is a measure of information about a system. For two bettors, odds differ for each depending on what each knows. The Monty Hall problem makes this reasonably clear.
  • Gnomon
    3.8k
    Cats most definitely imagine the near future.fishfry
    The human advantage over cats is in the degree & detail of its imagery -- including abstract models of Probability. I assume that cats have an instinctive sense of near- future prospects, but the theory of Probability goes beyond the innate dispositions that humans share with cats, into ideal realms where the cat food doesn't require human servants with hands & can openers. What if cat food just grew on trees -- what are the chances? :joke:
  • Gary Enfield
    143


    Denis / All

    As I have mentioned on my thread in the Philosophy of Science forum, probabilities are an acknowledgment of no known cause for a range of outcomes, and therefore their function is to provide a description in the absence of an explanation.

    In providing that description, I agree with T Wood that
    Probability is a measure of information about a system.tim wood

    Determinist theory, and the maths involved in the traditional Laws of Physics & Chemistry, (rather than the principles emerging from QM), says that for every action from a precise start point, there can only be one inevitable outcome - and if you were to accept that premise for all aspects of existence, then everything in existence would be truly inevitable and we would all be acting-out a fixed script.

    The trouble is that this theory contradicts our living experiences, and requires us to effectively deny both the reality of our lives, and the experiences of every person who has ever lived, simply to uphold a doctrine - which may or may not be true. That is a very big ask.

    However, determinists will point out that there is a difference between our limited knowledge and therefore our ability to predict outcomes vs the underlying inevitability of existence. There is an important difference in that respect, but it doesn't deny the possibility of truly random events instead of hidden causes/variables.

    The Laws of Physics demand one outcome - not multiple outcomes from a given start point, (called randomness) - because that is the basis of their explanations. So when multiple outcomes arise, there is either a missing factor, or a lack of determinism.

    In trying to resolve the unknown factors behind the differences in outcome, people may try to assume that the gap simply reflects known factors that are not being monitored. But that is an assumption, and where there is no factor that can logically be applied to resolve a scenario, then such a presumption seems to be at odds with the evidence. It is perfectly valid to look for missing factors, but it is also valid to consider that there may be genuinely non-determinist (spontaneous or random) factors in existence.

    The more narrow the range of potential outcomes, the more likely it would seem that a hidden mechanism is at work, which has not yet been identified - but conversely, a very broad range of outcomes, without any discernible pattern might also be evidence for a lack of determinism.

    As has been pointed out, probabilities, like odds in a lottery, provide a generalised description rather than a firm prediction. The most extreme and unusual outcome could occur next time around. But the converse is also true, probabilities indicate likelihood, and so the chances of the same extreme outcome occurring a second or third time would be extremely remote if the statistical model were originally correct.

    That is the dilemma when trying to apply probabilities to the origin of the first living cell. Multiple examples of every single protein would be needed for nature to experiment and evolve. When the odds of a single protein occurring by chance are one in '10 to the power 366' (ie. more options than there are atoms in the universe), then yes the likelihood of a 2nd example occurring by chance are pretty well nil. Yet life emerged.

    In other words, the problems are not confined to randomness, but also examples of co-ordination that break the scientific models. So I don't agree with fishfry that

    We are hard pressed to give even a single example of an ontologically random event. Most people will fall back on quantum events. The low-order bit of the femtosecond timestamp of the next neutrino to hit your detector is random because QM says it is.fishfry

    The traditional main source of examples for a lack of determinism arise from Thought and consciousness. But as we have seen, the origin of proteins, plus the growing number of examples from the activities of molecules within living cells, point to other equally challenging issues. For instance, (as again discussed in the Philosophy of Science forum), certain molecules display characteristics of problem solving which defy the logic of the Laws of Physics which should apply to them.

    In terms of randomness there are also many examples from other disciplines, such as cosmology including the broad theory of origin and the Big Bang. The evidence of the accelerating expansion of the universe either says that the 'Big Bang - Big Crunch' explanation must have had a start point 13.7 billion years ago (representing spontaneity without cause), or had a change to a previously eternal sequence - requiring a change that had to be either spontaneous or random - ie. non-deterministic. (I confess that I got that principle from Finipolscie's books).

    The problem with QM is that we can't see what's happening at that level of existence, and can only describe what is observed. The extensive use of probabilities in QM is a way to try to bring the perceived randomness of the observations into the deterministic fold - but they still represent a description rather than an explanation, because they are basically an admission that we don't have a cause to explain the different outcomes.
  • Possibility
    2.8k
    Well, it's an awkward question, but, what in fact is probability? I mean, we assume that if an event has probability of 99,9% of happening, it means that if we simulate the conditions, each 1000 times the event would occur next to 999 times. But that's not a fact, since nothing really prohibits the complement of the event, with probability of 0,1% of keep continuously occurring through time, while the first event, with almost 100% of probability never happens.
    Anyone could argue that this is not likely, or if it happens, if you repeat the experiment, it would probably not happen again. But those two arguments just use again the definition of probability without explaining it.
    I could add that saying "Means that something is more or less likely" just change the word probable for likely.
    I know that in real life events like the first described are not usual, but mathematically it's not impossible and it's just a scenario to the main question: What, conceptually, is probability? What is something being likely to happen?
    denis yamunaque

    Here’s my take: Probability is a quantified occurrence of a possible event under limited conditions.

    The difference between probability - a mathematical structure - and real life events is the distribution of potential energy into attention and effort, or affect.

    The important point you make is ‘if we simulate the conditions’. So, if an event has a 99.9% probability of happening (under certain conditions), then theoretically we could manipulate these conditions locally in such a way that its complement will always occur, with surprising accuracy. The more precisely controlled and controllable the conditions under which this probability is calculated, the more play we have.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.