Discrete probability distribution: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>D. Matt Innis
(fix categories)
mNo edit summary
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
A '''discrete probability distribution''' is a one class of [[probability distributions]]. The other main class in basic [[probability theory]] is [[continuous probability distributions]].
{{subpages}}
 
A '''discrete probability distribution''' is a one class of [[probability distributions]]. The other main class in basic [[probability theory]] is [[continuous probability distribution|continuous probability distributions]].


==Intro - probability as logic viewpoint==
==Intro - probability as logic viewpoint==
Line 77: Line 79:
*[[Stochastic diffential equations]]
*[[Stochastic diffential equations]]


==External links==
==External links==[[Category:Suggestion Bot Tag]]
 
 
 
 
 
[[Category:Mathematics Workgroup|probability distribution, Discrete]]
[[Category:CZ Live|probability distribution, Discrete]]

Latest revision as of 16:00, 7 August 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

A discrete probability distribution is a one class of probability distributions. The other main class in basic probability theory is continuous probability distributions.

Intro - probability as logic viewpoint

Faced with a list of mutually exclusive propositions or possible outcomes, people intuitively put "degrees of belief" on the different alternatives.

For instance, consider the following 2 propositions:

  • During next week there will be rain in London.
  • During next week there will be no rain in London.

Based on available information about the record of past weather in England, people tend intuitively to put more "belief" in the first possibility than the second.

For another, slightly more complex example, consider the following 6 propositions:

  • During next week no automobiles will enter I-45 from Galveston Island.
  • During next week 1-1,000 automobiles will enter I-45 from Galveston Island.
  • During next week 1,001-10,000 automobiles will enter I-45 from Galveston Island.
  • During next week 10,001-100,000 automobiles will enter I-45 from Galveston Island.
  • During next week 100,001-1,000,000 automobiles will enter I-45 from Galveston Island.
  • During next week more than 1,000,000 automobiles will enter I-45 from Galveston Island.

Based on local information about past traffic patterns, people will intuitively distribute a "degree of belief" among the propositions.

If every "degree of belief" is a real numbers ranging from 0 to 1 and their sum is exactly 1, we have a discrete probability distribution, and each "degree of belief" is called a probability.

A discrete probability distribution is thus nothing more than a mathematically precise version of a common intuitive phenomenon, reflecting the human mind's ability to deduce and infer the physical propensities of external systems.

As a simple illustration as to how the individual probabilities may be obtained in practice, consider the expected results for a coin toss experiment. While we don't know the results for any individual toss of the coin, we can expect the results to average out to be heads half the time and tails half the time (assuming a fair coin).


Formal definition

Given a countable set S={s0, ... ,sn, ... } of mutually exclusive propositions (or possible outcomes of an experiment). Let A=[0,1}, a proper subset of the real numbers R. A discrete probability distribution is then a subset T={(s0,t0),...,(sn,tn), ...} of the cartesian product , such that all the ti sum to exactly 1.


Important examples

Bernoulli distribution - Each experiment is either a 1 ("success") with probability p or a 0 ("failure") with probability 1-p. An example would be tossing a coin. If the coin is fair, your probability for "success" will be exactly 50%.

An experiment where the outcome follows the Bernoulli distribution is called a Bernoulli trial.

Binomial distribution - Each experiment consists of a series of identical Bernoulli trials, f.i. tossing a coin n times, and counting the number of successes.

Uniform distribution - Each experiment has a certain finite number of possible outcomes, each with the same probability. Throwing a fair die, f.i., has six possible outcomes, each with the same probability. The Bernoulli distribution with p=0.5 is another example.

Poisson distribution - Given an experiment where we have to wait for an event to happen, and the expected remainding waiting time is independent of how long we've already waited. Then the number of events per unit time will be a Poisson distributed variable.

Geometric distribution -

Negative Binomial distribution -




References

See also


Related topics

==External links==