⚠Discrete Probability Distribution ⚠
Bernoulli Distribution
Binomial Distribution
A random variable X has a Bernoulli distribution with parameter p , where 0≤p≤1 , if it has only two possible values, typically denoted 0 and 1 . The probability mass function (pmf) of X is given by
p(0)=P(X=0)=1−p
p(1)=P(X=1)=p.
Cumulative Distribution Function F(X)
0,x<0
1-p, 0≤x≤1
1, 1≤x
Example:
Let A be an event in a sample space Ω . Suppose we are only interested in whether or not the outcome of the underlying probability experiment is in the specified event A . To track this we can define an indicator random variable, denoted IA , given by
I(As)=1,if s∈A. 0 , if s∈A'
the random variable IA will equal 1 if the resulting outcome is in event A
Suppose that n independent trials of the same probability experiment are performed, where each trial results in either a "success" (with probability p ), or a "failure" (with probability 1−p ). If the random variable X denotes the total number of successes in the n trials, then X has a binomial distribution with parameters n and p , which we write X∼binomial(n,p) . The probability mass function of X is given by
Statistic
Variance: σx^2=[(1-p)=pq
Skewness: β1=a3^2=(1-p)/p+p/(1-p)
kurtosis: β2=a4=1-6p(1=p)/p(1-p) +3
Binomial Experiment
This experiment shows how many times n in Bernoulli experiment occurred. Therefor the condition are
n is the total of how many times the experiment occurred and it is the contrant that already been noted before the experiment
Every repetition, only have two output, success or fail
The probability of success is p, and failure is q=1-p
Every output didn't dependant on the other experiment's output
p(x)=P(X=x)=(nx)p^x(1−p)^n−x
the three tosses of the coin, so in this case we have parameter n=3 . Furthermore, we were interested in counting the number of heads occurring in the three tosses, so a "success" is getting a heads on a toss, which occurs with probability 0.5 and so parameter p=0.5 . Thus, the random variable X in this example has a binomial (3,0.5) distribution and applying the formula for the binomial pmf given
Reference: Stats.libretext & Dr. Ir Harinaldi
Multinomial Distribution
The multinomial distribution can be used to compute the probabilities in situations in which there are more than two possible outcomes
p=(n!/n1!n2!n3!)p1^n1p2^n2p3^n3
p is the probability,
n is the total number of events
n1 is the number of times Outcome 1 occurs,
n2 is the number of times Outcome 2 occurs,
n3 is the number of times Outcome 3 occurs,
p1 is the probability of Outcome 1
p2 is the probability of Outcome 2 , and
p3 is t he probability of Outcome 3 .
Hypergeometric Distribution
Geometric Experiment
Poisson Distribution
p=(n!)/(n1!n2!n3!)p1^n1p2^n2…pk^nk
The Poisson distribution can be used to calculate the probabilities of various numbers of "successes" based on the mean number of successes. In order to apply the Poisson distribution, the various events must be independent. Keep in mind that the term "success" does not really mean success in the traditional positive sense. It just means that the outcome in question occurs.
p=(e^−μ) (μ^x)/x!
e is the base of natural logarithms
μ is the mean number of "successes"
x is the number of "successes" in question
Poisson Process
widely used to model random points in time and space, such as the times of radioactive emissions, the arrival times of customers at a service center, and the positions of flaws in a piece of material.
The times when a sample of radioactive material emits particles
The times when customers arrive at a service station
The times when file requests arrive at a server computer
The times when accidents occur at a particular intersection
The times when a device fails and is replaced by a new device