Unit 2

click to edit

Lesson 1: Independent and mutually exclusive events

Terminology

A trial: operation with unpredictable results, such as coin toss

An experiment: one or more trials, such as toss a coin for twice

An outcome: result of an experiment, such as one tail and one head of coin tosses

The sample space: the set of all possible outcomes, {HH, HT, TH, TT}

An event: one or more outcomes, such as {HT, TH} (one head}, {HH, TH} (the last toss is a head), {HH, HT, TH} (at least one head) …

Probability: likelyhood, between 0 to 1, p(event)=|event|/|sample space|

Impossible event: p=0

Certain event: p=1

Complementary event: A and A', p(A)+p(A^′ )=1

Addition Rule for Probability

Independence and conditional probability

Independent events

P(A∪B)=P(A)+P(B)−P(A∩B)

Mutually exclusive events

Lesson 2: Counting Principles

Fundamental Counting Principle

Rule of product

Rule of sum

Permutations

〖(_n^)P〗_r=n×(n−1)×…×(n−r+1)=n!/(n−r)!

Divide out the duplicates

n!/a!b!c!

Combinations

〖(_n^)C〗_r=n!/r!(n−r)!

Lesson 3: Tree diagrams and combined events

Lesson 4: Discrete probability distributions

Random variable: number form of possible outcomes

x 0 1 2

Event for x

P(X=x)

Lesson 5: Binominal Probability Distributions

x 0 1 2 … i … n

P(X=x) 〖(_n^)C〗_0 p^0 q^n 〖(_n^)C〗_1 p^1 q^(n−1) 〖(_n^)C〗_2 p^2 q^(n−2) 〖(_n^)C〗_i p^i q^(n−i) 〖(_n^)C〗_n p^n q^0

Lesson 5n: Normal Probability Distributions

Continuous probability distribution

Normal random variable and normal distribution

Standard deviation σ: measures the spread of the distribution

Spread of different values.

Bigger standard deviation -> more spread -> flatten curve

Properties of Normal Distribution: p(X<x)=f(x, μ, σ)

Curve is symmetrical about the mean μ

Mean divide area into halves

Total area under curve is 1

μ and σ defines the shape of the curve

Standard normal distribution, μ=0, σ=1

Converting any normal distribution of X into standard normal distribution Z

z=(x−μ)/σ

Lesson 6: Expected value

x x_1 x_2 x_3

Event for x

P(X=x) P(x_1) P(x_2) P(x_3)

Value for x v_1 v_2 v_3

Expected value: E(X)=v_1 P(x_1 )+v_2 P(x_2 )+v_3 P(x_3 )=Σv_i P(x_i)