Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty: Probability and Markov chains - Coggle Diagram
Uncertainty: Probability and Markov chains
Probability theory
Distributions
Discrete: Binomial, Poisson
Continuous: Normal, Exponential
Theorems and Laws
Law of Large Numbers
Central Limit Theorem
Concepts
Sample Space
Set of all outcomes
Event
Subset of sample space
Probability Function
Assigns probabilities
Definition
Mathematical study of randomness
Decision theory
Definition
Studies decision-making under uncertainty
Objective
Identify the most rational decision
Key Concepts
Utility: Measure of desirability for outcomes
Risk: Uncertainty associated with outcomes
Expected Utility: Value considering probabilities
Approaches
Normative: Identifies most rational decision
Prescriptive: Provides decision-making strategies
Descriptive: Studies actual decision-making
Models
Decision Trees: Graphical representation of choices
Game Theory: Analyzes strategic interactions
Expected Utility Theory: Maximizes expected value
Probability
Definition
A measure of the likelihood that an event will occur.
Range
Values range from 0 (impossible event) to 1 (certain event).
Calculation
Probability (P) = Number of favorable outcomes / Total number of possible outcomes.
Types
Theoretical Probability
Based on reasoning or a model (e.g., rolling a fair die).
Experimental Probability
Based on actual experiments and observations (e.g., flipping a coin multiple times).
Subjective Probability
Based on personal judgment or experience (e.g., estimating the chance of rain).
Applications
Used in various fields such as gambling, statistics, finance, insurance, and science.
Markov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC) is a statistical technique that uses Markov chains to sample from complex probability distributions, ensuring that the sequence of samples approximates the desired distribution for efficient Bayesian inference and model fitting.
Andrey Markov
Andrey Markov was a Russian mathematician known for developing Markov chains, which are processes that undergo transitions from one state to another based on certain probabilistic rules.
Hidden Markov Models (HMMs)
Speech Processing
Page Ranking
Population Genetic Drift
Rational decision
Optimal Timing
Choosing the best time to act
Avoids Long Waits
Minimizes waiting time
Avoids Speeding Fines
Considers safety and legal constraints
Based on Preferences
Considers importance of goals
Depends on Goals
Guided by objectives
Highest Expected Utility
Chosen for overall benefit
Contingency Plans
Plans for uncertainties
Judgmental Domains
Requires subjective judgment
Example
Flight Planning
Optimal time for airport
Involves Uncertainty
Predicting future events
Chaos Theory
Deterministic Processes
Small changes, big impacts
Butterfly Effect
Tiny changes, different outcomes
Applications
Weather, stock markets
Unpredictable Outcomes
Long-term predictions unreliable
Nonlinear Dynamics
System evolution over time
Sensitive Dependence
Importance of initial conditions
Example
Weather Prediction
Small changes, different weather
Interconnected Systems
Variables interconnected
Uncertainty
Challenges: Decision-making, Planning, Prediction.
Types: Epistemic (knowledge-based), Aleatory (random).
Definition: Lack of knowledge about future events.
Tools: Probability Theory, Markov Chains.
Applications: Finance, Weather Forecasting, Risk Management.
Example: Predicting stock market movements.