Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty, Probability, and Markov Chains - Coggle Diagram
Uncertainty, Probability, and Markov Chains
Uncertainty in Decision-Making
Belief state = all possible world states
Logic alone is insufficient
Need contingency plans (e.g., catching a flight)
Rational Decision Making
Rational decisions depend on:
Goal importance
Likelihood of outcomes
Decision theory = Probability + Utility
Probability Theory
Degree of belief (0 to 1)
Probability summarizes uncertainty
Sample space = all possible outcomes
Markov Chains
Random transitions between states
Memoryless (only current state matters)
Weather example:
Today’s weather → Tomorrow’s weather
Transition matrix: Rows add up to 1
Examples of Markov Systems
Kitten behavior
Weather prediction
Stock market
Blood pressure changes
Markov Chain Monte Carlo (MCMC)
Simulate random processes
Monopoly board example: track most landed properties
Applications: Forecasting, Strategy optimization
Hidden Markov Models (HMM)
Patterns hidden behind observable events
Applications:
Speech recognition
Google page ranking
DNA sequencing
Virology
Absorbing Markov Chains
Contains at least one absorbing state (end state)
Some states are transient (temporary)
Related Concepts
Chaos Theory
Tiny changes → Big effects (Butterfly effect)
Garkov
Garfield comic + Markov text generation
PHYLO Game:
Humans outperform AI in genetic alignment tasks