Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty, Probability, and Markov Chains - Coggle Diagram
Uncertainty, Probability, and Markov Chains
Uncertainty
-
Where it appears
Predictions about the future (e.g., Will it rain tomorrow?).
-
-
-
Rational Decision-Making
-
Example
Should you leave 2 hours, 4 hours, or 24 hours early to catch your flight?
Depends on balancing the risk of missing the flight vs. waiting too long at the airport.
-
Probability Theory
Definition:
Probability measures how likely an event is, on a scale from 0 to 1.
Probability Theory
Definition:
Probability measures how likely an event is, on a scale from 0 to 1.
-
Important Note:
Probability reflects our knowledge, not the world itself.
Example:
If you believe there’s an 80% chance a toothache is due to a cavity, in reality, the tooth either has a cavity or not.
Utility Theory
-
-
-
Example:
Choosing between waiting 4 hours at the airport (less utility) vs. risking missing a flight (even lower utility).
Decision Theory
-
-
Example:
Based on probability of traffic and personal preference for not missing flights, the agent chooses when to leave.
Markov Model
-
-
Definition:
A stochastic model where the future state depends only on the current state (not history).
-
-
-
-
-