Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty, Probability & Markov Chains, image, image - Coggle Diagram
Uncertainty, Probability & Markov Chains
-
Rational Decision Making
-
-
Uncertainty complicates even simple decisions (e.g., timing to airport)
-
Probability Theory
-
Application
Example: Believing there's 0.8 probability of a cavity doesn't mean it exists or not just reflects current belief
-
Definition: Probability measures how likely an event is, on a scale from 0 to 1.
Utility Theory
-
-
Example: Choosing between waiting 4 hours at the airport (less utility) vs. risking missing a flight (even lower utility).
-
Decision Theory
-
-
-
Example: Based on probability of traffic and personal preference for not missing flights, the agent chooses when to leave.
Markov Model
Definition
Markov property: Future state depends only on the present, not the past
-
-
Markov Chain
-
-
-
-
Components
Set of States: e.g., Rainy, Sunny
-
-
-
-
-
-
-
-