Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty: Probability and Markov chains - Coggle Diagram
Uncertainty: Probability and Markov chains
Uncertainty
Belief state = representation of all possible world states
They make contingency plans
Problem-solving and logical agents keep track of belief state
BUT drawback - too much logic
Rational decision
“right” thing to do
Depends on relative importance of various goals + the likelihood of achieving them
Probability theory
Probability
Probability
provides way of summarising the uncertainty that comes from our laziness and ignorance!
Sample space = all possible outcomes
Logic and probability
SAME - world composed of facts DIFFERENT -Logic says TRUE/FALSE (Probability has degree of belief from 0 -1)
Decision theory
(Decision theory = probability theory + utility theory)
Markov Chains
(similar to sequences proscribed by FSMs EXCEPT there is probability associated with the next sequence to be followed, rather than a definite input/output causation.)
[
http://www.statslab.cam.ac.uk/~rrw1/markov/index.html
]
Andrey Markov
(Markov - mathematical system that undergoes transitions in state space)
Absorbing Markov chain
Hidden Markov Model
Examples
Blood pressure
Stock market fluctuations
Weather
Stochastic process
Indeterminate outcome
Represents evolution of system over time
Collection of random variables
Applications - statistical models of real world
Chaos Theory
(Based on deterministic processes - at micro level)
Garkov
(Extract and combine it using a math/logic process called a Markov chain)
PHYLO
(3,000 regular players produced 350,000 solutions to various problems, beating accuracy of alignments)