Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty: Probability and Markov Chains - Coggle Diagram
Uncertainty: Probability and Markov Chains
Uncertainty
Problem -solving and logical agents keep track of belief state, representation of all possible states
But there is too much logic
A rational decision depends on the importance of goals and likelihood of achieving them
Markov Chains
Similar to sequences proscribed by FSMs
Introduced by Andrey Markov
A Markov property is a mathematical system that undergoes transition in state space
A random process is a collection of variables
A random process with the Markov property is a Markov process
A Markov chain is a Markov process with discrete time and discrete state space
Markov Chain Usage Examples
Typing word predictor
Markov Chain Monte Cario
Garkov
Random Walk
Google Pagerank
Probability
Probability Theory is a way of summarizing the uncertainty
Logic says True/False while probability says a degree of belief from 0-1
Decision Theory = Probability Theory + Utility Theory
Simply how likely something is to happen
It is the numerical measure of the likelihood that an event will occur
The value of probability always remains between 0 and 1
Chaos Theory
Chaos theory explores the effects of small occurrences that can dramatically affect the results of seemingly unrelated events.
Web links
http://setosa.io/blog/2014/07/26/markov-chains/
http://en.wikipedia.org/wiki/Brownian_motion
http://upload.wikimedia.org/wikipedia/commons/c/cb/Random_walk_25000.svg
https://www.investopedia.com/ask/answers/08/chaos-theory.asp
http://www.nature.com/nbt/journal/v22/n10/full/nbt1004-1315.html
http://www.fejes.ca/easyhmm.html