Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty - Coggle Diagram
Uncertainty
Markov Chains A Markov chain is defined as a mathematical framework consisting of random variables that transition from one state to another based on probabilistic rules. It is characterized by the Markov property, where the probability of moving to a specific state depends only on the current state and time elapsed.
-
Examples: Weather, Kitten Behavior, Stock Market, Blood Pressure
-
-
-
-
Hidden Markov Model (HMM): Applications in Speech Recognition, Google PageRank, Bioinformatics
-
-
-
-
Chaos Theory the uncertainty in a forecast increases exponentially with elapsed time. Hence, mathematically, doubling the forecast time more than squares the proportional uncertainty in the forecast.
Applications: Drugs, Forecasting
Probabilistic Systems A mathematical foundation for comprehending and measuring uncertainty is offered by probability theory. Even in situations where we lack comprehensive knowledge or information, it enables us to convey the probability that events will occur. Even in ambiguous circumstances, we can make better predictions and decisions by giving probability to various events.
Rational decision Probability theory provides a mathematical framework to understand and quantify uncertainty. It allows us to assign numerical values (probabilities) to the likelihood of different outcomes, helping us make informed decisions and predictions when faced with incomplete or uncertain information.
Stochastic process
Examples: Modelling the expansion of a bacterial population, the varying electrical current caused by thermal noise, or the motion of a gas molecule are a few examples of stochastic processes.