Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty: Probability and Markov Chains - Coggle Diagram
Uncertainty: Probability and Markov Chains
Uncertainty
Problem-solving agents use belief states and logic to make decisions.
Belief states represent all possible world states.
Agents create contingency plans for different scenarios.
Overusing logic can lead to complexity.
Effective management of beliefs and planning is key for agents.
Probability
A measure of the likelihood that a specific event will occur, expressed as a number between 0 and 1.
Probability Theory
Mathematical framework for analyzing random phenomena
Studies uncertainty and likelihood of events
Utilizes concepts like probability distributions and random variables
Fundamental in fields like statistics, machine learning, and physics
Used to model and predict outcomes in various scenarios
Decision Theory
Decision Theory = Probability theory + utility theory
Decision theory studies rational decision-making by agents.
It aids advancements in technology like machine learning and artificial intelligence.
Focuses on understanding how decisions are made.
Examines how multiple decisions impact each other.
Investigates strategies for dealing with uncertainty in decision-making.
Rational Decision
Logic-based decision-making process
Utilizes reasoning and analysis
Considers available options and their outcomes
Aims to optimize outcomes based on predefined criteria
Incorporates data-driven approaches and algorithms
Web Links
https://www.geeksforgeeks.org/probability-theory/
https://www.geeksforgeeks.org/hidden-markov-model-in-machine-learning/
https://www.vedantu.com/maths/stochastic-process
https://www.geeksforgeeks.org/markov-chain/
Andrey Markov
Andrey Andreyevich Markov was a Russian mathematician best known for his work on stochastic processes.
Markov Chain
Stationary Distribution: Long-term probabilities remain constant.
Types of States: States can be transient, recurrent, or absorbing.
Transition Probabilities: Represented in a transition matrix.
State Space: Consists of a finite or countable set of states.
Memoryless Property: Future state depends only on the current state.
Markov chains, named after Andrey Markov, a stochastic model that depicts a sequence of possible events where predictions or probabilities for the next state are based solely on its previous event state, not the states before.
Hidden Markov Model
DNA/Genome sequencing
Speech processing/recognition software
Population genetic drift
Page ranking on Google
The hidden Markov Model (HMM) is a statistical model that is used to describe the probabilistic relationship between a sequence of observations and a sequence of hidden states.
Stochastic Process
A stochastic process, also known as a random process, is a collection of random variables that are indexed by some mathematical set.
Bernoulli Process
It is a sequence of independent and identically distributed (iid) random variables, where each random variable has a probability of one or zero, say one with probability P and zero with probability 1-P.
Wiener Process
The Wiener process is a stationary stochastic process with independently distributed increments that are usually distributed depending on their size.
Random Walk
Random walks are stochastic processes that are typically defined as sums of iid random variables or random vectors in Euclidean space, implying that they are discrete-time processes.
Chaos Theory
Branch of mathematics
Focuses on complex systems
Behavior sensitive to slight changes in initial conditions
Results in seemingly random and unpredictable outcomes
Systems are deterministic in nature