Please enable JavaScript.
Coggle requires JavaScript to display documents.
Markov Models - Coggle Diagram
Markov Models
Applications
Bioinformatics
DNA Sequences
Molecular Prediction
Speech recognition
Word Recognition
Phoneme Probabilities
Finance
Market Behavior
Interest / Exchange Rates
Stock Prices
Control systems
Robot/Autonomous Vehicle Control
Behavior Modeling
NLP
Text Generation
Word/Phrase Probabilities
Weather forecasting
Weather Pattern Probabilities
Future Weather Predictions
Sports analytics
Event Probabilities
Game Outcome Prediction
Epidemiology
Disease Spread Modeling
Intervention Impact
Hidden Markov Models
Statistical models
Sequential data with hidden states
Variables
Observed
Hidden (latent)
Assumption
System is Markov process
System evolves over time
Goal
Estimate
Model parameters
Emission distributions
Transition matrix
Sequence of hidden states generating observed data
Estimation methods
Forward-backward algorithm
Baum-Welch algorithm
Uncertainty
Sources
Data Uncertainty
Model Uncertainty
Environmental Uncertainty
Techniques to address uncertainty
Uncertainty quantification
Bayesian inference
Probabilistic modeling
Benefits
More informed decisions
More reliable decisions
Improved performance of AI systems
Types
Discrete-time models/
Markov Chain
State Space
Set of All Possible States
Coin Example
{heads, tails}
Size and Complexity varies
Small and Easily Enumerable
Infinite or High-Dimensional
Essential Component of Markov Chains
Transition Probabilities
Probability of transitioning
One state to another
Set of matrices P(1), P(2), ...,
Derived from System model
Stationary distribution
Properties
Uniqueness
Conservation
Invariance
Limiting behavior
Methods
Solving a system of linear equations
Power iteration method
Remains unchanged over time
Solving linear equations
Empirical estimation
System's long-term behaviour
Conditions
Non-periodicity
No drift to infinity
Communication
Absorbing Markov Chains
No return to previous states
Applications
Modelling infectious diseases spread
Modeling financial markets
System reaches final states
Absorbing states
Continuous-time models/
Markov Processes
Set of transition probabilities
Sequence of random variables
State of the system
Continuous-time version of Markov chains
Conditions
No drift to infinity
Communication
Markov Property
Future state depends only on current state
Predictions based on current state
Properties
Memoryless
Efficient computation
Prediction of future states
Flexible and powerful framework
Model complex stochastic systems
Probabilistic Systems
Evolution of system Over Time
Mathematical Models
Probability Theory