Please enable JavaScript.
Coggle requires JavaScript to display documents.
Uncertainty: Probability and Markow Chains - Coggle Diagram
Uncertainty: Probability and Markow Chains
Uncertainty
he absence of total assurance regarding the condition of a system or the result of an event is referred to as uncertainty. It occurs when decisions are difficult to make without the use of probabilistic or logical reasoning due to incomplete, confusing, unpredictable, or imprecise information.
Example: Tomorrow at 8 a.m., I have a flight from Sri Lanka.
Potential unknowns include traffic bottlenecks, fuel shortages, delayed flights, and traffic accidents.
The significance of different objectives and their chances of success determine rational decision-making.
Organizing: It is impossible for a logical agent to determine that these things won't happen: Long waits and penalties can be avoided with the right time.
In domains such as law, commerce, and medical (e.g., dental diagnosis), diagnosis entails uncertainty.
Probability Theory
Probability values, which summarize uncertainty from ignorance and laziness, vary from 0 to 1.
incorporated into machine learning, especially for decision-making and outcome prediction.
For instance, our current understanding is reflected in the belief that a toothache indicates an 80% risk of a cavity.
Utility Theory
Using a function that maps outcomes to real numbers—higher values denote preferable outcomes—utility theory represents an agent's preferences over potential outcomes.
Agents use Utility Theory for decision making
Decision theory
Decision theory combines probability theory and utility theory.
A rational agent chooses the action yielding the highest expected utility.
Conclusion
A straightforward yet effective idea for illustrating intricate real-time systems is the Markov chain.
The Markov chain principle is used in several Al tools for a variety of purposes.
Markow Model
Origin of Markow Chain Model
Introduced in 1906 by Andrey Markow
investigated independent random sequences and demonstrated that, in some circumstances, average results converge.
used first to literary letter sequences.
The basis of operational research and probabilistic state modeling in Al statistics.
What is Markow Chain Model
Only the present state has a probabilistic influence on changes (memoryless characteristic).
characterized by a range of states, some of which are silent and others of which emit symbols.
explains the likelihood that a system may change states.
Uses
predicting, organizing, and making choices in ambiguous situations.
Markow Assumption
All members in the system are subject to probability.
Over time, probabilities don't change.
Key Principle
he next state depends only on the current state, not on the sequence of past states.
Markow Chain Application
utilized in population genetics, algorithmic music composition, asset pricing, market research, text generation, Page rankings (Google results), and customer journey forecasting.
Definition
System states that change over time are modeled using a stochastic (random) model.
Example of Markow Chain Model
Using a stochastic finite state machine to buy cola:
There is a 90% likelihood that the next purchase will be Coke if the last one was for Coke, and an 80% chance that it will be Pepsi.
For instance, there is a 60% chance of rain tomorrow and a 40% chance of none at all.