Please enable JavaScript.
Coggle requires JavaScript to display documents.
Markov Chain, Transition matrix - Coggle Diagram
Markov Chain
Definition
A Markov chain or Markov process is a stochastic (random) model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Formula
-
-
-
Theorem 10.4.4
The steady-state vector q of a regular transition matrix P is the unique probability
vector that satisfies the equation Pq = q.
-
Problem solving
Question: A country is divided into three demographic regions. It is found that each year 5% of the residents of region 1 move to region 2, and 5% move to region 3. Of the residents of region 2, 15% move to region 1 and 10% move to region 3. And of the residents of region 3, 10% move to region 1 and 5% move to region 2. What percentage of the population resides in each of the three regions after a long period of time?
-
-
-
-