Suppose in small town there are three places to eat, two restaurants one Chinese and another one is Mexican restaurant. The third place is a pizza place. Everyone in town eats dinner in one of these places or has dinner at home.
Assume that 20% of those who eat in Chinese restaurant go to Mexican next time, 20% eat at home, and 30% go to pizza place. From those who eat in Mexican restaurant, 10% go to pizza place, 25% go to Chinese restaurant, and 25% eats at home next time. From those who eat at pizza place 30Those who eat at home 20% go to Chinese, 25% go to Mexican place, and 30% to pizza place. We call this situation a system. A person in the town can eat dinner in one of these four places, each of them called a state. In our example, the system has four states. We are interested in success of these places in terms of their business. For example after a given period of time, what percentage of people in town will go to pizza place?
Suppose there is a physical or mathematical system that has possible states and at any one time, the system is in one and only one of its states. And suppose that at a given observation period, say period, the probability of the system being in a particular state depends on its status at the n-1 period, such a system is called Markov Chain or Markov process.
In the example above there are four states for the system. Define to be the probability of the system to be in state after it was in state j ( at any observation ). The matrix ) is called the Transition matrix of the Markov Chain.
So transition matrix for example above, is
The first column represents state of eating at home,
the second column represents state of eating at the Chinese restaurant,
the third column represents state of eating at the Mexican restaurant,
and the fourth column represents state of eating at the Pizza Place.
Similarly the rows respectively represent eating at home, eating at the Chinese restaurant, eating at the Mexican restaurant and eating at the Pizza Place.
Note that the sum of each column in this matrix is one. Any matrix with this property is called a stochastic matrix probability matrix or a Markov matrix. We are interested in the following question:
What is the probability that the system is in the state, at the observation?
To answer this question, we first define the state vector. For a Markov Chain, which has k states, the state vector for an observation period , is a column vector defined by
where, = probability that the system is in the state at the time of observation.
Note that the sum of the entries of the state vector has to be one.
Any column vector ,
where
is called a probability vector. Consider our example, suppose at the beginning, every one eats at home, that is the initial state vector is
At the end of week the state vector is
Note that we can compute directly using as
Similarly, we can find the state vector for
observation periods.
Computing
suggest that the state vector approached to some fixed vector, as the number of observation periods increase. This is not the case for every Markov Chain. For example if
we can compute the state vectors for different observation periods:
These computations indicates that this system oscillates and does not approach any fixed vector.