In mathematics, a Markov chain, named after Andrey Markov, is a discrete random process with the Markov property. A discrete random process means a system which can be in various states, and which changes randomly in discrete steps. It can be helpful to think of the system as evolving once a minute, although strictly speaking the "step" may have nothing to do with time. The Markov property states that the probability distribution for the system at the next step (and in fact at all future steps) only depends on the current state of the system, and not additionally on the state of the system at previous steps. Since the system changes randomly, it is generally impossible to predict the exact state of the system in the future. However, the statistical properties of the system at a great many steps in the future can often be described. In many applications it is these statistical properties that are important.
The possible values of Xi form a countable set S called the state space of the chain.Markov chains are often described by a directed graph, where the edges are labeled by the probabilities of going from one state to the other states.
Source : http://en.wikipedia.org/wiki/Markov_chain
No comments:
Post a Comment