Markov chain

(data structure)

Definition: A finite state machine with probabilities for each transition, that is, a probability that the next state is sj given that the current state is si.

See also hidden Markov model.

Note: Equivalently, a weighted, directed graph in which the weights correspond to the probability of that transition. In other words, the weights are nonnegative and the total weight of outgoing edges is positive. If the weights are normalized, the total weight, including self-loops, is 1. After [Leda98].

Author: PEB

More information

Named after Andrei Andreyevich Markov (1856 - 1922), who studied poetry and other texts as stochastic sequences of characters.

Go to the Dictionary of Algorithms and Data Structures home page.

If you have suggestions, corrections, or comments, please get in touch with Paul Black.

Entry modified 23 January 2006.
HTML page formatted Fri Feb 23 10:06:08 2018.

Cite this as:
Paul E. Black, "Markov chain", in Dictionary of Algorithms and Data Structures [online], Vreda Pieterse and Paul E. Black, eds. 23 January 2006. (accessed TODAY) Available from: