Markov chain

(data structure)

Definition: A finite state machine with probabilities for each transition, that is, a probability that the next state is sj given that the current state is si.

See also hidden Markov model.

Note: Equivalently, a weighted, directed graph in which the weights correspond to the probability of that transition. In other words, the weights are nonnegative and the total weight of outgoing edges is positive. If the weights are normalized, the total weight, including self-loops, is 1. After [Leda98].

Author: PEB

More information

Named after Andrei Andreyevich Markov (1856 - 1922), who studied poetry and other texts as stochastic sequences of characters.

Go to the Dictionary of Algorithms and Data Structures home page.

If you have suggestions, corrections, or comments, please get in touch with Paul Black.

Entry modified 21 December 2020.
HTML page formatted Mon Dec 21 09:49:05 2020.

Cite this as:
Paul E. Black, "Markov chain", in Dictionary of Algorithms and Data Structures [online], Paul E. Black, ed. 21 December 2020. (accessed TODAY) Available from: