Deleted text in red / Inserted text in green

WW

HEADERS_END

A Markov Chain is a collection of states that you move between,

making choices according to probability distributions. More to

follow.

making choices according to probability distributions, in which the probability of the current state occurring depends on the previous state. More to follow.

Markov chains are now also used as a technique of spam filtering.

* http://www.google.co.uk/search?q=Markov+Chain