Markov Chain

Markov Chain is a sequence of random variables ( X_1, X_2, X_3, \cdots) which has a property that the probability of moving to the next step depends only on the current state not on the previous states.

The probability of moving to the next step  P(X_{n+1}) which is defined as follows:

 \displaystyle
    P(X_{n+1}=x \mid X_1=x_1, X_2=x_2, \ldots, X_n=x_n)

depends only on the current state ( P(X_{n} = x_n)).

 \displaystyle
   =  P(X_{n+1}=x \mid X_n = x_n)

You can find a visual explanation in this page.