Markov Chain
Markov Chain is a sequence of random variables () which has a property that the probability of moving to the next step depends only on the current state not on the previous states.
The probability of moving to the next step which is defined as follows:
depends only on the current state ().
You can find a visual explanation in this page.