N-Step Transition

Following up on Markov Chains Introduction

Question: If a Markov Chain is in state i at time m, what is the probability that n periods later it will be in state j?

Answer: Clearly, (from the transition probability matrix P).

… the (i,j)-th element of Example:

t=0t=1 oddst=0t=1 oddst=0t=1 odds
Sunny0.7 sunnyCloudy0.2 sunnyRainy0.2 sunny
0.2 cloudy0.5 cloudy0.2 cloudy
0.1 rainy0.3 rainy0.6 rainy

What is probability it is sunny in 2 days when it is rainy today.

t=0: Rainy t=1: 0.2 sunny, 0.2 cloudy, 0.6 rainy t=2: 0.2(0.7) + 0.2(0.2) + 0.2(0.1), 0.2(0.2) + 0.2(0.5) + 0.2(0.3), 0.6(0.2) + 0.6(0.2) + 0.6(0.6) t=3: 0.2(0.7) + 0.2(0.2) + 0.6(0.2) = 0.3