N-Step Transition
Following up on Markov Chains Introduction
Question: If a Markov Chain is in state i at time m
, what is the probability that n
periods later it will be in state j?
Answer: Clearly, (from the transition probability matrix P).
… the (i,j)-th element of Example:
t=0 | t=1 odds | t=0 | t=1 odds | t=0 | t=1 odds |
---|---|---|---|---|---|
Sunny | 0.7 sunny | Cloudy | 0.2 sunny | Rainy | 0.2 sunny |
0.2 cloudy | 0.5 cloudy | 0.2 cloudy | |||
0.1 rainy | 0.3 rainy | 0.6 rainy | |||
What is probability it is sunny in 2 days
when it is rainy today.
t=0: Rainy t=1: 0.2 sunny, 0.2 cloudy, 0.6 rainy t=2: 0.2(0.7) + 0.2(0.2) + 0.2(0.1), 0.2(0.2) + 0.2(0.5) + 0.2(0.3), 0.6(0.2) + 0.6(0.2) + 0.6(0.6) t=3: 0.2(0.7) + 0.2(0.2) + 0.6(0.2) = 0.3