NTPC LTD
การศึกษา

Example of Markov Chain

Lets start with an example of NIFTY. Tomorrow, NIFTY can have three states -

It can either end in green.
It can either end in red.
It is in the same place.

Note - On any given day, NIFTY will be following one of these three states. Now, as per the theory of the Markov Chain, We need to assume that, tomorrow's state of NIFTY depends on today's state.

So, What happens today is dependant on yesterday's state and so on!

In other words - there is a way to predict what will be the state of NIFTY tomorrow if you know the state of NIFTY today.

This diagram shows an example of such a Markov Chain with assumptive probability values

คำจำกัดสิทธิ์ความรับผิดชอบ