Hm... I don't know more than the basics, really, but you can think of a Markov chain as being a directed graph (as in graph theory, not charts in Excel), with weighted edges like this. A and E are the two states, so if you're in state A, there's a 40% chance you'll move to state E, and a 60% chance you'll stay in state A.
If I'm not mistaken, it's related to Bayesian inference as well, since they both address the same basic question (If I know that X is true, what's the chance that Y will happen).
I hope that helps.