Your explanation has made the most sense out of all the ones I've come across in the past month. If you wouldn't mind, to help my brain get a better grasp on this, in what classification/category does a Markov chain fall? Does it have parents, siblings, or strange friends with similar yet at the same time diverse interests?
Hm... I don't know more than the basics, really, but you can think of a Markov chain as being a directed graph (as in graph theory, not charts in Excel), with weighted edges like this. A and E are the two states, so if you're in state A, there's a 40% chance you'll move to state E, and a 60% chance you'll stay in state A. If I'm not mistaken, it's related to Bayesian inference as well, since they both address the same basic question (If I know that X is true, what's the chance that Y will happen). I hope that helps.