2016-11-22 at 10:40 AM UTC
I am making this thread because I was researching some things and came upon the Markov property and the Martingale. I was having trouble understanding the difference until I found this explanation.
Lets play a game called Markovian ball. In this game a number of black balls and white balls are contained in a sack. To play you simply pick up a ball, record its color and put it back. Markovian ball is a Markov process because the process of picking a ball and returning it has no bearing on the next pick. In other words, none of the previous process changes the probability of the next outcome.
Martingale ball on the other hand is just a slight variation. After a ball is picked and returned another ball of the same color is added to the sack. This addition will change the probability of the next outcome.
Thus in Martingale Ball any knowledge of previous plays is of no use as the current state of play is the only determining factor in the next possible outcome. In Markovian ball there is no change in the probability of the outcome based on previous processes.
Post last edited by thelittlestnigger at 2016-11-22T10:44:27.857810+00:00
2016-11-22 at 11:59 AM UTC
Originally posted by mmQ
How do you win?
The only way to win is not to play
The following users say it would be alright if the author of this
post didn't die in a fire!
2016-11-23 at 1:15 AM UTC
aldra
JIDF Controlled Opposition
note to self: stop browsing this site at work
Lanny: can we please restrict embedded images to a set size, like 150x150 or something, click to enlarge?
The following users say it would be alright if the author of this
post didn't die in a fire!