I am trying to write something that mimics the Martingale betting system. If you are not familiar with this, this is the “sure thing!”. (not sure thing) a betting system for games with coins, where you double your bet every time you lose, hoping to get all your lost money back on your first win.
Thus, your bets will cost $ 10 → loss → $ 20 → loss → $ 40 → loss → $ 80 → win! → $ 10 ...
Simple, right? I suppose the logic would be:
- Have a wallet variable starting at $ 1,000.
- Make a bet.
- Flip a coin with
rand(0..1). 0 will be a loss and 1 victory. - If I win, add a bet to my wallet. If I lose, subtract the bet from my wallet, and then issue a new bet for the previous two.
I write this as:
def flip(bet)
if rand(0..1) == 0 then
$balance += bet
else
$balance -= bet
flip(bet*2)
end
end
Then I run a flip(10)thousand times to see how effective this betting system is.
The problem is that I always get accurate results. I will run the program ten times, and the first five results will always be 1010, 1020, 1030, 1040, 1050 ... So something is wrong. But I can’t understand that; logic seems beautiful to me.
To check everything, I deleted the recursive call, line flip(bet*2). Instead, I just ran a thousand regular bets. And it behaves the way you expect, different results each time.
So what is going on here?
source
share