Categories
Cheenta Probability Series

Physics of Coin Tossing and Uncertainty | Cheenta Probability Series

This is our 4th post in the Cheenta Probability Series that deals with the physics involved in coin tossing. It reveals the true nature of uncertainty.

“It is a very tedious task !! First you have to calculate where he is and where is is not, then you must calculate where he could possibly be, then you must seek where he is at this moment, then finally you have to calculate the probability that what is the chance of finding him when you reach, where you are suspecting him to be right now . ”

Sukumar Ray in his Novel, ” Utter Nonsense”( Ho-Jo-Bo-Ro-Lo)

Uncertainty is something that has drawn attention of mathematicians ad philosophers right from the beginning of the modern civilization. There has been many school of thoughts about the true nature of uncertainty, which also changed and is keeping changing as the mathematics is getting sophisticated gradually. The perspective of uncertainty that was(is) very much appealing to me originated from the perspective of the great probabilist Bruno de Finetti is “Uncertainty is actually the quantification of our ignorance of the lack of information “. This straightaway made the idea of probability subjective, which should be a topic for discussion for another day. But one of the living legend and an ambassador of this school of thought Persi Diaconis is man who is obsessed with the way the most fundamental places where the uncertainty exists, goes further finding a physical solution to the problem of quantifying uncertainty, which our present discussion is all about.

In statistical experiments lack of knowledge do motivate statisticians to develop more sophisticated laws of predictions, and statisticians gain only that much knowledge that they can afford given a cost. So, lack of knowledge is what induces the desire of prediction in extension handling uncertainty. They basically do a mapping of knowledge gain with cost, though cost-gain of knowledge relationship is a different topic altogether and not really a core probability stuff, but from this we end up with two most important philosophical questions

  • What is uncertainty ?
  • Can we actually gather all the information if we have a luxury of infinite investment ?

You can see clearly that the both questions are correlated and assuming on gives you the justification for another, and I choose to discuss on the first question as that is also a more general one.

Is Coin flipping is Random

There are quite a number of fundamental models that engage probabilists and mathematicians in their quest for the actual nature of uncertainty, coin tossing being the most basic and fundamental of them, made the great mathematician and probabilist Diaconis obsessed and he dived in the search of the true nature of uncertainty. Inspite of the simplicity of the experiment, and the number of unknowns are many and they accumulate to an ultimate uncertain system, which are subject to physical changes that one can’t possibly have the luxury to be aware of. But Diaconis being an obsessed man, follows the thoughts of de Fenetti and questions, “Is coin tossing is truly random, or its just our ignorance over the physical parameters ?” .

what if we knew about the magnitude of the force that our thumb imparted on the coin? At how many spins per second the coin spun while going up or coming down ? At what velocity the coin went up ? How high did the coin went before it succumb on the ground ? What about the collisions with the air ?

If we if can answer these questions, then Physics allows us to completely determine the outcome of coin flipping, so coin tossing is not at all random, its Physics !!

Persi Diaconis

Diaconis is so obsessed by the enigma of the uncertainty, he claimed that coin tossing involves the law of mechanics( ignoring the effect of air molecules). To demonstrate this, he had had the physics department build him a coin-tossing machine. The coin starts out on a spring , the spring released, the coin spins upward and lands in a cup.(as shown in the figure). Because the force imparted are controlled he claims that the coin always lands with the same side up. Magicians and crooked gamblers (including Persi Diaconis himself) also possess such abilities. I further suggest the interested readers to read articles and watch videos of Diaconis on coin tossing and game of cards, you will be amazed. Those are the reason, I feel he is among the very few who worked extensively and mathematically over the philosophical questions about uncertainty.

Coin-Tossing Machine
Keller’s approach towards explaining the Dynamics of the coin tossing.

The careful study of flipped coin started by Keller (in 1986). he assumed that a coin flips about an axis in its plane with spin about this axis at a rate \(w\) revolutions per second. If the initial velocity in the up direction \(\vec{K}\) is \(v_t\) , after \(t\) seconds a coin is flipped from a height \(z_o\) will be at height \(z_o+tv_t-\frac{g}{2}t^2\). If the coin drops at a surface the time elapsed \(t*\) satisfies \(t*v_t-\frac{gt*^2}{2}=0\) or \( t*=\frac{2v_t}{g}\) (simple equations of motion). This coin will revolve \(\frac{2wv_t}{g}\) times. If this is between \(2j\) and \(2j+1\) the initial side will turn up. If it is between \(2j+1\) and \(2j+2\) the opposite side turns up.

Hyperbolas as defined by the various initial values of \(w\).

The figure shows a space \((w,t)\) in the regions where the coin coes up as the same side or the opposite side. The edges of the hyperbola \(\frac{2wv}{g}=j\). Visually the regions get closer together, implying that small changes in initial conditions make for the difference between heads and tails.

How then is the probabilistic treatment of coin flips so widespread and so successful ? the basic answer was given by Poincare. If a coin is flipped vigorously, with sufficient vertical and angular velocity, there is a sensitive dependence on initial conditions. Then a little uncertainty as to initial conditions is amplified to a large uncertainty about the outcome, where equiprobability of outcomes are not at all a bad assumption.

In 1992, Engel carried on the work based on Keller and fitted a probabilistic model on the velocity and spins per second and even he too ended up concluding that the chance of getting head is nearly half and inspite of accounting for the velocity and spins it seemed difficult to argue on the fact that almost with equal chances one can obtain head or tail.

Engel proposed a theorem that considered two probability distribution on velocity \(v\) and spin per second \(w\) and gave a bound on the chances of getting a head and found that the chance revolves around \(\frac{1}{2}\).

So, even in a classical mechanical set up, the entire uncertainty can’t be wiped out completely.


Does Fair Tosses only depends on Fair Coins ?

According to Persi Diaconis’s observation , coin tossing outcomes are heavily dependent on the way it is tossed up. The exact determination of the bias depends in a delicate way on the shape of the coin’s edges also, which are used by magicians to trick naked human eye. (We will see this fact as a consequence in a problem, which we will solve later). He even goes further to suggest that the coin tossing gives a lot more unbiased result when tossed say 100 times, but the same penny when spun showed reasonable bias towards tails. We provide the result in a histogram comparing the outcomes of coin tossing and coin spinning. Diaconis further extends his search for uncertainty in coin flipping explaining more about physical chances, which interested readers can make their way further to those.

Coin spun on their edges Tossed Coins

Explaining and discussing the approaches and ideas that are associated in understanding the physics behind the coin flipping, lets shift ourselves to some beautiful problems which involves and explains the physical geometry and the classical mechanical fundamentals which influences the outcome of the toss.


Wishful Coin Tossing

Suppose you are tossing a coin, you know nothing about its fairness, we assume that the thickness of the coin is negligible, the coin is tossed, what is the chance of getting a head ?

Now, here we really know next to nothing about the initial conditions, which we saw impacts the outcome heavily. So here we will deal with this problem geometrically, but don’t forget the fact that classical mechanics is basically originated from this kind of mathematics. Hence, lets assume that after the coin strikes the surface the vector of the normal \(\vec{N}\) applied on the heads side of the coin generates a cone ( follow the figure). The axis of the cone makes an angle \(\theta\) \((-\frac{\pi}{2} \le \theta \le \frac{\pi}{2})\) with the horizontal plane and \(\alpha\) is the angle between the generatrix of the cone and its axis, \((0 \le \alpha \le \frac{\pi}{2})\).

This figure demontrates the dynamics of the coin according to the stated problem, here the shaded sector, will be where the coin will flip over, and the darker region will be below the horizontal plane when, \(|\theta| \le \alpha \).

When the coin hits the ground and starts spinning before settling finally, the normal vector spins randomly over the circumference of the base of the (imaginary) cone. Now imagine that coin is in any arbitrary position (as given in the figure).

Now observe that it is quite obvious from the figure that if \(\theta < \alpha \), then the coin will surely not flip over .i.e. heads will come up.

Similarly if \( \theta < – \alpha \) then the coin will surely flip and heads will never come up.

But when \(|\theta| \le \alpha \), then we have to observe the rotation of the normal vector \(\vec{N}\) more closely, and have to locate the regions on the circumference where if the normal vector stops will result in the flip over of the coin. Further observe that the coin will flip over whenever the normal vector will intersect(or penetrate ) the horizontal plane. It is hard to imagine, the figure and ones own imagine is what I can offer for clarification. Observe that part of the base of the cone that will be immersed within the horizontal plane will be part of the circumference that will flip over the coin.

So, observing the figure, if say \(O’P=r’\) and radius of the base of the cone is \(r\) and the height of the cone is \(h\). Using further simple properties of the circle, we need to find the circumference of the shaded sector as in that part of the circle, the coin will flip over, following the argument we established above. Then,

\(tan{\theta} = \frac{r’}{h} \) and \(tan {\alpha}= \frac{r}{h}\)

Let the angle of the sector be \(\phi\) (angle at \(O’\)).

So, \( Cos{\frac{\phi}{2}}=\frac{r’}{r}=\frac{h tan{\theta}}{h tan{\alpha}} \Rightarrow \phi=2cos^{-1}{(\frac{tan {\theta}}{tan {\alpha}})}\)

So, \(P( coin \ flips \ over)=\frac{r \phi}{2 \pi r}=\frac{1}{\pi}cos^{-1}{(\frac{tan {\theta}}{tan {\alpha}})}\)

Hence, \(P(Heads)= 1-\frac{1}{\pi}cos^{-1}{(\frac{tan {\theta}}{tan {\alpha}})}\).

So, we can also say that the toss will be fair when \(\theta =0\) .i.e. \(P(Head)=\frac{1}{2}\), which we can conclude also from the figure that if the axis of the cone becomes parallel to the horizontal plane there is a equal chance of falling on either side and expose any of the faces.

If you can toss a coin such that it first lands on its edges when it hits the ground (surface), then even with a biased coin you can perform a fair toss !!

So, as Diaconis claimed that edges play an important role in determining the outcome of a toss, this problem showed the same. Actually when you see magicians tossing a coin and claiming tat heads will land (as Diaconis does this himself) and truly ends up with a head, then you can be sure about the fact that he/she always using the “\(\theta > \alpha \)” case. Though in this problem I used lesser parameters ( as compared to standard mechanical models), still I claim that I ended with a quite satisfactory chance of getting, since the geometry was specific and subtle enough to cover the all the possibilities after the coin hits the surface.

Can you control the uncertainty on the edges of the coin like the magicians ? Try it !!

Edgy Tossing

Suppose I ask you about the chance of a coin landing on one of its faces, you may think its how foolish of me to ask such a question, and smartly gave the reply that ” landing on of its faces is a sure event man !!”, and then I further argue with strong mathematical logic that we can always assign some chances on the event that the coin may sometimes land on its edges !! Edgy right !

Imagine a thick coin, (for psychological convenient), now we can actually assume the coin as a cylinder right ! a flatter one perhaps. Say the coin has radius \(r\) and thickness \(h\), eventhough the mass is accountable we ignore the mass for simplicity. Now you rolled the coin (cylinder), what is the probability you think that the cylinder (coin) lands on its lateral surface (edge) ?

Try this problem yourself !! Use the assumption that Keller used, that is the normal vector of the coin will rotate in a spherical space, and you have to use the idea of Solid angle of the geometric shapes you wish to use in analyzing the phenomenon. If you don’t know what is a solid angle, then give a breif read on the topic and come back to the problem !! Good luck !

I tried the problem with a standard coin of diameter (\(2r\)) 19.05 mm and thickness (\(h\)) is 1.52 mm . I ended up with the fact that there exists about 3% chance of the coin to land on its edges !! Hence landing on edges are not at all impossible, on the contrary it is edgy enough !!

The Coin finally becomes Schordinger’s Cat !

So, till now we have been talking about how Classical Physics try to explain what are disguised as uncertainties, but at the end of every instances we failed to free ourselves from chances. Though we must admits that Laws of Mechanics explain some part of the uncertainty, leaving us to propose a “model of uncertainty” for better understanding of our readers, the model is as follows ,

\( Uncertainty = “Lack \ of \ Classical \ Mechanical \ information ” + “Error’ \).

Now before concluding, we will just discuss and try to argue from where this error coming and what is this error actually represents.

According to the latest researches, of two physicists Andreas Albrecht and Daniel Philips of University Of California, they argue that probabilities we use in our daily life and science do not “quantify our ignorance” but instead reflect the inherent random nature of the physical world as described by Quantum Mechanics.

They claim that the reason we fail to determine the coin toss outcome even after accounting for most of the classical parameters is because we cant anticipate the collisions of the air particles due to Brownian motion over the surface of the coin which in turn leaving its invisible impact on the instantaneous velocity and spins of the coin, Remember Heisenberg’s Uncertainty Principle !!

They infact claim that anyone who is tossing a coin is actually performing Schordinger’s Cat Experiment. But rather than a cat that is both alive and dead the quantum object in this case is a coin whose final state is here Heads or Tails (or Edge !!). hence outcome of the flip generally remains genuinely open until the upward face of the coin is looked at which the system takes a definite value of either Heads or a tails.

So, basically what uncertainty stands to be is that “Uncertainty is the manifestations of Quantum Chanciness .” Hence our intuitive model of uncertainty modifies as follows,

\( Uncertainty = “Lack \ of \ Classical \ Mechanical \ information ” + “Quantum \ phenomenon” \).

This is also the instant where classical probabilities branches up to their respective directions. I hope our interested readers will find themselves hanging on any of these branches. Will you toss a coin to choose a branch ?? What do you think !!


Food For Thought

If you have solved the problem left as an thought exercise in “Edgy Tossing ” , then take some rest and think again !!

Suppose the chance of the coin landing on its edge is \(\frac{1}{3}\), How thick you think the coin should be !! Do you Have any idea !!

You can share them with us perhaps !!

Share your thoughts with us, and stay tuned as the next week is the “Geometric Probability week” , and we hope to make most of it !! Be ready to be perplexed !

References

  1. Dynamical Bias in the Coin Toss – Persi Diaconis , Susan holmes and Richard Montgomery
  2. Ten Great ideas About Chances – Persi Diaconis & Brian Skyrms
  3. The Quantum Coin Toss- Edwin Cartlidge, PhysicsWorld
  4. Problems in The Theory of Probability – Sevastyanov, Chistyakov & Zubkov
  5. Fifty Challenging Problems in Probability – Frederick Mosteller.
  6. Special Thanks to my friends Avishek Dutta and Soham Ghosh ( also the co-writer of this blog), for some discussions which turned out to be productive while writing this article.

ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


Categories
Cheenta Probability Series

An Unexpected Correspondence and some Unfinished Games | Cheenta Probability Series

Human revolutionized and extended her/is restrictions on perception to natural phenomenon, when s/he started thinking about chances. We already know what crucial roles chances play when we cross the road on a busy traffic or while playing a game of 29 (card game), you show a card expecting your opponent doesn’t show the joker (card with highest points) of the same colour. Hold on ! did I say “expecting” ?

So how about discussing a problem on how we actually quantify what we are actually expecting when uncertainty is playing her mighty tricks ? Let’s present two very fundamental problems on mathematical expectation which are some of the most early documented problems on mathematical expectations or probabilistic means, encountered by two most significant mathematicians of that century. After little bit of background stories we will finally converge into the problems and present the intuitions and ideas the two Greats developed to solve those problems.

Letters conveying Thoughts

It was in 1654, after 78 years after Cardano’s (the man who first developed, the mathematics of probability) death and 9 years before Cardano’s works on The Game of Chances got published posthumously, correspondence of letters started between two mathematician exchanging ideas on how to solve some gambling problems, and as it happens they both ended up setting essential rules in the subject.

Hence these letters exhibits the first substantial works in mathematics of Probability. These two greats showed how seemingly complex problems can be reduced to straight-forward calculations when required subtlety has not escaped the eyes of the solver.

Blaise Pascal

The main aspects they focused on was fairness and expectation or probabilistic means.

These two were namely Blaise Pascal and Pierre de Fermat, who formally defined the present form of mathematical expectations while building the idea of fairness in an uncertain game. They mainly exchanged their thoughts on two problems, which was encountered by Pascal, when he was asked by one of his Parisian gambler friend Gamboud , the Chevalier de Mere. Fermat on the other hand mainly known for his interests and works in Number Theory, was getting inclined towards the idea of tackling uncertainties, and hence replied the letters immediately, which he was receiving from Pascal, explaining the problems and possible approaches.

Well before, putting the problems, just as the justification of the title I gave for this article, this correspondence which started seemingly voluntarily was very unexpected indeed and also cause of chance, as in his rest of his life (as a problem solver) Fermat never collaborated neither published any of his works. Once Pascal pressed him to publish some of his works and Fermat replied : ” Whatever of my work is judged worthy of publication , I do not want my name to appear there ” , such a secretive man was Fermat. Though ironically and very rightfully, there exists many theorem under the name “Fermat” in the present day mathematics. So, we can claim that this collaboration, doesn’t matter how successful it turned out to be, it remains one of the unexpected and unlikely collaboration in the history of evolution of mathematics, and the man who induced this unlikely event was Father Mersenne , after who’s name we have the Mersenne Primes.

Peirre de Fermat

So, What are the Problems ?

There are basically two problems, as they follows

Problem of dice – A player will roll a dice 8 times, and he needs to throw a 6 within this 8 throws, now 3 throws have been made and none of them resulted in a 6 and a stake is being settled. So, what proportion of the stake would be fair to give to the player to forego his fourth throw (only fourth ) ?

And the second problem is a bit of complex and funny in respective sense,

Problem of points- Gamboud and another player were involved in a game of dice rolling and the rule of the game was, first one to reach a certain number of points wins and collects the whole stake. Now they started playing and after certain number of rounds, as it happens the gamblers got themselves into an argument and ended up being in front of Pascal, demanding a fair division of the stake as they are not interested in completing the game and put a stop on their argument.

So, basically Pascal needed to calculate the chances of wining the game of each payer if they had continued the game and divide the stake according to the probabilities of winning of individual players assuming that both players has same probability of winning each point.


Pascal reaches out to Fermat

Observe that both the question deals with a common requirement of fair judgement. But Probabilities in those days were mainly based on intuition and experience of gamblers ( remember Cardano, himself was a gambler, and idea of probability to him was just a survival strategy). Pascal who had no experience of gambling reached out to Fermat to share his ideas and they ended up employing the concept of expectation to answer the question of fairness.

The expected value of a gamble that pays off \(S(x)\) (say), in outcome \(x\) is the probabilistic weighted average or probabilistic mean, defined as

\(expectation (S)=S(x_1)p(x_1)+S(x_2)p(x_2)+…….. \), where \( p(x_1), p(x_2),…..\) are the individual probabilities of the outcomes \(x_1,x_2,…..\) respectively.

They both argued that “A transaction that leaves the players expected value unchanged is assumed to to be fair.” For example consider flipping a fair coin. If it comes up Head you win 1, and if tails turn up you loose 1. Then the expected value is \( (+1)\frac{1}{2} +(-1)\frac{1}{2} =0\) .

Fermat rectifies Pascal

In the letters exchanged between the two mathematicians it is found that Pascal first made an error that, he claimed that throws that the player lost, earned him \(\frac{1}{6}\) th of the remaining stake, which makes by his theory that if the player should earn, \(\frac{125}{1296}\) of the total stake if he agrees to forego his fourth throw !! But Fermat explains that he is mistaken and provides the solution himself.

Fermat writes,

” …..you proposed in the last example in your letter (I quote your very terms)b that if I undertake to find the six in eight throws and I have thrown three times without getting it and if my opponent proposes that I should not play the fourth time, and if he wishes me to be justly treated, it is proper that I have \(\frac{125}{1296}\) of the entire sum of our wages.

This is however not true by my theory. For in this case the three first throws having nothing for the player who holds the die, the total sum thus remaining at stake , he who holds the die and who agrees to not play his fourth throw should take \(\frac{1}{6}\) as his reward. And if he has played fourth throw without finding the desired point and if they agree that he shall not play the fifth time , he will nevertheless have \(\frac{1}{6}\) of the total for his share. Since the whole sum stays in the play it not only follows the theory, but indeed common sense that each throw should be of equal value. “

So, basically Fermat, claimed that in order to conduct a fair game, the expected value of the game must not change depending on the fact whether a player is foregoing a round taking a proportion of the stake or continue to play the game.

Let us explain Fermat’s argument, more elaborately and mathematically, considering two cases, as

Case-1– Suppose the stake \(s\) is settled after 3 throws ( 6 came in none of those), and our friend is left with 5 throws and he plays the 4th round, then he is expected to win the stake \(s\) with probability \(\frac{1}{6}\) (probability of getting a six ) or loosing the 4th throw with probability \(\frac{5}{6}\) ( probability of getting anything but 6) but winning the stake \(s\) in one of the 4 remaining rounds thereafter with probability \((1-(\frac{5}{6})^4)\) (probability of getting at least 1 six in any of the 4 throws). So, mathematically the expectation becomes,

\(\frac{1}{6}s+ \frac{5}{6}(1-(\frac{5}{6})^4)s \) .

Case-2– Suppose the player foregoes his fourth throw for \(\frac{1}{6}\) of the stakes, then according to Fermat’s suggestions in the letters, his expectation is,

\(\frac{1}{6}\) of the stakes \(s\) due to his foregoing of the 4th round with rest of the remaining \(\frac{5}{6}\) of \(s\), with probability \((1-(\frac{5}{6})^4)\) (probability of winning in the remaining 4 rounds). Hence, while foregoing his fourth throw the player expects,

\( \frac{1}{6}s +(1-(\frac{5}{6})^4)\frac{5}{6}s \).

so, clearly, in both cases the expectation of the player remained same as stated by Fermat, so \(\frac{1}{6}\) of the stake is the fair price for foregoing the fourth throw. Using Fermat’s argument one can easily generalize this solution, hence I leave that task upon the readers.

Fermat gets the key once again

The second problem “ Problem of points ” is also an expected-value problem, which roots back in 1494, when Fra Luca Pacioli considered a problem where the play is complete with 6 points, one player has won 5 points and the other 3. Pacioli argued that the fair division would be according to the proportion to the rounds already won 5 to 3 . About 50 years later Tartaglia objected that according to this objected that according to this rule if the game were stopped after 1 game, then the person winning that only game would take the whole stake !! After objecting when he himself tried the problem he ended up being puzzled by the perplexity which later also baffled the likes of Cardano and the Chevelier de Mere.

But Fermat immediately got the key once again and stated more generally that if the Player-1 needs \(p\) points more and the Player-2 needs \(q\) points more to win the game, then the result of the game can be determined by playing at most \( p+q-1\) games . Then he used a very important step which to this day remains a very efficient techniques to the present day problem solvers. What he did is just mapped each rounds of the \(p+q-1\) games to be payed with the event of tossing a fair coin, \(p+q-1\) times where each toss representing each round of the game. This is called bijective mapping in mathematics, where you just look outcomes of a phenomenon which has exactly identical outcomes of the event we are interested in. Here he basically mapped the win and loss to the two faces of the fair coin and since we assumed that each player has equal probability of winning each round, so its just like win( or head) with probability \(\frac{1}{2}\) and loose( or tails ) with probability \(\frac{1}{2}\) . Hence the problem reduced to a set of events which have equiprobable outcomes, and thus is the subtlety which I was talking while explaining the nature and trick to the problem.

So, if we consider Pacioli’s problem, where Player-1 has 5 points and player-2 has 3, since 6 concludes the game, 3 (1+3-1) more games will suffice. Now if we think of tossing 3 fair coins, then we will have 8 (\(2^3\)) equiprobable outcomes, but Player-2 who needs 3 points must win all three of the games, that is he/she must have (win, win, win) (or, (heads, heads heads)) outcome, and chance of getting all 3 wins (heads) is \(\frac{1}{8}\), which makes the chance of Player-1’s win \(\frac{7}{8}\) (or, \((1-\frac{1}{8})\)). And hence, Player-1’s expectation on the stake becomes \(\frac{7}{8}\) of the stakes.

Here, Fermat just magnified his views over the outcomes of each round and since the outcomes are symmetric, he can extended his observation over a set of games played. in modern Probability theory , we use this method which we sometimes call the fundamental bridge, which breaks an random variable (here points scored by each player after a set of games have been played), into some finite number of indicator random variables (here, 1 point scored if one wins or 0 if looses in each game). This technique helps tremendously in finding complex expectations, even when you don’t know the distribution of the random variable. Hence this work of Fermat initiated the first step towards erecting the pillars of the fundamental bridge in probability theory.

Pascal Generalizes with Pattern

Imagine Fra Pacioli’s problem in a modified set up, where in the game where win comes at 6 points , Player-1 has no points and the other has only 1 point .So by Fermat’s argument, the players needs to play at most 10 (6+5-1) matches . Now here using the fundamental bridge and tossing a fair coin 10 times leaves us with 1024 (\(2^{10}\)) equally likely outcomes, which is no more possible to count like the the previous one. Obviously, we don’t have all day!! But now Pascal had an idea.

To count the cases in which one of the players must win at least 6 rounds, she can win exactly any 6 out of 10 trials in [ 10 choose 6] \( 10 \choose 6 \) ways, again exactly 7 win out of 10 trials in [10 choose 7] \( 10 \choose 7\) ways,….. this way…., 10 wins out of 10 trials in [10 choose 10] \( 10 \choose 10 \) ways. And these numbers can be found to be lying on the 10t row of the Pascal’s triangle. The particular row tell us number of ways we can choose “wins” from a group of 10 “outcomes”( of “wins” and “loss” only).

Pascal’s Triangle- The temple of Patterns
(observe the row-10 for the given problem)

In this particular problem, Pascal needed the number of ways getting 6 wins in 10 trials+7 wins in 10 trials +…..+ 10 wins in 10 trials, which from Pascals Triangle found to be , 210+120+45+10+1=386.

So, fos a probability of winning of \(\frac{386}{1296}\) (which is about 38% , from here calculating the fair division is already explained earlier.

These solutions and generalizations by Pascal and Fermat opened the gate of handling equiprobable cases using combinatorial principles and provided insights on the fairness of a game and measuring of mathematical expectation, which later turned out to be most important and fundamental measure besides Probability itself in theory of Statistics and Probability.

Pascal gives the finishing touch

Before concluding I just want to put light on another approach which Pascal provided to solve a modified version of the “Problem of points” , without discussing about this beautiful approach, I think this article on “Probabilistic means” will remain, incomplete.

Pascal proposed a problem and its possible approach to Fermat in one of his letter as,

” Let us suppose the first of them has two (points) and the other one. They now play on throw of which the chances are such that if the first wins, he will win the entire wager that is at stake, that is to say 64 pistoles. If the other wins they will be two to two and in consequence, if they wish to separate it follow that each will take back his wage that is to say 32 pistoles.

Consider then, Monsieur that if the first wins, 64 will belong to him. If he looses 32 belong to him. then if they do not wish to play this point and separate without doing it, the first would say, “I am sure of 32 pistoles, for even a loss gives them to me. As for the 32 other , perhaps I will have them and perhaps and perhaps you will have them the risk is equal. Therefore let us divide the 32 pistoles in half (16 each) and give me 32 of which I’m certain besides.” He will then have 48 (32+16) pistoles and the other will have 16.”

Pascal basically used the situation of draw clearly to create a definition of fairness which is very difficult to argue over. He finely differentiated what part of the stakes surely belongs to the leading player, and what part of the stake still requires to be partitioned over uncertainties.

Pascal further extends his idea to solve more general problem, when say one player is leading by 2 points (or p points more generally) and they start playing for a point where if the one leading wins , takes the lot. Can you find the fair share if they doesn’t wish to play the rounds further ? What do you think the fair share would be if one leading looses the match and now just leading by a point ?? Are you thinking of using recursion of the argument Pascal provided for the case of “2 to 1 points” ? Well thats what Pascal himself did, try completing the rest yourself !!

Pascal’s method of recursion , was the final touch of artistry to the solutions to the problems and the definition of fairness and its relation to the probabilistic means. This method of calculation makes the solution more general and it lives till today to solve problems on conditional expectation. And since recursion involves much insights over recognizing the patterns inherent in a certain problem associated with seemingly random outcomes, Pascal’s procedure of calculating probabilistic mean still now attracts bunch of problem solvers and probability fanatics who are inclined to the mathematical aspects of quantifying chances. This method helps to solve problems involving in games, patterns in outcomes of a sequence of coin tosses, which has extensive application in many fields of Physics and Genetics. Coin tossing is itself a topic which demands discussion on its own right, lets keep it for another day.


Food For Thought

I often find thinking over mistakes and disputes in arguments often help us getting more insights over the subtle truth which is puzzling enough. So I suggest,

Trace back to the section ” Fermat rectifies Pascal”, can you explain where Pascal went wrong in his argument, which I left unexplained for you to figure out !!

Share your thoughts with us, and stay tuned to go out for some “Random Walks” with Soham Ghosh in the next post. Be ready to be perplexed !

References

  1. Ten Great Ideas About Chance – Persi Diaconis & Brian Skyrms
  2. Pascal and Fermat on Probability – Vera Sanford.
  3. Do dice play God – Ian Stewart.
  4. Fermat’s Last Theorem – Simon Singh.

Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


Categories
Cheenta Probability Series Probability

Measuring Chances and Coincidences | Cheenta Probability Series

Let’s explore coincidence statistics!

(10 min read)

Coincidence Statistics – A non-formal approach to probability !

As Maxwell puts it straight : “The true logic of this world is in the calculus of probabilities.”

When we talk literature, we mean literature only . No need to bring up bookish terms to define probability ! To put this in a simpler way, think of the following scenario:

Suppose some dark night a policeman walks down a street, apparently deserted; but suddenly he
hears a burglar alarm, looks across the street, and sees a jewellery store with a broken window. Then
a gentleman wearing a mask comes crawling out through the broken window, carrying a bag which
turns out to be full of expensive jewellery. The policeman doesn’t hesitate at all in deciding that this
gentleman is dishonest. But by what reasoning process does he arrive at this conclusion?

Ok, now you get it, right?? Probability is the science of this intuition or in more stricter terms deductive and plausible reasoning.

Time to get formal now, how can we measure probability??

It was as late as the 16th and the 17th centuries that scientists could actually realize that chance (probability) can be measured.The Greeks even had a goddess of chance, Tyche. Several contemporary philosophers tried to give an empirical measurement of chance, but it appears that there was no quantitative theory of chance in these times.

Consider the analogous problem for another metric, say length. How would you measure length? Of course, it is silly to ask this question now, but this was a valid problem in the 16th century.

You find a standard of equal length, apply it repeatedly and count . The standard might be your foot,as you pace off a distance. One such approach was made in 1522, etched eternally in history books, known as the “lawful rood“.

The idea was to line up the feet of 16 people as they emerged from the church,as shown in the figure below:

coincidence Statistics
Determination of the lawful rood

The various folks have different foot lengths, but an implicit averaging effect was accepted by a group, even though the explicit term “average” was non-existent at that time.

However, there is an objection. There is a kind of circularity involved in the process. We are defining length, but we are already assuming that our standard remains the same length as we step off the distance.

Eventually, we refine our notion of length.Your foot may change length;so may the rod; so may the standard meter stick, at a fine enough precision.We refine the measurement of length using physics.The circularity is real, but it indicates a path for refinement rather than a fatal objection.

The same thing happens with probability. To measure probability, we first find or make equally probable cases and then count them.

We speak of probability only for observations that we contemplate being made in the future. By the “probability” of a particular outcome of an observation we mean our estimate for the most likely fraction of a number of repeated observations that will yield that particular outcome. If we imagine repeating an observation—such as looking at a freshly tossed coin— \( N \) times, and if we call \( N_{A} \) our estimate of the most likely number of our observations that will give some specified result A, say the result “heads,” then by P(A), the probability of observing A, we mean \(P(A)=\frac{N_A}{N}\)

First of all, we may speak of a probability of something happening only if the occurrence is a possible outcome of some repeatable observation.

We should emphasize that \( N \) and \(N_A \)  are not intended to represent numbers based on actual observations. \(N_A \)  is our best estimate of what would occur in \( N \)  imagined observations. Probability depends, therefore, on our knowledge and on our ability to make estimates. In effect, on our common sense! Fortunately, there is a certain amount of agreement in the common sense of many things, so that different people will make the same estimate. Probabilities need not, however, be “absolute” numbers. Since they depend on our ignorance, they may become different if our knowledge changes.

The Birthday Paradox

Reiterating Diaconis’ words ,coincidences occur to all of us. The birthday paradox has emerged as a useful tool to enable standards of surprise.

 In a room of just 23 people, do you know there’s a 50-50 chance of at least two people having the same birthday ?

This amazingly counter-intuitive result is actually called a paradox because our brain hates processing compounding exponents.We expect probabilities to be linear and only consider the scenarios we’re involved in.

Let’s investigate the case for \( N \) people and \(C \) categories.(We consider days as categories here, i.e. C=365)

We make a few assumptions before tackling this. First of all we assume the birth of all individuals to be independent of each other(i.e. we drop the cases of twins) and for simplicity, we assume a non-leap year, which means we only deal with 365 days.

So, we basically have \( N \) people, each independently and uniformly distributed in \( \{1,2,..,C\} \).

Now, if \(N=C+1 \), is there a chance that all these numbers are distinct? Of course not . (Simply by the Pigeon Hole Principle)

Call the chance of all these numbers being distinct as \(P(C,N)\).

So, see that \( P(C,N)=1(1-\frac{1}{C})(1-\frac{2}{C})…(1-\frac{N-1}{C}) \).

For a simpler approximation, for \(N= 1.2 \sqrt{C} \), the chance of success is close to \( \frac{1}{2} \).

The basis of this approximation is rooted in this proposition that when \(N,C\) are large and \( \frac{N^{\frac{2}{3}}}{C} \) is small, the chance of no match is \(P(C,N) \sim e^{-\frac{N(N-1)}{2C}} \)

The proof is quite simple, just see that \(\log(1-x) \sim -x \) when \(x\) is small.

Thus, \(P(C,N)=e^{\sum_{i=1}^{N-1} \log(1-\frac{i}{C})} \sim e^{-(\sum_{i=1}^{N-1} \frac{i}{C})}=e^{-\frac{N(N-1)}{2C}} \)

So, guess what if \(N=75 \) and \( C=365 \)?

Voila! there’s almost a 99.9% chance of at least two people matching.

Coincidence statistics and probability
Exponential decrease of P(No match) with increasing N

The Birthday paradox has turned tables in cryptography and in a more generic sense applies to hash functions: the expected number of N-bit hashes that can be generated before getting a collision is not \(2^N\), but rather only \(2^{\frac{N}{2}}\). This is exploited by birthday attacks on hash functions.

Food For Thought

Let’s think of an extension to the Birthday Problem. Suppose there are two types of people, say m men and n women. Then find the probability of no shared birthdays between at least one man and one woman.

A possible hint : You will require Stirling Numbers of the second kind .

You may also use the Birthday problem to study multiple coincidences. As an example, how large should \(N\) be to have approximately even odds of a triple birthday match?

Till then, keep learning, keep thinking and keep innovating!

Stay tuned for the next post by Uttaran Chatterjee, I guarantee you that would be a treat!

References:

1.10 Great Ideas About Chance – Persi Diaconis,Brian Skyrms

2. Feynman’s Lectures on Probability

3. Mathematical Recreations and Essays- Ball & Coxeter

Check out another blog from this Probability Series: An Unexpected Correspondence and some unfinished games.