[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference rusure::math

Title:Mathematics at DEC
Moderator:RUSURE::EDP
Created:Mon Feb 03 1986
Last Modified:Fri Jun 06 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:2083
Total number of notes:14613

718.0. "Brain Bogglers, 7/87 Discover" by COMET::ROBERTS (Dwayne Roberts) Wed Jun 17 1987 01:03

    The following five probability puzzles come from the July 1987 issue
    of Discover magazine.  I'll publish the Discover's answers later,
    as a reply to this note.
    
    Probability puzzles are notoriously difficult to solve.  The ones
    below include several that fascinated--and even stumped--some of
    the eighteenth century scientists who explored the mathematics of
    chance.
    
    1.	This problem was proposed by Swiss mathematician Nikolaus Bernoulli
        to Pierre Montmort, author of a famous book on probability.
        Montmort concluded the problem was quite difficult, but Isaac
        Todhunter, an English mathematician of a century later, called
        it obvious.  You decide:  A and B play the following dice game.
        A deposits one crown to start the pot, and then the two players
        take turns rolling a single die, with B rolling first. If B
        rolls an even number, he wins a crown; if he rolls an odd number,
        he must add another crown to the pot. A also collects a crown
        for each even number he rolls, but he doesn't deposit anything
        if he rolls an odd number. The game continues until the pot
        is exhausted. Is this game fair?
        
    2.	The Swiss physicist and mathematician Daniel Bernoulli, the
        cousin of Nikolaus, says he'll pay you a sum of money that's
        yet to be determined. Then he will flip a coin until the first
        head turns up, at which point the game is ended and you must
        pay him a sum determined by the following: if the head appears
        on the first toss, you pay $1; on the second, $2; on the third,
        $4; on the nth, $2^(n-1). How much money should Bernoulli give
        you in advance so that this game will be fair?
        
    3.	In his classic treatise, The Doctrine of Chances, Abraham De
        Moivre (1667-1754) describes this problem: A and B shuffle
        identical decks of cards and then deal them out simultaneously,
        one card at a time. Every time the two players deal identical
        cards, A must give B a guinea. How much money must A ask from
        B before the shuffles to make the game fair?
        
    4.	Randomly choose any two positive integers, a and d, so that
        a<d. Then randomly choose b and c, any two positive integers
        between a and d, so that a<b<c<d. Is the probability that b+c
        is an odd number greater than one-half, less than one-half,
        or exactly one-half?
        
    5.	At a carnival, players throw quarters onto a board checkered
        with squares 40 millimeters on a side. If a quarter touches
        a boundary, it's lost. If it rolls off the board, it's returned.
        But if it lies wholly within a square, the player wins his quarter
        back plus 75 cents. If a quarter is 24 millimeters in diameter,
        what's the expected return from tossing 100 quarters?
        
T.RTitleUserPersonal
Name
DateLines
718.1CLT::GILBERTeager like a childWed Jun 17 1987 12:577
Interesting problems!


    I vote that problem 1 is obvious!

    Problems 2 and 5 are straight-forward.  Problem 4 is pretty simple;
    it's odd with probability > 1/2.  Problem 3 looks nasty.
718.2BEING::POSTPISCHILAlways mount a scratch monkey.Wed Jun 17 1987 13:1410
    Re .1:
    
    You are mostly correct.  One is obvious, and the rest are
    straightforward.  But four is not defined, and three is easy because it
    has been solved already.  If four were defined by selecting random
    integers from 1 to N, and taking the limit of the probability as N
    increases without bound, the answer would be exactly 1/2. 
    
    
    				-- edp 
718.3thoughts on problem 3VINO::JMUNZERWed Jun 17 1987 14:127
    	Does it matter how many cards are in the deck?
    	Does this relate to the chance of two people having the same
    	   birthday (N people are all asked their birthdays)?
    	If you want to see the probability, it's in Note 304 -- anyone
    	   care to prove it's correct?
    
    John
718.4it really *is* greater than 1/2CLT::GILBERTeager like a childWed Jun 17 1987 15:4832
718.5BEING::POSTPISCHILAlways mount a scratch monkey.Wed Jun 17 1987 17:1212
    Re .4:
    
    > We must have a+2<d (since b and c must fit between them), but
    > there are *no* other constraints on a and d; . . .
        
    According to the problem, another constraint is that a and d were
    chosen randomly from the positive integers.  Given some definition of
    what this means, I can prove the probability is 1/2.  For example, you
    assumed N is finite.  It cannot be expected to be finite.
    
    
    				-- edp 
718.6How do you tell which integers you have chosen?AKQJ10::YARBROUGHWhy is computing so labor intensive?Wed Jun 17 1987 19:4018
>    According to the problem, another constraint is that a and d were
>    chosen randomly from the positive integers.  Given some definition of
>    what this means, I can prove the probability is 1/2.  For example, you
>    assumed N is finite.  It cannot be expected to be finite.

Eh? N is the difference between two SELECTED (from an infinite population) 
integers a and d. But both a and d must be finite, therefore their 
difference is finite. 

Another way of saying this is that the desired probability approaches 1/2 
from above as N increases and is never =1/2 for any finite values of a and d.
========================================================================

Problem 5: The player wins only if the center of his coin falls within a 
16x16 square in the center of the 40x40 square, so the prob. of winning is 
16x16/40x40, or 4/25. Each throw he can expect to win 4/25*$1 - 21/25*
.25, or -.05, losing a nickel a toss. In reality it's worse than that,
depending on how thick the lines are! 
718.7BEING::POSTPISCHILAlways mount a scratch monkey.Wed Jun 17 1987 20:1310
    Re .6:
    
    > But both a and d must be finite, therefore their difference is
    > finite.
    
    a and d cannot be expected to be finite, either.  That is a consequence
    of chosing integers "randomly".
    
    
    				-- edp
718.8CLT::GILBERTeager like a childThu Jun 18 1987 01:1232
re: .6

    This is beginning to sound silly!

re: problem 4

    I figured there would be some question about random integers chosen
    from an an infinite range.  Let's suppose that 1<=a,d<=M, determine
    the probability that b+c is odd, and *then* take limit as M goes to
    infinity.  For even M, what I get for prob(b+c is odd) is:

	                          M/2-1
	      5 - M/2 + (2*M-3) *  Sum  1/(2*i-1)
	                           i=2
	1/2 + -----------------------------------
	                (M-2) * (M-3)

    Presuming that this is correct (ha!), an approximation for large M is:

	1/2 + ln(M)/M.

    And so the limit as M goes to infinity is 1/2.  The following may suggest
    why it converges so slowly:  Where the difference between a and d is small,
    the probability is much significantly greater than 1/2, and it all adds up.

    Fortunately, someone else can give a real proof that the limit is 1/2.

re: problem 3 and .3

    Unfortunately note 304 is considering a different problem.  I think
    there's a simple answer to the problem, but I don't see a simple proof
    that the answer's correct.
718.9More on problem three.ZFC::DERAMODaniel V. D'EramoThu Jun 18 1987 12:2656
>> .0    3.  In his classic treatise, The Doctrine of Chances, Abraham De
>> .0        Moivre (1667-1754) describes this problem: A and B shuffle
>> .0        identical decks of cards and then deal them out simultaneously,
>> .0        one card at a time. Every time the two players deal identical
>> .0        cards, A must give B a guinea. How much money must A ask from
>> .0        B before the shuffles to make the game fair?

     After A deals his cards, number them from 1 to n.  Then B's
     deal is just a permutation of the numbers of 1 to n, and A
     must pay to B the number of places where the permutation
     "matched" the original ordered sequence.  To make the game
     fair A must ask for the expected number of matches.

     Note 304 deals with handing out n hats randomly to n men,
     and asks for the probability that anyone gets the correct
     hat.  Number the hats and the men from 1 to n, and you have
     problem three again, but with a different question asked.

     Problem three asked essentially for the expected number of
     matches and 304 asked for the probability of any matches.
     It was stated without proof in the replies to 304 that the
     expected number of matches was always one.  [And that the
     probability of no matches approached 1/e as n increased
     without bound.  I have seen a proof of this part, but
     not of the claim about the expected number of matches.]

>> .3        Does it matter how many cards are in the deck?

     According to 304.2, no.  A should ask for one guinea, no
     matter how many cards are in the deck, in order that the
     game be fair.

>> .3        Does this relate to the chance of two people having the same
>> .3           birthday (N people are all asked their birthdays)?

     I don't think that there is any relation here.  The birthday
     problem deals with some number [not necessarily n] of
     selections with replacement from 1 to n, and asks for the
     probability that all selections are distinct.  Problem 3 and
     304 compare a permutation of 1 to n -- exactly n selections
     with no replacement -- with the original ordering.

>> .3        If you want to see the probability, it's in Note 304 -- anyone
>> .3           care to prove it's correct?
>> .8re: problem 3 and .3
>> .8
>> .8    Unfortunately note 304 is considering a different problem.  I think
>> .8    there's a simple answer to the problem, but I don't see a simple proof
>> .8    that the answer's correct.

     Discussed above.  You are both right!

     V={dan}

     P.S.  The author of .3 was VINO::JMUNZER, and 304.0 begins with:
>> 304.0   J.Munzer tells the tale of ....
718.10CLT::GILBERTeager like a childThu Jun 18 1987 13:3614
718.11BEING::POSTPISCHILAlways mount a scratch monkey.Thu Jun 18 1987 19:5959
    Re .8, .10:
    
    As .9 mentions, I stated in 304.2 that the mean number of matches is
    always one.  A proof follows.
    
    Let F(n,k) be the number of permutations of n objects in which k of the
    objects are unmoved.  Let f(n,k) be the probability of such a
    permutation given a uniform random selection from the permutations, so
    F(n,k) = n! f(n,k).  Also, C(n,k) is the number of combinations of k
    objects selected from n. 
    
    I believe F(n,k) = F(n-k,0) C(n,k).  And F(n,0) = sum i from 0 to n of
    (-1)^i n!/i!.  This latter gives 1, 0, 1, 2, 9, 44, 265, . . .
    However, it is not necessary to prove these.
    
    Instead, consider: 
    
    	F(n+1,k) = F(n,k-1) + (n-k) F(n,k) + (k+1) F(n,k+1).
    
    This is true because there are three types of ways to make an
    arrangement of n+1 objects with k correct from an arrangement of n
    objects: 
            
         The n objects have k-1 correct, and the new object is put at
         the correct position at the end.  (One way.)
         
         The n objects have k correct, and the new object is put at
         the correct position at the end and swapped with an object
         already in an incorrect position.  (n-k ways.)
         
         The n objects have k+1 correct, and the new object is put at
         the correct position at the end and swapped with an object
         currently in a correct position.  (k+1 ways.)
    
    Now suppose the sum for i from 0 to n of i*F(n,i) is n!, which means
    the sum of i*f(n,i) is one.  This is easily verified for small n.  What
    is the sum for i from 0 to n+1 of i*F(n+1,i)? 
    
    Consider what happens to each F(n,i).  It contributes a bit to
    F(n+1,i-1), F(n+1,i), and F(n+1,i+1).  Consider the sum for i from 0 to
    n+1 of i*F(n+1,i).  First, replace each F(n+1,i) with its equivalent in
    terms of F(n,i-1), F(n,i), and F(n,i+1) as described above.  The
    expanded sum contains extra values like F(n,-1) and F(n,n+1) which are
    zero and may be discarded.  After discarding these, consider a typical
    term containing F(n,i) for a particular i, and gather these terms
    together:
    
    	(i-1) [  i   F(n,i)] +
    	  i   [(n-i) F(n,i)] +
    	(i+1) [  1   F(n,i)]
    
    What is the sum of the three terms with F(n,i)?  Add them up to find
    they are i(n+1)F(n,i) -- and the sum of these terms is (n+1) times the
    sum of the terms i*F(n,i).  Since the sum for i from 0 to n of i*F(n,i)
    is n!, the sum for i from 0 to n+1 of i*F(n+1,i) is (n+1)n! = (n+1)!,
    so the sum from i from 0 to n+1 of i*f(n+1,i) is one.
    
    
    				-- edp 
718.12SpoilersCOMET::ROBERTSDwayne RobertsSat Jun 20 1987 03:0750
    As promised, here's Discover's answers to the five puzzles.
    
    Spoilers follow!
    
    
    1.	The game is fair to both players. Observe that the profit of one
        player must equal the loss of the other, and therefore if the game
        is fair to one player, then it must be fair to the other. Now consider
        player B. On every toss he has an equal chance to win or lose a
        crown. Clearly the game is fair to him, so it follows that the game
        must be fair to A.
        
    2.	You should get an infinite amount of money from Bernoulli. The reason
        has to do with the "expected value" of the game, which is equal
        to the sum of the individual prizes each multiplied by the probability
        of winning it. Bernoulli's chance of winning $1 is 1/2; of winning
        $2 is 1/4; of winning $4 is 1/8; etc. Therefore, his mathematical
        expected earnings are
        ($1*1/2)+($2*1/4)+($4*1/8)+...+($2^(n-1)*1/2^n)+... and so on. But
        this is equal to $1/2 + $1/2 + $1/2... which is an infinite sum.
        This famous puzzle, known as the Petersburg paradox, was first
        formulated by Nikolaus Bernoulli and later investigated by Daniel.
        In fact, the pay-off on most games is small. In 70,000
        computer-simulated games, I paid Bernoulli an average of only $5.30
        per game. The large pay-offs come only on rare occasions when there
        is a long string of tails before the first head.
        
    3.	One guinea. Consider the second pack to be a set of guesses as to
        the identity of each card in the first pack. Before any cards are
        drawn, the probability of correctly guessing any particular card
        in a 52-card deck is 1/52. Since the second deck represents, in
        effect, 52 guesses, the expected number of right guesses is the
        sum of 1/52 + 1/52 + 1/52 ... 52 times, or 1 card.
        
    4.	The sum b+c is more likely to be odd than even. Choose any two integers
        for a and d and enumerate all possible sums of in-between numbers,
        b and c, and you'll find there are always more odd possibilities.
        For example, if a=1 and d=7, then there are six odd sums and four
        even sums of numbers between them: 2+3=5; 2+4=6; 2+5=7; 2+6=8; 3+4=7;
        3+5=8; 3+6=9; 4+5=9; 4+6=10; 5+6=11.
        
    5.	$16. The coin will fall entirely within a square only if its center
        lands inside an imaginary square 16 millimeters on a side centered
        in the larger square. Therefore, the probability of winning is
        (16*16)/(40*40)=16 per cent. Thus, 16 of your 100 quarters will
        win, giving you a $16 payback on a $25 investment, a net loss of
        $9. (In real carnival games, players frequently play back their
        winnings, so that the operator often wins all--not just a fraction--of
        what is bet.)
        
718.13DISCOVER something new in arithmeticAKQJ10::YARBROUGHWhy is computing so labor intensive?Mon Jun 22 1987 13:245
>        		... Thus, 16 of your 100 quarters will
>        win, giving you a $16 payback on a $25 investment, a net loss of
>        $9. ...

Eh? Last time I looked, 16 quarters was $4.
718.14really now (tsk tsk)BANDIT::MARSHALLhunting the snarkMon Jun 22 1987 14:2210
    re .13:
    
    100 quarters = $25
    16 wins = $16
                                                   
                  /
                 (  ___
                  ) ///
                 /
    
718.15an alternate approach to problem 3VINO::JMUNZERFri Jun 26 1987 14:3327
Number the cards in the deck 1,2,...,n.  Consider a loop starting with
A's card #1 and ending with B's card #1:

	Card from A's deck	Corresponding card in B's deck

		1				a
		a				b
		b				c
				.
				.
		y				z
		z				1

The loop can be 1,2,...,n long, and each has probability (1/n) -- e.g.

       	p(loop length = 3) = (n-1/n) * (n-2/n-1) * (1/n-2) = 1/n

E(n) = expected number of matching cards with an n-card deck

     = p(loop length = 1) * (1 + E(n-1))		   /* card #1 matches */
      + sum, from i=2 to i=n-1, of p(loop length = i) * E(n-i)  /* it doesn't */

     = 1/n * (1 + sum, from j=1 to j=n-1, of E(j))

Induction yields E(n) = 1, all n > 0.  [E(1) = 1; E(n) = 1/n * (1 + n-1) = 1.]

John
718.16something seems wrong (Bernoulli puzzle)VIDEO::OSMANtype video::user$7:[osman]eric.sixTue Jun 30 1987 19:3626
    The answer this puzzle does not seem right.
    
    The answer claims that no matter how much money is anted, the initial
    receiver of the money is at a loss.
    
    But it just doesn't sound right.
    
    I mean, suppose I offer you $1000.  All you have to do is agree
    that when I start flipping this fair coin afterwards, if it comes
    up heads on first flip, you pay me $1.
    
    If it takes two flips to come up heads, you pay me $2.  If it takes
    three flips, you pay me $4.  If it takes four flips, you pay me
    $8 etc.
    
    Personally, *I'd* accept the $1000 with those conditions.  I mean,
    heck, it's bound to come up heads almost immediately, right ?  So,
    I'll keep most of my $1000.
    
    But no, the "mathematical" solution to the puzzle suggests that
    $1000 is not enough!  The "mathematical" solution claims that
    one ought not accept $1000 under those conditions.
    
    Am I missing something ?
    
    /Eric
718.17Please, not in DCL!SQM::HALLYBLike a breath of fresh water...Wed Jul 01 1987 01:0111
    Eric,
    
    Write yourself a little program and run it for a while.  
    If $1000 is "overpaying" and we know $0.01 is underpaying,
    so you should be able to figure out the real fair "price"
    to pay after a few weeks of CPU time.

    Once you come up with a finite fair value, let's you and me
    get together.  Bring your mortgage.
    
      John
718.18define "fair"BANDIT::MARSHALLhunting the snarkWed Jul 01 1987 21:5013
    re .16:
    
    The problem is that you are thinking in terms of a single play.
    When the problem asks how to make the game "fair", it means that
    if you play it an infinite number of times, neither party will come
    out ahead.        
    
                                                   
                  /
                 (  ___
                  ) ///
                 /
    
718.19A rough gameVAXRT::BRIDGEWATERWed Jul 01 1987 23:0812
    Re: .17

    John,

    Are you saying that you would be willing to pay $1000 to play this
    game?  If so, I hope you have deep pockets, because the volatility of
    this game is *HUGE*.  (Actually, the expected payoff and the variance of
    the payoff are both infinite.)  Your opponent may also decide to stop
    playing after a hundred games, say, and he would very likely be ahead by
    a substantial sum at such point.

    - Don
718.20Anybody want to guess the answer?SQM::HALLYBLike a breath of fresh water...Thu Jul 02 1987 14:5213
Re: .19 [re: .17]
    
    No, I wasn't anticipating paying $1000 to play.  It kinda depended
    on what Eric calculated as the "fair price".  If small enough, then
    I might just decide to play for the intellectual curiosity of it all.

    Surely nobody thinks I would consider discussing GAMBLING on company
    resources.
    
      John

    (Actually, if you assume Eric can't/won't pay more than, say, the
    National Debt, then the expected value becomes easy to calculate).
718.21CLT::GILBERTeager like a childThu Jul 02 1987 16:542
    If Eric is only willing to lose $1000 on the game, then $6.50
    sounds like a reasonable amount for John to ante.