Dominion Strategy Forum

Dominion => Dominion General Discussion => Topic started by: GendoIkari on April 17, 2012, 12:00:01 pm

Title: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 12:00:01 pm
If you have $4 in your opening hand, and buy a Nomad Camp, what are the odds that you will have $5 on turn 2?
Title: Re: Math request: Nomad Camp
Post by: Fabian on April 17, 2012, 12:02:44 pm
40%
Title: Re: Math request: Nomad Camp
Post by: Higara on April 17, 2012, 12:08:06 pm
40%

What he said. To expand, you're looking for the probability that the bottom card of your deck is an estate. Before you buy the Nomad Camp, you have five cards in your deck, of which two are estates. Since buying the Nomad Camp doesn't change the order of cards beyond the top, you have a 2 (estates)/5 (cards) probability of drawing three coins.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 12:08:13 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 12:10:11 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 12:11:59 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.
I would guess they would rise to 100%...
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 12:12:59 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.
I would guess they would rise to 100%...
How do the odds of the last card in your deck being Estate rise to 100% if you had all three Estates in your starting hand?
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 12:16:09 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.
I would guess they would rise to 100%...
How do the odds of the last card in your deck being Estate rise to 100% if you had all three Estates in your starting hand?
Oh, I was thinking about the probability that you have $5 in hand 2.  But of course context of "it" changed in between.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 12:18:26 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.

Give me a minute to pull up the necessary information. You're intuition is correct - And if I were to SHOW you those remaining 5 cards, and THEN reshuffle them, it would be 40%. But because the information entered into the system was 7 Coppers and 3 Estates at the first shuffle, the odds of the BOTTOM card have NOT changed, and were determined at that time to be 30%.

And edit while I find what I need:

The original problem was stated thusly:

If I have a 52 card deck, equal red and equal black, shuffled to together randomly, and I start to reveal cards to you one at a time...:
You may tell me to stop at any time, and guess what the color the bottom card will be. Can you ever increase your odds better than 50% that it will be black? What if I show you the NEXT card instead?

-- Against intuition, the answer is NO. It is always 50/50, because those odds were determined with initial information input of 26/26 - and even as you remove cards, and the total remaining may change, it doesn't change the initial odds on any GIVEN card from the remaining to be 50/50.
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 12:22:18 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.

Give me a minute to pull up the necessary information. You're intuition is correct - And if I were to SHOW you those remaining 5 cards, and THEN reshuffle them, it would be 40%. But because the information entered into the system was 7 Coppers and 3 Estates at the first shuffle, the odds of the BOTTOM card have NOT changed, and were determined at that time to be 30%.
Only because something did not change does not mean the probability did not change, see T1 CCEEE. The buttom card does also not change when you draw this hand, and still you won't try to convince me that there's still a probability of 30% that the 10th card is a 4th Estate, or?

Edit: Just because not ALL of the information got leaked in the first turn, that doesn't mean that NONE has leaked...
Title: Re: Math request: Nomad Camp
Post by: Thisisnotasmile on April 17, 2012, 12:22:42 pm
30%.

The bottom card of your deck was decided upon first shuffle, and no new knowledge changes it's original odds of being an Estate.

No, 40% is correct.  The new knowledge that changes the odds is the fact that the first 5 cards of your deck are 4 Coppers and 1 Estate.  It's the same principle by which the odds plummet to 0% if your first hand is CCEEE.

Give me a minute to pull up the necessary information. You're intuition is correct - And if I were to SHOW you those remaining 5 cards, and THEN reshuffle them, it would be 40%. But because the information entered into the system was 7 Coppers and 3 Estates at the first shuffle, the odds of the BOTTOM card have NOT changed, and were determined at that time to be 30%.

If you can SHOW me those five remaining cards and they are not Copper, Copper, Copper, Estate, Estate, (in any order, it doesn't matter because we're going to be reshuffling them) then I'll believe you. Until then, 40% is correct.
Title: Re: Math request: Nomad Camp
Post by: Fabian on April 17, 2012, 12:23:08 pm
michaeljacksonpopcorn.gif
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 12:27:48 pm
Oh, I was thinking about the probability that you have $5 in hand 2.  But of course context of "it" changed in between.
Sorry about the confusion; it started out hungry but then wanted to take a walk instead.

Galzria, the question is not "in a shuffle of 7 Coppers and 3 Estates, what is the chance Estate is the bottom card."  Rather, the question is "in a shuffle of 7 Coppers and 3 Estates, knowing that the first five cards are 4 Coppers and 1 Estate, what is the chance Estate is the bottom card."

Are you familiar with the Monty Hall problem?  That's effectively what this is.
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 12:33:13 pm
Quote
The original problem was stated thusly:

If I have a 52 card deck, equal red and equal black, shuffled to together randomly, and I start to reveal cards to you one at a time...:
You may tell me to stop at any time, and guess what the color the bottom card will be. Can you ever increase your odds better than 50% that it will be black? What if I show you the NEXT card instead?

-- Against intuition, the answer is NO. It is always 50/50, because those odds were determined with initial information input of 26/26 - and even as you remove cards, and the total remaining may change, it doesn't change the initial odds on any GIVEN card from the remaining to be 50/50.
You have to be very carefull with probabilities, small changes in the conditions can change a lot.
But as stated here, I tell you the answer is yes, because I will just wait 51 cards and count and thus know what the last card will be.

If, as I can not read in post, but anyway, the cards are not a equal number of red and blacks, but equally likely red or black, that's a completely different situation, and it's also a different situation than we have here. Because the cards of a Dominion deck are not Copper with 70% and Estate with 30%, but there are exactly 3 Estates and 7 Coppers. In the first case, if you know the first 9 cards, you don't know more about the 10th card, it is still 30% Estate and 70% Copper. In the second case, if you know the first 9 cards, you know the 10th, its Copper if you've just found 6 Coppers and Estate if you have found 2 Estates.

But in the first case that would also mean that the startung deck could consist of 10 Estates with probability 0.3^10.
Title: Re: Math request: Nomad Camp
Post by: michaeljb on April 17, 2012, 12:39:36 pm
Isn't it just as simple as this?

Original odds of bottom card being Estate 3/10.

Odds of bottom card being Estate, given exactly one Estate is in the top 5 cards, would be 2/5.

Odds of bottom card being Estate, given exactly two Estates are in the top 5 cards, would be 1/5.

Odds of bottom card being Estate, given exactly three Estates are in the top 5 cards, would be 0.
Title: Re: Math request: Nomad Camp
Post by: Tables on April 17, 2012, 12:39:45 pm
The original problem was stated thusly:

If I have a 52 card deck, equal red and equal black, shuffled to together randomly, and I start to reveal cards to you one at a time...:
You may tell me to stop at any time, and guess what the color the bottom card will be. Can you ever increase your odds better than 50% that it will be black? What if I show you the NEXT card instead?

The ultimate question here is a trick, and you're being caught out by it. This asks if there's some strategy to improve the odds above 50% (actually, I think you might be able to improve the odds above 50%, although I'd need to revisit the drunkards walk to be certain, and that isn't relevant here). But that isn't the same as what's being asked here. Let me take an extreme example in your question. I can let you reveal any number of cards, correct? I'll let you reveal 51. At this point I can be 100% certain of what the bottom card is. BUT having revealed those 51, it's still a 50% chance the card will be black. Revealing 51 cards doesn't increase the odds the bottom card will be black, but after seeing 51, I'm not going to turn around and say the last card has a 50/50 chance, am I?

The same principle occurs here. Suppose I draw cceee as my first hand. What's the chance my bottom card is an estate now? It's not 30% any more, is it? Similarly, suppose I draw cccce. Well, the last 5 cards of my deck are cccee in some order, so there's a 2 in 5, 40% chance, it's an estate on the bottom.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 12:52:19 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.
Title: Re: Math request: Nomad Camp
Post by: Kuildeous on April 17, 2012, 12:58:11 pm
When I have questions, I throw together a quick Excel sheet and see what distribution I get with 20k iterations. I'll go more if necessary.

It is indeed 40%. I wanted to verify that before cranking out the numbers, but it boils down to conditional probability. The question asked "If first hand has exactly 4 coppers then…" This is more than just calculating the probability of the last card. It's the probability of the last card, GIVEN that the first five cards are arranged a certain way. That's why it's not as simple as 30%.
 
Title: Re: Math request: Nomad Camp
Post by: timchen on April 17, 2012, 12:58:41 pm
This shouldn't need to be explained. Consider the probability of drawing a $4 hand in the second turn (without buying stuff like NC). Whatever you draw in the first turn, the probability becomes either 0% or 100%. It is just silly to say the probability does not change.
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 01:01:48 pm
Galzria, your error is that the question you are answering - 18.5(d) - isn't the question that we're all answering (or the question that was posed by the original poster).  We're all answering 18.5(c), with r = 3 (Coppers) and b = 2 (Estates).
Title: Re: Math request: Nomad Camp
Post by: randomdragoon on April 17, 2012, 01:02:17 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

This is correct, but it is not the same situation here. In order to buy Nomad Camp you must have CCCCE in your first hand. This is akin to, in the above problem, saying you don't even get to play the game unless the first card turns up red; in this case, you will easily have higher than a 50% probability of winning.


Or, think of it this way: Take 3 coppers and two estates, and shuffle them. Then pick up 4 coppers and an estate from the supply. (You do this because if your first hand is not CCCCE you can't buy the Nomad Camp). What is the probability that the bottom card of the deck is an estate?
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 01:05:12 pm

Quote
Or, think of it this way: Take 3 coppers and two estates, and shuffle them. Then pick up 4 coppers and an estate from the supply. (You do this because if your first hand is not CCCCE you can't buy the Nomad Camp). What is the probability that the bottom card of the deck is an estate?

But notably, this is NOT what happened. You didn't take CCCEE shuffled, and then ADD CCCCE to the top. You took CCCCCCCEEE, shuffled, and then revealed CCCCE. This produces different odds, even though we both know that the bottom 5 cards are the same set of CCCEE in some order.
Title: Re: Math request: Nomad Camp
Post by: michaeljb on April 17, 2012, 01:07:04 pm
My intuition tells me if you can't guarantee better than 1/2 odds of winning it's because only black lets you win, not because you don't get new info. For example, if my strategy is to only take the next card when more reds have been revealed then blacks, I can't win if all the blacks land on top. But if I can win by predicting red or black, then my strategy just needs to include seeing at least one of each color and that stacked deck case becomes trivial.

The challenge in that problem is being restricted to black, not that the new info you get from revealing cards doesn't help you.
Title: Re: Math request: Nomad Camp
Post by: O on April 17, 2012, 01:09:53 pm

Quote
Or, think of it this way: Take 3 coppers and two estates, and shuffle them. Then pick up 4 coppers and an estate from the supply. (You do this because if your first hand is not CCCCE you can't buy the Nomad Camp). What is the probability that the bottom card of the deck is an estate?

But notably, this is NOT what happened. You didn't take CCCEE shuffled, and then ADD CCCCE to the top. You took CCCCCCCEEE, shuffled, and then revealed CCCCE. This produces different odds, even though we both know that the bottom 5 cards are the same set of CCCEE in some order.

Except it doesn't produce different odds. This isn't the revealing door puzzle, where a conditional choice is made.
Title: Re: Math request: Nomad Camp
Post by: blueblimp on April 17, 2012, 01:10:37 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

(This is basically identical to the problem in the Venture thread, heh.) Anyway, the relevant parts here are 2b/2c, because we already know your first hand is CCCCE, which tells us your remaining cards are CCCEE (in some order).
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 01:13:15 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

(This is basically identical to the problem in the Venture thread, heh.) Anyway, the relevant parts here are 2b/2c, because we already know your first hand is CCCCE, which tells us your remaining cards are CCCEE (in some order).

And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%. The information is predetermined with the original shuffle - This is much more clear when you think of revealing the BOTTOM card, rather than the NEXT card. Shuffle a deck (26/26, or 7/3), and remove the bottom card, without looking at it. What are the odds that it's red/black (or Estate/Copper)? Now reveal X cards, one at a time from the top of the deck. The odds on the removed card don't change. Yes, you could reveal all the other cards, and know with certainty what that last card IS, but it's odds at any given moment are predefined.

Because of that, when evaluating what %chance you have to hit $5, the only relevant information is that bottom card, which since you havn't changed it's odds since the original shuffle, when it had 30%, is still 30%.
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 01:15:51 pm
To clarify my last post (I first made edits, then saved, then saw 8 or so posts after it - my bad):


The issue is that 18.5(d) asks you to somehow change your "winning" probability from 50% without yet having any information about the shuffle results (i.e., without having yet dealt any cards from the deck).  You are therefore confronted with a universe of possible outcomes in which exactly half will "win" and half will "lose."  Of course your chances of "winning" are therefore 50%, and nothing can be done to change that.

BUT, once you flip some cards from the deck, you eliminate a number of outcomes from the possible universe of results.  If you eliminate more "losing" outcomes than "winning" outcomes, your probability of winning - knowing that information - goes up.  And vice versa if you eliminate more "winning" than "losing" outcomes.

Back to the Dominion example, where "winning" = Estate on the bottom, drawing your first hand of 5 cards eliminates all "losing" outcomes in which (1) the 3 Estates are in the top 5 cards; and (2) 2 of the 3 Estates are in the top 5 cards.  Whereas the only "winning" outcomes eliminated are those in which none of the Estates are in the top 5 cards.  More losing outcomes were eliminated than winning outcomes, and that is why the chance of winning goes up.

But notably, this is NOT what happened. You didn't take CCCEE shuffled, and then ADD CCCCE to the top. You took CCCCCCCEEE, shuffled, and then revealed CCCCE. This produces different odds, even though we both know that the bottom 5 cards are the same set of CCCEE in some order.

No, these two situations produce the same odds, AFTER having revealed CCCCE.  In the second scenario, while the ORIGINAL probability of Estate-on-the-bottom is 30% (vs. 40% in the first scenario), the CONDITIONAL probability of Estate-on-the-bottom, after revealing the top 5 cards, is 40%.

And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%

No, it's not.  Per 18.5(c), the odds that the next card is black in your example (with 22 black and 14 red remaining in the deck) are 22/36, or 61.1%.

Say you flip over 26 red cards and 0 black cards.  Are the odds of your next card being black still 50%?
Title: Re: Math request: Nomad Camp
Post by: blueblimp on April 17, 2012, 01:16:34 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

(This is basically identical to the problem in the Venture thread, heh.) Anyway, the relevant parts here are 2b/2c, because we already know your first hand is CCCCE, which tells us your remaining cards are CCCEE (in some order).

And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%

No. Imagine you flipped over 26 red cards and no black cards (so there are only black cards left). Is the chance that the next card is black 50%, in that case?

(To be fair, conditional probability is counter-intuitive (http://www.youtube.com/watch?v=3pRM4v0O29o).)
Title: Re: Math request: Nomad Camp
Post by: Toskk on April 17, 2012, 01:17:57 pm

Quote
Or, think of it this way: Take 3 coppers and two estates, and shuffle them. Then pick up 4 coppers and an estate from the supply. (You do this because if your first hand is not CCCCE you can't buy the Nomad Camp). What is the probability that the bottom card of the deck is an estate?

But notably, this is NOT what happened. You didn't take CCCEE shuffled, and then ADD CCCCE to the top. You took CCCCCCCEEE, shuffled, and then revealed CCCCE. This produces different odds, even though we both know that the bottom 5 cards are the same set of CCCEE in some order.

Oh god, my head.. it hurts.. this thread is insane, and people seem to be attempting to calculate two totally different things.

The original poster asked, providing your opening hand is CCCCE and you buy a Nomad Camp, what are your odds of getting $5 on turn 2. In this case, we absolutely *do not* care what the odds are of drawing the opening CCCCE. Randomdragoon and others are very correct that the odds of the $5 T2 hand is 40%.. the problem is precisely identical to picking one card from CCCEE (this picked card is the one you don't get for your T2 hand).

But people like Galzria are apparently attempting something totally different.. they're instead talking about the odds of drawing first CCCCE and *then* drawing Nomad Camp + CCCE. The odds of these two occurrences together are *not* 40%.
Title: Re: Math request: Nomad Camp
Post by: michaeljb on April 17, 2012, 01:19:50 pm
And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%.

So part c is asking you to prove true something that is really false? 22/40

edit: cut the quote down to the relevant size, and realized my 22/40 probability isn't right. 22/40 would be if you flipped over 12 total, 4 being black: (26-4)/(52-12). I misread it; with flipping 12 red and 4 black, it's actually (26-4)/(52-12-4) = 22/36, much more than 50%
Title: Re: Math request: Nomad Camp
Post by: timchen on April 17, 2012, 01:21:14 pm

But people like Galzria are apparently attempting something totally different.. they're instead talking about the odds of drawing first CCCCE and *then* drawing Nomad Camp + CCCE. The odds of these two occurrences together are *not* 40%.

He is not attempting something totally different. He is just totally wrong.
Title: Re: Math request: Nomad Camp
Post by: Toskk on April 17, 2012, 01:22:48 pm

But people like Galzria are apparently attempting something totally different.. they're instead talking about the odds of drawing first CCCCE and *then* drawing Nomad Camp + CCCE. The odds of these two occurrences together are *not* 40%.

He is not attempting something totally different. He is just totally wrong.

.. I was trying to give him the benefit of the doubt.. :P
Title: Re: Math request: Nomad Camp
Post by: O on April 17, 2012, 01:23:21 pm
And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%.

Don't want to be rude, but you need to retake your statistics class. A finite sized deck is not a head or tails coin. The odds are nowhere near 50%.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 01:24:24 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

(This is basically identical to the problem in the Venture thread, heh.) Anyway, the relevant parts here are 2b/2c, because we already know your first hand is CCCCE, which tells us your remaining cards are CCCEE (in some order).

And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%. The information is predetermined with the original shuffle - This is much more clear when you think of revealing the BOTTOM card, rather than the NEXT card. Shuffle a deck (26/26, or 7/3), and remove the bottom card, without looking at it. What are the odds that it's red/black (or Estate/Copper)? Now reveal X cards, one at a time from the top of the deck. The odds on the removed card don't change. Yes, you could reveal all the other cards, and know with certainty what that last card IS, but it's odds at any given moment are predefined.

Because of that, when evaluating what %chance you have to hit $5, the only relevant information is that bottom card, which since you havn't changed it's odds since the original shuffle, when it had 30%, is still 30%.

So part c is asking you to prove true something that is really false? 22/40

Not exactly. Yes, you have a better chance of winning - That is, there are more cards remaining in your favor than against - But the odds of the next card, or any given, regardless of revealed information, remain 50%.
Title: Re: Math request: Nomad Camp
Post by: jonts26 on April 17, 2012, 01:25:06 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

(This is basically identical to the problem in the Venture thread, heh.) Anyway, the relevant parts here are 2b/2c, because we already know your first hand is CCCCE, which tells us your remaining cards are CCCEE (in some order).

And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%. The information is predetermined with the original shuffle - This is much more clear when you think of revealing the BOTTOM card, rather than the NEXT card. Shuffle a deck (26/26, or 7/3), and remove the bottom card, without looking at it. What are the odds that it's red/black (or Estate/Copper)? Now reveal X cards, one at a time from the top of the deck. The odds on the removed card don't change. Yes, you could reveal all the other cards, and know with certainty what that last card IS, but it's odds at any given moment are predefined.

Because of that, when evaluating what %chance you have to hit $5, the only relevant information is that bottom card, which since you havn't changed it's odds since the original shuffle, when it had 30%, is still 30%.

So part c is asking you to prove true something that is really false? 22/40

Not exactly. Yes, you have a better chance of winning - That is, there are more cards remaining in your favor than against - But the odds of the next card, or any given, regardless of revealed information, remain 50%.

That's just ... no. No, I'm sorry.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 01:28:52 pm
Alright, look - I can't find the source material right now, despite looking through everything I can find, so I'll abstain for the time being.

I was hoping that somebody else would have encountered this particular paradox, and could help me if they had the proof on hand (I imagine mine got recycled at some point, but I'm usually pretty organized and don't get rid of anything).

If I can find the information, I'll bring it out, but for now, follow the intuitive answer of %40 - However, it's still really %30.  ;)
Title: Re: Math request: Nomad Camp
Post by: michaeljb on April 17, 2012, 01:29:18 pm
Not exactly. Yes, you have a better chance of winning - That is, there are more cards remaining in your favor than against - But the odds of the next card, or any given, regardless of revealed information, remain 50%.

What is this I don't even
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 01:30:01 pm
And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%. The information is predetermined with the original shuffle - This is much more clear when you think of revealing the BOTTOM card, rather than the NEXT card. Shuffle a deck (26/26, or 7/3), and remove the bottom card, without looking at it. What are the odds that it's red/black (or Estate/Copper)? Now reveal X cards, one at a time from the top of the deck. The odds on the removed card don't change. Yes, you could reveal all the other cards, and know with certainty what that last card IS, but it's odds at any given moment are predefined.

Because of that, when evaluating what %chance you have to hit $5, the only relevant information is that bottom card, which since you havn't changed it's odds since the original shuffle, when it had 30%, is still 30%.

I think the issue is that you are defining "odds" differently from the rest of us.  You appear to be defining "odds" as "the chance this card is X when the deck is first shuffled."  We are all defining "odds" as "the chance this card is X, knowing what we know about the rest of the deck."  In other words, you are talking about initial probability; we are talking about conditional probability.  See section 18.3 of the textbook you cite.

Importantly, the example you raise is NOT from the "conditional probability" section of the textbook you cited.  A better example is problem 18.14:

Quote
Problem 18.14.
A 52-card deck is thoroughly shuffled and you are dealt a hand of 13 cards.
(a) If you have one ace, what is the probability that you have a second ace?
(b) If you have the ace of spades, what is the probability that you have a second ace?
Remarkably, the two answers are different. This problem will test your counting ability!

Last two lines are straight from the textbook.
Title: Re: Math request: Nomad Camp
Post by: Robz888 on April 17, 2012, 01:32:01 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

(This is basically identical to the problem in the Venture thread, heh.) Anyway, the relevant parts here are 2b/2c, because we already know your first hand is CCCCE, which tells us your remaining cards are CCCEE (in some order).

And yet, if I've flipped over 12 red cards, and 4 black cards, the odds that the next card is black is STILL 50%. The information is predetermined with the original shuffle - This is much more clear when you think of revealing the BOTTOM card, rather than the NEXT card. Shuffle a deck (26/26, or 7/3), and remove the bottom card, without looking at it. What are the odds that it's red/black (or Estate/Copper)? Now reveal X cards, one at a time from the top of the deck. The odds on the removed card don't change. Yes, you could reveal all the other cards, and know with certainty what that last card IS, but it's odds at any given moment are predefined.

Because of that, when evaluating what %chance you have to hit $5, the only relevant information is that bottom card, which since you havn't changed it's odds since the original shuffle, when it had 30%, is still 30%.

So part c is asking you to prove true something that is really false? 22/40

Not exactly. Yes, you have a better chance of winning - That is, there are more cards remaining in your favor than against - But the odds of the next card, or any given, regardless of revealed information, remain 50%.

They absolutely do not remain 50%. If there is more of one card than the other, the odds of drawing that card are higher.
Title: Re: Math request: Nomad Camp
Post by: Fabian on April 17, 2012, 01:32:16 pm
Galzria,

Just want to make sure I'm getting your argument, please answer the following!

1. I shuffle my deck and draw my starting hand. My starting hand is 3 Estate and 2 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?


2. I shuffle my deck and draw my starting hand. My starting hand is 2 Estate and 3 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

3. I shuffle my deck and draw my starting hand. My starting hand is 1 Estate and 4 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?
Title: Re: Math request: Nomad Camp
Post by: timchen on April 17, 2012, 01:37:36 pm
I think your problem might be that you are confused by the idea of conditional probability.

Let us talk about a very simple deck with 1 copper and 1 estate. Sure, each card in the deck will have 50% being either.

I would assume that you agree that after I flip the first card to be an estate, the remaining card has to be a copper.

 Yet in this case you claim the second card has 50% to be an estate. The only plausible explanation to me is that you have some sort of idea of "prior probability" in mind: that is, you are thinking about this 50% chance in the beginning, or in an ensemble, that is if you do this many times, there will be 50% chance for the second card to be an estate.

But that is not the question we are interested. The question is what is the chance of the second card being an estate, GIVEN THAT the first card was flipped and shown to be an estate. And the answer certainly is not 50%.
Title: Re: Math request: Nomad Camp
Post by: jonts26 on April 17, 2012, 01:42:43 pm
Alright, look - I can't find the source material right now, despite looking through everything I can find, so I'll abstain for the time being.

I was hoping that somebody else would have encountered this particular paradox, and could help me if they had the proof on hand (I imagine mine got recycled at some point, but I'm usually pretty organized and don't get rid of anything).

If I can find the information, I'll bring it out, but for now, follow the intuitive answer of %40 - However, it's still really %30.  ;)

I think no one else is helping you out here because you are quite obviously wrong. Either you are using a definition of odds which is not the standard definition or you have some sort of fundamental misunderstanding of conditional probability.

And it's been said many times already but let me try again. Probability and odds are a measure of unknown information. Once I reveal to you new information (like the first 5 cards) the odds can and do change. This is called conditional probability, the odds of something happening given a certain amount of information. The odds start at 30% but once you draw your hand with 1 estate they change to 40%.

Also, ss someone mentioned before, I highly recommend you read up on the monty hall problem for a particularly mind bending example of this. http://en.wikipedia.org/wiki/Monty_Hall_problem
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 01:44:16 pm
Quote
Galzria,

Just want to make sure I'm getting your argument, please answer the following!

Quote
1. I shuffle my deck and draw my starting hand. My starting hand is 3 Estate and 2 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

The odds that the bottom card is an Estate IS still 30% - BUT - You know that it will be 0% of the time, since you have 3 of 3 in hand. This only holds true, however, because you've revealed 100% of the remaining. Thusly:

Quote
2. I shuffle my deck and draw my starting hand. My starting hand is 2 Estate and 3 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

The odds that the bottom card is an Estate IS still 30% - BUT - In this case, intuitively, 1 in 5 remain to be an Estate, so it would SEEM to be 20%, but it isn't, because it's initial probability hasn't changed. Until you reach 0% or 100%, it remains 30% because that was the information the system was given to start. EVERY card is 30% until all are revealed. That is, even when you can know 100% of the time, if it IS or ISN'T, IT'S odds are still 30%.

Quote
3. I shuffle my deck and draw my starting hand. My starting hand is 1 Estate and 4 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

As above, The odds of the bottom card is an Estate IS still 30% - BUT - In this case, intuitively, 2/5 remain to be an Estate, so it would SEEM to be 40%, but it isn't.

-- Either way! -- I can't provide the proof right now, so I'm happy to abstain and let it stand with the intuitive answer until I can.
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 01:45:37 pm
40%

Thanks. I have poor Nomad Camp luck; it seemed like I never managed to hit $5. To prove cognitive bias though, I just went through all my games that had Nomad Camp. Turns out I bought it on turn one 12 times; hit $5 4 of those. So worse than average; but only a bit.
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 01:46:41 pm
Or, Galzria, how about this variant situation:

- Your deck consists of 3 Provinces, 1 Tactician, 1 Wishing Well, 8 Coppers, and 2 Silvers.
- Your opponent has 4 Provinces, 1 Curse, and no other victory point cards.  Buying the last Province will win you this 2-player game.
- You played Tactician last turn, discarding the rest of your hand.  Your deck was then empty, so you reshuffled (this reshuffle contains 3 Provinces, 1 Wishing Well, 8 Coppers, and 2 Silvers).
- You then drew 3 Provinces, 6 Coppers, and 1 Wishing Well.  You play your Wishing Well and draw another Copper.
- Your discard pile is empty.

Here are the questions:
- What are the odds your next card is Copper?
- What are the odds your next card is Silver?
- To have the best chance of winning this turn, what should you wish for?
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 01:48:27 pm
Alright, look - I can't find the source material right now, despite looking through everything I can find, so I'll abstain for the time being.

I was hoping that somebody else would have encountered this particular paradox, and could help me if they had the proof on hand (I imagine mine got recycled at some point, but I'm usually pretty organized and don't get rid of anything).

If I can find the information, I'll bring it out, but for now, follow the intuitive answer of %40 - However, it's still really %30.  ;)

I think no one else is helping you out here because you are quite obviously wrong. Either you are using a definition of odds which is not the standard definition or you have some sort of fundamental misunderstanding of conditional probability.

And it's been said many times already but let me try again. Probability and odds are a measure of unknown information. Once I reveal to you new information (like the first 5 cards) the odds can and do change. This is called conditional probability, the odds of something happening given a certain amount of information. The odds start at 30% but once you draw your hand with 1 estate they change to 40%.

Also, ss someone mentioned before, I highly recommend you read up on the monty hall problem for a particularly mind bending example of this. http://en.wikipedia.org/wiki/Monty_Hall_problem

In particular Jonts, this came up (for me, years ago), following the Monty Hall problem, when after solving it, we were provided with a list of other seeming Paradox's to choose from, and provide the proofs for. This was one of them. It is just as unintuitive - That is, the easy answer of Monty Hall's problem is 33%, as is the easy answer here 40% (or in the case of the 26/26 card deck, whatever the odds would SEEM based on what you KNOW is left) - But it isn't. As Monty's is actually 66%, so here is it actually 30%, and in the case of the playing deck, 50%.
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 01:50:04 pm
If I have a 52 card deck, equal red and equal black, shuffled to together randomly, and I start to reveal cards to you one at a time...:
You may tell me to stop at any time, and guess what the color the bottom card will be. Can you ever increase your odds better than 50% that it will be black? What if I show you the NEXT card instead?

-- Against intuition, the answer is NO. It is always 50/50, because those odds were determined with initial information input of 26/26 - and even as you remove cards, and the total remaining may change, it doesn't change the initial odds on any GIVEN card from the remaining to be 50/50.

Um, actually the answer is yes. Given that the top card is red; you now have a less than 50% chance that the bottom card is also red.

The reason this is different than the Monty Hall problem is that the top card was flipped randomly. With Monty Hall, he knows what is where, and ALWAYS opens a door that has a goat. He does not just choose a door to open randomly. This is why the odds of your door being the car don't change. But with the card example (and with the Dominion example), it's different. The top card had a chance to be black, but it turns out it wasn't. This changes things. If, on the other hand, you said "I'm now going to show you a red card" and then remove 1 red card from the deck; the odds of the bottom card haven't changed.
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 01:50:25 pm
Quote
-- Either way! -- I can't provide the proof right now, so I'm happy to abstain and let it stand with the intuitive answer until I can.
... which will be, if you want to provide a correct proof, ONCE AND FOR ALL!
Title: Re: Math request: Nomad Camp
Post by: Thisisnotasmile on April 17, 2012, 01:53:35 pm
For the record, I am now +1ing all of the exceptional trolling in this thread for it's exceptionalness.
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 01:53:43 pm
Alright, look - I can't find the source material right now, despite looking through everything I can find, so I'll abstain for the time being.

I was hoping that somebody else would have encountered this particular paradox, and could help me if they had the proof on hand (I imagine mine got recycled at some point, but I'm usually pretty organized and don't get rid of anything).

If I can find the information, I'll bring it out, but for now, follow the intuitive answer of %40 - However, it's still really %30.  ;)

I think no one else is helping you out here because you are quite obviously wrong. Either you are using a definition of odds which is not the standard definition or you have some sort of fundamental misunderstanding of conditional probability.

And it's been said many times already but let me try again. Probability and odds are a measure of unknown information. Once I reveal to you new information (like the first 5 cards) the odds can and do change. This is called conditional probability, the odds of something happening given a certain amount of information. The odds start at 30% but once you draw your hand with 1 estate they change to 40%.

Also, ss someone mentioned before, I highly recommend you read up on the monty hall problem for a particularly mind bending example of this. http://en.wikipedia.org/wiki/Monty_Hall_problem

In particular Jonts, this came up (for me, years ago), following the Monty Hall problem, when after solving it, we were provided with a list of other seeming Paradox's to choose from, and provide the proofs for. This was one of them. It is just as unintuitive - That is, the easy answer of Monty Hall's problem is 33%, as is the easy answer here 40% (or in the case of the 26/26 card deck, whatever the odds would SEEM based on what you KNOW is left) - But it isn't. As Monty's is actually 66%, so here is it actually 30%, and in the case of the playing deck, 50%.

Here's another way to think of it. Let's say instead of revealing the top card, you reveal the top 26 cards. And, based on crazy shuffle randomness, all 26 of them were red! According to your argument, there is still a 50/50 chance of the bottom card being red. So, do you think that is correct? Given that the top 26 cards are all red, what are the odds of the bottom card being red? 0 or .5?
Title: Re: Math request: Nomad Camp
Post by: jonts26 on April 17, 2012, 01:53:47 pm
Since this is getting nowhere, I suggest people entertain themselves with a particularly mind wrinkling stats puzzle.

http://scienceblogs.com/evolutionblog/2011/11/the_tuesday_birthday_problem.php
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 01:54:19 pm
Maybe we can discuss the card problem once this is solved, because it's a different problem, and it does not really help.

So back to the Estates. So we have 2 Estates and 3 Coppers left. Do you agree that each of the
EECCC
ECECC
ECCEC
ECCCE
CEECC
CECEC
CECCE
CCEEC
CCECE
CCCEE

is equaly likely for the last 5 cards? If not why not? And there happens to be just 4 out of 10 setups where the Estate is in the last position...
Title: Re: Math request: Nomad Camp
Post by: timchen on April 17, 2012, 01:54:56 pm
BUT - You know that it will be 0% of the time, since you have 3 of 3 in hand. This only holds true, however, because you've revealed 100% of the remaining.
This. I thought this is about the only argument you can make. But seriously:

Suppose we have this deck with 7 coppers and 3 estates. We draw the first hand to contain 1 estate.
 
According to you, the chance for the next card in the remaining 5 to be an estate is 30%. At least for us.

Suppose there is a guy who just came along. We told him that there are 3 coppers and 2 estates in the remaining 5- card deck. What would he say about the chance for the first card among the 5 to be an estate? 40% without question.

Why and how can our answers be different?
Title: Re: Math request: Nomad Camp
Post by: Fabian on April 17, 2012, 01:55:10 pm
Quote
Galzria,

Just want to make sure I'm getting your argument, please answer the following!

Quote
1. I shuffle my deck and draw my starting hand. My starting hand is 3 Estate and 2 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

The odds that the bottom card is an Estate IS still 30% - BUT - You know that it will be 0% of the time, since you have 3 of 3 in hand. This only holds true, however, because you've revealed 100% of the remaining. Thusly:

Quote
2. I shuffle my deck and draw my starting hand. My starting hand is 2 Estate and 3 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

The odds that the bottom card is an Estate IS still 30% - BUT - In this case, intuitively, 1 in 5 remain to be an Estate, so it would SEEM to be 20%, but it isn't, because it's initial probability hasn't changed. Until you reach 0% or 100%, it remains 30% because that was the information the system was given to start. EVERY card is 30% until all are revealed. That is, even when you can know 100% of the time, if it IS or ISN'T, IT'S odds are still 30%.

Quote
3. I shuffle my deck and draw my starting hand. My starting hand is 1 Estate and 4 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

As above, The odds of the bottom card is an Estate IS still 30% - BUT - In this case, intuitively, 2/5 remain to be an Estate, so it would SEEM to be 40%, but it isn't.

-- Either way! -- I can't provide the proof right now, so I'm happy to abstain and let it stand with the intuitive answer until I can.

This is more than I could have hoped for, thanks!
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 01:55:56 pm
The odds that the bottom card is an Estate IS still 30% - BUT - You know that it will be 0% of the time, since you have 3 of 3 in hand. This only holds true, however, because you've revealed 100% of the remaining. Thusly:

Wait, what? ??? I'm pretty sure you're using the word "odds" here differently than everyone else. If something happens 0% of the time, then the odds of it happening are 0%. You can't say that the odds are 30%, but it will happen 0% of the time. That's not a paradox; that's just defining the word "odds" differently.
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 01:57:05 pm
Since this is getting nowhere, I suggest people entertain themselves with a particularly mind wrinkling stats puzzle.

http://scienceblogs.com/evolutionblog/2011/11/the_tuesday_birthday_problem.php

Nooooooo, not the Tuesday Birthday Problem!!!
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 02:05:06 pm
For the record, I am now +1ing all of the exceptional trolling in this thread for it's exceptionalness.

To be fair, I wasn't intentionally trolling, and if I could find this stupid proof, it wouldn't have gotten so out of hand. I've also tried to back down until I CAN, as it's as obvious to me as it is to you and anybody else that the right answer APPEARS to be 40%. It's the nature of the problem.

Without the math and proof behind it, I'm more than happy to let it go. I thought I knew where the material was, and I was wrong, so now am stuck in an awkward position of having done this twice academically, and yet not being able to provide more information.

If I can uncover where I put it, I'll be happy to come back with that information.

Quote
-- Either way! -- I can't provide the proof right now, so I'm happy to abstain and let it stand with the intuitive answer until I can.
... which will be, if you want to provide a correct proof, ONCE AND FOR ALL!

That's rather uncalled for. Fair enough that I havn't been able to provide a proof for my reasoning yet, but I've done my best to explain it in Lamens terms without it. I've also acknowledged that at this time, I'm willing to let it go with 40% until I can. It took me many weeks of staring at the solution to believe it, because yes, it is VERY unintuitive. Still, I havn't been rude about my reasons for my suggestion.
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on April 17, 2012, 02:13:12 pm
Galzria, the thing is you have already stated that "your odds of winning are > 50%." You've stated that certain things will happen a certain percentage of the time. (Like having an Estate on the bottom if you draw 3 Estates in your hand). These percentages are what everyone here is talking about when we are talking about "odds." The "odds of an Estate being on the bottom, given that you have 3 Estates in hand" is the same as saying "if you were to play 1000 opening hands, and consider only the ones where you start with 3 Estate in hand, how many of them will have an Estate on the bottom?" The answer is, as you would agree, 0. None of those games would have an Estate on the bottom.

Now, there are a lot of other games in there (about 30%) where you would have an Estate on the bottom, but those games aren't being considered. They are irrelevant and have no bearing on the question at hand. It's a different question completely. No one here is asking about the odds of an Estate being on the bottom. We are asking "if you start with 3 Estates in hand, and do this a whole bunch of times, how many of those times will you have an Estate on the bottom?"

Read my post about the Monty Hall problem. This is different than that simply because there was a chance that you would have drawn less than 3 Estates in hand, but you didn't. With Monty Hall, there was NOT a chance that the door revealed would have been the car. Monty knows where the car is, and opens a non-car door every time. Basically, Monty is not giving ANY new information to you at all!  You already knew that at least 1 of the 2 doors you didn't choose had a goat, and, well, that's still all you know. No new information was given. If Monty hadn't know where the car was, and just picked a door at random, then it's a different story. Now new information has been given; because a door that originally had a chance to have a car now doesn't have a car.

*Edit* Just saw that Jont's link addresses this with Monty Hall:

Quote
The classic example of this is the Monty Hall problem. (I shall assume you are familiar with this problem. If you are not, I know a good book you should read.) The common fallacy is to ignore what we know about how Monty makes decisions. Thus, when he opens an empty door we tend to think, mistakenly, that we have only learned that that door is empty. In reality we have learned that Monty, who makes his decisions in a rigidly controlled way, chose to take a certain action.

*Changed 1 Estate in hand to 3 Estates in hand because it better illustrates the point, without changing the discussion.
Title: Re: Math request: Nomad Camp
Post by: DG on April 17, 2012, 02:17:01 pm
If you've got a bag of red and black dogs, and draw out a blue dog, should you give it to Monty Hall?
Title: Re: Math request: Nomad Camp
Post by: Kuildeous on April 17, 2012, 02:23:03 pm
If you've got a bag of red and black dogs, and draw out a blue dog, should you give it to Monty Hall?

I can't answer that. I need to know if the blue dog's birthday is on a Tuesday.
Title: Re: Math request: Nomad Camp
Post by: WanderingWinder on April 17, 2012, 02:25:26 pm
If you've got a bag of red and black dogs, and draw out a blue dog, should you give it to Monty Hall?
Do you have an orange goat in another bag, or is it only a pink elephant?
Title: Re: Math request: Nomad Camp
Post by: DStu on April 17, 2012, 02:30:12 pm
To be fair, I wasn't intentionally trolling, and if I could find this stupid proof, it wouldn't have gotten so out of hand. I've also tried to back down until I CAN, as it's as obvious to me as it is to you and anybody else that the right answer APPEARS to be 40%. It's the nature of the problem.
If you are talking about the proof for your card game, and the best strategy gives you 50% winchance:


Obviously there exists a strategy with p=0.5, namely just guess "black" at the first turn.
Now every other strategy that does not guess "black" on the first turn does not have higher winchance:
Every strategy that don't guess "black" on the first turn, loses with prob. 50% already after the first step, namely if the card is black. So no matter what you do on later turns, you can't do better than 50%.
qed
Title: Re: Math request: Nomad Camp
Post by: Kuildeous on April 17, 2012, 02:37:19 pm
If I can find the information, I'll bring it out, but for now, follow the intuitive answer of %40 - However, it's still really %30.  ;)

Sadly, my own math is rusty. I recognize that it's a conditional probability question. I recognize that the answer is not 30%. Like you, I do not personally have the math handy to show this (though others have provided excellent dissertations).

I will say that hard numbers show that it's 40%. I created a little simulation in Excel. Out of a batch of 30k draws, the simulation picked out how many draws had exactly 4 Coppers and had an Estate at the bottom. The percentage was indeed 40%. Granted, while my original sample size was 30k, the sample size of first hands that contain exactly 1 Estate ranged between 12k and 13k. Still, pretty good sample size. Every time I generated new data based on the randomizer, the final number was about 40%.

In fact, I expanded it to consider other assumptions.

If the first hand contains 5 Copper: 60% occurrence of bottom card being Estate.
If the first hand contains 4 Copper: 40% occurrence of bottom card being Estate.
If the first hand contains 3 Copper: 20% occurrence of bottom card being Estate.
If the first hand contains 2 Copper: 0% occurrence of bottom card being Estate (seemed silly to test this, but I like to double-check my code).

So, I'm sorry to say that your claim of 30% does not hold up to actual card draws.

I don't doubt that you thought you had the material to back this up. As Gendo pointed out, this is not exactly like the Monty Hall problem. It sounds similar: You may have read something that sounds similar, but it really isn't, and conditional probability is the culprit.
Title: Re: Math request: Nomad Camp
Post by: Fabian on April 17, 2012, 02:52:37 pm
Kuildeous, here's the actual numbers though, proof here (http://forum.dominionstrategy.com/index.php?topic=2260.msg35142#msg35142).

If the first hand contains 5 Copper: 30% occurrence of bottom card being Estate.
If the first hand contains 4 Copper: 30% occurrence of bottom card being Estate.
If the first hand contains 3 Copper: 30% occurrence of bottom card being Estate.
If the first hand contains 2 Copper: 0% occurrence of bottom card being Estate (odds are still 30% though)
Title: Re: Math request: Nomad Camp
Post by: Kuildeous on April 17, 2012, 03:05:06 pm
Not to belabor the point (too late), but I think this illustration might help shed some light.

I have a simulation with 30,057 random card draws. Of those 30,057, there are 9,110 results of the bottom card being an Estate, yielding 30.3%. This is to be expected. So far, so good.

A) Of those 30,057, there are 2,570 results where the first hand has 0 Estates. Of those 2,570, there 1,536 results of the bottom card being an Estate, yielding 59.8%. This coincides with an earlier claim that the probability is 60%.

B) Likewise, there are 12,464 results where the first hand has 1 Estate. Of those 12,464, there are 5,102 results of the bottom card being an Estate, yielding 40.9%. Again, this coincides with the claim that the probability is 40%.

C) There are 12,484 results where the first hand has 2 Estates. Of that, there are 2,472 results of the bottom card being an Estate, yielding 19.8%.

D) Naturally, 0% of the 2,539 results where the first hand has 3 Estates had an Estate on the bottom.

This is where the condition is important. If you add all these up (probability of A times the sample size of A and so forth), then you get the final number of 30%. It is absolutely true that the last card (indeed any one card) being an Estate is 30% when you don't know anything about the deck. Once you know the first hand, then you know that you are either in case A, B, C, or D. The probability that the final card of a fresh shuffle being 30% is still there, but you are now no longer looking at a fresh shuffle. You are now looking at one of four cases. Knowledge of the first hand gives you better information.
Title: Re: Math request: Nomad Camp
Post by: sjelkjd on April 17, 2012, 03:11:18 pm
if I could find this stupid proof, it wouldn't have gotten so out of hand
You can't find the proof because it doesn't exist.  If you won't believe logic, maybe you'll believe a simulation:
Code: [Select]
#include <algorithm>
#include <vector>
#include <iostream>
using namespace std;

int main()
{
vector<char> cards;
for(int i=0;i<7;i++)
cards.push_back('C');
for(int i=0;i<3;i++)
cards.push_back('E');
const int TRIALS = 1000000;
int success = 0;
int total = 0;
for(int j=0;j<TRIALS;j++)
{
random_shuffle(cards.begin(), cards.end());
int eCount = 0;
for(int i=0;i<5;i++)
{
if(cards[i] == 'E') eCount++;
}
if(eCount == 1)
{
int money = 2;
for(int i=0;i<4;i++)
{
if(cards[5+i] == 'C')
{
money++;
}
}
if(money == 5)
{
success++;
}
total++;
}
}
double ratio = ((double)success)/total;
cout << ratio << endl;
}
Guess what?  Answer is 40%
Title: Re: Math request: Nomad Camp
Post by: eHalcyon on April 17, 2012, 03:17:35 pm
Perhaps Galzria is thinking of some form of the Gambler's Fallacy.  Suppose that I flip a fair coin.  What is the chance that I flip heads?  The answer is, of course, 50%.

Now let's reword the problem: Suppose that I am flipping a fair coin.  What is the chance that the next flip will be heads?  The answer is still, of course, 50%.

Now here's the rewording that matches it up to our initial problem: Suppose that I flip a fair coin 10 times.  The first 5 flips come up TAILS.  What is the chance that the last flip will be heads?  The answer is STILL 50%.

The thing is, this doesn't apply to cards at all.  Each individual flip of the coin has a 50% chance of coming up heads.  You don't know what it is until it happens.  The same is not true of cards -- they are what they are.  If you have 5 red cards and 5 black cards, revealing 5 red guarantees that the next 5 are black.

I'm guessing that Galzria is remembering a proof for some concept that doesn't actually apply to the question at hand, though at first glance it might seem relevant.
Title: Re: Math request: Nomad Camp
Post by: jonts26 on April 17, 2012, 03:21:29 pm
Perhaps Galzria is thinking of some form of the Gambler's Fallacy.  Suppose that I flip a fair coin.  What is the chance that I flip heads?  The answer is, of course, 50%.

Now let's reword the problem: Suppose that I am flipping a fair coin.  What is the chance that the next flip will be heads?  The answer is still, of course, 50%.

Now here's the rewording that matches it up to our initial problem: Suppose that I flip a fair coin 10 times.  The first 5 flips come up TAILS.  What is the chance that the last flip will be heads?  The answer is STILL 50%.

The thing is, this doesn't apply to cards at all.  Each individual flip of the coin has a 50% chance of coming up heads.  You don't know what it is until it happens.  The same is not true of cards -- they are what they are.  If you have 5 red cards and 5 black cards, revealing 5 red guarantees that the next 5 are black.

I'm guessing that Galzria is remembering a proof for some concept that doesn't actually apply to the question at hand, though at first glance it might seem relevant.

The difference between the cards and the coin is that the coin flip results are independent of each other, whereas the card flips are not. Past results will influence future ones. Which is what you said, I just wanted to make sure we had proper terminology.
Title: Re: Math request: Nomad Camp
Post by: Robz888 on April 17, 2012, 03:31:12 pm
This reminds me of the Lightning Town problem (I don't know what other people call it--I call it the Lightning Town problem).

You have a town that experiences weirdly frequent lightning storms. The storms occur randomly, but on average, once a week. There is a lightning storm today (Tuesday). What day is the most likely day of the next storm?

Tomorrow (Wednesday). Most people incorrectly say either a week from today, or every day is equally likely. They are equally likely to have a storm, true, but they are not equally likely to be the next day with a storm. For instance, for the day after tomorrow (Thursday) to be the next day with a storm, there would have to be no storm on Wednesday. For Friday to be the next day with a storm, there would have to be no storm on Thursday or Wednesday, which is even less likely. So we discover that tomorrow is the next most likely day for a lightning storm, because there is the least amount of time for a storm to interrupt what you could call the next-day chain.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 03:52:13 pm
This reminds me of the Lightning Town problem (I don't know what other people call it--I call it the Lightning Town problem).

You have a town that experiences weirdly frequent lightning storms. The storms occur randomly, but on average, once a week. There is a lightning storm today (Tuesday). What day is the most likely day of the next storm?

Tomorrow (Wednesday). Most people incorrectly say either a week from today, or every day is equally likely. They are equally likely to have a storm, true, but they are not equally likely to be the next day with a storm. For instance, for the day after tomorrow (Thursday) to be the next day with a storm, there would have to be no storm on Wednesday. For Friday to be the next day with a storm, there would have to be no storm on Thursday or Wednesday, which is even less likely. So we discover that tomorrow is the next most likely day for a lightning storm, because there is the least amount of time for a storm to interrupt what you could call the next-day chain.

We always did that as Popquiz:


Teacher:Sometime next week (Monday-Friday) will be a popquiz, but I'm not going to tell you when, because I don't want you to put off studying until the night before.
Student 1: Well, it can't be Friday then, because if it hadn't happened on Monday through Thursday, we would KNOW it was Friday, and we would study Thursday night.
Student 2: And if it can't be Friday, then we KNOW it can't be Thursday, because if we havn't had it by Wednesday afternoon, we would KNOW to study that night.
Student 3: ...
Class: Our popquiz therefor MUST be on Monday, so we should study this weekend.
Title: Re: Math request: Nomad Camp
Post by: Axxle on April 17, 2012, 04:02:03 pm
This reminds me of the Lightning Town problem (I don't know what other people call it--I call it the Lightning Town problem).

You have a town that experiences weirdly frequent lightning storms. The storms occur randomly, but on average, once a week. There is a lightning storm today (Tuesday). What day is the most likely day of the next storm?

Tomorrow (Wednesday). Most people incorrectly say either a week from today, or every day is equally likely. They are equally likely to have a storm, true, but they are not equally likely to be the next day with a storm. For instance, for the day after tomorrow (Thursday) to be the next day with a storm, there would have to be no storm on Wednesday. For Friday to be the next day with a storm, there would have to be no storm on Thursday or Wednesday, which is even less likely. So we discover that tomorrow is the next most likely day for a lightning storm, because there is the least amount of time for a storm to interrupt what you could call the next-day chain.

We always did that as Popquiz:


Teacher:Sometime next week (Monday-Friday) will be a popquiz, but I'm not going to tell you when, because I don't want you to put off studying until the night before.
Student 1: Well, it can't be Friday then, because if it hadn't happened on Monday through Thursday, we would KNOW it was Friday, and we would study Thursday night.
Student 2: And if it can't be Friday, then we KNOW it can't be Thursday, because if we havn't had it by Wednesday afternoon, we would KNOW to study that night.
Student 3: ...
Class: Our popquiz therefor MUST be on Monday, so we should study this weekend.


And then she has it Friday after you guys study all weekend.  There's some fallacy here.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 04:02:40 pm
This reminds me of the Lightning Town problem (I don't know what other people call it--I call it the Lightning Town problem).

You have a town that experiences weirdly frequent lightning storms. The storms occur randomly, but on average, once a week. There is a lightning storm today (Tuesday). What day is the most likely day of the next storm?

Tomorrow (Wednesday). Most people incorrectly say either a week from today, or every day is equally likely. They are equally likely to have a storm, true, but they are not equally likely to be the next day with a storm. For instance, for the day after tomorrow (Thursday) to be the next day with a storm, there would have to be no storm on Wednesday. For Friday to be the next day with a storm, there would have to be no storm on Thursday or Wednesday, which is even less likely. So we discover that tomorrow is the next most likely day for a lightning storm, because there is the least amount of time for a storm to interrupt what you could call the next-day chain.

We always did that as Popquiz:


Teacher:Sometime next week (Monday-Friday) will be a popquiz, but I'm not going to tell you when, because I don't want you to put off studying until the night before.
Student 1: Well, it can't be Friday then, because if it hadn't happened on Monday through Thursday, we would KNOW it was Friday, and we would study Thursday night.
Student 2: And if it can't be Friday, then we KNOW it can't be Thursday, because if we havn't had it by Wednesday afternoon, we would KNOW to study that night.
Student 3: ...
Class: Our popquiz therefor MUST be on Monday, so we should study this weekend.


And then she has it Friday after you guys study all weekend.  There's some fallacy here.

Of course. That's a bit of the point, but as with the Lightning Tower, the thought process that brings you to the answer of Tomorrow/Monday is the same. In truth, the teacher is NOT randomly choosing a day once per week. But it's the same logic. The longer you go, the more likely it is to occur the following day. Since she's looking for a surprise, she's looking for the smallest chance of knowing when it'll be.
Title: Re: Math request: Nomad Camp
Post by: toaster on April 17, 2012, 04:23:19 pm
The pop quiz is actually a different problem...it's about game theory or logic more than probability, and the reasoning behind it is quite different.  The most common set up for it is as the Paradox of the Unexpected Hanging (http://en.wikipedia.org/wiki/Unexpected_hanging_paradox).
Title: Re: Math request: Nomad Camp
Post by: Donald X. on April 17, 2012, 04:27:54 pm
The difference between the cards and the coin is that the coin flip results are independent of each other, whereas the card flips are not. Past results will influence future ones. Which is what you said, I just wanted to make sure we had proper terminology.
That terminology is "hypergeometric distribution." There's a wikipedia article on it and everything.

Also: You are standing on one side of a river, with a fox, a goat, and a basket of cabbages. One of them always lies, one always tells the truth, and one is a basket of cabbages. You have to determine which is which using only one weighing on a scale.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 17, 2012, 04:35:28 pm
The pop quiz is actually a different problem...it's about game theory or logic more than probability, and the reasoning behind it is quite different.  The most common set up for it is as the Paradox of the Unexpected Hanging (http://en.wikipedia.org/wiki/Unexpected_hanging_paradox).

Yes, that would be it almost exactly. And like I said, they aren't the same problem. But the idea behind the prisoners hanging, or popquiz, was to illustrate a point, disregarding the fallacies of the logic. The key being "surprise", rather than semi-controlled randomness. It was still designed to get you to "the most likely day is tomorrow".

But following that logic, the Lightning Tower is a cleaner problem, because it erases the unknown factor. It still brings you to the most likely day, because anything after would be less probable. And that's, at the basic level, what the Prisoner's Hanging / Popquiz was trying to show. It just leaves too many invariables, and a loose definition "surprise". It draws out to conclusions that can't be made with certainty.
Title: Re: Math request: Nomad Camp
Post by: Axxle on April 17, 2012, 04:43:42 pm
The pop quiz is actually a different problem...it's about game theory or logic more than probability, and the reasoning behind it is quite different.  The most common set up for it is as the Paradox of the Unexpected Hanging (http://en.wikipedia.org/wiki/Unexpected_hanging_paradox).

Yes, that would be it almost exactly. And like I said, they aren't the same problem. But the idea behind the prisoners hanging, or popquiz, was to illustrate a point, disregarding the fallacies of the logic. The key being "surprise", rather than semi-controlled randomness. It was still designed to get you to "the most likely day is tomorrow".

But following that logic, the Lightning Tower is a cleaner problem, because it erases the unknown factor. It still brings you to the most likely day, because anything after would be less probable. And that's, at the basic level, what the Prisoner's Hanging / Popquiz was trying to show. It just leaves too many invariables, and a loose definition "surprise". It draws out to conclusions that can't be made with certainty.
There's still the very real difference of Lightning Tower being about a chance of something happening each day for an infinite amount of time, with the probability of it happening not changing even with known information about previous days, and Popquiz being about something definitely happening in a finite space.  Like people have been saying this entire thread.  It's the difference between flipping coins and drawing cards.
Title: Re: Math request: Nomad Camp
Post by: toaster on April 17, 2012, 04:45:12 pm
Except that the "solution" of the pop quiz/unexpected hanging problem *isn't* "have it tomorrow", nor is it trying to make a point anything like the lightning problem.  The entire point of the unexpected hanging problem is the loose definition of surprise and how that leads to a superficially "airtight" logical inference that turns out to be wrong.  The paradox isn't a problem with the unexpected hanging, it's the entire point of the scenario.
Title: Re: Math request: Nomad Camp
Post by: Voltgloss on April 17, 2012, 04:45:45 pm
Also: You are standing on one side of a river, with a fox, a goat, and a basket of cabbages. One of them always lies, one always tells the truth, and one is a basket of cabbages. You have to determine which is which using only one weighing on a scale.

Sorry, I'm too busy waiting for the basket of cabbages to figure out whether it's wearing a red hat or a white hat, based on the fox and the goat being unable to deduce same.
Title: Re: Math request: Nomad Camp
Post by: eHalcyon on April 17, 2012, 04:50:28 pm
Also: You are standing on one side of a river, with a fox, a goat, and a basket of cabbages. One of them always lies, one always tells the truth, and one is a basket of cabbages. You have to determine which is which using only one weighing on a scale.

Sorry, I'm too busy waiting for the basket of cabbages to figure out whether it's wearing a red hat or a white hat, based on the fox and the goat being unable to deduce same.

The bear that eats them all is white, because they are at the North Pole.
Title: Re: Math request: Nomad Camp
Post by: Robz888 on April 17, 2012, 04:55:18 pm
Except that the "solution" of the pop quiz/unexpected hanging problem *isn't* "have it tomorrow", nor is it trying to make a point anything like the lightning problem.  The entire point of the unexpected hanging problem is the loose definition of surprise and how that leads to a superficially "airtight" logical inference that turns out to be wrong.  The paradox isn't a problem with the unexpected hanging, it's the entire point of the scenario.

Exactly. The Lightning Town and the Pop Quiz are very different. The Lightning Town, I think, is supposed to illustrate why random events appear to occur non-randomly. The lightning storms tend to look planned, rather than random, because they happen in clusters. But they look like clusters to us for good reason, because the day of the next lightning storm is more likely to be tomorrow than 1,000 years from now.

It's similar to the coin flip. People expect roughly the same number of heads as tails, but a random series of coin flips could very well come out TTTTTTHHTT, and so on.
Title: Re: Math request: Nomad Camp
Post by: michaeljb on April 17, 2012, 04:58:20 pm
Fair enough that I havn't been able to provide a proof for my reasoning yet, but I've done my best to explain it in Lamens terms without it.

I've figured it out! You just need to find your magical pendant (http://en.wikipedia.org/wiki/Lamen_(magic)), then you'll be able to give us this elusive proof.

edit: FWIW, if the starting hand is dealt, but no one looks at it to discover that it consists of 4 Coppers and 1 Estate, I do believe the odds of the bottom card being an Estate are indeed 30%. As long as you don't look at that starting hand.
Title: Re: Math request: Nomad Camp
Post by: toaster on April 17, 2012, 05:12:08 pm
Exactly. The Lightning Town and the Pop Quiz are very different. The Lightning Town, I think, is supposed to illustrate why random events appear to occur non-randomly. The lightning storms tend to look planned, rather than random, because they happen in clusters. But they look like clusters to us for good reason, because the day of the next lightning storm is more likely to be tomorrow than 1,000 years from now.

It's similar to the coin flip. People expect roughly the same number of heads as tails, but a random series of coin flips could very well come out TTTTTTHHTT, and so on.

Actually, the Lightning Town example doesn't illustrate the events tend to happen in clusters...if you take the same problem but state that there *wasn't* a lightning storm today, the most likely date of the next storm remains the same: tomorrow. 

On the other hand, your point about what humans think of as random and how they perceive clusters where there aren't any is right on the money.
Title: Re: Math request: Nomad Camp
Post by: WanderingWinder on April 17, 2012, 05:22:57 pm
So I think Galzria's problem is this. Here's the proof that he's talking about that he remembers. You have the red/black card game thing. When you flip a card over, it of coursse doesn't change the distribution of cards, which were determined by the shuffle. However, the new information of this top card of course changes what we know. Specifically, every distribution of cards which has that top card at any position other than the top card has been eliminated. And so now, after that's happened, we have a better idea about each of the other cards in the deck. It doesn't make it more likely that your shuffling process produces a deck with black card on the bottom. It just changes what we know about this particular instance, this particular shuffled deck end product, because certain distributions we thought to be possible before we now know are not possible. Ok.
Now, so if the top card is revealed and we know that it's red, then that makes it more likely that the bottom card is black, not because it changes the shuffle order, but because there's more different possibilities left that have black cards on bottom (because there's more black cards left in the deck) than there are that have red on bottom. And so at this point, you CAN make predictions better than 50/50. But the clever proof he's remembering shows that you can't actually take advantage of this extra information to win more than 50% of the time, and the reasoning is, because you need the bottom card to be a particular color to win. And the only time you can actually have any choice (let's say you get to pick which color it is), is before the distribution of cards is fixed. Or let's say you can pick when to stop and look at the bottom card. Fine. Obviously, if you stop when there's more cards in the deck that are black than are red, you're more likely for it to be black. But the marginal gains you get there are totally cancelled out by the losses you get from basically never getting to the situation where you've revealed more reds than blacks. Because every black you reveal hurts your probabilities, and you're as likely to get this hurt as you are to get any help. Now, if you could choose to never go for that bottom card, then you win every time - just wait until all the cards left in the deck are what you need them to be. And this jives with our intuition that no part of looking at the top X cards of the deck will change the color of the bottom card. Because it doesn't. But that doesn't mean we can't guess better now, because we have more information.

So the analogous thing here is basically saying that that bottom card is either an estate, or it isn't. And buying the nomad camp doesn't change that. Drawing your first hand doesn't change that. BUT, there are 10! ways of distribution your 10 cards (assuming you have like the faces marked 0-9, so you can tell the difference). 30% of those have an estate on the bottom. And after you see that first hand, man, that doesn't change that 30% of the time you shuffle, you get an estate on the bottom. BUT, after you see your first hand, you know that a whole bunch of those distributions can't be right, because the first 5 cards don't conform. So you have to throw all those out as impossible, and you space of possible distributions shrinks. So in this case, all those cases that had 0 or 1 or 3 estates in the bottom 5 cards aren't viable anymore, because we know from the first hand that those bottom 5 MUST contain 2 estates. And hands containing 2 estates have an estate as the 5th card more often than when we have this distribution from 0 to 3 that we know about before we shuffle, and there you get the 40% number.

Now, I would love to get a statistics sub-forum so I can go off on this idiotic Tuesday boy birthday garbage :)
Title: Re: Math request: Nomad Camp
Post by: michaeljb on April 17, 2012, 05:29:16 pm
So I think Galzria's problem is this.

...


While I really enjoyed the rest of your post, and completely agree with it, I'm not sure I agree that that's what Galzria's problem is....after all there was this:

Quote
Galzria,

Just want to make sure I'm getting your argument, please answer the following!

Quote
1. I shuffle my deck and draw my starting hand. My starting hand is 3 Estate and 2 Copper. Is the probability of me having an Estate on the bottom of my library a) 0% b) 30% c) 40% d) other?

The odds that the bottom card is an Estate IS still 30% - BUT - You know that it will be 0% of the time, since you have 3 of 3 in hand. This only holds true, however, because you've revealed 100% of the remaining

(http://i.qkme.me/35lyg2.jpg)
Title: Re: Math request: Nomad Camp
Post by: Taco Lobster on April 17, 2012, 07:39:57 pm
Also: You are standing on one side of a river, with a fox, a goat, and a basket of cabbages. One of them always lies, one always tells the truth, and one is a basket of cabbages. You have to determine which is which using only one weighing on a scale.

Sorry, I'm too busy waiting for the basket of cabbages to figure out whether it's wearing a red hat or a white hat, based on the fox and the goat being unable to deduce same.

The bear that eats them all is white, because they are at the North Pole.

Or on the Island...
Title: Re: Math request: Nomad Camp
Post by: DStu on April 18, 2012, 02:02:32 am
The bear that eats them all is white, because they are at the North Pole.

Is this an african or an european Polar Bear?
Title: Re: Math request: Nomad Camp
Post by: jonts26 on April 18, 2012, 02:06:51 am
I...I don't knowwww aahhhhhhhhh.
Title: Re: Math request: Nomad Camp
Post by: Kirian on April 18, 2012, 02:25:04 am
Also: You are standing on one side of a river, with a fox, a goat, and a basket of cabbages. One of them always lies, one always tells the truth, and one is a basket of cabbages. You have to determine which is which using only one weighing on a scale.

Thank you Donald, this is now my FB status.
Title: Re: Math request: Nomad Camp
Post by: Axxle on April 18, 2012, 02:47:08 am
Also: You are standing on one side of a river, with a fox, a goat, and a basket of cabbages. One of them always lies, one always tells the truth, and one is a basket of cabbages. You have to determine which is which using only one weighing on a scale.

Thank you Donald, this is now my FB status.
Mine too, hahaha ;D
I added a bit to the end though. "Do you testify against your partner?"
Title: Re: Math request: Nomad Camp
Post by: Kirian on April 18, 2012, 02:51:04 am
The pop quiz is actually a different problem...it's about game theory or logic more than probability, and the reasoning behind it is quite different.  The most common set up for it is as the Paradox of the Unexpected Hanging (http://en.wikipedia.org/wiki/Unexpected_hanging_paradox).

What's awesome is that 5 clicks away from that paradox on Wikipedia I got to analysis paralysis, an experience many of us have had in Dominion as well.  I love this thread, but the answer is 40%.
Title: Re: Math request: Nomad Camp
Post by: Axxle on April 18, 2012, 04:27:15 am
The pop quiz is actually a different problem...it's about game theory or logic more than probability, and the reasoning behind it is quite different.  The most common set up for it is as the Paradox of the Unexpected Hanging (http://en.wikipedia.org/wiki/Unexpected_hanging_paradox).

What's awesome is that 5 clicks away from that paradox on Wikipedia I got to analysis paralysis, an experience many of us have had in Dominion as well.  I love this thread, but the answer is 40%.
5 clicks is a lot of clicks: It's also 5 clicks away from Hitler (Paradox > Impossible object > M.C._Escher > Dutch people > Hitler).
Thank Godwin.
Title: Re: Math request: Nomad Camp
Post by: qmech on April 18, 2012, 04:49:43 am
But the clever proof he's remembering shows that you can't actually take advantage of this extra information to win more than 50% of the time, and the reasoning is, because you need the bottom card to be a particular color to win.

The extra small observation you need is that while in the original game you were betting on the the next card being black, it's entirely equivalent to be betting on the bottom card being black, as it's just one of the remaining cards chosen uniformly at random.  Formally it's a simple coupling argument.

It's slightly embarrassing that this is the topic that gets me to register for such excellent forums.

Title: Re: Math request: Nomad Camp
Post by: BadAssMutha on April 18, 2012, 09:45:55 am
One comment about the Lightning Town problem - while the most likely day for the next lightning storm is indeed tomorrow, we can still say that there will probably not be a lightning storm until next week. By this I mean that  the cumulative probability of there being a storm won't reach 50% until a week has passed. Each day has a decreasing probability of being the first day that it rains, but we need to add these all up to get the intuitive answer that storms happen, on average, once a week. Hope I didn't open another can of worms here.
Title: Re: Math request: Nomad Camp
Post by: Davio on April 18, 2012, 09:57:07 am
Just sneakily look at the bottom card through a glass table and be done with it.
Title: Re: Math request: Nomad Camp
Post by: toaster on April 18, 2012, 11:47:30 am
One comment about the Lightning Town problem - while the most likely day for the next lightning storm is indeed tomorrow, we can still say that there will probably not be a lightning storm until next week. By this I mean that  the cumulative probability of there being a storm won't reach 50% until a week has passed. Each day has a decreasing probability of being the first day that it rains, but we need to add these all up to get the intuitive answer that storms happen, on average, once a week. Hope I didn't open another can of worms here.

Actually, that's not correct.  If storms happen once per week, the median time between storms is actually about 4.5 days.  For some very inexact intuition, think about that fact that at a minimum, the time to the next storm is a day, but there is no maximum time to the next storm...and thus for a mean interval of a week, there are more intervals less than a week than greater than a week.  See http://en.wikipedia.org/wiki/Geometric_distribution
Title: Re: Math request: Nomad Camp
Post by: BadAssMutha on April 18, 2012, 12:05:44 pm
One comment about the Lightning Town problem - while the most likely day for the next lightning storm is indeed tomorrow, we can still say that there will probably not be a lightning storm until next week. By this I mean that  the cumulative probability of there being a storm won't reach 50% until a week has passed. Each day has a decreasing probability of being the first day that it rains, but we need to add these all up to get the intuitive answer that storms happen, on average, once a week. Hope I didn't open another can of worms here.

Actually, that's not correct.  If storms happen once per week, the median time between storms is actually about 4.5 days.  For some very inexact intuition, think about that fact that at a minimum, the time to the next storm is a day, but there is no maximum time to the next storm...and thus for a mean interval of a week, there are more intervals less than a week than greater than a week.  See http://en.wikipedia.org/wiki/Geometric_distribution

True, but my main point was that even though tomorrow's the most likely day for the first storm, it still probably won't rain for a few days. I was pretty inexact with my definition of "average" time between storms (mean, median, etc.), plus I didn't read close enough to see if there is on average, "one storm per week" or "a week between storms", which are indeed be different. Gotta triple check the details before post here, I suppose.
Title: Re: Math request: Nomad Camp
Post by: Avin on April 18, 2012, 04:59:35 pm
This is annoying me, because I can't find the main source material that we used way back when to prove this. However:

http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2010/readings/MIT6_042JS10_chap18.pdf

Problem 18.5. (Do a ctrl+f) is designed to get you to exactly the right section, or:

"Problem 18.5.
I have a deck of 52 regular playing cards, 26 red, 26 black, randomly shuffled. They all lie face down in the deck so that you can’t see them. I will draw a card off the top of the deck and turn it face up so that you can see it and then put it aside. I will continue to turn up cards like this but at some point while there are still cards left in the deck, you have to declare that you want the next card in the deck to be turned up. If that next card turns up black you win and otherwise you lose. Either way, the game is then over.

(a) Show that if you take the first card before you have seen any cards, you then have probability 1/2 of winning the game.

(b) Suppose you don’t take the first card and it turns up red. Show that you have then have a probability of winning the game that is greater than 1/2.

(c) If there are r red cards left in the deck and b black cards, show that the probability of winning in you take the next card is b/(r + b).

(d) Either,
1. come up with a strategy for this game that gives you a probability of winning strictly greater than 1/2 and prove that the strategy works, or,
2. come up with a proof that no such strategy can exist."

Point (D) 2. is asked because, against intuition, the only proof that exists is one showing that no strategy can exist, that is, your odds never change. They were determined at the outset.

I can prove this (D2).

What I will show is that the odds of winning this game when there are n cards left in the deck (in other words 52-n cards have been flipped over) in an arbitrary deck are 50%.

This can be expressed as the sum from b=0 to n of the probability of there being b black cards left multiplied by the odds of winning with b black cards left in the deck.

Here's the interesting part. Consider all permutations of 52 cards and divide them up into two groups, ones which have "black" as the final card on one side and ones with "red" as the final card on the other side. You can form a one to one correspondence between these two groups because if you take each card in one group and swap each black card with a red card and each red card with a black card (in other words, inverting the color of each card), you will get a unique permutation from the other group.

Now you can see that the probability of there being b=k black cards left has to be the same as the probability of there being b=(n-k) black cards left, because for each permutation that has k black cards left, we can use that inverting function to obtain a corresponding permutation that has n-k black cards left.

Also, the odds of winning when there are b black cards left out of n cards is b/n (see part c, only I am using n instead of r+b), and the odds of winning when there are n-b black cards left is (n-b)/n. If we suppose we were playing on both situations simultaneously, the odds then of winning one of them is the sum of the odds of winning either one, since the decks are opposite each other:

(b/n) + (n-b)/n = (n-b+b) / n = n/n = 1

So now we can reduce our big summation above by pairing up those inverted scenarios:

sum from b=0 to n of the probability of there being b black cards left multiplied by the odds of winning with b black cards left in the deck
= sum from b=0 to n/2 rounded down of the probability of there being b black cards left multiplied by the odds of winning with either b black cards or n-b black cards left in the deck *
= sum from b=0 to n/2 rounded down of the probability of there being b black cards left multiplied by 1
= 1 * sum from b=0 to n/2 rounded down of the probability of there being b black cards left
= 1/2 * sum from b=0 to n of the probability of there being b black cards left
= 1/2 * 1, since the sum of the probability of every possibility has to be 1
= 1/2

* edit to note the case in the above step that if n is even, then the case where b=n/2 is also 1/2 because then n-b=b, meaning there are just as many reds as blacks left.

P.S. 40% to the OP
Title: Re: Math request: Nomad Camp
Post by: qmech on April 18, 2012, 05:18:37 pm
Avin, that works for strategies of the form "wait until there are n cards left, then guess", but there are more complicated strategies that you could follow (you can take into account the number of black cards seen so far, or make a choice randomly if you like).  The easiest way to sidestep these problems is to observe that the problem doesn't change if you switch to betting on the last card instead of the next card, as mentioned above.

The hardest thing about the question as presented is that it comes after three basic calculations, which primes you to calculate rather than look for a more elegant solution.
Title: Re: Math request: Nomad Camp
Post by: Avin on April 18, 2012, 05:29:25 pm
Avin, that works for strategies of the form "wait until there are n cards left, then guess", but there are more complicated strategies that you could follow (you can take into account the number of black cards seen so far, or make a choice randomly if you like).  The easiest way to sidestep these problems is to observe that the problem doesn't change if you switch to betting on the last card instead of the next card, as mentioned above.

The hardest thing about the question as presented is that it comes after three basic calculations, which primes you to calculate rather than look for a more elegant solution.

You can't "guess" because the requirement is that the card has to be black - you don't get to say what color it is. If you got to pick what color then there WOULD be a strategy that would win 100% of the time - wait until you've seen 26 cards of either color, then guess the opposite color for the next card.

And I believe the proof above is sufficient for all possible strategies because it doesn't take into account what strategy you're using, it just takes into account which turn the final declaration was on. So you could randomly pick a turn by rolling a 52 sided die prior to the game begins and guess that way, or you could attempt to wait until you've seen more reds than blacks and declare then - it wouldn't matter.
Title: Re: Math request: Nomad Camp
Post by: blueblimp on April 18, 2012, 05:56:50 pm
Since this same problem came up in the Venture thread (http://forum.dominionstrategy.com/index.php?topic=1914.msg30602#msg30602), I've thought a bit about solutions.

My favourite so far is:

Since you don't know the order of the remaining cards, it's irrelevant whether you draw the next card from the top of the deck or the bottom of the deck. (This can easily be proven formally using a bijection argument.) So, imagine that every non-bet card is drawn from the bottom of the deck, whereas once you bet the next card is black, then the next card is drawn from the top of the deck. This game is the same as the original one, from a probability standpoint.

Now play the new game with a fixed shuffling of the deck. No matter when you choose to bet, it's obviously not going to affect the top card of the deck, since the timing of your bet only affects how many cards you first draw from the bottom. Therefore, your strategy does no different than 50%.
Title: Re: Math request: Nomad Camp
Post by: david707 on April 19, 2012, 08:19:41 am
Why is there even an argument here?
Probability (A given B) = Probability (A and B)/Probability (B)
Probability (Bottom card is estate given initial hand is CCCCE)=Probability(Bottom card is estate and initial hand is CCCCE)/Probability(Initial hand is CCCCE)

Probability(Bottom card is estate and initial hand is CCCCE):
A deck of {CCCCCCCEEE} can be arranged in 10!/(7!*3!) ways, which is 120 ways.
We require an arrangement of {CCCCE|CCCE|E}, the order of the first 5 and the order of cards 6-9 don't matter so we get this in 5*4=20 ways. 20/120=1/6.

Probability(initial hand is CCCCE):
We require {CCCCE|CCCEE}, the first part can be ordered 5 different ways, the second part 10 ways. 5*10=50. 50/120=5/12.

so the final answer is (1/6)/(5/12)=2/5=0.4=40%

end of thread.
Title: Re: Math request: Nomad Camp
Post by: DStu on April 19, 2012, 08:34:14 am
end of thread.

I wouldn't be sure...
Title: Re: Math request: Nomad Camp
Post by: Kuildeous on April 19, 2012, 09:27:44 am
end of thread.

I wouldn't be sure...

Well, it should be, but this is the internet after all.

Thank you, David. That is exactly what I was looking for. The fact that you explained it perfectly just demonstrates how rusty I am in probability. I knew what had to be done, but I couldn't quite recall all of it. Your post brings back memories (which I hope to not forget too soon). If I could have properly blown out the cobwebs in my skull, I could have ended this two pages ago.

Although, I believe the 30% claim has been abandoned for now, so it's probably best that we continue on to more pressing questions like when is the blue dog's birthday if he is struck by lightning tomorrow.
Title: Re: Math request: Nomad Camp
Post by: Fabian on April 19, 2012, 01:22:03 pm
Well it is end of thread, because unfortunately the guy started ignoring it once he figured out he was wrong. Which is too bad, I was greatly enjoying this thread up until then.
Title: Re: Math request: Nomad Camp
Post by: Kuildeous on April 19, 2012, 01:36:00 pm
Well it is end of thread, because unfortunately the guy started ignoring it once he figured out he was wrong. Which is too bad, I was greatly enjoying this thread up until then.

I can't blame him. It wasn't a pleasant experience for me when I discovered I was on the wrong side of the Monty Hall problem. And I argued vehemently that the probability changed to 1/2. I was certain that the math used was somehow faulty or abused (like proving that 1=2 with a divide-by-zero error).

I think it took expanding the problem to a million doors to make me re-evaluate the math and realize that I was wrong all along. It sucked. Now I can look back at it and laugh, but I don't think I ever did admit I was wrong to the person who challenged what I thought I knew about the Monty Hall problem.

Humans hate being wrong.
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 19, 2012, 02:34:03 pm
Your last point is valid, however the rest is wrong.

See, my error is where this is DIFFERENT than Monty. We recieve information in blocks of 5, not 1. If this were a Hall paradox, we would a) get to evaluate after 8 doors have been revealed, not 5, and b) we would know all subsets that show 3 estates in the first 8 are false, else we would have already lost; that is, he will always leave us a way to win (assuming "win" in this is naming which card holds the 3rd estate with greatest frequency).

Since we dont know cards 6-8, we cannot make assumptions about them. WW did indeed post the proof behind my problem - which is different than here.
Title: Re: Math request: Nomad Camp
Post by: eHalcyon on April 19, 2012, 03:22:20 pm
Your last point is valid, however the rest is wrong.

See, my error is where this is DIFFERENT than Monty. We recieve information in blocks of 5, not 1. If this were a Hall paradox, we would a) get to evaluate after 8 doors have been revealed, not 5, and b) we would know all subsets that show 3 estates in the first 8 are false, else we would have already lost; that is, he will always leave us a way to win (assuming "win" in this is naming which card holds the 3rd estate with greatest frequency).

Since we dont know cards 6-8, we cannot make assumptions about them. WW did indeed post the proof behind my problem - which is different than here.

So... are you admitting that you were mistaken earlier?  :P
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 19, 2012, 03:49:38 pm
Sure. I said all along I didn't have the proof. I tried to let it lie. :)

WW put up where I was going with my thoughts, which similar, produce very different results. Blueblimps venture example is closer to what I intended as well.

I remembered doing something similar (as this was) that produced counter intuitive results than 2/5 = 40%. I wasn't trying to argue that exactly, but the circumstances to what I was arguing were obviously slightly different - I just couldn't remember what they were at the time, and only saw the similarities.

Title: Re: Math request: Nomad Camp
Post by: eHalcyon on April 19, 2012, 03:55:22 pm
Sure. I said all along I didn't have the proof. I tried to let it lie. :)

WW put up where I was going with my thoughts, which similar, produce very different results. Blueblimps venture example is closer to what I intended as well.

Well the tone I got from those posts was "I'm sure I'm right but I don't have the proof, so I won't go on about it even though you're all wrong."  But maybe I was misreading it.

Good on you for admitting error after arguing the contrary for so long!
Title: Re: Math request: Nomad Camp
Post by: Galzria on April 19, 2012, 07:13:35 pm
Nope, never meant it like that. That's why I had said I was willing to let 40% stand until I could produce a reason it shouldn't. I was willing to put the onus on me to prove/show where I thought I was right at. When I read WW's post, the details of the problem I was remembering came back. Similar, but definitely different than what we have going on here.
Title: Re: Math request: Nomad Camp
Post by: cored on January 20, 2013, 11:48:55 am
Quote
Problem 18.14.
A 52-card deck is thoroughly shuffled and you are dealt a hand of 13 cards.
(a) If you have one ace, what is the probability that you have a second ace?
(b) If you have the ace of spades, what is the probability that you have a second ace?
Remarkably, the two answers are different. This problem will test your counting ability!

Last two lines are straight from the textbook.
[/quote]

I apologize in advance for the necro-post, but damnit, what's the answer to part (b)?

Part (a) is pretty straightforward.  Take an ace out, and you have 51 cards, 3 of which are aces, so that's 1/17, and then the question is basically just what's the probability of an ace being in 12 of those , so 1/17 * 12/17.

I can't think of how it is different knowing that the ace you have is a particular ace, and the answer is not in the link, and I can't seem to google the same scenario...argh.
Title: Re: Math request: Nomad Camp
Post by: qmech on January 20, 2013, 12:16:22 pm
Part (a) is pretty straightforward.  Take an ace out, and you have 51 cards, 3 of which are aces, so that's 1/17, and then the question is basically just what's the probability of an ace being in 12 of those , so 1/17 * 12/17.

It's not quite like that.  If we work with unordered hands then there are (52 choose 13) hands in total, of which (48 choose 13) have no aces.  So (52 choose 13) - (48 choose 13) hands contain at least one ace.  All we know is that our hand is one of these.

Of these, 4 x (48 choose 12) hands contain exactly one ace.  So, courtesy Wolfram Alpha (http://www.wolframalpha.com/input/?i=1+-+%284%2848+choose+12%29%2F%28%2852+choose+13%29+-+%2848+choose+12%29%29%29) (EDIT (http://forum.dominionstrategy.com/index.php?topic=2260.msg181456#msg181456): try this instead (http://www.wolframalpha.com/input/?i=1+-+%284%2848+choose+12%29%2F%28%2852+choose+13%29+-+%2848+choose+13%29%29%29)), the probability of at least two aces given that we have at least one ace is about 0.5071 0.3696.

For (b), the number of hands containing the ace of spades is (51 choose 12).  Of these, (48 choose 12) contain no additional aces.  So, again courtesy Wolfram Alpha (http://www.wolframalpha.com/input/?i=1+-+%28%2848+choose+12%29%2F%28%2851+choose+12%29%29%29), the probability of at least two aces given the ace of spades is about 0.5612.

Your analysis for (a) is closer to the correct argument for (b).  You are indeed drawing 12 cards from 51, of which 3 are aces, so the average number of aces you expect to draw is 12/17.  But sometimes you'll get more than one ace, so the probability of at least one ace has to be a bit lower to compensate.
Title: Re: Math request: Nomad Camp
Post by: serakfalcon on January 20, 2013, 12:20:01 pm
I think the real questions in all of this are,
a) what is the likelihood of being nerdsniped on this forum
b) what is the proportion of math nerd/geeks (separate statistics if possible) on this forum
Proof(s) left as an exercise
Title: Re: Math request: Nomad Camp
Post by: Tables on January 20, 2013, 06:21:30 pm
I think the real questions in all of this are,
a) what is the likelihood of being nerdsniped on this forum
b) what is the proportion of math nerd/geeks (separate statistics if possible) on this forum
Proof(s) left as an exercise

I'm not going to give exact answers to each question, but as a spoiler, the sum of both answers together is approximately 2.
Title: Re: Math request: Nomad Camp
Post by: jomini on January 20, 2013, 10:37:12 pm
I think the real questions in all of this are,
a) what is the likelihood of being nerdsniped on this forum
b) what is the proportion of math nerd/geeks (separate statistics if possible) on this forum
Proof(s) left as an exercise

I'm not going to give exact answers to each question, but as a spoiler, the sum of both answers together is approximately 2.

Only for sufficiently small values of 2 =)
Title: Re: Math request: Nomad Camp
Post by: Tables on January 21, 2013, 06:19:16 am
I'm very tempted to now take a picture of me wearing my shirt that says '2+2=5 (for extremely large values of 2)' but alas, it's in the wash.
Title: Re: Math request: Nomad Camp
Post by: AdamH on January 22, 2013, 02:24:00 pm
Problem 18.14.
A 52-card deck is thoroughly shuffled and you are dealt a hand of 13 cards.
(a) If you have one ace, what is the probability that you have a second ace?
(b) If you have the ace of spades, what is the probability that you have a second ace?
Remarkably, the two answers are different. This problem will test your counting ability!

[the answer to the question, which boils down to]
sometimes you'll get more than one ace [for part (b)]

Sorry, but this immediately came to my mind after reading the question and the answer:

http://xkcd.com/169/

This problem is ONLY hard because they stated it poorly, and that upsets me. That comic, OTOH, makes me very happy.
Title: Re: Math request: Nomad Camp
Post by: GigaKnight on January 22, 2013, 09:15:35 pm
Sorry, but this immediately came to my mind after reading the question and the answer:

http://xkcd.com/169/

This problem is ONLY hard because they stated it poorly, and that upsets me. That comic, OTOH, makes me very happy.

I've read that comic over and over... and I've read the Explain XKCD for it twice.  I cannot, for the life of me, parse the first panel into anything other than the obvious meaning. How can it possibly be parsed into meaningful English such that the smug joke works?
Title: Re: Math request: Nomad Camp
Post by: Axxle on January 22, 2013, 09:22:53 pm
You're not alone, I can't parse it either.
Title: Re: Math request: Nomad Camp
Post by: ftl on January 22, 2013, 09:44:55 pm
It can't, the guy phrased his smug joke wrong.
Title: Re: Math request: Nomad Camp
Post by: Drab Emordnilap on January 22, 2013, 11:22:19 pm
The joke is more properly phrased

"Angry and hungry are two words that end in 'gry'. What is the third word in the English language?"
Title: Re: Math request: Nomad Camp
Post by: DStu on January 23, 2013, 02:01:16 am
Problem 18.14.
A 52-card deck is thoroughly shuffled and you are dealt a hand of 13 cards.
(a) If you have one ace, what is the probability that you have a second ace?
(b) If you have the ace of spades, what is the probability that you have a second ace?
Remarkably, the two answers are different. This problem will test your counting ability!

[the answer to the question, which boils down to]
sometimes you'll get more than one ace [for part (b)]

Sorry, but this immediately came to my mind after reading the question and the answer:

http://xkcd.com/169/

This problem is ONLY hard because they stated it poorly, and that upsets me. That comic, OTOH, makes me very happy.

I don't see how the puzzle is stated poorly.  It gives you two well defined and not really obscure cases.  What's poorly is the human intuition on probability, and the puzzle should and does show this.
Title: Re: Math request: Nomad Camp
Post by: AdamH on January 23, 2013, 08:03:59 am
Problem 18.14.
A 52-card deck is thoroughly shuffled and you are dealt a hand of 13 cards.
(a) If you have one ace, what is the probability that you have a second ace?
(b) If you have the ace of spades, what is the probability that you have a second ace?
Remarkably, the two answers are different. This problem will test your counting ability!

[the answer to the question, which boils down to]
sometimes you'll get more than one ace [for part (b)]

Sorry, but this immediately came to my mind after reading the question and the answer:

http://xkcd.com/169/

This problem is ONLY hard because they stated it poorly, and that upsets me. That comic, OTOH, makes me very happy.

I don't see how the puzzle is stated poorly.  It gives you two well defined and not really obscure cases.  What's poorly is the human intuition on probability, and the puzzle should and does show this.

I disagree. The puzzle was stated in such a way that it's designed to make you think that the conditions of the second case are exactly the same as the first, except that they're specifying which ace you have. The whole point of the problem is that the conditions are not the same. Once I realized the conditions weren't the same, the probability came really easily.
Title: Re: Math request: Nomad Camp
Post by: DStu on January 23, 2013, 08:09:22 am
I disagree. The puzzle was stated in such a way that it's designed to make you think that the conditions of the second case are exactly the same as the first, except that they're specifying which ace you have. The whole point of the problem is that the conditions are not the same. Once I realized the conditions weren't the same, the probability came really easily.
I seriously don't get it. Exactly this is the only difference, as I understand this. There is no hidden trick, no distraction like in the xkcd169, it's like, one time you have an ace, the other time you have the ace of spades.

How would you formulate this?
Title: Re: Math request: Nomad Camp
Post by: WanderingWinder on January 23, 2013, 08:12:41 am
I disagree. The puzzle was stated in such a way that it's designed to make you think that the conditions of the second case are exactly the same as the first, except that they're specifying which ace you have. The whole point of the problem is that the conditions are not the same. Once I realized the conditions weren't the same, the probability came really easily.
I seriously don't get it. Exactly this is the only difference, as I understand this. There is no hidden trick, no distraction like in the xkcd169, it's like, one time you have an ace, the other time you have the ace of spades.

How would you formulate this?
I wouldn't. The point is, you never actually care about both questions - you only ever care about one - so it doesn't really make sense to compare them. I mean, show me a situation where the difference is important?
Title: Re: Math request: Nomad Camp
Post by: DStu on January 23, 2013, 08:15:42 am
I wouldn't. The point is, you never actually care about both questions - you only ever care about one - so it doesn't really make sense to compare them. I mean, show me a situation where the difference is important?

It shows that you have to be carefull with conditional probabilities.  Of course, in a given setting, there is only one correct method, but knowing that you have to be carefull and that you can't apply the method from yesterday just because the problems looks similar is something worth knowing...
Title: Re: Math request: Nomad Camp
Post by: AdamH on January 23, 2013, 08:44:01 am
OK you win. The rules of the happy little universe I live in don't seem to apply when you get into this business. I think the same goes for Goons games.
Title: Re: Math request: Nomad Camp
Post by: DStu on January 23, 2013, 08:48:13 am
I don't want to win, I want to understand...
Title: Re: Math request: Nomad Camp
Post by: AdamH on January 23, 2013, 08:57:06 am
When I went back to try and figure it out it didn't make any sense. The numbers are all there and they make sense when you just put them down from what the problem tells you to do. It doesn't make any sense to me that it works out that way, but it's the same thing with Goons: doing what makes sense to me is clearly wrong.
Title: Re: Math request: Nomad Camp
Post by: DStu on January 23, 2013, 09:00:38 am
When I went back to try and figure it out it didn't make any sense. The numbers are all there and they make sense when you just put them down from what the problem tells you to do. It doesn't make any sense to me that it works out that way, but it's the same thing with Goons: doing what makes sense to me is clearly wrong.
And I think exactly that is the point of this puzzle. When I think about how that should be, I also don't really see why it should make a difference. When I think longer, I see a reason, but it is the wrong way, i.e. the wrong case gets the higher probability.

But it's not because the puzzle is poorly worder, I don't think you can make the difference any more clear without allcaps, but because intuition clearly doesn't help much here, you just have to count.
Title: Re: Math request: Nomad Camp
Post by: GendoIkari on January 23, 2013, 09:05:59 am
I'm very tempted to now take a picture of me wearing my shirt that says '2+2=5 (for extremely large values of 2)' but alas, it's in the wash.

So you're saying it's missed the reshuffle?
Title: Re: Math request: Nomad Camp
Post by: theory on January 23, 2013, 09:40:46 am
Maybe here's a better, more intuitive explanation.

In one, we are looking for, # hands with at least two Aces / # hands with at least one Ace

and in two, we are looking for

# hands with at least two Aces (one of which is the Ace of Spades) / # hands with the Ace of Spades

In the second problem, the world in which you are operating is slightly different.  There are fewer ways to have hands with 2 A's if one of them has to be the Spade A, and even though there are fewer hands with the Spade A, it doesn't cancel out perfectly.

It is very similar to that "two children one is a boy" vs "two children elder is a boy" question.
Title: Re: Math request: Nomad Camp
Post by: HiveMindEmulator on January 23, 2013, 12:12:42 pm
The way I understand it is this:

The confusion comes from you thinking of the problem as first taking an Ace (or the Ace of Spades), then drawing 12 other cards. In this formulation, it doesn't matter what the first Ace is.

But the real scenario is more like dealing out all the cards into 4 13-card hands and finding the one with the Ace of Spades vs finding the ones with Aces, and then asking if there are 2 Aces in that chosen hand. Finding the hand with the Ace of Spades is just like the previous formulation. It's the Spade Ace and 12 other random cards. But if you are just trying to find a hand with any Ace, you are more likely to land in a 2-Ace hand, since the 2-Ace hands have more Aces in them.
Title: Re: Math request: Nomad Camp
Post by: eHalcyon on January 23, 2013, 01:19:47 pm
Part (a) is pretty straightforward.  Take an ace out, and you have 51 cards, 3 of which are aces, so that's 1/17, and then the question is basically just what's the probability of an ace being in 12 of those , so 1/17 * 12/17.

It's not quite like that.  If we work with unordered hands then there are (52 choose 13) hands in total, of which (48 choose 13) have no aces.  So (52 choose 13) - (48 choose 13) hands contain at least one ace.  All we know is that our hand is one of these.

Of these, 4 x (48 choose 12) hands contain exactly one ace.  So, courtesy Wolfram Alpha (http://www.wolframalpha.com/input/?i=1+-+%284%2848+choose+12%29%2F%28%2852+choose+13%29+-+%2848+choose+12%29%29%29), the probability of at least two aces given that we have at least one ace is about 0.5071.

For (b), the number of hands containing the ace of spades is (51 choose 12).  Of these, (48 choose 12) contain no additional aces.  So, again courtesy Wolfram Alpha (http://www.wolframalpha.com/input/?i=1+-+%28%2848+choose+12%29%2F%28%2851+choose+12%29%29%29), the probability of at least two aces given the ace of spades is about 0.5612.

Your analysis for (a) is closer to the correct argument for (b).  You are indeed drawing 12 cards from 51, of which 3 are aces, so the average number of aces you expect to draw is 12/17.  But sometimes you'll get more than one ace, so the probability of at least one ace has to be a bit lower to compensate.

I think you input the equation incorrectly on Wolfram Alpha.  The denominator there should have (52 choose 13), not (52 choose 12), right?  With the change, the probability is 0.3696, not 0.5071.

The way I understand it is this:

The confusion comes from you thinking of the problem as first taking an Ace (or the Ace of Spades), then drawing 12 other cards. In this formulation, it doesn't matter what the first Ace is.

But the real scenario is more like dealing out all the cards into 4 13-card hands and finding the one with the Ace of Spades vs finding the ones with Aces, and then asking if there are 2 Aces in that chosen hand. Finding the hand with the Ace of Spades is just like the previous formulation. It's the Spade Ace and 12 other random cards. But if you are just trying to find a hand with any Ace, you are more likely to land in a 2-Ace hand, since the 2-Ace hands have more Aces in them.

qmech's math says that the probability of having more aces is higher given that you have the Ace of Spades, compared to just having an Ace.  Is there something wrong with his math?  It's been a lot time since I've done combinations, but his work seems correct to me.
Title: Re: Math request: Nomad Camp
Post by: HiveMindEmulator on January 23, 2013, 03:29:29 pm
The way I understand it is this:

The confusion comes from you thinking of the problem as first taking an Ace (or the Ace of Spades), then drawing 12 other cards. In this formulation, it doesn't matter what the first Ace is.

But the real scenario is more like dealing out all the cards into 4 13-card hands and finding the one with the Ace of Spades vs finding the ones with Aces, and then asking if there are 2 Aces in that chosen hand. Finding the hand with the Ace of Spades is just like the previous formulation. It's the Spade Ace and 12 other random cards. But if you are just trying to find a hand with any Ace, you are more likely to land in a 2-Ace hand, since the 2-Ace hands have more Aces in them.

qmech's math says that the probability of having more aces is higher given that you have the Ace of Spades, compared to just having an Ace.  Is there something wrong with his math?  It's been a lot time since I've done combinations, but his work seems correct to me.

Actually my intuitive explanation is is wrong AND qmechs typing into Wolfram is wrong.

You're more likely to find any specific Ace (including the Ace of Spades) in a 2 Ace hand than you are to find it in a 1 Ace hand, so P(2 Aces | Spade A) should be larger. If you're looking for a hand with at least one Ace, you're not more likely to count the hands with multiple Aces than the ones with only one Ace.

So P(2 Aces | specific A) = 0.56
and P(2 Aces | at least one A) = 0.37 (qmech's expressions are right, but entered into wolfram wrong).
Title: Re: Math request: Nomad Camp
Post by: dnkywin on January 23, 2013, 03:50:21 pm
f.ds - turning a simple math question into a 6 page argument on probability

>.<
Title: Re: Math request: Nomad Camp
Post by: HiveMindEmulator on January 23, 2013, 05:04:48 pm
You're more likely to find any specific Ace (including the Ace of Spades) in a 2 Ace hand than you are to find it in a 1 Ace hand, so P(2 Aces | Spade A) should be larger. If you're looking for a hand with at least one Ace, you're not more likely to count the hands with multiple Aces than the ones with only one Ace.

So P(2 Aces | specific A) = 0.56
and P(2 Aces | at least one A) = 0.37 (qmech's expressions are right, but entered into wolfram wrong).

I wanted to clarify my intuitive argument by giving the example of a typical deal of cards. Typically, you'll find 1 hand with 2 Aces, 2 with 1 Ace, and 1 with none. Of the hands with Aces, 1/3 have 2 Aces (probability ~ 0.37). But half the Aces are in hands with 2 Aces, so if you look for a specific Ace, you'll find it in a hand with 2 Aces 1/2 the time (probability ~ 0.56).
Title: Re: Math request: Nomad Camp
Post by: jeb56 on January 23, 2013, 05:27:06 pm
This is another way to frame the argument:

Take four copies of the distribution for a specific Ace (one each for Spade. heart, Diamond, and Club Aces), and mix them together.  This resulting distribution has one copy of each hand with one Ace, two copies of each hand with two Aces, three copies of each hand with three Aces, and four copies of each hand with four Aces.

Compare that with the distribution that has one copy (only) of each hand that contains an Ace.

Title: Re: Math request: Nomad Camp
Post by: qmech on January 26, 2013, 05:41:03 pm
I think you input the equation incorrectly on Wolfram Alpha.  The denominator there should have (52 choose 13), not (52 choose 12), right?  With the change, the probability is 0.3696, not 0.5071.
And you correctly corrected the Alpha typing, but then reported the wrong change! :P

Conclusion: typing is hard.
Title: Re: Math request: Nomad Camp
Post by: eHalcyon on January 26, 2013, 06:09:44 pm
I think you input the equation incorrectly on Wolfram Alpha.  The denominator there should have (52 choose 13), not (52 choose 12), right?  With the change, the probability is 0.3696, not 0.5071.
And you correctly corrected the Alpha typing, but then reported the wrong change! :P

Conclusion: typing is hard.

Wait, what did I do wrong then? I am lost now!
Title: Re: Math request: Nomad Camp
Post by: Morgrim7 on January 26, 2013, 06:46:01 pm
f.ds: Where people spend six pages of discussion talking about the odds of one card.
Title: Re: Math request: Nomad Camp
Post by: ycz6 on January 26, 2013, 08:42:55 pm
Ain't you guys ever been on the Internet before? I don't think I've seen a forum discussion or comment thread about a riddle in probability that didn't turn into a long-ass argument about conditional probability where several different people on both sides try to use intuitive arguments to make their point but end up failing, and I've seen a few.

Maybe I hang out around the wrong parts of the Internet.
Title: Re: Math request: Nomad Camp
Post by: qmech on January 27, 2013, 05:55:22 am
(52 choose 13), not (52 choose 12)

That one was right: I had (48 choose 12) when I should have had (48 choose 13).

Hopefully that's an end of this now!
Title: Re: Math request: Nomad Camp
Post by: eHalcyon on January 27, 2013, 10:57:24 am
(52 choose 13), not (52 choose 12)

That one was right: I had (48 choose 12) when I should have had (48 choose 13).

Hopefully that's an end of this now!


And then the answer I reported was still correct... Wow. Typing IS hard! :P