|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
If you have a box with N balls numbered 1 to N, and you repeatedly pull one
out at random (replacing it each time), on average how many times would you
expect to pull out a ball until you have seen all the balls 1 to N?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
scott wrote:
> If you have a box with N balls numbered 1 to N, and you repeatedly pull
> one out at random (replacing it each time), on average how many times
> would you expect to pull out a ball until you have seen all the balls 1
> to N?
N is trivially the lower bound.
Each ball appears with probability 1/N, so the probability for all balls
to appear (given that these events are clearly independent) in N trials
would be (1/N)^N, which is 1/(N^N).
Hmm, but wait! This does not count for ordering. There are N!
permutations of N events, so taking *that* into account, we now actually
have N! / N^N.
Now, if we perform K trails, the probability of seeing all the balls is
(N! + (K-N)!) / N^N. (Assuming K >= N.)
Assuming that by "on average" you mean "with probability > 50%", it is
now a matter of choosing K such that the above formula comes out to more
than 0.5.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Now, if we perform K trails, the probability of seeing all the balls is
> (N! + (K-N)!) / N^N. (Assuming K >= N.)
Can you explain how you got that formula?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
scott wrote:
>> Now, if we perform K trails, the probability of seeing all the balls
>> is (N! + (K-N)!) / N^N. (Assuming K >= N.)
>
> Can you explain how you got that formula?
Intuitively, assuming N trials the formula is just N! / N^N, so if we
add (K-N) extra trails, you get the formula above.
On reflection, I'm not sure that's actually correct...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>>> Now, if we perform K trails, the probability of seeing all the balls is
>>> (N! + (K-N)!) / N^N. (Assuming K >= N.)
>>
>> Can you explain how you got that formula?
>
> Intuitively, assuming N trials the formula is just N! / N^N, so if we add
> (K-N) extra trails, you get the formula above.
>
> On reflection, I'm not sure that's actually correct...
What if K=5 and N=2? I think the answer should be a 30/32 chance of seeing
both balls, but your formula gives 8/4 !
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Invisible wrote:
> N is trivially the lower bound.
>
> Each ball appears with probability 1/N, so the probability for all balls
> to appear (given that these events are clearly independent) in N trials
> would be (1/N)^N, which is 1/(N^N).
>
> Hmm, but wait! This does not count for ordering. There are N!
> permutations of N events, so taking *that* into account, we now actually
> have N! / N^N.
>
> Now, if we perform K trails...
Let's try that again...
If we select one ball, the probability of it being ball #1 is 1/N.
If we select K balls, the probability of ball #1 appearing becomes 1/N +
1/N + 1/N + 1/N + ... K times over, minus the probability of ball #1
appearing more than once.
In other words, we have K/N - E, and I can't figure out how to compute E.
(Presumably combinatorics provides a trivial way to solve this. Too bad
I don't know combinatorics, eh?)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> If we select K balls, the probability of ball #1 appearing becomes 1/N +
> 1/N + 1/N + 1/N + ... K times over, minus the probability of ball #1
> appearing more than once.
>
> In other words, we have K/N - E, and I can't figure out how to compute E.
You can think of it the other way round, ie the probability of ball #1
appearing after K tries is 1 minus the probability of ball #1 not occurring
after K tries. That is 1-((N-1)/N)^K.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
scott wrote:
>> If we select K balls, the probability of ball #1 appearing becomes 1/N
>> + 1/N + 1/N + 1/N + ... K times over, minus the probability of ball #1
>> appearing more than once.
>>
>> In other words, we have K/N - E, and I can't figure out how to compute E.
>
> You can think of it the other way round, ie the probability of ball #1
> appearing after K tries is 1 minus the probability of ball #1 not
> occurring after K tries.
Ah. So instead of a chain of non-exclusive OR events, you construct a
chain of independent AND events? Ingenius...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
scott wrote:
> If you have a box with N balls numbered 1 to N, and you repeatedly pull
> one out at random (replacing it each time), on average how many times
> would you expect to pull out a ball until you have seen all the balls 1
> to N?
I'm sure the answer is in the back of a textbook somewhere ... :)
--
~Mike
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
And lo On Wed, 17 Dec 2008 13:49:38 -0000, scott <sco### [at] scottcom> did
spake thusly:
> If you have a box with N balls numbered 1 to N, and you repeatedly pull
> one out at random (replacing it each time), on average how many times
> would you expect to pull out a ball until you have seen all the balls 1
> to N?
Do a search for Coupon Collector's Problem. For equal random drawings it's
roughly nHn that is n * Sum k=1-n of 1/k multiplied by n
--
Phil Cook
--
I once tried to be apathetic, but I just couldn't be bothered
http://flipc.blogspot.com
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |