|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"scott" <sco### [at] scottcom> wrote:
> If you have a box with N balls numbered 1 to N, and you repeatedly pull one
> out at random (replacing it each time), on average how many times would you
> expect to pull out a ball until you have seen all the balls 1 to N?
For sufficiently large N, I'd expect to keep pulling out balls until I die. I
guess something in the order of N=1000 should be sufficient for this.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> You are thinking it in the wrong way.
>
> The very first time you select one ball, the probability it's a ball you
> have never seen before is 1.
>
> The second time you select a ball, the probability it's a ball you have
> never seen before is (N-1)/N.
>
> With the third ball, the probability is (N-2)/N. And so on.
As with most mathematical problems, it's trivially easy once you already
know what the answer is... :-}
(And let's not forget, I have the lowest IQ in this newsgroup...)
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v8 <voi### [at] devnull> wrote:
> > The very first time you select one ball, the probability it's a ball you
> > have never seen before is 1.
> >
> > The second time you select a ball, the probability it's a ball you have
> > never seen before is (N-1)/N.
> >
> > With the third ball, the probability is (N-2)/N. And so on.
> As with most mathematical problems, it's trivially easy once you already
> know what the answer is... :-}
OTOH that only tells us the probability of getting distinct balls when
taking N balls out. It doesn't tell us how many balls must be taken in
average in order to get N distinct balls.
Btw, if you didn't bother continuing that chaing of deductions, the
probability of getting all distinct balls when taking out N balls is
N!/(N^N).
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>>> In other words, we have K/N - E, and I can't figure out how to compute
>>> E.
>>
>> You can think of it the other way round, ie the probability of ball #1
>> appearing after K tries is 1 minus the probability of ball #1 not
>> occurring after K tries.
>
> Ah. So instead of a chain of non-exclusive OR events, you construct a
> chain of independent AND events? Ingenius...
Yes, it's a useful trick to remember :-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>>I have often wondered this, occasionally looked for the answer, and never
>>really needed to know enough to work it out from first principles.
>
> Strange, I've never wondered about it o_O
Hehe, the exact problem I was applying it to was that I needed to save all
the possible images generated by some website, I knew there was a certain
number of possibilities and that it was random that each one appeared, so I
just wondered how many times I would need to try before I got all of them.
I didn't need to know exactly, eg if N=100, am I expecting 150 tries, 200
tries, 500 tries? But knowing the exact formula would be cool as it seemed
an interesting problem without an obvious result.
Another situation I can think of is if you need to check every item in a
physical population, but you can only select ones at random (eg tagging fish
in a lake or something).
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Am Wed, 17 Dec 2008 14:49:38 +0100 schrieb scott:
> If you have a box with N balls numbered 1 to N, and you repeatedly pull
> one out at random (replacing it each time), on average how many times
> would you expect to pull out a ball until you have seen all the balls 1
> to N?
Without looking at the other solutions (and excluding the trivial case),
I would guess, that such average number does not exist. There is the real
possibility of always pulling the same ball. In fact there is an infinite
number of such combinations, where you can pull balls without ever
getting all the balls. If the pulling of balls is truely random and
independently, such combinations are not less likely (seems unintuitive,
but there is no way of predicting which ball you will pull based on which
balls you already pulled). In this case the infinite sum is diverging and
you can't calculate an average.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Am Thu, 18 Dec 2008 02:55:41 -0500 schrieb Florian Pesth:
> In this case the infinite sum is
> diverging and you can't calculate an average.
Ah... forget that, there is also an infinite number of solutions of the
same length. Even though it is an infinite sum, one can calculate it.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> Do a search for Coupon Collector's Problem. For equal random drawings
>> it's roughly nHn that is n * Sum k=1-n of 1/k multiplied by n
>
>
> Brain death! - n * Sum k=1-n of 1/k
Thanks Phil, when you know what it's called it's easy to find.
So, my answer is that the expected number of drawings to get all balls is
given by:
E[T] = n*(1/1 + 1/2 + 1/3 + ... 1/n)
Cool neat little formula :-)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
And lo On Thu, 18 Dec 2008 08:26:56 -0000, scott <sco### [at] scottcom> did
spake thusly:
>>> Do a search for Coupon Collector's Problem. For equal random drawings
>>> it's roughly nHn that is n * Sum k=1-n of 1/k multiplied by n
>>
>>
>> Brain death! - n * Sum k=1-n of 1/k
>
> Thanks Phil, when you know what it's called it's easy to find.
Indeed, one of the major difficulties in research is trying to guess what
someone else might have named their solution ;-)
> So, my answer is that the expected number of drawings to get all balls
> is given by:
>
> E[T] = n*(1/1 + 1/2 + 1/3 + ... 1/n)
>
> Cool neat little formula :-)
Yup, of course as you recognise it's only an estimate the lower bound
being n and the upper bound being infinity; there's no reason why you
can't be pulling out the same number over and over again although you
might suspect a fiddle if it yields a high mean absolute error.
--
Phil Cook
--
I once tried to be apathetic, but I just couldn't be bothered
http://flipc.blogspot.com
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> E[T] = n*(1/1 + 1/2 + 1/3 + ... 1/n)
>>
>> Cool neat little formula :-)
>
> Yup, of course as you recognise it's only an estimate the lower bound
> being n and the upper bound being infinity; there's no reason why you
> can't be pulling out the same number over and over again although you
> might suspect a fiddle if it yields a high mean absolute error.
On an only slightly related topic, I did manage to find this one
http://en.wikipedia.org/wiki/Benford%27s_law
without knowing what it was called beforehand.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |