This proof solidifies some of the earlier observations of the reflexive density constant, a necessary-but-not-sufficient requirement in the evaluation of random number generator security.

Given T, an unbiased true random number generator, which can produce 1 of N unique integers; and S, a set consisting of N such sample outputs. Then the expected number E of unique integers in S, in the limit that N approaches infinity, is given by:

Given T, an unbiased true random number generator, which can produce 1 of N unique integers; and S, a set consisting of N such sample outputs. Then the expected number E of unique integers in S, in the limit that N approaches infinity, is given by:

E = (1-1/e)N

E = RN

Here is the proof:

Construct a bitmap B, consisting of N 0s. Obtain S by performing N invocations of T. For each integer in S, set a bit in B corresponding to the sorted order of all possible outputs of T; that is, bit 0 would correspond to the least such integer, and bit (N-1) would correspond to the greatest. (In practice, the possible outputs would be 0 through N-1, where N is a power of 2. But that's not required here.)

Because T is unbiased, the probability of any given bit ending up as 1 is uniform throughout B. So consider one of these bits in isolation.

After the first invokation of T, the odds that this bit is 1 is just 1/N. Put another way, the odds that it's 0 is (1-1/N).

After the second invokation, the odds that it's still 0 are: the odds that it was still 0 after the first iteration, times another factor of (1-1/N). Thus the odds that it's still 0 are (1-1/N)^2.

We can continue this logic over all N invocations of T. The odds that this bit ends up as 0 is then ((1-1/N)^N). Thus, in the limit of infinite N, the odds R that this bit is 1 is just 1 minus this quantity:

R = lim(N->infinity,1-((1-1/N)^N))

R = 1-1/e

R ~ 0.6321205588285576784044762298385391325541888689682321

It turns out that the reflexive density constant shows up in lots of places. For example:

1. Neural network stability at equation 6.27.

2. The exponential cummulative distribution function.

3. The Poisson distribution, if we were to add up the 1s at each bit of B, instead of ORing them.

4. The Random Arrival Time Best-Choice Problem on p. 499.

5. Some other discussion about it which shows that I wasn't the first to prove the reflexive density theorem, although hopefully you find my proof to be intuitive.

## No comments:

## Post a Comment