testing quality of a /dev/random

Enzo Michelangeli em at who.net
Thu Mar 9 12:16:03 CET 2000


Such kind of tests, at best, prove how bad a (P)RNG is, not how good: the
(very deterministic) digits of PI would pass your tests with flying colours.
Determining an lower bound for the entropy of a data source based only on
the data stream is a non-computable problem.

Enzo

----- Original Message -----
From: "David Frey" <david at eos.lugs.ch>
To: <gnupg-devel at gnupg.org>
Sent: Thursday, March 09, 2000 5:38
Subject: Re: testing quality of a /dev/random


> On Wed, Mar 08, 2000 at 11:37:09AM -0500, Sam Roberts wrote:
> > I'm wondering at how to evaluate the quality of randomness
> > it generates.
> Apply John Walker's ent [1]-Test, which is a Pseudorandom Number Sequence
> Test Program on it. Example run under Linux 2.0.36:
>
> $dd if=/dev/random bs=1k count=100|./ent
> 0+100 records in
> 0+100 records out
> Entropy = 7.657693 bits per byte.
>
> Optimum compression would reduce the size
> of this 631 byte file by 4 percent.
>
> Chi square distribution for 631 samples is 279.81, and randomly
> would exceed this value 25.00 percent of the times.
>
> Arithmetic mean value of data bytes is 129.0254 (127.5 = random).
> Monte Carlo value for Pi is 3.352380952 (error 6.71 percent).
> Serial correlation coefficient is -0.056984 (totally uncorrelated =
> 0.0).
> $
>
> Just my 0.02 CHF.
>
> [1] Called random.zip.
>
> David
>
> David
>



More information about the Gnupg-devel mailing list