testing quality of a /dev/random
Sam Roberts
sroberts at uniserve.com
Wed Mar 8 11:24:04 CET 2000
Thanks!
So, would this:
---------------------------------------------------
$ dd if=/dev/random bs=1k count=100 | ./ent
0+100 records in
0+100 records out
Entropy = 7.705609 bits per byte.
Optimum compression would reduce the size
of this 637 byte file by 3 percent.
Chi square distribution for 637 samples is 230.67, and randomly
would exceed this value 75.00 percent of the times.
Arithmetic mean value of data bytes is 128.2339 (127.5 = random).
Monte Carlo value for Pi is 3.132075472 (error 0.30 percent).
Serial correlation coefficient is -0.015775 (totally uncorrelated = 0.0).
---------------------------------------------------
be a functional source of entropy?
Sam
Previously, you (David Frey) wrote:
> On Wed, Mar 08, 2000 at 11:37:09AM -0500, Sam Roberts wrote:
> > I'm wondering at how to evaluate the quality of randomness
> > it generates.
> Apply John Walker's ent [1]-Test, which is a Pseudorandom Number Sequence
> Test Program on it. Example run under Linux 2.0.36:
>
> $dd if=/dev/random bs=1k count=100|./ent
> 0+100 records in
> 0+100 records out
> Entropy = 7.657693 bits per byte.
>
> Optimum compression would reduce the size
> of this 631 byte file by 4 percent.
>
> Chi square distribution for 631 samples is 279.81, and randomly
> would exceed this value 25.00 percent of the times.
>
> Arithmetic mean value of data bytes is 129.0254 (127.5 = random).
> Monte Carlo value for Pi is 3.352380952 (error 6.71 percent).
> Serial correlation coefficient is -0.056984 (totally uncorrelated =
> 0.0).
> $
>
> Just my 0.02 CHF.
>
> [1] Called random.zip.
>
> David
>
> David
>
>
>
--
Sam Roberts, sroberts at uniserve dot com, www.emyr.net/Sam
More information about the Gnupg-devel
mailing list