Recommended key size for life long key

Robert J. Hansen rjh at sixdemonbag.org
Mon Sep 9 04:04:07 CEST 2013


On 09/08/2013 06:54 PM, Leo Gaspard wrote:
> Well... If factoring takes a month, with the factor of 125, it takes 
> ten years. Seems not that irrelevant to me.

Or you wait three years and let technological progression reduce the
work factor for you.  Or you throw 125 machines at it instead of one.
Or... etc.  If something is unsafe at work level X, it won't be safe at
work level 125X.

> Strangely enough, I would have thought 4k qubits would be quite a 
> huge need, thus meaning we would have overcome the major problems 
> with decoherence.

There's a big difference between physics and engineering.  For an
example, look at the history of aviation.  During the era of
propeller-driven aircraft we were limited by the engineering constraints
of piston-driven propellers.  People said, "ah, but if we could only
perfect jet propulsion we could accelerate as fast as we wanted!"
Well-respected engineers like Buckminster Fuller talked about doing
space launches with jet aircraft -- accelerating up to orbital velocity,
releasing the satellite, and landing the delivery aircraft again.

Once we invented jet engines we discovered those were pipe dreams.
There was a new limitation that no one had considered prior to the
perfection of jet engines: since the air in the engine must be slowed to
subsonic velocities, that sharply limits just how far supersonic a jet
engine can go.  There was a new speed limit to replace the old speed
limit -- and a speed limit we didn't foresee until we actually had the
new technology to play with.

Nowadays people are talking about developing scramjets to overcome the
limitations of jet engines -- supersonic combustion within the engine.
And the old ideas from the 1920s are coming back around again, of using
scramjets to deliver satellites (or bombs, if you're working on defense
contracts).  And I have no doubt that if/when we perfect scramjet
technology we'll discover a new limitation, one we couldn't have
foreseen before we had working scramjets to play with.

So, yeah.  A 4k ensemble would mean we'd overcome the decoherence
problem, but really, so would a 200-qubit ensemble, or even a 50-bit
ensemble.  I'm not skeptical about our ability to overcome decoherence;
Bill Unruh tells me that we know how to do it in a physics level and
it's only a matter of time until engineering catches up.  I'm skeptical
about our ability to overcome the new limits which will arise, limits we
are at present unaware of.

> But, again, not being a quantum physicist, I cannot be relied upon
> on that subject.

Nor am I, but Bill Unruh is.  :)  I also attended grad school with Ben
Moehlmann, who has since received his Ph.D. in quantum computation.
Ben's been a great resource over the years for this stuff.  I never have
a conversation with him without walking away staggering under the weight
of the new knowledge.

I am not a quantum computation expert, but I hang out with some really
cool nerds.  :)



More information about the Gnupg-users mailing list