Problem with faked-system-time option
jerome at jeromebaum.com
Thu Jun 16 07:32:34 CEST 2011
>> So, how do you sign
>> (i.e. timestamp) data that isn't already signed by someone else?
> You use a regular old 0x00 signature. 0x50 gives you capabilities that 0x00 doesn't. That doesn't mean 0x50 takes over all purposes of an 0x00. 0x00+notation or 0x50+notation covers either set of semantics.
I understood your suggestion as 0x50, not 0x50+n. 0x50+n, where n is
"timestamp-only", seems redundant.
0x50 doesn't give additional capabilities. You can sign a signature
packet with 0x00 as well. 0x50 is more restricted than 0x00, not more
In any case, let's just use a notation and concentrate on that. The
0x50, clarity/confusion, notation, 0x40, etc. discussion is wasteful
and not really fun.
>> 1 a. Should we set this notation critical, non-critical, or user's
>> choice? We also had the suggestion of doing two signatures, one w/
>> critical and one w/out. The idea was that the user will be inclined to
>> look more closely.
> I don't see any particular need beyond a straightforward "timestamp-only" at most. Clock drift and clock resolution seems like massive overkill and overcomplexity to me, but if someone else wants it, that's the nice thing about notations - anyone can define them to whatever semantics they like.
I'm thinking in terms of stamper, which does timestamps at scheduled
intervals (some 10-15 minutes or so).
> Pick critical or not depending on the semantics you want: critical means more or less "the receiving system needs to understand this notation to properly understand/handle the signature". It causes (intentional) incompatibility with all deployed code. If those are the desired semantics, then you have no choice, but it's a bit of a hamper (months to years) to adoption.
I rather was asking (anyone listening in) for an opinion. We've
already discussed the trade-offs that you mention. What I'm looking
for is to get this specified a bit more formally and get everyone's
input, instead of just throwing any random solution out there.
My personal vote would be for critical, because while it might hinder
compatibility, there's no chance of a user mistaking the signature for
something it isn't. The two-part signature sounds interesting but I'm
afraid one of the signatures might get lost (leaving the issue of
non-critical notation and misinterpretation) and it generally seems
As for the error/resolution notation, someone else (can't recall who
and the gmail thread is unbearable) mentioned that this would be
relevant, and with the same breath that you could state this in your
signing policy. My thought process is, what if I have two machines,
and one is NTP-synced (or even takes legal time from the broadcast
signal) while the other regularly drifts up to 10 minutes, or runs
timestamping in batches, etc.
Of course, I could set up separate keys. Personally I'd opt for the
notation, as that's also computer-readable (think "Good timestamp from
Alice, between 10:00 and 11:00 on 2011-06-16" or whatever automated
processing people want to cook up -- e.g. keeping signatures valid
under decaying algorithms by resigning/chaining: This could be
verified by some script that you tell when each algorithm was declared
"insufficient"). Basically it allows us to do stuff that a note in my
policy doesn't, and if we think this through, it won't be very
That was the "why have this data?", here's the "how":
Another alternative is timestamp-interval at gnupg.org = <ISO 8601 time
interval> which describes the interval during which the timestamp was
made, accounting for precision and error, and leaving no room for
interpretation of the interval, but making it the signer's duty to
compute this interval. That's also a lot less complex than a
timestamp-precision and timestamp-error, so we're out of "massive
email jerome at jeromebaum.com
PGP: A0E4 B2D4 94E6 20EE 85BA E45B 63E4 2BD8 C58C 753A
PGP: 2C23 EBFF DF1A 840D 2351 F5F5 F25B A03F 2152 36DA
More information about the Gnupg-users