Problem with faked-system-time option

David Shaw dshaw at jabberwocky.com
Wed Jun 15 23:19:33 CEST 2011


On Jun 15, 2011, at 3:50 PM, Daniel Kahn Gillmor wrote:

> On 06/15/2011 03:10 PM, David Shaw wrote:
>> That said I'd probably suggest notations for this, even though 0x40 exists in the standard.  0x40 signatures are a bit of a leftover tail in the standard, and are not well specified (0x40 sigclass - is it a binary signature?  a text signature?).  Using notations also gives you more flexibility since you can do key=value stuff and specify different variations on timestamp signatures.
> 
> Note that if you do decide to use a notation for this, you should mark
> the relevant notation subpacket as "critical", so that the signature is
> not interpreted by an unwitting implementation as meaning something
> other than the specific declaration:

I'm not sure I agree with that.  Essentially, this notation is a way for a user to say "This is what I mean by this signature".  Meaning and intent is difficult for GnuPG to divine :)

In practice, the critical flag tells GnuPG to reject the signature (mark it as invalid) if it doesn't know about the notation.  Why does GnuPG need to know about this notation?  Or more specifically, what should GnuPG do differently for a timestamp-only signature compared to a regular signature?

I'm not against the user deciding to mark the notation as critical if he chooses to do so.  I just wouldn't have it automatically and always critical.  Unless I'm misunderstanding your point, I don't see that the semantics of a timestamp notation require that.

David




More information about the Gnupg-users mailing list