[guardian-dev] WOT and Authentication Research
Patrick Baxter
patch at cs.ucsb.edu
Wed Dec 19 05:29:05 CET 2012
Thanks for the response! I'm glad you have similar interests in this.
I have some responses inline:
On Sat, Dec 8, 2012 at 11:45 PM, elijah <elijah at riseup.net> wrote:
> If I read correctly, a few of your main points are:
>
> (1) We need well defined and expanded trust metrics
> (2) Everything would be better if we enforced a 1:1 mapping of public
> keys and UIDs (e.g. alice at example.org)
> (3) There are many different behaviors users engage in that demonstrate
> trust, and many different cryptographic formats this can take. Why not
> combine them in order to build a much more complete sense of trust?
>
I think that is a really good summary. I could also divide this into
two goals, by stating.
1) We must improve the ability for anyone to participate in end-to-end
encryption WITH strong authentication:
a. This means leveraging many different social interactions and
protocols that can capture key validation.
b. Improved ability to manage a private key
c. When possible, rely less on public key servers.
2) Must create a flexible standard for a public data structure that
enforces this one-to-one mapping:
a. It needs to balance the flexibility and usability tradeoff
intelligently.
b. Least possible authority
c. Is adaptable to different types of identities
d. "Trust agnostic"
e. Keys can be duplicated among multiple types of identities. Use
this to opportunistically to increase ability to enforce and
update mapping by using the validations from each domain
> If I understand correctly, another way to think of this same problem is
> in terms of Zooko's triangle:
> *snip*
Agreed!
> Essentially, I think the goal is decentralized human
> memorable public keys. No one wants to type an OpenPGP fingerprint as a
> destination address!
Agreed as well, having a key to encrypt and sign should be a natural
thing to possess for every UID you have. People shouldn't have to
think much about keys, but can probably understand it as a password
that must be physically available. "you are using an unsecure
connection on this login because you do not have access to your
password"
> I probably missed a few, but it is a good starting list. Ideas that rely
> on a single central authority are excluded.
As a side, unless you consider something like Sovereign Keys a
centralized solution, I'd actually make the argument AGAINST
federation. Rather, I think having a single 'centralized structure'
that is replicated under distributed control is more desirable.
Federation puts a single authority in place of a certain group of
people. A central solution controlled by varying groups arguably must
be more accountable and must collude to violate user's trust.
Still, a key server should only be responsible for keeping track of
valid mappings. A user can do that on their own to varying degrees and
accept changes in identity only if they want. Ultimately, keyservers
must still present valid user signatures to be validated by a user.
> WOT: Alone, the current approach to WOT has problems with usability,
> reachability, and what exactly counts as trust. Despite this, it works.
> In my mind, however, WOT should not be considered as the primary method
> of proof of control if the signatures are made public. Otherwise, the
> user has exposed their entire social graph--data that can be more
> sensitive in many cases than even the content of communication. WOT
> might be very useful in modified form and in conjunction with other
> schemes. Systems for automating key signing with mobile devices could
> make WOT more practical in the real world.
This can be a tough trade off. I am aware of the problem of exposing
social graphs thanks to William Binney. This is partially a different
problem then solving authentication, but it matters to most people.
However, I think anonymity is better solved in other ways that don't
cost us the ability to make public signatures. Too many things expose
you social graph to make it worth it to not publish signatures.
1) Exposing your social graph as a list of people you've signed isn't
as problematic as the real-time data of how often you contact a
person, when, and what location is associated with that communication
link.
2) All that real-time information is still exposed for the time being.
When you email someone you will never encrypt the headers and so
hiding signatures is a moot point until a general anonymous routing is
fully in place. Even hiding your IP with Tor won't help this since you
exit to the general internet and still expose you headers. Unless some
sort of anonymity system for each type of communication is made or a
general darknet replacement for the Internet becomes viable I don't
see this going away. CJDNS seems like a promising idea.
3) I think the correct way to obtain anonymity/privacy would be to use
strict psuedonyms that are only accessed/created over something that
anonymizes your TCP/IP layer (Tor). This way you may still expose you
social graph, but if you always use encryption, that pseudonym should
be able to fully separate from your 'real' identity you wish to
protect. Definitely not perfect though.
4) Without making public signatures, third parties really can't help
authenticate you and you are left with a fully distributed system or a
system that completely relies on third parties to authenticate you. If
going the distributed route it would look similar to GPG without the
keyservers. This works locally, but anytime you need to authenticate
with a person you haven't met, you most likely don't have a path to
that person.
5) Since this is such a small scope of the anonymity problem you could
almost label it as a privacy problem. One alternative is to send
signatures to the person you signed and let them choose to publish
them. This way no one publishes that they signed your key unless you
are OK with it.
> DNSSEC: There are RFCs for discovery of user's public keys via DNS,
> could be use with DNSSEC... If you love the idea of putting trust in the
> domain name system. Ugh.
It seems necessary and inevitable to work with DNSSEC for the time
being. Its been in the making for so long and if DNSSEC provides at
least some faith in which keys belong to which domains then that may
be useful as another authority to help enforce an initial mappings in
the DNS world. Convergence and Sovereign Keys both provide methods for
relying on this.
> Biometric Feedback: In synchronous communication, you can use non-verbal
> clues (like the sound of someone's voice) for both parties to mutually
> validate identities (when used in combination with a challenge, like in
> ZRTP).
> Shared Secret
Yes! I think this type of stuff is really fascinating. The more valid
protocols like this that are available to people, The more people can
sign keys with out knowing precisely whats going on. If using a WOT
this means more paths. If you just collect signatures, this means a
stronger consensus on a mapping.
> Mail-back Verification:
> Network Perspective:
> Authority Binding:
Agreed, all good things!
> Federated WOT: Rather than force users to practice good WOT hygiene, the
> idea with a Federated WOT is to force service providers to do proper
> key-signing and then have service providers sign the keys of their
> users. This allows for end-to-end WOT trust path without exposing the
> social graph, although it requires cooperation from the service
> providers (but not faith in them, see network perspective).
Interesting Idea! This puts some structure on the WOT. I'm not sure
though. Could a single federated structure corrupt/MITM a path for
one of its users?
> I have labeled what I think you are proposing "Trust Agnosticism". No
> doubt, you have a better name, but I needed something to call it
> something for the purpose of this email.
I have no better name, thats as good as any :)
> I agree with #1 and #2, but ultimately have some reservations about
> parts of #3.
> I absolutely agree that some linkage between different key formats would
> be a big improvement, as proposed by PSST. As for improving the WOT by
> including different sources of linkages, ultimately I think the WOT is a
> dead end for the use cases that many of us would like to support.
>
> Specifically:
>
> (1) To be useful, a WOT exposes a user's social graph.
See the other response about this above.
> (2) In order to address the rapid rise of mass surveillance, in either
> repressive or democratic contexts, we need to dramatically increase the
> rate of adoption of end-to-end message encryption. I think there is a
> lot of evidence that current schemes are too complex (since so few
> people use them correctly). My claim is that this complexity cannot be
> fixed with better tools, but is a problem in the inherent conceptual
> complexity of OpenPGP and OTR, etc. A lot of people will take issue with
> that last statement, but these people have probably never tried to get
> activists in the field to use existing secure communication tools
> correctly and consistently. But our goal cannot just be to achieve
> adoption among people who are motivated enough to pay a privacy-tax. To
> be effective against mass-surveillance, we must work to reduce the
> privacy-tax to nothing.
Most definitely. I think the accessibility is what is really
interesting about LEAP. However, I still think you can make a WOT
usable by updating our keyserver structure and changing the end-user
software to be a better experience. Remove trust decisions and
automatically capture signatures through some of the aforementioned
things.
I'm not convinced the WOT is bad, but its still a bit open as to the
correct way to use it. I like the idea of preserving the ability for
users to authenticate without any trust in a third party when
possible.
> We are early in our design and coding process, and would absolutely love
> to dialog and collaborate with other people interested in this problem
> space. Depending on what funding we are able to put together, our work
> in this area may progress faster or slower.
Cool stuff, lets keep up the dialog!
-Patrick
More information about the Gnupg-users
mailing list