Opportunistic Encryption [Was: Keys not trusted]
Tue May 13 19:18:02 2003
Content-Type: text/plain; charset=us-ascii
On Tue, May 13, 2003 at 05:52:47PM +0400, Yenot wrote:
> Web-of-trust solutions can work within companies and various
> communities, but the web of trust based on public keyservers will
> never become a universal solution. We should support web-of-trust
> solutions (i.e. use OpenPGP), but we shouldn't ban opportunistic
> encryption in order to force the growth of the Web-of-Trust.
Without fingerprint verification in person, many will argue that their
current notion of the WoT doesn't apply.
Opportunistic encryption will make more messages encrypted, but it
doesn't prevent MITM attacks. PRZ doesn't feel that most people are
targets of such attacks, which is why he now advocates opportunistic
> 1) Keysigning parties are never going to catch on with the masses.
> It's an expensive operation. Protocols that require expensive
> operations (purchasing a certificate, attending keysigning
> parties, etc.) will never catch on with the masses.
So? Them asses can use cryptography without performing said expensive
> In order to protect the masses, we need opportunistic encryption.
Protect them asses from what and/or whom? If they don't have a threat
model, maybe they have nothing to defend.
> 2) As implemented today, the Web-of-Trust is bad for privacy.
> Advertising e-mail addresses combined with a list of your closest
> contacts (via signatures) works well for an authentication
> protocol, but it's not a good privacy protocol.
Privacy and anonymity are two separate things.
> I'm not the only one with this opinion. 50% of the residential
> phone customers in California, USA pay around $0.28 *every* month
> to keep their phone number unpublished. The nationwide percent
(But how many block their number when making outbound calls? How many
also avoid using toll-free numbers?)
> in America is only around 24%, but some of the phone monopolies
> extort as much as $6.95/month to keep a phone number unpublished.
(Extortion it is.)
> On top of this, I believe there have been multiple battles to stop
> American phone companies from selling name/address/phone lists
> of peoples' closest contacts based on call history. The
> Web-of-Trust forces people to disclose this very same information
> that a large percentage of the population (at least in America) do
> not want published.
No, most people choose to put their names on their keys because using
names instead of numbers (keyids) to identify keys is easier.
Assume we didn't, but still wanted to encrypt email. At a keysigning,
we'd have to provide our email addresses anyway. (Photo IDs might be
irrelevant if we're not certifying everyone's real name for any
auxiliary purposes.) We could take everyone's word that they own the
keys they claim to own, or we could email them encrypted challenges.
In our MUAs, we'd probably manually associate key with email addresses.
This gives us keys which can't be harvested for their email addresses
and can't be attached to a real person unless you've met them at a
keysigning or do traffic analysis on their email. (If needed, use
--throw-keyid so that anyone doing traffic analysis can't attribute
a specific key[id] to that person.)
(If you need even more anonymity, wear masks at the keysigning and
communicate through anonymous channels.)
The WoT doesn't cease to exist for "anonymous" keys. In fact, it
becomes purer. If you don't have a trust path to an "anonymous" key,
you can't even put any trust into it based on a name or email address
that you might be willing to trust.
Jason Harris | NIC: JH329, PGP: This _is_ PGP-signed, isn't it?
firstname.lastname@example.org | web: http://jharris.cjb.net/
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (FreeBSD)
-----END PGP SIGNATURE-----