what is killing PKI?

Stan Tobias sttob at mailshack.com
Wed Oct 3 21:19:13 CEST 2012


The impulse for writing my first post in this thread was frustration
about a "technological" treatment that privacy often receives, and about
a lobby that tries to tell everybody to encrypt everything, whether
sensitive or not (I think I've seen this on this list, but I don't have
the time to research now).  The argument for using encryption seems
to go like this: "privacy (value) is always good, privacy (secrecy)
is achieved by encryption, therefore encryption is always desirable".
I'm bothered when I read "privacy is achieved by encryption" (johnny1)
- okay, maybe they use it as a synonym of "secrecy", but then it blurs
the distinction between the two.

In my posts I just wanted to articulate one reason why for some people
(from "out in the field") encryption is a non-choice, and I gave some
context in which such attitude might be understandable.  In further
discussion I tried to describe Privacy as a social mechanism and show it's
not equivalent to secrecy (i.e. information leakage does not necessarily
mean loss of privacy), and therefore the above argument is non sequitur.
I don't feel the best person to discuss these things, but I thought
someone must first say an idea, so that others can use it and test it.

Before writing my first post, I had read the Gaw et al. article (by
your recommendation, actually), and I was a little less than satisfied
(perhaps because authors didn't make an effort to hide their opinions).
For this writing, I have read all the other articles mentioned in
someone's earlier post.

Please, allow me to add one more thought, which is relevant further down.

It might sound paradoxical, but openness is what protects us in our lives.
We require transparency from the government, from institutions, from
private companies (in certain circumstances), and therefore we practice
a culture of openness ourselves.  This is how we keep institutions
under control, and this is what keeps society together.  People are
more likely to support a transparent organisation.  People are upset
when public institutions become too secretive.

Facing an opponent, to avoid fight, we may run away (nothing wrong,
a normal defense), or challenge them (it often works).

I think I can understand why some people (e.g. Jenny) feel encrypting
"public" information is not appropriate: it's a challenge.  "We had
a meeting at 9, coffee was served, so what?  We are worried about new
law - every citizen has the right to be concerned.  We investigate what
BigCorp does - of course, we're ActivistCorp, it's our job, that's what
our supporters pay us to do.  Now, what are _you_ up to?".  Secrecy would
probably be not adequate, because then police could use any pretense to
enter the offices to hamper the activity.  Transparency also helps keep
internal discipline (don't do stupid things).


"Robert J. Hansen" <rjh at sixdemonbag.org> wrote:

> On 8/26/12 5:37 PM, Stan Tobias wrote:
> > In the works cited before (this thread and other discussions),
> > one recurring concern could be formulated as: "Why Johnny doesn't
> > encrypt his Christmas greetings to his granny?", with an implicit
> > assumption/expectation that everybody ought to use cryptography by
> > default for any and everything.  I'll concentrate on the encryption only.
>
> Well, speaking just for myself, I try not to make that assumption.  I'm
> interested in knowing why Johnny can't encrypt, and then further why
> Johnny *doesn't* encrypt.  These are two different questions which have
> very different answers.

I didn't adapt the title without a reason, my answer was directed towards
that attitude.  "What will it take to make the use of encrypted e-mail
universal and routine?" is a quote from Gaw et al.

Do we really have evidence people can't encrypt?  For me the "johnny"
articles were not quite clear about it (they seemed to investigate 
a different aspect).  I don't believe people are stupid.  They can
learn to use cryptography, just as they have learned many other things
in their lives.

Another matter is what "can" does mean.  I can fly an Airbus A380.
(Sure :-^, I only have to find that "A/P" button.)  - That was
my conclusion having read "All users were able to decrypt. This is
because PGP automatically decrypts emails when they appear in Outlook
Express." (Sheng et al.).  I think what is missing from many discussions
is that to be able to _effectively_ use cryptography on computers, one
has to know much more than how to use cryptography.  (I could tell a
funny story of a security failure, not because of wrong use of GnuPG,
but because they didn't realize how a certain file system worked.)

Can you imagine a responsible person exchanging sensitive information,
while not being certain what he does is safe?  It's a matter of personal
integrity, it's not enough to tell a user "click here and there, and
you're fine"; we have to first convince ourselves what we do is right.
The upshot is that you cannot make cryptography easier for users, they
will have to study and understand it themselves anyway.


> "Why Johnny can't encrypt" is a human-computer interaction (HCI)
> problem.  HCI problems are eminently solvable.  The papers have a lot of
> exploration of this problem: see, e.g., "Why Johnny Can't Encrypt",
> "Johnny 2", and "Why Johnny Still Can't Encrypt" for three examples of
> really good peer-reviewed papers that explore this.

One reason I didn't like those papers is that they concentrated on a
particular version of a particular implementation, and they seemed to
make mostly ignorant users to work with it (and by my standards, they
were actually quite successful in that).  Their methods and findings
could be applied to any graphical software.  As I said, for me to be
able to use encryption means more than knowing which buttons to click.


> "Why Johnny doesn't encrypt" is a social problem.  Social problems are
> notoriously intractable.  

I disagree here.  They might be difficult to quantify, but we can discuss
those issues and try to reach some conclusions, if not solutions.

One reason I like to read Schneier's blog (if I have the time) is that
he often discusses social aspects to security.

> See, e.g., Gaw, Felten and Fernandez-Kelly's
> paper.  They found that even when people were aware of the dangers they
> were facing, knew those dangers were real, had easy access to crypto
> software and had been trained in its use, they *still* weren't using
> crypto... principally because they didn't want to be seen as paranoid.

In the article I didn't find anybody who said they didn't want to be
_seen_ as paranoid, they only described certain behaviour as paranoid.
I think people use the word "paranoid" when something conflicts with
their perception of the world, and they don't know how to phrase it.

Cryptography might not be difficult to apply, but is not without its
own problems *around* it.  In my small experience, it requires a lot of
planning: what and why must be encrypted; what passwords will be used
(with many encrypted files) and how to ensure I don't forget those
passwords; how to ensure the encrypted files don't become corrupted
(so that data doesn't become irrecoverable), how do I check that they're
not corrupted, how many times will I check the files, where and how to
make backups (of course, check goes before the backups, but you have to
remember this), and many other small details in a long decision tree.

Recently I helped a friend to recover data from a broken NTFS partition -
mainly family pictures since many years ago.  Had the disk been encrypted,
the chances of recovering anything (by me, at least) would be close
to zero.  One has to be able to balance the risk of leaking information
against completely losing it, and it's a big headache, especially that
we don't realize all the factors that come into play.

Using cryptography to protect secrets is a serious intellectual
effort. Abe (in Gaw) described it as a "chore", and I think I can
understand what he meant by that.  If you want others to use cryptography
communicating with you, you want to put the same burden on their backs.
"Paranoid" in this case does not mean "tin-hat"; it just means that the
effort you put into message exchange is not proportional to its value.

As for the people you mentioned, I don't see exactly which you mean,
I didn't find any egregious carelessness (except that users didn't
understand digital signatures, but that wasn't a big issue either).
Ultimately, it's the organization management's responsibility to decide to
what degree information must be protected.  Should cleaners be required
to encrypt their emails?  What about the plumbers?  And encrypting often
isn't the only (or most important) matter to think about (WikiLeaks
"thank you" email fiasco).


> I really don't want to rain on people's parades.  A lot of these ideas
> of "what the problem is" are deeply interesting.  But until you actually
> go out into the world and ask real users the question, and observe
> workers in their natural environment, then it's a bunch of discussion
> over how many angels can dance on the head of a pin.

Facts by themselves are not knowledge.  We gain insight by discussing
facts.  It's important to discuss before a next round of interviews,
because then we can know better what questions to ask, and how to ask.
My experience is generally in agreement with the findings of the articles.
My only addition here is that I try to rationalize certain behaviour,
which is something that the interviewees could not do on the spot,
because they probably acted on instinct rather than calculated risk.
I might be wrong about Jane, but I'm not wrong about myself, and my
writing here is another testimony.  There's no need to go far,
this mailing list is a mine of issues people deal with, of reasons
why they do or don't encrypt, and of their good and bad perceptions.
If someone had the time to sort all those things out, it might result
in another great scientific paper.

Best regards, Stan.

P.S.1. Having an occasion now, I just want to say to you, Robert, a big
       and sincere "Thank You!" for your articles on this mailing list.

P.S.2. I mentioned British police once - they still don't wear guns:
       http://www.bbc.co.uk/news/magazine-19641398

P.S.3. Thanks to others who responded, and especially to Marco - after
       my second writing I got your reply first, and it was very nice
       and encouraging.  It's not very important what I have to say,
       and only slightly topical, but because you've asked, here goes,
       but briefly:

Facebook users have often been accused of carelessness about their
privacy, to plain foolishness (by "facebook" I understand any web-page
where users publish themselves, e.g. Wikipedia).  Once I read a discussion
on sexting among teenagers; the conclusion why they do it was that as
they grow, they announce this fact to others, it's in their nature.
(Some were harshly punished, but IMO not by life, but rather by
self-righteous adults, who disregarded their normality.)  I think the
same applies to grown-ups; I've seen people publishing uninteresting
things about themselves without purpose, I can't explain it other than
by a need to announce one's presence to the world - it's something in
the human nature.  Much "Internet" time has passed, and I haven't seen
any privacy disaster yet.  People reveal a lot, but not everything; they
make (sometimes funny) mistakes, but they also learn from those mistakes.
Privacy is not all black-and-white, and we have room to try how much we
can reveal, and when it becomes too much.

I think a lot of good results from people publishing themselves en
masse: we learn about other people, but most importantly *we learn about
ourselves*.  This helps _break social taboos_, and bring down barriers
between people.  We keep in contact with other people, share ideas,
organize ourselves, we can influence political changes.  People seek
other people, it's in their nature.  "Foolishness" is part of human
lives; I think what attracts people, is that they can make a mistake,
look foolish, and still maintain dignity, because everyone else around
is equally "foolish".  These are extremely important things, they help
us to grow up, and they change our (global) culture.  Ultimately, it
may occur that there are not so many things really private, because
essentially we all look the same, and do the same things.

I think we sometimes overestimate the negatives when people publish
their lives.  When one person comes out naked into the street, it's a
sensation.  When one thousand become naked, then effectively noone is.
Things that were inappropriate twenty years ago, are not so today.

One concern does remain: the published information remains, and we can
never be sure that it won't be used in future against us in ways yet
unknown to us.  Well, life is a risk.  We must evaluate what's more
important, creating more good and freedom for ourselves, or avoiding
the risk.  I see it as a race who will be there first, ordinary people
establishing a new standard of normality, or the self-righteous - will
they sense change in time and start regulating our lives again?

Last: here on this list we reveal a lot about ourselves, too.  If you
ask questions, or help someone with understanding cryptography, you
reveal you know something about it.  This information is potentially
more sensitive than what someone ate, or where they were on vacation,
and could be used against you.  So calling facebook people foolish on
this list is... well, paradoxical at least.




More information about the Gnupg-users mailing list