HomeLog InSource

notes.public

[View] [Short] [Hash] [Raw]

2017-04-29

This Mess of a Certificate Authority System

I recently had to switch all of my SSL certificates from StartCom to Let’s Encrypt, so let me take this opportunity to expound on the public key infrastructure used on the web.

First, to establish a secure connection over an untrusted channel using symmetric encryption, you need a shared secret. The conventional solution is to start a handshake using asymmetric encryption, through which each party can be identified by their public key. Then the problem becomes, how do you verify the other person’s public key?

There are basically two solutions in practice: SSL/TLS, which uses a hierarchy of certificate authorities to delegate validation to; and SSH, which presents the public key fingerprint to the users and lets them work it out themselves.

Incidentally, any system which doesn’t use either of these techniques, and which doesn’t rely on a pre-shared key, probably has a fundamental weakness. So for example, I think it should be pretty trivial to steal WiFi passwords just by masquerading as someone else’s access point and waiting for them to connect. The only way it could be secure is if the access point provided a public key for clients to validate. (If you Google phrases like “wifi mitm” you get a lot of application man-in-the-middle attacks like Firesheep, which are completely unrelated, so I’m not sure whether this exploit is known or practical.)

Anyway, public keys must be validated by the client, on way or another. Assuming the user is in control of his or her own computer, that means the responsibility ultimately falls to the user. However, since all of us are lazy and most of us are clueless, and browsers do certificate checking by default, we tend to let browsers do it for us.

Responsibility and power must be two sides of the same coin, because browsers have turned around and created a large industry of certificate authorities. The profits of these companies come directly from users’ disinterest in (and to some extent, ignorance and fear of) verifying certificates themselves.

Okay, so I may sound a little bit anti-capitalist but none of that is bad. The bad part is this:

Before you can verify a website’s certificate, before you can even connect to it, you need to know whether to expect the site to be encrypted or not. After all, there are still lots of unencrypted sites out there. If a site supports encryption, you don’t want to connect to it insecurely, but if it isn’t, you have to or it won’t work. (But note that both “upgrade” and “fallback” strategies are insecure. So you really do have to know in advance.)

The original solution was an “s” in the URL’s protocol. If you go to https://www.mybank.com, your browser knows that the connection must be encrypted. It won’t allow the connection to be downgraded by an attacker.

However, we follow URLs from all sorts of sources: other web pages, emails, chat, etc. We also type mybank.com in the address bar, which still assumes plain “http:”. Either way, if you end up at http://www.mybank.com (without “s”), then someone can intercept your unencrypted connection. So the “s” in the protocol is almost worthless.

The next solution was HSTS (Hypertext Strict Transport Security), which simply lets sites indicate that all future connections should be secure. However, that only helps once the user has connected (potentially insecurely) the first time, similar to TOFU (Trust On First Use).

The final solution was HSTS preloading, which is just a way for browsers to bundle in a list of sites which require encrypted connections. It was started by the Chrome developers but these days basically all browsers use their site list.

If a site isn’t on the big list, your connections to it are vulnerable: if the site uses HSTS, then the first time you visit it (on a new computer or after a fresh install); or otherwise, every time you visit the site without using a trusted link or manually typing “https:”.

In other words, the free HSTS preload list is a browser-run, semi-manual certificate authority, like they were trying to avoid by creating the CA ecosystem in the first place. If it were enhanced to record the fingerprint of each site’s root certificate, then all of the current flexibility could be maintained, but additionally free certificates could be more accessible and secure (because each level of delegation just increases the attack surface).

The problem of verifying certificates and the problem of knowing whether to expect a certificate really are the same problem, so it makes no sense for them to be completely disconnected (even handled by separate entities) the way they are currently.

If you want to get into really crazy territory, browser vendors could bring EV (Extended Validation) certificates in-house, which would give companies like Mozilla another revenue stream, instead of (bizarrely) outsourcing the profits like they do today. As EV gets increasingly automated (from what I understand, some CAs already have a completely automated EV process), this becomes more practical for software companies like browser vendors.

Unfortunately, the biggest problem with this plan is probably social. Once an industry is created, it’s nearly impossible to get rid of, no matter how unnecessary it becomes. Thus, as a final warning for aspiring protocol designers, be very careful where you insert opportunities for profit, because they’ll be impossible to remove later.