In Favour of Self-Signed Certificates?

Today I watched the Google I/O presentation about HTTPS everywhere and read a couple of articles, saying that Google is going to rank sites using HTTPS higher. Apart from that, SPDY has mandatory usage of TLS, and it’s very likely the same will be true for HTTP/2. Chromium proposes marking non-HTTPS sites as non-secure.

And that’s perfect. Except, it’s not very nice for small site owners. In the presentation above, the speakers say “it’s very easy” multiple times. And it is, you just have to follow a dozen checklists with a dozen items, run your site through a couple of tools and pay a CA 30 bucks per year. I have run a couple of personal sites over HTTPS (non-commercial, so using a free StatCom certificate), and I still shiver at the thought of setting up a certificate. You may say that’s because I’m an Ops newbie, but it’s just a tedious process.

But let’s say every site owner will have a webmaster on contract who will renew the certificate every year. What’s the point? The talk rightly points out three main aspects – data integrity, authentication and encryption. And they also rightly point out that no matter how mundane your site is, there is information about that that can be extracted based on your visit there (well, even with HTTPS the Host address is visible, so it doesn’t matter what torrents you downloaded, it is obvious you were visiting thepiratebay).

But does it really matter if my blog is properly authenticating to the user? Does it matter if the website of the local hairdresser may suffer a man-in-the-middle attack with someone posing as the site? Arguably, not. If there is a determined attacker that wants to observe what recipes are you cooking right now, I bet he would find it easier to just install a keylogger.

Anyway, do we have any data of how many sites are actually just static websites, or blogs? How many websites don’t have anything more than a contact form (if at all)? 22% of newly registered domains in the U.S. are using WordPress. That doesn’t tell much, as you can build quite interactive sites with WordPress, but is probably an indication. My guess is that the majority if sites are simple content sites that you do not interact with, or interactions are limited to posting an anonymous comment. Do these sites need to go through the complexity and cost of providing an SSL certificate? Certification Authorities may be rejoicing already – forcing HTTPS means there will be an inelastic demand for certificates, and that means prices are not guaranteed to drop.

If HTTPS is forced upon every webmaster (which should be the case, and I firmly support that), we should have a way of free, effortless way to allow the majority of sites to comply. And the only option that comes to mind is self-signed certificates. They do not guarantee there is no man-in-the-middle, but they do allow encrypting the communication and making it impossible for a passive attacker to see what you are browsing or posting. Server software (apache, nginx, tomcat, etc.) can have a switch “use self-signed certificate”, and automatically generate and store the key pair on the server (single server, of course, as traffic is unlikely to be high for these kinds of sites).

Browsers must change, however. They should no longer report self-signed certificates as insecure. At least not until the user tries to POST data to the server (and especially if there is a password field on the page). Upon POSTing data to the server, the browser should warn the user that it cannot verify the authenticity of the certificate and he must proceed only if he thinks data is not sensitive. Or even passing any parameters (be it GET or POST) can trigger a warning. That won’t be sufficient, as one can issue a GET request for or even embed an image or use javascript. That’s why the heuristics to detect and show a warning can include submitting forms, changing src and href with javascript, etc. Can that cover every possible case, and won’t it defeat the purpose? I don’t know.

Even small, content-only, CMS-based sites have admin panels, and that means the owner sends username and password. Doesn’t this invalidate the whole point made above? It would, if there wasn’t an easy fix – certificate pinning. Even now this approach is employed by mobile apps in order to skip the full certificate checks (including revocation). In short, the owner of the site can get the certificate generated by the webserver, import it in the browser (pin it), and be sure that the browser will warn him if someone tries to intercept his traffic. If he hasn’t imported the certificate, the browser would warn him upon submission of the login form, possibly with instructions what to do.

(When speaking about trust, I must mention PGP. I am not aware whether there is a way to use web-of-trust verification of the server certificate, instead of the CA verification, but it’s worth mentioning as a possible alternative)

So, are self-signed certificate for small, read-only websites, secure enough? Certainly not. But relying on a CA isn’t 100% secure either (breaches are common). And the balance between security and usability is hard.

I’m not saying my proposal in favour of self-signed certificates is the right way to go. But it’s food for thought. The exploits such an approach can bring to even properly secured sites (via MITM) are to be considered with utter seriousness.

Interactive sites, especially online shops, social networks, payment providers, obviously, must use a full-featured HTTPS CA certificate, and that’s only the bare minimum. But let’s not force that upon the website of the local bakery and the millions of sites like it. And let’s not penalize them for not purchasing a certificate.

P.S. EFF’s Let’s Encrypt looks a really promising alternative.

P.P.S See also in-session key negotiation