Onions, Certs & Browsers: a three-way “Mexican Standoff”

Alec Muffett
5 min readJan 25, 2017

tl;dr — self-authenticated addresses mesh clumsily with hierarchical identity

New Tor Onion Addresses are coming. They will be 256 bits long, vastly more secure than their predecessors, and enable many new networking solutions.

They may (note: may) be secure enough to become eligible for issuance of SSL “DV” certificates, the kind that LetsEncrypt issues; and this will lead to interesting philosophical questions about trust on the internet.

Whats the background?

Onion addresses are self-authenticating.

When you type “https://www.facebookcorewwwi.onion" into TorBrowser, you are cryptographically certain at the very levels upon which packets of data are being exchanged, that you are talking to the “facebookcorewwwi” machine — because “facebookcorewwwi” is essentially a cryptographic network address. Onion communication is “end to end” encrypted, with built-in identity & trust.

In contrast, when you type “https://www.foo.com/" into any browser, you are trusting many external entities:

  • a DNS provider to map “www.foo.com" to a network address
  • network providers (ISPs) to send packets to that network address
  • a hosting provider to route that network traffic to the proper machine
  • a certificate provider to have issued a “foo.com” certificate exclusively to the “Foo, Incorporated” organisation which which you want to browse/communicate with some surety.

This is a much more complex. multiparty, trust model.

What about Onions & SSL Certificates?

You can get a special kind of SSL certificate for an Onion site; it’s called an “EV” (Extended Validation) certificate. It’s the kind which often has a big green label attached to it in the browser toolbar.

When you connect to https://www.facebookcorewwwi.onion, the SSL certificate helps you be doubly-certain that the site is owned by Facebook, plus it unlocks certain site functionality like “secure cookie handling”.

You can’t yet get basic “DV” (Domain Validation) certificates for Onion sites; these are the certificates where the CA (Certificate Authority) basically attests that “yep, domain name seems to be attached to this person’s current application for a SSL certificate”.

The most common explanation of why Onions are not secure enough to have DV certificates issued is that the existing “80 bits truncated SHA1 hash” Onion address lengths are thought potentially brute-forceable. This would hypothetically permit a fake “facebookcorewwwi.onion” site to be set up, disturbing DV certificate proof-of-ownership protocols and disrupting trust; therefore the greater and deeper background checks of EV certificates are thought necessary. New 256-bit Onion addresses will be proof against this, and therefore may be DV-capable.

However, there was a constant background subtext to the SSL-Onion discussions — that the Certificate Authorities realise:

  • CRLs (Certificate Revocation Lists) do not work very well, and…
  • Regarding trust, the Internet’s real whip hand is the existence of:
  • … (a) DNS & (b) ISPs

Certificates are widely considered to be the saviours of all matters identity-related, and the roots of all trust, on the internet; but they have a tension with user-interface people who just want browser toolbars to say:

[Secure] https://www.foo.com/

…and to mean it with no if’s and’s or but’s.

This tension manifests in UX discussions with perspectives like:

EV Sucks Because People Have To Read The Green Label
And Then They Have To Think Hard About Their Choices
And All Our Data Tells Us That People Don’t Understand The Choices

So, what happens when a repressive state government somehow gets ahold an EV cert for “google.com” ?

  1. Google Inc will want it taken down.
  2. The CAs will say “Oops, we’ll add it to the CRL but that won’t help”
  3. The last resort will be to blacklist traffic to the evil-fake-google at network-peering level, block it from mention in DNS, etc.
  4. <snark>“Certificate Transparency” advocates trumpet that if only people would update their browsers to check certificates in (yet another) database, none of this would still be a problem.</snark>

Now consider Onionspace:

If (hypothetically) I create a “https://www.bbcwwwofficialxxx.onion” site — acting as a rewriting proxy for traffic to/from the official BBC website — then:

  • there would be nothing that can be done to take it down…
  • there would be no DNS to ban it from, and…
  • if I got a DV certificate for it from LetsEncrypt, then all of the pages would be properly encrypted, look padlocked, etc.

What, then, exists to prove to a user that it is “not the real BBC”, other than lack of an EV sticker and/or lack of attestation from the main BBC website?

As ever, this is the problem of trying to prove a negative.

In the world of hierarchical identity, DNS & ISPs, the “solution” would be to beat up the DNS provider for “bbcwwwofficialxxx.onion” and get the DNS record removed— but in Onionspace there is no DNS to pressure.

For any journalists reading this: I would like to take a moment to spell out that there is nothing novel in this whatsoever; the mechanism of people setting up man-in-the-middle proxy websites is ancient, large chunks of internet technology rely upon it (eg: corporate network filters, mobile ISP data compressors, caches, etc) — but in my experience the fact that a challenge is old and well understood has never stopped anyone from trying to pretend that it’s an insurmountable obstacle to change, when in fact they just… really… don’t… want… to… deal… with… change…

So:

  • Onionspace needs SSL Certificates.
  • EV certificates are complex and expensive to obtain, and some folk think they are ugly (i.e.: not a simple yes/no boolean) & confuse users.
  • DV certificates would fill the gap, although the inherent takedown-resistance of Onionspace would complicate the use of standard mitigations against MITM phishing-sites; and “just look for a padlock” won’t cut it as security advice (not that it ever did)

But there’s more…

These two observations are themselves in tension with third issue, that browser manufacturers are increasingly locking technologies to be SSL-only.

WebRTC and HTTP/2, would be the biggest two cases in point, I think.

If Onion sites continue to be locked out of obtaining cheap, plentiful DV SSL certificates, then Onionspace will turn into a stagnant technological backwater.

Nerdy kludge-fixes like “make TorBrowser equate Onions with SSL” ignore the existence of many other Tor platforms than Torbrowser (eg: Orfox for Android, OnionBrowser for iOS) — plus will cause other friction-issues related to lack of orthogonality with the rest of the Internet (e.g.: third-party CDNs fretting about sending HTTPS WebRTC traffic to a site with supposedly SSL-equivalent “Origin: http://foofoofoofoofoo.onion" headers)

Summary

  • TorBrowser needs SSL-over-Onion (and therefore certificates must be easily/cheaply available to Onion Sites) in order to use certain new technologies and for Onionspace to remain “current” on technologies
  • There will likely be concerns about how to reconcile DV Onions with lack of mitigation for the above MITM scenario; trying to restrict issuance of SSL certificates in the face of SSL-locked browser features, is not viable because growth.
  • Big green EV labels constitute an attractive yet possibly doomed “middle ground” solution for identity attestation, viz: “It is/isn’t the real BBC Onion Site because the label says so”; and the Security UX community may revolt against them.

I believe that the only way forward is:

  • to have orthogonality of SSL certificate availability in DNS & Onionspace
  • for Onion sites to be treated by browsers as any other hostname
  • to raise awareness and maintain respect for all these issues as the discussions progress.

--

--