Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Binary Transparency for Firefox (wiki.mozilla.org)
235 points by _jomo on March 29, 2017 | hide | past | favorite | 38 comments


If I'm understanding correctly, the plan is to piggy-back on top of the existing Certificate Transparency [0] infrastructure by issuing a regular X509 certificate per Firefox release, but for a special domain name that includes a Merkle tree hash for the files in that release, with a known suffix (".fx-trans.net").

In that manner they can piggy-back on top of the CT ecosystem (including existing logs, including existing search / monitoring tools, and presumably gossip if/when that's solved).

This seems like a really cool hack! The state of binary software distribution is really pretty scary when you think about it - techniques like this have the potential to restore a lot of confidence.

[0] http://www.certificate-transparency.org/


Specifically, Certificate Transparency makes it possible to detect SSL certificates that have been mistakenly issued by a certificate authority or maliciously acquired from an otherwise unimpeachable certificate authority. It also makes it possible to identify certificate authorities that have gone rogue and are maliciously issuing certificates.

Interesting. I assume this either helped with the evidence for - or was developed because of - the whole Symantec CA dustup going on?


CT significantly pre-dates the recent Symantec issues, but yes, it does provide an excellent tool for providing evidence of misissuance [0] [1] - and that's the crux of it - in order for a certificate to be considered valid in a CT world, it must present proof that it has been publicly logged.

[0] https://security.googleblog.com/2015/09/improved-digital-cer... [1] http://searchsecurity.techtarget.com/news/450411573/Certific... [2]


Correct. I believe Certificate Transparency existed prior to any of the issues with Symantec, but CT was indeed involved in exposing some of the Symantec shenanigans.

https://arstechnica.com/security/2017/01/already-on-probatio...

> Ayer discovered the unauthorized certificates by analyzing the publicly available certificate transparency log

That article also links to the primary source, https://www.mail-archive.com/dev-security-policy@lists.mozil... which in turn links to a public viewer for Certificate Transparency logs.


The initial impetus was actually a design to allow a CA to be transparent about its own operations. However, the DigiNotar incident triggered the plan to apply it to all CAs.


Knowing quite little about the technicalities behind CT, I'm interested in the scalability of this. If CT were to be piggybacked upon by a large number of open source binary software distributions, I assume this wouldn't be problematic in any way. CT is already designed - I guess - to handle theoretically all domains. Plus, Firefox is a pretty big, popular distribution to be starting with.


CT logs are designed to be able to handle queries from all web browsers on a daily / more frequent basis, and the output from queries is easily cacheable (and the logs can be mirrored in a read-only manner).

If FF is already doing any log inclusion proofs for certificates, then I think including one more (for the FF release itself) would be pretty much line noise.

I think an interesting question arises as to how well with the CT logs themselves would scale to handle the same kinds of certificates for all binaries, if this ends up taking off as a good idea in general. They've had to handle quite an explosion in X509 certificates over the past year or two due to Let's Encrypt. Some of Google's logs now show more than 80,000,000 certificates [0] in there - IIRC 2 years ago it was a low single digit million.

[0] https://crt.sh/monitored-logs


I actually think that building an independent system for binaries is a better plan, for various reasons.

One is that log bloat is indeed a problem, not so much for the logs, but for those that want to monitor them.

The other is CT has made some tradeoffs to allow cert issuance to be quick. I don't believe binaries need the same tradeoffs, and, for example, instead of an SCT, they should come with an inclusion proof (something I'd like to see for certs, too, in the long run).


Yes, very cool. I believe we should do something similar for web applications. I wrote a blog post about that a while back:

http://blog.airbornos.com/post/2015/04/25/Secure-Delivery-of...


This is a fantastic step forwards for Binary Transparency, which I hope is followed by Linux distros and package managers, so all Free Software gets the benefit.

The one worry that comes to mind, though, is that once a binary transparency log check is made mandatory for any update to a piece of software, there is a risk that a bug in the log checking code makes it impossible to ever upgrade the software again. (This reminds me of the HPKP Suicide attack, but is not quite the same).

Obviously it should be possible, with Firefox at least, to manually download a new copy of the installer and install it from scratch, but I feel there should be a fall-back mechanism where, say, a release signed with a special offline key should be allowed to skip the transparency check (perhaps only if the transparency check has been failing on an offered upgrade for more than a month).


A bug is the log checking code is no worse than a bug in your signature verification code. But obviously, denial-of-update attacks on the log infrastructure should be mitigated in some way before this is mandatory.

An up-side to not having a fall-back mechanism is that you can't produce a secret update. No matter many 5$ a wrenches the NSA can afford (https://xkcd.com/538/).


I wonder how much effort it would take them to actually get to fully reproducible builds.

edit:

here, https://bugzilla.mozilla.org/show_bug.cgi?id=885777


As long as PGO (profile guided optimization) is still a thing that yields a significant speed boost, a lot.


The profile data used could be published.


This is all good and nice, especially if reproducible builds come true. But the devil is in the extensions. They are the weakest link.


Binary transparency seems to be a nice thing to have though quite limited in scope for linux as distro usually compile from source. Even more limited as mozilla knowingly makes controversial choice stating unhappy users and distro can recompile with a build flag until we strip the code.

IMHO mozilla should orient its transparency effort towards its decision process first so we don't end with a binary transparent browser no one use because management decided to remove user choice and break the UI (to look more like chrome), break extensions that contributed to firefox success (to be more like chrome), require pulseaudio and drop alsa and so on.


All the examples you mentioned were communicated pretty transparently in Bugzilla, developer blog posts and announcement blog posts. Just because you disagreed with them doesn't mean that the reasoning wasn't public and that you couldn't have contributed to it.


Why not just use openpgp signing instead like most GNU/Linux distros?


Signing a binary allows the updater to verify that it came from a trusted source, but it doesn't tell you whether they gave you the same binary as everyone else, or a custom one.

Binary transparency ensures that there's a complete, public list of all updates sent out. It's an additional level of verification showing that the source isn't up to any shenanigans.

It's more of a deterrent though. It doesn't prevent sending a custom update; it just makes it difficult to hide.


I think that a threat model that Binary Transparency tries to counter is one where an attacker gets hold of the signing key, or coerces the holder of that key, to sign a malicious update which is then sent to a targeted victim (on a compromised mirror site, for example).

If people are combining Reproducible Builds with Binary Transparency, then the attacker probably has to release the same binary to everyone, and release the source code containing the malicious change.

It remains to be seen whether enough people would audit the source code diffs of each release of Firefox, say, to stop a malicious update from affecting a large number of users. In particular, mechanisms would need to be put in place to stop users updating to a release which was discovered (or reported and then verified) to be compromised.


Really interesting hack. It basically gives (almost) free timestamping (using Let's encrypt for cert issuance and CT logs for storing information). Previously one would use Bitcoin OP_RETURN outputs for timestamping [0].

[0]: https://en.bitcoin.it/wiki/OP_RETURN


Is Debian / Ubuntu doing anything like this?


There is an effort to make all the packages' builds riproducibile in Debian, which should automatically propagate to Ubuntu for most of the packages


Funny aside - the debian reproducible build effort has uncovered bugs like this: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=848721


What's wrong with just doing SHA1?


If the binary you're downloading might have been modified, how do you know that the hash you're checking against hasn't been as well?


Sha1 collisions...


what's wrong with doing <insert hash function>?


Making a hash of the release is just a small part of it (and is the first part of what they are doing).

The trick is to be confident that you're getting the same hash as everyone else - and that's what requiring a proof that it be added to a CT logs gives you some level of assurance about.


The stated goal is to enable someone to verify "that they have gotten the same version as the rest of the world and not a special, possibly compromised version." This is actually two goals: (1) verify that your version is the same as everyone else's, and (2) verify that that version is genuine.

Why should one care about (1)? All that really matters is (2). As long as I'm using a genuine release, does it matter what the rest of the world is using? Unless I wish to establish trust in a binary based on how popular it is, or unless I care about interoperability between the version I have and the version others have, it doesn't really matter what version everyone else has.

I wonder if the author has heard about Nix or Guix? The purely functional software deployment model pioneered by Nix solves (2) trivially, for practically all applications in general, not just Firefox specifically. It also solves many other problems in the field of software deployment that this article doesn't even mention.

Long story short, don't reinvent the wheel. Use Nix or Guix. Learn more by reading the first chapter of Eelco Dolstra's thesis, which describes the problems and how the Nix model solves them:

https://nixos.org/~eelco/pubs/phd-thesis.pdf

Edit: Even if one is concerned about (1), the Nix model enables ways to verify that the origin is actually sending a binary that was built from the source it claims to use. For example, consider "guix challenge":

https://www.gnu.org/software/guix/manual/html_node/Invoking-...


The reason to care about getting the same binary as the rest of the world is that it increases the likelihood that an attack will be detected.

In the case with neither binary transparency or reproducible builds, a nefarious actor can target a single user with a tainted binary and it's unlikely that the user will find out and difficult for them to rule out the possibility of tampering up-front.

In the case with binary transparency but no reproducible builds, a nefarious actor must target all users which makes it more likely that someone will notice, but still difficult for people to rule out tampering up-front.

In the case with reproducible builds but no binary transparency, it's easy for people who are paranoid to rule out tampering with the binary, but people who aren't paranoid are unlikely to discover that their specific binaries were tampered with, so a targeted attack will still probably go undetected.

In the case with both reproducible builds and binary transparency, it only takes one paranoid person discovering a tampered binary to alert the whole world that their own binaries have been tampered with. It's safety in numbers, even for those not technically-literate enough to determine (or even suspect) tampering.


Thank you for the clarification. I can see from your examples why binary transparency is a useful concept worth considering in its own right. I still suspect there is a huge amount of overlap between the problems the author is trying to solve and they ones that Nix/Guix has already solved (especially the way they want to use a hashing algorithm to identify the release). I'll bet a general solution for binary transparency could be built - a solution from which practically all software in general could benefit, not just Firefox in particular - by building on top of (or at least learning from) the base that the purely functional software deployment model, as pioneered by Nix, has already given us.

I am not simply saying "They should use Nix" as if that would magically accomplish their goals. I am saying that they could build on top of, or at least learn from, the novel techniques that Nix has contributed to the field of software deployment.


One of the people involved in the reproducible builds project is a NixOS committee. Fairly certain they're aware of nix/guix


Does the reproducible builds project have a hand in the project to give Security/Binary Transparency to Firefox? I ask because i don't know, and I saw no language to suggest that in the page linked.


The traditional answer is that if your version is different from others', it might be because (1) the developer is trying to attack you, including (1a) the developer wants to attack you, (1b) someone forced the developer to make a custom version of the software in order to attack you, or (1c) someone compromised the internal processes of the developer in order to attack a small group of users (in a way that reduces the chance that the compromise will be discovered by the developer or by others).

This also includes, perhaps, (1d) there are some secret antifeatures in the software whose existence the developer hopes to conceal from the general user population.

For some of these cases, "you" might include not just one person, but also users in a particular country, language community, or income bracket.

Edit: I agree that there may be technical solutions other than binary transparency in particular that can also address some of these concerns.


I can understand why those scenarios would be concerning. Ultimately, what matters depends on the threat model. I believe that the Nix model can be used as a base for solving issues like the one presented here, and that it can be done in generality, for a wide spectrum of software.


Reproducible builds (which it seems like Nix focuses on) are useful, but they don't address the problem of whether you got the same binary or source as everyone else at all.


Nix's functional software deployment model is a useful tool for building software reproducibly, but reproducible builds are neither the primary motivation for nor the primary goal of the Nix model. For information on what problems it aims to solve, how it solves them, and how it can be applied in various useful ways (e.g., a package manager is just one particular thing you can implement using the Nix model), I suggest you read the first chapter of the thesis I linked earlier. It's very interesting, and I think you'll find that the model can be applied to problems like this; it's not just about building software reproducibly, although software that builds reproducibly is conducive to the model.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: