Trust but Verify

Background

It is becoming increasingly difficult to trust the privacy properties of software and services we rely on to use the Internet. Governments, companies, groups and individuals may be surveilling us without our knowledge. This is particularly troubling when such surveillance is done by governments under statutes that provide limited court oversight and almost no room for public scrutiny.

As a result of laws in the US and elsewhere, prudent users must interact with Internet services knowing that despite how much any cloud-service company wants to protect privacy, at the end of the day most big companies must comply with the law. The government can legally access user data in ways that might violate the privacy expectations of law-abiding users. Worse, the government may force service operators to enable surveillance (something that seems to have happened in the Lavabit case).

Worst of all, the government can do all of this without users ever finding out about it, due to gag orders.

Implications for Browsers

This creates a significant predicament for privacy and security on the Open Web. Every major browser today is distributed by an organization within reach of surveillance laws. As the Lavabit case suggests, the government may request that browser vendors secretly inject surveillance code into the browsers they distribute to users. We have no information that any browser vendor has ever received such a directive. However, if that were to happen, the public would likely not find out due to gag orders.

The unfortunate consequence is that software vendors — including browser vendors — must not be blindly trusted. Not because such vendors don’t want to protect user privacy. Rather, because a law might force vendors to secretly violate their own principles and do things they don’t want to do.

Why Mozilla is different

Mozilla has one critical advantage over all other browser vendors. Our products are truly open source. Internet Explorer is fully closed-source, and while the rendering engines WebKit and Blink (chromium) are open-source, the Safari and Chrome browsers that use them are not fully open-source. Both contain significant fractions of closed-source code.

Mozilla Firefox in contrast is 100% open source [1]. As Anthony Jones from our New Zealand office pointed out the other month, security researchers can use this fact to verify the executable bits contained in the browsers Mozilla is distributing, by building Firefox from source and comparing the built bits with our official distribution.

This will be the most effective on platforms where we already use open-source compilers to produce the executable, to avoid compiler-level attacks as shown in 1984 by Ken Thompson.

Call to Action

To ensure that no one can inject undetected surveillance code into Firefox, security researchers and organizations should:

  • regularly audit Mozilla source and verified builds by all effective means;
  • establish automated systems to verify official Mozilla builds from source; and
  • raise an alert if the verified bits differ from official bits.

In the best case, we will establish such a verification system at a global scale, with participants from many different geographic regions and political and strategic interests and affiliations.

Security is never “done” — it is a process, not a final rest-state. No silver bullets. All methods have limits. However, open-source auditability cleanly beats the lack of ability to audit source vs. binary.

Through international collaboration of independent entities we can give users the confidence that Firefox cannot be subverted without the world noticing, and offer a browser that verifiably meets users’ privacy expectations.

See bug 885777 to track our work on verifiable builds.

End-to-End Trust

Beyond this first step, can we use such audited browsers as trust anchors, to authenticate fully-audited open-source Internet services? This seems possible in theory. No one has built such a system to our knowledge, but we welcome precedent citations and experience reports, and encourage researchers to collaborate with us.

Brendan Eich, CTO and SVP Engineering, Mozilla
Andreas Gal, VP Mobile and R&D, Mozilla

[1] Firefox on Linux is the best case, because the C/C++ compiler, runtime libraries, and OS kernel are all free and open source software. Note that even on Linux, certain hardware-vendor-supplied system software, e.g., OpenGL drivers, may be closed source.

43 Replies to “Trust but Verify”

  1. “establish automated systems to verify official Mozilla builds from source”
    => I imagine a large part if not all of Mozilla’s current automated build infrastructure is open source (+monitoring, metrics, update infrastructure). This is worth auditing as well of course, especially the update infrastructure actually…
    Providing a link to all of that in your blog post may help the organizations you’re calling to action (so they don’t re-invent the wheel and the cost of replicating such infrastructure is made lower).

    “Every major browser today is distributed by an organization within reach of surveillance laws”
    => This is true, but can be partially defeated if several independent (not owned by Mozilla) organizations ship the same software from different countries. Mozilla could even outsource building and distribution to trusted foreign (not compelled to the same laws) companies not owned by Mozilla. The non-centrality and ability for each organization to HTTP 302 to a more trusted one might be largely enough to protect users… until all governments agree on a worldwide surveillance agreement?
    It will even be a necessary measure if security researchers do raise an alert.

  2. Maybe we should include more information about the environment in about:buildconfig to verify the build. Notably we could add |$(CC) –version| in there.

  3. Just a heads up that anyone that is interested in contributing to security activities related to Firefox or Mozilla, or looking for feedback on security research they are doing related to Mozilla products and services, are welcome to reach out to our security teams via security@mozilla.org!

  4. To play devil’s advocate: given the unsafety of a language like C++ and the size of a modern browser’s codebase, given one only needs one subtle vulnerability to make the entire stack-of-cards come tumbling down, can we actually rely on auditing to find such issues? The Underhanded C Contest relies upon the subtly of such issues, after all.

    I’m not going to claim that verifying the entire browser isn’t a very worthwhile effort (and would likely show up several other issues, too!) — I’m just sceptical as to whether it’s, practically, too much work to actually reliably show up everything that could bring down the stack of cards.

  5. Sorry no browser can be used as a “trust anchor” in a communications chain unless it fully controls the hardware, which is usually the job of the OS and it’s underlying drivers.

    Browsers are just a link in the chain, which can be weak or strong depending on their design and implentation, but they cannot stop “end run attacks” via the likes of “device shims”.

    And whilst being a “trusted link” is a laudable goal the major “closed source” issue is at the end of the day the “binary blobs” from chip manufactures that infest device drivers. These blobs are present not just in main stream closed source OSs but also in some open source OSs as well as it’s the only way some graphics cards can be used effectivly.

    Having looked into this issue last century in quite some detail it becomes clear that the human has to be part of the trust chain and to do this requires fully independent side channels to do full two way authentication to avoid “end run attacks”, trying to find “fully independent” side channels in this age of smart phones and tablets is at best difficult, making them usable as well as secure is well neigh impossible…

  6. The auditability properties you discussed are more important today than ever. The problem I see with hosted web apps as they exist today is that they are not auditable. See this blogpost here.

    https://tonyarcieri.com/whats-wrong-with-webcrypto

    That’s why I see so much potential in packaged apps with static auditable versions (both for Firefox as well as for Chrome). Unfortunately they are not yet deployable in the Mozilla Marketplace for Desktop and Android (only firefox os), which makes it hard for service providers to deploy auditable cross-platform apps. The only choice currently is to employ a vast amount of different packaging systems such as phonegap, chrome, firefox which increases the cost for portability (something the web is generally better at).

    That’s why for 2014 I’d really love to see better cross platform support for firefox packaged apps as well as adoption of the w3c standard from other vendors:

    https://www.w3.org/2012/sysapps/runtime/

  7. It seems like this just shifts the point of attack from Mozilla to the plugin and operating system vendors (except for the negligible percentage of Firefox users who run a completely open-source platform like GNU/Linux and don’t install any binary plugins like Flash Player).

    It’s still a necessary step toward a solution, but it seems like it won’t actually meaningfully increase security of typical users unless they also take much more drastic steps along with it.

  8. Verification and auditing are a great idea; but Mozilla engineers currently are the ones with the expertise needed for the first steps so Mozilla needs to get the ball rolling.

    You are asking the community to “regularly audit and verify builds” of Mozilla products, something that takes a lot of work. This is made exceedingly harder by the confused state of the builds that poorly isolate the code used to generate the product from the code for tools, profilers, and utility binaries.

    Firefox OS, the new Mozilla operating system providing an HTML 5 execution environment for applications, currently clocks in a 10GB of original source code repositories across around 100 projects (at least for one hardware device). The current build system is one big, poorly documented monolith that builds profilers and runtime tools (some based on Eclipse), Android binaries for interaction with the phone, and other useful binaries that are not part of the OS at the same time as building the kernel, a rootdisk of linux binaries, the system image consisting mostly of gecko along with some binary blobs, and the user visible shell and other applications. The build instructions currently are:
    ./config.sh
    ./build.sh
    ./flash.sh
    For those of us unable to ‘trust’ these instructions, figuring out what is supposed to happen takes a long, long time made all the harder because the Mozilla portions of the build chain are not documented. Given this state, it becomes the work of the community not just to ‘audit and verify’ but to ‘reverse engineer, simplify, and isolate’ the relevant parts before even starting their real work.

    So, will the two of you push Mozilla to dedicate engineering resources to make this ‘verify’ easier? Will you push Mozilla to perform the work to isolate the build and code of the product from the build and code for the tooling? Do you consider it Mozilla’s responsibility in this process to help develop and document an auditing methodology?

    It would be nice to know, beyond the call to arms, how central the two of you consider this work in the context of all the other work Mozilla needs to do. It would also be nice to know the extent to which Mozilla will be pitching in to this.

  9. I’ll echo what Adrian and others have said.

    Mozilla has the huge advantage of being completely open-source, true, but it also has the significant disadvantage of being incredibly complex, even for a web browser. It’s a huge, largely monolithic C++ application, with (from what I’ve seen) unimpressive documentation. I can barely *build* it.

    I searched on developer.mozilla.org just now to see if the documentation situation had improved since I last played with it. The first page I found was one describing the “source code directories overview”. It said that the Windows source code is for Windows 95, 98, and NT4, and the Mac source code is for both 68K *and* PPC. Yikes!

    As a programmer, I’ve often wanted to get involved with Firefox. I just don’t have the time to figure out how any more. It looks like it would take me about 10 years to figure anything out, and it’s getting more complex every day. From where I sit, complex and undocumented open source isn’t really that much more auditable than a closed source binary. With either of them, I’d have to spend the first few months reverse-engineering with my debugger.

    My response to this request is: if you want people to audit your code, you’ve got to simplify, and abstract, and document it. You can’t realistically drop 10 million lines of poorly-documented C++ on the world and expect an audit to occur.

  10. Note that unless we publish the PGO data, or stop doing PGO, people are far from being able to reproduce our builds. Plus, on Linux, typical users are getting Firefox from distros, not us, which multiplies the number of different binaries available for the same source code.

  11. The offensive proof announce will be the right way!

    Sure, the good old times, when programmers where programmers, they are history, but a remindable. The sceners stay in front of a hill of byteshit, whichone will be in parts verficatable.

    To break into the silence will be simple.

    The open way via plugin/addon will be the simpliest, to infiltrate any system. Often are noobs happy about a runnable. That says nothing about programming style or quality of a product.

    The closed way is the manipulation via transport. You can have clean source, clean binaries… but its lost, when the transportpath is compromitted.

    The uggliest way is the usage of subtile components. On part of the holy pieces is the randomizer. There are more of it.

    Please take care and remember, that the internet, you are using up today, in his history was concepted as an military one.

    I´ll warn by this way before an hidden component of humans. In 1989 the wall was broken down. 7 days before the magic day fires the emotions, the red sector of tiled germany deleted the identities of elitegroups in psychologic operative. They ´ve got new identities and where placed around the world. The signature of mind in working methods of nsa could be classified and identified as eastern german zero-agents.

    The americans in 1989 spoke about the eastern germany operative, that they are on highest level worldwide at this point.

    Read my words without angry, respect the content of context!

  12. Thank you for sharing these excellent ideas with us!

    About Firefox for Android: There really needs to be a download option on the mozilla homepage for the apk. Currently you cannot use it on Android devices without a Google account. I want to use Firefox to protect my privacy, so binding myself to Google is not an option for me.

    Perhaps open source software needs something like trusted app stores in general?

  13. The firefox urgently needs independent evaluations for each single addon where the addon is visibly marked on the addons page list if the addon “phones home”.

  14. First batch of replies:

    @AGUMONKEY: then we’ll have a giant super-OMeta++-compiler to audit, but at least it’s the one true Unicorn to audit ;-).

    @DOMINIC: see the bug, we are working on it but we’re not there yet. TOR work known, but we need cloud-hosted CI and v. fast builds. This can be done with determinism, it’s just not how TOR did it.

    @GEOFFREY: see the “No silver bullets” and Denning links. Also see the cryptography@metzdowd list where underhanded C examples were cited. Mozilla code review should (and generally does) reject such code. We are not perfect, but we’re not suckers for C (or C++ mainly) that pushes too far.

    @CLIVE: we’re aware of the nature of the problem in general, and the many other threats, but our conclusion is that deterministic builds, auditing, and verifying binaries are important. Lots of problems to work on, in a sane priority order. Good to have open source for maximum help and advice along the way.

    /be

  15. First, you could release a version of , say 23, that doesn’t have the SQL database.

    I think that having that untouchable database of visited sites and downloads, is the most dangerous thing that is easily rid.

    I hate the awesome bar anyway, the autosearch function is a major privacy hazard, and adjusting the frecency to just get bookmark search is pretty hard.

  16. I just wanted to note that I’m getting two separate certificates for this site using the Perspectives plugin, which might be an indication of a MitM attack.
    the certificate fingerprints are:
    3a:df:5b:02:7c:23:56:a2:14:94:00:cb:73:05:54:08 and
    41:47:66:3a:b6:00:46:e6:26:ed:e7:6e:38:8f:dc:eb (which is the one I’m getting, signed by geotrust)

  17. “Mozilla Firefox in contrast is 100% open source”

    Haven’t you just made the perfect argument for opposing the inclusion of EME in HTML5? The BEST case I’ve heard for EME is that it minimizes the amount of totally opaque, proprietary code. But guarantees it can never be zero.

    As many have pointed out, Firefox is just one link in a chain. No point touting the security of that link, if you don’t work to keep the others clean as well.

  18. Not to start a flamewar, but since your point here is freedom, why would you speak of “open-source” instead of “Free Software” ?

    Also, since Mozilla cares about privacy, why couldn’t extensions such as RequestPolicy (https://www.requestpolicy.com/) be enabled by default ?

  19. Hi
    @ Brendan & Andreas:
    Well U can’t tell us if the valid Version of Firefox has a backdoor or not.
    But it’s not forbidden to tell us a version of firefox we can trust, for whatever reason @ all 😉
    So feel free to tell us ^^

    big hugs @ all, also for those poor guys that be the slaves of their jobs that makes them observe all & everything – my advise: free your conscience and I promise you’ll be loved^^

  20. In reply to “A”:

    short version: Having 2 fingerprints for this website is normal/expected due to SNI.

    long version:

    This website is on a shared host (single public IP) which serves many other websites. For clients which do not support SNI (such as SSLv3 clients), we’re serving a generic certificate, which has a different fingerprint than the TLS certificate.

    SNI stands for server name indication, which lets the webserver know which website is being requested (https://en.wikipedia.org/wiki/Server_Name_Indication).

    Note also that coincidentally, we have updated the TLS certificate on monday as the previous certificate was about to expire, thus it also has a new fingerprint (also it comes from a different CA).

    Hope this clarifies it.

  21. @FooBar: Twitter won’t verify my account without some kind of conflict with an impostor or lookalike account, or without other conditions that amount to paying for it. If you follow my account, you’ll see me interact often, so it’s clearly my real account. @BoredBrendanJS is of course not real (and alas not as funny as it should be — but @FakeAlexRussell is great for a fake account).

    /be

Comments are closed.