My dotJS 2017 Keynote

Yesterday in Paris, I gave the closing keynote at the dotJS conference. I’ve had the privilege to speak at dotJS every other year since 2013.

COME OUT TO THE COAST... PUT SCHEME IN THE BROWSER
A Brief History of JavaScript

Click above for a PDF of my slides (sorry, I used Keynote for several reasons, and its generated HTML is huge and not likely to work well with WP). Long-timer readers may notice that I re-presented a few still-on-point slides from my TxJS 2011 talk. Video will come in a few weeks, the organizers say.

dotJS 2017 was a terrific TED-style conference with top speakers and a smart, friendly audience. Special shout-out to Christophe Porteneuve for a great intro and for MCing in style — and of course many thanks to Sylvain Zimmer and team for the whole event.

The Render Token

I wrote about OTOY over four years ago, in “Today I Saw The Future”. Since then, I have been inspired by the commitment of the founders Jules Urbach and Alissa Grainger to the vision that Jules enunciates:

“… to render and remix simulated reality as effortlessly as the web did for text and digital media.

Now, OTOY is building RNDR, its own GPU-cloud rendering token, to decentralize AR/VR, game, and movie rendering to over 7 million GPUs owners. The advantages of RNDR align with those of the BAT:

  1. Efficiency: Utility tokens unlock access to idle or mispriced resources, e.g., GPUs for RNDR, user attention for BAT.
  2. Fraud resistance: Tokens act as a low-fraud unit of account with payment only after blockchain-attested verification of work.
  3. Social credit: Pre-creating a pool of tokens not for sale endows users with tokens by fiat.

(A speculative technical note on efficiency: RNDR looks well-timed, in view of GPU service lifetimes, to improve efficiency further by replacing Ethereum’s Proof of Work mining jobs with rendering jobs, as I suggested on Twitter.)

Decentralized rendering requires verification of results. Tokens do not flow until the render-job’s author confirms quality of results via sampling and testing. Renderers gain and lose scored reputation based on degree and quality of jobs completed.

Decentralized rendering requires confidentiality. In one of those technological ironies for which I live, the same kind of security hardware (e.g., ARM TrustZone) created for DRM helps solve the confidentiality problem.

In my original blog posts about OTOY, I advocated watermarking as inevitable and superior to DRM. OTOY has been developing and deploying watermarking since 2009. Large shared AR/VR worlds cannot possibly “encrypt what you see” (as DRM for fixed media tries to do). Yet creators of models and art in such shared virtual worlds need effective and fair protection, as experience from Second Life shows.

Indelible watermarking is a key part of the solution. See the “Watermarking and Encrypted Escrow Transactions” section at RNDR: A Photon-Driven Economy for more details.

I’m honored to be an advisor to OTOY and RNDR. I’m thrilled by the prospects for RNDR, BAT, and other domain-specific tokens that will bind the Metaverse economy into a coherent and equitable, yet decentralized, whole. The future will be tokenized!

From ASM.JS to WebAssembly

tl;dr I’m burying the lede with context and catch-up material first, so impatient or already-clued-in readers should skip to below the videos for today’s big news. Or just read Luke Wagner‘s blog post right now.

My Fluent 2015 “ECMAScript Harmony: Rise of the Compilers” talk given on April 21st:

Jeremy Ashkenas picked up this ball and ran into the next field’s end zone two days later in Brooklyn:

My slides for the talk I gave at ModernWeb.tw on May 15th:

Perhaps you detect a theme, beyond “JS turned 20”. Something about compilers rising, Java bytecode and asm.js, shared memory threads and SIMD. The usual retort, recurrent on Hacker News, looks like this:

In an HN sub-thread sparked by my ModernWeb.tw slides, I concurred with Patrick Walton on analogies between browsers and the Unix kernel, JS and C. One does not get to write kernel code in any language, at least not yet. Elsewhere in that sub-thread the analogue of JS was identified as architecture-specific assembly code, where the CPU architecture is fixed by physical reality, while software is written in virtual languages compiled to machine code, or run through interpreters.

HN-comment-id-9554914

In any event, whether analogous to C or assembly, JS has had a lock on the “kernel” of the client side. Web “userland code”, in contrast, can be written in a great many languages. Adding a second (bytecode) syntax has looked like a “now you have two problems” loss, against the alternative of one syntax. Until now.

Today’s Big News

It’s by now a cliché that JS has become the assembly language of the Web. Rather, JS is one syntax for a portable and safe machine language, let’s say. Today I’m pleased to announce that cross-browser work has begun on WebAssembly, a new intermediate representation for safe code on the Web.

What: WebAssembly, “wasm” for short, .wasm filename suffix, a new binary syntax for low-level safe code, initially co-expressive with asm.js, but in the long run able to diverge from JS’s semantics, in order to best serve as common object-level format for multiple source-level programming languages.

It’s crucial that wasm and asm stay equivalent for a decent interval, to support polyfilling of wasm support via JS. This remains crucial even as JS and asm.js evolve to sprout shared memory threads and SIMD support. Examples of possible longer-term divergence: zero-cost exceptions, dynamic linking, call/cc. Yes, we are aiming to develop the Web’s polyglot-programming-language object-file format.

Why: asm.js is great, but once engines optimize for it, the parser becomes the hot spot — very hot on mobile devices. Transport compression is required and saves bandwidth, but decompression before parsing hurts. A secondary consideration: JS has a few awkward corners even in its asm.js subset. Finally, once browsers support the WebAssembly syntax natively, JS and wasm can diverge, without introducing unsafe or inappropriate features into JS just for use by compilers sourcing a few radically different programming languages.

See the FAQ for more nuance and detail. No, JS isn’t going away in any foreseeable future. Yes, wasm should relieve JS from having to serve two masters. This is a win-win plan.

How: If you use Emscripten, then wasm support via a command-line flag will at first include and target the prototype polyfill. But as native wasm decoders appear in top engines (see the V8 native prototype decoder), Emscripten will auto-configure for best results. Another prototype: a JS AST compressor (encoder).

These prototypes will evolve and track draft specification changes as WebAssembly matures and receives progressively more developer testing. Other compilers than Emscripten, and other compiler frameworks than LLVM, will join the mix. I expect all engines will support fast native decoders. All these parts are means to the end of a common WebAssembly standard.

Who: A W3C Community Group, the WebAssembly CG, open to all. As you can see from the github logs, WebAssembly has so far been a joint effort among Google, Microsoft, Mozilla, and a few other folks. I’m sorry the work was done via a private github account at first, but that was a temporary measure to help the several big companies reach consensus and buy into the long-term cooperative game that must be played to pull this off.

Here you can see JF Bastien of Google’s PNaCl team barely able to keep a lid on the secret. Good thing I replied with a comic book reference as plausible cover. Close one! 😀

BE-and-JF-assemble

Having both the PNaCl team and the V8 team from Google, along with key people from Microsoft and the asm.js and Emscripten gurus from Mozilla, collaborating closely once everyone saw the light, has been inspiring. I’d like to single out for highest praise JF Bastien, K. Gadd, and Ben Titzer of Google; Dan Gohman of Mozilla; Abhijith Chatra and Michael Holman of Microsoft; Alon Zakai of asm.js & Emscripten fame; Filip Pizlo for JavaScriptCore/WebKit; and especially asm.js/OdinMonkey mastermind Luke Wagner.

The Big Picture

In the history of computing, the dream of a universal, language-neutral intermediate form goes back to well before Melvin Conway‘s UNCOL (1958, the same year LISP was born). I remember ANDF from the ’80s, and U-something from the ’70s. No one wants the “N x M” language sources vs. machine targets problem.

Sometimes we must take a trip or two around Fortuna’s Wheel before falling into success. Neurons appear to have evolved more than once. Believe me, I find it ha-ha funny that lowly JS (which I did not plan out this far in advance!) has paved the evo-path to WebAssembly. Yet here we are.

Although it won’t be the only compiler framework used to generate wasm, LLVM has been a boon to this project, as to wasm’s Emscripten and PNaCl progenitors. Kudos to Chris Lattner and team.

The PNaCl folks I know are good natured, but I think some are sore at me for playing Cassandra to their Troy over the years on HN. So I want to give them major credit for wringing much undefined behavior (UB) out of LLVM — really out of all levels of abstraction, from C/C++ specs down to hardware. That kind of thankless work is a big help for WebAssembly, which must be mostly-determinstic and well-defined, for interoperation and security.

(Astute readers will have by this point recalled The Birth & Death of JavaScript by Gary Bernhardt. I live to troll Gary. 😉

Bottom line: with co-evolution of JS and wasm, in a few years I believe all the top browsers will sport JS engines that have become truly polyglot virtual machines. I predict that JS over the same timespan will endure and evolve to absorb more APIs and hardware-based affordances — but not all, where wasm carries the weight.

Again we see how the Web favors a succession of “little bangs”:

  • to hold community consensus along the evolutionary path, by
  • testing each hop for fitness with developers and implementors, and
  • supporting usable polyfills for older deployed browsers.

I usually finish with a joke: “Always bet on JS”. I look forward to working “and wasm” into that line — no joke.

/be

The Next Mission

Slides for the brief talk that I gave at a Harvard seminar on privacy and user data organized by John Taysom last week.

My talk was really more about the “network problem” than the “protocol problem”. Networks breed first- and second-mover winners and others path-dependent powers, until the next disruption. Users or rather their data get captured.

Privacy is only one concern among several, including how to realize economic value for many-yet-individually-weak users, not just for data-store/service owners or third parties. Can we do better with client-side and private-cloud tiers, zero-knowledge proofs and protocols, or other ideas?

In the end, I asked these four questions:

  1. Can a browser/OS “unionize its users” to gain bargaining power vs. net super-powers?
  2. To create a data commons with “API to me” and aggregated/clustered economics?
  3. Open the walled gardens to put users first?
  4. Still be usable and private-enough for most?

I think the answer is yes, but I’m not sure who will do this work. It is vitally important.

I may get to it, but not working at Mozilla. I’ve resigned as CEO and I’m leaving Mozilla to take a rest, take some trips with my family, look at problems from other angles, and see if the “network problem” has a solution that doesn’t require scaling up to hundreds of millions of users and winning their trust while somehow covering costs. That’s a rare, hard thing, which I’m proud to have done with Firefox at Mozilla.

I encourage all Mozillians to keep going. Firefox OS is even more daunting, and more important. Thanks indeed to all who have supported me, and to all my colleagues over the years, at Mozilla, in standards bodies, and at conferences around the world. I will be less visible online, but still around.

/be

Inclusiveness at Mozilla

I am deeply honored and humbled by the CEO role. I’m also grateful for the messages of support. At the same time, I know there are concerns about my commitment to fostering equality and welcome for LGBT individuals at Mozilla. I hope to lay those concerns to rest, first by making a set of commitments to you. More important, I want to lay them to rest by actions and results.

A number of Mozillians, including LGBT individuals and allies, have stepped forward to offer guidance and assistance in this. I cannot thank you enough, and I ask for your ongoing help to make Mozilla a place of equality and welcome for all. Here are my commitments, and here’s what you can expect:

  • Active commitment to equality in everything we do, from employment to events to community-building.
  • Working with LGBT communities and allies, to listen and learn what does and doesn’t make Mozilla supportive and welcoming.
  • My ongoing commitment to our Community Participation Guidelines, our inclusive health benefits, our anti-discrimination policies, and the spirit that underlies all of these.
  • My personal commitment to work on new initiatives to reach out to those who feel excluded or who have been marginalized in ways that makes their contributing to Mozilla and to open source difficult. More on this last item below.

I know some will be skeptical about this, and that words alone will not change anything. I can only ask for your support to have the time to “show, not tell”; and in the meantime express my sorrow at having caused pain.

Mozilla is a movement composed of different people around the world, working productively together on a common mission. This is important to our ability to work and grow around the world.

Many Mozillians and others know me as a colleague or a friend. They know that I take people as they come and work with anyone willing to contribute. At the same time, I don’t ask for trust free of context, or without a solid structure to support accountability. No leader or person who has a privileged position should. I want to be held accountable for what I do as CEO. I fully expect you all to do so.

I am committed to ensuring that Mozilla is, and will remain, a place that includes and supports everyone, regardless of sexual orientation, gender identity, age, race, ethnicity, economic status, or religion.

You will see exemplary behavior from me toward everyone in our community, no matter who they are; and the same toward all those whom we hope will join, and for those who use our products. Mozilla’s inclusive health benefits policies will not regress in any way. And I will not tolerate behavior among community members that violates our Community Participation Guidelines or (for employees) our inclusive and non-discriminatory employment policies.

You’ll also see more from Mozilla under my leadership in the way of efforts to include potential contributors, especially those who lack privilege. This entails several projects, starting with Project Ascend, which is being developed by Lukas Blakk. I intend to demonstrate with meaningful action my commitment to a Mozilla that lives up to its ideals, including that of being an open and inclusive community.

/be

Mozilla News

A quick note to update everyone on Mozilla news. Our Board of Directors has appointed me CEO of Mozilla, with immediate effect. I’m honored and humbled, and I promise to do everything I can to lead Mozilla to new heights in this role.

I would first like to thank Jay Sullivan for his contributions to Mozilla and to the Web. He has been a passionate force at Mozilla whose leadership, especially during the last year, has been important to our success, in particular with Firefox OS. Jay is helping with the CEO transition and will then leave to pursue new opportunities.

My co-founder and 15-year partner in Mozilla, Mitchell Baker, remains active as Executive Chairwoman of Mozilla. I could not do what I do for Mozilla without Mitchell, and I like to think she feels the same way about me ;-). We have worked together well since she took on management of the tiny mozilla.org staff fragment embedded in Netscape. At that time I was “acting manager” (more like method acting manager :-P). I’ve learned a lot from Mitchell and my other peers at Mozilla about management since then!

Mozilla is about people-power on the Web and Internet — putting individual users, who create as well as consume, above all other agendas. In this light, people-fu trumps my first love, which you might say is math-fu, code-fu or tech-fu (if I may appropriate the second syllable from kung fu). People around the world are our ultimate cause at Mozilla, as well as source of inspiration and ongoing help doing what we do.

Speaking of people a bit more, I’ll take this moment to introduce Li Gong as my incoming COO. Li set up Mozilla China and our Taipei office, and he has been a crucial partner in building up Firefox OS. If you don’t know him yet, you will probably get a chance if you pass through our headquarters, as Li will be moving back to the US to help manage here.

Mozilla remains a global public benefit organization, so I’m sure I will see all of you more as I travel: to all of our offices (I have not yet been to Beijing or Taipei), to the places where we are bringing Firefox OS and the $25 smartphone, and everywhere Mozillians, developers, and others are working to make the Web better for everyone.

/be

The Web at 25

The World Wide Web is 25 years old today.

The Web is a big deal (as is the Internet on which it is built), I don’t need to tell you! But I did have a few thoughts, solicited by a friend who asked “where [do] you think the future of the Internet will take us in the next 25 years?”

My answer: 25 years is a long time. I expect some big changes (computers inside us monitoring body functions), while other things stay remarkably unchanged (no flying cars).

Even now people remark on how much more personal or intimate a smartphone is than a PC (that image still makes me laugh). Think about this when the Internet includes not just your house and most physical artifacts worth hooking up, but yourself.

In such a world, open systems built on open standards and open source are even more important, for all of these reasons:

  • interoperation among implementations;
  • freedom to migrate among different vendors’ systems;
  • ability to mix-and-match, hyperlink/transclude, copy-learn-and-hack, and monitor/audit against mistakes, malware, and surveillance.

We have more work to do. Let’s go.

/be

Other voices:

MWC 2014, Firefox OS Success, and Yet More Web API Evolution

Just over a week ago, I left Barcelona and Mobile World Congress 2014, where Mozilla had a huge third year with Firefox OS.

mwc14-booth

We announced the $25 Firefox OS smartphone with Spreadtrum Communications, targeting retail channels in emerging markets, and attracting operator interest to boot. This is an upgrade for those channels at about the same price as the feature phones selling there today. (Yes, $25 is the target end-user price.)

tarako

We showed the Firefox OS smartphone portfolio growing upward too, with more and higher-end devices from existing and new OEM partners. Peter Bright’s piece for Ars Technica is excellent and has nice pictures of all the new devices.

We also were pleased to relay the good news about official PhoneGap/Cordova support for Firefox OS.

We were above the fold for the third year in a row in Monday’s MWC daily.

(Check out the whole MWC 2014 photo set on MozillaEU’s Flickr.)

As I’ve noted before, our success in attracting partners is due in part to our ability to innovate and standardize the heretofore-missing APIs needed to build fully-capable smartphones and other devices purely from web standards. To uphold tradition, here is another update to my progress reports from last year and from 2012.


First, and not yet a historical curiosity: the still-open tracking bug asking for “New” Web APIs, filed at the dawn of B2G by Andreas Gal.

Next, links for “Really-New” APIs, most making progress in standards bodies:

Yet more APIs, some new enough that they are not ready for standardization:

Finally, the lists of new APIs in Firefox OS 1.1, 1.2, and 1.3:

This is how the web evolves: by implementors championing and testing extensions, with emerging consensus if at all possible, else in a pref-enabled or certified-app sandbox if there’s no better way. We thank colleagues at W3C and elsewhere who are collaborating with us to uplift the Web to include APIs for all the modern mobile device sensors and features. We invite all parties working on similar systems not yet aligned with the emerging standards to join us.

/be

Trust but Verify

Background

It is becoming increasingly difficult to trust the privacy properties of software and services we rely on to use the Internet. Governments, companies, groups and individuals may be surveilling us without our knowledge. This is particularly troubling when such surveillance is done by governments under statutes that provide limited court oversight and almost no room for public scrutiny.

As a result of laws in the US and elsewhere, prudent users must interact with Internet services knowing that despite how much any cloud-service company wants to protect privacy, at the end of the day most big companies must comply with the law. The government can legally access user data in ways that might violate the privacy expectations of law-abiding users. Worse, the government may force service operators to enable surveillance (something that seems to have happened in the Lavabit case).

Worst of all, the government can do all of this without users ever finding out about it, due to gag orders.

Implications for Browsers

This creates a significant predicament for privacy and security on the Open Web. Every major browser today is distributed by an organization within reach of surveillance laws. As the Lavabit case suggests, the government may request that browser vendors secretly inject surveillance code into the browsers they distribute to users. We have no information that any browser vendor has ever received such a directive. However, if that were to happen, the public would likely not find out due to gag orders.

The unfortunate consequence is that software vendors — including browser vendors — must not be blindly trusted. Not because such vendors don’t want to protect user privacy. Rather, because a law might force vendors to secretly violate their own principles and do things they don’t want to do.

Why Mozilla is different

Mozilla has one critical advantage over all other browser vendors. Our products are truly open source. Internet Explorer is fully closed-source, and while the rendering engines WebKit and Blink (chromium) are open-source, the Safari and Chrome browsers that use them are not fully open-source. Both contain significant fractions of closed-source code.

Mozilla Firefox in contrast is 100% open source [1]. As Anthony Jones from our New Zealand office pointed out the other month, security researchers can use this fact to verify the executable bits contained in the browsers Mozilla is distributing, by building Firefox from source and comparing the built bits with our official distribution.

This will be the most effective on platforms where we already use open-source compilers to produce the executable, to avoid compiler-level attacks as shown in 1984 by Ken Thompson.

Call to Action

To ensure that no one can inject undetected surveillance code into Firefox, security researchers and organizations should:

  • regularly audit Mozilla source and verified builds by all effective means;
  • establish automated systems to verify official Mozilla builds from source; and
  • raise an alert if the verified bits differ from official bits.

In the best case, we will establish such a verification system at a global scale, with participants from many different geographic regions and political and strategic interests and affiliations.

Security is never “done” — it is a process, not a final rest-state. No silver bullets. All methods have limits. However, open-source auditability cleanly beats the lack of ability to audit source vs. binary.

Through international collaboration of independent entities we can give users the confidence that Firefox cannot be subverted without the world noticing, and offer a browser that verifiably meets users’ privacy expectations.

See bug 885777 to track our work on verifiable builds.

End-to-End Trust

Beyond this first step, can we use such audited browsers as trust anchors, to authenticate fully-audited open-source Internet services? This seems possible in theory. No one has built such a system to our knowledge, but we welcome precedent citations and experience reports, and encourage researchers to collaborate with us.

Brendan Eich, CTO and SVP Engineering, Mozilla
Andreas Gal, VP Mobile and R&D, Mozilla

[1] Firefox on Linux is the best case, because the C/C++ compiler, runtime libraries, and OS kernel are all free and open source software. Note that even on Linux, certain hardware-vendor-supplied system software, e.g., OpenGL drivers, may be closed source.