JSConf.eu 2011 was terrific, bigger and juicier than last year, with a strong sense of community felt from reject.js pre-conf:

to start:

to finish:

Chris Williams makes a moving plea for an end to negativity, meaning trolling, flaming, mocking, and hating in online media.

This sounds utopian, like “an end to history”. But it is good as an aspiration, a constant reminder, since we’ve all seen how many people tend to be more negative online than they are in person. This isn’t just a matter of isolated individual behavior, free of cultural feedback loops. The new media reinforce tribalism.

However, it is hard to be positive about some things. I will persevere….

JSConf.eu had too many awesome talks to cover without bi-locating. Mozillans were well-represented, including dmandelin and dvander on JavaScript JITs, Marijn Haverbeke on DOM implementation techniques, Chris Heilmann on Community JS reloaded – how to rock as a movement, and Andreas Gal on PDF.js. Janet Swisher led the MDC doc sprint in the Hacker Lounge.

I would like to single out Alon Zakai‘s Emscripten talk. Emscripten is an LLVM-to-JS compiler, which means it enables compiling C, C++, and Objective-C (and other languages with LLVM front ends) to JS. What’s more, interpreters written in C for Python, Ruby, and Lua have been compiled and hosted on the web.

Alon’s results are impressive, with lots of room for more wins. At JSConf.eu, jaws dropped and eyes were opened.

For my talk, I reprised some CapitolJS material, including the RiverTrail demo, which won loud and enthusiastic applause when I clicked on the “Parallel” button.

(A few people asked afterward about whether the graphics was running on one of four cores. I’ll repeat the answer here: the particle system demo uses WebGL targeting the GPU for rendering, and the four CPUs’ vector units for n-body solving. All from deadlock-free, data-race-free, seemingly single-threaded JS.)

Here’s the video of my talk:

The amazing Anna Lena Schiller created infographics for all the talks, on the spot — a truly impressive display of concentration and stamina. Here’s the one she did for my talk:


And here are the updated and new slides I presented, showing ES6 work-in-progress (none of it final, so don’t panic) and covering some current controversies.


From recent es-discuss messages, I’m afraid that classes are on their way out of ES6. This seems a shame, and avoidable. In hindsight, we did not have all class advocates working in concert on the hard issues last year and earlier this year. But we also do not agree on what’s required for ES6, and some on TC39 view minimizing as future-hostile.

To be blunt, we lost some “classes” advocates who work for Google to Dart. Others at Google on TC39 seem to want more out of ES6 classes than even Dart guarantees (see the future-hostile point above).

I’m not slamming Google as a company here, since it does still support people working on JS in TC39. I respect the people involved and believe they’re for the most part making their own choices. But Dart and other unrelated Google agenda items do impose clear and significant opportunity costs on Google’s standards actiivities.

To remain positive per “An End to Negativity”, I’ll simply conclude that we TC39ers should pay attention to Dart now that it is out, even though we’ve lost time and potential contributions.

The famous Tony Hoare quote that Bill Frantz cited, which argues for deferring classes, is this:

When any new language design project is nearing completion, there is always a mad rush to get new features added before standardization. The rush is mad indeed, because it leads into a trap from which there is no escape. A feature which is omitted can always be added later, when its design and its implications are well understood. A feature which is included before it is fully understood can never be removed later.
From C.A.R.Hoare’s 1980 ACM Turing Award Lecture

I agree with Erik Arvidsson that “[b]y not providing [class] syntax we are continuing to encourage a million incompatible ‘class’ libraries.” I’m with Erik: I would still like to see TC39 agree on minimal classes. But not at any cost.

Onward to new proposals with sometimes-tentative syntax. I’m continuing to “live in a fishbowl” by showing these proposals, even though doing so risks drive-by misinterpretation that we have finalized the sum of all proposals.

So, please don’t freak out. Not all of this will make it as proposed. We may also make cuts. But it’s important to address the use-cases motivating these proposals, take in the fullness of the problem space and potential solutions, and do the hermeneutic spiral.


Apart from font issues that make <| look lopsided or non-triangular, this proposal looks good. It replaces the main legitimate use-case for assigning to __proto__: presetting the prototype link in an object literal.


Unlike Object.extend, .{ copies only “own” properties from its right-hand-side object literal, and (this is a crucial difference) it also copies properties with private name object keys (which are non-enumerable by definition). For example, base.{[privateName]: value, publicName: value2} given a private name object reference denoted privateName in scope.


Design patterns point to programming language bugs. Nevertheless, this class pattern shows clever work by Allen Wirfs-Brock, decomposing classes-as-sugar into chained operator expressions. It’s still a bit verbose and error-prone in my opinion, and cries out for the ultimate sugar of minimal class syntax (if only we could agree on that).


Much of the Dart class syntax design looks good to me. Possibly TC39 can agree to adopt it, with necessary adjustments. It would still be sugar for constructors and prototypes.


Arrow function syntax faces an uphill battle due to the combination of TC39’s agreement to future-proof by having an unambiguous LR(1) grammar (after ASI and with lookahead restrictions); mixed with the comma expression, (a, b, c), which I copied into JS’s grammar straight from C (not from Java, which left it out, instead providing comma-separated special forms in a few contexts, e.g. for(;;) loop heads). You can’t have both, and we do not want to remove the comma expression in Harmony.



I’m quite in favor of block-lambdas, and they meet formal approval from TC39’s strictest grammarian. Some still object to them as an alien DNA injection from Ruby and Smalltalk, both syntactically and (with Tennent Correspondence Principle conformance regarding return, break, continue, and this) semantically.


At this point, ES6 has no shorter function syntax. This seems like a loss, and fixable, to me. Your comments welcome, especially if they make novel distinctions that help forge consensus.


During the talk and Q&A, I recounted how the WHAT-WG was created to counteract a standards body gone wrong (the 2004-era W3C). I then raised the idea of a community-based group, a “JS-WG”, to augment the much healthier but still under-staffed Ecma TC39 committee.

Besides floating more ideas (really, the point is not to bikeshed endlessly or take in too many proposals to digest), a JS-WG worth organizing might actually develop draft specs and prototype implementation patches for JavaScriptCore, SpiderMonkey, and V8. The maintainers of those engines could use the help, and with patches and patched builds, we could scale up user testing beyond what’s in the cards now.


I know it’s hard to believe, but people are finally realizing that with V8 prototyping alongside SpiderMonkey, ES6 is happening. It’ll be prototyped in pieces. I hope many will be “on by default” (e.g., not under a flag in Chrome) well before the new edition is standardized (end of 2013). That’s how we roll in Firefox with SpiderMonkey.


CapitolJS, RiverTrail

I took time away from the Mozilla all-hands last week to help out on-stage at the Intel Developer Forum with the introduction of RiverTrail, Intel’s technology demonstrator for Parallel JS — JavaScript utilizing multicore (CPU) and ultimately graphics (GPU) parallel processing power, without shared memory threads (which suck).

Then over the weekend, I spoke at CapitolJS, talking about ES6 and Dart, and demo’ing RiverTrail to the JS faithful. As usual, I’ll narrate my slides, but look out for something new at the end: a screencast showing the RiverTrail IDF demo.


I had a lot to cover in a half-hour (a good talk-time in my view — we’ll see how the video comes out, but I found it invigorating). CapitolJS had a higher-than-JSConf level of newcomers to JS in attendance, so I ran through material that should be familiar to readers of this blog, presented here without much commentary:


The meaning of my color-coding should be intuitively obvious ;-).


BTW, dom.js, with its Proxy usage under the hood, is going great, with David Flanagan and Donovan Preston among others hacking away on it, and (this is important) providing feedback to WebIDL‘s editor, Cameron McCormack.


Here I must add the usual caveat that “ES6” might be renumbered. Were I more prudent, I’d call it “ES.next”, but it’s highly likely to be the 6th edition, and you’re all sophisticated close readers of the spec and its colorful history, right?


The SpiderMonkey bug tracking binary data prototype implementation is bug 578700.


The SpiderMonkey bug tracking quasi-literal prototype implementation is bug 688857.


The @ notation is actually “out” for ES6, per the July meeting (see notes). Thanks to private name objects and object literal extensions (see middle column), private access syntax is factored out of classes. Just use this[x] or p[x], or (in an object literal for the computed property name, which need not be a private name) [x]: value.


As close to CoffeeScript as I can get them, given JS’s grammar and curly-brace (not indentation-based) block structure.


Ruby-esque, Smalltalk is the grandfather.


You know you want it.


I’ve been clear about preferring block-lambdas over arrows for their added semantic value and compelling syntax mimicking control statements. It’s good to hear @jashkenas agree in effect.

@mikeal is right that JS syntax is not a problem for some (perhaps many) users. For others, it’s a hardship. Adding block-lambdas helps that cohort, adds value for code generators (see Harmony Goals above), and on balance improves the language in my view. It won my straw show-of-hands poll at CapitolJS against all of arrows, vaguer alternatives, and doing nothing.


The epic Hacker News thread on my last blog post in relation to Dart needs its own Baedeker. For now I’ll just note that Dart and the politics evident in the memo are not making some of my standards pals who work for other browser vendors happy. Google may fancy itself the new Netscape, but it doesn’t have the market share to pull off proprietary power-move de facto standards.


The leaked memo makes some observations I agree with, some unbacked assertions about unfixable JS problems that TC39 work in progress may falsify this year, and a few implicit arguments that are just silly on their face.


Still, I think we should react to valid complaints about JS, whatever the source. The number type has well-known usability and (in practice, in spite of aggressively optimizing JIT-compiling VMs) performance problems.


The last bullet shows pragmas for switching default numeric type and arithmetic evaluation regime. This would have to affect Number and Math, but lexically — no dynamic scope. Still a bit hairy, and not yet on the boards for Harmony. But perhaps it ought to be.


Links: [it’s true], [SpiderMonkey Type Inference].


Coordinated strawman prototyping in SpiderMonkey and V8 is a tall order. Perhaps we need a separate jswg.org, as whatwg.org is to the w3c, to run ahead? I’ve been told I should be BDFL of such an org. Would this work? Comments welcome.


Remember, ridiculously parallel processing power is coming, if not already present, on your portable devices. It’s here on your laptops and desktops. The good news is that JS can exploit it without your having to deal with data races and deadlocks.

RiverTrail is a Narcissus-based JS to OpenCL compiler, packaged as a Firefox add-on. It demonstrates the utility of a new ParallelArray built-in, based on typed arrays. The JS-to-OpenCL compiler automatically multicore-short-vectorizes your JS for you.


As noted in the previous slide, because the ParallelArray methods compile to parallelized folds (see Guy Steele’s excellent ICFP 2009 keynote), associative operations will be reordered, resulting in small non-deterministic floating point imprecision errors. This won’t matter for graphics code in general, and it’s an inevitable cost of business using parallel floating point hardware.


The code looks like typical JS, with no hairy callbacks, workers, or threads. It requires thinking in terms of immutable trees and reductions or other folds, but that is not a huge burden. As Guy’s talk makes plain, learning to program this way is the key to parallel speedups.

Here is my screencast of the demo. Alas, since RiverTrail currently targets the CPU and its short vector unit (SSE4), and my screencast software uses the same parallel hardware, the frame rate is not what it should be. But not to worry, we’re working on GPU targeting too.

At CapitolJS and without ScreenFlow running, I saw frame rates above 35 for the Parallel demo, compared to 3 or 2 for Sequential.

The point of a technology demonstrator is to show where real JS engines can go. Automatic parallelization of ParallelArray-based code can be done by next year’s JS engines, based on this year’s Firefox add-on. We’re very close to exploiting massively parallel hardware from JS, without having to write WebCL and risk terrible safety and DoS bugs.


To close, I sought inspiration from Wesley Snipes in Passenger 57. Ok, not his best movie, but I miss the ’90s action movie era.


Seriously, the shortest path on the Web usually is the winning play. JS is demonstrably able to grow new capabilities with less effort than a “replacement” entails. Always bet on JS!


My TXJS talk (Twitter remix)

TXJS 2011 A6 – Brendan Eich – Ecma TC39: The Good, The Bad, and The Ugly.

[Main slides] [Paren-free]

I spoke at TXJS, a really excellent regional JS conference, in June. Thanks to @SlexAxton, rmurphey, and everyone else involved. My talk was concerned with the good, bad, and ugly of Ecma TC39 (and I mean those words in the best possible way!), mixing philosophy with historical events from the last 14 years.

After describing the standards process and its history in Ecma, I presented the good stuff that’s going into ES6 (I gave a remixed version of this part of the talk at an SFTechTalk hosted by Twitter last month — thanks to @valueof and @chanian for hosting). As a bonus, I closed with a paren-free update, whose punchline slide is included at the bottom.

TXJS Talk.001

I am not Tuco.

TXJS Talk.002

We don’t smoke, but the rest is pretty much right.

TXJS Talk.003

Here I praised Jan van den Beld, former S-G, Ecma — a raconteur and polymath, for his stewardship of Ecma.

Yes, Netscape picked ECMA (now Ecma, pay attention ;-)) as standards body to which to submit JS mainly to give Microsoft grief, since ECMA had bravely standardized some part of the Windows API. But Jan and the entire TC39 TG1 group played fair on ES1, the first ECMA-262 Edition. Microsoft was so happy they standardized early C# and CLI metadata specs at Ecma.

TXJS Talk.004

Not Bruce Campbell, so not me. That means I’m the Bad.

TXJS Talk.005

Here is something that the Google leak about Dart (nĂ©e Dash) telegraphs: many Googlers, especially V8 principals, do not like JS and don’t believe it can evolve “in time” (whatever that might mean — and Google of course influences JS’s evolution directly, so they can put a finger on the scale here).

They’re wrong, and I’m glad that at least some of the folks at Google working in TC39 actually believe in JS — specifically its ability to evolve soon enough and well enough to enable both more predictable performance and programming in the large.

There’s a better-is-better bias among Googlers, but the Web is a brutal, shortest-path, Worse-is-Better evolving system.

I’ve spent the last 16 years betting on the Web. Evolving systems can face collapses, die-offs, exigent circumstances. I don’t see JS under imminent threat of death due to such factors, though. Ironic that Google would put a death mark on it.

TXJS Talk.006

TXJS Talk.007

Here I propose that Crock is Plato and I am Aristotle, and that while we need to “keep it real”, we must also hew to high ideals.

TXJS Talk.008

The ideas I cite here, represented by the Hermenuetic Circle, definitely apply to TC39’s understanding of the text of ECMA-262, as well as various canonical texts in Computer Science. The committee works best when it spirals in on a solid design, avoiding local and premature optimizations and pessimizations.

TXJS Talk.009

TXJS Talk.010

Every one of these “Bad Parts” has been on parade in TC39 in recent years, including 2011. I’ve been tempted by the second one, horse-trading, so I’m not innocent (no one is).

The “Scenario Solving” one is particularly subtle in that the Scenario proposed to be solved is quite often a very real developer problem. But a complex, ad-hoc solution to it, especially when rushed, too often is unsound. We would do better to take the Scheme lesson to heart, and develop sound and valid orthogonal primitives. Higher-level libraries built on them can be standardized post hoc.

TXJS Talk.011

TXJS Talk.012

These meta-discussions are sometimes quite funny in light of the vendor whose representative is making them. I note that C# can now be written to look like JS. Why shouldn’t any particular extension to C# be at least considered (not rubber-stamped of course) for JS standardization?

TXJS Talk.013

TXJS Talk.014

TXJS Talk.015

TXJS Talk.016

TXJS Talk.017

These slides provide a glimpse of ES6, still under construction but already full of good stuff.

TXJS Talk.018

The Harmony Goals are not just lofty abstractions.

TXJS Talk.019

I hope the way JS is developed and standardized, as much in the open as the existing standards process permits, and with developer feedback via es-discuss, twitter, and the various conference talks, helps. If not, we may as well wait for some single-source solution to descend from the mountain. And then hold our breaths waiting for Firefox, IE and Safari to implement!

TXJS Talk.020


While paren-free in all likelihood won’t make ES6, the for-of loop will, and comprehensions and generator expressions in ES6 will be paren-free. Yay!


New JavaScript Engine Module Owner

As you may know, I wrote JavaScript in ten days. JS was born under the shadow of Java, and in spite of support by marca and Bill Joy, JS in 1995 was essentially a one-man show.

I had a bit of help, even at the start, that I’d like to acknowledge again. Ken Smith, a Netscape acquiree from Borland, ported JDK 1.0-era java.util.Date (we both just drafted off of the Java truck, per management orders; we did not demur from the Y2K bugs in that Java class). My thanks also to Netscape 2’s front-end hackers, chouck, atotic, and garrett for their support. EDIT: can’t forget spence on the X front end!

That was 1995. Engine prototype took ten days in May. Bytecode compiler and interpreter from the start, because Netscape had a server-side JS product in the works. The rest of the year was browser integration, mainly what became known as “DOM level 0”. Only now standardized in HTML 5 and Anne’s wg. Sentence fragments here show my PTSD from that sprint :-/.

In 1996 I finally received some needed assistance from RRJ, who helped port David M. Gay and Guy Steele’s dtoa.c and fix date/time bugs.

Also in summer 1996, nix interned at Netscape while a grad student at CMU, and wrote the first LiveConnect. I am still grateful for his generous contributions in wide-ranging design discussions and code-level interactions.

At some point in late summer or early fall 1996, it became clear to me that JS was going to be standardized. Bill Gates was bitching about us changing JS all the time (some truth to it; but hello! Pot meet Kettle…). We had a standards guru, Carl Cargill, who knew Jan van den Beld, then the Secretary-General of ECMA (now Ecma). Carl steered our standardization of JS to ECMA.

Joining ECMA and participating in the first JS standards meeting was an eye-opener. Microsoft sent a B-team, and Borland and a company called NOMBAS also attended. “Success has many fathers” was the theme. The NOMBAS founder greeted me by saying “oh, we’ve been doing JavaScript for *years*”. I did not see how that could be the case, unless JS meant any scripting language with C-based syntax. I had not heard of NOMBAS before then.

At that first meeting, I think I did well enough in meta-debate against the Microsoft team that they sent their A-team to the next meeting. This was all to the good, and Microsoft in full-blooded compete mode, but also with individual initiative beyond the call of corporate duty by Shon Katzenberger, materially helped create ES1. Sun contributed Guy Steele, who is composed of pure awesome. Guy even brought RPG for fun to a few meetings (Richard contributed ES1 Clause 4).

Meanwhile, in fall 1996, I was under some pressure from Netscape management to write a proto-spec for JS, but that was not something I could do while also maintaining the “Mocha” engine all by myself in both shipping and future Netscape releases, along with all of the DOM code.

This was a ton of work, and on top of it I had to pay off substantial technical debt that I had willingly taken on in the first year. So I actually stayed home for two weeks to rewrite Mocha as the codebase that became known as SpiderMonkey, mainly to get it done (no other way), also to go on a bit of a strike against the Netscape management team that was still underinvesting in JS. This entailed garbage collection and tagged values instead of slower reference-counting and fat discriminated union values.

Also in fall 1996, chouck decided to join me as the second full-time JS team-mate. He and I did some work targeting the (ultimately ill-fated) Netscape 4 release. This work was ahead of its time. We put the JS engine in a separate thread from the “main thread” in Netscape (still in Mozilla). This allowed us to better overlap JS and HTML/CSS/image computations, years ahead of multicore. You could run an iloop in JS and the “slow script dialog” seamlessly floated above it, allowing you to stop the loop or permit it to continue.

After summer 1996 and the start of ECMA-262 standardization, Netscape finally invested more in JS. Clayton Lewis joined as manager, and hired Norris Boyd, who ended up creating Rhino from SpiderMonkey’s DNA transcoded to Java. This was ostensibly because Netscape was investing in Java on the server, in particular in an AppServer that wanted JS scripting.

I met shaver for the first time in October 1996 at Netscape’s NY-based Developer Conference, where he nimbly nerd-blocked some Netscape plugin API fanboys and saved me from having to digress from the main thing, which was increasingly JS.

Some time in 1997, shaver contributed “LiveConnect 2”, based on more mature Java reflection APIs not available to nix in 1996. Clayton hired shaver and the JS team grew large by end of 1997, when I decided to take a break from JavaScript (having delivered ES1 and ES2) and join the nascent mozilla.org.

I handed the keys to the JS kingdom to Waldemar Horwat, now of Google, in late 1997. Waldemar did much of the work on ES3, and threw his considerable intellect into JS2/ES4 afterwards, but without overcoming the market power and stalling tactics of Microsoft.

True story: Waldemar’s Microsoft nemesis on TC39 back then, at the time a static language fan who hated JS, has come around and now endorses JS and dynamic languages.

Throughout all of this, I maintained module ownership of SpiderMonkey.

Fast-forward to 2008. After a great (at the time) Firefox 3 release where @shaver and I donned the aging super-hero suits one more time to compete successfully on interpreter performance against JavaScriptCore in WebKit, Andreas Gal joined us for the summer in which we hacked TraceMonkey, which we launched ahead of Chrome and V8.

A note on V8: I’d learned of it in 2006, when I believe it was just starting. At that point there was talk about open-sourcing it, and I welcomed the idea, encouraging any of: hosting on code.google.com, hosting without any pressure to integrate into Firefox on mozilla.org (just like Rhino), or hosting with an integration plan to replace SpiderMonkey in Firefox. I had to disclose that another company was about to release their derived-from-JS engine to Mozilla, but my words included “the more the merrier”. It was early days as far as JS JITs were concerned.

V8 never open-sourced in 2006, and stealthed its way to release in September 2008. This may have been a prudent move by Google to avoid exciting Microsoft. Clearly, in 1995, the “Netscape + Java kills Windows” talk from Netscape antagonized Microsoft. I have it on good authority that a Microsoft board member wrote marca at the end of 1995 warning “you’ve waved the cape in the bull’s face — prepare to get the horns!” One could argue that Chrome in 2008 was the new red cape in the bull’s face, which begot IE9 and Chakra.

Whatever Google’s reasoning, keeping V8 closed-source for over two years hurt JS in this sense: it meant Apple and Mozilla had to climb the JIT learning curves on their own (at first; then finally with the benefit of being able to inspect V8 sources). Sure, the Anamorphic work on Self and Smalltalk was somewhat documented, and I had learned it in the ’90s, in part with a stint on loan from Netscape to Sun when they were doing due dliigence in preparation for acquiring Anamorphic. But the opportunity to build on a common engine codebase was lost to path dependence.

On the upside, different competing open source engines have demonstrably explored a larger design space than one engine codebase could under consolidated management.

In any event, the roads not taken in JS’s past still give me pause, because similar roads lie ahead. But the past is done, and once we had launched TraceMonkey, and Apple had launched SquirrelFish Extreme, the world had multiple proofs along with the V8 release that JS was no longer consigned to be “slow” or “a toy”, as one referee dismissed it in rejecting a PLDI submission from Andreas in 2006.

You know the rest: JS performance has grown an order of magnitude over the last several years. Indeed, JS still has upside undreamed of in the Java world where 1% performance win is remarkable. And, we are still at an early stage in studying web workloads, in order to synthesize credible benchmarks. On top of all this, the web is still evolving rapidly, so there are no stable workloads as far as I can tell.

Around the time TraceMonkey launched, Mozilla was lucky enough to hire Dave Mandelin, fresh from PhD work at UCB under Ras Bodik.

The distributed, open source Mozilla JS team delivered the goods in Firefox 4, and credit goes to all the contributors. I single Dave out here because of his technical and personal leadership skills. Dave is even-tempered, super-smart, and a true empirical/skeptical scientist in the spirit of my hero, Richard Feynman.

So it is with gratitude and more than a bit of relief, after a very long 16 years in full, 13 years open source, that I’m announcing the transfer of SpiderMonkey’s module ownership to @dmandelin.

Hail to the king, baby!

Mozilla’s NodeConf Presentation

NodeConf is a blast, and Mozilla had a 30 minute slot. Here’s the slideshare.net link.


SpiderNode and V8Monkey are on github, of course. Paul O’Shannessy already blogged a few weeks ago.


To avoid confusion, here’s the cheat-sheet:

  • V8Monkey is SpiderMonkey with V8’s API around it. We are not done emulating the full V8 API.
  • Because we haven’t managed to perfectly emulate the full V8 API, the few language-level extensions (e.g., Error.captureStackTrace), and the V8 build system, SpiderNode is a clone of Node with V8Monkey integrated and (in a few cases we want to get rid of) hacked in place of V8.


This slide should speak for itself.

We are not out to make a maintained, competing fork of Node, just a friendly downstream that should go away as soon as possible. We aren’t selling anything to Node users.

We are trying to improve SpiderMonkey’s API, test Harmony JS language features in the Node setting, and have fun learning about the new JS server side.





These four slides are straight from my JSConf.us 2011 talk. I went fast since a lot of NodeConf attendees were at JSConf, but a good number of hands did not go up when I asked who attended both conferences.



yield conquers the nested function spaghetti monster.

Generators are winning, and they are worth playing with and investigating in SpiderNode and of course Firefox and other SpiderMonkey embeddings (also in Rhino). They are not the last word on the “anti-function-nesting” topic, for sure. I’m pretty sure there is no “last word”.

Thanks to Rob Arnold for this demo and the next one, and of course to Dave Herman for TaskJS. Thanks too to Shawn Wilsher for the encouragement at the last minute đŸ˜‰.


This is an even shorter demo. I switched to a terminal window, fired up $ ./node helloworld.js, loaded into a fresh Firefox window, and pressed reload repeatedly to show the counter incrementing.

Node really did break the mold when it comes to ease of writing server code that you can get running super-fast, using JS and lots of the client-side knowledge you may already have.


One more time: thanks to @robarnold, @sdwilsh, @zpao, the awesome @john_h_ford who did our build automation, and of course my partner in crime for much mad science at Mozilla, @andreasgal.

Join us on IRC and the mailing list, we have a lot of work still to do, and we’re having a ton of fun.


My JSConf.US Presentation

@jashkenas was kind enough to let me join him for his JSConf.us session. Here is the slideshare link. I’ll comment on the individual slides below.


Jeremy’s talk was entitled “CoffeeScript as a JS/next”, and I was interested in giving an update on Ecma TC39 Harmony progress, so when Jeremy and I met and caught up during the first day of JSConf, we quickly agreed on a joint session with about 15 minutes in the first half for my stuff.


JS developers sometimes seem afraid of the future, specifically of what browser vendors and Ecma TC39 might do to them! Here I used some inspiring quotes and a practical, three-step process outline to encourage everyone at JSConf to have courage and make their direct wishes known, as best they can. By acting (using CoffeeScript, for example) as well as writing (es-discuss at mozilla.org).

Browser competition is hot, browser vendors crave developers as fans and supporters. This should give JS hackers great power, if they can speak coherently, as individuals and as a group. But it’s important for developers to speak clearly, since TC39 members often listen to their own peers on the committee and in their respective companies more than to the JS community writ large.

Of course “the community” does not speak with one voice. Communities do evolve leaders, and leaders are often necessarily articulate. Also, effective communities produce artifacts such as PrototypeJS and jQuery, which speak in their own ways.

Why does the community matter? One good reason is this: communities facing harsh survival tests (as JS’s has) reward merit better than committees operating under competitive and time pressures.

@andrewdupont observed, in his barn-burner of a talk near the end of JSConf day one, how recent JS standards started as community-project library methods. Often enough, these de facto standards also started out as rough and real, i.e., far from perfect — just as JS did. They had to be user-tested and iterated upon until winning.

I reiterated this point in my talk. TC39 should avoid inventing where it can pave the cowpaths.


The Harmony goals (slightly reworded here to fit on the slide, and for clarity) indeed talk about codifying de facto standards.

The “complex applications” and “libraries” bits suggest programming in the large, implying modules and other Harmony proposals. The “code generators targeting the new edition” verbiage directly invokes CoffeeScript and the many other JS code-generators out there today.

An important point about the Harmony process: you don’t have to wait three years for a finished standard before browsers start supporting prototype implementations of solid proposals. We aim to do how implementors have done HTML5 and the Web API standards — continuous early implementation and rolling standardization. Thus, Proxies in FF4, WeakMaps in Nightly (FF6) releases, etc.


This slide (which builds bullet by bullet in Keynote) is as concise a summary as I could make of the bulk of what is already very likely in the next ECMA-262 Edition, according to the Harmony Proposals wiki page.

We have invested over the years in most of these proposals, not only on TC39 but in SpiderMonkey (with Norris Boyd, Steve Yegge, et al. matching in Rhino), in order to test implementability and usability. Firefox and other Mozilla XUL apps and add-ons make great use of let, generators, and more. We will keep adjusting our prototype implementations to match the emerging ES.next spec.

Now it looks like other browsers will start to implement Harmony proposals. This effort complements the adoption of de facto standard library functions. Some problems can’t be solved in the current language by writing library code. We need language evolution at both syntactic (user interface) and semantic (deep meaning) layers.


David Flanagan’s talk, which was first up on JSConf day one, covered typed arrays and array buffers a bit. During the Q&A I mentioned the Harmony binary data proposal from @littlecalculist.

The binary data proposal embraces and extends the fine work of WebGL typed arrays. Sometimes someone paves the cowpath into an expressway, which grows into a superhighway. JS has a lot of bits to move these days.


I’ve been talking about better function syntax for a while. function is too long (I wish I’d used fn in 1995 and preempted this issue).

TC39 has had several proposals or variations. Meanwhile, CoffeeScript has emerged and taken off. In arrow function syntax I’ve synthesized all the winning ideas, without goring anyone’s ox. This slide conveys the main points of the proposal.

The # syntax from Harmony of My Dreams evolves in this proposal to be an orthogonal prefix, for freezing and joining (fusing identity up to the nearest relevant closure). See records and tuples for analogous hash meaning.


Here I go out on a limb in listing some strawman proposals not yet in Harmony. Some of these will make ES.next, some probably won’t (there will be an ES.next.next).

At this point in my talk, I advocated strongly for standardizing prototypal inheritance a la CoffeeScript’s class, super, and @ syntactic sugar.

TC39 is tempted to “do more”, which means invent (not by committee but by champions, single members or sub-groups working on a particular area). I believe we will not achieve consensus soon, and possibly go wrong, if we overreach.

We could renounce classes in order to remain future-proof, but JS developers are crying out for sweet and (more important) foolproof syntax to make prototype-based single-inheritance class-like abstractions. We should pave this cowpath now.

(Some at the conference tweeted against “classes” thinking this was another turn-JS-into-Java thing, but counter-tweeting made it clear that it’s purely prototypal. What’s in a name? It is hard to beat “class”, IMHO.)

Oh, and paren-free still wins. Where it is not simply a relaxation of JS’s over-bracketed C-based grammar, it gives for-in in loops, comprehensions, and generator expressions not only lighter syntax (and only paren-free syntax!) — it gives better semantics.

Generally, paren-free for-in provides the iteration protocol hook for custom iterators, definitely including generators (the easiest way to write an iterator).

Specifically, paren-free for-in reforms array iteration to be over values, not keys. This is pure win (ignoring migration, see next paragraph), as far as I can tell.

Forbidding parenthesized for-in heads makes a migration “early error”, a tax for a common good: lighter syntax with much better semantics.


JS developers and implementors on TC39 must learn from one another and “meet in the middle”. The Harmony goals are good. But developers may do only what can be done in library code, or reach for CoffeeScript or another language on top of JS. And TC39 may over-invent.

The better way is a dialog between JS developers, especially natural leaders among them such as @jashkenas, and TC39 members.

This won’t involve “scientific polling”. There’s no substitute for nice judgment and (ultimately) sound language design theory and practice. But the experts must also learn from the users, who’ve moved mountains on their own over the last ten years. And users, meaning JS developers, should step up to this dialog and away from fear and passivity.


After our talk, with some fun stage misdirection, Alex Russell and Peter Hallam of Google introduced Traceur, Google’s Harmony-ish transpiler. This is a good development on the whole. I’m happy to see it as a way to test proposals. However, I need to register some misgivings that go beyond “style point deductions”:

  • The slides are Chrome-only. Come on, it’s not 1997. I’m using slideshare.net and HTML/PNG. There’s no excuse.
  • Some of the extensions are neither Harmony nor Strawman. (In particular for (i : o) was rejected. So was __iterator__.) I hope this can be updated, but if Traceur turns out to support (over the long run, ignoring short-term committee jitter) other than what has been pitched and vetted by TC39, that would be several kinds of wrong.
  • Dion’s post was up last night [UPDATE: not pre-arranged, Dion is just very fast even when he’s not actually at JSConf. Sorry, D], yet it says “JS.next” without qualification. My posts, and other TC39ers as far as I know, have been careful to say “experimental”, “proposed”, “possible”, “dark horse”, etc. I personally try to assess odds of inclusion in ES.next based on the Harmony/Strawman process. No such nuance from Dion.
  • Narcissus on github is where Mozilla and several academic partners have been working on ES.next prototypes. We’re not as well-staffed as Google, but we’ve done this work in the open and I’ve talked about it since last year. Meanwhile, Traceur was done as “delayed open source”, with the wow-effect JSConf unveiling.
  • Ok, to each his own. But the after-show cajoling by Alex to me, along the lines of “come work on our transpiler with us”, left a bad taste. I say: Open-source early, tell your TC39 colleagues your plans and intentions, invite others to join you. Don’t seed the googlecode project with all-Google-employee committers, work for months in relative secrecy, and then go open.

This reminds me of how V8 was developed (I have a few saved emails from 2006… V8 released as open source with Chrome in 2008).

Again, it’s Google’s prerogative to do whatever it wants with its staff and code. But this pattern of secret-first/open-later development is on a slippery slope toward openwashing. That aside, with Traceur it seems “forkish” in TC39 social-cohesion terms.

Another point: a transpiler (syntax only or mostly) or true compiler/runtime targeting JS (new semantics too) is ultimately not enough. Harmony proposals need to be implemented in several engines, ideally including V8, before a new standard edition is ratified. No word on this at JSConf, but I hear good things from others at Google, so I’ll leave this one on a positive note.

Really, I’m not out to break TC39 and Harmony over this. Indeed we haven’t heard anything from Microsoft about what they’re hoping for in Harmony; we have not seen anything like an open source transpiler. So good for Google. But some things about how this came out don’t sit right, enough so that I’m writing publicly.


Jeremy is a star, and I want to thank him again for letting me share his slot. He made a number of excellent points, some intentionally “dialectically sharp”, all encouraging JS developers to use and even implement “better languages” built on top of JS.

The sharpest point in my mind was “JavaScript is too important to be left to the experts.” This is obviously true for several definitions of “experts”, but expertise matters too or we wouldn’t have Jison, the parser-generator used by CoffeeScript, based on Bison and Yacc before it.

Jeremy was dialectically sharp to encourage developers to throw stuff up and see what sticks. Seeing what sticks with a programming language in part depends on user-testing, usability, human factors that are not reduced to science — that is, part of language design is art. He even encouraged people to “cheat when you get stuck”, words I try to live by (after the example of Cadet Kirk).

Now, not everyone is a @WilliamShatner, or a @jashkenas. Many will get stuck, cheat, fail, and cry in despair. Not to be discouraging!

Expertise, even formal methods and people who know them, can help. Zaach‘s Jison is one example. TC39’s generally pragmatic collection of experts, including Dave Herman, Allen Wirfs-Brock, Mark Miller, Sam Tobin-Hochstadt and others, also constitute a valuable resource for the community.

What Jeremy and I tried to advocate, which I believe is happening before our eyes, is a connecting of the circle, from JS as I created it, though its fearless communities and their leaders, back to browser vendors and experts on TC39 — and around again.

We are not just advocating a language on top of JS that you use to avoid JS (GWT, Haxe, Objective-J, etc.). We are advocating that you all help build a better JS on JS, which then becomes standardized as JS.next.

This is important work. I haven’t seen anything like it at this scale, with multiple interoperating implementations. It happens with single-open-source-as-spec languages (Perl, Python, Ruby). With transpilers, better libraries, and de facto, prompt (but not premature) standardization, it can happen with JS, too.


Harmony Of My Dreams

Continuing in the vein of paren-free, I’d like to present a refreshed vision of JavaScript Harmony. This impressionist exercise is of course not canonical (not yet), but it’s not some random, creepy fanfic either. Something like this could actually happen, likelier and better if done with your help (more on how at the end).

I’m blurring the boundaries between Ecma TC39’s current consensus-Harmony, straw proposals for Harmony that some on TC39 favor, and my ideas. On purpose, because I think JS needs some new conceptual integrity. It does not need play-it-safe design-by-committee, either of the “let’s union all proposals” kind (which won’t fly on TC39), or a blind “let’s intersect proposals and if the empty set remains, so be it” approach (which also won’t fly, but it’s the likelier bad outcome).

Anyway, it’s my blog, and my current dream. I hope you like it. Talk and dream back at me, and with any luck we’ll build a better Harmony-in-reality.

little languages

Calling JavaScript a little language is polite but false at this late date: ES5.1 weighs in at over 100,000 words, with hundreds of nonterminals in the lexical and syntactic grammars.

I would say that the same goes for CoffeeScript, although I get the point in its use of the phrase “little language”: concise expression-language sugar for the lever-arm, using JS-in-full as implemented in all browsers as the fulcrum, for maximum productivity leverage. CoffeeScript is well done and more convenient to use than JS, provided you buy into the Python-esque significant space and the costs of generating JS from another source language. But semantically it’s still JS.

Could JS evolve to be a better “little language” in both surface and substance? Implementors and users will impose some fuzzy but obvious and (past the fuzz) hard limits. JS can’t evolve directly into something too different. For instance, I believe JS implementors on TC39 would reject significant space instead of curly braces, or a mandatory bottom-up parser with disambiguation magic.

Meanwhile, polyfills such as CoffeeScript (when not run as a server-side code generator) may become more widely used, pushing JS in different directions. Still, it will be hard for polyfills to beat native code implementation, <script> tag prefetching, and the other built-into-every-browser advantages of pure JS.

Whatever happens with polyfills, JS’s fitful progress over its life so far suggests that it can evolve further, and significantly. Such evolution requires growth in the short run, to solve the obvious web-imposed problem of backward compatibility.

growing a language

Maturing languages grow even without web-wide compatibility constraints, and JS is no exception. We should continue to grow the language, keeping support for old forms while adding new forms to help users themselves grow the language.

That last link is to a video of Guy Steele’s famous talk. Here’s a cleaned-up transcript. One quote:

If we add hundreds of new things to the Java programming language, we will have
a huge language, but it will take a long time to get there. But if we add just a few
things—generic types, overloaded operators, and user defined types of light weight, for
use as numbers and small vectors and such—that are designed to let users make and add
things for their own use, I think we can go a long way, and much faster. We need to put
tools for language growth in the hands of the users.

Look past the Java specifics. This applies deeply to JS as well. Empowering users to grow the language is why modules, proxies, binary data, and even an operators/literals/value-types dark horse, are high priorities for Harmony in my view. Which is not to say we shouldn’t add anything else. Because:

I hope that we can, in this way or some other way, design a programming language
where we don’t seem to spend most of our time talking and writing in words of just one

You can do anything with function in JS, but you shouldn’t have to — it over-taxes JS programmers and VM implementors to learn and optimize all the idiomatic patterns. Too much like writing with only one-syllable words.

grow to shrink

If we do this right, Harmony’s kernel semantics do not grow inordinately in complexity. Then users merely have to choose to use the simpler new syntax over the old, and for those users (and possibly for everyone, many years hence — sooner, if you use a translator to “lower” Harmony to JS-as-it-is), JS is in fact more usable and smaller in its critical dimensions.

Beyond users choosing to code in a subset, we could potentially shrink — or not grow, or grow less — the opt-in Harmony language by excluding misfeatures of “classic JS”. This was one of the ideas developed in paren-free and some followup comments: no messy, underspecified, not-quite-interoperable for (i in o) loop, only for i in o loops, comprehensions, and generator expressions, to take one example. ES5 strict mode already removes with. Harmony already proposes to remove the global object as top-most scope, to pick a non-syntactic example.

(Opt-in is required for Harmony because of new syntax. Yet developer brain-print conservation, existing code migration, shared-object-heap interoperation, and browser engine code re-use, all favor keeping Harmony “close” to JS-as-it-is. How close is the question. I’m in favor of pushing this envelope given the inertia of the standards setting and the conservatism of committees. Bear with me if you disagree.)

On the web, the only way to shrink is to grow first. As I put it at jsconf.us last year, provide better carrots to lead the horses away from the rotting vegetables, which can be cleaned up later.

finding harmony

Some of what’s below is already harmonious according to TC39. Some is new, some is not yet proposed. The idea is to give an overview of Harmony that covers all of the high points and adds some new spice, instead of referring true believers to the sprawling wiki and then hoping they can figure things out from recent changes and the discussion list.

Unlike the CoffeeScript docs, I’ll show JS as it is implemented today on the left, and Harmony-of-my-dreams code on the right. This emphasizes how we’re working to fill gaps in the language’s semantics, not simply add sugar that lowers from new syntax to old. There’s nothing wrong with desugaring, and I believe CoffeeScript and other front ends for JS have a bright future, but TC39 is charged with evolving the core language, especially in ways that can’t be done efficiently or at all in today’s JS.

With this context in mind, let’s dive in.

binding and scope

Block scoped let and const, not weird old hoisted (to top of function or script) var. Lexical scope all the way up, no global object on the scope chain. Free variables are early errors.

var still_hoisted = "alas"
var PRETEND_CONST = 3.14
function later(f, t, type) {
  setTimeout(f, t, typo) // oops...
let block_scoped = "yay!"
const REALLY = "srsly"
function later(f, t, type) {
  setTimeout(f, t, typo) // EARLY ERROR

Removing the global window object from the scope chain doesn’t mean it won’t be available, though; see modules below.


[Presented in the spirit of the Mozilla Apologator:] I’m sorry for picking so long a keyword. Beyond the length of function, and for all the many wins of closures, the objects that result from evaluating function declarations and expressions have some very shaggy hair. Time for a trim, starting with syntax proposed by @Arv and @Alex, extended to work with binding keywords:

function add(a, b) { return a + b }
(function(x) { return x * x })
const #add(a, b) { a + b }
#(x) { x * x }

The # character is one of few ASCII punctuators available. My straw polls on better characters to use has not led to a clear winner, and this one is proposed for Harmony. CoffeeScript’s -> and => seem to require bottom-up parsing, so they’re not going to fly among implementors.

Beyond syntax, notice how the braced body can end in an expression statement that evaluates to the implicit return value. And here’s another difference from functions: #-functions are immutable and joined.

What to call these # functions? Ruby has given up hash rockets. Can JS coin a new hash-phrase: hash-funcs? Suggestions welcome.

We should not call these things lambdas, as that drags in untenable Tennent’s Correspondence Principle strangeness, such as return from a lambda returning from its enclosing function (if still active; otherwise you would get a runtime error).

tail position

The implicit return value does more than save six characters plus one space (relieving you of having to type return ), it also makes the last expression statement be in tail position. This contrasts with a JS function, which has an implicit return undefined; at the end of its body.

function cps(x) { not_tail(x) }
function cps_harder(x) {
  return tail(x)
const #cps_smarter(x) {

At his JSConf.eu talk last fall, @Crock promoted the idea of proper tail calls, something we have wrestled with in TC39 since ES4 days. Tail calls are a feature of Scheme, providing an asymptotic space guarantee (in plain English, you can tail-call without growing the call stack inevitably to entrain the space for all args and vars active along the dynamic call chain).

I agree with Doug that tail calls would be a win, especially with evented code. The # function syntax allows us to give tail calls a boost and save you seven (14 total: function + return_#) keystrokes.

Some have objected that this creates an unintended completion value leak-hazard, but (a) we can improve the Harmony definition of completion value in #-functions, (b) the void operator I added for javascript: URLs back in ’95 stands ready, and (c) when in doubt, use function syntax or write an explicit return.

no arguments

The arguments object is another clump of hair to trim from functions in adding hash-funcs. With Harmony, we have rest parameters, so we don’t need no steenking arguments!

(function () {
  return typeof arguments
})() == "undefined"
#() {
  typeof arguments
}() == "undefined"

This haircut makes life simpler for web developers; it means even more to JS implementors.

lexical this

Another Tennent’s Correspondence Principle casualty: this default binding. When you call o.m() in JS, unless m is a bound method, this must bind to o. But for all functions in ES5 strict mode, and therefore in Harmony (based on ES5 strict), this is bound to undefined when the function is called by its lexical name (f, not o.m for o.m=f).

Binding this to undefined censors the global object, a capability leak. Apart from that fix, though, passing undefined as this is nearly useless.

Why not bind this to the same value as in the code enclosing the hash-func? Doing so will greatly reduce the need for var self=this or Function.prototype.bind, especially with the Array extras.

function writeNodes() {
  var self = this
  this.nodes.forEach(function(node) {
function writeNodes() {
  this.nodes.forEach(#(node) {

It’s great that ES5 added bind as a standard method, but why should you have to call it all over the place? If Harmony does not address this issue, I will count that a failure.


A hot-button issue with ES5: Object.freeze. Whose side are you on, Batman’s or Mr. Freeze’s? Simplistic to say the least, since even in one’s own small-world codebase, making some things immutable protects against mistakes. Never mind single- and multi-threaded data sharing and other upside.

However, calling an Object method around every object initialiser you want frozen is a drag, and even then, the JS engine has to stand on its head and spin around to figure out that all the many evaluations of such an expression could be shared (but for the violation of object identity, detectable via === and !==, that sharing would create; but that could be optimized too, at some further expense).

With # we can do better:

var point = {x: 10, y: 20}
point.equals({x: 10, y: 20})
const point = #{x: 10, y: 20}
point === #{x: 10, y: 20}

Not only does the # sigil allow us to create records that are hash-cons‘ed so there is only one object identity per nearest containing relevant closure; we also get == and != (and the triple-equals forms) for free. Object content-based equality.


You may have noticed a trend here. Gaps in the JS language in usability, semantic unity of purpose, and reliability or invariance, can be made up for by adding “hash forms” that are shorter or still short enough, better for optimization, and free from mutation hazards and other historic hair. The same goes for Arrays, via tuples:

var tuple = [1, 2, 3]
tuple[tuple.length-1] === 3
Array.prototype.compare = /*...*/
tuple.slice(0, 2).compare([1, 2]) == 0
tuple.compare([1, 2, 4]) < 0
const tuple = #[1, 2, 3]
tuple[-1] === 3
tuple[0:2] === #[1, 2]
tuple < #[1, 2, 4]

Not only are the equality operators (strict and loose) based on contents and not object identity (which is not material due to the implicit freezing of these array-like objects), tuples support relational operators (< <= > >=), again based on contents not identity.

Relationals could work on records too, using enumeration order (assuming we standardize that order sanely).

Negative indexing, something unlikely to be grafted onto Array in Harmony (see “shared-object-heap interoperation” point above), is more than a minor convenience in my book. It’s also something Harmony Proxy handlers can implement. So tuples should support negative indexing even if arrays do not. And it ought to be cheap to create a tuple from an Array instance.

If negative indexing works, can slices and ranges be far behind? I’ll leave those for another post.

Array.prototype is full of generic methods, most of which do not mutate this. It’s tempting to want tuples to delegate to Array.prototype, with optimizations possible (as fast engines do today for dense-enough arrays). I’ll throw this idea out and confess I haven’t thought through every corner of it.

One known bug in the Array.prototype methods that construct a new array object, e.g. slice: they always make an Array instance, instead of calling new this.constructor. I agree with @Alex that we ought to fix this bug in Harmony.


Here I recap paren-free, which I have prototyped in Narcissus (invoked via njs --paren-free), but with an obvious and convenient extension:

if (x > y) alert("brace-free")
if (x > z) return "paren-full"
if (x > y) f() else if (x > z) g()
if x > y { alert("paren-free") }
if x > z return "brace-free"
if x > y { f() } else if x > z { g() }

We do not want else clauses to be braced no matter what. In particular, an if statement as the else clause should not be braced, to avoid rightward indentation drift. Therefore any statement that starts with a keyword need not be braced. This is a boon for short break, continue, return, and throw statements often controlled by if heads that guard uncommon conditions.


The simple modules proposal, along with its module loaders adjunct, is the likely Harmony module system solution. Note that there is no left-hand side example written in current JS below — you’d need a preprocessor, not part of the language.

module M {
  module N = "https://N.com/N.js"
  export const K = N.K
  export #add(x, y) { x + y }

Modules are being prototyped in Narcissus by @little_calculist right now. More on this at the end.


The impetus for paren-free was the poor old for-in loop. I propose we break it utterly by requiring an unparenthesized head sporting implicit let binding, and the “always use the one true iteration protocol” semantics. Also, generators based on JS1.7, based on Python. All of this is pretty much as in Python, but built on proxies.

for (var k in o) append(o[k])
module Iter = "@std:Iteration"
import Iter.{keys,values,items,range}
for k in keys(o) { append(o[k]) }
for v in values(o) { append(v) }
for [k,v] in items(o) { append(k, v) }
for x in o { append(x) }
#sqgen(n) { for i in range(n) yield i*i }
return [i * i for i in range(n)]
return (i * i for i in range(n))

Migrating for-in loops into Harmony will require saying what you mean.

That "@std:Iteration"module resource locator is something I made up. It’s an “anti-URL” since it starts with @. The idea is to be able to name built-in modules without having to write URLs, and without colliding with any possible URL. You could imagine "@dom" too.

rest parameters

Instead of the bad-smelling arguments object, Harmony boasts parameter default values (not shown here) and rest parameters.

function printf(format) {
  var args = Array.prototype.slice
  /* use args as a real array here */
function printf(format, ...args) {
  /* use args as a real array here */

With hash-funcs, default parameter values and rest parameters are all you get — no more arguments.

Should tuples become harmonious, the question arises: how does a rest parameter reflect, as an array or as a tuple? The answer may depend on how important it is to splice, reverse, or sort a rest parameter. VM implementors would love the frozen tuple answer. Most JS hackers who cared would, I suspect, favor array over tuple here.


A companion to rest parameters, the spread syntax allows one to expand an array’s elements as positional parameters or array initialiser elements. Finally you can write a generic constructor-invoking helper without using switch and eval:

function construct(f, a) {
  switch (a.length) {
    case 0: return new f
    case 1: return new f(a[0])
    case 2: return new f(a[0], a[1])
      var s = "new f("
      for (var i = 0; i < a.length; i++)
       s += "a[" + i + "],"
     s = s.slice(0, -1) + ")"
     return eval(s)
function construct(f, a) {
  return new f(...a)

Even without the 0, 1, and 2 special cases, this significant savings in lines screams "semantic gap being filled!"

UPDATE: @markm emailed to remind me that ES5 fills the gap part-way, at the price of a bound function:

function construct(f, a) {
  var ctor = Function.prototype.bind
             .apply(f, [null].concat(a))
  return new ctor();
function construct(f, a) {
  return new f(...a)

ES5 helps, but I think it is time to prototype Harmony's spread for Firefox.next.


Often in JS you'll find yourself unpacking the properties of an object into same-named variables. Destructuring binding and assignment (prototyped since 2006 in JS1.7 in Firefox) fill this gap:

var first = sequence[0],
    second = sequence[1]
var name = person.name,
    address = person.address
    // no easy misc solution
let [first, second] = sequence
const {name, address, ...misc} = person

The destructuring patterns mimic object and array initialisers, and raise the possibility of refutable matching in JS.

library missing links

Array.create, Function.create (like Function but with a leading name parameter), binary data, proxies, and weak maps.

Array.create(proto, [1, 2, 3]) // see also Array.createConstructor

Function.create(name, ...params, body);

const Point2D = new StructType({ x: uint32, y: uint32 });
const Color = new StructType({ r: uint8, g: uint8, b: uint8 });
const Pixel = new StructType({ point: Point2D, color: Color });
const Triangle = new ArrayType(Pixel, 3);

Proxy.create(handler, proto)
Proxy.createFunction(handler, call, construct)

const map = WeakMap()
map.set(obj, value)

These are just the big dogs in the Harmony standard library kennel, but worth some attention.

In particular, proxies really want weak maps. Weak maps are something JS has needed for ages in general. Have you ever kept objects in an array and searched by object identity, or else mutated objects to assign hashcodes to them? No more.

Binary data looks insanely useful, and we hope it will supplant WebGL typed arrays in due course.


This is a long post. If you made it this far and take away anything, I hope it is Guy's "Growing a Language" meta-point. JS will be around for a very long time, and it has a chance to evolve until its users can replace TC39 as stewards of its growth. The promised land would be macros, for syntactic abstraction and extensibility. I am not holding my breath, but even without macros, the Harmony-of-my-dreams sketched here would be enough for me.

We aim to do more than dream. Narcissus is coming along nicely since it moved to github and got a shot in the arm from our excellent Mozilla Research interns last summer. We are prototyping Harmony in Narcissus (invoked via njs -H), so you can run it as an alternate <script> engine via the Zaphod Firefox add-on.

@andreasgal has a JS code generator for Narcissus in the works, which promises huge speedups compared to the old metacircular interpreter I wrote for fun in 2004. With good performance, we can actually do some usability studies of Harmony proposals, and avoid Harmony-of-our-nightmares: untested, hard-to-use committee designs.

A code-generating Narcissus has other advantages than performance. Migrating code into Harmony, what with the removal of the global object as top scope (never mind the other changes I'm proposing -- here's another one: let's fix typeof), needs automated checking and even code rewriting. DoctorJS uses a static analysis built on top of Narcissus, which could be used to find flaws, not just migration changes. Self-hosted parsing, JS-to-JS code generation, and powerful static analysis come together to make a pretty handy Harmonizer tool. So we're going to build that, too.

More on Narcissus and Zaphod as they develop. When the time is right, we will need users -- lots of them. As always, your comments are welcome.



The tl;dr version

Krusty the ventriloquist

<Krusty>So, you kids want CoffeeScript, do you?</Krusty>

<script type="harmony">   // placeholder MIME type

if year > 2010 {

for i in iter {           // i is a fresh let binding!

while lo <= hi {
    let mid = (lo + hi) / 2
    // binary search blah blah blah

... return [i * i for i in range(n)]   // array comprehension


No parentheses around control structure “heads”. If Go can do it, so can JS. And yes, I’m using automatic semi-colon insertion (JSLint can suck it).

There are open issues (are braces required around bodies?) but this is the twitter-friendly section. More below, after some twitter-unfriendly motivation.


We had a TC39 meeting last week, graciously hosted at Apple with Ollie representing. Amid the many productive activities, Dave presented iterators as an extension to proxies.

The good news is that the committee agreed that some kind of meta-programmable iteration should be in the language.


Proxies had already moved to Harmony Proposal status earlier this year, but with an open issue: how to trap for (i in o) where o is a proxy with a huge (or even an infinite — rather, a lazily created and indefinite) number of properties.

js> var handler = {
    enumerate: function () { return ["a", "b", "c"]; }
js> var proxy = Proxy.create(handler);
js> for (var i in proxy)

The proxy handler’s fundamental enumerate trap eagerly returns an array of all property names “in” the proxy, coerced to string type if need be. Each string is required to be unique in the returned array. But for a large or lazy object, where the trapping loop may break early, eagerness hurts. Scale up and eagerness (never mind the uniqueness requirement) is fatal. TC39 agreed that a lazy-iteration derived (optional) trap was wanted.

js> var handler = {
    iterate: function () { for (var i = 0; i < 1e9; i++) yield i; } }; js> var proxy = Proxy.create(handler);
js> for (var i in proxy) {
    if (i == 3) break;

The iterators strawman addressed this use-case by proposing that for-in would trap to iterate if present on the handler for the proxy referenced by o, in preference to trapping to enumerate.

js> var handler = {
    enumerate: function () { return ["a", "b", "c"]; },
    iterate: function () { for (var i = 0; i < 1e9; i++) yield i; } }; js> var proxy = Proxy.create(handler);
js> for (var i in proxy) {
    if (i == 3) break;

To avoid switching from enumeration to iteration under a single for-in loop, once the loop has started enumerating a non-proxy, if a proxy is encountered on that object’s prototype chain, the prototype proxy’s enumerate trap will be used, not its iterate trap.

js> var handler = {
    has: function (name) { return /^[abc]$/.test(name); },
    enumerate: function () { return ["a", "b", "c"]; },
    iterate: function () { for (var i = 0; i < 1e9; i++) yield i; } }; js> var proxy = Proxy.create(handler);
js> var obj = Object.create(proxy);
js> for (var i in obj) {

Enumeration walks the prototype chain, and this is why a proxy might want both enumerate and iterate.


What all this means: you can implement Pythonic iterators with proxies, and return a sequence of arbitrary values to a for-in loop that’s given the proxy directly (not on a prototype chain of a non-proxy object, as noted above). A large/lazy proxy would trap iterate instead of enumerate and return string keys, but other iterator-proxies could return Fibonacci numbers, integer ranges, or whatever the proxy implementor and consumer want. This was an intended part of the package deal.

js> function fib(n) {
    var i = 0;
    var a = 0, b = 1;
    return {
        next: function () {
            if (++i > n)
                throw StopIteration;
            [a, b] = [b, a + b];
            return a;
js> var handler = {iterate: function () { return fib(10); } };
js> var proxy = Proxy.create(handler);
js> for (var i in proxy)

(JS1.7 and above, implemented in both SpiderMonkey and Rhino, prefigured this proposal by supporting an unstratified iteration protocol based on Python 2.5. This JS1.7 Iterator extension is fairly popular in spite of some design flaws, and from the exercise of implementing and shipping it we’ve recognized those flaws and fixed them via proxies combined with the iterators strawman.)

The bad news is that the committee did something committees often do: try to compromise between divergent beliefs or subjective value theories.

In this case the compromise was based on the belief that for-in should not become the wanted meta-programmable iteration syntax. The argument is that for-in must always visit string-typed keys of the object, or at least whatever strings the accepted proxy enumerate trap returns in an array. If a Harmony proxy could somehow be enumerated by pre-Harmony for-in-based code, non-string values in the iteration might break the old code.

(The counter-argument is that once you let the proxy handler trap enumerate, a lot can change behind the back of old for-in-based code; also, enumeration is an underspecified mess. But these points do not completely overcome the objection about potential breakage in old code.)

Fear of Change

To fend off such breakage, we could make for-in meta-programmable only in Harmony code — any loop loaded under a pre-Harmony script tag type would not iterate a proxy.

This opt-in protection probably does not resolve the real issue, which is whether syntax can have its semantics changed much (or at all) in a mature language such as JS, which is being evolved via mostly-compatible standard versions in multi-year cycles.

I acknowledged during the meeting that we would not make progress without trying to agree on new syntax. This was too optimistic but I wanted to discover more about the divergent beliefs that made extending for-in via proxies a showstopper.

A quick whip-round the room with an empty cup managed to net us loose change from latter-day Java and C++:

for (var i : x)   // or let i, or just i for any lvalue i

as our meta-programmable “new syntax”. Bletch!

Not to worry. For-colon is probably not going to fly for some reasons I raised on es-discuss, but it also should die a deserved death as a classic bad compromise forged in the heat of a committee meeting.

The difficulty before us is precisely this how-much-to-change question.

ES5 strict mode already changes runtime semantics for existing syntax (eval of var no longer pollutes the caller’s scope; arguments does not alias formal parameters; a few others), for the better. Unfortunately, developers porting to "use strict" must test carefully, since these are meaning shifts, not new early errors.

My point is that syntactic and semantic change has happened over the last 15 years of JS, it is happening now with ES5 strict, and it will happen again.

Change is Coming

We believe that future JS, the Harmony language, must include at least one incompatible change to runtime semantics: no more global object at the top of the scope chain. Instead, programmers would have lexical scope all the way up, with the module system for populating the top scope. By default, the standard library we all know would be imported; also by default in browsers, the DOM would be there.

Can the world handle another incompatible change to the semantics of existing syntax, namely the for-in loop?

There are many trade-offs.

On the one hand, adding new syntax ensures no existing code will ever by confused, even if migrated into Harmony-type script. On the other, adding syntax hurts users and implementors in ways that combine to increase the complexity of the language non-linearly. The chances for failure to standardize and mistakes during standardization go up too.

What’s more, it will be a long time before anyone can use the new syntax on the web, whereas for-in and proxies implementing well-behaved iterators could be used much sooner, with fallback if (!window.Proxy).

Utlimately, it’s a crap shoot:

  • Play it safe, enlarge the language, freeze (and finally standardize, ahem) the semantics of the old syntax, and try to move users to the new syntax? or
  • Conserve syntax, enable developers to reform the for-in loop from its enumeration-not-iteration state?

All this is prolog. Perhaps the “play it safe” position is right. And more important, what if new syntax could be strictly more usable and have better semantics?

New Clothes and Body

Here’s my pitch: committees do not design well, period. Given a solid design, a committee with fundamental disagreements can stall or eviscerate that design out of conservatism or just nay-saying, until the proposal is hardly worth the trouble. At best, the language grows larger more quickly, with conservative add-ons instead of holistic rethinkings.

I’m to blame for some of this, since I’ve been playing the standards game with JS. Why not? It seems to be working, and the alternatives (ScreamingMonkey, another language getting into all browsers) are nowhere. But I observe that even for Harmony, and notably for ES5, much of the innovation came before the committee got together (getters, setters, let, destructuring). Other good parts of ES5 and emerging standards came from strong individual or paired designers (@awbjs, @markm, @tomvc).

And don’t get me wrong: sometimes saying “no” is the right thing. But in a committee tending a mature but still living programming language, it’s too easy to say “no” without any “but here’s a better way” follow-through. To be perfectly clear, TC39 members generally do provide such follow-through. But we are still a committee.

I want to break out of this inherently clog-prone condition.

So, given the concern about changing the meaning of for-in, and the rise of wrist-friendly “unsyntax” (Ruby, Python, CoffeeScript) over the shifted-keystroke-burdened C-family syntax represented by JS, why not make opting into Harmony enable new syntax with the desired meta-programmable semantics?

Paren-Free Heads

It would be a mistake to change syntax (and semantics) utterly. VM implementors and web developers having to straddle both syntaxes would rightly balk. There will be commerce between Harmony and pre-Harmony scripts, via the DOM and the shared JS object heap. But can we relax syntactic rules a bit, and lose two painfully-shifted, off-QWERTY-home-row characters, naming () in control structure heads?

for i in iter {
    // i is a value of any type;

Here’s your new syntax with new semantics!

We can simplify the iterator strawman too. If you want to iterate and not enumerate, use the new syntax. If you want to iterate keys (both “own” and any enumerable unshadowed property names on prototypes), use a helper function:

for i in keys(o) {
    // i is a string-typed key

The old-style for (var i in o)... loop only traps to enumerate. Large/lazy proxies? Use the new for k in keys(o) {...} form.

Are the braces required? C has parenthesized head expressions and unbraced single-statement bodies. Without parens, a C statement such as

if x

would be ambiguous (don’t try significant newlines on me — I’ve learned my lesson :-/). You need to mandate either parens around the head, or braces around the body (or both, but that seems like overkill).

So C requires parens around head expressions. But many style guides recommend always bracing, to ward off dangling else. Go codifies this fully, requiring braces but relieving programmers from having to parenthesize the head expression.

I swore I’d never blog at length about syntax, but here I am. Syntax matters, it’s programming language UI. Therefore it needs to be improved over time. JS is overdue for an upgrade. So my modest proposal here is: lose the head parens, require braces always.

You could argue for optional braces if there’s no particular ambiguity, e.g.

if foo

But that will be a hard spec to write, a confusing spec to read, and educators and gurus will teach “always brace” anyway. Better to require braces.

Pythonic significant whitespace is too great a change, and bad for minified/crunched/mangled web scripts anyway. JS is a curly-brace language and it always will be.

Implicit Fresh Bindings

Another win: the position between for and in is implicitly a let binding context. You can destructure there too, but whatever names you bind, they’ll be fresh for each iteration of the loop.

This allows us to solve an old and annoying closure misfeature of JS:

js> function make() {
    var a = [];
    for (var i = 0; i < 3; i++)         a.push(function () { return i; });     return a; } js> var a = make();
js> print(a[0]());
js> print(a[1]());
js> print(a[2]());

Changing var to let in the C-style three-part for loop does not help.

But for-in is different, and in Harmony we (TC39) believe it should make a fresh let binding per iteration. I’m proposing that the let be implicit and obligatory. And of course the head is paren-free, so the full fix looks like this:

js> function make() {
    var a = [];
    for i in range(3) {
        a.push(function () { return i; });
    return a;
js> var a = make();
js> print(a[0]());
js> print(a[1]());
js> print(a[2]());

Part of the Zen of Python: “Explicit is better than implicit.” Of course, Python has implicit block-scoped variable declarations, so this is more of a guideline, or a Zen thing, not some Western-philosophical absolute ;-). Having to declare an outer or global name in Python is therefore an exception, and painful. Like the sound of one hand slapping your face.

Of course JS shouldn’t try to bind block-scoped variables implicitly all over the place, as Python does; once again, that would be too great a change. But implicit for-in loop let-style variable declaration is winning both as sensible default, and to promulgate the closure-capture fix.


When we implemented iterators and generators in JS1.7, I also threw in array comprehensions:

js> squares = [i * i for (i in range(10))];
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
js> odds = [i for (i in range(20)) if (i % 2)]
[1, 3, 5, 7, 9, 11, 13, 15, 17, 19]

At first I actually implemented paren-free heads for the for-in parts in the square brackets, but when I got to the optional trailing if I balked. Too far from JS, and in practical terms, a big-enough refactoring speed-bump for anyone sugaring a for-in loop as a comprehension. But paren-free Harmony rules:

js> squares = [i * i for i in range(10)];
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
js> odds = [i for i in range(20) if i % 2]
[1, 3, 5, 7, 9, 11, 13, 15, 17, 19]

The same win applies to generator expressions.


Thanks to TC39 colleagues for their general excellence — we’re a committee but I’ll try not to hold that against any of us.

Thanks especially to @AlexRussell and @arv, who at last week’s meeting brought some attitude about improving syntax and semantics in Harmony that I fought at first (for fear of the committee opening up all design and compatibility constraints and failing to reach Harmony). Their attitude stimulated me to think outside the box, and outside the parens.

Some of you may be thinking “this is crazy!” Others of you will no doubt say “more! more!” I have some other thoughts, inspired by TC39 conversations, that could help make Harmony a better language without it being over-compatible warm beer, but I’ll save them for another post.

My point here is not to rush another syntax strawman through TC39, but to stimulate thinking. I’m serious about paren-free FTW, but I’m more serious about making Harmony better through judicious and holistic re-imaginings, not only via stolid committee goal-tending.


Proxy Inception

After marinating for a few months, my JSConf.eu slides:

(Mobile/No-Flash version)

These are based directly on the excellent work of Mark Miller and Tom Van Cutsem, who developed the harmony:proxies proposal that is now approved for the next major iteration of the JavaScript standard (ECMA-262, probably edition 6 but we’ve learned the hard way not to number prematurely — anyway, approved for “ECMAScript Harmony” [my original Harmony-coining post]).

Harmony Proxies are already prototyped in Firefox 4 betas, thanks to Andreas Gal.

When I reached the “meta-level shifting” slide:


someone in the audience tweeted about how my talk was like Inception (github-sourced simulator). Meta-meta dreams within dreams (warning: meta-to-the-4th-shifting leads to Limbo).

The money-shot slide in my view is:


which depicts how Proxies finally level the playing field between browser implementors using burned-into-browser-binaries C++ and web developers using downloaded JS.

It’s hard to overstate how this matters. The DOM (IE’s for sure, but all of them, back to the original I hacked in Netscape 2) suffers from its “VM territory” privileges, which have been abused to make all kinds of odd-ball “host objects”. Proxies both greatly reduce the weirdness of host objects and let JS hackers emulate and even implement such objects.

Novice JS hackers and all JS programmers happy at the base level of the language need not worry about the details of Proxies. Proxies cannot break the invariants that keep the JS lucid dream unfolding on stage. Specifically, you can’t hack traps onto an existing non-proxy object — you can only create a new proxy and start using it afresh, perhaps passing it off as a preexisting kind of object that it emulates [1].

But when you need to go backstage of the dream and change the rules without breaking the dreamer’s illusion, by interceding on every get, set, call, construct, etc., then Proxies are indispensable.

Firefox 4 is using Proxies to implement all of its security wrappers.

Long-time SpiderMonkey fans will ask “why no __noSuchMethod__” (or: why not also have a noSuchMethod or invoke trap, or a flag to get telling when it is trapping a get for the entire callee part of a call expression)? The short answer is to keep the set of handler traps minimal in terms of JS semantics (modulo scalability), which do not include “invoke-only methods”. The longer answer is on es-discuss.


[1] Inside the engine, a clever trick from Smalltalk called becomes is used to swap a newborn Proxy and an existing object that has arbitrarily many live references. Thus an object requiring no behavioral intercession can avoid the overhead of traps until it escapes from a same-origin or same-thread context, and only if it does escape through a barrier will it become a trapping Proxy whose handler accesses the original object after performing access control checks or mutual exclusion.

The local jargon for such object/Proxy swapping is “brain transplants”.

Should JS have bignums?

jwz finally learns some JS and picks at an old scab that had almost healed. I reply in various comments. I include some little-known, kind-of-funny (not always ha-ha funny) history along the way to set several records straight.

The issue before us now is whether to add value types to JS, perhaps by extending proxies, so you can implement non-reference-semantics objects with operators (because without operators, what’s the point?); or just add bignums; or do nothing.

Comments welcome (to keep up with Akismet-gaming spammers — anyone have a better WP plugin to stop comment spam?).