The Open Web and Its Adversaries

A small presentation I gave at SXSW asks what it means to be “Open” in the sense of open standards, open source, and a web whose major content formats are not controlled by a single vendor (I credited Nat Friedman’s amusing keynote from last year’s Free Software & Open Source Symposium for inspiration of the OED historical “Open” example).

Ted Leung (see followups) blogged about this topic already, specifically in connection with Adobe.

Ted refers to Anne Zelenka’s Why Open is Good and How Open Could Be Good for Flash post, which is worth reading, including the comments. In one comment, John Dowdell of Adobe asks whether the essence of “Open” is the ability to fork. Anne notes the loaded connotation of “fork” but agrees that John’s analysis is “not incorrect.”

I would go further. Forking is an extreme point in a continuum of options that exist with open source. The option to fork must exist as a feedback mechanism, but it need not be used in order for users to gain benefits not available with closed source and proprietary standards. Forking can be the right thing, or it can be a kind of mutually-assured-destruction option that keeps everyone acting in the interest of not forking.

Prior to even approaching a fork, open standards and open source both empower user-driven innovation. This is old news to the Mozilla user community, who have been building and feeding back innovations for the life of the project, increasing over time to include Firefox add-ons and GreaseMonkey user scripts. (BTW, I am pushing to make add-on installation not require a restart in Firefox 3, and I intend to help improve and promote GreaseMonkey security in the Firefox 3 timeframe too.) Without forking, even to make private-label Firefoxes or FlashPlayers, users can innovate ahead of the vendor’s ability to understand, codify, and ship the needed innovations.

Consider just the open standards that make up the major web content languages: HTML, CSS, DOM, JS. These mix in powerful ways that do not have correspondences in something like a Flash SWF. There is no DOM built inside the FlashPlayer for a SWF; there’s just a
display list. There’s no eval in ActionScript, and ActionScript features a strict mode that implements a static type checker (with a few big loopholes for explicit dynamic typing). You can’t override default methods or mutate state as freely as you can in the browser content model. Making a SWF is more like making an ASIC — it’s “hardware”, as Steve Yegge argues.

This is not necessarily a bad thing; it’s certainly different from the Open Web.

I assert that there is something wrong with web-like “rich” formats that aren’t hyperlink-able or indexable by search-engines. You could argue that these bugs could be fixed, and Flash is wisely becoming more URI-addressable and view-source-able over time. But it still ain’t the Web. It is not hand-authored, easily tweaked incrementally, copy-and-paste-able. It’s

Given the stagnation of the web under IE and the failure of Java on the client, there’s a place for Flash, and I’m sure Microsoft would argue that means there’s a place for WPF/E (but I don’t see the point: either bet on the Web, or bet against it in a transcendently different way from trying to overtake Flash). If I were VP of Engineering in a 10,000 person company, I would want the security blanket of the C-like syntax and a static type system for a well-known, big-bell-curve language like AS3, C#, or Java.

And indeed AS3, C#, and Java all are marketed to big companies with hordes of “hardware” hackers creating fixed-point UIs. Dead UIs, in other words, not enlivened by user innovation. And sorry to say (to Yahoo! and Google, who may be exceptions to the rule), the big innovations don’t come from the big dumb companies.

Meanwhile, the Web is alive precisely because of the distributed extensibility of its content languages, both on the server side and in the client JS/DOM world, not to mention add-ons and GreaseMonkey.

Dare Obasanjo argues that developers crave single-vendor control because it yields interoperation and compatibility, even forced single-version support. Yet this is obviously not the case for anyone who has wasted time getting a moderately complex .ppt or .doc file working on both Mac and Windows. It’s true for some Adobe and Microsoft products, but not all, so something else is going on. And HTML, CSS, DOM and JS interoperation is better over time, not worse. TCP/IP, NFS, and SMB interoperation is great by now. The assertion fails, and the question becomes: why are some single-vendor solutions more attractive to some developers? The answers are particular, not general and implied simply by the single-vendor condition.

Implicit in my writing is the assumption (conclusion, really) that browsers can adopt the necessary advanced rendering and faster virtual-machine programming language support that the “rich client platforms” boast (or promise in version 2.0). For instance, we will support an OpenGL-ES binding for canvas in the near term, in Firefox 3 if vlad and I can help it. There’s no technical reason this can’t be interoperably supported by other browsers in the near term.

Can the Open Web keep improving enough to stave off the Closed Web? Of course, Mozilla is here to help, and we want every browser to support the nascent standards we support. But at some point the answer does depend on Microsoft upgrading IE without forking web standards from what Firefox, Safari, and Opera agree on now and in their upcoming releases, which are not yet implemented in IE7.

This will mean incompatibility for old IE content, or more painful version checking in the IE engine, or a mix. Note how Flash 9 includes both the AS2 and AS3 (Tamarin) engines, since Adobe could not keep compatibility in one engine. Gecko has had to put up with more de-facto quirks implementation than we might prefer; we sucked it up to get market share. IE can do likewise, for a change to avoid losing more market share.

Such a brute-force versioning approach could work well for Microsoft, as it did for Adobe in Flash 9. Microsoft can most afford to put more megabytes on disk via Windows Update than can browser vendors such as Mozilla and Opera, who rely on
voluntary downloads.

Anyway, I’m committed to improving the Open Web, even at the expense of Firefox market share if it comes to it (but we are still growing, and Firefox still has the second-place leverage to improve the Web quickly). If we should fail and just make a fancier browser whose nascent standards are not adopted by IE, at least we tried. The alternative is to renounce innovation and let the proprietary rich clients move the Closed Web forward. That might be convenient for some (big company) developers. It’s not in the interest of the larger developer community, or of the public, in my book.

Let me know what you think.