A small presentation I gave at SXSW asks what it means to be “Open” in the sense of open standards, open source, and a web whose major content formats are not controlled by a single vendor (I credited Nat Friedman’s amusing keynote from last year’s Free Software & Open Source Symposium for inspiration of the OED historical “Open” example).
Ted Leung (see followups) blogged about this topic already, specifically in connection with Adobe.
Ted refers to Anne Zelenka’s Why Open is Good and How Open Could Be Good for Flash post, which is worth reading, including the comments. In one comment, John Dowdell of Adobe asks whether the essence of “Open” is the ability to fork. Anne notes the loaded connotation of “fork” but agrees that John’s analysis is “not incorrect.”
I would go further. Forking is an extreme point in a continuum of options that exist with open source. The option to fork must exist as a feedback mechanism, but it need not be used in order for users to gain benefits not available with closed source and proprietary standards. Forking can be the right thing, or it can be a kind of mutually-assured-destruction option that keeps everyone acting in the interest of not forking.
Prior to even approaching a fork, open standards and open source both empower user-driven innovation. This is old news to the Mozilla user community, who have been building and feeding back innovations for the life of the project, increasing over time to include Firefox add-ons and GreaseMonkey user scripts. (BTW, I am pushing to make add-on installation not require a restart in Firefox 3, and I intend to help improve and promote GreaseMonkey security in the Firefox 3 timeframe too.) Without forking, even to make private-label Firefoxes or FlashPlayers, users can innovate ahead of the vendor’s ability to understand, codify, and ship the needed innovations.
Consider just the open standards that make up the major web content languages: HTML, CSS, DOM, JS. These mix in powerful ways that do not have correspondences in something like a Flash SWF. There is no DOM built inside the FlashPlayer for a SWF; there’s just a
display list. There’s no eval
in ActionScript, and ActionScript features a strict mode that implements a static type checker (with a few big loopholes for explicit dynamic typing). You can’t override default methods or mutate state as freely as you can in the browser content model. Making a SWF is more like making an ASIC — it’s “hardware”, as Steve Yegge argues.
This is not necessarily a bad thing; it’s certainly different from the Open Web.
I assert that there is something wrong with web-like “rich” formats that aren’t hyperlink-able or indexable by search-engines. You could argue that these bugs could be fixed, and Flash is wisely becoming more URI-addressable and view-source-able over time. But it still ain’t the Web. It is not hand-authored, easily tweaked incrementally, copy-and-paste-able. It’s
hardware.
Given the stagnation of the web under IE and the failure of Java on the client, there’s a place for Flash, and I’m sure Microsoft would argue that means there’s a place for WPF/E (but I don’t see the point: either bet on the Web, or bet against it in a transcendently different way from trying to overtake Flash). If I were VP of Engineering in a 10,000 person company, I would want the security blanket of the C-like syntax and a static type system for a well-known, big-bell-curve language like AS3, C#, or Java.
And indeed AS3, C#, and Java all are marketed to big companies with hordes of “hardware” hackers creating fixed-point UIs. Dead UIs, in other words, not enlivened by user innovation. And sorry to say (to Yahoo! and Google, who may be exceptions to the rule), the big innovations don’t come from the big dumb companies.
Meanwhile, the Web is alive precisely because of the distributed extensibility of its content languages, both on the server side and in the client JS/DOM world, not to mention add-ons and GreaseMonkey.
Dare Obasanjo argues that developers crave single-vendor control because it yields interoperation and compatibility, even forced single-version support. Yet this is obviously not the case for anyone who has wasted time getting a moderately complex .ppt
or .doc
file working on both Mac and Windows. It’s true for some Adobe and Microsoft products, but not all, so something else is going on. And HTML, CSS, DOM and JS interoperation is better over time, not worse. TCP/IP, NFS, and SMB interoperation is great by now. The assertion fails, and the question becomes: why are some single-vendor solutions more attractive to some developers? The answers are particular, not general and implied simply by the single-vendor condition.
Implicit in my writing is the assumption (conclusion, really) that browsers can adopt the necessary advanced rendering and faster virtual-machine programming language support that the “rich client platforms” boast (or promise in version 2.0). For instance, we will support an OpenGL-ES binding for canvas in the near term, in Firefox 3 if vlad and I can help it. There’s no technical reason this can’t be interoperably supported by other browsers in the near term.
Can the Open Web keep improving enough to stave off the Closed Web? Of course, Mozilla is here to help, and we want every browser to support the nascent standards we support. But at some point the answer does depend on Microsoft upgrading IE without forking web standards from what Firefox, Safari, and Opera agree on now and in their upcoming releases, which are not yet implemented in IE7.
This will mean incompatibility for old IE content, or more painful version checking in the IE engine, or a mix. Note how Flash 9 includes both the AS2 and AS3 (Tamarin) engines, since Adobe could not keep compatibility in one engine. Gecko has had to put up with more de-facto quirks implementation than we might prefer; we sucked it up to get market share. IE can do likewise, for a change to avoid losing more market share.
Such a brute-force versioning approach could work well for Microsoft, as it did for Adobe in Flash 9. Microsoft can most afford to put more megabytes on disk via Windows Update than can browser vendors such as Mozilla and Opera, who rely on
voluntary downloads.
Anyway, I’m committed to improving the Open Web, even at the expense of Firefox market share if it comes to it (but we are still growing, and Firefox still has the second-place leverage to improve the Web quickly). If we should fail and just make a fancier browser whose nascent standards are not adopted by IE, at least we tried. The alternative is to renounce innovation and let the proprietary rich clients move the Closed Web forward. That might be convenient for some (big company) developers. It’s not in the interest of the larger developer community, or of the public, in my book.
Let me know what you think.
Yes agreed of course…
We need some sort of alert system, like the kind they have in CA for missing persons. (i don’t remember the name anymore, haven’t lived there in a while) Anytime things start looking a little “proprietary-ish” an alert goes out and a super strike team of bloggers / critics set out to make justice. 😉
I’ve had to maintain a few native desktop apps across at least 3 platforms myself, and hope to god no one tries to put me there again..I won’t go without a fight at least.
I was glad to see your presentation Tuesday; I was also glad not to have slept in. Your talk and the Q&A afterward were both refreshingly candid, especially when contrasted with Chris Wilson’s non-answers.
Those of us who support the ideal of an Open Web appreciate your efforts. Look forward to more Road Map updates.
For standardista sitedevs and those not inclined to sign up for TypeKey, I’ve summarized this post over at the Web Standards Project (follow the author link for this comment). Try to keep it nice.
BTW, I used SiteKey without much thought simply because it was a supported option with the MT set-up that mozillazine.org hosts, and it helped (a lot!) with blog spam. Is it considered evil? Is there a better system like it, not considered evil?
/be
I’m all for standard based open web as the platform, but I wish there was more choice in the programming model. I programmed client and server side scripting languages for years, but these days prefer a nice strongly typed OO-language like C# or Java for my development. Right now I’m forced to write my client and server side in different languages which I wish wasn’t the case. I mean if JS runs inside a virtual machine, why can’t we create compilers that let other languages address the VM and deliver bytecode, so that the client doesn’t even have to deal with a future proliferation.
I mean every platform out there allows developers to choose the language they like best, except for browser land, where it’s JS or you don’t get to play. Any chance that there is a future in rich browser development in which the open standard is the VM and everyone can target it in their own fashion via bytecode?
…There’s nothing evil about TypeKey, just that it’s Yet Another Passkey to Remember – and something of a latecomer in my opinion. (Note to self: TypeKey passkey and Blogger passkey are identical.)
What [sdether, correction] is asking for is *exactly* what Morfik is delivering. You write code in high-level OO languages and it gets compiled to JavaScript.
True, it is a commercial product but totally based on the standards of the Open Web.
If the WHATWG is open, and we’re wondering what Apple is up to, where does this message fit in:
https://lists.whatwg.org/pipermail/whatwg-whatwg.org/2007-March/010129.html
Doesn’t seem very smurfy.
sdether: there won’t be a standard bytecode, on account of at least (a) too much patent encrustation and (b) overt differences in VM architectures. We might standardize binary AST syntax for ES4 in a later, smaller ECMA spec — I’m in favor. For the near term, given the installed base, generating JS, as Morfik (thanks for the reminder, Mauricio), GWT, OpenLaszlo, and more and more tools do, is attractive.
In the medium term, ES4 (JS2) will ship and achieve adoption in the market thanks to modern browsers having auto-update features. Microsoft could do something to try to quash this, for sure, but things would get ugly fast.
/be
Sayrer: not surprising, given the mess that is patent law in the U.S. and much of the rest of the world. Apple needs a patent policy backed by a standards body that is a legal entity, which the WHATWG is not.
So in the best case, Apple reserves its rights until canvas makes it to the W3C’s HTML WG. I hope you can help by participating in the HTML WG.
/be
Your presentation highlighted the fact that some parties may be going for a less expressive Web, in order to ensure a market for their “rich” technologies.
My concern with the HTML WG is that sensible standards values like compromise and simplicity will be the excuse to the hobble the Web, and the W3C’s vocal minority of XML/XHTML/RDF standards wonks will create the acrimony necessary to justify a lowest common denominator spec, in the name of consensus.
It would be good to see evidence that the group will avoid that hazard. Starting with the WHATWG work is a great way to do that. Starting from scratch is not.
Brendan, Mauricio: Being fixated on VMs, I had never come across Morfik, never looked into GWT, thinking it was just another widget set and the last time I looked at OpenLaszlo their DTHML support was just starting. In the end, if I can author server and client in the same language, I don’t care how it gets encoded to run on the client, so “compiling” into JS is good enough for me. Thanks for those pointers, looks like I’ve got some toolkits to play with.
“why can’t we create compilers that let other languages address the VM and deliver bytecode?”
I thin in the end, that is the future of web applications. As you said, right now it’s JS or go home. And as fantastic as JS is, it’s not brilliant at everything, but then, i was never meant to be. I’d love to see sandboxes for lots of languages available like Java is, and even the ability to code up you page’s interactivity like JS but with other languages. But the last time this was done, it was called ActiveX, and was done with the naivety typical of a large single-user-oriented company. It’s still a rich vein to mine, however, for the properly designed, cross platform and open-standards based approach. With another huge future market of “smart devices” like phones and PDAs and palmtops, being tied to one OS isn’t an option.
But then, you know that. 🙂
“Any chance that there is a future in rich browser development in which the open standard is the VM and everyone can target it in their own fashion via bytecode?”
That’s a question you’d have to ask browser-makers. If only you could contact one. 😉
In the long run, assuming no patent armageddon, there will be multi-language VMs on all devices, with optimizing JITs even in routers — but that is a long-term project. In the near term, fitting a desktop VM into a phone is hard. Instead, you see small VMs targeting devices, making different trade-offs.
Phones are hard targets, but embedding and bundling a big VM in a browser such as Firefox is still hard. We want to keep our download size under 5MB (give or take). We can’t afford too much code size growth apart from download footprint. And the cost of integration is large: you need a Java/XPCOM bridge with DOM helpers; you need a coherent garbage collection solution across heaps; you need strong security and a consistent access control model.
These checklist items (among other necessary ones) may be boasted by various pieces of the puzzle (Firefox has some of the above items; some VMs may claim to have others). Putting them all together in 5MB compressed is very difficult. So for the near term, Mozilla 2, we will integrate Tamarin. If Tamarin become attractive as a VM for other source languages, great. It will be in the Flash Player as well as in Firefox.
/be
“Right now I’m forced to write my client and server side in different languages which I wish wasn’t the case.”
I enthusiastically agree! Since JavaScript is the one language we can use in browsers it is the natural choice for the web…on both sides. This would allow code sharing for client and server. Developers would not have to mentally switch gears so frequently.
So why do people choose Ruby, Python, Perl, PHP etc for the server-side? These languages don’t offer much more or different then JavaScript when it comes to server-side development. The OOP models are different but none have the full power of Lisp (e.g. macros) and only JavaScript has full-blown first class functions. Should the JavaScript language make a few allowances for the features needed when developing server-side scripts? Perhaps not much is needed and it is just promoting the idea with a web framework like JavaScript on Rails.
One thing JavaScript doesn’t have, or at least the perception is that JavaScript doesn’t have, is the libraries that the other languages have for server-side use.
Anti-restart for Firefox extensions is completely the most important thing for Firefox 3.0 in my opinion. The pain of learning curve plus the pain of figuring out how to do quick refresh development is what has kept me in the past from trying to learn how to make Firefox extensions. And please let me run dev-mode extensions from any old project dir that I want. I don’t want to have to “install”. Really, just a few clean-ups like this could help devs and users A LOT. So thanks for making non-restart a priority. (And I’m eager for Tamarin, too. And Ogg (please?). And other things.) But non-restart would rock.
Oh, also SHA-1 HTTP headers (e.g., “If-Not-Match-Hash” or some equivalent or easier mechanism) for safe, cross-site caching of large libraries/resources would rock. That would make the web browser a much stronger platform with minimal change to current behavior and standards.
One more comment. I think that Firefox and the web in general provide a nice feature set, but the foundation isn’t strong. I think a VM like Java (or perhaps Tamarin) provides a strong foundation, but it doesn’t have the web platform at the upper layers. That’s why I’m eager to see Firefox get a stronger foundation. Probably hard to replace the foundation on a house that’s already built, but at least it’s safer with software. And once the cycle collector and Tamarin are in there, it might start to get pretty interesting.
Tom: quick responses, pointers really:
* See https://bugzilla.mozilla.org/show_bug.cgi?id=256509 for progress on addon install w/o restart, and if you can come up with a “zero install” pitch for extension development (shift-reload on the subdir index?), make it to shaver@mozilla.org.
* Ogg Theora is the default, mandatory decoder proposed by Opera for the tag in the WHATWG, and Mozilla is on board. See https://lists.whatwg.org/pipermail/whatwg-whatwg.org/2007-February/009702.html and more recent posts on “video”.
* I believe sayrer has something going like If-Not-Match-Hash, and we’re still talking to Big Partners about universal Ajax library edge-cached hosting. My other idea, probably implementable for 1.9, is a ccache like speedup to hash .js files and avoid client-side recompilation costs by finding cached bytecode. Comments welcome.
* Agreed on the platform, which is why we are investing in Mozilla 2 with JS2/ES4 on Tamarin.
/be
Thanks much for the pointers and info.
Concerning “If-Not-Match-Hash”, how could I look up more info on sayrer’s plans or how to contact him? As for comments on this topic, edge-cached hosting sounds like a nice plus. Cached bytecode also sounds nice (especially if compilation time really is slow). But I also really think something along the lines of “If-Not-Match-Hash” is needed. Mostly because it’s an automatically decentralized solution to the redownload problem. It doesn’t require cooperation from big parties. It doesn’t really even require direct web server support (although this would make support much more pervasive and therefore more useful). And it could be standardized after the fact if it proves useful.
I think this is a great subject you are touching on here. I, for one, would really like to see the Open Web succeed and become the standard for rich web clients.
But in the same time, all too often I get tempted to convert to f ex Flash. Doing rich web clients for multiple browsers is still so hard that you have to be something of an enthusiast to endure.
When reading about you breaking new ground with OpenGL and such I think it’s cool, being an enthusiast, but as a web application developer I’m thinking we should focus more on the basic everyday stuff.
We ought to have a decent native class library standardized across all browsers. Just the quickest look at Java’s or .NET’s class libraries reveals tons of useful stuff (not that EVERYTHING in there should go inside the browser, but you get the idea).
Also, the existing native classes and BOM objects are often too limited. XHR lacks a usable synchronous behaviour with timeouts. Another thing is you can’t implement blocking behaviour yourself. (See [1] for why this would be useful. It is somewhat remotely related to our discussion about coroutines in your Threads post.)
I would like to see you “influentious” people spend more time with getting basic stuff into all browsers, than with breaking too much new ground for one browser.
I saw your mention of the “law of least effort” which is in effect when users are asked to download a new runtime (f ex Flash) to be able to view a site. The same reasoning can be applied to Firefox, but, in addition to having the user download and install the “runtime”, the user will have to get used to a new browser user interface and a separate bookmark collection. Quite disruptive. As a web application builder I can’t make an application tailored just for Firefox as most IE users will just choose “least effort” and surf somewhere else.
I’m afraid the Open Web will lose to the Closed Web because the Closed Web will offer so much better APIs and support libraries for the programmer. Enhancing the overall programmer experience in ES/JS in all browsers is the critical factor for the Open Web’s success.
Best regards
Mike
[1]:
My favourite use case for this is a “Do you want to save your edits before closing window?” dialog popped up when closing a browser window (onunload event). As all other dialogs in my application are positioned DIVs with HTML content I want to do that for the exit dialog as well, but no can do as my own dialogs can’t block during the window closing the way Window.prompt() does. Ok, so have to use an ugly prompt() anyway.
The user clicks “Yes, I want to save” and I’m sending the unsaved edits to the server using XHR. Before letting the window close, I want to get an acknowledgement back from the server that the edits were successfully saved. If I use standard asynchronous XHR, my event handler and global context will be long gone when the server replies as we are in the midst of closing the window. So I have to use synchronous XHR, but then I am risking hanging the browser window if the server doesn’t reply as XHR will wait forever for an answer. Touche.
Mike: My thoughts are that Mozilla could possibly have an official “Firefox Plugin” for IE triggered by certain content types or something that could be installed live on demand like Flash. Not sure if that would work, but it could be an interesting experiment.
By the way, from my understanding, I’d argue that Microsoft Office (and even IE) have not really been “single vendor” products across Mac and Windows. That is, I’ve understood them to have access to each other’s code somewhat, but they code bases are still mostly distinct and with completely different teams. Real single-vendor (or rather, single-code-base) products tend to do better at interoperability. But as you point it, standards are sometimes followed.
While the Canvas support and OpenGL binding is interesting, SVG potentially has a big role to play in the ‘open web’ story, esp. in relation to the proprietary RIA solutions being touted.
The recent announcement (https://weblogs.mozillazine.org/tor/archives/2007/04/smil_animation_and_svg_fonts.html) that SMIL animation and SVG fonts would not be included in the Firefox 3 roadmap is concerning.
IMHO Mozilla is best positioned to foster in-line SVG online in the face Microsoft opting for a proprietary solution; given the stated aims to “protect the open web”, one might say that Mozilla caries the ‘moral responsibility’ to be the standards bearer in this regard.
‘Open web’ development processes could benefit greatly from increasing SVG support, and web usage would greatly increase the utility of the SVG format to the design community (Inkscape etc.); SVG has the opportunity to become ‘the OpenDocument of the graphics world’ and Mozilla could play a large role in enabling the use of this inter-operable format with increased investment in the short term, to kick-start usage, and influence the other ‘open standard’ browsers.
IMHO this would be a significant contribution to the design/communications and the evolving web2.0/RIA development communities.
Marc: SVG owners are focused on bugs and performance as noted in tor’s blog. There’s still time for folks to help, but it’s going to require fast work.
But even with animation and font support, SVG would not nearly compete with all of Flex+Flash or Silverlight. It’s at most only part of the open web story, and the SVG standard was not developed in a particularly “open” fashion — SVG was a creation of the w3c during a period when Adobe, cell phone software vendors, and others were pushing it in the pay-for-play, members-only silo of the w3c. (Note that Adobe has EOL’ed their SVG plugin.)
At a higher level, I think you misunderstand what SVG is. It is a presentational language with absolute positioning and 2D affine transforms only. There is no way that SVG as designed could be “the OpenDoc of the graphics world”.
So SVG in Firefox is helpful, and the Mozilla community is investing in it, but it’s not enough. SVG in Safari is coming, but I don’t know whether it will include SMIL-driven animation, or what the font story will be. If SVG is implemented in IE8, that would be very helpful. I’ve heard hints of this, but no promises. But even in that happy event, you can bet that MS will be showing Silverlight demos that SVG+SMIL by themselves simply cannot match.
For the open web to compete, leverage and clever embrace and extend strategy will be necessary. SVG in the setting of the w3c standards process offers neither leverage nor E+E applied to the non-open competition. More on this in a future post.
/be
>Marc: SVG owners are focused on bugs and performance as noted in tor’s blog. There’s still time for folks to help, but it’s going to require fast work.
Yes, the effort is definitely appreciated, and I did not intend my comments to sound overly critical. As for diving in, if my skills extended to coding I gladly would.
>But even with animation and font support, SVG would not nearly compete with all of Flex+Flash or Silverlight.
That is most certainly true; it’s not a like-for-like comparison, SVG is largely useful as a file format (inc. SVG+SMIL, e.g. for dead-simple online slide-shows.) Initiatives like sIFR and swfIR point to pragmatic, incremental use of Flash as the most viable vector format, exactly the envisioned uses for SVG, and as such it would be highly desirable to address ‘231179: SVG in CSS backgrounds’ & ‘276431: SVG in IMG’ sooner rather than later.
>(Note that Adobe has EOL’ed their SVG plugin.)
Exactly why I’m advocating that Firefox pick up the slack. Obviously that doesn’t fix the IE issue, but improved support from Firefox, WebKit, Opera can apply pressure (as they did with CSS), and there are efforts to host SVG in IE via ActiveX, VML, and Silverlight (just like hacks that worked around some CSS issues.) Decent support in ~20% of browsers might allow site developers to provide SVG as an additional content option (just like we provide alternative video formats at the moment.)
In-spite of the current RIA hype, it’s unlikely that the web will rapidly transition to these new technologies. Therefore, while there is time for open alternatives to mature those same issues of momentum will likewise apply to the alternatives; wide adoption will take time, and I’m advocating that Mozilla Foundation further contribute to accelerating that adoption.
>SVG standard was not developed in a particularly “open” fashion
Yes, SVG has a somewhat spotted history, and has developed very slowly, but it is now a published standard gaining momentum, and has genuine end-user and developer benefits. Obviously SVG support would bolster the Ajax development story against the new RIA platforms (and of course can also intertwine with Flex/Apollo via WebKit.)
>At a higher level, I think you misunderstand what SVG is…. There is no way that SVG as designed could be “the OpenDoc of the graphics world”.
Comparisons are always risky, but like OpenDocument (ODF) SVG has potential to become a (vector) lingua franca for the graphics world — currently that’s EPS but there’s no chance of that being employed online.
As for WebKit, there will not be support for SVG in IMG nor CSS, and SVG fonts will not be supported (at least according to current visible progress.)
>MS will be showing Silverlight demos that SVG+SMIL by themselves simply cannot match.
But what of SVG + Ajax + CSS + video… plus Flash, Silverlight or JavaFX for the pragmatic?
A link as an addendum to my comment:
Replacing Flash with something else
https://www.kryogenix.org/days/2007/05/15/replacing-flash-with-something-else
great article
I think a VM like Java (or perhaps Tamarin) provides a strong foundation, but it doesn’t have the web platform at the upper layers.