The non-world non-wide non-web

I spent a day at the recent w3c workshop on web apps and compound documents. Due to vacation, that day was the second, so I missed the chance to hear JavaScript praised as the worst invention of all time.

The adolescent sniping and general irrelevance continued on the second day, however. The sad fact is that the w3c is not concerned with the world wide web, AKA the Internet. Rather, the focus for a while now seems to be on vertical tool/plugin and service/cellphone markets, where interoperation is not a requirement, content authors are few and paid by the vertical service provider, and new standards provide livelihoods and junkets for a relative handful of academics, standards body employees, and big company implementers.

Evidence of the vertical nature of the new standards? There are only a few hundred tests for the SVG w3c recommendation. That’s several decimal orders short of what is required just for surface coverage. Often recently, when Hixie hears of a claim about an interesting SVG standard feature, he writes a testcase. Adobe’s plugin too often fails that testcase, although I am sure Adobe SVG tooling produces content that works with the Adobe plugin. Interoperation is a joke.

Real browser vendors, who have to deal with the ugly web as it is, know better. The dream of a new web, based on XHTML + SVG + SMIL + XForms, is just that — a dream. It won’t come true no matter how many toy implementations (including Mozilla implementations — we’ve supported XHTML for years) there are. Long before the w3c gets compound documents working on paper (having missed the chance with SVG 1.0 and 1.1, which ambiguate and conflict with CSS), XAML etc. will leak onto the public web.

What matters to web content authors is user agent market share. The way to crack that nut is not to encourage a few government and big company “easy marks” to go off on a new de-jure standards bender. That will only add to the mix of formats hiding behind firewalls and threatening to leak onto the Internet.

The best way to help the Web is to incrementally improve the existing web standards, with compatibility shims provided for IE, so that web content authors can actually deploy new formats interoperably.

What has this to do with Mozilla’s roadmap? Not much, which is why apart from HTML, CSS, DOM, and SVG, which we support, you probably won’t hear much more about the w3c here. But Mozilla is joining with Opera and others to explore the sort of incremental improvements to HTML proposed by us at the workshop. I expect the resulting specs and implementations to play a significant part in the roadmap.

/be

31 Replies to “The non-world non-wide non-web”

  1. Hi Brendan,
    Just a few quick comments. Some of Ian’s test cases are not valid SVG, and shouldn’t work, anywhere. A lot of the tests are also testing SVG inline (X)HTML, which wouldn’t work with the Adobe SVG Viewer in any browser (unless you used some HTC component in IE6, and even then). So I would not use these few tests for the gospel and make assumptions on the level of interoperability in the SVG world.
    Regarding the SVG test suite, I agree that it would be good if it were bigger. As you may know, writing a test suite is a time-demanding task, and thus an expensive thing to do. I can honestly say that the SVG WG tried and did as much as possible for the SVG 1.1 test suite to be as rich as possible. You’ll also note that some tests are profiled for SVG Basic and SVG Tiny. As SVG 1.2 is wrapping up, the group will also devote a lot of efforts towards upgrading the test suite.
    On the SVG in Mozilla front, you say Mozilla supports SVG, but just to be clear, are you saying that you consider Mozilla’s support of SVG good enough already, or that you expect Mozilla getting to a better SVG implementation? The way it stands today, the implementation is still a long way from being compliant to any of the conformance crieteria listed in the SVG specification.
    Rewinding back to the JavaScript comment, I love JavaScript, and can’t wait to have a JavaScript 2.0 implementation availabe in an SVG-aware environment.
    Antoine Quint, SVG WG Invited Expert

  2. Hi Brendan,
    adding to Antoine’s comments, it would appear that some of Ian’s tests are not even well-formed, hardly solid proof even if the input is welcome.
    Indeed the spec’s test suite is insufficient, but that is alas an issue that concerns HTML and CSS equally. Thankfully it is being worked on within the 1.2 time-frame and already the number of mobile tests (which also apply to Full naturally) has been increased two-fold.
    Regarding Javascript, the comment you mention came from a handful of individuals that have in common their preference for the “ugly web” over the more recent W3C developments. I believe they have been frustrated with the work-arounds and hacks that are required there for interop. I an assure you that the entirety of the SVG community is hooked on js, and one rarely sees an example without it. The spec mandates Ecmascript 1.5 ed3 and specific APIs so as to guarantee interop.
    I have issues with the “let’s just emulate IE6 with minor improvements” approach. First, Microsoft will not hesitate to break existing content — they did it with all previous major releases of their browser software. This means that whatever MS replaces IE6 with at some point in the future will break content, and leave other browser vendors in an unpleasant position: playing catch-up. The only way in which they would not do that would be if IE6 no longer is the dominant gorilla, which requires people to switch. The problem is, people don’t switch for small incremental improvements, they switch for stuff that is an order of magnitude better (in whichever direction). I work in a very standards-oriented company, and have had the hardest time getting people to switch to FireFox (or Mozilla before that). Everyone here agrees that it’s a better browser, but people have habits they don’t like switching out of. If I could say “Look! It does this, that, and that which are evidently so much better” they’d switch. Currently, it’s only an incremental improvement, of little interest to them.
    That’s precisely what happened with XHTML. Ok, so you can now feed it to XML tools. That’s neat. Yeah yeah I’ll switch my site some day, if I get time or something. Perhaps.
    To me the “small, incremental improvements to the ugly web” approach will make FireFox/Safari/etc be to IE6 what XHTML 1.x is to HTML. Neat increment, mostly a flop. Please don’t take this as an attack (heck, I use your stuff day-in, day-out) but I would really like to see where you see a path to success in that strategy, because hard as I honestly try I don’t.
    And while I understand the deployment issues, a user agent that would integrate XHTML, SVG, and XForms (and then, maybe not even XForms natively, XBL can take care of that part at least to start with) would IMHO provide such a leap forward in terms of content capabilities that it would appeal to content creators, and, in turn seeing the content, to users. The content creators learnt the “ugly web” not so long ago, because they had an incentive to switch from whatever they were doing before. With a strong incentive, they can learn again. Likewise, the users switched, they can switch again. But they need a reason that goes beyond “well, it’s Open Source, it’s a bit faster and lighter, it has those neat little additions there and there, etc”. Integrating those specs together isn’t very far away.
    A few months back I saw a demo from Adobe mixing XForms, SVG, and XHTML and even though I was well-aware of those technologies and what they could do, I couldn’t refrain going “WOW”. Not because of the tech, but because even without knowing what was inside I hadn’t seen something like that before.
    It’s been a long while since anything made me wow on the Web. I want that back.

  3. I’m sure there’s more to SVG non-interoperation than Ian’s tests, however lacking they may be. You don’t implement a 700-page spec and interoperate without testing and massive distribution, period, end of story.
    Yes, real test suites cost money, but there’s really no excuse for going to w3c RECommended status without interoperable implementations, in my opinion. Even market-testing two or more implementations against one another would have been better than what actually happened.
    I get the impression, from the talkback here, others’ words in the w3c lists, and those spoken at the workshop, that egos are on the line here. Just for the record, I don’t mind if Bert Bos hates JavaScript. I have a love/hate relationship with it myself. If you call my baby ugly, I won’t take offense (my real-life baby is beautiful). JavaScript was a rush-job, but I’m glad I got to promote the Self-influenced prototype model, and Scheme-like lambdas, in a mass-market scripting language. Too bad IE does not implement them properly in versions still in the field.
    The important point is not anyone’s ego, it is this: the web needs an upgrade, or rather, continuous upgrades under evolutionary pressure (competition, real-world use-cases).
    Lacking that, making up a bunch of new XML-based standards just creates a gulf between where we are, and where you would like us to be, that I see as uncrossable. No killer app supporting XForms and SVG is likely to become widespread, especially not based only on those standards. Any future killer app is likely to be a 3D game, or something dependent on MS or other, similar proprietary technologies, the way I see things.
    But there’s a chance, if we can co-evolve some HTML improvements, to get to a better space.
    Here’s how: if we get IE compatibility shims built for the whatwg.org specs, and the top three minority market share browsers implement the specs well, then web content authors can use them on all major user agents. If between Mozilla, Opera, and perhaps Apple, we gain enough market share to matter to content authors, *then* having XForms support in Mozilla (which I expect will be done first, among the M-O-A troika) may let intranet/vertical XForms apps take flight on the web — provided service or content providers are willing to stipulate and help deploy Mozilla Firefox 2.0 (just for example).
    This assumes that Firefox 2.0, or whatever, is otherwise a competitive browser — something that the “browsers are dead, let’s have a new runtime” folks pretend is not necessary. No one wants a new browser-like runtime-based Internet app, trust me. I’ve talked to dozens of companies and institutions. Deployment is hard, and the browser is the runtime.
    Similar arguments to the ones I make above for XForms being helped by whatwg.org-based incremental improvements to HTML apply to SVG, or at least to its most useful parts — SVG 1.2 is looking like an enormous spec, again in the real word not interoperably implemented any time soon. SVG is not going to replace HTML as the “host language”, but it should be possible, for those with the right audience, to add SVG to HTML pages, if we make the right incremental changes to browsers in the next year or 18 months.
    (Antoine, anyone can read https://www.mozilla.org/projects/svg/status.html to see how much of SVG we implement — by “support”, I meant, implement partially, with an aim to implement completely, or to some useful subset or profile, in the next year or two.)
    But it simply won’t do to pretend that by focusing only on XForms, SVG, or some combination, we will make enough of a market to compete with proprietary formats, especially with the impending Longhorn ones. No matter how many government or industry sites want XForms, the world wide web will not see significant traffic in XForm’s MIME type in the next couple of years. After that, I bet we may see more, especially if whatwg.org succeeds.
    /be

  4. Count me on the ‘Javascript sucks’ side of the debate (but at least it sucks significantly less than the other scripting languages, particularly VBScript). Anyway, it might suck but it basically does the job. Moving on…
    About SVG, you mean that somebody is really working on it? Good…
    About the whatthefuck.org psuedo-‘standards’ effort; I don’t think people are going to use those ‘IE-compatibility shims’ any more than people currently use systems like the ‘IE7’ scripts. Even though you only need to include a stylesheet import or whatever, it’ll still only be the most advanced of the most advanced web designers who even think about it.
    I also think that the wtf stuff (sorry, wtfml or whatever it’s called) is missing the point – it’s largely an effort to continue with the kind of flashy crap that Netscape introduced: a kind of 21st-century blink tag. Drop-down menus, popup menus, etc… it’s all a creep in the shitty ‘web applications’ direction which I think attacks the usability of the Web. Web sites are usable because the metaphor is very simple; it’s a page, it has links you can click on. Maybe occasionally there is a form you fill in that takes you to another page when you fill it in. Adding ‘application’-style junk only lets developers off the hook of thinking about how their system ought to present itself on the web.
    On the Web, as far as UI elements are concerned, limited is better.
    –sam

  5. sam makes a valid point amid the crudity and misconceptions (thanks, sam!): the web is stable now for reasons beyond monopoly abuses and standards body neglect. A stable web is good for users, in many ways (IE “stability”, meaning security holes, popups/unders, etc., to the contrary notwithstanding).
    Even more to the point, a stable web is good for server-side content authors, service providers, and others who want to innovate on top of the browser, not in it.
    This is sad but true, yet in ten years, we won’t all be using the same old Firefox 9.0 that looks a lot like today’s browser, I predict. Something will give, and the graphics processing power, high screen resolution, superscalar CPU, and generous memory hierarchy of the modern desktop will be much better utilized. That’s why I half-joke that the next revolution will be led by a 3D game.
    The revolution will happen far sooner than ten years, in my view — ten years may be off the edge of the Singularity (https://hanson.gmu.edu/vi.html).
    Can incremental, evolutionary moves by browser vendors reduce the future shock, by exploiting the desktop hardware better without falling into the plugin prison? An interesting question; comments welcome.
    /be

  6. In this discussion I’m missing position of the leading browser maker. I do not remember if Microsoft’s position was articulated one way or another. Does the absence of position mean that Microsoft is fully satisfied with the present state of the web technologies, or does Microsoft not care at this moment? Or does Microsoft think that in the future it can release completely new stuff without concerning itself with backwards compatibility and make everybody else to play catch-up?
    Or am I missing a point here? Could it be that all these proposed changes are a way to gradually wrestle control of the web back from Microsoft without open fight?

  7. Walter, it seems like Microsofts position is “XAML”. At least that is what they talked about at the workshop from what I have heard.
    So yes, no backwards compatibility, no cross-platform, just MS license updates everywhere.

  8. Evolution
    The web will be backward compatible with legacies disappearing slowly. If vendor or standard bodies attempt to jump too far ahead, it will fail. New techniques with high probability of success will solve exiting problems and will be easy to use; e.g., clean solution to sessions. Hence, big new standards have little chance.
    IE
    Like all the other players, MS is limited by the existing web/internet; but MS is the player with most power. Techniques not implemented in IE will hardly be successful; but not every technique introduced in IE will be successful. We are living a classic in “International Relations” with Irak.
    Web applications
    The web was born as HTML markup language with a simple transfer protocol and it has evolved to a light operating system like the Blit terminal talking to Unix: this should be the new vision.

  9. I agree with Axel Hecht.
    Hence the question about WHAT.org, does it make any sense to enhance HTML, i.e. creating webforms 2, if it is not supported by IE?
    I believe there should be an Alliance of Apple, Opera and Mozilla though probably not to introduce new standards but
    a.) Move up to IE by adding things like the HTTP Request Object. Recently I read somwhere that it has been added to Safari but is not yet available in Opera. That could be done for other IE only features as well, at least for those which are useful. This alliance should minimize there differences between there browsers, so that there are not 4 browsers to check for incompatibilities but only two, between the Apple/Opera/Mozilla Alliance and Microsoft IE.
    b.) The Apple/Opera/Mozilla and eventually others, such as IBM or Sun should find an appropriate single answer/solution to XAML. Whatever it is, who knows.

  10. Regarding the prevoius post!
    I mean “I agree with the previous post” which was from “M.T. Carrasco Benitez” and not “Axel Hecht”.
    Thanks

  11. Karl, see https://www.whatwg.org/charter, specifically:
    All specifications produced by this working group must take into account backwards compatibility, and clearly specify reasonable transition strategies for authors. They must also specify error handling behaviour to ensure interoperability even in the face of documents that do not comply with the letter of the specification.
    You can do a lot on top of IE using HTCs and the like to emulate new specs such as Web Forms 2.0. Such an implementation may not be the best-performing, but that’s yet another reason to switch to Firefox.
    /be

  12. Brendan, thanks for your info.
    I am using Mozilla since Netsacpe 4.x and I hope that Firefox and Thunderbird (+XUL, etc.) will continue to evolve and that they will be more widely used.
    Though Microsoft will probably remain the market leader and hence it wont be bad that the rest of the world forms a strong Alliance that goes beyond Webforms 2, ensure that browser incompatibilities between the minor browsers are as small as possible, etc.

  13. How about trying to embrace and extend Microsoft?
    Declare that we will support XAML, and that XAML will be a sub-branch of our new mega-standard?
    The other possibility, if the above is bad, and if Javascript/XBL/whatever is not enough to compete with XAML, we could include Java in our standard. That would give us a lot more power.

  14. Idear: XAML is a moving target, as well as an unspecified proprietary format. It won’t be stable until Longhorn market pentration forces it to stabilize, more than a few years from now.
    Rather than waste time eating MS’s dust, why not try to push the web standards that people actually use till they are sufficient? That certainly means https://www.whatwg.org style extensions. It probably means further extensions.
    Java is a potential help, but for Mozilla, it can’t be depended upon in default builds until it is open source and “open spec”.
    /be

  15. Hi Brendan,
    Javascript is fine as a scripting language, I simply refuse to run scripts that I have not audited, effectively killing any client-side scripting. Navigating the web is an exercise in hacking javascript to pull simple URL’s that are needlessly embedded in scripts.
    I’m not blaming the tool, rather the practice of embedding script in html is unforgiveable, from both security and usability perspectives.
    To briefly respond to a poster above, flash is totally unacceptable by virtue of it being a binary format. At work I have Flash blocked by our content filters, there is no question that running untrusted code behind our firewall constitutes a slight but none the less unacceptable security risk.
    Thanks.

  16. Well, its good to hear that there are some useful improvements being made on HTML that may actually make it to the public web in the near future.
    However, if XHTML+SVG+XFORMS compound docs are just a dream, what’s this bit in Robin’s post (above) about Adobe mixing them up in their new Viewer/Browser (PDF+SVG+XForms+XHTML)?
    Adobe already has their PDF browser opening in both IE and Mozilla and it keeps prompting me to download its new SVG, etc., updates. Its almost as if Mozilla /IE is the OS (platform) and Acrobat viewer is the browser. Throw in PDF’s guarenteed formatting, the PDF publishing-as-easy-as printing out of any application, and now add XHTML+SVG+SMIL+XForms and now Acrobat Viewer is the browser for the next generation of the web; developers can start creating content right away because everyone already has it installed and it will auto-update so clients will be able to parse all the latest markup. (Hmmm, … well, we’ll see.)
    If, as you say Brendan, compound docs are still a long way off, then the Adobe stuff is just fiddling, and its great to see some practical progress taking place on HTML. Shims for IE sound OK; how about embedding Moz into IE like Acrobat Reader – overkill?
    Web forms need desprate help, great to see the plans at whatwg.org. It would sure be nice to have web form fields that can be linked to text nodes in an xml fragment/document DOM; and this frag/doc could then be sent back to the server or saved on the local computer for future reference/submission all without any web page reloading.
    SVG would be a big boon as well and its good to see it’s further implementation has been mentioned as part of Mozilla’s future.
    Best of luck, hopefully the W3C folks will not take your ventures as an offensive, but as a bellweather of the web developer and browser vendor desires.
    mawyra

  17. mawyra: please pay attention to precisely what I wrote was “just …a dream”: “… a new web, based on XHTML+SVG+SMIL+XForms ….”
    In other (similar) words, a new public web, replacing the one we use every day (the one with >5e9 public pages indexed by Google), and based only on new, incompatible w3c standards, is a pipe-dream.
    Adobe demos do not change the causes of the current web browser and format plateau. New, undertested, backward-incompatible standards, non-interoperably implemented to boot, won’t either. Only user agent market share will.
    /be

  18. Brendan,
    On the whole, I agree with you but when someone pokes a hole in your argument about SVG test cases, “I’m sure there’s more to SVG non-interoperation than Ian’s tests, however lacking they may be. You don’t implement a 700-page spec and interoperate without testing and massive distribution, period, end of story.” is simply not good enough.
    You either have a point and can follow it through or you don’t. Don’t write cheques you can’t cash.

  19. jackiemcghee: I’ll let Ian defend and explain his test cases, but you are the one writing checks without interoperable SVG implementations in hand to back those checks.
    Get real: it took years, and millions of dollars of NRE, for browser vendors to match one another’s implementations of specified standards smaller than SVG (and this does not count all the de-facto standards that the w3c members failed to formally specify, such as DOM level 0). Apple is only the most conspicuous, recent investor in interoperation.
    A non-web example: I spent 1985 and 1986 bringing up Sun NFS in SGI’s Unix-based operating system. Even with shared source and interop get-togethers, vendors took several years in the late ’80s getting their NFS servers and clients interoperating. NFS is almost trivially simple compared to SVG.
    If you think magic standards fairies cause implementations to interoperate, you are very confused. What’s more, SVG’s spec is incomplete and ambiguous, as well as buggy — any 700 page spec is. This is so basic, it should go without saying. Until there are testsuites and tested implementations, your talk is empty.
    /be

  20. I followed up on the test suite problems with Antoine and Robin. Antoine pointed to a couple of typos that I have fixed (thanks Antoine!), and Robin didn’t reply.
    Unless someone can point to specific problems on those tests, making dismissive comments about them is silly. To the best of my knowledge, they are all valid. The Adobe plugin fails more than half of them (and it wasn’t even written to find bugs in the Adobe plugin, it was written to find bugs in the Mozilla implementation). SVG’s test suite is a good start, but that’s all it is: a start. It isn’t anywhere near enough to claim conformance, IMHO.

  21. Brendan, I think you missed my point, I don’t know whether or not Ian’s test cases are in/valid or should/shouldn’t work. It’s more that I didn’t think it was good form to use Ian’s work as an early plank to build your argument on and then not really have an answer when they are called in to question.
    I admire all the work that you and Ian and others put in, I really do (most of my work these days revolves around the DOM and CSS, so I definitely appreciate it). I guess I may have come across as being way more aggressive than I was intending and I’ll put your reaction down to a similar perception, because no malice or anything of that sort was meant.

  22. jackiemcghee: I’m sorry if it seemed I thought you were malicious. My interest here is in reality, the same hard-nosed standards of evidence and reasoning that should inform both the SVG spec and any testcases written to validate implementations of it.
    But it seems to me the shoe is on the other foot — your form is not “good form” by your own standard.
    You tried to impeach *all* of Ian’s tests by citing others’ authority here. You used incomplete assertions in comments by Antoine and Robin as “evidence” that Ian’s tests were invalid. Yet you have no answer now that Ian has commented here, except to repeat yourself.
    My argument consisted of three parts:
    – way too few SVG tests in the official suite;
    – other tests showing bugs in implementations;
    – the sheer size of the spec combined with the lack of multiple implementations interoperating in the marketplace before it was ratified.
    The second plank of this platform is not weak, as far as I can tell. And even if all of Ian’s tests were completely broken (they are not), the other two planks would stand.
    /be

  23. JavaScript has become the Rodney Dangerfield of scripting languages. Dilettantes and Johnny-come-latelies seem to think they enhance their cachet by dissing it. Instead, like the snotty dot-commers, they reveal that it is they who, “Just don’t get ‘it’.”

  24. People reading this discussion from a less informed perspective might get the idea that HTML and XHTML are somehow incompatible. When I re-wrote my HTML website in XHTML a year or more ago, it carried on working the same as before in all browsers that I tried. In fact, the XHTML 1.0 spec relies on HTML 4.01 for the meanings of XHTML elements and attributes.
    I realise of course that people who have written a website using some sort of proprietary version of HTML with non-standard extensions may not be able to reproduce the whole look and feel using standard XHTML, but if they produce a new XHTML website providing the same basic functionality as their old HTML site then I wouldn’t have thought they need worry about people not being able to access it.
    Now I suppose someone is going to give me a list of historical browsers that don’t grok XHTML, but I would be very surprised if it affected more than 1% of surfers (and those historical browsers would also be unable to access a great many “HTML” sites either).
    OK, once you add in things like SVG or MathML then you severely limit your audience, but as I understand it XHTML is itself fully supported by all significant browsers – no?

  25. The new ‘less-ugly’ internet you predicted is coming with XAML/Avalon/Longhorn.
    How ready will Mozilla/XUL be when that apocalypse comes?
    Will an installed based of intranet/XUL app users be lined up to sing XUL praises as the open source alternative (to XAML/Avalon)?
    Will Mozilla/XUL look like a hacked-up contraption next to the polished and tightly-integrated XAML/Avalon/.NET juggernaut?
    The commodity applications software platfrom is moving from a 3GL/OS-level of abstraction to a better, mostly-declarative, 4GL-level of abstraction. The frenzy over web applications will be repeated as the old web (app) platform is abandoned for this new web (app) platform. This is just another turn of a classic cycle. I can already see the frantic job adverts for anyone with XAML/Avalon skills. Will anyone be looking for XUL app dev skills?
    The future looks like either XAML/Avalon or XUL. Both probably will co-exist to one degree or another. The war over this next generation web/software (app) platform is already underway. Which 3GL ‘extensibility’ language is available will hardly matter. All eyes will turn past the 3GL and to the 4GL.
    The ‘necessary-evil’ 3GL might be JavaScript, Python, Perl, etc. The 3GL might even be JAVA/J2EE (for the write-once-port-to-every-JDK-&-tune-like-crazy JAVA zealots). We surely will see support for many 3GL options simultaneously. Yet none of this matters. What a devastating distraction! The better the beyond-procedural (i.e. ‘declarative’), less-ugly-web, 4GL-level, XAML/XUL – the less the 3GL will be used – and the less the 3GL will matters. Move on.
    The single worst strategic blunder the open source community can make right now is to *loose* focus on what really matters: XUL. I can scarcely imagine squandering the headstart XUL has over XAML/Avalon. Talk about the worst possible time to get distracted! Yet I hear all kinds of talk about https://www.whatwg.org/. Wow. Suicide by standards.
    Microsoft must be ecstatic. First, they steal the XUL concept. Then as they sweat bullets trying to catch up, the open source community fragments and dissolves. They watch the XUL visionaries wander off before the battle even starts. Microsoft seems to be getting very lucky here … at just the right time. They get to grab control of the new XUL-like web/software (app)platform. The opposition has wandered off.

  26. Steffen: we’re not squandering XUL’s lead — Firefox’s huge success is the best thing we could do in 2004, better than polishing “4GL” purity points in the Gecko/XUL platform for some future year’s products. The platform work we are starting for Mozilla 2.0 (libxul, xulrunner, etc.) needs the kind of distribution demonstrated and projected by Firefox 1.0.
    See my latest entry on OpenLaszlo and Eclipse, but please don’t buy into the “nGL” theory of programming language evolution. Scripting will never die, nor should it, and declarative languages can capture only certain structures without simply obfuscating and bloating the program. The past 40 years of software are littered with broken monuments to such grand theorizing.
    /be

  27. I’m with Brendan.
    Firefox can and is providing a viable choice in the marketplace. Choice equals competition which equals quality and more choice.
    ————————-
    Scoop
    Build With Steel

  28. Not withstanding ny previous statement XUL should be a primary focus in the struggle for the open source community.
    ————————-
    Scoop
    Build With Steel

Comments are closed.