Assertions should be fatal

Thanks to Dawson Engler for helping get us connected to Coverity last Fall. Dan Veditz and I have done several scans of Mozilla sources (Firefox branch and trunk, Thunderbird with Calendar enabled) using Coverity’s SWAT static analysis toolset.

(As an aside: I am a big fan of the Meta Compiler approach, and I have hopes of helping extend it to the Mozilla-specific domains of HTML, XUL, CSS, the DOM, and JavaScript. Some day, practical programming languages will get beyond out-Java’ing Java or out-C++’ing C++ and let us declare our invariants, instead of trying to assert them with runtime checks that duplicate and invert logic, and that are compiled out of production code.)

The good news is that our nominal error rates are respectable at first glance: as good as or better than other large open source projects. The results show many trivial redundant null checks, missing or inconsistent null checks, and the like. These numerous bugs will be filed and fixed in batches, and we haven’t classified them all yet, so the bug-filing won’t be immediate. These are low priority bugs.

There were fewer more serious errors caught, including dead code, but the most worrisome problems visible to the static analyzers were not obviously bugs. Rather, they were cases where an index into an array would be out of bounds only if assertions (NS_ASSERTION, etc.) were not seen as fatal when botched.

The toolset can be taught that NS_ASSERTION is fatal (exits the control flow graph), just as <assert.h>’s assert macro is understood by default to be fatal. But thanks to some bad ancient history in the Mozilla project, for most builds, NS_ASSERTION is not fatal!

(The bad history dates from 1999 and involved various people, mostly Netscape employees, all “trying to get their jobs done” in the face of bogus assertions from some of their fellow hackers. Instead of forcing bad assertions to be fixed, at some opportunity/social/political cost to oneself, it was too easy to go along with the minority view that “assertions should just be warnings”.

To be fair, some wanted all assertions to be combined via newer macros with run-time checks that would not compile out of production builds, but that went too far: performance-critical code with a limited set of trusted callers should not have to run-time check all indices.

Anyway, I’m sorry I gave in and went along with the assertions-as-warnings flow. That was a decision made in haste and repented at leisure!)

So back to the present. Given assertions-as-warnings, the problem becomes: do we have enough test coverage with DEBUG builds to be sure that developers will actually botch assertions whose invariants are being violated by a rare bad caller or broken data dependency? And of those few developers, how many will actually notice the warning messages in the cluttered standard output or standard error of the Mozilla-based program they’re testing? And of those, how many will actually do anything about fixing the broken assertions?

Anyone committed to reality has to believe that the odds ratios multiply into too small a fraction for us to be confident that assertions are effective “path killers” for the static analyzers to respect.

I go further: anyone committed to software quality should want fatal assertions. Assertions are proof-points, to oneself and to others reading one’s code. They are not idle wishes, or doubtful questions posed to the software gods. If your code has a bogus assertion, you should want to know why it botched, and how to fix it. You should want to keep it in a fixed form, rather than remove it, if at all possible.

Dan and I pored over the several dozen potential static and dynamic overrun errors that the Coverity tools found, and at least for the ones in Firefox, we convinced ourselves that the callers were well-behaved. So again, based on our four eyeballs and fallible brains, we believe that the tools found nothing overtly bad or exploitable.

But, we must fix this historic assertions-are-warnings botch.

Just making assertions fatal now will turn tinderboxes on Linux and Windows, at least, bright orange.

I call that a good start, and then we can close the tree, divide the labor among all hands on deck at irc.mozilla.org, and fix or remove the bogus assertions. We could do this right at the start of the 1.9 alpha cycle. You may well object that this will be too disruptive. If so, please comment here proposing a better plan that actually fixes the problem in a finite number of days. We shouldn’t put this off beyond the start of the 1.9 cycle.

/be

OpenLaszlo and Eclipse

Back in my February 2004 Developer Day slides, I promoted the idea of using Eclipse to create a XUL application builder, with direct-manipulation graphical layout construction and editing, project management wizards, etc.

Although a few people expressed interest and even did some hacking (the MozCreator project being the most conspicuous example, although not Eclipse-based), no one actually created an Eclipse project and built on its Graphical Editor Framework to realize a XUL app-builder.

The good news this week is Open Laszlo and IBM releasing the Eclipse IDE for Laszlo. LZX is cool, and similar in spirit, and in many ways in flesh, to XUL.

So the thought occurs: why not patch the Eclipse IDE for Laszlo to support XUL as an alternative target language, and Firefox (or any new-style XUL app, soon enough unified via XULRunner) as the target runtime? Any takers?

/be

The Firefox and the Hedgehog

The Greek poet Archilochus wrote “The fox knows many things, but the hedgehog knows one big thing.”

But what does the Firefox know? Both many things (tabbed browsing live bookmarks popup blocking mouse gestures extension architecture download manager small fast . . .) and one immense thing: that the power of the Internet and the power of open source are two sides of one coin, minted by millions of people working together as never before. Firefox shows what can be done when people use the web to collaborate without any agenda other than a common vision of simplicity and ease of use, and with the freedom to extend that vision according to individual good taste in boundless directions through XUL extensions.

In the case of Firefox 1.0, those people include the dozens of top hackers on the Mozilla project, the project managers at the Foundation and among the key strategic partners, the hundreds of CVS committers, the thousands of daily build testers and advocates, and the millions of users. I’ll single out only four by name, without slighting any others in the least.

First, many thanks to ben, who took up the flag after 0.5, kept his cool and his great sense of design under pressure, and carried the ball into the end zone. Kudos also to blake and hyatt, who started it all and showed the world the way to a better mousetrap. Finally, thanks again, and always, to asa, for his tireless testing and release leadership.

Onward to Firefox 1.1 and Mozilla 2.0!

/be

Firefox news in brief

For the impending PR1 candidate builds (tomorrow’s, we hope):

  • Alternate Style Sheet switcher makes a come-back thanks to Fantasai, with Ben reviewing and Asa approving. The statusbar icon won’t show up unless the page has alternate sheets, which is an improvement. There’s a View menu item to disables all author-level style sheets.
  • Work Offline is back in File, and we may fix a few of its bugs for 1.0 if we can get help from Darin.

Back to Mozilla roadmap topics in my next update, some time soon-ish.

/be

Everyone remain calm

A lot of folks in the Mozilla community share the reaction Boris had to some deeply mistaken, tentative and now-aborted plans to remove View / Source and other “developer” features from Firefox. I wanted to point out that these plans were not made with agreement from me or, as far as I can tell, from Ben. First, let me just say that there is no way Firefox would ship without View / Source or any other UI that goes back to Netscape 1, and is therefore part of the “body plan” of browsers. Not while I’m around and involved, at any rate.

People dive into HTML all the time, copying and pasting, hacking, cribbing. View / Source is indispensable for such learning, not to mention for the kind of trouble-shooting all too frequently done by “end users”. My wife uses View / Source, and so do millions of others, whether or not they are “web developers” ™. You don’t have to be a Gecko hacker or even a paid web content designer to appreciate View / Source — far from it.

The web is the most popular and populist programmable content system ever created. There is no government-enforced monopoly, no union card, no web developer elite or cartel using arcane tools not available to mere mortals (well, there are plenty of would-be elites, and too many arcane and expensive tools, but web content can be, and often should be, written by hand). More than a few grandparents hack HTML and even JavaScript (not perfectly, but usually well enough).

The line between a “user” and a “developer” is soft and flexible on the web, and it should remain that way, lest some know-it-alls or business-suited sharpies lead us down an over-complicated, proprietary path.

Even in the early days of NCSA Mosaic, when there were ~40 servers in the world with content to care about breaking with incompatible browser changes, marca and ebina had good reason to tweak Mosaic’s layout engine to support known usage errors, some of which we now call “quirks”.

I cheerfully acknowledge that this is heresy, but their decision (insofar as it was a decision) was simply good economics, and it offered better usability or human factors design than a strict SGML purism would have afforded. Without tolerating human error of the sort tolerated in natural languages, I think it likely that the web would not have grown as it did.

Throughout the explosive growth of the web, View / Source has played a crucial role, hard to appreciate if you dumb down your user model based on myopic hindsight and a static analysis of the majority cohort of “end users”.

Anyway, I wanted to reassure everyone, from our top Gecko hackers to interested web developers to enthusiastic surfers, that Firefox is not about to implode into a bare-bones, ultra-minimalist browser that those important hackers, et al., can’t use. Firefox cannot be “all things to all people” without at least some people having to configure an extension or two, but the default features should support the crucial user bases.

There’s another point worth making about these late-breaking Firefox feature removal plans: it’s not wise to remove anything significant after Firefox 0.9, because removal has risks and opportunity costs — more marketing than technical to be sure — that we can’t fully assess in the time remaining before 1.0. If we can remove a buggily inadequate, vestigial feature such as offline support, perhaps. But certainly not View / Source, View / Page Info, or the JavaScript Console item in the Tools menu.

I’m willing to see DOM Inspector moved to an extension, based on its relative novelty and complexity compared to View / Source.

I’m increasingly skeptical about the wisdom of the alternative style sheet UI removal decried by Daniel, and I’ll make sure that feedback from the preview release on this removal is heard and fairly evaluated.

About Firefox UI: we’re trying something with both SeaMonkey and Firefox (and Thunderbird) now that couldn’t be done in the old days when Netscape paid most module owners, along with a good number of professional UI (or UE, they used to call it) designers: individually accountable product design leads.

Product design can’t be done well by committee, and SeaMonkey’s UI was always worse for the compromises, bloat, and confusion about its intended audience that resulted from past committees. No one was much or well empowered by any nominal share in such a committee or mob.

For SeaMonkey, which is moving into a sustaining engineering mode, and won’t be our premier product after Firefox 1.0, Neil Rashbrook leads the UI design and implementation.

For Firefox, Ben Goodger is the design lead.

/be

Mozilla Developer Day Slides

The slides that shaver and I presented at last Friday’s Mozilla Developer Day are up now.

As presented at dev-day, these slides nicely demonstrated support for Apple’s canvas tag, embedded in Mozilla as <xul:canvas> and implemented using Cairo (a static PNG of the clock and animated stars must stand in for now, in the published slides, but you can view source to see the starbar.js script and related source). Thanks go to vlad and stuart for their heroic efforts hacking up <canvas> support.

As vlad pointed out, you can think of <canvas> as a programmable image tag. In that light, it’s reminiscent of the XBM images generated using <img src="javascript:genxbm()"> tricks developed by Bill Dortch back in the Netscape 2 era. All of which is to say, we really should have implemented this kind of tag in 1995, but both the management and hacker cultures then at Netscape deferred to Java for programmable graphics and other such features. What’s ironic is that this left most web designers reaching for Flash, not for Java, as the browser and Java-in-the-client wars played out.

The WHAT Working Group is considering standardizing <canvas>, with the goal of interoperating implementations based on the standard. My hope is that this is done both well, and quickly, in keeping with the WHATWG charter.

People ask about how SVG in Mozilla and <canvas> relate. The short answer is that they don’t, except that both must work well in all dimensions (complete and fast alpha-blending, e.g.) on top of a common graphics substrate, which looks likely to be Cairo.

A longer answer could compare and contrast <canvas>‘s non-persistent, procedural, PostScript-y rendering API with SVG’s declarative markup and persistent DOM. The upshot is that SVG and <canvas> complement one another, catering to fairly distinct requirements and authoring audiences.

One crucial fact to keep in mind: <canvas> support is tiny compared to the implementation of any known profile of SVG, so it will be easy to justify adding <canvas> support to default builds of Mozilla products. SVG should be supported in the same way as XForms and other, bulkier implementations of standards not yet seen much on the web: as a one-click download-and-install package that extends Gecko. I’ve asked top hackers to look into generalized support for such Gecko extensions, based on XTF, with versioning and update management a la Firefox’s extensions.

I’ll blog separately about the other points of interest raised in these slides.

/be

WHATWG and RSS

Dave Winer seems to have misheard my exchange with the Gillmor Gang about RSS and HTML: I was asked, at around 36 minutes into the show (not 20 minutes), whether the Web Hypertext Application Technology Working Group considered RSS to be “completely orthogonal” to HTML, and I said (paraphrasing slightly) “RSS is not on the WHATWG’s radar, we are extending HTML to better support web apps”.

Steve Gillmor then went on to aver that RSS and HTML were orthogonal, and asked about browser built-in HTML editing support for bloggers.

Based on this brief exchange, Dave blogged: “I agree with everything Brendan says up to the point where he says RSS and HTML are orthogonal. Take another look, RSS wraps chunks of HTML with useful metadata.”

Now, wrapping chunks of HTML in RSS sounds like a proof of orthogonality to me, especially if the HTML can be wrapped and unwrapped freely. But maybe I’m using too mathematical a definition of “orthogonal.”

WHATWG may yet be interested in RSS for HTML metadata, but no one involved has brought RSS up in connection with web apps. Dave, if you care to make a proposal, here’s the list info.

/be

Mozilla 2.0 virtual machine goals

  1. Multiple languages supported, including JS, Java, and Python.
  2. Good cross-language integration: inheritance, type matching, etc.
  3. Cross-language debugging, ideally including C++.
  4. One GC to rule them all, preferably one shared GC, not a super-GC ruling a zoo of heterogenous GCs and reference-counting subsystems.
  5. Decent JITed performance, because performance matters when you can least afford to rewrite in C++.
  6. Sandboxed execution security model, type safety, defense in depth.
  7. Part of a larger open source platform . . .
  8. . . . that has momentum and excited developers hacking on it.
  9. A sufficiently open specification process for standardizing and evolving that platform.
  10. New XML language implementations easy to write, plug in, and demand-load (if we had this today, we would use it for non-web standards such as XForms).

The last item means some number of frozen APIs into the content code, e.g. XTF. SVG and existing XUL and HTML widgets may provide a rich enough rendering vocabulary. If we can mix them well using XBL, then we may not need to expose frozen APIs into layout, gfx, etc.

The goal should be minimal, supportable, and composable high-level APIs, where possible. XBL and XTF point the way.

Back to the more programming-language-specific points above: comments welcome on realistic candidates. Obvious and controversial possibilities include Mono, Java (if open source), and Parrot (last I looked, *way* too Perl6-focused, not very mature, not looking likely to mature quickly).

/be

Mozilla 2.0 platform must-haves

  1. libxul.so/libxul.dll, a versioned shared library with minimal, frozen, documented API exports, and fast intra-library calling convention code (so small footprint compared to today’s “GRE” or “XRE”).
  2. xulrunner/xulrunner.exe, so you can write ‘#! /usr/bin/xulrunner’ at the top of a .xul file and get busy.
  3. XUL 2 and XBL 2 — standardized specifications, greater binding language power, more scripting languages, iTunes-like widgets, and working remote XUL/XBL.
  4. SVG support to a useful level, not necessarily the whole 1.1 spec.
  5. Web Forms 2.0.
  6. XForms Basic (or ultra-Basic, details TBD — the heavyweight item is Schema-based validation) support as an optional, downloadable extension.
  7. JavaScript 2.0 support, including ECMAScript for XML support.
  8. Python support, perhaps via Mono (if so, along with other programming languages).

That’s a good start. Comments?

/be

The non-world non-wide non-web

I spent a day at the recent w3c workshop on web apps and compound documents. Due to vacation, that day was the second, so I missed the chance to hear JavaScript praised as the worst invention of all time.

The adolescent sniping and general irrelevance continued on the second day, however. The sad fact is that the w3c is not concerned with the world wide web, AKA the Internet. Rather, the focus for a while now seems to be on vertical tool/plugin and service/cellphone markets, where interoperation is not a requirement, content authors are few and paid by the vertical service provider, and new standards provide livelihoods and junkets for a relative handful of academics, standards body employees, and big company implementers.

Evidence of the vertical nature of the new standards? There are only a few hundred tests for the SVG w3c recommendation. That’s several decimal orders short of what is required just for surface coverage. Often recently, when Hixie hears of a claim about an interesting SVG standard feature, he writes a testcase. Adobe’s plugin too often fails that testcase, although I am sure Adobe SVG tooling produces content that works with the Adobe plugin. Interoperation is a joke.

Real browser vendors, who have to deal with the ugly web as it is, know better. The dream of a new web, based on XHTML + SVG + SMIL + XForms, is just that — a dream. It won’t come true no matter how many toy implementations (including Mozilla implementations — we’ve supported XHTML for years) there are. Long before the w3c gets compound documents working on paper (having missed the chance with SVG 1.0 and 1.1, which ambiguate and conflict with CSS), XAML etc. will leak onto the public web.

What matters to web content authors is user agent market share. The way to crack that nut is not to encourage a few government and big company “easy marks” to go off on a new de-jure standards bender. That will only add to the mix of formats hiding behind firewalls and threatening to leak onto the Internet.

The best way to help the Web is to incrementally improve the existing web standards, with compatibility shims provided for IE, so that web content authors can actually deploy new formats interoperably.

What has this to do with Mozilla’s roadmap? Not much, which is why apart from HTML, CSS, DOM, and SVG, which we support, you probably won’t hear much more about the w3c here. But Mozilla is joining with Opera and others to explore the sort of incremental improvements to HTML proposed by us at the workshop. I expect the resulting specs and implementations to play a significant part in the roadmap.

/be