Today I saw the future (Update)

As noted at the Mozilla blog, OTOY and Amazon along with Autodesk and Mozilla have announced the next step in Amazon and OTOY’s GPU/cloud effort.

Demo videos:

This means developers can get started using ORBX.js with GPU-cloud encoding and downloadable decoding on all modern Web clients.

It also means that any of the Hollywood Six can start a streaming video service that reaches the most users across the Web (compared to any other purely Web-based service), using watermarking not DRM. More on this soon, if all goes as I hope.

Note that I’m an OTOY advisor. Not because of any compensation, but because I believe in their approach and their talent.


14 Replies to “Today I saw the future (Update)”

  1. What has changed since May. I’m interested in using ORBX.js. However, just like the announcement in May, there is not a link to any source code or download.

    It’s not clear how I as a developer can use ORBX.js. This just seems like a fluffy marketing press release for all the companies involved.

  2. Now all we need is ubiquitous 50mbps data connectivity 🙂

    (I picked a random number)

    The Samsung Pixel would be the only device I’d need then – we’ll get there eventually, sooner rather than later by these videos.

    Nice one!

  3. What’s the licensing of the ORBX.js encoder and decoder? Is the source code available? Are there any patents that must be licensed to encode or to distribute a compatible encoder, decoder, or encoded files? Is an audio codec bundled? Is this intended only for live streaming, or is there a stable file format for archiving?

  4. Oh and one last question. 🙂 From the announcements earlier this year about ORBX.js I seem to recall that support for mobile devices was limited in framerate or resolution due to lack of WebGL being required for anything but key frames. Is this still a problem, or is there an in-browser (or native app) solution for full frame rate decoding on iOS and Android? (Or is the recommended solution to use H.264 and AAC for these platforms, which requires a patent license to encode and distribute?)

  5. I was going to ask the same question as Brion about ORBX’s source and its availability. Also, did Mozilla contribute to the code at all? Perhaps from Broadway.js?

    Maybe I missed a discussion somewhere, but wouldn’t a very valuable approach be to have a VP8 & Theora (and in the future VP9 & Daala) JS + WebGL implementations similar to ORBX.js that could be used as a fallback if native implementations are not available? This way everyone could play these codecs *now*.

    If a JS implementation can be sufficiently efficient in terms of power and performance (compared to hardware support for alternate codecs (h.264)) this would seem like a silver bullet to allow free, unencumbered codecs to win.

  6. Caspy7, I’ve been doing some experiments with JavaScript and iOS native Vorbis & Theora decoding which you may find interesting:

    * (JS)

    * (iOS)

    JS performance is better than I expected on modern desktop hardware, but still needs tuning to make use of GPU when available… at least YUV->RGB conversion in JS could be easily done with WebGL on IE11 or if Safari has WebGL manually enabled — this is the approach taken in Broadway.js, as I recall, and I may just steal their YUV canvas wrapper code. 😉

    The Vorbis and Theora decoding in JavaScript is currently done with ’emscripten’ cross-compiles of the C reference libraries; in theory one could probably tune some of the actual decoding in native JS and/or WebGL-enhanced stuff but I’m not sure how much work it would take. Making use of multicore processing also isn’t very compatible with the emscripten approach, so it’s easy to max out a single CPU core on slower devices.

    Don’t expect 1080p60 out of JavaScript Theora decoding, at least not yet. 🙂

    The iOS native version is more encouraging on mobile, but again would need a lot of optimization to avoid draining battery unnecessarily, and to work well on the slowest devices.

    1. @Caspy7 & Brian: one aspect of ORBX.js that’s novel: it was designed over many years, with newer profiles or versions building on past learnings, to be more parallelizable than the codecs “from the 90s” (I’m generalizing a bit here, but not unfairly I think).

      With WebGL2 uplifting OpenGL ES3 stuff, it’s even more efficient (integer not f.p. coding). Yet more wins coming after that, and the downloadable “just content” aspect really shines there, as you note.


  7. @Caspy & Brion – OTOY is not sharing info on licensing ORBX.js. Six months ago people asked this question on Brendan’s previous blog post. It’s fairly clear at this point that it’s proprietary tech.

    Even if the decoder is open source, it’s useless without understanding the ability to encode. After all this PR, it’s no more ‘open web’ than Flash was.

    1. @Ace: ORBX.js is something we are urging OTOY to open-source, at least the decoder side, but that’s their call. Unlike Flash, no plugin is required — the open web aspect is JS+WebGL and more along those lines (WebGL2, SIMD and Parallel JS, safe WebCL-like stuff on our agenda). The web has tons of non-open content (JS, text, images, server-side code), for which no one impeaches a website _a priori_ as “not open web”.

      We wouldn’t have the Open Web (capitalized) without tons of non-open content, along with plenty of open (source, or CC licensed, e.g.). We want video codecs written in JS and WebGL to be possible to express as part of that content, in addition to — and ultimately instead of — the H.264-in-hardware and Flash-as-unsafe-plugin traps.


  8. I’m also wondering if ORBX is still capable of similar performance on iOS without the optimizing JIT?

    1. @Caspy7: In the May event at Autodesk HQ, Jules of OTOY showed ORBX working well on an iPhone (4S I recall). No WebGL, but the JS JIT. Pretty amazing. The smaller screen helps.


Comments are closed.