Today I Saw The Future

This morning, Mozilla and OTOY made an announcement:

Mozilla and OTOY deliver the power of native PC applications to the Web, unveil next generation JavaScript video codec for movies and cloud gaming

What this means:

ORBX.js, a downloadable HD codec written in JS and WebGL. The advantages are many. On the good-for-the-open-web side: no encumbered-format burden on web browsers, they are just IP-blind runtimes. Technical wins start with the ability to evolve and improve the codec over time, instead of taking ten years to specify and burn it into silicon.

After these come more wins: 25% better compression than H.264 for competitive quality, adaptive bit-rate while streaming, integer and (soon) floating point coding, better color depth, better intra-frame coding, a more parallelizable design — the list goes on.

The GPU cloud has your back. Think of the amazing 3D games that we have on PCs, consoles, and handheld devices thanks to the GPU. Now think of hundreds of GPUs in the cloud, working for you to over-detail, ray/path-trace in realtime, encode video, do arbitrary (GPGPU) computation.

Or consider high-powered tools from Autodesk, Adobe, and others for 3D modeling and rendering:

Native apps from any popular OS, in the GPU cloud and on your browser. Yes, both: this is not just remote desktop tech, or X11 reborn via JS. Many local/remote hybrid computation schemes are at hand today, e.g. a game can do near-field computing in the browser on a beefy client while offloading lower LOD work to the GPU cloud.

OTOY’s CEO Jules Urbach demo’ed an entire Mac OS X desktop running in a cloud VM sandbox, rendering via ORBX.js to Firefox, but also showed a Windows homescreen running on his Mac — and the system tray, start menu, and app icons were all local HTML5/JS (apps were a mix ranging from mostly local to fully remoted, each in its own cloud sandbox).

Valve’s Steam was one such app:

Watermarking, not DRM. This could be huge. OTOY’s GPU cloud approach enables individually watermarking every intra-frame, and according to some of its Hollywood supporters including Ari Emanuel, this may be enough to eliminate the need for DRM.

We shall see; I am hopeful. This kind of per-user watermarking has been prohibitively expensive, but OTOY estimates the cost at pennies per movie with their approach.

Oculus Rift, Lightfield displays, Holodecks, and beyond. OTOY works with Paul Debevec of USC’s Institute for Creative Technologies. This is Tony Stark stuff, coming at us super-fast and soon to be delivered via JS, WebGL, and ORBX.js running in the browser.

I was thrilled to be included in today’s event, hosted at Autodesk‘s fabulous San Francisco offices. I gave a demo of Epic Games Unreal Engine 3 (Unreal Tournament, “Sanctuary” level) running via Emscripten and asm.js at full frame-rate in Firefox Aurora, and spoke about how JS will continue to evolve “low-road” as well as “high-road” APIs and features to exploit parallel hardware.

As Jeff Kowalski, Autodesk’s CTO, pointed out, the benefits go beyond major cost reduction in CGI and similar processing work, to increase collaboration and innovation radically, by relieving creative people from having to sit at big workstations. The GPU cloud means many alternative ideas, camera angles, etc., can be tried without waiting hours for each rendering. Even from the beach, via your 4G-connected tablet. Teams around the world can collaborate closely as timezones permit, across the web.

We will continue to collaborate with OTOY; I’ll post updates on this topic. It’s hot, and moving very quickly. Kudos to OTOY for their brilliant innovations, and especially for porting them to JS and WebGL so quickly!

When we at Mozilla say the Web is the platform, we are not bluffing.

/be

P.S. Always bet on JS!

P.P.S. Hat tip to Andreas Gal for seeing far, with Broadway.js.

57 Replies to “Today I Saw The Future”

  1. Perhaps I should wait until there’s more info on this announcement, but I just wanted to clarify: the key couldn’t-be-done-before technology here is the HD codec running in Javascript. The remote desktop technology is nice and cool and everything, but a completely separate thing that just happens to be possible because of the codec.

    Some questions:
    1) What license is the codec encoder and decoder?
    2) Can the encoder also run in Javascript?
    3) Assuming the answer to (1) is free for both, and (2) is no, how difficult will it be for hardware and open source encoders to implement?

  2. @Voracity: Please read again, this isn’t “remote desktop technology”, although it can do that too. The codecs (several, including lossless) are essential to the GPU-cloud value proposition. The GPU cloud is essential to the watermarking idea.

    The two parts of the announcement, JS+WebGL decoder and the GPU cloud driving the encoder, combine to make a whole greater than the sum of the parts.

    Answers:
    1) Still being discussed, more when I know more.
    2) The encoder needs OpenCL or similar (OpenGL/Compute) power, WebGL as is (based on OpenGL-ES2) is not enough. But @vvuk says WebGL will be updated soon to OpenGL-ES3. Then we will have to re-evaluate.
    3) See (2), we need to update WebGL at least and make use of ES3 features and see how well that works.

    The first point I made in bold is that downloadable codecs do not need to be freely licensed. IPR even if only defensive patents may be in play. But none of this taints the browser. This is a net win, along with the watermarking-not-DRM idea. It beats the H.264 and DRM alternative!

    /be

  3. How practical would this be for sites like Vimeo, DailyMotion, and YouTube to deploy? It seems like it needs quite a lot of compute resources on the server side.

  4. “downloadable codecs do not need to be freely licensed.”

    They kind of do if you want to actually PARTICIPATE on the web – without a legal encoder, I we can’t legally produce our own “content”.

    It’s not really an “open web” if one is only legally allowed to “consume” and not actually participate by contributing one’s own work to the web legally.

    “It beats the H.264 and DRM alternative”

    I can agree with that much, it would be an improvement. I worry that it could turn into a situation like “Well, you’re still enslaved, but now you’ll only be beaten where the bruises won’t show and no bones will be broken. Mission accomplished!” though. It’s starting to feel like everybody’s endpoint is “The Internet is now TV. Open wide, consumer!”

    This will be extremely awesome if the encoder can be at least legally freely reimplemented so that everyone will be able to join in (even if OTOY keeps their own implementation proprietary) though – looking forward to hearing more about this project.

    (P.S.: how easy will it be to sync video in this form to opus audio?)

  5. The otoy link is down for me.

    ORBX sounds great and good to have. It’s not what we need to break the MPEG cartel’s strangehold on video, though.

  6. I used “remote desktop technology” for imprecise shorthand. To be more accurate, what I meant was “computing not on client”, which isn’t what interests me in this announcement. (If anything, it somewhat fuels my concerns about the potential to over-centralise, but that’s another issue.)

    “The first point I made in bold is that downloadable codecs do not need to be freely licensed. IPR even if only defensive patents may be in play. But none of this taints the browser. This is a net win”

    I would not consider it a net win if non-defensive patents are involved. Downloadable codecs imply consumers and platforms (browsers, OSes, etc.) aren’t directly affected by patent considerations, but *creators* and tool developers would be. It’s more critical to protect the freedom to create than to consume; a non-free codec doesn’t help, and might make things worse if it became the de facto standard.

    But I don’t know the full details yet and I trust Mozilla, so I trust the codec won’t knowingly involve any non-defensive patents.

  7. Maybe I got it all wrong but as far as I understand, the whole thing does not work without connectivity, right?
    And maybe I also didn’t understand that the whole CPU/GPU power of an end user device goes into rendering instead of computing the specific program locally. Is this correct?

  8. I’d like to see you give dark shikari (lead developer of x264) a copy of the encoder so that he can do a thorough comparison of x264 vs atoy codec.

    BTW, can’t you just use VP9 and port that to javascript as i suspect that vp9 will give better video compression than atoy’s codec.

    Will the source code to codec be released so that others can help improve it?

    I don’t see how the codec will survive lawsuits for patent infringement by MPEG-LA unless you have google bankrolling it like they did with VP8 and licenced some of the patents a few months ago. Are you going to give the MPEG-LA the source code and ask them to tell you what patents it may infringe and then change the codec so that it doesn’t violate the patents?

    Will we be able to download the .mp4 files that are being streamed to us? If not, youtube switched to this we would never be able to download their videos and store them on our pc’s as youtube often censors think such as govt corruption.

  9. Curious if any of this is using asm? Would it benefit? Perhaps all the heavy lifting is done in WebGL.
    If it might benefit, would be interesting to see the contrast in performance of using asm.js on non optimized browsers (compared to pre-asm conversion).

  10. “25% better compression than H.264…” – 25% better compression than WHICH h.264 encoder?

    Learn it already: it’s not about the format itself, it’s about how well the specific encoder analyzes still frame by still frame and how well it creates the video program stream from that information. There are crappy h.264 encoders, and there are excellent h.264 encoders.

  11. Roc said: ORBX sounds great and good to have. It’s not what we need to break the MPEG cartel’s strangehold on video, though.

    Exactly, why isn’t Mozilla focussed on making JS shims for competitors browsers so *they* can open WebM, rather than looking at ways to tolerate the mess that is H.264 licensing with js loaders inside Firefox – ie, Broadway.js

    You guys have everything backwards at the moment.

  12. @roc: What do you think possibly could “break the MPEG cartel’s strangehold on video”? In practical terms, there is no solution. H.264 codecs — including a realtime encoder for the ubiquitous video camera — are on our phones in hardware.

    We need to move the goal posts and aim beyond the ten-year cartelized design-and-implement-in-hardware cycle. OTOY is a bet that way, and a good one in view of the results. The GPU is the more general parallel hardware to focus codec design around. Yes, entropy or arithmetic coding is serial, but that’s only one stage of the pipe.

    I was talking to Tim Sweeney at GDC about how hardware took a wrong turn in the ’90s. “Wintel” meant x86 compat mattered, so big, power-hungry, super-scalar/out-of-order cores emerged. On the post-SGI side, the GPU went parallel in a more power-efficient way.

    This rift is being healed now, but too slowly. Intel folks say they could heal it faster, but for the fact that software (which no longer needs x86 compat — ARM rules mobile, and JITted JS source dominates) can’t scale to match exponential core growth. Still, projecting CPU and GPU trends, the gap will be filled with greater homogeneity and parallelism.

    In the latter half of the noughties, H.264 got fully into hardware, but given the hardware trends and the ongoing need to shrink area and power, are we really condemned to burn encumbered, soon to be obsolete, codecs into SiO2? I think not.

    @John: why do you put it on Mozilla, instead of much better-heeled Google, to make JS shims for WebM? Your “who/whom” assumption, to borrow Lenin’s famous questions, is backwards.

    Most important in context of this post: VP8 like H.264 does not parallelize as well via WebGL as ORBX does, so a “JS shim” is not going to compete. Hardware optimization is required these days, whether bespoke and for only one format (H.264) or more general but still efficient enough. Serial JS software by itself is not enough.

    Note also how Google has been late, with only mixed commitments, getting SoC vendors to spend chipset area for VP8 on top of mandatory H.264. Indeed just asking for more system cost to have two codecs when (by most consumers’ and business peoples’ lights) one is enough is asking for the moon.

    H.264 in hardware happened. We can’t undo it. The question is, what should we do to leapfrog it. Complaining about Mozilla not tilting at windmills misses the mark by light-years.

    /be

  13. @Voracity: over-centralization is a concern, but I contend more with the giant first party powers (Google, FB, etc.) than with an OTOY.

    Economy of scale — more crudely: thermodynamics — favors the cloud, even in the best case of successful and robust de-centralized alternatives (unhosted.org, p2p systems [which have their more centralized backbones too]).

    I’m with you in fighting against over-centralization due to path-dependent winner take all/most effects. That’s one way of looking at Mozilla’s mission, but it does not make us anti-commercial at all. We have to improve the world incrementally.

    Defensive patents are important in this imperfect world, but OTOY assures us, and I believe them after checking their business model, that they have no rent-seeking patent royalty plans at all. OTOH, they may have unaddressed IPR concerns, especially on the encoding side. We will work through these in the coming days and months.

    Part of what I wrote here was not about OTOY, but about the bigger picture: downloadable codecs, the amazing input and output device work at USC ICT and OTOY, the return of VR with the Oculus Rift, real-time lightfield capture/projection and holography. Leapfrogging is not just about codec efficiency!

    /be

  14. >OTOY assures us, and I believe them after checking their business model, that they have no rent-seeking patent royalty plans at all.

    Get it in writing 😉

    My first thoughts on seeing the story were “So they put RealPlayer in JS and RealServer in the cloud?” An obvious play now that computing power is sufficient. It eliminates the ‘click here to install plugin’ barrier to be sure, and the browser is one step further along the path of becoming the entire platform.

    If ORBX survives basic scrutiny (where some previous OTOY announcements have not), it probably means our own future codec releases will need to be in C and JavaScript both, something Lucas Gonze had been practically begging us to do for years. It honestly wasn’t feasible before the MediaSource API, now it may be foolish to ignore.

  15. @Pete: using hundreds of GPUs in the cloud to render in realtime (which used to take coffee-break time or longer) is one application. Remoting a Windows app to an iPad is another (Steam!).

    Distributed computation with some local and (where it wins) anywhere from some to much more remote — that’s the interesting option. It is the best way short of purely a local app to cope with network latency and outages.

    /be

  16. @Caspy7: ORBX.js doesn’t use ASM.js yet, it’s not Emscriptened C or C++, rather a by-hand port or evolution of OTOY codecs written in C/C++. Coding ASM.js by hand is pretty counter-productive. You’re right that WebGL does heavy lifting.

    As we evolve ASM.js support, add SIMD to JS, and upgrade WebGL, I’m sure OTOY will keep re-optimizing. That is a huge advantage of downloading rather than burning into SiO2.

    /be

  17. @Monty: I’d love to get lots of things in writing, including re: VP8 vs. H.264 non-defection. Wishes are not horses.

    What convinces me here more than words on paper is the viability of the per-GPU-hour business plan, which depends on the downloadable codec (and not hardware) for reach. If this went south, it’d be a shame, but it looks strong so far. Glad you agree!

    /be

  18. @Robin: Please do not shoot the messenger. I “learned it” (encoders matter) long ago (I wrote a toy MPEG-2 encoder while at MicroUnity in ’94). OTOY’s claim was in reply to a reporter, the setting was low on technical nuance. It is credible, given H.265’s claim of 50% better compression.

    Indeed encoders are still mystery-meat, relative to decoders. I am in favor of open source codecs. Should OTOY not see their way clear to o.s. all their work, then we still have incremental wins:

    1. Downloadable codecs that use GPUs relieve us down the road of patent and hardware rents.

    2. The OTOY innovation blazes a trail others can follow. You want an open and better codec? Build it. Mozilla is looking into this already, per our Broadway.js work and as Monty’s comment suggested.

    But we are collaborating with OTOY and I like their cadence and thinking. So far so good. Do not throw out baby and bathwater at start of bath!

    /be

  19. This is amazing. Is there an alpha/nightly that includes this functionality available?

  20. So what kind of hardware (CPU, GPU) is necessary on client side? Does it need shaders in GPU?

    I thought video decoding (H.264 like) requires quite a lot of computing power; and actually my desktop system can decode HD H.264 videos only when using Nvidias VDPAU hardware acceleration. I’m still fuzzy on how that solved in ORBX.js – is the decoding algorithm so much faster, or do you offload most of the decoding to GPU, or is your code better optimized than current H.264 decoders?

  21. Brendan, because I’m wearing a ‘I support the Open Web’ bracelet with Mozilla on the reverse side, not Google. They can speak for themselves on their own blogs.

    Recent changes in Gecko have meant that the web experience on my platform is going to get worse because its using platform codec support on other OSes – and developers are going to assume its consistent across all Firefox versions, thats no longer true.

  22. @Scooby: It looked like Jules was demoing ORBX.js in Firefox stable (21) but I will check. ASM.js’s OdinMonkey back end is in Aurora as noted here.

    The great thing about this approach is that it works in other browsers, just at some speed/smoothness hit. They’ll level up soon, I’m pretty sure.

    @Oliver: the demos included on an iPhone 4S without WebGL enabled — ORBX.js still ran at 30fps 720p, if I recall correctly.

    The advantages of ORBX, as noted, include better i-frame coding and more parallelizable design. So yes, offloading to the GPU is critical. For Broadway.js, we offloaded just H.264 colorspace conversion last I heard, and doing more was hard based on both H.264’s design and WebGL’s state at the time.

    @John: please link me to your Google blog comments (appreciated), but they may not have as much effect there. I’m responding here because I think we need more realism with integrity in the world, and that means not expecting Mozilla to tilt at Windmills that Google won’t approach.

    What is your platform? We are working on H.264 support cross-OS and will have more to say on that front pretty soon.

    /be

  23. “a downloadable HD codec written in JS and WebGL”

    So I need to give sites I want to see video on full access to WebGL, running on a GPU, possibly the most poorly-sandboxed classes of hardware in existence?

    And the internet pipes of the future are going to be filled with millions of huge streams of individualized, non-multicastable video (all of course being real-time “critical”) which could all perfectly well be rendered locally to the user.

    Sometimes I wonder whether people actually think anymore before they invent things.

  24. Robert: WebGL is enabled by default in Chrome, Firefox, and Opera. I believe it’ll be on in IE11 as well. Maybe *you* need a better GPU? 😛

    I hope you’re not sock’ing for Apple here. Of all the browser vendors, they most control their hardware architecture and GPU supply. They could best make WebGL and GLSL as safe as any other part of the system that they currently enable.

    Anyway, GPUs have grown up in terms of context management and better security over time. Your views apply better to CPUs and at the paranoid limit argue against most of the Web as we know it. Paranoia aside, GPUs are part of the processing power of the client now. Genie, bottle: no going back.

    Another point: the Internet is *already* full of non-multicast, compressed streaming video. This is both a problem and an opportunity (see Opera’s acqusition of Skyfire).

    But ORBX.js has nothing to do with this existing video-congestion problem, and watermarking at the edges is doable. Edge “caches” became smart a long time ago.

    As for “perfectly well […] rendered locally”, that’s an option with ORBX.js too. Why wouldn’t it be? You imply that the novel OTOY advantage, watermarking, is an exclusive either/or, which magically disallows downloading. This does not make sense.

    Downloading does bring back the DRM bogeyman, though.

    > Sometimes I wonder whether people actually think anymore before they invent things.

    More than you put into your anonymous drive-by!

    /be

  25. @David: OpenCL feeds into WebCL in Khronos, but still faces safety issues pending better GPUs (and even then, if memory sharing with the CPU is promiscuous). Our approach is to uplift JS (River Trail, ParallelJS, SIMD intrinsics) and upgrade WebGL. Will we need GPGPU support in the form of a new/old (C99 offshoot) programming language? TBD.

    @Jerzy: I didn’t know about that other “Broadway”, thanks. The obvious difference here is that ORBX.js (for video, losslessly for models and data in general) plus GPUs in the cloud provide cross-OS and cross-browser reach.

    /be

  26. What happens when Microsoft accidentally ruin the performance to the point no one wants to use it?

  27. Brendan Eich

    As I understand it javascript doesn’t support UDP. I was wondering what your thoughts were on that since UDP would be a much faster way to push down your video as opposed to TCP?

  28. It would make sense for Autodesk to leave Windows and use Linux since it’s far better at server base distribution and cloning applications.

    They could then offer the service of cloud-cad, or other cloud based manipulation of their 3D editor engines. They could also sell a server package that runs on a cloud in the business and allows that business to serve up 3D editing through terminals to its workers.

  29. The video streaming is not such an achievement bar the compression and other efficiencies Otoy has gained, but the next discussion is input from users to the server. How is that implemented. Is it based on standard javascript/Browser events?

  30. John T.: Microsoft is not the spoiler any longer. They do not control enough of the browser market any more (far from it on mobile).

    Dan Dart: no source yet, working on it. It is not mine to release, but I hope OTOY sees the upside of open source development while also taking care to protect their business interests. I will do my best.

    Bob, E8HFFFF: everything’s based on JS, so the whole menu is available: HTTP via links, forms, XHR; WebSockets; and WebRTC provides data comm even via UDP. OTOY has used RUDP from their native code; we’re committed to making sure WebRTC can meet the realtime requirements.

    John Orban: You’re welcome!

    /be

  31. Is this a VM-only tech or can this be possible to implement on a consumer PC? Having a stream of my PC from any screen in the house would be awesome.

  32. Mike: PCs can do it, nothing is VM-only. The demos last Friday showed p2p streaming via an iPad, two laptops (one MBP, one AlienWare PC), and a Chrome Pixel.

    /be

  33. > WebGL is enabled by default in Chrome, Firefox, and Opera

    This is exactly what worries me.

    > Anyway, GPUs have grown up in terms of context management and better security over time.

    Not yet they haven’t. There are improvements coming, but they’re still hideously unsafe when it comes to random content from the web.

    Graphics drivers typically don’t even make use of the IOMMU, and that’s for people who even have systems with IOMMUs.

    “Security” in the GPU world is in its absolute infancy compared to that of CPUs.

    > the Internet is *already* full of non-multicast, compressed streaming video.

    Not to anywhere near the same degree. I also note you missed out “real time” there. My point is we don’t currently have everybody who casually decides to spend 3 solid hours of playing a game (which is very common) requiring (3 hours x HD video bandwidth) of data.

    These things just… worry me in general.

  34. Sorry folks but I think this goes too far. We talk about video decoding here.

    The real future lies in video ENcoding. When the day comes you can encode videos in plain JavaScript to MP4 or WebM format, I will run naked outside.

    For months I tried to port ffmpeg / avconv to JavaScript via emscripten, see https://gist.github.com/binarykitchen/5329825 but I am struggling.

    Don’t let the media brainwash you. Video decoding isn’t everything. You also need encoding.

  35. Mind blown – Inception-esque.

    OT – ES6 and yield.

    A generator function is signified with a * – how about instead of the keyword ‘yield’ using:

    function* fibonacci() {
    let [prev, curr] = [0, 1];
    for (;;) {
    [prev, curr] = [curr, prev + curr];
    return* curr;
    }
    }

    And delegation: return** mygenerator;

    Seems symmetrical.

    This could leave the keyword ‘yield’ available for other purposes.
    Maybe something in the await/futures space e.g

    // runtime yields it’s ‘turn’ to myfunc – and waits for return.
    let x = yield myfunc(x,y);

    This approach inverts the meaning of yield as compared to Python – but I find it more mentally parseable the await.

  36. ES6 has classes. What’s the behaviour when:

    class MyClass { …. }

    var x = MyClass(); // no ‘new’

  37. “ORBX.js, a downloadable HD codec written in JS and WebGL.”

    Well, that’s great and all but, as ever, it doesn’t mean very much until there’s some functionally practical real world usage. Google is happy, for instance, about VP9 being nearly ready (http://blog.webmproject.org/2013/05/vp9-codec-nears-completion.html) and that YouTube intends to use it, but so what? YouTube isn’t even very good at HTML5 video yet, never mind which codecs it uses.

    Half the time when you attempt to play YouTube video in HTML5 (depending on the particular video and depending on if you watch it embedded or on YouTube, small frame or full screen), YouTube insists on playing the video in Flash. I imagine this is antisocially done for reasons of Business, but it serves to underscore the fact that even Big Advertising still fails to achieve universal content delivery on an open platform.

    If ORBX.js gets rolled out in a big way, I’ll applaud it but the short history of HTML5 Video has taught me Cynicism and taught me well.

  38. Robert: GPUs are behind CPUs for sure, hence the GLSL verifier or hand-verified and restricted GLSL hardcoded today. But genie isn’t going back in the bottle.

    Michael: working on encoder support. Needs WebGL upgrade.

    Kevin: ‘return’ means something other than ‘yield’ for a reason – no sale!

    Calling a class without ‘new’ calls its constructor. This should be in the latest drafts, but looks broken. I will ask @awbjs.

    /be

  39. > Calling a class without ‘new’ calls its constructor. This should be in the latest drafts, but looks broken. I will ask @awbjs.

    Hope so – I think it would be ideal if for ES6 classes if Call == Constructor.

    class Point = { … }
    let p = Point(0,0); // nice and succinct

  40. You mentioned “GPU cloud”. I’m curious, since it’s been a large hole in the cloud services space.

    Is there an existing GPU cloud that scales like AWS, Azure, etc, where apps that are very GPU heavy (autodesk, Cinema 4D, Adobe, etc) can be virtualized?

  41. Thanks, but EC2 leverages Tesla cores, which, while GPUs, are designed more for parallel computing than for graphics or visualization programs (such as autodesk, C4D, AE, etc).

    Those programs, while leveraging Tesla cores for some tasks, generally require the Quadro or Kepler Quadro series GPU boards, which I have yet to see offered in the cloud.

    Any chance you’ve seen an offering for those?

    Thanks!

  42. I’m not sure if anyone does yet — largely, I think, because there hasn’t been a good way to virtualize access to them. Most people who are doing it are doing their own custom solutions (e.g. all the streaming-game-playing infrastructure, nvidia’s gpu cloud stuff, etc.). ORBX will hopefully help change that.

  43. Adam: Jules of OTOY shares “We got Crysis, World of Warcraft and tons of PC games running on Amazon’s CentOS GPU instances as far back as 2010 using just WINE and virtualGL on those 2010 Teslas. Newer Teslas – such as the K2’s on our rack, do support full GL/DX graphics without any such hacks.”

    And as Vlad says, more news coming. I’ll blog about it as soon as I can.

    /be

Comments are closed.