Monday, March 30, 2009

OnLive (your e-mail)

OnLive was the #1 topic, by far, in my e-mail last week, and there were several general themes running through your thoughts, so let's take a look.

First off, since technical details were hard to come by in most of the articles I read, Skip Key sent me a link to an EETimes article that provided more information:
A novel non-linear compression technology accelerated by massively multicore ASICs is at the heart of the OnLive service. Each of the company's servers includes a proprietary 100-plus core encoder.

...The encoder ASIC does not use an array of homogenous cores. Instead it has a wide variety of dedicated blocks, each handling a small piece of the compression algorithm.

Skip's comments:
It appears that what they're doing is outputting the video through a huge array of custom processors to do the compression. The codec used is probably very gaming specific. If that's the case, I don't see any reason the technology can't work as described. It's not really cloud computing, but more of a traditional thin-client server architecture. It is, however, going to be very expensive to run one of these servers. Think quarter-million plus per box, and who knows how many each box will support?

If OnLive is a thin-client architecture versus a real cloud computing model, that's an important difference, because if I'm understanding the details correctly, it makes scalabity more difficult. It doesn't sound like OnLive can temporarily rent capacity, like they could in a true cloud computing setup.

Now we go from straight tech information to some marketing buzz:
The distributed compression technique uses Internet Protocol multicasting so that as many as a million viewers could watch a single game in progress at full resolution, Perlman said. It can deliver content without perceptible latency for as far as 1,000 kilometers over copper or 1,500 miles on optical nets, he added.

"We got rid of a whole bunch of layers in the software stack needed for a more generalized wireless architecture, but not for a controller," said Pearlman. "We came up with a structure where by the nature of the transmissions, controllers don't interfere so we get low latency," he added.

I think this is the point where stuff starts to smell. For one, I'm not sure what constitutes the threshold of "perceptible latency," but phrases like "we came up with a structure where by the nature of the transmissions, controllers don't interfere so we get low latency" really don't make any sense. It's not someone describing how their perpetual motion machine works, but it's got that kind of vagueness.

Here's what Ian Hardingham (Mode 7 Games) had to say about latency:
Existing games use client-side prediction to reduce lag. In online games like first person shooters and so on, you don’t have input lag for movement or mouse movement because it is all done on your computer first. You only have input lag for shooting - and anyone who’s played an online FPS knows how incredibly jarring that is. In Counter Strike, you do not even have shooting lag - it’s all done client side straight away and the game uses some clever maths to sort it out.

None of this is possible on the onLive light client. You will be dealing with the full ping + processing lag for every action.

...I don’t think you can expect an average ping of less than forty. And I’m being VERY generous there - that requires that they have a HUGE number of server farms. In that case, you’re looking at an input lag of 50. I’m in the process of mocking up a demo of something with this kind of input lag, but I can tell you it’s not going to be nice to play with.

In an online game, when the ping spikes a little, you often don’t notice it because your computer is doing all the prediction. If you have a spike in the middle of a Mario level everything will stutter and you’ll mistime the jump and die.

Richard Leadbetter wrote an excellent article for Eurogamer (thanks to Chad Mercer for the link) where he discusses some of these same issues:
Not only will these datacenters be handling the gameplay, they will also be encoding the video output of the machines in real time and piping it down over IP to you at 1.5MBps (for SD) and 5MBps (for HD). OnLive says you will be getting 60fps gameplay. First of all, bear in mind that YouTube's encoding farms take a long, long time to produce their current, offline 2MBps 30fps HD video. OnLive is going to be doing it all in real-time via a PC plug-in card, at 5MBps, and with surround sound too.

...OnLive overlord Steve Perlmen has said that the latency introduced by the encoder is 1ms. Think about that; he's saying that the OnLive encoder runs at 1000fps. It's one of the most astonishing claims I've ever heard. It's like Ford saying that the new Fiesta's cruising speed is in excess of the speed of sound. To give some idea of the kind of leap OnLive reckons it is delivering, I consulted one of the world's leading specialists in high-end video encoding, and his response to OnLive's claims included such gems as "Bulls***" and "Hahahahaha!" along with a more measured, "I have the feeling that somebody is not telling the entire story here." This is a man whose know-how has helped YouTube make the jump to HD, and whose software is used in video compression applications around the world.

Ouch.

There seems to be a general consensus that OnLive simply can't output 720p resolution with acceptable framerate and image quality. That's before any consideration of lag is included.
Plus, and several of you brought up this question, if this video compression technology is so incredibly good, why is it being monetized in this manner? A clearly superior compression technology could be monetized in a far superior manner by simply licensing the technology, instead of trying to reach the gaming tech equivalent of "cold fusion" (as Victor Godinez mentioned via e-mail).

Okay, so let's say that HD at a high and consistent framerate isn't technically possible right now. Standard definition is another possibility, since it requires much lower bandwidth. For OnLive, it seems ideal, but as a consumer, I'm not interested in anything less than HD. If I subscribed and standard-def was fine, but HD was jerky, I'd cancel in the first month. So the business model might be more workable in SD, but the appeal to anyone who is already gaming in HD would be negligible.

Another popular topic was the effect that a move to an OnLive model would have on the modding community. Would the OnLive version of a game support mods? I don't believe anyone asked that question last week, so it wasn't addressed, but I certainly think that fewer people would be making mods if OnLive becomes popular, just because far fewer people would have actual copies of the game (they'd essentially be renting a video stream of the game as they play). It could also potentially reduce the desire of publishers to support the modding community, because if 75% of unit sales can't even support mods, why support them at all? That's an open question going forward, even if it's somewhat under the radar for now.

When I mentioned that ISP's were going to go to war over bandwidth, Shane reminded me that some countries have already done so:
In Australia...each plan from each ISP has a speed associated with it AND a download quota (the link takes you to iiNet's service offerings). The speed is usually a function of the technology used (ADSL, ADSL2+, Cable , Wireless etc etc) - I am sure you are familiar with this from the US. The difference is the quota - this can range from as little as 500MB of downloads to huge quotas like 130GB per month.

Like I said, this was a very popular e-mail topic last week, and mainly, it was about the topics we already discussed, though. Tarn Adams, though, raised a concern that was entirely unique:
For me, there's the point about how ephemeral video games are related to say movies, books and music (even concerts can be recorded, even if that's not "the same"), and how that affects the advancement of video game design as a skill.

Take Seven Cities of Gold for example. It didn't spawn any imitators, as far as I can tell, at least not in the ways I care about. They made a gold version like ten years later, but it didn't spawn any imitators either. Starflight too. Both of those are now hard to come by. Elite and Civ and so on were more popular, mainly for "space trading" or warfare, and there are plenty of imitators for both of those, but expedition games are effectively dead, before modern systems got a chance to fully realize their potential. I'm not sure this would have happened if they were still as easy to play as it is to pick up a book from the 80s or listen to song from the 80s. If it happens again, it's more likely that it would be reinvented rather than through the regular progression of ideas, which is a waste.

The thing is, with something like OnLive, if the company goes under, there won't be physical copies in the hands of every fan, with some enterprising enough to archive and distribute them (illegally or not isn't really germane here). Games that aren't popular (rather than aren't good) might disappear entirely, and everybody would eventually suffer from it. Even under the current system, there are a few arcade games I remember looking around for that are impossible to find, online or in arcades, and those games are gone, aside from a few screenshots, along with most everything they'd contribute to people that might play them. It seems like this could be even worse.

As with anything, I think it's important for a game designer to play a lot of games, and this would make it very hard to have anything like "classics", unless the classic in question were popular enough to justify reissuing it or hawking sequels.

I don't mean to take things to their extreme conclusion, and there'd likely be tie-ins through services like GameTap (or just within OnLive itself) to keep most games going, but as competitors arise and companies fail, if the publisher is already gone and one of the big distribution services flops, there's no guarantee that a bunch of their titles, especially any exclusives, wouldn't die as well, permanently.

As the servers are upgraded over time, it might become a hassle to upgrade the old games as well -- this is currently a problem, of course, but with all of the old games stuck off in a vault some place, the current solutions to this problem wouldn't apply, without some work from the publishers/distributors, who might not be willing if there's no significant return.

Dwarf Fortress-wise, I just hope people that play games on their PCs will still have a reason to have good video cards. People with PCs geared toward OnLive wouldn't be able to play my game after they stumble across it (it takes a decent video card to handle even my display as they continue to deprecate old graphics functions), or many of the other independent games that can't afford to buy in to this sort of distribution system. I don't see this as a huge problem -- people will still likely still have video cards good enough to play things that aren't strictly cutting edge graphics-wise -- but it seems like a bit of a blow to hobbyists and independents.

Generally, I think fewer designers and less preservation means more crap in the long term.

That's so good and so interesting that I'll just be quiet now.

Site Meter