|
![](/i/fill.gif) |
On 4/16/2011 16:59, Patrick Elliott wrote:
> No, I want the damn thing to bloody run.
Here's what I think is happening.
Game designers like to target the consoles, not because they're particularly
good, but because they're only broken in one way.
Given that some 85% of BSODs are video drivers crashing, it's clear that
nobody really builds video drivers that work 100%. OK, so maybe your PC
works 98% of all its functionality perfectly. Maybe the xbox works 95% of
all its functionality perfectly.
So you write the game, you test it out on the xbox, and you don't do the
stuff that doesn't work on the xbox. 5% of your rendering code consists of
working around bugs in the xbox hardware and software.
Now you have to port it to the PC. But not just Patrick's PC. Everyone's PC.
Maybe 20 different graphics cards/chips/OS version combinations. Each of
which have only 2% instead of 5% of the stuff not working. So you wind up
with 95% of the stuff you want to do on the xbox working, and 60% of what
you want to do on the PC working. Sure, you can compensate on the PC for
each broken thing, working around each bug, rewriting that code 20 times,
testing against the versions you can get your hands on. Or you can release a
game that people will buy anyway and just turn off any optimization that
doesn't work on every card.
Remember that every game that says "make sure you have the latest video
drivers" is really saying "we're relying on something working that was
broken in the first N official versions of the driver software."
It's equivalent to saying "wow, google released this web site that works
great under Chrome, but it looks awful under IE6. How unprofessional is
that?" And that's exactly where we'd be all the time if people updated
their web browsers as often as they replaced their video cards.
--
Darren New, San Diego CA, USA (PST)
"Coding without comments is like
driving without turn signals."
Post a reply to this message
|
![](/i/fill.gif) |