Oh Bart...
Posted: Fri Apr 23, 2004 3:40 pm
Bart Kijanka – CTO Gas Powered Games
GD: A lot of attention has been paid to NVIDIA's support of Pixel Shader 3.0 - can you specifically think of anything PS 3.0 can be used for that can't be done in PS 2.0?
Bart: Currently it appears the impact of 3.0 on developers will be minimal. Therefore it's unlikely the consumer will see significant benefits from 3.0 for quite some time, especially since the improvement of 3.0 over 2.0 isn't as great as 2.0 over 1.0.
GD: Is there ever a need to use 32-bit precision over 24-bit? I'm looking for an example where 32-bit precision shows an obvious superiority over 24-bit - both are considered "Full precision" by the current DX9 spec.
Bart: We don't have an identified need for 32 bit precision at this time.
GD: As a developer, how much are you influenced, if at all, by the likes of NVIDIA's "Way its meant to be played" or ATI's "Get in the Game" programs?
Bart: We like to ensure our games work on the majority of video hardware regardless. The card vendor certification initiatives seem to affect marketing more than development.
GD: How far away do you think we are until we see PS3.0 and 32bit Precision support become relevant as a "must have" feature in graphics cards?
Bart: Several years. I expect it won't be a "must have" until 3.0 shader support is present in 80% or more of the installed user base.
GD: As a gamer, would it be more important for you to buy a video card with the utmost in performance and supported 24bit precision and PS2.0 or one slightly behind in performance but had PS3.0 and 32bit precision support?
Bart: Highest performance wins.
GD: A lot of attention has been paid to NVIDIA's support of Pixel Shader 3.0 - can you specifically think of anything PS 3.0 can be used for that can't be done in PS 2.0?
Bart: Currently it appears the impact of 3.0 on developers will be minimal. Therefore it's unlikely the consumer will see significant benefits from 3.0 for quite some time, especially since the improvement of 3.0 over 2.0 isn't as great as 2.0 over 1.0.
GD: Is there ever a need to use 32-bit precision over 24-bit? I'm looking for an example where 32-bit precision shows an obvious superiority over 24-bit - both are considered "Full precision" by the current DX9 spec.
Bart: We don't have an identified need for 32 bit precision at this time.
GD: As a developer, how much are you influenced, if at all, by the likes of NVIDIA's "Way its meant to be played" or ATI's "Get in the Game" programs?
Bart: We like to ensure our games work on the majority of video hardware regardless. The card vendor certification initiatives seem to affect marketing more than development.
GD: How far away do you think we are until we see PS3.0 and 32bit Precision support become relevant as a "must have" feature in graphics cards?
Bart: Several years. I expect it won't be a "must have" until 3.0 shader support is present in 80% or more of the installed user base.
GD: As a gamer, would it be more important for you to buy a video card with the utmost in performance and supported 24bit precision and PS2.0 or one slightly behind in performance but had PS3.0 and 32bit precision support?
Bart: Highest performance wins.