Page 1 of 2
Nextgen Video Card Thread
Posted: Thu Apr 22, 2004 2:55 pm
by R3C
Anyone else thinking of going with Nvidia this cycle? I still need to see what the final specs are for the new ATI card, but the 6800Ultra is looking mighty good right now. Support PS3.0 too. Not sure if the ATI card will do that or not, but I can't imagine them not doing it and falling behind. We'll see....
Posted: Thu Apr 22, 2004 5:23 pm
by mistasparkle*
Tough call... both will most likely be much more than any games will utilize for a while, but I wouldnt sleep on ATI. They've delivered solidly on top for 2 generations now, so if you want the absolute best, I think a wait and see on ATI's next move is a good idea.
...but yea, the 6800Ultra looks really good.
Posted: Thu Apr 22, 2004 6:23 pm
by enderzero
I need to wait until I have a job that allows me to spend the money, or better yet "test at home" a new $500 video card. I have barely got to use my 9800 yet... but I am looking forward to it!!
I'm...
Posted: Thu Apr 22, 2004 8:37 pm
by R3C
... a little concerned because all the rumors, and semi-"leaked" info, points to the ATI card only supporting PS2.0. Also only the top end card looks like it is going to have a 16 pixel pipeline. Of course I'm going to wait and see how it all turns out with final specs, but if all these things turn out to be true, I suppose I'm going with Nvidia. (Haven't had an Nvidia card for quite a while.) I'm a little worried about the 2D quality and the color precision on an NV card, but I'm someone who likes effects a lot, so the PS3.0 may outweigh that. (Even if there's only a game or three that supports it.) Oh well, it will be until they are both out and all over the place before I can afford a $500 video card anyway.
Posted: Fri Apr 23, 2004 3:23 am
by enderzero
How many games are actually out right now that fully support PS 2.0 features?
There are...
Posted: Fri Apr 23, 2004 8:36 am
by R3C
... quite a few that use 2.0. If you patch Far Cry, it supports PS3.0. So there's already an incredibly good game for it.
So...
Posted: Mon Apr 26, 2004 9:54 am
by R3C
... I've done some reading on SM3.0 vs SM2.0. It looks like the only really cool thing about SM3.0 is that displacement mapping can be done. While this is cool, and something I'll be quite interested in, I don't think it will be useful right now. Epic says that they will have SM3.0 support in Unreal Engine 3.0. (Which is at least a year and a half off.) By then all kinds of new cards will have been released. The rest of the enhancements in SM3.0 are all speed and efficiency related. (Good, but nothing amazing.) So really there isn't much reason to get the 6800Ultra over the next ATI card. I'm sticking with ATI. Frickin' Nvidia marketing. Get me all excited until real-world examples emerge.
Wow...
Posted: Mon Apr 26, 2004 10:32 am
by R3C
... the NV situation is a little more frigged up than I had originally thought. The GeForce 6800 Ultra doesn't look like a bad card by any means. (In fact it looks really good,) but it looks like NV has been misinforming people of the capabilities. Pretty standard for them, but I figured they'd want to behave after the 5800 debacle. Anyway, the screenshots that got me all excited were supposedly the difference between shader model 2.0 and shader model 3.0 in Far Cry. Well, first of all it wasn't Far Cry, it was a Cry-Engine tech demo. Second, the shots were actually comparing SM3.0 to SM1.1. 1.1!!!!!!!!! Which is DirectX 8 if I'm not mistaken. Frixing Firxors!!! And the SM3.0 images weren't even using the 3.0 features, they were a mix of 2.0 and 3.0 type operations. Anyway the articles and newsbits are on
http://www.hardocp.com Quite a few sites are reporting this. The card itself is not bad, but there is no reason at all to get it over the new ATI card it would seem. Unless ATI screws up badly. (Unlikely.)
Posted: Tue Apr 27, 2004 8:55 pm
by enderzero
I have been doing quite a bit of reading on this and other hardware issues of late, including the HardOCP article. First of all, those screen shots (be them the 2.0 or 3.0) got me excited to get my hands on my 9800. Regardless of DX9 advanced features, my PC is going to eat up Far Cry. And with Doom 3, and HL2 right around the corner I am really set.
It is pretty obvious though that the only real advantage in the near future to SM3.0 is Displacement Mapping. DM is in fact quite cool, but my point is that by the time games start really taking advantage of it ATI will have likely caught up and be producing SM3.0 cards.
But my opinion has shifted a bit. I have to admit to being a long time nVidia fan. I still find it hard to believe they are so relegated to 2nd place these days. And it is even harder for me to believe that the company that did it to them was ATI. What's next... the revival of Matrox? How about Rendition? But anyway, it seems to me the thing that really did nVidia in was the Xbox. They focused so much attention on that project that their consumer products slipped and ATI just snuck up from behind and grabbed the reigns. Then with some really poor decision making on how to bring the NV30 to market (deep deep pipelines and hugely oversized refrigeration system pcbs) nVidia really faltered.
But this isn't necessarily 3Dfx all over again. If nVidia is going to capitalize and try to regain its position on top they should be looking at the opportunity opening up over the next few months.
ATI's R420 can be pretty much considered an inferior product to nVidia's NV40. The R420, from what I understand, is basically just the R360 (9800XT) with a smaller 0.13 process. The budget 9600 already uses a 0.13 process. So basically we get a faster, cooler running 9800. Fine, that video card is great. But I think few people already running 9800s (or 9600s) are going to think the jump to R420 isn't really necessary. There is a pretty big market for the hobbyist upgrader that does buy a new card every 9-12 months. If I just shelled out $400 9 months ago and I was looking to upgrade again, I might be pretty tempted by the NV40's SM3.0 support, 32-bit floating-point pipelines (compared to the R420's 24-bit) and 600MHz+ G-DDR3 Memory.
What nVidia needs to do is stop acting like they are the market leader and play a little catch up. Look at AMD. They have always been the underdog, and undercutting Intel with price and superior technology (sometimes) has allowed them to get a big chunk of the market, especially among enthusiasts. nVidia is still walking around like they are the true heir to the throne while the only people buying their cards are the people that haven't bothered to realize that there is something a whole lot better out there. So give it up. Admit your mistakes from the past, and try to start something new with the next generation. This negative press around the SM3.0 specs is completely the wrong way to do it. It just makes the company look dishonest and arrogant in front of the audience that they need to be convincing to come back most.
There are some great games slated to drop soon. People are excited and looking to upgrade. If nVidia can drop the price a bit, admit that the double sized jet engine card was a stupid mistake, and tell people about the advantages of the NV40 plain and simple, they might be able to remind us why we use to love the company so much. Plus, isn't ATI working on the Xbox2?
My...
Posted: Tue Apr 27, 2004 10:09 pm
by R3C
3DFX comments are not because of the situation. It's because Nvidia bought 3DFX IP and engineers. Their work went into the GeForceFX (hence the FX) and that, I believe is why it sucked so bad. They have indeed been infected with a nasty strain of 3DFXinosis of the eye. The unfounded arrogance is sweeping quietly through the company as we speak.
Ok, so there was a lot more wrong with NV30 than just that, but that did play a big portion...
They dropped FX from the name for this product cycle
Posted: Tue Apr 27, 2004 11:46 pm
by enderzero
I didn't mean to imply that you were comparing the two...but the similarities are rather abundant. It will be very interesting to see if nVidia can turn their long slide down around the way 3Dfx couldn't. If I remember correctly though there were many other issues that came into play such as the dismal performance of the Voodoo 3, inability to make the transition from 3D add-in to standalone 2D/3D, their reluctance to give up on Glide, and the merge with STB.
Ahhh, I still fondly remember my first nVidia card, the
STB Velocity 4400. Thanks Sharky. I ditched my 3Dfx card and never looked back. Long live the
TNT!!
Well...
Posted: Wed Apr 28, 2004 8:25 am
by R3C
... those things played a part, but what really killed them is doing their own fabrication. They kicked out all the OEMs who did their card making, bought STB, and opened a new plant in Mexico. (Or something like that.) All the capital they wasted switching to that format, and what they lost from the OEMs killed them. Nvidia won't die. The GeForce 6800 Ultra is actually a very good card. I would use one. It's fast, they've done a lot to fix their image quality. They just shouldn't be lying to people. That gets the hardware sites all worked up, starts a war, and then nobody wants the product. The 6800 can stand on its own, so I don't know why they would skew images and results that way. The ATI card will still be better and won't take two power connectors from what I can tell.
(For this product cycle anyway.) Still need to see final specs and benchmark results though before the final decision.
Posted: Wed Apr 28, 2004 11:51 am
by McNevin
I had the best of both worlds, a voodoo and a riva128zx, at the same time.
Muhuhuhuauahah.
Then I sold it to Mike or Matt, can't remember.
Good old Diamond Viper V330.
Funny thing is I never had a TNT2. Went back to jerks Voodoo3
The Voodoo3...
Posted: Wed Apr 28, 2004 12:12 pm
by R3C
... was actually a really nice card for a short time. It was super fast, and there were still a few games supporting Glide, which made it even better. Then the TNT2 Ultra came out.
Posted: Wed Apr 28, 2004 12:46 pm
by McNevin
It did me just fine. It played a mean Unreal Tourney, which for some reason hauled ass on that card. It just so happened I was really into that game at the time, which made for good times for me.
I remember you even regretting buying that game for me, because I wouldnt play anything else.
Posted: Wed Apr 28, 2004 2:02 pm
by spidermonkey
Anyone thinking of getting the Mad Dog Predator Blastwave?
No...
Posted: Wed Apr 28, 2004 2:20 pm
by R3C
... I was thinking of getting a Mean Jerk Killer Explosion 7.1 card though
Or maybe:
Grizzly Bear Kill-You-In-Your-Sleep Apocalypse Blaster XL
Posted: Wed Apr 28, 2004 8:43 pm
by enderzero
I remember when you got the Riva128 quite well kev. What was the racing game it came with? Not moto racer was it? Man we played moto racer a lot. I would play that now. I had that game down. My Canopus 6MB super Voodoo 1 kicked serious ass though. Except I was still using it when everyone else had voodoo 2s. You still running the SLI Voodoo 2s beeeph?
Maximum PC Benchmarks
Posted: Wed May 19, 2004 11:03 am
by enderzero
I pulled these benchmark results out of the latest Maximum PC testing reference models of the new GeForce 6800 Ultra (NV40) and the Radeon X800 XT (R420).
Card Specs:
<table width="500" border="1" cellspacing="0" cellpadding="5"><tr><td></td><td><strong>GeForce 6800 Ultra</strong></td><td><strong>Radeon X800 XT</strong></td></tr> <tr><td><strong>Core Clock</strong></td><td>400 MHz</td><td>500MHz</td></tr><tr><td><strong>Memory Type</strong></td><td>DDR3</td><td>GDDR3</td></tr><tr><td><strong>Memory Clock</strong></td><td>550MHz</td><td>500MHz</td></tr><tr><td><strong>Number of Transistors</strong></td><td>220 million</td><td>170 million</td> </tr><tr><td><strong>Number of Pipelines</strong></td><td>16</td> <td>16</td></tr></table>
The test system is a nForce3 150 with a FX-51 CPU and 1GB of RAM.
Benchmarks:
<table width="650" border="1" cellspacing="0" cellpadding="5"><tr><td></td><td><strong>GeForce 6800 Ultra</strong></td><td><strong>Radeon X800 XT</strong></td><td><strong>Radeon 9800 XT</strong></td></tr><tr><td><strong>Halo v1.02 (fps)</strong></td><td>35.03 *</td><td><strong>59.75</strong></td><td>28.1</td></tr><tr><td><strong>Far Cry v1.1 (fps)</strong></td><td>62.0</td><td><strong>65.8</strong></td><td>53.0</td> </tr><tr><td><strong>UT2003 Flyby (fps)</strong></td><td>261</td> <td><strong>275.5</strong></td><td>127.3</td><tr><tr><td><strong>Aquamark3 </strong></td><td><strong>63,536</strong></td><td>63,487</td> <td>45,857</td></tr><tr><td><strong>3DMark2003 Game 2</strong></td><td><strong>95.0</strong></td><td>88.5</td><td>45.0</td> </tr><tr><td><strong>3DMark2003 Game 4</strong></td><td>62.8</td><td><strong>71.2</strong></td><td>37.4</td> </tr><tr><td><strong>3DMark Pixel Shader 2.0 test</strong></td><td><strong>156.4</strong></td><td>121</td><td>57.2</td> </tr><tr><td><strong>3DMark 2003 Overall</strong></td><td><strong>11,833</strong></td><td>11,437</td> <td>6563</td></tr></table>
*These results are Halo specific and will likely be remedied before the final drivers.
Note: if these tables show up as code make sure "display HTML" is checked in your profile.
So these are some pretty interesting results. The X800 took all the real world tests and the 6800 took most of the synthetic tests. Regardless, the results were pretty close between the two. It looks like from these results, either card would make a good addition to a smokin system. But the really amazing thing here (and real reason I posted this) is look at how badly the 9800XT got its ass handed to it. This is the absolute fastest card on the market right now, and it got utterly destroyed by the new generation. Look at that 3DMark overall score! This is pretty damn impressive. I would say this if you are interested in running the top of the line games at their best this is possibly an upgrade that you cannot go without.
Posted: Wed May 19, 2004 11:25 am
by R3C
It was previously the fastest card.
The X800 Pro is widely available now. [TEE HEE HEE]
I'm picking up the x800 XT as soon as it ships. Doom 3 is out soon, and I need the best card to run it.
I agree though, the 6800 and x800XT are both really good cards. The reason I wouldn't go NV though, is it's a dual slot card, it requires more power (though not as much as they originally specified,) and the overall image quality and 2D image stability is better on the ATI cards still. Anyone who is a Die Hard NV fan though, at least won't get totally left out this time around. Oh, and the ATI card is probably a lot quieter.
I almost think the Pro version would be good enough to run things like Doom 3 and HL2 on though, so I may just get one of those and save some money. It may be a better idea to buy the top of the line card, when ATI releases a completely new architecture. Much like the x800XT is twice as fast as the 9800XT, the Pro is probably about twice as fast as my 9800 Pro.
Posted: Wed May 19, 2004 11:30 am
by R3C
After peaking at some benchmarks, and some consideration, I think I'm going to go with the X800 Pro. It looks to be twice as fast as the 9800 Pro. I don't think there's much of a reason to spend $500+ at this time. So, McNevin, I'll be ready to do the transaction on June 11th. (That will be the day I buy my card.)
Posted: Wed May 19, 2004 11:34 am
by McNevin
Ok... I will be ready!
I been ready!
Posted: Wed May 19, 2004 12:42 pm
by R3C
You done already been ready!!!
Well, I gotta git me up some munnnnny, then I can git me up wuna those x800Pros. We'll enjoy us up some ATI!!
Posted: Wed May 19, 2004 12:52 pm
by spidermonkey
I reckon you would too.
Posted: Wed May 19, 2004 12:53 pm
by McNevin
Tarnation! I can hardly wait... yee haa!
Posted: Wed May 19, 2004 1:52 pm
by R3C
Commence the jigglin'
This seems...
Posted: Wed May 19, 2004 3:21 pm
by Bill Drayton Jr.
so familiar...trying to determine what is the best from one card to the next all the while the difference in performance is negligible... Either card is going to be good for games since they both represent each vendors most current offering but what about 2D where we spend most of our time? Are the new NVidia cards still blurry?
Posted: Wed May 19, 2004 3:45 pm
by spidermonkey
I think that's caused by your drunken vision...
...
Posted: Wed May 19, 2004 11:54 pm
by Bill Drayton Jr.
you've got the wrong l2icks0r! buddy! besides he would probably think it looks good...
Posted: Sat Jun 19, 2004 3:11 pm
by McNevin
So whats everyones thoughts on PCI Express.
I was reading an interesting
article about agp and pci express. They touched on somthing that i found very interesting, agp was designed to increase the bus between the video and the cpu, but with huge amounts of memory on the card, and the advent of T&L, is agp still usefull?