What's new

I need a video card, what should i get

blizz

New member
funny thing is you guys bitch so much about the drivers but don't take into account how difficult it is, you've got to code so that crappy games must work with your new revision plus you've got to make sure you support the standards for those "proper" games which do things sensibly.

of course they're never going to be 100%
 

fivefeet8

-= Clark Kent -X- =-
Tagrineth said:
They've only gotten better in terms of the Cg compiler.

Nothing more.

Their trilinear filtering hacks still exist, btw.

I could show you 2 identicle shots. One which uses Full Trilinear and one with the Psydo bi/tri filtering and you probably wouldn't be able to tell the difference. For most people, they aren't going to be able to tell the difference. Even most hardware review sites which recently did some IQ comparisons say the difference is so nominal it gets subjective.

Of course I do think nvidia should atleast provide users the option to use Full Trilinear. But I think Nvidia's current stance on this issue is that most users aren't going to see a difference. Full Trilinear filtering works fine in OGL games btw.

There is one thing you seem to have missed about Nv's af though. In earlier dets(44.xx), 8xAF per application was broken. 8xAF application was really using 2xAF. But with the newer 50.xx drivers, 8xAF per application actually works. Of course like you said, the psydo bi/tri filtering is still there and is now even more optimized in d3d.
 
Last edited:

AlphaWolf

I prey, not pray.
blizz said:
funny thing is you guys bitch so much about the drivers but don't take into account how difficult it is, you've got to code so that crappy games must work with your new revision plus you've got to make sure you support the standards for those "proper" games which do things sensibly.

of course they're never going to be 100%

Well, when you pay $200 for a video card, you should at least expect it to work. Same goes for the games, when software companies charge $50+ for each of em.
 

vampireuk

Mr. Super Clever
AlphaWolf said:
Well, when you pay $200 for a video card, you should at least expect it to work. Same goes for the games, when software companies charge $50+ for each of em.

Buy ATI then, it works and they havn't cheated and tried to bullshit their way out of a paper bag for the past year ;)

My 9700 pro works like a dream, the same could not be said for the reference 5200 Ultra I recieved. NVIDIAs drivers are picking up but they have a long way to go before they regain peoples trust.
 

AlphaWolf

I prey, not pray.
vampireuk said:
Buy ATI then, it works and they havn't cheated and tried to bullshit their way out of a paper bag for the past year ;)

My 9700 pro works like a dream, the same could not be said for the reference 5200 Ultra I recieved. NVIDIAs drivers are picking up but they have a long way to go before they regain peoples trust.

You know, I don't like either company, I think they both piss me off. ATI has been cought cheating just as badly as nvidia ever has, and even the ATI fanboy forum thinks their drivers are buggy. Nvidias cards have sub-par image quality and waste power on laptops. The only thing I am doing here is dispelling the rumors that the fanboys/girls are spreading about one brand being totally superior to the other.

So far as speed is concerned, I realy don't care. The fastest video card is always a ripoff. I only go for whoever gives the best overall card for $200. IMO, anybody who goes for a specific brand is just a fanboy.
 
Last edited:

pandamoan

Banned
AlphaWolf said:
You know, I don't like either company, I think they both piss me off. ATI has been cought cheating just as badly as nvidia ever has, and even the ATI fanboy forum thinks their drivers are buggy. Nvidias cards have sub-par image quality and waste power on laptops. The only thing I am doing here is dispelling the rumors that the fanboys/girls are spreading about one brand being totally superior to the other.

So far as speed is concerned, I realy don't care. The fastest video card is always a ripoff. I only go for whoever gives the best overall card for $200. IMO, anybody who goes for a specific brand is just a fanboy.

i have to agree 100% with alphawolf here.

both companies have disgraceful track records as far as integrity goes.

i'm cheaper though, so i aim for about $150... :)

and vampire, there's evidence of cheating on the new ati drivers as of 3 days ago. i read about it on tom's hardware.

both companies refuse to "do the work" by making good products (and good drivers!) and instead keep trying to short cut each other into a lead role.

it's pathetic. i may buy an xgi when they come out.
 

fivefeet8

-= Clark Kent -X- =-
AlphaWolf said:
Nvidias cards have sub-par image quality and waste power on laptops. The only thing I am doing here is dispelling the rumors that the fanboys/girls are spreading about one brand being totally superior to the other.

Generally, IQ on Nvidia products and ATi products are comparable. ATi has better quality FSAA, but ATi's FSAA doesn't AA everything. Plus Nvidia has a lot more FSAA options. Some of which do compare very well against Ati's FSAA quality wise. Nvidia's AF seems to be a little better though. Especially in certian angle viewing situations. Most of the people who have both an NvidiaFX card and Ati Radeons say that IQ is comparable on both. Again, it's not one cards IQ totally dominating the other.

The current FW 52.16 video drivers have been found to contain a lot more FSAA options for the FX cards then before. Using the newest Rivatuner along with the 52.16 dets, these FSAA options have been found. Some are new and undocumented, some are old:

old modes enabled:
FSAAMode12 = 4x 9tap
FSAAMode05 = 4x Super Sampling
FSAAMode16 = old 6x
FSAAMode18 = old 8x
8xS-d3d/ogl
12x-d3d
16x-ogl

new unknown modes:
FSAAMode01 = mystery horizonal SS?
FSAAMode02 = mystery vertical SS?
FSAAMode0C = mystery SS mode?
FSAAMode14 = mystery texture filter
FSAAMode0A = StarStorm Quality

These new modes were found by the users of FX cards by editing the Rivatuner config file.

As far as synthetic benchmark cheating goes, both companies do it. Using anti detects with Nvidia/Ati cards, the Game 4 Nature benchmark in 3dmark2k1se runs at a much less FPS on both products. With the newest 52.16 dets, it seems Nvidia may have gotten rid of the cheat for the Game4 nature test. A lot of Nvidia card users are complaining about dropped framerates/scores in 3dmark2k1se. It's stupid I know, but I see it at every hardware forum. "The 52.16 suck!!! My score in 3dmark2k1se dropped 2000 points!!" Pff.

In my own benchmark testing, the Nature test in 3dmark2k1se did drop in FPS from 140 fps to 100 fps. Lost about 40 fps going from the 44.67 to the 52.16s. Total score in 3dmark2k1se also dropped. But it's a synthetic benchmark which has been Optimized by both companies. Seems that Nvidia may have taken out a few "cheats" from it though.

I don't really care about synthetic benchmarks. I test them only to compare on a driver to driver base. The actual games I've tested are running much better with the current drivers.

Damn. This post is getting too long. :blink:
 
Last edited:

Tagrineth

Dragony thingy
AlphaWolf said:
Wait, so which is it? They code around all but one nvidia bug, and leave six ATI bugs intact? Why not just code around every nvidia bug while your at it? And IIRC two of these ATI bugs were showstoppers (e.g. prevented the game from working,) why not code around these bugs? Could it be that the ATI drivers are just buggier?

nVidia is the market leader, and probably provided a lot of the funding for the game. So of course, given a certain amount of time to code, they're gonna focus more on getting it to work with the "current big thing".

Maybe the remaining nVidia bug was so severe it couldn't be worked around without rewriting large portions of the engine?

And if you spend all of your VERY LIMITED coding time on nV bugs, how're you going to fix all of the other vendors' bugs?

You know, I don't like either company, I think they both piss me off. ATI has been cought cheating just as badly as nvidia ever has, and even the ATI fanboy forum thinks their drivers are buggy. Nvidias cards have sub-par image quality and waste power on laptops. The only thing I am doing here is dispelling the rumors that the fanboys/girls are spreading about one brand being totally superior to the other.

ATi has been caught cheating just as badly? Example please? I'd like to see where ATi was effectively disabling Trilinear filtering without saying anything about it. I'd like to see where ATi was adding clipping planes everywhere the camera wasn't. I'd like to see where ATi b0rked DXTC with a noticeable visual quality loss. Or how about I dig up the list of complaints Valve shot out when the 50.xx dets appeared? They had about 8-10 bullet points of cheats the NEW driver was using in Half-Life 2...

and vampire, there's evidence of cheating on the new ati drivers as of 3 days ago. i read about it on tom's hardware.

Internet rule #1. Never believe anything you read on Tom's Hardware wrt video cards. Pabst has been severely nVidia-biased since THG opened. I can cite examples of very obviously clouded judgment.

both companies refuse to "do the work" by making good products (and good drivers!) and instead keep trying to short cut each other into a lead role.

You really think ATi's R3x0 line isn't a good product? Hell, even nVidia's been known to make good products (GeForce4Ti).
 

pandamoan

Banned
ok, sorry, yes i think the current r3x0 product line is solid, however without decent drivers, does it matter?

honestly if the gfx card companies would open source their drivers, it would solve all of this, and ensure more optomized drivers.

it's the hardware we pay for anyway.

and i agree, the nvidia ti line was pretty solid driverANDhardware wise, so..... they both have had moments.

my main point is that they are both cheating in drivers.

they are both releasing all kinds of pointless FUD.

they are both releasing hardware and software that is not especially well engineered or stable (as a combo of hardware and software).

they are both f*@#ing up, and it's time for a third party to kick their asses.

enter xgi.

:)
 

AlphaWolf

I prey, not pray.
fivefeet8 said:
Nvidia's AF seems to be a little better though.

I'll have to agree there, when I can get AF to work on ATI cards, it makes this little box with rounded edges surrounding you, whereas nvidias AF is like a perfect sphere that surrounds you...just looks a lot smoother that way. On the other hand, and this is the main thing I was refering to: the overall picture is sharper with ATI cards in 2D.

Tagrineth said:
nVidia is the market leader, and probably provided a lot of the funding for the game. So of course, given a certain amount of time to code, they're gonna focus more on getting it to work with the "current big thing".

Maybe the remaining nVidia bug was so severe it couldn't be worked around without rewriting large portions of the engine?

And if you spend all of your VERY LIMITED coding time on nV bugs, how're you going to fix all of the other vendors' bugs?

Come on, you seriously believe that? That is so far fetched that its bordering on a conspiracy theory. Quit being a fangirl :p

Tagrineth said:
ATi has been caught cheating just as badly? Example please?

Aside from what fivefeet8 just said, you can't forget quaff3.exe
 

Xade

Irrelevant Insight
Tagrineth said:
Easy, more nVidia bugs were worked around than ATi ones.

You can't code around absolutely any bugs, though.

For the best example of bug-evasive coding practices, check out C&C: Renegade.

Oh, and by the way, http://www.beyond3d.com/forum/viewtopic.php?t=8926

The latest Detonators (ForceWare, whatever) have worked FLAWLESSLY with my very, very powerful graphics card, Tagrineth.

The performance is much improved, and I have no complaints.

As for the guys that do... try... uninstalling... the... old... drivers... first...

*Sigh*
 
Last edited:

vampireuk

Mr. Super Clever
ATI were caught out and removed the cheats from the drivers, NVIDIA burried them under more lies and tried to keep the "optimisations" in the drivers. Then of course there was the 5800 winning a award for a year it wasnt even released in...but thats going off topic ;)
 

fivefeet8

-= Clark Kent -X- =-
vampireuk said:
ATI were caught out and removed the cheats from the drivers, NVIDIA burried them under more lies and tried to keep the "optimisations" in the drivers.

Depends on which benchmarks the optimizations were taken out of by ATi. They definately weren't taken out of 3dmark2k1se as proven by the anti detects. You really have to wonder if a company would cheat once and never do it again. Nvidia is definately optimizing for benchmarks, but so is Ati.
 

Tagrineth

Dragony thingy
wrt the Quake3 renamed executable thing, that was an accident, a leftover optimisation from Radeon R6.

If it was such a cheat, why did the next driver release look 100% normal and have better performance at the same time? If it was a cheat, the quality reduction would be the CAUSE of the performance increase. Obviously this isn't the case.

fivefeet8 said:
Depends on which benchmarks the optimizations were taken out of by ATi. They definately weren't taken out of 3dmark2k1se as proven by the anti detects. You really have to wonder if a company would cheat once and never do it again. Nvidia is definately optimizing for benchmarks, but so is Ati.

I get the distinct impression that ATi doesn't care about 3DMark2001SE any more, they probably don't think it's worth the effort to take that out considering that in an official sense, 2001 is obsolete.

They should take it out, though.

And keep in mind they did take out the ops in 3DMark03, and since then I can't remember hearing anything about ATi cheating in anything at all.

AlphaWolf said:
I'll have to agree there, when I can get AF to work on ATI cards, it makes this little box with rounded edges surrounding you, whereas nvidias AF is like a perfect sphere that surrounds you...just looks a lot smoother that way. On the other hand, and this is the main thing I was refering to: the overall picture is sharper with ATI cards in 2D.

Who's the fanperson here? "Little box with rounded edges surrounding you"?

Try these on for size. I don't see anything wrong with that AF.
 

AlphaWolf

I prey, not pray.
Tagrineth said:
wrt the Quake3 renamed executable thing, that was an accident, a leftover optimisation from Radeon R6.

And people here accuse me of trying to beat to death old topics.

Tagrineth said:
Who's the fanperson here? "Little box with rounded edges surrounding you"?

Try these on for size. I don't see anything wrong with that AF.

Two things wrong with those screenshots, and thats that they are too dark and textured to be able to see the seems, and also it isn't feasable to run every game at 16x AF.

What I am refering to specifically is the mipmap. An example is attached. Note how it has that corner - it forms a box like that around you. My nvidia cards form a perfect circle, less noticeable IMO, just looks more natural.

.
 

fivefeet8

-= Clark Kent -X- =-
http://www.firingsquad.com/hardware/imagequality2/default.asp

One of the more recent IQ comparisons.

Currently ATI's Method of Filtering, Cannot hit all angles. THis is a hardware Limitation. In a Game like Unreal Tournament 2003, You will very unlikely notice this. However in Bigger games Like MMORPG Such as Everquest, Star Wars Galaxy, Ect, If you look off in the distances, To the right and left, There are areas where no filtering is occuring. This doesn't happen with Nvidia cards as those areas do get filtered.
 

Top