What's new

Buying a new Geforce?

fivefeet8

-= Clark Kent -X- =-
poison_dart3 said:
That your NVidia only cheats people. NV still sells GF4MX cards ( in the middle of 2004!) for about €60. The trouble is that these cards don't support Pixel and Vertex shader, so gamers can kiss goodbye a lot of great games such as Doom3, Silent Hill 3, Crazy Taxi: High Roller, Sands of Time, Pandora Tomorrow,... Meanwhile same priced ATI cards can run all those games listed.

I'm not even gonna respond. :whistling
 

jdsony

New member
fivefeet8 said:
Features I'm aware of on Ati cards that Nvidia doesn't have something similar.

-TAA
-SmartShader
-hardware 3dc(can be done in software)
-Truform

Features Nvidia has that Ati doesn't have something similar:
-Super Sampling AA
-Hybrid Super Sampling + Multisampling AA modes
-Digital Vibrance
-Image Sharpening
-3d Stereoscopic
-Image quality settings all available through the Control Panel.(disable/enable all af/tri optimizations)
-Shader Model 3 support.

I don't want to continue the argument here but you own an NVIDIA card and probably haven't owned any recent ATI cards. I personally don't care much for Nvidia anymore but have owned a few of their cards in the past. I'm sure I would be happy enough with an Nvidia card but I can't really comment on their more recent cards which I haven't owned.

About the features you mentioned above though some of them are the same features with different names for each company and a lot of the features like "Digital Vibrance" and "Image Sharpening" shouldn't be needed and are probably an attempt to simulate better image quality. If I had to compare video cards to cars I would have to say ATI is a BMW while Nvidia is a modded Civic. Nvidia is all flash and in your face especially with their ludicrious "The way it's meant to be played" ad's in so many games and their whining about certain benchmarks. ATI is more low profile and professional (except if you count the HL2 source link I guess).

When ATI and NVIDIA have cards that perform similarly then it comes down to personal preference and who you trust more to make a good product.
 
Last edited:

MindMasher

New member
In the absolute top end of the performance spectrum, the x800 xt is edging the 6800u by 5% in some cases. We are talking huge res, and bigtime AA. Price wise, they are about equal, and in the lower res/AA, the 6800u seems to be slightly faster (sometimes 10%). I used to buy nothing but nvidia, but I like the direction ATI is taking better then NVIDIA, they are more focused on the developers then marketing, at least according to some of the industry types that have been interviewed.

Rest assured, if you bought either of these cards, you would have nothing short of a monster on your hands - more GPU power and features then 99% of todays games require. I think that such a close race is good for ati/nvidia, but even better for consumers!
 

Nightmare

(when dream come true)
MindMasher said:
I think that such a close race is good for ati/nvidia, but even better for consumers!

hum... i'm not so sure of that... nvidia has released crapy gfx card already (gforce3 & gforce5 series)... and ati has released cards that needed extra power (it mean they didn't take the time to make it work with the classics mother board)... they just make money as soon as they can, so how it could be profitable for us... i don't wanna change of gfx card every years ...
 

MindMasher

New member
How can it be profitable? Thats like asking why microsoft can do whatever they want with internet explorer, and windows, and the DOJ. There is simply no competition!

As for your comments about them trying to get rich quick... your totally off base.

A) Geforce3 was a good solid card for the time, and considering this was near the beggining of the 6 month product cycle, the fall refresh was suitably powerful and supported exciting new features (hardware t&l I believe)

B) The 5xxx series from nvidia wasn't bad per se, it was just that ATI's product was so good.

C) The GPU cycle is driven by gamers and games for the most part, and this 6 month product refresh is successfull because people are buying cards at that rate, its that simple...

Not every company is greedy and cutthroat, and some of them reached their position by hard work and catering to customers.
 

l00pus

New member
fivefeet8 said:
What cards are you comparing(nv/ati) for TV out quality and options? Also what options are on ATi's TV out that isn't on Nvidia cards?


Well when I origionally put together my cab I only was concerned with 2d performance as it was going to be strictly a mame cab. I had a geforce 4 MX and was comparing it to a radeon 7500. Both hooked S-video. Recently with troubleshooting from a previous thread (if you want to read it, it is on this forum the title is "I need some suggestions" probobly a few pgs back) I had the privledge of working with some other cards in my arcade machine. I had a Ti 4200 and a Ati Radeon 9000.

In both comparisons the way the Nvidia card handled the Television seemed worse compared to the Ati. I dont know the technical reasons why but I went thru all of the video/advanced settings and I just couldent tweak the nvidia to the img quality of the Ati.

First off getting the img to fill the TV perfectly on the nvidia is next to impossible. The TV adjustment options just are not as robust as Ati's. The Ati's boots in both cases with the screen properly filling the screen with no tweaking needed. Just in case it dosent it has seperate image resize options for vertical and horizontal in the Tv options which did not seem possible for Nvidia.

Next the Pic just is much cleaner. The flicker settings are more robust with the Ati card and has a wider range of flicker variance.

The colors and overall image are far more defined on the Ati. Right off the bat you can tell the diff and I even had 3 observers while I was doing some of the swaps that noted the same thing. Im sure you could tweak the nvidia to make it a bit closer but getting anywhere near the qua of the ati's default was difficult. I have further tweaked the color/gamma settings on the Ati to even further show the disparity.

Now trust me I am one of the biggest Nvidia fanboys there are but for television work I cannot deny the diffrences. People in my office actually make fun of how much of a fanboy I am for nvidia. If you were using anything other then a TV I would probobly reccomend Nvidia but even at default settings for a TV the Ati cards I have tested blow away the Nvidia even after you tweak the nvidia.
 

gandalf

Member ready to help
Clements said:
Like what? Truform? :happy:

I'd take a 6800 U over a X800 XT any day, for many reasons which I won't state now.

I know why:
You have an GeForce
6800 supports DX 9.0 c and X800 not

;)


Back to topic....buy an ATI Radeon
 

fivefeet8

-= Clark Kent -X- =-
jdsony said:
I don't want to continue the argument here but you own an NVIDIA card and probably haven't owned any recent ATI cards.

I've never owned an Ati card. But I have used them a lot. I also go to many sites where people who own them, talk about them(good or bad). I do not think Ati cards are bad at all. On the contrary, I think Ati has just as good products as Nvidia.

jdsony said:
About the features you mentioned above though some of them are the same features with different names for each company and a lot of the features like "Digital Vibrance" and "Image Sharpening" shouldn't be needed and are probably an attempt to simulate better image quality.

Which features are you talking about? The ones I listed from Nvidia aren't even on an Ati card that I'm aware of. Digital Vibrance and Image sharpening aren't needed, but they are features that some people actually like to use. They don't simulate better image quality per se, but they do allow the user the options to set whatever kind of image quality they prefer.

jdsony said:
When ATI and NVIDIA have cards that perform similarly then it comes down to personal preference and who you trust more to make a good product.

Exactly, and that's what I've been saying.
 
Last edited:

fivefeet8

-= Clark Kent -X- =-
MindMasher said:
In the absolute top end of the performance spectrum, the x800 xt is edging the 6800u by 5% in some cases. We are talking huge res, and bigtime AA. Price wise, they are about equal, and in the lower res/AA, the 6800u seems to be slightly faster (sometimes 10%). I used to buy nothing but nvidia, but I like the direction ATI is taking better then NVIDIA, they are more focused on the developers then marketing, at least according to some of the industry types that have been interviewed.

I'm not quite sure ATi is focused on the developers more than Nvidia. I'd say they both try to push their strengths onto developers, but they only get the developers in their own developer campains to actually back them up. The difference is, nvidia seems to have most of the best developers in their corner.

Ati with their "get in the game" campain developers and Nvidia with it's "the way it's meant to be played" campain.
 

fivefeet8

-= Clark Kent -X- =-
l00pus said:
Well when I origionally put together my cab I only was concerned with 2d performance as it was going to be strictly a mame cab. I had a geforce 4 MX and was comparing it to a radeon 7500. Both hooked S-video. Recently with troubleshooting from a previous thread (if you want to read it, it is on this forum the title is "I need some suggestions" probobly a few pgs back) I had the privledge of working with some other cards in my arcade machine. I had a Ti 4200 and a Ati Radeon 9000.

In both comparisons the way the Nvidia card handled the Television seemed worse compared to the Ati. I dont know the technical reasons why but I went thru all of the video/advanced settings and I just couldent tweak the nvidia to the img quality of the Ati.

First off getting the img to fill the TV perfectly on the nvidia is next to impossible. The TV adjustment options just are not as robust as Ati's. The Ati's boots in both cases with the screen properly filling the screen with no tweaking needed. Just in case it dosent it has seperate image resize options for vertical and horizontal in the Tv options which did not seem possible for Nvidia.

Next the Pic just is much cleaner. The flicker settings are more robust with the Ati card and has a wider range of flicker variance.

The colors and overall image are far more defined on the Ati. Right off the bat you can tell the diff and I even had 3 observers while I was doing some of the swaps that noted the same thing. Im sure you could tweak the nvidia to make it a bit closer but getting anywhere near the qua of the ati's default was difficult. I have further tweaked the color/gamma settings on the Ati to even further show the disparity.

Your basing that on a geforce 4 ti? Believe me when I say, Nvidia has improved it's 2d/TV out quality a lot with the GeforceFX and Nv40. Those problems you mentioned are no longer an issue with my FX5900 ultra or 6800 ultra.
 

l00pus

New member
fivefeet8 said:
Your basing that on a geforce 4 ti? Believe me when I say, Nvidia has improved it's 2d/TV out quality a lot with the GeforceFX and Nv40. Those problems you mentioned are no longer an issue with my FX5900 ultra or 6800 ultra.

Well unfortunatly a lot of folks building a system to go in a cabinet looking for a card to support it may not be going that high end. Im sure once I get a new nvidia card for my main system I will try it out and see for myself how it compares but I will believe it when I see it. I just recently picked up a radeon card that is handling anything my cab is throwing at it for 55$ and nothing even up to a 100$ price point (probobly more) could stand toe to toe with the TV qua of this card. when you dont need support for large resolutions (as most TV's aside from HDTV cant handle above 800x600) then need for that powerful of a graffix card is less, so until nvidia pushs some of those improvements to some of the lower tier cards ati will most likely continue to be the card of choice for any standard television user.
 

fivefeet8

-= Clark Kent -X- =-
l00pus said:
Well unfortunatly a lot of folks building a system to go in a cabinet looking for a card to support it may not be going that high end. Im sure once I get a new nvidia card for my main system I will try it out and see for myself how it compares but I will believe it when I see it. I just recently picked up a radeon card that is handling anything my cab is throwing at it for 55$ and nothing even up to a 100$ price point (probobly more) could stand toe to toe with the TV qua of this card. when you dont need support for large resolutions (as most TV's aside from HDTV cant handle above 800x600) then need for that powerful of a graffix card is less, so until nvidia pushs some of those improvements to some of the lower tier cards ati will most likely continue to be the card of choice for any standard television user.

There are Fx cards that are low end with the same Tv-out features as it's high end counterparts. If you do get a low end Fx card, then I'll let you decide for yourself.
 

l00pus

New member
fivefeet8 said:
There are Fx cards that are low end with the same Tv-out features as it's high end counterparts. If you do get a low end Fx card, then I'll let you decide for yourself.

Im sure I will get something to test eventually. I only put Nvidia cards in my main box. I will most likely be jumping right from the 4's to the 6800's tho. Plus early next yr I plan on up'ing the TV in my cab to an HDTV so then I may have use for some higher res's. I will be interested to see if the feature base has gone thru that much of an upgrade. I still find it silly that if the features have been improved for TV that it couldent of been trickled down to older cards. I mean there is no reason that screen management and flicker controls couldent of been passed down ... if not color and gamma management as well.
 
Last edited:

fivefeet8

-= Clark Kent -X- =-
l00pus said:
I still find it silly that if the features have been improved for TV that it couldent of been trickled down to older cards. I mean there is no reason that screen management and flicker controls couldent of been passed down ... if not color and gamma management as well.

I think it's more of a cost issue more than anything. Nvidia doesn't produce older cards now anyways, so the only way to improve the TV out quality of older cards is to start making them again with the updated hardware. Not much sense in doing that from a business standpoint.

Another reason why I think 2d/TV quality has improved from previous generations is because Nvidia has started being a lot more strict on it's IHV's to use better RFI filters since they don't usually make the cards themselves. In previous generations, IHV's had a lot more freedom in the quality of components they used. They sometimes used poor quality parts to lower costs at the expense of output quality.
 

l00pus

New member
fivefeet8 said:
I think it's more of a cost issue more than anything. Nvidia doesn't produce older cards now anyways, so the only way to improve the TV out quality of older cards is to start making them again with the updated hardware. Not much sense in doing that from a business standpoint.

Another reason why I think 2d/TV quality has improved from previous generations is because Nvidia has started being a lot more strict on it's IHV's to use better RFI filters since they don't usually make the cards themselves. In previous generations, IHV's had a lot more freedom in the quality of components they used. They sometimes used poor quality parts to lower costs at the expense of output quality.

Yes I actually read about the filters and such. I bought my ti 4200 because it people raved about the 2d Because Leadtek used a high quality filter. Still didnt help a lot

I would still think the flicker variance and screen control options would at least be driver related. I mean there isnt even an option to individually stretch the screen horizontally and then vertically and I am pretty sure the older cards would be capable.
 

Top