What's new

Gfx Card Test

Doomulation

?????????????????????????
Ok... now for something intresting. This little proggie I made will check the capabilities of your gfx card. It might be intresting to see what everyones' cards support. Note, however, that not even I know not what all these functions are used for :whistling but some I know. The more supported, the better.

I can also say that unless your card has hardware vertex processing and pure hardware vertex support, it's crap. Because, mainly, this will give 100% speed boost when I tried (*cough* intel crap *cough*).

Anyway, test it out. This will be intresting.

EDIT 04-05-10 (YY-MM-DD):
Attached version 1.1. Changes:

- Rewrote pixel and vertex-shader version lookup code. It should now hopefully find the correct version. As to before, it now extracts the data from the structure instead of trying to test if the version is correct. If oddities still appear, then it's a d3d or driver problem.
- The least fully supported dx version should now be filled in I hope since it relies on information by which pixel shader version you have.
- Tried to implemt tooltips but failed horribly :)cry: Damn ms for making static text so lacking! :cry:)
- Looked up the vertex tweening and w-buffer support tests. And found that they are correctly written in the test. If the information is incorrect, then I'm afraid that it's either d3d or your drivers which is messing with you.
- Changed compiled filename.
 
Last edited:

cooliscool

Nintendo Zealot
Cool little app.

Put it in your sig along with what's required for Jabo's D3D plugin (somewhere on smiff's site). Refer n00bs to it when they ask why everything looks so bad on their 1337 intel extreme with OMG 64MB SHARED RAM. :rolleyes:

:sp_canada
 

Trotterwatch

New member
Xade said:
Because the MX 440 is 1337...

This is a great utility and one which I will actually use :) Only one error I've noticed is that it tells me my card supports Pixel Shader 0.4 - when it should say 1.4 (if all is correct this end).
 
Last edited:

Clements

Active member
Moderator
The features shown in the program for my card (FX 5600) are identical to Xade's. Nice program, love to see more from it ;)
 
OP
Doomulation

Doomulation

?????????????????????????
That's funny...
has anyone noticed that it says Z-buffer is unsupported on high-end cards? On the MX400 it says it's supported, but not otherwise.
Is D3D playing a trick on me?

And the pixel shader version, I must look into, because it's still not quite right. And such is the case for the DX9 card, as well.

Btw...about the sig, if I change it, I gotta remove everything else due to the limit of sig chars.
Well, I'll think of a way.
 

Trotterwatch

New member
Doomulation, on this Radeon 8500 W-Buffer can actually be enabled via a compatability option. And your tester picks up on that, which is cool.

I'd imagine you can do similar with the other card that didn't support it (or Z).
 
OP
Doomulation

Doomulation

?????????????????????????
Ah well, it relies on the information d3d reports in the caps structure. I guess what it basically does is ask the drivers if it can be used or not.
So whatever d3d reports, it takes as given, just as any d3d application would do. So if it reports something (like w-buffer) isn't enabled, then neither will any d3d application see it as enabled.
 
OP
Doomulation

Doomulation

?????????????????????????
Most of my apps are such small size :)
I don't put in mega-bigga-crap stuff in them like other companies seem to do.
 

sheik124

Emutalk Member
kinda off topic, i was looking at the X800 Pro and X800 XT Ultimate benchmarks, and all i can say is, tag, eat nvidia's dirt, the 5950 Ultra creams the X800 Pro, near ties with the X800 XT Ultimate in OpenGL benchmarks, although ATI holds most of the ground in the DX9 benchmarks, nVidia shines in DX8. superiority in DX8 and OGL is enough to keep me from switching over.
 

Reznor007

New member
sheik124 said:
kinda off topic, i was looking at the X800 Pro and X800 XT Ultimate benchmarks, and all i can say is, tag, eat nvidia's dirt, the 5950 Ultra creams the X800 Pro, near ties with the X800 XT Ultimate in OpenGL benchmarks, although ATI holds most of the ground in the DX9 benchmarks, nVidia shines in DX8. superiority in DX8 and OGL is enough to keep me from switching over.

Umm...which benchmarks are you looking at? This isn't golf man, higher scores are better ;)

Look at Toms Hardware review...Unreal Tournament 2004, 1600x1200 4xFSAA 8xAF:

X800 XT-82.3
X800 Pro-68.5
6800 Ultra-64.1
5950 Ultra-39.1
9800 XT-38.2

5950 Ultra creaming X800 pro? Yeah...
 

sheik124

Emutalk Member
do you seriously think i trust tom's hardware? check out anandtech buddy

http://www.anandtech.com/video/showdoc.html?i=2044&p=18

http://www.anandtech.com/video/showdoc.html?i=2044&p=19

and here we can see the 5950 Ultra brush past the X800 Pro
http://www.anandtech.com/video/showdoc.html?i=2044&p=21

and here is a real oddity, the 5950 Ultra comes out ON TOP, due to the different shader programs that are being run (not pixel shader, its shadow maps for NV3X)
http://www.anandtech.com/video/showdoc.html?i=2044&p=13
 

Reznor007

New member
sheik124 said:
do you seriously think i trust tom's hardware? check out anandtech buddy

http://www.anandtech.com/video/showdoc.html?i=2044&p=18

http://www.anandtech.com/video/showdoc.html?i=2044&p=19

and here we can see the 5950 Ultra brush past the X800 Pro
http://www.anandtech.com/video/showdoc.html?i=2044&p=21

and here is a real oddity, the 5950 Ultra comes out ON TOP, due to the different shader programs that are being run (not pixel shader, its shadow maps for NV3X)
http://www.anandtech.com/video/showdoc.html?i=2044&p=13


Anandtech is just as bad as Tom...he was the one saying NV30 was going to crush R300 before they came out.

But if you insist on using Anand...

http://www.anandtech.com/video/showdoc.html?i=2044&p=17

http://www.anandtech.com/video/showdoc.html?i=2044&p=11

Both show ATI leading here. Oh yeah, you shouldn't count the nvidia 61.11 drivers in Far Cry...it doesn't even render properly. It skips fog effects, has graphical issues on the weapons, lighting/shaowing problems, and visible banding. Also, it is using 1.1 shaders instad of the 2.0 shaders the ATI cards use("The Way it's Meant to be Played" indeed ;) ).

As for why nvidia leads in any Quake3 engine based game, it's because it's the main game they have focused on optimizing since the TNT2.
 

Top