What's new

Gfx Card Test

OP
Doomulation

Doomulation

?????????????????????????
Make SURE you have the right runtime. You can grab them at the beginning of the thread where there's a link.
 

MasterPhW

Master of the Emulation Flame
Clements said:
Worked for me, since I was using drivers that have preliminary OpenGL2 support. Before it didn't work at all, thus the note in the readme, but since there are drivers available that allow the shader effects.
What 4 drivers r u using Clements? And since which drivers is OpenGL support?
And the last question: what means preliminary?Handle the ATI driver OpenGL in a better way, or what?

And now, a great pic of GFX Test, shows my great Geforce 3 Ti 200:

(Thought it was an DirectX 8 card?!? What's happening? Wrong thougt or bug?)
 

Clements

Active member
Moderator
MasterPhW_DX said:
What 4 drivers r u using Clements? And since which drivers is OpenGL support?
And the last question: what means preliminary?Handle the ATI driver OpenGL in a better way, or what?

I'm using the 60.72 drivers, a few of the ones before needed a reg to tweak to enable it, so I think the 60.72 drivers are the first to have it enabled from the start. It's preliminary since it's very new and is not reached the level of ATi's implementation yet (ATi drivers have supported it a lot longer)
 

rcgamer

the old guy
clements you should give the 61.11 drivers a go. they are the best ive used so far. you can get them at guru3d.com in the forums.
 

sheik124

Emutalk Member
MasterPhW_DX said:
What 4 drivers r u using Clements? And since which drivers is OpenGL support?
And the last question: what means preliminary?Handle the ATI driver OpenGL in a better way, or what?

And now, a great pic of GFX Test, shows my great Geforce 3 Ti 200:

(Thought it was an DirectX 8 card?!? What's happening? Wrong thougt or bug?)
Pixel Shader 1.1 = Before DirectX 8.1 (not sure what exactly)

Doomulation said:
Right, so I was wondering really, what EXACTLY the plugins require to work. All I know is that intel = bad, gf2 mx = okay. And Direct64, no pixel shader = no workie, pixel 1.1 = bad, pixel 1.3 = okay, pixel 2.0 = good
I have some corrections to make here, yes, intel does = bad (P4, eat my Athlon XP's dirt) GeForce 2 MX really isn't that great, it would fall under no workie, NO PIXEL SHADER. my buddy has a GeForce 4 MX (there isn't much of a diff between them) and some games look like bleh (try running halo or far cry on there for fun :puke: ) However, Jabo's Direct3D runs great on the card, I use it to play N64 at skool (yes i am a demon :n64: )
 

cooliscool

Nintendo Zealot
Got a new card.. compared to my MX440 it's a beast.. having pixel shaders makes a man feel so powerful.. also bought a GeforceFX 5200, but it's a POS. :p
 
OP
Doomulation

Doomulation

?????????????????????????
sheik124 said:
I have some corrections to make here, yes, intel does = bad (P4, eat my Athlon XP's dirt) GeForce 2 MX really isn't that great, it would fall under no workie, NO PIXEL SHADER. my buddy has a GeForce 4 MX (there isn't much of a diff between them) and some games look like bleh (try running halo or far cry on there for fun :puke: ) However, Jabo's Direct3D runs great on the card, I use it to play N64 at skool (yes i am a demon :n64: )
I mean for emulation. Jabo's works fine on gf2mx.
 

euphoria

Emutalk Member
Here's what it gives to my Matrox G450.
Why does it give the drop-down lists with 1 pixel height. You have to choose with keyboard the options? btw what is "Hardware vertex support" are they some kind of pre-computed vertex lists or what?

Anyways, good idea for a program and very useful.
 
Last edited:

Clements

Active member
Moderator
GeForce 3 was a DirectX 8 card that supported Pixel shader 1.1, thus is a pre DX8.1 card. The info says 'Below DX8.0' which is misleading since it IS DX8.0 compliant.

(GeForce4 Ti supported PS 1.4 and was DirectX 8.1 compliant).
 

sheik124

Emutalk Member
cooliscool said:
Err. You know nothing. :plain:
my bad dude, intel = shit :whistling:
that is the one point no body will ever move me from, and another thing, how do you like your game pad? ;)
 

cooliscool

Nintendo Zealot
I'd be willing to bet you've never actually owned an Intel system (an old P1/PII system doesn't count). :\

As for the gamepad, I noticed a few days ago we had the same one. ;) I love it, very comfortable, analogs are nice and precise, and the rumble is awesome. :)
 

sheik124

Emutalk Member
cooliscool said:
I'd be willing to bet you've never actually owned an Intel system (an old P1/PII system doesn't count). :\

As for the gamepad, I noticed a few days ago we had the same one. ;) I love it, very comfortable, analogs are nice and precise, and the rumble is awesome. :)
haha, yeah my only intel systems were my laptop (ancient piece of shit, 100 MHz pre-MMX Pentium) and my HP, Pentium 200 MMX, 32 MB EDO RAM, a whopping 4 MB of Video Ram for your powerful 3d S3 Virge chipset, and not to mention 3.8 GB of space, more than enough for all of today's applications. yep, she was a beaut. then of course i got my Slot A 900 MHz Athlon HP with an nVidia Vanta LT and i realized what i had been missing, and with my Athlon XP 1600+ which sadly had shit S3 ProSavage GFX, my fate was sealed to AMD, the price premium for an Intel isn't enough to encourage me for one. AMD is on top in gaming and workstation performance, Intel is on top in encoding and compressing.
 

Tagrineth

Dragony thingy
sheik124 said:
haha, yeah my only intel systems were my laptop (ancient piece of shit, 100 MHz pre-MMX Pentium) and my HP, Pentium 200 MMX, 32 MB EDO RAM, a whopping 4 MB of Video Ram for your powerful 3d S3 Virge chipset, and not to mention 3.8 GB of space, more than enough for all of today's applications. yep, she was a beaut. then of course i got my Slot A 900 MHz Athlon HP with an nVidia Vanta LT and i realized what i had been missing, and with my Athlon XP 1600+ which sadly had shit S3 ProSavage GFX, my fate was sealed to AMD, the price premium for an Intel isn't enough to encourage me for one. AMD is on top in gaming and workstation performance, Intel is on top in encoding and compressing.

So yeah, basically you never had a P3...

P3 > you.

Pentium M, the current uber laptop CPU, is based on P3.

Rumour has it, with all the problems with Prescott and Tejas... and with Tejas now cancelled... Intel will replace NetBurst with a new P6 derivative (basically a souped up desktop Pentium M).

I'm so fucking psyched, if that's the truth.
 
OP
Doomulation

Doomulation

?????????????????????????
euphoria said:
Here's what it gives to my Matrox G450.
Why does it give the drop-down lists with 1 pixel height. You have to choose with keyboard the options? btw what is "Hardware vertex support" are they some kind of pre-computed vertex lists or what?

Anyways, good idea for a program and very useful.
Dropdowns lists of 1 pixel in height? Which dropdown lists? I'm quite sure there are none of those.

Hardware vertex support means that the gfx card can process all vertex information. As it is, the vertexes contains points in 3d space where polygons will be drawn. Using diffrent rendering methods, these points are translated to polygons. With hardware vertex support, this is done by the gfx card; otherwise by the processor. Hence, hardware vertex support is much faster.

I can only say one thing about that card: it sucks.


Clements: would you help me get a list of what cards support what dx version? The program need some info to compare it against. Which currently is the pixel shader version.
 

euphoria

Emutalk Member
Doomulation said:
Dropdowns lists of 1 pixel in height? Which dropdown lists? I'm quite sure there are none of those.
Here's two sshots of it. 1st one is what i get immediately after starting your prog. 2nd is showing the "1px dropdown list" (a black horizontal line where dropdown of Adapters should be).
Am i the only one getting this? Maybe i've a fucked up mfc70.dll, or something.
Doomulation said:
Hardware vertex support means that the gfx card can process all vertex information. As it is, the vertexes contains points in 3d space where polygons will be drawn. Using diffrent rendering methods, these points are translated to polygons. With hardware vertex support, this is done by the gfx card; otherwise by the processor. Hence, hardware vertex support is much faster.
What exactly does the hardware do. Store the vertex array? Or do the matrix operations in hw?
Doomulation said:
I can only say one thing about that card: it sucks.
Well, tack so mycket. I know.
 
OP
Doomulation

Doomulation

?????????????????????????
euphoria said:
Here's two sshots of it. 1st one is what i get immediately after starting your prog. 2nd is showing the "1px dropdown list" (a black horizontal line where dropdown of Adapters should be).
Am i the only one getting this? Maybe i've a fucked up mfc70.dll, or something.
Then something's fux0red on your system. Try the runtime I've provided at the site which I linked earlier.

What exactly does the hardware do. Store the vertex array? Or do the matrix operations in hw?
It does the matrix operations. Not transformations, but it calculates all information needed to create the polygons from the information specified from the vertexes. That is processing all the points, drawing lines between them, creating a 3D figure. Applying color and alpha color.

There's much more to do before rendering a 3D scene, of course, but there's other things required for this. Such as hardware transformation and clipping.

Well, tack so mycket. I know.
Yea well, if you get a gf2 mx, I bet you'd get a big speed increase... at least in normal computer games since they use hw vertex processing. There's not that many n64 plugins that do, however.
 

Top