PDA

View Full Version : VooDoo 5 5500 PCI info needed



LazerTag
April 16th, 2003, 08:12
I know it's not directly emulation related, but I will be using this card with Glide64 (and other EMU/Game related) hopefully.

So for WinXP Pro, Where and what version drivers will I need?

Also does anyone see issue with running this card inline with my GF4 ti4200 AGP?

Guess I just need to hook my second monitor to the new card or get a switchbox.

The Khan Artist
April 16th, 2003, 09:58
http://falconfly.de/, use the AmigaMerlin 2.5SE or TDHQ beta 10.

And fivefeet8 of the NGEmu forums would be the guy to ask about dual vid boards.

fivefeet8
April 17th, 2003, 00:17
Originally posted by The Khan Artist
http://falconfly.de/, use the AmigaMerlin 2.5SE or TDHQ beta 10.

And fivefeet8 of the NGEmu forums would be the guy to ask about dual vid boards.

Moi? hehe.. :happy: I'm also registered here too.. ;)


Originally posted by LazerTag
I know it's not directly emulation related, but I will be using this card with Glide64 (and other EMU/Game related) hopefully.

So for WinXP Pro, Where and what version drivers will I need?

Also does anyone see issue with running this card inline with my GF4 ti4200 AGP?

Guess I just need to hook my second monitor to the new card or get a switchbox.

Goto www.voodoofiles.com for Winxp 3dfx Drivers.. They should work fine as long as they are for Winxp.. I personally use the 3dhq beta 10 winxp drivers.

There is no issue with the V5 pci running inline with a Geforce 4 card.. I have the setup and they work fine.. You can use your geforce 4 on one monitor to play PC games and use the v5 on the other to play glide.. :)

LazerTag
April 17th, 2003, 03:11
Thanks much for the info. I can't wait to get it all setup. Shold be much fun.

And thanks to the MOD who moved my post. Sry I missed the Tech forum. I'll try and double check myself at the door next time. :)

LazerTag
April 18th, 2003, 08:12
fivefeet8,

Is there something I might be missing? I can't seem to get 1964 and others to use the new card for Glide. It seems they just want to continue to run with my GF4 even though I deleted Evoodoo and it's files.

I think I'll go try some Glide games for PC to see what happens, but any insight you might have since you are already doing this would be great.


thanks

Tagrineth
April 18th, 2003, 08:48
Generally speaking, the 2D-capable Voodoo card has to be the primary adapter for GLide acceleration.

In other words, you'd have to set up another hardware profile in which the Voodoo5 is the primary adapter, and switch monitor cables when you want your uberfast GLide support.

fivefeet8
April 18th, 2003, 10:49
Yeah.. To use glide on your Voodoo 5, it has to be the Primary display.. You can still run ogl and d3d on your Geforce even though the V5 is the primary though.. Enable MultiMonitor accelleration on your geforce..

LazerTag
April 18th, 2003, 22:13
OK, before coming back and seeing these two posts here is what I have done so far.

I was able to get the VooDoo working without being the primary. So now it works, but has a couple issues. When I first run 1964 using Glide64 it works, but the whole machine hangs within a minute with the sound looping. I have to reboot at this point. If I try it with sound disabled in 1964 it works great, until I try to run another game. It doesn't lock when I do this, but it doesn't play the next game either. No matter what I try at this point it doesn't seem to want to load a game again until I reboot or change video plugins back and forth.

I will set to primary to see if these issues go away, but knowing what they are I don't see why being primary or secondary would have any effect.

Anyway I'll post more results tonight after I make the change. And is this change simply defining primary from within Windows XP's display properties? or, Do I have to define it in my BIOS?

Tagrineth
April 18th, 2003, 23:13
You should be able to set up a hardware profile for it.


And, yes, it SHOULD matter whether it's primary or not. Keep in mind, older cards were only meant to drive one monitor, and then, generally, if you had more than one card, you would of course use the better one as primary.

Because of that, or something like that, D3D and OpenGL generally require a primary surface for rendering.

OpenGL doesn't any more, I think, and D3D will probably follow soon (officially)... but GLide? Can't. It will never be updated again; it's dead.

GLide is meant to run on the first GLide-capable adapter it finds, if you have a 2D-capable Voodoo, then it's going to be a primary adapter (at the time), and if it's a 3D-only, it'll TAKE the primary surface away from your main adapter and use it itself.

API designers initially just didn't think of secondary surfaces, and now it's taking a long time to become standard.

Slougi
April 19th, 2003, 00:09
Tag, glide is released as open source.

http://maccentral.macworld.com/news/9912/09.3dfx.shtml

Tagrineth
April 19th, 2003, 00:11
Originally posted by Slougi
Tag, glide is released as open source.

http://maccentral.macworld.com/news/9912/09.3dfx.shtml

Not according to nVidia.

LazerTag
April 19th, 2003, 14:09
Well I give up on this one.

I made the card my primary in BIOS and Windows XP, still hangs with sound looping yet neither my sound nor the V5 are sharing interupts or IRQ. I even went as far as taking my GF4 out just to make sure and still the same thing.

Well it was nice idea I guees, but I'm sticking with supported tech.

Thanks for all the help.

Flash
April 20th, 2003, 00:33
Originally posted by Tagrineth
Not according to nVidia.

AFAIK Glide sources was released by 3Dfx .

Tagrineth
April 21st, 2003, 06:12
Originally posted by Flash
AFAIK Glide sources was released by 3Dfx .

And nearly all IP belonging to 3Dfx now belongs to nVidia. kthxowned.

Flash
April 22nd, 2003, 06:37
Huh... maybe, but anyway glide sources was released by 3Dfx in 1999 and they're still available.

Stezo2k
April 22nd, 2003, 06:40
dont know why nvidia doesnt make there cards compatible with glide, glide is a gr8 api, really fast :) plus it would be good for voodoo users too who want to play there glide games faster etc

Flash
April 22nd, 2003, 06:54
Originally posted by Tagrineth
You should be able to set up a hardware profile for it.


And, yes, it SHOULD matter whether it's primary or not. Keep in mind, older cards were only meant to drive one monitor, and then, generally, if you had more than one card, you would of course use the better one as primary.

Because of that, or something like that, D3D and OpenGL generally require a primary surface for rendering.

OpenGL doesn't any more, I think, and D3D will probably follow soon (officially)... but GLide? Can't. It will never be updated again; it's dead.



3dfx GLIDE™ for Windows - glide2.x , glide3.x
Contact: KoolSmoky , [email protected] , http://www.3dfxzone.it/koolsmoky

Open sourced 3dfx GLIDE™ driver for Win32 (Windows 95,98,Me,NT4,2000,XP) ported from the Linux 3dfx GLIDE™ source code. The Linux source code was originally open sourced by the demised 3dfx Interactive Inc. Currently supports 3dfx Voodoo Banshee, Velocity, Voodoo3, Voodoo4, Voodoo5. It has been extended to support S3TC (DXTC) compressed textures, T-Buffer effects for all VSA based cards including Voodoo4, and optimizations for 3DNOW!, MMX, SSE, SSE2 are used.



Issues
Multi-Monitor with Windows 2000,XP
You may have problems running GLIDE™ on 3dfx devices other than the 1st. To run GLIDE™ on 2nd or 3rd device, disattach window's desktop from the display device using the display properties.

Tagrineth
April 22nd, 2003, 16:35
Originally posted by Stezo2k
dont know why nvidia doesnt make there cards compatible with glide, glide is a gr8 api, really fast :) plus it would be good for voodoo users too who want to play there glide games faster etc

They'd have to integrate a full Voodoo-series core into theirs. Not feasible.

Reznor007
April 23rd, 2003, 03:52
You wouldn't need to integrate the old Voodoo core. The only things nVidia's cards don't support that 3dfx did was NCC textures and maybe FXTC(I can't remember if that was added or not).

Of course, you could just do a software emulation of a Voodoo1/2. It's been done already, just check www.aarongiles.com

Tagrineth
April 23rd, 2003, 07:55
Originally posted by Reznor007
You wouldn't need to integrate the old Voodoo core. The only things nVidia's cards don't support that 3dfx did was NCC textures and maybe FXTC(I can't remember if that was added or not).

Yes, but they don't support it in the same way as 3dfx did. GLide is pretty much to-the-metal Voodoo-series code.

Same reason, say, ATi's R5 core can't run GLide.


Of course, you could just do a software emulation of a Voodoo1/2. It's been done already, just check www.aarongiles.com

Software raster = slooooooooooow.

Reznor007
April 23rd, 2003, 10:46
Originally posted by Tagrineth
Yes, but they don't support it in the same way as 3dfx did. GLide is pretty much to-the-metal Voodoo-series code.

Same reason, say, ATi's R5 core can't run GLide.



Software raster = slooooooooooow.

Yes, however, not as slow as some think. For example NFL Blitz '99 arcade plays at 20% speed on a 3GHz P4. That may seem very slow, but you have to realize that is doing software 3dfx emulation at 640x480+interpreter emulation of a 150MHz MIPS R5000 + interpreter emulation of an ADSP2115. Carnevil is 50-75% on the same system, and consists of the same hardware. Compared to the emulation portion, I'd say the software rasterization only takes up maybe 20-30% of that.

As for a hardware wrapper/driver for other cards, any R200+ or Geforce3+ card could find an appopriate function for anything a 3dfx card could do. With R300 or GeforceFX it's even less of a problem.

milen
April 23rd, 2003, 15:33
Glide is just an API, yes it's very close to 3dfx hardware becasue it was written for this purpose. It can be done in ATI,NVIDIA,SIS... cards but it would not be better than 3dfx cards, and maybe not faster.

The companies will not make money from this so they would not make it.

Tagrineth
April 23rd, 2003, 22:11
Originally posted by Reznor007
Yes, however, not as slow as some think. For example NFL Blitz '99 arcade plays at 20% speed on a 3GHz P4. That may seem very slow, but you have to realize that is doing software 3dfx emulation at 640x480+interpreter emulation of a 150MHz MIPS R5000 + interpreter emulation of an ADSP2115. Carnevil is 50-75% on the same system, and consists of the same hardware. Compared to the emulation portion, I'd say the software rasterization only takes up maybe 20-30% of that.

Is it doing bilinear filtering?


As for a hardware wrapper/driver for other cards, any R200+ or Geforce3+ card could find an appopriate function for anything a 3dfx card could do. With R300 or GeforceFX it's even less of a problem.


Glide is just an API, yes it's very close to 3dfx hardware becasue it was written for this purpose. It can be done in ATI,NVIDIA,SIS... cards but it would not be better than 3dfx cards, and maybe not faster.

The companies will not make money from this so they would not make it.

NO. It can NOT be done natively on anything other than the Voodoo series. Reznor, what you suggest would be wrapping code, not running natively! That's what 'finding an appropriate function' IS.

Only the Voodoo series can run GLide natively. Period. Get over it. It's the same as MeTaL only running on S3 cards, RRedline only on Rendition Vérité, No Direct3D on Intergraph Wildcat and Intergraph/3DLabs Wildcat2...

milen
April 23rd, 2003, 23:05
Do you think that ATI and NVIDIA cards support Opengl natively, it's only supported by their drivers.

     Video Card
      |
     Unified Driver
      |
      |
----------------------------------------------
 |          ;|           |                    |
 |                       |                     |
Opengl            DirectX            Glide

This is universal diagram for all cards.

But in fact on 3dfx's cards Opengl call are converted to Glide than Glide to Hardware Calls. DirectX is directly converted to Hardware Calls.

So Opengl,DirectX,Glide are softwares(APIs) and the cards have some features which are used through this APIs.

Tagrineth
April 24th, 2003, 03:28
Originally posted by milen
Do you think that ATI and NVIDIA cards support Opengl natively, it's only supported by their drivers.

This is universal diagram for all cards. (snipped)

But in fact on 3dfx's cards Opengl call are converted to Glide than Glide to Hardware Calls. DirectX is directly converted to Hardware Calls.

So Opengl,DirectX,Glide are softwares(APIs) and the cards have some features which are used through this APIs.

Direct3D and OpenGL are both wrapped to GLide on the Voodoo series; Direct3D, however, unlike GLide, has the necessary runtimes integrated, rather than referencing external files like OpenGL does (which is why OGL stops working if you delete GLide2x and 3x.dll, whereas D3D keeps working - it's for WHQL reasons).

Voodoo3 and VSA-100 actually support some Direct3D and OpenGL in hardware, though. Moreso in VSA than V3... actually, the VSA supports a lot of D3D/OGL, though there's still more wrapped to GLide than necessary...

Anyway, I digress. If nVidia or ATi supported GLide, it would be through a wrapper, and only through a wrapper. The hardware can't support it natively. Basically it wouldn't be any better than using, say, eVoodoo or GL2IDE, except that the IHV's would include more speed hacks.

Reznor007
April 24th, 2003, 07:30
Originally posted by Tagrineth
Is it doing bilinear filtering?





NO. It can NOT be done natively on anything other than the Voodoo series. Reznor, what you suggest would be wrapping code, not running natively! That's what 'finding an appropriate function' IS.

Only the Voodoo series can run GLide natively. Period. Get over it. It's the same as MeTaL only running on S3 cards, RRedline only on Rendition Vérité, No Direct3D on Intergraph Wildcat and Intergraph/3DLabs Wildcat2...

It is doing 100% emulation of the Voodoo graphics core. This includes MIP mapping, perspective correct texturing, bi/trilinear filtering, alpha blending, fog, whatever the game wants to use. The driver also runs Wayne Gretzkey's 3d Hockey and San Francisco Rush.

And ALL drivers are essentially wrappers. The SDK for the particular API uses commands that are translated to the rasterizers machine code. For example, the Glide command grDrawTriangle() is not executed by the graphics core, it is interpreted by the driver to do the appropriate function.

The big difference in current Glide Wrappers is that instead of directly controlling the GPU, it sends out D3D or OpenGL calls that do essentially the same function. If ATI or nVidia wanted, they could make their own glide3x.dll and have it send direct hardware commands without going through D3D/OpenGL.

Reznor007
April 24th, 2003, 07:36
Originally posted by Tagrineth
Direct3D and OpenGL are both wrapped to GLide on the Voodoo series; Direct3D, however, unlike GLide, has the necessary runtimes integrated, rather than referencing external files like OpenGL does (which is why OGL stops working if you delete GLide2x and 3x.dll, whereas D3D keeps working - it's for WHQL reasons).

Voodoo3 and VSA-100 actually support some Direct3D and OpenGL in hardware, though. Moreso in VSA than V3... actually, the VSA supports a lot of D3D/OGL, though there's still more wrapped to GLide than necessary...

Anyway, I digress. If nVidia or ATi supported GLide, it would be through a wrapper, and only through a wrapper. The hardware can't support it natively. Basically it wouldn't be any better than using, say, eVoodoo or GL2IDE, except that the IHV's would include more speed hacks.

There is no such thing as supporting an API in hardware. The chip supports many 3d math functions, and the drivers simply interpret 3d API commands to the chips own internal instruction set.

Tagrineth
April 24th, 2003, 10:12
Originally posted by Reznor007
There is no such thing as supporting an API in hardware. The chip supports many 3d math functions, and the drivers simply interpret 3d API commands to the chips own internal instruction set.

So would you please explain to me why 3DLabs of all companies couldn't get Direct3D to work at all on the Wildcat and Wildcat 2 cores?

AlphaWolf
April 24th, 2003, 11:20
Originally posted by Tagrineth
So would you please explain to me why 3DLabs of all companies couldn't get Direct3D to work at all on the Wildcat and Wildcat 2 cores?

I am no expert on 3D crap, but I would guess that it was probably due to some limitations of the core that couldn't meet the needs of the API.

AFAIK, the API is in fact done through the driver interfacing with the hardware. I can't see why they would design any specific GPU to tailor a specific API(s), because one of the key points of an API is to avoid this, this way the hardware can drasticaly change without breaking software compatibilities. Otherwise there would be no layers between the hardware and the software application, and it would still be like the old days where X game is only compatible with Y card.

I think the real reason is, why use glide when its obsolete? It's not a professional standard, it was meant specificaly to make the old games at the time work on the slower hardware they had available. Only one company owned it at the time that it was actualy a good standard, and now that its been replaced, why bother?

Reznor007
April 24th, 2003, 13:09
Originally posted by Tagrineth
So would you please explain to me why 3DLabs of all companies couldn't get Direct3D to work at all on the Wildcat and Wildcat 2 cores?

It's probably something like a required feature missing that prevents the API from being able to work. However, it could just be that they A-didn't care enough about it, B-the resulting performance was so slow they decided not to release it and just said it didn't work, or C-were incompetent(hey...Creative owns them, and look at all the crap they get about poor drivers).

milen
April 24th, 2003, 16:06
So would you please explain to me why 3DLabs of all companies couldn't get Direct3D to work at all on the Wildcat and Wildcat 2 cores?

Maybe they are bad programers.


All drivers are wrappers in fact. Sometimes there are 2 or 3 passes between d3d,opengl command and video card. And for 3dfx as is said before D3D dosen't use glide. It uses the same layer as Glide for communication with Video Card.Opengl uses Glide and than Glide uses the layer. Glide is faster because it supports only functions that 3dfx hardware supports. It's written with 3dfx's hardware in mind and is otimized to get all avaible performence from it. D3D and Opengl are universal APIs and that makes them slower. If NVIDIA makes card wich supports directly all D3D fucntions and work toghether with microsoft to implement NVIDIA specefic code than the result will be maybe somehting like GLIDE.

Tagrineth
April 24th, 2003, 22:19
Originally posted by Reznor007
or C-were incompetent(hey...Creative owns them, and look at all the crap they get about poor drivers).

Creative bought them when Wildcat VP was already ready and IIRC Wildcat 4 was already out.

3DLabs are hardly bad programmers; they have quite possibly the most stable and functional drivers available today, period.

And yet they couldn't get D3D working at all on Wildcat2. Trust me, I've talked to them extensively about it. WC2 owners wanted compatibility, not speed... and 3DLabs simply couldn't give it because they core was designed specifically for OpenGL, and has no D3D capabilities at all.

3DLabs listened to the massive amounts of complaints though... WC3 can do DX7. :)

But anyway, it's the same thing as non-Voodoo series being incapable of running GLide. They simply don't support all of the functions needed to run it. And GLide compilers put out very low-level code...

Reznor007
April 24th, 2003, 22:34
Originally posted by Tagrineth
Creative bought them when Wildcat VP was already ready and IIRC Wildcat 4 was already out.

3DLabs are hardly bad programmers; they have quite possibly the most stable and functional drivers available today, period.

And yet they couldn't get D3D working at all on Wildcat2. Trust me, I've talked to them extensively about it. WC2 owners wanted compatibility, not speed... and 3DLabs simply couldn't give it because they core was designed specifically for OpenGL, and has no D3D capabilities at all.

3DLabs listened to the massive amounts of complaints though... WC3 can do DX7. :)

But anyway, it's the same thing as non-Voodoo series being incapable of running GLide. They simply don't support all of the functions needed to run it. And GLide compilers put out very low-level code...

If they couldn't get D3D working, then either their hardware was sub-par, or their driver writers were incompetent. I'm going to say it was the driver programmers, since if it was capable of running OpenGL it should support most things needed for DX6 level support. How complete was the OpenGL support? Also, since the WC is a pro card, not a gamer card, they probably didn't care to include D3D as it is a gaming API, not a professional one. Once owners got them, they wanted to try their games, and since3dlabs already made their money off the customer, they didn't want to devote programming resources to adding that.

You don't design core features specifically for an API. You have your core support many 3d rasterization functions, then a driver to interpret API calls to your cores specific instructions.

PS, the Glide SDK does not send out low level code. The SDK merely creates a series of Glide API commands that are sent to the Glide2x/3x file and converted to machine code there. If you didn't do it there, it would require a recompile each time a new 3dfx card was released if you wanted to support it.

The Khan Artist
April 25th, 2003, 02:20
Tag, what do you think about Revenge? Is it all a load of BS?

And what about open-sourcing the 3Dfx D3D/OGL drivers?

Reznor007
April 25th, 2003, 07:21
Originally posted by The Khan Artist
Tag, what do you think about Revenge? Is it all a load of BS?

And what about open-sourcing the 3Dfx D3D/OGL drivers?

Load of crap. They talked so much about how they were writing a new ICD, yet the only person who even came close to updating OpenGL was Colourless, who never did any hyping/bragging. 3dhq was all a load of crap. The driver releases they did appeared to be nothing but slight modifications, and none of them worked as good as the official 3dfx Win2k drivers on XP with Colourless's GlideXP files, I should know, I've been running nothing but a Voodoo5 since June 2000...up to this day.

AlphaWolf
April 25th, 2003, 08:56
How do you know glide would run faster on another brand card anyways? Remember that its the driver developers that write the API layer. They probably didn't write the opengl layer as well as they wrote the glide layer on their cards. (I can think of numerous reasons for why they would do this on purpose) Just because glide is faster for a 3dfx card, doesn't mean it would be faster for say an nvidia, ati, or matrox card, because they may have already coded the opengl layer optimaly.

Stezo2k
April 25th, 2003, 15:00
well evoodoo & glide64 seem to be a good combination on pcs, quite fast on slower Pcs, glide has great possibilities as a api, i mean look when ultraHLE came out, the voodoo series ran that superbly, heh even made me buy a voodoo 2, i mean look at Doom III, the requirements are very high, lets just imagine a voodooo had 128 Ram, and doom III used Glide, the requirements would be a lot lower

Reznor007
April 25th, 2003, 21:49
Originally posted by Stezo2k
well evoodoo & glide64 seem to be a good combination on pcs, quite fast on slower Pcs, glide has great possibilities as a api, i mean look when ultraHLE came out, the voodoo series ran that superbly, heh even made me buy a voodoo 2, i mean look at Doom III, the requirements are very high, lets just imagine a voodooo had 128 Ram, and doom III used Glide, the requirements would be a lot lower

Glide doesn't support the required functions. No dot product 3 blending, no stencil buffer(unless it was added as an undocumented V5 feature), and no vertex programs. I know this is just a hypothetical suggestion...but still :)

AlphaWolf
April 26th, 2003, 01:03
Originally posted by Stezo2k
well evoodoo & glide64 seem to be a good combination on pcs, quite fast on slower Pcs, glide has great possibilities as a api, i mean look when ultraHLE came out, the voodoo series ran that superbly,

Compared to what? Another card with a glide > opengl wrapper? Of course thats going to be slower. I've had voodoo cards, I never thought glide was anything special. I've always looked at it as a stripped down proprietary counterpart to opengl.

Half life on a voodoo 3 running glide wasn't as fast as a geforce classic ddr running opengl.

Stezo2k
April 26th, 2003, 02:30
Originally posted by Reznor007
Glide doesn't support the required functions. No dot product 3 blending, no stencil buffer(unless it was added as an undocumented V5 feature), and no vertex programs. I know this is just a hypothetical suggestion...but still :)

heh, i know that, it was just an example, using glide on a voodoo can make it faster than openGL in some cases

Stezo2k
April 26th, 2003, 02:37
Originally posted by AlphaWolf
Compared to what? Another card with a glide > opengl wrapper? Of course thats going to be slower. I've had voodoo cards, I never thought glide was anything special. I've always looked at it as a stripped down proprietary counterpart to opengl.

Half life on a voodoo 3 running glide wasn't as fast as a geforce classic ddr running opengl.

well when comparing my old geforce 2 in fifa 99, they both seem to be excellent, i dont know whether my old Voodoo2 was using glide in it, but it seemed to be similar in performence, the same with Kingpin. I dont know much about apis, but glide seems to be a pretty good performer

and about halflife, that worked pretty well on my Voodoo2 12MB, but games were advancing then, slower than my GF2, more than likely due to the DDR Ram on the geforce series

AlphaWolf
April 26th, 2003, 02:54
Originally posted by Stezo2k
heh, i know that, it was just an example, using glide on a voodoo can make it faster than openGL in some cases

Right, but see thats because they run the opengl layer through the glide layer, hence making the opengl layer be a bit slower. This doesn't happen on non 3dfx cards.

Stezo2k
April 26th, 2003, 06:49
Originally posted by AlphaWolf
Right, but see thats because they run the opengl layer through the glide layer, hence making the opengl layer be a bit slower. This doesn't happen on non 3dfx cards.

ah i see, thanx for the info :) heh you know your stuff :phone:

Reznor007
April 26th, 2003, 07:41
Originally posted by AlphaWolf
Compared to what? Another card with a glide > opengl wrapper? Of course thats going to be slower. I've had voodoo cards, I never thought glide was anything special. I've always looked at it as a stripped down proprietary counterpart to opengl.

Half life on a voodoo 3 running glide wasn't as fast as a geforce classic ddr running opengl.

Half-Life doesn't support Glide, and never has. The 3dfx option was only to use a 3dfx MiniGL driver, which is a basic OpenGL driver that only supports the functions Half-Life requires. The 3dfx option in Quake2(and all other Q2 based games) is also MiniGL, not Glide.

The Khan Artist
April 26th, 2003, 12:12
There's also WickedGL. Jedi Outcast runs at full speed at 640x480 2xFSAA on my V5. :)

AlphaWolf
April 26th, 2003, 13:43
Well, I don't know a whole lot about 3D stuff specificaly, but I know that the opengl drivers for 3dfx cards are broken without the glide libs, which indicates to me that its not the same as any other cards opengl ICD.

Reznor007
April 26th, 2003, 16:04
All 3dfx card drivers for OpenGL (full ICD or MiniGL) do rely on Glide files to do some of the basic operations(triangle drawing, etc.), but the games I mentioned do not send out direct Glide API calls, it is standard OpenGL stuff.

milen
April 27th, 2003, 07:33
Koolsmokey's experimental opengl driver is standalone and don't require glide.It's based on mesa (opengl 1.2) and is not very compitable.

Tagrineth
April 29th, 2003, 09:19
Originally posted by Reznor007
no stencil buffer(unless it was added as an undocumented V5 feature)

First off, not only does VSA-100 naturally support a stencil buffer (not in GLide but yes in OGL/D3D, naturally, since GLide is technically still 16-bit only and thus CAN'T support stencil), it also supports a full 32-bit Z buffer - only other cores that can do that are Radeon R6 and R200, IIRC. :)

And 3dhq did a fantastic job on Direct3D 8 support :)

Reznor007
April 29th, 2003, 09:36
Originally posted by Tagrineth
First off, not only does VSA-100 naturally support a stencil buffer (not in GLide but yes in OGL/D3D, naturally, since GLide is technically still 16-bit only and thus CAN'T support stencil), it also supports a full 32-bit Z buffer - only other cores that can do that are Radeon R6 and R200, IIRC. :)

And 3dhq did a fantastic job on Direct3D 8 support :)

I know the VSA-100 supports stencil, however, Glide does not, so it's a moot point. And I think the newer PowerVR cores support 32bit Z.

I tried every 3dhq driver set, and none of them worked better than the 3dfx 1.04 Win2k drivers. Using the 3dhq drivers I had to play Warcraft 3 in OpenGL because D3D was horribly broken(missing half of the polygons), while the official drivers work perfectly. I have still yet to see proof that 3dhq did ANYTHING worthwhile. The only non-official 3dfx drivers worth their file size Colourless's GlideXP or koolsmokey's Glide(now combined with Colorless's stuff).

Tagrineth
April 30th, 2003, 06:26
3dhq mainly did work on getting D3D 8 to work; yes, they broke some stuff in previous versions but try some games that use heavy DX8 (without pixel shaders), like Unreal2. If you install the drivers right you should get far better functionality than with 3dfx's last drivers.

AlphaWolf
April 30th, 2003, 06:57
Why waste all of that effort on updating drivers for a dead card?

Reznor007
April 30th, 2003, 09:59
Originally posted by Tagrineth
3dhq mainly did work on getting D3D 8 to work; yes, they broke some stuff in previous versions but try some games that use heavy DX8 (without pixel shaders), like Unreal2. If you install the drivers right you should get far better functionality than with 3dfx's last drivers.

I tried Unreal Tournament 2003, and got horridly slow performance, missing polygons, and random triangles popping up over the screen.

I haven't seen/heard anything from them that left me with any faith. The new ICD that never turned up. Fully functional DX8 drivers that never showed up. And a new video card that "sadly" never made it to production. It was all talk. Colourless and koolsmokey are the only 2 to actually produce something worthwhile. Even the beta WinXP VoodooTV drivers are crap. BTWincap or DScaler provide much more functionality, and aren't even designed specifically for VTV.

Tagrineth
April 30th, 2003, 10:38
Originally posted by AlphaWolf
Why waste all of that effort on updating drivers for a dead card?

Because some people still have V3-5's and don't have the MONEY for new cards; also talk to anyone from SimHQ, it took a LONG time for them to give up on their V5's.

Reznor007:

I said Unreal 2, not UT2003.

fivefeet8
April 30th, 2003, 11:17
Originally posted by Tagrineth
Because some people still have V3-5's and don't have the MONEY for new cards; also talk to anyone from SimHQ, it took a LONG time for them to give up on their V5's.



Yeah.. It's taking me quite a long time to give up mines.. ;) Glide rocks in emulators..

Reznor007
April 30th, 2003, 12:03
Originally posted by Tagrineth
Because some people still have V3-5's and don't have the MONEY for new cards; also talk to anyone from SimHQ, it took a LONG time for them to give up on their V5's.

Reznor007:

I said Unreal 2, not UT2003.

Essentially the same engine. I also don't have U2, as I heard it wasn't near as good as U1.

AlphaWolf
April 30th, 2003, 12:39
Well, I don't know much about V5 performance, but I am guessing that its not as good as a GF4 TI4200, which indexes at $122 for the 128 meg version.

V3/V4 are definitely no better than a GF4 MX440, which indexes at $53 for the 128 meg version. Thats about the price of your average video game...I am a bit reluctant about saying it, but it sounds to me like they have a religious passion for the name 3DFX or something.

I abandoned them years ago :P

EDIT: Ok I just looked it up, the little sold Voodoo 5 6000, which was the best of the 3dfx pack ever made, barely matches that of a GF2 Ultra, which now costs peanuts. The V3-5 users would definitely benefit from a GF3 Ti200, which indexes at $64, and has all of your DX8/9 features.

milen
April 30th, 2003, 12:49
AlphaWolf, in some countries 120$ are a lot of money. I work whole month for 200$ as a programar in Bulgaria. My priority is LCD monitor not video card. I plan to keep my Voodoo3 for at least 6 months. At least I can enjoy Glide64 without wrapper.
For N64 emulation I don't need newer card.

ra5555
April 30th, 2003, 13:05
Originally posted by AlphaWolf
Well, I don't know much about V5 performance, but I am guessing that its not as good as a GF4 TI4200, which indexes at $122 for the 128 meg version.

V3/V4 are definitely no better than a GF4 MX440, which indexes at $53 for the 128 meg version. Thats about the price of your average video game...I am a bit reluctant about saying it, but it sounds to me like they have a religious passion for the name 3DFX or something.

I abandoned them years ago :P

EDIT: Ok I just looked it up, the little sold Voodoo 5 6000, which was the best of the 3dfx pack ever made, barely matches that of a GF2 Ultra, which now costs peanuts. The V3-5 users would definitely benefit from a GF3 Ti200, which indexes at $64, and has all of your DX8/9 features.

And there is the Voodoo 5 6000, one of which sold for $2000 at ebay!! Man, why do people like that card so much.

Here is one example http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=3407094731&category=3762

AlphaWolf
April 30th, 2003, 13:46
Originally posted by ra5555
And there is the Voodoo 5 6000, one of which sold for $2000 at ebay!! Man, why do people like that card so much.

Here is one example http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=3407094731&category=3762

Actualy, thats not even the worst of it. The going rate for this pathetic marketing scheme of a card was somewhere around $696

(Reason I say "pathetic marketing scheme" is that 3dfx managed to get their followers to believe that the card was "more powerfull" because it required an external power source. How cheesy do you get...honestly?)

fivefeet8
April 30th, 2003, 15:22
Originally posted by AlphaWolf


EDIT: Ok I just looked it up, the little sold Voodoo 5 6000, which was the best of the 3dfx pack ever made, barely matches that of a GF2 Ultra, which now costs peanuts. The V3-5 users would definitely benefit from a GF3 Ti200, which indexes at $64, and has all of your DX8/9 features.

Geforce3 cards aren't directx9 cards.. They are directx7/8 and lack the feature for directx9.. Only directx9 cards are geforcefx's and Radeon 9x00's..

They may be dx9 compatible but not compliant.. The voodoo 5 has superior FSAA quality compared to the geforce3 and especially the geforce2 in emulation.. Plus running with glide plugins, it's easily as fast and sometimes faster than OGL and d3d running on geforce2's and 3's in psx and n64 emulation..

The V5 6000 has higher memory bandwidth than the geforce2 ultra, but lacks certain features that the geforce2 has.. But it has 8xFSAA which would blow away the geforce2's crappy FSAA quality...

AlphaWolf
April 30th, 2003, 20:22
Originally posted by fivefeet8
Geforce3 cards aren't directx9 cards.. They are directx7/8 and lack the feature for directx9.. Only directx9 cards are geforcefx's and Radeon 9x00's..


Ok 8 sorry, but V5 isn't.



They may be dx9 compatible but not compliant.. The voodoo 5 has superior FSAA quality compared to the geforce3 and especially the geforce2 in emulation.. Plus running with glide plugins, it's easily as fast and sometimes faster than OGL and d3d running on geforce2's and 3's in psx and n64 emulation..


I don't know anything about the V5s FSAA, but I do know that the card is slower than a GF3, and having a card that slow, using FSAA isn't exactly a good idea.

milen
April 30th, 2003, 21:10
The method used in 3dfx's FSAA is different than Nvidia's. Voodoo 5 6000 FSAA is much faster and better quality than Geofrce2/3.

I mean
Voodoo5 6000 slower than Geforce3 Ti200
Voodoo5 6000 4xFSAA faster than Geforce3 Ti200 4xFSAA

AlphaWolf
April 30th, 2003, 21:42
That can't be right, the quincunx mode barely takes any performance hit at all on older stuff.

Reznor007
April 30th, 2003, 22:33
Originally posted by AlphaWolf
That can't be right, the quincunx mode barely takes any performance hit at all on older stuff.

On VSA-100, the FSAA performance is easy to predict. 2xFSAA gives you 1/2 speed. 4xFSAA gives you 1/4 speed.

The only exception is the Voodo5 6000, which was 1/2 speed for 4x, and 1/4 for 8x. I'm not getting into the Quantum3D variants using up to 32 VSA-100 chips(and 2GB video RAM), which could do 64x FSAA at 1/4 normal speed.

The only card to really beat the VSA-100 FSAA is the Radeon 300 series(9700, 9500, etc.).

Tagrineth
April 30th, 2003, 22:38
Originally posted by Reznor007
The only exception is the Voodo5 6000, which was 1/2 speed for 4x, and 1/4 for 8x. I'm not getting into the Quantum3D variants using up to 32 VSA-100 chips(and 2GB video RAM), which could do 64x FSAA at 1/4 normal speed.

Wrong.

4x would be 1/2 speed compared to a 5500's non-AA. :)

fivefeet8
May 1st, 2003, 01:02
Originally posted by AlphaWolf

I don't know anything about the V5s FSAA, but I do know that the card is slower than a GF3, and having a card that slow, using FSAA isn't exactly a good idea. [/B]

It's memory bandwidth is lower than a Geforce3, but in N64 and Psx emus, both the v5 and the geforce3 has more than enough of it to use FSAA and get good speed..

Hell, I can run psx and N64 games with glide at 960x720 + 4xFSAA on my V5 5500 PCI and stil get full speed.. And the FSAA quality is better than the Geforce3..

FYI.. The voodoo 5 5500 pci has the same memory bandwidth as a Geforce2 Ultra..


Originally posted by AlphaWolf
That can't be right, the quincunx mode barely takes any performance hit at all on older stuff.

Yeah.. Quincunx FSAA was designed for minimal performance loss, different form of FSAA than the other modes like 2x and 4x in geforce's, and looks like crap in anything 1024x768 and lower.. It only looks decent at higher resolutions..

Reznor007
May 1st, 2003, 01:47
Originally posted by Tagrineth
Wrong.

4x would be 1/2 speed compared to a 5500's non-AA. :)

First off, I wasn't comparing 6000 speeds to 5500.

And we were both wrong. 4x on the 6000 would be 1/4 normal speed, as each chip renders it's own frame, thus having 4x as much to render.

Actual rates:
Voodoo4:
Normal=333Mpixel
2xFSAA=1/2 speed-166Mpixel

Voodoo5 5500:
Normal=666Mpixel
2xFSAA=1/2 speed-333Mpixel
4xFSAA=1/4 speed-166Mpixel

Voodoo5 6000:
Normal=1333
2xFSAA=doesn't work
4xFSAA=1/4 speed-333Mpixel
8xFSAA=1/8 speed-166Mpixel

4x on the 6000 is the same as 2x on the 5500, not full speed 5500.

Tagrineth
May 1st, 2003, 18:40
Originally posted by Reznor007
And we were both wrong. 4x on the 6000 would be 1/4 normal speed, as each chip renders it's own frame, thus having 4x as much to render.

Which is exactly what I said :)


Actual rates:
Voodoo4:
Normal=333Mpixel
2xFSAA=1/2 speed-166Mpixel

Voodoo5 5500:
Normal=666Mpixel
2xFSAA=1/2 speed-333Mpixel
4xFSAA=1/4 speed-166Mpixel

Voodoo5 6000:
Normal=1333
2xFSAA=doesn't work
4xFSAA=1/4 speed-333Mpixel
8xFSAA=1/8 speed-166Mpixel

Nice job stating the obvious.

And ask any 6k owner about 2x AA. :) Every one of them will tell you 2x works, even if 4x and 8x crash. :)


4x on the 6000 is the same as 2x on the 5500, not full speed 5500.

Re-read what I said:

4x would be 1/2 speed compared to a 5500's non-AA.

4x on a 6k would be half as fast as a 5500's non-AA.

Reznor007
May 2nd, 2003, 01:18
Originally posted by Tagrineth
Which is exactly what I said :)



Nice job stating the obvious.

And ask any 6k owner about 2x AA. :) Every one of them will tell you 2x works, even if 4x and 8x crash. :)



Re-read what I said:

4x would be 1/2 speed compared to a 5500's non-AA.

4x on a 6k would be half as fast as a 5500's non-AA.

That isn't exactly what you said though. You only said the part about V5 6000 4x speed in relation to V5(which I wasn't comparing anyway). You didn't mention ANYTHING about how it works :)

Also, when you said that, I thought you were saying 4x on the 6k was the same as full speed V5 (half speed V5 6k).

And from everything I've heard 2x on the 6k didn't work due to either software bugs or flaws in the hardware(and let's face it...the 6k was a mess). I heard something about getting 2x to work, however that was in single chip mode, which would make a an expensive Voodoo4.

PS, that was only obvious if you knew about VSA-100 in the first place.

Tagrineth
May 2nd, 2003, 01:50
Originally posted by Reznor007
That isn't exactly what you said though. You only said the part about V5 6000 4x speed in relation to V5(which I wasn't comparing anyway). You didn't mention ANYTHING about how it works :)

Nitpicker. :flowers:


And from everything I've heard 2x on the 6k didn't work due to either software bugs or flaws in the hardware(and let's face it...the 6k was a mess). I heard something about getting 2x to work, however that was in single chip mode, which would make a an expensive Voodoo4.

A good number of 6k's can ONLY run 2x. The other modes result in a PCI bus desync and lockup within about ~5 minutes, without Hank's mod.

I haven't heard anything whatsoever about any 6k's being incapable of 2x AA. And I've been a part of the x3dfx community since the beginning. And I was a part of 3dhq from its start to its end. :)

AlphaWolf
May 2nd, 2003, 01:51
Now you all see why I don't like to pay attention to the technical side of 3D hardware. It's all heresay from sales asses and weirdos who measure their pride based on what video card they own. There isn't any accurate information anywhere.

Reznor007
May 2nd, 2003, 08:26
Originally posted by Tagrineth
Nitpicker. :flowers:



A good number of 6k's can ONLY run 2x. The other modes result in a PCI bus desync and lockup within about ~5 minutes, without Hank's mod.

I haven't heard anything whatsoever about any 6k's being incapable of 2x AA. And I've been a part of the x3dfx community since the beginning. And I was a part of 3dhq from its start to its end. :)

Well, the 2x works, but I've only heard of it working in single chip mode(otherwise it would be all 4 chips working together, which causes problems on most since they are all alpha and beta hardware).

I also follow the x3dfx stuff, since the day it opened, all the way up to the past day or 2 with the driver source released(I'm still wondering if that is completely true or what though, and yes, I have the file).

And I still feel the way I said about 3dhq...all hype.

Tagrineth
May 2nd, 2003, 15:07
Originally posted by Reznor007
Well, the 2x works, but I've only heard of it working in single chip mode(otherwise it would be all 4 chips working together, which causes problems on most since they are all alpha and beta hardware).

Talk to LogicalMadness or Colourless.

And all the performance figures I've seen show 2x running full speed. :flowers: I have no idea where you get the idea that 2x runs in single-chip only.

I suppose it might make sense if it ran on two chips only, considering that it doesn't **** over the PCI signal like 4x/8x on most 6k's... but that doesn't make sense because all the numbers I've seen have it faster than a 5500 when both are in 2x.

Reznor007
May 2nd, 2003, 15:17
Originally posted by Tagrineth
Talk to LogicalMadness or Colourless.

And all the performance figures I've seen show 2x running full speed. :flowers: I have no idea where you get the idea that 2x runs in single-chip only.

I suppose it might make sense if it ran on two chips only, considering that it doesn't **** over the PCI signal like 4x/8x on most 6k's... but that doesn't make sense because all the numbers I've seen have it faster than a 5500 when both are in 2x.

I suppose I could ask Colourless, but at this point it doesn't really matter.

But logically thinking, if 2x uses all 4 chips, it should suffer the same problems as 4/8x since all of the chips are functioning.

Flash
May 2nd, 2003, 21:55
I have 6K and have no problems with all modes, 2x, 4x, 8x.

Tagrineth
May 3rd, 2003, 00:19
Originally posted by Flash
I have 6K and have no problems with all modes, 2x, 4x, 8x.

Lucky! Most working non-final revision 6k's have severe problems with 4x/8x locking up within 5 minutes...

How's the performance in 2x compared to standard and 4x?

And, for the love of the Goddess, post some 8x screenshots!!

AlphaWolf
May 3rd, 2003, 15:45
Willie hears ya, and Willie don't care.