Rice
Emulator Developer
======================================
NOTE:
You don't need to use this version to replace your
current 4.5.0 version if you don't ever use 16bits.
It is usually not faster to use 16bit, though it is usually not slower.
======================================
Correction: 16bit is faster than 32bit on my laptop.
======================================
Someone complains the 16bit supporting as they have voodoo3 or other video cards which support 16bit only for 3D rendering.
I tried to work on it, quickly fix it. I thought it could take me long time to fix, but in fact it did not.
I tried it on my own video card (Geforce2 MX400), it works fine in 16bit, but the result does not mean it will work for you. (my GF2 somehow can even support 32bit textures in 16bit mode, it does not matter if I use 16bits or 32bits, my plugin always work.)
Anyway, try this fix by you own, and please let me know if 16bit works or not on your voodoo or other low end video cards. (voodoo is not low end, just don't support 32bit in 3D.)
Both opengl and DirectX are supportig 16bits. For opengl, you need to manually select between 16/32 bit. For DirectX, you don't need to do so, the plugin will detect your current video card setting and find it out. This also means you can use either 16bit or 32bit Opengl at either 16 bit or 32bit video card, I don't know the result though.
Rice
NOTE:
You don't need to use this version to replace your
current 4.5.0 version if you don't ever use 16bits.
It is usually not faster to use 16bit, though it is usually not slower.
======================================
Correction: 16bit is faster than 32bit on my laptop.
======================================
Someone complains the 16bit supporting as they have voodoo3 or other video cards which support 16bit only for 3D rendering.
I tried to work on it, quickly fix it. I thought it could take me long time to fix, but in fact it did not.
I tried it on my own video card (Geforce2 MX400), it works fine in 16bit, but the result does not mean it will work for you. (my GF2 somehow can even support 32bit textures in 16bit mode, it does not matter if I use 16bits or 32bits, my plugin always work.)
Anyway, try this fix by you own, and please let me know if 16bit works or not on your voodoo or other low end video cards. (voodoo is not low end, just don't support 32bit in 3D.)
Both opengl and DirectX are supportig 16bits. For opengl, you need to manually select between 16/32 bit. For DirectX, you don't need to do so, the plugin will detect your current video card setting and find it out. This also means you can use either 16bit or 32bit Opengl at either 16 bit or 32bit video card, I don't know the result though.
Rice
Last edited: