What's new

Direct 64 Question

Hal

Bartman
Just a quick question; anyone know why under bit-depth in the Direct 64 options that you can only choose 16 or 24 bit depth? What I mean to say is why doesent it exploit the 32bit depth instead of 24bit? From what I know about graphics, 32bit depth color is not only higher quality, but it is also suppost to be much faster than 24bit depth.
 

CF2

Pretends to make sense
Nonono, the higher the bit depth, the slower the performance. I'm not sure why Direct64 doesn't use 32 bit color, but I'm guessing that the N64 only goes up to 24 itself, so allowing 32 bit would be useless.
 

Clements

Active member
Moderator
I *thought* it had to do with precision of Pixel Shaders, ATi supporting FP24 and NVIDIA using a combination of FP16/FP32. 32-bit therefore would probably only work on NV30 and up NVIDIA cards and on FX cards it would be very slow. I'm probably completely wrong though. :p
 

Reznor007

New member
Technically 24bit and 32bit color are the same amount of color, but 32bit has an additional 8 bits of alpha. It could also be talking about the Zbuffer, as current cards only support 16bit or 24bit, and not 32bit Z.

With shaders you don't actually specify 24/32bit. Both are considered full precision under Pixel Shader 2.0. PS3.0 requires FP32 though. But in 2.0 you specify full precision being 24/32bit or partial precision, being 16bit or lower. Since ATI only supports 24bit it doesn't make any difference what you select, as all calculations are done at 24bit.
 

BFeely

New member
I hope 32 bit mode would be added, since I am currently stuck using 16 bit video mode for fullscreen, as my NVidia GeforceFX 5200 card only supports 16 and 32 bit display modes under Windows.
 

dragon_rider

私は竜が好き&#
For some reason, it lets me use 24-bit color in fullscreen. Is this normal for a GeForce FX 5700 LE to it? Not that I'm complaining, mind you; I just find it rather odd.
 

arnalion

Nintendo Fan
What depth are you using on the desktop? Don't the nvidia cards set the z-buffer on the same depth as the desktop?
 

Orkin

d1R3c764 & g1|\|64 m4|<3R
Direct64 calls the D3D format, D3DFMT_X8R8G8B8, "24-bit" while most programs call it 32-bit. I guess the actual format is 32-bits per pixel, I just called it 24-bit since there are actually only 24-bits used for color information (the X channel is just padding). This just made sense to me since I also called D3DFMT_X1R5G5B5 15-bit, D3DFMT_R5G6B5 16-bit, and D3DFMT_X2R10G10B10 30-bit.

I'll change it for the next version since it's causing confusion.
 
Last edited:

dragon_rider

&#31169;&#12399;&#31452;&#12364;&#22909;&#12365;&#
It's not that using 24-bit color is a bad thing. I find is rather advantagous IMO. It also seems to run faster than 16-bit color mode (though I've yet to figure out why this is.) But, Orkin, if you desire to add a 32-bit color mode, that's fine by me. Your plugin is awesome!!!
 

Top