What's new

Rice's Plugin Source <discussion>

klasa

New member
I've just tried with OpenGL, and I got the same error, it checks for the textures, then a black screen, and then vista says PJ64 has stopped working...

EDIT: Nevermind, OpenGL doesn't work with the stable version either.
 
Last edited:

Doomulation

?????????????????????????
I wanted you to know that the preview version complains that d3d9_27.dll is missing. But when I've downlaoded the file it worked again, maybe you can include it or something? (When i put the file in the windows dir I only sayw no. 28, 29 and 30)

Just for your reference, downloading the latest version of DirectX and installing it installs those d3d9_xx.dll files.
 

mudlord

Banned
If your using Vista, it could be due to the OpenGL implementation...

I tested out the stable version in DX and OGL in XP, and I noticed no real anomalies, I haven't tested out the hi-res texture loading thoroughly though...I mainly tested on Super Mario 64 and Zelda OoT....

I did some work on the texture loading issues last night, and I managed to find where it broke (I have a fair idea where..). I'll post a solution when its stable enough, though, without glitching up in the major games that people retexture...
 
Last edited:

klasa

New member
I found out the problem! The plugin crashes once a highres texture is loaded onto the screen (try starting zelda with only the logo files in the dir).

If you're not using high res textures, there seem to be no problems.
 

Cyberman

Moderator
Moderator
mudlord
Could you dump information to a file (This would be an advanced option obviously) what file name the game loads for hirez textures, and WHERE it started looking as well? I've never been able to get hirez textures to work that I've made. So I thought I was insane or something, if the plugin gives feedback as to what it's looking for.. that will help emmensely in naming the textures.

On a side note, it might be good to have a clean and definative information bit of what gets dumped where in the texture dumping thing. Actually what I would like most is to allow it to dump textures and stuff them into neat seperate directories. In a game textures get loaded as needed for an area. On start up a bunch of textures get loaded and dumped to the screen. It would be nice to dump groups of textures that are loaded about the same time into the same directory. That makes it easier to figure out what they belong too.

I ramble.


Cyb
 
OP
Enzo Dragon

Enzo Dragon

STFU, NAVI
The shaders you've implemented thus far are nice, mudlord. I'm quite impressed.

One thing I'm wondering: What do you mean by "make the frame buffers more 'pleasan't"?
 

mudlord

Banned
Could you dump information to a file (This would be an advanced option obviously) what file name the game loads for hirez textures, and WHERE it started looking as well? I've never been able to get hirez textures to work that I've made. So I thought I was insane or something, if the plugin gives feedback as to what it's looking for.. that will help emmensely in naming the textures.

On a side note, it might be good to have a clean and definative information bit of what gets dumped where in the texture dumping thing. Actually what I would like most is to allow it to dump textures and stuff them into neat seperate directories. In a game textures get loaded as needed for an area. On start up a bunch of textures get loaded and dumped to the screen. It would be nice to dump groups of textures that are loaded about the same time into the same directory. That makes it easier to figure out what they belong too.

I ramble.

Logging like that sounds like a cool idea, and shouldnt be that hard. (your request for showing OpenGL info was pretty damn easy to do, and that will be in the next stable build).

Dumping in seperate directories based on texture type.....thats a great idea too...Maybe that can also be with klasa's request, by having control over what exactly is dumped. I'll do a seperate "hi res texture" dialog box in the GUI for all hi-res texture stuff, that way, all the hi-res options are isolated, thus making it easier to identify.

The shaders you've implemented thus far are nice, mudlord. I'm quite impressed.

One thing I'm wondering: What do you mean by "make the frame buffers more 'pleasan't"?

Actually, those shaders are read from external files....:D Thus, coders can make thier own shaders, and they don't have to be "implemented" per se. All that was added is a mechanism and shader C++ class to read the shader files, and compile them on the GPU. Therefore, shaders can be used in conjunction with hi-res texture projects, like if you want a complete hi-res texture pack, that processes the models, and not just the textures....but I'm glad you like the toon/sketch shaders..

What I was saying about the framebuffers being more pleasant, is that atm, they are slow. I know there are methods for accelerating them (Jabo showed some of them off in his demonstration for MK64's billboard being emulated at full speed), it needs to be a case of them being implemented, since most plugins these days support HWFBE (Jabo's, Orkin's and Gonetz's plugins support it).
 
OP
Enzo Dragon

Enzo Dragon

STFU, NAVI
Just got around to checking out the newest build, and the "set aspect ratio" option appears to do nothing. I also noticed that I did not have the choice to set my render engine.
 

mudlord

Banned
Just got around to checking out the newest build, and the "set aspect ratio" option appears to do nothing. I also noticed that I did not have the choice to set my render engine.

Huh? Now this is weird......you not having a choice to set render engine is even weirder.....I find both issues incredibly perplexing......:unsure:

EDIT: Tested switching render engines, works fine for me. I'm currently in the middle of doing some major testing with texture packs and implemented features, only trouble is finding working links to the nice texture packs in the past.....:(
 
Last edited:
...
1) Is there any real reason for the texture size limit to be increased? (if its technically feasible to do in the first place, without breaking texture loading in the process)?
...

Textures stretched over huge areas like large grasspatches, rocks or other vast areas is very blurry compared to the other textures you've replaced eventhough you scale them up 4x the size, if you could throw it up to say 8x, 16x or more (depending on how stretched it is) i think that'd make a huge difference in keeping the overall sharpness at the same level.

P.S: i didn't read every post i just skimmed through the thread but i don't think i saw anyone replying to this.
 

mudlord

Banned
Textures stretched over huge areas like large grasspatches, rocks or other vast areas is very blurry compared to the other textures you've replaced eventhough you scale them up 4x the size, if you could throw it up to say 8x, 16x or more (depending on how stretched it is) i think that'd make a huge difference in keeping the overall sharpness at the same level.

Oh alright, but then, what about VRAM usage? Adding this to around 16x or 32x could really hurt video RAM usage, unless you have a 256MB card or higher...Sure, they'll look nice, but then users might complain about added requirements.

Maybe this can be for a per-case basis, like the re-texturer can note that when it uses 8x or higher textures, and let end-users know of the possible consequences.
 
Yeah i guess if you make an already big texture 16x the size it might be a problem but if you ask me there isn't really any reason to cap it, the more freedom the better, it's up to the texturer to make it work properly.
For example you could offer an extra zip to replace the "super" high res texture with a lower res one for the people with lower VRAM.
 

Cyberman

Moderator
Moderator
Apart from load times getting rediculously long? :D

That will horribly affect performance is what I would like to point out.

mudlord I suggest rescaling the textures based on available VRAM. IE say someone tries to load a 16x 64x64 texture which comes out to 4 megabytes of VRAM by the way. It first checks if there is enough available VRAM then loads it. This is a HUGE texture by the way. 1024x1024. If it is used a LOT (IE big performance penalty) the plugin could REDUCE it (inspite of it's original massive size) until performance isn't so bad (this requires some regular checking of the impact of textures on performance). I would suggest loading the original textures then run one frame with the originals, then load the hirez one and check the difference in frame rendering time. If it's under the 'choosen' frame rate everything is fine. If a huge difference start scaling the hirez texture down tell it has a sane rendering rate. If it ends up down to the original texture size then use the original texture.

I suggest also having in the advanced options a MAX texture scale setting.
IE 2x 4x 8x 16x 32x 64x. You can grey out really big ones if there isn't enough VRAM (IE check VRAM).

Cyb
 

mudlord

Banned
Apart from load times getting rediculously long?

But as we know, people want better graphics :evil: ...we better give what they want. :matrix:

That will horribly affect performance is what I would like to point out.

Yep, Rice did a discussion a while back into the effects of high scaling levels on RAM usage (I still have a copy of the forum's threads)...and the consensus from there was 4x scaling levels were the most reasonable back then.


mudlord I suggest rescaling the textures based on available VRAM. IE say someone tries to load a 16x 64x64 texture which comes out to 4 megabytes of VRAM by the way. It first checks if there is enough available VRAM then loads it. This is a HUGE texture by the way. 1024x1024. If it is used a LOT (IE big performance penalty) the plugin could REDUCE it (inspite of it's original massive size) until performance isn't so bad (this requires some regular checking of the impact of textures on performance). I would suggest loading the original textures then run one frame with the originals, then load the hirez one and check the difference in frame rendering time. If it's under the 'choosen' frame rate everything is fine. If a huge difference start scaling the hirez texture down tell it has a sane rendering rate. If it ends up down to the original texture size then use the original texture.

I suggest also having in the advanced options a MAX texture scale setting.
IE 2x 4x 8x 16x 32x 64x. You can grey out really big ones if there isn't enough VRAM (IE check VRAM).

Hm, so as part of the checks, we check the amount of memory. AFAIK, there is no simple way to return a value of VRAM in either DX or OGL APIs (Glide seems to have a function for this though...). And according to video RAM values, we load the textures...and do checks so that the card renders them at a decent FPS..that is chosen by the end-user or retexturer.

For those options, they can go into the new hires texture option box I'm pondering. Sorry bout the lack of binary releases atm (University is one of my main priorities atm) though...I must seriously get my ass more in gear :p....and nuckle down on this ^^...
 

Cyberman

Moderator
Moderator
Well mudlord releasing something that isn't well thought out will just waste your time, so think things over first.

The only way I know of guessing memory available is knowing the cards VRAM memory and estimating from the various things you have done with (IE current screen resolution number of buffers you are using, the textures and there sizes you are using etc.) This can be done but requires a little work on texture management. However it is true over a 4x texture size, things aren't too good.

It might be interesting to use non power of two textures. This might be possible by splitting the single texture into some integer multiple of textures (IE 3x of a 64x64 texture becomes 4 128x64 and 1 64x64 texture). Just one of Cyb's random crazy thoughts ;)

Also do your studies, this is ONLY a hobby no matter how fun.

Cyb
 

mudlord

Banned
Well mudlord releasing something that isn't well thought out will just waste your time, so think things over first.

Very true...:). Thanks for the advice :).

The only way I know of guessing memory available is knowing the cards VRAM memory and estimating from the various things you have done with (IE current screen resolution number of buffers you are using, the textures and there sizes you are using etc.) This can be done but requires a little work on texture management. However it is true over a 4x texture size, things aren't too good.

k, In DX9, it is possible to return maximum VRAM amount via IDirect3DDevice9::GetAvailableTextureMem (there might be a equivalent function in DX8), and in OGL it is possible to use glAreTexturesResident() in a way to return amount of VRAM, so my understanding was incorrect. But your correct, its quite hard to work out running VRAM usage.

It might be interesting to use non power of two textures. This might be possible by splitting the single texture into some integer multiple of textures (IE 3x of a 64x64 texture becomes 4 128x64 and 1 64x64 texture). Just one of Cyb's random crazy thoughts

Also do your studies, this is ONLY a hobby no matter how fun.

Hmm, then cards with NPOT support will be a requirement. Or maybe a optional feature?
 

Top