What's new

Aristotle's Mudlord & Rice Video

Status
Not open for further replies.

gitech

N64 Artist
Use tinypic.com to host your screenshots and paste the "IMG" code in your post to display screenshots here. ;)

How's it going Aristotle? Does any of the advice I gave such as the library from koolsmoky (Glide64 HD) help at all?

Take care,
Jay
 

Gonetz

Plugin Developer (GlideN64)
There was a bug with texture CRC calculation in original RiceVideo sources. Looking into the change log, I think that it is still there. The problem is that CRC of 4bit CI textures depends on state of "Full TMEM emulation" option. I'll point on the bug in the original sources. First, RDP_texture.h:
TxtrCacheEntry* LoadTexture(uint32 tileno)
{
:
if( !options.bUseFullTMEM && tile.dwSize == TXT_SIZE_4b )
gti.PalAddress += 16 * 2 * tile.dwPalette;
:
}

Here, if "Full TMEM emulation" is off and texture size is 4byte, set correct palette to gti.PalAddress.

Then, TextureManager.cpp

TxtrCacheEntry * CTextureManager::GetTexture(TxtrInfo * pgti, bool fromTMEM, bool doCRCCheck, bool AutoExtendTexture)
{
:
if ( doCRCCheck && (pgti->Format == TXT_FMT_CI || (pgti->Format == TXT_FMT_RGBA && pgti->Size <= TXT_SIZE_8b )))
{
:
if( pgti->Size == TXT_SIZE_8b )
{
dwPalSize = 256;
dwOffset = 0;
}
else
{
dwOffset = pgti->Palette << 4;
}

pStart = (uint8*)pgti->PalAddress+dwOffset*2;
:
dwPalCRC = CalculateRDRAMCRC(pStart, 0, 0, maxCI+1, 1, TXT_SIZE_16b, dwPalSize*2);
:
}

Here, Rice set palette offset regardless of "Full TMEM emulation". Thus, if "Full TMEM emulation" is off, the offset is doubled! Normally, Rice had to get "out of bounds" error here, crash or something like this. But he made a trick - he doubled the size of palette array!
As the result, palette CRC depends on "Full TMEM emulation". Thus, if you dumped textures with "Full TMEM emulation" off, your replacements for 4bit textures will not be loaded if "Full TMEM emulation" is on, unless tile.dwPalette is not zero.

I think, this bug must be fixed. However, all texture pack, which were dumped with "Full TMEM emulation" off, will not work after fix.
 
Last edited:

death--droid

Active member
Moderator
w00t Gonetz is here.
Also i can confirm that the bug is still there.
Anyone able to help with my crashing problems.

EDIT:
Can you send me the source code for rice video please.
I can't get it to compile properly but i can get yours to.
 
Last edited:
OP
A

Aristotle900

New member
There was a bug with texture CRC calculation in original RiceVideo sources. Looking into the change log, I think that it is still there. The problem is that CRC of 4bit CI textures depends on state of "Full TMEM emulation" option. I'll point on the bug in the original sources. First, RDP_texture.h:
TxtrCacheEntry* LoadTexture(uint32 tileno)
{
:
if( !options.bUseFullTMEM && tile.dwSize == TXT_SIZE_4b )
gti.PalAddress += 16 * 2 * tile.dwPalette;
:
}

Here, if "Full TMEM emulation" is off and texture size is 4byte, set correct palette to gti.PalAddress.

Then, TextureManager.cpp

TxtrCacheEntry * CTextureManager::GetTexture(TxtrInfo * pgti, bool fromTMEM, bool doCRCCheck, bool AutoExtendTexture)
{
:
if ( doCRCCheck && (pgti->Format == TXT_FMT_CI || (pgti->Format == TXT_FMT_RGBA && pgti->Size <= TXT_SIZE_8b )))
{
:
if( pgti->Size == TXT_SIZE_8b )
{
dwPalSize = 256;
dwOffset = 0;
}
else
{
dwOffset = pgti->Palette << 4;
}

pStart = (uint8*)pgti->PalAddress+dwOffset*2;
:
dwPalCRC = CalculateRDRAMCRC(pStart, 0, 0, maxCI+1, 1, TXT_SIZE_16b, dwPalSize*2);
:
}

Here, Rice set palette offset regardless of "Full TMEM emulation". Thus, if "Full TMEM emulation" is off, the offset is doubled! Normally, Rice had to get "out of bounds" error here, crash or something like this. But he made a trick - he doubled the size of palette array!
As the result, palette CRC depends on "Full TMEM emulation". Thus, if you dumped textures with "Full TMEM emulation" off, your replacements for 4bit textures will not be loaded if "Full TMEM emulation" is on, unless tile.dwPalette is not zero.

I think, this bug must be fixed. However, all texture pack, which were dumped with "Full TMEM emulation" off, will not work after fix.

Okay, then I'm removing the Full TMEM emulation option, and making it stay permanently on. Also, this means to everyone making/made texture packs, you'll have to redump them once these CRC caculation errors are fixed. Also, you should redump them anyway, with ether plugin, with that option turned on.


w00t Gonetz is here.
Also i can confirm that the bug is still there.
Anyone able to help with my crashing problems.

EDIT:
Can you send me the source code for rice video please.
I can't get it to compile properly but i can get yours to.

I'm sorry to hear it won't quit crashing, but it seems this plugin (Or this version) simply hates running on Project64 1.6. Use 1964.

And look to the first post in the topic for the recent souce code. It's bundled with the binary release, and it does compile.


Use tinypic.com to host your screenshots and paste the "IMG" code in your post to display screenshots here.

How's it going Aristotle? Does any of the advice I gave such as the library from koolsmoky (Glide64 HD) help at all?

Take care,
Jay

I don't think I really read over it. o_O Anyway, I must have forgotten, I'll read up on that post. Right now, it troubles me how many people keep getting these crashing problems, mostly with Project64, and these runtime library. So, I'm tempted to get a windows installer to properly install the runtime library for everyone, and the source to the plugin directory as well. Something like that, so more people can even use the plugin. Major issues are, the crashing, and now the CRC and texture dumping.
 
Last edited:

death--droid

Active member
Moderator
Ok I'll try out 1964.
The thing is tho Mudlords last version worked perfectly with Project64 for me.

EDIT:

Works perfect with 1964 so far.
1964 seems better then Project 64 at the moment.
 
Last edited:

gitech

N64 Artist
Cool, I am glad Gonetz chimed in! :)

I understand. Priorities are priorities!

Because of the time investment involved and I will want to make sure it's the last time I will have to do it...I will start re-re-redoing ( ;) ) CLOD when texture caching and all related issues such as the 4bit and slowdown bugs are ironed out. But, I will do it! ;) :D

The info from KoolSmoky is on page 17.

Thanks guys,
Jay
 

Rice

Emulator Developer
Just want to check in here. Gonetz, it is very nice to see you again.

I have not touched the code for more than 2 years. I may be able to answer some questions off my head, but will probably not be able to help too much.

Having said that, I have to admit:

1) CRC calculation is a mess. It was started from the original Daedalus code, and I have made some changes in multiple places, without really thinking careful if there is a better way to do so. It seems to be too late to change now, because the texture CRC values are used in the texture filenames. If a new CRC calculation is used, then all texture filenames need to be updated. (it is really not that difficult in coding)
2) TMEM/fullTMEM thing is also a mess, again, started from the very old codes. The original Daedalus code completely skips TMEM emulation, and loads texture directly from RDRAM. It is very fast in this way, but gives many texture problems in many games. I added full-TMEM feature late to fix the texture problems in these games. Full-TMEM could be turned on / off, which is a good feature for most games, but is a nightmare for texture replacement projects.
3) Supporting both DirectX and OpenGL is very nice. However, to do so, I have made the code very complicate and difficult to follow. The whole project uses C++ classes, and virtual function concepts, in order to support multiple render engine (opengl/directx), multiple graphic cards and color combiners. To understand why I do so, you have to understand that the code was written for video cards of many generation ago. Every video card supports pixel shader now, but not 4 or 5 years ago. Of course, a lot of old code is not useful anymore.
5) Another important goal was to run fast. This goal may have also made the code difficult to follow. This goal may not be as important as it was years ago.
6) Hardware framebuffer features are not very well supported because such features were often depended on video card.

I certainly do not have anytime to rewrite/cleanup the code. There are indeed many works to be done, for example, to delete OpenGL or DirectX code from it, or to delete all codes to support low end graphic cards. I could give many worthy suggestions if someone would like to do so.

I will be around here. Please feel free to ask questions.
 

death--droid

Active member
Moderator
Yay Rice is here to.
I might try removing OpenGL from the code.

EDIT:

Removed it just need to make sure the plugin still works XD.
I think i might of removed a bit to much.

EDIT2:
GAAAAH I did now i have to start over XD.

I'll upload it as soon as OpenGL is removed sucessfully.
 
Last edited:

gitech

N64 Artist
Why?

I prefer OGL.

Tell me why please.

Hello Rice! I knew you would come. Good to see you and thanks! :D

Jay
 
Last edited:

death--droid

Active member
Moderator
Lots of the OGL side of the plug in is incomplete from what i have seen so far.
Plus DirectX is more powerful but windows exclusive.

EDIT:

Everytime i remove OGL DirectX stops rendering 3d properly.
 
Last edited:
H

h4tred

Guest
Just want to check in here.

Long time no see Rice. Been wanting to chat to you for ages, I'm glad your here regardless. :)

I may be able to answer some questions off my head, but will probably not be able to help too much.

Don't worry, I'll be able to assist, though my memory is also a bit rusty.

Supporting both DirectX and OpenGL is very nice. However, to do so, I have made the code very complicate and difficult to follow. The whole project uses C++ classes, and virtual function concepts, in order to support multiple render engine (opengl/directx), multiple graphic cards and color combiners. To understand why I do so, you have to understand that the code was written for video cards of many generation ago. Every video card supports pixel shader now, but not 4 or 5 years ago. Of course, a lot of old code is not useful anymore.

Personally Rice, since there is now a prevalence of Shader Model 2.0 - 4.0 compatible cards, I don't see the harm in completely rewriting the whole plugin to exclusively use a pixel shader based rendering pipeline for blenders and combiners. The shader pipeline I reckon does not need to have seperate ATI and NVIDIA render paths, but instead, have only special cases where its needed. For instance, in Glide64's wrapper's case, we use some ATI specific hacks only to make up for failings in ATI's OpenGL ICD (eg, specific texture formats).

HOWEVER, that leaves the question, what API?

Another important goal was to run fast. This goal may have also made the code difficult to follow. This goal may not be as important as it was years ago.

I have to agree. In this day and age, I bet most enthusiasts in emulation by now have dual core or even quad core based systems. So speed ain't a issue. Plus, GeForce FX cards are more than enough for N64 emulation with shaders. Its only in PC gaming where they are horrendously slow. So, I say now, having a clean code base should be more important than speed, since we are dealing with HLE here.

Hardware framebuffer features are not very well supported because such features were often depended on video card.

But now thats not the case. All recent cards support render to texture, as well as shaders. Also, antialiasing on hardware render buffers is possible on the most recent cards. So we don't have to worry about that anymore.

I certainly do not have anytime to rewrite/cleanup the code. There are indeed many works to be done, for example, to delete OpenGL or DirectX code from it, or to delete all codes to support low end graphic cards. I could give many worthy suggestions if someone would like to do so.

Personally, I say we remove all code to support old cards, and replace with a shader based rendering pipeline. From there, there will be no real need for explicit Nvidia/ATI render paths.

As for the API choice, that comes down to personal choice. Direct3D9 and OpenGL 1.5 are near identical in feature sets, so there is no real difference. Thats something we devs have to really think hard about, as there is no point for rash decisions.

I also see some things missing from the current source, like missing support for my ported filters from GlideHQ (HQxxS/LQxxS). Guess I could help with that.

But honestly, a complete rewrite of the plugin to fit modern cards appeals to me, no offense.

--mud.
 

death--droid

Active member
Moderator
Woah is that you mudlord?
Just wondering because of the end of your post.
I wish i could help with a complete rewrite.
 
OP
A

Aristotle900

New member
Well, my laptop is 2002 1.13GHz P3 Mobile, Intel Extream Graphics. Non OpenGL friendly, no pixel shading support, Software only T&L....Okay, so I got a low end video card. XD I should figure a way to overclock the video card chip..

Anyway, a complete re-write? Sounds great, however, I'll have to abandon if it's redone for only new PCs and video cards though.
 

death--droid

Active member
Moderator
Aristotle900 are you using directx or opengl?
If you're using directX can you remove all the stuff for opengl please and send it to me.

I delete all the stuff and then directX stops rendering 3d XD.
 
H

h4tred

Guest
XD I should figure a way to overclock the video card chip..

You can't. Only way is to upgrade.

I'll have to abandon if it's redone for only new PCs and video cards though.

Not really, some minor portions can be kept, but the combiner/blender emulation as well as other major things would need to be completely redone. Would be also nice to rewrite it in response to new research in N64 emulation (eg. skys in GE/PD and the framebuffer effects in CBFD).

--mud.
 
OP
A

Aristotle900

New member
My video card supports DirectX 9.0, but not as well as 8.0. And no, it really is possible to overclock it, I would just have to be very careful not to mess up anything, since it's easy to do so on laptop main-boards. Then there would be the heat issues..

And I use DirectX, because my video card doesn't like OpenGL at all. Only thing that works nicely with it, is the DecentX program. (3D space shooter game)

Remove all OpenGL emulation? That would take a little doing, but uhh..What is the need for doing that? o_O You know, you can select which rendering system to use, in the options.
 

MasterPhW

Master of the Emulation Flame
Aristotle900 are you using directx or opengl?
If you're using directX can you remove all the stuff for opengl please and send it to me.

I delete all the stuff and then directX stops rendering 3d XD.
Please, don't let Aristotle do all the work for you. If you want something done, do it yourself. It will help you to understand the source and you can change the stuff yourself. Aristotle and the other devs have still enough to do to get the plugin back to its glory and if they always have to make changes only for you it will cost time and work.
 

death--droid

Active member
Moderator
The thing is I've allready tried and every time I've removed all of opengl from the plugin the directX has stopped rendering anything that's 3d.
So at least i tried doing something to help.
But never mind about that.

EDIT:
I made a edit.

Open up TextureFilters.cpp

And replace the inter void called FindAllDumpedTextures
With

Code:
void FindAllDumpedTextures(void)
{
	char	foldername[256];
	GetPluginDir(foldername);
	if(foldername[strlen(foldername) - 1] != '\\') strcat(foldername, "\\");
	strcat(foldername,"texture_dump\\");
	CheckAndCreateFolder(foldername);

	strcat(foldername,g_curRomInfo.szGameName);
	strcat(foldername,"\\");

	gTxtrDumpInfos.clear();

	char	foldername3[256];
	GetPluginDir(foldername3);
	if(foldername[strlen(foldername3) - 1] != '\\') strcat(foldername, "\\");
	strcat(foldername3,"hires_texture\\");
	CheckAndCreateFolder(foldername3);

	strcat(foldername3,g_curRomInfo.szGameName);
	strcat(foldername3,"\\");

	gHiresTxtrInfos.clear();


	if( !PathFileExists(foldername) )
	{
		CheckAndCreateFolder(foldername);
		char	foldername2[256];
		for( int i=0; i<5; i++)
		{
			strcpy(foldername2,foldername);
			strcat(foldername2,subfolders[i]);
			CheckAndCreateFolder(foldername2);
		}
		return;
	}
	else
	{
		gTxtrDumpInfos.clear();
		FindAllTexturesFromFolder(foldername,gTxtrDumpInfos, false, true);
		gHiresTxtrInfos.clear();
		FindAllTexturesFromFolder(foldername3,gHiresTxtrInfos, true, true);

		char	foldername2[256];
		for( int i=0; i<5; i++)
		{
			strcpy(foldername2,foldername);
			strcat(foldername2,subfolders[i]);
			CheckAndCreateFolder(foldername2);
		}
	}
}

It checks the hires folder to see if the textures are there as well.
I need someone to make sure this works tho.

I'll release a slightly updated version soon that adds a option to enable and disable scanning of hires texture on dump check.

EDIT:

We need svn :p

EDIT2:
As soon as i find out how to add more options to the plugin gui It'll be ready for some testing.
 
Last edited:

Mollymutt

Member
Are you saying that when you dump textures, it checks in the hi res folder and doesn't dump any textures in it? I've been waiting for this for a long time. I would love to test this for you.
 
Status
Not open for further replies.

Top