What's new

Requesting info on decoding theory.

teetuck

New member
Typical Stuff:
First post.
I'm a n00b to emulation and hacking, but I have several years experience with programming in several languages.
I've tried locating topics with a similar subject, but found none. Please forgive me if this has already been covered.

Problem:
After finding a easter egg of sorts in Starfox Adventures, I've been attempting to find the specific texture involved within the source code. See here: http://starfox-online.com/phpBB2/viewtopic.php?t=790&start=0
I got the code and ran it through GC-Tool, however, the final files are, for the most part, compressed binary. Tile Molester has proved useless on all the files I've tried.

Question:
I've never done any kind of HEX work, so could someone direct me to websites and tutorials explaining the theory behind decompressing and reading HEX files, and translating normal data from them (ie. Bitmaps)? Any specific to the Gamecube's coding scheme? Is there a tool I overlooked that would solve this problem?
 

afwefwe

New member
That google result isn't going to be very helpful.

You'll probably have to deal with several methods of encoding to get at the texture data. At the lowest level, the texture could be stored palettized, raw, or compressed using S3TC. S3TC is the most likely, as it usually has the best quality-to-size ratio.

At a higher level, almost all console games store textures in a custom pack file. You'll need to figure out how to decode and unpack the file on your own unless you are very lucky and the developers used a middleware library like ADX to handle details like that.

Start at the highest level possible; try and make a program to unpack the game's pack files. Then make guesses as to which files are the textures, and figure out what format they're stored in.

Keep in mind that this probably won't be very easy to do.
 

KCat

New member
I haven't been trying much to crack the compression scheme these files use, but I think I have the containing pack format partly figured out.

Each "chunk" starts with a ZLB\0 marker, followed by a 32-bit number (which seems to always be 1). From here, I'm guessing it has a 32-bit number representing the uncompressed size of the data, followed by a 32-bit number for the compressed size of the data. The files I've tested this on seem to follow this (the uncompressed size is always bigger, and the smaller the size the lesser the difference). However there is sometimes (usually, but not always) gaps between the end of an entry and the beginning of the next one/end of the file.. this gap has never exceeded more than 40 bytes in my tests so I can only assume that perhaps the packing method just wasn't very efficient.

From here, I've tried writing out the packed data to disk. None of my Linux system decompression tools I have even wanted to touch it (not surprising, really). So far, it seems all I've been able to figure out is that each of these packed data segments start with the two byte values 0x58 and 0x85.. if this isn't part of the entry header, anyway.
 

Top