I don't know the details of what's responsible for how rendering games is accomplished with most N64 emulators (or any for that matter), however, I'm familiar with rendering concepts in general, and I've thought over what's potentially plausible, though I'm stuck at knowing what the case is for a number of questions.
Basically, I'm curious to see it possible that model data could be brought into an N64 game as it's played, similar to how plugins have made it possible to retexture a game.
However, I'd like to know, when it comes down to it, how is the rendering taking place? Does the emulator build the 3D world itself, based on how the game is saying it should be, or does the game handle the rendering itself with everything else, and the emulator wraps the native instructions into something that gets it working for whatever the given platform (ie. x86 PCs) and simply displays what's in the game's video memory to screen?
How does the Rice Video plugin operate, or any other for that matter? Does it have some level of knowing the position, orientation etc of a model during runtime, and retextures the models in the game as they're being rendered in the emulator's software, or even if the game is rendering, that they're being 'layered' atop the game's rendering job?
My point with the video plugins being, there seems to be some knowledge as to what's possible with altering the games as they run, but I would like to know what they're doing; altering the game's rendered work, or the emulator's rendered work?
If the game is rendering, then I'd imagine the possibility of altering what models are being drawn to be difficult at best, however, if the emulators are rendering the games, swapping models and textures with whatever the user has locally for the emulator to use should be much easier, and without any effect how the game continues to operate (albeit with collision models remaining the same in the game).
Basically, I'm curious to see it possible that model data could be brought into an N64 game as it's played, similar to how plugins have made it possible to retexture a game.
However, I'd like to know, when it comes down to it, how is the rendering taking place? Does the emulator build the 3D world itself, based on how the game is saying it should be, or does the game handle the rendering itself with everything else, and the emulator wraps the native instructions into something that gets it working for whatever the given platform (ie. x86 PCs) and simply displays what's in the game's video memory to screen?
How does the Rice Video plugin operate, or any other for that matter? Does it have some level of knowing the position, orientation etc of a model during runtime, and retextures the models in the game as they're being rendered in the emulator's software, or even if the game is rendering, that they're being 'layered' atop the game's rendering job?
My point with the video plugins being, there seems to be some knowledge as to what's possible with altering the games as they run, but I would like to know what they're doing; altering the game's rendered work, or the emulator's rendered work?
If the game is rendering, then I'd imagine the possibility of altering what models are being drawn to be difficult at best, however, if the emulators are rendering the games, swapping models and textures with whatever the user has locally for the emulator to use should be much easier, and without any effect how the game continues to operate (albeit with collision models remaining the same in the game).
Last edited: