Hi, my first post I'm a newbie with N64 emulation, so I might be missing something important, but, how about multiplying a 1D sharp lighting texture to the surface? Multitexturing is need, obviously.
1) Create a 1D sharp lighting texture.
2) Compute the coordinates of the 1D texture for every vertex using the dot product between the normal and the lighting direction.
3) Disable lighting.
4) Draw the polygons using the orignal RGB color, coordinate, texture (if any), and blend the sharp lighting texture by multiplication.
I see that Gonetz's implementation does exactly as Sami "MENTAL" Hamlaoui's article in the "Cel-Shading Textures" section says. Unfortunately, that section of the article is incorrect, as the RGB values for all points inside the triangle will be interpolated between the RGB values of the three vertex defining a triangle with gouraud shading, just as Rice pointed out. I guess flat shading wouldn't work either because it would be too blocky for coarse polygoned models.
-KoolSmoky
Koolsmoky, welcome to emutalk.
Your method may not work because vertex does not have RGB color if lighting is in use. In fact, vertex RGB colors are computed from lighting when lights are used. Because the vertex gets its RGB colors from lighting, lights with RGB colors cannot be replaced successfully by a grayscale texture. A full RGB color texture map must be used.
BTW, N64 has 2 texture units. To do cel-shading successfully, a video card needs to support multitexture with at least 3 texture units. This won't be a problem in general if the video card supports pixel-shader. Pixel-shader is needed anyway in order to efficiently implement the new multitexture color combiner models.
Dont forget this.
// NPR outline shader
// c0-3 view matrix
// c4-7 view projection matrix
// c8
// c9 (0.0, 0.0, 0.0, 1.0f)
// c10 line width scalar
vs.1.1
m4x4 r0, v0, c0 // compute the view vector
dp3 r1, r0, r0 // normalize the view vector
rsq r1, r1
mul r0, r0, r1
m3x3 r1, v7, c0 // multiply normal 1 by the view matrix
m3x3 r2, v8, c0 // multiply normal 2 by the view matrix
dp3 r3, r0, r1 // dot normal 1 with the view vector
dp3 r4, r0, r2 // dot normal 2 with the view vector
mul r3, r3, r4 // multiply the dot products together
slt r3, r3, c9 // check if less than zero
mov oD0, c9 // set the output color
dp4 r0, v0, c6 // compute the vertex depth
mul r0, r0, c10 // multiply by a line thickness scalar
mul r3, r3, r0 // multiply the thickness by the smooth normal
mul r3, v3, r3 // multiply by the normal offset
add r0, v0, r3 // add in the offset
mov r0.w, c9.w // swizzle in a one for the w value
m4x4 oPos, r0, c4 // transform the vertex by the model view projection
// Cartoon vertex shader
// c9 is the light position
// c10 is the view projection matrix
// c14 is the view matrix
vs.1.1
// output the vertex multiplied by the mvp matrix
m4x4 oPos, v0, c10
// compute the normal in eye space
m3x3 r0, v3, c14
mov oT0, r0 // write the normal to tex coord 0
// compute the light vector
sub r0, c9, v0
dp3 r1, r0, r0
rsq r1, r1
mul r0, r0, r1
m3x3 r1, r0, c14 // transform the light vector into eye space
mov oT1, r1 // write the light vector to tex coord 1
// compute half vector
m4x4 r0, v0, c14 // transform the vertex position into eye space
dp3 r3, r0, r0 // normalize to get the view vector
rsq r3, r3
mul r0, r0, r3
add r0, r1, -r0 // add the light vector and the view vector = half angle
dp3 r3, r0, r0 // normalize the half angle vector
rsq r3, r3
mul r0, r0, r3
mov oT2, r0 // write the half angle vector to tex coord 2
// Cartoon shading pixel shader
//
ps.1.4
def c0, 0.1f, 0.1f, 0.1f, 0.1f // falloff 1
def c1, 0.8f, 0.8f, 0.8f, 0.8f // falloff 2
def c2, 0.2f, 0.2f, 0.2f, 1.0f // dark
def c3, 0.6f, 0.6f, 0.6f, 1.0f // average
def c4, 0.9f, 0.9f, 1.0f, 1.0f // bright
// get the normal and place it in register 0
texcrd r0.xyz, t0
// get the light vector and put it in register 1
texcrd r1.xyz, t1
// compute n dot l and place it in register 3
dp3 r3, r0, r1
// subtract falloff 1 from the n dot l computation
sub r4, r3, c0
// check if n dot l is greater than zero
// if yes use average color otherwise use the darker color
cmp_sat r0, r4, c3, c2
// subtract falloff 2 from the n dot l computation
sub r4, r3, c1
// check if n dot l is greater than zero
// if yes use bright color otherwise use whats there
cmp_sat r0, r4, c4, r0
KoolSmoky, vertex gets RGB colors (through lighting) from all lights. There are usually 1 ambient light, and 1 or more directional lights. Point lighting is not used in N64 games except Zelda MM.
Hyrule Field as an example, shifts its colours based on the Day to Night cycle, and that would be done through lighting.
Vertex colours are also used for the environments, for example in the Temple of Time, the vertex colours go to black to hide the ceiling.
Hyrule Field as an example, shifts its colours based on the Day to Night cycle, and that would be done through lighting.
KoolSmokey, a 3D texture is much easier.
mudlord, where did you see the lookup method uses 3 colours instead of grayscale? Can you post your link?
mudlord, where did you see the lookup method uses 3 colours instead of grayscale? Can you post your link?