Interpolating Normals
When a model is drawn, its normals first arrive in the vertex shader. In the early days of OpenGL, the ambient, diffuse, and specular terms were calculated in the vertex shader, and the resulting colors were interpolated across the triangle. Shading at the vertices rather than fragments is fast but ugly:
Instead of interpolating the shaded color across a triangle, you want to interpolate the normal. The vertex shader sends the normal along to the fragment shader through an out
variable, just as you have done with per-vertex colors:
in vec3 position;
in vec3 color;
in vec3 normal;
out vec3 mixColor;
out vec3 mixNormal;
void main() {
gl_Position = vec4(position, 1.0);
mixColor = color;
mixNormal = normal;
}
in vec3 position; in vec3 color; in vec3 normal; out vec3 mixColor; out vec3 mixNormal; void main() { gl_Position = vec4(position, 1.0); mixColor = color; mixNormal = normal; }
However, there's a problem with interpolating vertex normals. Consider this side profile of normals being interpolated between two vertices:

The issue is that interpolated normals lose their unit length. You can see in the figure that the blended normals are shorter than the vertex normals. They should look like this:

In the fragment shader, you must renormalize the normal, which you can do with the builtin normalize
function:
in vec3 mixNormal;
void main() {
vec3 normal = normalize(mixNormal);
// ...
}
in vec3 mixNormal; void main() { vec3 normal = normalize(mixNormal); // ... }