Blending
So far we've been rendering opaque surfaces. What makes them opaque is that their colors completely overwrite whatever colors were written to the framebuffer previously. A transparent surface, on the other hand, allows colors from farther surfaces to pass through. The color that reaches your eye through a transparent surface is a mix of colors.
The first step to render transparent surfaces is to enable blending with this statement:
gl.enable(gl.BLEND);
gl.enable(gl.BLEND);
The second step is to draw an object with an opacity less than 1. This means altering the assignment to fragmentColor
in the fragment shader. This assignment, for example, renders fragments in a magenta that is 60% opaque:
fragmentColor = vec(1.0, 0.0, 1.0, 0.6);
fragmentColor = vec(1.0, 0.0, 1.0, 0.6);
Browsers automatically blend the pixels of the framebuffer with whatever page content is behind the canvas. If your page has a background color, it will mix into your 3D scene. Likely you do not want this. One way to disable the mixing is to eliminate the alpha channel from your framebuffer in the getContext
call:
window.gl = canvas.getContext('webgl2', {alpha: false});
window.gl = canvas.getContext('webgl2', {alpha: false});
When blending is disabled, the graphics card overwrites the color in the framebuffer with a statement like this:
framebuffer[c, r] = fragmentColor
framebuffer[c, r] = fragmentColor
When blending is enabled, the assigned color is a weighted sum of the fragment color and the color previously written to the framebuffer:
framebuffer[c, r] = newWeight * fragmentColor.rgb +
oldWeight * framebuffer[c, r]
framebuffer[c, r] = newWeight * fragmentColor.rgb + oldWeight * framebuffer[c, r]
You specify the weights using gl.blendFunc
. There are many possible weighting schemes, though one is more physically intuitive than others. If a fragment has an opacity of 75%, then that means you will see 75% of its color and 25% of whatever color is behind it. That scheme is applied with this statement:
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
With gl.SRC_ALPHA
as the first parameter, the new weight is the fragment's opacity. With gl.ONE_MINUS_SRC_ALPHA
as the second parameter, the old weight is the complement of the fragment's opacity. The framebuffer is updated then with a statement like this:
framebuffer[c, r] =
fragmentColor.a * fragmentColor.rgb +
(1 - fragmentColor.a) * framebuffer[c, r]
framebuffer[c, r] = fragmentColor.a * fragmentColor.rgb + (1 - fragmentColor.a) * framebuffer[c, r]
Explore how these two triangles blend together using this scheme:
Blending surfaces seems pretty neat, but it suffers from one major drawback. For a transparent surface to reveal the surfaces behind it, those farther surfaces must already have been recorded in the framebuffer. This requires sorting the geometry of the scene and rendering from farthest to nearest. Because of this cost, transparency is used sparingly or only for visual effects that are always drawn after the rest of the scene, like a fog overlay or a heads-up display.