Blog

When implementing the cloaking ability for Project Oort, I came upon a somewhat interesting problem: At what point during the gradual disappearing process should the object’s shadow disappear?

At first, I just had the object abruptly stop writing to the depth buffer at a certain transparency threshold. This would cause the shadow to simply cut out at an arbitrary point. I didn’t like the look of that and decided that I wanted the shadow to fade in and out gradually.

To do this, I needed a way to know the transparency factor of the object which cast each shadow. With this information, it would then be trivial to change the shadow opacity by multiplying the shadow factor (where 1.0 is in total shadow and 0.0 is in total light) with an object opacity. This would look something like this:

float shadowFac = calcShadow(norm) * visibilityFactor;
vec3 ambient = (kd * diffuse + specular) * ao * (1.0 - shadowFac);

A nicer approach, since I already implemented PCSS1, would be to effectively increase the blur on the shadow as the caster became more transparent until the shadow would be unnoticeable. This is more expensive, so I didn’t go for it.

As for determining the transparency of a shadow’s caster, this could be achieved pretty simply by using a GL_RG32F texture instead of a traditional single channel depth texture and writing both the depth and object transparency during the depth pass. The shader would therefore look something like this:

in float depth;
uniform float inv_fac;
out vec2 depth_inv;

void main() {
    depth_inv = vec2(depth, inv_fac);
}

Now if it were that easy, I wouldn’t be talking about it. And, you may have already noticed an issue.

In the diagram, suppose the blue box is a transparent object and the red circle is an opaque object. With the above description, the depth information of the transparent object would overwrite the depth information of the opaque block and there would be a part of the opaque object’s shadow that fades away.

My first thought was to use stencil testing to try and include the invisibility factor in the computation for determining whether to render to the depth map. One small issue is this would require quantizing the 32-bit floating point invisibility factor into 256 levels so that it could fit in an 8-bit integral stencil value and setting the stencil function for each object.

Recall that stencil testing involves comparing the masked current value in the stencil buffer with a masked value specified in the stencil function. If the comparison (ref & mask) CMP_OP (cur_val & mask) succeeds, then we continue in the rendering pipeline and update the stencil buffer according to a specified update operation.

Ultimately, what killed this idea was that the stencil test occurs before the depth test, but we want to look at depth information first. This is because, looking back at the above diagram, the transparent object should cast a shadow on the opaque object (during the time it is transitioning from opaque to transparent), but using the stencil test approach this wouldn’t happen since the overlapping part of the blue box wouldn’t be rendered to the depth map. I thought of encoding depth information in the stencil buffer, but the limited precision would not be acceptable for depth data.

I’m not sure what the best real-time solution for this problem is, but I can share my solution.

My first idea was to render all the transparent objects’ depth/invisibility info in a separate depth map and use logic in the fragment shader to incorporate this data. However, this brings us right back to where we started! The earlier discussion still applies, just substitute “opaque” for “nearly opaque” and “transparent” for “nearly transparent”.

Therefore, we need a depth map for every transparent object to prevent losing depth and transparency information that may become necessary later. Project Oort uses cascading shadow maps, so this amounts to c * N extra depth maps where c is the number of cascades and N is the number of objects. Luckily for me, only the player can turn invisible, so there is only 1 possible transparent object (N = 1).

To make this change we first start with passing the extra shadow maps to the fragment shader.

uniform sampler2D cascadeTransMaps[3];
uniform sampler2D cascadeTransFacs[3];

We use a transparency factor where 1.0 is fully transparent, and 0.0 is opaque.

When searching for blocking objects to compute the average occluder distance for PCSS, we include semi-transparent objects as if they were opaque.

for(int i = 0; i < blocker_search_samples; ++i) {
    vec2 rand_offset = poissonDisc[i] * searchWidth;
    vec2 pos = shadowCoords.xy + rand_offset;
    // depth_map is the regular depth map of the ith cascade
    // trans_depth_map is the depth map for any transparent objects
    float sample_depth = min(texture(depth_map, pos).r, texture(trans_depth_map, pos).r);
    if (sample_depth < shadowCoords.z - bias) {
        ++blockers;
        avgBlockerDistance += sample_depth;
    }
}

Then, when computing pcf, we independently compute the shadow factor for opaque occluders and semi-transparent occluders. We blend these shadow factors to compute the final shadow factor:

float calcShadow(vec3 norm) {

    vec3 s = calcShadowFrom(norm);
    float opaqueShadow = s.x;
    float transShadow = s.y;
    float blendFac = s.z;
    return min((1.0 - blendFac) * transShadow + opaqueShadow, 1.0);
}

Note that we don’t use a traditional alpha blend ((1.0 - blendFac) * transShadow + opaqueShadow * blendFac). If a fragment is in the shadow of an opaque object, it also being in the shadow of a transparent object should not lighten the shading.

The result ended up looking like so:

Invisibility demo


  1. Fernando, R. (2005). Percentage-Closer Soft Shadows - Nvidia. https://developer.download.nvidia.com/shaderlibrary/docs/shadow_PCSS.pdf ↩︎


Source