This is likely to be an incredibly esoteric post of interest to very few people, but I’ve been wrestling with some really specific technical problems today and I *think* I’ve found the solutions, so I thought I’d document them here for my own benefit. If someone in the future happens to come across this post and it helps them solve the same problems, then all the better!
1.) Don’t rely on a “Fallback” shader to provide shadowcasting pass
Nome uses a heavily-customised sprite shader which, amongst other things accounts for normal maps, shadows and fog – all things not normally associated with 2D games. To do so, I’m making use of the fact that Unity will read additional passes from the fallback shader (this isn’t documented, but is widely known). But, and I have no idea why, when my sprite was flipped (i.e. a right-facing sprite was made to face left, by either setting localScale.x=-1, localRotation.eulerAngles.y=180, or hitting the mirror X option in the inspector) then the shader would no longer write to the depth texture. As a result, effects like global fog which rely on reading the depth buffer from the camera were not rendering correctly.
Originally I assumed this must be something to do with flipped normals on the reverse side of the mesh? Incorrect tangents? Backface culling? Well, possibly, but the cause seemed to be the fact that I was specifying
Fallback "Transparent/Cutout/Diffuse" as a way to get the cutout shadow rendering pass. When I manually ripped out the code of the ShadowCaster pass from the Transparent/Cutout/Diffuse shader and pasted it into my custom shader, it worked fine. Go figure…
2.) Graphics.Blit doesn’t work with a material with a surface shader.
I recently changed from using (CPU-based) LibNoise noise library to (GPU-based) Turbulence Library as a way to produce noise textures used in some of the procedural generation functions in the game. Since Turbulence Library uses shaders to create noise textures on the GPU, I needed to blit that to a rendertexture and the read the result back into a regular Texture2D:
// Create a temporary render texture RenderTexture temp = RenderTexture.GetTemporary(size, size, 0, RenderTextureFormat.ARGB32); // Render the noise material to the rendertexture (whiteTexture is just a dummy src) Graphics.Blit(Texture2D.whiteTexture, temp, mat); // Copy the rendertexture to a regular texture Texture2D heightMap = new Texture2D(size, size); heightMap.ReadPixels(new Rect(0, 0, size, size), 0, 0, false); heightMap.Apply();
However, while the texture was getting created fine, every time it was just filled with black pixels. The material itself looked fine in the inspector and when assigned to a material in scene view. The problem is that (for some reason) the shader template that Turbulence Library uses creates a surface shader which attempts to apply a Lambertian lighting model of reflection to the material surface. It seems that surface shaders just don’t work in a blit operation – I guess because a blit has no concept of the camera, object, or lighting direction reference within the scene – all of which are generally required in a lighting function. Rewriting it as a regular unlit vert/frag shader fixed the issue (as well as making the operation a bit faster), meaning that I can now generate heightmap textures as shown below on the GPU and apply them to terrain heightmaps in a couple of milliseconds.
So that’s down to only 97 problems, at least…