Procedural Generation

Projecting procedural textures onto 3d surfaces

Background

A little bit of background before we dive right into projecting our textures onto 3d objects:

So far, we have only been working with one kind of shader program: Fragment shaders. A full shader program is comprised of both a Vertex and a Fragment shader. I can't do a good job explaining how these two things work in tandem, thankfully, there's a few animations that do it much better than I could explain Here. The two animations to look at are the one near the top, just under the first block of code, and another one further down, with a triangle with very jagged, pixelated edges.

The primary differences between the 'vertex' and 'fragment' shaders are:

  1. The vertex shader processes verticies
  2. The fragment shader processes pixels
  3. The fragment shader (typically) runs more times, and does more work
In the last 'paper', most of the examples were simply fragment shaders, running for every pixel in a canvas object. Behind that, there was a vertex shader, which didn't do much, just to make the entire canvas be used.

In this 'paper', we are moving on to write shaders for the Unity3d game engine. Shaders for this platform are a bit different, and come in many varieties. One method of producing shaders allows for controlling the legacy 'Fixed Function' pipeline. This is a bit less useful for us, since we want to be able to write more complex programs, rather than just telling the graphics card how to sample and blend texture images. We will be focusing on using Surface Shaders. This kind of shader works a bit differently than what we have been doing so far.

Unity's surface shader expects more than just a single color as output, instead, it expects a much larger struct, containing more information about the pixel.

Given by Unity's built-in documentation:

    fixed3 Albedo;      // base (diffuse or specular) color
    fixed3 Specular;    // specular color
    fixed3 Normal;      // tangent space normal, if written
    half3 Emission
    half Metallic;      // 0=non-metal, 1=metal
    half Smoothness;    // 0=rough, 1=smooth
    half Occlusion;     // occlusion (default 1)
    fixed Alpha;        // alpha for transparencies

They also provide a version that takes also has a 'fixed3 Specular' property for specular highlight color. This struct is then used by a lighting function to produce the actual color of the pixel.

  1. Albedo is the primary output for surfaces, which is what recieves the light.
  2. Specular controls the color of specular reflections. By default, this is just white (100% of whatever light is being reflected)
  3. The Emission property is light that is always present (Emitted by the surface), and used for glows.
  4. Metallic and Smoothness properties determine the general reflectiveness of the surface, both to light sources and to the ambient environment reflections.
  5. Occulsion determines how exposed that part of the surface is to light. This adds to the appearance of depth on the surface.
  6. Alpha has no use for opaque surfaces, but can be used to either fade transparent surfaces, Or 'clip' pixels off of partial surfaces, like leaves.

Our 'surface' shader writes to these properties of a struct, and then that struct is unpacked and processed by a lighting function.

Unity's shader system is pretty extensible, and even allows for custom user lighting functions. The lighting function is what actually does what our fragment shaders were doing previously (determining a single color).

However, the presence of other properties gives us other places where we can provide details to make the surfaces have more detail.

For example, besides the Albedo property, we could vary the Metallic and Smoothness properties based on our noise functions, as well the Occulsion, Normal, and Emissive properties.

Unity's pipeline also does a lot of other useful stuff, like providing different information into the fragment/surface shaders, through a user-defined struct (Input) which can hold whatever information is needed, as well as allowing users to provide information themselves through custom vertex functions, as well as automatically compiling a number of different variants of the surface shader for different 'passes', and in different rendering modes. It also creates a way to pipe information about the shader for each 'material' that uses that shader program.

We use this system to grab the worldspace coordinates of the pixel that is being rendered, (The 'worldPos' field in the 'Input' struct) and we use that information to sample the noise fields to build the texture.

Unfortunately, Unity has retired their Web Player plugin, and doesn't properly support complex shaders in their WebGL pipeline, yet. So, there's no graceful way to embed these examples into the pages this time. Instead, there will be embedded images or videos.

Also, much of the stuff that has been done, has been ported to Nvidia Cg, (C for Graphics). Most of the common functions have been separated into cginclude files, similar to headers.

CGIncludes
  1. noiseprims.cginc - Holds the hash and basic noise functions
  2. fbm.cginc - Holds the fractal noise function
  3. voroni.cginc - Holds the voroni noise functions
  4. fbmnormal.cginc - Holds a helper function to generate normals for surfaces
  5. procheight.cginc - Holds a helper function to parallax surfaces

For starters, lets look at a 3d-projection of the camo created in the last 'paper':

Camo Surface Shader Code
Full Unity Shader File: Camo.shader

Wow, that was a huge amount of stuff. It starts with the properties block. This defines what data is piped into the shader from the engine, allowing people other than the programmer to define things about the shader. Each of the lines in this section corrospond to a variable defined below (the variables starting with _). Then there's a bunch of #include directives pointing to some of the cginc files listed above, and some other compile directive stuff. Below, towards the bottom, the last few lines assign to the surface outputs. The middle part is still relatively the same.

The surface produced by this shader (with default settings) looks like the following:

Lookin pretty good. Another benefit from using procedural textures, is it's extremely easy to modify what the surface looks like.

Changing shader properties

(Click to expand)

So, changing the properties changes the noise fields, or how the samples are used to generate the texture. The parameters that can be piped into shaders are pretty limitless, as are the kinds of things that one can do with the shaders.

Heres a bunch of other, similar effects I've written using the same noise primitives and CGIncludes. The first two, 'Marble' and 'Digital Brain', I didn't come up with myself, like with the camo pattern, but the rest are all creations of my own. The Camo, Marble, and Tech effects I wrote to get my bearings in the world of making shaders. I then used similar techniques to create the other effects.

Click one to show more info about it.

'Marble' more_vert
'DigitalBrain'more_vert
'Lumpy' more_vert
'Planet' more_vert
'Moon' more_vert
'Bricks' more_vert
Marble

Code Here

This is one of the simpler effects. Again, I didn't come up with this myself, but have seen it used in numerous places. The meat of the effect is the calculation of the value:

float v = ( 1 + sin( ( pos.x + nnoise(pos, _Factor) ) * 50 ) ) / 2;

Which is used to lerp between two colors. The inputs to the sin function are the x position (which always increases) and the noise value at the point (which is essentially random). This creates a sort of 'grain' in the surface. This effect might be good for creating a bit of regularity in a surface, such as the grain of wood, or layers in sandstone.

Marble morphing
Digital Brain

Code Here

This is a bit of a different effect. It's also transparent, so it writes to the o.Alpha output field, and is compiled with the pragma alpha:fade.

It's a pretty simple effect that is a bunch of layers of voroni noise, but the neato part is the 'electrons' moving across the 'wires'. On octaves deeper than the first, there's an additional sample of the noise function used to animate the electrons. This extra field is panned across the first field, which is what makes the electrons move. Only a small section of the field is animated this way (sample value between certain values), so the electrons are confined to the 'wires' instead of the 'cells'.

This is the last effect I used someone else's work on. The original effect I based this on can be found Here. This is one of the effects that I got inspired by, and got me writing these effects in the first place, and even after all this time (and especially that I now undestand exactly how it works), I still find it a really cool effect.

Digital Brain
'Lumpy'/Stones

Code Here

Another variant Here

This effect turned out way better than I expected. I used a technique for parallax, where the height of the surface is controlled by a Voroni Cellular noise. The parallax calculation is fairly simple for the visual interest it adds. It's included in the 'procheight.cginc' file:

Excerpt from procheight.cginc

inline float3 parallax3d(Input IN, float3 h) {
    const float3 nrm = normalize(IN.wNormal);
    const float hv = h * _Parallax - _Parallax * _Bias;
    const float3 eye = normalize(IN.viewDir);
    float3 dir = eye - nrm * dot(eye, nrm) / dot(nrm, nrm);
    return hv * -dir;
}

That is used to offset the sample for the actual texture. The eye direction (dir) on the last line can be flipped to taste, neither looks perfect, as this technique works much more simply in on 2 dimensional textures, and not that well on 3 dimensional textures. Parallax calculations in 2d and 3d both rely on projecting the eye vector onto the surface and offseting the texture sample in that direction by some distance, based on the height sample.

Then, the noise field is re-sampled at the offset position. Then, the result is used to calculate albedo, normals, and is also applied to the 'Glossiness' and 'Metallic'ness of the pixel. I added another feature (_Polish) which changes how much 'Glossiness' and 'Metallic'ness the pixel has based on the 'height' of the pixel.

Not being content, I took a technique from the previous 'Digital Brain' effect, and added another set of samples to the voroni noise function, this time, 3 of them, at different frequencies. Unlike in digital brain, this is applied across the entire space, rather than just along the 'circuits'. Then, the samples panned and blended, creating an animated, liquid covered texture.

This effect comes out of the 'water' samples affecting the height (and thereby, the parallax of any pixel), as well as a bit of color. This gives the 'lower' regions between the stones some color. One cool side effect of the way this is done, is the panning is in whatever space the texture is in (world or local). When using the worldspace, the water always flows in the worldspace direction (as shown in webm below).

I then modified this water texture to be like a slime covered rock, panning at a slower speed, with lower amounts of higher frequency noise blended.

Stone Morph
Running Water
Slimy/Fleshy Version
Planet

Code Here

This one is another adapted 2d texture, but one I had made earlier when experementing with 'mode7' like projections.

My original effect can be found Here

The original effect sampled the noise function a few times, for 'height' and 'moisture' values, then, like the camo effect, 'Clips' the texture into different regions, but instead, based on the combination of the values.

This adaptation works much the same way, but with improved surfaces, and adjustments to 'height'/'moisture' based on the distance from the 'equator' (or to the x/z plane).

Planet Morphing
Moon

Code Here

This one was very interesting to write, and looks very convincing until you get right up close to it. It's a simple Euclidean-distance voroni noise, filtered and processed in such a way that not all holes pass. Then, the edges of the holes are adjusted they are smooth, rounded craters, with a slight lip around the edge. And, of course, the craters are fractalized, and it uses the same height-parallax technique the 'Lumpy' shader does.

Going into more detail, the craters are determined by the distance to the closest point. That distance is applied to a curve, roughly as follows:

Value (Height) Distance 1 0 0 1

There is some smoothing applied within the shader for this curve, but this makes it so only positions very close to feature points become craters, there's a nice lip at points just outside of those craters, and the rest of the points not close to feature points are flat.

The height parallax then adds to the illusion that there is depth on the surface,

Moon Morphing
Bricks

Code Here

This is likely the most complicated shader I have written so far. There's a ton of parameters for all sorts of things, such as the size of the bricks, offsets per 'layer', factors for 'relaxing' the noise, blending different noise layers, and changing the border size between the tiles.

The basic effect works by this process:

  1. Taking the sample point, and determining the 3d cell it resides within
  2. Blend between 'grout' and 'brick' based on the distance from the sample point to the cell edges.
  3. Sample 'texture' for height value, offset sample point with parallax
  4. Re-determine cell, 'grout'/'brick' blend etc.
  5. Sample offset sample point for color, normal

Then, I decided to apply the water texture on-top of the bricks, much like the 'Lumpy' effect. This is applied the same way (Once for the height/parallax, and once for color).

Bricks Morphing
Water on Bricks

Here are the shaders applied to some models that were graciously given to public domain by 'Yughues' of Open Game Art:

'Asteroids' using 'Planet' and 'Moon' textures
'Wood Crates' using 'Marble' texture