Good-looking randomization for procedural buildings

Learn how to add variation to a modular building. Create a procedural material in Unreal, for both static and instanced meshes

Reading time: 11 minutes
Last edited: Aug 27, 2019


Constructing a procedural building with Blueprint is a tempting idea. Using standardized modules and automatic placement makes a lot of sense. It’s architecture, after all. But how can we texture it to get a natural variation instead of repetition?

The building was made from just 1 module, copied automatically in the Construction Blueprint. The idea for the material was to require almost no manual input. There is only 1 material for the entire building (except the windows). Its features utilize vertex color and pixel’s world position to drive all the randomization.

A single module – it’s all we need
No manual placement or scripting. All the randomization happens in the material

The material I describe in this tutorial:

  • Has a height-based dirt layer that covers the object only up to the specified absolute height
  • Selects the color of items is randomly for every floor and segment
  • Offsets the position of small items slightly, also in a random manner
  • Allows the user to pick 2 colors of the walls, as well as the amount damage

You can download the project files for free (or a donation, if you want):

Get project files

Vertex color as data ( 03:34)

In addition to mesh’s vertex positions and normals, game engines usually provide access to additional values, like the vertex color. When shading a triangle of a mesh, the color is interpolated between the vertices. You can either paint it in a 3D editing software or using the Paint tab in UE. Doing it in the engine could give an ability to customize a single instance of a mesh in the world. This is a use case I explained in the vertex painting tutorial. Here, however, I stick to the imported color. That’s because I use the RGB channels as precise masks that drive randomization.

Remember than a color in 3D graphics is just a three-component vector. Its components are the brightness of red, green and blue, in the range of 0 – 1. The meaning is arbitrary, as vertex color is just data. Instead of using it directly for colorizing the textures, I decided to pack masks into each vertex color channel:

  • Red channel – a mask between the primary and the secondary color of wall paint. Polygons with the value 0 use the primary color, 1 means the secondary color.
  • Green channel – used for picking a color from a palette. This drives the color variation of smaller items, like the drying clothes and socks. A value between 0 and 1 is rounded to an index (UV position) on a palette texture.
  • Blue channelvertex position offset, for moving the vertices horizontally. It means that 0 should be used for walls (no movement), while a value up to 1 can be assigned to an air conditioner or to the clothes. This channel also drives visbility (opacity mask). If value is greater than 0, a random value is added, different per building segment, to add variety.

Every major 3D editor provides a feature for painting vertex color. You should be able to easily find instructions for your tool of choice. Just remember to tell Unreal Engine to Replace vertex color (instead of Ignore) in the mesh import window.

As I mentioned, the color in the case of this project is a collection of precise values. For a task like that – and virtually any technical task in game art – I prefer Houdini. However, a similar result is possible to achieve (just with more effort) in any 3D editing software. Just focus on what each color channel was supposed to mean for the shader.

I split the entire process of assigning values to vertex color channels into a separate tutorial: Store data in vertex color using Houdini. In the tutorial I make use of several clever tools in Houdini to make the process efficient.

Construction Blueprint ( 07:57)

The Blueprint I set up here is basic. It just creates a flat building wall by duplicating the mesh horizontally and vertically. Its editable variables are MeshWall (Static Mesh), NumberOfFloors (integer), NumberOfSegments (integer) and Material.

The result of the blueprint, 4 floors, 2 segments.

The entire process happens in the Construction Script, i.e. during level editing. This will make the resulting mesh behave like any other static object. For example, it will be taken into consideration when building lighting.

First, the mesh’s dimensions are measured. Bounds extents mean half the size of an object. We can calculate it once and store, as the value will be the same for all segments.

The rest of the code happens in two loops. The outer loop creates entire floors, while inner loop creates the segments within the current floor. The location is calculated from the loops’ iteration indexes, multiplied by mesh width and height.

A new static mesh component is added for every segment. In some cases it may be useful to use instancing instead. Adding components lead to an increase in the number of draw calls, which may be problematic in heavier scenes.

That’s it. Setting the NumberOfFloors and NumberOfSegments will automatically update the building.

Two wall colors, masked by vertex color ( 21:16)

Both colors are exposed as parameters. The vertex color’s red channel is used as the blending factor between them.

Channel packing, mixing ( 14:22)

We’d like the wall colors to only affect actual walls and spare the window frames, air conditioners. The damage patches should also be unaffected. This is done by packing a mask in base color textures’ alpha channel. In other words, the base color texture have a transparent background – this is where the wall colors will kick in.

By the way, I packed the metalness, roughness and occlusion textures – all grayscale – into the R/G/B channels of a single texture. It reduces the number of samplers, files and Lerp-s to just 1/3rd, which is a nice optimization for free. Find a tutorial about it in the footnotes.

Noise from a texture

Such a noise pattern is used to mix between the 1st (clean) and 2nd (damaged) sets of textures. Instead of calculating it in real-time, which is expensive for high-quality noise, I just load it from a texture. The WorldAlignedTexture node projects it from 3 sides (it generates UV coordinates procedurally).

Height-based dirt. The Remap function ( 18:00)

The dirt is a flat color, applied by a world-space gradient. Pixel world position’s Z component is remapped into a 0-1 range. This gives as a useful mask – a factor for Lerp. The original minimum and maximum (e.g. from 150 to 700 cm) are provided by the user as scalar parameters. A slight noise is added to make the transition more natural.

The TAA_Remap_01_Clamped function was made by me. I use it in almost every shader. It translates a value within the original range into 0-1 range. Perfect from making masks out of a distance (be it from the camera, from the ground or even for shapes in the UV space).

Remap, the most useful function in the world

Randomizing colors, hiding elements ( 22:55)

Deriving a random value from segment pivot’s position allows to shift the color palette for the small items. The palette is a texture with colors aligned horizontally, so moving the UVs used to read it changes the resulting color.

The palette texture, scaled up. Normally it’s 8×1 pixels, with compression turned off

The vertex color’s green channel serves as a randomization value and a mask. The value of 0 is treated as “don’t apply the color here”.

Applying local position offset ( 27:08)

The final feature is moving the items on the X axis, randomly. It works by adding a material-controlled position offset to the vertices. There are 2 things to watch out for with this trick – complex collision (for shooting) and distance fields. Both are unaware that such offset happened.

I take the blue channel again, then add a random number – the same segment position-based noise we used before. Let’s remap it from 0-1 to between -0.5 and 0.5, to make the movement happen in both directions. Then we multiply it by the PositionOffsetStrength. The Append node will add the remaining axes (constant 0 in Y and Z).

What may surprise you is that Unreal expects an offset in world coordinates, as the output. What we calculated is local position. So how do we convert it?

We can do it by changing the space of this new local position into the world space using a Transform node. Then I subtract the original world position of the vertex from this new position of the vertex – getting a world offset intead of a position in return. Connect it the the World Position Offset material output and it’s done.

Final material

I hope you learned something new from the tutorial. Here’s screenshot of the entire material node graph:

Project files and discussion

You can download the project files for free (or a donation, if you want):

Get project files

If you have feedback or questions, join our discussion:

Discuss on Reddit

Learn more

  • Easy Way To Pack Textures Into RGB Channels — How to save 3 grayscale textures into channels of a single RGB image, using just Photoshop. Packing allows to save space and, most importantly, retrieve 3 textures with a single read. This is worth the effort because reading textures from memory is one of the most time-consuming operations on the GPU.

Big thanks to the reviewers of this tutorial: Mikołaj Fabjański, Alexander Gajek