You can get Voxel Farm now. For more information click here.

Monday, May 9, 2016

Applying textures to voxels

When I look back at the evolution of polygon-based content, I see three distinct ages. There was a time where we could only draw lines or basic colored triangles:

One or two decades later, when memory allowed it, we managed to add detail by applying 2D images along triangle surfaces:

This was much better, but still quite deficient. What is typical of this brief age is that textures were not closely fitted to meshes. This was a complex problem. Textures are 2D objects, while meshes live in 3D. Somehow the 3D space of the mesh had to be mapped into the 2D space of the texture. There was no simple, single analytical solution to this problem, so mapping had to be approximated to a preset number of cases: planar, cylindrical, spherical, etc.

With enough time, memory constraints relaxed again. This allowed us to write the 3D to 2D mapping as a set of additional coordinates for the mesh. This brought us into the last age: UV-mapped meshes. It is called UV because it is an additional set of 2D coordinates. Just like we have XYZ for the 3D coordinates in space, we use UV for coordinates in the texture space. This is how Lara Croft got her face.

We currently live in this age of polygon graphics. Enhancements like normal maps, or other maps used for physically based rendering, are extensions of this base principle. Even advanced techniques like virtual texturing or Megatextures still rely on this.

You may be wondering why is this relevant to voxel content. I believe voxel content is no different than polygon content when it comes to memory restrictions, hence it should go through similar stages as restrictions relax.

The first question is whether it is necessary to texture voxels at all. Without texturing, each voxel needs to store color and other surface properties individually. Is this feasible?

We can look again to the polygon world for an answer. The equivalent question for polygon content would be, can we have all the detail we need from just geometry, can we go Reyes-style and rely on microgeometry? For some highly stylized games maybe, but if you want richer realistic environments this is out of the question. In the polygon realm this also touches the question about use of unique texturing and megatextures, like in idTech5 and the game Rage. This is a more efficient approach on having a unique color per scene element, but still was not efficient enough to compete with traditional texturing. The main reason is that storing unique colors for entire scenes was simply too much. It led to huge game sizes while the perceived resolution remained low. Traditional texturing on the other hand allows to reuse the same texture pixel many times over the scene. This redundancy decreases the required information by an order of magnitude at often no perceivable cost.

Unique geometry and surface properties per voxel are no different than megatextures. They are slightly worse as the geometry is also unique, and polygons are able to compress surfaces much more efficiently than voxels. With that in mind, I think memory and size constrains are still too high for untextured voxels to be competitive. So there you have the first voxel content age, where you still see large primitives and flat colors, and size constraints won't allow them to become subpixel:

(Image donated by Doug Binks @dougbinks from his voxel engine)

The second age is basic texturing. Here we enhance the surface detail by applying one ore more textures. The mapping approach of choice is tri-planar mapping. This is how Voxel Farm has worked until now. This is sufficient for natural environments, but still not there for architectural builds. You can get fairly good looking results, but requires attention to detail and often additional geometry:

In this scene (from Landmark, using Voxel Farm) the pattern in the floor tiles is made out of voxels. The same applies to table surfaces. These are quite intricate and require significant data overhead compared to a texture you could just fit to each table top for instance, as you would do for a normal game asset.

We saw it was time for voxels to enter the third age. We wanted voxel content that benefited from carefully created and applied textures, but also from the typical advantages you get from voxels: five-year-olds can edit them and they allow realistic realtime destruction.

The thing about voxels is, they are just a description of a volume of space. We tend to think about them as a place to store a color, but this is a narrow conception. We saw that it was possible to encode UV coordinates in voxels as well.

What came next is not for the faint of heart. The levels of trickery and hackery required to get this working into a production ready pipeline were serious. We had to write voxelization routines that captured the UV data with no ambiguities. We had to make sure our dual contouring methods could output the UV data back into triangle form. The realtime compression had to be now aware of the UV space, and remain fast enough for realtime use. And last but not least we knew voxel content would be edited and modified in many sorts of cruel ways. We had to understand how the UV data would survive (or not) all these transformations.

After more than a year working on this, we are pleased to announce this feature will make it into Voxel Farm's next major release. Depending on the questions I get here, I may get more into detail about how all this works. Meanwhile enjoy a first dev video of how the feature works:


  1. The UV info had to be placed manually on each internal voxel? Cross projection form the surface, or some kind of black magic?

    When you started to smooth the edges I was like "What?".

    But the texture blending, between different solid materials still bugs me a lot. Like between the grass and the stone tiles.

    1. Most impressive Miguel. :D
      I wonder if Landmark will buy into the next release?

    2. Thanks, the internal voxels do not have UV info, but a regular material that is exposed when the surface voxels are gone. Only the surface voxels have UVs. Since a single voxel can be quite thick, it takes a bit for the UV data to erode completely. I believe the results are nicely organic.

      About the blending is it something you can set per material. We should have been more careful with the material settings for the video, probably were only looking at the UVs :)

  2. Amazing! I assume you can generate a textured-voxel "prefab" from a traditional UV-mapped polygonal source model. Any chance you can go in the other direction, exporting a uv-mapped mesh from the voxel world?

    I'm very interested to learn more about you achieved this effect! Although I can't think of any poignant questions at the moment :)

    1. Yes, it is possible to export a UV-mapped mesh from the voxel world. Check out this older post:

  3. The main thing I thought when I saw you smoothing that wall was: "Procedural building erosion." Be it in the pre-game generation, or in the actual game, either way it's exciting stuff.

    I don't quite know enough about this to get properly excited about the stuff you guys are doing here, but I'm convinced it's grand!

    1. Yes I too was excited when we finally got the smooth (erosion) to work with the UVs. Like you say weathering or even maybe consumption by fire would be the first things to try.

    2. Oh yea! You could make a realistic campfire! With actual disintegrating logs and stuff! (Or an entire house, if you're feeling destructive =P)

    3. ... and different materials would be consumed at different rates, or not consumed at all. For extreme weathering concrete will disappear at a faster rate than steel. This would leave structural beams inside a house exposed for a while before the beams begin to erode.

    4. I would not be surprised if people would pay for a... game(?)... where you build structures, and then subject it to several types of weathering, be it fire, air, water, something else?

  4. So you mentioned it being a part of dual contouring. Do you generate the UV at the same time that you generate the intersection point of a voxel?

  5. How did you manage to generate UV's procedurally without any distortion? I don't know if I'm just naive but as far as I have seen most UV mappers end up either distorting the texture slightly or duplicating the vertices.
    Your work is amazing! I can't wait to see more ingenuity and additions to the engine.


There was an error in this gadget