You can get Voxel Farm now. For more information click here.

Friday, June 30, 2017

Unity versus Unreal

This topic is as divisive as the US 2016 presidential election, so I'll tread carefully.

As a middleware maker, it makes no sense to have favorites. We do our best to keep integrations of Voxel Farm on par so we reach as many users as possible. As an individual, I see no problem stating I prefer Unreal, but this is only because it is an all C++ environment. It is not a rational thing.

This post, however, is not about how I feel. It is rather about the state of the two engines and how much they facilitate procedural generation and working with voxel data. I think many of the issues we have encountered over the past few years are common if you are doing a similar type of work with these engines. Hopefully, our story can help.

Let's start with the visuals. Both Unity and Unreal are capable of rendering beautiful scenes. Both are also able to render at very high frame rates, even for fairly complex content. This has likely been the lion's share of their R&D for years now. Unity has one crucial advantage over Unreal, which is it natively supports texture arrays. Unreal almost supports them, in fact, we managed to make them work in a custom branch of UE4 with little effort. However, this is not possible with the out-of-the-box Unreal distribution. That is a dealbreaker if your middleware is to be used as a plugin like it is our case.

Texture Arrays in Unity allow precise filtering and high detail

Texture arrays make a big difference if you need complex materials where many different types of surfaces need to be splatted in a single draw call. When an engine lacks texture array support, you must use 2D atlasing. This raises a whole hell of issues, like having to pick mip levels yourself and wasting precious memory in padding textures to avoid bleeding. When you hit this low point, you begin to seriously question your career choices.

If your application uses procedural generation, it likely means the contents of the scene is not known when the application is in design mode. This is at odds with how these engines have evolved to work. If your application allows users to change the world, it only makes it worse. For the most part, both engines expect you to manage new chunks of content in the main thread. This is something that if left unattended can cause severe spikes in your framerate.

There are multiple aspects involved in maintaining a dynamic world. First, you must update the meshes you use to render the world. This is fairly quick in both engines, but it does not come free. Then, you must create collision models for the new geometry. Here Unreal does better. Since you have closer access to the PhysX implementation, you can submit a much simpler version of the content. In Unity, you may be stuck with using the same geometry for rendering as colliders. (EDIT: I was wrong about this, see the comments section.) From reading their latest update, I see this motivated the Card Life developers to ditch PhysX collisions altogether.

Card Life, made in Unity, features a hi-res voxel world

Voxel Farm allows players to cut arbitrary chunks of the world, which then become subject to physics. Unity was able to take fragments of any complexity and properly simulate physics for them. Unreal, on the other hand, would model each fragment as a cube. Apparently, PhysX is not able to compute convex hulls, so for any object subject to physics, you must supply a simplified model. Unity appears to create these on-the-fly. For Unreal, we had to plug in a separate convex hull generation algorithm. Only then we could get the ball rolling, literally.

When it comes to AI and pathfinding, both engines appear to use Recast, which is a third party navigation mesh library. Recast uses voxels under the hood (go voxels!) but this aspect is not exposed by its interface. For a voxel system like us, it is a bit awkward to be submitting meshes to Recast, which then are voxelized again and ultimately contoured back into navigation meshes. But this is not bad, just messy. There is one key difference here between Unreal and Unity. Unreal will not let you change the scope of the nav-mesh solution in real-time. That means you cannot have the nav-mesh scope follow the player across a large open world. It is unfortunate since this is a tiny correction if you can modify the source code, but again for a plugin like Voxel Farm it is not an option.

Dynamic nav-mesh in UE4

This brings me to the last issue in this post, which is the fact Unreal is open source while Unity is closed. As a plugin developer, I find myself surprised to think a closed source system may be more amicable for plugin development. Here is my rationale: So far the open source model has been great allowing us to discover why a given feature will not work in the official distribution. You can clearly see the brick wall you are about to hit. For application developers, open source works better because you can always fork the engine code and remove the brick wall. The problem is this takes the pressure off and the brick wall stays there for longer. In Unity, both application and middleware developers must use the same version of the engine. I believe this creates an incentive for a more complete interface.

I'm sure there is more to add to this topic. There are some key aspects we still need to cover for both engines, like multiplayer. If you find any of our issues to be unjustified, I would love to be proven wrong, for the betterment of our little engine. Just let me know by dropping a comment.

12 comments:

  1. Thanks for the insight. Does the fact the Epic occasionally merges pull requests offset the negative pressure of open source? And in what way is the branch requirement for texture arrays a deal breaker? Is it not reasonable to make a branch available which enables these features?

    ReplyDelete
    Replies
    1. It is hard for me to define what "occasionally" should be in this case. For the issues we depend on there have been little movement, some of these branches have been around for a long time now. Some bits of these branches have inspired changes in the official release, but at the moment I do not recall a single entire pull request being merged. I could be wrong here, we should really do a spectral analysis here (or something like that) before making any claims.

      For a plugin it is a deal breaker because it prevents the use case where your product is obtained from the asset store and it just works. It is clear ease of use is paramount today for gamedev, just see how much Epic has invested in their blueprint system. If you suddenly require users to use git to obtain a custom version of the engine and compile it in Visual Studio, it will be too much work or too difficult for a large section of your potential user base.

      And the biggest issue is that this model does not scale. Now imagine you need another three pieces of middleware like Voxel Farm in your project. Each one comes with its own branch. You will need to get four branches, merge them, pray that they do not step into each other toes, and repeat this process every time there is a new UE4 release or a new release for any of your four middleware pieces.

      Delete
    2. They merge pull requests a *lot*.

      As for a multi-way merge, git cherry pick. Not as hard as you might expect -- however its not plug and play.

      When making plugins, I've rarely (once ever?) encountered an issue in UE4 I couldn't work around with clever code. Source code over black box has been a big win, at least over here.

      Delete
  2. Have you tried contacting Epic about these issues? Don't know if it'll help, but it might =P.

    ReplyDelete
    Replies
    1. I believe others have done it, if you follow the forums. For the most part we have just encountered these issues. It was only two releases ago that Voxel Farm became a plugin for UE.

      If you look at the pace of UE4's development, they have covered a lot of ground very quickly. It is likely they will turn their attention to procedural and user-generated content in the near future.

      Also, being able to circumvent these issues is a competitive advantage for us. If the tax code was straightforward we would not need accountants. I am not complaining.

      Delete
  3. FWIW you can definitely use a different mesh for the collider than the renderer in Unity : PhysX uses whatever is in MeshCollider.sharedMesh, and MeshRender uses MeshFilter.sharedMesh.

    ReplyDelete
    Replies
    1. Thanks for the tip! Our Unity integration team is out for the Canadian holiday, as soon as they are back I will bring this to their attention.

      Delete
    2. I just asked, and I was completely wrong. This is not an issue for us at all, we already submit different meshes for collision. The issue is this still happens in the main thread and PhysX can take unpredictable time to bake the collision model.

      Delete
    3. Splitting big meshes into a bunch of smaller ones usually helps. This is ugly as hell, but the only working solution to the problem at the moment as far as I know.

      Delete
  4. A lot of Unity forums feedback about this entry if you are interested.
    https://forum.unity3d.com/threads/unity-is-not-behind-ue4-when-it-comes-to-dynamic-procedural-content-and-may-be-ahead-in-some-areas.481332/

    ReplyDelete
    Replies
    1. Thanks, just looked at that thread. It seems I should have not used the words "open source" anywhere near UE4. Twitter was also very enthusiastic about this. My point was about the fact you can read all of UE4 code and compile it, not about how such code is licensed to you. This post was about technical issues, nothing to do with their business models.

      Delete
    2. Dont sweat that noise. Your points are 100% dead on with my experience doing plugin work in UE. Anyone trying to nitpick is doing just that.

      Delete

There was an error in this gadget