My last two posts were about the the flame pyro simulation in Houdini, and the custom render pipeline I made for Unity. The flame was intended for a cinematic, but it felt like a pity not to have that in the game. So, bringing the Houdini pyro simulation into Unity will be the topic this time.
UNITY PARTICLE SYSTEMS
Unity has an old legacy particle system, as well as the new VFX graph. I started trying out the legacy system. It is similar to most traditional particles systems like Cascade in Unreal engine, and as such is a battle proven technology used in many games. Though, I didn't not want to start from the beginning to create a flame particle system, when I already had done one in Houdini. The VFX graph is more modern, but is tightly integrated with the URP and HDRP render pipelines. So, I set out to create my own solution to integrate Houdini simulations with my custom render pipeline.
THE HOUDINI PYRO SIMULATION
While I was talking about particle systems above, the Houdini pyro simulation is not a particle system. It is a volumetric simulation, involving density, heat and temperature fields, represented as voxel data. The image below shows a cross section of the density, heat and temperature fields, from left to right.
The density field is used when rendering smoke, while the heat and temperature is used for the emissive flame. I will focus on the flame for now. It might seem a bit confusing to have both heat and temperature, and not just temperature, but the correct thing is to have both. I'll clarify the difference later below.
IMPORTING PARTICLES IN UNITY
Ok, so what about the particles? As I discussed in earlier posts, I spawned particles in the pyro volume to use as a complementary light source. That will serve well to use for the real time version in the game.
The first thing I did was to export one frame of this point cloud as both FBX and OBJ. Though, Unity's mesh importer can not import pure point clouds in these formats. Unity expects a mesh of connected points. So, let's connect them. This can be achieved by squashing the points onto a flat plane, performing Delaunay triangulation, and pulling the points back into 3D space.
Now we have a properly connected mesh that Unity is happy to import. I stored the heat and temperature fields into the first uv set. My last post goes more into details regarding exporting geometry from Houdini.
UNITY INSTANCING
To make a particle system, I need to instance a particle onto each point. This is done using GPU instancing. The particle itself can be any kind of geometry, but I am using simple quads, rendered as billboards. Each particle needs to know it's position and other attributes, like heat and temperature. These attributes can be passed to the particle shader as compute buffers.
PARTICLE SHADER
This is the fun part. The particle shader defines the look of the particle system. As we are working on the flame, it is a fully emissive shader. Each particle will add a small amount of light to the scene, and is thus additive.
As a side note, additive lighting assumes that we are working in a linear color space. Adding 50% gray in sRGB gamma space would be adding 0.5 to 0.5 resulting in 1.0, that is pure white. In linear color space, mid grey has a value of 18%, and adding 0.18 to 0.18 gives 0.36, which is physically correct. My render pipeline works in linear color space, and also has hight dynamic range (HDR), allowing values to exceed 1.
We have the heat and temperature, and need to find the RGB value to add to the scene. What determines the color of a fire particle? We all know that fire has a color temperature of 1900K (or say 1700K for deeper red). Ok, so the temperature determines the color... Then we can map the particle temperature using a color ramp, similar to the color temperature charts we all have seen.
Let's look at Houdini's flame preset for the Mantra fire shader.
It uses a ramp to map the temperature. This is only one part of the full ramp above it, but it follows the same idea of mapping the temperature to the color temperature. All artists are familiar with this. I started with a similar approach, but the result did not look convincing.
Back to the discussion about the heat field. Isn't heat the same thing as temperature? No, the heat is what is referred to as intensity in the above screenshot. The heat determines the intensity of the emissive, and the temperature the color. At least that is how the Houdini default pyro shader is set up. Then I could look up the temperature from a ramp, and multiply by the intensity. Still, this was not looking good.
So a bit of research regarding blackbody radiation. As it turns out, this Kelvin color temperature we have all been using is just a simplification. The color of a blackbody depends and both luminance (intensity/heat) as well as temperature. The real color temperature chart looks more like the following.
Using a two dimensional ramp to get a more accurate blackbody color greatly improved the look of the particle system.
MANY TORCHES
Now I had one torch flame, frozen in time, that looked good enough for me. The next step was to use this for all torches in the game. As you would only be able to see a few torches at a time, I added an occlusion test to only draw visible particle systems. Each particle system has a bounding box used to determine visibility.
ANIMATED FLAME
The flame is still frozen in time. We exported one frame from Houdini. How do we export the whole animation? That is tens of thousands of particles for each single frame. The mesh approach above was good enough to get this far. To pass a massive amount of data into a shader, one approach is to pack it into a texture. Each pixel stores the attributes of one particle. 100 frames with 20,000 particles, would require two million pixels. That is the same as a 2,000x1,000 image.
In my previous post, I introduced how to write VEX in a COP network. With VEX, we have full access to all geometry data. That is perfect for packing the particles into textures. Each frame is first packed into a multi-channel EXR, one slice of the total animation. All slices are then stacked into a single image. Unity will not read multi-channel EXR:s, so the channels are repacked, and stored as separate images.
COP networks have the bad habit of not updating properly, which caused problems when trying to make all this to work in Unity. I needed a way to check that the exported images were containing the data they were supposed to. Luckily, Nuke not only supports multi-channel EXR:s, but can also display the position pass as a point cloud. This is done using the PostionToPoints node (Gizmo), as shown below.