Interested in creating film-like effects such as fire, smoke or holograms in Unity? Discover how to use the node-based workflow effects and adaptive features in Unity's Visual Effect Graph to create striking VFX in real-time. This article is based on a session at Unite Copenhagen 2019 led by Vlad Neykov, lead graphics test engineer at Unity.
We recently published a blog post that covers all the verified features and workflows for the VFX Graph. Read on to get some tips on the VFX Graph architecture as well as a deep-dive into how we used the graph in the Magic Book demo.
VFX graph overview
The Visual Effect Graph is a next-generation visual effects tool for Unity. It can be used to create anything ranging from simple effects to complex simulations.
Each visual effect resides in the Project folder, as a self-contained asset; it has event and parameter interfaces that allow it to communicate with the rest of Unity. The Visual Effect Graph is designed to work with the High Definition Render Pipeline (HDRP), and it’s tailored for next-generation platforms (which support compute shaders).
The VFX Graph is simulated on the GPU, which allows it to support significantly more particles. Its node-based approach allows users to create custom behaviors and complex simulations, and its access to the frame buffers enables powerful functionalities such as spawning particles on top of the scene geometry.
However, being simulated on the GPU means that it’s not trivial to send data back and forth to the CPU. The VFX Graph does not support reading VFX particle data in C# and it doesn’t interact with Unity's Physics system, so you’ll need to employ some workarounds to create physics-based effects.
The flow of the VFX Graph
When creating a system, a series of contextual nodes, or contexts, are vertically connected, and executed from top to bottom. These are the building blocks of any effect.
Spawn is the context that determines how many particles should be created in the current frame. The next context, Initialize, determines the capacity and bounding box of the effect, and is where the initial values of the particles are specified. Remember that the bounding box is not dynamic, unlike in the Built-in particle system, so you need to make sure the box encompasses the whole effect for the effect to be called properly.
Update runs every frame and is a great place to do collision simulations, add different noise turbulence, and more. Output determines how particles are rendered, and each system can support multiple outputs. You can also adjust many particle attributes in Output before they are rendered. If you don’t need to change the size throughout the whole graph, you can just set it at the end, and you don’t need to reserve a buffer for it.
The Architecture of the VFX Graph
The VFX Graph is an asset in the Project folder that contains blocks and nodes as scriptable objects stored as sub-assets. From the VFX Graph, Unity generates an Expression Graph, which is a lower-level representation of the VFX Graph. Then, the Expression Graph sends this information to the VFX compiler, which generates a lot of runtime data.
This data includes shaders – compute shaders for the simulation and vertex and pixel shader pairs for rendering particles. It also generates byte code for the CPU interpreter as it is more efficient to calculate some things once per frame on the CPU than once for each particle on the GPU.
The VFX Compiler also generates a particle data layout, so each system stores only the data that is actually used. For example, if you’re not using velocity in your system, Unity won’t store another Vector3 per particle for velocity. Each system is optimized to recognize only what pertains to it. If you need to make a simple effect, it’s not going to carry all attributes a larger effect might require.
The compiler also creates a property sheet where you’ll find all the exposed parameters. You can access elements of the VFX Graph through Timeline or code, or you can modify them directly in the Inspector.
And, finally, the compiler lists all of the VFX systems, so you can issue different calls to each of them.
If you’re editing an effect, it’s convenient to use Auto Compile as it allows the VFX Graph to be constantly modified, even in play mode. You can change values and see the results without recompiling the effect, and changing the graph by connecting or disconnecting blocks or nodes will trigger a recompile.
New features in the Visual Effects Graph
We added a wealth of features to the VFX Graph, including:
- VFX ShaderGraphs (Preview)
- Particle Strips (Preview)
- Internal sequencing (Loop and Delay)
- Motion vector support
Subgraphs are essential for visual effects in larger projects, where elements can be reused, and they’re great for organizing your project. Subgraphs can be created either via the contextual menu by selecting the desired nodes/blocks or directly in the project window.
We added the Shader Graph integration with the VFX Graph. This allows you to customize rendering behaviors for each VFX output by creating Shader Graph shaders, exposing parameters via the Blackboard, and using them directly within the VFX Graph.
Motion vector support is especially useful for creating sparks or other fast-moving particles. It works in conjunction with the Motion Blur post-processing effect and can be activated simply by checking a box in the desired output.
Learn more about the new additions — experimental and production-ready features — in the VFX Graph and how you can use them in your project from our recent blog post.
Magic Book in action
The demo in the Unite Copenhagen session shows a magic book effect, which is included in the VFX Samples project, and it utilizes the recently added features.
VFX operators and blocks
The VFX Shader Graph integration and the particle strip data type for creating trails are still experimental. To use them, the experimental features need to be enabled within Preferences > Visual Effects > Experimental Operators/Blocks.
Burning pages effect
The pages of the magic book use a Shader Graph shader for a custom alpha-erosion effect which is assigned to the Lit Mesh output of the Visual Effect Graph. The exposed properties within the Shader Graph shader are accessible in the Visual Effect Graph and can be modified at will.
Another great feature that you can leverage is the addition of Subgraphs. Subgraphs enable you to create new custom operators, blocks, or behaviors and are a powerful way of building up a library of reusable functionality or effects. There are three types of subgraphs:
- Subgraph Operators
- Subgraph Blocks
- Subgraph Effects
Subgraph Operators are created by linking different operators together. An example of this in the Magic Book effect is an operator to create a Random Vector3. This subgraph is searchable and can be added in any other graph and will look like a regular operator.
Subgraph Blocks are created via blocks (and, optionally, operators). An example of this in the Magic Book effect is a simple block combining a few things needed to initialize the page particles. Block subgraphs, such as the one in the image above, can be used as regular blocks.
Subgraph Effects are a great way of nesting different effects within the same graph. This helps with simplifying graphs and also gives you control over multiple effects within the same Visual Effect Graph window. The Floating Rock VFX contains an Output Mesh for the rock itself, two instances of the Chain VFX, and a simple simulation which sways the rock and makes the two chains follow it in unison.
Another newly added feature is the additional Spawner functionality for controlling the loop duration, loop count, or adding delays around each loop. It is useful for creating more complex spawning behaviors and is used in Jacob’s Ladder effect here to control the spawn bursts.
The trails swirling around the beam use particle strips. They are a new data type that can be set within the Initialize context. Particle strips use dedicated particle strip outputs and connect all particles within each strip into a seamless trail.
In this particular effect, one system is used for the heads of the trails, and it triggers GPU Events to create trail segments behind itself. Particle Stips are also still being refined, and to use them, you need to enable the experimental operators/blocks options in Preferences > Visual Effects > Experimental Operators/Blocks.
A final touch is added to the effect by enabling the generation of Motion Vectors for some fast-moving particles. The option can be enabled in the Inspector after selecting the desired VFX Output and ticking the Generate Motion Vector checkbox. This, in conjunction with the Motion Blur Post-processing effect, allows Unity to blur out fast-moving particles in the direction they are moving.
Creating potion bubbles and smoke effects
The two remaining effects we have not discussed yet are the potion bubbles and smoke.
The bubbles are using a Signed Distance Field (SDF) for collision which is baked out in Houdini using the mesh of the potion glass. The bubbles are both attracted to it (using the Conform to SDF block) and they also collide with it (using the Collide with SDF block).
The smoke is using a similar approach, except the collision is done by approximating the shape of the potion with primitive shapes, such as spheres and cylinders.