The Visual Effect Graph (VFX Graph) enables the authoring of both simple and complex effects using node-based visual logic. As one of several major toolsets available in Unity, the VFX Graph allows artists and designers to create with little or no coding.
Visual effects are key to crafting deeply immersive experiences for your players. And thanks to continuous hardware advancements, what used to be available only for Hollywood blockbusters can now be attained in real-time.
This article is an excerpt from our 120-page e-book, The definitive guide to creating advanced visual effects in Unity, which guides artists, technical artists, and programmers using the Unity 2021 LTS version of VFX Graph. Leverage it as a reference for producing richly layered, real-time visual effects for your games.
Interested in creating film-like effects such as fire, smoke or holograms in Unity? Discover how to use the node-based workflow effects and adaptive features in Unity's Visual Effect Graph to create striking VFX in real-time. This article is based on a session at Unite Copenhagen 2019 led by Vlad Neykov, lead graphics test engineer at Unity.
We recently published a blog post that covers all the verified features and workflows for the VFX Graph. Read on to get some tips on the VFX Graph architecture as well as a deep-dive into how we used the graph in the Magic Book demo.
VFX graph overview
The Visual Effect Graph is a next-generation visual effects tool for Unity. It can be used to create anything ranging from simple effects to complex simulations.
Each visual effect resides in the Project folder, as a self-contained asset; it has event and parameter interfaces that allow it to communicate with the rest of Unity. The Visual Effect Graph is designed to work with the High Definition Render Pipeline (HDRP), and it’s tailored for next-generation platforms (which support compute shaders).
The VFX Graph is simulated on the GPU, which allows it to support significantly more particles. Its node-based approach allows users to create custom behaviors and complex simulations, and its access to the frame buffers enables powerful functionalities such as spawning particles on top of the scene geometry.
However, being simulated on the GPU means that it’s not trivial to send data back and forth to the CPU. The VFX Graph does not support reading VFX particle data in C# and it doesn’t interact with Unity's Physics system, so you’ll need to employ some workarounds to create physics-based effects.
The flow of the VFX Graph
When creating a system, a series of contextual nodes, or contexts, are vertically connected, and executed from top to bottom. These are the building blocks of any effect.
Spawn is the context that determines how many particles should be created in the current frame. The next context, Initialize, determines the capacity and bounding box of the effect, and is where the initial values of the particles are specified. Remember that the bounding box is not dynamic, unlike in the Built-in particle system, so you need to make sure the box encompasses the whole effect for the effect to be called properly.
Update runs every frame and is a great place to do collision simulations, add different noise turbulence, and more. Output determines how particles are rendered, and each system can support multiple outputs. You can also adjust many particle attributes in Output before they are rendered. If you don’t need to change the size throughout the whole graph, you can just set it at the end, and you don’t need to reserve a buffer for it.
The Architecture of the VFX Graph
The VFX Graph is an asset in the Project folder that contains blocks and nodes as scriptable objects stored as sub-assets. From the VFX Graph, Unity generates an Expression Graph, which is a lower-level representation of the VFX Graph. Then, the Expression Graph sends this information to the VFX compiler, which generates a lot of runtime data.
This data includes shaders – compute shaders for the simulation and vertex and pixel shader pairs for rendering particles. It also generates byte code for the CPU interpreter as it is more efficient to calculate some things once per frame on the CPU than once for each particle on the GPU.
The VFX Compiler also generates a particle data layout, so each system stores only the data that is actually used. For example, if you’re not using velocity in your system, Unity won’t store another Vector3 per particle for velocity. Each system is optimized to recognize only what pertains to it. If you need to make a simple effect, it’s not going to carry all attributes a larger effect might require.
The compiler also creates a property sheet where you’ll find all the exposed parameters. You can access elements of the VFX Graph through Timeline or code, or you can modify them directly in the Inspector.
And, finally, the compiler lists all of the VFX systems, so you can issue different calls to each of them.
If you’re editing an effect, it’s convenient to use Auto Compile as it allows the VFX Graph to be constantly modified, even in play mode. You can change values and see the results without recompiling the effect, and changing the graph by connecting or disconnecting blocks or nodes will trigger a recompile.
New features in the Visual Effects Graph
We added a wealth of features to the VFX Graph, including:
- VFX ShaderGraphs (Preview)
- Particle Strips (Preview)
- Internal sequencing (Loop and Delay)
- Motion vector support
Subgraphs are essential for visual effects in larger projects, where elements can be reused, and they’re great for organizing your project. Subgraphs can be created either via the contextual menu by selecting the desired nodes/blocks or directly in the project window.
We added the Shader Graph integration with the VFX Graph. This allows you to customize rendering behaviors for each VFX output by creating Shader Graph shaders, exposing parameters via the Blackboard, and using them directly within the VFX Graph.
Motion vector support is especially useful for creating sparks or other fast-moving particles. It works in conjunction with the Motion Blur post-processing effect and can be activated simply by checking a box in the desired output.
Learn more about the new additions — experimental and production-ready features — in the VFX Graph and how you can use them in your project from our recent blog post.
In the image above, you’ll notice the four Contexts present in the empty particle system graph.
The flow between the Contexts determines how particles spawn and simulate. Each Context defines one stage of computation:
- Spawn: Determines how many particles you should create and when to spawn them (e.g., in one burst, looping, with a delay, etc.)
- Initialize: Determines the starting Attributes for the particles, as well as the Capacity (maximum particle count) and Bounds (volume where the effect renders)
- Update: Changes the particle properties each frame; here you can apply Forces, add animation, create Collisions, or set up some interaction, such as with Signed Distance Fields (SDF)
- Output: Renders the particles and determines their final look (color, texture, orientation); each System can have multiple outputs for maximum flexibility
Systems and Contexts form the backbone of the graph’s “vertical logic,” or processing workflow. Data in a System flows downward, from top to bottom, and each Context encountered along the way modifies the data according to the simulation.
Systems are flexible, so you can omit a Context as needed or link multiple outputs together.
Contexts themselves behave differently depending on their individual Blocks, which similarly calculate data from top to bottom. You can add and manipulate more Blocks to process that data.
Click the button at the top-right corner of a Context to toggle the System’s simulation space between Local and World.
See the Node Library for a complete list of Contexts and Blocks.
Blocks can do just about anything, from simple value storage for Color, to complex operations such as Noises, Forces, and Collisions. They often have slots on the left, where they can receive input from Operators and Properties.
Properties and Operators
Just as Systems form much of the graph’s vertical logic, Operators make up the “horizontal logic” of its property workflow. They can help you pass custom expressions or values into your Blocks.
Operators flow from left to right, akin to Shader Graph nodes. You can use them for handling values or performing a range of calculations.
Use the Create Node menu (right-click or press the spacebar) to create Operator Nodes.
Properties are editable fields that connect to graph elements using the property workflow. Properties can be:
- Any Type, including integers, floats, and booleans
- Made from Compound components, such as Vectors and Colors
- Cast and converted (e.g., an integer to a float)
- Local or World space; click the L or W to switch between them
Properties change value according to their actual value in the graph. You can connect the input ports (to the left of the Property) to other Graph nodes.
Property Nodes are Operators that allow you to reuse the same value at various points in the graph. They have corresponding Global properties that appear in the Blackboard.
A utility panel called the Blackboard manages Global properties, which can appear multiple times throughout the graph as Property Nodes.
Properties in the Blackboard are either:
- Exposed: The green dot to the left of any Exposed Property indicates that you can see and edit it outside of the graph. Access an Exposed Property in the Inspector via script using the Exposed Property class.
- Constant: A Blackboard property without a green dot is a Constant. It is reusable within the graph but does not appear in the Inspector.
New properties are Exposed by default, and as such, appear in the Inspector. You must uncheck the Exposed option if you want to hide your property outside of the graph, and create Categories to keep your properties organized.
Group Nodes and Sticky Notes
As your graph logic grows, use Group Nodes and Sticky Notes to cut down on clutter. With Group Nodes, you can label a group of nodes and move them as one. On the other hand, Sticky Notes operate like code comments.
To create Group Nodes, select a group of nodes, right-click over them, then choose Group Selection from the Context menu. You can also drag and drop a node into an existing Group Node by holding the Shift key to drag it out. By deleting a Group Node, either with the Delete key or from the Context menu, you do not delete its included nodes.
Meanwhile, you can use Sticky Notes to describe how a section of the graph works, plus leave comments for yourself or your teammates. Add as many Sticky Notes as you need and freely move or resize them.
A Subgraph appears as a single node, which can help declutter your graph logic. Use it to save part of your VFX Graph as a separate asset that you can drop into another VFX Graph for reorganization and reuse.
To create a Subgraph, select a set of nodes and then pick Convert To Subgraph Operator from the right mouse menu. Save the asset to disk and convert the nodes into a single Subgraph Node. You can package Systems, Blocks, and Operators into different types of Subgraphs.
Creating a Subgraph is analogous to refactoring code. Just as you would organize logic into reusable methods or functions, a Subgraph makes elements of your VFX Graph more modular.
Levels of editing in VFX Graph
The VFX Graph supports three different levels of editing:
- Asset instance configuration: Use this to modify any existing VFX Graph. Designers and programmers alike can adjust exposed parameters in the Inspector to tweak an effect’s look, timing, or setup. Artists can also use external scripting or events to change preauthored content. At this level, you’re treating each graph as a black box.
- VFX asset authoring: This is where your creativity can truly take charge. Build a network of Operator Nodes to start making your own VFX Graph, and set up custom behaviors and parameters to create custom simulations. Whether you’re riffing off existing samples or starting from scratch, you can take ownership of a specific effect.
- VFX scripting: This supports more experienced technical artists or graphics programmers using the component API to customize the VFX Graph’s behavior. With VFX scripting, your team can enjoy a more efficient pipeline for managing specific effects, and access advanced features like the Graphics Buffers.
An Attribute is a piece of data you might use within a System, such as the color of a particle, its position, or how many of them you should spawn.
Use nodes to read from or write to Attributes. In particular, use the:
- Get Attribute Operator to read from Attributes in the Particle or ParticleStrip System
- Experimental Spawner Callbacks to read from Attributes in Spawn systems
- Set Attribute Block to write values to an Attribute; either set the value of the Attribute directly or use a random mode (for example, set a Color Attribute with a Random Gradient or Random Per-component Block)
See the documentation for a complete list of Attributes.
Note: A System only stores Attributes when it needs them. In order to save memory, it does not store any unnecessary data. If you read that the VFX Graph has not stored the simulation data from an Attribute, the Attribute passes its default constant value.
The various parts of a VFX Graph communicate with each other (and the rest of your scene) through Events. For example, each Spawn Context contains Start and Stop flow ports, which receive Events to control particle spawning.
When something needs to happen, external GameObjects can notify parts of your graph with the SendEvent method of the C# API. Visual Effect components will then pass the Event as a string name or property ID.
An Event Context identifies an Event by its Event string name or ID inside a graph. In the above example, external objects in your scene can raise an OnPlay Event to start a Spawn system or an OnStop Event to stop it.
You can combine an Output Event with an Output Event Handler. Output Events are useful if the initial spawning of the particles needs to drive something else in your scene. This is common for synchronizing lighting or gameplay with your visual effects.
The above example sends an OnReceivedEvent to a GameObject component outside of the graph. The C# script will then react accordingly to intensify a light or flame, activate a spark, etc. See the Interactivity section of the VFX Graph e-book for more information on Output Events.
At the same time, you can use GPU Events to spawn particles based on other particle behavior. This way, when a particle dies in one system, you can notify another system, which creates a useful chain reaction of effects, such as a projectile particle that spawns a dust effect upon death.
These Update Blocks can send GPU Event data in the following way:
- Trigger Event On Die: Spawns particles on another system when a particle dies
- Trigger Event Rate: Spawns particles per second (or based on their velocity)
- Trigger Event Always: Spawns particles every frame
The Blocks’ outputs connect to a GPU Event Context, which can then notify an Initialize Context of a dependent system. Chaining different systems together in this fashion helps you create richly detailed and complex particle effects.
The Initialize Context of the GPU Event system can also inherit Attributes available in the parent system prior to the Trigger Event. So, for instance, by inheriting its position, a new particle will appear in the same place as the original particle that spawned it.
Magic Book in action
The demo in the Unite Copenhagen session shows a magic book effect, which is included in the VFX Samples project, and it utilizes the recently added features.
Exploring VFX sample content
A VFX Graph is more than the sum of its parts. It requires a solid understanding of how to apply nodes and Operators, along with the ways they can work together.
The VFX Graph Additions in the Package Manager demonstrate several simple graphs, making them a great starting point for learning how to manage particles.
The following sections introduce you to some of the common Blocks and Operators you’ll encounter as you explore the samples provided.
Noise and Operators
Procedural Noise helps reduce the “machine-like” look of your rendered imagery. The VFX Graph provides several Operators that you can use for one-, two-, and three-dimensional Noise and Randomness.
Attribute Blocks similarly include the option of applying Randomness in various modes. They can vary slightly per Attribute, so experiment with them to familiarize yourself with their behavior.
An animated texture can do wonders to make your effects believable. Generate these from an external Digital Content Creation (DCC) tool or from within Unity. Use Operators to manage the Flipbook Block.
For more information on creating your own Flipbooks within Unity, check out the Image Sequencer in the VFXToolbox section of the VFX Graph e-book.
Forces, Collisions, and Drag are essential to making particles simulate natural phenomena. But don’t be afraid to push the boundaries of what’s real. As the artist, you get to decide what looks just right.
Visual Effect Subgraphs
A Visual Effect Subgraph is an asset that contains a part of a Visual Effect Graph that can be used in another Visual Effect Graph or Subgraph. Subgraphs appear as a single node.
Subgraphs can be used in graphs in the following three ways:
- System Subgraph: One or many Systems contained in one Graph
- Block Subgraph: A set of Blocks and Operators packaged together and used as a Block
- Operator Subgraph: A set of Operators packaged together and used as an Operator
Subgraphs enable you to factorize commonly used sets of nodes from graphs into reusable assets to add to the Library.
Get the free e-book
Our 120-page e-book, The definitive guide to creating advanced visual effects in Unity, guides artists, designers, and programmers using the Unity 2021 LTS version of VFX Graph. Written by experts, it is a reference to help you take on richly layered, real-time visual effects for your games.