What are you looking for?
Hero background image
Efficient development of simulated environments for autonomous vehicle training

Using simulated scenarios for testing in the automotive industry is a well-established practice. However, the scenarios used in the past, for example, to train ABS brake systems, do not suffice for autonomous vehicle training. Essentially, autonomous vehicles need to be trained to behave like humans, which requires highly complex simulations.

A key part of any autonomous vehicle training simulation is the simulation environment. Unity, the real-time 3D rendering platform, is being used by engineering teams to efficiently create simulation environments for autonomous vehicle training that are rich in sensory and physical complexity, provide compelling cognitive challenges, and support dynamic multi-agent interaction.

This article provides a useful overview of what comprises a simulation environment and how Unity is used in the production of training environments for autonomous vehicles.

What makes an autonomous vehicle think?

Just like humans, an autonomous vehicle needs a “brain”: this is the autonomous system, which comprises four key areas:

Control: This part takes care of the actions that the car needs to do, such as braking, accelerating and steering.

Planning: The planning part looks after how the vehicle navigates, overtakes, and avoids obstacles.

Perception: This covers how the car gets information about the real world. Information can be gathered with a combination of sensors, such as:

  • Computer vision if you're using cameras
  • LiDARs (Light Detection and ranging) sensors
  • Radars

Lastly, via a process called sensor fusion, the information collected with the above methods is collated and put together as something that the car actually understands.

Coordination: This part is connected with the planning one, as it deals with how the car behaves in relationship to other smart cars it encounters. It requires communication with other vehicles and infrastructure, examples of which include:

  • Platooning: how cars can closely follow each other in highways forming a kind of "train" that optimizes fuel by cutting on air resistance, among other things.
  • Merging and intersections: how cars could adjust collectively to the traffic flow.
  • Swarming: the concept that facilitates the above coordinations.
The key challenges with training autonomous vehicles

How do you collect all data you need? Machine Learning is at the heart of autonomous vehicles, and it’s a system that is very data-hungry. Huge amounts of data are necessary to train an autonomous vehicle. How do you do that cost-effectively and accurately?

How will the car understand what the data is? It’s not just enough to collect the data, you have to make sure that the car understands what the data is–it can’t just see an object, it has to understand if that object is a tree, road, person, and so on.

How do you sort and structure the data? Every single piece of data has to to be understood by the autonomous vehicle, a process that’s both expensive and prone to error when carried out by people. If it could be done automatically, you’d arrive much faster, and more securely at the required algorithms.

How do you prepare the vehicle for the inevitable unforeseen situations? Data collected exclusively from the real world can only prepare the autonomous vehicle for what it’s already seen out there.

Rich and complex simulation environments give engineering teams control over the data generation, and ultimately, train an autonomous vehicle system to be ready for all scenarios, including unforseen ones and edge cases.

Diagram showing a simulation environment
What’s in a simulation environment?

To train an autonomous vehicle system, you need to produce an environment that is as closely related to what the real car would see on the road. The key parts of a simulation environment are:

Vehicle dynamics: how the car behaves physically, such as friction with the asphalt.

Environment: This part comprises three sub-categories:

  • Static elements, such as the roads, trees, and stop light signs.
  • Dynamic elements, such as pedestrians or other cars, that provide variations within your scenario and allow you to create scenarios that can be used to validate or collect data for your vehicles.
  • Parameters, such as the time day and different weather conditions, that you can apply to a given scenario, to recreate different situations.

The conjunction of these varied environment factors is what allows you to produce edge cases that are rare in reality.

Sensor model: The simulation scenarios need to be taken in by the autonomous system via a sensor model, such as a LiDAR sensor, camera, or radar. It has to be physically accurate to the point that the algorithm relying on this information can behave in the synthetic environment as well as it will behave in reality.

Environment development diagram
Unity: a natural choice for developing simulation environments

Engineering a simulation environment requires the same features and toolsets used in creating other types of rich interactive content: lighting and physics, particle and weather systems, animation, machine learning and more.

Unity is the global-leading real-time 3D rendering platform for games and other interactive content development. It’s a tried-and-tested, full-featured platform that powers millions of multi-platform games and applications. It also provides the unique advantages of the Asset Store (see below for more information), and its huge community of cross-industry developers and creators.

Key Unity features for engineering teams developing autonomous vehicle systems

Scripting Flexibility: Teams can adapt Unity to their workflows with a powerful C# scripting system, and comprehensive API. Source code access can be purchased for under-the-hood C++ development.

Speed: The intuitive UI of the Unity editor makes it possible to prototype quickly. When you are in “Play” mode in the editor, you can play and preview how your application will look in its final build. You can pause the scene, and alter values, assets, scripts and other properties, and instantly see the results. You can also step through a project frame by frame for easy debugging.

Rich interactivity: Unity provides a robust and well-documented API that gives access to the complete range of its systems, including physics, rendering, animation, and communications, enabling a rich interaction model and integration with other systems.

High-end graphics: The Scriptable Render Pipeline or SRP, allows you to code the core of your render loop in C#, thereby giving you much more flexibility for customizing how your scene is drawn to make it specific to your content.

There are two SRPs available: the High-Definition Render Pipeline (HDRP) offers world-class visual quality on high-performance hardware, while the Universal Render Pipeline (URP) maintains responsive performance when scaling for mobile.

VR and AR support (plus deployment to 25 other platforms). Due to its extensive platform support, Unity is used by AAA game studios, top creative agencies, film studios and research teams in auto, space, and other industries to create immersive applications.

Advanced artist and designer tools: Unity includes 3D scene design tools, storytelling and cinematics, lighting and special effects, audio system, and a powerful dopesheet animation system.

Machine Learning and AI capabilities: The Unity ML-Agents toolkit enables machine learning researchers to study complex behaviors using Unity, and provide interactive content developers with the latest machine learning technologies to develop intelligent agents.

The Asset Store: The Asset Store gives you access to the largest marketplace of off-the-shelf assets and productivity tools, including a huge selection for creating environments, to save on development time.

BMW autonomous driving journey
BMW’s autonomous driving journey

The BMW Group used Unity to develop a graphical scenario editor that vastly simplifies the process to test and validate automated driving (AD) features in development. The interface makes it easy for its AD developers to visualize and set up thousands of simulated scenarios that increase feature maturity and readiness.

Nearly 95% of all BMW’s test miles for autonomous driving are driven by virtual vehicles in virtual worlds.

Read our blog series to learn more.

Unity for automotive
Unity for automotive

Learn more about how the world’s leading automotive manufacturers are using Unity to accelerate autonomous vehicle development and their entire product lifecycle, from design to marketing to maintenance.