The AD/ADAS Pack is made for engineers in charge of advanced driver-assistance systems testing and validation and/or researchers who want to study human factors behavior when using ADAS.
It brings all the features from simulating functional sensors to providing reliable target lists to ADAS and offers off-the-shelf solutions to simulate autonomous vehicle behavior, allowing researchers to focus on human driver behavior.
The AD/ADAS Pack includes the following functional sensors:
Fully integrated within SCANeR software engineers benefit from the rich and representative simulation environments and features to lure ADAS. Each sensor has a full graphical interface for customization.
Using the information available in the manufacturers’ product sheets, engineers can precisely define the characteristics of the sensors’ capabilities (i.e. perception area, range limit, types of smart function targets, etc.) and customize error or noise models to get even closer to real models.
As SCANeR enables the simulation of any type of road infrastructure and environment, as well as the creation of complex scenarios, the millions of kilometers required to validate ADAS can be performed in simulation.
Furthermore, as cars are getting smarter every year, ADAS systems are no longer an isolated component. They interact with the HMI (i.e: heads-up displays), headlights projectors (i.e: Glare free high-beam), etc. The AD/ADAS Pack works well when coupled with other SCANeR Packs, resulting in time saved and better consistency. Teams from different departments can share not only the ADAS systems but also the scenarios, the vehicle dynamics model, the road networks, etc.
The versatility of this Pack makes it possible to meet the needs of users at every level of testing and validation: integrated models (MIL), software (SIL), hardware (HIL, VIL), and drivers (DIL).
SCANeR sensors can be spread across multiple computers in order to optimize the calculation and ensure that the results are correctly calculated with enough resources. Engineers can dedicate a computer to the simulation of 1 LiDAR, another to the simulation of 2 radars, another to the simulation of 3 cameras, etc.
In addition, it also provides features to communicate with sensors via standardized means such as:
- ROS (Robotic Operating System),
Test and validate ADAS
Engineers working on ADAS such as AEB, ACC, LKA or many others, are able to test and validate their systems in an unlimited amount of known and unknown situations created in SCANeR (e.g. SOTIF). Through its ability to model a large number of sensors, it enables the validation of any driving automation level (1 to 5).
Study drivers’ behavior when using driving assistance systems
Thanks to the included Autonomous Driving feature, researchers can study Human Factors when using driving assistance systems. The Autonomous Driving feature simulates full/partial delegation of control. You do not need to have an ADAS, we simulate it for you! Simply focus on your study. Interactions with the driver, especially the transition phases whichare critical and require your full attention. Combined with the Simulators and Human Factors Packs, the AD/ADAS Pack facilitates the research and fine-tuning of these delicate phases of transition on the simulator.
|LiDAR||Simulate as many LiDARs as the GPU can handle.|
|Camera||Simulate up to 5 camera sensors.|
|Radar||Simulate radar sensors. It supports as many functional sensors as the CPU/GPU is able to process.|
|Ultra-sonic||Simulate ultra-sonic sensors. It supports as many functional sensors as the CPU/GPU is able to process.|
|Lighting||Simulate lighting sensors. It supports as many functional sensors as the CPU/GPU is able to process.|
|E-Horizon||Simulate E-Horizon sensors and retrieve the electronic horizon in ADASIS format.|
|SensorViewer||Visualize the sensors detection in real time.|