Human Factors Pack

Discover all the features of our Human Factors pack

The Human Factors Pack brings all the necessary tools and interfaces to measure and analyze driver behavior: head movements, eye movements, breathing, heart rate, physio-measurements, video recordings, and connection to virtual reality systems (Oculus, HTC…)

Engineers can evaluate the driver’s performance and behavior and carry out studies on fatigue, drowsiness, vigilance, effects of drugs, and driving under the influence without putting the driver and other road users in danger.

The Human Factors Pack answers questions about  user interfaces in vehicles such as… Does the HUD naturally fall before the eyes? How does the driver adapt/embrace the suggested HMI? Is the driver focused on the road? How does the driver react to a specific feature (new horn, new HMI etc.).

The Pack can be used throughout the V Cycle. In the design phase, it will ease and hasten the decision for a particular HMI concept. During verification phase, it is used to validate and ensure that the results match the specifications.

Among the many benefits of the Pack are:

  • Native Support: The Human Factors Pack provides native support for different types of hardware, interfaces and manufacturers: Oculus, Vive, TrackIR, SmartEye, UDP, ANT etc. Learn more in the features section.
  • Synchronization: All the devices and tools are synchronized with the driving simulation. Engineers and scientists can focus on utilizing the data after the simulation instead of sorting and synchronizing the data.
  • Serenity:  The engineers can focus on their development and expertise while AVSimulation provides and maintains the interfaces with the different devices used for experimentation.
  • Openness: The interface with any specific or new devices can be achieved through the SDK provided with the driving simulation software, SCANeR. Should you have any doubt, feel free to contact our sales engineers in order to get advice and the latest updates for supported devices.

The Human Factors Pack will allow you to save time and give more attention to your expertise.

 

 

USE CASES

Ergonomics and HMI.

These applications are meant to verify that the driver is efficiently using the interfaces in the vehicle. The focus is on the information available to the driver and how it is accessed. A typical example: Head-up aiming systems and virtual image placement. The driving simulation enables you to emphasize very quickly, adjust the position, collect feedback and change the amount of information displayed. In some cases, it is possible to carry out semi-functional tests in order to validate the dynamic behavior of the HMI from a graphical point of view, the relevance of displayed information, etc. In ergonomics and HMI, we often use sensors to track the head, eyes and fingers of test subjects. The simulation and the driver are both recorded and synchronized. Virtual and mixed reality can also be used when studying the driver under stress, previous measurements are completed with physiological measurements.

 

Driver fatigue, drowsiness, hypovigilance, drug and alcohol effects.

During these kinds of studies, we are interested in the same metrics (Physiological data, drivers’ head and eyes movements, etc.) as mentioned previously. Any additional data can be computedsynchronized and recorded in SCANeR.

Infrastructure and transport studies.

The goal of these studies is to measure the comprehensiveness of the road infrastructure (road signs, positions, entries and exits positions, road markings, etc.). The Human Factors Pack will allow you to record and compare with your own criteria for the reaction of the driver to different road infrastructures.

FEATURES

Physiological Interaction with electrophysiological devices (Biopac, ANT+).
Video Record a video stream from video cameras.
Eye Tracker Interface and synchronization between eye tracking systems and SCANeR.
Head Tracker Interface between head tracking devices and SCANeR.
Head Mounted Display (HMD) Interface with HMD devices.
Augmented Reality Full set of functions for AR applications to merge real images from a camera with synthetic images.