Human Factors Pack
The Human Factors Pack brings all the necessary tools and interfaces to measure and analyze a driver’s behavior: head movements, eye movements, breathing, heart rate, physio measurements, video recordings, connection to virtual reality systems (Oculus, HTC…)
Engineers can evaluate the driver’s performance and behavior and carry out studies on fatigue, drowsiness, vigilance, effect of drugs, driving under the influence without putting in danger the driver and other road users.
The Human Factors Pack answers questions about the user interfaces in vehicles such as… Does the HUD naturally fall before the eyes? How does the driver adapt/embrace the suggested HMI? Is the driver focused on the road? How does the driver react to a specific feature (new honk, new HMI etc.).
The Pack can be used throughout the V Cycle. In the design phase, it will ease and fasten the decision for a particular HMI concept. During verification phase, it is used to validate and ensure that the results are matching the specifications.
Among the many benefits of the Pack are:
- Native Support: The Human Factors Pack provides native support for different types of hardware, interfaces and manufacturers: Oculus, Vive, TrackIR, SmartEye, UDP, ANT… Learn more in the features section.
- Synchronization: All the devices and tools are synchronized with the simulation. Engineers and scientists can focus on exploiting the data after the simulation instead of sorting and synchronizing the data.
- Serenity: The engineers focus on his development and his expertise while AVSimulation provides and maintains the interfaces with the different devices used for the experimentation.
- Openness: The interface with any specific or new devices can be achieved through the SDK provided with SCANeR. Should you have any doubt, feel free to contact our sales engineers in order to get advice and the latest update about supported devices.
The Human Factors Pack will allow you to save time and give more attention on your expertise.
Ergonomics and IHM.
These applications are meant to verify that the driver is efficiently using the interfaces in the vehicle. The focus in on the information available to the driver and how it is accessed. A typical example: Head-up aiming systems and virtual image placement. The simulation enables you to emphasize very quickly, adjust the position, collect feedback and change the amount of information displayed. In some cases, it is possible to carry out semi-functional tests in order to validate the dynamic behavior of the HMI from a graphical point of view, the relevance of displayed information, etc. In ergonomics and HMI, we often use sensors to track the head, eyes and fingers of test subjects… The simulation and the driver are both recorded and synchronized. Virtual and mixed reality can also be used When studying the driver under stress, previous measurements are completed with physiological measurements.
Driver fatigue, drowsiness, hypovigilance, drug and alcohol effects.
During these kind of studies, we are interested in the same metrics (Physiological data, drivers’ head and eyes movements, etc.) as previously. Any additional data can be computed, synchronized and recorded in SCANeR.
Infrastructure and transport studies.
The goal of these studies is to measure the comprehensiveness of the road infrastructure (road signs, positions, entries and exits positions, road markings, etc.). The Human Factors Pack will allow you to record and compare with your own criteria the reaction of the driver to different road infrastructures.
|Physiological||Interaction with electrophysiological devices (Biopac, ANT+).|
|Video||Record a video stream coming from video cameras.|
|Eye Tracker||Interface and synchronization between eye tracking system and SCANeR studio|
|Head Tracker||Interface between head tracking devices and SCANeR studio.|
|Head Mounted Display (HMD)||Interface with HMD devices.|
|Augmented Reality||Full set of functions for AR application to merge real images from a camera with synthetic images.|