Description While the concept of emotions is familiar to most people, it is difficult to define. Emotions are associated with a strong feeling deriving from one’s circumstances, mood, and/or relationships with other people. In the driving context, the emotions most commonly monitored for safety purposes are stress and anger, as they have a negative impact on driving, and create dangers (82; 170).
Stress is a state of physical, emotional, or psychological tension resulting from adverse or demanding circumstances. In biology, stress is defined as a state of homeostasis being challenged due to a stressor (128).
Anger is a strong feeling of annoyance, displeasure, and/or hostility. It is a common negative emotion in the context of driving, where is often called road rage (81).
Indicators Emotion recognition is currently a hot topic in the field of affective computing, and is gaining interest in the field of advanced driver-assistance systems (ADASs). To recognize emotions, one can use various behavioral features, e.g., speech (64) and facial expressions (49; 184).
Among the driver-based indicators of both stress and anger, physiological indicators are commonly used. Stress causes physiological responses (43), such as variations or modifications in HR (43; 77; 40; 253), breathing activity (43; 77), blood pressure, EDA (77; 40; 200), and pupil activity (168). The two physiological features that exhibit the highest correlations with driver stress are HR and EDA (77).
For anger in the driving context, Wan et al. (228) suggest to identify it based on physiological indicators such as HR, EDA, breathing rate, and EEG, with the obvious, current, practical limitations for the latter.
The self-assessment manikin (SAM) (25) is a subjective assessment technique to characterize emotions.
The above information allows one to fill, in Table 4, the relevant cells of the “Emotions” column.
Sensors The development of wearable devices with physiological sensors facilitates the recognition of emotions in real-driving conditions, thus outside of a laboratory context.
Facial expressions constitute a good indicator of emotions. The analysis and recognition of facial expressions is currently a field of great interest in scientific research (115; 252). Facial expressions can be monitored in a vehicle via the use of a camera facing the driver (62; 87; 142). Indeed, Jeong and Ko (87) recently developed an algorithm for monitoring the emotions of a driver based on the analysis of facial expressions. Using DNNs performing facial-expression recognition (FER), they can identify—in real time and in real-driving situations—anger, disgust, fear, happiness, sadness, and surprise. A smartphone with a camera facing the user can be used for FER, here for estimation of his/her emotional state (142).
Far-infrared (FIR) imaging (with wavelengths of 15 − 1000 μm), also called infrared thermography (IRT), can be used to quantify stress and emotions by monitoring the breathing activity (150). This can be done via the use of an IRT camera facing the driver.
The recognition of emotions can also be done using wearable sensors (175) such as the E4 wristband, which is a wearable research device that provides the means to acquire physiological data in real time. Many studies (68; 162; 197) have indeed shown that one can detect stress by using the physiological data that this device provides, in particular HR and EDA data.
Bořil et al. (24) developed a stress detector employing a combination of the driver’s speech and some CAN-bus parameters, mainly the steering-wheel angle and the speed. Basu et al. (19) review various methods (that are not specific to the field of driving) for recognizing emotions from speech. Zhang et al. (251) explore how to utilize a deep CNN for the same purpose.
The above information allows one to fill the relevant cells of Table 5.