Philart (Myounghoon) Jeon
Engineering Psychology, Georgia Institute of Technology, USA.
Designing an In-Vehicle Affect Detection and Regulation Interfaces
Research Question: How to effectively regulate drivers’ various affective states using auditory displays?
Last year, I brought an ‘advanced auditory menus project’ in which I applied various new types of non-speech sounds including spearcons, spindexes, and auditory scrollbars to the spoken menus. I have successfully demonstrated the effectiveness of those sonification-enhancements with diverse user populations (e.g., sighted undergraduates, visually impaired users, and drivers) on several devices (e.g., desktops, in-vehicle technologies [e.g., 1], and touchscreen smartphones).
This year, I would bring my dissertation research, titled ‘designing an in-vehicle affect detection and regulation interfaces project’. Since this research is still in an initial stage, several processes are running simultaneously (e.g., framing the project scope [2], extracting and analyzing a driving-specific affect dimension [3], specifying the affective effects [4], and participatory design process—interviews and focus group studies with various user groups).
Especially at ThinkTank, I would like to discuss the regulation methods for drivers using auditory displays, since vision is the most heavily taxed sense in a driving situation. Based on literature review and previous research, plausible directions for the affect regulation using auditory user interfaces are as follows:
- ‘Attention Deployment’, which is one of the five steps in the process model [5] for emotion regulation can be adopted. For example, calling one’s own name can sometimes break through to conscious awareness (Moray, 1959). This processing occurs preattentively, thus it can directly have a strong impact on driving performance and safety. It may work well for making a driver distracted from an affective source, but how the system can make a driver concentrate on driving itself is another question to answer. Directly telling a driver about driving or a driving task may make him or her feel that the in-vehicle system is a back-seat driver (or nagger).
- If the mechanisms are clearly known for a certain affective effect, indirect mitigation of the affective states may be possible. For example, using simple voice prompts, Harris and Nass [6] showed that ‘Cognitive Reframing’ of frustrating events yielded better driving performance and less negative emotions.
- Finally, the system can probably be designed to increase overall performance, regardless of regulation of the affective state itself. For high workload and stressful situations, the intelligent system can temporarily prevent incoming calls or email notification. Reversely, adding auditory displays for a specific task may help improve driving and other related-task performance and reduce affective effects and workload as reported in [1]. For this adaptive user interface, I would like to apply advanced sonifications to the in-vehicle technologies.
References
Jeon, B. K. Davison, M. A. Nees, J. Wilson, and B. N. Walker, “Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies,” 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI09), Essen, Germany, 2009, pp. 91-98. |
Jeon and B. Walker, "Emotion detection and regulation interface for drivers with traumatic brain injury," ACM SIGCHI Conference on Human Factors in Computing Systems (CHI11), Vancouver, BC, Canada. |
Jeon and B. Walker, "What to detect? Analyzing factor structures of affect in driving contexts for an emotion detection and regulation system," 55th Annual Meeting of the Human Factors and Ergonomics Society, Las Vegas, Nevada, 2011. |
Jeon, J. Yim, and B. Walker (in preparation), "An angry driver is not the same as a fearful driver: Different effects of specific negative emotions on risk perception, driving performance, and workload," 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI11), Salzburg, Austria, 2011. |
Gross, J. J. (1998). "The emerging field of emotion regulation: An integrative review," Review of General Psychology, 2(3), 271-299. |
Harris, H., & Nass, K. (2011). "Emotion regulation for frustrating driving contexts," ACM SIGCHI Conference on Human Factors in Computing Systems (CHI11), Vancouver, BC, Canada. |
Back to ICAD2011 Think Tank
Back to Jose González
Next to Martin Morrell