Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-tf8b9 Total loading time: 0 Render date: 2024-11-23T23:14:55.866Z Has data issue: false hasContentIssue false

9 - Emotions in human–computer interaction

Published online by Cambridge University Press:  05 June 2012

Veikko Surakka
Affiliation:
University of Tampere, Finland
Toni Vanhala
Affiliation:
University of Tampere, Finland
Arvid Kappas
Affiliation:
Jacobs University Bremen
Nicole C. Krämer
Affiliation:
Universität Duisburg–Essen
Get access

Summary

Overview:Human–computer interaction (HCI) may be significantly improved by incorporating social and emotional processes. Developing appropriate technologies is only one side of the problem. It is also vital to investigate how synthesized emotional information might affect human behavior in the context of information technology. Despite previous suggestions that people treat computers as social actors, we still know relatively little about the possible and supposedly positive effects of utilizing any kind of emotional cues or messages in human–technology interaction. The aim of the present chapter is to provide a theoretical and empirical basis for integrating emotions into the study of HCI. We will first argue and show evidence in favor of the use of virtual emotions in HCI. We will then proceed by studying the possibilities of a computer for analyzing human emotion-related processes and consider some physiological measures used for this purpose in more detail. In this context, we will also briefly describe some new technological prototypes for measuring computer users' behavior. The chapter ends with a discussion summarizing the findings and addressing the advantages of studying emotions in the context of present-day technology.

Introduction

The qualitative improvement and facilitation of human–computer interaction (HCI) has become a central research issue in computer science. Traditionally, attempts to improve HCI have centered on making computers more user-friendly along technical dimensions. In this line of work, perhaps still the most visible milestone for an ordinary user has been the development of a graphical, mouse-driven user interface.

Type
Chapter
Information
Face-to-Face Communication over the Internet
Emotions in a Web of Culture, Language, and Technology
, pp. 213 - 236
Publisher: Cambridge University Press
Print publication year: 2011

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Allanson, J., Rodden, T., and Mariani, J. (1999). A toolkit exploring electro-physiological human–computer interaction. In Sasse, M. A. and Johnson, C. (eds), Human–Computer Interaction – Interact '99 (pp. 231–237). Amsterdam: IOS Press.Google Scholar
Anttonen, J. and Surakka, V. (2005). Emotions and heart rate while sitting on a chair. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2005, (pp. 491–499). Portland, OR: ACM Press.Google Scholar
Aula, A. and Surakka, V. (2002). Auditory emotional feedback facilitates human–computer interaction. In Faulkner, X., Finlay, J., and Détienne, F. (eds), People and Computers XVI: Memorable Yet Invisible, Proceedings of the British HCI (pp. 337–349). London: Springer.Google Scholar
Barreto, A., Scargle, S., and Adjouadi, M. (2000). A practical EMG-based human–computer interface for users with motor disabilities. Journal of Rehabilitation Research and Development, 37, 53–64.Google ScholarPubMed
Berry, D. C., Butler, L. T., and Rosis, F. (2005). Evaluating a realistic agent in an advice-giving task. International Journal of Human–Computer Studies, 63, 304–327.CrossRefGoogle Scholar
Bradley, M. M. and Lang, P. J. (1999). International Affective Digitized Sounds (IADS): Stimuli, Instruction Manual and Affective Ratings. Technical Report B-2. Gainesville, FL: University of Florida: Center for Research in Psychophysiology.Google Scholar
Bradley, M. M. and Lang, P. J. (2000). Affective reactions to acoustic stimuli. Psychophysiology, 37, 204–215.CrossRefGoogle ScholarPubMed
Brave, S., Nass, C., and Hutchinson, K. (2005). Computers that care: investigating the effects of orientation of emotion exhibited by an embodied computer agent. International Journal of Human–Computer Studies, 62, 161–178.CrossRefGoogle Scholar
Cacioppo, J. T. and Gardner, W. L. (1999). Emotion. Annual Review of Psychology 50, 191–214.CrossRefGoogle ScholarPubMed
Cassell, J., Bickmore, T., Campbell, L., Vilhjálmsson, H., and Yan, H. (2000). Human conversation as a system framework: designing embodied conversational agents. In Cassell, J., Sullivan, J., Prevost, S., and Churchill, E. (eds), Embodied Conversational Agents (pp. 29–63). Cambridge, MA: MIT Press.Google Scholar
Chen, D. and Vertegaal, R. (2004). Using mental load for managing interruptions in physiologically attentive user interfaces. In Dykstra-Erickson, E. and Tscheligi, M. (eds), Extended Abstracts of the 2004 Conference on Human Factors and Computing Systems (pp. 1513–1516). Vienna: ACM Press.CrossRefGoogle Scholar
,Concord Communications (1999). Concord network rage survey. www.concord.com/library/network_rage/.
Damasio, A. (1994). Descartes' Error: Emotion, Reason and the Human Brain. New York: Grosset.Google Scholar
Dimberg, U. (1990). Facial electromyography and emotional reactions. Psychophysiology, 27, 481–494.CrossRefGoogle ScholarPubMed
Ekman, P. (1994). Strong evidence for universals in facial expressions: a reply to Russell's mistaken critique. Psychological Bulletin, 115, 268–287.CrossRefGoogle ScholarPubMed
Fogg, B. J. and Nass, C. (1997). Silicon sycophants: the effects of computers that flatter. International Journal of Human–Computer Studies, 46, 551–561.CrossRefGoogle Scholar
Fox, E. and Damjanovic, L. (2006). The eyes are sufficient to produce a threat superiority effect. Emotion, 6, 534–539.CrossRefGoogle ScholarPubMed
Fridlund, A. J. and Cacioppo, J. T. (1986). Guidelines for human electromyographic research. Psychophysiology, 23, 567–589.CrossRefGoogle ScholarPubMed
Hansen, C. H. and Hansen, R. D. (1988). Finding the face in the crowd: an anger superiority effect. Journal of Personality and Social Psychology, 54, 917–924.CrossRefGoogle Scholar
Hess, E. H. (1972). Pupillometrics. In Greenfield, N. S. and Sternbach, R. A. (eds), Handbook of Psychophysiology (pp. 491–531). New York: Holt, Rinehart & Winston.Google Scholar
Hietanen, J. K., Surakka, V., and Linnankoski, I. (1998). Facial electromyographic responses to vocal affect expressions. Psychophysiology, 35, 530–536.CrossRefGoogle ScholarPubMed
Ilves, M. and Surakka, V. (2004). Subjective and physiological responses to emotional content of synthesized speech. In Magnenat-Thalmann, N., Joslin, C., and Kim, H. (eds), Proceedings of the 17th International Conference on Computer Animation and Social Agents, CASA 2004 (pp. 19–26). Geneva: Computer Graphics Society (CGS).Google Scholar
Jacob, R. J. K. (1991). The use of eye movements in human computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems, 9, 152–169.CrossRefGoogle Scholar
Jacob, R. J. K. (1996). The future of input devices. ACM Computing Surveys, 28(A) (Annex), December.CrossRefGoogle Scholar
Janisse, M. P. (1974). Pupil size, affect and exposure frequency. Social Behavior and Personality, 2, 125–146.CrossRefGoogle Scholar
Klein, J., Moon, Y., and Picard, R. W. (2002). This computer responds to user frustration: theory, design, and results. Interacting with Computers, 14, 119–140.CrossRefGoogle Scholar
Kübler, A., Kotchoubey, B., Hinterberger, T., Ghanayim, N., Perelmouter, J., Schauer, M., Fritsch, C., Taub, E., and Birbaumer, N. (1999). The thought translation device: a neurophysiological approach to communication in total paralysis. Experimental Brain Research, 124, 223–232.Google Scholar
Lang, P. J. (1995). The emotion probe: studies of motivation and attention. American Psychologist, 50, 372–385.CrossRefGoogle ScholarPubMed
Lang, P. J., Bradley, M. M., and Cuthbert, B. N. (1995). International Affective Picture System (IAPS): Photographic Slides. Gainesville, FL: University of Florida, Center for the Study of Emotion and Attention.Google Scholar
Lang, P. J., Greenwald, M. K., Bradley, M. M., and Hamm, A. O. (1993). Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology, 30, 261–273.CrossRefGoogle ScholarPubMed
Larsen, J. T., Norris, C. J., and Cacioppo, J. T. (2003). Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology, 40, 776–785.CrossRefGoogle ScholarPubMed
LeDoux, J. E. (1994). Emotion, memory and the brain. Scientific American, 270(6), 32–39.CrossRefGoogle Scholar
LeDoux, J. E. (1998). The Emotional Brain. New York: Simon & Schuster.Google Scholar
Levenson, R. W. and Ekman, P. (2002). Difficulty does not account for emotion-specific heart rate changes in the directed facial action task. Psychophysiology, 39, 397–405.CrossRefGoogle Scholar
Loewenfeld, I. E. (1966). Pupil size. Survey of Ophthalmology, 11, 291–294.Google Scholar
Malliani, A., Pagani, M., Lombardi, F., and Cerutti, S. (1991). Cardiovascular neural regulation explored in the frequency domain. Circulation, 84, 482–492.CrossRefGoogle ScholarPubMed
Marsh, P. and Khor, Z. (2006). Life online: mouse rage! Social Issues Research Centre. www.sirc.org/publik/mouse_rage.pdf.
Massaro, D. W. and Egan, P. B. (1996). Perceiving affect from the voice and the face. Psychonomic Bulletin and Review, 3, 215–221.CrossRefGoogle Scholar
D'Mello, S., Graesser, A., and Picard, R. W. (2007). Toward an affect-sensitive AutoTutor. IEEE Intelligent Systems, 22(4), 53–61.CrossRefGoogle Scholar
Nass, C., Isbister, K., and Eun, J. L. (2000). Truth is beauty: researching embodied conversational agents. In Cassell, J., Sullivan, J., Prevost, S., and Churchill, E. (eds), Embodied Conversational Agents (pp. 374–402). Cambridge, MA: MIT Press.Google Scholar
Nass, C. and Moon, Y. (2000). Machines and mindlessness: social responses to computers. Journal of Social Issues, 56, 81–103.CrossRefGoogle Scholar
Nass, C. and Steuer, J. (1993). Voices, boxes, and sources of messages: computers and social actors. Human Communication Research, 19, 504–527.CrossRefGoogle Scholar
Nass, C., Steuer, J., and Tauber, E. R. (1994). Computers are social actors. Proceedings of CHI '94, (pp. 72–78). Boston: ACM.CrossRefGoogle Scholar
Nöjd, N., Puurtinen, M., Niemenlehto, P., Vehkaoja, A., Verho, J., Vanhala, T., Hyttinen, J., Juhola, M., Lekkala, J., and Surakka, V. (2005). Wireless wearable EMG and EOG measurement system for psychophysiological applications. In Lundström, R., Andersson, B., and Grip, H. (eds), Proceedings of the Nordic Baltic Conference on Biomedical Engineering and Medical Physics (pp. 144–145). Umeå: Swedish Society for Medical Engineering and Medical Physics.Google Scholar
Partala, T., Jokiniemi, M., and Surakka, V. (2000). Pupillary responses to emotionally provocative stimuli. In Duchowski, A. T. (ed.), Proceedings of ETRA 2000, Eye Tracking Research and Applications (123–129). Palm Beach Gardens, FL: ACM Press.Google Scholar
Partala, T., Aula, A., and Surakka, V. (2001). Combined voluntary gaze direction and facial muscle activity as a new pointing technique. In Hirose, M. (ed.), Proceedings of INTERACT 2001 (pp. 100–107). Tokyo: IOS Press.Google Scholar
Partala, T. and Surakka, V. (2003). Pupil size variation as an indication of affective processing. International Journal of Human Computer Studies, 591–2, 185–198.CrossRefGoogle Scholar
Partala, T., and Surakka, V. (2004). The effects of affective interventions in human–computer interaction. Interacting with Computers, 16, 295–309.CrossRefGoogle Scholar
Partala, T.Surakka, V., and Lahti, J. (2004). Affective effects of agent proximity in conversational systems. In NordiCHI: Proceedings of the Third Nordic Conference on Human–Computer Interaction (pp. 353–356). New York: ACM Press.CrossRefGoogle Scholar
Partala, T., Surakka, V., and Vanhala, T. (2005). Person-independent estimation of emotional experiences from facial expressions. In Riedl, J., Jameson, A., Billsus, D., and Lau, T. (eds), Proceedings of the 10th International Conference on Intelligent User Interfaces (pp. 246–248). New York: ACM Press.Google Scholar
Partala, T., Surakka, V., and Vanhala, T. (2006). Real-time estimation of emotional experiences from facial expressions. Interacting with Computers, 18, 208–226.CrossRefGoogle Scholar
Pentland, A. (2000). Perceptual intelligence. Communications of the ACM, 43, 35–44.CrossRefGoogle Scholar
Picard, R. (1997). Affective Computing. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
Rainville, P., Bechara, A., Naqvi, N., and Damasio, A. R. (2006). Basic emotions are associated with distinct patterns of cardiorespiratory activity. International Journal of Psychophysiology, 61, 5–18.CrossRefGoogle ScholarPubMed
Reeves, B. and Nass, C. (1996). The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press.Google Scholar
Rinn, W. E. (1991). The neuropsychology of facial expression. In Feldman, R. S. and Rimé, B. (eds), Fundamentals of Nonverbal Behavior (pp. 3–30). Cambridge University Press.Google Scholar
Salminen, K., Surakka, V., Lylykangas, J., Raisamo, J., Saarinen, R., Rantala, J., Raisamo, R., and Evreinov, G. (2008). Emotional and behavioral responses to haptic stimulation. In Proceedings of SIGCHI Conference on Human Factors in Computing Systems (pp. 1555–1652). New York: ACM Press.Google Scholar
Scherer, K. R., Ladd, D. R., and Silverman, K. E. A. (1984). Vocal cues to speaker affect: testing two models. Journal of the Acoustical Society of America, 76, 1346–1356.CrossRefGoogle Scholar
Schilbach, L., Helmert, J. R., Mojzisch, A., Pannasch, S., Velichkovsky, B. M., and Vogeley, K. (2005). Neural correlates, visual attention and facial expression during social interaction with virtual others. In Proceedings of ICCS-2005 Symposium, Toward Social Mechanisms of Android Science, Stresa, Italy (74–86).Google Scholar
Schilbach, L., Wohlschlaeger, A. M., Kraemer, N. C., Newen, A., Shah, N. J., Fink, G. R., and Vogeley, K. (2006). Being with virtual others: neural correlates of social interaction. Neuropsychologia, 44, 718–730.CrossRefGoogle ScholarPubMed
Schlosberg, H. (1954). Three dimensions of emotion. Psychological Review, 61, 81–88.CrossRefGoogle ScholarPubMed
Surakka, V. (2004). Tunteet ja sosiaalisuus ihminen-tietokone vuorovaikutuksessa [Emotions and sociality in human–computer interaction]. Psykologia, 39, 19–28.Google Scholar
Surakka, V. and Hietanen, J. K. (1998). Facial and emotional reactions to Duchenne and non-Duchenne smiles. International Journal of Psychophysiology, 29(1), 23–33.CrossRefGoogle ScholarPubMed
Surakka, V., Illi, M., and Isokoski, P. (2003). Voluntary eye movements in human–computer interaction. In Hyönä, J., Radach, R., and Deubel, H. (eds), The Mind's Eyes: Cognitive and Applied Aspects of Oculomotor Research (pp. 473–491). Oxford: Elsevier Science.CrossRefGoogle Scholar
Surakka, V., Illi, M., and Isokoski, P. (2004). Gazing and frowning as a new human–computer interaction technique. ACM Transactions on Applied Perception, 1, 40–56.CrossRefGoogle Scholar
Surakka, V., Tenhunen-Eskelinen, M., Hietanen, J. K., and Sams, M. (1998). Modulation of human auditory information processing by visual emotional stimuli. Cognitive Brain Research, 7, 159–163.CrossRefGoogle Scholar
Vanhala, T. and Surakka, V. (2005). An agent framework for the development of psycho-physiologically interactive systems. In Proceedings of HCI International 2005. CD-ROM. Mahwah, NJ: Lawrence Erlbaum.Google Scholar
Vanhala, T. and Surakka, V. (2007a). Recognizing the effects of voluntary facial activations using heart rate patterns. In Proceedings of the 11th WSEAS International Conference on Computers, 628–632.
Vanhala, T. and Surakka, V. (2007b). Facial activation control effect (FACE). In Paiva, A., Prada, R., and Picard, R. W. (eds), Proceedings of ACII 2007, Lecture Notes in Computer Science, vol. 4738 (pp. 278–289). Lisbon: Springer.Google Scholar
Vanhala, T., Surakka, V., Siirtola, H., Räihä, K.-J., Morel, B., and Ach, L. (2010). Virtual proximity and facial expressions of computer agents regulate human emotions and attention. Computer Animation and Virtual Worlds, 21, 215–224.Google Scholar
Vehkaoja, A. and Lekkala, J. (2004). Wearable wireless biopotential measurement device. In Proceedings of the IEEE EMBS 2004 (pp. 2177–2179). San Francisco, CA: IEEE Press.Google Scholar
Vehkaoja, A., and Lekkala, J. (2006). Wireless measurement band for EEG mismatch negativity registration in mobile activities. IMEKO XVIII World Congress and IV Brazilian Congress of Metrology.Google Scholar
Vehkaoja, A., Verho, J., Puurtinen, M., Nöjd, N., Lekkala, J., and Hyttinen, J. (2005). Wireless Head Cap for EOG and Facial EMG Measurements. 27th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 6, 5865–5868.Google ScholarPubMed
Weyers, P., Mühlberger, A., Hefele, C., and Pauli, P. (2006). Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology, 43, 450–453.CrossRefGoogle ScholarPubMed
Wolpaw, R. J., Birbaumer, N., Mcfarland, D. J., Pfurtscheller, G., and Vaughan, T. M. (2002). Brain–computer interfaces for communication and control. Clinical Neurophysiology, 113, 767–791.CrossRefGoogle ScholarPubMed
Zajonc, R. B. (1980). Feelings and thinking: preferences need no inferences. American Psychologist, 35(2), 151–175.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×