No CrossRef data available.
Article contents
Fictional emotions and emotional reactions to social robots as depictions of social agents
Published online by Cambridge University Press: 05 April 2023
Abstract
Following the depiction theory by Clark and Fischer we would expect people interacting with robots to experience fictional emotions akin to those toward films or novels. However, some people's emotional reactions toward robots display the motivational force typical to non-fictional emotions. We discuss this incongruity and offer two suggestions on how to explain it while maintaining the depiction theory.
- Type
- Open Peer Commentary
- Information
- Copyright
- Copyright © The Author(s), 2023. Published by Cambridge University Press
References
Bainbridge, W. A., Hart, J. W., Kim, E. S., & Scassellati, B. (2011). The benefits of interactions with physically present robots over video-displayed agents. International Journal of Social Robotics, 3(1), 41–52. https://doi.org/10.1007/s12369-010-0082-7CrossRefGoogle Scholar
Bartneck, C., Van Der Hoek, M., Mubin, O., & Al Mahmud, A. (2007). “Daisy, Daisy, Give Me Your Answer Do!” Switching off a Robot. 2nd ACM/IEEE International Conference on Human–Robot Interaction, Washington, DC, pp. 217–222.Google Scholar
Darling, K. (2017). Who's Johnny? Anthropomorphic framing in human–robot-interaction, integration, and policy. In Lin, P., Bekey, G., Abney, K. & Jenkins, R. (Eds.), Robot ethics 2.0 (pp. 173–188). Oxford University Press.Google Scholar
Darling, K. (2021). The new breed: What our history with animals reveals about our future with robots. Henry Holt.Google Scholar
Döring, S. A. (2007). Seeing what to do: Affective perception and rational motivation. Dialectica, 61(3), 363–394. https://doi.org/10/ff8c8pCrossRefGoogle Scholar
Döring, S. A. (2008). Conflict without contradiction. In Brun, G., Dogluoglu, U. & Kuenzle, D. (Eds.), Epistemology and emotions (pp. 83–103). Ashgate.Google Scholar
Frijda, N. H. (1988). The laws of emotion. American Psychologist, 43(5), 349–358. https://doi.org/ckqCrossRefGoogle ScholarPubMed
Garber, M. (2013). Funerals for fallen robots: New research explores the deep bonds that can develop between soldiers and the machines that help keep them alive. The Atlantic, September 20.Google Scholar
Garreau, J. (2007). Bots on the ground in the field of battle (or even above it), robots are a soldier's best friend. Washington Post, May 6.Google Scholar
Gendler, T. S. (2010). Genuine rational fictional emotions. In Gendler, T. S. (Ed.) Intuition, imagination, and philosophical methodology (pp. 227–237). Oxford University Press. https://doi.org/fn7mcdCrossRefGoogle Scholar
Goldie, P. (2000). The emotions: A philosophical exploration. Oxford University Press.Google Scholar
Lazarus, R. S. (1991). Cognition and motivation in emotion. American Psychologist, 46(4), 352–367. https://doi.org/czd6kpCrossRefGoogle ScholarPubMed
Lee, K. M., Jung, Y., Kim, J., & Kim, S. R. (2006). Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human–robot interaction. International Journal of Human–Computer Studies, 64(10), 962–973. https://doi.org/10.1016/j.ijhcs.2006.05.002CrossRefGoogle Scholar
Medina, J. (2013). An enactivist approach to the imagination: Embodied enactments and “fictional emotions”. American Philosophical Quarterly, 50(3), 317.Google Scholar
Seo, S. H., Geiskkovitch, D., Nakane, M., King, C., & Young, J. E. (2015). Poor Thing! Would You Feel Sorry for a Simulated Robot? A Comparison of Empathy toward a Physical and a Simulated Robot. 10th ACM/IEEE International Conference on Human–Robot Interaction (HRI), Portland, OR, pp. 125–132.Google Scholar
Singer, P. W. (2009). Wired for war: The robotics revolution and conflict in the 21st century. Penguin.Google Scholar
Sung, J., Guo, L., Grinter, R. E., & Christensen, H. I. (2007). “My Roomba is Rambo”: Intimate Home Appliances. Ubicomp, pp. 145–162.10.1007/978-3-540-74853-3_9CrossRefGoogle Scholar
Tappolet, C. (2016). Emotions, values, and agency. Oxford University Press.10.1093/acprof:oso/9780199696512.001.0001CrossRefGoogle Scholar
Teroni, F. (2019). Emotion, fiction and rationality. The British Journal of Aesthetics, 59(2), 113–128. https://doi.org/10.1093/aesthj/ayz015CrossRefGoogle Scholar
Vendrell Ferran, I. (2022). Sham emotions, quasi-emotions or non-genuine emotions? Fictional emotions and their qualitative feel. In Breyer, T., Cavallaro, M. & Sandoval, R. (Eds.), Phenomenology of phantasy and emotion (pp. 231–259). WBG Academic.Google Scholar
Walton, K. L. (1990). Mimesis as make-believe: On the foundations of the representational arts. Harvard University Press.Google Scholar
You have
Access
Clark and Fischer's depiction theory is meant to give, among other things, an answer to the question of why we respond to social robots emotionally while knowing that they are no real social agents. Their claim is that this is most likely because we view them not as real social agents but as merely depicting social agents and that our emotions are directed only to the depicted layer. It seems they understand these emotions as cases of what others have called fictional emotions (Gendler, Reference Gendler and Gendler2010; Medina, Reference Medina2013; Teroni, Reference Teroni2019; Vendrell Ferran, Reference Vendrell Ferran, Breyer, Cavallaro and Sandoval2022). These are emotions experienced toward characters or situations that we know are imaginary or fictional, such as the fear of the monster in a scary movie, compassion with the characters in a tragic novel, or the vicarious joy when seeing the protagonists victorious at the end of a play. In all these cases, we know that these characters are fictional, but having followed their stories we feel emotions that are very similar to the ones we would feel for real people.
Fictional emotions differ from non-fictional emotions in some respects, most crucially in their action-motivating force: Emotions are closely related to certain types of actions or behavior. Fear, for example, disposes the subject to generally avoid or want to avoid the object of their fear, anger is associated with confrontational and aggressive behavior, compassion with consoling the grieving person, and so on. This motivational force (sometimes called action tendencies, or action readiness) is usually understood to be a central feature of emotions (Frijda, Reference Frijda1986, Reference Frijda1988; Lazarus, Reference Lazarus1991). In the case of fictional emotions, however, the motivational force is strongly reduced or takes on a subdued form (Gendler, Reference Gendler and Gendler2010; Walton, Reference Walton1990). For instance, despite feeling fear while watching Kubrick's The Shining, we might tense up in our seat, but we do not flee from the cinema, nor do we try to console the Angel of Grief chiseled in stone. Thus, if we could really only experience fictional emotions toward depicted social agents, we should expect the same kind of reduced action tendencies or motivational force as in emotions directed at social robots. The authors acknowledge this claim in section 9.2 and they explain the fact by a “[c]ompartmentalization of emotions”:
“Suppose Amy sees a forklift operator run into Ben and severely injure his arm. She would surely fear for Ben's health, rush to his aid, and call for an ambulance. If she saw the same happen to Asimo, she would do none of that. She would take her time in contacting Asimo's principal about the damage, […]” (target article, sect. 9.2, para. 4)
The authors assume that, in this scenario we would observe that while Amy might experience a certain degree of fear for the depicted Asimochar, she would not display the same action tendency as with a genuine social agent such as Ben. But this is just a conjecture on how a person might react and not an observation of actual behavior. Taking into account several empirical studies, this conjecture seems not very well justified. Strong social emotional responses to robots have been documented in many cases (Darling, Reference Darling, Lin, Bekey, Abney and Jenkins2017): Beginning with people feeling gratefulness toward Roomba, their vacuum cleaner (Sung, Guo, Grinter, & Christensen, Reference Sung, Guo, Grinter and Christensen2007), over others refraining from hitting, switching off (Bartneck, Van Der Hoek, Mubin, & Al Mahmud, Reference Bartneck, Van Der Hoek, Mubin and Al Mahmud2007), or destroying a robot (Darling, Reference Darling2021, Ch. 10) to soldiers who risk their lives in order to save the robots they are working with (Singer, Reference Singer2009, Ch. 17) or to bury and hold a funeral for a defect mine-disposal robot (Garber, Reference Garber2013; Garreau, Reference Garreau2007). The interactive and immediate nature of robots seems to elicit social emotions with the motivational force typical of non-fictional emotions, which we would not expect in the case of a fictional emotion toward other forms of depiction. A person in Amy's situation would probably never feel the urge to rush to help a depicted agent in a painting, novel, or a movie, but might feel to urge to help Asimo out of fear for him (although she would probably still call a technician rather than an ambulance). It seems, therefore, that the emotions we experience toward social robots can take on a stronger motivational force than we would expect from fictional emotions, which we experience toward other forms of depiction.
Here are two suggestions on how we could explain this apparent discrepancy between the fictional status of the social agent depicted by robots and the motivational force emotions toward them can have, without giving up the depiction theory. First, while it might be possible for people to keep the three perspectives distinct on a cognitive level, they might fail to keep them separated on an emotional level. Moreover, keeping emotional reactions to objects on the three perspectives distinct might be easier with forms of depiction that are more spatially and temporally distant and less interactive. Emotional overreactions might be strongest with embodied, physically present depictions. Several studies suggest that participants tend to respond with more empathy (Seo, Geiskkovitch, Nakane, King, & Young, Reference Seo, Geiskkovitch, Nakane, King and Young2015), afford greater trust to (Bainbridge, Hart, Kim, & Scassellati, Reference Bainbridge, Hart, Kim and Scassellati2011), and report a stronger feeling of social presence (Lee, Jung, Kim, & Kim, Reference Lee, Jung, Kim and Kim2006) toward physically present robots compared to telepresent or simulated ones. Second, emotions might be triggered by features other than the depicted social agent. A robot's parts might additionally depict bodily features of a human or animal. Seeing such depicted bodily parts being damaged might also elicit emotional responses, without requiring the depiction of a social agent. Emotions are often said to work on the level of perception and are somewhat – but not entirely – inaccessible to higher-level cognitive penetration (Döring, Reference Döring2007, Reference Döring, Brun, Dogluoglu and Kuenzle2008; Goldie, Reference Goldie2000; Tappolet, Reference Tappolet2016). If this is the case, then we should expect that the intellectually demanding work of keeping depiction and reality separate might sometimes fail to translate to the emotional level.
Acknowledgments
We thank Tobias Starzak and acknowledge the INTERACT! project for their useful feedback.
Financial support
This is a publication in the context of the project INTERACT!, funded by the ministry of culture and science of North Rhine Westphalia.
Competing interest
None.