Hostname: page-component-586b7cd67f-dsjbd Total loading time: 0 Render date: 2024-11-23T18:41:12.979Z Has data issue: false hasContentIssue false

Cultural Differences in People's Reactions and Applications of Robots, Algorithms, and Artificial Intelligence

Published online by Cambridge University Press:  29 August 2023

Kai Chi Yam*
Affiliation:
National University of Singapore, Singapore
Tiffany Tan
Affiliation:
National University of Singapore, Singapore
Joshua Conrad Jackson
Affiliation:
Kellogg School of Management, Evanston, USA
Azim Shariff
Affiliation:
University of British Columbia, Vancouver, Canada
Kurt Gray
Affiliation:
University of North Carolina at Chapel Hill, USA
*
Corresponding author: Kai Chi Yam ([email protected])
Rights & Permissions [Opens in a new window]

Abstract

Although research in cultural psychology has established that virtually all human behaviors and cognitions are in some ways shaped by culture, culture has been surprisingly absent from the emerging literature on the psychology of technology. In this perspective article, we first review recent findings on machine aversion versus appreciation. We then offer a cross-cultural perspective in understanding how people might react differently to machines. We propose three frameworks – historical, religious, and exposure – to explain how Asians might be more accepting of machines than their Western counterparts. We end the article by discussing three exciting human–machine applications found primarily in Asia and provide future research directions.

摘要

摘要

文化心理学的研究表明,几乎所有人类行为和认知都在某种程度上受到文化的影响。但令人惊讶的是,在新兴的技术心理学文献中,几乎没有对于文化这一因素的关注。在这篇前瞻性文章中,作者首先回顾了现有文献中关于厌恶机器算法和欣赏机器算法的研究成果,发现亚洲人比西方人更接受机器算法。作者通过跨文化视角,提出了三个框架—历史、宗教、曝光度,来解释为什么亚洲人和西方人会对机器算法产生截然不同的反应。最后,作者讨论了在亚洲兴起的三种主要的有趣的人机应用程序,并预测了未来的研究方向。

Type
Perspectives
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press on behalf of The International Association for Chinese Management Research

Introduction

Robots, algorithms, and artificial intelligence (AI) are becoming rapidly commonplace in today's technology-driven world. In 2020, a record three million industrial robots were operating across the globe (IFR, 2021a). Asia, in particular, took the lead being the largest market for industrial robots, with countries like China, Japan, and South Korea continuing to be the first, second, and fourth biggest market players, respectively. Apart from industrial robots, Asia also has a stronghold on the development of social robots, which is projected to grow 36% in market size in the APAC region alone by 2025, dominating other markets like Europe and the United States (Technavio, 2022). Unlike industrial robots, which are predominantly found in factories, social robots are designed to behave and interact with humans. Japan, for example, is home to some robot-manned hotels (Yam, Bigman, Tang, et al., Reference Yam, Bigman, Tang, Ilies, De Cremer, Soh and Gray2021), robot pets (Craft, Reference Craft2022), as well as robot caregivers at nursing homes (Lufkin, Reference Lufkin2020). In China, the COVID-19 lockdown saw robots deployed to deliver food and medicine, as well as disinfect hospitals (Fannin, Reference Fannin2020).

Outside of robots, Asia has grown substantially in the space of AI technologies as well (International Institute of Communications, 2020). In Singapore and Japan, machine learning algorithms have been adopted by big insurance companies to automate claims processes. In China, AI helps farmers monitor and predict environmental conditions and yields. Despite these promising and exciting developments in technology, however, social scientists have been met with a perplexing finding: people are averse to machinesFootnote 1 (for a review, see Gray, Yam, Eng, Wilbanks, & Waytz, Reference Gray, Yam, Eng, Wilbanks, Waytz and D.2023). Robots may be helpful, but people also find them creepy (MacDorman, Reference MacDorman2006). Algorithms may make objective, calculated decisions, but people do not always trust them (von Eschenbach, Reference von Eschenbach2021). Importantly, these effects appear to be culturally dependent (Lim, Rooksby, & Cross, Reference Lim, Rooksby and Cross2021). In this article, we first briefly review recent findings on machine aversion versus appreciation, focusing on the who – who likes machines, and who does not? – and what – what do machines do that people like or dislike? For the former, past work has looked at how people of varying expertise (Logg, Minson, & Moore, Reference Logg, Minson and Moore2019), ages (Chien et al., Reference Chien, Li, Lee, Yang, Lin, Yang, Wang and Yeh2019), gender (Nomura, Reference Nomura2017), and occupations (Reich & Eyssel, Reference Reich and Eyssel2013) respond differently toward machines. For the latter, research has established that people also appreciate machines differently depending on the nature of the task it is doing. For example, people do not like algorithms making moral decisions (Bigman & Gray, Reference Bigman and Gray2018), but outside of the moral domain people perceive algorithms as better at objective than subjective tasks (Castelo, Bos, & Lehmann, Reference Castelo, Bos and Lehmann2019). Nevertheless, there is another important variable that may also affect attitudes and behaviors toward machines – culture, defined as an ‘untidy and expansive set of material and symbolic concepts, such as world, environment, contexts, cultural systems, social systems, social structures, institutions, practices, policies, meanings, norms, and values, that give form and direction to behavior’ (Markus & Kitayama, Reference Markus and Kitayama2010: 422).

Broadly, cultureFootnote 2 is who we are – it encompasses our common values, beliefs, and norms, distinct from other groups (Lehman, Chiu, & Schaller, Reference Lehman, Chiu and Schaller2004), and is shaped by our unique histories, languages, geographies, and so on. Culture is often treated as all-encompassing; it can reach into and influence people's behavior from the way we greet each other (Li, Reference Li2009) to the way we negotiate job salaries (Lu, Reference Lu2023). In this perspective article, we ask: how do cultures shape attitudes and behavior toward machines? To explore these differences, we then delineate a cross-cultural perspective with an attempt to explain culturally divergent responses to machines. We propose three frameworks – historical, religiousFootnote 3, and exposure – to help scholars understand the nuanced cultural differences in how people view and interact with machines. Finally, we look ahead and discuss three interesting areas where new technological applications are emerging in Asia – sex, religion, and therapy – and urge future research to attend to these new and exciting contexts, as well as their implications for human–human interactions.

Machine Aversion or Appreciation?

The phenomenon of machine aversion was first documented by Dietvorst, Simmons, and Massey (Reference Dietvorst, Simmons and Massey2015). In a series of studies, participants either witnessed an algorithm making a forecast, a human making a forecast, or both. Strangely, even after seeing algorithms outperform human forecasters (i.e., made better, more accurate predictions), participants still largely chose human forecasters over algorithmic ones to make future predictions. Furthermore, people seem most averse to machines making moral (Bigman & Gray, Reference Bigman and Gray2018) and social (Yeomans, Shah, Mullainathan, & Kleinberg, Reference Yeomans, Shah, Mullainathan and Kleinberg2019) decisions, even when machines are better than humans at making these decisions.

Algorithms aside, robots also inspire feelings of threat and anxiety, especially in the context of the workplace. For instance, Yam and colleagues found that adults exposed to robots either physically (e.g., interacting with them at work in real life) or psychologically (e.g., reading an article about the presence of robots in businesses) report higher job insecurity and anxiety (Yam, Tang, Jackson, Su, & Gray, Reference Yam, Tang, Jackson, Su and Gray2023), especially when interacting with humanoid robots (Yam, Bigman, & Gray, Reference Yam, Bigman and Gray2021). Such job insecurity further predicted employee burnout and workplace incivility. Separately, Gamez-Djokic and Waytz (Reference Gamez-Djokic and Waytz2020) also found that such automation-related job anxiety can translate into negative sentiments toward out-groups, such as immigrants. Taken together, not only do these findings suggest that people generally fear novel technologies they do not yet understand, but they also hint that such fears toward machines can spill over and lead to negative interpersonal outcomes.

But there is another side of the story – one in which people do not fear, but prefer, machines. Admittedly, this literature is less developed compared to machine aversion research. In a series of studies, Logg et al. (Reference Logg, Minson and Moore2019) examined participants’ preferences for human versus algorithmic predictions regarding people's weight, song popularity, and geopolitical events, among others, without providing any performance accuracy feedback. Interestingly, they found that people relied more heavily on the advice they thought was generated by algorithms than by people (though the forecasts were otherwise identical). Another line of research by Dietvorst, Simmons, and Massey (Reference Dietvorst, Simmons and Massey2018) has found that people are less averse to algorithms if they have the power to modify how algorithms work.

These findings challenge the existence of general machine aversion, instead, suggesting that certain individuals, like experts, may be more machine-averse than others, preferring to rely on their own predictions. Beyond individual differences, the task domain that the machines are operating in influences people's aversion (vs. appreciation) toward them as well. For example, people are highly averse to machines making decisions in the moral domain such as deciding if a criminal gets parole, or if a patient should undergo a risky surgery that may save or kill them (Bigman & Gray, Reference Bigman and Gray2018). This is because people do not perceive machines to be able to fully think or feel. People also have less trust in algorithmic advice for tasks that they perceive to be subjective (e.g., dating advice) than tasks that they perceive to be objective (e.g., financial advice) (Castelo et al., Reference Castelo, Bos and Lehmann2019). Evidently, people's attitudes toward machines are influenced by a range of factors, including individual user differences and the type of task the machine is doing.

Are Attitudes Toward Machines Culture-Specific?

Culture may be a hidden influence on how people respond to machines. Cross-cultural research in the social sciences has paid much attention to the East-West dichotomy in exploring how people in the East diverge from the West in terms of values, beliefs, and behavioral norms (Kagitcibasi & Berry, Reference Kagitcibasi and Berry1989). For example, scholars have long discussed fundamental cultural differences in how the East and the West organize knowledge (Nisbett, Peng, Choi, & Norenzayan, Reference Nisbett, Peng, Choi and Norenzayan2001), use logical rules (Nisbett, Reference Nisbett2004), and cooperate in teams (Qin et al., Reference Qin, Yam, Ye, Zhang, Liang, Zhang and Savani2023). In terms of interacting with machines, a popular perception is that Eastern cultures are fonder of robots than Western ones (see headlines like ‘Why Japanese Love Robots (And Americans Fear Them)’; Mims, Reference Mims2010; or ‘Asia Has Learnt to Love Robots – The West Should, Too’; Thornhill, Reference Thornhill2018).

Take Japan for example, known for being the home to Paro, a therapeutic baby seal robot (IEEE, 2022), Geminoid HI-2, a robot doppelganger of human roboticist Hiroshi Ishiguro (Guizzo, Reference Guizzo2010), and perhaps, most famously, Astro Boy, a friendly superpowered manga and anime android character who fights evil and saves lives (Robot Hall of Fame, 2004). Japan is a prime example of a globally lauded robot-loving culture. Research on attitudes toward machines in Asia – not just Japan – has also reported positive reactions. For instance, Bigman, Yam, Marciano, Reynolds, and Gray (Reference Bigman, Yam, Marciano, Reynolds and Gray2021) found that participants in Singapore reported an increased preference for algorithm decision-making in healthcare settings when the topic of inequality in healthcare access was highlighted to them. Likewise, Oh et al. (Reference Oh, Kim, Choi, Lee, Hong and Kwon2019) found that Korean doctors and medical students had generally positive attitudes regarding the use of AI in the healthcare sector, largely appreciating its capacity to analyze big amounts of clinical data in a short amount of time. Similarly, in China, a content analysis of social media posts found that most of the general public regard medical AI positively, believing that such technology has the potential to partially or even fully replace human doctors (Gao, He, Chen, Li, & Lai, Reference Gao, He, Chen, Li and Lai2020). A recent field experiment shows that workers in China perceived algorithmic task assignment as fairer than human-based task assignment (Bai, Dai, Zhang, Zhang, & Hu, Reference Bai, Dai, Zhang, Zhang and Hu2022).

But beyond commonplace assumptions, when looking at cross-cultural research specifically, the findings are more nuanced. Though some studies have reported more positive reactions to machines among people from the East than their counterparts from the West (e.g., Li, Rau, & Li, Reference Li, Rau and Li2010), others have found the opposite (e.g., Bartneck, Suzuki, Kanda, & Nomura, Reference Bartneck, Suzuki, Kanda and Nomura2007). Lim et al. (Reference Lim, Rooksby and Cross2021: 1307) recently reviewed the past two decades of research on human–robot interaction (HRI) focusing on cultural influences and noted ‘complex and intricate relationships between culture and human cognition’, while discussing how national culture and prior experiences with robots affect human–robot interactions.

There may be no easy answer to the question, ‘Do Eastern cultures like and accept machines more than Western cultures’? Cultures are multifaceted and dynamic and may affect a wide range of attitudes and behaviors toward machines in diverse yet interactive ways. Rather than determining a simplistic ‘yes’ or ‘no’ answer, in this article we aim to further dissect and understand the differences in how the East and West think about and interact with machines at large. There is a need to go beyond the basic assumption that the East just has more affinity for technology and explore the specific differences and underlying mechanisms. Furthermore, most social science research on novel technologies has been conducted using samples from the West – specifically, samples from Western, educated, industrialized, rich, and democratic (WEIRD) societies (Henrich, Heine, & Norenzayan, Reference Henrich, Heine and Norenzayan2010). Ironically, as previously discussed, many of these technologies are primarily being deployed and used outside of the West. Thus, it is no longer sufficient to rely on the findings situated in WEIRD societies to fully understand people's attitudes and behaviors toward machines, without considering the significance of culture in shaping our everyday thoughts and behaviors. The time is ripe for us to review the present state of knowledge about cultural attitudes and behaviors toward machines and identify how and why the East and West may differ in this regard (see Table 1 for a summary).

Table 1. Summary of factors proposed to affect East-West differences in people's attitudes and behaviors toward new technologies

Historical and Religious Perspectives

The different historical legacies of Eastern and Western cultures may shed light on how these cultural traditions perceive machines today. Eastern and Western cultural traditions ultimately stem from the same deep historical roots (Bouckaert et al., Reference Bouckaert, Redding, Sheehan, Kyritsis, Gray, Jones and Atkinson2022; Lee, Han, Byron, & Fan, Reference Lee, Han, Byron, Fan, Chen and Lee2008). Historical and ethnographic studies have found that early societies in both regions may have held widespread belief in animismFootnote 4, in which many non-human agents or spiritual forces animated the natural world (Jackson, Dillion, et al., Reference Jackson, Dillion, Bastian, Watts, Buckner, DiMaggio and Gray2023; Peoples, Duda, & Marlowe, Reference Peoples, Duda and Marlowe2016). However, the religious and philosophical traditions in the East and West have diverged in key ways over the last 3000 years. Many Eastern religious traditions such as Buddhism and Shintoism continue to emphasize the co-existence of humans and animistic agents or forces. In contrast, Western religious and philosophical traditions have diverged to emphasize ‘human exceptionalism’, defined as the perception of human beings having distinct minds, rights, and capabilities which are not shared by non-human animals (Srinivasan & Kasturirangan, Reference Srinivasan and Kasturirangan2016). Belief in other non-human agents such as spirits or disembodied forces has also declined in Western cultures, contributing further to the view that humans have unique abilities, rights, and privileges (Jackson, Caluori, Gray, & Gelfand, Reference Jackson, Caluori, Gray and Gelfand2021).

We argue that this difference in how both cultures perceive and understand non-human entities shapes fundamentally divergent relationships with machines – for the East, machines can be part of the natural world as much as other life is; they complement it. For the West, machines are Others, too inherently different from humans. As such, machines are more accepted in Eastern cultures as people view them in harmony with themselves, whereas in Western cultures, they are seen more as dissimilar, unfamiliar aliens, possibly posing a threat to people's identities and societies.

The legacy of animistic beliefs in current-day human–machine interaction is particularly visible in Japan. Broadly, in Japanese culture, non-human entities are frequently thought of as having souls or spirits, not unlike humans. Even entities that are not human or even alive are perceived to possess life in their own right. These beliefs can be traced to the Shinto religion which assumes that kami, or divinity, lives in all parts of nature, such as the seas, mountains, and flowers (Asquith & Kalland, Reference Asquith and Kalland1996). Notably, Shinto is the largest religion in Japan in practice, with over 80% of the Japanese population participating in some Shinto practices (Breen & Teeuwen, Reference Breen and Teeuwen2010). As such, it is not uncommon there to practice rites honoring or respecting non-human entities, like funeral ceremonies and offerings made for pets after they die (Kenney, Reference Kenney2004), or the famous KonMari method in which homeowners are asked to greet their houses and belongings upon entering and converse with them as if they were conscious, living beings (Kahn, Reference Kahn2021). The idea that spirits exist within non-humans is not just accepted but embraced in Shintoism. As Geraci (Reference Geraci2006: 236) summarizes, ‘robots fit into the natural world [of Japan] as easily as any other object’. Despite not being human, machines are still easily accepted as a part of the natural world, complementing what is already there.

In Western cultures, on the contrary, the Judeo-Christian worldview emphasizes human uniqueness and importance (MacDorman, Vasudevan, & Ho, Reference MacDorman, Vasudevan and Ho2009). Medieval Christianity, for instance, believed in a natural order or hierarchy in the world, known as the ‘great chain of being’, which has historically dominated Western thought (Nee, Reference Nee2005). Accordingly, the Universe is ranked linearly, starting with rocks, then plants, animals, humans, angels, and finally, God. Notably, humans are valued over inanimate entities by the order. In a similar vein, Westerners do not typically view non-human entities as possessing agency or moral concerns the same way humans do (Laham, Reference Laham2009). Though Medieval Christianity may no longer hold the same influence today that it did in the past, scholars argue that the human exceptionalism worldview in the West has only evolved into anthropocentrism (Daliot-Bul, Reference Daliot-Bul2019).

Anthropocentrism is the broad belief that humans are the only entities with intrinsic value, whereas the value of all other entities lies only in their instrumental values to serve humans (Goralnik & Nelson, Reference Goralnik, Nelson and Chadwick2012). Unlike human exceptionalism, anthropocentrism orders and arranges the world around humans rather than God. Though distinct in philosophies, anthropocentrism is similar to human exceptionalism in that it continues to hold up the assumption that humans are distinct and important – in fact, now, they are the center. Be it either, the dominant Western thought distinguishes neatly between humans and non-humans. But why would this create machine aversion? First, people categorize others into in-groups versus out-groups, in which in-group members share a social identity and out-group members do not. People behave more favorably toward their in-groups while discriminating against out-groups (Tajfel, Billig, Bundy, & Flament, Reference Tajfel, Billig, Bundy and Flament1971). By human exceptionalism and/or anthropocentrism beliefs, the strict distinction between humans and non-humans likely exacerbates the social categorization of machines as an out-group, evoking unfavorable evaluations and reactions. Second, given the basic assumptions of these philosophies that humans are superior to all other entities (other than the divine), the rise of machines may be perceived as a threat (Floridi, Reference Floridi2014). Humans have always been positioned near the top or at the center of the order – usually assumed to be the most intelligent and exceptional (Finlay & Workman, Reference Finlay and Workman2013; Roth & Dicke, Reference Roth and Dicke2005). As such, the expanding capabilities and presence of machines in recent times may be seen as a threat of replacement to humans, both in terms of realistic threat (i.e., people's jobs and livelihood are on the line) and symbolic threat (i.e., people's identity as unique and superior entities are endangered) (Gray et al., Reference Gray, Yam, Eng, Wilbanks, Waytz and D.2023), contributing to the unease people feel about co-existing among machines. Hence, while animism in the East may promote a complementary and harmonic relationship between humans and machines by highlighting a commonality (that we all have a soul and a place in the world), Western human exceptionalism and anthropocentrism seem to emphasize differences, presenting a more competitive relationship.

But how relevant are these historical and religious perspectives to the human–machine relationship today? Societies have become more secular over recent decades (Jackson et al., Reference Jackson, Caluori, Gray and Gelfand2021), but the history of places and cultures can continue to reflect in people's behaviors today. To illustrate, Talhelm et al. (Reference Talhelm, Zhang, Oishi, Shimin, Duan, Lan and Kitayama2014) demonstrated that the agricultural histories of different regions in China (e.g., farming rice vs. farming wheat) which shapes community networks (e.g., interdependent vs. independent) continue to affect people's thinking styles (e.g., holistic vs. analytical) in the modern day. As such, even if these traditional beliefs are no longer as subscribed to today as before, it is nevertheless likely that such opposing historical backdrops of the East and the West have played a significant role in cultivating the different cultural attitudes and behaviors toward machines observed today. But of course, the past alone cannot paint the full story, and in the next section, we discuss a more contemporary perspective that shapes cross-cultural perceptions of machines.

The Role of Exposure

Exposure to machines in one's everyday life can influence people's attitudes and behavior toward machines in important ways. Here, we discuss two modes of exposure to machines – (a) through the media and (b) through real-life experience – zooming in on how they differ across the East and the West, and how such different qualities and quantities of exposure to machines can shape culturally divergent attitudes of them.

Cultural representations of technology

First, how a culture represents machines through its artifacts (e.g., toys, the media) can shape people's judgments and preferences for them. This draws from the information processing approach (Entman, Reference Entman1989), which proposes that salient information about a target object from the media has the power to influence people's pre-existing schemas (i.e., mental representations) of it. Indeed, the media can serve as a means by which cultural values are transmitted. How machines are represented in the mass media (e.g., kind vs. evil) in each culture thus tells people what and how to think of them.

In Japan, the governmentFootnote 5 has long promoted the advancement of social robotics on a national scale to address the manpower needs of a rapidly ageing and declining population (Wagner, Reference Wagner2009). Supporting this push, the Japanese mass media has also heavily promoted robot and automation acceptance, through positive narratives of robots or androids working alongside humans for the benefit of humanity (e.g., Astro Boy, Power Rangers, Doraemon). Such positive depictions of machines tie in with – and are likely fostered by – the animistic beliefs of the East that support the idea that machines can co-exist in peace and harmony with humans. Doraemon, for instance, features the stories and adventures of an earless robotic cat that lives happily with the Nobi family, often coming to the rescue of their young son in times of trouble. As such, the concept of robotic help is neither unfamiliar nor jarring; it is actively encouraged. The positive depictions of robots by the government and mass media as helpers and a potential solution to managing a societal problem communicate and reinforce the idea that the appropriate or ‘correct’ attitude toward machines is acceptance because they are favorable for them and society. Therefore, it is unsurprising that Japanese people may exhibit more preference and intention to use machines.

In contrast, media depictions of technology in the West tend to be a lot more mixed – with some downright negative and threatening. Hollywood blockbuster Terminator (1984), for instance, stars a cyborg assassin antagonist, as well as a threat of an AI-activated nuclear holocaust (Eoin, 2020). Such narratives that machines may well outsmart and overtake humans one day are common in the Western media and contribute to a larger, overarching cultural view of technology as a threatening entity that endangers us. Such narratives are also likely underpinned by the Western philosophy of human exceptionalism – they send a message that machines compete with humans for power and dominance and count on audiences hoping for the humans to ultimately win. Considering the large amount of media discussion about the threats of machines replacing (or displacing) human employees, this leads Westerners to learn to perceive machines as a threat and fear or dislike them. This is in line with Gerbner and Gross (Reference Gerbner and Gross1976) cultivation theory which posits that messages and images depicted in the media ‘act like the pull of gravity toward an imagined center’ over time (Judge & Cable, Reference Judge and Cable2011: 96). Furthermore, negative events or information have asymmetrically strong effects on the human psyche (compared to positive ones) (Baumeister, Bratslavsky, Finkenauer, & Vohs, Reference Baumeister, Bratslavsky, Finkenauer and Vohs2001), meaning the impact of such negative machine portrayal on people's attitudes likely supersedes the influence of other positive depictions. Given the contrasting dominant narratives both cultures have of machines, it is no surprise that the West might be more ambivalent toward machines than their Eastern counterparts (De Boer, Jansen, Bustos, Prinse, Horwitz, & Hoorn, Reference De Boer, Jansen, Bustos, Prinse, Horwitz and Hoorn2021).

Technology exposure

Second, having real-life experiences of interacting with machines can also affect people's attitudes toward machines. Researchers usually hypothesize that more prior machine exposure translates to more positive attitudes, presumably based on ideas of the mere-exposure hypothesis (that familiarity leads to liking) (Harrison, Reference Harrison1977), or that exposure reduces uncertainty and anxiety toward machines (Bartneck et al., Reference Bartneck, Suzuki, Kanda and Nomura2007). Indeed, some researchers have demonstrated that having previous interactions or real-life experiences seeing robots in action is correlated with lesser negative attitudes toward them. For example, Nomura, Suzuki, Kanda, and Kato (Reference Nomura, Suzuki, Kanda and Kato2006) found that Japanese students who have seen real robots live before had less negative attitudes toward them than those who had not. In another study, Bartneck et al. (Reference Bartneck, Suzuki, Kanda and Nomura2007) found that participants who had directly interacted with Aibo, a robot dog, before rated lower on negative attitudes toward robots than the participants who had not.

Given that the robot density in Asia trumps that of Europe and the Americas (IFR, 2021b), it could be expected that the East affords more opportunities for people to experience and interact with robots than the West, contributing to differential attitudes toward them. This line of reasoning has been explored by Li et al. (Reference Li, Rau and Li2010) who found that German participants liked, trusted, and engaged with social robots less than Chinese and Korean people, proposedly due to Germany's general lack of exposure to social robots (though they are more exposed to industrial robots such as welding robots and robotic arms). In contrast, social robots are presumably more commonplace and utilized in Korea (a culture which prefers ‘small and slow’, in line with the appearance and mobility of social robots) than in Germany (a culture which prefers ‘big and fast’, more in line with the features of industrial robots). Separately, Han, Hyun, Kim, Cho, Kanda, and Nomura (Reference Han, Hyun, Kim, Cho, Kanda and Nomura2009) found that parents in Spain expressed the most negativity about the use of educator robots for their children compared to parents from Korea and Japan. Korean parents were the least resistant to using educator robots, correlating with the existing prevalence of e-learning in Korea already. These studies suggest that different cultures afford different levels of familiarity with machines – and such differences relate to less negativity and aversion toward machines in the East than in the West.

Further unpacking technology exposure

Although extant research has primarily supported our proposed cultural perspective of human–machine interactions, there are also exceptions. For example, while Li et al. (Reference Li, Rau and Li2010) and Han et al. (Reference Han, Hyun, Kim, Cho, Kanda and Nomura2009) reported fewer negative attitudes and machine aversion in Eastern cultures than in Western ones, a handful of studies have found that attitudes across cultures are more similar than different. For example, MacDorman et al. (Reference MacDorman, Vasudevan and Ho2009) found that both Japanese and US participants self-reported preferring humans over robots and implicitly associated robots with weapons more than humans (albeit slightly more for the US than Japan), despite Japanese participants having had more exposure to robot-related content than participants from the US through both the media and real-life interactions. Similarly, Haring, Mougenot, Ono, and Watanabe (Reference Haring, Mougenot, Ono and Watanabe2014) found that both Japanese and European participants reported similar assumptions and attitudes toward robots, neither being more positive than the other, as well as a similar level of fear. They also found that Japanese participants only had higher robot exposure compared to European participants through the media (e.g., TV, manga), but less in terms of personal contact.

Surprisingly, despite both these studies finding some sort of increased exposure Japan has to robots than the US and Europe, neither found strong support that Japanese people had a stronger preference or more positive attitudes to robots than the participants from the West. A plausible explanation is that although interacting with machines in one's day-to-day life may provide much room for people to observe and learn about their value, the opposite is also true: people may also learn about machine failures, have bad experiences with them, or feel disappointed and disillusioned if the machines do not meet their expectations (Yam, Bigman, Tang, et al., Reference Yam, Bigman, Tang, Ilies, De Cremer, Soh and Gray2021; Yam, Goh, Fehr, Lee, Soh, & Gray, Reference Yam, Goh, Fehr, Lee, Soh and Gray2022). As Bartneck et al. (Reference Bartneck, Suzuki, Kanda and Nomura2007) suggest, prior exposure to robots may not only have made Japanese participants more aware of their capabilities but also their limitations and weaknesses.

Given these somewhat mixed findings, it is difficult to conclude the true extent of differences in terms of the amount of technology exposure the East versus the West affords or how divergent (or convergent) attitudes toward machines are across both cultures. But what can be said is that the machine exposure discussed so far is only but one piece of the culture story. Yes, there are more robots in the East than in the West (IFR, 2021b) – but does everyone in a culture get the same access to experience them? Other factors like age, gender, and occupation may also affect a person's likelihood of having had prior exposure to a machine, and depending on the demographic of the sample a study uses, the findings may or may not represent a cultural group's general level of exposure to machines. And if the East does afford more opportunities for interaction with machines than the West, are these interactions always positive? Do these interactions promote machine acceptance, or do they backfire and create machine aversion? More than how much exposure someone has had to machines in the past, the quality of the exposure needs to be examined, too. As such, future research can clarify the role of technology exposure on cultural attitudes toward machines by studying machine exposure not just as a measure of quantity, but quality, as well.

New Applications of Machines in Asia

Thus far, we have reviewed and proposed some plausible reasons as to how and why different cultures may diverge in their views of machines. Importantly, culture forms another piece of the puzzle to understanding the who of ‘who likes machines, and who does not’? But to also return to the ‘what’ – what do machines do that people like or dislike? – here we review the benefits and drawbacks of three new and exciting types of machine applications primarily found in Asia. Our review suggests that Asians appear to be more receptive to interacting with machines in the social, moral, and spiritual domains than their Western counterparts. To begin, we suggest that this review is by no means exhaustive but aims to facilitate generative research beyond the traditional paradigm of machine aversion versus appreciation to explore how machines might fundamentally change human-to-human relationships in these unique contexts.

Sex Robots Change Romantic Relationships

The sex robot industry is booming, estimated to be worth $30 billion in 2019 and expected to double by 2026 (Williams, Reference Williams2021; Figure 1A). Nagoya, Japan, for instance, is home to a ‘hyper-realistic’ robot brothel, where customers may pay ¥13000 (approximately US$ 100) for an hour's session with one of the brothel's four sex robots (Hicks, Reference Hicks2019). Meanwhile, in China, tech companies have launched sales of customizable, AI-powered sex dolls that can hold simple conversations with users and move their eyes and arms (Song, Reference Song2018). On the one hand, such developments might help alleviate loneliness and provide a source of intimacy for people with social anxiety. On the other hand, the surge of sex robots raises ethical challenges. Learning to compromise and think about other people is important for healthy psychological growth and maturity, and partners can provide a valuable source of social support. Seeking companionship with robots rather than humans might increase egoism or reduce human reliance on one another for social connection (Kiron & Unruh, Reference Kiron and Unruh2018). It might also change the way people think about their romantic interests. If people can get sexual gratification from robots, it might cause them to objectify potential future partners, be they mechanical or flesh and blood.

Figure 1. Benefits and ethical threats of robots in sex, religion, and mental health. Note: The left panel (A) shows Henry, a sexbot, sold online. From Meet Henry, the Male Sex Robot With Artificial Intelligence and a British Accent [Photograph], by Realbotix, 2018, Allure (https://www.allure.com/story/realbotix-henry-male-sex-robot-with-artificial-intelligence). The middle panel (B) shows Mindar, a robot priest introduced in a 400-year-old temple in Kyoto, Japan. From Kyoto Temple Enlists Android Buddhist Deity To Help People [Photograph], by The Asahi Shimbun, 2019, Getty Images (https://www.gettyimages.com/detail/news-photo/android-kannon-named-minder-is-displayed-at-kodaiji-temple-news-photo/1131988643). The right panel (C) shows NAO, a 60-cm robot piloted in Singapore to engage children with autism in social interactions. From NAO Robot Aims to Help Kids with Autism Become More Social [Photograph], by Nuria Ling, 2013, The Straits Times (https://www.straitstimes.com/singapore/nao-robot-aims-to-help-kids-with-autism-become-more-social).

Clergy Robots Change People's Relationship with the Divine and Religious Leaders

For millennia, religious groups around the world have elevated people to elite roles such as shamans, priests, or medicine men. These religious elites have in turn been crucial in maintaining the credibility of the religious beliefs that they espouse (Henrich, Reference Henrich2009; Lanman & Buhrmester, Reference Lanman and Buhrmester2017). However, as religious decline has accelerated in multiple world regions (Norris & Inglehart, Reference Norris and Inglehart2011), some religious groups are using ‘robot priests’ to try to attract younger and more technologically savvy adherents.

The rise of robot priests has been especially pronounced in Japan, which is consistent with Japan's cultural legacy of elevating non-human agents to anthropocentric roles. For example, Japan's SoftBank Robotics is producing a new line of Pepper robot Buddhist monks which will lead funeral rites (Gibbs, Reference Gibbs2017). A 400-year-old Japanese temple has taken this trend one step further by introducing a robot named Mindar to deliver Buddhist sermons (Figure 1B; see also Jackson, Yam, Tang, Liu, & Shariff, Reference Jackson, Yam, Tang, Liu and Shariff2023). The rise of robots in East Asian religious settings contrasts with a deep aversion to religious robots in many Western cultural contexts. For example, the advent of large language models such as ChatGPT led to a wave of opposition to the possibility of developing automated sermons (Crary, Reference Crary2023; Gerber, Reference Gerber2023). The 2017 introduction of a robot named ‘BlessU-2’ in a German Protestant church provoked a wave of media interest, but it had a fairly minor responsibility (reading blessings in different languages) and has not spread in the intervening years.

Time will tell whether robot preachers continue to be accepted in East Asian contexts. Given people's attraction to credible and charismatic religious figures (Lanman & Buhrmester, Reference Lanman and Buhrmester2017; Sperber, Reference Sperber2010), robot preachers may fail to inspire the same kind of commitment as human preachers and may eventually drift out of style. But the fact that temples in Japan are adopting these automated agents in religious contexts suggests that they may have less aversion to machines making moral decisions than Westerners (Bigman & Gray, Reference Bigman and Gray2018). This may foreshadow an era where machines seep into moral and religious spheres of society in East Asian cultures. This trend could have implications for well-being and ethics. For example, people also see religious leaders as sources of community support during difficult circumstances, and this trust and support may be undermined when robots serve in these roles.

Robot Therapists Shape Information Disclosure to Others

A final area where robots are transforming jobs is psychotherapy. People are often reluctant to seek help because confiding in a mental health professional makes them vulnerable and is stigmatized in many cultures. In East Asian cultures, for instance, unrestrained expression of emotion is generally frowned upon and sometimes viewed as a threat to social harmony (Ng, Reference Ng1997). Unsurprisingly, the fear of stigmatization relating to one's mental health can significantly delay people from seeking treatment (Subramaniam et al., Reference Subramaniam, Abdin, Vaingankar, Shafie, Chua, Tan, Tan, Verma, Heng and Chong2020). But robot therapists offer a solution to this problem (Figure 1C).

For example, effective psychological therapies often require full disclosure of the patient's darkest fears and secrets, which can be difficult to achieve with human therapists as clients struggle with feelings of embarrassment or shame. Significantly, self-disclosure to a robot does not seem to evoke the same kind or extent of resistance – research has found that people engage in more self-disclosure, particularly on negative topics, interacting with robot therapists relative to human therapists (Takahashi, Takahashi, Ban, Shimaya, Yoshikawa, & Ishiguro, Reference Takahashi, Takahashi, Ban, Shimaya, Yoshikawa and Ishiguro2017). This may be why people have been confiding in and seeking mental health advice from ChatGPT, a popular AI language model chatbot released in late 2022 (Broderick, Reference Broderick2023). Interestingly, ChatGPT was not designed or intended as a therapy chatbot. Yet, chatbots like these can promise anonymity and non-judgment to people, possibly more so than professional human therapists, which may make them more appealing ‘listeners’ to some audiences. Such trends importantly suggest there is already some openness to – or even a demand for – robot therapists among us. In Asia, chatbots designed specifically for counseling have been sprouting up over the past few years, such as Singapore's mindline.sg, launched by the government to support the community during the COVID-19 pandemic (Goh, Reference Goh2020). It is equipped to guide users through meditative and breathing exercises, among other care techniques. In China, a similar AI chatbot designed to provide counseling services free-of-charge and around-the-clock, Xiaotian, is also in development (Xinhua, 2021). According to its creators, Xiaotian will be able to guide users through 50-minute-long conversations about their feelings and experiences and direct them to professional help resources if needed.

But despite the potential benefits robot therapists may bring, these developments also raise ethical concerns. In addition to the obvious issues of privacy and data security, there is some evidence suggesting that a therapeutic alliance between the patient and the therapist is necessary for effective psychotherapy (Horvath & Luborsky, Reference Horvath and Luborsky1993). With non-human therapists, this might be lost. Moreover, deep and personal disclosures with a robot therapist might ironically further reduce humans’ desire to socially connect with other humans. Such robots might create a paradox for social connectedness. On the one hand, they allow people to receive counseling and access mental health resources; but on the other, they can lead to further social alienation and ostracism because people who seek help from robots would no longer feel the need or want to share their deepest disclosures with other people.

Future Research

Having proposed a cross-cultural perspective and discussed unique machine applications in Asia, we now turn to future research that we believe will be fruitful. We organize this section into three board categories – unexamined populations and underlying mechanisms, boundary conditions within and across cultures, and the paradox of technology development versus adoption and deployment.

Unexamined Populations and Underlying Mechanisms

The examples and research cited pertaining to the East throughout this article have primarily been based on developed economies in Asia such as China, Singapore, Japan, and South Korea. This largely corresponds to the top three countries in terms of robot density – South Korea, Singapore, and Japan (IFR, 2021b). China has also seen tremendous growth in robot application – growth data suggest that robot density per capita grew by over 400% in China between 2015 and 2020 (compared to a less than 45% growth in the US). Still, we call for more research to examine human–machine interactions in other parts of Asia, particularly developing economies in Southeast and South Asia. Approximately one-third of the world's population resides in this region; yet there is little research on how South and Southeast Asians perceive new technologies and machines, a problem noted in many areas of behavioral science research (e.g., Bernardo, Mateo, & Dela Cruz, Reference Bernardo, Mateo and Dela Cruz2022). That said, we have reasons to believe that some of the aforementioned cross-cultural differences might replicate in these regions. For example, in one rare study, Yam et al. (Reference Yam, Tang, Jackson, Su and Gray2023) examined how engineers from India reacted to robots and the results were largely consistent with samples from Singapore.

Although these cross-cultural differences in how people perceive new technologies may replicate, their underlying mechanisms might differ. Throughout this article, we have proposed a framework to explain why there lie cross-cultural differences in machine aversion versus appreciation, but most of these mechanisms remained untested. We also urge future research to unpack further cultural mechanisms undiscussed in this article. For example, cognitive differences between cultures may also explain why Asians are more receptive to machines than Westerners. Possibly, as Asians are more likely to be holistic thinkers (Markus & Kitayama, Reference Markus and Kitayama1991) and relationship-focused (Qin et al., Reference Qin, Yam, Ye, Zhang, Liang, Zhang and Savani2023), they are more likely to perceive machines in terms of their relationships with others. In other words, machines can be thought of as helpful and/or harmful depending on the targeted relationships (e.g., beneficial to consumers but harmful to some employees). On the other hand, Westerners are more likely to be analytical thinkers, perceiving machines in isolation, and hence, as potential threats. As another example, much of the extant research has examined cultural differences in trust toward machines (e.g., Haring, Silvera-Tawil, Matsumoto, Velonaki, & Watanabe, Reference Haring, Silvera-Tawil, Matsumoto, Velonaki and Watanabe2014; Wang, Rau, Evers, Robinson, & Hinds, Reference Wang, Rau, Evers, Robinson and Hinds2010), but this line of work has not examined more nuanced mechanisms such as cognitive versus affective trust (McAllister, Reference McAllister1995). Notably, the latter has been found to be especially important in Asian contexts (Chen, Eberly, Chiang, Farh, & Cheng, Reference Chen, Eberly, Chiang, Farh and Cheng2014), and likely more difficult for machines to cultivate.

Boundary Conditions within and Across Cultures

Although we have argued that Asians generally react more positively to machines than their counterparts in the West, there are important moderators to be considered within cultures. For example, we suggest an interesting industry moderating effect within cultures – although Asians are generally more accepting of robots than Westerners, they may reject the deployment of robotic caregivers in nursing homes. This is because using robots to take up the responsibilities and tasks of caring for one's aged parents may be at odds with the Confucianist concept of filial piety (Ng, Lee, & Wu, Reference Ng, Lee, Wu, Bruyns and Wei2022). Conversely, the use of robots for surveillance purposes may be better accepted in the East than in the West, as Western cultures tend to value individual rights to privacy and freedom more strongly than Eastern cultures do. A closely related illustration of this is the pervasive uptake and embrace of digital tracking tools in East Asian nations during the COVID-19 pandemic to support contract tracing compared to Western countries (Cha, Reference Cha2020).

In addition, there are broader moderators that would be applicable across cultures. One such moderator is age or generational difference. Mahmud, Islam, Ahmed, and Smolander (Reference Mahmud, Islam, Ahmed and Smolander2022)'s review of algorithm aversion identified that older people find algorithms less useful and trustworthy. However, it also reported some conflicting findings with regard to older people's preference for algorithms over humans – Thurman, Moeller, Helberger, and Trilling (Reference Thurman, Moeller, Helberger and Trilling2019) found that older people preferred human news recommenders over algorithmic ones; Ho, Wheatley, and Scialfa (Reference Ho, Wheatley and Scialfa2005) found that older people depended more on an algorithm to perform a medication management task than younger people did. Looking at robots, some studies have also found that older adults feel more fearful and anxious about having robots at home (Scopelliti, Giuliani, & Fornara, Reference Scopelliti, Giuliani and Fornara2005), and tend to hold more negative implicit attitudes toward robots than younger people (Chien et al., Reference Chien, Li, Lee, Yang, Lin, Yang, Wang and Yeh2019). All in all, age is likely a relevant but incomplete piece to understanding algorithm aversion. Older people are indeed less likely to be familiar with new technologies and machines which may account for more anxiety and distrust of them. But, on the other hand, older people may also find that machines are more valuable and useful to them (than younger people would) for certain tasks that are too demanding or difficult for them (e.g., in Ho et al.'s (Reference Ho, Wheatley and Scialfa2005) study, older people relied more on the algorithm because they had less confidence in themselves to perform the task). In sum, we encourage future work to seriously examine age or generational difference, and technology familiarity/affinity as moderators because these factors might trump any cross-cultural differences identified in this article.

The Paradox of Technology Development Versus Adoption and Deployment

Finally, our review reveals one interesting paradox – most new AI innovations are happening in the West and yet, their widespread application and deployment are often found in the East. One possible reason for the slow uptake of AI and machines in real-world settings in the West is the lengthy review and approval processes by regulatory authorities. Interestingly, global surveys have found that concerns about the harms of AI decision-making run highest in regions like Latin America and North America, whereas Southeast Asia and East Asia reported much lower levels of such worries (Neudert, Knuutila, & Howard, Reference Neudert, Knuutila and Howard2020). These attitudes of concern and apprehension may reflect in the different societies’ relative willingness or resistance to deploying newly innovated technologies, explaining some of the West's relatively slow deployment processes for new technologies, despite the thriving state of innovation there.

Conclusion

In this perspective article, we have provided a brief review of the recent machine aversion versus appreciation debate through a cultural lens, from the perspectives of the historical, religious, and the role of machine exposure. Additionally, we discussed three unique applications of machines found primarily in Asia – in the domains of sex, religion, and mental health – that could generate exciting future research beyond the traditional paradigm of just machine aversion versus appreciation. Given the extensive and exciting developments of technology in both the East and the West, the time is ripe for scholarly, cross-cultural collaboration to realize the full potential of human–machine interaction in both the East and the West.

Acknowledgment

This research is supported by a Singapore Ministry of Education Tier 1 grant (A-8000768-00-00) awarded to the first author.

Footnotes

1 In this article, we use the term machines to loosely refer to embodied industrial and social robots, disembodied algorithms, automation, as well as artificial intelligence.

2 Historically, cross-cultural psychology has studied East-West differences as a dichotomy (Markus & Kitayama, Reference Markus and Kitayama1991). However, many contemporary scholars have argued that cultural differences are best viewed in relative terms (Takano & Osaka, Reference Takano and Osaka2018), a view we endorse in this article.

3 In this article, we suggest that religion is a form of culture that can vary along cultural axioms (see Cohen, Reference Cohen2009), and that religious groups and other cultural groups constantly exchange cultural values such as cultural tightness (see Caluori, Jackson, Gray, & Gelfand, Reference Caluori, Jackson, Gray and Gelfand2020). All in all, since cross-cultural psychologists have often acknowledged cultural and religious values as intertwined, we do not distinguish them in our work.

4 Animism and anthropomorphism are both religious beliefs and cultural beliefs. Generally speaking, while cultural and religious values are not synonymous, they are heavily intertwined and affect one another. Animism and anthropomorphism, for example, have been treated as religious beliefs since the earliest days of cultural anthropology by scholars like Max Muller and Edward Burnett Tylor, and they continue to be studied in the psychology of religion (Jackson, Dillion, et al., Reference Jackson, Dillion, Bastian, Watts, Buckner, DiMaggio and Gray2023). But these beliefs have also become part of many metaphysical philosophical traditions – particularly in East Asia – and they are now held by many people who may not consider themselves traditionally religious (Fuller, Reference Fuller2001).

5 An in-depth discussion about political and institutional strategies is beyond the scope of this article, but we acknowledge they play an important role in shaping one's exposure to machines. It is our view that culture and national politics or agendas are not independent of each other, just like how the Japanese government's national strategy to promote robotics has easily bled into people's sociocultural identity (Kovacic, Reference Kovacic2018).

References

Asquith, P. J., & Kalland, A. 1996. Japanese images of nature: Cultural perspectives. (1st ed.). London, UK: Routledge.Google Scholar
Bai, B., Dai, H., Zhang, D., Zhang, F., & Hu, H. 2022. The impacts of algorithmic work assignment on fairness perceptions and productivity: Evidence from field experiments. Manufacturing & Service Operations Management, 24(6): 27973306.Google Scholar
Bartneck, C., Suzuki, T., Kanda, T., & Nomura, T. 2007. The influence of people's culture and prior experiences with Aibo on their attitude towards robots. AI & Society, 21: 217230. https://doi.org/10.1007/s00146-006-0052-7CrossRefGoogle Scholar
Baumeister, R. F., Bratslavsky, E., Finkenauer, C., & Vohs, K. D. 2001. Bad is stronger than good. Review of General Psychology, 5(4): 323370. https://doi.org/10.1037/1089-2680.5.4.323CrossRefGoogle Scholar
Bernardo, A. B. I., Mateo, N. J., & Dela Cruz, I. C. 2022. The psychology of well-being in the margins: Voices from and prospects for South Asia and Southeast Asia. Psychological Studies, 67(3): 273280. https://doi.org/10.1007/s12646-022-00676-5CrossRefGoogle Scholar
Bigman, Y. E., & Gray, K. 2018. People are averse to machines making moral decisions. Cognition, 181: 2134. https://doi.org/10.1016/j.cognition.2018.08.003CrossRefGoogle ScholarPubMed
Bigman, Y. E., Yam, K. C., Marciano, D., Reynolds, S. J., & Gray, K. 2021. Threat of racial and economic inequality increases preference for algorithm decision-making. Computers in Human Behavior, 122: 106859. https://doi.org/10.1016/j.chb.2021.106859CrossRefGoogle Scholar
Bouckaert, R., Redding, D., Sheehan, O., Kyritsis, T., Gray, R., Jones, K. E., & Atkinson, Q. 2022. Global language diversification is linked to socio-ecology and threat status. https://doi.org/10.31235/osf.io/f8tr6CrossRefGoogle Scholar
Breen, J., & Teeuwen, M. 2010. A new history of Shinto. West Sussex, UK: John Wiley & Sons.10.1002/9781444317190CrossRefGoogle Scholar
Broderick, R. 2023. People are using AI for therapy, whether the tech is ready for it or not. Fast Company. Available from URL: https://www.fastcompany.com/90836906/ai-therapy-koko-chatgptGoogle Scholar
Caluori, N., Jackson, J. C., Gray, K., & Gelfand, M. 2020. Conflict changes how people view god. Psychological Science, 31(3): 280292. https://doi.org/10.1177/0956797619895286CrossRefGoogle ScholarPubMed
Castelo, N., Bos, M. W., & Lehmann, D. R. 2019. Task-dependent algorithm aversion. Journal of Marketing Research, 56(5): 809825. https://doi.org/10.1177/0022243719851788CrossRefGoogle Scholar
Cha, V. 2020. Asia's COVID-19 lessons for the West: Public goods, privacy, and social tagging. The Washington Quarterly, 43(2): 118. https://doi.org/10.1080/0163660X.2020.1770959CrossRefGoogle Scholar
Chen, X.-P., Eberly, M. B., Chiang, T.-J., Farh, J.-L., & Cheng, B.-S. 2014. Affective trust in Chinese leaders: Linking paternalistic leadership to employee performance. Journal of Management, 40(3): 796819. https://doi.org/10.1177/0149206311410604CrossRefGoogle Scholar
Chien, S.-E., Li, C., Lee, H.-H., Yang, C.-C., Lin, F.-H., Yang, P.-L., Wang, T.-M., & Yeh, S.-L. 2019. Age difference in perceived ease of use, curiosity, and implicit negative attitude toward robots. ACM Transactions on Human-Robot Interaction, 8(2): 119. https://doi.org/10.1145/3311788CrossRefGoogle Scholar
Cohen, A. B. 2009. Many forms of culture. American Psychologist, 64(3): 194204. https://doi.org/10.1037/a0015308CrossRefGoogle ScholarPubMed
Craft, L. 2022. Robo-dogs and therapy bots: Artificial intelligence goes cuddly. CBS News. Available from URL: https://www.cbsnews.com/news/robo-dogs-therapy-bots-artificial-intelligence/Google Scholar
Crary, D. 2023. Pastors’ view: Sermons written by ChatGPT will have no soul. Wisconsin State Journal. Available from URL: https://madison.com/lifestyles/faith-and-values/pastors-view-sermons-written-by-chatgpt-will-have-no-soul/article_273e7cb7-302d-5978-9337-54675b4e9d26.htmlGoogle Scholar
Daliot-Bul, M. 2019. Ghost in the shell as a cross-cultural franchise: From radical posthumanism to human exceptionalism. Asian Studies Review, 43(3): 527543. https://doi.org/10.1080/10357823.2019.1631257CrossRefGoogle Scholar
De Boer, S., Jansen, B., Bustos, V. M., Prinse, M., Horwitz, Y., & Hoorn, J. F. 2021. Social robotics in Eastern and Western newspapers: China and (even) Japan are optimistic. International Journal of Innovation and Technology Management, 18(01): 2040001. https://doi.org/10.1142/S0219877020400015CrossRefGoogle Scholar
Dietvorst, B. J., Simmons, J. P., & Massey, C. 2015. Algorithm aversion: People erroneously avoid algorithms after seeing them err. Journal of Experimental Psychology: General, 144(1): 114126. https://doi.org/10.1037/xge0000033CrossRefGoogle ScholarPubMed
Dietvorst, B. J., Simmons, J. P., & Massey, C. 2018. Overcoming algorithm aversion: People will use imperfect algorithms if they can (even slightly) modify them. Management Science, 64(3): 11551170. https://doi.org/10.1287/mnsc.2016.2643CrossRefGoogle Scholar
Entman, R. M. 1989. How the media affect what people think: An information processing approach. The Journal of Politics, 51(2): 347370. https://doi.org/10.2307/2131346CrossRefGoogle Scholar
Eoin. 2020. The terminator (1984) review. The Action Elite. Available from URL: https://theactionelite.com/the-terminator-1984-review/Google Scholar
Fannin, R. 2020. The rush to deploy robots in China amid the coronavirus outbreak. CNBC. Available from URL: https://www.cnbc.com/2020/03/02/the-rush-to-deploy-robots-in-china-amid-the-coronavirus-outbreak.htmlGoogle Scholar
Finlay, B. L., & Workman, A. D. 2013. Human exceptionalism. Trends in Cognitive Sciences, 17(5): 199201. https://doi.org/10.1016/j.tics.2013.03.001CrossRefGoogle ScholarPubMed
Floridi, L. 2014. Smart, autonomous, and social: Robots as challenge to human exceptionalism. In Sociable Robots and the Future of Social Relations, vol. 273: 11. IOS Press. Available from URL: https://ebooks.iospress.nl/doi/10.3233/978-1-61499-480-0-11Google Scholar
Fuller, R. C. 2001. Spiritual, but not religious: Understanding unchurched America. New York: Oxford University Press.10.1093/0195146808.001.0001CrossRefGoogle Scholar
Gamez-Djokic, M., & Waytz, A. 2020. Concerns about automation and negative sentiment toward immigration. Psychological Science, 31(8): 9871000. https://doi.org/10.1177/0956797620929977CrossRefGoogle ScholarPubMed
Gao, S., He, L., Chen, Y., Li, D., & Lai, K. 2020. Public perception of artificial intelligence in medical care: Content analysis of social media. Journal of Medical Internet Research, 22(7): e16649. https://doi.org/10.2196/16649CrossRefGoogle ScholarPubMed
Geraci, R. M. 2006. Spiritual robots: Religion and our scientific view of the natural world. Theology and Science, 4(3): 229246. https://doi.org/10.1080/14746700600952993CrossRefGoogle Scholar
Gerber, A. 2023. ChatGPT has no future in the pulpit. Preaching Today. Available from URL: https://www.preachingtoday.com/skills/2023/chatgpt-has-no-future-in-pulpit.htmlGoogle Scholar
Gerbner, G., & Gross, L. 1976. Living with television: The violence profile. Journal of Communication, 26(2): 172199. https://doi.org/10.1111/j.1460-2466.1976.tb01397.xCrossRefGoogle ScholarPubMed
Gibbs, S. 2017. The future of funerals? Robot priest launched to undercut human-led rites. The Guardian. Available from URL: https://www.theguardian.com/technology/2017/aug/23/robot-funerals-priest-launched-softbank-humanoid-robot-pepper-live-streamingGoogle Scholar
Goh, Y. H. 2020. New ‘emotionally intelligent’ chatbot to help Singaporeans stressed by pandemic. The Straits Times. Available from URL: https://www.straitstimes.com/singapore/community/new-emotionally-intelligent-chatbot-to-help-singaporeans-stressed-by-pandemicGoogle Scholar
Goralnik, L., & Nelson, M. P. 2012. Anthropocentrism. In Chadwick, R. (Ed.), Encyclopedia of applied ethics (2nd Edition: 145155). Academic Press. https://doi.org/10.1016/B978-0-12-373932-2.00349-5CrossRefGoogle Scholar
Gray, K., Yam, K. C., Eng, A., Wilbanks, D., & Waytz, A. 2023. The psychology of social robots and artificial intelligence. In D., Gilbert, et al. (Ed.), Handbook of Social Psychology (6th ed.). Cambridge: Situational PressGoogle Scholar
Guizzo, E. 2010. Hiroshi Ishiguro: The man who made a copy of himself. IEEE Spectrum. Available from URL: https://spectrum.ieee.org/hiroshi-ishiguro-the-man-who-made-a-copy-of-himself 10.1109/MSPEC.2010.5434851CrossRefGoogle Scholar
Han, J., Hyun, E., Kim, M., Cho, H.-K., Kanda, T., & Nomura, T. 2009. The cross-cultural acceptance of tutoring robots with augmented reality services. International Journal of Digital Content Technology and Its Applications, 3(2): 95102. https://doi.org/10.4156/jdcta.vol3.issue2.hanCrossRefGoogle Scholar
Haring, K. S., Mougenot, C., Ono, F., & Watanabe, K. 2014. Cultural differences in perception and attitude towards robots. International Journal of Affective Engineering, 13(3): 149157. https://doi.org/10.5057/ijae.13.149CrossRefGoogle Scholar
Haring, K. S., Silvera-Tawil, D., Matsumoto, Y., Velonaki, M., & Watanabe, K. 2014. Perception of an android robot in Japan and Australia: A cross-cultural comparison. International Conference on Social Robotics, 8755: 166175. https://doi.org/10.1007/978-3-319-11973-1_17CrossRefGoogle Scholar
Harrison, A. A. 1977. Mere exposure. Advances in Experimental Social Psychology, 10: 3983. https://doi.org/10.1016/S0065-2601(08)60354-8CrossRefGoogle Scholar
Henrich, J. 2009. The evolution of costly displays, cooperation and religion: Credibility enhancing displays and their implications for cultural evolution. Evolution and Human Behavior, 30(4): 244260. https://doi.org/10.1016/j.evolhumbehav.2009.03.005CrossRefGoogle Scholar
Henrich, J., Heine, S. J., & Norenzayan, A. 2010. Most people are not WEIRD. Nature, 466. Article 7302. https://doi.org/10.1038/466029aCrossRefGoogle Scholar
Hicks, A. 2019. Sex robot brothel opens in Japan amid surge of men wanting bisexual threesomes. The Mirror. Available from URL: https://www.mirror.co.uk/news/weird-news/sex-robot-brothel-opens-japan-14792161Google Scholar
Ho, G., Wheatley, D., & Scialfa, C. T. 2005. Age differences in trust and reliance of a medication management system. Interacting with Computers, 17(6): 690710. https://doi.org/10.1016/j.intcom.2005.09.007CrossRefGoogle Scholar
Horvath, A. O., & Luborsky, L. 1993. The role of the therapeutic alliance in psychotherapy. Journal of Consulting and Clinical Psychology, 61(4): 561573. https://doi.org/10.1037/0022-006X.61.4.561CrossRefGoogle ScholarPubMed
IEEE. 2022. Paro. ROBOTS: Your Guide to the World of Robotics. Available from URL: https://robots.ieee.org/robots/paro/Google Scholar
IFR. 2021a. IFR presents world robotics 2021 reports. IFR International Federation of Robotics. Available from URL: https://ifr.org/ifr-press-releases/news/robot-sales-rise-againGoogle Scholar
IFR. 2021b. Robot density nearly doubled globally. IFR International Federation of Robotics. Available from URL: https://ifr.org/ifr-press-releases/news/robot-density-nearly-doubled-globallyGoogle Scholar
International Institute of Communications. 2020. Artificial intelligence in the Asia-Pacific Region: 88. Available from URL: https://www.iicom.org/wp-content/uploads/IIC-AI-Report-2020.pdfGoogle Scholar
Jackson, J. C., Caluori, N., Gray, K., & Gelfand, M. 2021. The new science of religious change. American Psychologist, 76(6): 838850. https://doi.org/10.1037/amp0000818CrossRefGoogle ScholarPubMed
Jackson, J. C., Yam, K. C., Tang, P. M., Liu, T., & Shariff, A. 2023. Exposure to robot preachers undermines religious commitment. Journal of Experimental Psychology: General (in press).10.1037/xge0001443CrossRefGoogle ScholarPubMed
Jackson, J. C., Dillion, D., Bastian, B., Watts, J., Buckner, W., DiMaggio, N., & Gray, K. 2023. Supernatural explanations across 114 societies are more common for natural than social phenomena. Nature Human Behaviour, 7: 707717. https://doi.org/10.1038/s41562-023-01558-0CrossRefGoogle ScholarPubMed
Judge, T. A., & Cable, D. M. 2011. When it comes to pay, do the thin win? The effect of weight on pay for men and women. Journal of Applied Psychology, 96(1): 95112. https://doi.org/10.1037/a0020860CrossRefGoogle Scholar
Kagitcibasi, C., & Berry, J. W. 1989. Cross-cultural psychology: Current research and trends. Annual Review of Psychology, 40: 493531. https://doi.org/10.1146/annurev.ps.40.020189.002425CrossRefGoogle Scholar
Kahn, J. 2021. Eros in the closet. Psychological Perspectives, 64(3): 413418. https://doi.org/10.1080/00332925.2021.1996827CrossRefGoogle Scholar
Kenney, E. 2004. Pet funerals and animal graves in Japan. Mortality, 9(1): 4260. https://doi.org/10.1080/13576270410001652532CrossRefGoogle Scholar
Kiron, D., & Unruh, G. 2018. Even if AI can cure loneliness—Should it? MIT Sloan Management Review. Available from URL: https://sloanreview.mit.edu/article/even-if-ai-can-cure-loneliness-should-it/Google Scholar
Kovacic, M. 2018. The making of national robot history in Japan: Monozukuri, enculturation and cultural lineage of robots. Critical Asian Studies, 50(4): 572590. https://doi.org/10.1080/14672715.2018.1512003CrossRefGoogle Scholar
Laham, S. M. 2009. Expanding the moral circle: Inclusion and exclusion mindsets and the circle of moral regard. Journal of Experimental Social Psychology, 45(1): 250253. https://doi.org/10.1016/j.jesp.2008.08.012CrossRefGoogle Scholar
Lanman, J. A., & Buhrmester, M. D. 2017. Religious actions speak louder than words: Exposure to credibility-enhancing displays predicts theism. Religion, Brain & Behavior, 7(1): 316. https://doi.org/10.1080/2153599X.2015.1117011CrossRefGoogle Scholar
Lee, Y.-T., Han, A.-G., Byron, T. K., & Fan, H.-X. 2008. Daoist leadership: Theory and practice. In Chen, C. C., & Lee, Y. T. (Eds.), Leadership and management in China: Philosophies, theories, and practices: 83107. New York: Cambridge University Press.10.1017/CBO9780511753763.005CrossRefGoogle Scholar
Lehman, D. R., Chiu, C., & Schaller, M. 2004. Psychology and culture. Annual Review of Psychology, 55: 689714. https://doi.org/10.1146/annurev.psych.55.090902.141927CrossRefGoogle ScholarPubMed
Li, D., Rau, P. L. P., & Li, Y. 2010. A cross-cultural study: Effect of robot appearance and task. International Journal of Social Robotics, 2(2): 175186. https://doi.org/10.1007/s12369-010-0056-9CrossRefGoogle Scholar
Li, W. 2009. Different communication rules between the English and Chinese greetings. Asian Culture and History, 1(2): 7274. https://doi.org/10.5539/ach.v1n2p72CrossRefGoogle Scholar
Lim, V., Rooksby, M., & Cross, E. S. 2021. Social robots on a global stage: Establishing a role for culture during human–robot interaction. International Journal of Social Robotics, 13(6): 13071333. https://doi.org/10.1007/s12369-020-00710-4CrossRefGoogle Scholar
Logg, J. M., Minson, J. A., & Moore, D. A. 2019. Algorithm appreciation: People prefer algorithmic to human judgment. Organizational Behavior and Human Decision Processes, 151: 90103. https://doi.org/10.1016/j.obhdp.2018.12.005CrossRefGoogle Scholar
Lu, J. G. 2023. Asians don't ask? Relational concerns, negotiation propensity, and starting salaries. Journal of Applied Psychology, 108(2): 273290. https://doi.org/10.1037/apl0001017CrossRefGoogle ScholarPubMed
Lufkin, B. 2020. What the world can learn from Japan's robots. BBC. Available from URL: https://www.bbc.com/worklife/article/20200205-what-the-world-can-learn-from-japans-robotsGoogle Scholar
MacDorman, K. F. 2006. Subjective ratings of robot video clips for human likeness, familiarity, and eeriness: An exploration of the uncanny valley. ICCS/CogSci-2006 Long Symposium: Toward Social Mechanisms of Android Science: 26–29. Available from URL: http://www.macdorman.com/kfm/writings/pubs/MacDorman2006SubjectiveRatings.pdfGoogle Scholar
MacDorman, K. F., Vasudevan, S. K., & Ho, C.-C. 2009. Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures. AI & Society, 23(4): 485510. https://doi.org/10.1007/s00146-008-0181-2CrossRefGoogle Scholar
Mahmud, H., Islam, A. K. M. N., Ahmed, S. I., & Smolander, K. 2022. What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175: 121390. https://doi.org/10.1016/j.techfore.2021.121390CrossRefGoogle Scholar
Markus, H. R., & Kitayama, S. 1991. Culture and the self: Implications for cognition, emotion, and motivation. Psychological Review, 98(2): 224253. https://doi.org/10.1037/0033-295X.98.2.224CrossRefGoogle Scholar
Markus, H. R., & Kitayama, S. 2010. Cultures and selves: A cycle of mutual constitution. Perspectives on Psychological Science, 5(4): 420430. https://doi.org/10.1177/1745691610375557CrossRefGoogle Scholar
McAllister, D. J. 1995. Affect- and cognition-based trust as foundations for interpersonal cooperation in organizations. The Academy of Management Journal, 38(1): 2459. https://doi.org/10.2307/256727Google Scholar
Mims, C. 2010. Why Japanese love robots (and Americans fear them). MIT Technology Review. Available from URL: https://www.technologyreview.com/2010/10/12/120635/why-japanese-love-robots-and-americans-fear-them/Google Scholar
Nee, S. 2005. The great chain of being. Nature, 435: 7041. https://doi.org/10.1038/435429aCrossRefGoogle ScholarPubMed
Neudert, L.-M., Knuutila, A., & Howard, P. N. 2020. Global attitudes towards AI, machine learning & automated decision making. Oxford Commission on AI & Good Governance. Available from URL: https://oxcaigg.oii.ox.ac.uk/wp-content/uploads/sites/11/2020/10/GlobalAttitudesTowardsAIMachineLearning2020.pdfGoogle Scholar
Ng, C. H. 1997. The stigma of mental illness in Asian cultures. Australian & New Zealand Journal of Psychiatry, 31(3): 382390. https://doi.org/10.3109/00048679709073848CrossRefGoogle ScholarPubMed
Ng, T. P., Lee, J.-J., & Wu, Y. 2022. Unpacking cultural perceptions of future elder care through design fiction. In Bruyns, G. & Wei, H. (Eds.), Proceedings of the 9th Congress of the International Association of Societies of Design Research (IASDR 2021): 1632–1652. Springer Nature. https://doi.org/10.1007/978-981-19-4472-7_107CrossRefGoogle Scholar
Nisbett, R. 2004. The geography of thought: How Asians and Westerners think differently…and why. New York: Simon and Schuster.Google Scholar
Nisbett, R. E., Peng, K., Choi, I., & Norenzayan, A. 2001. Culture and systems of thought: Holistic versus analytic cognition. Psychological Review, 108(2): 291310. https://doi.org/10.1037/0033-295X.108.2.291CrossRefGoogle ScholarPubMed
Nomura, T. 2017. Robots and gender. Gender and the Genome, 1(1): 1826. https://doi.org/10.1089/gg.2016.29002.nomCrossRefGoogle Scholar
Nomura, T., Suzuki, T., Kanda, T., & Kato, K. 2006. Measurement of negative attitudes toward robots. Interaction Studies: Social Behaviour and Communication in Biological and Artificial Systems, 7(3): 437454. https://doi.org/10.1075/is.7.3.14nomCrossRefGoogle Scholar
Norris, P., & Inglehart, R. 2011. Sacred and secular: Religion and politics worldwide. Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9780511894862CrossRefGoogle Scholar
Oh, S., Kim, J. H., Choi, S.-W., Lee, H. J., Hong, J., & Kwon, S. H. 2019. Physician confidence in artificial intelligence: An online mobile survey. Journal of Medical Internet Research, 21(3): e12422. https://doi.org/10.2196/12422CrossRefGoogle ScholarPubMed
Peoples, H. C., Duda, P., & Marlowe, F. W. 2016. Hunter-gatherers and the origins of religion. Human Nature, 27(3): 261282. https://doi.org/10.1007/s12110-016-9260-0CrossRefGoogle ScholarPubMed
Qin, X., Yam, K. C., Ye, W., Zhang, J., Liang, X., Zhang, X., & Savani, K. 2023. Collectivism impairs team performance when relational goals conflict with group goals. Personality and Social Psychology Bulletin., https://doi.org/10.1177/01461672221123776Google Scholar
Realbotix. 2018. Meet Henry, the male sex robot with artificial intelligence and a British accent [Photograph]. Allure. Available from URL: https://www.allure.com/story/realbotix-henry-male-sex-robot-with-artificial-intelligenceGoogle Scholar
Reich, N., & Eyssel, F. 2013. Attitudes towards service robots in domestic environments: The role of personality characteristics, individual interests, and demographic variables. Paladyn, Journal of Behavioral Robotics, 4(2): 123130. https://doi.org/10.2478/pjbr-2013-0014CrossRefGoogle Scholar
Robot Hall of Fame. 2004. Astro boy. The Robot Hall of Fame Powered by Carnegie Mellon. Available from URL: http://www.robothalloffame.org/inductees/04inductees/astro_boy.htmlGoogle Scholar
Roth, G., & Dicke, U. 2005. Evolution of the brain and intelligence. Trends in Cognitive Sciences, 9(5): 250257. https://doi.org/10.1016/j.tics.2005.03.005CrossRefGoogle ScholarPubMed
Scopelliti, M., Giuliani, M. V., & Fornara, F. 2005. Robots in a domestic setting: A psychological approach. Universal Access in the Information Society, 4(2): 146155. https://doi.org/10.1007/s10209-005-0118-1CrossRefGoogle Scholar
Song, A. 2018. Chinese factory builds AI sex dolls – in pictures. The Guardian. Available from URL: http://www.theguardian.com/world/gallery/2018/jul/30/chinese-factory-builds-ai-sex-dolls-in-picturesGoogle Scholar
Sperber, D. 2010. The guru effect. Review of Philosophy and Psychology, 1(4): 583592. https://doi.org/10.1007/s13164-010-0025-0CrossRefGoogle Scholar
Srinivasan, K., & Kasturirangan, R. 2016. Political ecology, development, and human exceptionalism. Geoforum, 75: 125128. https://doi.org/10.1016/j.geoforum.2016.07.011CrossRefGoogle Scholar
Subramaniam, M., Abdin, E., Vaingankar, J. A., Shafie, S., Chua, H. C., Tan, W. M., Tan, K. B., Verma, S., Heng, D., & Chong, S. A. 2020. Minding the treatment gap: Results of the Singapore Mental Health Study. Social Psychiatry and Psychiatric Epidemiology, 55(11): 14151424. https://doi.org/10.1007/s00127-019-01748-0CrossRefGoogle ScholarPubMed
Tajfel, H., Billig, M. G., Bundy, R. P., & Flament, C. 1971. Social categorization and intergroup behaviour. European Journal of Social Psychology, 1(2): 149178. https://doi.org/10.1002/ejsp.2420010202CrossRefGoogle Scholar
Takahashi, U., Takahashi, H., Ban, M., Shimaya, J., Yoshikawa, Y., & Ishiguro, H. 2017. A robot counseling system—What kinds of topics do we prefer to disclose to robots? 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN): 207–212. https://doi.org/10.1109/ROMAN.2017.8172303CrossRefGoogle Scholar
Takano, Y., & Osaka, E. 2018. Comparing Japan and the United States on individualism/collectivism: A follow-up review. Asian Journal of Social Psychology, 21(4): 301316. https://doi.org/10.1111/ajsp.12322CrossRefGoogle Scholar
Talhelm, T., Zhang, X., Oishi, S., Shimin, C., Duan, D., Lan, X., & Kitayama, S. 2014. Large-scale psychological differences within China explained by rice versus wheat agriculture. Science, 344(6184): 603608. https://doi.org/10.1126/science.1246850CrossRefGoogle ScholarPubMed
Technavio. 2022. Social robots market size to grow by USD USD 1.10 trillion | Dominant Players include Diligent Robotics Inc., Furhat Robotics AB, Hitachi Ltd., Knightscope Inc. Among others | Technavio. Cision PR Newswire. Available from URL: https://www.prnewswire.com/news-releases/social-robots-market-size-to-grow-by-usd-usd-1-10-trillion-dominant-players-include-diligent-robotics-inc-furhat-robotics-ab-hitachi-ltd-knightscope-inc-among-others-technavio-301571823.htmlGoogle Scholar
The Asahi Shimbun. 2019. Kyoto temple enlists Android Buddhist deity to help people [Photograph]. gettyimages. Available from URL: https://www.gettyimages.com/detail/news-photo/android-kannon-named-minder-is-displayed-at-kodaiji-temple-news-photo/1131988643Google Scholar
Thornhill, J. 2018. Asia has learnt to love robots—The west should, too. Financial Times. Available from URL: https://www.ft.com/content/6e408f42-4145-11e8-803a-295c97e6fd0bGoogle Scholar
Thurman, N., Moeller, J., Helberger, N., & Trilling, D. 2019. My friends, editors, algorithms, and I. Digital Journalism, 7(4): 447469. https://doi.org/10.1080/21670811.2018.1493936CrossRefGoogle Scholar
von Eschenbach, W. J. 2021. Transparency and the black box problem: Why we do not trust AI. Philosophy & Technology, 34(4): 16071622. https://doi.org/10.1007/s13347-021-00477-0CrossRefGoogle Scholar
Wagner, C. 2009. ‘The Japanese way of robotics’: Interacting ‘naturally’ with robots as a national character? RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication: 510–515. https://doi.org/10.1109/ROMAN.2009.5326221CrossRefGoogle Scholar
Wang, L., Rau, P.-L. P., Evers, V., Robinson, B. K., & Hinds, P. 2010. When in Rome: The role of culture & context in adherence to robot recommendations. 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI): 359366. https://doi.org/10.1109/HRI.2010.5453165CrossRefGoogle Scholar
Williams, M. F. 2021. Sex tech and the rise of AI robotics in the bedroom: 9 Investment ideas. Financial News Now. Available from URL: https://financial-news-now.com/sex-tech-and-the-rise-of-ai-robotics-in-the-bedroom-9-investment-ideas/Google Scholar
Xinhua. 2021, 25. AI chatbot offers mental health support for humans. China Daily. Available from URL: https://www.chinadaily.com.cn/a/202108/25/WS61259a6aa310efa1bd66aedc.htmlGoogle Scholar
Yam, K. C., Bigman, Y. E., & Gray, K. 2021. Reducing the uncanny valley by dehumanizing humanoid robots. Computers in Human Behavior, 125: 106945. https://doi.org/10.1016/j.chb.2021.106945CrossRefGoogle Scholar
Yam, K. C., Tang, P. M., Jackson, J. C., Su, R., & Gray, K. 2023. The rise of robots increases job insecurity and maladaptive workplace behaviors: Multimethod evidence. Journal of Applied Psychology, 108(5): 850870. https://doi.org/10.1037/apl0001045CrossRefGoogle ScholarPubMed
Yam, K. C., Goh, E.-Y., Fehr, R., Lee, R., Soh, H., & Gray, K. 2022. When your boss is a robot: Workers are more spiteful to robot supervisors that seem more human. Journal of Experimental Social Psychology, 102: 104360. https://doi.org/10.1016/j.jesp.2022.104360CrossRefGoogle Scholar
Yam, K. C., Bigman, Y. E., Tang, P. M., Ilies, R., De Cremer, D., Soh, H., & Gray, K. 2021. Robots at work: People prefer—and forgive—service robots with perceived feelings. Journal of Applied Psychology, 106(10): 15571572. https://doi.org/10.1037/apl0000834CrossRefGoogle ScholarPubMed
Yeomans, M., Shah, A., Mullainathan, S., & Kleinberg, J. 2019. Making sense of recommendations. Journal of Behavioral Decision Making, 32(4): 403414. https://doi.org/10.1002/bdm.2118CrossRefGoogle Scholar
Figure 0

Table 1. Summary of factors proposed to affect East-West differences in people's attitudes and behaviors toward new technologies

Figure 1

Figure 1. Benefits and ethical threats of robots in sex, religion, and mental health. Note: The left panel (A) shows Henry, a sexbot, sold online. From Meet Henry, the Male Sex Robot With Artificial Intelligence and a British Accent [Photograph], by Realbotix, 2018, Allure (https://www.allure.com/story/realbotix-henry-male-sex-robot-with-artificial-intelligence). The middle panel (B) shows Mindar, a robot priest introduced in a 400-year-old temple in Kyoto, Japan. From Kyoto Temple Enlists Android Buddhist Deity To Help People [Photograph], by The Asahi Shimbun, 2019, Getty Images (https://www.gettyimages.com/detail/news-photo/android-kannon-named-minder-is-displayed-at-kodaiji-temple-news-photo/1131988643). The right panel (C) shows NAO, a 60-cm robot piloted in Singapore to engage children with autism in social interactions. From NAO Robot Aims to Help Kids with Autism Become More Social [Photograph], by Nuria Ling, 2013, The Straits Times (https://www.straitstimes.com/singapore/nao-robot-aims-to-help-kids-with-autism-become-more-social).