Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-23T22:10:58.991Z Has data issue: false hasContentIssue false

A COMMON FRAMEWORK FOR THEORIES OF NORM COMPLIANCE

Published online by Cambridge University Press:  04 December 2018

Adam Morris
Affiliation:
Psychology, Harvard University
Fiery Cushman
Affiliation:
Psychology, Harvard University

Abstract:

Humans often comply with social norms, but the reasons why are disputed. Here, we unify a variety of influential explanations in a common decision framework, and identify the precise cognitive variables that norms might alter to induce compliance. Specifically, we situate current theories of norm compliance within the reinforcement learning framework, which is widely used to study value-guided learning and decision-making. This framework offers an appealingly precise language to distinguish between theories, highlights the various points of convergence and divergence, and suggests novel ways in which norms might penetrate our psychology.

Type
Research Article
Copyright
Copyright © Social Philosophy and Policy Foundation 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

*

We thank Jonathan Phillips and other members of the Moral Psychology Research Lab for their advice and assistance. This research was supported by Grant N00014-14-1-0800 from the Office of Naval Research.

References

1 Bicchieri, Christina, The Grammar of Society: The Nature and Dynamics of Social Norms (New York: Cambridge University Press, 2006);Google Scholar Cooter, R., “Do Good Laws Make Good Citizens? An Economic Analysis of Internalized Norms,” Virginia Law Review 86, no. 8 (2000): 15771601. https://doi.org/10.2307/1073825;CrossRefGoogle Scholar Fehr, E. and Fischbacher, U., “Social Norms and Human Cooperation,” Trends in Cognitive Sciences 8, no. 4 (2004): 185–90. https://doi.org/10.1016/j.tics.2004.02.007CrossRefGoogle Scholar

2 Nisbett, R. E. and Cohen, D., Culture of Honor: The Psychology of Violence in the South, vol. xviii (Boulder, CO: Westview Press, 1996).Google Scholar

3 Turnbull, C. M., The Mountain People (New York: Touchstone, 1987).Google Scholar

4 Elster, Jon, “Social Norms and Economic Theory,” The Journal of Economic Perspectives 3, no. 4 (1989): 99117.CrossRefGoogle Scholar

5 Ellickson, R., Order without Law: How Neighbors Settle Disputes, rev. ed. (Cambridge, MA: Harvard University Press, 1944).Google Scholar

6 For much more nuanced definitions of norms and their kinds, see Bicchieri, The Grammar of Society.

7 Cialdini, R. B. and Trost, M. R., “Social Influence: Social Norms, Conformity and Compliance,” in Gilbert, D. T., Fiske, S. T., and Lindzey, G., eds., The Handbook of Social Psychology, Vols. 1 and 2, 4th ed. (New York: McGraw-Hill, 1998), 151–92.Google Scholar

8 Rand, D. G. and Epstein, Z. G., “Risking Your Life without a Second Thought: Intuitive Decision-Making and Extreme Altruism,” PLOS ONE 9, no. 10 (2014), e109687. https://doi.org/10.1371/journal.pone.0109687.CrossRefGoogle Scholar

9 Fehr, E. and Schmidt, K. M.,“The Economics of Fairness, Reciprocity and Altruism–Experimental Evidence and New Theories,” in , S. C. K. and Ythier, J. M., ed., Handbook of the Economics of Giving, Altruism and Reciprocity Vol. 1 (Amsterdam: Elsevier, 2006), 615–91. Retrieved from http://www.sciencedirect.com/science/article/pii/S1574071406010086.Google Scholar

10 Thaler, R. H., “Anomalies: The Ultimatum Game,” Journal of Economic Perspectives 2, no. 4 (1988): 195206.CrossRefGoogle Scholar

11 We use the terms “obey,” “follow,” “comply with,” and so on, interchangeably to indicate situations where people do what a norm prescribes (or avoid doing what a norm forbids). Unlike some authors, we do not use the terms to differentiate hypotheses about the psychology underlying that behavior ( Kelman, H. C., “Compliance, Identification, and Internalization: Three Processes of Attitude Change,” The Journal of Conflict Resolution 2, no. 1 [1958]: 5160;CrossRefGoogle Scholar Koh, H. H., “Why Do Nations Obey International Law?Yale Law Journal 106, no. 8 [1997]: 25992659. https://doi.org/10.2307/797228).CrossRefGoogle Scholar

12 Dolan, R. J. and Dayan, P., “Goals and Habits in the Brain,” Neuron 80, no. 2 (2013): 312–25. https://doi.org/10.1016/j.neuron.2013.09.007.CrossRefGoogle Scholar

13 Samuelson, P. A., “Foundations of Economic Analysis,” Science and Society 13, no. 1 (1948): 9395;Google Scholar von Neumann, J. and Morgenstern, O., Theory of Games and Economic Behavior (Princeton, NJ: Princeton University Press, 1944).Google Scholar

14 Glimcher, P. W. and Fehr, E., Neuroeconomics: Decision Making and the Brain (London: Academic Press, 2013).Google Scholar

15 Ibid.

16 Becker, G. S., Accounting for Tastes (Cambridge, MA: Harvard University Press, 1996);Google Scholar Sen, Amartya, Choice, Welfare and Measurement (Cambridge, MA: Harvard University Press, 1997);Google Scholar Sunstein, Cass R., “Social Norms and Social Roles,” Columbia Law Review 96, no. 4 (1996): 903968. https://doi.org/10.2307/1123430CrossRefGoogle Scholar

17 Pavlov, I. P. and Anrep, G. V., Conditioned Reflexes (North Chelmsford, MA: Courier Corporation, 1927).Google Scholar

18 Lichtenstein, S. and Slovic, P., The Construction of Preference (New York: Cambridge University Press, 2006);CrossRefGoogle Scholar Vlaev, I., Chater, N., Stewart, N., and Brown, G. D. A., “Does the Brain Calculate Value?Trends in Cognitive Sciences 15, no. 11 (2011)): 546–54. https://doi.org/10.1016/j.tics.2011.09.00CrossRefGoogle Scholar

19 Andreoni, J., Castillo, M., and Petrie, R., “What Do Bargainers’ Preferences Look Like? Experiments with a Convex Ultimatum Game,” The American Economic Review 93, no. 3 (2003): 672–85;CrossRefGoogle Scholar Andreoni, J. and Miller, J., “Giving According to GARP: An Experimental Test of the Consistency of Preferences for Altruism,” Econometrica 70, no. 2 (2002): 737–53. https://doi.org/10.1111/1468-0262.00302CrossRefGoogle Scholar

20 C. M. Anderson and L. Putterman, “Do Non-Strategic Sanctions Obey the Law of Demand? The Demand for Punishment in the Voluntary Contribution Mechanism,” Games and Economic Behavior 54, no. 1 (2006): 1–24. https://doi.org/10.1016/j.geb.2004.08.007; J. Andreoni and L. Vesterlund “Which is the Fair Sex? Gender Differences in Altruism,” The Quarterly Journal of Economics 116, no. 1 (2001): 293–312; V. Capraro, J. J. Jordan, and D. G. Rand, Heuristics Guide the Implementation of Social Preferences in One-Shot Prisoner’s Dilemma Experiments (SSRN Scholarly Paper No. ID 2429862) (Rochester, NY: Social Science Research Network, 2014). Retrieved from http://papers.ssrn.com/abstract=2429862; J. P. Carpenter, “The Demand for Punishment,” Journal of Economic Behavior and Organization 62, no. 4 (2007): 522–42. https://doi.org/10.1016/j.jebo.2005.05.004

21 Ruff, C. C. and Fehr, E., “The Neurobiology of Rewards and Values in Social Decision Making,“ Nature Reviews Neuroscience 15, no. 8 (2014): 549–62. https://doi.org/10.1038/nrn3776CrossRefGoogle Scholar

22 de Quervain, D. J. F., Fischbacher, U., Treyer, V., Schellhammer, M., Schnyder, U., Buck, A., and Fehr, E., “The Neural Basis of Altruistic Punishment,” Science 305, no. 5688 (2004): 1254–58. https://doi.org/10.1126/science.1100735;CrossRefGoogle Scholar Rilling, J. K., Gutman, D. A., Zeh, T. R., Pagnoni, G., Berns, G. S., and Kilts, C. D., “A Neural Basis for Social Cooperation,” Neuron 35, no. 2 (2002): 395405. https://doi.org/10.1016/S0896-6273(02)00755-9;CrossRefGoogle ScholarPubMed Tabibnia, G. and Lieberman, M. D., “Fairness and Cooperation Are Rewarding,” Annals of the New York Academy of Sciences 1118, no. 1 (2007): 90101. https://doi.org/10.1196/annals.1412.001;CrossRefGoogle ScholarPubMed Zaki, J. and Mitchell, J. P., “Equitable Decision Making is Associated with Neural Markers of Intrinsic Value,” Proceedings of the National Academy of Sciences 108, no. 49 (2011): 19761–766. https://doi.org/10.1073/pnas.1112324108CrossRefGoogle Scholar

23 Glimcher and Fehr, Neuroeconomics: Decision Making and the Brain.

24 Cushman, Fiery, “From Moral Concern to Moral Constraint,” Current Opinion in Behavioral Sciences 3 (2015): 5862. https://doi.org/10.1016/j.cobeha.2015.01.006;CrossRefGoogle Scholar Gershman, S. J. and Niv, Y.Learning Latent Structure: Carving Nature at its Joints,” Current Opinion in Neurobiology 20, no. 2 (2010): 251–56. https://doi.org/10.1016/j.conb.2010.02.008CrossRefGoogle Scholar

25 There are two key details about MDPs that we’ve omitted for simplicity. First, the transition and reward functions, when conditioned on the current decision point, are assumed to be independent of past experience. This restriction is known as the Markov property, and is often attained by simply enhancing the representation of the current state to include all relevant prior factors (Sutton, R. S. and Barto, A. G., Introduction to Reinforcement Learning [Cambridge, MA: MIT Press, 1998]).CrossRefGoogle Scholar Second, there is also a discount parameter, which controls the rate at which future rewards are discounted relative to current rewards (Sutton and Barto, ibid.).

26 von Neumann, J. and Morgenstern, O., Theory of Games and Economic Behavior (Princeton, NJ: Princeton University Press, 1944).Google Scholar

27 Elster, Jon, “Social Norms and Economic Theory,” Journal of Economic Perspectives 3, no. 4 (1989): 99117;CrossRefGoogle Scholar Harsanyi, J. C., “Morality and the Theory of Rational Behavior,” Social Research 44, no. 4 (1977): 623–56;Google Scholar Kahneman, D. and Thaler, R. H., “Anomalies: Utility Maximization and Experienced Utility,” The Journal of Economic Perspectives 20, no. 1 (2006): 221–34. https://doi.org/10.1257/089533006776526076CrossRefGoogle Scholar

28 Elster, “Social Norms and Economic Theory,“ 99–117.

29 Tversky, A. and Kahneman, D., “Judgment under Uncertainty: Heuristics and Biases,“ Science 185, no. 4157 (1974): 1124–31. https://doi.org/10.1126/science.185.4157.1124CrossRefGoogle Scholar

30 Dolan, R. J. and Dayan, P., “Goals and Habits in the Brain,“ Neuron 80, no. 2 (2013): 312–25. https://doi.org/10.1016/j.neuron.2013.09.007CrossRefGoogle Scholar

31 This method of habit learning by reinforcing actions historically associated with reward emerged in the reinforcement learning literature in the 1980s and rapidly revolutionized the field, quickly enabling human-level proficiency in games like backgammon. Several computational signatures of these model-free RL algorithms were also discovered in dopaminergic neural circuits that implement value-guided learning and decision-making, catalyzing two decades of rapid theoretical and empirical advances (W. Schultz, P. Dayan, and P. R. Montague, “A Neural Substrate of Prediction and Reward,” Science 275, no. 5306 [1997]: 1593–99. https://doi.org/10.1126/science.275.5306.1593).

32 Thorndike, E. L., “The Law of Effect,“ The American Journal of Psychology 39 (1927): 212–22. https://doi.org/10.2307/1415413CrossRefGoogle Scholar

33 Kahneman, D., Thinking, Fast and Slow (New York: Farrar, Straus, and Giroux, 2011).Google Scholar

34 Dickinson, A., Balleine, B., Watt, A., Gonzalez, F., and Boakes, R. A., “Motivational Control after Extended Instrumental Training,“ Animal Learning and Behavior 23, no. 2 (1995): 197206. https://doi.org/10.3758/BF03199935CrossRefGoogle Scholar

35 Dolan and Dayan, “Goals and Habits in the Brain,“ 312–25.

36 Lichtenstein, S. and Slovic, P., The Construction of Preference (New York: Cambridge University Press, 2006);CrossRefGoogle Scholar Vlaev, I., Chater, N., Stewart, N., and Brown, G. D. A., “Does the Brain Calculate Value? Trends in Cognitive Sciences 15, no. 11 (2011): 546–54. https://doi.org/10.1016/j.tics.2011.09.008CrossRefGoogle ScholarPubMed

37 McClelland, J. L., Rumelhart, D. E., and Hinton, G. E., “Parallel Distributed Processing: Explorations in the Microstructure of Cognition,“ Vol. 1, in Rumelhart, D. E., McClelland, J. L., and C. PDP Research Group, eds.,). (Cambridge, MA: MIT Press, 1986), 3-44. Retrieved from http://dl.acm.org/citation.cfm?id=104279.104284Google Scholar

38 Jowett, B., “The Dialogues of Plato,“ Journal of Hellenic Studies 45, no. 4 (1925): 274.Google Scholar

39 Sunstein, “Social Norms and Social Roles,” 903–968.

40 Camerer, C., Behavioral Game Theory: Experiments in Strategic Interaction (Princeton, NJ: Princeton University Press, 2003).Google Scholar

41 Fehr and Schmidt, “The Economics of Fairness, Reciprocity and Altruism,” 615–91.

42 Delton, A. W., Krasnow, M. M., Cosmides, L., and Tooby, J., “Evolution of Direct Reciprocity under Uncertainty Can Explain Human Generosity in One-Shot Encounters,” Proceedings of the National Academy of Sciences 108, no. 32 (2011), 13335–40. https://doi.org/10.1073/pnas.1102131108CrossRefGoogle ScholarPubMed

43 Andreoni, J., “Cooperation in Public-Goods Experiments: Kindness or Confusion?American Economic Review 85, no. 4 (1995): 891904;Google Scholar Fehr and Schmidt, “The Economics of Fairness, Reciprocity and Altruism”; Thaler “Anomalies: The Ultimatum Game,” 195–206.

44 Fehr and Schmidt, “The Economics of Fairness, Reciprocity and Altruism.“

45 Henrich, J., “Does Culture Matter in Economic Behavior? Ultimatum Game Bargaining among the Machiguenga of the Peruvian Amazon,” American Economic Review 90, no. 4 (2000): 973–79;CrossRefGoogle Scholar Henrich, J., McElreath, R., Barr, A., Ensminger, J., Barrett, C., Bolyanatz, A., Ziker, J., “Costly Punishment Across Human Societies,” Science 312, no. 5781 (2006): 1767–70. https://doi.org/10.1126/science.1127333;CrossRefGoogle Scholar Henrich, J., Ensminger, J., McElreath, R., Barr, A., Barrett, C., Bolyanatz, A., Ziker, J., “Markets, Religion, Community Size, and the Evolution of Fairness and Punishment,” Science 327, no. 5972 (2010): 1480–84. https://doi.org/10.1126/science.1182238CrossRefGoogle Scholar

46 Cialdini, R. B., Kallgren, C. A., and Reno, R. R., “A Focus Theory of Normative Conduct: A Theoretical Refinement and Reevaluation of the Role of Norms in Human Behavior,” in Zanna, M. P. ed., Advances in Experimental Social Psychology , vol. 24 (Waltham, MA: Academic Press, 1991), 201234. Retrieved from http://www.sciencedirect.com/science/article/pii/S0065260108603305;Google Scholar Cialdini, R. B., Reno, R., and Kallgren, C. A., “A Focus Theory of Normative Conduct: Recycling the Concept of Norms to Reduce Littering in Public Places,” Journal of Personality and Social Psychology 58, no. 6 (1990): 1015–26. https://doi.org/10.1037/0022-3514.58.6.1015CrossRefGoogle Scholar

47 Goldstein, N. J., Cialdini, R. B., and Griskevicius, V., “A Room with a Viewpoint: Using Social Norms to Motivate Environmental Conservation in Hotels,” Journal of Consumer Research 35, no. 3 (2008)): 472–82. https://doi.org/10.1086/586910;CrossRefGoogle Scholar Raihani, N. J. and McAuliffe, K., “Dictator Game Giving: The Importance of Descriptive versus Injunctive Norms,” PLOS ONE 9, no. 12 (2014), e113826. https://doi.org/10.1371/journal.pone.0113826CrossRefGoogle ScholarPubMed

48 Many studies show that manipulating a variable (e.g., time available to make a decision) makes people more or less cooperative, fair, generous, and so on, in economic games. These choices are likely influenced by the norms to which people have been exposed (Fehr and Fischbacher, “Social Norms and Human Cooperation”; Rand, D. G., Peysakhovich, A., Kraft-Todd, G. T., Newman, G. E., Wurzbacher, O., Nowak, M. A., and Greene, J. D., “Social Heuristics Shape Intuitive Cooperation,” Nature Communications 5, no. 3677 [2014]. https://doi.org/10.1038/ncomms4677).CrossRefGoogle Scholar But ideally, to show that the specific variable at hand affects norm compliance, the study would simultaneously manipulate whether the relevant norm is present, and show that the variable only has an effect when the norm is present. Unfortunately, the ideal study has often not yet been run. We cite the imperfect studies with the hope that future work will fill in the gaps.

49 Rand, D. G. and Epstein, Z. G., “Risking Your Life without a Second Thought: Intuitive Decision-Making and Extreme Altruism,” PLOS ONE 9, no. 10 (2014): e109687. https://doi.org/10.1371/journal.pone.0109687;CrossRefGoogle Scholar Rand, D. G., Greene, J. D., and Nowak, M. A., “Spontaneous Giving and Calculated Greed,” Nature 489, no. 7416 (2012): 427–30. https://doi.org/10.1038/nature11467;CrossRefGoogle Scholar Peysakhovich, Rand, Newman, Kraft-Todd, Wurzbacher, , Nowak, , and Greene, , “Social Heuristics Shape Intuitive Cooperation.”Google Scholar

50 Crockett, M. J., “Models of Morality,” Trends in Cognitive Sciences 17, no. 8 (2013): 363–66. https://doi.org/10.1016/j.tics.2013.06.005;CrossRefGoogle Scholar Cushman, Fiery, “Action, Outcome, and Value: A Dual-System Framework for Morality,” Personality and Social Psychology Review 17, no. 3 (2013): 273–92. https://doi.org/10.1177/1088868313495594CrossRefGoogle Scholar

51 Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M., and Cohen, J. D., “An fMRI Investigation of Emotional Engagement in Moral Judgment,“ Science 293, no. 5537 (2001): 21052108. https://doi.org/10.1126/science.1062872CrossRefGoogle Scholar

52 Cushman, F., Gray, K., Gaffey, A., and Mendes, W. B., “Simulating Murder: The Aversion to Harmful Action,” Emotion 12, no. 1 (2012): 27. https://doi.org/10.1037/a0025071CrossRefGoogle ScholarPubMed

53 Cushman, Fiery, “From Moral Concern to Moral Constraint,” Current Opinion in Behavioral Sciences 3 (2015): 5862. https://doi.org/10.1016/j.cobeha.2015.01.006CrossRefGoogle Scholar

54 Rand, D. G., “Cooperation, Fast and Slow Meta-Analytic Evidence for a Theory of Social Heuristics and Self-Interested Deliberation,” Psychological Science (2016), 0956797616654455. https://doi.org/10.1177/0956797616654455CrossRefGoogle Scholar

55 Cappelletti, D., Güth, W., and Ploner, M., “Being of Two Minds: Ultimatum Offers under Cognitive Constraints,” Journal of Economic Psychology 32, no. 6 (2011): 940–50. https://doi.org/10.1016/j.joep.2011.08.001;CrossRefGoogle Scholar Halali, E., Bereby-Meyer, Y., and Ockenfels, A., “Is It All about the Self? The Effect of Self-Control Depletion on Ultimatum Game Proposers,” Frontiers in Human Neuroscience 7 (2013). https://doi.org/10.3389/fnhum.2013.00240CrossRefGoogle Scholar

56 Anderson, C. and Dickinson, D. L., “Bargaining and Trust: The Effects of 36-H Total Sleep Deprivation on Socially Interactive Decisions,” Journal of Sleep Research 19 (2010): 5463. https://doi.org/10.1111/j.1365-2869.2009.00767.x;CrossRefGoogle ScholarPubMed Grimm, V., and Mengel, F., “Let Me Sleep on It: Delay Reduces Rejection Rates in Ultimatum Games,” Economics Letters 111, no. 2 (2011): 113–15. https://doi.org/10.1016/j.econlet.2011.01.025;CrossRefGoogle Scholar Halali, E., Bereby-Meyer, Y., and Meiran, N., When Rationality and Fairness Conflict: The Role of Cognitive-Control in the Ultimatum Game (SSRN Scholarly Paper No. ID 1868852) (Rochester, NY: Social Science Research Network, 2011). Retrieved from http://papers.ssrn.com.ezp-prod1.hul.harvard.edu/abstract=1868852;Google Scholar Halali, E., Bereby-Meyer, Y., and Meiran, N., “Between Self-Interest and Reciprocity: The Social Bright Side of Self-Control Failure,” Journal of Experimental Psychology: General 143, no. 2 (2014): 745–54. https://doi.org/10.1037/a0033824;CrossRefGoogle Scholar Neo, W. S., Yu, M., Weber, R. A., and Gonzalez, C.The Effects of Time Delay in Reciprocity Games,” Journal of Economic Psychology 34 (2013): 2035. https://doi.org/10.1016/j.joep.2012.11.001;CrossRefGoogle Scholar Sutter, M., Kocher, M., and Straub, S., “Bargaining under Time Pressure in an Experimental Ultimatum Game,” Economics Letters 81, no. 3 (2003): 341–47. https://doi.org/10.1016/S0165-1765(03)00215-5CrossRefGoogle Scholar

57 Schulz, J. F., Fischbacher, U., Thöni, C., and Utikal, V., “Affect and Fairness: Dictator Games under Cognitive Load,“ Journal of Economic Psychology 41 (2014): 7787. https://doi.org/10.1016/j.joep.2012.08.007. The effect of time pressure and cognitive depletion on generosity is mixed. Some studies report that they induce more givingCrossRefGoogle Scholar (Cornelissen, G., Dewitte, S., and Warlop, L., “Are Social Value Orientations Expressed Automatically? Decision Making in the Dictator Game,” Personality and Social Psychology Bulletin, 0146167211405996 [2011]. https://doi.org/10.1177/0146167211405996;Google Scholar Schulz, Fischbacher, Thöni, , and Utikal, , “Affect and Fairness,” 77-87), while others report a null or (rarely) reversed effect (Hauge, K. E., Brekke, K. A., Johansson, L. O., Johansson-Stenman, O., and Svedsäter, H., “Keeping Others in Our Mind or in Our Heart? Distribution Games under Cognitive Load,” Experimental Economics 19, no. 3 [2015], 562–76. https://doi.org/10.1007/s10683-015-9454-z;Google Scholar Bereby-Meyer, Halali, and Ockenfels, , “Is It All about the Self?”). Interestingly, the game used in these studies to measure generosity has notoriously fickle norms (Bicchieri, The Grammar of Society; Fehr and Schmidt, The Economics of Fairness, Reciprocity, and Altruism;Google Scholar Krupka, E. L. and Weber, R. A., “Identifying Social Norms Using Coordination Games: Why Does Dictator Game Sharing Vary?Journal of the European Economic Association 11, no. 3 (2013): 495–524. https://doi.org/10.1111/jeea.12006;). Perhaps the ambiguous effects of time pressure on generosity can be explained by differences in norm perception across studies.CrossRefGoogle Scholar

58 Rand and Epstein, “Risking Your Life without a Second Thought,“ e109687.

59 Ibid.; Rand, Greene, and Nowak, “Spontaneous Giving and Calculated Greed,“ 427–30; Rand, Peysakhovich, Kraft-Todd, Newman, Wurzbacher, Nowak, and Greene, “Social Heuristics Shape Intuitive Cooperation.

60 Rand and Epstein, “Risking Your Life without a Second Thought: Intuitive Decision-Making and Extreme Altruism,“ e109687; Rand, Peysakhovich, Kraft-Todd, Newman, Wurzbacher, Nowak, and Greene, “Social Heuristics Shape Intuitive Cooperation.“

61 It is possible that, under time pressure, people do not become more compliant; they simply become more prosocial (Rand, Greene, and Nowak, “Spontaneous Giving and Calculated Greed”). The fact that people are also more-negatively reciprocal under time pressure suggests the former interpretation. But it is an open question. One study appeared to show that, even after being instilled with a norm for competition instead of cooperation, people were still more cooperative under time pressure (J. Cone, and D. G. Rand, “Time Pressure Increases Cooperation in Competitively Framed Social Dilemmas,” PLOS ONE 9, no. 12 [2014]: e115756. https://doi.org/10.1371/journal.pone.0115756.) But people in the competitive condition did not, on average, contribute less than people in the cooperative condition (the difference was around one cent out of an endowment of forty), suggesting that the norm manipulation was insufficient.

62 Rand, “Cooperation, Fast and Slow Meta-Analytic Evidence for a Theory of Social Heuristics and Self-Interested Deliberation“; Rand, Greene, and Nowak, “Spontaneous Giving and Calculated Greed,“ Nature, 427–30.

63 Fehr, E. and Schmidt, K. M., “A Theory of Fairness, Competition, and Cooperation,” Quarterly Journal of Economics 114, no. 3 (1999): 817–68.CrossRefGoogle Scholar

64 Cialdini, R. B. and Trost, M. R., “Social Influence: Social Norms, Conformity and Compliance,” in Gilbert, D. T., Fiske, S. T., and Lindzey, G., eds., The Handbook of Social Psychology, Vols. 1 and 2, 4th ed. (New York: McGraw-Hill, 1998), 151–92;Google Scholar Parsons, Talcott and Shils, Edward, Toward a General Theory of Action (Charleston, SC: Nabu Press, 2011), sec. 1.Google Scholar

65 Ho, M. K., MacGlashan, J., Littman, M. L., and Cushman, F., “Social Is Special: A Normative Framework for Teaching with and Learning from Evaluative Feedback,“ Cognition (2017). https://doi.org/10.1016/j.cognition.2017.03.006CrossRefGoogle Scholar

66 Fehr, E. and Fischbacher, U., “The Nature of Human Altruism,“ Nature 425, no. 6960 (2003): 785–91. https://doi.org/10.1038/nature02043CrossRefGoogle Scholar

67 Fehr and Schmidt, “The Economics of Fairness, Reciprocity and Altruism.“

68 David Ackley and Michael Littman, “Interactions Between Learning and Evolution,” in Artificial Life II, SFI Studies in the Sciences of Complexity, vol. X, ed. C. G. Langton, C. Taylor, J. D. Farmer, and S. Rasmussen (London: Addison-Wesley, 1991); Ho, MacGlashan, Littman, and Cushman, “Social Is Special”; Satinder Singh, Richard Lewis, and Andrew Barto, “Where Do Rewards Come From?” Proceedings of the Annual Conference of the Cognitive Science Society (2009): 2601-2606.

69 Fehr and Schmidt “A Theory of Fairness, Competition, and Cooperation,” 817–68.

70 Andreoni, J., “Impure Altruism and Donations to Public Goods: A Theory of Warm-Glow Giving,” The Economic Journal 100, no. 401 (1990): 464–77. https://doi.org/10.2307/2234133CrossRefGoogle Scholar

71 Fehr and Schmidt, “A Theory of Fairness, Competition, and Cooperation.”

72 Andreoni, “Impure Altruism and Donations to Public Goods.”

73 Dana, J., Cain, D. M., and Dawes, R. M., “What You Don’t Know Won’t Hurt Me: Costly (But Quiet) Exit in Dictator Games,” Organizational Behavior and Human Decision Processes 100, no. 2 (2006): 193201. https://doi.org/10.1016/j.obhdp.2005.10.001;CrossRefGoogle Scholar Krupka, and Weber, “Identifying Social Norms Using Coordination Games”; López-Pérez, R., “Aversion to Norm-Breaking: A Model,” Games and Economic Behavior 64, no. 1 (2008): 237–67. https://doi.org/10.1016/j.geb.2007.10.009Google Scholar

74 Baron, J. and Spranca, M.Protected Values,” Organizational Behavior and Human Decision Processes 70, no. 1 (1997): 116. https://doi.org/10.1006/obhd.1997.2690CrossRefGoogle Scholar

75 Tetlock, P. E., Kristel, O. V., Beth, S., Green, M. C., Lerner, J. S., “The Psychology of the Unthinkable: Taboo Trade-Offs, Forbidden Base Rates, and Heretical Counterfactuals,” Journal of Personality and Social Psychology 78, no. 5 (2000): 853–70. https://doi.org/10.1037/0022-3514.78.5.853CrossRefGoogle Scholar

76 Joseph, Raz, Practical Reason and Norms (Oxford: Oxford University Press, 1999).Google Scholar

77 Nozick, Robert, Anarchy, State, and Utopia (New York: Basic Books, 1974); Raz, Joseph, Practical Reason and Norms (Oxford: Oxford University Press, 1999);Google Scholar Schauer, Frederick, Playing by the Rules: A Philosophical Examination of Rule-Based Decision-Making in Law and in Life (Oxford: Clarendon Press, 1991).Google Scholar

78 Becker, G. S., Accounting for Tastes (Cambridge, MA: Harvard University Press, 1996).Google Scholar

79 Tetlock, Kristel, Beth, Green, and Lerner, “The Psychology of the Unthinkable.”

80 Ubel, P. A., Pricing Life: Why It’s Time for Health Care Rationing (Cambridge, MA: MIT Press, 2001).Google Scholar

81 Milgram, S., “Behavioral Study of Obedience,” Journal of Abnormal and Social Psychology 67, no. 4 (1963): 371–78. https://doi.org/10.1037/h0040525CrossRefGoogle ScholarPubMed

82 Phillips, J. and Knobe, J., “Moral Judgments and Intuitions About Freedom,” Psychological Inquiry 20, no. 1 (2009): 3036. https://doi.org/10.1080/10478400902744279CrossRefGoogle Scholar

83 Phillips, J., Luguri, J. B., and Knobe, J., “Unifying Morality’s Influence on Non-Moral Judgments: The Relevance of Alternative Possibilities,” Cognition 145 (2015): 3042. https://doi.org/10.1016/j.cognition.2015.08.001CrossRefGoogle Scholar

84 Cushman, Fiery and Morris, Adam, “Habitual Control of Goal Selection in Humans,” Proceedings of the National Academy of Sciences 112, no. 45 (2015): 1381713822. https://doi.org/10.1073/pnas.1506367112;CrossRefGoogle Scholar Huys, Q. J. M., Eshel, N., O’Nions, E., Sheridan, L., Dayan, P., and Roiser, J. P., “Bonsai Trees in Your Head: How the Pavlovian System Sculpts Goal-Directed Choices by Pruning Decision Trees,” PLOS Comput Biol 8, no. 3 (2012): e1002410. https://doi.org/10.1371/journal.pcbi.1002410CrossRefGoogle Scholar

85 Hoffman, M., Yoeli, E., and Nowak, M. A., (2015). “Cooperate without Looking: Why We Care What People Think and Not Just What They Do,” Proceedings of the National Academy of Sciences 112. no. 6 (2015): 1727–32. https://doi.org/10.1073/pnas.1417904112;CrossRefGoogle Scholar Jordan, J. J., Hoffman, M., Nowak, M. A., and Rand, D. G., “Uncalculating Cooperation Is Used to Signal Trustworthiness,” Proceedings of the National Academy of Sciences 113, no. 31 (2016): 8658–63. https://doi.org/10.1073/pnas.1601280113CrossRefGoogle Scholar

86 Phillips, Jonathan, and Cushman, Fiery, “Morality Constrains the Default Representation of What Is Possible,“ Proceedings of the National Academy of Sciences 114, no. 18 (2017): 4649–54.CrossRefGoogle ScholarPubMed

87 Jonathan Phillips and P. Bloom, “Do Children Believe Immoral Events Are Magical?” (unpublished manuscript, available at https://osf.io/en7ut/).

88 Cushman, Fiery, “Action, Outcome, and Value: A Dual-System Framework for Morality,” Personality and Social Psychology Review 17, no. 3 (2013): 273–92. https://doi.org/10.1177/1088868313495594;CrossRefGoogle Scholar Ruff, C. C. and Fehr, E., “The Neurobiology of Rewards and Values in Social Decision Making,” Nature Reviews Neuroscience 15, no. 8 (2014): 549–62. https://doi.org/10.1038/nrn3776CrossRefGoogle Scholar

89 Daw, N. D., Gershman, S. J., Seymour, B., Dayan, P., and Dolan, R. J., “Model-Based Influences on Humans’ Choices and Striatal Prediction Errors,” Neuron 69, no. 6 (2011): 12041215. https://doi.org/10.1016/j.neuron.2011.02.027CrossRefGoogle Scholar

90 An alternative formulation of this idea is that the decision variables have different levels of informational encapsulation ( Fodor, J. A., The Modularity of Mind: An Essay on Faculty Psychology [Cambridge, MA: MIT Press, 1983]):Google Scholar Q and R are more encapsulated than T, and there are therefore fewer types of experience that can change them. At the present, it is unknown how encapsulated the action set A s or the state space S are.