Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-27T18:35:10.390Z Has data issue: false hasContentIssue false

The social sciences are increasingly ill-equipped to design system-level reforms

Published online by Cambridge University Press:  30 August 2023

Michelle Jackson*
Affiliation:
Department of Sociology, Stanford University, Stanford, CA, USA [email protected]; www.mivich.com

Abstract

Our social policy landscape is characterized by incrementalism, while public calls for more radical reform get louder. But the social sciences cannot be relied upon to generate a steady stream of radical, system-level policies. Long-standing trends in social science – in particular, increasing specialization, an increasing emphasis on causal inference, and the growing replication crisis – are barriers to system-level policy development.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

The “behavioral turn” in public policy can be identified across a wide range of policy areas in recent decades. Whether the problem to be addressed relates to public health, the environment, inequality, or human capital development, policies rooted in behavioral science have become commonplace. Chater & Loewenstein (C&L) well describe the consequences of this behavioral turn in policy-making, and the dominance of “i-frame” (individual-level) proposals for reform. But they underestimate the extent to which the more macro-oriented social science disciplines that they rely upon to propose sweeping “s-frame” (system-level) reforms – sociology, political science, and economics – are themselves compromised. Over the past half-century, social science has developed in ways that strongly constrain the types of policies on offer. Incrementalism characterizes our policy landscape, and even where s-frame reforms are proposed by social scientists, they are relatively modest as compared with the ambitious policies of the past. Here, I propose that several changes in scientific practice have led to i-frame reforms being prioritized over their more radical, s-frame counterparts.

First, true s-frame reform requires social scientists to propose policies that affect multiple social institutions (e.g., Jackson, Reference Jackson2020). The European welfare state reforms of the early–mid-twentieth century, for example, introduced policies to simultaneously improve health systems, education systems, pension and unemployment rights, and housing. But as the body of social science research grows larger, individual social scientists have become ever more specialized. Increasing specialization of research output is likely to produce increases in the productivity of academic researchers, but research shows that negative outcomes are also likely: Disciplinary silos temper innovation and inhibit communication across specialties (e.g., Jacobs, Reference Jacobs2014; Sherif & Sherif, Reference Sherif and Sherif1969), and those researchers who do engage in interdisciplinary research are less productive (Leahey, Beckman, & Stanko, Reference Leahey, Beckman and Stanko2017). A less appreciated side effect of increasing specialization is that social scientists are left with diminishing capacity to design more radical policies (Jackson, Reference Jackson2020; Jacobs, Reference Jacobs2014). Specialized social scientists focus on individual social institutions (e.g., the education system), or even parts of individual social institutions (e.g., early childhood education), but system-level policies require breadth of knowledge and an understanding of multiple social institutions. It would be quite ineffective, for example, to attempt to eliminate racial inequality in US society by addressing a single social institution such as education while leaving policing, employment discrimination, and other areas untouched. But in a highly specialized research environment, it is more likely that incremental policies focusing on single institutions will be proposed, crowding out the development of s-frame policies.

Second, the causal revolution in social science has transformed both the practice of research and the evidentiary standards with respect to policy development and evaluation. Social scientists are increasingly expected to demonstrate causal effects through application of recognized methods of causal inference (Angrist & Pischke, Reference Angrist and Pischke2010), and policy proposals that cannot call upon evidence gathered via experimental methods or alternative techniques of causal inference have little chance of gaining support. The nailing down of precise causal effects, whether in pure or applied research, necessarily entails focusing in on narrow questions and well-defined mechanisms.

There are, of course, good reasons to insist on the scientific soundness of policy proposals: Policy-makers have limited resources and limited political capital, and policies must therefore have a high likelihood of producing the outcomes that are promised. But we must also acknowledge that the popularity of i-frame proposals in part arises from stronger claims to empirical support. It is hard to imagine how the welfare state reforms of the last century could have been introduced under current evidentiary standards for policy implementation: These reforms were simply too expansive in their scope, and there was certainly no body of experimental or quasi-experimental evidence to support an overhaul of multiple institutions. If s-frame policies cannot gain the imprimatur of “scientifically sound,” it will be difficult for such policies to challenge the dominance of i-frame policies. Advocates of s-frame policies, and particularly the more radical of these policies, must therefore consider how best to build a convincing evidence base in support of sweeping reform. Put simply, we need to build a science of radical reform.

Finally, it is important to consider the extent to which the replication crisis has led to poor-quality policy-making. One reason why i-frame policies became dominant in recent decades is because the research promised large effects for small investment. Even the most radical of s-frame advocates would be hard-pressed to object to i-frame policies that delivered effects of the size promised. Unfortunately, it has become increasingly clear that i-frame policies have failed to deliver on their promise, in part because this promise was built on weak scientific foundations.

Publication bias, p-hacking, insufficient replication, and other perversities of scientific practice are not just consequential for scientific work: Our policy suffers too. Take, for example, the nudge policies that were embraced by the Obama administration a decade ago. A recent comprehensive meta-analysis suggested that nudges “promote behavior change with a small to medium effect size,” although there is also evidence of a “moderate publication bias” in the nudge literature (Mertens, Herberz, Hahnel, & Brosch, Reference Mertens, Herberz, Hahnel and Brosch2022, p. 1). In a reply to the paper, Maier et al. show that there is evidence for “severe publication bias” in the nudging literature, and that once this bias is accounted for, “no evidence remains that nudges are effective as tools for behaviour change” (2022, p. 2; see also Bakdash & Marusich, Reference Bakdash and Marusich2022). Although we might expect the replication crisis to have had consequences for the evidentiary foundations of s-frame policies too, the likely overestimation of the strength of evidence underlying i-frame policies is particularly damaging, given the understandable preference among policy-makers for policies that are both effective and cheap.

Our policy did not just get small: Social science helped to push it in that direction. Times demand a social science that allows us to take risks. Without changes to our science, the s-frame policy proposals that C&L praise will be all too scarce, and i-frame reforms will continue to dominate.

Acknowledgments

I thank David Grusky and Robb Willer for their comments on an earlier version of this commentary.

Financial support

This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.

Competing interest

None.

References

Angrist, J. D., & Pischke, J.-S. (2010). The credibility revolution in empirical economics: How better research design is taking the con out of econometrics. Journal of Economic Perspectives, 24(2), 330. doi:10.1257/jep.24.2.3CrossRefGoogle Scholar
Bakdash, J. Z., & Marusich, L. R. (2022). Left-truncated effects and overestimated meta-analytic means. Proceedings of the National Academy of Sciences, 119(31), e2203616119. doi:10.1073/pnas.2203616119CrossRefGoogle ScholarPubMed
Jackson, M. (2020). Manifesto for a dream: Inequality, constraint, and radical reform. Stanford University Press.CrossRefGoogle Scholar
Jacobs, J. A. (2014). In defense of disciplines: Interdisciplinarity and specialization in the research university. University of Chicago Press.CrossRefGoogle Scholar
Leahey, E., Beckman, C. M., & Stanko, T. L. (2017). Prominent but less productive: The impact of interdisciplinarity on scientists’ research. Administrative Science Quarterly, 62(1), 105139. doi:10.1177/0001839216665364CrossRefGoogle Scholar
Maier, M., Bartoš, F., Stanley, T. D., Shanks, D. R., Harris, A. J. L., & Wagenmakers, E.-J. (2022). No evidence for nudging after adjusting for publication bias. Proceedings of the National Academy of Sciences, 119(31), e2200300119. doi:10.1073/pnas.2200300119CrossRefGoogle ScholarPubMed
Mertens, S., Herberz, M., Hahnel, U. J. J., & Brosch, T. (2022). The effectiveness of nudging: A meta-analysis of choice architecture interventions across behavioral domains. Proceedings of the National Academy of Sciences, 119(1), e2107346118. doi:10.1073/pnas.2107346118CrossRefGoogle ScholarPubMed
Sherif, M., & Sherif, C. W. (Eds.). (1969). Interdisciplinary relationships in the social sciences. Routledge.Google Scholar