Hostname: page-component-cd9895bd7-mkpzs Total loading time: 0 Render date: 2024-12-25T19:27:48.750Z Has data issue: false hasContentIssue false

Cognitive rules, institutions, and economic growth: Douglass North and beyond

Published online by Cambridge University Press:  22 November 2016

AVNER GREIF*
Affiliation:
Department of Economics, Stanford University, Stanford, CA, USA and CIFAR, Fairbanks, AK, USA
JOEL MOKYR*
Affiliation:
Department of Economics, Northwestern University, Evanston, IL, USA, The University of Tel Aviv, Tel Aviv, Israel, and CIFAR, Fairbanks, AK, USA
Rights & Permissions [Opens in a new window]

Abstract

Douglass North's writing on institutional change recognized from the very start that such change depends on cognition and beliefs. Yet, although he focused on individual beliefs, we argue in this paper that such beliefs are social constructs. We suggest that institutions – rules, expectations, and norms – are based on shared cognitive rules. Cognitive rules are social constructs that convey information that distills and summarizes society's beliefs and experience. These rules have to be self-enforcing and self-confirming, but they do not have to be ‘correct’. We describe the characteristics of such rules in the context of a market for ideas, and illustrate their importance in two developments central to the growth of modern economies: the rise of the modern state with its legitimacy based on consent, and the rise of modern science-based technology that was the product of the scientific revolution and the Enlightenment.

Type
Research Article
Copyright
Copyright © Millennium Economics Ltd 2016 

1. Introduction

Neither of us ever studied formally with Douglass North, nor were his colleagues, nor did we co-author with him. Yet, as practicing economic historians for many years, we cannot imagine what our work would have looked like without his influence, support, and encouragement, whether through his writings or through our conversations with him. North was a scholar like no other: He was never bothered by the standard methods and conventional wisdom of economics and never hesitated to take his colleagues to task if he felt they were missing something important – which was most of the time. He ignored the traditional boundaries between the various social sciences, and his thinking influenced and stimulated people across the social and historical disciplines. He also was rarely locked into a position on anything: As his thinking evolved, he readily abandoned positions he had taken in the past.

North believed in the importance of institutions in economic history, and rightly accused much of the new economic history – which he helped found – for ignoring them for many decades. This call resonated with many scholars who were inspired by him to ask historical questions about the kind of institutions that he was interested in: property rights, contract enforcement, the political and legal frameworks of markets, the deployment of power and violence in society, and so on. North stressed from the start that institutions should be differentiated from organizations. But what precisely are institutions? In his path breaking book, North (Reference North1990) defined them as human-made constraints (and hence the ‘rules of the game’), but that definition seemed to raise as many difficulties as it resolved: who actually defines these constraints, and if they were constraints, what happened if they were violated? It seemed more natural to define them as incentives (which North immediately added): the rewards and penalties that society imposes on people who display certain behaviors. But if so, who set these incentives and who enforced the rewards? What was the role of customs and norms? And above all, what determined if agents would pay any attention to them?

From the onset, it became (almost) a consensus in the profession that we could not understand economic history without paying attention to institutions. It became imperative to confront the question of institutional change: why and how institutions looked the way they did, how they changed in the long run and in response to what, and why some countries had such dramatically different institutions than others. In time, it became clear that changing cognition and beliefs was important to institutional change. North (Reference North2005: 49) noted that ‘there is an intimate relationship between belief systems and the institutional framework’, What people believed to be true, fair, and reasonable mattered a great deal not just to their behavior directly but also through the institutions they lived with. North famously referred to these beliefs as the ‘scaffolds’ on which institutional structures rested (ibid.: 8–9).Footnote 1

In what follows, we take up the challenge posed by North in his 2005 book. We suggest that institutions – rules, expectations, and norms – are based on shared cognitive rules. Indeed, it is hard to think of incentives as anything but a cognitive rule. Cognitive rules are social constructs that convey information which distills and summarizes society's beliefs and experience. These rules have to be self-enforcing and self-confirming, but they do not have to be ‘correct’. Cognitive rules include not only beliefs based on observed empirical regularities such as the difference in temperature between winter and summer (which each individual can observe on his or her own), but also beliefs about nature such as that the gravity of the moon causes the tides and that smoking causes cancer, which individuals believe because they are socially accepted (equilibrium) cognitive rules. The incentives that people respond to are also socially based cognitive rules: people believe that certain actions will lead to certain outcomes. For instance, in some societies people believe that working hard and paying one's taxes honestly are rewarded and are the correct things to do. In others, people have different moral beliefs on what constitutes ‘cheating’ on their taxes, what constitutes ‘shirking’ on their jobs, and what constitutes a ‘bribe’ as opposed to a ‘fair payment’. The very definition of these concepts is shared cognitive rules. These rules others may think of as ‘institutions’. We think that cognitive rules such as what is moral, what is expected of people to do in certain situations, and how causes lead to outcomes are underlying the regularities in behavior that are generated by institutions. Without such social mechanisms, people are incapable of making sense of much of the world around them, neither of the society they live in, and the markets they buy and sell in, nor the physical and biological world with which they cope with on a daily basis (Greif, Reference Greif2006, chapter 5; Greif, Reference Greif2014; Scott, Reference Scott1998). Economists have typically assumed that people make decisions on the basis of knowledge of the problems they have to solve. What North and others have pointed out is that the rules by which this knowledge emerges are the result of individual learning; we, on the other hand, see them as social constructs that provide the foundation of individual decision making and that are transmitted by social rules – such as the rules of the road or the rules of the market (Greif, Reference Greif1998; Greif and Kingston, Reference Greif, Kingston, Caballero and Schofield2011).

How should we think about institutions and cognitive rules? Cognitive rules, which summarize and aggregate society's beliefs and attitudes, are followed because individuals with limited cognition – that is, everyone – have to rely on them in exercising their choices since these rules link outcomes to decisions and thus set the incentive structure. The knowledge and information conveyed by the consequences of choices and decisions are specified by these rules. Individuals have the option to follow the rules or not, but they normally cannot set the rules. In other words, cognitive rules correspond to behavior when the cognitive frameworks they convey constitute an equilibrium in a ‘game’ between each individual and the rules. If I take this action, such and such is likely to happen – whether that involves eating spoiled food, driving through a stop sign, or writing a rude email. In a sense, one is ‘playing’ by responding to the rules rather than to the other players. The individual takes the rules as given – as if they correspond to reality – in making choices.

In what follows, we hope to show that this view of institutions can help us understand historical phenomena and not just behavioral issues such as why (almost) all drivers stay to the right side of the road as the law stipulates, but a few drivers observe posted speed limits. That raises two questions of profound historical importance: one is why and how do the accepted cognitive rules that matter to the economy change, and how did such changes map into economic change eventually leading to the modern economy, which is what North (Reference North1990, Reference North2005) was after all interested in. Specifically, we consider the rise of the consent-based government of the modern state, the idea of the Law as a means of enabling market and other economic activities, and the cognitive basis of the rise of modern science and the technology based on it.

2. Cognitive rules and the ‘market for ideas’

Of all the cognitive rules in a society, some of the most important may be the meta-rules that specify which cognitive rules are accepted or not. One reason for the limited capacity of individuals to form correct beliefs is poor informational feedback in an environment in which multiple interpretations are possible. Such feedback is particularly poor under individualistic (atomistic) learning based only on outcomes that are observed by each individual given the inherent attribute of the interaction. Each individual learns quickly that if they drop an object, it falls on the floor and if they want to buy an item on the market, they have to pay the market price. But many other rules are socially conveyed and distributed and cannot be tested individually. For instance, the age-old custom of bleeding fever patients may have been viewed as effective if the patient subsequently recovered; the more rigorous conclusion whether it was effective requires a large sample and random assignment of bloodletting, something that was beyond the power of the individual patient to observe. The rule was self-confirming: if the patient recovered, bleeding had worked. If he did not, this was despite the procedure. The socially constructed cognitive rule that bloodletting worked was left intact for many generations.Footnote 2 The same is true for social interactions: Each individual attempts to form beliefs about others' future behavior based on their past behavior. Yet, these others also adjust their future behavior in response to past outcomes.Footnote 3 Perhaps most difficult are cognitive rules about economic policy. Is free trade a policy that makes people better off? Do minimum wages create unemployment? Do democratic regimes foster economic growth?

One way of making this point is to utilize a distinction made many decades ago by Hayek (Reference Hayek1942). Hayek insisted that one of the errors made in the social science is to overlook ‘the real contrast between ideas which, by being held by the people, become the causes of a social phenomenon and the ideas which people form about that phenomenon’. He referred to the former as ‘constitutive’ ideas as they are the real causes of a phenomenon and the latter as ‘speculative’ or ‘explanatory’ ideas, which are the ex post notions people have about a phenomenon.Footnote 4 In all emergent properties, the collective or aggregate may be regarded very different than the causes of the elements that account for it – because it is. And yet, in this paper, we want to argue that at times the two may coincide: the people who carry out research in the hope of making society richer may actually believe that the path to economic growth is paved by scientific innovation; the people who are driven to political action to bring about government by consent may actually regard government by consent as a superior form of politics.

Because individuals cannot normally make such decisions on their own, they often rely on experts: priests, officials, teachers, physicians, scientists, ethicists, and legal experts – all help agents decide what they can and should do, and what the payoffs are of each action. These experts constitute a way in which society distributes the distilled cumulative aggregated wisdom of the totality to individuals. Yet, such a rule of experts – inevitable in every society in which the set of social knowledge is larger than what each individual agent can verify on his or her own and which practices a division of knowledge – raises many other issues. First, how do these experts themselves reach the beliefs and convictions they have? Second, who appoints those experts, and who appoints the appointers? And third, what happens when experts disagree and when they compete with one another, holding conflicting views? How do people choose?

Before delving deeper into these issues, it is important to stress that there is nothing in human history and experience that indicates some kind of Gresham's Law in reverse, namely that ‘good’ cognitive rules drive out ‘bad’ ones. Much of that depends on the meta-rules that help people decide what knowledge is valid. In a world in which the wisdom of ancient sages is decisive – be they Aristotle, the Talmud, or Zhu Xi – the likelihood that bogus beliefs can remain powerful is high. But even a world that relies on evidence and logic has to make difficult decisions about what evidence counts, and what rules of logic are admissible. Is statistical evidence acceptable if experimental data are unavailable? And when is experimental evidence decisive? When new ideas strongly conflict with an existing view of the world, the rules of evidence may be cast aside.Footnote 5

Such strong persistence of beliefs is due to confirmation bias on the individual and societal levels, compounded by the material interests of those benefitting from these beliefs. Confirmation bias implies that when beliefs are challenged by new evidence, individuals and groups seek ways to reconcile them with existing beliefs rather than replacing them with new beliefs that are better supported by the data. The belief system advanced by the Catholic Church has survived although it was modified to not be refuted by the Copernican view that it eventually had to accept. Ironically, cognitive systems that are false but cannot be disproved can last longer than systems that might be mainly right but cannot be proved (the germ theory was first proposed in 1546 by Girolamo Fracastoro, 50 years before the first microscope).

The idea of a competitive market for ideas has long been popular among some scholars (Coase, Reference Coase1974; Gans and Stern, Reference Gans and Stern2003; Mokyr, Reference Mokyr2007; Polanyi, Reference Polanyi1962; Stigler, Reference Stigler1965). It is at once misleading and helpful (Hodgson, Reference Hodgson2015: 130–131). Well-functioning markets imply transferable, defined property rights that are transferable at a price – which does not apply here. Yet, disruptive new ideas are generated by somebody, and they become accepted cognitive rules when a sufficient number of others accept them, usually abandoning or modifying previously held views. Such a change can be regarded in terms of a metaphorical market, in which intellectual innovators try to persuade the relevant public to accept new beliefs. If such persuasion is successful, a ‘sale’ has taken place. Markets for ideas can be highly competitive or dominated by monopolists, they can be open or erect high barriers of entry, and we can certainly see the transaction costs and taboos as we see in any other market. While there are no prices that are paid when transactions take place, successful sellers gain fame and prestige, and the utility and resources correlated with them. Old ideas are stubborn and fight for their survival, so such persuasion is often accompanied by serious conflict – one thinks of the persecution of heretics and dissenters over the ages, and the religious wars of the 16th and 17th centuries.

Like all markets, however, the market for ideas needs to have institutional and technological underpinnings that make it work – indeed, that was one of North's main messages. If repeated and sustainable transactions are to take place, there have to be meta-rules about how cognitive rules are assessed. Those include rhetorical conventions about what constitutes proof and evidence, but also the rules of conduct in this market.Footnote 6 There is also the matter of the technology of communication. The market for ideas in a world of internet and Facebook is as different from the market of the 1950s as the market of the first half of the 16th century (with widespread printing presses and effective long-distance mail services) was from the medieval environment.

Cognitive rules tend to reproduce themselves and to be highly persistent – except when they are not. They tend to become unstable when they lose their ability to be self-confirming. This can happen for example when new evidence emerges that is viewed as incontrovertible yet is inconsistent with accepted cognitive rules. Such new evidence can be the unexpected by-product of new technology; the new scientific instruments of the 17th century showed clearly the errors of classical physics, astronomy, and geography, and the improved microscopes of the 19th century demonstrated the validity of the germ theory as opposed to miasma theories. In other cases, however, more subtle persuasion was at work that changed people's views of the organizations that defined their collective lives: was the king the citizen's master by divine right, or was his legitimacy based on the rule of law and his subjects’ consent? Was rent-seeking a legitimate activity or was the only legitimate economic activity the one that actually produced (rather than redistributed) resources? Were protection and subsidies a good way to run an economy or was unfettered free trade? Here smoking guns or mathematical proofs were largely absent, and persuasion became a central feature. Indeed, social learning, imitation, and persuasion through one form or another were the essence of what was taking place in the market for ideas, shaping the kind of self-reinforcing cognitive rules that North viewed as institutions and that in his view set the rules of the game.

The market for ideas can throw up different kinds of equilibria. One is the degenerate equilibrium in which a single belief or cognitive rule becomes ‘fixed’ in the population. Worldwide, flat-earthier are practically extinct, as extinct as the number of Swiss drivers who will try to offer a traffic policeman a bribe. In many other cases, one mental species becomes dominant, but the others are driven into smaller or larger niches. Creationist biology may still be taken seriously in Petersburg Ky., and taught at Liberty University and a handful of other Christian colleges, but can hardly be regarded as a serious discipline in American higher education. A there is serious support for homeopathic and other alternative medicine, there seems to be little sign of a serious ‘alternative chemistry’ or an ‘alternative nuclear physics’. Yet the cognitive rule that guarantees such Aalternative@ ideas the chance to compete in the market for ideas without retribution, no matter how widely regarded their authors are viewed as crackpots, is itself a successful meta-rule (at least in the United States) that had to compete in the market for ideas (and which clearly was rejected in many other cases).

North had a great deal of sympathy for evolutionary theory and models of institutions and cognition. He realized full well that evolutionary thinking is a natural way to connect the past to the present and make sense of how institutions change over time. However, in the end he concluded that the differences between the two are too deep to apply evolutionary thinking to economics (North, Reference North2005: 65–66). Yet, a generalized evolutionary structure is attractive to historically minded scholars precisely because it provides a link between any society's rules and its past. Through the socialization of beliefs, customs, and values, children become imperfect copies of their parents and teachers. But at times this process fails in important respects: all Protestants before, say, 1535 were born Catholics, and early members of the Communist parties were not brought up as Communists. New items appear on the menu of cultural options. Such changes differ in important respects from ‘mutations’ in biology but they have similar effects. Selection on intellectual innovations works through the market for ideas.

What counts for the dynamics is the importance of the concept of coevolution (Durham, Reference Durham1991; Richerson and Christiansen, Reference Richerson and Christiansen2013); two entities or species can affect each other either positively or negatively. Cognitive rules affect one another. In some cases, they are mutually antagonistic, whereas in other cases they mutually reinforce one another. There are a few instances in human history when a process of positive feedback in which multiple sets of beliefs and knowledge reinforced one another had the power to change history. None of these, it appears to us, are more important than the evolution of beliefs and institutions in the West in the 18th century, which triggered the Industrial Revolution and everything that came after, and in which the foundations of our current prosperous world were laid.

Co-evolution also helps resolve the issue of the direction of causality. Historical materialism subjugated ideas and beliefs to material economic forces; historical ideationism in which ideas drive historical development has had a recent revival (McCloskey, Reference Levine2006, Reference McCloskey2016). But nobody is arguing that ideas alone or material forces alone drive institutions and historical outcomes. Material interests determine to some extent what people believe and what institutions will emerge. The ability to create intellectual rationalizations for one's hopes for material advancement to say nothing of naked greed should not be underestimated. Ideas change to fit changing times, but as they change they affect the way the environment changes, much like the environment affects how species evolve and yet the species change the environment in turn. All the same, beliefs and cognitive rules are formed in more complex ways than ‘how can I profit?’ Did material interests affect whether people believed in the theory of evolution or in Newtonian celestial mechanics? Co-evolution, in which the two constantly interact and feedback on one another is a more accurate way of looking at the kind of issues North was interested in. They also underline that outcomes are on the whole indeterminate and characterized by a multiplicity of equilibria and outcomes, much like history itself.

Did cognitive rules matter for outcomes economic historians care about? We argue that they did and below we present a number of examples to that effect, each of which touches directly on the two developments that are at the core of the historical transformations that created the modern economy. The first is the rise of the modern Western-style nation state aimed at improving the welfare of its citizens and relying on an effective legal system, and the second is the rise of modern science and technology and the increase in productivity and economic welfare it implied, which McCloskey has termed the Great Enrichment.

3. Cognitive rules, legitimacy, and political development

With some notable exceptions, the role that cognition played in the historical process of political development has not been examined. (Among these exceptions are Greif and Rubin, Reference Greif and Rubin2016; Levi and Sacks, Reference Levi and Sacks2009). Yet, as we argue below, cognition – particularly regarding legitimacy – has had a large impact on historical trajectories of political development.

The cognitive aspects of interest here are those articulating on legitimacy, that is, the rational or moral basis for the right to rule. Political regimes face the challenge of motivating compliance with demands on the citizenry that, absent either intrinsic motivation or coercive power, violate the individual-level participation constraint. North noted the role of intrinsic motivation in governing: the fundamental aim of ideology, he argued, is a way to make people behave in ways that were contrary to their simple hedonistic individual cost/benefit calculus and overcome free riding (North, Reference North1981: 53). In other words, people in power advanced cognitive rules justifying their control of others to motivate compliance. But how?

To begin this task, it is useful to begin by considering the conditions under which states provide public goods (rather than delegating this role to purely social or economic organizations). Moreover, if legitimacy and coercion are substitutes, what limits the reliance on legitimacy? We depart from North by considering why the state is facing the following tradeoff in providing public goods and why it provides them to begin with. Recall that the provision of public goods is characterized by a free-rider problem as identified in the seminal contribution by Olson (Reference Olson1965). Collective actions differ by their attributes such as excludability and observability that determine whether free riders can be deterred based on non-coercive (i.e., economic or social) mechanisms. How can free riding be mitigated if economic and social punishments are not available? Either intrinsic motivation or coercion or a combination of the two can be used. In particular, a legitimate ruler who orders his subjects to contribute resources to a public good can rely on an intrinsic motivation to mitigate the collective action problem. Similarly, if the ruler can collect tax using coercive power, he can finance the provision of public goods.

Both coercion and legitimacy, however, come at a price. The price of coercion, that has been and is still common, is dividing the society between the coerced and the coercing, where the former had to pay the costs required to motivate the latter to subdue them. Such internal divisions, in turn, required resources and fostered social unrest and violent conflicts. Equality and open access were inversely related to the degree of intra-state coercion used to mobilize resources for public goods. Moreover, intra-state imbalances in the allocation of coercive power created opportunities for a military elite (originally designed to protect the country from external threats) to exploit the non-elite and extract resources from the larger population. This is the essence of the ‘natural state’ described by North et al. (Reference North, Wallis and Weingast2009) and the ‘extractive state’ described by Acemoglu and Robinson (Reference Acemoglu and Robinson2012). Their attempt to understand the rise of the modern European state using only this perspective has been innovative, but without a more explicit emphasis on beliefs and ideology it has been incomplete. Extending the analysis to consider cognitive rules therefore seems promising.

Legitimacy is a perception shared by the citizens that a particular regime is rightfully in power. Because it is a moral view, it tends to be persistent. Some ancient rulers seem to have been remarkably able in prolonging their regimes, although it is difficult to distinguish whether their legitimacy or power was the reason. Be that as it may, later regimes were facing the challenge of motivating compliance among subjects, who often already held a cognitive structure created by a previous regime or another entity (e.g., religious authorities). Roland (Reference Roland2004) noted that such ‘slow moving’ cultural features constrain the set of behavior a ruler can institutionalized. Subjects may pretend to recognize legitimacy even when they do not.Footnote 7 If the economic and coercive pressure to conform to a new legitimacy rule is too strong, it may lead to resistance to the new regime that may be deeply entrenched. Regimes require powerful and influential individuals or organizations to declare their recognition of the rulers, such as the prophet Samuel in the book of Kings. The fact that such an agent has been asked to legitimize a ruler is a source of additional power to such agents. Such power to bestow legitimacy enhances the legitimacy of the ruler over other agents who have not been asked to legitimize him (Greif and Rubin, Reference Greif and Rubin2016).

One indication of the importance of cognitive rationales for political systems is the large extent to which political regimes invoked religious justifications for their control, despite the risks involved. Egyptian pharaohs, Persian kings, Japanese and Roman emperors alike were among the many who claimed themselves to be divine. The benefit to them was the ability to delegate the punishment to a third party (that it, the divine entity), or postpone punishment for non-compliance to the afterlife. The risks, however, were substantial. The first risk is the need of the rule to be self-confirming (Greif, Reference Greif2006, chapters 2, 6, 7; Greif and Laitin, 2011). A cognitive rule refuted by observable outcomes would not last long. A king who claimed to be a god risked being unmasked as an impostor by outcomes inconsistent with the claim such as a military defeat or a natural disaster. The second risk is that a supporting religious authority could become ambitious or greedy and challenge the monarchy.Footnote 8 Religious authorities therefore had to be compensated to maintain their loyalty.

Divine justification was nevertheless sufficiently valuable that rulers often sought it. It is therefore possible to evaluate whether cognitive rules mattered by regressing observable outcomes on proxies of differences in cognitive rules. Specifically, did different religiously based cognitive rules have distinct implications regarding the longevity, effectiveness, of economic and political institutions? Iyigun (Reference Iyigun2015: 23–45) established that political units in which monotheistic religions prevailed last longer and were bigger than others. It is significant, however, that post-Roman European rulers at first did not rely on religion to justify their control.

They ruled because they were the descendants of the traditional chieftains of these tribal groups. As such they were considered first among equal and to become a ruler, a son of the previous chieftain had to get the consent of those who were to follow him to battle. Although consent could not be taken back, a chieftain who lost the confidence of his followers could expect to find in following someone else to whom they declared loyalty.

By the 8th century, challengers to the traditional rulers of various European polities were deploying Christianity to gain legitimacy. Specifically, rulers whose legitimacy was based on blood line and consent were challenged by those whose legitimacy was acquired by Papal blessing. The legitimizing power of the Papacy is based on the new cognitive idea of Christian king and the Papal position as an intermediary between the Lord and the believers. The cognitive foundation for this position is reflected in the Papal emblem, two crossed keys, symbolizing that any door that the Papacy opens on earth God will opens it in heaven. The Papacy was therefore in a position to influence compliance and loyalty to rulers and those who challenge these rulers.

Divine justification was nevertheless sufficiently valuable that rulers often sought it. It is therefore possible to evaluate whether cognitive rules mattered by regressing observable outcomes on proxies of differences in cognitive rules. Specifically, did different religiously based cognitive rules have distinct implications regarding the longevity, effectiveness, of economic and political institutions? Iyigun (Reference Iyigun2015: 23–45) established that political units in which monotheistic religions prevailed last longer and were bigger than others. It is significant, however, that post-Roman European rulers at first did not rely on religion to justify their control. Instead, the political units created by the Germanic tribes and other groups held that the right to govern was based on blood line of the tribal chiefs and later kings and on the consent of the group's free people.

The Papal role as king-maker is illustrated, for example, by the history of one of the most important European dynasties, that of Charles the Great whose father, Pepin the Short, became king of the Franks in 751 with Papal support. Previously, the traditional rulers of the Franks, known as the Merovingian, legitimized their rule based on hereditary rights and the consent of their aristocracy. Pepin's family held the position of the mayor domus, the main administrator under the king and over time created a professional army under its control. In 751, when the Pope needed Pepin's military support against the Lombards, he approved Pepin as king. Only afterward did Pepin seek the consent of the aristocracy, while having his army close by. The hint was clear and consent was given. Pepin was not unique in invoking the Papacy to justify his rule. William the Conqueror sought Papal approval in 1066 before sailing to capture England and similar to Pepin, consent by the English nobility was given when William's army was at hand.

There is an interesting historical dialectic at work here. The king-making powers of the Church endangered existing royal houses and other powerful actors, thereby, undermining itself. Kings feared that their opponents would ally with the Pope, whereas intra-state actors who sought to limit royal power feared Papal support of the monarchy. Perhaps the most striking example is the case of the Magna Carta in 13th century England. The barons forced the king to take an oath to keep the charter and as a Christian king he could not renege without committing a mortal sin. In order to break his oath anyway, the king offered England to the Pope and ruling it as a papal vassal, in return for the Pope annulling the oath. Later, an act of Parliament declared it illegal for a king to offer England to the Pope.

The religious obligation of the king's subjects to comply with their rule was thus beneficial to the kings, as long as it could be controlled and shielded from papal meddling in the affairs of the realm. By the late 11th century, the tension erupted in an open confrontation between the Papacy and the Holy Roman Emperor. The results were devastating to both, as the Empire disintegrated and the papacy saw its king-making capacity decline over the following centuries.

More generally, to buttress their independence from the papacy, European rulers relied on the legitimizing power of consent as was the case in the period before the rise of the political power of the papacy.Footnote 9 In other words, in their attempts to weaken the cognitive rules that regarded the pope as the supreme political authority, the rulers promoted the role of consent by their subjects, harking back to pre-Christian institutions of legitimacy by consent. In 1302, the French Estates General was assembled by the King, Philip the Fair, when he sought their support in his struggle with Pope Boniface VIII. In England, the House of Common was drastically expanded after Henry VIII broke with Rome from 1529 onward (Greif and Rubin, Reference Greif and Rubin2016).Footnote 10 Similarly, the earlier conflict between the king and his barons led the monarch to strengthen cities and fostered commerce to weaken the baron by shifting power, wealth, and administrative capacity to the commoners.

European rulers still sought religious legitimacy, but one that did not depend on the consent of the papacy. For this purpose, they promoted national church hierarchies under their control that shielded them from Rome and enabled them to gain from a supporting religion authorities. In other words, the European monarchs created a new cognitive concept, a national church. These churches were part of a universalistic religion but were more amenable to sanction the current ruler. In creating a national church to justify their rule, the European monarchs exploited divisions within the Church and the cognitive rules inherited from tribal institutions that required consent. It was relatively easy to align the interests of the Monarch with those of the local high clergy. Archbishops preferred to crown a king rather than let the Pope do so.

In some cases, the Pope was formally deposed as head of the church and replaced by the King (as in England) or basically deprived of any serious political influence as in France, where Louis XIV forged a form of Catholic absolutism known as ‘Gallicanism’ (Pincus, Reference Pincus2011). In early modern Europe, the cognitive rules for legitimacy had clearly changed. The kings no longer had to rely primarily on a religious imprimatur to attain the consent of their citizens.

When the European monarchs turned to limit the power of the representative assemblies they again did so using a cognitive innovation that combined two previous legitimacy principles: hereditary rights and Church approval of a Christian king. By combining these, the cognitive rulers demanded compliance based on their ex dei gratia, divine right. This was a brilliant cognitive innovation that, subject to the constraint implied by monotheism, provided a way of invoking divine sanction but without assuming the risk of declaring oneself God. Even kings who ruled by consent valued their divine right. That European kings cared about the perception of their divine power is illustrated, for example, in the restoration of King Charles II to the throne of England in 1660. One of his demands was to resume holding public healing sessions, demonstrating his divine powers to perform miracles. His grandfather, James I, articulated in Parliament that he had a divine right to rule shortly after his coronation in 1603.

The appeal of monarchs to a divine right is natural, given the discussion regarding the nature of cognitive rules and their function in justifying rulers. Under this concept the king was not God, but his right to demand compliance was God-given. This rule was a way of achieving compliance based on divine sanction, while avoiding the risk of being one. The message he sent to his subjects was: obey the king regardless of his performance, or else be a sinner against the will of God. All the same, the divine right was not absolute: even kings who invoked it ruled only with the consent of their subjects – the divine right was a supplementary way to elicit that consent.

The effectiveness of national churches depended on context. According to Charles II, who was nominally the head of the Church of England, not even all monotheistic religions were created alike. Specifically, he famously declared that Catholicism was the best religion for an absolutist ruler.Footnote 11 Although Catholicism centers around the Papacy, there was no equivalent central religious authority in Protestantism. Moreover, Protestants read the Old Testament in which the idea that only God should govern over the community of believers undermined the claims for the right of Kings to be obeyed by their subjects because of divine will. Protestant intellectuals generally supported the notion that subjects had the right to overthrow rulers of whom they did not approve.Footnote 12

In the Catholic and Slavic parts of Europe, in which the church and/or the nobility were strong, rulers held power only through the legitimacy and support provided by the national church and/or nobility (landowners more generally). During the 18th century, the total number of sessions held by European representative assemblies went down relative to the previous century by 15% (to 804). The decline was particularly large in the Catholic states of Denmark, Poland, Portugal, Russia, Spain, Italy, and France (van Zanden et al., Reference Van Zanden, Buringh and Bosker2012). This was hence the period known as European Absolutism. It did not last long however. The cognitive rule claiming the right of representation reasserted itself during the 19th century.

The importance of the cognitive foundations of political order becomes clearer once we broaden the scope of the analysis beyond Europe. Political orders based on cognitive rules that can be refuted by not meeting the standards of proof defined by the rule were particularly vulnerable. This was the case in China where the cognitive rule justifying the Chinese emperor was that he held a mandate from Heaven (e.g., Zhao, Reference Zhao2009). The mandate manifested itself in peace and prosperity for which the emperor was responsible. Chinese dynasties were in jeopardy and even ended whenever some combination of population growth, climatic change, natural disasters, political weakness due to internal divisions, and external attacks invalidated the mandates (Morris, Reference Morris2010).

To sum up, the political foundations of legitimate rulers in Europe changed over time due to cognitive innovations, changes in balance of power, and strategic interactions. In particular, following the collapse of the Roman Empire, traditional or hereditary rights and consent provided the basic for political legitimacy in the new political entities, many of which were initially pagans. As Christianity spread, the papacy introduced the concept of Christian king based on which it could have become the kingmaker in Europe. In this quest, the papacy benefited from conflict between the monarchy and nobility. The king-making powers of the Church, however, endangered existing royal houses and other powerful actors, thereby, undermining itself. The monarchs initially weakened the power of the papacy by reviving legitimacy by consent and by allying themselves in strengthening the commoners, particularly the cities. Subsequently, however, the cognitive innovation of divine right strengthened by national churches and support by the weakened nobility enabled the rulers to restrict the power of the commoners as well.

4. The cognition of modern growth: progress, science, and technology

The European Enlightenment is a hugely complex and controversial topic; specialists still disagree on many aspects including whether we should think of a common denominator to the rather divergent views that constituted it, or whether we should speak of many 'enlightenments' and leave it at that. For the economic historian, however, the significance of the Enlightenment is above all concentrated in the belief in progress, in the capability of economic agents to work successfully towards improving their lives. New discoveries and instruments emerging after 1500 showed the many errors of classical learning and raised skepticism and contestability of age-old accepted beliefs to the level of a cognitive rule. Evidence and logic replaced unassailable authority. In this intellectual environment, the Enlightenment emerged triumphant (Mokyr, Reference Mokyr2016). It bears emphasis that radical skepticism and contestability of received wisdom were found in other societies, but never to the extent and with a force it attained in early modern Europe.

Specifically, an aspect of the Enlightenment that was central to the subsequent economic history of Europe was the changing views regarding the physical and biological world around us. The behavioral rules of interest here, above all, were the rules by which people distributed ideas and knowledge and the rhetorical conventions by which they persuaded one another on both these subjects (Mokyr, Reference Mokyr2016). These two entities co-evolved, reinforcing one another. In the end, the cognitive rules became inconsistent with the existent political forms, and the latter had to be changed either through revolution or through reforms.

The exact attitudes regarding progress and how to bring it about differed, especially between the great thinkers of the Scottish Enlightenment and their French counterparts. But they shared a most important mental model, namely the belief that economic progress depended on the ‘progress of the arts and sciences’ as Hume titled his famous 1742 essay and on suitable political institutions, as formulated by Adam Smith in his widely cited statement that ‘little else is required to carry a nation to the highest state of opulence from the lowest barbarism but peace, easy taxes, and a tolerable administration of justice . . . All governments which thwart this natural course, which force things into another channel or which endeavor to arrest the progress of society at a particular point, are unnatural, and to support themselves are obliged to be oppressive and tyrannical’.Footnote 13 Hume and others conjectured about the likelihood of progress occurring in their lives and future generations, but neither he nor his more enthusiastic French colleagues, such as Turgot and Condorcet, had much of a boding of what was to come.

The best-known part of the Enlightenment dealt with politics, including of course the matter of legitimacy. But the cognitive rules regarding the state and its relation to the economy went much further. As long as the essence of the state was to transfer resources from the weak multitudes to the powerful few, improvements in technology and the allocation of resources would be hard to translate into widespread growth in the standard of living. The Enlightenment rang in the beginning of the end of the extractive state in Europe. Government still taxed, but even in absolutist empires such as Russia and Prussia, the purpose of the discussion became less and less to enrich the rulers and their cronies, and more and more the provision of supposedly welfare-enhancing public goods and services that the private sector for one reason or another could not provide. The rise of free trade in post-1780 Europe is a good indicator of these changing cognitive rules; tariffs were one of the oldest and most widely practices of rent-seekers. The growing conviction that free trade was desirable and good for the economy derived not just from the highly influential writing of Smith and his liberal followers who stressed that exchange was a positive-sum and not a zero-sum game, but also from the increasing resistance to any kind of measure that benefitted a few at the expense of the many. The same held for freedom of occupational choice and location of residence. Rent-seeking (which was what mercantilist policies were largely about) was increasingly understood to be associated with large deadweight losses. Monopolies, tariffs, subsidies, cozy offices, what the French called privileges, were all leaky buckets, in which in the gains to the winners were smaller than the losses of those who paid the price. North (Reference North2005: 63) explicitly mentions the transition from a cognitive rule that sees all economic activity as a zero sum game to one that sees it as a positive sum game, but he did not pinpoint the intellectual innovations of the Enlightenment as the crucial events that brought this transformation about.

Continental Europe and the North American colonies implemented many of these reforms through revolution. In Britain, although the unfolding of these policies may have been slower than impatient reformers wished for, the mercantile state as it had existed in the 17th and 18th centuries was practically dismantled by 1850. With mercantilist policies, corruption and to a great extent rent seeking melted away, just as Smith and Hume had hoped (Mokyr, Reference Mokyr2009, chapter 4). Although as always some corrupt behavior could not be avoided, it became the exception rather than the rule. In Britain, as Harling (Reference Harling1995, Reference Harling1996) has shown, corruption declined and its ruling class was on its way to turn itself from an extractive class to a professional and largely conscientious service elite (Colley, Reference Colley1992: 192). In Prussia, Scandinavia, the Low Countries, and to some extent France, rent-seeking and corruption were kept in check, and the State in the countries that had experienced the Enlightenment became increasingly the kind of organization that the 18th century philosophes had dreamed of. The cognitive rules that governed how the citizens saw the state and their rulers had changed dramatically.

The other great cognitive change of the Enlightenment was the realization that the understanding and control of natural phenomena and regularities were essential to human progress. The importance of scientific insights (both substantive and methodological) to the Industrial Revolution and subsequent economic growth has been a matter of dispute. In many areas, technological progress still occurred the way it always had: small cumulative improvements in processes and products through trial and error and artisanal serendipity. Yet, that system was changing, and was changing, many of the great industrialists of the Industrial Revolution sought the advice and counsel of scientists at the cutting edge of their profession. Whether the advice of the likes of consulting scientists such as William Cullen and Davies Gilbert did much good to the instrument-makers and the spinning-mill owners who sought it is not at all certain: in some cases, more so than in others. But what is striking is how committed the age of the Industrial Revolution was to the basic cognitive rule that the insights of science could and would eventually lift productivity and living standards. It is not surprising that one of the heroes of 18th century thought was Francis Bacon, the philosopher who did more than anyone else to change the way people thought about progress and acted on those beliefs.

Bacon's influence on European economic history is a topic that has not engaged the profession much till now, but the Northian concepts of mental models on the individual level and its extension to the social and institutional levels (Aoki, Reference Aoki2001; Greif, Reference Greif2006, chapter 5; see Greif and Kingston, Reference Greif, Kingston, Caballero and Schofield2011 for a survey).

Cognitive rules and their effects on institutions are quite helpful here. Bacon's main message was that the agenda of natural philosophy, which we would call applied science, should be driven by practical and material needs to solve technological bottlenecks and 'the relief of Man's Estate' as he called it. This message resonated enormously during the century and a half that followed his death in 1626 and intellectual historians such as Zagorin (Reference Zagorin1998) and Zittel et al. (Reference Zittel, Engel, Nanni and Karafyllis2008) have given him the credit he deserves for being the pivotal thinker in creating economic modernity and the so-called Baconian program that created it (see also Farrington, 1979; Rossi, Reference Rossi1970). The ideas he promulgated were of course not altogether new, but his writings served as a focal point that clarified and organized the thinking of his followers in the age of Enlightenment and created an altogether novel cognitive rule. To cast this in terms of game theory, subjectively developed beliefs converged on equilibrium beliefs. An initial ‘grain of truth’ regarding others’ behavior – which is what Bacon proposed – is thus sufficient for individuals to learn independently how others will play and for convergence on a cognitive equilibrium. In the market for ideas, he can be regarded as a highly successful entrepreneur (Mokyr, Reference Mokyr2016).

The other idea that drove much of Enlightenment science was the realization that ancient learning was not the be-all and end-all of knowledge. The debate within the European intellectual community, now largely forgotten, is known as the struggle between the ancients and moderns (Lecoq, Reference Lecoq2001; Levine, Reference Levine1981, Reference Levine1991). The belief in progress logically implies a certain lack of respect for the learning of earlier generations. French thinkers such as Pascal and Fontenelle argued that knowledge was cumulative and that it was therefore inevitable that each generation knew more than the previous ones.Footnote 14 New cognitive rules emerged that denigrated the once-powerful authority of ancient wisdom reduced the built-in persistence of knowledge systems and allowed faster change. If Aristotle and Ptolemy could be wrong about so many things, could the zero-sum mercantilist view of international trade and the unassailable divine right of kings be far behind? The intellectual community that formed in the 16th century (known as the Republic of Letters) adopted a meta-principle that turned out to be transformative: contestability. There were no more sacred cows, not Aristotle, not the Bible, not even Newton. It was no accident that the Royal Society adopted the motto of in nullius verba (on no one's word). Authority was demoted, and had to make room for evidence and logic.

The cognitive rules established in the age of Enlightenment thus radically changed the way in which Europeans thought about the natural world around them, and how to go about understanding it. Not only the agenda but also the methods of inquiry were transformed between 1500 and 1750: experimental methods had become legitimate, mathematics and precise computation had gained respectability, and new tools and instruments were deployed to measure and observe new objects and with greater precision. How and why to improve science and technology were not the only cognitive rules that changed in this era. McCloskey (Reference McCloskey2016) argues that the hierarchy of values changed as well, although it did not change monotonically. For her what mattered above all is that at some point in early modern Europe, society began to honor the ‘bourgeoisie’ – merchants, investors, high-skill artisans, and speculators, giving them a respect and a social standing that changed their position in society and made others want to excel in these activities. That bourgeois spirit, she maintains, was a key factor in the economic changes that North was trying to explain. North would certainly agree.Footnote 15 It is hard to know whether the ethical factors that McCloskey is talking about are more important than the more mundane advances in the understanding of the physical world stressed by scholars such as Jacob (Reference Jacob2007, Reference Jacob2014) and Wootton (Reference Wootton2015). That debate will continue. But where everyone agrees is that what people believed to be true and how they processed information must be at the center of any argument that explains the modern world.

5. Cognitive rules and legal development

Belief in progress, the scientific method, and science-based technology provided the cognitive foundations of modern growth. Modern growth, however, would not have come about, at least in Europe at that period of time, unless it was complemented by reinforcing cognitive foundations of states and the law. The cognitive foundations of the European states were already discussed above. Political voice and political representation by economic agents, the rule of law, and the interest of rulers to promote economic growth as a way to gain in interstate competition were conducive to implementing the agenda, now recognized as possible, of modern economic growth. This does not imply, however, that contemporary recognized that this was the direction of the European economy. In fact, even the Wealth of Nations written by Adam Smith in 1776 reveals little awareness of what was about to transpire. The cognitive foundation of the European legal systems was crucial for the economic growth that was to follow, and their functioning, in turn, critically depends on the nature of the political systems. When the states were growth-oriented, the cognitive foundation of the European legal systems rendered them effective in the emergence of modern growth.

One role of the legal system in the European transition to modern growth was to mitigate the social upheavals implied by the transition. Maintaining social order in a society experiencing a transition to modern economic growth is challenging. The challenges are many and among them are protecting new forms of property rights such as copyright and patents, and providing social safety nets to a relatively large urban population that depends on the market for staple food. The transition to modern growth is socially challenging also because it requires large investment in new public goods such as research institutes, schooling, and infrastructure. Losers from economic development and change need to be compensated or otherwise held at bay. Population explosion in urban areas had to be dealt with and checked, and the internal and external predators that more wealth attracted have to be deterred.

In Europe, the evolving cognitive foundations of the law facilitated achieving such objectives. The cognitive foundations of European legal system evolved alongside and in complementary manner to that of the political development described above. An important consequence of these developments in the cognitive basis the European State was the decline in the legal power of the Church and the increasing authority of the state over the law. As late as 1300, the Church's Canon law was more advanced than individual state laws. Moreover, early states needed to hire churchmen to have literate civil servants. The cognitive rules in Christianity, however, did not serve these rulers well in attempting to control civil law. Emerging within the political body of the Roman Empire and its strong civil legal tradition, the Church was to render onto Caesar that which was Caesar's.

In its power struggles with the European monarchs, however, the Church sought control over the law. Legal authority would have enhanced the Church's capacity to discredit a ruler as a sinner. A king who was a sinner contradicted the premise that a Christian king had first and foremost to be Christian. As the church had authority to decide who was a sinner, such a discretion implied a great deal of political power and the Papacy repeatedly excommunicated kings and placed nations under interdict. Over time, however, the cognitive rules changed: the number of actions considered crimes increased and those considered sin declined. Eventually the idea of sin disappeared as a concrete political concept.

To illustrate other possibilities, considered the legal development in the Muslim Mediterranean area that evolved along a different path of cognitive foundations complementing that of its political system. The cognitive foundations of the political order in the Muslim lands in the medieval era differed from those in Europe and China. First and foremost, the Muslim rulers that created the first Muslim empire were titled Caliph Amir Al-Mu'minin meaning the substitute military leader of the believers. In other words, no Muslim rulers, including the Caliph himself inherited the role of Muhammed as a spiritual leader. The authority over religious matters remained the responsibility of the Islamic scholars.

Among the responsibilities of the Islamic scholars was advising the provincial administrator and caring for the needy, providing education, and interpreting the Sharia, the Muslim religious law, and judging accordingly. Islamic law, however, does not cover all legal matters and therefore had to be complemented by other codes. Among these were civil codes, non-Sharia codes, and code of customary laws. Common to these codes is that they were not based on or became part of the Islamic law prior to the modern period. In contrast the law in Europe became increasingly unified and centralized as rulers and assemblies gained power relative to the religious authorities due to the increasing power of the European states and their representative assemblies to control the legal system. Divergence in the cognitive foundations of political order thus influenced institutional – legal – developments.

The following three tables summarize the situation. They contrast the nature of the legal system in Europe and the Muslim world. The former is represented by early 19th century Code Napoleon (which epitomizes and aggregates many of the legal customs of continental Europe).

The important point to take from this comparison is that in the areas of the law most important for economic development – contract, constitutional, and taxation – the capacity of the Muslim state was particularly limited. In general, the Ottoman state was effective and fast in adopting military technology from Europe. This served Islam well on the battle fields. But the Ottomans were ineffective in enacting laws that could foster modern growth. Charity, contract law, property law, and inheritance law, among others were in the domain of the Islamic scholars, not the state. Laws in these areas would have considered un-Islamic and would only highlight the predicament of the state regarding its consistency with the Islamic law.Footnote 16

The cost of altering the laws covered by the Sharia was high to the Sultans as the changes reaffirmed the state's inherently un-Islamic nature. The high cost of changing the law limited the incentive to introduce transition-enhancing legal changes. Recall that the areas of the law most important for economic activities such as contract law, charity, and property law are covered by the sharia. It may very well be the case that early in the history of Islam, when the sharia law on these matters was appropriate for the needs of the time. But modern growth required commercial law, contract law, and property law very distinct from those of the previous era. In the absence of the capability to adapt gradually to changing circumstances, change, when it arrived in the 20th century, was violent and imposed top-down as the experience of Turkish Republic demonstrates.

An important question in growth economics is whether differences in legal systems affect economic growth or other welfare-related outcomes? An important line of work in economics has established the importance of the law and its historical origins on subsequent economic development. Perhaps the most important line of research is the legal origins literature initiated by Andrei Shleifer and his collaborators. They have shown that common law adopted by ex-British colonies fostered the deepening of financial markets and thus contributed to development.Footnote 17 By focusing on the adoption of colonial law, this work circumvented the question of the endogenous determinants of law (Berkowitz and Clay, Reference Berkowitz and Clay2011). Similarly, the important line of research associated with Kuran (Reference Kuran2011) that noted the role of law in the economic decline in the Muslim world has been neglected. His pioneering work examined implications of the Sharia on economic development and concluded that the inheritance laws, the lack of legally formal incorporation laws, and the rigidity of laws governing pious foundations limited capital accumulation and formation. The importance of this insight notwithstanding, it sidesteps the relationship between the cognitive foundation of the state and legal changes.

These issues are the focus of the analysis here. The transition to the modern economy required more than formal legal reforms: it required changing the cognitive rules of society and embedding these changes in the legal code. Kuran touches upon these issues when discussing the reasons why the Islamic world did not adopt or invent the Western-style business corporations.Footnote 18 Modern economic growth required more than adapting existing contractual and organizational forms to new tasks. It required forming different cognitive rules regarding the nature of the economy, the practice of business and commerce, the mechanisms of conflict resolution and contract enforcement, and the precise role of government in managing human affairs. At the same time, it also required an agile legal system that did not prevent political and social adaptations and legal innovations that underpinned growth as Great Britain possessed (Mokyr, Reference Mokyr2009: 377, 413–418). In order to implement the new technologies and knowledge to provide the basis for the modern market economy, there was a need to create and adopt new cognitive rules about the world around us in the nature of the economy.

Another important driver of changes in cognitive rules is the slow decline in fatalism and of the belief that human life's outcomes were the result of God's will and hence were inevitable destiny. Instead, the cognitive rule that slowly emerged viewed outcomes, either good or bad, in a different way. Economic outcomes were due to a combination of human agency, ability, and diligence, with luck, random events, and accident. The challenge was to tell one from the other.

This cognitive transformation had profound implications for the rise of a 'modern' political economy: it enabled legal changes that the transition required if it was to prevent the losers from trying to block any further progress (see Greif and Iygun, Reference Greif and Iygun2013; Greif and Tabellini, Reference Greif and Tabellini2010, Reference Greif and Tabellini2016 for analysis of these changes in distinct societies). To illustrate, the concept of the deserving poor (as distinct from the idle poor) recognized that although in medieval societies most individuals had direct access to land from which they could make a living, this was no longer the case in industrialized economies. In agrarian economies, people were still subject to shocks caused by weather and other natural factors, but these reflected divine will. Charity mitigated the worst results, but it was seen as a redemption of the giver, not the receiver.

After 1500, the working poor merited support once, due to no fault of their own, they became deserving poor. Western societies developed cognitive rules that stressed the distinction between people who were poor through no fault of their own (and thus merited relief), and those who were able-bodied but idle because of their own decisions and thus did not. Orphans, widows, cripples, blind, and mentally handicapped people were all unequivocally deserving, but so were people who had been the victims of economic and technological forces stronger than themselves. The history of the English poor law demonstrates how difficult it was to make this distinction and prevent moral hazard in such situations.

Similarly, in industrial market economies, in which innovations and exogenous shocks on both the supply and demand sides were common, it became recognized that a businessman might go bankrupt even if he took all the right actions. In England during the 17th century, the modern notions of bankruptcies and insolvencies were introduced. Was bankruptcy caused by force majeure or bad faith? Britain's bankruptcy laws, originating in 1542 but reformulated in the 1706 Bankruptcy Act, recognized that some debtors could not pay because of events beyond their control, and that punishing such people would have neither a deterrence nor a signaling value. Under the Lord's Act of 1759, Parliament allowed creditors to demand that bankrupt debtors prepare a list of their assets under oath, and they would be released from debtor's jail when they did.

6. Conclusions

In his first serious work on institutions North (Reference North1981) pointed to the importance of cognitive rules in explaining human behavior. He (somewhat confusingly) used the term 'ideology' but it is unambiguous what he meant: everyday behavior and the world around us are guided by what we think is knowledge, which he thought was at base theoretical, intellectual efforts to rationalize the behavioral patterns of individuals and groups (p. 48). Yet, he provided little elaboration on this insight, and the historical examples he provided were mostly concerned with property rights.

We have argued here that this framework, suitably expanded, can provide us with a critical component to understand the evolution of institutions. To understand historical change, we should explore not so much the physiological roots of cognition and the nature of consciousness as North (Reference North2005, chapter 4) suggested, but their evolution over time through learning, imitation, and persuasion. Cognitive rules change over time, the result of competitive forces in a 'market for ideas', in which basic cognitive rules are proposed and either accepted or rejected. Among the most important ones that established the modern economies are the legitimacy of the ruler, the incentive structures that govern wealth creation and distribution, and the agenda, methods, and purpose of scientific research. Far beyond his own focus on the evolution of property rights, North's insights provide us with a guidance of how to see 'the Rise of the Western World' in an entirely new light.

We do not mean this account to sound like some kind of Whiggish narrative in which good, progressive, and just ideas drove out selfishness and stupidity. Economic historians have written for decades about technological progress and institutional change. There is no presumption that over the long haul, there is a secular trend toward improvement in institutions or even in beliefs about the cognitive structures that underlie them.Footnote 19

The Enlightenment was followed by a counter-enlightenment of xenophobia, cultural arrogance, and romantic militarism, and with them came protectionism and new opportunities for rent-seekers. Democratic and open institutions are constantly challenged by the likes of Mussolini and Victor Orban. The evidence for institutional progress – even if we could find a consensus what it means – is spotty and ambiguous. Well-functioning and integrated markets can disintegrate faster than they can emerge, as happened in August 1914 (and within a hair's breadth, on 9/11). The rule of law, to say nothing of peace and the respect for life and property on which efficient allocations depend, has been abruptly reversed more than once – most recently in Syria and Libya.

Although some societies may have become more inclusive and open, in many others autocratic rulers have driven rent-seeking and corruption to a peak that has given rise to the term ‘kleptocracy’. For every Brazil, in which there has been significant improvement, popular perceptions notwithstanding (Alston et al., 2016), there is a Venezuela and an El Salvador.Footnote 20

Footnotes

1 Rules are obeyed not just because of sanctions imposed by an authority, but also because legal systems ‘can acquire the force of moral legitimacy’ (Hodgson, Reference Hodgson2006: 4). In various papers, such as Denzau and North (Reference Denzau and North1994), North tried to come to grips with the hard issues of learning and rationality, and although the concept of ‘shared mental models’ highlights the social dimension of cognition, his work tends to stress the importance of individual learning, not social interactions.

2 The breakthrough the bloodletting was a useless procedure owed most to Pierre C. A. Louis who developed a ‘numerical method’ for evaluating therapy and in about 1840 provided statistical proof that bloodletting was useless, leading to the gradual demise of this technique (Hudson, Reference Hudson1983: 206).

3 Formal models of such learning confirm this intuition. Consider, for example, the following relatively simple case that is conducive to such learning. Suppose that the rules of the game are common knowledge and there are only few rational players each of whom tries to learn how the others will behave. Learning is difficult because it is interactive: each player's learning process complicates what has to be learned by everyone else (e.g., Kalai and Lehrer, Reference Kalai and Lehrer1993a, Reference Kalai and Lehrer1993b; Nachbar, Reference Nachbar1997, Reference Nachbar2005).

4 For instance, a growing fear of the impact of smoking on health may lead to a decline in the demand for cigarettes, which may be quite different from the way people regard the changing market for tobacco. The constitutive idea behind all markets is that people want to do better by exchange; yet the way they see markets as benevolent entities that allocate resources effectively or rapacious entities that support predatory agents to exploit others may be quite different.

5 The classic examples are the two great breakthroughs of Darwinian evolutionary theory that ran into a great deal of resistance simply because its moral and metaphysical implications seemed to some to contradict religious beliefs, and Einstein's relativity theory, to which there never was much popular resistance.

6 An important example is priority rules, in which the person who first comes up with a widely accepted idea gets the recognition and the prestige associated with it. Like so many such rules, their working is highly imperfect (Stigler, Reference Stigler1999: 277–290).

7 Greif and Tadelis (Reference Greif and Tadelis2010) identified crypto-morality as a main source of the limited success of extrinsic reward to change intrinsic motivation.

8 The biggest risk, as many rulers from the Holy Roman Emperor Henry IV to England's Henry VIII found out, was that the Pope could try to deprive them of their legitimacy by excommunicating them. Between 1077 and 1538, however, this weapon lost much of its power because the cognitive rules that governed the authority had changed dramatically.

9 The discussion relies on the rich scholarship on the topic among which are Barzel, and Kiser (Reference Barzel and Kiser1997), Bisson (Reference Bisson1966), Graves (Reference Graves2001), Herb (Reference Herb2003), Hoffman and Norberg (Reference Hoffman and Norberg2001), Myers (Reference Myers1975), and van Zanden et al. (Reference Van Zanden, Buringh and Bosker2012).

10 Representative assemblies emerged throughout Europe but exhibited large variations in forms and functions that have important implications once the European monarchs attempted to limit their authority. In assemblies with three estates, the nobility and clergy dominated in a way that often hurt the economy. Some assemblies had standing taxation committees that rulers found relatively easy to co-opt, whereas others had no legislative power or the power to approve all taxation limiting their ability to constrain orders. Yet, other assemblies were so large that they were ineffective in mobilizing resources again demanding rulers. Notably the English Parliament had none of these deficiencies.

11 See, for instance Troost (Reference Troost2001: 72).

12 Theodore Beza, Calvin's successor in Geneva and one of the founding fathers of the reformed church, wrote a tract titled De jure magistratum (1574) in which he revised the Calvinist doctrine of obedience to civil authority. A few years later in another Huguenot tract titled Vindiciae Contra Tyrannos (published in Basel in 1579), the anonymous author specified the conditions under which the people may resist a ruler, which include not only that a ruler breaks divine law but also that his rule is harmful to the commonwealth.

13 This statement does not appear in Wealth of Nations. Smith's successor and student Dugald Stewart noted that the sentences appear in a small 1755 manuscript by Smith that was in his possession, but not to be published.

14 The philosopher Carl Becker, whose work North knew well, noted that 'a Philosopher could not grasp the modern idea of progress . . . until he was willing to abandon ancestor worship, until he analyzed away his inferiority complex toward the past, and realized that his own generation was superior to any yet known' (Becker, Reference Becker1932: 131).

15 Later generations, McCloskey argues, led by a retrograde “clerisy,” no longer could muster uniformly to these beliefs. Instead, left-wing intellectuals, led by Marxists, turned the bourgeoisie into a bête noire and a scapegoat for all of society's ills. But by that time the engine of growth had been set into motion, and a series of self-enforcing irreversible changes had occurred that led to the Great Enrichment.

16 The same was true in business and management practice: In 1494, Luca Pacioli printed in Venice his Summa Arithmetica which contained an introduction to modern accounting practices (based on the method developed in Venice). European traders adopted it in fairly short time. In the Ottoman Empire, printing was not permitted by religious scholars, and accounting practices were not adopted.

17 The seminal paper in this literature is Glaeser and Shleifer (Reference Glaeser and Shleifer2002). For a summary, see La Porta et al. (Reference La Porta, Lopez-de-Silanes and Shleifer2008).

18 See also Harris (Reference Harris2016) for a general analysis of the failure of non-European economies to adopt corporate forms of business organization.

19 In an often-cited remark, Freud wrote in his The Future of an Illusion, 'While mankind has made continual advances in its control over nature and may be expected to make still greater ones, it is not possible to establish with certainty that a similar advance has been made in the management of human affairs'.

20 The notable exception to this rule is the secular decline in violence in human history, most recently surveyed and analyzed by Pinker (Reference Pinker2011). Pinker credits the Enlightenment for what he calls 'the Humanitarian Revolution'. As he sees it, the Enlightenment augured in 'explicit arguments that institutionalized violence should be minimized or abolished . . . people began to sympathize of their fellow humans. . . a new ideology coalesced from these forces, one that placed life and happiness at the center of values and that used reason and evidence to motivate the design of institutions' (p. 133).

References

Acemoglu, D. and Robinson, J. (2012), Why Nations Fail: The Origins of Power, Prosperity, and Poverty, New York: Crown.Google Scholar
Aoki, M. (2001), Toward a Comparative Institutional Analysis, Cambridge, MA: MIT Press.Google Scholar
Barzel, Y. and Kiser, E. (1997), ‘The Development and Decline of Medieval Voting Institutions: A Comparison of England and France’, Economic Inquiry, 35 (2): 244260.CrossRefGoogle Scholar
Becker, C. L. (1932), The Heavenly City of the Eighteenth-Century Philosophers, New Haven, CT and London: Yale University Press.Google Scholar
Berkowitz, D. and Clay, K. (2011), The Evolution of a Nation: How Geography and Law Shaped the American States, Princeton: Princeton University Press.Google Scholar
Bisson, T. N. (1966), ‘The Military Origins of Medieval Representation’, American Historical Review, 71 (4): 11991218.Google Scholar
Coase, R. (1974), ‘The Market for Goods and the Market for Ideas’, American Economic Review, 64 (2): 384391.Google Scholar
Colley, L. (1992), Britons: Forging the Nation, 1707-1837, New Haven, CT: Yale University Press.Google Scholar
Denzau, A. T. and North, D. C. (1994), ‘Shared Mental Models: Ideologies and Institutions’, Kyklos, 47 (1): 331.Google Scholar
Durham, W. H. (1991), Coevolution: Genes, Culture, and Human Diversity, Stanford: Stanford University Press.Google Scholar
Gans, J. S. and Stern, S. (2003), ‘The Product Market and the “Market for Ideas”: Commercialization Strategies for Technology Entrepreneurs’, Research Policy, 32 (2): 333350.Google Scholar
Glaeser, E. L. and Shleifer, A. (2002), ‘Legal Origins’, Quarterly Journal of Economics, 117 (4): 11931230.Google Scholar
Graves, Michael A. R. (2001), The Parliaments of Early Modern Europe, Harlow, England: Pearson Education Ltd. Google Scholar
Greif, A. (1998), ‘Historical and Comparative Institutional Analysis’, American Economic Review, 88 (2): 8084.Google Scholar
Greif, A. (2006), Institutions and the Path to the Modern Economy: Lessons from Medieval Trade, Cambridge: Cambridge University Press.Google Scholar
Greif, A. (2014), ‘Do Institutions Evolve?’, Journal of Bioeconomics. A Special Issue Honoring Elinor Ostrom, 16 (16): 5360.Google Scholar
Greif, A. and Kingston, C. (2011), ‘Institutions: Rules or Equilibria?’, in Caballero, G. and Schofield, N. (eds.), Political Economy of Institutions, Democracy and Voting. Heidelberg: Springer, pp. 1344.Google Scholar
Greif, A. and Iygun, M. (2013), ‘Social Organizations, Violence, and Modern Growth’, American Economic Review (Papers & Proceedings), 103 (3): 534538.Google Scholar
Greif, A. and Rubin, J. (2016), ‘Political Legitimacy and the Transition to the Rule Law: The English Experience’, Working paper, Stanford University.Google Scholar
Greif, A. and Tabellini, G. (2010), ‘Cultural and Institutional Bifurcation: China and Europe Compared’, American Economic Review, 100 (2): 135140.Google Scholar
Greif, A. and Tabellini, G. (2016), ‘The Clan and the City: Sustaining Cooperation in China and Europe’, Journal of Comparative Economics (forthcoming).Google Scholar
Greif, A. and Tadelis, S. (2010), ‘A Theory of Moral Persistence: Crypto-morality and Political Legitimacy’, Journal of Comparative Economics, 38 (3): 229244.CrossRefGoogle Scholar
Harling, P. (1995), ‘Rethinking ‘Old Corruption’, Past and Present, 147 (1): 127158.Google Scholar
Harling, P. (1996), The Waning of ‘Old Corruption’: The Politics of Economical Reform in Britain, 1779-1846, Oxford: Clarendon Press.CrossRefGoogle Scholar
Harris, R. (2016), The Birth of the Business Corporation East and West: The Organization of Eurasian Trade 1400-1700, Princeton: Princeton University Press.Google Scholar
Hayek, F. A. (1942), ‘Scientism and the Study of Society. Part I’, Economica, 9 (35): 267291.CrossRefGoogle Scholar
Herb, M. (2003), ‘Taxation and Representation’, Studies in Comparative International Development, 38 (3): 331.Google Scholar
Hodgson, G. (2006), ‘What are Institutions?’, Journal of Economic Issues, 40 (1): 125.Google Scholar
Hodgson, G. (2015), Conceptualizing Capitalism: Institutions, Evolution, Future, Chicago: University of Chicago Press.Google Scholar
Hoffman, P. T. and Norberg, K. (2001), Fiscal Crises, Liberty, and Representative Government 1450-1789, Stanford: Stanford University Press.Google Scholar
Hudson, R. P. (1983), Disease and Its Control: the Shaping of Modern Thought, Westport: Greenwood Press.Google Scholar
Iyigun, M. (2015), War, Peace, and Prosperity in the Name of God: The Ottoman Role in Europe's Socioeconomic Evolution, Chicago: University of Chicago Press.Google Scholar
Jacob, M. C. (2007), ‘Mechanical Science of the Factory Floor’, History of Science, 45 (2): 197221, Art. no. 148.Google Scholar
Jacob, M. C. (2014), The First Knowledge Economy, Cambridge: Cambridge University Press.Google Scholar
Kalai, E. and Lehrer, E. (1993a), ‘Rational Learning Leads to Nash Equilibrium’, Econometrica, 61 (5): 10191045.Google Scholar
Kalai, E. and Lehrer, E. (1993b), ‘Subjective Equilibrium in Repeated Games’, Econometrica, 61 (6): 12311240.Google Scholar
Kuran, T. (2011), The Long Divergence: How Islamic Law Held Back the Middle East, Princeton: Princeton University Press.Google Scholar
La Porta, R., Lopez-de-Silanes, F., and Shleifer, A. (2008), ‘The Economic Consequences of Legal Origins’, Journal of Economic Literature, 46 (2): 285332.Google Scholar
Lecoq, A.-M. (ed.) (2001), La Querelle des Anciens et des Modernes, Paris: Éditions Gallimard.Google Scholar
Levi, M. and Sacks, A. (2009), ‘Legitimating Beliefs: Concepts and Measurements’, Regulation and Governance, 3 (3): 331333.Google Scholar
Levine, J. M. (1981), ‘Ancients and Moderns Reconsidered’, Eighteenth-Century Studies, 15 (1): 7289.Google Scholar
Levine, J. M. (1991), The Battle of the Books: History and Literature in the Augustan Age, Ithaca, NY: Cornell University Press.Google Scholar
McCloskey, Deirdre N. (2006), The Bourgeois Virtues: Ethics for an Age of Commerce, Chicago: University of Chicago Press.Google Scholar
McCloskey, D. N. (2016), Bourgeois Equality: How Ideas, Not Capital or Institutions, Enriched the World, Chicago: University of Chicago Press.Google Scholar
Mokyr, J. (2007), ‘The Market for Ideas and the Origins of Economic Growth in Eighteenth Century Europe’, Tijdschrift voor Sociale en Economische Geschiedenis, 4 (1): 338 (Heineken Lecture).Google Scholar
Mokyr, J. (2009), The Enlightened Economy, New York and London: Yale University Press Google Scholar
Mokyr, J. (2016), A Culture of Growth: The Origins of the Modern Economy, Princeton: Princeton University Press.Google Scholar
Morris, Ian (2010), Why the West Rules—For Now, New York: Farrar, Strauss and Giroux.Google Scholar
Myers, A. R. (1975), Parliament and Estates in Europe to 1789, London: Harcourt Brace Jovanovich.Google Scholar
Nachbar, J. H. (1997), ‘Prediction, Optimization, and Learning in Repeated Games’, Econometrica, 65 (2): 275310.Google Scholar
Nachbar, J. H. (2005), ‘Beliefs in Repeated Games’, Econometrica, 73 (2): 459480.CrossRefGoogle Scholar
North, D. C. (1981), Structure and Change in Economic History, New York: W. W. Norton.Google Scholar
North, D. C. (1990), Institutions, Institutional Change, and Economic Performance, Cambridge: Cambridge University Press.Google Scholar
North, D. C. (2005), Understanding the Process of Economic Change, Princeton, NJ: Princeton University Press.Google Scholar
North, D. C., Wallis, J. J. and Weingast, B. (2009), Violence and Social Orders: A Conceptual Framework for Interpreting Recorded Human History, Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Olson, Mancur (1965), The Logic of Collective Action; Public Goods and the Theory of Groups, Cambridge, MA: Harvard University Press.Google Scholar
Pincus, S. (2011), 1688: The First Modern Revolution, New Haven: Yale University Press.Google Scholar
Pinker, S. (2011), The Better Angels of Our Nature, New York: Penguin.Google Scholar
Polanyi, M. (1962), ‘The Republic of Science: Its Political and Economic Theory’, Minerva, 1 (1): 5473.Google Scholar
Richerson, P. J. and Christiansen, M. H. (eds.) (2013), Cultural Evolution: Society, Technology, Language, and Religion, Cambridge, MA: MIT Press.Google Scholar
Roland, G. (2004), ‘Understanding Institutional Change: Fast-Moving and Slow-Moving Institutions’, Studies in Comparative International Development, 38 (4): 109131.Google Scholar
Rossi, Paolo (1970), Philosophy, Technology and the Arts in the Early Modern Era, New York: Harper Torchbooks.Google Scholar
Scott, W. R. (1998), Organizations. Rational, Natural and Open Systems, New Jersey: Prentice Hall.Google Scholar
Stigler, G. J. (1965), ‘The Intellectual and the Marketplace’, Kansas Journal of Sociology, 1 (2): 6977.Google Scholar
Stigler, S. J. (1999), Statistics on the Table: The History of Statistical Concepts and Methods, Cambridge, MA: Harvard University Press.Google Scholar
Troost, W. (2001), Stadhouder-koning Willem III: Een Politieke Biografie, Hilversum: Uitgeverij Verloren.Google Scholar
Van Zanden, J. L., Buringh, E., and Bosker, M. (2012), ‘The Rise and Decline of European Parliaments, 1188-1789’, The Economic History Review, 65 (3): 835862.Google Scholar
Wootton, D. (2015), The Invention of Science: A New History of the Scientific Revolution, London: Allen Lane.Google Scholar
Zagorin, P. (1998), Francis Bacon, Princeton, NJ: Princeton University Press.Google Scholar
Zittel, C., Engel, G., Nanni, R., and Karafyllis, N. C. (eds.) (2008), Philosophies of Technologies: Francis Bacon and His Contemporaries, Leiden and Boston: Brill.Google Scholar
Zhao, D. (2009), ‘The Mandate of Heaven and Performance Legitimation in Historical and Contemporary China’, The American Behavioral Scientist, 53 (3): 416443.Google Scholar