No CrossRef data available.
Article contents
Conviction Narrative Theory gains from a richer formal model
Published online by Cambridge University Press: 08 May 2023
Abstract
Conviction Narrative Theory (CNT) is a convincing descriptive theory, and Johnson et al.'s formal model is a welcome contribution to building more precise, testable hypotheses. However, some extensions to the proposed model would make it better defined and more powerful. The suggested extensions enable the model to go beyond CNT, predicting choice outcomes and explaining affective phenomena.
- Type
- Open Peer Commentary
- Information
- Copyright
- Copyright © The Author(s), 2023. Published by Cambridge University Press
References
Ainslie, G. (2017). De Gustibus Disputare: Hyperbolic delay discounting integrates five approaches to impulsive choice. Journal of Economic Methodology, 24(2), 166–189.CrossRefGoogle Scholar
Berns, G. S., Laibson, D., & Loewenstein, G. (2007). Intertemporal choice – toward an integrative framework. Trends in Cognitive Sciences, 11(11), 482–488.CrossRefGoogle ScholarPubMed
Blue, M., Bush, B., & Puckett, J. (2002). Unified approach to fuzzy graph problems. Fuzzy Sets and Systems, 125(3), 355–368.CrossRefGoogle Scholar
Boyer, P. (2003). Religious thought and behaviour as by-products of brain function. Trends in Cognitive Sciences, 7(3), 119–124.CrossRefGoogle ScholarPubMed
Bruner, J. (1985). Chapter VI: Narrative and paradigmatic modes of thought. Teachers College Record, 86(6), 97–115.CrossRefGoogle Scholar
Caldwell, L. (2018a). Reinforcement learning made me irrational [Conference presentation]. TIBER 2018, Tilburg, Netherlands.Google Scholar
Caldwell, L. (2018b). Inferring heuristics from complex causal models [Conference presentation]. International Conference: Heuristics in Organisations and Society, Herbert Simon Society, Turin, Italy.Google Scholar
Caldwell, L., & Seear, L. (2019). System 3: Measuring the consumer's imagination. ESOMAR Congress 2019.Google Scholar
De Beaugrande, R., & Colby, B. N. (1979). Narrative models of action and interaction. Cognitive Science, 3(1), 43–66.CrossRefGoogle Scholar
Eliaz, K., & Spiegler, R. (2020). A model of competing narratives. American Economic Review, 110(12), 3786–3816.CrossRefGoogle Scholar
Kóczy, L. (1992). Fuzzy graphs in the evaluation and optimization of networks. Fuzzy Sets and Systems, 46(3), 307–319.CrossRefGoogle Scholar
Momennejad, I., Otto, A. R., Daw, N. D., & Norman, K. A. (2018). Offline replay supports planning in human reinforcement learning. elife, 7, e32548.CrossRefGoogle ScholarPubMed
Parunak, H. V. D. (2022). Model mechanisms and behavioral attractors. In Social Simulation Conference (SSC2022), pages (under review), University of Milan, Milan, Italy.Google Scholar
Pennington, N., & Hastie, R. (1986). Evidence evaluation in complex decision making. Journal of Personality and Social Psychology, 51(2), 242.CrossRefGoogle Scholar
Pham, M. T. (1998). Representativeness, relevance, and the use of feelings in decision making. Journal of Consumer Research, 25(2), 144–159.CrossRefGoogle Scholar
Polichak, J. W., & Gerrig, R. J. (2002). Get up and win! Participatory responses to narratives. In Green, M. C., Strange, J. J., & Brock, T. C. (Eds.), Narrative impact: Social and cognitive foundations (pp. 71–95). Erlbaum.Google Scholar
Sarbin, T. R. (1986). Narrative psychology: The storied nature of human conduct. Praeger Publishers/Greenwood Publishing Group.Google Scholar
Schelling, T. C. (1987). The mind as a consuming organ. In Elster, J. (Ed.), The Multiple Self (pp. 177–196). Cambridge University Press.Google Scholar
Schultz, W., Dayan, P., & Montague, P. R. (1997). A neural substrate of prediction and reward. Science, 275(5306), 1593–1599.CrossRefGoogle ScholarPubMed
Shiller, R. J. (2017). Narrative economics. American Economic Review, 107(4), 967–1004.CrossRefGoogle Scholar
Sloman, S. A., & Lagnado, D. (2015). Causality in thought. Annual Review of Psychology, 66, 223–247.CrossRefGoogle ScholarPubMed
Tuckett, D., & Nikolic, M. (2017). The role of conviction and narrative in decision-making under radical uncertainty. Theory & Psychology, 27(4), 501–523.CrossRefGoogle ScholarPubMed
Yeh, R. T., & Bang, S. Y. (1975). Fuzzy relations, fuzzy graphs, and their applications to clustering analysis. In Fuzzy sets and their applications to cognitive and decision processes (pp. 125–149). Academic Press.CrossRefGoogle Scholar
Zadeh, L. A. (1999). Fuzzy logic and the calculi of fuzzy rules, fuzzy graphs, and fuzzy probabilities. Computers & Mathematics with Applications, 37(11–12), 35.CrossRefGoogle Scholar
You have
Access
Narratives have long been recognised as important by psychologists (De Beaugrande & Colby, Reference De Beaugrande and Colby1979; Bruner, Reference Bruner1985; Pennington & Hastie, Reference Pennington and Hastie1986; Sarbin, Reference Sarbin1986) and more recently by economists (Eliaz & Spiegler, Reference Eliaz and Spiegler2020; Shiller, Reference Shiller2017). Intuitively and empirically, narrative plays an important role in thinking.
Conviction Narrative Theory (CNT) (target article; Tuckett and Nikolic, Reference Tuckett and Nikolic2017) convincingly proposes that people use narratives to settle on a course of action in the face of uncertainty. Earlier work on CNT has not clearly defined what a narrative is: The target article's significant new contribution is a formal model of narrative. Its model of a narrative can be summarised as a graph of objects, in which:
• Each object may have positive, negative or neutral valence,
• Any pair may be causally related,
• Any pair may be temporally related,
• Any subgraph can represent an analogy with a second, isomorphic subgraph.
This model helps to formalise the phenomena previously treated only descriptively. As it stands, however, it leaves some matters unclear. The model is not quite full enough to support some of the phenomena described in the target article.
The authors ask, in the context of probabilistic utility theory, where do these numbers come from? One might ask them in turn: Where do these narratives come from? The answer, that they are “supplied in part by the social environment” is imprecise and incomplete.
Narratives are to be used for affective evaluation – but other than an optional binary valence on each node, there is no affective component in the model. During explanation, agents evaluate whether a narrative “feels right,” but the model contains nothing that they can use to make this judgement – this feeling must be imported from outside.
This missing information renders some of the examples ambiguous. In Figure 6c, although Fundamentals have a causal effect on Price, this could be either positive or negative – the diagram cannot capture this. In others, the assignment of valence is not explained. In Figure 5a, Masks receive positive valence. Is this because people like wearing them? Or is the valence inferred from Masks' negative effect on infection? The latter would require narratives to be dynamic rather than static.
Three extensions to the model would resolve these issues. The mathematical approach is provided by the “fuzzy graph” literature (Blue, Bush, & Puckett, Reference Blue, Bush and Puckett2002; Kóczy, Reference Kóczy1992; Yeh & Bang, Reference Yeh and Bang1975; Zadeh, Reference Zadeh1999). A fuzzy graph's nodes and edges take non-binary values, usually a real number in [0, 1].
First, where the models come from: Instead of constructing narratives on-the-fly for each decision, a persistent causal graph is specified, representing the agent's whole mental model of the world. This graph is learned (by conditioning, or through verbal learning). Narratives are selected as subgraphs of this model (perhaps with minor modifications), not created from scratch.
The authors relate the story of Mrs O'Leary's cow, who kicked over a lantern that started the Great Chicago Fire. Although this narrative is “given” to us in the telling, it is believable because we already know that cows kick lanterns and lanterns start fires. Without prior causal beliefs, it might invite doubt rather than conviction.
Narrative subgraphs of a pre-existing causal graph can be instantiated and used faster, and more easily communicated to other people, than new narratives. This explains why a chosen narrative is more likely to contain concepts already salient to the agent (Boyer, Reference Boyer2003). The authors hint at this: “…the most contagious narratives including a small number of novel concepts against a larger background of familiar concepts…in digestible form because they match our default causal-temporal representations of events…”
Second, to enable affective evaluation, valence in the graph is replaced with a scalar affective value, represented by degree of shading in Figures 1 and 2 (interpreted as “anticipated reward” as in Berns, Laibson, & Loewenstein, Reference Berns, Laibson and Loewenstein2007). When a narrative is evaluated during mental simulation, this affective value is experienced (Ainslie, Reference Ainslie2017) and that experience is used in evaluation.
Figure 1. The background graph prior to narrative selection.
Figure 2. A narrative selected as subgraph, prompted by a choice situation.
Each affective value is updated during mental simulation as its downstream causal consequences are experienced, allowing future simulations to be more efficient and accurate (Schultz, Dayan, & Montague, Reference Schultz, Dayan and Montague1997). Affectively appealing narratives become “attractors” (Parunak, Reference Parunak2022).
Third, to better specify causal influence, the edges are assigned a scalar coefficient reflecting strength or likelihood of causality. The authors rightly observe that exact probabilities are incalculable under radical uncertainty, but certainly agents can understand that some causal links are stronger than others (Sloman & Lagnado, Reference Sloman and Lagnado2015). These coefficients can influence how narratives are mentally simulated, even without conscious awareness of the agent.
This change allows narratives to be traversed by more than one possible pathway. Imagine a brain evaluating multiple pathways in parallel (or by quickly switching between them) rather than in isolation. CNT's proposed explanation and simulation phases then merge into one process, in which multiple narrative pathways are tried on for both plausibility and affective appeal, until one preferred option is found. This perhaps better reflects the human experience of struggling with uncertainty.
These refinements more fully implement CNT, and enable the model to predict more than just conviction. In offline replay (Momennejad, Otto, Daw, & Norman, Reference Momennejad, Otto, Daw and Norman2018), by contrasting multiple narratives of present and future, agents can update their mental models even when not choosing between options. If mental simulation over this narrative model generates “synthetic reward” (Caldwell, Reference Caldwell2018a, Reference Caldwell2018b), daydreaming (Schelling, Reference Schelling and Elster1987), enjoying memory replay and transportation by fiction (Polichak & Gerrig, Reference Polichak, Gerrig, Green, Strange and Brock2002) can all be explained. The How-Does-It-Feel heuristic (Caldwell, Reference Caldwell2018b; Pham, Reference Pham1998) is implemented by this model.
In commercial work, a similar formal model (“System 3”) has been used (Caldwell & Seear, Reference Caldwell and Seear2019) to predict frequency of shopping behaviour, responses to advertising messages and consumer sustainability behaviours.
In their conclusion, the authors anticipate two critiques: That their theory may be seen as too grandiose, or too skeletal. This commentary lands on the “skeletal” side, but perhaps this proposed extension of the formal model can add some meat to those very promising bones.
Financial support
This research received no specific grant from any funding agency, commercial or not-for-profit sectors.
Competing interest
None.