Introduction
Financial literacy, defined as the ability to make informed and effective decisions regarding the use and management of money, has gained increasing recognition as an essential life skill (Kaiser and Lusardi Reference Kaiser and Lusardi2024). Reflecting this importance, the Organisation for Economic Co-operation and Development (OECD) incorporated financial literacy into its Programme for International Student Assessment (PISA) in 2012. This inclusion enables an evaluation of 15-year-olds’ ability to apply their financial knowledge to real-life contexts, measuring their readiness to make informed financial decisions, manage personal finances, and navigate the financial challenges they will encounter (Lusardi Reference Lusardi2015).
PISA provides unique global data to explore the state of financial literacy among the young. The four financial literacy assessments conducted in 2012, 2015, 2018, and 2022 offer rich insights into global trends and variations in financial literacy, attitudes, and behaviors among 15-year-old students. The recent release of the 2022 student-level data allows us to provide an overview of the main results of a decade of PISA financial literacy data. The dataset also includes test scores on math and reading, which enables researchers to examine the interplay between financial literacy and other cognitive abilities.
Despite the availability of this rich data, its use has been surprisingly limited. Existing studies tend to focus on individual countries (Hospido et al. Reference Hospido, Villanueva and Zamarro2015; Riitsalu and Põder Reference Riitsalu and Põder2016; Cordero and Pedraja Reference Cordero and Pedraja2019; Pesando Reference Pesando2018; Arellano et al. Reference Arellano, Cámara and Tuesta2018; Mancebón et al. Reference Mancebón, Ximénez-de-Embún, Mediavilla and Gómez-Sancho2019; Bottazzi and Lusardi Reference Bottazzi and Lusardi2021; Silinskas et al. Reference Silinskas, Ahonen and Wilska2021) or a single iteration of the PISA assessment (Gramaţki Reference Gramaţki2017; Moreno-Herrero et al. Reference Moreno-Herrero, Salas-Velasco and Sánchez-Campillo2018; Salas-Velaso et al. Reference Salas-Velasco, Moreno-Herrero and Sánchez-Campillo2021; Cordero et al. Reference Cordero, Gil-Izquierdo and Pedraja-Chaparro2022; Davoli Reference Davoli2023; Pulk and Riitsalu Reference Pulk and Riitsalu2024). This article draws on all available data from the 2012, 2015, 2018, and 2022 PISA financial literacy assessments. By pooling data across these four waves, we are able to document key stylized facts, providing a comprehensive analysis of the correlates of financial literacy among adolescents across different countries and socioeconomic contexts.
Across the four cycles of data collection, approximately 20 countries and economies participated in each assessment, including long-standing participants such as Italy, the United States, Poland, and Spain. We find that students from countries like Estonia and Finland consistently rank at the top, while students in emerging economies, such as Brazil and Peru, achieve much lower scores. Furthermore, we document persistent gender differences, with boys typically outperforming girls in many countries, although in some, like Australia, girls have outperformed boys in recent waves of the assessment. Moreover, students from wealthier backgrounds score significantly higher than their less affluent peers. These findings underscore the importance of targeted educational initiatives to bridge existing financial literacy gaps.
A lot can be learned by the PISA financial literacy data. First, it includes a comprehensive set of student-level covariates, allowing for in-depth analyses of the factors associated with financial literacy, such as demographic background, socioeconomic status, parental involvement, access to financial education, and the use of digital tools. These individual-level characteristics can provide critical insights for policy and programs, including the design of financial education interventions. Second, the dataset makes it possible to examine the links between financial literacy and macroeconomic indicators such as GDP growth, national savings rates, and measures of financial stability. Third, PISA’s focus on youth offers a unique opportunity to study financial literacy at a formative age, providing insights into how financial education impacts long-term financial behaviors and outcomes. Fourth, the rigorous psychometric design of the assessment ensures that financial literacy is measured consistently across a broad spectrum of ability levels, which minimizes measurement error and improves the precision of estimates. Finally, the standardized testing framework used in PISA allows for meaningful comparisons across countries and over time, despite some changes in the conceptual model and assessment protocols across waves.
This article is structured as follows: “The assessment of financial literacy in PISA” outlines the conceptual framework underlying the PISA financial literacy assessment. “Data” provides an overview of the dataset, including details on how to use student-level test scores and covariates. Because these data are complex, we provide as much information as possible to help researchers make a good use of the data. “An overview of results across waves” presents key descriptive findings, highlighting differences in financial literacy across demographic groups. “Discussion” concludes with potential avenues for future research.
The assessment of financial literacy in PISA
The OECD launched the PISA initiative in 2000 with the aim of evaluating and comparing educational systems worldwide (Hanushek and Woessmann Reference Hanushek and Woessmann2023). Initially, PISA focused on assessing adolescents’ proficiency in reading, mathematics, and science. However, in 2012, the assessment expanded to include a financial literacy component, reflecting the growing significance of financial skills in an increasingly globalized economy and rapidly evolving financial markets.Footnote 1 More than 12 years ago and before it became a mandatory topic in school in many countries, financial literacy already emerged as an essential skill for young people.
The primary objective of the PISA financial literacy assessment is to gauge adolescents’ ability to make informed financial decisions, manage resources efficiently, and navigate the financial challenges they are likely to encounter throughout their lives (OECD 2013). This section outlines the conceptual framework of the financial literacy assessment, detailing the assessment procedures, protocols, and the psychometric models and methods applied.
Framework of the financial literacy assessment
The conceptual model for assessing financial literacy in PISA is grounded in a framework that emphasizes both the knowledge and skills necessary for individuals to make informed and effective financial decisions. It is important to describe the definition of financial literacy and the framework that were used to design the assessment. Financial literacy is defined as follows in PISA (OECD 2023a: 112) as:
“Financial literacy is knowledge and understanding of financial concepts and risks, as well as the skills and attitudes to apply such knowledge and understanding in order to make effective decisions across a range of financial contexts, to improve the financial well-being of individuals and society, and to enable participation in economic life.”
This definition highlights the importance of not only acquiring financial knowledge but also applying it to make sound financial decisions that can benefit the individual and society as well (for an in-depth discussion of this definition, see Lusardi Reference Lusardi2015). The PISA framework assesses financial literacy through three core dimensions: content, processes, and contexts.
The content dimension focuses on specific areas of financial knowledge, such as understanding money and transactions, planning and managing finances, recognizing risks and rewards, and grasping the broader financial landscape. This encompasses everyday financial activities like making payments, managing bank accounts, budgeting, saving, understanding credit and insurance, and being aware of consumer rights and economic activities.
The process dimension examines the cognitive approaches students use when engaging with financial information. This includes identifying financial information, analyzing it within a financial context, evaluating financial issues, and applying financial knowledge to real-world situations. These cognitive processes are critical for making informed decisions, interpreting financial data, assessing the reliability of information, and solving financial problems.
The context dimension considers the various settings in which financial knowledge and skills are applied, ranging from personal and household finances to broader societal and economic environments. This includes making decisions related to education and work, managing personal expenses and savings, and understanding financial systems and institutions at a societal level (Lusardi Reference Lusardi2015; OECD 2019).
Figure 1 provides an example of a question designed following the framework described before. Note how the questions aim to assess not just having financial knowledge but being able to apply that knowledge.
PISA evaluates financial literacy across five proficiency levels, which capture the range of students’ financial capabilities. At the basic level, students can recognize common financial terms and handle simple tasks, such as differentiating between needs and wants. At higher levels, students demonstrate the ability to apply financial knowledge in financial decision-making, such as creating budgets, interpreting financial documents, and understanding long-term financial implications. The highest proficiency level involves advanced understanding of financial concepts, solving complex financial problems, and making informed, long-term decisions.
Although the core principles of PISA’s financial literacy assessment have remained constant over time, the revised frameworks have introduced additional elements to enhance the evaluation. For example, the 2018 assessment collected data to form new indices, including information about the extent of financial education in school, confidence in managing money, parental involvement in financial matters, and additional items to capture nuances in youth financial behavior. Moreover, the shift from paper-based to computer-based assessments allowed for more interactive tasks, such as requiring students to seek additional information or manipulate data to explore different financial scenarios. These innovations have enabled a more comprehensive assessment of students’ financial literacy.
Methods and test administration
Test development
PISA follows a rigorous process to ensure the validity, reliability, and cross-country comparability of financial literacy data. The development of assessment items is grounded in the conceptual framework described in “Framework of the financial literacy assessment.” These items undergo piloting, review, and refinement through extensive international collaboration, drawing on expertise from various disciplines, including education and policymaking. PISA test items are designed to reflect a wide range of contexts and cognitive processes, and the development process consists of several stages, including drafting, reviewing, piloting, and finalization. A key priority in the design is ensuring cultural neutrality, with all items subjected to thorough reviews to guarantee fairness and minimize bias.
The assessment includes a combination of multiple-choice questions, constructed-response items, and interactive tasks. The latter became particularly prominent with the introduction of computer-based assessments in the more recent waves. These varied formats are intended to capture a broad range of skills and cognitive abilities, allowing for a very comprehensive evaluation of financial literacy.
In comparison to the widely used Big Three and Big Five financial literacy questions (Lusardi and Mitchell Reference Lusardi and Mitchell2014), PISA’s financial literacy items differ significantly in target audience, scope, and complexity. For example, while the Big Three questions are designed to be implemented in household surveys of adults to assess knowledge of basic financial concepts such as interest compounding, inflation, and risk diversification, the PISA questions are designed for 15-year-old students and encompass a wider range of content areas. As mentioned earlier, these include money management, financial risks, and understanding of financial products. Additionally, PISA’s items are more contextualized, asking students to apply their knowledge in practical, real-world scenarios (as demonstrated by the example in Figure 1). Finally, the open-ended nature of some of the questions reduces the incidence of potential gender differences in financial literacy, given the reluctancy of female respondents to pick an answer when lacking confidence about the topic. As reported in Table 1, financial literacy is measured with as many as 40 test items (43 in the recent waves).
Notes: This table shows wave-specific characteristics regarding participation in PISA financial literacy assessment. As shown in Figure 1, separate assessments in country regions were combined with the respective economy.
Source: Authors’ calculations based the data source discussed in the text.
Sampling
PISA employs a stratified sampling method to select a representative sample of 15-year-old students from each participating country. This approach ensures that the sample reflects the diversity of the student population, accounting for variations in socioeconomic backgrounds, geographic regions, and types of schools. The sampling process follows a two-stage design: in the first stage, schools are selected, and in the second stage, students are chosen from within those schools. This procedure is meticulously managed to meet the requirements necessary for ensuring the validity and comparability of results across countries.
Test administration
The test administration is standardized across all participating countries to ensure consistency and comparability. The assessment is conducted over a two-hour period, during which students complete a variety of item clusters covering different domains. Depending on the focus of the assessment waves, each student typically completes a combination of tasks in reading, mathematics, science, and financial literacy. Additionally, students, teachers, and school principals answer separate questionnaires designed to gather contextual information on the learning environment, teaching practices, and students’ backgrounds, providing valuable insights into the factors that may influence student performance.
Scaling and scoring
PISA test scores are estimated using an Item Response Theory (IRT) framework. IRT models are designed to infer latent (unobserved) proficiency scores from observed (manifest) item responses, such as answers to test questions. The IRT models employed in most educational assessments, including PISA, rely on two key assumptions: local independence, meaning the probability of answering an item correctly depends solely on the student’s proficiency and not on responses to other items; and unidimensionality, meaning that a single latent construct (e.g., proficiency in a particular domain) is being measured. In PISA assessments, these assumptions are met, and item parameters, along with proficiency scores, are estimated for each domain separately (see OECD 2018 for details). For more details, we refer the reader to the technical appendix at the end of the paper.
Data
The OECD provides comprehensive data repositories for each wave of the PISA assessment, containing questionnaires, codebooks, compendia, and datasets in both SAS and SPSS formats. The questionnaire section includes student-level questionnaires, which capture information on demographics and the assessment itself, as well as questionnaires for parents, teachers, and school principals. Additionally, some years feature specialized questionnaires, such as those on well-being or digital media usage (e.g., in 2018). A separate questionnaire is also available, focusing on noncognitive outcomes in the financial domain.
The dataset section contains the main datasets with test score data for the core domains of mathematics, reading, and science, along with data from the student and parent questionnaires. Separate financial literacy datasets are also included for each wave (starting in 2012) and can be merged with the main dataset using student and country identifiers.
Separate datasets are available for demographic variables which are collected via the school and teacher questionnaires. Additionally, two specific datasets provide information on response times: one records the total time spent on the background questionnaire, and the other records response times and the number of attempts for each test item in the assessment. These data can be used to derive metrics related to test-taking motivation, as suggested by studies such as Wise and Kong (Reference Wise and Kong2005) and Wise (Reference Wise2015).
Test scores: IRT framework and plausible values (PVs)
The PISA datasets provide student proficiency scores estimated using an IRT model, as outlined in “Methods and test administration” and the technical appendix. Each PISA test score data file contains five plausible values (PVs) for the 2012 wave and ten PVs for the 2015, 2018, and 2022 waves for each participant. These PVs are standardized with a mean of 500 and a standard deviation of 100, based on the average performance of OECD economies in the first financial literacy assessment in 2012. When interpreting differences in performance at the country or individual level, a difference of about one-fifth of a standard deviation (20 points) on the PISA scale is equivalent to the typical learning outcome of approximately one year of schooling (Avvisati and Givord Reference Avvisati and Givord2021).
PVs are multiple imputations of the latent variable (e.g., financial literacy). Each PV represents a random draw from the posterior distribution of the latent trait, conditional on the observed data. To account for the complexities of this design, regression analyses are conducted using a multiple imputation framework, where all five (or ten) PVs are imputed, and pooled estimates are reported. Popular statistical software, such as R and STATA, offer functions to handle imputed data (e.g., MICE, van Buuren and Groothuis-Oudshoorn Reference van Buuren and Groothuis-Oudshoorn2011). For PISA data, the OECD recommends the repest package for STATA users and the Rrepest package for R users (Ilizaliturri et al. Reference Ilizaliturri, Avvisati and Keslair2023), which enable researchers to account for complex survey designs and imputed variables.Footnote 2
Demographic and other information
The variables in the main dataset and the financial literacy dataset encompass a wide range of demographic, socioeconomic, and educational factors, along with financial literacy measures. These variables can be easily merged with auxiliary datasets, such as the parent or school datasets, using country and student identifiers.
Demographic variables include age, gender, immigrant background, family wealth (an index based on the number and type of home possessions), and language spoken at home, offering insights into the characteristics of the student population. Socioeconomic variables, such as parents’ education levels, occupations, income, and family structure, provide critical information about the students’ socioeconomic backgrounds. Educational variables capture data on students’ academic performance, school type, and the availability of educational resources, offering context for their financial literacy outcomes.
The datasets also feature several indices derived using multiple variables. For example, the 2018 wave introduced new indices measuring confidence in handling financial matters, familiarity with digital financial services, exposure to school-based financial education, and parental involvement in financial decision-making. Additionally, the 2018 wave provided new indices on topics such as exposure to bullying, attitudes toward competition, fear of failure, and self-efficacy. Indices available across all waves include measures of parental emotional support, teacher support, and the perceived value of schooling. Furthermore, the Index of Economic, Social, and Cultural Status (ESCS) which is available for all waves is based on three key indicators related to family background: parents’ highest education (in years), parents’ highest occupational status, and home possessions (OECD 2023b).
An overview of results across waves
The inclusion of the financial literacy assessment into PISA has generated important insights into the state of financial literacy of the young around the globe. By benchmarking students’ performance across countries, PISA enables policymakers, educators, and researchers to identify best practices, highlight areas for improvement, and develop evidence-based interventions to enhance adolescents’ financial literacy levels. In this section, we report stylized results from the four waves of the PISA financial literacy data (2012, 2015, 2018, and 2022). All analyses make use of sampling weights.
Participating countries
The PISA financial literacy assessment was first administered in 2012 with 18 participating countries (see Table 1).
Since then, the number of countries involved has varied, with 15 countries participating in 2015 and 20 in 2018 and 2022. Over the course of these assessments, more than a quarter of a million students have taken part.
Figure 2 provides a map displaying the participating countries and the number of participations across the last four waves. Notably, four countries – the United States, Spain, Italy, and Poland – have participated in all four assessments.
As mentioned earlier, financial literacy is measured using 40 questions in the 2012 assessment and 43 questions in the later assessments. The number of test items enhances the assessment’s reliability, allowing for a more precise measurement of financial literacy levels across time and between countries. In the 2018 assessment, approximately two-thirds of the financial literacy questions were carried over from the 2012 and 2015 assessments, with the remaining one-third comprising newly developed items (OECD 2018). The 2022 financial literacy assessment is structured similarly to the 2018 version, incorporating a mix of previously used items to monitor performance trends and new interactive items aligned with the revised framework (OECD 2023a).
The variation of financial literacy scores between countries and waves
We begin by examining the variation in financial literacy scores at the country level and over time. Figure 3 presents financial literacy levels across all countries participating in PISA over four waves (2012, 2015, 2018, and 2022): each square represents the average financial literacy score for a specific country in a particular assessment wave. The scores are color-coded by year: navy blue for 2012, light gray for 2015, yellow for 2018, and burgundy red for 2022. The colored bars around each square depict the 95% confidence intervals, indicating the range within which we can be 95% confident the true average score lies, thereby reflecting the uncertainty around each country’s estimated mean.
The top-performing countries, which surpass the OECD average in financial literacy, include nations such as Estonia and Finland, which have consistently demonstrated strong results across multiple assessment waves, particularly in 2012 and 2015. These countries have made substantial investments in financial education initiatives within school systems and have recently implemented comprehensive national strategies to support youth-oriented financial education programs. This approach to embedding financial literacy into educational frameworks may have contributed to their ongoing success in these assessments.
On the other hand, countries such as Brazil and Georgia consistently rank among the lowest performers, with scores falling well below the OECD average. This finding reveals a marked divide between high-income OECD nations, which generally display high financial literacy scores, and emerging economies, where financial literacy levels remain significantly low and lower than the OECD average. In emerging economies, challenges such as limited resources for financial education and the absence of cohesive national strategies to promote financial literacy are often contributing factors to these lower scores.
Examining the trends for the four countries that participated in all four assessment waves – Spain, the United States, Poland, and Italy – we find distinct trajectories in financial literacy over time. Poland and the United States display relatively stable levels, with financial literacy scores fluctuating slightly around the 500-point mark. Spain, on the other hand, has consistently performed just below the OECD average, with scores typically ranging between 450 and 500 points. Italy, while still below the 500-point benchmark, has shown modest improvements since 2012, indicating gradual progress over time, which may be due to targeted policy efforts in enhancing the financial literacy of the population
Emerging economies in regions such as Latin America and Southeast Asia, including Brazil, Peru, and Indonesia, consistently score below the OECD average in financial literacy.
From 2018 to 2022, some countries, such as Russia and Slovakia, experienced notable fluctuations in their financial literacy scores. In contrast, countries like Estonia and Chile have shown steady, albeit modest, improvements over the years, though Chile’s scores remain below the OECD average.
Proficiency gaps by countries and waves
We now turn to a descriptive analysis of the differences in financial literacy by student subgroups across countries and over time.
Gender gap
We start by examining differences between male and female students. While much of the literature highlights a gender gap in financial literacy among adults (Lusardi et al. Reference Lusardi, Mitchell and Curto2010; Bucher-Koenen and Lusardi Reference Bucher-Koenen and Lusardi2011; Bucher-Koenen et al. Reference Bucher-Koenen, Lusardi, Alessie and van Rooij2017), the evidence from PISA assessments presents a more nuanced picture. In fact, when examining the pooled data across all participating countries, no significant gender gap in financial literacy is detected (OECD 2020). Note that, as mentioned earlier, the methodology used to design the assessment was meant to minimize gender differences in financial literacy generated by how the questions are asked.
However, as illustrated in Panel A of Figure 4, when examining individual countries, gender gaps in financial literacy emerge in Italy, the United States, Peru, and Canada. In contrast, in some countries, such as Bulgaria, Poland, and Slovakia, female students outperforming their male counterparts.
Immigrant background
Another important factor influencing financial literacy is students’ immigrant background, as shown in Panel B of Figure 2. Hanushek et al. (Reference Hanushek, Kinne, Lergetporer and Woessmann2022) investigate how cultural traits, specifically patience and risk-taking, influence human capital investments worldwide. They find that cultural variations in these traits significantly affect human capital investment and educational outcomes. For immigrant students, who often may bring distinct cultural perspectives on patience, risk-taking, and educational investment, these traits may uniquely shape their learning experiences and financial literacy development in their countries of residence. Results shown in Panel B reveal substantial differences between immigrant and nonimmigrant students across assessment waves and countries. However, countries like Canada, Israel, and Latvia show no significant disparities, suggesting these nations may provide effective support systems for immigrant students within their educational frameworks. Additionally, countries such as the Netherlands and Estonia display a narrowing performance gap over time, which may indicate the positive effects of policies aimed at better integrating immigrant students.
Parental socioeconomic status
Panel C of Figure 4 highlights one of the most influential factors associated with financial literacy scores: the parental socioeconomic index derived from occupational data on both the father and mother, collected through open-ended responses (for more details on the index construction, see OECD 2023b). Across all countries, students from higher socioeconomic backgrounds consistently perform better in financial literacy assessments than those from mid- or lower socioeconomic background. The magnitude of this gap often exceeds 50 points, a difference far larger than the variation observed between or within countries over time. This finding underscores important disparities in financial literacy and highlights the critical need for policies or programs that can address these differences.
Pooled correlates of financial literacy by wave
We now examine all the correlates of financial literacy mentioned before in a regression model with country-fixed effects and study their importance by wave (see Table 2).
Notes: This table shows multiple regressions with financial literacy scores (imputed by plausible values) as the dependent variable for each wave separately. The dependent variables are standardized to have a mean of 500 and a standard deviation of 100 based on the OECD mean in 2012. Parents’ highest occupational status (High HISEI) takes the value 1, if the participant is the highest quarter of the distribution, 0 otherwise. Standard errors are clustered at the school level. All regression models include country-fixed effects and senate weights. High HISEI is omitted when the regression contains Index of Economic, Social, and Cultural Status (ESCS) to avoid multicollinearity.
* p < 0.01,
** p < 0.05,
*** p < 0.01.
Source: Authors’ calculations based the data source discussed in the text.
Starting with the initial model that includes key demographic indicators, we find that immigrant background and parents’ highest occupational status are significantly associated with financial literacy scores across all four waves, as shown in Table 2. Additionally, math and reading abilities are strong predictors of financial literacy. An increase in math or reading proficiency by one unit is associated with an increase in financial literacy levels by nearly half a unit. When adjusting for differences in math and reading scores, the gender gap in financial literacy becomes more pronounced, with a difference of about 5 to 6 points in the pooled analysis. This suggests that controlling for math and reading comprehension is essential when examining the factors influencing financial literacy.
In a more comprehensive model that includes both demographic and socioeconomic factors, immigrant background and parents’ occupational status remain significant predictors of financial literacy. However, when further adjusting for educational resources and social and cultural status (ESCS), the effect of immigrant background loses both economic and statistical significance. This suggests that the initial association between immigration background and financial literacy may reflect better access to educational resources at home rather than immigration status itself. This finding warrants further exploration and has potentially important policy implications, particularly regarding the provision of personal finance education in schools.
Discussion
The comprehensive information provided by the PISA financial literacy assessments offers a valuable resource for researchers examining the financial literacy of adolescents globally. Adding a financial literacy component in PISA in 2012 has been not just visionary but has delivered important insights into the capabilities of 15-year-old students to navigate the financial landscape and be able to participate to society. This article provides an overview of the data collected over four assessment cycles (2012, 2015, 2018, and 2022), outlines the conceptual framework and methodology, and offers guidance on utilizing the test scores generated by psychometric models. These data are complex, and the article explains in detail the financial literacy assessment and many of the technical aspects of the PISA financial literacy scores to facilitate the use and research among other scholars. Additionally, we highlight key findings on financial literacy among youth and differences across time, countries, and demographic characteristics.
The PISA assessments underscore the significant impact of socioeconomic status, immigrant background, and gender on financial knowledge. Higher socioeconomic status is consistently linked to better financial literacy scores, likely due to greater access to educational resources and support. Immigrant background and gender also play a role, although patterns vary across countries. For instance, some countries show gender gaps favoring male students, while others exhibit higher financial literacy levels among females. Given the importance of demographic, socioeconomic, and educational factors, it may be important to design policy and programs that provide better access to financial education to the more vulnerable groups.
Several challenges may explain why PISA data remain underutilized in financial literacy and personal finance research. One primary obstacle is the difficulty of establishing causality using repeated cross sections. While researchers studying math and reading literacy have developed methods to address this issue (e.g., Hanushek et al. Reference Hanushek, Link and Woessmann2013, Reference Hanushek, Piopiunik and Wiederhold2014; Hogrebe and Strietholt Reference Hogrebe and Strietholt2016; Hanushek et al. Reference Hanushek, Kinne, Lergetporer and Woessmann2022), the assumptions involved in the identification of causal effects from these data are usually strong and difficult to test. Another factor is the focus on 15-year-old students with no possibility of linking these individual-level test score data to administrative data on financial choices. Despite these limitations, early financial education is crucial in shaping future financial behaviors, making this age group essential for understanding the foundations of financial literacy and its broader implications for macroeconomic outcomes like economic growth and financial stability over time.
Several promising avenues for future research emerge from the PISA financial literacy assessments. Researchers can exploit variations within and between countries, as well as changes over time, to rigorously analyze the determinants of financial literacy. Additionally, the relationship between financial literacy and noncognitive skills, such as risk-taking and patience – critical traits influencing financial decisions – can be explored further (Falk et al. Reference Falk, Becker, Dohmen, Enke, Huffman and Sunde2018; Hanushek et al. Reference Hanushek, Kinne, Lergetporer and Woessmann2022). The rapid growth of financial technologies presents another area for investigation, particularly how digital platforms, online financial tools, and gamification affect financial literacy among young people. While much research has examined the link between financial literacy and financial decision-making (Kaiser et al. Reference Kaiser, Lusardi, Menkhoff and Urban2022) there is still considerable room to explore the long-term impacts of financial literacy on economic outcomes such as wealth accumulation, debt management, and economic mobility.
Examining how different educational reforms correlate with financial literacy outcomes over time could provide valuable policy insights. By delving deeper into these data, researchers can uncover the root causes of disparities among young people and identify effective interventions to enhance financial literacy. Policymakers can leverage these insights to design targeted educational programs that equip youth with the skills needed to manage their finances effectively, ultimately improving their financial well-being and fostering a more financially literate society. In this respect, we also call for continuation of the data collection. As countries start or implement national strategies for financial literacy, it is of paramount importance to track the state of financial literacy among the young and its progress over time.
Technical appendix
Scaling and scoring
PISA test scores are estimated using an IRT framework.
For dichotomous item responses (e.g., correct or incorrect answers), PISA uses a two-parameter logistic IRT model (Birnbaum Reference Birnbaum, Lord and Novick1968). For items with more than two response categories, referred to as polytomous items, each response option is awarded a different number of points, and PISA applies the Generalized Partial Credit Model (GPCM) (Muraki Reference Muraki1992) to these ordered responses.
The IRT models allow for various methods to estimate student proficiency (denoted as ${\theta _v}$ ). The most traditional method is Maximum Likelihood Estimation (MLE), which identifies the value of θ that maximizes the likelihood of observing a given set of responses. However, MLE can be biased or produce undefined estimates, particularly when a student answers all items correctly or incorrectly (i.e., extreme responses). To address this, Weighted Likelihood Estimation (WLE) was introduced by Warm (Reference Warm1989), modifying the likelihood function by assigning greater weight to more informative items (those that provide more information about the student’s ability).
While MLE and WLE can produce a single-point estimate for each student’s ability, recent research has suggested that these methods do not fully capture the uncertainty and variability in such measurements (Wu Reference Wu2005). As a result, most large-scale educational assessments, including PISA, employ a method known as PVs to account for this uncertainty. Instead of generating a single proficiency score, PVs offer multiple estimates of a student’s latent ability, representing reasonable values drawn from the posterior distribution of the latent trait based on item responses and background variables. PVs are not individual test scores but random draws that reflect the range of potential proficiency levels, accounting for the limited number of test items and the inherent uncertainty in educational measurement (Mislevy et al. Reference Mislevy, Beaton, Kaplan and Sheehan1992).
In addition, the estimation procedures in large-scale assessments often integrate test scores with key background variables, such as gender, migrant status, or socioeconomic status, through latent regression modeling. This approach enhances the accuracy of the proficiency estimates and reduces bias in estimating the relationships between proficiency and student-level covariates (see Mislevy et al. Reference Mislevy, Beaton, Kaplan and Sheehan1992; Davier et al. Reference von Davier, Khorramdel, He, Shin and Chen2019 for detailed explanations). Thus, PVs are preferred in many educational assessments for their ability to provide more robust and nuanced insights into student proficiency (Mislevy Reference Mislevy1991; Wu Reference Wu2005).
Competing interests
None.