Hostname: page-component-745bb68f8f-cphqk Total loading time: 0 Render date: 2025-01-07T19:37:07.209Z Has data issue: false hasContentIssue false

Students’ Complex Problem Solving Profiles

Published online by Cambridge University Press:  01 January 2025

Michela Gnaldi
Affiliation:
University of Perugia
Silvia Bacci*
Affiliation:
University of Florence
Thiemo Kunze
Affiliation:
University of Luxembourg
Samuel Greiff
Affiliation:
University of Luxembourg
*
Correspondence should be made to Silvia Bacci, Department of Statistics, Computer Science, Applications “G. Parenti”, University of Florence, Viale Morgagni 59, 50134 Firenze, Italy. Email: [email protected]

Abstract

Complex problem solving (CPS) is an up-and-coming twenty-first century skill that requires test-takers to solve dynamically changing problems, often assessed using computer-based tests. The log data that users produce when interacting with a computer-based test provide valuable information about each individual behavioral action they undertake, but such data are rather difficult to handle from a statistical point of view. This paper addresses this issue by building upon recent research focused on decoding log data and aims to identify homogeneous student profiles with regard to their ability to solve CPS tasks. Therefore, we estimated a discrete two-tier item response theory model, which allowed us to profile units (i.e., students) while taking into account the multidimensionality of the data and the explanatory effect of individual characteristics. The results indicate that: (1) CPS can be thought of as a three-dimensional latent variable; (2) there are ten latent classes of students with homogenous profiles regarding the CPS dimensions; (3) students in the higher latent classes generally demonstrate higher cognitive and non-cognitive performances; (4) some of the latent classes seem to profit from learning-by-doing within tasks, whereas others seem to exhibit the reverse behavior; (5) cognitive and non-cognitive skills, as well as gender and to some extent age, contribute to distinguishing among the latent classes.

Type
Application Reviews and Case Studies
Copyright
Copyright © 2020 The Psychometric Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Adams, R. J., Wilson, M. R., & Wang, W. C. (1997). The multidimensional random coefficients multinomial logit. Applied Psychological Measurement, 21, 124. CrossRefGoogle Scholar
Akaike, H. Petrov, B. N., & Csaki, F. (1973). Information theory and an extension of the maximum likelihood principle. Second international symposium of information theory. Budapest: Akademiai Kiado. 267281. Google Scholar
Bacci, S., & Bartolucci, F. (2016). Two-tier latent class IRT models in R. The R Journal, 8, 139166. CrossRefGoogle Scholar
Bacci, S., Bartolucci, F., & Gnaldi, M. (2014). A class of multidimensional latent class IRT models for ordinal polytomous item responses. Communications in Statistics-Theory and Methods, 43, 787800. CrossRefGoogle Scholar
Bartolucci, F. (2007). A class of multidimensional IRT models for testing unidimensionality and clustering items. Psychometrika, 72, 141157. CrossRefGoogle Scholar
Bartolucci, F., & Bacci, S. (2016). MLCIRTwithin: Latent class item response theory (LC-IRT) models under within-item multidimensionality. R package version 2.10. https://cran.r-project.org/web/packages/MLCIRTwithin. Cited July 27th 2018. Google Scholar
Bartolucci, F., Bacci, S., & Gnaldi, M. (2015). Statistical analysis of questionnaires: A unified approach based on R and stata. Boca Raton, FL: Chapman & Hall, CRC Press. CrossRefGoogle Scholar
Birnbaum, A. Lord, F. M., & Novick, M. R. (1968). Some latent trait models and their use in inferring an examinee’s ability. Statistical theories of mental test scores. Reading, MA: Addison-Wesley. 395479. Google Scholar
Bock, R. D., Gibbons, R., & Muraki, E. (1988). Full-information item factor analysis. Applied Psychological Measurement, 12, 261280. CrossRefGoogle Scholar
Bolt, D. M., Cohen, A. S., & Wollack, J. A. (2002). Item parameter estimation under conditions of test speededness: Application of a mixture Rasch model with ordinal constraints. Journal of Educational Measurement, 39, 331348. CrossRefGoogle Scholar
Bonifay, W. E. Reise, S. P., & Revicki, D. A. (2015). An illustration of the two-tier item factor analysis model. Handbook of item response theory modeling, Abingdon-on-Thames: Routledge. 207225. Google Scholar
Bradlow, E. T., Wainer, H., & Wang, X. (1999). A Bayesian random effects model for testlets. Psychometrika, 64, 2 153168. CrossRefGoogle Scholar
Buchner, A. Frensch, P. A., & Funke, J. (1995). Basic topics and approaches to the study of complex problem solving. Complex problem solving: The European perspective, Hillsdale, NJ: Lawrence Erlbaum. 2547. Google Scholar
Cai, L. (2010). A two-tier full-information item factor analysis model with applications. Psychometrika, 75, 581612. CrossRefGoogle Scholar
Cai, L., Yang, J. S., & Hansen, M. (2011). Generalized full-information item bifactor analysis. Psychological Methods, 16, 221248. CrossRefGoogle ScholarPubMed
Chalmers, R. P. (2012). mirt: A multidimensional item response theory package for the R environment. Journal of Statistical Software, 48, 129. CrossRefGoogle Scholar
Choi, I., & Wilson, M. (2015). Multidimensional classification of examinees using the mixture random weights linear logistic test model. Educational and Psychological Measurement, 75, 78101. CrossRefGoogle ScholarPubMed
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 3746. CrossRefGoogle Scholar
Cohen, J. (1968). Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit. Psychological Bulletin, 70, 213220. CrossRefGoogle ScholarPubMed
Danner, D., Hagemann, D., Holt, D. V., Hager, M., Schankin, A., Wüstenberg, S., Funke, J. (2011). Measuring performance in dynamic decision making. Reliability and validity of the Tailorshop simulation. Journal of Individual Differences, 32, 225233. CrossRefGoogle Scholar
Dayton, C. M., & Macready, G. B. (1988). Concomitant-variable latent-class models. Journal of the American Statistical Association, 83, 173178. CrossRefGoogle Scholar
de la Torre, J., & Minchen, N. (2014). Cognitively diagnostic assessments and the cognitive diagnosis model framework. Psicologia Educativa, 20, 8997. CrossRefGoogle Scholar
DeMars, C. E. (2006). Application of the bi-factor multidimensional item response theory model to Testlet-based tests. Journal of Educational Measurement, 43, 145168. CrossRefGoogle Scholar
Demetriou, A., Platsidou, M., Efklides, A., Metallidou, Y., & Shayer, M. (1991). The development of quantitative-relational abilities from childhood to adolescence: Structure, scaling, and individual differences. Learning and Instruction, 1, 1943. CrossRefGoogle Scholar
Dempster, A. P., Laird, N. M., & Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm (with discussion). Journal of the Royal Statistical Society, Series B, 39, 138. CrossRefGoogle Scholar
Formann, A. K. (2007). Mixture analysis of multivariate categorical data with covariates and missing entries. Computational Statistics and Data Analysis, 51, 52365246. CrossRefGoogle Scholar
Frischkorn, G., Greiff, S., & Wüstenberg, S. (2014). The development of complex problem solving: A latent growth curve analysis. Journal of Educational Psychology, 106, 10041020. CrossRefGoogle Scholar
Funke, J. Stern, E., & Guthke, J. (2001). Neue Verfahren zur Erfassung intelligenten Umgangs mit komplexen und dynamischen Anforderungen [new methods to assess intelligent dealing with complex and dynamic requirements]. Perspektiven der Intelligenzforschung [Perspectives of intelligence research], Lengerich D: Pabst Science Publishers. 89107. Google Scholar
Funke, J., & Frensch, P. A. Jonassen, D. H. (2007). Complex problem solving: The European perspective—10 years after. Learning to solve complex scientific problems, New York, NY: Lawrence Erlbaum. 2547. Google Scholar
Gibbons, R. D., Bock, R. D., Hedeker, D., Weiss, D. J., Segawa, E., Bhaumik, D. K., Kupfer, D. J., Frank, E., Grochocinski, V. J., Stover, A. (2007). Full-information item bifactor analysis of graded response data. Applied Psychological Measurement, 31, 419. CrossRefGoogle Scholar
Gibbons, R. D., & Hedeker, D. R. (1992). Full-information item bi-factor analysis. Psychometrika, 57, 423436. CrossRefGoogle Scholar
Gnaldi, M. (2018). Indicators definition: Use of extended IRT models for composite indicator development. In Encyclopedia Wiley statistics reference. John Wiley & Sons. https://doi.org/10.1002/9781118445112.stat08112. CrossRefGoogle Scholar
Gnaldi, M., Bacci, S., & Bartolucci, F. (2015). A multilevel finite mixture item response model to cluster examinees and schools. Advances in Data Analysis and Classification, 10, 5370. CrossRefGoogle Scholar
Gnaldi, M., Bacci, S., Greiff, S., & Kunze, T. Petrucci, A. Verde, R. (2017). Profiles of students on account of complex problem solving (CPS) strategies exploited via log-data. Statistics and data science: New challenges, new generations, Firenze: FUP. 505512. Google Scholar
Goodman, L. A. (1974). Exploratory latent structure analysis using both identifiable and unidentifiable models. Biometrika, 61, 215231. CrossRefGoogle Scholar
Greiff, S., Holt, D. V., & Funke, J. (2013a). Perspectives on problem solving in educational assessment: Analytical, interactive, and collaborative problem solving. Journal of Problem Solving, 5, 7191. CrossRefGoogle Scholar
Greiff, S., & Neubert, J. C. (2014). On the relation of complex problem solving, personality, fluid intelligence, and academic achievement. Learning and Individual Differences, 36, 3748. CrossRefGoogle Scholar
Greiff, S., Wüstenberg, S., & Avvisati, F. (2015). Computer-generated log-file analyses as a window into students’ minds? A showcase study based on the PISA 2012 assessment of problem solving. Computers and Education, 91, 92105. CrossRefGoogle Scholar
Greiff, S., Wüstenberg, S., & Funke, J. (2012). Dynamic problem solving. A new measurement perspective. Applied Psychological Measurement, 36, 189213. CrossRefGoogle Scholar
Greiff, S., Wüstenberg, S., Molnar, G., Fischer, A., Funke, J., & Csapo, B. (2013). Complex problem solving in educational contexts—Something beyond g: Concept, assessment, measurement invariance, and construct validity. Journal of Educational Psychology, 105, 364379. CrossRefGoogle Scholar
Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and applications. Boston: Kluwer Nijhoff. CrossRefGoogle Scholar
Jiao, H., Lissitz, R., Macready, G. B., Wang, S., & Liang, S. (2011). Exploring levels of performance using the mixture Rasch model for standard setting. Psychological Test and Assessment Modeling, 53, 499522. Google Scholar
Kretzschmar, A., Neubert, J. C., Wüstenberg, S., & Greiff, S. (2016). Construct validity of complex problem solving: A comprehensive view on different facets of intelligence and school grades. Intelligence, 54, 5569. CrossRefGoogle Scholar
Kroner, S., Plass, J. L., & Leutner, D. (2005). Intelligence assessment with computer simulations. Intelligence, 33, 347368. CrossRefGoogle Scholar
Lazarsfeld, P. F., & Henry, N. W. 1968 Latent structure analysis. Boston: Houghton Mifflin. Google Scholar
Lotz, C., Scherer, R., Greiff, S., & Sparfeldt, J. R. (2017). Intelligence in action. Effective strategic behaviors while solving complex problems. Intelligence, 64, 98112. CrossRefGoogle Scholar
Maij-de Meij, A. M., Kelderman, H., & van der Flier, H. (2008). Fitting a mixture item response theory model to personality questionnaire data: Characterizing latent classes and investigating possibilities for improving prediction. Applied Psychological Measurement, 32, 611631. CrossRefGoogle Scholar
McLachlan, G., & Peel, D. 2000 Finite mixture models. New York: Wiley. CrossRefGoogle Scholar
Mislevy, R. J., & Verhelst, N. (1990). Modeling item responses when different subjects employ different solution strategies. Psychometrika, 55, 195215. CrossRefGoogle Scholar
Nair, K. U., & Ramnarayan, S. (2000). Individual differences in need for cognition and complex problem solving. Journal of Research in Personality, 34, 305328. CrossRefGoogle Scholar
National Research Council (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: The National Academies Press. Google Scholar
OECD (2014). PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems. Paris: OECD Publishing. Google Scholar
Reckase, M. D. (2009). Multidimensional item response theory. New York: Springer. CrossRefGoogle Scholar
Reise, S. P. (2012). The rediscovery of bifactor measurement models. Multivariate Behavioral Research, 47, 667696. CrossRefGoogle ScholarPubMed
Rost, J. (1990). Rasch models in latent classes: An integration of two approaches to item analysis. Applied Psychological Measurement, 14, 3 271282. CrossRefGoogle Scholar
Rost, J. (1991). A logistic mixture distribution model for polychotomous item responses. The British Journal of Mathematical and Statistical Psychology, 44, 7592. CrossRefGoogle Scholar
Rost, J., & von Davier, M. Fischer, G. H., & Molenaar, I. W. (1995). Mixture distribution Rasch models. Rasch models. Foundations, recent developments and applications, New York: Springer. 257268. Google Scholar
Rudolph, J., Greiff, S., Strobel, A., & Preckel, F. (2018). Understanding the link between need for cognition and complex problem solving. Contemporary Educational Psychology, 55, 5362. CrossRefGoogle Scholar
Schwarz, G. (1978). Estimating the dimension of a model. Annals of Statistics, 6, 461464. CrossRefGoogle Scholar
Schweizer, F., Wüstenberg, S., & Greiff, S. (2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning and Individual Differences, 24, 4252. CrossRefGoogle Scholar
Stadler, M., Becker, N., Godker, M., Leutner, D., & Greiff, S. (2015). Complex problem solving and intelligence: A meta-analysis. Intelligence, 53, 92101. CrossRefGoogle Scholar
Vainikainen, M. 2014 Finnish primary school pupils’ performance in learning to learn assessments: A longitudinal perspective on educational equity, Helsinki: University of Helsinki. Google Scholar
Van der Linden, W., & Hambleton, R. K. (1997). Handbook of modern item response theory. New York: Springer. CrossRefGoogle Scholar
Vollmeyer, R., & Rheinberg, F. (1999). Motivation and metacognition when learning a complex system. European Journal of Psychology of Education, 14, 541554. CrossRefGoogle Scholar
von Davier, M. (2008). A general diagnostic model applied to language testing data. British Journal of Mathematical and Statistical Psychology, 61 (2), 287307. CrossRefGoogle ScholarPubMed
Wang, W. -C., Wilson, M. R., & Adams, R. J. Wilson, M., Draney, K., & Eglehard, G. (1997). Rasch models for multidimensionality between items and within items. Objective measurement: Theory into practice, Greenwich: Ablex Publishing. 139155. Google Scholar
Willemsen, M. C., & Johnson, E. J. Mecklenbeck, M., Kühberger, A., & Ranyard, R. (2011). Visiting the decision factory: Observing cognition with MouselabWEB and other information acquisition methods. A handbook of process tracing methods for decision making. New York, NY: Taylor & Francis. 2142. Google Scholar
Wüstenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving. More than reasoning?. Intelligence, 40, 114. CrossRefGoogle Scholar
Wüstenberg, S., Greiff, S., Molnar, G., & Funke, J. (2014). Cross-national gender differences in complex problem solving and their determinants. Learning and Individual Differences, 29, 1829. CrossRefGoogle Scholar
Wüstenberg, S., Greiff, S., Vainikainen, M. P., & Murphy, K. (2016). Individual differences in complex problem solving skills. How they evolve and what they imply. Journal of Educational Psychology, 108, 10281044. CrossRefGoogle Scholar