Hostname: page-component-745bb68f8f-f46jp Total loading time: 0 Render date: 2025-01-08T12:12:19.519Z Has data issue: false hasContentIssue false

How to Measure and Explain Achievement Change in Large-Scale Assessments: A Rejoinder

Published online by Cambridge University Press:  01 January 2025

Marian Hickendorff*
Affiliation:
Leiden University
Willem J. Heiser
Affiliation:
Leiden University
Cornelis M. van Putten
Affiliation:
Leiden University
Norman D. Verhelst
Affiliation:
CITO, National Institute for Educational Measurement
*
Requests for reprints should be sent to Marian Hickendorff, Division of Methodology and Psychometrics, Leiden University Institute for Psychological Research, P.O. Box 9555, 2300 RB Leiden, The Netherlands. E-mail: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

In this rejoinder, we discuss substantive and methodological validity issues of large-scale assessments of trends in student achievement, commenting on the discussion paper by Van den Heuvel-Panhuizen, Robitzsch, Treffers, and Köller (2009). We focus on methodological challenges in deciding what to measure, how to measure it, and how to foster stability. Next, we discuss what to do with trends that are found. Finally, we reflect on how the research findings were received.

Type
Theory and Methods
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This article distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Copyright
Copyright © 2009 The Psychometric Society

References

De Knecht-Van Eekelen, A., Gille, E., Van Rijn, P. (2007). Resultaten Pisa-2006. Praktische kennis en vaardigheden van 15-jarigen [Pisa 2006 results. Functional knowledge and skills of 15-year-olds], Arnhem: CITO.Google Scholar
Expertgroep Doorlopende Leerlijnen Taal en Rekenen (2008). Over de drempels met rekenen. Consolideren, onderhouden, gebruiken en verdiepen [Crossing the thresholds with mathematics. Strengthen, maintain, use, and deepen], Enschede: Expertgroep Doorlopende Leerlijnen Taal en Rekenen.Google Scholar
Hickendorff, M., Heiser, W.J., Van Putten, C.M., & Verhelst, N.D. (2009). Solution strategies and achievement in Dutch complex arithmetic: Latent variable modeling of change. Psychometrika, 74(2). doi: 10.1007/s11336-008-9074-z.CrossRefGoogle ScholarPubMed
Hickendorff, M., Van Putten, C.M., Verhelst, N.D., & Heiser, W.J. (2009). Individual differences in strategy use on division problems: mental versus written computation. Manuscript submitted for publication.Google Scholar
Hiebert, J., Grouws, D. (2007). The effects of classroom mathematics teaching on students’ learning. In Lester, F.K. (Eds.), Second handbook of research on mathematics teaching and learning (pp. 371404). Charlotte: Information Age Publishing.Google Scholar
Janssen, J., Van der Schoot, F., Hemker, B. (2005). Balans van het reken-wiskundeonderwijs aan het einde van de basisschool 4 [Fourth assessment of mathematics education at the end of primary school], Arnhem: CITO.Google Scholar
Mazzeo, J., & von Davier, M. (2008). Review of the Programme for International Student Assessment (PISA) test design: Recommendations for fostering stability in assessment results. Paris: OECD Education Working Papers (EDU/PISA/GB(2008)28).Google Scholar
Meelissen, M.R.M., Drent, M. (2008). TIMMS-2007 Nederland. Trends in leerprestaties in exacte vakken van het basisonderwijs [TIMMS 2007 the Netherlands. Trends in achievement in mathematics and science in primary education], Enschede: Twente University.Google Scholar
Mullis, I.V.S., Martin, M.O., Foy, P. (2008). TIMMS 2007 international mathematics report. Findings from IEA’s Trends in International Mathematics and Science Study at the fourth and eighth grades, Boston: Boston College TIMMS & PIRLS International Study Center.Google Scholar
National Assessment Governing Board (2006). Mathematics framework for the 2007 National Assessment of Educational Progress, Washington: US Department of Education.Google Scholar
OECD (2004). Learning for tomorrow’s world. First results from PISA 2003, Paris: OECD.Google Scholar
Porter, A.C. (2006). Curriculum assessment. In Green, J.L., Camilli, G., Elmore, P.B. (Eds.), Handbook of complementary methods in education research (pp. 141160). Mahwah: Lawrence Erlbaum Associates.Google Scholar
Stein, M.K., Remillard, J., Smith, M.S. (2007). How curriculum influences student learning. In Lester, F.K. (Eds.), Second handbook of research on mathematics teaching and learning (pp. 319370). Charlotte: Information Age Publishing.Google Scholar
Van den Heuvel-Panhuizen, M., Robitzsch, A., Treffers, A., & Köller, O. (2009). Large-scale assessments of change in student achievement: Dutch primary school students’ results on written division in 1997 and 2004 as an example. Psychometrika, 74(2). doi: 10.1007/s11336-009-9110-7.Google Scholar
Van der Schoot, F. (2008). Onderwijs op peil? Een samenvattend overzicht van 20 jaar PPON [A summary overview of 20 years of national assessments of the level of education], Arnhem: CITO.Google Scholar
Van Putten, C.M. (2008). De onmiskenbare daling van het prestatiepeil bij de bewerkingen sinds 1987—een reactie [The unmistakable decline of the complex arithmetic achievement level since 1987: A reaction]. Reken-wiskundeonderwijs: onderzoek, ontwikkeling, praktijk, 27(1), 3540.Google Scholar
Van Putten, C.M., Hickendorff, M. (2006). Strategieën van leerlingen bij het beantwoorden van deelopgaven in de periodieke peilingen aan het eind van de basisschool van 2004 en 1997 [Students’ strategies when solving division problems in the PPON test end of primary school 2004 and 1997]. Reken-wiskundeonderwijs: onderzoek, ontwikkeling, praktijk, 25(2), 1625.Google Scholar