Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-rdxmf Total loading time: 0 Render date: 2024-11-28T16:00:08.889Z Has data issue: false hasContentIssue false

12 - A Learning Sciences Perspective on the Design and Use of Assessment in Education

from Part II - Methodologies

Published online by Cambridge University Press:  14 March 2022

R. Keith Sawyer
Affiliation:
University of North Carolina, Chapel Hill
Get access

Summary

This chapter reviews assessment research with the goal of helping all readers understand how to design and use effective assessments. The chapter begins by introducing the purposes and contexts of educational assessment. It then presents four related frameworks to guide work on assessment: (1) assessment as a process of reasoning from evidence, (2) assessment driven by models of learning expressed as learning progressions, (3) the use of an evidence-centered design process to develop and interpret assessments, and (4) the centrality of the concept of validity in the design, use, and interpretation of any assessment. The chapter then explores the implications of these frameworks for real-world assessments and for learning sciences research. Most learning sciences research studies deeper learning that goes beyond traditional student assessment, and the field can contribute its insights to help shape the future of educational assessment.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2022

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Alonzo, A. C., & Gotwals, A. W. (2012). Learning progression in science: Current challenges and future directions. Rotterdam, The Netherlands: Sense Publishers.Google Scholar
American Association for the Advancement of Science. (2001). Atlas of science literacy. Arlington, VA: National Science Teachers Association.Google Scholar
American Educational Research Association, American Psychological Association, and National Council of Measurement in Education (AERA, APA, NCME). (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
American Educational Research Association, American Psychological Association, and National Council of Measurement in Education (AERA, APA, NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
Behrens, J. T., DiCerbo, K. E., & Foltz, P. (2019). Assessment of complex performances in digital environments. The ANNALS of the American Academy of Political and Social Science, 683(1), 217232.Google Scholar
Bennett, R. E., Deane, P., & van Rijn, P. W. (2016). From cognitive-domain theory to assessment practice. Educational Psychologist, 51(1), 82107.CrossRefGoogle Scholar
Berman, A., Feuer, M., & Pellegrino, J. W. (Eds.). (2019). What use is educational assessment? Special issue of The ANNALS of the American Academy of Political and Social Science, 683(1).Google Scholar
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5(1), 773.Google Scholar
Bransford, J. D., Brown, A. L., Cocking, R. R., Donovan, M. S., & Pellegrino, J. W. (Eds.). (2001). How people learn: Brain, mind, experience, and school (expanded ed.). Washington, DC: National Academy Press.Google Scholar
Corcoran, T. B., Mosher, F. A., & Rogat, A. (2009). Learning progressions in science: An evidence-based approach to reform. New York, NY: Columbia University, Teachers College, Consortium for Policy Research in Education, Center on Continuous Instructional Improvement.Google Scholar
Daro, P., Mosher, F. A., Corcoran, T., Barrett, J., & Consortium for Policy Research in Education. (2011). Learning trajectories in mathematics: A foundation for standards, curriculum, assessment, and instruction. Philadelphia, PA: Consortium for Policy Research in Education.Google Scholar
Duncan, R. G., & Hmelo-Silver, C. (2009). Learning progressions: Aligning curriculum, instruction, and assessment. Journal for Research in Science Teaching, 46(6), 606609.CrossRefGoogle Scholar
Duncan, R. G., & Rivet, A. E. (2018). Learning progressions. In Fischer, F., Hmelo-Silver, C., Goldman, S. R., & Reimann, P. (Eds.), International handbook of the learning sciences (pp. 422432). New York, NY: Routledge/Taylor & Francis.CrossRefGoogle Scholar
Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). (2007). Taking science to school: Learning and teaching science in grade K-8. Washington, DC: The National Academies Press.Google Scholar
Fischer, F., Hmelo-Silver, C., Goldman, S. R., & Reimann, P. (Eds.). (2018). International handbook of the learning sciences. New York, NY: Routledge/Taylor & Francis.Google Scholar
Forster, M., & Masters, G. (2001). Progress maps. Victoria, Australia: Australian Council for Educational Research.Google Scholar
Gordon Commission on the Future of Assessment in Education. (2013a). Technical report. Retrieved from www.gordoncommission.org/publications_reports.htmlGoogle Scholar
Gordon Commission on the Future of Assessment in Education. (2013b). Policy report. Retrieved from www.gordoncommission.org/publications_reports.htmlGoogle Scholar
Harris, C. J., Krajcik, J. S., Pellegrino, J. W., & DeBarger, A. H. (2019). Designing knowledge‐in‐use assessments to promote deeper learning. Educational Measurement: Issues and Practice, 38(2), 5367.Google Scholar
Herman, J. L., Wilson, M. R., Shavelson, R., Timms, M., & Schneider, S. (2005, April). The CAESL assessment model. Paper presented at American Educational Research Association annual conference, Montreal, Canada.Google Scholar
Heubert, J. P., & Hauser, R. M. (Eds.). (1999). High stakes: Testing for tracking, promotion, and graduation. Washington, DC: National Academy Press.Google Scholar
Hickey, D., & Pellegrino, J. W. (2005). Theory, level, and function: Three dimensions for understanding transfer and student assessment. In Mestre, J. P. (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 251293). Greenwich, CT: Information Age Publishing.Google Scholar
Kane, M. T. (2006). Validation. In Brennan, R. L. (Ed.), Educational measurement (4th ed., pp. 1764). Westport, CT: Praeger.Google Scholar
Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 173.CrossRefGoogle Scholar
Kilpatrick, J., Swafford, J., & Findell, B. (Eds.). (2001). Adding it up: Helping children learn mathematics. Washington, DC: National Academy Press.Google Scholar
Mislevy, R. J. (1996). Test theory reconceived. Journal of Educational Measurement, 33(4), 379416.CrossRefGoogle Scholar
Mislevy, R. J. (2019). Advances in measurement and cognition. The ANNALS of the American Academy of Political and Social Science, 683(1), 164182.Google Scholar
Mislevy, R. J., & Haertel, G. (2006). Implications of evidence-centered design for educational assessment. Educational Measurement: Issues and Practice, 25(4), 620.Google Scholar
Mislevy, R. J., & Riconscente, M. M. (2006). Evidence-centered assessment design: Layers, concepts, and terminology. In Downing, S. & Haladyna, T. (Eds.), Handbook of test development (pp. 6190). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
National Research Council. (2003). Assessment in support of learning and instruction: Bridging the gap between large-scale and classroom assessment. Washington, DC: The National Academies Press.Google Scholar
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: The National Academies Press.Google Scholar
National Research Council. (2018). How people learn II: Learners, contexts and cultures. Washington, DC: The National Academies Press.Google Scholar
Neumann, K., Schecher, H., & Theyssen, H. (2019). Assessing complex patterns of student resources and behavior in the large scale. The ANNALS of the American Academy of Political and Social Science, 683(1), 233249.Google Scholar
NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. Washington, DC: The National Academies Press.Google Scholar
Pellegrino, J. W., Baxter, G. P., & Glaser, R. (1999). Addressing the “two disciplines” problem: Linking theories of cognition and learning with assessment and instructional practice. In Iran-Nejad, A. & Pearson, P. D. (Eds.), Review of research in education (Vol. 24, pp. 307353). Washington, DC: American Educational Research Association.Google Scholar
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: The National Academies Press.Google Scholar
Pellegrino, J. W., DiBello, L., & Brophy, S. (2014). The science and design of assessment in engineering education. In Johri, A. & Olds, B. (Eds.), The Cambridge handbook of engineering education research (pp. 571598). Cambridge, England: Cambridge University Press.Google Scholar
Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51(1), 5981.CrossRefGoogle Scholar
Pellegrino, J. W., & Hickey, D. (2006). Educational assessment: Towards better alignment between theory and practice. In Verschaffel, L., Dochy, F., Boekaerts, M., & Vosniadou, S. (Eds.), Instructional psychology: Past, present and future trends – Sixteen essays in honour of Erik De Corte (pp. 169189). Oxford, England: Elsevier.Google Scholar
Quellmalz, E., & Pellegrino, J. W. (2009). Technology and testing. Science, 323(5910), 7579.CrossRefGoogle ScholarPubMed
Ruiz-Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation of systemic science education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39(5), 369393.Google Scholar
Sadler, R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119144.Google Scholar
Shepard, L. A., Penuel, W. R., & Pellegrino, J. W. (2018). Using learning and motivation theories to coherently link formative assessment, grading practices, and large-scale assessment. Educational Measurement: Issues and Practice, 37(1), 2134.Google Scholar
Snow, C. E., Burns, M., & Griffin, M. (Eds.). (1998). Preventing reading difficulties in young children. Washington, DC: National Academy Press.Google Scholar
Stiggins, R. J. (1997). Student-centered classroom assessment. Upper Saddle River, NJ: Prentice-Hall.Google Scholar
Wiliam, D. (2007). Keeping learning on track: Formative assessment and the regulation of learning. In Lester, F. K. Jr. (Ed.), Second handbook of mathematics teaching and learning (pp. 10531098). Greenwich, CT: Information Age Publishing.Google Scholar
Wiliam, D. (2012). Embedded formative assessment: Practical strategies and tools for K-12 teachers. Bloomington, IN: Solution Tree Press.Google Scholar
Wilson, M. (2004). Constructing measures: An item response modeling approach. Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
Wilson, M. (2018). Making measurement important for education: The crucial role of classroom assessment. Educational Measurement: Issues and Practice, 37(1), 520.CrossRefGoogle Scholar
Wilson, M., & Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 13(2), 181208.Google Scholar
Wilson, M. R., & Bertenthal, M. W. (Eds.). (2005). Systems for state science assessments. Washington, DC: The National Academies Press.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×