Overview
- Helps integrate standards, instruction, formative assessment, and accountability measures
- Builds on principles of evidence-centered assessment design
- Provides links to interactive, online versions of design patterns
- Includes supplementary material: sn.pub/extras
Part of the book series: SpringerBriefs in Statistics (BRIEFSSTATIST)
Buy print copy
About this book
This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards. Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop.
Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based reasoning: Model Formation, Model Use, Model Elaboration, Model Articulation, Model Evaluation, Model Revision, and Model-Based Inquiry. Each design pattern lays out considerations concerning targeted knowledge and ways of capturing and evaluating students’ work. These design patterns are available at http://design-drk.padi.sri.com/padi/do/NodeAction?state=listNodes&NODE_TYPE=PARADIGM_TYPE. The ideas are illustrated with examples from existing assessments and the research literature.Similar content being viewed by others
Keywords
- model-based reasoning
- model formation
- model use
- model elaboration
- model articulation
- model evaluation
- model revision
- model-based inquiry
- design patterns
- assessment
- science education
- learning science
- evidence-centered design
- Next Generation Science Framework
- Bayesian
- inference
- learning and instruction
Table of contents (12 chapters)
Authors and Affiliations
About the authors
Geneva D. Haertel, PhD, is Director of Assessment Research and Design at the Center for Technology in Learning, SRI International.
Michelle M. Riconscente, PhD, is Director of Learning and Assessment at GlassLab in California.
Daisy Wise Rutstein, PhD, is Education Researcher in the Education Division at SRI International.
Cindy S. Ziker, PhD, MPH, is Senior Researcher for Assessment in the Education Division of the Center for Technology and Learning at SRI International.
Bibliographic Information
Book Title: Assessing Model-Based Reasoning using Evidence- Centered Design
Book Subtitle: A Suite of Research-Based Design Patterns
Authors: Robert J Mislevy, Geneva Haertel, Michelle Riconscente, Daisy Wise Rutstein, Cindy Ziker
Series Title: SpringerBriefs in Statistics
DOI: https://doi.org/10.1007/978-3-319-52246-3
Publisher: Springer Cham
eBook Packages: Mathematics and Statistics, Mathematics and Statistics (R0)
Copyright Information: The Author(s) 2017
Softcover ISBN: 978-3-319-52245-6Published: 02 August 2017
eBook ISBN: 978-3-319-52246-3Published: 25 July 2017
Series ISSN: 2191-544X
Series E-ISSN: 2191-5458
Edition Number: 1
Number of Pages: XVII, 130
Number of Illustrations: 14 b/w illustrations, 9 illustrations in colour
Topics: Statistics for Social Sciences, Humanities, Law, Assessment, Testing and Evaluation, Statistical Theory and Methods, Educational Technology, Teaching and Teacher Education, Learning & Instruction