Atjaunināt sīkdatņu piekrišanu

Assessing Model-Based Reasoning using Evidence- Centered Design: A Suite of Research-Based Design Patterns 1st ed. 2017 [Mīkstie vāki]

  • Formāts: Paperback / softback, 130 pages, height x width: 235x155 mm, weight: 454 g, 9 Illustrations, color; 14 Illustrations, black and white; XVII, 130 p. 23 illus., 9 illus. in color., 1 Paperback / softback
  • Sērija : SpringerBriefs in Statistics
  • Izdošanas datums: 02-Aug-2017
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3319522450
  • ISBN-13: 9783319522456
  • Mīkstie vāki
  • Cena: 60,29 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 70,94 €
  • Ietaupiet 15%
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Paperback / softback, 130 pages, height x width: 235x155 mm, weight: 454 g, 9 Illustrations, color; 14 Illustrations, black and white; XVII, 130 p. 23 illus., 9 illus. in color., 1 Paperback / softback
  • Sērija : SpringerBriefs in Statistics
  • Izdošanas datums: 02-Aug-2017
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3319522450
  • ISBN-13: 9783319522456
This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards. Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop.  





Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based reasoning: Model Formation, Model Use, Model Elaboration, Model Articulation, Model Evaluation, Model Revision, and Model-Based Inquiry. Each design pattern lays out considerations concerning targeted knowledge and ways of capturing and evaluating students work. These design patterns are available at http://design-drk.padi.sri.com/padi/do/NodeAction state=listNodes&NODE_TYPE=P ARADIGM_TYPE. The ideas are illustrated with examples from existing assessments and the research literature.
1 Introduction
1(8)
2 Model-Based Reasoning
9(10)
2.1 Scientific Models
9(3)
2.2 The Inquiry Cycle
12(2)
2.3 Some Relevant Results from Psychology
14(5)
2.3.1 Experiential Aspects of Model-Based Reasoning
14(1)
2.3.2 Reflective Aspects of Model-Based Reasoning
15(1)
2.3.3 Higher-Level Skills
16(1)
2.3.4 Implications for Assessment Use Cases
17(2)
3 Evidence-Centered Assessment Design
19(6)
3.1 Assessment Arguments
20(2)
3.2 Design Patterns
22(3)
4 Design Patterns for Model-Based Reasoning
25(6)
5 Model Formation
31(18)
5.1 Rationale, Focal KSAs, and Characteristic Task Features
31(5)
5.2 Additional KSAs
36(3)
5.3 Variable Task Features
39(3)
5.4 Potential Work Products and Potential Observations
42(4)
5.5 Considerations for Larger Investigations
46(1)
5.6 Some Connections to Other Design Patterns
47(2)
6 Model Use
49(10)
6.1 Rationale, Focal KSAs, and Characteristic Task Features
49(4)
6.2 Additional KSAs
53(2)
6.3 Variable Task Features
55(1)
6.4 Potential Work Products and Potential Observations
55(2)
6.5 Some Connections with Other Design Patterns
57(2)
7 Model Elaboration
59(6)
7.1 Rationale, Focal KSAs, and Characteristic Task Features
59(2)
7.2 Additional KSAs
61(1)
7.3 Variable Task Features
61(2)
7.4 Potential Work Products and Potential Observations
63(1)
7.5 Some Connections with Other Design Patterns
64(1)
8 Model Articulation
65(6)
8.1 Rationale, Focal KSAs, and Characteristic Task Features
66(1)
8.2 Additional KSAs
67(1)
8.3 Variable Task Features
67(1)
8.4 Potential Work Products and Potential Observations
68(1)
8.5 Some Connections with Other Design Patterns
69(2)
9 Model Evaluation
71(10)
9.1 Rationale, Focal KSAs, and Characteristic Task Features
71(5)
9.2 Additional KSAs
76(1)
9.3 Variable Task Features
77(1)
9.4 Potential Work Products and Potential Observations
78(1)
9.5 Some Connections with Other Design Patterns
79(2)
10 Model Revision
81(8)
10.1 Rationale, Focal KSAs, and Characteristic Task Features
81(3)
10.2 Additional KSAs
84(1)
10.3 Variable Task Features
85(1)
10.4 Potential Work Products and Potential Observations
86(1)
10.5 Some Connections with Other Design Patterns
86(3)
11 Model-Based Inquiry
89(10)
11.1 Rationale, Focal KSAs, and Characteristic Task Features
90(2)
11.2 Additional KSAs
92(1)
11.3 Variable Task Features
92(2)
11.4 Potential Work Products and Potential Observations
94(3)
11.5 Some Connections with Other Design Patterns
97(2)
12 Conclusion
99(6)
12.1 Standards-Based Assessment
99(1)
12.2 Classroom Assessment
100(1)
12.3 Large-Scale Accountability Testing
101(1)
12.4 Simulation- and Game-Based Assessment
102(1)
12.5 Closing Comments
103(2)
Appendix: Summary Form of Design Patterns for Model-Based Reasoning 105(16)
References 121(8)
Index 129
Robert J. Mislevy, PhD, is Frederic M. Lord Chair in Measurement and Statistics at the Educational Testing Service in Princeton, New Jersey, and Professor Emeritus at the University of Maryland. Geneva D. Haertel, PhD, is Director of Assessment Research and Design at the Center for Technology in Learning, SRI International. Michelle M. Riconscente, PhD, is Director of Learning and Assessment at GlassLab in California. Daisy Wise Rutstein, PhD, is Education Researcher in the Education Division at SRI International. Cindy S. Ziker, PhD, MPH, is Senior Researcher for Assessment in the Education Division of the Center for Technology and Learning at SRI International.