Please enable JavaScript.
Coggle requires JavaScript to display documents.
Classroom Assessment Strategies (Assessment: process of observing a sample…
Classroom Assessment Strategies
Assessment: process of observing a sample of a student's behavior and drawing inferences about the student's knowledge and abilities
Formative (conducted before or during instruction to facilitate planning and learning) vs. Summative (conducted after instruction to assess final achievement)
Informal (spontaneous, day to day observations) vs. Formal (preplanned to ascertain what students know and can do)
Paper-pencil (written response to written items) vs. Performance (demonstrate knowledge in a non written fashion)
Traditional (measure basic knowledge and skills in relative isolation) vs. Authentic (focuses on students knowledge and skills in a context that might be found in the outside world)
Standardized (developed by test construction experts and published for use in different classrooms) vs. Teacher-Developed (developed for use in a specific classroom)
Criterion- Referenced (what students know and can do in reference to predetermined standards) vs. Norm-Referenced (how students perform relative to a peer group)
Evaluating/ Promoting Learning:
Response to Intervention: approach to diagnosing a cognitive impairment in which students are identified for in-depth assessment after failing to master certain basic skills
Assessments can motivate students to study and learn
Assessments can influence students' cognitive processes as they study
Assessments can serve as learning experiences in and of themselves
Assessments can provide valuable feedback about learning progress
Enhancing learning through formative assessments:
rubrics: list of characteristics and components that a student's performance on an assessment should ideally have
dynamic assessment: systematic examination of how easily and in what ways a student can acquire new knowledge or skills, usually within the context of instruction or scaffolding
Qualities of a Good Assessment:
Did the teacher evaluate students' responses inconsistently 2. Were some students assessed under more favorable conditions than others? 3. Was the assessment a poor measure of what you had learned? 4. Was the assessment so time consuming that after a while you no longer cared how well you performed?
Reliability: extent to which an assessment yields consistent information about the knowledge, skills, or characteristics being assessed
Standardization: refers to the extent in which an assessment involves similar content and format and is administered and scored similarly for everyone
Validity: extent to which an assessment actually measures what it is intended to measure and allows appropriate inferences about the characteristic or ability in question
Content Validity: extent to which an assessment includes a representative sample fo tasks within the content domain being assessed
Curriculum-based measurement (CBM): use of frequent assessments to track students' progress in acquiring basic skills
Table of specifications: two-way grid indicating the topics to be covered in an assessment and the things students should be able to do with those topics
Predictive validity: extent to which the results of an assessment predict future performance in a particular domain
Construct validity: extent to which an assessment accurately measures an unobservable educational or psychological characteristic
Practicality: extent to which an assessment instrument or procedure is inexpensive and easy to use and takes only a small amount of time to administer and score
Assessments:
Informal: personal journals, body language during activities, asking questions during the lesson
halo (perceiving positive behaviors in someone they like) vs. horns (perceiving negative behaviors in someone they have little respect)
Paper-pencil: true/false, multiple choice, matching
recognition task: identify correct information among incorrect or irrelevant statements
Recall task: assessment task in which one must retrieve information from long-term memory with only minimal retrieval cues
Constructed response: recall assessment task that requires a lengthy response
Performance: playing a musical instrument, role-playing, presentations
Analytic scoring: evaluating various aspects of it separately
Holistic scoring: summarizing performance with a single score
Formal Considerations:
testwiseness: test-taking know-how that enhances test performance
test anxiety: excessive anxiety about a particular test or about assessment in general
item analysis: patterns in student response to various items
item difficulty: index reflecting the proportion of students getting a particular assessment item correct
item discrimination: index reflecting the relative proportion of high-scoring vs low-scoring students getting a particular assessment item correct