Please enable JavaScript.
Coggle requires JavaScript to display documents.
Ch. 10 Multiple-Choice and Matching Exerxises - Coggle Diagram
Ch. 10
Multiple-Choice and Matching Exerxises
Multiple-Choice Items
To Do
To Aviod
Does the item assess an important aspect of the unit's instructional objectives?
Does the item match your assessment plan in terms of performance, emphasis, and number of points?
Does the stem ask a direct question or set a specific problem?
Is the item based on a paraphrase rather than words lifted directly from a textbook?
Are the vocabulary and sentence structure at a relatively low and nontechnical level (except for content-related vocabulary)?
Is each alternative plausible so that a student who lacks knowledge of the correct answer cannot view it as absurd or silly?
If possible, is every incorrect alternative based on a common student error or misconception?
Is the correct answer to this item independent of the correct answers of other items?]
Are all of the alternatives homogeneous and appropriate to the content of the stem?
Did you avoid using "all of the above" or "none of the above" as much as possible?
11.ls there only one correct or best answer to the item?
Creating Alternative varieties of Multiple-Choice Items
Does each item in the greater-less-same set assess an important aspect of the unit's instructional objectives?
Does each item in the greater-less-same set match your assessment plan in terms of performance, emphasis, and number of points?
Do some of the items in the greater-less-same set require students to apply their knowledge and skill to new situations, examples, or events?
Do your directions clearly and completely explain the basis you intend students to use when judging "greater than," "less than," or "same as" for each pair of statements?
Do your directions state which pair member (left or right) is the referent?
Did you avoid using a pattern (GGSSLLGGSSLL, etc.) for the correct answers?
Does each best-answer item assess an important aspect of the unit's instructional objectives?
Does each best-answer item match your assessment plan in terms of performance, emphasis, and number of points?
Does each best-answer item require students to apply their knowledge and skill in some manner to new situations, examples, or events?
Do your directions clearly and completely explain the basis you intend students to use when judging "best"? Have your students been given practice in using the appropriate criteria for judging "best"?
Are all the options correct to some degree?
Is the keyed answer the only one that can be defended as "the best" by applying the criteria you specify in the directions?
Is each distractor based on an important misconception, misunderstanding, or way of being an incomplete answer? Did you avoid tricky or trivial ways of making a distractor partially correct or contain misinformation?]
Are all of the options of equal length (within five words of each other)?
Did you avoid
(a) having more than one "best" answer
b) using "all of the above" or "none of the above"?
Did you apply all of the item-writing guidelines described in the multiple-choice checklist?
Does each item assess an important aspect of the unit's instructional objectives?
Does each experiment-interpretation item match your assessment plan in terms of performance, emphasis, and number of points?
Does each item focus on requiring students to apply one or more important principles or criteria to new situations, examples, or events?
Have you given students opportunity to practice applying the appropriate criteria or principles for judging the "best" or "most valid" interpretation?
Did you describe an experiment or research study in concise but sufficient detail that a student can use the appropriate criteria or principles to interpret the results?
Is the keyed answer the only one that can be defended as the "best" or "most valid" interpretation?
Is each distractor based on an important misconception, misinterpretation, or misapplication of a criterion or principle? Did you avoid tricky or trivial ways of making a distractor partially correct or contain misinformation?
Did you avoid
(a) having more than one "best" or "most valid" answer
(b) using "all of the above" or "none of the above"?
Did you apply all of the appropriate item-writing guidelines described in the multiple-choice checklist?
If you used short-answer items, did you apply all of the appropriate item-writing guidelines described in the short essay checklist (Chapter 11)?
Does the exercise assess an important aspect of the unit's instructional objectives?
Does the exercise match your assessment plan in terms of performance, emphasis, and number of points?
Within this exercise, does every premise and response belong to the same category of things?
Do your directions clearly state the basis you intend students to use to complete the matching correctly?
Does every element in the response list function as a plausible alternative to every element in the premise list?
Are there fewer than 10 responses in this matching exercise?
Did you avoid "perfect matching"?
Are the longer statements in the premise list and the shorter statements (names, words, symbols, etc.) in the response list?
If possible, are the elements in the response list ordered in a meaningful way (logically, numerically, alphabetically, etc.)?
Are the premises numbered and the responses lettered?
Does the masterlist exercise assess an important aspect of the unit's instructional objectives?]
Does the masterlist exercise match your assessment plan in terms of performance, emphasis, and number of points?
Does the masterlist exercise require the students to apply their knowledge and skill to new situations, examples, or events?
Did you provide enough information so that knowledgeable students are able to apply the knowledge and skill called for by the item?
Do your directions to the students clearly and completely explain the basis you intend them to use when applying masterlist response choices to the stems?
Within this masterlist exercise, does every stem and every response choice in the masterlist belong to the same category of things?
Does every response choice in the masterlist function as a plausible alternative for every stem?
Did you avoid "perfect matching"?
If possible, are the options in the masterlist ordered in a meaningful way (logically, numerically, alphabetically, etc.)?
Are the stems numbered and the masterlist response choices lettered?
Does the tabular exercise assess an important aspect of the unit's instructional objectives?
Does the tabular exercise match your assessment plan in terms of performance, emphasis, and number of points?
Do your directions to students clearly explain (a) the basis you intend students to use when matching the responses to the premises, (b) how to mark their answers, and (c) that a response choice may be used once, more than once, or not at all?
Do the response choices within each response list all belong to the same category of things?
Does every response choice function as a plausible alternative to every premise?
Did you avoid "perfect matching"?
If possible, are the response choices ordered in a meaningful way (logically, numerically, alphabetically, etc.)?
Are the premises numbered and the response choices lettered?
On the test page, are the directions placed first, the response choices second, and the table third?
If possible, is the entire exercise printed on one page rather than split between two pages?
Conclusion
This chapter has discussed writing multiple choice items and matching exercises. With the fill-in-the-blank and true-false items discussed in Chapter 9, these formats comprise the common selected response, objectively scored items types used in paper-and-pencil tests. In Chapter 11, we turn to the essay question, a constructed response format.