Essay Exams: Beyond Knowledge and Recall of Factual Information
DMU faculty and staff.
Curriculum Design and Evaluation
Curriculum development and evaluation in the modern health sciences environment poses many challenges from the perspective of course developers. Faculty members face an ever-increasing demand to develop integrated courses that incorporate active learning and that are also matched to assessment and program evaluation criteria. To meet this demand and provide quality education programs, faculty members must have a thorough understanding of course design. The IAMSE Spring Series will cover several key topics to help course directors design courses based on carefully planned objectives and expected student outcomes, assess student performance using several types of questions, and perform detailed program evaluation to help gauge course effectiveness and promote successful quality improvement. Session 1 will explore issues related to instructional design with emphasis on creating measurable learning objectives using Blooms Taxonomy, and utilizing a backwards design approach to course development. Session 2 will focus on how to use curriculum mapping to identify content gaps and undesired redundancy within programs. The next two sessions will demonstrate how to perform post-hoc multiple choice item analysis using psychometric data and how to design effective essay questions that assess student knowledge. The final session will concentrate on methods to connect program evaluation to continuous quality improvement.
A firmly held belief in medical education is that assessment drives learning. Students generally learn what they need to learn to succeed on required assessments. As part of our curriculum redesign a decade ago at Case Western Reserve University SOM, we switched from almost exclusively multiple-choice to open-ended essay type questions. The switch occurred after vigorous debate. The leadership believed that constructed response-type questions promoted more desirable study methods and required conceptual organization and synthesis of information on the students’ part more so than multiple choice. This shift was supported by our faculty. During this webinar, we will review our experience with open-ended assessments as well as the lessons learned using open-ended essay type questions for the assignment of student grades during Foundations of Medicine and Health. We will share a sampling of our faculty’s comments and insights regarding the assessment of student performance using open-ended essay type questions. We will explore evidence behind the commonly held view that open-ended items require that students both search for and retrieve information whereas multiple choice test items require only that students recognize and pick the correct answer out from among a list of incorrect choices (ironically enough, called distractors), i.e., that different assessment formats place different cognitive demands on students.
- Explain the importance of aligning course objectives, instructional methods, and assessment.
- Describe an example of a synthesis essay question and explain logistics of scoring and reporting.
- Consider the evidence behind the commonly held view that different assessment formats place different cognitive demands on students.
Amy Wilson-Delfosse, PhD
Dr. Wilson-Delfosse is the Associate Dean for Curriculum at Case Western Reserve University School of Medicine and is the immediate Past-President of the International Association of Medical Science Educators. She is published in the fields of pharmacology education, team-based learning and integration of basic science and clinical medicine. She also has particular interest in the identification of strategies to optimize learning within the context of problem-based learning teams. The American Association of Medical Colleges recognized Dr. Wilson-Delfosse’s accomplishments in medical education with the Alpha Omega Alpha Robert J. Glaser Distinguished Teacher Award in 2012.
Klara Papp, PhD
Dr. Papp is Professor and Director of the Center for the Advancement of Medical Learning at Case Western Reserve University School of Medicine in Cleveland, Ohio, USA. The Center supports students and faculty in their efforts to pursue excellence in learning, teaching, and educational scholarship. Dr. Papp completed her PhD in Educational Psychology at the State University of New York at Buffalo. She provides expertise in educational testing and measurement which includes the construction and interpretation of performance-based measures of student performance, including multiple choice tests, essay examinations, behavior checklist rating scales, and patient-based clinical skills exams. She is working with a small and dedicated team of faculty on designing assessments for the curriculum redesign at CWRU that aligns educational goals, instructional methods, and assessments.
- Bloom BS. Taxonomy of Educational Objectives; the classification of educational goals. New York, D. McKay Co., Inc. 1974, c1956.
- Case SM, Swanson DB. Constructing written test questions for the basic and clinical sciences. 3rd Ed. Philadelphia: National Board of Medical Examiners, 2000.
- Haladyna TM, Downing SM. Construct-irrelevant variance in high stakes testing. Educ Meas: Issues & Practice 2004; 23: 17-27.
- Martinez ME. Cognition and the question of test item format. Educ Psychol 1999; 34: 207-18.
- Norman GR, Swanson DB, Case SM. Conceptual and methodological issues in studies comparing assessment formats. Teach Learn Medicine 1996; 8: 208-16.
- Swanson DB, Case SM. Assessment in basic science instruction: directions for practice and research. Advances in Health Sciences Educ 1997; 2: 71-84.
- Joint Committee. Standards for educational and psychological testing. The joint committee of the American Educational Research Association, American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Washington, DC, AERA, 1999.
- Zieky MJ, Livingston SA. Passing Scores: a manual for setting standards of performance on educational and occupational tests. Educational Testing Service, 1982.
- 1.00 CE Contact Hour(s)