Skip to main content

Main menu

  • Home
  • Content
    • Current Issue
    • Online Ahead of Print
    • Past Issues
  • Info for
    • Authors
      • Resources
    • Institutions
    • Advertisers
    • Subscribing
  • About
    • About the JDE
    • Editorial Board
  • More
    • Alerts
    • My Saved Searches
    • Feedback
    • Help
  • Other Publications

User menu

  • My Alerts
  • Log In
  • My Cart

Search

  • Advanced search
Journal of Dental Education
Visit the American Dental Education Association's main website
  • My Alerts
  • Log In
  • My Cart
Journal of Dental Education

Advanced Search

  • Home
  • Content
    • Current Issue
    • Online Ahead of Print
    • Past Issues
  • Info for
    • Authors
    • Institutions
    • Advertisers
    • Subscribing
  • About
    • About the JDE
    • Editorial Board
  • More
    • Alerts
    • My Saved Searches
    • Feedback
    • Help
  • Visit jde Template on Facebook
  • Follow jde Template on Twitter
  • Follow jde Template on YouTube
  • View jde RSS feed
  • TOC Alerts
Research ArticleEducational Methodologies

Short-Answer Examinations Improve Student Performance in an Oral and Maxillofacial Pathology Course

R. Neal Pinckard, C. Alex McMahan, Thomas J. Prihoda, John H. Littlefield and Anne Cale Jones
Journal of Dental Education August 2009, 73 (8) 950-961;
R. Neal Pinckard
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
C. Alex McMahan
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Thomas J. Prihoda
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
John H. Littlefield
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Anne Cale Jones
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site

GTranslate

English French German Italian Portuguese Russian Spanish
  • Article
  • Figures & Data
  • Info & Metrics
  • PDF
Loading
  • © 2009 American Dental Education Association

Abstract

The effect of examination question format on student performance was assessed by investigating three academically comparable second-year dental school classes in an oral and maxillofacial pathology course. One class was given examinations with all multiple-choice questions, one class was given examinations with all short-answer questions, and one class was given examinations with half multiple-choice questions and half short-answer questions. The class given examinations with half short-answer questions along with half multiple-choice questions had a significantly higher average score and grade category distribution (80–100 percent, 70–79 percent, <70 percent) than the class given examinations with all multiple-choice questions. When students in these two classes were divided into three academic ability groups based on the student’s score in a prerequisite general pathology course, the class given examinations with half short-answer questions and half multiple-choice questions in the oral and maxillofacial pathology course had significantly higher scores and grade category distributions in all three ability groups. The average score and grade category distribution in the class given examinations with all short-answer questions in the oral and maxillofacial pathology course were not significantly different from the average score and grade category distribution in the class given examinations with half short-answer and half multiple-choice questions. Our interpretation of these results is that the utilization of examinations containing short-answer questions created a more challenging learning environment that motivated students to adopt more effective study regimens.

Keywords:
  • Aptitude-Treatment Interaction
  • student performance evaluation
  • multiple-choice questions
  • short-answer questions
  • formula scoring
  • correction for guessing
  • educational methodology
  • educational measurement
  • dental education

The expectation for students to improve their scores on cued multiple-choice format examinations by guessing is well recognized. This expectation raised a crucial issue in our minds as to whether this type of examination question format accurately assesses student knowledge, a question of paramount importance for health profession educators. To examine this fundamental question, we retrospectively applied the standard correction for random (no knowledge) guessing1–3 to the scores on multiple-choice examinations in an oral and maxillofacial pathology course for dental students in the 2004–05 academic year.4 We found that corrected multiple-choice examination scores (half a student’s grade) agreed significantly better with short-answer examination scores (half a student’s grade) than did the uncorrected, and inflated, multiple-choice examination scores. Therefore, we concluded that correction for guessing increased the validity of the multiple-choice examinations and provided a better evaluation of student knowledge.

As an obvious follow-up to the above findings, the course director prospectively implemented the correction for guessing on the multiple-choice portion of the examinations in the same oral and maxillofacial pathology course in the next academic year (2005–06). We observed significantly improved student performance, particularly among students in the lower portion of the grade distribution, relative to the previous year (2004–05) in which prospective correction for guessing was not employed (an Aptitude-Treatment Interaction [ATI]5,6). The improved performance was observed not only on the multiple-choice portions of the examinations, but on the short-answer portions as well. However, student reaction to the implementation of correction for guessing was overwhelmingly negative. As a consequence, in the next academic year (2006–07), the course director decided to utilize all multiple-choice questions, without correction for guessing, as is done in most of the courses in the first two years of our dental school curriculum. Unexpectedly and unfortunately, student performance decreased substantially in this 2006–07 academic year. As a result, the decision was made that for the 2007–08 academic year, only short-answer questions would be used to evaluate student performance.

In view of the preceding, we serendipitously found ourselves in a unique position to evaluate quantitatively the association between examination question format and student academic performance. The results of these analyses indicated that using short-answer format questions significantly improved student performance. Presumably, this improved performance occurred because, in anticipation of the short-answer examinations as opposed to multiple-choice examinations, the students were motivated to modify their study regimen in a manner that fostered more effective learning.

Methods

This study evaluated the effects of examination question format on student academic performance. Student examination scores in an oral and maxillofacial pathology course were compared among three second-year dental school classes in which different examination question formats (multiple-choice, short-answer, or both; see Table 1⇓) were used. In the 2004–05 class, course grades were determined from scores obtained on an equally weighted combination of short-answer examinations and multiple-choice examinations (SA/MC, 2004–05). In the 2006–07 class, course grades were determined using only multiple-choice examinations (MC, 2006–07) and in the 2007–08 class, course grades were determined using only short-answer examinations (SA, 2007–08).

View this table:
  • View inline
  • View popup
Table 1.

Summary of examination formats and scoring methods used in the oral and maxillofacial pathology course, by academic class

Examination scores in the prerequisite general pathology course were used to determine whether the three classes were academically comparable. Grade categories (80–100 percent, 70–79 percent, <70 percent) in the general pathology course were used to identify subgroups of students of different academic ability. The examinations procedures (all multiple-choice questions) in the prerequisite general pathology course were the same in all three years.

In the three different academic years in this report, there were no changes in instructors in the oral and maxillofacial pathology course and only slight changes (affecting five of sixty-one contact hours) in instructors in the general pathology course. There were no changes in course content or objectives for either course. Any other changes among the three years, such as slight changes in emphasis or changes in clinical cases and images, were changes that typically occur from year to year.

To improve the validity of scores from all multiple-choice examinations (that is, comparability with short-answer examination scores), we retrospectively applied the standard correction for guessing.4 However, only the uncorrected scores were used in the calculation of a student’s official grade in these years. Because the correction for guessing was applied retrospectively, it had no influence on student behavior.

The oral and maxillofacial pathology course at the University of Texas Health Science Center at San Antonio, given during the spring semester to second-year dental students, consisted of fifty hours of didactic lecture and four two-hour examinations. At the outset of this course, students were told explicitly that they were expected to learn and understand each of the following characteristics for the various pathologic processes discussed in the course: etiology, age and sex predilection, most common anatomic location, distinguishing features (clinical, radiographic, microscopic), diagnostic aids and laboratory tests, treatment options, and prognosis. Moreover, students were told they would be examined on these characteristics. Both short-answer and multiple-choice examinations were constructed to test not only recall of memorized facts but also to test student knowledge and understanding of these disease characteristics. The following illustrates a test question on the same information using both short-answer and multiple-choice formats.

A fourteen-year-old female demonstrates hypertelorism, frontal bossing, mandibular prognathism, and palmar pitting. A panoramic radiograph reveals multiple radio-lucent lesions in the right and left posterior mandible and a radiolucent lesion surrounding the right maxillary canine. Based on the clinical and radiographic findings, what is the most likely pathologic process associated with these multiple radiolucent lesions?

Short-answer: Odontogenic keratocyst

Multiple-choice: A. Dentigerous cyst

B. Odontogenic keratocyst

C. Calcifying odontogenic cyst

D. Ameloblastic fibroma

E. Ameloblastoma

Each of the four examinations in the oral and maxillofacial pathology course covered between eleven and thirteen hours of lecture material and comprised 25 percent of the final grade. No comprehensive final examination was given. Students received final course grades based on averages calculated from the scores on the four examinations. The questions on both short-answer and multiple-choice examinations were equally weighted to the topics that were presented prior to each of the four examinations. This was to ensure that a given topic was not stressed more often than another topic and that a topic was not stressed more in one question format than another.

In the SA/MC (2004–05) class, each of the four examinations was divided into two one-hour examinations. The first hour of each examination was based on the presentation of twenty-five clinical cases. Each student was given a written examination containing the clinical histories corresponding to the twenty-five clinical cases that would be projected. Each case consisted of a brief written clinical history and projected gross, microscopic, and/or radiographic findings. Students were advised to respond succinctly to the two short-answer questions for each case and not to use verbose responses. Appropriate answers to short-answer questions typically consisted of one or more sentences or key words. All short-answer examinations were graded solely by the course director (coauthor ACJ). The short-answer questions were graded by identifying key words delineated at the time of construction of the examination. Points were not deducted for spelling errors as long as responses were phonetically correct. If a student gave several answers, only the first answer was evaluated; no partial credit was awarded. The second hour of each examination in the SA/MC (2004–05) class consisted of fifty multiple-choice questions, each of which had one correct answer and four plausible distractors. At the end of the second hour, the multiple-choice examination answer sheets were collected and graded electronically. Since the multiple-choice and short-answer examinations each consisted of fifty questions, they were equally weighted in the calculation of each student’s final grade.

In the MC (2006–07) class, each of the four examinations was comprised of seventy-five multiple-choice questions of construction similar to that described in the foregoing paragraph. In the SA (2007–08) class, each of the four examinations was comprised of seventy-five short-answer questions of construction similar to that described in the foregoing paragraph for the 2004–05 class.

The general pathology course at the University of Texas Health Science Center at San Antonio is given in the fall semester to second-year dental students and immediately precedes the oral and maxillofacial pathology course. The course and examination methods were the same for all academic years included in this report. The course consisted of sixty-one hours of didactic lecture, four two-hour review sessions, and four two-hour examinations. The review sessions were structured in a question and answer format. Each faculty member who had previously presented didactic information for the upcoming examination presented a brief verbal review of their topics. Students were then allowed to ask questions, the answers to which were discussed by the faculty member. This procedure was repeated until there were no further questions. Each of the examinations consisted of seventy-five multiple-choice questions with one correct answer and four distractors; test construction strategies were similar to those described for the oral and maxillofacial pathology course. The multiple-choice questions covered information presented in the lectures and reading assignments in the period immediately preceding each examination. Each two-hour examination comprised 25 percent of the final course grade. No comprehensive final examination was given. Students received a final course grade based on the averages calculated from the four two-hour examinations. Each of the four examinations covered between thirteen and nineteen hours of lecture material. When the multiple-choice questions were constructed, the questions were equally weighted to the topics that were presented prior to each of the four examinations.

We analyzed data from students who completed all four examinations in the oral and maxillofacial pathology course and all four examinations in the prerequisite general pathology course. The analyses presented in this report were based on eighty-eight students in the SA/MC (2004–05) class, eighty-eight students in the MC (2006–07) class, and eighty-six students in the SA (2007–08) class. This study was approved by the Institutional Review Board of the University of Texas Health Science Center at San Antonio.

The correction for guessing utilized was a modification to the common grading method for multiple-choice examinations (number-correct or number-right scoring) in which zero points are assigned for an incorrect answer and full credit is given for a correct answer.4,7 Since each multiple-choice question had five possible answers, the standard correction for guessing consisted of awarding -¼ for an incorrect answer, 0 for a question not answered, and +1 for a correct answer. The retrospective application of the correction for guessing had no influence on student behavior, and students answered all multiple-choice questions because they expected to benefit by guessing. The retrospective correction for guessing therefore was equivalent to applying a straight line adjustment to the multiple-choice examination scores such that a grade of 100 percent was unchanged and a grade of 20 percent was adjusted to zero. The equation of this straight line was: Corrected Score (%) = 1.25 [Uncorrected Score (%) – 20].4 The grade categories used in this report (80–100 percent, 70–79 percent, <70 percent) were based on corrected multiple-choice examination scores and therefore correspond to uncorrected scores of 84–100 percent, 76–83 percent, and <76 percent.

Means were compared between classes using analysis of variance.8 Cumulative relative frequency distributions were used to graphically show distributions of numerical scores. Numerical course averages in the oral and maxillofacial pathology course and in the general pathology course were used to assign course grade category as 80–100 percent, 70–79 percent, and <70 percent. Frequencies in grade categories were analyzed using the chi-square test for independence;8 if low expected frequencies were encountered, an exact procedure was used to obtain the P-value. We classified students’ performance based on their grade category in the general pathology course (an estimate of student academic ability) and then analyzed their respective numerical scores in the oral and maxillofacial pathology course by these classifications and academic class in a two-way analysis of variance8 to investigate Aptitude-Treatment Interactions.5 All calculations were carried out using SAS 9.1.3 (SAS Institute, Cary, NC).

Results

General Pathology Course

In the general pathology course, there were no significant differences in course averages (P=0.3754, Table 2⇓) nor in the grade category distributions (80–100 percent, 70–79 percent, <70 percent) (P=0.4527, Table 2⇓) among the three academic classes.

View this table:
  • View inline
  • View popup
Table 2.

Mean scores and grade category distributions in the general pathology course and the oral and maxillofacial pathology course, by academic class

Oral and Maxillofacial Pathology Course

Student performance assessed by multiple-choice examinations.

The class means and grade category distributions of the multiple-choice examination scores in the oral and maxillofacial pathology course are given in Table 2⇑. Utilization of half short-answer examinations was associated with enhanced student performance on the multiple-choice examinations as indicated by a significantly higher mean score for the SA/MC (2004–05) class than for the MC (2006–07) class (81.6 vs. 78.4, P=0.0101).

The grade category distributions for the multiple-choice examinations in the oral and maxillofacial pathology course are shown in Figure 1⇓ and Table 2⇑. The two pie charts in the left-hand column of Figure 1⇓ show little difference in the grade distributions for the two classes in the general pathology course (not significantly different as stated earlier). The grade distributions of the MC (2006–07) class in the general pathology and oral and maxillofacial pathology courses were not different (P=0.8211); this is not surprising because the examination formats were the same, that is, all multiple-choice questions. However, in the SA/MC (2004–05) class, student performance on the multiple-choice examinations in the oral and maxillofacial pathology course was significantly better than performance in the general pathology course (P=0.0010). The grade category distribution in the oral and maxillofacial pathology course based on the multiple-choice examinations in the SA/MC (2004–05) class reflected a higher fraction of students in the 80–100 percent grade category than in the MC (2006–07) class (56.8 percent vs. 43.2 percent, P=0.0704, Table 2⇑). There also was a lower fraction of students in the <70 percent grade category in the SA/MC (2004–05) class than in the MC (2006–07) class (5.7 percent vs. 18.2 percent, P=0.0105, Table 2⇑). Thus, by utilizing a short-answer format examination in the oral and maxillofacial pathology course in the SA/MC (2004–05) class, the students’ performances improved relative to the grade distribution anticipated from the general pathology course.

Figure 1.
Figure 1.

Distribution (percent of students) of grade category in the general pathology course and oral and maxillofacial pathology course based on multiple-choice examination

Student performance assessed by short-answer examinations.

The mean score on the short-answer examinations for the SA/MC (2004–05) class was not significantly different from the mean score for the SA (2007–08) class (82.8 vs. 82.8, P=0.9628, Table 2⇑); and the grade category distribution was not significantly different between the SA/MC (2004–05) class and the SA (2007–08) class (Table 2⇑, P=0.2607). Thus, there was no association of student performance with utilization of half short-answer examinations relative to using all short-answer examinations.

Oral and Maxillofacial Pathology Course Performance for Students of Different Ability

The foregoing analyses demonstrated improved overall class performance in the oral and maxillofacial pathology course when short-answer format questions were used. We further investigated how individual students of differing academic ability, as estimated by their grade category in the general pathology course, performed in the oral and maxillofacial pathology course relative to examination question format.

Student performance assessed by multiple-choice examinations.

The cumulative frequency distributions of individual student averages on the multiple-choice examinations for the oral and maxillofacial pathology course by grade category in the general pathology course are shown in the left-hand column of Figure 2⇓. These cumulative distributions indicated that students in all three grade categories (80–100 percent, 70–79 percent, <70 percent) in the general pathology course performed better in the oral and maxillofacial pathology course in the SA/ MC (2004–05) class when short-answer questions comprised half of the examinations. Furthermore, the magnitudes of the improved performances were similar in all three grade categories determined from the prerequisite general pathology course.

Figure 2.
Figure 2.

Cumulative relative frequency distributions of average student scores from four multiple-choice examinations and four short-answer examinations in the oral and maxillofacial pathology course by grade category in the prerequisite general pathology course and academic class

The foregoing impressions were confirmed quantitatively. For all general pathology grade categories, the average scores (Table 3⇓) on the multiple-choice examinations in the oral and maxillofacial pathology course were significantly higher in the SA/MC (2004–05) class than in the MC (2006–07) class (80–100 percent grade category, 88.5 vs. 84.4, P=0.0012; 70–79 percent grade category, 80.7 vs. 75.6, P=0.0001; <70 percent grade category, 72.6 vs. 69.1, P=0.0408). The differences in mean score on the multiple-choice examinations (Table 3⇓) between the SA/MC (2004–05) class and the MC (2006–07) class were similar in all three general pathology grade categories (differences calculated from mean scores in Table 3⇓ were 4.0 for the 80–100 percent grade category, 5.1 for the 70–79 percent grade category, and 3.5 for the <70 percent grade category), indicating that there was no Aptitude-Treatment Interaction (academic class by grade category interaction, P=0.7075).

View this table:
  • View inline
  • View popup
Table 3.

Mean scores on the multiple-choice examinations and the short-answer examinations in the oral and maxillofacial pathology course by grade category in the general pathology course (retrospectively corrected) and academic class

The grade category distributions of the multiple-choice examinations in the oral and maxillofacial pathology course are given in Figure 3⇓ and Table 4⇓. All of the students in the 80–100 percent grade category in the general pathology course achieved this target grade category in the oral and maxillofacial pathology course in the SA/MC (2004–05) class, in which short-answer examinations were included. However, in the MC (2006–07) class, there was a significant fraction of these students (80–100 percent grade category in the general pathology course) that dropped into the 70–79 percent grade category in the oral and maxillofacial pathology course (0 percent vs. 24.4 percent, P=0.0019). For the students in the 70–79 percent grade category in the general pathology course, the fraction of students moving up into the 80–100 percent grade category in the oral and maxillofacial pathology course approximately doubled in the SA/MC (2004–05) class relative to the MC (2006–07) class (51.4 percent vs. 23.3 percent, P=0.0203). Also, in the SA/MC (2004–05) class, none of these students (70–79 percent grade category in the general pathology course) moved down into the <70 percent grade category in the oral and maxillofacial pathology course, whereas in the MC (2006–07) class, 20 percent of students fell into the <70 percent grade category (P=0.0072). Among those students in the <70 percent grade category in the general pathology course, there was a significantly greater fraction of students that remained in this grade category in the oral and maxillofacial pathology course in the MC (2006–07) class than in the SA/MC (2004–05) class (58.8 percent vs. 23.8 percent, P=0.0281).

View this table:
  • View inline
  • View popup
Table 4.

Grade category distribution based on multiple-choice examinations and short-answer examinations in the oral and maxillofacial pathology course by grade category (retrospectively corrected) in the general pathology course and academic class

Figure 3.
Figure 3.

Distribution of grade category in the oral and maxillofacial pathology course based on multiple-choice examination by academic class and grade category in the general pathology course

Student performance assessed by short-answer examinations.

The cumulative frequency distributions of individual student averages on the short-answer examinations in the oral and maxillofacial pathology course by grade category in the general pathology course are shown in the right-hand column of Figure 2⇑. These cumulative distributions clearly indicated that, within the three general pathology grade categories, students in the SA/MC (2004–05) class and the SA (2007–08) class performed comparably in the oral and maxillofacial pathology course. For students in all general pathology grade categories, the average scores (Table 3⇑) on the short-answer examinations were not significantly different between the SA/MC (2004–05) class and the SA (2007–08) class (80–100 percent grade category, 90.3 vs. 88.9, P=0.2935; 70–79 percent grade category, 81.6 vs. 81.4, P=0.8635; <70 percent grade category, 73.4 vs. 74.2, P=0.6613). The differences in mean scores between the SA/MC (2004–05) class and the SA (2007–08) class were similar in all three grade categories, indicating that there was no Aptitude-Treatment Interaction (academic class by grade category interaction, P=0.5974).

The grade category distribution in the oral and maxillofacial pathology course based on the short-answer examinations are given in Figure 4⇓ and Table 4⇑. There was no significant difference in performance on short-answer examinations in the oral and maxillofacial pathology course between the SA (2007–08) class and the SA/MC (2004–05) class, regardless of student ability as assessed by grade category in the general pathology course.

Figure 4.
Figure 4.

Distribution of grade category in the oral and maxillofacial pathology course based on short-answer examination by academic class and grade category in the general pathology course

Discussion

This study has shown that utilization of more challenging short-answer questions, as opposed to cued multiple-choice questions, on examinations in a second-year dental school oral and maxillofacial pathology course was associated with significantly improved student academic performance. Moreover, this improved performance was observed in students of all academic abilities as determined by their scores (80–100 percent, 70–79 percent, <70 percent) in the prerequisite general pathology course, in which multiple-choice questions were used exclusively. Of note, utilization of all short-answer examinations was not associated with any additional improved performance compared to having only half of the examinations comprised of short-answer format questions.

The results of a comparison between the two classes with the most divergent examination question formats are worthy of note. The SA (2007–08) class (all short-answer) significantly outperformed the MC (2006–07) class (all multiple-choice) (class means 82.8 vs. 78.4, P=0.0003, Table 2⇑). It was striking to us that, even though the MC (2006–07) class (all multiple-choice) had the advantage of cueing and was also rewarded for educated guessing, they did not achieve the academic performance of the SA (2007–08) class (all short-answer). Moreover, the students in the SA (2007–08) class were not accustomed to short-answer format questions since nearly all courses in the first two years of our dental school curriculum exclusively utilize multiple-choice examinations.

Our interpretation of the improved performance associated with the utilization of short-answer questions is that the students’ study regimens likely differed from and were more effective than those they would have used if the examinations were comprised of only cued multiple-choice questions. The study regimens used to prepare for short-answer examinations also resulted in improved scores on the multiple-choice format questions when this format represented half of the examinations and short-answer format questions the other half (SA/MC, 2004–05 class) compared to all multiple-choice format questions (MC, 2006–07 class). These results are consistent with the results from Balch’s study that found that, faced with the expectation of short-answer examinations, students were motivated to study differently and also more effectively.9 Furthermore, Balch reported that students anticipating a short-answer examination had higher perceived anxiety levels while studying for the examination relative to the anxiety levels of students anticipating a multiple-choice examination; however, the relative perceived anxiety level between the two groups was reversed during the actual examination, which, in fact, was the same multiple-choice examination for both groups.

The results reported here strongly support the use of short-answer format questions. We believe that utilization of other types of examination questions, such as essay format questions, which require even more integration of information than short-answer questions, would further motivate students to modify their study regimens and result in an even better comprehension of the subject matter. Our results using half short-answer questions suggest that only a fraction of the examination questions would need to be essay format in order to achieve such benefits.

Previously, we documented in the same oral and maxillofacial pathology course the beneficial effects on student performance of prospective (with the students’ awareness) implementation of correction for guessing on multiple-choice questions (2005–06 class). The examination format in the 2005–06 class was identical to that in the SA/MC (2004–05) class—that is, one-half of the examination questions were multiple-choice questions and the other half short-answer questions.6 Prospectively implemented correction for guessing was associated with an additional improvement in academic performance that was over and above the improvement associated with the use of short-answer questions reported here. However, in marked contrast to the current study, the improvement was seen only in those students whose previous academic performance in the general pathology course placed them in the lowest part of the grade distribution;6 furthermore, the improved performance by these students was not only realized on the multiple-choice examinations but on the short-answer examinations as well. Because the correction for guessing only changed performance in a group of underachieving students, we interpreted this observation as representing an Aptitude-Treatment Interaction.5 Thus, we believe that the reasons underlying the overall improved student performance seen in the present study by inclusion of short-answer questions on examinations were different from the improved performance attained by the underachieving students after imposition of correction for guessing of multiple-choice questions.6

Finally, we believe it is important to draw a clear distinction between utilizing examinations solely for the purpose of evaluating students’ academic achievements and using examinations as a vehicle to challenge students to learn more effectively. Our study has shown that the utilization of short-answer examinations created a more challenging learning environment and motivated students to adopt more effective study regimens.

Acknowledgments

The authors gratefully acknowledge Ms. Belen Ballesteros for her excellent management of the database of student test scores that were used in this study.

Footnotes

  • Dr. Pinckard is Professor, Department of Pathology; Dr. McMahan is Professor, Department of Pathology; Dr. Prihoda is Associate Professor, Department of Pathology; Dr. Littlefield is Director, Academic Center for Excellence in Teaching; and Dr. Jones is Professor, Department of Pathology—all at the University of Texas Health Science Center at San Antonio. Drs. Pinckard and McMahan contributed equally to this study. Direct correspondence and requests for reprints to Dr. Anne Cale Jones, Department of Pathology, University of Texas Health Science Center at San Antonio, 7703 Floyd Curl Drive, San Antonio, TX 78229-3900; 210-567-4122 phone; 210-567-2303 fax; jonesac{at}uthscsa.edu.

REFERENCES

  1. ↵
    Diamond J, Evans W. The correction for guessing. Rev Educ Res 1973; 43:181–91.
    OpenUrl
  2. Rogers HJ. Guessing in multiple choice tests. In: Masters GN, Keeves JP, eds. Advances in measurement in educational research and assessment. New York: Pergamon, 1999.
  3. ↵
    Muijtjens AMM, van Mameren H, Hoogenboom RJI, Evers JLH, van der Vleuten CPM. The effect of a “don’t know” option on test scores: number-right and formula scoring compared. Med Educ 1999; 33:267–75.
    OpenUrlCrossRefPubMed
  4. ↵
    Prihoda TJ, Pinckard RN, McMahan CA, Jones AC. Correcting for guessing increases validity in multiple-choice examinations in an oral and maxillofacial pathology course. J Dent Educ 2006; 70(4):378–86.
    OpenUrlAbstract/FREE Full Text
  5. ↵
    Snow R. Aptitude-Treatment Interaction as a framework for research on individual differences in learning. In: Ackerman P, Sternberg RJ, Glaser R, eds. Learning and individual differences. New York: W.H. Freeman, 1989.
  6. ↵
    Prihoda TJ, Pinckard RN, McMahan CA, Littlefield JH, Jones AC. Prospective implementation of correction for guessing in oral and maxillofacial pathology multiple-choice examinations: did student performance improve? J Dent Educ 2008; 72(10):1149–59.
    OpenUrlAbstract/FREE Full Text
  7. ↵
    Lord FM. Formula scoring and number-right scoring. J Educ Meas 1975; 12:7–12.
    OpenUrl
  8. ↵
    Snedecor GW, Cochran WG. Statistical methods. Ames: Iowa State University Press, 1967.
  9. ↵
    Balch WR. Effects of test expectation on multiple-choice performance and subjective ratings. Teaching Psychol 2007; 34:219–25.
    OpenUrlAbstract/FREE Full Text
View Abstract

This article requires a subscription to view the full text. If you have a subscription you may use the login form below to view the article. Access to this article can also be purchased.

PreviousNext
Back to top

In this issue

Journal of Dental Education: 73 (8)
Journal of Dental Education
Vol. 73, Issue 8
1 Aug 2009
  • Table of Contents
  • Index by author

GTranslate

English French German Italian Portuguese Russian Spanish
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Journal of Dental Education.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Short-Answer Examinations Improve Student Performance in an Oral and Maxillofacial Pathology Course
(Your Name) has sent you a message from Journal of Dental Education
(Your Name) thought you would like to see the Journal of Dental Education web site.
Citation Tools
Short-Answer Examinations Improve Student Performance in an Oral and Maxillofacial Pathology Course
R. Neal Pinckard, C. Alex McMahan, Thomas J. Prihoda, John H. Littlefield, Anne Cale Jones
Journal of Dental Education Aug 2009, 73 (8) 950-961;

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero

Request Permissions

Share
Short-Answer Examinations Improve Student Performance in an Oral and Maxillofacial Pathology Course
R. Neal Pinckard, C. Alex McMahan, Thomas J. Prihoda, John H. Littlefield, Anne Cale Jones
Journal of Dental Education Aug 2009, 73 (8) 950-961;
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo LinkedIn logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
  • Linkedin Share Button

Jump to section

  • Article
    • Abstract
    • Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • Scopus
  • PubMed
  • Google Scholar

Cited By...

  • Do Dental Students Use Optimal Study Strategies?
  • Fostering Dental Student Self-Assessment of Knowledge by Confidence Scoring of Multiple-Choice Examinations
  • Constructing Licensure Exams: A Reliability Study of Case-Based Questions on the National Board Dental Hygiene Examination
  • Improving Multiple-Choice Questions to Better Assess Dental Student Knowledge: Distractor Utilization in Oral and Maxillofacial Pathology Course Examinations
  • Short-Answer Questions and Formula Scoring Separately Enhance Dental Student Academic Performance
  • Scopus (8)
  • Google Scholar

More in this TOC Section

  • Embryology and Histology Education in North American Dental Schools: The Basic Science Survey Series
  • Improving Light-Curing Instruction in Dental School
  • A Four-Tier Problem-Solving Scaffold to Teach Pain Management in Dental School
Show more Educational Methodologies

Similar Articles

About

  • About ADEA
  • About the JDE
  • Editorial Review Board
  • Contact Us

Author Information

  • Submit a Paper
  • Submission Information
  • FAQ for Authors
  • Reprint Policies

More Information

  • Advertise
  • Subscribe
  • Email Alerts
  • My Saved Searches
  • Help

© 2019 Journal of Dental Education

Powered by HighWire