Limitations of physician ratings in the assessment of student clinical performance in an obstetrics and gynecology clerkship

Research output: Contribution to journalReview article

36 Citations (Scopus)

Abstract

A review of the grading process used in an obstetrics and gynecology clerkship prompted an analysis of physician ratings of student clinical performance. The study assessed the following: 1) the degree to which raters distinguished among six categories of performance, 2) the concordance among raters in terms of evaluation criteria used, 3) the degree of inter-rater agreement, and 4) the relationship between the ratings and student performance on the National Board of Medical Examiners Subject Examination in obstetrics and gynecology. Data from physician ratings and examination scores of 82 students were analyzed. The physicians received no standardized training in using the evaluation form. Seven raters (three faculty, four residents [one per postgraduate year]) were randomly selected from each student’s set of evaluations. Contrary to expectations, physicians made global ratings of the students using a single criterion and did not distinguish among the six evaluation categories. Each of the five groups of physicians used a different criterion to evaluate students. The ratings were inflated and suffered from low inter-rater reliability. First-year residents were more lenient in their grading tendencies than the other physician groups. The ratings showed a weak correlation with the examination scores, and the strength of this relationship varied with the physician group and performance category. These findings and supporting literature should remind clerkship directors to periodically check the quality of clinical performance ratings and to recognize the limitations of these ratings for grading purposes. Suggestions are presented for improving the student evaluation process.

Original languageEnglish (US)
Pages (from-to)136-141
Number of pages6
JournalObstetrics and Gynecology
Volume78
Issue number1
StatePublished - Jan 1 1991
Externally publishedYes

Fingerprint

Gynecology
Obstetrics
Students
Physicians
Coroners and Medical Examiners

All Science Journal Classification (ASJC) codes

  • Medicine(all)

Cite this

@article{afc252d3cf294a69b78a39d0a53ed942,
title = "Limitations of physician ratings in the assessment of student clinical performance in an obstetrics and gynecology clerkship",
abstract = "A review of the grading process used in an obstetrics and gynecology clerkship prompted an analysis of physician ratings of student clinical performance. The study assessed the following: 1) the degree to which raters distinguished among six categories of performance, 2) the concordance among raters in terms of evaluation criteria used, 3) the degree of inter-rater agreement, and 4) the relationship between the ratings and student performance on the National Board of Medical Examiners Subject Examination in obstetrics and gynecology. Data from physician ratings and examination scores of 82 students were analyzed. The physicians received no standardized training in using the evaluation form. Seven raters (three faculty, four residents [one per postgraduate year]) were randomly selected from each student’s set of evaluations. Contrary to expectations, physicians made global ratings of the students using a single criterion and did not distinguish among the six evaluation categories. Each of the five groups of physicians used a different criterion to evaluate students. The ratings were inflated and suffered from low inter-rater reliability. First-year residents were more lenient in their grading tendencies than the other physician groups. The ratings showed a weak correlation with the examination scores, and the strength of this relationship varied with the physician group and performance category. These findings and supporting literature should remind clerkship directors to periodically check the quality of clinical performance ratings and to recognize the limitations of these ratings for grading purposes. Suggestions are presented for improving the student evaluation process.",
author = "William Metheny",
year = "1991",
month = "1",
day = "1",
language = "English (US)",
volume = "78",
pages = "136--141",
journal = "Obstetrics and Gynecology",
issn = "0029-7844",
publisher = "Lippincott Williams and Wilkins",
number = "1",

}

TY - JOUR

T1 - Limitations of physician ratings in the assessment of student clinical performance in an obstetrics and gynecology clerkship

AU - Metheny, William

PY - 1991/1/1

Y1 - 1991/1/1

N2 - A review of the grading process used in an obstetrics and gynecology clerkship prompted an analysis of physician ratings of student clinical performance. The study assessed the following: 1) the degree to which raters distinguished among six categories of performance, 2) the concordance among raters in terms of evaluation criteria used, 3) the degree of inter-rater agreement, and 4) the relationship between the ratings and student performance on the National Board of Medical Examiners Subject Examination in obstetrics and gynecology. Data from physician ratings and examination scores of 82 students were analyzed. The physicians received no standardized training in using the evaluation form. Seven raters (three faculty, four residents [one per postgraduate year]) were randomly selected from each student’s set of evaluations. Contrary to expectations, physicians made global ratings of the students using a single criterion and did not distinguish among the six evaluation categories. Each of the five groups of physicians used a different criterion to evaluate students. The ratings were inflated and suffered from low inter-rater reliability. First-year residents were more lenient in their grading tendencies than the other physician groups. The ratings showed a weak correlation with the examination scores, and the strength of this relationship varied with the physician group and performance category. These findings and supporting literature should remind clerkship directors to periodically check the quality of clinical performance ratings and to recognize the limitations of these ratings for grading purposes. Suggestions are presented for improving the student evaluation process.

AB - A review of the grading process used in an obstetrics and gynecology clerkship prompted an analysis of physician ratings of student clinical performance. The study assessed the following: 1) the degree to which raters distinguished among six categories of performance, 2) the concordance among raters in terms of evaluation criteria used, 3) the degree of inter-rater agreement, and 4) the relationship between the ratings and student performance on the National Board of Medical Examiners Subject Examination in obstetrics and gynecology. Data from physician ratings and examination scores of 82 students were analyzed. The physicians received no standardized training in using the evaluation form. Seven raters (three faculty, four residents [one per postgraduate year]) were randomly selected from each student’s set of evaluations. Contrary to expectations, physicians made global ratings of the students using a single criterion and did not distinguish among the six evaluation categories. Each of the five groups of physicians used a different criterion to evaluate students. The ratings were inflated and suffered from low inter-rater reliability. First-year residents were more lenient in their grading tendencies than the other physician groups. The ratings showed a weak correlation with the examination scores, and the strength of this relationship varied with the physician group and performance category. These findings and supporting literature should remind clerkship directors to periodically check the quality of clinical performance ratings and to recognize the limitations of these ratings for grading purposes. Suggestions are presented for improving the student evaluation process.

UR - http://www.scopus.com/inward/record.url?scp=0025906667&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0025906667&partnerID=8YFLogxK

M3 - Review article

C2 - 2047054

AN - SCOPUS:0025906667

VL - 78

SP - 136

EP - 141

JO - Obstetrics and Gynecology

JF - Obstetrics and Gynecology

SN - 0029-7844

IS - 1

ER -