can performance in medical school predict performance in residency? a compilation and review of...

15
Can performance in medical school predict performance in residency? A compilation and review of correlative studies Katya L. Harfmann, MD, and Matthew J. Zirwas, MD Columbus, Ohio Background: The current resident selection process relies heavily on medical student performance, with the assumption that analysis of this performance will aid in the selection of successful residents. Although there is abundant literature analyzing indicators of medical student performance measures as predictors of success in residency, wide-ranging differences in beliefs persist concerning their validity. Objective: We sought to collect and review studies that have correlated medical student performance with residency performance. Methods: The English-language literature from 1996 to 2009 was searched with PubMed. Selected studies evaluated medical students on the basis of US Medical Licensing Examination scores, preclinical and clinical performance, research experience, objective structured clinical examination performance, medical school factors, honor society membership, Medical Student Performance Evaluations, letters of recom- mendation, and faculty interviews. Outcome measures were standardized residency examinations and residency supervisor ratings. Results: The medical student factors that correlated most strongly with performance on examinations in residency were medical student examination scores, clinical performance, and honor society membership. Those that correlated most strongly with supervisor ratings were clinical grades, faculty interview, and medical school attended. Overall, there were inconsistent results for most performance measures. Limitations: In addition to the lack of a widely used measure of success in residency, most studies were small, single institution, and single specialty, and thus of limited ability to generalize findings. Conclusion: No one medical student factor can be used to predict performance in residency. There is a need for a more consistent and systematic approach to determining predictors of success in residency. ( J Am Acad Dermatol 2011;65:1010-22.) Key words: assessment; medical student; performance; residency; resident selection. E very year, residency programs embark on a process of evaluating and selecting appli- cants. Although programs vary in the qualities they value, the general goal of applicant selection is to select individuals who will meet or exceed the expectations placed on them during residency and who will have successful careers after residency training. From the Ohio State University. Funding sources: None. Conflicts of interest: None declared. Accepted for publication July 16, 2010. Reprint requests: Matthew J. Zirwas, MD, the Ohio State University, 540 Officenter Plaza, Suite 240, Gahanna, OH 43230. E-mail: [email protected]. Published online May 25, 2011. 0190-9622/$36.00 ª 2010 by the American Academy of Dermatology, Inc. doi:10.1016/j.jaad.2010.07.034 Abbreviations used: ACGME: Accreditation Council for Graduate Medical Education AOA: Alpha Omega Alpha GPA: grade-point average ITE: in-training examination LOR: letters of recommendation MSPE: Medical Student Performance Evaluation OSCE: objective structured clinical examination PD: program director USMLE: US Medical Licensing Examination 1010

Upload: katya-l-harfmann

Post on 25-Oct-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Can performance in medical school predictperformance in residency? A compilation and review

of correlative studies

Katya L. Harfmann, MD, and Matthew J. Zirwas, MD

Columbus, Ohio

From

Fund

Conf

Acce

Repr

54

M

Publ

0190

ª 20

doi:1

101

Background: The current resident selection process relies heavily on medical student performance, withthe assumption that analysis of this performance will aid in the selection of successful residents. Althoughthere is abundant literature analyzing indicators of medical student performance measures as predictors ofsuccess in residency, wide-ranging differences in beliefs persist concerning their validity.

Objective: We sought to collect and review studies that have correlated medical student performance withresidency performance.

Methods: The English-language literature from 1996 to 2009 was searched with PubMed. Selected studiesevaluated medical students on the basis of US Medical Licensing Examination scores, preclinical andclinical performance, research experience, objective structured clinical examination performance, medicalschool factors, honor society membership, Medical Student Performance Evaluations, letters of recom-mendation, and faculty interviews. Outcome measures were standardized residency examinations andresidency supervisor ratings.

Results: The medical student factors that correlated most strongly with performance on examinations inresidency were medical student examination scores, clinical performance, and honor society membership.Those that correlated most strongly with supervisor ratings were clinical grades, faculty interview, andmedical school attended. Overall, there were inconsistent results for most performance measures.

Limitations: In addition to the lack of a widely used measure of success in residency, most studies weresmall, single institution, and single specialty, and thus of limited ability to generalize findings.

Conclusion: No one medical student factor can be used to predict performance in residency. There is aneed for a more consistent and systematic approach to determining predictors of success in residency.( J Am Acad Dermatol 2011;65:1010-22.)

Key words: assessment; medical student; performance; residency; resident selection.

Abbreviations used:

ACGME: Accreditation Council for GraduateMedical Education

AOA: Alpha Omega Alpha

Every year, residency programs embark on aprocess of evaluating and selecting appli-cants. Although programs vary in the qualities

they value, the general goal of applicant selection is

the Ohio State University.

ing sources: None.

licts of interest: None declared.

pted for publication July 16, 2010.

int requests: Matthew J. Zirwas, MD, the Ohio State University,

0 Officenter Plaza, Suite 240, Gahanna, OH 43230. E-mail:

[email protected].

ished online May 25, 2011.

-9622/$36.00

10 by the American Academy of Dermatology, Inc.

0.1016/j.jaad.2010.07.034

GPA: grade-point averageITE: in-training examinationLOR: letters of recommendationMSPE: Medical Student Performance EvaluationOSCE: objective structured clinical examinationPD: program directorUSMLE: US Medical Licensing Examination

0

to select individuals who will meet or exceed theexpectations placed on them during residency andwho will have successful careers after residencytraining.

Page 2: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1011

There are two widely accepted domains underwhich student and physician characteristics can beclassified: cognitive and noncognitive. Cognitive traitsinclude memory and reasoning, and these traits arecommonly objectively measured in both medicalschool and residency; examples include medicalschool examinations, US Medical Licensing

CAPSULE SUMMARY

d Residency program directors rely onmedical student performance measuresto aid in the selection of successfulresidents.

d It is unknown whether medical studentperformance measures can predictsuccess in residency.

d Competitive specialties such asdermatology are faced with greaterchallenges in resident selection as aresult of the increased number ofapplicants with outstanding academicrecords.

d This article reviews studies done to thisend to assist program directors withresident selection.

Examination (USMLE) stepexaminations, and residencyin-training examinations(ITEs). Noncognitive traits en-compass a number of subjec-tive measures, such asattitude, motivation, interper-sonal skills, and ‘‘emotionalintelligence,’’ defined byMartinez1 as the ‘‘arrayofnon-cognitive skills, capabilities,and competencies that influ-ence a person’s ability to copewith environmental demandsandpressures.’’ A recent studyfound that most orthopedicprogram directors (PDs)thought thatnoncognitive fac-tors were very important fac-tors in selecting residencycandidates, but these traitsare difficult to assess when

reviewing medical students.2,3

In the 2008 Match, PDs were asked to rate theimportance of a number of objective and subjectivecriteria used to assess applicants (Table I).4

Competitive specialties are often forced to selectapplicants to interview from a large pool of highlyqualified applicantsedermatology had 1.7 appli-cants per position in the 2009 Match, second onlyto plastic surgery with 1.9 applicants per position.5

Despite the plethora of data available about appli-cants, however, only half of plastic surgery PDs were‘‘somewhat satisfied’’ with the selection process, andnearly one quarter were ‘‘less than satisfied.’’6

We undertook this review to determine if thecurrent processes and criteria by which medicalstudents are assessed are predictors of or are asso-ciated with future performance as residents. Oursecondary goal was to provide practical guidance forthose charged with assessing applicants and select-ing residents, particularly in fields that must selectresidents from a highly qualified pool of applicants.

METHODSWe searched the English-language literature about

resident selection for 1996 to March 2009 withPubMed. The initial year was selected based on the

implementation of the USMLE sequence in 1994.7 ThePubMed search was conducted by combining thefollowing terms in a key word search: ‘‘residen*,’’‘‘student,’’ and ‘‘selection.’’ The abstracts of the resul-tant list were reviewed for relevance, and additionalarticleswere selected from references and the ‘‘relatedarticle’’ function in PubMed. Selected studies for

review were chosen basedon several inclusion criteria.We included studies that: (1)hadmedical students as studysubjects; (2) contained mea-sures of medical student per-formance as the independentvariable; and (3) containedperformance measures ofthe same medical studentsduring residency as the de-pendent variable. Both pro-spective studies followingmedical students into resi-dency programs and retro-spective studies analyzingprior medical student perfor-mance of current residentswere included.

DEFINING SUCCESS INRESIDENCY

One key prerequisite in determining if a givenfactor is associated with residency performance is areliable, universally accepted measure of residentperformance. The definition of a successful resident,however, and the criteria assessed differ widelyamong individuals, specialties, and programs(Table II; available at http://www.eblue.org). Somepropose that a successful resident is one who worksindependently, functions well under stress, andexercises sound judgment.8 Other criteria includepatient rapport, work ethic, maturity, perceivedknowledge, integrity, leadership ability, use of liter-ature, judgment, acquisition of teaching and researchawards, or specialty-specific technical skills.9-14

In response to the absence of a reliable assess-ment tool for residency performance, Durning et al15

created and validated an overall performance as-sessment tool for PDs to evaluate resident perfor-mance. This tool was validated via 1247 evaluationsof trainees in the full spectrum of disciplines, all ofwhom had graduated from a single medical school.In addition to having a high degree of internalconsistency, the ratings of the 18 cognitive andnoncognitive items collapsed into two domains:professionalism and expertise, roughly analogousto noncognitive and cognitive traits, respectively.

Page 3: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Table I. Factor variation in mean importanceratings by program directors in the 2008 Match inapplicant ranking

Factor

Rating*

(1-5)

Interpersonal skills exhibited during interview 4.3Interactions with faculty during interview 4.3Professional attributes exhibited during interview 4.2Interactions with housestaff during interview 4.2Perceived commitment to specialty 4.1Grades in clerkship in desired specialty 4.0Letters of recommendation from clerkship directorin specialty

3.9

Grades in required clerkships 3.8Honors in clerkship in desired specialty 3.8Medical Student Performance Evaluation 3.8Honors in basic sciences 3.8USMLE step 1 score 3.8Letters of recommendation from colleague or chairof specialty

3.8

Leadership qualities exhibited during interview 3.8Honors in clinical clerkships 3.7Class ranking/quartile 3.7Consistency of grades 3.6USMLE step 2 score 3.6Personal prior knowledge of the applicant 3.6Personal statement 3.3Graduate of highly regarded US medical school 3.3Alpha Omega Alpha membership 3.2Demonstrated involvement in research 2.8Interest in academic career 2.6

USMLE, US Medical Licensing Examination.

*Ratings on scale from 1 (not at all important) to 5 (very

important).

J AM ACAD DERMATOL

NOVEMBER 20111012 Harfmann and Zirwas

A similar study found that PDs’ ratings of 24 itemswere best explained by 5 factors: interpersonalcommunication, clinical skills, population-basedhealth care, record-keeping skills, and critical ap-praisal skills.16 These factors had excellent internalconsistency reliability, meaning that evaluators wereconsistent in their evaluations of particular traits inindividual residents. In short, although most facultyagree that they know a good resident when they seeone and vice versa, there is no general consensus onobjective measures that correlate with this gestaltmethod.

PERFORMANCE ON USMLE STEPS 1 AND 2Residency programs rely on USMLE scores as

objective and consistent measures ofmedical studentcognitive performance that provide an easy methodfor comparison of applicants from different medicalschools. PDs in the 2008Match used step 1 scores to agreater extent than step 2 scores in interview

selection: 82% compared with 70%, respectively.4

The importance of USMLE scores tends to be greaterin competitive specialties; dermatology applicantstypically have some of the highest USMLE scores,with a median step 1 score second only to plasticsurgery, and the highest median step 2 score in the2009 Match.5 Considering the higher than averagescores of applicants and the ever-increasing numberof applications to competitive specialties, ‘‘very sat-isfied’’ plastic surgery PDs were significantly morelikely to use a minimum step 1 cut-off score thantheir less satisfied colleagues.6 In addition to theirknown use as a screening tool in certain programsand specialties, USMLE step 1 scores have beenshown to correlate with the final National ResidentMatching Program rank list, indicating either contin-ued use in later stages of the Match, or true correla-tion with the strength and competence of theapplicants.17

When PDs evaluated residents using the form ofDurning et al,15 both steps 1 and 2 correlatedsignificantly with the expertise domain, but failedto correlate with the professionalism domain, sug-gesting that the USMLE may only predict perceivedcognitive competence. Indeed, studies of residentsin multiple specialties have shown a positive corre-lation between USMLE scores and ITE or boardexamination scores, with the exception of two stud-ies.9-11,14,15,18-25 The evidence correlating perfor-mance on the USMLE with subjective residentperformance is much less conclusive. Most studiesfailed to find significant correlations with noncogni-tive assessments during residency such as facultyevaluations and standardized patient encoun-ters.9-11,14,19,26 Furthermore, two studies found sig-nificant negative relationships between USMLEscores and subsequent noncognitive performance,measured through faculty evaluations of theAccreditation Council for Graduate MedicalEducation (ACGME) competencies and profession-alism ratings.27,28

There are studies, however, that have demon-strated positive correlations between USMLE scoresand subjective performance. Two of these studiesfollowed medical students into residency and foundpositive correlations between USMLE scores andfaculty evaluations.16,29 Taylor et al29 analyzed steps1 and 2, and the step 2 clinical skills prototype, andfound that the examinations were significant predic-tors of the intern’s quartile ranking, average ACGMEcompetency score, and interpersonal competencyscore. Studies found that, in addition to positivecorrelations with individual scores, the relationshippersisted when medical students are divided intothirds according to their USMLE scores.16,30

Page 4: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1013

SummaryOverall, the evidence linking medical student

performance on USMLE steps 1 and 2 with residentperformance is inconsistent. High performance onexaminations at one level of training likely predictshigh performance on examinations at future levels.When attempts aremade to correlate the USMLEwithnoncognitive performance measures, the results areunclear and unpredictable. Studies originating fromsingle residency programs that examined USMLEscores of their residents tended not to find a corre-lation between the objective scores and subjectiveperformance, whereas studies originating frommed-ical schools that examined entire classes of graduatesdid tend to find positive correlations. One hypothesisis that a given residency program is composed of arelatively homogeneous group of trainees with rel-atively similar characteristics and a narrow range ofUSMLE scores. Entire medical school graduatingclasses represent a more heterogeneous populationthat will pursue multiple specialties, and it is withthese more diverse groups that relationships be-tween USMLE scores and subjective resident perfor-mance may be revealed.

PERFORMANCE IN PRECLINICALCOURSEWORK

Medical student performance in preclinical yearsis another objective measure of student performanceassessed during the application process. Unlike theUSMLE, preclinical coursework and grading schemesvary greatly from school to school. Despite this lackof comparability, nearly half of PDs in the 2008Match cited that achievement of ‘‘honors’’ in thebasic sciences was a factor in interview selection ofresidency candidates.4

Only one study compared preclinical grade-pointaverage (GPA) with subsequent performance onexaminations during residency. Honors or ‘‘Agrades’’ in preclinical courses were significantlyrelated to success on board examinations, despiteonly a sporadic relationship with ITE scores.19

Regarding correlations with subjective performancemeasures, some studies found low positive correla-tions between preclinical GPAs and residency super-visor ratings.19,31 Two studies following medicalstudents into residency, however, found significantcorrelations with residency director ratings, pro-claiming that overall 4-year GPA was the best predic-tor of the intern’s subsequent performance.16,29 Inanother study, lower performance on any academicmarker was associated with an increased risk of poorknowledge and professionalism ratings.28 Like theUSMLE score data, significant correlations remainedwhen GPA was sectioned into thirds.16,30

SummaryThe findings demonstrate a similar pattern to that

seen with USMLE scores, likely because both mea-sures assess competence in the cognitive domain.Again, studies emerging from medical schools thatexamined residency performance of graduates inmultiple specialties at multiple institutions weremore likely to find significant relationships thanstudies emerging from single residency programs.This could be for the same reason as mentionedwhen discussing USMLE performance, but could alsobe largely attributable to the lack of comparability ofpreclinical GPAs across institutions. Because resi-dency programs are likely composed of traineesfrom multiple institutions with different preclinicalgrading schemes, attempts at drawing correlationsbetween resident performance and preclinical GPAsare unlikely to be useful. On the other hand,preclinical performance of a medical school classfrom a single institution is consistently measured andis more likely to be a valid tool for comparison ofstudents from that institution.

PERFORMANCE ON CLINICAL ROTATIONSMedical student performance during the clinical

years is widely used in resident selection, with 71%of PDs in the 2008 Match citing it as a factor ininterview selection.4 Grades on rotations are oftendetermined through a combination of examinationsand supervisor evaluations. The weight of thesecognitive and noncognitive measures in determiningthe final clerkship grade differs among and withininstitutions.32 There are also many grading schemes,causing difficulties in comparing students from dif-ferent schools. Furthermore, certain clerkships andmedical school factors, such as geographic locationand source of funding, have been shown to impactthe distribution of grades among students.33

Subjective evaluations introduce the ‘‘halo effect,’’where a single strong positive or negative character-istic about an individual affects perception of unre-lated characteristics, and subsequent differences inevaluations based on student characteristics anddemographics.34-36 The clerkship order during theclinical years also impacts the final grade.37 Despitethese confounding factors, 63% of PDs value thedesignation of clinical honors in residencycandidates.4

The PD evaluation form of Durning et al15 foundclinical GPA to correlate with the expertise andprofessionalism domains, suggesting that clinicalgrades reflect both cognitive and noncognitive per-formance. Accordingly, when the evaluation formwas used, clinical GPA could predict subsequentknowledge ratings and was the only independent

Page 5: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

NOVEMBER 20111014 Harfmann and Zirwas

predictor of professionalism ratings.28 Other studiescomparing clerkship performance with residencysupervisor ratings have had varying results. Severalstudies found positive correlations between the twoperformance measures; notably, Dirschl et al38 de-termined that the number of clinical clerkshiphonors in medical school was the best predictor ofoverall resident performance.25,31 Two prospectivestudies of medical school graduates found, again,that clinical GPA correlated with PD ratings of overallperformance.16,30 There are studies, however, thatfailed to find a relationship between clinical GPA andnoncognitive performance measures, including onestudy that found a negative correlation.10,19,27

When clerkship performance is compared withcognitive performance in residency, the results aremore consistent. Multiple studies have found thathonors grades in clerkships can predict performanceon step 3 and a passing board score on the firstattempt.25,31 Interestingly, studies that could not finda correlation between clinical GPA and faculty eval-uations did find positive correlations with ITE andboard examination scores.19,27 However, the partic-ular subject in which a medical student receivedclinical honors did not seem to matter. For example,Boyse et al19 found that honors in the radiologyclerkship did not impact future radiology boardscores.

Clinical clerkship scores are often presented withwritten comments derived from the faculty evalua-tion, which provide an additional means of assess-ment of residency applicants. A study that evaluatedwritten comments on clerkship evaluations foundthat most comments addressed issues of initiative,relationships, patient skills, work habits, self-improvement, and composure.39 Of the evaluationsanalyzed, most comments were positive. Whenpositive, equivocal, and negative comments werecorrelated with the nonnarrative portion of theevaluation, positive comments correlated positivelywith professionalism questions, whereas both nega-tive and equivocal comments correlated negatively.This suggests that equivocal comments should beregarded as negative and taken seriously, consider-ing their rarity in evaluations.

SummaryThe findings indicate that clinical clerkship per-

formance holds promise as a predictor of residentsuccess. It is unknown whether written commentsare correlated with success in residency. Thoroughevaluation of these comments, though, may lessenthe confusion created by the various grading systemsused during clinical years. All studies analyzed foundthat clinical GPA correlated positively with cognitive

performancemeasures. Regarding noncognitive per-formance measures, the results are promising, albeitless straightforward. Most studies that failed to find apositive correlation between clinical GPA andnoncognitive performance measures tended not toexamine overall medical student or resident perfor-mance, but rather examined specific clerkships inmedical school thought to be most relevant andspecific rotation evaluations in residency. This couldindicate that clerkship success does not necessarilyrepresent mastery or skill in a particular area, butinstead demonstrates overall work ethic and profes-sionalism that is best revealed when clerkship per-formance is examined en masse.

RESEARCH EXPERIENCENearly half of PDs in the 2008 Match cited

involvement in research as a factor in interviewselection, and dermatology applicants were in thetop 4 specialties regarding research experiences,abstracts, posters, presentations, and possession ofa PhD in the 2009 Match.4,5 One study found thatstudent research contributed to creating physician-scientists.40 Brothers and Wetherholt,27 however,found that research experience and publications ofmedical students had a negative correlation withfaculty clinical performance ratings during resi-dency. Another study found no significant correla-tion between research publications and ITE scores.22

SummaryThe lack of studies is most likely a result of the

inherent difficulties in quantifying involvement inresearch. It is possible that in selected situations, suchas MD/PhD candidates, research experience anddemonstrated success is crucial, but the studies thathave been done have found either no correlation or anegative correlation with success in residency.

OBJECTIVE STRUCTURED CLINICALEXAMINATION PERFORMANCE

The objective structured clinical examination(OSCE) was introduced in 1975 to provide a stan-dardized measure of clinical performance.41 It is notconsistently used among medical schools nor are theresults specifically reported in applications, andtherefore, it is not a commonly used item forassessing residency applicants. Use is increasing,however, in various capacities, including as a meansfor student assessment in clerkships, as a teachingtool, and as an indicator of potential weaknesses inprograms.42-44

In one study, communication skills during theOSCE positively correlated with measures of emo-tional intelligence.45 Despite this finding of a

Page 6: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1015

significant correlation between OSCE performanceand noncognitive traits of medical students, studiesare conflicting over whether there is a positivecorrelation with subsequent intern performanceand the OSCE’s ability to identify knowledgegaps.29 Several studies have shown a low correlationwith OSCE performance and supervisor ratings dur-ing residency, with the interpersonal skills domainhaving the highest correlation.31

SummaryMost studies assessing the use of the OSCE eval-

uate its effectiveness in measuring clinical perfor-mance and noncognitive traits of medical students,whereas few correlate this clinical performancemeasure to success in residency. This is likely aresult of the inconsistent use of the OSCE amongmedical schools; reporting of OSCE results is notexpected in residency applications, and if present, isnot standardized. The findings in the above studiessuggest that, despite its rare use as a predictive factor,the OSCE may be helpful to evaluate certain non-cognitive traits of medical students that could impactperformance in residency.

MEDICAL SCHOOL FACTORSIn addition to characteristics of the medical stu-

dent, medical school factors are used to evaluatepotential residents. Slightly more than half of PDs inthe 2008 Match cited the reputation of an applicant’smedical school as a factor in interview selection.4

Despite its use by the majority of PDs and theprevalence of applicants to competitive specialtiesfrom top institutions (nearly 50% in dermatology),medical school reputation is valued significantly lessby the ‘‘very satisfied’’ plastic surgery PDs comparedwith their less satisfied colleagues.5,6

Few studies have examined the relationship be-tween medical school factors and success in resi-dency. Two studies analyzed found no correlationbetween medical school rank and subsequent resi-dency performance measures.14,19 An interestingstudy, however, found that medical school attendedwas more consistently predictive of clinical andacademic success in residency than USMLE scores,dean’s letters, and letters of recommendation(LOR).46 Regarding geographic factors, one studyfound no correlation with ITE scores, whereas an-other found that ties to the region were associatedwith significantly better performance evaluations.19,21

SummaryResults indicating a minimal role are not surpris-

ing, as the school attended portrays nothing uniqueabout the work ethic and cognitive achievements of

the student that is not demonstrated by the student’sachievements during medical school. On the otherhand, those studies finding positive correlations mayindicate that a given medical school tends to attract acertain type of student or that the overall culture ofthemedical school tends to favor the development ofcertain traits that may affect residency performance.Because every residency program is different interms of the specific traits that are valued and leadto success, practically this implies that faculty canbest determine the medical school factors that pre-dict success in their programs by drawing on expe-riences with previous residents from the medicalschool in question. The results taken together sug-gest that medical school factors are unlikely to play apredictive role for cognitive success in residency, butmay be helpful in predicting success in the noncog-nitive domain by selecting residents from medicalschools that have previously produced residentswho were successful in the program.

HONOR SOCIETY MEMBERSHIPMedical student membership in honor societies,

particularly Alpha Omega Alpha (AOA), is anotherfactor that has been evaluated for correlation withfuture success in residency. In the 2008 Match,slightly more than half of PDs cited AOA member-ship as a factor in interview selection, with 15 plasticsurgery PDs listing it as the most important objectivecriterion.4,47 This is particularly relevant in the eval-uation of dermatology applicants, as 51% of success-ful applicants in 2009 were members of AOAethehighest of any specialty.5

Multiple studies have examined AOA member-ship and its association with success in residency.Regarding board examination scores, two studiesfound that significantly more AOA members passedthe written board examinations on the first at-tempt.25,31 Membership in AOA and its relationshipwith subsequent ITE scores is less straightforward.Two studies, including one that found a positivecorrelation with step 3 scores, failed to find a signif-icant difference with ITE scores.22,25 Boyse et al19

found that senior AOA election did not significantlyaffect ITE scores, and junior AOA election had asignificant positive correlation with only 1 year ofITE scores. Two studies examined AOAmembershipand subsequent faculty ratings. Although one studyfailed to find a correlation between AOA member-ship and faculty rankings, another study found asignificant relationship.10,14

SummaryHonor society membership does not have a clear

impact on success in residency. In the cognitive

Page 7: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

NOVEMBER 20111016 Harfmann and Zirwas

domain, AOA membership consistently predictedsuccess on board examinations, but did not consis-tently relate to ITE scores. Membership in AOA mayrepresent strong studying and test-taking skills thatexpress themselves again during the preparation forboard examinations. In contrast, ITEs occur yearlyand are often not studied for with the same intensityas other examinations. Therefore, the characteristicsof students that lead to election to AOA may not berepresented by isolated ITE performance to the samedegree as the collective residency knowledge thatboard examination performance represents.

In the noncognitive domain, the two studies arecontradictory, but suggest that AOA membershipmay predict better faculty rankings during residency.Because AOA membership generally depends onboth strong overall performance in preclinical cour-sework and on clinical rotations, it is not surprisingthat there would be a correlation with faculty rank-ings, which are also related to performance onclinical rotations. However, the evidence that per-formance on clinical rotations is predictive of resi-dency performance is stronger, and it is not clear thatAOA membership adds any additional informationbeyond that obtained by performance on clinicalrotations alone.

MEDICAL STUDENT PERFORMANCEEVALUATION

The Medical Student Performance Evaluation(MSPE), formerly known as the dean’s letter, wascreated as an instrument to provide informationabout achievements of medical students.48 It hassince undergone extensive revisions to improve itsconsistency and is now a commonly used tool toassess residency applicants, with 76% of PDs citing itas a factor in interview selection.4 The Association ofAmerican Medical Colleges established specificguidelines for the preparation of the MSPE, further-ing the regularity and reliability of this instrument.Those parts perceived to be predictive by PDs offuture performance include the academic historysummary, academic progress, academic ranking,and comparative clinical performance with class-mates; parts perceived to be not predictive includeunique characteristics, preclinical comparative per-formance, professional behaviors and attitudes incomparison with classmates, and the summary state-ment.49 Despite these perceptions, residency direc-tors admit that they often only read the lastparagraph, which contains a summary statementabout the applicant’s overall medical schoolperformance.50

Appropriately, the studies examining possiblerelationships between the MSPE and residency

performance have largely looked at the summarystatement and its inferred class rank. In severalstudies, MSPE rankings of medical school graduatescorrelated positively with PD evaluations, althoughthe level of correlation ranged from ‘‘closely related’’to a ‘‘low correlation.’’31,51 Another study, however,found no significant difference in future evaluationsduring residency.19 Lurie et al51 also compared theMSPE with peer assessments of medical students andfound significant correlations with peer-assessedwork habits, but no correlation with interpersonalrelations. Concerning the MSPE and its relationshipwith future cognitive measures, studies have primar-ily found trends toward better performance onexaminations, but no significant relationship be-tween the two factors.19,31,52

SummaryThe relationship between the MSPE and perfor-

mance during residency is unpredictable. Within thecognitive domain, no study found a relationshipmore remarkable than a trend toward higher perfor-mance. This is interesting, given that the factors ofthe MSPE that PDs perceived as most predictiveprimarily represented cognitivemeasures. Regardingnoncognitive measures of performance, the resultsindicate that a positive correlation may exist. Thevarying degree of relationship is most likely a resultof the inherent variability of subjective supervisorevaluations and the differing criteria for measuringperformance among evaluators. According to resultsof Lurie et al,51 one could surmise that residentperformance assessed by faculty that evaluate pri-marily on the basis of work habits may significantlycorrelate with the MSPE, whereas performance asjudged by faculty that heavily value interpersonalskills may not positively correlate with the MSPE. It isalso possible, even likely, that student work ethicdoes not predict resident work ethic. Depending onthe motivation for a student’s work ethic in medicalschool, that work ethic may or may not be trans-ferred to residency. Overall, the MSPE should berecognized partially as a reflection of a student’swork ethic during medical school, rather than apredictor of future cognitive success, and it is notclear that this work ethic is directly predictive ofwork ethic during residency.

LETTERS OF RECOMMENDATIONLOR are intended to provide a unique perspective

on a student’s strengths and abilities not reflected inother performance measures. They are consideredan important element of residency applications, withinternal medicine PDs rating them in the top onethird of all data received.53 Because there are few

Page 8: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Table III. Recommended use of residency applicant variables in the resident selection process based onreviewed literature.

Variable Use Strength of recommendation*

USMLE step 1 and 2 scores Should not be used among small groups of applicants to predictnoncognitive performance

Strong9-11,14,19,26

Can be used among large groups of heterogeneous applicants(ie, entire medical school classes) to predict noncognitiveperformance

Strong16,29,30

Can be used to predict future cognitive performance Strong9-11,14,15,18,19,21,22,24,25

Preclinical performance Can be used among applicants from the same institution topredict noncognitive performance

Very strong16,28-30

Should not be used among applicants from differentundergraduate medical schools to predict noncognitiveperformance

Very strong19,31

Should not be used to predict future cognitive performance Very weak19

Clinical performance Can be used to predict future cognitive performance Very strong19,25,27,31

Equivocal and negative comments in clinical narratives should betaken seriously

Very strong39

Clerkship-specific performance should not be used to predictfuture noncognitive performance

Very strong10,19

Overall performance can be used to predict future noncognitiveperformance

Strong16,28,30,31,38

Research experience Should not be used to predict future cognitive or noncognitiveperformance

Very strong22,27

OSCE performance Can be used to evaluate noncognitive traits Moderate31,45

Medical school factors Program directors can draw on previous experiences withgraduates from specific schools to predict success in thenoncognitive domain

Very strong46

Geographic factors should not be used to predict residentperformance

Very weak19,22

School rank should not be used to predict resident performance Very weak14,19

Honor society membership Can be used to predict future board examination performance Very strong19,31

Should not be used to predict future noncognitive performance Very weak10,12,19

MSPE Should not be used to predict future cognitive performance Strong19,51

Can be used to predict future noncognitive performance,particularly regarding work ethic

Weak31,51

LOR Can be used to predict future noncognitive performance ifevaluation is standardized and objectified

Very strong27

Overall strength of the LOR, as determined subjectively byreviewers, should not be used to predict future cognitive ornoncognitive performance

Moderate19,53,55

Faculty interviews Interviewers should be blinded to cognitive applicant data tomore accurately assess noncognitive traits

Very strong17,57-59

Can be used to predict future noncognitive performance Very strong10,27

LOR, Letters of recommendation; OSCE, objective structured clinical examination; MSPE, Medical Student Performance Evaluation; USMLE, US

Medical Licensing Examination.

*Very strong = all studies support recommendation; Strong = almost all studies support recommendation, at least one study equivocal or

contrary; Moderate = combination of studies supporting and not supporting recommendation, one side clearly stronger; Weak =

combination of studies supporting and not supporting recommendation, one side somewhat stronger; Very weak = studies supporting and

not supporting, largely inconclusive.

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1017

guidelines for LOR, there is substantial variability incontent and interpretation. Regarding content, inter-nal medicine PDs revealed that the writer’s depth ofunderstanding of the student was the most ‘‘essen-tial’’ component.53 A study of orthopedic PDs, how-ever, revealed that personal knowledge of the letter

writer was the most desired factor.54 Despite clearpreferences concerning the content of the LOR, mostPDs thought that the content was not meant to beused to predict future performance.53

Several studies have analyzed LOR and theirability to predict success in multiple domains during

Page 9: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Fig 1. Suggested algorithm for selection of residents to a residency program. AOA, AlphaOmega Alpha; LOR, letters of recommendation; MSPE, Medical Student Performance Evalu-ation; USMLE, US Medical Licensing Examination.

J AM ACAD DERMATOL

NOVEMBER 20111018 Harfmann and Zirwas

residency. Two studies found no relationship tosubsequent residency faculty evaluations.Regarding future examinations, they found eitherno correlation or only an insignificant trend towardhigher scores.19,38 A study by Dirschl and Adams55

found only slight interobserver reliability when theletters were evaluated on content alone. Residentsdeemed ‘‘superior,’’ however, tended to have more‘‘outstanding’’ LOR.

Two studies did demonstrate a significant positivecorrelation between LOR and performance duringresidency, however. Using a standardized evaluationformwith numeric scores, Brothers andWetherholt27

found that LOR positively correlated with residentcore competency evaluations and were more pre-dictive of resident ratings than GPA, USMLE

performance, or research experience. Another studyfound that candidates with an ‘‘exceptional’’ traitwere more likely to rank in the highest one third offaculty rankings.14

SummaryThese results do not indicate a reliable relationship

between LOR and success in residency. Aside fromthe finding that residents with a subjectively deter-mined ‘‘exceptional’’ trait perform better in resi-dency, there is little evidence to suggest that LORcan predict future performance. Although strong LORmay be more common in high-performing residents,residents of all levels have both good and bad LOR.

Much of the difficulty with using LOR to predictfuture performance is based on the high degree of

Page 10: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Fig 1. Continued.

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1019

interobserver variability in evaluating LOR. TheBrothers and Wetherholt27 study implies that theintroduction of a standardized evaluation form maybe beneficial in reducing this variation and improv-ing predictive value. Another contributing factor tothe poor predictive quality is the nature by whichLOR are requested by students. Faculty that writeletters on students’ behalves are handpicked by thestudents, with the assumption that the chosenauthor will write a strong recommendation. Withthis in mind, there may be some benefit in lookingbeyond the content of the letter to the author of theletter. PDs who are familiar with specific authors ofLOR (ie, having reviewed many of the author’sletters for multiple applicants) may be able todiscern small differences in letters indicative of alarge variation in performance. The approach couldbe similar to the approach to clinical narratives,where certain comments are regarded as negative,despite their equivocal or unenthusiastically posi-tive nature.

Thus, the role that LOR should play in the appli-cation process is unclear. They have not been shownto be predictors of performance. The type of

information that should be included in the letters isvariable, and different assessors reading the sameletter often interpret the letter differently. With theselimitations, it is difficult to allege that LOR, in theircurrent form, should play a meaningful role in theresidency application process.

FACULTY INTERVIEWSAsmentioned in the introduction and illustrated in

Table I, PDs consider the interview to be one of themost important factors during the residency appli-cation process.4 Only a small majority of plasticsurgery PDs, however, agreed that interview perfor-mance is indicative of residency performance.6

Multiple studies have shown the interview to bethe most important factor in the determination of thefinal National Resident Matching Program rank forindividual programs.17,56

There are many different formats for interviewswith variations in the number of interviews, inter-viewers, length, content, and factors evaluated. Inaddition, interviews often differ in their degree ofblinding. Three studies found that interview scoreswere significantly affected depending on whether or

Page 11: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

NOVEMBER 20111020 Harfmann and Zirwas

not interviewerswere providedwith applicant USMLEscores before the interview,with higherUSMLE scoresleading to higher interview scores.17,57,58Hauge et al59

found that blinded interviewers tended to rate knowncandidates higher than unknown, a tendency thatwasreversed with open-file interviews.

Few studies have directly examined the predic-tive value of the interview on success in residency.Brothers and Wetherholt27 found that personalcharacteristics observed during the interview weremore predictive of core competency ratings thanGPA, USMLE scores, and research experience. Thisscore was also able to predict residents who wereeventually identified as ‘‘cause for concern.’’Another study found a weak, but significant, corre-lation between the residency interview and futurefaculty ratings.10

SummaryThese results are encouraging and indicate that

noncognitive traits assessed during the interviewmay predict positive faculty evaluations duringresidency. The studies comparing the open- andclosed-file interviews show that both interrater andintrarater variability exist with both forms. Higherinterview scores were largely attributed to betterperformance on the USMLE. The finding that, whenblinded, known candidates had higher interviewscores suggests that the snapshot of a candidate’spersonality provided by the interview does notchange positive opinions that have developed overmultiple interactions.

These results suggest that the residency interview,conducted in an appropriate manner, may providehelpful insight to predict success in residency, par-ticularly in the noncognitive domain. Because inter-personal skills and interactions with faculty areconsidered to be the most important factors of theinterview, the format of the interview should betailored to best reflect and promote their evaluationand limit the impact of factors that are assessed atdifferent times throughout the application process.Interviews conducted with faculty not having accessto indicators of academic performance are morelikely to give a specific assessment of interpersonalqualities as compared with interviews in whichfaculty are aware of applicant academic perfor-mance, as the academic performance clearly influ-ences the interview score when available.

CONCLUSIONThis review was undertaken to see if the current

criteria used for resident selection are associatedwith success in residency. To this end, the studiesincluded in the review were able to provide a few

criteria that correlated strongly with residency suc-cess, but did not show consistent results for mostcriteria. How can criteria that are used consistentlyby medical schools across the country be such poorpredictors of success in residency? There are severalfactors that could be responsible for the paucity ofcorrelations and associations found: success in med-ical school may not be related to success in resi-dency, current assessments of medical studentperformance may not be the appropriate assess-ments for predicting success in residency, the studydesigns may not be able to reveal true correlationsthat exist, or it may be a combination of the above.Based on experience and interpretation of the liter-ature, the authors believe that the most likely expla-nation for the lack of strong predictors is that currentassessments of medical students are not designed orintended to predict success in residencyethey aredesigned and intended to measure medical schoolperformance. Perhaps the characteristics that lead tosuccess in medical school are not identical with thecharacteristics that lead to success in residency.

The lack of a reliable, validated, widely usedoutcome measure for success in residency and theaccompanying difficulties in predicting the ill-defined ‘‘successful resident’’ is another major issue.Study design likely plays a large role; as seen withseveral of the criteria assessed, simple changes instudy population could change what was no corre-lation to a significant positive correlation. Most of thestudies used were also small, single-institution,single-specialty, retrospective studies with limitedability to apply findings outside their narrow focus ofstudy. Despite these limitations, several recommen-dations can bemade concerning use of the discussedcriteria in the resident selection process (Table III).

If most individual factors cannot reliably be usedto predict success in residency, how should resi-dency programs select residents who will be suc-cessful? A suggested algorithm is presented in Fig 1that incorporates program-specific goals with theprevious recommendations. Perhaps the best way isto examine the applicant as a whole, rather thandissecting apart the individual components of theapplication. Two studies have created program-specific scoring systems for applicants that usemany of the studied variables in proportions thatcorrespond with previously determined correlationswith residency performance in that particular pro-gram.60,61 These studies have allowed the determi-nation of appropriate screening tools and predictivemeasures for success in their respective residencyprograms. This approach could be promising formany residency programs, as the definition of suc-cess and type of resident valued differs among

Page 12: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1021

programs and specialties. An even better approach,however, likely consists of multi-institutional, longi-tudinal studies of medical school graduates andsubsequent performance in residency. This perspec-tive best represents the heterogeneous view of ap-plicants that residency programs have whenevaluating applications andwould allowwidespreadapplication of results. Establishment of qualities ofmedical students predictive of success in specificoutcome measures in residency would allowprogram-specific determination of the ideal resi-dency candidate, properly equipped with the neces-sary skills and abilities to succeed in residency.

REFERENCES

1. MartinezMN. The smarts that count. HRMagazine 1997;42:72-8.

2. Evarts CM. Resident selection: a key to the future of orthope-

dics. Clin Orthop Relat Res 2006;449:39-43.

3. Sklar DP, TandbergD. The relationshipbetweennational resident

match program rank and perceived performance in an emer-

gency medicine residency. Am J Emerg Med 1996;14:170-2.

4. National Resident Matching Program, Data Release and Re-

search Committee. Results of the 2008 NRMP program director

survey. Washington: National Resident Matching Program; De-

cember 2008. 144 p. Available from: URL: http://www.nrmp.org/

data/programresultsbyspecialty.pdf. Accessed April 9, 2009.

5. National Resident Matching Program, Data Release and Re-

search Committee; Association of American Medical Colleges.

Charting outcomes in the match: characteristics of applicants

who matched to their preferred specialty in the 2009 main

residency match. 3rd ed. Washington: National Resident

Matching Program; August 2009. 277 p. Jointly published by

the Association of American Medical Colleges. Available from:

URL: http://www.nrmp.org/data/chartingoutcomes2009v3.pdf.

Accessed March 13, 2010.

6. Janis JE, Hatef DA. Resident selection protocols in plastic

surgery: a national survey of plastic surgery program directors.

Plast Reconstr Surg 2008;122:1929-39.

7. USMLE Composite Committee, Committee to Evaluate the

USMLE Program. Comprehensive review of USMLE: summary of

the final report and recommendations. Philadelphia:

Federation of State Medical Boards; June 2008. 11 p. Jointly

published by the National Board of Medical Examiners. Avail-

able from: URL: http://www.usmle.org/general_information/

CEUP-Summary-Report-June2008.pdf. Accessed April 4, 2010.

8. Talarico JF, Metro DG, Patel RM, Carney P, Wetmore AL.

Emotional intelligence and its correlation to performance as a

resident: a preliminary study. J Clin Anesth 2008;20:84-9.

9. Bell JG, Kanellitsas I, Shaffer L. Selection of obstetrics and

gynecology residents on the basis of medical school perfor-

mance. Am J Obstet Gynecol 2002;186:1091-4.

10. Borowitz SM, Saulsbury FT, Wilson WG. Information collected

during the residency match process does not predict clinical

performance. Arch Pediatr Adolesc Med 2000;154:256-60.

11. Thordarson DB, Ebramzadeh E, Sangiorgio SN, Schnall SB,

Patzakis MJ. Resident selection: how are we doing and why?

Clin Orthop Relat Res 2007;459:255-9.

12. Andriole DA, Jeffe DB, Whelan AJ. What predicts surgical

internship performance? Am J Surg 2004;188:161-4.

13. Metro DG, Talarico JF, Patel RM, Wetmore AL. The resident

application process and its correlation to future performance

as a resident. Anesth Analg 2005;100:502-5.

14. Daly KA, Levine SC, Adams GL. Predictors for resident success

in otolaryngology. J Am Coll Surg 2006;202:649-54.

15. Durning SJ, Pangaro LN, Lawrence LL, Waechter D, McManigle

J, Jackson JL. The feasibility, reliability, and validity of a

program director’s (supervisor’s) evaluation form for medical

school graduates. Acad Med 2005;80:964-8.

16. Paolo AM, Bonaminio GA. Measuring outcomes of undergrad-

uate medical education: residency directors’ ratings of first-

year residents. Acad Med 2003;78:90-5.

17. Swanson WS, Harris MC, Master C, Gallagher PR, Mauro AE,

Ludwig S. The impact of the interview in pediatric residency

selection. Ambul Pediatr 2005;5:216-20.

18. Armstrong A, Alvero R, Nielsen P, Deering S, Robinson R,

Frattarelli J, et al. Do US medical licensure examination step

1 scores correlate with council on resident education in

obstetrics and gynecology in-training examination scores

and American board of obstetrics and gynecology written

examination performance? Mil Med 2007;172:640-3.

19. Boyse TD, Patterson SK, Cohan RH, Korobkin M, Fitzgerald JT,

Oh MS, et al. Does medical student performance predict

radiology resident performance? Acad Radiol 2002;9:437-45.

20. Gunderman RB, Jackson VP. Are NBME examination scores

useful in selecting radiology residency candidates? Acad

Radiol 2000;7:603-6.

21. Thundiyil JG, Modica RF, Silvestri S, Papa L. Do United States

medical licensing examination (USMLE) scores predict in-

training test performance for emergency medicine residents?

J Emerg Med 2010;38:65-9.

22. Carmichael KD, Westmoreland JB, Thomas JA, Patterson RM.

Relation of residency selection factors to subsequent ortho-

pedic in-training examination performance. South Med J 2005;

98:528-32.

23. Black KP, Abzug JM, Chinchilli VM. Orthopedic in-training

examination scores: a correlation with USMLE results. J Bone

Joint Surg Am 2006;88:671-6.

24. Perez JA Jr, Greer S. Correlation of United States medical

licensing examination and internal medicine in-training ex-

amination performance. Adv Health Sci Educ Theory Pract

2009;14:753-8.

25. Andriole DA, Jeffe DB, Hageman HL, Whelan AJ. What predicts

USMLE step 3 performance? Acad Med 2005;80(Suppl):S21-4.

26. Rifkin WD, Rifkin A. Correlation between housestaff perfor-

mance on the United States medical licensing examination

and standardized patient encounters. Mt Sinai J Med 2005;72:

47-9.

27. Brothers TE, Wetherholt S. Importance of the faculty interview

during the resident application process. J Surg Educ 2007;64:

378-85.

28. Greenburg DL, Durning SJ, Cohen DL, Cruess D, Jackson JL.

Identifying medical students likely to exhibit poor profession-

alism and knowledge during internship. J Gen Intern Med

2007;22:1711-7.

29. Taylor ML, Blue AV, Mainous AG III, Geesey ME, Basco WT Jr.

The relationship between the National Board of Medical

Examiners’ prototype of the step 2 clinical skills exam and

interns’ performance. Acad Med 2005;80:496-501.

30. Alexander GL, Davis WK, Yan AC, Fantone JC III. Following

medical school graduates into practice: residency directors’

assessments after the first year of residency. Acad Med 2000;

75(Suppl):S15-7.

31. Hamdy H, Prasad K, Anderson MB, Scherpbier A, Williams R,

Zwierstra R, et al. BEME systematic review: predictive values

of measurements obtained in medical schools and future

performance in medical practice. Med Teach 2006;28:

103-16.

Page 13: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

J AM ACAD DERMATOL

NOVEMBER 20111022 Harfmann and Zirwas

32. Kassebaum DG, Eaglen RH. Shortcomings in the evaluation of

students’ clinical skills and behaviors in medical school. Acad

Med 1999;74:842-9.

33. Takayama H, Grinsell R, Brock D, Foy H, Pellegrini C, Horvath K.

Is it appropriate to use core clerkship grades in the selection

of residents? Curr Surg 2006;63:391-6.

34. Lee KB, Vaishnavi SN, Lau SK, Andriole DA, Jeffe DB. ‘‘Making

the grade:’’ noncognitive predictors of medical students’

clinical clerkship grades. J Natl Med Assoc 2007;99:1138-50.

35. Shen H, Comrey AL. Factorial validity of personality structure

in medical school applicants. Educ Psychol Meas 1995;55:

1008-15.

36. Davis KR, Banken JA. Personality type and clinical evaluations

in an obstetrics/gynecology medical student clerkship. Am J

Obstet Gynecol 2005;193:1807-10.

37. Hampton HL, Collins BJ, Perry KG Jr, Meydrech EF, Wiser WL,

Morrison JC. Order of rotation in third-year clerkships: influence

on academic performance. J Reprod Med 1996;41:337-40.

38. Dirschl DR, Campion ER, Gilliam K. Resident selection and

predictors of performance: can we be evidence based? Clin

Orthop Relat Res 2006;449:44-9.

39. Frohna A, Stern D. The nature of qualitative comments in

evaluating professionalism. Med Educ 2005;39:763-8.

40. Solomon SS, Tom SC, Pichert J, Wasserman D, Powers AC.

Impact of medical student research in the development of

physician-scientists. J Investig Med 2003;51:149-56.

41. Harden RM, Stevenson M, Downie WW, Wilson GM. Assess-

ment of clinical competence using objective structured

examination. Br Med J 1975;1:447-51.

42. Prislin MD, Fitzpatrick CF, Lie D, Giglio M, Radecki S, Lewis E.

Use of an objective structured clinical examination in evalu-

ating student performance. Fam Med 1998;30:338-44.

43. Tervo RC, Dimitrievich E, Trujillo AL, Whittle K, Redinius P,

Wellman L. The objective structured clinical examination

(OSCE) in the clinical clerkship: an overview. S D J Med 1997;

50:153-6.

44. Brazeau C, Boyd L, Crosson J. Changing an existing OSCE to a

teaching tool: the making of a teaching OSCE. Acad Med 2002;

77:932.

45. Stratton TD, Elam CL, Murphy-Spencer AE, Quinlivan SL.

Emotional intelligence and clinical skills: preliminary results

from a comprehensive clinical performance examination. Acad

Med 2005;80(Suppl):S34-7.

46. Hayden SR, Hayden M, Gamst A. What characteristics of

applicants to emergency medicine residency programs predict

future success as an emergency medicine resident? Acad

Emerg Med 2005;12:206-10.

47. LaGrasso JR, Kennedy DA, Hoehn JG, Ashruf S, Przybyla AM.

Selection criteria for the integrated model of plastic surgery

residency. Plast Reconstr Surg 2008;121:121-15e.

48. Association of American Medical Colleges, Dean’s Letter Advi-

sory Committee. A guide to the preparation of the medical

student performance evaluation. Washington: American Asso-

ciationof AmericanMedical Colleges; 2002. 11p. Available from:

URL: http://www.aamc.org/students/eras/resources/downloads/

mspeguide.pdf. Accessed July 28, 2009.

49. Swide C, Lasater K, Dillman D. Perceived predictive value of

the medical student performance evaluation (MSPE) in anes-

thesiology resident selection. J Clin Anesth 2009;21:38-43.

50. Mallott D. Interview, dean’s letter, and affective domain issues.

Clin Orthop Relat Res 2006;449:56-61.

51. Lurie SJ, Lambert DR, Grady-Weliky TA. Relationship between

dean’s letter rankings and later evaluations by residency

program directors. Teach Learn Med 2007;19:251-6.

52. Papp KK, Polk HC Jr, Richardson JD. The relationship between

criteria used to select residents and performance during

residency. Am J Surg 1997;173:326-9.

53. DeZee KJ, Thomas MR, Mintz M, Durning SJ. Letters of

recommendation: rating, writing, and reading by clerkship

directors of internal medicine. Teach Learn Med 2009;21:153-8.

54. Bernstein AD, Jazrawi LM, Elbeshbeshy B, Della Valle CJ,

Zuckerman JD. An analysis of orthopedic residency selection

criteria. Bull Hosp Jt Dis 2002-2003;61:49-57.

55. Dirschl DR, Adams GL. Reliability in evaluating letters of

recommendation. Acad Med 2000;75:1029.

56. Wagoner NE, Suriano R. Program directors’ responses to a

survey on variables used to select residents in a time of

change. Acad Med 1999;74:51-8.

57. Smilen SW, Funai EF, Bianco AT. Residency selection: should

interviewers be given applicants’ board scores? Am J Obstet

Gynecol 2001;184:508-13.

58. Miles WS, Shaw V, Risucci D. The role of blinded interviews in

the assessment of surgical residency candidates. Am J Surg

2001;182:143-6.

59. Hauge LS, Stroessner SJ, Chowdhry S, Wool NL. Association for

Surgical Education. Evaluating resident candidates: does closed

file review impact faculty ratings? Am J Surg 2007;193:761-5.

60. Dirschl DR. Scoring of orthopedic residency applicants: is a

scoring system reliable? Clin Orthop Relat Res 2002;399:

260-4.

61. Turner NS, Shaughnessy WJ, Berg EJ, Larson DR, Hanssen AD.

A quantitative composite scoring tool for orthopedic resi-

dency screening and selection. Clin Orthop Relat Res 2006;

449:50-5.

Page 14: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Table II. Medical student factors and various outcome measures analyzed in selected studies

Acceptance criterion Performance criterion Reference

Overall GPA Clinical judgment Alexander et al30 (2000)USMLE scores Patient management

CSProfessional qualitiesHumanistic qualitiesOral and written presentation skillsOverall performance

USMLE scores USMLE step 3 score Andriole et al25 (2005)Clinical GPAAOA electionUSMLE scores ITE scores Armstrong et al18 (2007)

Board examination scoresUSMLE scores ITE scores Bell et al9 (2002)Clinical honors Clinical judgment and acumenInterview score Patient rapport

Surgical abilityWork ethic

USMLE scores ITE scores Black et al23 (2006), Perez and Greer24 (2009),Thundiyil et al21 (2010)

NRMP rank Overall quality Borowitz et al10 (2000)NBME scores Board examination scoresClinical grades ITE scoresAOA electionInterview scoreOverall GPA General knowledge Boyse et al19 (2002)Dean’s letter Overall performanceLetters of recommendation ITE scoresAOA election Board examination scoresUSMLE scoresMedical school prestigePersonal characteristics Faculty rating Brothers and Wetherholt27 (2007)Reference letters ITE scoresAcademic record Board examination scoresAcademic honorsUSMLE performanceResearch experienceUSMLE scores ITE scores Carmichael et al22 (2005)AOA electionResearch publicationsAgeMarital statusUSMLE scores Faculty ranking Daly et al14 (2006)AOA election AwardsLetters of recommendation National presentationsMedical school prestige Peer-reviewed publicationsInterview score ITE scores

Received masters or doctorateHonors grades ITE scores Dirschl et al38 (2006)USMLE scores Board examination scoresAOA election Overall performanceResearch experiencePublicationsMedical school reputationNBME scores Board examination scores Gunderman and Jackson20 (2000)AOA election ITE scores

Continued

J AM ACAD DERMATOL

VOLUME 65, NUMBER 5Harfmann and Zirwas 1022.e1

Page 15: Can performance in medical school predict performance in residency? A compilation and review of correlative studies

Table II. Cont’d

Acceptance criterion Performance criterion Reference

NBME scores Supervisor rating Hamdy et al31 (2006)(systematic review of the literature)Clinical GPA NBME III scores

OSCE Board examination scoresDean’s letterPreclinical GPAMedical school ranking Overall performance Hayden et al46 (2005)Medical school record Academic successDean’s letter Clinical successLetters of recommendationUSMLE scoresResearch experienceInterview scoreDistinctive factorsGrades Knowledge Metro et al13 (2005)USMLE scores JudgmentResearch experience Intrapersonal attributesExtracurricular activities Overall impressionInterview Motor skills

Work habitsInterpersonal attitudesITE percentile score

AOA election Knowledge Papp et al52 (1997)Class rank Technical skillHonors grades MaturityResearch experience Individual judgmentNBME scores ITE scoresExtracurricular activities End of residency faculty rankingUSMLE scores History and physical examination skills Rifkin and Rifkin26 (2005)

Interpersonal skillsGrades Clinical competence Sklar and Tandberg3 (1996)Written autobiography Leadership qualitiesInterview IntelligenceLetters of recommendationNBME scoresUSMLE step 2 CS prototype ACGME competency average score Taylor et al29 (2005)USMLE scores Global quartile rankingOverall GPA Interpersonal competencyInitial faculty ranking End of residency faculty ranking Thordarson et al11 (2007)USMLE scores Board examination scores

ITE scores‘‘Best doctor’’ rank

ACGME, Accreditation Council for Graduate Medical Education; AOA, Alpha Omega Alpha; CS, clinical skills; GPA, grade point average; ITE,

in-training examination; NBME, National Board of Medical Examiners; NRMP, National Resident Matching Program; OSCE, objective structured

clinical examination; USMLE, US Medical Licensing Examination.

J AM ACAD DERMATOL

NOVEMBER 20111022.e2 Harfmann and Zirwas