a two-year study of microscopic urinalysis competency ...a two-year study of microscopic urinalysis...

14
A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion, 1* Sara Kim, 2 Amanda Nelson, 1 Paul J. Henderson, 1 Carla Phillips, 1 Claudia Bien, 1 Lynn Mandel, 2 Adam R. Orkand, 1 and James S. Fine 1 Background: The microscopic examination of urine sediment is one of the most commonly performed mi- croscope-based laboratory tests, but despite its wide- spread use, there has been no detailed study of the competency of medical technologists in performing this test. One reason for this is the lack of an effective competency assessment tool that can be applied uni- formly across an institution. Methods: This study describes the development and implementation of a computer program, Urinalysis- Review TM , which periodically tests competency in mi- croscopic urinalysis and then summarizes individual and group test results. In this study, eight Urinalysis- Review exams were administered over 2 years to medi- cal technologists (mean, 58 technologists per exam; range, 44 –77) at our academic medical center. The eight exams contained 80 test questions, consisting of 72 structure identification questions and 8 quantification questions. The 72 structure questions required the iden- tification of 134 urine sediment structures consisting of 63 examples of cells, 25 of casts, 18 of normal crystals, 8 of abnormal crystals, and 20 of organisms or artifacts. Results: Overall, the medical technologists correctly identified 84% of cells, 72% of casts, 79% of normal crystals, 65% of abnormal crystals, and 81% of organ- isms and artifacts, and correctly answered 89% of the quantification questions. The results are probably a slight underestimate of competency because the images were analyzed without the knowledge of urine chemis- try results. Conclusions: The study shows the feasibility of using a computer program for competency assessment in the clinical laboratory. In addition, the study establishes baseline measurements of competency that other labo- ratories can use for comparison, and which we will use in future studies that measure the effect of continuing education efforts in microscopic urinalysis. © 1999 American Association for Clinical Chemistry The examination of urine sediment is one of the most commonly performed microscope-based tests in the clin- ical laboratory and in point-of-care settings. Therefore, the teaching of microscopic urinalysis is part of the education of medical technologists and other healthcare personnel. Manual microscopic urinalysis is associated with poor reproducibility (1, 2). This is likely the result of variability in the experience and education of personnel and vari- ability in the procedures and equipment for performing the test (3, 4 ). In the US, microscopic urinalysis is considered a mod- erate complexity task. Therefore, personnel who perform microscopic urinalysis must be assessed for competency. A competency assessment program for a moderate com- plexity test should have the following characteristics: (a) it should be performed and documented at least two times annually; (b) it should identify problem areas for individ- uals and provide remedial retraining in those areas; (c) it should be applied uniformly across the laboratory; and (d) it should be easy to administer, taking a minimal amount of technologist and supervisory time, at the same time providing a maximal amount of information regarding competency of both individuals and the entire laboratory. The standard approaches to competency assessment for microscopic urinalysis are exams based on viewing urine specimens through a microscope or exams based on printed photos or projected slides. These traditional ap- proaches have several drawbacks. The microscope-based approach is difficult because specimens that demonstrate Departments of 1 Laboratory Medicine and 2 Medical Education, Univer- sity of Washington School of Medicine, Seattle, WA 98195. *Address correspondence to this author at: Department of Laboratory Medicine, University of Washington School of Medicine, Box 357110, Seattle, WA 98195-7110. Fax 206-548-6189; e-mail [email protected]. Received January 13, 1999; accepted March 16, 1999. Clinical Chemistry 45:6 757–770 (1999) Special Report 757

Upload: others

Post on 10-Jan-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

A Two-Year Study of Microscopic UrinalysisCompetency Using the Urinalysis-Review

Computer ProgramMichael L. Astion,1* Sara Kim,2 Amanda Nelson,1 Paul J. Henderson,1

Carla Phillips,1 Claudia Bien,1 Lynn Mandel,2 Adam R. Orkand,1 and James S. Fine1

Background: The microscopic examination of urinesediment is one of the most commonly performed mi-croscope-based laboratory tests, but despite its wide-spread use, there has been no detailed study of thecompetency of medical technologists in performing thistest. One reason for this is the lack of an effectivecompetency assessment tool that can be applied uni-formly across an institution.Methods: This study describes the development andimplementation of a computer program, Urinalysis-ReviewTM, which periodically tests competency in mi-croscopic urinalysis and then summarizes individualand group test results. In this study, eight Urinalysis-Review exams were administered over 2 years to medi-cal technologists (mean, 58 technologists per exam;range, 44–77) at our academic medical center. The eightexams contained 80 test questions, consisting of 72structure identification questions and 8 quantificationquestions. The 72 structure questions required the iden-tification of 134 urine sediment structures consisting of63 examples of cells, 25 of casts, 18 of normal crystals, 8of abnormal crystals, and 20 of organisms or artifacts.Results: Overall, the medical technologists correctlyidentified 84% of cells, 72% of casts, 79% of normalcrystals, 65% of abnormal crystals, and 81% of organ-isms and artifacts, and correctly answered 89% of thequantification questions. The results are probably aslight underestimate of competency because the imageswere analyzed without the knowledge of urine chemis-try results.

Conclusions: The study shows the feasibility of using acomputer program for competency assessment in theclinical laboratory. In addition, the study establishesbaseline measurements of competency that other labo-ratories can use for comparison, and which we will usein future studies that measure the effect of continuingeducation efforts in microscopic urinalysis.© 1999 American Association for Clinical Chemistry

The examination of urine sediment is one of the mostcommonly performed microscope-based tests in the clin-ical laboratory and in point-of-care settings. Therefore, theteaching of microscopic urinalysis is part of the educationof medical technologists and other healthcare personnel.Manual microscopic urinalysis is associated with poorreproducibility (1, 2). This is likely the result of variabilityin the experience and education of personnel and vari-ability in the procedures and equipment for performingthe test (3, 4 ).

In the US, microscopic urinalysis is considered a mod-erate complexity task. Therefore, personnel who performmicroscopic urinalysis must be assessed for competency.A competency assessment program for a moderate com-plexity test should have the following characteristics: (a) itshould be performed and documented at least two timesannually; (b) it should identify problem areas for individ-uals and provide remedial retraining in those areas; (c) itshould be applied uniformly across the laboratory; and (d)it should be easy to administer, taking a minimal amountof technologist and supervisory time, at the same timeproviding a maximal amount of information regardingcompetency of both individuals and the entire laboratory.

The standard approaches to competency assessmentfor microscopic urinalysis are exams based on viewingurine specimens through a microscope or exams based onprinted photos or projected slides. These traditional ap-proaches have several drawbacks. The microscope-basedapproach is difficult because specimens that demonstrate

Departments of 1 Laboratory Medicine and 2 Medical Education, Univer-sity of Washington School of Medicine, Seattle, WA 98195.

*Address correspondence to this author at: Department of LaboratoryMedicine, University of Washington School of Medicine, Box 357110, Seattle,WA 98195-7110. Fax 206-548-6189; e-mail [email protected].

Received January 13, 1999; accepted March 16, 1999.

Clinical Chemistry 45:6757–770 (1999) Special Report

757

Page 2: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

the desired urine sediment structures are not alwaysavailable, and when available, the specimens are difficultto preserve for use by a large group of technologists. Inaddition, the variability of microscope equipment canmake it difficult to administer an exam uniformly across alarge laboratory or group of laboratories.

An adequate collection of photos or slides can demon-strate all the relevant urine sediment structures. However,it is time-consuming to develop a photo or slide libraryand expensive to purchase one. In addition, it is difficultto obtain adequate brightfield photomicrographs of urinesediment structures that have low contrast, such as hya-line casts and bacteria. Finally, photos or slides cannotsimulate microscope techniques that are sometimes nec-essary to identify urine sediment structures. These tech-niques include moving the microscope stage and usingpolarization or phase contrast microscopy.

Although clinical laboratories are required to assess thecompetency of their technologists in microscopic urinaly-sis, the results of these assessments are not publishedroutinely. Therefore, the average competency of technol-ogists who perform this test is unknown. Proficiencytesting results are available, but it is difficult to derivegeneral conclusions about technologist competency fromthese results.

For the last 6 years, the University of WashingtonDepartment of Laboratory Medicine has been developingcomputer programs that teach the interpretation of im-age-based laboratory tests and monitor the competency ofindividuals and laboratories that perform image-basedlaboratory tests [for an early review, see Ref. (5 )]. Theunderlying premise of this work is that computers canovercome some of the problems associated with tradi-tional methods of instruction and competency assess-ment. Our previous work covers a variety of image-basedtasks including peripheral blood smears (6 ), gram stains(7, 8), ova and parasite exams (9 ), identification of fungi(10 ), electrophoresis of serum and other body fluids (11 ),and immunofluorescence assays for anti-nuclear andother autoantibodies (12–15).3

Urinalysis-Tutor™ (distributed by the University ofWashington, Seattle, WA and by Bayer Diagnostics, Tar-rytown, NY) systematically teaches the microscopic exam-ination of urine sediment, and its development, imple-mentation, and educational effectiveness has been thesubject of a previous publication (16 ). This study focuseson a complementary computer program, Urinalysis-Re-viewTM (University of Washington). Urinalysis-Review isa competency assessment program that tests the ability oflaboratory personnel to identify and quantify urine sedi-ment structures and keeps track of individual and labo-ratory performance over time. Here we describe the

program and the results obtained when we used theprogram in our laboratory over a 2-year period. The studydemonstrates the feasibility of using computer programsas an alternative to traditional methods for competencyassessment, and it establishes baseline measurements ofcompetency that other laboratories can use as a basis forcomparison.

Materials and Methodsprogram descriptionUrinalysis-Review administers competency assessmentexams and displays the results of the exams. The examswere developed by a team of clinical laboratory scienceeducators working with a computer programmer. Eachexam consists of 10 image-based questions that test theuser’s ability to identify and quantify urine sedimentstructures.

The images in the exams were collected from freshurine sediments that were prepared in the clinical labora-tories at the University of Washington Medical Center(Seattle, WA) and the Harborview Medical Center (Seat-tle, WA). The images were captured using a digital videomicroscopy system operated by Optimas image analysissoftware (Optimas). The hardware components of thesystem included a light microscope (Olympus modelBH2; Olympus), a color CCD camera (Javelin ChromachipII model JE3462RGB; Javelin Electronics), an 80486 com-puter (Gateway 2000), a video imaging board (MVP-AT;Matrox Electronic Systems), and a 13-inch closed circuittelevision monitor (Sony) that was used to display theimages. The imaging board converted the analog camerasignal into a digital image that could then be saved andedited. When necessary, digital image editing was per-formed using Adobe PhotoShop (Adobe Systems). Edit-ing could include contrast and brightness adjustment,color correction, and noise reduction. The goal of imageenhancement was to make the images appear nearlyidentical to images viewed through a high-quality micro-scope.

Urinalysis-Review is written in Microsoft Visual Basicfor Windows® (Microsoft). The program runs under Win-dows 3.1, 95, 98, or NT, and the minimal computerrequirements to run the program are an 80486 computerrunning at 33 megahertz, four megabytes of RAM, andtwo megabytes of hard disk storage per exam. The mini-mal display resolution is 640 3 480, 256 colors; theoptimal display resolution is 800 3 600, 256 or morecolors.

Urinalysis-Review is distributed to the laboratories inour academic medical center and to client laboratories.There are currently two modes of distribution of theprogram. The first is a subscription service (University ofWashington, Seattle, WA) in which participating labora-tories receive a new exam on a floppy disk every 3months. The second mode of distribution is through aspecial release of Urinalysis-Tutor (Bayer Diagnostics),

3 Additional information about educational software from the Universityof Washington Department of Laboratory Medicine can be found on the WorldWide Web at: http://www.labmed.washington.edu/Tutors.

758 Astion et al.: Urinalysis-Review Computer Program

Page 3: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

which incorporates a version of Urinalysis-Review thathas four exams.

A schematic diagram of Urinalysis-Review is shown inFig. 1. Urinalysis-Review is designed for two generalclasses of users. The first class of users includes technol-ogists or other individuals who are taking the exam. Atthe start of the exam, the examinee logs in his or her nameand then is presented with 10 image-based, multiple-choice questions. There is no exit from the program untilthe exam is completed. The second class of users includesthe laboratory supervisor who interacts with a controlpanel that displays results and program settings. Thesupervisor accesses the control panel using a password.The control panel can display the results of individualsand the entire laboratory for the current exam, anyprevious exam, or cumulatively on all exams to date. Thecontrol panel also has a program options menu that canchange the password to the panel, back up the program’sstatistics file, or export the file for use by other programs,such as a spreadsheet, database, or statistics program.

For each exam, 9 of the 10 questions require theidentification of one or more urine sediment structures.Each structure identification question has 34 possibleanswers, which are listed in Table 1. For an answer to aquestion to be scored as correct, the examinee mustidentify all structures that are present and must not selectany structure that is absent. The large majority of thestructure identification questions are on unstained urine.Some structure identification questions allow the exam-inee to change the microscope configuration from bright-field microscopy to polarization or phase contrast. After

the examinee completes the question, the correct answerand a detailed explanation are presented.

An example of a structure identification question, withits associated answer, is shown in Fig. 2. The question(Fig. 2A) involves the identification of transitional epithe-lial cells, white blood cells, hyaline casts, and bacteria. Theuser navigates through the 34 possible answers using the“Next” button and selects the structures that are presentusing the mouse. The screen in Fig. 2A was capturedwhen the display showed the cell types that can beselected. If the “Next” button were chosen from thisscreen, the list of possible casts would then be presented.After casts, the order of presentation is normal crystals,abnormal crystals, and organisms/artifacts. When a struc-ture is selected, the name of the structure turns from whitelettering to green, and it is then listed in the “Youranswer(s)” field. In Fig. 2A, this field is blank, whichindicates that no structures have been selected. Afterchoosing the answers, the examinee selects the “Done”button, and a screen (not shown) in which a list of thecorrect answers is displayed next to the user’s answers. Ifthese two lists are not identical, the user is told that thequestion was not answered correctly, and then a screen ispresented in which the correct answers are demonstratedusing text and arrows pointing into the image (Fig. 2B).

One of the 10 questions requires the quantification of aurine sediment structure. Each quantification questionsimulates the movement of the microscope stage so thatmultiple microscope fields can be viewed and counted.After the question is completed, the correct answer and anexplanation are presented.

An example of a quantification question is presented inFig. 3, which tests the examinee’s ability to quantify whiteblood cells. In Fig. 3A, the user is asked to quantify (on ascale of 11 to 41) the white blood cells present, assumingthe microscope is set at high power (total magnification,3400). The user simulates moving the microscope stage,using the arrows displayed below the image. The whitesquare in the center of the arrows represents the entiremicroscope slide, and the movable gray box within thesquare represents the current field of view, which isshown in the image above the arrows. After an answer isselected, the user is told if the answer is correct (notshown) and then shown the explanation for the correctanswer (Fig. 3B). The explanation for the correct answer

Fig. 1. Schematic diagram of the Urinalysis-Review computer program.See text for details.

Table 1. List of the 34 possible answers to each structure identification question.Cells Casts Normal crystals Abnormal crystals Organisms and artifacts

White blood cells Hyaline Uric acid Leucine YeastRed blood cells Granular Hippuric acid Tyrosine ParasitesSquamous epithelial cells Waxy Calcium oxalate Cystine SpermatozoaTransitional epithelial cells Fatty Triple phosphate Bilirubin BacteriaRenal tubular epithelial cells Red blood cell Calcium carbonate Cholesterol FibersOval fat bodies White blood cell Calcium phosphate Sulfonamide Starch

Renal cell Ammonium biurate Radiopaque dye Air bubble

Clinical Chemistry 45, No. 6, 1999 759

Page 4: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

expresses the answer both on the semiquantitative scaleand as cells per high power field.

study populationThe primary examinees in this study were medical tech-nologists, all with a Bachelor of Science degree in medical

technology, who work at the University of WashingtonMedical Center. The study was performed from May 1996to May 1998. To ensure that there was no fragmentation ofthe database of exam results, Urinalysis-Review residedon only one of the several computers in the clinicalchemistry laboratory. A new Urinalysis-Review exam was

Fig. 2. Screen captures showing a structure identification question (A) and part of its answer (B).The structures that are present in this image are transitional epithelial cells, white blood cells, hyaline casts, and bacteria. See text for details.

760 Astion et al.: Urinalysis-Review Computer Program

Page 5: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

Fig. 3. Screen captures showing a quantification question related to white blood cells (A) and part of its answer (B).See text for details.

Clinical Chemistry 45, No. 6, 1999 761

Page 6: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

installed on the computer every 3 months, and the studyencompassed eight consecutive exams. The technologistswere required minimally to take a Urinalysis-Reviewexam twice per year. However, the majority of technolo-gists took all eight exams. Overall, a mean of 58 (range,44–77) different technologists took each of the exams.

The official departmental policy regarding use of theprogram as a competency assessment tool is stated below:

Each technologist who is routinely scheduled on the urinalysisbench must complete a designated Urinalysis-Review examtwice each year. Results are recorded in five major areas: cells,casts, normal crystals, abnormal crystals, and organisms/arti-facts. Quantification skill is also assessed. Each technologist isgiven a score for each area as well as a composite score for theentire exam. Acceptable performance on each exam will be anoverall score of at least 70% and a score of at least 60% in eachindividual area. Any technologist who obtains a score below thestated criteria is required to complete the appropriate portion ofthe Urinalysis-Tutor as retraining. The date that retraining iscompleted is recorded on the exam performance summary log.

overview of exams used in the studyThe eight exams used in this study had a total of 80questions consisting of 8 quantification questions (1 perexam) and 72 sediment structure questions (9 per exam),which required the identification of one or more struc-tures. The eight quantification questions consisted of fourquestions about red blood cells, three about white bloodcells, and one about granular casts. The 72 structureidentification questions covered 134 structures (mean, 17

per exam) consisting of 63 examples of cells, 25 casts, 18normal crystals, 8 abnormal crystals, and 20 organisms/artifacts. For the 72 sediment structure questions, 68 (94%)were on unstained urine, 4 (6%) were on urine stainedwith either the Kova variant of the Sternheimer-Malbinstain (3 of the 4 stained urines; Hycor Biomedical) or aSudan IV fat stain (1 of the 4 stained urines; JT Baker). Inaddition, 8 (11%) of the 72 sediment structure questionsallowed the examinee to switch back and forth frombrightfield to polarization microscopy, and 1 (1%) al-lowed the user to switch back and forth from brightfieldto phase contrast microscopy. The structure identificationquestions encompassed the majority of microscopic find-ings in urine, and many structures, especially each spe-cific cell type, were covered multiple times.

Resultsgroup performance: specific examsDecember 1996. The group performance (n 5 53) for asingle exam (December 1996) is presented in Fig. 4. Forthis exam, the quantification question was about granularcasts. The nine structure identification questions had atotal of 19 urine sediment structures to identify, consistingof 9 cells (3 questions containing white blood cells, 2containing renal tubular cells, 2 containing squamousepithelial cells, 1 containing red blood cells, 1 containingtransitional epithelial cells), 2 casts (1 hyaline, 1 renal cell),2 normal crystals (1 calcium oxalate, 1 triple phosphate), 2

Fig. 4. Group performance (n 5 53) for the December 1996 exam.See text for details.

762 Astion et al.: Urinalysis-Review Computer Program

Page 7: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

abnormal crystals (1 bilirubin, 1 cystine), and 4 organ-isms/artifacts (3 bacteria, 1 yeast).

Like all the figures presented in this section, Fig. 4 is acomputer screen captured from the exam results sectionof the program. The text at the top of the screen in Fig. 4shows that the exam date is December 1996 and 53examinees took the exam. For the 53 people to correctlyidentify the 19 structures in the exam, they would need tomake a total of 53 3 19 5 1007 correct responses. Of the1007 possible correct responses for sediment structures,the group scored 809 (80%) correct responses. There were53 3 9 5 477 possible correct cell responses, and thegroup scored 341 (71%; blue column in graph, firstcolumn from left) correctly. The group had 86 missed cellidentifications. Missed cell identifications are defined inthe program as errors of commission in which a cell thatwas not present was identified as being present. Therewere 53 3 2 5 106 possible correct responses for casts,and the group scored 76 (72%; green column, secondcolumn from left) correctly. There were 53 3 2 5106possible correct responses for normal crystals, and thegroup scored 104 (98%; turquoise column, third columnfrom left) correctly. There were 53 3 2 5106 possiblecorrect responses for abnormal crystals, and the groupscored 93 (88%; red column, fourth column from left)correctly. There were 53 3 4 5212 possible correct re-sponses for organisms/artifacts, and the group scored 195(92%; magenta column, fifth column from left) correctly.There was one quantification question; therefore, there

was 53 3 1 5 53 possible correct answers for the group,and the group scored 37 (70%; yellow column, sixthcolumn from the left) correctly. The menu at the bottomleft of Fig. 4 shows the exam date field, which indicatesthat these results pertain to the December 1996 exam.Results for a different exam or for all exams cumulativelycan be selected using the mouse. The field in the bottomcenter of Fig. 4 indicates that these results are groupperformance on the Dec 1996 exam. The other option,“Performance over time”, can be activated only when “AllExams” has been selected from the exam date field. Thereare two buttons located in the lower right of the screen.One is a help button, which leads to instructions on howto interpret the display, and the other is a control panelbutton, which leads to the general menu for the resultssection of the program.

March 1997. Fig. 5 shows the group performance (n 5 56examinees) on the March 1997 exam. For this exam, thequantification question was about red blood cells. Thenine structure identification questions had a total of 20urine sediment structures to identify, consisting of 10 cells(4 questions containing red blood cells, 3 containing whiteblood cells, 1 containing renal tubular cells, 1 containingsquamous epithelial cells, 1 containing transitional epithe-lial cells), 3 casts (1 hyaline, 1 waxy, 1 granular), 4 normalcrystals (1 triple phosphate, 1 ammonium biurate, 1 urate,1 calcium oxalate), 1 abnormal crystal (sulfonamide), and2 organisms/artifacts (1 bacteria, 1 yeast). The most

Fig. 5. Group performance (n 5 56) for the March 1997 exam.See text for details.

Clinical Chemistry 45, No. 6, 1999 763

Page 8: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

obvious finding in Fig. 5 is the low score on the abnormalcrystal (13 of 56, or 23%). The results on the normalcrystals (143 of 224, or 64%) were just above the minimumcompetency point. The group performance on cells (485 of560, or 87%), casts (150 of 168, or 89%), and the quantifi-cation question (53 of 56, or 95%), although suggestingroom for improvement, were well above the 60% compe-tency threshold.

group performance: cumulative results on allexams, and performance over timeA statistical summary of the ability of the University ofWashington Medical Center laboratory staff to provideproper microscopic interpretations of urine sediment isshown in Fig. 6, which shows the cumulative results on alleight exams taken over a 2-year period. These examscontained a total of 80 questions on urine sediment andcovered the majority of microscopic findings in urine,many of the structures being covered multiple times. Forthe laboratory staff to get every structure identificationquestion correct, they would have to give 8019 correctanswers regarding urine sediment structures. Overall, thelaboratory scored 6369 (79%) correct responses for urinesediment structures. The performance on specific struc-tures was 84% (3143 of 3759) for cells, 72% (1102 of 1526)for casts, 79% (842 of 1068) for normal crystals, 65% (293of 448) for abnormal crystals, and 81% (989 of 1218) fororganisms and artifacts. The laboratory scored 89% (414 of465) on the quantification questions.

The group performance over time is shown in Fig. 7.This is the screen that is displayed when “All Exams” isselected in the exam field and “Performance over time” isselected in the graph options field. For this graph, theprogram combines all urine sediment structures into onecategory. The graph indicates that the overall perfor-mance of the University of Washington Medical Centerlaboratory staff on urine sediment structures (blue line inthe graph) has remained relatively constant over a 2-yearperiod. The performance on quantification questions(green line in the graph) varied from 70% on the Decem-ber 1996 exam to 100% on the June 1997 exam, but it doesnot show an upward or downward trend.

individual performance: specific exam,march 1997For each individual technologist in the laboratory, Urinal-ysis-Review can show the results of a single exam, resultson all exams cumulatively, and results over time. Fig. 8shows the performance of an individual on the March1997 exam. Fig. 8A is a one-screen statistical summary ofthe performance on this exam. The fields at the top of Fig.8A show the exam date and the technologist’s name(which has been changed for publication). Either field canbe changed using the scroll bars and clicking on thedesired date and name. The results listed in the lower leftof Fig. 8A indicate that the technologist was correct on sixof the nine (67%) structure identification questions andthe one quantification question. The nine structure iden-

Fig. 6. Group performance on all exams cumulatively.A mean of 58 technologists (range, 44–77) took each exam, and there were eight exams administered over 2 years. See text for details.

764 Astion et al.: Urinalysis-Review Computer Program

Page 9: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

tification questions contained 20 structures to identify,and 16 of these (80%) were identified correctly. For theindividual structure categories, the technologist correctlyidentified 10 of 10 examples of cells, 2 of the 3 casts, 3 ofthe 4 normal crystals, and 1 of the 2 organisms/artifacts.The one example of abnormal crystals was missed.

The panel of buttons in the lower right of Fig. 8Aallows the supervisor to view the questions in the examand the answers given by each technologist. The incorrectresponses are indicated in red. Thus, in this example, thetechnologist was incorrect on questions 1, 5, and 7. (InUrinalysis-Review the structure identification questionsare questions 1–9, and the quantification question isnumber 10.) Selection of question 7 leads to the screenpresented in Fig. 8B, which shows the question and theanswer given by this technologist. The answer given was“Leucine crystal”, and the correct answer was “Sulfon-amide crystal”. Because this was the only abnormalcrystal on the March 1997 exam, this answer was scored as0 of 1 abnormal crystals correctly identified, and onemissed identification because leucine was selected when itwas absent.

The ability to rapidly view the answers to questions inthe exam also allows the supervisor to detect error trends.Thus, by toggling through the list of technologists takingthe March 1997 exam, the supervisor could determine thatleucine crystals were frequently confused with the sulfon-amide crystals in question 7, that yeast and red blood cellswere sometimes confused in question 5 (not shown), and

that uric acid crystals and triple phosphate crystals wereoccasionally confused in question 1 (not shown). Thesupervisor can analyze errors for their clinical relevanceand direct continuing education efforts toward reducingthe most serious errors.

individual performance: cumulative results onall exams, and performance over timeThe cumulative results on all exams for an individualtechnologist (whose name has been changed for publica-tion) in our laboratory are shown in Fig. 9, which is astatistical summary of the ability of an individual toprovide proper microscopic interpretation of urine sedi-ment. Fig. 9 has the same basic features of Fig. 6 exceptthat the statistics are for an individual and not a group.This individual took all eight exams, which represents atotal of 80 questions, consisting of 72 structure identifica-tion questions and 8 quantification questions. Overall,there were a total of 134 urine sediment structures toidentify, and this technologist identified 115 (86%) cor-rectly. The best performance (100% correct) was on abnor-mal crystals (red column) and the quantification questions(yellow column). This technologist also scored 87% (55 of63; blue column) on cells, 80% (20 of 25; green column) oncasts, 89% (16 of 18; turquoise column) on normal crystals,and 80% (16 of 20; magenta column) on organisms andartifacts. A comparison with the group cumulative per-formance presented in Fig. 6 shows that this technologist

Fig. 7. Group performance over time.The blue line shows the performance over time for the sediment structures, which are combined into one category; the green line shows the performance for thequantification questions. See text for further details.

Clinical Chemistry 45, No. 6, 1999 765

Page 10: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

Fig. 8. Performance of an individual technologist on the March 1997 exam.The technologist’s name has been changed for publication. See text for details.

766 Astion et al.: Urinalysis-Review Computer Program

Page 11: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

Fig. 9. Performance of an above-average technologist on all exams cumulatively.The technologist’s name has been changed for publication. See text for details.

Fig. 10. Performance of an individual technologist over time.The blue line shows the performance over time for the sediment structures, which are combined into one category; the green line shows the performance for thequantification questions. See text for further details.

Clinical Chemistry 45, No. 6, 1999 767

Page 12: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

had an above-average performance. The individual cu-mulative results were also used to detect technologistswith below average performance, and these technologistsreceived additional continuing education.

The individual performance over time for the technol-ogist in Fig. 9 is shown in Fig. 10, which is similar inappearance to the screen on group performance over timeshown in Fig. 7. For this graph, the program combines allurine sediment structures into one category. The graphindicates that the overall performance of this individualhas remained relatively constant over a 2-year period. Theperformance-over-time graph can also identify technolo-gists who are improving or declining over time, or it canbe used to rapidly detect a particularly problematic examfor a technologist. That exam can then be further dissectedto define specific areas for improvement.

DiscussionThe first accomplishment of this study was the develop-ment and implementation of a computer program toassess competency in microscopic urinalysis. The pro-gram has advantages over the traditional approaches tocompetency assessment. Unlike the microscope-based ap-proach, the computer program does not require micro-scopes or fresh specimens, and it is therefore easier toadminister to any size laboratory and at any time. Incontrast to a photo or textbook exam, the computerprogram can simulate manipulations of the microscope,such as stage movement and configuring the microscopefor polarization or phase contrast. Another advantage ofthe computer program is that it automatically recordsresults and allows supervisors to efficiently documentand display these results for each individual in thelaboratory as well as the laboratory as a whole (Figs.4–10).

A second achievement of this study is the publicationof 2 years of urinalysis competency data from a largeclinical laboratory (Figs. 6 and 7). Despite using a detailedliterature search, we were unable to find similar studies.These baseline results are a starting point for our labora-tory as we attempt to measure the effects of educationalprograms designed to improve urinalysis performance. Inaddition, other laboratories could use these data forcomparison as they implement their own procedures forcompetency assessment and continuing education.

Because competency assessment results for micro-scopic urinalysis are not published routinely, many clin-ical laboratory scientists use proficiency testing results tocompare the competency of their institution with otherinstitutions. Compared with the exams and results pro-vided by Urinalysis-Review, proficiency testing is not asuseful for assessing and comparing competency in micro-scopic urinalysis. This is because proficiency testing is notas comprehensive and proficiency testing specimens arenot handled by all technologists in the laboratory. In

addition, although proficiency testing specimens shouldbe handled like any other laboratory specimen, manylaboratories give these specimens special attention be-cause of the importance of maintaining accreditation.Thus, proficiency testing tends to overestimate compe-tency.

The group performance results presented in Figs. 6 and7 are probably a conservative estimate of our laboratory’scompetency because the technologists analyzed the im-ages without urine dipstick or other urine chemistryresults and without the ability to chemically manipulatethe urine with acid, alkali, stains, or other treatments. Inaddition, no patient history was provided, and the pro-gram only occasionally allowed the technologist to invokepolarization (11% of structure identification questions) orphase contrast microscopy (1% of structure identificationquestions) and did not allow changes in focusing. Lastly,the display quality on the laboratory computer wasslightly inferior to the computer display used to developthe questions. This might have caused some unfairness ina few questions related to bacteria.

Thus, although a very skilled technologist should havebeen able to identify the images in Urinalysis-Reviewwithout further aid, additional information and a bettercomputer display would have improved performance.For example, many technologists confused the sulfon-amide crystal in question 7 on the March 1997 exam (Fig.8B) with a leucine crystal. In actual practice, technologistsmight take a variety of actions when faced with the crystalin Fig. 8B. Because leucine crystals are rare and areassociated with liver disease, most technologists wouldlook for confirmatory evidence such as a positive bilirubinon the urine dipstick. In addition, a technologist, suspect-ing a sulfa crystal, might perform a sulfa confirmatory testand call the clinic to ask if the patient was taking a sulfadrug. Another technologist error we detected was confu-sion of yeast and red blood cells. In this case, manytechnologists would have treated the urine with diluteacetic acid, which lyses red blood cells while leaving yeastintact, to differentiate the two. Similarly, several crystalswould have been identified correctly if the technologistsknew the pH of the urine and the solubility characteristicsof the crystals. Some technologists who missed hyalinecasts would have identified them correctly if phase con-trast microscopy and changes in focusing were available.

Despite its limitations, the study clearly indicates thatthe laboratory could substantially improve its ability toidentify all categories of urine sediment structures. Spe-cifically, we should focus on abnormal crystals becausethe laboratory performed most poorly (64% correct) onthese clinically important structures.

There are several areas where Urinalysis-Review willbe improved. As discussed above, the program would bemore realistic if it included urine chemistry results, pa-tient history, and allowed more microscope manipulation.In addition, it would be useful if the technologist had theoption to chemically manipulate the urine with acid,

768 Astion et al.: Urinalysis-Review Computer Program

Page 13: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

alkali, stains, or other treatments. Moreover, it would beuseful if the results section of the program could breakdown the five sediment structure categories into subcat-egories so that specific errors could be more easily de-tected. Thus, in addition to being able to score the generalcategories of cells, casts, crystals, and organisms andartifacts, the program should be able to rapidly identifythe specific structures (e.g., granular casts, renal epithelialcells, sulfonamide crystals) that are problematic. This isimportant because not all errors are equal. For example,there is an important clinical difference between missing anormal crystal such as ammonium biurate and missing anabnormal crystal such as leucine. With the current versionof the program, it is possible to determine specific struc-tures that are problematic, but this is a time-consumingtask because it requires dissecting the results of each examby going through each individual’s performance (Fig. 8).A more rapid display of the specific structures that areproblematic to the laboratory would greatly help super-visors to identify and provide education regarding themost clinically relevant errors.

In this study, we concentrated on results that areavailable using the display options within Urinalysis-Review (Figs. 4–10). Another on-going study involvesanalyzing the Microsoft Access® database that underliesUrinalysis-Review and collecting survey data from tech-nologists who have been using Urinalysis-Review. Thisdetailed analysis will help us more rigorously define thecurrent state of competence in our institution. For exam-ple, we will be able to compare and contrast the perfor-mance of the two geographically distinct core laboratoriesin our academic institution, the first step in ensuring thatthe laboratories are performing at the same level. Inaddition, we will be able to determine what specificstructures are commonly confused with each other andwhat clinically relevant errors are made most frequently.Lastly, the addition of the survey results might help usidentify modifiable factors that affect competency in mi-croscopic urinalysis.

In addition to a more detailed data analysis, we arepursuing other logical extensions of our work. One obvi-ous extension is to repeat the study after enhancing bothUrinalysis-Tutor and Urinalysis-Review. This will allowus to establish even more accurate competency data. Asecond project is an interlaboratory comparison based oncollecting competency data from other laboratories thatsubscribe to Urinalysis-Review. A third project is toimplement the program outside the core clinical labora-tory as a competency assessment tool in point-of-caresettings. In this case, the examinees are healthcare person-nel outside the laboratory who perform microscopic uri-nalysis—for example, physicians, nurses, and physicianassistants—and the supervisor could be the staff memberwho oversees laboratory testing.

Another area of interest is to determine competency forother common microscope-based laboratory tests such asperipheral blood smears and direct gram stains of body

fluids. We have developed computer programs to accom-plish this. The programs, PeripheralBlood-ReviewTM andGramStain-ReviewTM, function analogously to Urinalysis-Review, but our implementation of these programs in ourlaboratory would need to be more systematic and formalto yield the quality of results that have been presentedhere.

In addition to conducting further studies, our futureplans include improving the distribution and content ofUrinalysis-Review. In June 1999, the University of Wash-ington will begin the twice-yearly distribution of theprogram on floppy disks.4 In addition, there is an en-hanced release of Urinalysis-Tutor (Bayer Diagnostics)that contains a four-exam version of Urinalysis-Review. Inthe United States, this can provide 2 years of competencyassessment.

In addition to the current modes of distribution, wewould eventually like to make the program available overthe World Wide Web via a subscription service. This hasseveral advantages. Most importantly, it would allowlaboratories to anonymously compare their competencyassessment results with those of other laboratories andthen set quantitative goals regarding improvement inmicroscopic urinalysis. In addition, an on-line serviceshould enhance convenience because technologists couldtake exams from any computer with Internet access andthe results would reside on a master database in ourinstitution. Currently, the program lacks an easy methodfor combining results residing on multiple computers;therefore, to maintain the integrity of the database ofresults, our technologists must take the exams on thesame computer.

In summary, we have developed and implemented Uri-nalysis-Review, a computer program that periodicallymonitors competency in performing microscopic urinaly-sis. Results collected over a 2-year period from ourlaboratory showed that our technologists correctly iden-tified 79% of urine sediment structures, and this perfor-mance has been relatively constant over time. Our futureeffort will focus on improving the content and distribu-tion of the program, and performing additional studies.

We thank Mark Wener, Doug Schaad, and Dina Hortonfor help with the manuscript. In addition, we thank thestaffs of the clinical chemistry laboratories at the Univer-sity of Washington Medical Center and Harborview Med-ical Center as well as the staff of the University ofWashington Roosevelt Medical Center for participating inthis study.

4 The cost of the program is $100 per year.

Clinical Chemistry 45, No. 6, 1999 769

Page 14: A Two-Year Study of Microscopic Urinalysis Competency ...A Two-Year Study of Microscopic Urinalysis Competency Using the Urinalysis-Review Computer Program Michael L. Astion,1* Sara

References1. Carlson DA, Statland BE. Automated urinalysis. Clin Lab Med

1988;8:449–61.2. Haber MH. Quality assurance in urinalysis. Clin Lab Med 1988;8:

431–47.3. Arant BS. Screening for urinary abnormalities: worth doing and

worth doing well. Lancet 1998;351:307–8.4. Fogazzi GB, Grignani S. Urine microscopic analysis—an art aban-

doned by nephrologists? Nephrol Dial Transplant 1988;13:2485–7.

5. Astion ML, LeCrone CN, Cookson BT, Orkand AR, Curtis JD,Pagliaro L, et al. Laboratory-Tutors: personal computer programsthat teach the interpretation of image-based laboratory tests. ClinLab Sci 1996;9:44–7.

6. Wood B, Mandel L, Schaad D, Curtis JD, Murray C, Broudy V, et al.Teaching the interpretation of peripheral blood smears to asecond year medical school class using the PeripheralBlood-Tutorcomputer program. Am J Clin Pathol 1998;109:514–20.

7. Cookson BT, Curtis JD, Orkand AR, Fritsche TR, Pagliaro L,McGonagle L, Astion ML. Gram. Stain Tutor: a personal computerprogram that teaches gram stain interpretation. Lab Med 1994;25:803–6.

8. Mandel L, Schaad D, Cookson BT, Curtis JD, Orkand AR, DeWitt D,et al. The evaluation of an interactive computer-based program toteach Gram stain interpretation. Acad Med 1996;71:S100–2.

9. Fritsche TR, Curtis JD, Eng S, Davis D, Curran G, Orkand AR, AstionM. Parasite-Tutor: a computer program that teaches the identifi-cation of clinically important parasites [Computer Program]. Phil-adelphia: Lippincott-Williams-Wilkins Publishers, 1997.

10. Coyle MB, Nowowiejski DJ, LaFe KJ, Ritter D, Curtis JD, Murray C,et al. Mycology-Tutor: an interactive tutorial that teaches morpho-logical identification of clinically significant fungi parasites [Com-puter Program]. Philadelphia: Lippincott-Williams-Wilkins Publish-ers, 1998.

11. Astion ML, Rank J, Wener MH, Torvik P, Schneider JB, Killings-worth LM. Electrophoresis Tutor: an image-based personal com-puter program that teaches the clinical interpretation of proteinelectrophoresis patterns of serum, urine, and cerebrospinal fluid.Clin Chem 1995;41:1328–32.

12. Astion ML, Orkand AR, Olsen GB, Pagliaro LJ, Wener MH. ANA-Tutor: a computer program that teaches the anti-nuclear antibodytest. Lab Med 1993;24:341–4.

13. Astion ML, Hutchinson KH, Ching AKY, Pagliaro LJ, Wener MH.Cytoplasmic-Tutor: a personal computer program that uses highresolution digital images to teach the interpretation of a micro-scope-based laboratory test. MD Comput 1994;11:301–6.

14. Astion ML, Wener MH, Hutchinson K, Olsen GB, Orkand AR,Pagliaro LJ. A computer program that periodically monitors theability to interpret the antinuclear antibody test. Clin Chem1996;42:836–40.

15. Wener MH, Pagliaro L, Orkand AR, Olsen GB, Astion ML. ANCA-Tutor: a computer program that teaches interpretation of animmunofluorescence assay. MD Comput 1996;13:216–20.

16. Phillips C, Henderson PJ, Mandel L, Kim S, Schaad D, Cooper M,et al. Teaching the microscopic examination of urine sediment tosecond year medical students using the Urinalysis-Tutor computerprogram. Clin Chem 1998;44:1692–700.

770 Astion et al.: Urinalysis-Review Computer Program