content analysis: reliability kimberly a. neuendorf, ph.d. cleveland state university fall 2011
TRANSCRIPT
![Page 1: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/1.jpg)
Content Analysis:Reliability
Kimberly A. Neuendorf, Ph.D.Cleveland State University
Fall 2011
![Page 2: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/2.jpg)
Reliability
Generally—the extent to which a measuring procedure yields the same results on repeated trials (Carmines & Zeller, 1979)
Types: Test-retest: Same people, different times.
Intracoder reliability. . . Alternative-forms: Different people, same time, different
measures. Internal consistency: Multiple measures, same construct. Inter-rater/Intercoder: Different people, same measures.
![Page 3: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/3.jpg)
Index/Scale Construction Similar to survey or experimental work e.g., Bond analysis—Harm to female,
sexual activity Need to check internal consistency
reliability (e.g., Cronbach’s alpha)
![Page 4: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/4.jpg)
Intercoder Reliability Defined: The level of agreement or
correspondence on a measured variable among two or more coders
What contributes to good reliability? careful unitizing, codebook construction, coder training (training, training!)
![Page 5: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/5.jpg)
Reliability Subsamples Pilot and Final reliability subsamples
Because of drift, fatigue, experience Selection of subsamples
Random, representative subsample “Rich Range” subsample
Useful for “rare event” measures Reliability/variance relationship
![Page 6: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/6.jpg)
Intercoder Reliability Statistics - 1
Types Agreement
Percent agreement Holsti’s
Agreement beyond chance Scott’s pi Cohen’s kappa
Fleiss’ multi-coder extension of kappa Krippendorff’s alpha(s)
Covariation Spearman rho Pearson r Lin’s concordance correlation coefficient (rc)
![Page 7: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/7.jpg)
Reliability Statistics – 2 See handouts on (a) Bivariate Correlation
and (b) Pearson’s and Lin’s Compared
![Page 8: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/8.jpg)
Reliability Statistics - 3 Core assumptions of coefficients
“More scholarship is needed”—these coefficients have not been assessed!
![Page 9: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/9.jpg)
Reliability Statistics - 4
My recommendations Do NOT use percent agreement ALONE Nominal/Ordinal: Kappa (Cohen’s, Fleiss’) Interval/Ratio: Lin’s concordance Calculate via PRAM
Reliability analyses as diagnostics, e.g., Problematic variables, coders (“rogues”?), variable/coder
interactions Confusion matrixes (categories that tend to be confused)
![Page 10: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/10.jpg)
Reliability Statistics - 5
“Standards” for Minimums for Rel. Stats. Percent Agreement:
90%?? Kappa (Cohen’s, Fleiss’):
.40 minimally, .60 OK, .80 good Pearson correlation; Lin’s concordance:
.70 (~50% shared variance) --???
![Page 11: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/11.jpg)
Reliability Statistics - 6 The problem of the “extreme” or “skewed”
distribution Can have a % agreement of .95 and a Cohen’s
kappa of -.10!!! Why? What to do?
![Page 12: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/12.jpg)
PRAM: Program for Reliability Analysis with Multiple Coders Written by rocket scientists! Trial version available from Dr. N!
![Page 13: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/13.jpg)
![Page 14: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/14.jpg)
![Page 15: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/15.jpg)
![Page 16: Content Analysis: Reliability Kimberly A. Neuendorf, Ph.D. Cleveland State University Fall 2011](https://reader035.vdocuments.mx/reader035/viewer/2022062221/56649ceb5503460f949b79b0/html5/thumbnails/16.jpg)
pause