how many of you are from - home - sacscoc€¦ · bringing equity and quality learning together:...
TRANSCRIPT
10/25/2019
1
Office of Academic Planning and Assessment
Dr. Jeff RobertsDirector of AssessmentSam Houston State University
Presented at the 2019 SACSCOC Annual Meeting
How many of you are from:
◦ Public Institution?
◦ Private Institution?
◦ 2-year College?
◦ 4-year University?
10/25/2019
2
How many of you are:
◦ Assessment Professional
◦ Faculty
◦ Administrator
◦ Other?
4-year, public university approximately 1-hour north of Houston, Texas
Current enrollment of approximately 22,000 undergraduate and graduate students◦ 91 bachelor’s degree programs, more than ◦ 57 master’s degree programs, and ◦ 10 doctoral programs
Classified by the Carnegie Commission on Higher Education as a “Doctoral Research University” and a “Community Engaged” University
10/25/2019
3
SHSU was one of seven institutions recognized in 2019.
SHSU was one of twenty-seven total institutions awarded since 2016.
SHSU is the FIRST Texas institution to be recognized.
Co-sponsored by: ◦ Voluntary System of Accountability (VSA), ◦ National Institute for Learning Outcomes Assessment
(NILOA) ◦ Association of American Colleges and Universities (AAC&U)
Recognizes institutions that are successfully:◦ Integrating assessment practices throughout the institution ◦ Providing evidence of student learning outcomes ◦ Using assessment results to guide institutional decision-
making and improve student performance
Focus is on processes and uses of assessment data, rather than on student performance or accomplishment
10/25/2019
4
Critical Thinking
Communication
Empirical and Quantitative Reasoning
Teamwork
Personal Responsibility
Social Responsibility
To include effective development, interpretation, and expression of ideas through written, oral, visual communication
Focus of this presentation upon the written communication portion of that larger general education outcome
10/25/2019
5
Written Communication skills are increasingly important for students as they graduate college and enter the workforce ◦ (Arum & Roska, 2011; AAC&U, 2008; Hart Research
Associates, 2013; 2015b; THECB, 2017a)
Many perceive that students are graduatingfrom college lacking necessary written communication skills◦ (Arum & Roska, 2011, Hart Research Associates,
2013; 2015b)
There are also broader concerns regarding equity of student learning and student success ◦ (Montenegro & Jankowski, 2017, THECB, 2016)
Institutions are not examining student learning data through an equity lens◦ In a 2015 survey of AAC&U institutions: 70% tracked student learning data
31% set equity goals for different racial/ethnic groups 24% set equity goals for different socioeconomic groups 14% set equity goals by parental educational level
17% actually examined their data by these factors 16% by race/ethnicity 9% by socioeconomic status 6% by parental educational level
10/25/2019
6
By examining student data institutions can “respond to the doubts of skeptical publics”
(Kuh et al., 2015, 8-9).
In particular, Kuh et al. (2015) noted that:
…if academic institutions are collecting and usingevidence of student learning to inform decisions and guide change that can help students and institutions
improve performance, the confidence of the American public is likely to follow (p. 12).
Mixture of Direct and Indirect Assessment Measures both Within the Curriculum AND at End-of-Experience◦ Examples of Direct Measures Assessment of Written Communication (AWC) Critical Thinking Assessment Test (CAT) Course Embedded American/Texas Government Tests
◦ Examples of Indirect Measures National Survey of Student Engagement (NSSE) Teamwork Self-Reflection Instrument (TSRI)
10/25/2019
7
Student artifacts collected from end-of-experience courses (i.e., 3000- and 4000-level courses)◦ Preferred artifacts from writing-enhanced, gateway
or capstone courses
All colleges at SHSU evaluated on a 3-year, rotating cycle
Artifacts scored with a locally developed written communication rubric
Rubric created by an interdisciplinary faculty committee with expertise and experience in teaching and assessing written communication◦ Promotes Content Validity
AAC&U Written Communication VALUE Rubric was influential in the identifying the domains measured by the rubric◦ Promotes Theoretical Validity
10/25/2019
8
Rubric measured student written communication across four separate domains◦ Ideas/Critical Thinking/Synthesis◦ Style◦ Organization◦ Conventions
A student’s scores across all four domains were combined to create a single, overall average score for the student artifact◦ Data from the overall student score for analysis
10/25/2019
9
Data were collected from 2015, 2016, and 2017 academic years
Data were available for six of the seven academic colleges at SHSU
1,284 junior- and senior-level students◦ Gender Male (n=457, 36%) Female (n=827, 64%)
◦ Race/Ethnicity White (n=825, 64%) Black (n=224, 17%) Hispanic (n=235, 18%)
◦ First Generation Status First Generation (n=512, 40%) Not First Generation (n=448, 35%) Unknown (n=324, 25%)
◦ Socio-Economic Status Pell Eligible (n=600, 47%) Not Pell Eligible (n=684, 53%)
Comparable junior- and senior-level students enrolled in Fall 2016 (n=9,416)◦ Gender Male (n=3,594, 38%) Female (n=5,822, 62%)
◦ Race/Ethnicity White (n=5,699, 61%) Black (n=1,655, 18%) Hispanic (n=2,062, 21%)
◦ First Generation Status Comparison Data Not Available
◦ Socio-Economic Status Comparison Data Not Available
10/25/2019
10
In rubric-based assessments, the consistency of the raters’ scores is an important measure of reliability ◦ Banta & Palomba, 2015; Millett, Payne, Dwyer, Stickler, & Alexiou, 2008
Inter-rater reliability was then calculated using one-way, random intra-class correlation (ICC) coefficients (Fleiss, 2003; Shrout & Fleiss, 1979).
ICC values (Cicchetti, 1994), ◦ < .40 = poor agreement◦ .40 - .59 = fair agreement◦ .60 - .74 = good agreement◦ > .74 = excellent agreement
The ICC value for the overall paper score was .70, which indicated good agreement between the raters
Multiple regression was used for analysis◦ Prior to analysis, appropriate checks were made of necessary assumptions (Field,
2009, Onwuegbuzie & Daniel, 2002)
Standardized skewness and kurtosis values were within ranges of normality +/-3 (Onwuegbuzie & Daniel, 2001)
The possible presence of multicollinearity was addressed by examining Tolerance values and VIF◦ Tolerance values were above .10 and the VIF values were very close to 1
Significant residual autocorrelation was examined using the pooled Durbin-Watson statistic◦ Pooled Durbin-Watson statistic was .02, indicating this assumption was
violated.
10/25/2019
11
As first-generation status was self-reported, data were missing for approximately 25% of students◦ Multiple Imputation Analysis was employed to
account for missing data 3-5 Imputations are typically sufficient (Schafer, 1997)
5 Imputations were used for this study
Multiple Imputation is “one of the best options for handling missing data”◦ Schlomer et al., 2010, p. 5
The overall regression model was predictive of differences in student written communication skills, F (5, 1278) = 2.58, p < .001◦ Pooled adjusted R2 was .028, indicating 2.8% of the
total variance in written communication skills could be explained by student demographic characteristics
◦ This value was small (Cohen, 1988)
10/25/2019
12
Two variables were statistically significant in the model
On a 4-point scale:◦ Scores for male students were 0.14 points, or
approximately 3%, lower than female students Effect size was small (Cohen’s d = 0.31)
◦ Scores for Black students were 0.17, or approximately 5%, lower than those of White students Effect size was small (Cohen’s d = 0.25)
Scores for Hispanic students, first-generation students, and students who were Pell-eligible were not statistically significantly different
Summary of Pooled Multiple Regression Results of Predictor Variables Upon Student
Written Communication Performance
Predictor Variable B SE B β t p
Constant 2.51 0.03 77.14 < .01
Male -0.15 0.03 -.13 -4.70 < .01
Black -0.18 0.04 -.12 -4.10 < .01
Hispanic -0.06 0.04 -.04 -1.35 .18
Pell Eligible -0.03 0.04 -.03 -0.88 .38
First Generation < -0.00 0.04 < -.01 -0.03 .97
Note. The performance of Black and Hispanic students are both compared to that of White Students.
10/25/2019
13
Descriptive Statistics for Student Writing Scores by Student Group
Student Group n M SD
Overall Population 1,284 2.40 0.56
Race/Ethnicity
White 825 2.43 0.56
Black 224 2.26 0.52
Hispanic 235 2.39 0.59
Gender
Male 457 2.31 0.57
Female 827 2.45 0.55
Socio-Economic Status
Pell Eligible 600 2.36 0.55
Not Pell Eligible 684 2.42 0.57
First Generation Status
First Generation 512 2.39 0.57
Not First Generation 448 2.39 0.54
10/25/2019
14
The small percentage of variance explained by the model is a positive, as it indicates these variables potentially had limited effect
No statistical significant differences were seen in the scores between:◦ White and Hispanic Students
◦ First Generation and Non-First Generation Students
◦ Pell Eligible and Non-Pell Eligible Students
Model did reveal that the scores of Black students and of Male students were statistically significantly lower than the scores of White students and of Female students◦ These findings align with Arum and Roska (2011) for
race/ethnicity and Priess et al. (2013) for gender
Findings are a departure from Roberts et al. (2017), who examined different students at the same university from a different time◦ Mean differences were similar between studies, but were
not statistically significant in Roberts et al. (2017)
Performance of all students was lower than desired, at a 2.40 on a 4-point scale
10/25/2019
15
Roberts et al., 2017
Race/Ethnicity◦ White (n=249) – 2.69◦ Black (n=51) – 2.53◦ Hispanic (n=56) – 2.54
Gender◦ Male (n=143) – 2.59◦ Female (n=251) – 2.66
Current Study
Race/Ethnicity◦ White (n=825) – 2.43◦ Black (n=224) – 2.26◦ Hispanic (n=235) – 2.36
Gender◦ Male (n=457) – 2.31◦ Female (n=827) – 2.42
10/25/2019
16
Continued examinations of student written communication skills are needed to better understand the performance of all student groups◦ Particularly at the individual domain levels
Studies are needed to explore the interaction between different demographic variables upon written communication skills
Future studies will be needed to examine the impact of any interventions for improvement
All student groups underperformed with regard to university expectations
The results did reveal an equity gap in the performance of Black students when compared to White students
The results did reveal an equity gap in the performance of male students when compared to female students
10/25/2019
18
Arum, R., & Roska, J. (2011). Academically adrift: Limited learning on college campuses. Chicago, IL: University of Chicago Press.
Association of American Colleges and Universities. (2008). College learning for the new global century: Executive summary with employers’ views on learning outcomes and assessment approaches (2008 edition). Washington, D.C: Author.
Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning implementing, and improving assessment in higher education (2nd ed.). San Francisco, CA: Jossey-Bass.
Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6, 284-290. doi:10.1037/1040-3590.6.4.284
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.
Field, A. (2009). Discovering statistics using SPSS (3rd ed.). Thousand Oaks, CA: Sage. Fleis, J. L. (2003). Statistical methods for rates and proportions (3rd ed.). New York, NY: Wiley.
doi:10.1002/0471445428Hart Research Associates. (2013). It takes more than a major: Employer priorities for college learning and
student success. Washington, DC: Association of American Colleges and Universities. Retrieved from http://www.aacu.org/leap/presidentstrust/compact/2013SurveySummary
Hart Research Associates. (2015a). Bringing equity and quality learning together: Institutional priorities for tracking and advancing underserved students’ success. Key findings from a survey and in-depth interviews among administrators at AAC&U member institutions. Washington, DC: Association of American Colleges and Universities. Retrieved from http://www.aacu.org/publications/bringing-equity-and-quality-learning-together
Hart Research Associates. (2015b). Falling short? College learning and career success. Washington, DC: Association of American Colleges and Universities. Retrieved from http://www.aacu.org/leap/public-opinion-research/2015-survey-falling-short
Kuh, G. D., Ikenberry, S. O., Jankowski, N. A., Cain, T. R., Ewell, P. T., Hutchings, P., & Kinzie, J. (2015). Using evidence of student learning to improve higher education. San Francisco, CA: Jossey-Bass.
Millett, C. M., Payne, D. G., Dwyer, C. A., Stickler, L. M., & Alexiou, J. J. (2008). A culture of evidence: An evidence-centered approach to accountability for student learning outcomes. Princeton, NJ: Educational Testing Service. Retrieved from https://www.ets.org/Media/Education_Topics/pdf/COEIII_report.pdf
Montenegro, E., & Jankowski, N. A. (2017). Equity and assessment: Moving towards culturally responsive assessment. (Occasional Paper #29). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).
Onwuegbuzie, A. J., & Daniel, L. G. (2002). Uses and misuses of the correlation coefficient. Research in the Schools, 9(1), 73-90.
Preiss, D. D., Castillo, J. C., Flotts, P., & Martín, E. S. (2013). Assessment of argumentative writing and critical thinking in higher education: Educational correlates and gender differences. Learning and Individual Differences, 28, 193-203. doi: 10.1016/j.lindif.2013.06.004
Roberts, J. Nardone, C. F., & Bridges, B. (2017). Differences in student writing ability as a function of student race and gender. Research & Practice in Assessment, 12(Winter).
Schafer, J. L. (1999). Multiple imputation: A primer. Statistical Methods in Medical Research, 8, 3-15. doi:10.1177/096228029900800102
Schlomer, G. L., Bauman, S., Card, N. A. (2010). Best practices for missing data management in counseling psychology. Journal of counseling Psychology, 57(1), 1-10. doi:10.1037/a0018082
Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychology Bulletin, 86, 420-428. doi:10.1037/0033-2909.86.2.420
Texas Higher Education Coordinating Board. (2016). Closing the gaps final progress report. Retrieved from http://www.thecb.state.tx.us/index.cfm?objectid=858D2E7C-F5C8-97E9-0CDEB3037C1C2CA3
Texas Higher Education Coordinating Board. (2017). Elements of the Texas core curriculum. Retrieved from http://www.thecb.state.tx.us/index.cfm?objectid=427FDE26-AF5D-F1A1-E6FDB62091E2A507