student learning outcomes - rit · 2018. 3. 21. · schedule (i.e., students repeat a class or take...

32

Upload: others

Post on 25-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to
Page 2: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to
Page 3: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

General Education Student Learning Outcomes Assessment Report 2012-2014

Page 4: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

Rochester Institute of Technology, One Lomb Memorial Drive, Rochester, NY 14623-5603, U.S.A. ©2014 Rochester Institute of Technology. All rights reserved.

Page 5: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

Table of Contents

I. Executive Summary 1

Indirect Assessment Opportunities 3

II. Summary of Key Findings and Use of Results 4

Outcome: Use relevant evidence gathered through accepted scholarly methods, and properly acknowledge sources of information 4

Outcome: Express oneself effectively in common college-level written forms using standard American English 7

Outcome: Express oneself effectively in presentations 9

Outcome: Interpret and evaluate artistic expression considering the cultural context in which it was created 10

Outcome: Identify contemporary ethical questions and relevant positions 11

Outcome: Perform college-level mathematical operations or apply statistical techniques 13

Outcome: Comprehend and evaluate mathematical or statistical information 14

Outcome Demonstrate knowledge of basic principles and concepts of one of the natural sciences 15

Outcome Apply methods of scientific inquiry and problem solving to contemporary issues 16

Reflections: Evidence, Effective, Engagement, and Inclusion 17

III. Appendices

A. Faculty Engagement Model 18

B. Participating Faculty Members 2012.2014 19

C. Cycle 3 General Education Student Learning Outcomes Achievement Summary 20

D. Indirect Assessment – National Survey of Student Engagement (NSSE) 23

E. Express Oneself Effectively in Presentations Rubric 24

F. Interpret and Evaluate Artistic Expression Considering the Cultural Context in Which It Was Created Rubric 25

G. Identify Contemporary Ethical Questions and Relevant Positions Rubric 26

i

Page 6: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to
Page 7: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

1

Section I. Executive Summary

Students learn best, and assessment works best, when the academic experience is intentional, integrated, and meaningful. As RIT entered its third General Education assessment cycle, a key focus was moving beyond refining assessment processes (e.g., changes to rubrics, data collection forms, and procedures) to examining how the results are used to improve the student learning experience. This report highlights the results from our assessment initiatives and describes how the results improve student learning and inform decision making related to curriculum or instructional changes. The General Education Committee (GEC) provides oversight to ensure on-going monitoring of the General Education curriculum and assessment of the outcomes. The GEC provides regular updates and annual reports to both the Inter-College Curriculum Committee and Academic Senate. In addition, the Faculty Associate for General Education provides campus-wide leadership in the coordination and implementation of the General Education Framework.

RIT’s General Education Student Learning Outcomes are mapped to the required Foundation and Perspective categories of the General Education Framework. This ensures that we provide opportunities for all students to achieve the General Education Student Learning Outcomes. We assessed 9 of the current 15 RIT outcomes during this cycle.

Results: Our overall progress on the achievement of the student learning outcomes is presented in Table 1. Reported results include direct assessment and in some cases data from the employer co-op evaluation. For complete results see Appendix C.

Table 1: Direct Course-Based Assessment Summary

Student Learning Outcome Achievement of Benchmark

Writing: Use relevant evidence gathered through accepted scholarly methods, and properly acknowledge sources of information Not Met

Writing: Express oneself effectively in common college-level written forms using standard American English Partially Met**

Communication: Express oneself effectively in presentations Partially Met**

*Ethical: Identify contemporary ethical questions and relevant positions Partially Met**

*Artistic: Interpret and evaluate artistic expression considering the cultural context in which it was created Met

*Mathematical: Perform college-level mathematical operations or apply statistical techniques Met

*Mathematical: Comprehend and evaluate mathematical or statistical Information Met

* Scientific Principles: Demonstrate knowledge of basic principles and concepts of one of the natural sciences Met

*Natural Science Inquiry: Apply methods of scientific Inquiry and problem solving to contemporary issues Met

*outcome assessed in more than one course **partially met indicates the outcome was also assessed with the co-op evaluation and the benchmark was met in at least one of the two methods.

Page 8: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

2

Engaging Faculty: In addition and equally important is the work of our General Education Faculty Teams. RIT established eight interdisciplinary teams that provide leadership for ongoing assessment and guide the use of results for improvements to teaching and learning. The Faculty Engagement Model (see Appendix A) outlines our faculty-driven assessment practices. For a full list of the participating faculty in this assessment cycle, see Appendix B.

Academic Quality across the Globe: While the assessment of student learning remains our top priority, ensuring that students at our international campuses (Croatia, Dubai, and Kosovo) fulfill the same General Education requirements as students on the main campus is equally important. The first inclusive assessment in 2012-13 included writing samples from First Year Writing courses from all international locations and the faculty (using video conference technology) normed themselves on the scoring guide, evaluated the student writing, and then discussed both the experience of scoring essays from all

locations and preliminary results (using the Clipboard Survey tool). The results from this assessment are provided on page 5.

Moving Beyond Process: Writing: One clear and compelling example of moving beyond process is the assessment of the General Education writing outcome: Use relevant evidence gathered through scholarly methods and properly acknowledge sources of information. The assessment data revealed students struggled with evaluating information critically, an area that was noted in both assessment cycles and at each of RIT’s international campus locations. Not only did we move beyond process and into teaching and learning, but assessment of this outcome led us to move beyond borders. Faculty from RIT’s main campus and international campuses worked together to assess student writing and discuss the assessments findings. More details on this writing assessment can be found on page 5.

Math: After analyzing the outcomes for three cycles, the faculty hypothesized that students taking courses off schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to as “trailer sections”) were less likely to achieve the student learning outcomes benchmark at the same level as students in non-trailer courses. Math faculty partnered to collect more data on student learning in trailer sections and develop solutions to enhance student success. The math faculty received a Student Learning @ RIT mini-grant and presented their work at a conference. More details on the math assessment can be found on pages 14-15.

Page 9: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

3

Indirect Assessment Opportunities

National Survey of Student Engagement (NSSE)

NSSE measures the extent to which first-year students and seniors are engaged in educational practices associated with student success. In 2013, NSSE 2.0 was introduced and the order and grouping of items changed, were revised, eliminated, or replaced with new items. We re-mapped our General Education student learning outcomes to the new items that most closely aligned to continue using NSSE as an indirect measure of student achievement. A summary of indirect results is provided in Appendix D.

Alumni Survey 2014

RIT’s Alumni Attitudinal survey measures graduates perceptions of their educational experience. The goal of the survey is to better understand the impact an RIT degree has on their professional life and career path. We added an Educational Outcomes module to the 2014 survey to determine how important RIT’s core educational outcomes (knowledge and skills) have been to our alumni’s professional or personal life since college. We then asked how effective RIT was in supporting their development of the knowledge or skills. We mapped the writing, communication, ethical, and math General Education student learning outcomes to the survey items.

Alumni rated writing, communication, ethical, and quantitative reasoning very high in terms of level of importance as 98-99% ranked all outcomes somewhat important or higher. RIT was generally effective in supporting their development of those same skills and alumni felt RIT was most effective (78% rated good or excellent) in the area of quantitative reasoning, closely followed by ethical reasoning and action (71%). See tables 2 and 3 for a summary of the outcomes.

Table 2: RIT Alumni Attitudinal Survey, Level of Importance

Outcome Not Important Somewhat Important Very Important Critically

Important Effective writing 1% 12% 41% 45% Effective oral communication 1% 6% 38% 56% Ethical reasoning and action 2% 13% 37% 48% Quantitative reasoning 2% 14% 41% 43% Table 3: RIT Alumni Attitudinal Survey, Level of Effectiveness

Outcome Poor Fair Good Excellent

Effective writing 5% 31% 45% 20% Effective oral communication 5% 27% 45% 22% Ethical reasoning and action 5% 25% 49% 22% Quantitative reasoning 4% 18% 48% 30%

Page 10: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

4

Section II. Summary of Key Findings and Use of Results

This section provides a more detailed description of the General Education student learning outcomes assessed in this cycle. For each outcome, we provide the following:

1. A description of the assessment method (course, assignment, rubric) 2. The level of achievement of the benchmark(s) and a summary of the key findings 3. The use of results and how we are closing the loop

Writing Outcome: Use relevant evidence gathered through accepted scholarly methods, and properly acknowledge sources of information

1. Assessment Method

Research-based essays were collected from First Year Writing courses throughout AY 2012-13. The collection included 193 essays from randomly selected students in all sections to gain a sample that is representative of the First Year Writing population. All three international locations submitted essays for inclusion in the assessment.

Essays were scored by 15 full and part-time writing faculty from RIT’s main campus and four faculty from the three international locations. Faculty communicated across multiple locations via video conference technology. The University Writing Program Director led this multiple location review and discussion on the scoring guide and the assessment results from prior years. Before scoring student work, faculty participated in a norming session in which common papers were scored and reviewed as a group. Each essay in the collection was then scored by two different faculty members.

The scoring guide for this outcome was developed and piloted by faculty prior to the 2012-13 assessment. Readers evaluate the students’ ability to utilize and cite sources. The scoring guide includes the following five criteria:

Scope (How writers determine the extent of information needed) Context (How writers evaluate Information and its sources critically) Purpose (How writers use information effectively to accomplish their specific purposes for writing) Integration and Use of Document Sources Variety of sources (How writers select a range of source material that relates directly to their

specific purpose for writing)

2. Achievement of Benchmarks and Summary of Key Findings

RIT‘s aspirational benchmark is that 100% of students will receive a score of 5 (out of 20) or greater. This was not met: only 90% of students received a score of 5 or greater on the rubric (see Table 4). Rubric scores ranged from 2-20 with an average score of 11.

Table 4: Overall % of Students Meeting the Benchmark

Location All RIT Croatia Dubai Kosovo

# Essays 193 148 14 7 24 % Met Benchmark

(Scored 5/20) 90% 92% 68% 87% 98%

Page 11: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

5

Although the benchmark (100% achieving a 5 or greater) was not changed from the prior pilot assessment, additional performance benchmarks were set to provide specific goals for improvement by criterion. The new performance benchmarks are described in Table 5 and focus on the areas with the lowest assessment scores: Context, Purpose, and Integrates.

Table 5: Adjusted Benchmarks and Results 2013

Criteria Benchmark Results

Context 85% of students will receive a 1 (benchmark) or better on the rubric

Not Met 79% of students received a 1 (benchmark)

Purpose 90% of students will receive a 1 (benchmark) or better on the rubric

Met 93% of students received a 1 (benchmark)

Integrates 75% of students will receive a 2 (milestone) or better on the rubric

Met 84% of students received a 2 (milestone)

or better 1 = Benchmark, 2-3= Milestone, 4=Capstone

While the percentage of students meeting the predetermined benchmark varied by campus location, the criterion with the lowest score at each location was Context. Additionally, Context was the only area where the performance benchmark was not met.

3. Use of Results and Closing the Loop

The faculty debriefed immediately following the writing assessment to generally discuss the process and their experiences. The discussions focused on three questions.

What did the assessment show you about students’ use of relevant sources? Responses:

Synthesizing information is difficult for students Introducing sources is difficult – saying why the source is valid and important to the discussion Students have a tendency to either present information instead of using it to back a claim or

make a strong argument and use their own opinions to back a claim

How will this experience impact your teaching? Responses:

We are expecting student writers to accomplish tasks that we are not always teaching Enhance the practice of assessing work collaboratively in class

What did you think of the overall assessment process? Responses:

Broad range of assignments is complicating the assessment Significant outliers sometimes skew the assignment

The faculty met a second time to share and develop potential assignments, activities, and teaching strategies aimed at improving what appeared to be student challenges. Table 6 provides a summary of common challenges and instructional strategies. The focus of the discussion was on curriculum and

Page 12: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

6

instruction related to students’ use of evidence, with a particular emphasis on Context. Faculty reviewed this criterion using sample essays from the collection as the basis for a discussion. During that meeting, faculty collaborated on the creation of course activities, assignments and pedagogical approaches that would: Ensure that students have the greater opportunity to practice evaluation of sources Improve students’ achievement of the learning outcome

Table 6: Context - Challenges and Instructional Strategies

Common Challenges Instructional Strategies

Attention to the rhetorical context of a source

• “Post-card” activity – student writers work in small groups to analyze the “rhetorical space” of a source and its argument, and gain rhetorical awareness

Analysis of the assumptions of the author of the source

• “Where’s the ‘poop’?” – student writers work together to find authors’ assumptions, and gaps or questions that authors fail to address

• Author-Question Postcard – student writers pose questions that other students attempt to answer as the source author

Sources not presented in dialogue with other sources Student writer’s own place in the dialogue is unclear

• Visual Mapping – student writers use color coding to highlight the proximity of sources

• “Signal Phrases” – student writers use color coding to highlight key phrases introducing different sources

• Topic Assumption List – student writers take turns to identify and compare the authors’ assumptions of different sources

• Role Playing – student writers take on roles of the authors of different sources and act out a dialogue among the sources

Another focus of the workshop was a discussion of the scoring guide and a review of the current benchmark. Changes made to the assessment instrument and process included: Bringing each criterion into alignment by keeping the 0-4 scale (5 point) and expanding scale of

Integrate and Variety criteria to include a value of 0 or “below benchmark” The minimum score for an essay can now be 0 Aligning the Clipboard survey tool to match the scoring guide for ease of data entry and

analysis Review of benchmarks with faculty will be ongoing but will remain the same for the next

assessment cycle

The previous General Education report included details about the initial pilot assessment of this outcome and the recommendations made by the faculty. Completing this follow-up assessment was a priority recommendation. The other high priority action item was to review and discuss the results from the assessment with faculty and collect information on teaching and curriculum related to using source data. This faculty development approach was completed on RIT’s campus and will be expanded to the international campuses in spring 2015.

Page 13: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

7

Writing Outcome: Express oneself effectively in common college-level written forms using standard American English

1. Assessment Methods

This outcome was assessed using research-based essays collected from First Year Writing courses throughout AY 2012-13. The collection included 193 essays from randomly selected students in all sections to gain a sample that is representative of the First Year Writing population. All three international locations submitted essays for inclusion in the assessment.

The scoring guide for this outcome was developed and piloted in 2012 and revised in 2013. The focus of the scoring guide is on the understanding of audience, purpose, genre, content and thought process as demonstrated through appropriate use of rhetorical conventions.

This outcome is also assessed using the Co-op Employer Evaluations (n=2000). Employers are asked to provide feedback on students who have worked for them in a co-op experience. The Co-op Employer Evaluation was revised in 2013 to align more closely with General Education Student Learning Outcomes. The new performance criteria include, “Communicates effectively in written form” and “Demonstrates effective written skills.”

2. Achievement of Benchmarks and Summary of Key Findings

RIT‘s aspirational benchmark, that 100% of students will receive a 2 (Emerging) or better out of a possible 4 (Highly Competent), was not met. Only 90% of students received a 2 or better on the rubric. However, the percentage of students meeting the benchmark increased over the past two years from 78%, in 2012, to 90% in 2013.

This assessment cycle also includes scores from the international locations. The addition of international location samples does not appear to be a factor in the increase in students meeting the benchmark as overall scores were generally lower for the international locations.

Table 7: Course Assessment Effective Writing Skills

Cycle 2 (2012) (RIT only n=174)

Cycle 3 (2013) (all locations n= 193)

Weak (1) 22% 10%

Emerging(2) 29% 47%

Competent (3) 35% 32%

Highly Competent (4) 14% 11%

Total Scores 2 or better 78% 90%

The Co-op Employer Evaluation benchmark goal is that employers will rate RIT students’ communication skills as a 4 (Exceeds Expectations) or greater. This benchmark was met, with students from different colleges receiving average ratings from employers above 4 on their written communication skills for both survey items (see Table 8).

Page 14: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

8

Table 8: Co-op Employer Evaluation Effective Writing Skills

Co-op Employer Evaluation Item # of Respondents Benchmark

Mean RatingAY 2013-14

(1-5) *

Communication Skills: Communicates effectively in written form 2232 Met 4.16

Writing: Demonstrates effective written skills 278 Met 4.2

*5=Excellent 4=Exceeds Expectations, 3=Meets Expectations, 2= Progressing Towards Expectations, 1 = Poor

3. Use of Results and Closing the Loop

RIT faculty met to discuss the results of both writing assessments and focused on curriculum and instruction related to students’ effective written expression.

Instruction: After the previous assessment, faculty determined papers scored below the benchmark should be reviewed to gain a better understanding of what was needed to better support students. While this review is ongoing, a number of improvements to the support of student writing have already been made: The University Writing Program hired a new Writing Commons Director In the Writing Commons, priority has shifted away from walk-in based consultations to

appointment-based consultations, including recurring, weekly appointments. Each semester, references are solicited from campus-wide programs that provide support to at-risk students (e.g., HEOP, Spectrum Support, College Restoration Program). This shift in focus helps develop more meaningful relationships with students who may struggle in the transition to college level writing

The Writing Commons Director emphasized class visits early in the semester to increase student awareness of locations for writing support

The Writing Commons implemented a faculty referral form so students, consultants and faculty can appropriately identify students’ needs and address them through writing consultations

The First Year Writing course aligned placement practices with NTID, HEOP, and the English Language Center in order to place students preparing to complete or exit those programs in appropriate writing courses (i.e., either Critical Reading and Writing or FYW: Writing Seminar

Assessment: University Writing Program faculty also determined the following related to their assessment practices in 2014.15: Faculty will again evaluate students’ ability to effectively express themselves in writing during

the assessment of another outcome students’ ability to revise and improve written products. The benchmark will be reviewed with faculty and modified, if appropriate

Over the past few years of analyzing co-op evaluation data, employers typically rate students writing skills higher than faculty rate them. This may be attributed to the differences between academic writing and job-related writing.

Page 15: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

9

Communication Outcome: Express oneself effectively in presentations A new General Education faculty team was convened to plan the initial assessment for this outcome. The team developed a rubric to define and assess effective presentation skills. The faculty emphasized the need for an instrument that could be integrated into a variety of General Education courses that provide opportunities to assess this outcome.

1. Assessment Method

This outcome was assessed in multiple sections of Public Speaking (n=18) on three assignments: Introductory Speech, Informative Speech, and Persuasive Speech. The outcome was also assessed in Human Communication (n=60) using an individual oral presentation assignment. The newly developed rubric defines three performance categories for effective presentation including content and organization, language, and delivery. The rubric was designed using a checklist format to facilitate the scoring of student presentations. The rubric can be found in Appendix E.

This outcome was also assessed using the Co-op Employer Evaluation. RIT established a benchmark that employers will rate RIT students’ ability to express oneself effectively in presentations as a 4 (Exceeds Expectations) or greater out of 5.

2. Achievement of Benchmarks and Summary of Key Findings

For direct assessment in courses, RIT established a benchmark that 80% of students receive an overall rubric score of 21 (Accomplished) or better. The benchmark was met in both courses, with 98% of students in Human Communication and 100% of students in Public Speaking receiving an overall rubric score of 21 (Accomplished) or greater.

The Co-op Employer Evaluation was also used to measure the outcome (see Table 9). The Co-op Evaluation benchmark is 4.0 and the mean rating was a 3.9. We are approaching the benchmark and will collect and review the data over the next few semesters to monitor the mean. Table 9: Co-op Evaluation Effective Presentation Skills

Co-op Employer Evaluation Item # of Respondents Benchmark

Mean RatingAY 2013-14

(1-5)*Verbal Communication: Express oneself effectively in presentations 914 Not Met 3.9

*5=Excellent 4=Exceeds Expectations, 3=Meets Expectations, 2= Progressing Towards Expectations, 1 = Poor

3. Use of Results and Closing the Loop

The results of this assessment are encouraging and the faculty plan to continue to use the rubric in courses. The faculty shared and reviewed the rubric with students which helped clarify the expectations and performance criteria related to effective speaking.

The faculty recommended that this assessment be expanded to include a larger variety of General Education courses to increase opportunities for students to demonstrate the outcome through presentations. It was noted that students taking both of these courses were given more extensive instruction in effective public speaking methods and skills. Completing this assessment in a different setting or course would be a good way to test the rubric. The rubric could serve as a good resource for faculty as it outlines all criteria for demonstrating effective presentation skills. We will continue to monitor the results of the Co-op Employer Evaluation for trends.

Page 16: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

10

Artistic Outcome: Interpret and evaluate artistic expression considering the cultural context in which it was created.

A new General Education faculty team convened to plan the initial assessment of the artistic outcome. The team included faculty from English, Photographic Arts and Sciences, and Art History, and they focused on developing a rubric (see Appendix F) that could be applied across multiple disciplines and General Education courses.

1. Assessment Method

The outcome was initially assessed in three sections of Survey of Western Art and Architecture II (n=208) in the spring of 2013. Faculty selected assignments that provided evidence of the achievement of the student learning outcome which included a comparison paper, an exam, and an artistic response paper.

The rubric includes three criteria that delineate the outcome: Experience and explore artistic expression Interpret and evaluate art, providing a cogent critique utilizing formal concepts and

appropriate terminology Demonstrate informed appreciation of art forms in their cultural and historical context and

recognize the distinct contribution of art to human life

2. Achievement of Benchmarks and Summary of Key Findings

RIT faculty established multiple benchmarks that are considered developmental because they determined students’ need to first experience and explore before being able to interpret and evaluate artistic expression. Experience and Explore - 80% of students will accomplish a 3 (Acceptable) or better Interpret and Evaluate - 70% of students will accomplish a 3 (Acceptable) or better Knowledge and Appreciation - 60% of students will accomplish a 3 (Acceptable) or better

All benchmarks were met in each section of Survey of Western Art and Architecture II. The data indicate a higher percentage of students achieved an acceptable rating on Experience and Explore, but the subsequent areas become more challenging. These results are similar to what the faculty team predicted.

3. Use of Results and Closing the Loop

Although the results indicated students achieved the established student learning outcome benchmark, the faculty determined a need to expand this assessment to a variety of courses. Additionally, the rubric was designed intentionally to apply to the interpretation and evaluation of a variety of disciplines and art forms including art history, photography, architecture, and poetry. Expanding this assessment is an important next step in testing the rubric.

The faculty team made the following recommendations based on the results of the assessment: Add a mid-semester meeting to provide clarity and consistency on how rubric scores are

designated. (This will be incorporated into the spring 2015 assessment) Refine performance levels on the rubric. (Minor modifications were made) Use rubric to intentionally articulate and clarify expectations with students

Page 17: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

11

Ethical Outcome: Identify contemporary ethical questions and relevant positions An interdisciplinary General Education faculty team attended a full day retreat in June 2012 to plan for the assessment of the ethical outcome. The team began by defining the outcome and developing a rubric with corresponding framing language that could be generally applied across courses and disciplines (see Appendix G).

1. Assessment Method

This outcome was assessed for the first time in two courses, STSO 220 Environment and Society (n=21) and Introduction to Environmental Studies (n=38). The faculty attended an assessment kick-off meeting to review: General Education assessment at RIT, the ethical outcome rubric and framing language, the student achievement benchmarks, and data collection forms for recording and submitting their findings. Faculty members selected essay questions from mid-term and final exams as well as take home essays as the primary data sources for the assessment of the outcome.

The team then set a preliminary benchmark for student achievement and made recommendations for conducting an initial assessment. When constructing this rubric, the faculty team differentiated three distinct elements that defined the student learning outcome. Each of these individual steps has its own benchmark.

They envisioned a step by step process in which, at the highest level, a student is able to do all of the following: Recognize an ethical problem Identify possible positions and consider the full implications Select a position and provide logical justification for it, responding to objections and displaying

original insight

2. Achievement of Benchmarks and Summary of Key Findings

The rubric was created with three distinct, but interrelated benchmarks, as each one builds on the previous: Ethical Problem Recognition:

90% of students will earn a 3 (Acceptable)

Identify Ethical Positions: 80% of students will earn a 3 (Acceptable)

Evaluation of Different Ethical Perspectives: 70% of students will earn a 3 (Acceptable)

Rubric Scale: 1 = insufficient, 2 = developing, 3= acceptable, 4= exemplary

Figure 1: Average on Rubric Elements (all classes)

75%

66%

56%

RecognizeEthicalProblems

IdentifyEthicalPositions

EvaluateEthicalPerspectives

Page 18: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

12

3. Use of Results and Closing the Loop

The faculty analyzed the data and determined the difference in outcome results was due to the selection of assignments. Some assignments did not provide the best opportunity for students to demonstrate achievement of the student learning outcome. The structure of exam questions may have contributed to a negative result, as faculty felt they did not offer students enough of an opportunity to demonstrate the learning outcome.

While faculty members found the assessment process helpful in improving the course and agreed that their courses “fit” the outcome, they felt unsure about the assignments they selected and requested additional support in terms of guidelines and resources in order to identify assignments that provide more in depth responses. Based on these requests and the inconsistent results from the pilot assessment, the initial faculty team and the implementation team worked together to develop resources for faculty assessing this outcome including: key concepts, potential pedagogical approaches, and assignment samples. The Ethical Guidelines are available on the SLOA website (rit.edu/outcomes). Going forward, faculty assessing this outcome will be able to use the newly developed resources. In addition, the sample will be expanded to include business, social sciences, and humanities courses in additional disciplines such as philosophy.

This outcome was also assessed using the RIT Co-op Employer Evaluation. The benchmark is that employers will rate RIT students’ ethical and professional behavior as a 4 (Exceeds Expectations) or greater. Employers were asked to rate students on the item Demonstrates ethical and professional behavior. The benchmark was met (See Table 10) with a mean rating of RIT students’ ethical and professional behavior as a 4.5.

Table 10: Co-op Evaluation Demonstrates Ethical Professional Behavior

Co-op Employer Evaluation Item # of Respondents Benchmark

Mean Rating AY 2013-14

(1-5)*

Ethics: Demonstrates ethical and professional behavior 3298 Met 4.5

*5=Excellent 4=Exceeds Expectations, 3=Meets Expectations, 2= Progressing Towards Expectations, 1 = Poor

Although our direct course assessment yielded varying results, the employers clearly rated students very high in demonstrating ethical and professional behavior. We will continue to monitor the co-op evaluation trends.

Page 19: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

13

Mathematical Outcome: Perform college-level mathematical operations or apply statistical techniques

1. Assessment Method

The General Education math faculty team uses the assessment process as an opportunity to have conversations about student learning outcomes data, share teaching strategies, and collaborate with colleagues. It was particularly important for the team to continue to collect data in the last year of the quarter model to provide another point of comparison before RIT transitioned into semesters. The faculty repeated the assessment with the following courses: Calculus B, Data Analysis I, University Physics II, and Calculus C (total n=155). Faculty selected questions on exams and quizzes as the assessment opportunities.

2. Achievement of Benchmarks and Summary of Key Findings

RIT faculty set two benchmarks: 1) 80% of the students achieve an overall rubric score of 2.0 or higher and 2) 50% of the students achieve an overall rubric score of 3.0 or higher. A summary of the direct, course embedded findings can be found in Table 11.

3. Use of Results and Closing the Loop

After collecting data for multiple years, the math faculty determined that the students’ ability to meet the benchmark was highly dependent on whether or not the students taking sequential math courses were “on track.” When courses are offered off schedule (e.g., Data Analysis, offered in the spring) it is likely that some students are taking the course for the second time. The scores in the “trailer” sections tend to be lower. Three math faculty members applied for an assessment grant with the goal to increase student success in the math trailer sections. The team wanted to gain a better understanding of the success of trailer and non-trailer students at meeting the course learning outcomes and then formulate recommendations for additional student learning support. Faculty analyzed and used the results to determine the following course specific improvements or to curriculum and instruction: Complete a question by question analysis

Table 11: Summary of Findings for Perform College Level Mathematical Operations or Apply Statistical Techniques

Calculus B Data Analysis I University Physics II Calculus C

Data Sources Exams 1, 2, and 3 Exam 1, 2, and 3 6 questions on 3 Exams Quizzes and Exams

Number of Students 32 36 38 49

Benchmark 80% of the students achieve an overall rubric score of 2 (Developing) or higher 50% of the students achieve an overall rubric score of 3 (Competent) or higher

Key Findings Partially Met 2 or higher: 84% 3 or higher: 34%

Met 2 or higher: 100% 3 or higher: 78%

Met 2 or higher: 100% 3 or higher: 91%

Met 2 or higher: 90% 3 or higher: 71%

Page 20: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

14

Test various pedagogical strategies to implement in trailer sections Add assignment related to the topic of concern Add additional assessment opportunity (weekly online quiz) designed to assess student

knowledge of the concepts in addition to the procedures

In the next cycle, RIT will include its international locations in the assessment of the two math outcomes. A mathematical outcome “tool kit” was developed by the math faculty. The tool kit contains a variety of resources including sample problems.

Mathematical Outcome: Comprehend and evaluate mathematical or statistical information

1. Assessment Method

The General Education math faculty team repeated the assessment of this outcome as well, using the following courses: Calculus C and Data Analysis I (total n= 86). Faculty selected questions on exams and quizzes as the assessment opportunities.

2. Achievement of Benchmarks and Summary of Key Findings

RIT faculty set a benchmark of 80% of the students achieve an overall rubric score of 2 or higher and 50% of the students achieve an overall rubric score of 3 or higher. A summary of the direct, course embedded findings for Cycle 3 can be found in Table 12.

3. Use of Results and Closing the Loop

The majority of the recommendations from the last cycle were implemented including revision of the benchmarks, guidelines about rounding and course withdrawal, additional research on “trailer sections” and increasing student success in trailer sections. Faculty will focus on the following curricular modifications: introduce more difficult problems based on the success of this course, add weekly online quizzes designed to test the students’ knowledge of the concepts instead of just procedures, and increase the time spent on topics that data indicated were difficult for students.

Table 12: Summary of Findings for Comprehend and Evaluate Mathematical or Statistical Information

Calculus C Data Analysis I

Data Sources Quizzes and Exam Questions Exam Questions Number of Students 49 37

Benchmark 80% of the students achieve an overall rubric score of 2 (Developing) or higher 50% of the students achieve an overall rubric score of 3 (Competent) or higher

Key Findings Met

2 or higher: 92% 3 or higher: 76%

Met 2 or higher: 100% 3 or higher: 95%

Page 21: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

15

Scientific Principles Outcome: Demonstrate knowledge of basic principles and concepts of one of the natural sciences

1. Assessment Method

This is the fifth year that RIT collected data on the two scientific student learning outcomes. Faculty wanted to continue to collect data and refine their teaching and assessment practices and collect data from the last assessment in the quarter calendar.

One of the goals was to increase the sample size beyond individual courses. Faculty recommended using the Physics common exam as it provided an opportunity for students to achieve the outcome. The final exam tests students’ overall ability to use basic physics principles and concepts in problem solving and reasoning aimed at material typically found in an introductory level course in electricity and magnetism. Faculty reviewed the course constructs and determined that the final exam aligned directly to the General Education student learning outcome being assessed.

For the first time, the commonly graded final exam from University Physics II (n- 578) was used to assess this outcome. University Physics II was offered in 20 sections during the fall and spring semesters. The course coordinators developed a method to “rate” the final exam score using the university developed rubric.

Table 13: Data Source Alignment to Rubric

Rubric Scale Exemplary (4) Competent (3) Developing (2) Beginning (1)

Final Exam Score Range 85+ 70-85 55-70 below 55

2. Achievement of Benchmarks and Summary of Key Findings

RIT’s benchmark: 1) 80% of students achieve an overall rubric score of 2 or higher and 2) 50% of students achieve an overall rubric score of 3 or higher.

The benchmark was met with 83% of students scoring a 2 or better on the rubric and 51% of students scoring a 3 or better on the rubric. The results were similar for both semesters.

Figure 2: Data Summary by Rubric Element

3. Use of Results and Closing the Loop

Faculty reviewed the data and, although the benchmarks were both met, they would like to continue to collect data so there is a larger sample size and the groups can be compared from year to year. Additionally, faculty recommended collecting data from College Physics.

16%

37%31%

16%17%

33%32%

18%16%

35%32%

17%

Exemplary (4) Competent (3) Developing (2) Beginning (1)

Fall Spring Both

Page 22: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

16

Scientific Inquiry Outcome: Apply methods of scientific inquiry and problem solving to contemporary issues

1. Assessment Method

Another focus for this outcome was the inclusion of laboratory science courses. The Separations Lab, General and Analytical Chemistry, and two sections of the General Biology Lab III for majors and non-majors were inlcuded in the sample (total n= 445).

2. Achievement of Benchmarks and Summary of Key Findings

RIT’s benchmark is that 80% of the students achieve an overall rubric score of 2 or higher and 50% of students achieve an overall rubric score of 3 or higher.

The benchmark was met in each course as 87-98% of students scored a 2 or better on the rubric and 53-74% % scored a 3 or better on the rubric. The results were similar for both semesters. Table 14 provides a summary of the data.

3. Use of Results and Closing the Loop

Faculty reviewed the data and, although the benchmarks were met in all courses, faculty recommended the following curriculum, instruction, and assessment actions: Further research is needed to determine how to best teach the topic of solubility to determine

the exact cause of the student misconceptions Benchmark is consistently met in labs – increase the benchmark in lab sections to a 3.0 Increase the rigor in areas, such as homework and quizzes, to grasp the range of students’

abilities, and to help them improve

Table 14: Summary of Findings for Apply Methods of Scientific Inquiry and Problem Solving to Contemporary Issues

General & Analytical Chemistry

General Biology Lab III/non-science

majors

General Biology Lab III/science

majors Separations Lab

Data Sources Exams Questions Homework, Quizzes, Lab

Report, Exams

Homework, Quizzes, Lab

Report, Exams Lab Reports

Number of Students 171 79 147 48

Benchmark 80% of the students achieve an overall rubric score of 2 (Developing) or higher 50% of the students achieve an overall rubric score of 3 (Competent) or higher

Key Findings

Met 2 or higher: 87% 3 or higher: 53%

Met 2 or higher: 92% 3 or higher: 57%

Met 2 or higher: 93% 3 or higher: 74%

Met 2 or higher: 98% 3 or higher: 73%

Page 23: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

17

Reflections: Evidence, Effective, Engagement, and Inclusion

Evidence of Learning: As indicated, our shift in focus is really on assessing the outcomes and the ability we now have to provide credible evidence of student learning to relevant internal and external stakeholders. Faculty use student learning outcomes data to determine improvements which impact student success in terms of achieving the student learning outcomes. The majority of our improvements involve making changes to curriculum, revising individual General Education courses, adding or revising instructional strategies, and directly addressing students’ learning needs. See Table 1 or Appendix C for examples of improvements.

Effective Assessment: At the conclusion of the previous General Education Student Learning Outcomes Assessment report, we noted that assessment processes are continually improving and evolving at RIT. We know this will endure as assessment, at its best, is a dynamic process and our goal continues to be sustaining effective assessment practices. As we examine the work done by RIT’s faculty, the outcomes that we have now assessed multiple times using direct course-level assessment yield a high level of consistency among faculty raters. Our reform efforts will continue to focus on improving teaching and learning – the ultimate measure of effective assessment.

Engagement of Faculty: As our faculty engagement model continues to evolve, RIT faculty have moved into a new role of mentoring and working with additional faculty to build capacity for assessing the General Education student learning outcomes. Faculty members that participate on faculty teams or assess a General Education student learning outcome in their courses become part of a larger collaborative exchange of ideas about improving student outcomes. We have faculty that volunteer to work with international faculty, recruit other faculty, and integrate and share their expertise with their home programs. RIT faculty add breadth and depth to our faculty engagement model and assessment work.

Inclusive Assessment: We are growing on a variety of fronts. First, the inclusion of international campuses in the assessment of General Education student learning outcomes increases our faculty and student numbers; second, the extension of our faculty teams in scope and outreach; and finally, the addition of new methods to increase our sample sizes to improve the ability to generalize our findings. This growth has allowed us to “close the loop” on some important initiatives from the previous cycle, including moving beyond tactical (processes and practices) to focus on using data to improve student learning, relying less on indirect measures and expanding direct measures, and increasing the level of confidence in the data.

Next Steps

As noted throughout the report, we have made progress or closed the loop on our recommendations from the last cycle. Our work in the next cycle is focused on working with all campuses to assess six General Education student learning outcomes. The planning and preparation to include the international campuses is extensive, and the goal is to infuse systematic processes to sustain meaningful assessment that will yield evidence of student learning. We also plan to continue expanding our assessment opportunities in additional courses to increase faculty engagement and our sample sizes.

Page 24: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

18

Section III. Appendices

Appendix A: Faculty Engagement Model

Page 25: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

19

Appendix B: Participating Faculty Members 2012.2014 Robert Barbato, Professor, Management, E. Philip Saunders College of Business Evelyn Brister, Assistant Professor, Philosophy Department, College of Liberal Arts Peter Cardegna, Associate Head and Professor, School of Physics and Astronomy, College of Science Collette Caton, Lecturer, University Writing Program, Academic Affairs Rebecca Charry, Faculty, English Department, RIT Croatia Pamela Conley, Associate Professor, Liberal Studies, National Technical Institute for the Deaf Sandi Connelly, Assistant Professor, Thomas H. Gosnell School of Life Sciences, College of Science Matthew Coppenbarger, Associate Professor, Associate Head, School of Mathematical Sciences, College of Science Rebecca Daggar, Senior Lecturer, School of Mathematical Sciences, College of Science Daniel Deibel, Adjunct Faculty, School of Chemistry and Materials Science, College of Science Mary Delmastro, Visiting Assistant Professor, College of Imaging Arts & Sciences Gina Ferrari, Lecturer, Foundations, College of Imaging Arts & Sciences Amy Foley, Adjunct Faculty, University Writing Program, College of Liberal Arts Elizabeth Hane, Associate Professor/Associate Head, Thomas J. Gosnell School of Life Sciences, College of

Science; Faculty Associate in General Education, Academic Affairs Dawn Hollenbeck, Associate Professor, Undergraduate Program Coordinator, College of Science Gail Hosking, Lecturer, University Writing Program, Academic Affairs Michelle Jansen, Visiting Assistant Professor, English Department, College of Liberal Arts Keith Jenkins, Associate Professor, Undergraduate Program Director, College of Liberal Arts Erin Karl, Faculty, English Language Center, Student Affairs Christine Keiner, Associate Professor, Science, Technology & Society Department, College of Liberal Arts Pamela Kincheloe, Assistant Professor, Liberal Studies, National Technical Institute for the Deaf Jonathan Kruger, Associate Professor, Performing Arts and Visual Culture Department, College of Liberal Arts Carrie Lahnovych, Lecturer, School of Mathematical Sciences, College of Science Joseph Lanzafame, Senior Lecturer, School of Chemistry and Materials Science, College of Science Carl Lutzer, Assistant Head, Professor, School of Mathematical Sciences, College of Science David Martins, Associate Professor, University Writing Program Director, Academic Affairs Deana Olles, Senior Lecturer, School of Mathematical Sciences, College of Science Heidi Nickisher, Senior Lecturer, School of Art, College of Imaging Arts & Sciences Michael Palanski, Assistant Professor, Management, E. Philip Saunders College of Business Andrew Perry, Senior Lecturer, University Writing Program, Academic Affairs Elizabeth Reeves-O’Connor, Senior Lecturer, Communication Department, College of Liberal Arts Wade Robison, Ezra A. Hale Chair in Applied Ethics, Philosophy Department, College of Liberal Arts John Roche, Associate Professor, English Department, College of Liberal Arts Phillip Shaw, Lecturer, University Writing Program, Academic Affairs Christine Shank, Assistant Professor, MFA Director, School of Photographic Arts & Sciences, College of Imaging

Arts & Sciences Richard Shearman, Associate Professor, Science, Technology & Society Department, College of Liberal Arts Chip Sheffield, Associate Professor, College of Imaging Arts & Sciences; Eugene H. Fram Chair in Critical

Thinking, Academic Affairs Thomas Stone, Lecturer, University Writing Program, Academic Affairs Paulette Swartzfager, Lecturer, University Writing Program, Academic Affairs Sarah Thompson, Assistant Professor, School of Art, College of Imaging Arts & Sciences Helen Timberlake, Senior Lecturer, School of Mathematical Sciences, College of Science Karen vanMeenen, Lecturer, English Department, College of Liberal Arts Michael Waschak, Faculty, Public Policy and Governance, American Education in Kosovo Kevin Watson, Faculty, English Department, RIT Dubai Dianna Winslow, Assistant Professor, University Writing Program, First Year Writing Program Director, English

Department, College of Liberal Arts

Page 26: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

20

Appendix C: Cycle 3 General Education Student Learning Outcomes Achievement Summary

Student Learning Outcome Data Source Performance Benchmark Findings Achievement of

Benchmark Use of Results Action Items

Writing: Use relevant evidence gathered through accepted scholarly methods, and properly acknowledge sources of information

Writing Seminar (FYW) Claim-based Research Essay

100% of students will receive a total rubric score of 5 or better

90% earned a total score of 5 or better

Not Met

Based on the results, context was selected as a focus area and the writing faculty examined pedagogical strategies to support student achievement of the outcome.

Writing: Express oneself effectively in common college-level written forms using standard American English

Writing Seminar Portfolio Collection: Claim-based Research Essay

100% of students will receive an overall 2 (Emerging) or better on the scoring guide

90% earned an overall score of 2 (Emerging) or better

Not Met

The faculty determined the need to provide more direct writing support to students. They instituted weekly appointments in the Writing Commons and created a referral form to facilitate ongoing dialogue among faculty, students, and writing consultants.

Co-op Employer Evaluation Item: Communicates effectively in written form

RIT students’ will receive an overall mean rating of 4 (Exceeds Expectations) or higher

Mean rating for students was 4.16

Met Co-op Employer Evaluation Item: Demonstrates effective written skills

Mean rating for students was 4.2

Communication: Express oneself effectively in presentations

COMM 201 Public Speaking and COMM 101 Human Communication Presentations

80 % of the students receive an overall score of 21 (Accomplished) or better on the rubric

99%** earned an overall score of 21 (Accomplished) or better

Met

A core focus is to pilot the rubric in a wider variety of courses that include presentations. Completing this assessment in a different type of course will further test the rubric and serve as a resource for faculty as it outlines criteria for demonstrating effective presentation skills.

Co-op Employer EvaluationItem: Express oneself effectively in presentations

RIT students’ will receive an overall mean rating of 4 (Exceeds Expectations) or higher

Mean rating for students was 3.9

Not Met

Page 27: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

21

Student Learning Outcome Data Source Performance Benchmark Findings Achievement of

Benchmark Use of Results Action Items

Identify contemporary ethical questions and relevant positions*

STSO 220 Environment and Society and 120 Introduction to Environmental Studies

90% of students will achieve a 3 (Acceptable) or better on Ethical Problem Recognition

50%** earned a 3 (Acceptable) o

Not Met

Design assignments with more attention to the differences between ethical problem recognition, identification, and evaluation.

Develop examples of assignments across disciplines

80% of students will achieve a 3 (Acceptable) or better on Identify Ethical Positions

60%** earned a 3 (Acceptable)

70% of students will achieve a 3 (Acceptable) or better on Evaluation of Different Ethical Perspectives

69%** earned a 3 (Acceptable)

Co-op Employer EvaluationEthics: Demonstrates ethical and professional behavior

RIT students’ will receive an overall mean rating of 4 (Exceeds Expectations) or higher

Mean rating for students was 4.5 Met

Interpret and evaluate artistic expression considering the cultural context in which it was created*

Survey of Western Art and Architecture II Exams and Response papers

60% of students will earn a 3 (Acceptable) or better on the rubric on the Overall: Knowledge and Appreciation of Cultural Context criteria

60% earned a 3 (Acceptable) or better

Met

Expand assessment to courses such as literature, music, creative writing.

Provide additional support to faculty assessing for first time.

Perform college-level mathematical operations or apply statistical techniques*

Calculus B Data Analysis I University Physics II Calculus C Exam and Quiz questions

80% of the students achieve an overall rubric score of 2 (Developing) or higher

94%** earned a score of 2 (Developing)

Met

Expand assessment to include international locations

Develop assessment toolkit Analyze results for each

question to determine if the questions need refinement

Provide additional instruction on related topics.

50% of the students achieve an overall rubric score of 3 (Competent) or higher

69%** earned a score of 3 (Competent)

Page 28: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

22

Student Learning Outcome Data Source Performance Benchmark Findings Achievement of

Benchmark Use of Results Action Items

Comprehend and evaluate mathematical or statistical Information*

Calculus C ( Spring 13) Quizzes and Exam Questions Data Analysis (Spring 13) Exam Questions

80% of the students achieve an overall rubric score of 2 (Developing) or higher

96%** earned a 2 (Developing) or higher

Met

The core recommendation is to strengthen the focus on content with the addition of weekly online quizzes to test knowledge of concepts instead of procedures.

Reassess in semester course format

Increase time on difficult topics

50% of the students achieve an overall rubric score of 3 (Competent) or higher

86%** earned a 3 (Competent) or higher

Demonstrate knowledge of basic principles and concepts of one of the natural sciences*

University Physics II Commonly graded Final Exam

80% of the students achieve an overall rubric score of 2 (Developing) or higher

83% earned a 2 (Developing)

Met

Although benchmarks are met, faculty will continue to collect data for trends and comparison as well as expand the assessment to include College Physics.

50% of the students achieve an overall rubric score of 3 (Competent) or higher

51% earned a 3 (Competent)

Apply methods of scientific Inquiry and problem solving to contemporary issues*

General & Analytical Chemistry General Biology III Separations Lab

80% of the students achieve an overall rubric score of 2 (Developing) or higher

93%** earned a 2 (Developing)

Met

Faculty will work with lab instructors to make the course homework and quizzes more intentional and meaningful in terms of providing opportunities for student to practice the outcome.

50% of the students achieve an overall rubric score of 3 (Competent) or higher.

63%** earned a 3 (Competent)

*Outcomes were assessed in more than one course or section. ** rubric score average of multiple sections

Page 29: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

23

Appendix D: Indirect Assessment – National Survey of Student Engagement (NSSE) NSSE measures the extent to which first-year and seniors are engaged in educational practices associated with student success. In 2013, NSSE 2.0 was introduced and the order and grouping of items changed, were revised, eliminated, or replaced with new items.

RIT Benchmark: First-year and senior mean score on the selected NSSE item is on par with or higher than their Carnegie Class peers. This means that if our mean score is not significantly lower than our peer mean score than we have met our benchmark. Items with mean differences that are larger than would be expected by chance (as calculated by NSSE) are noted using one, two, or three asterisks (p<.05 =*, p<.01=**, p<.001=***).

We share the following data the appropriate General Education faculty team and they review and use both the direct and indirect results when making recommendations for improvement. Table 15: Indirect Assessment - NSSE

RIT Student Learning Outcome

NSSE Item Finding

2013 RIT

Mean

2013 Carnegie

PeersSig

Use relevant evidence gathered through accepted scholarly methods, and properly acknowledge sources of information

4d. Evaluating a point of view, decision, or information source

First-yearNot Met

2.7 3.0 ***

SeniorNot Met

2.7 3.0 ***

Express oneself effectively in common college-level written forms using standard American English

17a. Writing clearly and effectively

First-yearNot Met

2.4 3.0 ***

SeniorNot Met

2.8 3.1 ***

Express oneself effectively in presentations

1i. Gave a course presentation First-yearMet

2.3 2.2 ***

SeniorMet

2.9 2.6 ***

Interpret and evaluate artistic expression considering the cultural context in which it was created.

1d. Attended an art exhibit, play or other arts performance (dance, music, etc.)

First-year Met

2.0 2.0

Senior Met

1.9 1.8 ***

Identify contemporary ethical questions and relevant positions

17g. Developing a personal code of values and ethics

First-year Not Met

2.5 2.8 ***

SeniorNot Met

2.7 2.9 ***

Comprehend and evaluate mathematical or statistical information

6a.Reached conclusions based on your own analysis of numerical information (numbers, graphs, and statistics, etc.)

First-yearMet

2.7 2.5 ***

SeniorMet

2.7 2.6 ***

17c. Analyzing numerical and statistical information

First-yearMet

2.8 2.6 ***

SeniorMet

2.9 2.8 ***

Page 30: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

24

Appendix E: Express Oneself Effectively in Presentations Rubric

Student Learning Outcome: Express Oneself Effectively in Presentations Beginning

(1) Developing

(2) Accomplished

(3) Exemplary

(4) CONTENT AND ORGANIZATION

Introduction Engages audience with effective attention getter Introduces topic clearly Establishes credibility Relates topic to audience Clearly presents thesis or previews main points

meets 0-1 criteria

meets 2 criteria

meets 3-4 criteria

meets all 5 criteria

Body Articulates identifiable main points Balances time among main points Presents main points in a logical order Includes clear transitions between main points

meets 0-1 criteria

meets 2 criteria

meets 3 criteria

meets all 4 criteria

Supporting Materials

Supports main points (explanations, examples, illustrations, statistics, analogies, quotations)

Cites sources clearly Depth of content reflects thorough understanding of

topic Support materials are relevant, timely, appropriate,

and unbiased

meets 0-1 criteria

meets 2 criteria

meets 3 criteria

meets all 4 criteria

Conclusion Transitions clearly from body to conclusion Summarizes main points and/or moves audience to

action Includes strong final statement

meets 0 criteria

meets 1 criteria

meets 2 criteria

meets all 3 criteria

LANGUAGE

Language Identifies with audience (builds rapport, makes connections)

Language choices are imaginative, memorable, and compelling and enhance the effectiveness of the presentation

Language is appropriate to audience (level of formality, inclusive language)

Language is correct (uses proper grammar and syntax) Language is concise (uncluttered, avoids wordiness)

meets 0-1 criteria

meets 2-3 criteria

meets 4 criteria

meets all 5 criteria

DELIVERY

Paralanguage Has spontaneous, strong conversational quality (no reading)

Speaks at an appropriate rate and volume Speaks clearly and articulately (forms speech sounds

crisply and distinctly) Pronounces words correctly Voice is dynamic (vocal variety) Avoids fillers such as “um,” “like,” “you know”

meets 0-1 criteria

meets 2-3 criteria

meets 4-5 criteria

meets all 6 criteria

Movement and gestures

Consistently uses eye contact to maintain rapport with audience

Effective use of scanning to expand zone of interaction

Inconspicuous use of notes Gestures appropriately, stands and moves

deliberately; avoids rocking and swaying Avoids distracting mannerisms (tapping, fidgeting,

wringing hands) Facial expressions match tone of speech

meets 0-1 criteria

meets 2-3 criteria

meets 4-5 criteria

meets all 6 criteria

Subtotals TOTAL SCORE

Page 31: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

25

Appendix F: Interpret and Evaluate Artistic Expression Considering the Cultural Context in Which It Was Created Rubric

Benchmarks (Progressive) Experience and Explore: 80% of students will receive a 3 or better on the rubric. Interpret and Evaluate: 70% of students will receive a 3 or better on the rubric. Knowledge and Appreciation of Cultural Context: 60% of students will receive a 3 or better on the rubric.

Student Learning Outcome: Interpret and Evaluate Artistic Expression Considering the Cultural Context in Which It Was Created

Perspective: Artistic Criteria Insufficient (1) Developing (2) Acceptable (3) Exemplary (4)

Experience and Explore

Inability to experience and unwillingness to explore. Student demonstrates passivity and narrow response to artistic experiences.

Moderately engaged; student demonstrates satisfactory but uneven response to artistic experiences.

Actively engaged; student demonstrates an informed response to artistic experiences.

Enthusiastically and deeply engaged; seeks out further artistic experiences; applies them meaningfully to the world around them.

Interpret and Evaluate

Little or no use of appropriate terminology; ineffectively communicates interpretive and evaluative response

Uses appropriate terminology; able to articulate basic concepts, but insufficiently communicates interpretive and evaluative response

Provides a cogent critique, utilizes relevant terminology and demonstrates awareness of disciplinary conventions. Makes adequate connections. Effectively communicates interpretive and evaluative response.

Provides a sophisticated critique. Forges strong connections with previous experiences and formulates an insightful response.

Knowledge and Appreciation of Cultural Context

Fails to demonstrate that there is an interaction of art and culture and cannot recognize significant historical perspectives.

Rudimentary knowledge of art forms in their cultural and historical context.

Informed appreciation of art forms in their cultural and historical context, but response lacks subtlety and complexity

Recognizes complexity, ambiguity, subtlety, irony, interconnections between disciplines, the significance of historical and cultural narratives, and above all, the distinct contribution of art to human life.

Page 32: Student Learning Outcomes - RIT · 2018. 3. 21. · schedule (i.e., students repeat a class or take time off between courses that are meant to be taken consecutively; referred to

26

Appendix G: Identify Contemporary Ethical Questions and Relevant Positions Rubric

Student Learning Outcome: Identify Contemporary Ethical Questions and Relevant Positions

Criteria Insufficient (1) Developing (2) Acceptable (3) Exemplary (4) Ethical Problem Recognition

Student cannot recognize ethical problems and is unaware of complexity.

Student can recognize basic ethical problems but fails to grasp complexity or interrelationships.

Student can recognize basic ethical problems and grasp (incompletely) the complexities or interrelationships among the problems.

Student can recognize ethical problems when presented in a complex, multilayered (gray) context and can recognize relationships with other ethical problems.

Identification of Possible Positions

Student is unable to diagnose the ethical problem.

Student is able to diagnose the ethical problem.

Student is able to diagnose the ethical problem and can identify potential solutions.

Student is able to diagnose the ethical problem and can identify potential solutions and consider the full implications of them.

Evaluation of Different Ethical Perspectives

Student is not able to state any position.

Student is able to state a position but gives incomplete or flawed justification.

Student states a position and displays a logical justification for that position.

Student states a position, provides a logical justification, responds to objections and displays original insight.

Benchmarks (Progressive) Ethical Problem Recognition Benchmark: 90% of students will receive a 3 or better on the rubric. Identify Possible Positions: 80% of students will receive a 3 or better on the rubric. Evaluation of Different Ethical Perspectives: 70% of students will receive a 3 or better on the rubric.